Combining Second-Order Belief Distributions with Qualitative Statements in Decision Analysis
http://www.springerlink.com/content/q0vn618n6q763202
Ola Caster and Love Ekenberg There is often a need to allow for imprecise statements in real-world decision analysis. Joint modeling of intervals and qualitative statements as constraint sets is one important approach to solving this problem, with the advantage that both probabilities and utilities can be handled. However, a major limitation with interval-based approaches is that aggregated quantities such as expected utilities also become intervals, which often hinders efficient discrimination. The discriminative power can be increased by utilizing second-order information in the form of belief distributions, and this paper demonstrates how qualitative relations between variables can be incorporated into such a framework. The general case with arbitrary distributions is described first, and then a computationally efficient simulation algorithm is presented for a relevant sub-class of analyses. By allowing qualitative relations, our approach preserves the ability of interval-based methods to be deliberately imprecise. At the same time, the use of belief distributions allows more efficient discrimination, and it provides a semantically clear interpretation of the resulting beliefs within a probabilistic framework.Characterization of a coherent upper conditional prevision as the Choquet integral with respect to its associated Hausdorff outer measure
http://www.sci.unich.it/~doria/pub/021.pdf
Serena DoriaA model of coherent upper conditional prevision for bounded random variables is proposed in a metric space. It is defined by the Choquet integral with respect to Hausdorff outer measure if the conditioning event has positive and finite Hausdorff outer measure in its Hausdorff dimension. Otherwise, when the conditioning event has Hausdorff outer measure equal to zero or infinity in its Hausdorff dimension, it is defined by a 0-1 valued finitely, but not countably, additive probability. If the conditioning event has positive and finite Hausdorff outer measure in its Hausdorff dimension it is proven that a coherent upper conditional prevision is uniquely represented by the Choquet integral with respect to the upper conditional probability defined by Hausdorff outer measure if and only if it is monotone, comonotonically additive, submodular and continuous from below.New papers on imprecise probability from Sipta.org
http://www.sipta.org/ipp.xml
If you want to post an abstract, just send a message to alessandro@idsia.ch with the necessary informationNotes on desirability and conditional lower previsions
http://www.idsia.ch/~marco/papers/2011amai-desirability.pdf
Enrique Miranda and Marco ZaffalonWe detail the relationship between sets of desirable gambles and conditional lower previsions. The former is one the most general models of uncertainty. The latter corresponds to Walley's celebrated theory of imprecise probability. We consider two avenues: when a collection of conditional lower previsions is derived from a set of desirable gambles, and its converse. In either case, we relate the properties of the derived model with those of the originating one. Our results constitute basic tools to move from one formalism to the other, and thus to take advantage of work done in the two fronts. Sets of desirable gambles are at the same time very powerful and intuitive models of uncertainty. Given the central role of uncertainty in artificial intelligence, this work marks a key passage towards the wider accessibility of those modelling capabilities in artificial intelligence.Belief functions combination without the assumption of independence of the information sources
http://dx.doi.org/10.1016/j.ijar.2010.10.006
Marco CattaneoThis paper considers the problem of combining belief functions
obtained from not necessarily independent sources of information. It
introduces two combination rules for the situation in which no assumption
is made about the dependence of the information sources. These two rules
are based on cautious combinations of plausibility and commonality
functions, respectively. The paper studies the properties of these rules
and their connection with Dempster's rules of conditioning and combinat=
ion and the minimum rule of possibility theory.Likelihood-based inference for probabilistic graphical models: Some preliminary results
http://www.helsinki.fi/pgm2010/papers/cattaneo.pdf
Marco CattaneoA method for calculating some profile likelihood inferences in
probabilistic graphical models is presented and applied to the problem of
classification. It can also be interpreted as a method for obtaining
inferences from hierarchical networks, a kind of imprecise probabilistic
graphical models.Exchangeable lower previsions
http://arxiv.org/pdf/0909.1148v1
Gert de Cooman, Erik Quaeghebeur and Enrique MirandaWe extend de Finetti's [Ann. Inst. H. Poincare' 7 (1937) 1–68] notion of exchangeability to finite and
countable sequences of variables, when a subject’s beliefs about them are modelled using coherent lower previsions
rather than (linear) previsions. We derive representation theorems in both the finite and countable cases, in terms of
sampling without and with replacement, respectively.Imprecise Probabilities based on Generalized Intervals for System Reliability Assessment
http://www.me.gatech.edu/~ywang/publication/IJRS_impRel_wang.pdf
Yan WangDifferent representations of imprecise probabilities have been
proposed, where interval-valued probabilities are used such that
uncertainty is distinguished from variability. In this paper, we
present a new form of imprecise probabilities for reliability
assessment based on generalized intervals. Generalized intervals have
group properties under the Kaucher arithmetic, which provides a
concise representation and calculus structure as an extension of
precise probabilities.
With the separation between proper and improper interval
probabilities, focal and non-focal events are differentiated based on
the associated modalities and logical semantics. Focal events have the
semantics of critical, uncontrollable, and specifi^\ed in probabilistic
analysis, whereas the corresponding non-focal events are
complementary, controllable, and derived.
A logic coherence constraint is proposed in the new form. Because of
the algebraic properties of generalized intervals, conditional
interval probability can be directly defi^\ned based on marginal
interval probabilities. A Bayes' rule with generalized intervals
allows us to interpret the logic relationship between interval prior
and posterior probabilities. The imprecise Dirichlet model is also
extended with the logic coherence constraint.Bounding Uncertainty in Civil Engineering
http://www.springer.com/engineering/book/978-3-642-11189-1
Alberto Bernardini and Fulvio TononThe book is centered on the applications to civil engineering problems and essentially on the mathematical theories that can be referred to the general idea of a convex set of probability distributions describing the input data and/or the final response of systems. In this respect, the theory of random sets has been adopted as the most appropriate and relatively simple model in many typical problems. However, the authors have tried to elucidate its connections to the more general theory of imprecise probabilities. If choosing the theory of random sets may lead to some loss of generality, it will, on the other hand, allow for a self-contained selection of the arguments and a more unified presentation of the theoretical contents and algorithms. Finally, it will be shown that in some (or all) cases the final engineering decisions should be guided by some subjective judgment in order to obtain a reasonable compromise between different contrasting objectives (for example safety and economy) or to take into account qualitative factors. Therefore, some formal rules of approximate reasoning or multi-valued logic will be described and implemented in the applications. These rules cannot be confined within the boundaries of a probabilistic theory, albeit extended as indicated above. The first chapter provides motivation for the introduction of more general theories of uncertainty than the classical theory of probability, whose basic definitions and concepts are recalled in the second chapter that also establishes the nomenclature and notation for the remainder of the book. Chapter 3 is the main point of departure for this book, and presents the theory of random sets for one uncertain variable together with its links to the theory of fuzzy sets, evidence theory, theory of capacities, and imprecise probabilities. Chapter 4 expands the treatment to two or more variables (random relations), whereas the inclusion between random sets (or relations) is covered in Chapter 5 together with mappings of random sets and monotonicity of operations on random sets. The book concludes with Chapter 6, which deals with approximate reasoning techniques. The book is written at the beginning graduate level with the engineering student and practitioner in mind. As a consequence, each definition, concept or algorithm is followed by examples solved in detail, and cross-references have been introduced to link different sections of the book.Building knowledge-based systems by credal networks: a tutorialhttp://www.idsia.ch/~zaffalon/papers/2010nova-cns.pdfA. Piatti, A. Antonucci, M. Zaffalon/Knowledge-based systems/ are computer programs achieving expert-level competence in solving problems for specific task areas. This chapter is a tutorial on the implementation of this kind of systems in the framework of /credal networks/. Credal networks are a generalization of Bayesian networks where /credal sets/, i.e., closed convex sets of probability measures, are used instead of precise probabilities. This allows for a more flexible model of the knowledge, which can represent ambiguity, contrast and contradiction in a natural and realistic way. The discussion guides the reader through the different steps involved in the specification of a system, from the evocation and elicitation of the knowledge to the interaction with the system by adequate inference algorithms. Our approach is characterized by a sharp distinction between the /domain knowledge/ and the process linking this knowledge to the perceived evidence, which we call the /observational process/. This distinction leads to a very flexible representation of both domain knowledge and knowledge about the way the information is collected, together with a technique to aggregate information coming from different sources. The overall procedure is illustrated throughout the chapter by a simple knowledge-based system for the prediction of the result of a football match.Representing uncertainty on set-valued variables using belief functions
http://www.hds.utc.fr/~tdenoeux/perso/doku.php?id=en:publi:belief_art
T. Denoeux, Z. Younes and F. AbdallahA formalism is proposed for representing uncertain information on set-valued variables using the formalism of belief functions. A set-valued variable $X$ on a domain $\Omega$ is a variable taking zero, one or several values in $\Omega$. While defining mass functions on the frame $2^{2^\Omega}$ is usually not feasible because of the double-exponential complexity involved, we propose an approach based on a definition of a restricted family of subsets of $2^\Omega$ that is closed under intersection and has a lattice structure. Using recent results about belief functions on lattices, we show that most notions from Dempster-Shafer theory can be transposed to that particular lattice, making it possible to express rich knowledge about $X$ with only limited additional complexity as compared to the single-valued case. An application to multi-label classification (in which each learning instance can belong to several classes simultaneously) is demonstrated.Conditional models: coherence and inference through sequences of joint mass functions
http://bellman.ciencias.uniovi.es/~emiranda/regular-extension.pdf
E. Miranda and M. ZaffalonWe call a /conditional model/ any set of statements made of conditional probabilities or expectations. We take conditional models as primitive compared to unconditional probability, in the sense that conditional statements do not need to be derived from an unconditional probability. We focus on two problems: (/coherence/) giving conditions to guarantee that a conditional model is self-consistent; (/inference/) delivering methods to derive new probabilistic statements from a self-consistent conditional model. We address these problems in the case where the probabilistic statements can be specified imprecisely through sets of probabilities, while restricting the attention to finite spaces of possibilities. Using Walley's theory of /coherent lower previsions/, we fully characterise the question of coherence, and specialise it for the case of precisely specified probabilities, which is the most common case addressed in the literature. This shows that coherent conditional models are equivalent to sequen ces of (possibly sets of) unconditional mass functions. In turn, this implies that the inferences from a conditional model are the limits of the conditional inferences obtained by applying Bayes' rule, when possible, to the elements of the sequence. In doing so, we unveil the tight connection between conditional models and zero-probability events.A study of Bayesian approximations of belief functions in the probability simplex
http://cms.brookes.ac.uk/staff/FabioCuzzolin/pubs.html
Fabio CuzzolinIn this paper we provide a comprehensive study of several of the most popular Bayesian approximations of a belief function in the probability simplex. Starting from the interpretation of the pignistic function as center of mass of the simplex of consistent probabilities, we prove that a large group of approximations can be described by the notion of focus of pairs of simplices in the simplex of all probability measures.Three alternative combinatorial formulations of the theory of evidence
http://cms.brookes.ac.uk/staff/FabioCuzzolin/pubs.html
Fabio CuzzolinIn this paper we introduce three alternative combinatorial formulations of the theory of evidence (ToE), by proving that both plausibility and commonality functions share the structure of sum function with belief functions. We compute their Moebius inverses, which we call basic plausibility and commonality assignments. As these results are achieved in the framework of the geometric approach to uncertainty measures, the equivalence of the associated formulations of the ToE is mirrored by the geometric congruence of the related simplices. We can then describe the point-wise geometry of these sum functions in terms of rigid transformations mapping them onto each other. Combination rules can be applied to plausibility and commonality functions through their Moebius inverses, leading to interesting applications of such inverses to the probabilistic transformation problem.The geometry of consonant belief functions: simplicial complexes of possibility measures
http://cms.brookes.ac.uk/staff/FabioCuzzolin/pubs.html
Fabio CuzzolinIn this paper we extend the geometric approach to the theory of evidence in order to include other important ¯nite fuzzy measures. In particular we describe the geometric counterparts of the class of possibility measures represented by consonant belief functions. The correspondence between chains of subsets and convex sets of consonant functions is studied and its properties analyzed, eventually yielding an elegant representation of the region of consonant belief functions in terms of the notion of simplicial complex. In particular we focus on outer consonant approximations, showing that they live on a polytopes associated with all possible maximal chains of focal elements, which in turn form a simplicial complex in analogy with the whole consonant space.A consonant approximation of the product of independent consonant random sets
http://sdestercke.free.fr/papers/PossAppRSI_IJUFKS_DesterckeDuboisChoj.pdf
S. Destercke, D. Dubois and E. ChojnackiThe belief structure resulting from the combination of consonant and independent marginal random sets is not, in general, consonant. Also, the complexity of such a structure grows exponentially with the number of combined random sets, making it quickly intractable for computations. In this paper, we propose a simple guaranteed consonant outer approximation of this structure. The complexity of this outer approximation only linearly increases with the number of marginal random sets (i.e., of dimensions), making it easier to handle in uncertainty propagation. Features and advantages of this outer approximation are then discussed, with the help of some illustrative examples.Bayesian Estimation with Uncertain Parameters of Probability Density Functions
http://isas.uka.de/Publikationen/Fusion09_Klumpp-Type2Dens.pdf
Vesa Klumpp, Uwe D. HanebeckIn this paper, we address the problem of processing imprecisely known
probability density functions by means of Bayesian estimation. The imprecise
knowledge about probability density functions is given as stochastic
uncertainty about their parameters. The proposed processing of this special
density in a Bayesian estimator is accomplished by reinterpretation of the
Filter and prediction equations. Here, the parameters are treated as a higher
order state, which can be processed by Bayesian estimation techniques. For
state estimation, this avoids the need to select specific values for unknown
parameters and, thus, allows the processing of all potential parameters at
once. The proposed approach further allows the use of imprecisely known model
equations for measurement and state prediction by the same principle.State Estimation with Sets of Densities considering Stochastic and Systematic Errors
http://isas.uka.de/Publikationen/Fusion09_Noack.pdf
Benjamin Noack, Vesa Klumpp, Uwe D. HanebeckIn practical applications, state estimation requires the
consideration of stochastic and systematic errors. If both error types are
present, an exact probabilistic description of the state estimate is not
possible, so that common Bayesian estimators have to be questioned. This paper
introduces a theoretical concept, which allows for incorporating unknown but
bounded errors into a Bayesian inference scheme by utilizing sets of
densities. In order to derive a tractable estimator, the Kalman filter is
applied to ellipsoidal sets of means, which are used to bound additive
systematic errors. Also, an extension to nonlinear system and observation
models with ellipsoidal error bounds is presented. The derived estimator is
motivated by means of two example applications.Nonlinear Bayesian Estimation with Convex Sets of Probability Densities
http://isas.uka.de/Publikationen/Fusion08_Noack.pdf
Benjamin Noack, Vesa Klumpp, Dietrich Brunn, Uwe D. HanebeckThis paper presents a theoretical framework for
Bayesian estimation in the case of imprecisely known probability
density functions. The lack of knowledge about the true density
functions is represented by sets of densities. A formal Bayesian
estimator for these sets is introduced, which is intractable for
infinite sets. To obtain a tractable filter, properties of convex
sets in form of convex polytopes of densities are investigated.
It is shown that pathwise connected sets and their convex hulls
describe the same ignorance. Thus, an exact algorithm is derived,
which only needs to process the hull, delivering tractable results
in the case of a proper parametrization. Since the estimator
delivers a convex hull of densities as output, the theoretical
grounds are laid for deriving efficient Bayesian estimators for
sets of densities. The derived filter is illustrated by means of an
example.Reliable hidden Markov model filtering through coherent lower previsions
http://www.idsia.ch/~alessio/hmm-short_rev.pdf
Benavoli, A., Zaffalon, M., Miranda, E.We extend Hidden Markov Models for continuous variables taking into account imprecision in our knowledge about the probabilistic relationships involved. To achieve that, we consider sets of probabilities, also called coherent lower previsions. In addition to the general formulation, we study in detail a particular case of interest: linear-vacuous mixtures. We also show, in a practical case, that our extension outperforms the Kalman filter when modelling errors are present in the system.Multiple model tracking by imprecise Markov trees
http://www.idsia.ch/~alessandro/papers/antonucci2009e.pdf
Antonucci, A., Benavoli, A., Zaffalon, M., de Cooman, G., Hermans, F.We present a new procedure for tracking manoeuvring objects by hidden Markov chains. It leads to more reliable modelling of the transitions between hidden states compared to similar approaches proposed within the Bayesian framework: we adopt convex sets of probability mass functions rather than single ''precise probability'' specifications, in order to provide a more realistic and cautious model of the manoeuvre dynamics. In general, the downside of such increased freedom in the modelling phase is a higher inferential complexity. However, the simple topology of hidden Markov chains allows for efficient tracking of the object through a recently developed belief propagation algorithm. Furthermore, the imprecise specification of the transitions can produce so-called indecision, meaning that more than one model may be suggested by our method as a possible explanation of the target kinematics. In summary, our approach leads to a multiple-model estimator whose performance, investigated through extensive numerical tests, turns out to be more accurate and robust than that of Bayesian ones.