Página 1 dos resultados de 4179 itens digitais encontrados em 0.024 segundos

Information geometric similarity measurement for near-random stochastic processes

Dodson, C.T.J.; Scharcanski, Jacob
Fonte: Universidade Federal do Rio Grande do Sul Publicador: Universidade Federal do Rio Grande do Sul
Tipo: Artigo de Revista Científica Formato: application/pdf
ENG
Relevância na Pesquisa
55.87%
We outline the information-theoretic differential geometry of gamma distributions, which contain exponential distributions as a special case, and log-gamma distributions. Our arguments support the opinion that these distributions have a natural role in representing departures from randomness, uniformity, and Gaussian behavior in stochastic processes. We show also how the information geometry provides a surprisingly tractable Riemannian manifold and product spaces thereof, on which may be represented the evolution of a stochastic process, or the comparison of different processes, by means of well-founded maximum likelihood parameter estimation. Our model incorporates possible correlations among parameters. We discuss applications and provide some illustrations from a recent study of amino acid self-clustering in protein sequences; we provide also some results from simulations for multisymbol sequences.

Information Geometry for Landmark Shape Analysis: Unifying Shape Representation and Deformation

Peter, Adrian M.; Rangarajan, Anand
Fonte: PubMed Publicador: PubMed
Tipo: Artigo de Revista Científica
Publicado em /02/2009 EN
Relevância na Pesquisa
45.9%
Shape matching plays a prominent role in the comparison of similar structures. We present a unifying framework for shape matching that uses mixture models to couple both the shape representation and deformation. The theoretical foundation is drawn from information geometry wherein information matrices are used to establish intrinsic distances between parametric densities. When a parameterized probability density function is used to represent a landmark-based shape, the modes of deformation are automatically established through the information matrix of the density. We first show that given two shapes parameterized by Gaussian mixture models (GMMs), the well-known Fisher information matrix of the mixture model is also a Riemannian metric (actually, the Fisher-Rao Riemannian metric) and can therefore be used for computing shape geodesics. The Fisher-Rao metric has the advantage of being an intrinsic metric and invariant to reparameterization. The geodesic—computed using this metric—establishes an intrinsic deformation between the shapes, thus unifying both shape representation and deformation. A fundamental drawback of the Fisher-Rao metric is that it is not available in closed form for the GMM. Consequently, shape comparisons are computationally very expensive. To address this...

Geometria da informação : métrica de Fisher; Information geometry : Fisher's metric

Julianna Pinele Santos Porto
Fonte: Biblioteca Digital da Unicamp Publicador: Biblioteca Digital da Unicamp
Tipo: Dissertação de Mestrado Formato: application/pdf
Publicado em 23/08/2013 PT
Relevância na Pesquisa
65.99%
A Geometria da Informação é uma área da matemática que utiliza ferramentas geométricas no estudo de modelos estatísticos. Em 1945, Rao introduziu uma métrica Riemanniana no espaço das distribuições de probabilidade usando a matriz de informação, dada por Ronald Fisher em 1921. Com a métrica associada a essa matriz, define-se uma distância entre duas distribuições de probabilidade (distância de Rao), geodésicas, curvaturas e outras propriedades do espaço. Desde então muitos autores veem estudando esse assunto, que está naturalmente ligado a diversas aplicações como, por exemplo, inferência estatística, processos estocásticos, teoria da informação e distorção de imagens. Neste trabalho damos uma breve introdução à geometria diferencial e Riemanniana e fazemos uma coletânea de alguns resultados obtidos na área de Geometria da Informação. Mostramos a distância de Rao entre algumas distribuições de probabilidade e damos uma atenção especial ao estudo da distância no espaço formado por distribuições Normais Multivariadas. Neste espaço, como ainda não é conhecida uma fórmula fechada para a distância e nem para a curva geodésica, damos ênfase ao cálculo de limitantes para a distância de Rao. Conseguimos melhorar...

Information geometry, dynamics and discrete quantum mechanics

Reginatto, Marcel; Hall, Michael J. W.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
45.92%
We consider a system with a discrete configuration space. We show that the geometrical structures associated with such a system provide the tools necessary for a reconstruction of discrete quantum mechanics once dynamics is brought into the picture. We do this in three steps. Our starting point is information geometry, the natural geometry of the space of probability distributions. Dynamics requires additional structure. To evolve the probabilities $P^k$, we introduce coordinates $S^k$ canonically conjugate to the $P^k$ and a symplectic structure. We then seek to extend the metric structure of information geometry, to define a geometry over the full space of the $P^k$ and $S^k$. Consistency between the metric tensor and the symplectic form forces us to introduce a K\"ahler geometry. The construction has notable features. A complex structure is obtained in a natural way. The canonical coordinates of the K\"ahler space are precisely the wave functions of quantum mechanics. The full group of unitary transformations is obtained. Finally, one may associate a Hilbert space with the K\"ahler space, which leads to the standard version of quantum theory. We also show that the metric that we derive here using purely geometrical arguments is precisely the one that leads to Wootters' expression for the statistical distance for quantum systems.; Comment: 12 pages. Presented at MaxEnt 2012...

Information geometry and sufficient statistics

Ay, Nihat; Jost, Jürgen; Lê, Hông Vân; Schwachhöfer, Lorenz
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
55.87%
Information geometry provides a geometric approach to families of statistical models. The key geometric structures are the Fisher quadratic form and the Amari-Chentsov tensor. In statistics, the notion of sufficient statistic expresses the criterion for passing from one model to another without loss of information. This leads to the question how the geometric structures behave under such sufficient statistics. While this is well studied in the finite sample size case, in the infinite case, we encounter technical problems concerning the appropriate topologies. Here, we introduce notions of parametrized measure models and tensor fields on them that exhibit the right behavior under statistical transformations. Within this framework, we can then handle the topological issues and show that the Fisher metric and the Amari-Chentsov tensor on statistical models in the class of symmetric 2-tensor fields and 3-tensor fields can be uniquely (up to a constant) characterized by their invariance under sufficient statistics, thereby achieving a full generalization of the original result of Chentsov to infinite sample sizes. More generally, we decompose Markov morphisms between statistical models in terms of statistics. In particular, a monotonicity result for the Fisher information naturally follows.; Comment: 37 p...

Transversely Hessian foliations and information geometry

Boyom, Michel Nguiffo; Wolak, Robert A.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 29/03/2015
Relevância na Pesquisa
55.93%
A family of probability distributions parametrized by an open domain $\Lambda$ in $R^n$ defines the Fisher information matrix on this domain which is positive semi-definite. In information geometry the standard assumption has been that the Fisher information matrix tensor is positive definite defining in this way a Riemannian metric on $\Lambda$. If we replace the "positive definite" assumption by the existence of a suitable torsion-free connection, a foliation with a transversely Hessian structure appears naturally. In the paper we develop the study of transversely Hessian foliations in view of applications in information geometry.

Information Geometry and Statistical Manifold

Suzuki, Mashbat
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 08/10/2014
Relevância na Pesquisa
55.88%
We review basic notions in the field of information geometry such as Fisher metric on statistical manifold, $\alpha$-connection and corresponding curvature following Amari's work . We show application of information geometry to asymptotic statistical inference.

The Information Geometry of Mirror Descent

Raskutti, Garvesh; Mukherjee, Sayan
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
45.9%
Information geometry applies concepts in differential geometry to probability and statistics and is especially useful for parameter estimation in exponential families where parameters are known to lie on a Riemannian manifold. Connections between the geometric properties of the induced manifold and statistical properties of the estimation problem are well-established. However developing first-order methods that scale to larger problems has been less of a focus in the information geometry community. The best known algorithm that incorporates manifold structure is the second-order natural gradient descent algorithm introduced by Amari. On the other hand, stochastic approximation methods have led to the development of first-order methods for optimizing noisy objective functions. A recent generalization of the Robbins-Monro algorithm known as mirror descent, developed by Nemirovski and Yudin is a first order method that induces non-Euclidean geometries. However current analysis of mirror descent does not precisely characterize the induced non-Euclidean geometry nor does it consider performance in terms of statistical relative efficiency. In this paper, we prove that mirror descent induced by Bregman divergences is equivalent to the natural gradient descent algorithm on the dual Riemannian manifold. Using this equivalence...

Information Geometry and Evolutionary Game Theory

Harper, Marc
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 09/11/2009
Relevância na Pesquisa
55.89%
The Shahshahani geometry of evolutionary game theory is realized as the information geometry of the simplex, deriving from the Fisher information metric of the manifold of categorical probability distributions. Some essential concepts in evolutionary game theory are realized information-theoretically. Results are extended to the Lotka-Volterra equation and to multiple population systems.; Comment: Added references

Extension of information geometry for modelling non-statistical systems

Anthonis, Ben
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 05/01/2015
Relevância na Pesquisa
45.97%
In this dissertation, an abstract formalism extending information geometry is introduced. This framework encompasses a broad range of modelling problems, including possible applications in machine learning and in the information theoretical foundations of quantum theory. Its purely geometrical foundations make no use of probability theory and very little assumptions about the data or the models are made. Starting only from a divergence function, a Riemannian geometrical structure consisting of a metric tensor and an affine connection is constructed and its properties are investigated. Also the relation to information geometry and in particular the geometry of exponential families of probability distributions is elucidated. It turns out this geometrical framework offers a straightforward way to determine whether or not a parametrised family of distributions can be written in exponential form. Apart from the main theoretical chapter, the dissertation also contains a chapter of examples illustrating the application of the formalism and its geometric properties, a brief introduction to differential geometry and a historical overview of the development of information geometry.; Comment: PhD thesis, University of Antwerp, Advisors: Prof. dr. Jan Naudts and Prof. dr. Jacques Tempere...

Notes on information geometry and evolutionary processes

Toussaint, Marc
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 20/08/2004
Relevância na Pesquisa
45.91%
In order to analyze and extract different structural properties of distributions, one can introduce different coordinate systems over the manifold of distributions. In Evolutionary Computation, the Walsh bases and the Building Block Bases are often used to describe populations, which simplifies the analysis of evolutionary operators applying on populations. Quite independent from these approaches, information geometry has been developed as a geometric way to analyze different order dependencies between random variables (e.g., neural activations or genes). In these notes I briefly review the essentials of various coordinate bases and of information geometry. The goal is to give an overview and make the approaches comparable. Besides introducing meaningful coordinate bases, information geometry also offers an explicit way to distinguish different order interactions and it offers a geometric view on the manifold and thereby also on operators that apply on the manifold. For instance, uniform crossover can be interpreted as an orthogonal projection of a population along an m-geodesic, monotonously reducing the theta-coordinates that describe interactions between genes.

Information geometry in vapour-liquid equilibrium

Brody, Dorje C.; Hook, Daniel W.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 06/09/2008
Relevância na Pesquisa
45.97%
Using the square-root map p-->\sqrt{p} a probability density function p can be represented as a point of the unit sphere S in the Hilbert space of square-integrable functions. If the density function depends smoothly on a set of parameters, the image of the map forms a Riemannian submanifold M in S. The metric on M induced by the ambient spherical geometry of S is the Fisher information matrix. Statistical properties of the system modelled by a parametric density function p can then be expressed in terms of information geometry. An elementary introduction to information geometry is presented, followed by a precise geometric characterisation of the family of Gaussian density functions. When the parametric density function describes the equilibrium state of a physical system, certain physical characteristics can be identified with geometric features of the associated information manifold M. Applying this idea, the properties of vapour-liquid phase transitions are elucidated in geometrical terms. For an ideal gas, phase transitions are absent and the geometry of M is flat. In this case, the solutions to the geodesic equations yield the adiabatic equations of state. For a van der Waals gas, the associated geometry of M is highly nontrivial. The scalar curvature of M diverges along the spinodal boundary which envelopes the unphysical region in the phase diagram. The curvature is thus closely related to the stability of the system.; Comment: A short survey article. 38 Pages...

K\"ahlerian information geometry for signal processing

Choi, Jaehyung; Mullhaupt, Andrew P.
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
55.98%
We prove the correspondence between the information geometry of a signal filter and a K\"ahler manifold. The information geometry of a minimum-phase linear system with a finite complex cepstrum norm is a K\"ahler manifold. The square of the complex cepstrum norm of the signal filter corresponds to the K\"ahler potential. The Hermitian structure of the K\"ahler manifold is explicitly emergent if and only if the impulse response function of the highest degree in $z$ is constant in model parameters. The K\"ahlerian information geometry takes advantage of more efficient calculation steps for the metric tensor and the Ricci tensor. Moreover, $\alpha$-generalization on the geometric tensors is linear in $\alpha$. It is also robust to find Bayesian predictive priors, such as superharmonic priors, because Laplace-Beltrami operators on K\"ahler manifolds are in much simpler forms than those of the non-K\"ahler manifolds. Several time series models are studied in the K\"ahlerian information geometry.; Comment: 24 pages, published version

Critical phenomena and information geometry in black hole physics

Aman, Jan E.; Pidokrajt, Narit
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 13/01/2010
Relevância na Pesquisa
45.9%
We discuss the use of information geometry in black hole physics and present the outcomes. The type of information geometry we utilize in this approach is the thermodynamic (Ruppeiner) geometry defined on the state space of a given thermodynamic system in equilibrium. The Ruppeiner geometry can be used to analyze stability and critical phenomena in black hole physics with results consistent with those from the Poincare stability analysis for black holes and black rings. Furthermore other physical phenomena are well encoded in the Ruppeiner metric such as the sign of specific heat and the extremality of the solutions. The black hole families we discuss in particular in this manuscript are the Myers-Perry black holes.; Comment: Contribution to ERE2009, 5 pages

From information to quanta: A derivation of the geometric formulation of quantum theory from information geometry

Reginatto, Marcel
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 02/12/2013
Relevância na Pesquisa
45.9%
It is shown that the geometry of quantum theory can be derived from geometrical structure that may be considered more fundamental. The basic elements of this reconstruction of quantum theory are the natural metric on the space of probabilities (information geometry), the description of dynamics using a Hamiltonian formalism (symplectic geometry), and requirements of consistency (K\"{a}hler geometry). The theory that results is standard quantum mechanics, but in a geometrical formulation that includes also a particular case of a family of nonlinear gauge transformations introduced by Doebner and Goldin. The analysis is carried out for the case of discrete quantum mechanics. The work presented here relies heavily on, and extends, previous work done in collaboration with M. J. W. Hall.; Comment: 18 pages. Presented at Symmetries in Science XVI, Bregenz, Austria, July 21-26, 2013

Sketching, Embedding, and Dimensionality Reduction for Information Spaces

Abdullah, Amirali; Kumar, Ravi; McGregor, Andrew; Vassilvitskii, Sergei; Venkatasubramanian, Suresh
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 17/03/2015
Relevância na Pesquisa
45.91%
Information distances like the Hellinger distance and the Jensen-Shannon divergence have deep roots in information theory and machine learning. They are used extensively in data analysis especially when the objects being compared are high dimensional empirical probability distributions built from data. However, we lack common tools needed to actually use information distances in applications efficiently and at scale with any kind of provable guarantees. We can't sketch these distances easily, or embed them in better behaved spaces, or even reduce the dimensionality of the space while maintaining the probability structure of the data. In this paper, we build these tools for information distances---both for the Hellinger distance and Jensen--Shannon divergence, as well as related measures, like the $\chi^2$ divergence. We first show that they can be sketched efficiently (i.e. up to multiplicative error in sublinear space) in the aggregate streaming model. This result is exponentially stronger than known upper bounds for sketching these distances in the strict turnstile streaming model. Second, we show a finite dimensionality embedding result for the Jensen-Shannon and $\chi^2$ divergences that preserves pair wise distances. Finally we prove a dimensionality reduction result for the Hellinger...

A unified framework for information integration based on information geometry

Oizumi, Masafumi; Tsuchiya, Naotsugu; Amari, Shun-ichi
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 15/10/2015
Relevância na Pesquisa
55.86%
We propose a unified theoretical framework for quantifying spatio-temporal interactions in a stochastic dynamical system based on information geometry. In the proposed framework, the degree of interactions is quantified by the divergence between the actual probability distribution of the system and a constrained probability distribution where the interactions of interest are disconnected. This framework provides novel geometric interpretations of various information theoretic measures of interactions, such as mutual information, transfer entropy, and stochastic interaction in terms of how interactions are disconnected. The framework therefore provides an intuitive understanding of the relationships between the various quantities. By extending the concept of transfer entropy, we propose a novel measure of integrated information which measures causal interactions between parts of a system. Integrated information quantifies the extent to which the whole is more than the sum of the parts and can be potentially used as a biological measure of the levels of consciousness.

Mapping the Region of Entropic Vectors with Support Enumeration & Information Geometry

Liu, Yunshu; Walsh, John MacLaren
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 10/12/2015
Relevância na Pesquisa
55.78%
The region of entropic vectors is a convex cone that has been shown to be at the core of many fundamental limits for problems in multiterminal data compression, network coding, and multimedia transmission. This cone has been shown to be non-polyhedral for four or more random variables, however its boundary remains unknown for four or more discrete random variables. Methods for specifying probability distributions that are in faces and on the boundary of the convex cone are derived, then utilized to map optimized inner bounds to the unknown part of the entropy region. The first method utilizes tools and algorithms from abstract algebra to efficiently determine those supports for the joint probability mass functions for four or more random variables that can, for some appropriate set of non-zero probabilities, yield entropic vectors in the gap between the best known inner and outer bounds. These supports are utilized, together with numerical optimization over non-zero probabilities, to provide inner bounds to the unknown part of the entropy region. Next, information geometry is utilized to parameterize and study the structure of probability distributions on these supports yielding entropic vectors in the faces of entropy and in the unknown part of the entropy region.

Loop Calculus for Non-Binary Alphabets using Concepts from Information Geometry

Mori, Ryuhei
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Relevância na Pesquisa
55.78%
The Bethe approximation is a well-known approximation of the partition function used in statistical physics. Recently, an equality relating the partition function and its Bethe approximation was obtained for graphical models with binary variables by Chertkov and Chernyak. In this equality, the multiplicative error in the Bethe approximation is represented as a weighted sum over all generalized loops in the graphical model. In this paper, the equality is generalized to graphical models with non-binary alphabet using concepts from information geometry.; Comment: 18 pages, 4 figures, submitted to IEEE Trans. Inf. Theory

Information geometry and the hydrodynamical formulation of quantum mechanics

Molitor, Mathieu
Fonte: Universidade Cornell Publicador: Universidade Cornell
Tipo: Artigo de Revista Científica
Publicado em 03/04/2012
Relevância na Pesquisa
55.83%
Let (M,g) be a compact, connected and oriented Riemannian manifold. We denote D the space of smooth probability density functions on M. In this paper, we show that the Frechet manifold D is equipped with a Riemannian metric g^{D} and an affine connection \nabla^{D} which are infinite dimensional analogues of the Fisher metric and exponential connection in the context of information geometry. More precisely, we use Dombrowski's construction together with the couple (g^{D},\nabla^{D}) to get a (non-integrable) almost Hermitian structure on D, and we show that the corresponding fundamental 2-form is a symplectic form from which it is possible to recover the usual Schrodinger equation for a quantum particle living in M. These results echo a recent paper of the author where it is stressed that the Fisher metric and exponential connection are related (via Dombrowski's construction) to Kahler geometry and quantum mechanics in finite dimension.