Expand AU Menu

Economics | Conference

A Conference in Honor of Arnold Zellner

Recent Developments in the Theory, Method, and Application ofInformation and Entropy Econometrics

September 19-21, 2003

 

Background

Information and Entropy Econometrics (IEE) is research that directly or indirectly builds on the foundations of Information Theory (IT) and the principle of Maximum Entropy (ME). IEE includes research dealing with statistical inference of (economic) problems given incomplete knowledge or data, as well as research dealing with the analysis, diagnostics and statistical properties of information measures.

The development of ME occurred via two lines of research: Statistical Inference (Bernoulli, Bayes, Laplace, Jeffreys, Cox) and statistical modeling of problems in mechanics, physics and information (Maxwell, Boltzmann, Gibbs, Shannon). The objective of the first line of research is to formulate a theory/methodology that allows understanding of the general characteristics (distribution) of a system from partial and incomplete information. In the second line of research, this same objective is expressed as determining how to assign (initial) numerical values of probabilities when only some (theoretical) limited global quantities of the investigated system are known. Recognizing the common basic objectives of these two lines of research aided Jaynes (1957) in the development of his classic work, the Maximum Entropy (ME) formalism. The ME formalism is based on the philosophy of the first line of research and the mathematics of the second line of research.

The interrelationship between Information Theory (IT), statistics and inference, and the ME principle started to become clear in the early work of Kullback and Lindley. Building on the basic concepts and properties of IT, Kullback and Leibler developed some of the fundamental statistics, such as sufficiency and efficiency as well as a generalization of the Cramer-Rao inequality, and thus were able to unify heterogeneous statistical procedures via the concepts of IT (Kullback and Leibler 1951; Kullback 1954, 1959). Lindley (1956), on the other hand, provided the interpretation that a statistical sample could be viewed as a noisy communication channel that conveys a message about a parameter according to a prior distribution. In that way, he was able to apply Shannon's ideas to statistical theory by referring to the information in an experiment rather than in a message. Zellner (1988) established Bayes' theorem as the optimal learning rule for information process based on logarithmic information measures.

All of the estimation methods within IEE are based on optimizing a certain informational-objective function subject to certain moment representation of the data, or certain "conservation laws" representing the underlying system. This class of methods is led by the pioneering work of Zellner (1994) on the Bayesian Method of Moments (BMOM), the work on Empirical Likelihood (EL), the Generalized Method of Moments (GMM) and the Generalized ME (GME). The connection between IT and these methods became much clearer as a result of the seminal work of Imbens et. al (1998) and Kitamura and Stutzer (1997), as well as the work on the GME.

These methods share the same basic objective of analyzing limited and noisy data using minimal assumptions. As the underlying data generating process (or error margins) is uncertain or unknown, statisticians and econometricians try to avoid strong distributional assumptions or a pre-specified likelihood function. With the above in mind, and within the general objective of estimation and inference for a large class of models (linear and nonlinear, parametric and non-parametric), it seems that going back to the foundations of IT and ME was quite inevitable and led to the above class of information-theoretic methods. These information-theoretic methods could be viewed as approaches to solving ill-posed or under-determined problems in the sense that without a pre-specified likelihood or distribution, there are always more unknowns than knowns regardless of the amount of data. That is, since the observation matrix is irregular or ill-conditioned or since the number of unknowns exceeds the number of data points, the problem is ill-posed.

When IEE methods are used in conjunction, the resultants are powerful tools for analyzing a wide variety of problems in most disciplines of science. Examples include (i) work on image reconstruction and spectral analysis in medicine, physics, chemistry, biology, topography, engineering, communication and information, operations research, political science and economics (e.g., brain scan, tomography, satellite images, search engines, political surveys, input-output reconstruction and general matrix balancing, as well as EL and GMM type methods); (ii) research in statistical inference and estimation (Bayesian and non Bayesian methods); and (iii) ongoing innovations in information processing and IT.

Despite these significant innovations, there is still much to discuss and develop regarding the relationship connecting statistical and econometric inference and diagnostic techniques, learning, information theory, Bayesian analysis and entropy.

 

Conference Objectives

This conference will (i) study and explore IT solutions for linear estimation and inference problems, (ii) facilitate the exchange of research ideas in the field, (iii) promote collaboration among researchers from different disciplines, and (iv) highlight the major trends in Information and Entropy Econometrics. In particular, this Conference will concentrate on the most recent (theoretical and applied) research in linear Bayesian and non- Bayesian, IT and entropic procedures with emphasis on modeling and measuring information. In addition, the conference will deal with the interpretations and meaning of the solutions to IT estimation and inference (e.g., statistical meaning, complexity and efficiency as well as informational meaning). Both theory and innovative applied papers will be included.

Conference Topics

Conference topics include theory and methods and applications in various fields. Theory and methods topics include:

  • Measuring information
  • Interpretation and meaning of IT solutions to estimation and inference
  • Information in moment based estimation
  • Bayesian analysis and IT
  • Linear IT and Entropic procedures
  • Optimal information processing

Applications topics from all fields and interdisciplinary papers within and between the following fields are particularly welcome:

  • Economics and social science
  • Business and management science
  • Natural and physical science
  • Medical and biological science

Structure

  • Paper presentations (sessions consisting of 3 papers and discussants).
  • Invited Lecturers
  • Survey-Review Lectures