DISCLAIMER: THE MATERIALS ON THIS WEB SITE (INCLUDING, WITHOUT LIMITATION, ALL SOFTWARE) ARE PROVIDED AS IS AND WITHOUT WARRANTIES OF ANY KIND EITHER EXPRESSED OR IMPLIED. THE RESOURCE PAGE IS MANAGED BY XIMING WU (xwu@ag.tamu.edu) .
Books on maximum entropy and informationtheoretic econometrics and methods
A Family of Empirical Likelihood Functions and Estimators for the Binary Response Model by Mittelhammer and Judge, 2011.
An Information Theoretic Approach to Econometrics by Judge and Mittelhammer, 2011.
Elements of Information Theory by Cover and Thomas, 2nd Ed., 2006.
Maximum Entropy Econometrics: Robust Estimation with Limited Data by Golan, Judge, and Miller, 1996.
Information and Entropy Econometrics: A Review and Synthesis by Golan.
Information Theory and Statistics: A Tutorial by Csiszar.
Maximum Entropy Models in Science and Engineering by Kapur, 2006.
Entropy Optimization Principles with Applications by Kapur and Kesavan, 1992.
Entropy and Information Theory by Gray, 1990.
Entropy, Large Deviations, and Statistical Mechanics by Ellis, 1985.
Handbook of Empirical Economics and Finance (Editors: Aman Ullah and David E. A. Giles; CRS Press)
Below is an incomplete list of articles related to informationtheoretic and entropy methods, with a focus on econometrics and statistics. Other general sources include IEEE Transactions on Information Theory and Entropy: an Online Journal.
Business, Economics and Finance

Abbas, A. E. (2006), “Maximum Entropy Utility,” Operations Research, 54, 277290.

AbdelKhalik, A. R. (1974), “The Entropy Law, Accounting Data, and Relevance to DecisionMaking,” The Accounting Review, 49, 271283.

Acar, W., and Sankaran, K. (1999), “The Myth of the Unique Decomposability: Specializing the Herfindahl and Entropy Measures?,” Strategic Management Journal, 20, 969975.

Aharon, B.T. (1985), “The Entropic Penalty Approach to Stochastic Programming,” Mathematics of Operations Research, 10, 263279.

Arizono, I., Cui, Y., and Ohta, H. (1991), “An Analysis of M/M/S Queueing Systems Based on the Maximum Entropy Principle,” The Journal of the Operational Research Society, 42, 6973.

Badran, Y. (1987), “Optimal Grouping of State Spaces of Discrete Stochastic Systems,” The Journal of the Operational Research Society, 38, 10311037.

Bearse, P. M., Bozdogan, H., and Schlottmann, A. M. (1997), “Empirical Econometric Modelling of Food Consumption Using a New Informational Complexity Approach,” Journal of Applied Econometrics, 12, 563586.

Beatty, T. K. M. (2007), “Recovering the Shadow Value of Nutrients,” American Journal of Agricultural Economics, 89, 5262.

BenTal, A., and Teboulle, M. (1987), “Penalty Functions and Duality in Stochastic Programming Via FDivergence Functionals,” Mathematics of Operations Research, 12, 224240.

Blackman, A. (2001), “Why Don’t Lenders Finance HighReturn Technological Change in DevelopingCountry Agriculture?,” American Journal of Agricultural Economics, 83, 10241035.

Block, W., and Walker, M. (1988), “Entropy in the Canadian Economics Profession: Sampling Consensus on the Major Issues,” Canadian Public Policy / Analyse de Politiques, 14, 137150.

Boer, P. T. d., Kroese, D. P., and Rubinstein, R. Y. (2004), “A Fast CrossEntropy Method for Estimating Buffer Overflows in Queueing Networks,” Management Science, 50, 883895.

Borwein, J. M., Lewis, A. S., and Noll, D. (1996), “Maximum Entropy Reconstruction Using Derivative Information, Part 1: Fisher Information and Convex Duality,” Mathematics of Operations Research, 21, 442468.

Boulding, K. E. (1973), “The Economics of Energy,” Annals of the American Academy of Political and Social Science, 410, 120126.

Brock, W. A., and Baek, E. G. (1991), “Some Theory of Statistical Inference for Nonlinear Science,” The Review of Economic Studies, 58, 697716.

Brockett, P., and Charnes, A. (1991), “Information Theoretic Approach to Geometric Programming,” Mathematics of Operations Research, 16, 888889.

Brotchie, J. F. (1978), “A New Approach to Urban Modelling,” Management Science, 24, 17531758.

Buchen, P. W., and Kelly, M. (1996), “The Maximum Entropy Distribution of an Asset Inferred from Option Prices,” The Journal of Financial and Quantitative Analysis, 31, 143159.

Bullock, D. S., Ruffo, M. L., Bullock, D. G., and Bollero, G. A. (2009), “The Value of Variable Rate Technology: An InformationTheoretic Approach,” American Journal of Agricultural Economics, 91, 209223.

Callen, J. L., Kwan, C. C. Y., and Yip, P. C. Y. (1985), “ForeignExchange Rate Dynamics: An Empirical Study Using Maximum Entropy Spectral Analysis,” Journal of Business & Economic Statistics, 3, 149155.

Chan, M. M. W. (1971), “System Simulation and Maximum Entropy,” Operations Research, 19, 17511753.

Charnes, A., Cooper, W. W., and Learner, D. B. (1978), “Constrained Information Theoretic Characterizations in Consumer Purchase Behaviour,” The Journal of the Operational Research Society, 29, 833842.

Charnes, A., Cooper, W. W., Learner, D. B., and Phillips, F. Y. (1984), “An Mdi Model and an Algorithm for Composite Hypotheses Testing and Estimation in Marketing,” Marketing Science, 3, 5572.

Chase, R. (1980), “StructuralFunctional Dynamics in the Analysis of SocioEconomic Systems: Adaptation of Structural Change Processes to Biological Systems of Human Interaction,” American Journal of Economics and Sociology, 39, 4964.

Chemmanur, T. J. (1993), “The Pricing of Initial Public Offerings: A Dynamic Model with Information Production,” The Journal of Finance, 48, 285304.

Converse, A. O. (1967), “The Use of Uncertainty in a Simultaneous Search,” Operations Research, 15, 10881095.

Cozzolino, J. M., and Zahner, M. J. (1973), “The MaximumEntropy Distribution of the Future Market Price of a Stock,” Operations Research, 21, 12001211.

Csikszentmihalyi, M. (2000), “The Costs and Benefits of Consuming,” The Journal of Consumer Research, 27, 267272.

Davis, R., and Thomas, L. G. (1993), “Direct Estimation of Synergy: A New Approach to the DiversityPerformance Debate,” Management Science, 39, 13341346.

Dinkel, J. J., and Kochenberger, G. A. (1979), “Constrained Entropy Models: Solvability and Sensitivity,” Management Science, 25, 555564.

Drechsler, F. S. (1970), “The Concept of Entropy,” Operational Research Quarterly (19701977), 21, 476477.

Eisner, H. (1962), “A Generalized Network Approach to the Planning and Scheduling of a Research Project,” Operations Research, 10, 115125.

Elton, A. D., and Madan, D. B. (2005), “An Empirical Examination of the VarianceGamma Model for Foreign Currency Options,” The Journal of Business, 78, 21212152.

Filar, J. A., and Krass, D. (1994), “Hamiltonian Cycles and Markov Chains,” Mathematics of Operations Research, 19, 223237.

Fisk, C., and Brown, G. R. (1975), “A Note on the Entropy Formulation of Distribution Models,” Operational Research Quarterly (19701977), 26, 755758.

Frank, M., and Stengos, T. (1989), “Measuring the Strangeness of Gold and Silver Rates of Return,” The Review of Economic Studies, 56, 553567.

Freund, D., and Saxena, U. (1984), “An Algorithm for a Class of Discrete Maximum Entropy Problems,” Operations Research, 32, 210215.

Gaile, G. L. (1977), “Effiquity: A Comparison of a Measure of Efficiency with an Entropic Measure of the Equality of Discrete Spatial Distributions,” Economic Geography, 53, 265282.

Galli, E.P., and Legros, D. (2007), “Spatial Spillovers in France: A Study on Individual Count Data at the City Level,” Annales d’conomie et de Statistique, 221246.

Garber, T., Goldenberg, J., Libai, B., and Muller, E. (2004), “From Density to Destiny: Using Spatial Dimension of Sales Data for Early Prediction of New Product Success,” Marketing Science, 23, 419428.

Garrison, C. B. (1974), “Industrial Growth in the Tennessee Valley Region, 1959 to 1968,” American Journal of Agricultural Economics, 56, 5060.

Garrison, C. B., and Paulson, A. S. (1973), “An Entropy Measure of the Geographic Concentration of Economic Activity,” Economic Geography, 49, 319324.

Gerchak, Y. (1981), “Maximal Entropy of Markov Chains with Common SteadyState Probabilities,” The Journal of the Operational Research Society, 32, 233234.

Gersch, W., and Kitagawa, G. (1983), “The Prediction of Time Series with Trends and Seasonalities,” Journal of Business & Economic Statistics, 1, 253264.

Gertsbakh, I., and Stern, H. I. (1978), “Minimal Resources for Fixed and Variable Job Schedules,” Operations Research, 26, 6885.

Ghaoui, L. E., Oks, M., and Oustry, F. (2003), “WorstCase ValueatRisk and Robust Portfolio Optimization: A Conic Programming Approach,” Operations Research, 51, 543556.

Golan, A., Judge, G., and Perloff, J. M. (1996), “Estimating the Size Distribution of Firms Using Government Summary Statistics,” The Journal of Industrial Economics, 44, 6980.

Golan, A., Judge, G., and Robinson, S. (1994), “Recovering Information from Incomplete or Partial Multisectoral Economic Data,” The Review of Economics and Statistics, 76, 541549.

Golan, A., Karp, L. S., and Perloff, J. M. (2000), “Estimating Coke’s and Pepsi’s Price and Advertising Strategies,” Journal of Business & Economic Statistics, 18, 398409.

Golan, A., Perloff, J. M., and Shen, E. Z. (2001), “Estimating a Demand System with Nonnegativity Constraints: Mexican Meat Demand,” The Review of Economics and Statistics, 83, 541550.

Guiasu, S. (1986), “Maximum Entropy Condition in Queueing Theory,” The Journal of the Operational Research Society, 37, 293301.

Guiasu, S. (1987), “Maximum Entropy Condition in Queueing Theory: Response,” The Journal of the Operational Research Society, 38, 98100.

Hall, E. H., Jr., and Caron, H. S. J. (1994), “A Methodological Note on Diversity Measurement,” Strategic Management Journal, 15, 153168.

Handlarski, J. (1980), “Mathematical Analysis of Preventive Maintenance Schemes,” The Journal of the Operational Research Society, 31, 227237.

Hansen, S. (1972), “Utility, Accessibility and Entropy in Spatial Modelling,” The Swedish Journal of Economics, 74, 3544.

Hansen, S. (1975), “Utility, Accessibility and Entropy in Spatial Modelling. Reply,” The Swedish Journal of Economics, 77, 502503.

Hauser, J. R. (1978), “Testing the Accuracy, Usefulness, and Significance of Probabilistic Choice Models: An InformationTheoretic Approach,” Operations Research, 26, 406421.

Haynes, K. E., and Enders, W. T. (1975), “Distance, Direction, and Entropy in the Evolution of a Settlement Pattern,” Economic Geography, 51, 357365.

Heltberg, R., Arndt, T. C., and Sekhar, N. U. (2000), “Fuelwood Consumption and Forest Degradation: A Household Model for Domestic Energy Substitution in Rural India,” Land Economics, 76, 213232.

Herer, Y. T., and Raz, T. (2000), “Optimal Parallel Inspection for Finding the First Nonconforming Unit in a Batchan Information Theoretic Approach,” Management Science, 46, 845857.

Herniter, J. D. (1973), “An Entropy Model of Brand Purchase Behavior,” Journal of Marketing Research, 10, 361375.

Herniter, J. D. (1974), “A Comparison of the Entropy Model and the Hendry Model,” Journal of Marketing Research, 11, 2129.

Herpen, E. v., and Pieters, R. (2002), “The Variety of an Assortment: An Extension to the AttributeBased Approach,” Marketing Science, 21, 331341.

Hexter, J. L., and Snow, J. W. (1970), “An Entropy Measure of Relative Aggregate Concentration,” Southern Economic Journal, 36, 239243.

Hexter, J. L., and Snow, J. W. (1971), “An Entropy Measure of Relative Aggregate Concentration: Reply,” Southern Economic Journal, 38, 112114.

Hirschberg, J. G., Maasoumi, E., and Slottje, D. J. (2001), “Clusters of Attributes and WellBeing in the USA,” Journal of Applied Econometrics, 16, 445460.

Holloway, G., and Paris, Q. (2002), “Production Efficiency in the Von Liebig Model,” American Journal of Agricultural Economics, 84, 12711278.

Holm, J. (1991), “The Distribution of Income: A Saddle Point Formulation,” The Scandinavian Journal of Economics, 93, 545554.

Hong, H., Preston, B., and Shum, M. (2003), “Generalized Empirical LikelihoodBased Model Selection Criteria for Moment Condition Models,” Econometric Theory, 19, 923943.

Hong, Y., and White, H. (2005), “Asymptotic Distribution Theory for Nonparametric Entropy Measures of Serial Dependence,” Econometrica, 73, 837901.

Horowitz, A., and Horowitz, I. (1968), “Entropy, Markov Processes and Competition in the Brewing Industry,” The Journal of Industrial Economics, 16, 196211.

Hoskisson, R. E., Hitt, M. A., Johnson, R. A., and Moesel, D. D. (1993), “Construct Validity of an Objective (Entropy) Categorical Measure of Diversification Strategy,” Strategic Management Journal, 14, 215235.

Hu, J., Fu, M. C., and Marcus, S. I. (2007), “A Model Reference Adaptive Search Method for Global Optimization,” Operations Research, 55, 549568.

Huffman, G. W. (1992), “Information, Asset Prices, and the Volume of Trade,” The Journal of Finance, 47, 15751590.

Hutchens, R. (2004), “One Measure of Segregation,” International Economic Review, 45, 555578.

Hutton, B., and Schmidt, C. P. (1988), “Sensitivity Analysis of Additive Multiattribute Value Models,” Operations Research, 36, 122127.

Imbens, G. W. (1997), “OneStep Estimators for overIdentified Generalized Method of Moments Models,” The Review of Economic Studies, 64, 359383.

Imbens, G. W. (2002), “Generalized Method of Moments and Empirical Likelihood,” Journal of Business & Economic Statistics, 20, 493506.

Imbens, G. W., Spady, R. H., and Johnson, P. (1998), “Information Theoretic Approaches to Inference in Moment Condition Models,” Econometrica, 66, 333357.

Iusem, A. N., Svaiter, B. F., and Teboulle, M. (1994), “EntropyLike Proximal Methods in Convex Programming,” Mathematics of Operations Research, 19, 790814.

Jackson, R. W., Hewings, G. J. D., and Sonis, M. (1989), “Decomposition Approaches to the Identification of Change in Regional Economies,” Economic Geography, 65, 216231.

Jacquemin, A. P., and Berry, C. H. (1979), “Entropy Measure of Diversification and Corporate Growth,” The Journal of Industrial Economics, 27, 359369.

Jacquemin, A. P., and Kumps, A.M. (1971), “Changes in the Size Structure of the Largest European Firms: An Entropy Measure,” The Journal of Industrial Economics, 20, 5970.

Jarrett, D. (1981), “Comments On ”Maximal Entropy of Markov Chains with Common SteadyState Probabilities,” The Journal of the Operational Research Society, 32, 10451046.

Jessop, A. (2002), “Prioritisation of an It Budget within a Local Authority,” The Journal of the Operational Research Society, 53, 3646.

Jon, L. (1998), “Constrained MaximumEntropy Sampling,” Operations Research, 46, 655664.

Jon, S. (1987), “The Relationship between Wages and Firm Size: An Information Theoretic Analysis,” International Economic Review, 28, 5168.

Kahn, B. E., and Wansink, B. (2004), “The Influence of Assortment Structure on Perceived Variety and Consumption Quantities,” The Journal of Consumer Research, 30, 519533.

Kalwani, M. U., and Morrison, D. G. (1977), “A Parsimonious Description of the Hendry System,” Management Science, 23, 467477.

Kim, W. C. (1989), “Developing a Global Diversification Measure,” Management Science, 35, 376383.

Kitamura, Y., and Stutzer, M. (1997), “An InformationTheoretic Alternative to Generalized Method of Moments Estimation,” Econometrica, 65, 861874.

Ko, C.W., Jon, L., and Queyranne, M. (1995), “An Exact Algorithm for Maximum Entropy Sampling,” Operations Research, 43, 684691.

Koenigsberg, E. (1987), “Maximum Entropy Condition in Queueing Theory,” The Journal of the Operational Research Society, 38, 9798.

Kotiah, T. C. T., and Wallace, N. D. (1973), “Another Look at the Pert Assumptions,” Management Science, 20, 4449.

Kottke, F. J. (1971), “An Entropy Measure of Relative Aggregate Concentration: Comment,” Southern Economic Journal, 38, 109112.

Kouvatsos, D. D. (1988), “A Maximum Entropy Analysis of the G/G/1 Queue at Equilibrium,” The Journal of the Operational Research Society, 39, 183200.

Kouvatsos, D. D., and Othman, A. T. (1989), “Optimal Flow Control of a G/G/C Finite Capacity Queue,” The Journal of the Operational Research Society, 40, 659670.

Krzysztof, C. K. (1997), “FreeSteering Relaxation Methods for Problems with Strictly Convex Costs and Linear Constraints,” Mathematics of Operations Research, 22, 326349.

Kurkalova, L. A., and Carriquiry, A. (2002), “An Analysis of Grain Production Decline During the Early Transition in Ukraine: A Bayesian Inference,” American Journal of Agricultural Economics, 84, 12561263.

Leblebici, H., and Salancik, G. R. (1981), “Effects of Environmental Uncertainty on Information and Decision Processes in Banks,” Administrative Science Quarterly, 26, 578596.

Leibenstein, H. (1975), “Aspects of the XEfficiency Theory of the Firm,” The Bell Journal of Economics, 6, 580606.

Lence, S. H., and Miller, D. J. (1998), “Recovering OutputSpecific Inputs from Aggregate Input Data: A Generalized CrossEntropy Approach,” American Journal of Agricultural Economics, 80, 852867.

Lev, B., and Theil, H. (1978), “A Maximum Entropy Approach to the Choice of Asset Depreciation,” Journal of Accounting Research, 16, 286293.

Liang, T.P. (1992), “A Composite Approach to Inducing Knowledge for Expert Systems Design,” Management Science, 38, 117.

Lim, A. E. B., and Shanthikumar, J. G. (2007), “Relative Entropy, Exponential Utility, and Robust Dynamic Pricing,” Operations Research, 55, 198214.

Lucas, D. J., and McDonald, R. L. (1990), “Equity Issues and Stock Price Dynamics,” The Journal of Finance, 45, 10191043.

Maasoumi, E., and Trede, M. (2001), “Comparing Income Mobility in Germany and the United States Using Generalized Entropy Mobility Measures,” The Review of Economics and Statistics, 83, 551559.

Mayfield, E. S., and Mizrach, B. (1992), “On Determining the Dimension of RealTime StockPrice Data,” Journal of Business & Economic Statistics, 10, 367374.

McClean, S. (1986), “Extending the Entropy Stability Measure for Manpower Planning,” The Journal of the Operational Research Society, 37, 11331138.

McClean, S., and Abodunde, T. (1978), “Entropy as a Measure of Stability in a Manpower System,” The Journal of the Operational Research Society, 29, 885889.

McCombie, J. S. L. (1975), “Utility, Accessibility and Entropy in Spatial Modelling: A Comment,” The Swedish Journal of Economics, 77, 497501.

Medvedkov, Y. (1970), “Entropy: An Assessment of Potentialities in Geography,” Economic Geography, 46, 306316.

Miller, D. J. (2002), “EntropyBased Methods of Modeling Stochastic Production Efficiency,” American Journal of Agricultural Economics, 84, 12641270.

Miller, D. J., and Plantinga, A. J. (1999), “Modeling Land Use Decisions with Aggregate Data,” American Journal of Agricultural Economics, 81, 180194.

Miller, R. A. (1972), “Numbers Equivalents, Relative Entropy, and Concentration Ratios: A Comparison Using Market Performance,” Southern Economic Journal, 39, 107112.

Mills, J. A., and Zandvakili, S. (1997), “Statistical Inference Via Bootstrapping for Measures of Inequality,” Journal of Applied Econometrics, 12, 133150.

Myung, I. J., Ramamoorti, S., and Bailey, A. D., Jr. (1996), “Maximum Entropy Aggregation of Expert Predictions,” Management Science, 42, 14201436.

Nayak, T. K., and Gastwirth, J. L. (1989), “The Use of Diversity Analysis to Assess the Relative Influence of Factors Affecting the Income Distribution,” Journal of Business & Economic Statistics, 7, 453460.

Ng, L. F.Y. (1995), “Changing Industrial Structure and Competitive Patterns of Manufacturing and NonManufacturing in a Small Open Economy: An Entropy Measurement,” Managerial and Decision Economics, 16, 547563.

Nilim, A., and Ghaoui, L. E. (2005), “Robust Control of Markov Decision Processes with Uncertain Transition Matrices,” Operations Research, 53, 780798.

Ordentlich, E., and Cover, T. M. (1998), “The Cost of Achieving the Best Portfolio in Hindsight,” Mathematics of Operations Research, 23, 960982.

Palepu, K. (1985), “Diversification Strategy, Profit Performance and the Entropy Measure,” Strategic Management Journal, 6, 239255.

Paris, Q. (2001), “Symmetric Positive Equilibrium Problem: A Framework for Rationalizing Economic Behavior with Limited Information,” American Journal of Agricultural Economics, 83, 10491061.

Paris, Q. (2002), “An Analysis of IllPosed Production Problems Using Maximum Entropy: Reply,” American Journal of Agricultural Economics, 84, 247.

Paris, Q., and Howitt, R. E. (1998), “An Analysis of IllPosed Production Problems Using Maximum Entropy,” American Journal of Agricultural Economics, 80, 124138.

Parisi, F. (2003), “Freedom of Contract and the Laws of Entropy,” Supreme Court Economic Review, 10, 6590.

Perakis, G., and Roels, G. (2008), “Regret in the Newsvendor Model with Partial Information,” Operations Research, 56, 188203.

Perry, W. L., and Moffat, J. (1997), “Measuring the Effects of Knowledge in Military Campaigns,” The Journal of the Operational Research Society, 48, 965972.

Philippatos, G. C., and Gressis, N. (1975), “Conditions of Equivalence among EV, Ssd, and EH Portfolio Selection Criteria: The Case for Uniform, Normal and Lognormal Distributions,” Management Science, 21, 617625.

Pieters, R., Wedel, M., and Zhang, J. (2007), “Optimal Feature Advertising Design under Competitive Clutter,” Management Science, 53, 18151828.

Preckel, P. V. (2001), “Least Squares and Entropy: A Penalty Function Perspective,” American Journal of Agricultural Economics, 83, 366377.

Preckel, P. V. (2002), “An Analysis of IllPosed Production Problems Using Maximum Entropy: Comment,” American Journal of Agricultural Economics, 84, 245246.

Pulliainen, K. (1970), “EntropyMeasures for International Trade,” The Swedish Journal of Economics, 72, 4053.

Pye, R. (1978), “A Formal, DecisionTheoretic Approach to Flexibility and Robustness,” The Journal of the Operational Research Society, 29, 215227.

Rachev, S. T., and Rmisch, W. (2002), “Quantitative Stability in Stochastic Programming: The Method of Probability Metrics,” Mathematics of Operations Research, 27, 792818.

Robertson, J. C., Tallman, E. W., and Whiteman, C. H. (2005), “Forecasting Using Relative Entropy,” Journal of Money, Credit and Banking, 37, 383401.

Robins, J. A., and Margarethe, F. W. (2003), “The Measurement of Corporate Portfolio Strategy: Analysis of the Content Validity of Related Diversification Indexes,” Strategic Management Journal, 24, 3959.

Robinson, P. M. (1991), “Consistent Nonparametric EntropyBased Testing,” The Review of Economic Studies, 58, 437453.

Rodrigues, F. C. (1989), “A Proposed Entropy Measure for Assessing Combat Degradation,” The Journal of the Operational Research Society, 40, 789793.

Ronen, J., and Gideon, F. (1973), “Accounting Aggregation and the Entropy Measure: An Experimental Approach,” The Accounting Review, 48, 696717.

Ryu, H. K., and Slottje, D. J. (1994), “Coordinate Space Versus Index Space Representations as Estimation Methods: An Application to How Macro Activity Affects the U.S. Income Distribution,” Journal of Business & Economic Statistics, 12, 243251.

Sampson, A. R., and Smith, R. L. (1982), “Assessing Risks through the Determination of Rare Event Probabilities,” Operations Research, 30, 839866.

Saviotti, P. P., and Pyka, A. (2004), “Economic Development, Variety and Employment,” Revue conomique, 55, 10231049.

Semple, R. K., and Golledge, R. G. (1970), “An Analysis of Entropy Changes in a Settlement Pattern over Time,” Economic Geography, 46, 157160.

Shindo, E., and McCormack, G. (1985), “Hunger and Weapons: The Entropy of Militarisation,” Review of African Political Economy, 622.

Shorrocks, A. F. (1980), “The Class of Additively Decomposable Inequality Measures,” Econometrica, 48, 613625.

Soofi, E. S. (1990), “Generalized EntropyBased Weights for Multiattribute Value Models,” Operations Research, 38, 362363.

Stewart, J. F. (1979), “The Beta Distribution as a Model of Behavior in Consumer Goods Markets,” Management Science, 25, 813821.

Stutzer, M. (1996), “A Simple Nonparametric Approach to Derivative Security Valuation,” The Journal of Finance, 51, 16331652.

Suppes, P. (1961), “Behavioristic Foundations of Utility,” Econometrica, 29, 186202.

Tanyimboh, T. T., and Templeman, A. B. (1993), “Calculating Maximum Entropy Flows in Networks,” The Journal of the Operational Research Society, 44, 383396.

Tavana, M. (2006), “A Priority Assessment MultiCriteria Decision Model for Human Spaceflight Mission Planning at Nasa,” The Journal of the Operational Research Society, 57, 11971215.

Teboulle, M. (1992), “Entropic Proximal Mappings with Applications to Nonlinear Programming,” Mathematics of Operations Research, 17, 670690.

Thomas, M. U. (1979), “A Generalized Maximum Entropy Principle,” Operations Research, 27, 11881196.

Tseng, P. (2004), “An Analysis of the Em Algorithm and EntropyLike Proximal Point Methods,” Mathematics of Operations Research, 29, 2744.

Tummala, V. M. R., and Ling, H. (2000), “A Note on the Sampling Distribution of the Information Content of the Priority Vector of a Consistent Pairwise Comparison Judgment Matrix of Ahp,” The Journal of the Operational Research Society, 51, 237240.

Vachani, S. (1991), “Distinguishing between Related and Unrelated International Geographic Diversification: A Comprehensive Measure of Global Diversification,” Journal of International Business Studies, 22, 307322.

Vanhonacker, W. R. (1985), “Testing the Exact Order of an Individual’s Choice Process in an InformationTheoretic Framework,” Journal of Marketing Research, 22, 377387.

Vassiliou, P. C. G. (1984), “Entropy as a Measure of the Experience Distribution in a Manpower System,” The Journal of the Operational Research Society, 35, 10211025.

Vinod, H. D. (1985), “Measurement of Economic Distance between Blacks and Whites,” Journal of Business & Economic Statistics, 3, 7888.

Webber, M. J. (1976), “Elementary Entropy Maximizing Probability Distributions: Analysis and Interpretation,” Economic Geography, 52, 218227.

Weitzman, M. L. (2000), “Economic Profitability Versus Ecological Entropy,” The Quarterly Journal of Economics, 115, 237263.

White, D. J. (1969), “Operational Research and Entropy,” OR, 20, 126127.

White, D. J. (1970), “The Use of the Concept of Entropy in System Modelling,” Operational Research Quarterly (19701977), 21, 279281.

White, D. J. (1975), “Entropy and Decision,” Operational Research Quarterly (19701977), 26, 1523.

Whiteside, T. J. (1976), “A Comment On ”A Marketing Model” And On ”Entropy and Decision”,” Operational Research Quarterly (19701977), 27, 10191020.

Willmer, M. A. P. (1966), “On the Measurement of Information in the Field of Criminal Detection,” OR, 17, 335345.

Wilson, A. G. (1970), “The Use of the Concept of Entropy in System Modelling,” Operational Research Quarterly (19701977), 21, 247265.

Wu, J.S. (1992), “Maximum Entropy Analysis of Open Queueing Networks with Group Arrivals,” The Journal of the Operational Research Society, 43, 10631078.

Wu, J. S., and Chan, W. C. (1989), “Maximum Entropy Analysis of MultipleServer Queueing Systems,” The Journal of the Operational Research Society, 40, 815825.

Xingsi, L. (1991), “An Aggregate Constraint Method for NonLinear Programming,” The Journal of the Operational Research Society, 42, 10031010.

Yu, S. B., and Efstathiou, J. (2006), “Complexity in Rework Cells: Theory, Analysis and Comparison,” The Journal of the Operational Research Society, 57, 593602.

Zandvakili, S. (1999), “Income Inequality among Female Heads of Households: Racial Inequality Reconsidered,” Economica, 66, 119133.

Zellner, A., and Tobias, J. (2001), “Further Results on Bayesian Method of Moments Analysis of the Multiple Regression Model,” International Economic Review, 42, 121140.

Zhang, X., and Fan, S. (2001), “Estimating CropSpecific Production Technologies in Chinese Agriculture: A Generalized Maximum Entropy Approach,” American Journal of Agricultural Economics, 83, 378388.

Zheng, B. (2007), “UnitConsistent Decomposable Inequality Measures,” Economica, 74, 97111.

Zohrabian, A., Traxler, G., Caudill, S., and Smale, M. (2003), “Valuing PreCommercial Genetic Resources: A Maximum Entropy Approach,” American Journal of Agricultural Economics, 85, 429436.
Statistics and Probability

Aaron, D. L. (2001), “Schwarz, Wallace, and Rissanen: Intertwining Themes in Theories of Model Selection,” International Statistical Review, 69, 185212.

Adler, R. J., and Feigin, P. D. (1984), “On the Cadlaguity of Random Measures,” The Annals of Probability, 12, 615630.

Adler, R. J., and Samorodnitsky, G. (1987), “Tail Behaviour for the Suprema of Gaussian Processes with Applications to Empirical Processes,” The Annals of Probability, 15, 13391351.

Agresti, A. (1986), “Applying R2Type Measures to Ordered Categorical Data,” Technometrics, 28, 133138.

Ahlswede, R., and Gacs, P. (1976), “Spreading of Sets in Product Spaces and Hypercontraction of the Markov Operator,” The Annals of Probability, 4, 925939.

Aitchison, J. (1970), “Statistical Problems of Treatment Allocation,” Journal of the Royal Statistical Society. Series A (General), 133, 206239.

Alexander, K. S., and Kalikow, S. A. (1992), “Random Stationary Processes,” The Annals of Probability, 20, 11741198.

Alexander, K. S., and Pyke, R. (1986), “A Uniform Central Limit Theorem for SetIndexed PartialSum Processes with Finite Variance,” The Annals of Probability, 14, 582597.

Algoet, P. H., and Cover, T. M. (1988), “A Sandwich Proof of the ShannonMcmillanBreiman Theorem,” The Annals of Probability, 16, 899909.

Alonso, A., and Molenberghs, G. (2007), “Surrogate Marker Evaluation from an Information Theory Perspective,” Biometrics, 63, 180186.

Ankirchner, S., Dereich, S., and Imkeller, P. (2006), “The Shannon Information of Filtrations and the Additional Logarithmic Utility of Insiders,” The Annals of Probability, 34, 743778.

Antonelli, P. L. (1979), “The Geometry of Random Drift V. Axiomatic Derivation of the Wfk Diffusion from a Variational Principle,” Advances in Applied Probability, 11, 502509.

Arizono, I., and Ohta, H. (1989), “A Test for Normality Based on KullbackLeibler Information,” The American Statistician, 43, 2022.

Arnold, L., Gundlach, V. M., and Demetrius, L. (1994), “Evolutionary Formalism for Products of Positive Random Matrices,” The Annals of Applied Probability, 4, 859901.

Arratia, R., and Waterman, M. S. (1985), “Critical Phenomena in Sequence Matching,” The Annals of Probability, 13, 12361249.

Artalejo, J. R., and LopezHerrero, M. J. (2001), “Analysis of the Busy Period for the M/M/C Queue: An Algorithmic Approach,” Journal of Applied Probability, 38, 209222.

Arwini, K., and Dodson, C. T. J. (2004), “Neighbourhoods of Randomness and Geometry of Mckay Bivariate Gamma 3Manifold,” Sankhya: The Indian Journal of Statistics (2003), 66, 213233.

Asadi, M., Ebrahimi, N., Hamedani, G. G., and Soofi, E. S. (2004), “Maximum Dynamic Entropy Models,” Journal of Applied Probability, 41, 379390.

Asmussen, S., Nerman, O., and Olsson, M. (1996), “Fitting PhaseType Distributions Via the Em Algorithm,” Scandinavian Journal of Statistics, 23, 419441.

Autar, R. (1975), “On a Characterization of Information Improvement,” Journal of Applied Probability, 12, 407411.

Autar, R., and Soni, R. S. (1975), “Inaccuracy and a Coding Theorem,” Journal of Applied Probability, 12, 845851.

Baggen, S., et al. (2006), “Entropy of a BitShift Channel,” Lecture NotesMonograph Series, 48, 274285.

Baggerly, K. A. (1998), “Empirical Likelihood as a GoodnessofFit Measure,” Biometrika, 85, 535547.

Bahadoran, C., Guiol, H., Ravishankar, K., and Saada, E. (2006), “Euler Hydrodynamics of OneDimensional Attractive Particle Systems,” The Annals of Probability, 34, 13391369.

Barlow, R. E., and Hsiung, J. H. (1983), “Expected Information from a Life Test Experiment,” Journal of the Royal Statistical Society. Series D (The Statistician), 32, 3545.

Barron, A., and Hengartner, N. (1998), “Information Theory and Superefficiency,” The Annals of Statistics, 26, 18001825.

Barron, A., Schervish, M. J., and Wasserman, L. (1999), “The Consistency of Posterior Distributions in Nonparametric Problems,” The Annals of Statistics, 27, 536561.

Barron, A. R. (1985), “The Strong Ergodic Theorem for Densities: Generalized ShannonMcmillanBreiman Theorem,” The Annals of Probability, 13, 12921303.

Barron, A. R. (1986), “Entropy and the Central Limit Theorem,” The Annals of Probability, 14, 336342.

Barron, A. R., and Sheu, C.H. (1991), “Approximation of Density Functions by Sequences of Exponential Families,” The Annals of Statistics, 19, 13471369.

Bass, R. F., and Pyke, R. (1985), “The Space D(a) and Weak Convergence for SetIndexed Processes,” The Annals of Probability, 13, 860884.

Benassi, A., and Fouque, J.P. (1987), “Hydrodynamical Limit for the Asymmetric Simple Exclusion Process,” The Annals of Probability, 15, 546560.

Beran, R., and Dumbgen, L. (1998), “Modulation of Estimators and Confidence Sets,” The Annals of Statistics, 26, 18261856.

Berlekamp, E. R. (1972), “A Survey of Coding Theory,” Journal of the Royal Statistical Society. Series A (General), 135, 4473.

Bernard, P., and Wu, L. (1998), “Stochastic Linearization: The Theory,” Journal of Applied Probability, 35, 718730.

Bernardo, J. M., and Rueda, R. (2002), “Bayesian Hypothesis Testing: A Reference Approach,” International Statistical Review, 70, 351372.

Berry, D. A. (1974), “Optimal Sampling Schemes for Estimating System Reliability by Testing Components–1: Fixed Sample Sizes,” Journal of the American Statistical Association, 69, 485491.

Bertoluzza, C., and Forte, B. (1985), “Mutual Dependence of Random Variables and Maximum Discretized Entropy,” The Annals of Probability, 13, 630637.

Bertrand, C. (1996), “Implications of Reference Priors for Prior Information and for Sample Size,” Journal of the American Statistical Association, 91, 173184.

Bhandari, S. K. (1986), “Characterisation of the Parent Distribution by Inequality Measures on Its Truncations,” Sankhya: The Indian Journal of Statistics, Series B, 48, 297300.

Biagini, S., and Frittelli, M. (2004), “On the Super Replication Price of Unbounded Claims,” The Annals of Applied Probability, 14, 19701991.

Birch, J. J. (1962), “Approximations for the Entropy for Functions of Markov Chains,” The Annals of Mathematical Statistics, 33, 930938.

Boldrighini, C., Keane, M., and Marchetti, F. (1978), “Billiards in Polygons,” The Annals of Probability, 6, 532540.

Bolthausen, E., and Deuschel, J.D. (1993), “Critical Large Deviations for Gaussian Fields in the Phase Transition Regime, I,” The Annals of Probability, 21, 18761920.

Bolthausen, E., and Giacomin, G. (2005), “Periodic Copolymers at Selective Interfaces: A Large Deviations Approach,” The Annals of Applied Probability, 15, 963983.

Borth, D. M. (1975), “A Total Entropy Criterion for the Dual Problem of Model Discrimination and Parameter Estimation,” Journal of the Royal Statistical Society. Series B (Methodological), 37, 7787.

Borth, D. M., McKay, R. J., and Elliott, J. R. (1985), “A Difficulty Information Approach to Substituent Selection in Qsar Studies,” Technometrics, 27, 2535.

Bose, R. C., and Kuebler, R. R., Jr. (1960), “A Geometry of Binary Sequences Associated with Group Alphabets in Information Theory,” The Annals of Mathematical Statistics, 31, 113139.

Bosma, W., Dajani, K., and Kraaikamp, C. (2006), “Entropy Quotients and Correct Digits in NumberTheoretic Expansions,” Lecture NotesMonograph Series, 48, 176188.

Boucher, C., Ellis, R. S., and Turkington, B. (1999), “Spatializing Random Measures: Doubly Indexed Processes and the Large Deviation Principle,” The Annals of Probability, 27, 297324.

Boucheron, S., Bousquet, O., Lugosi, G., and Massart, P. (2005), “Moment Inequalities for Functions of Independent Random Variables,” The Annals of Probability, 33, 514560.

Boucheron, S., Lugosi, G., and Massart, P. (2003), “Concentration Inequalities Using the Entropy Method,” The Annals of Probability, 31, 15831614.

Bowman, A. W., and Foster, P. J. (1993), “Adaptive Smoothing and DensityBased Tests of Multivariate Normality,” Journal of the American Statistical Association, 88, 529537.

Braess, D., and Dette, H. (2004), “The Asymptotic Minimax Risk for the Estimation of Constrained Binomial and Multinomial Probabilities,” Sankhya: The Indian Journal of Statistics (2003), 66, 707732.

Braverman, A. (2002), “Compressing Massive Geophysical Datasets Using Vector Quantization,” Journal of Computational and Graphical Statistics, 11, 4462.

Breiman, L. (1957), “The Individual Ergodic Theorem of Information Theory,” The Annals of Mathematical Statistics, 28, 809811.

Breiman, L. (1960), “Correction Notes: Correction To ”The Individual Ergodic Theorem of Information Theory”,” The Annals of Mathematical Statistics, 31, 809810.

Brown, T. A. (1963), “Entropy and Conjugacy,” The Annals of Mathematical Statistics, 34, 226232.

Buhlmann, P., and Wyner, A. J. (1999), “Variable Length Markov Chains,” The Annals of Statistics, 27, 480513.

Bulmer, M. G. (1974), “On Fitting the Poisson Lognormal Distribution to SpeciesAbundance Data,” Biometrics, 30, 101110.

Burnham, K. P., White, G. C., and Anderson, D. R. (1995), “Model Selection Strategy in the Analysis of CaptureRecapture Data,” Biometrics, 51, 888898.

Burshtein, D., Pietra, V. D., Kanevsky, D., and Nadas, A. (1992), “Minimum Impurity Partitions,” The Annals of Statistics, 20, 16371646.

Burton, R., and Pemantle, R. (1993), “Local Characteristics, Entropy and Limit Theorems for Spanning Trees and Domino Tilings Via TransferImpedances,” The Annals of Probability, 21, 13291371.

Buss, S. R., and Clote, P. (2004), “Solving the FisherWright and Coalescence Problems with a Discrete Markov Chain Analysis,” Advances in Applied Probability, 36, 11751197.

Callen, J. L., Kwan, C. C. Y., and Yip, P. C. Y. (1985), “ForeignExchange Rate Dynamics: An Empirical Study Using Maximum Entropy Spectral Analysis,” Journal of Business & Economic Statistics, 3, 149155.

Chakravarty, S. R. (1982), “An Axiomatisation of the Entropy Measure of Inequality,” Sankhya: The Indian Journal of Statistics, Series B, 44, 351354.

Champagnat, F., and Idier, J. (2000), “On the Correlation Structure of Unilateral Ar Processes on the Plane,” Advances in Applied Probability, 32, 408425.

Chan, T. (1999), “Pricing Contingent Claims on Stocks Driven by Levy Processes,” The Annals of Applied Probability, 9, 504528.

Chatfield, C. (1973), “Statistical Inference Regarding Markov Chain Models,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 22, 720.

Chen, X.H., Dempster, A. P., and Liu, J. S. (1994), “Weighted Finite Population Sampling to Maximize Entropy,” Biometrika, 81, 457469.

Chiu, S. N. (1994), “MeanValue Formulae for the Neighbourhood of the Typical Cell of a Random Tessellation,” Advances in Applied Probability, 26, 565576.

Christensen, E. S. (1989), “Statistical Properties of IProjections within Exponential Families,” Scandinavian Journal of Statistics, 16, 307318.

Christof, K., Ny, A. L., and Redig, F. (2004), “Relative Entropy and Variational Properties of Generalized Gibbsian Measures,” The Annals of Probability, 32, 16911726.

Chung, K. L. (1961), “A Note on the Ergodic Theorem of Information Theory,” The Annals of Mathematical Statistics, 32, 612614.

Conger, M., and Viswanath, D. (2006), “Riffle Shuffles of Decks with Repeated Cards,” The Annals of Probability, 34, 804819.

Courbage, M., and Hamdan, D. (1994), “ChapmanKolmogorov Equation for NonMarkovian ShiftInvariant Measures,” The Annals of Probability, 22, 16621677.

Cover, T. M., Gacs, P., and Gray, R. M. (1989), “Kolmogorov’s Contributions to Information Theory and Algorithmic Complexity,” The Annals of Probability, 17, 840865.

Crescenzo, A. D., and Longobardi, M. (2002), “EntropyBased Measure of Uncertainty in Past Lifetime Distributions,” Journal of Applied Probability, 39, 434440.

Csiszar, I. (1984), “Sanov Property, Generalized IProjection and a Conditional Limit Theorem,” The Annals of Probability, 12, 768793.

Csiszar, I. (1991), “Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems,” The Annals of Statistics, 19, 20322066.

Currin, C., Toby, M., Morris, M., and Don, Y. (1991), “Bayesian Prediction of Deterministic Functions, with Applications to the Design and Analysis of Computer Experiments,” Journal of the American Statistical Association, 86, 953963.

Cutler, C. D., and Dawson, D. A. (1990), “NearestNeighbor Analysis of a Family of Fractal Distributions,” The Annals of Probability, 18, 256271.

Daley, D. J., and VereJones, D. (2004), “Scoring Probability Forecasts for Point Processes: The Entropy Score and Information Gain,” Journal of Applied Probability, 41, 297312.

Darroch, J. N., and Ratcliff, D. (1972), “Generalized Iterative Scaling for LogLinear Models,” The Annals of Mathematical Statistics, 43, 14701480.

Davis, H. T., and Koopmans, L. H. (1973), “Adaptive Prediction of Stationary Time Series,” Sankhya: The Indian Journal of Statistics, Series A, 35, 522.

Dawid, A. P., and Vovk, V. G. (1999), “Prequential Probability: Principles and Properties,” Bernoulli, 5, 125162.

DeGroot, M. H. (1962), “Uncertainty, Information, and Sequential Experiments,” The Annals of Mathematical Statistics, 33, 404419.

Dembo, A. (1997), “Information Inequalities and Concentration of Measure,” The Annals of Probability, 25, 927939.

Diaconis, P., and Freedman, D. (1990), “On the Uniform Consistency of Bayes Estimates for Multinomial Probabilities,” The Annals of Statistics, 18, 13171327.

Diaconis, P., and Zabell, S. L. (1982), “Updating Subjective Probability,” Journal of the American Statistical Association, 77, 822830.

Djehiche, B., and Kaj, I. (1995), “The Rate Function for Some MeasureValued Jump Processes,” The Annals of Probability, 23, 14141438.

Donoho, D. L., Johnstone, I. M., Hoch, J. C., and Stern, A. S. (1992), “Maximum Entropy and the Nearly Black Object,” Journal of the Royal Statistical Society. Series B (Methodological), 54, 4181.

Droge, B. (1998), “Minimax Regret Analysis of Orthogonal Series Regression Estimation: Selection Versus Shrinkage,” Biometrika, 85, 631643.

Dudewicz, E. J., and Meulen, E. C. v. d. (1981), “EntropyBased Tests of Uniformity,” Journal of the American Statistical Association, 76, 967974.

Dudley, R. M. (1973), “Sample Functions of the Gaussian Process,” The Annals of Probability, 1, 66103.

Dudley, R. M. (1978), “Central Limit Theorems for Empirical Measures,” The Annals of Probability, 6, 899929.

Dudley, R. M. (1987), “Universal Donsker Classes and Metric Entropy,” The Annals of Probability, 15, 13061326.

Duncan, G., and Lambert, D. (1989), “The Risk of Disclosure for Microdata,” Journal of Business & Economic Statistics, 7, 207217.

Dunham, J. G. (1980), “Abstract Alphabet SlidingBlock Entropy Compression Coding with a Fidelity Criterion,” The Annals of Probability, 8, 10851092.

Dutta, M. (1966), “On Maximum (InformationTheoretic) Entropy Estimation,” Sankhya: The Indian Journal of Statistics, Series A, 28, 319328.

Duvillard, T. C., and Guionnet, A. (2001), “Large Deviations Upper Bounds for the Laws of MatrixValued Processes and NonCommunicative Entropies,” The Annals of Probability, 29, 12051261.

Dym, H. (1966), “A Note on Limit Theorems for the Entropy of Markov Chains,” The Annals of Mathematical Statistics, 37, 522524.

Eaves, D. M. (1985), “On Maximizing Missing Information About a Hypothesis,” Journal of the Royal Statistical Society. Series B (Methodological), 47, 263266.

Ebanks, B. (1978), “The Branching Property in Generalized Information Theory,” Advances in Applied Probability, 10, 788802.

Ebanks, B. R. (1984), “Polynomially Additive Entropies,” Journal of Applied Probability, 21, 179185.

Ebrahimi, N. (2000), “The Maximum Entropy Method for Lifetime Distributions,” Sankhya: The Indian Journal of Statistics, Series A, 62, 236243.

Ebrahimi, N., Habibullah, M., and Soofi, E. S. (1992), “Testing Exponentiality Based on KullbackLeibler Information,” Journal of the Royal Statistical Society. Series B (Methodological), 54, 739748.

Ebrahimi, N., and Soofi, E. S. (1990), “Relative Information Loss under Type Ii Censored Exponential Data,” Biometrika, 77, 429435.

Eeden, C. V., and Zidek, J. V. (2002), “Combining Sample Information in Estimating Ordered Normal Means,” Sankhya: The Indian Journal of Statistics, Series A, 64, 588610.

Eguchi, S., and Copas, J. (1998), “A Class of Local Likelihood Methods and nearParametric Asymptotics,” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 60, 709724.

Elias, P. (1972), “The Efficient Construction of an Unbiased Random Sequence,” The Annals of Mathematical Statistics, 43, 865870.

Enns, E. G. (1975), “Selecting the Maximum of a Sequence with Imperfect Information,” Journal of the American Statistical Association, 70, 640643.

Epifani, I., Lijoi, A., and Prnster, I. (2003), “Exponential Functionals and Means of NeutraltotheRight Priors,” Biometrika, 90, 791808.

Erschler, A. (2003), “On Drift and Entropy Growth for Random Walks on Groups,” The Annals of Probability, 31, 11931204.

Es, B. v. (1992), “Estimating Functionals Related to a Density by a Class of Statistics Based on Spacings,” Scandinavian Journal of Statistics, 19, 6172.

Evans, W., Kenyon, C., Peres, Y., and Schulman, L. J. (2000), “Broadcasting on Trees and the Ising Model,” The Annals of Applied Probability, 10, 410433.

Feldman, J., and Smorodinsky, M. (1971), “Bernoulli Flows with Infinite Entropy,” The Annals of Mathematical Statistics, 42, 381382.

Ferguson, T. S. (1973), “A Bayesian Analysis of Some Nonparametric Problems,” The Annals of Statistics, 1, 209230.

Follmer, H., and Gantert, N. (1997), “Entropy Minimization and Schrodinger Processes in Infinite Dimensions,” The Annals of Probability, 25, 901926.

Follmer, H., and Orey, S. (1988), “Large Deviations for the Empirical Field of a Gibbs Measure,” The Annals of Probability, 16, 961977.

Franke, J. (1985), “Arma Processes Have Maximal Entropy among Time Series with Prescribed Autocovariances and Impulse Responses,” Advances in Applied Probability, 17, 810840.

Galli, E.P., and Legros, D. (2007), “Spatial Spillovers in France: A Study on Individual Count Data at the City Level,” Annales d’conomie et de Statistique, 221246.

Gamboa, F., and Gassiat, E. (1997), “Bayesian Methods and Maximum Entropy for IllPosed Inverse Problems,” The Annals of Statistics, 25, 328350.

Gao, F., and Quastel, J. (2003), “Exponential Decay of Entropy in the Random Transposition and Bernoulli: Laplace Models,” The Annals of Applied Probability, 13, 15911600.

Gardner, R. J., Kiderlen, M., and Milanfar, P. (2006), “Convergence of Algorithms for Reconstructing Convex Bodies and Directional Measures,” The Annals of Statistics, 34, 13311374.

Gatsonis, C. A. (1984), “Deriving Posterior Distributions for a Location Parameter: A Decision Theoretic Approach,” The Annals of Statistics, 12, 958970.

Ge, H., Jiang, D.Q., and Qian, M. (2006), “Reversibility and Entropy Production of Inhomogeneous Markov Chains,” Journal of Applied Probability, 43, 10281043.

Geer, S. V. D. (1987), “A New Approach to LeastSquares Estimation, with Applications,” The Annals of Statistics, 15, 587602.

Geer, S. v. d. (1990), “Estimating a Regression Function,” The Annals of Statistics, 18, 907924.

Geer, S. v. d. (1993), “HellingerConsistency of Certain Nonparametric Maximum Likelihood Estimators,” The Annals of Statistics, 21, 1444.

Geer, S. v. d. (1995), “Exponential Inequalities for Martingales, with Application to Maximum Likelihood Estimation for Counting Processes,” The Annals of Statistics, 23, 17791801.

Gendron, P., and Nandram, B. (2001), “An Empirical Bayes Estimator of Seismic Events Using Wavelet Packet Bases,” Journal of Agricultural, Biological, and Environmental Statistics, 6, 379406.

Genovese, C. R., and Wasserman, L. (2000), “Rates of Convergence for the Gaussian Mixture Sieve,” The Annals of Statistics, 28, 11051127.

Georgii, H.O. (1993), “Large Deviations and Maximum Entropy Principle for Interacting Random Fields on Zd,” The Annals of Probability, 21, 18451875.

Gersch, W., and Kitagawa, G. (1983), “The Prediction of Time Series with Trends and Seasonalities,” Journal of Business & Economic Statistics, 1, 253264.

Ghosal, S., and Vaart, A. W. v. d. (2001), “Entropies and Rates of Convergence for Maximum Likelihood and Bayes Estimation for Mixtures of Normal Densities,” The Annals of Statistics, 29, 12331263.

Ghosh, M., and Yang, M.C. (1988), “Simultaneous Estimation of Poisson Means under Entropy Loss,” The Annals of Statistics, 16, 278291.

Ghurye, S. G. (1968), “Information and Sufficient SubFields,” The Annals of Mathematical Statistics, 39, 20562066.

Gilbert, E. N. (1958), “An Outline of Information Theory,” The American Statistician, 12, 1319.

Gilula, Z., Krieger, A. M., and Ritov, Y. (1988), “Ordinal Association in Contingency Tables: Some Interpretive Aspects,” Journal of the American Statistical Association, 83, 540545.

Gilula, Z., and Shelby, J. H. (2000), “Density Approximation by Summary Statistics: An InformationTheoretic Approach,” Scandinavian Journal of Statistics, 27, 521534.

Gine, E., and Zinn, J. (1984), “Some Limit Theorems for Empirical Processes,” The Annals of Probability, 12, 929989.

Girardin, V., and Limnios, N. (2003), “On the Entropy for SemiMarkov Processes,” Journal of Applied Probability, 40, 10601068.

Gleaton, J. U., and Lynch, J. D. (2004), “On the Distribution of the Breaking Strain of a Bundle of Brittle Elastic Fibers,” Advances in Applied Probability, 36, 98115.

Golan, A., Judge, G., and Perloff, J. M. (1996), “A Maximum Entropy Approach to Recovering Information from Multinomial Response Data,” Journal of the American Statistical Association, 91, 841853.

Golan, A., Karp, L. S., and Perloff, J. M. (2000), “Estimating Coke’s and Pepsi’s Price and Advertising Strategies,” Journal of Business & Economic Statistics, 18, 398409.

Goldie, C. M., and Greenwood, P. E. (1986), “Variance of SetIndexed Sums of Mixing Random Variables and Weak Convergence of SetIndexed Processes,” The Annals of Probability, 14, 817839.

Good, I. J. (1953), “The Population Frequencies of Species and the Estimation of Population Parameters,” Biometrika, 40, 237264.

Good, I. J. (1963), “Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency Tables,” The Annals of Mathematical Statistics, 34, 911934.

Gordon, B., and Kelsall, J. E. (2002), “Nonlinear Kernel Density Estimation for Binned Data: Convergence in Entropy,” Bernoulli, 8, 423449.

Grandits, P. (1999), “The POptimal Martingale Measure and Its Asymptotic Relation with the MinimalEntropy Martingale Measure,” Bernoulli, 5, 225247.

Grandits, P., and Rheinlnder, T. (2002), “On the Minimal Entropy Martingale Measure,” The Annals of Probability, 30, 10031038.

Gray, R. M., Neuhoff, D. L., and Shields, P. C. (1975), “A Generalization of Ornstein’s Distance with Applications to Information Theory,” The Annals of Probability, 3, 315328.

Gray, R. M., Ornstein, D. S., and Dobrushin, R. L. (1980), “Block Synchronization, SlidingBlock Coding, Invulnerable Sources and Zero Error Codes for Discrete Noisy Channels,” The Annals of Probability, 8, 639674.

Griffeath, D. S. (1972), “Computer Solution of the Discrete Maximum Entropy Problem,” Technometrics, 14, 891897.

Grnwald, P. D., and Dawid, A. P. (2004), “Game Theory, Maximum Entropy, Minimum Discrepancy and Robust Bayesian Decision Theory,” The Annals of Statistics, 32, 13671433.

Gulati, B. R., and Kounias, E. G. (1973), “On Three Level Symmetrical Factorial Designs and Ternary Group Codes,” Sankhya: The Indian Journal of Statistics, Series A, 35, 377392.

Gupta, S. S., and Huang, D.Y. (1976), “On Subset Selection Procedures for the Entropy Function Associated with the Binomial Populations,” Sankhya: The Indian Journal of Statistics, Series A, 38, 153173.

GutirrezPea, E., and Muliere, P. (2004), “Conjugate Priors Represent Strong PreExperimental Assumptions,” Scandinavian Journal of Statistics, 31, 235246.

Hall, P. (1987), “On the Amount of Detail That Can Be Recovered from a Degraded Signal,” Advances in Applied Probability, 19, 371395.

Hall, P., and Presnell, B. (1999), “Density Estimation under Constraints,” Journal of Computational and Graphical Statistics, 8, 259277.

Hall, P., and Titterington, D. M. (1986), “On Some Smoothing Techniques Used in Image Restoration,” Journal of the Royal Statistical Society. Series B (Methodological), 48, 330343.

Hansen, M. H., and Yu, B. (2001), “Model Selection and the Principle of Minimum Description Length,” Journal of the American Statistical Association, 96, 746774.

Hart, P. E. (1971), “Entropy and Other Measures of Concentration,” Journal of the Royal Statistical Society. Series A (General), 134, 7385.

Haussler, D., and Opper, M. (1997), “Mutual Information, Metric Entropy and Cumulative Relative Entropy Risk,” The Annals of Statistics, 25, 24512492.

Heavlin, W. D. (2003), “Designing Experiments for Causal Networks,” Technometrics, 45, 115129.

Henery, R. J. (1986), “Interpretation of Average Ranks,” Biometrika, 73, 224227.

Hickey, R. J. (1982), “A Note on the Measurement of Randomness,” Journal of Applied Probability, 19, 229232.

Hinich, M. (1965), “LargeSample Estimation of an Unknown Discrete Waveform Which Is Randomly Repeating in Gaussian Noise,” The Annals of Mathematical Statistics, 36, 489508.

Holgate, P. (1981), “The Statistical Entropy of a Sample from a Community of Species,” Biometrics, 37, 795799.

Horowitz, I. (1970), “Employment Concentration in the Common Market: An Entropy Approach,” Journal of the Royal Statistical Society. Series A (General), 133, 463479.

Hsu, L. (1995), “New Procedures for GroupTesting Based on the Huffman Lower Bound and Shannon Entropy Criteria,” Lecture NotesMonograph Series, 25, 249262.

Huang, D. (1990), “On the Maximal Entropy Property for Arma Processes and Arma Approximation,” Advances in Applied Probability, 22, 612626.

Hurvich, C. M. (1986), “DataDependent Spectral Windows: Generalizing the Classical Framework to Include Maximum Entropy Estimates,” Technometrics, 28, 259268.

Ibrahim, J. G., and Laud, P. W. (1994), “A Predictive Approach to the Analysis of Designed Experiments,” Journal of the American Statistical Association, 89, 309319.

Iii, J. P., and Stine, R. A. (1997), “Estimation for an M/G/8 Queue with Incomplete Information,” Biometrika, 84, 295308.

Imbens, G. W. (2002), “Generalized Method of Moments and Empirical Likelihood,” Journal of Business & Economic Statistics, 20, 493506.

Iosifescu, M. (1965), “Sampling Entropy for Random Homogeneous Systems with Complete Connections,” The Annals of Mathematical Statistics, 36, 14331436.

Iosifescu, M. (1969), “Correction Note: Correction To ”Sampling Entropy for Random Homogeneous Systems with Complete Connections”,” The Annals of Mathematical Statistics, 40, 2215.

Isaki, C. T. (1983), “Variance Estimation Using Auxiliary Information,” Journal of the American Statistical Association, 78, 117123.

Jain, N. C. (1990), “Large Deviation Lower Bounds for Additive Functionals of Markov Processes,” The Annals of Probability, 18, 10711098.

Jennison, C., and Turnbull, B. W. (1997), “GroupSequential Analysis Incorporating Covariate Information,” Journal of the American Statistical Association, 92, 13301341.

Joe, H. (1987), “Majorization, Randomness and Dependence for Multivariate Distributions,” The Annals of Probability, 15, 12171225.

Joe, H. (1988), “Majorization, Entropy and Paired Comparisons,” The Annals of Statistics, 16, 915925.

Joe, H. (1989), “Relative Entropy Measures of Multivariate Dependence,” Journal of the American Statistical Association, 84, 157164.

Johnson, O. (2006), “A Central Limit Theorem for NonOverlapping Return Times,” Journal of Applied Probability, 43, 3247.

Johnson, R. A., and Wehrly, T. E. (1978), “Some AngularLinear Distributions and Related Regression Models,” Journal of the American Statistical Association, 73, 602606.

Johnstone, I. M. (1994), “On Minimax Estimation of a Sparse Normal Mean Vector,” The Annals of Statistics, 22, 271289.

Johnstone, I. M., and Silverman, B. W. (1997), “Wavelet Threshold Estimators for Data with Correlated Noise,” Journal of the Royal Statistical Society. Series B (Methodological), 59, 319351.

Jon, S. (1976), “On the Optimal Use of Multiauxiliary Information,” Journal of the American Statistical Association, 71, 679.

Jonasson, J. (1999), “The Random Triangle Model,” Journal of Applied Probability, 36, 852867.

Jourdain, B. (2002), “Probabilistic Characteristics Method for a OneDimensional Inviscid Scalar Conservation Law,” The Annals of Applied Probability, 12, 334360.

Jupp, P. E., and Mardia, K. V. (1983), “A Note on the MaximumEntropy Principle,” Scandinavian Journal of Statistics, 10, 4547.

Kaimanovich, V. A., and Vershik, A. M. (1983), “Random Walks on Discrete Groups: Boundary and Entropy,” The Annals of Probability, 11, 457490.

Kaimanovich, V. A., and Woess, W. (2002), “Boundary and Entropy of Space Homogeneous Markov Chains,” The Annals of Probability, 30, 323363.

Kalikow, S., and Weiss, B. (1992), “Explicit Codes for Some Infinite Entropy Bernoulli Shifts,” The Annals of Probability, 20, 397402.

Kamae, T. (2006), “Numeration Systems as Dynamical Systems: Introduction,” Lecture NotesMonograph Series, 48, 198211.

Kannappan, P. L., and Ng, C. T. (1980), “On Functional Equations and Measures of Information. Ii,” Journal of Applied Probability, 17, 271277.

Karlin, S., and Rinott, Y. (1981), “Entropy Inequalities for Classes of Probability Distributions I. The Univariate Case,” Advances in Applied Probability, 13, 93112.

Karlin, S., and Rinott, Y. (1981), “Entropy Inequalities for Classes of Probability Distributions Ii. The Multivariate Case,” Advances in Applied Probability, 13, 325351.

Kavalieris, L. (1991), “A Note on Estimating AutoregressiveMoving Average Order,” Biometrika, 78, 920922.

Kemperman, J. H. B. (1969), “On the Optimum Rate of Transmitting Information,” The Annals of Mathematical Statistics, 40, 21562177.

Kendall, M. G. (1973), “Entropy, Probability and Information,” International Statistical Review, 41, 5968.

Kiefer, J. (1961), “Optimum Designs in Regression Problems, Ii,” The Annals of Mathematical Statistics, 32, 298325.

Kieffer, J. C. (1973), “A Counterexample to Perez’s Generalization of the ShannonMcmillan Theorem,” The Annals of Probability, 1, 362364.

Kieffer, J. C. (1974), “On the Approximation of Stationary Measures by Periodic and Ergodic Measures,” The Annals of Probability, 2, 530534.

Kieffer, J. C. (1975), “A Generalized ShannonMcmillan Theorem for the Action of an Amenable Group on a Probability Space,” The Annals of Probability, 3, 10311037.

Kieffer, J. C. (1980), “On the Transmission of Bernoulli Sources over Stationary Channels,” The Annals of Probability, 8, 942961.

Kim, G.H., and David, H. T. (1979), “Large Deviations of Functions of Markovian Transitions and Mathematical Programming Duality,” The Annals of Probability, 7, 874881.

King, R., and Brooks, S. P. (2001), “On the Bayesian Analysis of Population Size,” Biometrika, 88, 317336.

Kleijn, B. J. K., and Vaart, A. W. v. d. (2006), “Misspecification in InfiniteDimensional Bayesian Statistics,” The Annals of Statistics, 34, 837877.

Klein, T., and Rio, E. (2005), “Concentration around the Mean for Maxima of Empirical Processes,” The Annals of Probability, 33, 10601077.

Klotz, J. H. (1978), “Maximum Entropy Constrained Balance Randomization for Clinical Trials,” Biometrics, 34, 283287.

Konishi, S., and Kitagawa, G. (1996), “Generalised Information Criteria in Model Selection,” Biometrika, 83, 875890.

Kosygina, E. (2001), “The Behavior of the Specific Entropy in the Hydrodynamic Scaling Limit,” The Annals of Probability, 29, 10861110.

Kotz, S. (1966), “Recent Results in Information Theory,” Journal of Applied Probability, 3, 193.

Krafft, O., and Schmitz, N. (1969), “A Note on Hoeffding’s Inequality,” Journal of the American Statistical Association, 64, 907912.

Kruk, L. (2004), “Limiting Distributions for Minimum Relative Entropy Calibration,” Journal of Applied Probability, 41, 3550.

Kubokawa, T. (1988), “The Recovery of Interblock Information in Balanced Incomplete Block Designs,” Sankhya: The Indian Journal of Statistics, Series B, 50, 7889.

Kubokawa, T., Robert, C., and Saleh, A. K. M. E. (1992), “Empirical Bayes Estimation of the Covariance Matrix of a Normal Distribution with Unknown Mean under an Entropy Loss,” Sankhya: The Indian Journal of Statistics, Series A, 54, 402410.

Kullback, S. (1952), “An Application of Information Theory to Multivariate Analysis,” The Annals of Mathematical Statistics, 23, 88102.

Kullback, S. (1954), “Certain Inequalities in Information Theory and the CramerRao Inequality,” The Annals of Mathematical Statistics, 25, 745751.

Kullback, S. (1956), “An Application of Information Theory to Multivariate Analysis, Ii,” The Annals of Mathematical Statistics, 27, 122146.

Kullback, S. (1956), “Correction To ”An Application of Information Theory to Multivariate Analysis, Ii”,” The Annals of Mathematical Statistics, 27, 860.

Kullback, S. (1969), “A Bound for the Variation of Gaussian Densities,” The Annals of Mathematical Statistics, 40, 21802182.

Kullback, S., Kupperman, M., and Ku, H. H. (1962), “Tests for Contingency Tables and Markov Chains,” Technometrics, 4, 573608.

Kullback, S., and Rosenblatt, H. M. (1957), “On the Analysis of Multiple Regression in K Categories,” Biometrika, 44, 6783.

Kumar, S., and Milton, S. (1971), “Finding a Single Defective in Binomial GroupTesting,” Journal of the American Statistical Association, 66, 824828.

Lagarias, J. C. (1993), “Pseudorandom Numbers,” Statistical Science, 8, 3139.

Lalley, S. P. (1992), “Brownian Motion and the Equilibrium Measure on the Julia Set of a Rational Mapping,” The Annals of Probability, 20, 19321967.

Lapedes, A. S., Bertrand, G. G., Liu, L., and Stormo, G. D. (1999), “Correlated Mutations in Models of Protein Sequences: Phylogenetic and Structural Effects,” Lecture NotesMonograph Series, 33, 236256.

Larimore, W. E. (1983), “Predictive Inference, Sufficiency, Entropy and an Asymptotic Likelihood Principle,” Biometrika, 70, 175181.

Lavine, M. (1994), “More Aspects of Polya Tree Distributions for Statistical Modelling,” The Annals of Statistics, 22, 11611176.

Lee, P. M. (1964), “On the Axioms of Information Theory,” The Annals of Mathematical Statistics, 35, 415418.

Lehmann, E. L. (1983), “Estimation with Inadequate Information,” Journal of the American Statistical Association, 78, 624627.

Leonard, C., and Najim, J. (2002), “An Extension of Sanov’s Theorem: Application to the Gibbs Conditioning Principle,” Bernoulli, 8, 721743.

LeSage, J. P. (1991), “Analysis and Development of Leading Indicators Using a Bayesian TurningPoints Approach,” Journal of Business & Economic Statistics, 9, 305316.

Leysieffer, F. W., and Warner, S. L. (1976), “Respondent Jeopardy and Optimal Designs in Randomized Response Models,” Journal of the American Statistical Association, 71, 649656.

Li, W. V., and Linde, W. (1999), “Approximation, Metric Entropy and Small Ball Estimates for Gaussian Measures,” The Annals of Probability, 27, 15561578.

Li, X., Mikusinski, P., Sherwood, H., and Taylor, M. D. (1996), “In Quest of Birkhoff’s Theorem in Higher Dimensions,” Lecture NotesMonograph Series, 28, 187197.

Lin, S., and Ding, J. (2009), “Integration of Ranked Lists Via Cross Entropy Monte Carlo with Applications to Mrna and Microrna Studies,” Biometrics, 65, 918.

Lind, N. C. (1994), “Information Theory and Maximum Product of Spacings Estimation,” Journal of the Royal Statistical Society. Series B (Methodological), 56, 341343.

Lu, S. (1995), “Hydrodynamic Scaling Limits with Deterministic Initial Configurations,” The Annals of Probability, 23, 18311852.

Luo, Y., and Lin, S. (2003), “Information Gain for Genetic Parameter Estimation with Incorporation of Marker Data,” Biometrics, 59, 393401.

Lutwak, E., Yang, D., and Zhang, G. (2004), “MomentEntropy Inequalities,” The Annals of Probability, 32, 757774.

M, E. G. (1974), “On the Central Limit Theorem for Sample Continuous Processes,” The Annals of Probability, 2, 629641.

Madych, W. R. (1991), “Solutions of Underdetermined Systems of Linear Equations,” Lecture NotesMonograph Series, 20, 227238.

Manuceau, J., Troup, M., and Vaillant, J. (1999), “On an Entropy Conservation Principle,” Journal of Applied Probability, 36, 607610.

Marcus, M. B. (1973), “A Comparison of Continuity Conditions for Gaussian Processes,” The Annals of Probability, 1, 123130.

Mardia, K. V., and Gadsden, R. J. (1977), “A Small Circle of Best Fit for Spherical Data and Areas of Vulcanism,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 26, 238245.

MartinLf, A. (1986), “Entropy Estimates for the First Passage Time of a Random Walk to a Time Dependent Barrier,” Scandinavian Journal of Statistics, 13, 221229.

MartinLf, P. (1974), “The Notion of Redundancy and Its Use as a Quantitative Measure of the Discrepancy between a Statistical Hypothesis and a Set of Observational Data [with Discussion],” Scandinavian Journal of Statistics, 1, 318.

Marton, K., and Shields, P. C. (1994), “Entropy and the Consistent Estimation of Joint Distributions,” The Annals of Probability, 22, 960977.

Marton, K., and Shields, P. C. (1996), “Correction: Entropy and the Consistent Estimation of Joint Distributions,” The Annals of Probability, 24, 541545.

Massart, P. (1989), “Strong Approximation for Multivariate Empirical and Related Processes, Via Kmt Constructions,” The Annals of Probability, 17, 266291.

Massart, P. (2000), “About the Constants in Talagrand’s Concentration Inequalities for Empirical Processes,” The Annals of Probability, 28, 863884.

Massart, P., and Ndlec, . (2006), “Risk Bounds for Statistical Learning,” The Annals of Statistics, 34, 23262366.

Mayfield, E. S., and Mizrach, B. (1992), “On Determining the Dimension of RealTime StockPrice Data,” Journal of Business & Economic Statistics, 10, 367374.

McCann, M., and Don, E. (1996), “A Path Length Inequality for the MultivariateT Distribution, with Applications to Multiple Comparisons,” Journal of the American Statistical Association, 91, 211216.

McClintock, B. T., White, G. C., Antolin, M. F., and Tripp, D. W. (2009), “Estimating Abundance Using MarkResight When Sampling Is with Replacement or the Number of Marked Individuals Is Unknown,” Biometrics, 65, 237246.

McCulloch, R. E. (1988), “Information and the Likelihood Function in Exponential Families,” The American Statistician, 42, 7375.

McDonald, J. B., and Jensen, B. C. (1979), “An Analysis of Some Properties of Alternative Measures of Income Inequality Based on the Gamma Distribution Function,” Journal of the American Statistical Association, 74, 856860.

McEliece, R. J., and Posner, E. C. (1971), “Hide and Seek, Data Storage, and Entropy,” The Annals of Mathematical Statistics, 42, 17061716.

McEliece, R. J., and Posner, E. C. (1973), “Hiding and Covering in a Compact Metric Space,” The Annals of Statistics, 1, 729739.

McMillan, B. (1953), “The Basic Theorems of Information Theory,” The Annals of Mathematical Statistics, 24, 196219.

Menard, S. (2004), “Six Approaches to Calculating Standardized Logistic Regression Coefficients,” The American Statistician, 58, 218223.

Merkl, F., and Rolles, S. W. W. (2005), “EdgeReinforced Random Walk on a Ladder,” The Annals of Probability, 33, 20512093.

Miles, R. E. (1984), “Symmetric Sequential Analysis: The Efficiencies of Sports Scoring Systems (with Particular Reference to Those of Tennis),” Journal of the Royal Statistical Society. Series B (Methodological), 46, 93108.

Miller, M. I., Roysam, B., Smith, K., and Udding, J. T. (1991), “On the Equivalence of Regular Grammars and Stochastic Constraints: Applications to Image Processing on Massively Parallel Processors,” Lecture NotesMonograph Series, 20, 239257.

Milton, S., and Groll, P. A. (1966), “Binomial GroupTesting with an Unknown Proportion of Defectives,” Technometrics, 8, 631656.

Mitchell, T. J., and Beauchamp, J. J. (1988), “Bayesian Variable Selection in Linear Regression,” Journal of the American Statistical Association, 83, 10231032.

Moothathu, T. S. K. (1990), “The Best Estimator and a Strongly Consistent Asymptotically Normal Unbiased Estimator of Lorenz Curve Gini Index and Theil Entropy Index of Pareto Distribution,” Sankhya: The Indian Journal of Statistics, Series B, 52, 115127.

Morf, M., Vieira, A., and Kailath, T. (1978), “Covariance Characterization by Partial Autocorrelation Matrices,” The Annals of Statistics, 6, 643648.

Morgan, B. J. T. (1976), “Markov Properties of Sequences of Behaviours,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 25, 3136.

Morris, M. D., Toby, J. M., and Ylvisaker, D. (1993), “Bayesian Design and Analysis of Computer Experiments: Use of Derivatives in Surface Prediction,” Technometrics, 35, 243255.

Morton, K. (1960), “On Comparing Two Observed Frequency Counts,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 9, 3742.

Mossel, E. (2001), “Reconstruction on Trees: Beating the Second Eigenvalue,” The Annals of Applied Probability, 11, 285300.

Mossel, E., and Peres, Y. (2003), “Information Flow on Trees,” The Annals of Applied Probability, 13, 817844.

Mudholkar, G. S., Kollia, G. D., Lin, C. T., and Patel, K. R. (1991), “A Graphical Procedure for Comparing GoodnessofFit Tests,” Journal of the Royal Statistical Society. Series B (Methodological), 53, 221232.

Mureika, R. A. (1972), “The Maximization of Entropy of Discrete DenumerablyValued Random Variables with Known Mean,” The Annals of Mathematical Statistics, 43, 541552.

Nadarajah, S. (2002), “A Conversation with Samuel Kotz,” Statistical Science, 17, 220233.

Nayak, T. K. (1986), “An Analysis of Diversity Using Rao’s Quadratic Entropy,” Sankhya: The Indian Journal of Statistics, Series B, 48, 315330.

Nayak, T. K., and Dayanand, N. N. (1989), “Estimating Multinomial Cell Probabilities under Quadratic Loss,” Journal of the Royal Statistical Society. Series D (The Statistician), 38, 310.

Nayak, T. K., and Gastwirth, J. L. (1989), “The Use of Diversity Analysis to Assess the Relative Influence of Factors Affecting the Income Distribution,” Journal of Business & Economic Statistics, 7, 453460.

Neter, J., and Maynes, E. S. (1970), “On the Appropriateness of the Correlation Coefficient with a 0, 1 Dependent Variable,” Journal of the American Statistical Association, 65, 501509.

Neuhoff, D. L., and Shields, P. C. (1982), “Channel Entropy and Primitive Approximation,” The Annals of Probability, 10, 188198.

Nicoleris, T., and Yatracos, Y. G. (1997), “Rates of Convergence of Estimates, Kolmogorov’s Entropy and the Dimensionality Reduction Principle in Regression,” The Annals of Statistics, 25, 24932511.

Nishiyama, Y. (2000), “Weak Convergence of Some Classes of Martingales with Jumps,” The Annals of Probability, 28, 685712.

Olshen, R., and Breiman, L. (2001), “A Conversation with Leo Breiman,” Statistical Science, 16, 184198.

Ossiander, M. (1987), “A Central Limit Theorem under Metric Entropy with L2 Bracketing,” The Annals of Probability, 15, 897919.

Osterburg, J. W., Parthasarathy, T., Raghavan, T. E. S., and Sclove, S. L. (1977), “Development of a Mathematical Formula for the Calculation of Fingerprint Probabilities Based on Individual Characteristics,” Journal of the American Statistical Association, 72, 772778.

O’Sullivan, F. (1990), “An Iterative Approach to TwoDimensional Laplacian Smoothing with Application Image Restoration,” Journal of the American Statistical Association, 85, 213219.

Papangelou, F. (1977), “Conditional Intensities of Point Processes: Their Application to Davidson’s Problem and to Entropy,” Advances in Applied Probability, 9, 434.

Pardo, L., Salicr, M., Menndez, M. L., and Morales, D. (1995), “Divergence Measures Based on Entropy Functions and Statistical Inference,” Sankhya: The Indian Journal of Statistics, Series B, 57, 315337.

Parthasarathy, K. R. (1963), “Effective Entropy Rate and Transmission of Information through Channels with Additive Random Noise,” Sankhya: The Indian Journal of Statistics, Series A, 25, 7584.

Patterson, K. D., and Heravi, S. M. (1991), “Direct Estimation of Entropy and Revisions to the National Income Accounts,” Journal of the Royal Statistical Society. Series D (The Statistician), 40, 3550.

Pea, V. H. d. l., Ibragimov, R., and Sharakhmetov, S. (2006), “Characterizations of Joint Distributions, Copulas, Information, Dependence and Decoupling, with Applications to Time Series,” Lecture NotesMonograph Series, 49, 183209.

Peres, Y. (1992), “Iterating Von Neumann’s Procedure for Extracting Random Bits,” The Annals of Statistics, 20, 590597.

Perron, F. (1992), “Monotonic Minimax Estimators of a 2x2 Covariance Matrix,” The Canadian Journal of Statistics, 20, 441449.

Perrut, A. (2000), “Hydrodynamic Limits for a TwoSpecies ReactionDiffusion Process,” The Annals of Applied Probability, 10, 163191.

Pitcher, T. S. (1968), “The ?Entropy of Certain Measures on [ 0, 1 ],” The Annals of Mathematical Statistics, 39, 13101315.

Pittel, B. (1985), “Asymptotical Growth of a Class of Random Trees,” The Annals of Probability, 13, 414427.

Politis, D. N. (1994), “Markov Chains in Many Dimensions,” Advances in Applied Probability, 26, 756774.

Portnoy, S. (1973), “On Recovery of IntraBlock Information,” Journal of the American Statistical Association, 68, 384391.

Posner, E. C., and Rodemich, E. R. (1971), “Epsilon Entropy and Data Compression,” The Annals of Mathematical Statistics, 42, 20792125.

Posner, E. C., and Rodemich, E. R. (1973), “Epsilon Entropy of Stochastic Processes with Continuous Paths,” The Annals of Probability, 1, 674689.

Posner, E. C., Rodemich, E. R., and Rumsey, H., Jr. (1967), “Epsilon Entropy of Stochastic Processes,” The Annals of Mathematical Statistics, 38, 10001020.

Posner, E. C., Rodemich, E. R., and Rumsey, H., Jr. (1969), “Epsilon Entropy of Gaussian Processes,” The Annals of Mathematical Statistics, 40, 12721296.

Posner, E. C., Rodemich, E. R., and Rumsey, H., Jr. (1969), “Product Entropy of Gaussian Distributions,” The Annals of Mathematical Statistics, 40, 870904.

Pra, P. D., Paganoni, A. M., and Posta, G. (2002), “Entropy Inequalities for Unbounded Spin Systems,” The Annals of Probability, 30, 19591976.

Prescott, P. (1976), “On a Test for Normality Based on Sample Entropy,” Journal of the Royal Statistical Society. Series B (Methodological), 38, 254256.

Preston, C. (1972), “Continuity Properties of Some Gaussian Processes,” The Annals of Mathematical Statistics, 43, 285292.

Pyke, R. (1986), “Product Brownian Measures,” Advances in Applied Probability, 18, 117131.

Rachev, S. T., and Ruschendorf, L. (1991), “Approximate Independence of Distributions on Spheres and Their Stability Properties,” The Annals of Probability, 19, 13111337.

Rahiala, M. (1986), “Identification and Preliminary Estimation in Linear Transfer Function Models,” Scandinavian Journal of Statistics, 13, 239255.

Rao, C. R. (1982), “Diversity: Its Measurement, Decomposition, Apportionment and Analysis,” Sankhya: The Indian Journal of Statistics, Series A, 44, 122.

Rao, C. R. (1984), “Convexity Properties of Entropy Functions and Analysis of Diversity,” Lecture NotesMonograph Series, 5, 6877.

Rao, C. R., Renyi, A., and Kendall, D. G. (1965), “[on the Foundations of Information Theory]: Discussion,” Review of the International Statistical Institute, 33, 14.

Rathie, P. N. (1970), “On a Generalized Entropy and a Coding Theorem,” Journal of Applied Probability, 7, 124133.

Ren, C., Sun, D., and Dey, D. K. (2004), “Comparison of Bayesian and Frequentist Estimation and Prediction for a Normal Population,” Sankhya: The Indian Journal of Statistics (2003), 66, 678706.

Renyi, A. (1965), “On the Foundations of Information Theory,” Review of the International Statistical Institute, 33, 114.

Rheinlnder, T., and Steiger, G. (2006), “The Minimal Entropy Martingale Measure for General BarndorffNielsen/Shephard Models,” The Annals of Applied Probability, 16, 13191351.

Rio, E. (1993), “Strong Approximation for SetIndexed PartialSum Processes, Via Kmt Constructions Ii,” The Annals of Probability, 21, 17061727.

Robert, C. (1990), “An Entropy Concentration Theorem: Applications in Artificial Intelligence and Descriptive Statistics,” Journal of Applied Probability, 27, 303313.

Ron, M., Judge, G., Akkeren, M. v., and Cardell, N. S. (2002), “CoordinateBased Empirical LikelihoodLike Estimation in IiiConditioned Inverse Problems,” Journal of the American Statistical Association, 97, 11081121.

Roy, D., and Mukherjee, S. P. (1986), “A Note on Characterisations of the Weibull Distribution,” Sankhya: The Indian Journal of Statistics, Series A, 48, 250253.

Rudolph, D., and Steele, J. M. (1980), “Sizes of Order Statistical Events of Stationary Processes,” The Annals of Probability, 8, 10791084.

Rukhin, A. L. (2000), “Approximate Entropy for Testing Randomness,” Journal of Applied Probability, 37, 88100.

Rukhin, A. L. (2002), “Distribution of the Number of Words with a Prescribed Frequency and Tests of Randomness,” Advances in Applied Probability, 34, 775797.

Ryu, H. K., and Slottje, D. J. (1994), “Coordinate Space Versus Index Space Representations as Estimation Methods: An Application to How Macro Activity Affects the U.S. Income Distribution,” Journal of Business & Economic Statistics, 12, 243251.

Samson, P.M. (2000), “Concentration of Measure Inequalities for Markov Chains and FMixing Processes,” The Annals of Probability, 28, 416461.

Satten, G. A., and Kupper, L. L. (1993), “Inferences About ExposureDisease Associations Using Probabilityof Exposure Information,” Journal of the American Statistical Association, 88, 200208.

Schmidt, K. (1978), “A Probabilistic Proof of Ergodic Decomposition,” Sankhya: The Indian Journal of Statistics, Series A, 40, 1018.

Sebastiani, P., and Wynn, H. P. (2000), “Maximum Entropy Sampling and Optimal Bayesian Experimental Design,” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 62, 145157.

Sebenius, J. K., and Geanakoplos, J. (1983), “Don’t Bet on It: Contingent Agreements with Asymmetric Information,” Journal of the American Statistical Association, 78, 424426.

Seidenfeld, T., Schervish, M. J., and Kadane, J. B. (1995), “A Representation of Partially Ordered Preferences,” The Annals of Statistics, 23, 21682217.

Seneta, E. (1982), “Entropy and Martingales in Markov Chain Models,” Journal of Applied Probability, 19, 367381.

Seppalainen, T. (1994), “Large Deviations for Markov Chains with Random Transitions,” The Annals of Probability, 22, 713748.

Seppalainen, T. (1998), “Entropy for TranslationInvariant RandomCluster Measures,” The Annals of Probability, 26, 11391178.

Seppalainen, T. (1999), “Existence of Hydrodynamics for the Totally Asymmetric Simple KExclusion Process,” The Annals of Probability, 27, 361415.

Severini, T. A. (1998), “Likelihood Functions for Inference in the Presence of a Nuisance Parameter,” Biometrika, 85, 507522.

Sharma, B. D., and Autar, R. (1973), “On Characterization of a Generalized Inaccuracy Measure in Information Theory,” Journal of Applied Probability, 10, 464468.

Sharon, M. P., et al. (1998), “Medical Image Compression and Vector Quantization,” Statistical Science, 13, 3053.

Sheffield, S. (2006), “Uniqueness of Maximal Entropy Measure on Essential Spanning Forests,” The Annals of Probability, 34, 857864.

Shelby, J. H. (1982), “Analysis of Dispersion of Multinomial Responses,” Journal of the American Statistical Association, 77, 568580.

Shen, X., and Wasserman, L. (2001), “Rates of Convergence of Posterior Distributions,” The Annals of Statistics, 29, 687714.

Shen, X., and Wong, W. H. (1994), “Convergence Rate of Sieve Estimates,” The Annals of Statistics, 22, 580615.

Shields, P., and Thouvenot, J. P. (1975), “Entropy Zero × Bernoulli Processes Are Closed in the Metric,” The Annals of Probability, 3, 732736.

Shields, P. C. (1992), “Entropy and Prefixes,” The Annals of Probability, 20, 403409.

Shields, P. C., Neuhoff, D. L., Davisson, L. D., and Ledrappier, F. (1978), “The DistortionRate Function for Nonergodic Sources,” The Annals of Probability, 6, 138143.

Shier, D. R. (1988), “The Monotonicity of Power Means Using Entropy,” The American Statistician, 42, 203204.

Shirai, T., and Takahashi, Y. (2003), “Random Point Fields Associated with Certain Fredholm Determinants Ii: Fermion Shifts and Their Ergodic and Gibbs Properties,” The Annals of Probability, 31, 15331564.

Silvey, S. D. (1964), “On a Measure of Association,” The Annals of Mathematical Statistics, 35, 11571166.

Simon, J. C., Daniell, G. J., and Nicole, D. A. (1998), “Using Maximum Entropy to Double One’s Expected Winnings in the Uk National Lottery,” Journal of the Royal Statistical Society. Series D (The Statistician), 47, 629641.

Sinha, B. K., and Wieand, H. S. (1979), “UnionIntersection Test for the Mean Vector When the Covariance Matrix Is Totally Reducible,” Journal of the American Statistical Association, 74, 340343.

Siromoney, G. (1962), “Entropy of Logarithmic Series Distributions,” Sankhya: The Indian Journal of Statistics, Series A, 24, 419420.

Sitter, R. R., and Wu, C. (2002), “Efficient Estimation of Quadratic Finite Population Functions in the Presence of Auxiliary Information,” Journal of the American Statistical Association, 97, 535543.

Skilling, J., and Gull, S. F. (1991), “Bayesian Maximum Entropy Image Reconstruction,” Lecture NotesMonograph Series, 20, 341367.

Slomczynski, W., and Zastawniak, T. (2004), “Utility Maximizing Entropy and the Second Law of Thermodynamics,” The Annals of Probability, 32, 22612285.

Small, C. G., Wang, J., and Yang, Z. (2000), “Eliminating Multiple Root Problems in Estimation,” Statistical Science, 15, 313332.

Smith, W. (1989), “AnovaLike Similarity Analysis Using Expected Species Shared,” Biometrics, 45, 873881.

Soofi, E. S. (1992), “A Generalizable Formulation of Conditional Logit with Diagnostics,” Journal of the American Statistical Association, 87, 812816.

Soofi, E. S. (1994), “Capturing the Intangible Concept of Information,” Journal of the American Statistical Association, 89, 12431254.

Soofi, E. S. (2000), “Principal Information Theoretic Approaches,” Journal of the American Statistical Association, 95, 13491353.

Soofi, E. S., Ebrahimi, N., and Habibullah, M. (1995), “Information Distinguishability with Application to Analysis of Failure Data,” Journal of the American Statistical Association, 90, 657668.

Sorensen, M. (1993), “Stochastic Models of Sand Transport by Wind and Two Related Estimation Problems,” International Statistical Review, 61, 245255.

Soyer, R., and Vopatek, A. L. (1995), “Adaptive Bayesian Designs for Accelerated Life Testing,” Lecture NotesMonograph Series, 25, 263275.

Spiegelhalter, D. J., Best, N. G., Carlin, B. P., and Linde, A. v. d. (2002), “Bayesian Measures of Model Complexity and Fit,” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 64, 583639.

Srivastava, S. K. (1971), “A Generalized Estimator for the Mean of a Finite Population Using MultiAuxiliary Information,” Journal of the American Statistical Association, 66, 404407.

Stanley, T. R., and Burnham, K. P. (1998), “Estimator Selection for ClosedPopulation Capture: Recapture,” Journal of Agricultural, Biological, and Environmental Statistics, 3, 131150.

Steif, J. E. (1997), “Consistent Estimation of Joint Distributions for Sufficiently Mixing Random Fields,” The Annals of Statistics, 25, 293304.

Stein, M. L. (1990), “Bounds on the Efficiency of Linear Predictions Using an Incorrect Covariance Function,” The Annals of Statistics, 18, 11161138.

Stern, H., and Cover, T. M. (1989), “Maximum Entropy and the Lottery,” Journal of the American Statistical Association, 84, 980985.

Stine, R. A., and Shaman, P. (1990), “Bias of Autoregressive Spectral Estimators,” Journal of the American Statistical Association, 85, 10911098.

Sugar, C. A., and James, G. M. (2003), “Finding the Number of Clusters in a Dataset: An InformationTheoretic Approach,” Journal of the American Statistical Association, 98, 750763.

Sugiura, N. (1989), “Entropy Loss and a Class of Improved Estimators for Powers of the Generalized Variance,” Sankhya: The Indian Journal of Statistics, Series A, 51, 328333.

Sweeting, T. J., Datta, G. S., and Ghosh, M. (2006), “Nonsubjective Priors Via Predictive Relative Entropy Regret,” The Annals of Statistics, 34, 441468.

Swindel, B. F., and Yandle, D. O. (1972), “Allocation in Stratified Sampling as a Game,” Journal of the American Statistical Association, 67, 684686.

Sylvester, R. J. (1988), “A Bayesian Approach to the Design of Phase Ii Clinical Trials,” Biometrics, 44, 823836.

Talagrand, M. (1990), “Characterization of Almost Surely Continuous 1Stable Random Fourier Series and Strongly Stationary Processes,” The Annals of Probability, 18, 8591.

Talagrand, M. (1996), “Majorizing Measures: The Generic Chaining,” The Annals of Probability, 24, 10491103.

Talagrand, M. (2003), “VapnikChervonenkis Type Conditions and Uniform Donsker Classes of Functions,” The Annals of Probability, 31, 15651582.

Tarter, M. E. (1979), “Trigonometric Maximum Likelihood Estimation and Application to the Analysis of Incomplete Survival Information,” Journal of the American Statistical Association, 74, 132139.

Theil, H., and Chung, C.F. (1988), “InformationTheoretic Measures of Fit for Univariate and Multivariate Linear Regressions,” The American Statistician, 42, 249252.

Thomasian, A. J. (1960), “An Elementary Proof of the Aep of Information Theory,” The Annals of Mathematical Statistics, 31, 452456.

Thompson, E. A. (1981), “Optimal Sampling for Pedigree Analysis: Sequential Schemes for Sibships,” Biometrics, 37, 313325.

Tomizawa, S. (1995), “Measures of Departure from Marginal Homogeneity for Contingency Tables with Nominal Categories,” Journal of the Royal Statistical Society. Series D (The Statistician), 44, 425439.

Tsybakov, A. B., and Meulen, E. C. v. d. (1996), “RootN Consistent Estimators of Entropy for Densities with Unbounded Support,” Scandinavian Journal of Statistics, 23, 7583.

Vasicek, O. (1976), “A Test for Normality Based on Sample Entropy,” Journal of the Royal Statistical Society. Series B (Methodological), 38, 5459.

Vasicek, O. A. (1980), “A Conditional Law of Large Numbers,” The Annals of Probability, 8, 142147.

Vinod, H. D. (1985), “Measurement of Economic Distance between Blacks and Whites,” Journal of Business & Economic Statistics, 3, 7888.

Wagner, U., and Geyer, A. L. J. (1995), “A Maximum Entropy Method for Inverting Laplace Transforms of Probability Density Functions,” Biometrika, 82, 887892.

Walker, S. G. (2003), “How Many Samples?: A Bayesian Nonparametric Approach,” Journal of the Royal Statistical Society. Series D (The Statistician), 52, 475482.

Wang, X. (2006), “Approximating Bayesian Inference by Weighted Likelihood,” The Canadian Journal of Statistics, 34, 279298.

Warner, S. L. (1976), “Optimal Randomized Response Models,” International Statistical Review, 44, 205212.

Weissman, T., and Merhav, N. (2004), “Universal Prediction of Random Binary Sequences in a Noisy Environment,” The Annals of Applied Probability, 14, 5489.

Wen, L. (1990), “Relative Entropy Densities and a Class of Limit Theorems of the Sequence of MValued Random Variables,” The Annals of Probability, 18, 829839.

Winkler, R. L., Smith, J. E., and Fryback, D. G. (2002), “The Role of Informative Priors in ZeroNumerator Problems: Being Conservative Versus Being Candid,” The American Statistician, 56, 14.

Wolfowitz, J. (1958), “Information Theory for Mathematicians,” The Annals of Mathematical Statistics, 29, 351356.

Wong, W. H., and Severini, T. A. (1991), “On Maximum Likelihood Estimation in Infinite Dimensional Parameter Spaces,” The Annals of Statistics, 19, 603632.

Wong, W. H., and Shen, X. (1995), “Probability Inequalities for Likelihood Ratios and Convergence Rates of Sieve Mles,” The Annals of Statistics, 23, 339362.

Wright, R. L. (1983), “Finite Population Sampling with Multivariate Auxiliary Information,” Journal of the American Statistical Association, 78, 879884.

Xie, J., Li, K.C., and Bina, M. (2004), “A Bayesian Insertion/Deletion Algorithm for Distant Protein Motif Searching Via Entropy Filtering,” Journal of the American Statistical Association, 99, 409420.

Yaguchi, H. (1990), “Entropy Analysis of a NearestNeighbor Attractive/Repulsive Exclusion Process on OneDimensional Lattices,” The Annals of Probability, 18, 556580.

Yaguchi, H. (1991), “Acknowledgment of Priority: Entropy Analysis of a NearestNeighbor Attractive/Repulsive Exclusion Process on OneDimensional Lattices,” The Annals of Probability, 19, 1822.

Yang, Y. (2000), “Mixing Strategies for Density Estimation,” The Annals of Statistics, 28, 7587.

Yang, Y. (2001), “Adaptive Regression by Mixing,” Journal of the American Statistical Association, 96, 574588.

Yang, Y., and Barron, A. (1999), “InformationTheoretic Determination of Minimax Rates of Convergence,” The Annals of Statistics, 27, 15641599.

Yatracos, Y. G. (1985), “Rates of Convergence of Minimum Distance Estimators and Kolmogorov’s Entropy,” The Annals of Statistics, 13, 768774.

Yatracos, Y. G. (1989), “A Regression Type Problem,” The Annals of Statistics, 17, 15971607.

Young, D. L. (1976), “Inference Concerning the Mean Vector When the Covariance Matrix Is Totally Reducible,” Journal of the American Statistical Association, 71, 696699.

Yu, B., and Speed, T. P. (1997), “Information and the Clone Mapping of Chromosomes,” The Annals of Statistics, 25, 169185.

Yung, W., and Rao, J. N. K. (2000), “Jackknife Variance Estimation under Imputation for Estimators Using Poststratification Information,” Journal of the American Statistical Association, 95, 903915.

Zanten, H. v. (2003), “On Empirical Processes for Ergodic Diffusions and Rates of Convergence of MEstimators,” Scandinavian Journal of Statistics, 30, 443458.

Zeckhauser, R. (1971), “Combining Overlapping Information,” Journal of the American Statistical Association, 66, 9192.

Zellner, A. (1988), “Optimal Information Processing and Bayes’s Theorem,” The American Statistician, 42, 278280.

Zhang, T. (2006), “From ?Entropy to KlEntropy: Analysis of Minimum Information Complexity Density Estimation,” The Annals of Statistics, 34, 21802210.

Zidek, J. V., and Eeden, C. v. (2003), “Uncertainty, Entropy, Variance and the Effect of Partial Information,” Lecture NotesMonograph Series, 42, 155167.

Zidek, J. V., Sun, W., and Le, N. D. (2000), “Designing and Integrating Composite Networks for Monitoring Multivariate Gaussian Pollution Fields,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 49, 6379.