Expand AU Menu

Info-Metrics Institute | Books & Papers

DISCLAIMER: THE MATERIALS ON THIS WEB SITE (INCLUDING, WITHOUT LIMITATION, ALL SOFTWARE) ARE PROVIDED AS IS AND WITHOUT WARRANTIES OF ANY KIND EITHER EXPRESSED OR IMPLIED. THE RESOURCE PAGE IS MANAGED BY XIMING WU (xwu@ag.tamu.edu) . 
 

Books on maximum entropy and information-theoretic econometrics and methods

A Family of Empirical Likelihood Functions and Estimators for the Binary Response Model by Mittelhammer and Judge, 2011.
An Information Theoretic Approach to Econometrics by Judge and Mittelhammer, 2011.
Elements of Information Theory by Cover and Thomas, 2nd Ed., 2006.  
Maximum Entropy Econometrics: Robust Estimation with Limited Data by Golan, Judge, and Miller, 1996.  
Information and Entropy Econometrics: A Review and Synthesis by Golan.  
Information Theory and Statistics: A Tutorial by Csiszar.  
Maximum Entropy Models in Science and Engineering by Kapur, 2006.  
Entropy Optimization Principles with Applications by Kapur and Kesavan, 1992.  
Entropy and Information Theory by Gray, 1990.  
Entropy, Large Deviations, and Statistical Mechanics by Ellis, 1985.  
Handbook of Empirical Economics and Finance (Editors: Aman Ullah and David E. A. Giles; CRS Press)

Below is an incomplete list of articles related to information-theoretic and entropy methods, with a focus on econometrics and statistics. Other general sources include IEEE Transactions on Information Theory and Entropy: an Online Journal.

Business, Economics and Finance

Abbas, A. E. (2006), “Maximum Entropy Utility,” Operations Research, 54, 277-290.

Abdel-Khalik, A. R. (1974), “The Entropy Law, Accounting Data, and Relevance to Decision-Making,” The Accounting Review, 49, 271-283.

Acar, W., and Sankaran, K. (1999), “The Myth of the Unique Decomposability: Specializing the Herfindahl and Entropy Measures?,” Strategic Management Journal, 20, 969-975.

Aharon, B.-T. (1985), “The Entropic Penalty Approach to Stochastic Programming,” Mathematics of Operations Research, 10, 263-279.

Arizono, I., Cui, Y., and Ohta, H. (1991), “An Analysis of M/M/S Queueing Systems Based on the Maximum Entropy Principle,” The Journal of the Operational Research Society, 42, 69-73.

Badran, Y. (1987), “Optimal Grouping of State Spaces of Discrete Stochastic Systems,” The Journal of the Operational Research Society, 38, 1031-1037.

Bearse, P. M., Bozdogan, H., and Schlottmann, A. M. (1997), “Empirical Econometric Modelling of Food Consumption Using a New Informational Complexity Approach,” Journal of Applied Econometrics, 12, 563-586.

Beatty, T. K. M. (2007), “Recovering the Shadow Value of Nutrients,” American Journal of Agricultural Economics, 89, 52-62.

Ben-Tal, A., and Teboulle, M. (1987), “Penalty Functions and Duality in Stochastic Programming Via F-Divergence Functionals,” Mathematics of Operations Research, 12, 224-240.

Blackman, A. (2001), “Why Don’t Lenders Finance High-Return Technological Change in Developing-Country Agriculture?,” American Journal of Agricultural Economics, 83, 1024-1035.

Block, W., and Walker, M. (1988), “Entropy in the Canadian Economics Profession: Sampling Consensus on the Major Issues,” Canadian Public Policy / Analyse de Politiques, 14, 137-150.

Boer, P. T. d., Kroese, D. P., and Rubinstein, R. Y. (2004), “A Fast Cross-Entropy Method for Estimating Buffer Overflows in Queueing Networks,” Management Science, 50, 883-895.

Borwein, J. M., Lewis, A. S., and Noll, D. (1996), “Maximum Entropy Reconstruction Using Derivative Information, Part 1: Fisher Information and Convex Duality,” Mathematics of Operations Research, 21, 442-468.

Boulding, K. E. (1973), “The Economics of Energy,” Annals of the American Academy of Political and Social Science, 410, 120-126.

Brock, W. A., and Baek, E. G. (1991), “Some Theory of Statistical Inference for Nonlinear Science,” The Review of Economic Studies, 58, 697-716.

Brockett, P., and Charnes, A. (1991), “Information Theoretic Approach to Geometric Programming,” Mathematics of Operations Research, 16, 888-889.

Brotchie, J. F. (1978), “A New Approach to Urban Modelling,” Management Science, 24, 1753-1758.

Buchen, P. W., and Kelly, M. (1996), “The Maximum Entropy Distribution of an Asset Inferred from Option Prices,” The Journal of Financial and Quantitative Analysis, 31, 143-159.

Bullock, D. S., Ruffo, M. L., Bullock, D. G., and Bollero, G. A. (2009), “The Value of Variable Rate Technology: An Information-Theoretic Approach,” American Journal of Agricultural Economics, 91, 209-223.

Callen, J. L., Kwan, C. C. Y., and Yip, P. C. Y. (1985), “Foreign-Exchange Rate Dynamics: An Empirical Study Using Maximum Entropy Spectral Analysis,” Journal of Business & Economic Statistics, 3, 149-155.

Chan, M. M. W. (1971), “System Simulation and Maximum Entropy,” Operations Research, 19, 1751-1753.

Charnes, A., Cooper, W. W., and Learner, D. B. (1978), “Constrained Information Theoretic Characterizations in Consumer Purchase Behaviour,” The Journal of the Operational Research Society, 29, 833-842.

Charnes, A., Cooper, W. W., Learner, D. B., and Phillips, F. Y. (1984), “An Mdi Model and an Algorithm for Composite Hypotheses Testing and Estimation in Marketing,” Marketing Science, 3, 55-72.

Chase, R. (1980), “Structural-Functional Dynamics in the Analysis of Socio-Economic Systems: Adaptation of Structural Change Processes to Biological Systems of Human Interaction,” American Journal of Economics and Sociology, 39, 49-64.

Chemmanur, T. J. (1993), “The Pricing of Initial Public Offerings: A Dynamic Model with Information Production,” The Journal of Finance, 48, 285-304.

Converse, A. O. (1967), “The Use of Uncertainty in a Simultaneous Search,” Operations Research, 15, 1088-1095.

Cozzolino, J. M., and Zahner, M. J. (1973), “The Maximum-Entropy Distribution of the Future Market Price of a Stock,” Operations Research, 21, 1200-1211.

Csikszentmihalyi, M. (2000), “The Costs and Benefits of Consuming,” The Journal of Consumer Research, 27, 267-272.

Davis, R., and Thomas, L. G. (1993), “Direct Estimation of Synergy: A New Approach to the Diversity-Performance Debate,” Management Science, 39, 1334-1346.

Dinkel, J. J., and Kochenberger, G. A. (1979), “Constrained Entropy Models: Solvability and Sensitivity,” Management Science, 25, 555-564.

Drechsler, F. S. (1970), “The Concept of Entropy,” Operational Research Quarterly (1970-1977), 21, 476-477.

Eisner, H. (1962), “A Generalized Network Approach to the Planning and Scheduling of a Research Project,” Operations Research, 10, 115-125.

Elton, A. D., and Madan, D. B. (2005), “An Empirical Examination of the Variance-Gamma Model for Foreign Currency Options,” The Journal of Business, 78, 2121-2152.

Filar, J. A., and Krass, D. (1994), “Hamiltonian Cycles and Markov Chains,” Mathematics of Operations Research, 19, 223-237.

Fisk, C., and Brown, G. R. (1975), “A Note on the Entropy Formulation of Distribution Models,” Operational Research Quarterly (1970-1977), 26, 755-758.

Frank, M., and Stengos, T. (1989), “Measuring the Strangeness of Gold and Silver Rates of Return,” The Review of Economic Studies, 56, 553-567.

Freund, D., and Saxena, U. (1984), “An Algorithm for a Class of Discrete Maximum Entropy Problems,” Operations Research, 32, 210-215.

Gaile, G. L. (1977), “Effiquity: A Comparison of a Measure of Efficiency with an Entropic Measure of the Equality of Discrete Spatial Distributions,” Economic Geography, 53, 265-282.

Galli, E.-P., and Legros, D. (2007), “Spatial Spillovers in France: A Study on Individual Count Data at the City Level,” Annales d’conomie et de Statistique, 221-246.

Garber, T., Goldenberg, J., Libai, B., and Muller, E. (2004), “From Density to Destiny: Using Spatial Dimension of Sales Data for Early Prediction of New Product Success,” Marketing Science, 23, 419-428.

Garrison, C. B. (1974), “Industrial Growth in the Tennessee Valley Region, 1959 to 1968,” American Journal of Agricultural Economics, 56, 50-60.

Garrison, C. B., and Paulson, A. S. (1973), “An Entropy Measure of the Geographic Concentration of Economic Activity,” Economic Geography, 49, 319-324.

Gerchak, Y. (1981), “Maximal Entropy of Markov Chains with Common Steady-State Probabilities,” The Journal of the Operational Research Society, 32, 233-234.

Gersch, W., and Kitagawa, G. (1983), “The Prediction of Time Series with Trends and Seasonalities,” Journal of Business & Economic Statistics, 1, 253-264.

Gertsbakh, I., and Stern, H. I. (1978), “Minimal Resources for Fixed and Variable Job Schedules,” Operations Research, 26, 68-85.

Ghaoui, L. E., Oks, M., and Oustry, F. (2003), “Worst-Case Value-at-Risk and Robust Portfolio Optimization: A Conic Programming Approach,” Operations Research, 51, 543-556.

Golan, A., Judge, G., and Perloff, J. M. (1996), “Estimating the Size Distribution of Firms Using Government Summary Statistics,” The Journal of Industrial Economics, 44, 69-80.

Golan, A., Judge, G., and Robinson, S. (1994), “Recovering Information from Incomplete or Partial Multisectoral Economic Data,” The Review of Economics and Statistics, 76, 541-549.

Golan, A., Karp, L. S., and Perloff, J. M. (2000), “Estimating Coke’s and Pepsi’s Price and Advertising Strategies,” Journal of Business & Economic Statistics, 18, 398-409.

Golan, A., Perloff, J. M., and Shen, E. Z. (2001), “Estimating a Demand System with Nonnegativity Constraints: Mexican Meat Demand,” The Review of Economics and Statistics, 83, 541-550.

Guiasu, S. (1986), “Maximum Entropy Condition in Queueing Theory,” The Journal of the Operational Research Society, 37, 293-301.

Guiasu, S. (1987), “Maximum Entropy Condition in Queueing Theory: Response,” The Journal of the Operational Research Society, 38, 98-100.

Hall, E. H., Jr., and Caron, H. S. J. (1994), “A Methodological Note on Diversity Measurement,” Strategic Management Journal, 15, 153-168.

Handlarski, J. (1980), “Mathematical Analysis of Preventive Maintenance Schemes,” The Journal of the Operational Research Society, 31, 227-237.

Hansen, S. (1972), “Utility, Accessibility and Entropy in Spatial Modelling,” The Swedish Journal of Economics, 74, 35-44.

Hansen, S. (1975), “Utility, Accessibility and Entropy in Spatial Modelling. Reply,” The Swedish Journal of Economics, 77, 502-503.

Hauser, J. R. (1978), “Testing the Accuracy, Usefulness, and Significance of Probabilistic Choice Models: An Information-Theoretic Approach,” Operations Research, 26, 406-421.

Haynes, K. E., and Enders, W. T. (1975), “Distance, Direction, and Entropy in the Evolution of a Settlement Pattern,” Economic Geography, 51, 357-365.

Heltberg, R., Arndt, T. C., and Sekhar, N. U. (2000), “Fuelwood Consumption and Forest Degradation: A Household Model for Domestic Energy Substitution in Rural India,” Land Economics, 76, 213-232.

Herer, Y. T., and Raz, T. (2000), “Optimal Parallel Inspection for Finding the First Nonconforming Unit in a Batch-an Information Theoretic Approach,” Management Science, 46, 845-857.

Herniter, J. D. (1973), “An Entropy Model of Brand Purchase Behavior,” Journal of Marketing Research, 10, 361-375.

Herniter, J. D. (1974), “A Comparison of the Entropy Model and the Hendry Model,” Journal of Marketing Research, 11, 21-29.

Herpen, E. v., and Pieters, R. (2002), “The Variety of an Assortment: An Extension to the Attribute-Based Approach,” Marketing Science, 21, 331-341.

Hexter, J. L., and Snow, J. W. (1970), “An Entropy Measure of Relative Aggregate Concentration,” Southern Economic Journal, 36, 239-243.

Hexter, J. L., and Snow, J. W. (1971), “An Entropy Measure of Relative Aggregate Concentration: Reply,” Southern Economic Journal, 38, 112-114.

Hirschberg, J. G., Maasoumi, E., and Slottje, D. J. (2001), “Clusters of Attributes and Well-Being in the USA,” Journal of Applied Econometrics, 16, 445-460.

Holloway, G., and Paris, Q. (2002), “Production Efficiency in the Von Liebig Model,” American Journal of Agricultural Economics, 84, 1271-1278.

Holm, J. (1991), “The Distribution of Income: A Saddle Point Formulation,” The Scandinavian Journal of Economics, 93, 545-554.

Hong, H., Preston, B., and Shum, M. (2003), “Generalized Empirical Likelihood-Based Model Selection Criteria for Moment Condition Models,” Econometric Theory, 19, 923-943.

Hong, Y., and White, H. (2005), “Asymptotic Distribution Theory for Nonparametric Entropy Measures of Serial Dependence,” Econometrica, 73, 837-901.

Horowitz, A., and Horowitz, I. (1968), “Entropy, Markov Processes and Competition in the Brewing Industry,” The Journal of Industrial Economics, 16, 196-211.

Hoskisson, R. E., Hitt, M. A., Johnson, R. A., and Moesel, D. D. (1993), “Construct Validity of an Objective (Entropy) Categorical Measure of Diversification Strategy,” Strategic Management Journal, 14, 215-235.

Hu, J., Fu, M. C., and Marcus, S. I. (2007), “A Model Reference Adaptive Search Method for Global Optimization,” Operations Research, 55, 549-568.

Huffman, G. W. (1992), “Information, Asset Prices, and the Volume of Trade,” The Journal of Finance, 47, 1575-1590.

Hutchens, R. (2004), “One Measure of Segregation,” International Economic Review, 45, 555-578.

Hutton, B., and Schmidt, C. P. (1988), “Sensitivity Analysis of Additive Multiattribute Value Models,” Operations Research, 36, 122-127.

Imbens, G. W. (1997), “One-Step Estimators for over-Identified Generalized Method of Moments Models,” The Review of Economic Studies, 64, 359-383.

Imbens, G. W. (2002), “Generalized Method of Moments and Empirical Likelihood,” Journal of Business & Economic Statistics, 20, 493-506.

Imbens, G. W., Spady, R. H., and Johnson, P. (1998), “Information Theoretic Approaches to Inference in Moment Condition Models,” Econometrica, 66, 333-357.

Iusem, A. N., Svaiter, B. F., and Teboulle, M. (1994), “Entropy-Like Proximal Methods in Convex Programming,” Mathematics of Operations Research, 19, 790-814.

Jackson, R. W., Hewings, G. J. D., and Sonis, M. (1989), “Decomposition Approaches to the Identification of Change in Regional Economies,” Economic Geography, 65, 216-231.

Jacquemin, A. P., and Berry, C. H. (1979), “Entropy Measure of Diversification and Corporate Growth,” The Journal of Industrial Economics, 27, 359-369.

Jacquemin, A. P., and Kumps, A.-M. (1971), “Changes in the Size Structure of the Largest European Firms: An Entropy Measure,” The Journal of Industrial Economics, 20, 59-70.

Jarrett, D. (1981), “Comments On ”Maximal Entropy of Markov Chains with Common Steady-State Probabilities,” The Journal of the Operational Research Society, 32, 1045-1046.

Jessop, A. (2002), “Prioritisation of an It Budget within a Local Authority,” The Journal of the Operational Research Society, 53, 36-46.

Jon, L. (1998), “Constrained Maximum-Entropy Sampling,” Operations Research, 46, 655-664.

Jon, S. (1987), “The Relationship between Wages and Firm Size: An Information Theoretic Analysis,” International Economic Review, 28, 51-68.

Kahn, B. E., and Wansink, B. (2004), “The Influence of Assortment Structure on Perceived Variety and Consumption Quantities,” The Journal of Consumer Research, 30, 519-533.

Kalwani, M. U., and Morrison, D. G. (1977), “A Parsimonious Description of the Hendry System,” Management Science, 23, 467-477.

Kim, W. C. (1989), “Developing a Global Diversification Measure,” Management Science, 35, 376-383.

Kitamura, Y., and Stutzer, M. (1997), “An Information-Theoretic Alternative to Generalized Method of Moments Estimation,” Econometrica, 65, 861-874.

Ko, C.-W., Jon, L., and Queyranne, M. (1995), “An Exact Algorithm for Maximum Entropy Sampling,” Operations Research, 43, 684-691.

Koenigsberg, E. (1987), “Maximum Entropy Condition in Queueing Theory,” The Journal of the Operational Research Society, 38, 97-98.

Kotiah, T. C. T., and Wallace, N. D. (1973), “Another Look at the Pert Assumptions,” Management Science, 20, 44-49.

Kottke, F. J. (1971), “An Entropy Measure of Relative Aggregate Concentration: Comment,” Southern Economic Journal, 38, 109-112.

Kouvatsos, D. D. (1988), “A Maximum Entropy Analysis of the G/G/1 Queue at Equilibrium,” The Journal of the Operational Research Society, 39, 183-200.

Kouvatsos, D. D., and Othman, A. T. (1989), “Optimal Flow Control of a G/G/C Finite Capacity Queue,” The Journal of the Operational Research Society, 40, 659-670.

Krzysztof, C. K. (1997), “Free-Steering Relaxation Methods for Problems with Strictly Convex Costs and Linear Constraints,” Mathematics of Operations Research, 22, 326-349.

Kurkalova, L. A., and Carriquiry, A. (2002), “An Analysis of Grain Production Decline During the Early Transition in Ukraine: A Bayesian Inference,” American Journal of Agricultural Economics, 84, 1256-1263.

Leblebici, H., and Salancik, G. R. (1981), “Effects of Environmental Uncertainty on Information and Decision Processes in Banks,” Administrative Science Quarterly, 26, 578-596.

Leibenstein, H. (1975), “Aspects of the X-Efficiency Theory of the Firm,” The Bell Journal of Economics, 6, 580-606.

Lence, S. H., and Miller, D. J. (1998), “Recovering Output-Specific Inputs from Aggregate Input Data: A Generalized Cross-Entropy Approach,” American Journal of Agricultural Economics, 80, 852-867.

Lev, B., and Theil, H. (1978), “A Maximum Entropy Approach to the Choice of Asset Depreciation,” Journal of Accounting Research, 16, 286-293.

Liang, T.-P. (1992), “A Composite Approach to Inducing Knowledge for Expert Systems Design,” Management Science, 38, 1-17.

Lim, A. E. B., and Shanthikumar, J. G. (2007), “Relative Entropy, Exponential Utility, and Robust Dynamic Pricing,” Operations Research, 55, 198-214.

Lucas, D. J., and McDonald, R. L. (1990), “Equity Issues and Stock Price Dynamics,” The Journal of Finance, 45, 1019-1043.

Maasoumi, E., and Trede, M. (2001), “Comparing Income Mobility in Germany and the United States Using Generalized Entropy Mobility Measures,” The Review of Economics and Statistics, 83, 551-559.

Mayfield, E. S., and Mizrach, B. (1992), “On Determining the Dimension of Real-Time Stock-Price Data,” Journal of Business & Economic Statistics, 10, 367-374.

McClean, S. (1986), “Extending the Entropy Stability Measure for Manpower Planning,” The Journal of the Operational Research Society, 37, 1133-1138.

McClean, S., and Abodunde, T. (1978), “Entropy as a Measure of Stability in a Manpower System,” The Journal of the Operational Research Society, 29, 885-889.

McCombie, J. S. L. (1975), “Utility, Accessibility and Entropy in Spatial Modelling: A Comment,” The Swedish Journal of Economics, 77, 497-501.

Medvedkov, Y. (1970), “Entropy: An Assessment of Potentialities in Geography,” Economic Geography, 46, 306-316.

Miller, D. J. (2002), “Entropy-Based Methods of Modeling Stochastic Production Efficiency,” American Journal of Agricultural Economics, 84, 1264-1270.

Miller, D. J., and Plantinga, A. J. (1999), “Modeling Land Use Decisions with Aggregate Data,” American Journal of Agricultural Economics, 81, 180-194.

Miller, R. A. (1972), “Numbers Equivalents, Relative Entropy, and Concentration Ratios: A Comparison Using Market Performance,” Southern Economic Journal, 39, 107-112.

Mills, J. A., and Zandvakili, S. (1997), “Statistical Inference Via Bootstrapping for Measures of Inequality,” Journal of Applied Econometrics, 12, 133-150.

Myung, I. J., Ramamoorti, S., and Bailey, A. D., Jr. (1996), “Maximum Entropy Aggregation of Expert Predictions,” Management Science, 42, 1420-1436.

Nayak, T. K., and Gastwirth, J. L. (1989), “The Use of Diversity Analysis to Assess the Relative Influence of Factors Affecting the Income Distribution,” Journal of Business & Economic Statistics, 7, 453-460.

Ng, L. F.-Y. (1995), “Changing Industrial Structure and Competitive Patterns of Manufacturing and Non-Manufacturing in a Small Open Economy: An Entropy Measurement,” Managerial and Decision Economics, 16, 547-563.

Nilim, A., and Ghaoui, L. E. (2005), “Robust Control of Markov Decision Processes with Uncertain Transition Matrices,” Operations Research, 53, 780-798.

Ordentlich, E., and Cover, T. M. (1998), “The Cost of Achieving the Best Portfolio in Hindsight,” Mathematics of Operations Research, 23, 960-982.

Palepu, K. (1985), “Diversification Strategy, Profit Performance and the Entropy Measure,” Strategic Management Journal, 6, 239-255.

Paris, Q. (2001), “Symmetric Positive Equilibrium Problem: A Framework for Rationalizing Economic Behavior with Limited Information,” American Journal of Agricultural Economics, 83, 1049-1061.

Paris, Q. (2002), “An Analysis of Ill-Posed Production Problems Using Maximum Entropy: Reply,” American Journal of Agricultural Economics, 84, 247.

Paris, Q., and Howitt, R. E. (1998), “An Analysis of Ill-Posed Production Problems Using Maximum Entropy,” American Journal of Agricultural Economics, 80, 124-138.

Parisi, F. (2003), “Freedom of Contract and the Laws of Entropy,” Supreme Court Economic Review, 10, 65-90.

Perakis, G., and Roels, G. (2008), “Regret in the Newsvendor Model with Partial Information,” Operations Research, 56, 188-203.

Perry, W. L., and Moffat, J. (1997), “Measuring the Effects of Knowledge in Military Campaigns,” The Journal of the Operational Research Society, 48, 965-972.

Philippatos, G. C., and Gressis, N. (1975), “Conditions of Equivalence among E-V, Ssd, and E-H Portfolio Selection Criteria: The Case for Uniform, Normal and Lognormal Distributions,” Management Science, 21, 617-625.

Pieters, R., Wedel, M., and Zhang, J. (2007), “Optimal Feature Advertising Design under Competitive Clutter,” Management Science, 53, 1815-1828.

Preckel, P. V. (2001), “Least Squares and Entropy: A Penalty Function Perspective,” American Journal of Agricultural Economics, 83, 366-377.

Preckel, P. V. (2002), “An Analysis of Ill-Posed Production Problems Using Maximum Entropy: Comment,” American Journal of Agricultural Economics, 84, 245-246.

Pulliainen, K. (1970), “Entropy-Measures for International Trade,” The Swedish Journal of Economics, 72, 40-53.

Pye, R. (1978), “A Formal, Decision-Theoretic Approach to Flexibility and Robustness,” The Journal of the Operational Research Society, 29, 215-227.

Rachev, S. T., and Rmisch, W. (2002), “Quantitative Stability in Stochastic Programming: The Method of Probability Metrics,” Mathematics of Operations Research, 27, 792-818.

Robertson, J. C., Tallman, E. W., and Whiteman, C. H. (2005), “Forecasting Using Relative Entropy,” Journal of Money, Credit and Banking, 37, 383-401.

Robins, J. A., and Margarethe, F. W. (2003), “The Measurement of Corporate Portfolio Strategy: Analysis of the Content Validity of Related Diversification Indexes,” Strategic Management Journal, 24, 39-59.

Robinson, P. M. (1991), “Consistent Nonparametric Entropy-Based Testing,” The Review of Economic Studies, 58, 437-453.

Rodrigues, F. C. (1989), “A Proposed Entropy Measure for Assessing Combat Degradation,” The Journal of the Operational Research Society, 40, 789-793.

Ronen, J., and Gideon, F. (1973), “Accounting Aggregation and the Entropy Measure: An Experimental Approach,” The Accounting Review, 48, 696-717.

Ryu, H. K., and Slottje, D. J. (1994), “Coordinate Space Versus Index Space Representations as Estimation Methods: An Application to How Macro Activity Affects the U.S. Income Distribution,” Journal of Business & Economic Statistics, 12, 243-251.

Sampson, A. R., and Smith, R. L. (1982), “Assessing Risks through the Determination of Rare Event Probabilities,” Operations Research, 30, 839-866.

Saviotti, P. P., and Pyka, A. (2004), “Economic Development, Variety and Employment,” Revue conomique, 55, 1023-1049.

Semple, R. K., and Golledge, R. G. (1970), “An Analysis of Entropy Changes in a Settlement Pattern over Time,” Economic Geography, 46, 157-160.

Shindo, E., and McCormack, G. (1985), “Hunger and Weapons: The Entropy of Militarisation,” Review of African Political Economy, 6-22.

Shorrocks, A. F. (1980), “The Class of Additively Decomposable Inequality Measures,” Econometrica, 48, 613-625.

Soofi, E. S. (1990), “Generalized Entropy-Based Weights for Multiattribute Value Models,” Operations Research, 38, 362-363.

Stewart, J. F. (1979), “The Beta Distribution as a Model of Behavior in Consumer Goods Markets,” Management Science, 25, 813-821.

Stutzer, M. (1996), “A Simple Nonparametric Approach to Derivative Security Valuation,” The Journal of Finance, 51, 1633-1652.

Suppes, P. (1961), “Behavioristic Foundations of Utility,” Econometrica, 29, 186-202.

Tanyimboh, T. T., and Templeman, A. B. (1993), “Calculating Maximum Entropy Flows in Networks,” The Journal of the Operational Research Society, 44, 383-396.

Tavana, M. (2006), “A Priority Assessment Multi-Criteria Decision Model for Human Spaceflight Mission Planning at Nasa,” The Journal of the Operational Research Society, 57, 1197-1215.

Teboulle, M. (1992), “Entropic Proximal Mappings with Applications to Nonlinear Programming,” Mathematics of Operations Research, 17, 670-690.

Thomas, M. U. (1979), “A Generalized Maximum Entropy Principle,” Operations Research, 27, 1188-1196.

Tseng, P. (2004), “An Analysis of the Em Algorithm and Entropy-Like Proximal Point Methods,” Mathematics of Operations Research, 29, 27-44.

Tummala, V. M. R., and Ling, H. (2000), “A Note on the Sampling Distribution of the Information Content of the Priority Vector of a Consistent Pairwise Comparison Judgment Matrix of Ahp,” The Journal of the Operational Research Society, 51, 237-240.

Vachani, S. (1991), “Distinguishing between Related and Unrelated International Geographic Diversification: A Comprehensive Measure of Global Diversification,” Journal of International Business Studies, 22, 307-322.

Vanhonacker, W. R. (1985), “Testing the Exact Order of an Individual’s Choice Process in an Information-Theoretic Framework,” Journal of Marketing Research, 22, 377-387.

Vassiliou, P. C. G. (1984), “Entropy as a Measure of the Experience Distribution in a Manpower System,” The Journal of the Operational Research Society, 35, 1021-1025.

Vinod, H. D. (1985), “Measurement of Economic Distance between Blacks and Whites,” Journal of Business & Economic Statistics, 3, 78-88.

Webber, M. J. (1976), “Elementary Entropy Maximizing Probability Distributions: Analysis and Interpretation,” Economic Geography, 52, 218-227.

Weitzman, M. L. (2000), “Economic Profitability Versus Ecological Entropy,” The Quarterly Journal of Economics, 115, 237-263.

White, D. J. (1969), “Operational Research and Entropy,” OR, 20, 126-127.

White, D. J. (1970), “The Use of the Concept of Entropy in System Modelling,” Operational Research Quarterly (1970-1977), 21, 279-281.

White, D. J. (1975), “Entropy and Decision,” Operational Research Quarterly (1970-1977), 26, 15-23.

Whiteside, T. J. (1976), “A Comment On ”A Marketing Model” And On ”Entropy and Decision”,” Operational Research Quarterly (1970-1977), 27, 1019-1020.

Willmer, M. A. P. (1966), “On the Measurement of Information in the Field of Criminal Detection,” OR, 17, 335-345.

Wilson, A. G. (1970), “The Use of the Concept of Entropy in System Modelling,” Operational Research Quarterly (1970-1977), 21, 247-265.

Wu, J.-S. (1992), “Maximum Entropy Analysis of Open Queueing Networks with Group Arrivals,” The Journal of the Operational Research Society, 43, 1063-1078.

Wu, J. S., and Chan, W. C. (1989), “Maximum Entropy Analysis of Multiple-Server Queueing Systems,” The Journal of the Operational Research Society, 40, 815-825.

Xingsi, L. (1991), “An Aggregate Constraint Method for Non-Linear Programming,” The Journal of the Operational Research Society, 42, 1003-1010.

Yu, S. B., and Efstathiou, J. (2006), “Complexity in Rework Cells: Theory, Analysis and Comparison,” The Journal of the Operational Research Society, 57, 593-602.

Zandvakili, S. (1999), “Income Inequality among Female Heads of Households: Racial Inequality Reconsidered,” Economica, 66, 119-133.

Zellner, A., and Tobias, J. (2001), “Further Results on Bayesian Method of Moments Analysis of the Multiple Regression Model,” International Economic Review, 42, 121-140.

Zhang, X., and Fan, S. (2001), “Estimating Crop-Specific Production Technologies in Chinese Agriculture: A Generalized Maximum Entropy Approach,” American Journal of Agricultural Economics, 83, 378-388.

Zheng, B. (2007), “Unit-Consistent Decomposable Inequality Measures,” Economica, 74, 97-111.

Zohrabian, A., Traxler, G., Caudill, S., and Smale, M. (2003), “Valuing Pre-Commercial Genetic Resources: A Maximum Entropy Approach,” American Journal of Agricultural Economics, 85, 429-436.

Statistics and Probability

Aaron, D. L. (2001), “Schwarz, Wallace, and Rissanen: Intertwining Themes in Theories of Model Selection,” International Statistical Review, 69, 185-212.

Adler, R. J., and Feigin, P. D. (1984), “On the Cadlaguity of Random Measures,” The Annals of Probability, 12, 615-630.

Adler, R. J., and Samorodnitsky, G. (1987), “Tail Behaviour for the Suprema of Gaussian Processes with Applications to Empirical Processes,” The Annals of Probability, 15, 1339-1351.

Agresti, A. (1986), “Applying R2-Type Measures to Ordered Categorical Data,” Technometrics, 28, 133-138.

Ahlswede, R., and Gacs, P. (1976), “Spreading of Sets in Product Spaces and Hypercontraction of the Markov Operator,” The Annals of Probability, 4, 925-939.

Aitchison, J. (1970), “Statistical Problems of Treatment Allocation,” Journal of the Royal Statistical Society. Series A (General), 133, 206-239.

Alexander, K. S., and Kalikow, S. A. (1992), “Random Stationary Processes,” The Annals of Probability, 20, 1174-1198.

Alexander, K. S., and Pyke, R. (1986), “A Uniform Central Limit Theorem for Set-Indexed Partial-Sum Processes with Finite Variance,” The Annals of Probability, 14, 582-597.

Algoet, P. H., and Cover, T. M. (1988), “A Sandwich Proof of the Shannon-Mcmillan-Breiman Theorem,” The Annals of Probability, 16, 899-909.

Alonso, A., and Molenberghs, G. (2007), “Surrogate Marker Evaluation from an Information Theory Perspective,” Biometrics, 63, 180-186.

Ankirchner, S., Dereich, S., and Imkeller, P. (2006), “The Shannon Information of Filtrations and the Additional Logarithmic Utility of Insiders,” The Annals of Probability, 34, 743-778.

Antonelli, P. L. (1979), “The Geometry of Random Drift V. Axiomatic Derivation of the Wfk Diffusion from a Variational Principle,” Advances in Applied Probability, 11, 502-509.

Arizono, I., and Ohta, H. (1989), “A Test for Normality Based on Kullback-Leibler Information,” The American Statistician, 43, 20-22.

Arnold, L., Gundlach, V. M., and Demetrius, L. (1994), “Evolutionary Formalism for Products of Positive Random Matrices,” The Annals of Applied Probability, 4, 859-901.

Arratia, R., and Waterman, M. S. (1985), “Critical Phenomena in Sequence Matching,” The Annals of Probability, 13, 1236-1249.

Artalejo, J. R., and Lopez-Herrero, M. J. (2001), “Analysis of the Busy Period for the M/M/C Queue: An Algorithmic Approach,” Journal of Applied Probability, 38, 209-222.

Arwini, K., and Dodson, C. T. J. (2004), “Neighbourhoods of Randomness and Geometry of Mckay Bivariate Gamma 3-Manifold,” Sankhya: The Indian Journal of Statistics (2003-), 66, 213-233.

Asadi, M., Ebrahimi, N., Hamedani, G. G., and Soofi, E. S. (2004), “Maximum Dynamic Entropy Models,” Journal of Applied Probability, 41, 379-390.

Asmussen, S., Nerman, O., and Olsson, M. (1996), “Fitting Phase-Type Distributions Via the Em Algorithm,” Scandinavian Journal of Statistics, 23, 419-441.

Autar, R. (1975), “On a Characterization of Information Improvement,” Journal of Applied Probability, 12, 407-411.

Autar, R., and Soni, R. S. (1975), “Inaccuracy and a Coding Theorem,” Journal of Applied Probability, 12, 845-851.

Baggen, S., et al. (2006), “Entropy of a Bit-Shift Channel,” Lecture Notes-Monograph Series, 48, 274-285.

Baggerly, K. A. (1998), “Empirical Likelihood as a Goodness-of-Fit Measure,” Biometrika, 85, 535-547.

Bahadoran, C., Guiol, H., Ravishankar, K., and Saada, E. (2006), “Euler Hydrodynamics of One-Dimensional Attractive Particle Systems,” The Annals of Probability, 34, 1339-1369.

Barlow, R. E., and Hsiung, J. H. (1983), “Expected Information from a Life Test Experiment,” Journal of the Royal Statistical Society. Series D (The Statistician), 32, 35-45.

Barron, A., and Hengartner, N. (1998), “Information Theory and Superefficiency,” The Annals of Statistics, 26, 1800-1825.

Barron, A., Schervish, M. J., and Wasserman, L. (1999), “The Consistency of Posterior Distributions in Nonparametric Problems,” The Annals of Statistics, 27, 536-561.

Barron, A. R. (1985), “The Strong Ergodic Theorem for Densities: Generalized Shannon-Mcmillan-Breiman Theorem,” The Annals of Probability, 13, 1292-1303.

Barron, A. R. (1986), “Entropy and the Central Limit Theorem,” The Annals of Probability, 14, 336-342.

Barron, A. R., and Sheu, C.-H. (1991), “Approximation of Density Functions by Sequences of Exponential Families,” The Annals of Statistics, 19, 1347-1369.

Bass, R. F., and Pyke, R. (1985), “The Space D(a) and Weak Convergence for Set-Indexed Processes,” The Annals of Probability, 13, 860-884.

Benassi, A., and Fouque, J.-P. (1987), “Hydrodynamical Limit for the Asymmetric Simple Exclusion Process,” The Annals of Probability, 15, 546-560.

Beran, R., and Dumbgen, L. (1998), “Modulation of Estimators and Confidence Sets,” The Annals of Statistics, 26, 1826-1856.

Berlekamp, E. R. (1972), “A Survey of Coding Theory,” Journal of the Royal Statistical Society. Series A (General), 135, 44-73.

Bernard, P., and Wu, L. (1998), “Stochastic Linearization: The Theory,” Journal of Applied Probability, 35, 718-730.

Bernardo, J. M., and Rueda, R. (2002), “Bayesian Hypothesis Testing: A Reference Approach,” International Statistical Review, 70, 351-372.

Berry, D. A. (1974), “Optimal Sampling Schemes for Estimating System Reliability by Testing Components–1: Fixed Sample Sizes,” Journal of the American Statistical Association, 69, 485-491.

Bertoluzza, C., and Forte, B. (1985), “Mutual Dependence of Random Variables and Maximum Discretized Entropy,” The Annals of Probability, 13, 630-637.

Bertrand, C. (1996), “Implications of Reference Priors for Prior Information and for Sample Size,” Journal of the American Statistical Association, 91, 173-184.

Bhandari, S. K. (1986), “Characterisation of the Parent Distribution by Inequality Measures on Its Truncations,” Sankhya: The Indian Journal of Statistics, Series B, 48, 297-300.

Biagini, S., and Frittelli, M. (2004), “On the Super Replication Price of Unbounded Claims,” The Annals of Applied Probability, 14, 1970-1991.

Birch, J. J. (1962), “Approximations for the Entropy for Functions of Markov Chains,” The Annals of Mathematical Statistics, 33, 930-938.

Boldrighini, C., Keane, M., and Marchetti, F. (1978), “Billiards in Polygons,” The Annals of Probability, 6, 532-540.

Bolthausen, E., and Deuschel, J.-D. (1993), “Critical Large Deviations for Gaussian Fields in the Phase Transition Regime, I,” The Annals of Probability, 21, 1876-1920.

Bolthausen, E., and Giacomin, G. (2005), “Periodic Copolymers at Selective Interfaces: A Large Deviations Approach,” The Annals of Applied Probability, 15, 963-983.

Borth, D. M. (1975), “A Total Entropy Criterion for the Dual Problem of Model Discrimination and Parameter Estimation,” Journal of the Royal Statistical Society. Series B (Methodological), 37, 77-87.

Borth, D. M., McKay, R. J., and Elliott, J. R. (1985), “A Difficulty Information Approach to Substituent Selection in Qsar Studies,” Technometrics, 27, 25-35.

Bose, R. C., and Kuebler, R. R., Jr. (1960), “A Geometry of Binary Sequences Associated with Group Alphabets in Information Theory,” The Annals of Mathematical Statistics, 31, 113-139.

Bosma, W., Dajani, K., and Kraaikamp, C. (2006), “Entropy Quotients and Correct Digits in Number-Theoretic Expansions,” Lecture Notes-Monograph Series, 48, 176-188.

Boucher, C., Ellis, R. S., and Turkington, B. (1999), “Spatializing Random Measures: Doubly Indexed Processes and the Large Deviation Principle,” The Annals of Probability, 27, 297-324.

Boucheron, S., Bousquet, O., Lugosi, G., and Massart, P. (2005), “Moment Inequalities for Functions of Independent Random Variables,” The Annals of Probability, 33, 514-560.

Boucheron, S., Lugosi, G., and Massart, P. (2003), “Concentration Inequalities Using the Entropy Method,” The Annals of Probability, 31, 1583-1614.

Bowman, A. W., and Foster, P. J. (1993), “Adaptive Smoothing and Density-Based Tests of Multivariate Normality,” Journal of the American Statistical Association, 88, 529-537.

Braess, D., and Dette, H. (2004), “The Asymptotic Minimax Risk for the Estimation of Constrained Binomial and Multinomial Probabilities,” Sankhya: The Indian Journal of Statistics (2003-), 66, 707-732.

Braverman, A. (2002), “Compressing Massive Geophysical Datasets Using Vector Quantization,” Journal of Computational and Graphical Statistics, 11, 44-62.

Breiman, L. (1957), “The Individual Ergodic Theorem of Information Theory,” The Annals of Mathematical Statistics, 28, 809-811.

Breiman, L. (1960), “Correction Notes: Correction To ”The Individual Ergodic Theorem of Information Theory”,” The Annals of Mathematical Statistics, 31, 809-810.

Brown, T. A. (1963), “Entropy and Conjugacy,” The Annals of Mathematical Statistics, 34, 226-232.

Buhlmann, P., and Wyner, A. J. (1999), “Variable Length Markov Chains,” The Annals of Statistics, 27, 480-513.

Bulmer, M. G. (1974), “On Fitting the Poisson Lognormal Distribution to Species-Abundance Data,” Biometrics, 30, 101-110.

Burnham, K. P., White, G. C., and Anderson, D. R. (1995), “Model Selection Strategy in the Analysis of Capture-Recapture Data,” Biometrics, 51, 888-898.

Burshtein, D., Pietra, V. D., Kanevsky, D., and Nadas, A. (1992), “Minimum Impurity Partitions,” The Annals of Statistics, 20, 1637-1646.

Burton, R., and Pemantle, R. (1993), “Local Characteristics, Entropy and Limit Theorems for Spanning Trees and Domino Tilings Via Transfer-Impedances,” The Annals of Probability, 21, 1329-1371.

Buss, S. R., and Clote, P. (2004), “Solving the Fisher-Wright and Coalescence Problems with a Discrete Markov Chain Analysis,” Advances in Applied Probability, 36, 1175-1197.

Callen, J. L., Kwan, C. C. Y., and Yip, P. C. Y. (1985), “Foreign-Exchange Rate Dynamics: An Empirical Study Using Maximum Entropy Spectral Analysis,” Journal of Business & Economic Statistics, 3, 149-155.

Chakravarty, S. R. (1982), “An Axiomatisation of the Entropy Measure of Inequality,” Sankhya: The Indian Journal of Statistics, Series B, 44, 351-354.

Champagnat, F., and Idier, J. (2000), “On the Correlation Structure of Unilateral Ar Processes on the Plane,” Advances in Applied Probability, 32, 408-425.

Chan, T. (1999), “Pricing Contingent Claims on Stocks Driven by Levy Processes,” The Annals of Applied Probability, 9, 504-528.

Chatfield, C. (1973), “Statistical Inference Regarding Markov Chain Models,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 22, 7-20.

Chen, X.-H., Dempster, A. P., and Liu, J. S. (1994), “Weighted Finite Population Sampling to Maximize Entropy,” Biometrika, 81, 457-469.

Chiu, S. N. (1994), “Mean-Value Formulae for the Neighbourhood of the Typical Cell of a Random Tessellation,” Advances in Applied Probability, 26, 565-576.

Christensen, E. S. (1989), “Statistical Properties of I-Projections within Exponential Families,” Scandinavian Journal of Statistics, 16, 307-318.

Christof, K., Ny, A. L., and Redig, F. (2004), “Relative Entropy and Variational Properties of Generalized Gibbsian Measures,” The Annals of Probability, 32, 1691-1726.

Chung, K. L. (1961), “A Note on the Ergodic Theorem of Information Theory,” The Annals of Mathematical Statistics, 32, 612-614.

Conger, M., and Viswanath, D. (2006), “Riffle Shuffles of Decks with Repeated Cards,” The Annals of Probability, 34, 804-819.

Courbage, M., and Hamdan, D. (1994), “Chapman-Kolmogorov Equation for Non-Markovian Shift-Invariant Measures,” The Annals of Probability, 22, 1662-1677.

Cover, T. M., Gacs, P., and Gray, R. M. (1989), “Kolmogorov’s Contributions to Information Theory and Algorithmic Complexity,” The Annals of Probability, 17, 840-865.

Crescenzo, A. D., and Longobardi, M. (2002), “Entropy-Based Measure of Uncertainty in Past Lifetime Distributions,” Journal of Applied Probability, 39, 434-440.

Csiszar, I. (1984), “Sanov Property, Generalized I-Projection and a Conditional Limit Theorem,” The Annals of Probability, 12, 768-793.

Csiszar, I. (1991), “Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems,” The Annals of Statistics, 19, 2032-2066.

Currin, C., Toby, M., Morris, M., and Don, Y. (1991), “Bayesian Prediction of Deterministic Functions, with Applications to the Design and Analysis of Computer Experiments,” Journal of the American Statistical Association, 86, 953-963.

Cutler, C. D., and Dawson, D. A. (1990), “Nearest-Neighbor Analysis of a Family of Fractal Distributions,” The Annals of Probability, 18, 256-271.

Daley, D. J., and Vere-Jones, D. (2004), “Scoring Probability Forecasts for Point Processes: The Entropy Score and Information Gain,” Journal of Applied Probability, 41, 297-312.

Darroch, J. N., and Ratcliff, D. (1972), “Generalized Iterative Scaling for Log-Linear Models,” The Annals of Mathematical Statistics, 43, 1470-1480.

Davis, H. T., and Koopmans, L. H. (1973), “Adaptive Prediction of Stationary Time Series,” Sankhya: The Indian Journal of Statistics, Series A, 35, 5-22.

Dawid, A. P., and Vovk, V. G. (1999), “Prequential Probability: Principles and Properties,” Bernoulli, 5, 125-162.

DeGroot, M. H. (1962), “Uncertainty, Information, and Sequential Experiments,” The Annals of Mathematical Statistics, 33, 404-419.

Dembo, A. (1997), “Information Inequalities and Concentration of Measure,” The Annals of Probability, 25, 927-939.

Diaconis, P., and Freedman, D. (1990), “On the Uniform Consistency of Bayes Estimates for Multinomial Probabilities,” The Annals of Statistics, 18, 1317-1327.

Diaconis, P., and Zabell, S. L. (1982), “Updating Subjective Probability,” Journal of the American Statistical Association, 77, 822-830.

Djehiche, B., and Kaj, I. (1995), “The Rate Function for Some Measure-Valued Jump Processes,” The Annals of Probability, 23, 1414-1438.

Donoho, D. L., Johnstone, I. M., Hoch, J. C., and Stern, A. S. (1992), “Maximum Entropy and the Nearly Black Object,” Journal of the Royal Statistical Society. Series B (Methodological), 54, 41-81.

Droge, B. (1998), “Minimax Regret Analysis of Orthogonal Series Regression Estimation: Selection Versus Shrinkage,” Biometrika, 85, 631-643.

Dudewicz, E. J., and Meulen, E. C. v. d. (1981), “Entropy-Based Tests of Uniformity,” Journal of the American Statistical Association, 76, 967-974.

Dudley, R. M. (1973), “Sample Functions of the Gaussian Process,” The Annals of Probability, 1, 66-103.

Dudley, R. M. (1978), “Central Limit Theorems for Empirical Measures,” The Annals of Probability, 6, 899-929.

Dudley, R. M. (1987), “Universal Donsker Classes and Metric Entropy,” The Annals of Probability, 15, 1306-1326.

Duncan, G., and Lambert, D. (1989), “The Risk of Disclosure for Microdata,” Journal of Business & Economic Statistics, 7, 207-217.

Dunham, J. G. (1980), “Abstract Alphabet Sliding-Block Entropy Compression Coding with a Fidelity Criterion,” The Annals of Probability, 8, 1085-1092.

Dutta, M. (1966), “On Maximum (Information-Theoretic) Entropy Estimation,” Sankhya: The Indian Journal of Statistics, Series A, 28, 319-328.

Duvillard, T. C., and Guionnet, A. (2001), “Large Deviations Upper Bounds for the Laws of Matrix-Valued Processes and Non-Communicative Entropies,” The Annals of Probability, 29, 1205-1261.

Dym, H. (1966), “A Note on Limit Theorems for the Entropy of Markov Chains,” The Annals of Mathematical Statistics, 37, 522-524.

Eaves, D. M. (1985), “On Maximizing Missing Information About a Hypothesis,” Journal of the Royal Statistical Society. Series B (Methodological), 47, 263-266.

Ebanks, B. (1978), “The Branching Property in Generalized Information Theory,” Advances in Applied Probability, 10, 788-802.

Ebanks, B. R. (1984), “Polynomially Additive Entropies,” Journal of Applied Probability, 21, 179-185.

Ebrahimi, N. (2000), “The Maximum Entropy Method for Lifetime Distributions,” Sankhya: The Indian Journal of Statistics, Series A, 62, 236-243.

Ebrahimi, N., Habibullah, M., and Soofi, E. S. (1992), “Testing Exponentiality Based on Kullback-Leibler Information,” Journal of the Royal Statistical Society. Series B (Methodological), 54, 739-748.

Ebrahimi, N., and Soofi, E. S. (1990), “Relative Information Loss under Type Ii Censored Exponential Data,” Biometrika, 77, 429-435.

Eeden, C. V., and Zidek, J. V. (2002), “Combining Sample Information in Estimating Ordered Normal Means,” Sankhya: The Indian Journal of Statistics, Series A, 64, 588-610.

Eguchi, S., and Copas, J. (1998), “A Class of Local Likelihood Methods and near-Parametric Asymptotics,” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 60, 709-724.

Elias, P. (1972), “The Efficient Construction of an Unbiased Random Sequence,” The Annals of Mathematical Statistics, 43, 865-870.

Enns, E. G. (1975), “Selecting the Maximum of a Sequence with Imperfect Information,” Journal of the American Statistical Association, 70, 640-643.

Epifani, I., Lijoi, A., and Prnster, I. (2003), “Exponential Functionals and Means of Neutral-to-the-Right Priors,” Biometrika, 90, 791-808.

Erschler, A. (2003), “On Drift and Entropy Growth for Random Walks on Groups,” The Annals of Probability, 31, 1193-1204.

Es, B. v. (1992), “Estimating Functionals Related to a Density by a Class of Statistics Based on Spacings,” Scandinavian Journal of Statistics, 19, 61-72.

Evans, W., Kenyon, C., Peres, Y., and Schulman, L. J. (2000), “Broadcasting on Trees and the Ising Model,” The Annals of Applied Probability, 10, 410-433.

Feldman, J., and Smorodinsky, M. (1971), “Bernoulli Flows with Infinite Entropy,” The Annals of Mathematical Statistics, 42, 381-382.

Ferguson, T. S. (1973), “A Bayesian Analysis of Some Nonparametric Problems,” The Annals of Statistics, 1, 209-230.

Follmer, H., and Gantert, N. (1997), “Entropy Minimization and Schrodinger Processes in Infinite Dimensions,” The Annals of Probability, 25, 901-926.

Follmer, H., and Orey, S. (1988), “Large Deviations for the Empirical Field of a Gibbs Measure,” The Annals of Probability, 16, 961-977.

Franke, J. (1985), “Arma Processes Have Maximal Entropy among Time Series with Prescribed Autocovariances and Impulse Responses,” Advances in Applied Probability, 17, 810-840.

Galli, E.-P., and Legros, D. (2007), “Spatial Spillovers in France: A Study on Individual Count Data at the City Level,” Annales d’conomie et de Statistique, 221-246.

Gamboa, F., and Gassiat, E. (1997), “Bayesian Methods and Maximum Entropy for Ill-Posed Inverse Problems,” The Annals of Statistics, 25, 328-350.

Gao, F., and Quastel, J. (2003), “Exponential Decay of Entropy in the Random Transposition and Bernoulli: Laplace Models,” The Annals of Applied Probability, 13, 1591-1600.

Gardner, R. J., Kiderlen, M., and Milanfar, P. (2006), “Convergence of Algorithms for Reconstructing Convex Bodies and Directional Measures,” The Annals of Statistics, 34, 1331-1374.

Gatsonis, C. A. (1984), “Deriving Posterior Distributions for a Location Parameter: A Decision Theoretic Approach,” The Annals of Statistics, 12, 958-970.

Ge, H., Jiang, D.-Q., and Qian, M. (2006), “Reversibility and Entropy Production of Inhomogeneous Markov Chains,” Journal of Applied Probability, 43, 1028-1043.

Geer, S. V. D. (1987), “A New Approach to Least-Squares Estimation, with Applications,” The Annals of Statistics, 15, 587-602.

Geer, S. v. d. (1990), “Estimating a Regression Function,” The Annals of Statistics, 18, 907-924.

Geer, S. v. d. (1993), “Hellinger-Consistency of Certain Nonparametric Maximum Likelihood Estimators,” The Annals of Statistics, 21, 14-44.

Geer, S. v. d. (1995), “Exponential Inequalities for Martingales, with Application to Maximum Likelihood Estimation for Counting Processes,” The Annals of Statistics, 23, 1779-1801.

Gendron, P., and Nandram, B. (2001), “An Empirical Bayes Estimator of Seismic Events Using Wavelet Packet Bases,” Journal of Agricultural, Biological, and Environmental Statistics, 6, 379-406.

Genovese, C. R., and Wasserman, L. (2000), “Rates of Convergence for the Gaussian Mixture Sieve,” The Annals of Statistics, 28, 1105-1127.

Georgii, H.-O. (1993), “Large Deviations and Maximum Entropy Principle for Interacting Random Fields on Zd,” The Annals of Probability, 21, 1845-1875.

Gersch, W., and Kitagawa, G. (1983), “The Prediction of Time Series with Trends and Seasonalities,” Journal of Business & Economic Statistics, 1, 253-264.

Ghosal, S., and Vaart, A. W. v. d. (2001), “Entropies and Rates of Convergence for Maximum Likelihood and Bayes Estimation for Mixtures of Normal Densities,” The Annals of Statistics, 29, 1233-1263.

Ghosh, M., and Yang, M.-C. (1988), “Simultaneous Estimation of Poisson Means under Entropy Loss,” The Annals of Statistics, 16, 278-291.

Ghurye, S. G. (1968), “Information and Sufficient Sub-Fields,” The Annals of Mathematical Statistics, 39, 2056-2066.

Gilbert, E. N. (1958), “An Outline of Information Theory,” The American Statistician, 12, 13-19.

Gilula, Z., Krieger, A. M., and Ritov, Y. (1988), “Ordinal Association in Contingency Tables: Some Interpretive Aspects,” Journal of the American Statistical Association, 83, 540-545.

Gilula, Z., and Shelby, J. H. (2000), “Density Approximation by Summary Statistics: An Information-Theoretic Approach,” Scandinavian Journal of Statistics, 27, 521-534.

Gine, E., and Zinn, J. (1984), “Some Limit Theorems for Empirical Processes,” The Annals of Probability, 12, 929-989.

Girardin, V., and Limnios, N. (2003), “On the Entropy for Semi-Markov Processes,” Journal of Applied Probability, 40, 1060-1068.

Gleaton, J. U., and Lynch, J. D. (2004), “On the Distribution of the Breaking Strain of a Bundle of Brittle Elastic Fibers,” Advances in Applied Probability, 36, 98-115.

Golan, A., Judge, G., and Perloff, J. M. (1996), “A Maximum Entropy Approach to Recovering Information from Multinomial Response Data,” Journal of the American Statistical Association, 91, 841-853.

Golan, A., Karp, L. S., and Perloff, J. M. (2000), “Estimating Coke’s and Pepsi’s Price and Advertising Strategies,” Journal of Business & Economic Statistics, 18, 398-409.

Goldie, C. M., and Greenwood, P. E. (1986), “Variance of Set-Indexed Sums of Mixing Random Variables and Weak Convergence of Set-Indexed Processes,” The Annals of Probability, 14, 817-839.

Good, I. J. (1953), “The Population Frequencies of Species and the Estimation of Population Parameters,” Biometrika, 40, 237-264.

Good, I. J. (1963), “Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency Tables,” The Annals of Mathematical Statistics, 34, 911-934.

Gordon, B., and Kelsall, J. E. (2002), “Nonlinear Kernel Density Estimation for Binned Data: Convergence in Entropy,” Bernoulli, 8, 423-449.

Grandits, P. (1999), “The P-Optimal Martingale Measure and Its Asymptotic Relation with the Minimal-Entropy Martingale Measure,” Bernoulli, 5, 225-247.

Grandits, P., and Rheinlnder, T. (2002), “On the Minimal Entropy Martingale Measure,” The Annals of Probability, 30, 1003-1038.

Gray, R. M., Neuhoff, D. L., and Shields, P. C. (1975), “A Generalization of Ornstein’s  ¯ D Distance with Applications to Information Theory,” The Annals of Probability, 3, 315-328.

Gray, R. M., Ornstein, D. S., and Dobrushin, R. L. (1980), “Block Synchronization, Sliding-Block Coding, Invulnerable Sources and Zero Error Codes for Discrete Noisy Channels,” The Annals of Probability, 8, 639-674.

Griffeath, D. S. (1972), “Computer Solution of the Discrete Maximum Entropy Problem,” Technometrics, 14, 891-897.

Grnwald, P. D., and Dawid, A. P. (2004), “Game Theory, Maximum Entropy, Minimum Discrepancy and Robust Bayesian Decision Theory,” The Annals of Statistics, 32, 1367-1433.

Gulati, B. R., and Kounias, E. G. (1973), “On Three Level Symmetrical Factorial Designs and Ternary Group Codes,” Sankhya: The Indian Journal of Statistics, Series A, 35, 377-392.

Gupta, S. S., and Huang, D.-Y. (1976), “On Subset Selection Procedures for the Entropy Function Associated with the Binomial Populations,” Sankhya: The Indian Journal of Statistics, Series A, 38, 153-173.

Gutirrez-Pea, E., and Muliere, P. (2004), “Conjugate Priors Represent Strong Pre-Experimental Assumptions,” Scandinavian Journal of Statistics, 31, 235-246.

Hall, P. (1987), “On the Amount of Detail That Can Be Recovered from a Degraded Signal,” Advances in Applied Probability, 19, 371-395.

Hall, P., and Presnell, B. (1999), “Density Estimation under Constraints,” Journal of Computational and Graphical Statistics, 8, 259-277.

Hall, P., and Titterington, D. M. (1986), “On Some Smoothing Techniques Used in Image Restoration,” Journal of the Royal Statistical Society. Series B (Methodological), 48, 330-343.

Hansen, M. H., and Yu, B. (2001), “Model Selection and the Principle of Minimum Description Length,” Journal of the American Statistical Association, 96, 746-774.

Hart, P. E. (1971), “Entropy and Other Measures of Concentration,” Journal of the Royal Statistical Society. Series A (General), 134, 73-85.

Haussler, D., and Opper, M. (1997), “Mutual Information, Metric Entropy and Cumulative Relative Entropy Risk,” The Annals of Statistics, 25, 2451-2492.

Heavlin, W. D. (2003), “Designing Experiments for Causal Networks,” Technometrics, 45, 115-129.

Henery, R. J. (1986), “Interpretation of Average Ranks,” Biometrika, 73, 224-227.

Hickey, R. J. (1982), “A Note on the Measurement of Randomness,” Journal of Applied Probability, 19, 229-232.

Hinich, M. (1965), “Large-Sample Estimation of an Unknown Discrete Waveform Which Is Randomly Repeating in Gaussian Noise,” The Annals of Mathematical Statistics, 36, 489-508.

Holgate, P. (1981), “The Statistical Entropy of a Sample from a Community of Species,” Biometrics, 37, 795-799.

Horowitz, I. (1970), “Employment Concentration in the Common Market: An Entropy Approach,” Journal of the Royal Statistical Society. Series A (General), 133, 463-479.

Hsu, L. (1995), “New Procedures for Group-Testing Based on the Huffman Lower Bound and Shannon Entropy Criteria,” Lecture Notes-Monograph Series, 25, 249-262.

Huang, D. (1990), “On the Maximal Entropy Property for Arma Processes and Arma Approximation,” Advances in Applied Probability, 22, 612-626.

Hurvich, C. M. (1986), “Data-Dependent Spectral Windows: Generalizing the Classical Framework to Include Maximum Entropy Estimates,” Technometrics, 28, 259-268.

Ibrahim, J. G., and Laud, P. W. (1994), “A Predictive Approach to the Analysis of Designed Experiments,” Journal of the American Statistical Association, 89, 309-319.

Iii, J. P., and Stine, R. A. (1997), “Estimation for an M/G/8 Queue with Incomplete Information,” Biometrika, 84, 295-308.

Imbens, G. W. (2002), “Generalized Method of Moments and Empirical Likelihood,” Journal of Business & Economic Statistics, 20, 493-506.

Iosifescu, M. (1965), “Sampling Entropy for Random Homogeneous Systems with Complete Connections,” The Annals of Mathematical Statistics, 36, 1433-1436.

Iosifescu, M. (1969), “Correction Note: Correction To ”Sampling Entropy for Random Homogeneous Systems with Complete Connections”,” The Annals of Mathematical Statistics, 40, 2215.

Isaki, C. T. (1983), “Variance Estimation Using Auxiliary Information,” Journal of the American Statistical Association, 78, 117-123.

Jain, N. C. (1990), “Large Deviation Lower Bounds for Additive Functionals of Markov Processes,” The Annals of Probability, 18, 1071-1098.

Jennison, C., and Turnbull, B. W. (1997), “Group-Sequential Analysis Incorporating Covariate Information,” Journal of the American Statistical Association, 92, 1330-1341.

Joe, H. (1987), “Majorization, Randomness and Dependence for Multivariate Distributions,” The Annals of Probability, 15, 1217-1225.

Joe, H. (1988), “Majorization, Entropy and Paired Comparisons,” The Annals of Statistics, 16, 915-925.

Joe, H. (1989), “Relative Entropy Measures of Multivariate Dependence,” Journal of the American Statistical Association, 84, 157-164.

Johnson, O. (2006), “A Central Limit Theorem for Non-Overlapping Return Times,” Journal of Applied Probability, 43, 32-47.

Johnson, R. A., and Wehrly, T. E. (1978), “Some Angular-Linear Distributions and Related Regression Models,” Journal of the American Statistical Association, 73, 602-606.

Johnstone, I. M. (1994), “On Minimax Estimation of a Sparse Normal Mean Vector,” The Annals of Statistics, 22, 271-289.

Johnstone, I. M., and Silverman, B. W. (1997), “Wavelet Threshold Estimators for Data with Correlated Noise,” Journal of the Royal Statistical Society. Series B (Methodological), 59, 319-351.

Jon, S. (1976), “On the Optimal Use of Multiauxiliary Information,” Journal of the American Statistical Association, 71, 679.

Jonasson, J. (1999), “The Random Triangle Model,” Journal of Applied Probability, 36, 852-867.

Jourdain, B. (2002), “Probabilistic Characteristics Method for a One-Dimensional Inviscid Scalar Conservation Law,” The Annals of Applied Probability, 12, 334-360.

Jupp, P. E., and Mardia, K. V. (1983), “A Note on the Maximum-Entropy Principle,” Scandinavian Journal of Statistics, 10, 45-47.

Kaimanovich, V. A., and Vershik, A. M. (1983), “Random Walks on Discrete Groups: Boundary and Entropy,” The Annals of Probability, 11, 457-490.

Kaimanovich, V. A., and Woess, W. (2002), “Boundary and Entropy of Space Homogeneous Markov Chains,” The Annals of Probability, 30, 323-363.

Kalikow, S., and Weiss, B. (1992), “Explicit Codes for Some Infinite Entropy Bernoulli Shifts,” The Annals of Probability, 20, 397-402.

Kamae, T. (2006), “Numeration Systems as Dynamical Systems: Introduction,” Lecture Notes-Monograph Series, 48, 198-211.

Kannappan, P. L., and Ng, C. T. (1980), “On Functional Equations and Measures of Information. Ii,” Journal of Applied Probability, 17, 271-277.

Karlin, S., and Rinott, Y. (1981), “Entropy Inequalities for Classes of Probability Distributions I. The Univariate Case,” Advances in Applied Probability, 13, 93-112.

Karlin, S., and Rinott, Y. (1981), “Entropy Inequalities for Classes of Probability Distributions Ii. The Multivariate Case,” Advances in Applied Probability, 13, 325-351.

Kavalieris, L. (1991), “A Note on Estimating Autoregressive-Moving Average Order,” Biometrika, 78, 920-922.

Kemperman, J. H. B. (1969), “On the Optimum Rate of Transmitting Information,” The Annals of Mathematical Statistics, 40, 2156-2177.

Kendall, M. G. (1973), “Entropy, Probability and Information,” International Statistical Review, 41, 59-68.

Kiefer, J. (1961), “Optimum Designs in Regression Problems, Ii,” The Annals of Mathematical Statistics, 32, 298-325.

Kieffer, J. C. (1973), “A Counterexample to Perez’s Generalization of the Shannon-Mcmillan Theorem,” The Annals of Probability, 1, 362-364.

Kieffer, J. C. (1974), “On the Approximation of Stationary Measures by Periodic and Ergodic Measures,” The Annals of Probability, 2, 530-534.

Kieffer, J. C. (1975), “A Generalized Shannon-Mcmillan Theorem for the Action of an Amenable Group on a Probability Space,” The Annals of Probability, 3, 1031-1037.

Kieffer, J. C. (1980), “On the Transmission of Bernoulli Sources over Stationary Channels,” The Annals of Probability, 8, 942-961.

Kim, G.-H., and David, H. T. (1979), “Large Deviations of Functions of Markovian Transitions and Mathematical Programming Duality,” The Annals of Probability, 7, 874-881.

King, R., and Brooks, S. P. (2001), “On the Bayesian Analysis of Population Size,” Biometrika, 88, 317-336.

Kleijn, B. J. K., and Vaart, A. W. v. d. (2006), “Misspecification in Infinite-Dimensional Bayesian Statistics,” The Annals of Statistics, 34, 837-877.

Klein, T., and Rio, E. (2005), “Concentration around the Mean for Maxima of Empirical Processes,” The Annals of Probability, 33, 1060-1077.

Klotz, J. H. (1978), “Maximum Entropy Constrained Balance Randomization for Clinical Trials,” Biometrics, 34, 283-287.

Konishi, S., and Kitagawa, G. (1996), “Generalised Information Criteria in Model Selection,” Biometrika, 83, 875-890.

Kosygina, E. (2001), “The Behavior of the Specific Entropy in the Hydrodynamic Scaling Limit,” The Annals of Probability, 29, 1086-1110.

Kotz, S. (1966), “Recent Results in Information Theory,” Journal of Applied Probability, 3, 1-93.

Krafft, O., and Schmitz, N. (1969), “A Note on Hoeffding’s Inequality,” Journal of the American Statistical Association, 64, 907-912.

Kruk, L. (2004), “Limiting Distributions for Minimum Relative Entropy Calibration,” Journal of Applied Probability, 41, 35-50.

Kubokawa, T. (1988), “The Recovery of Interblock Information in Balanced Incomplete Block Designs,” Sankhya: The Indian Journal of Statistics, Series B, 50, 78-89.

Kubokawa, T., Robert, C., and Saleh, A. K. M. E. (1992), “Empirical Bayes Estimation of the Covariance Matrix of a Normal Distribution with Unknown Mean under an Entropy Loss,” Sankhya: The Indian Journal of Statistics, Series A, 54, 402-410.

Kullback, S. (1952), “An Application of Information Theory to Multivariate Analysis,” The Annals of Mathematical Statistics, 23, 88-102.

Kullback, S. (1954), “Certain Inequalities in Information Theory and the Cramer-Rao Inequality,” The Annals of Mathematical Statistics, 25, 745-751.

Kullback, S. (1956), “An Application of Information Theory to Multivariate Analysis, Ii,” The Annals of Mathematical Statistics, 27, 122-146.

Kullback, S. (1956), “Correction To ”An Application of Information Theory to Multivariate Analysis, Ii”,” The Annals of Mathematical Statistics, 27, 860.

Kullback, S. (1969), “A Bound for the Variation of Gaussian Densities,” The Annals of Mathematical Statistics, 40, 2180-2182.

Kullback, S., Kupperman, M., and Ku, H. H. (1962), “Tests for Contingency Tables and Markov Chains,” Technometrics, 4, 573-608.

Kullback, S., and Rosenblatt, H. M. (1957), “On the Analysis of Multiple Regression in K Categories,” Biometrika, 44, 67-83.

Kumar, S., and Milton, S. (1971), “Finding a Single Defective in Binomial Group-Testing,” Journal of the American Statistical Association, 66, 824-828.

Lagarias, J. C. (1993), “Pseudorandom Numbers,” Statistical Science, 8, 31-39.

Lalley, S. P. (1992), “Brownian Motion and the Equilibrium Measure on the Julia Set of a Rational Mapping,” The Annals of Probability, 20, 1932-1967.

Lapedes, A. S., Bertrand, G. G., Liu, L., and Stormo, G. D. (1999), “Correlated Mutations in Models of Protein Sequences: Phylogenetic and Structural Effects,” Lecture Notes-Monograph Series, 33, 236-256.

Larimore, W. E. (1983), “Predictive Inference, Sufficiency, Entropy and an Asymptotic Likelihood Principle,” Biometrika, 70, 175-181.

Lavine, M. (1994), “More Aspects of Polya Tree Distributions for Statistical Modelling,” The Annals of Statistics, 22, 1161-1176.

Lee, P. M. (1964), “On the Axioms of Information Theory,” The Annals of Mathematical Statistics, 35, 415-418.

Lehmann, E. L. (1983), “Estimation with Inadequate Information,” Journal of the American Statistical Association, 78, 624-627.

Leonard, C., and Najim, J. (2002), “An Extension of Sanov’s Theorem: Application to the Gibbs Conditioning Principle,” Bernoulli, 8, 721-743.

LeSage, J. P. (1991), “Analysis and Development of Leading Indicators Using a Bayesian Turning-Points Approach,” Journal of Business & Economic Statistics, 9, 305-316.

Leysieffer, F. W., and Warner, S. L. (1976), “Respondent Jeopardy and Optimal Designs in Randomized Response Models,” Journal of the American Statistical Association, 71, 649-656.

Li, W. V., and Linde, W. (1999), “Approximation, Metric Entropy and Small Ball Estimates for Gaussian Measures,” The Annals of Probability, 27, 1556-1578.

Li, X., Mikusinski, P., Sherwood, H., and Taylor, M. D. (1996), “In Quest of Birkhoff’s Theorem in Higher Dimensions,” Lecture Notes-Monograph Series, 28, 187-197.

Lin, S., and Ding, J. (2009), “Integration of Ranked Lists Via Cross Entropy Monte Carlo with Applications to Mrna and Microrna Studies,” Biometrics, 65, 9-18.

Lind, N. C. (1994), “Information Theory and Maximum Product of Spacings Estimation,” Journal of the Royal Statistical Society. Series B (Methodological), 56, 341-343.

Lu, S. (1995), “Hydrodynamic Scaling Limits with Deterministic Initial Configurations,” The Annals of Probability, 23, 1831-1852.

Luo, Y., and Lin, S. (2003), “Information Gain for Genetic Parameter Estimation with Incorporation of Marker Data,” Biometrics, 59, 393-401.

Lutwak, E., Yang, D., and Zhang, G. (2004), “Moment-Entropy Inequalities,” The Annals of Probability, 32, 757-774.

M, E. G. (1974), “On the Central Limit Theorem for Sample Continuous Processes,” The Annals of Probability, 2, 629-641.

Madych, W. R. (1991), “Solutions of Underdetermined Systems of Linear Equations,” Lecture Notes-Monograph Series, 20, 227-238.

Manuceau, J., Troup, M., and Vaillant, J. (1999), “On an Entropy Conservation Principle,” Journal of Applied Probability, 36, 607-610.

Marcus, M. B. (1973), “A Comparison of Continuity Conditions for Gaussian Processes,” The Annals of Probability, 1, 123-130.

Mardia, K. V., and Gadsden, R. J. (1977), “A Small Circle of Best Fit for Spherical Data and Areas of Vulcanism,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 26, 238-245.

Martin-Lf, A. (1986), “Entropy Estimates for the First Passage Time of a Random Walk to a Time Dependent Barrier,” Scandinavian Journal of Statistics, 13, 221-229.

Martin-Lf, P. (1974), “The Notion of Redundancy and Its Use as a Quantitative Measure of the Discrepancy between a Statistical Hypothesis and a Set of Observational Data [with Discussion],” Scandinavian Journal of Statistics, 1, 3-18.

Marton, K., and Shields, P. C. (1994), “Entropy and the Consistent Estimation of Joint Distributions,” The Annals of Probability, 22, 960-977.

Marton, K., and Shields, P. C. (1996), “Correction: Entropy and the Consistent Estimation of Joint Distributions,” The Annals of Probability, 24, 541-545.

Massart, P. (1989), “Strong Approximation for Multivariate Empirical and Related Processes, Via Kmt Constructions,” The Annals of Probability, 17, 266-291.

Massart, P. (2000), “About the Constants in Talagrand’s Concentration Inequalities for Empirical Processes,” The Annals of Probability, 28, 863-884.

Massart, P., and Ndlec, . (2006), “Risk Bounds for Statistical Learning,” The Annals of Statistics, 34, 2326-2366.

Mayfield, E. S., and Mizrach, B. (1992), “On Determining the Dimension of Real-Time Stock-Price Data,” Journal of Business & Economic Statistics, 10, 367-374.

McCann, M., and Don, E. (1996), “A Path Length Inequality for the Multivariate-T Distribution, with Applications to Multiple Comparisons,” Journal of the American Statistical Association, 91, 211-216.

McClintock, B. T., White, G. C., Antolin, M. F., and Tripp, D. W. (2009), “Estimating Abundance Using Mark-Resight When Sampling Is with Replacement or the Number of Marked Individuals Is Unknown,” Biometrics, 65, 237-246.

McCulloch, R. E. (1988), “Information and the Likelihood Function in Exponential Families,” The American Statistician, 42, 73-75.

McDonald, J. B., and Jensen, B. C. (1979), “An Analysis of Some Properties of Alternative Measures of Income Inequality Based on the Gamma Distribution Function,” Journal of the American Statistical Association, 74, 856-860.

McEliece, R. J., and Posner, E. C. (1971), “Hide and Seek, Data Storage, and Entropy,” The Annals of Mathematical Statistics, 42, 1706-1716.

McEliece, R. J., and Posner, E. C. (1973), “Hiding and Covering in a Compact Metric Space,” The Annals of Statistics, 1, 729-739.

McMillan, B. (1953), “The Basic Theorems of Information Theory,” The Annals of Mathematical Statistics, 24, 196-219.

Menard, S. (2004), “Six Approaches to Calculating Standardized Logistic Regression Coefficients,” The American Statistician, 58, 218-223.

Merkl, F., and Rolles, S. W. W. (2005), “Edge-Reinforced Random Walk on a Ladder,” The Annals of Probability, 33, 2051-2093.

Miles, R. E. (1984), “Symmetric Sequential Analysis: The Efficiencies of Sports Scoring Systems (with Particular Reference to Those of Tennis),” Journal of the Royal Statistical Society. Series B (Methodological), 46, 93-108.

Miller, M. I., Roysam, B., Smith, K., and Udding, J. T. (1991), “On the Equivalence of Regular Grammars and Stochastic Constraints: Applications to Image Processing on Massively Parallel Processors,” Lecture Notes-Monograph Series, 20, 239-257.

Milton, S., and Groll, P. A. (1966), “Binomial Group-Testing with an Unknown Proportion of Defectives,” Technometrics, 8, 631-656.

Mitchell, T. J., and Beauchamp, J. J. (1988), “Bayesian Variable Selection in Linear Regression,” Journal of the American Statistical Association, 83, 1023-1032.

Moothathu, T. S. K. (1990), “The Best Estimator and a Strongly Consistent Asymptotically Normal Unbiased Estimator of Lorenz Curve Gini Index and Theil Entropy Index of Pareto Distribution,” Sankhya: The Indian Journal of Statistics, Series B, 52, 115-127.

Morf, M., Vieira, A., and Kailath, T. (1978), “Covariance Characterization by Partial Autocorrelation Matrices,” The Annals of Statistics, 6, 643-648.

Morgan, B. J. T. (1976), “Markov Properties of Sequences of Behaviours,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 25, 31-36.

Morris, M. D., Toby, J. M., and Ylvisaker, D. (1993), “Bayesian Design and Analysis of Computer Experiments: Use of Derivatives in Surface Prediction,” Technometrics, 35, 243-255.

Morton, K. (1960), “On Comparing Two Observed Frequency Counts,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 9, 37-42.

Mossel, E. (2001), “Reconstruction on Trees: Beating the Second Eigenvalue,” The Annals of Applied Probability, 11, 285-300.

Mossel, E., and Peres, Y. (2003), “Information Flow on Trees,” The Annals of Applied Probability, 13, 817-844.

Mudholkar, G. S., Kollia, G. D., Lin, C. T., and Patel, K. R. (1991), “A Graphical Procedure for Comparing Goodness-of-Fit Tests,” Journal of the Royal Statistical Society. Series B (Methodological), 53, 221-232.

Mureika, R. A. (1972), “The Maximization of Entropy of Discrete Denumerably-Valued Random Variables with Known Mean,” The Annals of Mathematical Statistics, 43, 541-552.

Nadarajah, S. (2002), “A Conversation with Samuel Kotz,” Statistical Science, 17, 220-233.

Nayak, T. K. (1986), “An Analysis of Diversity Using Rao’s Quadratic Entropy,” Sankhya: The Indian Journal of Statistics, Series B, 48, 315-330.

Nayak, T. K., and Dayanand, N. N. (1989), “Estimating Multinomial Cell Probabilities under Quadratic Loss,” Journal of the Royal Statistical Society. Series D (The Statistician), 38, 3-10.

Nayak, T. K., and Gastwirth, J. L. (1989), “The Use of Diversity Analysis to Assess the Relative Influence of Factors Affecting the Income Distribution,” Journal of Business & Economic Statistics, 7, 453-460.

Neter, J., and Maynes, E. S. (1970), “On the Appropriateness of the Correlation Coefficient with a 0, 1 Dependent Variable,” Journal of the American Statistical Association, 65, 501-509.

Neuhoff, D. L., and Shields, P. C. (1982), “Channel Entropy and Primitive Approximation,” The Annals of Probability, 10, 188-198.

Nicoleris, T., and Yatracos, Y. G. (1997), “Rates of Convergence of Estimates, Kolmogorov’s Entropy and the Dimensionality Reduction Principle in Regression,” The Annals of Statistics, 25, 2493-2511.

Nishiyama, Y. (2000), “Weak Convergence of Some Classes of Martingales with Jumps,” The Annals of Probability, 28, 685-712.

Olshen, R., and Breiman, L. (2001), “A Conversation with Leo Breiman,” Statistical Science, 16, 184-198.

Ossiander, M. (1987), “A Central Limit Theorem under Metric Entropy with L2 Bracketing,” The Annals of Probability, 15, 897-919.

Osterburg, J. W., Parthasarathy, T., Raghavan, T. E. S., and Sclove, S. L. (1977), “Development of a Mathematical Formula for the Calculation of Fingerprint Probabilities Based on Individual Characteristics,” Journal of the American Statistical Association, 72, 772-778.

O’Sullivan, F. (1990), “An Iterative Approach to Two-Dimensional Laplacian Smoothing with Application Image Restoration,” Journal of the American Statistical Association, 85, 213-219.

Papangelou, F. (1977), “Conditional Intensities of Point Processes: Their Application to Davidson’s Problem and to Entropy,” Advances in Applied Probability, 9, 434.

Pardo, L., Salicr, M., Menndez, M. L., and Morales, D. (1995), “Divergence Measures Based on Entropy Functions and Statistical Inference,” Sankhya: The Indian Journal of Statistics, Series B, 57, 315-337.

Parthasarathy, K. R. (1963), “Effective Entropy Rate and Transmission of Information through Channels with Additive Random Noise,” Sankhya: The Indian Journal of Statistics, Series A, 25, 75-84.

Patterson, K. D., and Heravi, S. M. (1991), “Direct Estimation of Entropy and Revisions to the National Income Accounts,” Journal of the Royal Statistical Society. Series D (The Statistician), 40, 35-50.

Pea, V. H. d. l., Ibragimov, R., and Sharakhmetov, S. (2006), “Characterizations of Joint Distributions, Copulas, Information, Dependence and Decoupling, with Applications to Time Series,” Lecture Notes-Monograph Series, 49, 183-209.

Peres, Y. (1992), “Iterating Von Neumann’s Procedure for Extracting Random Bits,” The Annals of Statistics, 20, 590-597.

Perron, F. (1992), “Monotonic Minimax Estimators of a 2x2 Covariance Matrix,” The Canadian Journal of Statistics, 20, 441-449.

Perrut, A. (2000), “Hydrodynamic Limits for a Two-Species Reaction-Diffusion Process,” The Annals of Applied Probability, 10, 163-191.

Pitcher, T. S. (1968), “The ?-Entropy of Certain Measures on [ 0, 1 ],” The Annals of Mathematical Statistics, 39, 1310-1315.

Pittel, B. (1985), “Asymptotical Growth of a Class of Random Trees,” The Annals of Probability, 13, 414-427.

Politis, D. N. (1994), “Markov Chains in Many Dimensions,” Advances in Applied Probability, 26, 756-774.

Portnoy, S. (1973), “On Recovery of Intra-Block Information,” Journal of the American Statistical Association, 68, 384-391.

Posner, E. C., and Rodemich, E. R. (1971), “Epsilon Entropy and Data Compression,” The Annals of Mathematical Statistics, 42, 2079-2125.

Posner, E. C., and Rodemich, E. R. (1973), “Epsilon Entropy of Stochastic Processes with Continuous Paths,” The Annals of Probability, 1, 674-689.

Posner, E. C., Rodemich, E. R., and Rumsey, H., Jr. (1967), “Epsilon Entropy of Stochastic Processes,” The Annals of Mathematical Statistics, 38, 1000-1020.

Posner, E. C., Rodemich, E. R., and Rumsey, H., Jr. (1969), “Epsilon Entropy of Gaussian Processes,” The Annals of Mathematical Statistics, 40, 1272-1296.

Posner, E. C., Rodemich, E. R., and Rumsey, H., Jr. (1969), “Product Entropy of Gaussian Distributions,” The Annals of Mathematical Statistics, 40, 870-904.

Pra, P. D., Paganoni, A. M., and Posta, G. (2002), “Entropy Inequalities for Unbounded Spin Systems,” The Annals of Probability, 30, 1959-1976.

Prescott, P. (1976), “On a Test for Normality Based on Sample Entropy,” Journal of the Royal Statistical Society. Series B (Methodological), 38, 254-256.

Preston, C. (1972), “Continuity Properties of Some Gaussian Processes,” The Annals of Mathematical Statistics, 43, 285-292.

Pyke, R. (1986), “Product Brownian Measures,” Advances in Applied Probability, 18, 117-131.

Rachev, S. T., and Ruschendorf, L. (1991), “Approximate Independence of Distributions on Spheres and Their Stability Properties,” The Annals of Probability, 19, 1311-1337.

Rahiala, M. (1986), “Identification and Preliminary Estimation in Linear Transfer Function Models,” Scandinavian Journal of Statistics, 13, 239-255.

Rao, C. R. (1982), “Diversity: Its Measurement, Decomposition, Apportionment and Analysis,” Sankhya: The Indian Journal of Statistics, Series A, 44, 1-22.

Rao, C. R. (1984), “Convexity Properties of Entropy Functions and Analysis of Diversity,” Lecture Notes-Monograph Series, 5, 68-77.

Rao, C. R., Renyi, A., and Kendall, D. G. (1965), “[on the Foundations of Information Theory]: Discussion,” Review of the International Statistical Institute, 33, 14.

Rathie, P. N. (1970), “On a Generalized Entropy and a Coding Theorem,” Journal of Applied Probability, 7, 124-133.

Ren, C., Sun, D., and Dey, D. K. (2004), “Comparison of Bayesian and Frequentist Estimation and Prediction for a Normal Population,” Sankhya: The Indian Journal of Statistics (2003-), 66, 678-706.

Renyi, A. (1965), “On the Foundations of Information Theory,” Review of the International Statistical Institute, 33, 1-14.

Rheinlnder, T., and Steiger, G. (2006), “The Minimal Entropy Martingale Measure for General Barndorff-Nielsen/Shephard Models,” The Annals of Applied Probability, 16, 1319-1351.

Rio, E. (1993), “Strong Approximation for Set-Indexed Partial-Sum Processes, Via Kmt Constructions Ii,” The Annals of Probability, 21, 1706-1727.

Robert, C. (1990), “An Entropy Concentration Theorem: Applications in Artificial Intelligence and Descriptive Statistics,” Journal of Applied Probability, 27, 303-313.

Ron, M., Judge, G., Akkeren, M. v., and Cardell, N. S. (2002), “Coordinate-Based Empirical Likelihood-Like Estimation in Iii-Conditioned Inverse Problems,” Journal of the American Statistical Association, 97, 1108-1121.

Roy, D., and Mukherjee, S. P. (1986), “A Note on Characterisations of the Weibull Distribution,” Sankhya: The Indian Journal of Statistics, Series A, 48, 250-253.

Rudolph, D., and Steele, J. M. (1980), “Sizes of Order Statistical Events of Stationary Processes,” The Annals of Probability, 8, 1079-1084.

Rukhin, A. L. (2000), “Approximate Entropy for Testing Randomness,” Journal of Applied Probability, 37, 88-100.

Rukhin, A. L. (2002), “Distribution of the Number of Words with a Prescribed Frequency and Tests of Randomness,” Advances in Applied Probability, 34, 775-797.

Ryu, H. K., and Slottje, D. J. (1994), “Coordinate Space Versus Index Space Representations as Estimation Methods: An Application to How Macro Activity Affects the U.S. Income Distribution,” Journal of Business & Economic Statistics, 12, 243-251.

Samson, P.-M. (2000), “Concentration of Measure Inequalities for Markov Chains and F-Mixing Processes,” The Annals of Probability, 28, 416-461.

Satten, G. A., and Kupper, L. L. (1993), “Inferences About Exposure-Disease Associations Using Probability-of- Exposure Information,” Journal of the American Statistical Association, 88, 200-208.

Schmidt, K. (1978), “A Probabilistic Proof of Ergodic Decomposition,” Sankhya: The Indian Journal of Statistics, Series A, 40, 10-18.

Sebastiani, P., and Wynn, H. P. (2000), “Maximum Entropy Sampling and Optimal Bayesian Experimental Design,” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 62, 145-157.

Sebenius, J. K., and Geanakoplos, J. (1983), “Don’t Bet on It: Contingent Agreements with Asymmetric Information,” Journal of the American Statistical Association, 78, 424-426.

Seidenfeld, T., Schervish, M. J., and Kadane, J. B. (1995), “A Representation of Partially Ordered Preferences,” The Annals of Statistics, 23, 2168-2217.

Seneta, E. (1982), “Entropy and Martingales in Markov Chain Models,” Journal of Applied Probability, 19, 367-381.

Seppalainen, T. (1994), “Large Deviations for Markov Chains with Random Transitions,” The Annals of Probability, 22, 713-748.

Seppalainen, T. (1998), “Entropy for Translation-Invariant Random-Cluster Measures,” The Annals of Probability, 26, 1139-1178.

Seppalainen, T. (1999), “Existence of Hydrodynamics for the Totally Asymmetric Simple K-Exclusion Process,” The Annals of Probability, 27, 361-415.

Severini, T. A. (1998), “Likelihood Functions for Inference in the Presence of a Nuisance Parameter,” Biometrika, 85, 507-522.

Sharma, B. D., and Autar, R. (1973), “On Characterization of a Generalized Inaccuracy Measure in Information Theory,” Journal of Applied Probability, 10, 464-468.

Sharon, M. P., et al. (1998), “Medical Image Compression and Vector Quantization,” Statistical Science, 13, 30-53.

Sheffield, S. (2006), “Uniqueness of Maximal Entropy Measure on Essential Spanning Forests,” The Annals of Probability, 34, 857-864.

Shelby, J. H. (1982), “Analysis of Dispersion of Multinomial Responses,” Journal of the American Statistical Association, 77, 568-580.

Shen, X., and Wasserman, L. (2001), “Rates of Convergence of Posterior Distributions,” The Annals of Statistics, 29, 687-714.

Shen, X., and Wong, W. H. (1994), “Convergence Rate of Sieve Estimates,” The Annals of Statistics, 22, 580-615.

Shields, P., and Thouvenot, J. P. (1975), “Entropy Zero × Bernoulli Processes Are Closed in the D¯-Metric,” The Annals of Probability, 3, 732-736.

Shields, P. C. (1992), “Entropy and Prefixes,” The Annals of Probability, 20, 403-409.

Shields, P. C., Neuhoff, D. L., Davisson, L. D., and Ledrappier, F. (1978), “The Distortion-Rate Function for Nonergodic Sources,” The Annals of Probability, 6, 138-143.

Shier, D. R. (1988), “The Monotonicity of Power Means Using Entropy,” The American Statistician, 42, 203-204.

Shirai, T., and Takahashi, Y. (2003), “Random Point Fields Associated with Certain Fredholm Determinants Ii: Fermion Shifts and Their Ergodic and Gibbs Properties,” The Annals of Probability, 31, 1533-1564.

Silvey, S. D. (1964), “On a Measure of Association,” The Annals of Mathematical Statistics, 35, 1157-1166.

Simon, J. C., Daniell, G. J., and Nicole, D. A. (1998), “Using Maximum Entropy to Double One’s Expected Winnings in the Uk National Lottery,” Journal of the Royal Statistical Society. Series D (The Statistician), 47, 629-641.

Sinha, B. K., and Wieand, H. S. (1979), “Union-Intersection Test for the Mean Vector When the Covariance Matrix Is Totally Reducible,” Journal of the American Statistical Association, 74, 340-343.

Siromoney, G. (1962), “Entropy of Logarithmic Series Distributions,” Sankhya: The Indian Journal of Statistics, Series A, 24, 419-420.

Sitter, R. R., and Wu, C. (2002), “Efficient Estimation of Quadratic Finite Population Functions in the Presence of Auxiliary Information,” Journal of the American Statistical Association, 97, 535-543.

Skilling, J., and Gull, S. F. (1991), “Bayesian Maximum Entropy Image Reconstruction,” Lecture Notes-Monograph Series, 20, 341-367.

Slomczynski, W., and Zastawniak, T. (2004), “Utility Maximizing Entropy and the Second Law of Thermodynamics,” The Annals of Probability, 32, 2261-2285.

Small, C. G., Wang, J., and Yang, Z. (2000), “Eliminating Multiple Root Problems in Estimation,” Statistical Science, 15, 313-332.

Smith, W. (1989), “Anova-Like Similarity Analysis Using Expected Species Shared,” Biometrics, 45, 873-881.

Soofi, E. S. (1992), “A Generalizable Formulation of Conditional Logit with Diagnostics,” Journal of the American Statistical Association, 87, 812-816.

Soofi, E. S. (1994), “Capturing the Intangible Concept of Information,” Journal of the American Statistical Association, 89, 1243-1254.

Soofi, E. S. (2000), “Principal Information Theoretic Approaches,” Journal of the American Statistical Association, 95, 1349-1353.

Soofi, E. S., Ebrahimi, N., and Habibullah, M. (1995), “Information Distinguishability with Application to Analysis of Failure Data,” Journal of the American Statistical Association, 90, 657-668.

Sorensen, M. (1993), “Stochastic Models of Sand Transport by Wind and Two Related Estimation Problems,” International Statistical Review, 61, 245-255.

Soyer, R., and Vopatek, A. L. (1995), “Adaptive Bayesian Designs for Accelerated Life Testing,” Lecture Notes-Monograph Series, 25, 263-275.

Spiegelhalter, D. J., Best, N. G., Carlin, B. P., and Linde, A. v. d. (2002), “Bayesian Measures of Model Complexity and Fit,” Journal of the Royal Statistical Society. Series B (Statistical Methodology), 64, 583-639.

Srivastava, S. K. (1971), “A Generalized Estimator for the Mean of a Finite Population Using Multi-Auxiliary Information,” Journal of the American Statistical Association, 66, 404-407.

Stanley, T. R., and Burnham, K. P. (1998), “Estimator Selection for Closed-Population Capture: Recapture,” Journal of Agricultural, Biological, and Environmental Statistics, 3, 131-150.

Steif, J. E. (1997), “Consistent Estimation of Joint Distributions for Sufficiently Mixing Random Fields,” The Annals of Statistics, 25, 293-304.

Stein, M. L. (1990), “Bounds on the Efficiency of Linear Predictions Using an Incorrect Covariance Function,” The Annals of Statistics, 18, 1116-1138.

Stern, H., and Cover, T. M. (1989), “Maximum Entropy and the Lottery,” Journal of the American Statistical Association, 84, 980-985.

Stine, R. A., and Shaman, P. (1990), “Bias of Autoregressive Spectral Estimators,” Journal of the American Statistical Association, 85, 1091-1098.

Sugar, C. A., and James, G. M. (2003), “Finding the Number of Clusters in a Dataset: An Information-Theoretic Approach,” Journal of the American Statistical Association, 98, 750-763.

Sugiura, N. (1989), “Entropy Loss and a Class of Improved Estimators for Powers of the Generalized Variance,” Sankhya: The Indian Journal of Statistics, Series A, 51, 328-333.

Sweeting, T. J., Datta, G. S., and Ghosh, M. (2006), “Nonsubjective Priors Via Predictive Relative Entropy Regret,” The Annals of Statistics, 34, 441-468.

Swindel, B. F., and Yandle, D. O. (1972), “Allocation in Stratified Sampling as a Game,” Journal of the American Statistical Association, 67, 684-686.

Sylvester, R. J. (1988), “A Bayesian Approach to the Design of Phase Ii Clinical Trials,” Biometrics, 44, 823-836.

Talagrand, M. (1990), “Characterization of Almost Surely Continuous 1-Stable Random Fourier Series and Strongly Stationary Processes,” The Annals of Probability, 18, 85-91.

Talagrand, M. (1996), “Majorizing Measures: The Generic Chaining,” The Annals of Probability, 24, 1049-1103.

Talagrand, M. (2003), “Vapnik-Chervonenkis Type Conditions and Uniform Donsker Classes of Functions,” The Annals of Probability, 31, 1565-1582.

Tarter, M. E. (1979), “Trigonometric Maximum Likelihood Estimation and Application to the Analysis of Incomplete Survival Information,” Journal of the American Statistical Association, 74, 132-139.

Theil, H., and Chung, C.-F. (1988), “Information-Theoretic Measures of Fit for Univariate and Multivariate Linear Regressions,” The American Statistician, 42, 249-252.

Thomasian, A. J. (1960), “An Elementary Proof of the Aep of Information Theory,” The Annals of Mathematical Statistics, 31, 452-456.

Thompson, E. A. (1981), “Optimal Sampling for Pedigree Analysis: Sequential Schemes for Sibships,” Biometrics, 37, 313-325.

Tomizawa, S. (1995), “Measures of Departure from Marginal Homogeneity for Contingency Tables with Nominal Categories,” Journal of the Royal Statistical Society. Series D (The Statistician), 44, 425-439.

Tsybakov, A. B., and Meulen, E. C. v. d. (1996), “Root-N Consistent Estimators of Entropy for Densities with Unbounded Support,” Scandinavian Journal of Statistics, 23, 75-83.

Vasicek, O. (1976), “A Test for Normality Based on Sample Entropy,” Journal of the Royal Statistical Society. Series B (Methodological), 38, 54-59.

Vasicek, O. A. (1980), “A Conditional Law of Large Numbers,” The Annals of Probability, 8, 142-147.

Vinod, H. D. (1985), “Measurement of Economic Distance between Blacks and Whites,” Journal of Business & Economic Statistics, 3, 78-88.

Wagner, U., and Geyer, A. L. J. (1995), “A Maximum Entropy Method for Inverting Laplace Transforms of Probability Density Functions,” Biometrika, 82, 887-892.

Walker, S. G. (2003), “How Many Samples?: A Bayesian Nonparametric Approach,” Journal of the Royal Statistical Society. Series D (The Statistician), 52, 475-482.

Wang, X. (2006), “Approximating Bayesian Inference by Weighted Likelihood,” The Canadian Journal of Statistics, 34, 279-298.

Warner, S. L. (1976), “Optimal Randomized Response Models,” International Statistical Review, 44, 205-212.

Weissman, T., and Merhav, N. (2004), “Universal Prediction of Random Binary Sequences in a Noisy Environment,” The Annals of Applied Probability, 14, 54-89.

Wen, L. (1990), “Relative Entropy Densities and a Class of Limit Theorems of the Sequence of M-Valued Random Variables,” The Annals of Probability, 18, 829-839.

Winkler, R. L., Smith, J. E., and Fryback, D. G. (2002), “The Role of Informative Priors in Zero-Numerator Problems: Being Conservative Versus Being Candid,” The American Statistician, 56, 1-4.

Wolfowitz, J. (1958), “Information Theory for Mathematicians,” The Annals of Mathematical Statistics, 29, 351-356.

Wong, W. H., and Severini, T. A. (1991), “On Maximum Likelihood Estimation in Infinite Dimensional Parameter Spaces,” The Annals of Statistics, 19, 603-632.

Wong, W. H., and Shen, X. (1995), “Probability Inequalities for Likelihood Ratios and Convergence Rates of Sieve Mles,” The Annals of Statistics, 23, 339-362.

Wright, R. L. (1983), “Finite Population Sampling with Multivariate Auxiliary Information,” Journal of the American Statistical Association, 78, 879-884.

Xie, J., Li, K.-C., and Bina, M. (2004), “A Bayesian Insertion/Deletion Algorithm for Distant Protein Motif Searching Via Entropy Filtering,” Journal of the American Statistical Association, 99, 409-420.

Yaguchi, H. (1990), “Entropy Analysis of a Nearest-Neighbor Attractive/Repulsive Exclusion Process on One-Dimensional Lattices,” The Annals of Probability, 18, 556-580.

Yaguchi, H. (1991), “Acknowledgment of Priority: Entropy Analysis of a Nearest-Neighbor Attractive/Repulsive Exclusion Process on One-Dimensional Lattices,” The Annals of Probability, 19, 1822.

Yang, Y. (2000), “Mixing Strategies for Density Estimation,” The Annals of Statistics, 28, 75-87.

Yang, Y. (2001), “Adaptive Regression by Mixing,” Journal of the American Statistical Association, 96, 574-588.

Yang, Y., and Barron, A. (1999), “Information-Theoretic Determination of Minimax Rates of Convergence,” The Annals of Statistics, 27, 1564-1599.

Yatracos, Y. G. (1985), “Rates of Convergence of Minimum Distance Estimators and Kolmogorov’s Entropy,” The Annals of Statistics, 13, 768-774.

Yatracos, Y. G. (1989), “A Regression Type Problem,” The Annals of Statistics, 17, 1597-1607.

Young, D. L. (1976), “Inference Concerning the Mean Vector When the Covariance Matrix Is Totally Reducible,” Journal of the American Statistical Association, 71, 696-699.

Yu, B., and Speed, T. P. (1997), “Information and the Clone Mapping of Chromosomes,” The Annals of Statistics, 25, 169-185.

Yung, W., and Rao, J. N. K. (2000), “Jackknife Variance Estimation under Imputation for Estimators Using Poststratification Information,” Journal of the American Statistical Association, 95, 903-915.

Zanten, H. v. (2003), “On Empirical Processes for Ergodic Diffusions and Rates of Convergence of M-Estimators,” Scandinavian Journal of Statistics, 30, 443-458.

Zeckhauser, R. (1971), “Combining Overlapping Information,” Journal of the American Statistical Association, 66, 91-92.

Zellner, A. (1988), “Optimal Information Processing and Bayes’s Theorem,” The American Statistician, 42, 278-280.

Zhang, T. (2006), “From ?-Entropy to Kl-Entropy: Analysis of Minimum Information Complexity Density Estimation,” The Annals of Statistics, 34, 2180-2210.

Zidek, J. V., and Eeden, C. v. (2003), “Uncertainty, Entropy, Variance and the Effect of Partial Information,” Lecture Notes-Monograph Series, 42, 155-167.

Zidek, J. V., Sun, W., and Le, N. D. (2000), “Designing and Integrating Composite Networks for Monitoring Multivariate Gaussian Pollution Fields,” Journal of the Royal Statistical Society. Series C (Applied Statistics), 49, 63-79.