©2006 All content on site protected by copyright
An Introduction to Modern Bayesian Econometrics, reviewed by Victor Niederhoffer
Right at the outset of Tony Lancaster's book, 'An Introduction to Modern Bayesian Econometrics' he states that it is probably better not to have any background in econometrics or statistics because the Bayesian approach is so different from the traditional ways. This book will change the way you think about decision making, prediction, and description in all economic matters, and includes computational techniques, an elaborate discussion of Bayes' theorem, and computer programs and code.
Econometric analysis is different from statistical analysis in that it deals with behavior of economic agents making choices about goods, and the interaction of markets. As such it is much closer to what the average investor is interested in than the typical statistics book. This book is the first to show how these decisions can be made using the modern tools available, and with these tools the way one would approach the problems of choice and decision making is radically different from what we have been accustomed to. In short, you start with beliefs about the probabilities of your hypothesis, you consider the consistency of the data with each of the realizations of your hypothesis, and then you come up with a revised tableaux of beliefs about your hypothesis based on the new ex-post probabilities of your hypothesis. This is the way most scientists actually do work . They come up with a hypothesis, they state what they would expect to happen based on the hypothesis, and they then examine the data to see it is consistent with the same. Finally they revise their hypothesis. This is a good model for scientists as well as traders.
The basis for all the book is Bayes' theorem which the author emphasizes is the only way to revise beliefs based on numerical evidence. The basic theorem is as follows:
Suppose you have a theory that has different possibilities , say that the market is a 50-50 proposition or that it is biased to 60% up with a 10% per year drift. Let us call that your prior probability A with two states, A1 with a probability 0.5 and A2 with a probability 0.5. Next, consider what you would expect to find in evidence if your theories were true. That is event B. For example , if theory A1 were true you'd expect to find 4 rises in a row 0.06 of the time. But if theory A2 were true, you would expect to find 4 rises in a row 13% of the time. Now if you find four rises in a row, theory 2 becomes more likely by a factor 13/6.
Originally the theory was 50-50 . but now the theories have an ex-post probabilities proportion to 0.5 x 13 and 0.5 x 0.06. So the revised the probabilities of A1 and A2 is that A1 is now 13/19 to be true, and A1 is now 6/19 to happen.
Formally, Bayes' theorem states that we start with a theory A and then revise it based on the evidence B according to the formula that the probability of A and B taken together is:
P (A) x P (B|A) = P(B) x P(A|B)
Alternately, P(A|B) = [P(B|A) / P(B)] x P(A)
The right most term in the numerator is the likelihood. The left most term is the revised posterior probability and the denominator on the right is the prior probability.
Bayes' theorem and all econometric decision making may then be formulated as a probability distribution for the various states of your hypothesis to find the likelihoods of various realizations of the data for each state of your hypothesis. Collect the data and then apply Bayes' theorem above to revise your probabilities. Then see how consistent your revised beliefs are with what you know about the situation, and how helpful it is for you to use it in the future, i.e. criticize and find out how useful your new model is through prediction.
The book contains numerous examples, numerous models, numerous extensions of this algorithm along with the computer code to perform such analysis. Topics treated include linear and non -linear regression models, multinomial models, autoregressive models, time series models, Markov Chain models, binary choice models, duration models, survey data and recursive equations. Extensive use of simulation is made throughout the book, and the book also includes a computer program to help make the models come to life. The book is somewhat technical, especially since all the results depend on computer modeling that yields output that is not clearly derivable by hand without, however, it will change the way you think about decision making, and will be an important adjunct in how you think in the future, regardless of the mastery of all the technical details contained. This book is highly recommended for a nice pencil and paper read.
Jim Sogi adds:
Bayesian Methods by Thomas Leonard discusses the philosophical underpinnings of the Bayesian Method. Classical probability is the 'm over k rule' and is appropriate when outcomes are equally likely. The second type of probability is the frequency probability of an event in the long run, proportional to the number of times the event occurs. Subjective probability measures an individual's uncertainty in an event and may vary amongst individuals. An individual who always tries to represent his uncertainty by a subjective probability distribution is referred to as a "Bayesian."
In terms of the parsimony principle of keeping models with few parameters, L.J. Savage felt that "a model should be as big as an elephant." Contrast this with the late Toby Mitchell's philosophy "the greater the amount of information, the less you know."
An interesting twist on the subject is Simpson's paradox with lurking variables. The classic case is the death penalty by race which seemed to show no racial bias, but when the lurking variable of the race of the victim is introduced, the conclusion is the opposite. This seems particularly apt in the market where unknown variable are always lurking, seeking to turn your confidence in a bullish trade on its head.