Daily Speculations

The Web Site of Victor Niederhoffer & Laurel Kenner

Dedicated to the scientific method, free markets, deflating ballyhoo, creating value, and laughter;  a forum for us to use our meager abilities to make the world of specinvestments a better place.

 

Home

Write to us at: (address is not clickable)

8/16/04:
Dept. Of Continuing Learning: Information Theory by Victor Niederhoffer

One hastens to introduce information theory to the list. A good book on the subject is "Information Theory: Structural Models for Qualitative Data" by Klaus Krippendorff, a Sage Univ. book (I buy all these Sage books from my favorite book store, UC Seminary).. Information is the amount of selective work a message enables us to do. It's the amount of surprise that a message has, or a measure of the difficulty of making appropriate decisions. It's measured in bits. I find it helpful to think of it . If the probability of an event is:

1/2 that's..  2^-1
1/4 .2^-2
1/8 ..2^-3
1/16 ..2^-4

Leave out the minus sing and you have the number of bits in a message that an event with probability p has occurred. A related concept is entropy - that's the average amount of information required to predict observations by categories, or the amount of uncertainty after you've been given information as to which classes an event is in. Entropy is a measure of the variety in a classification; it's defined as sum of p(i) log p(i) over all the possible states of the event in the sample. For example if heads can occur with probability 1/2, then entropy is -(1/2 log 1/2 + 1/2 log 1/2) = 1. The maximum entropy you can have occurs when an event is evenly divided into categories. You've probably all heard that entropy is used often in linguistics to measure the diversity of various authors, the diversity of words, the diversity of culture and art, the diversity of the press coverage, how best to design a communications channel, how to measure things like noise, equivocation ,and degree of redundancy. When I was a student at Chicago, Theil wrote a series of articles relating it to job diversity, the growth and decline of social systems, and the information content of the advance decline ratio. I've been looking for that article for some time. Information theory should play a key role in our thinking about markets. Announcements come. They usually have no surprise. But the employment number changes our views on all sorts of things. The entropy is large before the employment number. An earnings announcement gives us much information about the future state of a companies' earnings, and reduces the alternatives for wrong thinking by many bits relative to some other announcement. The number of bits in a Head and Shoulder's pattern is not great, but, like all announcements, its effect is dependent on the entropy of the system in advance. When this is quantified, it can lead to some good predictions one would think. Applications and thoughts on information theory are solicited.

8/18/04:
Entropy Contest

Here's an idea for a lifetime inspired by Ms. Bonnie Lo. Entropy as you will recall is the amount of disorder in a system after it's been "classified". It's defined as the sum of -p(i)*log(p(i)) over all the states in a classification. It has many useful properties in communications and understanding diversity and physics, many of which can be googled. Might I suggest that each week the entropy in a market be calculated and that this be used as a predictor of the future moves in that market the next week or so. After determining some ways of making money in that market with that time period, do generalize the concept. Our previous contests have been successful, and we will give an award of $500 to any submissions that make in our opinion good progress either empirically or otherwise on this meal for a lifetime. -- Victor Niederhoffer

8/19/04: Jeff Sasmor on Entropy

Is it possible that there's also an element of entropy that's due to the constant shift between different 'sectors' - for ex. right now retail is suffering, tech is down, and energy stocks + services (like OIH) are going up; or shifts between large cap and small cap -- within some time these patterns will shift. So there's always some current flowing like this. If you see this flow in action it seems as if one could profit from the shift.

8/19/04:
Information Theory by James Sogi

If maximum entropy is 1 when there is an evenly divided possibility between two outcomes, a 50% probability, then when the outcome is certain, entropy is zero. The only time in the market when the outcome is certain is defined by the rules of the exchange, when it is closed. At the open, there is the biggest jump in entropy, and at the close, the largest decrease in entropy. On a weekly basis, the weekend is the 0 entropy period, and the Monday and Friday action would be the time of the greatest change in entropy. Chair mentions announcements, but since weekends usually come every week it might be a good time to measure the action and we have seem some day of week work. Take the market action the last two Fridays. Both were slamming down to new lows for the week. Friday 7/30 was flirting with new highs for that week. In each case, it signaled a reversal of sorts. In terms of information, maximum information has accumulated over the weekend for Monday, and by Friday all the week's news and information is pretty much played out, signaling decrease of entropy into the slow Friday close. Thought this idea might lead to other thoughts on counting, such as tendency to reverse on Monday /Friday. The decrease of entropy violates the second law of thermodynamics, so to answer Maxwell's query, even traders need a day off, so the exchanges suspend the laws of physics for the weekend at least. But then why is it always Black Monday?.

8/19/04:
More on Information Theory from Tom Ryan

In keeping with the entropy and energy discussions, there has been a remarkable string of days lately where the US stocks have made a daily low within 30 minutes of the open or 30 minutes from the close. I have found in the past that if there has been many days in the recent past where the market has made a low either at the open or the close, that this tends to presage decreased volatility for the next period even more so than say negative autocorrelation of daily ranges. For example, if 4 or 5 of the last five days have seen a low at the open or close the absolute value of the change in the S&P 500 for the next 5 days this year has been 10 points with a standard deviation of 10 pts. This compares to all 5 day periods (overlapping) of 14 points with the same 10 point standard deviation. Its like the brown trout I caught in the big hole last week which thrashed about mightily for a bit then got so still I couldn't tell if he was still on the line or a boxer who has thrown a lot of strong punches, is a bit tired and now has to do some rope a dope to catch his breath.

8/19/04:
Jared Albert's Take:

An (ideal) perfectly elastic market in which supply was infinite and prices didn't change would also have an entropy of zero. An (ideal) orderly market in which prices and orders were continuous and singularly directional, perhaps one in which all orders for the day were matched in the morning so prices could move continuously and contiguously would have a very nearly zero entropy. I guess that there is a progression to completely discontinuous jumps in price and volume for which no structure could be modeled which would have the maximum entropy possible for the system (I don't think this would be infinite because prices still have to fall within the system. To have an infinite entropy, the prices would have to be able to print anywhere, something I can't imagine with my limited abilities). Based on this qualitative analogy, one way to approximately measure entropy in the market might be to measure the volatility of all the issues in the market. Something like the Sum of daily issues from issue 1 to n (((dayhigh-daylow)/(dayhigh+daylow)/2) *volume for that issue for the day *shares outstanding for that issue)/(total days volume for all issues*total shares out for all issues) A more exact approach might be to measure the continuous price action between ticks or minute bars of each stock and normalize it with its contribution to the total daily volume as a part of the total shares outstanding. Clearly larger cap and larger volume should contribute more than weak volume or thin stocks.

8/19/04:
Allen Gillespie on Information Theory

In my experience, information theory and entropy play strongly at the individual biotech (FDA meetings), technology issues (new product releases, particularly in the run-up before Christmas), legal issues (MO with lawsuits and MSFT with the recent EU ruling), spin-offs, and AMR with its non-bankruptcy filing. These issues frequently (though I haven't quantified it) develop a strange attractor nature, frequently are characterized by large gaps at the event, and frequently show high (or in negative events extremely low) relative year to date performance (similar to your S&P 400 stocking stuffer idea) in the period prior to the "event". Examples from recent years would include, AAPL (Windows release of iTunes), DNA (new drug), MOGN (new drug), NVDA (X-Box release, SNDK/EK (digital photography), AMR non-bankruptcy filing, JCP (spin-off after S spun off its credit card division). While not predictable, many of the defense stocks also showed these same characteristics pre-9/11 (i.e. INVN, LMT, NOC, etc.). KMRT may also qualify if you count a cornering of a stock by a fund (we have noticed the tendency of ESL to take positions in firms with market caps similar to the size of its fund so that it could buy-in the entire company if necessary, sell-off assets, then use corporate cash to buy back stock) - see AZO, S, etc..

8/30/04:
An Info Tutorial by James Sogi

One of the goals of information theory is to provide a framework to solve the difficulty of making an appropriate (to a degree better than chance) decision. (Information Theory, Krippendorff.) With 4 bits of information, there are 16 options each with a 6% probability. This in theory improves the choice from 50/50 random chance with only 1 bit of information but rapidly runs up against the human real time decision making capacity or threshold. There needs to be a balance between information overload and ease of decision making. Information is defined as a measure of the amount of selective work a message enables its receiver to do. Too much information may decrease the selective work quality. Here is where computing models utilizing information theory like Phil's m spreadsheet, are going to be cutting edge.

A very interesting aspect of information theory Krippendorff mentions is that analysis of qualitative data is more universal than analysis based on variance of quantitative data in that the use of variance assumes a normal distribution leading to certain restrictions whereas qualitative data avoids that bias. That bias seems to be one of the main objections statistical analysis of market time series.

Redundancy is used in communication to overcome noise and is computed as the difference between the entropy of a uniform distribution and the observed entropy and is an information measure in its own right.
T(A)=H(A)max - H(A) The English language is up to 70% redundant. This makes is less efficient, but capable to correcting errors in spelling and syntax. The market has redundancy that might help in decphering the message over the noise. For example this year we have had three cycles of up and down within a range. Some call it noise, but the redundancy may be telling us something. What is it telling us. That there may be bigger move ready to happen. When? Well we still have three weeks or so to go 12 points if the cycle mimics the last one, the point is not my prediction but that there is information in the market redundancies.

Noise is defined in information theory where the sender is unsure whether or how the message is received. Noise need not be undesirable under both information and chaos theory and in many instances is intentional in such areas as political discourse, and creative activity. Under chaos theory of emergence, creation often arises from noise as displayed in biodiversity from which evolution arises (bacterial evolution), from verbal argument to understanding (Plato), from social unrest to economic development and reform (Great Leap Forward), from choppy markets to new trends (fall 2002 v 2003).

for Labor day BBQ here's a favorite:
*Big Island Short Ribs*
3 lbs crosscut beef short ribs

Marinade #1
1 cup sake
1/4 cup sugar

Marinade #2
1 Cup tamari
2 clove garlic
1 bunch green onion chopped
1/4 cup brown sugar
1 tbsp Asian toasted sesame oil
2 tbsp peanut oil
1 tbsp bilk pepper ground

Pour #1 over ribs 15 min

soak ribs in #2 overnight

grill 4-5 min

Enjoy with Zinfandel or Petit Syrah