Monday, February 23, 2009

Temperatures update

As I've noted a couple of times, the prices on TEMP.2009.HIGH stock are very high when compared to any reasonable forecasting model. In previous posts, I used a variety of methods to forecast the 2009 temperature anomaly based only on the time series. A model using only a nonlinear time trend (year and year squared) and lagged temperature variables explains about 84% of the variation in the time series. Based on that model, it looked like there was no more than a 5% chance that 2009 would be warm enough to beat 1998.

We now have the global average temperature anomaly for January. Let's add it to the model and see what comes up.

The first way I did it was to use my model to predict January temperatures. That model provides a much worse fit than the one predicting annual temperatures, but that's to be expected. The model's predicted January temperature is spot-on: the model predicted a temperature anomaly of 0.37 and that's what we got. We would have needed an actual January temperature much higher than predicted to start me worrying about my call on this contract. Instead, I'm right on the money.

The next approach is to add January temperatures in to help predict average annual temperatures. I updated the forecasting model I previously was using to have January temperatures as an explanatory variable. I then took the regression residuals and checked how many times in the data series we've had residuals large enough that, if we had that bad a model fit this year, 2009 would beat 1998. It's happened 6 times in the 154 year series. That's 3.8%. The best fit model says we have a 3.8% chance that 2009 is warmer than 1998. Last time, we had a 7/154 chance. So, the addition of the January data makes it LESS likely that we'll beat 1998. Why is that, given that January temps are right on target? The addition of January data to the model tightens up the precision of the estimates!

Long story short, the January data suggests that the chances of 2009 beating 1998 are even lower than I'd previously thought. A fair price on this contract is $0.038.

Full disclosure: I have a very large short position on this contract, based largely on analyses like those conducted and linked to here. I'm also slightly long on the TEMP.2009 contract as my model suggests 2009 will be warmer than 2008 (74% chance), just not warm enough to beat 1998. If you want to check my work and try it for yourself, get a copy of Stata. Get the data series and import it. Add the obvious column headings. Then do the following:

gen year2 = year^2
tsset year
reg temp year year2 L1.temp L4.temp jan
predict temp_hat
predict resid, residuals
sum resid if resid > (0.543-0.3867909)

The numbers are the 1998 anomaly and the predicted 2009 anomaly. The last line will then tell you the number of times that the model has been so far out that it generated a residual sufficiently large that, if encountered again, 2009 will be warmer than 1998.

3.8%. I don't know who or what is keeping the prices up at $0.18. Current price is five times higher than it ought to be based on fundamentals.

Wednesday, January 28, 2009

14 New Stocks - Interest, Anyone?

14 news stocks are now underway, all relating to interest rates and what they'll be doing over the next six months.

For the first time, iPredict is pointing at Australia and asking you to consider what the Reserve Bank of Australia will do with its cash rate when the RBA board next meets on 3 February, next Tuesday. Yes, a short sharp set of stocks this one, but we expect some healthy trading.

In another first, iPredict is also asking you to forecast retail mortgage rates for home owners by the middle of this year. How low will variable rate mortgages be in June? Below 7%? Below 6% below 5.5%? This should be a stock of interest to home owners considering whether to break their agreements with their banks and moving to a lower interest rate.

Finally, we launch another set of OCR stocks, this time for the RBNZ's 12 March annoucement. Of course, we'll be keeping an eye on these stocks as tomorrow's OCR news comes through.

Friday, January 23, 2009

iPredict Forum is Underway

iPredict's forum is underway. Here's the link. Well done Aaron and Simon (our developers) for pulling this together and fighting the monster that is IE6.

If you have any trouble, please tell us by posting into the bug reports section of the forum. If you're having so much trouble that you can't see the forum, well just post it here :-)

91 Unleaded Stocks Launched


Today sees the launch of our first petrol price stocks, aimed at forecasting the price of 91 unleaded petrol at the pump.

Five binary stocks are designed to measure this, each asking will the price of 91 unleaded be within the following ranges:

91.FEB09.VLOW: price less and 132 cents
91.FEB09.LOW: price between 132 and 139.9 cents
91.FEB09.MID: price between 140 and 148 cents
91.FEB09.HIGH: price between 148.1 and 156 cents
91.FEB09.VHIGH: price above 156 cents

How do we measure price? We use the Ministry of Economic Development's Regular Petrol Price series, available from here. Although the Ministry does not make it clear on the page, this series is measuring the GST inclusive, all other taxes-inclusive price of 91 unleaded petrol in the Wellington region. I had to call the Ministry to figure that out.

We have also launched a new bundle called BUNDLE.91.FEB09 which combines all five stocks and sells for $1.

Thursday, January 22, 2009

Climate stocks

Since folks seem to be discounting at 100% the previous post on why the TEMP.2009.HIGH stock seems massively overvalued, I started worrying that traders out there know something I don't. So, I threw the data into STATA instead and ran a few prediction models with lag effects and a time trend. The best fit model predicts a 2009 temperature anomaly of 0.38821, which is higher than I previously was estimating. But, the probability of exceeding 1998 seems largely unaffected because my confidence bands are tighter. The standard deviation of the regression residuals is 0.100832. The residual tells you the difference between the actual observation and the estimate predicted by the model for any particular year. So, if the actual temperature anomaly turns out to be 0.546, the residual would be -0.15779: the prediction minus the observation gives you that as residual. How often do we observe residuals at least that large on the down side? 7 times in 154 observations: 4.5% of the time.

The simple model explains 84% of the variance in the time series using only lagged dependent variables and a time trend.

Want to replicate this? Get the dataset, add in a null 2009 observation, throw it into STATA, type:

gen year2 = year^2
reg deviation L1.deviation L4.deviation year year2
predict y_hat
predict resid, residuals

Then just take (y_hat - 0.546) for 2009, and see how often the residual is smaller than that value:

sum resid if resid < -0.15778

7 observations in 154.

I wonder what the folks buying at 0.22 know that I don't. No way anybody's using this for hedging against a hot year, not at these stakes....

* Note I kept the first and fourth lags after a general to specific reduction starting with 7 lags.

Wednesday, January 21, 2009

Request for Your Feedback on New Features

Heaps of great feedback in earlier posts on new features ideas. This morning we compiled a list of new features that we've been bouncing around for a while. Part of the list we came up with is below, much of it is from ideas given to us by you. We've left out some administrative features ideas and ideas only for our internal prediction market customers.

Please do two things in the comments.

One, list your priorities starting with most preferred e.g. F,D,E,B,Q. Prioritise as many as you like, I'll compile your opinions on ideas for however many as you are willing to give.

Two, add new suggestions. I apologise in advance for all the features already submitted in the last few months not on the list - the list was prepared in a rush, please submit them again, I'll add them.

I will update the list with your new ideas. I may reverse the list's order so that newest ideas are put at the top.

For option D, what minimum, if any, is appropriate? $20?

Some of these may require clarification - if so please ask.

Here is what we have so far, in no particular order.

A. Forum

B. Continued API development

C. Confirm trade button disabled after one click

D. Option to display trader earnings by percentage (minimum deposit $20?) + make named appearance on list optional

E. Funds frozen following a withdrawal request + option to cancel request if not processed

F. User's cash, portfolio value and net worth plotted over time

G. User-added events/news on stock details, editable/removable/flaggable by other users via wiki, added items plotted on stock price chart

H. User stock board – lists all stocks in one place, current price/volume/day's high,low/lifetime high,low/high buy,low sell. Ajaxed and sortable.

I. Daily stock high/low prices added to stock details

J. Multiple stock trading system - allows buying and selling of multiple stocks at one time from single page – facilitates arbitraging

K. Stock details page widget which converts price to prediction for that stock taking into account any special rules (e.g. price floors) for that stock + remove probability from browse stock page and replace with "Avg. Cost" or "Rev/Stock Received At Sale" values if position taken, otherwise blank

L. Home page public notification – alerts to users arriving on the home page, warning of approaching downtime for maintenance, for example, or new stocks launch time

M. User alerts system - user can get emails/text messages sent if prices move outside defined bands

N. Basic look/feel site adjustments (colours, some page design changes)

O. Re-work trading interfaces - reduce steps, make clearer, ajax-enable

P. Re-work short selling to make clearer to user what funds have been transferred and why

Q. Post-trade suggested stocks "Users who bought this stock also traded …" displayed on the trade confirmed screen

R. Browse closed stocks page

S. Stocks vary between $0 and $max, where $max can be <> $1.

T. Connect forum and stock details pages - allow forum browsing (for posts relating to that stock only) and posting from stock details

U. Download (anonymous) raw trading data interface (currently available via API only)

V. Add Google news items to stock details pages using stock-specific search terms. Plot news item flags on stock price line.

W. One-click trading interface: buy and sell price can be clicked for a standard parcel size (eg 25 contracts) subject to there being at least the std number available at that price (if not the lesser number)

X. Either streamline browse stocks to load only selected categories or set up a browse stocks for mobile/PDA only

Y. Remove the division into 'page 1, page 2' of stocks in the lists on My Portfolio, e.g. the Watch list

Z. Show current bid and offer prices in the portfolio lists

Monday, January 19, 2009

Climate stocks

At current prices, the markets are forecasting a 60% chance that 2009 temperatures will be above 2008 temperatures, with a 24% chance that 2009 temperatures will beat the 1998 high of a 0.546 degree anomaly. The 2008 anomaly was 0.324, so traders must reckon 60% of the temperature probability distribution lies above 0.324 and 24% lies above 0.546.

If you go to the underlying data series, which goes back to 1850, we find a mean change from year to year of close to zero (because of course it's anomaly relative to an average baseline) with a standard deviation of the year-on-year change of 0.115. So, in any given year, the next year's anomaly will fall within +/- 0.115 of the prior year's anomaly about 68% of the time. For the 2009 anomaly to be greater than 0.546 would require an increase equivalent to at least 1.93 standard deviations of the year-on-year change. A jump that large shouldn't happen more than about 2.5% of the time in a normally distributed series. You could try arguing that the variance of the series has increased, but the standard deviation of year on year changes over the last decade is almost identical to that of the whole 159 year series.

You might also well say that I ought to account for the warming trend. Let's try that. From 1980 through 2008, the average change in the anomaly has been 0.009483. We can add that to the 2008 temperature to get a prediction for 2009. The 2009 observation would need to be 1.83 SD above the expected temperature increase to beat the 1998 anomaly. We would expect that to happen 3.36% of the time.



I've included a graph with the underlying temperature series and bands showing changes 1.83 standard deviations above and below the prior year's observation. How often in the series' 158 year* history do we see the subsequent year's anomaly outside the 1.83SD band? 14 times. Or, 8.8% of the time, with 4.4% above the band and 4.4% below the band. That's a bit more than we'd expect from the standard normal distribution (6.7% of observations, 3.35% each side), but not a ton more. I've marked these with big red dots: 1863, 1865, 1877, 1879, 1890, 1916, 1930, 1954, 1957, 1964, 1974, 1977, 1997 and 1999. Those are years with temperatures that varied by more than 1.83 standard deviations of the normal year-on-year change from the prior year's observation. You'll probably need to click on the graph to enlarge it to see things properly.

I've been shorting this stock since it launched. I plan to continue shorting this stock. I'm not saying that 2009 can't be warmer than 1998, I just can't see how it's more than 20% likely to happen. I can't see that it's more than 10% likely to happen either. The analysis above should give an upper bound estimate of the likelihood of 2009 exceeding 1998: for a midpoint estimate, I'd not account for a warming trend and instead evaluate at the mean zero change. At what price would I start covering my shorts? Well, I think a fair price wouldn't be higher than $0.05, and I'm a bit risk averse.

Of course, feel free to follow the links provided in the stock description to get the underlying data and have your own play with it. I'm not doing anything high tech here.

*Yes, the series is 159 years long, but you can't look at the year-ahead for the most recent year. That's the one we're trading on!

Note: Post updated to provide analysis based on 1.83SD cutpoints (the upperbound case) rather than the midpoint case provided earlier.