I was glad to see some interest in the Crystal Egg model when I made my first post during the week. There seemed to particularly be questions around how the model worked, so I have put up a description of the process here, for those that way inclined.

I’ve also set up a main page for the model here, which during the season will have the latest predictions for games, how well predictions have gone up to now, and most importantly how I’m doing with my bets. For the moment though, it just has one section – predictions for the final outcome of the two leagues.

One of the nice things about using a model based on probabilities is that you can run multiple simulations of every game, and hopefully the more simulations you run, the more accurate the end result. Based on this principle, I ran every fixture in the league 10,000 times and got the average outcome of each one, to have a go at predicting the end of year table. The results are shown below:

1 | Leinster | 77.719 |

2 | Ulster | 67.577 |

3 | Munster | 66.624 |

4 | Ospreys | 62.608 |

5 | Glasgow | 58.444 |

6 | Scarlets | 49.3 |

7 | Cardiff Blues | 42.1 |

8 | Connacht | 34.66 |

9 | Edinburgh | 33.762 |

10 | Dragons | 32.462 |

11 | Benetton Treviso | 29.625 |

12 | Zebre | 16.951 |

1 | Saracens | 75.0389 |

2 | Leicester Tigers | 69.3478 |

3 | Northampton Saints | 66.4715 |

4 | Harlequins | 61.0665 |

5 | Bath Rugby | 51.1301 |

6 | Exeter Chiefs | 46.2364 |

7 | Gloucester Rugby | 43.868 |

8 | London Wasps | 38.691 |

9 | London Irish | 38.4475 |

10 | Sale Sharks | 34.1185 |

11 | London Welsh | 26.7133 |

12 | Newcastle Falcons | 22.2422 |

There are no surprises at the top and bottom of either table – with Leinster expected to top the table of the Pro 12 with 78 points and Sarries to do the same in England on 75, while Zebre and Newcastle are predicted to prop up their relative tables. Glasgow are expected to do a little worse than last year, while Exeter are given a surprisingly high 6th place prediction in the Premiership. Interestingly, the model seems to significantly underestimate the number of bonus points being scored in the Pro 12, with average points all round lower than equivalent tables for last year, while the Premiership stands up reasonably well.

More interesting, when we are running 10,000 simulations, we can count how often a team comes in a certain position across all these simulations and therefore work out what their chances are of finishing in any given position. Results are shown below:

So while Leinster and Sarries are expected to top their tables, the model is a good deal more certain about Leinster than about Saracens – 64% for the latte-sippers as opposed to 56% for the soulless mercenaries. At the other end of the scale, Zebre are considered to be much more likely to finish last in the Pro12 than Newcastle are in the Premiership. Does this back up suggestions made in Heineken Cup negotiations that the Celtic league is just less competitive?

Time to put my money where my mouth is for the first time on this model. Betting on the overall outcome of the regular season is probably more risky than individual matches. The simulations are run based solely on team scores, not on individual lineups in individual games (obviously, as they’re not available yet!) so the model is a little less accurate. Also, the simulations are run based on current scores/form and therefore obviously can’t predict a run of good or bad form by any time. Still though, I think there’s some value to be had from Paddy Power based on these numbers.

PP odds | Paddy Power Implied Probability | Crystal Egg probability | |

Leinster | 2.62 | 38% | 63.64% |

Ulster | 5.5 | 18% | 15.84% |

Munster | 4.5 | 22% | 12.56% |

Ospreys | 8.5 | 12% | 5.69% |

Glasgow | 5.5 | 18% | 2.09% |

Scarlets | 31 | 3% | 0.16% |

Cardiff Blues | 31 | 3% | 0.02% |

Connacht | 51 | 2% | 0.00% |

Edinburgh | 81 | 1% | 0.00% |

Dragons | 81 | 1% | 0.00% |

Benetton Treviso | 501 | 0% | 0.00% |

Zebre | 751 | 0% | 0.00% |

PP odds | Paddy Power Implied Probability | Crystal Egg probability | |

Saracens | 3.25 | 30.77% | 55.98% |

Leicester Tigers | 3.75 | 26.67% | 24.00% |

Northampton Saints | 4 | 25.00% | 13.52% |

Harlequins | 8 | 12.50% | 5.71% |

Bath Rugby | 9 | 11.11% | 0.66% |

Exeter Chiefs | 67 | 1.49% | 0.05% |

Gloucester Rugby | 15 | 6.67% | 0.06% |

London Wasps | 26 | 3.85% | 0.02% |

London Irish | 201 | 0.50% | 0.00% |

Sale Sharks | 51 | 1.96% | 0.00% |

London Welsh | 501 | 0.20% | 0.00% |

Newcastle Falcons | 501 | 0.20% | 0.00% |

The tables above show current odds available from Paddy Power as well as the implied probability associated with each odds (e.g. 2/1 odds (3.0 in decimal) implies a 33% chance of something happening). The first thing to notice is that the bookies seems to imply pretty much the same order of likelihood of coming first as my model – with some strange exceptions such as Exeter Chiefs. This is reassuring that the model is delivering common sense outputs!

The main result though is that with the exception of the favourite in each league, every other team is grossly overpriced according to the model. Even 500/1 outsiders like London Welsh are terrible bets, because they have literally no chance of winning. 66/1 may seem tempting for Exeter Chiefs, but even my model, which seems to love Exeter for some reason, thinks 2000/1 would be a more fair prediction. On the other hand, 9/4 looks pretty good for Saracens when I’m estimating they have a 56% chance of topping the table, and likewise for Leinster on 13/8.

As this is the first bet of the season, I’ll push the boat out with two bets. Much as it pains me to choose these two teams, **a fiver each on Sarries and Leinster to top the table at the end of the season.**

Ross Christie

This is a blog I’ll definitely be following with interest. Good work, please do keep it up

Baldo

Thanks Ross, will do my best to keep it interesting