Feeds:
Posts
Comments

Posts Tagged ‘data mining’

With the 2013-2014 NFL preseason games underway, the business for experts to predict games is about to start again. Cannot wait…

Here comes the ESPN expert pick for week 1.

CBSsports also joined the expert pick business this year with its collection of experts:

Fans must be eager to know who is the best expert in this NFL prediction game and there are already questions posted in the comment section of ESPN Expert picks.

Based on the record I collected from ESPN in the last two years, we clearly have a winner: Seth Wickersham, who correctly predicted 69.9% and 65.2% games, the best among ESPN experts, for the last two NFL seasons, respectively. Here are the overall prediction accuracy records of each expert in the last two season, with more details here.

Picks Allen Golic Hoge Jaworski Mortensen
2013 60.2% 63.3% 66.8% 65.6% 69.5%
2012 65.0% 62.9% 63.5% 64.6% 60.5%
Picks Schefter Schlereth Wickersham Jackson Johnson
2013 62.5% 64.5% 69.9% 62.9% 60.2%
2012 61.7% 65.2% 65.2% N/A N/A
Picks Ditka Carter Accuscore Pick’em
2013 64.8% 66.0% 64.1% 65.9%
2012 N/A N/A 68.0% 68.0%

Adam Schefter has the worst prediction average among the ones who made picks for the last two seasons, and Keyshawn Johnson was the worst for the last season.

Chris Mortensen‘s results are the most curious ones, winner of the most improved expert award in 2013. He did really well for the 2013 season, but his predictions was worst of the worst for 2012 (large variability?).  Let’s see if he keeps it up this year 🙂

Some additional background information: Accuscore is based on simulations (algorithms and data) by accuscore.com and Pick’em is the average of all predictions by NFL fans who submitted their picks on ESPN.com before the game (kind of a “crowd prediction” by non-experts).

Unlike predictions used in the last two years, the ESPN expert pick page shows that Accuscore prediction is no longer included this year. I wish ESPN still includes this algorithm (statistics) based prediction in this prediction game.

We also had fun of comparing expert picksalgorithmic prediction and crowd prediction of the 2011-2012 season.

For this year, more experts, more fun! Now, let the game start! Are you ready for the football (and those experts)?

Read Full Post »

The 2012-2013 NFL regular season games were in the book now. Following the fun of comparing expert picksalgorithmic prediction and crowd prediction of the last (2011-2012) season, let’s check how well they predicted this time. Some background information: Accuscore is based on simulations (algorithms and data) by accuscore.com and Pick’em is the average of all predictions by NFL fans who submitted their picks on ESPN.com before the game (kind of a “crowd prediction” by non-experts).

The first noticeable difference is that ESPN added a couple of experts to their pool, from 8 up to 12 by adding Jackson, Johnson, Ditka, Carter (to get a better crowd of experts?)

nfl_expert_pick

For the 2012-2013 season, twelve experts’ prediction accuracies range from 60.2% to 69.9% with the median around 64.6%, roughly the same as the median accuracy 64.1% of eight experts in 2011-2012 season.

Picks Allen Golic Hoge Jaworski Mortenson
2013 60.2% 63.3% 66.8% 65.6% 69.5%
2012 65.0% 62.9% 63.5% 64.6% 60.5%
Picks Schefter Schlereth Wickersham Jackson Johnson
2013 62.5% 64.5% 69.9% 62.9% 60.2%
2012 61.7% 65.2% 65.2% N/A N/A
Picks Ditka Carter Accuscore Pick’em
2013 64.8% 66.0% 64.1% 65.9%
2012 N/A N/A 68.0% 68.0%

Pick’em tied accusore with 68% accuracy, better than all experts, in 2011-2012, but both clocked in much lower for the 2012-2013 season. Pick’em achieved 65.9%, slightly beating 8 out of 12 experts, while accusore was worse than 7 experts. Now what do we say about crowd prediction and algorithm prediction?

By the way, it seems Wickersham is the best expert for prediction and did his homework. Way to go!

For statisticians, are these percentages differ significantly?

Read Full Post »

Welcome to the age of big data and see what those Data Crunchers can do! I believe both campaigns have such a group of data crunchers, but only those on the winner’s team get the highlight 🙂

Swampland

In late spring, the backroom number crunchers who powered Barack Obama’s campaign to victory noticed that George Clooney had an almost gravitational tug on West Coast females ages 40 to 49. The women were far and away the single demographic group most likely to hand over cash, for a chance to dine in Hollywood with Clooney — and Obama.

So as they did with all the other data collected, stored and analyzed in the two-year drive for re-election, Obama’s top campaign aides decided to put this insight to use. They sought out an East Coast celebrity who had similar appeal among the same demographic, aiming to replicate the millions of dollars produced by the Clooney contest. “We were blessed with an overflowing menu of options, but we chose Sarah Jessica Parker,” explains a senior campaign adviser. And so the next Dinner with Barack contest was born: a chance to eat…

View original post 1,736 more words

Read Full Post »

After investigating results of  expert picks and algorithmic prediction of 2011-2012 season NFL game in earlier posts, we may have been convinced that systematic study of relevant data may lead to better “expert”. Machine beats human with a wide margin (68% v.s. 65% accuracy rate)?

Now if you move your eyes to the last column of the prediction image, you will notice the word “Pick’em“.  It is the average of all predictions by NFL fans who submitted their picks on ESPN.com before the game. A kind of “crowd prediction” by non-experts.

Like Accuscore, Pick’em prediction scored 10-6 in the first week of 2011-2012 NFL season, no more, no less.

This slideshow requires JavaScript.

However, at the end of the  17 weeks of regular season games, Pick’em tied with accusore with 174 right picks (68%accuracy). Isn’t it amazing? The next table contains the performance of the best four experts, Accuscore and Pick’em.

Allen Jaworski Schlereth Wickersham Accuscore Pick’em
Correct 165 155 167 167 174 174
Wrong 89 85 89 89 82 82
Accuracy .65 .65 .65 .65 .68 .68

Aha!  Wisdom of the crowd kicks in nicely. A classic example of this phenomenon is often mentioned as:

At a 1906 country fair in Plymouth, eight hundred people participated in a contest to estimate the weight of a slaughtered and dressed ox. Statistician Francis Galton observed that the mean of all eight hundred guesses, at 1197 pounds, was closer than any of the individual guesses to the true weight of 1198 pounds.

This year, Lior Zoref actually bring a live ox on stage of TED.

How good does the the mind of that crowd do? There were 500 estimates, and the results were:

-The lowest guess was 308 lbs.

-The highest was more than 8000 pounds.

-The average was 1792 pounds.

And the real weight? The ox weighs 1795 pounds. Three pounds off.

People rock! (How about random forests and boosting, if you know what I’m talking about.)

Read Full Post »

Following my previous post on expert picks of NFL games, one may wonder if someone can do better than those experts in picking winning teams ahead of the game. If you direct your eagle eyes to the second last column of the prediction image, you will notice the word “Accuscore”. What is it? ESPN describes it as:

AccuScore has powered more than 10,000 simulations for every NFL game on ESPN.com, calculating how each team’s performance changes in response to game conditions and opponent’s abilities. Each game is simulated and the game is replayed a minimum of 10,000 times to generate forecasted winning percentages.

In short words, it is prediction based on simulations (algorithms and data). As a company, Accusore runs simulation on almost every major sport game: NFL, NBA, NCAA FB, NCAA BB, … Of course, they sell their prediction through membership purchase. It seems that they are trying their best to use massive amount of data to generate a fortune. So, how well are they doing?

For predicting the first week of 2011-2012 season, Accuscore prediction scored 10-6, just like one extra expert, no better, no worse.

However, at the end of the  17 weeks of regular season games, accusore did pretty well, compared to any of those experts.

Allen Golic Hoge Jaworski Mortensen Schefter Schlereth Wickersham Accuscor
Correct 165 161 162 155 155 158 167 167 174
Wrong 89 95 93 85 101 98 89 89 82
Accuracy .65 .63 .64 .65 .61 .62 .65 .65 .68

Would we call it the power of data mining? (… to be continued …)

Read Full Post »