Gold Medals Do Not Mean Good Wine: Actual Proof?

Bear with me while I get this out of the way:

I told you so.

I’ve taken a lot of flak here at Vinography for my stance on the competition, state and county fair medals that wineries like to make a big deal about. I think they’re all bunk — useless to the consumer, and a waste of money for the wineries trying to win them. For reference, I suggest you look at my posts entitled: Stop The State Fair Madness and Wine Competitions are One Big Racket.

My opinion has been based up until this point on purely anecdotal evidence that I can summarize quite simply. I taste thousands of wines a year at large wine tasting events, where (annoyingly) many producers advertise (and proclaim) their medal winning wines. The vast majority of the time, these wines aren’t any good. And quite to the contrary, some of the time, these wines are positively awful.

But now, we have a modicum of statistical evidence to support my contention, thanks to the work of some folks at the American Association of Wine Economists. These folks have just published a paper entitled: “An Analysis of Concordance in 13 U.S. Wine Competitions,” which not only demonstrates what I believe to be pretty clear statistical evidence for my previous claims, they also manage to cite a study I was unaware of suggesting that gold medals don’t really increase sales to consumers in the first place!

The paper outlines an analysis of, among other things, 2,440 wines that were entered into three or more wine competitions around the country. 47 percent of these won gold medals (that fact along should ring alarm bells), but of those, 84 percent won ZERO medals (not even a bronze) in other competitions. Which means that while these wines may have been rated as among the very best wines in one competition, they were rated as below average in another. Even taking into account the differences in the field of competition, this is a rather damning indictment of the quality, relevance, and value of these competitions and their awards.

One of the interesting details of the study was that the only place that these competitions were concordant in their evaluations of wines were of wines that they did not like. There were groups of wines that consistently received no award or a bronze medal at these competitions. This suggests, as the author of the study has apparently published elsewhere, that the judges at these events really only agree on what they don’t like.

The competition results that were studied in this report include:

Dallas Morning News Wine Competition
San Francisco Chronicle Wine Competition
Grand Harvest Awards
Jerry D. Mead’s New World International Wine Competition
West Coast Wine Competition
Pacific Rim International Wine Competition,
San Diego National Wine Competition
Riverside International Wine Competition
Los Angeles County Fair
International Eastern Wine Competition
Orange County Fair Wine Competition
San Francisco International Wine Competition
California State Fair Wine Competition.

For anyone unfamiliar with this competition circuit, these are essentially the largest and most prestigious wine competitions in America, many of which employ lots of wine professionals as judges. According to the paper, wineries spend more than a million dollars on entry fees every year for these competitions.

So I’ll say it again: stop the madness. I know that small wineries need every little bit of help they can get to sell their wines and get some attention from consumers, but these competitions are a lousy way to do that, and the awards do more to prop up the egos of those who enter than help consumers make relevant buying decisions.

Even with this paper in hand, I expect a volley of stones. Read it first, and then fire away.

An Analysis of the Concordance Among 13 U.S. Wine Competitions — 161k PDF.