Our list of “The Best Beers and Breweries of 2018” is the result of a combined analysis of the top rated beers as determined by over 10 million collective beer reviews as reported by the three separate beer rating websites BeerAdvocate, Untappd and RateBeer.
Here’s how it works:
In order to rank the top beers from Untappd, RateBeer, and BeerAdvocate, we started by combining all of the top 50 beers from all three websites into a single super list, and then removed any meads, retired beers and any duplicate beers including multiple vintages.
This brought the total number of beers on the list down to 75. Next, we assigned a score to each remaining beer on the list by averaging a given beer’s score from Untappd, RateBeer and BeerAdvocate. Finally, we arranged all the beers on the list from highest to lowest score, and identified the top 50.
The Top 10 Beers are listed below, however the entire top 50 can be found here.
The Top 10 Beers of 2018
RANK | NAME | BREWERY | SCORE |
1. | Kentucky Brunch Brand Stout | Toppling Goliath Brewing Company | 4.666 |
2. | Dark Lord - Marshmallow Handjee | 3 Floyds Brewing Co. | 4.63 |
3. | Hunahpu's Imperial Stout - Double Barrel Aged | Cigar City Brewing | 4.623 |
4. | Zenne y Frontera | Brouwerij 3 Fonteinen | 4.586 |
5. | Mornin' Delight | Toppling Goliath Brewing Company | 4.576 |
6. | Pliny The Younger | Russian River Brewing Company | 4.573 |
7. | Barrel-Aged Abraxas | Perennial Artisan Ales | 4.566 |
8. | Barrel Aged Imperial German Chocolate Cupcake Stout | Angry Chair Brewing | 4.543 |
9. | Juice Machine | Tree House Brewing Company | 4.536 |
10. | Trappist Westvleteren 12 (XII) | Brouwerij Westvleteren | 4.53 |
[Note: Untappd lists beers of different vintages as separate individual entries such as with the many “Bourbon County” vintages from Goose Island, whereas BeerAdvocate and RateBeer simply combine all the vintages and assigns a single entry and single score. To simplify our list, we combined all vintages of the same label on Untappd, averaged the score, and then averaged that score with the scores from RateBeer and BeerAdvocate.]
Determining The Top Breweries of 2018
To calculate our list of top breweries, first we combined all of the beers that appeared on RateBeer, Untappd, and BeerAdvocate’s top 50 into a single super list. Then, each brewery was awarded a single point for each beer it had on the super list, excluding meads and retired beers.
Finally, the breweries were ranked by the number of points they had, with tiebreakers being determined by the average score(s) of the beer(s) per brewery. To determine the average score of a given beer, we simply averaged that beer’s individual score from RateBeer, BeerAdvocate and Untappd.
The Top 10 Breweries of 2018
RANK | BREWERY | # OF BEERS | AVERAGE
SCORE |
1. | Tree House Brewing Company | 18 | 4.42 |
2. | Toppling Goliath Brewing Company | 10 | 4.53 |
3. | 3 Floyds Brewing Company | 7 | 4.40 |
4. | Cigar City Brewing | 6 | 4.54 |
5. | Hill Farmstead Brewery | 6 | 4.43 |
6. | AleSmith Brewing Company | 6 | 4.40 |
7. | Founders Brewing Company | 5 | 4.48 |
8. | Goose Island Beer Co. | 5 | 4.43 |
9. | Russian River Brewing Company | 5 | 4.41 |
10. | Funky Buddha Brewery | 4 | 4.47 |
[All rating data was pulled from RateBeer, BeerAdvocate and Untappd on 3/19/18.]
And of course no beer list would be complete without somebody explaining why it sucks. So for your entertainment, we’ve included a conversation with the dreaded “Critique Master” below who does an utterly thorough job of destroying our beer list and possibly every other beer list there is.
Enjoy!
A Conversation with Critique Master about the Top Beers:
Critique Master: First things first: we’re barely a third of the way through 2018, so how the heck can you make any semblance of a claim about the year’s “top beers and breweries” when the year isn’t even over yet? Even if you combine all the data from BeerAdvocate, RateBeer and Untappd, that data is subject to change at any moment, right? So isn’t this just a list of what’s best at this very second and not really useful? Just sayin’…
BeerSyndicate: Wow, coming straight out the gate swinging, huh? Fair enough, I guess. So yes, you’re partially correct, the beer ranking of top beers on RateBeer, BeerAdvocate and Untappd is subject to change. But just because something can change, doesn’t mean that it will. At least not much in this case, anyways.
For example, looking back almost one year ago today, 10 of the first 11 beers of BeerAdvocate’s top beers back then were the very same beers on the top of its list today. In fact, if one beer in the top 11 hadn’t been retired (namely King JJJuliusss from Tree House Brewing), the beers on that list from a year ago would be exactly the same as it is today with just a slightly different order. Other data points from RateBeer and Untappd were similar.
Critique Master: That’s all well and good, but do we really need another “best of” beer list? I get it, beer ratings can influence buying decisions sometimes in a major way like in the case of Westvletern, but I mean, can’t you spend your time doing something more important? Just sayin’…
BeerSyndicate: Yes and no, in that order… or some other order.
Critique Master: Very cute. Come to think of it though, technically this isn’t even really “your” list of “top beers”, is it? I mean, I think you guys are actually being really misleading because you’re basically just relying on all of the reviews from BeerAdvocate, Untappd and RateBeer, to which you’ve contributed exactly zero reviews.
Just sayin’…
BeerSyndicate: Well, our list was more about compiling and analyzing a larger amount of data in an attempt to get a more complete picture of the top rated beer and breweries. I don’t know, we thought it was a pretty cool idea, but you are the Critique Master, and we respect your title.
Critique Master: I am indeed the Critique Master, and you will tremble at my critiquing powers. Speaking of which, you mentioned that you excluded meads, retired beers, and different vintages of the same beer from your list? That doesn’t seem fair. Care to explain why you’re cherry picking the data?
BeerSyndicate: As tasty as mead can be, it’s a honey wine made only with honey, water and yeast. In other words, it’s not a beer. Beer, on the other hand, has to contain at least some grain. That said, if there was a braggot on any list, which, as you know, is a kind of mead made with the addition of barley malt, then we would have included it. But there wasn’t, so we didn’t.
As for excluding retired beers (which encompasses previous vintages of a given beer), we simply wanted to give a representation of the best beers currently being produced on the market and also make all of the data consistent since Untappd is the only site of the three that separates out certain vintages of the same beer.
This is just one way of looking at the data. You could certainly do it some other way.
Critique Master: Yeah, I’d probably do it a different way. Or actually I wouldn’t do it at all, because your entire dataset is flawed. Just sayin’…
BeerSyndicate: Here it comes.
Critique Master: For starters, Untappd doesn’t seem to have any set guidelines for how its users should rate a beer, so the justification for the average score of any beer is unclear, though my guess is that most users determine a rating based on how much the user “personally enjoys” a given beer, which is subjective.
On the other hand, BeerAdvocate has its users rate beer according to how well they think a beer represents the definition of the beer style listed on the BeerAdvocate website. And RateBeer is somewhere in the middle.
In other words, your list of “top beers” confusingly blends reviews that are based on purely subjective personal enjoyment like many on Untappd and also reviews that are rated according to some more fixed standard. Apples and oranges.
In addition, the varying levels of individual experience and palate acuity of the reviewers determining the score for a given beer on any social beer rating site is also problematic.
Not only that, but I bet it’s mainly Americans who are doing the beer reviews, so that limits your results to mainly the American beer scene and palate, and to a lesser extent non-American English speakers. Just sayin’…
BeerSyndicate: It’s true— we’re taking the data from BeerAdvocate, Untappd and RateBeer at face value. And though we can’t say for certain, you’re probably right that the user base of all three websites is likely mainly American, and then non-American English speakers.
You’re also right to question the level of experience any users has on any of those sites and how that might affect the relative objectivity of any beer’s score.
That said, we never claimed that our list is a list of the “best beer in the world”. It could be, but all we said was this is a list based on an analysis of the top beers per RateBeer, Untappd and BeerAdvocate.
But you have a valid point when it comes to the subjectivity of scoring a beer based on personal enjoyment as you assume many users on Untappd do. People have different personal preferences, so beer scores based on those kinds of reviews may not necessarily be very useful to everyone.
If you think reviews from Untappd are too subjective or more of a social app than a serious rating site, we could show you what the top 10 beers would look like if we totally excluded Untappd’s data, and only show you the combined results of the top beers from RateBeer and BeerAdvocate?
Critique Master: I’m mildly interested. Go ahead and show me your findings, but make it quick.
BeerSyndicate: Right away, but only because you’re mildly interested:
[Below are the top 10 beers based only on BeerAdvocate & RateBeer data]
RANK | NAME | BREWERY | SCORE |
1. | Kentucky Brunch Brand Stout | Toppling Goliath Brewing Company | 4.685 |
2. | Zenne y Frontera | Brouwerij 3 Fonteinen | 4.58 |
3. | Hunahpu's Imperial Stout - Double Barrel Aged | Cigar City Brewing | 4.565 |
4. | Mornin' Delight | Toppling Goliath Brewing Company | 4.55 |
5. | Dark Lord - Marshmallow Handjee | 3 Floyds Brewing Co. | 4.535 |
6. | Pliny The Younger | Tree House Brewing Company | 4.525 |
7. | Trappist Westvleteren 12 (XII) | Brouwerij Westvleteren | 4.515 |
8. | Barrel-Aged Abraxas | Perennial Artisan Ales | 4.505 |
9. | Barrel Aged Imperial German Chocolate Cupcake Stout | Angry Chair Brewing | 4.495 |
10. | Canadian Breakfast Stout (CBS) | Founders Brewing Co. | 4.485 |
Critique Master: Huh, 9 out of 10 beers from the list without Untappd’s data were the same as the list with it. But that could mean anything. And anyways, the source data from all of these beer review sites— and even your data— could still be untruthful, corrupt or suspect in some way.
For example, Anheuser-Busch InBev acquired a minority stake in RateBeer back in 2016, so even if we assume RateBeer’s data was otherwise perfect, I now have at least a minority reason to be skeptical.
Funny— you’d think that with the glut of AB-InBev-owned Goose Island beers in Untappd’s top 50, it was Untappd that was being influenced by Anheuser-Busch. Anyways.
But here’s an even bigger problem with all of these “top beer” lists: they’re just lists of over-hyped beer. And this is largely because few if any of the beer reviews on Untappd, etc. are done blind.
BeerSyndicate: It’s a fair point to make that the “hype factor” may have some effect on some of the beer reviews that appear on BeerAdvocate, RateBeer and Untappd. But exactly how to account for the “hype factor” and to what degree that hype even has an effect on those reviews is up to speculation.
Sure, some folks might over-score a hyped beer, but some might unfairly under-score a hyped beer because it doesn’t live up to mountains of hype. And then some might be impartial and not let the hype influence their judgment either way.
So it seems reasonable that blind tastings are potentially helpful for eliminating some of the elusive “hype factor”, though someone doing a blind tasting may think they know what beer it is that they’re sampling, and somehow that could affect their scores.
Nevetheless, most professional and homebrewing competitions in the U.S. are judged blind.
If you like blind beer reviews, you might like the ones Paste Magazine puts out.
Critique Master: Actually, even though Paste Magazine does blind reviews, they score beer according to “personal enjoyment” similar to many Untappd reviews, which, like I said, is pretty darn subjective.
I mean, why go through all the trouble of doing a blind review in the name of objectivity just to introduce subjectivity back into the equation by judging beer according to personal enjoyment instead of some independent publicly established standard like the BJCP Beer Style Guidelines?
Just sayin’…
BeerSyndicate: Not sure. We still like Paste beer reviews, but you might have a point.
That said, beer styles can change over time, so a fixed definition of beer would also have to be subject to change. Paste might be trying to account for the real-time evolution of beer styles by not strictly adhering to past publications of the BJCP Beer Style Guidelines.
Critique Master: Nice try, but basing a beer review on “personal enjoyment” isn’t the same thing as trying to account for a possible subtle development of a particular beer style. Just sayin’…
BeerSyndicate: Alright, so to sum up your position thus far: You have a problem with the beer reviews from Paste and Untappd because they base their reviews on personal enjoyment, which is subjective.
You’re suspect of RateBeer reviews because AB InBev owns a minority stake in the company.
You have a problem with BeerAdvocate because similar to Untappd and RateBeer, its users aren’t necessarily preforming blind tastings to rate beer.
And you obviously don’t like our list of top rated beers because, well, you don’t like the data it’s based on.
Does that about cover it?
Critique Master: Almost. Rating a beer based on personal enjoyment is obviously problematic due to the subjectivity of personal preference, so to make things less subjective, it’s better to rate beer according to a fixed standard like the BJCP Beer Style Guidelines.
Even if you view the style guidelines as purely conventional, sort of like the common convention of using inches or centimeters on a ruler, it’s still better to have a common public standard to rate beer than whatever private reasons an individual has for determining their personal level of enjoyment.
But even then, people still have to judge a beer based on their individual physiological sense of taste, which can vary from person to person. For example, people may perceive the intensity of certain taste sensations differently. About 25% of the population are supertasters and perceive certain flavors more intensely than other people, 50% are normal tasters, and 25% are non-tasters who perceive flavors less intensely than normal tasters and far less intensely than supertasters.
In other words, if you’re a supertaster, then your review of a given beer may not be as relevant to normal tasters, and even less so to non-tasters. Therefore, people conducting beer reviews should get tested to determine if they are a supertaster using a $5 test.
BeerSyndicate: No offense Critique Master, but hardly anyone will get tested to determine if they’re a supertaster, normal taster or a non-taster.
Not only that, but you’re gonna end up with three sets of beer reviews: one for supertasters, one for normal tasters, and one for non-tasters. Not to mention, the person reading the review would also have to be tested to know which beer review applies to them.
Then you’d also have to control for other conditions including the serving temperature of the beer, the kind of glass it the beer is served in, environmental distractions, the level of experience of the taster, the level of intoxication of the taster, maybe the diet of the individual like they do with professional coffee tasters, and I’m sure there are other factors we haven’t thought of.
It’s too prohibitive, especially just for the sake of beer reviews. It’s simply easier for the public to buy a given beer and try it for themselves.
Critique Master: Exactly. And because of this, most or maybe all beer reviews have very little practical application to the individual— except for maybe a blind review done by experienced individuals with trained palates according to a fixed standard and intended for their own personal use. I suppose this also goes for wine and food reviews too.
By the way, this also means that you’ve wasted your time doing your little analysis of the so-called top beers and breweries of 2018, assuming you were trying to generate any meaningful data in the first place.
Just sayin’…
Next on tap on the BeerSyndicate Blog: Ranking the Beers of Rodenbach!
Hi, I’m Dan: Beer Editor for BeerSyndicate.com, Beer and Drinking Writer, Award-Winning Brewer, BJCP Beer Judge, Beer Reviewer, American Homebrewers Association Member, Shameless Beer Promoter, and Beer Traveler.