We had three goals for prototyping the Crowdtasting: The Science of Beer & Food Pairingsresearch event we hosted on March 25, 2016. 1) We wished to understand what the logistics of ethically engaging 400 people into a flavor study at the Museum would look like when done efficiently and effectively. 2) We needed to establish if crowdtasting research events attract people that will take the research seriously and provide high quality data. 3) We wanted to learn how to better design the recipes, select the beers and ask the questions of a crowdtasting audience to ensure that the results would withstand peer-reviewed scientific publication.
(Information on the Beer & Food Working Group can be found here)
Disclaimer: This report is not peer-reviewed and represents a rudimentary analysis of the data for public sharing. The data will be more rigorously analyzed and then used to formulate hypotheses for future research studies (both crowdtasting and in the Genetics of Taste Lab). The results of future studies will be submitted for peer-reviewed scientific publication. In addition, for those of you that joined us on March 25, 2016, we shared with you the quick and dirty data. That was not a statistical analysis, but rather a visualization of where each food was rated individually (scored by the scale choice with the most responses) compared to the same food with each beer. Therefore, the statistical analysis you see here may disagree with that presentation, and in such cases the results presented here is a more accurate representation of the data. Finally, logistics prevented randomization of the beer and food order for this event. This is something we will strive to accomplish in future studies.
Results for Goals 1 & 2
1. Logistics: We still have much to assess (and we have plenty of meetings scheduled just to work through this very item!), but in short the logistics were a challenge meant to be overcome. Even between the first and second sensory sessions we learned so much about what to have ready (e.g. pitchers on the tables, paper surveys on hand when Wi-Fi and phones fail), what information to give verbally and what points on the digital data entry to reinforce, that the second session went far more smooth than the first. We currently have a survey request out to our 400 participants to learn more about how they felt the flow went, what was difficult, what they felt that we overlooked. We will take every bit of feedback into account as we determine how best to conduct crowdtasting research events in the future.
2. Crowdtasting Participants: Far and away the most impressive takeaway of the night was how poised, focused and dedicated our public panelists were. As I analyze the data, I continue to be impressed with the aptitude of the audience we hosted. We know now that we can ask much more in-depth questions, including intensity scaling of this group, which will allow us to ask higher impact questions in crowdtasting research events moving forward. A huge kudos to all of you out there that attended and contributed such high quality data!
Results for Goal 3
Gender. Using a two-sample t-test, we found that men and women show statistically different (p-value < 0.05) preference ratings for one food item (men preferred the pepper sauce) and two of the beers (men preferred the stout and the IPA) (Table 1). It is also note-worthy to mention here how much women in Colorado rock. My colleagues in the beer industry were impressed with the fact that a craft beer event attracted more women than men (Figure 2). This is opposite of the typical trend, and as a scientist and craft beer fan that happens to be a woman, I was in good company last Friday night!
Table 1. Comparison of preference scores of both the beers and the food items of men and women. (*indicates the gender difference in preference for that item statistically significant)
Savory Tart 1.000
Pecan Shortbread 0.869
Pepper Sauce 0.000*
Beer A (Brown) 1.000
Beer B (Hefeweizen) 1.000
Beer C (Stout) 0.000*
Beer D (IPA) 0.001*
Figure 1. Gender of Crowdtasting Participants
Pairings. To account for individual base line liking of each food, the preference scores for each pairing were normalized to the original food preference score. To determine if there is a difference in preference in the food when paired with the given beer styles, I conducted analysis of variance and then Tukey’s honestly different test (Figure 2).
Figure 3. Boxplot of preference scores for foods with each beer style (normalized to original food score). Shared numbers on the graph indicate that there is no statistical difference in preference between the styles identified.
Figure 3A. Savory Tart (Primary Taste Attribute: Umami)
We found that the brown was preferred to the three other styles when paired with a primarily umami based food. There was no statistical difference between the hefeweizen and the IPA. The stout was preferred the least in this pairing.
Figure 3B. Pecan Shortbread (Primary Taste Attribute: Sweet)
We found that the brown and the hefeweizen were preferred equally when paired with a primarily sweet based food, and were preferred over the stout and the IPA, which were equally preferred the least.
Figure 3C. Pepper Sauce (Intended Primary Mouthfeel Attribute: Spicy. However descriptive analysis by the taste panel at New Belgium Brewery showed that there were three primary attributes for this recipe: Spicy, Sour and Salty, therefore very little can be inferred from this data)
We found that the brown, the hefeweizen and the IPA were equally preferred with this recipe. The stout was preferred the least.
This work would not have been possible without the support of the Beer & Food Working Group members, our tireless adult programs coordinator Julia Spaulding-Beegles, the Catering & Events team (with special thanks to Executive Chef Steven Alves, Executive Sous Chef Carl Klein, Director of Catering Patrick Hartnett, and banquet staff Paul Sanchez, Rick Maes, Ashley Treadway and their serving crew), the breweries that donated their beer under a strict code of confidentiality as to not influence the outcome of the data (Avery Brewing, Prost Brewery, Deschutes Brewery and O’Dell Brewing- see beer labels below for the tasting code), the expert taste panel at New Belgium Brewing Co., the volunteers who set up 400 plus table settings (special shout out to the folks that helped out from the Fermentation Program at CSU and the Rocky Mountain Taste and Smell Center at the Anschutz Medical Campus), and last but not least the crowdtasting participants who graciously shared their data and feedback, and cheered us on while doing so- we could not have pulled this off without you!