Science and Tech
What Makes Memes Popular? The Science of Memes

Scott Shank | 17 Dec 2015 19:00
Science and Tech - RSS 2.0
social image

Keith Shubeck first became interested in memes while losing at StarCraft. As his defenses crumbled, his opponents would taunt him with "AYBABTU," or "All Your Base Are Belong To Us," one of the earliest blockbuster memes of the online era.

Shubeck, now doing his Ph.D. in psychology at the University of Memphis, wondered what it was about some memes that made them so popular. "There are all these memes out there competing for our limited cognitive resources. If memes are competing with each other, those that are easier to remember should have an advantage."

While double rainbows and hot dog legs can boost any meme's chances, Shubeck realized that a much more basic mechanism influences meme memorability, and therefore success: how we process language. As Shubeck's collaborator Stephanie Huette, a professor of psychology at the University of Memphis, explains, "Every word, every syllable matters."

Armed with insights from research into language processing and working memory, Shubeck and Huette are now among a small group of scientists creating machine learning models to predict which memes are more likely to succeed.

Many linguistic factors determine how easily people recall words and sentences. Some factors are straightforward. Shorter words and sentences, for instance, are easier to remember than longer ones. Other factors, however, are far from intuitive but reflect fundamental aspects of how human cognition works. Emotional arousal improves recall, so the presence of words expressing positive and negative sentiments like nice and ugly have some impact on memorability. Similarly, concrete words like house are generally easier to recall than abstract words like proof in short-term memory tasks.

Shubeck and Huette set out to test whether these same linguistic factors shown to influence recall in the lab could account for a meme's popularity online. They selected a set of 268 memes from and ran them through linguistic analysis tools to determine the presence of a number of linguistic features. In addition to length, emotional arousal and concreteness, they tracked other memorable features, including swear words and purposeful misspellings. They then fed these memes into a neural network - a kind of machine learning model that approximates connections in the brain - that learned how these features correlated with success, defined as 37,400 or more verbatim Google search results.

Though Shubeck cautions that this work is still at the proof-of-concept stage, after the neural network was fully trained, it could predict success with 80% accuracy when exposed to new memes.

For example, the model correctly predicted the success of the meme "Banana for scale." This meme is short and benefits from the presence of a concrete word, "banana." The model likewise correctly predicted a lack of success for the meme "Does this look like the face of mercy?" This meme also contains a concrete word, "face," but because the meme is over four words in length, the model treated it as long.

Comments on