Up next: Vancouver Art Book Fair, October 18- 26, 2021

Prelude to a Recipe


  • Image of Prelude to a Recipe

These short texts were made by combining recipe blogger stories and an artificial intelligence (AI) model, GPT-2, developed by the research group OpenAI.

Long essays that precede recipes are a byproduct of the online publishing revenue model. To earn income, the blogger has to write content that attracts search engines, which increases clicks and readers. Getting more readers exposes more eyeballs to the blog’s sponsored ads, resulting in more income for the blogger. However, readers often complain these essays cause an irritating user experience. One must scroll a loooong way before getting to the recipe instructions.

GPT-2 was trained to generate human-like text based on source material from 8 million websites. One major source was Reddit.com, where the users are mostly male, between the ages of 18-29, and go by names like “UncleTouchyFingers” and “I_YELL_ALOT.” GPT-2 is available for public use and can be “primed” with any text input. Once the model is run, its output is a continuation of the input text.

Since GPT-2 was trained on the internet without curating source material, I was curious what would happen if I primed the model with wholesome baking stories. Now these once scrolled over texts have a new life. They describe the challenges of a pushy mother and failing marriage, along with the insecurities that arise when keeping up a popular online presence.

Image of Spammy Humans
Spammy Humans
Image of The Man Who Mistook His Speaker for a Mistress
The Man Who Mistook His Speaker for a Mistress
Related products