AI-generated recipes won’t take you to Flavortown

Last year, as an experiment, I baked cakes based on recipes generated by ChatGPT and Bard. It went better than I expected, as a lot of each recipe didn’t seem to be made by AI at all.

The recipes I got were eerily familiar to recipes I’ve seen on food blogs or Instagram. They were slightly edited, and only Bard (now called Gemini) bothered to offer a citation, with a link to a recipe from Sally’s Baking Addiction. In some ways this is not a new problem; I’ve seen far too many copied not-AI recipes floating around the internet. However, AI makes the ethics of copying recipes more fraught. Its sheer size threatens to create a world where human recipe developers are displaced by semi-random, AI-generated competition, but while there are indications that this world is coming, we are not in it yet.

Even without the intervention of AI, similarities in recipes are a very sensitive topic in the food world. Some influencers find themselves in a storm when another food blogger or cookbook author claims they stole their recipes. The New York Times reported several cases of alleged recipe copying, including the 2021 case in which author Sharon Wee accused British chef Elizabeth Haigh of copying passages and recipes from her cookbook. Then there’s the online hatred of popular food blogger Half Baked Harvest, who has been accused several times of copying two other popular food bloggers.

But food makers have few legal options if they feel someone has taken their work. A simple list of ingredients and instructions is treated as an idea that cannot be protected by copyright. Many recipes have an element of oral tradition; many are passed down through family members. While some contain lengthy personal preambles Are copyrighted, the biggest bummer for recipe “stealing” is often the daily snark threads on Foodsnark Reddit, and the recipe world is largely governed by etiquette – not law.

And just like with musical chords, there are a limited number of ingredients you can put together for a reasonable recipe. Take a pie crust. How many combinations of sugar, butter and flour can you write down before someone claims to have repeated the same recipe?

Large language models, like the ones that power ChatGPT and Gemini, can take these permutations, analyze them faster than a human could, and come up with a pretty solid recipe very quickly. As a result, finding recipes that meet a specific diet is often touted as a good potential use of chatbots. On the other hand, to make an obvious point: AI tools cannot actually prepare or eat food. They don’t really “know” if a recipe will work, just that it fits the pattern of a recipe that does.

Cookbook author and recipe developer Abi Balingit, who runs the blog The Dusky Kitchen, said she doesn’t think about AI when she creates recipes. But she worries this could impact where food writers and developers can showcase their work.

“There are gradients in what is good and what is not. AI does not make recipe development worse because there is no guarantee that what it produces will work properly,” Balingit said. “But the nature of media is transient and unstable, so I fear there will come a point where publications will turn to an AI instead of recipe developers or chefs.”

Generative AI still occasionally hallucinates and invents things that are physically impossible, as many companies have discovered the hard way. Grocery delivery platform Instacart partnered with OpenAI, which runs ChatGPT, for recipe images. The results ranged from hot dogs with the insides of a tomato to a salmon Caesar salad that somehow created a lemon lettuce hybrid. The proportions weren’t good – like The Washington Post It was noted that the steak size in Instacart’s recipe easily feeds more people than planned. Buzz Feed also came with an AI tool that recommended Taste brand recipes.

Balingit added that people have a certain level of trust when they read someone’s recipe or see how someone makes a dish. There is the expertise of having made the food and actually tasted it.

That explained why I immediately felt the need to double-check chatbot recipes. AI models can still hallucinate and wildly misjudge how ingredient amounts affect taste. For example, Google’s chatbot inexplicably doubled the number of eggs, making the cake moist but also dense and gummy in a way I didn’t like.

Balingit claims that one of the benefits of recipe makers is the human connection. Her cookbook Mayumu is filled with dishes that take an imaginative journey through her parents’ migration to the US from the Philippines, her childhood in California and her current life in New York. She said AI has no cultural or nostalgic connections to food and eating, something that is historically personal and that people share with others.

I felt the same. Although I never really cared who first came up with the idea of ​​poaching chicken in a ginger broth, it is still my favorite Filipino dish: tinola. When I first started learning how to cook it (thanks in large part to my dad kicking me out of the kitchen for being a nuisance), I scoured the internet for “authentic” recipes. I chose to follow the work of people whose personal connection to the dish was strong, and it made me want to make the food they were passionately writing about.

ChatGPT and Bard can generate functional recipes. I know that because I followed them. But I knew, as the person who baked these cakes, that it was emotionless and generic. My editor, Adi Robertson, likened one to a boxed cake mix, and another reminded me of sad cafeteria cakes. Sure, it hits the spot. Yes, it’s chocolate cake. But cakes can be so much more.

Leave a Reply

Your email address will not be published. Required fields are marked *