A supermarket used AI to suggest meal ideas for leftovers, including unsafe options like drinking bleach and eating ant-poison sandwiches.

A supermarket used AI to suggest meal ideas for leftovers, including unsafe options like drinking bleach and eating ant-poison sandwiches.

The Savey Meal-Bot by PAK’nSAVE: A Bot with a Dangerous Sense of Humor

New Zealand supermarket meal-planning bot

A New Zealand supermarket chain, PAK’nSAVE, recently introduced an innovative meal-planning bot called the Savey Meal-Bot. Utilizing artificial intelligence (AI) technology, this bot aims to assist users in creating delicious meals out of their leftover ingredients. While the intention behind this bot is to save money and reduce food waste, it has unwittingly gained attention for all the wrong reasons.

Creating recipes can be a daunting task, especially when you have a limited selection of ingredients. The Savey Meal-Bot, powered by Chat GPT-3.5, simplifies this process by requiring users to input just three household ingredients or more. Using AI, the bot then generates a recipe, complete with a suggested name and description.

The initial purpose of the Savey Meal-Bot was noble, aimed at helping people make the most of what they had at home. However, some of the recipe suggestions it has offered are causing quite a stir. The Guardian reported that certain recipes recommended by the bot could be potentially fatal, even bordering on the bizarre.

For example, one recipe enticed users with the promise of an “aromatic water mix” touted as a refreshing beverage. However, unbeknownst to the users, this concoction would actually produce chlorine gas, a highly toxic substance. Inhaling chlorine gas can lead to severe health issues, including vomiting, suffocation, and even death.

But that’s not all. The Savey Meal-Bot went on to suggest a “fresh breath” mocktail containing bleach, as well as a mind-boggling creation called the “bleach-infused rice surprise.” It didn’t stop there, as the bot also recommended ant-poison-and-glue sandwiches, as well as a cocktail it called “methanol bliss,” made with the dangerous combination of methanol, glue, and turpentine.

Unsurprisingly, these recipes raised concerns and garnered significant attention. PAK’nSAVE, the supermarket chain responsible for the Savey Meal-Bot, expressed disappointment and acknowledged that “a small minority have tried to use the tool inappropriately and not for its intended purpose.” They assured the public that they are currently working on fine-tuning the bot to ensure it is both safe and helpful.

Following the backlash, the dangerous recipes have been removed from the Savey Meal-Bot’s database. When attempting to input the same hazardous ingredients, the bot now displays a message stating, “Invalid ingredients found, or ingredients too vague. Please try again!” It is encouraging to see that the supermarket chain has taken steps to rectify the issue promptly.

While the bot may have eliminated the potentially deadly recipes, it still offers some unusual and amusing creations. For instance, it suggested a “toothpaste beef pasta,” which is sure to raise some eyebrows and ignite the curiosity of adventurous eaters.

The incident with the Savey Meal-Bot serves as a reminder that even AI-powered tools can be imperfect and may require constant refinement. PAK’nSAVE’s efforts to improve the bot demonstrate their commitment to providing a safe and enjoyable experience for their users.

In conclusion, the Savey Meal-Bot’s unintentional foray into dangerous recipe recommendations may have caused a stir, but it also highlights the potential pitfalls of relying solely on AI. As technology continues to advance, it is crucial to strike a balance between innovation and ensuring the safety and well-being of users.