Image: Savvapanf Photo / Shutterstock.com
Burger King in the U.S. is currently conducting a trial that is turning heads. In about 500 locations, an AI system is being used that is designed not only to assist with work but also to evaluate the friendliness of employees. The assistant is named “Patty” and is part of the “BK Assistant,” which is based on a model from OpenAI.
When the voice from the headset is included in the evaluation
Employees and managers wear new headsets. Through them, Patty provides updates on daily operations, shares recipes, alerts them when vending machines are empty, or reminds them of cleaning tasks. But that’s not all. The system also generates a “friendliness score” for the team.
The evaluation follows a simple pattern: those who use words like “please,” “thank you,” or “welcome” score higher. If these terms are missing, the score drops. From the company’s perspective, this helps improve service. For critics, it is one thing above all else: a very crude measure of human behavior.
Kindness is more than just a few kind words
Elisabeth André, a professor at the University of Augsburg, considers this approach problematic. She says, “You can’t reduce kindness to just saying ‘please’ and ‘thank you’.”
That is precisely the crux of the matter. Whether someone comes across as friendly isn’t determined by individual words alone. Tone of voice, eye contact, patience, and the specific situation also play a crucial role. Someone who speaks hurriedly, seems annoyed, and yet still says “thank you” is far from being friendly.
In Germany, that would be hard to imagine
Burger King emphasizes that the system is intended to support employees, not to monitor them specifically. Nevertheless, this distinction is unlikely to convince many people. After all, anyone who is constantly being listened to and evaluated will ultimately feel as though they are being monitored.
In Germany, such a model would be hard to imagine anyway. Audio and video recordings in the workplace are subject to strict legal restrictions here. On top of that, there is the issue of co-determination by the works council. So the big AI eavesdropper in everyday fast-food life would have a much harder time here.
What we say about this
The real point is actually something else: companies love to talk about modern support. But as soon as software starts trying to reduce human warmth to a set of metrics, help quickly turns into pressure. Technology can improve processes. Whether it should also be the judge of decency and friendliness is an entirely different question.
Sources: sueddeutsche.de




