Employees at Samsung Semiconductor have sent sensitive data to the AI chatbot ChatGPT to make their daily lives easier. They apparently didn’t realize that this posed a security risk to the company.

Samsung employees send sensitive data to ChatGPT

Just shortly after Samsung Semiconductor allowed its employees to use ChatGPT for work, they sent confidential data to the AI chatbot. There have already been three such incidents in just 20 days, according to the Korean edition of The Economist.

How did the data leak happen?

Employees sent software code and other internal company data to the chatbot to find bugs and fix them. That’s problematic because that’s how they’re sent to OpenAI’s servers. So they are outside of Samsung’s network.

The dangers for Samsung’s semiconductor business

The chatbot, in turn, learns through conversations with users, so it could even happen that Samsung’s code or other data unintentionally shows up on other ChatGPT users. In the first incident, a Samsung engineer sent code from a Samsung proprietary program to ChatGPT to fix bugs. This was a top-secret program that was not supposed to be released to the public.

In the second case, an employee typed test patterns into ChatGPT’s chat box and asked the bot to optimize them. These test sequences are used in the semiconductor industry to test and speed up chips – and are worth a lot of money.

In the third case, an employee used the Clove assistant from Korean software giant Naver to turn a meeting into a document. He then sent the text to ChatGPT to create a presentation from it.

Chatbots as helpful work tools – but with limitations

All examples show that an AI chatbot can definitely help employees in their everyday work. However, this should not be fed with sensitive data. At least not if the bot does not originate from the company itself.

Samsung’s reaction to the data leak and outlook for the future

According to the Economist, Samsung is thinking about developing its own AI chatbot to help its own employees. Until then, the company is limiting the length of questions they can send to ChatGPT to 1,024 bytes. If further incidents occur, Samsung could also block access to the chatbot completely again.

Subscribe to our newsletter

and stay always updated on data protection.