Is ChatGPT a Danger to Businesses? This Samsung case opens the debate
robort - 2023-06-03 13:42:54
Samsung's experience with ChatGPT reveals how risky generative AI can be. Samsung Semiconductor employees would in fact have used AI by sharing internal data that could reveal critical information about production processes and other company secrets.
While on the one hand, ChatGPT simplifies and optimizes the performance of tasks in many areas, on the other, it should be emphasized that the data is collected and stored on external servers.
This means that any information entered into the chatbot will be saved and will be impossible to recover.
Specifically, Samsung would have recorded three data leaks in just 20 days which could have disastrous consequences for the company's business.
For these reasons, the company will adopt security systems while waiting to develop its own service, similar to ChatGPT, for internal use only.
In the first case, an employee would have entered the entire source code of a top-secret application to allow artificial intelligence to correct its errors. This means that the entire software has been shared with an external company with no possibility of recovery.
The second, however, concerns the process adopted by Samsung for the production of semiconductors. The employee would chat with the test patterns the company uses to find faulty drives and request optimization.
The goal was to speed up the testing and verification procedures of malfunctioning chips with a positive impact on production costs.
In the third case, following the use of Naver Clova to convert an entire meeting recording to text, it would be sent to ChatGPT to prepare a presentation. In short, the internal data shared with the chat in just 20 days was really a lot, which exposes the company to enormous risks for its competitiveness.
The biggest problem, in fact, concerns the learning capabilities of the algorithm which could allow other companies to access the information without even OpenAI being able to intervene.
In this regard, Samsung would have decided to limit the size of the information entered on the chatbot to 1024 bytes. In any case, it seems that the Korean reality is evaluating the total blocking of access to ChatGPT from internal terminals.
Meanwhile, a Samsung spokesperson who was asked about the matter declined to comment, neither confirming nor denying the data leak rumors.