Friday, September 22, 2023

Bing will store your conversations and there is only one way to avoid it

If you are a user of the new Bing with ChatGPT, be careful with the information you share. A recent change to the Terms of Service states that Microsoft could store your conversations with the chatbot. Although the company says the new policies are designed to ensure appropriate use of artificial intelligence, some consider it too intrusive.

As reported The Registerthe July 30, 2023 update adds sections that Fight Bing Misuse for Insider Information. Users will not be able to use the chatbot to discover any underlying components of the models, algorithms and systems. You are also not allowed to use methods to extract or collect web data from the service to create, train or improve other artificial intelligence models.

Microsoft is serious about protecting your browser and is determined to punish anyone who breaks the rules. If we remember, the early days of Bing with ChatGPT were chaotic for the company, after multiple users put its security to the test. One of them got Bing to reveal his code name and the guidelines for its operation.

Kevin Liu, security expert, accessed confidential information through prompt hacking, a technique that tricks the chatbot into answering unauthorized questions. In addition to Liu, other analysts managed to make him plan terrorist attacks or go crazy after a prolonged conversation. Given this, Microsoft closed the gap and updated its guardrails to prevent Bing from continuing to leak information.

Microsoft takes aggressive stance to protect Bing

Microsoft Bing with ChatGPT, imagined by Midjourney AI

The competition in the artificial intelligence race has caused many companies to shift their stance to protect their technology. Just like Twitter or Reddit, Microsoft does not want third parties to access the data set, models or algorithms that power Bing with ChatGPT.

“As part of the provision of AI services, Microsoft will process and store your inputs to the service, as well as the output of the service, in order to monitor and prevent abusive or harmful uses or outputs,” mentions the update to the Terms of Service. Service.

Microsoft does not specify for how long your prompts, although judging by the reason, it could be 30 days. Meta services store a copy of your messages for a similar period of time. In emergency situations or on accounts related to a criminal investigation, data is retained for up to 90 days.

The only way to prevent Microsoft from storing your conversations with Bing is to be a user of the enterprise version. The policies for the ChatGPT browser are different for companies and prevent the technology from keeping a history or training its models using the information you share.

The new policy will be effective as of September 30, 2023so you still have time to delete your chat history.

Also in Hypertext:

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article