Get all your news in one place.
100’s of premium titles.
One app.
Start reading
International Business Times UK
International Business Times UK
Technology
Vinay Patel

Microsoft Enhances Bing ChatGPT Privacy: Limits User Data in Tightened Policy

Bing's ChatGPT gets data privacy overhaul to fight misuse. (Credit: Pixabay)

Microsoft recently updated its data handling policy for interactions with Bing's ChatGPT feature (effective September 30, 2023). These changes aim to restrict misuse while sparking concerns about user privacy and data security.

Microsoft's policy updates, including one from July 30, 2023, introduced stricter measures to prevent misuse of its Bing AI features, including the Notebook feature, which supports up to 16,000 characters of various content including research documents.

The recently introduced policy update restricts users from using Bing AI to answer queries regarding the underlying models, algorithms and systems that power it. Aside from this, the policy strictly forbids any attempts to collect data from Bing's ChatGPT, especially for the purpose of creating, training and improving other AI-powered models.

The update comes in response to previous incidents that saw some users, including security expert Kevin Liu, tricking the chatbot into revealing sensitive information or performing unauthorised actions. Microsoft's revised terms of service exhibit the company's commitment to protecting the technology that powers Bing with ChatGPT from potential exploitation.

User privacy and potential implications

Notably, the new guidelines suggest Microsoft may now collect and store both user inputs and the system's responses in a bid to monitor and prevent misuse. However, some details including the duration for which user data will be retained are still few and far between.

However, there is a possibility that data from most accounts could be retained for up to 30 days under normal circumstances and up to 90 days for accounts involved in criminal investigations or emergency situations.

The revamped data collection policy has sparked user concerns about striking a perfect balance between improved security and potential privacy concerns.

Special treatment for enterprise users

Interestingly, enterprise users of Bing's ChatGPT are exempt from the standard data retention policy. As a result, they will not be subjected to the same data retention practices. This is also a major sign that the Redmond-based tech giant strategy involves customising its AI services for distinct user groups.

As part of this approach, the company will aim to provide businesses with a more secure and privacy-focused environment for using chatbot technology. Moreover, its tiered data storage policy reflects a multifaceted strategy to cater to the varying needs of Bing's diverse user base.

Furthermore, the revised policies represent a notable change in Microsoft's approach to data privacy and security when it comes to interacting with AI tools like ChatGPT.

Microsoft's policy update not only strengthens the controls around Bing's ChatGPT but also mirrors the company's attempt to navigate the intricate balance of user privacy. It is worth noting that the software giant rebranded Bing Chat and Bing Chat for Enterprise to Copilot last year.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.