会员登录 - 用户注册 - 设为首页 - 加入收藏 - 网站地图 Microsoft might be saving your conversations with Bing Chat!

Microsoft might be saving your conversations with Bing Chat

时间:2024-09-23 00:35:54 来源:摩登家庭人人影视网 作者:产品中心 阅读:862次

Uh-oh — Microsoft might be storing information from your Bing chats.

This is probably totally fine as long as you've never chatted about anything you wouldn't want anyone else reading, or if you thought your Bing chats would be deleted, or if you thought you had more privacy than you actually have.

In its terms of service, Microsoft updated new AI policies. Introduced on July 30 and going into effect on Sept. 30, the policy said: "As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service."

SEE ALSO:Microsoft is testing Bing Chat on Chrome and Safari

According to the Register's readingof a new clause "AI Services" in Microsoft's terms of service, Microsoft can store your conversations with Bing if you're not an enterprise user — and we don't know for how long. 

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Microsoft did not immediately respond to a request for comment from Mashable, and a spokesperson from Microsoft declined to comment to the Register about how long it will store user inputs.

"We regularly update our terms of service to better reflect our products and services," a representative said in a statement to the Register. "Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers."


Related Stories
  • Microsoft Bing will connect ChatGPT to the internet for all users
  • Microsoft Bing AI chatbot and Edge browser get massive AI upgrades. See the list.
  • The Microsoft Bing AI chatbot doesn't have human thoughts. Neither does your dog.
  • The Clippy of AI: Why the Google Bard vs. Microsoft Bing war will flame out
  • Microsoft is testing Bing Chat on Chrome and Safari

Beyond storing data, there were four additional policies in the new AI Services clause. Users cannot use the AI service to "discover any underlying components of the models, algorithms, and systems." Users are not allowed to extract data from the AI services. Users cannot use the AI services to "create, train, or improve (directly or indirectly) any other AI service." And finally, users are "solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services)."

So maybe be a bit more careful while using Microsoft Bing chats or switch to Bing Enterprise Chat mode — Microsoft said in July that it doesn't save those conversations.

TopicsArtificial IntelligenceMicrosoft

(责任编辑:行业动态)

相关内容
  • Slot extends perfect Liverpool start
  • White House official Twitter calls out student debt forgiveness objectors over their PPP loans
  • 打破行政区划  打造全域飞地 大力推进“飞地经济”高质量发展
  • Lukashenko: Can Europe's longest
  • Republicans on abortion
  • FBI's alleged role in North Korea's Spanish embassy attack creates stir
  • 秋天的第一颗金桔来了! “水果珍品 融安金桔”即将入湾
  • Inside the movement redefining and resisting professionalism
推荐内容
  • Anatomy of a Keyboard
  • XGIMI projectors on Amazon
  • 破获跨境网络赌博案  冻结扣押资金上亿元
  • Apparent N.Korean balloon crossed into S. Korea: JCS
  • 14 Heists, Robberies, and Other Great Capers
  • US sanctions 2 Chinese firms for helping N. Korea evade sanctions