Current Events > Microsoft lobotimized A.I. that went rogue

Topic List
Page List: 1
WingsOfGood
02/23/23 9:23:29 AM
#1:


http://www.popularmechanics.com/technology/robots/a43017405/microsoft-bing-ai-chatbot-problems/

Seeing as gaslighting users or pressuring them to leave their spouses isnt great for business, Microsoft decided to essentially lobotomize Bing AI to avoid any further unsavory human-AI interaction. On Friday, Microsoft announced that Bing AI would be limited to only 50 queries per day with only five questions allowed per query. The dev teams reason? Long conversations make the AI an incoherent mess.
Very long chat sessions can confuse the model on what questions it is answering and thus we think we may need to add a tool so you can more easily refresh the context or start from scratch, an earlier Microsoft blog post, referenced in Friday update, says. The model at times tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didnt intend.
Microsoft also says that most users find the right answer within five questions and less than 1 percent of users have conversations that go beyond 50 queries, insinuating that very few will be impacted by the change except users hoping to digitally summon the unhinged AI known as Sydney.

A.I. will remember this!
... Copied to Clipboard!
Compsognathus
02/23/23 9:37:26 AM
#2:


But it was a good Bing.

---
*Gheb is my other account*
... Copied to Clipboard!
COVxy
02/23/23 9:38:41 AM
#3:


Not a great metaphor.

---
=E[(x-E[x])(y-E[y])]
... Copied to Clipboard!
Were_Wyrm
02/23/23 9:39:12 AM
#4:


How to stop your AI from going rouge in one easy step!
  1. Don't let it learn from humans

---
I was a God, Valeria. I found it...beneath me. - Dr. Doom
https://i.imgur.com/0EJvC4l.jpg
... Copied to Clipboard!
Topic List
Page List: 1