Topic List |
Page List:
1 |
---|---|
WingsOfGood 02/23/23 9:23:29 AM #1: |
http://www.popularmechanics.com/technology/robots/a43017405/microsoft-bing-ai-chatbot-problems/ Seeing as gaslighting users or pressuring them to leave their spouses isnt great for business, Microsoft decided to essentially lobotomize Bing AI to avoid any further unsavory human-AI interaction. On Friday, Microsoft announced that Bing AI would be limited to only 50 queries per day with only five questions allowed per query. The dev teams reason? Long conversations make the AI an incoherent mess. A.I. will remember this! ... Copied to Clipboard!
|
Compsognathus 02/23/23 9:37:26 AM #2: |
But it was a good Bing. --- *Gheb is my other account* ... Copied to Clipboard!
|
COVxy 02/23/23 9:38:41 AM #3: |
Not a great metaphor. --- =E[(x-E[x])(y-E[y])] ... Copied to Clipboard!
|
Were_Wyrm 02/23/23 9:39:12 AM #4: |
How to stop your AI from going rouge in one easy step!
--- I was a God, Valeria. I found it...beneath me. - Dr. Doom https://i.imgur.com/0EJvC4l.jpg ... Copied to Clipboard!
|
Topic List |
Page List:
1 |