I know where Bing AI chat went wrong

Spread the love


Ask me something. It is the lengthy type of an AMA and some of the widespread types of interactive discourse on Reddit. It is also a significant problem, as Microsoft’s Bing AI chatbot, a.ok.a. “new Bing” is shortly studying.

Anytime a celeb or notable indicators as much as do a Reddit AMA, normally shortly after posing with a photograph to show it is actually them reply questions, there’s a deep second of trepidation.

The flexibility to ask anybody something is normally a minefield of inappropriate discourse that’s managed by a stay neighborhood supervisor who fields and filters the questions. In any other case, issues shortly go off the rails. Even with out that safety, they often do, anyway (opens in new tab).

Bing

(Picture credit score: Future)

When Microsoft launched its new Bing AI-powered chat, it made it clear that the ChatGPT AI was prepared for any and all questions. This was both an indication of deep belief with the comparatively small however rising group of customers or unimaginable naivete.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *