F
Fred Bloggs
Guest
This won\'t be too different from fools who drive their vehicles off road or into water because GPS told them to...
Reasons why people will be suckered:
AUTHORITY
Joe Vitriol, a political scientist at Lehigh University who studies disinformation, says he expects that people will \"accept its output in a biased or self-serving way, like people do with any other source of information.\" In other words, people will believe a chatbot if it tells them things that comport with their existing beliefs and opinions â just as they do with traditional Google results. Whether the chatbot is telling the truth or hallucinating won\'t really matter.
[ A sure sign of a moron we\'ve seen plenty of.]
THE POWER OF STORY
In emotional areas, a dose of storytelling makes an explanation more believable. In less personal matters, like public policy, people prefer to have facts unadorned by narrative.
PEOPLE ARE LAZY
So the chatbots will lie and get things wrong. My biggest worry is that Google and Bing users will know this and simply won\'t care. One theory of why disinformation and fake news spreads is that people are downright lazy. They buy whatever a trusted source is selling. If the chatbots get it mostly right most of the time, that\'s good enough. Until, say, your flight doesn\'t actually leave at that time from that airport.
Seems they need a \"bot checker\" backup for what the chatbot says. How hard could that be. Scan for statements of fact, and then fact check.
https://www.businessinsider.com/ai-chatbot-chatgpt-google-microsofty-lying-search-belief-2023-2
Reasons why people will be suckered:
AUTHORITY
Joe Vitriol, a political scientist at Lehigh University who studies disinformation, says he expects that people will \"accept its output in a biased or self-serving way, like people do with any other source of information.\" In other words, people will believe a chatbot if it tells them things that comport with their existing beliefs and opinions â just as they do with traditional Google results. Whether the chatbot is telling the truth or hallucinating won\'t really matter.
[ A sure sign of a moron we\'ve seen plenty of.]
THE POWER OF STORY
In emotional areas, a dose of storytelling makes an explanation more believable. In less personal matters, like public policy, people prefer to have facts unadorned by narrative.
PEOPLE ARE LAZY
So the chatbots will lie and get things wrong. My biggest worry is that Google and Bing users will know this and simply won\'t care. One theory of why disinformation and fake news spreads is that people are downright lazy. They buy whatever a trusted source is selling. If the chatbots get it mostly right most of the time, that\'s good enough. Until, say, your flight doesn\'t actually leave at that time from that airport.
Seems they need a \"bot checker\" backup for what the chatbot says. How hard could that be. Scan for statements of fact, and then fact check.
https://www.businessinsider.com/ai-chatbot-chatgpt-google-microsofty-lying-search-belief-2023-2