ChatGPT creator seeking to eliminate chatbot hallucinations

Despite all of the excitement around ChatGPT and similar AI-powered chatbots, the text-based tools still have some serious issues that need to be resolved.

Among them is their tendency to make up stuff and present it as fact when it doesn’t know the answer to an inquiry, a phenomenon that’s come to be known as “hallucinating.” As you can imagine, presenting falsehoods as fact to someone using one of the new wave of powerful chatbots could have serious consequences.

Such trouble was highlighted in a recent incident in which an experienced New York City lawyer cited cases —…


Read more on google

About bourbiza mohamed

Check Also

Tech companies try to take AI image generators mainstream with better protections against misuse

Artificial intelligence tools that can conjure whimsical artwork or realistic-looking images from written commands started …

Leave a Reply

Your email address will not be published. Required fields are marked *