If you were asked a question but didn’t know the answer, instead of admitting it, you might take a shot at answering it as convincingly as you can. If no one knew any better, they might accept your reply as correct. 

As it turns out, artificial intelligence (AI) chatbots and platforms are somewhat human in that regard. They, too, may “make it up” when they don’t have all the correct information to reply to a query. It’s what’s referred to as an AI hallucination.

According to Wikipedia, a hallucination in AI is a confident response by an AI platform that does not seem justified by its training data. In other words, these AI tools are prone to fabricating and providing incorrect information. 

It’s a significant concern for users of these platforms and the AI industry, with OpenAI, creators of the ChatGPT chatbot and text-to-digital image generator DALL-E 2, working on training its generative AI tools to take a more human-like approach to “thinking” to mitigate the problem.

Understanding AI hallucinations and ways to mitigate them

Download Our FREE Insurance Guide

Learn everything you need to protect your small business.

Whitepaper download

"*" indicates required fields

Your email address will be used by Zensurance to provide latest news, offers and tips.
You can unsubscribe at any time.

Zensurance - Small Business Insurance Guide

The risk for small business owners or self-employed professionals using AI tools when they hallucinate is potentially publishing incorrect information that could harm your reputation or, worse, inadvertently cause financial or physical harm to your clients and subsequently being sued for damages.

Save on business insurance - CTA

Related Posts

Sign Up for ZenMail

"*" indicates required fields

The best of Zensurance news, tips, and resources are delivered straight to your inbox.
Name*

Categories

How Can You Tell If an AI System Is Hallucinating?

Remember that AI tools don’t know or understand things like humans do (not presently, anyway). 

If it doesn’t know the answer to a question or query you input, it reaches for the closest information it can find and attempts to make it appear as fact or reality. Here are a couple of ways to try to identify when the AI tool you’re using may be hallucinating:

Look for grammatical errors

Platforms like ChatGPT may make grammatical errors, and when it does, it may be a sign the tool is hallucinating.

When text-generated content appears flawed

When an AI bot spits out something you know is incorrect or seems unbelievable, it probably is. For example, when Google demonstrated its new chatbot, Bard, it was asked about the discoveries the James Webb Space Telescope made. Bard incorrectly stated the telescope took the first photos of a planet outside our solar system. 

Ultimately the onus is on you to exercise caution when using any AI platform and fact-check the information it provides before using it. 

8 Ways to Mitigate AI Hallucinations

Here are eight ways to try to mitigate AI hallucinations:

  1. Tell the AI platform to admit when it doesn’t know the answer to something you ask.
  2. Provide the system with relevant, high-quality information and sources that support your query, and give it precise directions. The better information you provide, the better the outcome will be.
  3. Break down the tasks you ask into small prompts instead of one large query.
  4. Give an AI tool the context to explain what you’re after.  
  5. Pick the right platform for what it is you’re trying to accomplish.
  6. Verify everything an AI platform gives you. When in doubt, toss it out.
  7. Give the tool you’re using feedback. For instance, on ChatGPT, you can give a thumbs up or down to the outputs it provides. That can help it improve.
  8. Report problems you encounter with an AI tool to its developers.

Can Business Insurance Cover You for Damages Because of AI Hallucinations?

While a business insurance policy is not designed to cover any damage or loss you suffer due to publishing or promoting AI hallucinations, a customized policy that includes intellectual property and professional liability insurance may offer protection if you face allegations of copyright infringement or are threatened with legal action for advice or services you provide. 

Fill out our online application for a free quote. If you have questions about protecting your small business, talk to one of our licensed brokers.

In addition to getting low-cost insurance and customizing it to suit your needs, they can advise you on the types of coverage to include, the coverage limits you ought to have, and help you make informed decisions for protecting your business.

— Reviewed by Michael McDermott, Director of Underwriting, Zensurance.

Recent Posts

Share This Story:

About the Author: Liam Lahey

Liam is the Content Marketing Manager at Zensurance. A writer and editor for more than 20 years, he has been published in several newspapers and magazines, including Yahoo! Canada Finance, Metroland Media, IT World Canada and others.