Monday, October 23, 2023 – KFF Health News

by Fulton Watch News Feed

Your ‘Friendly’ AI Chatbot May Give You Racist Health Guidance

Some high-profile artificial intelligence chatbots perpetuate false or debunked medical information about Black people, a new study has found, reminding us of the risks of using low-quality data to train the new tech even as other reports show how much promise AI has in some health care settings.

Axios:
Study: Some AI Chatbots Provide Racist Health Info

Some of the most high-profile artificial intelligence chatbots churned out responses that perpetuated false or debunked medical information about Black people, a new study found. As AI takes off, chatbots are already being incorporated into medicine — with little to no oversight. These new tech tools, if fueled by false or inaccurate data, have the potential to worsen health disparities, experts have warned. (Goldman, 10/23)

Stat:
STAT Summit: Doctor V. ChatGPT Showed AI’s Promise, Blind Spots

Generative AI tools are already helping doctors transcribe visits and summarize patient records. The technology behind ChatGPT, trained on vast amounts of data from the internet, made headlines when it correctly answered more than 80% of board exam questions. In July, a team at Beth Israel saw promising results when using GPT-4 during a diagnosis workshop for medical residents. (Lawrence, 10/20)

Stat:
FDA Gives Detailed Accounting Of AI-Enabled Medical Devices

The Food and Drug Administration on Thursday released a new accounting of artificial intelligence tools cleared for use in health care, adding scores of new products designed to reshape care in several areas of medicine. (Ross and Palmer, 10/20)

In other health care industry developments —

Modern Healthcare:
Providence St. Joseph Workers In Burbank, California, To…

Read the full article here

Have a news tip for Fulton Watch? Submit your news tip or article here.

You may also like

Copyright © 2023 Fulton Watch. created by Sawah Solutions.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy