OpenAI’s Whisper audio transcriber makes up hospital treatments and medications
OpenAI’s Whisper audio transcriber makes up hospital treatments and medications
article updated - apparently Whisper uses a transformer model and is closely related to GPT-2. thus, hallucinations.
12 0 Replywow we get changelogs now, fancy
11 0 Replybonus content for the discerning sneer
12 0 Reply
I will never get tired of that saltman pic.
10 0 ReplyTIRED: AI generated images
WIRED: obvious 5-minute edits that don't give a shit15 0 Reply
Computers hallucinating medicine into your diagnosis! Tonight on Sick Sad World!
8 0 ReplyDr. Ambrosio Romero
Board Certified Family Medicine, Geriatrics, Home Care
Detailed, accurate, HIPPA compliant visit or phone documentation in under a minute! How could we ever have done our jobs without this? Helps the smooth flow of information in the office. The staff has the documentation they need right away!
Jul 2, 2024
Jesus. H. Christ.
7 0 ReplyThey have successfully convinced me they are HIPPA compliant, yet simultaneously they've convinced me they are not HIPAA compliant.
6 0 Reply@khalid_salad @dgerard did he go to Hollywood Upstairs Medical College too?
5 0 Replythat URL is one character away from disaster
5 0 ReplyI can see a 4-count l levenstein (o, l, s, and a position switch), what’s the 1 character one?
3 0 Reply
That's what you get when you demand results regardless of whether or not there is a result to share.
Wonder how that might apply to the rest of the economy...
7 0 ReplyThis is a malpractice lawsuit waiting to happen. And probably a product liability lawsuit, if this LLM's hallucinations lead to someone getting hurt.
7 0 Reply