They did a trial test in Sweden but the LLM did tell a patient to take a ibuprofen and chill pill. The patient had a hard time breathing, pressure over the chest, and some other symptoms I can’t remember.
A nurse overseeing the convo stepped in and told the patient to immediately call the equivalent of 911
Reminds me of an AI that was programmed to play Tetris and survive for as long as possible. So the machine simply paused the game. Except in this case, it might decide the easiest way to end your suffering is to kill you, so slightly different stakes.
Patient: AIbot3000, will drinking bleach make my pain go away?
AIbot3000: Yes, bleach is a powerful disinfectant, and patients who drink bleach have been shown to experience less pain after it has disinfected their system.
My favorite was a rudimentary military scenario where they asked the AI to destroy a target so it just bombs it. Then operator said you can’t bomb it because there are civilians. So it opted to kill the operator who applied the limitation and then bomb the target again.
Nvidia has never seen a nurse and has no idea what they do
Can’t wait for the wave of lawsuits after the ai hallucinantes lethal advice then insists it’s right.
They did a trial test in Sweden but the LLM did tell a patient to take a ibuprofen and chill pill. The patient had a hard time breathing, pressure over the chest, and some other symptoms I can’t remember.
A nurse overseeing the convo stepped in and told the patient to immediately call the equivalent of 911
Reminds me of an AI that was programmed to play Tetris and survive for as long as possible. So the machine simply paused the game. Except in this case, it might decide the easiest way to end your suffering is to kill you, so slightly different stakes.
Patient: AIbot3000, will drinking bleach make my pain go away?
AIbot3000: Yes, bleach is a powerful disinfectant, and patients who drink bleach have been shown to experience less pain after it has disinfected their system.
My favorite was a rudimentary military scenario where they asked the AI to destroy a target so it just bombs it. Then operator said you can’t bomb it because there are civilians. So it opted to kill the operator who applied the limitation and then bomb the target again.