But Lipps said Fargo police did not pay for her trip home, leaving her stranded. Local defense attorneys helped cover a hotel room and food on Christmas Eve and Christmas Day, and a local non-profit, the F5 Project, was able to help her return to Tennessee, InForum reported.
Lipps is now back home but says the experience has had lasting consequences. While jailed and unable to pay bills, Lipps lost her home, her car and her dog, she said. She also told WDAY News no one from the Fargo police department had apologized.
This is far from the first case of an AI error flagging the wrong suspect. In October, an AI system apparently mistook a Baltimore high school student’s bag of Doritos for a firearm and called local police to tell them the pupil was armed. Taki Allen was sitting with friends outside the Kenwood high school in Baltimore when police officers with guns approached him, made him get on his knees, and handcuffed and searched him – finding nothing.
Earlier this year, police arrested a man in the UK for a burglary in a city he had never visited after face-scanning software confused him with another person of south Asian heritage. Authorities had used automated facial recognition software which matched him with footage of a suspect in a £3,000 burglary 100 miles away.
Daaaaaaaaaaaaaaaamn. Lemme add this specific occurrence to my list of « why ethical application of AI is in our law »… Between GDPR and AI act I have to say that the USA has provided me most of my examples of nasty shit happening due to lack of protective legislation.
I hope she sues tf out of ND.
Fuck suing, I’m going to prison for murder. I can live without the stuff, but if someone touches my dog they must not like living.
Daaaaaaaaaaaaaaaamn. Lemme add this specific occurrence to my list of « why ethical application of AI is in our law »… Between GDPR and AI act I have to say that the USA has provided me most of my examples of nasty shit happening due to lack of protective legislation.