Chicago won’t renew its ShotSpotter contract and plans to stop using the controversial gunshot detection system later this year, Mayor Brandon Johnson’s office announced Tuesday.

The system, which relies on an artificial intelligence algorithm and network of microphones to identify gunshots, has been criticized for inaccuracy, racial bias and law enforcement misuse. An Associated Press investigation of the technology detailed how police and prosecutors used ShotSpotter data as evidence in charging a Chicago grandfather with murder before a judge dismissed the case due to insufficient evidence.

Chicago’s contract with SoundThinking, a public safety technology company that says its ShotSpotter tool is used in roughly 150 cities, expires Friday. The city plans to wind down use of ShotSpotter technology by late September, according to city officials. Since 2018, the city has spent $49 million on ShotSpotter.

  • DarkNightoftheSoul@mander.xyz
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    edit-2
    9 months ago

    That’s strange. I would assume this would be a problem unusually well-suited to machine learning techniques. Law enforcement misuse and racial bias I can see, but inaccuracy? It’s a triangulation problem mostly.

    • Sir_Kevin@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      9 months ago

      The system probably works great in military situations, which I believe is what it was designed for. In a dense city where sound can echo multiple times off various buildings and other structures? It probably gets things wrong quite often. Add in trigger-happy cops that don’t know how to interpret the data and you have a recipe for disaster.

    • xor@infosec.pub
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      9 months ago

      not if you consider fireworks, car misfires, echos and weird geometries, and the fact that supersonic bullets have a sonic boom that travels with it…

      that and the ai was probably only trained in black neighborhoods so it thinks loud bass or black accents are required to be a positive? i dunno

      • DarkNightoftheSoul@mander.xyz
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        all of those different noises have distinct soundwave profiles, and different geometries can be accounted for either in software or with strategic placement of mics. I’m convinced this would be a good ML project, if we could find a way of enforcing without police bias, which, good luck.

        • xor@infosec.pub
          link
          fedilink
          arrow-up
          6
          ·
          9 months ago

          i don’t think so… each neighborhood is shaped differently and will have an effect on the sound profiles…
          maybe if you set it up and calibrated it by shooting guns all around the city (:

        • phoneymouse@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          they have this in DC… coincidentally the day with the most “gunshots” is also the 4th of July when hoards of people are openly lighting off fireworks of all kinds in the street.

          • DarkNightoftheSoul@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            9 months ago

            I guess I should have said “in principle it should be possible to distinguish these sounds” because yeah a couple people saying stuff to shit on these systems’ technology.

            edit: I expressed myself very poorly last night. I meant to say, “I guess I should have said ‘in principle it should be possible to distinguish these sounds’ because people are making valid observations in the comments about the notorious failures of this product”

    • ShittyBeatlesFCPres@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      I wouldn’t assume a company like ShotSpotter uses modern machine learning techniques. It’s got a pitiful accuracy rate and the company was founded 28 years ago. They seem more like a company that hires people with connections rather than a company that hires AI experts and buys Nvidia H100 GPUs by the gross.

    • thantik@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      10
      ·
      9 months ago

      How TF do you racially bias a gunshot sound? That doesn’t make any fucking sense…

      • DarkNightoftheSoul@mander.xyz
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        edit-2
        9 months ago

        I was imagining selective enforcement/response, or restricting use to areas where one race predominates, or behaving differently in response to mic reports in white neighborhoods than black. Standard cop shit.

        edit: yeah pretty much

        • thantik@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          9 months ago

          Ah okay, so the article as written is just written stupidly. It doesn’t seem like the system itself is racially biased, so much as the cops using it.

            • thantik@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              9 months ago

              We actually have a local paper that suffered from what I believe is happening here. Cops give news papers and stuff deeper access to stuff on the regular. Our local paper here wrote something SLIGHTLY negative about one of the cops, and immediately had all their ‘press’ credentials revoked. They stepped back in line shortly after and issued an apology to the police department…

              I’d be willing to bet that’s what’s happening here. They’re toeing the line of saying the racial bias is the cops by making it ambiguous…

      • Kid_Thunder@kbin.social
        link
        fedilink
        arrow-up
        7
        ·
        9 months ago

        Well they are placed in mostly minority communities but above that, a ShotSpotter tech admitted in court that they often change the analyst interpretations of what is and isn’t a gunshot at the request of their police department customers which, keep in mind, have successfully been used in court as evidence of a crime.

        There is also overwhelming data showing that the majority of their alerts lead to no arrests. The Chicago IG believes this demonstrates false positives where as ShotSpotter (who changed their name after some criticism to SoundThinking) says that people can fire a gun and leave no evidence in spite of police investigating and asking people that would have witnessed it (and there being no victim).

        Furthermore, the way it works is that AI ‘assists’ a human who then determines if it is a gun shot and then attempts to triangulate the position it came from. From the trial I mentioned earlier, ShotSpotter determined that there was no gun shot but then changed the analysis at LEO request but in actuality was proven to be a helicopter…

        Furthermore, ShotSpotter keeps the details of its methodologies and models a secret and has refused an independent audit from IPVM.

        So with all of that, one could easily argue that ShotSpotter/SoundThinking is as biased as a police officer and that the evidence is purely subjective and non-transparent.

      • Ejh3k@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 months ago

        By putting them only in predominantly minority neighborhoods. And I do mean only.

      • Death_Equity@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        9 months ago

        The bias comes in because they installed them in areas with more gunshots, which happen to be areas with more minorities.

        By not spending excessively on detectors uniformly distributed across the metropolitan area, they are targeting minorities.