Deepfake scammer walks off with $25 million in first-of-its-kind AI heist::Hong Kong firm tricked by simulation of multiple real people in video chat, including voices.

  • theskyisfalling@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    3
    ·
    9 months ago

    What kind of company let’s a single employee transfer that amount of money without multiple different password entries or checks from different people though, seriously?

    Doesn’t matter if they had a conference call with what appeared to be certain people as the article says they could easily have used key pair verification such as pgp. Sounds like poor security all around especially considering the amounts involved.

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      9 months ago

      PGP? Have you ever dealt with any banking or financial corporations? You’d have better luck getting the money handlers and decision makers to authenticate transactions with magic.

      Hong Kong and Japan are the absolute worst I’ve experienced. Their online banking UI’s and processes are stuck in the late 90’s to early 2000’s.

        • Jojo@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          9 months ago

          It’s stylistically acceptable to put an apostrophe for plurals in cases where the plural thing isn’t a “normal” word, as is the case for initialisms like UI or numbers like the latter two you caught.

          Obviously a given body may make its own rules in this regard, but luckily English has no overall authority, and this is informal communication outside the domain of any minor ones (beyond, perhaps, idle pedants and prescriptivists).

    • Lmaydev@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      9 months ago

      Somewhere I worked the CEOs email got hacked and they asked the head of finance to change the bank account details for a 100k payment that was due to go out.

      Luckily they thought to double check with them. But it came really close to happening.

      This all happened via a phishing email.

      Social engineering is how most hacks happen. Doesn’t matter what protection you put in place. People are always the weakest link.

  • Sunforged@lemmy.ml
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    9 months ago

    This is incredible. And scary. And incredible. I would hate to be the poor sap that fell for it though, oof.

  • BarqsHasBite@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    9 months ago

    Used to be so easy to spot scams and fakes, this stuff now is getting scary. I wonder if this will slow things down as we require face to face and in person confirmation.

  • PeroBasta@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    9 months ago

    I’d like to hear the whole story, like how hold was the scammed guy etc.

    To me it smells like he was an accomplice, or a very old person who is full of his company shit.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 months ago

    This is the best summary I could come up with:


    Deepfakes utilize AI tools to create highly convincing fake videos or audio recordings, posing significant challenges for individuals and organizations to discern real from fabricated content.

    This incident marks the first of its kind in Hong Kong involving a large sum and the use of deepfake technology to simulate a multi-person video conference where all participants (except the victim) were fabricated images of real individuals.

    Despite initial doubts, the employee was convinced enough by the presence of the CFO and others in a group video call to make 15 transfers totaling HK$200 million to five different Hong Kong bank accounts.

    The high-tech theft underscores the growing concern over new uses of AI technology, which has been spotlighted recently due to incidents like the spread of fake explicit images of pop superstar Taylor Swift.

    Over the past year, scammers have been using audio deepfake technology to scam people out of money by impersonating loved ones in trouble.

    The police have offered tips for verifying the authenticity of individuals in video calls, such as asking them to move their heads or answer questions that confirm their identity, especially when money transfer requests are involved.


    The original article contains 519 words, the summary contains 192 words. Saved 63%. I’m a bot and I’m open source!