Civilloquy
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
JOMusic@lemmy.ml to Technology@lemmy.worldEnglish · 5 months ago

US Bill proposed to jail people who download Deepseek

www.404media.co

external-link
message-square
122
fedilink
726
external-link

US Bill proposed to jail people who download Deepseek

www.404media.co

JOMusic@lemmy.ml to Technology@lemmy.worldEnglish · 5 months ago
message-square
122
fedilink
Senator Hawley Proposes Jail Time for People Who Download DeepSeek
www.404media.co
external-link
According to the language of the proposed bill, people who download AI models from China could face up to 20 years in jail, a million dollar fine, or both.
alert-triangle
You must log in or register to comment.
  • NauticalNoodle@lemmy.ml
    link
    fedilink
    English
    arrow-up
    104
    ·
    edit-2
    5 months ago

    now i gotta download something i don’t even wanna download.

    • 👍Maximum Derek👍@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      1
      ·
      5 months ago

      Yup. Downloaded 7b, 32b, and 70b varieties this afternoon. Entirely out of spite.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        5 months ago

        Since those smaller models are technically fine-tunes of Meta/Facebook’s LLAMA, using Deepseek’s outputs, I wonder if they would be covered by the bill at all.

    • ObsidianZed@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 months ago

      I literally just did the same

  • labbbb2@thelemmy.club
    link
    fedilink
    English
    arrow-up
    91
    ·
    edit-2
    5 months ago

    Fascist regime and power/police abuse has started.

    P.S.: It seems like the US is becoming similar to Russia, kleptocratic country and organised crime in government.

    • TheReturnOfPEB@reddthat.com
      link
      fedilink
      English
      arrow-up
      53
      ·
      edit-2
      5 months ago

      to be fair for black Americans that is a centuries old tune

      • labbbb2@thelemmy.club
        link
        fedilink
        English
        arrow-up
        20
        ·
        5 months ago

        Oh, you’re right

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        Don’t worry, their already bad situation will get worse too.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      Every step unchallenged is an invitation to do more.

  • metaStatic@kbin.earth
    link
    fedilink
    arrow-up
    82
    arrow-down
    2
    ·
    5 months ago

    For Base Model

    git lfs install git clone https://huggingface.co/deepseek-ai/DeepSeek-V3-Base

    For Chat Model

    git lfs install git clone https://huggingface.co/deepseek-ai/DeepSeek-V3

    • theunknownmuncher@lemmy.world
      link
      fedilink
      English
      arrow-up
      50
      ·
      5 months ago

      this is deepseek-v3. deepseek-r1 is the model that got all the media hype: https://huggingface.co/deepseek-ai/DeepSeek-R1

    • neon_nova@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      ·
      5 months ago

      Can you elaborate on the differences?

      • cyd@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        5 months ago

        Base models are general purpose language models, mainly useful for AI researchers and people who want to build on top of them.

        Instruct or chat models are chatbots. They are made by fine-tuning base models.

        The V3 models linked by OP are Deepseek’s non-reasoning models, similar to Claude or ChatGPT4o. These are the “normal” chatbots that reply with whatever comes to their mind. Deepseek also has a reasoning model, R1. Such models take time to “think” before supplying their final answer; they tend to give better performance for stuff like math problems, at the cost of being slower to get the answer.

        It should be mentioned that you probably won’t be able to run these models yourself unless you have a data center style rig with 4-5 GPUs. The Deepseek V3 and R1 models are chonky beasts. There are smaller “distilled” forms of R1 that are possible to run locally, though.

        • DogWater@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          I heard people saying they could run the r1 32B model on moderate gaming hardware albeit slowly

          • FrederikNJS@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 months ago

            32b is still distilled. The full one is 671b.

            • DogWater@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 months ago

              I know, but the fall off in performance isn’t supposed to be severe

              • FrederikNJS@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                5 months ago

                You are correct. And yes that is kinda the whole point of the distilled models.

                • DogWater@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  5 months ago

                  I know. Lmao

          • meliante@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            My legion slim 5 14" can run it not too bad.

      • metaStatic@kbin.earth
        link
        fedilink
        arrow-up
        6
        arrow-down
        4
        ·
        5 months ago

        https://www.deepseekv3.com/en/download

        I was assuming one was pre-trained and one wasn’t but don’t think that’s correct and don’t care enough to investigate further.

        • JOMusic@lemmy.mlOP
          link
          fedilink
          English
          arrow-up
          17
          ·
          5 months ago

          Is that website legit? I’ve only ever seen https://www.deepseek.com/

          And I would personally recommend downloading from HuggingFace or Ollama

      • thefartographer@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        5
        ·
        5 months ago

        r1 is lightweight and optimized for local environments on a home PC. It’s supposed to be pretty good at programming and logic and kinda awkward at conversation.

        v3 is powerful and meant to run on cloud servers. It’s supposed to make for some pretty convincing conversations.

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          5 months ago

          R1 isn’t really runnable with a home rig. You might be able to run a distilled version of the model though!

          • theunknownmuncher@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 months ago

            Tell that to my home rig currently running the 671b model…

            • Pennomi@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              1
              ·
              5 months ago

              That likely is one of the distilled versions I’m talking about. R1 is 720 GB, and wouldn’t even fit into memory on a normal computer. Heck, even the 1.58-bit quant is 131GB, which is outside the range of a normal desktop PC.

              But I’m sure you know what version you’re running better than I do, so I’m not going to bother guessing.

              • theunknownmuncher@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                5 months ago

                It’s not. I can run the 2.51bit quant

                • Pennomi@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  5 months ago

                  You must have a lot of memory, sounds like a lot of fun!

          • thefartographer@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            5 months ago

            You’re absolutely right, I wasn’t trying to get that in-depth, which is why I said “lightweight and optimized,” instead of “when using a distilled version” because that raises more questions than it answers. But I probably overgeneralized by making it a blanket statement like that.

  • TsarVul@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    ·
    5 months ago

  • theunknownmuncher@lemmy.world
    link
    fedilink
    English
    arrow-up
    75
    ·
    edit-2
    5 months ago

    Hawley’s statement called DeepSeek “a data-harvesting, low-cost AI model that sparked international concern and sent American technology stocks plummeting.”

    data-harvesting

    ???

    It runs offline… using open-source software that provably does not collect or transmit any data…

    It is low-cost and out-competes American technology, though, true

    • JasonDJ@lemmy.zip
      link
      fedilink
      English
      arrow-up
      43
      arrow-down
      1
      ·
      edit-2
      5 months ago

    • pixxelkick@lemmy.world
      cake
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      sent American technology stocks plummeting

      Oh yeah, thats what did it, totally

      • sunzu2@thebrainbin.org
        link
        fedilink
        arrow-up
        3
        ·
        5 months ago

        You don’t fuck with the big man money tbh… That’s like rule 1 of the game.

  • EveningPancakes@lemm.ee
    link
    fedilink
    English
    arrow-up
    63
    ·
    5 months ago

    I’m gonna download it even harder.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 months ago

      See you hell evildoer!

  • Katana314@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    ·
    5 months ago

    This is astounding.

    I mean, not the Deepseek or jailing stuff. I mean a Senator actually proposing a law. I thought the way our government worked was, the annoying orange declares a vague uncited threat to be bad, and signs an executive order on it!

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      ·
      5 months ago

      No, we also allow mega corporations to submit bills that get rubber stamped by a rep somewhere. I don’t think a corporation would be so audacious as to submit this, so it’s a rare case of original content.

  • blady_blah@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    ·
    5 months ago

    That’s awesome! I didn’t know you could download an LLM and run it locally! That’s what I’m really interested in is something that’s on my side and not a conduit to Google, MS or other.

    I’m so glad Hawley proposed this bill or I wouldn’t have known that deepseek was open source and downloadable! I’ll have to go look for a download.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      Ollama makes it pretty easy, and there are other runners as well. Good luck!

    • xigoi@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      AFAICT it’s not open source, just open weights.

  • Railcar8095@lemm.ee
    link
    fedilink
    English
    arrow-up
    48
    ·
    5 months ago

    Download the model and run locally is the most secure and privacy friendly way to use it.

    It’s absurd how little they know about what they are doing.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      5 months ago

      I doubt they understand local vs server distinction.

      • Railcar8095@lemm.ee
        link
        fedilink
        English
        arrow-up
        15
        ·
        5 months ago

        “Server is when we ask Amazon to build a backdoor, local is when we ask Microsoft”

    • piecat@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      5 months ago

      And that’s exactly why they want to stop it

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        Nah, Congress (esp the Senate) is a bunch of old people yelling at clouds, and sometimes they yell the same thing. Don’t give them too much credit.

        • Kellenved@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          deleted by creator

    • Knock_Knock_Lemmy_In@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      It’s easy to run a distilled version of the R1 model locally. It’s very difficult to run the full version. Min $6k to get 7 tokens per second.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 months ago

        Here’s one for 2k if you don’t mine jank (edit: and 3-4 tokens :) )

        https://digitalspaceport.com/how-to-run-deepseek-r1-671b-fully-locally-on-2000-epyc-rig/

      • Kyuuketsuki@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        I hear its easy, but I’ve had no luck at all on the most distilled models (for prelim testing), and am wondering how things have broken so badly.

  • jabathekek@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    42
    ·
    edit-2
    5 months ago

    I wasn’t thinking of downloading an AI onto my low tier computer until now.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      5 months ago

      I’ve got a laptop kicking around from 2010 that’s about to get deepseek just because they’re proposing this dumb ass shit. I don’t even use Gen AI.

      • dindonmasker@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        Finally affordable housing!

      • jabathekek@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        same lmao

        They never learn.

  • Chivera@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    ·
    5 months ago

    Land of the free

  • technocrit@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    5 months ago

    Yes the ban on TikTok is working! We’re getting more and more freer!!! The kids will be saved!!! \s \s \s

    • jadelord@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 months ago

      “Victory for free speech (as long as it means only we get to talk”)! /s

  • Gsus4@mander.xyz
    link
    fedilink
    English
    arrow-up
    40
    ·
    5 months ago

    I wasn’t gonna, but now I gotta…

    • InFerNo@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 months ago

      You laugh, but stay safe

  • w3dd1e@lemm.ee
    link
    fedilink
    English
    arrow-up
    38
    ·
    edit-2
    5 months ago

    God, I hate Hawley. He’s an embarrassment to my state.

    He doesn’t even live in Missouri.

  • dontbelasagne@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    5 months ago

    So I guess it’s free speech as long as you agree with the goverment’s speech. If not, then it’s a crime.

    • John Richard@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      5 months ago

      Elon Musk was just posting a factory of prisoners all working for cents on the dollar saying that America needs more of that.

    • ilmagico@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 months ago

      Always have been, and this is a bipartisan value, heck, it’s common to all political parties of the world.

      • sunzu2@thebrainbin.org
        link
        fedilink
        arrow-up
        3
        ·
        5 months ago

        Yeah that’s called being a sovereign… They will respect each other doing since it is a club in a oligarchy or “democracy” but little people need watch that mother fucking mouth, or daddy gonna issue some backhand

    • TxzK@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 months ago

      free speech is when racial slurs obviously

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4.2K users / day
  • 8.3K users / week
  • 15.3K users / month
  • 32.8K users / 6 months
  • 1 local subscriber
  • 72.3K subscribers
  • 14.3K Posts
  • 447K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • BE: 0.19.5
  • Modlog
  • Legal
  • Instances
  • Docs
  • Code
  • join-lemmy.org