“The new device is built from arrays of resistive random-access memory (RRAM) cells… The team was able to combine the speed of analog computation with the accuracy normally associated with digital processing. Crucially, the chip was manufactured using a commercial production process, meaning it could potentially be mass-produced.”
Article is based on this paper: https://www.nature.com/articles/s41928-025-01477-0
This already a thing, there’s a US lab doing this
sounds like bullshit.
(x) Doubt.
Same here. I wait to see real life calculations done by such circuits. They won’t be able to e.g. do a simple float addition without losing/mangling a bunch of digits.
But maybe the analog precision is sufficient for AI, which is an imprecise matter from the start.
Wouldn’t analog be a lot more precise?
Accurate, though, that’s a different story…
Ahh yeah and we should 1. Believe this exists 2. Believe that china doesnt think technology of this caliber isnt a matter of national security
For the love of Christ this thumbnail is triggering, lol
Why? It’s standard socket in SMOBO design (sandwich Motherboard).
Just push ever so slightly more when you hear the crunching sounds.
Then apply thermal paste generously
1000x!
Is this like medical articles about major cancer discoveries?
yes, except the bullshit cancer discoveries are always in Israel, and the bullshit chip designs are in china.
1000x yes!
It uses 1% of the energy but is still 1000x faster than our current fastest cards? Yea, I’m calling bullshit. It’s either a one off, bullshit, or the next industrial revolution.
EDIT: Also, why do articles insist on using ##x less? You can just say it uses 1% of the energy. It’s so much easier to understand.
I mean it‘s like the 10th time I‘m reading about THE breakthrough in Chinese chip production on Lemmy so lets just say I‘m not holding my breath LoL.
Yeah it’s like reading about North American battery science. Like yeah ok cool, see you in 30 years when you’re maybe production ready
coming from china, more like 1 -off bs, with nothing to backup on.
But it only does 16x16 matrix inversion.
Oh noes, how could that -possibly- scale?
To a billion parameter matrix inverter? Probably not too hard, maybe not at those speeds.
To a GPU, or even just the functions used in GenAI? We don’t even know if those are possible with analog computers to begin with.
This seems like promising technology, but the figures they are providing are almost certainly fiction.
This has all the hallmarks of a team of researchers looking to score an R&D budget.
This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.
Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.
You might benefit from watching Hinton’s lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove He says the facts forced him to change his mind
much of it details technical reasons why digital is much much better than analog for intelligent systems
For current LLMs there would be a massive gain in energy efficiency if analogue computing was used. Much of the current energy costs come from stimulating what effectively analogue processing on digital hardware. There’s a lot lost in the conversation, or “emulation” of analogue.
I wish researchers like Hinton would stick to discussing the tech. Anytime he says anything about linguistics or human intelligence he sounds like a CS major smugly raising his hand in Phil 101 to a symphony of collective groans.
Hinton is a good computer scientist (with an infinitesimally narrow field of expertise). But the guy is philosophically illiterate.
That’s a good point. The model weights could be voltage levels instead of digital representations. Lots of audio tech uses analog for better fidelity.I also read that there’s a startup using particle beams for lithography. Exciting times.
what audio tech uses analog for better fidelity?
Vinyl records, analog tube amplifiers, a good pair of speakers 🤌
Honestly though digital compression now is so good it probably sounds the same.
At least one Nobel Laureate has exactly the opposite opinion (see the Hinton lecture above)
This is not a new line of research in the sense that this is not the only place looking in the mixed analog/digital computers. been articles on it for at least a year I think and when digital was taking over there was a lot of discussion around it being inferior to analog so I bet its been being thrown around to combine the two likely since digital became a thing.
Who is China? Why is it so smart?
Edit: I removed a chatgtp generated summary because I thought it could have been useful.
Anyway just have a good day.I appreciate that you wanted to help people even if it didn’t land how you intended. :)
It was a decent summary, I was replying when you pulled it. Analog has its strengths (the first computers were analog, but electronics was much cruder 70 years ago) and it is def. a better fit for neural nets. Bound to happen.
No one is reading that.
That’s fine. Just have a good day :)
Nice thorough commentary. The LiveScience article did a better job of describing it for people with no background in this stuff.
The original computers were analog. They were fast, but electronics was -so crude- at the time, it had to evolve a lot … and has in the last half-century.









