Castellucci, whose pronouns are they/them, acquired this remarkable control after gaining access to the administrative account for GivEnergy, the UK-based energy management provider who supplied the systems. In addition to the control over an estimated 60,000 installed systems, the admin account—which amounts to root control of the company’s cloud-connected products—also made it possible for them to enumerate names, email addresses, usernames, phone numbers, and addresses of all other GivEnergy customers (something the researcher didn’t actually do).
tl;dr: hacker (the good kind) exploits weak encryption key to gain access to the utility’s management system. Because you too were probably wondering how key length and power generation could possibly be related.
Wow, props to Castellucci for being a stand up person and not using their discovery to control or mess with tens of thousands of people’s power supply. And props to GivEnergy for not turning around and suing them after they reported finding the issue.
This could have gone badly in either direction, but we lucked out that this Castellucci seems to be an excellent and conscientious citizen.
How in the fuck do you even coax software into using a key like that? Did someone just say “yeah just use the smallest size possible, that’ll be okay” and then just like not care?
From the article:
In an email, a GivEnergy representative reinforced Castellucci’s assessment, writing:
In this case, the problematic encryption approach was picked up via a 3rd party library many years ago, when we were a tiny startup company with only 2, fairly junior software developers & limited experience. Their assumption at the time was that because this encryption was available within the library, it was safe to use. This approach was passed through the intervening years and this part of the codebase was not changed significantly since implementation (so hadn't passed through the review of the more experienced team we now have in place).
So, it sounds like they don’t have regular security audits, because that’s something that would absolutely get flagged by any halfway competent sec team.
No need for audits. It’s only critical infrastructure embedded into tens of thousands of homes, lol.
Yet another reminder that trust should be earned.
Because cryptography is a specialized knowledge. Most curriculums doesn’t even include cryptography as core topic in their Computer Science degree. You can have a look of the MIT’s computer science curriculum. Cryptography is instead embedded in the elective class of Fundementals of Computer Security (6.1600). That’s also why DevSecOps instead of the previous DevOps. It’s just simply boils down teaching and learning cryptography is hard. It’s still too early to expect a typical dev to understand how to implement cryptograhy, even with good library. Most doesn’t know compression and encryption doesn’t mix well. Nor they understand the importance of randomness and never use the same nounce twice. They doesn’t even know they can’t use built-in string comparison (
==
) for verifying password hashes which can lead to timing attacks. Crypto lib devs who understands crypto add big scary warnings yet someone will mess something up.Still, I will strongly support academics adding basic cryptography knowledge to their curriculum, like common algoritms, key lengths, future threats, and how fast the security landscape is moving, just for the sake of the future of cyber security.
Eh, I disagree. Cryptography really isn’t something your average software engineer needs to know about, as long as they understand that you should never roll your own crypto. If you teach it in school, most students will forget the details and potentially just remember some now-insecure details from their classes.
Instead, we should be pushing for more frequent security audits. Any halfway decent security audit would catch this, and probably a bunch of other issues they have as well. Expect that from any org with revenue above some level.
At least have few lessons let them remember not to roll their own crypto, and respect those scary warnings. These needs to be engraved into their mind.
I agree security audit would catch this, but that’s something after the fact. There is a need for a more preventative solution.
Security audits should be preventative. Have them before any significant change in infrastructure is released, and have them periodically as a backup.
I had a cryptography and security class in college (I took the elective), and honestly, we didn’t cover all that much that’s actually relevant to the industry, and everything that was relevant was quickly outdated. That’s not going to be a solution, we need a greater appreciation for security audits.
At least teach the concept of “don’t do it ever” won’t hurt, and won’t get outdated anytime soon.
However, this approach will hurt security in the long term as this brings to burden to the lib dev to maintain a foolproof design, which they can burnout, quit, and leave a big vulnerbility in the future as most dev won’t touch the code again if it’s still “working.”
Cybersecurity is very important in today’s digital landscape, and cryptography is one of the pillers. I believe it’s essential for devs to learn of core principles of cryptograhy.
Again, audits are nice, and you can use it in various points, but it’s not silver bullet. It is just a tool, and can’t replace proper education. People are often ignorant. Audits can generate any number of warnings it can, but it’s the people needs to take corrective actions, which they can ignore or pressured to ignore. Unless it’s part of a compliances certification process that can cause them to get out of business. Otherwise, most managers are “What would I care? That cost more.”
This was an incredibly interesting article.
Right? I feel like ars technica has been on a roll this year
I subbed because I’ve really enjoyed their content for the past few years
You know, at least when I’ve had to generate RSA keys for SSH, it seems like the highest I can possibly do is 4096. Just makes me wonder why you can’t generate a key of any links that’s a multiple of 1024. Such as, what if I wanted a 20,480 bit key?
Current recommendation is to stop using RSA in new deployments altogether. ECC is preferred now, and the major programs (OpenTLS, OpenSSH, etc.) support it.
Thats ECDSA correct? Or is that something different?
ECDSA
Yup, that’s an implementation that uses ECC (elliptic curve cryptography).
ECDSA is elliptic curve digital signature algorithm. Key exchange is usually done with ECDH (elliptic curve Diffie-Hellman). There has been some debate on the exact best way to do ECDH, but I think the FOSS world is currently settled on Curve25519. Anyway, it is best to leave stuff like that to specialists if you’re not one yourself. As mentioned, OpenSSL and OpenSSH both provide working implementations so go ahead and use them. The NIST curve P256 is also perfectly fine as far as anyone can tell. It has a mathematical drawback that it’s especially easy to make mistakes and screw up the security if you don’t know what you’re doing, but the deployed implementations that are out there have been checked carefully and should be ok to use. Bitcoin uses P256 so if anything were wrong with it, someone would have broken it and gotten pretty darn rich ;).
I believe you can with
openssl
, but it will take lots of time both generating and using the key. Think you sign something with that key, and the other party is using a low end device. He might take few mintues to verify the signature. The drawbacks just outweight the benefits. Security is a balancing act between complexity and usability.deleted by creator
Yet another reason to never connect your devices to the cloud.