Yet another reason to never connect your devices to the cloud.
You have to wonder how many other things are out there with effectively worthless encryption because some old document or default option told them to/allowed them to implement it without any ‘hey! some 14 year old with a TI-83 could crack this key!’ warnings.
There was a book with a Bitcoin wallet generator code in it that specifically said that it was vulnerable and was only to be done as a demo test and yet somebody released a wallet with that code and fucked a bunch of people over on accident.
Castellucci, whose pronouns are they/them, acquired this remarkable control after gaining access to the administrative account for GivEnergy, the UK-based energy management provider who supplied the systems. In addition to the control over an estimated 60,000 installed systems, the admin account—which amounts to root control of the company’s cloud-connected products—also made it possible for them to enumerate names, email addresses, usernames, phone numbers, and addresses of all other GivEnergy customers (something the researcher didn’t actually do).
tl;dr: hacker (the good kind) exploits weak encryption key to gain access to the utility’s management system. Because you too were probably wondering how key length and power generation could possibly be related.
Wow, props to Castellucci for being a stand up person and not using their discovery to control or mess with tens of thousands of people’s power supply. And props to GivEnergy for not turning around and suing them after they reported finding the issue.
This could have gone badly in either direction, but we lucked out that this Castellucci seems to be an excellent and conscientious citizen.
This was an incredibly interesting article.
Right? I feel like ars technica has been on a roll this year
You know, at least when I’ve had to generate RSA keys for SSH, it seems like the highest I can possibly do is 4096. Just makes me wonder why you can’t generate a key of any links that’s a multiple of 1024. Such as, what if I wanted a 20,480 bit key?
I believe you can with
openssl
, but it will take lots of time both generating and using the key. Think you sign something with that key, and the other party is using a low end device. He might take few mintues to verify the signature. The drawbacks just outweight the benefits. Security is a balancing act between complexity and usability.what if I wanted a 20,480 bit key?
Does consumer grade hardware to decrypt(in seconds not hours) such a key exists today?
How in the fuck do you even coax software into using a key like that? Did someone just say “yeah just use the smallest size possible, that’ll be okay” and then just like not care?
Because cryptography is a specialized knowledge. Most curriculums doesn’t even include cryptography as core topic in their Computer Science degree. You can have a look of the MIT’s computer science curriculum. Cryptography is instead embedded in the elective class of Fundementals of Computer Security (6.1600). That’s also why DevSecOps instead of the previous DevOps. It’s just simply boils down teaching and learning cryptography is hard. It’s still too early to expect a typical dev to understand how to implement cryptograhy, even with good library. Most doesn’t know compression and encryption doesn’t mix well. Nor they understand the importance of randomness and never use the same nounce twice. They doesn’t even know they can’t use built-in string comparison (
==
) for verifying password hashes which can lead to timing attacks. Crypto lib devs who understands crypto add big scary warnings yet someone will mess something up.Still, I will strongly support academics adding basic cryptography knowledge to their curriculum, like common algoritms, key lengths, future threats, and how fast the security landscape is moving, just for the sake of the future of cyber security.
Eh, I disagree. Cryptography really isn’t something your average software engineer needs to know about, as long as they understand that you should never roll your own crypto. If you teach it in school, most students will forget the details and potentially just remember some now-insecure details from their classes.
Instead, we should be pushing for more frequent security audits. Any halfway decent security audit would catch this, and probably a bunch of other issues they have as well. Expect that from any org with revenue above some level.
At least have few lessons let them remember not to roll their own crypto, and respect those scary warnings. These needs to be engraved into their mind.
I agree security audit would catch this, but that’s something after the fact. There is a need for a more preventative solution.
Security audits should be preventative. Have them before any significant change in infrastructure is released, and have them periodically as a backup.
I had a cryptography and security class in college (I took the elective), and honestly, we didn’t cover all that much that’s actually relevant to the industry, and everything that was relevant was quickly outdated. That’s not going to be a solution, we need a greater appreciation for security audits.
In an email, a GivEnergy representative reinforced Castellucci’s assessment, writing:
In this case, the problematic encryption approach was picked up via a 3rd party library many years ago, when we were a tiny startup company with only 2, fairly junior software developers & limited experience. Their assumption at the time was that because this encryption was available within the library, it was safe to use. This approach was passed through the intervening years and this part of the codebase was not changed significantly since implementation (so hadn't passed through the review of the more experienced team we now have in place).
Yet another reminder that trust should be earned.
So, it sounds like they don’t have regular security audits, because that’s something that would absolutely get flagged by any halfway competent sec team.
No need for audits. It’s only critical infrastructure embedded into tens of thousands of homes, lol.