Tech

How Much Longer Can We Rely on 2048 Bit Public Keys?

Site operators and those who manage their own web apps have long had to concern themselves with encryption, but increasingly they’re being forced to choose between several different options when it comes to locking down their information. PGP and RSA technologies once relied on 1024 bit public keys, but these were cracked and shown to be unsafe for usage on the clearnet. That led to the development of more secure 2048 bit keys, but some computer specialists feel that those are ready to become equally as unsafe. The debate has caused quite a bit of confusion among those who have to manage smart cards or request SSL certificates.

NIST representatives suggested that 2048 bit keys will no longer be secure come 2030, which means that you’d have to re-sign your code in about eight years. Instead of waiting, intrepid computer security personnel are already starting to migrate to 4096 bit keys, but the market as a whole is taking quite a long time to do so. Since bad actors are constantly finding new ways to compromise the existing keys, there are those who are urging developers to deprecate all of the 2048 bit RSA code currently in use.

Unfortunately, this isn’t as easy as flipping a switch.

Barriers to Adopting 4096 Bit Keys

Exactly four binary kilobytes doesn’t sound like a lot of information, but it’s twice as much as many existing keys currently use. When you multiply that number by the number of public encryption keys processed everywhere in the entire world every minute, you can start to see why this is an issue. To make matters worse, not all devices are equally well endowed with powerful silicon hardware.

While a 4096 bit key would be nothing for a modern desktop or laptop PC, many mobile devices are going to start to balk when presented with too many of these. Devices that use multiple layers of virtualization to maintain a sophisticated network stack might be particularly burdened as they would have to process an amplified amount of data to perform a check on any key that used a greater amount of data than they’re currently used to. Some machines are designed with dedicated chips that can process antiquated 1024 bit keys as well as more modern 2048 bit keys instantaneously.

Once you add a greater level of encryption to the equation, you start to run into situations where these chips enter into a sort of inefficient emulation mode where they have to actually work more than twice as hard to do twice as much work. On top of this, switching to a greater bit depth dramatically increases the overall complexity of the algorithms involved.

Why Upgrading Encryption APIs Might Not be So Simple

Readers with some programming experience might be of the opinion that an existing algorithm could be used as long as end-users didn’t mind authentication-related subroutines running twice as fast. Depending on the hardware involved, it might seem like there’s really not much of a difference anyway. The problem is that the current libraries used by RSA and SSL are not resilient against attacks carried out by quantum computing-based devices.

Specialists in the financial fraud protection field have raised these concerns, pointing to the fact that we’re approaching a time when quantum computing hardware will be functional enough that bad actors could leverage it as part of a brute force attack strategy. This is actually a big part of the reason that authorities have been suggesting that 4096 bit encryption keys will become a must by 2030 to begin with. If outdated libraries were used to present a modern security layer, then it’s likely that these would eventually get cracked by a brute force attack even if other more sophisticated types of exploits never punctured any holes in them.

Using a Simple Attack to Break Complicated Security Features

An increase in the total installed base of quantum computer technology would make it more likely that 2048 bit public keys would have to be retired before 2030. Any patches that could be released before that time could stop complex attacks and plug zero-day exploits, but they’d be vulnerable to traditional brute force numeric checks. On top of that, they’d make it hard for non-technical users to browse the web as they’d probably get pestered with various modal authorization dialog boxes.

Validating digital documents may be more difficult as well, even for technicians. Since users wouldn’t be able to trust whatever security stack was running on their device, they may simply be unable to verify the identity of a sender. For that reason, engineers will likely be sun-setting many of the existing SSL certificates that power the net today.

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button