This is an incomplete discussion of SSL/TLS authentication and encryption. This post only goes into RSA and does not discuss DHE, PFS, elliptical, or other mechanisms.
In a previous post I created an 15,360-bit RSA key and timed how long it took to create the key. Some may have thought that was some sort of stunt to check processor speed. I mean, who needs an RSA key of such strength? Well, it turns out that if you actually need 256 bits of security then you’ll actually need an RSA key of this size.
According to NIST (SP 800-57, Part 1, Rev 3), to achieve 256 bits of security you need an RSA key of at least 15,360 bits to protect the symmetric 256-bit cipher that’s being used to secure the communications (SSL/TLS). So what does the new industry-standard RSA key size of 2048 bits buy you? According to the same document that 2048-bit key buys you 112 bits of security. Increasing the bit strength to 3072 will bring you up to the 128 bits that most people expect to be the minimum protection. And this is assuming that the certificate and the certificate chain are all signed using a SHA-2 algorithm (SHA-1 only gets you 80 bits of security when used for digital signatures and hashes).
So what does this mean for those websites running AES-256 or CAMELLIA-256 ciphers? They are likely wasting processor cycles and not adding to the overall security of the circuit. I’ll make two examples of TLS implementations in the wild.
First, we’ll look at wordpress.com. This website is protected using a 2048-bit RSA certificate, signed using SHA256, and using AES-128 cipher. This represents 112 bits of security because of the limitation of the 2048-bit key. The certificate is properly chained back to the GoDaddy CA which has a root and intermediate certificates that are all 2048 bits and signed using SHA-256. Even though there is a reduced security when using the 2048-bit key, it’s likely more efficient to use the AES-128 cipher than any other due to chip accelerations that are typically found in computers now days.
Next we’ll look at one of my domains: christensenplace.us. This website is protected using a 2048-bit RSA certifcate, signed using SHA-1, and using CAMELLIA-256 cipher. This represents 80 bits of security due to the limitation of the SHA-1 signature used on the certificate and the CA and intermediate certificates from AddTrust and COMODO CA. My hosting company uses both the RC4 cipher and the CAMELLIA-256 cipher. In this case the CAMELLIA-256 cipher is a waste of processor since the certificates used aren’t nearly strong enough to support such encryption. I block RC4 in my browser as RC4 is no longer recommended to protect anything. I’m not really sure exactly how much security you’ll get from using RC4 but I suspect it’s less than SHA-1.
So what to do? Well, if system administrators are concerned with performance then using a 128-bit cipher (like AES-128) is a good idea. For those that are concerned with security, using a 3072-bit RSA key (at a minimum) will give you 128 bits of security. If you feel you need more bits of security than 128 then generating a solid, large RSA key is the first step. Deciding how many bits of security you need all depends on how long you want the information to be secure. But that’s a post for another day.
After years of using caff for my PGP key-signing needs I finally come across the answer to a question I’ve had since the beginning. I document it here so that I may keep my sanity next time I go searching for the information.
My question was “how do you make a specific certification in a signature?”. As defined in RFC 1991, section 6.2.1, the four types of certifications are:
<10> - public key packet and user ID packet, generic certification ("I think this key was created by this user, but I won't say how sure I am") <11> - public key packet and user ID packet, persona certification ("This key was created by someone who has told me that he is this user") (#) <12> - public key packet and user ID packet, casual certification ("This key was created by someone who I believe, after casual verification, to be this user") (#) <13> - public key packet and user ID packet, positive certification ("This key was created by someone who I believe, after heavy-duty identification such as picture ID, to be this user") (#)
Generally speaking, the default settings in caff only provide the first level “generic” certification. Tonight I found information specific to ~/.caff/gnupghome/gpg.conf. This file can contain, as far as I know, can contain three lines:
I can’t find any official information on this file as the man pages are a little slim on details. That said, if you use caff you should definitely create this file and populate it with the above at a minimum with the exception of the default-cert-level. The default-cert-level should be whatever you feel comfortable setting this as. My default is “2″ for key signing parties (after I’ve inspected an “official” identification card and/or passport). The other two settings are important as they provide assurances of using a decent SHA-2 hash instead of the default
I recently upgrade to Fedora 20 and quickly found my offlineimap instance failing. I was getting all kinds of errors regarding the certificate not being authenticated. Concerned wasn’t really the word I’d use to describe my feelings around the subject. Turns out, the version of offlineimap in Fedora 20 (I won’t speculate as to earlier versions) requires a certificate fingerprint validation or a CA validation if
SSL=yes is in the configuration file (.offlineimaprc). I was able to remedy the situation by putting
sslcacertfile = /etc/ssl/certs/ca-bundle.crt in the config file.
I won’t speculate as to the functionality in earlier versions but checking to make sure the SSL certificate is valid is quite important (MITM). If you run across a similar problem just follow the instructions above and all should, once again, be right with the world.
I’ve been arguing with my web hosting company about their use of RC4. Like many enterprise networks they aren’t consistent across all their servers with respect to available ciphers and such. It appears that all customer servers support TLS_RSA_WITH_CAMELLIA_256_CBC_SHA and TLS_RSA_WITH_CAMELLIA_128_CBC_SHA, in addition to TLS_RSA_WITH_RC4_128_SHA (although the latter is preferred over the other two) but their backend controlling web servers only support RC4. This is a problem if you are handling crypto (keys) (and other settings) over a weak encryption path to better secure your web service as you have essentially failed due to using the weak encryption to begin with.
So what’s wrong with RC4?
It’s been known for a while (years!) that RC4 is not a good encryption cipher. It’s broken and there are several attacks that are available. So why is it being used so frequently? In a word: BEAST. RC4 was the only stream cipher available that can combat BEAST and so it became the standard for all TLS connections. It’s not clear which attack vector is worse: BEAST or the weak RC4.
In recent months most Internet browsers have implemented the workaround n/n-1 to fix the BEAST vulnerability. With the fix in place it should, once again, be safe to use block ciphers and, thus, get better encryption ciphers (better protection). There have been many people and organizations talking about the need to get rid of RC4 now since it is a bigger threat to web security. Yesterday Microsoft released a security bulletin discussing the problem and urged all developers to stop using RC4. (Oh yeah, and they also want to stop using SHA-1 as well.) I usually think of Microsoft as trailing in the security field (lets face it, their products aren’t known for being secure ever since that whole network thing happened) so when they say that this mess with RC4 must stop it’s gotten to a point where we should have already done so.
So what are we waiting for?
I think, simply, we’re waiting for TLSv1.1 and TLSv1.2 to become mainstream. It’s not as if these technologies have just popped up on our radar screens, however, (they’ve been out since April 2006 and August 2008, respectfully) but there has been slow adoption of the two flavors of TLS. According to Microsoft, their products are ready for TLSv1.1 and TLSv1.2 (both IIS on and IE 11+). Firefox supports up to TLSv1.2 in 25.0 but you have to manually turn it on (it’s for testing) and OpenSSL (used for Apache) should support TLSv1.2 in its 1.0.1e release. It’s time to start pushing these better encryption mechanisms into operation… now.
Thought I’d pass along this research study, The keys to the kingdom, as I found it to be quite interesting (especially when you scan the entire Internet for your data). If you don’t understand the math explanation at the beginning just continue reading as you don’t need to have a degree in math and science to understand what’s going on.
Since upgrading to Fedora 19 I’ve been working out the kinks. Today I was finally able to run one of my problems down and fix it. It involved the failure of my MTA to deliver mail due to a TLS failure.
This failure was working against both postfix and ssmtp. After much log searching I was able to determine that ssmtp wasn’t verifying the public certificate of the distance SMTP server against the CA certificates I have on my system. I was able to confirm that the problem existed on other Fedora 19 systems and that it wasn’t just my crazy setup. After working with a couple of developers it seems that the ssmtp configuration file now requires the entry “TLS_CA_File=/etc/pki/tls/certs/ca-bundle.crt” to function correctly. It is not currently known what changes were made that created this problem.
I have not troubleshot postfix as of yet but I suspect a similar solution will be needed.
I went to sign an outgoing message tonight and my Gemalto USB Shell Token wouldn’t light up when I plugged it into my USB port. After doing the typical troubleshooting I am left with the thought that the device has given up the ghost. This isn’t a big problem because I saved the card the SIM came out of and I’m able to use my token on my personal laptop with a traditional card reader. My work computer, however, does not have one of these fancy card readers (maybe I can find my USB one somewhere?).
I can buy another Gemalto device but I’m wondering if there is a better device to use? I mean, the Gemalto lasted almost 500 signatures. But I guess I can now take advantage of the failure to try something else. Does anyone have any suggestions?