Archive

Archive for the ‘Integrity’ Category

caff gpg.conf file settings

2014-04-01 1 comment

After years of using caff for my PGP key-signing needs I finally come across the answer to a question I’ve had since the beginning.  I document it here so that I may keep my sanity next time I go searching for the information.

My question was “how do you make a specific certification in a signature?”.  As defined in RFC 1991, section 6.2.1, the four types of certifications are:

     <10> - public key packet and user ID packet, generic certification
          ("I think this key was created by this user, but I won't say
          how sure I am")
     <11> - public key packet and user ID packet, persona certification
          ("This key was created by someone who has told me that he is
          this user") (#)
     <12> - public key packet and user ID packet, casual certification
          ("This key was created by someone who I believe, after casual
          verification, to be this user")  (#)
     <13> - public key packet and user ID packet, positive certification
          ("This key was created by someone who I believe, after
          heavy-duty identification such as picture ID, to be this
          user")  (#)

Generally speaking, the default settings in caff only provide the first level “generic” certification. Tonight I found information specific to ~/.caff/gnupghome/gpg.conf. This file can contain, as far as I know, can contain three lines:

personal-digest-preferences SHA256
cert-digest-algo SHA256
default-cert-level 2

I can’t find any official information on this file as the man pages are a little slim on details.  That said, if you use caff you should definitely create this file and populate it with the above at a minimum with the exception of the default-cert-level.  The default-cert-level should be whatever you feel comfortable setting this as.  My default is “2″ for key signing parties (after I’ve inspected an “official” identification card and/or passport).  The other two settings are important as they provide assurances of using a decent SHA-2 hash instead of the default

Configuring offlineimap to validate SSL/TLS certificates

2014-01-30 Leave a comment

I recently upgrade to Fedora 20 and quickly found my offlineimap instance failing.  I was getting all kinds of errors regarding the certificate not being authenticated.  Concerned wasn’t really the word I’d use to describe my feelings around the subject.  Turns out, the version of offlineimap in Fedora 20 (I won’t speculate as to earlier versions) requires a certificate fingerprint validation or a CA validation if SSL=yes is in the configuration file (.offlineimaprc).  I was able to remedy the situation by putting sslcacertfile = /etc/ssl/certs/ca-bundle.crt in the config file.

I won’t speculate as to the functionality in earlier versions but checking to make sure the SSL certificate is valid is quite important (MITM).  If you run across a similar problem just follow the instructions above and all should, once again, be right with the world.

Categories: Fedora 20, Integrity, Security Tags: , ,

Kicking RC4 out the door

2013-11-13 2 comments

I’ve been arguing with my web hosting company about their use of RC4.  Like many enterprise networks they aren’t consistent across all their servers with respect to available ciphers and such.  It appears that all customer servers support TLS_RSA_WITH_CAMELLIA_256_CBC_SHA and TLS_RSA_WITH_CAMELLIA_128_CBC_SHA, in addition to TLS_RSA_WITH_RC4_128_SHA (although the latter is preferred over the other two) but their backend controlling web servers only support RC4.  This is a problem if you are handling crypto (keys) (and other settings) over a weak encryption path to better secure your web service as you have essentially failed due to using the weak encryption to begin with.

So what’s wrong with RC4?

It’s been known for a while (years!) that RC4 is not a good encryption cipher.  It’s broken and there are several attacks that are available.  So why is it being used so frequently?  In a word: BEAST.  RC4 was the only stream cipher available that can combat BEAST and so it became the standard for all TLS connections.  It’s not clear which attack vector is worse: BEAST or the weak RC4.

In recent months most Internet browsers have implemented the workaround n/n-1 to fix the BEAST vulnerability.  With the fix in place it should, once again, be safe to use block ciphers and, thus, get better encryption ciphers (better protection).  There have been many people and organizations talking about the need to get rid of RC4 now since it is a bigger threat to web security.  Yesterday Microsoft released a security bulletin discussing the problem and urged all developers to stop using RC4.  (Oh yeah, and they also want to stop using SHA-1 as well.)  I usually think of Microsoft as trailing in the security field (lets face it, their products aren’t known for being secure ever since that whole network thing happened) so when they say that this mess with RC4 must stop it’s gotten to a point where we should have already done so.

So what are we waiting for?

I think, simply, we’re waiting for TLSv1.1 and TLSv1.2 to become mainstream.  It’s not as if these technologies have just popped up on our radar screens, however, (they’ve been out since April 2006 and August 2008, respectfully) but there has been slow adoption of the two flavors of TLS.  According to Microsoft, their products are ready for TLSv1.1 and TLSv1.2 (both IIS on and IE 11+).  Firefox supports up to TLSv1.2 in 25.0 but you have to manually turn it on (it’s for testing) and OpenSSL (used for Apache) should support TLSv1.2 in its 1.0.1e release.  It’s time to start pushing these better encryption mechanisms into operation… now.

Updates

What Google has to say on the subject.

How secure are those SSL and SSH keys anyway?

2013-10-30 Leave a comment

Thought I’d pass along this research study, The keys to the kingdom, as I found it to be quite interesting (especially when you scan the entire Internet for your data).  If you don’t understand the math explanation at the beginning just continue reading as you don’t need to have a degree in math and science to understand what’s going on.

Why Android SSL was downgraded from AES256-SHA to RC4-MD5 in late 2010

2013-10-14 1 comment

Just ran across this article discussing how horrible the cipher preference list is in Android.  That’s a lot of bad crypto on the streets right now.

Why Android SSL was downgraded from AES256-SHA to RC4-MD5 in late 2010

 

Trusting Trusted CAs

2013-10-09 Leave a comment

Like it or not, the basis of trust for much of the Internet is based on Certificate Authorities (CA).  Companies like Verisign, GoDaddy, and GeoTrust are in the trust business.  They will sell you cryptographic proof of your Internet assets (namely your domain name) that others can use to verify that when they visit your website that they are actually visiting your website and not some lookalike website.  This is important as you don’t want to give your login credentials to your bank account to a lookalike web page that really isn’t your bank.

The trouble is, how do you know the CAs are doing their due diligence and not just issuing certificates to people who just claim to own a particular domain name?  Well, I’m not sure we do know, as users.  Mozilla, like other web browsers, has a policy for including CAs in their browser product but a quick look at the list of CAs that are already in Firefox shows that we as users probably can’t go behind and verify them all.

If I were a conspiracy theorist I would be looking real hard at what the Electronic Freedom Foundation (EFF) recently released about the NSA spying program.  According to their research (and that of the Guardian and others) the NSA is actively performing man-in-the-middle attacks (MITM) to get malware into computers.  This malware allows the NSA (and anyone else capable of accessing these infected computers) to circumvent protections put in place to keep information passed over the Internet secure.  To do these MITM attacks one would need to provide users with a valid SSL certificate if they happen to be visiting a site that is supposed to be secured.  The only way of doing this is to either obtain the SSL certificates from the real sites or to create their own and have them trusted by a trusted CA.  With that in mind, I wonder which option is more probable?

It’s good to note that these types of attacks are not solely done by the NSA.  Gaining access to computers is a very profitable business and one that people other than governments can do.  It’s important to protect yourself against these attacks and be smart when surfing the Internet.  The end of the EFF story contains information on how to protect your computer (and yourself) and is a good read for everyone.

Reflections on Trusting Trust

2013-09-18 1 comment

Reflections on Trusting Trust

This is an old paper written back in 1984 by Ken Thompson.  Mr. Thompson describes why it is so difficult to trust software even when you have access to the source code.  We are now reading daily about how the NSA has access to our network communications and even our computers.  If they have access you can believe that others, completely unrelated to the NSA, have access as well through many of the same software bugs or network connections.  It will be difficult to figure out how to get past these problems.  Fortunately we do have smart people thinking about these things daily.

Categories: Fedora Project, Integrity, Security Tags:

MTA certificate not verifying in Fedora 19

2013-09-06 Leave a comment

Since upgrading to Fedora 19 I’ve been working out the kinks.  Today I was finally able to run one of my problems down and fix it.  It involved the failure of my MTA to deliver mail due to a TLS failure.

This failure was working against both postfix and ssmtp.  After much log searching I was able to determine that ssmtp wasn’t verifying the public certificate of the distance SMTP server against the CA certificates I have on my system.  I was able to confirm that the problem existed on other Fedora 19 systems and that it wasn’t just my crazy setup.  After working with a couple of developers it seems that the ssmtp configuration file now requires the entry “TLS_CA_File=/etc/pki/tls/certs/ca-bundle.crt” to function correctly.  It is not currently known what changes were made that created this problem.

I have not troubleshot postfix as of yet but I suspect a similar solution will be needed.

Gemalto USB Shell Token bit the dust… Now what?

2013-09-01 1 comment

I went to sign an outgoing message tonight and my Gemalto USB Shell Token wouldn’t light up when I plugged it into my USB port.  After doing the typical troubleshooting I am left with the thought that the device has given up the ghost.  This isn’t a big problem because I saved the card the SIM came out of and I’m able to use my token on my personal laptop with a traditional card reader.  My work computer, however, does not have one of these fancy card readers (maybe I can find my USB one somewhere?).

I can buy another Gemalto device but I’m wondering if there is a better device to use?  I mean, the Gemalto lasted almost 500 signatures.  But I guess I can now take advantage of the failure to try something else.  Does anyone have any suggestions?

Encrypting SMS messages and phone calls on Android

2013-03-21 Leave a comment

Much of our daily lives are contained within our smartphones and computers.  Email, text messages, and phone calls all contain bits and pieces of information that, in the wrong hands, could harm our privacy.  Unfortunately many people either don’t understand how vulnerable their data is when sent across the Internet (or another commercial circuit) or just don’t care.  While I don’t have much to say for the crowd in the latter category (can’t fix stupid) I do try to help people in the prior category understand that any network outside of their control is fair game for pilfering and that basic protections need to be taken to protect themselves.  While I’m not going to dig into how data can be intercepted (there are plenty of articles out there on the subject) I would like to talk about how one can use tools to protect their data when using an Android smartphone.

Until recently email was the only easily-encrypted mode of communication.  Most people didn’t have the means of encrypting their phone conversations and certainly not their SMS messages (unless you happen to be using a SME-PED, but those things are terrible in other ways).  Now, Whisper Systems have released two open source programs that allow you to protect your communications.  The first is called “RedPhone”.  This program encrypts your phone conversations and allows you to converse securely.  The second program is called “TextSecure” and encrypts text messages using authenticated, asymmetrical encryption.

I like the way TextSecure manages keys and allows you to verify the user’s key directly so you can establish trust.  RedPhone appears to use the trust in the phone number for authentication.  RedPhone also provides encryption opportunities when the distant party also has RedPhone on their device which is a nice feature that I wish TextSecure also provided.  Both of these programs are very easy to use and need very little configuration.

TextSecure also provides an encrypted container for all your text messages so that your text messages are secure if the attacker has physical access to the device.

And OpenPGP is still a great option for protecting your email communications but that is a topic for later.

Follow

Get every new post delivered to your Inbox.

Join 202 other followers