• Current Events & Politics
    Welcome Guest
    Please read before posting:
    Forum Guidelines Bluelight Rules
  • Current Events & Politics Moderators: deficiT | tryptakid | Foreigner

TrueCrypt no longer supported (developers abruptly pull support during audit)

It was a pretty plug&play setup, I used it when I had no luck setting up pgp disc encryption.. Gotta say that it's interesting to note how widely adopted it was, w/o anyone really knowing exactly why they were trusting it. I think most just figure (at least, i did) that if others who seemed smarter than me were using, it's gotta be good!

I'm really interested to see what the 2nd round of auditing uncovers, I mean nobody really knows who put this crypto out so it coulda just been inept work, or an intentional 'honeypot' type thing w/ weak encryption or backdoor on purpose..
 
Yeah that's what I'm wondering also. And truecrypt isn't even the industry standard, they are all switching over to bitlocker which has OS integration. Makes you think twice about security. Luckily I'm only an IT intern in this company so the fix will not be up to me, but I will definitely mention this at work...
 
even if they don't find any smoking guns, per se, the fact the developers warned/bailed is de facto smoking-est gun. I dunno about industry standards and imagine they vary widely, but truecrypt was more popular than bitlocker is now (and has been a 'go-to' of sort for the better part of a decade)

question (ahem, thujone, ahem): is just doing a 'manual' disc encryption w/ gpg or pgp* still 'legit' security ('corporate-level' protection)?
[*is one better than the other? have only used gpg or 'gnuPG', cuz i like to support eff/gnu/et al ]
 
I frequently hear stuff like 'oss is good cuz the code's open, so you know it's legit!' but in reality who the hell actually audits code for most things? Also, while I'm no programmer, I do wonder as to how well an intentionally-included backdoor/flaw is to spot, I mean wouldn't the auditors be at a significant disadvantage trying to conclude a backdoor, as compared to a smarter programmer including it off the bat?
 
I frequently hear stuff like 'oss is good cuz the code's open, so you know it's legit!' but in reality who the hell actually audits code for most things? Also, while I'm no programmer, I do wonder as to how well an intentionally-included backdoor/flaw is to spot, I mean wouldn't the auditors be at a significant disadvantage trying to conclude a backdoor, as compared to a smarter programmer including it off the bat?

That depends on how well the backdoor is programmed. If you start making connections to the outside and sending unencrypted data over those connections a skilled programmer/cryptologist or even someone with a simple packet sniffer will notice immediately. However if you program your backdoor so that it is very hard to exploit it, say for instance when you have to follow a very specific and very long procedure to get unencrypted data out of the program it can be much harder to spot. But I doubt such a thing would be possible in the case of truecrypt, as it is so widespread in it's usage. You can be sure that loads of people have audited that code. Skilled programmers, students working on their dissertations, cryptologists that want to publicize their work, government agencies,...

I'm not saying it isn't possible. If you are interested look up the stuxnet virus. It was made to target nuclear power plants and some other government facilities specifically and it is theorized that such a complex virus must be made by some government agency as it was lightyears ahead of anything we had seen before, they compared it to rocket science in it's complexity. This virus also exploited a weakness in the code and caused damage to nuclear turbines over a long period of time, as to not draw suspicion. IIRC it made the rotors of the turbines spin 1% faster than normal, while still sending out normal sensor data to the operators. This caused the turbines to break over time. Took a long time for people to realize this too, so it is definitely possible, more so if the backdoor can be installed in the software itself beforehand and doesn't need to be added afterwards by way of a virus. It isn't easy however. I am a programmer and I wouldn't know where to begin if I had to install an undetectable backdoor in something like truecrypt. Closed source programs are much easier to manipulate undetectable with malignant intentions. However there are people 1000x more skilled and smarter than me so it is definitely possible...
 
Last edited:
Idk if widespread use is enough make sure important bugs are found fast. OpenSSL is widespread so you'd think that someone would have spotted the Heartbleed bug a lot sooner.

As for how hard an intentionally hidden vulnerability is to spot... if it's done well... very hard. There are all kinds of non obvious things you can do to weaken the encryption enough to make brute force attacks feasible. Messing with the 'random' generators used for generating keys, 'optimizations' that weaken the encryption, ... And I don't write encryption sw for a living, someone who does could name even more. There is also the possibility that the sw is written perfectly, but the math behind the encryption algorithms is flawed. So far there aren't any usable flaws in the algorithms used by TC known to the general public, but does that mean they don't exist? There's also the possibility that both the implementation and algorithm are fine, but your other software and/or hardware are compromised.

Anyway atm, there are no known vulnerabilities in TC and the security audit is continuing. Which encryption solution to trust is something everyone should decide on his own. There are alternatives, perhaps more secure, perhaps not.
 
Top