Thursday, January 10, 2013

Securing database passwords


Recently there have been many attacks on major websites a lot of databases have been exposed, and although most of the time these databases have been hashed yet a rainbow attack can be done on these databases not only this the database is out which means that any password is under attack and it is a matter of time of exposing this data. Many enhancement are suggested over the password storage which can enhance the time needed to do attacks on the database, like for example adding salt to it, which actually doesn't cost a lot yet does add some security margin to the cracking but ain't enough, there is other solutions like using bcrypt or even more an scrypt, while this is enough for time being but , it is expensive, as of the amount of memory and/or the amount of time required to do storage/verification process, what I will suggest here is an enhancement that will definitely doesn't mean you will need to forget the idea of using a bcrypt or even scrypt, definitly you will need a hash slat, or more you might do a dual hashing on it. The idea will also add some security measures to the system to allow harder manipulation to the system.

What is it then ?

Basically the idea is not to store the password only but instead to store the password with an encrypted identifier (which basically can be the user name or a hashed version of it), very trivial ! yes but it can be tricky to make this securely ..
so the basic idea is to have Enc( F1(password) + F2(Identifier "might be user name, user name and something etc")). then hashing the result with a salt or bcrypting or even more scrypting it ..
F1 can be the same as F2 or even you can make it one function for both data together (hashing can be a good idea)
Noting that the function/s that you use with the password and identifier is/are important to examine, since you are encrypting, you want to make the data with low frequency, it is obvious that stream ciphering is not an option here! also having a variable random IV does not making sense. Then you should try to make this data random as possible. noting that at the end you actually will not store the encrypted data in database this is a relief but not much, it is a security feature yet bare in mind that encrypting a large data with a lot of patterns similarity is always breakable, ( , the great thing here is that it is up to to choose and code an secure this transaction as you wish .
One thing to note is that you should identify the amount of data you are encrypting here with one key so a key identifier should be stored in some place.
More to say about this solution is even if the database is retrieved without the key it is hard to get the password or even to use, you should bare in mind again that the key should be stored in a secure equipment for utmost security (HSM) or if it is not an option then it shouldn't be exportable from the machine a feature that doesn't really mean it is not exportable specifically with some OSes.


1- Encryption is not very expensive process and if it is added to a bcrypt the cose is negligible (time and memory wise) though an HSM can be expensive
2- A malicious user/or a hacker   can't simply manage to change the user password unless he has access to the key, database, and know the algorithms you use on F1 and F2 .. if you are using a non exportable key then this can be done only if the user has access to specific machines, more to say a full log can be traced up to the user (except if it is web application error then it is hard to be done)
3- You can have more control over accessing of the data, by having access control over HSM or security machine to a specific user, a database admin can't simply change the password of a user at any time.

That's all guys waiting for your comments

Sunday, September 30, 2012

X509 new Critical extension

Regarding my previous comment I have been thinking about how would it be suitable to implement fixed size certificate in X509, well I got an idea to add an new extension, that has a specific OID 'let's call it Fixed size cert OID 1xK' it can be 'critical'
In this extension you might put in the octet string something like this
INTEGER Multiples
Where Multiples , says how many '1K bytes is this certificate plain data'
And padding is the padding you actually add to make this text size 1K or 2K or xK !! Padding can be anything I guess it shouldn't be random I like the 0xdead padding
One note
I don't want to make this extension flexible for two reasons
1- it should be hard to get similar patterns
2- It adds a predefined agreement to cert

Saturday, September 29, 2012

Hash function can it be more secure? Do we need it to be more secure?

A hash function can have any size of input and give one output function, this makes hash collision an inevitable !
Hash functions have many usages out there in digital world, many of which are very sensitive !
PKI for example is based on major standards start with asymmetric encryption mechanisms, X509 (ASN1) and hash are mainly the three pillars of PKI when it comes to mathematics, actually also PGP!!
Let's focus on hashing - what hashing adds to asymmetric encryption is really important basically because it makes suitable to sign a long on one shot, but that doesn't come free, it comes with two flaws, padding and hash collision .
I will focus now on hash collision on PKI signing of the certificates, for any hash value in reality there is an infinite Number of collisions, more to say there is infinite number of data that you actually sign with your private key, a fact that can be denied even with a perfect hash function !!
One here can argue yeah I signed an infinite random numbers but they don't qualify to be an X509 certificate (for example)
I have to tell him that one signature of a certificate is actually a signature of infinite number of valid X509 certificates and this is because X509 standard itself.
Oh my god that's horrible, infact it is not that horrible because though it is infinite but it is rare infinite, that's if you are using a perfect hash function it is hard to find .. But with current computation power it is proven that hash collision can be put in action.
So wut can we do?
Well I can't say that this is a bullet proof solution but by restricting the size and not the structure of the input to a hash function this leads to a more secure system or so I think right now for me it is pure mathematics.
If I have a perfect hash function, and I have an output there are an infinite number of inputs that leads to same result although scattered but they exist waiting for manipulator to use them.
But with pre setting the size of the input you have only a limited number of collisions with the same size as the real input of you, it obvious ain't it!
More to say that within this limited collision points you have little of which that may satisfy a specific structure (for example being X509 compliant or XML or whatever)....
If you agree with me, then i know your next question, Ok that looks good but how we may implement it ?
Well padding again is your solution you may chose your padding scheme!
Your next question , Ok but that limits my input size wut if I don't have a specific upper size for my hash input, which is the case with most if the standards?
True, but there are usually limits, infact I won't be surprised if I tried to enroll for a 10 KB certificate and I get the CSR rejected from my CA software !
But no one would like to stop people of choosing to put there images in a certificate :) for example !
So the solution would be that you first plan for the suitable input size that almost all of your data will fall below i then afterwards you deal within inputs that are larger than what expected as another message to be padded and then hashed --

It looks simple but it needs a rectification for the current standards for example (X509)

But in case you are implementing your own signature schema for signing a text for example, you may as well do the above as a protective solution (not so protective) but let's say more protective solutions

One last thing I have to say here is that the only issue I can think about here is that if someone is searching for a collision he has a narrower pool to search at

At the end I have to say I am totally open to any comments about the subject

Thursday, October 27, 2011

Symmertic Encryption for large data

Apparently it is easy to understand a conversation between two people without knowing their language, isn't especially when you know wut they are talking about ! Isn't it .. It is quite the same with cryptography -- I know wut you are going to say I can understand the pattern .. the main problem is on the chunks of data ... Most Symmetric Encryption algorithms are no longer than 256bit i.e. 32-byte .. Imagine 1 MegaPixel image encrypted with ECB, the data pattern is quite clear there, apparantly ECB can't protect a large data with a know pattern, if the data is compressed and send, for sure it is different but if an Evasdropper knows it is JPEG, clearly he can understand the pattern, the recognition mechanism will be different as in his case he will try to look for the cosine transform of the image ..
Accrodingly CBC was developed, but unfortunately it is misused - CBC is usually used with on IV and then next IVs are generated from the previous result, it is like ok we will change our cipher language - we will start with an S and then repeat add the last result letter to the next phrase .. it is a bit harder but still the pattern exists, I am not an expert but I believe by adding a totally random IV to each block might be of great value, it is even might be of a better value if this IV is randomly positioned and/or if more than one random thing is added to the message .. and all these IV can be wrapped up somewhere in the message -- this also can be random .. doing so might improve the way the real data is hidden and scattered increasing the entropy of it ...

There is no standard for doing so .. so I might suggest my flaw way of doing it .. please wait for my next post .. again I am not sure if this is a good idea but for me it sound good since the real data can be hidden .. it is then all about the randomization mechnism and the adaptation of new standards to do that

Friday, May 27, 2011


Ok so have you ever wondered if that the speed light is not the constant it is our observation that is, can it be true ! Well though it's quite hard to prove yet, can you imagine it lights hits and we then observe its effect on us, noting that we are not in a point space we extend over the universe only that our energy is distributed such that it is focused in one a space something exponential like .. can this be true !! Can it that we are only calculating the other side around !

Sunday, August 3, 2008

Reality or Fiction

Have you ever wonder weather it is reality or fiction, what was once a fiction is now a reality, what was once an axiom is now invalidated by the truth of existance.

I want to explore what we are, where we are and why are we here

That's what it is all about in this blog.

This blog was inspired by Roger Penrose book Road to Reality.