In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, or relative entropy) is a non-commutative measure of the difference between two probability distributions P and Q.
please browse http://en.wikipedia.org/wiki/Kullback-Leibler_divergence for detail