Mutual information quantifies the dependency between two variables. Unlike correlation, which only measures linear relationships, MI can detect non-linear interactions, making it particularly useful for analyzing biological data. It is defined as the reduction in uncertainty of one variable given knowledge of another. Mathematically, it is expressed as: MI(X; Y) = H(X) + H(Y) - H(X, Y) where H(X) and H(Y) are the entropy of variables X and Y, respectively, and H(X, Y) is the joint entropy of X and Y.