Ridge regression is a type of linear regression that incorporates a regularization term to avoid overfitting. Specifically, it adds a penalty equal to the square of the magnitude of the coefficients to the loss function. This regularization term helps to shrink the coefficients, thereby reducing their variance and improving the model's generalizability.