DEV Community

Ma Uttaram
Ma Uttaram

Posted on

Linear Algebra

Linear Regression

The Math: (y = Xw + b)

The Loop:

Use Gradient Descent.Predict (y_{pred}).Calculate Error ((y_{pred} - y_{true})).

Compute Gradient:

(X^T \cdot \text{Error} / n).Update Weights: (w = w - (\text{learning_rate} \times \text{gradient})).

Logistic RegressionThe Difference:

Same as Linear Regression, but pass the result through the Sigmoid Function: (1 / (1 + e^{-z})).

The Goal:

Maps any number to a probability between 0 and 1.

K-Means

The Logic:

Pick (k) random points as "centroids."Assign every data point to the nearest centroid.Move centroids to the average of their assigned points.Repeat until the centroids stop moving.

Top comments (0)