Multiple Linear Regression
The model
Multiple linear regression uses multiple features:
ลท = w1ยทx1 + w2ยทx2 + ... + wkยทxk + bลท = w1ยทx1 + w2ยทx2 + ... + wkยทxk + b
false
flowchart LR X1[x1] --> M[Linear Model] X2[x2] --> M Xk[xk] --> M M --> Y[Prediction ลท]
false
Interpreting coefficients
If all else is equal:
wkwktells how much the target changes when featurexkxkincreases by 1.
But be careful:
- if features are correlated, coefficient interpretation becomes tricky (multicollinearity)
Scikit-learn example
Multiple linear regression
import numpy as np
from sklearn.linear_model import LinearRegression
# Example: [size_sqft, bedrooms, age]
X = np.array([
[800, 2, 10],
[1000, 3, 5],
[1200, 3, 20],
[1500, 4, 7],
])
y = np.array([180, 240, 220, 320])
model = LinearRegression()
model.fit(X, y)
print("coefficients:", model.coef_)
print("intercept:", model.intercept_)Multiple linear regression
import numpy as np
from sklearn.linear_model import LinearRegression
# Example: [size_sqft, bedrooms, age]
X = np.array([
[800, 2, 10],
[1000, 3, 5],
[1200, 3, 20],
[1500, 4, 7],
])
y = np.array([180, 240, 220, 320])
model = LinearRegression()
model.fit(X, y)
print("coefficients:", model.coef_)
print("intercept:", model.intercept_)Common issues
- multicollinearity: features carry overlapping signal
- scaling: if using regularization, scale inputs
Mini-checkpoint
- Which features are strongly correlated?
- Consider removing or combining them (feature engineering) if needed.
๐งช Try It Yourself
Exercise 1 โ Train-Test Split
Exercise 2 โ Fit a Linear Model
Exercise 3 โ Evaluate with MSE
If this helped you, consider buying me a coffee โ
Buy me a coffeeWas this page helpful?
Let us know how we did
