Dev Duniya
Mar 19, 2025
While Simple and Multiple Linear Regression models assume a linear relationship between the variables, Polynomial Regression allows for more complex relationships. It introduces polynomial terms (such as squared or cubed terms) of the independent variable(s) into the regression equation.
The equation for a simple polynomial regression with a degree of 2 (quadratic):
y = b0 + b1x + b2x²
where:
Let's revisit the sales prediction example, but this time, we suspect a non-linear relationship between advertising spend and sales.
import numpy as np
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
X = np.array([[1], [2], [3], [4], [5]]).reshape(-1, 1) # Independent variable (advertising spend)
y = np.array([10, 15, 18, 20, 25]) # Dependent variable (sales)
polynomial_features = PolynomialFeatures(degree=2)
X_poly = polynomial_features.fit_transform(X)
model = LinearRegression()
model.fit(X_poly, y)
new_advertising_spend = np.array([[6]])
new_advertising_spend_poly = polynomial_features.transform(new_advertising_spend)
predicted_sales = model.predict(new_advertising_spend_poly)
print("Predicted Sales:", predicted_sales)
This code creates polynomial features (x and x²) and then uses the LinearRegression model to fit the data.
Polynomial Regression extends the capabilities of Linear Regression by allowing for more flexible and complex relationships between variables. By incorporating polynomial terms, it can effectively model curves and capture non-linear patterns in the data. However, it's crucial to carefully choose the degree of the polynomial to avoid overfitting and ensure a meaningful and interpretable model.