Quadratic Regression Analysis

Anonymous contributor's avatar
Anonymous contributor
Published Aug 23, 2024
Contribute to Docs

Quadratic Regression Analysis, also known as Second-Order Regression Analysis, is a supervised learning technique that models non-linear behaviors, such as a parabolic shape, using a quadratic equation. The quadratic equation is a polynomial of the second degree, which can be written in the following form:

The Quadratic Equation

The goal of quadratic regression analysis is to fit this equation to the observed data, providing a more refined, non-linear model of the data than linear regression.

Syntax

In Scikit-Learn, quadratic regression analysis is performed using the combination of PolynomialFeatures and LinearRegression classes:

sklearn.preprocessing.PolynomialFeatures(degree=2, *, interaction_only=False, include_bias=True, order='C')
model = LinearRegression(fit_intercept=True, copy_X=True, n_jobs=None, positive=False)

PolynomialFeatures has the following parameters:

  • degree (int, default=2): The degree of the polynomial features.
  • interaction_only (bool, default=False): If True, only interaction features are produced (no powers of features).
  • include_bias (bool, default=True): If True, includes a bias column (constant term) in the output features.
  • order (str, default=’C’): The order of the output array. C means row-major (C-style) and F means column-major (Fortran-style).

LinearRegression has the following parameters:

  • fit_intercept: Determines whether the model should calculate an intercept term. If False, the model is forced through the origin.
  • copy_X: If True, creates a deep copy of the input data to avoid modifying the original. Otherwise, the input data might be overwritten.
  • n_jobs: Specifies the number of CPU cores to use for parallel computations. -1 uses all available cores.
  • positive: If True, ensures that all coefficients are constrained to be positive values.

Example

The example below shows quadratic regression analysis by generating polynomial features with PolynomialFeatures, fitting a LinearRegression model, and making predictions:

import numpy as np
from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression
# Define sample training data and labels
training_data = np.array([[1], [2], [3], [4]])
labels = np.array([1, 4, 9, 16]) # Example quadratic relationship: y = x^2
# Define polynomial features for interaction terms only and without bias term
poly = PolynomialFeatures(interaction_only=True, include_bias=False)
model = LinearRegression()
# Transform training data to include polynomial features and fit the model
poly_features = poly.fit_transform(training_data)
model.fit(poly_features, labels)
# Define sample test data
test_data = np.array([[5], [6]])
# Transform test data with the same 'PolynomialFeatures' instance and predict labels
test_poly_features = poly.transform(test_data)
predictions = model.predict(test_poly_features)
print(predictions)

Here is the output for the above example:

[20. 25.]

Codebyte Example

The following codebyte example demonstrates quadratic regression analysis by fitting a linear model to polynomial features:

Code
Output
Loading...

Note: The output predictions will vary due to the random generation of training data and noise in the model.

All contributors

Contribute to Docs

Learn Python:Sklearn on Codecademy