25. Quiz: Polynomial Regression
Programming Quiz for "Polynomial Regression"
Polynomial Regression Exercise
Get some practice implementing polynomial regression in this exercise. In data.csv, you can see data generated for one predictor feature ('Var_X') and one outcome feature ('Var_Y'), following a non-linear trend. Use sklearn's PolynomialFeatures
class to extend the predictor feature column into multiple columns with polynomial features. Play around with different degrees of polynomial and the Test Run button to see what fits best: when you think you have the best-fitting degree, press the Submit button to check your work!
Perform the following steps below:
1. Load in the data
- The data is in the file called 'data.csv'. Note that this data has a header line.
- Make sure that you've split out the data into the predictor feature in
X
and outcome feature iny
. - For
X
, make sure it is in a 2-d array of 20 rows by 1 column. You might need to use NumPy'sreshape
function to accomplish this.
2. Create polynomial features
- Create an instance of sklearn's
PolynomialFeatures
class and assign it to the variablepoly_feat
. Pay attention to how to set the degree of features, since that will be how the exercise is evaluated. - Create the polynomial features by using the
PolynomialFeatures
object's.fit_transform()
method. The "fit" side of the method considers how many features are needed in the output, and the "transform" side applies those considerations to the data provided to the method as an argument. Assign the new feature matrix to theX_poly
variable.
3. Build a polynomial regression model
- Create a polynomial regression model by combining sklearn's
LinearRegression
class with the polynomial features. Assign the fit model topoly_model
.
Start Quiz:
# TODO: Add import statements
# Assign the data to predictor and outcome variables
# TODO: Load the data
train_data = None
X = None
y = None
# Create polynomial features
# TODO: Create a PolynomialFeatures object, then fit and transform the
# predictor feature
poly_feat = None
X_poly = None
# Make and fit the polynomial regression model
# TODO: Create a LinearRegression object and fit it to the polynomial predictor
# features
poly_model = None
# Once you've completed all of the steps, select Test Run to see your model
# predictions against the data, or select Submit Answer to check if the degree
# of the polynomial features is the same as ours!
Var_X,Var_Y
-0.33532,6.66854
0.02160,3.86398
-1.19438,5.16161
-0.65046,8.43823
-0.28001,5.57201
1.93258,-11.13270
1.22620,-5.31226
0.74727,-4.63725
3.32853,3.80650
2.87457,-6.06084
-1.48662,7.22328
0.37629,2.38887
1.43918,-7.13415
0.24183,2.00412
-2.79140,4.29794
1.08176,-5.86553
2.81555,-5.20711
0.54924,-3.52863
2.36449,-10.16202
-1.01925,5.31123
# TODO: Add import statements
import numpy as np
import pandas as pd
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
# Assign the data to predictor and outcome variables
# TODO: Load the data
train_data = pd.read_csv('data.csv')
X = train_data['Var_X'].values.reshape(-1, 1)
y = train_data['Var_Y'].values
# Create polynomial features
# TODO: Create a PolynomialFeatures object, then fit and transform the
# predictor feature
poly_feat = PolynomialFeatures(degree = 4)
X_poly = poly_feat.fit_transform(X)
# Make and fit the polynomial regression model
# TODO: Create a LinearRegression object and fit it to the polynomial predictor
# features
poly_model = LinearRegression(fit_intercept = False).fit(X_poly, y)