Learn Before
Polynomial Future Transformation
We can use polynomial future transformation to transform a problem into a higher dimensional regression space. Adding these extra polynomial features allows us a much richer set of complex functions that we can use to fit to the data. This is like allowing polynomials to be fit to the training data instead of simply a straight line, but using the same least-squares criterion that minimizes mean squared error. This approach of adding new features like polynomial features is also very effective with classification.
0
1
Tags
Data Science
Related
Types of linear regression
Logistic Regression
When is a model considered linear?
Linear Regression and Linear Models Videos
Example of Linear Regression Using R
Training or Fitting a linear regression model
Assessing the accuracy of linear regression
Comparison of Linear Regression with K-Nearest Neighbors
Assumptions of Linear Regression
Regression Coefficient
Linear regression vs. nonlinear regression
Popular Regularization Techniques in Deep Learning
Polynomial Future Transformation
Locally Weighted Linear Regression
Linear Regression Dataset Notation
History of Linear Regression
Linear Regression One-Dimensional Fit
Squared Error Loss
Linear Regression Conditional Mean Assumption
Linear Regression Weight Parameters
Linear Regression Bias Parameter
Linear Regression as a Neural Network
Synthetic Data Generation for Linear Regression