This repository was archived by the owner on Nov 19, 2020. It is now read-only.

Description
I have been working with the following code snippet and getting erratic results. I suspect there may be something wrong with the PolynomialLeastSquares() class of I just don't fully understand what it does.
If I run this:
Period = 14;
inputs = new double[Period];
outputs = new double[Period];
ols = new PolynomialLeastSquares();
ols.Degree = 3;
for(i = 0; i < Period; i++)
{
inputs[i] = Convert.ToDouble(i);
outputs[i] = inputs[i];
}
// Use OLS to learn the regression
reg = ols.Learn(inputs, outputs);
result = reg.Transform(-1.0);
Print(ols.Degree+" "+reg.ToString()+" Int "+ reg.Intercept);
This is what is returned:
3 y(x) = 2.37812678233955E-17x^2 + 1x^1 + 2.77718345656259x^0 Int -2.77718345656258
This appears to be a 2 Degree equation as there is no X^3 and it seems like the X^0 intercept is shown with a different sign from reg.Intercept. If I use a more complex input with higher values and more of a curve, the Intercept starts to get very larger in magnitude. I took a quick look at the source code:
https://github.com/accord-net/framework/blob/development/Sources/Accord.Statistics/Models/Regression/Linear/Fitting/PolynomialLeastSquares.cs
Some things look funny here to me in building the X matrix, I don't see the 1's for the X^0 parameter (maybe you handle that elsewhere), but also the Pow() function seems to never put in the X^Degree factor, but rather an X^(Degree-1), not sure if this helps but if you could help advice when you get time, I would appreciate it.