Which polynomial regression degree is significant depends of number of points or other parameters

0 votes

I am studying the stability of numerical derivatives as a function of the step I take to compute these derivatives. With a derivative with 15 points (obtained by the finite difference method), I get the following plot (each multipole "l" corresponds to a parameter whose depends the derivative but it doesn't matter) :

plot derivatives with 15 points

Now, I would like to compare this derivative of 15 points with the derivative computed with 3, 5 and 7 points. For this, I have just plotted the relative difference like (with absolute differences) :

abs(f'_15_pts - f'_3_pts)/f'_3_pts for comparison between 15 and 3 points
abs(f'_15_pts - f'_5_pts)/f'_5_pts for comparison between 15 and 5 points
abs(f'_15_pts - f'_7_pts)/f'_7_pts for comparison between 15 and 7 points

And my issue occurs when I want to do a polynomial regression on the relative variations above with the multipole l=366.42 (the problem remains right for other multipoles).

For example, when I do a cubic regression (3 degrees) , I get the following plot :

interpolation regression with cubic

I don't know exactly how to interpret these results: maybe it means that I have a relative error maximal between 3 points and 15 points derivatives and less between 5 and 15 like between 7 and 15 points.

Then, If I want to do for example a polynomial regression of degree 10, I get the following plot :

interpolation regression with 10 degree

As you can see, this is totally different from cubic regression above.

So I don't know which degree to take for polynomial regression, I mean which degree is relevant to get valid physical results : 3, 4, 6 or maybe 10. If I take a too large degree, results aren't valid since I have dirac peaks and straight lines.

I guess the right polynomial degree to keep depends of the initial number of points of the interpolated curve (140 points for the first figure) and maybe others parameters.

As conclusion, could anyone tell me if there is criteria to determine which polynomial degree to apply ?, I mean the degree which will be the most relevant from a relative error point of view.

If I don't do regression, I have the following plot which is blurred to interpret :

original plot of relative errors

That's why I would like to interpolate these data, to see more clearly the differences between different relative evolutions.

PS: here the code snippets of polynomial regression :

stepForFit = np.logspace(-8.0,-1.0,10000)
coefs_3_15 = poly.polyfit(np.log10(stepNewArray), np.log10(errorRelative_3_15), 10)
ffit_3_15 = poly.polyval(np.log10(stepForFit), coefs_3_15)
coefs_5_15 = poly.polyfit(np.log10(stepNewArray), np.log10(errorRelative_5_15), 10)
ffit_5_15 = poly.polyval(np.log10(stepForFit), coefs_5_15)
coefs_7_15 = poly.polyfit(np.log10(stepNewArray), np.log10(errorRelative_7_15), 10)
ffit_7_15 = poly.polyval(np.log10(stepForFit), coefs_7_15)

# Plot interpolation curves
plt.plot(stepForFit[stepArrayId], np.power(10,ffit_3_15[stepArrayId]), colorDerPlot[0])
plt.plot(stepForFit[stepArrayId], np.power(10,ffit_5_15[stepArrayId]), colorDerPlot[1])
plt.plot(stepForFit[stepArrayId], np.power(10,ffit_7_15[stepArrayId]), colorDerPlot[2])v

UPDATE 1: Given I have not hypothesis (or a model) on the value of relative error, I can't put constraints apriori on the degree of polynome that must best-fit the data.

But maybe I have an clue since the derivatives that I have computed are 3, 5, 7 and 15 points. So I have an uncertainty of level respectively of O(h^2), O(h^4), O(h^6) and O(h^14).

For example, for the 3 points derivatives, I have :

enter image description here

and so the final expression of derivatives :

enter image description here

By the way, I don't understand why we pass from $O(h^4)$ to $O(h^2)$ between the expression.

But main issue is that I have not for instant hypothesis on the polynomial degree that I have to apply.

Maybe, I should test a range of polynomial degree and compute at each time the chi2, so minimal chi2 will give to me the right degree to take into account.

What do you think about this ? Does Numpy or Python already have this kind of study with specific functions ?

UPDATE 2: I tried to determine over a range of 1-15 degree of polynomial that could best fit the data. My criterion was to fit with a polynome for each degree and then compute the chi2 between "interpolation computed data" and "experimental data". If new chi2 is lower than previous chi2, I update the degree to choose to do a polynomial regression.

Unfortunately, for each of the 3,5 and 7 points derivatives, I always get by this research of "ideal degree", the maximum degree which corresponds to the max of degree-interval explored.

Ok, the chi2 is minimal for the highest degree but this doesn't correspond to physical results. One doesn't forget that below 10^-4, behavior of Cl' is chaotic, so I don't expect a physical interpretation on the convergence of the derivatives as an increasing number of points for derivatives.

But the interesting area is above 10^-4 where I have more stability.

Given my method of selecting best degree as a function of chi2 doesn't work (it always gives the maximal degree of the range explored), is there another method to get a nice fit? I know this is difficult since chaotic region for small steps.

Last thing, the cubic regression (3 degrees) gives nice curves but I don't understand why this only occurs for degree 3 and not for higher degree.

As someone said in the comments, for higher degree, regression is overfitted: how to fix this ?

Mar 15, 2022 in Machine Learning by Nandini
• 5,480 points
620 views

1 answer to this question.

0 votes

I must admit that the wording of your inquiry perplexes me, therefore I can only offer a generic response. Next time, you might want to break up your huge question into multiple smaller ones.

To begin, I believe your question is: how important is the number of points in a differentiation stencil when I'm going to apply polynomial interpolation on the derivative later?
The number of points in a stencil increases the precision of the derivative computation in general. Filling in Taylor expansions for the variables in the numerical derivative demonstrates this. Following the cancellation of terms, you are left with a higher-order term that gives you a lower bound on the error you make. The underlying assumption is that the function for which you compute the derivative (in your case C) is smooth on the interval for which you compute the derivatives. That is to say, if your function behaves badly on your 15-point stencil, that derivative is essentially useless.
Because the user may know that their series behaves like a polynomial up to a particular degree but not the polynomial coefficients, the order of the polynomial in polynomial regression is normally a free parameter chosen by the user. You can set the degree yourself if you know something about your data.

Now let's get down to business with your specific issue.

  • Unless you know further information about your data, there is no way to determine an ideal degree. A hyperparameter is something like a degree. If you want an optimal, you'll need to include more information, such as "I want the lowest degree polynomial that fits the data with an error epsilon."

  • Overfitting can be avoided by using a polynomial with a lower degree. If that doesn't work, least-squares regression isn't the method for you. 

  • You should investigate a regression method that uses a different metric, or you should preprocess the data, or you should employ a non-polynomial fit (fit a function of certain shape, then use Levenberg-Marquardt for instance).

  • The 15-step derivative appears to be suspect; your data is unlikely to have this level of smoothness. If you have a valid justification, please explain; otherwise, use 2 points for the first derivative and 3 or 5 points for the second.

The Landau sign (big-O) does not change the order of the phrase from fourth to second. O(h^4)/h^2 becomes O(h2) when the two equations are subtracted and divided by  h^2.

answered Mar 17, 2022 by Dev
• 6,000 points

Related Questions In Machine Learning

+1 vote
2 answers

Is Sophia Humanoid ‘Limited Memory’ type of AI or ‘Reactive Machines’ type of AI?

Sophia is basically a chatbot with a ...READ MORE

answered Jan 29, 2020 in Machine Learning by Rakshi
2,662 views
0 votes
1 answer
0 votes
1 answer

What is the difference between Coefficient of Regression and Elasticity

It is questionable. I'll simplify the model ...READ MORE

answered Apr 4, 2022 in Machine Learning by Nandini
• 5,480 points
753 views
0 votes
1 answer

Different types of Logistic Regression

There are three main types of logistic ...READ MORE

answered May 13, 2019 in Machine Learning by Nikhil
11,850 views
0 votes
2 answers
+1 vote
2 answers

how can i count the items in a list?

Syntax :            list. count(value) Code: colors = ['red', 'green', ...READ MORE

answered Jul 7, 2019 in Python by Neha
• 330 points

edited Jul 8, 2019 by Kalgi 4,434 views
0 votes
1 answer

Is predicting number of sales a Regression or Classification problem?

The output will be discrete but the ...READ MORE

answered Feb 25, 2022 in Machine Learning by Dev
• 6,000 points
1,289 views
0 votes
1 answer
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP