I have a set of data points, (x and y in the code below) and I am trying to create a linear line of best fit through my points. I am using scipy.optimize.curve_fit. My code produces a line, but not a line of best fit. I have tried giving the function model parameters to use for my gradient and for my intercept, but each time it produces the exact same line which does not fit to my data points.
The blue dots are my data points the red line should be fitted to:

If anyone could point out where I am going wrong I would be extremely grateful:
import numpy as np
import matplotlib.pyplot as mpl
import scipy as sp
import scipy.optimize as opt
x=[1.0,2.5,3.5,4.0,1.1,1.8,2.2,3.7]
y=[6.008,15.722,27.130,33.772,5.257,9.549,11.098,28.828]
trialX = np.linspace(1.0,4.0,1000) #Trial values of x
def f(x,m,c): #Defining the function y(x)=(m*x)+c
return (x*m)+c
popt,pcov=opt.curve_fit(f,x,y) #Returning popt and pcov
ynew=f(trialX,*popt)
mpl.plot(x,y,'bo')
mpl.plot(trialX,ynew,'r-')
mpl.show()
Returns poptarray. Optimal values for the parameters so that the sum of the squared residuals of f(xdata, *popt) - ydata is minimized. pcov2-D array. The estimated covariance of popt. The diagonals provide the variance of the parameter estimate.
The SciPy open source library provides the curve_fit() function for curve fitting via nonlinear least squares. The function takes the same input and output data as arguments, as well as the name of the mapping function to use. The mapping function must take examples of input data and some number of arguments.
means that the fit could not determine the uncertainties (variance) of the fitting parameters.
optimize. curve_fit() function is used to find the best-fit parameters using a least-squares fit. The curve_fit method fits our model to the data. The curve fit is essential to find the optimal set of parameters for the defined function that best fits the provided set of observations.
You could alternatively use numpy.polyfit to get the line of best fit:
import numpy as np
import matplotlib.pyplot as mpl
x=[1.0,2.5,3.5,4.0,1.1,1.8,2.2,3.7]
y=[6.008,15.722,27.130,33.772,5.257,9.549,11.098,28.828]
trialX = np.linspace(1.0,4.0,1000) #Trial values of x
#get the first order coefficients
fit = np.polyfit(x, y, 1)
#apply
ynew = trialX * fit[0] + fit[1]
mpl.plot(x,y,'bo')
mpl.plot(trialX,ynew,'r-')
mpl.show()
Here is the output:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With