Showing posts with label optimization. Show all posts
Showing posts with label optimization. Show all posts

Thursday, July 14, 2011

Polynomial curve fitting

We have seen already how to a fit a given set of points minimizing an error function, now we will see how to find a fitting polynomial for the data using the function polyfit provided by numpy:
from numpy import *
import pylab

# data to fit
x = random.rand(6)
y = random.rand(6)

# fit the data with a 4th degree polynomial
z4 = polyfit(x, y, 4) 
p4 = poly1d(z4) # construct the polynomial 

z5 = polyfit(x, y, 5)
p5 = poly1d(z5)

xx = linspace(0, 1, 100)
pylab.plot(x, y, 'o', xx, p4(xx),'-g', xx, p5(xx),'-b')
pylab.legend(['data to fit', '4th degree poly', '5th degree poly'])
pylab.axis([0,1,0,1])
pylab.show()

Let's see the two polynomials:

Saturday, May 28, 2011

Data fitting using fmin

We have seen already how to find the minimum of a function using fmin, in this example we will see how use it to fit a set of data with a curve minimizing an error function:
from pylab import *
from numpy import *
from numpy.random import normal
from scipy.optimize import fmin

# parametric function, x is the independent variable
# and c are the parameters.
# it's a polynomial of degree 2
fp = lambda c, x: c[0]+c[1]*x+c[2]*x*x
real_p = rand(3)

# error function to minimize
e = lambda p, x, y: (abs((fp(p,x)-y))).sum()

# generating data with noise
n = 30
x = linspace(0,1,n)
y = fp(real_p,x) + normal(0,0.05,n)

# fitting the data with fmin
p0 = rand(3) # initial parameter value
p = fmin(e, p0, args=(x,y))

print 'estimater parameters: ', p
print 'real parameters: ', real_p

xx = linspace(0,1,n*3)
plot(x,y,'bo', xx,fp(real_p,xx),'g', xx, fp(p,xx),'r')

show()
The following figure will be showed, in green the original curve used to generate the noisy data, in blue the noisy data and in red the curve found in the minimization process:

The parameters will be printed also:
Optimization terminated successfully.
         Current function value: 0.861885
         Iterations: 77
         Function evaluations: 146
estimater parameters:  [ 0.92504602  0.87328979  0.64051926]
real parameters:  [ 0.86284356  0.95994753  0.67643758]

Monday, May 9, 2011

How to find the roots of a function with fsolve

The function fsolve provided by numpy return the roots of the (non-linear) equations defined by func(x) = 0 given a starting estimate. We will see how to use fsolve to find the root of the function

from scipy.optimize import fsolve
import pylab
import numpy

pow3 = lambda x : x**3

result = fsolve(pow3,10) # starting from x = 10
print result
x = numpy.linspace(-1,1,50)
pylab.plot(x,pow3(x),result,pow3(result),'ro')
pylab.grid(b=1)
pylab.show()
In the following graph we can see f(x) (blue curve) and the solution found (red dot)

Wednesday, April 27, 2011

How to find the minimum of a function using fmin from scipy

In this example we will see how to use the function fmin to minimize a function. The function fmin is contained in the optimize module of the scipy library. It uses the downhill simplex algorithm to find the minimum of an objective function starting from a guessing point given by the user. In the example we will start from two different guessing points to compare the results. Here's the code:
import numpy
import pylab
from scipy.optimize import fmin

# objective function
rsinc = lambda x: -1 * numpy.sin(x)/x

x0 = -5 # start from x = -5
xmin0 = fmin(rsinc,x0)

x1 = -4 # start from x = -4
xmin1 = fmin(rsinc,x1)

# plot the function
x = numpy.linspace(-15,15,100)
y = rsinc(x)
pylab.plot(x,y)
# plot of x0 and the minimum found startin from x0
pylab.plot(x0,rsinc(x0),'bd',xmin0,rsinc(xmin0),'bo')
# plot of x1 and the minimum found startin from x1
pylab.plot(x1,rsinc(x1),'rd',xmin1,rsinc(xmin1),'ro')
pylab.axis([-15,15,-1.3,0.3])
pylab.show()
The function fmin will print some detail about the iterative process performed:
Optimization terminated successfully.
         Current function value: -0.128375
         Iterations: 18
         Function evaluations: 36
Optimization terminated successfully.
         Current function value: -1.000000
         Iterations: 19
         Function evaluations: 38

And the graphical result should be as follows:


The blue dot is the minimum found starting from the blue diamond (x=-5) and the red dot is the minimum found starting from the red diamond (x=-4). In this case, when we start from x=-5 fmin get stuck in a local minum and when we start from x=-4 fmin reaches the global minimum.