Kalman Filter module to correct ARIMA forecast result

I am currently writing a script to do a wind speed forecast using ARIMA and I have pretty nice results for a very short term forecast. I was wondering which of the Kalman Filter function in python is the one that would allow me to have lower root mean square error of my forecast. I am using only wind speed as input data and I want to see some improvements in my forecast errors. Can you give me some guidelines on which module to use, functions and be a little bit specific on why I can't use others. Do I have to use pykalman (kalman smoother)?Why?How? Or tsa.kalmanf.kalmanfilter (In this module I am seeing two different types of Kalman Filter: one with the k in capital and there is no comments on the function and the other in non capital letters) Any help would be great!


I would use this numpy implementation on the scipy wiki:

# Kalman filter example demo in Python

# A Python implementation of the example given in pages 11-15 of "An
# Introduction to the Kalman Filter" by Greg Welch and Gary Bishop,
# University of North Carolina at Chapel Hill, Department of Computer
# Science, TR 95-041,
# http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html

# by Andrew D. Straw

import numpy
import pylab

# intial parameters
n_iter = 50
sz = (n_iter,) # size of array
x = -0.37727 # truth value (typo in example at top of p. 13 calls this z)
z = numpy.random.normal(x,0.1,size=sz) # observations (normal about x, sigma=0.1)

Q = 1e-5 # process variance

# allocate space for arrays
xhat=numpy.zeros(sz)      # a posteri estimate of x
P=numpy.zeros(sz)         # a posteri error estimate
xhatminus=numpy.zeros(sz) # a priori estimate of x
Pminus=numpy.zeros(sz)    # a priori error estimate
K=numpy.zeros(sz)         # gain or blending factor

R = 0.1**2 # estimate of measurement variance, change to see effect

# intial guesses
xhat[0] = 0.0
P[0] = 1.0

for k in range(1,n_iter):
    # time update
    xhatminus[k] = xhat[k-1]
    Pminus[k] = P[k-1]+Q

    # measurement update
    K[k] = Pminus[k]/( Pminus[k]+R )
    xhat[k] = xhatminus[k]+K[k]*(z[k]-xhatminus[k])
    P[k] = (1-K[k])*Pminus[k]

pylab.plot(z,'k+',label='noisy measurements')
pylab.plot(xhat,'b-',label='a posteri estimate')
pylab.axhline(x,color='g',label='truth value')

valid_iter = range(1,n_iter) # Pminus not valid at step 0
pylab.plot(valid_iter,Pminus[valid_iter],label='a priori error estimate')

A common misconception regarding the filter is that noise need be distributed normally. This is not required.

Hope this helps and if you require further explanation, do leave a comment.

Need Your Help

How to get facebook login working with devise, omniauth - cannot overcome controller error

ruby-on-rails facebook devise callback omniauth

I am working on getting Facebook working with devise and omniauth. I have been trying to figure this out for days now without success. When I press the 'Facebook connect' button, the login page com...

About UNIX Resources Network

Original, collect and organize Developers related documents, information and materials, contains jQuery, Html, CSS, MySQL, .NET, ASP.NET, SQL, objective-c, iPhone, Ruby on Rails, C, SQL Server, Ruby, Arrays, Regex, ASP.NET MVC, WPF, XML, Ajax, DataBase, and so on.