Reportedly, neural networks are able to learn complex rules. Let's see if they can learn to predict the next value of a sine wave...
Predicting from two past values
Start generating pairs of values of a sine wave described by cos(6t) and a sampling period of 0.1s
t=t=np.linspace(0,10,101)
ds = SupervisedDataSet(2, 1)
s1=np.cos(6*t)
for ix in range(2,101):
ds.addSample(s1[ix-2:ix],s1[ix])
Then, build the neural network
net = buildNetwork(2, 3, 1, hiddenclass=TanhLayer)
trainer = BackpropTrainer(net, ds)
Make a different dataset for testing. Note it has the same frequency
# Test dataset with different values
dst = SupervisedDataSet(2, 1)
st=np.cos(6*(t-0.11))
for ix in range (2,101):
dst.addSample(st[ix-2:ix],st[ix])
Define a function to test quality of the resulting network
def evalnet():
out=net.activateOnDataset(dst)
stderr=np.std(out.T-st[2:101])
print 'std error :'+str(stderr)
Then train the network and display result repeatedly:
trainer.trainEpochs(200)
evalnet()
Here are the results
Run #1
In [81]: execfile( 'example_sine.py')
std error :0.475135957881
std error :0.00528582169261
std error :0.00420217821467
std error :0.00388770773851
Run #2
In [82]: execfile( 'example_sine.py')
std error :0.825420526812
std error :0.0231771366421
std error :0.0206355959287
std error :0.0188795980662
Run #2
In [83]: execfile( 'example_sine.py')
std error :0.790629966353
std error :0.0095095475289
std error :0.0078286464515
std error :0.00764017583539
Is the network able to predict cos(5t) if trained for cos(6t)?
In [88]: execfile( 'example_sine.py')std error :0.981991629948
std error :0.0691771695798
std error :0.0695395180253
std error :0.0691409803733
In [89]: execfile( 'example_sine.py')
std error :1.20686164247
std error :0.0741454399328
std error :0.0724747717285
std error :0.0700249455395
So, it doesn't perform very well, but it is fair enough!
If we set up a dataset with 50% of data corresponding to cos(5t) and 50% corresponding to cos(6t) and test with a similar data set:
In [123]: execfile( 'example_sine.py')
std error :0.513686241649
std error :0.0340440621768
std error :0.0339060173516
std error :0.0338778895194
In [125]: execfile( 'example_sine.py')
std error :0.669916052193
std error :0.0395873020069
std error :0.0374792495802
std error :0.0359768805783
In [126]: execfile( 'example_sine.py')
std error :0.696332858547
std error :0.037025395267
std error :0.0363592341024
std error :0.03641336896
How well does it predict a sine wave of a different amplitude?
tt=np.linspace(0,0.2,3)
In [153]: v=0.5*np.cos(5*tt-4); print v;net.activate([v[0],v[1]])
[-0.32682181 -0.46822834 -0.49499625]
Out[153]: array([-0.58104487])
Not quite well but, again, fair enough!
What happens with a network with more hidden neurons? 2,3,1 -> 2,5,1
In [156]: execfile( 'example_sine.py')
std error :1.04554148959
std error :0.0385415333149
std error :0.0388701113238
std error :0.0386260279819 <- similar to before
In [157]: v=0.5*np.cos(5*tt-4); print v;net.activate([v[0],v[1]])
[-0.32682181 -0.46822834 -0.49499625]
Out[157]: array([-0.48321326])
And with another hidden layer? 2,3,1 -> 2,3,3,1
In [161]: execfile( 'example_sine.py')
std error :0.788084537967
std error :0.0435729774591
std error :0.041549014995
std error :0.0398895692455
In [162]: v=0.2*np.cos(5*tt-2); print v;net.activate([v[0],v[1]])
[-0.08322937 0.01414744 0.10806046]
Out[162]: array([ 0.00811018])
With 2,3,1 -> 2,4,4,1 it performs more or less the same
Predicting from three past values
In [203]: execfile( 'example_sine3.py')std error :0.849241666555
max error :2.67855045373
std error :0.0536682726789
max error :0.120103597074
std error :0.0475902319921
max error :0.108592376129
std error :0.0422407920624
max error :0.0944045487712
In [204]: v=0.7*np.cos(5.5*tt-0.2); print v;net.activate([v[0],v[1],v[2]])
[ 0.6860466 0.6575609 0.43512698 0.08435194]
Out[204]: array([ 0.07766859])
In [205]: v=0.5*np.cos(4*tt-0.2); print v;net.activate([v[0],v[1],v[2]])
[ 0.49003329 0.49003329 0.41266781 0.27015115]
Out[205]: array([ 0.21284467])
In [206]: v=0.6*np.cos(4.2*tt-0.6); print v;net.activate([v[0],v[1],v[2]])
[ 0.49520137 0.59030622 0.58280278 0.47399534]
Out[206]: array([ 0.41496148])
Works very good!
No comments:
Post a Comment