It looks like you're new here. If you want to get involved, click one of these buttons!

- All Categories 2.4K
- Chat 505
- Study Groups 21
- Petri Nets 9
- Epidemiology 4
- Leaf Modeling 2
- Review Sections 9
- MIT 2020: Programming with Categories 51
- MIT 2020: Lectures 20
- MIT 2020: Exercises 25
- Baez ACT 2019: Online Course 339
- Baez ACT 2019: Lectures 79
- Baez ACT 2019: Exercises 149
- Baez ACT 2019: Chat 50
- UCR ACT Seminar 4
- General 75
- Azimuth Code Project 111
- Statistical methods 4
- Drafts 10
- Math Syntax Demos 15
- Wiki - Latest Changes 3
- Strategy 113
- Azimuth Project 1.1K
- - Spam 1
- News and Information 148
- Azimuth Blog 149
- - Conventions and Policies 21
- - Questions 43
- Azimuth Wiki 719

Options

Neural Network Delta Forecast for El Nino 3.4 Anomalies

A bit improvement on the error in comparison to SVR's 9% reduced to 7.9% for NN, but the max deviation range was increased a bit.

**IMPORTANT: error for each of the 6 months was at the same level, not increasing as in the case of SVR**

3 Layers, Input and Hidden each of 37 length, in other words the training samples vectors of length 37.

Output layer of length 6, for next 6 months forecasts.

300 sweeps of entire data, each step repeated 30 times, total of 9000 learning sessions.

Dara

## Comments

John

I assume N in NIPS is for Neural Networks. So I did this forecast for you to be on the safe side. Also I wanted to make sure the SVR results are not off.

`John I assume N in NIPS is for Neural Networks. So I did this forecast for you to be on the safe side. Also I wanted to make sure the SVR results are not off.`

Code written in C, fully parallelized on 16 cpu server via Intel's OPENMP compiler technology.

`Code written in C, fully parallelized on 16 cpu server via Intel's OPENMP compiler technology.`

Hello John

This algorithm assumes smooth functions and the manipulation of the error terms has calculus background to them.

Another version of this algorithm could be obtained for continuous functions which are not smooth, and no calculus smooth operators needed.

In this case the error term will be minimized by a global minimizer e.g. Differential Evolution and that could run might fast on the new GPU servers.

Dara

`Hello John This algorithm assumes smooth functions and the manipulation of the error terms has calculus background to them. Another version of this algorithm could be obtained for continuous functions which are not smooth, and no calculus smooth operators needed. In this case the error term will be minimized by a global minimizer e.g. Differential Evolution and that could run might fast on the new GPU servers. Dara`

I will go thru the code this week and thoroughly test everything to make sure results are as accurate as possible

`I will go thru the code this week and thoroughly test everything to make sure results are as accurate as possible`