This explanation is not meant to be a detailed explanation but rather a brief explanation to give the general idea. If a more detailed explanation is required I refer you to the web or a bookstore where you will find extensive information about this subject.

Biological: |
An intertwined network of
Neurons. |

Mathematical: |
Multiple layers of rows (vectors) of
Neurons. The input neurons designate the neurons that handle the direct input from the training data. The input can be from several sources (close, moving averages, technical indicators) and span several time periods (2 days, 15 days, etc.). Being able to span several time periods simultaneously is a great asset. It falls under the same concept of moving averages. By averaging out enough values you can reduce the noise inherent in the data. Similarly by spanning several time periods the nets can average out the noise to a minimum helping it to focus on the underlying pattern. By optimizing this value, the net will find the time period with the least amount of averaged noise that doesn’t reduce it’s performance. The hidden neurons are where the non-linear mapping occurs (sinusoidal activation function). Not enough hidden neurons will prevent a net from converging. To many hidden neurons will cause the net to learn the data rather than generalize the data. Nets that have learned the data will perform poorly (also known as overtraining). By optimizing this value, the net can perform the best with minimal risk of overtraining. The output neuron(s) is where the net sends it’s outputs from the hidden neurons. This also has a non-linear mapping function (logistic activation function) from the hidden neuron’s outputs to the output’s inputs. Special data preprocessing allows us to take this output and mathematically reverse it back to an actual value to use in graphs and future outputs. To get multiple days forward you take this output and temporarily place it back into the input in order to get another day forward. Repeating this process takes you X amount of days forward. In realizing that we are using future outputs wrapped around to produce future outputs you quickly see the accuracy of the net diminishing with time. The farther out, the less accurate. What you are really looking at would be if the current pattern found and learned was to continue for X amount of days then this would be the future value it would attain. Since many factors affect current market patterns the net should be updated regularly to adjust for any new conditions taking place in the given security. |