Introduction

The financial industry is on the cusp of a technological revolution with the integration of Artificial Intelligence (AI) into various facets of operations and services. Artificial Neural Networks (ANNs), a part of Artificial Intelligence, is a transformative technology that has the potential to disrupt traditional banking and financial services.

ANNs and their relationship with Machine Learning and Deep Learning?

Artificial Neural Networks (ANNs) imitate the concept of biological neural networks within the human brain. Machine learning is a subset of AI that includes various models and algorithms designed to enable computers to learn from data and make decisions or predictions without explicit programming.

Deep learning is an advanced subset of machine learning used to analyse complex patterns and relationships in data. ANNs are a part of deep learning models.

Origin of ANNs

The concept behind artificial neural networks (ANNs) is not new, but it dates back to the middle of the 20th century. Following are some key milestones in the evolution of ANNs:

1. 1943 - McCulloch-Pitts Neuron Model
The McCulloch-Pitts Neuron Model is the first computational model of neurons proposed by Warren McCulloch and Walter Pitts in 1943. McCulloch Pitt’s model of neurons is a simple model which consists of binary inputs with some weight associated with each one of them. It laid the groundwork for the development of artificial neurons.

2. 1958 - Perceptron
Frank Rosenblatt introduced the concept of the perceptron in 1958. It was used to perform binary classification tasks. Even though the perceptron faced criticism due to its limited capacity to learn complex patterns, it was an early attempt at creating a computational model inspired by neural processes.

3. 1989 - Backpropagation algorithm
One of the most important breakthroughs in artificial neural networks came with the development of the backpropagation algorithm. Developed in 1989 by Hinton, Rumelhart, and William, the Backpropagation algorithm effectively trained a neural network by adjusting weights based on the error between predicted and actual outputs.

4. After 2009 - Deep Learning era
After 2009, the popularity of deep neural networks increased significantly due to the advances in computing power, the availability of huge datasets, and innovations in architectures. These breakthroughs resulted in expanding the scope of neural networks in areas such as image recognition, natural language processing, and more.

Learning and training process of ANNs

ANNs consist of interconnected nodes, also known as neurons, organised in layers. Each connection between nodes has an associated weight, and the network learns by adjusting these weights based on the input data and desired outputs. Although ANNs are powerful tools, it is essential to demystify them by fostering understanding and awareness of their underlying principles. Learning about the fundamentals of machine learning and neural networks can help individuals appreciate the technology without perceiving it as magical.

 
 
To read more, please subscribe.