Neural Network Fundamentals?
Before neural network, it is important to have basics of linear algebra, probabilities, few machine learning algorithms.
Then you could make sense of what a neural network architecture looks like (layers) - Imagine a set of " Perceptrons" arranged in layers. The essence of neural network is then the below 4 sets of things
- Input values
- Weights and bias
- Summation function
- Activation function
You give the model a set of input values, the inputs could be taken in with different weights based on how important the variable is , the summation function aggregates the inputs and weights (think of it like an equation) and feeds to activation function which is simply put a transformation process to give you the final output
Neural networks are a form of Machine learning which comes under deep learning. It mimics the neurons in the brain.There is a network of neurons which are connected together just like actual neurons.
- Neural networks arranges neurons is below format:
- Neurons are arranged in different layers.
- Each layer receives input and passes output.
- The output of each neuron depends upon the weight of the neuron.
- The weight of each neuron is adjusted in the training phase.