Monday, February 3, 2020

Week Two

Week 2 


Research about the neural network was done, and we were able to understand the basic structure of NN:



Fundamental Algorithms of NN:
- Linear Regression Analysis:
A type of predictive analysis, its idea is to see 2 things. Firstly, does the variable set do a good job predicting dependant variables? Secondly, which particular variables significantly predict the outcome and what way they indicate to magnitude?
The simplest form of regression equation:
y = c + b*x
where
y: estimated dependant variable score
c: constant
b: regression coefficient
x: score on the independent variable 

- Forward propagation:
One of the core processes during the learning phase. Where input information (data) is inserted in a forward direction throughout the network, passing through hidden layers where it is processed then passed to successive layer.

- Loss function:
The objective function is a function used to evaluate a candidate solution when we want to minimise it, we call it loss function.

- Activation function:
A function used to achieve the output of node (transfer function).

- Backpropagation:
Gradient descent:
This method of Backpropagation operates in a way that allows it to find a minimum of a function by starting at a random location in parameter space after that reduces the error until it reaches a local minimum.





No comments:

Post a Comment