Skip to content

Latest commit

 

History

History
20 lines (12 loc) · 1.64 KB

README.md

File metadata and controls

20 lines (12 loc) · 1.64 KB


Gradient Problem in RNN

Description

The vanishing gradient problem is a well-known issue in training recurrent neural networks (RNNs). It occurs when gradients (derivatives of the loss with respect to the network's parameters) become too small as they are backpropagated through the network during training. When the gradients vanish, it becomes difficult for the network to learn long-range dependencies, and the training process may slow down or even stagnate. This problem is especially pronounced in traditional RNN architectures.


Basic Concept:
1.Weight
2.Bias

You can follow me:

Links