Gradient Descent

What is Stochastic Gradient Descent?

Stochastic Gradient Descent (SGD) is an iterative optimization algorithm commonly used in machine learning and deep learning for training models. It is an extension of the standard Gradient Descent algorithm that aims to minimize the cost or loss function associated with the model’s parameters.

1 Like