New logarithmic step size for stochastic gradient descent
The step size, often referred to as the learning rate, plays a pivotal role in optimizing the efficiency of the stochastic gradient descent (SGD) algorithm. In recent times, multiple step size strategies have emerged for ...
Apr 22, 2024
0
7