A practical optimisation algorithm for big data applications

A practical optimisation algorithm for big data applications
Image shows an aerial view of Hurricane Matthew.  Improvements to optimisation algorithms enable weather forecasters to make better weather predictions. Credit: Pixabay

Numerous science and engineering applications require finding the lowest or highest value of a mathematical model. This is usually obtained computationally by running an optimisation algorithm. When working with big data which is more complex, it becomes computationally much more time-consuming and expensive to arrive at an optimal or close-to-optimal solution. Increasingly, the computational efficiencies of algorithms become of paramount importance when tackling such complex scenarios.

Prof Vincent TAN from the Department of Mathematics, NUS and Prof William HASKELL from the Department of Industrial Systems Engineering and Management, NUS with their student Renbo ZHAO proposed several practical strategies which can improve the computational speed of a well-known optimisation known as the stochastic L-BFGS (limited-memory Broyden-Fletcher-Goldfarb-Shanno) algorithm by several orders of magnitude. The mathematicians also proved theoretically that their proposed algorithms converge much faster than other available algorithms.

Several aspects of the existing stochastic L-BFGS algorithms can be improved to ensure higher computational efficiency. For example, traditional strategies use uniform sampling at each computational step. Surprisingly, the NUS mathematicians have shown that by using a non-uniform sampling , the algorithm can process much faster.

Prof Tan said, "Large-scale optimisation problems lie at the heart of analysis. It is important to be able to solve them efficiently. Many practical problems, such as training of neural networks for machine learning, are complex in nature and can benefit from stochastic optimisation algorithms. We believe that the theoretical advancements will have significant impact in enhancing the training phases of a variety of machine learning algorithms."

More information: "Stochastic L-BFGS Revisited: Improved Convergence Rates and Practical Acceleration Strategies", Proceedings of Conference on Uncertainty in Artificial Intelligence (UAI), Sydney, Australia, Aug 2017 (to appear).

Citation: A practical optimisation algorithm for big data applications (2017, September 26) retrieved 19 April 2024 from https://phys.org/news/2017-09-optimisation-algorithm-big-applications.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Physicists uncover similarities between classical and quantum machine learning

13 shares

Feedback to editors