This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

The future of AI is wide, deep, and large

The future of AI is wide, deep, and large
Two networks for classification of concentric rings of red circles and blue diamonds. (a) A one-hidden-layer network with 3 neurons to classify concentric rings whose polytope contains 1 simplex; and (b) a one-hidden-layer network with 6 neurons to classify concentric rings whose polytope comprises 4 simplices. Clearly, the number of simplices is a more meaningful complexity measure of ReLU networks than the number of polytopes. Credit: Journal of Machine Learning Research (2023). https://jmlr.org/papers/volume24/21-0579/21-0579.pdf

ChatGPT has fascinated the public as we begin to explore how generative artificial intelligence (AI) can be useful in our everyday lives. On the back end, scientists are continually advancing AI for potential applications so vast that it may change life as we know it by accelerating scientific and technological developments.

In research recently published in the Journal of Machine Learning Research, Fenglei Fan, Ph.D. '23, former Rensselaer doctoral student and current research assistant professor of mathematics at The Chinese University of Hong Kong; Rongjie Lai, former Rensselaer associate professor and now a professor of mathematics at Purdue University; and Ge Wang, Ph.D., Clark & Crossan Endowed Chair Professor and director of the Biomedical Imaging Center at Rensselaer, found that analyzing the topology of artificial neural networks illuminated how to best harness the power of AI in the future.

Much like a topological map, technologies that power AI have three dimensions. ChatGPT, which is all the buzz in the AI world, is a with many layers, also referred to as a . Wang and his collaborators found that width, which refers to the number of neurons in a layer, also plays a significant role.

Interestingly, they found that one type of network may be converted into the other to accomplish a given task, such as regression or classification, which are critical elements of machine learning. (Machine learning is a subset of AI that allows computer-generated predictions without explicit instructions.) In other words, a deep neural network may be converted into a wide neural network, and vice versa.

"Early in the technology, scientists focused on very wide and shallow networks (one to two layers), to do universal approximation," said Wang. "Later, (many layers work in the feedforward fashion) were proven to be very powerful. However, we were not fully convinced that the focus should be solely on deep learning rather than both wide and deep learning. We feel that depth is just one dimension and width is another, and both need to be considered and combined."

In their research, the team considered the relationship between deep and wide neural networks. Using quantitative analysis, they found that deep and wide networks can be converted back and forth on a continuum. Using both will give a bigger picture and avoid bias.

Their research hints at the future of machine learning, in which networks are both deep and wide and interconnected with favorable dynamics and optimized ratios between width and depth. Networks will become increasingly complicated, and when dynamics reach the desired states, they will produce amazing outcomes.

"It's like playing with LEGO bricks," said Wang. "You can build a very tall skyscraper or you can build a flat large building with many rooms on the same level. With networks, the number of neurons and their interconnection are the most important. In 3D space, neurons can be arranged in myriad ways. It's just like the structure of our brains. The neurons just need to be interconnected in various ways to facilitate diverse tasks."

"Comprehending the conversion between the depth and width of neural networks remains a dynamic and evolving field," said Lai. "Both wide and deep networks offer their distinct advantages and drawbacks. Shallow networks, typically, are more straightforward to grasp. Our exploration into the symmetries inherent in these two network types illuminates a new perspective for understanding deep networks through the lens of wide networks."

"Dr. Wang's on the relationship of wide and deep neural networks opens new paths to harness the potential of AI," said Shekhar Garde, Ph.D., dean of Rensselaer's School of Engineering. "AI is impacting almost every aspect of our society, from medicine to new materials to finance. It is an exciting time for the field and Dr. Wang is at the forefront of thought on the subject."

More information: Fenglei Fan et al, Quasi-Equivalence between Width and Depth of Neural Networks Journal of Machine Learning Research (2023). jmlr.org/papers/volume24/21-0579/21-0579.pdf

Citation: The future of AI is wide, deep, and large (2023, September 27) retrieved 27 April 2024 from https://techxplore.com/news/2023-09-future-ai-wide-deep-large.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Deep learning from a dynamical viewpoint

7 shares

Feedback to editors