Tensor2Tensor library to speed deep learning work

computer
Credit: Public Domain

(Tech Xplore)—Google Brain Team's senior research scientist Lukasz Kaiser had an announcement on Monday, posted on the Google Research Blog, that translates into good news for those engaged in Deep Learning research.

"Today, we are happy to release Tensor2Tensor (T2T), an open-source system for training models in TensorFlow."

The launch of the open source system is expected to make training deep learning models faster and easier..

The message on GitHub: T2T "is a modular and extensible library and binaries for supervised learning with TensorFlow and with support for sequence tasks. It is actively used and maintained by researchers and engineers within the Google Brain team."

Just how can theTensor2Tensor release help accelerate deep-learning research? Speaking of its strengths, SD Times said "T2T includes a library of data sets and models to help kick start research."

Liam Tung, ZDNet, said, "The framework promises to take some of the work out of customizing an environment to enable deep-learning models to work on various tasks."

TechCrunch commented that "The sheer number of variables in AI research combined with the fast pace of new developments makes it difficult for experiments run in two distinct settings to match."

With a easy to work with, the impact could be its T2T lowering the barrier for organizations looking to experiment with deep learning, said TechRepublic.

What does modular mean in this context?

Conner Forrest, senior editor, TechRepublic: "It also utilizes a standard interface among all aspects of a deep learning system, including datasets, models, optimizers, and different sets of hyperparameters, the post said. So, users can swap versions of these components out to see how they perform together. It is this modular architecture that is one of the core values of T2T."

Kaiser also discussed how this lightens the burdens. He said many open-sourced Deep Learning systems "use unique setups that require significant engineering effort and may only work for a specific problem or architecture, making it hard to run new experiments and compare the results."

In contrast, "T2T facilitates the creation of state-of-the art models for a wide variety of ML applications, such as translation, parsing, image captioning and more, enabling the exploration of various ideas much faster than previously possible."

Tung talked about this in ZDNet. He said that deep learning has had success in speech recognition, image classification and translation, but each model needs to be tuned specifically for the task at hand. Tung added that " models are often trained on tasks from the same "domain", such as translation tasks being trained with other translation tasks."

All this slows down research work and, as significant, "don't follow how the human brain works, which is capable of taking lessons from one challenge and applying it to solving a new ."

("Time," said Jon Fingas in Engadget, "is one of the biggest obstacles to the adoption of deep learning.")

Engadget's headline on Tuesday: "Google can turn an ordinary PC into a deep learning machine."

Kaiser, writing in the blog, said, "This release also includes a library of datasets and models, including the best models from a few recent papers (Attention Is All You Need, Depthwise Separable Convolutions for Neural Machine Translation and One Model to Learn Them All) to help kick-start your own DL research."

"We're eager to collaborate with you on extending T2T, so please feel free to open an issue on GitHub or send along a pull request to add your data-set or . See our contribution doc for details and our open issues."

© 2017 Tech Xplore

Citation: Tensor2Tensor library to speed deep learning work (2017, June 21) retrieved 19 April 2024 from https://techxplore.com/news/2017-06-tensor2tensor-library-deep.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Google Brain posse takes neural network approach to translation

23 shares

Feedback to editors