Second-Generation Tensor Processing Units (TPUs) Announced at 2017 Google I/O
Google as a company is constantly launching new technologies with the potential to radically change entire industries and alter the course of human civilization, with some of the best press announcements reserved for release at the annual I/O Convention. This year, Sundar Pichai (CEO), Jeff Dean (Research), and Urs Hölzle (Google Cloud) all made simultaneously timed announcements that effectively launched an entirely new form of cloud hosting based around machine learning and artificial intelligence. Google has developed a new type of computer server with advanced Tensor Processing Units or TPUs which function to optimize machine learning processes for AI in a manner similar to the way that GPUs operate in the advanced mathematical rendering of high-speed computer graphics or in accelerating Bitcoin mining. Google intends to make the machine learning servers available for research and development on open source standards through cloud computing plans which can be programmed with TensorFlow, “an open-source software library for Machine Intelligence.” The launch of Cloud TPU hosting is one of the highlights of Google’s change from a “mobile-first” to “AI-first” company and symbolic of its goal to develop “AI-first” data centers across all current business operations.
TensorFlow – An Open Source Software Library for Machine Intelligence
The announcement of Cloud TPU hosting potentially represents a major game-changing moment for the future of web hosting, data center management, and software application development. Many have come to take Google I/O announcements with less enthusiasm after the huge hype around Google Wave and Google Glass both failed to develop into viable new products. However, the Android platform announced at Google I/O 2008 has subsequently scaled to surpass 2 billion device installations in less than 10 years. Google’s philosophical shift from a “mobile first” to an “AI first” company has already been implemented internally and the changes are found across the many of the company’s existing search, photos, email, maps, and language applications. In launching Cloud TPU hosting services based around the TensorFlow platform and a completely new type of cloud computer developed for the unique requirements of machine learning and artificial intelligence, Google as a company proves that they are still relatively light years ahead of the competition again.
Industry analysts rushed to calculate the estimated effects of the Cloud TPU hosting platform on Intel, AMD, & Nvidia after the I/O announcement, as the major chip makers and GPU design companies in the sector reacted to the latest Google product launch. Clearly artificial intelligence and machine learning are almost unlimited in potential as sectors for new hardware & software development in IT. According to the press release, Google’s single TPU servers deliver “up to 180 teraflops of floating-point performance… a TPU pod contains 64 second-generation TPUs and provides up to 11.5 petaflops to accelerate the training of a single large machine learning model.” The AI-optimized cloud servers can be used to build new applications for advanced speech recognition, voice translation, image search, and many other purposes in research, industry, science, and design. Google is promoting the use of AI & machine learning to rethink traditional problems across all major industrial sectors in order to develop new innovative products to market.
One of the reasons the new TPU servers were developed by Google is to optimize the way that AI evolves through the use of neural networks in machine learning. This includes developing the ability for computers to “see” and “understand” what is in an image, as well as leading to new potential in computer “hearing” through voice recognition and “reading” through optical text recognition. Healthcare, biotech research, medicine, genetic sequencing, astrophysics, quantum physics, biology, chemistry, architecture, & engineering – Google is looking for a means to scale from the current environment where AI research and machine learning with neural networks is primarily the domain of PhD students in computer science to the application of innovation across wide sectors of society and industry. This will require hundreds of thousands of professional developers and AI/ML programmers, making TensorFlow a potentially valuable platform for sharing open source code when building new software applications. Google.ai is another project that represents a collection of experts from various Google departments who are all working to bring the benefits of AI to everyone through applied research and new consumer product development based on deep learning & neural nets.
Introductory Video: TensorFlow (2017)
TensorFlow: Open Source Machine Learning – “TensorFlow is an open source software library for numerical computation using data flow graphs. Originally developed by researchers and engineers working on the Google Brain Team within Google’s Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research.” Follow TensorFlow on Twitter.
Google Cloud TPU Platform – Accelerated Machine Learning for a New Generation of Apps
Google’s cloud computing platform currently includes Skylake CPUs from Intel, Nvidia GPUs, and the new TPU servers launched by internal research and development teams at the company. The Tensor Processing Unit (TPU) is an ASIC chip (application-specific integrated circuit) which has many wondering if Google’s latest research has led it to developments that will surpass Intel, AMD, & Nvidia in artificial intelligence and machine learning chip design. The nearest industry products that compete with TensorFlow are the Nervana Engine and Nervana Cloud being developed by Intel with the Python-based Neon deep learning platform. Nvidia has launched the Volta (V100 series) machine learning chip along with its proprietary Deep Learning Accelerator (DLA). Wave Computing’s Dataflow Architecture is another start-up product ecosystem in this sector. Google has released extensive testing results from its data centers that compare Tensor Processing Unit (TPU) performance to the Intel Haswell CPU and Nvidia K80 GPU. Overall, Google’s research concluded that “on our production AI workloads that utilize neural network inference, the TPU is 15x to 30x faster than contemporary GPUs and CPUs.” Although some have questioned the fact that the TPU servers can only be used with the TensorFlow platform to program new applications currently, it is clear that new product development using the Google TPU hardware is only just beginning.
Google Cloud TPU Hosting – “With the ending of Moore’s Law, many computer architects believe that major improvements in cost-energy-performance must now come from domain-specific hardware. The Tensor Processing Unit (TPU), deployed in Google datacenters since 2015, is a custom chip that accelerates deep neural networks (DNNs).” Learn More about Google Cloud TPU Hosting.
The 2017 TensorFlow Dev Summit: Machine Learning, AI, & Cloud Computing
TensorFlow is one of the most advanced platforms currently available in the world for developing Al & ML applications on open source foundations. The research announcements presented by Google technicians at the 2017 TensorFlow Developer’s Summit highlighted the diverse and unexpected nature of the software built by programmers internationally utilizing neural networks. Whether medicine, agriculture, science, industry, art, music, finance, business, aviation, or engineering, independent developers, computer scientists, academic researchers, creative artists, and garage-lab hobbyists have all put the tools available with TensorFlow code to work in building new applications for productive use in their local environments. What remains to be seen is how these new developments in cloud computing and the “internet of things” will cross-over into web applications through text recognition, data mining, meme generation, image & video search, etc. into a new generation of AI-driven websites and ML-enabled mobile applications on the popular level. TensorFlow is already being used by AirBNB, ARM, DeepMind, DropBox, eBay, Google, IBM, Intel, Qualcom, SAP, Snapchat, Twitter, Uber, and other mainstream companies in day-to-day business operations. What comes next is likely to be even more transformative than the changes launched by advances in cloud computing and mobile technology today. The TensorFlow platform also builds a path to the wider socially transformative vision of artificial intelligence and machine learning promoted by the senior Google researcher Ray Kurzweil in his series of books on “the Singularity”. The competing commercial standards advanced by Google, Intel, Nvidia, IBM, Microsoft, & other companies at this stage of collective AI/ML development also points to the vast potential of economic returns from seed-stage financial investment in this commercial sector.
Keynote Video: Google I/O 2017
Google I/O Keynote – “Organizing the world’s information… by applying deep computer science and technological insights to solve problems at scale.” Follow Google CEO Sundar Pichai on Twitter.
TensorFlow R&D – Platform Resources for ML Programmers & AI Developers
While machine learning, neural networks, and artificial intelligence programming have for generations been the research material for computer science professionals, the Google TensorFlow platform launch and Cloud TPU hosting announcement with new hardware proves that the innovation from this sector is finally arriving in the mainstream. Not all TensorFlow research and development is limited to academics or Fortune 500 IT departments, however. There are a growing number of internet resources on the TensorFlow platform, such as:
- The TensorFlow Community
- TensorFlow Code on GitHub
- The Operator Vectorization Library
- TensorFlow Tutorials
- Machine Learning with TensorFlow
- Google Developer Channel
Industry developments in cloud computing, big data applications, machine learning, artificial intelligence, the internet of things, and 3-D printing all appear ready to integrate on the next level of data center technology exemplified by the launch of the Google Cloud TPU pod networks.
Accelerated Machine Learning – “Machine learning (ML) has the power to greatly simplify our lives. Improvements in speech recognition and language understanding help all of us interact more naturally with technology. Businesses rely on ML to strengthen network security and reduce fraud. Advances in medical imaging enabled by ML can increase the accuracy of medical diagnoses and expand access to care, ultimately saving lives.” Learn more about Cloud TPU Hosting with TensorFlow.
Deep Neural Networks (DNNs): Deep Learning & Autonomous Driving Applications
Google continues to be secretive about its own corporate research in autonomous driving computer platforms and how this relates to new industry developments in TPU computing, artificial intelligence, machine learning, and robotics. This is not surprising as the automotive industry stands poised on the verge of a major launch of the revolutionary new self-driving vehicle technology across numerous companies, consumer transportation models, and competing OS platforms. Nearly all of the major auto companies and IT corporations internationally, as well as innumerable small start-up software development firms, are reported to be working on the breakthrough launch of autonomous driving vehicles in the next few years. Nvidia’s Xavier processor is closely related to the Volta chip that is used in its deep learning platform, but the Xavier GPU uses tensor math to power Nvidia’s “Drive PX” autonomous driving software with advanced AI. It is likely that Google management views its TPU hardware research as an essential part of developing the future of self-driving, autonomous vehicles powered by Google Maps, AI, & machine learning software like TensorFlow on the next level. Look for more announcements in the future from Google on the development of autonomous driving technology in association with next-generation deep learning TPU server hardware.
Keynote Video – TensorFlow Dev Summit 2017
Google Developers – “Jeff Dean, Rajat Monga, and Megan Kacholia deliver the keynote address at the inaugural TensorFlow Dev Summit. They discuss: The origins of TensorFlow; Progress since TensorFlow’s open-source launch; TensorFlow’s thriving open-source community; TensorFlow performance and scalability; TensorFlow applications around the world… and share some exciting announcements!” Learn more about the TensorFlow Dev Summit 2017.