Google’s TensorNetwork library speeds up computation by up to 100 times
Do tensor networks ring a bell? They’re mathematical constructs increasingly used in machine learning to perform complex calculations, but a number of barriers stand in the way of their widespread adoption. For one, there hasn’t been a freely available library for accelerated hardware to run the underlying algorithms at scale. Moreover, most of the tensor network literature is geared toward physics applications.
Fortunately, the folks at Google are on the case: The Mountain View company’s AI division today announced TensorNetwork, an open source library and API developed in collaboration with the Perimeter Institute for Theoretical Physics and Google parent company Alphabet’s X skunkworks. It’s designed to improve the efficiency of tensor calculations by using Google’s TensorFlow machine learning framework as a backend, along with optimizations for graphics card processing.
In preliminary tests, Google reports that TensorNetwork delivers computational speedups of up to 100 times compared with work on a processor.
For the uninitiated, tensors are multidimensional arrays categorized in a hierarchy according to their order. An ordinary number is a tensor of order zero, or a scalar, while a vector is an order-one tensor and a matrix is an order-two tensor. Tensor networks, then, are graphically encoded tensor contraction patterns — types of mathematical operations on a tensor — of several constituent tensors that together form a new one.
Tensor networks very efficiently represent several, dozens, or even hundreds of tensors. How? Rather than storing or manipulating them directly, they represent tensors as contractions of smaller constituent tensors in the shape of the larger tensor network. This makes them much more practical for image classification, object recognition, and other AI tasks.
The TensorNetwork library is designed to facilitate this; it’s a general-purpose library for tensor network algorithms, which Google expects will be useful for research engineers and research scientists. It notes that approximating quantum states is a typical use-case for tensor networks in physics, and that it’s well-suited to “illustrate the capabilities of the TensorNetwork library.”
“Tensor networks let one focus on the quantum states that are most relevant for real-world problems — the states of low energy, say — while ignoring other states that aren’t relevant,” wrote Google AI research engineer Chase Roberts and X research scientist Stefan Leichenauer. “With the open source community, we are also always adding new features to TensorNetwork itself. We hope that TensorNetwork will become a valuable tool for physicists and machine learning practitioners.”
Roberts, Leichenauer, and colleagues leave to future work using TensorNetwork to classify images in data sets like MNIST and Fashion-MNIST, time series analysis, and quantum circuit simulation.