A Blog by Jonathan Low

 

Oct 31, 2021

How AI Is Reinventing What Computers Are

AI is changing how they're designed and manufactured, but more importantly, what they're for. JL 

MIT Technology Review reports

Computers haven̵'t changed much in 40 or 50 years. They're smaller  and faster, but they’re still boxes of processors that carry out instructions from people. AI is changing that: how computers are made, how they’re programmed, and how they’re used. Ultimately, it will change what they are for. With machine learning, programmers no longer write rules. Instead, they create a neural network that learns these rules for itself. It’s a different way of thinking. (And) anything can become a computer. Most household items already come in a smart version.  “The core of computing is changing from number crunching to decision making.”

Fall 2021: the season of pumpkins, pecan tarts and peach-colored new phones. Every year, Apple, Samsung, Google and others release their latest publications on cue. These fixtures on the consumer tech calendar no longer inspire the surprise and amazement of those heady early days. But behind all the marketing glitz, there’s something remarkable.

The latest offering from Google, the Pixel 6, is the first phone to have a separate chip for AI that sits next to its standard processor. And the chip that runs the iPhone has in recent years contained what Apple calls a “neural engine,” which is also dedicated to AI. Both chips are better suited to the types of calculations required in training and running machine learning models on our devices, such as: B. the AI ​​that drives your camera. Almost unnoticed, AI has become part of our everyday life. And it changes the way we think about computers.

What does that mean? Well, computers haven̵'t changed much in 40 or 50 years. They're smaller  and faster, but they’re still boxes of processors that carry out instructions from people. AI is changing that on at least three fronts: how computers are made, how they’re programmed, and how they’re used. Ultimately, it will change what they are for.

“The core of computing is changing from number crunching to decision making,” said Pradeep Dubey, director of the Parallel Computing Lab at Intel. Or as Daniela Rus, director of MIT CSAIL, puts it, AI frees computers from their boxes.

More rush, less speed

The first change affects the way computers are made – and the chips that drive them. Traditional computer gains came when machines could do one computation after another faster. For decades, the world benefited from the speed increases in chips that came with metronomic regularity as chip makers followed Moore’s Law.

However, the deep learning models that make current AI applications work require a different approach: they have to perform a large number of inaccurate calculations at the same time. That means a new type of chip is needed: one that can move data as quickly as possible and make sure it’s available when and where it’s needed. When deep learning exploded about a decade ago, there were special computer chips that were pretty good at it: graphics processors, or GPUs, designed to display an entire screen full of pixels dozens of times per second.

Anything can become a computer. In fact, most household items, from toothbrushes to light switches to doorbells, already come in a smart version.

Now chipmakers like Intel and Arm and Nvidia, who supplied many of the first GPUs, are in the process of developing hardware specifically for AI. Google and Facebook are also pushing into this industry for the first time to find an AI advantage through hardware.

The chip in Pixel 6, for example, is a new mobile version of Google’s Tensor Processing Unit (TPU). Unlike traditional chips, which are geared towards ultra-fast, precise computations, TPUs are designed for the high-volume, but low-precision computations required by neural networks. Google has been using these chips in-house since 2015: They process people’s photos and natural language search queries. Google’s sister company DeepMind uses them to train its AIs.

In the past few years, Google has made TPUs available to other companies, and these chips – as well as similar ones being developed by others – are becoming the standard in the world’s data centers.

AI even helps develop your own computing infrastructure. In 2020, Google used a reinforcement learning algorithm – a type of AI that learns to solve a task through trial and error – to design the layout of a new TPU. The AI ​​eventually came up with strange new designs that no one would think of – but they worked. Such an AI could one day develop better and more efficient chips.

Show, don’t tell

The second change affects the way computers are told what to do. We have been programming computers for 40 years; we will train them for the next 40, says Chris Bishop, head of Microsoft Research in the UK.

Traditionally, in order to get a computer to recognize language or identify objects in an image, programmers first had to develop rules for the computer.

With machine learning, programmers no longer write rules. Instead, they create a neural network that learns these rules for itself. It’s a fundamentally different way of thinking.

0 comments:

Post a Comment