A Blog by Jonathan Low

 

Jan 31, 2015

Technology, Talent and Tolerance

The movie 'The Imitation Game,' about the World War II British codebreakers at Bletchley Park, has raised both awareness and questions about the interdependence of talent and technology.

It has, as well, sparked debate about the tolerance required to optimize the impact of those in any  society whose behavior or adhesion to norms may be different from the majorities.

Alan Turing, Steve Jobs and Bill Gates were 'different,' though each in his own way. As are many other tech luminaries who may be gay or dyslexic, or immigrants, or not in the least athletic, handsome or socially adept.

It is useful to remember this, as the following article explains, as we contemplate the rise or failure of nation states' ability to nurture innovation, creativity and the other manifestations of cultures that both generate and attract wealth or talent or both.

Technology and talent are bound by tolerance of differences. This is especially important when one considers the relative insignificance of those differences within the context of the benefits their contributions have generated for the rest of the populations of which they are a part.. JL

Greg Satell comments in Digital Tonto:

It is not fancy labs that produce breakthroughs, but technology, talent and tolerance.
The award winning movie, The Imitation Game, based on Andrew Hodges’ definitive biography of Alan Turing, won rave reviews for its portrayal of a rare genius.  Turing not only invented modern computing, but helped win World War II by breaking the German Enigma code and was a pioneer in artificial intelligence.
Even today, it’s difficult to have a serious discussion about information technology without eventually hitting on Turing machinesTuring tests or some other concept that he invented.  It’d be hard to think of anyone who matches his impact on information technology.
So it’s curious, to say the least, that his country has fared so poorly in the industry that Turing helped create.  There is no British Apple, Google or IBM. In fact, only one of the companies on the FTSE 100 is a computer company.  The story of how that happened is more than a mere historical curiosity, but offers important lessons for how to foster technology and innovation.
The Improbable Origins Of Modern Computing
To understand what Turing did, we first must understand the context under which he did it.  Mathematics in the early 20th century was in a state of disarray.  Its basic foundation of logic was under siege from people like CantorFrege and Russell and nobody was sure quite what to do about it.
The problem was that it had become unclear whether it is possible for a logical system to be complete, consistent and computable.  In other words, we expect math to answer questions and, if it is shown to be unable to give us answers reliably, then we have a real problem.  In effect, there was a massive hole at the very center of logic.
Faced with a foundational crisis, David Hilbert, perhaps the most influential mathematician at the time, instituted a program of problems to be solved that would set things aright.  In 1931, a little over a decade since he first posed his challenges, Hilbert got the first answers, although not the ones he was looking for.
A young Austrian logician named Kurt Gödel proved a pair of theorems that showed a system could be complete or consistent, but not both.  In effect, all systems eventually crash.  Mathematics, as the world knew it, was dead forever. Ironically, it was that insight that made computers possible.
Turing Machines And Unbreakable Codes
Alan Turing was still an undergraduate when Gödel published his famous theorems and was fascinated by his work.  As he thought about it, he realized that he could use the Austrian’s methods, called Gödel numbering, to solve the computability problem.  It was an astounding achievement that put Turing in the first rank of mathematicians.
Yet what makes his solution important for our story is a thought experiment he invented to help him think about the problem.  He imagined a machine, consisting of little more than a strip of tape, a head that is able to read and write on that tape and a device which can manipulate the head.  Turing showed that such a machine could calculate any computable number.
turing-machine
(Image credit: Wikipedia)
In essence, what Turing created in his mind was a universal computer.  There were computing machines before Turing’s, but they were fairly narrowly conceived machines designed to perform a specific task.  This one, in theory, could calculate anything, much like our computers today.
It was this line of thinking that led to the breaking of the German Enigma code and the development of the Colossus, the world’s first digital computer.  Due to the sensitive nature of its work, however, Winston Churchill ordered it to be destroyed after the end of the war.
The First Computers
Most people have never heard of the Colossus and, for a long time, the ENIAC, developed at the University of Pennsylvania, was considered to be the world’s first digital computer.  In truth, however, neither authentically resembles the computers that we have today.  That honor goes to the IAS machine, developed by John von Neumann at Princeton.
What von Neumann conceived, now known as the Von Neumann architecture, was a computer with separate units for storage, mathematical logic and control. Organized this way, a machine could store separate programs to carry out specific tasks, which could then be replaced by new programs that would perform different tasks.
Another interesting aspect of the IAS machine was that it was essentially an open source technology.  Although funded by the US government to perform calculations for the development of the hydrogen bomb, it was also used for civilian work and its design was shared widely.  Neumann himself consulted for GE, IBM and other companies.
Turing continued to work on computers and, in fact, improved on von Neumann’s design with the Automatic Computing Engine he developed at England’s National Physics Laboratory.  However, its commercial development was hindered by concerns about secrecy, lack of financing and, not least, Turing’s suicide following his arrest for homosexual acts.
The Rise (And Fall?) Of The Creative Class
When we think about technological development, we usually think of white coats and labs.  So it’s understandable that when governments seek to promote technology, they build massive industrial parks where talented people work quietly in pristine isolation to dream up the next big thing.
However, in The Rise of the Creative Class, author Richard Florida presents an alternative vision.  As he tells it, it is not fancy labs that produce breakthroughs, but technology, talent and tolerance.  In effect, art galleries, cafes and a thriving social scenes are just as important as big budgets and elaborate facilities.
The story of Alan Turing bears this out.  The US, with its more tolerant society and freewheeling lifestyle, was able to attract talented immigrants like John von Neumann, Andy Grove and others.  Government support of basic research led, albeit indirectly, to the Haight-Ashbury scene as much as it did to Silicon Valley.  UK policies, on the other hand, killed its technology industry, quite literally in the case of Alan Turing.
However, past is not necessarily prologue.  Although the US stands atop the world of technology today, immigration, tolerance and funding for basic research are under fierce attack.  In fact, it has become fashionable for politicians to not only question the science that that underlies today’s technology, but even higher education itself.
So in the case of Alan Turing, we should not only remember what he lived for, but what he died for.  Which was nothing.

0 comments:

Post a Comment