A Blog by Jonathan Low

 

Jul 6, 2018

How Intel Was Disrupted

In this economy, excellence in optimization too often leads to competitive innovation and subjective irrelevance. JL

Steven Sinofsky reports in Learning By Shipping:

Intel thought no matter what, it could out-manufacture everyone. And that was true for a very long time. It was just as true that Microsoft could out-software most anyone.
But the problem is…Those have to be pointed in the right direction. When you’re being disrupted, the skills and processes developed and optimized turn out to be no longer relevant. The competitive world views those as “rocks” and the river of innovation flows around them.
Perhaps it is simpler to say that Intel…was disrupted” — @stratechery // Simple to say but many factors at play. Disruption is never one feature, but full set of *assumptions* that go into a business. https://stratechery.com/2018/intel-and-the-danger-of-integration/ Some stories…
Some key assumptions that the mobile/ARM ecosystem inverted, many of which were rooted in the transition from desktop to laptop PCs:
  • Proprietary instruction set
  • Discrete graphics
  • Low power
  • SoC
  • WWAN
  • Go to market (GTM)
A little on each.
Too often people look to a single factor that causes disruption — sort of like how zippers disrupted buttons or something. In reality, disruption of a company is often a host of inter-connected issues with no single issue dominating. Together these might be a theme — mobile computing — but within that it is many things working in concert to cause problems.
First what about manufacturing?
Ben’s post nails the idea that Intel thought no matter what, it could out-manufacture everyone. And that was true for a very long time. It was just as true that Microsoft could out-software most anyone.
But the problem is…
Those have to be pointed in the right direction. When you’re being disrupted, the skills and processes developed and optimized turn out to be no longer relevant.
The competitive world views those as “rocks” and the river of innovation flows around them.
A classic statement about any company (or person) is how their greatest strength becomes their weakness over time. In technology, the fast nature of innovation and change, means that as quickly as a company can develop a strength others “work around” the strength to create products doing similar, but different, things. That is why you almost never see successful startups competing head-on with a large tech incumbent. There’s a decided effort to work around their strengths (in product, go to market, etc.)
Intel was “obsessed” (my word) with AMD. So much energy went into working to have a proprietary advantage over “x86”.
Was at so many meetings about using some specific instructions:
  • MMX
  • Threads
  • Video/Audio codecs
All aimed at having Win/Office that were “not as good” on AMD.
I know this sounds crazy or perhaps petty, but I did go to so many meetings where as a customer (of IA) I just felt like Intel didn’t “get me”. In VC++ I was very focused on helping people write Windows programs and in fact all I was worried about was getting to 32-bits, which Intel didn’t feel was enough. They wanted more threads because “Hyper Threading” was such a big bet. In Office, our twice yearly meetings I was presented with things Intel thought would make Office better, but really were just ways to have Office “support” Intel’s proprietary IA extensions. How much better would Office really be if it decompressed a JPEG using a special instruction? We would go through benchmarks explaining that we are RAM limited more than anything. Intel didn’t make RAM and in fact a sore spot was how Intel would tell OEMs that the amount of RAM for a PC was xMB when we knew we needed more (to be fair, we were the ones bloating the need for more RAM). More on those incentives below.
At the same time, Windows future was a bet on OS kernel abstracted from hardware — Windows NT.
Brilliant brilliant choice by NT team was the bet on the AMD 64 bit instructions. Seeing AMD64 all over the code drove them “nuts”.
That led to Itanium…more proprietary distraction.
In the original tweet one could say I implied incorrect timing. Itanium was clearly first, but was designed as a “clean break” from the PC world. So in a sense kudos to Intel for leaning in and disrupting servers with a new architecture. It was Microsoft with the bet on NT (AMD64 and processor independence) that contributed to the Itanium failure. But it was a whole ecosystem focused on x86 that really led to the success of AMD64 and brilliant work AMD did at a compatible but widened architecture. In hindsight (and to the NT team at the time) it made total sense.
So first thing, if innovation is focused on first and foremost being proprietary vs solving problems people have, then I think you’ll always run into trouble.
In an “ecosystem” play this is always a risk. Ppl hate being locked in, especially when it is obviously the intent.
Everyone is ultimately proprietary even in the Open Source world. Once you invest heavily in any sort of architecture or platform, you’re not moving anywhere with your current investment. That said, no one likes to hear about potential investments knowing that the whole point is to be proprietary versus solving some core problem. Solutions in search of problems can often feel that way, especially if by “chance” they drive a wedge with a platform competitor.
Graphics was always a blind spot for intel and, at least to me, a mystery.
Office was super focused on graphics because of PowerPoint and drawing, but still nothing like Adobe, etc. Windows was split — the 9x code (with directx) was huge into gfx. NT was about the command line.
In 2006 AMD (struggling) bought ATI for $5.4B. Intel just didn’t even notice. It was super weird.
When it came to graphics, the discussion always turned to compute cores. Step 1 — add a lot of cores. Then a leap, and somehow all those x86 cores would lead to graphics. 🤔
I wish I had a better way to understand this but as early as the 2000’s the discussions with Intel around graphics would get very confusing. The PowerPoint team specifically was always pushing graphics. They wanted to do all sorts of things on Windows using DirectX — floating video, alpha blending, transition effects, gradients, and more (things that were always shown off in Keynote and done with Photoshop on Windows) — but piping these through GDI was crazy. All the investments for DirectX were with ATI and Nvidia. That Microsoft was a bit “split” here only helped Intel — the NT team (AMD64) was not very focused on graphics seeing their OS as mostly a command line whereas the Win 9x team was all about gaming. That’s the story of Windows 2000 v XP for another day.
Desktops and laptops could have continued on a path where discrete graphics ruled. BUT Intel’s proprietary DNA also created world where PC makers had to choose Intel chips with poor graphics in order to get the latest CPUs. See https://www.ftc.gov/news-events/press-releases/2010/08/ftc-settles-charges-anticompetitive-conduct-against-intel
If by chance you wanted to use discrete graphics but also have a long battery life, thin, laptop then you would be out of luck. The way Intel structured its product line was to push those wanting the latest CPU to also take their integrated graphics. This was adjudicated and settled as you can see above. The net effect though was a big slow down in progress on graphics while people were focused on pricing and licensing to slow things down. Keep in mind during this time ATI and Nvidia are rising with gamers and improved graphics. Oh and the MacBook Air was being developed. Apple had always had fantastic graphics.
Then all of this really came to head with mobile — tablets in particular. Intel long working on a low-power chipset for feature phones. This wasn’t coming together well. It seemed their heart wasn’t in it.
Much of the work was far removed from HQ and seemed like a side project.
Intel was most decidedly stuck with their own organization (see above, Windows NT “versus” Windows 9x). The core (eg “Core”) was the focus of headquarters and were increasing cores, threads, and driving the new graphics integrated strategy (effectively the one we use today).
For better or worse the mobile strategy (to win feature phones) was an overseas effort. Because it was a SoC it used a different graphics engine, which also happened to be outsourced.
It would be reading too much into the situation to just throw a flag and say organizational failure. In fact keep in mind that many books on disruption theory say that to avoid disruption you *must* have separate teams so in a sense Intel was doing exactly the right thing. Hold that thought — more in a second.
The “moment” when Intel gave up on mobile was when it turned the investment that wasn’t quite yielding success in phones into a new “low cost PC” c. 2008.
In other words the way to compete with the nascent world of mobile, was with a really under-powered PC.
As you can imagine the early success of netbooks (millions sold) boosted everyone’s view of PCs (because 40M units is hard to miss). And brought a sense of “victory”.
But of course this wasn’t going to stop the march of mobile phones let alone the MacBook Air. 💥
Me with a Netbook and “I’m a PC” from the Microsoft PDC. Source: A Microsoft memory.
Yep there’s me with a Netbook. So from my perspective Netbook’s were actually quite a gift, albeit one that you had to think very short term about. Without a long fork into the Netbook drama, just consider that the PC run rate was softening and then along come PCs for $200–400 dollars — half of what they used to cost. And they would run Windows and people were buying them to have extra PCs around, emerging markets loved them, and so on. Seemed all good.
The core problem was that even for “lightweight” usage these PCs struggled in particular…ready for it…on graphics. What did people want to do with these PCs? They wanted to browse the web, play web games, and watch web video and what did all of those have in common? Flash. And Flash needed a good processor and video subsystem, which Netbooks didn’t have. For me, Office’s decade push to keep memory requirements down along with the work on Windows 7 made for a perfectly “more than adequate” travel experience. Plus it would foreshadow the move to a 10” form factor which I am using right now.
The ATOM line was technically all about building a System on Chip, SoC which would provide much smaller surface area, lower power, integrated graphics and telephony (and more).
But Intel’s ATOM lacked all of those. It wasn’t a competitive SoC.
Importantly Intel never got traction on 3G or 4G. That’s why it was so difficult to buy a PC with integrated WWAN. It required a separate card from low volume makers. Why? Well think about how Intel felt about AMD and then multiply by a big number to get feelings about QC.
What is interesting about WWAN is how much Intel made a huge success out of embracing WiFi. With Centrino (a name for Intel chipsets with 802.11) Intel almost made it seem like you needed Intel for WiFi. But that was built on the PC ecosystem.
Like graphics, WWAN was another mystery to me (especially when the team was making Surface but even before then). While it was not clear how WWAN would be cost effective (I personally still can’t believe how cheap and fast wireless data has become relative to what was happening a decade ago — note to self for sure), it was clear that WiFi would only get you so far (even true today). But to be honest, I never really dove into the IP dynamics around Qualcomm at the time but just knew that every time the topic came up it was more of a discussion about royalties and IP than anything to do with space on silicon.
Coming full circle, imagine now that there’s no SoC, no WWAN, no low power chip — these were all new assumptions in the world of mobile. But one really big assumptions — that graphics would be just a feature — turned out to be a huge problem.
With the iPhone, touch interface, Android, iPad…one thing totally changed about computing — role of graphics.
Even on the PC, browsers (led by IE) bet huge on graphics. Graphics moved from a niche for gamers/video/photos to part of the core experience on par with networking.
So now putting this all together, the assumptions that computing would be defined by increasing gigahertz (power), many generic cores (not graphics), discrete connectivity (WWAN chips), and no small form factor SoC and you can see how disruption happens.
You can almost imagine each subsystem of the Intel Architecture thinking “this is a bit competitive but not a game changer” which was sort of the case. But when you put them all together the package wasn’t even in the game.
This is where you really have to get a sense for how disruption is slow and then fast. You can imagine product meetings where each subsystem presents. One by one each talks about their progress and needs, and silicon is budgeted out (just like megabytes in Office).
One by one each has a few challenges but in isolation they think “we can do this”. Graphics says “one more iteration”. Power management says “if the OS and apps would only make changes to how they shut down we’d be good”. Packaging says “OEMs need to stop using so many external peripherals” and so on. You roll it all up and if you’re optimistic you think you’ll power through. BUT sea change and platform shifts don’t happen according to your architectural view — they happen differently almost by definition. Then disruption happens quickly.
This is what Ben was saying about Integration, but for me this is about product integration not just product <> mfg.
Circling back to integration and manufacturing it is amazing through all of this as Ben so eloquently shares, Intel was crushing everyone in manufacturing — of closed system, vertically integrated desktop CPUs.
But businesses are made of 4 P’s (product, price, place, promotion) and disruption almost always reaches a tipping point at go to market. That’s the last topic.
Too often in disruption, us techies focus on the product. In my view, it is almost always the go to market where the actual disruption happens. That is because even when a company might have a product that can enter a “new” market and compete the problem is the new market almost always means less revenue and/or profit so it basically looks like self-destruction. And keep in mind, it isn’t like the company is actually seeing the impact of this new platform in revenue.
Ultimately, Intel chips are expensive. On an average PC, Intel might charge $125 for the CPU/mobo (depending on how much of Intel stuff is bought).
If you’ve ever built a PC you know Intel is the biggest chunk of pricing.
The PC maker ecosystem is a massive tug-of-war between OEMs, Intel, Microsoft. What this does is cause all sorts of…ultimately self-defeating…dynamics in making PCs (most of which Apple addresses!)
The challenge with a multi-sided ecosystem made of large players is that eventually the market is growing and the battles are over a shrinking pie. When that happens the incentives no longer align and ultimately the products and customer experience begin to suffer.
This is about Intel though.
Intel’s focus on maximizing revenue per PC drove a focus on more cores, high power, and their integrated (poor) graphics. So you see how this could run up against competing with a MacBook, for example, when it came to battery life, fans, etc.
Intel was always focused on getting the ecosystem to rally around what would make the most money, but just as important was managing what chips could be used where.
Ultrabooks, Tablets, Netbooks — these categories are specifically designed around price points of chips.
Ultrabooks would be a classic example of this. While many of the products were very good, the challenge has been any number of pricing, power management, screen sizes available, and/or graphics. Intel would structure the CPU line so as not to create too many Ultrabooks at the expense of either lower-priced laptops or higher end “portable workstations”. Too many low priced PCs and that’s as much a problem as not enough high-priced ones. But also you might want a 10” screen, but then you can’t use the Core fanless chips and need to use Atom and so on.
All through this Microsoft has its view on which variant of Windows to run. And the PC makers want to do more than just put their own plastic shell on an Intel mobo. YET they are all looking at MacBooks and none of the OEMs would use aluminum or make a nice case because then their Core i5 whatever would cost more than the one right next to it at the store and most PCs weren’t bought that way.
Want to make a 10” PC but use a chip Intel said could only be used for 9.7” PC, then you would forego discounts or promotions. Want to use discrete graphics on a low power PC? Same problem.
These go to market programs might all be totally fine/legal/etc. BUT they create ecosystem incentives to look elsewhere.
This fine grained pricing created all sorts of challenges. But it also warps how R&D is done and where features/investments go.
Disruption happens slowly for sure, and then quickly. There were a lot of things going on with Intel that were amazing and beyond legendary. But the transition to mobile created an opportunity for many to thrive in new ways focused on new products.
There’s so much to this story and the legendary innovation of the company. There’s much more to come. Intel has enormous strengths, assets, and opportunities. // END
May favorite typo in this thread — Windows 9x was into graphics (gfx) not gifs though gifs are cool too.
PS/ If you’re wondering why Intel did not just go make ARM chips and “out manufacture” TSMC then two things:
  • Intel was a ARM architecture licensee. They knew about it.
  • But, keep in mind the “proprietary mindset” and you can see not owning the architecture was a problem.
One of the most “dramatic” business moments in my career was the call to Intel to tell them we were making Surface (😁) but it would use ARM (😐). Following that call we had many meetings about the progress on ATOM and I would hear why ARM was much less of a competitive concern and not a good investment. Often this would tie back to manufacturing. I would return to graphics, power management, WWAN. Life is funny that way.
It is worth considering that even if Intel had an ARM chipset they would still have probably ended up pretty consternated over a) the feature set and performance relative to Core chips (or ATOM?) and b) how to price it so as not to cannibalize Core. Both of these evidenced with the homegrown ATOM line.
The intent of this post is not to point out everything Intel did in a negative light, not even a little bit. The market is where it is. How did it get here? That’s what is interesting and why I shared my personal perspective.
This post has a lot of history in it. History has a way of being remembered differently by everyone involved so my apologies if I got something wrong and certainly would love alternative views. This is just a quick blog post not my own definitive history for sure.

0 comments:

Post a Comment