Pulse Check: How Intel is Scaling to Meet the Decade’s Opportunities

Eighteen months ago, Intel announced it would address the world’s rapidly growing computing continuum by investing in variations on the Intel Architecture (IA). It was met with a ho-hum. Now, many product families are beginning to emerge from the development labs and head towards production. All with IA DNA, these chip families are designed to be highly competitive in literally dozens of new businesses for Intel, produced in high volumes, and delivering genuine value to customers and end users.

Intel is the only company with an architecture, cash flow, fabs, and R&D capable of scaling its computing engines up and down to meet the decade’s big market opportunities. What is Intel doing and how can they pull this off?

The 2010’s Computing Continuum
Today’s computing is a continuum that ranges from smartphones to mission-critical datacenter machines, and from desktops to automobiles.  These devices represent a total addressable market (TAM) approaching a billion processors a year, and will explode to more than two billion by the end of the decade.  Of that, traditional desktop microprocessors are about 175 million chips this year, and notebooks, 225 million.

For more than four decades, solving all the world’s computing opportunities required multiple computer architectures, operating systems, and applications. That is hardly efficient for the world’s economy, but putting an IBM mainframe into a cell phone wasn’t practical. So we made due with multiple architectures and inefficiencies.

In the 1990’s, I advised early adopters NCR and Sequent in their plans for Intel 486-based servers. Those were desktop PC chips harnessed into datacenter server roles. Over twenty years, Intel learned from its customers to create and improve the Xeon server family of chips, and has achieved a dominant role in datacenter servers.

Now, Intel Corporation is methodically using its world-class silicon design and fabrication capabilities to scale its industry-standard processors down to fit smartphones and embedded applications, and up into high-performance computing applications, as two examples. Scaling in other directions is still in the labs and under wraps.

The Intel Architecture (IA) Continuum
IA is Intel’s architecture and an instruction set that is common (with feature differentiation) in the Atom, Core, and Xeon microprocessors already used in the consumer electronics, desktop and notebook, and server markets, respectively.  These microprocessors are able to run a common stack of software such as Java, Linux or Microsoft Windows.  IA also represents the hardware foundation for hundreds of billions of dollars in software application investments by enterprise and software application package developers that remain valuable assets as long as hardware platforms can run it — and backwards compatibility in IA has protected those software investments.

To meet the widely varying requirements of this decade’s computing continuum, Intel is using the DNA of IA to create application-specific variants of its microprocessors.  Think of this as silicon gene-splicing.  Each variant has its own micro-architecture that is suited for its class of computing requirements (e.g., Sandy Bridge for 2011 desktops and notebooks). These genetically-related processors will extend Intel into new markets, and include instruction-set compatible microprocessors:

  • Embedded processors and electronics known as “systems on a chip” (SOCs) with an Atom core and customized circuitry for controlling machines, display signage, automobiles, and industrial products;
  • Atom, the general-purpose computer heart of consumer electronics mobile devices, tablets, and soon smartphones;
  • Core i3, i5, and i7 processors for business and consumer desktops and notebooks, with increasing numbers of variants for form-factor, low power, and geography;
  • Xeon processors for workstations and servers, with multi-processors capable advances well into the mainframe-class, mission-critical computing segment;
  • Xeon datacenter infrastructure processor variants (e.g., storage systems, and with communications management a logical follow-on);

A Pause to Bring You Up To Date
Please do not be miffed: all of the above was published in February, 2011, more than two years ago. We included it here because it sets the stage for reviewing where Intel stands in delivering on its long-term strategy and plans of the IA computing continuum, and to remind readers that Intel’s strategy is hiding in plain sight for going on five years.

In that piece two years ago, we concluded that IA fits the market requirements of the vast majority of the decade’s computing work requirements, and that Intel is singularly capable of creating the products to fill the expanding needs of the computing market (e.g., many core).

With the launch of the 4th Generation Core 22nm microprocessors (code-name Haswell) this week and the announcement of the code-name Baytrail 22nm Atom systems on a chip (SoCs), it’s an appropriate time to take the pulse on Intel’s long-term stated direction and the products that map to the strategy.

Systems on a Chip (SoCs)
The Haswell/Baytrail launch would be a lot less impressive if Intel had not mastered the SoC.

The benefits of an SoC compared to the traditional multi-chip approach Intel has used up until now are: fewer components, less board space, greater integration, lower power consumption, lower production and assembly costs, and better performance. Phew! Intel could not build a competitive smartphone until it could put all of the logic for a computer onto one chip.

This week’s announcements include SoCs for low-voltage notebooks, tablets, and smartphones. The data center Atom SoCs, code-name Avoton, are expected later this year.

For the first time, Intel’s mainstream PC, data center, and mobile businesses include highly competitive SoCs.

SoCs are all about integration. The announcement last month at Intel’s annual investor meeting that “integration to innovation” was an additional strategy vector for the company hints at using many more variations of SoCs to meet Intel’s market opportunities with highly targeted SoC-based variants of Atom, Core, and Xeon.

Baytrail, The Forthcoming Atom Hero
With the SoCs for Baytrail in tablets and Merrifield in smartphones, Intel can for the first time seriously compete for mobile marketshare against ARM competitors on performance-per-watt and performance. These devices are likely to run the Windows 8, Android, and Chrome operating systems. They will be sold to carriers globally. There will be variants for local markets (i.e., China and Africa).

The smartphone and tablet markets combined exceed the PC market. By delivering competitive chips that run thousands of legacy apps, Intel has finally caught up on the technology front of the mobile business.

Along with almost the entire IT industry, Intel missed the opportunity that became the Apple iPhone. Early Atom processors were not SoCs, had poor battery life, and were relatively expensive. That’s a deep hole to climb out of. But Intel has done just that. There are a lot fewer naysayers than two years ago. The pendulum is now swinging Intel’s way on Atom. 2014 will be the year Intel starts garnering serious market share in mobile devices.

4th Generation Core for Mainstream Notebooks and PCs
Haswell is a new architecture implemented in new SoCs for long-battery-life notebooks, and with traditional chipsets for mainstream notebooks and desktops. The architecture moves the bar markedly higher in graphics performance, power management, and floating point (e.g., scientific) computations.

We are rethinking our computing model as a result of Haswell notebooks and PCs. Unless you are an intense gamer or workstation-class content producer, we think a notebook-technology device is the best solution.

Compared to four-year old notebooks in Intel’s own tests, Haswell era notebooks are: half the weight, half the height, get work done 1.8x faster, convert videos 23x faster, play popular games 26x faster, wake up and go in a few seconds, and with 3x battery life for HD movie playing. Why be tethered to a desktop?

Black, breadbox-size desktops are giving way to all-in-one (AIO) designs like the Apple iMac used to write this blog. That iMac has been running for two years at 100% CPU utilization with no problems. (It does medical research in the background folding proteins). New PC designs use notebook-like components to fit behind the screen. You’ll see AIOs this fall that lie flat as large tablets or go vertical with a rear kick-stand. With touch screen, wireless Internet and Bluetooth peripherals, these new AIOs are easily transportable around the house. That’s the way we see the mainstream desktop PC evolving.

And PCs need to evolve quickly. Sales are down almost 10% this year. One reason is global macro-economic conditions. But everybody knows the PC replacement cycle has slowed to a crawl. Intel’s challenge is to spark the PC replacement cycle. Haswell PCs and notebooks, as noted above, deliver a far superior experience to users than they are putting up with in their old, obsolescent devices.

Xeon processors for workstations, servers, storage, and communications
The data center is a very successful story for Intel. The company has steadily gained workloads from traditional (largely legacy Unix) systems; grown share in the big-ticket Top 500 high-performance computing segment; evolved with mega-datacenter customers such as Amazon, Facebook, and Google; and extended Xeon into storage and communications processors inside the datacenter.

The Haswell architecture includes two additions of great benefit to data-center computing. New floating point architecture and instructions should improve scientific and technical computing throughput by up to 60%, a huge gain over the installed server base. Second, transactional memory is a technology that makes it easier for programmers to deliver fine-grain parallelism, and hence to take advantage of multi-cores with multi-threaded programs, including making operating systems and systems software like databases run more efficiently.

In the past year, the company met one data-center threat in GPU-based computing with PHI, a server add-in card that contains dozens of IA cores that run a version of Linux to enable massively parallel processing. PHI competes with GPU-based challengers from AMD and nVidia.

Another challenge, micro-servers, is more a vision than a market today. Nevertheless, Intel created the code-name Avoton Atom SoC for delivery later this year. Avoton will compete against emerging AMD- and ARM-based micro-server designs.

Challenges
1. The most difficult technology challenge that Intel faces this decade remains software, not hardware.  Internally, the growing list of must-deliver software drivers for hardware such as processor-integrated graphics means that the rigid two-year, tick-tock hardware model must also accommodate software delivery schedules.

Externally, Intel’s full-fray assault on the mobile market requires exquisite tact in dealing with the complex relationships with key software/platform merchants: Apple (iOS), Google (Android), and Microsoft (Windows), who are tough competitors.

In the consumer space such as smartphones, Intel’s ability to deliver applications and a winning user experience are limited by the company’s OEM distribution model. More emphasis needs to be placed on the end-user application ecosystem, both quality and quantity. We’re thinking more reference platform than reference hardware.

2. By the end of the decade, silicon fabrication will be under 10 nm, and it is a lot less clear how Moore’s Law will perform in the 2020’s. Nevertheless, we are optimistic about the next 10-12 years.

3. The company missed the coming iPhone and lost out on a lot of market potential. That can’t happen again. The company last month set up an new emerging devices division charged with finding the next best thing around the same time others do.

4. In the past, we’ve believed that mobile devices — tablets and smartphones — were additive to PCs and notebooks, not substitutional. The new generation of Haswell and Baytrail mobile devices, especially when running Microsoft Windows, offer the best of the portable/consumption world together with the performance and application software (i.e., Microsoft Office) to produce content and data. Can Intel optimize the market around this pivot point?

Observations and Conclusions
Our summary observations have not changed in two years, and are reinforced by the Haswell/Baytrail SoCs that are this week’s proof point:

  • Intel is taking its proven IA platforms and modifying them to scale competitively as existing markets evolve and as new markets such as smartphones emerge.
  • IA scales from handhelds to mission-critical enterprise applications, all able to benefit from a common set of software development tools and protecting the vast majority of the world’s software investments.  Moreover, IA and Intel itself are evolving to specifically meet the needs of a spectrum of computing made personal, the idea that a person will have multiple computing devices that match the time, place and needs of the user.
  • Intel is the only company with an architecture, cash flow, fabs, and R&D capable of scaling its computing engines up and down to meet the decade’s big market opportunities.

Looking forward, Intel has fewer and less critical technology challenges than at any point since the iPhone launch in 2007. Instead, the company’s largely engineering-oriented talent must help the world through a complex market-development challenge as we all sort out what devices are best suited for what tasks. We’ve only scratched the surface of convertible tablet/notebook designs. How can Intel help consumers decide what they want and need so the industry can make them profitably? How fast can Intel help the market to make up its mind? Perhaps the “integration to innovation” initiative needs a marketing component.

If the three-year evolving Ultrabook campaign is an example of how Intel can change consumer tastes, then we think industry progress will be slower than optimal. A “win the hearts and minds” campaign is needed, learning from the lessons of the Ultrabook evolution. It will take skillsets in influencing and moving markets in ways Intel will need more of as personal computing changes over the next decade, for example, as perceptual computing morphs the user interface.

Absent a macro-economic melt-down, Intel is highly likely to enjoy the fruits of five years of investments over the coming two-year life of the Haswell architecture. And there’s no pressing need today to focus beyond 2015.

Biography

Peter S. Kastner is an industry analyst with over forty-five years experience in application development, datacenter operations, computer industry marketing, PCs, and market research.  He was a co-founder of industry-watcher Aberdeen Group in 1989.  His firm, Scott-Page LLC, consults with technology companies and technology users.

Twitter: @peterskastner

Haswell Core i7 desktop microprocessor

Haswell Core i7 desktop microprocessor


Advertisements

On the Impact of Paul Otellini’s CEO Years at Intel

Intel’s CEO Paul Otellini is retiring this week. His 40-year career at Intel now ending, it’s a timely opportunity to look at his impact on Intel.

Source: New York Times

Source: New York Times

Intel As Otellini Took Over

In September 2004 when it was announced that Paul Otellini would take over as CEO, Intel was #46 on the Fortune 100 list, and had ramped production to 1 million Pentium 4′s a week (today over a million processors a day). The year ended with revenues of $34.2 billion. Otellini, who joined Intel with a new MBA in 1974, had 30 years of experience at Intel.

The immediate challenges the company faced fell into four areas: technology, growth, competition, and finance:

Technology: Intel processor architecture had pushed more transistors clocking faster, generating more heat. The solution was to use the benefits of Moore’s Law to put more cores on each chip and run them at controllable — and eventually much reduced — voltages.

Growth: The PC market was 80% desktops and 20% notebooks in 2004 with the North America and Europe markets already mature. Intel had chip-making plants (aka fabs) coming online that were scaled to a continuing 20%-plus volume growth rate. Intel needed new markets.

Competition: AMD was ascendant, and a growing menace.  As Otellini was taking over, a market research firm reported AMD had over 52% market share at U.S. retail, and Intel had fallen to #2. Clearly, Intel needed to win with better products.

Finance: Revenue in 2004 recovered to beat 2000, the Internet bubble peak. Margins were in the low 50% range — good but inadequate to fund both robust growth and high returns to shareholders.

Where Intel Evolved Under Paul Otellini

Addressing these challenges, Otellini changed the Intel culture, setting higher expectations, and moving in many new directions to take the company and the industry forward. Let’s look at major changes at Intel in the past eight years in the four areas: technology, growth, competition, and finance:

Technology

Design for Manufacturing: Intel’s process technology in 2004 was at 90nm. To reliably achieve a new process node and architecture every two years, Intel introduced the Tick-Tock model, where odd years deliver a new architecture and even years deliver a new, smaller process node. The engineering and manufacturing fab teams work together to design microprocessors that can be manufactured in high volume with few defects. Other key accomplishments include High-K Metal Gate transistors at 45nm, 32nm products, 3D tri-gate transistors at 22nm, and a 50% reduction in wafer production time.

Multi-core technology: The multi-core Intel PC was born in 2006 in the Core 2 Duo. Now, Intel uses Intel Architecture (IA) as a technology lever for computing across small and tiny (Atom), average (Core and Xeon), and massive (Phi) workloads. There is a deliberate continuum across computing needs, all supported by a common IA and an industry of IA-compatible software tools and applications.

Performance per Watt: Otellini led Intel’s transformational technology initiative to deliver 10X more power-efficient processors. Lower processor power requirements allow innovative form factors in tablets and notebooks and are a home run in the data center. The power-efficiency initiative comes to maturity with the launch of the fourth generation of Core processors, codename Haswell, later this quarter. Power efficiency is critical to growth in mobile, discussed below.

Growth

When Otellini took over, the company focused on the chips it made, leaving the rest of the PC business to its ecosystem partners. Recent unit growth in these mature markets comes from greater focus on a broader range of customer’s computing needs, and in bringing leading technology to market rapidly and consistently. In so doing, the company gained market share in all the PC and data center product categories.

The company shifted marketing emphasis from the mature North America and Europe to emerging geographies, notably the BRIC countries — Brazil, Russia, India, and China. That formula accounted for a significant fraction of revenue growth over the past five years.

Intel’s future growth requires developing new opportunities for microprocessors:

Mobile: The early Atom processors introduced in late 2008 were designed for low-cost netbooks and nettops, not phones and tablets. Mobile was a market where the company had to reorganize, dig in, and catch up. The energy-efficiency that benefits Haswell, the communications silicon from the 2010 Infineon acquisition, and the forthcoming 14nm process in 2014 will finally allow the company to stand toe-to-toe with competitors Qualcomm, nVidia, and Samsung using the Atom brand. Mobile is a huge growth opportunity.

Software: The company acquired Wind River Systems, a specialist in real-time software in 2009, and McAfee in 2010. These added to Intel’s own developer tools business. Software services business accelerates customer time to market with new, Intel-based products. The company stepped up efforts in consumer device software, optimizing the operating systems for Google (Android), Microsoft (Windows), and Samsung (Tizen). Why? Consumer devices sell best when an integrated hardware/software/ecosystem like Apple’s iPhone exists.

Intelligent Systems: Specialized Atom systems on a chip (SoCs) with Wind River software and Infineon mobile communications radios are increasingly being designed into medical devices, factory machines, automobiles, and new product categories such as digital signage. While the global “embedded systems” market lacks the pizzazz of mobile, it is well north of $20 billion in size.

Competition

AMD today is a considerably reduced competitive threat, and Intel has gained back #1 market share in PCs, notebooks, and data center.

Growth into the mobile markets is opening a new set of competitors which all use the ARM chip architecture. Intel’s first hero products for mobile arrive later this year, and the battle will be on.

Financial

Intel has delivered solid, improved financial results to stakeholders under Otellini. With ever more efficient fabs, the company has improved gross margins. Free cash flow supports a dividend above 4%, a $5B stock buyback program, and a multi-year capital expense program targeted at building industry-leading fabs.

The changes in financial results are summarized in the table below, showing the year before Otellini took over as CEO through the end of 2012.

GAAP 2004 2012 Change
Revenue 34.2B 53.3B 55.8%
Operating Income 10.1B 14.6B 44.6%
Net Income 7.5B 11B 46.7%
EPS $1.16 $2.13 83.6%

The Paul Otellini Legacy

There will be books written about Paul Otellini and his eight years at the helm of Intel. A leader should be measured by the institution he or she leaves behind. I conclude those books will describe Intel in 2013 as excelling in managed innovation, systematic growth, and shrewd risk-taking:

Managed Innovation: Intel and other tech companies always are innovative. But Intel manages innovation among the best, on a repeatable schedule and with very high quality. That’s uncommon and exceedingly difficult to do with consistency. For example, the Tick-Tock model is a business school case study: churning out ground-breaking transistor technology, processors, and high-quality leading-edge manufacturing at a predictable, steady pace of engineering to volume manufacturing. This repeatable process is Intel’s crown jewel, and is a national asset.

Systematic Growth: Under Otellini, Intel made multi-billion dollar investments in each of the mobile, software, and intelligent systems markets. Most of the payback growth will come in the future, and will be worth tens of billions in ROI.

The company looks at the Total Addressable Market (TAM) for digital processors, decides what segments are most profitable now and in the near future, and develops capacity and go-to-market plans to capture top-three market share. TAM models are very common in the tech industry. But Intel is the only company constantly looking at the entire global TAM for processors and related silicon. With an IA computing continuum of products in place, plans to achieve more growth in all segments are realistic.

Shrewd Risk-Taking: The company is investing $35 billion in capital expenses for new chip-making plants and equipment, creating manufacturing flexibility, foundry opportunities, and demonstrating a commitment to keep at the forefront of chip-making technology. By winning the battle for cheaper and faster transistors, Intel ensures itself a large share of a growing pie while keeping competitors playing catch-up.

History and not analysts will grade the legacy of Paul Otellini as CEO at Intel. I am comfortable in predicting he will be well regarded.

Follow me on Twitter @PeterSKastner

Why Would IBM Sell Off Its X86 Server Business?

The Wall Street Journal reports that IBM is in advanced talks to sell of its X86 server business, which generates about $4,5 billion a year, to China-based Lenovo. IBM sold its PC business to Lenovo in 2005.

X86 servers are the Intel Xeon or AMD Opteron workhorses of almost every data center. Even shops with massive investments in IBM Z series mainframes or Power-based servers surround the back-end with X86 server racks handling Internet loads, or many other diverse workloads. Moreover, X86 workloads are expanding while proprietary (aka “rest of IBM”) hardware is declining in the total server market.

I’m throwing my hands in the air. I can see no strategic benefit coming to IBM from divesting X86 servers. True, margins on X86 servers are slim and getting slimmer due to Open Compute-like initiatives that are commoditizing X86 hardware and everything in the racks. IBM has shown in the past (i.e., PC business) that it will exit businesses that do not generate adequate operating margin.

IBM’s strategy for the past decade has been to surround hardware sales with IBM software (very high margins) and services (pretty good margins). Importantly, IBM is one of the few companies in the world where you can big a complex but problematic business process and say “Can you fix this for me, IBM?” The solution typically involves IBM hardware, software, and implementation services following on to IBM transformation services. IBM does transformation exceedingly well.

So, I ponder, if IBM no longer has X86 servers as an ingredient to its transformation engagements, is the company reducing its effective engagement footprint, perhaps sharply? I doubt IBM’s customers will be quite as excited if the message becomes “no X86 solutions from us any more”.

Alternatively, IBM could be setting up Lenovo and other preferred partners for transactional X86 sales to IBM transformation customers, with IBM taking a fee and no longer having to bother with pesky X86 R&D, break fix, inventory, training, and other costs.

This deal could definitely happen, which is why I am writing about it. The question to ask if it goes down is “how are IBM customers now going to get X86 servers?”

Follow me on Twitter @peterskastner

IBM X-series

IBM X-series

The 2013-2014 Computing Forest – Part 1: Processors

Ignoring the daily tech trees that fall in the woods, let’s explore the computer technology forest looking out a couple of years.
Those seeking daily comments should follow @peterskastner on Twitter.

Part 1: Processors

Architectures and Processes

Intel’s Haswell and Broadwell

We’ll see a new X86 architecture in the first half of 2013, code-name Haswell. The Haswell chips will use the 22 nm fabrication process introduced in third-generation Intel Core chips (aka Ivy Bridge). Haswell is important for extending electrical efficiency, improving performance per clock tick, and as the vehicle for Intel’s first system on a chip (SoC), which combines a dual-core processor, graphics, and IO in one unit.

Haswell is an architecture, and the benefits of the architecture carry over to the various usage models discussed in the next section.

I rate energy efficiency as the headline story for Haswell. Lightweight laptops like Ultrabooks (an Intel design) and Apple’s MacBook Air will sip the battery at around 8 watts, half of today’s 17 watts. This will dramatically improve the battery life of laptops but also smartphones and tablets, two markets that Intel has literally built $5 billion fabs to supply.

The on-chip graphics capabilities have improved by an order of magnitude in the past couple of years and get better of the next two. Like the main processor, the GPU benefits from improved electrical efficiency. In essence, on-board graphics are now “good enough” for the 80-th percentile of users. By 2015, the market for add-on graphics cards will start well above $100, reducing market size so much that the drivers switch; consumer GPUs lead high-performance computing (HPC) today. That’s swapping so that HPC is the demand that supplies off-shoot high-end consumer GPUs.

In delivering a variety of SoC processors in 2013, Intel learns valuable technology lessons for the smartphone, tablet, and mobile PC markets that will carry forward into the future. Adjacent markets, notably automotive and television, also require highly integrated SoCs.

Broadwell is the code-name for the 2014 process shrink of the Haswell architecture from 22nm to 14nm. I’d expect better electrical efficiency, graphics, and more mature SoCs. This is the technology sword Intel carries into the full fledged assault on the smartphone and tablet markets (more below).

AMD

AMD enters 2013 with plans for “Vishera” for the high-end desktop, “Richland”, an SoC  for low-end and mainstream users, and “Kabini”, a low-power SoC  for tablets.

The 2013 server plans are to deliver its third-generation of the current Opteron architecture, code name Steamroller. The company also plans to move from a 32nm SOI process to a 28nm bulk silicon process.

In 2014, AMD will be building Opteron processors based on a 64-bit ARM architecture, and may well be first to market. These chips will incorporate the IO fabric acquired with microserver-builder Seamicro. In addition, AMD is expected to place small ARM cores on its X86 processors in order to deliver a counter to Intel’s Trusted Execution Technology. AMD leads the pack in processor chimerism.

Intel’s better performing high-end chips have kept AMD largely outside looking in for the past two years. Worse, low-end markets such as netbooks have been eroded by the upward charge of ARM-based tablets and web laptops (i.e., Chromebook, Kindle, Nook).

ARM

ARM Holdings licenses processor and SoC designs that licensees can modify to meet particular uses. The company’s 32-bit chips started out as embedded industrial and consumer designs. However, the past five years has seen fast rising tides as ARM chip designs were chosen for Apple’s iPhone and iPad, Google’s Android phones and tablets, and a plethora of other consumer gadgets. Recent design wins includes Microsoft’s Surface RT. At this point, quad-core (plus one, with nVidia) 32-bit processors are commonplace. Where to go next?

The next step is a 64-bit design expected in 2014. This design will first be used by AMD, Calxeda, Marvell, and undisclosed other suppliers to deliver microservers. The idea behind microservers is to harness many (hundreds to start) of low-power/modest-performance processors costing tens of dollars each and running multiple instances of web application in parallel, such as Apache web servers. This approach aims to compete on price/performance, energy/performance, and density versus traditional big-iron servers (e.g., Intel Xeon).

In one sentence, the 2013-2014 computer industry dynamics will largely center on how well ARM users defend against Intel’s Atom SoCs in smartphones and tablets, and how well Intel defends its server market from ARM microserver encroachment. If the Microsoft Surface RT takes off, the ARM industry has a crack at the PC/laptop industry, but that’s not my prediction. Complicating the handicapping is fabrication process leadership, where Intel continues to excel over the next two years; smaller process nodes yield less expensive chips with voltage/performance advantages.

Stronger Ties Between Chip Use and Parts

The number of microprocessor models has skyrocketed off the charts the past few years, confusing everybody and costing chip makers a fortune in inventory management (e.g., write-downs). This really can’t keep up as every chip variation goes through an expensive set of usability and compatibility tests running up to millions of dollars per SKU (stock-keeping unit e.g., unique microprocessor model specs). That suggests we’ll see a much closer match between uses for specific microprocessor variations and the chips fabricated to meet the specific market and competitive needs of those uses. By 2015, I believe we’ll see a much more delineated set of chip uses and products:

Smartphones – the low-end of consumer processors. Phone features are reaching maturity: there are only so many pixels and videos one can fit on a 4″ (5″?) screen, and gaming performance is at the good-enough stage. Therefore, greater battery life and smarter use of the battery budget become front and center.

The reason for all the effort is a 400 million unit global smartphone market. For cost and size reasons, prowess in mating processors with radios and support functions into systems on a chip (SoCs) is paramount.

The horse to beat is ARM Holdings, whose architecture is used by the phone market leaders including Samsung, Apple, nVidia, and Qualcomm. The dark horse is Intel, which wants very much to grab, say, 5% of the smartphone market.

Reusing chips for multiple uses is becoming a clever way to glean profits in an otherwise commodity chip business. So I’ll raise a few eyebrows by predicting we’ll see smartphone chips used by the hundreds in microservers (see Part 2) inside the datacenter.

Tablets – 7″ to 10″ information consumption devices iconized by Apple’s iPad and iPad Mini. These devices need to do an excellent job on media, web browsing, and gaming at the levels of last year’s laptops. The processors and the entire SoCs need more capabilities than smartphones. Hence a usage category different from smartphones. Like smartphones, greater battery life and smarter use of the electrical budget are competitive differentiators.

Laptops, Mainstream Desktops, and All-in-One PCs – Mainstream PCs bifurcate differently over the next couple of years in different ways than the past. I’m taking my cue here from Intel’s widely leaked decision to make 2013-generation (i.e., Haswell) SoCs that solder permanently to the motherboard instead of being socketed. This is not a bad idea because almost no one upgrades a laptop processor, and only enthusiasts upgrade desktops during the typical 3-5 year useful PC life. Getting rid of sockets reduces costs, improves quality, and allows for thinner laptops.

The point is that there will be a new class of parts with the usual speed and thermal variations that are widely used to build quad-core laptops, mainstream consumer and enterprise desktops, and all-in-one PCs (which are basically laptops with big built-in monitors).

The processor energy-efficiency drive pays benefits in much lower laptop-class electrical consumption, allowing instant on and much longer battery life. Carrying extra batteries on airplanes becomes an archaic practice (not to mention a fire hazard). The battle is MacBook Air versus Ultrabooks. Low-voltage becomes its own usage sub-class.

Low End Desktops and Laptops – these are X86 PCs running Windows, not super-sized tablet chips. The market is low-cost PCs for developed markets and mainstream in emerging markets. Think $299 Pentium laptop sale at Wal-Mart. The processors for this market are soldered, dual-core, and SoC to reduce costs.

Servers, Workstations, and Enthusiasts – the high end of the computing food chain. These are socketed, high-performance devices used for business, scientific, and enthusiast applications where performance trumps other factors. That said, architecture improvements, energy efficiency, and process shrinks make each new generation of server-class processors more attractive. Intel is the market and technology leader in this computing usage class, and has little to fear from server-class competitors over the next two years.

There is already considerable overlap in server, workstation, and enthusiast processor capabilities. I see the low end Xeon 1200 moving to largely soldered models. The Xeon E5-2600 and Core i7 products gain more processor cores and better electrical efficiency over the Haswell generation.

Part 2: Form-Factors

Part 3: Application of Computing

Dell Inspiron 15z

Dell Inspiron 15z

Gaming AMD’s 2012 Strategy

AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.

There’s an awful lot of mis-guided analysis wafting about regarding AMD’s new strategic direction, which the company says it will make public in February. This piece is to help you (and me) sort through the facts and the opportunities. I last took a look at AMD’s strategies earlier this year, available here.

Starting With the Facts

  • AMD is a fabless semiconductor company since 2009. The company depends on GlobalFoundries and soon Taiwan Semiconductor to actually fabricate its chips;
  • In its latest quarter, AMD had net income of about $100 million on $1.7 billion in revenue. Subsequently, the company announced a restructuring that seeks to cut costs by $118 million in 2012, largely through a reduction in force of about ten percent;
  • AMD has about a 20% market share in the PC market, which Intel says is growing north of 20% this year, largely in emerging markets;
  • AMD’s products compete most successfully against rival Intel in the low- to mid-range PC categories, but 2011 PC processors have underwhelmed reviewers, especially in performance as compared to comparable Intel products;
  • AMD has less than a 10% market share in the server market of about 250,000 units, which grew 7.6% last quarter according to Gartner Group;
  • AMD’s graphics division competes with nVidia in the discrete graphics chip business, which is growing in profitable commercial applications like high-performance supercomputing and declining in the core PC business as Intel’s integrated graphics is now “good enough” for mainstream buyers;
  • AMD has no significant expertise in phone and tablet chip design, especially the multi-function “systems on a chip (SOCs)” that make up all of today’s hot sellers.

What Will AMD CEO Rory Read’s Strategy Be?

I have no insider information and no crystal ball. But my eyebrows were seriously raised this morning in perplexity to see several headlines such as “AMD to give up competing with Intel on X86“, which led to “AMD struggling to reinvent itself” in the hometown Mercury News. I will stipulate that AMD is indeed struggling to reinvent itself, as the public process has taken most of 2011. The board of directors itself seems unclear on direction. That said, here is my score card on reinvention opportunities in descending order of attractiveness:

  1. Servers —  For not much more work than a desktop high-end Bulldozer microprocessor, AMD makes Opteron 6200 server processors. Hundreds or thousands more revenue dollars per chip at correspondingly higher margins. AMD has a tiny market share, but keeps a foot in the door at the major server OEMs. The company has been late and underdelivered to its OEMs recently. But the problem is execution, not computer science.
  2. Desktop and Notebook PCs — AMD is in this market and the volumes are huge. AMD needs volume to amortize its R&D and fab preparation costs for each generation of products. Twenty percent of a 400 million chip 2011 market is 80 million units! While faster, more competitive chips would help gain market share from Intel, AMD has to execute profitably in the PC space to survive. I see no role for AMD that does not include PCs — unless we are talking about a much smaller, specialized AMD.
  3. Graphics Processors (GPUs) — ATI products are neck-and-neck with nVidia in the discrete graphics card space. But nVidia has done a great job of late creating a high-performance computing market that consumes tens of thousands of commercial-grade (e.g., high price) graphics cards. Intel is about to jump into the HPC space with Knight’s Corner, a many-X86-core chip. Meanwhile, AMD needs the graphics talent onboard to drive innovation in its Fusion processors that marry a processor and graphics on one chip. So, I don’t see an AMD without a graphics component, but neither do I see huge profit pools either.
  4. Getting Out of the X86 Business — If you’re reading along and thinking you might short AMD stock, this is the reason not to: the only legally sanctioned software-compatible competition to X86 inventor Intel. If AMD decides to get out of making X86 chips, it better have a sound strategy in mind and the ability to execute. But be assured that the investment bankers and hedge funds would be flailing elbows to buy the piece of AMD that allows them to mint, er, process X86 chips. So, I describe this option as “sell off the family jewels”, and am not enthralled with the prospects for success in using those funds to generate $6.8 billion in profitable revenue or better to replace today’s X86 business.
  5. Entering the ARM Smartphone and Tablet Market— A sure path to Chapter 11. Remember, AMD no longer makes the chips it designs, so it lacks any fab margin to use elsewhere in the business. It starts against well-experienced ARM processor designers including Apple, Qualcomm, Samsung, and TI … and even nVidia. Most ARM licensees take an off-the-shelf design from ARM that is tweaked and married to input-output to create an SOC design, that then competes for space at one of the handful of global fab companies. AMD has absolutely no special sauce to win in the ARM SOC kitchen.To win, AMD would have to execute flawlessly in its maiden start (see execution problems above), gain credibility, nail down 100+ design wins for its second generation, and outrace the largest and most experienced companies in the digital consumer products arena. Oh, and don’t forget volume, profitability, and especially cash flow. It can’t be done. Or if it can be done, the risks are at heart-attack levels.

“AMD intends to pursue “growth opportunities” in low-powered devices, emerging markets and Internet-based businesses.” One way to read that ambiguous sentence by AMD is a strategy that includes:

  • Tablets and netbooks running X86 Windows 8;
  • Emerging geographic markets, chasing Intel for the next billion Internet users in places like Brazil, China, and even Africa. Here, AMD’s traditional value play resonates;
  • Internet-based businesses such as lots of profitable servers in the cloud. Tier 4 datacenters for Amazon, Apple, Facebook, Google, and Microsoft are a small but off-the-charts growing market.

So, let’s get together in February and see how the strategy chips fall. Or post a comment on your game plan for AMD.

BAPco SYSmark 2012: Dropping the Llano Shoe

No wonder AMD was upset enough over BAPco’s SYSmark 2012 benchmark to drop out of the non-profit benchmarking organization in June with much sturm und drang.

My testing of the AMD Fusion high-end “Llano” processor, the A8-3850 APU, shows an overall rating on SYSmark 2012 of 91. Except for the 3D component of the benchmark, the Intel “Sandy Bridge” Pentium 840 scores higher in individual components — and higher overall — with a score of 98, according to the official SYSmark 2012 web site.

The SYSmark 2012 reference platform scores 100. That puts the high-end Llano desktop performance at 90% of a 2010 Intel “Clarkdale” first-generation Core i3-540, a low-end mainstream processor.

Moreover, the Intel “Sandy Bridge” Core i3-2120 dual-core processor with integrated graphics costs within a dollar of the “Llano” A8-3850 but delivers a 36 point higher score – noticeably snappier performance, in my actual use experience (see chart below).

I also tested AMD’s Phenom II 1100T, a top-end AMD six-core processor with an ATI Radeon HD 4290 graphics card, against an Intel “Sandy Bridge” second generation Core i5-2500 with integrated graphics. The Core i5-2500 is the superior processor on this benchmark; the much-maligned Intel internal graphics barely loses to the ATI 4920 external graphics card in the 3D component, while delivering a 44 point overall advantage. The results are shown below in Chart 1.

Chart 1: Selected BAPco SYSmark 2012 Scores

Processor

Overall

Office

Media

Analysis

3D

Web

Sys Mgt

Intel i5-2500

166

144

162

191

181

168

153

Intel i3-2120

127

123

125

146

125

121

122

AMD Phenom II 1100T

122

109

116

122

183

108

110

Intel Pentium 840

98

100

102

106

87

90

107

AMD A8-3850

91

91

84

96

121

73

88

Intel Pentium G620T

79

81

81

88

70

71

86

Source: Peter S. Kastner andBusiness Applications Performance Corporation

Is SYSmark 2012 Relevant?
SYSmark 2012 is relevant because it allows evaluators to test specific PC configurations against actual, commonly used business applications.

AMD says “AMD will only endorse benchmarks based on real-world computing models and software applications, and which provide useful and relevant information. AMD believes benchmarks should be constructed to provide unbiased results and be transparent to customers making decisions based on those results.” Let’s look at what SYSmark does and how it does it.

Serious readers will study the SYSmark 2012 Overview published at the BAPco web site. This benchmark version is built on 20 years of collaborative experience by BAPco in modeling business work loads into application scenarios and corresponding benchmarks through a 26-phase process that takes years to complete. The last version was SYSmark2007 under Windows Vista. SYSmark is real-world in that it incorporates widely used applications such as Office, AutoCAD, Acrobat, Flash, Photoshop and Internet Explorer under Windows 7 in component scenarios.

SYSmark is widely used around the globe in business and public tenders to select PCs without bias towards vendor and processor manufacturer. SYSmark is the only generally accepted benchmark for general business computers since it uses actual application code in the tests, not synthetic models.

The benchmark is intensive, reflecting workload snapshots of what power users actually do, rather than light-duty office workers. There are six scenario components to SYSmark 2012, each of which counts equally in the final rating:

Office Productivity: The Office Productivity scenario models productivity usage including word processing, spreadsheet data manipulation, email creation/management and web browsing.

Media Creation: The Media Creation scenario models using digital photos and digital video to create, preview, and render a video advertisement for a fictional business.

Web Development: The Web Development scenario models the creation of a website for a fictional company.

Data/Financial Analysis: The Data/Financial Analysis scenario creates financial models to review, evaluate and forecast business expenses. In addition, the performance and viability of financial investments is analyzed using past and projected performance data.

3D Modeling: The 3D Modeling scenario focuses on creating, rendering, and previewing 3D objects and/or environments suitable for use in still imagery. The creation of 3D architectural models/landscapes and rendering of 2D images and video of models are also included.

System Management: The System Management scenario models the creation of data backup sets and the compression, and decompression of various file types. Updates to installed software are also performed.

For each of the six components, BAPco develops a workflow scenario. Only then are applications chosen to do the work. BAPco licenses the actual application source code and assembles it into application fragments together with its workflow measurement framework. The data/financial analysis component, for example, runs a large Microsoft Excel spreadsheet model.

What I don’t like is the “2012” moniker. This SYSmark version is built on business application components as of 2010. By naming it SYSmark 2012, BAPco implies the benchmark is forward looking, when it actually looks back to 2010 application versions. The labeling should be 2010. In spite of the labeling, SYSmark 2012 is unique as a cross-platform benchmark for stressing business desktops using real-world applications in job-related scenarios.

Analysis and Conclusions
The SYSmark 2012 reference-point PC is a Core i3-540 and has a 100 point score. When I used this processor with Windows 7 last year as my “daily-driver PC” for a month, I was underwhelmed by its overall feel. Subjective comment, yes, but my point is that the reference machine is no speed demon.

The new AMD “Llano” A8-3850, a quad-core processor with integrated graphics, is adequate for light-weight office duties as measured by BAPco SYSmark 2012. The top-of-the-line AMD Phenom II 1100T with a discrete graphics card is better suited for mainstream task-specific business computing than the “Llano” processors.

Intel’s low-end dual-core “Sandy Bridge” Pentium 620 and 840 bracket the “Llano” A8-3850 in processor performance, while lagging in graphics-intensive 3D benchmark components.

Intel’s entry-level Core i3-2120 with integrated graphics handily beats the top-of-the-line Phenom II 1100T with a discrete graphics card in all but graphics-intensive 3D benchmarks, making it an attractive price-performer. The high-end Core i5-2500 tops the top-of-the line Phenom II 1100T with a 44 point overall advantage, despite using integrated graphics.

 SYSmark’s results do not plow new performance ground. An Internet search will quickly turn up numerous reviews that conclude, using a different set of benchmarks, that the “Llano” line is weak as a processing engine and pretty good at graphics, especially 3D consumer games. Yet consumer games are not typically not high on the business PC evaluation checklist.

Many of the SYSmark 2012 applications use graphics-processor acceleration, when available, including Adobe Photoshop, Flash, Premier Pro CS5, Autodesk 3ds Max and AutoCAD, and Microsoft Excel. SYSmark 2012 convinces me that today’s integrated graphics are plenty good enough for business PCs shy of dedicated workstations. But a strong processor is still necessary for good overall performance.

Business desktops ought to be replaced every three to four years. However, the reality is many businesses keep desktops for five or more years, and many have instituted a “replace it when it breaks” cycle. Productivity studies show that knowledge workers deserve the higher end of today’s performance curve in a new PC so as not to be completely obsolete — and less productive — before the machine is replaced.

No single benchmark should be the sole criteria for selecting a computer, and SYSmark 2012 is no exception. However, I disagree with AMD that SYSmark is no longer worthy of consideration, and by other analysts that SYSmark is dead because AMD walked away from BAPco.

The bottom line for PC evaluators is simple: if you believe that the extensive work by the BAPco consortium across two decades stands up to scientific and peer scrutiny, then the SYSmark results discussed above show AMD at a serious performance disadvantage. If you don’t think SYSmark is a relevant benchmark for business PCs, then neither AMD nor I have a viable substitute.

The next shoe to drop is AMD’s high-end “Bulldozer” processor, expected in the next 60 days.

Apple Follows Sun Tzu, Knocks Off Competing Generals

Sun Tzu, the maybe historical Chinese general, is a favorite for tech motivational speakers, with a war-making philosophy that can be summarized as “avoid direct military conflict when other means suffice”. Real or imaginary, Sun Tzu would be proud of what Apple has done to its competitors.

I’ve had to copy my envelope-back to a genuine list of now-departed CEOs who failed to confront Apple with a strategy or products that suited said CEO’s board. Here’s the list of the departed as of the first day of Q2-2011:

  • Dirk Meyer – AMD
  • Gianfranco Lanci – Acer
  • Nam Yong – LG Electronics
  • Olli-Pekka Kallasvuo – Nokia

Motorola, Sony, Toshiba, Asustek Computer (Asus) and Lenovo are all in danger of being dragged off by the smartphone wave that is iPhone.

Hewlett-Packard (HP) and Dell have modest smartphone bases, and are for the moment defending their enterprise citadels from incursions by the iPad. But walled moats have not held forever in the innovative tech industry. Verticals like education and medical are taking up tablets at a rapid rate. The PC giants need a better solution, fairly soon.

As I keep saying, its not just about the hardware. The so-far impregnable juggernaught that is Apple is an ecosystem of hardware, operating system, apps, an app store, and vibrant developer community.

Great Wall of China