This article is more than 1 year old

By 2040, computers will need more electricity than the world can generate

So says the semiconductor industry's last ever communal roadmap

Without much fanfare, the Semiconductor Industry Association earlier this month published a somewhat-bleak assessment of the future of Moore's Law – and at the same time, called “last drinks” on its decades-old International Technology Roadmap for Semiconductors (ITRS).

The industry's been putting together the roadmap every two years since the 1990s, when there were 19 leading-edge chip vendors. Today, there are just four – Intel, TSMC, Samsung and Global Foundries – and there's too much road to map, so the latest ITRS – written last year and officially published this month – will be the last.

The group recently suggested that the industry is approaching a point where economics, rather than physics, becomes the Moore's Law roadblock. The further below 10 nanometres transistors go, the harder it is to make them economically.

That will put a post-2020 premium on stacking transistors in three dimensions without gathering too much heat for them to survive.

In filing its last return, so to speak, the association says computing is facing a lot of crunch points far more serious than keeping Moore's Law going.

The biggest is electricity. The world's computing infrastructure already uses a significant slice of the world's power, and the ITRS says the current trajectory is self-limiting: by 2040, as the chart below shows, computing will need more electricity than the world can produce.

Computing Energy

We don't got the power: the ITRS prediction for energy consumption isn't encouraging

Another problem is that there are too many balls in the air for the semiconductor industry to be the only juggler. Apart from power consumption, it lists seven other research areas a confab of industry, government and academia see as critical: cyber-physical systems; intelligent storage; realtime communication; multi-level and scalable security; manufacturing; “insight” computing; and the Internet of Things.

A quarter of a century ago, the business of setting research priorities looked easy: microprocessors, memory, storage and communications all looked like they had a predictable trajectory. That made an industry consensus relatively easy to achieve.

One thing that's obvious in the list of research priorities is that a focus on feature size and clock speed is no longer enough: the ITRS documents a shift in which applications drive design. In discussing the change, IEEE Spetrum also notes that the big customers (like Google, Apple, Samsung, and blueprint shop Qualcomm) are calling the shots, rather than semiconductor companies.

With fewer vendors in the industry and such a large to-do list, the ITRS states the industry can't set research priorities without help:

“The U.S. semiconductor community—including government, industry and academia—will be able to take these critical steps only through partnership and focused funding. A National Computing and Insight Technology Ecosystem initiative will support development of an aggressive research agenda and a leap forward in new knowledge. Together, the community must exploit the rapidly developing opportunities to reboot, expand, and extend the IT revolution, and thereby ensure the United States of robust, long-term information technology leadership”

That initiative – N-CITE in short – was foreshadowed by the group in January, when it said such an effort should be part of the National Strategic Computing Initiative (NCSI). ®

More about

TIP US OFF

Send us news


Other stories you might like