The computer industry faces epic change, as the demands of "deep learning" forms of machine learning force new requirements upon silicon, at the same time that Moore's Law, the decades-old rule of progress in the chip business, is collapsing.
This week, some of the best minds in the chip industry gathered in San Francisco to talk about what it means.
Applied Materials, the dominant maker of tools to fabricate transistors, sponsored a full day of keynotes and panel sessions on Tuesday, called the "A.I. Design Forum[1]," in conjunction with one of the chip industry's big annual trade shows, Semicon West.
The presentations and discussions had good news and bad news. On the plus side, many tools are at the disposal of companies such as Advanced Micro Devices and Xilinx to make "heterogenous" arrangements of chips to meet the demands of deep learning. On the downside, it's not entirely clear that what they have in their kit bag will mitigate a potential exhaustion of data centers under the weight of increased computing demand.
No new chips were shown at the Semicon show, those kinds of unveilings long since passed to other trade shows and conferences. But the discussion at the A.I. forum gave a good sense of how the chip industry is thinking about the explosion of machine learning and what it means for computers[2].
Gary Dickerson, chief executive of Applied Materials, started his talk by noting the "dramatic slowdown of Moore's Law, citing data from UC Berkeley Professor David Patterson and Alphabet chairman John Hennessy showing that new processors are improving in performance by only 3.5% per year. (The figure is slightly outdated;