onsdag 26. august 2009

The coolest and scariest things coming in the chip industry’s future

At the Hot Chips chip design conference at Stanford University this week, chip researchers spelled out some of the toughest computing problems of the future and the solutions to deal with them. Hearing these pioneers and visionaries talk was both inspiring and disturbing. They talked alternately about running into technological brick walls, and about ways to get around them. But they warned that the ever-increasing cost of making the newest chips will have an impact on the entire food chain of electronic products where chips are used. Here’s a roundup of the ideas aired at the conference:
COOL 1. Universal Translator — Jen-Hsun Huang, chief executive of graphics chip maker Nvidia, predicted in a keynote speech that graphics chips will be used to compute non-graphics tasks. That’s good because graphics chips (working in concert with microprocessors) are expected to speed up more than 570 times in the next six years, while microprocessors are expected to advance only about three-fold in performance. Among the things he looks forward to seeing is a universal translator that can translate spoken words and synthesize them for people to hear in the languages they understand. This problem has been unsolved for a long time, and not just because of hardware limitations. Getting software to do this correctly and at an affordable price is still the biggest hurdle. Douglas Adams’ “babelfish” — a little fish you put in someone’s ear that can translate all galactic languages, from the Hitchhikers Guide to the Galaxy — may not be so far off in the future. But maybe Obama and Ahmadinejad could one day talk to each other without the aid of human translators.
SCARY 2. Moore’s Law Slowdown — Researchers on a panel said they think that the law observed in the 1960s by Intel Chairman Emeritus Gordon Moore — who predicted accurately that chips double their chip capacity every 18 months — which has held true since the 1960s is past its prime. Now it’s getting harder and harder to deliver performance improvements without generating excessive power. The days of easy gains through manufacturing advances are coming to an end, and chip makers and designers have to turn to more exotic and creative techniques just to stay on the Moore’s Law treadmill. That means that it’s going to cost more money to make relatively modest gains in chip performance. With the current recession, that will just making things harder on chip makers whose ranks have thinned dramatically in the past few years. Here’s a comparison every gamer can understand: If Moore’s Law meant that chip capacity would double every three years instead of every two years, we would be stuck playing games on a device that was no more powerful than a Nintendo 64, a video game console that came out in 1996.
COOL 3. Rise of parallel computing — While the manufacturing slowdown has hurt serial computing, or doing one task at a time extremely fast, it has created an opportunity for parallel computing, a design change where lots of computing cores work on tasks simultaneously. Researchers from labs at the University of California at Berkeley, Stanford, and the University of Illinois gave talks. Parallel computing could be the key to problems such as speech recognition, smart low-power handhelds, and computer vision, said David Patterson, a computer science professor at Berkeley and a leader of the Par Lab, a parallel computing lab at the school. With parallel computing, chip designers will become important again, since design, not manufacturing, will become the key to innovation.
SCARY 4. The complexity of parallel computing – The only problem with No. 3 is that people have been trying parallel computing for decades and they haven’t solved the thorny problems of programming complexity. These are things “we should have been talking about 10 or 20 years ago,” insisted John Hennessey, president of Stanford University and an accomplished chip architect. He and others are frustrated that hardest technical problems with parallel computing haven’t been solved. If the solution to the manufacturing slowdown is headed for trouble, then so is chip progress. “This is the hardest problem in computer science,” said Hennessey, co-author of the most popular computer science textbook.
COOL 5. The big iron chips aren’t dead — The surviving chip giants are still working on extremely ambitious and complex chips which serve the purpose of simplifying data centers. At the conference, IBM described its Power 7 microprocessor, which will have 1.2 billion transistors. Meant to power high-end data center computers known as servers, this chip is due to ship in 2010. It will be have 16 processing cores that can do for tasks, or threads, at a time. The chip will have four times the performance of the IBM Power 6 chip launched three years ago and triple the bandwidth. This kind of chip could probably download the entire iTunes library to your computer in about a minute or two. It can also run 1,000 partitions at once, meaning it could probably replace lots of older servers in a data center and thereby cut both power and computing costs, said Brad McCredie, an IBM fellow and vice president. That all brings down the total cost of owning the servers. Of course, IBM wasn’t alone. Intel said its next Nehalem EX server chip will have eight cores and billions of transistors. And Sun Microsystems’ Rainbow Falls chip will also be a monster with 16 cores.
SCARY 6. Goodbye custom chips – One of the by-products of the slowdown in manufacturing gains is an accompanying rise in costs. Michael Hart, an engineering director at Xilinx, said it can take $60 million to bring out a chip with circuits that are 45 nanometers wide (a human hair is maybe 100,000 nanometers thick). Even designing a custom chip, or application specific integrated circuit (ASIC), can cost millions of dolllars and requires a big team working for a couple of years. Those custom chips have to sell many millions of units just to break even. Stanford computer science professor Mark Horowitz, co-founder of Rambus, said that chip design tools — which automate chip design and thus take the complexity out of the process — just haven’t kept up, and no one is motivated to design these tools for a shrinking industry. “It sure sounds to me like ASICs are broken,” said Berkeley’s David Patterson.
COOL 7. Personal fab in your garage — During a question and answer session, Ruby Lee, a computer science professor at Princeton University, noted that you don’t have to spend billions of dollars on a chip factory to exploit semiconductor technology. Why not just create a low-end, personal fab for individuals. People could create all sorts of innovative chips, just like citizens with camcorders can post movies on YouTube. Researchers at MIT have been working on these “Fab Labs” for years, with the goal of creating $50,000 or less machines that can fabricate simple chips. Researchers can already use Nvidia’s Tesla-based “personal supercomputers.” So the fab in the garage makes sense for the lone chip designer. One day, Lee said, maybe they could be used to create disposable chips that could be made inexpensively yet still serve some purpose.
SCARY 8. No more chip startups — The venture capitalists have run for the hills. Hot Chips used to be a place where VCs would come to cruise for new investments. But Forest Baskett of New Enterprise Associates was one of the few chip VC diehards left at the event. Chip startups were once the engine of economic growth in Silicon Valley. When one engineer asked what to do about it and where to find jobs, Mark Horowitz of Stanford said, “Go into solar.” Since chips require a huge amount of money to get to the market, startups are becoming scarce. I’ve only written about a handful of chip fundings in the past year. That means there won’t be as many new ideas coming from startups to keep the big companies honest. If the well dries up for venture money, that’s a sign of a smokestack industry.
COOL 9. Games get less immersive, more social — Mark Snir of the University of Illinois’ parallel computing research center said that one of these days, games will have the best of both worlds: games will have both the outstanding graphics of games that are heavily scripted on a predefined path for the player to follow, but also have the freedom for the player to wander through an open world. Right now, it’s usually one or the other. Better computing power will make it all possible, Snir said.
At the same time, Rich Hilleman, chief creative officer at Electronic Arts, said that big investments will now be made in user interfaces, which sense and translate human movement into a game world so that you can control a game with your entire body. That means that the social interaction in the game is more important than the immersion in a single player experience. But Jim Kahle, a fellow at IBM and architect of the Cell processor used in the PlayStation 3, said he wasn’t sure if such input devices would demand the same kind of engineering resources as graphics realism.
SCARY 10. The brain drain continues — The U.S. isn’t producing enough computer scientists or electrical engineers, and those who came here to study abroad are now returning to their home countries. U.S. chip companies are outsourcing chip engineering to offshore locations where costs are lower and the engineers are plentiful. And the quality of chip engineers in places such as India is now first rate. Intel recently had a chip design team in India create its Dunnington server chip, which was announced last year. That’s one of the most difficult kinds of chips to design. If the Indians can do that kind of chip, they can pretty much do most of what American chip teams can do.
Come see the most promising new technologies unveiled for the first very time at DEMOfall 09 this September 21-23 in San Diego. VentureBeat readers may register to attend the conference at a special 20% discount off our regular rate. Register now at this link.

Ingen kommentarer:

Legg inn en kommentar