佳工机电网 在线工博会 我的佳工网 手机版 English
关键字  
  选择展区 >>
您的位置: 首页 > 电子元器件及材料展区 > 集成电路展厅 > 新闻 > 正文 产品 会展 人才 帮助 | 注册 登录  
集成电路
 按行业筛选
 按产品筛选
本产品全部新闻


e展厅 产品库 视频 新闻 技术文章 企业库 下载/样本 求购/论坛
  市场动态 | 技术动态 | 企业新闻 | 图片新闻 | 新闻评论 佳工网行业新闻--给您更宽的视角 发表企业新闻 投稿 
Ten 2008 trends in system and chip design
http://cn.newmaker.com 1/3/2008 6:54:00 PM  佳工机电网
欢迎访问e展厅
展厅
2
集成电路展厅
集成电路, 微控制器, 单片机, RAM存储器, ...
Prediction is always a risky business, but in the world of chip and system design, there are some new methodologies, tools, and challenges that are clearly going to impact design and verification. This article examines those developments in order to identify ten top trends for 2008.

What's driving all of these trends is design complexity. Consumer electronics, for example, is bringing about a new generation of mobile, multi-media devices that must support multiple application and interface standards, low power, and fast time to market. And in every application area, the pressure is on for solutions that are higher in performance, lower in cost, and no hungrier for power consumption than what came before. These demands are behind the move to 65 and 45 nm ICs, the emergence of complex architectures, and the explosion of embedded software content.

The 10 trends identified here are the following:
Functional verification gains intelligence
New approaches boost analog/mixed-signal design and verification
Virtual platforms speed software development
Advanced power techniques raise verification challenges
EDA applications move to parallel computing
As 45 nm hits production, DFM challenges rise
IP trends: configurable processors, mixed-signal cores
Memory importance and options increase
FPGA design is more like ASIC design
Packages and boards come into the picture


1. Functional verification gains intelligence
Long regarded as the biggest single bottleneck in IC design, functional verification received increasing attention in 2007, and will become more intelligent and automated in 2008. Some years ago Gary Smith, chief analyst at Gary Smith EDA, put forth the vision of an "intelligent test bench" that would evaluate a design and apply the correct verification engines to various blocks. The full scope of that dream has remained elusive, but developments in 2007 suggest that we're at least heading in the direction of smarter verification.

There was some interesting startup activity in 2007. Certess, for example, introduced "functional qualification," a new approach to verification coverage that profiles tests, injects faults, and then determines if the existing verification testbench would have detected them. Breker Verification Systems introduced a graph-based functional "test synthesis" tool that provides automatic test vector generation. And Nusym Technology, which has yet to formally announce a product, is working on technology that automates both coverage estimation and test generation. Another company to watch in 2008 is Mentor Graphics Corp., which quietly bought startup Lighthouse Design Automation and acquired technology that promises high coverage through intelligent testbench generation.

"2008 will herald a more holistic, intelligent verification approach, with EDA tools using coverage data to accelerate the entire verification process in an automated manner," said Venk Shukla, president and CEO of Nusym. "By intelligently using key coverage metrics to select and generate the vital test sets from the overly verbose pseudo-random vector sets, potential quality issues may be directly targeted with a reduced [complexity], high-impact testbench."

Functional qualification metrics will make third-party silicon intellectual property (IP) more trustworthy, said Mark Hampton, founder and CEO of Certess. "Functional qualification is a simple concept of measuring the ability of functional verification to find design bugs," he said.

There's a constant and growing demand for "helper" applications that make the verification flow more rigorous and productive, said Scott Sandler, CEO of Novas Software. While automating debugging is one example, another is "the new class of tools in the verification closure space." These new tools, he said, include formal analysis, coverage estimation, and power-aware verification.

Formal property verification will see widespread deployment in 2008, according to Craig Cochran, vice president of marketing at Jasper Design Automation. He said that formal verification is finding two new applications. One is "RTL exploration" that allows designers to quickly validate their ideas, and another is post-silicon debugging, in which formal tools can help identify the root cause of a bug and verify that it's fixed correctly.

Meanwhile, standards efforts may help make verification environments more open and interoperable this year. In 2007, Cadence Design Systems and Mentor Graphics developed the Open Verification Methodology (OVM), an open-source SystemVerilog class library and methodology that defines a framework for reusable verification IP and tests. "One clear dominant trend in verification is the move away from proprietary, tool-locked verification methodologies toward an open methodology," said Jim Miller, executive vice-president of the product and technologies organization at Cadence.

Another emerging standards effort is the Accellera Unified Coverage Interoperability Standard (UCIS), which is developing an API that can unify coverage data from different types of tools, including simulation and formal verification.

2. New approaches boost analog/mixed-signal design and verification
Analog/mixed-signal design and verification will move forward in 2008 in at least two significant ways. First, new methodologies and tools will bring more rigor, precision and speed to analog verification. Secondly, the move towards interoperable parameterized cells (p-cells) and process design kits (PDKs) will pave the way for more startup activity and more tool choice for users, challenging Cadence Design Systems' traditional market dominance in analog IC design.

In a recent Expert's Corner interview on SCDsource.com, Henry Chang, co-founder of Designer's Guide Consulting, outlined an emerging analog verification methodology that uses behavioral modeling and regression testing. Driving the need for this methodology, he said, is the increasing architectural complexity needed to support multiple standards, demanding specifications, and low power. "Applying more rigorous, systematic, and automated approaches in analog circuit verification will become more critical in 2008," Chang said. "The concept of having a separate person or team responsible for analog circuit verification will gain more traction."

Tom Borgstrom, director of solutions marketing at Synopsys, also foresees new analog/mixed-signal (AMS) verification approaches. "Increasing mixed-signal content will drive the emergence of new AMS verification methodologies that leverage successful approaches used in digital functional verification, while providing hierarchical techniques that use functional blocks directly in their native modeling languages," he said.

"In 2008, the EDA industry will need to bridge the gap between digital and analog design, and bring digital-type automation to the analog designer," said Ashutosh Mauskar, vice president for product and business development at Magma Design Automation's custom design business unit.

Circuit verification of analog and mixed-signal blocks "is getting close to impossible," said Mathias Silvant, CEO of physical verification provider Edxact S.A. He said that "all means" will be employed this year to solve that problem, including model order reduction, enhanced fast Spice simulation, and behavioral modeling. Paul Estrada, chief operating officer of circuit analysis provider Berkeley Design Automation, said that analog and RF designs need to be verified at a higher level of hierarchy. "There will be a shift to analog/RF verification tools that can deliver complex block and full circuit verification with full Spice accuracy," he said.

Analog and transistor-level verification has become a ripe area for startups. In 2007, Solido Design Automation introduced transistor-level statistical design technology that promises new functionality well beyond conventional Monte Carlo simulation. Xoomsys is developing a parallel Spice capability. Nascentric rolled out a multi-threaded fast Spice simulator last summer.

Meanwhile, one of the most significant developments in analog IC layout is the development of interoperable p-cells, which are a critical part of every foundry PDK. Until very recently, nearly all p-cell libraries were written in Cadence's proprietary Skill language, and were generally not portable to other vendors' tools. Startup Ciranova pioneered interoperable p-cells for the OpenAccess database in 2006, and last year, five vendors collaborated to develop an open-source, interoperable p-cell library (IPL) generated using technology from Ciranova.

"2008 will see the first practical implementation of interoperable p-cell libraries built on the Open Access database," said Rich Morse, director of marketing at IPL co-founder Silicon Canvas. "For the first time there will be an open environment for custom integrated circuit design, beginning a sea change in the last single-vendor EDA monopoly." Eric Filseth, CEO of Ciranova, said that the first interoperable PDKs from foundries will appear in 2008. Until now, he noted, foundry design kits have used proprietary formats and worked with tools from a single vendor.

3. Virtual platforms speed software development
Electronic System Level (ESL) providers have been trying to raise the IC design abstraction level for years, with limited success. But one ESL application that has been receiving increasing attention, and that promises to become even more prominent in 2008, is the use of virtual platforms (also known as "software virtual prototypes" or "virtual prototypes" or ''virtual system prototypes.'') Virtual platforms provide a system-level model that's fast enough for software development and accurate enough for architectural exploration and optimization. As noted in a recent SCDsource feature story, virtual platforms are finding satisfied users today at system and system-on-chip (SoC) design companies such as AMCC, Freescale, General Dynamics, Infineon, Sarnoff, STMicroelectronics, and Wind River Systems.

"2008 will be the year of the virtual prototype," said John Sanguinetti, CTO of Forte Design Systems. "In 2007, most large design projects attempted to use virtual prototypes in one form or another. In 2008, designers will be trying to improve the prototyping process to get more acceptable speed, accuracy, and overall usability from their prototypes." A byproduct, he said, will be demand for higher level silicon IP.

"The growing use of virtual platforms for pre-silicon software development is being fueled in part by the establishment and adoption of open standards for constructing the transaction-level models that serve as the building blocks for these platforms," said Matt Gutierrez, director of marketing for professional services and system-level solutions at Synopsys. He noted that the Open SystemC Initiative (OSCI) Transaction Level Modeling (TLM) 2.0 draft 2 should facilitate TLM interoperability across companies and design tools. OSCI recently released that draft for review. The lack of TLM interoperability has been cited as a major obstacle to the acceptance of virtual platforms.

While ARM, Carbon, CoWare, Synopsys, Vast Systems, and Virtutech offer virtual platform tools today, Imperas plans to offer virtual prototyping in early 2008 for multicore ICs. Imperas will provide simulation, debugging, and a model library. It's a change of focus for the much-watched startup, whose original goal was to provide a high-level software programming environment for multicore ICs.

"2008 will be the year of the integrated compiler, profiler, and virtual platform," said Mark Snook, director of marketing for ARM. Combine a virtual platform with a compiler that can optimize code generation, and a profiler that's aware of operating threads and resources, "and you have a killer suite of tools," he said.

4. Advanced power techniques raise verification challenges
The demand for mobile consumer devices has made power consumption the number one design consideration for many ICs and systems. As a result, said Phil Dworsky, director of strategic alliances at Synopsys, "aggressive" power management techniques will "become mainstream in 2008." Such techniques include multiple voltage islands, power gating, and dynamic voltage and frequency scaling. Design tools and IP have been enhanced to support such techniques, and designers are becoming more comfortable with them.

But there's a problem that will have to be addressed in 2008 - advanced power management techniques greatly complicate verification. "Power has risen to become the major design issue facing the industry," said Jerry Frenkil, CTO of Sequence Design. "Less obvious, but no less thorny, is the nasty interplay between power and verification and power and test."

"To address low power design requirements, both static and dynamic functional verification tools will need to advance to accurately handle supply voltage as a functional input," noted Synopsys' Borgstrom. "Multi-voltage analysis and multi-voltage simulation will be necessary."

With multiple power domains, noted John Lenyo, director of marketing for Mentor Graphics' design, verification and test division, state data must be retained and restored, and wires must be connected to the boundary of the isolated power domain. Multi-voltage systems require level shifting to swing logic values from one voltage domain to another. "Historically, verification of the functional implications of low-power design has been performed late in the process, typically after physical design, as all relevant information was not available sooner," he said.

The Accellera Unified Power Format (UPF) will allow earlier functional verification of low-power design intent, Lenyo said. Mentor, Magma, and Synopsys support UPF. What doesn't appear to be in the cards for 2008, however, is any convergence between UPF and the rival Common Power Format (CPF) developed by Cadence Design Systems and currently managed by the Silicon Integration Initiative.

5. EDA applications move to parallel computing
While EDA vendors are working to support the design of next-generation multicore CPUs, the bigger challenge they face is making use of them to accelerate their own software. According to Gary Smith, the rewriting of EDA tools for parallel computing is the issue with the "largest potential impact on the competitive makeup of EDA" in 2008. This is no trivial matter. Smith has noted that rewriting a large CAD application can be a three-year process.

Startups appear to be leading the parallel computing charge. Among them is Xoomsys, with its parallel Spice capability. "An overarching trend will be the move toward more parallel computing in EDA," said Raul Camposano, Xoomsys president and CEO and former Synopsys CTO. "Eventually most tools that haven't already been parallelized will have to be rewritten."

Startup CLK Design Automation rolled out a multi-threaded static and statistical timing analyzer in 2007. Extreme DA quickly followed suit. According to Isadore Katz, CLK president and CEO, a 16-processor machine based on 4 quad cores with 64 Gbytes of memory will soon cost less than $20,000. This will become the "default server configuration," he said, and EDA software must take full advantage of such platforms in order to provide adequate turnaround times for 65 nm and 45 nm designs.

The upcoming market battle for 45 nm IC physical design may be decided, in part, on the basis of parallel computing. One attraction of Sierra Design Automation's Olympus-SoC design suite, now provided by Mentor Graphics, is its multiprocessing capability. Pyxis Technology's NexusRoute "yield-driven" autorouter, introduced in September 2007, offers multithreading. Atoptech's Aprisa, the newest netlist-to-GDSII design suite, also offers parallel processing over multiple workstations or CPUs.

6. As 45 nm hits production, DFM challenges rise
45 nm IC design will come on strong in 2008. "TSMC already has the foundry sector's first commercial 45 nm products in production," said John Wei, senior director of advanced technology marketing at TSMC. "Furthermore, our 45 nm prototyping vehicles are heavily booked through 2008, indicating high interest in the technology."

But 45 nm design will require more attention to design for manufacturability (DFM) than previous process nodes. "At 45 nm, designers will encounter lithography distortions of the poly and diffusion shapes that unavoidably alter transistor electrical characteristics," said Ricardo Borges, manager of product marketing in Synopsys' Silicon Engineering Group. "They will also encounter changes to transistor performance due to mechanical stress, which is being introduced into silicon to boost performance. Designers will need model-based tools, methodologies and design flows to account for parametric variability."

"In 2008, increased amounts of process intelligence will be incorporated into the design flow," said Prashant Maniar, chief strategy officer at Stratosphere Solutions. This intelligence, he said, will be oriented to electrical parameters and based on "a language that front-end designers actually understand." Chipmakers are investing substantial R&D into building an accurate statistical process model for designers, he said.

Closure of 45 nm designs, said Joe Sawicki, vice president of Mentor's design-to-silicon division, requires concurrent optimization across dozens of corners. He noted that IC physical verification is evolving to include more design rules and DFM checks, that parallel processing is speeding up physical verification, and that statistical analysis of volume test data is gaining popularity to speed up failure analysis.

Below 45 nm, however, DFM tools and methodologies won't be the only approach to achieving acceptable yields. Restricted design rules (RDRs) are widely expected to become more prominent. Another possible approach, advocated by PDF Solutions, is to restrict layouts to a handful of pre-characterized, regular patterns.

45 nm and 32 nm designs also pose new material choices. IBM has based its Cu-45HP 45 nm ASIC family on silicon on insulator (SOI), which promises better power and performance characteristics than bulk CMOS. An SOI Consortium has been forged to build an infrastructure that can bring SOI to fabless and ASIC designers. But SOI is controversial. Intel has rejected the technology for its 45 nm ICs, opting instead for a high-k metal gate process to boost power and performance.

"High-k metal gates will be the primary transistor solution at 32 nm," said Mike Smayling, senior vice president at Tela Innovations. TSMC's Wei said that high-k gates will be "mainstream at 32 nm for sure." SOI and high-k gates can work together, however - IBM and several partners announced in December 2007 the incorporation of high-k gates into a new generation of 32 nm SOI technology.

7. IP trends: configurable processors, mixed-signal cores
The silicon IP market continues to grow rapidly, and in 2008 two themes will be prominent. One is the use of configurable processing elements, and the other is the outsourcing of analog/mixed-signal IP.

As noted at ARC International's ConfigCon conference in December 2007, configurable processors can be a boon for mobile, multi-media devices, where designers are confronting a proliferation of standards, the need for power management, and time-to-market pressures. Others agree. "What is clear, regardless of implementation, is the need for more programmability and flexibility," said Jack Browne, vice president of marketing at MIPS Technologies. SoC designers, he said, "need configurable solutions that can scale with their product line."

Grant Martin, chief scientist at Tensilica, noted that "configurable, extensible processors are being found in a wider and wider range of devices." Dedicated application processors that may not be easily recognized as processors are providing audio and video functions, he noted. But programming and integrating these processors remains challenging, he said.

MIPS' Browne also said that 2008 will be a "big year for the outsourcing of analog and mixed-signal IP." Since analog design is difficult, many companies can no longer afford internal analog design teams, he noted. Warren Savage, president and CEO of IPextreme, noted that the unavailability of mixed-signal IP for new process nodes is delaying foundry customers' decisions to move to these nodes.

The use of mixed-signal IP for protocols like USB, PCI Express, SATA and DDR2 is becoming more commonplace, said Navraj Nandra, director of product marketing for mixed-signal IP at Synopsys. There are two challenges, he said - supporting these interfaces on 45 nm processes with 1.8V oxides, and achieving higher speeds for the protocols.

8. Memory importance and options increase
Memory is already a key part of many SoC designs, and may take on new importance in 2008. "Memory will finally move to the forefront of electronic design," predicted Juan-Antonio Carballo, general partner at Argon Venture Partners. "New and maturing memory materials, device designs, circuit architectures, and system usage models will make memory the key differentiator in an increasing number of designs." Carballo predicted an increasing array of options for memory architectures and IP, and said that "EDA tools will start changing accordingly."

Consumer devices are demanding increasing bandwidth and signaling speeds, and memory solutions must step up to these needs, said Steve Woo, technical director at Rambus. In 2008, he said, memory innovations must support faster data transfer rates and higher capacity. Rambus, a provider of embedded DRAM technology, announced in November 2007 an initiative aimed at delivering a terabyte-per-second memory bandwidth.

9. FPGA design is more like ASIC design
FPGA design used to be much easier than ASIC design, but no more. In 2008, said Jeff Garrison, senior director of marketing at Synplicity, timing closure for complex FPGAs will drive the adoption of physical synthesis. He noted that FPGA physical synthesis considers the detailed routing resources of the target FPGA, and performs a full-chip placement simultaneously with logic optimization.

The availability of vendor-independent IP is becoming increasingly important for FPGA designers, said Daniel Platzker, product line director of Mentor's design and synthesis division. It's important to ensure that designs are not locked to a specific FPGA vendor's IP, he noted. Other trends Platzker cited include the increasing use of SystemVerilog for FPGAs, design at higher levels of abstraction, more use of FPGAs for ASIC prototyping, and the adoption of physical synthesis.

10. Packages and boards come into the picture
EDA is historically very IC-centric, but electronic systems include much more than just chips. One trend that emerged in 2007 and will continue in 2008 is the importance of IC/package/pc-board co-design. In October 2007, Magma Design Automation bought co-design provider Rio Design Automation, and Apache Design purchased Optimal Corp. to put together an IC/package/pc-board power and signal integrity solution.

"Designers can no longer ignore the impact of IC power and noise on the package and system design, nor can the IC designers ignore the effects of package parasitics in their design," said Dian Yang, senior vice president of product management at Apache Design Solutions. "The need to address true IC package co-design and co-verification will drive the EDA industry to expand from its IC centric view to a system-centric view," Yang said.

There may not be a lot of news about printed circuit board (PCB) design, but increasing complexity is driving a need for new approaches, said John Isaac, director of market development in Mentor's systems design division. "We foresee more high-speed content in products and multi-gigabit-per-second trends in inter-IC communications," he said. There will also be more use of fabrication technologies such as microvias, flex boards, and 3D packaging, he said. "Intellectual property creation and concurrent multi-disciplined design collaboration with remote teams may speed up PCB design," Isaac said.

Conclusion
We've identified ten trends that we believe will impact chip and system design in 2008. Taken together, these trends suggest that 2008 will be a year of profound change and progress as design and verification tools, methodologies, and silicon IP scramble to keep up with the exploding complexity of electronic devices.

By Richard Goering

发表评论】【新闻评论】【论坛】【收藏此页
更多有关集成电路的新闻:
·SIA:11月份全球芯片销售增长2.3% 1/3/2008
·未来的FPGA将进入更多消费电子领域 12/28/2007
·信产部表彰10款国产IC产品 12/22/2007
·iSuppli:07年中国半导体销售额首次突破500亿美元 12/20/2007
·高密度IC设计中面临的ASIC与FPGA的抉择 12/19/2007
·意法半导体收购GENESIS MICROCHIP公司 12/18/2007
·摩尔定律最多延续十年 半导体面临物理极限 12/17/2007
·面对2008半导体的严冬,本土IC企业何去何从? 12/13/2007
·承接台湾集成电路设计业转移 福建芯有潜力 12/11/2007
·NEC山形将投产40纳米半导体产品 12/8/2007
查看与本新闻相关目录:
·电子元器件及材料展区 > 集成电路展厅 > 集成电路新闻

对 集成电路 有何见解?请到
集成电路论坛 畅所欲言吧!





网站简介 | 企业会员服务 | 广告服务 | 服务条款 | English | Showsbee | 会员登录  
© 1999-2024 newmaker.com. 佳工机电网·嘉工科技