Wednesday, June 11, 2008

DAC 2008 Trip Report

Overview




DAC 2008 was significantly less crowds – unofficially counts were down 46% over last year. However Monday (free day) seemed very crowed. Cadence was not present at this years DAC but they weren’t missed. While there is a lot of negative comments in terms of both the size and value of the conference, from my (Analog) perspective I was very pleasantly surprised. I am very glad I attended and I got a lot more out of it than I was expecting.


There are two key areas I was interested in: Analog/Mixed Signal and IP Qualification/Validation. I saw other interesting things but they would be better addressed by my peers.


Analog Mixed Signal Space


Over the past several years Open Access (OA) has been touted as the new open source database for Cadence.


Background: This database is the replacement to the traditional cdb database and is the only supported database for the Virtuoso(i.e. Analog) and Encounter(i.e. Digital) moving forward in Virtuoso 6.1. The benefits to shifting to this database are numerous - you get to use the latest and greatest tools from Cadence. It works on the flow front to back - In other words both the Analog AND the Digital tools can talk to the same dataset. It's fast and highly scaleble. It also is open source and driven from the Si2organization. Lastly, and perhaps the biggest reason to shift to this is that because the database is open source other tools can talk to the database natively without any data loss. Using traditional tools/databases to work on data you must do a data transfer from one database to another. Cadence has used the cdb database, Synopsys has Milkyway, Magma has Volcano, each of which is proprietary to the parent company. So to translate from one database to another you must go through the traditional transfer mechanisms (LEF/DEF/GDS/EDIF). But doing this you loose all connectivity information between the designers intent (Schematic/Verilog) and the layout (GDS). By using a common database, OA retains that critical information and allows others to manipulate the data.


So why is that important? In short because other tools can now access the data, other tools which can do the same tasks that Virtuoso is doing (schematic capture/layout) can now compete natively using the same database. This gives Cadence some competition and I think we will see the start of the fragmentation of the Analog space. This will include both better tools and cheaper replacements. This is not new however and has been on the horizon for the past 3 years at DAC. This year however we saw real products both from the big players (Synopys-Orion, Magma-Titan) which offered up full replacements to the smaller startups which have very interesting point based technology (AnaGlobe - zero refresh layout, Analog Rails- Connectivity Correct Layout, Helix - Floorplanning).


All of this is being driven from the use of an Open Source Database OA. But we are also seeing significant work and adoption of PDK standardization efforts from the IPL (Interoperable PDK Libraries). The IPL is a coalition of companies (primarily EDA Vendors) which is working to standardize on Process Design Kit Data. This year they were able to bring TSMC on as a primary driver and because of this they were able to demonstrate a full Analog methodology using a TSMC PDK which was not Virtuoso based but behaved virtually identical to the Virtuoso PDK. Further they demonstrated the interoperability of this PDK between several competing solutions for front-end design and layout (Synopsys-Orion, Springsoft-Laker) with tie-ins from other companies for Physical verification (Synopsys-Hercules, Mentor-Calibre).


Again, why is all of this important? Price Leveraging. Cadence is no longer the only analog EDA tool shop in town. By demonstrating that this flow is fundamentally possible using other tools and by adopting some of these components when SMSC is developing a new (90nm) PDK we could use that as a bargaining chip when re-doing licensing deals because we are no longer chained to Cadence. That being said there is an additional effort which would need to be incurred to adopt some of these components (Pycells/IPL CDF's), but at least we saw it's possible.


IP Qualification – Fenix-DA


Over the past year Fenix-DA has really started to evolve into a generic IP Qualification and Validation tool. This year they introduced several new features into their Crossfire product which will drive us to re-look at Crossfire. Specifically they have added a generic IP Qualification Path for macros, Flow based prep work, and a rich API to allow you to bolt on components you may require for your IP Flow. All of these components are necessary when looking at generic macros.


In the past Fenix was focused on Standard Cell Qualification and they provide a rich system for handling this data. Fundamentally Fexix-DA is both a Cadence Connection Parter and is part of the Synopsys Partner Program which gives them access to the underlying database API's. This is important for a number of reasons but fundamentally (which was stated above) we can extract much more information from the database than we can by simply looking at the results of a database operation (streamOut/Verilog out). But within a single tool they can talk to both databases - and they are doing this to ensure consistency between them - hence the validation/qualification piece. They initially focused on Standard Cells which were predicatable (an AND gate function is well understood) but this becomes much more challenging when you look at macros.


Now that they are supporting the QA/Validation of macros the challenge is a bit more generic. True, Crossfire can't do the predictive validation that they do on standard cells but they don't need to. The needs for macros (at least initially) are more generic: pin name direction validation, .lib validation and lef/gds validation are all equally important and time consuming to do by hand. While automating pieces of this are straightforward, they are already are doing this which makes is very compelling.


This year they also have introduced a customizable flow based system which allows you to create custom flows depending on your needs. You would use this flow based approach to help generate the collateral needed prior to the release of IP. For example: I need to stream out of cadence a gds file. Crossfire has the Cadence API under the hood so stream outs are handled simply. A more complicated example would be LVS. Build a customized flow to support both the cdlOut and the StreamOut, run LVS and validate the results.. Pretty awesome indeed.


The last key benefit is that if the Graphical UI flow based approach will not meet your needs they have an open source API (written for Python) which you can directly access. This is the most promising feature because I think of it as an API to the API. Now through a single tool I can directly access all components of an IP block using both the Cadence API and the Synopsys API.


Conclusion


Overall I was very pleased with DAC2008. I was surprised by what I saw and in my opinion we will see a fragmentation followed by a subsequent re-energization of the analog tools market. This will be not only good for EDA and Analog design but should make this a much more interesting environment to work in.








3 comments:

Marketing EDA said...

Steven,

Nice trip report. What's your take on the SPICE and FastSPICE circuit simulators at DAC this year?

Steven Klass said...

You know I didn't really look at the spice/FastSpice simulators. Since the recent release of SpectreTurbo I see very little value in looking elsewhere.

Uday A P said...

Hi Steven,
Thanks first of all for throwing light on DAC 2008.
Good contents in all your blogs.
Do you Have any idea on ballistic tool?