Tao On Twitter





    Wednesday, June 03, 2009

    EDA Standardization: The Next Wave Is Here!

    In one of my earlier posts (EDA Standards I'd Love To See) , I argued for a standard for interconnect extraction rule from that will be used by the extraction tool. I quote:

    "There are only so many things one expects to be present in an RC Deck. These values, much like in DRC decks, are constant for a given process. It can't be hard to develop a common extraction rules format. Vendors can still add value via the speed and accuracy of their RC estimation algorithms."

    TSMC has now developed a common RC extraction deck format called iRCX that can be used as the input for all RC extraction tools. The utopian promise: RC tools all support the same standard for rules and compete on accuracy and runtime. According the press release, this standard is the first of many to come (I' m expecting that DRC & LVS rule formats are next in line).

    Sunday, March 29, 2009

    Reframe and Reuse: Extending The Capabilities Of EDA Tools Through Creative Reframing

    There's a joke that goes like this (from Mike Cook's List Of Math Jokes) :

    A physicist and a mathematician setting in a faculty lounge.
    Suddenly, the coffee machine catches on fire. The physicist grabs a
    bucket and leaps towards the sink, fills the bucket with water and
    puts out the fire. The second day, the same two sit in the same
    lounge. Again, the coffee machine catches on fire. This time, the
    mathematician stands up, gets a bucket, hands the bucket to the
    physicist, thus reducing the problem to a previously solved one.
    My interpretation: Make the physicist do all the work!


    There any many problems is ASIC design for which straightforward solutions/tools may not exist. You want the tool to consider timing, area, power and X. The rub is that the tool does not support measuring or optimizing for X. In such cases, it is worthwhile to explore the possibility of reframing X into something the tool does understand.

    By reframing the problem, you save yourself a whole lot of work:
    • You don't code, test and release a tool that supports concurrent optimization of X, timing, area, power,..
    • You don't create a tool that optimizes only X and spend time iterating between X optimization and standard optimization
    • You get to leverage the powerful algorithms built into your EDA tool to concurrently optimize for timing, area, power, X, Y, Z, ...
    For example, if X is uniform placement density and the tool does not explicitly support it, you can recast the problem as one of uniform/max/min metal density (which is supported via the DRC engine). Using virtual (user-created) metal layers and associated density rules to represent the placement density, enhance the cell's physical view such that the tool sees that the entire area of the cell is covered with one big piece of virtual metal. During optimization, the tool will try and meet the virtual layer density requirements (representative of uniform placement density) along with its standard cost functions (area, timing, power,etc).


    Tags : ,

    Monday, January 26, 2009

    In The Shameless Pursuit Of Average : Uniformity As A Metric In Physical Design

    Physical design tools aim to create a placed and routed design exhibiting the best area, timing and power. As one progresses to smaller technology nodes, the additional steps in the flow deal with more localized phenomena.

    • Dynamic IR drop is a local phenomenon in both time and space. It occurs when a group of cells in the same region switch at the same time.
    • DFM-related phenomena such as CMP effects and CAA hotspots are local in space. They occur due to the layout geometries in a very small window area.
    The usual approach in the ASIC design flow is to implement the design first and fix violations due to local effects later. This is a valid trade-off. The overhead in tracking or resolving local effects while implementing the whole design would be high so it makes sense to follow the "design-first-fix-later" approach. Is there another effective way to skin this cat? Let's state our problem in this way:

    Local violations occur when local phenomena are far from the full-chip average.
    • If the power consumption in a small window is larger than the design average, there would be more static IR drop in that window.
    • If the power consumption is high in both a small layout window as well as a small timing window, it would result in dynamic IR drop violations.
    • If the routing density in a local window is high, it would result in CAA hotspots
    • If the routing density in a local window is too low, it would result in CMP hotspots
    The obvious solution that presents itself is:

    Rather than fix violations later, add uniformity as a cost function in physical design tools to avoid the occurrence of violations in the first place.

    • If uniform power density is added to the tool's cost function, we would not see large static IR drop in local layout windows.
    • If we can spread power density in both time and space, dynamic IR violations can be reduced as no local layout window will consume a large amount of power in a small time window.
    • If we can add uniform routing density as part of the cost function, we can avoid CAA hotspots a priori rather than spreading out wires later.

    Tags : ,