In the physical design phase of an IC there is a general tendency of the design team to focus more on (or sometimes only on) the synthesis and layout tasks. The chip finishing tasks get mostly brushed aside on lesser priority. They get attention only when there is some urgency regarding the power analysis and timing closures.
This is a highly detrimental practice! Especially in the case of large hierarchical designs.
Issues identified during top-level finishing can call for flow modifications at the block level, which can cause blocks to be reopened and the schedules to slip! Moreover, as the different parts of the design are farmed to the different design engineers, such issues raise concerns for the consistency across library versions, flow, procedures and tools.
With this article, team InSemi throws light on the various issues in hierarchical physical design and the methodologies to get the best out of it.
Hierarchical Physical Design
In the case of larger designs it is always beneficial to divide the design process into small (and easily manageable) chunks that later produce the large composite block when combined together at a higher level of hierarchy. Though this physical partitioning follows the same boundary for the logical and physical hierarchy; the design is somehow different from the logical hierarchy enacted by the HDL language structures.
So, let’s examine the impacts rendered when the partitioned blocks are integrated back together in the final IC.
Data Consistency
Data consistency interprets that all the design team members are using a common set of source data, which is consistent throughout the design and partitioning process. Data consistency is an important aspect to consider and is also the most prone to be overlooked.
In a typical design there might be 6 different cell libraries, more than 2 pad libraries, several IP blocks and in-house macros. A good data management practice involves monitoring various release versions, tools specific databases with flow scripts.
Vulnerabilities to watch out in data consistency are:
- Tools & Platform Differences: The same original data is looked upon in a different way by the different tools. For instance, the IP Block’s physical layout is generally developed in a GDS file, which needs to be transformed into the database-supported format for place & route tools. In the tapeout phase the place & route tool will develop the GDS for the full chip. Here it is worth checking that no piece of information has been lost in the translation.
- Data from different libraries: Different tools used at the different stages of the workflow access different library data. In the configuration file or somewhere in the script or the design database, the information regarding the location of the data set must be mentioned to avoid confusion. Here comes the need of ensuring the usage of a coherent & consistent database. All the tools in use must be using the correct version of IP data and references in all version updates.
- Hard Coded Programming: ROM programming is often only finalized near the end of the design project, and sometimes is even delivered just a few days before the final tapeout. The design team needs to be completely sure that this new data set gets seamlessly picked up during the stream-out, and that the model (schematic/source) Netlist is also updated to incorporate the new coding.
Process Flow & Methodology
Efficient workflow and methodologies are the keystone for quality design output, and both hold greater prominence in the case of hierarchical designs. Work methodologies when executed with better efficiency lessen the probability of errors arising out of aforementioned vulnerabilities.
Methodology for the reference of the chosen toolset should be the basis of workflow but these are not specific to any particular foundry technology.
For large hierarchal designs flow management can be a kind of challenge. Following are the major glitches encountered in it.
- Different Versions of Same Script: For the usage of different blocks in the design, several personalized versions of the same script can exist. Here the challenge arises in applying a common project-level update and associating the correct version of it across all design blocks.
- Data Consistency between blocks: With the new versions of tools and databases coming into play, the consistency among various blocks and also the designers need to be ensured. If any block needs the use of any specific tool then that needs to be configured into the flow for it.
- Functional Correctness: Signoff checks and their related tools test the manufacturability and functionality of the design. Other third-party IP blocks also require a specific configuration to fulfil the related checks. So, a single common set of rules must be used for the full chip verification and applied to the development of all blocks.
Library Quality Assurance
After ensuring the efficient the flow & methodology, next comes the verification of physical layout. Routing keep-outs around the block edge ensure there is no violation of spacing in adjacent blocks. This also lessens the chances of signal crosstalk. For this, a dummy fill is inserted to resolve the timing impact and also for addressing the local metal density issues.
For the data libraries quality assurance can be done to various degrees but at least there should be a full run of DRC/LVS checks on the data. To perfection, these DRC/LVS checks should be tested on the provided GDS file, and also on the GDS file issued out from the IP library database.
This provides an additional layer of assurance that there has been no compromise on the quality of data during the format changes.
Layout vs. Layout Check
LVL allows the comparison of two GDS files on the basis of layer by layer, which comes of great help in validating the IP blocks in the streamed-out GDS file against the original data. This test also approves the implementation of specified IP versions.
Reference Data from Common Library
As various engineers work on different design blocks, ensuring common reference sets for the linkage of cells and IP libraries holds great significance. Since each IP has several data files associated with it, so all these data files must be accessed from the same base location. Confining all such references to a single shared configuration file and common path variables greatly reduces the chances of human errors.
Electrostatic Discharge Analysis
After the Physical Verification Checks like design rule checking and LVL tests, comes the need for ESD protection. During the final integration, it is highly recommended to go through any ESD protection required for the IP blocks in the design. Mostly there are some special power connectivity needs and other additions like diode clamps and IP pads might also be required.
Winding Up
This discussion touched on some of the major aspects of the hierarchal physical design and the deliberation mainly boils down to good design practices. If these practices are not followed correctly then there is a strong possibility of the chip failing eventually. While signing off a design, the engineer must be confident with the chip design and must have rechecked its functionality.
Well, human errors cannot be eliminated completely and some fringes of risks are always associated with the chip. But with stern workflow management and these stringent design practices, the risk can be managed and mitigated greatly, and the design engineers can head towards the tapeout phase with much more self-reliance!