feature article
Subscribe Now

Evaluating a Design Data Management System

Evaluating any EDA tool has several challenges. You have several tools and vendors to choose from. You have to get past the marketing hype to determine what is really important to you and whether the supported feature set meets your requirements. Finally, you have to make sure that the features you need perform as advertised. And, of course, you have to do this evaluation while juggling all your other tasks.

Evaluating a design data management (DDM) system is further complicated by the fact that it is groupware. To be effective, all the engineers on the project must adopt the system. Different sub-groups may use different tools and may have different requirements. The needs of the analog designers may be different than those of digital designers. The effectiveness of the DDM system is best judged in a real and interactive group environment, which is difficult to simulate during an evaluation cycle when only a few people are using the system in a test environment.

It is very important to define the list of requirements for your team and their relative importance BEFORE you start your research on DDM vendors. This will help you resist being seduced by vendor marketing.

The following are some of the factors that are likely to drive your requirements:

  • Geographical Separation: Are your project members spread across multiple sites now or is this likely to happen in the near future?
  • Group Size: How large is a typical project team likely to be? This will help you decide the relative importance of scalability and usability.
  • Project Duration & Size: How long do typical projects last and how big are they?
  • Design Flows: What is your design flow and what tools do you use?
  • DDM Experience of Team: How much prior experience does the team have with the use of design data management solutions and methodology?
  • Methodology: Having a well-defined design data management methodology that is suitable for your team (meeting your needs without being too complex) is very important. Roughly defining the methodology will help crystallize your requirements.

When gathering this information it is important to extrapolate for the future while being realistic.

Requirements to Consider

Most of the DDM solutions are likely to have the core features you need. The trick is to figure out which DDM solution will be optimum for your needs. With few people (sometimes just one engineer) involved in the evaluation process it is difficult to simulate a real team development environment. However, it is very important to recreate real usage scenarios rather than simple repetitive tests that are easier to create but are unrealistic.  Here are some important requirements you should consider and tips on how to rate DDM solutions against these criteria.

Multi-Site Collaboration:

If you have multiple sites or plan to have distributed teams in the future then this is the most crucial criteria. DDM usability is typically affected by two factors – performance and how the system handles network issues.

Both network latency and available bandwidth affect performance. Users typically check in or check out a few files to modify. Therefore performance here is important but not critical. The bottleneck is usually the time it takes, at a remote site, to update a work area and get all the changes made at the other sites. Many hundreds, or even thousands, of files may have changed and must be brought across the WAN. DDM systems may employ different techniques to improve remote site performance.

Cache or proxy servers that cache the latest revisions of the files at the remote site: If the file revision is already in the cache, then users are automatically served the file from the local cache instead of going over the WAN. The first update would bring in the new file revisions across the WAN. However, subsequent updates by others would be a lot faster since the file revision is now available in the LAN. This reduces bandwidth usage (which your IT department will appreciate) and dramatically improves performance.  

‘Push’ updates of remote caches: If sites are in close time zones, then changes may be happening at these sites simultaneously. If a push technology is deployed, the remote caches will get the latest revisions soon after they have been checked in. When a user requests that revision at the remote site, the file revision is likely to already be in the local cache.

A real possibility at a remote site is that there is a network or server outage and the users are unable to access the project for a few seconds or even for several hours. The DDM system should be able to handle a short network glitch safely, but, more importantly, should allow users to continue working with limited capabilities even if there is a prolonged outage.

Ease of Use & Administration

CAD engineers are already busy supporting the ever-increasing complexity of design flows and designers are working on complex projects with insane deadlines. New designers may join the team midway through a project. Hence a DDM solution should be easy to learn, easy to use and require minimal administration. This is especially important for startups, both because they are dynamic and because they cannot afford additional overhead. When deployed, the DDM system and your methodology should add minimal overhead to the design flow while streamlining the process and improving team productivity.

Hardware platforms and data storage for the DDM system can have a large impact on the administration costs as well as the time it will take to recover from a failure. Hardware failures that can bring work to a stop need to be prepared for in advance. Any specialized hardware will increase the overall cost due to higher initial costs and possibly the need to keep spares on hand. Specialized hardware also may require expensive hardware maintenance contracts (not to mention the added time to bring replacement hardware online). Any additions to your current data storage also are sure to increase cost (e.g., additional backup licenses) while possibly lowering the overall reliability of your data storage.

The only way to measure ease of use and administration is to actually deploy the DDM system on a real, if small, project. This will allow people with different levels of DDM knowledge to actually use the solution for an extended period of time. The feedback will be valuable and real. Skeptical designers will be able to directly experience the productivity gains.

Performance & Scalability

Large projects may grow in size to several hundreds of thousands of objects and many hundreds of gigabytes of data. Performance and scalability are very important factors to consider for these types of projects.

Often, performance tests are made by timing a script that will check in several thousand files, then check them all out, make some modification and check them all back in. Such a test is easy to set up but does not measure real usage. A typical usage scenario is that several engineers each check out and modify a small set of files. Therefore, timing check-in and check-out performance is not that important because these operations are usually performed on a small set of files after the initial check-in of the project. Usually, the time-consuming operation is updating the workarea to get all changes checked in by other engineers. This could be thousands of files, depending on the size of the project and the time since you last updated this workarea.

To measure performance and scalability, the following scenarios should be measured:

  • Are operations such as check-in and check-out affected by the size of the project, i.e., does the time to check in a file increase if the size of the project grows?
  • Does the time to update a workarea increase with the size of the project? Ideally it should be proportional to the number of new revisions to be brought into the workarea as a result of the update and not by the size of the project.
  • How does remote site workarea update compare to local update after all remote site performance features, such as caching, are turned on?

A very important factor that is often overlooked when measuring scalability is the use of resources such as disk space.  Although disk capacities are increasing and prices are falling, the cost of fault tolerant network storage and the management and backup of data is still expensive. As we all know, engineers are often hitting their disk quotas. Imagine a project which has grown to hundreds of gigabytes and has more than 50 designers. Even if each user has just one workarea, the total storage required would be prohibitive if workareas were created as physical copies.

DDM solutions employ different methods to optimize the use of disk space. IBM’s ClearCase® pioneered a unique virtual file system to tackle this problem. Though it is a technically elegant solution, it adds significant performance and administration overhead. Other DDM solutions have made use of native file system features such as symbolic links. Files in the workarea are created as symbolic links to a mirror or cache except for the files that the user is actually modifying. The files in the central cache or mirror are shared by everyone. This results in significant savings in data storage needs and scales well for larger projects. It is important to make sure that users have full control over their workareas even when symbolic links are being used. If the target of the symbolic link can change without the user’s knowledge, then verification and regression test results can become suspect.

Integration with Flows

Hardware design flows are far more complex than software development since they include multiple tools, often from different vendors. The flow may include graphical tools such as schematic or layout editors. These tools produce several files and are hard to manage for several reasons:

  • A design object may be stored as multiple files by the design tool. These files together form the design object and therefore must be managed together as a single unit.
  • Tools also generate run files to maintain backups, logs, simulation output, etc. These files do not contain design data and need not and/or should not be managed.
  • DDM solutions modify the file system in many ways–get new updated revisions, change the permissions of files, etc. This can impact the design flow in several ways.

It is important that the DDM solution be tightly integrated with your design flow. The integration should make management of design data easy and error proof. You should look for at least the following capabilities:

  • Is the interface to DDM commands conveniently available directly within the design flow? For example, if you are using Cadence® Virtuoso®, then users should be able to access the DDM commands directly from the Library Manager and the various editors without having to open a separate tool or interface.
  • Does the DDM system recognize and manage design objects even if they are made up of multiple files? For data integrity and convenience, the co-managed set of files that represent a design object should be managed as a single object and not as separate files with program-generated names.

The design flow should recognize the presence of the DDM system and take appropriate action as needed. For instance, when you try to edit a design object, the system should prompt to check out the object if needed. Operations such as rename or hierarchical copy should correctly interact with the DDM system. If a design is open in an editor with unsaved changes then the system should prompt you to save if you try to check it in.

Your flow may also include commercial or in-house tools that the DDM system may not be integrated with. If you use such tools or plan to use new design tools in the future, then you should investigate how the tool will be able to handle data from design flows it is not integrated with. Check how customizable the DDM solution is and what it would take to manage data from a real or hypothetical design tool. Some factors to consider:

  • Command line interface and scripting.
  • Application programming interface.
  • User interface customizability.
  • Ability to automate management of complex design data.

Partitioning & Design Reuse

Design reuse is vital to improving the productivity of design teams. Process development kits (PDKs), intellectual property (IP), setup files and scripts are often reused from project to project. Typically, reusable blocks or IP are just copied into a project and sometimes modified. This leads to duplication and the fixes and improvements are not shared between projects.

Sometimes a large project may need to be partitioned into more manageable sub-projects. There may also be other reasons to partition the project, such as:

  • Different blocks of the design are being developed at different sites. It may be more efficient to partition the project so that each site manages the part of the design it is primarily responsible for.
  • Requirements for an analog design flow may be quite different from a digital flow. Teams may have different levels of expertise with DDM solutions. Therefore, it may be convenient to split the project along functional lines.

If design reuse or design partitioning are important to you, then the DDM system should allow you to effectively create a project using blocks or libraries from other projects, whether the sub-projects are in the same site or across the WAN. Members of the super-project should be able to seamlessly access the referenced objects in the sub-project. Since the objects are referenced and not copied, you will reap the benefits of sharing the fixes and enhancements.

Access Controls

As projects grow in size so do the number of engineers. Administrators may need to put limits and controls on who gets to do what and to which parts of the design. Here are just a few examples of the types of access controls that may be needed:

  • Ensure that designers do not edit layouts and layout engineers do not edit schematics.
  • Allow contractors to view only the parts of the design they are supposed to work on.
  • Lock down an entire project from any modifications while still allowing access to get data.
  • Allow only a designated set of engineers access to a project.
  • Restrict certain operations, like creating branches, to team leads only.

Each design team is unique and is likely to have different requirements. It may be impossible for you to even figure out what your needs may be in the future. Therefore you should investigate the flexibility of the access control models supported by the DDM system. Some features you may want to look for are:

  • Can users be assigned to groups for each project irrespective of their Unix/Linux group? You know how much effort it can be to get system administrators to add and change groups!
  • Can user roles be customized to control who gets to do which operations?
  • Can access privileges, including read privileges, be controlled per object?
  • Can certain processes or rules be enforced? For example, can you make sure that project members use a particular release of an IP library?

Details: Technical Support and Cost

Once a design data management solution is deployed, it becomes the backbone of your design flow. A problem with a server can disrupt the entire team. An issue close to a tape-out can result in expensive delays.

Therefore, it is important to get a good measure of the technical support provided by the DDM vendor. During the evaluation process, make sure to engage with the technical support team and rate them both on the speed and quality of their response. Here are some areas where you can work with the technical support team to evaluate how well they can help you:

  • Installation and setup to meet the needs of your design team.
  • ‘How-to’ questions on advanced features.
  • Recommendations and customizations to tailor the DDM solution for your flow.
  • Understanding how the DDM solution integrates and interacts with the design tools and flow.
  • Issues and enhancement requests you may have.

Cost is always a consideration in any buying decision, even if it is not the most important one. When calculating the cost of the DDM solution, try to estimate the total cost of ownership. Consider the expense of all the factors that contribute to the long term deployment and use of the solution:

  • Software Licensing: Make sure to include costs for all features, servers, etc., that you will need. When calculating your cost, make realistic projections for your future growth. Negotiate a fair price with the vendor and lock in rates for your future purchases.
  • Hardware: Include the costs of any special hardware and any other networking, storage or other IT infrastructure upgrades needed.
  • Administration: System administration and CAD engineering resources are expensive. The admin resources needed to deploy and maintain the DDM solution may become a significant part of the cost. Investigate the effort required to upgrade to new releases of the DDM solution.
  • Training: All engineers must be trained to use the features and to follow your methodology. Ease of use has a direct impact on training costs. A pilot project will give you a good feeling for the level of training that will be needed. If you think that formal training would be needed, then investigate the cost and forms of training the vendor provides.

Once you sort through all of your requirements, there are several DDM solutions you can choose from. As with anything else, there is no one product that is best suited for everyone. To avoid the risk of getting emotionally tied to a ‘cool’ feature, you need to carefully define your requirements before starting your search. Evaluation should be designed to closely resemble the real usage scenario.

When you deploy a DDM system, it becomes an integral part of the design flow. You are not just adopting a point tool. It is a complete solution that is very crucial to the success of your project.  Make sure to evaluate not just the product, but also the support and response from the DDM vendor. 

About the Author:

Scott Woods is the Director of Design Automation Development at Integrated Device Technology, Atlanta.  Scott leads the Atlanta CAD team developing and supporting custom design flows including PDK development for both internal and foundry processes.

Leave a Reply

featured blogs
Apr 25, 2024
Structures in Allegro X layout editors let you create reusable building blocks for your PCBs, saving you time and ensuring consistency. What are Structures? Structures are pre-defined groups of design objects, such as vias, connecting lines (clines), and shapes. You can combi...
Apr 25, 2024
See how the UCIe protocol creates multi-die chips by connecting chiplets from different vendors and nodes, and learn about the role of IP and specifications.The post Want to Mix and Match Dies in a Single Package? UCIe Can Get You There appeared first on Chip Design....
Apr 18, 2024
Are you ready for a revolution in robotic technology (as opposed to a robotic revolution, of course)?...

featured video

How MediaTek Optimizes SI Design with Cadence Optimality Explorer and Clarity 3D Solver

Sponsored by Cadence Design Systems

In the era of 5G/6G communication, signal integrity (SI) design considerations are important in high-speed interface design. MediaTek’s design process usually relies on human intuition, but with Cadence’s Optimality Intelligent System Explorer and Clarity 3D Solver, they’ve increased design productivity by 75X. The Optimality Explorer’s AI technology not only improves productivity, but also provides helpful insights and answers.

Learn how MediaTek uses Cadence tools in SI design

featured paper

Designing Robust 5G Power Amplifiers for the Real World

Sponsored by Keysight

Simulating 5G power amplifier (PA) designs at the component and system levels with authentic modulation and high-fidelity behavioral models increases predictability, lowers risk, and shrinks schedules. Simulation software enables multi-technology layout and multi-domain analysis, evaluating the impacts of 5G PA design choices while delivering accurate results in a single virtual workspace. This application note delves into how authentic modulation enhances predictability and performance in 5G millimeter-wave systems.

Download now to revolutionize your design process.

featured chalk talk

Automotive/Industrial PSoC™ High Voltage (HV) Overview
Sponsored by Mouser Electronics and Infineon
In this episode of Chalk Talk, Amelia Dalton and Marcelo Williams Silva from Infineon explore the multitude of benefits of Infineon’s PSoC 4 microcontroller family. They examine how the high precision analog blocks, high voltage subsystem, and integrated communication interfaces of these solutions can make a big difference when it comes to the footprint size, bill of materials and functional safety of your next automotive design.
Sep 12, 2023
27,795 views