In the beginning, "you had a lump of model," explains Paul Brown, director of NX marketing for Siemens PLM Software (Plano, TX; www.siemens.com/plm). Solids modelers based on boundary representation (brep) were all about topology. Then came parametrics, adding intelligence to the model through dimensions that determined design intent by driving geometry. Then came feature-based, history-based modeling: "As you build the model, the 'recipe,' you built up the relationships," continues Brown. This recipe and all its ingredients-the model and its features, and relationships between those features-are all dependent upon each other.
While feature-based parametric modeling isn't about to go away anytime soon, assures Brown, it does have limitations. First, changing the model, especially complex models with highly interdependent features, can be time consuming. Every change to a model requires recalculating its feature tree. Conversely, he says, "explicit modeling provides direct interaction with geometry and provides flexible editing. But it lacks control and the ability to establish design rules to manage change."
Second, "once a model is built, it cannot be changed in any ad-hoc or freeform way," writes the Aberdeen Group (Boston, MA; www.aberdeen.com) in a market alert. "It must be changed within the constraints of the original feature definitions or rebuilt. Overall, this limits the freedom the user has to make significant changes." Adds Brown, "To get reliable edits, I have to preplan how I build my model. I'm almost in the crystal ball-stage. I have to ask whether somebody will later want to change the model in a particular way."
Last, again Aberdeen: "There is no uniform formatting convention for CAD files across different applications, or even across platform versions." Some parametric relationships are lost in translation from one 3D CAD system to another. Or they might get lost entirely.
Now embedded in the latest versions of both NX and Solid Edge, synchronous technology (ST) from Siemens PLM tries to overcome these problems. Introduced in April, 2008, ST combines the best of constraint-driven modeling with the best from direct (explicit) modeling. It removes the linear dependency of features found in parametric modeling. It captures modeling knowledge on-the-fly so that CAD users can make on-the-fly (a.k.a., ad hoc) changes to those models.
ST analyzes geometry and topology, and then makes logical assumptions about the points, surfaces, and other features that comprise the solid model. (The designer can override these assumptions.) This dynamic analysis releases the designer from choosing between constraint-driven or history-free modeling. Says CAD industry analyst Evan Yares (www.evanyares.com), "Siemens' synchronous solver overcomes the order dependencies that have plagued history-based CAD programs by solving for the explicit and inferred constraints at the same time. The synchronous solver doesn't use a history tree, but rather holds user-defined constraints in groups associated with the surfaces to which they apply."
Other freeform and hybrid 3D CAD systems have elements of ST. However, says Brown, ST ties the geometry, the rules, and the synchronous solving together and embeds all three into the CAD system.
According to Siemens PLM, designing can be up to 100 times faster with ST, especially with large, complex models. This speedup comes from defining optionally persistent dimensions, parameters, and design rules during model creation or editing, without the overhead of an ordered history. ST also leads to using data from multiple CAD systems. It automatically infers the function of various design elements in imported geometry without needing feature or constraint definitions, or remodeling. Last, ST introduces new inference technology to automatically infer common constraints and execute typical commands based on cursor position. This makes ST-based CAD easy to learn and use for occasional users.
More than just data translation
CAD vendors have been increasingly attentive to the problems in translating solids modeling data between 3D CAD systems. The result is data translators from two sources: CAD vendors and third parties (CAD vendor-neutral). Naturally, CAD vendors directly compete with other CAD vendors, and "it takes two to tango," comments Ken Tashiro, vice president and COO of Elysium, Inc. (Southfield, MI; www.elysiuminc.com). Consequently, the translators don't play nice. Continues Tashiro, "all too often behind the scenes the hand-off has been 'hands-on,' involving hours of manual, error-prone resurfacing and repairing of models."
On the other hand, third-party (vendor-neutral) translators can access code that the CAD vendors can't get from their competitors. "If we're lacking some sort of functionality, then we'll negotiate with the CAD OEM to enhance its application programming interfaces," says Tashiro.
These days, data translators have morphed into more than just data translation. They translate CAD data. They verify CAD data. They repair CAD data (if defects exist in the translated solids model).
CAD data translating is not easy, even with STEP and IGES as the intermediate data format. "The standards are very well documented; however, it's the way you implement them," says Tashiro. "Every CAD system has a different kernel, a different architecture for constructing models, and different mathematical and topological assumptions. There are nuances in each CAD system. The more differences you encounter and solve, the better your translator is, but it'll never be perfect because of the continuing evolution of both CAD systems."
Another problem is in precision. CAD programs, even functions within a CAD system, have different tolerances (e.g., Catia v4: 0.1 mm; Catia v5: variable; and Unigraphics: 0.0254 mm). "Each system measures its own internal geometry using algorithms consistent with its core mathematical conventions. These conventions can hide issues from the user, like curves that really aren't as straight as the user thinks," says Tashiro.
CAD data verification checks the quality of the translated output and certifies that it matches the source CAD data. Automotive OEMs and Tier 1 suppliers are increasingly requiring suppliers to provide verification reports that compare the original surfaces to the translated surfaces down to several decimal places, and list any of the variances. The goal is to ensure the accuracy of the solid model before the translated data become someone else's problem downstream from the CAD authoring system (and CAD vendor).
And then there's CAD data repair. This is the next step after certifying the quality of a solid model. Sometimes CAD data is "healed" automatically. Sometimes not.
The Elysium CADdoctor EX3.0 performs all three of these tasks. CADdoctor is a standalone direct translator for converting and repairing CAD geometry exchanged in product development-from scanned geometry for modeling to surface preparation for molding and manufacturing parts. It supports data exchange between the major 3D CAD systems, and it operates on Microsoft Windows 2000 and Windows XP.
In addition to data translation, CADdoctor checks geometry using the product data quality standards based on specifications from the Japan Automobile Manufacturers Association and, in Europe, Verband der Automobilindustrie; checks and repairs polygonal data in STL files; simplifies geometry for finite element analysis (or for hiding features to protect intellectual property) by turning an assembly into a 3D solid; and compares source geometry to geometry after translation, repair, and other processing. The tool is capable of resolving geometry problems such as self-intersecting surfaces and curves, gaps between trim curves and base surfaces, sliver faces, and short curves. It can also check models for draft angle, undercut, wall thickness, and other manufacturability issues for mold manufacturing. CADdoctor automatically repairs common geometry defects falling within a specified tolerance.
For uncommon defects or as required, CADdoctor flags the defect and prompts the user through a manual repair process by displaying only those software functions that can repair the defect. The manual intervention is necessary, explains Tashiro. "It can be dangerous if you don't know what you're doing or don't know the original intent. In the end, there's always human intervention to get a perfect model."