What’s in a product lifecycle management (PLM) system? Some authoring tools, computer-aided design (CAD); large dollops of simulation and visualization; lots of manufacturing data systems (e.g., computer-aided process planning (CAPP) and configuration management); heavy-duty infrastructure stuff (database management systems (DBMS) and data communications); and plenty of behind-the-scenes infrastructure utilities, such as web-based user interfaces and application programming interfaces (API). No one-size-fits-all exists in terms of PLM components, data requirements, or implementation, but here’s a brief description of the essential components for an effective PLM system.
CIMdata Inc. (Ann Arbor, MI) uses the label “information authoring tools” for applications ranging from mechanical and electronic CAD, to computer-aided software engineering (CASE), to technical publishing (such as the office suites running on your computer). Computer-aided manufacturing (CAM) is another. By defining and planning manufacturing sequences up front, including geometries, machining parameters, and resources, CAM systems can generate, postprocess, and document the NC programs for cutting tools, as well as validate these NC programs before actual production.
Bill of Materials Processor
A centralized BOM system lays out product structures and provides a unified view of all product designs and part information. (Parts include standard parts, purchased parts, proprietary parts, and versions of existing parts.) Because of PLM’s enterprise-wide and long lifecycle focuses, the BOM processor should be capable of ad-hoc querying, multi-database querying, BOM comparisons, and both the simulation and analysis of BOM versions.
Configuration management primarily tracks changes, identifies revisions, and controls effectivity. It involves identifying functional and physical characteristics of design, process, and informational objects; controlling the objects; tracking and detailing changes; providing auditing procedures; and both data and metadata searching capabilities. As noted by SAP, configuration management should control products during lifecycle phases, such as “as-designed,” “as-built,” and “as-maintained.”
Much of configuration management involves the change control aspects related to document management: data vaulting, storage, and security. Vaulting, for example, ensures data integrity by managing the check-in/check-out of documents—and related documentation—from electronic storage. (Check in/out itself includes the administration functions of tracking who, what, when data are used.) Such functionality needs to apply to the entire document structure of a product, from full assembly to the associated document hierarchies of individual drawings.
The Data Model
This is the crux of PLM. The data model shows and manages the inter-relationships between products, processes, and resources. In a report last year, AMR Research, Inc. (Boston, MA) pointed out that linking PLM modules and existing systems requires more than just bill of materials (BOM) data. It also requires “an object model that [PLM and the other enterprise systems] can agree on.” Linking these together, continues AMR, “is a more sophisticated version of the age-old argument between PLM and enterprise resource planning (ERP) of ‘Who owns the BOM?’” Why is this so important? Answers AMR, “explicitly modeling the product line in PLM—as the product manager views it—captures ambiguous information and defines the interface between PLM and other systems.”
PLM is basically the mother of all enterprise databases, and yet it can be thought of as a database with a variety of software tools to collect, disseminate, present, and otherwise manage the data and metadata to be contained in the PLM system.
The DBMS must be relational and object-oriented enough to capture and manage the vast variety of data types, properties, behaviors, and relationships of data that exist in an enterprise. These include not only the obvious initial documentation—BOMs and material specifications, CAD drawings, numerical control (NC) programs, work instructions/process plans—but also the data that comes from downstream processes, such as change notices, quality reports, audit files, office documents—anything that can be put into electronic format. Not surprisingly, such a DBMS must also feature sophisticated change control, effectivity management, database security, data synchronization, and database administrator-specific tools.
No PLM system is an island of information unto itself. To ensure data interoperability between the PLM system and the rest of the enterprise, enterprise application integration (EAI) technologies within the PLM system must support the broad range of “open” “standards” defined for hardware, software, and data interoperability. While the list of standards is enough to make anyone’s eyes glaze over—and this includes both defacto industry standards as well as the proprietary APIs from software vendors—suffice to say that EAI technology must include out-of-the-box integration and the tools and data exchange conventions to create secure integrations when they are needed.
At the very least, as AMR points out, PLM integration must include the semantics to synchronize structured, semi-structured, and unstructured information across applications; the mapping between high-level processes and individual applications; and the ability to present this information through some user interface or portal.
CAPP helps optimize and validate manufacturing operations, rooting out inefficiencies in production sequencing and production equipment. CAPP feeds into factory modeling and simulation, and ultimately into the selection of capital equipment.
Incorporated within CAPP is group technology for classifying, searching, and managing the attributes of parts, processes, and tooling. Additional CAPP tools might be necessary to address industry-specific tasks. For example, automotive body-in-white assembly planning requires specific functionality, such as matching weld points to operation/station assignment. CAPP search capabilities, to pick one function, are not just the province of design or manufacturing—certain product classifications for example, are relevant to purchasing, as well as the software tools to view designs (in 2D and 3D) and disclose characteristics (size, material, manufacturing process).
Program management might seem peripheral to PLM, but it has everything to do with product lifecycle and management itself: program and project management functions within PLM establish a work breakdown structure (a hierarchy of tasks and sub-tasks) to complete a program/project. This is not workflow; this functionality involves critical path analysis, costing and budget management, progress tracking, human resources, and a host of fundamental business processes.
PLM-based simulations let users dynamically analyze all the part and process data contained in the PLM system. Simulation lets designers and engineers see products in action, and how they’re produced and assembled. PLM users can access the appropriate data to try out different designs and production alternatives to optimize product designs (what the customer is buying) and production processes (how the enterprise is making what the customer is buying). Simulation systems can focus on piece parts, finished products, specific production operations (such as stamping operations or tool management), or full-factory modeling system (including the factory layout and the interactions of material and part movements, production equipment and assembly operations, and people).
Getting data in and out of PLM is basic. Doing that easily is mandatory, along with viewing and modifying information, data mining, ad-hoc querying, and other data manipulations. Plus, there’s authoring new products, processes, and the like from that data. PLM’s user interface needs to support all of these tasks. The user interface must also support collaboration by providing the functionality required to share comments among users, maintain discussion history, and conduct conferences (from webcam and WebEx meetings to document display and redlining, whatever the document, to shared whiteboards).
Visualization tools let users anywhere in product development, manufacturing, and the supply chain see and modify product and process designs without having the authoring tools that created those designs. Visualization utilities include viewers that can display the vast variety of design files, from basic PDF displays to document displays to photo renderings to dynamic simulations. Along with that should be, as expressed in SmarTeam literature, “multiple user redlining options, enhanced printing and manipulation tools, including sectioning, mass properties, measurements, bird’s eye, and more.”
To support geographically distributed project and supplier teams, the PLM infrastructure much be able to streamline communications between all the participants, regardless of geographic location or time zone. This infrastructure should also be accessible to all participants, even “mom-and-pop” manufacturers, preferably at no or little additional cost.
These days, the Web and Web-based applications provide the data communications infrastructure and user interface for easy and secure data gathering and sharing. Collaboration involves a variety of other networking technologies, points out CIMdata: audio conferencing, teleconferencing, synchronous visualization tools, data translators, and “system administration tools to control access and manage collaborative data and relationships.”
Workflow, according to CIMdata, is the technology that gets people interacting with information. Workflow automatically routes work from one stage to the next, initiates actions, tracks project status, expedites engineering changes, moves financial decisions along, and provides relevant data to those who need it. As MatrixOne points out, PLM must be able to “execute workflows that simplify and speed response to what is becoming a build-to-order marketplace.”
The workflow engine should be capable of guiding users through the process of creating and modifying workflows, including defining workflow participants, business objects to be distributed, trigger events, roles, and decision trees. Ideally, the workflow functionality within PLM should include an “enterprise modeler” for defining, documenting, and modifying business processes both within an enterprise and within its supply chain.
|PLM Vendors Assisting With This Article|
|Arena Solutions, Inc. (Mountain View, CA)||www.arenasolutions.com|
|Dassault Systemes (Montreal, Quebec, Canada)||plm.3ds.com/en/about_plm/enovia.asp|
|EDS PLM Solutions (Plano, TX)||eds.com/plm|
|Framework Technologies Corp. (Burlington, MA)||www.frametech.com|
|IBM Corp. (Armonk, NY)||www.ibm.com|
|MatrixOne (Westford, MA)||www.matrixone.com|
|PTC (Needham, MA)||www.ptc.com|
|SAP America, Inc. (Newtown Square, PA)||www.sap.com|
|SmarTeam Americas Inc. (Beverly, MA), a division of Dassault Systemes||www.smarteam.com|