PROSTEP | Newsletter
DE EN

Slicing the PLM elephants

By Bernd Pätzold

Many companies are facing the challenge of modernizing their now aging PLM landscapes. This is especially true of the PLM pioneers among them, who rolled out the first management systems for their mechanical product data more than 20 years ago. In many cases, these systems are no longer subject to further development or they no longer meet the new requirements for developing cyber-physical systems, which incorporate a high proportion of electronics and software. Their software architectures make it difficult to quickly integrate the new tools and features needed to respond agilely to new market and customer requirements.

Modernizing existing PLM system landscapes pose a challenge in many respects. First of all, companies need to be clear about which PLM capabilities they will actually need in the future to ensure that they are able to operate successfully on the market. The answer to this question depends to a great extent on their business strategy and the business processes and models that need to be supported in the future. For example, what role does software play in their smart, connected products? How important is the Internet of Things (IoT) and the use of operational data for the development of new services or new product-as-a-service offerings?

A new PLM landscape cannot be created in a vacuum, it has to be designed in the context of a broader enterprise architecture (EA). The EA describes the interaction between business processes, information flows and IT systems. In order to analyze this interaction better, our PLM strategy consulting experts have developed a best practices approach called information flow analysis (IFA). It makes it possible to define a PLM architecture that is designed to meet business requirements and provides an answer to the question of how to integrate IT applications to provide the best possible support for the business processes. You will find a few interesting examples from the shipbuilding industry in this newsletter.

Defining a future PLM architecture is one thing, implementing it is another. It is inconceivable for major companies in the automotive and supplier industries to replace their existing PLM landscapes, which incorporate a large number of  monolithic IT systems, in one fell swoop – if only because of the risk involved. Instead, they need to gradually meld their brownfield legacy IT with greenfield IT. This requires a roadmap with clear priorities for restructuring their PLM landscape.

The challenge when it comes to restructuring is slicing the PLM elephants as efficiently and with as little disruption to ongoing operations as possible. Every OEM has a slightly different “cutting pattern” in mind, depending on what their current PLM landscape looks like or what they want their future architecture to look like. 

Some manufacturers continue to rely more heavily on commercial solutions, while others want to take greater control of both the development of vehicle software and the development of supporting IT systems themselves.

They all have certain basic principles in common when it comes to building future PLM architectures. These include the principle of the modularization or the federation of applications with the help of largely independent microservices, which not only requires open and integrable IT systems but also new data integration concepts. Instead of replicating data, data is linked intelligently so that it can be used across systems and domains. Semantic Web technology in combination with standardized ontologies, i.e. a harmonized and machine-readable terminology, is therefore a key element of a sustainable PLM architecture.

An innovative approach to modularizing monolithic PLM landscapes, which our PLM experts are currently implementing at the carmaker with the star, is domain-driven design. The aim of this approach is to provide the specialist departments or specific user groups within the specialist departments with lean applications that have a task-specific functional scope, their own data model and their own interface components and which exchange data and share services with neighboring domains via clearly defined interfaces.

It is intended that the functional scope required for the respective domain be extracted from the legacy systems, packaged as containers and orchestrated together with the data management components in order to then move it to the cloud. The fact that the applications function largely independently means that they can be adapted to new requirements on an almost daily basis – at least that is the idea.

The cloud or the use of software from the cloud is an integral part of the future PLM strategy for other companies as well. It seems that the pandemic has dispelled the last remaining doubts in terms of security and reliability. PROSTEP is well prepared for this paradigm shift, but I am curious to see how this trend will affect the PLM market as a whole. Regardless of the impending recession, many PLM vendors are already complaining about the lack of capacities for handling upcoming projects.

Best regards
Bernd Pätzold

© PROSTEP AG | ALL RIGHTS RESERVED | IMPRINT | PRIVACY STATEMENT YOU CAN UNSUBSCRIBE TO THE NEWSLETTER HERE.