The digital revolution is transforming a vast range of aspects of both business and society, from housing and mobility through to healthcare. For manufacturing companies, this revolution presents a threefold challenge: Firstly, they have to rethink their development processes and methods to take account of the new requirements associated with the interdisciplinary development of smart products and systems. And the watchword here is systems engineering. Secondly, they have to get their production into shape for Industry 4.0 and autonomous manufacturing to allow them to produce custom products efficiently, even in "batch size 1". And thirdly, they have to improve integration of their service processes in engineering and production in order to be able to support the product lifecycle into the operational phase and present their customers with new service offerings on the basis of the Internet of Things (IoT).
PLM has already been around for a while, and all the hype around IoT and Industry 4.0 means that it is becoming somewhat neglected. Without a robust foundation in PLM, with clean product data, the digital transformation of manufacturing and service processes is built on sand. This fact was recently underlined yet again in a large number of presentations at the PI PLMx event in Hamburg.
When I speak of PLM in this context, I am not referring to a specific system solution. Rather, I am thinking of the underlying concept of product lifecycle management. The IoT has not made PLM obsolete. On the contrary, the networking of smart products and systems via the Internet is precisely what will make it possible to use digital twins to support the operational phase and thus to extend the PLM philosophy right through to the end of life of the product. Or to put it another way, it will be possible to treat the product lifecycle as a closed loop and to reuse the information from the operational phase for development of the next generation of products.
We are not talking about simply finding new names for PLM, but rather about redefining PLM. Support for aspects such as systems engineering, Industry 4.0 and IoT requires a PLM architecture that is different from what we are familiar with.
Monolithic systems are too cumbersome to provide an agile response to the new requirements associated with an extended product life. This is clearly stated in the Future PLM theses formulated by PROSTEP in collaboration with other PLM experts from industry and the research community under the auspices of the prostep ivip Association. Instead, we need modular PLM architectures with federated systems and intelligent, networked information. The prerequisite for this is openness, as demanded by the Code of PLM Openness (CPO).
The Future PLM theses draw on the experience gained from more than 20 years of working with PLM in practice, and they are undoubtedly of practical value for companies. The theses were taken up by the carmaker Audi and provided guidance in redesigning their PLM landscape and PLM processes during the transition towards systems engineering. In an interview, Thomas Kriegel, Head of Process and Methods Development for Systems Engineering at Audi, explains some of the action plans that the company has derived from the theses and underscores the significance of openness for shaping the PLM landscape.
Audi is also giving thought to new concepts for linking the data in the various systems. But simply linking data is not always enough. Before the digital data can be used and reused efficiently across the entire lifecycle, it is often the case that it first needs to be conditioned and any gaps filled. The quality of the information models is the key to digitalization of information flows and business processes. Not all this information will be managed using traditional PLM systems in future, but they play a key role in qualifying the original data.
Incidentally, this also applies to qualification of the (mechanical) CAD models, an aspect that has long been somewhat neglected as a result of the growing proportion of electronics and software in our smart products. In particular, it is the new issues such as additive manufacturing or the digital thread that are now causing companies to pay more attention to how they have to structure their digital product models in order to optimize them and use them throughout the process chain. Good old design methodology is therefore still alive and well, as we can see in the newsletter article by Hartwig Dümler.
I hope you enjoy your read.