One f the most fascinating books published in the last couple of years is “Homo Deus: A Brief History of Tomorrow” by Yuval Noah Harari. The book offers a view of the future of mankind: new technologies, such as artificial intelligence and genetic engineering, will impact science, religion, and politics and ultimately become the principal force of evolution. Intelligent design would replace natural selection as the main force of evolution.
One of the most captivating theories in the book is the one about science and, specifically, the future of healthcare. In a world where “more people die from obesity than from starvation; more people die from old age than from infectious diseases”, the end goal of healthcare would switch from “preventing/curating diseases” to “optimizing human performance”. Creating a “super-human”, by leveraging the latest technologies, and playing with algorithms that can be more and more easily written and self-optimized: “Nanotechnology experts are developing a bionic immune system composed of millions of nano-robots, who would inhabit our bodies, open blocked blood vessels, fight viruses and bacteria, eliminate cancerous cells, and even reverse aging processes”. With the end goal of healthcare switching from preventing/curating diseases to “avoiding diseases by design” and “optimizing human performance beyond averages”.
WHAT IS NON-DESTRUCTIVE TESTING
What works in healthcare works in the industrial world. Leveraging similar technologies and processes currently used for humans in hospitals, inspections are applied to critical industrial assets: X-rays, Ultrasound, and Visual Inspections among others. Not looking at “diseases”, but at “defects”, and trying to detect them as early as possible, potentially mitigating or delaying the risk of critical failures. The beauty of it is the capability of doing this inspection without “touching” or “damaging” the samples… It is called “non-destructive testing”.
FROM COST CENTER TO ROI
You would say that’s what Quality departments in an industrial setting usually deal with. That’s correct, but it is quite interesting to see where Quality 4.0 is bringing this discipline and the parallelism with Prof Harari’s theories.
Quality has changed during the last years, keeping pace with technological innovation. While quality has been historically associated with a “necessary evil” cost center, it is now moving to a new business value driver, with high ROI. Thanks to Industry 4.0, it is becoming “Hyperconnected quality with real-time Data and IoT”: It is all about generating incremental value across each step of the value chain, thanks to new workflows connecting real and digital worlds.
DIGITAL TWIN AND QUALITY 4.0
Everything starts with Data: “Good Data”. What you need is big data, and or, high-quality data. That data is produced by “good sensors”: high-quality sensors, digitally enabled, capable to capture and replicate the physical world. This data needs to be associated with the specific application, and the workflows where the part you want to test will operate to drive consistency, reliability, and accuracy.
In the past, looking at how Quality was structured in industrial organizations, a gap had always existed between Quality Departments and the other ones (e.g. Design, Manufacturing, and others). However, Quality 4.0 brings quality management on top of a smart manufacturing approach: quality data brings a unique value when integrated on top of the “Digital Twin” of any industrial asset. More specifically, inspection data shows “asset health” status, an essential piece of information to understand and mitigate risks of critical failures of the asset.
The Digital Twin approach enables a feedback loop where Quality data and insights are connected across the asset lifecycle: this closed-loop approach brings Quality data back from R&D, to manufacturing, to the supply chain, and to field services. In software terms, this means connecting Quality Data produced by sensors and inspection machines to software packages such as Computer-Aided Design and Manufacturing (CAD/CAM), Manufacturing Execution Systems (MES), Product Lifecycle Management (PLM), and Enterprise Resource Planning (ERP) software.
Whereas conventional Quality was mostly focused on detecting defects, Quality 4.0 is striving to prevent them before they even occur. The new goal is not just to look at the “asset health”, and build “defect-free” assets. The real value is in leveraging the “digital twin” approach to re-design and push asset performances beyond the current design. “Super Asset”. The asset that you could start simulating in use even before building the first piece.
USE CASE: Blades for critical rotating equipment (gas turbines or centrifugal compressors)
In “conventional quality” they may be inspected just matching specific quality requirements for that specific industry/application. This happens at different stages of the product lifecycle (at manufacturing, assembly, shipping, commissioning, and service). Moreover, this is done by using different technologies (from X-ray to Ultrasound or Visual) and by different actors, from the equipment manufacturer to the commissioner and the field services operators.
Quality 4.0 brings the story together with a digital thread: from enhancing data capturing to enabling analytics and insights that will complement the “digital twin” of the asset.
The advantages are enormous: feedback on these data to R&D to “re-design” the assets, aiming to create a “defect-free”, “over-performing” equipment. Moreover, that could drive important business outcomes: standardization, lean manufacturing, equipment robustness, and flexibility to be adapted to multiple applications with reduced effort. Based on this shift, new business models from manufacturing to service could emerge.
Source: Simone Bellanova
Don’t hesitate to contact Thanh for advice on automation solutions for CAD / CAM / CAE / PLM / ERP / IT systems exclusively for SMEs.
Luu Phan Thanh (Tyler) Solutions Consultant at PLM Ecosystem Mobile +84 976 099 099
Web www.plmes.io Email firstname.lastname@example.org