In today's dynamic industrial landscape, the needs for flexibility, resiliency and cost-efficiency remain the driving forces behind investments in production performance improvement. Industry leaders across discrete, process and hybrid industrial models are harnessing the power of data-centric capabilities in machine learning, artificial intelligence and data federation. Concept frameworks like ‘industrial data fabric’ are becoming foundational requirements for achieving greater levels of performance through these new data methods and tools. Many industrial operations leaders find that the investment ‘recipe’ of using technologically advanced tools in the acquisition, management and contextualization of data, followed by the application of advanced analytical tools delivers greater rates of performance improvement than ever before. Meanwhile, the timeless, foundational and familiar principles of lean manufacturing, quality management and performance measurement remain relevant and useful.
Join Jeff Miller, Kalypso Lead Principal for Production Performance Improvement, as he defines PPI within today’s industrial operating systems, explains the leading principles and strategies that guide PPI programs and reviews how they’re applied in representative case studies to deliver differentiating performance results.
Key Takeaways:
- Data is everything and if used the right way, can be monetized.
- PPI today allows machines to give us context of information (that is unplanned production downtime).
- As a result of PPI, we have a timely awareness of how the factory is doing.