Get the ultimate guide to shifting left with data contracts — O'Reilly Book
Data platform modernization requires data leaders employing a sound data platform strategy—one informed by platform maturity modeling. Learn why.
The value of data is often industry-agnostic—but healthcare data quality in particular comes with higher stakes for many data leaders and the teams they manage.
Learn how to assess and improve your data platform with a data platform maturity model to drive business growth and innovation.
Data platform modernization is going from “would be nice” to “need to do now” due to growing industry needs for high-quality data. Learn exactly why here.
The rapid rise of AI has dramatically elevated the value and strategic importance of data, transforming how upstream software engineers perceive and interact with data workflows. In this expert-led panel, industry leaders will share their experiences and insights into effectively bridging the gap between data teams and software engineers. They will discuss practical strategies for proactively managing data infrastructure, enhancing collaboration, and ensuring high-quality data to support advanced AI-driven development initiatives.
Good Data and not Big Data is becoming more important in today's ecosystem. Machine Learning models rely on good quality data to make their model training more efficient and effective. We have traditionally applied Data Quality checks and balances in manual, centralized way, putting a lot of onus on our customers. Shifting Left Data Quality will bring the data quality checks closer to where data is being created, while preventing bad data from flowing downstream. Also auto-detecting, recommending and auto-enforcing data quality rules will make our customers job easier, while creating a more mature and robust data ecosystem.
In healthcare technology, protecting patient privacy while scaling data operations requires reimagining where quality and governance live. This presentation explores Helix's journey of shifting critical processes left in its precision medicine business—from implementing automated data classification and privacy workflows to enlisting cross-functional expertise in refining operational workflows. For clinical data management, we've partnered with healthcare systems to implement OMOP standards and data contracts at the source, creating a robust foundation for research and commercial opportunities. Through practical examples, we'll demonstrate how this upstream approach has transformed our data operations, encouraged internal alignment, and strengthened partner relationships.
Data teams increasingly embrace software engineering practices to address quality and integration challenges, yet friction remains between software and data teams. This talk explores why standard practices alone aren’t enough and introduces the concept of the “Data-Conscious Software Engineer,” an emerging role critical to bridging these organizational divides. Attendees will learn how identifying and empowering engineers who deeply understand both software development and data workflows can foster stronger collaboration, improve data quality, and drive organizational change toward treating data as a strategic asset.
High-quality, governed, and performant data from the outset is vital for agile, trustworthy enterprise AI systems. Traditional approaches delay addressing data quality and governance, causing inefficiencies and rework. Apache Iceberg, a modern table format for data lakes, empowers organizations to "Shift Left" by integrating data management best practices earlier in the pipeline to enable successful AI systems. This session covers how Iceberg's schema evolution, time travel, ACID transactions, and Git-like data branching allow teams to validate, version, and optimize data at its source. Attendees will learn to create resilient, reusable data assets, streamline engineering workflows, enforce governance efficiently, and reduce late-stage transformations—accelerating analytics, machine learning, and AI initiatives.