Collaboration, Trusted Data and the Next Phase of Operational AI
Week 19 of the Data Innovation Summit XI Edition shifts the spotlight to collaboration, trust, and operational maturity in data and AI systems. From national-scale innovation and industrial AI foundations to real-time data pipelines, governance transparency, and autonomous operations, the new sessions announced this week explore how organizations move from experimentation to durable capability.
New Keynotes, Speakers and Sessions
The highlight of the week is the opening Future Outlook Keynote, featuring Anders Ynnerman, Executive Chairman of Sferical AI and Director of Strategic Research at the Knut and Alice Wallenberg Foundation. In a dialogue moderated by Anders Arpteg, VP AI & Data at Saab Group, the session explores how large-scale collaboration and collective intelligence are becoming essential to accelerate innovation. The session sets the tone for the Summit by emphasizing a critical premise: in an era of exponential change, progress cannot happen in isolation, so the fastest way to innovate is to share the journey.
From the world of media and storytelling, Matthias Stahl, Head of Data & Visualizations at Der SPIEGEL, joins the program with From Data to Deadline: How We Built and Run a Data & Visualizations Team in the Newsroom. The session offers a behind-the-scenes look at how a 23-person interdisciplinary team combines data journalism, design, and engineering to deliver impactful stories under newsroom deadlines while still investing in long-term tools and innovation, including AI-supported workflows.
On the public-sector digital infrastructure field, Janne Lautanala, Chief Ecosystem and Technology Officer at Fintraffic, presents From Open Traffic Data to Operational AI: How Fintraffic Builds Public Value at National Scale. The talk outlines how Finland’s national traffic operator evolved from publishing open datasets to building an operational ecosystem that enables real-time decision-making, digital services, and AI-ready capabilities across transport systems.
Cybersecurity and regulation are also increasingly intertwined with AI adoption. Per Myrseth, Senior Researcher in Cyber Security at DNV, will present AI-empowered Cyber Security Risk Management and Certification. Drawing on insights from EU-funded research projects, the session explores how AI can strengthen DevSecOps practices while helping organizations prepare for compliance with emerging frameworks such as the Cyber Resilience Act and the EU AI Act.
Understanding data at scale remains a central challenge for large technology platforms. In How Meta Understands Data at Scale, Vasileios Lakafosis and Dave Kurtzberg, Software Engineers at Meta, share how Meta’s Privacy Aware Infrastructure integrates schematization, annotation, and a universal privacy taxonomy early in the product development lifecycle.
Several sessions this week focus on the foundations required to support trustworthy AI systems. Adam Segal from Cloudera explores why data lineage has become a governance requirement rather than a technical feature in AI is Accelerating Enterprise Decisions. Regulatory Scrutiny is Accelerating Faster. On the data architecture front, Antti Kajala, CIO at WiseDigi Oy, discusses how Finnish energy company Pohjolan Voima is preparing for AI with a high-quality data platform. Another emerging theme this week is how generative AI is reshaping data engineering workflows. Tamara Tatian, EMEA Technical Leader and Architect at IBM Data Platform, tackles this in Vibe Coding and VibeOps – Enterprise DataOps Heaven or Hell?. The session examines whether new paradigms enabled by generative AI represent an opportunity for faster development or introduce new operational complexity for enterprise-grade DataOps.
The evolution from pilots to real-world production remains one of the most pressing challenges for AI leaders. In Random Acts of AI: Moving from Pilots to Production, Dr. Chris Hillman, Global AI Lead at Teradata, outlines how enterprises can shift from isolated experiments toward coordinated roadmaps that deliver measurable business outcomes within complex environments.
Operational AI also depends on real-time infrastructure. Gregor Bauer, VP Customer Engineering at CrateDB, presents Architecting Real-Time Data Pipelines: Turning Sensor Streams into Dashboards, demonstrating how modern streaming architectures transform high-velocity IoT data into actionable operational insights.
Similarly focused on operational systems, Somil Gupta, Co-Founder of Kovant, introduces the concept of Operations 5.0, where autonomous AI agents plan, adapt, and execute tasks across real business processes. The session explores how agentic systems differ from traditional automation and what organizations need to adopt them successfully.
Several sessions address the growing importance of trusted data as the foundation for AI reliability. Christian Stadlmann, Chief Revenue Officer at One Data, explains why technical data quality alone is not enough in Data Quality Is Not Enough: Stop AI Hallucinations with Trusted Data. In another session from One Data, Tim Föckersperger explores how organizations can move Beyond Migration when modernizing SAP BW, transforming legacy environments into future-ready data foundations that support AI-driven decision making. The transition toward agentic AI also places new demands on data infrastructure. Arjan Hijstek, Solution Architect at ClickHouse, discusses how next-generation analytics platforms can deliver the speed, concurrency, and scale required to power agentic systems in When Data Starts Acting: Speed, Scale and Simplicity for Agentic AI.
Industry collaboration stories also take center stage this week. In Finding the Right Chemistry: Combining Domain Expertise, Data & MLOps Across Borders, Nicola Holmes from Kemira and Janne Sipilä from Twoday discuss how multinational teams align domain expertise and data engineering practices to build scalable AI initiatives despite complex data environments.
The finance sector contributes another strong example of data product thinking. André Westerlund from Handelsbanken IT and Frej Örnberg from Handelsbanken Fonder present Handelsbanken Pension Fund: 10x Faster Data Products by Redesigning Data Delivery. Their session describes how shifting from point integrations to reusable data services significantly accelerated reporting and analytics while maintaining strict governance requirements.
Healthcare data transformation is also featured this week. Henry De Rudder, Head of Data, AI & IT at Ceres Pharma, shares how the organization unified 23 entities, five ERP systems, and multiple acquisitions into a single governed data layer within a year, enabling secure self-service analytics across the business.
Finally, several additional speakers join the program across the Summit tracks, including Jon Palmer (Field CTO at Omni), Dr. Christopher Royles (CTO EMEA at Cloudera), Claudia Chiţu (Engineering Manager at Arrive / EasyPark Group), and Gudrun Anna Atladottir from Novo Nordisk together with Rajesh Ananth from EPAM. Their sessions will further expand the conversation on enterprise analytics, AI readiness, and modern data platforms.
New Partners
Week 19 also brings several new organizations into the DIS26 ecosystem, further strengthening the community around enterprise data and AI innovation. We welcome Validio, Astrafy, ConfidentialMind, Aiven, JetBrains, ThoughtSpot and Sigma as new partners supporting this year’s Summit.
Week 19 highlights a clear theme: the next generation of AI systems will not be built on models alone. They require trusted data foundations, transparent governance, real-time infrastructure, and above all, collaboration across organizations and industries.
More announcements are coming next week
The Data Innovation Team
