Programma

U KUNT LIVE IN UTRECHT OF ONLINE DEELNEMEN. MET WORKSHOPS NAAR KEUZE.

Data Mesh & Fabric: The Data Quality Dependency

This session will briefly recap the main concepts and practices of Data Mesh and Data Fabric and consider their implications for Data Quality Management. Will the Mesh and Fabric make Data Quality easier or harder to get right? As a foundational data discipline how should Data Quality principles and practices evolve and adapt to meet the needs of these new trends? What new approaches and practices may be needed? What are the implications for Data Quality practitioners and other data management professionals working in other data disciplines such as Data Governance, Business Intelligence and Data Warehousing?
Lees meer

The concepts and practices of Data Mesh and Data Fabric are data management’s new hot topics. These contrasting yet complementary technology and organisational approaches promise better data management through the delivery of defined data products and the automation of real time data integration.

But to succeed both depend on getting their Data Quality foundations right. To work, Data Mesh requires high quality, well curated data sets and data products; Data Fabric also relies on high quality, standardised data and metadata which insulates data users from the complexities of multiple systems and platforms.  

This session will briefly recap the main concepts and practices of Data Mesh and Data Fabric and consider their implications for Data Quality Management. Will the Mesh and Fabric make Data Quality easier or harder to get right? As a foundational data discipline how should Data Quality principles and practices evolve and adapt to meet the needs of these new trends? What new approaches and practices may be needed? What are the implications for Data Quality practitioners and other data management professionals working in other data disciplines such as Data Governance, Business Intelligence and Data Warehousing?   

This session will include:

  • A brief overview of the main concepts of Data Mesh and Data Fabric
  • A review of the current state of Data Quality Management – its successes and failures
  • An analysis of the impact of Data Mesh and Data Fabric on Data Quality Management – will they improve or worsen the Data Quality status quo?
  • Practical guidance on how Data Quality Management needs to evolve to support these new data management approaches

A suggested roadmap of actions which Data Quality practitioners and other data management professionals should implement to ensure they remain relevant in the new world of Data Mesh and Data Fabric.

Lees minder

Building an Enterprise Data Marketplace

This session looks at what a data marketplace is, how to build one and how you can use it to govern data sharing across the enterprise and beyond. It also looks at what is needed to operate a data marketplace and the trend to become a marketplace for both data and analytical products.
Lees meer

Most firms today want to create a high quality, compliant data foundation to support multiple analytical workloads. A rapidly emerging approach to building this is to create DataOps pipelines that produce reusable data products. However, there needs to be somewhere where these data products can be made available so data to be shared. The solution is a data marketplace where ready-made, high quality data products that can be published for others to consume and use. This session looks at what a data marketplace is, how to build one and how you can use it to govern data sharing across the enterprise and beyond. It also looks at what is needed to operate a data marketplace and the trend to become a marketplace for both data and analytical products.

  • The need for a high-quality data foundation to support decision making
  • Incrementally building a data foundation using DataOps pipelines to product Data Products
  • Using an enterprise data marketplace to share data
  • What is the difference between a data catalog and a data marketplace?
  • Challenges in establishing a data marketplace
  • What processes are needed to operate a data marketplace?
  • Governing the sharing of data using a data marketplace
  • Trends – publishing analytical products in a marketplace
  • Progressively shortening time to value using a marketplace.
Lees minder

Data Lakehouse: Marketing Hype or New Architecture?

This session discusses all aspects of data warehouses and data lakes, including data quality, data governance, auditability, performance, historic data, and data integration, to determine if the data lakehouse is a marketing hype or whether this is really a valuable and realistic new data architecture.
Lees meer

The data lakehouse is the new popular data architecture. In a nutshell, the data lakehouse is a combination of a data warehouse and a data lake. It makes a lot of sense to combine them, because they are sharing the same data and similar logic. This session discusses all aspects of data warehouses and data lakes, including data quality, data governance, auditability, performance, historic data, and data integration, to determine if the data lakehouse is a marketing hype or whether this is really a valuable and realistic new data architecture.

Lees minder

Data Observability – What is it and why is it important?

This session by internationally acclaimed analyst Mike Ferguson looks at the emergence of Data Observability and looks at what it is about, what Data Observability can observe, vendors in the market and examples of what vendors are capturing about data.
Lees meer

This session looks at the emergence of Data Observability and looks at what it is about, what Data Observability can observe, vendors in the market and examples of what vendors are capturing about data. The presentation will also look at Data Observability requirements, the strengths and weaknesses of current offerings, where the gaps are and tool complexity (overlaps, inability to share metadata) from a customer perspective. It will explore the link between Data Observability, data catalogs, data intelligence and the move towards augmented data governance and discuss how Data Observability and data intelligence can be used in a real-time automated Data Governance Action Framework to govern data across multiple tools and data stores in next generation Data governance.

Lees minder

Knowledge Graphs – New Perspectives on Analytics

Ever since Google announced its Knowledge Graph solution in 2012 the paradigm has found its way into many real-world use cases, mostly in the analytics space. This presentation will cover what a Knowledge Graph is, how it is different and yet complementary and will look at vendors, products and standards.
Lees meer

Since Google announced its Knowledge Graph solution in 2012 the paradigm has found its way into many real-world use cases. These are mostly in the analytics space. The graph database market has exploded over the last 10 years with at least 50 brand names today. International Standardization is coming – very soon SQL will be extended by functionality for property query queries. A full international standard for property graphs, called GQL, will surface in late 2023.
The inclusion of graph technology dramatically enlarges the scope of analytics by enabling semi-structured information, semantic sources such as ontologies and taxonomies, social networks as well as schema-less sources of data. At the same time graph databases are much better suited for doing complex multi-joins analyzing large networks of data, opening up for advanced fraud detection etc. The Panama papers is the best-known example. Finally graph theory is a mathematical discipline with a long history, which among other things have created graph algorithms for many complex analytics, such as clustering, shortest path, page rank, centrality and much more.
This presentation will cover what a Knowledge Graph is, how it is different and yet complementary to other technologies. Furthermore, Thomas will cover:

  • Why do semantics and relations matter?
  • What kinds of data architectures and pipelines?
  • Which are the vendors and the products?
  • Which standards exist?

It is a non-technical presentation, focusing on business requirements and architecture. More technical information will be covered in the workshop Understanding Graph Technologies on the 5th of April.

Lees minder

The Data-Process Connection

Alec Sharp illustrates the many ways concept models (conceptual data models) support business process change and business analysis.
Lees meer

Whether you call it a conceptual data model, a domain map, a business object model, or even a “thing model,” a concept model is invaluable to business process and business analysis professionals. After introducing methods to get people, even C-level executives, engaged in concept modelling, we’ll introduce essential guidelines for naming and defining entities, laying out a data model diagram, and verifying the model, all in ways that maximize involvement and understanding. Then we’ll move onto the heart of the presentation – techniques and real-life examples to show how a concept model can help solve problems that often arise in business process and business analysis efforts.
Drawing on over forty years of successful modelling, on projects of every size and type, this session introduces proven techniques backed up with current, real-life examples.

Topics include:

  • Concept modelling essentials, including why you shouldn’t call it a “data model”
  • “Guerrilla modelling” – starting a concept model without anyone realising it
  • Case studies, including:
    • concept models as a starting point for discovering processes, use cases/user stories, and services
    • using concept models to get stalled process and analysis work moving
    • using concept models to support selection of purchased software.
Lees minder

A Data Strategy for Becoming Data Driven

Data driven worden behelst meer dan de inzet van nieuwe tools en technologie. Het grijpt in op business modellen, processen en vooral datamanagement. Dit seminar door Nigel Turner schetst de praktische stappen die nodig zijn om een haalbare datastrategie en aanpak op te stellen.
Lees meer

In this digital world, it is becoming clear to many organisations that their success or failure depends on how well they manage data. They recognise that data is as a critical business asset which should be managed as carefully and actively as all other business assets such as people, finance, products etc. But like any other asset data does not improve itself and will decline in usefulness and value unless actively maintained and enhanced.

For any organisation a critical first step in maintaining and enhancing its data asset is to understand two critical things:

  • How well does data support our current business model?
  • How do we need to improve and develop it both to better sustain our current business and to enable our future business strategies and goals?

The primary purpose of a data strategy is to answer these two critical questions. For any data driven organisation a data strategy is essential because it serves as a blueprint for prioritising and guiding current and future data improvement activities. Without a data strategy, organisations will inevitably try to enhance their data assets in a piecemeal, disconnected, unfocused way, usually ending in disappointment or even failure. What’s needed is a well crafted and coherent data strategy which sets out a clear direction which all data stakeholders can buy into. And as the famous US baseball player Yogi Berra once said, “If you don’t know where you are going, you’ll end up somewhere else.”

This seminar will teach you how to produce a workable and achievable data strategy and supporting roadmap and plan, and how to ensure that it becomes a living and agile blueprint for change.

 

The seminar

In this full day seminar Nigel Turner will outline how to create and implement a data strategy. This includes:

  • How data strategy and business strategy interrelate
  • What a data strategy is (and is not) and what it should contain
  • Building & delivering a data strategy – the key components and steps
  • Managing and implementing a data strategy to ensure it continually aligns with changing business priorities and needs.

The seminar will take you through a simple and proven four step process to develop a data strategy. It will also include practical exercises to help participants apply the approach before doing it for real back in their own organisations, as well as highlighting some real world case studies where the approach has been successful. 

 

 Learning Objectives

  • Know what a data strategy is, and why it is a ‘must have’ for digital organisations
  • Understand the mutual relationship between business and data strategies
  • Identify what a data strategy needs to include
  • Understand and be able to apply a simple approach to developing a data strategy
  • Analyse business goals and strategies and their dependence on data
  • Highlight current data problems and future lost opportunities
  • Make an outline business case for strategic action
  • Assess current data maturity against required data capabilities
  • Focus in on business critical data areas
  • Identify required new or enhanced data capabilities
  • Define and create an actionable roadmap and plan
  • Secure stakeholder support and buy in
  • Manage change and communication across the organisation
  • Understand the crucial role of data governance in implementing and sustaining a data strategy
  • Track data strategy deliverables and benefits
  • Be aware of case studies of successful implementation of the approach
  • Highlight software and other tools that can help to support and automate the delivery of the data strategy.
Lees minder

Concept Modelling for Business Analysts

Concept Modelling (or Conceptual Data Modelling) has seen an amazing resurgence of popularity in recent years, and Alec Sharp illustrates the many reasons for this along with practical techniques and guidelines to ensure useful models and business engagement.
Lees meer

Whether you call it a conceptual data model, a domain model, a business object model, or even a “thing model,” the concept model is seeing a worldwide resurgence of interest. Why? Because a concept model is a fundamental technique for improving communication among stakeholders in any sort of initiative. Sadly, that communication often gets lost – in the clouds, in the weeds, or in chasing the latest bright and shiny object. Having experienced this, Business Analysts everywhere are realizing Concept Modelling is a powerful addition to their BA toolkit. This session will even show how a concept model can be used to easily identify use cases, user stories, services, and other functional requirements. 

Realizing the value of concept modelling is also, surprisingly, taking hold in the data community. “Surprisingly” because many data practitioners had seen concept modelling as an “old school” technique. Not anymore! In the past few years, data professionals who have seen their big data, data science/AI, data lake, data mesh, data fabric, data lakehouse, etc. efforts fail to deliver expected benefits realise it is because they are not based on a shared view of the enterprise and the things it cares about. That’s where concept modelling helps. Data management/governance teams are (or should be!) taking advantage of the current support for Concept Modelling. After all, we can’t manage what hasn’t been modelled!

The Agile community is especially seeing the need for concept modelling. Because Agile is now the default approach, even on enterprise-scale initiatives, Agile teams need more than some user stories on Post-its in their backlog. Concept modelling is being embraced as an essential foundation on which to envision and develop solutions. In all these cases, the key is to see a concept model as a description of a business, not a technical description of a database schema. 

This workshop introduces concept modelling from a non-technical perspective, provides tips and guidelines for the analyst, and explores entity-relationship modelling at conceptual and logical levels using techniques that maximise client engagement and understanding. We’ll also look at techniques for facilitating concept modelling sessions (virtually and in-person), applying concept modelling within other disciplines (e.g., process change or business analysis,) and moving into more complex modelling situations. 

Drawing on over forty years of successful consulting and modelling, on projects of every size and type, this session provides proven techniques backed up with current, real-life examples.

Topics include:

  • The essence of concept modelling and essential guidelines for avoiding common pitfalls
  • Methods for engaging our business clients in conceptual modelling without them realizing it
  • Applying an easy, language-oriented approach to initiating development of a concept model
  • Why bottom-up techniques often work best
  • “Use your words!” – how definitions and assertions improve concept models
  • How to quickly develop useful entity definitions while avoiding conflict
  • Why a data model needs a sense of direction
  • The four most common patterns in data modelling, and the four most common errors in specifying entities
  • Making the transition from conceptual to logical using the world’s simplest guide to normalisation
  • Understand “the four Ds of data modelling” – definition, dependency, demonstration, and detail
  • Tips for conducting a concept model/data model review presentation
  • Critical distinctions among conceptual, logical, and physical models
  • Using concept models to discover use cases, business events, and other requirements
  • Interesting techniques to discover and meet additional requirements
  • How concept models help in package implementations, process change, and Agile development

 

Learning Objectives:

  • Understand the essential components of a concept model – things (entities,) facts about things (relationships and attributes, ) and rules
  • Use entity-relationship modelling to depict facts and rules about business entities at different levels of detail and perspectives, specifically conceptual (overview) and logical (detailed) models
  • Apply a variety of techniques that support the active participation and engagement of business professionals and subject matter experts
  • Develop conceptual and logical models quickly using repeatable and Agile methods
  • Draw an Entity-Relationship Diagram (ERD) for maximum readability
  • Read a concept model/data model, and communicate with specialists using the appropriate terminology.
Lees minder

Understanding Graph Technologies

In deze workshop van een halve dag zal Thomas Frisendal laten zien wat graph technologieën in de praktijk betekenen. Hij zal ook laten zien hoe graph oplossingen verschillen en hoe traditionele databases en graph technologie elkaar aanvullen. De combinatie van de twee is zeer krachtig, en gelukkig relatief eenvoudig te implementeren.
Lees meer

Since Google announced its Knowledge Graph solution in 2012 graph database technologies have found their way into many organizations and companies. The graph database market has exploded over the last 10 years with at least 50 brand names today. International Standardization is coming – very soon SQL will be extended by functionality for property graph queries. A full international standard for property graphs, called GQL, will surface in late 2023 (from the same ISO committee that maintains the SQL standard).

Graph databases are generally quite easy to understand – the paradigm is intuitive and seems straightforward. In spite of that, the breadth and power of the solutions, one can create, are overwhelmingly impressive. The inclusion of graph technology dramatically enlarges the scope of analytics by enabling semi-structured information, semantic sources such as ontologies and taxonomies, social networks as well as schema-less sources of data.
At the same time graph databases are much better suited for doing complex multi-joins analyzing large networks of data, opening up for advanced fraud detection etc. The Panama papers is the best-known example.

Finally graph theory is a mathematical discipline with a long history, which among other things have created graph algorithms for many complex analytics, such as clustering, shortest path, page rank, centrality and much more.

Learning Objectives

  • Understand graph parlance and paradigms
  • Understand the principles of graph data modeling
  • Understand “schema on read” approaches and use cases
  • Investigate examples on the database language level
  • Get a feel for the scope of graph solutions
  • Get an overview of the vendors and technologies
  • Get an understanding of the tools available
  • Get a good feel for investigative analytics, graph algorithms and graphs in the ML context
  • Get advice on how to get to play with graph tools
  • Get references to good resources.

 

Who is it for?

  • People, who architect, design and manage analytical solutions, looking for additional analytics power for complex business concerns
  • People, who implement analytics
  • People, who use analytics applications, tools and data to resolve business issues
  • People, who have some experience with database query languages and/or query tools
  • Business analysts
  • Data and IT consultants.

Although code examples (in graph database query languages) will be used frequently, the audience is not expected to be proficient database developers (but even SQL experts will benefit from the workshop).

 

Workshop Course Outline

  • Graph Models
    • Graph Theory, Property Graphs and data paradigms
    • Graph models compared to classic (relational) models
    • Schema less, first, last or eventually
    • The Flight Data Model as a property graph
  •  Graph Queries
    • Graph traversals and paths
    • Query languages, incl. international standards work in progress and a market overview
    • Loading, modifying and deleting Data
    • Profiling graph data
  •  Graph Analytics
    • Investigative analytics (Cypher examples)
    • Graph Algorithms
    • Graphs and Machine Learning
  • Best Practises
  • Resources
    • Literature
    • Websites
    • Getting started with a prototype.

 

It is a somewhat technical workshop, focusing on what and how, using examples. Business and architectural level information can be found in the knowledge graph session on the DW&BI Summit on April 4th.

Lees minder

 

Liever online?  Volg via de live stream!
Het congres kan zowel live in Utrecht als online worden gevolgd. Deelnemers aan het congres hebben bovendien nog enkele maanden toegang tot de video opnames dus als u een sessie moet missen, is er geen man overboord. Ook kunt u hierdoor alle parallelsessies achteraf nog bekijken.

4 april

09:45 - 10:45 | Data Mesh & Fabric: The Data Quality Dependency
Zaal 1    Nigel Turner
09:45 - 10:45 | Building an Enterprise Data Marketplace
Zaal 1    Mike Ferguson
09:45 - 10:45 | Data Lakehouse: Marketing Hype or New Architecture?
Zaal 1    Rick van der Lans
09:45 - 10:45 | Data Observability – What is it and why is it important?
Zaal 1    Mike Ferguson
09:45 - 10:45 | Knowledge Graphs – New Perspectives on Analytics
Zaal 1    Thomas Frisendal
09:45 - 10:45 | The Data-Process Connection
Zaal 1    Alec Sharp

Workshops

09:30 - 17:00 | A Data Strategy for Becoming Data Driven
5 april    Nigel Turner
09:00 - 12:30 | Concept Modelling for Business Analysts
5 april    Alec Sharp
13:30 - 17:00 | Understanding Graph Technologies
5 april    Thomas Frisendal