Frans Klaassen

Developing data awareness as part of Data Governance

Data-driven? Start with data awareness.

How do you create true data awareness within a large organization? In this ninth episode of the podcast, Frans Klaassen MBA / CDMP his unique approach, insights and lessons learned from increasing data literacy globally and implementing sustainable data governance in a corporate environment.

What is data awareness?

According to Frans, data awareness doesn't start with technology, but with people. Or as he puts it: "It starts with explaining the impact of your actions on the rest of the data chain."

A simple anecdote illustrates this perfectly: a remarkably large amount of toilet paper was being used at a refinery. The cause? Operators were quickly clicking through input screens without realizing that their actions were being recorded as deviations further down the process. By explaining this, awareness was created. Small insights, big impact.

How do you create data awareness?

Frans introduced a broadly supported programme with:

  • Peer-to-peer learning & Data Analytics Networks

  • Reach: 9,000 employees in 44 countries in 6 months

  • Practical examples, playbooks and videos

  • Champions who function as ambassadors

"You have to make data fun and understandable. That way you get the whole organization on board."

By fostering broad support rather than top-down management, a community-driven culture emerged. Initiatives were fueled by intrinsic motivation and supported by centralized resources.

Tools & approach

  • Automatic reporting on the source: accelerates processes and prevents manipulation.

  • Data Quality Handbook: living document full of practical examples.

  • Data Governance as an enabler, not a compliance obligation.

  • Dynamic KPI Reporting: focus on 10 KPIs at a time, replace them when they remain green for 3 months.

“Make sure that data governance doesn't feel like an obligation, but rather a necessity to do good work.”

👥 Governance & overcoming resistance

Frans addresses the well-known challenge: resistance at the intermediate management level. His solution:

  • Taking time for 1-on-1 conversations

  • Drinking coffee instead of PowerPoint

  • Listening, explaining, and connecting with practice

You don't have to get everyone on board right away. Start with a few who truly believe.

💡 Tips for organizations that want to start:

  1. Start small – test with a pilot group

  2. Anchor consciousness in culture – repeat, show value

  3. Invest visibly in people – training, recognition

  4. Make data fun and understandable – use stories

  5. Create a consistent language – prevent proliferation of definitions

Ask your questions!

🎁 Ask all your questions for free via ⁠⁠databewuster.com⁠⁠Every Friday at 3:00 PM.

🚀 Or do you want it Guided implementation of the Metro Model within your organization? Schedule a no-obligation consultation now via ⁠⁠powerbison.io/dataquality

Shannon Els

Metadata as the foundation of Data Governance

What is metadata really?

According to Shannon M Els Metadata is much more than simply "data about data." It's context. Without metadata, data is meaningless. You can't make decisions about information without the right context, just as you can't put on a coat in 25 degrees without knowing whether it's Celsius or Fahrenheit.

Definition according to Shannon: Metadata is information about data which determines the context in which data can be understood, classified and used.

Metadata is what allows you to know what 42 means—degrees Celsius, Fahrenheit, age, speed? Without metadata, data is blind.

Start simple, don't make it complex

Many organizations make metadata unnecessarily complex. Shannon emphasizes that it is usually not complex is, but a lot ofStart with a basic set: name, definition, source, owner, and quality requirements. And importantly: drink coffee with your colleagues.

Complexity has become an excuse for stagnation. Start with what you know. Ask your colleagues what they really need.

The triangle of data governance

Shannon introduces the data governance triad: metadata, data quality, and master data. These three form the core of every data-driven organization:

  • Metadata gives meaning

  • Quality creates trust

  • Master data is the reference point

As soon as you touch one of these, you automatically have to consider the other two. They are inextricably linked.

"Without metadata, there's no data quality. Without data quality, there's no trust. Without master data, there's no clarity."

The metamodel: the backbone of your organization

A powerful insight from the conversation is the importance of a metamodelThis is the structured blueprint in which you specify which types of metadata objects you manage (such as tables, definitions, reports, rules, data delivery agreements).

This model makes it possible to:

  • Clearly define responsibilities

  • Testing maturity within your organization

  • To use systems (such as Collibra) more effectively

Without a metamodel, metadata management is like building without a foundation.

The human side: metadata starts with coffee

Data governance is a human endeavor. Shannon repeatedly emphasized the importance of informal contact and human relationships:

  • Look for people who are already taking responsibility

  • Go for coffee, listen to frustrations and needs

  • Label roll pass once you know who already fulfills them (informally)

The best way to start governance? Have coffee with the people who are already complaining about data quality. That's where your structure begins.

The Modeling Authority

Especially in larger organizations, there is a clear Modeling Authority Essential: a group or individual who oversees the quality and consistency of metadata. Ideally, this is an experienced information analyst or enterprise architect with knowledge of modeling, processes, and governance.

Conclusion: Metadata is not an IT party

Metadata is at the heart of how organizations manage their information. It's not just for the IT department; it belongs to everyone. By starting small, listening carefully, and structuring effectively, you can make a difference.

Ask your questions!

🎁 Ask all your questions for free via ⁠⁠databewuster.com⁠⁠Every Friday at 3:00 PM.

🚀 Or do you want it Guided implementation of the Metro Model within your organization? Schedule a no-obligation consultation now via ⁠⁠powerbison.io/dataquality

Mark van der Veen

Why data governance is not a control mechanism – and how the Metro Model helps raise awareness

For many people, data governance conjures up associations with rules, control, and restrictions. But at its core, governance is about management: providing direction, formulating ambitions, and taking responsibility. In this long read, inspired by an extensive conversation with Mark van der Veen MMIT-Trilingual, Vice President at DAMA NL, we explore how organizations can effectively shape data governance. We'll address leadership, stakeholder management, education, and the need for data literacy in the AI era.

🔎 Data governance is managing, not controlling

In many organizations, governance is seen as something you "have to manage" or a way to manage risks. But governance is essentially about something much more fundamental: driving change and realizing ambitions with dataThis requires more than policy; it requires leadership.

Governance isn't an abstract control tool. It's a way to achieve organizational goals with data.

The Metro Model, a visual aid developed by the DAMA Netherlands community, helps clarify this nuance. The model visualizes the layers of data management: from strategy and governance to tools, processes, and culture. It makes the discussion about data quality and responsibility more accessible to the business.

🧭 Leadership is not about job titles, but about behavior

Leadership in data quality doesn't have to come from a hierarchical role. On the contrary: true leaders show initiative, feel responsible, and initiate action. Mark describes how people from working groups within DAMA, without formal roles, demonstrate leadership by developing tools such as the Data Quality Management System.

You don't have to be a manager to demonstrate leadership. Above all, you have to want something and get it started.

These leaders play a key role in raising awareness and driving collaboration around data quality. Governance therefore requires social skills: knowing who your stakeholders are, and how to involve them in your ambitions.

🤝 Stakeholder management: working together on shared responsibility

Effective data governance requires collaboration between different roles in the organization, from data owners to process managers and end users.

Stakeholder analysis is crucial here. Who has an interest in which data? What are their goals? And what agreements should we make to prevent friction? The Metro Model helps with the structural involvement of these stakeholders in governance and consultation structures.

It's all about simple questions: What do you want to do with this data? What do I do with it? And should we make an agreement about this?

🧠 No data literacy, no governance

With the rise of AI, the urgency for data literacy is also growing. Employees need to know not only that data exists, but also understand, analyze and communicate withThis applies not only to data specialists, but to everyone in the organization.

Mark explains that DAMA NL therefore also focuses on education, for example by:

  • The development of a bachelor's degree program in Data Management & BI (in collaboration with Utrecht University of Applied Sciences)
  • Factsheets and tools for data management
  • Serious Business Games, such as Data moles and Data dynamics, to work on data quality in a playful way

"You can only use data effectively if you understand what you're reading. And you only truly understand it if you can talk about it."

📦 From inspiration to implementation

All knowledge, models and tools from the discussion, including the full Data Quality Management System factsheet and the Metro model, are available in the free community of More data-awareHere you will find, among other things:

  • ✅ All episodes of the podcast
  • ✅ Tools and factsheets from the DAMA NL working groups
  • ✅ Updates on new content, events and working groups

 

Ask your questions!

🎁 Ask all your questions for free via ⁠⁠databewuster.com⁠⁠Every Friday at 3:00 PM.

🚀 Or do you want it Guided implementation of the Metro Model within your organization? Schedule a no-obligation consultation now via ⁠⁠powerbison.io/dataquality

Ruud Kuil

Why a baseline measurement is the starting point for every successful data quality strategy

In my fifth podcast episode I spoke with Ruud Kuil, a data management expert with over 16 years of experience in data governance and master data management. Ruud has trained several professionals in the DAMA DMBoK framework and helped countless organizations gain control over their data.

In our conversation, he shares his vision on data quality, practical examples, and concrete advice for managers who want to get started with data quality improvement.

1. Think big, start small

Ruud emphasizes that awareness at management level essential: “Senior management must understand what data is and what poor data quality means for the organization.” Still, he recommends starting small.

“The goal is world domination, but you start with five attributes, not two hundred.”

One of his practical examples: an organization that started with only 5–10 attributes within one product groupWithin six months, they were able to demonstrate that improved data quality delivered measurable value. This built the necessary trust and enabled rapid scaling.

2. Make the value tangible

Without a business case, data quality often remains an abstract concept. Ruud gives a striking example:

“An incorrect email address can mean an invoice is not sent, needs to be re-examined and physically printed. For one organization, this meant €100,000 per year in waste.”

By calculating and visualizing such concrete examples, you create support among stakeholders and enthusiasm within teams.

3. Start with a baseline measurement

For managers who want to get started, Ruud's advice is clear:

  • Perform a baseline measurement. Talk to employees at all levels and ask: What keeps you awake at night? What data problems are costing you time or frustrating you?
  • Define business rules. Ruud often collects 30–50 concrete rules which he can test against the data. This provides insight into data quality and priorities.

4. Role and responsibility: no empty titles

Many organizations appoint data owners or data stewards without a clear mandate. Ruud states:

A title without clear responsibilities is pointless. Define the role, determine the time commitment, and link it to concrete results.

5. Metadata management as a foundation

Another important point of attention: metadata managementModern data catalogs provide more insight than ever into the definitions, origins, and uses of data.

Without metadata, you're blind. Start early, otherwise you'll lose momentum and fall behind.

6. Data management = culture change

Finally, Ruud emphasizes that data management a long-term trajectory is:

"This isn't a six-month project, but a 5- to 10-year culture shift. You have to give people time to adapt to a different way of working."

 

Conclusion

Improving data quality requires a long breath, but it delivers enormous value. By starting small, making data transparent and transparent, and actively involving stakeholders, you build a solid foundation for the future.

 

Ask your questions!

🎁 Ask all your questions for free via ⁠⁠databewuster.com⁠⁠Every Friday at 3:00 PM.

🚀 Or do you want it Guided implementation of the Metro Model within your organization? Schedule a no-obligation consultation now via ⁠⁠powerbison.io/dataquality

PHOTO-2025-06-15-21-03-27 3

Strategically managing data quality: how the top layer of the Metro Model guides sustainable change

From Compliance to Opportunity: The Real Value of Strategy

In many organizations you still see that they approach compliance as a mandatory number – ticking off rules, delivering audit reports and moving on. But, as Vincent Lassauw stilt:

“Compliance is also an opportunity to improve your processes, work in a more customer-focused manner, and develop new initiatives.”

Rather than simply complying with regulations, compliance should be a catalyst for business improvement. This requires strategic thinking: not just at the board level, but throughout the entire organization.

The foundation: assessments and maturity scans

According to Vincent, every data quality strategy starts with an honest picture of where you stand. maturity assessments and stakeholder interviews Essential. They help answer questions like:

  • How (mature) is our data strategy?
  • How anchored is data quality in our processes?
  • Who are our data owners, and who experiences the data consequences?

Without those insights, you can't set targeted goals. And without goals, no strategy.

Data strategy and data quality strategy: separate or integrated?

An important nuance in the conversation is the relationship between a broader data strategy and a specific data quality strategyVincent is clear:

A data quality strategy isn't an optional extra. It's an integral part of your data strategy. Without reliable data, you can't make data-driven decisions.

Data quality acts as a critical success factor within a broader data strategy – for example, when operationalizing KPIs, building dashboards, or modeling AI applications.

The Data Quality Management System (DQMS)

An important instrument for anchoring is the Data Quality Management System (DQMS)Such a system formalizes the continuity of data quality assurance by:

  • Setting up measurements and KPIs for data quality
  • Defining the roll such as data stewards, business owners and auditors
  • Describing the processes for monitoring, root cause analysis and corrective actions
  • Providing for education and support building

Crucial here is the systematic approach: the DQMS not only offers control over the present, but also agility towards the future.

Continuous improvement & adaptability

One of Vincent's strengths: you're never done with data quality. Not just because your organization changes, but also because the outside world changes—from legislation to technology.

Especially in the age of AI, strategic agility essential:

Let your strategy guide your decisions around AI—not the hype. Only then can you create and manage value.

This means you must allow room for recalibration. Strategy isn't a static document, but a living management tool.

Success factors according to Vincent

The conversation with Vincent also produces a number of implicit conditions for success:

  • Creating awareness at all levels of the organization
  • Organizing ownership – governance boards, sponsors, domain managers
  • Clearly define the scope – wide enough for impact, narrow enough for focus
  • Building business cases – make the value of data quality visible and measurable

Conclusion: strategy as a starting point, not as a final item

Vincent demonstrates that sustainable data quality begins with a well-considered and supported strategy. It requires a balance between people, processes, and technology. And the courage to make choices – even when they're painful.

 

Ask your questions!

🎁 Ask all your questions for free via ⁠⁠databewuster.com⁠⁠Every Friday at 3:00 PM.

🚀 Or do you want it Guided implementation of the Metro Model within your organization? Schedule a no-obligation consultation now via ⁠⁠powerbison.io/dataquality

1756391667073

The Standards Framework as an Architecture for Data Quality – With Nico Kohlberg and René Wiertz

In the sixth episode of The Sonny Side of Life I started talking to Nico Kohlberg and René Wiertz on how they use the metro model and the standards framework as a foundation for data quality management.

The central question: How do you use the standards framework and the Metro Model to not only improve data quality but also embed it in the organizational culture?

From fact sheet to foundation

René explained his research into data cleansing led to the development of a fact sheet, now available to everyone through DAMA Netherlands. This fact sheet provides guidance: definitions, objectives, PDCA cycles, and the relationship to other data quality elements.

The standards framework acts here as spine: it provides procedural and substantive requirements (often based on ISO standards) that help to clearly define processes and roles.

The Metro model as a talking picture

Nico and René use it Metro model as talking picture to make complex data quality issues understandable for non-technical stakeholders.

“With the Metro Model, we can truly take the business with us: which processes are we affecting, which roles are crucial, and what are the benefits?” – Nico

This allowed them to start at UWV with five priority elements, including issue management, monitoring and defining roles and responsibilities.

From SharePoint to data-driven culture

Remaining practical is a common thread in their approach. They use SharePointNot high-end, but effective for visibility and support. In addition, they are building a business glossary and they work on process modeling in a single framework, so that data and business processes are seamlessly integrated.

Their goal?

“Not just data-driven, but data-driven work: using data with common sense.” – Nico

Culture, communication & collaboration

Technology alone is not enough. Awareness, small steps and the continue to demonstrate the value are crucial:

  • Start small: tackle the biggest data pain.
  • Show the added value: “What's in it for me?” must always be answered.
  • Work together: internally with data owners and process architects, externally via platforms such as DAMA.

Why This Matters

The story of Nico and René shows how organizations improve data quality structural can improve by using the standards framework and Metro model not only as theory, but as architecture for change.

Want to know more? Listen to the full episode here.

And would you like to download the materials discussed and stay informed about new episodes? You can download them for free at databewuster.com.

Or subscribe to my newsletter: https://thesonnysideoflife.kit.com.

PHOTO-2025-06-15-21-03-27 2

Operational top layer of data quality in the Metro model

This is Laurens van der Drift's practical take on the Metro Model. Drawing on his experience building data management software and practical projects for government organizations and others, Laurens shares his insights on data quality within the Metro Model. The discussion is peppered with practical examples and sharp observations on how organizations can manage data more effectively.

Data quality as a strategic added value

Laurens argues that data quality is more than just meeting compliance requirements. It's about setting internal requirements for data that align with the organization's mission and processes. He distinguishes three dimensions of data quality:

  • Source quality: Is the data recorded correctly?
  • Transaction/transport quality: How does data move through systems (data lineage)?
  • Contextual quality: Does the data match the user's goal?

From data lineage to business rules

A key theme is data lineage: understanding the path data takes from source to use. Laurens explains how errors arise along the way and advocates for a sharp eye on critical data objects. From these objects, you then build business rules, preferably based on automated analysis such as machine learning.

Example: An analysis of youth care data revealed a 44-year-old "youth care client." This was an exception, caused by the fact that unborn children were not registered in the Personal Records Database (BRP) and therefore their data was stored with the mother. Such insights require reflection on both technology and policy.

Impact in euros: why it matters

Laurens emphasizes that data quality has a direct financial impact. Think of double-paid invoices, incorrect subsidies, or misclassified transactions. And while some effects are difficult to quantify in euros, the costs of bad data are undeniable. The unnecessary work for BI teams, sometimes requiring only 70% of their time, also constitutes a strong business case for structural improvement.

Governance and ownership

Without internal knowledge and ownership, improvements remain superficial. Laurens is critical of outsourcing data and IT expertise. Organizations, especially government bodies, should appoint their own people such as data stewards and CDOs to manage data quality and ensure continuity.

Start small, build up

Finally, Laurens van der Drift advocates a bottom-up approach. Start with a specific pain point, map critical data objects, create simple business rules, and measure the results. Demonstrate the improvement and gradually expand. Data quality should become a habit, just like exercise!

Stay informed!

Curious about more practical insights into data quality and the Metro Model? Sign up for the newsletter via The Sonny Side of Life. We'll share the materials discussed there and you'll receive updates on new episodes!

Want to implement the Metro Model within your organization? Contact us via

PHOTO-2025-06-15-21-03-27

Operational underlay of data quality in the Metro model

You can watch this second episode about applying the operational layer from the Metro model on YouTube, or listen to it on Spotify and Apple Podcasts.

Understanding Data Quality from Within

Aris Prins, with over 20 years of experience in data quality, shares practical insights: from data analysis and quality control to data cleansing and managing complex customer data. This article is a must-read for anyone who wants to understand and improve data quality in practice.

Why the operational layer?

The operational layer forms the backbone of the Metro Model. This is where data quality becomes tangible: analyses are performed, errors are detected, data is cleaned, and processes are improved. Aris illustrates this using recognizable situations, such as faulty invoicing due to incorrect system integrations or customers who are incorrectly registered multiple times.

From analysis to action

An important starting point in this layer is the analysis phase. Aris explains how to identify data quality issues based on complaints, patterns, and deviations in

Datasets. Sometimes these issues are known, sometimes you only discover them after a deeper analysis. AI tools can help predict patterns and identify anomalies, for example, when a customer's birthday appears to be on January 1, 1900.

Rules and stakeholders

A recurring theme is the need for clear rules and collaboration between stakeholders such as data stewards, data owners, and IT. Who determines what is correct? How do we ensure this remains the case? Clear definitions and responsibilities allow for effective monitoring and improvement of data quality.

Cleanup, migration and monitoring

Data cleanup often proves more complex than anticipated. Sometimes, it's technically impossible to adapt systems, for example, due to vendor dependency. Workarounds are then sought or temporary manual solutions implemented. Migrations also carry risks, such as incorrect birth dates or duplicate customer registrations. Monitoring helps with this: how is the quality developing, and how many errors are being resolved or are new ones being introduced?

AI, automation and value

High-quality data is essential for successful AI applications and automation. Aris emphasizes that poor data can lead to incorrect predictions and inefficient processes. By prioritizing potential damages or fines, organizations can better target their investments in data quality improvement.

A practical start for enthusiasts

For those who want to get started with data quality, Aris Prins advises: start small, choose a specific pain point, analyze the data, and monitor progress. Use smart tools to recognize patterns, but above all, don't forget the importance of human insight and collaboration.

Stay informed!

Curious about more practical insights and tools related to data quality? Subscribe to our newsletter via The Sonny Side of Life. We'll share the materials discussed there and keep you updated on future episodes!

Want to implement the Metro Model within your organization? contact on.

PHOTO-2025-06-15-21-03-26

Improve your data quality yourself with the Metro Model

From the Podcast 'The Sonny Side of Life ' Episode: 1. Improve your data quality yourself with the Metro model.

You can watch the first episode about improving your data quality yourself with the Metro Model on YouTube, or listen to it on Spotify and Apple Podcasts.

Improving Data Quality with the Metro Model

Data quality is becoming increasingly crucial for organizations that want to be data-driven. But how do you ensure that data is not only accurate and complete, but also used effectively? In an inspiring conversation with Marco and Peter, we discuss the Metro Model: a framework that helps structure and improve data quality management. This model, based on ISO 9001, provides organizations with guidance and helps them improve data quality at the strategic, tactical, and operational levels.

DAMA Netherlands and the Data Quality Working Group

Marco Heij and Peter van Nederpelt are active in DAMA NL, part of the international DAMA network. This network facilitates knowledge sharing around data management. The data quality working group focuses specifically on developing standards and tools, such as the Metro model, so that organizations don't have to reinvent the wheel every time.

What is the Metro Model?

The Metro Model visualizes data quality management as a metro network with thirty "stations," or elements, that together form a complete management system. The four layers of the model are:

  • Objectives: The core of the model is that data quality meets requirements and that users are satisfied.
  • Strategic level: This is where the main points are set out, such as drawing up a data quality policy and management involvement.
  • Tactical level: This level focuses on governance, stakeholder management and translating strategy into concrete measures.
  • Operational level: This concerns the daily implementation, such as data cleansing and monitoring.

The model is designed to offer organizations flexibility: you can join at any desired level and determine which elements are most important at that moment.

Why is ownership crucial?

One of the biggest challenges in data quality is ownership. Who is responsible for data quality? According to Marco and Peter, it's essential that this is properly established at all levels. This is integrated into the Metro Model by assigning a clear responsibility for each element. This prevents data quality from remaining a floating issue without concrete action points.

Your organization and the Metro Model

By using the model as a starting point, various departments within your organization can work on data quality in a uniform manner. This leads to better coordination between teams and a more effective approach to data governance.

The next step: From Data Quality to Data Utilization

Data quality is not a goal in itself, but a means to actually utilize data. This is the next step the data quality working group is focusing on. How do you ensure that

How high-quality data is used for data-driven work, AI, and dashboards? This is a topic that will be explored further in future discussions.

 

Stay informed!

Want to learn more about the Metro Model and how you can improve data quality in your organization? Subscribe to our newsletter via The Sonny Side of Life. We'll share valuable materials like the Metro Model for free, and you'll stay up-to-date on future episodes!