Enterprise Data Governance
eBook - ePub

Enterprise Data Governance

Reference and Master Data Management Semantic Modeling

Pierre Bonnet

Share book
  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Enterprise Data Governance

Reference and Master Data Management Semantic Modeling

Pierre Bonnet

Book details
Book preview
Table of contents
Citations

About This Book

In an increasingly digital economy, mastering the quality of data is an increasingly vital yet still, in most organizations, a considerable task. The necessity of better governance and reinforcement of international rules and regulatory or oversight structures (Sarbanes Oxley, Basel II, Solvency II, IAS-IFRS, etc.) imposes on enterprises the need for greater transparency and better traceability of their data.

All the stakeholders in a company have a role to play and great benefit to derive from the overall goals here, but will invariably turn towards their IT department in search of the answers. However, the majority of IT systems that have been developed within businesses are overly complex, badly adapted, and in many cases obsolete; these systems have often become a source of data or process fragility for the business. It is in this context that the management of 'reference and master data' or Master Data Management (MDM) and semantic modeling can intervene in order to straighten out the management of data in a forward-looking and sustainable manner.

This book shows how company executives and IT managers can take these new challenges, as well as the advantages of using reference and master data management, into account in answering questions such as: Which data governance functions are available? How can IT be better aligned with business regulations? What is the return on investment? How can we assess intangible IT assets and data? What are the principles of semantic modeling? What is the MDM technical architecture? In these ways they will be better able to deliver on their responsibilities to their organizations, and position them for growth and robust data management and integrity in the future.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Enterprise Data Governance an online PDF/ePUB?
Yes, you can access Enterprise Data Governance by Pierre Bonnet in PDF and/or ePUB format, as well as other popular books in Informatique & Stockage de données. We have over one million books available in our catalogue for you to explore.

Information

Publisher
Wiley-ISTE
Year
2013
ISBN
9781118622537

PART ONE

The MDM Approach

Chapter 1

A Company and its Data

This first chapter is dedicated to the role of data governance within companies and will enable us to make a clear definition of it. We will also see how the most commonly used data repository tools such as Customer Data Integration, Product Information Management, and structure and organization directories fit into the context of the MDM approach.

1.1. The importance of data and rules repositories

Master data management is not solely limited to the IT community. It is not only about rationalizing reference and master data, increasing their quality nor even laying the groundwork or preparing the first steps for a possible transformation of IT systems. These are important objectives but are insufficient to place MDM in its proper context.
To understand the true impact of MDM, it is important to consider the asset value of an IT system. Indeed, it is a crucial factor in the strategic and financial evaluation of the MDM approach. In order to get a sense of the outline of this assessment and its relationship with data management, here is a real story taken from a conversation with the head of IT systems of a large industrial company. This CIO was explaining to me that his IT systems were performing well, all the production KPIs were good, with a good capacity to respond to the needs of the business thanks to centralized operations and fixed price outsourcing, based on certified quality procedures. The contribution of IT to the quality of the system was a sure thing in the mind of this CIO, who was convinced of its success. Very impressed by this viewpoint, I wanted to find out more:
“How can you measure this contribution?”, I asked.
“I put an IT management control system in place that measures the global costs for each project. At the start of the project we sign a contract with our users that covers the commitments in terms of planning and return on investment”, answered the CIO.
“You are the director of IS and IT systems. Do you not find it odd that your management control, i.e. your P&L (Profit and Loss) does not correspond with your function?”
“What do you mean?” He asked, surprised.
“Instead of only having an IT management control, you should have an IS management control, in agreement with your role as head of IS and IT systems. If you measured the costs, the planning, and the rate of IT return you would not be as convinced as to your contribution to the IS and IT system. Truth be told, you have no indicators to prove this.”
“I don't understand? What should I be measuring?”
“Your assets! What are they?”, I insisted.
“We have computer hardware, software licenses, subcontracts and operating contracts. We already take those into account in our management control.”
“And what about your software and your databases? They represent a fountain of knowledge of the company's processes. These pieces of software embody the Information System, especially as your IT systems support all the company's processes. There are, without a doubt, no decisions or operations without the support of computers as a tool. Do you realize the importance of this and do you know how to measure it?”
“No. How could we measure it? It's impossible.”
“If you cannot evaluate it, then you will not be able to measure the impact of IT on the Information System. You are not in a position to do it because the capital asset value of the Information System, the one that you should find out, is trapped in IT systems that are not readily accessible to the business, in programming languages and databases only understood by your IT experts. At best, only a part of the IS assets are known via reporting tools, business intelligence tools and workflows for certain business processes.”
“And how could it be otherwise?”
“By putting together another IT system. You need to get the IS assets out of your software. Among these assets, it is important to first consider the reference and master data. You need a solution enabling business users to take control of their reference and master data, via a unified tool applied across the whole of your IT system, and with governance functions allowing the business users to enhance this data, depending on the version and use contexts (country, organization, etc.), querying and auditing them, and rating their reliability. This concerns very large domains of data, ranging from product and service descriptions to business regulatory configurations. Its financial value is estimated using tools such as benchmarking. For example, at the time of a takeover by another company, your company would be in a position to compare its data repository for product configuration with that of the targeted company. The same comparison is possible with other data repositories such as business regulations, accounting and financial structures, etc. The value of the data repositories is thus concrete. The method to obtain such a goal exists and the MDM approach is the first step.”
“That's a radical break!”
“Yes, similar to the passage from the light bulb to the transistor! After having taken into account the asset value of the reference and master data, you have a similar approach to business rules with a BRMS (Business Rules Management System) and with BPM (Business Process Management). Your IS and IT management control can therefore be enhanced with an inventory of assets in terms of reference and master data, business rules and processes. Thus it changes into management accounting of the whole IS and IT system. It is from then on, and not before, that you can evaluate the relevance of the IT in terms of its contribution to the IS and its alignment with the business. The lower your asset inventory is, the more blurred your vision of the situation is. With no assets to evaluate, you are running blind!”
Before reaching this stage of maturity, at which you will be able to incorporate business repositories into management control, you must first sort out the data repositories, in other words establish an MDM system. To do so, you have to return to the basics of IT.

1.2. Back to basics

These past few years, innovation in IT has not left much room for database reconstruction.
Most of the initiatives in the object oriented approach and Java development, of BPM and even Service Oriented Architecture (SOA) were carried out taking into account legacy databases without restructuring them. Companies did not have the means nor the objective to modify these databases. The IT industry1 has largely maintained the idea that new technologies would be capable of offering a return on investment without modifying the core of existing systems. Now, companies are realizing that these new technologies have been oversold. It is now obvious but it was not easy to convince people that, if data is no longer reliable, then the software that use them are also no longer reliable, whether they be oriented objects, BPM, SOA or others.
During the past few years, there has been a steady decline in the quality of data and data modeling expertise is becoming increasingly rare. It is more common to hire a Java engineer than a UML data modeler. It is even present in computer science training courses, surely a sign that this decline runs deep. And yet, in the first days of IT, a formal approach to data existed. Procedures and methodology were set in stone with2:
– conceptual data modeling to present business information in a format that can be understood by people who are not technical specialists;
– logical data modeling for the translation of conceptual modeling to data structures that respect IT needs and feasibility;
– physical data modeling for the translation of logical modeling into software, i.e. the “physical database schema”.
IT experts concerned themselves with establishing models that had a life expectancy that was not that of the processing. Relational algebra, and its translation to normal forms, has contributed to the quality of data models3. This strict modeling has given birth to databases placed at the heart of IT systems, giving greater stability and, once again has been widely exploited by companies. The organization of an IT Department left room for data administration. This administration was not only concerned with technical models; it was also concerned with business models, supported by semantic data dictionaries. Data administrators, a valid role within an IT Department, bring together the knowledge of these models (conceptual, logical) in order to pool modeling resources and favor further re-use. These administrators interested themselves in all types of data, from reference and master data (i.e. those shared by application systems) to transactional data.
Today, few organizations have maintained data administration at the business model level. For the most part, they have been lost in a technical database administration, necessary but insufficient to guarantee the durability of business models.
The addition of tactical databases around the heart of an IT system, but also the silo approach and software packages, has led to a fragmented architecture and poor documentation of data. Contemporary IT has not been able to completely preserve its database heritage.
Additionally, new concepts such as agility and traceability of information are needed that did not exist with the same intensity/degree of importance at the time. It is important to understand past differences in order to act more appropriately now.

1.2.1. Past differences

How do we reclaim data when IT heritage stems from many decades of hard-coded software developments and the use of software packages?
Taking a new approach to a data model across the whole of a company, avoiding the big-bang effect, valid for all the functional and technical IS silos, as well as data exchanges with partners, is an unachievable task in a single step.
In contrast with the early periods of IT (1960–1980), we are now facing a legacy that needs to be reformed. This is a situation that IT experts do not like, as they are keen to keep their legacy software. It is harder to take away a piece of software than to add one.
Furthermore, the IT industry is facing a not only generational but technical regeneration:
– IT professionals who had the first experience of database modeling across the whole of an IS during the 1970s, are now, increasingly, retiring. And with them, the know-how of data modeling is disappearing;
– during the past 20 years, techniques have dramatically evolved, and mastering them is essential in order to elevate the responsiveness of a system. It is especially important to benefit from object oriented approaches, standards such as XML schema and Model
Driven Architecture4. The feedback and lessons learned to carry out such a change are missing, even though the methods and tools are ready to be exploited.
It is indeed a transformation and not simply an addition of further layers of software to that which are already in place, which would only complicate the situation. The ability of the existing systems to take on an extra layer of complexity is coming to a stage which companies should not exceed. The risk of overdosing on complexity could lead to the loss of control of an Information System through its data.
The first step in order to improve the situation without starting all over again is to regain control of data, without imposing modifications on in-house software or software packages already in place. It is all about reference and master data, i.e. data shared and initialized before use by transactional systems. This could be, for instance, configuration of products, structures and organization descriptions, or financial classifications etc. And this is where the MDM approach comes in.
The improvement in management of this data, be it reference or master data, is also necessary in order to meet new business regulations such as Sarbanes Oxley, Basel II, Solvency II, etc. These regulations require a very high level of auditability and traceability in terms of information use.
This reactivation of data modeling comes with the incorporation of new business requirements and technological innovations:
– reference and master data governance should not only be the responsibility of IT. Business users must be allowed to govern data, in terms of manageme...

Table of contents