MDM can tame and monetize IoT data explosion – Technologist

In 1965 Wilf Hey was said to coin the catchphrase “garbage in, garbage out” (GIGO) to reflect the view that flawed, or nonsense input data produces nonsense output or “garbage”.

The phrase is even more noteworthy today in the era of big data, small data and analytics. As one finance manager participating at a CXOCIETY-hosted roundtable recently attested to “we have so much data coming in, it is hard work sifting through to glean any insight, let alone figure out what is real and what isn’t.”

His predicament stems from the realization that as a business they have multiple sources of data: warehousing and inventory control, finance, sales and marketing, supply chain, product development, etc. And yet each department sees the company based on the data it holds and calls it’s the correct version of the truth.

So as the company moves to become more data-driven, how does one reconcile the different sources [and interpretation] of data and get to the one true version of the truth?

What Master means

Gartner defines Master Data Management (MDM) as a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise’s official shared master data assets.

Master data is the consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise including customers, prospects, citizens, suppliers, sites, hierarchies and chart of accounts.

But the description is laced with technical jargon that businesses will instantly ignore if left untethered to a business outcome.

Pierre Bonnet, vice president of Product and Engineering at TIBCO Software

Pierre Bonnet, vice president of Product and Engineering at TIBCO Software believes that MDM should be a business-led programme that is essentially the clearinghouse to guarantee that the most important data is clean and of the highest quality. An essential attribute of this function is the ability to share the data across the organisation.

Clearinghouse

Bonnet likens MDM to a clearinghouse for data.

As companies deal with an increasing number of data sources and fragmented information from social media, mobile devices, and the cloud, MDM allows organisations to control and manage key master data entities scattered across different applications and databases. This improves visibility and control over the business activities and optimises various business operations such as the supply chain, inventory management, forecasting, and customer service.

“In a fast-growing business market with high expectations of deep digitalisation, a company without such a “data clearinghouse” could lose control of its data quality and data governance, leading to the delivery of poor quality business processes to its market. Such an MDM system is the spine of the deep-digitalisation process a company must follow to reinforce its market sustainability,” he explained.

When consolidated and matched accurately, data can reveal opportunities, risks, and areas where the business can be improved.

Got MDM, will DX

While often not discussed, MDM may play an important role in organisations undertaking a digital transformation (DX) initiative. Why? At the core of many DX journeys is data – arguably the least understood, much abused and overhyped, and still relatively untapped for many organisations.

Can a business successfully achieve transformation without the need for a clearinghouse for data?

Bonnet cautions that there are two levels to consider when discussing digitalisation.

The first is the external-facing part of digitalisation as represented by API and websites. This part has a limited impact on the organisation’s internal workings.

The second level called deep-digitalisation is where a company rethinks its internal IT systems to create a portfolio of autonomous and reusable coarse-grained components that can be exposed to the market via smarter APIs.

Bonnet explained that to make this deep-digitalisation happen at the right scale with the right quality, the governance of the data must cover all the information system layers, not only revealing certain important data in a fairly rough manner.

“To get this agility and depth of data governance, a high-end MDM system is mandatory. This system will be connected into all the information silos and layers within the silos, also with new systems. It is not a surface MDM system, but a deep MDM system with a strong data storage layer, rich governance features, and a very fast, agile process of delivery for the management of changes,” he elaborated.

Secret to making it work

To achieve success at large scale, Bonnet says a company’s MDM system must allow for an agile delivery process.

“It is almost impossible to be sure about the data structure, semantics, and governance process a company needs to start, and the prediction for the future is so hard to establish, even impossible,” he laments.

The inability to know the future is the key reason for the agility mindset. This is a vital awareness.

“If the MDM system is not agile enough, then all the existing systems running in a company could be slowed in their ability to change. There is also a potential for poor integrating with the MDM system which will not improve the data quality, and may have the opposite effect,” he continues.

He suggests that checking two points: first, the MDM system must be agile, without a rigid engineering process that could delay the delivery of the existing systems.

This is what is called a “model-driven MDM” for which the data semantics will drive a big part of the expected delivery in an automatic process.

The second point is the need for a methodology framework to set up a business glossary, model the data per domain at the semantic level, design the data policy with the workflow, and appoint the right roles for the data governance, etc.

“Today, after a couple of years of implementing such an MDM system, it is clear that the “model-driven” approach is mature when applying it to the most important data, and the methodology framework relies on rich lessons learnt and best practices ready to share,” he concludes.

Focus on what is important

Bonnet warns not to get hung up on sexy terms like AI, big data, and data lakes. These are just tools. The real challenge is making sure the data is clean.

“Often, big data and data lake projects rely too much on some ‘magic’ algorithms that should compute the vision for improving the future. But the business prediction will not be any good if the underlying data is wrong,” he pointed out.

He suggests subjecting the data to clear governance. This is arguably where MDM shines.

“The MDM system is the masterpiece of the whole data enterprise governance solution. Once the data is aligned with the quality insurance process, then a company can start getting good results with data analytics and AI,” said Bonnet.

“By closing the loop between the operating system and data analytics results, the MDM is used as the bi-directional bridge to convey good data from the operating system to data analytics and from the results of data analytics back to the operating system. The two worlds are then connected under the governance enforced by the MDM system,” he concluded.

Tying it to IoT

The Forrester Wave: Master Data Management Q1 2019 report notes that MDM is moving into its third generation, with the Internet of Things (IoT), and its massive stores of data, driving to the development of systems of automation and systems of design, and with it the introduction of new MDM usage scenarios to support co-design and the exchange of information on customers, products, and assets within ecosystems.

Industries like consumer products goods and retail will likely find MDM the centrepiece of flexibility. Forrester says “MDM within ecosystems, connecting to product information management (PIM) systems, is becoming a key success factor for such strategic MDM implementations.”

Minimising risks

Deploying technology is often a complicated solution to a complex problem, with risks escalating as you add more departments into the mix. Data, which cuts across everyone within the company, is no exemption.

Bonnet is not perturbed. He noted that solutions, as TIBCO EBX™, can be used to quickly comply with evolving data quality, management, and governance requirements, while automating current manual business processes around the management of master data.

TIBCO EBX™ comes with out-of-the-box functionalities specifically designed for multi-party, multi-tier collaboration in the creation, management, and synchronisation of master data. Implementation is quicker as well, which allows businesses to quickly achieve business value and return on investment. Solutions also need to be scalable to meet future needs.

First published on FutureCIO

Add a Comment

Your email address will not be published. Required fields are marked *

x