Legacy systems, customizations, and on-premises, oh my!
The goals of master data management — improve data quality, reduce inaccuracies, and increase the overall health of corporate data — are commendable and only mean for the best. Who doesn’t want super clean, accurate data on which to base your company decisions? The process of creating best practices and principles to help departments set up a “single source of truth” (more on that concept later) to have insights no matter what data was needed was an important one.
But one of the main problems with MDM stems from when it was made. Take a trip with me:
Imagine it’s the early 2000s. You’ve been tasked with increasing your company’s data quality. As you look around your physical servers in the cold room next to you, you think it’s not so bad of a job. After all, you have everything right on-premises and they’re all your company’s systems. Everything’s been customized to exactly fit the needs of your organization and there’s only a handful.
Now jump back to today and it’s an entirely different story. Now you have apps that include both the really old ones that you still can’t get rid of in that same cold room and the ones that live in the cloud. Everything uses a different schema, you have a ton of duplicate information spread across the tech stack, and worse still: you know there’s more out there that’s hidden.
Out with the old, in with the new
The good news is as businesses and tech have both evolved, the market has begun to standardize things. Certain CRMs and MAPs rule their respective industries. Because of that prevalence, standardization is becoming the norm.
Instead of the days of yore where custom data models were all that existed, the large majority of data models today have already been dictated. Meaning, companies once spent hours mapping all their data points in their databases — what type of information they were storing, as well as how it was related to each other and moved through business processes. Now, the data models being used by the popular platforms have simply been adopted by the companies using those platforms.
The good news is: now that companies are starting from these standard models, other apps that work with the industry giants have also elected to use those models. The byproduct? The number of data models in the market has decreased significantly.
The boon companies get from this reduction in complexity is two-fold:
- You no longer have to always create data models from scratch.
- Using standardized schemas lets you bring more data into more places, more easily.
Modern systems, legacy problems
Now, beyond the inherent difficulties with implementing MDM simply because of change management, MDM is also being blocked because of this paradigm shift. MDM wants to come in, create your master data, and push it all into one single repository. But, as we brought up last week, creating that “single source of truth” has its own challenges:
Issue #1: It takes how long?
Anyone who’s been through a big project where you have to mention “change management” knows those projects take forever. Unfortunately, MDM is no different, and the overloaded nature of IT departments today means such projects are known to take years and have a high failure rate. So, why are companies continuing to beat their heads against the proverbial wall?
Issue #2: Where’s my data?
Different departments prefer working out of their own systems, and while MDM excelled at pulling data together, it had some difficulty distributing data to users’ end systems. This forced Operations to manually push data out of the SSOT into the CRM for Sales, MAP for Marketing, etc. However, most operations professionals currently do not have the tools necessary to be efficient, requiring them to spend the majority of their time manually managing data instead of being able to execute on strategic priorities.
Issue #3: Wait, we’re using outdated data?
Being able to stand up multiple platforms and apps has been a great advancement for orgs to do things more quickly. The problem arises when older, slower, legacy endeavors such as MDM try to keep up. The simple fact is that MDM is slow and complicated, and so the data in the SSOT (if you ever get one) is out-of-date whenever you access it.
Issue #4: We’re also using inaccurate data?!
Yet another issue traditional MDM has is its fundamental approach to data. Most often, it only concerns itself with the connections between systems and not the data itself. In other terms, MDM was born out of a desire for data integrations, not data sync.
Integrations are all about copying and pasting data between systems, with no regard for quality. (If you put garbage data in, that’s all you’re getting out.)
Syncing means multiple systems operating as one with the shared, trusted data. (Meaning, there are rules and processes in place so data is always clean, accurate, and consistent.)
The intrinsic fault in the tenets of MDM means we need a new way of doing things; one that understands the modern business & data environments.
Take advantage of a new way
It’s about time we finally said goodbye to MDM. It’s simply too slow and complicated, especially for such a fast-moving market. Let’s look fondly at its well-intentioned beginnings, but adopt both modern solutions and a modern approach to data.
Today’s businesses need to:
- Be quick and efficient
- Embrace distributed truth
- Democratize data
Syncari can help with all three. Instead of waiting days for data to be updated and months (or years) for a single source of truth to be implemented, Syncari’s data automation platform enables operations. Automate your data workflows and create distributed truth through an easy-to-use-and-understand drag-and-drop interface. Clean, manage, dedupe, unify, and standardize data without a single line of code.
We’re built for the operations pros struggling in today’s data landscape. Our data automation platform helps you fix what you have and prepare for an infinitely more complex future.