Customer Data Platform (CDP) vs Master Data Management (MDM)

Melissa AU Team | Customer Data Platform, Master Data Management | , , , , ,

There’s no doubt about it – the key to retaining customers isn’t about fighting price wars, it’s offering them the best experience possible. Customers like being wooed and to do this, the right way, businesses need to understand their customers. This is why customer data is so important. You need to know what is in your customer’s shopping cart to send them reminder emails, you need to understand their demographic profile to personalize their landing page, and so on.

This is where Master Data Management (MDM) and Customer Data Platform (CDP) come in. While both handle data, there are a few intrinsic differences between them. Let’s take a look at what these applications are and the difference between them.

What are Master Data Management (MDM) applications?

MDM is a data-led, technology-enabled discipline aimed at creating a single, reliable data source for all operational and analytical systems. It is typically driven by IT to ensure accuracy, uniformity and semantic consistency of the enterprise’s master data assets. To do this, it pools the data into master data files and tracks relevant data points throughout the organization.

MDMs merge data by comparing multiple data points such as names, addresses, phone numbers, etc. Many MDM products are a part of larger data handling solutions.

What is a Customer Data Platform (CDP)?

As the name suggests, a CDP caters exclusively to customer data. It can be described as a packaged software that connects to all customer-related systems and allows business users to manage and manipulate the collected data. CDP empowers marketing teams to provide personalized customer experiences. It links data by comparing a single data point such as IP addresses or email accounts to identify prospects.

Master Data Management vs Customer Data Platform

Let’s look at some of the critical differences and approaches between MDM and CDP.

Data governance

Both CDPs and MDMs can handle vast amounts of data. However, as all businesses know, when it comes to data, quality matters more than quantity. A CDP cannot differentiate between good and bad data. Once the data enters the system, it does not have rules for how to deal with poor data. Thus, the value of the data held is significantly lowered.

On the other hand, MDMs are designed to create unified data views. The system collects data from multiple sources and validates it to create a reliable data source for the organization. Often, the data from MDMs is used for CDPs.

Data integration

CDPs are designed with the marketing team in mind. It pulls structured and unstructured information from various marketing applications and helps the marketing team assess what or why a customer is doing something and thus, what is the next step they need to take to woo the customer. The system is not capable of further integrations.

On the other hand, MDMs are designed to create a single source of truth. They can collect and send information to various enterprise applications and are not limited only to marketing. It can be used for business analytics, to drive data-based decisions, etc.

Adding context to data

For any kind of data to be valuable, it must be with context. CDPs collect and compare data but they have limited abilities when it comes to the hierarchical management of customer data. MDMs are much better suited to do this.

MDMs create links that help businesses understand aspects such as which 2 customers are related, which customers are also suppliers, etc. Understanding how customers interact with each other and the different parts of a business provide valuable operational intelligence.

Single vs Multi-Domain

Data is complex and typically the various domains are constantly interacting with each other. CDP offers a great perspective on customer data in a domain but this is an isolated view of the customer. Users can apply varied rules to match data for different purposes and each application can work with their own IDs to unify the data within the CDP.

If you’re looking for a multi-domain view, MDMs offer better functionality. By organizing data with cross-domain relationships in mind, it allows a business to see all the different factors affecting it. For example, you can see the type of products being sold to a segment of customers and the modes of purchase to improve promotions and target customers with higher accuracy. In terms of matching data sources, MDMs follow more rigid rules.

What do You Need?

While MDMs are more established, CDPs are a relatively newer player in the field. You can work with both or choose one based on your end goals. If it’s just about customer data for your marketing team, A CDP will suffice. But, if you need a system that helps derive context from all the data collected and gives you a holistic view of the business, MDMs are a better solution.

The Evolution Of Data Quality

Melissa AU Team | Data Quality | , , ,

How do brands decide where to open a store? How does an entrepreneur decide where to advertise his new company? What makes a company decide to promote one product and not the others? Data – all of these decisions are based on data.

Today, finding data is easy but this is not enough. To support good short-term and long-term decision making, businesses must ensure that they rely on good quality data. This means that it must be an accurate representation of reality. It must be:

• Accessible
• Accurate
• Complete
• Timely
• Unique
• Reliable
• Relevant

These criteria developed over time as businesses grew more and more reliant on data. Let’s take a look at this evolution.

Establishing The Importance Of Data Quality

The term ‘Business Intelligence’ (BI) was established by Professor Richard Millar Devens in 1865. It was first published in his Cyclopædia of Commercial and Business Anecdotes. The term was used to describe how Sir Henry Furnese gathered data and acted on it in a bid to boost profits.

In 1958, Hans Peter Luhn published an article about the potential of collecting and analyzing business information with the help of technology to make it useful. His articles mentioned that using this information ‘first’ could offer significant advantages.

Until the late 1960s, the only people who could translate data into useful information were specialists. At the time, data was typically stored in silos and remained fragmented and disjointed. Thus, the results were often questionable. The problem was recognized by Edgar Codd in 1970. He presented a solution in the form of a “relational database model” that was soon adopted worldwide.

Early Data Management

The earliest form of a database management system can be described as the decision support system (DSS). Modern business intelligence has its roots in this system. By the 1980s, the number of BI vendors had grown considerably. By now, the value of data-driven decisions had been realized by many businesses. Several tools were also developed to help make data more accessible and organized. This included data warehouses, OLAP and Executive Information Systems. This further led to the development of relational databases.

Data Quality-As-A-Service (DQaaS)

At this time, data was stored in large mainframe computers that held the name and address data for data delivery. They were designed to track customer accounts that were invalid because the person had moved, got married or divorced, died or any other such reason. They could also correct errors in common spellings, names and addresses.
Around 1986, governmental agencies allowed companies to reference postal data by cross-checking their data with the NCOA (National Change of Address) registry. This move dramatically improved data quality and accuracy and was initially sold as a service.

The Internet And Data Availability

By the early 1990s, the internet was beginning to make its presence felt and it brought with it a flood of data. At the same time, reliance on data analysis for business decisions was also increasing. The relational databases in use could no longer keep up with the data available to them. This was compounded by the different data types that were being developed.

A solution emerged in the form of Non-relational databases or NoSQL. These databases used multiple computer systems and could translate many different types of data and made data management more flexible. Large unstructured data sets handled by NoSQL began to be referred to as big data and the term became official by 2005.

Data Governance

By 2010, businesses developed means to store, combine, manipulate and present information in different ways to cater to their different needs. This marked the beginning of data governance. Pioneering companies in the field designed governance organizations to find the best way to maintain data and develop collaborative processes.

They also brought about a “policy-centric approach” to data models, data quality standards and data security. They designed processes that allowed data to be stored in multiple databases as long as it adhered to a common policy standard and thus data became a corporate asset.

Data Quality In The Present Day

Data governance is still evolving. There are a number of associations such as the Data Governance Institute, Data Governance Professionals Organization, etc. that are committed to promoting data governance practices.

Given that it reduces the risk of human error and offers faster processing, Machine Learning is becoming a popular way to implement data governance. Every company that uses data should have its own data governance teams and policy. Data governance teams comprise of business managers and data managers as well as clients who use the business services.

The policy should ideally cover the overall data management, how data is to be used, to whom it should be available and data security. There are countless tools available to enhance data quality but finding the tools best suited to a particular situation can be a challenge. If you’re looking for ways to enhance your data quality, choose workflow driven data quality tools that can be scaled easily while maintaining a system of trust.

How to Drain a Murky, Big Data Swamp

Melissa Team | Article, Big Data, Data Cleansing, Data Governance, Data Quality | , , , , ,

Gathering data without proper governance can turn data lakes into data swamps – devoid of metadata management, rife with duplication, plagued with errors, and presenting an opaque view of your operations. But Big Data verification and governance solutions can transform expanding data volumes into trusted, actionable information across the enterprise. Melissa’s V.P. of Enterprise Sales and Strategy, Bud Walker, shares exclusive insight in Drain the Swamp, an article in The DZone Guide to Big Data. Turn to Page 29 to see how you can drain the murky data from your data swamp to pull deep value from the reaches of your data.

The Relationship between Data Governance and GDPR

Melissa UK Team | Article, Data Governance, Data Quality, Data Quality Services, GDPR, Global Address Verification, Global Data Quality, Global ID Verification, Global Phone Verification, United Kingdom | , , ,

According to the GDPR, personal data refers to any information related to a person including name, photo, email, address, bank details, social media updates, medical information and even IP addresses. It can come in various forms and contains several data elements. Data governance and GDPR are complementary as a strong data governance program is vital for GDPR compliance and with GDPR now in force, world leaders are paying close attention to how data is governed, collected and share.

This growth in the importance of data governance not only ensures that customer information is protected but also demonstrates the need for a digital transformation. Whilst data governance is becoming increasingly more important, approaches to data governance are outdated. Organisations are now collecting more data from various sources in huge volumes thanks to technological advances and social media which has changed the way people access and interact with information. But as the level of data held by firms increases, the way in which data is governed lags behind.

data-governance-gdpr

A data governance program should be equipped with supporting technology which ensures that new data sources or changes can be made accurately. With consumers changing how they search, locate and interact with information thanks to social platforms, organisations are also expected to adapt. Organisations should look to invest in tools and solutions that will help them to manage their data and ensure its accuracy. A strong data quality solution which reviews data at the point of entry can be extremely useful in beginning.

A proactive approach should be taken within the whole organisation about how information regarding data elements are shared within the organisation. Creating a platform based on information sharing and collective knowledge can help to ensure accuracy and accountability within the business. All departments within the business should be responsible for monitoring data and ensuring its accuracy.

It is important to remember that investing heavily in data governance should not be done simply because of GDPR compliance but because data governance plays a key role in digital transformation. Data governance supports business growth, knowledge and performance as it can provide critical insights into customers and help build solid customer relationships.

Melissa is able to offer data quality solutions that can be tailored for your organisation’s data governance needs, for more information please contact info.uk@melissa.com or ring the team on 020 7718 0070.

 

If you are interested in continuing the conversation, please connect with us on Twitter.

Melissa Data and Semarchy Partner to Integrate Data Quality and Master Data Management

Blog Administrator | Address Quality, Analyzing Data, Analyzing Data Quality, Data Cleansing, Data Enrichment, Data Governance, Data Integration, Data Management, Data Quality, MDM, Press Release | , , , , ,

Melissa Data Enrichers Enable Clean, Global Contact Data for Semarchy Users; Webinar Demonstrates Best of Breed Strategy for Fast, Optimized MDM Operations

Rancho Santa Margarita, CALIF – November 12, 2014 – Melissa Data, a leading provider of contact data quality and data integration solutions, today formally announced its partnership with Semarchy, a developer of innovative Evolutionary Master Data Management (MDM) software and solutions. Together the two firms are facilitating sophisticated data quality as a key performance enabler of effective MDM operations. Through this partnership, Semarchy offers its users a fast and easy way to perform worldwide address enrichment, standardization, geocoding and verification using Melissa Data’s proven data quality tools and services. In turn, Melissa Data enables its users to upgrade data quality projects, moving beyond tactical processes to engage in a more comprehensive strategy combining data quality, data governance and master data management. Data integrators can learn more in a joint webinar presented by the two firms; click here to access the presentation on demand on Melissa Data’s website.

“Clean, enhanced contact data is essential to enterprise MDM – maximizing the value of applications, empowering sales and marketing, and assuring trusted information as the basis for analysis and reporting,” said Bud Walker, Director of Data Quality Solutions, Melissa Data. “By integrating data quality operations within the Semarchy Convergence ™ platform, we’re supporting users in easily unlocking this kind of high level business value, increasing the quality of supplier, location and customer data within a single de-duplicated 360° view.”

Semarchy Convergence™ for MDM is an integrated platform, available in the cloud or on-premise, enabling ground-up design and implementation of master data management initiatives; using a single tool, developers and data managers can create data models, manage quality, set up match/merge and transformation rules, and build data management applications and workflows. Semarchy Convergence™ now supports Melissa Data Enrichers, validated plug-ins that uniquely enable Semarchy users to verify, clean, correct, standardize, and enrich global customer records including full contact validation for global addresses, phones and emails.

Melissa Data Enrichers enhance the capture of new master records at the point of entry; deployed on-site or via the cloud, these tools eliminate the need for users to build integration between internal MDM systems and data quality processes. Customer records can be enhanced with missing contact data and master data can be appended to include attributes such as geographic coordinates. Users of the Semarchy Convergence™ platform can model any domain, managing data via collaboration workflows, generated user interfaces, or through simple batch processing to clean, standardize and enrich legacy data.

“Our perspective on master data management as an evolving business function recognizes the essential correlation between reliable data and authoritative analytics. By capitalizing on Melissa Data’s proven technologies, Semarchy users can more effectively manage increasingly intelligent applications of global data – consolidating unstructured sources and enriching master data across all their business domains,” said Matthew Dahlman, Technical Director Americas, Semarchy.

Click here for Semarchy Convergence for MDM as a 30-day trial version, and here to begin your free trials of Melissa Data Enrichers for the Semarchy platform. For more information, contact info@semarchy.com, sales@melissadata.com, or call 1-800-MELISSA (635-4772).

News Release Library