Using Master Data Management (MDM)
MDM is information management of master data. However, MDM is not for all data. It is only about data that are worth managing (widely shared among organizational / functional units, consistent across viewpoints, foundational and valuable to the operation and the business) and can be managed (understood by the business, stable over time).
MDM is critical for customer experience (CX) applications.
By 2020, 75% of those organizations that neglect MDM and EIM while creating a 360-degree view of their customers to support the CX will adversely affect CX metrics via the use of inaccurate data during customer interactions.
Through 2017, CRM leaders who avoid MDM will derive erroneous results that annoy customers, resulting in a 25% reduction in potential revenue gains.
By 2019, 50% of application leaders supporting social for CRM will leverage MDM, resulting in higher customer satisfaction scores and increased retention.
Trusted master data is necessary for confident business use of any data – MDM makes it possible. As the fragmentation of internal data continues unabated, costs of current and new business processes rise, the business benefits are eroded and technical debt created. Customers become further alienated by erroneous and inappropriate interactions – both active and passive.
MDM enables consolidation and governance of physically disparate internal instances of customer, product, etc. Reduces redundant and error-prone business processes and results in better customer experience. The ability to identify occurrences of matching data in external datasets – data brokers, social data, IoT, etc. and act on observed activity – enhanced sales and service capabilities are emerging as key requirements for MDM initiatives in the enterprise.
Key benefits of linking internal profiles to external data include identifying customer engagement opportunities, prioritizing high-value customers and keeping them loyal, and avoiding public misunderstandings. For example, single views of the customer are increasingly leveraging social data (MDM is key for linking social media profiles with internal data). The explosion of social networks (and smartphone use) are forcing companies to invest in applications such as sentiment analysis (comments by a customer about my company, brand or product, do my customers categorize my products like I do?), and network analysis (who knows who? Who influences whom? ).
Organizations are incorporating new data sources into their information governance organization and process, surveying business users at all levels (especially sales and service) to determine data that would be useful to them, and changing business processes to achieve the required level of trust (e.g. ask for social network IDs during onboarding or service interactions instead of attempting to match with MDM). However, they need navigate carefully to avoid crossing the “creepy line” when building acquisition and matching capabilities across internal master data and external datasets. Organizations want to create the required view via well-managed data governance and integration processes that use canonical forms of all data – whether batch (ETL / DW), real time (ESB / SOA), or both,
Understanding shifts in security
Security is borderless and extends well beyond the traditional perimeter. Securing APIs using IAM, TLS and API keys is no longer enough. Application security tools, such as web application firewalls, currently have limited API security capabilities. WAF and bot mitigation solutions also typically offer only very limited anomaly detection for APIs. As a consequence, successful exploitation of API weaknesses, such as insufficient logic boundaries, will remain a threat.
Enterprises are focusing on the wrong party to improve security. Cloud services are not getting breached and most security incidents are the customer’s fault. The big story in cloud security is that big hacks and failures have not occurred.
“The average time to detect a breach in the Americas is 99 days and the average cost is $4 million.”
51.8% of online traffic is bots. (28.9% are bad bots).
Sophisticated security events have become too hard to detect. There is lack of adequate security data collection and monitoring tools that operate on data streams at scale. The collected data is stored in giant repositories and connecting the dots for an attack becomes a challenging task. Additionally, more data means, even more policies and rules to be created and maintained. In some cases, once the attack goal is met, the malicious code immediately disappears without a trace. The variability of threats and adaptive adversaries makes security challenges significantly harder.
By 2020, there will be one major data breach of Thing Commerce, leading to the leakage of at least 100 million customers’ data and the loss of $1 billion.
Trust needs to be developed by customers to allow things to make decisions.
User privacy needs to be respected and users need to have full control of their data. Technologies such as AI, NLP and bot frameworks need to further advance to be good enough for making autonomous decisions. Fraud detection and prevention need to take place in real time. Multilayer security protection needs to be deployed to ensure integrity of Thing Commerce.
There are too many false positives as security tools are not doing a good job of filtering out the noise. Responses to security incidents depend upon too many manual processes (running around putting out fires is not scalable). There is also dependence on too many independent tools that aren’t integrated together. Though, integration of varied data sources, systems, big data technologies, advanced analytics, and ML / DL driven applications have been the mantra in the business applications space for a while now, there’s been little progress towards integrating disparate point solutions used in security operations or using integrated big data security analytics security and risk management. There are some big data based security and governance solutions however very few that can be used for meeting real-time security, risk and governance of high-speed streaming data.
Security teams need to be equipped with the knowledge, tools, analytics, visualization, and infrastructure to effectively counter the increasing risks to the business while improving the truthfulness of the data and systems being used by the business. Additionally, the processes and skills of security teams need to be augmented as well. For example, the security teams now need to include big data and ML / AI specialists to not only protect the organization but also advise the development teams on security practices required for developing more robust and secure systems.
Organizations expect analytics to speed up detection and automation to reduce response times, acting as a force multiplier to scale the security team without adding more people. Organizations want analytics and automation to ensure they focus their limited resources on events with the highest risk and the most confidence.
In this final blog, we explored the impact of shifts towards improving trust and security in the use of business applications that need to be understood now for developing strategies for addressing the associated governance, risks and security requirements, comprehensively.
- Michael Patrick Moran, MDM is critical for customer experience, Gartner Application Strategies & Solutions Summit, 2017.
- Olive Huang, The state of customer experience technologies and their impact to your application strategy, Gartner Application Strategies & Solutions Summit, 2017.
- Jason Daigler, To the point: Thing Commerce: Expand sales and engage customers through smart things, Gartner Application Strategies & Solutions Summit, 2017.
- Mark O’Neill, API Security: Balancing openness and protection, Gartner Application Strategies & Solutions Summit, 2017.
- Neil MacDonald, Gartner Security & Risk Management Summit 2017.
- David Mitchell Smith, Boot camp – Getting started with cloud, Gartner Application Strategies & Solutions Summit, 2017.