czyykj.com

Maximizing Value in Data Science Vendor Management

Written on

Chapter 1: Understanding the Landscape

In the world of data science, the mantra often heard is, "consolidate your data." This phrase has been the catalyst for numerous challenging IT projects, promising a "360-degree view of the customer" and "data-driven" decisions. However, after significant investments and prolonged implementations, many businesses struggle to identify tangible benefits from these ventures. It's critical to note that these endeavors were meant to 'facilitate' analytics, rather than provide immediate value.

Adopting a 'data-first' approach can be misleading from a data science viewpoint. There's an abundance of valuable data available, and effective data science should begin from the top, focusing on objectives before diving into the data. Thus, establishing a clear business goal and metric is paramount.

A new breed of consultant has emerged, selling so-called actionable insights—an offering as insubstantial as vapor. We find ourselves amidst an AI hype cycle, with every consulting firm and IT provider claiming expertise in this domain, often citing their presence in some 'magic quadrant.' When executed correctly, a well-planned data science initiative can break even within six months. So, how do you navigate through the complex mathematics, visualizations, and dazzling demos to select an analytical partner who can fulfill this promise?

This article contends that analytics should be treated like any other business pursuit, making it possible to assess and manage accordingly. We will outline key questions to evaluate vendors during the proposal phase and share best practices for managing partnerships successfully.

Vendor Assessment Essentials

The first inquiry should be straightforward: "What is their Unique Value Proposition (UVP)?" Surprisingly, many vendors struggle to provide a convincing response. Do they offer a unique data source? Are their analytical capabilities proven? Can they easily integrate and evaluate third-party data? Do they employ experienced data scientists?

Large organizations often operate on inertia, valuing only familiar practices and mistakenly blending them with data science. For instance, if their legacy business involved selling mainframes, they might now be promoting cloud solutions. Consultancies tend to focus on complex integrations, while the Big 4 primarily market BI and reporting tools, leading to the familiar pitch: "You need this infrastructure first; data science can come later." This narrative conveniently allows for costs related to infrastructure and software tools to be justified without clear business value or return on investment. While some resources may be necessary, their costs should be validated through empirical analysis—essentially, through data science.

Does the vendor's leadership possess genuine data science expertise? Many firms rebrand their services as 'Data Science' without having actual data scientists on staff. It's not unreasonable to inquire about their qualifications. A quick search on LinkedIn or Google Scholar can reveal surprising truths—some organizations may have no data scientists or claim to have a handful working offshore.

In a revealing discussion, a Senior Partner at a top consulting firm admitted that his team lacked data assets and successful analytics projects. He boasted about never hiring a data scientist over 26 years old and had a practice lead without a college degree. His strategy for competing with other providers was simply being a "trusted professional services partner," leveraging existing relationships rather than actual analytics capability.

The industry is not facing a shortage of skills or junior resources, but rather a lack of leaders with a solid grasp of the mathematics involved and a proven track record in data science solutions. Most projects also necessitate a field engineering lead to collaborate closely with business and operations teams, capturing process flows and constraints. Relying on intermediaries can introduce confusion and delay, jeopardizing project delivery.

Is the vendor's proposal sufficiently detailed for technical assessment? Their approach must be both technically credible and feasible. If they cannot clarify their technologies, why should you place your trust in them? If they sidestep specifics by citing proprietary information, that raises concerns about their actual capabilities.

The algorithms they propose may not hold any relevance to your business challenges. Terms like "neural network" or "Natural Language Processing" can be misleading if not clearly defined. For instance, "Cognitive" is merely an adjective without context.

A sound technical approach is essential but not the sole requirement for success. For instance, effective fraud detection solutions utilize various advanced algorithms, including anomaly detection and network analysis.

The first video, "Zach from The Try Guys: Insights on How to Succeed on YouTube!" provides valuable perspectives on creating engaging content and building a following.

Furthermore, do their case studies include ROI or performance comparisons? In data science, a disciplined, empirical methodology is crucial. Performance metrics should be quantifiable based on data, whether through existing data or via a champion/challenger live rollout.

Were these case studies conducted using real client data? This is a critical question; many solutions have not been tested on live datasets, relying instead on synthetic data or unrelated datasets. Such practices are common among legacy software companies that prioritize standardized APIs over actual data value extraction.

Can they provide a reference site? While not all clients are willing to serve as references, a promising vendor without an "Alpha" deployment should prompt negotiations. Entering a co-development agreement can lead to bespoke functionalities at a reduced cost.

Vendor Management Best Practices

A successful data science engagement should yield three key deliverables: a diagnostic, proof of value, and an implementation plan. A steering committee should be formed to review each deliverable throughout the engagement.

Establish a steering/review committee at the project's start, consisting of key stakeholders, including the P&L owner and analytics lead. While internal IT team leaders can assist with due diligence, they often lack data science skills, which can inflate project costs if they do not fully understand the mission or technology involved.

Define business objectives and performance metrics clearly. Metrics should align with engineering objectives—profit, revenue, costs, and so forth. Clear metrics simplify due diligence and establish expectations for the vendor. For example, predicting customer attrition does not provide direct business benefits; thus, it's not a strong test of the vendor’s capabilities.

Schedule a "Go/No-Go" review early in the project. This interim review should occur 2–3 weeks after data access is granted, allowing the vendor to verify if the objective aligns with the data.

After reviewing the initial results, the steering committee can decide to either proceed, re-prioritize, or halt the project. From this juncture, there should be clarity regarding the value of the approach being taken.

Be cautious of costly implementation plans. Initial implementation cost estimates may only be indicative, as many constraints and data requirements are uncovered during prototype development.

Some vendors may attempt to recover costs by overselling platforms and infrastructure. However, many industries rely on outdated decision engines, which can complicate the data science delivery process.

The promise of data science remains clouded by poorly planned initiatives and dubious practitioners. To succeed, rigorous due diligence is essential, alongside clearly defined business problems, established metrics, and proof of value.

The second video, "Who knew being the legal insights guy would get me caught… by compliments! 🕵️ #shorts #funny #legal," humorously explores the unexpected outcomes in the legal field.

Russell Anderson, the Director of Transaction Analytics Advisory, boasts over 30 years of experience in data science across financial services, retail, e-commerce, and biomedical sectors. He has provided scientific advice to renowned analytics firms and holds a Ph.D. in Bioengineering from the University of California, with numerous scientific publications and patents.

Questions or comments are encouraged: [email protected]

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

From Entry-Level to Executive: My Journey of 4 Promotions

Discover how I climbed the corporate ladder from entry-level to regional manager in just 3.5 years, sharing valuable insights along the way.

Navigating the Challenges of Work: My Journey and Insights

A personal account reflecting on the complexities of work and its psychological impact, alongside insights into coaching for existential wellness.

Unlocking the True Potential of Apple's Freeform App

Discover why Apple's Freeform app is a game-changer for productivity, despite being misunderstood and underutilized.