Beyond Features: Choosing Data and AI Technologies That Fit

Author:

Liam Agnew

Date Published:
March 31, 2026

We’ve all been that disappointed kid who wasted a “fortune” on some gizmo or game that ended up being way less awesome than we imagined. Now, as well-adjusted adults, we like to think we’ve outgrown that naivety. Truth is, it’s still there. It’s just sneakier.

We see this when organizations adopt a cutting-edge data and analytics solution. They embrace the latest technology, explore what competitors are using, and then make an expensive commitment, perhaps even opting for custom tools.

Yet months or years later, they discover they’ve spent as much or more than was returned. In other cases, it is unclear what they gained or even what it cost. Why do we spend so much on analytics technologies without clear evidence of their effectiveness?

This is part four of our series on eight common gaps quietly draining ROI from data, analytics, and AI initiatives—and how to turn investment to impact.

Today, we tackle: How do we invest in solutions that measurably move the business forward?

Where does it go wrong?

There are many ways organizations end up with underperforming analytics solutions and technologies:

Underestimating the cost of integrations and change management before adoption.

Adopting tools that are misaligned with how the team works or the maturity of the business.

Choosing solutions because they were trending, aggressively marketed, or used by a competitor.

As a result, their analytics can fail to deliver trusted insights that justify the cost. In data and AI, the biggest risk to ROI isn’t choosing a technology that lacks features; it’s choosing one that doesn’t fit your business.

The most successful organizations don’t start with tools. They start with business challenges, then evaluate solutions in a tool-agnostic way to find the right fit.

Mistake #1: Missing Hidden Costs

When data and analytics solutions are chosen without fully anticipating expenses, hidden costs multiply quietly, eating away at ROI.

Total Cost of Ownership

Many teams focus on the initial purchase, implementation, development, or subscription costs but underestimate the costs of

  • Migration
  • Training
  • Change management
  • Integrations
  • Support and operations
  • Scaling

If so, initial cost estimates will be too low. Leaders must recognize costs beyond the “sticker price.”

Take change management as an example: if a new forecasting tool or data source is introduced without appropriate investment in change management—without building understanding and buy-in—trust erodes quickly. In data and AI, trust is everything. If decision makers don’t trust the data, predictions, or outputs, they won’t use them. Don’t be caught off-guard by the investment required to develop buy-in.

Accounting for the less obvious costs of a data and AI technology adoption helps ensure your estimates hold up to reality. Look at examples from comparable organizations.

Opportunity Cost of Ownership

Even after evaluating the total cost of ownership, as Joe Reis reminds us in his Fundamentals of Data Engineering: Plan and Build Robust Data Systems, organizations must also consider the opportunity cost of ownership. It is common to focus on the explicit costs. But opportunities are sacrificed too; every technology adopted also limits what else we can do[EJ[P1] .

One way to evaluate this opportunity cost is by asking questions like:

Are we building transferable skills (e.g., widely used data science techniques) or vendor-specific skills (e.g., proprietary GUIs)?

How easy will it be to replace this system, or parts of it, later?

Is our data readily accessible if we need to migrate or change platforms?

A choice must be made, but each choice can reduce future flexibility, so it should be made deliberately.

Building vs. Buying

Building a solution is one of the easiest ways to stumble into hidden costs. The cost of building and maintaining a data and analytics solution is usually dramatically underestimated. Building is expensive and is only worth it when it provides a specific competitive edge that could not be had otherwise.

True, off-the-shelf solutions have less flexibility. But consider whether that flexibility is worth a tenfold increase in investment. (Sometimes it is!)

Mistake #2: Misaligning Tools to the Job

Many analytics approaches are selected because they seem promising, or because another organization succeeded with them. But do they match the team’s culture, maturity, scale, or business model? There is no ROI if the solution is not adopted and used.

Data Maturity

Many analytics tools assume a level of data maturity that simply doesn’t exist yet in an organization. Clear ownership, consistent definitions, accessible data, and quality controls are elements of data governance and data management that may need to be in place for the tool to be effective.

Without this foundation, it may technically work, but it won’t deliver value.  It can even amplify inconsistency and erode trust.

The issue isn’t capability. The tool simply assumes a level of data maturity the organization hasn’t reached. A solution that builds the data maturity foundation would be significantly more valuable for this organization, at this stage.

Ways of Working

Analytics platforms and solutions are often opinionated.  They embed assumptions about governance, workflows, and decision-making. When those assumptions don’t align with how teams actually work, adoption stalls, workarounds emerge, and value never materializes.

Solutions should fit a business’s problems, people, and systems unless there is a deliberate, strategic decision to change how the organization works. In that case, the change must be actively managed, and users must be intentionally brought along.

Involve end users early in the evaluation process, across roles, so they can assess how well a solution aligns with existing workflows and where change would be required.

Frankenstacks

Often a solution is evaluated in isolation. But you don’t want incompatible analytics tools, each adopted for their own legitimate reason. The “best tool available” may not be the best for an organization if it doesn’t integrate with other systems.

Lean on the technical experts who will manage the systems and the users who will operate them to score possible tools. Piloting—running a limited real-world trial before a full rollout—is a powerful way to cut through vendor marketing to see how that integration works in practice.

Mistake #3: Falling for the New and Flashy

This is one of the most common failure patterns. A solution looks impressive, has excellent marketing, or is dominating conference expo halls. Teams adopt it because

“It’s the new standard.”

“Our competitor uses it.”

“It will solve our problems.”

As one of our principal data scientists, Tom Shafer, recently put it, “These days the client wants the shiny tech thing, and we have to say, ‘Are you sure?’” We often need to probe deeper to understand fit. Do they have the data maturity for it to work? Do they understand the technology’s limitations in the context of their business problems? Have they considered the full cost of effective implementation?

Matt Turck’s widely referenced ML, AI, and Data Landscape increases in size every year, and in 2025 even had to be truncated due to “an explosion of new companies and products.” It can be overwhelming to weed through the options to find solutions that will work for your business.

Focus on the business’s problems and opportunities to define capabilities needed in a tool-agnostic way. Then, score analytics options available in terms relevant to your business.

Avoiding These Mistakes

Successful organizations consider these types of questions before adopting any technologies or data solutions:

What are our unique business problems and vision?

What are the hidden costs associated with adoption?

Does this solution fit our data maturity, processes, and scale?

And ultimately, will an investment in this data and analytics solution measurably move our business forward?

Whenever possible, they develop ways to score and objectively evaluate fit to reduce bias and avoid the influence of “new and flashy syndrome.”

A Success Story

A global restaurant brand recently partnered with us to improve observability across its analytics and machine learning ecosystem—moving from noisy, low-context alerts to a clearer understanding of where issues originate and how to prevent them.

Instead of jumping to a single vendor, we led a structured, solution-agnostic assessment of multiple platforms, refining the client’s evaluation criteria along the way so they could judge each option on feature sets, usability, integration needs, scalability, and long-term operational fit.

The result was a confident, future-ready decision that gave their developers and analysts faster insight into system behavior and reduced the time spent diagnosing failures. Just as importantly, the client walked away with an objective evaluation framework they can reuse as new technologies emerge.

Moving Forward

Organizations succeed when they treat analytics as a strategic function, not just a technical one. They tie work directly to business priorities and stay technology-agnostic as needs evolve.

At Elder Research, we believe data and analytics transformation is an ongoing journey—one that’s most successful when it’s collaborative, transparent, and grounded in real business needs. Our team is ready to help you identify your organization’s unique gaps, align your analytics efforts, and turn your data investments into measurable results.

Ready to move from investment to impact? Let’s start a conversation about where your analytics journey can go next. Reach out to our team or read another post in this series linked below.