Blog

Blog

Blog

Efficiency vs Effectiveness: The AI Conversation All Insight Leaders Need to Have

Six weeks. One week. One day. Even faster. 

That is the research industry’s favorite progress story, and it is not wrong. The problem is what happens after the speed arrives. 

Speaking with James (JT) Turner, host of Research Revolutionaries and founder and CEO of Delineate, Tina Tonielli, a senior insights and analytics leader and former North America Lead for Consumer and Business Insights and Analytics at Haleon, described a moment a lot of insight leaders recognize instantly. The data shows up faster than the organization can absorb it. You are moving at “machine speed” on the output, but the business is still running on human rhythms. 

“You were getting the information so fast that you couldn’t even process it.” 

That is the moment “faster” stops being a benefit and starts being a weight. 

Tina has spent her career on the client side, inside the brands where the decisions land. She also came into insights through marketing, which gives her a “useful bias”. She is less interested in whether the technology is impressive and more interested in whether it changes what the business does next. 

She does not argue that AI is bad. She argues that speed, by itself, is not the same as better. 

You can watch or listen to the full podcast episode here: https://www.research-revolutionaries.com/e13-cheaper-faster-but-is-it-better-ais-impact-on-consumer-research/  

Screenshot 2025 12 01 161010

When Fast Turns Into Heavy

 

GenAI did not create the speed obsession, says Tina. It just accelerated it. 

Insights and analytics have changed more in the last two years than they did in the prior ten. Before generative AI, machine learning, and big data were already shrinking timelines. The industry kept pushing the same promise, wave after wave, until “faster” stopped sounding like an advantage and started sounding like a load. 

“It used to take six weeks. Now we can get it in a week. Now we can get it in a day. Now we can get it even faster.” 

Inside a business, that curve doesn’t always feel like progress. It can feel like volume. 

“You were just being bombarded with data and overwhelmed by data. You’re just sitting under an avalanche of data, and you can’t activate it.” 

That’s the gap that matters. The work isn’t finished when the output arrives. Someone still has to make sense of it, decide what matters, and carry it into the moment where a decision gets made. The organization doesn’t absorb insight continuously. It absorbs it when people come together and commit to a call. 

“People are making decisions in forums and with some degree of process.” 

JT recognizes the same pressure from the real-time side. Brands aren’t short on signals. They’re juggling enormous numbers of sources, sometimes conflicting. Another faster feed can land as one more thing to reconcile, not one more thing to act on. 

At some point, the question stops being “How fast can we get it?” and becomes “Where does this go, inside the business, when it arrives?” 

The Slow Part: Readiness, Procurement, Onboarding

 

If you want to watch a room deflate, bring up procurement in a conversation about innovation. 

JT brings it up anyway. The slow bits are often not research. They’re the steps around research, procurement, purchase orders, onboarding. The mechanics of getting a partner through the door and approved. 

Tina tells the story the way most people experience it, which is why it lands. She asked a supplier partner a simple question. If she had a magic wand, what would help them move faster? The answer was not about methods or models. 

“If it takes you eight months to get through the onboarding process with a client, then you’re almost shooting yourself in the foot.” 

This is where the “cheaper and faster” promise breaks down in practice. You can compress fieldwork and automate analysis. If the organization still takes months to start, speed becomes a detail. It’s not that the tech isn’t working. It’s that the system around it isn’t built for tempo. 

It also explains why teams can feel stuck in a loop. Leadership wants more speed and more output. The processes that govern suppliers and data access don’t change. The result is predictable. People squeeze harder on the parts they can control, then wonder why the whole thing still feels slow. 

Where “Always On” Helps and Where It Hurts

 

Tina doesn’t reject “always on.” She rejects the way it is talked about. 

“We love words like always on,” she says, “and nobody’s making a decision always on.” 

That line doesn’t argue against always-available data. It argues against pretending the business runs on a constant decision loop. It doesn’t. It runs on meetings, cycles, and moments where people are actually authorized to make a call. 

That’s why she keeps coming back to starting with the decision, not the feed. 

“It’s not about having data all the time. It’s about having data exactly when I need it to make that business decision.” 

If the organization meets quarterly to make decisions, then paying for weekly reporting that no one uses is not progress. It’s just a habit. 

So “always on” splits into two different realities. 

One is always-available insight, connected to the real world, ready when the business needs to answer something fast. That’s the version most insight leaders want, and it’s where always-on systems earn their place. You don’t want to wait six weeks for procurement, onboarding, setup, and fieldwork when the question is urgent. 

The other is always-on reporting, a constant stream arriving whether or not anyone is in a position to act. If the business only reviews and decides quarterly, then running something weekly “because we can” becomes expensive noise. 

JT adds the nuance that fits the way modern teams actually want to work. Sometimes you do need an “always on” posture, not because you want a daily stream, but because you want readiness. The ability to spin up fast when the moment hits. The part that slows you down is not always research time, it’s everything around it. 

This is where the Delineate idea of an always-on connection to the real world makes sense. Always on doesn’t have to mean “always shouting.” It can mean “always available,” with timing that matches how decisions really get made. 

Efficiency vs Effectiveness

 

“There is an efficiency play and there is an effectiveness play.” This is the line Tina uses to separate the useful from the noisy. 

Most AI use in insight work starts with efficiency because it is immediate and easy to justify. Summaries. Drafts. Faster turnaround on the things that used to burn hours. 

That’s helpful. It’s also where the trap forms, quietly. A summary can create a sense of completion without doing the hard part of the job. The hard part isn’t compressing information. The hard part is deciding what matters, what changes a recommendation, and what the business should do next. 

Effectiveness is different. It sits closer to judgment. 

Tina describes wanting help pulling insight across different places, making meaning, and getting to the real problem behind a brief. She keeps returning to business decisions, not because she dislikes technology, but because she’s seen what happens when output outpaces action. 

“I feel like I have so much data already.” 

More respondents, more streams, more dashboards can feel like more overwhelm. The need is not more volume. Is help shaping the story and clarifying the problem statement, so the work has a chance of being used. 

JT echoes the same need from the supplier side. Clients want speed, but they also want transparency on why numbers move. They want confidence that the metric means the same thing over time, and that it connects cleanly to decisions. If the output can’t be explained, it doesn’t matter how fast it arrives. 

Efficiency buys you time. Effectiveness earns you trust. 

Where AI Starts to Get Interesting

 

Tina doesn’t pretend anyone can map the next few years with certainty. 

“Generative AI is a foundational technology like electricity was,” she says. If you had asked people right after electricity arrived what it would enable, you would have blown their minds. The point is not predicting perfectly. The point is experimenting, learning, and building capability as you go. 

The point isn’t prediction. It’s learning, trying, failing, experimenting, keeping a team curious without letting standards slip. 

That’s where she draws the next distinction. There’s the straightforward efficiency use case, like summarizing. And then there’s the more interesting effectiveness use case, where models help you work higher up the value chain. Not replacing judgment, but lifting the starting point. 

That’s where agents and training come in. If you can teach a model your organization’s way of working, and feed it the differentiating knowledge your team has built up over the years, that becomes a real advantage. 

“If you can teach it your way of doing things, that agent is your new IP.” But “it needs to partner with a human.” 

That’s also why she’s cautious about AI that simply generates more respondents or more “data,” especially if your team is already buried. More volume doesn’t help if you’re struggling to connect the dots and influence decisions. 

AI earns its keep when it helps the team do the work the team is actually struggling with, not when it helps the team produce even more output. 

Better Is Not a Speed Metric

 

The episode keeps returning to the “unsexy work that makes the sexy work possible”: data strategy. 

What business questions do you always need to answer? What should be standardized and automated? What work should stay human-heavy because it’s strategic and messy and requires judgment? 

She’s clear about what gets skipped when everyone rushes to the tool. 

“We jumped to the AI and skipped the not so sexy part of why do we need this data?” 

JT reinforces the same need from another angle. When tracking becomes a data stream into a broader marketing data environment, the bar rises. Consistency. Repeatability. Clear definitions. Transparency. Without those, the system produces argument instead of clarity. 

Then Tina names the personal capability cost of getting this wrong. When teams get spoon-fed by tools, they can lose the muscle that makes them valuable. The skill she puts first is not technical. 

“The number one skill… is a growth mindset.” 

Everyone needs to stay learnable, and the skills that make insight move in the room are still the ones that have always mattered: interpretation, storytelling, influence, relationships. 

“None of us know what we’re doing.” 

The people who look like experts are usually the ones who have tried things, failed, learned, and kept going. That’s the posture she’s advocating. Experiment, keep your standards, share learnings, and don’t let efficiency become the headline if effectiveness is what the business actually needs. 

Because “better” is not a speed metric. It’s whether the work changes a decision, at the moment a decision is being made. 

 

You can watch or listen to the full podcast episode here: https://www.research-revolutionaries.com/e13-cheaper-faster-but-is-it-better-ais-impact-on-consumer-research/  

Join our Newsletter