By Sean Perry, Chief Information Officer at Kelly
I keep hearing the same thing from companies trying to implement AI: “We can’t start until all our data is in a data lake.”
That made sense years ago when machine learning meant building your own models and running them for days on centralized data.
It doesn’t work that way anymore.
You should be looking at it as: “I have AI that needs access to data where it already lives.”
Recency matters more than consolidation. When someone asks an AI assistant about a client, they expect it to see the same CRM record that was updated an hour ago—not a stale version that syncs overnight.
According to IBM, 80% of companies admit to making decisions based on out-of-date information, which leads to missed opportunities, operational inefficiencies, and competitive disadvantage. When AI is powered by outdated data, it produces unreliable outputs that undermine efficiency, trust, and ROI. As IBM puts it, "Without real-time data, AI is like a GPS running on last week's traffic updates."
Technologies like Model Context Protocol (MCP) make that possible. MCP lets AI connect directly to systems like HubSpot or Workday in real time, retrieving only the information it needs in context. Instead of shuffling data across the organization, it simply goes to the source, pulls what’s relevant, and responds.
If a CIO says they can’t move forward until their data lake is ready, someone sold them the wrong solution. That approach belongs to another era—and it takes far too long to deliver value. The goal isn’t to move data around. It’s to make the systems you already rely on accessible to AI.
Gartner estimates 80% of enterprise data lakes fail to deliver their expected ROI due to poorly cataloged, untrusted contents. In a 2024 Salesforce survey of 150 enterprise CIOs, 84% agreed AI will be as revolutionary as the internet, yet only 11% say they have fully implemented AI in their organizations. Security and data infrastructure challenges were cited as the leading hurdles.
CIOs report spending a median 20% of their IT budgets on data infrastructure and management versus just 5% on AI initiatives—effectively investing four times more in "getting data ready" than in the AI tools that use the data. Meanwhile, 68% say business stakeholders now have unrealistic expectations for quick AI ROI despite this slow groundwork.
Some use cases—like reporting and historical analysis—still benefit from centralized data. Organizations with well-governed centralized data (where it truly serves a purpose) have achieved 2.5× faster model deployment compared to others. But for day-to-day work, the question is where should information live so it supports how people actually work?
At Kelly, our salespeople spend their day in SalesHub, so that’s where we pull in the data they need. Sure, they could check Workday to see timesheets or placements—but that slows them down. Soon, we’ll be able to surface that information right inside SalesHub.
Context switching is a silent productivity killer. A Harvard Business Review study found that the average digital worker toggles between applications and websites nearly 1,200 times per day, accumulating almost 4 hours per week reorienting after these switches—equivalent to five working weeks per year lost to this "toggle tax."
When the data your teams need lives in the same system they use, the AI built into that system can immediately act on it—whether it’s marketing to a client, matching talent, or finding former employees worth rehiring.
The AI ecosystem moves quickly. MCP is barely six months old, but major vendors like HubSpot and Workday have already adopted it—and others, like Bullhorn, will follow. Our focus is connecting these systems so AI can move fluidly between them.
MCP allows AI assistants to call APIs, databases, and applications in real time for supplementary information or to execute tasks. For example, using MCP an AI could query "What's the latest interaction with this customer?" and get live CRM data to incorporate into its response. It can similarly update records or trigger workflows, acting as a bridge between systems that traditionally didn't talk to each other.
We also watch how employees use our internal AI tool, Grace, to find ways to make their work easier. When we notice patterns—like users frequently reformatting resumes—we turn those into one-click automations with best practices baked in.
Over 40% of employees report using AI tools like ChatGPT for work tasks—a number that has nearly doubled in two years. Right now, most people still copy text from one system, paste it into ChatGPT, copy the result, and paste it back. It works, but it’s clunky.
The next version of Grace eliminates that. It lives in the browser, can see what users see, and acts on it directly. When someone opens a candidate page, Grace will ask, “Reformat the resume and create a summary?” One click, done. No downloading, no pasting.
That's where adoption grows—when AI removes steps instead of adding them. A McKinsey Global AI study found that while AI adoption has doubled since 2017, only 55% of companies have successfully embedded AI into their day-to-day business processes. The study notes that "the challenge isn't the algorithms—it's integrating AI into human workflows."
Moreover, a Stanford study on human-AI interaction found that systems designed to collaborate with users (augmenting their workflow) achieved nearly double the adoption rates of solutions that aimed to fully automate or replace human work without integration.
A scalable data foundation for AI doesn't mean warehousing everything before you start. It means:
The organizations making progress with AI aren't waiting for perfect infrastructure. They're connecting what they have, learning from how people use it, and improving as they go.
The companies that thrive in the AI era will be those that connect what they have and keep improving—rather than those that delay value in pursuit of an ideal that may come too late.