Every year, L.A. drivers each spend 128 hours imprisoned in freeway traffic—among the worst in the nation—which siphons $19 billion out of the L.A. economy. To put that in perspective, that’s the size of NASA’s budget.
If each of those irate, captive drivers were a piece of data, the whole Los Angeles sprawl would be a fitting diagram of most B2B data infrastructure: Slow, archaic, maddening, and wasteful—data with a purpose, not getting where it needs to go. It costs most companies $15 million per year.
Like L.A.’s poor transit authorities, most IT teams have a reason: legacy infrastructure. Also like L.A.’s transit authorities, IT teams frequently fall into the trap of thinking that adding more and bigger connections can actually make things better.
The law of capacity
L.A. is subject to an unending parade of politicians promising to tame the chaos. Wider bike lanes. A real metro. Underground tunnels. Flying Ubers. But every time the city adds transit capacity, things only grow more chaotic. When L.A. added an additional lane each way on its torturous-at-any-hour 405 freeway, the transit authority’s own study found it actually slowed commutes by one minute. Expanding bike lanes had the same effect. So have all those dockless electric scooters.
The problem is rampant individualism: A chaotic mass of unique individuals with unique modes of transit all trying to go their own way. The more ways to move and the more locations to go, the messier the transit maw grows, and the more people get stranded in parts of the city they don’t want to be in. The more capacity L.A.’s planners add, the more people want to use it, and so the more everything worsens. A team of economists named this The Fundamental Law of Road Congestion: Adding capacity doesn’t reduce the chaos.
So too with data. Perhaps we can call it the fundamental law of data connections: Adding more systems and connectors slow sync times and makes data quality worse.
Blame the ballooning number of apps teams use. The average B2B company uses dozens of SaaS tools, the defining feature of which is their connectedness. We pushed on-premise systems into the cloud so they could all talk. But we still use the same legacy architecture—APIs, webhooks, and data pipes—from back when Salesforce’s homepage still looked oddly reminiscent of Barnes & Noble.
The data connector market is entirely fractured. Each data pipe does things slightly differently. The standard mode of transfer is unidirectional—data in, not out. And what does it mean to pipe data in, anyway? Despite lots of talk of “single source of truth,” can you point to yours now and say that it’s inviolable? If there are 14 systems linked to your CRM, can it play air traffic controller to decide which system has primacy for each data field?
Most teams don’t have this flexibility and the result is Carmageddon on your company’s backbone which degrades data quality. Data takes one-way trips and can’t return. Duplicates pile up. Data gets stranded. Apps disagree. Old systems cause failures. Wrecks grind your team to a halt.
It’s no wonder that salespeople spend 900 hours each year on administrative tasks like verifying contact information. Or that marketing teams spend 800 hours each year on data cleanup. Or that 89 percent of companies are increasing their IT budgets to keep the data overpasses from collapsing. Every company has its own quiet version of L.A.s $19 billion siphon.
Yet decades ago, for two and a half glorious weeks, L.A. traffic was chaos-free, and the way they did it offers lessons for how I think we can all clean our data up.
Data quality control
In July of 1984, L.A. drivers experienced the unparalleled joy of gliding down the city’s ample freeways in peace. This was despite having an additional 650,000 temporary residents that month. The Summer Olympics had come and, fearful of any athletes missing the games because they were stuck in traffic, the city cleared the roads with a squadron of 550 additional route-to-route busses that ferried people night and day between the most important locations—notably, to L.A.’s new focal point, the Coliseum.
A few factors made this an unqualified success:
-
-
- There was one central hub. All roads lead to the Coliseum.
- The engineers prioritized routes. Unimportant routes received no attention.
- A central authority made it happen. Drivers needed guidance and enforcement.
- Many residents left for the week. Nonessential traffic dropped off.
-
What might your data environment look like with the same forced reorganization? A single, central data authority to manage routes and a hierarchy of importance to prioritize traffic. It’d offer an entirely new view of your data: You’d see it in terms of quality, not connections. Not how many “lanes” do we have, but how complete, accurate, and available is it in all the systems it needs to be in?
With that mindset, I’m convinced IT teams would unlock a tremendous amount of value for the entire organization. You’d reclaim thousands of sales, marketing, support, product management, and business analyst hours, and more of your hard-won data would actually be supporting the customer lifecycle, not languishing in silos. It’d take work, and a data-management mindset shift. But unlike L.A.’s squabbling politicians, you’re actually free to try something new, and if it works, not revert back.
About the author: Nick is a CEO, founder, and author with 25 years of experience in tech who writes about data ecosystems, SaaS, and product development. He spent nearly seven years as EVP of Product at Marketo is now CEO and founder of Syncari.