We Use cookies

to enhance your experience with analytics, personalization, and security.

From the CGA Roundtable: Joe Eberly on the Real Barriers to Better Utility Mapping

Written by

Chris Garafola

Published on

November 12, 2025

Industry Insights

When Joe Eberly, President and Chief Growth Officer at 4M, sat down for a recent Common Ground Alliance roundtable discussion on excavation safety, the conversation quickly dove deep into hard-hitting issues surrounding utility mapping technology. What emerged was a frank conversation about what’s holding the industry back—and what it will take to move forward.

The Roundtable brings together utility industry leaders for unscripted discussions about excavation damage prevention. This session tackled one of the most persistent frustrations in damage prevention: why, despite abundant technology, poor maps remain a top concern for all stakeholders.

Is Poor Mapping Really a Technology Problem?

Joe cut straight to the heart of the issue. “I think the relevant issue is not that there’s a lack of mapping systems or a lack of technology,” he explained. “A lot of it has to do with the data that feeds the maps, but the industry is also struggling with the processes or the work that it takes to get the data to the maps.”

He described the usual problem: utility work gets captured in as-builts at the end of a project, but then faces a gauntlet of steps before making it into the mapping system. “Think about the time, the effort, the resources that it takes and the steps in the process to get that data into the mapping system at some point. That can be weeks for those who do it efficiently, or that could be years and years for those who unfortunately don’t have the ability to work through that process in an efficient way.”

The diagnosis? “It’s not the mapping technology that is the biggest challenge. It’s really the processes and the workflows that enable the maps themselves.”

Reactive vs. Proactive: A Fundamental Problem

Joe identified a critical flaw in how the industry approaches mapping updates. He walked through the typical process: “A strike happens. It’s recorded, there’s an incident report, an audit is done, all these different investigations are done. Ultimately, that ends up back in the hands of the GIS or mapping folks at the utility owner, and then it gets updated into a map in the case that there was a mapping error that occurred on the specific project.”

The fundamental problem with this approach is that “you will never get to a point where your map has been updated to a place where you have full confidence. Why? Because it’s reactive. It only happens after an event occurs.”

Joe’s advice: “There needs to be a fundamental shift in a proactive manner—there’s no incentive to go in, spend the time, the resources, and the money to update your maps in a proactive way versus how it’s done today, which is just to react when damage happens. It’s a fundamental problem across the industry.

Why Technology Providers Must Evolve Beyond “Just Mapping”

Joe applied the same scrutiny to mapping technology providers: “I also think there’s a good deal of culpability on the technology firms that historically have put out mapping capabilities.”

The problem? These companies formed their businesses around a simple value proposition: “Use our maps, put your data in here, and we’ll represent it visually to you so that you have a pictorial look at where your assets are.”

But Joe sees the need for an evolution. “I think a fundamental shift needs to take place, and it’s about mapping companies or technology providers that provide the maps taking their place in the damage prevention space to say, ‘We can do more.’”

He pointed to AI as a catalyst for this transformation: “Now with the introduction of AI into utility mapping, like some of the things 4M and others are doing out there, we can shift the mindset from, ‘We’re just a mapping provider,’ to ‘We’re a utility intelligence provider.’”

The expanded role means providing actionable insights, not just visualizations: “We can give you information about risk. We can do all kinds of things that previously weren’t available and that can help the overall challenge that we have with damage prevention in the industry as a whole.”

Joe emphasized the untapped potential already sitting in utility databases: “We have a responsibility to take the technology to the next level, and present that to consumers of mapping technology to show them that there’s value embedded in the data that you already own and from which you can extract information.”

Perfect vs. Good Enough Utility Data

Perhaps the most critical point Joe raised during the discussion centered on an industry-wide mindset challenge: the pursuit of perfect data, at the expense of good-enough data that could deliver immediate value.

“I want to go back to this idea of accuracy or being perfect, versus: is the data good enough for me to do the job that I need to deliver? Whether it’s early planning on a project, design work, or feeding data to the locators that go out there and make that data better through precise locating.”

He invoked a familiar phrase: “Don’t let perfect get in the way of the good—or really good or good getting better…When I’m out and about and talking with folks, the number one question that I get asked about mapping technology in general when I’m describing what 4M is doing is, ‘What about accuracy?’”

Joe argued for matching data quality to use cases: “There needs to be an acknowledgement that centimeter accuracy is important, but it has a time and a purpose…when I go to put a shovel in the ground…For everything prior to when that occurs, don’t let the perfect get in the way of the good enough.”

This isn't about lowering standards—it’s about recognizing that different stages of a project have different accuracy requirements. Planning and design work don’t need centimeter-level precision. By insisting on perfection across the board, the industry creates bottlenecks that prevent any progress at all.

Reducing Barriers: Technology at Scale, Costs at a Fraction

When the conversation turned to costs and incentives, Joe pushed back against the notion that better data must be prohibitively expensive. “The cost of the data is proportional to the cost that it takes to collect and process the data. So if you can reduce the collection and processing side of it, naturally the cost of acquisition of that data will go down.”

Joe acknowledged that using today’s methods wouldn’t deliver value: “There’s a negative return on investment and probably not a lot of value if we use today’s methods of data gathering.”

But innovative tech changes the equation entirely: “With newer technology, you can do it at scale and you can do it efficiently at a fraction of the cost that it takes to do today.”

The challenge for technology providers, Joe emphasized, comes down to execution: “There is a proportionality there that, again, is culpable on the tech providers to be able to drive the overall cost of aggregation of data, processing of data, and dissemination of that data at a price point that can make sense for a contractor or an excavator.”

The Untapped Archive: Utility Data Already Exists

Joe highlighted a surprising reality about existing data that utilities already possess but aren’t leveraging: “In a lot of cases, the utility owners already have that data. Most utility owners require their locators to take pictures nowadays or videos of the markouts that they do on their assets,” he pointed out.

When asked if utilities are capturing this data, Joe confirmed: “In the last 60 days, I’ve talked with the biggest ones here in the U.S., and they all have an archive of that data where it just sits in a file server somewhere unused.” This represents a massive untapped resource sitting dormant in storage.

Crowdsourcing Data: A Public Utility Database?

When the conversation turned to crowdsourcing and getting the utility industry to embrace shared data, Joe floated some interesting ideas.

“One way to do it is to make their data publicly available. Imagine a national public utility database—that’s one idea. You have to figure out what kind of business you build around something like that, but it’s certainly possible.”

Joe expanded on the concept of truly distributed data collection: “The data is not just where the utility lines are. On-surface objects and all kinds of other things can feed into a data model. So where do you get that data? There’s remote detection and all kinds of technologies that can feed data into that model, but what about creating a public app where people can take pictures of infrastructure and load it right in there?”

The scale could become transformative: “Now you’re talking about a hundred million providers of data into a common data environment, and every time they feed in there, you put five cents into their Starbucks account or something. That’s true crowdsourcing of data that has no limitations.”

What If Asset Owners Had Immunity for Sharing Underground Data?

One of the most provocative questions in the roundtable was, “What if legislation granted asset owners complete immunity from liability when they share their utility data?

Joe’s response was unexpected—he suggested the question might soon be irrelevant. “Asking for data or getting [asset owners] to give out data is going to go by the wayside, because the way that technology is coming along, we don’t need the data. We can collect the data on our own, and chances are pretty good that it can be just as or more accurate and complete.”

He pointed to emerging capabilities that could make the liability question moot: “Can you map a city with no record sources and come up with an 80, 85, 90 percent complete and accurate map of a water system or a gas system within a city? The technology is already there and it’s being used today.”

This suggests a future where the debate over data sharing becomes irrelevant—not because liability concerns are resolved, but because alternative data collection methods bypass the need for owner-provided records entirely.

The Risk Mitigation Opportunity: Giving Excavators Better Data

One of Joe’s most practical suggestions addressed risk mitigation at the excavator level, where the consequences of poor data are most severe.

“When it comes to the risk for the excavator, what has prevented the excavator from taking their own measures to mitigate any risk or to QA/QC the data that they receive?” Joe asked. “First, it’s time consuming, and it’s expensive to be able to get your hands on the data.”

His proposed solution flips the script: “Why can’t we solve for mitigation of risk at the excavator level by giving them instant access to an alternative data set that might show something different, or that might indicate something different than what’s been given to them, along with all the risk?”

Building Trust Through Repeatability

When asked directly whether the issue was a lack of technology or change resistance, Joe was unequivocal: “The technology is here and it’s going to continue to advance and get faster, get more efficient, get cheaper, deliver better results. But all those challenges that were mentioned, those take time to overcome.”

For Joe, the path forward comes down to trust. “Over time, you build trust in your technology or your deliverables or your data, and then it becomes repeatable. Once it becomes repeatable, it can get adopted in the industry.

From Reactive to Proactive: What Are Damage Prevention Teams Really Doing

Joe challenged utilities to take an honest look at how their damage prevention teams spend their time.

“If you look at a damage prevention team at a facility owner utility, and take a look at what they do week over week, month over month, on a yearly basis—is it truly damage prevention? Or are they working on utility damage investigations and reporting and other types of things?”

The implication is clear: teams spending most of their time investigating past damages don’t have the capacity to prevent future ones.

He acknowledged the financial realities: “There has to be an ROI for the CFO to be able to make significant investments, to be able to turn that methodology and mentality to a much more proactive approach to damage prevention, as opposed to a reactive one.”

A proactive approach to data quality could shift that balance, freeing damage prevention teams to actually prevent damage rather than simply document it.

The Bottom Line

Joe’s contributions to the CGA roundtable discussion painted a clear picture of an industry at an inflection point. The technology exists to dramatically improve utility mapping and damage prevention. The costs can be managed through scale and efficiency. The methods are proven.

What’s missing isn’t capability—it’s willingness and trust.

Willingness to accept that “good enough” data can deliver real value while working toward better data.

Willingness to shift from reactive damage investigation to proactive data improvement.

Willingness to recognize that technology providers must evolve beyond simple visualization to become utility intelligence providers.

And most critically, willingness to build trust through repeatability—demonstrating value quickly, consistently, and at scale.

“The technology is here and it’s going to continue to advance and get faster, get more efficient, get cheaper, deliver better results,” Joe concluded. The question isn’t whether we can improve utility mapping and damage prevention. It’s whether we’re ready to embrace the tools that can make it happen—and whether we can build the trust necessary for widespread adoption.”

Want to learn more about 4M’s approach to utility intelligence and damage prevention? Get a demo today.  

Chris Garafola

Brand and Content Leader

With over a decade of experience spanning agencies and innovative startups, Chris is a dedicated content marketing leader, driven by the belief that content isn't just about consumption; it's about leaving a lasting impact on the person who engages with it.

Inspired by 4M's mission to create the first online database of subsurface utilities in the U.S., Chris is eager to illuminate one of the infrastructure industry's most pressing issues and champion innovative solutions that deeply resonate with general contractors and civil engineers to address these challenges.

Recent blog posts

View all Blogs

Here's What's New in 4M: October 2025

Chris Garafola

November 3, 2025

Product

GDOT Scales 4M Utility AI Mapping Statewide to Power Smarter Project Delivery

October 30, 2025

Customer Story

4M Analytics and WSB Engineering Announce Reseller Partnership to Increase Access to Utility AI Data

Tamar Shafrir

October 29, 2025

Company News

The New Era of Utility Mapping: Global Damage Prevention Summit Recap

Tamar Shafrir

October 28, 2025

Industry Insights

The Hidden Cost of Broken Utility Data: Why We All Pay the Price

Chris Garafola

October 27, 2025

Industry Insights

Our Newsletter

Join 7k infrastructure professionals

Get monthly insights on ways to build
smarter, faster and safer with Utility AI.