- AT&T's SVP of Engineering gave us the low-down on what goes into testing the hundreds of AI applications it is exploring
- Latent bias in generative AI is a key area of concern and one more people in the industry should be thinking about, he said
- The exec also highlighted the importance of data interoperability and AT&T's relationship with Snowflake
Telcos have caught a lot of flak for being slow to adopt artificial intelligence (AI). But they’re not just standing around hemming and hawing. AT&T VP of Data Platforms Matt Dugan told Fierce there’s more going on behind the scenes than meets the eye – a lot more. In fact, he said AT&T is working to vet over a thousand AI use cases. Here’s how.
“We have extremely rigorous governance processes taking every single one of them through a review process with our privacy teams, with our legal teams, with our compliance teams, with our security teams and with our data teams to look and analyze and ask questions of what could go wrong,” Dugan explained. “Only the ones that we say ‘this is reasonable and direct and has the right protections in place’ can go forward.”
According to Dugan, the review process can take months. Use cases that don’t pass the vetting process are put in a suspended state during which AT&T’s teams go back to the drawing board to put in more thought and make tweaks before it is put back into review. Some uses cases, though, will “never pass go,” he added.
This might seem like overkill, especially when you think of presumably simple use cases like call or document summarization. But Dugan noted even these use cases require strict guardrails. Why?
“If you wanted to say summarize a customer call log, well you might need to get certain information out of that call log every time and there’s some kinds of information that should never appear in the call log,” he explained. Think personal information or comments made by a third party in the background of the call.
“An agent knows better. An agent knows ‘oh, I heard something in the background I’m not going to type that in.’ But generative AI is listening to an audio recording, it’s not necessarily listening and discerning,” he continued.
“So, in order to make that kind of a use case something that we could do, we have to really think about how we’re interrogating that use case and ensuring it complies with all the right controls.”
But even once a company like AT&T sorts out the nuances of call summarization, that doesn’t mean it can turn around and use exactly the same solution for, say, document summarization. Basically, Dugan said, each application is a distinct use case that needs to be thoroughly evaluated on its own.
(For the record, Dugan said document summarization is one of the use cases AT&T has deployed internally. The tool helps reduce manual activity and errors and speed business processes, he said.)
Looking out for latent bias
One major thing AT&T is looking out for as part of its AI review process is latent bias.
Dugan said he believes AT&T is a leader in the industry when it comes to being able to spot and mitigate bias in traditional AI. But a changing landscape – i.e. the advent of generative AI – is posing new challenges.
There are plenty of examples of generative AI making outlandish suggestions in response to prompts. Those, though, are symptomatic of a larger problem. With generative AI, he said, bias can remain hidden and only rear its head under certain conditions.
“That bias is in there for a reason…it gets to a point where if you just ask the right question or just have the right information coming in in a certain way it triggers the bias that’s hidden, that’s latent inside that model,” he said. “I think this is an area industry really needs to look at and think about.”
Why is he so worried?
The dream is to use AI to run networks and systems autonomously. Right now, there’s a human in the loop to prevent catastrophic errors. But take the human out of the equation in a system-to-system interaction where the AI gets it wrong…well, you can see why operators would want to avoid latent bias crashing their networks.
Mountains of data
Of course, you can’t really talk about AI without talking about data. So we did.
It’s well-enough known that AT&T has something like 600 petabytes of data crossing its network everyday. But it also has multiple petabytes of business, operational, financial and other data it stores in the cloud.
Dugan said AT&T used to run its own data centers, but has moved much of its stored data to Snowflake’s platform. Why? Because there are a lot of hardware, security, power and facilities costs associated with running your own data centers, Dugan said. And if you’re not selling that capacity, it’s a direct cost rather than one tied to a revenue stream.
Plus, when you run your own data centers, you have to build to peak load to meet the needs of that one day a year where there’s a holiday sale or some other major event. By moving to Snowflake, AT&T can now consume cloud storage on-demand and scale to peak load only when necessary.
But there’s one more big reason Dugan said AT&T likes Snowflake’s platform: data interoperability.
More specifically, Snowflake offers a data format called Iceberg, which allows customers to tap into data stored in third-party data lakes.
“The main thing as it related to AI is making sure that you’re not introducing bias by not having a complete view of your data. For us that means getting to an interoperable view of data storage” to avoid creating islands of data in different clouds and formats, Dugan explained. Interoperability can also help AT&T find and reduce redundancy and clean up orphaned data, he said.
The exec warned AT&T isn’t alone in pursuing this as a priority: “There are a lot of companies that are in this data and analytics and processing space that are really interested in having some level of ecosystem lock-in. They see that as a protectionist action for their business."
"I think especially for enterprises like ours what [those] companies are going to realize is when they’re worried about total contract value, they should really be worried about that TCV going to zero if they’re locking away an enterprise’s data,” he concluded.