Why is the edge the “Goldilocks” location for edge?

BT talks monetizing edge AI, creating a global connectivity fabric and the future of conversational networks

Back in June at a telecoms industry show, BT CTO Colin Bannon referred to the edge as the “Goldilocks location for AI.” He expanded on this topic, and more, during the recent Telco AI Forum 2.0, available on demand here. The current “thesis” he said is that there’s a “gold rush of investment” in data center infrastructure for large AI model training in a market that’s maturing very rapidly albeit with “still evolving” use cases and ROI. 

Bannon said as training scales, inferencing follows, and latency comes to the foreground. “Up until this point with large language models, you had textual-based query and responses that are not latency sensitive. As you move to voice interaction and multimodal and agentic, all of a sudden latency becomes important. Therefore the topology and the location and where you put your engines for inference” similarly becomes important. 

Bottomline, “You’re going to see models of value closer to the edge, closer to the users, as the power of AI continues to exponentially grow.” For CSPs, it’s time to decide how, when and where to invest because “what’s undeniable is the gold rush that’s going on.”

Talking through advancements in the optical layer, the types of hybrid AI models that will ultimately be in place, and the need for a more advanced multi-network mesh, Bannon got into the “platformization” of the network “where, in simple terms, you abstract the port from the platform…and you virtualize…[to] be able to spin whatever protocol up.” 

“That’s a heavy lift,” he continued.  “The locations of where you build the network is different. You’re at the doorstep of the cloud…You want to put it where you have those hyperconnected nodes.” 

There’s also the important distinction between how CSPs will apply AI to the network and how they will build the network to support AI. “Network performance will be really important,” Bannon said. The goal is deterministic networks powered by AI that are also AI-ready. “You’re going to need new network flexibility and new fabrics of really highly resilient, very intelligent, very agile networks that can deal with new flows and new behaviors on the networks.” 

More on fabrics: On Oct. 1, BT turned on its network-as-a-service (NaaS) “Global Fabric” platform, with commercial service launch to follow early next year. The company has points of presence in 45 of the “world’s major cloud data centers,” according to an announcement. Global Fabric essentially stitches together metro edge cloud and network services for global enterprises to “shop” for lower-latency compute services via local telecoms operators and datacenter partners. Enterprise customers can test its management portal to play with network configurations and APIs.

As this value proposition comes to market and takes commercial shape, Bannon highlighted the importance of trust, particularly for enterprises whose data is essentially their unique selling proposition. As “data becomes gold,” he said, and as data regulation evolves, enterprises will become more discerning as to where data and AI meet. 

As CSPs embark on this multi-faceted transformation—towards NaaS, towards “platformization,” and (ideally) towards monetizing AI at the edge—Bannon spelled out what he needs from vendors. “One is assistance…Partners may be working with multiple different industries, not just telco, and cross-pollinating what they’ve learnt.” Second, he said, is “to increase time to revenue.” And third is AIOps—beyond buying AI products, operationalizing them and effectively using them. 

As far as the long-term goals, Bannon painted a picture of a sort of Borgesian library made usable with AI. “Imagine a world where you have this abstract, Byzantine, complex, impenetrable set of systems and data that you need to go through with a fine-tooth comb in the off chance you find the needle in the haystack.” He gave the example of a firewall imbued with AI where an engineer can “ask the packets what’s wrong. And have the packets and the data, the flows of those applications, summarize and explain some of that back, and actually have a conversation to actually help diagnose its problems in more approachable English…If you can picture a future where you can make these things more approachable and allow these experts who know the right questions to unlock their power better, that’s a really exciting world ahead.” 

Comments are closed.