The three big telco AI questions
If you’ve recently attended or watched or read about any tech industry conferences, you’ve noticed the characterization of artificial intelligence (AI), both generative AI (gen AI) and more classical AI, as a sort of panacea for business problems regardless of industry. Telecom is no exception. The idea of telco AI solutions for everything from network management and optimization to customer management and churn reduction has been broadly seeded, partnerships are being formed, vendor-led thought leadership is picking up pace, and it’s clear telco AI will be part of the landscape going forward.
For Rakuten Mobile in Japan and its sister network hardware/software provider Rakuten Symphony, telco AI is just AI. This is because these two parts of Rakuten are part of a much larger whole that provides a wide range of services, including banking, e-commerce, financial services and more. The company is developing various AI engines and applications that are trained using data from across the enterprise, not just from Rakuten Mobile for Rakuten Mobile and so forth.
The first two big telco AI questions—does the problem require AI? If so, do I have the right data?
Rakuten Symphony Managing Director and President of OSS Rahul Atri laid out the group’s approach to AI. “We always believe in building platforms,” along with driving adoption and fostering a culture of innovation. He laid out a three-legged stool problem around AI. First, do we have the data? Second, what’s the cost? Third, did this create new efficiency that would otherwise not be achievable?
Rakuten Group maintains a unified data lake; “we knew data would be the new oil,” Atri said. On cost, “People do see that coming but I don’t think many people are even talking about cost—cost of training a model, cost of cloud resources, cost of investment in even figuring out the use case.” In terms of efficiency, this is the big one; “Do you even need AI? Could it be a data insight problem? Can it be solved by a typical workflow engine?”
Looking more broadly at automation tooling, which includes the use of telco AI solutions, Atri laid out a process: first is analyzing data, second is finding the root cause of a particular focus item, and third is making a decision or taking an action. The first and third steps are the easy ones, he said, highlighting that locking onto a root cause is “where we need [AI model] training and fine tuning…Focus on step two.”
Atri also talked through the use of large language models (LLMs) for telco AI applications. He enumerated distinct phases in a typical user journey starting with the use of chatbot and refining natural language processing. The next phase introduces an LLM that can access proprietary data and become domain and business specific in an effort to combine data with context. He made the point that one operator may have an AI solution accessed by customer care agents, RF engineers or senior executives, all of whom would want different information from the tool based on the varied contexts of their positions.
He also hit on a point of debate in both telecoms and, more broadly, amongst enterprise AI users—is building an LLM from scratch in service of a particular industry a better approach than fine tuning an already available LLM? “You can build telecom LLMs as much as you want to,” he said. “Do you want to? I don’t think so.” He used the analogy of building a cloud-native telecom network as compared to building a cloud that supports a telecom network. Worth noting that Rakuten Group, with its shared data lake, uses multiple LLMs.
Data-driven, software-first operations are key to telco AI success
So all signs point to AI being the right solution; but do you have the right data for that solution? Rakuten Symphony Chief Marketing Officer Geoff Hollingworth stressed the importance of “data fidelity” in increasingly automating previously manual network processes. He advised operators to “go underneath the jargon and the hype and the terminology. Embrace a business-driven, ROI-driven use case, user group, work stream approach to starting to investigate yourself, how to understand what these new technologies from automation, data and AI can do for you, and be very, very disciplined and prescriptive in that.”
Rakuten Mobile uses the analogy of autonomous vehicles where Level 1 refers to minimal (but some) driver assistance capabilities, Level 3 suggests conditional automation where most things are automated but some manual inputs are required, and Level 6 means the vehicle performs all driving tasks under all conditions with no need for humans.
Going up in levels of autonomy, Hollingworth envisioned “a keyboard with no buttons on it in the network operations center.” But to get to this state, AI isn’t the starting point. “The journey really starts without AI because the only way that any machine can see what is happening to it or what it is experiencing is because of the data that it can actually understand. And having access to that data is the number one level of transformation…and there’s a couple of aspects to data that it’s always important to understand. The first aspect is what is the fidelity of that data?”
Another analogy, this one with Google Earth: “If you’re out looking at the earth from the moon, you don’t see much detail, but you zoom in, the moment you go down to another level of detail, you can start to actually make a different level of intelligent decision based on the added information. The enemy of good decisions is the average, is what people say. The second aspect of data is timeliness. So obviously having availability of that data as close to the moment and making it available to the engines that are watching it, the AI models, that then can interpret it is another factor on getting to a journey that you want to have full automation, full autonomy.”
Hollingworth continued: “And then what’s interesting is that all of that is true if you just want to automate. So automation is both a receiver of data so it can decide what to do, but also automation generates data at a different granular level so you can actually analyze it. And that’s one of the areas that Rakuten really has taken a leadership role in because they have digitalized all of the processes involved in winning the network, and that’s a great asset then. Without that data, you can’t see things and without that horizontal unified data, you will be down to level one automation where you are doing binary automations, go faster, go slower. But a bit like cruise control in a car is a good example of a level one automation.”
Another important thing to consider as telco AI solutions are adopted, Hollingworth said, is around the organization, the people. This gets into an interesting area wherein telco AI needs data but much of the relevant process data is either stored in a legacy fashion or held in the minds of workers. So how do you convert institutional memory into data that can be fed into telco AI systems?
Hollingworth called this piece “the number one problem. And an interesting question always to ask, if somebody reaches out and asks you, ‘I want AI to solve my problems,’ it’s always interesting to take a step back and ask the first question, ‘Well, how do you make decisions today and what data do you use to make those decisions and how do you get access to that data?’ And nine times out of 10, in a lot of situations, especially in very institutional organizations, they’re not using very much data. They don’t have very much data and therefore if you haven’t got the data, you can’t expect a machine to make better decisions than a human can. So data is 90% of the work in AI.”
And the third big question—what does this mean for the environment?
There’s no question that AI will impact telecommunications and essentially every other industry eager to boost productivity and streamline operations while also striving for product and/or service differentiation. But at what cost? Training AI models requires a lot of compute horsepower that today is primarily delivered by graphics processing units (GPUs) which use a lot of power. And more AI means more GPUs which means more power. And regulators have taken note. In the U.S., lawmakers have put forward a bill outlining an assessment of AI’s environmental footprint and standardization of reporting long-term impacts. The European Union’s AI Act will require reporting on AI-related resource consumption over a particular AI system’s lifecycle. There’s also increasing discussion of sustainable AI. And there’s a bit of circularity in that AI could help industries of all sorts optimize and decrease resource consumption, but doing so requires more resource consumption by AI.
Which brings us back to Atri: “I’m concerned about the cost and environmental impact. We all have been seeing it, we all have been noticing it, talking about it. But I don’t think anyone has stepped on it and said, ‘We’re not going to make more than five [large language models] in a year.’ Everyone is in a rush to create more. Everyone wants to be more efficient…So [I’m] scared about those impacts.”
He effectively made the case that AI can help drive broad forward progress that’s meaningful to individuals, to companies and to the world, but there has to be consideration of the environmental impact. “I’m super excited that this will turn into something good, super worried that every ChatGPT query is consuming water somewhere, it’s consuming power somewhere…I think these are two scenarios that if we can balance them, the technology is great. We can build a lot of innovation.”
Comments are closed.