上海品茶

电信管理论坛:2023生成式AI:通信运营商迈出第一步报告(英文版)(56页).pdf

编号:152873 PDF   DOCX 56页 9.79MB 下载积分:VIP专享
下载报告请您先登录!

电信管理论坛:2023生成式AI:通信运营商迈出第一步报告(英文版)(56页).pdf

1、REPORTTM Forum|December 2023operators take their first stepsAuthor:Mark Newman,Chief Analyst Editor:Ian Kemp,Managing Editor Sponsored by:generativeAI:contents2inform.tmforum.orgsection 2:whos who in the GenAI value chain?section 1:key GenAI technologies and conceptsthe big picture100503section 4:To

2、 build?To train?To fine-tune?Getting to grips with LLMssection 3:CSP strategies getting ready for GenAI181424section 5:taking the first steps early GenAI use cases for CSPssection 6:look while you leap key challenges for CSPs31additional features&resourcessection 7:key findings and recommendations38

3、35the big pictureThe extent of the impact GenAI is having on the telecoms industry was made clear in a comment by Jonathan Abrahamson,Chief Product and Digital Officer at Deutsche Telekom,in an article for TM Forum Inform:“Its difficult to envisage a world where this doesnt change everything and cer

4、tainly in the context of what we do as a telecommunications company,”he said,referring to the proliferation of GenAI developments just one year after the launch of ChatGPT.In simple terms GenAI is a type of artificial intelligence(AI)that is capable of creating original content or information,includ

5、ing text,images,audio and video.ChatGPT remains the most widely recognized GenAI-enabled service and the most heavily backed.Microsoft has invested$13 billion in OpenAI,and ChatGPT reportedly now has more than 180 million users and 1.5 billion monthly visitors to its website.But many other enterpris

6、e IT tools and services based on GenAI have come to market in the past year resulting in a frenzy of excitement and activity within the global enterprise IT community.Its arrival did not come as a surprise to everyone.The technologies which GenAI is based on have been in development since the 1960s,

7、and firms such as Google and Microsoft have been working on GenAI-enabled products and solutions for several years.So how far down the line are CSPs when it comes to adopting GenAI technology?How useful is it?And where does it fit into their business models?To feed into this report we carried out a

8、comprehensive survey of service providers globally.The charts in the report are based on data from 104 senior-level respondents from 73 operators,just over half of whom are in technology roles.Generative AI(GenAI)technology exploded into the consciousness of both consumers and the business community

9、 in November 2022 with the launch of a service called ChatGPT by a little-known company called OpenAI.One year later,perceptions of GenAI have expanded rapidly from an AI-based content creation tool towards a strategic platform that is fast becoming front and center of thinking for almost every comm

10、unications service provider(CSP)worldwide.3inform.tmforum.orgIts difficult to envisage a world where this doesnt change everything.”Jonathan Abrahamson,Deutsche Telekom4the big pictureinform.tmforum.orgWhen asked whether their organization is in a good place to exploit the potential of AI and machin

11、e learning,79%said yes.But that still leaves one fifth who think they are not,a function perhaps of the fact that the technology is still in the early stages of commercial development,and that organizations are yet to reshape and take on the levels of skills needed to fully take advantage of these t

12、echnologies.But the potential speed and impact of developments may not give them much time to adapt.When we asked when executives think GenAI and large language models(LLMs)a type of GenAI that can understand and generate human-like language are likely to have a significant impact on business,nearly

13、 60%said over the next one-to-two years and a further 37%between two and five years.The graphic below,created by TM Forum and Bain&Company,shows some of the current and longer-term applications for GenAI.We explore early and future use cases in detail in section 5.Read this report to understand:Some

14、 of the different technologies and definitions behind the GenAI juggernaut Who is doing what in the GenAI value chain How CSPs are preparing themselves to implement GenAI Early experimentation and GenAI use cases The key challenges to developing and adopting GenAI.When do you expect that GenAI/LLMs

15、willhave a significant impact on your business?TM Forum,2023More than 5 yearsBetween 2 and 5 yearsOver the next 1-2 years57%37%6%Generative AI will be introduced in phasesTM Forum,2023Service management and orchestrationEngineer assistantBilling disputeresolutionSupply-chain optimizationAutonomous n

16、etworksCodex:text to codefor applicationsSmart rating /charging/pricingStrategic decision-makingNetwork digital twinIT integrationAsset managementData managementProduct/service designOperationsChatbotsSales leadgenerationMultilingualchatbotCustomer interactionanalysisSocial mediasentiment analysisCu

17、stomerengagementPersonal digitalassistantCustomersReal-time translationMetaverse creationProductsNowNext 2 yearsLonger term5section 1:key GenAI technologies and conceptsinform.tmforum.orgGenAI works by processing huge volumes of data to find patterns and determine the best possible response to a que

18、stion or situation,which it then generates as an output.By feeding the AI immense amounts of data it is able to develop an understanding of correlations and patterns within the data.Whereas conventional approaches to machine learning require data scientists to develop artificial intelligence from sc

19、ratch,GenAI involves the use of foundational models deep-learning neural networks as a starting point to develop models that power new applications more quickly and cost-effectively(see definitions box on p.9).Large language models(LLMs)are a type of GenAI that can understand and generate human-like

20、 language,with OpenAIs ChatGPT(now at version 4)the most well-known example.LLMs are foundational models specifically trained on massive data sets.They are huge,with billions or even trillions of parameters allowing them to learn complex language patterns and perform tasks that would be impossible f

21、or smaller models.There are many LLMs in existence today and,increasingly,these are designed to serve specific communities or functions.For example,StarCoder and StarCoderBase are LLMs designed specifically for code trained using data from GitHub,while BloombergGPT is an LLM for the financial sector

22、 trained on an extensive archive of financial data.Generative AI(GenAI)refers to any machine learning model capable of dynamically creating output after it has been trained.The algorithms create new content based on patterns learned from existing data.6section 1:key GenAI technologies and conceptsFo

23、undation models are like Swiss army knives which can be used for multiple purposes.Once a model has been developed,any developer can build an application on top of it to create content.The main application of OpenAIs GPT-3 and GPT-4,for example,is the ChatGPT chatbot,but thousands of businesses acro

24、ss the world are now working on their own applications using these LLMs.What can GenAI do?The fact that GenAI has a deliverable an output makes it different from other types of AI.GenAI can be applied to any content be it words,images,audio,video or code.This content may be freely available on the i

25、nternet or proprietary information owned by an organization.While there is understandable concern about the long-term impact of GenAI and of AI more broadly on jobs,it will be used in the short-to-medium term as a tool to help individuals and teams carry out their work more efficiently.Chatbots are

26、deployed as an interface to enable users to interact with GenAI models,and copilots are conversational interfaces that use LLMs to support users in various tasks and decision-making processes.Microsoft 365 Copilot,for example,is a tool,developed using GPT-4 LLMs,that helps people use the companys ap

27、plications to carry out tasks such as writing documents and summarizing emails.The process of experimenting with,and adopting,GenAI for specific use cases is very different from what enterprises are used to with previous models or versions of AI.With GenAI,outputs can be created immediately,but the

28、challenge then is to decide how much fine-tuning is needed to improve the results and what to do with the output.There are also a number of issues and risk factors associated with GenAI we explore these in more detail in chapter 6 which need to be addressed,or at least taken into consideration befor

29、e output content is used.These include hallucinations inaccurate or inappropriate responses to inputs.Furthermore,many companies are finding that the real benefits of GenAI come in combination with other technologies and types of AI.One large US telco from our survey,for example,has combined GenAI w

30、ith conversational and enterprise search to make it easier for people to navigate its website.GenAI use casesFor any enterprise or service provider seeking to exploit the potential of GenAI there is a balance to achieve between pursuing those use cases which are gaining momentum in the business worl

31、d generally and finding the right internal deployment and democratization strategy to optimize opportunities for innovation.Customer operations represents by some considerable margin the family of use cases that is generating the most interest.Interactions between customers and different touchpoints

32、 customer service agents,chatbots,websites and social media are generating a huge volume of unstructured data,which through the use of GenAI can be converted into knowledge that drives better engagement and opportunities for cross-sell,upsell and contract renewal.CSPs are already beginning to use ge

33、nerative AI to augment the jobs that human customer service agents perform.Our recent report Counter intelligence:using AI to improve customer experience explains how operators are using AI in general to improve customer experience(CX)and highlights some pitfalls of generative AI.Read the report to

34、find out more:REPORTAuthor:Sponsored by:Teresa Cottam,Contributing AnalystDawn Bushaus,Contributing Editorcounterusing AI to improve customer experienceintelligenceOther broad use cases,and families of use cases,include:Sales and marketing Network operations Fraud reduction Product innovation Softwa

35、re engineering.Most of these categories are common to all businesses,with network operations the only category that is specific to telecoms operators(more on specific GenAI use cases in section 5).This is important in the context of the approach that operators take to choosing LLMs and the extent to

36、 which they buy off-the-shelf LLMs rather than ones that need to be trained or fine-tuned.inform.tmforum.orgGenAI will be used in the short-to-medium term as a tool to help individuals and teams carry out their work more efficiently.7section 1:key GenAI technologies and conceptsinform.tmforum.orgSom

37、e telco-specific developments are likely to come through partnerships.In July Deutsche Telekom,e&,Singtel and SK Telecom announced the Global Telco AI Alliance to co-develop a platform for operator-specific AI services.Deutsche Telekom and SKT are already working together to develop an LLM for digit

38、al assistants in customer service,adapting existing LLMs to learn about telco-specific data.“There is a not insignificant amount of plumbing and orchestration that you have to build around the models to make them work for our telco context anduse cases,”Deutsche Telekoms Abrahamson told TM Forum.“Wh

39、at became very clear for us is to get really good with thisand to use it at massive scale we need to tune the model to make it work for our use cases.”Amdocs,Microsoft and Nvidia also announced a partnership in November to create LLMs for the telecoms and media industries.GenAI in telecoms CSPs have

40、 been exploring how and where to adopt AI in their businesses for the past five years or so.In our survey,55%of respondents said they had made good progress in using AI and machine learning in different parts of their business.But that leaves 45%which are either still at trial stages or just researc

41、hing their potential(see chart above).As we saw in The Big Picture,four out of five respondents say their organizations are in a good position to exploit the future potential of AI and machine learning.Despite this confidence expressed by many operator executives,the reality is that the use of AI in

42、 CSP businesses remains relatively limited and confined to specific use cases,principally in network operations and customer-facing functions.Indeed,presented with those use case options as well as product development,a full 87%said AI/machine learning would have a positive impact on customer experi

43、ence or customer relationship management and 85%on(network)operations(see chart on the next page).The arrival of GenAI has given a sense of urgency to CSPs in terms How efectively is your organization exploiting AI and machine learning?TM Forum,2023We are at the early stages of researching the poten

44、tial of AI and machine learningWe have some trials and proofs of concept running but we havent deployed it in our businessWe have made good progress in using AI and machine learning in diferent parts of our business(e.g.customer experience or network operations)55%31%14%The arrival of GenAI has give

45、n a sense of urgency to CSPs to leverage AI.There is a significant amount of plumbing and orchestration that you have to build around the models to make them work in a telco context.”Jonathan Abrahamson,Deutsche Telekom8section 1:key GenAI technologies and conceptsinform.tmforum.orgof how they lever

46、age AI.It comes at a time when operators are desperately seeking inspiration to turn their fortunes around both in terms of finding new sources of revenue and generating efficiencies.There is a willingness from CSPs to place big bets on GenAI.But at the same time there is an awareness that with the

47、landscape changing so quickly in terms of its technology and players,CSPs need to be able to change direction quickly and easily too.Research conducted by telecoms,media and technology consultancy Altman Solon on behalf of AWS indicates that telcos will increase their expenditure on AI by as much as

48、 six times over the next two years.Almost half of respondents to a survey among 100 CSPs conducted by the consultancy anticipate that GenAI spend will account for between 2%and 6%of total technology spend in two years,up from less than 1%today.However,it is much less clear where that investment will

49、 take place.This will be a function of which approach to building,training and fine-tuning LLMs CSPs take and the commercial relationships with their technology partners.For example,CSPs will be wary of committing to pricing models which scale with usage because of the risk of costs spiraling out of

50、 control.The box on the next page provides definitions of some of the key technologies and concepts around GenAI that we have touched on in this section and cover in subsequent chapters.In the next section we look at where operators fit into the GenAI value chain.13%85%87%11%46%43%1%14%Where does AI

51、/machine learning have the mostpotential to positively impact the telecoms business?High potentialMedium potentialLow potentialNew product developmentOperations/network operationsCustomer experience/customer relationship managementTM Forum,20239section 1:key GenAI technologies and conceptsinform.tmf

52、orum.orgFoundation modelsFoundation models sometimes known as general-purpose AI are a form of GenAI trained on massive data sets.They form the basis of many applications including ChatGPT built on the GPT-3.5 and GPT-4 foundation models and Microsofts Bing Chat.Rather than develop artificial intell

53、igence(AI)from scratch,data scientists use a foundation model as a starting point to develop machine learning models that power new applications more quickly and cost-effectively.The term foundation model was coined by researchers to describe machine learning models trained on a broad spectrum of ge

54、neralized and unlabeled data and capable of performing a wide variety of general tasks such as understanding language,generating text and images,and conversing in natural language.Large language modelsLarge language models(LLMs)are AI systems designed to process and analyze vast amounts of natural l

55、anguage data and then use that information to generate human-like responses.An LLM is any statistical model of language built on large data volumes and billions of parameters.LLMs are the basis for most of the existing foundation models,which increasingly are multimodal generating multiple outputs (

56、such as text and images).ParametersA parameter is a configuration variable that is learned during a machine-learning process.Such variables are used to control the behavior of the model.LLMs have tens or hundreds of billions of parameters(GPT-4 is said to have 1.76 trillion).The number of parameters

57、 tends to be a measure of the size and the complexity of the model.The more parameters a model has,the more data it can process,learn from and generate.There is now a trend towards smaller,more specialized LLMs with fewer parameters and which are easier and cheaper to train.Tokens and tokenizationA

58、token is a basic unit of text or code that an LLM uses to process and generate language.Tokens can be characters,words or other segments of text or code.Tokenization is the process of splitting the input and output texts into smaller units that can be processed by the LLM AI models.Tokenization affe

59、cts the amount of data and the number of calculations that the model needs to process.The more tokens that the model has to deal with,the more memory and computational resources the model consumes.TrainingAI training is the process of teaching an AI system to perceive,interpret and learn from data.D

60、ata scientists can spend years creating a new AI model and training it to perform complex tasks.In the first phase of a training process the model is fed massive amounts of data and is then asked to make decisions based on the information.The training part involves making adjustments to the model un

61、til it produces satisfactory results.Fine-tuningFine-tuning an LLM involves making small adjustments to a pre-trained model to improve its performance in a specific task.This can yield better results than training a model from scratch as the model already works well and can leverage its existing kno

62、wledge to learn new tasks more quickly.Prompt engineeringThis is the process of refining LLMs with specific prompts and recommended outputs,and of refining input to various GenAI services to generate text or images.HallucinationsThese are inaccurate or inappropriate responses to inputs presented as

63、facts,such as false or misleading information or race/gender bias.AI hallucinations are often caused by limitations or biases in training data and algorithms,and can result in producing content that is wrong or even harmful.Retrieval Augmented Generation(RAG)RAG provides a way to optimize the output

64、 of an LLM with targeted information without modifying the underlying model itself.The targeted information can be specific to a particular organization and industry,so the GenAI system can provide more contextually appropriate answers to prompts.(sources:AWS,Ada Lovelace Institute,IBM)Key technolog

65、ies and concepts around GenAI10section 2:whos who in the GenAI value chain?inform.tmforum.orgFor the time being,the GenAI value chain is dominated by hyperscalers.They are developing multiple LLMs,some generalist and some specialist,and embedding GenAI-enabled services in their enterprise platforms.

66、Perhaps most importantly,they have the deepest pockets when it comes to investing in building and training LLMs and the computing power that supports them as well as investing in GenAI start-ups and fledgling companies.InfrastructureThe machine learning algorithms used to create text,images and audi

67、o through GenAI require vast amounts of data and fast memory to run effectively.This GenAI infrastructure space is dominated by semiconductor giants such as Nvidia,Intel and AMD,as well as the hyperscalers,some of which are developing their own chips(among others,Google,Amazon,Microsoft,Meta,Apple a

68、nd Baidu are all developing chips).When it comes to building commercial models a GenAI value chain is still emerging.That value chain and the partnerships it is based on is likely to change over time as new companies build their own LLMs and as successful GenAI applications developers seek to extend

69、 their roles to improve their own products and generate new sources of revenue.The early-stage GenAI value chainTM Forum,2023ServicesConsultingProfessionalservicesSystemsintegrationTrainingApplicationsProductinnovationCustomeroperationsNetworkoperationsSoftwareengineeringSales&marketingMachine learn

70、ing OpsPromptengineeringModel hubThis graphic is representative of the early-stage GenAI market and is not intended as an exhaustive listof the companies operating in these areasFine-tuningResponsible AIVectordatabaseFoundation modelsCloud platformsInfrastructureAccording to data from Omdias Competi

71、tive Landscape Tracker,the semiconductor industry recorded a revenue increase in the second quarter of 2023 after five straight quarters of decline.Omdia says this growth is mainly attributed to the“explosion in the AI segment,led by GenAI”.“The data processing segment,driven by AI chips into the se

72、rver space,grew 15%quarter-over-quarter and makes up nearly one-third of semiconductor revenue(31%in 2Q23),”Omdia notes.The research firm says quarterly revenue grew 3.8%to US$124.3 billion in the same period.11section 2:whos who in the GenAI value chain?inform.tmforum.orgInvestment in GenAI start-u

73、ps globally totalled$17.8 billion in the first nine months of this year.Cloud platformsHyperscalers develop the platforms that provide access to computer hardware that drives GenAI services.The three largest platform providers are Amazon Web Services(AWS),Microsoft Azure and Google Cloud.But other p

74、latform companies which can be expected to play a leading role in the telecoms industrys adoption of GenAI include Salesforce,Oracle,IBM and ServiceNow.The sheer scale of investment needed to rapidly build and scale GenAI products makes it extremely difficult for other companies to compete.That said

75、,a huge amount of private equity has flowed into start-up GenAI businesses.According to The Wall Street Journal,investment in GenAI start-ups globally totalled$3.9 billion in 2022,rising sharply to$17.8 billion in the first nine months of this year($10 billion of that by Microsoft into OpenAI,$270 m

76、illion into Coheres AI platform,and$100 million by SK Telecom into Anthropic).One of the leading developers of LLMs for GenAI,AI21 Labs,has received$336 million in funding from companies including Intel,Comcast,Google and Nvidia.The immediate opportunity for hyperscalers to monetize their investment

77、s in GenAI is in enriching their existing products and services.For example,they could use copilot services to bolster sales of existing productivity tools.This would command an additional per-user fee.By opening copilots to developers,hyperscalers will also drive greater usage of their core cloud c

78、omputing services.Foundation modelsThis is the most dynamic part of the GenAI value chain.While hyperscale service providers are extremely active in building LLMs,many other companies from start-ups to mature technology organizations are seeking to build their own.Some of the most prominent companie

79、s here include:OpenAI the company that owns ChatGPT and in which Microsoft has invested an estimated$13 billion.Anthropic,set up in 2021 by former OpenAI employees and which has developed Claude 2,a rival chatbot to ChatGPT.Amazon is reportedly investing up to$4 billion in the company and Google rec

80、ently agreed to invest$2 billion.Stability AI,a company set up in 2020 which describes itself as“the worlds leading open-source generative AI company”.It has created the Stable Diffusion text-to-image model and Stable LLMs and has a partnership with AWS to run AI models on its Sagemaker machine lear

81、ning platform.Cohere,an enterprise-focused GenAI firm set up by ex-Google Brain employees to deliver“language AI”(LLM)applications including chatbots,search engines and copywriting.It has key partnerships with Amazon,Google Cloud,Oracle and Salesforce.New LLM initiatives are announced on a weekly ba

82、sis,and the sector is trending towards more specialist LLMs which are cheaper to create/train and geared towards specific market segments and geographies.Machine learning modelsGenerative AI models can be complex and expensive to train because of the computing power and resources needed.This makes i

83、t difficult to scale them to production environments.Machine learning operations(MLOps)helps to streamline the process of taking machine learning models to production,and then maintaining and monitoring them.It can help to scale AI models by automating the training and deployment process,and makes u

84、se of a range of tools to curate,host,fine-tune and manage foundation models.Applications and servicesWe are only at the very early stages of products being developed for consumer and B2B markets that use GenAI either off-the-shelf or with a degree of fine-tuning.While the barriers to entry for comp

85、anies seeking to enter the GenAI infrastructure,cloud platform or foundation model parts of the value chain are significant because of the sheer level of investment required the GenAI applications space is wide open.Specialist knowledge in GenAI is already a valuable asset,and professional services

86、firms are developing a range of services and solutions designed to help enterprises leverage the GenAI opportunity,including systems integration,training and professional services.Telecoms and GenAISo where does the telecoms industry fit into the GenAI value chain?What is clear is that both 12sectio

87、n 2:whos who in the GenAI value chain?inform.tmforum.orgAt DTW23Ignite Vodafones Head of New Technologies and Innovation,Lester Thomas,talked to Ian Thornhill,Telco Industry Advisor at Microsoft,about early use cases of GenAI and how TM Forums Open Digital Architecture enables the operator to scale

88、services using Open APIs.Thomas says Vodafone has its first GenAI use cases in production and“experiments in virtually every part of”its business,where any form of written or spoken human language is used,including supply chain,legal,customer operations and marketing.All of its AI is running in the

89、public cloud,enabling it to focus on use cases and investment in skills.“Our goal is to enable every single employee across Vodafone,in every single function,to derive value,make faster decisions,by using insights and by using AI,”says Thomas.Thomas says in a few years he could see“virtually every p

90、iece of software,any part of your Open Digital Architecture”,using LLMs.Microsoft,among its applications for GenAI,is using its Azure OpenAI large language model as a tool to improve the linguistic capabilities of its chatbot and provide a“more natural language flow”,says Thornhill.It is also using

91、GenAI to capture and summarize data across different channels such as chatbots and call center agents.Watch the video to find out more:Vodafone and Microsoft experiment with early GenAI use casestelecoms operators and their technology partners see huge opportunities.Analyst companies are also bullis

92、h about monetization prospects.McKinsey,for example,has identified 63 generative AI use cases spanning 16 business functions that it says could deliver total value in the range of$2.6 trillion to$4.4 trillion in economic benefits annually when applied across industries.CSPs are already partnering wi

93、th cloud platform providers principally hyperscale service providers and professional services firms as they take their first steps in GenAI.They are experimenting with off-the-shelf models with an expectation that they will train or fine-tune them for specific use cases.However,if operators do buil

94、d their own LLMs something which we explore in section 4 they will need access to computer hardware in either a public or private cloud environment and build internal expertise in a range of MLOps tools.CSPs technology suppliers are also seeking to embrace GenAI tools and capabilities.Given that ope

95、rators own internal data for example,data derived from customer interactions or from network operations sits within the systems supplied by their operational and business support system(OSS/BSS)vendors,they will have a crucial role to play in how operators use GenAI.The approaches of three of the co

96、mpanies sponsoring this research report provide good examples of how vendors will use GenAI:Netcracker is seeking to insert itself between the CSP and the LLM by ensuring that the LLM only uses the most relevant data and that which adheres to rules governing security and privacy when,for example,it

97、receives a prompt from a chatbot.Tecnotree greatly increased its AI expertise with the acquisition of AI engineering platform CognitiveScale in December 2022.It is now embedding Gen-AI capabilities into its existing BSS stack.One of these is a chatbot that will enable its customers to ask questions

98、about the Finnish companys products and solutions.Billing provider Aria Systems is partnering with Salesforce to offer a new AI-optimized“concept-to-care”monetization solution.This will allow CSPs to enhance products in their catalogs,integrate them into their OSS/BSS systems and offer automation an

99、d GenAI capabilities from Salesforce Einstein the companys AI for CRM products portfolio to personalize customer touchpoints.In the meantime,operators are still working out how they fit into the GenAI value chain.The box and video below outline some of the early GenAI use cases being experimented wi

100、th at Vodafone and Microsoft.And on the next page we outline TM Forums work in AI and automation,including the development of a GenAI LLM model for telecoms.In the next section we look at how some CSPs are experimenting with LLMs.13section 2:whos who in the GenAI value chain?inform.tmforum.orgTM For

101、um members are working together in collaboration projects and Catalyst proofs of concept to solve common AI challenges for the telecoms industry.Central to that is a new Innovation Hub,hosted by Jio in Mumbai,which opened on 1 December 2023.Founding members are Accenture,Deutsche Telekom,Google Clou

102、d,Orange,Reliance Jio,Telenor and Vodafone.Among the early projects will be the development of TMF Guru,a GenAI/LLM conversational search capability to simplify and improve how to find relevant material digitally.The project members are setting out to create an architecture blueprint that outlines h

103、ow GenAI technology can be seamlessly integrated into the telco landscape,addressing challenges related to security,privacy,accuracy,performance and scalability.In the near term,TM Forums AI innovation agenda focuses on gaining maximum benefits from AI,safely and cost-effectively,using existing IT a

104、nd network systems and data.The medium-term goal is to exploit the full potential of AI to reduce operational costs and create opportunities for growth by taking AI from a bolt-on automation aid to an integral part of IT and network system designs.The Forums AI collaboration projects are creating st

105、andards and exploring best practice for AI-driven,intent-based closed loop automation,building an industry-agreed framework for the use of AI in IT and network operations.Meanwhile,Catalyst co-innovation projects are turning ideas into proofs of concept as members explore applications of machine lea

106、rning and GenAI.Members can keep control of AI deployments and avoid accumulating technical debt by following TM Forums best practice in AIOps and AI governance,and prepare for the future by upgrading IT and network systems for AI with TM Forums blueprint reference architectures,ontology and Open AP

107、Is.Many operators are adopting the Autonomous Networks Levels Evaluation Methodology to set targets for network automation and track their progress.TM Forums collaborative work in AI and automationTM Forum,202314section 3:CSP strategies getting ready for GenAIinform.tmforum.orgOur survey respondents

108、 were split 50:50 in terms of whether GenAI needs a specific focus or if it should be seen as one strand within an overall AI strategy.The main argument for taking a specific focus is that GenAI tools can be deployed relatively quickly and provide near-term benefits particularly in areas such as cus

109、tomer experience,but also potentially in many other functions across the business as we saw in section 1.As such,the value of this technology can only be unlocked by democratizing access and,potentially,budget.This may mean circumventing traditional approaches for putting together business cases and

110、 approving investment.It could also be argued although this is somewhat controversial that GenAI has a more transformative capability than other branches of AI and that,as such,it needs to be prioritized.The counter argument is that the main use cases for GenAI are largely the same as those for othe

111、r branches of AI and that creating parallel structures and initiatives The arrival of GenAI tools has lit a fire under CSPs AI strategies.But will its potential impact on operators businesses be so significant that it requires them to put in place a new set of initiatives,a distinct and separate org

112、anizational approach and new investment?The value of GenAI can only be unlocked by democratizing access and,potentially,budget.15section 3:CSP strategies getting ready for GenAIwould involve a duplication of effort and,potentially,unnecessary confusion and complexity further down the line.Furthermor

113、e,GenAI is often combined with other AI technologies and,as such,does not represent something that can be pursued as a standalone strategy.OpenAI introduced the ChatGPT API for developers to integrate functionality in their applications in March 2023.That was the cue for CSPs to start putting togeth

114、er new use cases or,more likely,look at how it could help improve the results and performance of existing ones.To develop these use cases many operators have created AI centers of excellence 53%of our respondents say they have done so in order to decide how,where and when to make the first moves.One

115、 of our respondents,as a first step,set up six work streams(six“Ps”)managed by the AI center of excellence:potential platform,partners,people,process,and policy and guidelines.Others are experimenting with GenAI across different functions and departments,and/or working with partners including techno

116、logy firms to work on use cases.In the first instance many CSPs have reached out either to hyperscalers many have established a“preferred”hyperscaler relationship within the past two-to-three years or large consulting and professional services firms.This has given them a degree of comfort as they co

117、nsider some of the key risk factors such as breaching rules around privacy,security or intellectual property;these firms have already considered the various issues as they build up their own capabilities.In our survey,operators own centers of excellence came out as the most important source of knowl

118、edge on GenAI followed by public cloud companies(see chart on the next page).Respondents were asked to rank six sources according to importance.Hyperscalers are already making GenAI services available alongside their existing public cloud services,making it easier for CSPs to dip their toes into the

119、 water.Industry analysts and consultants were the next most important sources.Speed versus risksThe launch of ChatGPT,Googles Bard and the explosion of interest in GenAI in the spring of 2023 has put huge pressure on telecoms operator CIOs to deploy the technology and demonstrate a return on investm

120、ent in early use cases.Unlike other technologies,GenAI has captured the imagination of CEOs and main boards who are prepared to make resources available but are impatient for success.Faced with the challenge of acting quickly,the CIO of one CSP in China a mid-sized operator with around five million

121、customers decided to skip the process of developing use cases centrally and,instead,to start making GenAI agents available to any team that wanted to experiment with them.In the first month 200 teams said they wanted to try out the GenAI agents,and from this process the operator identified nine“flag

122、ship”use cases but did not disclose them where a business case could be made for driving revenues.Another operator talked of the“risk and fear”that can be associated with deploying new technologies and how to overcome these to maximize the potential of successful experimentation:“We are trying to in

123、spire our employees to start taking risks in a controlled way,and to do the experimentation and eventually record the outcome of these decisions so we can see the results of these use cases,and which have worked and which havent worked.”inform.tmforum.org53%59%45%53%How are you approaching the GenAI

124、 opportunity(choose all that apply)?TM Forum,2023We have diferent functions and departments experimenting with GenAI across the organizationWe are working with partners/technology firms to work on diferent use casesWe have identified families of use cases and specific use cases within each familyWe

125、have set up a center of excellence16section 3:CSP strategies getting ready for GenAIGetting a working proof of concept up and running as soon as possible was also seen as a priority by respondents to our survey.One western European CSP said that one of its early priorities,before receiving guidance

126、and funding from the business,was to set up a working chatbot proof of concept.“Nothing speaks more than a working demo which actually shows you a bit of the magic,”the companys Head of Data and AI told us.The time that it takes to set up a working example can vary,however.This particular operator a

127、imed to have working proofs of concept set up in just two days,while another operator we spoke to in our research said they had taken the whole summer to set one up.Unlocking data valueOperators have spent many years trying to improve the availability,reliability and usability of data that their ope

128、rations and customers generate.The commercial viability and availability of AI has made this a bigger priority for CSPs in recent years,and in our survey most operator respondents were relatively positive about the advances they have made in a number of different aspects relating to data readiness:t

129、he creation of a unified data model,data governance,the awareness of the data that is available across the organization,and the availability of data in real time.But when it comes to making use of unstructured data for example,data generated from online or real conversations between customer care ag

130、ents and customers fewer respondents(41%)said that their organizations have made good use of the opportunity(see chart on the next page).This may represent an immediate opportunity for GenAI.Without GenAI the operator would need to label this data the process of adding meaningful and informative lab

131、els to provide context so that a machine learning model can learn from it to be able to use it.But now,with a large language model,an operator can feed this unstructured data into the model and immediately start generating insights which,over time,can be fine-tuned.Getting ready for GenAIIn our rese

132、arch many senior executives stressed the importance of“organizational readiness”to pursue GenAI transformation.There are many elements to this,including:It is sometimes said that GenAI has the capability to democratize AI because tools such as copilots give individuals and small teams the opportunit

133、y to use new tools.But if access to data,both within and across siloes,is not easily accessible,democratization risks leading to frustration.inform.tmforum.orgWhich are the most important sources of knowledge aboutGenAI for your business(rank most to least important)?TM Forum,20233.14.13.53.6Industr

134、y analysts4.22.5Our own internal center of excellenceBrowsing on the internetConsultantsPublic cloud companiesOur legacy vendors and integration partnersNothing speaks more than a working demo which actually shows you a bit of the magic.”Head of Data and AI,European CSP17section 3:CSP strategies get

135、ting ready for GenAI The democratization of GenAI could also put the organization at greater risk from a range of issues associated with the technology,including hallucinations incorrect or inaccurate responses to questions or tasks fraud,security and broader legal considerations.Operators are alrea

136、dy putting GenAI guardrails into place for example,barring the exposure of raw GenAI outputs directly to customers without a human filter.Many of the business leaders we spoke to made reference to lessons that can be learned from the early days of cloud transformation where many operators took a“lif

137、t and shift”approach.These operators then found that the cost of running applications and workloads in the public cloud did not generate the benefits they expected and that public cloud costs spiralled way beyond what they budgeted(more on this in the next section).Such is the speed of development o

138、f GenAI that operators are faced with the challenge of how to keep their organizations moving at such a pace to allow them to be flexible and agile in terms of how,where and when to adopt new solutions and technologies.In the AWS-commissioned Altman Solon survey,19%of respondents said they had use c

139、ases in production,with this number set to increase to 48%in the next two years.We look in more detail at GenAI use cases in section 5,while in the next section we examine how CSPs are experimenting with LLMs.inform.tmforum.orgHow successful has your organization been in makingdata available for AI

140、and machine learning?TM Forum,2023I agree with this statementI disagree with this statementI neither agree nor disagree with this statementWe have made a good job of using unstructured data alongside structured dataWhere needed we have systems in place that allow data to be accessed in real time/nea

141、r real timeThere is good general awareness of the data that sits across diferent siloes within the organizationWe have efective governance in place which helps ensure that data can be accessed by diferent teams and functions across the organizationWe are making good progress towards the creation of

142、a unified data model23%13%65%22%15%62%26%12%63%26%17%57%35%24%41%The democratization of GenAI could put an organization at greater risk from a range of issues associated with the technology.18section 4:To build?To train?To fine-tune?Getting to grips with LLMsinform.tmforum.orgThere are two categorie

143、s of generative AI LLMs:proprietary and open source.Proprietary models are expensive to develop various estimates put the cost of GPT-4 at$50 million to$100 million and anyone who uses them needs to get a license.Open-source LLMs are freely available for people to access,use and tweak based on their

144、 own particular requirements.Examples of open-source models are Llama 2,developed by Meta and Microsoft;and GPT-Neo and GPT-J from EleutherAI,a collective of researchers working on open-source AI;Examples of proprietary models are BloombergGPT,developed for the financial sector,and DeepMinds Gopher.

145、The size of an LLM is a function of the data on which it is trained and the number of parameters.An LLM that is trained on general internet data is necessarily much larger than an LLM that is trained on domain-specific data.A domain could be a specific industry vertical for example,healthcare or fin

146、ancial services or a horizontal segment such as research papers from academia.Some LLMs contain a combination of internet data and domain-specific data.CSPs face a number of choices when it comes to choosing foundational models,the use cases to which they are applied,the extent to which they need to

147、 train or fine-tune them,and the environments in which the LLMs operate.19section 4:To build?To train?To fine-tune?Getting to grips with LLMsinform.tmforum.orgMinimizing investmentIn this early,initial phase,many CSPs want to use LLMs that require minimal up-front investment and which can generate i

148、mmediate benefits.That means choosing the right use case mapped with the right LLM and from a supplier that can provide support and guidance.In such instances,where external skills are needed,open-source LLMs might not be a viable option.CSPs also need to decide whether the LLMs they use should oper

149、ate on their premises,on a public cloud or on vendors clouds.The decision-making process here is similar to any other compute requirement.An on-premises solution can be attractive when dealing with sensitive data,but public cloud is a better option if the operator is experimenting or needs to scale

150、computing resources up or down quickly.In practice,given the dominance of the hyperscalers,the vast majority of LLM usage by CSPs will likely be in the public cloud,at least in the short-to-medium term.In our survey we asked CSPs which LLMs they were considering for their organizations.When it comes

151、 to using open-source models,59%said they were considering the use of that option in the cloud while 32%were looking at on-premises options(see chart above).About half of respondents said they were considering a service model an approach where the CSP buys access to an LLM from a vendor along with a

152、 service agreement.A similar number are considering partnerships with LLM vendors which can support them in training proprietary models.In the AWS/Altman Solom survey more than 65%of respondents said they anticipated training off-the-shelf models,primarily using their proprietary internal data to ta

153、ilor them to their specific needs.Only 15%said they would build foundation models in-house.There were no clear trends in terms of preferences for CSPs using context learning/prompt engineering(i.e inference)or fine-tuning(i.e.training)to train models.Most CSPs are evaluating both approaches.However,

154、in this early phase operators seem to be leaning towards context learning as this is less resource intensive and provides“good enough”performance for many use cases.Choosing between LLMsWhile it is true to say that one foundation model can be used for multiple tasks,it is also a reality that one LLM

155、 is not going to do all the tasks that operators are considering.But how do CSPs make decisions between the tens or even hundreds that are available to them?59%32%49%50%Which large language model(LLM)approaches are youconsidering for your organization(choose all that apply)?TM Forum,2023LLM partners

156、hip(i.e.partner with vendor who supports training a proprietary model)21%Building our own proprietary modelA service model(i.e.buy from vendor with service agreement)Open source on premisesOpen source in cloudCSPs need to decide whether the LLMs they use should operate on their premises,on a public

157、cloud or on vendors clouds.20section 4:To build?To train?To fine-tune?Getting to grips with LLMsinform.tmforum.orgThe only way for operators to inform their decisions is to experiment with different models,gauge the accuracy,the bias and the level of hallucinations inaccurate or inappropriate respon

158、ses to inputs presented as facts by the model that come with each of them.CSPs also need to learn about the training data that was used for each of the different models and how this impacts their accuracy.As new,domain-specific models are launched they may decide that these yield better results than

159、 more general-purpose ones,even though the latter likely have benefitted from greater training investment.In deciding how good or how accurate an LLM is,a CSP will consider whether,for example,it is 40%,60%or 80%accurate.Different use cases will require different levels of accuracy for the use of Ge

160、nAI to be worthwhile.In some cases a CSP may decide that,for example,20%accuracy is worthwhile if it creates insights that can be input into a process or decision.For others,70%or 80%accuracy may be needed.The processes of training,fine-tuning and Retrieval Augmented Generation(RAG)optimizing the ou

161、tput of an LLM with targeted information are used to improve Cost&licensingIs the cost model fully understood,particularly if/when use cases scale and are there any restrictions on usage?Performance of the LLMHas the model been assessed in terms of its accuracy,usefulness,human evaluation and fideli

162、ty?Use case requirementsCan the LLM fulfill the specific tasks/use cases that are required?LLM capacityWhat are the computational and memory requirements of the LLM?Pre-training data and domainHas the model been pre-trained on general internet text or domain-specific data?Fine-tuning flexibilityCan

163、the model be adapted to a specific task using an organizations own-labelled data?Ethical and responsible AI considerationsIs the model committed to fairness,bias mitigation and responsible AI practices?Updates&maintenanceHow frequently is the model updated and how is the model maintained by the mode

164、l operator?Choosing the right large language model(s)the accuracy of LLMs.But telcos also need to learn how LLMs work and how to involve different teams and departments in order to get the most out of them.The more parameters in an LLM the more training will be required and,as such,the more expensiv

165、e it will be to develop because of the required computing resources.“Were trying to figure out how we enable our domain people to understand the nuances of a generative model,”the CIO of one CSP in the Indian subcontinent told us.“Each model has its own weights and its own methodology.”As such,when

166、it comes to fine-tuning and prompt engineering this is not solely an area that sits within the remit of a data science team.It also needs to involve domain specialists.When it comes to fine-tuning the model,this process involves understanding how the model works and then framing questions in such a

167、way as to generate the most accurate results.In our survey we asked what operators need to do to best exploit AI.Recruiting new skills,and training staff in AI skills,scored the highest among our respondents,while partnering with AI specialists was another popular option(see chart on the next page).

168、CSPs are also finding that LLMs behave differently from each other and that each one has its own“personality”.This means that,for example,when comparing the performance/accuracy of one LLM versus another,initial results can be disappointing until it becomes clear how best to use it.Recruiting new sk

169、ills and training staff scored the highest among operators wanting to exploit AI.TM Forum,202321section 4:To build?To train?To fine-tune?Getting to grips with LLMsinform.tmforum.orgOther considerations when choosing an LLM include:Language availability.Some models are trained on a single language wh

170、ile others are trained on many.GPT-3,for example,is trained in more than 50 languages.In smaller countries,or ones where there are many different languages,CSPs may need to take some of the responsibility themselves for training different aspects of LLMs.Cultural nuances expressed in languages.If we

171、 take,for example,the language used between a customer and a customer services agent,there are very different types of language and conventions used across the world.The longevity of the model.LLMs have a limited life span because there is necessarily a cut-off point in terms of the data that is fed

172、 into a model.The evidence to date is that a model only has a lifespan of around six months.This is a particular issue for fast-moving industries where the output of the model needs to reflect data that was recently fed into it.Furthermore,there is no guarantee that when an LLM model developer relea

173、ses a new version it will behave in the same way.Level of latency supported.Do the use cases that the LLMs support require low-latency connectivity?If so,this may impact the choice of private versus public cloud.Ease of transferral/cross-use.How easy will it be for a CSP that has trained a specific

174、model to transfer its input into a new or updated model?Cost and ROICSPs in general are comfortable experimenting with LLMs and meeting the payment terms of their owners.While companies that embed LLM capabilities tend to charge for them on a per-user,per-month basis Microsofts copilot service is ma

175、de available for an additional fee to its Teams customers businesses seeking direct access to LLMs pay on an API basis.Calculations are,essentially,based on the number of words input into or output from a model.As such,costs scale directly with usage.Such pricing models present a significant cost ma

176、nagement risk to any enterprise that deploys LLM-enabled solutions at scale.This is a particular issue with GenAI,which is seen as a tool to“democratize”AI and innovation more broadly across the organization.The discussion around cost and return on investment(ROI)has echoes of early deployments of p

177、ublic cloud services.Many operators took a“lift-and-shift”approach to moving applications to the public cloud and subsequently found that the business case was no longer viable because the workload consumed more public cloud resources than expected.As such,operators are keen to evolve to a different

178、 type of pricing model.An ideal model from the perspective of operators would be one that incentivizes What do operators need to do to give themselves the bestchance of efectively exploiting AI and machine learning?TM Forum,2023This will have a big impactThis will have a moderate impactThis will hav

179、e a small impactPartner with AI specialistsAcquire specialist AI technology firmsTrain existing staf in a range of diferent AI skillsRecruit new AI and machine learning skills2%28%70%1%39%60%24%37%39%9%40%51%22section 4:To build?To train?To fine-tune?Getting to grips with LLMsinform.tmforum.orgthe L

180、LM model owner based on a successful deployment of an LLM for a specific use case or family of use cases.That could be,for example,the number of customer queries successfully handled by a chatbot or a reduction in the time taken by field engineers to address network outages.Watch Verizon CDIO,Shanka

181、r Arumugavelu,and TM Forum CEO,Nik Willetts,discuss the comparison of GenAI adoption with the journey to cloud-native operations.Telecoms-specific LLMsFinancial information services provider Bloomberg unveiled BloombergGPT,an LLM for the financial sector,in April 2023.Bloomberg has taken a“mixed app

182、roach”by combining financial data it owns largely the data sources accessible by any user of a Bloomberg terminal with data that it has scraped from the internet.This latter category comprises both financial data and more general content.The LLM has 50 billion parameters constructed with 363 billion

183、 tokens from Bloombergs own information sources,and 347 billion from other,mainly publicly available,datasets.If Bloomberg is able to create an LLM specifically for the financial services industry,does this mean that it should be possible to create one for the telecoms sector?SK Telecom in partnersh

184、ip with Deutsche Telekom,Singtel and e&is now building an LLM for the telecoms industry(see box on p.23).But is this the right medium-to-long-term approach for the industry,or indeed the only approach?In our survey just 21%of CSPs indicated an inclination to build foundation models in-house.But why

185、shouldnt operators have their own model?The graphic below highlights some of the arguments for and against.In the next section we look at some of the early use cases for GenAI in the telecoms sector.With IP ownership,a CSP may be able to license the model to other CSPs,or simply to make it part of t

186、he overall technology capabilities of the business.By creating its own LLMs,the telecoms industry could uncover new products and services leading to revenue growth.A dedicated telecoms-industry LLM will,almost certainly,produce significantly more accurate results because it has had real-world data f

187、ed into it.This is particularly true for data that is specific to telecoms for example,network operations.The ability to use GenAI to generate code could produce significant benefits.With their own LLM,CSPs can have control over their own destiny rather than relying on third parties which do not nec

188、essarily share their own best interests.Each CSP use case has its own requirement in terms of latency,privacy and security major considerations when it comes to deciding whether to deploy on-premises,or on private or public cloud.With their own LLMs,CSPs can make the best choices.Pricing models for

189、LLMs deployed at scale are unclear.“The industry is pretty much in its infancy in terms of deciding how they charge for the use of LLMs at scale,”the CIO of one North American CSP told us.By developing their own LLMs,operators will have a better visibility of costs and ROI.Within an operators busine

190、ss many different functions may benefit from GenAI adoption.But each has a distinct approach to language:customer-facing functions use highly conversational language;technology teams deal in technology terms;finance teams have their own financial language,as does legal,and so on.Would each of these

191、functions be better served by a domain-specific LLM?The telecoms industry may not need to build its own LLM model to obtain(telecoms)domain specificity.This may be possible though the process of training or fine-tuning existing models.Operators are slow-moving organizations which are not used to dev

192、eloping and deploying their own technologies at anything close to the speed at which the AI industry is evolving.There is a risk that the challenge to innovate,to change direction and to build in new features and capabilities will mean that an operator-specific LLM will fail to keep up with more gen

193、eral models.The cost to build and train their own model tailored to the telecoms sector,or specific segments of the sector,could be a cost burden.Not having clear visibility of costs or ROI could be prohibitive.CSPs may lack the in-house skills to build and train telecoms-specific LLMs.Arguments aga

194、inst CSPs having their own modelArguments for CSPs having their own model23section 4:To build?To train?To fine-tune?Getting to grips with LLMsinform.tmforum.orgSouth Korean operator SK Telecom has been developing its AI capabilities for the past two-to-three years using its own super computer,Titan.

195、In August 2023 it invested$100 million in GenAI start-up Anthropic(also backed by Google).As part of the deal,the companies agreed to jointly develop an LLM that is optimized for telecoms operators.This LLM,which is due for launch in early 2024,is a variant of Anthropics more general model called Cl

196、aude and will support Korean,English,German,Japanese,Arabic and Spanish languages.The model will be shared by three other operators Deutsche Telekom,e&and Singtel which have formed,with SK Telecom,the Global Telco AI Alliance.Both Deutsche Telekom and SK Telecom emphasize the early focus of the new

197、LLM will be on helping them deliver a stronger customer experience.An LLMs reasoning capabilities make it good at analyzing customer intent and searching a telcos back-end systems for the answer,says Jonathan Abrahamson,Chief Product&Digital Officer at Deutsche Telekom.SK Telecom Chief AI Global Off

198、icer Chung Suk-geun told attendees at TM Forums DTW23Ignite event in Copenhagen in September that with GenAI CSPs had the potential to redefine the relationship with their customers.In doing so,CSPs not only have the opportunity to build more loyal relationships,but they can also leverage this to se

199、ll new products and services.One example would be to use the skills used in improving customer experience through GenAI to help enterprises outside the telco industry,he said.Watch the video to find out more:Michael Lesniak,Business Development and Partnerships Manager for Web 3 at SK Telecom,reckon

200、s that GenAI could,more specifically,help operators develop their carrier billing the sale of third-party services line of business.“I think that GenAI creates an interesting opportunity for carriers to use SMS to communicate directly with customers and to initiate transactions,”he told attendees du

201、ring a webinar organized by Mobile Europe.Rather than making the customer go through the sometimes long-winded process of signing up for a new app on a smartphone,an operator could remove all the friction from the transaction because it knows enough about the identity of the customer to open up acce

202、ss to the service without going through the usual process,because they already have a billing relationship with them.The four operators will build a so-called Telco AI Platform that sits on top of the new LLM and which will serve as the foundation for developing new services such as chatbots and oth

203、er apps.Speaking at SK Telecoms financial results in August,Suk-geun said part of the thinking behind the creation of a telco-industry LLM was that operators are having problems getting the attention of the companies building LLMs.The Global Telco AI Alliance plans to add new operator members to hel

204、p drive scale and increase the richness of the model and the applications that are built on top.Giving operators more control of their business,and properly leveraging new technologies,is also part of the thinking behind building their own LLM and AI platform.By working together operators will be ab

205、le to avoid the mistakes that they have made in the past in ceding strategic business to rivals during a major technology shift,according to Suk-geun.This point was echoed by Deutsche Telekoms Abrahamson,who said the company“liked the idea”of developing and owning its own IP.Suk-geun says operators

206、will start by using GenAI to transform internal core processes such as marketing,sales and customer service operations.For example,he says,marketing cost reductions could come through automated customer service and in networks through automated infrastructure monitoring;and service quality improveme

207、nts could be made through more personalized,targeted offerings.Alliance sets out to build telco-specific LLM24section 5:taking the first steps early GenAI use cases for CSPsinform.tmforum.orgEarly GenAI deployments will mainly be used for improving existing functions.The technology cannot,currently,

208、generate content that is sufficiently or consistently accurate or suitable to replace complex operations.However,as developments improve it will start to be used in this way,and in so doing will replace processes,functions and people.However,we are still some way away from this point.Indeed,it is li

209、kely that either businesses themselves,or policymakers,will impose strict rules and guidelines to restrict GenAI use.With early use cases mainly taking advantage of out-of-the-box LLMs,operators are able to deliver improvements to existing functions with minimal investment.CSPs that we interviewed f

210、or this research are confident that these can generate a rapid,easily demonstrable return on investment.The cost savings for GenAI use cases that involve the replacement of existing functions can be significant,but they can also generate other benefits for example,delivering customer service capabil

211、ities that CSPs do not currently offer because they are too costly.But for these to become viable the accuracy and fidelity of LLMs needs to improve significantly.For GenAI to deliver value it needs a body of knowledge to work on.With telecoms,pools of knowledge exist both inside the operator organi

212、zation and outside.The graphic on the next page defines some of those aspects.Once CSPs become familiar with GenAIs capabilities they have the opportunity to create new pools of data as a clear and deliberate strategy to deliver future customer service,operational efficiency and productivity gains,a

213、nd to deliver new or enhanced products.The key question many CSPs are asking is what practical applications can be applied to telecoms operations using GenAI technology.But before exploring the potential use cases,and families of uses,it is worth considering how it is currently being tested by early

214、 adopters.The cost savings for GenAI use cases that involve the replacement of existing functions can be significant.25section 5:taking the first steps early GenAI use cases for CSPsinform.tmforum.orgFamilies of use casesMany operators and their technology partners have begun the process of putting

215、together families of use cases for GenAI.We have identified seven for this research:customer operations sales and marketing network(operations)IT and software engineering product innovation internal knowledge business operations.Within each of these families we have identified different use cases wh

216、ich are either being explored by CSPs today,or have short-to-mid-term potential.For many we include graphics from the Altman Solon survey.It asked CSP respondents to say whether they were implementing or evaluating GenAI across 16 use cases in four of those categories:customer operations,sales and m

217、arketing,network operations,and IT and software engineering.Customer operationsDelivering a better chatbot experience for customers is seen as the most tangible opportunity today for CSPs investing in GenAI.Many operators already offer their customers access to chatbots as a complement to other form

218、s of customer interaction.But these services are extremely limited in the functionality they provide and are not,generally,liked or appreciated by customers.With GenAI,operators have the opportunity to completely change the chatbot experience.Todays services are based on traditional AI and rules-bas

219、ed systems.As such it is useful for certain queries and frequently asked questions.Indeed,these chatbots can be effective whenever the system recognises a query from a customer and the data that sits behind it can point the customer to information that helps to resolve the problem.GenAI chatbots wou

220、ld use unstructured rather than structured data to understand what the customer wants and how it can help.These multi-modal chatbots would open up use cases such as predictive customer sentiment.Todays LLMs,however,are not accurate enough to give CSPs sufficient confidence to deploy these types of c

221、hatbots.Operators would need to train large volumes of customer data,either on a third-party LLM or their own,to give it the capability to have meaningful conversations with customers.In the shorter term,and with less accurate information,customer service agents can save themselves time by using Gen

222、AI copilots to search for information to help with customer queries.Customer data from third-party sources such as social media sites(which tends to be unstructured data)Information from standards,technical bodies and academic research journals used for building new networks,IT systems and customer

223、products and solutions Information from consultants and analysts about the operators own business or the wider industry Insights and data about a)other operators and service providers b)technology suppliers and partners c)customers and potential customers and d)other companies against whom CSPs migh

224、t want to benchmark themselves Code from repositories such as GitHub Customer data that sits within BSS/OSS systems Operational data relating to the performance of the network and IT systems Financial and operational data/KPIs about the operators business Technical manuals used by engineers to addre

225、ss issues and build new products and capabilities Code that has been written Data about suppliers,partners and the supply chain more broadly Data about distributors and channels Information(from HR)about people working in the organization The content of emails and messaging platforms between employe

226、es,customers,partners and suppliers.Internal pools of knowledgeExternal pools of knowledge26section 5:taking the first steps early GenAI use cases for CSPsinform.tmforum.orgThis can help to bridge the performance gaps between those agents who are able to find the information that a customer needs an

227、d those who are not.The copilot approach could also be something that is offered to customers for example,when they are searching for information on a website.One North American operator who took part in our survey has combined three technologies conversational AI,enterprise search and GenAI to crea

228、te a multi-modal experience for its customers.The role of GenAI,in this example,is to provide a summary of what the customer is looking for(and finds)on the website.Canadian telco Telus,meanwhile,is using GenAI to predict when customers will have negative experiences for example,when the network is

229、likely to go down because an engineer is carrying out repairs in the local area.Matthew Sanchez,Global Chief Data and AI Officer at BSS vendor Tecnotree,says his company is currently tracking around 20 use cases in customer operations.“Some of these were working on actively with customers,some are d

230、iscussions we are having with customers,and others are R&D work we are doing.We have seen some RFPs around that involve using GenAI to fully transform customer operations.”The Altman Solon survey identified four use cases within the customer service category.Its responses,which measure the percentag

231、e of total respondents that are currently implementing,or evaluating with a high likelihood to adopt,the use case demonstrate that chatbots are,by some margin,the application which is attracting most interest.Sales and marketingThe highest profile use case within the sales and marketing category is

232、automated generation of marketing collateral/content.It allows B2B marketers to build compelling narratives,tailor messaging to target audiences and,ultimately,drive engagement.It represents an important productivity tool as marketers come under growing pressure to create personalized messages for t

233、arget audiences.Within this overall category of content marketing,GenAI can help marketers to brainstorm and generate ideas,create new content,enhance existing content and create compelling visuals.As customers start to use their own preferred social media Building families of GenAI use casesCustome

234、r operations Customer chatbot Call center agent documentation and coaching Website assistance Predictive and personalized servicesNetwork Field service operations guided assistance Network/capacity planning Network security testing Post Mortem Creation Root cause analysisIT/software engineering Auto

235、mated code generation and testing Automating repetitive tasks(e.g.data mapping)Detection of code security vulnerabilitySales&marketing Marketing collateral generation Personalized customer/email scripts Social media automated responsesProduct innovation Carrier billing Personalized services Voice va

236、lue-added services B2B customer call servicesBusiness operations Legal/contract Fraud management Partner management(e.g.roaming)Human resourcesInternal knowledge,training&development Evaluating new trends/developments Competitive analysis Supply chain analysisCSP GenAI use case adoption:customer ser

237、viceTM Forum,2023(source:Altman Solon)Guided employee assistance for knowledge management&troubleshootingContact center documentationEmployee coachCustomer chatbot92%61%75%73%27section 5:taking the first steps early GenAI use cases for CSPsinform.tmforum.orgchannels to communicate with their service

238、 providers,GenAI can help CSPs to respond to these queries.However,because of the very nature of social media this is fraught with risks if a response is,in any way,inappropriate or incorrect.A CSP would need to have an extremely high level of trust in its LLM to allow direct GenAI-created content t

239、o communicate with a customer.When it comes to sales,GenAI gives account managers the ability to send personalized email to customers and prospects.It can also help sales teams identify the best prospects and create scripts for conversations with them.Network(operations)Whereas many of the use cases

240、 in customer operations and sales and marketing are generic to all businesses,the network is specific to telecoms.As such,use cases will emerge from operators themselves,often in partnership with GenAI technology firms.Networks are a major focus for the deployment of AI-enabled automation more broad

241、ly,and GenAI will be used both to complement existing focus areas and to address new opportunities.Near-to-mid-term opportunities include what Altman Solon describes as“guided employee assistance for installation,troubleshooting and maintenance”.Deploying field service operations teams to install ne

242、twork elements such as routers and switches to maintain the network and to respond to outages is an extremely costly function.But the time it takes for network engineers to resolve issues can be shortened if they are given access to tools that enable them to quickly diagnose issues by consulting man

243、uals and historical data gleaned from similar issues that have occurred in the past.This capturing of historical events,which can then be used to learn from,is also referred to as post mortem creation.Another use case within network operations is capacity planning.Smart capex is an analytics and AI-

244、driven approach already used by many operators to make more efficient decisions about when and where to invest in new network capacity.GenAI can be used to enhance and fine-tune smart capex by adding unstructured data from customers from conversations with customer service agents,comments on social

245、media and so on to more structured network data.There are many other potential applications for GenAI within the larger network planning category for example,network design.But CSPs are already using other types of AI for these issues,and it is not always clear how much GenAI can add.Furthermore,suc

246、h is the sensitivity of network data that operators default position tends to be to proceed with caution.Root cause analysis is an approach already used in CSPs technology functions and in enterprise IT more broadly to solve issues by going to the root cause of faults or problems.By using GenAI to i

247、nterrogate either internal or vendor documentation,CSPs have the possibility of automating the processes that are used to evaluate and classify alarms.This can help ensure that operators prioritize and address critical issues and generate recommendations for their resolution.CSP GenAI use case adopt

248、ion:marketing and productTM Forum,2023(source:Altman Solon)Insight generationNew feature ideationSearch term generationContent generation78%69%71%57%CSP GenAI use case adoption:network(operations)TM Forum,2023(source:Altman Solon)Synthetic data generation for security testingGuided employee assistan

249、ce for installation,troubleshooting&maintenanceNetwork image generationGenerative route/network design&network configuration55%49%53%62%28section 5:taking the first steps early GenAI use cases for CSPsinform.tmforum.orgAn LLM can generate network topology diagrams or build a digital twin based on te

250、xt descriptions,and learn patterns from historical data to identify anomalies and the root cause of problems.It can also automate troubleshooting and predictive maintenance.China Telecom,for example,is building its own GenAI system to find the root cause of network problems,using structured and unst

251、ructured data.The Altman Solon survey also includes network image generation a capability that assists with the guided employee assistance use case and synthetic data generation for security testing.The creation of synthetic data came up as a strong theme in many of our conversations and is a capabi

252、lity which will help CSPs to implement many different use cases across their organizations.Synthetic data is information that is generated artificially rather than produced by real-world events.Typically created using algorithms,it can be deployed to validate mathematical models and increasingly is

253、used to train AI/machine learning models.Synthetic data is also often used for product testing and software development.Software engineeringThis is an area where the use of copilots,or companions,can help software engineers and developers to be more productive.For example,AWSs AI coding companion us

254、es a foundation model to generate code suggestions in real time.Its suggested use cases for CSPs include automating software development with text/voice to code,filling skills gaps and helping people without coding experience.Sanchez at Tecnotree says a significant opportunity for CSPs lies in autom

255、ating many of the repetitive tasks that sit within CSP operations.“Data mapping,data migration,these are constant problems in the industry,”he says.“People transferring data from one place to another,validation,logic,a lot of business rules and business assumptions that have to then get coded into r

256、ules and code.Generative AI can help with a lot of this.”AT&T has given its employees access to a tool called Ask AT&T,initially built using ChatGPT.The first use case is“helping our coders and software developers across the company become more productive”,says Andy Markus,AT&Ts Chief Data Officer,i

257、n a blog.Another use case it is exploring is upgrading legacy software code and environments.Product development Much of the interest in using GenAI for product innovation is based on the potential for CSPs to build better relationships with their customers and to learn about their tastes and prefer

258、ences.Speaking at the DTW23Ignite conference in Copenhagen in September,SK Telecoms Chung Suk-geun gave the example of a telecoms operator offering a customer travelling abroad personalized restaurant recommendations.Redefining customer relationships is one of the two overarching strategies that SK

259、Telecom is pursuing with GenAI(the other is internal transformation).It has a vision for rolling out GenAI enabled chatbots that learn about their customers tastes and preferences and,in doing so,offer them personalized services.Voice services also represent an opportunity for CSPs to add more“richn

260、ess”and potentially to build new revenue streams,particularly in B2B.Voice calls could be transcribed and translated in real time and then stored and archived.Roaming presents a specific opportunity,allowing CSP GenAI use case adoption:software engineeringTM Forum,2023(source:Altman Solon)Automated

261、code generation,debugging,testingAutomated code documentationEmployee self-service deskGuided employee assistance79%76%82%77%A significant opportunity for CSPs lies in automating many of the repetitive tasks that sit within operations.”Matthew Sanchez,Tecnotree29section 5:taking the first steps earl

262、y GenAI use cases for CSPsinform.tmforum.orgcustomers to converse with local people using their own spoken language.This builds on a concept called 5G New Calling,developed by China Mobile.Internal knowledgeEvery function that sits within a telecoms operator business has a requirement for knowledge,

263、training and development.Today this is delivered via company intranets,third-party literature and documentation,and online and offline training and development programs.GenAI can help drive the adoption and effectiveness of these initiatives with,for example,the adoption of employee chatbots and LLM

264、s that deliver assistance in areas such as:Competitor analysis Staying up to date with new trends in technology,marketing and so on Partner and supply chain background/conformance assessment.The main challenge here is in updating the information to train the model.Competitive analysis can only be tr

265、uly useful unless it is used for general background purposes if it is updated on a daily basis.Business operationsIn addition to the six families of use cases we have listed so far,many other general uses are emerging from across the CSP business.A few examples are:GenAI can be employed to create sy

266、nthetic datasets that mimic real-world fraud patterns.These synthetic datasets can be used to augment limited or imbalanced training data,improving the performance of fraud detection models.TM Forum published a whitepaper on this topic in September,and Canadian operator Telus is already using GenAI

267、for fraud detection.Vodafone is building an intelligent assistant that uses GenAI to securely and quickly search through the 10,000+contracts that it has signed with other telecoms operators for mobile roaming services.The legal departments of CSPs can use GenAI to help prepare contracts by feeding

268、previous ones into an LLM.HR departments could use the technology to help classify and rank job candidates by feeding their CVs into a GenAI model.Below and on the next page we look at some of the TM Forum Catalyst proofs of concept experimenting with GenAI use cases.And in the next section we look

269、at some of the risk factors and potential regulation surrounding the deployment of GenAI technologies.GenAI Hyper-personalized Customer Experience was a finalist in the Outstanding Catalyst Business Impact category in the TM Forum 2023 Catalyst Awards.The catalyst focuses on the future of customer e

270、xperience using hyper-personalized real-time GenAI.The aim is to address the problem of duplicate and unnecessary messages to customers,thereby reducing churn.The solution is to send personalized messages through a real-time marketing automation pipeline that combines predictive and generative AI.It

271、 is predicated on three steps:carrying out complex event processing using predictive AI to predict a churn score and then instantly create a dynamic campaign syndicated through an omnichannel experience using generative AI to complete digital engagement.The participants say the design pattern could

272、be used for other applications such as network healing,agent workflows,bi-flow automation and in the overall ethical use of data in AI technology evolution.There were five CSP champions:Bell Canada,BT,Deutsche Telekom,Telecom Italia and TurkNet.The solution,based on TM Forums Open Digital Architectu

273、re and Open APIs,used Google Clouds Vertex AI platform,Pegasystems Customer Decision Hub,and Accentures Customer Data platform.Watch the video:Using GenAI to deliver hyper-personalized customer experience30section 5:taking the first steps early GenAI use cases for CSPsinform.tmforum.orgThe Digital C

274、arbon Footprint Optimization catalyst was the winner of the Best Moonshot Catalyst The Energy Challenge at DTW23 Ignite.The project sets out to tackle the problem of measuring Scope 3 emissions indirect emissions in the supplier value chain by helping CSPs determine value chain emissions data and ea

275、sily transpose that data to a product catalog.The project uses real-time data and AI(including generative AI)built on sustainable cloud infrastructure to create a new way to build product bundles and provide ongoing carbon dioxide equivalent(CO2e)data and analytics.The solution enables customers to

276、have full visibility of the carbon footprint of products such as phones during purchase.A built-in intelligent tool analyzes CO2e data using TM Forum Open APIs in various aspects of the shopping lifecycle,enabling the CSP to provide lower carbon footprint product offers and publish it across differe

277、nt digital channels.The catalyst demonstrates how to monitor consumers battery consumption and battery health and provide recommendations based on AI/machine learning.Consumers thereby get full transparency into the carbon emissions generated by their devices.Using data from sustainability consultan

278、cy Carbon Footprint the project team was able to provide access to accurate,real-time carbon footprint information for each product component.CSP product managers and marketing teams were able to view CO2e data for each component,ultimately allowing them to create more environmentally oriented produ

279、ct bundles which make use of the CO2e visualized data attributes in the product catalog.The project champions were Vodafone and Carbon Footprint,with technology partners AWS,Amdocs and Snowflake.Watch the video:Assessing Scope 3 emissions to deliver lower-carbon product offersTo achieve the level of

280、 network intelligence needed in the 5G era,CSPs increasingly are turning to AI to detect,diagnose and resolve network performance issues.The GenAI and knowledge-driven 5G operations Catalyst set out to develop an intelligent cognitive decision-making system for 5G network operation,with the aim of i

281、mproving network performance,reducing operational costs and enhancing customer experience.The project champions China Telecom Research Institute and China Unicom Research Institute,together with their technology partners,are developing a solution to implement 5G intelligent operations,using generati

282、ve AI to construct LLMs and a communication knowledge graph to perceive the quality and condition of the network and its services.The system can classify and perform root-cause analysis of problems,and then execute automated resolutions.The system will automatically collect and process network data,

283、classify and analyze root causes through a pre-trained AI model,and automatically delimit and locate problems in combination with a network resource database.It then matches the solution through the decision knowledge base,forms the intention with the resource base,and sends it to the network manage

284、ment system to identify the intention and perform operations on the network.Building on TM Forums Open APIs,the solution enables intelligent closed-loop automation to create and sustain a network capable of advanced self-cognition.As well as advancing the intelligence and efficiency of network opera

285、tions,the solution can enable a range of other savings such as reducing 5G network operations and maintenance(O&M)costs.Technology partners for the Catalyst included ZTE,Inspur Group and Beijing ZZNode Technologies.Watch the video:GenAI helps deliver 5G intelligent operations31section 6:look while y

286、ou leap key challenges for CSPsinform.tmforum.orgThe categories we highlight are broad and there are many more challenges within them.But they are among the key issues CSPs will need to address to make full use of GenAI technology in the telecoms space.Access to(CSP)internal dataCSPs have long strug

287、gled to organize,store,aggregate and extract the data that sits in their organizations and within their business and operational support systems(BSS/OSS).This should be no surprise given the fact that those support systems have evolved over a period of 20-30 years and have been supplied by many diff

288、erent vendors.Nevertheless,considerable progress has been made in terms of moving towards a unified data model a common structure for sharing data making it available across siloes and delivering it in real time.Furthermore,these challenges to accessing the right data are applicable to all forms of

289、AI,not specifically GenAI.The suitability of unstructured data for GenAI is an important differentiator with other types of AI.CSPs have,until now,been relatively unsuccessful in using unstructured data for example,messages,transcriptions of voice calls or articles on the internet in AI implementati

290、ons.There are many early use cases in GenAI particularly ones that use copilots that exploit unstructured data.But most GenAI use cases will still need structured data from operators support systems(Netcracker estimates that 90%of GenAI use cases need BSS/OSS data).The challenge for operators is how

291、 to integrate this sensitive customer data with the data that sits in public LLMs,and how to ensure its accuracy given that it is being drawn from both public sources which cannot necessarily be trusted and operators own more trustworthy data.Another data-related challenge facing CSPs is making data

292、 available to non-technical individuals and teams who lack the knowledge or training to consult internal systems which can often be complex and hard to navigate.We have identified five categories of challenges relating to CSPs use of GenAI.In some cases operators are facing these challenges today,wh

293、ile in others they are likely to be encountered in the medium term.All relate to the deployment of GenAI at scale.32section 6:look while you leap key challenges for CSPsinform.tmforum.orgSecurity,governance and privacyData security and privacy was cited as the number one big risk in our CSP survey b

294、y 80%of respondents(see chart above).Given the strong focus on using GenAI for customer operations,CSPs need to be fully cognizant of where the servers running the relevant LLMs are located.There are stiff penalties in many countries for sending customer data including anonymized customer data outsi

295、de of national borders.This means,for example,that because OpenAI runs its LLMs over Microsoft Azure,using customer data in GPT-3 or GPT-4 is only possible in those countries where Microsoft Azure has public cloud infrastructure.Regardless of where the LLM is running,the approach of customizing pre-

296、trained models and tailoring them with operators(sometimes proprietary)data is one that requires a robust data infrastructure and governance to ensure that the foundation model is trained on high-quality data.Even when the right safeguards and governance are in place,the very nature of GenAI means t

297、hat when it is deployed at scale,and across the organization,new systems and processes may be needed.“If its something youre going to scale you need to think about other things such as change management,”says Charlotta Lundell Berg,Head of Analytics at Telia.And Ibrahim Gedeon,CTO at Telus,believes

298、the sheer speed of innovation enabled by GenAI is putting existing security and privacy processes under pressure.“We used to get one,two or three projects that require a process to evaluate the data every year,but now with GenAI were getting one or two every week,”he says.This is forcing the operato

299、r to put more agile processes towards governance into place.As governments and regulators across the world consider the potential impact of GenAI on their economies and societies,many will introduce new safeguards in an effort to reduce some of the bigger risks.The European Union,for example,has pro

300、posed a regulatory framework for AI which What are the main risks for your organization of using GenAI?Big riskSmall riskShadow AI(i.e.non-authorized employee usage)Privacy and security(e.g.leaks of sensitive data via GenAI apps)Phishing and fraud(e.g.phishing emails or deep fakes)Lack of truth func

301、tion(i.e.creating factually incorrrect answers)Misinformation and manipulation(e.g.creation of fake content,documents)Bias and fairness(because real-world data is often biased)Legal considerations(through use of public data)Leaks of proprietary data48%52%20%80%36%64%26%74%38%62%34%66%26%74%27%73%TM

302、Forum,2023If its something youre going to scale you need to think about other things such as change management.”Charlotta Lundell Berg,Telia33section 6:look while you leap key challenges for CSPsinform.tmforum.orgspecifically references GenAI.Disclosing the content that is generated by AI,designing

303、models to prevent them from generating illegal content,and publishing summaries of copyrighted data use for training,will all become mandatory requirements for systems that use GenAI under the proposed AI Act.Phishing and fraud is another risk that can be put into this category and was cited by two

304、out of three of our survey respondents as a big risk.Similarly,legal considerations such as the use of public data which may be implicated in legal or license questions is a major concern.Other risks asked about in our survey sit within the category of data accuracy which we highlight below.IP and l

305、ock-inAs we saw in section 4 there are two types of LLMs:proprietary ones owned by companies that have built them,and open source.As such,the IP of an LLM is owned by its creator.But what happens when a company uses an LLM under license and adds to it through the process of fine-tuning?The operator

306、may own the data it has added,but this does not mean it is possible,or easy,to transfer that data or the learnings from the process of fine-tuning the model from one LLM to another.An operator may want to change which LLM it is using because the model is being updated or simply to use a different LL

307、M.Such considerations strengthen the argument for CSPs to develop their own LLMs.Indeed,a senior manager in a hyperscale service provider,who asked not to be named,told us he believed the telecoms industry“should own data and build the model”.But as we saw in section 4,there are arguments for and ag

308、ainst such approaches.Issues over IP and the transferability of fine-tuning and learning activities also impact the approach that operators will take over how to customize an LLM for their particular requirements.Other approaches include retrieval augmented generation(RAG)a way to optimize the outpu

309、t of an LLM with targeted information without modifying the underlying model itself and prompt engineering(see page 9).The latter is a specialized AI skill that involves guiding and shaping model responses by carefully crafting specific and detailed questions,and testing out different ways to phrase

310、 instructions to generate better outputs.Data accuracy and traceabilityMany factors contribute to an LLM generating information which is unbiased and accurate.The key question is how accurate the data needs to be to meet the requirements of a particular use case.Less accurate data can still be valua

311、ble if the alternative is no data at all,or if it is being used to help guide someone in making a decision or taking a specific course of action.But much more accurate data is needed to replace human decision-making and interaction.Accuracy is a function of the data that is fed into a model and of h

312、ow the model uses and interprets it.Bias exists when the totality of the content on the internet on a particular topic does not show a suitable balance of purported facts or opinions.Hallucinations occur because the output of LLMs tends to be presented in such a convincing way that it is easy to acc

313、ept it as fact.Accuracy can also be undermined by a lack of up-to-date information if,for example,there has been a cut-off date for information that has been fed into a model.The cut-off date for GPT-3,for example,was originally September 2021,but this has now been addressed through RAG.“Reasoning”e

314、rrors can be more difficult to address.LLMs apply statistical analysis to large bodies of text,but they are not logic engines.Furthermore,different LLMs can produce very different responses to the same interrogation or prompt.This can distort results when,for example,a CSP is making accuracy compari

315、sons between two LLMs.The way that a prompt is phrased may create a misleading impression that one LLM is better than another when the reverse may be true if the wording is changed.The ability to trace the output from an LLM is also extremely important,particularly if outputs are showing sub-standar

316、d results and the user needs to trace that back to the original source of information.Traceability can also be important for issues around privacy,security or legality.The use of public data which may be implicated in legal or license questions is a major concern.34section 6:look while you leap key

317、challenges for CSPsinform.tmforum.orgCost and ROIThe cost of experimenting with LLMs is not prohibitive and is not a factor impacting how CSPs are learning how,where and when to develop early use cases.However,it could well become a factor when it is deployed at scale.Rather than extending the API-b

318、ased pricing approach being used today,and which scales directly with usage,many CSPs would like to see a transition to new pricing models based on the success or outcome of a specific use case.However,hyperscale service providers,given the colossal investments they are making in GenAI,will come und

319、er enormous pressure to adopt pricing approaches which enable them to show investors that they are starting the process of making returns on their investments.The visibility that a CSP has about costs varies depending on the use case.Michiel van Rijthoven,Lead Data Scientist at VodafoneZiggo in the

320、Netherlands,cites the example of a customer chatbot using GenAI which,because there are clear metrics in place,enables the operator to gauge costs.“We know how many calls we get into the call center,we know the length of the calls,so we can make some predictions about the cost.And yes,the cost will

321、be significant,but the business case is extremely positive,”he says.In the longer term there is a risk that operators will incur technical debt in GenAI because of the growing array of choices that are emerging in how to generate better results from LLMs.As we have detailed,these include fine-tuning

322、,RAG and prompt engineering.If the usage of these techniques is not properly monitored and managed,operators could find themselves in a situation in the future where it becomes difficult to adopt new,efficient techniques at scale because of the cost and difficulty of migrating old systems.In the fin

323、al section we look at some of the key findings from our research for this report as well as some recommendations for CSPs setting out to experiment with and deploy GenAI LLMs.Many CSPs would like to see a transition to new pricing models based on the success or outcome of a specific use case.35secti

324、on 7:key findings and recommendationsinform.tmforum.orgCSPs are responding in different ways to the GenAI opportunity.Here are some of the early trends:Despite the fact that GenAI as a commercial proposition is only a year old,more than half of CSPs are experimenting with it today.The mandate for la

325、unching into GenAI is,in many cases,coming from the main board.The desire to act“hard and fast”on GenAI is putting intense pressure on technology leaders.They now need to work out how to align their existing AI strategies and focus with these new,bigger ambitions.In our survey there is an even split

326、 between those operators which believe that GenAI requires a specific,distinct focus and those which see it just as another type of AI that can be incorporated into an existing strategy.CSPs are focused today on low-hanging fruit use cases that can be exploited using the copilot services offered by,

327、for example,hyperscale service providers and with out-of-the-box LLMs fine-tuned with their own data.Operators today are experimenting with both proprietary and open-source LLMs.The landscape is constantly shifting,and operators will experiment with many different models.GenAI is ideally suited to w

328、orking on unstructured data and offers an immediate opportunity for CSPs.Many have struggled until now to make good use of the unstructured data that sits within,or adjacent to,their organizations.Todays API-based LLM pricing models allow CSPs to experiment,and to build proofs of concept without par

329、ticular concern about the costs they are incurring.However,as and when operators deploy GenAI at scale such pricing mechanisms may become less attractive.Some use cases for CSPs involve the use of GenAI as the only type of AI,but many require GenAI to be combined with other types of AI.For example,o

330、perators are already deploying AIOps in areas such as network operations,and GenAI will be a complementary technology with the potential to deliver improved performance.Throughout this report we have focused on the opportunities and challenges to CSPs considering,and starting out on,the deployment o

331、f generative AI large language models.This section brings together those findings.The figures quoted in this section are based on our survey of 104 executives from 73 operators.36section 7:key findings and recommendationsinform.tmforum.orgMain challenges There are many internal issues that CSPs need

332、 to address before they can deploy GenAI at scale.These include siloed data,the lack of a common ontology and incomplete information architectures.While concerns about LLM accuracy for example bias and hallucinations represent important challenges today,new techniques are emerging that will deliver

333、significant improvements.As such,they may represent less of a long-term concern for CSP businesses than issues around privacy,security,IP and the cost of deploying GenAI at scale.The ability to measure model performance is a significant challenge in GenAI.In the world of predictive AI and machine learning,this was done through techniques such as precision and recall.However,approaches in AI are st

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(电信管理论坛:2023生成式AI:通信运营商迈出第一步报告(英文版)(56页).pdf)为本站 (Yoomi) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
客服
商务合作
小程序
服务号
会员动态
会员动态 会员动态:

wei**n_... 升级为标准VIP    姚哥 升级为至尊VIP 

微**... 升级为标准VIP  182**73...  升级为高级VIP

wei**n_...  升级为标准VIP   138**94...  升级为标准VIP

wei**n_... 升级为至尊VIP   A**o  升级为至尊VIP

 134**12...  升级为标准VIP  wei**n_... 升级为标准VIP

wei**n_... 升级为标准VIP  158**01... 升级为高级VIP

wei**n_...  升级为标准VIP 133**84...   升级为高级VIP

 wei**n_... 升级为标准VIP 周斌  升级为高级VIP

 wei**n_... 升级为至尊VIP  182**06... 升级为高级VIP

139**04... 升级为至尊VIP   wei**n_... 升级为至尊VIP 

 Ke**in 升级为高级VIP  186**28... 升级为至尊VIP

139**96...  升级为高级VIP she**nz...  升级为至尊VIP

 wei**n_... 升级为高级VIP   wei**n_... 升级为高级VIP

 wei**n_... 升级为标准VIP  137**19...  升级为至尊VIP

 419**13... 升级为标准VIP 183**33...  升级为至尊VIP 

189**41... 升级为至尊VIP 张友 升级为标准VIP 

奈**... 升级为标准VIP   186**99... 升级为至尊VIP

 187**37... 升级为高级VIP 135**15... 升级为高级VIP  

朱炜 升级为至尊VIP   ja**r  升级为至尊VIP

wei**n_...  升级为高级VIP wei**n_... 升级为高级VIP

 崔** 升级为至尊VIP 187**09... 升级为标准VIP 

 189**42... 升级为至尊VIP  wei**n_... 升级为高级VIP 

 妙察 升级为标准VIP  wei**n_...  升级为至尊VIP

137**24... 升级为高级VIP    185**85... 升级为标准VIP

wei**n_... 升级为高级VIP  136**40... 升级为标准VIP 

156**86...   升级为至尊VIP 186**28... 升级为标准VIP 

135**35... 升级为标准VIP  156**86... 升级为高级VIP 

wei**n_...  升级为至尊VIP  wei**n_...  升级为高级VIP

wei**n_... 升级为标准VIP   wei**n_... 升级为标准VIP

wei**n_...   升级为高级VIP   138**87... 升级为高级VIP

185**51...  升级为至尊VIP  微**... 升级为至尊VIP

136**44... 升级为至尊VIP  183**89... 升级为标准VIP

wei**n_...  升级为至尊VIP 8**的... 升级为至尊VIP 

Goo**ar... 升级为至尊VIP   131**21...  升级为至尊VIP

139**02...  升级为标准VIP wei**n_... 升级为高级VIP 

wei**n_... 升级为高级VIP  wei**n_... 升级为至尊VIP

wei**n_...  升级为至尊VIP 138**05...  升级为至尊VIP 

 wei**n_... 升级为高级VIP  wei**n_...  升级为至尊VIP

wei**n_...  升级为至尊VIP wei**n_...  升级为至尊VIP

 131**77... 升级为高级VIP  wei**n_... 升级为标准VIP

 186**06...  升级为高级VIP  150**97... 升级为至尊VIP

wei**n_... 升级为标准VIP wei**n_... 升级为至尊VIP

185**72... 升级为至尊VIP  186**81...  升级为至尊VIP

升级为至尊VIP 159**90...   升级为标准VIP

ja**me   升级为高级VIP wei**n_... 升级为标准VIP 

wei**n_...  升级为至尊VIP   黑碳 升级为高级VIP

 黑碳  升级为标准VIP  wei**n_... 升级为高级VIP

Fro**De...  升级为至尊VIP  wei**n_... 升级为高级VIP

185**28...  升级为标准VIP HO**T  升级为至尊VIP 

cic**hu 升级为高级VIP wei**n_...  升级为标准VIP