More Of The Latest Thoughts From American Technology Companies On AI (2023 Q4)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2023 Q4 earnings season.

Nearly a month ago, I published The Latest Thoughts From American Technology Companies On AI (2023 Q4). In it, I shared commentary in earnings conference calls for the fourth quarter of 2023, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls for 2023’s fourth quarter after I prepared the article. The leaders of these companies also had insights on AI that I think would be useful to share. This is an ongoing series. For the older commentary:

Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management thinks the company is a leader in delivering generative AI and has a highly differentiated approach through the use of proprietary data and by being friendly with intellectual property

We’re a leader in delivering generative AI across all our clouds. We’re taking a highly differentiated approach across data, models, and interfaces. Our proprietary data is built on decades of deep domain expertise across creative, documents and customer experience management. We leverage large language models as well as have invested in building and delivering our proprietary models in the creative, document, and marketing domains. Our IP-friendly approach is a differentiator for creators and enterprises.

Adobe’s management sees an immense market opportunity for the company in AI and that the company is uniquely positioned to capture the opportunity; Adobe’s end-to-end generative AI solution, GenStudio, is already seeing success with entreprises; GenStudio is a generative AI application that helps marketers plan create, store, and deliver marketing content; GenStudio is integrated across Creative Cloud and Experience Cloud

Every student, communicator, creative professional, and marketer is now focused on leveraging generative AI to imagine, ideate, create and deliver content and applications across a plethora of channels. Adobe is uniquely positioned through the combination of Express, Firefly, Creative Cloud, Acrobat, and Experience Cloud to deliver on this immense market opportunity. The success we are already seeing with our GenStudio offering in the enterprise is validation of our leadership, and we expect that success to translate into other segments as we roll out these solutions throughout the year…

…Adobe GenStudio is a generative AI-first application that allows marketers to quickly plan, create, store, deliver, and measure marketing content in a single intuitive offering. With state-of-the-art generative AI powered by Firefly services, marketers can create on-brand content with unprecedented scale and agility to deliver personalized experiences. Adobe GenStudio natively integrates with multiple Adobe applications across Creative Cloud and Experience Cloud, including Express, Firefly, Workfront, Experience Manager, Customer Journey Analytics and Journey Optimizer. It can be used by brands and their agency partners to unlock new levels of creativity and efficiency in marketing campaigns.

Adobe’s management is seeing strong usage, value and demand for its AI solutions across all customer segments

We’re driving strong usage, value and demand for our AI solutions across all customer segments.

Acrobat AI Assistant uses generative AI to summarise long PDFs, answer questions through a chat interface, help with generating emails, reports, and presentations; AI Assistant has strong data security; Adobe’s management is pleased with the English language beta of AI Assistant and Adobe will be releasing other languages later in the year; management will monetise AI Assistant through a monthly add-on for Reader and Acrobat users; management thinks there’s a lot of monetisation opportunity with AI Assistant, including consumption-based monetisation

The world’s information, whether it’s an enterprise legal contract, a small business invoice, or a personal school form, lives in trillions of PDFs. We were thrilled to announce Acrobat AI Assistant, a massive leap forward on our journey to bring intelligence to PDFs. With AI Assistant, we’re combining the power of generative AI with our unique understanding of the PDF file format to transform the way people interact with and instantly extract additional value from their most important documents. Enabled by a proprietary attribution engine, AI Assistant is deeply integrated into Reader and Acrobat workflows. It instantly generates summaries and insights from long documents, answers questions through a conversational interface, and provides an on-ramp for generating e-mails, reports and presentations. AI Assistant is governed by secure data protocols so that customers can use the capabilities with confidence. We’re pleased with the initial response to the English language beta and look forward to usage ramping across our customer base as we release other languages later in the year. We will monetize this functionality through a monthly add-on offering to the hundreds of millions of Reader users as well as the Acrobat installed base across individuals, teams, and enterprises…

…Everyone is looking at AI Assistant in Acrobat. I certainly hope all of you are using it. It should make your lives more effective. Not just for insight, we think that there’s a lot of opportunity for monetization of insight for AI Assistant on our core base of Acrobat users but also, for the first time, doing consumption-based value. So the hundreds of millions of monthly active users of Reader will also be able to get access to AI Assistant and purchase an add-on pack there, too. So it’s a really broad base to look at how we monetize that.

Adobe’s generative AI model for creative work, Adobe Firefly, has been released for around a year and has been integrated into Creative Cloud and within Adobe Express; users of Creative Cloud and Adobe Express have generated >6.5 billion creative assets to-date (was 4.5 billion in 2023 Q3) across images, vectors, designs, and text effects; Firefly has a web-based interface which has seen tremendous adoption; enterprises can now embed Firefly into their own workflow through Firefly Services, which is commercially safe for enterprises to use

Adobe Express is inspiring millions of users of all skill levels to design more quickly and easily than ever before. In the year since we announced and released Adobe Firefly, our creative generative AI model, we have aggressively integrated this functionality into both our Creative Cloud flagship applications and more recently, Adobe Express, delighting millions of users who have generated over 6.5 billion assets to date.

In addition to creating proprietary foundation models, Firefly includes a web-based interface for ideation and rapid prototyping, which has seen tremendous adoption. We also recently introduced Firefly Services, an AI platform which enables every enterprise to embed and extend our technology into their creative production and marketing workflows. Firefly Services is currently powered by our commercially safe models and includes the ability for enterprises to create their own custom models by providing their proprietary data sets as well as to embed this functionality through APIs into their e-mail, media placement, social, and web creation process…

…… The 6.5 billion assets generated to date include images, vectors, designs, and text effects. 

IBM is an early adopter of Firefly and has used it to generate marketing assets much faster than before and that have produced much higher engagement

Early adopters like IBM are putting Firefly at the center of their content creation processes. IBM used Adobe Firefly to generate 200 campaign assets and over 1,000 marketing variations in moments rather than months. The campaign drove 26x higher engagement than its benchmark and reached more key audiences.

Firefly is now available in mobile workflows through the Adobe Express mobile app beta and has a first-of-its-kind integration with TikTok’s creative assistant; the introduction of Firefly for enterprises has helped Adobe win enterprise clients in the quarter

The launch of the new Adobe Express mobile app beta brings the magic of Adobe Firefly AI models directly into mobile workflows. The first-of-its-kind integration with TikTok’s creative assistant makes the creation and optimization of social media content quicker, easier and more effective than ever before…

…  The introduction of Firefly services for enterprises drove notable wins in the quarter, including Accenture, IPG, and Starbucks. Other key enterprise wins include AECOM, Capital Group, Dentsu, IBM, Nintendo, and RR Donnelley.

During 2023 Q4 (FY2024 Q1), Adobe’s management saw the highest adoption of Firefly-powered tools in Photoshop since the release of Generative Fill in May 2023

Generative Fill in Photoshop continues to empower creators to create in new ways and accelerate image editing workflows. Q1 saw the highest adoption of Firefly-powered tools in Photoshop since the release of Generative Fill in May 2023, with customers adopting these features across desktop, web and most recently, iPad, which added Generative Fill and Generative Expand in December.

Adobe’s management expects Adobe’s AI-powered product features to drive an acceleration in the company’s annual recurring revenue (ARR) in the second half of the year; management thinks the growth drivers are very clear

You can expect to see the product advances in Express with Firefly on mobile, Firefly services and AI Assistant in Acrobat drive ARR acceleration in the second half of the year…

…As we look specifically at Creative Cloud, I just want to sort of make sure everyone takes a step back and looks at our strategy to accelerate the business because I think the growth drivers here are very clear. We are focused on expanding access to users with things like Express on mobile. We want to introduce new offers across the business with things like AI Assistant and also existing capabilities for Firefly coming into our core Firefly, our core Photoshop and Illustrator and flagship applications. We want to access new budget pools with the introduction of Firefly services and GenStudio as we talked about…

…And as we enter the back half of the year, we have capabilities for Creative Cloud pricing with Firefly that have already started rolling out late last year as we talked about, and we’ll be incrementally rolling out throughout the year. We’re ramping Firefly services and Express in enterprise. As we talked about, we saw a very good beginning of that rollout at the — toward the end of Q1. We also expect to see the second half ramping with Express Mobile and AI Assistant coming through. So we have a lot of the back-end capabilities set up so that we can start monetizing these new features, which are still largely in beta starting in Q3 and beyond…

…We are very excited about all the innovation that’s coming out, that’s just starting to ramp in terms of monetization and/or still in beta on the Creative Cloud side. We expect that to come out in Q3 and we’ll start our monetization there. So we continue to feel very confident about the second half acceleration of Creative Cloud…

…Usage of Firefly capabilities in Photoshop was at an all-time high in Q1, Express exports more than doubling with the introduction of Express mobile in beta now, going to GA in the coming months, AI Assistant, Acrobat, same pack pattern. You can see that momentum as we look into the back half of the year. And from an enterprise standpoint, the performance in the business was really, really superb in Q1, strongest Q1 ever in the enterprise. So there’s a lot of fundamental components that we’re seeing around performance of the business that give us confidence as we look into the back half of the year.

Adobe’s management believes that the roll out of personalisation at scale has been limited by the ability of companies to create content variations and this is where generative AI can help

Today, rollout of personalization at scale has been limited by the number of content variations you can create and the number of journeys you can deploy. We believe harnessing generative AI will be the next accelerant with Creative Cloud, Firefly services and GenStudio providing a comprehensive solution for the current supply chain and generative experience model automating the creation of personalized journeys.

Adobe’s management believes that AI augments human ingenuity and expands the company’s market opportunity

We believe that AI augments human ingenuity and expands our addressable market opportunity.

Adobe’s management is monetising Adobe’s AI product features in two main ways – via generative packs and via whole products – and they are progressing in line with expectations; management thinks that it’s still early days for Adobe in terms of monetising its AI product features

I think where there’s tremendous interest and where, if you look at it from an AI monetization, the 2 places that we’re monetizing extremely in line with our expectations, the first is as it relates to the Creative Cloud pricing that we’ve rolled out. And as you know, the generative packs are included for the most part in how people now buy Creative Cloud, that’s rolling out as expected. And the second place where we are monetizing it is in the entire enterprise as it relates to Content and GenStudio. And I’m really happy about how that’s monetizing it. And that’s a combination, Brent, of when we go into an enterprise for creation, whether we provide Creative Cloud or a combination of Express, what we are doing with asset management and AEM, Workflow as well as Firefly services to enable people to do custom models as well as APIs. We’re seeing way more monetization earlier, but again, very much in line with expected…

…As it relates to the monetization of AI, I think we’re in early stages as it relates to experimentation. So we’re looking at both what the value utilization is as well as experimentation. The value utilization is actually really positive for us. I think as it relates to the monetization and the experimentation, we have the generative packs, as you know, in Creative Cloud. I think you will see us more and more have that as part of the normal pricing and look at pricing, because that’s the way in which we want to continue to see people use it. I think in Acrobat, as you’ve seen, we are not actually using the generative packs. We’re going to be using more of an AI Assistant model, which is a monthly model. As it relates to the enterprise, we have both the ability to do custom models, which depends on how much content that they are creating as well as an API and metering. We’ve rolled that out and we started to sell that as part of our GenStudio solution.

Adobe’s management pushed out the enforcement of generative AI credit limits beyond April 2024 because Adobe is still in user-acquisition mode for its AI product features

[Question] You pushed out the enforcement of generative credit limits for some products beyond April that were originally expected sooner. What’s the thinking behind this decision? And what are you seeing thus far in terms of credit consumption and purchasing patterns of those credit packs? 

[Answer] In terms of the timing of the — when we start enforcing credits, don’t read anything into that other than right now we are still very much in acquisition mode. We want to bring a lot of users in. We want to get them using the products as much as possible. We want them coming back and using it…

…So right now, look, the primary point is about proliferation and usage. 

Adobe recently released generative AI capabilities in music composition, voice-dubbing, and lip-syncing; these capabilities will require a lot of generative AI credits from users

 In the last few weeks, we’ve done a couple of sneaks that could also be instructive. Last month, we snuck music composition where you can take any music track, you can give it a music type like hip-hop or orchestra or whatever, and it will transform that initial track into this new type of music. Just this morning, we snuck our ability to do auto-dubbing and lip-syncing where you give it a video of someone talking in, say, English and then you can translate it automatically to French or Spanish or Portuguese or whatever. As you can imagine, those actions will not take 1 credit. Those actions will be much more significant in terms of what they cost.

Adobe’s management thinks that developments in generative AI models for video, such as the recent release of Sora by OpenAI, are a net positive for Adobe; Adobe is also developing its own generative AI video models and will be releasing them later this year

[Question] Clearly, a lot of news around video creation using generative AI during the quarter, of course, with the announcement of Sora. Maybe the question for you folks is can we just talk a little bit about how you think about the market impact that generative AI can have in the video editing market and how maybe Firefly can participate in that trend?

[Answer] So really great advances, but net-net, video is going to be even more of a need for editing applications in order to truly take advantage of generative AI…

…We see the proliferation of video models to be a very good thing for Adobe. We’re going to work with OpenAI around Sora. You’re going to see us obviously developing our own model. You’re going to see others develop their model. All of that creates a tailwind because the more people generate video clips, the more they need to edit that content, right? So whether it’s Premier or After Effects or Express, they have to assemble those clips. They have to color correct those clips. They have to tone-match. They have to enable transitions. So we’re excited about what we’re building, but we’re just as excited about the partnerships that we see with OpenAI and others coming down this path. And if you take a step back, you should expect to see more from us in the weeks ahead with imaging and vector, design, text effects, in the months ahead with audio and video and 3D. We’re very excited about what all of this means, not just for the models, but for our APIs and our tools.

Adobe’s management thinks that Adobe is in a great position to attract talent for technical AI work because they believe that the company has one of the best AI research labs and can provide access to AI computing hardware; management also thinks that Adobe is in a great position to attract talent for AI sales

And so that’s not just an Adobe perspective, but it’s playing out, obviously, in the enterprises as they look at what are the models that they can consider using for production workflows. We’re the only one with the full suite of capabilities that they can do. It’s a really unique position to be in. But it’s also being noticed by the research community, right? And as the community starts looking at places, if I’m a PhD that wants to go work in a particular environment, I start to ask myself the question of which environment do I want to pick. And a lot of people want to do AI in a responsible way. And that has been a very, very good opportunity for us to bring in amazing talent. So we are investing. We do believe that we have the best — one of the best, if not the best, research labs around imaging, around video, around audio, around 3D, and we’re going to continue to attract that talent very quickly. We’ve already talked about we have the broadest set of creative models for imaging, for vector, for design, for audio, for 3D, for video, for fonts and text effects. And so this gives us a broad surface area to bring people in. And that momentum that starts with people coming in has been great.

The second part of this, too, is managing access to GPUs while maintaining our margins. We’ve been able to sort of manage our cost structure in a way that brings in the talent and gives them the necessary GPUs to do their best work…

…Regarding the sales positions in enterprises. In enterprise, we’re in a strong position because what we — this area of customer experience management, it remains a clear imperative for enterprise customers. Everybody is investing in this personalization, at scale and current supply chain. These help drive both growth and profitability. So when you look at these areas, these, from an enterprise perspective, these are a must-have. This is not a need-to-have. And that’s helping us really attract the right kind of talent. We just onboarded, this week, a VP of Sales who had prior experience, a lot of experience in Cisco and Salesforce, et cetera. 

Adobe’s management believes that Adobe’s tools will be in demand even in an AI dominated world and will not be automated away

[Question] I just wanted to give you an opportunity to debunk this hypothesis that is going around that AI, it is generating videos and pictures, but the next step is, it’s going to do the actual editing and put out Premier Pro use or whatnot. So that is probably the existential threat that people are debating.

[Answer] So as it relates to generative content, I’m going to sort of break it up into 2 parts. One is around the tooling and how you create the content and the second is around automation associated with the content…

…So I think the core part of this is that as more of this content creates, you need more toolability, the best models are going to be the models that are safe to use and have control built-in from the ground up. And I think we have the best controls of anyone in the industry. And they need to be able to be done in an automated fashion that can embed into your workflow. 

Adobe’s management believes that as generative AI models proliferate in society, the demand for Adobe’s products will increase partly because there will be a rise in the number of interfaces that people use for creative content

I think the first question that I hear across many folks is, hey, with the advent of AI and the increase in the number of models that people are seeing, whether they be image models or video models, does that mean that the number of seats, both for Adobe and in the world, do they increase? Or do they decrease? To me, there’s no question in my mind that when you talk about the models and interfaces that people will use to do creative content, that the number of interfaces will increase. So Adobe has to go leverage that massive opportunity. But big picture, models will only cause more opportunity for interfaces. And I think we’re uniquely qualified to engage in that, so that’s the first one.

Adobe’s management wants Adobe to work with many different AI models, even those from third-parties

Do we only leverage the Adobe model? Or is there a way in which we can leverage every other model that exists out there? Much like we did with plug-ins, with all of our Creative applications, any other model that’s out there, we will certainly provide ways to integrate that into our applications, so anybody who’s using our application benefits not just from our model creation but from any other model creation that’s out there…

…But long term certainly, as I’ve said with our partnerships, we will have the ability for Adobe, in our interfaces, to leverage any other model that’s out there, which again further expands our opportunity.

MongoDB (NASDAQ: MDB)

MongoDB’s management thinks that AI will be a long-term growth driver for the company, but it’s still early days; management sees three layers to the AI stack – the first being compute and LLMs (large language models), the second being fine-tuning the models, and the third being the building of AI applications – and most of the AI spend today is at the first layer where MongoDB does not compete; MongoDB’s customers are still at the experimentation and prototyping stages of building their initial AI applications and management expects the customers to take time to move up to the second and third layers; management believes that MongoDB will benefit when customers start building AI application

While I strongly believe that AI will be a significant driver of long-term growth for MongoDB we are in the early days of AI, akin to the dial-up phase of the Internet era. To put things in context, it’s important to understand that there are 3 layers to the ad stack. The first layer is the underlying compute and LLMs the second layers of fine-tuning of models and building of AI applications. And the third layer is deploy and running applications that end users interact with. MongoDB’s strategy is to operate at the second and third layers to enable customers to build AI applications by using their own proprietary data together with any LLM, closed or open source on any computing infrastructure.

Today, the vast majority of AI spend is happening in the first layer that is investments in compute to train and run LLM, neither are areas in which we compete. Our enterprise customers today are still largely in the experimentation and prototyping stages of building their initial AI applications, first focused on driving efficiencies by automating existing workflows. We expect that will take time for enterprises to deploy production workloads at scale. However, as organizations look to realize the full benefit of these AI investments, they will turn to companies like MongoDB, offering differentiated capabilities in the upper layers of the stack. Similar to what happened in the Internet area, era when value accrued over time to companies offering services and applications, leveraging the built-out Internet infrastructure, platforms like MongoDB will benefit as customers build AI applications to drive meaningful operating efficiencies and create compelling customer experiences and pursue new growth opportunities…

…While it’s early days, we expect that AI will not only support the overall growth of the market, but also compel customers to revisit both their legacy workloads and build more ambitious applications. This will allow us to win more new and existing workloads and to ultimately establish MongoDB as a standard enterprise accounts. 

MongoDB’s management is already seeing the company’s platform resonate with AI startups that are building applications across wide use cases, and this gives management confidence that MongoDB is a good fit for sophisticated AI workloads

We already see our platform resonating with innovative AI startups building exciting applications for use cases such as real-time patient diagnostics for personalized medicine, cyber threat data analysis for risk mitigation, predictive maintenance for maritime fleets and auto generated animations for personalized marketing campaigns…

…we do see some really interesting start-ups who are building on top of MongoDB. So it gives us confidence about our platform fit for these sophisticated workloads. 

There are three elements that are important when migrating from a relational database to a non-relational database and MongoDB’s current relational migrator offering helps automate the first two elements; the third element – rewriting the application code – is manually intensive and management believes that generative AI can help to tremendously improve the experience there

There are 3 elements to migrating and application to transform the schema, moving the data and rewriting the application code. Our current relational migrator offering is designed to automate large parts of the first 2 elements, but rewriting application code is the most manually intensive element. Gen AI holds tremendous promise to meaningfully reduce the cost and time of rewriting application code. We will continue building AI capabilities into relational migrator, but our view is that the solution will be a mix of products and services.

Samsung Electronics’ Digital Appliances division migrated from its previous MySQL database to MongoDB Atlas; the Samsung Smart Home Service can leverage MongoDB’s document database model to collect real-time data for training AI services; the migration improved response times by >50% and the latency was reduced from 3 seconds to 18 milliseconds

Samsung Electronics Digital Appliances division transitioned from their previous MySQL database to MongoDB Atlas to manage clients’ data more effectively. By leveraging MongoDB’s document model, Samsung’s Smart Home Service can collect real-time data from the team’s AI-powered home appliances and use it for a variety of data-driven initiatives such as training AI services. Their migration to MongoDB Atlas improved response times by more than 50% and this re-latency was reduced from 3 seconds to 18 millisecond, significantly improving availability and developer productivity.

MongoDB’s management believes that the performance and cost of AI applications are still not up to mark, using ChatGPT as an example

And also these technologies maturing, but from both the performance and from a cost point of view, if you played with chat GPT or any of the other chatbots out there or large language models, you’ll know that the performance of these applications to get a response time in the 1 to 2 to 3 seconds, depending on the type of question you’re asking. And so naturally, a chatbot is a very simple but easy to understand use case, but to embed that technology into a sophisticated application, making real-time decisions based on on real-time data, the performance and to some degree, the cost of these architectures are still not there…

…The performance of some of these systems is — I would classify as okay, not great. The cost of inference is quite expensive. So people have to be quite careful about the types of applications they deploy.

MongoDB’s management thinks that this year is when MongoDB’s customers start rolling out a few AI applications and learn; it will be at least another year when the positive impacts of AI to MongoDB’s business really starts showing up

Cstomers are still in the learning phase. They’re — they’re experimenting, they’re prototyping. But I would say you’re not seeing a lot of customers really deploy AI applications at scale. So I think it’s going to take them — I would say, this year is a year where they’re going to do probably roll out a few applications, learn…

… I think it’s going to show up in a business when people are deploying AI apps at scale, right? So I think that’s going to be at least another year.

MongoDB’s management believes that the company is very well-positioned to capture AI application workloads because of the technologies underneath its platform and because it is capable of working with a wide range of AI models

We feel very good about our positioning because from an architecture point of view, the document model, the flexible schema, the ability to handle real-time data, performance at scale, the unified platform, the ability to handle data, metadata and vector data with the same query language, same semantics, et cetera, is something that makes us very, very attractive…

… We feel like we’re well positioned we feel that people really resonate with the unified platform, one way to handle data, metadata and vector data that we are open and composable that we integrate to not only all the different LLMs, we are integrated to different embedding models, and we also essentially also integrate with some of the emerging application frameworks that developers want to use. So we think we’re well positioned

MongoDB’s management is seeing that AI-related decisions are being made at the senior levels of a company, and so MongoDB is engaging with customers at that senior level

The other thing that we’re finding is unlike a typical sale where someone is deciding to either build a new workload or modernize a workload. The AI decision is more of a central decision — centralized decision more than ever. So it allows us to actually go higher in the organization. So we’re actually engaging with customers at much more senior levels because, obviously, this is coming down as a top-down initiative.

MongoDB’s management is seeing the first few wave of AI use cases among companies being focused on reducing costs, co-generation, and increasing developer productivity

In regards to use cases, we’re seeing most customers focus on driving efficiencies in their business because their existing baseline of costs are well known. So it’s much easier for them to determine how much value they can derive by using some of these new AI technologies. So I see the first wave of applications being around reducing costs. You’ve seen some announcements by some customers are saying, focusing on things like customer support and customer service, they really have been — they have found ways to dramatically reduce their cost. That’s not surprising to me. I think co-generation and this increasing developer productivity is another area. I think those are going to be kind of 2 areas where there’s low-hanging fruit. 

MongoDB’s management is seeing high interest in AI across almost every industry

In terms of across industries, I think it’s — obviously, there’s some constraints on some customers based on the regulated nature of their industry. But in general, we see basically high interest across almost every industry that we operate in.

Customers migrate off relational databases to MongoDB for three key reasons – (1) their data model has become brittle with relational architecture, (2) their legacy systems are not scaling properly, and (3) the costs have become high – and they are accompanied by a compelling event; customers also conduct the migration to make their data more AI-enabled

Even for IPO, we had a meaningful number of customers migrating off relational to MongoDB. So they to come in 3 categories of reasons why. First is that the data model has become so brittle with relational architecture that is very hard to build new features and be responsive to their customers. And so they just feel like their ability to innovate has slowed down. The second reason is that the system is just not scaling or performing given the increased number of users or the large amount of data that they have to process that they realize that they have to get off a legacy platform. And the third reason is just the cost of the underlying platform and relative to the ROI that application. So it typically falls in one of those three buckets. Sometimes customers may have all 3 or maybe 2 of the 3 that are driving that demand. And then there’s typically some compelling event. Maybe there’s some milestones they want to hit. Maybe there’s a renewal coming up with the incumbent vendor that’s driving them to potentially move off that vendor as quickly as possible…

… On top of the 3 reasons I gave you in terms of why people moved this now in the both reason which is enabling their data and their applications to be more AI-enabled. And so it’s not just moving to a more modern platform, but making them more AI enabled. And so that’s also something that’s getting customers’ interest.

Okta (NASDAQ: OKTA)

Okta’s management has built a strong pipeline of products that are powered by Okta AI

We’re expanding on the world’s most robust and modern identity platform, and we have a strong pipeline of products and functionality powered by Okta AI.

Okta’s management believes that Spera (the company’s new acquisition) is going to help with threat protection; threat protection with Okta AI and the Spera product will be packaged and monetised as add ons

And so you’re seeing customers really starting as they lean in and do more with modern identity, they’re also at the same time saying, what is this class of tools and technologies and capabilities are going to protect that? And that’s where offerings like Underneath Threat Protection with Okta AI or the Spera product are really going to help. And so I think in terms of how we’re going to price and package and monetize these things, think of — they’re both additional, they’re both additional capabilities with additional licensing fee. 

Salesforce (NYSE: CRM)

Salesforce’s management believes that the company is the world’s No.1 AI CRM

Salesforce is the world’s #1 AI CRM, #1 in sales, #1 in service, #1 in marketing, #1 data cloud, incredible.

In Salesforce’s management’s conversations with CEOs, they are hearing three things that the CEOs want – productivity, higher value customer relationships, and higher margins – and these things can happen through AI; Salesforce’s management thinks that company-leaders know they need to make major investments in AI right now

As I talk to CEOs around the world, they tell me, they want 3 things. You may have heard me say this already, but I’ll say it again. One, they want more productivity, and they’re going to get that productivity through the fundamental augmentation of their employees through artificial intelligence. It’s happening. It’s empirical. Number two is they want higher value customer relationships, which is also going to happen through this AI. And they want higher margins, which we are seeing empirically as well through the — when they use this artificial intelligence in these next-generation products. As we look at productivity, as we look at higher value customer relationships, as we look at higher margins, how do our customers get these things? How are they achieving these goals? It is AI. It is why every CEO and company knows they need to make major investments in AI right now.

Salesforce’s management thinks that the current AI moment will give companies an unprecedented level of intelligence and Salesforce’s Einstein 1 platform can help companies achieve this

And I believe this is the single most important moment in the history of the technology industry. It’s giving companies an unprecedented level of intelligence that will allow them to connect with their customers in a whole new way.  And with our Einstein 1 Platform, we’re helping out our customers transform for the AI future.

Salesforce’s management thinks that popular AI models are not trusted solutions for enterprises because they are trained on amalgamated public data and could hallucinate, providing customers with services that do not exist; this was the exact problem faced by an airline recently, and the airline was a Salesforce customer who did not want to work with Salesforce’s AI technologies

The truth is that these AI models are all trained on amalgamated public data. You all understand that. You’ve all seen the New York Times lawsuit of OpenAI or others who are really going to take, hey, this is all — this amalgamated stolen public data, much of it used without permission, unlicensed, but amalgamated into these single consolidated data stores…

These AI models, well, they could be considered very confident liars, producing misinformation, hallucinations. Hallucinations are not a feature, okay?…

…And there’s a danger though for companies, for enterprises, for our customers that these are not trusted solutions. And let me point out why that is, especially for companies who are in regulated markets. Why this is a big, big deal. These models don’t know anything about the company’s customer relationships and, in some cases, are just making it up. Enterprises need to have the same capabilities that are captivating consumers, those amazing things, but they need to have it with trust and they need to have it with security, and it’s not easy. Look, we all read the story. Now it just happened last week. An airline chatbot prompts by a passenger to book a flight with a 90-day refund window. It turns out the chatbot, running on one of these big models, we won’t have to use any brand names here. We all know who it was, hallucinate the option. It did not exist… 

…The airline said, “Oh, listen, that was just the chatbot. It gets that way some time. We’re so sorry — you know what, that’s just a separate technical entity, a separate legal entity and the airline, “We can’t — we’re not going to hold liability for that.” Well, guess what? That defense did not work in a court of law. The court of law said that, that AI chatbot that made up that incredible new policy for that company, well, that company was going to be held responsible, liable for that policy, that they were going to be held liable for the work of that chatbot. Just as they would for a human employee, they were being held liable for a digital employee…

…And that story I told you on the script. When I saw that last week, I’m like, I’m putting this in the script, that this company, which is a great company and a customer of ours, but did not use our technology, went out there and used some kind of rogue AI that they picked off the Internet. Some engineer just hobbled it, hooked it up, and then it started just skewing these hallucinations and false around their loyalty program, and the courts are holding them liable. Good. Let every CEO wake up and realize, we are on the verge of one of the greatest transformations in the history of technology, but trust must be our highest value.

Salesforce’s management believes that there are three essential components for enterprises to deliver trusted AI experiences, (1) a compelling user interface, (2) a high-quality AI model, and (3) data and metadata; management thinks that Salesforce excels in all three components; management has found that Salesforce customers who are experts on AI have realised that it is the integration of AI models with data and metadata that is the important thing in powering AI experiences, and this is why customers are turning to Salesforce

The reality for every enterprise is that to deliver trusted AI experiences, you need these 3 essential components now.

You need that compelling user interface. There’s no question, a natural and effortless experience. And at Salesforce, we have some of the most intuitive user interfaces that deliver insights and intelligence across sales and service and marketing and commerce and industries. Many of you are on Slack right now. Many of you are on Tableau. Many of you are on MuleSoft are, one of our other products.

Okay. Now what else do you need? Number two, you need a world-class AI model. And now we know there’s many, many models available. Just go to hugging face, which is a company that we’re investor in or look at all the other models. And by the way, not only the thousands of models right now, but there are tens of thousands, hundreds of thousands of models coming. And all the models that are available today will be obsolete 12 months from now. So we have to have an open, extensible and trusted framework inside Salesforce to be receptacles for these models. That’s why Einstein 1 is so important. Then you have to be able to use these AI models. The ones that Salesforce is developing or these public models on Hugging Face or other things, or even bring your own model. Customers are even making their own models, fantastic. Of course, we have great partnerships with OpenAI, with Mythropic, with Cohere with many other AI models. This is the second key component. One is the UI, the second is the model, all right?…

…Now we’re in the enterprise. In the enterprise, you need deep integration of data and metadata for the AI to understand and deliver the critical insights and intelligence that customers need across their business, across sales, service, marketing, commerce, whatever it is. That deep integration of the data and metadata that’s not so easy. That’s not just some amalgamate stolen public data set. In the enterprise, that deep integration of data and metadata. Oh, that’s what Salesforce does. We are a deep integration of data and metadata. That is why it’s very, very exciting…

…And they try to stitch together a variety of AI tools and copilots and this and that and whatever I’ve had so many funny conversations with so many customers that come to me that they’re experts in AI and their. And then I just say to them, but how are you going to deliver this experience? And then finally, they realize, “Oh, I need the deep integration with the data and the metadata. The reason why the metadata is so important is because it describes the data. That’s why so many companies are turning to Salesforce for their AI transformation. Only Salesforce offers these critical layers of AI for our customers, the UI, the model and the deep integration of the data and the metadata make the AI smart and intelligent and insightful. And without the hallucinations and without all of these other — all the other problems. For more than 2 decades, we’ve been trusted with our customers’ data and metadata. And we have a lot of it. 

Salesforce’s management believes that most AI models that are available today will be obsoleted in 12 months’ time, and that a flood of new AI models will be coming soon – because of this, it’s important that Salesforce needs to have an open, extensible framework to work with all kinds of models, and this is where Einstein 1 has an important role to play

And by the way, not only the thousands of models right now, but there are tens of thousands, hundreds of thousands of models coming. And all the models that are available today will be obsolete 12 months from now. So we have to have an open, extensible and trusted framework inside Salesforce to be receptacles for these models. That’s why Einstein 1 is so important.

Salesforce’s management believes that data is even more important for AI than chips, and this is why management is so excited about Salesforce: Because the company has one of the largest data repositories in the world for its customers

 I love NVIDIA, by the way, and what Jensen has done is amazing, and they are delivering very much. In the era of the gold rush, the Levi’s jeans to the gold miners. But we all know where the gold is: the data. The gold is the data. And that’s why we’re so excited about Salesforce because we are one of the very largest repositories of enterprise data and metadata in the world for our customers. And customers are going to start to realize this right now…

…For more than 2 decades, we’ve been trusted with our customers’ data and metadata. And we have a lot of it.

There is a lot of trapped data in Salesforce’s customers which is hindering their AI work; Salesforce’s Data Cloud helps to integrate all the disparate data sources, and it is why the service is Salesforce’s fastest-growing product ever; Data Cloud is now integrated across the entire Salesforce platform, and management is totally focused on Data Cloud in FY2025; using Data Cloud adds huge value for customers who are using other Salesforce services; Data Cloud and Einstein 1 are built on the same metadata framework – which allows customer apps to securely access and understand the data that is on Salesforce’s platform – and this prevents hallucinations and it is something only Salesforce can do

Many of our customers also have islands and thousands of systems of trapped data…

… Trap data is all over the enterprise. Now what trap data could be is you might be using a great company like Snowflake and I less Snowflake or Databricks or Microsoft or you might be using Amazon system or even something like Google, what do you say, BigQuery, all these various databases…

…if you’re using Salesforce, Sales Cloud, Service Cloud, Tableau, Slack, we need to be able to, through our zero copy, automatically integrate into our data cloud, all of those systems and then seamlessly provide that data back into these amazing tools. And that is what we are doing because so many of our customers have islands of trapped data in all of these systems, but the AI is not going to work because it needs to have the seamless amalgamated data experience of data and metadata, and that’s why our Data Cloud is like a rocket ship.

The entire AI revolution is built on this foundation of data, and it’s why we’re so excited about this incredible data cloud. It’s now deeply integrated into all of our apps into our entire platform. Its self-service for all of our customers to turn on. It is our fastest-growing product ever. It’s our total focus for fiscal year ’25.

With Salesforce Data Cloud, Salesforce can unlock this trap data and bring together all of their business and customer data into one place for AI, all while keeping their data safe and secure, and it’s all running inside our Einstein Trust layer, and we’ve deployed it to all of our customers, we unleash now the copilot as well to all of our customers deeply built on our pilot on our data and metadata. And while other copilots just sit and spin because they can’t figure out what the data means and if you haven’t seen the demonstrations, you can see these co-pilots spin, but when they use Salesforce and all of a sudden becomes intelligent, and that is the core of the NSN platform. And all of our apps, all of our AI capabilities, all of the customer data and 1 deeply integrated trusted metadata platform, and that’s why we’re seeing incredible demand for data cloud. Data Cloud brings it all together…

…We’ve never seen traction like this of a new product because you can just easily turn on the Data Cloud and it adds huge value to Sales Cloud. It adds huge value to Service Cloud, the Marketing Cloud and the CDP…

… Because Data Cloud and all of Einstein 1 is built on our metadata framework, as I just described, every customer app can securely access and understand the data and use any bottle, use an EUI workflow, integrate with the platform. That means less complexity, more flexibility, faster innovation, but also we want to say goodbye to these hallucinations. We want to say goodbye to all of these crazy experiences or having with these bots that don’t know what they’re doing because they have no data or metadata, okay? Or the data that they have metadata is like productivity data like the highest level data that’s not deeply integrated data. So only Salesforce can do this.

Payroll company ADP has been a long-time customer of Salesforce but wanted to evaluate other AI solutions; ADP realised that the data and metadata component was lacking in other AI solutions and it is something only Salesforce can provide

We all know the HR and payroll leader, ADP and their incredible new CEO, [indiscernible], amazing. ADP has been a great sales cloud customer for 2 decades. They’ve used Einstein for years. They are one of the first customers we ever have…

…And the company wanted to transform now customer service with AI to give their agents real-time insights, next best actions, auto generating case summaries. But what I have to say to you, it was a little bit embarrassing Salesforce is not #1 on their list. And I said to them, “How can that be? We’re the #1 service cloud. We’re #1 in the Q. We’re #1 in this. Number went work.” “No, we’re going to go evaluate this. We’re going to look at all the different solutions — we’re going to look at all the new AI models. We think we’re just going to hook this model up to this, and we’re going to do that.” And it sounds like a big Rube Goldberg invention what was going to happen there. And so we had to go in and we just wanted to partner with them and say, “All right, show us what you want to do. We’re going to work with you, we’re going to be trusted partners. Let’s go.” 

But like a lot of our customers move into AI, ADP realized it didn’t have a comprehensive deeply integrated platform of data and metadata that could bring together all of this into a single source of truth — and then you get the incredible customer service. Then you get the results that you’re looking for. And it’s deeply integrated with their sales systems with marketing and custom applications. And ADP discovered the only sales force can do this. We were able to show ADP how we could unlock trap data with data cloud, 0 copy, drive intelligence, productivity, efficiency for their sales team with Einstein to levels unimagined just a year ago

Salesforce has a new copilot, Einstein Copilot, which management believes is the first conversational AI assistant the is truly trusted; Einstein Copilot can read across all the data and metadata in Salesforce’s platform to surface valuable sales-actions to take, and that is something human users cannot do; management believes that other copilots cannot do what Einstein Copilot can currently can without deep data integration; management thinks that Einstein Copilot is a cut above other copilots

We’re now incredibly excited to work with all of our customers to take their AI to the next level with Einstein copilot, which is going live tomorrow. Einstein CoPilot, which if you haven’t seen it, and if you haven’t, please come to TrailheadDx next week. This is the first conversational AI assistant for the enterprise that’s truly trusted. It’s amazing. It can answer questions. It can summarize. It can create new content, dynamically automate task on behalf of the user. From the single consistent user experience embedded directly within our platform. 

But let me tell you the 1 thing that can do that’s more important than all of that. It is able to read across all the data and metadata in our platform to get that insight instantly. And you’re going to see that — so the sales rep might ask the Einstein CoPilot, what lead I should focus on or what is the most important thing I need to do with this opportunity. And it may say, you need to resolve this customer’s customer case because this escalation has been around for a week or you better go and answer that lead that came in on the Marketing Cloud before if you want to move this opportunity for it because it’s reading across the entire data set. That is something that individual users cannot do that the copilot can do. With access to customer data and the metadata and sales force, including all this real-time data and website engagement and the ability to read through the data set, that’s why Einstein copilot has all the context to understand the question and surface belied that has the highest value and likelihood to convert. And it can also instantly generate the action plan with the best steps to close the deal, such as suggesting optimal meeting times on the lead contacts, known preferences even draping e-mail. If you haven’t seen the video that I put on my Twitter feed last night, there’s a 5-minute video that goes through — all of these incredible things that it’s able to do, there’s never been an enterprise AI capability quite like it. It’s amazing…

… I assure you, without the deep integration of the day of the metadata across the entire platform within copilots deep integration of that data, they cannot do it. I assure you they cannot because they cannot. — because they don’t have the data on the meta data, which is so critical to making an AI assistant so successful…

And I encourage you to try the demos yourself to put our copilot up against any other copilot. Because I’ll tell you that I’ve seen enterprise copilots from these other companies and actions and they just spend and spin and spin…

…I’ve used those copilots from the competitors, have not seen them work yet….

…Einstein is the only copilot with the ability to truly understand what’s going on with your customer relationships. It’s one conversational AI assistant, deeply connected to trusted customer data and metadata.

Einstein 1 is driving sales price uplift in existing Salesforce customers, while also attracting new customers to Salesforce; Salesforce closed 1,300 Einstein deals in FY2024; Einstein 1 has strong early signs after being launched for just 4-plus months

In fact, we continue to see significant average sales price uplift from existing customers who upgrade to Einstein 1 edition. It’s also attracting new customers to Salesforce, 15% of the companies that purchased our Einstein 1 addition in FY ’24 were net new logos…

… In FY ’24, we closed 1,300 Einstein deals, as more customers are leveraging our generative and predictive AI capabilities…

…. I think the way to think about the price uplift moving to Einstein 1 addition used to be a limited edition plus, is really about the value that we’re providing to our customers because at the end of the day, our ability to get increased price is about the value that we’re going to provide. And so as customers start to ramp up their abilities on AI, ramp up their learnings and understand what it means for them economically, our ability to get price will be dictated by that. Early signs of that are pretty strong. We feel good about the progress we’ve seen. It’s only been in market for 4-plus months now in FY ’24, but we’re encouraged by what we’re seeing.

Slack now comes with AI-search features; Salesforce’s management thinks Slack can become a conversational interface for any application

We just launched SlackAI with features like AI search channel recaps and thread summaries to meet the enormous demand for embedded AI in the flow of work from customers like Australian Post and OpenAI. It’s amazing to see what Slack has accomplished in a decade. And frankly, it’s just the beginning, we have a great vision for the future of Slack as a conversational interface for any application. 

Bajaj Finance in India is using Einstein for AI experiences and in 2023 Q4, Bajaj become Salesforce’s second largest Data Cloud customer globally

India continues to be a bright spot for us, growing new business at 35% year-over-year, and we continue to invest in the region to meet the needs of customers, including Bajaj Finance. I had the great opportunity to meet with our CEO, Rajeev Jain in January, and a top priority for him was using Einstein to deliver predictive and generative AI across their entire lending business, which they run on Salesforce. In Q4, Bajaj became the second largest data cloud customer globally, building their AI foundation on the Einstein One platform

Salesforce’s management would be very surprised if other companies can match Salesforce’s level when it comes to AI

Because if you see anyone else being able to deliver on the promise of enterprise AI at the level of quality and scale and capability of Salesforce, I’ll be very surprised. 

Salesforce is deploying its own AI technologies internally and management is seeing the benefits

We are a big believer on sales on Salesforce. We are deploying our own AI technology internally. Our sales teams are using it. Absolutely, we are seeing benefits right now. But the biggest benefit we’ve seen actually has been in our support operation, with case summaries our ability to get — to tap in a knowledge base is faster to get knowledge surfaced within the flow of work. And so it absolutely is part of our margin expansion strategy going forward, which is how do we leverage our own AI to drive more efficiencies in our business to augment the work that’s being done in sales and in service and in marketing and even into our commerce efforts as well…

…We have to be customer #1 and use it, and I’m excited that we are.

Tencent (NASDAQ: TCEHY)

Tencent’s management thinks that its foundational AI model, Tencent Hunyuan, is now among the best large language models in China and worldwide; Hunyuan excels in multiturn conversations, logical inference and numerical reasoning; Hunyuan has 1 trillion parameters; Hunyuan is increasingly used by Tencent for co-pilot services in the company’s SaaS products; management’s focus for Hunyuan is on its text-related capabilities, especially text-to-video

Our Tencent Hunyuan foundation model is now among the top tier of large language model in China with a notable strength in advanced logical reasoning…

… After deploying leading-edge technologies such as the mixture of experts (MoE) architecture, our foundation model, Tencent Hunyuan, is now achieving top-tier Chinese language performance among large language models in China and worldwide. The enhanced Hunyuan excels particularly in multiturn conversations, logical inference and numerical reasoning, areas which has been challenging for large language models. We have scaled the model up to the 1 trillion perimeter mark, leveraging the MoE architecture to enhance performance and reduce inference costs, and we are rapidly improving the model text to picture and text to video capabilities. We’re increasingly integrating Hunyuan to provide co-pilot services from enterprise SaaS products, including Tencent Meeting and Tencent Docs…

…Among our enterprise Software-as-a-Service products, we deployed AI for real-time content comprehension in Tencent Meeting, deployed AI for prompt based document generation Tencent Docs and rolled out a paid customer acquisition tool for eCom…

… At this point in time, we are actually very focused on the text technology because this is actually the fundamentals of the model. And from text, we have built out text to picture from text, we build out text to video capabilities. And the next important evolution is actually what we have seen with [indiscernible], right? [indiscernible] has done an incredible job with text to a [ long ] video, and we — this is something which we would be developing in [ the next turn ]. When we continue to improve the text fundamental capability of Hunyuan, at the same time, we’ll be developing the text to video capability because we actually think that this is actually very relevant to our core business, which is a content-driven business in the area of short video, long video and games. And that’s the area in which we’ll be developing and moving our Hunyuan into. 

Tencent’s management is developing new generative AI tools for internal content production; management thinks that the main benefits of using AI for internal content production is not to reduce costs, but to enable more rapid monetisation and thus, higher revenue generation

 And we are also developing new gen AI tools for effective content production internally…

…We are increasingly going to be deploying AI, including generative AI in areas such as accelerating the creation of animated content, which is a big business for Tencent Video and a profitable business for Tencent Video in terms of game content, as we discussed earlier, potentially in terms of creating [ code ] in general. But the benefit will show up, not in the substantial cost reductions. It will show up in more rapid content creation, and therefore, more rapid monetization and revenue generation.

Tencent’s management is starting to see significant benefits to Tencent’s business results from deploying AI technology in the company’s businesses; the benefits are particularly clear in Tencent’s advertising business, especially in the short-term; Tencent has seen a 100% increase in click-through rates in the past 18 months in its advertising business through the use of AI

More generally, deploying AI technology in our existing businesses have begun to deliver significant revenue benefits. This is most obvious in our advertising business, where our AI-powered ad tech platform is contributing to more accurate ad targeting, higher ad click-through rates and thus, faster advertising revenue growth rates. We’re also seeing early stage business opportunities in providing AI services to Tencent Cloud customers…

…In terms of the AI short-term benefits, I think financial benefits should be much more indexed towards the advertising side because if you think about the size of our advertising business as call it RMB 100 billion [ a year ]. And if you can just have a 10% increase, right, that’s RMB 10 billion and mostly on profit, right? So that’s the scale of the benefits on the advertising side and especially as we see continued growth of our advertising business and when we add in the Video Accounts e-commerce ecosystem, that just has a very long track of growth potential and also the low ad load right now within Video Accounts.

But on the other hand, if you look at the cloud and business services customers, then you are really facing a relatively nascent market. You still have to sell to these customers. And we spend a lot of time working with all the customers in different industries and trying to figure out what’s the best way of leveraging AI for their business. And then you have to go through a long sales cycle. And then at the same time, it’s competitive because your competitors will actually come in and say, “Oh, they can also provide a similar service.” And despite we believe we have a superior technology and product, it’s actually [ very careful ] and your competitor may actually sort of come in and say they’re going to cut prices, even though there’s an inferior product.

So all these things, all the low-margin, highly competitive and long sales cycle of the 2B business would actually come in to play in that side of the business. So when you compare the two sides of the equation, you can actually clearly see that ramping up advertising is actually going to be much more profitable from the short term. Of course, we’ll continue to do both, right?…

… Martin gave the example of if we can improve click-through rates by 10%, then that’s CNY 10 billion in incremental revenue, probably CNY 8 billion in incremental gross operating profit. In reality, you should view 10% as being in the nature of a floor, not a ceiling. Facebook has seen a substantially bigger improvements in click-through rates for some of our most important inventories, we’ve actually seen our click-through rates increase by 100% in the past 18 months. So when we’re thinking about where the financial benefits of AI, then it’s advertising, click-through rates and therefore, advertisement revenue first and foremost, and that’s a very high flow-through business for us.

Tencent’s management believes that AI technology can be applied in its games business in terms of creating innovative gameplay as well as generating content in existing games, but these will take some time to manifest

In terms of the application of AI to games, then like many things, the boundary between [indiscernible] reality is a function of how far forward [indiscernible] willing to look and [ we’re willing to look very far ] forward. And all of the areas you mentioned, such as AI-powered [ MPCs ], such as AI accelerated graphical content generation, graphical asset generation are areas that [ over for years ] to come, not over the months to come will benefit meaningfully from the deployment of AI. And I think it’s also fair to say that the game industry has always been a mixture of, on the one hand, innovation around gameplay techniques. And on the other hand, deployment of enhanced content — renewed content into existing gameplay. And it’s reasonable to believe that AI will be most beneficial for the second of those activities. But one will continue to require very talented individuals and teams due to focus on the first of those opportunities, which is the creation of innovative game play.

Veeva Systems (NYSE: VEEV)

Veeva’s management has seen very specialised AI models being used for some time – prior to the introduction of large language models to the consumer public – to help with drug discovery, especially in areas such as understanding protein folding

[Question] what are you seeing out of life sciences companies in terms of how AI is changing things. Whether that’s accelerating drug development, whether that’s more targeted marketing, maybe if you could walk us through kind of what those conversations would look like? And what sort of role you think you can play in those changes?

[Answer] I would say the most direct impact and it’s been happening a while before large language models as well with AI and drug discovery. Very, very targeted AI models that can do things like protein folding and analyzing retina images, things like that. So this is — this is very powerful, but very therapeutic area specific, very close to the science in the R&D, and I — there’s not just one AI model there is multiple specialized AI models.

Veeva’s management has seen some experimentation going on with the use of large language models in improving general productivity in the life sciences industry

Then in terms of other areas, really, there’s a lot of experimentation with large language models. And what people look at it for are: a, can I just have general productivity for my people, can they write an e-mail faster? Can they check their e-mail faster? Can they research some information faster. So that’s one thing that’s going on. Also, specific use cases like authoring, can I — can I author a protocol faster? Can I author a regulatory document faster. Now faster is one thing, but also have to be very accurate. So I would say there’s experimentation on that. There’s not yet broad production use on that. And certainly, some of these critical things has to be lot of quality control on it. So those are probably the two biggest use cases — really three: research, general productivity and authoring.

Veeva’s management has developed a product to make Veeva’s data platform extract data in a much faster way so that it works well with AI applications, but otherwise, the company has not invested in LLMs (large language models) because they are not as relevant in the company’s field

And then as far as our role, we’ve been doing some really heavy work over the last 2 years on something in our Vault platform that’s called the Direct Data API. And that’s a pretty revolutionary way of making the data come out of Vault in a consistent — transactionally consistent manner much, much faster, roughly 100x faster than it happens now. That’s going to be critical for all kinds of AI applications on the top, which we may develop, which our customers may develop, and we’re also utilizing that for some really fast system to system transfer between our different Vault family. So that’s been the biggest thing that we’ve done. We haven’t really invested heavily in large language models. So far, we just don’t see quite the application in our application areas, not to say that, that wouldn’t change in the future.

Veeva’s management thinks that the important thing for AI is data – AI models will be a commodity – and Veeva has the advantage in this

I would say we’re in a pretty good position because AI really — the durable thing about AI is the data sources, the data sources. The AI models will come on top, and that will be largely a tech commodity, but the control and the access to the data sources, that’s pretty important, and that’s kind of where Veeva plays.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, Meta Platforms, MongoDB, Okta, Salesforce, Starbucks, Tencent, and Veeva Systems. Holdings are subject to change at any time.