Formerly known as Wikibon

Data Platform as a Map to Guide Agents | Road to Service as Software

Rahul Auradkar, Salesforce EVP & GM, Data Cloud

Key Takeaways:

  • Data Cloud as complementary data amplification layer
    • Enhances existing data infrastructure (Snowflake, Databricks) rather than replacing
    • Harmonizes hundreds of data sources through zero-copy integration
    • Unlocks trapped data for real-time CRM actions and AI-driven insights
    • Expands buyer personas from business leaders to include CIOs/CTOs
  • 4D mapping creates system of intelligence
    • Tracks people, places, things, and activities over time vs traditional 2.5D snapshots
    • Enables richer agent decision-making through comprehensive customer context
    • Combines deterministic flows with LLM-based responses using grounded intelligence
    • Graph-based connections provide deeper customer journey understanding
  • Production deployments demonstrate scale and value
    • Salesforce’s own deployment: 747 data streams, 272M profiles, 145M unified
    • Customer examples include real-time lending approvals and cross-functional CRM
    • Some customers operate 5-10x larger deployments than Salesforce internally  
    • Integration spans traditional CRM plus IoT signals and adjacent applications
  • Business model evolution supports digital labor transition
    • Hybrid pricing: consumption credits alongside traditional per-user-per-month
    • Flex agreements allow budget movement between human and digital labor
    • Digital Wallet provides real-time transparency and control tools
    • Informatica acquisition enhances transparency, understanding, and governance capabilities

George Gilbert

Today on the show we have a returning alum, Rahul Auradkar, EVP and GM of Salesforce Data Cloud. Rahul’s role at Salesforce is critical because Data Cloud is the tip of the spear of the reinvention of Salesforce as a platform. It’s not a traditional platform as we’ve come to know application platforms because in the age of AI data programs, BI, ML models and agents, and so while data might be the new infrastructure, the emerging layer of values, how you model the data that drives the analytics. In Salesforce’s case, it’s the metadata that describes everything about the customer and their engagement journey with a vendor. That forms a richer four-dimensional map than the 2.5D snapshots captured by traditional metrics and dimensions.

We call this a system of intelligence. It’s the foundation that allows the system of agents to perceive the state of the customer and the business, perform analysis, make a plan, and operationalize decisions in the customer 360 apps or external systems. And most important, this integration allows agents to learn from the outcome of their actions, something much harder for vendors without a combination of the data platform agents and operational applications. So the data platform, in a way, makes Salesforce a force and software-only hyperscaler. It’s the foundation of the next generation of Salesforce and the next generation of data-driven, agent-driven applications, or what we call services software, a term we learned from Rahul’s colleague at Salesforce, David Schmaier. So with that, welcome to the show, Rahul.

Rahul Auradkar

Introduction and platform gratitude

• System of intelligence foundation enabling agent perception

• Integration allows agents to learn from action outcomes

• Software-only hyperscaler through data platform foundation

Thank you. Thank you, George. Delighted to be here as a returning alum, if you may, of your program. So it’s just a delight to be here. Thank you for your opportunity.

George Gilbert

Data Cloud value proposition vs Databricks/Snowflake

So let’s start with the basics. I like to start in terms of explaining things that people already understand. So for customers who are already familiar with or who may have deployed one or even both of say Databricks and Snowflake, what is the Salesforce Data Cloud value proposition?

Rahul Auradkar

Complementary amplification layer

• Amplifies existing data infrastructure investments (lakes, warehouses)

• Activates real-time insights through AI and 360 applications

• Unlocks trapped data for better CRM actions vs replacing existing systems

So first off, the way to look at Data Cloud is in, like you said, in the age of AI and also connected applications, you should look at Data Cloud as unlocking the full value of data across the entire enterprise. The idea that Data Cloud powers your customer 360 applications, Agentforce, business applications in the enterprise, what it essentially does, it amplifies existing data infrastructure investments that customers have in their data state, things like existing lake houses, warehouses, the examples that you use, Snowflake and Databricks, and to activate real-time insights and activations driven through intelligent actions through AI and with our 360 applications. Essentially, Data Cloud is designed to complement existing systems. They exist within the data state that enterprises have. If anything, we say that we have fantastic partnerships with the names you called out, and more, and we look at it as being the best upgrade, if you may, for those lakes and warehouses because we are unlocking that data that sits in there to drive better actions through business applications, more importantly through our CRM applications.

George Gilbert

Layer above existing data platforms – zero-copy integration

And so just to put a finer point on it, when you say you’re adding value, it’s that you’ve built a model of a customer and their engagement journey so the data could live elsewhere. And through zero-copy technology, you’re incorporating it, synthesizing it into a model that then can perform richer analytics and operationalize those insights into the customer 360 or external applications. So it’s a layer above the existing data platform, so that’s a way to think about it?

Rahul Auradkar

Harmonization and unification layer

• Harmonizes hundreds of data sources (web clicks, mobile, IoT signals)

• Unifies data based on accounts, leads, customers, vehicles

• Drives activation through intelligent actions and automation

Yes, that’s correct. It’s a layer above, and the more important thing is that siloed information or siloed data that exists in these different silos, the data infrastructure silos, they’re there for good reason. We are not really replacing those silos, but we are unlocking that. The word that you used was models. What we are doing is we’re unlocking that, we are harmonizing hundreds of data sources and using those hundreds of harmonized data sources, for example, your web link, your web clicks, your mobile clicks, or you could have data associated with a customer or with a business or it could be IoT signals coming in. We are harmonizing all of that. We are unifying it based on whether it’s an account or a lead or a customer or a vehicle, et cetera. We are unifying it. And from there, you derive insights through our open and extensible platform that we have through either bringing your own model or you’re building your model inside Data Cloud.

It could be BI or AI insights. The most important part of it, George, is that final mind, like you refer to, which is activation. We are driving that activation through a combination of intelligent actions through AI, or in many cases it’s just automation that is driven through applications. Those models that you refer to are being used to drive the insights and the activations.

George Gilbert

Different buyer personas vs traditional data platforms

And how is the customer persona who’s buying this different from a traditional data platform? You talked about it technically it’s complimentary, so is the buyer different? Are they buying ultimately attributes that are different and so is the persona different?

Rahul Auradkar

Dual buyer categories emerging

• Business leaders (CMOs, sales leaders, service owners) for line-of-business use

• CIOs/CTOs increasingly involved for enterprise governance and integration

• Cross-functional binding (sales + services) requires C-level coordination

Yeah, broadly speaking, buyer persona come in two broad categories. One is the buyer is a business leader within an enterprise, whether it could be a customer service owner or it could be a CMO or it could be a sales leader or it could be any business app that is owned by a business leader. When combined with the fact that now Salesforce Data Cloud has become mainstream, in the enterprise we are starting to see CIOs and CTOs, with the influence coming from CTOs, making sure that it fits into their larger estate, making sure that it is governed appropriately, making sure that the securities are there too. And because it binds multiple businesses together, think about it this way, George, one of our reference customers who is binding sales and services is FedEx. Finding ways in which they can trap the signals from marketing, then bind it into sales and vice versa, they market back to sales, and they’re getting better top line from the fact that they’re able to combine marketing and sales. At that point, you would find a need for somebody from the CIO’s office, along with the CTO and CTO, to come in and drive all the coexistence, if you may, with their other data estate. That’s when we start seeing the CXOs coming in as buyer personas as well.

George Gilbert

CXO persona emergence timeline

And that persona I assume is relatively new compared to the line of business persona that you typically had?

Rahul Auradkar

Recent expansion beyond traditional CRM buyers

• Traditional: Sales/marketing leaders buying CRM applications

• New: IoT signals from automotive, broader business applications

• Expanded persona beyond traditional business leader scope

That is correct. That is correct. We have typically seen in the CRM space that you had either the sales leader or the marketing buyer buying CRM type applications, but now because the definition of CRM is so much broader, for example IoT signals coming in from auto automotive companies or automobiles, are now used in different top types of business applications that are connected to customers. Now, those IoT signals are making their way into Salesforce platform through Data Cloud. Now you’re looking at a persona that goes well beyond the traditional business leaders persona that you called out.

George Gilbert

4D map vs 2.5D snapshots – system of intelligence explanation

Okay. So let’s start digging into this Data Cloud and why it’s a new category, it’s an emerging category. We use this term system of intelligence because we refer to it as a 4D map, a four-dimensional map where it’s not just entities but it’s people, places, things and activities. It’s the processes that find them all, and the traditional analytics SQL databases are more like 2.5D snapshots. So maybe to help transition people who aren’t familiar with the difference, help maybe explain the difference between Tableau Semantics metrics and dimensions, and then the richer application semantics that are in Salesforce Data Cloud and how that integrates the applications.

Rahul Auradkar

Rich semantics with actionable workflows

• Tableau Semantics built on Data Cloud (revenue, profit, ACV, ARR definitions)

• Semantic modeling enables automated actions (discounts, alerts, agent triggers)

• Workflows and automations driven by AI agents from semantic understanding

Yeah, when you look at Tableau Semantics, George, we have built Tableau next, which is the next version of Tableau on top of Data Cloud and Salesforce platform that gives you rich semantics that are BI type semantics. Now, when I refer to semantics, a simple example of semantics could be revenue, it could be profit, it could be loss, it could be things like ACV, AOV, ARR. Different departments within a company might have different definitions to it, but once you have a semantic meaning to it, now we are able to model it on top of Data Cloud using Tableau Semantics that are built into Data Cloud. Now when you start looking at the semantics as a layer about that, imagine you’re using a combination of all of these to create lifetime value, and then you’re modeling something that requires you to have an action associated with lifetime value.

As an example, if the lifetime value is beyond a certain threshold and a customer is about to cancel some sort of an order, then you want to make sure that there’s an auto trigger associated with an action that goes from it. An action could be you’re providing a discount. An action could be that you’re alerting an account executive or a sales leader to go take action against this, or it could alert, in all the current use cases we’re seeing, is it could bring out actions through a workflow from an AI agent, also known as agents or agents from Agentforce. So that’s the difference where you have semantics on metrics. The metrics are defined with semantic meaning in Tableau Semantics, and then you’re defining workflows, you’re defining automations, you’re defining actions that are either performed through applications or through AI agents from Data Cloud modeling.

George Gilbert

Application logic harmonization into Data Cloud

Okay, so would it be fair to say that as you enrich the action space for the analytics and that the agents themselves can call on, you’re gradually abstracting and harmonizing the application logic that used to live in the operational application silos and in some way you’re starting to treat the application logic and the artifacts that define how operations work as an asset that needs to be handled and harmonized just the way we handled analytic data. Is it fair that that’s now pouring itself slowly into Data Cloud?

Rahul Auradkar

Production instance demonstration

• 747 data streams, 272M profiles, 145M unified profiles (live Salesforce data)

• Web signals automatically create leads with enriched context

• Zero-copy integration with Snowflake (776M rows virtualized)

Yes, it is. Yes, it is. Actually an example of that could be… I’m going to share my screen. I can show you an example of what we are talking about, what exactly you referred to. If you can see my screen, this, George, is a production instance of Data Cloud. This is Salesforce’s production instance. It’s indeed a production instance in that this is customer zero, as we call it. It is our deployment, Salesforce’s deployment of our deeply unified platform in Data Cloud, and you’re looking at the production instance. If you take a look at this, you’ve got 747 data streams. This is the number of data streams of data coming in. And these, like you said, it’s coming from infrastructure, as in data infrastructure, a lot of different places. It’s also coming from applications, different applications that have different meaning to and context of what we want to do. But how do we bring more synthesized, harmonized meaning to everything that the data is coming in?

I’ll get to that in a second. But one thing worthwhile hovering here is that since you brought up the 4D, the people, things, et cetera, activities, here are the people part of it. There are 272 million profiles. There’s multiple people touching us, and there are 145 million unified profiles. It’s a live instance, so you can see these are all changing on a dynamic basis. So these are all the profiles in the EMEA region. This is all production. This is I’m showing you real production data for Salesforce, how Salesforce’s business is being run. Now, the reason why I’m bringing that in up is that you brought up this the notion of signal. So here is a signal, I’m on salesforce.com homepage and here is a web agent built using Agentforce. I’m going to ask a question saying, show me a demo of Agentforce.

So what’s happening here is that the web agent is going and finding the best demo for Agentforce. And when I click on a demo here, it’s going into the demo. Now what is happening here is I’m logged in using my Trailblazer ID, and I’m logged in using my personal ID here, Rahul@Auradkar.com. And once I start watching the demo here, it becomes a web signal. Now, this is a signal that is coming in from the web application, in this case. It could be a mobile application. It could be tens of applications. It could also be warehouses and lake houses. What that does do is this is an application. This is the production Sales Cloud application. Now, I can go in and take a look at what happened to me that I had here.

Now this is Rahul@Auradkar.com. A lead got created here. A lead got created, got assigned to an fictitious account here, and also it recognized the fact that I’m an EVP and GM at Salesforce, even though I was at Rahul@Auradkar.com. How does it know that? That’s the harmonized unified data that’s modeled on top of all the sources of data that came in. These were all the sources. Let me log into this. I’m showing you all the sources of data here. Speaking of infrastructure, take a look at Snowflake. You brought up Snowflake as an example. This is, we Salesforce use Snowflake, and this is infrastructure that we are bringing in. We are bringing in 776 million rows of data from Snowflake, but that’s coming as direct taxes. This is the zero-copy that you referred to. We have deep partnerships. We created the zero-copy network in the market. We are doing highly standards-based implementation of that, whether at the query level or at the file level using Iceberg.

So we are bringing in then there is bringing in through zero-copy data, but bringing in is not the right word. Even though it shows up at 776 million rows of data, in reality it’s virtual, in that we are virtualizing the data, we’re federating the data in, if you may. In other cases we are ingesting and then also we have an accelerated mode of zero-copy as well. Look at all the sources of data, the 776 or whatever that number was, these streams of data coming in from Salesforce CRM data, all of that is zero-ETL coming in here. Now you asked a question of harmonization, so let me show you an example of metadata harmonization. So here’s an example. On the left side here you have various ways in which systems are bringing in data about an individual.

This is one such way coming from one source. On the right side here is the individual, which is a canonical object. This is a cleanup, if you may, using our metadata strength. On the right side this canonical individual object is now mapped from all the data streams on the left side. What are we doing here? We are logically separating the source systems from that of how you would eventually use the metadata, so the individual is a canonical model object. We cleaned it up. Now downstream you can use everything that you do with the individual is separated from the upstream data sources. So this is, I can show you another example here, web engagement coming from a web application. On the left side you can see all of the web engagement merge data that’s coming from multiple different sources. The right side is web engagement as a object. Now these objects are customizable. You can build your own objects.

You can package and ship your own objects. This is all the extensibility that customers are known to have, and Salesforce is available in Data Cloud as well. Now once you have that harmonized data, you do get this entire map. This is the enterprise map that I have in the past called out, so this is the modeled objects. This is the cleaned up metadata. I’m in the different data space here. I’m going to go to the default data space. Look at the modeled objects here. Now what I have here on the screen is the individual. Connected to the individual would be the entire schema harmonized review of the individual across the contact point. Phone number is harmonized. Privacy consent logs are brought in. They’re all harmonized. This is the whole harmonized view of the metadata.

George Gilbert

4D map representation confirmation

And just to be clear, because we’re looking at a graph of how everything’s connected and the connections are like first-class citizens, this is we’re tracking what’s happening to everything and every one over time, and that’s what we call the 4D map.

Rahul Auradkar

Graph-based connections as first-class citizens

• Individual connected across contact points, phone numbers, privacy logs

• Tracking everything and everyone over time

• Visual representation of harmonized metadata relationships

Correct, correct. You brought up the 4D map. Correct, this is a representation of the 4D map.

George Gilbert

Enhanced analytics and agent capabilities from 4D context

So maybe explain once you have this richness, as opposed to a bunch of tables with foreign keys and primary keys, once you have this much richer picture of people, places, things and the activities or processes by which they interact, how does that change the type of questions that you can answer with traditional analytics and then even with agents? In other words, this context is so much richer. Think of it as like a robot can perceive now a 4D instead of 2D, what does that allow that robot to do?

Rahul Auradkar

Richer qualification and automation

• Profile unification (272M source → 145M unified via fuzzy/deterministic logic)

• Predictive AI models determine lead qualification

• Marketing nurture agents work automatically below thresholds

What does that robot, first off, you’re presenting to in 4D let’s say in the people part of it. Now you can go in and you have harmonized data. Now you can go clean up, for example. Here’s another cleanup that you can do. You want to go unify. All I showed you until now was a harmonized map, which is a schema normalization, metadata cleanup, if you may. Now we are cleaning up the identities in that. There could be multiple. I showed you I was logged in at Rahul@Auratka.com. I could be at Gmail. I could be at Salesforce. I could have a different phone number, multiple different phone numbers. I could be multiple entities. So now I’ve got 145 million unified profiles, whereas I started with 272 million source profiles. You can do it with fuzzy logic or you could do it with deterministic logic. Now your question, what can you do with it? Here’s an example of what you could do with it. This is a flow. This is deterministic, low-code automation.

What you could do with this is now you can do qualification. Am I qualified to be a lead or not a lead? In the back end, what’s happening, George, here interestingly, I didn’t show this to you, before that lead got created was it went through a predictive AI model. The predictive AI model was modeling whether Rahul should be a lead or not because I have the richness of Rahul’s score. If I’m not a lead, then it hands off to a marketing nurture agent, which is running in the background. So if I fall below a threshold, then it moves it to the marketing nurture agent, and that nurture agent is working in the background by sending emails and making the signal richer. So once it becomes rich enough, then it goes into the qualified mode and the hands off to a BDR or an SDR. That’s an example for low-code automation.

George Gilbert

Flows vs deterministic automation clarification

Okay, so just to be clear, this is an example of, these flows are an example of what might’ve lived in the silos before that you’re starting to enrich the definition of not just the data that’s in Data Cloud, but the customers and then what you’re supposed to, the processes that move the customers through an engagement journey from prospect or lead to prospect to conversion in aftermarket. In other words, all the ways that you interact with customers are now becoming unified. Some are deterministic when they need to be repeatable, explainable, auditable. That’s in a flow. Maybe now explain how an agent might complement that.

Rahul Auradkar

Design-time procedures with runtime intelligence

• Predictive models trigger lead creation and alerts

• Deterministic flows handle structured processes

• LLM-based responses use system of intelligence context

An agent could complement this, as an example, let me just show you this example here. So from here, a good example would be assume I come back into Salesforce.com and ask a question, show me the pricing for Agentforce. What the system knows about me right now is I’m already a lead. I might have already bought a product, so it has signals about me. It has my richness of my signals. It has gone through predictive AR models, so then the system knows quite a bit about me. Now I’m asking pricing about Agentforce. With that pricing, it now knows that there is a high propensity to my buyer who’s looking at pricing for Agentforce. It sends a signal or sends an alert to a system. In this case, let’s say the system is Slack, which is a really, our buyers live in Slack.

Now Slack can initiate an agentic response to me. An agentic response could be based on the harmonized unified signal you have in the model that was built around that signal. It can initiate either an email reach out or it could initiate something else, or it could even a pricing model. It could go message the model based on what the pricing for Agentforce and send it out to me. Now that’s not theory. That’s real. That’s happening today. We are doing it today.

George Gilbert

Agent composition of flows vs independent action

But maybe distinguish also how much the agent is using the flows that are traditional deterministic but low code, and then how much the agent is itself deciding how to compose either the low code flows to perform a unique action in this context. In other words, some of it is the traditional hard coded rules and some of it is the agent perceiving the state of what’s right for you, as a customer, and then invoking either procedural flows or new I guess actions that it might generate on its own.

Rahul Auradkar

Design-time vs runtime distinction

• Design-time: Procedural flows with predictive AI triggers

• Runtime: LLM-based generative responses using grounded intelligence

• System of intelligence feeds context to large language models

It’s a great question. So as an example, you think about it as design time and runtime. In this case, on design time we are giving it a procedural flow as in saying, hey, you’ve got a predictive AI model and it’s triggering something. The something could be that, hey, you go, that trigger, you create a lead, and if that person’s really getting into the website again, recognize that that’s an easy thing. You don’t need agentic actions for it. Soon as that happens, then you trigger something else. There’s something else is an alert that goes into Slack, for example. That’s where you get that determinism now moves into an LLM type, generative AIs. And the generation that happens in the response that you’re providing to me now is happening based on the more of the non-deterministic LLM-based. We are feeding the intelligence, if you may. This is the system of intelligence that’s feeding that LLM in this case or it’s feeding a model, a large language model that’s generating a response based on that grounded information that the system of intelligence has fed it through the model that was created in Data Cloud.

George Gilbert

Data graph context for agent decision-making

Okay, so I want to make sure we put a really fine point on this, the context that helps that agent decide how to respond, you’re feeding it. I assume ultimately that context, just to take it down to the ground level, is the data graph, if I understand that right. That graph structures signals about the customer in a way that allows the agent to infer, to make a richer inference about the customer than if you just returned a SQL result set. Maybe explain, give us an example of how that data graph allows the agent to be smarter about what it’s going to do.

Rahul Auradkar

Comprehensive customer intelligence

• Web clicks + application data + purchase history + sentiment analysis

• Recent activity context (last 5 minutes of pricing/Agentforce clicks)

• Rich unified profile enables contextual responses and actions

So an example would be you have all the web click information that I showed you. Then you have all of the application data that is coming in from various different application, what the person’s bought, what the person’s purchased, what’s the sentiment of the customer, et cetera. And then you have information associated with the, in my case about me, about my latest address, the latest phone number that might be coming from a reference data that’s sitting in a warehouse. Now imagine all of that data that’s coming in, the graph that you referred to, enterprise graph that we created based on harmonizing all of that against my, and then you’re doing the unification. Now that web click information could be last five clicks, clicks in the last five minutes. It’s about pricing. It’s about Agentforce. It’s about specifics within the last five seconds, five minutes, if you may. Now that is the context that you’re referring to that is being provided to the alert.

The alert’s just not an alert that Rahul looked up a website. The alert is about Rahul looked up a website, but this is all the information we have about Rahul based on all these data sources that we have harmonized, unified, and created an intelligent model around. Now use that context to create a response. We could use that context to take an action.

George Gilbert

Data graph vs conventional SQL queries comparison

Okay, so just to relate it back to the conventional way of doing things, conventionally you might get the context in terms of the state of the customer and their journey by querying a database and you’re getting back a flat table. Here the agent can look at how you’ve engaged with the customer, how the customer’s engaged with the company over time and what their interests are and so it can make a richer decision. I’m just trying to distill it down.

George Gilbert

That is correct. That’s correct.

Rahul Auradkar

Unlocking trapped data with fluidity

• Conventional: Flat tables from transactional systems, warehouses, lakes

• Data Cloud: Fluid, harmonized data available in business context

• Real-time to near real-time availability for web, commerce, service use cases

Okay. – Now with the SQL query, you refer to a transaction system, or you might have stored information in a warehouse, or you could have something in a lake house, or you could have it in some other repository. All of that we refer to as data that is relevant for that context in which it was stored. But it is trapped there, and we are unlocking the trapped data, and we are bringing intelligence as we are combining, harmonizing all of the data and making that available as context for the specific action that I’m referring to.

George Gilbert

Okay. – And that’s the place where we make a distinction on the fact that we are making the data fluid. We are referring to this as data fluidity and metadata fluidity, making it fluid and available in the context of the business, in the context of the application, in the context of the use case, whether it is near real time or real time. In many cases, real time could be the next web page that you go on the website as you’re shopping for commerce or you’re doing marketing. That could be you want near real time subsequently or you want real time, subsequent real time. In other cases, you want near real time as in you’re providing customer service. It’s okay to do it within seconds, but all of that is fluidly available, a combination of data and metadata. That’s data and metadata fluidity versus gravity as in it’s sitting inside the systems but not being used.

George Gilbert

Persona changes with richer data platform

Okay, so let’s back up and talk about the different personas that work with data platforms and how that’s changing with the Salesforce data platform. Because you have this much richer model about the state of the business, this 4D map about the customer and how they’ve been engaging, how does that change the personas that interact with a data platform? How does that simplify their jobs or make them more effective?

Rahul Auradkar

Seamless access without technical expertise

• Business users access through familiar applications (Service Cloud tabs)

• Digital labor (AI agents) automatically consume rich data

• Marketing nurture agents work seamlessly in background

So personas like, for example, a marketeer or a sales executive or a sales person or a customer service individual, they don’t log into warehouses and lake houses. When they log into Service Cloud in the case of Salesforce or they log into an application that is being fed by Data Cloud, they’re essentially logging into Data Cloud because it’d be a tab. So these are personas that traditionally wouldn’t really log into warehouses, they don’t know how to use the data, but now that data is available at their fingertips when they need it for what they’re doing, from a human persona standpoint. Now think about digital labor. Now in the digital labor’s case, it is now being seamlessly fed to the AI agents. I showed you the fact that from the web agent, we created a lead. Now seamlessly what is being fed in the background that I didn’t show you was that if I didn’t fall above the threshold of the score where a lead got created on against my name, then it would be seamlessly fed to the marketing nurture agent, which is the digital labor. That marketing nurture agent, it’s doing its own thing as in it’s nurturing me until I become rich enough so that I can create a lead so that it’s a high-quality lead in the system.

George Gilbert

Lead nurturing agent capabilities vs raw data

And okay just to, again, try and distill. My takeaway of what you’re saying is this lead nurturing agent has a much richer scaffolding in terms of knowing where that lead came from, all the information associated with that potential lead, and then also actions it might perform to move that lead through the funnel that’s on the operational application side. In other words, the range of potential actions that it can take. So that what I’m trying to do is distinguish it from if you were to try and put the piece parts together, and you’re just looking at raw modeled bronze, silver, gold data artifacts, and then you’re talking through reverse ETL, this tells you instead of tables, this tells you the state of the customer and what you can do with that state in terms of operationalizing it into an application just to distinguish between the abstraction levels.

Rahul Auradkar

Semantic modeling with operational integration

• Bronze/silver/gold models enhanced with semantic meaning and metadata

• Application context layered on top of models

• Continuous, fluid data flow vs on-demand queries

Yeah, that’s the right way to look at it. With the medallion architecture of bronze, silver, gold, you can create models, but what we are creating models with A, we are creating models with semantics and meaning using our metadata. We are cleaning it up. Second, to your point, we are layering on top of it semantic meaning using the application that it is going to feed, and that’s now bringing it across tens and hundreds and thousands of different data sources that are brought to bear in that you can now unlock all of that data that’s sitting in the context of the model that you’re feeding, and it’s happening continuously. It’s happening in a fluid manner. You don’t need to go, like your point, go make a query when you need it, but instead it’s happening continuously based on what the needs of the application are.

George Gilbert

Salesforce’s own Data Cloud deployment learnings

Okay, so let’s transition to what’s probably the biggest production deployment of Data Cloud I would imagine, which is Data Cloud at Salesforce. And maybe give us some flavor as to how it was deployed, the learnings you had that can help other customers understand how they can get to scale and then how that enabled richer analytics and agentic applications.

Rahul Auradkar

Customer zero production implementation

• 272M source profiles, 145M unified profiles in production

• Not the largest deployment (customers 5-10x bigger)

• Web agent integration creating leads from website interactions

Sure. So if we switch back to the screen, I was basically, what I was showing you was the customer zeroed implementation. That was the flow. This was the number of profiles I referred to. This is the home screen where I’m showing your Data Cloud deployment with source profiles and unified profiles, 272 million source profiles, 145 million unified profiles. This is the production instance that I pulled up before. One example that I showed you was going in here and using the web agent and creating leads. You can now go into a variety of different places, like customer service, for example. You’re feeding a profile for customer service. An example of that would be if I go in here, I show you the enterprise map, so I can even show you a unified profile. I show you this is a unified contact. If I go in here, so I can go take a look at the data space here.

Give me a second. I’m going into the default data space here. And what I’m doing here, George, is I’m picking out the unified individual against the email address, which was my personal address. So it should pick out this personal address, and here now it can pick out the unified profile. So this is, by the way, Data Cloud deployment within Salesforce is not the biggest one. We have massive customers who are much like, maybe 10X bigger than what we do.

George Gilbert

Wow, okay. – If we have one customer that is doing probably 5 to 10X bigger than what we have done here at Data Cloud within Salesforce. But it’s not just the profile size, it’s also what they do with actions, with activations, with agentic actions and also with insights. It’s massively bigger than what we have at the-

George Gilbert

Customer analytics and agentic applications examples

Okay, so maybe since it sounds like the naming that customer might not be kosher yet, maybe you can tell us about the different analytics that they’re able to perform now that they’ve harmonized all their data and some of the agentic applications they’ve been able to build on that.

Rahul Auradkar

Real-time lending and cross-functional CRM

• B2B/B2B2C personal loans with 40 migrated applications

• Real-time credit history, propensity scoring for in-store approvals

• Cross-marketing, sales, service with segment optimization agents

Yeah, so an example of that would be what they’re doing here. Just give me one second. I’m going from this screen here. Let me stop sharing for a second here. I want to answer your question. So what are they doing? So in one of the largest customers that, what we have is doing B2B and B2B2C personal loans. So they have now migrated about 40 of their applications on Data Cloud that will allow their customer service agents to service small businesses who are lending out money to customers who are coming into the store. Now in the moment, real time, they have a full profile understanding of that customer, including the person’s credit history, their propensity to pay back, their propensity to propensity to default, et cetera.

They have it in the moment, and they approve the loan there in the moment for the purchase that’s happening. And they have many such businesses. They have 40 some businesses. They have migrated all of that on Data Cloud. So that’s the ability for them to service their customers. They also use that to reach out to their customers when not just during sale but also during service, but they also reach out to their customers based on campaigns that they’re creating for doing marketing. So that’s an example of a cross-marketing sales and service of a pretty large customer. Another one would be somebody who’s really doing very simple agentic… I would create a segment, for example, to go do the marketing. Now I want to clean up that segment. I want to know something about the segment. I could go ask a question with an agent saying, tell me more about what’s in the segment.

Tell me more about the attributes and characteristics of the segment. They continuously use that to optimize their segment, if you may. So there are some examples of where customers are using it, using what I just said. There is another customer that is public right now, which we have talked about this, Indeed. Indeed is, as you know, the recruiting company. Indeed uses a combination of Agentforce and Data Cloud to drive better responses and better outcomes for small businesses who are their customers.

George Gilbert

Salesforce app integration vs heterogeneous estate

So this gets at a question about adoption and future directions as customers become more mature. There’s this tension that I’m hoping you can elaborate on, which is to what extent does Data Cloud because it integrates and harmonizes the Salesforce applications, make it more attractive for customers to have more of the Salesforce operational applications because it makes each of them more valuable? Or to what extent does it allow them to unify a more heterogeneous customer-facing application estate in that now you can bring them together in a way that wasn’t really possible before?

Rahul Auradkar

First-party priority with extensible architecture

• Seamless customer 360 across sales, service, marketing, commerce, analytics

• Zero-copy network for federation in/out to warehouses and applications

• IoT signals expanding CRM definition and adjacent application ecosystem

So as you would imagine, our seamless integration with our first-party applications is probably our priority number one, where we have provided seamless what we referred to as customer 360 value across sales, service, marketing, commerce, analytics. Those are first-party applications. Those are something that our customers use today, very successful with it. And we also have programmatic ways in which we can make what I just told you through, you used the word reverse ETL. That reverse ETL could be done, or you could also do the export, if you may, of the unified profile that I showed you into S3 or it could be just a warehouse. We do the, I refer to zero-copy network. When I refer to zero-copy network, we do zero-copy in and out, as in we federate it in and share out as well. You could share out a fully ETL version of that model that I referred to that you have been calling out as well to another application or another warehouse or another lake house.

For analytics, it could be driven for actions that are being driven through other applications. There are many customers now who are using non-traditional CRM type data sets. I used the example with IoT in the past where we are bringing IoT signals and they’re bringing that into Salesforce platform through Data Cloud and associating that with how dealers and businesses are now interacting with customers using applications of their choice. So we are starting to see adjacencies and the broadening, if you may, what CRM means to customers, it’s getting broader. And also we are seeing adjacent applications that are starting to make it into that ecosystem that is benefiting from what we just talked about, which is a combination of data, AI, and apps, the trinity, if you may.

George Gilbert

Business model transition from seats to consumption/digital labor

So there’s a business model and a pricing model transition, it sounds like. I mean because SaaS was always a seat-based model, but now more work can be done by digital labor and with data as the substrate. How do you think about this transition over time between the work? We’re always going to have humans in the loop, but the question is to what extent. And there was the Klarna story a year and a half ago where they maybe got ahead of themselves a little bit and downsized too fast on their customer service, but it was directionally interesting because at some point you’re going to have more digital labor substituting for human labor. It’ll be complementing, but to some extent substituting. But how do you balance and how does a customer budget between buying seats and paying for Data Cloud and agents?

Rahul Auradkar

Hybrid pricing with flex agreements

• Consumption pricing based on credits alongside per-user-per-month

• Flex agreements allow budget movement between models

• Modular approach for customer flexibility in spending

That’s a great question. As you might have seen our latest announcements, we are reacting to what customers want. We are reacting to where customers are going. I’ll talk a little bit about history on Data Cloud, then I’ll come back and answer your question. A little bit of history. When we started off, we shipped as a CDP because that’s what the market knew, but I keeps reminding our customers that traditional CDPs are more like MDPs, marketing data platforms. We are a CDP, as in we put the C in CDP. We put the customer in CDP. And when we started shipping additional capabilities, beyond what regular MDPs do, is we started adding, we were truly open and extensible right down at the compute level all the way up to the sharing layer we are truly open and extensible. As we started adding bring your own model, bring your own LLM, and zero-copy in and out, all of that, and insights and what we are doing with actions, what we are doing with real-time data triggers, et cetera, those were all additional capabilities that we were adding, our customers said, hey, that’s not, doesn’t relate to profile.

Some of them do not relate to how we were buying things in the past. Instead, give us the modularity where, or we’ll buy what we want and we’ll pay for what we are using. So that modularity really led to a consumption pricing based on credits. And that consumption pricing now coexists with our per-user-per-month model that you referred to. So it’s a hybrid model that we are seeing in the marketplace. The most important thing that we announced recently, George, is flex. The ability for our customers to use flexibility between the two models in that they can spread out their spend and their budget across the two, and they can flexibly move it based on where they’re using it for per-user-per-month or in digital labor using consumption or per-user-per-month, they’re able to now flex across those two.

George Gilbert

Flex currency usage across applications

Maybe give us an example where a customer who’s far along in their maturity of the deployment of Data Cloud and you can see where they’re going to be going with agents, what the mix in that flex, how that might trend over time from seats to consumption and activity-based prices.

Rahul Auradkar

Seamless credit utilization

• Credits bought for marketing automation used across customer service, sales

• Flexible currency movement across automation actions regardless of location

• Early customer adoption of human/digital labor combinations

So let me give you an example of just flexibility from within Data Cloud and then we can talk about flexibility across. So from within Data Cloud, we have a lot of customers who bought the product first to drive digital automation through marketing automation. Then as our vision always was, for it to be used in other places like customer service, a need for unified profiles in customer service, a need for sales automation and things like that. Now the same credits or the same currency that they bought for Data Cloud is being used in other places as well. It queries into the unified profile or something as in automation actions that we are driving. It doesn’t matter where the automation actions are. They could seamlessly use the currency or the credits that they bought in one application to the next one, to the next one, to the next one, and they can basically flexibly use it across everywhere.

Now we are seeing customers do that routinely. We have also started, as we just announced with our flex agreements with Agentforce where you can move from user per month into other places as well, we are just starting to see customers use that with a combination of, like you said, human labor and digital labor. There are some early examples we are seeing there, George, and we are pretty delighted with what progress that we’ve been seeing with our customers with that new flex agreement that we announced.

George Gilbert

Future pricing mix expectations (50/50 consumption vs seats)

Would you see leading edge and mature customers in a couple of years time who have taken advantage of the unifying platform and Data Cloud and now are adding digital labor, would you see them moving to a mix where there’s 50/50 consumption and activity-based pricing and seat-based pricing? Is that something customers should expect?

Rahul Auradkar

Transparency and control tools for predictability

• Estimation tools and calculators for growth planning

• Digital Wallet provides real-time consumption transparency

• Control knobs for usage management and budget predictability

Yeah. One thing we have done is we’ve provided a whole bunch of… Look at these four dimensions, George. One is customers have to estimate based on what they know now and what the growth of the business could be. So we are providing a whole bunch of tools and calculators for them to estimate. And then we are providing, once you start using it, transparency, as in where are you using it, how are you using and how are you consuming the credits, how are you consuming your licenses, et cetera. We have this product known as the Digital Wallet that sits inside, for example, Data Cloud or Agentforce. You can literally see it within a tab as to what’s happening in real time. So that transparency, the estimation allows them to understand what they’re buying into. Transparency allows them to then start tracking the credits as to what they estimated and what. We are trying to bridge the gap between the two of them. And then the third one we are providing is control, as in you have knobs that you can control, how much you’re using, where are you using, what the usage is.

And the roadmap on all of that, we are learning and we are reacting and we are providing customers what they want on control and transparency. What these three things allow customers to do is they can then drive predictability, which is the question you are asking. What would be the percentage on credit consumption, what would be the percentage that would use in other places? And within credits, how much should they really set aside as a budget for the next six months, three months, 12 months that allows them to drive more predictability? So we are providing estimation. We are providing transparency. We are providing control, and the customers can use that to manage their business.

George Gilbert

Third-party agent integration pricing model

Okay. Let me try and get in two quick questions. One, what happens when customers want to bring in third-party agents that might be complementary or might be competitive with the first-party agents, but they need your data? And I imagine that that comes in through agent exchange, but what does that pricing and exchange of value look like?

Rahul Auradkar

Unified consumption metering

• Third-party agents using harmonized data run same consumption meters

• No distinction between Agentforce and other agents for billing

So any of these agents that continue to use a harmonized unified data, whether through Agentforce or other agents as well, will run the same meters, if you may. They will run the same consumption that is being driven inside Data Cloud, so we don’t see a distinction between them.

George Gilbert

Informatica acquisition impact on Data Cloud expansion

Okay, one last question. If you look out a year or two, again the magic of the system is that you add value to the data and bring it together into a rich model of the customer and their engagement journey. When you have mature technology like Informatica Data Cloud, to name one example, that has built up deep knowledge about data in other systems, how might that make it much faster to expand the footprint of what Data Cloud can ingest, model, harmonize, unify, and activate?

Rahul Auradkar

Three-dimensional enhancement value

• Transparency: Connectivity, catalog, lineage, integration capabilities

• Understanding: Rich metadata combined with unified data model for AI context

• Governance: Auto-tagging, ABAC policies, MDM, data quality controls

Yeah, it’s a great question. On Informatica, we are pretty delighted with that announcement that we made for an agreement to acquire. It’s still going through regulatory approval. So that all notwithstanding, the way we look at that, George, is from three different dimensions. One is a transparency dimension. Our customers frequently ask us about, hey, what’s the connectivity, what’s the catalog, what is the lineage, what is the integration, if you may, into the system, all of that. With Informatica’s capabilities there, it enhances what our customers have been asking for. The second one is the data understanding. This idea that we have our own metadata that we’ve been, that’s what has made us successful for the last twenty-some years becoming the number one AI CRM. Informatica also has a rich metadata that far exceeds the regular CRM.

And combined with the Informatica’s rich metadata combined with Salesforce’s unified data model that I just showed you, now AI agents can interpret, connect, and act on enterprise data with some more meaningful context. So that’s the second one, which is transparency is one. Understanding is the second one. The third one is governance. This is something that our customers ask us all the time on governance. We just, in Data Cloud, we just shipped advanced governance capabilities with ABAC, attribute-based access control. So what you can do now is you can auto-tag data that’s coming in, structured and unstructured data. You can write policies on top of it. A policy could be, hey, I’m now looking at all the tags, and I’ve classified the data. This classified data cannot have access. Its privacy cannot be provided, the PII or PCI information.

You can now define policies. And then you can define access, attribute-based access control. These are attributes that have been defined underneath. All of that is happening seamlessly. The reason why we have to do auto-tagging and we have to do attribute-based access control that augments our role-based access control that we already have, is because we are talking about big data here. We’re talking about 10X, 100X, 1,000X bigger data, so we have to do that. Now on the data governance side with Informatica, they’ve got MDM, and then they have data quality controls. They also have their own policy management. Now that combined with what I referred to on transparency, which is catalog, now we can ensure that all data coming in that’s feeding AI is standardized, is accurate, is consistent, is secure, that we are adhering to consent policy. So that’s how we look at the complimentary nature of value we are creating for our customers across transparency, understanding, and governance.

George Gilbert

Informatica technology + Salesforce penetration for expansion

Okay, and so just to sum up, it sounds like then that’s a technology… They have I think roughly 5,000 customers. Salesforce has 200,000. I don’t know. What’s the Data Cloud penetration that you publicly talked about in terms of customer <inaudible>?

Rahul Auradkar

Value creation over footprint expansion

• Few thousand paying Data Cloud customers, many more freemium

• Focus on increased customer value across transparency, understanding, governance

• Rapid growth acceleration as outcome of enhanced value proposition

Yeah, we have a few thousand paying customers, and we have much, much more in terms of the freemium customer count.

George Gilbert

Okay, so between their technology and your growing penetration within the existing Salesforce customer base, with the transparency, the governance, and the data understanding, now we can expand, if I’m understanding, the footprint of Data Cloud much more rapidly beyond just customer data. Certainly more customer data, but also beyond the customer data.

Rahul Auradkar

Yeah, I think it is a little, the footprint of Data Cloud is just, it’s awesome. It’s incidental. But the more important thing is that across transparency, understanding, and governance, we add more value for our customer. So already you are using a combination of Salesforce and Informatica. It is about the value creation for our existing customers and for the new customers that we might have down the road. Data Cloud footprint increasing is just a matter of how we… We are already growing pretty rapidly. We have achieved incredible work the growth with Data Cloud. We expect that to continue to grow and only get accelerated, but that’s an outcome. What we are talking about here is an increase in value for our customers across Salesforce and Informatica.

George Gilbert

All right. Rahul, I guess we should put a pin in this as opposed to calling it an end because I hope to pick up this conversation with you again soon, but this was a really rich picture of all the progress you’ve made since the probably 15 months since we last spoke. And I very much appreciate your participating, and we look forward to following the story and hearing the updates from you. Thanks.

Rahul Auradkar

Thank you, George. Thank you for the time. I also, I’ll point you to another video that talks through the entire demo. I was trying to show you some screens of Salesforce on Salesforce customer zero. There’s one publicly that we just published as well. I’ll point you to that. You might want to take a look at that as well.

George Gilbert

And we put that in the show notes. I think I’ve seen it. It’s like a four or six-part just published in the last couple weeks about-

Rahul Auradkar

That’s correct. We have a one-and-a-half minute one, and then we have a 30-minute one, which goes through the entire depth of the demo.

George Gilbert

Okay. – So if you don’t have it, I’ll send that to you as well.

George Gilbert

Okay. Rahul, thanks and talk to you in V3. This is episode two, episode three to come.

Rahul Auradkar

Awesome, thank you. Thanks, George. It’s always great to talk to you. Thanks.

Article Categories

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
"Your vote of support is important to us and it helps us keep the content FREE. One click below supports our mission to provide free, deep, and relevant content. "
John Furrier
Co-Founder of theCUBE Research's parent company, SiliconANGLE Media

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well”

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content