Formerly known as Wikibon

Breaking Analysis: GPUs get all the headlines, but the future of AI is real-time data

The era of AI everything continues to excite. But unlike the Internet era, where any company announcing a dotcom anything immediately rose in value, the AI gods appear to be more selective. Nvidia beat its top line whisper number by more than $300M and the company’s value is rapidly approaching one trillion dollars. Marvell narrowly beat expectations this week but cited future bandwidth demand driven by AI and the stock was up more than 20% on Friday. Broadcom was up nearly 10% on sympathy with the realization that connect-centricity beyond the CPU is what the company does really well. Meanwhile, other players like Snowflake, which also narrowly beat earnings Wednesday and touted AI as a future tailwind, got hammered as customers dial down cloud consumption. 

In this Breaking Analysis we look at the infrastructure of AI examining the action at the silicon layer specifically around Nvidia’s momentum. Since much of AI is about data, we’ll also look at the spending data on two top data platforms, Snowflake and Databricks to see what the survey data says and examine the future of real time data and automation as a catalyst for massive productivity growth in the economy. 

To do so we have a special Breaking Analysis panel with John Furrier and David Floyer. 

How Nvidia Plans to own the Datacenter with AI

Two years ago we published this research report, laying out our thesis as to how Nvidia will disrupt the one trillion dollar x86 installed base. 

Basically it was a roadmap of Nvidia’s plan to take a massive chunk out of Intel’s general purpose data center dominance. We had a positive outlook on the Nvidia’s prospects specifically due to its software expertise and end-to-end capabilities. We noted not just the GPUs, but the tens of thousands of other components, the networking, the intelligent NICs and the full stack that Nvidia was building. 

Here’s an excerpt from that report:

Nvidia wants to completely transform enterprise computing by making datacenters run 10X faster at 1/10th the cost. Nvidia’s CEO, Jensen Huang, is crafting a strategy to re-architect today’s on-prem data centers, public clouds and edge computing installations with a vision that leverages the company’s strong position in AI architectures. The keys to this end-to-end strategy include a clarity of vision, massive chip design skills, new Arm-based architectures that integrate memory, processors, I/O and networking; and a compelling software consumption model. 

Nvidia’s Results Called the ‘Greatest Beat in the History of Earnings Reports’

Nvidia’s recent results are evidence that vision appears to be coming to fruition. 

John Furrier called ChatGPT the “Web browser moment.” Jensen Huang calls it the iPhone moment. Either way, Nvidia blew away its numbers with a $670M revenue beat and cited its 2H supply is going to be significantly better…and laid out a forceful and compelling narrative that budgets are shifting away from x86 to what the company calls accelerated computing.

Nvidia’s valuation is nearly 9X that of Intel’s and ChatGPT has been a massive catalyst for Nvidia. Here’s a summary of the conversation that followed on our panel. 

AI Infrastructure, Semiconductors, and Data: The Rise of Parallel Computing and Cloud-Optimized GPUs

In the conversation Floyer and Furrier talked about major shifts occurring in the tech industry landscape. AI infrastructure, semiconductors, and data are the crux of these transformations, driven significantly by the adoption of parallel computing and cloud-optimized GPUs. The following three key points emerged:

  • Parallel computing has become pivotal, as demand for CPU cycles has spiked dramatically. This shift has led to an increase in simpler, more efficient CPU technology, propelling companies like Nvidia to the forefront with its GPU-led architecture.
  • Tech giants like Tesla and Apple have also aggressively invested in parallel computing, with a focus on neural networks. These companies are fundamentally re-architecting their hardware to accommodate the surge in CPU demand.
  • Intel, once a the dominant player in the CPU world, has failed to keep pace with this paradigm shift. Floyer stated that the company’s future as a leader is in jeopardy.

In addition, there’s more than parallel computing at play. Other facets of the semiconductor industry are also undergoing significant changes.

  • Furrier noted that Nvidia, was well-placed from the inception with GPUs and capitalized on the initial crypto craze. The AI trend has been a tailwind and has now turned its focus to AI. With the crypto market cooling, the attention has shifted to cloud-optimized GPUs and AI, which Furrier believes is the next-gen hyperscale technology.
  • There’s a looming competitive battle on the horizon. While Intel may retain its dominance in server technologies through its Xeon line and traditional OEMs, the emerging markets are heading towards chip-level connectivity and cloud-optimized silicon.
  • AI’s role is not just limited to chatbots like ChatGPT. The physical layer, often overshadowed, is believed to be the next major wave. It’s akin to the OSI model, where the physical layer is addressed first, followed by other layers.
  • Nvidia’s CEO believes the data center’s future lies in becoming an “AI factory,” marking a drastic shift in spending towards AI-powered or accelerated computing. While this statement is self-serving, it’s a powerful marketing metaphor that strategically positions Nvidia to take advantage of this shift.

Bottom Line: The combination of AI infrastructure, semiconductors, and data will drive the next wave of technological advancements. Companies that can successfully ride this wave will likely shape the future of the industry. The battle among industry players is set to intensify, making this a crucial space to track.

[Watch three top analysts discuss the future of AI and the shift to the AI-powered data center].

How an AI Alpha Engineer Summarizes the GPU Shortage

In a candid conversation with a community member of theCUBE’s, this deep AI expert shared the following:

The people doing AI love Jensen…because he’s a baller. But if he really wants to democratize AI, he needs to lower prices. We need more competition. We’ll see new GPUs from AMD. We use Intel GPUs even though they’re suboptimal because we need other sources.

Lots of Competition Coming After Nvidia’s Dominant Position

So with that as a backdrop, let’s look at some of the silicon competition to Nvidia and other firms possibly getting a boost from AI. Nvidia is disrupting Intel, that’s clear, as is Arm. AMD is competing head on with both companies and has done an amazing job of bring AMD back to prominence. All the cloud players are developing silicon, as is IBM. Broadcom is competing for share in merchant silicon and is focused on the surrounding components including intelligent NICs, along with Marvell in connectivity. And several other players are building semiconductor capabilities including Apple, Tesla and Meta. And finally Chinese companies are designing and manufacturing silicon chips in an effort to achieve independence. So Nvidia is far from alone in this market but it has a big lead. 

Here’s a summary of the analyst conversation on Nvidia’s success in the AI space, the importance of neural networks, the role of hyperscalers, and geopolitical concerns.

First, the panel discussed Nvidia’s lead in the AI space, primarily due to their GPU technology and innovative CUDA software. They believe Nvidia will continue to innovate by adding more neural networks to their repertoire. Both Apple and Tesla were noted for their heavy investments in neural networks, with the former dominating consumer computing and the latter focusing on inference work for their autonomous vehicles. The conversation led to the broader picture of AI, which they see as a driving force towards automation.

Next, the hyperscalers were brought into the mix, with AWS, Google, Microsoft, and Alibaba all developing their own AI products and chips. China’s looming influence in this market is also noted. Amazon, with its deep experience in silicon and a long history with AI, was highlighted as a potential leader. AWS’s approach to generative AI and aggressive messaging were seen as pivotal to its positioning.

Third, the conversation turned to the example of AWS’s acquisition of Annapurna Labs. AWS wasn’t satisfied with Intel’s performance or price, so they began partnering with Annapurna, and ultimately bought the company. AWS then used Annapurna to design Arm-based chips in-house. The panel speculated if AWS could follow a similar path to compete with Nvidia, potentially by acquiring AI startups to innovate their offerings. But Andy Jassy’s famous quote that “there’s no compression algorithm for experience” favors Nvidia.

The following key points are noteworthy:

  • Nvidia is seen as a dominant force in the AI space, driven by their GPU technology and CUDA software.
  • Apple and Tesla’s heavy investment in neural networks is expected to continue influencing AI hardware design.
  • Hyperscalers such as AWS, Google, Microsoft, and Alibaba are major players, designing their own chips or AI products.
  • AWS’s acquisition of Annapurna Labs demonstrated the potential for hyperscalers to lower costs and improve performance by bringing design in-house.
  • The potential for AWS or other hyperscalers to acquire startups or innovate in-house to compete with Nvidia is possible but in the foreseeable future they will be reliant on Nvidia.
  • Geopolitical concerns, especially related to China and Taiwan’s TSMC, were raised as potential risks to watch out for as firms like Nvidia and Apple are exposed.
  • The cost per compute cycle must come as AI grows, including energy costs. Automation is the key to justifying the expense of AI.

[Watch this 10 minute deep dive into the competitive landscape for silicon chip design, the role of hyperscalers, AI inferencing and the cost of AI]. 

Bottom Line: Nvidia has plenty of competition but their lead is substantial and in the world of semiconductors major shifts go in long cycles. 

Snowflake Catches a Cold

Let’s shift gears and look at Snowflake’s quarter and talk about where it fits in AI. The reason we say Snowflake catches a cold is because they narrowly beat but were very cautious about the outlook citing more tepid consumption patterns relative to the past and investors sold. Ironically, Snowflake’s CFO was suffering from a nagging cough which plagued him throughout the conference call. Despite the selloff, Snowflake’s momentum is still strong with very low churn. The fact however is customers are optimizing costs by reducing retention policies – which lowers storage costs and makes queries run faster – so less storage and compute equals lower revenue.

Snowflake’s play is to be the iPhone of data apps. Or the app store if you will. They want to be the best platform to build data apps — better than the hyperscalers, better than Databricks…better than anyone. And they’ve made some acquisitions like Applica and now Neeva which support the envisioned outcomes of Snowpark, a developer experience announced last year that use interfaces other than SQL (e.g. Python, Scala and others).

The following summarizes the discussion on the future of data infrastructure in relation to artificial intelligence (AI) and automation, using Snowflake and Databricks as case studies.

Floyer is of the opinion that future company architectures should aim to reduce their workforce through automation. To do so, transactional data and analytic data need to be unified, and they have to share the same databases to minimize time lags and drive real-time automation. He believes that in the long run, architectures like Snowflake may not support this model as they would require a more direct approach from data sources to applications.

Regarding Databricks, Furrier discussed how it has successfully capitalized on the big data wave. However, John also believes that the introduction of AI will change this landscape, bringing about a shift in the infrastructure platforms. He feels databases will become invisible, automated by AI, and data storage will be controlled by developers and applications, leading to a complete reversal of the current script.

Floyer responded by expressing the importance of databases for maintaining consistency in the future, even in a more distributed form. He doesn’t believe developers will completely take over the role of data management, as databases relieve developers of many tasks. He also believes that developers will benefit from a plethora of new tools, leading to simpler orchestration of automation.

Bottom Line: There will be a major shift in the data infrastructure landscape towards distributed, developer-controlled databases and increased automation. However, there is a divergence in opinion on how much control developers will have over data management and the extent to which databases will remain an essential tool. The underlying theme is that change is inevitable, and companies will need to adapt to stay relevant.

[Watch this clip of the three analysts discussing Snowflake, Databricks, developers and the future of data platforms].

Survey Data Confirms the Deceleration in Snowflake’s Momentum

The chart below is based on a survey of 1,700 IT decision makers (ITDMs) comprising 264 Snowflake accounts. It shows the granularity of Snowflake’s Net Score across those 264 accounts. Net Score is a measure of spending velocity based on ETR’s proprietary methodology. The lime green bars show the percentage of new customers adding Snowflake. The forest green is the percentage that are spending 6% or more. The gray signifies flat spend. The pinkish bars show spending down 6% or more and the bright red is churn. Subtract the reds from the greens and that equals Net Score. The blue line shows Net Score and the yellow line shows the share of mentions within the data set.

The notable points are:

  • Snowflake’s Net Score peaked in the Jan 2022 survey and has steadily declined since.
  • It continues to be highly elevated and amongst the highest in the data set.
  • The decline in Snowflake’s Net Score is a function of a major shift toward flat spending within the customer base. This was accentuated on the Snowflake earnings call with CEO Slootman’s comment that “the CFO is in the business,” meaning the finance function is imposing caps on spending growth.
  • The churn in Snowflake accounts is very small, supporting its 150%+ net revenue retention (NRR).

Snowflake & Databricks Compete to Define the “Modern Data Platform”

Let’s now share a different view and bring Databricks into the equation and see how they stack up with Snowflake. This chart compares the data from Snowflake (N=264), Databricks (N=225) and Streamlit (N=111). It plots Net Score or spending momentum on the Y axis and account overlap/presence based on the N’s on the X axis. The squiggly lines indicate the progression over time. 

Several points are notable in the data:

  • The presence of of Databricks in the enterprise over the last two plus years has significantly increased and the account overlap between the two leaders sets up a looming battle.
  • Snowflake’s Net Score decline has brought it in line with that of Databricks.
  • Despite the macro pressures, all three platforms are above the 40% line, which is an indicator of a highly elevated spending velocity.

Let’s delve into the key points that arose during this conversation about these data points.

The discussion begins with an acknowledgment from Furrier of the validity of Snowflake’s churn data. Its churn rates are low despite the current economic headwinds causing a market slowdown. So they’re not losing customers, similar to the dynamic amongst the cloud players. Major shifts, such as the hype around AI right now, tend to cause a freeze in the buyer market. This leads to a “wait and see” approach, further slowing down spending in the sector. But customers are not defecting.

An important comparison arises between Snowflake and Databricks, two significant players in the market. Snowflake’s strong business model has set it apart and allowed it to lead the market in the early stages. However, Databricks has gained substantial traction through leveraging the open-source community and consistently enhancing its robust product offerings.

A salient point in the discussion revolves around data retention policies. Snowflake, for example, stores an extensive amount of data for its clients, but the question arises – is all this stored data being utilized effectively? This was a topic of conversation on the earnings call where customers are being forced to control costs and shortening retention times is a logical way to do so.

Floyer stressed that the value of data diminishes as it ages, prompting a shift towards efficiently capturing and extracting the value of data close to its source before disposing of it. Barclays analyst Raimo Lenschow asked what we thought was a salient question on the earnings call: Is the trend of shorter retention times potentially enduring and will this lead to a change in the future growth of data storage needs.

  • The expectation is that the increase in the efficiency of data capture and utilization will result in a larger proportion of data being ephemeral.
  • Yet, the amount of data generated will be so large that it will continue to grow exponentially due to the emergence of new industries and use cases and the impact of AI.

Furrier introduced the concept of ‘hyper data scalers.’ These could be entities similar to cloud hyperscalers but focused on storing vast amounts of data for foundational AI models.

  • This aligns with the ongoing evolution of data storage and usage models, with foundational AI models driving this change.
  • There’s potential for the creation of new forms of data clouds, differing from the current offerings by companies like Snowflake and Databricks. These would support more distributed cloud data architectures but would require new types of databases and standards.

The conversation concludes with a focus on the value of historical data for AI, particularly for pattern recognition and training AI models. There’s speculation about a new model of data infrastructure where the emphasis is less on storage and more on real-time use and domain-specific data. This could significantly alter the landscape of data handling in the future.

Bottom Line: The current traction of both Snowflake and Databricks is notable. Both companies have strong managements and seemingly loyal customer bases and they’ll both likely leverage AI effectively. The market is large enough for both to thrive in the near to mid-term with longer term trends around real time data and AI inferencing challenging the status quo. 

[Listen and watch the three analysts discuss churn rates, real time data, whether more data will become ephemeral and if that will negatively impact storage growth].

Uber for the Enterprise – The Future of Data Apps

The world of applications is shifting toward data apps. Today’s data silos are largely a function of data being embedded in applications that automate processes. Increasingly, we believe business logic will be infused into data and apps will be build using this new model. The example we often use is Uber for the enterprise where a digital twin of your businesses is created. People, places and things are digitized as independent data elements but those data “products” are discoverable, governed and have coherence. A semantic layer enables these data elements to be completely connected and understood by the system and each other.

Using the Uber example…Riders and drivers are connected with data related to destinations, ETAs and prices, based on demand and supply. This is done in real time without incurring significant trade offs between availability, latency and consistency. We believe that these types of apps will require new thinking around data architectures, standards and platforms. But more importantly it will drive new levels of automation and productivity for businesses.

Here’s a bulleted summary of the closing conversation between the three analysts using the Uber example and the impact on productivity:

  • Revenue per Employee in Uber: Uber’s success in harnessing data from cars, drivers, streets, and road conditions to optimize their business has led to significant increases in revenue per employee. From 2021 to 2022, Uber’s revenue per employee grew from $600,000 to $971,000, far surpassing the typical software company’s $225,000-$250,000 per employee.
  • Automation and AI’s Impact on Businesses: Uber’s automation model represents a future where businesses have to automate extensively and leverage AI to remain solvent. Companies that don’t adopt this model within a decade may face significant risks from emerging startups using AI tools and real-time data more effectively.
  • Elon Musk’s Impact on Industry Productivity: Elon Musk has significantly impacted industries such as automotive and space with his innovative approach to productivity. By producing ‘software cars’ instead of hardware cars, and with similar innovation in SpaceX, Musk has shown that to survive and thrive, companies must adapt to new technologies and workflows.
  • Real-time Data Capture is Key: The most valuable data for decision-making is real-time data. Companies like Uber have excelled by making real-time decisions based on immediate data.
  • AI in Incumbents vs. New Entrants: Incumbents like Dell, IBM, HPE, Oracle, ServiceNow and Salesforce can certainly leverage AI. However, there’s a belief that a new model akin to keyword search, initially overlooked but eventually dominant, may emerge and become a game-changer in the industry.
  • Predicted Shift Towards Simplification: The panel predicts that the ongoing shift in technology represents a new paradigm. This shift is likely to give rise to new startups, much like the advent of the web did. We believe that the next wave will focus on simplifying things and reducing the steps it takes to accomplish tasks. As with the web’s rise, the companies that can work within and enhance this nascent market stand to capture the most value. The big players will also be involved, reaping their share of the value from this shift.

Unlike previous generations of companies, particularly witnessed in demise of the east coast minicomputer business (Apollo, DEC, DG, Prime, Wang), today’s leaders are much more paranoid about disruptive technologies. But blind spots exist and often incumbents are so focused on protecting their franchises that it leads to slower growth and lack of innovation. AI represents an opportunity for both incumbents to drive automation into existing platforms and disruptors to bring new models to industries.

Unlike the Web, which was often seen as a bifurcated opportunity between bit and atom-based businesses, AI has the potential to be even more ubiquitous.

[Watch the three analysts discuss the Uber example, the impact automation has had on Uber’s revenue per employee and the future imperative of leveraging real time data].

Keep in Touch

Many thanks to Alex Myerson and Ken Shifman on production, podcasts and media workflows for Breaking Analysis. Special thanks to Kristen Martin and Cheryl Knight who help us keep our community informed and get the word out. And to Rob Hof, our EiC at SiliconANGLE.

Remember we publish each week on Wikibon and SiliconANGLE. These episodes are all available as podcasts wherever you listen.

Email david.vellante@siliconangle.com | DM @dvellante on Twitter | Comment on our LinkedIn posts.

Also, check out this ETR Tutorial we created, which explains the spending methodology in more detail.

Watch the full video analysis:

Image: greenbutterfly

Note: ETR is a separate company from Wikibon and SiliconANGLE. If you would like to cite or republish any of the company’s data, or inquire about its services, please contact ETR at legal@etr.ai.

All statements made regarding companies or securities are strictly beliefs, points of view and opinions held by SiliconANGLE Media, Enterprise Technology Research, other guests on theCUBE and guest writers. Such statements are not recommendations by these individuals to buy, sell or hold any security. The content presented does not constitute investment advice and should not be used as the basis for any investment decision. You and only you are responsible for your investment decisions.

Disclosure: Many of the companies cited in Breaking Analysis are sponsors of theCUBE and/or clients of Wikibon. None of these firms or other companies have any editorial control over or advanced viewing of what’s published in Breaking Analysis.

Article Categories

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
"Your vote of support is important to us and it helps us keep the content FREE. One click below supports our mission to provide free, deep, and relevant content. "
John Furrier
Co-Founder of theCUBE Research's parent company, SiliconANGLE Media

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well”

You may also be interested in

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content