Meet Edouard Bugnion
Edouard (Ed) Bugnion is a Professor of Computer Science at EPFL, one of the leading Swiss engineering institutes. He is a co-founder of VMware and was the CTO at one point. He’s also an early stage investor in companies including Cumulus Networks and Datrium.
Recently he returned to Silicon Valley to celebrate VMware’s 20th birthday. He stopped by theCUBE studios in Palo Alto and sat down with Peter Burris to talk about trends in enterprise tech and the future of the data center.
In the interview, according to Burris and Bugnion, there were several consequential technology changes in the enterprise business over the past two decades including virtualization, which of course VMware popularized, networking, mobile technology and flash storage.
As it pertains to flash, Ed stated “it’s totally changed expectations right? Before flash and before in-memory, the expectation was that anything that involved data warehousing and analytics was something that was a batch process. You have to wait for it and the notion that data is available on demand is something that is now taken for granted by users but it wouldn’t have been possible without those new technologies.”
Flash has had an enormous impact on system design and system architecture. Combined with the notion that digital transformation is real – meaning that the emergence of data as an asset is having a profound consequence on how business works – and we see that data is going to take a more central role in describing how future architectures evolve.
This is a key area of Bugnion’s research, in particular its impact on the data center. Specifically, his research focuses on whether you can actually make efficient use of the resources in a given data center while meeting service level objectives. How do you make sure that you can respond to user facing requests fast enough and at the same time be able to deploy the right amount of capacity?
According to Bugnion, this so-called interactive behavior changes the game because of time constraints. He asserts in the interview that it’s difficult to be able to solve the problem of delivering latency-critical, human responses reliably, in real-time. And, at the same time being able to do that without consuming an exorbitant amount of resources. He stated several times that energy is a big issue today in the data center. And large consumers of compute and storage believe strongly that if you can deliver the same amount of data traffic with less underlying hardware capacity then it’s a big win for customers.
Will there be Just a Few Mega Clouds?
According to Bugnion, the idea that all data is going to consolidate into a few mega data centers is a myth. He stressed there will always be a balance. While there are clear economies of scale in these very large data centers, for many enterprises it still makes sense to have some amount of data on-prem because you can’t just shove everything into the cloud. It’s just not practical.
The role of automation is also critical according to Bugnion. He stated, “There’s an old friend of mine who once said screwdrivers don’t scale…If you want to be able to operate anything at any scale, you need to have automation. And virtualization is a one of the mechanism for automation, it’s one of the foundational elements…You want to make absolutely sure to separate the act of operating screwdrivers – i.e. adding capacity in the data center – which you only need to do once in a while – and decoupling the addition of physical capacity in a data center from the operations.”
The Future of the Data Center
Burris asked Bugnion where he thought the future of the data center was headed. He responded by saying that “if there were no new applications, if there were no digital transformation the answer would be easy…” Basically because it would just come down to a game of bigger, better, faster, cheaper.
In reality, however, because of digital transformation, data center requirements are witnessing a reshaping of the growth curve – i.e. one that’s accelerating. The key question becomes how do you keep up with the growth and consequent complexity? While virtualization applied to a particular domain allows that domain to scale (by reducing operational complexity) the problem is part of that operational complexity gets shifted elsewhere. Bugnion makes the point that the in the early days of virtualization at VMware they virtualized servers and clusters of servers. That was helpful because you could move VMs around transparently. But that pushed a lot of the complexity into storage networks. At small scale that was fine and VMware and its ecosystem dealt with that problem for years through storage integration through VMware application programming interfaces (APIs). At larger scale, however, it creates an operational issue because the storage subsystem begins to choke under the weight of all that complexity. So it becomes a game of constantly chasing bottlenecks and finding the pain points in the system that are causing constraints.
These constraints have become problematic as we pursue new digital opportunities. Businesses react differently as they try to exploit data. The whole concept of technology, not at the infrastructure level per se, but rather as an enabler or as a strategic capability within a business, starts to elevate and permeate organizations. Security becomes a bigger problem as risks escalate. Customer experience becomes critical and the IT professional is suddenly thrust into a new, more outward-facing role where complex engineering has to be simplified to create consumer-like tech experiences for customers.
According to Bugnion, that’s one of the big transitions because when he started in tech the direct implication of people’s lives was minimal. Effectively IT was digitizing business processes that were largely back office functions. Today, according to Bugnion, tech is at a level that can directly impact people’s lives – for the better or for worse, depending on its application. He states: “as an industry we must have the appropriate introspection so that we know we’re doing things in a sensible way. It might involve actually slowing down at some times the pace of innovation. Trying to do things in a more deliberate, careful way. Other segments and industries had to do that. You can’t come up with a new drug and simply put it on the market. There’s a process. Obviously this is an extreme example but tech has always been on the other extreme. And the big trend is to find the appropriate level balance. I live in Switzerland now and GDPR is all over Europe. It’s actually a big change in the mindset because now you not only have to make sure that you can manage data for yourself as an enterprise but also that you actually abide to your responsibilities as a data processor for your customers and your users.”
In the interview, Burris and Bugnion go deeper into these and other issues, including computational thinking, the locality principle, data sovereignty, regulatory issues, intellectual property, security and what the world of computing will look like in 2035.
Is Bugnion an optimist or a pessimist? You can find out by watching the curated playlist of the interview below and clicking on the various sections of the interview.
Thanks to Ed Bugnion for joining us in Palo Alto and to Datrium for sponsoring the segment.