Will 3D XPoint make it against 3D NAND?
Wikibon projects that 3D XPoint will not make it in volume in either the enterprise or consumer markets; it may reside in some specialist application niches. The areas of uniqueness claimed for 3D XPoint will be addressed by DRAM, 3D NAND, and innovative non-volatile combinations of DRAM and 3D NAND.
CIOs and CTOs should assume 3D XPoint will NOT be important to the enterprise, and not evaluate it further. The focus should be on deployment of 3D NAND technologies, and expect a rapid cadence of increases in performance, density, form factor, and volume. Hybrid DIMMs with DRAM and 3D NAND should also be evaluated as the technologies mature, as well as UniGrid Architectures.
NAND Vendors put all their chips on TLC 3D NAND
Premise By mid 2018, 3D NAND will become the backbone of almost all deliveries and all vendors of NAND flash, to both the consumer and enterprise markets, with a clear path to increased density, higher performance, and decreasing cost. Consumer NAND announcements are a leading indicator of enterprise NAND capabilities. Toshiba Consumer SSD NAND Announcement […]
True Private Cloud Projections 2015-2026

The concept of Private Cloud has been in the market since 2008, but the ability of IT organizations to build Private Cloud services that match Public Cloud have been limited. Wikibon forecasts the Private Cloud market revenues from 2015-2026, as well as introducing the enhanced concept of “True” Private Cloud.
Disruptive Trends in Cloud Computing (2015-2025)

As more companies transition into digital businesses, it is critical to understand the business and technology trends that will shape Cloud Computing for the next decade. Wikibon examines the critical elements to help companies understand this rapidly evolving landscape and marketplace.
Simplifying and Future-Proofing Hadoop

Customers are asking how to choose from the rich and growing array of tools in the Hadoop ecosystem. Part of the consideration centers on how to insulate them from fragmentation in the continually evolving Hadoop ecosystem. The incredibly rapid pace of innovation distinguishes the ecosystem but it also has its downsides. The mix and match richness of choice introduces complexity for administrators and developers.
At some point it makes sense for customers to consider investing in tools that can hide much of that complexity in data ready for analysis. To be clear, there is no magic product that can hide all these technologies. But when customers take the perspective of simplifying specific end-to-end processes, solutions are available to address the problem.
Making Sense of Hortonworks’ Dataflow

IT leaders can no longer treat stream processors as esoteric functionality employed solely by leading-edge consumer Internet services. They are becoming core functionality that works side-by-side with the more familiar batch processing engines such as Hive, HBase, or Impala. Application developers that need near real-time functionality can start to evaluate stream processors as part of design patterns. However, their analytics functionality is still a bit primitive and most of them need a lot of hardening in order to be resilient enough for mainstream customers.
Making Sense of Hortonworks' Dataflow

IT leaders can no longer treat stream processors as esoteric functionality employed solely by leading-edge consumer Internet services. They are becoming core functionality that works side-by-side with the more familiar batch processing engines such as Hive, HBase, or Impala. Application developers that need near real-time functionality can start to evaluate stream processors as part of design patterns. However, their analytics functionality is still a bit primitive and most of them need a lot of hardening in order to be resilient enough for mainstream customers.
Unpacking the Public Cloud Market

Wikibon Research has forecasted Public Cloud revenues from 2015 through 2026; with the expectations that 33% of all IT spending ($500B) will move to Public Cloud services within the next decade. The impact this will have on IT vendors and IT strategy will be significant as more applications focus on mobile and real-time data analytics.
Server SAN 2012-2026

In this research paper, Wikibon looks back at the introductory Server SAN research, adjusts the Server SAN definition to include System Drag, and increases the speed of adoption of Server SAN based on very fast adoption from 2012 to 2014. The overall growth of Server SAN is projected to be about a 23% CAGR from 2014 to 2026, with a faster growth from 2014 to 2020 of 38%. The total Server SAN market is projected to grow to over $48 billion by 2026. The traditional enterprise storage market is projected to decline by -16% CAGR, leading to an overall growth in storage spend of 3% CAGR through 2026. Traditional enterprise storage is being squeezed in a vice between a superior, lower cost and more flexible storage model with Enterprise Server SAN, and the migration of IT towards cloud computing and Hyperscale Server SAN deployments. Wikibon strongly recommends that CTOs & CIOs initiate Server SAN pilot projects in 2015, particularly for applications where either low cost or high performance is required.