Formerly known as Wikibon

Server SAN Projections 2016-2026

Premise Server SAN continues to grow fast, and is projected to replace most traditional storage arrays by 2026. Wikibon projects Enterprise Hyperscale Server SAN will migrate to True Private Cloud (TPC) Server SAN. Wikibon also projects that True Private Cloud Server SAN will make some headway together with Hyperscale Server SAN in public cloud storage. Definitions for the […]

A Guide to Concepts in Digital Twins

Premise. The digital twin programming concept will extend well beyond IoT. It presents a richer representation of real things that traditional programming technologies. Users need conventions for core concepts and how they fit together. In our conversations with the Wikibon community, we hear both interest and confusion regarding the notion of digital twins. We believe […]

Further Evidence of IBM’s Strategic Retrenchment in Enterprise Analytics

If you’ve been following IBM’s strategic moves over the past 6 months in data and analytics, it’s hard not to have some concerns. The announcements they’ve been making this year err on the side of excessive caution. What they illustrate is IBM Analytics’ ongoing retrenchment into mainframes, private clouds, and end-to-end data governance. IBM seems […]

Will 3D XPoint make it against 3D NAND?

Wikibon projects that 3D XPoint will not make it in volume in either the enterprise or consumer markets; it may reside in some specialist application niches. The areas of uniqueness claimed for 3D XPoint will be addressed by DRAM, 3D NAND, and innovative non-volatile combinations of DRAM and 3D NAND.

CIOs and CTOs should assume 3D XPoint will NOT be important to the enterprise, and not evaluate it further. The focus should be on deployment of 3D NAND technologies, and expect a rapid cadence of increases in performance, density, form factor, and volume. Hybrid DIMMs with DRAM and 3D NAND should also be evaluated as the technologies mature, as well as UniGrid Architectures.

NAND Vendors put all their chips on TLC 3D NAND

Premise By mid 2018, 3D NAND will become the backbone of almost all deliveries and all vendors of NAND flash, to both the consumer and enterprise markets, with a clear path to increased density, higher performance, and decreasing cost. Consumer NAND announcements are a leading indicator of enterprise NAND capabilities. Toshiba Consumer SSD NAND Announcement […]

True Private Cloud Projections 2015-2026

The concept of Private Cloud has been in the market since 2008, but the ability of IT organizations to build Private Cloud services that match Public Cloud have been limited. Wikibon forecasts the Private Cloud market revenues from 2015-2026, as well as introducing the enhanced concept of “True” Private Cloud.

Disruptive Trends in Cloud Computing (2015-2025)

As more companies transition into digital businesses, it is critical to understand the business and technology trends that will shape Cloud Computing for the next decade. Wikibon examines the critical elements to help companies understand this rapidly evolving landscape and marketplace.

Simplifying and Future-Proofing Hadoop

Customers are asking how to choose from the rich and growing array of tools in the Hadoop ecosystem. Part of the consideration centers on how to insulate them from fragmentation in the continually evolving Hadoop ecosystem. The incredibly rapid pace of innovation distinguishes the ecosystem but it also has its downsides. The mix and match richness of choice introduces complexity for administrators and developers.

At some point it makes sense for customers to consider investing in tools that can hide much of that complexity in data ready for analysis. To be clear, there is no magic product that can hide all these technologies. But when customers take the perspective of simplifying specific end-to-end processes, solutions are available to address the problem.

Making Sense of Hortonworks' Dataflow

IT leaders can no longer treat stream processors as esoteric functionality employed solely by leading-edge consumer Internet services. They are becoming core functionality that works side-by-side with the more familiar batch processing engines such as Hive, HBase, or Impala. Application developers that need near real-time functionality can start to evaluate stream processors as part of design patterns. However, their analytics functionality is still a bit primitive and most of them need a lot of hardening in order to be resilient enough for mainstream customers.

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content