Formerly known as Wikibon
Close this search box.

Systems of Intelligence Are Driving Database Proliferation: Part 3 – "Soft" Issues


Consumer Internet service providers in the 2000’s pioneered so many things, including new services, business models, and technology platforms.  In building their technology platforms, they made increasing use of open source databases they often created themselves.  But the reasons weren’t always about technology.  The previous research update in this series explained the economic incentives.  With data accumulating far faster than Moore’s Law made it cheaper to manage, prices had to come down.  In addition, the benefits of open source community stewardship became compelling to more organizations.

As databases spread further in organizations, we see more usage scenarios delineating where each fits.  This note explains the “soft” issues driving the proliferation of new types of databases, developer skills and the changing cultural norms of the organizations using them.  Purists can easily dismiss these reasons as illegitimate and trivial compared to the value of data as a corporate asset.  They may be right.  But the customer is always *more* right.


Skill Set: Avoiding SQL

For three decades SQL represented all that was right in the world of data.  Without going through all its merits, SQL made it possible for developers to say what data they needed to operate on while leaving it to the database to figure out how to do the work in the fastest way possible.  All the developer had to do was say what they wanted, not how to do it.  What could be simpler?

Well, some developers didn’t like dropping SQL into their applications.  That meant learning yet another language and it meant switching between the language of their application and SQL, the language of the database.  There was grumbling.

Oracle 12c Enterprise Edition still defines the state-of-the-art in multi-purpose SQL databases that support not just mainstream applications but most specialized usage scenarios.  But there’s a trade-off that comes with that richness of functionality.  It isn’t easy to learn, use, and administer.  Going back to Ray Ozzie’s Web go-to-market products that need to support the discover, learn, try, buy, recommend process, this is definitely the software-equivalent of Big Iron.

MongoDB became the anti-MySQL, anti-Oracle for developers working on the user-facing side of many emerging Web and mobile applications.  Besides being cheap and easy to get started, developers didn’t have to use SQL.  They could program in JavaScript, the lingua franca of front-end devices.  And it was infinitely malleable, storing each document in a JavaScript-like format.  That became a critical issue as more applications started getting delivered from the cloud.


Cultural changes: avoiding central IT

The new application development process is the opposite of the old, ERP-class waterfall method where all requirements, including how the data was organized, were spelled out in painstaking detail before coding began.  This new process is very hard for IT to accommodate since they treat data as a corporate asset.  To them, that value comes from keeping it organized, accessible, and aligned.

The new generation of online developers who manage the customer profile want to get up and running quickly with simpler tools.  In addition, they need a database that can accommodate continual change.  The development process is one of continual iteration.  In fact, with online apps, the first iterations can be prototypes that just keep evolving.  Many SQL databases can accommodate this.  Even Oracle has a very sophisticated way of managing JavaScript-type data like MongoDB, perhaps more so.

But when online developers use Oracle databases, IT is generally in charge of them and all the data they contain.  The rapid iteration of customer profile information then comes into direct and almost unyielding contact with IT’s need to maintain order.

If a developer charged with updating the customer profile needs a new field, he has to submit a request to the service desk and wait for a ticket back from IT.  Several days later someone might call or push back a new ticket with a clarifying question.  “Do you want an index (for performance) on that column?”  If the answer is yes, then the storage administrators have to review the space impact; the security administrators have to assess the impact on access control to this and other data; and the database administrators have to determine the performance impact on existing queries.

Is it any surprise that developers want to avoid this type of process when they can?  Is there a downside to this new style of development?  Sure.  It’ll be a lot harder to analyze data across all parts of applications or the enterprise itself.  But applications will get built and evolve faster.

Action Item

Enterprises where the central IT organization has sufficient control over all application development activities shouldn’t completely shut down use of new, specialized databases.  At the same time, departments that can fly under the radar of central IT shouldn’t completely avoid them. 

The key to maintaining data as an asset from distributed application development activity requires threading a needle.  IT and the departmental developers need to agree on a common set of data that the organization needs for regulatory requirements, for performance analysis, and any other categories they agree are necessary.  That way each group can operate  semi-autonomously but the critical, common data still gets collected and stays organized.

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content