Reality Check

Blog archive
warehouse

Is Hadoop the death of data warehousing?

The Hadoop ecosystem has exploded in the last three years with major IT vendors announcing a connector to Hadoop, an augmentation on top of Hadoop or their own “enterprise-ready” distribution of Hadoop. Given that Hadoop is on such an exponential rise in adoption and its ecosystem is expanding in both depth and breadth, it is natural to ask whether Hadoop’s ascension will cause the demise of traditional data warehousing solutions.

Another way to put this question is to look at it in a bigger context: To what extent is big data changing the traditional data analytics landscape?

Data warehousing is a set of techniques and software to enable the collection of data from operational systems, the integration and harmonization of that data into a centralized database and then the analysis, visualization and tracking of key performance indicators on a dashboard.

A key difference between data warehousing and Hadoop is that a data warehouse is typically implemented in a single relational database that serves as the central store. In contrast, Hadoop and the Hadoop File System are designed to span multiple machines and handle huge volumes of data that surpass the capability of any single machine.

Furthermore, the Hadoop ecosystem includes a data warehousing layer/service built on top of the Hadoop core.  Those services on top of Hadoop include SQL (Presto), SQL-Like (Hive) and NoSQL (Hbase) type of data stores. In contrast, over the last decade, large data warehouses shifted to use custom multiprocessor appliances to scale to large volumes like those from Netezza (bought by IBM) and Teradata. Unfortunately, those appliances are very expensive and out of reach for most small- to medium-sized businesses.

With this background and context it’s natural to ask: Is Hadoop the death of data warehousing?

To answer this question, it’s important to divide the techniques of data warehousing from the implementation. Hadoop (and the advent of NoSQL databases) will auger the demise of data warehousing appliances and the “traditional” single database implementation of a data warehouse.  

Evidence of this can be seen with Hadoop vendors like Cloudera billing its platform as an “enterprise data hub,” in essence subsuming the need for traditional data management solutions.  Similar sentiment was expressed on ReadWrite.com with a recently published article entitled, “Why proprietary big data technologies have no hope of competing with Hadoop.” Likewise, a recent Wall Street Journal article described how Hadoop is challenging Oracle and Teradata.

And the Hadoop or NoSQL ecosystem is still evolving. Many big data environments are choosing hybrid approaches that span NoSQL, SQL and even NewSQL data stores. Additionally, there are changes and potential improvements to the MapReduce parallel processing engine on the horizon like Apache’s Spark project. So, while this story is far from over, it is safe to say that traditional, single server relational databases or database appliances are not the future of big data or data warehouses.

On the other hand, the techniques of data warehousing to include Extract-Transform-and-Load (ETL), dimensional modeling and business intelligence will be adapted to the new Hadoop/NoSQL environments. Furthermore, those technologies will also morph to support more hybrid environments. The key principle seems to be that not all data is equal, so IT managers should choose the data storage and access mechanism to best suit the usage of the data. Hybrid environments could include key-value stores, relational databases, graph stores, document stores, columnar stores, XML databases, metadata catalogs and others.

As you can see, this is not really a simple question and therefore does not lend itself well to a simple answer. Nevertheless, in general, while big data will change the implementation of data warehousing over the next five years, it will not obsolete the concepts and practice of data warehousing.

What does this mean for the federal government’s huge investments in data warehouses?

First, when the capacities of the current data warehouses are exceeded, the data warehouses will be migrated to a Hadoop-based, multimachine or a cloud-hosted solution. Second, instead of a one-size-fits-all approach, organizations will look to tailor their big data volumes to hybrid storage approaches.

Michael C. Daconta (mdaconta@incadencecorp.com) is the Vice President of Advanced Technology at InCadence Strategic Solutions and the former Metadata Program Manager for the Homeland Security Department. His new book is entitled, The Great Cloud Migration: Your Roadmap to Cloud Computing, Big Data and Linked Data.

Posted by Michael C. Daconta on Jan 08, 2014 at 9:27 AM


Reader Comments

Mon, Mar 10, 2014

I would like to suggest analyzing the business flow can lead to greater insights , its not assure the profit but clearly implicates the root which should be averted.

Mon, Feb 24, 2014 Arpita Bharadwaj

As the Bangalore is the software giant, you can see many training centers that offer Hadoop training program. But if you are looking to seek and to become an expert, it is important to choose right centers. JLC India is one institute that offers Hadoop training Bangalore with well expertise faculty, provides different mode of training and well facilitated classrooms. So make your choice right and take good knowledge. For more information visit: http://www.jlcindia.com/Hadoop-Training-Bangalore.html

Thu, Jan 16, 2014 simon moss New York

Very interesting read. We all agree that the enterprise will continue to use warehouses, still be dominated by the requirement that the data model needs to be adhered to before value can be created and will unfortunately continue to throw good money after bad because of an antiquated and erroneous deployment model. The problem is the return on those investments, poor that it is today, will increasingly decline as the operating model and components of value grow increasingly distributed and diverse. The warehouse forces all components or variables to be normalized. As a result the Adjusted R² (the measurement of the efficiency of a model) will inherently decline or become more costly as a ratio to created value. The return has always been poor when centralizing data in a warehouse, but in the past we had no alternative. Now there are alternatives. Just in time because the efficiency of a central warehouse model will exponentially decline as the computing and the data edge accelerates outwards.

Thu, Jan 9, 2014 JoJo

I have learned about Hadoop and how to use some of the APIs for Java. I think it is just another fad. Of COURSE there are a lot of vendors offering tools that interact with Hadoop: They can smell money; They can smell long term contracts. The way I see Hadoop being implemented, it is going to be just as or more expensive than the databases like Oracle (which has tools that interact with Hadoop, BTW) over time. Almost everything requires custom development, yet we seem to hire fewer technical staff. A group of differently formatted spreadsheets, flat files and who knows what else does not turn into a report by magic, yet there are many vendors selling products that say that will do just that...auto-magically. I think the maintenance costs and licensing for all of the add-ons for Hadoop will make our current licensing costs look like chump change. This is if we rely on contractors and COTS software, which it seems we may have to do--for a software that is supposedly free. What I would like to see is the government LEAD the way in developing a free way to handle massive sets of distributed data. By the time private industry has decided to move on to something new because Hadoop has X or Y shortcoming, will agencies jump on that bandwagon too? A private company can afford to take risks with data and investments, but I don't think the government should. My opinion only.

Wed, Jan 8, 2014 Vikas S. Rajput

There is no doubts with stride Hadoop/Big Data is making. But the real question is, are the Enterprises ready to rely on a yet-maturing platform for their DW/BI requirements which helps their business strive? Am too an OSS fanatic, but the fact remains that there are very few OSS product lines (and no "suites") which companies have made their business dependent upon. MySQL was not a bad proposition, but its known what happened. But yes, I will prefer to maintain a wait and watch stand.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

resources

HTML - No Current Item Deck
  • Transforming Constituent Services with Business Process Management
  • Improving Performance in Hybrid Clouds
  • Data Center Consolidation & Energy Efficiency in Federal Facilities