Reality Check

Blog archive
warehouse

Is Hadoop the death of data warehousing?

The Hadoop ecosystem has exploded in the last three years with major IT vendors announcing a connector to Hadoop, an augmentation on top of Hadoop or their own “enterprise-ready” distribution of Hadoop. Given that Hadoop is on such an exponential rise in adoption and its ecosystem is expanding in both depth and breadth, it is natural to ask whether Hadoop’s ascension will cause the demise of traditional data warehousing solutions.

Another way to put this question is to look at it in a bigger context: To what extent is big data changing the traditional data analytics landscape?

Data warehousing is a set of techniques and software to enable the collection of data from operational systems, the integration and harmonization of that data into a centralized database and then the analysis, visualization and tracking of key performance indicators on a dashboard.

A key difference between data warehousing and Hadoop is that a data warehouse is typically implemented in a single relational database that serves as the central store. In contrast, Hadoop and the Hadoop File System are designed to span multiple machines and handle huge volumes of data that surpass the capability of any single machine.

Furthermore, the Hadoop ecosystem includes a data warehousing layer/service built on top of the Hadoop core.  Those services on top of Hadoop include SQL (Presto), SQL-Like (Hive) and NoSQL (Hbase) type of data stores. In contrast, over the last decade, large data warehouses shifted to use custom multiprocessor appliances to scale to large volumes like those from Netezza (bought by IBM) and Teradata. Unfortunately, those appliances are very expensive and out of reach for most small- to medium-sized businesses.

With this background and context it’s natural to ask: Is Hadoop the death of data warehousing?

To answer this question, it’s important to divide the techniques of data warehousing from the implementation. Hadoop (and the advent of NoSQL databases) will auger the demise of data warehousing appliances and the “traditional” single database implementation of a data warehouse.  

Evidence of this can be seen with Hadoop vendors like Cloudera billing its platform as an “enterprise data hub,” in essence subsuming the need for traditional data management solutions.  Similar sentiment was expressed on ReadWrite.com with a recently published article entitled, “Why proprietary big data technologies have no hope of competing with Hadoop.” Likewise, a recent Wall Street Journal article described how Hadoop is challenging Oracle and Teradata.

And the Hadoop or NoSQL ecosystem is still evolving. Many big data environments are choosing hybrid approaches that span NoSQL, SQL and even NewSQL data stores. Additionally, there are changes and potential improvements to the MapReduce parallel processing engine on the horizon like Apache’s Spark project. So, while this story is far from over, it is safe to say that traditional, single server relational databases or database appliances are not the future of big data or data warehouses.

On the other hand, the techniques of data warehousing to include Extract-Transform-and-Load (ETL), dimensional modeling and business intelligence will be adapted to the new Hadoop/NoSQL environments. Furthermore, those technologies will also morph to support more hybrid environments. The key principle seems to be that not all data is equal, so IT managers should choose the data storage and access mechanism to best suit the usage of the data. Hybrid environments could include key-value stores, relational databases, graph stores, document stores, columnar stores, XML databases, metadata catalogs and others.

As you can see, this is not really a simple question and therefore does not lend itself well to a simple answer. Nevertheless, in general, while big data will change the implementation of data warehousing over the next five years, it will not obsolete the concepts and practice of data warehousing.

What does this mean for the federal government’s huge investments in data warehouses?

First, when the capacities of the current data warehouses are exceeded, the data warehouses will be migrated to a Hadoop-based, multimachine or a cloud-hosted solution. Second, instead of a one-size-fits-all approach, organizations will look to tailor their big data volumes to hybrid storage approaches.

Michael C. Daconta (mdaconta@incadencecorp.com) is the Vice President of Advanced Technology at InCadence Strategic Solutions and the former Metadata Program Manager for the Homeland Security Department. His new book is entitled, The Great Cloud Migration: Your Roadmap to Cloud Computing, Big Data and Linked Data.

Posted by Michael C. Daconta on Jan 08, 2014 at 9:27 AM


Reader Comments

Wed, Jul 9, 2014 Hoadhead

A good read but it is getting a little ad nauseam. First there were hadoop/map reduce solutions but they require too much development work to implement. So, SQL on Map Reduce solutions began to spring up. But they were slow so Map Reduce need refurbishment and thus YARN. As YARN is not panning out, we are getting something better with the likes of Impala. An operating system/SQL parser specifically for HDFS. Still far behind a Data Warehouse of I guess we now need to replace HDFS, repeal bottleneck. Oh, wait a minute, then we will have built a data warehouse. Ugh, I am being reminded of the .com bust!!!

Sat, May 17, 2014 Kai Wähner @KaiWaehner

I have some great slides describing the difference between Hadoop and Data Warehouse, and how both complement each other: http://www.kai-waehner.de/blog/2014/05/13/hadoop-and-data-warehouse-dwh-friends-enemies-or-profiteers-what-about-real-time-slides-including-tibco-examples-from-jax-2014-online/

Sun, Apr 27, 2014 iDontWantToKnow Melbourne

Once again, CTOs with more money than sense will be taken for a ride. The IT industry is addicted to reinventing the wheel. The latest iteration of the wheel now comes with logical partitions and RAID5 (and a new buzzword too)! How revolutionary (not)!

Mon, Mar 10, 2014

I would like to suggest analyzing the business flow can lead to greater insights , its not assure the profit but clearly implicates the root which should be averted.

Mon, Feb 24, 2014 Arpita Bharadwaj

As the Bangalore is the software giant, you can see many training centers that offer Hadoop training program. But if you are looking to seek and to become an expert, it is important to choose right centers. JLC India is one institute that offers Hadoop training Bangalore with well expertise faculty, provides different mode of training and well facilitated classrooms. So make your choice right and take good knowledge. For more information visit: http://www.jlcindia.com/Hadoop-Training-Bangalore.html

Show All Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

resources

HTML - No Current Item Deck
  • Transforming Constituent Services with Business Process Management
  • Improving Performance in Hybrid Clouds
  • Data Center Consolidation & Energy Efficiency in Federal Facilities