Making the First Hadoop Use Case Real for Customers

Dell, Cloudera, Syncsort and Intel are proud to announce the next generation of a Data Warehouse Optimization – ETL Offload Reference Architecture certified with CDH 5.5.1. Our goal is to create an intuitive and easy to use solution that enables new Hadoop users to develop and deploy ETL jobs in less than a week.

Together we built this solution because we continue to hear from customers that they are seeking to identify the best way to adopt their first big data Hadoop use case – a use case that will add value to their business without disrupting their existing IT environment. With the depth of solutions for big data, organizations often educate themselves and then step away due to the perceived complexity of deployment. Customers opt to either do nothing  and risk being left behind by their competition, or spend extensive amounts of time learning new programs, leading to long, drawn out big data projects that rarely reach their full potential.

Dell is determined to work closely with customers to turn that difficult-to-implement situation into a Hadoop project that delivers true business value. Imagine being able to introduce that first Hadoop use case without having to spend valuable time designing, architecting, testing, certifying, or even learning how to use some of the features of big data solutions like Pig or Hive. Dell has listened to customers and we’ve designed a solution that will deliver results leading to reduced costs and greater operational efficiencies. This solution delivers a use case driven Hadoop reference architecture that will enable organizations to augment costly and cumbersome existing capacity and performance constrained traditional data warehouses to lower data transformation costs and build efficiencies while laying a strong, cost-effective, secure, scalable and robust foundation for a modern data architecture.

With today’s release of the Dell | Cloudera | Syncsort Data Warehouse Optimization – ETL Offload Reference Architecture, Dell will work with your team to implement a solution that will simplify running extract, transform and load (ETL) jobs in Hadoop without having to reconstruct your environment or learn a new skill set.

For additional information, visit or contact us at

About the Author: Armando Acosta

Armando Acosta has been involved in the IT Industry over the last 15 years with experience in architecting IT solutions and product-marketing, management, planning, and strategy. Armando’s latest role has been focused on Big Data|Hadoop solutions, addressing solutions that build new capabilities for emerging customer needs, and assists with the roadmap for new products and features. Armando is a graduate of University of Texas at Austin and resides in Austin, TX.