Apache Sqoop becomes top level project for Hadoop data transfer

Pro

4 April 2012

As more organisations deploy Hadoop to analyse vast reams of information, they may find they need to transfer large amounts of data between Hadoop and their existing databases, data warehouses and other data stores. Now the volunteer developers behind a new connector designed to speed this data exchange have gotten the full support from the Apache Software Foundation (ASF).

ASF has promoted the Sqoop bulk data transfer tool as a top level project, the organisation has announced.

As a top level project (TLP), Sqoop will get the full support of the Apache support infrastructure, including mailing lists, collaborative work space, legal aid and a code repository. TLP status also indicates the Sqoop working group followsASF’s process and principles for developing and maintaining the software.

Sqoop provides a way to quickly transfer large amounts of data between the Hadoop data processing platform and relational databases, data warehouses and other non-relational data stores. It can work with most modern relational databases, such as MySQL, PostgreSQL, Oracle, Microsoft SQL Server, and IBM DB2, as well as enterprise data warehouses.

Sqoop was designed to transfer billions of rows into Hadoop in a speedy parallel fashion, said Arvind Prabhakar, Apache Sqoop project leader, in a statement. Sqoop places the data either directly into storage space governed by the Hadoop Distributed File System (HDFS), or can pipe it to other Hadoop applications such as the HBase big table data store, or the Hive Hadoop data warehouse software.

Currently at version 1.4, Sqoop has already been adopted in production duty by a number of Hadoop shops. Online marketer Coupons.com uses the software to exchange data between Hadoop and the IBM Netezza data warehouse appliance. The organisation can query its structured databases and pipe the results into Hadoop using Sqoop. Education company The Apollo Group also uses the software not only to extract data from databases but to inject the results from Hadoop jobs back into relational databases.

Sqoop first became an ASF incubator project in 2011.

Founded in 1999, the not-for-profit ASF supports over 150 open source projects, including such widely used software as the Apache Web server, the Tomcat application server, the Cassandra database, the Lucene search engine, the Perl programming language and the Hadoop data analysis platform. Facebook, Google, IBM, Hewlett-Packard, Microsoft, VMware, and Yahoo are among the companies that financially support the ASF.

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie