The Apache Software Foundation (ASF) announced that its Apache Sqoop big data tool has graduated from the Apache Incubator to become a top-level project (TLP).

Sqoop is designed to efficiently transfer bulk data between Apache Hadoop and structured data stores such as relational databases. Apache Sqoop allows the import of data from external data stores and enterprise data warehouses into a Hadoop Distributed File System or related systems like Apache Hive and HBase.

Maturity

“The Sqoop Project has demonstrated its maturity by graduating from the Apache Incubator,” said Arvind Prabhakar, vice president of Apache Sqoop, in a statement. “With jobs transferring data on the order of billions of rows, Sqoop is proving its value as a critical component of production environments.”

ASF officials said Sqoop builds on the Hadoop infrastructure and parallelises data transfer for fast performance and best use of system and network resources. In addition, Sqoop allows fast copying of data from external systems to Hadoop to make data analysis more efficient, and mitigates the risk of excessive load to external systems.

“Connectivity to other databases and warehouses is a critical component for the evolution of Hadoop as an enterprise solution, and that’s where Sqoop plays a very important role” said Deepak Reddy, Hadoop Manager at Coupons.com, in a statement. “We use Sqoop extensively to store and exchange data between Hadoop and other warehouses like Netezza. The power of Sqoop also comes in the ability to write free-form queries against structured databases and pull that data into Hadoop.”

Moreover, “Sqoop has been an integral part of our production data pipeline” said Bohan Chen, director of the Hadoop Development and Operations team at Apollo Group, also in a statement. “It provides a reliable and scalable way to import data from relational databases and export the aggregation results to relational databases.”

Specialised systems

Since entering the Apache Incubator in June 2011, Sqoop was quickly embraced as a key SQL-to-Hadoop data transfer solution.

The project provides connectors for popular systems such as MySQL, PostgreSQL, Oracle, SQL Server and DB2, and also allows for the development of drop-in connectors that provide high-speed connectivity with specialised systems like enterprise data warehouses.

Craig Ling, director of business systems at Tsavo Media, stated: “We adopted the use of Sqoop to transfer data into and out of Hadoop with our other systems over a year ago. It is straightforward and easy to use, which has opened the door to allow team members to start consuming data autonomously, maximising the analytical value of our data repositories.”

How well do you know the cloud? Take our quiz.

Darryl K. Taft

Darryl K. Taft covers IBM, big data and a number of other topics for TechWeekEurope and eWeek

Recent Posts

Craig Wright Sentenced For Contempt Of Court

Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…

2 days ago

El Salvador To Sell Or Discontinue Bitcoin Wallet, After IMF Deal

Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…

2 days ago

UK’s ICO Labels Google ‘Irresponsible’ For Tracking Change

Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…

2 days ago

EU Publishes iOS Interoperability Plans

European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…

3 days ago

Momeni Convicted In Bob Lee Murder

San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…

3 days ago