The Apache Software Foundation (ASF) announced that its Apache Sqoop big data tool has graduated from the Apache Incubator to become a top-level project (TLP).

Sqoop is designed to efficiently transfer bulk data between Apache Hadoop and structured data stores such as relational databases. Apache Sqoop allows the import of data from external data stores and enterprise data warehouses into a Hadoop Distributed File System or related systems like Apache Hive and HBase.

Maturity

“The Sqoop Project has demonstrated its maturity by graduating from the Apache Incubator,” said Arvind Prabhakar, vice president of Apache Sqoop, in a statement. “With jobs transferring data on the order of billions of rows, Sqoop is proving its value as a critical component of production environments.”

ASF officials said Sqoop builds on the Hadoop infrastructure and parallelises data transfer for fast performance and best use of system and network resources. In addition, Sqoop allows fast copying of data from external systems to Hadoop to make data analysis more efficient, and mitigates the risk of excessive load to external systems.

“Connectivity to other databases and warehouses is a critical component for the evolution of Hadoop as an enterprise solution, and that’s where Sqoop plays a very important role” said Deepak Reddy, Hadoop Manager at Coupons.com, in a statement. “We use Sqoop extensively to store and exchange data between Hadoop and other warehouses like Netezza. The power of Sqoop also comes in the ability to write free-form queries against structured databases and pull that data into Hadoop.”

Moreover, “Sqoop has been an integral part of our production data pipeline” said Bohan Chen, director of the Hadoop Development and Operations team at Apollo Group, also in a statement. “It provides a reliable and scalable way to import data from relational databases and export the aggregation results to relational databases.”

Specialised systems

Since entering the Apache Incubator in June 2011, Sqoop was quickly embraced as a key SQL-to-Hadoop data transfer solution.

The project provides connectors for popular systems such as MySQL, PostgreSQL, Oracle, SQL Server and DB2, and also allows for the development of drop-in connectors that provide high-speed connectivity with specialised systems like enterprise data warehouses.

Craig Ling, director of business systems at Tsavo Media, stated: “We adopted the use of Sqoop to transfer data into and out of Hadoop with our other systems over a year ago. It is straightforward and easy to use, which has opened the door to allow team members to start consuming data autonomously, maximising the analytical value of our data repositories.”

How well do you know the cloud? Take our quiz.

Darryl K. Taft

Darryl K. Taft covers IBM, big data and a number of other topics for TechWeekEurope and eWeek

Recent Posts

NASA, Boeing To Begin Starliner Testing After ‘Anomalies’

American space agency prepares for testing of Boeing's Starliner, to ensure it has two space…

15 hours ago

Meta Launches Friends Tab, As Zuck Touts ‘OG Facebook’

Zuckerberg seeks to revive Facebook's original spirit, as Meta launches Facebook Friends tab, so users…

19 hours ago

WhatsApp Appeal Against EU Fine Backed By Court Advisor

Notable development for Meta, after appeal against 2021 WhatsApp privacy fine is backed by advisor…

2 days ago

Intel Board Shake-Up As Three Members Confirm Retirement

First sign of shake-up under new CEO Lip-Bu Tan? Three Intel board members confirm they…

2 days ago

Trump’s SEC Pick Pledges ‘Coherent’ Crypto Rules

Trump's nominee for SEC Chairman, Paul Atkins, has pledged a “rational, coherent, and principled approach”…

2 days ago