The previous three chapters had one thing in common: they described various use cases of transferring data from a database server to the Hadoop ecosystem. What if you have the opposite scenario and need to transfer generated, processed, or backed-up data from Hadoop to your database? Sqoop also provides facilities for this use case, and the following recipes in this chapter will help you understand how to take advantage of this feature.
You have a workflow of various Hive and MapReduce jobs that are generating data on a Hadoop cluster. You need to transfer this data to your relational database for easy querying.
You can use Sqoop’s
export feature that allows you to transfer data from the Hadoop ecosystem to relational databases. For example, to export data from the
cities (the directory in HDFS that contains the source data) into table
cities (the table to populate in the database), you would use the following Sqoop command:
Export works similarly to import, except export transfers data in the other direction. Instead of transferring data from the relational database using
SELECT queries, Sqoop will transfer the data to the relational database using
INSERT statements. Sqoop’s export workflow matches the import case with slight differences. After you execute ...