Presentation is loading. Please wait.

Presentation is loading. Please wait.

From SQL to Hadoop and Back The “Sqoop” about Data Connections between

Similar presentations


Presentation on theme: "From SQL to Hadoop and Back The “Sqoop” about Data Connections between"— Presentation transcript:

1 From SQL to Hadoop and Back The “Sqoop” about Data Connections between
SQL and Hadoop Steve O’Hearn Copyright (c) 2014 Steve O'Hearn

2 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Welcome! Copyright (c) 2014 Steve O'Hearn

3 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com

4 Goals for “From SQL to Hadoop and Back”
Understand why you may want to establish data connections between the Oracle RDBMS and Hadoop Review various techniques and tools for establishing data connections between the Oracle RDBMS and Hadoop’s HDFS Understand the purpose, benefits, and limitations of the various techniques and tools Copyright (c) 2014 Steve O'Hearn

5 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Goals Understand why you may want to establish data connections between the Oracle RDBMS and Hadoop Review various techniques and tools for establishing data connections between the Oracle RDBMS and Hadoop’s HDFS Understand the purpose, benefits, and limitations of the various techniques and tools Copyright (c) 2014 Steve O'Hearn

6 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
First What is the Oracle RDBMS? What is Hadoop? Copyright (c) 2014 Steve O'Hearn

7 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Oracle 12c RDBMS Source: Oracle Database Concepts, 12c Release 1 (12.1) Copyright (c) 2014 Steve O'Hearn

8 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
The Hadoop Framework Framework of tools Open source - Java Apache Software Foundation projects Several tools HDFS (storage) and MapReduce (analysis) HBase, Hive, Pig, Sqoop, Flume, more Network sockets Copyright (c) 2014 Steve O'Hearn

9 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
HDFS Hadoop Distributed File System Text files, binary files Very large data blocks 64MB minimum 1GB or higher Typical is 128MB or 256MB Replication – 3 copy default Namenode and Datanodes Copyright (c) 2014 Steve O'Hearn

10 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
MapReduce Analytical Engine of Hadoop JobTracker TaskTracker Interprets data at runtime instead of using predefined schemas Copyright (c) 2014 Steve O'Hearn

11 There are file systems other than HDFS.
Hadoop Configuration Hive (Other) Pig HBase MapReduce HDFS API HDFS NOTE: There are file systems other than HDFS. Text Files BinaryFiles Copyright (c) 2014 Steve O'Hearn

12 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Oracle RDBMS vs. Hadoop Oracle RDBMS Hadoop Integrity High Low Schema Structured Unstructured Use Frequent reads and writes Write once, read many Style Interactive and batch Batch Copyright (c) 2014 Steve O'Hearn

13 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Goals Understand why you may want to establish data connections between the Oracle RDBMS and Hadoop Review various techniques and tools for establishing data connections between the Oracle RDBMS and Hadoop’s HDFS Understand the purpose, benefits, and limitations of the various techniques and tools Copyright (c) 2014 Steve O'Hearn

14 Why establish connections?
Sample scenario: Move data from Oracle into Hadoop Perform MapReduce on datasets that include Oracle data Move MapReduce results back into Oracle for analysis, reporting, etc. Other uses: Oracle queries that join with Hadoop datasets Scheduled batch MapReduce results to be warehoused Copyright (c) 2014 Steve O'Hearn

15 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Oracle and Hadoop Oracle and Hadoop together form a comprehensive platform for managing all forms of data, both structured and unstructured. Hadoop provides “big data” processing. Oracle provides analytic capabilities not found in Hadoop. (NOTE: This is changing.) Copyright (c) 2014 Steve O'Hearn

16 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Goals Understand why you may want to establish data connections between the Oracle RDBMS and Hadoop Review various techniques and tools for establishing data connections between the Oracle RDBMS and Hadoop’s HDFS Understand the purpose, benefits, and limitations of the various techniques and tools Copyright (c) 2014 Steve O'Hearn

17 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Goals Understand why you may want to establish data connections between the Oracle RDBMS and Hadoop Review various techniques and tools for establishing data connections between the Oracle RDBMS and Hadoop’s HDFS Understand the purpose, benefits, and limitations of the various techniques and tools Copyright (c) 2014 Steve O'Hearn

18 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Connectors PUSH PUSH SELECT SQL, PL/SQL, Java / Program Units (such as Stored Procedures, etc.) Custom Java w/JDBC, DBOutputFormat, FileSystem API, Avro Sqoop PUSH PULL PULL Custom Java w/JDBC, DBInputFormat, FileSystem API, Avro Sqoop Oracle Loader for Hadoop Oracle SQL Connector for HDFS PULL Copyright (c) 2014 Steve O'Hearn

19 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
SELECT PUSH SQL’s SELECT statement Use Java to: Spool output Control output Use string concatenation to create delimited text HDFS files Use Java Avro API to create serialized binary HDFS files Copyright (c) 2014 Steve O'Hearn

20 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Java PUSH PULL To connect to RDBMS JDBC Interacts with RDBMS DBInputFormat: reading from a database DBOutputFormat: dumping to a database Generates SQL Best for smaller amounts of data org.apache.hadoop.mapreduce.lib.db To interact with HDFS Files FileSystem API Avro API (for binary files) Copyright (c) 2014 Steve O'Hearn

21 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Sqoop PUSH PULL Sqoop = “SQL to Hadoop” Command line Works with any JDBC compliant RDBMS Works with any external system that supports bulk data transfer into Hadoop (HDFS, HBase, Hive) Strength: transfer of bulk data between Hadoop and RDBMS environments Read / Write / Update / Insert / Delete Stored Procedures (warning: parallel processing) Copyright (c) 2014 Steve O'Hearn

22 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Sqoop PUSH PULL Open Source / Java Apache Top Level Project (Graduated from incubator level March 2012) Bundled with: Oracle Big Data Appliance CDH (Cloudera Distribution including Apache Hadoop) Also available at Apache Software Foundation Latest version of Sqoop2: (as of 4/15/14) Wiki: https://cwiki.apache.org/confluence/display/SQOOP Copyright (c) 2014 Steve O'Hearn

23 Sqoop: Incoming Formats
PUSH PULL Text Human-readable Binary Precision Compression Examples SequenceFile (Java-specific) Avro Note: Sqoop cannot currently load SequenceFile or Avro into Hive. Copyright (c) 2014 Steve O'Hearn

24 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Sqoop: Purpose PUSH PULL Interacts with structured data stores outside of HDFS Moves data from structured data stores into Hbase Moves analytic results out of Hadoop to a structured data store Copyright (c) 2014 Steve O'Hearn

25 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Sqoop – How It Works PUSH PULL Interrogates the RDBMS data dictionary for the target schema Use MapReduce to import data into Hadoop Parallel Operation - configurable Fault Tolerance – configurable Datatype mapping: Oracle SQL data types to Java data types (VARCHAR2 = String; INTEGER = Integer, etc.) Generates Java class of structured schema Bean: “get” methods Write methods public void readFields(ResultSet __dbResults) throws SQLException; public void write(PreparedStatement __dbStmt) throws SQLException; Copyright (c) 2014 Steve O'Hearn

26 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Sqoop PUSH PULL $ sqoop help usage: sqoop COMMAND [ARGS] Available commands: codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import Import a table from a database to HDFS import-all-tables Import tables from a database to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information Copyright (c) 2014 Steve O'Hearn

27 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Sqoop PUSH PULL $ sqoop list-databases –connect "jdbc:mysql://localhost" -username steve –password changeme 14/04/24 15:35:21 INFO manager.SqlManager: Using default fetchSize of 1000 netcents2 dt_site exam_module team_wiki $ Copyright (c) 2014 Steve O'Hearn

28 Sqoop Connectors (external)
PUSH PULL Generic JDBC Connector Connectors for major RDBMS: Oracle, MySQL, SQL Server, DB2, PostgreSQL Third party: Teradata, Netezza, Couchbase Third party connectors may support direct import into third party data stores Copyright (c) 2014 Steve O'Hearn

29 Sqoop Documentation from Apache
PUSH PULL User Guides Installation Upgrade Five Minute Demo Command Line Client Developer Guides Building Sqoop2 Development Environment Setup Java Client API Guide Developing Connector REST API Guide Copyright (c) 2014 Steve O'Hearn

30 Oracle Loader for Hadoop
PULL Essentially – “SQL Loader” for Hadoop Java MapReduce application Runs as a Hadoop Utility w/configuration file Extensible (Attention Java programmers) Command line or standalone process Online and offline modes Requires an existing target table (staging table!) Loads data only, cannot edit Hadoop data Pre-partitions data if necessary Can pre-sort data by primary key or user-specified columns before loading Leverages Hadoop’s parallel processing Copyright (c) 2014 Steve O'Hearn

31 Oracle Loader for Hadoop
PULL RDBMS PRESORTING REDUCER JDBC MAPPER MAPPER MAPPER HDFS Copyright (c) 2014 Steve O'Hearn

32 Oracle Loader for Hadoop vs. Sqoop
Oracle Loader advantages Regular expressions (vs. as-is delimited file import) Faster throughput (vs. Sqoop JDBC) Data dictionary interrogation during load Support for runtime rollback (Sqoop generates INSERT statements with no rollback support) Sqoop advantages One system for bi-directional transfer support Copyright (c) 2014 Steve O'Hearn

33 Oracle SQL Connector for HDFS
PULL Essentially the “external table” feature for Hadoop Text files only – no binary file support Treats HDFS as an external table Read only (no import / transfer) No indexing No INSERT, UPDATE, or DELETE As is data import Full table scan Copyright (c) 2014 Steve O'Hearn

34 Oracle SQL Connector for HDFS CREATE TABLE statement
CREATE TABLE CUSTOMER_LOGFILES ( LOGFILE_ID INTEGER(20) , LOG_NOTE VARCHAR2(120) , LOG_DATE DATE) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY log_file_dir ACCESS PARAMETERS ( records delimited by newline badfile log_file_dir:'log_bad' logfile log_file_dir:'log_records' fields terminated by ',' missing field values are null ( LOGFILE_ID , LOG_NOTE , LOG_DATE char date_format date mask "dd-mon-yyyy“ ) LOCATION ( 'log_data_1.csv‘ , 'log_data_2.csv') PARALLEL REJECT LIMIT UNLIMITED; Copyright (c) 2013 Steve O'Hearn

35 Quest Data Connector for Oracle and Hadoop
PULL Oracle to CDH via Sqoop Freeware plug-in to CDH (Cloudera Distribution including Apache Hadoop) Quest Data Transporter for Hive Java command-line utility Saves Hive HQL output to an Oracle database Copyright (c) 2013 Steve O'Hearn

36 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Concluding Remarks There is no one best solution Apache Sqoop and Java APIs Bi-directional Read/Write/Insert/Update/Delete Limitation: JDBC and available connectors Requires knowledge of Java Oracle Loader Offers preprocessing and speed Unidirectional Oracle SQL Connector Integrates with existing SQL calls Limited to HDFS text files Third party tools (Cloudera, Hortonworks, etc.) are adding features to Hadoop that may reduce demand for moving data back to Oracle Copyright (c) 2014 Steve O'Hearn

37 Copyright (c) 2014 Steve O'Hearn http://www.databasetraining.com
Thank you! Steve O’Hearn DatabaseTraining.com and Corbinian.com Copyright (c) 2014 Steve O'Hearn


Download ppt "From SQL to Hadoop and Back The “Sqoop” about Data Connections between"

Similar presentations


Ads by Google