Guest User

Untitled

a guest
Feb 9th, 2018
123
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.09 KB | None | 0 0
  1. # SCOOP
  2.  
  3. Integrates SQL and Hadoop. It handles big data. It takes MAPREDUCE out of the equation and handles the importing and expoting of the data.
  4.  
  5. ### Import data from (My)SQL to HDFS
  6. ```
  7. sqoop import --connect jdbc://localhost/movielns --driver com.mysql.jdbc.Driver --table table_name
  8. ```
  9.  
  10. ### Import data from (My)SQL to Hive
  11. ```
  12. sqoop import --connect jdbc://localhost/movielns --driver com.mysql.jdbc.Driver --table table_name --hive-import
  13. ```
  14.  
  15. ### Export data from Hive to SQL
  16. * First you need to create a table so as to receive the data.
  17. * Connect on MySQL in maria_dev server using
  18. ```
  19. mysql -u root -p
  20. ```
  21.  
  22. ```
  23. CREATE TABLE table_name ( column_nam type_name);
  24. ```
  25.  
  26. * Exit and in maria_dev export data to sql
  27. ```
  28. sqoop export -connect jdcb:mysql://loacalhost/filename -m 1 --driver com.mysql.jdcb.Driver --table exported_movies --export-dir /apps/hive/warehouse/data_file --input-fields-terminated-by '\0001'
  29. ```
  30. * connect on 'filename' database
  31. * m1 is for 1 mapper since we are running locally.
  32. * specify the driver
  33. * and put it in a table
  34. * take the data from data_file
  35. * how that data are delimeter
Add Comment
Please, Sign In to add comment