使用Sqoop将数据从mysql导入HDFS

问题描述:

我正在使用Hadoop-1.2.1和Sqoop-1.4.6。我使用sqoop使用该命令从数据库meshtree导入表test到HDFS:使用Sqoop将数据从mysql导入HDFS

`sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test` 

但是,它显示了这个错误:

17/06/17 18:15:21 WARN tool.BaseSqoopTool: Setting your password on the  command-line is insecure. Consider using -P instead. 
17/06/17 18:15:21 INFO manager.MySQLManager: Preparing to use a MySQL  streaming resultset. 
17/06/17 18:15:21 INFO tool.CodeGenTool: Beginning code generation 
17/06/17 18:15:22 INFO manager.SqlManager: Executing SQL statement: SELECT  t.* FROM `test` AS t LIMIT 1 
17/06/17 18:15:22 INFO orm.CompilationManager: HADOOP_HOME is /home/student /Installations/hadoop-1.2.1/libexec/.. 
Note: /tmp/sqoop-student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java  uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
17/06/17 18:15:24 ERROR orm.CompilationManager: Could not rename /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java to /home/student /Installations/hadoop-1.2.1/./test.java 
org.apache.commons.io.FileExistsException: Destination '/home/student /Installations/hadoop-1.2.1/./test.java' already exists 
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378) 
at  org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227) 
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83) 
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:367) 
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453) 
at org.apache.sqoop.Sqoop.run(Sqoop.java:145) 
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) 
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) 
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) 
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) 
at org.apache.sqoop.Sqoop.main(Sqoop.java:238) 
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57) 
17/06/17 18:15:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.jar 
17/06/17 18:15:24 WARN manager.MySQLManager: It looks like you are importing  from mysql. 
17/06/17 18:15:24 WARN manager.MySQLManager: This transfer can be faster! Use  the --direct 
17/06/17 18:15:24 WARN manager.MySQLManager: option to exercise a MySQL- specific fast path. 
17/06/17 18:15:24 INFO manager.MySQLManager: Setting zero DATETIME behavior  to convertToNull (mysql) 
17/06/17 18:15:24 INFO mapreduce.ImportJobBase: Beginning import of test 
17/06/17 18:15:27 INFO mapred.JobClient: Cleaning up the staging area  hdfs://localhost:9000/home/student/Installations/hadoop-1.2.1/data/mapred /staging/student/.staging/job_201706171814_0001 
17/06/17 18:15:27 ERROR security.UserGroupInformation:  PriviledgedActionException as:student  cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory  test already exists 
17/06/17 18:15:27 ERROR tool.ImportTool: Encountered IOException running  import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output  directory test already exists 
at  org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileO utputFormat.java:137) 
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973) 
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936) 
at java.security.AccessController.doPrivileged(Native Method) 
at javax.security.auth.Subject.doAs(Subject.java:415) 
at  org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) 
at  org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936) 
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550) 
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580) 
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141) 
at  org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201) 
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413) 
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97) 
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380) 
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453) 
at org.apache.sqoop.Sqoop.run(Sqoop.java:145) 
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) 
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) 
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) 
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) 
at org.apache.sqoop.Sqoop.main(Sqoop.java:238) 
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57) 

有什么办法弄清楚这个问题?

  • 您正在导入,但未提供hdfs的目标目录。当我们没有提供任何目标目录时,sqoop只运行一次导入,并且用你的mysql表名在hdfs中创建目录。

So your query

sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test

this create a directory with the name test1 in hdfs

  • 只需添加下面的脚本

sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test --target-dir test1

希望充分的做工精细,只是指sqoop import and all related sqoop

如果您打算将Sqoop用于分布式Hadoop群集,则不要使用URL localhost,这一点很重要。您提供的连接字符串将用于整个MapReduce集群中的TaskTracker节点;如果您指定文字名称localhost,则每个节点将连接到不同的数据库(或更可能,根本没有数据库)。 而应使用数据库主机的完整主机名或IP地址,该主机名或IP地址可由所有远程节点看到。

请访问Sqoop文档Connecting to a Database Server部分了解更多信息。

+0

谢谢你的解释。我阅读了Sqoop文档,它非常有帮助,之前的错误已得到解决。但是还有另一个出现的错误。你能帮我解决吗?错误:线程中的异常“main”java.lang.IncompatibleClassChangeError:Found class org.apache.hadoop.mapreduce.JobContext,but expected expected \t at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java :65)' – Christine

+0

您正在使用Hadoop 1.x,但看起来您的Sqoop是使用Hadoop 2.x编译的。请使用Hadoop 1.x编译您的Sqoop或下载与Hadoop 1.x兼容的Sqoop的较低版本。 –

+0

谢谢您的回复,我试着下载较低版本的sqoop(sqoop-1.4.1)。但是,它仍然不适用于我。它显示此错误:'错误security.UserGroupInformation:PriviledgedActionException:连接被拒绝 错误tool.ImportTool:遇到IOException运行导入作业:连接被拒绝' – Christine

您没有权限。所以请联系myql dba以授予您相同的权利。 或者你可以自己做,如果你有管理员访问MySQL。

grant all privileges on databasename.* to 'username'@'%' identified by 'password'; 

* - 对于所有表 % - 从任何主机

上述语法是授予权限,用户在MySQL server.In你的情况让这将是: -

grant all privileges on meshtree.test to 'root'@'localhost' identified by 'yourpassword'; 
+0

谢谢你的解释。当我尝试再次使用sqoop import命令导入数据时,它显示另一个错误:'线程中的异常'main'java.lang.IncompatibleClassChangeError:Found class org.apache.hadoop.mapreduce.JobContext,但接口是预期的 \t at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:65)'你能帮我解决这个错误吗? – Christine

+0

授予权限是否解决了拒绝访问错误? – TKHN

+0

@Christine Incompitable类更改错误表示您的Sqoop版本与Hadoop版本不匹配。使用与您正在使用的hadoop版本匹配的SQOOP VERSION .. – TKHN