org.apache.flink.core.fs.UnsupportedFileSystemSchemeException
目录
一、错误
二、现象
三、原因
四、解决方案
一、错误
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme ‘hdfs’. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:447)
at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:359)
at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
at org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:116)
… 10 more
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:58)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:443)
… 13 more
二、现象
找不多hadoop相关的jar包
三、原因
我使用的hadoop的版本是2.8.5,flink的版本是1.9.0
因为flink的checkpoint是需要记录在hdfs上,但是应该是flink1.8之后的就把hadoop的一些依赖题除了,所以报错找不到相应的jar包。
四、解决方案
在flink的lib中手动导入:
flink-shaded-hadoop2-uber-2.8.3-1.8.2.jar