spark — 集群errors
hadoop errors
namenode一直处于安全模式
查看日志发现:
Resources are low on NN. Please add or free up more resources then turn off safe mode manually
解决思路:查看文件系统磁盘空间使用情况: /home目录已经使用完
修改 core-site.xml的hadoop.tmp.dir值
df - report file system disk space usage
-h, --human-readable -> print sizes in human readable format (e.g., 1K 234M 2G)
-l, --local -> limit listing to local file systems