卡在这不动huangjingying@spark059:~$ flume-ng agent -n a1 -c $FLUME_HOME/conf/ -f /home/huangjingying/dw059/flume/file_to_kafka_059.conf -Dflume.root.logger=info,console Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop) for HDFS access Info: Including HBASE libraries found via (/usr/local/hbase/bin/hbase) for HBASE access Info: Including Hive libraries found via (/usr/local/hive) for Hive access + exec /usr/local/java/bin/java -Xmx20m -Dflume.root.logger=info,console -cp '/opt/flume/apache-flume-1.11.0-bin/conf:/opt/flume/apache-flume-1.11.0-bin/lib/*:/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/share/hadoop/yarn:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hbase/conf:/usr/local/java/lib/tools.jar:/usr/local/hbase:/usr/local/hbase/lib/shaded-clients/hbase-shaded-client-2.4.15.jar:/usr/local/hbase/lib/client-facing-thirdparty/audience-annotations-0.5.0.jar:/usr/local/hbase/lib/client-facing-thirdparty/commons-logging-1.2.jar:/usr/local/hbase/lib/client-facing-thirdparty/htrace-core4-4.2.0-incubating.jar:/usr/local/hbase/lib/client-facing-thirdparty/reload4j-1.2.22.jar:/usr/local/hbase/lib/client-facing-thirdparty/slf4j-api-1.7.33.jar:/usr/local/hbase/conf:/usr/local/hive/lib/*' -Djava.library.path=:/usr/local/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib org.apache.flume.node.Application -n a1 -f /home/huangjingying/dw059/flume/file_to_kafka_059.conf SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/flume/apache-flume-1.11.0-bin/lib/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-reload4j-
时间: 2025-03-31 20:10:56 浏览: 43
### 解决 Flume 启动时 SLF4J 多绑定冲突问题
当 Flume 配合 Hadoop 和 HBase 库一起使用时,可能会因为多个日志框架的实现被引入而导致 `SLF4J: Class path contains multiple SLF4J bindings` 的错误。这种情况下,可以采用以下方法来解决问题。
#### 方法一:通过 Maven Shade Plugin 排除重复依赖
Maven Shade Plugin 是一种有效的工具,用于重新打包项目并排除不必要的依赖项。对于 Flume 来说,可以通过配置该插件移除多余的 SLF4J 绑定库[^2]。
以下是具体的 Maven 配置示例:
```xml
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<!-- Exclude unnecessary files -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>org.slf4j</pattern>
<shadedPattern>com.shaded.org.slf4j</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
```
此配置将重定位 `org.slf4j` 包至自定义路径(如 `com.shaded.org.slf4j`),从而避免与其他版本的日志实现发生冲突。
---
#### 方法二:手动调整类加载顺序
如果无法完全控制构建过程,则可以在运行时通过设置 Java 类路径的方式强制指定优先级较高的日志实现。例如,创建单独的文件夹存储所需的 SLF4J 实现,并将其置于其他依赖之前加载。
假设目标是让 `slf4j-log4j12` 成为主导实现,可执行如下操作:
1. 将所需 JAR 文件放入独立目录(如 `lib_slf4j/`);
2. 修改启动命令以显式声明类路径:
```bash
java -cp /path/to/lib_slf4j/*:/path/to/flume/lib/* org.apache.flume.node.Application ...
```
这种方式类似于处理 Jersey 依赖冲突的方法,能够有效隔离不同版本间的干扰。
---
#### 方法三:清理 `.m2/repository` 缓存
有时本地缓存中的旧版或损坏的依赖可能导致意外行为。建议先尝试清除相关条目后再重新安装必要组件:
```bash
mvn dependency:purge-local-repository
mvn clean install
```
此外,确认离线模式下的构建不会因网络原因遗漏某些更新后的资源[^1]。
---
#### 总结
综合来看,推荐优先考虑 **Shade Plugin** 方案,因为它不仅解决了当前问题,还增强了项目的兼容性和稳定性;而针对临时测试场景可以选择调整类加载器策略作为替代手段。
---
阅读全文
相关推荐

















