storm、hbase、kafka整合过程中遇到的log4j冲突问题
---》报错
SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError.
SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more details.
10059 [Thread-15-KafkaSpout] ERROR backtype.storm.util - Async loop died!
java.lang.NoClassDefFoundError: Could not initialize class org.apache.log4j.Log4jLoggerFactory
at org.apache.log4j.Logger.getLogger(Logger.java:39) ~[log4j-over-slf4j-1.6.6.jar:1.6.6]
at kafka.utils.Logging$class.logger(Logging.scala:24) ~[kafka_2.11-0.8.2.1.jar:na]
---》原因
log4j-over-slf4j.jar AND slf4j-log4j12.jar 循环调用冲突了,再进一步原因是kafka、hbase中用的是log4j。
---》解决方法
* 方案一:把storm中的log4j-over-slf4j 依赖排除;
* 方案二:把kafka和hbase中的slf4j-log4j12 依赖排除;
* 采用方案二便于集群发布
* storm的日志时采用的 SLF4J和Logback日志框架,所以自己写的storm代码统一使用 slf4j 的包,可以避免大多数包冲突问题。
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class MyBolt {
private static final Logger LOG = LoggerFactory
.getLogger(MyBolt.class);
}
---》参考
http://www.tuicool.com/articles/aIzyqiy
http://stackoverflow.com/questions/21630860/storm-topology-not-submit
http://blog.csdn.net/gongmf/article/details/40379547
已有 0人发表留言,猛击->> 这里<<-参与讨论
ITeye推荐
---》报错
SLF4J: Detected both log4j-over-slf4j.jar AND slf4j-log4j12.jar on the class path, preempting StackOverflowError.
SLF4J: See also http://www.slf4j.org/codes.html#log4jDelegationLoop for more details.
10059 [Thread-15-KafkaSpout] ERROR backtype.storm.util - Async loop died!
java.lang.NoClassDefFoundError: Could not initialize class org.apache.log4j.Log4jLoggerFactory
at org.apache.log4j.Logger.getLogger(Logger.java:39) ~[log4j-over-slf4j-1.6.6.jar:1.6.6]
at kafka.utils.Logging$class.logger(Logging.scala:24) ~[kafka_2.11-0.8.2.1.jar:na]
---》原因
log4j-over-slf4j.jar AND slf4j-log4j12.jar 循环调用冲突了,再进一步原因是kafka、hbase中用的是log4j。
---》解决方法
* 方案一:把storm中的log4j-over-slf4j 依赖排除;
<dependency><groupId>org.apache.storm</groupId><artifactId>storm-core</artifactId><version>0.9.5</version><scope>provided</scope><exclusions><exclusion><groupId>org.slf4j</groupId><artifactId>log4j-over-slf4j</artifactId></exclusion></exclusions></dependency><dependency><groupId>org.apache.kafka</groupId><artifactId>kafka_2.11</artifactId><version>0.8.2.1</version></dependency><dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-client</artifactId><version>1.0.1.1</version></dependency>
* 方案二:把kafka和hbase中的slf4j-log4j12 依赖排除;
<dependency><groupId>org.apache.storm</groupId><artifactId>storm-core</artifactId><version>0.9.5</version><scope>provided</scope><exclusions><exclusion><groupId>org.slf4j</groupId><artifactId>log4j-over-slf4j</artifactId></exclusion><exclusion><groupId>org.slf4j</groupId><artifactId>slf4j-api</artifactId></exclusion></exclusions> --></dependency><dependency><groupId>org.apache.kafka</groupId><artifactId>kafka_2.11</artifactId><version>0.8.2.1</version><exclusions><exclusion><groupId>org.slf4j</groupId><artifactId>slf4j-log4j12</artifactId></exclusion><exclusion><groupId>log4j</groupId><artifactId>log4j</artifactId></exclusion></exclusions></dependency><dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-client</artifactId><version>1.0.1.1</version><exclusions><exclusion><groupId>org.slf4j</groupId><artifactId>slf4j-log4j12</artifactId></exclusion><exclusion><groupId>log4j</groupId><artifactId>log4j</artifactId></exclusion></exclusions> </dependency><!-- <dependency><groupId>org.apache.hbase</groupId><artifactId>hbase-common</artifactId><version>1.0.1.1</version></dependency> --><dependency><groupId>redis.clients</groupId><artifactId>jedis</artifactId><version>2.7.0</version></dependency>
* 采用方案二便于集群发布
* storm的日志时采用的 SLF4J和Logback日志框架,所以自己写的storm代码统一使用 slf4j 的包,可以避免大多数包冲突问题。
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class MyBolt {
private static final Logger LOG = LoggerFactory
.getLogger(MyBolt.class);
}
---》参考
http://www.tuicool.com/articles/aIzyqiy
http://stackoverflow.com/questions/21630860/storm-topology-not-submit
http://blog.csdn.net/gongmf/article/details/40379547
已有 0人发表留言,猛击->> 这里<<-参与讨论
ITeye推荐