Docker:无法将数据从logstash容器发送到Kafka容器
问题描述:
我有2个docker容器,1个运行Logstash,另一个运行Zookeeper和Kafka。我试图将数据从Logstash发送到Kafka,但似乎无法将数据传到我在卡夫卡的主题中。Docker:无法将数据从logstash容器发送到Kafka容器
我可以登录到Docker Kafka容器,并从终端生成一条消息给我的主题,然后也将其消耗。
我使用的输出卡夫卡插件:
output {
kafka {
topic_id => "MyTopicName"
broker_list => "kafkaIPAddress:9092"
}
}
的ip地址我从运行docker inspect kafka2
了当我运行./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf
我得到这个错误。
Settings: Default pipeline workers: 4
Unknown setting 'broker_list' for kafka {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
我检查文件的配置,通过运行下面的命令返回OK。
./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf
Configuration OK
有没有人遇到过这个,难道是我没有打开卡夫卡容器上的端口,并保持卡夫卡运行如果是这样我怎么能做到这一点,而?
答
的错误是在这里broker_list => "kafkaIPAddress:9092"
尝试bootstrap_servers => "KafkaIPAddress:9092"
如果你有单独的机器上的容器,地图卡夫卡到主机9092
和使用主机地址:端口,如果同一主机上使用的内部泊坞IP:port
Works 100%谢谢 – Gman