1. 程式人生 > >java 實現kafka訊息生產者和消費者

java 實現kafka訊息生產者和消費者

一、概述
kafka原理這東西就不再贅述了,除了官網網上也是能找到一大堆,直接上程式碼,這裡實現的基本需求是 producer類利用for迴圈來產生訊息,然後consumer類來消費這些訊息,我的正確執行環境是:

centos-6.5
kafka-2.10_0.10
scala-2.10.4

二、程式碼

生產者:

package com.unisk.bigdata.kafka;

import java.util.Properties;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache
.kafka.clients.producer.Producer; import org.apache.kafka.clients.producer.ProducerRecord; public class MyProducer { public static void main(String[] args) { Properties props = new Properties(); props.put("bootstrap.servers", "master:9092"); props.put("acks", "all"); props.put("retries"
, 0); props.put("batch.size", 16384); props.put("linger.ms", 1); props.put("buffer.memory", 33554432); props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer"); props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer"); Producer<String, String> producer = null;
try { producer = new KafkaProducer<>(props); for (int i = 0; i < 100; i++) { String msg = "Message " + i; producer.send(new ProducerRecord<String, String>("HelloKafka", msg)); System.out.println("Sent:" + msg); } } catch (Exception e) { e.printStackTrace(); } finally { producer.close(); } } }

消費者

package com.unisk.bigdata.kafka;

import java.util.Arrays;
import java.util.Properties;

import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
import org.apache.kafka.clients.consumer.KafkaConsumer;

public class MyConsumer {

  public static void main(String[] args) {
    Properties props = new Properties();
    props.put("bootstrap.servers", "master:9092");
    props.put("group.id", "group-1");
    props.put("enable.auto.commit", "true");
    props.put("auto.commit.interval.ms", "1000");
    props.put("auto.offset.reset", "earliest");
    props.put("session.timeout.ms", "30000");
    props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
    props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

    KafkaConsumer<String, String> kafkaConsumer = new KafkaConsumer<>(props);
    kafkaConsumer.subscribe(Arrays.asList("HelloKafka"));
    while (true) {
      ConsumerRecords<String, String> records = kafkaConsumer.poll(100);
      for (ConsumerRecord<String, String> record : records) {
        System.out.printf("offset = %d, value = %s", record.offset(), record.value());
        System.out.println();
      }
    }

  }

}

三、結果展示

執行生產者之後
Sent:Message 0
Sent:Message 1
Sent:Message 2
Sent:Message 3
Sent:Message 4
Sent:Message 5
Sent:Message 6
Sent:Message 7
……

執行消費者後
offset = 67, value = Message 2
offset = 68, value = Message 5
offset = 69, value = Message 8
offset = 70, value = Message 11
offset = 71, value = Message 14
offset = 72, value = Message 17
offset = 73, value = Message 20
offset = 74, value = Message 23
offset = 75, value = Message 26
offset = 76, value = Message 29
……