Joe Keen cb2ec23cee Fork 0.9.5 kafka-python and require pykafka
To let other OpenStack projects move forward with new versions of kafka-python
we're forking kafka-python and embedding it in monasca-common.  This allows us
to migrate to the new async interfaces provided by more recent kafka clients
over time and not block other projects.

Requiring pykafka to allow us to have ~4x more throughput once we write to
their async interfaces.

Change-Id: Ifb6ab67ce1335a5ec4ed7dd8b0027dc9d46a6dda
Depends-On: I26f9c588f2818059ab6ba24f9fad8e213798a39c
2017-01-21 10:40:55 -07:00

912 B

For 0.8, we have correlation id so we can potentially interleave requests/responses

There are a few levels of abstraction:

  • Protocol support: encode/decode the requests/responses
  • Socket support: send/recieve messages
  • API support: higher level APIs such as: get_topic_metadata

Methods of producing

  • Round robbin (each message to the next partition)
  • All-to-one (each message to one partition)
  • All-to-all? (each message to every partition)
  • Partitioned (run each message through a partitioning function) ** HashPartitioned ** FunctionPartition

Possible API

client = KafkaClient("localhost:9092")

producer = KafkaProducer(client, "topic")
producer.send_string("hello")

consumer = KafkaConsumer(client, "group", "topic")
consumer.seek(10, 2) # seek to beginning (lowest offset)
consumer.commit() # commit it
for msg in consumer.iter_messages():
    print msg