Keywords: Kafka - AWS - Technical issue - Other
bnsupport ID: 1399006e-00e5-f8a4-18a8-13ef4b410f6b
Description:
I am using the following code with the bitnami settings that come out of the box for the AWS AMI to send signal remotely from a different EC2 -
from kafka import KafkaProducer
topic_for_consuming_review_signal = 'topic_for_consuming_review_signal'
producer = KafkaProducer(bootstrap_servers='172.31.0.209:9092',api_version=(0,9),security_protocol='SASL_PLAINTEXT',sasl_mechanism = 'PLAIN')
print('producer created')
reviews = ['good', 'great', 'disgusting', 'bad', 'poor']
for msg in reviews:
# producer.send(topic_for_consuming_review_signal, msg)
print('started sending signal - ' + msg)
producer.send('test', bytes(msg,'utf8'))
print('signal sent successfully - ' + msg)
producer.flush()
each time I am getting - raise Errors.KafkaTimeoutError( kafka.errors.KafkaTimeoutError: KafkaTimeoutError: Failed to update metadata after 60.0 secs.
few important points -
- both the ec2 are on the same private subnet
- kafka’s ec2 has open 9092
- I am able to use cli to produce and consume signals