December 17, 2020

[Note] Message Queue S/W in Docker

 

IBM MQ

$ docker volume create mqdata
$ docker run --name ibmmq --hostname ibmmq \
    -e LICENSE=accept -e MQ_QMGR_NAME=QM1 -e MQ_APP_PASSWORD=passw0rd \
    -v mqdata:/data/docker.data/mq -p 1414:1414 -p 9443:9443 -d ibmcom/mq:latest

Above will set up followings:

  • Queue manager QM1
  • Queue DEV.QUEUE.1
  • Channel: DEV.APP.SVRCONN
  • Listener: DEV.LISTENER.TCP on port 1414


Test 1:

$ docker exec -ti ibmmq bash
$ dspmqver      # version info
$ dspmq         # display running queue mgr

 
Test 2:
  1. Open https://localhost:9443/ibmmq/console/ 
  2. Log in with user ID and PW as “admin” and “passw0rd”.
  3. Click on "Manage QM1" box
  4. Click on DEV.QUEUE.1
  5. Click on "Create" button - right nav will slide out. Type something in "Application Data" and click on "Create" button.  You'll see your message in the queue.
  6. Click on trash can icon to clear up the queue.

References


RabbitMQ

$ docker run -d --hostname rabbitmq --name rabbitmq -e RABBITMQ_DEFAULT_USER=admin -e RABBITMQ_DEFAULT_PASS=passw0rd -p 15672:15672 -p 5672:5672 rabbitmq:3-management

Admin URL: http://localhost:15672


Redis

$ docker run --name redis --hostname redis -p 6379:6379 -d redis

Test:

$ docker exec -it redis bash
$ redis-cli
> ping
> set name pnap
> get name
> incr counter
> incr counter
> get counter
> exit

References

 

 

December 14, 2020

[Note] Kafka in Docker

Needs

Simple Kafka in Docker set up for development.  Single broker in Kafka, and Zookeeper should be usable for other than Kafka.

Tried various methods following several blogs/tutorials but none of them worked or suitable for my needs.

 

Prerequisite

Docker is installed & running, docker-compose is installed.

My Environment

Ubuntu 20.04


Steps



  1. Save above as docker-compose.yml
  2. $ sudo docker-compose up --build -d

Test

  1. Download Kafka to use Kafka commands
    https://kafka.apache.org/downloads 
  2. Create topic
    $ ./kafka-topics.sh -zookeeper localhost:2181 --create --topic test --partitions 3 --replication-factor 1
  3. List topics
    $ ./kafka-topics.sh --zookeeper localhost:2181 --list 
  4. Describe topic
    $ ./kafka-topics.sh --zookeeper localhost:2181 --describe --topic test

Test Python code

$ pip install kafa-python


producer.py

from json import dumps
from kafka import KafkaProducer

producer = KafkaProducer(bootstrap_servers=['localhost:9092'],
value_serializer=lambda x:
dumps(x).encode('utf-8'))

for e in range(10):
data = {'number' : e}
producer.send('test', value=data)

 

consumer.py

from kafka import KafkaConsumer
from json import loads

consumer = KafkaConsumer(
'test',
bootstrap_servers=['localhost:9092'],
auto_offset_reset='earliest',
enable_auto_commit=True,
group_id='my-group',
value_deserializer=lambda x: loads(x.decode('utf-8')))

for message in consumer:
message = message.value
print(message)


Run publisher first, and then consumer.



REFERENCE


READINGS