theglobalsolutions
 
Apache Kafka Powered by GlobalSolutions
Kafka
Powered by GS



 

Kafka is primarily used to build real-time streaming data pipelines. It is an open-source system developed by the Apache Software Foundation and written in Java and Scala. Kafka runs on a cluster of one or more servers (called brokers), and the partitions of all topics are distributed across the cluster nodes. Additionally, partitions are replicated to multiple brokers. A Kafka cluster is not only highly scalable and fault-tolerant, but it also has a much higher throughput compared to other message brokers such as ActiveMQ and RabbitMQ

Kafka combines three key capabilities for event streaming end-to-end,

  • To publish (write) and subscribe to (read) streams of events, including continuous import/export of your data from other systems.
  • To store streams of events durably and reliably for as long as you want.
  • To process streams of events as they occur or retrospectively.

 

Why Subscribe to our offering in AWS Marketplace
How to connect to the AMIs after subscribing from AWS Marketplace
Installation Location
Category Location
Kafka Running as a service
Zookeeper Running as a service

Monitoring your Kafka Service Using Grafana Dashboards

The first step in monitoring is to get access to the Grafanna running on the server. To get the Grafanna username and password please use the following steps,

Monitoring your Kafka Cluster

Once you retrieve your username and password, please browse Grafana from your browser using the following URL - http://Ec2-instance-public-IP:3000
Provide the username and password retrieved in the previous step. You will be able to see all the Kafka metrics.
 


Steps to kick start your work to create topics and consume the messages

Step to create a topic:

  1. Navigate to the /usr/local/kafka/bin folder
  2. From here run the below command
    • ./kafka-topics.sh --create --topic globalsolution --bootstrap-server localhost:9092
  3. As you can see above we have used a topic name "globalsolution"
  4. Please use a different topic name while trying the above steps.
  5. Once the topic is created the next step is to publish and consume message from this topic.
     

   Steps to produce and consume messages to the topic created above

  1. Navigate to the /usr/local/kafka/bin folder
  2. From here run the below command
    • ./kafka-console-producer.sh --topic globalsolution --bootstrap-server localhost:9092
  3. You will get a prompt to push any messages and type a couple of messages.
  4. Now in this step, we will create a consumer which can read from the topic
    • ./kafka-console-consumer.sh --topic globalsolution --from-beginning --bootstrap server localhost:9092

If you need more integration with alerting and other services please get in touch with us at support@theglobalsolutions.net
 


Support

Please contact us at support@theglobalsolutions.net for any questions on this offering in AWS Marketplace.

copyright - © 2016,The Globalsolutions LLC. or its affiliates. All rights reserved.