1. Home
  2. Dokumente
  3. Tutorials
  4. 91. Agent Kafka Adapter

91. Agent Kafka Adapter

Learning Objectives:-

This unit describes:-

  • About Agent Kafka Adapter?
  • Event-based communication.
  • How to use it?
  • What is the advantage of using it?


This tutorial explains about the Agent Kafka adapter. Kafka is an Event-based communication. Event-based communication contains two main components: Publish (Producer) and Subscribe (Consumer). The work of a publisher is to upload a message into a queue or topic, which is Kafka or Pulsar, and the middleware tool like Kafka and Pulsar will maintain the news; they follow the principle of Publish and Subscribe. And they will deliver messages to the Consumer. Then all the applications can subscribe. For example, Netflix plays the role of a publisher. And we play the part of the subscriber(Consumer). Same way Kafka Adapter which we will see in this tutorial.

Few Points to know related to Kafka is given below:

Event-based Communication:

This is the modern way of communication between different applications. It has two components: Publish and subscribe—for example, Netflix, youtube.

  • We can see the picture given below for reference. The SAP is a producer who is pushing messages, and Kafka or Pulsar works here as middleware and maintains messages. All other applications like Twitter, HubSpot, salesforce is a consumer or subscriber here. So SAP just needed to publish it one time, they don’t need to send it to different applications individually.

  • One more example for better understanding.

We all use Netflix to watch movies. It uses the Queuing tool and streaming tool. Netflix here plays the role of ProducerProducer, and we play the part of the Consumer. Netflix uploads Movie once, and all others who are subscribers of Netflix can watch.

They do not need to request again and again. To send it individually, it takes so much time; here, we just needed to upload once, and all the subscribers will get it.

Event-Driven Technology Terminology:

We have five main Event terminology.

  1. Publish
  2. Subscribe
  3. Streaming
  4. Producer
  5. Consumer

Check the picture below for description.

Any Connect:

  • Any connect is an easy tool that helps users to get your data into Salesforce objects.
  • Any Connect can also extract data from database objects into any of the destinations as per business requirements.
  • Similarly, Agent can use bulk deletions by exporting the ID fields for the data we wish to delete and using that source to specify deletions through the Any Connect.
  • Many networks and server monitoring solutions use what are called “Any Connect” to get values from the machines they are monitoring. It is programs that run on the remote machines and communicate with the main monitoring system. Some merchants try to hide the fact that they use Any Connect. They will try to deliver things like they “deploy” to remote systems or use other words, but it all compresses installing custom software on the remote machines.
  • We have available connectors like Pulsar, Kafka, Database, File, FTP, ftps, sftp, SOAP, and REST.

Kafka Adapter:

We have 2 Kafka Adapter-

  • Outbound Kafka Adapter:

We need Outbound Kafka Adapter when we want to send data out from salesforce; here, We need to create a topic first. From salesforce, we can send data messages to the API endpoint of our Agent, and then we have a camel producer to publish into the topic invoice, which is on the Kafka side. This is the outbound process because we are sending data out here. It plays the role of Producer.

  • Inbound Kafka Adapter:

We use Inbound Kafka Adapter when somebody pushes the topic into a topic customer on the Kafka side; then we are consuming. We have an event-driven listener, which is a camel consumer. We don’t need a scheduler. We can consume immediately. This is the Inbound process because we are receiving data here. It plays the role of the Consumer.

Check the picture given below for reference.


This Feature is beneficial and a modern way of communication, which is event-based communication. Kafka can handle many terabytes of data without incurring much at all in the way of overhead. Kafka persists the messages on the disks, which provides intra-cluster replication. This makes for a highly durable messaging system. Kafka replicates data and can support multiple subscribers. Developing a Kafka adapter makes our client work easier and efficient too. Agent Kafka Adapter is a tool that helps the user to integrate data. Agent Kafka Adapter has two functions like Consumer, which sends data from Apache Kafka to Salesforce, meaning that we can use inbound with Agent Kafka, and Producer is using for send data from Salesforce to Apache Kafka that mean we do Outbound from Salesforce to Apache Kafka. In the next chapter, we will learn how to use the Agent Kafka adapter for Consumer and ProducerProducer.




Fandest du diesen Artikel hilfreich? Ja Nein

Wie können wir helfen?