Tutorials

  1. Home
  2. Dokumente
  3. Tutorials
  4. 91. Agent Kafka Adapter
  5. 91.1 How to use the Agent Kafka Adapter for Producer?

91.1 How to use the Agent Kafka Adapter for Producer?

Introduction

Agent Kafka adapter for Producer uses for writing data to Topic, meaning that sends data from Salesforce to Topic in Apache Kafka. To use Kafka for Producer, we need to have a Kafka; then, we need to configure an adapter in the Agent control board.  In this chapter, we will learn how to use the Agent Kafka Adapter for Producer, and we will know how to do outbound with CSV, XML, and JSON file type.

Pre-require

  • Configure a new agent control board.
  • To checking to catch a new agent control board.
  • Starting and create Topic in the Kafka system.

1.How to configure a new agent control board.

  • Go to the Integration Detail page.
  • Click on the new agent control board.

 

  • Now we need to create a connection destination. For that, we can give any name to the destination. As shown below.

  • Fill all the required fields in Salesforce > Agent.

  • Click on the Ping Agent connection.

  • Fill all the required fields in Agent > Salesforce.

  • Click on the Ping Salesforce connection.
  • And save it.

2. How to check the Caching New Agent control board

We can check it for different Object name.

  • Check for the Integration. For that, select Integration in the object name.

  • Check for the Interface. For that, select Interface in the object name.

  • Check for the Mapping. For that, select Mapping in the object name.

  • Check for the Adapter. For that, select the Adapter in the object name.

  • Check for the Interface Group. For that, select Interface Group in the object name.

3. How to work with Kafka Outbound Adapter?

Case 1: Create Outbound KAFKA Adapter for CSV format.

  1. Firstly, we need to create a Kafka topic on the Kafka server.
  2. Create a Kafka topic named: CSV-Topic (As shown in the picture given below).

3. We should create different topics to identify various data formats. (This we needed for other use cases, for now, we are using CSV Topic ).

  • ${KAFKA_HOME}/bin/kafka-topics.sh –create –bootstrap-server localhost:9092 –replication-factor 1 –partitions 1 –topic CSV-Topic
  • ${KAFKA_HOME}/bin/kafka-topics.sh –create –bootstrap-server localhost:9092 –replication-factor 1 –partitions 1 –topic XML-Topic
  • ${KAFKA_HOME}/bin/kafka-topics.sh –create –bootstrap-server localhost:9092 –replication-factor 1 –partitions 1 –topic JSON-Topic

4. After creating a  topic we can use command below to show all Kafka topic.

  • ${KAFKA_HOME}/bin/kafka-topics.sh –zookeeper localhost:2181 –list

 

5. Create KAFKA Outbound Adapter for testing with CSV format.

  • Click on New, as shown below.

 

  • Fill all the required fields, as shown below.
  • Save it.

 

Note
CSV file supports Flat mapping and XML/JSON supports hierarchical mapping.

6. Create a message type using below XML.


<?xml version=”1.0″ encoding=”UTF-8″?>

<AccountTestV3 xmlns=”http://schema.infor.com/InforOAGIS/2″>

<AccountNumber>100001</AccountNumber>

<Name>TestAccountA1</Name>

<BillingCountry>CambodiaA1</BillingCountry>

<BillingCity>PhnomPenhA1</BillingCity>

<Account_ID>100001</Account_ID>

<Description>This tutorial will teach you the basics of XML. The tutorial is divided into sections such as XML Basics Advanced XML and XML tools.</Description>

</AccountTestV3>

7. Create Integration.

8. Create Interface

  • Name
  • Source: Account
  • Status: Deployed
  • Operation: upsert
  • Type: Outbound
  • Processing Mode: Synchronous

Note
Kafka supports only Synchronous Processing Mode 
  • Adapter:- Add CSV Kafka outbound adapter, which we created in point 6.
  • Metadata Provider
  • Repository
  • Message Type: Add the message type, which we have created in point 7.
  • Mapping

 

9. Consume message send from Salesforce in CSV format

${KAFKA_HOME}/bin/Kafka-console-consumer.sh –bootstrap-server localhost:9092 –topic CSV-Topic

10. Callout to Kafka server

11. Here is the result on the server.

 

12. The result in message monitoring.

 

 

13. CSV support other separators like the semicolon, tab, and pipe separator.

 

Case 2: Create Outbound KAFKA Adapter for JSON format.

  1. Create JSON Adapter.

2. Create Integration.

3. Create Interface

  • Name
  • Source: Account
  • Status: Deployed
  • Operation: upsert
  • Type: Outbound
  • Processing Mode: Synchronous

Note
Kafka supports only Synchronous Processing Mode 
  • Adapter:- Add JSON Kafka outbound adapter.
  • Metadata Provider
  • Repository
  • Message Type: Add the message type. (Hierarchical structure)
  • Mapping

  • Consume message send from Salesforce in JSON format.

${KAFKA_HOME}/bin/Kafka-console-consumer.sh –bootstrap-server localhost:9092 –topic JSON-Topic

4. Callout to Kafka server

5. Here is the result on the server.

6. The result on Message Monitoring.

Case 3: Create Outbound KAFKA Adapter for XML format.

  • We need to follow the same steps as in case 2. We just need to create an XML adapter.
  • We also need to convert data to a single line.

Summary:

Agent Kafka for Producer uses for writing data from Salesforce to the Topic in Kafka, meaning that we use outbound data from Salesforce to Kafka. In Agent Adapter for a producer can work with CSV, XML, and JSON data records, but we need to configure file type in Adapter too. Follow this guide to learning How to use Agent Kafka adapter for consumers (https://apsara-consulting.com/docs/tutorial-v2-41-lightning/91-how-to-use-the-agent-kafka-adapter/91-2/ ).

Fandest du diesen Artikel hilfreich? Ja Nein

Wie können wir helfen?