Blog Why API is not enough

Why API alone is not enough for enterprise-class integration

What if I say, you don’t need to code a single line to perform enterprise-level Integration. Yes !! You heard right. Salesforce is all about low coding, but still, we as a developer are writing codes to perform different types of integration. Let remember my first blog where I come across the SKYVVA Data Loader and have worked with it as the replacement of the Salesforce Data Loader. SKYVVA provide not only the easy data loading and importing as we know with the data loader. SKYVVA provide more to do a complex and enterprise-class integration, which is hard to do with the API approach only. It is a very powerful tool which provides Integration with various advanced features and with no coding. This secret tool can change your organization’s Integration process completely.

The Salesforce standard API

So let’s first understand what are the standard ways to do Integration in Salesforce. I will shortly outline the different types of API’s which we can use for Integration and manipulating your Salesforce organization data. For the completeness of the API documentation and description please refer to Salesforce official help site. Let’s have a look on different API’s:


REST API is a simple and powerful web service based on RESTful principles. It exposes all sorts of Salesforce functionality via REST resources and HTTP methods. For example, you can create, read, update, and delete (CRUD) records, search or query your data, retrieve object metadata, and access information about limits in your Org. REST API supports both XML and JSON. Because REST API has a lightweight request and response framework and is easy to use, it’s great for writing mobile and web apps.


SOAP API is a robust and powerful web service based on the industry-standard protocol of the same name. It uses a Web Services Description Language (WSDL) file to rigorously define the parameters for accessing data through the API. SOAP API supports XML only. Most of the SOAP API functionality is also available through REST API. It just depends on which standard better meets your needs. Because SOAP API uses the WSDL file as a formal contract between the API and consumer, it’s great for writing server-to-server integrations.


Bulk API is a specialized RESTful API for loading and querying lots of data at once. By lots, we mean 50,000 records or more. Bulk API is asynchronous, meaning that you can submit a request and come back later for the results. This approach is the preferred one when dealing with large amounts of data. There are two versions of the Bulk API (1.0 and 2.0). Both versions handle large amounts of data, but we use Bulk API 2.0 in this module because it’s a bit easier to use. Bulk API is great for performing tasks that involve lots of records, such as loading data into your org for the first time.

Streaming API

Streaming API is a specialized API for setting up notifications that trigger when changes are made to your data. It uses a publish-subscribe, or pub/sub, model in which users can subscribe to channels that broadcast certain types of data changes. The pub/sub model reduces the number of API requests by eliminating the need for polling. Streaming API is great for writing apps that would otherwise need to frequently poll for changes.

API Limits

Total limits vary by org edition, license type, and expansion packs that you purchase. For example, an Enterprise Edition org gets 1,000 calls per Salesforce license and 200 calls per Salesforce Light App license. With the Unlimited Apps Pack, that same Enterprise Edition org gets an extra 4,000 calls. Total limits are also subject to minimums and maximums based on the org edition, but we won’t get into that here. If you want to know more, check out the API Request Limits link in the Resources section.

Which API to use in which case?

Choosing the right API for your integration needs is an important decision. Here’s some information on our most commonly used APIs, including supported protocols, data formats, communication paradigms, and use cases. Treat this section as a reference you can return to when you’re considering which API to use.

When to Use REST API

REST API provides a powerful, convenient, and simple REST-based web services interface for interacting with Salesforce. Its advantages include ease of integration and development, and it’s an excellent choice of technology for use with mobile applications and web projects. For certain projects, you may want to use REST API with other Salesforce REST APIs. To build UI for creating, reading, updating, and deleting records, including building UI for list views, actions, and dependent picklists use User Interface API. To build UI for Chatter, communities, or recommendations, use Chatter REST API. If you have many records to process, consider using Bulk API, which is based on REST principles and optimized for large sets of data.

When to Use SOAP API

SOAP API provides a powerful, convenient, and simple SOAP-based web services interface for interacting with Salesforce. You can use the SOAP API to create, retrieve, update, or delete records. You can also use SOAP API to perform searches and much more. Use SOAP API in any language that supports web services.

For example, you can use SOAP API to integrate Salesforce with your org’s ERP and finance systems. You can also deliver real-time sales and support information to company portals and populate critical business systems with customer information. Note that SOAP API has a well-defined standard by W3C consortium and reach a long maturity and thus old application systems still only support soap. Old applications which deliver value to your new digital scenario are still a great asset for your company and thus need to integration to the new digital process.

When to Use Bulk API

Bulk API is based on REST principles and is optimized for loading or deleting large sets of data. You can use it to query, queryAll, insert, update, upsert, or delete many records asynchronously by submitting batches. Salesforce processes batches in the background.

SOAP API, in contrast, is optimized for real-time client applications that update a few records at a time. You can use SOAP API for processing many records, but when the data sets contain hundreds of thousands of records, SOAP API is less practical. Bulk API is designed to make it simple to process data from a few thousand to millions of records.

The easiest way to use Bulk API is to enable it for processing records in Data Loader using CSV files. Using Data Loader avoids the need to write your own client application.

When to Use Streaming API

Use Streaming API to receive near-real-time streams of data that are based on changes in Salesforce records or custom payloads. For Salesforce record changes, Salesforce publishes notifications when the changes occur. For custom notifications, you can publish event messages. Subscribers can receive notifications using CometD – an implementation of the Bayeux protocol that simulates push technology. Clients can also subscribe to some types of events with Apex triggers or declaratively with Process Builder and Flow Builder. Use the type of streaming event that suits your needs.

Push Topic Event

Receive changes to Salesforce records based on a SOQL query that you define. The notifications include only the fields that you specify in the SOQL query.

Change Data Capture Event

Receive changes to Salesforce records with all changed fields. Change Data Capture supports more standard objects than Push Topic events and provides more features, such as header fields that contain information about the change.

Platform Event

Publish and receive custom payloads with a predefined schema. The data can be anything you define, including business data, such as order information. Specify the data to send by defining a platform event. Subscribe to a platform event channel to receive notifications.

Generic Event

Publish and receive arbitrary payloads without a defined schema.

Nature of API and its principal use case

As you have seen the different available API’s for integration and it’s use cases now let us look deeper into the nature and the principle behind the API. API become popular in the area of moving the application to a distributed network where business logic does not happen as the whole bunch of logic inside a monolith anymore. The decomposition and the distribution become the design principle of creating a new area of application which can be easily used over a distributed network such as the internet. To be able to communicate with now autonomy piece of software and application which can be specialized on a business domain the API becomes the vehicle to do that. The software application has to communicate with each other using API.

The application was decomposed to logical units which have to communicate with each other to fulfill the business task. Most of the time this communication is based on a synchronous way thus happen in real-time. This kind of communication is good to provide interaction between the application and the business user. On the other side, doing integration happened mostly in the background and run autonomously without having a user sitting and waiting for any response of an API. The work has to be done in the background and some time with a high volume of data automatically. With other words, the communication has to happen in an asynchronous way and need to support background and batch processing of high volume data.

The integration pattern

Looking to the available Salesforce standard API’s above except the bulk api you can see that they are operating in the synchronous pattern and thus follow the pattern of ‘tight coupling’ integration. When you are dealing with integration you will come across two principal approaches of doing the integration.

  • Tight Coupling
  • Loose Coupling

The following picture shows the two pattern:


A tightly coupled integration is one that creates a dependent relationship between Salesforce and whatever systems it’s being linked to by spreading the integration of business logic across those systems. While a tightly coupled integration could function just fine, it will inevitably cause scalability and maintenance problems in the long term. A loosely coupled integration, on the other hand, keeps the integration business logic separate from each system, thus create independence, interoperability, and decoupled application. This allows you to more easily modify your integration logic and to make changes to each individual system without fear of breaking any existing functionality.

Tight Coupling

With the Salesforce soap and rest API, you can greatly do simple so-called CRUD operation with the data which operate synchronously. This means that the caller application or client need to wait for the API to be finished. What happened with the client when the API run into an endless loop because of having a recursive trigger on sObject. Are the clients still responsive? Can the user use the application or is it going to be frozen? This kind of problems occurs when using tight coupling pattern with the wrong design. Tight coupling is a valid and great pattern to use and not saying that this pattern is useless. The things are that in integration world the use case is more and better fit to the so-called loose coupling of application inter-communication.

Due to the decomposed designed of having separate functional units in the modern software design such a tight coupling cause too many dependencies. It is such a big and monolith block which cannot easily be distributed over the internet. Lightway and functional encapsulated app is now the future and thus need to decouple the communication between that app. With this evolution in the last few decades, we could see new application vendors entering the market with new application software which provides a specific and well-suited business package. Therefore it is a strong demand to integrate them together in the right way with the right architecture.

Here some thought and fact to consider when you plan to use the synchronous and tight coupling architecture for your integration project.

  • Tight coupling model provides more interdependency on both the Integration systems.
  • It required more coordination between two systems while Integration as both are equally co-ordinating with each other during Integration data processing in order to success.
  • The caller needs to wait till API is finished which will result in blocking of the sender who is ending the requested data.
  • This waiting for API callouts results in bad user experience and bad performance for the client. Sometimes it appears to your users that your application is frozen. This will cause dislike and business damage if your user is not using your app anymore.
  • The client cannot use all its resources due because of waiting and idle time which results in frustration.
  • This API always follow Ping-Pong pattern to synchronize API between caller and provider hence more efforts to be taken when its tightly coupled Integration.

On the other side using tight coupling is also a valid way to do integration when your business cases need real-time interaction to support your business users. Only in this case, the disadvantages described above can be ignored.

Loose Coupling

As you see nowadays applications are breaking from its monolith structure of small, smart and maintenance able unit there is not tight coupling in the nature of application anymore. Thus the tight coupling pattern doesn’t fit much and application need to talks and interacts loosely together without having a user or process waiting for others. This kind of integration causes less inter-dependency and are thus great to maintain and change flexibility to meet any kind of new business requirements.

Loose coupling follows the principle of the asynchronous communication pattern. Imagine calling somebody by phone where you always need somebody to be available at the time you start the communication. With the asynchronous pattern, you don’t need your communication partner to be available and thus create less dependency on communication. This is for example when you write an email to a recipient. You are not blocked for hours you just sent your message and can turn around to do other things. This is one of the main advantages of not being blocked by the communication partner.

Lose coupling solve the disadvantages you have with the tight coupling pattern described above. It doesn’t need and bound huge resources on both side e.g. sender and receiver for doing the communication. It releases the client e.g. caller quicker and thus doesn’t create the impression to the user that the application freeze. Furthermore, if you integration scenario involved exchanging mass data e.g., for example, updating of millions of the product prices from an ERP system then there is no way to use the asynchronous communication to be able to send such amount of data using the bulk API.

There are a lot of great books and resources available over the internet which you can read to more precisely understand the communication pattern using for integration domain. In this blog, we cannot cover and explain all the theoretical aspect of the communication pattern because it would exploit the size of this blog. Let see in the following chapter how I found such a solution with SKYVVA which support both e.g. tight and loose coupling.

Integration using the API only approach

Now as we have seen and understood the two existing approach and pattern for doing integration let have a deeper look at how we can do the integration to manually program against the Salesforce standard API. Salesforce now provides both flavors of integration API which supports SOAP and REST. For a few years ago you can only use the SOAP API to do the integration and connect to your Org. Therefore older application still existing which uses the SOAP API. The picture below shows the view when doing integration using synchronous pattern with the Salesforce standard SOAP/REST API.

It shows the use case with an SAP cloud ERP application as a client which use API to synchronized data like Account, Contact, sales order, invoice, etc. In such a business case you have to deal with mass data to synchronize between SAP and Salesforce application system. It required autonomous data synchronization between that two application system without to have any users involved and it needs batch processing because of the mass of data.

The characteristic of using the Salesforce standard API is that it operates synchronously meaning that the client application has to wait until the operation which is triggered by the API is done. This can cause all the disadvantages we have seen above with the usage of the tight coupling approach. The most seen issue with using the standard API is that the client application seems to be frozen when there are too many requests at the same time.

Another aspect of using the Salesforce standard API is that it provides a very simple CRUD operation capability. But in real integration scenario, we need more than only create (C), read (R), update (U) and delete (D) the business record as for example a quote. We need more complex logic to do first before we are doing those basic REST operation. For example, before updating the quote we need to check based on the sales area assignment in SAP if this quote belongs to the correct territory area in Salesforce. So we need some business logic which needs to do beforehand.

The real integration scenario for an enterprise is much more than only doing the simple CRUD-operation. You need a full-fledged of services on the Salesforce side to be able to provide a stable integration not only in term of simple connectivity but also to support the daily operation when something is going wrong. Doing this you need an integration service layer sitting on the Lightning platform such as I could found with the SKYVVA solution.

Looking to the picture above you can see the filling gap (the black whole in the above picture) which a service layer could bring to your integration need. It goes beyond the connectivity and provides you and your Salesforce team an easy way to develop, maintain and operate your daily integration need. This is what the SKYVVA service layer adds to the Salesforce Lightning platform which is missing for having the right and robust tool for doing the integration to and from Salesforce.

Why integration needs both patterns?

As integration problems and use cases are too different we cannot say that the tight or loose coupling will be the only right solution for a requirement. If you need real-time to see, for example, the availability of a product on the stock then you simply need to solve this requirement using the tight coupling pattern and thus provide the real-time experience to your user who is waiting on his mobile device to see. If you need to synchronize the account address data which is enough to see them in the next day then you can use the loose coupling pattern to delay the batch processing of the account data update to a nightly job. Thus you don’t disturb the business user who is working on the main business hour during the day. 

One approach and architecture style is not enough in solving today integration need for Salesforce eco-system. If you are using a technology which supports only one of that communication pattern you will be lost in providing a stable, reliable and on-time connected digital business with your Salesforce platform. You simply need a tool which supports you in doing integration using both patterns in an effective way and without demanding from you to be an API developer expert. With SKYVVA I found such a toolset providing both patterns with the native Salesforce technology. No additional tools and middleware are needed to provide enterprise-class integration.

After having explained the Salesforce standard API, the two integration pattern and showing you the need of having both patterns to build any integration now lets us have a look to a few features offered by SKYVVA solution. Note that all the feature we will show you below is possible only because of the service layer from SKYVVA which in fact is the decoupling layer. Let me also repeat here again that SKYVVA supports both patterns and let you do a real-time integration scenario by by-passing the SKYVVA staging layer in the same manner you can do with the Salesforce standard API. The differences are that you get more functionality with the SKYVVA solution then when doing the integration manually yourself or let it done by a developer team who need deep knowledge and skill for API programming.

The SKYVVA added-value we want to showcase in the below chapter are the following:

  • Message Monitor, which give you a handy tool for monitoring incoming and outgoing messages
  • Message Reprocessing, which allow you to correct wrong incoming data a do reprocess without having to ask the sender application to send the data again.
  • Alerting, ease your life not to actively and permanently looking to the monitor to see what is going wrong. Instead, you will get an alert and notification easily to your mobile device or chatter group.

How to monitor in case something is going wrong?

Let us start with a question if your interface and integration you have developed always run smoothly without any issue? Do you have never had a failed data integration due to incorrect data e.g. wrong date and currency format or the user entered simply incorrect data at the sending application? How do you deal with all kind of errors you face at the daily operation. If you are a Salesforce experienced developer you probably have no problem to find out the root cause for integration error. But do you want to spend your resources every day for doing the integration error which can be easily done for example from your admin or end users colleagues? This is the point and time-saving issue I come across while using the SKYVVA solution which provides me all these benefits in a very simple way.

With SKYVVA you now have end-2-end monitoring because the messages are kept for monitoring purpose in the SKYVVA service layer and staging area. Without the SKYVVA service layer, you would have only the monitoring capability provided by the sender application and the middleware. This value-added feature is provided exclusively by the SKYVVA service layer.

The screen below shows an example of the monitoring screen where you have different options to find the messages. For example, the most used searching options are to find the message based on its content e.g. by entering the account name. With one mouse click, you have a clear view of your data and can compare easily on a business level without having to be a Salesforce developer expert to be able to read the debug log and traces.

It provides flexibility to monitor your Integration data by specific date and time as well. Hence you don’t need to be always present in front of your system to perform data Integration.

Depending on the processing different message status filter options are available like

Completed: When API call is successful and data is successfully posted the status is completed.

Failed: When data is posted successfully but due to some reason it is failed for example due to mismatch of data type like its text field in sender but it’s number field in Salesforce, in that case, it cannot post the record. With the monitoring screen, you can actually monitor the reason for the failure of data after the API call.

Pending: When your data is sent by the sender but pending to be posted in salesforce then message status change to pending.

Canceled: When data is canceled after API calls then message status changed to canceled

Please refer to the SKYVVA online tutorial and documentation page to see the detail of the monitoring capability here: 

Easy data fixing and correcting using the reprocessing feature

Imagine an example where you have to bring your sales quote and order from an ERP-System such as SAP to Salesforce for your sales representative to provide a clear picture of customer quote and order amount. Quote and order can be synchronized to Salesforce immediately with the SAP quote which has been changed in SAP. Such a process happened in the back-office by your colleagues who are using SAP to create and change the quote and order. Now imagine that being human people your colleague can make a mistake and enter the wrong data into the quote. Instead of entering the correct currency format he/she has entered a wrong one which is not recognized by Salesforce. The posting of the quote thus fail. 

What to do now? You can ask your colleague to correct the data on the SAP system and send it again. What happened when it is close to the business day and your colleagues left the office already? There is no one who can correct the wrong currency. The problem you have now is that it is close to the end of the business day and you have to send the correct quote from Salesforce to the potential client. Of course, you can shift to send the quote tomorrow. But if you are not the only vendor who sends the quote to that potential client and if your competitor can send by the end of the business day and not you what is going to happen? Imagine this scenario which can damage your business!

Such a scenario and use case happened in the reality and you need a possibility to independently do the correction and reprocessing of the application data posting e.g., in this case, to update the Salesforce quote with the correct currency from SAP. SKYVVA allow you to do such a data correction with the ability to reprocess the data immediately. Thus you can correct the wrong currency and post your quote successfully. Now you can send the quote before the end of the day to your potential client. You even can set up a scenario back to send the changed data using the process build process with the SKYVVA callout functionality to update the changes you made back to SAP. 

Start to use the message monitor to look for failed messages or jump from your email inbox where you got the alert and jump to the failed message. In the failed message make the necessary data correction and reprocess it.

Why alerting can help you

As you have seen having an additional component to monitor data in Salesforce is a great help for your daily operation to fix integration failure. If you don’t have the monitoring component your only way is to dig into the debug trace and look for possible failure. This is complex, time-consuming and needs experienced apex developer with deep API knowledge. With the monitoring, you can delegate this kind of work to your Salesforce admin and event business users. You are free to develop other nice application your company might need.

Monitoring allows you to actively look, search and see the integration data failure. What to do if you don’t have time to look every 30 minutes to the monitoring? You are mostly busy with meeting and some other task and cannot afford to open for every 30 minutes the monitoring page. This is why alerting comes into place. With the message failure alerting from the SKYVVA solution, you don’t need more to actively care about any data integration failure. Even while you are drinking a cup of coffee you will know when something is going wrong with your data integration. SKYVVA provide you real-time alert and notification to your email inbox, create a task for your user and use the social media functionality using the chatter group to post alert.

With this feature, the SKYVVA solution has added the missing part to the Salesforce platform regarding alerting functionality. Now you have alongside the integration chain alerting capability. It is alerting everywhere and nothing can be get lost anymore. This is possible due to the decoupling and integration service layer of the SKYVVA solution added to the Salesforce Lightning platform.


Finally, you reach the end of this long blog. There are some more functionality and added value which I come across the SKYVVA integration solution while I have been working with this great native app. For example, regarding the reprocessing feature, you don’t need to do every data reprocessing manually. There are schedulers available in the SKYVVA solution which will do the reprocessing automatically. For data failure like business data record lock due to concurrent processing on the same record by an end user and the integration user at the same time the reprocessing job will resolve it. There is no need to do the reprocessing manually. The scheduler can be flexibly defined e.g. to run for example every 15 minutes or only on Sunday at 10  pm in the evening.

As you have seen it is not the matter or question of if you can do integration or not with the existing Salesforce standard API. The answer is YES you can do everything yourself if you have the right skill and coding experiences. But are you really providing the integration with the same quality and robustness which enterprise-class integration needed if you only use the API connectivity approach? What about supporting decoupling processing between the client and Salesforce? Can this be done using the standard Salesforce API?

What about the error handling including alerting functionality you need for a real robust integration? Do you have all of the required functionality to handle your daily data synchronization operation which includes the monitoring, reprocessing and alerting? If you just need a ping-pong and simple CRUD operation connectivity you can use the Salesforce standard API but if you want to have more or if your business needs a stable and maintenance-able integration and data synchronization then there is no way not to go for a professional solution like a middleware. But what to do when your company doesn’t have the budget to use a full-blown middleware because of cost reason? If you are in such a situation but need an enterprise-class integration solution like a middleware can provide it then definitely my recommendation is to have a look to the SKYVVA app.

Find here the latest release version and make the first try with the free SKYVVA Data Loader. For a test version with all the functionality but limited to 30 days of validation fill in the contact formula and ask for a trial key.

Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn

Release Spring ´19 SKYVVA Agent – Official Version 1.49

SKYVVA Agent - Official Version 1.49

Spring ´19 Release

Mit dem Spring ´19 Release sind folgende Erweiterungen mit der neuen SKYVVA Integration Agent Version 1.49 verfügbar:

  • Button ‘New Property’ for create new properties file is Agent
  • File name with Merge file.
  • Property File Cannot Duplicated
  • Web Service call through Agent
  • Initial mode not work with auto-switch mode
  • Retrieve more than 50K records from Salesforce with Agent Outbound Processing
  • New properties name “Replace Header” in adapter type File with file type CSV
  • Update the Next Run Date after the schedule setting is change in the Agent UI
  • Scheduler based condition
  • Read file extension with ignore case
  • XML IChained with Hierarchical difference way

Bug Fixed

  • MySQL Server: Time zone value unrecognized
  • Package Size on interface when we filled in Agent UI doesn’t update on SF
  • Agent cannot process attachment while attachment don’t have in folder
  • Hard Deleted doesn’t work with database
  • Login wrong Server Environment
  • Scheduler is working when it Stopped
Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn

Release Spring ´19 SKYVVA Integration Cloud – Official Version 2.42.4


Spring ´19 Release

Das Spring ´19 Release ist in der SKYVVA Official Version 2.42.4 ab sofort verfügbar. Das Release kann über die nachstehenden Downloads für Ihre Sandbox oder die Produkts-/Entwickler Organisationen aktualisiert werden:

  • Sandbox Organisationen
  • / Free Version
  • Produkts-/ Entwickler Organisationen
  • / Free Version

Neue Features

Das Spring 19 Release ist in der SKYVVA Official Version 2.42.4 ab sofort verfügbar. Die Highlights des Spring 19 Releases finden Sie in der folgenden Kurzübersicht:
  • SOAP Adapter
  • Put url parameter to the SFDC2SAPPI adapter
  • Message Layer Refactoring
  • Different response pattern for inbound call
  • Realtime CDC using Process Builder
  • View chain definition in an interface as tree
  • Mapping of Imported WSDL request/respons/fault
  • Add new operation type: Apex Class, Flow/Process 
  • Select field from parent sObject in Mapping editor should be possible like in the Query Editor
  • WSDL generation based on message type
  • Mapping editor shown fields based on the field access level of the user who login
  • Process Interface manually
  • Message reprocessing per interface group
  • Message reprocessing per integration
  • For outbound Call the response structure need to be the same like the request structure
  • For Real synchronous call no need to create message
  • Add new value and rename existing value for flag “Inbound Behavior”
  • Enhance CDC Monitor with more fields
  • New message monitor based on json payload
  • Redesign Log file
  • Enable using bulk 2.0 relationship in mapping tool
  • Validation check on Interface when Bulk Mode is used
  • Generate WSDL, swagger 2.0 and openAPI for metadata exchange between Skyvva and other platform
  • Workflow transaction
  • Rename CDC scheduler in CDC control board
  • Create one single WSDL which contain request and response for a synchronous interface from SF -> SAP
  • Handle request and response as a transaction for synchronous call
  • Create an hint to show that the data of this field in the message monitor is very long
  • Pass raw payload to the custom class
  • Enhance search filter to include transaction id and transfer id
  • Redesign the execution of outbound processing
  • Button on interface to export metadata
  • Show the operation type of the target interface after choosing the target interface in workflow
  • Map all fields automatically which has the same name
  • Add new filter to tab Scheduler to search by functional group
  • Remove schedulers are deprecated from the List
  • CDC Scheduler for interfaceGroup and Interface
  • Make the standard scheduler behavior

Bug Fixed

  • Comment field in message is empty after processing
  • UNABLE_TO_LOCK_ROW with Future apex with integrate
  • Date fields in inbound interface
  • Message monitor stuck because of soap fault message
  • processing baskets with Reprocess button on takeo org isn’t working correct
  • Set message type for the generated sub-interface
  • Related List, IChained does not working correctly.
  • Get success message but record doesn’t delete on Message Monitoring

Patch Version 2.42.4

  • Auto Create Query based on mapping
Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn
Blog Data Loader

SKYVVA a better Free Data Loader

While working on salesforce projects like data integration, daily I have to deal with data mapping, data import, export the data from spreadsheets, and then adjust the fields as per salesforce object fields. I have to use Salesforce standard Data Loader installed in my system, generate the token and connect my Org. 

I have to select my latest updated .csv file and then start importing it. However, when I have lots of record in my .csv file frequently it fails due to small mistakes in .csv file done by me. Unfortunately, I have to download the error file and check why records are failing and have to edit the .csv file again and repeat the process. This process took lots of effort as well as time to deal with large data.
I have discussed this problem with Salesforce experts as well as other developers and then I have concluded that everyone out there is facing the same issues while dealing with data importing and exporting with standard Salesforce Data Loader. Hence, I am actively looking for any smart thing to do this, which can save me time as well as efforts.
While searching on Appxchange for an alternative App option for Salesforce Standard Data Loader I come across SKYVVA Free Data Loader because of its excellent review in AppExchange.
Data mapping mit SKYVVA Free Data Loader
I was looking for an easy and powerful data loader because with the standard Salesforce DataLoader I am not able to build my data integration easily due to its complexity. Salesforce standard Data Loader consumes a lot of time as well as to maintain and resolving errors it takes a big effort and at the end, it costs money in terms of manpower who is spending time to build the integration.
With SKYVVA Data Loader, all 6 steps of our loading process can be achieved in a single data load with the added bonus of being able to resolve loading issues within the process and also able to import data for multi-tier custom objects in one shot.
SKYVVA Data Loader is also a FREE Data Loader like Salesforce Data Loader and definitely a perfect alternative solution. Data Import Export becomes very easy to perform data import and export within seconds. There are many advantages SKYVVA Data Loader has over standard Salesforce Data Loader and it is quite easy to use and get started with.

No installation is needed

As you might know, how easy it is to get an app on your smartphone the SKYVVA Data Loader is quite easy to install into a Salesforce organization. It is there in salesforce like a salesforce native application. No need to install separately as software on a machine.

installation SKYVVA Data Loader
The installation will take around 5 to 10 minutes depending on the internet speed at your side. You can follow this skyvva tutorial: 
To install the app easily or follow this youtube video: 
To get the latest version scroll down to the last chapter of this blog. After the successful installation, you will find the SKYVVA app in the App Launcher.
SKYVVA Integration App Launcher
How to work with the SKYVVA Data Loader in detail e.g. how to create the first integration and interfaces you can follow the online tutorial and the books on the SKYVVA page.

Data transformation using the built-in graphical data mapping tool

The Important and key feature is the graphical mapping tool with easy drag-n-drop data mapping which is much smarter and easy to access than the standard Salesforce Data Loader. Importing data is not only pushing raw data as it is from the source file to the Salesforce application object. Most of the time the data is not correct and available in the correct date, currency or amount format. So you need to define data mapping rule to transform and convert the raw data into the data format which is required by the apex run time. Thus you can build complex business data transformation and process the raw data correctly in Salesforce.

Data mapping Tool SKYVVA Mapping

With formula editor, you can use different built-in formal and function such as a VLOOKUP which is used to link related data or simply search other data based on some condition.


For almost any data transformation as date conversion, string manipulation and decision making and etc. there are standard expression and function available out-of-the-box to use.

data transformation formular

In 90 per cent use case, it is possible to solve the data mapping and conversation need using the built-in function. Should for any reason a very complex business requirement is needed where there is no standard function available then you can use a piece of apex code to solve any complex business logic.

Handling complex business object

One of the most annoying things to me when I have to build integration using the Salesforce Data Loader is that I cannot load a dependent and complex hierarchical object in one shot. I have to define different CSV file and load after each other in multiple rounds. This is a big effort and is error-prone. SKYVVA Data Loader allows me to put all data in one file and import in one run. All related parent and child can be easily linked together using foreign key using the external key without having to know the Salesforce Id before.

SKYVVA graphical data mapping
Using the graphical data mapping you can map the multi-tier object to Salesforce sObject like a quote or opportunity which consist of header and line item. Foreign key and relationship linking between parent and child handle automatically by the SKYVVA run-time.

Condition-based message processing with workflow

With the Salesforce Data Loader, you import the record from the CSV file and create the sObject like account or contact. All account and contact will be imported no matter if it fulfills a condition or not. Imagine you get a file where you have not time to clean and filter it before. So you want to import only account from the CSV file which belongs to the sales area “EMEA”. How can you do that without cleaning up your CSV input file before?
With the SKYVVA Data Loader, you can define a workflow rule to build the condition and process only the records which contain the “EMEA” region. The others records which do not meet the condition will be set to ‘Pending’ or can totally ignore and remove from the message monitor. The condition can be a complex formula to fulfill any of your requirement. For example such a business flow you can model with the SKYVVA workflow easily. 
Condition-based message processing

You can combine different condition and execute different interface in a chained base on the condition. Thus complex business rule can be implemented to fulfill your need to process the message.

Message monitor

Let’s talk about what happened when your data load doesn’t work as expected. Assume we are loading 50k account and contact and we miss some of them. What are you doing when you use the Salesforce standard Data Loader to find which records are missing? Are you asking your end user to find out themselves by browsing the account and contact one-by-one in the application screen? If this your approach it could take many days to find what is missing. It would be a nightmare for your end user.
The possibility with the Salesforce Data Loader is to look to the different result csv file after the import. Go line by line to check what could be the records which are missing. How easy is it if you can search with a mouse click which records are missing? If you can also search by the content of the data? This is what the SKYVVA Data Loader simplify your life by providing you with a message monitor to find the missing records in minutes instead of in hours or day.
Message monitor

With the monitor app, you can search the message by the different filter. For example search after your import for failed messages.

SKYVVA Message

What to do now with the failed messages which you can easily see with the red color? If the data do not need you can simply cancel it and threw it away. But if for example, only the email format is not correct because in the email address the data is not included the sign ‘@’. In this case with the Salesforce standard Data Loader, you have to identify the error csv file, correct the file and import it again. 

With the SKYVVA monitoring tool, you can edit the data, correct it and do are processing. Now the data is posted and you don’t need to re-import them again. This is where it helped me dramatically to save time and eas my life in a situation where the data loading is not working well because the data is so dirty. And believe me, in real production case the most error is happening on data issue e.g. the data is not well cleaned.

Data Mapping SKYVVA
Visit the SKYVVA tutorial and admin guide to see more about the monitoring functionality here:


Monitoring is good for active looking to see what is going wrong. What are you don’t always have time to always sit in front of your laptop to actively click on the refresh button of the monitor app? Is it cool that you are always informed when something is going wrong in your integration? SKYVVA comes with a built-in alerting functionality which informs you by sending you an alert mail.
This would help you that you don’t need to actively look to the monitor but just read your mail when you get an alert. From the mail, you can jump right away to the failed message record and do the data mapping correction and reprocessing as mentioned above.

Supporting Different Data format

Another paint-point I have and why I was looking for an alternative data loader is that I got for some customer project no csv file. The data are stored in an XML file and in the past, I have to find a tool first to convert the XML into a CSV file. This is not an easy way to do and is a waste of time. Fortunately, the SKYVVA Data Loader provides the capability to support the different file format.

Supporting Different Data forma

All facts for your overview

As you have seen in this article different features which are missing in the Salesforce standard Data Loader and needed for an enterprise-class data loader have been described briefly. More detail insight you can expect from our next series to deep-dive into the different functionality. To make it easy for you the below table shows all the fact-sheet.

Success Key Facts SKYVVA

Most of the required functionality you need in your daily work for data migration, data mapping and import is provided additionally by the SKYVVA Data Loader. Of course, you can straight away use the Salesforce standard Data Loader like before. But what to do if you need complex transformation logic and data mapping? What to do in case you need condition-based data import? 

And what I have love and see that it reduce my effort and time dramatically is the monitor, reprocessing and error handling the functionality of the SKYVVA Data Loader. If you have worked with them you ever want to miss them again.

How to get the App for data mapping?

To install this application available from this link: Click here to get the link installation
For beginners the very easy tutorials available for configuration and user guide is provided on their website in a very organized way you can check how to use SKYVVA Data Loader from:

Salesforce Integration Cloud Suite Solution

With SKYVVA Data Loader data import as you can do with the Salesforce standard Data Loader becomes easy for me with its less effort integration solution you can save your time which results in more productive and accuracy in your development journey. Especially when we are talking about integration its solutions are very useful for any organization working with complex business requirement and integration platform.

Salesforce Integration Cloud Suite Solution

Everything is done inside Salesforce and very fast and efficient. This is a true Cloud solution for FREE.


In this blog, I have told you my finding and experience with the SKYVVA Data Loader. Note that the SKYVVA Data Loader I use here is a component of the SKYVVA product suite family. I am curious to explore myself more on the other component like the Data Connect from the SKYVVA product suite and will let you participate with me in my integration journey. 
So Stay Tuned for this amazing Integration journey!
Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn
Release Notes SKYVVA Version Winter 19

Release Notes: Winter ´19 SKYVVA Integration Cloud- Official Version 2.41.3

Das Winter ´19 Release kann über die nachfolgenden Links aktualisiert werden:

Bug Fixed

Das Winter´19 Release ist in der SKYVVA version 2.41.3 ab sofort verfügbar. Hier sehen Sie die Highlights des Releases in einer Kurzübersicht:

  • Always get message error after deleted Message Type
  • The problem on delete message in Message Monitoring
  • SearchService API response JSON is grammatically not correct
  • The license key cannot full copy for other users
  • Message Board on manual load must link in one interface
  • Show incorrect date time on Last Run Time in Queues tab on CDC Control Board
  • XML payload generation for adapter SFDC2SAPPI does not follow the WSDL structure
  • Wrong order in Workflow
  • Edit button of adapter property not working
  • Some error with Export Data on Integration Details page
  • Name of SuppressNullVaule property not correct
  • Visualforce URL format change for Organization with MyDomain deployed
  • Error in Query Editor
  • Add the help text for the fields
Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn

Release Winter ´19 SKYVVA Agent – Official Version 1.48

Mit dem Spring 19 Release sind folgende Erweiterungen mit der neuen SKYVVA Integration Agent Version 1.48 verfügbar:

Neue Features


  • Enhance Agent to be a HTTP-Server
  • Server Configuration on Agent UI
  • Support Stored Procedure for Oracle
  • Use field “Package Size” on interface level with Agent
  • Use SQLite for the Agent database and Move all file base configuration to SQLite
  • Automatic move to SQLite database
  • Move Crontab file to database
  • Test Connection for every adapter from Salesforce
  • Agent with offline mode and online mode
  • Change Agent Scheduler to use crontab in database
  • Agent Support real bulk mode like Salesforce Data Loader
  • Create New button Export and Import Properties file in Integration Properties Setup
  • Streaming API run in background service
Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn
Siemens Industry SAP Beratung

Siemens Industry

SAP Beratung für Siemens Industry Automation und Drive Technologie

Siemens Industry Automation und Drive Technologies etablieren mit dem zentralen Produktstammdaten-Verfahren PMD eine Basis für die weltweite Versorgung der Produkte für ihre Vertriebsregionen. Als integraler Bestandteil zur Verteilung und Synchronisation der Produktstammdaten wird die SAP Exchange Infrastructure (SAP-XI) eingesetzt.    

Apsara Consulting unterstützt uns über viele Jahre hinweg in der Sicherstellung der Funktionalität und Qualität der Schnittstellen zur Versorgung der Stammdaten für die Siemens Regionen weltweit. Diese zentrale, komplexe und unternehmenskritische Schnittstelle zwischen PMD und PMI ist das Rückgrat der internationalen Produktdatenversorgung der Divisions Industry Automation und Drive Technologies im Siemenskonzern. Die Apsara Consulting hat hierzu maßgeblich Anteil am reibungslosen Ablauf z.B. im jährlichen EDI Release Wechsel und in der permanenten Anpassung und Weiterentwicklung dieser Schnittstellen.

Bernd Käfferlein, Service Manager PMD
Siemens AG Division Industry Automation

Die Fakten zum Unternehmen:

Name: Siemens Industry
Standort: Erlangen
Branche: Produktions-, Transport und Gebäudetechnik
Umsatz: 34.869 Mio. Euro (Geschäftsjahr 2010)
Mitarbeiter: 204.000 (30.09.2010)

Die Globalisierung und die drückende Wettbewerbssituation auf dem Weltmarkt erfordert eine schnelle und effiziente Versorgung der Vertriebsregionen mit stets aktuellen und richtigen Produktstammdaten z.B. für den Angebotsprozess. Große Konzerne können sich nicht mehr leisten durch Verzug in der Aktualität der Produkte, falsche Preise, etc. Angebote nicht rechtzeitig erstellen zu können, die indirekt zum Auftragsverlust führen. Gefordert ist daher eine schnelle, effiziente Infrastruktur zur Versorgung der Produktstammdaten für die Vertriebsregionen weltweit.

Highlights der SAP Beratung

• Einsatz von Unicode als Grundlage für die Internationalisierung der Produktstammdaten

• zentrale automatische Steuerung der Verteilung über SAP-XI

• komplexes intelligentes Mapping zur dynamischen Erzeugung der regionalen spezifischen Daten

• Gewährleistung des jährlichen kritischen EDI Release Wechsels

• Wechseln von einer 14 tägigen Versorgung der Stammdaten auf eine tägliche Aktualisierung Hauptnutzen für den Kunden

• Sicherstellung der täglichen störungsfreien Versorgung von Stammdaten für die weltweiten Vertriebsregionen

• Vereinfachung von Prozessen und Applikationen durch Harmonisierung der Produktdaten führen zur Kostenreduzierung

• Produktstammdaten stehen somit konzernweit:   

   1. aktuell
   2. vollständig
   3. konsistent und
   4. korrekt

zur Verfügung.

• verbesserte Datenqualität und -aktualität

Entscheidung für die SAP Beratung durch die Apsara Consulting GmbH

• Kompetenz, Erfahrung und der hohe Grad der Spezialisierung auf SAP NetWeaver XI/PI

• Tiefgehende Expertise und umfangreiches SAP NetWeaver XI/PI Know-How über den kompletten Product Lifecycle (XI 2.0 – PI 7.3)

• Zuverlässigkeit und Qualität bei der Planung, Durchführung und Betreuung von unternehmenskritischen Schnittstellen

• Best Practice- und Lösungsansätze aus zahlreichen erfolgreich durchgeführten Kundenprojekten

• Erfahrung mit Stammdaten-, Replikation,
Synchronisation und Harmonisierung

Vorhandene Systemlandschaft

SAP-basierte Anwendungslandschaft mit ERP Lösungen auf Hardware Plattformen von Linux, Integrationsplattform SAP NetWeaver XI als zentrales HUB für die Versorgung der Produktdaten. PMI (Product Master Data Integration) Java-Plattform.

Schnelle und effiziente Infrastruktur zur Versorgung der Produktstammdaten für die Vertriebsregionen

Die Globalisierung und die drückende Wettbewerbssituation auf dem Weltmarkt erfordert eine schnelle und effiziente Versorgung der Vertriebsregionen mit stets aktuellen und richtigen Produktstammdaten z.B. für den Angebotsprozess. Große Konzerne können sich nicht mehr leisten durch Verzug in der Aktualität der Produkte, falsche Preise, etc. Angebote nicht rechtzeitig erstellen zu können, die indirekt zum Auftragsverlust führen. Gefordert ist daher eine schnelle, effiziente Infrastruktur zur Versorgung der Produktstammdaten für die Vertriebsregionen weltweit.

Apsara Consulting Siemens Industry Success Story Sap Netweaver

PMD Verfahrenslandschaft

Der Service PMI (Product Master Data Integration) zusammen mit den Corporate Services CIP2SAP, EDI. Service und SPIRIDON Global stellt die technische Infrastruktur zur Versorgung der Siemens Vertriebsregionen (Clusters) mit all jenen kommerziellen Produktstammdaten aus dem Stammhaus zur Verfügung, die zur bedarfsgerechten Unterstützung aller relevanter vertrieblichen und logistischen Prozesse erforderlich sind. Als Integraler Bestandteil der technischen Infrastruktur zur Versorgung der Stammdaten setzen Siemens Industry Automation und Drive Technologies SAP-XI als die technische Middleware bzw. Enterprise Service Bus ein. SAP-XI ermöglicht den Datenaustausch zwischen den R/3 Systemen in den einzelnen Business Units zu PMD (Product Master Data) und Weiterverteilung an PMI.

“Der jährliche EDI Releasewechsel ist sehr business-kritisch. Er erfordert Anpassungen in dem komplexen XI-Mapping, den Abschluss aller Change Requests aus den Fachabteilungen und die Koordination zwischen den beteiligten Teams. Diese Tätigkeiten müssen termingerecht und erfolgreich abgeschlossen werden. Die Apsara Consulting erledigt diese Aufgabe seit mehreren Jahren stets zuverlässig und erfolgreich in SAP Beratung.”

Norbert Faltermeier, Service Manager PMI
Siemens AG Division Industry Automation

Zielsetzung der Apsara Consulting GmbH für die SAP Beratung

Zur Gewährleistung einer flexiblen Verteilung der Stammdaten an die Regione  wird konsequent auf ein dynamisches Mapping auf die regionalen spezifischen Organisationsdaten gesetzt. Dies ermöglicht, dass neue Regionen sofort mit Produktstammdaten versorgt werden können ohne umfassende und langwierige Änderung durchführen zu müssen.

Der lebendige Charakter der Produktdaten erfordert Änderungen und Anpassungen und schlägt sich im jährlichen EDI Release-Wechsel nieder. Change Request müssen zeitnah zum Termin des Release-Wechsels umgesetzt, getestet, abgenommen und zur Verfügung gestellt werden. Dieser Termin ist äußerst wichtig und erfordert eine genaue Koordination der beteiligten Partner und Services.

Der aus dem Wettbewerbsdruck entstandene Bedarf nach Tagesaktualität der Stammdaten konnte nicht erfüllt werden, da der bisherige Rhythmus der Datenversorgung lediglich 2 wöchentlich stattfand. Um diesem Umstand entgegenzuwirken wurde durch Automatisierung der sog. PMI-Ausleitung die Aktualisierung der Stammdaten auf täglichen Rhythmus umgestellt. Dies wurde ermöglicht durch die Einführung des SICONPlugIn4XI.


Aufgrund der Herausforderung, die Stammdaten zuverlässig auf höchster Aktualität zu halten und zu verteilen, ist eine ausgereifte und darauf ausgelegte Technologie notwendig. Als Technologie- Plattform wird daher SAP-XI eingesetzt, welche die Möglichkeit bietet, standardisierte Schnittstellen zu entwickeln und damit einen homogenen Informations-/ Nachrichtenaustausch sicher zu stellen.

Wurden früher kleinere Anpassungen und Änderungen manuell, fehlerträchtig und zeitintensiv durchgeführt, so ermöglicht die Mapping Engine von SAP-XI eine flexible Anpassung an die Anforderungen der Aufgaben. Neben Java- und ABAPMapping bietet SAP-XI das graphische Mapping zur Erstellung von flexiblen und komplexen Konvertierungen mit wenig Aufwand. So wurde über das graphische Mapping das IDoc “MATCOND“ in ein spezifisches Siemens SIDOC Format umgesetzt und zur Weiterverarbeitung an das SICONPlugIn4XI übergeben. Die Konvertierung in das endgültige EDI-Format konnte schließlich mit dem SICONPlug- In4XI bewältigt werden.

Es bestand wiederkehrend das Problem, dass bisher nicht erkannt wurde, wann eine Übertragung der Stammdaten praktisch beendet ist. Der Grund hierfür war die fehlende Information über den Status der bereits an den File-Adapter übergebenen Nachrichten. Als Lösung wurde die im Adapter zur Verfügung gestellten Möglichkeiten der eigenen Code-Erweiterung über sog. Modul-Exit implementiert. Somit konnte die Logik eingebaut werden, wann eine Übertragung logisch bzw. physisch beendet ist. Dies war die Grundlage für die komplette Automatisierung der Verteilung der Stammdaten für die Regionen.

Wir zählen auf die weitere SAP Beratung & Unterstützung von der Apsara Consulting GmbH

PMD ist gegenwärtig und zukünftig das Rückgrat für die Versorgung der Stammdaten für die
Regionen im Siemens Konzern. Sie störungsfrei in Betrieb zu erhalten ist das oberste Ziel. Jährliche EDI-Releasewechsel, Bug-Fixing und Maintenance fordern sowohl professionelle Skills, Know-How und Prozesskenntnisse als auch Kenntnisse an den beteiligten Applikationen, um das System auch weiterhin störungsfrei zu betreiben. Die durch die langjährige Zusammenarbeit mit der Apsara Consulting gewonnenen Prozesskenntnisse, -verfahren und das exzellente Know-How in SAP-XI ist an dieser Stelle unverzichtbar.

Durch gesetzliche und Corporate Vorgaben kommen immer wieder neue Anforderungen hoch. Ebenso gewinnt man durch den Betrieb von PMD neue Erkenntnisse und sieht Verbesserungspotential, welche zu neuen verbesserten Prozessen führen. Ein Beispiel ist die neue Schnittstelle für die Bildreferenzen, die aus den Erfordernissen aus der Praxis entstanden ist. Solche CR’s müssen in Zukunft weiterhin umgesetzt werden. Die Schnittstellen unterliegen ständigen Erweiterungen um neue Funktionalitäten. Auf die Apsara Consulting war und ist Verlass für die Umsetzung der anfallenden Aufgaben.

Das XI 3.0 ist mittlerweile fünf Jahren auf dem Markt und weitere Release mit SAP-PI 7.1 und aktuell 7.3 sind verfügbar. Aufgrund der vielen Verbesserungen, Stabilisierungen und der neuen Feature z.B. der Advanced Adapter Engine Extended, macht es Sinn auf SAP-PI 7.3 zu migrieren. Auch wegen der auslaufenden Wartung von SAP-XI 3.0 ist ein Upgrade auf ein aktuelles Release unumgänglich. Wir sehen die Apsara Consulting mit Ihrem umfangreichen speziellen Know-How in SAP-PI und wegen Ihrer Migrationserfahrungen bei anderen Kunden von XI 3.0 nach PI 7.x als unseren Partner.

Apsara Consulting GmbH – Wir stehen für eine kundenorientierte Partnerschaft

Die Apsara Consulting ist ein unabhängiges, international tätiges Unternehmen in der SAP Beratung, im SAP NetWeaver Umfeld und steht seinen Kunden als SAP NetWeaver Technologiepartner zur Seite.

Wir haben bereits sehr früh die große Bedeutung der Integrations- und Anwendungsplattform SAP NetWeaver erkannt und uns darauf spezialisiert. Deswegen sind wir heute Experten auf diesem Gebiet und können auf einen Erfahrungsschatz zurückgreifen, den andere Unternehmen erst aufbauen müssen. Unsere Erfahrung mit SAP NetWeaver Themen versetzt uns in die Lage, die für Ihren Geschäftserfolg passende Lösung zu finden. Wir beraten Sie umfassend und bieten Ihnen die Betreuung, die Sie brauchen – und zwar aus einer Hand. Profitieren Sie von unserer nachhaltigen Technologie- und Umsetzungskompetenz.

Beraterpersönlichkeiten mit ausgewiesener Management- und Technologieexpertise unterstützen Sie vor Ort mit kompetenter SAP Beratung rund um die Integrations- und Anwendungsplattform SAP NetWeaver. Apsara Consulting kann Ihr Unternehmen von der Ziel- und Strategiefindung, über die Einführung, den Betrieb und die Wartung, bis hin zur Migration über den gesamten Lebenszyklus der SAP NetWeaver-Plattform begleiten. Unser Ziel ist, aus SAP-NetWeaver eine Erfolgsgeschichte für Ihr Unternehmen zu machen.

Share on facebook
Share on Facebook
Share on twitter
Share on Twitter
Share on linkedin
Share on LinkedIn