Extending Salesforce Search with Azure Search using Logic Apps

Hi this time we want to bring inside the Salesforce platform the power of Azure search having as objective to index directly Salesforce entities.

Why this? Salesforce search is not enough? Of course Salesforce search capabilities are great but we can make it even greater if we can have things like fuzzy search, suggestions,etc.. that Azure Search offers out of the box.

How to achieve this? We can do a minimal project that can demonstrate quickly how we can add this functionality with ZERO impacts on existing Salesforce entities (no additional triggers or change in your existing code) but only a couple of additional visual force pages if you want.

The idea is to use Azure Logic Apps to detect record changes in Salesforce objects and refresh the azure search index when this happens.

Step 1 : Create the Azure Search Index Definition

We can use the rest API or directly the azure portal, let’s see the rest raw api call:

POST https://yoursearchservice.search.windows.net/indexes?api-version=2015-02-28
api-key: [yourApiKey]
Content-Type: application/json
"name": "myindex",
"fields": [
{"name": "id", "type": "Edm.String","key": true, "searchable": false},
{"name": "FirstName", "type": "Edm.String"},
{"name": "LastName", "type": "Edm.String"},
{"name": "Email", "type": "Edm.String"}
"suggesters": [
"name": "sg",
"searchMode": "analyzingInfixMatching",
"sourceFields": ["LastName"]

As you can see a real simple example with just SalesforceId, firstname, lastname and email indexed (we plan to index the Contact Objects). We add the suggester on the lastname, so you can test that even misspelling a surname the azure search will give you anyway a good tip on the person you wanted to find.

Step 2: Create an Azure Logic App

Here another simple process to put in place, with some minor complications due to the fact that search rest api do not expose still a good swagger definition so we have to go direct with http. We first login in salesforce with salesforce connector, then we pick the contact as object to watch.


Here the code view of the Azure Logic App:

"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"HTTP": {
"inputs": {
"body": "{'value': [{'@search.action': 'upload','id': '@{triggerBody()?['Id']}','FirstName': '@{triggerBody()?['FirstName']}','LastName': '@{triggerBody()['LastName']}','Email': '@{triggerBody()?['Email']}' }]}",
"headers": {
"Content-Type": "application/json",
"api-key": "yourapikey"
"method": "POST",
"uri": "https://yoursearchservice.search.windows.net/indexes/myindex/docs/index?api-version=2015-02-28"
"runAfter": {},
"type": "Http"
"contentVersion": "",
"outputs": {},
"parameters": {
"$connections": {
"defaultValue": {},
"type": "Object"
"triggers": {
"When_an_object_is_modified": {
"inputs": {
"host": {
"api": {
"runtimeUrl": "https://logic-apis-westeurope.azure-apim.net/apim/salesforce"
"connection": {
"name": "@parameters('$connections')['salesforce']['connectionId']"
"method": "get",
"path": "/datasets/default/tables/@{encodeURIComponent(encodeURIComponent('Contact'))}/onupdateditems"
"recurrence": {
"frequency": "Minute",
"interval": 1
"splitOn": "@triggerBody()?.value",
"type": "ApiConnection"

So what are we doing here?
Well every minute we check if any contact is changed and if yes we update the azure search index with the new information.
Note the splitOn option we have OOB with this kind of trigger that allows us , if more than one record is changed during the minute , to invoke the azure search automatically for each of this records without doing anything more!

Step 3: Enjoy it!

Now you can build any visualforce page you like to call the azure search api and leverage the incredible features this technology offers to you!

Try the azure search suggestions api with some ajax to propose suggestions while the user types, your call center users will love that!

Tips: You can call azure search just using  http requests with api key on the header and use the automatic json deserialization libraries that are available on apex to convert the responses in objects that you can map to the visualforce tags.


The Customer Paradigm Executive

Fun article this time ! 

Contact me /comment if you want to apply 😂😂😂 for the role!

Customer Paradigm Executive

Candidates must be able to mesh e-business users and grow proactive networks and evolve 24/7 architectures as well.

This position requires to syndicate innovative infomediaries and morph them with viral metrics in order to synergize back-end convergence.

The objective is to deliver brand wearable eyeballs and architect granular eyeballs deployed to B2B markets incentivizing leading-edge e-business and disintermediating 24/7 relationships.

You will also have to aggregate revolutionary communities and whiteboard end-to-end systems in order to orchestrate dynamic convergence and effectively monetize efficient interfaces that can morph scalable e-markets.

Send Emails with Adobe Campaign Marketing Cloud (Neolane)

Hi this time instead of downloading data from Neolane or updating recipients in it , we want to use Neolane as email cannon leveraging its powerful template engine to manage the email look & feel and deciding on the fly ,using Neolane API, the targets of our mass email campaign.

So the use case is the following : define in Neolane an email template with some fields mapping and leverage Neolane API to send email using this template but defining the email recipients externally and also the contents of the mapped fields .

According to the official Adobe documentation this can be done using the Neolane Business Oriented Apis (we looked into the Data Oriented Apis in our previous articles) as specified here:


Unfortunately the documentation is not really clear/complete and I had really to dig inside adobe logs, error codes and soap responses to have this working properly, and here is some sample code that can help you.

The code is made using the sample provided inside the Adobe documentation (external file mapping with data coming from the CDATA section inside the delivery xml tag structure).

Here the c# code:

 string sessionToken = "Look at other Neolane Code samples on how retrieve the session token";
 string securityToken = "Look at the other Neolane Code samples on how retrieve the security token";

string scenario ="Write here the internal name of the delivery template";

 HttpWebRequest reqData = (HttpWebRequest)WebRequest.Create(adobeApiUrl);

 reqData.ContentType = "text/xml; charset=utf-8";
 reqData.Headers.Add("SOAPAction", "nms:delivery#SubmitDelivery");
 reqData.Headers.Add("X-Security-Token", securityToken);
 reqData.Headers.Add("cookie", "__sessiontoken=" + sessionToken);
 reqData.Method = "POST";


 string strWriteHeader = "<?xml version='1.0' encoding='ISO-8859-1'?>" +
 "<soapenv:Envelope xmlns:soapenv=\"http://schemas.xmlsoap.org/soap/envelope/\" xmlns:urn=\"urn:nms:delivery\">" +
 "<soapenv:Header/>" + 
 " <soapenv:Body>"+
 "<urn:SubmitDelivery>" +
 "<urn:sessiontoken>" + sessionToken + "</urn:sessiontoken>" +
 " <urn:strScenarioName>" +scenario+ "</urn:strScenarioName>"+

 string strWriteRecipientBody = "<delivery> " +
   "<targets fromExternalSource=\"true\"> " +
           "<externalSource><![CDATA[MsgId|ClientId|Title|Name|FirstName|Mobile|Email|Market_segment|Product_affinity1|Product_affinity2|Product_affinity3|Product_affinity4|Support_Number|Amount|Threshold " + 
"1|000001234|M.|Phulpin|Hervé|0650201020|herve.phulpin@adobe.com|1|A1|A2|A3|A4|E12|120000|100000]]></externalSource>" +
          "</targets> " +
 string strWriteFooter = " </urn:elemContent>" +
 "</urn:SubmitDelivery>" +
 "</soapenv:Body>" +

 string bodyData = strWriteHeader + strWriteRecipientBody + strWriteFooter;

 byte[] byteArrayData = Encoding.UTF8.GetBytes(bodyData);

 // Set the ContentLength property of the WebRequest.
 reqData.ContentLength = byteArrayData.Length;
 // Get the request stream.
 Stream dataStreamInputData = reqData.GetRequestStream();
 // Write the data to the request stream.
 dataStreamInputData.Write(byteArrayData, 0, byteArrayData.Length);
 // Close the Stream object.

 var responseData = reqData.GetResponse();

 Stream dataStreamData = responseData.GetResponseStream();
 // Open the stream using a StreamReader for easy access.
 StreamReader readerData = new StreamReader(dataStreamData);
 // Read the content.
 string responseFromServerData = readerData.ReadToEnd();

 // Clean up the streams and the response.

return responseFromServerData;

Fintech for real in your pocket!

You probably heard thousands of times this word “fintech” floating around in newspapers, forums, websites, etc… Last month before going to U.S. for a 7 days trip , I read an interesting article about this young fintech company called Revolut and I decided just to install the app on my phone to play with it a bit….well from that moment and until I come back home from U.S. I made ALL my payments with Revolut, if you are interested in the details here the full story…

Day 1

Curiosity killed the cat we use to say, so I installed the app with all the precautions of this world, in other words I was ready to uninstall it at first sign of requiring my credit card details , bank account or something like that (paranoia mode on). Well the app just asked my cell phone, sms to confirm that actually it was me, some basic details (name, address, city, and country) and in less than a second I see appearing on my screen a…. brand new master card….WOW!!

I mean a real one! Just to verify that all of that was real, I went to my Amazon Web Services account and I set up this card as payment method and it worked like a charm! I immediately received a notification saying that 0 EUR where taken from my account that actually contained 0 EUR.

But let’s step back for a moment, why I’m talking of EUR if one of main reason I decided to experiment Revolut was my trip to U.S.?

Well Revolut gives you one account linked to your new master card and this account is divided in 3 subaccounts: one in EUR, another in Dollars and finally another one in Pounds, and you exchange money from one to another and the fee is actually ZERO! Yes they apply a very convenient exchange rate that is actually the same that you see on the exchange market (to be fair I verified that several times there is a minor discrepancy but we are talking of really cents over hundreds of dollars exchanged, but we will discuss this in detail later). 

Ok and what will happen if I have 100 EUR on my Revolut account and I pay a Shake Shack Burger (if you go to U.S. you have to try it, they are AMAZING!!) ? 

Well Revolut will automatically “on the fly” convert the right amount of EUR into dollars (again actual exchange rate no fees) and pay the burger!

All this conversion on the fly with no fees is valid also for a bunch of other currencies (see the Revolut website) so, for example, you can pay in CHF (swiss franc) in Switzerland for example and never worry about exchange fees.

Finally I saw the option: do you want to receive your Revolut master card at home (for free!)? Hell yes!

Day 3

After a couple of days, I received the shiny new master card and also the packaging it was incredibly well done (Apple style).

So after having a new digital master card, amazon approved, and also a new physical card delivered at home, I decided that I could put a minimum of trust on the platform and start to put some money on the Revolut account in order to cut all the usual exchange fees (usually everybody has been hit by these fees twice : a bad exchange rate that favors the bank plus the additional fee around 1-2 % of the transaction value).

Now here some bad news:

  • You can top up your Revolut account only by debit card or bank transfer so NO CREDIT CARD allowed , this because credit card fees are too high for Revolut
  • You can transfer to your Revolut account only EUR, Dollars or Pounds, no other currencies

In my case I opted for the bank transfer (SEPA transfer with split of cost) and probably due to my super expensive bank I had to pay a 5 EUR commission on the transfer.

Another issue during the bank transfer, was that my bank wrote the reference number of the payment (a code that links your transfer to your Revolut account) in a field that was not recognized by the Revolut banking system (the bank behind the scenes that manages your money is Barclays and they have 3 real bank accounts for your 3 subaccounts in Revolut).

A call with Revolut support sorted out the reference number problem and I was able to see money in my Revolut account.

Day 4-8

Here experience I had using the Revolut app and the Revolut master card during my trip in U.S.

The Good:

  • Every transaction arrives immediately as notification on your phone
  • Card accepted more or less everywhere (hotel did not accept it , probably because they wanted to block some money not for a real transaction but just in case there were additional expenses not already paid in advance)
  • You can also grab money from ATM but be careful because Revolut will not take any fee but several U.S. banks are (they took 3 Dollars on a 250 Dollars )
  • Exchange rate/fees really are the ones advertised

The Bad:

  • Refunds will arrive to your Revolut account with some days of delays
  • In case you pay a restaurant bill with an additional tip (quite common in U.S.) you will see on your Revolut account 2 times the same transaction: first the one without tip and after some days the other with tip and after another week or so finally the transaction without tip will be automatically deleted (but at one point in time your account will be charged with the two!)
  • Invalid transactions (like the one I had in U.S. metro station paying a ticket –> no ticket coming from the machine) will always grab money from your account and only after a while (1 week or so) you will see your money come back (transaction deleted)
  • Support (but this is well written in the Revolut support page) only works during U.K. working hours so if you are in U.S. actually nobody will answer to you during the day..
  • Support chat is not really working well on slow connections (like some wifi on the planes).

Final Recommendation

Once you are aware of the tricky things explained before, you know that soon or later all your money will be always safe and the system works, for first time users like me it was kind of heart attack every time I was looking at money disappear or not arriving or taken 2 times.

So in a nutshell I recommend it especially for travel purposes, on 1000 dollars spent in vacation I estimate a 30-45 dollars saved on exchanges fees and 30-45 dollars are a free lunch that you were paying to your bank.

Indexing SQL Data Warehouse with Azure Search and consume it with Salesforce

SQL Data Warehouse is one of the brand new PAAS services that Microsoft offers as part of their Cortana Analytics suite. If we want to find a way to describe it quickly we can say that is the Azure equivalent of Amazon Redshift with some differences of course, but in essence is a cloud MPP database that can be scaled quickly on demand , can ingest TB of data and leveraging the multi node architecture can process/query this data in seconds. One of the limitations of these systems (MPP) is that they cannot handle an high number of concurrent queries , usually the maximum value is around 20/30 and if the that number is reached new queries will be parked in queues (of course the entire query queue processing logic is more complicated than this but for in essence is that). Now this is fine is you plan to expose your data warehouse only to few analysts but if you want to make this data available also to other endpoints/consumers this became a problem. An example can be this : imagine in fact that you store in your DWH all your customer data/interactions/orders collected across multiple systems and you want your call center agents to access this info in order to correctly support your customers. So you want a super quick response time service called by several clients for searching costumers data and while this is a task that a classic relational database can perform with no problems , as said this represents a problem for an MPP database.

Luckily azure PAAS offering has multiple ways to solve this problem, one way can be to use Azure Search to index SQL Data Warehouse costumer data and offer to call center systems the Azure Search API as the service that will provide the search capabilities needed.

Azure Search has also the Indexer functionality that with pure configuration can automatically index a portion of a database, however looking at Azure Search documentation it seems that SQL Data Warehouse is not supported , but trying to use the Azure SQL Indexer I had no problems in performing the configuration task. I was able ,following the mentioned documentation, to schedule the indexing process and using the High Water Mark Change Detection policy (I had a field with timestamp) I was able to process the data progressively as it was updated/added to the DWH.

If you need to launch on demand indexing you can leverage the Azure Search Rest API and call Run Indexer operation , this combined with an Azure Logic App can let you update the index every time you need according to your workflow design (event based, depending on another scheduling  etc ..).

Consuming this service , from a call center software like Salesforce service cloud can be done using callouts consuming Azure Search Rest Api, a good example can be this , just change the json generation part so that instead of making queries on Salesforce accounts you actually call the Azure Search Rest Api. Actually you have to make two calls : one for the autocomplete (call the suggestions api ) that will provide you suggestions (so even if you make a typo the fuzzy search will help you to find the right customer) sand one that is the search api that will give you the customer detail data coming from the index.

One important thing to note: Salesforce callouts have concurrency limits but these limits can be bypassed using some tricks as described here .



Salesforce Integration Tips & Tricks

Hi I’ve been working with apex coding a bit lately and I’ve tried to integrate Salesforce with external systems using rest API.

My use case was the following :  when creating a contact I wanted to call the external API , try to find some contact related info there, update the contact itself with the external Key and some other info coming from the external system.

This seems an easy task but it is not so easy as it seems ….

In order to perform a change on a record that you are creating/updating , Salesforce offers the trigger functionality,

however you cannot perform an http request inside a trigger (Salesforce rightly says that external calls cannot respond in a quick and predictable way so they cannot stay in a trigger call).

@future directive comes to rescue because simply marking the synchronous method with this directive makes him asynchronous and executed actually outside from the trigger context.

This however brings another interesting challenge: if we execute our call outside the trigger scope how we are supposed to change the records that are currently updated ?

This can be solved passing to the asynchronous methods the Ids of the records being changed and applying the change to them outside the trigger context.

But now we have the “final boss” to be beaten 🙂

This is what happens if we put in place the ideas that we described:

  1. User or Process changes a record
  2. This makes the trigger react
  3. The async call is made
  4. In the async call , since we are outside the trigger context we call the update command to change the records
  5. the trigger reacts again
  6. infinite loop

Salesforce platform is so smart to detect the infinite loop and stop you, but it is clear that we have to fix this.

The trick here is to detect who is making the trigger react, if the trigger is activated by a user/process we should call the async method, if the trigger is activated by an aync method we do nothing so we interrupt the infinite loop.

Here some sample code that shows how is done the trigger:

trigger ExternalIDTrg on Contact (after insert, after update) {
   // Contact[] contacts = Trigger.new;
   String tempUID=null;
    System.debug('Trigger Started:');
    for ( Contact a : trigger.new ){
      if (!System.isFuture() &amp;&amp; !System.isBatch())
            System.debug('Future Call Done');

As you can see we detect with System.isFuture() if the call is coming from a future method. We call a method that matches a Contact using the email in the external system and if we find it we write the Id in the externalId field of Salesforce.


    public static void UpdateContact(ID ContactId)
      String tempUID=null;
      for ( Contact a : [
        Select Id, Email, ExternalID__c
        from Contact
        where Id = :ContactId
          if ( a.ExternalID__c == null &amp;&amp; a.Email !=null){
                   System.debug('Contact Email: '+a.Email);
                  System.debug('tempUID: '+tempUID);
                    if (tempUID!=null)
                        System.debug('ExternalID: '+a.ExternalID__c);
                        update a;

Notice the @future(callout=true) directive that allows us to call an http request inside the method.

Integrating Azure API App protected by Azure Active Directory with Salesforce using Oauth

This time I had to face a new integration challenge: on salesforce service cloud , in order to offer a personalized service to customers requesting assistance, I added a call to an azure app that exposes all the information the  company has about this customer on all the touch points (web, mobile, etc…). Apex coding is quite straightforward when dealing with simple http calls and interchange of Json objects, it becames more tricky when you have to deal with authentication.

In my specific case the token based authentication I have to put in place is composed by the following steps:

  1. Identify the url that accept our authentication request and returns the authentication token
  2. Compose the authentication request with all the needed parameters that define the requester identity and the target audience
  3. Retrieve the token and model all the following  requests to the target app inserting this token in the header
  4. Nice to have : cache the token in order to reuse it for multiple apex calls and refresh it before it expires or on request.

Basically all the info we need is contained in this single Microsoft page.

So before even touching a single line of code we have to register the calling and called applications in azure directory (this will give to both an “identity” ).

This step should be already done for the azure API app when you protect it with AAD using the portal (write down the client Id of the AAD registered app), while for the caller (salesforce) just register a simple app you want on the AAD.

When you will do this step write down the client id and the client secret that the portal will give you.

Now you need your tenantid , specific for your azure subscription. There are multiple ways of retrieving this parameter as stated here , for me worked the powershell option.

Once you have these 4 parameters you can be build a POST request in this way:

Endpoint: https://login.microsoftonline.com/<tenant id>/oauth2/token


Content-Type: application/x-www-form-urlencoded

Request body:

grant_type=client_credentials&client_id=<client  id of salesforce app>&client_secret=<client secret of the salesforce app>&resource=<client id of the azure API app>

If everything goes as expected you will receive this JSON response:






“resource”:”<client id of the azure API app>”


Now your are finally ready to call the azure API app endpoint, in fact the only added thing you have to do is to add to the http request is an header with the following contents :

Authorization: Bearer <access_token coming from the previous request> 

This should be sufficient to complete our scenario (btw do not forget to add https://login.microsoftonline.com and https://<UrlOfYourApiApp&gt; to the authorized remote urls list of your salesforce setup).

Using the expires data of the token you can figure out how long it will last (usually 24h) and you can setup your cache strategy.

Happy integration then!


Here some apex snipplets that implement what explained.

public with sharing class AADAuthentication {
private static String TokenUrl='https://login.microsoftonline.com/putyourtenantid/oauth2/token';
private static String grant_type='client_credentials';
private static String client_id='putyourSdfcAppClientId';
private static String client_secret='putyourSdfcAppClientSecret';
private static String resource='putyourAzureApiAppClientId';
private static String JSonTestUrl='putyourAzureApiUrlyouwanttoaccess';

public static String AuthAndAccess()
String responseText='';
String accessToken=getAuthToken();
HttpRequest req = new HttpRequest();
req.setHeader('Authorization', 'Bearer '+accessToken);
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug('COMPLETE RESPONSE: '+responseText);

} catch(System.CalloutException e) {
return responseText;


public static String getAuthToken()
String responseText='';
HttpRequest req = new HttpRequest();
String requestString='grant_type='+grant_type+'&client_id='+client_id+'&client_secret='+client_secret+'&resource='+resource;
Http http = new Http();
HTTPResponse res = http.send(req);
System.debug('COMPLETE RESPONSE: '+responseText);

} catch(System.CalloutException e) {

JSONParser parser = JSON.createParser(responseText);
String accessToken='';
while (parser.nextToken() != null) {
if ((parser.getCurrentToken() == JSONToken.FIELD_NAME) &&
(parser.getText() == 'access_token')) {
accessToken = parser.getText();
System.debug('accessToken: '+accessToken);
return accessToken;