Helping Troy Hunt for fun and profit

Hi everyone,  I’m a huge fan of the security expert Troy Hunt and of his incredible “free!” service (if you don’t know it, please use it now! to test if your email accounts are compromised ! ).


Now Troy has created a contest where you can actually win a shiny Lenovo laptop, if you create something “new” that can help people to be more aware of the security risks related to pwned accounts.

I decided to participate and my idea is the following, helping all the people that have gmail (and Hotmail/outlook/office 365 in alpha version!) accounts to verify if their friends, colleagues and family members have their email accounts compromised.

I uploaded the code and executables here and I strongly suggest you to read ENTIRELY the readme instructions to understand how the tool works, what are expected results and what you can do.

Regardless if I win the laptop or not, I already won because I was able, thanks to this tool, to alert my wife and some of my friends of the danger and to have the right “push” to convince them to setup two-factor authentication.

If you want to donate , for this effort please donate directly to Troy here, he deserves a good beer !



How to create the perfect Matchmaker with Bot Framework and Cognitive Services

Hi everyone, this time I wanted to showcase some of the many capabilities of Microsoft Cognitive Services using a “cupido”   bot built with Microsoft Bot Framework .

So what is the plan? Here some ideas:

  • Leverage only Facebook as channel! Why? Because with facebook you have people already “logged in” and you can leverage the messenger profile api to retrieve automatically the user details and more importantly his facebook photo!
  • Since usually the facebook photo is an image with a face , we can use this image with Vision and Face Api to understand gender, age and bunch of other interesting info without any user interaction!
  • We can score with a custom vision model that we trained using some publicly available images if a person looks like a super model or not 😉
  • Using all this info (age, gender, makeup, sunglasses, super model or not, hair color, etc…) collected with all those calls we can decide which candidates inside our database are the right ones for our user and display the ones that are fitting according to our demo rules.

Of course at the beginning our database of profiles will be empty , but with help of friends / colleagues we can quickly fill it and have fun during the demo.

So in practice how does it look like?

Here the first interaction, after saying hello the bot immediately personalizes the experience with our facebook data (foto, name, last name,etc..) and asks if we want to participate to the experiment:

After accepting it uses the described APIs to understand the image and calculate age, hair, super model score, etc…

Yeah, I know my super model score is not really good, but let’s see if there are any matches for me anyway….

Of course the bot is smart enough to display the profile of my wife otherwise I was really in a big problem :-).

Now I guess many of you have this question: how the super model score is calculated?

Well I trained the custom vision service of Microsoft with 30+ photos of real models and 30+ photos of “normal people” and after 4 iterations I had already a 90% accuracy on detecting super models in photos 😉

Of course there are several things to consider here:

  1. Images should be the focus of the picture
  2. have sufficiently diverse images, angles, lighting, and backgrounds
  3. Train with images that are similar (in quality) to the images that will be used in scoring

And we have for sure super model pics that have larger resolution, better lighting and good exposure vs the photos of “normal” people like you and me, but for the purposes of this demo the results were very good.

Another consideration to do is that you don’t always have to use Natural Language Processing in the bots (in our case in fact we skipped the usage of LUIS ) because, especially if we are not developing a Q&A/support bot, users prefer buttons and minimal amount of info to provide.

Imagine a Bot that handles your Netflix subscription, you just want  buttons like  activate/deactivate membership (if you go in vacation) and the other is “recommendations for tonight” .

Another important thing to consider is Bot Analytics and understand how your bot is performing, I leverage this great tool that under the covers uses Azure Application Insights:

If instead you are in love with statistics you can try this jupyter notebook with the following template to analyze with your custom code the Azure Application Insights metrics and events.

If you want to try the bot already with all the telemetry setup done , you can grab , compile and try the demo code (do not use this code in any production environment) that is available here and if this is your first bot start from this tutorial to understand a bit the various pieces needed.

How to generate Terabytes of IOT data with Azure Data Lake Analytics

Hi everyone, during one of my projects I’ve been asked the following question:

I’m actually storing my IOT sensor’s data in Azure Data Lake for analysis and feature engineering , but currently I still have very few devices, so not a big amount of data and I’m not able to understand how much fast will be my queries and my transformations when with much more devices and months/years of sensor data my data lake will reach do over several terabytes.

Well in that case let’s generate quickly those terabytes of data using U-SQL capabilities!

Let’s assume that our data resembles the following:

deviceId, timestamp, sensorValue, …….

so we have for each IOT device a unique identifier called deviceId and let’s assume is a composition of numbers and letters, we have a timestamp indicating the time at millisecond precision, where the IOT event was generated and finally we have the values of the sensors in that moment (temperature, speed, etc..).

The idea is the following give a real deviceId, generate N “synthetic deviceIds” that have all the same data of the original device . So if we have , for example , 5 real deviceId each with 100.000.000 records (500.000.000 records in total), if we generate 1000 synthetic deviceIds for each real deviceId  we will have 1000x5x100.000.000 additional records so 500.000.000.000 records.

But we can expand the amount of synthetic data even more playing with time, for example, if our real data has events only for  2017, we can duplicate the entire dataset for all the years starting from 2006 to 2016 and have records.

Here some sample C# code that generates the synthetic deviceIds:

note the GetArraysOfSyntheticDevices function that will be executed into the U-SQL script.

Before using it we have to register the assembly into our DataLake account and database (in my case the master one):

DROP ASSEMBLY master.[Microsoft.DataGenUtils];
CREATE ASSEMBLY master.[Microsoft.DataGenUtils] FROM @”location of dll”;

Now we can read the original IOT data and create the additional data:

REFERENCE ASSEMBLY master.[Microsoft.DataGenUtils];

@t0 =

deviceid string,
timeofevent DateTime,
sensorvalue float
FROM “2017/IOTRealData.csv”
USING Extractors.Csv();

//Let’s have the distinct list of all the real DeviceIds
deviceid AS deviceid
FROM @t0;

//Let’s calculate for each deviceId an array of 1000 synthetic devices

@t2 =
SELECT deviceid,
Microsoft.DataGenUtils.SyntheticData.GetArrayOfSynteticDevices(deviceid, 1000) AS SyntheticDevices
FROM @t1;

//Let’s assign to each array of synthetic devices the same data of the corresponding real device

@t3 = SELECT a.SyntheticDevices,
FROM @t0 AS de INNER JOIN @t2 AS a ON de.deviceid== a.deviceid;

//Let’s use the explode function to expand the array to records

@t1Exploded =
emp AS deviceid,
FROM @t3 AS de
EXPLODE(de.SyntheticDevices) AS dp(emp);

//Now we can write the expanded data

OUTPUT @t1Exploded
TO “SyntethicData/2017/expanded_{*}.csv”
USING Outputters.Csv();

Once you have the expanded data for the entire 2017 you can just use c# DateTime functions that add Years, Months or days to a specific date, applied that to timeofevent column and write the new data in a new folder (for example SyntethicData/2016, SyntethicData/2015 etc…).


Send Emails with Adobe Campaign Marketing Cloud (Neolane)

Hi this time instead of downloading data from Neolane or updating recipients in it , we want to use Neolane as email cannon leveraging its powerful template engine to manage the email look & feel and deciding on the fly ,using Neolane API, the targets of our mass email campaign.

So the use case is the following : define in Neolane an email template with some fields mapping and leverage Neolane API to send email using this template but defining the email recipients externally and also the contents of the mapped fields .

According to the official Adobe documentation this can be done using the Neolane Business Oriented Apis (we looked into the Data Oriented Apis in our previous articles) as specified here:

Unfortunately the documentation is not really clear/complete and I had really to dig inside adobe logs, error codes and soap responses to have this working properly, and here is some sample code that can help you.

The code is made using the sample provided inside the Adobe documentation (external file mapping with data coming from the CDATA section inside the delivery xml tag structure).

Here the c# code:

 string sessionToken = "Look at other Neolane Code samples on how retrieve the session token";
 string securityToken = "Look at the other Neolane Code samples on how retrieve the security token";

string scenario ="Write here the internal name of the delivery template";

 HttpWebRequest reqData = (HttpWebRequest)WebRequest.Create(adobeApiUrl);

 reqData.ContentType = "text/xml; charset=utf-8";
 reqData.Headers.Add("SOAPAction", "nms:delivery#SubmitDelivery");
 reqData.Headers.Add("X-Security-Token", securityToken);
 reqData.Headers.Add("cookie", "__sessiontoken=" + sessionToken);
 reqData.Method = "POST";


 string strWriteHeader = "<?xml version='1.0' encoding='ISO-8859-1'?>" +
 "<soapenv:Envelope xmlns:soapenv=\"\" xmlns:urn=\"urn:nms:delivery\">" +
 "<soapenv:Header/>" + 
 " <soapenv:Body>"+
 "<urn:SubmitDelivery>" +
 "<urn:sessiontoken>" + sessionToken + "</urn:sessiontoken>" +
 " <urn:strScenarioName>" +scenario+ "</urn:strScenarioName>"+

 string strWriteRecipientBody = "<delivery> " +
   "<targets fromExternalSource=\"true\"> " +
           "<externalSource><![CDATA[MsgId|ClientId|Title|Name|FirstName|Mobile|Email|Market_segment|Product_affinity1|Product_affinity2|Product_affinity3|Product_affinity4|Support_Number|Amount|Threshold " + 
"1|000001234|M.|Phulpin|Hervé|0650201020||1|A1|A2|A3|A4|E12|120000|100000]]></externalSource>" +
          "</targets> " +
 string strWriteFooter = " </urn:elemContent>" +
 "</urn:SubmitDelivery>" +
 "</soapenv:Body>" +

 string bodyData = strWriteHeader + strWriteRecipientBody + strWriteFooter;

 byte[] byteArrayData = Encoding.UTF8.GetBytes(bodyData);

 // Set the ContentLength property of the WebRequest.
 reqData.ContentLength = byteArrayData.Length;
 // Get the request stream.
 Stream dataStreamInputData = reqData.GetRequestStream();
 // Write the data to the request stream.
 dataStreamInputData.Write(byteArrayData, 0, byteArrayData.Length);
 // Close the Stream object.

 var responseData = reqData.GetResponse();

 Stream dataStreamData = responseData.GetResponseStream();
 // Open the stream using a StreamReader for easy access.
 StreamReader readerData = new StreamReader(dataStreamData);
 // Read the content.
 string responseFromServerData = readerData.ReadToEnd();

 // Clean up the streams and the response.

return responseFromServerData;

Integration with Adobe Campaign Marketing (aka Neolane) Part II

Hi in the previous post we saw how to read information from Adobe Campaign Marketing.

This time I want to show you how to “write” to it, in particular how to add or modify recipients . This is , I believe, something that you want to do regularly to have in sync , for example, your users preferences on your sites and their current status on your campaign database . In fact a user that removes from his profile on a site the consensus to receive a specific newsletter imagines that automatically , from that moment, he will never be disturbed again. If you do not sync this asap, you have the risk to contact someone that does not want to be contacted . On the other side, if a new user registers on your site you want asap to have in your campaign tool to target him .

Here the c# code:

string adobeApiUrl = ConfigurationManager.AppSettings["adobeApiUrl"];
//Here for testing purpouses username and password are simply read by conf settings but you should acquire it in a secure way!
string adobeUser = ConfigurationManager.AppSettings["adobeUser"];
string adobePass = ConfigurationManager.AppSettings["adobePass"];
//We need to write recipients but they stay inside a folder so we have to know in advance the folder id or name
string adobeFolderId = ConfigurationManager.AppSettings["adobeFolderId"];
//Create the web request to the soaprouter page
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(adobeApiUrl);
req.Method = "POST";
req.ContentType = "text/xml; charset=utf-8";
//Add to the headers the requested Service (session) that we want to call
req.Headers.Add("SOAPAction", "xtk:session#Logon");

string userName = adobeUser;
string pass = adobePass;
//We craft the soap envelope creating a session Logon reequest
string body = "<soapenv:Envelope xmlns:soapenv=\"\" xmlns:urn=\"urn:xtk:session\">" +
"<soapenv:Header/><soapenv:Body><urn:Logon>" +
"<urn:sessiontoken/>" +
"<urn:strLogin>" + userName + "</urn:strLogin>" +
"<urn:strPassword>" + pass + "</urn:strPassword>" +
"<urn:elemParameters/>" +
//We write the body to a byteArray to be passed with the Request Stream
byte[] byteArray = Encoding.UTF8.GetBytes(body);

// Set the ContentLength property of the WebRequest.
req.ContentLength = byteArray.Length;
// Get the request stream.
Stream dataStreamInput = req.GetRequestStream();
// Write the data to the request stream.
dataStreamInput.Write(byteArray, 0, byteArray.Length);
// Close the Stream object.

var responseAdobe = req.GetResponse();

Stream dataStream = responseAdobe.GetResponseStream();
// Open the stream using a StreamReader for easy access.
StreamReader reader = new StreamReader(dataStream);
// Read the content.
string responseFromServer = reader.ReadToEnd();
// Display the content.
// Clean up the streams and the response.
//Manually parsing the response with an XMLDoc
System.Xml.XmlDocument xResponse = new XmlDocument();
// We parse manually the response. This is again for testing purpouses
XmlNode respx = xResponse.DocumentElement.FirstChild.FirstChild;

string sessionToken = respx.FirstChild.InnerText;
string securityToken = respx.LastChild.InnerText;

// We have done the login now we can actually do a query on Neolane
HttpWebRequest reqData = (HttpWebRequest)WebRequest.Create(adobeApiUrl);
reqData.ContentType = "text/xml; charset=utf-8";
//Add to the headers the requested Service (persist) that we want to call
reqData.Headers.Add("SOAPAction", "xtk:persist#Write");
reqData.Headers.Add("X-Security-Token", securityToken);
reqData.Headers.Add("cookie", "__sessiontoken=" + sessionToken);
reqData.Method = "POST";
//We craft the soap header also here session token seems to be needed
string strWriteHeader = "<?xml version='1.0' encoding='ISO-8859-1'?>" +
"<soapenv:Envelope xmlns:soapenv='' xmlns:urn='urn:xtk:session'>" +
"<soapenv:Header/>" +
"<soapenv:Body>" +
"<urn:Write>" +
"<urn:sessiontoken>" + sessionToken + "</urn:sessiontoken>" +

string strWriteRecipientBody = string.Empty;
string strWriteFooter = "</urn:domDoc>" +
"</urn:Write>" +
"</soapenv:Body>" +
//Here I loop inside a list of objects that represent the adobe recipient I want to write
// operation is insertOrUpdate and the key that will check if it is an insert or an update is the email in my case.
// you can pick the one that you think is good

foreach (AdobeRecipient recipient in updatesOnAdobe)
strWriteRecipientBody +=
"<recipient "
+ "_operation='insertOrUpdate' "
+ "_key='@email' "
+ "xtkschema='nms:recipient' "
+ "account='" + recipient.account + "' "
+ "lastName='" + recipient.lastName + "' "
+ "firstName='" + recipient.firstName + "' "
+ "email='" + + "' "
+ "origin='" + recipient.origin + "' "
+ "company='" + + "'>"
+ "<folder id='" + recipient.folderId + "'/> "
+ "</recipient> ";

//Full String ready to be passed
string bodyData = strWriteHeader + strWriteRecipientBody + strWriteFooter;

byte[] byteArrayData = Encoding.UTF8.GetBytes(bodyData);

// Set the ContentLength property of the WebRequest.
reqData.ContentLength = byteArrayData.Length;
// Get the request stream.
Stream dataStreamInputData = reqData.GetRequestStream();
// Write the data to the request stream.
dataStreamInputData.Write(byteArrayData, 0, byteArrayData.Length);
// Close the Stream object.

var responseData = reqData.GetResponse();

Stream dataStreamData = responseData.GetResponseStream();
// Open the stream using a StreamReader for easy access.
StreamReader readerData = new StreamReader(dataStreamData);
// Read the content.
string responseFromServerData = readerData.ReadToEnd();
// Here we should receive an OK from Neolane
// Clean up the streams and the response.

return responseFromServerData;

Tips on cloud solutions integration with Azure Logic Apps

The cloud paradigm is a well established reality: old CRM systems are now replaced by salesforce  solutions, HR and finance systems by workday , old exchange servers and SharePoint intranets by office 365, without mentioning entire datacenters migrated to Amazon. At the same time all the efforts that on premise where done to have all these systems to communicate together (the glorious days  of ESB solutions with TIBCO, Websphere, Bea Logic, BizTalk, etc..) seems kind of lost in the new cloud world. Why?

Well each cloud vendor (salesforce first I think) tries to position his own AppStore (AppExchange) with pre-built connectors or apps that quickly allow to companies to have integrations with other cloud platforms in a way that is not only cost effective but most of all supported by app vendor and not by a system integrator (so paid by the license).

Well this in theory should work , in practice we see a lot of good will from niche players on these apps but no or little commitment from big vendors.

Luckily however the best cloud solutions already provide rich and secure APIs to enable integration , it’s only a matter of connecting the dots and here several “integration” cloud vendors are already positioning themselves : Informatica Cloud, Dell, SnapLogic,MuleSoft,etc… ,the Gartner quadrant for Integration platform as a service (iPaaS) represents well the situation.

But while Gartner produced the report on March 2015 , Microsoft released a new kind of app on the azure platform that is called Azure Logic App.

What is the strength of this “service” compared to the others? Well in my opinion is that lies on a “de facto” proven and standard platform that can scale as much as you want and it also gives you the possibility of writing in your own connectors as Azure API apps , finally you can create your integration workflows right in the browser “no client attached!”. Of course it has so many other benefits but for me these three are so compelling that I decided to give it a try and start developing some POCs around it.

What you can do with them? Basically you can connect any system with any other system right from your browser!

Do you want to twit something if a document is uploaded on your dropbox? You can do it!

You want to define a complex workflow that start from a message on the service bus , calls a BAPI on SAP and ends inside a oracle stored procedures? It’s right there!

There is a huge list of connectors that you can use , and each of them can be combined with others in so many interesting ways!

Connectors have actions and triggers! Triggers , as the word says, are useful to react to an event (a new tweet with a specific word , a new lead on salesforce, etc..) and they can be used in a push or pull fashion (I’m interested in this event so the connector will notify me when this happens or I’m interested in this data and I will periodically call the connector to check if there is new data).

Actions are simply methods that can be executed on the connector (send an email, do a query, write a post on facebook,etc…).

An azure logic app is a workflow where you combine all these connectors using triggers and actions to achieve your goals.

How they communicate each other? I mean how do I refer inside a connector B that is linked to A to perform the action using A data? It is super simple! When you link two connectors you will see inside the target one on every action that requires data pick lists where you can easily pick the source data! This can happen because each connector automatically describes its API schema using swagger (this really rocks!).

And you want to know the best of this? If you write your own connector with Visual Studio it will automatically generate the swagger metadata for you! So in really 10 min you can have your brand new connector ready to use !

Added bonus : you can have automatically done for you a testing api made by swagger!

Azure website is full of references to quickly ramp up on this technology , so I want to give you some  useful tips in your app logic journey instead of a full tutorial.

Tip 1:  You will see that published connectors are requesting you some configuration values (Package Settings) and only after that the connector becomes usable in your logic app. I tried with no success to do the same in visual studio with a custom API app and the best that I was able to find is that you simulate this only if you use deploy script from github (look at azuredeploy.json file, some examples here ) , at this stage in fact with the deploy setup screen you can set some configuration values that will never change once your azure api app is published . The way this is done is to map deploy parameters with app settings like this:

"properties": {
"name": "[parameters('siteName')]",
"gatewaySiteName": "[parameters('gatewayName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', parameters('svcPlanName'))]",
 "siteConfig": {
  "appSettings": [
   "name": "clientId",
   "value": "[parameters('MS_client_id')]"
   "name": "clientSecret",
   "value": "[parameters('MS_client_secret')]"

Then you can use the usual ConfigurationManager.Appsettings to read these values into the code.

I guess this will be fixed (possibility of defining package settings) when the publishing on marketplace will be available.

Tip 2: If you store credentials inside your custom api app please note that by default api app are published as public…. so if this particular api app reads from you health IOT device anyone in the world that knows or discovers API address can call this API and read your health data! So set security to internal and not public!

Tip 3: Browser Designer can be sometimes instable and produce not exactly what you were hoping from it, always check also the code view!
Tip 4: Azure API Apps have application settings editable on azure portal like the normal Azure “Web” Apps but they are hidden!
Look at this blog that saved me!

That’s it!


Indexing Xml files with elastic search and c#

Lately I’ve been struggling with some integration issues and each time I had to reverse engineer workflows, troubleshoot code and search inside the logs of the enterprise service bus to find the message xyz containing the costumer data of client abc. Now since these logs are also stored in an encrypted format, I had to write some code to decrypt them on the fly , search inside file contents, look inside next log etc…

Basically one single search of one costumer was taking 20-30 min…

So I started to look at solutions like elastic search or SOLR that solve exactly these kind of problems and since I already worked with elastic search in the past I went in that direction. The classic combo is Logtash & Elastic Search & Kibana: Logtash is used to parse and transform the incoming log files and send them to Elastic search, here the are indexed and using Kibana you can quickly build nice dashboards on the indexed content.

This time however I had to face a new challenge,  instead of having classic web logs and ready to use logtash transformations (filters) , I had to work on these huge xml log files stored inside the ESB and they also had several levels of nesting. Elastic search supports natively json objects indexing and not xml so you have to manipulate the xml with a logtash transformation. After reading a bit about logtash xml filter I found that (probably because I did not spend so much time on this) it would take too much time to write the right transformation for my case.

So I started to code some c# code to do it and I choose to leverage the NEST library (elastic search .net client).

While looking inside  Nest and elastic search documentation I discovered also that objects nested inside other objects are not so easy searchable like the root ones. So I decided to flatten the xml structure into a flat c# simple class. To have the write the minimum amount of code to do this, I first transformed the xml into a proper c# class , the fastest way I found is to use xsd.exe from windows tool kit (look in C:\Program Files\Microsoft SDKs\Windows\v6.0A\bin) and obtain the xsd file from a single xml document:

xsd “C:\Users\UserA\Desktop\ESB.xml” /o:”C:\Users\UserA\Desktop”

You will obtain ‘C:\Users\UserA\Desktop\ESB.xsd’.

Now use xsd.exe again to generate the c# class:

xsd /c “C:\Users\UserA\Desktop\ESB.xsd” /o:”C:\Users\UserA\Desktop”

You will obtain ‘C:\Users\UserA\Desktop\ESB.cs’.

I manually  created the flat c# class simply coping & pasting the generated c# class nested object properties inside the flat one. Since property names are not changed using reflection we can later automatically copy property values from the nested objects to the flat one:

public static void ReplaceValues(Object source, Object destination)
PropertyInfo[] propertiesIncoming = source.GetType().GetProperties(BindingFlags.Public | BindingFlags.Instance);
PropertyInfo[] propertiesDestination = destination.GetType().GetProperties(BindingFlags.Public | BindingFlags.Instance);
//This is a sample code, do not iterate like this 
//and use linq search with large collections!!!
foreach (PropertyInfo p in propertiesIncoming){                
      if (p.PropertyType != typeof(string)) { continue; }
      PropertyInfo dest = propertiesDestination.Where(y => y.Name == p.Name).FirstOrDefault();
      if (dest!=null)
         dest.SetValue(destination, p.GetValue(source));

So once we have the flat c# object using NEST client we can index it very quickly,
here a sample that takes one xml and indexes the xml contents.

//ESBObj is the ESB.cs class type     
XmlSerializer serializer = new XmlSerializer(typeof(ESBObj));
//Even with large xml files this deserialization happens really quickly
                ESBObj resultingMessage = (ESBObj)serializer.Deserialize(new XmlTextReader(this.openFileDialog1.FileName));
//Here we use the NEST library and we connect to the local node
                var node = new Uri("http://localhost:9200");
//we specify that we want to work on the esb_index
                var settings = new ConnectionSettings(
                    defaultIndex: "esb_index"
//let's connect 
                var client = new ElasticClient(settings);

//here we fill the flat objects using the ESBObj levels               
                FlatObj tempObj=null;
                int progressive = 0;
//sample code here , this can be largely improved using reflection again 
                foreach (var level1 in resultingMessage.Level1Objects)
                    foreach (var level2 in level1.Level2Objects)
                        foreach (var level3 in level2.Level3Objects)
                            tempObj = new FlatObj();
                            ReplaceValues(resultingMessage, tempObj);
                            ReplaceValues(level1, tempObj);
                            ReplaceValues(level2, tempObj);                          
                            ReplaceValues(level3, tempObj);
//Here before indexing we assign a progressive Id to each object
//in order to have unique id on elastic search
//elastic search uses this id to identify uniquely each object 
//on the index
                            tempObj.Id = progressive.ToString();
//This is the indexing call
                            var index = client.Index(tempObj);

Now we want search for contents on the index that stores these contents,however this happened to be more tricky of what I thought ,probably because it was the first time for me using the NEST library, but luckily I had installed also some elastic search plug-ins and one of these was ElasticHQ , a nice front-end for elastic search. Looking inside the JSON requests of the queries done by ElasticHQ I was able to find the right query to issue using NEST raw mode (where you pass directly the commands avoiding NEST library to do it for you).

This is some sample code that “should” work , a search with City=New York but in my case no results..

var searchResults = client.Search<FlatObj>(s => s
                .Query(q => q
                     .Term(p => p.City, "New York")

Here instead how I make it work (and in this way searches automatically on all the properties!):

//In searchbox we type what we want to find 
//we can type here anything and elastic search will search on all 
//flattened properties!!!
string searchVal = @"{""filtered"": {""query"": {""query_string"": {""query"": """ + this.searchbox.Text;
var client = new ElasticClient(settings);
var searchResults = client.Search<FlatObj>(s => s
//Here since Document collections is IEnumerable we can bind it on the fly
// and see the results on a grid!
this.dataGridView1.DataSource = searchResults.Documents;

So in the end I ended up my quick POC indexing a 14 Mb xml log file in 200 ms and searching every possible content of it in 50-100 ms for each search issued to elastic search node. Actually only the index size scares me (5 Mb), and I want to see how much it will grow with several files.