A massive adventure ends…

Hi everyone my journey in Microsoft ends and I want to use this opportunity to revisit many of the good moments I spent in this great company .

The first days I joined it was like entering in a new solar system where you have to learn all the names of the planets, moons, orbits at warp speed

but at the same time I had all the support of my manager , my team mates and my mentor !

Very quickly it was time to enter in action with my team and start working with clients, partners, developers, colleagues from different continents, teams and specializations.

We worked on labs and workshops helping customers and partners to discover and ramp up on Microsoft Advanced Analytics capabilities

and with customer engagements helping our clients to exploit all the possibilities that the Azure Cloud can provide to help them in reaching their goals

I certainly i cannot forget the incredible Microsoft Tech Ready event in Seattle:

and all the talented colleagues I had the opportunity to work with.

Finally I want to say a massive thank you to my v-team , I will miss you!!

As an adventure ends I am really excited to bring my passion and curiosity into a new one joining a customer .

I can’t wait to learn more, improve and be part of the massive digital transformation journey that is in front of us.

 

 

 

Annunci

How to create the perfect Matchmaker with Bot Framework and Cognitive Services

Hi everyone, this time I wanted to showcase some of the many capabilities of Microsoft Cognitive Services using a “cupido”   bot built with Microsoft Bot Framework .

So what is the plan? Here some ideas:

  • Leverage only Facebook as channel! Why? Because with facebook you have people already “logged in” and you can leverage the messenger profile api to retrieve automatically the user details and more importantly his facebook photo!
  • Since usually the facebook photo is an image with a face , we can use this image with Vision and Face Api to understand gender, age and bunch of other interesting info without any user interaction!
  • We can score with a custom vision model that we trained using some publicly available images if a person looks like a super model or not 😉
  • Using all this info (age, gender, makeup, sunglasses, super model or not, hair color, etc…) collected with all those calls we can decide which candidates inside our database are the right ones for our user and display the ones that are fitting according to our demo rules.

Of course at the beginning our database of profiles will be empty , but with help of friends / colleagues we can quickly fill it and have fun during the demo.

So in practice how does it look like?

Here the first interaction, after saying hello the bot immediately personalizes the experience with our facebook data (foto, name, last name,etc..) and asks if we want to participate to the experiment:

After accepting it uses the described APIs to understand the image and calculate age, hair, super model score, etc…

Yeah, I know my super model score is not really good, but let’s see if there are any matches for me anyway….

Of course the bot is smart enough to display the profile of my wife otherwise I was really in a big problem :-).

Now I guess many of you have this question: how the super model score is calculated?

Well I trained the custom vision service of Microsoft with 30+ photos of real models and 30+ photos of “normal people” and after 4 iterations I had already a 90% accuracy on detecting super models in photos 😉

Of course there are several things to consider here:

  1. Images should be the focus of the picture
  2. have sufficiently diverse images, angles, lighting, and backgrounds
  3. Train with images that are similar (in quality) to the images that will be used in scoring

And we have for sure super model pics that have larger resolution, better lighting and good exposure vs the photos of “normal” people like you and me, but for the purposes of this demo the results were very good.

Another consideration to do is that you don’t always have to use Natural Language Processing in the bots (in our case in fact we skipped the usage of LUIS ) because, especially if we are not developing a Q&A/support bot, users prefer buttons and minimal amount of info to provide.

Imagine a Bot that handles your Netflix subscription, you just want  buttons like  activate/deactivate membership (if you go in vacation) and the other is “recommendations for tonight” .

Another important thing to consider is Bot Analytics and understand how your bot is performing, I leverage this great tool that under the covers uses Azure Application Insights:

If instead you are in love with statistics you can try this jupyter notebook with the following template to analyze with your custom code the Azure Application Insights metrics and events.

If you want to try the bot already with all the telemetry setup done , you can grab , compile and try the demo code (do not use this code in any production environment) that is available here and if this is your first bot start from this tutorial to understand a bit the various pieces needed.

UniFi – Install a UniFi Cloud Controller on Azure

Hi everyone, this time I want to share one of my weekend projects inspired by this article of Troy Hunt.  As Troy I also experienced the pain of a single router/gateway/access point device and I decided to switch to UniFi devices .

On the net you can find dozens of tutorials on how assemble the various bits, here instead I want to explain how to set up the UniFi Controller software on Azure with very simple steps.

Step 1 : Go the following website https://azure.microsoft.com/ and register . You will receive 200$ of Azure credits that can be used in the first month. Alternatevely you can register for visual studio essentials program and have 25$ each month for 1 year .

Step 2: Once you have your subscription up and running you can provision a Linux VM clicking on the big “+” button and searching for Linux Ubuntu Server:Step 3: After you selected the Virtual Machine just give to it a name, username, password , choose as VM disk type HDD, a resource group name that you like and a data center near to where you live.

Step 4: Set the size of the VM ( I used A2 standard but you can try also with A1 or even A0, also remember you can schedule the vm to start/stop only when you need and save a lot of money):

Step 5: In the additional settings leave everything to the default values and finally hit the purchase button!

Step 6: After few minutes or even less your VM is ready and you should be able to see a screen like this:

Write down the Public IP Address because you will need it shortly.

Step 7: Setup the open ports in order to have the Controller working correctly.

First go the the network interface of the vm

Select the first one (there should be only one):

Now select the network security group first clicking on link n.1 and then on the link n.2:

Finally here add the necessary inbound rules exactly as described here:

Step 8: Connect using Putty on Windows or you Mac OS standard shell to the mentioned IP address and install the Unifi controller software with those commands:

echo “deb http://www.ubnt.com/downloads/unifi/debian unifi5 ubiquiti” | sudo tee -a /etc/apt/sources.list

sudo apt-key adv –keyserver keyserver.ubuntu.com –recv 06E85760C0A52C50

sudo apt-get update

sudo apt-get install unifi

Step 9: Connect to the controller web interface located here https://IP_Address:8443/ and complete the UniFi wizard:

Finally you may now proceed to adopt your UniFi devices using Layer 3 Adoption !

Digital Marketing in practice (final episode)

In our last episode of the series we have spoken of the holy grail of Digital Marketing landscape and how this is deeply connected to the identity of our customers. So let’s try to recap for a moment what we need :

  1. all our customer data (web logs, app logs, purchases, calls, surveys,etc…) marked with same identity Id in order to proper assign every event to the same customer and we need this data to be collected in real/near real time.
  2. to define what are our targets (sales, customer satisfaction, market penetration,etc..) and define a strategy to reach those goals.
  3. To define the strategy we use the data collected at point 1 to identify the patterns that are leading to : high revenues customers, abandoned carts, poor reviews, good surveys,etc….
  4. Once our overall strategy (sales, discounts, promos, coupons, social,etc.. ) is defined we need to put this strategy in practice defining our customers journeys, for example look at this or this , so literally we have to define on each touch point (where), what and when some “actions” will happen, who will be the target of those actions and what are the subsequent “sub actions” or steps that automatically have to happen at every step of the journey.
  5. To produce on all the touch points the respective UI associated to the actions.
  6. To go back to Point 1, evaluate the new data and check if the strategy is working and if necessary take the corrective actions.

Now in an hypothetical “perfect world” we should be finished,  but reality is much more complicated than that 🙂 .

rality check ahead sign

In fact , while we can define journeys and customer segments, profiles and target audiences , we need some “binding” to happen between our defined journeys and the real touch points.

An example? Let’s assume we define a coupon/loyalty initiative, this only means a quite large list of system configurations and actions :

  1. Define the new coupon initiative in the loyalty system
  2. Define the budget allocated for those coupons and the limits associated
  3. Integrate the new coupons with the e-commerce in order to have them to be applied and accepted at order time
  4. Integrate the journey builder actions into the e-commerce in order to have the e-commerce UI display the promotion new look & feel
  5. Integrate into e-commerce UI engine journey builder sub-steps if any
  6. Tag properly all the consumer journey steps in order to collect back the data points of the journey
  7. Etc..

Now repeat this for the marketing campaign system that handles email, sms and notifications, repeat this for the call center,etc….

simpsons_panic

As you can imagine we need a single unified collection of products (identity,e-commerce, apps, crm, marketing email/sms, etc…) all connected by the same vendor and the “unified data collector system” to be also the customer journey builder , in fact we can reasonably understand if our strategies are effective only if we can observe on the very same tool if the journeys we designed are working or not (what if we define a journey of 12 steps and almost nobody goes after step 3 ? ).

I guess that if you look now on preferred search engine and do so basic research you will find at least 20+ vendors that are saying that they have this kind of combined solution in place.

In reality , even if we assume that all the 20+ vendors have all fantastic and well connected platforms, all the enterprises have already a gargantuan amount of systems already in place and you cannot “turn off” everything and start from the scratch.

At the same time even if you start from zero, often the cost and the lock in risk associated with ALL IN ONE solutions are so high that you can anyway end up going to think about a Lego approach.

lego-1

So what is our recommendation here ?

buildorbuy-resized-600

The right approach can be perhaps neither build or buy , I call it smart integration.

Smart integration means the following:

  1. Define your own MML : marketing markup language
  2. From the central data repository define the personalized MML journey for each customer
  3. Write this MML journey on the identity of each customer
  4. Build/Configure on all the touch points the functionality needed to read the MML journey (leverage first the customer device/browser to perform the integration) from the identity itself ,translate that in meaningful actions on that specific touch point (email template on marketing automation, call center Next Best Action on the CRM, etc…)
  5. Collect all the data and evaluate , correct and start again 🙂

An example of MML?

You can start simply with something like this:

loyaltyblrule1

Now if you want to create a unified MML definition for the main strategies and touch points , I think it would be a fantastic idea and for sure a very successful start up!

success-loading-pic1-1024x768

Digital Marketing in practice (part 2)

In part one we defined our landscape and the first challenges encountered on managing the known customers .

We now want to investigate how to bring new customers to our e-commerce site and more importantly how to target , among potentially 7 billion customers, the ones that have the higher chances to buy our fantastic product.

Ideally we would like to place advertisement on :

  1. search engines (google, bing, etc..)
  2. social networks (facebook, twitter, snapchat, etc..)
  3. big content websites (msn.com, yahoo, news websites,etc..)
  4. inside famous apps that can host advertisement
  5. etc…

How do we contact all these different “information publishers” and how we can create a single “campaign” targeting all these “channels” ?

Here we have go into the DMP, DSP,SSP world, and see how these platforms can help us in reaching our objectives.

Let explain this with an example : go now to this yahoo page https://www.yahoo.com/news/ , you should see almost immediately on the top of the page an advertisement like this:

How and why this advertisement was placed there ?

The “publishers” like yahoo, have a so called “inventory” of places where ads can be positioned on each page , on different times of day or of the week typically. So they use a platform called SSP  to communicate to the entire world the following message : “I have these open slots/positions in my pages, who pays the highest amount to buy them and place their own ads?”

On the other side of the fence there is another platform called DSP where “marketers” can say the following : “I want to pay this amount of money to place my banner in yahoo pages “.

The process where “supply” and “demand” meet together is called RTB , real time bidding , and thanks to that, in real time it will be decided what is the advertisement appearing in yahoo website. If you want to understand this more in deep look at articles like this , but you understood that in this way we can have new customers that can reach our e-commerce site clicking on the banner.

But now another question comes up: is this banner the same for all the visitors? And if it has been displayed to 1 Million or 10 Million or 100 Million visitors what is the price that we have to pay?

This is the right time to explain the concept of audience: before going to the DSP and search for “open inventory” we first want to define who are the visitors or anonymous customers that we want to target in our campaign, in this way we can have an idea of how many of them in theory can see the banner.

But if these customers are “unknown” how do I target them? Here the final piece of the puzzle comes into play: the DMP . With DMP we can actually “purchase” (or better rent) from third parties anonymous customers profiles  that are based on browser cookies or smartphone device ids and pick only the ones that according to us are the best ones.

So for example we select them using simple filters in this way:

Once our audience is prepared, we can have an idea of how many of these potential customers we can reach and with this have an idea of the money we will spend if all of them will see our banner and hopefully click on it.

Now this is a pretty straightforward process, but it is not really super optimized…

In fact we already have customers in our e-commerce site (the so called first party data) and we already know who are the ones that are “our best customers”, it would make sense to find on the DMP platform the potential customers that are very similar to those ones, right?

This process exists and it is called look-alike modeling:

So now we know that we have somehow to integrate also the DMP world (that is already a world on his own with connections to DSPs and third party data providers) into our digital marketing landscape at least in two directions:

  1. unknown–>known campaigns integration
  2. known–>unknown look alike modeling and on-boarding process (bring our first party data into the DMP)

Be patient now 🙂 .

We will start designing our digital marketing landscape in part 3.

Digital Marketing in practice (Part 1)

Several times you hear these buzzwords: digital transformation, digital marketing, etc.., but what really is that from a technology point of view?

I will try to give you my point of view (mainly looking at architectures and components involved) but I hope this can be useful to a large audience.

In order to understand the digital marketing ecosystem imagine that we are the founder of one of the an innovative company that is launching a new incredible mini robot that is mainly focused for kids , let’s call our product TheToy:

Now that we have a product to sell we need to prepare the following:

  1. An e-commerce web site to sell this product
  2. A payment provider that help us to accept any credit card/bitcoin/bank transfer/paypal/etc..
  3. A delivery/supply chain provider that help us to bring the goods to our customers
  4. Speaking of customers we need a CRM system with integrated call center
  5. ……

This of course it’s just an over simplified view of the needs of an e-commerce initiative , but it’s already interesting to speak of this because in this way we can for example evaluate the following options:

  • Our IT related “stuff” (e-commerce, crm,etc..) needs servers, where do I buy, host and maintain these servers?
  • Do I build from scratch ,hiring some developers, the e-commerce website, the payment provider, the CRM,etc..?

Since our focus , should be, to create a fantastic product and having customers happy and not be an IT company we can do , for example, the following choices :

A) Pick a cloud provider that gives us all necessary “virtual hardware” and bandwidth, pick an e-commerce software package, install/configure a huge amount of stuff and hopefully end up in something (again high level) like this :

B) Pick a software as service e-commerce provider that already has all of this in place and where we have just to upload our product catalog and start our e-commerce site immediately:

Now choice is not simple as it seems because every time we pick a “the shortest route”, there is a price to pay , not only in terms of pricing of the solution, but also in terms of functionalities that we would like later to have.

As we said there are other pieces of the puzzle like our CRM and not only we have to pick the CRM that best serves our needs and budget but that also it has to be “somehow integrated” with our e-commerce site…

Before we dig also in this , let’s imagine for a moment that , magically, our e-commerce and CRM and Delivery/Supply Chain/etc.. are already in place and we are selling/delivering our products successfully , what if we have a new product/offer to sell and we want to notify our customers that we have a new product?

This process of contacting customers in order to sell/advertise a new product/offer is called “marketing campaign” and of course we need a tool for that 🙂 : we need a marketing campaign automation tool that help us to create targeted campaigns (we want to reach the right consumer for the right product….) and deliver those messages in multiple ways :

  • emails
  • sms
  • push notifications if we have also an e-commerce app
  • social accounts/pages (btw we need to setup also those accounts!)
  • personalized pages and messages on the e-commerce website advertising the new product only to the “right customers”
  • personalized CRM responses (when the target customers calls only to these customers the crm agent has to propose the new offer/product)
  • etc…

So now the things became to be a bit more complicated, we need in fact an e-commerce site integrated with CRM both integrated with a marketing campaign automation tool .

In order to have even more fun let’s also consider the following : we said that we want to produce “targeted campaigns” , this means that we want to leverage all our data on customers to target only the “right” ones ….

What is the data that we can leverage?

Some ideas:

  • The weblogs of the e-commerce site (google analytics logs for example)
  • The orders of the e-commerce site
  • The calls/cases of the CRM
  • The responses/interactions with our previous campaigns
  • The interactions on our social channels
  • etc…

This , even at small scale of our little e-commerce initiative , seems like a little data warehouse project and if we plan to find the right customers for the right offer , this is also a data science project involving machine learning…..

So we need an e-commerce site integrated with CRM integrated with a marketing campaign automation tool integrated with a big data engine with machine learning capabilities .

In the part 2 we will start to look also to another dimension of the digital marketing landscape: the unknown customers that are waiting to purchase our product but they don’t know that the product exists and we don’t how to reach them 🙂

Extract text from documents at scale in the Azure Data Lake

Hi across all the content posted on Build 2017 , I was really impressed by this presentation where you can learn how to lift and shift almost any runtime to Azure Data Lake Analytics for large scale processing.

Following those recommendations I built a custom Extractor and Processor for USQL leveraging tika.net extractor in order to extract in a text format all the content stored in files like pdf, docx, etc…

The idea is to solve the following business scenario: you have a large collection of docs (azure data lake store capacity is unlimited)  : pdfs, xls, ppts, etc.. and you want to quickly understand the information stored in all those documents without having to create/deploy your own cluster but in pure PaaS mode.

Here a sample of code built around the Visual Studio Project template for U-SQL Applications.

ADLA4As you can see in this demo we limited the max size of the extracted content to 128Kb in order to comply to this ADLA limit. This limit can be bypassed working on byte arrays.

Now I uploaded all the dll binaries to the data lake stored and registered them as assembly

adla3.jpg

Then I launched a U-SQL command to actually take text data stored in a collection of pdf documents specifying 22 AU .adla2

And in less than 2 min I have my collection of documents parsed inside one single csv file .adla.jpg

Now that the information is in text format we can use the Azure Data Lake topics and keywords extensions to understand quickly what kind of information is stored inside our document collection.

The final result that shows how keywords are linked to documents can be visualized in Power Bi with several nice visualizations

rev1

And clicking on one keyword we can see immediately which documents are linked to it

rev2

Another way to visualize this , it is with word cloud

rev4

where we see for a specific document this time what are the keywords most representative of the document.

If you are interested into the solution and you want to know more send me a message on my twitter handle.