Extending UniFi Data Analysis & Reports

Hi everyone, this time I want to share one of my preferred side activities : playing with my Ubiquiti home setup!

As you already know I have my controller running on Azure , but I wanted to understand more on which kind of data is stored inside the controller, in other words where the data that we see in the controller dashboard is stored.


Inspecting the binaries and looking in 2-3 posts on the forums I figured out that this data is sitting in a mongodb database, but I wanted of course to look a bit inside of this database.

What I did is the following, I made a backup of the data of the controller using the web interface of the controller and I downloaded it locally:


At this stage I downloaded the controller software for an installation on my laptop (Macbook) and at controller startup i requested a restore of the backup i just downloaded from the cloud controller.


Once the restore is done, mantain the controller running and you can use a mongodb client like robo 3T  and connect to localhost and port 27117 (we connect to the mongod process started by the controller locally).

Screen Shot 2018-05-02 at 9.22.56 PM

This is great! But I would like to produce some nice dashboards , with some visualization tool like Tableau or PowerBi or simply Excel but the data in a “Document” format while I need it to be in Table/Records format.

The solution is the Mongo Bi Connector that is a kind of “wrapper” or “translator” between the “Document” world and the tables/records world.

But things are never simple ;-), this connector works only from MongoDB v. 3.0 or higher while the one inside the controller software is 2.6. So first we have to download a separate mongodb server that works with it but more importantly upgrade the database itself to the 3.X format.

First let’s copy the database from the controller folder (check a folder called db) and copy it to another location, write down this location.

I tested and failed various times before understanding how to do it but this is the sequence (using brew to install mongodb on my mac):

install mongodb 3.0 –> open the controller database in the location we copied.

uninstall 3.0 /install 3.2 –> open the controller database in the location we copied.

uninstall 3.2 /install 3.4 –> open the controller database in the location we copied.

This will bring the database to a format that is working with the Bi Connector.

Screen Shot 2018-05-02 at 5.55.20 PM

Now in the Bi Connector you can extract the schema of a document collection you like (for example the stat_daily collection of ace_stat database) and after that spawn the wrapper process that can be used by a visualization tool:

Screen Shot 2018-05-02 at 9.38.10 PM

In my case I used tableau to create some test dashboards:

Screen Shot 2018-05-02 at 4.28.23 PM

Here I see that the CPU of my gateway was a bit high during the first part of the month and then decreased significantly.

I can add other metrics like downloaded data, etc.. to understand better:

Screen Shot 2018-05-02 at 4.30.13 PM

In reality in this specific case there is already a super nice visualization already offered by the controller dashboards:

Screen Shot 2018-05-02 at 5.54.18 PM

So the real interesting thing here is that you can actually create your own report and also discover new insights looking at the your own network data

So what are you waiting for ? Happy custom reports on your Unifi network and device data!


From Windows 10 to Mac OS Sierra without admin privileges

Hi everyone, lately thanks to my manager and my new employer I was able to switch from a Windows 10 laptop to a shiny mac book pro  and I want to share with you some tips and tricks that probably you will encounter if you will do the same. First let’s start with the basics: why I have chosen to switch? Well I always (since 2009) had only Apple devices at home and I always loved the consistency and the “stability” of the Apple devices, but I never had the opportunity to actually “work” with a Mac , so this is also a learning for me. If you actually never used a Mac the first obstacle will be shortcuts like CTRL+C and CTRL+V , the mouse clicks (actually the right click on the track pad), the scrolling with 2 fingers on the trackpad and now the shiny and mysterious touch bar. Passed this first shock, you will quickly get used to the magic search experience of spotlight, the backup for dummies of time machine and the well known experience of the App Store.

Now let’s focus on the work related stuff: you can finally have on a mac also office 2016 but it is miles and miles away from the functionalities and easy of use of Office 2016 on windows, not super evident differences but if you use office professionally you will quickly find the missing pieces.

Solution ? Go the App Store, purchase Parallels Lite and enjoy Linux and Windows Virtual Machines. You will have VMs without being admin because Parallels Lite uses the native hypervisor available on Mac since Yosemite.

Thanks to this I was able to have back also several “life saving” applications that I use daily like PowerBi Desktop, SQL Server Management Studio and Visual Studio 2017. To be honest they have their versions in the mac world but the functionalities that are missing in those versions are too numerous to live only with that.

So I ended up having a windows 10 VM full of software, so why don’t use directly windows? Well , with the windows VM i can exactly use windows for the apps that are running great on that platform and if the system starts to be unstable I can still normally work on my mac without losing my work while windows does his own “things” 🙂 .

When needed I leverage an ubuntu VM with docker  and vs code with the same segregation of duties principle (main OS fast and stable, guest OS with rich and dedicated software).

Now I work several times in this way : sql server hosted on linux, I do import/export of external data easily with Sql server management studio from windows and I run pyspark notebooks on docker accessing the same data and finally I do visualizations with power bi desktop on windows.

In case, like me, you have strict policies around admin accounts , I want to share with you this: do you remember the concept of portable apps in windows? Well on the mac you can do the same with some (not all) the applications that are outside the App Store (you can install almost all the apps in the App Store without admin privileges).

The technique to have an application on mac “portable” is simply the double extraction of the pkg files and Payload files to one folder that you can access (like your desktop), you can check the details here and here and basically run those applications from the locations that you like.

The exceptions will be :

  1. Applications not signed by a recognized and well know developer or software house
  2. Applications that on start up will ask you to install additional services
  3. Applications that before being launched require the registration of specific libraries/frameworks

There are cases (like azure machine learning workbech ) where the installer it’s actually writing everything in you user account folders but the last step will be the copy of the UI app to the Applications folder and this will fail if you are not admin. The solution is to look a bit inside the installer folders and find inside the json files the location of the downloaded packages . Once you find the URL of the missing one (use the installer error message to help you to find the package he was not able to copy) , download it locally and execute the app from any location, it should work without problems.


Marketing Segmentation & Predictive Analytics

Enterprises at any level need to target their consumers, clients, users with campaigns , measure the result of these campaigns and hopefully improve sales/contacts after each iteration. Practically this job requires a very broad spectrum of experiences starting from web/art/video design (how we format our message), going to language specific skills (how we write our message to raise interest) and of course some black magic art called segmentation (write the right message to the right potential or actual costumer). How marketing people build segments? Well using the attributes of the customers (age, sex, zip code, children etc…) they can create segments (for example all young women living in NY without children) and create campaigns for them (discounted shirts? Why not! ) . Now what is the role of predictive analytics in this area? Well in theory it should help marketing people really a lot: 

  • Discovering clusters (segments?) into the customer base
  • Identifying the key features or influencers that lead to buy an item or do some action
  • Showing what are the items  bought in combo and propose them as new packages to offer
  • Using social data identify leaders and followers into the customer base
  • ….surely  another hundred of insights like the ones mentioned.

Normally a marketing targeting tool leverages a classical database with tables, fields , records and actually the segmentation result it’s “just” a sql query that with some “where” conditions identifies the impacted customers. This tool it’s usually part of a suite, in other words from a vendor you buy a package with it and the package is made usually also of a CMS to build your web site pages, a eCommerce platform to sell your stuff on the net, an analytics/predictive analytics package to do the tasks described before. 

Looking at this , from a pure marketing department perspective, to maximize the value of your investments you should buy the package and enjoy the entire integrated platform without worries. It’s a bit like when you buy an iPhone and you start to enjoy it really when you couple it with an Apple TV, a MacBook and lately an apple watch. 

But while you can be the only one at home that takes this decision in medium/large enterprises you have several good reasons to do exactly the opposite:

1) predictive analytics runs on the largest possible combinations of data not only on the pure marketing/clicks/orders world.

2) it requires tools (big data clusters, in memory engines, complex algorithms, etc..) that are much more sophisticated  than the “normal” analytics capabilities provided by mkt packages . Usually it leverages best in class open source (R,Python,Scala libraries etc..) or specific vendor software (SAS, IBM,etc.. and lately several startups like Ayasdi etc..).

3) people working on it are miles away from marketing people from cab abilities perspective and from a target perspective (mkt usually want analytics to prove its guts feeling, analyst looks at data with curiosity trying to figure out patterns) 

4) From an analytics stand point  usually we want to buy software that we can reuse for all the business cases (supply chain, logistics, operations, etc…) and not only for the marketing business cases.

The usual “attempt to solve” this conflict it’s separation of diuties : analysts discover insights, they translate these insights into features of the customers and this info “should” fly to the segmentation tool and the good mkt guys should leverage these to do super effective campaigns. The result ? A huge amount of effort wasted without any ROI: 

1) mkt people do not own these features so they do not trust them or they simply ignore them

2) when you start the customization journey ( new interfaces that trasport data back & forth from analytics world to mkt planet) you will face bugs, data quality issues, data synchronization issues etc…

3) analyst try to cover multiple business cases as said, so only a fraction of his job it’s actually targeting marketing needs

When you are a startup however all the complications of this approach are mitigated by the fact that your team is small and mkt/analytics teams are in practice 2-3 people in the same room.

So we can apply the startup model everywhere?

The key factor here it’s people and objectives. We can go with a complete E2E marketing & predictive analytics integrated and customized solution but we have to add to it a dedicated team that works on it : creative marketing and data driven decisions can coexist and actually help each other if they are the result of a team of people laser focused on the same business objectives.

For large enterprises building this team is the key to provide  these capabilities as service to the various internal clients. Clients can be for example other marketing departments spread around the world in different countries working on different labels. These clients will benefit of this because they can focus really on their business at regional level and obtain not only an “agency like” service but they will benefit from the entire chain of experiences and results that data flowing at global level can bring.

    Here I am…


    Hi to everyone, according to blog guides in this first post I should describe my self and why users should read my blog. In reality I just needed a place where to write freely about my experiences with different products/technologies.

    You can find here security related articles, specific Microsoft technology posts , some experiences with some Big Data products/platforms and also some other collateral stuff.

    Sometimes I loved that stuff , some other I hated with all my self, but in the end in both cases the important thing it’s the experience coming from it.

    Several times even if I was working on something that was non particularly interesting I discovered precious info that it became really useful later on other projects.

    Of course it also happens the opposite , working on something nice and you discover things that will became later very useful .

    Just an example that comes to my mind :

    I was reading for personal interest Stefan Esser findings on IOS jailbreak and from that I figured out a bit some strategies that Apple uses to handle kernel memory and this became fundamental later when I started to help a team that was struggling to design a decent protection on their app from memory dumps.

    My primary objective with this it’s not to showcase something but receive primarily a feedback on some activities that I’m doing .

    Good or bad feedback does not matter, it’s just interesting to hear multiple voices and ideas.