Welcome to the World of Tomorrow:

We are in an emerging world of technologies and innovation, where IoT cannot stand alone.

The Internet-of-Things(a.k.a IoT) provides us with lots of sensor data/information. However, this sensor information by themselves does not add value unless we can turn them into actionable, contextualized information useful for mankind. Big data, data visualization and Machine Learning(ML) techniques allow us to gain new insights by learning, batch-processing and information analysis(online/off-line). Real-time sensor data analysis and decision-making are often done manually but to make it scalable, it is preferably automated.

Artificial Intelligence(AI) provides us the set of framework and tools to go beyond typical real-time decision and automation use cases for IoT. Who knows we may end up in a world where humans have less work to do like in Sci-Fi Hollywood movies. Whatever happens will be the best towards for our next generations and we have the responsiblity to build a better world for them.

The intent of this blog is to provide you latest insight into what is happening in the Cloud, IoT & AI industry. Keep reading, keep contributing and please share as well.

IoT is not all about Cloud

Recent past, I had multiple discussions with many tech forums and many people have a misconception about IoT and Cloud. Some think whenever we do something like blinking an LED with Raspberry Pi or Arduino is IoT.

I just thought of sharing some of my viewpoints on these terminologies.

  • Internet of Things(IoT) – refers to the connection of devices (other than the usual examples such as computers and smartphones) to the Internet. Cars, Home and Kitchen appliances, Industrial devices, and even heart monitors can all be connected through the IoT.
  • Cloud Computing – often called simply “the cloud,” involves delivering data, applications, photos, videos, and more over the Internet to data centers.

We can break down cloud computing into six different categories:

  1. Software as a service (SaaS): Cloud-based applications run on computers off-site (or “in the cloud”). Other people or companies own and operate these devices, which connect to users’ computers, typically through a web browser.
  2. Platform as a service (PaaS): Here, the cloud houses everything necessary to build and deliver cloud-based applications. This removes the need to purchase and maintain hardware, software, hosting, and more.
  3. Infrastructure as a service (IaaS): IaaS provides companies with servers, storage, networking, and data centers on a per-use basis.
  4. Public Cloud: Companies own and operate these spaces and provide quick access to users over a public network.Example: Amazon AWS, Microsoft Azure etc.
  5. Private Cloud: Similar to a public cloud, except only one entity (user, organization, company, etc.) has access. Means the access to the cloud is secured and isolated, only organizational entities have access to this type of cloud resources. A private cloud is owned by a single organization. Private clouds enable an organization to use cloud computing technology as a means of centralizing access to IT resources by different parts, locations, or departments of the organization. When a private cloud exists as a controlled environment within Onpremise data centers.
  6. Hybrid Cloud: Takes the foundation of a private cloud but provides public cloud access. This combination would be established through a secure high-speed VPN tunnel over MPLS or other dedicated lines or extended connectivity gateways provided by the respective cloud vendor. In this mode, your on-premise applications can connect to cloud infrastructure and vice versa. This provides you the flexibility to host your missing critical information in on-premise itself, but also provides you the flexibility to utilize the cloud power, without compromising your organization’s critical data.

Role of Cloud in IoT

Cloud is simply an enabler for IoT. It provides necessary services and infrastructure for things to be interconnected and operate.

Cloud provides all the essential services to increases efficiency in implementing your IoT solutions, accumulate and operate on IoT data. Internet of things requires Cloud to work, I would better define it as Cloud and IoT are inseparable, but IoT is not all about Cloud.

For example, millions of devices connected in an IoT ecosystem would create millions of bytes of data, and you would need sufficient infrastructure to store and operate on these data to create a meaningful result out of it.

Cloud Service providers started realizing the need of providing IoT specific services to customers to quickly enable to create Fast to market solutions. That’s where Cloud and IoT converges. Microsoft has packages all IoT related components into Azure IoT and hence Amazon AWS IoT, similarly the remaining providers such as SAP Hana, IBM Cloud etc. This helps customers from picking necessary components and build their IoT ecosystem in Cloud, or utilize the predefined(SaaS) solutions for quick enablement.

What is the role of Raspberry Pi, Arduino and Dragon board then?

These are single board computer or hardware boards(CPUs) or Microcontroller boards that have sufficient hardware capacity to run a small/complex IoT program on an operating system of your choice.

These boards are typically equipped with your basic storage and computing needs for establishing an IoT device or edge capability. You can write a program of your choice to blink an LED based on your conditions, as they are equipped with digital/analog I/O ports. You can choose from wide variety of operating systems such as Raspebian, Windows 10IoT etc to install on these devices or deploy microcontroller programs depending on the capacity.

This means they are edge devices which you can program for your IoT use case. When deployed to the field together, they would create an IoT network.

Conclusion:

Enough said, IoT is not all about the cloud, but are inseparable in a modern world and whatever you are doing with RaspeberryPi, Arduino Uno etc may not be an IoT unless there are a specific IoT use cases you are not trying to solve using these devices.

Useful References:

Getting Started with Azure Functions App

In my previous article I gave you an overview of Azure Functions and discussed about the benefits of the Azure Functions. With this session I will cover you with necessary steps to create an initial basic functions app.

Getting Started:

Login to Azure Portal, you will see Function Apps section in the left menu. This is where all your Function Apps will be listed, once you login.

imageimage

Let us start by creating a new Function app. Type Functions

image

Select Function App from Web + Mobile category, and Click on Create Button.

Fill In details:

  1. App Name,
  2. Select Subscription
  3. Select Resource Group (new if you want to create new resource group) or select existing
  4. Select Hosting Plan
  5. Specify Storage
  6. Click on Create

imageimage

You will see deployment in progress message.

imageimage

image

Once you explore further selecting the Function App instance you will be able view the URL and left side  menu you will see the options to configure:

  1. Functions
  2. Proxies (Preview Feature)
  3. Slots (Preview Feature)

image

Getting Started – create first Function App

Now since we have new instance ready. Let us create our first Function.

We have to choose by :

1.)  Choose a scenario:

  1. Webhook + API
  2. Time
  3. Data Processing

2.)  Choose a language:

  1. JavaScript
  2. CSharp
  3. FSharp
  4. For PowerShell, Python and Batch processing, you create your own custom function

image

For the demo sake I am creating a Timer Scenario and selected CSharp as the language.

image

I created a simple trigger code  and Click on Save and Run

image

Job has completed within speculated delay we put through on the Thread.Sleep:

image

Code Sample:

 
using System;
using System.Threading;

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executing at: {DateTime.Now}");

    RunTest(log);

    log.Info($"C# Timer trigger function completed at: {DateTime.Now}");
}


public static void RunTest(TraceWriter log)
{
    for(int i=0; i< 100; i++)
    {
       log.Info($"C# Timer trigger function executing at thread: {i}"); 
       
        Thread.Sleep(1000);

        log.Info($"C# Timer trigger function completed at thread: {i}"); 
    }
}

Using the Functions –> Integrate section we can configure Input, Output parameters and Schedule Timers, to make it available as a WebAPI methods. You can call this functional logic from another application to invoke as a web API call by passing necessary inputs, to start another functional process.

One example for this scenario would be to invoke a Database record archival  call after completion of an order. This is will be applicable in case we choose the scenario WebHook + API during the creation of your functional logic.

image

That’s all for now for this topic.  I will cover more details about WebHook + API in next series.

Please share your comments and rate this article to help me understand areas of improvement.

Additional Refs:

Azure Functions App–Run OnDemand Serverless code – a path way to Serverless Computing

Azure Functions is a new cloud solution from Azure that would let you execute small pieces code or “functions” in the cloud.  This means you do not have to worry about the infrastructure or environment to execute your little piece of code to solve any of your business problems.

functions-logo

Functions can make development even more productive, and you can use your development language of choice.

Benefits:

  • Pay only for the time your code runs and trust Azure to scale as needed.
  • Azure Functions lets you develop serveries applications on Microsoft Azure.
  • Supports wide variety of development language choices , such as C#, F#, Node.js, Python or PHP.
  • Bring your own dependencies – you can bring any of your Nuget/NPM dependencies for your functional logic.

What can we do with Azure Functions?

Azure Functions is a very good  solution for processing data, integrating systems, working with the internet-of-things (IoT), and building simple APIs and micro services.

Functions provides templates to help you  get started with some useful scenarios, including the following:

  • BlobTrigger – Process Azure Storage blobs when they are added to containers. You might use this function for image resizing.
  • EventHubTrigger – Respond to events delivered to an Azure Event Hub. Particularly useful in application instrumentation, user experience or workflow processing, and Internet of Things (IoT) scenarios.
  • Generic Webhook – Process webhook HTTP requests from any service that supports webhooks.
  • GitHub Webhook – Respond to events that occur in your GitHub repositories.
  • HTTPTrigger – Trigger the execution of your code by using an HTTP request.
  • QueueTrigger – Respond to messages as they arrive in an Azure Storage queue.
  • ServiceBusQueueTrigger – Connect your code to other Azure services or on-premises services by listening to message queues.
  • ServiceBusTopicTrigger – Connect your code to other Azure services or on-premises services by subscribing to topics.
  • TimerTrigger – Execute cleanup or other batch tasks on a predefined schedule.

Integration Support with other Azure Services:

Following are the services integration supported by Azure Functions app.

  • Azure Cosmos DB
  • Azure Event Hubs
  • Azure Mobile Apps (tables)
  • Azure Notification Hubs
  • Azure Service Bus (queues and topics)
  • Azure Storage (blob, queues, and tables)
  • GitHub (webhooks)
  • On-premises (using Service Bus)
  • Twilio (SMS messages)

Costing:

Azure functions will be charged based on two pricing plans below:

  1. App Service Plan – if you already have an Azure App Service running with Logic, Web, Mobile or Web Job, you can use the same environment for your Azure functions execution without needing to pay for extra resources.  You will be charged based on regular app service rates.
  2. Consumption plan  – with this plan you only need to pay for how long and how many times your functions runs and computational needs/resource usage during that execution time. Consumption plan pricing includes a monthly free grant of 1 million requests and 400,000 GB-s of resource consumption per month.

You can find further pricing related info here

Support and SLA:

  • Free billing and subscription management support
  • Flexible support plans starting at $29/month. Find a plan
  • 99.95% guaranteed up time. Read the SLA

Useful Links:

Managed Azure Database for MySQL and PostgreSQL

During Microsoft Build 2017(May 10th 2017) conference in Seattle, Scott Guthrie (EVP of Cloud and Enterprise Group) announced two new offerings to the Azure Database Services Platform, Azure Database for MySQL and Azure Database for PostgreSQL.

I was happy that Microsoft is filling the gap for the need of Fully Managed MYSQL and PostgreSQL . I recollect around in April I was trying to migrate this WordPress blog from Godaddy hosting in to  an Azure App Service to provide and since WordPress requires MySQL as the database. The only option left for me in Azure was to have local MySQL(MySQL in App)  in App Service, which cannot scale well or either use Clear DB service (a Microsoft partner in azure). Some how I wasn’t happy with the performance of local MYSQL and Clear DB, due to my bulky blog. So I thought what if there was a Managed MYSQL service just like Managed SQL Azure services.

What is Azure Database for MySQL and PostgreSQL?

Azure Database for MYSQL and PostgreSQL(currently in PREVIEW)  are fully managed Platform as a Service(PaaS) offering from Microsoft Azure, which does not want us to worry about infrastructure and managing the server instance.  Below is the outline of how these services has been stacked up against existing SQL Database offerings. As a customer you do not need to worry about the Compute, Storage, Networking, and high-performance/availability/scalability  of these services ensured by Azure Data Service Platform with built in monitoring.

You easily deploy an Azure Web App with Azure Database for MySQL as the database provider, and to provide complete solutions for common Content Management Systems (CMS) such as WordPress and Drupal.

2d1e1ef6-94ac-4110-bc4d-93d0b44d45aa

I will cover more details in later series That’s all for now. Thank you for reading my content. Leave your comments.

Pricing Details:

Useful Links:

Big Data & Front End Development track in the Microsoft Professional Program

Earlier I introduced you the Microsoft Professional Program for Data Science. Right after few days Microsoft announced the BETA availability of two more tracks Big Data and Front End Development.

Big Data Track:

This Microsoft program will help you to learn necessary skills from cloud storage and databases to Hadoop, Spark, and managed data services in Azure. Curriculum of this program involves learning how to build big data solutions for batch and real-time stream processing using Azure managed services and open source systems like Hadoop and Spark.

Are you intend to pursue a Data Analytics career, this is the right program for you to gain necessary insights.

Technology you will apply to gain these skills are: Azure Data Lake, Hadoop, HDInsight, Spark, Azure data factory, Azure Stream Analytics

Below is the course outline :

  • 10 COURSES  |  12-30  HOURS PER COURSE  |  8  SKILLS
  • ENROLL NOW here
  • More details here

Front End Development Track:

This track provides you necessary skills to get started with Advanced Front End development using HTML5, CSS3, JavaScript, AngularJS and Bootstrap.  At the end of the curriculum you will become master in Front End Development with all predominant modern web technologies.

So if you are a front end UI developer, this is something you can try out to enhance your skills.

Below is the course outline :

  • 13 COURSES  |  15-30 HOURS PER COURSE  |  11 SKILLS
  • ENROLL NOW here
  • More details  here

Track detail

Each course runs for three months and starts at the beginning of a quarter. January—March, April—June, July—September, and October —December. The capstone runs for four weeks at the beginning of each quarter: January, April, July, October. For exact dates for the current course run, please refer to the course detail page on edX.org. 

[Microsoft]

Microsoft Professional Program for Data Science

Microsoft has come up with a new program to bring in more skilled people to the field of Data Science by providing them the right training on right set of tools.

Microsoft has put together a curriculum  to teach key functional and technical skills, combining highly rated online courses with hands-on labs, concluding in a final capstone project. All these trainings will be delivered by Microsoft either online or through recorded sessions.

The program comprises of  10 COURSES, 16-32 HOURS PER COURSE,  8 SKILLS

The technology skills you will gain through are: T-SQL, Microsoft Excel, PowerBI, Python, R, Azure Machine Learning, HDInsight, Spark.

ENROLL NOW: through this link

Course schedule:
For exact dates for the course, please refer to the course detail page on edX.org.

For more details on this program: https://academy.microsoft.com/en-us/professional-program/data-science/ 

** This course would provide necessary insight to write Microsoft’s new Certification – Microsoft Certified Solution Associate(MCSA) – Machine Learning.

Happy Learning!!