Azure

Microsoft Certified: Azure Administrator Associate–Exam Changes

March 26, 2019 Azure, Azure, Azure Administrator, Azure CLI, Certification, Cloud Computing, Infrastructure, Microsoft, Platforms No comments

Earlier during Microsoft Ignite 2019 conference, Microsoft Learning team has rolled out exams various role based certification exams for Administrators, Developers, Architects and DevOps engineers.

Initially there was a requirement for passing two exams: AZ-100 and AZ-101

  • AZ-100 – Microsoft Azure Infrastructure and Deployment
  • AZ-101 – Microsoft Azure Integration and Security
  • AZ-102 – (Upgrade Exam) – Microsoft Azure Administrator Certification Transition

As on 20th March, Microsoft has made announcement to simiplify these requirements by introducing a single exam, instead of taking two exams. Here is how it would look like. 

image

What’s included in AZ-103 ?

You can find them detailed out in  official exams page here, but I will give a quick list taken from the official page

Manage Azure subscriptions and resources (15-20%)

  • Manage Azure subscriptions
    • May include but not limited to: Assign administrator permissions; configure cost center quotas and tagging; configure Azure subscription policies at Azure subscription level
  • Analyze resource utilization and consumption
    • May include but not limited to: Configure diagnostic settings on resources; create baseline for resources; create and rest alerts; analyze alerts across subscription; analyze metrics across subscription; create action groups; monitor for unused resources; monitor spend; report on spend; utilize Log Search query functions; view alerts in Log Analytics
  • Manage resource groups
    • May include but not limited to: Use Azure policies for resource groups; configure resource locks; configure resource policies; implement and set tagging on resource groups; move resources across resource groups; remove resource groups
  • Managed role based access control (RBAC)
    • May include but not limited to: Create a custom role, configure access to Azure resources by assigning roles, configure management access to Azure, troubleshoot RBAC, implement RBAC policies, assign RBAC Roles
Implement and manage storage (5-10%)
  • Create and configure storage accounts
    • May include but not limited to: Configure network access to the storage account; create and configure storage account; generate shared access signature; install and use Azure Storage Explorer; manage access keys; monitor activity log by using Log Analytics; implement Azure storage replication
  • Import and export data to Azure
    • May include but not limited to: Create export from Azure job; create import into Azure job; Use Azure Data Box; configure and use Azure blob storage; configure Azure content delivery network (CDN) endpoints
  • Configure Azure files
    • May include but not limited to: Create Azure file share; create Azure File Sync service; create Azure sync group; troubleshoot Azure File Sync
  • Implement Azure backup
    • May include but not limited to: Configure and review backup reports; perform backup operation; create Recovery Services Vault; create and configure backup policy; perform a restore operation.
Deploy and manage virtual machines (VMs) (20-25%)
  • Create and configure a VM for Windows and Linux
    • May include but not limited to: Configure high availability; configure monitoring, networking, storage, and virtual machine size; deploy and configure scale sets
  • Automate deployment of VMs
    • May include but not limited to: Modify Azure Resource Manager (ARM) template; configure location of new VMs; configure VHD template; deploy from template; save a deployment as an ARM template; deploy Windows and Linux VMs
  • Manage Azure VM
    • May include but not limited to: Add data discs; add network interfaces; automate configuration management by using PowerShell Desired State Configuration (DSC) and VM Agent by using custom script extensions; manage VM sizes; move VMs from one resource group to another; redeploy VMs
  • Manage VM backups
    • May include but not limited to: Configure VM backup; define backup policies; implement backup policies; perform VM restore; Azure Site Recovery
Configure and manage virtual networks (20-25%)
  • Create connectivity between virtual networks
    • May include but not limited to: Create and configure VNET peering; create and configure VNET to VNET; verify virtual network connectivity; create virtual network gateway
  • Implement and manage virtual networking
    • May include but not limited to: Configure private and public IP addresses, network routes, network interface, subnets, and virtual network
  • Configure name resolution
    • May include but not limited to: Configure Azure DNS; configure custom DNS settings; configure private and public DNS zones
  • Create and configure a Network Security Group (NSG)
    • May include but not limited to: Create security rules; associate NSG to a subnet or network interface; identify required ports; evaluate effective security rules
  • Implement Azure load balancer
    • May include but not limited to: Configure internal load balancer, configure load balancing rules, configure public load balancer, troubleshoot load balancing
  • Monitor and troubleshoot virtual networking
    • May include but not limited to: Monitor on-premises connectivity, use Network resource monitoring, use Network Watcher, troubleshoot external networking, troubleshoot virtual network connectivity
  • Integrate on premises network with Azure virtual network
    • May include but not limited to: Create and configure Azure VPN Gateway, create and configure site to site VPN, configure Express Route, verify on premises connectivity, troubleshoot on premises connectivity with Azure
Manage identities (15-20%)
  • Manage Azure Active Directory (AD)
    • May include but not limited to: Add custom domains; Azure AD Join; configure self-service password reset; manage multiple directories;
  • Manage Azure AD objects (users, groups, and devices)
    • May include but not limited to: Create users and groups; manage user and group properties; manage device settings; perform bulk user updates; manage guest accounts
  • Implement and manage hybrid identities
    • May include but not limited to: Install Azure AD Connect, including password hash and pass-through synchronization; use Azure AD Connect to configure federation with on-premises Active Directory Domain Services (AD DS); manage Azure AD Connect; manage password sync and password writeback
  • Implement multi-factor authentication (MFA)
    • May include but not limited to: Configure user accounts for MFA, enable MFA by using bulk update, configure fraud alerts, configure bypass options, configure Trusted IPs, configure verification methods

Now that said. Wishing all the best to all exam aspirants who would want to become an Microsoft Certified: Azure Administrator Associate.

Azure Cognitive Services–Experience Image Recognition using Custom Vision (Build an Harrison Ford Classifier)

December 23, 2018 Algorithms, Artificial Intelligence(AI), Azure AI, Cognitive Services, Compuer Vision Service, Computer Vision API, Custom Vision API, Custom Vision Service, Emerging Technologies, Machine Learning(ML) No comments

Custom Vision Service as part of Azure Cognitive Services landscape of pretrained API services, provides you an ability to customize the state-of-the-art Computer Vision models for your specific use case.

Using custom vision service you can upload set of images of your choice and categorize them accordingly using tags/categories and automatically train the image recognition classifiers to learn from these images and come up with image recognition predictions when you supply an input image. Later consume this service as an API in your existing applications.

image

For example:  Here is how an image of Hollywood Actor – Harrison ford being accurately predicted by the custom model through training using a series of pictures of Harrison Ford through different ages and shapes.

I build this sample during Global AI Bootcamp Letterkenny– Hands on Labs, and will take you further through this article. Harrison Ford is my all time favourite actor.

image

Another example, Harrison Ford was one among 3 in a photo. Here is how the results would look like.

image

Here is how Harrison Ford’ sons picture is being predicted as Harrison Ford, due to similar facial characteristics. |f we further train this model, we can improve it’s capabilities to come up with accurate predictions.

image

Now let us see, how it was implemented.

In this article I am going to use a set of Harrison Ford images found on Google Images and then upload them to Custom Vision service like below. For more accuracy, I tried to collect images of Harrison Ford through different stages of his life, so that computer vision model could evolve to predict more accurate results.

image

Getting Started with Custom Vision:

The Azure Custom Vision API is a cognitive service that lets you build, deploy and improve custom image classifiers. An image classifier is an AI service that sorts images into classes (tags) according to certain characteristics. Unlike the Computer Vision service, Custom Vision allows you to create your own classifications. The Custom Vision service uses a machine learning algorithm to classify images.

Classification and object detection

Custom Vision functionality can be divided into two features. Image classification assigns a distribution of classifications to each image. Object detection is similar, but it also returns the coordinates in the image where the applied tags can be found.

To get started with our example,  first you need to have a Microsoft Account and Register/Login to https://www.customvision.ai

There going to be five steps of activities we are going to do:

1. Setup a Custom Vision Project

Create a new Project by selecting ‘New Project’ button

image

Specify the naming as the followed:

  • Name: HarrisonFordClassifier
  • Description: HarrisonFordClassifier
  • Resource Group: Leave it default to ‘Limited Trial’
  • Project Types: Classification
  • Classification Types: Multi Label (this is essential, we are going to add multiple tags per image in this example: for say ‘Actor’, ‘Person’ and ‘Harrison Ford’
  • Domain: General (for now)

image

2. Upload the Images

a.) Prepare Images

I have gathered a set of images you can download it from here, and extract the HF-Demo-Images.zip in to a folder of your choice.

There are two folders in it  first folder(harrisonford) contains all reference images for training the model and second folder(hf-quicktest) contains all the quick test images we are going to use for evaluating the model.

image

b.) Create Tags

Select ‘+’ icon to create a new tag and create the following tags

  • Actor
  • Hollywood
  • Harrison Ford
  • Person
  • Male

image

Enter Tag Name and click on ‘Save

image

c.) Upload Images

Now that we created all the tags, lets upload the images and tag them with respective tags.

Click on ‘Add Images’ button and select the images from “harrisonford” folder to upload.

image

image


d.) Assign Tags

Now specify the associated tags in My Tags section, selecting from the drop down

image

Then click on Upload

image

Have a review of the images uploaded

image

3. Train

Now let us train the model by selecting the green train button on top right hand side of the page

image

This initiates the first automatic training(Iteration 1) based on the tags you assigned and images associated to it.

image

Once that step is completed let us review the output of the training.

It shows a precision and Recall of 100% indicates our image classification model is trained now to provide Precision of 100% and Recall of 100%.

PS: Recall means out of the tags which should be predicted correctly, what percentage did our model correctly find?

image

4. Evaluate the Model

Now that our classifier is trained, let us evaluate the accuracy. For that we are going to use the sample images from “hf-quicktest” folder.

a.) First click on Quick Test button on top – image

b.) Select a local image or select an image URL

image

image

image

Lets try another image

image

image

Next let us try to upload an image of Ben Ford (Harrison Ford’s son)

image

5. Active Learning

Now that we have couple of accurate predictions, Active Learning involves training the model again from the prediction samples we used. This would make the model evolve to provide us more accurate predictions, for example we correcting the model as it identified that Ben Ford also as Harrison Ford based on similar facial features. In real world, he is a different entity other than his father.

Ben Ford is a Chef by profession. So I am going to upload some of his pictures and tag them as Ben Ford. Also couple of images of both father and son together, and then initiate the training again. Hope they would not feel agitated.

image

image

Now if you look at training performance, Precision and Recall values came down a bit, we can realize it is because we have two persons being tagged with some common tags etc.

image

Let us do a Quick Test with the previous image of Ben Ford again. voilà!, we have some accurate prediction.

image

Similarly, we can repurpose some of the previous prediction images from Predictions tab and add them with right Tags. Then retrain the model again to evolve the model.

image

image

image

The End:

Now that you have learned how you can train Custom Vision API with set of images and retrain them again for more accuracy. Once your training is completed and you are happy with the performance, you can integrate the logic in to your existing apps using Custom Vision REST APIs. You can follow the HOL that covers the integration topic here.

Custom Vision Services provides you state-of-the-art Classification and object detection capabilities to customize it for your specific need with quick and easy steps. This help you reduce your time to market and increase ROI (Return of Investment) for your product lines or ideas.

Start learning today using the below reference links.

References:

Disclaimer: All the images referenced in this article are available on the public domain and there is no way any private images are been included in this examples. We respect Harrison Ford and his family privacy, this article is just an attempt to prove the capabilities of Azure Custom Vision Services, no way intended to insult or invade Mr.Harrison Ford’s privacy.I am a big fan of you sir.

Azure DevOps–Community Launch-Letterkenny (08-January’ 2019)

December 21, 2018 .NET, .NET Core, .NET Framework, Announcements, Azure, Azure DevOps, Azure DevOps Server, Azure DevOps Services, Community, LKMUG, Microsoft No comments

Inviting you all to Azure DevOps Community Launch in Letterkenny on 08th Jan 2019. Few months back Microsoft Visual Studio Team Services has been rebranded as Azure DevOps.

Azure DevOps is now a suite of separate but integrated services for managing software projects, source control, build and release management and automation testing to enhance your productivity and team performance in whatever development and deployment environment you choose.

Martin Woodward: The Principal GPM for @AzureDevOps and Vice-President of the .NET Foundation would be joining us in Letterkenny for this event.

To know more about Martin Woodward:

highres_476929143

What we’ll cover:
 Introduction to Azure DevOps
 Azure Pipelines: Fully managed CI/CD platform that works with any language, platform, and cloud
 Azure Repos: Source code repositories (Git/TFVC)
 Azure Test Plans: Manual and Automated Testing
 Azure Boards: Plan, track, and discuss work across your teams
 Azure Artifacts: Package management

Additionally, We’ll also cover how to use Azure Pipelines for Continuous builds with your GitHub projects.

If you are interested to learn how to plan smarter, collaborate better and ship faster, sign up now and RSVP/Share it: https://www.meetup.com/lk-mug/events/255764767/ 

Please follow us on:

Azure DevOps Server 2019 rc1–Available/Download Now

November 21, 2018 Azure, Azure DevOps, Azure DevOps Server, Azure DevOps Services, Microsoft, TFS No comments

Microsoft has announced the availability of first release candidate (RC) of Azure DevOps Server 2019. The Azure DevOps Server(previously TFS/Team Foundation Server) delivers the Azure DevOps Services optimized for customers who prefer to self-host these devops services on-premises.

image

Key Features included/improved :

  • Branding Changes
  • Azure DevOps Server includes support for Azure SQL in addition to existing SQL Server support.
  • New release management interface from Azure DevOps (Cloud Server) is also included.

Editions Available:

  • Azure DevOps Server Express – Free version for individuals and small teams.
  • Azure DevOps Server – enterprise grade version with more seats.

Upgrading from TFS :

  • TFS 2012 and above: A direct upgrade to Azure DevOps Server  is possible.
  • TFS 2010 or lower:  Perform interim steps before upgrading to Azure DevOps Server 2019.

Production/Go-Live Use:

  • Azure DevOps Server 2019 RC1 includes a go-live license making it suitable for production use right away.
  • Microsoft is looking for feedbacks though to incorporate in future RC’s

Download:

Source:

Azure Cosmos DB – TTL (Time to Live) – Reference Usecase

October 9, 2018 .NET, .NET Core, .NET Framework, Analytics, Architecture, Azure, Azure, Azure Cosmos DB, Azure Functions, Azure IoT Suite, Cloud Computing, Cold Path Analytics, CosmosDB, Emerging Technologies, Hot Path Analytics, Intelligent Cloud, Intelligent Edge, IoT Edge, IoT Hub, Microsoft, Realtime Analytics, Visual Studio 2017, VisualStudio, VS2017, Windows No comments

TTL capability within Azure Cosmos DB is a live saver, as it would take necessary steps to purge redudent data based on the configurations you may. 

Let us think in terms of an Industrial IoT scenario, devices can produce vast amounts of telemetry information, logs and user session information that is only useful until we operate on them and take action on them, to be specific up to finate period of time. Once that data becomes surplus, we need an application logic that purges these old records.

With the “Time to Live” or TTL, Microsoft Cosmos DB provides an ability to have your documents automatically purged from database storage after a certian period if time(which you configured)

  • This TTL by default can be set on a document collection level and later can be overridden on a per document basis.
  • Once the TTL is set, Cosmos DB service will automatically remove the documents when its lifetime is over.
  • Inorder to track TTL, Cosmos DB uses an offset field to check when it was last modified.  This field is identifiable as “_ts”, which exists in every document you create.  Basically it is a UNIX epoch timestamp. This field is updated everytime when the document is modified. [Ref: Picture1]

image

[Picture1]

Enabling TTL on Cosmos DB Collection:

You can enable TTL on a Cosmos DB collection simply by using Azure Portal –> Cosmos DB collection setting for existing or during creation of  a new collection)

TTL value needs to be set in seconds – if you need 90 days => 60 sec * 60 min * 24 hour * 90 days = 7776000 seconds

image

[Picture2]

Below is a one of the reference architecture in which Cosmos DB – TTL would be essentially useful and viable to any Iot business case:

image

[Picture3]

Hope that was helpful to get some understanding. For more references visit:  Cosmos DB Documentation

Azure Cosmos DB–Multi Master

October 8, 2018 .NET, .NET Core, .NET Framework, ASP.NET, Azure, Azure CLI, Azure Cosmos DB, CosmosDB, Data Consistancy, Data Integrity, Microsoft, Multi-master, Performance, Reliability, Resilliancy, Scalability, Scale Up No comments

During the Ignite 2018, Microsoft has announced the general availability of Multi-Master feature being introduced to Azure Cosmos DB to provide more control into data redundancy and elastic scalability for your data from different regions with multiple writes and read instances.

What is Multi-Master essentially?

Multi-master is a capability that provided as part of Cosmos DB, that would provide you multiple write regions and provides an option to handle conflict resolution automatically through different options provided by the platform. Most of the major scenarios you would encounter the conflict can be resolved with these simple configurations.

A sample diagram depicting a use case of load balanced web app writing to respective regional master:-

image

With multi-master, Azure Cosmos DB delivers a single digit millisecond write latency at the 99th percentile anywhere in the world, and now offers 99.999 percent write availability (in addition to 99.999 percent read availability) backed by the industry-leading SLAs.

image

Wow! That’s an amazing performance Cosmos DB guarantees to provide so that your mission-critical systems will have zero downtime, if they start using Cosmos DB.

 

How to Enabled Multi-Master support in your Cosmos DB solutions?

Currently multi-master can only be enabled for new Cosmos DB instances using “Enable Multi-Master” option in Azure Portal or through PowerShell or ARM templates or through SDK.

These options are detailed below with necessary examples:

1.) Azure Portal – Enable Multi-region writes and Enable geo-redundancy

image

2.) Azure CLI 
Set the “enable-multiple-write-locations” parameter to “true”

az cosmosdb create \
   –-name "thingx-cosmosdb-dev" \
   --resource-group "consmosify-dev" \
   --default-consistency-level "Session" \
   --enable-automatic-failover "true" \
   --locations "EastUS=0" "WestUS=1" \
   --enable-multiple-write-locations true \

3.) AzureRM PowerShell
In AzureRM PowerShell cmdlet – Set enableMultipleWriteLocations parameter to “true”

$locations = @(@{"locationName"="East US"; "failoverPriority"=0},
             @{"locationName"="West US"; "failoverPriority"=1})

$iprangefilter = ""

$consistencyPolicy = @{"defaultConsistencyLevel"="Session";
                       "maxIntervalInSeconds"= "10";
                       "maxStalenessPrefix"="200"}

$CosmosDBProperties = @{"databaseAccountOfferType"="Standard";
                        "locations"=$locations;
                        "consistencyPolicy"=$consistencyPolicy;
                        "ipRangeFilter"=$iprangefilter;
                        "enableMultipleWriteLocations"="true"}

New-AzureRmResource -ResourceType "Microsoft.DocumentDb/databaseAccounts" `
  -ApiVersion "2015-04-08" `
  -ResourceGroupName "consmosify-dev" `
  -Location "East US" `
  -Name "thingx-cosmosdb-dev" `
  -Properties $CosmosDBProperties

4.) Through CosmosDB SDK
Setting connection policy in DocumentDBClient and set UseMultipleWriteLocations to true.

ConnectionPolicy policy = new ConnectionPolicy
{
   ConnectionMode = ConnectionMode.Direct,
   ConnectionProtocol = Protocol.Tcp,
   UseMultipleWriteLocations = true,
};
policy.PreferredLocations.Add("East US");
policy.PreferredLocations.Add("West US");
policy.PreferredLocations.Add("West Europe");
policy.PreferredLocations.Add("North Europe");
policy.PreferredLocations.Add("Southeast Asia");
policy.PreferredLocations.Add("Japan East");
policy.PreferredLocations.Add("Japan West");

Azure Cosmos DB multi-master configuration is the game changes that really makes it a true global scale database with automatic conflict resolution capabilities for data synchronization and consistancy.

In my later sessions I will write examples to cover how conflict resolutions can be configured and used in realtime scenarios.

Useful Refs: