The Azure Method

Best Binary Options Brokers 2020:
  • BINARIUM
    BINARIUM

    Top Broker!
    Best Choice For Beginners!
    Free Trading Education!
    Free Demo Account!
    Big Sign-up Bonus!

  • BINOMO
    BINOMO

    Perfect For Experienced Traders!

Login methods

ADAL JS provides two ways for your application to sign-in users with Azure AD accounts.

Login with a redirect

This is the default method which the library provides to log in users. You can invoke this as follows:

Login in a pop-up window

The library provides this approach for developers building apps where they want to remain on the page and authenticate the user through a popup window.

ID tokens and User info

ADAL JS uses the OAuth 2.0 implicit flow. As a result, the sign-in flow in ADAL JS authenticates the user with Azure AD and also gets an ID token for your application. The ID token contains claims about the user which are exposed in the user.profile property in ADAL JS . You can get user information as follows:

Note: The ID token can also be used to make secure API calls to your application’s own backend API (which is registered in Azure AD as the same web app).

When the logout method is called, the library clears the application cache in the browser storage and sends a logout request to the Azure AD instance’s logout endpoint.

Things I Learnt in My First Azure Functions Project

As part of the EU General Data Protection Regulation (GDPR) requirement, we built a system in Visual Studio 2020 to conduct data minimisation using time and queue trigger Azure Functions hosted on consumption plan. Here are a list of gotchas and tips worth sharing in the hope they can be helpful to others who are new to Azure Functions development.

Make sure if it is right for you

Azure Functions is powerful but it isn’t a solution for every business requirement. There are two different hosting plans for the Azure function application to be running on. The following is a brief comparison on the differences between them. This page gives you a very good insight of what pros and cons each one has, and how they work.

The reason we chose Azure Functions with consumption plan is because it’s very well integrated with Azure services e.g. Cosmos DB, Storage queue via binding. This massively simplifies codes and lets us focus on things that are more relevant to our business requirement. Plus, the built-in retry feature comes free out-of-the-box in the queue trigger function.

Best Binary Options Brokers 2020:
  • BINARIUM
    BINARIUM

    Top Broker!
    Best Choice For Beginners!
    Free Trading Education!
    Free Demo Account!
    Big Sign-up Bonus!

  • BINOMO
    BINOMO

    Perfect For Experienced Traders!

Choose the right version

At the time of writing this post, the only version of Azure Functions that is officially recommended for production use is version 1.0, which doesn’t support .NET core. To use .NET core, you would have to use Azure Functions 2.0 which is under preview at the moment and has a list of known issues worth checking.

Best practice, best practice, best practice

Azure Functions development isn’t the same as others, e.g cloud service or Azure website etc., which have their own dedicated virtual machines. Even more so if you choose Azure Functions consumption plan, which is different in terms of its cross-function communication, scalability and costing calculation etc. All of these could significantly impact your functions cost and performance if you do not fully understand how they work — this is why it’s important, as a developer, to have these differences in mind while coding. Here is a good article showing you one of best practices — definitely worth a read. It’s also always worth asking the Azure Functions team whenever there isn’t an obvious solution to a complex or unique design requirement you have, instead of rushing out a solution yourself.

Azure Functions 1.0 runs in IIS worker process

Kudu is an engine behind git deployments in the Azure app service and it runs on its own w3wp process. After you deploy app service to Azure, you can access Kudu in the platform features.

Once you are in Kudu, open the Process Explorer, and you will see that Azure Functions version 1.0 is hosted in a IIS worker process w3wp.exe :

Then, your function assemblies get loaded within a w3wp process which acts like a container. The following is an example demonstrating how it is initialised:

Understand Azure Functions app settings

Just like any other .NET application, Azure Functions can also read app settings by using the System.Environment.GetEnvironmentVariable method. In terms of how the app settings correlate with functions, it’s probably not as you think it is.

For local, app settings are defined in the local.settings.json file which is only for local development use, even though it is already obvious by the name of it. However, I found a lot of developers (including myself) still mistakenly think settings defined here would take effect in Azure — the answer is absolutely not. To add a custom setting to your local.settings.json simply add a new entry in Values .

To define app settings for use in Azure, it’s done via app service deployment in the Azure resource management (ARM) template. Here is a quick start template for reference. As you can see in the azuredeploy.json template, app settings are defined as part of the Microsoft.web/sites resource, which is required beforehand, for your function deployment afterwards. So Azure Functions and app settings that function consumes, are deployed separately.

Limit queue trigger function scaling

One of the great features that the Azure Functions has is that the Azure Functions Runtime automatically scales out function instances when it is under load. In our case, it scales out depending on the volume of messages in the queue it’s listening to. However, this feature could cause problems to downstream APIs that it depends on. Instead, what we want is to limit its auto-scaling ability to avoid possible DDOS ourselves, in case of a huge amount of messages arriving. Fortunately, the queue trigger Azure function allows you to constrain its power by using the following two settings in the host.json file:

This basically means that the maximum number of concurrent messages that can be processed per function at any time is one ( batchSize plus newBatchThreshold ). Although, with this approach you might think it could still cause problems if each function execution completes extremely fast, one after another, which is absolutely possible. We therefore put a performance test in place to give us an extra level of confidence and its report shows the average duration of each function execution takes approximately 1.2 seconds — I know, too slow! However, the good news is it’s nowhere near our downstream API rate limit. Happy days!

Be aware of assembly redirect binding issue on Newtonsoft.Json

Since the version of 1.0.0-alpha6, Azure Functions SDK now strictly requires Newtonsoft.Json version 9.0.1 to prevent runtime failure. It can be frustrating sometimes as it limits you to use the latest Newtonsoft.Json package. Also, because of the popularity of the Newtonsoft.Json package, it’s being referenced by many other packages, so the chances of you running into this problem is very high.

In our case, this issue stops us using the WindowsAzure.Storage v9.2.0 package, which has a feature that supports accessing Azure storage account via Azure AD.

Access client certificate in Azure Functions

Accessing certificate is a very common scenario, typically when acquiring a token to access a remote protected resource. There are few ways to achieve this. Before the Azure Functions team made it available, I’ve seen some people do it by reading from the file system, which works for a public certificate but it wouldn’t be secure enough for reading a private certificate.

Now, accessing client certificate is pretty straightforward but there are few things worth noting:

  • How to provision a certificate onto the Azure app service. At the time of writing this post, the proper way of doing this is still a feature request. The way we get around it is done through the New-AzureRmWebAppSSLBinding Azure power shell script, which is designed for uploading an SSL certificate, but it can still upload a client certificate even if there’s an error that the domain isn’t matched with the subject name of the certificate. To get around it, you can suppress the error by specifying the ErrorAction parameter which has the following options and which will ignore all possible errors, or you can be more specific by checking the error using a try and catch block.
  • How to make certificate accessible. To make certificates available, you need to have WEBSITE_LOAD_CERTIFICATES app setting with a value set to the thumbprint of the certificate you want to read. Alternatively, you can set it to multiple thumbprints separated by comma or simply set it to * to read all available certificates.
  • How to verify certificate are available. To verify if your certificates are accessible, you can install the Certificate Read Checker via Site extensions gallery on Kudu, which will show you if your certificate can be read via code or not.
  • How to read a certificate in code. Because Azure Functions with consumption plan is hosted on a public scale unit, certificates can only be installed under CurrentUser personal store. This shows an example of how to read a certificate in C#.

Integrate with application insight

There are two built-in logging methods in Azure Functions, either the Webjob dashboard which uses table storage, or Application Insights which is much more advanced and robust. The following shows you how to provision and integrate Azure Functions with Application Insights.

Integrating with Application Insights is really straightforward. All you need to do is add your instrumentation key to your function app setting and Azure Functions will take care of the rest. For more information, see here.

In your ARM deployment template use the reference function to retrieve an instrument key during your function app provisioning and add it to the Azure function app setting list.

If you choose to use Application Insight for logging, it is worth disabling the Webjob dashboard logging for cost saving purposes. Otherwise it still writes logs to table storage which costs money. To do so, simply remove AzureWebJobsDashboard from app settings.

How to implement dependency injection in Azure Functions

Unfortunately, dependency injection (DI) isn’t supported out-of-the-box and this makes a developers life much harder because the size of the project grows as more services are added. Boris Wilhelms suggests something pretty elegant to counter this, which is to inject dependencies via the function entry method using the binding attribute and then register all services in the function binding extension. If you would like to have DI in your function project, the link above is all you need.

Most of the time you want your services to be able to log information or provide warnings when certain conditions occur. Once you have set up the DI using the above approach, you can then create the built-in logger to be available as a dependency to your service. The following shows an example of how to do that:

Above is a list of notes that I hope can be helpful. If you have any questions please leave a comment here, I will try my best to answer.

Understanding Azure deployment methods

Learn the differences between the classic Azure deployment and the newer Resource Manager deployment in this tip.

Deploying Azure applications can be a confusing process. Azure applications typically consist of multiple components.

Continue Reading This Article

Enjoy this article as well as all of our content, including E-Guides, news, tips and more.

that need to be deployed together and you need to coordinate the deployment of the different application pieces. In addition, if you are deploying multiple IaaS or PaaS applications then you want to have a standardized and repeatable process. If that’s not enough, Microsoft is in the process of updating the Azure deployment process.

With the release of the Azure Preview Portal in January of 2020 Microsoft Azure now supports two deployment methods: Azure classic deployment (also known as Service Manager Deployments) and the newer Resource Manager deployments. The architectural differences between the two Azure deployment models mean that Azure resources created using one deployment model will not necessarily interoperate with the resources created using the other deployment model. For example, Azure Virtual Machines created by using the classic deployment model can only be connected to Azure Virtual Networks created by using the classic deployment model. The resource providers that are different between the two deployment models are: Compute, Storage and Network.

There is some overlap as a few resource providers offer two versions of their resources: one version for classic and one version for Resource Manager. This article explains the differences between the classic Azure deployment and the newer Resource Manager deployment and shows how you can use each one.

Microsoft Azure currently has two separate management portals. There is the original Azure portal and then there is the Azure preview portal. Although it has been nearly a year since its introduction, the Azure Preview Port is still considered in “preview” mode. The original Azure Portal only supports classic deployments. You can see the classic Azure management Portal in Figure 1.

Figure 1. Creating classic deployments with the Azure Portal

You can create resources in the classic deployment model in two ways, using either the Azure Portal or Azure Preview Portal and specify Classic deployment. All Azure resources created with the classic deployment method must be managed individually — not as a group. Resources that were initially created using the classic deployment method were not part of any resource group. When the Resource Manager was introduced all of these resources were retroactively added to default resource groups. If you create a resource using classic deployment, that resource is automatically created within a default resource group. However, just because a resource may be contained within a resource group doesn’t mean that the resource has been converted to the Resource Manager model. Virtual Machines, Storage and Virtual Networks created using the classic deployment model must be managed using classic operations.

In general, you should not expect resources created through classic deployment to work with the newer Resource Manager. You can learn more about the architecture used by the different deployment methods at Resources for ramping up on Azure Resource Manager. It can freely switch between the two management portals by clicking on the your account icon in the top, upper-right portion of the Azure Portal and then clicking Switch to Preview Portal like you can see in Figure 1.

Resource Manager deployments are a part of the new management model that was introduced with the Azure Preview Portal. The infrastructure for your applications typically consists of multiple components. For instance, most applications will make use of a storage account, virtual machines and a virtual network, or you might have a Web application and database server. Because these resources are related, it’s desirable to deploy and manage them as a group. You can deploy, update or delete all of the resources for your solution in a single streamlined operation. Resource Manager also provides security, auditing and tagging features to help you manage your resources after deployment. You can see the Azure Preview Portal with options for both classic and Resource Manager deployment modes in Figure 2.

Figure 2. Creating Resource Manager deployments with the Azure Preview

When you create a new resource like a virtual machine using The Virtual Machine link and a virtual machine image from the Azure Image Gallery, the Azure Preview Portal will prompt you for the Resource Group that will contain the resource. You can see an example of how the Azure Preview Portal allows you to manage the resources contained in a Resource Groups in Figure 3.

Figure 3. Managing Azure resource group deployments

The Resource Manager’s resource groups allow you to combine related resources together. However, they have other advantages as well. With the Resource Manager you can create a template that defines deployment and configuration of your application. This template is created in JSON format and it provides a declarative way to define deployment. Classic deployments cannot make use of templates. Templates enable you to repeatedly deploy your application in a standardized manner. Use the template to define the infrastructure for your applications. It can also be used to configure the infrastructure, and define how to publish your application code to that infrastructure.

The Azure Resource Manager analyzes dependencies to ensure that resources defined in the template are created in the proper order. You can specify parameters in your template to enable customization. For example, you can pass parameter values that might customize your Azure deployment for a test environment and later provide different parameters to use that same template for a production deployment. You can create Azure Resource Manager templates using Visual Studio with the Azure SDK 2.6 installed. You can learn more about creating Azure Resource Manager Templates at Authoring Azure Resource Manager templates.

Going forward, Microsoft recommends that most new deployments use Resource Manager because of its ability to simplify the management of multiple related items. Further, they also recommend converting your existing classic deployments to use the Resource Manager as well where it’s possible. While Azure Resource Manager is Microsoft’s recommend future path, there are features that are present in classic deployments that are not in Azure Resource Manager.

For a more detailed understanding of the differences between the two deployment models, you can check out Azure Compute, Network and Storage Providers under the Azure Resource Manager.

Best Binary Options Brokers 2020:
  • BINARIUM
    BINARIUM

    Top Broker!
    Best Choice For Beginners!
    Free Trading Education!
    Free Demo Account!
    Big Sign-up Bonus!

  • BINOMO
    BINOMO

    Perfect For Experienced Traders!

Like this post? Please share to your friends:
How To Choose Binary Options Broker 2020
Leave a Reply

;-) :| :x :twisted: :smile: :shock: :sad: :roll: :razz: :oops: :o :mrgreen: :lol: :idea: :grin: :evil: :cry: :cool: :arrow: :???: :?: :!: