Quantcast
Channel: dotnetthoughts
Viewing all 610 articles
Browse latest View live

How to add Open API support for Azure Functions

$
0
0

This post is about documenting Azure Functions using Open API. Similar to ASP.NET Core, Open API support for Azure Functions is also simple and straight forward.

I am using the QR Generator Azure Function and I am enabling Open API support for that. To enable Open API support first you need to add the Open API package for Azure Functions. You can do this by running the command - dotnet add package Microsoft.Azure.WebJobs.Extensions.OpenApi --version 0.1.0-preview

When you add this package and if run the function, you will be able to see the function is displaying 3 more URLs other than the function HTTP endpoint.

Azure Function running

But if you try to access it, it will return internal server error. It is because you are missing the Open API configuration. You can fix this by adding the following JSON values in the host.json file.

"openApi":{"info":{"version":"1.0.0","title":"QR Code Generator","description":"A serverless Azure Function which helps you to create QR code for URLs.","termsOfService":"https://dotnetthoughts.net/","contact":{"name":"anuraj","email":"anuraj at dotnetthoughts.net","url":"https://dotnetthoughts.net/"},"license":{"name":"MIT","url":"https://anuraj.mit-license.org/"}}}

Now if you run and browse the Swagger UI endpoint you will be able to see a Open API screen like the following.

Open API Specification

But it is showing No operations defined in spec!, in case of ASP.NET Core, we don’t need to do it, but in Function you need to explicitly configure the Open API operations and associated request and responses. Here is an example.

[OpenApiOperation(operationId:"QRGenerator",tags:new[]{"QRGenerator"},Summary="Generate QR Code for the URL",Description="Generate QR Code for the URL",Visibility=OpenApiVisibilityType.Important)][OpenApiParameter(name:"url",In=ParameterLocation.Query,Required=true,Type=typeof(Uri),Summary="The URL to generate QR code",Description="The URL to generate QR code",Visibility=OpenApiVisibilityType.Important)][OpenApiResponseWithBody(statusCode:HttpStatusCode.OK,contentType:"image/png",bodyType:typeof(FileResult),Summary="The QR code image file",Description="The QR code image file")][OpenApiResponseWithoutBody(statusCode:HttpStatusCode.BadRequest,Summary="If the URL is missing or invalid URL",Description="If the URL is missing or invalid URL")][FunctionName("QRGenerator")]publicstaticasyncTask<IActionResult>Run([HttpTrigger(AuthorizationLevel.Anonymous,"get","post",Route=null)]HttpRequestreq,ILoggerlog){log.LogInformation("C# HTTP trigger function processed a request.");}

Now if you run the app again you will be able to find something like this.

Open API Specification with Open API operation

Now you can test and verify this function with the Open API UI, if you provide the URL it will generate the QR code image and will display it in the UI. Now you’re able to import the Open API file into API Management and it can work with Azure Functions.

Happy Programming :)


OpenAPI and Versioning for ASP.NET Core Web API

$
0
0

This post is about how to enable and use Open API for ASP.NET Core Web API with versioning enabled. I have created a Web API project in ASP.NET Core 5.0, so Open API is enabled by default. Next I am adding the Microsoft.AspNetCore.Mvc.Versioning package to enable versioning support. And I am enabling version support using the following code in Startup.cs - ConfigureServices method.

publicvoidConfigureServices(IServiceCollectionservices){services.AddControllers();services.AddApiVersioning(options=>{options.AssumeDefaultVersionWhenUnspecified=true;options.ReportApiVersions=true;options.DefaultApiVersion=newApiVersion(1,0);});services.AddSwaggerGen(c=>{//Following code to avoid swagger generation error //due to same method name in different versions.c.ResolveConflictingActions(descriptions=>{returndescriptions.First();});c.SwaggerDoc("v1",newOpenApiInfo{Title="Weather Forecast API",Version="1.0"});});}

Next I am adding following code to enable version support in controller.

[ApiController][ApiVersion("1.0")][Route("{version:apiVersion}/[controller]")]publicclassWeatherForecastController:ControllerBase{privatestaticreadonlystring[]Summaries=new[]{"Freezing","Bracing","Chilly","Cool","Mild","Warm","Balmy","Hot","Sweltering","Scorching"};privatereadonlyILogger<WeatherForecastController>_logger;publicWeatherForecastController(ILogger<WeatherForecastController>logger){_logger=logger;}[HttpGet]publicIEnumerable<WeatherForecast>Get(){varrng=newRandom();returnEnumerable.Range(1,5).Select(index=>newWeatherForecast{Date=DateTime.Now.AddDays(index),TemperatureC=rng.Next(-20,55),Summary=Summaries[rng.Next(Summaries.Length)]}).ToArray();}}

This will display something like this.

Open API support for Web API

In the Open API UI, you need to pass the version as the parameter. It is not a good practice. To fix this we need to implement two Open API filters, one to remove the version text box from the UI and one to replace version information in the Open API document paths. So here is the first filter implementation which will remove the version textbox from Open API UI.

publicclassRemoveVersionFromParameter:IOperationFilter{publicvoidApply(OpenApiOperationoperation,OperationFilterContextcontext){varversionParameter=operation.Parameters.Single(p=>p.Name=="version");operation.Parameters.Remove(versionParameter);}}

And you can use this class in ConfigureServices like this.

services.AddSwaggerGen(c=>{//Following code to avoid swagger generation error //due to same method name in different versions.c.ResolveConflictingActions(descriptions=>{returndescriptions.First();});c.SwaggerDoc("v1",newOpenApiInfo{Title="Weather Forecast API",Version="1.0"});c.OperationFilter<RemoveVersionFromParameter>();});

And if you run the application now, you will be able to see something like this - the version parameter got removed.

Version parameter removed

Now let me implement the document filter, which will replace the version in the URL path with the API version.

publicclassReplaceVersionWithExactValueInPath:IDocumentFilter{publicvoidApply(OpenApiDocumentswaggerDoc,DocumentFilterContextcontext){if(swaggerDoc==null){thrownewArgumentNullException(nameof(swaggerDoc));}varreplacements=newOpenApiPaths();foreach(var(key,value)inswaggerDoc.Paths){replacements.Add(key.Replace("{version}",swaggerDoc.Info.Version,StringComparison.InvariantCulture),value);}swaggerDoc.Paths=replacements;}}

And similar to the RemoveVersionFromParameter class, you can use this class in ConfigureServices method, in the AddSwaggerGen method like this.

services.AddSwaggerGen(c=>{//Following code to avoid swagger generation error //due to same method name in different versions.c.ResolveConflictingActions(descriptions=>{returndescriptions.First();});c.SwaggerDoc("v1",newOpenApiInfo{Title="Weather Forecast API",Version="1.0"});c.OperationFilter<RemoveVersionFromParameter>();c.DocumentFilter<ReplaceVersionWithExactValueInPath>();});

Now when you run the application you will be able to see something like this.

Version added to URL

You will be able to see the {version} value removed from the URL and it is replaced with the version value. Next we will add another version for the Web API controller and an associated method, like this.

[ApiController][ApiVersion("1.0")][ApiVersion("2.0")][Route("{version:apiVersion}/[controller]")]publicclassWeatherForecastController:ControllerBase{privatestaticreadonlystring[]Summaries=new[]{"Freezing","Bracing","Chilly","Cool","Mild","Warm","Balmy","Hot","Sweltering","Scorching"};privatereadonlyILogger<WeatherForecastController>_logger;publicWeatherForecastController(ILogger<WeatherForecastController>logger){_logger=logger;}[HttpGet]publicIEnumerable<WeatherForecast>Get(){varrng=newRandom();returnEnumerable.Range(1,5).Select(index=>newWeatherForecast{Date=DateTime.Now.AddDays(index),TemperatureC=rng.Next(-20,55),Summary=Summaries[rng.Next(Summaries.Length)]}).ToArray();}[HttpGet][MapToApiVersion("2.0")]publicIEnumerable<WeatherForecast>GetV2(){varrng=newRandom();returnEnumerable.Range(1,10).Select(index=>newWeatherForecast{Date=DateTime.Now.AddDays(index),TemperatureC=rng.Next(-20,65),Summary=Summaries[rng.Next(Summaries.Length)]}).ToArray();}}

This controller supports two versions 1.0 and 2.0 and a method which supported only in 2.0 - which returns 10 days forecast instead of 5 days. And if you run the application you won’t be able to see version 2.0. Because even though we created version 2.0 of API we didn’t added the Open API information about 2.0 in the startup code. We can modify the ConfigureServices method like this.

publicvoidConfigureServices(IServiceCollectionservices){services.AddControllers();services.AddApiVersioning(options=>{options.AssumeDefaultVersionWhenUnspecified=true;options.ReportApiVersions=true;options.DefaultApiVersion=newApiVersion(1,0);});services.AddSwaggerGen(c=>{//Following code to avoid swagger generation error //due to same method name in different versions.c.ResolveConflictingActions(descriptions=>{returndescriptions.First();});c.SwaggerDoc("1.0",newOpenApiInfo{Title="Weather Forecast",Version="1.0"});c.SwaggerDoc("2.0",newOpenApiInfo{Title="Weather Forecast",Version="2.0"});c.OperationFilter<RemoveVersionFromParameter>();c.DocumentFilter<ReplaceVersionWithExactValueInPath>();});}

And modify the Configure method like this.

publicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){if(env.IsDevelopment()){app.UseDeveloperExceptionPage();app.UseSwagger();app.UseSwaggerUI(c=>{c.SwaggerEndpoint("/swagger/1.0/swagger.json","WeatherForecast 1.0");c.SwaggerEndpoint("/swagger/2.0/swagger.json","WeatherForecast 2.0");c.RoutePrefix=string.Empty;});}app.UseHttpsRedirection();app.UseRouting();app.UseAuthorization();app.UseEndpoints(endpoints=>{endpoints.MapControllers();});}

If you run the application now, you will be able to see like this.

Web API with Open API and Versioning support

When you select the 2.0 version you can try out the Weather Forecast method which returns 10 days forecast. You can find the source code of this blog post on GitHub.

Happy Programming :)

Securing Your Web API Using Azure Active Directory

$
0
0

This post is about securing your asp.net core web api applications using Azure Active Directory. First let’s create an Azure Active Directory application which helps you to protect the application.

Open Azure Portal, Select Azure Active Directory, and select App registrations from the blade. Then click on the + New Registration. Provide a name for the application which you can change later. Since I am trying to protect the Weather Forecast API, I provided the name Weather Forecast.

Azure AD App Registration

Next, we need to create a scope - for Weather Forecast API, we need only Read scope. For that select the Expose an API option from the Weather Forecast application. Click on the + Add Scope button. And it will show an Application Id URL - we can change it if required, but I am using the default one.

Add Scope

Click on the Save and Continue button. On the next screen, set the scope name as WeatherForecast.Read, select Admins and Users for the Who can consent? option. And set the display name and description for Admin and User consent screens.

Add Scope

And click on the Add Scope button to complete the process.

Now we can create the Web API application. I am creating a ASP.NET Core Web API application using Visual Studio 2019. And change the Authentication mode, and select Work or School Accounts and select Cloud . Next you need to configure Domain - you can get it from Azure Active Directory overview page. Next you need to select Overwrite the application entry if one with same ID exists checkbox and Client Id and App Id URI, from the App Registration overview page, and click OK. Next click on Create. This will take some time. Now let us explore the code generated by Visual Studio.

Here is the project file.

<ProjectSdk="Microsoft.NET.Sdk.Web"><PropertyGroup><TargetFramework>net5.0</TargetFramework><UserSecretsId>aspnet-WebApiAuth_Server-8d185841-534d-47ae-b55b-495c7284befa</UserSecretsId></PropertyGroup><ItemGroup><PackageReferenceInclude="Microsoft.AspNetCore.Authentication.JwtBearer"Version="5.0.0"NoWarn="NU1605"/><PackageReferenceInclude="Microsoft.AspNetCore.Authentication.OpenIdConnect"Version="5.0.0"NoWarn="NU1605"/><PackageReferenceInclude="Microsoft.Identity.Web"Version="1.1.0"/><PackageReferenceInclude="Swashbuckle.AspNetCore"Version="5.6.3"/></ItemGroup></Project>

The Microsoft.Identity.Web package is the library which helps us to integrate Azure AD with ASP.NET Core application. The other change is in the Startup class, ConfigureServices method.

publicvoidConfigureServices(IServiceCollectionservices){services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme).AddMicrosoftIdentityWebApi(Configuration.GetSection("AzureAd"));services.AddControllers();services.AddSwaggerGen(c=>{c.SwaggerDoc("v1",newOpenApiInfo{Title="WebApiAuth",Version="v1"});});}

In this method we are adding a JWT Authentication middleware and AddMicrosoftIdentityWebApi extension method which will take the Azure AD Configuration settings and make our API app to authenticate against the Azure AD service. And the configuration values from appsettings.json file.

{"AzureAd":{"Instance":"https://login.microsoftonline.com/","Domain":"Domain name from Azure AD page","TenantId":"Tenant Id from App Overview Page","ClientId":"Application Id from App Overview Page","CallbackPath":"/signin-oidc"},"Logging":{"LogLevel":{"Default":"Information","Microsoft":"Warning","Microsoft.Hosting.Lifetime":"Information"}},"AllowedHosts":"*"}

In the controller, there is an Authorize attribute which will throw a 401 is the request is not authenticated. And in the Get() method there is a method HttpContext.VerifyUserHasAnyAcceptedScope - which helps to enable authorization to your web API. We need to modify this method and instead of putting the default user_impersonation scope, we need to provide the scope we created while creating the App Scope.

Here is the updated code.

[Authorize][ApiController][Route("[controller]")]publicclassWeatherForecastController:ControllerBase{privatestaticreadonlystring[]Summaries=new[]{"Freezing","Bracing","Chilly","Cool","Mild","Warm","Balmy","Hot","Sweltering","Scorching"};privatereadonlyILogger<WeatherForecastController>_logger;staticreadonlystring[]scopeRequiredByApi=newstring[]{"WeatherForecast.Read"};publicWeatherForecastController(ILogger<WeatherForecastController>logger){_logger=logger;}[HttpGet]publicIEnumerable<WeatherForecast>Get(){HttpContext.VerifyUserHasAnyAcceptedScope(scopeRequiredByApi);varrng=newRandom();returnEnumerable.Range(1,5).Select(index=>newWeatherForecast{Date=DateTime.Now.AddDays(index),TemperatureC=rng.Next(-20,55),Summary=Summaries[rng.Next(Summaries.Length)]}).ToArray();}}

Now you’re done with the changes. If you try to access the https://localhost:44363/WeatherForecast (the port number will change, if you’re using VS Code or dotnet CLI it will be 5001) you will get a 401 error.

Next we will test the API with Postman. For testing this we need configure callback URLs and Client Secret. Click on the Authentication menu, under Platform configurations, add the Redirect URIs - add the postman callback URL - https://app.getpostman.com/oauth2/callback

Platform configurations

Add you can create the Client secret using the Certificates & secrets menu, and new New Client secret. Set the Description and duration as Never. And click on Create, it will create a token and copy it. You won’t be see it again.

Generate Client Secret

We have completed the configuration for connecting client using Postman. Now open Postman, provide the URL - https://localhost:44363/WeatherForecast, then select the Authorization tab and choose OAuth 2 from the Type list.

The values for Auth URL and Access Token URL you can get from the Endpoint menu of the application.

Postman Authorization

Once you fill up all the fields - you can skip the State field, click on Get New Access Token button. It will popup the Azure AD login dialog and you can login. Once the login is completed, Postman will show a Token, which you can be used to talk to API.

Token from Azure AD

Click on the Use Token button, which will add the Authorization header. Now we can send the GET request which will return the JSON results, like this.

Postman with Authorization header

You can create another Azure App registration and use the Client Id instead of using the API client Id and secret. This way you can protect ASP.NET Core Web API using Azure Active Directory.

Happy Programming :)

How to do OAuth2 Authorization in ASP.NET Core for Swagger UI using Swashbuckle

$
0
0

This post is about documenting OAuth 2 protected ASP.NET Core Web API using Swashbuckle. Last post - Securing Your Web API Using Azure Active Directory I wrote about securing web api with Azure Active Directory. In ASP.NET Core Web API, Open API will be enabled by default. To enable OAuth2 authentication, first we need to write the following code. I this code we are adding a SecurityDefinition with OAuth2 type. And also configuring Authentication URL, Token URL and Scopes.

c.AddSecurityDefinition("oauth2",newOpenApiSecurityScheme{Type=SecuritySchemeType.OAuth2,Flows=newOpenApiOAuthFlows(){Implicit=newOpenApiOAuthFlow(){AuthorizationUrl=newUri("https://login.microsoftonline.com/7493ef9e-db24-45d8-91b5-9c36018d6d52/oauth2/v2.0/authorize"),TokenUrl=newUri("https://login.microsoftonline.com/7493ef9e-db24-45d8-91b5-9c36018d6d52/oauth2/v2.0/token"),Scopes=newDictionary<string,string>{{"api://29a02307-5a1b-460c-85ba-9e9abb75e48d/Read.WeatherForecast","Reads the Weather forecast"}}}}});

This will display the Authorize button, like this.

Web API Open API Authorize

Click on the Authorize button will display Available authorizations, like the following.

Available authorizations

You need to provide the Client Id and select the scopes. Once you complete it click on Authorize button, which will open the Microsoft AD authentication page, but you might get an error like this - AADSTS700051: response_type 'token' is not enabled for the application.

AADSTS700051: response_type 'token' is not enabled for the application

It is because Access tokens and ID tokens is not enabled. You can enable it from Authentication menu.

Authentication - Enable ID Tokens

Select Access tokens and ID Tokens and save the changes. Next you will get another error - because we didn’t added the callback URL. We need to add the following URL. - https://localhost:5001/oauth2-redirect.html - if you’re using Visual Studio instead of 5001 use the port. You can add it under Authentication, Web and Redirect URIs. And save it. Next click on Authorize, you can login, but it we won’t get the proper token. It is because we don’t have Client Secret configured and we didn’t configured the authentication for controllers and action methods.

To enable this you need add the following code.

c.AddSecurityRequirement(newOpenApiSecurityRequirement(){{newOpenApiSecurityScheme{Reference=newOpenApiReference{Type=ReferenceType.SecurityScheme,Id="oauth2"},Scheme="oauth2",Name="oauth2",In=ParameterLocation.Header},newList<string>()}});

Which will show the lock sign near action method, clicking on them show the same dialog.

Authentication - Action methods

And finally add the following code in UseSwaggerUI() method.

app.UseSwaggerUI(c=>{c.SwaggerEndpoint("/swagger/1.0/swagger.json","Weather Forecast API 1.0");c.RoutePrefix=string.Empty;c.OAuthClientId("ff11294f-2cc9-18d4-bdf6-6cf26670316a");c.OAuthClientSecret("hXUedE-fItzZi8-7Mua4v9MxiTGx2--R_B");c.OAuthUseBasicAuthenticationWithAccessCodeGrant();});

Now we have completed the configuration. Run the application and you will be able to see the authentication icons on the UI and clicking on will show the authentication dialog with client Id pre populated. Click on Authorize, which will open the Microsoft Sign in dialog. First you need to provide the email and next password. And finally it will show the permission dialog like this.

Permission Dialog

Click on the Accept button and continue. It will authenticate and returns a token. Then the open lock symbol changed to closed lock symbol.

Authentication completed.

Clicking on the symbol again will show the authenticated dialog.

So we have completed the OAuth2 integrated to ASP.NET Core Web API. Similar way you can integrate other authentication protocols.

Happy Programming :)

Run EF Core Migrations in Azure DevOps

$
0
0

This post is about running Entity Framework Core migrations on Azure DevOps which helps you to continuously deploy your database changes to your staging or your QA environments or even to production - even though I won’t recommend it. We will be using Entity Framework Core Code first approach.

For this blog post I am using an Web API application - a todo list - which is using SQL Server as backend. I am creating EF core migrations and committing them to source control - which is important - in this case GitHub. I am building the SQL Scripts from EF Core migrations code which is committed in source control. And as part of Release pipeline I am executing them in the Azure SQL Server.

Here is my build pipeline looks like.

Azure DevOps Build Pipeline

It is normal .NET Core build pipeline, but I have added few more steps to build and generate EF Core migrations.

  1. Create Manifest file - This is again a dotnet command which will help you to create a manifest file using the dotnet new tool-manifest command. This is required because I am installing the dotnet ef command locally instead of installing it globally.

  2. Install EF tool - This command install the dotnet ef tool locally, so that I can build the SQL Script from migrations script. This is step is using the dotnet tool install dotnet-ef.

  3. Create SQL Scripts - This step will generate SQL scripts using dotnet ef command with the following command - dotnet ef migrations script --output $(Build.SourcesDirectory)/SQL/tododbscript.sql --idempotent --project $(Build.SourcesDirectory)/src/TodoApi.csproj --context TodoApiDbContext. In this command the --idempotent parameter is important. Otherwise it might not work the way it is expected.

  4. Publish Artifacts : SQL Scripts - This step will publish the SQL Scripts as artifacts generated by the Create SQL Scripts step.

Here is the complete YAML script.

steps:-checkout:self-task:DotNetCoreCLI@2displayName:New Manifest for toolinputs:command:customcustom:'new'arguments:tool-manifest-task:DotNetCoreCLI@2displayName:Install EF Toolinputs:command:customcustom:'tool'arguments:install dotnet-ef-task:DotNetCoreCLI@2displayName:Restoreinputs:command:restoreprojects:$(BuildParameters.RestoreBuildProjects)-task:DotNetCoreCLI@2displayName:Buildinputs:projects:$(BuildParameters.RestoreBuildProjects)arguments:--configuration $(BuildConfiguration)-task:DotNetCoreCLI@2displayName:Create SQL Scriptsinputs:command:customcustom:'ef'arguments:migrations script --output $(Build.SourcesDirectory)/SQL/tododbscript.sql --idempotent --project $(Build.SourcesDirectory)/src/TodoApi.csproj --context TodoApiDbContext-task:DotNetCoreCLI@2displayName:Publishinputs:command:publishpublishWebProjects:Trueprojects:$(BuildParameters.RestoreBuildProjects)arguments:--configuration $(BuildConfiguration) --output $(build.artifactstagingdirectory)zipAfterPublish:True-task:PublishBuildArtifacts@1displayName:Publish Artifactcondition:succeededOrFailed()inputs:PathtoPublish:$(build.artifactstagingdirectory)TargetPath:'\\my\share\$(Build.DefinitionName)\$(Build.BuildNumber)'-task:PublishBuildArtifacts@1displayName:'PublishArtifact:SQLScripts'inputs:PathtoPublish:$(Build.SourcesDirectory)/SQL/tododbscript.sqlArtifactName:SQLScripts

This script will create two artifacts - the drop zip file and SQL Script file.

Azure DevOps Generated Artifacts

Next I created a Release pipeline which take this artifacts and deploy it Azure Web App and SQL Server. This step is optional, you can deploy it along with build pipeline. But it is recommended to use Release pipeline, so that we can control the number of deployments to the environments.

Azure DevOps Release pipeline

First step will deploy the drop folder to web app using Web Deploy. Second I am adding Azure SQL Database deployment task. In which I am configuring authentication mode which is SQL Server, username, password and database. And select Deploy type as SQL Script File and select the SQL Script file - $(Build.SourcesDirectory)/SQL/tododbscript.sql.

You can configure the trigger to deploy it when a build completes or you can manually do it. Here is the Release pipeline completed.

Azure DevOps Release pipeline completed

This way you can configure your application database deployment using Azure DevOps and Entity Framework Core migrations - you need to make sure you’re committing the migration scripts to source control. Unlike running dotnet ef database update command - the script command won’t create the Databases. You need to create an empty database and configure it in the SQL Task in release pipeline.

Happy Programming :)

How to configure CSP Headers for Static Website Hosted in Azure Blob Storage

$
0
0

This post is about configuring CSP Header for Static Website Hosted in Azure Blob Storage. If you’re running a Single Page application, hosting it from Azure Blob service it easy and cost effective. Long back I wrote a blog post on how to do this - Simple Static Websites using Azure Blob service. One of the challenge in this approach is configuring security headers. If you check your application with tools like https://securityheaders.com, you will be getting a very low score, like F.

Azure Static site from Blob - With Rating F

In this post I will be explaining how to improve this score and get an A+ with the help of Azure CDN. So first I created a storage account and I configured it to host a static web app, you can find more details on how to do this from this post. Once it is provisioned and running, I am associating an Azure CDN to this storage account. We can do this from the Azure CDN menu of the storage account.

Azure CDN configuration

You need to choose the pricing tier as Standard Microsoft. And Origin hostname as your storage account URL with the Static Website associated. Click on the Create button, it might take some time to provision the CDN. Once the provisioning is completed, click on the hostname to open Azure CDN details.

Azure CDN configuration - Endpoint

On the Azure CDN, select the Rule Engine menu, it will display something like this.

Azure CDN configuration - Rule Engine

Click on the Add URL option, which will display a configuration section like the following.

Add Rule - Rule Engine

Provide a name, click on the Add Condition, and choose Request URL option, this will show a condition section on the bottom, select Any operator - so this URL will be applied to all the incoming URLs. If you’re using the Global you can directly add actions, you don’t require any conditions. There is a limit of 5 actions, so if you need to add more than 5 actions you need to follow this method. Next you need to click on the Add Action, and select Modify Response Header. Similar to condition this will add one more section, in the section, choose Action as Append, set X-Frame-Options as HTTP Header name, and set SAMEORIGIN as the Http header value.

You need to configure following values.

These are the configuration values I am using in my application - it can change based on your application requirements. Please look into the results from securityheaders.com and configure it based on your application requirements.

ActionHTTP header nameHTTP header value
AppendX-Frame-OptionsSAMEORIGIN
AppendX-Content-Type-Optionsnosniff
AppendContent-Security-Policydefault-src https: data:
AppendStrict-Transport-Securitymax-age=31536000; includeSubDomains
AppendX-Xss-Protection1; mode=block
AppendReferrer-Policystrict-origin
AppendPermissions-Policyaccelerometer=(); camera=(); geolocation=(); gyroscope=(); magnetometer=(); microphone=(); payment=(); usb=()

And here is my Azure CDN Rule engine with the completed configuration.

Rule Engine configuration completed

Now we have completed the configuration, lets run the URL again in securityheaders.com and will check the results.

Security Headers Result A+

And we got an A+ as the result. Some security tools will show a warning if the response returns a Server header. We can remove that by adding one more action. In the action choose Delete instead of Append and in the HTTP header name option provide the Server as the value. Now if you check the server response header won’t be there. This way you can make you application more secure without web.config or any other server side technologies. We can use Azure Function proxies to achieve the same results.

Happy Programming :)

Running Playwright on Azure Functions

$
0
0

This post is about Running Playwright on Azure Functions and deploying it to Azure. Playwright is a Node.js library to automate Chromium, Firefox and WebKit with a single API. Playwright is built to enable cross-browser web automation that is ever-green, capable, reliable and fast. Since it is a Node.js library I am creating this azure function in Javascript. This function will read a URL as input parameter and converts that URL into a PDF file. So first I am creating an Azure function with Javascript.

I am creating the function using VSCode. I am created a function using Javascript and HttpTrigger. Next I am installing the Playwright package using the command - npm install playwright. this command installs Playwright and browser binaries for Chromium, Firefox and WebKit. And I am installing playwright-chromium as well, so that I can access the library in the code.

Here is my package.json file

{"name":"url2pdf","version":"1.0.0","description":"","scripts":{"start":"func start","test":"echo \"No tests yet...\""},"dependencies":{"playwright":"^1.6.2","playwright-chromium":"^1.6.2"},"devDependencies":{}}

Next modify the index.js file like this.

const{chromium}=require("playwright-chromium");module.exports=asyncfunction(context,req){consturl=req.query.url||"https://dotnetthoughts.net/";constbrowser=awaitchromium.launch();constpage=awaitbrowser.newPage();awaitpage.emulateMedia('screen');awaitpage.goto(url);constscreenshotBuffer=awaitpage.pdf({format:'A4'})awaitbrowser.close();context.res={body:screenshotBuffer,headers:{"content-type":"application/pdf","Content-disposition":"attachment; filename="+url+".pdf"}};}

This code will launch chrome instance, open a new tab, navigate to the URL in the request. Once navigation completed, it will convert the page to pdf and returns the bytes to response as PDF file.

Next I am deploying it to Azure. To do that first you need to install the Azure Extension for VS Code and Connect to your subscription.

VSCode Azure Function

This will connect the subscription and display your Function as Local Project. You can click on the Blue icon - Deploy to Function app to deploy it to Azure, you can choose your existing function app or you can create one. Before deploying it to Azure you need to make sure you’re changing VS Code settings to run npm install on the server instead of development machine. To do this, select the Azure Function extension and click on the settings.

VSCode Azure Function

And select the option - Azure Functions: SCM Do Build During Deployment - this will build the project on server instead of client. It is applicable only for Linux Function apps. So next I will be clicking on the Deploy to Function app icon, which will prompt few questions and which will help you to create the azure function. I choose the Create new Function app in Azure Advanced option, provided a unique name, choose Node.js 12 LTS as the runtime stack, Linux as the OS, Consumption hosting plan, created new resource group, storage account and I added an application insights resource. It will deploy the function to Azure. But it might not work as expected. Because Playwright downloads Chromium to a location outside the function app’s folder. To include Chromium in the build artifacts, we need to configure Playwright to install Chromium in the app’s node_modules folder. To do this, we need to create an app setting named PLAYWRIGHT_BROWSERS_PATH with a value of 0 in the function app in Azure. This setting is used by Playwright at run-time to locate Chromium in node_modules. If you’re creating the function first from portal than creating it while deployment, you can avoid deploying it again after this change.

Playwright configuration in Azure Function app

You can also remove the preDeployTask and postDeployTask from .vscode/settings.json file. Here is the updated settings.json

{"azureFunctions.deploySubpath":".","azureFunctions.projectLanguage":"JavaScript","azureFunctions.projectRuntime":"~3","debug.internalConsoleOptions":"neverOpen"}

Now deploy it again. This will show a prompt about over writing the deployment, select Deploy option. It will deploy the app and provide the URL of the function. While deploying I found one issue - The Azure function extension setting - SCM Do Build During Deployment was not working. So I had to include it in the settings.json like this.

{"azureFunctions.deploySubpath":".","azureFunctions.projectLanguage":"JavaScript","azureFunctions.projectRuntime":"~3","debug.internalConsoleOptions":"neverOpen","azureFunctions.scmDoBuildDuringDeployment":true}

Now browse the function app with a URL which will download a PDF file of the URL.

Happy Programming :)

How to configure Postman API tests in Azure DevOps

$
0
0

This post is configuring Postman API tests in Azure DevOps. Postman makes API development easy. Postman platform offers the tools to simplify each step of the API building process and streamlines collaboration so you can create better APIs faster. In this post I am discussing about configuring Postman API testing in Azure DevOps and GitHub actions.

First I created a collection in Postman, then I am using some API endpoint for this post. Along with that, I am adding few test cases. Here is one request with test.

Postman Request

Once it is executed and working as expected, click on the collection and export it.

Postman Export Collection

From the Export collection dialog, choose the Collection v2.1 (recommended), and save the JSON file. For demo purposes I am not using any environment variables. If you’re using any tokens or variables, you can using Postman environments, and you need to export that as well.

To run the Postman collection in DevOps pipelines you need to use newman CLI tool - which is from postman helps you to run the Postman collection files, this can be installed using npm command - npm install -g newman. And you can run the tests using the command - newman run TestAPICollection.json - this command will execute the test cases and print the output in the console.

Postman Running Console

Next I committed the JSON file to source control, I am using GitHub. Once you commit the JSON file we can start creating the Azure DevOps pipeline. I am creating a Azure DevOps Build pipeline, I am using Class Editor option, not YAML option. Next I am adding few build steps - first build step in a npm task will be installing the newman. Next step is a command line script - which will be running the newman with the file. And final step is Publish Test Results - this step will publish the test results from the previous step. We may need to provide one more parameter which will export the results in JUnit XML.

Here is the YAML file.

resources:repositories:-repository:selftype:gitref:masterjobs:-job:Job_1displayName:Agentjob1pool:vmImage:vs2017-win2016steps:-checkout:self-task:Npm@1displayName:npmcustominputs:command:customverbose:falsecustomCommand:install-gnewman-task:CmdLine@2displayName:CommandLineScriptinputs:script:newmanrun$(Build.SourcesDirectory)\Testing.postman_collection.json-x-rjunit--reporter-junit-export$(build.artifactstagingdirectory)\Results\JunitResults.xml-task:PublishTestResults@2displayName:PublishTestResultsinputs:testResultsFiles:'**\*.xml'searchFolder:$(build.artifactstagingdirectory)

And here is the Build pipeline.

Postman Running Console

If you notice the newman run command comes with few more parameters which helps to publish the results as JUnit XML file. The -x flag will ignore the exit code and continue the build steps, if you want to stop the build execution if any of the tests fails, you need to remove it. In this post I build a pipeline only specific for monitoring purposes, that is why I added an ignore exit code parameter. Here is the results after running the pipeline.

DevOps Pipeline Tests

You can find more details about the test run in the Test Plans > Runs. You can find more details like Test Runs and details about Tests.

DevOps Pipeline Tests

You can configure the it to be run it with scheduled trigger or as a continuous delivery option, so that this API tests can validate your app and show the results. You can use similar script and execute the same test cases in GitHub actions as well.

Happy Programming :)


Scaffold an entire .NET 5 Web API using Wrapt

$
0
0

This post is about scaffolding an entire .NET 5 Web API with a simple yaml or json file using Wrapt, which helps you can focus on the high value features in your web app. Recently I came across this tool which helps you to scaffold a Web API application in Clean architecture with Open API documentation, unit tests and integration tests by providing a YAML or JSON file.

To get started you need to install a project template and a dotnet tool. So first you need to run the command - ` dotnet new -i Foundation.Api - this command will install the Foundation API project template. Next you need to install the craftsman tool which helps you to scaffold the project. You can install it using dotnet tool install -g craftsman command. Once installation is successful, you can run the craftsman` command which will display an output like this.

Craftsman command

Next you need to create a YAML or JSON file - it is the template from which craftsman is generating the solution and projects. For this demo I am building a todo application YAML template file, which will look like this.

SolutionName:TodoApiDbContext:ContextName:TodoDbContextDatabaseName:TodoDbProvider:SqlServerEntities:-Name:TodoItemProperties:-Name:TodoIdIsPrimaryKey:trueType:intCanFilter:trueCanSort:true-Name:NameType:stringCanFilter:trueCanSort:true-Name:IsCompletedType:boolCanFilter:trueCanSort:trueEnvironments:-EnvironmentName:StartupConnectionString:"Data Source=.;Initial Catalog=TodoDb;Integrated Security=True;Encrypt=True;TrustServerCertificate=True;"ProfileName:DevSwaggerConfig:Title:TodoApiServiceDescription:APIformanagingtodoitems.ApiContact:Name:AnurajEmail:support@dotnetthoughts.netUrl:https://dotnetthoughts.net

The YAML file is simple and straight forward. Here is the detailed explanation of each element.

  1. SolutionName - This element used to configure the application solution name.
  2. DbContext - This element is about configuring Database Context associated to the application. This element got other properties like DbContext name, Database name and Provider - Currently SqlServer and PostgreSQL only supported.
  3. Entities - This element helps you to configure the database entities in the application. For the todo list application I am configuring only one entity. You need to provide the name of the property, type and associated meta data like sort and filter support.
  4. Environments - This element helps you to configure environment configuration - which helps you to configure the connection string and associated variables for a certain environment.
  5. SwaggerConfig - This element helps you to build the Open API documentation for the Web API.

You can copy / paste the YAML file and save it as todo-api.yaml. Next let’s build the web api. You can run the command - ` craftsman new:api .\todo-api.yaml`, which will parse the YAML file and scaffold the solution and projects.

Craftsman New API

So we successfully scaffolded the application. Next change the directory to TodoApi and execute the command dotnet run --project webapi - the webapi folder contains the API project file. Once the application is running, browse the /swagger endpoint which will display the Open API documentation like this.

Open API documentation

And here the folder structure created by the application.

Solution Explorer

This project is created using the Clean Architecture, because of that the files and project are structured in certain way. Long back I wrote a blog post on how to generate Angular code from OpenAPI, using those utilities we can build web api client application as well.

You can find more details about Wrapt from https://wrapt.dev/. And here is the getting started and tutorial. Explore it and let me know your thoughts.

Happy Programming :)

How to use FastReport Open Source in ASP.NET Core

$
0
0

If you’re coming from VB6 / Winforms world, one of the challenge in ASP.NET Core is lack of reporting tools. In VB6, Crystal reports was there. And if I am not wrong it was part of some Visual Studio versions. I was exploring a free reporting tool for one of old companies as they were planning to move their VB6 app to ASP.NET Core and Cloud. So lets explore Fast Report Open Source.

I am using ASP.NET Core 5.0 MVC application for this demo. And I am using the NorthWind database. I am displaying the Categories table in View with the help of Fast Report component. So to use FastReport, you need the FastReport Designer - I am using the Community Edition. They are offering an online edition as well - which is paid. Open FastReport Designer executable, and select the Standard Report Wizard under Create New option.

Fast Report Designer

Then click on Create new datasource button in the Standard Report Wizard, which help you to configure a data source for the report.

Standard Report Wizard - Step 1

From that screen, click on New Connection button - which will help you to connect to a Data source.

Data Wizard - Step 1

In the Connection dialog, choose the option MS SQL Connection, the dialog will expand - you can configure you SQL Server connection here. Since I am connecting to my local SQL Server I provided the SQL Server name as . and it in Windows Authentication mode I choose that option, and finally the Database - which is Northwind. You can do the Test Connection and Click Ok to complete the MS SQL Connection.

Data Wizard - Step 2

Once you click OK, the Connection name will be populated in the screen - it depends on the database you are connecting. You can click on Edit Connection button if you want to modify anything - like changing authentication mode or switching to a different database.

Data Wizard - Step 3

Click on the Next button to move to the next screen - in that screen you will be able to see all the tables in the database. Select the categories table and click on Finish.

Data Wizard - Step 4

Now you have completed the creation of the Data Source. Once you finish the Data Wizard - you will be able to see the Categories table populated on the Select Data Source dialog.

Standard Report Wizard - Step 2

Click Next and choose the fields you like to display in the report. I am choosing all the fields. You can choose it depend on your needs.

Standard Report Wizard - Step 3

Next you can choose fields to group, I am not doing it in the demo, click on Next, and in this screen you can choose the Layout and Orientation. I choose the Portrait orientation and Columnar as Layout. And in the next screen you can choose the Report Style, I choose the Blue style. And click on Finish. This will complete the Standard Report Wizard. And you will be able to see the Report Designer.

Fast Report Designer - with Categories

You can click on File > Preview or Ctrl + P to see the Preview - this will show the report in preview mode with categories.

Fast Report Preview

If notice, the Picture field is displayed as System.Byte[] in the preview screen it is because in the designer it is a text field. You can drag a Picture Control from the toolbar and replace the picture textbox with that and configure the Picture box field as Picture.

Fast Report Preview

Now if you run the Preview you will be able to see the Images instead of System.Byte[]. So you have successfully created a report in the designer. For using it, you need to save the report. I am saving it as northwind-categories.frx.

Lets integrate the report to ASP.NET Core MVC. To do this, I created an ASP.NET Core MVC project and added the following packages - FastReport.OpenSource, FastReport.OpenSource.Data.MsSql, and FastReport.OpenSource.Web. And for connecting to SQL Server and scaffolding database context and model classes I am using EF Core packages as well. Here is the update project file.

<ProjectSdk="Microsoft.NET.Sdk.Web"><PropertyGroup><TargetFramework>net5.0</TargetFramework></PropertyGroup><ItemGroup><PackageReferenceInclude="FastReport.OpenSource"Version="2021.1.7"/><PackageReferenceInclude="FastReport.OpenSource.Data.MsSql"Version="2021.1.7"/><PackageReferenceInclude="FastReport.OpenSource.Web"Version="2021.1.7"/><PackageReferenceInclude="Microsoft.EntityFrameworkCore.Design"Version="5.0.2"><IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets><PrivateAssets>all</PrivateAssets></PackageReference><PackageReferenceInclude="Microsoft.EntityFrameworkCore.SqlServer"Version="5.0.2"/></ItemGroup></Project>

Next using EF Core dotnet tool I scaffolded Northwind database - database context and model classes - you can find more details on how to do this on this blog post. Next I copied the report file to reports folder in the application root. And modified the Startup class - Configure method to use FastReport, like this.

publicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){if(env.IsDevelopment()){app.UseDeveloperExceptionPage();}else{app.UseExceptionHandler("/Home/Error");app.UseHsts();}app.UseHttpsRedirection();app.UseFastReport();app.UseStaticFiles();app.UseRouting();app.UseAuthorization();app.UseEndpoints(endpoints=>{endpoints.MapControllerRoute(name:"default",pattern:"{controller=Home}/{action=Index}/{id?}");});}

Also I modified the ConfigureServices method to configure the Database Connection to SQL Server for Fast Report.

publicvoidConfigureServices(IServiceCollectionservices){FastReport.Utils.RegisteredObjects.AddConnection(typeof(MsSqlDataConnection));services.AddDbContext<NorthwindContext>(options=>options.UseSqlServer(Configuration.GetConnectionString("NorthWindConnection")));services.AddControllersWithViews();}

And I added the following code in an action method which will display the Category report in the MVC View.

publicIActionResultReport(){varwebReport=newWebReport();varmssqlDataConnection=newMsSqlDataConnection();mssqlDataConnection.ConnectionString=_configuration.GetConnectionString("NorthWindConnection");webReport.Report.Dictionary.Connections.Add(mssqlDataConnection);webReport.Report.Load(Path.Combine(_hostEnvironment.ContentRootPath,"reports","northwind-categories.frx"));varcategories=GetTable<Category>(_northwindContext.Categories,"Categories");webReport.Report.RegisterData(categories,"Categories");returnView(webReport);}

The GetTable method is a static method which helps me to convert an IEnumerable to DataTable using Reflection. In this method, I also added code to fix the image display issue.

staticDataTableGetTable<TEntity>(IEnumerable<TEntity>table,stringname)whereTEntity:class{varoffset=78;DataTableresult=newDataTable(name);PropertyInfo[]infos=typeof(TEntity).GetProperties();foreach(PropertyInfoinfoininfos){if(info.PropertyType.IsGenericType&&info.PropertyType.GetGenericTypeDefinition()==typeof(Nullable<>)){result.Columns.Add(newDataColumn(info.Name,Nullable.GetUnderlyingType(info.PropertyType)));}else{result.Columns.Add(newDataColumn(info.Name,info.PropertyType));}}foreach(varelintable){DataRowrow=result.NewRow();foreach(PropertyInfoinfoininfos){if(info.PropertyType.IsGenericType&&info.PropertyType.GetGenericTypeDefinition()==typeof(Nullable<>)){objectt=info.GetValue(el);if(t==null){t=Activator.CreateInstance(Nullable.GetUnderlyingType(info.PropertyType));}row[info.Name]=t;}else{if(info.PropertyType==typeof(byte[])){//Fix for Image issue.varimageData=(byte[])info.GetValue(el);varbytes=newbyte[imageData.Length-offset];Array.Copy(imageData,offset,bytes,0,bytes.Length);row[info.Name]=bytes;}else{row[info.Name]=info.GetValue(el);}}}result.Rows.Add(row);}returnresult;}

And here is the View code.

@modelFastReport.Web.WebReport@{ViewData["Title"]="Categories - Report";}@awaitModel.Render()

And here is the report displaying on ASP.NET Core MVC View.

Fast Report Display on ASP.NET Core MVC

This way you can display the FastReport in ASP.NET Core MVC. You can configure which all button should be displayed in the toolbar. And you can play around the content source - instead of pushing the Categories collection, you can run query and push the output as a data source. You can also explore the FastReport Designer and design complex reports. You can download the FastReport Designer from here. And you can find more details about the FastReport open source blog posts here

You can find source code for this article in GitHub.

Happy Programming :)

A/B Testing with Azure App Service

$
0
0

A/B Testing feature helps you to test new website content, processes, workflows, etc. by routing the traffic into multiple slots. At a very high level, you route your users into different two deployments of code and measure the success of each version of the site based on your requirements. Azure App Service helps us to set up it very quickly with the help of Deployment Slots.

To use this feature, you need to configure Deployment Slots for Azure App Service. Deployment slot feature is only available on Standard and Premium pricing SKU. If you’re not running App Service on Standard or Premium SKU, you will see something like this.

Standard and Premium pricing SKU

You can click on the Upgrade button and change the SKU to Standard or Premium. For this blog post I am upgrading to Standard SKU.

App Service Spec Picker

It will take few seconds to upgrade SKU and once it is done, Deployment slot option will be enabled. Once enabled, the default slot is marked as production.

App Service Deployment Slots enabled

You can add slot by clicking on Add Slot button. I am adding a staging slot. Since I don’t have a configuration, I am selecting Do not clone Settings option, if you have any configuration, like App Settings or Connection Strings, you can clone from other slots. The slot is also an Azure app service. Most of the App service features are available for App Service slots as well.

VS Code App Service Deployment

Next I am creating an ASP.NET Core MVC app for deploying to this App Service slots. While creating the app service I choose the Runtime as .NET Core 3.1, you can run the following command to create an .NET Core 3.1 app - dotnet new mvc --framework netcoreapp3.1 -o abtestingdemo. Next I am deploying the app in Production Slot. I am using Visual Studio Code to deploy the app to the slot. Once successfully deployed you can browse the application and verify whether it is working or not. Next, you need to modify code and deploy it to Staging slot. I am modified the Index view in the application like this and I am deploying it to staging slot.

<divclass="text-center"><h1class="display-4">Welcome to v2 of the App Service</h1><p>Learn about <ahref="https://docs.microsoft.com/aspnet/core">building Web apps with ASP.NET Core</a>.</p><buttonclass="btn btn-primary">Send Feedback</button></div>

To deploy to a slot from VSCode, you need to right click on the Slot and choose the Deploy to Slot command.

VS Code Deployment to slot

Once the deployment is completed, you can open the Deployment slots option in Azure App Service and configure Traffic % to 30 to Staging slot. This will make sure 30% of traffic is redirected to staging slot and rest 70% to production slot.

Traffic percentage configuration to deployment slots

Even though the app service URL will remain the same, the traffic will be redirected to different slots. You can identify the slot with a cookie named - x-ms-routing-name. For staging slot it will be staging. And for production it will be self.

Slots Cookies

You can test this behavior by browsing the app and refreshing the browser. To verify it I wrote a C# console application which is hitting the server and reads the cookie and identify whether it is production slot or staging slot.

classProgram{staticintproductionHit=0;staticintstagingHit=0;staticvoidMain(string[]args){varnumberOfIterations=10;for(inti=0;i<numberOfIterations;i++){HitUrl();}Console.WriteLine(string.Format("Number of iterations:{0} - Production Slot:{1} - Staging Slot:{2}",numberOfIterations,productionHit,stagingHit));}staticvoidHitUrl(){varappUrl="https://abtestingdemo.azurewebsites.net/";varcookies=newCookieContainer();varhandler=newHttpClientHandler();handler.CookieContainer=cookies;varclient=newHttpClient(handler);varresponse=client.GetAsync(appUrl).Result;Uriuri=newUri(appUrl);varresponseCookies=cookies.GetCookies(uri).Cast<Cookie>();foreach(CookiecookieinresponseCookies){if(cookie.Name=="x-ms-routing-name"){if(cookie.Value=="self"){productionHit++;}else{stagingHit++;}}}}}

If you run this app, you will be able to see Production Slot 7 and Staging slot 3. Instead of automatically driving users to staging slot, you can implement a Beta link, so that user can opt in to Beta or staging version. To do this you need to create a link with the query string x-ms-routing-name and slot name as value. For staging slot or beta you can create something like ?x-ms-routing-name=staging, if user clicks on this, will be redirected to staging slot. You can identify the slot by configuring an environment variable and selecting the slot setting option. And once the user switch to a slot, the app slot will be served in future. You can integrate Azure Application Insights in to the app service and identify the how users are using the features. Azure App service team posted a series on A/B Testing with App Service. I am including the links for your reference. You can configure Azure DevOps build pipeline to deploy artifacts to staging slot and once you completed the verification and you can use the swap option in deployment slots to swap between staging and production slots.

  1. Part 1: Client-side configuration
  2. Part 2: Server-side configuration
  3. Part 3: Analyzing the telemetry
  4. Set up staging environments in Azure App Service

Happy Programming :)

Working with SSL Certificate in Azure App Service

$
0
0

This article shows you how to work with SSL certificates in Azure App Service. Last year I wrote a blog post on how to use Azure App Service managed certificates. If you’re using App service managed certificates, you don’t need to worry about the expiry, it will get renewed automatically by Azure. If you’re using custom domain names for Azure App service you will be able to configure SSL certificates to the custom domain name. To create custom TLS/SSL bindings for your App Service app, your App Service plan must be in the Basic, Standard, Premium, or Isolated tier.

Basic, Standard, Premium, or Isolated tier

I am using Basic tier for this demo. To configure the custom domain name, you can click on the custom domains menu and click on Add Custom domain. Based on the domain name, you may need to modify your DNS settings in your domain provider. If it is a root domain, you may need to add A NAME records and for subdomains you may need to add CNAME records. Recently Azure App Service introduced a TXT record - for custom domain verification, so along with A or CNAME records you need to add a TXT record as well.

Once you successfully configured the domain name - App Service will start showing a warning - You have custom domains that are not secured and will cause browser warnings/errors when accessed over https. Click on "Add binding" to secure your custom domains.. To fix this warning you should configure SSL binding to your app service.

Custom domain SSL warning

To configure SSL binding you need an SSL certificate. You can get SSL certificate from most of the domain providers. To get an SSL certificate, first you need to create a certificate request or CSR. You can create a CSR using IIS. Open IIS, Select the Server Certificates option. And click on Create Certificate Request option.

Create Certificate Request

On this step you need to provide the following details

FieldDescription
Common nameName for the certificate.
OrganizationName of the organization for which the certificate is used.
Organizational unitame of the department or division in the organization in which the certificate is used.
City/localityUnabbreviated name of the city or locality where your organization or organizational unit is located.
State/provinceUnabbreviated name of the state/province where your organizational unit is located.
Country/regionName of the country or region where your organization or organizational unit is located.

On the next screen you need to choose the Cryptographic Service Provider properties.

Cryptographic Service Provider

You can choose either Microsoft RSA SChannel Cryptographic Provider or Microsoft DH SChannel Cryptographic Provider. The default provider is Microsoft RSA SChannel Cryptographic Provider. And choose a bit length that the provider you selected uses. By default, the RSA SChannel provider uses a bit length of 1024, and the DH SChannel provider uses a bit length of 512. A longer bit length increases the level of encryption. But it can impact the performance because it requires the transmission of additional bits. I am using 2048 as Bit length since my SSL provider requires 2048 as minimum.

And in the next screen you will be able to save the CSR to a text file. In SSL provider you can upload this txt file and get the SSL certificate. Once it is generated, you can download the certificate from the provider. And open IIS, Select Server Certificates option. And click on the Complete Certificate Request option.

Complete Certificate Request

In this screen you need to provide the file - crt / cer file from the SSL provider, Friendly name - which helps you to identify certificate in IIS Server Certificates. And Certificate store - where this certificate is stored in the machine. Then Click OK. If you are running the Complete Certificate Request in the same machine where you generated the CSR, the certificate will be displayed in the Server Certificates in IIS.

Now you have created an SSL certificate, but to use this certificate in Azure App Services, you need to export this as PFX files. To do that, right click on the certificate and choose the Export option. In the export screen you need to configure the file name and provide a Password - this password is used in Azure App Service while importing it.

Once exported, open the TLS/SSL settings option from Azure App Service blade. Select the Private Key Certificate option. And Upload Certificate option.

TLS/SSL settings

In the Upload screen, choose the PFX file you exported and provide the password. It will imported to the resource group. If you’re uploading a wild card certificate, you can access this certificate in other app services or any service which supports a SSL file. Now to configure the App Service with the SSL file, select the Bindings menu, and click on the Add TLS/SSL Binding option.

TLS/SSL settings

In this screen you need to choose the domain - it is the custom domain you configured, select the uploaded certificate and choose SNI SSL as the TLS/SSL type. And click on Add Binding. And now if you check your custom domains blade you will be able to see the warning is gone.

It is recommended to turn on the HTTPS Only option. So that your app service always running on secure mode only. If you browse your app with a http it will automatically redirected to https. And configure your Minimum TLS Version to 1.2 - which is the recommended TLS level by industry standards, such as PCI DSS. Once you configure your TLS version, your app all connections with lower TLS versions.

TLS/SSL Protocol settings

This way you can create SSL certificate requests, create and use SSL certificate in Azure App Service. Once it is expired, you can execute the same procedure and use the latest certificate.

  1. Use a TLS/SSL certificate in your code in Azure App Service
  2. Add a TLS/SSL certificate in Azure App Service
  3. Secure a custom DNS name with a TLS/SSL binding in Azure App Service

Happy Programming :)

Managing Azure App Service SSL Certificate with Azure Key Vault

$
0
0

In my last blog post I wrote about working with SSL certificate in Azure App Service. In this article I will explain how to manage Azure App Service SSL certificates with Azure Key Vault Service. If you’re running SAAS applications on Azure App Service with custom domains and SSL certificates it is quite complicated. Here is a screenshot of an App Service running a SAAS app with custom domain and SSL certificates.

Multiple custom domains and SSL Certificates

In this scenario updating the SSL is quite complicated because you need to upload the SSL certificate, and manually change all the SSL/TLS bindings for all the custom domains. The alternate option is to use Azure Key Vault. So instead of uploading the SSL certificate to the app service directly, you will be uploading the certificate to the Azure Key Vault and access the certificate from App service via Key Vault. Incase of SSL certificate renewal you will be able to upload the latest certificate to the KeyVault and it internally manage and provide the latest certificate to the app service.

To use Azure Key Vault, you need to create an Azure Vault service. In Azure portal click on New Resource and search for Azure Key Vault. You need to select a resource group and provide a name of the Key Vault and click on the Review and Create button. The name should be unique and using this name you will be able to interact with key vault using REST API.

Create Azure Key Vault

Once you create the Key Vault service, you can import the SSL certificate to Azure Key Vault with the help to import option. Similar to App Service you will be able to import PFX file in Azure Key vault as well. To get this option, you can choose the Certificates option from the Key Vault, and you can click on the Generate / Import, this will display a screen like this, after you choose Import from the list.

Import SSL certificate

Similar to App Service you need to provide the PFX file password in this screen. Once it is imported, you may need to assign an identity to manage this SSL from Web App. First you need to enable Identity in the App Service.

Enable App Service Identity

Once you enable App Service identity, you will be able to assign Azure Key Vault permissions to the identity. To do this, you need to select Access policies from the Azure Key Vault, and click on the Add Access Policy option.

Access Policies

In the Add access policy option, choose Get option in Key permissions, Secret permissions, and Certificate permissions. And for the Select Principal, click on the None Selected. And search for the web application on which you have enabled the Identity.

Add Access Policy

Once selected, click on the Select button and click on the Add button. After it got added to the Key Vault. In the Web Application, select TLS/SSL settings and select the Private key certificates (.pfx) option. And click on the Import Key Vault Certificate option.

Import Key Vault Certificate

Now you can bind the SSL certificate to the custom domains. As I mentioned earlier if you’re using SSL certificate from Azure Key Vault - renewal of SSL certificate can be automated. If you’re using SSL certificate from Azure Key Vault integrated provider the creation and renewal can be automated. You can use Azure Key Vault to store your application secrets and keys as well. And you will be able to manage it with Azure Key Vault Client SDKs.

  1. About Azure Key Vault
  2. Add a TLS/SSL certificate in Azure App Service - Import a certificate from Key Vault
  3. Assign a Key Vault access policy using the Azure portal
  4. Tutorial: Use a managed identity to connect Key Vault to an Azure web app in .NET

Happy Programming :)

Azure App Service - Enable the Health Check

$
0
0

This article shows you what is App Service Health Check feature and how to enable it. This feature will help you to improve the availability of your Azure App Service. You can increase the availability and throughput by scaling the app into multiple instances. But what will happen due to some exceptions one of your app becomes faulty and not responding? This feature helps you to configure an endpoint, in which system will ping on configured intervals, if an app service instance fails to respond to this ping, system remove the instance from your load balancer. This feature introduced in Azure App service in 2019 and now it is Generally Available and ready for production applications.

To enable health check, you can login into Azure Portal, Select the App Service and select the Health Check under Monitoring.

Enable Health Check in Azure App Service

Once you specify the path, app service will start ping that URL in the configured intervals. If the URL responds with an error http status code or does not respond, then the instance is identified as unhealthy and it is removed from the load balancer rotation. And load balancer will no longer send the traffic to that app service. System will continuously ping the app service endpoint and if it becomes healthy, it will be added back to the load balancer and if it continues to respond error status code or not responding, App Service will restart the underlying virtual machine to bring back instance to a healthy state.

Once you configure your app service health check, you can monitor the health of your app service using Azure Monitor. From the Health check blade in the Portal, click Metrics. This will open a new blade where you can see the app service health status history and create a new alert rule.

Azure Alert configuration

If you’re using asp.net core application, you can configure Health Checks feature in ASP.NET Core - I wrote a blog post about the Health check implementation. Check out it here. Because it is recommended that health endpoint is checking for the all the critical elements - and if any critical element is not responding the health endpoint should return an error status code.

Here is some reference links which discuss about this feature in detail.

  1. Health Check is now Generally Available
  2. Monitor App Service instances using Health check

Happy Programming :)

Monitor Azure WebJobs status with Azure Application Insights

$
0
0

This article shows you how to monitor Azure WebJobs using Azure Application Insights. WebJobs is a feature of Azure App Service that enables you to run a program or script in the same instance as a web app, API app, or mobile app. There is no additional cost to use WebJobs. If you’re using Azure WebJobs monitoring WebJobs is little hard. You might need to login into Kudu and need to check the status. Recently I had to implement a solution which helps me to monitor Azure WebJobs. Basically by monitoring Web Jobs I wanted to verify whether the service is able to complete the operation successfully or it is still running or it is failed. In this post I am using Azure Application Insights and Kudu REST API to monitor the availability of the Web Jobs. You will be using the Availability feature of Application Insights with the help of MultiStep tests.

So first you need to create a MultiStep test. You can do it with Visual Studio Enterprise edition. You need to create a Web Performance and Load Test Project. Once it is created, it will create a webtest file. You can right click on the node and choose the Add Request option. In this you need to provide the Kudu endpoint of the Web Job. You can configure the URL property with the Kudu endpoint URL. This is the URL format - https://your_app_service.scm.azurewebsites.net/api/triggeredwebjobs/your_web_job_name. You need to replace the app service name and web job name you created. This is an authenticated endpoint using Basic Authentication. So you need to add a Header. You can again right click on the Request and choose the option Add Header. This will add a Header option, you need to select the Authorization for the Name property and Value you need to configure the token. To get the token download the Publish profile from App Service. It is an XML file. You need to find the values parameters - userName and userPWD. And execute the following powershell code.

$userName="`$app-service-webjobs-demo"$userPWD="HkKy9zfyb1S9ueZx2kJPgjqPGDJ0KqmdWFgT5fHE2CKjj5legfaLLjz9jboo"$authHeader="Basic {0}"-f[Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}"-f$userName,$userPWD)))Write-Host$authHeader

If you notice, you will be able to find one extra character ( ` ) in the username variable. It is required to escape the $ symbol in powershell. Once you execute this code with the variables from your publishing profile you will get a token. Use this token in the Header parameter value.

Next you need to create a validation rule in the Request - this will check the response and verify the Web Job status. In the Add Validation Rule dialog you need to choose the Find Text rule. And on the properties, put Success as the value for the Find Text.

Add Validation Rule

Click OK to save the changes. Now you ready with the test. Save the test and execute it. If your Web Job is executed properly the Test will pass - if WebJob is failed or running the test will fail. As mentioned earlier the WebTest file is an XML file. Here is the one I have created. You can use the same one and modify the parameters and use it.

<?xml version="1.0" encoding="utf-8"?><WebTestName="WebTest1"Id="b41e7ab8-2478-4ae5-8eb7-cc9eaf15e583"Owner=""Priority="2147483647"Enabled="True"CssProjectStructure=""CssIteration=""Timeout="0"WorkItemIds=""xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010"Description=""CredentialUserName=""CredentialPassword=""PreAuthenticate="True"Proxy="default"StopOnError="False"RecordedResultFile=""ResultsLocale=""><Items><RequestMethod="GET"Guid="ce98e408-a154-4859-84b5-f6e0aa0e8511"Version="1.1"Url="https://your-web-app.scm.azurewebsites.net/api/triggeredwebjobs/your-webjob"ThinkTime="0"Timeout="300"ParseDependentRequests="True"FollowRedirects="True"RecordResult="True"Cache="False"ResponseTimeGoal="0"Encoding="utf-8"ExpectedHttpStatusCode="0"ExpectedResponseUrl=""ReportingName=""IgnoreHttpStatusCode="False"><Headers><HeaderName="Authorization"Value="Basic your-auth-header"/></Headers><ValidationRules><ValidationRuleClassname="Microsoft.VisualStudio.TestTools.WebTesting.Rules.ValidationRuleFindText, Microsoft.VisualStudio.QualityTools.WebTestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"DisplayName="Find Text"Description="Verifies the existence of the specified text in the response."Level="High"ExectuionOrder="BeforeDependents"><RuleParameters><RuleParameterName="FindText"Value="Success"/><RuleParameterName="IgnoreCase"Value="True"/><RuleParameterName="UseRegularExpression"Value="False"/><RuleParameterName="PassIfTextFound"Value="True"/></RuleParameters></ValidationRule></ValidationRules></Request></Items></WebTest>

Now you have completed the setup, next you need to upload this file into the Application Insights. To do this select your Application Insights resource, click on the Availability menu, and from there click on Add test.

Configure Availability Tests

In the Create Test screen you need to provide a name for the test, choose Multi-step web test for the Test Type, and upload the test file. You can configure the Test Frequency based on your requirement. If the Job is taking some time configure the frequency based on that. Other values you can accept the default ones and click on Create.

Once you create a test, it will start executing on the specified intervals. And will display results like this.

Test Execution Results

To configure alerts on the Failure, you can click on the Alerts menu in the Application Insights resource. Since the alerts are configured as part of the availability test creation, one alert rule will be configured automatically.

Alert Rules

So if the test is getting failed in two geographic locations out of five locations, this alert will be triggered. And you can configure email notifications or SMS notifications based on this.

This way you will be able to monitor Azure App Service WebJobs using Application Insights. Since Microsoft is depreciating the Load Test and Performance Test project it is better to use Azure Functions to monitor. I will try to do a blog post on this approach in future.

Happy Programming :)


Building Realtime applications on Angular with ASPNET Core and SignalR

$
0
0

This article shows you how to build realtime applications on Angular with ASP.NET Core and SignalR. To get started you need to create an ASP.NET Core application and configure SignalR hub in that. I am using a Web API application. You can do this by dotnet new webapi command. Once it is created, you need to create a Hub - which the one of the core components in the SignalR framework. Here is the Hub implementation.

usingMicrosoft.AspNetCore.SignalR;namespaceBackend{publicclassMessageHub:Hub{}}

There is no methods implemented in the Hub. Next you need to configure the Startup class to use SignalR. You can do this by adding the following code.

publicvoidConfigureServices(IServiceCollectionservices){services.AddSignalR();services.AddControllers();//code removed for brevity}publicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){//code removed for brevityapp.UseEndpoints(endpoints=>{endpoints.MapControllers();endpoints.MapHub<MessageHub>("/messageHub");});}

Now you’re ready to accept SignalR connections. But you haven’t implemented any methods to interact with clients. So instead of writing the code in the Hub, you can write the code in the controller - which is better since you can access all the other HTTP parameters in the controller and you can expose action methods to external systems so that they can send notifications to the client applications. You need to create a new API controller and in the constructor you can get HubContext using that you can interact with SignalR. Here is the implementation.

[ApiController][Route("[controller]")]publicclassMessageController:ControllerBase{privatereadonlyIHubContext<MessageHub>_hubContext;publicMessageController(IHubContext<MessageHub>hubContext){_hubContext=hubContext;}[HttpPost]publicasyncTask<IActionResult>SendMessage([FromBody]stringmessage){await_hubContext.Clients.All.SendAsync("MessageReceived",message);returnOk();}}

In this code, you’re exposing the SendMessage API endpoint using this method external systems can send message to the clients. Next let us create the Angular project and write code to interact with SignalR server. You can create a Angular project using ng new --minimal command - the minimal argument will not create the spec files - this is not recommended for production apps. Once the Angular app dependency packages are installed, you need to install the client package for SignalR to interact with SignalR server. You can do this with npm i @microsoft/signalr command. Once installed, you can modify the app.component.ts file like this.

import{Component}from'@angular/core';import*assignalRfrom"@microsoft/signalr";@Component({selector:'app-root',template:'',styles:[]})exportclassAppComponent{title='Frontend';connection=newsignalR.HubConnectionBuilder().withUrl("https://localhost:5001/messageHub").build();ngOnInit(){this.connection.on("MessageReceived",(message)=>{console.log(message);});this.connection.start().catch(err=>document.write(err));}}

In the above code you’re establishing a connection with the SignalR server and subscribing for an event - MessageReceived which is the method server will be invoking. And the message from the server displaying in the console. Next you can run both applications and verify the output. From Angular app, you will get a message like this.

Angular Client without CORS

It is because SignalR requires CORS to work properly, so you need to configure CORS policy in the SignalR server. You can do like this.

publicvoidConfigureServices(IServiceCollectionservices){services.AddCors(options=>options.AddPolicy("CorsPolicy",builder=>{builder.AllowAnyHeader().AllowAnyMethod().SetIsOriginAllowed((host)=>true).AllowCredentials();}));services.AddSignalR();services.AddControllers();}publicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){//code removed for brevityapp.UseCors("CorsPolicy");app.UseEndpoints(endpoints=>{endpoints.MapControllers();endpoints.MapHub<NotificationHub>("/NotificationHub");});}

In this above code you have configured to accept any header, any method and with credentials. Again it is not recommended production apps. In production apps use proper domain names and allow only required methods. Next run both applications - you can run web api app using dotnet run command and angular app using ng serve --watch command. And from the Swagger UI try out the /Messageendpoint and you will be able to the message text you’re sending reaching the client apps. Now you have completed a basic SignalR and Angular app implementation. But in real world the requirements may change - for example if you’re building a chat application, you want to welcome the user or you want to transfer some client information to the backend. To do this you can use query strings - you can append the information you need to transfer to the server as the query strings in the Angular app like the following.

connection=newsignalR.HubConnectionBuilder().withUrl("https://localhost:5001/messageHub?userId=123456").build();

And you will be able to get these values in the server - hub - OnConnected() method like this.

publicclassMessageHub:Hub{publicoverrideTaskOnConnectedAsync(){varuserId=Context.GetHttpContext().Request.Query["userId"];returnbase.OnConnectedAsync();}}

This way you can build real time applications in Angular with ASP.NET Core and SignalR. Right now you’re accepting the message string from server and displaying it in the console, instead of this you can define JSON payload and implement display logic in the client side. You can make the application more interactive with the help of Notification API in browsers. You can find the source code for this blog post here.

Happy Programming :)

Running custom availability tests using Azure Functions

$
0
0

This article shows you how create and run custom availability tests using Azure Functions. This is an update to the blog post about monitoring WebJobs using Application Insights, since Microsoft is deprecating the Load Test and Performance features, this is the recommended alternative for monitoring availability. To get started you need to create an Azure function in C# with Timer Trigger. You can choose the trigger interval based on your requirement. This is one of the main advantage compared to Application Insights availability test feature. Next you need to add following packages.

  • Microsoft.ApplicationInsights - This package is for interacting with the Application Insights.
  • Microsoft.Azure.Functions.Extensions - This package is for creating a Startup class for the Azure Function.
  • Microsoft.Extensions.Http - This package is for creating the HttpClient which helps to interact with Web Jobs REST API.

This Startup class is required so that you can inject the HttpClient to the Azure Function - it is the recommended instead of creating the HttpClient instance every time in the Function code. Here is the Startup class code.

[assembly:FunctionsStartup(typeof(DotNetThoughts.Demo.Startup))]namespaceDotNetThoughts.Demo{publicclassStartup:FunctionsStartup{publicoverridevoidConfigure(IFunctionsHostBuilderbuilder){builder.Services.AddHttpClient();}}}

And here is the Function implementation.

privatereadonlyTelemetryClient_telemetryClient;privatereadonlystringinstrumentationKey=Environment.GetEnvironmentVariable("APPINSIGHTS_INSTRUMENTATIONKEY");privatereadonlystringwebJobAuthenticationToken=Environment.GetEnvironmentVariable("WEBJOB_AUTHORIZATION");privatereadonlystringwebJobRESTEndpoint=Environment.GetEnvironmentVariable("WEBJOB_RESTENDPOINT");privateconststringEndpointAddress="https://dc.services.visualstudio.com/v2/track";privatereadonlyIHttpClientFactory_httpClientFactory;publicWebJobAvailability(IHttpClientFactoryhttpClientFactory){_httpClientFactory=httpClientFactory;_telemetryClient=newTelemetryClient(newTelemetryConfiguration(instrumentationKey,newInMemoryChannel{EndpointAddress=EndpointAddress}));}[FunctionName("WebJobAvailability")]publicasyncTaskRun([TimerTrigger("0 */1 * * * *")]TimerInfomyTimer,ILoggerlog){stringtestName="WebJobStatusTest";stringlocation=Environment.GetEnvironmentVariable("REGION_NAME");stringoperationId=Guid.NewGuid().ToString("N");varavailabilityTelemetry=newAvailabilityTelemetry{Id=operationId,Name=testName,RunLocation=location,Success=false};try{awaitExecuteTestAsync(log);availabilityTelemetry.Success=true;}catch(Exceptionex){availabilityTelemetry.Message=ex.Message;varexceptionTelemetry=newExceptionTelemetry(ex);exceptionTelemetry.Context.Operation.Id=operationId;exceptionTelemetry.Properties.Add("TestName",testName);exceptionTelemetry.Properties.Add("TestLocation",location);_telemetryClient.TrackException(exceptionTelemetry);}finally{_telemetryClient.TrackAvailability(availabilityTelemetry);_telemetryClient.Flush();}}

The ExecuteTestAsync execute the business logic or test implementation. In this you will be writing the code to interact with Kudu REST endpoint and verify the response status code.

publicasyncTaskExecuteTestAsync(ILoggerlog){log.LogInformation("RunAvailabilityTestAsync - Started.");varhttpClient=_httpClientFactory.CreateClient();httpClient.DefaultRequestHeaders.Authorization=newAuthenticationHeaderValue("Basic",webJobAuthenticationToken);varwebJobResponseJson=awaithttpClient.GetStringAsync(webJobRESTEndpoint);varwebJobResponse=JsonSerializer.Deserialize<WebJobResponse>(webJobResponseJson);if(!webJobResponse.LatestRun.Status.Equals("Success",StringComparison.OrdinalIgnoreCase)){thrownewException("Latest Run status is not success.");}}

WebJobResponse class is the POCO implementation of REST endpoint response. You can convert the JSON response to POCO using json2csharp.com or using Visual Studio. Now you can run the function and verify the status of the Web Job.

Application Insights - Failures

This way you can create and run custom availability tests using Azure Functions. Similar to Application Insights availability tests you can configure alerts. As the location is null it is showing empty in the availability blade. If you host it on Azure Functions, the location variable will be auto populated.

  1. Create and run custom availability tests using Azure Functions

Happy Programming :)

Getting started with Microsoft YARP

$
0
0

This article discuss about YARP - A Reverse Proxy. YARP is a library to help create reverse proxy servers that are high-performance, production-ready, and highly customizable. So what is a reverse proxy? A reverse proxy is an intermediate connection point placed at a network’s edge. It receives initial HTTP connection requests and behave like the actual endpoint based on the configuration. Reverse Proxy acts as gateway between your application and users.

Reverse Proxy

YARP is built on .NET using the infrastructure from ASP.NET and .NET (.NET Core 3.1 and .NET 5.0). The key differentiator for YARP is that it’s been designed to be easily customized and tweaked via .NET code to match the specific needs of each deployment scenario. YARP can support configuration from appsettings.json file or code. In this post, you will explore how to use YARP in an empty ASP.NET Core web application, and that application will act a front end for two ASP.NET Core MVC applications. To get started, you can create a empty web application using the command ` dotnet new web. Next you need to add YARP package. You can do this with the following command - dotnet add package Microsoft.ReverseProxy –version 1.0.0-preview.9.21116.1`. It is still preview.

Once the package added, you can configure the Startup class to read the configuration and enable the Reverse proxy. You can do it like this.

publicclassStartup{publicIConfigurationConfiguration{get;set;}publicStartup(IConfigurationconfiguration){Configuration=configuration;}publicvoidConfigureServices(IServiceCollectionservices){services.AddReverseProxy().LoadFromConfig(Configuration.GetSection("ReverseProxy"));}// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.publicvoidConfigure(IApplicationBuilderapp,IWebHostEnvironmentenv){if(env.IsDevelopment()){app.UseDeveloperExceptionPage();}app.UseRouting();app.UseEndpoints(endpoints=>{endpoints.MapReverseProxy();});}}

In the ConfigureServices method Reverse Proxy middleware is added and the configuration is reading from the appsettings.json file. And the Configure method, routing to the reverse proxy configuration is mapped. Next you need to modify the configuration, you can do this by editing the appsettings.json file. Here is the reverse proxy configuration.

"ReverseProxy":{"Routes":[{"RouteId":"route1","ClusterId":"cluster1","Match":{"Path":"{**catch-all}"}}],"Clusters":{"cluster1":{"Destinations":{"cluster1/destination1":{"Address":"https://localhost:11000"},"cluster1/destination2":{"Address":"https://localhost:12000"}}}}}

This configuration has mainly two elements - Routes and Clusters. Routes is where you can configure the endpoint routes and URLs. The Match element is configured to execute the proxy for all the routes. RouteId is unique name for the route and ClusterId is used to identify the backend application servers or URLs. Inside the Clusters, there are two application urls configured. It is same application running in different ports. Now you’re ready to run the proxy application and your other web apps. The other web apps you can run in different ports with following command - dotnet run --urls="https://localhost:xxxxx". Now when you try to browse https://localhost:5001, you will be able to see the Index Page of the Web Application - the URL wont change. If you keep on refreshing, sometimes you will be able to see the second app as well. By default YARP will use the PowerOfTwoChoices algorithm for load balancing. There are other built in policies like.

  • First - Select the first destination without considering load. This is useful for dual destination fail-over systems.
  • Random - Select a destination randomly.
  • PowerOfTwoChoices (default) - Select two random destinations and then select the one with the least assigned requests. This avoids the overhead of LeastRequests and the worst case for Random where it selects a busy destination.
  • RoundRobin - Select a destination by cycling through them in order.
  • LeastRequests - Select the destination with the least assigned requests. This requires examining all destinations.

To configure any other load balancing policy you can modify the configuration like this. Here RoundRobin algorithm is used.

"Clusters":{"cluster1":{"LoadBalancingPolicy":"RoundRobin","Destinations":{"cluster1/destination1":{"Address":"https://localhost:11000"},"cluster1/destination2":{"Address":"https://localhost:12000"}}}}

YARP also supports traffic routing by checking the health of the destination application and route client requests based on that. If you’re using ASP.NET Core apps, you can enable ASP.NET Core health check option for this purposes. YARP coming with lot of features and improvements. Check out their documentation and home page for more details on the existing features and how to use it.

  1. Introducing YARP Preview 1
  2. YARP Home Page
  3. YARP GitHub Repository

Happy Programming :)

How to create social media posts from long form content using Python

$
0
0

This article discuss about generating social media posts from long form written content using Python. Most of the companies create technical blogs, white papers, articles as part of their marketing initiative. And they will push these content to social media with a summary, which helps them to bring customers to their websites. And most of the articles or blogs repurposed. This solution will help you to create such social media posts from the blog posts or articles. In this post you will learn how to implement a solution using Python and Flask, and hosting it on Azure App Service. You will also learn to use Azure Cognitive Services instead of using nltk package.

How it works?

The algorithm is very simple. First you will parse the URL and extract the keywords from the content using NLP. Next you will find the sentences from the content with most of the keywords and display it.

Following packages are used in this example.

Package NameDescription
FlaskFor user interface and user interactions
NewspaperFor getting the content from URLs or website
NltkFor extracting keywords from Text and splitting content into multiple sentences

So you need to install the above packages. Here is the requirements.txt file.

Flask==1.1.2
newspaper3k==0.2.8
nltk==3.5

You can run the pip install -r requirements.txt in your virtual environment. Once you install all the requirements, you can create the app.py file. You can find the app.py file in the implementation section. You can use VS Code for development purposes, with Docker and Azure extensions.

Implementation

You can use the Flask framework to show the user interface and interact with user inputs. The newspaper package is for converting the URL into readable format and extracting the keywords from the content using Nltk package.

fromflaskimportFlask,render_template,requestimportnewspaperimportnltkfromnltk.tokenizeimportsent_tokenizeapp=Flask(__name__)@app.route('/',methods=['GET'])defindex():returnrender_template('index.html')@app.route('/',methods=['POST'])defindex_post():url=request.form['UrlInput']if(len(url.strip())>=1):article=newspaper.Article(url)article.download()article.parse()article.nlp()sentences=sent_tokenize(article.text)keywords=article.keywordsresults=[]frequency=3forsentenceinsentences:numberOfWords=sum(1forwordinkeywordsifwordinsentence)if(numberOfWords>frequency):results.append(sentence.replace("\n","").strip())returnrender_template('result.html',url=url,results=results,keywords=keywords)else:returnrender_template('result.html',url=url,error='Please provide a valid URL to continue.')if__name__=='__main__':app.run(host='0.0.0.0',port=5000)

This implementation got one route with different HTTP methods. When a user browse the URL, the HTTP GET method is invoked and it returns a index.html file. And when a user fill the UrlInput field and submits the form, the HTTP POST route is invoked. In backend, you will get the value of the UrlInput form field. Using the Newspaper package, the URL is downloaded, parsed and running nlp on the content with the help of nltk which helps to extract the keywords. Next using sent_tokenize the text is split into multiple sentences. And finally, based on the number of keywords in a sentence, add the sentence into a array and render the result.html file with the array. And the app is exposing port 5000. You can run / debug the application using VS Code.

In the next section, you will publish the solution to Azure.

Publishing to Azure

To publish the solution to Azure, let’s convert the solution into a docker image and publish it. To do this you can use VSCode Docker extension and add the Dockerfile. Once you add the Dockerfile, you will get a requirements.txt file with flask and gunicorn packages. You need to add the packages you installed to this. Modify the requirements.txt file like the following.

Flask==1.1.2
gunicorn==20.0.4
newspaper3k==0.2.8
nltk==3.5

And here is the Dockerfile generated by VS Code.

FROM python:3.8-slim-busterEXPOSE 5000ENV PYTHONDONTWRITEBYTECODE=1ENV PYTHONUNBUFFERED=1COPY requirements.txt .RUN python -m pip install -r requirements.txtWORKDIR /appCOPY . /appRUN useradd appuser && chown -R appuser /appUSER appuserCMD ["gunicorn", "--bind", "0.0.0.0:5000", "app:app"]

Once it is done, run the docker build image command - docker image build --tag anuraj/postgenerator ., you need to use your docker hub or container registry id instead of anuraj. And once it is build, run the container with the command docker run -d -p 5000:5000 anuraj/postgenerator and open browser and check whether our application is running. You can browse http://127.0.0.1:5000/. It will be showing the UI. Once you submit a URL, it will throw an Internal Server Error. You can check the docker logs and it will show something like this.

Internal Server Error - Missing package

To fix this issue, you need to download the punkt resource. You can do it in the Dockerfile like this.

FROM python:3.8-slim-busterEXPOSE 5000ENV PYTHONDONTWRITEBYTECODE=1ENV PYTHONUNBUFFERED=1COPY requirements.txt .RUN python -m pip install -r requirements.txtWORKDIR /appCOPY . /appRUN [ "python", "-c", "import nltk; nltk.download('punkt', download_dir='/app/nltk')" ]ENV NLTK_DATA /app/nltk/RUN useradd appuser && chown -R appuser /appUSER appuserCMD ["gunicorn", "--bind", "0.0.0.0:5000", "app:app"]

In the Dockerfile you’re downloading the punkt resource to the /app/nltk directory and configuring the NLTK_DATA environment variable to the downloaded directory. Now build the image and run it. It should work properly. Now you build a docker image. Next you need to publish the image to any docker registry. For this example, Docker Hub is used. And the image is tagged based on the Docker Hub account. If you’re not following the convention, you need to tag the image with your id. If you’re using VS Code, you can deploy it from there with the help of Docker extension, or you can use the docker push command like this - docker push anuraj/postgenerator, it may take some time based on your internet bandwidth.

Docker image publishing to Docker Hub

Once it is completed, you can check the Docker Hub and verify it is available. To deploy the image to App Service, you can use the VS Code Docker extension. You can right click on the image tag and choose the Deploy Image to Azure App Service option.

Deploy the Web App to Azure App Service

It will prompt for the some configuration values - similar to what you configure when creating an Azure App Service. Once it is done, VS Code will provision the app service and deploy the container image to Azure app service.

Improvements

You can extend the implementation using Azure Cognitive Services - Text Analytics. So instead of using nltk package to extract the keywords you can use Azure Text Analytics service and extract the keywords. Here is the code for getting the keywords using Text Analytics from docs.microsoft.com.

fromazure.core.credentialsimportAzureKeyCredentialfromazure.ai.textanalyticsimportTextAnalyticsClientcredential=AzureKeyCredential("<api_key>")endpoint="https://<region>.api.cognitive.microsoft.com/"text_analytics_client=TextAnalyticsClient(endpoint,credential)documents=["Redmond is a city in King County, Washington, United States, located 15 miles east of Seattle.","I need to take my cat to the veterinarian.","I will travel to South America in the summer."]response=text_analytics_client.extract_key_phrases(documents,language="en")result=[docfordocinresponseifnotdoc.is_error]fordocinresult:print(doc.key_phrases)

Please note that Text Analytics got request data limits - Maximum number of characters for a document is 5120 and maximum number of documents is 10. So if you’re planning to get extract keywords from long documents, you may need to split the document and join the results.

Now you have implemented a minimal python AI application and deployed to Azure. You can find more details about App Service Deployment, configuring CI/CD pipelines for Python applications, Using Azure Cognitive services in Python etc in the Reference Links section.

  1. Tutorial: Deploy Docker containers to Azure App Service with Visual Studio Code
  2. Configure a Linux Python app for Azure App Service
  3. Quickstart: Create a Python app in Azure App Service on Linux
  4. Use CI/CD to deploy a Python web app to Azure App Service on Linux
  5. Build an AI web app by using Python and Flask
  6. Azure Text Analytics client library for Python - Version 5.0.0

Happy Programming :)

Add Azure Key Vault to support to your ASP.NET application

$
0
0

This article will discuss about how to connect and use Azure Key Vault in your ASP.NET MVC application.

Azure Key Vault - Provisioning and Configuration

To do this first you need to create an Azure Key Vault. It is a straight forward process. Login to azure portal, search for Key Vault, and select the Key Vault option.

Create Azure Key Vault

You need to provide a resource group, unique name and location, similar to most of the Azure resources, and click on Review + Create. And in the review screen confirm the details and create it.

Create Azure Key Vault

Next select the Secrets blade and add your app settings and connection strings. You can click on the Generate/Import button and choose the Upload options as Manual. Then configure your app settings and connection strings - keys and values to the Name and Value options. And keep other options as default.

Create Secret

Now you have completed your Key Vault configuration, you can click on Secrets blade and you can see your configured secrets list.

List of Secrets

In the next section, you will learn how to connect Visual Studio to Azure Key Vault and access the secret values in the code.

Connecting to Azure Key Vault

To connect to Azure Key Vault from Visual Studio, you need to right click on the project and select Add > Connected Service menu.

Connected Service

From the options, choose Secure Secrets with Azure Key Vault option.

Secure Secrets with Azure Key Vault

If you’re not signed in you might need to prompt to sign in to your account. Once you signed in, you can choose your Subscription and Key Vault - by default Visual Studio will prompt you with a new key vault, since you already created on you can select it from the list.

Connect and Select Azure Key Vault

And click on the Add button to add key vault reference to your application. This will add reference of the NuGet package Microsoft.Configuration.ConfigurationBuilders.Azure to the project. Also it will add some configuration in the Web.Config file.

<configSections><sectionname="configBuilders"type="System.Configuration.ConfigurationBuildersSection, System.Configuration, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"restartOnExternalChanges="false"requirePermission="false"/></configSections><configBuilders><builders><addname="AzureKeyVault"vaultName="dotnetthoughts"type="Microsoft.Configuration.ConfigurationBuilders.AzureKeyVaultConfigBuilder, Microsoft.Configuration.ConfigurationBuilders.Azure, Version=1.0.0.0, Culture=neutral"vaultUri="https://dotnetthoughts.vault.azure.net"/></builders></configBuilders>

Now the configuration is completed. Now you can use modify your appsettings and connectionstrings sections like this, so that the application can read from Azure Key Vault.

<appSettingsconfigBuilders="AzureKeyVault"><addkey="webpages:Version"value="3.0.0.0"/><addkey="webpages:Enabled"value="false"/><addkey="ClientValidationEnabled"value="true"/><addkey="UnobtrusiveJavaScriptEnabled"value="true"/><addkey="TextAnalyticsKey"value="from key vault"/></appSettings><connectionStringsconfigBuilders="AzureKeyVault"><addname="DefaultConnection"connectionString="from key vault"providerName="System.Data.SqlClient"/><addkey="StorageConnectionString"value="from key vault"/></connectionStrings>

And you’re completed the implementation. Now if you run the application, and put a break point on the configuration value, you will be able to see the application is reading from Azure Key Vault instead of the value provided in the configuration file. Here is the sample code - var textAnalyticsKey = ConfigurationManager.AppSettings["TextAnalyticsKey"];

Debugging Code

This way you can connect and use Azure Key Vault in your classic ASP.NET MVC applications. If you’re application is running is using .NET Framework 4.5 or lower version, you might need to upgrade to latest version of .NET Framework. You can use Azure Key Vault for App Service certificates as well.

  1. Azure Key Vault Developer’s Guide
  2. Add Key Vault to your web application by using Visual Studio Connected Services

Happy Programming :)

Viewing all 610 articles
Browse latest View live