Microsoft Dynamics Portals – How to Clear a Lookup Attribute

In Microsoft Portal’s sometimes we need some logic to clear values in a lookup attribute.

In such cases we can use the mentioned below code snippet to do the same

$(“attribute_name”).attr(“value”,null);
$(“arrtibute_id”).attr(“value”,null);
$(“#attribute_entityname”).attr(“value”,null);

“attribute” is the schema name of the lookup record in Dynamics.

Advertisements

Microsoft Dynamics Portals – Configuring Multi Steps in a Web Form Step

There is a property present in the web form step using which the different tabs in an Entity form will appear as separate steps.

This property comes in use when the web form step which needs to be configured is having a lot of information. In some cases, exploring the need to configure different web form steps can also be explored.

However with the below property, we will have just one web form step and still have different steps. Having different steps makes it possible to save information in small chunks i.e. steps and then come back to the web form.

As a prerequisite for this,

a) The Type of the Web form Step, should be selected as “Load Form”.

Web Form Step Type

b) Along with that the property “Auto Generate Steps from tabs” should be selected as well

Web Form Step

c) Once all this is done, navigate to the Web Form and move to the web form step.

Review that for the selected form all the tabs are displayed as separate steps in the same web form step

WEB Form Step 1

Web Form Step 2

Using Azure Log Analytics to Monitor Azure VM’s

Log Analytics can be used for monitoring the availability and performance for applications deployed on Azure Cloud.

In my last engagement we explored the idea of using Azure log analytics for monitoring the performance of Azure VM’s. So just wanted to share some useful links for the same

Azure Log Analytics Overview

https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-overview

Configuring Azure Logs on the Azure VM

https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-quick-collect-azurevm

 

 

Dynamics Virtual Entities – ODATA V4 Data Source – Using Request Parameters

Virtual entities is a recent feature introduced by Dynamics. Using Virtual entities , we can connect to external data source providers and provide them in a Dynamics GUI just like any other entity.

In the Virtual entity we can configure the source as an ODATA V4 provider.

In this particular blog, we will go through an example wherein we will utilize the Query Parameters present in the ODATA V4 Provider data source and show how we can implement a custom security using them.

In the mentioned below example, we will just create an ODATA Web API which will be consumed in the Virtual entity ODATA Source Provider.

For the sake of just explaining the concept, we will just go through the ODATA Controller and not the complete source code which will create the data to be shared with Dynamics.

Step 1 – Creating a Custom Authorization Class.

In this class, we will inherit this class with “AuthorizeAttribute” class available in namespace “System.Web.Http.AuthorizeAttribute”.

We will then override the method “OnAuthorization”. In the implementation, we will read the header variables present in the request. The key specified in the request header will be used later while setting up the virtual entity ODATA Provider source in Dynamics.

// Custom Authorization Class , which is implementing the class AuthorizeAttribute 

public class CustomAuthorizationClass: AuthorizeAttribute
    {
       
        // In the class we will override the method “OnAuthorization”
        public override  void OnAuthorization(HttpActionContext actionContext)
        {
           // Name of the request header parameter
            var key = “token”;
            
            try
            {
                // Reading from the request header key value pairs
                var headers = actionContext.Request.GetQueryNameValuePairs().ToDictionary(x => x.Key, x => x.Value);
                // Token object passed from Dynamics. 
                var token = headers.SingleOrDefault(x => x.Key == key).Value;
                // This can be used for any custom authentication.
            }
            catch
            {
                actionContext.Response = new HttpResponseMessage(System.Net.HttpStatusCode.Forbidden);
            }
            
        }

 

Step 2 – Adding Custom Authorize Attribute on the ODATAController.

Mentioned below is the code snippet for the same

[CustomAuthorizationClass] // Name of the Custom Authorization Class
public class VirtualEntitiesController : ODataController
{
        [EnableQuery]
        public IQueryable<VirtualEntity> Get()
        {
            // This function will be called when any View like Advanced Find , Associated View of the entity is opened
            // return entitycollection;
        }
        public IQueryable<VirtualEntity> Get([FromODataUri] Guid key)
        {
           

           // This function will be called when any record of Virtual Entity is opened
            // return record based upon the key;
        }
 }

 

Step 3 – Specifying the custom request header while Setting up the OData V4 Data Source

Please refer to the below screenshot which shows how we can specify the header parameter in the query header parameters

a) Navigate to Settings -> Administration -> Virtual Entity Data Sources. Click “New” to create a new Data Provider

Virtual Entity OData Provider

b) Click “Ok”. In the tab “Request Parameters”, Create a new parameter with Type as “Header”, Name as the “token” key specified in the code in step 1 and value.

Query Parameters

Azure Text Analytics API’s and Usage in Dynamics CRM

Using Azure Text Analytics API’s , we can set up processes that can analyze keywords during an interaction with the customer and provide some insights on the overall experience of the customer.

Text Analytics API demo is available on the following URL

https://azure.microsoft.com/en-us/services/cognitive-services/text-analytics/

As illustrated in the demo, we can analyze the text entered and share insights in terms of  Language Detected, Sentiment etc

We can also configure these API’s in azure cloud and then consume them in our applications.

Mentioned below are the steps to configure the Text Analytics API in the azure cloud environment

a) Navigate to azure portal, create a resource and select the category as “AI + Machine Learning”.

Text Analytics API's

b) We can then enter the details and the API’s will be configured. We will need to copy the API endpoint and an access key which we will then need to call them programmatically.

Mentioned below link explains how we can write a C# code for the same

https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/quickstarts/csharp

              Usage In Dynamics CRM

We can use the Text Analytics API in many functionalities in Dynamics. For example,

Service Request Scenario

a) Saving last Customer Interaction on a Service Request – While working with a customer on a particular service request, the customer care executive can capture the interaction is some placeholder like Notes or Annotation.

b) On save of the request, we can call the API endpoint and capture the sentiments to classify whether the interaction was “Positive”, “Negative” or “Neutral”.

c) Now when the customer calls again, based upon the previous sentiment analysis with the customer, the customer care executive can drive the conversation.

 

 

Caching in Azure Functions

Problem Statement –

Azure Functions are stateless in nature. Therefore even though we can use the standard.Net objects to cache values, they don’t persist if the Azure Function scales out or is idle for some time.

In many cases, Azure Functions are used for doing some integrations with other applications. For example, we may have an integration scenario in which we make calls to OAuth Rest API’s. In these cases, we may need to preserve OAuth2 bearer tokens in Azure Function.

Approach – We have some approaches for doing caching in Azure Functions.

a) Using standard Memory Objects – For example, we can create static objects like the dictionary for caching the values.

However as indicated previously, if the function is idle for some time or scales out, the cached value will be lost.

As a side note, below is the code snippet shows how we can implement assembly caching to save values

// Use the mentioned below statement to include required classes

using System.Runtime.Caching;

// Static object in which we will save the cache

static readonly ObjectCache tokenCache = MemoryCache.Default;

// Retrieving existing value in the cache

CacheItem tokenContents = tokenCache.GetCacheItem(TokenKey);
if (tokenContents == null)
{

// Branch when cache doesn’t exist. This would mean we need to regenerate it.
CacheItemPolicy policy = new CacheItemPolicy();
policy.Priority = CacheItemPriority.Default;

// Setting expiration timing for the cache
policy.AbsoluteExpiration = DateTimeOffset.Now.AddHours(1);
tokenContents = new CacheItem(TokenKey, tokenResponse.access_token);
tokenCache.Set(tokenContents, policy);
}
else
{

 // Branch to retrieve existing value present in the cache
Token = tokenContents.Value.ToString();
}

b) Using Redis Cache – Its managed by Microsoft is highly scalable and provides super fast access to data. Performance wise it can be good but from the pricing perspective, it can cost more than the other options.

c) Using tables in Storage accounts – For an Azure Storage account, we can create tables and properties. These properties can then be used for saving any information. 

Using this approach results in an extra API call to retrieve the value in the Azure storage table but pricing will be less than the Redis Cache.

Mentioned below screenshots show how we can configure an Azure Storage account

  • Navigate to Microsoft Azure Storage Explorer and enter the credentials for access in the Azure account. Review that it loads all the available storage accountsAzure Storage Account
  • Click on the Azure Storage account for which we need to add the storage table. Review that it will display the existing tables present in the accountAzure Storage Account Table
  • Right click “Table” and click on “Create Table”. Give a default name of the table. Review that the table gets created with default columns of “Partition Key” and “RowKey”

Default Storage Account Table

  • To add a new custom property to the Storage Account, click on button “+ Edit”. Review that a new screen to create a custom property pops up.

Updated Add Property

 

  • Add the property and click save. To add rows in the table, click on button “+Add” and add the rows in the table.
  • Make a note of the values present in the column “Partition Key” and “RowKey”. These attributes will be used to retrieve the values saved in azure storage accounts.

Code Snippet to retrieve values saved in the Azure Storage table

Mentioned below blog from Microsoft shows how to retrieve a particular row from storage account table using a key value

https://docs.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.table.tableoperation.retrieve?view=azure-dotnet

 

 

 

 

Azure Function .Net Core 2.0 Authentication Using JWT Libraries With Certificate

Problem Statement – The blog caters to explaining a use case in which we used JWT libraries to authenticate calls between two different environments.

The consumer application is an Azure Function App deployed on Azure Cloud which needs to monitor calls happening from another application.

Background – The central idea behind the integration is to ensure that the authentication mechanism follows the same guidelines even if the consumer application is changed from Azure Function to something else.

In am Azure Function we can use the mentioned below authentications

a) Active Directory Authentication with Cloud AD.

b) Authentication with other identities provider like Facebook, Gmail etc.

c) App Service Authentication using OAuth2 token validation.

In this particular implementations, as illustrated in the diagram, there could be multiple consumer applications each following their own authentication guidelines.

Source to Target Interaction

Using JWT libraries, we can lay down a framework

a) Which will involve no change in the source application.

b) A consistent authentication which can be implemented in different consumer applications irrespective of there underlying implementation.

Implementation Approach-

In this particular example, we will discuss the approach using Certificates. As illustrated in the diagram, mentioned below steps will be executed

Authentication Approach

a) Encoding the data which needs to be transferred using a Private Certificate.

b) When we use JWT libraries to encode, it creates three sets of encoded characters

Header – Containing the algorithm used for encoding.

Payload – Containing the encoded object, time at which the payload was generated.

Signature – Secret which needs to be verified while authentication.

c) In the consumer application, decode the data using the public certificate. If required we can also pass some other parameters like Validating lifetime of the event, Validating issuer of the event etc.

As a side note, in an Azure Function, we can save the certificates in SSL settings.

Azure Function SSL.png

SSL Certificate

Please note that to use the public certificates , as highlighted in the above screenshot, the hosting Plan of the Azure Function App must be “App Service Plan” and not “Consumption Plan”

Code Snippet

Mentioned below indicates a code snippet which can be used for the reference. Please note we will need to install a Nuget package for the JWT library as well. Please refer below the screenshot for the same

Nuget Package to Use

// Deserialize the request and retrieve the event details

NotificationEvent ObjModel = JsonConvert.DeserializeObject< NotificationEvent >(JsonContent);

var tokenHandler = new JwtSecurityTokenHandler();

//Thumbprint of Certificate to use

string thumbprint = “XXXXXXXXXXXXXX”;

// Loading the Certifications in Azure Function

X509Store certStore = new X509Store(StoreName.My, StoreLocation.CurrentUser);

certStore.Open(OpenFlags.ReadOnly);

// Finding the Certifcate based upon the thumbprint

X509Certificate2Collection certCollection = certStore.Certificates.Find(

X509FindType.FindByThumbprint,

thumbprint,

false);

if (certCollection.Count > 0)

{

var certificate = certCollection[0];

// Reading Key from the Certificate

var rsa = certificate.GetRSAPublicKey();

// Creating the parameters which will be used for JWT token verification

var validationParameters = new TokenValidationParameters

{

IssuerSigningKey = new RsaSecurityKey(rsa),

ValidateIssuerSigningKey = true,

ValidateIssuer = false,

ValidateLifetime = true, // This will validate the lifetime of event also.

ValidateAudience = false

};

   // This will throw exception if the validation fails     

var principal = tokenHandler.ValidateToken(ObjModel.Notification, validationParameters, out SecurityToken securityToken);

}