Dynamics Virtual Entities – ODATA V4 Data Source – Using Request Parameters

Virtual entities is a recent feature introduced by Dynamics. Using Virtual entities , we can connect to external data source providers and provide them in a Dynamics GUI just like any other entity.

In the Virtual entity we can configure the source as an ODATA V4 provider.

In this particular blog, we will go through an example wherein we will utilize the Query Parameters present in the ODATA V4 Provider data source and show how we can implement a custom security using them.

In the mentioned below example, we will just create an ODATA Web API which will be consumed in the Virtual entity ODATA Source Provider.

For the sake of just explaining the concept, we will just go through the ODATA Controller and not the complete source code which will create the data to be shared with Dynamics.

Step 1 – Creating a Custom Authorization Class.

In this class, we will inherit this class with “AuthorizeAttribute” class available in namespace “System.Web.Http.AuthorizeAttribute”.

We will then override the method “OnAuthorization”. In the implementation, we will read the header variables present in the request. The key specified in the request header will be used later while setting up the virtual entity ODATA Provider source in Dynamics.

// Custom Authorization Class , which is implementing the class AuthorizeAttribute 

public class CustomAuthorizationClass: AuthorizeAttribute
        // In the class we will override the method “OnAuthorization”
        public override  void OnAuthorization(HttpActionContext actionContext)
           // Name of the request header parameter
            var key = “token”;
                // Reading from the request header key value pairs
                var headers = actionContext.Request.GetQueryNameValuePairs().ToDictionary(x => x.Key, x => x.Value);
                // Token object passed from Dynamics. 
                var token = headers.SingleOrDefault(x => x.Key == key).Value;
                // This can be used for any custom authentication.
                actionContext.Response = new HttpResponseMessage(System.Net.HttpStatusCode.Forbidden);


Step 2 – Adding Custom Authorize Attribute on the ODATAController.

Mentioned below is the code snippet for the same

[CustomAuthorizationClass] // Name of the Custom Authorization Class
public class VirtualEntitiesController : ODataController
        public IQueryable<VirtualEntity> Get()
            // This function will be called when any View like Advanced Find , Associated View of the entity is opened
            // return entitycollection;
        public IQueryable<VirtualEntity> Get([FromODataUri] Guid key)

           // This function will be called when any record of Virtual Entity is opened
            // return record based upon the key;


Step 3 – Specifying the custom request header while Setting up the OData V4 Data Source

Please refer to the below screenshot which shows how we can specify the header parameter in the query header parameters

a) Navigate to Settings -> Administration -> Virtual Entity Data Sources. Click “New” to create a new Data Provider

Virtual Entity OData Provider

b) Click “Ok”. In the tab “Request Parameters”, Create a new parameter with Type as “Header”, Name as the “token” key specified in the code in step 1 and value.

Query Parameters


Azure Text Analytics API’s and Usage in Dynamics CRM

Using Azure Text Analytics API’s , we can set up processes that can analyze keywords during an interaction with the customer and provide some insights on the overall experience of the customer.

Text Analytics API demo is available on the following URL


As illustrated in the demo, we can analyze the text entered and share insights in terms of  Language Detected, Sentiment etc

We can also configure these API’s in azure cloud and then consume them in our applications.

Mentioned below are the steps to configure the Text Analytics API in the azure cloud environment

a) Navigate to azure portal, create a resource and select the category as “AI + Machine Learning”.

Text Analytics API's

b) We can then enter the details and the API’s will be configured. We will need to copy the API endpoint and an access key which we will then need to call them programmatically.

Mentioned below link explains how we can write a C# code for the same


              Usage In Dynamics CRM

We can use the Text Analytics API in many functionalities in Dynamics. For example,

Service Request Scenario

a) Saving last Customer Interaction on a Service Request – While working with a customer on a particular service request, the customer care executive can capture the interaction is some placeholder like Notes or Annotation.

b) On save of the request, we can call the API endpoint and capture the sentiments to classify whether the interaction was “Positive”, “Negative” or “Neutral”.

c) Now when the customer calls again, based upon the previous sentiment analysis with the customer, the customer care executive can drive the conversation.



Caching in Azure Functions

Problem Statement –

Azure Functions are stateless in nature. Therefore even though we can use the standard.Net objects to cache values, they don’t persist if the Azure Function scales out or is idle for some time.

In many cases, Azure Functions are used for doing some integrations with other applications. For example, we may have an integration scenario in which we make calls to OAuth Rest API’s. In these cases, we may need to preserve OAuth2 bearer tokens in Azure Function.

Approach – We have some approaches for doing caching in Azure Functions.

a) Using standard Memory Objects – For example, we can create static objects like the dictionary for caching the values.

However as indicated previously, if the function is idle for some time or scales out, the cached value will be lost.

As a side note, below is the code snippet shows how we can implement assembly caching to save values

// Use the mentioned below statement to include required classes

using System.Runtime.Caching;

// Static object in which we will save the cache

static readonly ObjectCache tokenCache = MemoryCache.Default;

// Retrieving existing value in the cache

CacheItem tokenContents = tokenCache.GetCacheItem(TokenKey);
if (tokenContents == null)

// Branch when cache doesn’t exist. This would mean we need to regenerate it.
CacheItemPolicy policy = new CacheItemPolicy();
policy.Priority = CacheItemPriority.Default;

// Setting expiration timing for the cache
policy.AbsoluteExpiration = DateTimeOffset.Now.AddHours(1);
tokenContents = new CacheItem(TokenKey, tokenResponse.access_token);
tokenCache.Set(tokenContents, policy);

 // Branch to retrieve existing value present in the cache
Token = tokenContents.Value.ToString();

b) Using Redis Cache – Its managed by Microsoft is highly scalable and provides super fast access to data. Performance wise it can be good but from the pricing perspective, it can cost more than the other options.

c) Using tables in Storage accounts – For an Azure Storage account, we can create tables and properties. These properties can then be used for saving any information. 

Using this approach results in an extra API call to retrieve the value in the Azure storage table but pricing will be less than the Redis Cache.

Mentioned below screenshots show how we can configure an Azure Storage account

  • Navigate to Microsoft Azure Storage Explorer and enter the credentials for access in the Azure account. Review that it loads all the available storage accountsAzure Storage Account
  • Click on the Azure Storage account for which we need to add the storage table. Review that it will display the existing tables present in the accountAzure Storage Account Table
  • Right click “Table” and click on “Create Table”. Give a default name of the table. Review that the table gets created with default columns of “Partition Key” and “RowKey”

Default Storage Account Table

  • To add a new custom property to the Storage Account, click on button “+ Edit”. Review that a new screen to create a custom property pops up.

Updated Add Property


  • Add the property and click save. To add rows in the table, click on button “+Add” and add the rows in the table.
  • Make a note of the values present in the column “Partition Key” and “RowKey”. These attributes will be used to retrieve the values saved in azure storage accounts.

Code Snippet to retrieve values saved in the Azure Storage table

Mentioned below blog from Microsoft shows how to retrieve a particular row from storage account table using a key value






Azure Function .Net Core 2.0 Authentication Using JWT Libraries With Certificate

Problem Statement – The blog caters to explaining a use case in which we used JWT libraries to authenticate calls between two different environments.

The consumer application is an Azure Function App deployed on Azure Cloud which needs to monitor calls happening from another application.

Background – The central idea behind the integration is to ensure that the authentication mechanism follows the same guidelines even if the consumer application is changed from Azure Function to something else.

In am Azure Function we can use the mentioned below authentications

a) Active Directory Authentication with Cloud AD.

b) Authentication with other identities provider like Facebook, Gmail etc.

c) App Service Authentication using OAuth2 token validation.

In this particular implementations, as illustrated in the diagram, there could be multiple consumer applications each following their own authentication guidelines.

Source to Target Interaction

Using JWT libraries, we can lay down a framework

a) Which will involve no change in the source application.

b) A consistent authentication which can be implemented in different consumer applications irrespective of there underlying implementation.

Implementation Approach-

In this particular example, we will discuss the approach using Certificates. As illustrated in the diagram, mentioned below steps will be executed

Authentication Approach

a) Encoding the data which needs to be transferred using a Private Certificate.

b) When we use JWT libraries to encode, it creates three sets of encoded characters

Header – Containing the algorithm used for encoding.

Payload – Containing the encoded object, time at which the payload was generated.

Signature – Secret which needs to be verified while authentication.

c) In the consumer application, decode the data using the public certificate. If required we can also pass some other parameters like Validating lifetime of the event, Validating issuer of the event etc.

As a side note, in an Azure Function, we can save the certificates in SSL settings.

Azure Function SSL.png

SSL Certificate

Please note that to use the public certificates , as highlighted in the above screenshot, the hosting Plan of the Azure Function App must be “App Service Plan” and not “Consumption Plan”

Code Snippet

Mentioned below indicates a code snippet which can be used for the reference. Please note we will need to install a Nuget package for the JWT library as well. Please refer below the screenshot for the same

Nuget Package to Use

// Deserialize the request and retrieve the event details

NotificationEvent ObjModel = JsonConvert.DeserializeObject< NotificationEvent >(JsonContent);

var tokenHandler = new JwtSecurityTokenHandler();

//Thumbprint of Certificate to use

string thumbprint = “XXXXXXXXXXXXXX”;

// Loading the Certifications in Azure Function

X509Store certStore = new X509Store(StoreName.My, StoreLocation.CurrentUser);


// Finding the Certifcate based upon the thumbprint

X509Certificate2Collection certCollection = certStore.Certificates.Find(




if (certCollection.Count > 0)


var certificate = certCollection[0];

// Reading Key from the Certificate

var rsa = certificate.GetRSAPublicKey();

// Creating the parameters which will be used for JWT token verification

var validationParameters = new TokenValidationParameters


IssuerSigningKey = new RsaSecurityKey(rsa),

ValidateIssuerSigningKey = true,

ValidateIssuer = false,

ValidateLifetime = true, // This will validate the lifetime of event also.

ValidateAudience = false


   // This will throw exception if the validation fails     

var principal = tokenHandler.ValidateToken(ObjModel.Notification, validationParameters, out SecurityToken securityToken);





Active BPF Issue Migrating from Dynamics 2016 to Dynamics 365 – Resolution

Problem Statement

The blog is in continuation of my previous blog wherein we explored possible BPF related issues after migration from Dynamics 2016 to Dynamics 365.

In this blog we will explore the resolution strategies for all the BPF migration issue that we encountered in the last blog.

To refresh the context, mentioned below were the issues which we encountered when Dynamics was upgraded from 2016 to 365

Dynamics 365 BPF design change causing the mentioned below issues

  1. Each user may see a different active BPF on the same record. Causing user perception and usability issues.
  2. Out of box workflows triggering on change of process and stage id fields behaving weirdly.
  3. Out of box workflows doing comparison on stage id and process id field values giving different results for different users.


Setting Same BPF for all Users in Dynamics 365

In Dynamics 365, by default “SetProcess” action or “SetProcessRequest” will only set the active BPF for the user in whose context the request is being executed and not for all the users present in Dynamics. There are two possible ways of tackling the situation

  1. Using a Workflow Code activity to execute SetProcessRequest for all the users in Dynamics

The idea behind this is to execute the “SetProcessRequest” for all the users in dynamics for the target record. Mentioned below is a code snippet for the same

Img 1

Mentioned below is an analysis on Pros and Cons of this approach


  1. We can configure the workflow code activity to run when the record is created. It will be a synchronous operation and will reflect for all the users immediately.


  1. There is a 2-minute timeout constraint in the workflow assembly. Therefore, depending upon the number of users for which we want to set the same BPF, there are chances that the request will timeout.
  2. In the above example, I am looping through all the records in Dynamics. However, in real world scenario’s not all the users may have required license or access to the BPF record. Therefore, for those users, we will encounter exception in the request. To navigate through that, we will need to only pass the users that are essential for setting the same BPF.
  3. We will need to be careful when we are using the “SetProcessRequest”, because if there is already an active instance for the BPF against the target entity, it will not preserve the same stage value but rather then overwrite the previous instance of that BPF.

2. Using client side API to check the active BPF for the logged in User and changing it

In this approach, the idea is to check the BPF process which is currently active for the logged in user and changing it in form on load event depending upon the business requirement. Mentioned below is the JavaScript code snippet for the same

img 2


  1. The JavaScript code snippet can just be embedded in the form on load event and will be easy to maintain and change.


  1. After setting the correct BPF for the user, there will be a onetime reload of the form. This may cause some user experience issue as compared to the first approach.
  2. This approach will only change the BPF for the user, once they open the form. Therefore, at the backend, the system will continue to store different values of stage id and process id for each user.
  3. Due to this it is imperative that the mentioned below steps related to retrieving active stage and process are implemented instead of doing a direct comparison with the stage and process id fields.


Configuring out of box workflows triggering on change of stage and process id fields

Till Dynamics 2016, it was a supported way to run workflows on update of stage id and process id values. For example, mentioned below is an out of box workflow which is triggering on update of stage and process values on a record.



However, in Dynamics 365, with the presence of multiple active BPF’s on an record, this will need to be changed. Mentioned below is the reason for the same.

  1. Suppose in CRM 2016, there were two BPF’s A and B on the record. At a time, only one could be active. Therefore, the above-mentioned workflow would have triggered only for the active process.
  2. However now there are multiple active BPF’s on the record. Therefore, the above process will trigger for each of the two BPF’s. This might cause some issue.

Due resolve this situation, we need to make the mentioned below changes in the workflow and the corresponding BPF in which we want to trigger the workflow

  1. Make the process On Demand and remove any on change events in it.

The first step which we need to do is to make the workflow on demand and remove any change events which are mentioned in the workflow

img52. Calling the workflow from stage event in the BPF.

This is a newly introduced feature in BPF’s. In a BPF we can now add a step that will run a On Demand workflow when a stage in entered or exited. Mentioned below are the steps for the same

img6a) Select a stage and click on the “+ Add” button and select “Add Workflow”. Review that a workflow step is added on the stage.

img7b) Now for the workflow step, select the appropriate Trigger and workflow. Trigger can assume two values “Stage Entry” and “Stage Exit”. In the workflow lookup, we can select any active on demand workflow set on the same entity as the BPF target entity. i.e. “account” in our example


c) After specifying all the details, click on “Apply” and then “Validate” and “Update” the BPF.

Configuring out of box workflows doing active stage and process name comparison on the target entity

As described earlier with the advent of Dynamics 365, there can be multiple active BPF’s on the same target record. Moreover as discovered in the previous blog, each user may have a different value of stageid and processid value depending upon the active process set for that user.

Therefore, in our out of box workflows, we cannot do a direct comparison on the stage and process name values. Mentioned below screenshots describe the changes that we need to in these kind of workflows

a) Mentioned below is a screenshot showing a basic example of a workflow where we were comparing the values of stage and process name before executing a step



Any such workflows may cause some issues. Mentioned below steps shows how we can correct the above-mentioned issue.

b) While suggesting the solution, I am assuming that we are interested in finding out the current stage of the active instance of a particular BPF process

In this case, we need to write a custom workflow code activity to make the mentioned below C# sdk requests.


/// <summary>

/// Retrieving current process active stage as per the design changes introduced in Dynamics 365

/// </summary>

/// <param name=”service”></param>

/// <param name=”currentTargetRecordID”></param>

/// <param name=”currentTargetLogicalName”></param>

/// <returns></returns>

protected string RetrieveActiveStageName(IOrganizationService service, Guid currentTargetRecordID, string currentTargetLogicalName)


string activeProcessName = “”;

string activeStageName = “”;

// Retrieves all active BPF instances for an entity

RetrieveProcessInstancesRequest activeProcessReq = new RetrieveProcessInstancesRequest


EntityId = currentTargetRecordID,

EntityLogicalName = currentTargetLogicalName



RetrieveProcessInstancesResponse activeProcessResp = (RetrieveProcessInstancesResponse)service.Execute(activeProcessReq);


if (activeProcessResp.Processes != null)


if (activeProcessResp.Processes.Entities != null)



int processCount = activeProcessResp.Processes.Entities.Count;


if (processCount > 0)


for (int i = 0; i < activeProcessResp.Processes.Entities.Count; i++)


var processInstance = activeProcessResp.Processes.Entities[i];

activeProcessName = processInstance.Attributes[“name”].ToString();

// Display name of the BPF for which we need active stage

if (activeProcessName.Contains(“BPF A”))


var _activeStageId = new Guid(processInstance.Attributes[“processstageid”].ToString());

// Retrieving all the stages available in the BPF

RetrieveActivePathRequest pathReq = new RetrieveActivePathRequest


ProcessInstanceId = processInstance.Id



RetrieveActivePathResponse pathResp = (RetrieveActivePathResponse)service.Execute(pathReq);


// Display name of the BPF for which we need active stage Looping through the stages and selecting the active stage

for (int j = 0; j < pathResp.ProcessStages.Entities.Count; j++)

{                                                           pathResp.ProcessStages.Entities[j].Attributes[“stagename”],



// Retrieve the active stage name and active stage position based on the activeStageId for the process instance

if (pathResp.ProcessStages.Entities[j].Attributes[“processstageid”].ToString() == _activeStageId.ToString())


activeStageName = pathResp.ProcessStages.Entities[j].Attributes[“stagename”].ToString();










return activeStageName;



We will then add the above-mentioned code as a workflow code assembly and register it as a step in the out of box workflow


Active BPF related issues while Migrating from Dynamics 2016 to Dynamics 365

Recently in my last engagement, we ran into some issues which were related to the BPF Design changes that were introduced in Dynamics 365. The mentioned below blog highlights the scenarios which we need to consider when we are planning the upgrade from Dynamics 2016 to Dynamics 365.

Problem Statement

Dynamics 365 BPF design change causing the mentioned below issues

  1. Each user seeing different active BPF on the same record. Causing user perception and usability issues.
  2. For the records migrated from Dynamics 2016 to Dynamics 365, a different active BPF may appear for each user.
  3. Out of box workflows depending upon process and stage attributes behave weirdly.
  4. Server side api’s not considering the active process which is set on that record for the particular user. Therefore a different active process is being returned for the user.


Before Dynamics 365, BPF used to follow the mentioned below architecture

  1. Each entity was to have at max one active business process flow.
  2. On the entity record, the Guid of the active BPF was saved on the processid field.
  3. Similarly the Guid of the active stage was saved on the stageid field.

For setting up the active process on any entity record, we were just setting the values in these two fields either programmatically or using out of box workflows an actions.

However, with Dynamics 365, a record can have multiple concurrent active BPF’s on the same record. As per different MSDN blogs, mentioned below changes are to be made in the system

  • Instead of setting values in these two fields, user needs to execute SetProcessRequest request against the target record.
  • Like previously before, there is also a command action “SetAction” provided by Dynamics which can be called in an out of box workflow to set the BPF.
  • The two field’s processid and stageid are not to be used for comparison as there is no guarantee that correct values will be updated back to these fields.
  • For programmatically progressing the next stage, we will need to execute two separate requests, RetrieveProcessInstancesRequest and RetrieveActivePathRequest.


Issues Encountered

Different BPF Coming for Different Users

The main problem, with SetProcessRequest is that, with Dynamics 365, it is only setting a BPF for one user and not for everyone else. For example consider the mentioned below scenario

a) For an entity, there are two BPF’s. BPF A and BPF B. BPF A is the default business process. Mentioned below screenshot shows the two BPF’s configured for an “account” entity.

Img 1


In BPF A, there is only one stage where user needs to enter the “Account name”. Mentioned below is another BPF B which is also having just one stage “Account Number” for the account entity.

img 2

b) Now, write an out of box workflow on create of account entity. Please note that the workflow can be configured for update of account also. In the workflow, the idea is to change the active process of the record from BPF A to BPF B. Mentioned below is the screenshot for the same

img 3

img 4.png


Please note that action “SetProcess” will behave the same way as the SDK call of “SetProcessRequest”.

c) Now create a new record of account. Mentioned below are the screenshots for the same.

Please note that, being the default BPF, before the record is created, BPF A is appearing on the account form.

img 5


Now create the record. On doing so the workflow that we have written will execute and will set the active BPF of the account record to “BPF B”. Review the bpf which is coming on the record.

img 6


Another way of setting up the BPF would be to execute “SetProcessRequest” on post create plugin on the record. It will behave the same way.

d) Now login with another user who is not the owner of the workflow. Open the record created in the previous step. Review that “BPF A” is coming on the record header.

img 7

This implies that the workflow did not set “BPF B” as the active business process for the record for all the users. Similar behaviour is observed for “SetProcessRequest” also.


  1. Till Dynamics 2016, all the users were seeing the same active business process flow with same stage. However with Dynamics 365, it will depend upon the user. Different users can see different BPF’s on the same record even if they are having same security role. Can lead to confusion.
  2. If there is any client side scripting based upon the active BPF, for different user’s it may behave differently.


Workflows based on Process id attributes not behaving properly

A second issue which we encountered was that any out of box workflow which was based on process id related fields was behaving weirdly. Mentioned below things were observed

a) Mentioned below is an on demand workflow we created. It’s just reading values from the Process id field and populating it in the account. Please note that the scope is “Organisation” and not user

img 8

img 9

Now, we run the on demand workflow for both the users.


For the administrator user, please note that it gives the stage name of BPF B and for the other user it gives the stage name of BPF A

For the administrator user, please note that it gives the stage name of BPF B and for the other user it gives the stage name of BPF A

img 10.png

img 11.png


  1. As compared to the first issue, this abnormal behaviour may cause some major implications. There is a wide variety of checks that user might have placed on the active stage name. All those workflows need to be analysed otherwise it can cause major data issues in the environment.

2. Some of the above workflows may be running for the entire organisation. However with the concept of “Active Business Process Flow” per user, the implications for those workflows will need to be analysed

C# SDK having confusion concept of active stage of the current active process

The mentioned below blog from Microsoft indicates that with Dynamics 365, the processid and stageid fields present on the record are not to be used. Instead, we must modify the design to read active stage and process from the newly introduced API’s


 img 12

However, whe extecuted the RetrieveProcessInstances is having a different active process for the user context in which the request is being made. The blog indicates that the results will be sorted on modified on date and the first record will be the one visible on the GUI. However that seems to be not the case for some users. Therefore the concept of Active Process and Active Stage in c# sdk call seems to be confusing.

Xperido-Dynamics 365 Connection Error

The blog concerns the connectivity error we were getting between Dynamics CRM and Xperido after Dynamics was upgraded from CRM 2016 to Dynamics 365.

We were using Xperido for document generation, with SharePoint Online configured as the document repository. The version of Xperido which we were using was 3.4.1703.2016. When the organisation was still on Dynamics CRM 2016 version, the document generation was working as expected. However, after it was upgraded to Dynamics 365, we started getting connectivity issues between Xperido and Dynamics.

Mentioned below were the issues which we observed:

  1. In Dynamics 365, after navigating to Xperido> Xperido Management > Server Status, we were getting the mentioned below error.

Xperido-Dynamics 365 Connection Error


  1. In the Xperido server, we were not seeing any request logs in the Xperido portal.

  1. There wasn’t any issue with the Xperido solution. The solution was compatible with Dynamics 365. Along with that, for the user used in Xperido-Dynamics integration, no change in security role or permissions was made.

Mentioned below highlights the exact issue which was causing the problem:

After checking the logs in Xperido Administrative Console in the Xperido server, it was observed that Xperido was giving errors during CRM connectivity. It was an authentication exception that was causing the issue. After navigating to the installation directory of Xperido in the path Program Files > Xperido > Runtime.Net, we made the following observations:

  1. The version of DLL Microsoft.CRM.SDK.Proxy.dll was not compatible with Dynamics 365.
  2. The version of DLL Microsoft.XRM.SDK.DLL was not compatible with Dynamics 365.


We just replaced the DLLs from the DLLs present in Dynamics 365 SDK.

This resolved the issue.