Quantcast
Channel: Microsoft Dynamics AX Solution Architecture
Viewing all 30 articles
Browse latest View live

Finding the X++ stack and AX user with public symbols in AX2012

$
0
0

This is an extension of a series of articles I wrote some time ago for AX2009 - this just brings it up to date to do it with AX2012. This enables you to take a snapshot of your AOS and see what's running on it - call stacks and user sessions. If you're not familiar with analysing memory dumps then check this post first, it explains how to get set up etc..

http://blogs.msdn.com/b/emeadaxsupport/archive/2011/04/10/so-your-aos-crashed-is-hanging-or-you-just-want-to-see-what-it-s-doing.aspx

Here’s a run through finding the X++ call stack and the AX user for AX2012. When you open the dump in WinDbg, run the command “kv” to see the call stack, you’ll see output like this:

 0:007> kv
Child-SP          RetAddr           : Args to Child                                                           : Call Site
00000000`040b4c40 00000001`406ffc75 : ffffffff`fffffffe 00000000`00000001 00000000`00000001 00000001`403bc721 : Ax32Serv!cqlCursor::connection+0x14
00000000`040b4c70 00000001`406ffec6 : 00000000`1e84a7a0 00000000`1e56c720 00000000`00000001 00000000`040b50b0 : Ax32Serv!cqlCursor::DropTempDBTableInstance+0x75
00000000`040b4eb0 00000001`403b9771 : 00000000`00000001 00000000`00000001 00000000`040b50b0 00000001`40437346 : Ax32Serv!cqlCursor::Dispose+0x26
00000000`040b4ee0 00000001`403bc714 : 00000000`1e56c720 00000001`404351f0 00000000`1ac20d28 00000001`407d8d4b : Ax32Serv!cqlCursor::~cqlCursor+0x101
00000000`040b4f60 00000001`40369815 : 00000000`1e56c720 00000000`194c0d00 00000000`1b912800 00000001`407d8ead : Ax32Serv!cqlCursor::`vector deleting destructor'+0x14
00000000`040b4f90 00000001`403801d5 : 00000000`1e56c720 00000000`040b6100 00000000`1e0bfe40 00000001`4058067b : Ax32Serv!cqlCursor::freeRef_AdHoc+0x35
00000000`040b4fc0 00000001`40582026 : 00000000`040b50b0 00000000`00000006 00000000`040b51a0 00000001`4049e2b1 : Ax32Serv!assignCursor+0x75
00000000`040b4ff0 00000001`40582d95 : 00000000`000000c8 00000000`00000000 00000000`040b51b0 00000000`040b51a0 : Ax32Serv!CQLFreeVars+0x116
00000000`040b5040 00000001`40430c43 : 00000000`1b912800 00000000`00000005 00000000`00000005 00000000`00000005 : Ax32Serv!interpret::CQLEvalProc+0x715
00000000`040b52c0 00000001`4043370a : 00000000`03720cc8 000007fe`8f8fc3a1 00000000`18ebfb90 00000000`1b913940 : Ax32Serv!interpret::doEval+0x3e3
00000000`040b55c0 00000001`40434517 : 00000000`18ebfb00 00000000`1e522736 00000000`1e665e60 00000000`00930200 : Ax32Serv!interpret::evalFunc+0x2ca
00000000`040b56a0 00000001`404351f0 : ffffffff`fffffffe 00000001`4065167a ffffffff`fffffffe 00000000`040b6190 : Ax32Serv!interpret::xal_eval_func+0xc77
00000000`040b6030 00000001`4049e127 : 00000000`1b912800 00000000`1b912800 00000000`040b6190 00000000`040b7000 : Ax32Serv!interpret::xal_eval_id+0xd0
00000000`040b6070 00000001`4049e268 : 00000000`1b912800 00000000`00000000 00000000`1b912800 00000000`040b70e0 : Ax32Serv!interpret::evalLoop+0x167
00000000`040b60d0 00000001`40582b53 : 00000000`00000001 00000000`00000000 00000000`040b6180 00000000`00000000 : Ax32Serv!interpret::eval+0x58

For the X++ stack we care about the ax32serv!interpret::evalfunc frames. Let’s take the evalfunc line nearest to the top of the stack as an example:

00000000`040b55c0 00000001`40434517 : 00000000`18ebfb00 00000000`1e522736 00000000`1e665e60 00000000`00930200 : Ax32Serv!interpret::evalFunc+0x2ca

Take the third column shown above in green and then run “du <the location>” as shown below. This gives you the X++ method name:

0:007> du 00000000`1e522736
00000000`1e522736  "saveBudgetCheckResultErrorWarnin"
00000000`1e522776  "gDetails"

Now to find the class, take the first column in that line and run “dd <the location>” as shown below, this gives you the class ID in hexadecimal:

0:007> dd 00000000`040b55c0+44
00000000`040b5604  000f554e 1b913940 00000000 00000000
00000000`040b5614  00000000 00000000 00000000 00000000
00000000`040b5624  00000000 00000000 00000000 00000000
00000000`040b5634  00000000 00000000 00000000 00000000
00000000`040b5644  00000000 ffffff00 00000000 1e665e60
00000000`040b5654  00000000 fffffffe ffffffff 00000001
00000000`040b5664  00000000 1b913940 00000000 1e522736
00000000`040b5674  00000000 1b912800 00000000 040b7000

To find out what the class name is from this, first you can convert the ID to decimal, just run “? <the ID>”

0:007> ? 000f554e
Evaluate expression: 1004878 = 00000000`000f554e

Above 1004878 is the classID, you can find the class name in an X++ job, like this:

info(classid2name(1004878));

Now for you to find the AX user you run the command “!tls -1”, this is following the instructions from this post but I am giving you different offsets here:

0:007> !tls -1
TLS slots on thread: 2ed4.176c
0x0000 : 0000000000000000
0x0001 : 0000000000000000
0x0002 : 0000000000000000
0x0003 : 0000000000000000
0x0004 : 0000000000000000
0x0005 : 0000000000000000
0x0006 : 0000000000000000
0x0007 : 0000000000e29900
0x0008 : 0000000000000000
0x0009 : 0000000000000000
0x000a : 0000000000000000
0x000b : 0000000000000000
0x000c : 0000000000000000
0x000d : 0000000000000000
0x000e : 0000000000000000
0x000f : 0000000000000000
0x0010 : 0000000000000000
0x0011 : 0000000000000000
0x0012 : 0000000000000000
0x0013 : 0000000000000000
0x0014 : 0000000000000000
0x0015 : 0000000000000000
0x0016 : 0000000000000000
0x0017 : 00000000005c70f0
0x0018 : 0000000000000000
0x0019 : 0000000000000000
0x001a : 0000000000000000
0x001b : 0000000000000000
0x001c : 0000000000000000
0x001d : 0000000000000000
0x001e : 000000000f2442a0
0x001f : 00000000005daca0
0x0020 : 000000000f177e60
0x0021 : 0000000001829210
0x0022 : 000000001916a600
0x0023 : 0000000000000000
0x0024 : 0000000000000000
0x0025 : 0000000000000000
0x0026 : 0000000000000000
…<cut short for display>…

Then take the location above and run the command below, this is to find the user’s session block (the class instance that represents their session). Note that in this example I’ve picked the 0x0022 entry in the list that came from the !tls -1, this won’t always be the same entry – unfortunately with public symbols (i.e. outside of Microsoft) you can’t do the !tls part more accurately, you’ll need to try each entry and has a value other than zero and you’ll know if it’s right in the next step if you see a username at 17c.

0:007> dq 000000001916a600+68
00000000`1916a668  00000000`194c0d00 00000000`0f26f790
00000000`1916a678  00000000`00000000 00000000`00000000
00000000`1916a688  00001d5b`00000000 00000001`3feebdc0
00000000`1916a698  00000000`00000001 00000000`00000000
00000000`1916a6a8  74007300`69004c00 65006700`61005000
00000000`1916a6b8  fff30002`00000000 00073ea8`0e010000
00000000`1916a6c8  02000000`00010001 00000000`000001ff
00000000`1916a6d8  10a8effc`a833b3d8 00000000`19152760

Then take the location in green above, and at various offsets you’ll find information about the user as shown below.

User’s AX username:
0:007> du 00000000`194c0d00+17c
00000000`194c0e7c  "kkidder"

User’s AX company:
0:007> du 00000000`194c0d00+298
00000000`194c0f98  "dat"

User’s client machine name:
0:007> du 00000000`194c0d00+57c
00000000`194c127c  "eeax2008"

Happy debugging!

/Tariq Bell


How to handle high memory situations on your Dynamics AX AOS

$
0
0

This article explains a real-world approach to dealing with high memory situations in Dynamics AX. Being able to identify root cause of memory usage is obviously useful in support scenarios when there's a memory problem in a production system, but it's also useful in test - catch those things and identify the root causes before they have a chance to get into production.


The key to my suggested approach here is simply to apply practical steps, not reliant on some extreme knowledge of underlying memory structures (although that can be fun too!).


Notice that I was careful above not to say "memory leak". It's common for people to describe high memory situations as leaks, but I'm making the distinction here because in my experience working the root cause of a high memory situation is rarely a leak. A leak would be a process that every time it runs, leaves behind a little bit of memory that cannot be used by the next process. The majority of the time there is a runaway process, or other heavy load causing the issue, but not actually a leak.


What shape is the issue?


The first thing to consider is what shape the high memory takes. Is it:

  • Constantly growing steadily over a long period - could be a leak - you're looking for something that runs a lot over and over.
  • Suddenly spiking up really high - unlikely to be a leak, is probably a runaway process.


Performance counters will tell us - within the counter set called "process" use the private bytes and virtual bytes counters and set them on for ax32serv to monitor the AOS process. The results of these counters will give you a graph in perfmon which shows the shape of the memory issue.


If your issue is slowly building memory up and up then perhaps you have a real memory leak. It's important to note that memory must keep growing and growing forever, if it just grows up to a point then stays around that level, even if that level is quite high, then it's not a leak.


Sudden spiking memory:


For sudden spiking memory - I like to use a performance rule in the tool debug diag. This enables me to create a performance rule based on a performance counter, so when a certain counter value is hit it'll make a memory dump of my target process. So I tell it to look at private bytes counter (mentioned earlier) and create a dump of ax32serv when the counter goes over the normal running threshold of the AOS - so normally it runs at 4gb, I'll get a dump when it goes past 5gb - you can tell it to generate a few dumps a few minutes apart.


Once you have a memory dump, just look at what was running at the time the dump was taken, it should be easy to see which process was running across all 3 dumps - there wouldn't be many things that run for a few minutes so typically you can expect it to be the only thing running across all 3. You can find out how to check a dump file for AX here.
.NET process using memory:


If you suspect it's a .NET process on your AOS (anything running as IL - Batches, services/AIF) using the memory then it's pretty easy to identify it - there is a .NET memory analysis script included in debug diag, just collect a dump while memory is high, and then run it through the debug diag analysis (second tab) using the .NET memory script listed there.


This will give you a nice html output which flags any large objects in memory - as far as an AOS is concerned, don't get too carried away reading every line of this report - at the top there will be headlines if it has noticed something it thinks are wrong, look at those first - you're pretty much looking for it to report that there is a large data table in memory or something along those line. The output is pretty human readable, so expect that it's quite easy to decipher what it's trying to tell you.
If you've written your own XppIL process and it's memory usage seems much higher in IL than it is when you run it as normal X++ then read this article.


A real "LEAK"


The hardest type of memory issue to investigate is the constant growth - these are rare in newer versions of Dynamics AX, since we had 64 bit instances of AOS - prior to that 32 bit resource limit could make it seem like there was a leak, when actually if it could use more memory it would have been ok.


In a suspected leak situation the first thing to do is take a practical approach - look at when it started what code changed or what new processes have been introduced in that period etc.. then test those changes/processes to see if you can simulate the memory issue. If you can catch it like this then finding the root cause will be easier.


If the practical approach yields nothing then you're likely to need to talk to Microsoft, contact Support (or come through your Partner if you don't have a Support contract with us directly). Expect us to run over the kind of things I've explained here, and then we'll collect a difference kind of memory dump which we can analyse at Microsoft to explain what is happening.

/Tariq Bell

Power BI and Dynamics AX: Part 2: Extract and Transform

$
0
0

 

This is part two continuing from Power BI and Dynamics AX: Part 1: Introduction.

This post will focus on the steps involved in extracting data from Dynamics AX into PowerQuery and transforming that data ready to be exposed in your visualisations, or used in PowerQ&A.

Extracting Data via OData

The first few steps of this process would typically be driven by an IT Professional within the business. It requires some knowledge of the structure of data within Dynamics AX to identify the correct query.

To extract data from Dynamics AX, we will be using OData (Open Data Protocol) services. OData allows us to expose a specific query from Dynamics AX for use in external applications. Which queries are available through OData is managed from within AX in Document Data Sources.

Organisational Administration \ Setup \ Document Management \ Document Data Sources

A detailed guide to setting up your Data source can be found here.

Once you have setup your data source, you can view all your published feeds from this URL (Replace <Server Name> with your AOS Server):

http://<Server Name>:8101/DynamicsAx/Services/ODataQueryService/

Note: some queries due to the way they have been developed or certain joins do not expose through OData Properly. The best way to test is a quick connection from Excel after publishing the OData service to see if the query has worked properly.

Some things to keep in mind when selecting your query:

  • Invest the time to find the right query, there are a huge number of Queries available from standard AX which you should review before trying to extract multiple tables and joining them externally.
  • OData Protocol filters are not supported, so if you require a filtered dataset to be published, you need to apply the filter from within the AX Query or from within PowerQuery. To do this in AX you can do this through the AOT, or by selecting the “Custom Query” option on the Document Data Sources form when creating a new data source.
  • Each record in the response must have a unique primary key. AOT View objects which don’t have a unique key will not be presented when you try and query the OData Service.
  • If you try and access the URL from a web browser and you receive a “Internal Server Error” you may have published a ‘bad’ query, try setting them to inactive and reactivating them one by one to find the problem query.

Once you have your OData service ready to go, we are ready to connect to the data from PowerQuery. PowerQuery will have two main functions when working with data; Extraction – from Dynamics AX as well as other data sources, and Transformation – to remove unwanted columns, rename columns and tidy up our data to make it as user friendly as possible.

PowerQuery is accessed through a ribbon within Microsoft Excel. If you don’t have PowerQuery installed, you can get it here.

A detailed guide of how to connect to your OData source from PowerQuery can be found here.

Important Note: If you plan to use the scheduled refresh functionality within Power BI, you need to ensure the correct case has been used on the OData URL when entered into PowerQuery. At the time of writing the authentication process for Power BI refresh lookups credentials for the OData service with the following case:

http://<Server Name>:8101/DynamicsAx/Services/ODataQueryService/

If you have any characters upper/lower case different to the above – the authentication will fail on refresh.

Transform your Data

After you’ve connected to your OData source and pulled it into PowerQuery, you can now leverage the tools within PowerQuery to transform your data ready for end users and visualisations.

The data exposed from Dynamics will come out with technical names and often unwanted data, below is an example of the ProjTransPostingCube Query from Dynamics AX R3 CU8.

A detailed guide of how to perform transformations can be found here.

The key transformations to implement when working with Dynamics AX data:

  • Remove unwanted columns.
  • Rename Column Names to user friendly names
    • Example “ProjTable_Name” to “Project Name”
    • This step is key to PowerQ&A to support natural language queries.
  • Change DateTime formatted fields to Data type “Date”
    • Example “10/02/2015 00:00:00:00” to “10/02/2015”
  • Merge with other Dynamics AX or Internal Data sources to provide a combined dataset to end users.
    • More details on a merge example can be found here.
  • Insert Year, Quarter and Month Columns to be used in Visualisations.
    • If you select a date field in the query you can add these columns by using the Date option on the “Add Column” ribbon.
    • Once added – ensure to change the Data Type to “Text” otherwise when you include it in visualisations it will try and total Years as real number values.

Once transformed a data source is not only easier to work with when designing visualisations, it is also allows PowerQ&A to work with the data in natural language queries. Below is the same ProjTransPostingCube query after transformation.

 

Enhancing your data with measures

Using PowerPivot within Excel, you can start to add calculated values and KPIs to your data set to use within your visualisations. This functionality is accessed from the PowerPivot tab within Excel, to open up the PowerPivot Data Model, click Manage.

Using the calculation grid at the bottom of the pane you can create calculated measures which will then be available in your visualisations. In the example below we have a new measure for “Actual Cost” which is based on Project Transactions, filtered on “Project – Cost” transactions. A detailed guide of how to create measures can be found here.

Once you’ve created your measures and saved the data model, they will be available in the field list for PowerView and can be added to visualisations like in the example below.

 

If you would like to align your terminology and calculations to the Standard Dynamics AX cubes review Cube and KPI reference for Microsoft Dynamics AX [AX 2012] for a breakdown of the measures available in the standard cube and the basis of the calculation.

Merging with Data from Azure Marketplace

One of the most powerful features of PowerQuery is leveraging data from other data sources, including the Azure Marketplace. The Marketplace has a collection of data from population statistics, business and revenue information and reference data to help build your visualisations. One of the most helpful is a data source for Date information. While this may sound basic, it’s really helpful in presenting visualisations without having to reinvent the wheel in your query.

A great example and one I have used is DateStream (http://datamarket.azure.com/dataset/boyanpenev/datestream) it is a free data source which contains a reference of the Month name, Day name, Quarter, etc for dates.

 

To use a data source from Azure, you first need to sign up on Azure Marketplace with your Microsoft account (Live) https://datamarket.azure.com/home. Once you’ve signed up and found the data source you would like to use, you subscribe to the data source through the data market. Now when we log in through Excel, it will be available for us.

In Excel, the process is similar to if we are connecting to OData. From the PowerQuery tab select “From Azure” > “From Microsoft Azure Marketplace”. You will then be prompted to enter your credentials (using your Microsoft account you used at the Azure Marketplace). After signing in you will be presented with a list of data sources you have subscribed to online.

Once the data is loaded into your data model, you follow the same merge process we described earlier to merge the new date fields with your data source. The result is now the additional columns in your primary query. In the example of the date reference query, we now have Year, Quarter and the Month name to use in visualisations.

Sharing your Transformed Query with others

After you’ve invested the time to transform your query into a nice end user ready data source, you can save it to your Office 365 Data Catalogue. This will allow other users in your organisation to benefit from the time you’ve invested in transformation and work with the end result in Excel and in their visualisations. You can access the Data Catalog from the PowerQuery ribbon in excel, you’ll need your Office 365 Credentials to log in.

A detailed guide to saving and sharing queries can be found here.

Now you should have a clean and friendly data source available within Excel, the next post will talk about creating and publishing visualisations.

Thanks,

Clay.

Power BI and Dynamics AX: Part 3: Create and Publish Visualisations

$
0
0

 

This is Part Three, continuing from Part 2: Extract and Transform. This post will focus on the creation and publishing of visualisations.

At this point in the process we have a workbook, with a transformed data model – we are ready to create a visualisation to present our data. This post isn’t meant to be a detailed guide of how to create a visualisation, you can find one here.

To get started with a PowerView Visualisation, from the workbook we built our query in we are going to click PowerView from the Insert tab.

Excel will load a blank PowerView sheet for you to begin creating your visualisation. Assuming your query was created you should now see your transformed data set in the “PowerView Fields” pane to the right of the screen. (If you don’t – jump back here to see how to setup the query).

You can now start by checking the fields you would like to present in your visualisations, or dragging and dropping them from the field list onto the PowerView sheet. Once you start adding data you will notice a new ribbon “Design” is available, from here you can change visualisations and the presentation of the data.

Some key things to keep in mind when creating your PowerView visualisations:

  • Keep your visualisations simple
    • PowerView is designed to give high impact visualisations that allow a user to answer questions, or prompt action quickly and easily. Complex and detailed visualisations make this extremely difficult to do.
    • Don’t try and answer every business question in one single PowerView sheet – don’t be afraid to have multiple sheets within your workbook to present data in different ways.
  • Keep the end user device in mind
    • With mobile apps and HTML 5 support, PowerView can be viewed on a variety of devices, with different interfaces of different sizes. If you create visualisations which are a small font size or with bars on the chart to small, users on small touch devices won’t be able to interact with them.
    • Some visualisations aren’t supported on all modes – web, mobile app, HTML 5. While basic visualisations are, there is limited support across devices for PowerMap – check on the PowerBI team site for the latest support.
  • Leverage Hierarchies for drill downs
    • In Chart, Bar and Pie visualisations you can add multiple fields to the Axis to create a dynamic hierarchy. The visualisation will then allow the user to drill into the values in the chart.

    • While this example is using date values, you can include any fields in the hierarchy – for example Client, Project, Worker to drill into more detail on project analysis.
  • Select your Axis and Legend Fields carefully
    • Keeping in mind the purpose of simple visualisations, if you select a field that has a high number of values (Like 1,000 Customers) and then add this to your legend on your chart – not only is this going to look terrible – but PowerView will also limit the number of customers it displays and provide you with a sample set. In most cases, you won’t want to view only a sample set.
    • Leverage hierarchies to group values so a manageable number of values are presented at any one time. For example: Customer Group, Customer

Publishing your Visualisation

Now you should have a workbook, including your data model and visualisation ready to share on PowerBI. The publishing process is quite simple, the main consideration at time of publishing is security.

Once you publish your workbook there Dynamics AX Security model is no longer applied – the data security is only applied at time of refresh. For example, if the designer of the report has access to 10 Dynamics AX Legal Entities, once they publish the report anyone with access to the workbook will see the contents. This is due to the fact that the information is uploaded into the Excel Data Model in Office 365 as part of the refresh process.

A key component of your PowerBI strategy needs to be focused on the plans for security related to your data. The security will be managed by SharePoint security, based on the workbook. As an example, if you create a folder in SharePoint for “Sales – West Region” and provide access to your West Region team, then only this team will have access to the report once its published to this folder.

The PowerQ&A service is also based on this security model, when a user asks a question on Q&A, PowerBI will look across all workbooks that user has access to within PowerBI. For this reason it’s even more important to ensure the correct security has been setup from day 1.

Note: To publish your workbook in PowerBI for Office 365, you will need an Office 365 subscription and an active licence for PowerBI.

Firstly, you will need to create a PowerBI site on your Office 365 tenant if you haven’t already done so, to create a new PowerBI site follow the detailed steps here.

Once you’ve created your new PowerBI site, you can upload your workbook by selecting “Add” > “Upload File” under the documents section.

Once the workbook is uploaded, you will notice it will automatically enabled itself for PowerBI. If you uploaded your document through the SharePoint library, or had already loaded it before adding PowerBI to your Office 365 tenant, you will need to enable the workbook manually. You can do this by clicking “Enable” on the ellipsis menu on the workbook.

One last step is to enable the workbook for PowerQ&A, this is also done from the ellipsis menu on the workbook itself. Workbooks will not be automatically enabled, the main reason here is you may have multiple workbooks with the same data source, you don’t need to enable all of them for Q&A, only one as long as it’s the same data source in the background.

Now that its enable you can click the workbook to see your visualisations on PowerBI. Done!

 

You’ll notice in the bottom right hand corner you have an icon to take you to the HTML 5 view. It’s recommended to always view your visualisations in HTML 5 to see how they are presented, depending on the device your user is using, you want to make sure the visualisations are clear and easy to understand. In some cases you will see it renders slightly differently in Excel vs HTML5 and may require some tweaking.

 

You’ve now got a Dynamics AX data source extracted and published in PowerBI. The next post in this series will talk about PowerQ&A and the data refresh options available for the data source.

Thanks,

Clay.

Power BI and Dynamics AX: Part 4: Data Refresh and Q&A

$
0
0

 

This is part four, continuing from Part 3: Create and Publish Visualisations. This post will focus on Data Refresh within PowerBI and how to get started with Power Q&A.

At this point in the process we have extracted data from Dynamics AX using Power Query, transformed our data, created a visualisation using PowerView and published it to our Power BI site on Office 365. The next step is to setup a data refresh to ensure our data and visualisations stay current. If you don’t schedule a data refresh, users will need to open the workbook in a desktop version of Excel, and manually refresh the data. This will also require a connection to the AOS for the user refreshing the workbook.

The refresh process has a few components involved, the initial setup has a few steps involved to connect your AX instance, but after this initial process new workbooks can be setup quite easily.

Firstly you need to install the Data Management Gateway (Download is available here.) the gateway is used to allow PowerBI to connect back to your On-Premise data without a user being involved. You will need to deploy the gateway on a server with internet access and access to the Dynamics AX AOS. Once installed, follow the steps outlined here to configure the gateway with our tenant of Office 365.

Once configured, you can now add a data source. This is done from the PowerBI Admin Centre. You will need to create a data source for each Dynamics AX Instance (Note Instance, not Query – so you will need a data source for Production, Dev, Test, etc).

To create the new Data Source, open the PowerBI Admin Centre from the Settings (Gears) option in the top right hand corner from your PowerBI site. From the Admin centre, select Data Sources. Now click the plus sign to add a new data source and select “Power Query”

You’ll now be asked to paste in the connection string used for the connection. You need to get this from your Excel Workbook. Open you excel workbook and open the “Connections” form from the Data tab.

Note: if the Connections option is greyed out, it may be because you’re on a PowerView sheet. Insert a new blank Excel sheet and the option will become available. (Don’t forget to delete the new sheet later)

You’ll see your different data sources in the connection window, you need to select one of your AX data sources, click “Properties”. On the Definition tab you will see the connection string, copy and paste the entire connection string into the PowerBI admin centre.

It should now load the list of connections you have in your data source – you need to complete the details for each data source:

  • Name: Use something informative, example “Dynamics AX – Test Environment”
  • Description: For your internal purposes
  • Gateway: Select the gateway for the connection to use, you may have multiple gateways configured in your environment.
  • Set Credentials: These are credentials used for the refresh, this account must have access to Dynamics AX to perform the data refresh. It is recommended to use a system account for this refresh, not a specific users account.

You can now test the connection. The next two steps will allow you to specify users which have admin privileges over the data source and where notifications of errors should be sent.

Once the Data source is configured we can now go and schedule our workbook to refresh. Return to your PowerBI site and navigate to the workbook. From the ellipsis menu on the workbook select the “Schedule Data Refresh option”

From here you can see the refresh history, as well as configure the refresh for this specific workbook. You must setup the data source in the Admin centre first, otherwise this step will fail. You can find detailed steps on the refresh here.

Using Power Q&A: Natural Language Queries

Q&A is an extremely powerful tool, allowing users to use natural language to explore the data that you’ve prepare in your data model. This is where your transformation steps really pay off, as Q&A can leverage the friendly names you’ve given fields to allow users to explore your data.

To use PowerQ&A, from your PowerBI site click “Ask with PowerBI Q&A” in the top right hand corner. You’ll be presented with a blank canvas ready for questions.

As an example, using my data set I have asked for “Expense amount by client name” – PowerQ&A has prepare a visualisation of my project expense transactions sorted by client name.

Using the “Explore this result” pane on the right hand side you can start changing visualisations, filters and even the fields that are presented.

Q&A does a lot of work on its own to understand natural language, it identifies synonyms, deals with spelling mistakes but you’ll notice as you begin to explore that PowerQ&A doesn’t always get it right, in the drop down under your question you’ll see how Q&A is interpreting your question. In my example you can see it is showing “Show amount where transaction type is expense sorted by client name”. What you will notice is sometimes Q&A can’t understand what you’re looking for and a word will be greyed out – this means Q&A didn’t understand your phrasing.

Synonyms are one of the easiest ways to teach Q&A about your business, you can do this through PowerPivot (detailed instructions here.) or through PowerQ&A optimisation in Office 365. To manage this through Office 365 you need to open the Power BI Site Settings from your PowerBI Site. From within your Site Settings you’ll see a tab for Q&A which will contain all your workbooks enabled for Q&A. From the ellipsis menu, select “Optimize for Q&A”

You will be presented with a blank Q&A space for you to ask test questions, you’ll also notice in the pane on the right hand side a summary of the optimisation that has already take place. The first time you load the workbook you’ll notice the synonyms and phrasings already generated by Q&A automatically.

 

Starting with the last tab, Usage, this is extremely helpful for you to understand how your users are using Q&A, as well as what words or phrasing isn’t being understood. IT administrators and/or data officers should be regularly monitoring this tab to ensure Q&A is providing the right results, and is continuing to learn about the organisation.

Synonyms: you can use this tab to add a synonym to your column names. For example, you may internally refer to a “Product” as a “Part” – to teach Q&A you can add Part as a synonym to Product, now when users use the work “Part” in their questions, Q&A will be able to provide a response.

Phrasing is extremely powerful, it allows you to teach complex terms or expressions which are used within your organisation. As an example, let’s say I asked “Expenses by client” – in my mind I want the same result as my first example, but I didn’t say “expense amount”. You can see Q&A hasn’t interpreted this correctly, what is it showing me is clients which have had expenses, not the actual amount.

This is where optimisation comes into play, where we can teach Q&A that when I say “Expenses” I actually mean the amount of transactions which are of type, Expense. So now under the Phrasing Tab, I can add a new phrasing.

As soon as I click ok and ask the same question again “Expenses by client”, the new result is shown below.

 

Some key things to keep in mind for Q&A:

  • Invest the time in PowerQuery transaction to make sure you start with a nice clean data set.
  • Plan for a pilot of Q&A before releasing it to your entire organisation, use this time to optimise your data and ensure you have your synonyms and phrasing worked out.
  • Remember access to data through Q&A is driven by your SharePoint security, so plan accordingly.
  • Q&A understands terms like more than, less than, last year, this month – explore with terminology and learn what works best for you.
  • Use the “featured questions” option to save questions and present them to users as they log in. This not only saves time in retyping questions, but also gives new users an introduction on what they can be asking.

Here are some great resources for Power BI, have a look at this content if you’re starting out:

This is the last post in this series focused on PowerBI for Office 365, the next post will be focused on an example of how to use the new PowerBI functionality with on premise data by setting up a tabular SSAS database.

Thanks,

Clay.

Sprinkling a little bit of IoT around Contoso’s Dynamics business process to make it more intelligent

$
0
0

I am increasingly hearing Dynamics customers asking about IoT solutions from Microsoft. This article describes a recent experience at a Dynamics AX customer, hope you enjoy. 

Internet of Things is growing at a phenomenal pace and there are so many mind boggling predictions about it that it can be hard to follow. To make things simple you can chose to remember just three predictions from Gartner, apparently these numbers will be valid for next five years 🙂 so you should be ok barring major changes.

  • IoT will include 26 billion units by 2020
  • IoT and service suppliers will generate incremental revenue exceeding $300B, mostly in services by 2020
  • IoT will result in $1.9T in global economic value-add through sales into diverse end markets

As if talk about petabytes and terabytes of data wasn’t sufficient now you also have to remember about billions of sensors and trillions of dollars! Doesn’t it make you wonder where all this is going and how much is really true? Well, if it does, I entertain you to look at this little bit of sprinkling of IoT devices that we are currently doing at Contoso. If this article interests you and you feel like you want to touch and feel some of these devices yourself please come and see us at Atlanta convergence March 16-19 2015 at a general session and a concurrent session. If you miss out on those opportunities you can always find us in the Customer showcase at EXPO hall. 

So I have covered in previous posts what Contoso’s business is about. Contoso is UK’s leading foodservice delivery company, supplying full- range of food stuffs across the UK. It is a leading supplier to restaurants, bars, pubs, café’s and schools. Forty-five thousand customers place over a million orders a year from a catalogue of about five thousand products. More is here.

Since they make deliveries, they have their own fleet to do this. As you can imagine, any delivery of temperature sensitive products like food or medicine has it’s fair share of challenges. One of these challenges for the COO of Contoso is how not to feel terribly sorry while he sits and watches thousands of pounds of food rotting at the backside of a truck which has a broken cooling unit and is stuck in snarling traffic through streets of London on a hot summer day. Half of such an event is enough to justify the meagre cost of implementing a Microsoft Azure based temperature-controlled solution that Contoso has come up with to address their scepticism about internet of things in general! 

On a more serious note it has happened a few times in hot summer months with products like ice-creams, shrimps and peas that customers have called up to complaint about drivers delivering defrosted stuff. There isn’t much choice for Contoso other than to provide a credit note when such a thing happens. It’s not only lost revenue but more importantly lost customer that is the real concern in such situations – not to mention, the spoilt food. And there is the major risk of big damage with a whole cooling unit breaking down. This is a nightmarish scenario for every distributor engaged in any cold chain.

So, the genesis of what Contoso is building lies in a simple question to Paul Smith, the COO – “Paul, if this is a risk, why don’t you put temperature sensors in your trucks and hook them up to Azure and run some analytics on time series data to alert you before something breaks down? If results tell you something, it would mean there is some substance to all the hype”  

It was a simple enough question and a simple enough proposal. Paul was in as long as we agreed we are not going to hook up sheep pedometers to Azure to measure their health and hence quality of their meat or use automatic devices to feed pets when owners were away deeply engaged in a pub crawl! Convince yourself here – 1, 2, 3, 4, 5, 6

So we looked around for suitable temperature sensors. There are so many sensors in the market sensing so many things that at first, it was hard for us but then quickly we established strict qualification requirements and narrowed down to two sensors. TI sensor and a ZenMeasure sensor from a vendor in China. ZenMeasure is called MiaoMiaoCe (in Chinese, this means “measure it every second”). TI tag has more details here. You are welcome to browse details of their capabilities using the links but I will put here in short, the tests we did and comparisons we made.  

ZenMeasure is pretty cool, pretty slick and seriously small – it is the size of an overcoat button. It was originally designed for consumer use, for measuring skin temperature of older patients and children in hospitals and care homes.

  So you can imagine their surprise when they got a request from Contoso to test their sensors to install in delivery trucks. In the early days (three months ago that is) ZenMeasure didn’t measure below freezing temperatures but this was quickly changed in an update a week later to allow Contoso to measure temperature of its chilled as well as frozen products on the road. (Note: In this market, there are many vendors and they move very quickly to publish changes to their products/apps).

When you go for something like this or any device of this nature you will quickly realize that most important things are signal strength and battery life. Contoso is still evaluating battery life and comparing it with TI. Contoso procured two sensors at first and Paul installed them in his home refrigerator and freezer and used his iPhone app to test the features. He felt pretty excited about its’ capabilities. Now it’s been three months into testing and several new ZenMeasure sensors have been procured and installed in various places including a few in delivery trucks, none so far indicate much loss in battery life. This is excellent! TI sensor on the other hand is capable of sensing not only temperature but also humidity, motion etc. making it very likely to have lower battery life than ZenMeasure. Both communicate using BLE Bluetooth Low Energy, both are very easily discoverable on iPhone and both meet the connection strength and connection range criteria.

Now about the major drawback. None of the sensor vendors we looked at have a Windows app. (Note: this is pretty common, vendors are building for iOS and Android, not so much for Windows, however this is likely to change quickly with Windows 10 and Raspberry Pi support and a number of other things.) We tapped into teams in Microsoft and found out Chris already had made a WP app for TI. He agreed to build a WP app for ZenMeasure. He quickly got a ZenMeasure sensor for himself and it took him few hours I guess to have the app working. There were a number of issues discovered and resolved, some included here to provide you a flavour of what’s in store for you if you go down the IoT road.  

Some mundane stuff – Contoso spent a decent few weeks discussing where exactly to mount the sensors in the delivery trucks. They eventually called refrigeration engineers and got their help to settle on the best location within the delivery truck to mount the sensors. TI sensor comes with a little hook that can be used to hang it on the walls of the truck however ZenMeasure doesn’t come with anything so a DIY sort of casing was built to house the sensor so it could be safely stored and mounted.  

But let’s say, we get a sensor, we got it working, we got an app for it to see the reading from anywhere. Now what? After all, this is great for a hobbyist but to be commercially useful to Contoso, there should be some use of this data in some business process. Isnt’ it?

It’s important to recollect here from one of the previous posts that Contoso has a Windows CE based drivers’ app that they use to make drop-offs to customers. Some customers ask for temperature reading of the products being delivered. Today drivers’ do this by walking to the back of the truck and manually looking at the manual hard-wired thermometers and entering this reading by hand on drivers’ app which then ensures that printed invoice contains the reading. 

So, naturally this brought up all the questions related to integration. So how exactly to integrate this ZenMeasure app and/or data from the sensor to various other things going on with the business process. Contoso is also building a new WP app for drivers that will run on Microsoft Lumia 635 phones running WP 8.1. The backend is Dynamics AX and it uses a number of Azure technologies. More on that is available on links at bottom of main Contoso page. So this is what is happening with ZenMeasure data. With the new drivers’ app, when the driver has reached his drop-off destination and is ready to pick goods from truck and drop them at the customer location, the drivers’ app interrogates both chiller and freezer ZenMeasure sensors about the current temperature reading. This reading is then stored in the SQLite database on local Microsoft Lumia device. Chris also made a suggestion to keep the interrogating to a minimum, more interrogation means lower battery life. Contoso has a regular maintenance schedule for its’ delivery trucks that go through regular check-ups every ten weeks and the goal is that ZenMeasure sensor with all the love and care showered onto them should not need battery changing at a cycle shorter than that. These are very important considerations from a business process point of view, it’s important that disruption to regular business process is kept minimum to justify building this solution for essentially an insurance against a future event of some likelihood.

Once the readings are in local Lumia device SQLite, the data is sent to SQL Azure storage along with all the other data from drivers’ app. The reading is then picked up to be transferred to Dynamics AX database from where it is used further down in the business process. Note, this is critical for offline scenarios since connectivity on road is poor in most places and really poor in UK. There is also a Bluetooth printer that drivers carry with them to print invoices with their drop-offs. This led to another challenge as for a while we could not get both Bluetooth printer and BLE sensor to pair with the Lumia device at same time. Soon enough this issue was resolved. The drivers now have the ability not only to print the invoices using the Bluetooth printer but also the ability to print temperature at drop-off time on the invoice itself for customers who want that level of service. This helps customer service folks deal better with customer complaints about defrosted goods. Automated thermometer reading into the app ensures reliability – the reading is captured for each and every drop-off and improves efficiency – driver’s don’t have to read the thermometers manually as in the old system.

There is no end to surprising problems we encountered while building this solution. Now, after full deployment, Contoso will have over 100 drivers using 100 Lumia phones in 100 vehicles with 200 ZenMeasure sensors installed on any given day. After a long period of contemplation, Paul decided to let his drivers’ keep Lumia phones with them after workday is over! So each driver has his phone now. And each truck has two sensors. However driver and trucks are in no way fixed. Any driver can drive any vehicle on any given day on any given route for that day. This meant that after drivers were assigned routes (based on complicated math, topic for another day!) and trucks, they would have to pair the two BLE sensors in that truck to their phones before they set off for the day. This is a step too complicated to add to already complex and very well oiled business process. After all the whole purpose of this exercise is to make the process more intelligent, more efficient and reliable. How can Paul leave this bit to chance? What if driver couldn’t pair properly or pairing itself failed or just didn’t want to, etc. So, Chris is working with the team to build a solution which ensures that all Microsoft Lumia phones are already permanently paired to all ZenMeasure sensors. 

The WP drivers’ app also has two additional capabilities.

There are two background tasks running, one to capture battery life from the sensor and another to capture temperature reading every couple minutes form every sensor in over hundred delivery trucks. Keeping in mind the offline scenarios due to poor connectivity these captured time series are also stored locally on Lumia phones and then streamed into Azure Event Hub. There are two plans with these time series. One is to use the battery life time series from all trucks and feed to an Azure ML model that does simple time series forecast to predict when the battery is expected to run out. This will give Contoso a very nice clean way to predict when the battery will run out and use this knowledge to proactively have required batteries available for drivers to change before they head out for the day so sensors are always ON while on road. No excuses!

The temperature time series on the other hand is being used in a number of ways. Contoso is using Azure Stream analytics to run simple temporal queries on the streaming data. Show me average freezer reading for last 30 minutes for all vehicles on the road, Alert me if average reading for last 30 min is beyond a certain threshold for any vehicles on the road etc. Since they already capture GPS co-ordinates for all vehicles they are planning to run finer grained GPS aware temporal queries. Contoso also has a Power BI (if you are keen to know more how to use power BI with AX, see Clay’s great series here) dashboard in the transport office that displays current vehicle locations and superimposes average temperature readings from chillers and freezers on to it. Since this is quite new, they are in the process of establishing policies on what to do when temperature readings cross certain thresholds – at what point to call drivers’ back to the base etc. Eventually they want to use the temperature time series data for a predictive maintenance Azure ML model that can predict when a particular cooling unit should be sent for an overhaul before it breaks down. There is also a plan in the works to use Anomaly detection Azure ML app to see if it reveals anything interesting and helps in any way.

All in all it was somewhat surprising to note that a consumer device and a consumer app that worked pretty much out-of-the-box for consumer scenarios had to go through pretty rigorous testing and a number of modifications to get to a point where Contoso considers this as a feasible solution to address their scepticism with internet of things! Kudos to this team for their incessant focus on customer service, efficiency and on technological innovation.

Hope this was interesting for you to see how an automated process could be made more intelligent with a live example and how so many new Microsoft Azure tools and technologies can be connected to work with your business processes in Dynamics AX. If you are a Dynamics customer and want to learn more about this solution or want to deploy another IoT solution please write to me aksheyg@microsoft.com

Contributions: Chris Lovett, Akshey Gupta

Power BI and Dynamics AX: Part 5: PowerBI.com and On Premise Data (Preview)

$
0
0

 

This is the final part of the Power BI and Dynamics AX blog series for now, the earlier posts focused on the current functionality available within Power BI for Office 365. This blog post is going to talk about new functionality, that at time of writing, is only available in preview. This the topic in this post should not be applied to a production environment as it relates to preview functionality.

The new version of PowerBI.com has introduced a lot of new functionality; Dashboards, new visualisations, new apps for iOS and Windows devices all making for an extremely rich PowerBI experience on the web. This post is focused however on the new Live Connectivity functionality for On Premise Data.

The new connector that has been released allows a live connection to an On Premise Analysis Services Tabular Database. While unfortunately at the time of writing the SSAS Database shipped with Dynamics is a Multi-dimensional database you can’t connect directly, but you can create a Tabular Database to “flatten” the multi-dimensional database into a tabular Database for use with PowerBI.com. The latency of your data in PowerBI.com will be determined by how often you’re processing your SSAS Cube.

As an organisation you need to determine the best connectivity method for you and PowerBI, this may be through SSAS or through OData as previously described. There are limits to the On Premise option at the moment, given the nature of the Q&A Natural Language Queries and the processing that is required, Q&A is not currently supported on On-Premise data, you must still upload data into the data model for Q&A to work. For more information, start with the documentation from the PowerBI team – Which Power BI experience is right for me?

For this example we are going to use:

  • Dynamics AX 2012 R3 CU8
  • SQL Server 2014 Enterprise
    • Two SSAS Instances Deployed
      • Multidimensional with standard Dynamics AX Cubes Deployed (Find a detailed guide here.)
      • Tabular
  • PowerBI.com (Preview)
  • Power BI Analysis Services Connector (Preview)
  • Visual Studio 2013
  • Microsoft SQL Server Data Tools – Business Intelligence for Visual Studio 2013

Important Note: The SSAS Connector requires Azure Active Directory Sync to be running between your on Premise Active Directory and Azure Active Directory – otherwise you will receive an error when trying to connect to the Data source from PowerBI. So for those of you using the Dynamics Demo Virtual Machine, you won’t be able to connect. Your SSAS instances will need to be deployed on a machine synced with your Azure Active Directory.

We are going to create our Tabular Database through Visual Studio using a Business Intelligence Project Template that is installed as part of the SQL Server Data Tools pack. For the example today we are going to create a Project analysis tabular model with basic project information along with some key measures.

To start, launch Visual Studio. We are going to create a new SSAS Tabular Project by clicking “New Project”, and selecting “Business Intelligence”, “Analysis Services Tabular Project”

After selecting the project and entering a name, you will be prompted to connect to an instance of SSAS, this is where you connect to your newly created SSAS Tabular Instance.

Once we have our new project you have a few options of how you would like to create the model, depending on your technical ability. For this example we are going to use the wizard as it’s the easiest option. To get started, select “Model” > “Import from Data source”. You’ll be prompted with the list of data sources you can import from. You have the option of connecting directly to the AX relational DB, but I find the cubes easier as fields like enums, references, etc have been cleaned up and are a lot easier to work with. You also get the benefit of leveraging the calculations already in place in the standard cubes.

For our purposes today, we will use Microsoft Analysis Services. In the next screen you’ll enter the connection details for the SSAS Multidimensional Database (Dynamics AX Standard Cubes).

After entering credentials, you’ll be prompted for the MDX query that the tabular database should use for this model. You can start with MDX if you wish, or use the “Design” option to launch the visual designer. From this screen we will select the data we want form the cube to be available in our Tabular model. You can drag and drop fields from cubes on the left hand side into the pane on the right. As you add fields you will see your tabular model refresh to give you a preview of the data that will be available.

Once you’ve selected the information you want available, click Ok – you can now give your query a friendly name and then click Finish. If you had provided incorrect credentials, you will receive an error – you will need to back up to the credentials and update them with an account that has access to the cubes. Once you click finish the MDX query will be processed, once finished, close the window and you will see the results of your MDX query. You can take this time to change column names if you wish to make the data a little friendlier once we load it into PowerBI.

You can now close and save your Model. If you would like to double check its deployment, you can open up SQL Management Studio and you should see your newly created tabular DB. The SSAS On Premise model uses security from SSAS, so this is where you would apply your role security to your SSAS data model. Users need to have access to this DB to be able to explore the data in PowerBI (This is a key difference to the OData/Workbook model previously discussed)

 

The last step On Premise is to install the PowerBI Analysis Services Connector (Preview). You can find a detailed guide of how to download and install the connector here. The installation will require your PowerBI.com login details (Your Office 365 credentials) as well as the details for the Tabular instance of SSAS.

Now we are ready to expose the new tabular database to PowerBI. You can log into the Preview here. At the time of writing this preview is only available to customers within the United States. Once you’ve logged in, select “Get Data” > “SQL Server Analysis Services” and then “Connect”.

You will be presented with a list of all the SSAS connectors published within your organisation. Find and select yours in the list. You will then see a list of Models which are available on your instance. Click the model you had created earlier and click “Connect”

Once connected, you will now have a new Dataset available which is your On Premise Data source. (Note: The name will be your SSAS instance, you can rename it in PowerBi.com if required)

Now your data is available to create reports and dashboards against like any other data source. From the ellipsis menu on the Dataset click “Explore” and you be taken to a blank PowerView page to start building your report. If you’re familiar with creating visualisations in PowerView you can follow the same process, if not you can find a detailed guide here.

Below is an example of a Project Profitability analysis report based on On-Premise Data on Dynamics AX. The Invoiced Revenue, Cost, Hours and Gross Profit are all based on calculated measures defined in our standard Dynamics AX SSAS Cubes. You can find a detailed reference of the information available in the cube here.

One of the key benefits of the new PowerBI.com is the ability to create dashboards. Dashboards allow visualisations from multiple reports to be pinned to a single dashboard to give you a quick and easy overview of multiple sets of data. You can then drill into that specific report by clicking the visualisation from the dashboard.

This was a very simple example of exposing some Dynamics AX data to explore the preview; users of PowerBI should consider the best connection method for them, along with planning around what data should and should not be exposed. The PowerBI technology is changing at a great pace at the moment, it’s important to keep up to date with what is coming and how it can help shape your ongoing Business Intelligence Strategy.

For information on the new PowerBI.com platform, try these resources:

Hopefully this has been a help insight to some of the new functionality out in preview at the moment, and how it can be applied to Dynamics AX.

Thanks,

Clay.

AX Retail: Create a test site using ecommerce checkout controls

$
0
0

In Dynamics AX 2012 R3 CU8 we released checkout controls. In order to create and browse test website using these integrated controls please follow these steps:

Go to Demo VM

  • Go to “my documents” and then open folder “retail SDK CU8”
  • Open solution file located at “\Online Channel\Clients\Web\Storefront\ Web.Storefront.sln” in VS 2013
  • Create strong name key named as “strongnamekey.snk” and place it adjacent to bldver.cs file
  • Compile your project and publish it to a folder
  • Grab that published folder and create new website in IIS in demo machine
  • Make sure app-pool account is “contoso\administrator”
  • Now go to the folder where site is running and open commerceruntime.config and change defaultOperatingUnitNumber value to 068

  <storage defaultOperatingUnitNumber=”068″ />

  • Open web.config and change database connection string

   <add name=”CommerceRuntimeConnectionString” connectionString=”Server=localhost;Database=RetailContosoStore;Trusted_Connection=Yes”/>

  • Same web.config replace section below with <<Put correct strong name key you created above>>

<section name=”ecommerceControls” type=”Microsoft.Dynamics.Retail.Ecommerce.Sdk.Controls.ControlsSection,Microsoft.Dynamics.Retail.Ecommerce.Sdk.Controls, Version=6.3.0.0,
Culture=neutral, PublicKeyToken=fd6b8d0172171ea7, processorArchitecture=MSIL”/>

  • Open browser and browse to new website just created.

 


Announcement: Upcoming Advanced Solution Architect and Development workshops for AX2012 R3

Extensibility in Dynamics AX 2012 R3 CU8 (CRT, RetailServer, MPOS) Part 2 – New data entity

$
0
0

Overview

This blog is to expand the knowledge you gained in part 1 of the series (http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx). In case you get stuck, I recommended that you make yourself familiar with part 1 first. Some of the information from there is required and assumed.

The steps are based on the Dynamics AX 2012 R3 CU8 VM image that can be requested via LCS (https://lcs.dynamics.com/Logon/Index) which most partners have access to (Contoso sample).  Alternatively, PartnerSource (https://mbs.microsoft.com/partnersource/northamerica/) can be used to download the VM as well. Make sure you get the CU8 version.

It is recommended to review some of the online resources around the Retail solution, either now or during the processes of following this blog (https://technet.microsoft.com/en-us/library/jj710398.aspx).

The areas this blog covers are:

–         AX: Adding a new data entity, related to a retail store, and populating it by means of a job (no UI)

–         CDX: configuring CDX in order to include the new table in data synchronizations

–         CRT: adding a new data entity and new service

–         RetailServer: exposing a new controller for the new entity; adding new ODATA action

–         MPOS: adding plumbing to call RetailServer; updating UI to expose data

A future blog will cover topics and suggestions for changing existing CRT code.  Stay tuned for that.

The changed code is available in ZIP format which includes the files that have been added or changed only. It can be applied (after backing up your existing SDK) on top of the
“Retail SDK CU8” folder.  Note that the ZIP file includes the changes from part 1 as well.

This sample customization will update the MPOS terminal to show more detailed opening times for a store.  Remember that a store worker can look up item availability across multiple stores. Imagine that as part of that flow, the worker would like to advise the customer if a particular store is open or not. See the screen shot below for the UI flow:

 

 

Notes:

–         The sample is to illustrate the process of a simple customization. It is not intended for product use.

–         All changes are being made under the login contoso\emmah. If you use a different account, or different demo data altogether please adjust the below steps accordingly.

 

High-level steps

 

The following steps need to be carried out:

 

  1. Setup the Retail SDK CU8 for development (see part 1)
  2. Prepare MPOS to be run from Visual Studio from unchanged SDK code (see part 1)
  3. Activate the MPOS device (see part 1)
  4. Include new entity in AX
  5. Configure CDX to sync new entity
  6. Channel schema update and test
  7. Add CRT entity, service, request, response, datamanager and RetailServer controller with new action
  8. Update client framework to call RetailServer endpoint
  9. Update client framework channel manager with new functionality
  10. Update client framework’s view model
  11. Update MPOS’s view to consume updated view model
  12. Test

 

Detailed steps

 

Setup the Retail SDK CU8 for development

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

 

Prepare MPOS to be run from Visual Studio from unchanged SDK code

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

 

Activate the MPOS device

See part 1 at http://blogs.msdn.com/b/axsa/archive/2015/02/17/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-1.aspx.

  

Include new entity in AX

 

In order to store the store hours per store, we will be using a new table called ISVRetailStoreDayHoursTable. It will store the day, and open and closing times for each store.

As part of the ZIP folder you can find the xpo file at SampleInfo\Sample2_StoreDayHours.xpo. Import this file into AX. It includes 2 items: the table and a simple job that populates sample data for the Houston store.

Run the job named Temp_InsertData at least once. Then inspect the table with SQL Server Management studio:

Configure CDX to sync new entity

In AX, add a new location table and the appropriate columns to the AX 2012 R3 schema (USRT/Retail/Setup/Retail Channel Schema)

 

 Create a new scheduler subjob (USRT/Retail/Setup/Scheduler subjobs)

 

Click transfer field list and make sure that the fields match as above.

Add the new subjob to the 1070 Channel configuration job (USRT/Retail/Setup/Scheduler Job)

 

Edit the table distribution XML to include the new table (USRT/Retail/Setup/Retail Channel Schema)

 

The easiest is to get the XML out of the text box, edit it outside with a XML editor, and then get it back in.  The change you need to do is to add this XML fragment:

  <Table name=”ISVRETAILSTOREDAYHOURSTABLE”>

     <LinkGroup>

     <Link type=”FieldMatch” fieldName=”RetailStoreTable” parentFieldName=”RecId” />

    </LinkGroup>

  </Table>

in two places. Both times, it needs to be added inside the RetailStoreTable table XML node.

At the end, click Generate Classes (USRT/Retail/Setup/Retail Channel Schema/AX 2012 R3)

Channel schema update and test

The equivalent change to the table schema must be made in the channel side.  This has to be done to all channel databases. Use SQL Server Management Studio and create the table. Since this is a sample, we won’t add stored procedures, we just do that in code. However, it is recommended to use sprocs for performance and security reasons.

The file can be found in the ZIP folder at SampleInfo\ChannelSchemaUpdates.txt

Now, go back to AX and run the 1070 job (USRT/Retail/Periodic/Distribution Schedule/1070/Run now)

Then, verify in AX that the job succeeded (USRT/Retail/Inquiries/Download Sessions/Process status messages). You should see a status of “Applied” for the stores. 

 

Add CRT entity, service, request, response, datamanager and RetailServer controller with new action

 Use the solution in the ZIP file at SampleInfo\RSCRTExtension\RSCRTExtension.sln and inspect he code.

Since this part is based on part 1, I assume you have:

–         already configured the pre.settings file (for rapid deployment as part of the build into RetailServer’s bin directory),

–         already configured RetailServer’s version of commerceRuntime.config to include the new CRT extension dll, and

–         already configured RetailServer’s web.config file to include our new extension dll.

Here is a code map view of the code changes required:

 You can see that we need a CRT request, response, service, dataaccessor and entity. Additionally, RetailServer is customized to include a new StoreDayHoursController that exposes a new ODATA endpoint, GetStoreDaysByStore.  That endpoint uses the CRT and the request object to get a response. It does not use the data service directly.

If you have configured all right, compiled the solution and fired up the ODATA metadata url of RetailServer (http://ax2012r2a.contoso.com:35080/RetailServer/v1/$metadata), you should see the new action:

 

 

Update client framework to call RetailServer endpoint

 

The first step is to make MPOS aware of the new entity and the new endpoint. This is basically proxy code similar to what tools like wsdl.exe would generate for .NET web services. The Retail team is investigating to provide a tool for automatic regeneration in a future release.

CommerceTypes.ts

This is a class that specifies the new entity, both as an interface and a class.

    export interface StoreDayHours {
        DayOfWeek: number;
        OpenTime: number;
        CloseTime: number;
        ExtensionProperties?: Entities.CommerceProperty[];
    }
    export class StoreDayHoursClass implements StoreDayHours {
        public DayOfWeek: number;
        public OpenTime: number;
        public CloseTime: number;
        public ExtensionProperties: Entities.CommerceProperty[];

        /**
         * Construct an object from odata response.
         *
         * @param {any} odataObject The odata result object.
         */
        constructor(odataObject?: any) {
            odataObject = odataObject || {};
            this.DayOfWeek = odataObject.DayOfWeek ? odataObject.DayOfWeek : null;
            this.OpenTime = odataObject.OpenTime ? odataObject.OpenTime : null;
            this.CloseTime = odataObject.CloseTime ? odataObject.CloseTime : null;
            this.ExtensionProperties = undefined;
            if (odataObject.ExtensionProperties) {
                this.ExtensionProperties = [];
                for (var i = 0; i < odataObject.ExtensionProperties.length; i++) {
                    this.ExtensionProperties[i] = odataObject.ExtensionProperties[i] ? new CommercePropertyClass(odataObject.ExtensionProperties[i]) : null;
                }
            }
        }
    }

 

CommerceContext.ts

This is a class that exposes the ODATA data service to the rest of MPOS.

 public storeDayHoursEntity(storeId?: string): StoreDayHoursDataServiceQuery {
  return new StoreDayHoursDataServiceQuery(this._dataServiceRequestFactory, “StoreDayHoursCollection”, “StoreDayHours”, Entities.StoreDayHoursClass, storeId);
 }

    export class StoreDayHoursDataServiceQuery extends DataServiceQuery<Entities.StoreDayHours> {

        constructor(dataServiceRequestFactory: IDataServiceRequestFactory, entitySet: string, entityType: string, returnType?: any, key?: any) {
            super(dataServiceRequestFactory, entitySet, entityType, returnType, key);
        }

        public getStoreDaysByStoreAction(storeId: string): IDataServiceRequest {
            var oDataActionParameters = new Commerce.Model.Managers.Context.ODataActionParameters();
            oDataActionParameters.parameters = { StoreNumber: storeId};

            return this.createDataServiceRequestForAction(‘GetStoreDaysByStore’, Entities.StoreDayHoursClass, ‘true’, oDataActionParameters);
        }
    }

 

Update client framework channel manager with new functionality

Now, we got the low-level proxy code done, we need to expose the new functionality in a more consumable way to the rest of application framework. An appropriate location for the new functionality is the IChannelManager as it already encompasses similar functionality that is of more global, channel-related nature.

IChannelManager.ts:

    getStoreDayHoursAsync(storeId: string): IAsyncResult<Entities.StoreDayHours[]>;

ChannelManager.ts:

  public getStoreDayHoursAsync(storeId: string): IAsyncResult<Entities.StoreDayHours[]> {
  Commerce.Tracer.Information(“ChannelManager.getStoreDayHoursAsync()”);

  var query = this._commerceContext.storeDayHoursEntity();
  var action = query.getStoreDaysByStoreAction(storeId);

  return action.execute<Entities.StoreDayHours[]>(this._callerContext);
 }

 

Update client framework’s view model

The view model is an abstraction of the view that exposes public properties and commands for any view implementation to use.  Here are the 3 things we need to do in order to customize the existing StoreDetailsViewModel:

  • a variable that holds the result for view to bind to and a variable called isStoreDayHoursVisible that view can use to toggle visibility of the UI:

        public storeDayHours: ObservableArray<Model.Entities.StoreDayHours>;
        public isStoreDayHoursVisible: Computed<boolean>;

  • data initialization in the constructor:

    // empty array
  this.storeDayHours = ko.observableArray([]);
  this.isStoreDayHoursVisible = ko.computed(() => {
   return ArrayExtensions.hasElements(this.storeDayHours());
  });

  • data retrieval function to be called by the view

        public getStoreDayHours(): IVoidAsyncResult {
            var asyncResult = new VoidAsyncResult(this.callerContext);
            Commerce.Tracer.Information(“StoreDetailsViewModel.getStoreDayHours()”);

            this.channelManager.getStoreDayHoursAsync(this._storeId)
                .done((foundStoreDayHours: Model.Entities.StoreDayHours[]) => {
                    this.storeDayHours(foundStoreDayHours);
                    Commerce.Tracer.Information(“StoreDetailsViewModel.getStoreDayHours() Success”);
                    asyncResult.resolve();
                })
                .fail((errors: Model.Entities.Error[]) => {
                    asyncResult.reject(errors);
                });

            return asyncResult;
        }

 

Update POS’s view to consume updated view model

The StoreDetailsView.ts already calls into the view model to get store distance. For simplicity, we just hook into the done() event handler to call the new function:

                    this.storeDetailsViewModel.getStoreDistance()
                        .done(() => {
                            this._storeDetailsVisible(true);
                            this.indeterminateWaitVisible(false);

                            this.storeDetailsViewModel.getStoreDayHours()
                                .done(() => {
                                    this._storeDetailsVisible(true);
                                    this.indeterminateWaitVisible(false);
                                })
                                .fail((errors: Model.Entities.Error[]) => {
                                    this.indeterminateWaitVisible(false);
                                    NotificationHandler.displayClientErrors(errors);
                                });

Lastly, we update the html to expose the data:

 

Please use the sample code in the ZIP archive as mentioned above.  This also includes a few other changes not
detailed here, for example in resoures.resjson, Converters.ts.
 

Issues and solutions:

If you cannot run MPOS from the Pos.sln file because it is already installed, uninstall the app first. This link may also be helpful: http://blogs.msdn.com/b/wsdevsol/archive/2013/01/28/registration-of-the-app-failed-another-user-has-already-installed-a-packaged-version-of-this-app-an-unpackaged-version-cannot-replace-this.aspx 

Happy coding,

Andreas

 

Original link: http://blogs.msdn.com/b/axsa/archive/2015/05/20/extensibility-in-dynamics-ax-2012-r3-cu8-crt-retailserver-mpos-part-2-new-data-entity.aspx (go back to it for zip file download…)

 

Retail SDK CU8 – Extensibility Sample 2.zip

Announcement: Upcoming Advanced Solution Architect workshop for AX2012 R3 in Finland

$
0
0

We have Advanced Solution Architect Workshop for Microsoft Dynamics AX 2012 R3 scheduled for November 9-11 in Finland

This three-day instructor-led workshop is designed for solution architects working for partners that are engaged in large, complex or multi-location projects where systems integration is a key requirement.

The participants of this workshop are expected to have experience in projects where they have driven the definition of business, technical, and architectural requirements, and where they are accountable for validating the solution

The objectives of the workshop include:

  • Learning how to articulate and architect a solution by using the key concepts of the foundational features in Microsoft Dynamics AX 2012 R3.

  • Developing an understanding of how to create a solution to a complex multi-national customer implementation by using the capabilities of AX 2012 R3.

  • Understanding the concepts behind architecting an enterprise solution with AX 2012 R3.

Training Dates: November 09, 2015 – November 11, 2015 
Training Venue: Microsoft Talo, CD-Building, Keilalahdentie 2-4, Espoo 02150, Finland  

 More information and registration links are here

Dynamics Technical Conference February 2016 Deep Dive Training

$
0
0

 

Happy New Year! I hope everyone is back refreshed and ready for an exciting year of Dynamics AX ahead. Over the break our readiness team announced training which will be taking place the three days after the Technical Conference in February in Seattle, Washington.

 

If you haven’t yet registered for the Technical Conference (February 23 – 25, 2016) you can here:

https://www.microsoft.com/en-us/dynamics/techconference/

 

In addition to all the content, breakout sessions and labs at the Technical Conference, the Solution Architecture group and core R&D teams are also running three deep dive training sessions all on the new Microsoft Dynamics AX (AX7).

 

Implementation Lifecycle Workshop for Microsoft Dynamics AX – Register Here

Training Details: February 26, 2016 – February 28, 2016

Training Time: Check-in 8:30am; Training 9:00am-5:00pm PST

Training Venue: Microsoft Conference Center, 16070 NE 36th Way, Building 33, Redmond, WA 98052, United States

Description

The Implementation Lifecycle Workshop for the new Dynamics AX (AX7) is designed for Functional, Technical and Project managers to understand the technology and tools available through Dynamics Lifecycle services and Visual Studio online to support the implementation of Dynamics AX. The workshop is primarily hands on, taking attendees through the deployment of Dynamics AX and the use of tools commonly used through an implementation to support the management of business processes, data, code and environments. The workshop will follow through an example case study to design, deploy, configure and test a solution on Dynamics AX.

 

Advanced Performance Workshop for Microsoft Dynamics AX – Register Here

Training Details: February 26, 2016 – February 29, 2016

Training Time: Check In Time: 8:30, Training: 9am-5pm PST

Training Venue: Microsoft Conference Center, 16070 NE 36th Way, Building 33, Redmond, WA 98052, United States

Description

The Advanced Performance Workshop for Microsoft Dynamics AX is designed for helping solution architects and senior consultants on planning, designing, implementing, stabilizing and releasing a Dynamics AX implementation with a focus on performance. Participants will understand the different phases and steps of the performance lifecycle; from analysis to deployment and the new tools required to get implementation complete.

 

Advanced Presales Workshop for Microsoft Dynamics AX – Register Here

Training Details: February 26, 2016 – February 28, 2016

Training Time: Check-in 7:30am; Training 8:00am-5:00pm PST

Training Venue: Microsoft Conference Center, 16070 NE 36th Way, Building 33, Redmond, WA 98052, United States

Description

The Presales workshop for Microsoft Dynamics AX is designed to provide a pre-sales consultant the grounding needed to demonstrate Microsoft Dynamics AX as a solution superior to competitive products meeting the needs of a customer’s business. The workshop is based on typical business scenarios, starting with foundational base knowledge, persona scenarios, exercises and case discussions by using real-world examples to enable consultants to apply and demonstrate the range of capabilities in Microsoft Dynamics AX, being used as a base to future specialize consultants in vertical industries.

 

We are extremely excited about the new Dynamics AX, and the new training around. See you at the Technical Conference.

 

Thanks,

Clay.

Commerce Data Exchange and Shared AX DB/Channel DB in Dynamics 365 for Operations

$
0
0

Commerce Data Exchange is a component of Dynamics 365 for Operations and Retail that transfers data between Microsoft Dynamics AX and retail channels, such as online stores or brick-and-mortar stores. In Dynamics 365 for Operations, Channel Database is part of AX DB itself and can also be part of Retail Store Scale Unit.   Retail Store Scale Unit consists of Retail Server, CDX Async client,  Cloud POS Server and Channel Database is an optional component. Channel Database that is part of AX DB is installed and configured in part with every installation of Dynamics 365 for Operations.

 

Retail Architecture Diagram

Commerce Data Exchange component that moves data between AX and Channel Database comprises of two batch jobs. One extracts data from AX DB and writes files for each of the distribution schedule jobs to blob storage. Second batch job that takes extracted data files from blob storage and writes to Channel Database that is part of AX DB with tables starting with AX.*.

For stores that are setup with Retail Store Scale Unit, the batch job on AX HQ side extracts the data and writes a file to blob storage. CDX Async client that is part of RSSU picks up the file from blob storage and applies to channel database that is part of RSSU. This part of the architecture remains similar to AX 2012 R3.  Stores/Channels were grouped using channel data groups and all channels/stores in a channel data group shared same data. Channel databases that were part of RSSU or Stores were defined using Channel database and grouped using Channel Database Group forms.

 

Key Learnings for implementations

  1. Multiple databases pointing to same HQ hosted Channel Database –  The channel database that is created with deployment has encrypted database connection string pointing to HQ hosted channel database. This cannot be configured for newly created database unless this is for RSSU. Further there is no real need to push same tables like CustTable, InventTable or EcoResProduct as part of different distribution schedule jobs more than once. In Customer implementations that are not using RSSU, there is absolutely no need to create multiple channel databases pointing to same database and additional channel data groups.
  2. Golden Configuration database move and Retail retargeting tool – It is also common practice for implementations to move configuration databases between environments. As part of this process, Retail retargeting tool needs to run to modify this connection string among other tasks. As of date of publication of this blog, Retail retargeting tool assumes that name of the only allowed channel database that is shared with AX DB to be called “Default”.

 

 

Authenticate with Dynamics 365 for Finance and Operations web services in on-premise

$
0
0

This blog explains how to take the standard examples for Dynamics 365 for Finance and Operations integration from Github and authenticate to an on-premise instance of Finance and Operations. At the end you'll also find some troubleshooting tips if it doesn't work first time, which can be useful for any scenario where something is trying to authenticate to services.

Environment prerequisites

There are a few items required before you start:
- Installed Visual Studio 2017 Enterprise edition (edition appears to not be important)
- Downloaded the examples from GitHub: https://github.com/Microsoft/Dynamics-AX-Integration
- Open the ServicesSamples.sln solution
- Within Visual Studio, go to Tools->NuGet package manager-> manage NuGet packages for solution, there it recognises there are some missing – click restore in the top right and it downloads them automatically.

ADFS Setup

Now on ADFS server in my on-prem environment, I need to add a client application. From “AD FS Management” I see Application Groups, open the default application group for Dynamics 365 called “Microsoft Dynamics 365 for Operations On-Premises”, it will look something similar to this:
ADFS application group

Click “Add application…” at the bottom, add a new server application:
Add new application

Add the redirect URL, this should be the URL for the D365 application, take a note of the client identifier as you’ll need to use this in your client application later:
Add the redirect URL

Select to generate a shared secret – you must copy this now, as it will not be shown again – your client app needs this detail to connect to D365:
Generate shared secret

Summary

Completed

Next, back in the application group window, edit the “Microsoft Dynamics 365 for Operations On-premises – Web API” item:
Edit application group

On the “Client permissions” tab add a new record for the server application created in the previous step:
Add a new record for server application

Code

Within the ServicesSamples.sln solution open the ClientConfiguration.cs source file and modify similar to below (using the values from your ADFS configuration above):

public static ClientConfiguration OneBox = new ClientConfiguration()
{
UriString = "https://ax.d365ffo.zone1.saonprem.com/namespaces/AXSF/", //the normal URL for logging into D365
UserName = "not used",
Password = "",

//Note that AOS config XML is on AOS machines in: C:\ProgramData\SF\AOS_10\Fabric\work\Applications\AXSFType_App84\AXSF.Package.1.0.xml

ActiveDirectoryResource = "https://ax.d365ffo.zone1.saonprem.com/", //this is the value for AADValidAudience from the AOS config xml
ActiveDirectoryTenant = "https://dax7sqlaoadfs1.saonprem.com/adfs",//this is the value for AADIssuerNameFormat (minus the placeholder {0}, instead suffix "/adfs") from AOS config xml
ActiveDirectoryClientAppId = "6c371040-cf6b-4154-b9c4-75e613fb5104", //client app ID is from ADFS management - configure a application group
ActiveDirectoryClientAppSecret = "MO-tVemKqAjVLj1NdcCs3mfiWw2X3ZNyjuFe0UYg", //secret is from ADFS management - same place as the client app ID

// Change TLS version of HTTP request from the client here
// Ex: TLSVersion = "1.2"
// Leave it empty if want to use the default version
TLSVersion = "",
};

AX Setup

Also need to add the application in AX application too, under System administration > setup > Azure Active Directory applications, using the client ID you put into your client code:
AX Setup

Troubleshooting:

ADFS group creation fails

If ADFS group creation fails as shown below with the error MSIS7613 Each identifier must be unique across all relying party trusts in AD FS configuration. This means that the URL entered for Web API is already registered in another group – probably the default D365 group. To resolve this see below.
ADFS group creation fails

Locate the standard Microsoft Dynamics 365 for Operations On-premises ADFS application group and open it.
ADFS configuration

Forbidden

The error below occurs if the setup within AX hasn’t been completed within the AX application under System administration > setup > Azure Active Directory applications. The error below was reported back to the calling client application.
0:025> !pe
Exception object: 0000029e2b48e1d8
Exception type: System.ServiceModel.Web.WebFaultException`1[[System.ComponentModel.Win32Exception, System]]
Message: Forbidden
InnerException: <none>
StackTrace (generated):
<none>
StackTraceString: <none>
HResult: 80131501
0:025> !clrstack
OS Thread Id: 0x1cb4 (25)
Child SP IP Call Site
000000f0381bc3e8 00007ff86e233c58 [HelperMethodFrame: 000000f0381bc3e8]
000000f0381bc4d0 00007ff808fe8642 Microsoft.Dynamics.Ax.Services.ServicesSessionProvider.ThrowSessionCreationException(Microsoft.Dynamics.Ax.Services.ServicesSessionCreationErrorCode)
000000f0381bc520 00007ff808fe45b0 Microsoft.Dynamics.Ax.Services.ServicesSessionProvider.GetSession(Boolean, Boolean, System.String, System.String, System.String, System.String, System.Security.Claims.ClaimsIdentity)
000000f0381bc690 00007ff808fe4014 Microsoft.Dynamics.Ax.Services.ServicesSessionManager.InitThreadSession(Boolean, Microsoft.Dynamics.Ax.Xpp.AxShared.SessionType, Boolean, System.String, System.String, System.String, System.String, System.Security.Claims.ClaimsIdentity)
000000f0381bc730 00007ff808fe3ea6 Microsoft.Dynamics.Platform.Integration.Common.SessionManagement.ServicesAosSessionManager.InitializeSession(Boolean, System.String, System.Security.Claims.ClaimsIdentity)
000000f0381bc7a0 00007ff808fe366a Microsoft.Dynamics.Platform.Integration.Common.SessionManagement.OwinRequestSessionProvider.CreateSession(System.Security.Claims.ClaimsIdentity)
000000f0381bc7f0 00007ff808fe34cc Microsoft.Dynamics.Platform.Integration.Common.SessionManagement.ServicesRequestSessionHelper.EnsureRequestSession(Microsoft.Dynamics.Platform.Integration.Common.SessionManagement.IServicesRequestSessionProvider, System.Security.Claims.ClaimsIdentity)
000000f0381bc830 00007ff808fe2a86

Audience validation failed

The error below occurs if the value you’re using in your client application for ActiveDirectoryResource (from ClientConfiguration.cs in the example apps) doesn’t match the value in the AOS configuration for AADValidAudience. The AOS configuration is here: C:\ProgramData\SF\AOS_10\Fabric\work\Applications\AXSFType_App84\AXSF.Package.1.0.xml
Note that the error passed back to the client application is not as detailed as this – this error was from catching the exception directly on the AOS machine using WinDbg.

0:029> !pe
Exception object: 00000166f07da608
Exception type: System.IdentityModel.Tokens.SecurityTokenInvalidAudienceException
Message: IDX10214: Audience validation failed. Audiences: 'http://tariqapp.saonprem.com'. Did not match: validationParameters.ValidAudience: 'null' or validationParameters.ValidAudiences: 'https://ax.d365ffo.zone1.saonprem.com, 00000015-0000-0000-c000-000000000000, https://ax.d365ffo.zone1.saonprem.com/'
InnerException: <none>
StackTrace (generated):
<none>
StackTraceString: <none>
HResult: 80131501

Strong name validation failed on first client application run

If you tried to run OdataConsoleApplication and it failed with error: Could not load file or assembly 'Microsoft.OData.Client, Version=6.11.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. Strong name validation failed. (Exception from HRESULT: 0x8013141A)

The root cause of this could be that it’s looking for 6.11 but 6.15 is the version installed – from the NuGet package manager in VS you can change the version, then build, success, then run, success.

ADFS error log

To help troubleshoot ADFS errors you can view the event viewer on the ADFS server, as shown below, it’s under Applications and services logs > AD FS > Admin.
ADFS error log

Oh AOS why have you forbidden me

$
0
0

Sometimes when services are trying to authenticate to an AOS in Dynamics 365 for Finance and Operations, both in the Cloud version and the on-premise version, the calling application may receive the error message "forbidden" back from the AOS. This message is deliberately vague, because we don't want a calling application to be able to poke the AOS and learn about how to get in, but unfortunately that vagueness can make it difficult to figure out what is actually wrong, in this post we'll discuss what's happening in the background and how to approach troubleshooting.

Anything which is calling web services could receive this "Forbidden" error - for example an integrated 3rd party application, or Financial Reporting (formerly Management Reporter).

First let's talk about how authentication to Finance and Operations works, there are two major stages to it:

1. Authentication to AAD (in Cloud) or ADFS (in on-premise)- this is happening directly between the caller and AAD/ADFS - the AOS isn't a part of it.
2. Session creation on the AOS - here the caller is giving the token from AAD/ADFS to the AOS, then AOS attempts to create a session.

The "forbidden" error occurs during the 2nd part of the process - when the AOS is attempting to create a new session. The code within the AOS which does this has a few specific cases when it will raise this:

- Empty user SID
- Empty session key
- No such user
- User account disabled
- Cannot load user groups for user

For all of these reasons the AOS is looking at the internal setup of the user in USERINFO table - it's not looking at AAD/ADFS. In a SQL Server based environment (so Tier1 or on-premise) you can run SQL Profiler to capture the query it's running against the USERINFO table and see what it's looking for.

Examples:

- Financial Reporting (Management reporter) might report "Forbidden" if the FRServiceUser is missing or incorrect in USERINFO. This user is created automatically, but could have been modified by an Administrator when trying to import users into the database.
- When integrating 3rd party applications if the record in "System administration > setup > Azure Active Directory applications" is missing


Disable any reliance on internet in Finance and Operations on-premise

$
0
0

There are some features within Dynamics 365 for Finance and Operations on-premise which rely on an internet connection.

This means that the on-premise version DOES by default have a dependency on some cloud services - BUT you can turn that off, so there is no dependency.

As an example, last week there was an AAD outage, which affected on-premise customer's ability to log into the application. What was happening was - you'd log in as normal - see the home page for a moment, then it would redirect to the AAD login page, which was down, so the user would be stuck.

In the background this relates to the Skype presence feature - after the user logs in, in the background the system is contacting the Skype service online - which is what triggers that redirect to AAD when AAD is unavailable.

There is a hotfix available which allows a System Administrator to turn off all cloud/internet related functions in the on-prem version, details are available here:
Disable internet connectivity

How to select the document management storage location

$
0
0

In Dynamics 365 for Finance and Operations the document management feature allows you to attach documents (files and notes) to records within the application. There are several different options for storage of those documents – in this document we will explain the advantages and disadvantages of each option

Document storage locations

There are 3 possible options for document storage:

• Azure storage: In the cloud version of Finance and operations this will store documents in Azure blob storage, in the on-premise version this will store documents in the file share given in the environment deployment options in LCS*
• Database: stores documents in the database
• SharePoint: stores documents in SharePoint Online, this is currently only supported for the cloud version. Support for on-premises SharePoint is planned to be added in the future

Each document storage option can be configured per document type – meaning that it’s possible to configure a type of document “scanned invoices” and choose storage “Database”, and configure another type of document “technical drawings” and choose storage “Azure storage”.

Classes

When configuring document types there are 3 different classes of document available, each class of document only allows certain storage locations:
- Attach file: this allows selection of “Azure storage” or “SharePoint” locations
- Attach URL: this allows only “Database” location
- Simple note: this allows only “Database” location

Document storage location options

Azure storage

This type of storage can be configured for the “attach file” class of document only.

As mentioned earlier in this document, in the cloud version of Finance and operations this will store documents in Azure blob storage, in the on-premise version this will store documents in the file share given in the environment deployment options in LCS.

In the cloud version the an Azure storage account is automatically created when an environment is deployed. No direct access to the storage account is given, access is only via the application. This is a highly available geo-replicated account, so there are no additional considerations required to ensure business continuity for this component.

In the on-premise version an SMB 3.0 file share is specified at environment deployment time. High availability and disaster recovery options must be considered to ensure availability of this file share, the application accesses this using its UNC path – ensure this UNC path is available at all times.

Files stored in this way are not readable by directly accessing the file share location – they are intended only to be accessed through Finance and Operations – specifically files stored will be renamed to a GUID type name, and their file extension is removed. Within Finance and Operations a database table provides the link between the application and the file stored on the file system.
No direct access to this folder should be allowed for users, access for the Finance and Operations server process is controlled through the certificate specified during environment deployment.

Database

Database storage will be used automatically for document types using classes “Attach URL” or “Simple note”. The “Attach file” class of documents will not be stored in the database.
Documents stored in the database will be highly available by virtue of the SQL high availability options which are expected to be in place already as a requirement of Finance and Operations.

SharePoint

This type of storage can be configured for the “Attach file” class of document only.

For the cloud version of Finance and Operations, SharePoint Online is supported but currently SharePoint on-premise is not supported. For the on-premise version SharePoint Online is also not supported currently.

SharePoint Online is a highly available and resilient service, we recommend to review our documentation for more information.

Cloud versus On-premise

In the cloud version of Finance and Operations, for file storage, either SharePoint Online or Azure blob storage can be used.
In the on-premise version, for file storage, only Azure blob storage can be used – which will store files in a network file share as defined in the environment deployment options.
*The screenshot below shows the setting for file share storage location used by on-premise environments when selecting “Azure storage”.

On-premise deployment storage options

Troubleshooting on-premise environment deployment D365FFO

$
0
0

This document contains tips for troubleshooting on-premise Dynamics 365 for Finance and Operations environment deployment failures, based on my own experiences when troubleshooting this process for the first time.

Types of failures

The first type of failure I am seeing here is a simple redeploy of the environment. Originally I was trying to deploy a custom package, but it failed and I didn’t know why, so I deleted the environment and was redeploying with vanilla – no custom bits, just the base, and it still failed. In LCS after it runs for approx. 50 minutes I see the state change to Failed. There is no further log information in LCS itself, that information is within the respective machines in the on-premise environment.

Orchestrators

The Orchestrator machines trigger the deployment steps. In the base topology there are 3 orchestrators, these are clustered/load balanced – often the first one will pick up work, but don’t rely on that – it could be any of them which picks up tasks – and it could be more than 1 for a given deployment run – for example server 1 picks up some of the tasks and server 2 picks up some other tasks – always check event logs on all of them to avoid missing anything useful.

To make it easier to check them you can add a custom view of the event logs on each orchestrator machine, to give you all the necessary logs in one place, like this:
Create custom event log view

Select events

I found in my case that server 2 was showing an error, as below, it’s basically saying it couldn’t find the AOS metadata service, and I notice the URL is wrong – I’d entered something wrong in the deployment settings in LCS:
Example error

AOS Machines

There are also useful logs on the AOS machines – the orchestrators are calling deployment scripts but for AX specific functions the AOSes are still running – for example database synchronize is run by an AOS. Again the AOSes are clustered so we need to check all of them as tasks could be executed by any of them. Similar to the orchestrators I create a custom event log view to show me all Dynamics related events in one place. This time I am selecting the Dynamics category, and I have unchecked “verbose” to reduce noise.

AOS event log

Here’s an example of a failure I had from a Retail deployment script which was trying to adjust a change tracking setting, for an issue such as this, once I know the error I can work around the problem by manually disabling change tracking on the problem table from SQL Server Management Studio and then starting the deployment again from LCS.

AOS example error

ADFS Machines

The ADFS servers will show authentication errors – typical causes of this kind of failure could be a “bad” setting entered in the deployment settings in LCS – for example I entered the DNS address for the ax instance incorrectly, then I see an ADFS error after deployment when trying to log into AX:

ADFS error example

If you see an error as above, you can understand more about it my reviewing the Application group setup in “ADFS Management” on the ADFS machine, open it from server manager:

ADFS

Under application groups you’ll see one for D365, double click it to see the details

ADFS setup

If you’re familiar with the cloud version of D365, then you’ll probably know that AAD requires application URLs to be configured against it to allow you to log in – in cloud the deployment process from LCS is doing this automatically, and you can see it if you review your AAD setup via the Azure portal. In the on-prem version, this ADFS management tool shows you the same kind of setup, also in on-prem the deployment process is creating these entries automatically for you. Click on one of the native applications listed and then the edit button you can see what’s been set up:

ADFS application group setup

The authentication error I mentioned previously:
MSIS9224: Received invalid OAuth authorization request. The received 'redirect_uri' parameter is not a valid registered redirect URI for the client identifier: 'f06b0738-aa7a-4a50-a406-5c1e486c49be'. Received redirect_uri: 'https://dax7sqlaodc1.saonprem.com/namespaces/AXSF/'.

We can now see from the configuration above that for client 'f06b0738-aa7a-4a50-a406-5c1e486c49be' the request URL isn’t configured. If we believed that the URL is correct, then we could add it here and ADFS would then allow the request to go through successfully. IN my case the URL was the mistake, so I didn’t change ADFS settings, I corrected the URL in LCS and started the deployment again.

Package deployment failures

When reconfiguring an environment, and including a custom package, if the deployment fails, check the orchestrator machine event logs, as described above – use a custom event log view to check all the logs on a machine at once.

I have had a situation where I’m getting failures related to package dependencies where my package does not have the failing dependency, I will explain:
Error is:

Package [dynamicsax-demodatasuite.7.0.4679.35176.nupkg has missing dependencies: [dynamicsax-applicationfoundationformadaptor;dynamicsax-applicationplatformformadaptor;dynamicsax-applicationsuiteformadaptor]]

My package does not contain demodatasuite, so the error is a mystery. Turns out that because my package has the same filename as a previously deployed package, the system is not downloading my package and just attempting to deploy an old package with the same name. Packages can be found in the file share location, as below:
\\DAX7SQLAOFILE1\SQLFileShare\assets

The first part, \\DAX7SQLAOFILE1\SQLFileShare, is my file share (so will differ in different environments – it’s a setting given when the environment was created), the assets folder is constant.

In here I see that my current package “a.zip” (renamed to a short name to work around an issue with deployment failure due to path too long), is from several weeks ago and is much larger than the package I expect. To get past this I rename my package to b.zip and attempt deployment again. Note that after PU12 for on-premise this issue no longer occurs.

Package deployment process

During the package deployment process, the combined packages folders will be created in this folder:

\\DAX7SQLAOFILE1\SQLFileShare\wp\Prod\StandaloneSetup-109956\tmp\Packages

Error when environment left in Configuration mode

When running a reployment, the error below can occur if the environment has been left in Configuration mode (for changing config keys), turn off configuration mode, restart the AOSes and then re-run the deployment.

MachineName SQLAOSF1ORCH2
EnvironmentId c91bafd5-ac0b-43dd-bd5f-1dce190d9d49
SetupModuleName FinancialReporting
Component Microsoft.Dynamics.Performance.Deployment.Commands.AX.AddAXDatabaseChangeTracking
Message An unexpected error occurred while querying the Metadata service. Check that all credentials are correct. See the deployment log for details.
Detail Microsoft.Dynamics.Performance.Deployment.Common.DeploymentException: An unexpected error occurred while querying the Metadata service. Check that all credentials are correct. See the deployment log for details. ---> System.ServiceModel.FaultException: Internal Server Error Server stack trace: at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc) at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout) at

Error when FRServiceUser is missing

This error can also happen when the FRServiceUser is missing in USERINFO – the AOS metadata service is trying to create an AX session as this user.
This user is normally created by the DB synch process. If the user is incorrect in USERINFO then deleting that User and re-running DB synch should recreate the user – you can set USERINFO.ISMICROSOFTACCOUNT to 0 in SSMS, and then re-run db synch to create the user. DB synch can be triggered in PU12+ by clearing the SF.SYNCLOG table and then killing AXService.exe – when it automatically starts back up it will run a db synch. The you should see the FRServiceUser created back in USERINFO.

MachineName SQLAOSF1ORCH2
EnvironmentId c91bafd5-ac0b-43dd-bd5f-1dce190d9d49
SetupModuleName FinancialReporting
Component Microsoft.Dynamics.Performance.Deployment.Commands.AX.AddAXDatabaseChangeTracking
Message An unexpected error occurred while querying the Metadata service. Check that all credentials are correct. See the deployment log for details.
Detail Microsoft.Dynamics.Performance.Deployment.Common.DeploymentException: An unexpected error occurred while querying the Metadata service. Check that all credentials are correct. See the deployment log for details. ---> System.ServiceModel.FaultException: Internal Server Error Server stack trace: at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc) at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout) at

How authentication works in Dynamics 365 for Finance and Operations On-premises

$
0
0

In this article I'm going to explain the moving parts to authentication in on-premises Dynamics 365 for Finance and Operations. The intention of this article is to provide some background to how the process works, so that if you have issues you can work through them to figure out what's wrong.

First off - there's one option you provide during environment deployment, the URL for AD FS, which looks something like this:

https://dax7sqlaoadfs1.saonprem.com/adfs/.well-known/openid-configuration

You'll find that mentioned in the deployment instructions here

During deployment this is going to be used to set various options in the AOS xml config files on each AOS machine. You'll find the AOS config in a folder similar to below - note that the numbers vary from machine to machine:

C:\ProgramData\SF\AOS_10\Fabric\work\Applications\AXSFType_App218\AXSF.Package.1.0.xml

Within this config file (which is on each AOS machine) you'll find a few sections which are set from the LCS deployment setting for AD FS, this bit:


<Section Name="Aad">
<Parameter Name="AADIssuerNameFormat" Value="https://dax7sqlaoadfs1.saonprem.com/{0}/" />
<Parameter Name="AADLoginWsfedEndpointFormat" Value="https://dax7sqlaoadfs1.saonprem.com/{0}/wsfed" />
<Parameter Name="AADMetadataLocationFormat" Value="https://dax7sqlaoadfs1.saonprem.com/FederationMetadata/2007-06/FederationMetadata.xml" />
<Parameter Name="AADTenantId" Value="adfs" />
<Parameter Name="AADValidAudience" Value="https://ax.d365ffo.zone1.saonprem.com/" />
<Parameter Name="ACSServiceEndpoint" Value="https://accounts.accesscontrol.windows.net/tokens/OAuth/2" />
<Parameter Name="ACSServicePrincipal" Value="00000001-0001-0000-c000-000000000000" />
<Parameter Name="FederationMetadataLocation" Value="https://dax7sqlaoadfs1.saonprem.com/FederationMetadata/2007-06/FederationMetadata.xml" />
<Parameter Name="Realm" Value="spn:00000015-0000-0000-c000-000000000000" />
<Parameter Name="TenantDomainGUID" Value="adfs" />
<Parameter Name="TrustedServiceAppIds" Value="913c6de4-2a4a-4a61-a9ce-945d2b2ce2e0" />
</Section>

Also this section:


<Section Name="OpenIDConnect">
<Parameter Name="ClientID" Value="f06b0738-aa7a-4a50-a406-5c1e486c49be" />
<Parameter Name="Metadata" Value="https://dax7sqlaoadfs1.saonprem.com/adfs/.well-known/openid-configuration" />
</Section>
<Section Name="Provisioning">
<Parameter Name="AdminIdentityProvider" Value="https://dax7sqlaoadfs1.saonprem.com/adfs" />
<Parameter Name="AdminPrincipalName" Value="admin@exampleDomain.com" />
</Section>

The AOS is using these config values to know where to redirect to when a user tries to hit the application URL - so user hits the URL, AOS should redirect to the AD FS login page (using the values from this config), user enters their credentials, and gets redirects to the application URL again.

If values in the AOS config file are incorrect - then that typically means the value given for ADFS during environment deployment was wrong - easiest thing is to delete and redeploy the environment from LCS with the right value - it is possible to manually edit the config files, but to be safe, do a redeploy. If you do edit the config files then you need to restart the AOS services for it to take effect - either from SF explorer (right click the AOS node under Nodes, and choose restart, then wait for a minute or so for it's status to go back to green) or reboot the machine.

One example of an error caused by this, if I had entered the AD FS URL in LCS deployment wrongly (as below - note the missing hyphen) then I would get server error 500 when going to the application URL, because it no longer knows how to redirect to AD FS properly

https://dax7sqlaoadfs1.saonprem.com/adfs/.wellknown/openid-configuration

 

The second piece to the authentication process is ADFS itself, on the ADFS server if you open "AD FS Management" (from Control Panel\System and Security\Administrative Tools), and look under "Application groups", you'll find a group called "Microsoft Dynamics 365 for Operations On-premises" - within this group the settings for AD FS for Dynamics are kept - specifically there are application URLs, the same one you specified during environment deployment as the URL for the application, here's an example:

AD FS application group setup

AD FS uses the Client ID and the URLs to decide whether the request for access is ok or not. You will notice that the Client ID is also listed in the AOS config (it's in the section I pasted above) - if both the client ID and the URL don't match what the AOS is requesting, then AD FS will deny the token - if that happens you'll find an error in the Event Log on the ADFS server - there's a special event log for AD FS under "Application and Services logs\AD FS\Admin"

AD FS event log error

In the case that any of the AD FS application group setup is wrong, you're likely to see an error in it's event log which explains the value it was looking for, so you can figure out what is set incorrectly.

Debug a Dynamics 365 for Finance and Operations on-premises instance without Visual Studio

$
0
0

In this post I'm going to explain how to debug an error occurring in Dynamics 365 for Finance and Operation on-premises - directly in the on-premises environment, where Visual Studio isn't available, by using a free tool called WinDbg.

This approach gives a fast way to catch exceptions occurring in the environment and identify the call stack, more detailed error message (for example to see inner exceptions) and to see values for running variables at the time of the exception. You can use this approach not only for debugging the AOS itself, but actually for any component in Windows which is running .NET type code - for example if SSRS was throwing an exception, you can do the same thing to debug SSRS itself.

It does not give a full X++ debugging experience as you would normally have using Visual Studio with the Dynamics dev tools installed - I will be making another post soon explaining how to hook up Visual Studio to debug your on-premises instance to debug.

Overview

WinDbg is a very powerful debugging tool and can be used in many different scenarios - for example debugging an exception occurring in any Windows software or analyzing memory dumps (also known as crash dumps) from a Windows process.

In this document we'll look at one particular scenario to give an introduction to the tool and how it can be helpful in conjunction with Dynamics 365 for Finance and Operations on-premises to troubleshoot exceptions.

The example scenario here is:
- I have an external application trying to call into Finance and Operations web services
- The call is failing with "Unauthorized" in the calling application
- There is no error in the AD FS event log - AD FS is issuing a token fine, but the AOS is denying the call.
- I want to know why I am "Unauthorized" because it seems AOS should be allowing me

Prepare

First install WinDbg, this is available from the Windows SDK here

Note: there is a newer version of WinDbg currently in preview available in the Windows Store here, but my post here is only dealing with the old current released version.

Most of the install tabs you can click next-next - but when choosing which options to install, uncheck everything except the "Debugging tools for Windows" as shown below:

Once the installer completes you will find WinDbg on your Windows start menu - both x64 and x86 versions (and ARM and ARM64) will be installed. The rule for debugging .NET code with WinDbg is to match the version of WinDbg to the architecture of the process - 32 bit process, 32 bit WinDbg and 64 bit process, 64 bit WinDbg. As we are going to debug the AOS which is 64 bit, we'll need to open WinDBgx64 - MAKE SURE to run as Administrator, otherwise it won't let you attach to the process.

In a typical on-premises environment there w3ill be 3 AOS instances - when we're debugging we're not sure which of the 3 AOS we'll hit, so we want to turn off the other two, then we know everything will hit the remaining one, and we can debug that one. There are two options to do that:
1. Shut down the other two AOS machines in Windows.
2. From SF explorer, disable the AOS application for the other two AOS - if you take this route then you need to check that AXService.exe has actually stopped on both of those AOS machines in task manager - because I've found that it doesn't always stop immediately, it'll sit there for a while and requests will continue to go to them.

Debug

Now we have the tool installed we're ready to debug something. In WinDbg go to "File"->"Attach to process..", a dialog will open showing all the current running processes on that machine - select "AXService.exe" and click ok. It's easier to find in the list if you select the "by executable" radio button, which will alphabetize the list.

WinDbg is a command line debugger, at the bottom of the Windows there is a box where you can enter commands for it to execute - that's primarily how you get it to do anything.

As we're going to debug .NET code, we'll first load an extension for WinDbg which will help us to decode .NET related information from the process. This extension exists on any machine which has the .NET framework installed. Enter this command and hit enter:

.load C:\Windows\Microsoft.NET\Framework64\v4.0.30319\sos.dll

Next we're going to tell WinDbg that when a .NET exception occurs it should stop the process on a breakpoint, because we don't have source code available in an on-premises environment, the easy way for us to set a breakpoint is to base it on exceptions. The command for WinDbg to break on exception is "sxe" and the exception code is "e0434352", we always use the same exception code here, because that is the native Windows code representing all .NET type exceptions.

sxe e0434352

Now we need to let the process run again - because when we attached to the process WinDbg automatically put a "break" on it - we can tell if the process is running or not - if it's running it says "Debuggee is running.." in the command prompt. To let the process run again enter "g" meaning go.

g

After entering "g" you see it is running again:

Ok now we're ready to reproduce our issue, so I'm just going to my client application and making the error happens, then in WinDbg I see this. Note that the client application will seem to "hang", this is because WinDbg is stopping the AOS on a breakpoint and not letting it complete the request:

We can run a command to show us the exception detail "!pe". This command comes from the sos.dll extension we loaded earlier, the use of "!" denotes it's coming from an extension. Note that WinDbg is case sensitive on everything you enter.

Here I can see the exception from within the AOS - it's hard to see in the screenshot, so here's the full text:

0:035> !pe
Exception object: 000002023b095e38
Exception type: System.IdentityModel.Tokens.SecurityTokenInvalidAudienceException
Message: IDX10214: Audience validation failed. Audiences: 'https://ax.d365ffo.zone1.saonprem.com/namespaces/axsf/'. Did not match: validationParameters.ValidAudience: 'null' or validationParameters.ValidAudiences: 'https://ax.d365ffo.zone1.saonprem.com, 00000015-0000-0000-c000-000000000000, https://ax.d365ffo.zone1.saonprem.com/'
InnerException:
StackTrace (generated):
StackTraceString:
HResult: 80131501

I'm not going explain the example error message in this post - but if you're interested it is explained here

Next we can see the call stack leading to this exception by running "!clrstack", it's worth noting that the first time you run this command on a machine where it's the first time you've used WinDbg it might spin for a couple of minutes - that happens because WinDbg is looking for symbols - after the first time it'll run straight away. This command is useful to understand what the AOS was trying to do when the exception occurred - its not necessary to have all of the source code to make sense of the call stack - most times I am looking at this I am simply reading the method names and making an educated guess about what it was doing based on the names (of course it's not always that simple, but often it is).

!clrstack

Last command for this post, is to show the running .NET variables relating to the call stack we just saw. This command is useful, to understand what values the AOS was running with - similar to my approach with !clrstack, I am simply looking through this list of human readable values - something I recognize - for example if it was an exception in a Purchase order process I'd be looking for something which looks like a vendor account number or PurchId. This is particularly useful when the value the AOS is running with, isn't the value that you expect it should have been running with.

!dso

That's all for now, happy debugging!

Viewing all 30 articles
Browse latest View live