Back

Creating Payables Transaction with Analytical Accounting (AA)

Published: Apr 04, 2017
Post Author Written by eOne Solutions

One of the most confusing pieces of data to integrate into Dynamics GP is anything with Analytical Accounting. The main confusion comes from the fact that the field naming conventions don’t necessarily match the main module. For example, the Document Number field on the AA distribution is not the Payables Document Number but the Payables Voucher Number. There are few things to consider when creating Payables Integrations with Analytical Accounting and really, any transaction in Dynamics GP with Analytical Accounting.

Depending on the data source and resulting data within Dynamics GP different mapping options are required within the Analytics Distribution that we will cover in this article.

You can download the maps and sample files here.

Scenario 1 – One Analytical Distribution per Payables Distribution and no duplicate accounts.

This is by far the easiest scenario to integrate as it’s a straight mapping to the fields in the Payables Transaction Header, Distributions and Analytical Distributions.

Data Source

The first step is to set up or validate the data source. I will use the example of employee expenses being tracked in Accounts Payables with AA Dimension and Dimension Code are on each Distribution line within the data source as shown below. The Dimension is in the DepartmentCode column and the Dimension Code is in the ProgramCode column. We are only going to have the Analytical information on the expense distribution and will leave it blank for the payables distribution.

NOTE: If you are using a text file such as a CSV or TAB delimited, I would recommend using the Create Schema option to set the Dimension and Dimension Codes to char. This will allow SmartConnect to handle any type of value without requiring a calculated field to translate a number to a string if there are mixture of data types.

In this example we are using a csv file with the Bulk Data and Text File data source types. The Data Source Key Field will be the DocumentNumber column.

 

Mapping

I won’t go through the standard distribution and header mappings in this example but will focus on the Analytics Distribution. The most important note is the Analytics Distribution is used for other document transactions as well as Accounts Payable so the field names might not match to the same values that you find with each different transaction. If we don’t map the right values, Microsoft eConnect will not be able to find the distribution record for proper validation and return an error.

In the case of Accounts Payable, the Document Number field would be mapped to the Voucher Number. The Document Type is always a 0 when working with Vouchers or a 1 if working with a payment. This is explained in the Microsoft eConnect documentation, https://msdn.microsoft.com/en-us/library/ff623998.aspx. In our simple scenario, the Analytical Amount matches our distribution amount so we wouldn’t have to map the amount field but we will typically do the mapping for consistency.

Mapping Restriction

Not every distribution in my data source will have a dimension code assigned, I need to create a restriction to prevent the Analytics Node from being processed for those distributions.

With the Analytics Distribution Mapping window open, click on the Restrictions button. The VB.NET code displayed here will check whether the Dimension or Dimension code is blank and restrict this distribution.

 

 

Scenario 2 – One Analytical Distribution per Payables Distribution and duplicate accounts.

The next easiest scenario to integrate is when we have the same account number on multiple distributions and each distribution has only one Analytical Distribution. For this scenario, we need to add the distribution sequence to our mapping. Depending on your data source, the value can come from there or calculated using a global variable.

Data Source

We will use the same data source as scenario 1 with the addition of a sequence number for the purchasing distribution. We need to make sure that every row has a sequence number. If you are unable to add the distribution sequence, you can use this article as an example to calculate the sequence number within the map itself. http://www.eonesolutions.com/blog-post/tech-tuesday-line-sequence-numbering/

Mapping

We need to make one addition to our Add Distribution node by adding the distribution sequence.

Our Analytical Distribution mapping includes the addition of the distribution sequence. Everything else within this node remains the same.

Mapping Restriction

This will be the same restriction as used in scenario 1.

 

Scenario 3 – Multiple Analytical Distribution per Payables Distribution and duplicate accounts.

The next easiest scenario to integrate is when we have the same account number on multiple distributions and there can be multiple Analytical Distributions for each payables distribution. With this scenario, we need to add the distribution sequence to our mapping and an analytical sequence. Depending on your data source, the value can come from a source column or calculated column using global variables as described in scenario 2.

Data Source

Here is our data source with the additional Analytical Distribution sequence number. I also separated the amounts into debit and credit columns to reduce the number of calculations needed for summing amounts on the distribution node.

 

Mapping

In the Add Distribution, we group on the document number and distribution sequence to create one distribution for our account or sequence. We will also sum the debit and credit amounts to ensure the correct distribution amount aggregate value.

 

In the Analytics Distribution, we map the AASequence to the subledger assign lookup ID.

 

Mapping Restriction

This will be the same restriction as used in scenario 1.

Troubleshooting

We will admit that troubleshooting issues regarding integrating AA transactions can be difficult. If something is not mapped correctly on the analytics distribution node, the most common error you will receive is “An Item with the same key has already been added”. Unfortunately, this error is being returned to SmartConnect from the Microsoft eConnect runtime and means eConnect has generated the same error multiple times. It usually means the Microsoft eConnect taProcessAnalytics Stored Procedure cannot validate the Analytics Distribution to the matching Payables Distribution. The way we troubleshoot this issue is to use SQL Profiler to capture the text of the calls being made to the eConnect procedures and executing those procedures within SQL Server Management Studio and execute them within SQL Management Studio. https://docs.microsoft.com/en-us/sql/tools/sql-server-profiler/sql-server-profiler

Another possible method to validate the mapping is to change the map destination to Dynamics GP – File and review the XML that is generated to verify the calculations and mappings are correct.

Here is a list of items to verify when looking at the generated XML.

  • Make sure the Document Number and Document Type are correct for the Analytics Distribution Node
  • Make sure the Analytics Distribution Node is after the Add distribution node and before the Create payables transaction node
  • Make sure the Distribution Sequence is correct for the Distribution node and the Analytics Distribution node.

     

Additional Considerations

A common question is whether we can provide the Analytical Distribution Alias and have eConnect pull the right values for the Analytical data. Unfortunately, this is not handled within eConnect so we can only provide the actual Analytical data with the integration.

Another scenario is where the Dimension Codes are all contained on the same line on the source document. If the data is in this format, it will require pivoting the rows so each Dimension Code is on its own line. If the Dimension Codes are not on their own line, we cannot process the file with SmartConnect.

 

 

 

 

 

Did this answer your question?

FEEDBACK FORM HERE

Feeling stuck? Get the support and guidance you need to help you power through any data challenge

We're on your integration team. Connect with our people and let us know how we can help you.