Tuesday, January 6, 2015

Retail - Process upload jobs encountered error

You might encounter an conversion error when the Async client tries to create an upload job from the Retail transactions, more specifically it is a decimal to integer SQL conversion cast error. The error will cause the Async client to throw an overflow exception and fall face down on the belly everytime "that" retail transaction is processed for upload package. How do you resolve it and why it occurred in the first place?
Am going to be frank, in production you will get an heart attack when you see this error on the event log when the trickle transactions are not coming back, there is no hint of why or how it happens. So put on your sherlock holmes hat and start your investigation for some clues to solve this error.
First lets see the Why part, the Async client raises the exception when it tries to create a package for one of the columns in the upload tables which has a values larger than what an integer can hold. Yes it can happen if the cashier accidentally scans a 13 digit EAN 13 barcode to a Quantity or Price field. The scanners at the Hypermart cashiers are always ON meaning while tendering if the product barcode could get scanned. Remember the scanner is programmed to add line feed <Enter> there by committing the transaction. So if it was at cash tender for 10$, and a barcode does gets scanned then the system will take 888,345,782,1090 and return the Change back as 888,345,782,1080. No issues now as the RetailTransactionPaymentTrans table record is created with this gigantic value and sale gets completed. The real exception comes when the Pull jobs starts to create records for this transaction. And the upload job will not proceed after this transaction and there by no transactions will trickle feed back to HQ.

Simple script to identify the issue by Qty or Amount
Select * from RetailTransactionSalesTrans where Qty < -50000
Select * from RetailTransactionSalesTrans where NetAmount < -500000

How do you solve this then, if it is a test environment just truncate all the transaction tables and you would be good to go immediately. The async client springs back to action and will start churning out the upload job packages. But for those unfortunate ones when the issue occurs in production you would have to wait and find the transaction and then cleanse it before the async client can limp back again. 
A clearer understanding of the exception is more important on how it can be handled, the error is triggered by the SQL during the CAST of a column value which is in Decimal to Integer which exceeds either +/-7922816251426. Yes, thats all! Async client never tells you which transaction, value, column or even table during the exception. So you need figure it out based on the exception when it started, last complete Retail Transaction that was synced and which store the exception occurs.
Based on my experience I went through, the last perfect sync of Retail Sales was at 10:20 PM, the error started occurring at 10:40 PM for that store async client. I took a list of all the tables handled in upload job and compared one by one if they had any columns which had disproportionately high values. Below is the complete list for R3 RTM version:-

RetailFiscalDocumentModel2_BR
RetailFiscalDocumentModel2Line_BR
RetailFiscalDocumentModel2TaxTrans_BR
RetailFiscalPrinter_BR
RetailFiscalPrinterReport_BR
RetailFiscalReceipt_BR
RetailFiscalReceiptLine_BR
RetailFiscalReceiptTaxTrans_BR
RetailListingStatusLog
RetailPosBatchAccountTrans
RetailPosBatchTable
RetailPosBatchTenderTrans
RetailTransactionAddressTrans
RetailTransactionAffiliationTrans
RetailTransactionAttributeTrans
RetailTransactionBankedTenderTrans
RetailTransactionDiscountTrans
RetailTransactionIncomeExpenseTrans
RetailTransactionInfocodeTrans
RetailTransactionKitsDisassemblyTrans
RetailTransactionLoyaltyRewardPointTrans
RetailTransactionMarkupTrans
RetailTransactionOrderInvoiceTrans
RetailTransactionPaymentTrans
RetailTransactionPaymentTrans_BR
RetailTransactionSafeTenderTrans
RetailTransactionSalesTrans
RetailTransactionTable
RetailTransactionTable_RU
RetailTransactionTaxTrans
RetailTransactionTenderDeclarationTrans
RetailZReport_BR
RetailZReportTotalizer_BR

Most are useless tables which contain little to no data, just cross of them. The major tables of interest for you should be those that start with RetailTransactionxxxxx and RetailPOSBatchxxx tables. I started narrowing down all the transactions after 10:20 PM but before the 10:40 for these tables.
Try to do a SQL Cast to simulate how the async client goes belly up, for suspected columns. After a through investigation the RetailPOSBatchxxxx were pretty much harmless meaning even if it has a some large values they are handled properly in the code and data seems to flow back to AX. Then shifted my real focus to transaction tables, use a stored procedure to type cast and simulate faster for all the records which have values larger than 5 digits for either Qty or Amount columns.

Sample Scripts for RetailPOSBatchTenderTrans Table

Simple script

Select CAST(TENDEREDAMOUNT as int),cast(TENDEREDAMOUNTCUR AS int),* FROM RETAILPOSBATCHTENDERTRANS 

Detailed script for analyzing line by line

Declare @v_TRANSACTIONID nvarchar(44), @v_TERMINAL nvarchar(10), @i int,@v_date Datetime, @v_AmountTen as int,
@v_TransTime as  int,@v_BATCHID as int,@v_TENDEREDAMOUNT as decimal,@v_TENDEREDAMOUNTCUR as decimal,
@v_int1 as int, @v_Int2 as int 

Declare cur_temp cursor for
Select TOP 200 TERMINALID,BATCHID,TENDEREDAMOUNT,TENDEREDAMOUNTCUR FROM RetailPosBatchTenderTrans 


OPEN cur_temp   
FETCH NEXT FROM cur_temp INTO  @v_TERMINAL,@v_BATCHID,@v_TENDEREDAMOUNT,@v_TENDEREDAMOUNTCUR
Set @i = 0

WHILE @@FETCH_STATUS = 0   
BEGIN   
Print '@v_TERMINAL    ' + CAST(@v_TERMINAL AS varchar(1000))
Print '@@v_BATCHID    ' + CAST(@v_BATCHID AS varchar(1000))

SET @v_int1 = CAST(@v_TENDEREDAMOUNT as int)
Print '@v_int1    ' + CAST(@v_int1 AS varchar(1000))

SET @v_int2 = CAST(@v_TENDEREDAMOUNTCUR as int)
Print '@@v_TENDEREDAMOUNTCUR    ' + CAST(@v_int2 AS varchar(1000))

set @i = @i + 1

print '-----completed--- i = ' + CAST(@i as varchar(10))
print '---------------------------------'



     FETCH NEXT FROM cur_temp INTO  @v_TERMINAL,@v_BATCHID,@v_TENDEREDAMOUNT,@v_TENDEREDAMOUNTCUR
END   

CLOSE cur_temp   
DEALLOCATE cur_temp 

Repeat the steps for each of the table in the RetailTransactionTable series.
Finally found that a CAST on AmountMST columns value in RetailTransactionPaymentTrans triggers the arithmetic conversion overflow exception, exactly the one I am looking for from the Async client. The transaction seems to have been exceptionally higher than the one which can be held by an integer.
There the error popped out, got a quick approval for amending the payment amount and change back transactions from the Client to modify on Production store database. Make sure to make the value come down to acceptable range for an integer, preferable to provide a realistic value like $10 for a sales transaction that was for $7.20.
Perform the steps -> Stop the async client, commit the new values in affected transaction(multiple columns might be present in same table like PaymentAmount, AmountMST, AmountCur, etc change all to new value) and start again the async client. If you had identified the column and value correctly the upload job should start kicking again with packages, else if you get the error. Make sure to double check the same columns again if there is still any other columns that has the value. By and large the issue should have solved after you patch the transaction data.
How to avoid this scenario entirely even before it happens? Yes possible by adding some validations for Quantity and Price. Example, Make sure to always set a maximum tender amount for each tender type, for me $100,000 is a wishful figure and there would not be sale with 1 tender type with more than $100,000. So in this case even if the barcode got scanned, the amount validation will stop the payment from being completed.

For those who managed to solve please do share in the comments your approach that helped you, it could really help someone to solve the issue on production environment.

Reference Detailed Exception Log:
Process upload jobs encountered Exception. Error Details: System.OverflowException: Conversion overflows.
   at System.Data.SqlClient.SqlBuffer.get_Decimal()
   at System.Data.SqlClient.SqlBuffer.get_Value()
   at System.Data.SqlClient.SqlDataReader.GetValue(Int32 i)
   at Microsoft.Dynamics.Retail.StoreConnect.Request.SQLHandler.SCSqlReadRequestRunner.Run(SqlConnection connection, SqlTransaction transaction)
   at Microsoft.Dynamics.Retail.StoreConnect.Request.SQLHandler.SCSqlSourceRequestHandler.ProcessReadRequests(SqlConnection connection, SqlTransaction transaction)
   at Microsoft.Dynamics.Retail.StoreConnect.Request.SQLHandler.SCSqlSourceRequestHandler.ProcessSourceRequestHeader(ISCSourceRequestHeader sourceRequestHeader)
   at Microsoft.Dynamics.Retail.SynchClient.Core.UploadAgent.PackRequestResultsToFile(SessionManager sessionMgr, ISCSourceRequestHeader header, String fileName)
   at Microsoft.Dynamics.Retail.SynchClient.Core.UploadAgent.ProcessUploadJob(Job activeJob)System.OverflowException: Conversion overflows.
   at System.Data.SqlClient.SqlBuffer.get_Decimal()
   at System.Data.SqlClient.SqlBuffer.get_Value()
   at System.Data.SqlClient.SqlDataReader.GetValue(Int32 i)
   at Microsoft.Dynamics.Retail.StoreConnect.Request.SQLHandler.SCSqlReadRequestRunner.Run(SqlConnection connection, SqlTransaction transaction)
   at Microsoft.Dynamics.Retail.StoreConnect.Request.SQLHandler.SCSqlSourceRequestHandler.ProcessReadRequests(SqlConnection connection, SqlTransaction transaction)
   at Microsoft.Dynamics.Retail.StoreConnect.Request.SQLHandler.SCSqlSourceRequestHandler.ProcessSourceRequestHeader(ISCSourceRequestHeader sourceRequestHeader)
   at Microsoft.Dynamics.Retail.SynchClient.Core.UploadAgent.PackRequestResultsToFile(SessionManager sessionMgr, ISCSourceRequestHeader header, String fileName)
   at Microsoft.Dynamics.Retail.SynchClient.Core.UploadAgent.ProcessUploadJob(Job activeJob)