I have been working on a large scale ETL project, one of the data sources I regularly pull data from is Salesforce. If anyone has worked with Salesforce they know it can be a free for all with object field changes. This alone makes pulling data regularly from the Salesforce REST API difficult since there can be so much activity with the objects. Not to mention the fact that the number of custom fields that can be added to a single object ranges into the hundreds. With these factors in play it can be a daunting task to debug errors when they inevitably crop up.
During a recent run of my process I started to receive the following error:
Error 40197 The service has encountered an error processing your request. Please try again. Error code 4815. A severe error occurred on the current command. The results, if any, should be discarded.
Here is a list of error codes however it was not too helpful in my situation: SQL error codes
In my ETL process I am using Entity Framework and EntityFramework.BulkInsert-ef6, so I am doing bulk inserts into my Azure SQL Database. Since I know there is a good chance there was a change with the object definition in Salesforce that could be the cause of this error that is where I started to investigate. As it turns out one of the fields length was changed from 40 to 60, which means the original table I have created with a column size varchar(40) is going to have a problem. In my case this error happened when the amount of data was larger then the field size definition in the table. Hopefully this post will give someone else another troubleshooting avenue for this error.