StackOverflow April 2016 Contributions

Below are links to questions I have contributed to on StackOverflow.  If you have time or any of the questions interest you take a moment to review the question and answers.  If you feel one of the answers appropriately answer the question, then please vote that answer up to better serve the community and future users looking for an answer to a similar question.

Accessing Salesforce Reports and Dashboards REST API Using C#

Introduction

If you have read any of my other posts, you know I have been doing work with the Salesforce REST API.  I recently had a need to access the Salesforce Reports and Dashboards REST API using C#.  While spiking out a simple example to access the Reports and Dashboards REST API I did not come across very much documentation on how to accomplish this.  In this post I will walk through a quick spike on how to authenticate with the api and how to call it to get a report.  Full code sample can be found Here on GitHub.

Prerequisites

Visual Studio

With any access to a Salesforce API you will need a user account (username, password, token) and the consumer key/secret combination from the custom connected app.  With these pieces of information, we can begin by creating a simple console application to spike out access to the reports and dashboards api.  Next we need to install the following nuget packages:

Once these packages are installed we can utilize them to create a function to access the Salesforce reports and dashboards api.

 
var sf_client = new Salesforce.Common.AuthenticationClient();
sf_client.ApiVersion = "v34.0";
await sf_client.UsernamePasswordAsync(consumerKey, consumerSecret, username, password + usertoken, url);

Here we are taking advantage of some of the common utilities in the DeveloperForce package to create an authclient which will get us our access token from the Salesforce api.  We will need that token next to start making requests to the api.  Unfortunately, the DeveloperForce library does not have the ability to call the reports and dashboards api, we are just using it here easily get the access token.  This all could be done using RestSharp but its simpler to utilize what has already been built.

                               
string reportUrl = "/services/data/" + sf_client.ApiVersion + "/analytics/reports/" + reportId;

var client = new RestSharp.RestClient(sf_client.InstanceUrl);
var request = new RestSharp.RestRequest(reportUrl, RestSharp.Method.GET);
request.AddHeader("Authorization", "Bearer " + sf_client.AccessToken);
var restResponse = client.Execute(request);
var reportData = restResponse.Content;

Since we have used the DeveloperForce package to setup the authentication we can now use RestSharp and the access token to query the report api.  In the code above we setup a RestSharp client with the Salesforce url, followed by defining the actual request for the report we want to execute.  To make the request we also need to push the Salesforce access token onto the header and now we can make the request to receive the report data.

Conclusion

As described this is a pretty simple example on how to accomplish authentication and requesting a report from the Salesforce reports and dashboards rest api using c#.  Hopefully this can be a jumping off point for accessing this data.  The one major limitation for me is the api only returns 2,000 records, this is especially frustrating if your Salesforce org has a lot of data.  In the near future I will be writing a companion post on how to get around this limitation.

Error 40197 Error Code 4815 Bulk Insert into Azure SQL Database

I have been working on a large scale ETL project, one of the data sources I regularly pull data from is Salesforce.  If anyone has worked with Salesforce they know it can be a free for all with object field changes.  This alone makes pulling data regularly from the Salesforce REST API difficult since there can be so much activity with the objects.  Not to mention the fact that the number of custom fields that can be added to a single object ranges into the hundreds.  With these factors in play it can be a daunting task to debug errors when they inevitably crop up.

During a recent run of my process I started to receive the following error:

Error 40197 The service has encountered an error processing your request. Please try again. Error code 4815. A severe error occurred on the current command. The results, if any, should be discarded.

Here is a list of error codes however it was not too helpful in my situation: SQL error codes

In my ETL process I am using Entity Framework and EntityFramework.BulkInsert-ef6, so I am doing bulk inserts into my Azure SQL Database.  Since I know there is a good chance there was a change with the object definition in Salesforce that could be the cause of this error that is where I started to investigate.  As it turns out one of the fields length was changed from 40 to 60, which means the original table I have created with a column size varchar(40) is going to have a problem.  In my case this error happened when the amount of data was larger then the field size definition in the table.  Hopefully this post will give someone else another troubleshooting avenue for this error.

Upgrading to Microsoft.Azure.Management.DataLake.Store 0.10.1-preview to Access Azure Data Lake Store Using C#

Introduction

Microsoft recently released a new nuget package to programmatically access the Azure Data Lake Store.  In a previous post Accessing Azure Data Lake Store from an Azure Data Factory Custom .Net Activity I am utilizing Microsoft.Azure.Management.DataLake.StoreFileSystem 0.9.6-preview to programmatically access the data lake using C#.  In this post I will go through what needs to be changed with my previous code to upgrade to the new nuget package.  I will also include a new version of the DataLakeHelper class which uses the updated sdk.

Upgrade Path

Since I already have a sample project utilizing the older sdk (Microsoft.Azure.Management.DataLake.StoreFileSystem 0.9.6-preview), I will use that as an example on what needs to be modified to use the updated nuget package (Microsoft.Azure.Management.DataLake.Store 0.10.1-preview).

The first step is to remove all packages which supported the obsolete sdk.  Here is the list of all packages that can be removed:

  • Hyak.Common
  • Microsoft.Azure.Common
  • Microsoft.Azure.Common.Dependencies
  • Microsoft.Azure.Management.DataLake.StoreFileSystem
  • Microsoft.Bcl
  • Microsoft.Bcl.Async
  • Microsoft.Bcl.Build
  • Microsoft.Net.Http

All of these dependencies are needed when using the DataLake.StoreFileSystem package.  In my previous sample I am also using Microsoft.Azure.Management.DataFactories in order to create a custom activity for Azure Data Factory, unfortunately this package has a dependency on all of the above packages as well.  Please be careful removing these packages as your own applications might have other dependencies on those listed above.  In order to show that these packages are no longer needed my new sample project is just a simple console application using the modified DataLakeHelper class, which can be found here on github.

Now let’s go through the few changes that need to be made to the DataLakeHelper class in order to use the new nuget package.  The following functions from the original DataLakeHelper class will need to be modified:

create_adls_client()
execute_create(string path, MemoryStream ms)
execute_append(string path, MemoryStream ms)

Here is the original code for create_adls_client():

 private void create_adls_client()
        {
            var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenant_id}");
            var credential = new ClientCredential(clientId: client_id, clientSecret: client_key);
            var result = authenticationContext.AcquireToken(resource: "https://management.core.windows.net/", clientCredential: credential);

            if (result == null)
            {
                throw new InvalidOperationException("Failed to obtain the JWT token");
            }

            string token = result.AccessToken;

            var _credentials = new TokenCloudCredentials(subscription_id, token);
            inner_client = new DataLakeStoreFileSystemManagementClient(_credentials);
        }

In order to upgrade to the new sdk, there are 2 changes that need to be made.

  1. The DataLakeStoreFileSystemManagementClient requires a ServiceClientCredentials object
  2. You must set the azure subscription id on the newly created client

The last 2 lines should now look like this:

var _credentials = new TokenCredentials(token);
inner_client = new DataLakeStoreFileSystemManagementClient(_credentials);
inner_client.SubscriptionId = subscription_id;

Now that we can successfully authenticate again with the Azure Data Lake Store, the next change is to the create and append methods.

Here is the original code for execute_create(string path, MemoryStream ms) and execute_append(string path, MemoryStream ms):

 private AzureOperationResponse execute_create(string path, MemoryStream ms)
        {
            var beginCreateResponse = inner_client.FileSystem.BeginCreate(path, adls_account_name, new FileCreateParameters());
            var createResponse = inner_client.FileSystem.Create(beginCreateResponse.Location, ms);
            Console.WriteLine("File Created");
            return createResponse;
        }

        private AzureOperationResponse execute_append(string path, MemoryStream ms)
        {
            var beginAppendResponse = inner_client.FileSystem.BeginAppend(path, adls_account_name, null);
            var appendResponse = inner_client.FileSystem.Append(beginAppendResponse.Location, ms);
            Console.WriteLine("Data Appended");
            return appendResponse;
        }

The change for both of these methods is pretty simple, the BeginCreate and BeginAppend methods are no longer available and the new Create and Append methods now take in the path and Azure Data Lake Store account name.

With the changes applied the new methods are as follows:

        private void execute_create(string path, MemoryStream ms)
        {
            inner_client.FileSystem.Create(path, adls_account_name, ms, false);
            Console.WriteLine("File Created");
        }

        private void execute_append(string path, MemoryStream ms)
        {
            inner_client.FileSystem.Append(path, ms, adls_account_name);
            Console.WriteLine("Data Appended");
        }

Conclusion

As you can see it was not difficult to upgrade to the new version of the sdk.  Unfortunately, since these are all preview bits changes like this can happen, hopefully this sdk has found its new home and it won’t go through too many more breaking changes for the end user.