How to truncate text in an Angular Material UI table cell

Working on a recent project using Angular and Material UI, I needed to be able to truncate the text in one of the cells of a Material table. Here is the simplest way I have found to accomplish that.

//Add this to your css
mat-cell > span.truncate-text {
    text-overflow: ellipsis; 
    overflow: hidden; 
    white-space: nowrap;
}
//Wrap the long text in a span with a class of truncate-text

<ng-container matColumnDef="longtext">
  <mat-header-cell *matHeaderCellDef> LongText </mat-header-cell>
  <mat-cell *matCellDef="let element">
    <span class="truncate-text">{{element.longtext}}</span>
  </mat-cell>
</ng-container>

Workaround – Google Sign-In without Google+ API with MVC .NET and Owin

A few years back I created a small custom application for a client that utilized their Google logins for authentication.  The web application was written with ASP.NET MVC and utilized Katana/Owin pipeline.  The common practice to setup that application with Google sign in was to also enable the Google+ API.  If you have done this then like me you have received an email recently that explains, as of March 2019 the Google+ API’s will be shut down.  I have spent the last few days trying to read through the documentation to understand what needs to be done to fix this, without completely switching up the current login flow, however they don’t seem to make this transition easy for .NET applications.  Thankfully I finally found the answer I was looking for, a workaround posted in GitHub comments, to address this exact issue.  For those of you who are in the same boat as me have a look at this comment .  I have made the changes recommended here and I can verify that my Google Sign-In is now working again without the Google+ API enabled.

Advent of Code 2018 – Day 1

In an effort to learn python I am working my way through the Advent of Code challenges.  Advent of Code is a coding challenge done during the holiday season every year, its a fun way to keep your skills sharp.  It is also a great way to try and learn a new language because it gives you some focused challenges to figure out how the new language works to solve the problem. Check it out here https://adventofcode.com/2018

Continue reading “Advent of Code 2018 – Day 1”

Eventstore – Extra Statistics Option on Persistent Subscription – Where is it?

When creating a new persistent subscription in EventStore, there are many options to configure. The extra statistics option was a peculiar one, once enabled there is no indication on what it is doing and where you can see the extra statistics. My previous post about viewing more detailed information on an EventStore persistent subscription likely gave away the answer, the json result of the info link for the persistent subscription has a connections collection. With extra statistics enabled you will see statistics gathered for each connection. Continue reading “Eventstore – Extra Statistics Option on Persistent Subscription – Where is it?”

EventStore HTTP API – Replaying Parked Messages in C# with HttpClient

Eventstore’s competing consumer pattern is implemented through the use of persistent subscriptions.  Competing consumers is a great pattern to use when you need multiple consumers pulling from a single stream with the state of the stream being managed by EventStore.  In order for the consumer to keep receiving messages they must ack/nack the message they received in some way.  The happy path would lead to the consumer being able to process the message and returning an ack back to signify that the message was processed.  If a consumer is unable to process the message, there are multiple options that can be taken, likely ending in the message being parked.  Each subscription group created will have another stream available known as the parked message queue.  Messages which are not acknowledged can be put into this queue for diagnosis later.  Through EventStore’s web ui, you can easily click the replay parked messages link for a particular subscription group.  If there is a need to programmatically replay this queue it can be done through the REST API.

 

Here is an example on how to programmatically invoke the replay parked messages functionality. Continue reading “EventStore HTTP API – Replaying Parked Messages in C# with HttpClient”

Change Default Swagger Route in an ASP.Net Core Web Api

Introduction

Over the past few weeks I have been doing some work with ASP.Net Core Web Api project’s using swagger.  The setup was pretty standard until the api’s needed to be deployed to staging and production environments.  The web api’s are being hosted in docker containers behind a reverse proxy, the staging and production environments required a prefix route parameter for each api.  This meant that the default url’s for the controllers and swagger would need to include a route prefix.  To add a route prefix to swagger and swagger ui is a pretty quick code change. Continue reading “Change Default Swagger Route in an ASP.Net Core Web Api”

Running an Azure Data Factory Pipeline on a Weekday Schedule Using an Azure Function

Introduction

I have written a few posts about different aspects of Azure Data Factory.  I use it as the main workhorse of my data integration and ETL projects.  One major drawback I have found with Azure Data Factory is the scheduling system, it’s not as flexible as I and many others would like it to be.  With that being said there are certainly ways to adapt and get more control of an Azure Data Factory pipeline execution.  In my post Starting an Azure Data Factory Pipeline from C# .Net, I outline the need to kick off a pipeline after a local job has completed and how this can be attained by utilizing the SDK to programmatically set the pipelines Start/End dates.  You may not have that requirement specifically, but let’s say you want to only run a pipeline during the weekday or another specific schedule, this can be accomplished by utilizing the same code from my prior post and scheduling a local console app.  However, I thought it would be more fun to utilize Azure Functions to kick off a pipeline on a weekday schedule to provide a fully cloud based solution. Continue reading “Running an Azure Data Factory Pipeline on a Weekday Schedule Using an Azure Function”

Run Oracle RightNow Analytics Report with C#

Introduction

In a previous post Get All Users from Oracle RightNow SOAP Api with C#, I gave a simple example of how to get object data using the QueryCSV method of the api.  There is another helpful method available on the Oracle RightNow Api which allows you to run an Analytics Report and receive a csv result set.  By having the ability to execute these reports from the api provides you the opportunity to structure the data in a form that directly meets your needs.

Prerequisites

  • You will need access to a Oracle RightNow instance
  • Oracle RightNow report defined
  • Visual Studio. Visual Studio Community 2015
  • Create a new console application
  • Add service reference to RightNow (follow previous post here)

Visual Studio

In order to run the following sample method, you will need to have a service reference defined for the Oracle RightNow system you would like to access.  I briefly went through this in the previous post so please follow up there on how to accomplish that.

        private static void run_report(int analyticsReportId)
        {
            var analyticsReport = new AnalyticsReport();
            var reportID = new ID
            {
                id = analyticsReportId,
                idSpecified = true
            };
            analyticsReport.ID = reportID;

            //analyticsReport.Filters = new[]
            //{
            //    new AnalyticsReportFilter
            //    {
            //        Name = "UpdatedRange",
            //        Operator = new NamedID {ID = new ID {id = 9, idSpecified = true}},
            //        Values =
            //            new[]
            //            {
            //                $"{new DateTime(2016, 01, 20, 00, 00, 00).ToString("s")}Z",
            //                $"{new DateTime(2016, 01, 20, 23, 59, 59).ToString("s")}Z"
            //            }
            //    }
            //};

            byte[] fileData;

            var _client = new RightNowSyncPortClient();
            _client.ClientCredentials.UserName.UserName = "";
            _client.ClientCredentials.UserName.Password = "";

            var clientInfoHeader = new ClientInfoHeader { AppID = "Download Analytics Report data" };
            var tableSet = new CSVTableSet();

            try
            {
                tableSet = _client.RunAnalyticsReport(clientInfoHeader, analyticsReport, 10000, 0, ",", false, true, out fileData);
            }
            catch (Exception e)
            {
                Console.WriteLine(e.Message);
                throw;
            }
            var tableResults = tableSet.CSVTables;

            var data = tableResults[0].Rows.ToArray();
        }

As you can see it is pretty simple to execute the analytics report on the RightNow api.  When calling the QueryCSV function you need to define a SOQL query, here we need to populate the AnalyticsReport object to be executed.  Reports that require filters need to populate the filter array on the AnalyticsReport object, you can see an example of a date range filter commented out in the method above.  Finally, the returned values from the report utilize the same result objects returned from the QueryCSV method.

Conclusion

For my current integration projects, I have the need to pull data from various api’s, having the ability to pull data already structured helped to speed up the process. Hopefully seeing this sample and knowing that this ability exists will provide another option for gathering data from the Oracle RightNow api.