PowerShell LINQ with Short Aliases

Most modern applications or code today deal with some kind of filtering or querying. In C# and .NET, we have Language Integrated Query (LINQ), which we also have access to in PowerShell, because it's built on .NET.

To list the top 10 largest files in the Windows temporary folder, which is larger than 1 Mb and starts with the letter W, skipping the first 5, ordering by size, the C#-code with LINQ would look somewhat like this:

new System.IO.DirectoryInfo(@"C:\Windows\Temp")
    .GetFiles()
    .Where(x => x.Length > 1024 && x.Name.StartsWith("W"))
    .OrderByDescending(x => x.Length)
    .Select(x => new { x.Name, x.Length })
    .Skip(5)
    .Take(10)
    .ToList()
    .ForEach(x => Console.WriteLine($"{x.Name} ({x.Length})"));

The equivalent logic in PowerShell has a bit of a more daunting syntax, especially if you're not used to it:

Get-ChildItem "C:\Windows\Temp" `
| Where-Object {$_.Length -gt 1024 -and $_.Name.StartsWith("W")} `
| Sort-Object {$_.Length} -Descending `
| Select-Object -Property Name, Length -First 10 -Skip 5 `
| ForEach-Object {Write-Host "$($_.Name) ($($_.Length))"}

That's a bit explicit and verbose, but if you use the command Get-Alias in PowerShell, you will see a lot of useful aliases, which make the syntax a bit terser and easier to get an overview of:

gci "C:\Windows\Temp" `
| ?{$_.Length -gt 1024 -and $_.Name.StartsWith("W")} `
| sort{$_.Length} -Descending `
| select Name, Length -First 10 -Skip 5 `
| %{write "$($_.Name) ($($_.Length))"}

In a real scenario, you probably wouldn't write each result to the console, but let PowerShell present the result in its default grid format.

HTML Encode TagHelper in ASP.NET Core

For a specific scenario recently, I wanted to display the HTML-encoded output of a TagHelper in ASP.NET Core. So I wanted to use the TagHelper, but not output its actual result, but see the raw HTML which would have been included in my template.

HTML

So I created another TagHelper, which allows me to wrap any HTML, inline code in ASP.NET Core and other TagHelpers, and get all the content inside the TagHelper's tag to be HTML-encoded, like this:

<html-encode>
    <a href="@Url.Action("Index")">Read More</a>
    @Html.TextBox("No_Longer_Recommended-TagHelpers_Preferred")
    <my-other-tag-helper />
</html-encode>

From this, I will get the raw HTML of the link with an UrlHelper-result, the result of the HTML-helper and the result of my other TagHelper.

The source-code for the html-encode-TagHelper is as follows:

[HtmlTargetElement("html-encode", TagStructure = TagStructure.NormalOrSelfClosing)]
public class HtmlEncodeTagHelper : TagHelper
{
    public override async Task ProcessAsync(TagHelperContext context, TagHelperOutput output)
    {
        var childContent = output.Content.IsModified
            ? output.Content.GetContent()
            : (await output.GetChildContentAsync()).GetContent();

        string encodedChildContent = WebUtility.HtmlEncode(childContent ?? string.Empty);

        output.TagName = null;
        output.Content.SetHtmlContent(encodedChildContent);
    }
}

API Rate Limit HTTP Handler with HttpClientFactory

Most APIs have a Rate Limit of some sort. For example, GitHub has a limit of 5000 requests per hour. This can partly be handled by limiting your use by timing your requests to the API or through caching of the results.

What about when an API limits your requests per second? This is probably something you would want to handle somewhere central in your code and not spread out everywhere where you make an HTTP call to the API.

Funnel

For me, the solution was to add a Outgoing request middleware to the setup of the HttpClientFactory.

With this, I can just configure the startup services to use this RateLimitHttpMessageHandler-class with the HttpClientFactory:

services
    .AddHttpClient<IApi, Api>()
    .AddHttpMessageHandler(() =>
        new RateLimitHttpMessageHandler(
            limitCount: 5,
            limitTime: TimeSpan.FromSeconds(1)))
    .AddDefaultTransientHttpErrorPolicy();

This ensures that wherever I use the class IApi, through dependency injection, it will limit the calls to the API to only 5 calls per second.

The simplified version of code for the RateLimitHttpMessageHandler:

public class RateLimitHttpMessageHandler : DelegatingHandler
{
    private readonly List<DateTimeOffset> _callLog =
        new List<DateTimeOffset>();
    private readonly TimeSpan _limitTime;
    private readonly int _limitCount;

    public RateLimitHttpMessageHandler(int limitCount, TimeSpan limitTime)
    {
        _limitCount = limitCount;
        _limitTime = limitTime;
    }

    protected override async Task<HttpResponseMessage> SendAsync(
        HttpRequestMessage request,
        CancellationToken cancellationToken)
    {
        var now = DateTimeOffset.UtcNow;

        lock (_callLog)
        {
            _callLog.Add(now);

            while (_callLog.Count > _limitCount)
                _callLog.RemoveAt(0);
        }

        await LimitDelay(now);

        return await base.SendAsync(request, cancellationToken);
    }

    private async Task LimitDelay(DateTimeOffset now)
    {
        if (_callLog.Count < _limitCount)
            return;

        var limit = now.Add(-_limitTime);

        var lastCall = DateTimeOffset.MinValue;
        var shouldLock = false;

        lock (_callLog)
        {
            lastCall = _callLog.FirstOrDefault();
            shouldLock = _callLog.Count(x => x >= limit) >= _limitCount;
        }

        var delayTime = shouldLock && (lastCall > DateTimeOffset.MinValue)
            ? (limit - lastCall)
            : TimeSpan.Zero;

        if (delayTime > TimeSpan.Zero)
            await Task.Delay(delayTime);
    }
}

Azure Storage Easy Web File-Hosting

In an ambition to improve my blog a little bit, I wanted to include more images in the posts but felt a lack of a good solution for web file-hosting. To find the best fit, I put down a check-list of features and criteria. The solution I was looking for should or must check the following:

  • Ownership of the files: If the solution used dissapears tomorrow, you can still access my files.
  • Predictable URLs: The URL to the resources should never change. You don't want to have to update all your blog-posts or other external links floating around the internet.
  • Good tooling: Avoiding slow web-upload, when uploading multiple and/or large files, but also easily getting an overview of your files.
  • (Bonus) Pretty URLs: "Pretty" looking URLs are easier to check for copy-paste errors, but could also potentially benefit SEO.
  • (Bonus) Low or no cost: Since there are free services out there, paying for file-hosting must be worth it.
Images & Tech

Non-fitting Alternatives

When I started evaluating alternatives, a while back, Flickr was still a thing. The problem was that I couldn't predictable directly link to an image I've uploaded. After feeling all the obstacles, which indicated that using Flickr for image-hosting was being actively blocked, I understood it was too much of a hack.

Historically, I've been using Google's image hosting through Blogger, which is where this blog started out. The problem was that this also felt like I hack and I was always worried that the URLs would change and I'd have to go through every single blog-posts I've ever made and update all images.

Services like Dropbox and Google Drive seem to actively try to block the use of their services for this, even if they are accessible through the web.

Azure Storage for easy web file-hosting

Enter Azure Storage, with its wide spread adoption, familiar interface and extremely affordably priced Blob Storage. It checks off all my points in the checklist and more. Giving it's backing, it's fair to assume more functionality will be added ongoingly.

Azure Blob Storage can be used to store data-blobs of almost any size in the Azure-cloud. By providing a path/key, you can read or write to that "file" at that "path". The overall performance of Azure Storage is great and also an important feature of the service, but the simple mechanisms of Azure Blob Storage make it very fast.

Expose blob-content to the internet

So you could build a web-app which accesses your files on Azure Blob Storage and expose them through URLs in your API, but you can also let Azure handle that for you, by activating anonymous read access to your blobs.

You can do this on blob-level, so you can have a dedicated blob for public files, separately from your other blobs in the same Storage-account. These files will be read-only when you use the option Blob (anonymous read access for blobs only), found under the Access policy-section of the selected blob.

Upload files

Then you can use the Azure Portal, programmatically use the Azure Blob Storage API to upload files, or you can use application Azure Storage Explorer, for a friendly GUI-experience to get started.

Azure Storage Explorer

Now you can share your files anywhere you want, via the provided URL from Azure or through a more pretty URL, using a custom domain.

Add custom domain

To fulfill the criteria of pretty URLs, you can set your own custom domain for an Azure Storage-account. If you do not do this, the default URL for Azure Blob storage is https://{storage-account-name}.blob.core.windows.net/{file-path}.

Activate Azure CDN

This is a great start for your small project to start out with, but you can in the future easily transition into using the full power of Azure Content Delivery Network (Azure CDN) on your existing Azure Storage-account, simply by activating it from the Azure CDN-section in your Storage-account.

dotnet-guid: Generate GUIDs/UUIDs with the Command Line

There have been a few projects, although not many, which I've been involved in, where generating GUIDs/UUIDs has been important. For that, I used to use online-tools like https://guidgenerator.com, but when I switched machines, the autocomplete in my browser might have lost the link to that tool.

Minimally for this reason, and mostly for fun, I decided to write a .NET Core Global Tool which quickly generates one or multiple GUIDs/UUIDs, in whatever format could be needed.

Installation

Download the .NET Core SDK 2.1 or later. The install the dotnet-guid .NET Global Tool, using the command-line:

dotnet tool install -g dotnet-guid

Usage

Usage: guid [arguments] [options]

Arguments:
  Count         Defines how may GUIDs/UUIDs to generate. Defaults to 1.

Options:
  -?|-h|--help  Show help information
  -n            Formatted as 32 digits:
                00000000000000000000000000000000
  -d            Formatted as 32 digits separated by hyphens:
                00000000-0000-0000-0000-000000000000
  -b            Formatted as 32 digits separated by hyphens, enclosed in braces:
                {00000000-0000-0000-0000-000000000000}
  -p            Formatted as 32 digits separated by hyphens, enclosed in parentheses:
                (00000000-0000-0000-0000-000000000000)
  -x            Formatted as four hexadecimal values enclosed in braces,
                where the fourth value is a subset of eight hexadecimal values that is also enclosed in braces:
                {0x00000000,0x0000,0x0000,{0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00}}
  -e            Defines if the GUIDs/UUIDs should be empty, using zero-values only.
  -u            Defines if the GUIDs/UUIDs generated should be upper-cased letters.

Examples

To get a single GUID/UUID, simply type:

guid

To get 3 random GUIDs/UUIDs, with letters in upper-case, formatted with brackets:

guid 3 -b -u

You can find the source-code on GitHub, the package on Nuget and the latest builds on MyGet.

You can find a great list of more .NET Core Global Tools on GitHub, maintained by Nate McMaster.