Sebastian Nilsson

API Rate Limit HTTP Handler with HttpClientFactory

Most APIs have a Rate Limit of some sort. For example, GitHub has a limit of 5000 requests per hour. This can partly be handled by limiting your use by timing your requests to the API or through caching of the results.

What about when an API limits your requests per second? This is probably something you would want to handle somewhere central in your code and not spread out everywhere where you make an HTTP call to the API.

Azure Storage Explorer

For me, the solution was to add a Outgoing request middleware to the setup of the HttpClientFactory.

With this, I can just configure the startup services to use this RateLimitHttpMessageHandler-class with the HttpClientFactory:

services
    .AddHttpClient<IApi, Api>()
    .AddHttpMessageHandler(() =>
        new RateLimitHttpMessageHandler(
            limitCount: 5,
            limitTime: TimeSpan.FromSeconds(1)))
    .AddDefaultTransientHttpErrorPolicy();

This ensures that wherever I use the class IApi, through dependency injection, it will limit the calls to the API to only 5 calls per second.

The simplified version of code for the RateLimitHttpMessageHandler:

public class RateLimitHttpMessageHandler : DelegatingHandler
{
    private readonly List<DateTimeOffset> _callLog =
        new List<DateTimeOffset>();
    private readonly TimeSpan _limitTime;
    private readonly int _limitCount;

    public RateLimitHttpMessageHandler(int limitCount, TimeSpan limitTime)
    {
        _limitCount = limitCount;
        _limitTime = limitTime;
    }

    protected override async Task<HttpResponseMessage> SendAsync(
        HttpRequestMessage request,
        CancellationToken cancellationToken)
    {
        var now = DateTimeOffset.UtcNow;

        lock (_callLog)
        {
            _callLog.Add(now);

            while (_callLog.Count > _limitCount)
                _callLog.RemoveAt(0);
        }

        await LimitDelay(now);

        return await base.SendAsync(request, cancellationToken);
    }

    private async Task LimitDelay(DateTimeOffset now)
    {
        if (_callLog.Count < _limitCount)
            return;

        var limit = now.Add(-_limitTime);

        var lastCall = DateTimeOffset.MinValue;
        var shouldLock = false;

        lock (_callLog)
        {
            lastCall = _callLog.FirstOrDefault();
            shouldLock = _callLog.Count(x => x >= limit) >= _limitCount;
        }

        var delayTime = shouldLock && (lastCall > DateTimeOffset.MinValue)
            ? (limit - lastCall)
            : TimeSpan.Zero;

        if (delayTime > TimeSpan.Zero)
            await Task.Delay(delayTime);
    }
}

Azure Storage Easy Web File-Hosting

In an ambition to improve my blog a little bit, I wanted to include more images in the posts but felt a lack of a good solution for web file-hosting. To find the best fit, I put down a check-list of features and criteria. The solution I was looking for should or must check the following:

  • Ownership of the files: If the solution used dissapears tomorrow, you can still access my files.
  • Predictable URLs: The URL to the resources should never change. You don't want to have to update all your blog-posts or other external links floating around the internet.
  • Good tooling: Avoiding slow web-upload, when uploading multiple and/or large files, but also easily getting an overview of your files.
  • (Bonus) Pretty URLs: "Pretty" looking URLs are easier to check for copy-paste errors, but could also potentially benefit SEO.
  • (Bonus) Low or no cost: Since there are free services out there, paying for file-hosting must be worth it.
Images & Tech

Non-fitting Alternatives

When I started evaluating alternatives, a while back, Flickr was still a thing. The problem was that I couldn't predictable directly link to an image I've uploaded. After feeling all the obstacles, which indicated that using Flickr for image-hosting was being actively blocked, I understood it was too much of a hack.

Historically, I've been using Google's image hosting through Blogger, which is where this blog started out. The problem was that this also felt like I hack and I was always worried that the URLs would change and I'd have to go through every single blog-posts I've ever made and update all images.

Services like Dropbox and Google Drive seem to actively try to block the use of their services for this, even if they are accessible through the web.

Azure Storage for easy web file-hosting

Enter Azure Storage, with its wide spread adoption, familiar interface and extremely affordably priced Blob Storage. It checks off all my points in the checklist and more. Giving it's backing, it's fair to assume more functionality will be added ongoingly.

Azure Blob Storage can be used to store data-blobs of almost any size in the Azure-cloud. By providing a path/key, you can read or write to that "file" at that "path". The overall performance of Azure Storage is great and also an important feature of the service, but the simple mechanisms of Azure Blob Storage make it very fast.

Expose blob-content to the internet

So you could build a web-app which accesses your files on Azure Blob Storage and expose them through URLs in your API, but you can also let Azure handle that for you, by activating anonymous read access to your blobs.

You can do this on blob-level, so you can have a dedicated blob for public files, separately from your other blobs in the same Storage-account. These files will be read-only when you use the option Blob (anonymous read access for blobs only), found under the Access policy-section of the selected blob.

Upload files

Then you can use the Azure Portal, programmatically use the Azure Blob Storage API to upload files, or you can use application Azure Storage Explorer, for a friendly GUI-experience to get started.

Azure Storage Explorer

Now you can share your files anywhere you want, via the provided URL from Azure or through a more pretty URL, using a custom domain.

Add custom domain

To fulfill the criteria of pretty URLs, you can set your own custom domain for an Azure Storage-account. If you do not do this, the default URL for Azure Blob storage is https://{storage-account-name}.blob.core.windows.net/{file-path}.

Activate Azure CDN

This is a great start for your small project to start out with, but you can in the future easily transition into using the full power of Azure Content Delivery Network (Azure CDN) on your existing Azure Storage-account, simply by activating it from the Azure CDN-section in your Storage-account.

dotnet-guid: Generate GUIDs/UUIDs with the Command Line

There have been a few projects, although not many, which I've been involved in, where generating GUIDs/UUIDs has been important. For that, I used to use online-tools like https://guidgenerator.com, but when I switched machines, the autocomplete in my browser might have lost the link to that tool.

Minimally for this reason, and mostly for fun, I decided to write a .NET Core Global Tool which quickly generates one or multiple GUIDs/UUIDs, in whatever format could be needed.

Installation

Download the .NET Core SDK 2.1 or later. The install the dotnet-guid .NET Global Tool, using the command-line:

dotnet tool install -g dotnet-guid

Usage

Usage: guid [arguments] [options]

Arguments:
  Count         Defines how may GUIDs/UUIDs to generate. Defaults to 1.

Options:
  -?|-h|--help  Show help information
  -n            Formatted as 32 digits:
                00000000000000000000000000000000
  -d            Formatted as 32 digits separated by hyphens:
                00000000-0000-0000-0000-000000000000
  -b            Formatted as 32 digits separated by hyphens, enclosed in braces:
                {00000000-0000-0000-0000-000000000000}
  -p            Formatted as 32 digits separated by hyphens, enclosed in parentheses:
                (00000000-0000-0000-0000-000000000000)
  -x            Formatted as four hexadecimal values enclosed in braces,
                where the fourth value is a subset of eight hexadecimal values that is also enclosed in braces:
                {0x00000000,0x0000,0x0000,{0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00}}
  -e            Defines if the GUIDs/UUIDs should be empty, using zero-values only.
  -u            Defines if the GUIDs/UUIDs generated should be upper-cased letters.

Examples

To get a single GUID/UUID, simply type:

guid

To get 3 random GUIDs/UUIDs, with letters in upper-case, formatted with brackets:

guid 3 -b -u

You can find the source-code on GitHub, the package on Nuget and the latest builds on MyGet.

You can find a great list of more .NET Core Global Tools on GitHub, maintained by Nate McMaster.

dotnet-cleanup: Clean Up Solution, Project & Folder

We developers like to think of developing software as an exact science, but sometimes, you just need to wipe your source-files to solve some kinds of problems.

For .NET-developers, there are many issues on Stackoverflow which are solved by just deleting your bin and obj-folders. For people using Node.js, probably just as many answers contains the step of removing your node_modules-folder.

Those are some of the reasons why I created dotnet-cleanup, which is a .NET Core Global Tool for cleaning up solution, project or folder. This was made easy by following Nate McMaster's post on getting started with creating a .NET Core global tool package.

Deleted files and folders are first moved to a temporary folder before deletion, so you can continue working with your projects, while the tool keeps cleaning up in background.

Installation

Download the .NET Core SDK 2.1 or later. The install the dotnet-cleanup .NET Global Tool, using the command-line:

dotnet tool install -g dotnet-cleanup

Usage

Usage: cleanup [arguments] [options]

Arguments:
  PATH                  Path to the solution-file, project-file or folder to clean. Defaults to current working directory.

Options:
  -p|--paths            Defines the paths to clean. Defaults to 'bin', 'obj' and 'node_modules'.
  -y|--confirm-cleanup  Confirm prompt for file cleanup automatically.
  -nd|--no-delete       Defines if files should be deleted, after confirmation.
  -nm|--no-move         Defines if files should be moved before deletion, after confirmation.
  -t|--temp-path        Directory in which the deleted items should be moved to before being cleaned up. Defaults to system Temp-folder.
  -v|--verbosity        Sets the verbosity level of the command. Allowed levels are Minimal, Normal, Detailed and Debug.
  -?|-h|--help          Show help information

The argument PATH can point to a specific .sln-file or a project-file (.csproj, .fsharp, .vbproj). If a .sln-file is specified, all its projects will be cleaned.

If it points to a folder, the folder will be scanned for a single solution-file and then for a single project-file. If multiple files are detected an error will be shown and you need to specify the file.

If not solution or project is found, the folder will be cleaned as a project.

Example

To cleanup a typical web-project, you can specify the paths to be cleaned in the projects like this:

cleanup -p "bin" -p "obj"  -p "artifacts" -p "npm_modules"

You can find the source-code on GitHub, the latest builds on MyGet and the package on Nuget.

You can find a great list of more .NET Core Global Tools on GitHub, maintained by Nate McMaster.

LINQ Distinct-Method using Lambda Expression

If you've ever wanted to filter a collection for a distinct result you probably know about the extension-method .Distinct in LINQ. It can be useful on simple data structures containing easily comparable objects, like I collection of strings or ints, but for more complex scenarios you need to pass in a IEqualityComparer. This is not very convenient.

More convenient would be to able to pass in a Lambda-expression, specifying what field you want to do the distinction by, like this:

var distinctItems = items.Distinct(x => x.Id);

To do this, you can add the following extension-methods to your projects:

public static IQueryable<TSource> Distinct<TSource>(
    this IQueryable<TSource> source, Expression<Func<TSource, object>> predicate)
{
    // TODO: Null-check arguments
    return from item in source.GroupBy(predicate) select item.First();
}

public static IEnumerable<TSource> Distinct<TSource>(
    this IEnumerable<TSource> source, Func<TSource, object> predicate)
{
    // TODO: Null-check arguments
    return from item in source.GroupBy(predicate) select item.First();
}

The extension-method using IQueryable<T> works with ORMs like Entity Framework, while IEnumerable<T> works with all types of collections, in-memory or otherwise, depending on implementation.

Warning: Avoid using this with EF Core version 1.x or 2.0, since the .GroupBy-execution is always made in-memory. So you might get the whole content of your database loaded into memory. Only use it with EF Core 2.1 and above in production-scenarios.