Azure Storage Easy Web File-Hosting

In an ambition to improve my blog a little bit, I wanted to include more images in the posts but felt a lack of a good solution for web file-hosting. To find the best fit, I put down a check-list of features and criteria. The solution I was looking for should or must check the following:

  • Ownership of the files: If the solution used dissapears tomorrow, you can still access my files.
  • Predictable URLs: The URL to the resources should never change. You don't want to have to update all your blog-posts or other external links floating around the internet.
  • Good tooling: Avoiding slow web-upload, when uploading multiple and/or large files, but also easily getting an overview of your files.
  • (Bonus) Pretty URLs: "Pretty" looking URLs are easier to check for copy-paste errors, but could also potentially benefit SEO.
  • (Bonus) Low or no cost: Since there are free services out there, paying for file-hosting must be worth it.
Images & Tech

Non-fitting Alternatives

When I started evaluating alternatives, a while back, Flickr was still a thing. The problem was that I couldn't predictable directly link to an image I've uploaded. After feeling all the obstacles, which indicated that using Flickr for image-hosting was being actively blocked, I understood it was too much of a hack.

Historically, I've been using Google's image hosting through Blogger, which is where this blog started out. The problem was that this also felt like I hack and I was always worried that the URLs would change and I'd have to go through every single blog-posts I've ever made and update all images.

Services like Dropbox and Google Drive seem to actively try to block the use of their services for this, even if they are accessible through the web.

Azure Storage for easy web file-hosting

Enter Azure Storage, with its wide spread adoption, familiar interface and extremely affordably priced Blob Storage. It checks off all my points in the checklist and more. Giving it's backing, it's fair to assume more functionality will be added ongoingly.

Azure Blob Storage can be used to store data-blobs of almost any size in the Azure-cloud. By providing a path/key, you can read or write to that "file" at that "path". The overall performance of Azure Storage is great and also an important feature of the service, but the simple mechanisms of Azure Blob Storage make it very fast.

Expose blob-content to the internet

So you could build a web-app which accesses your files on Azure Blob Storage and expose them through URLs in your API, but you can also let Azure handle that for you, by activating anonymous read access to your blobs.

You can do this on blob-level, so you can have a dedicated blob for public files, separately from your other blobs in the same Storage-account. These files will be read-only when you use the option Blob (anonymous read access for blobs only), found under the Access policy-section of the selected blob.

Upload files

Then you can use the Azure Portal, programmatically use the Azure Blob Storage API to upload files, or you can use application Azure Storage Explorer, for a friendly GUI-experience to get started.

Azure Storage Explorer

Now you can share your files anywhere you want, via the provided URL from Azure or through a more pretty URL, using a custom domain.

Add custom domain

To fulfill the criteria of pretty URLs, you can set your own custom domain for an Azure Storage-account. If you do not do this, the default URL for Azure Blob storage is https://{storage-account-name}.blob.core.windows.net/{file-path}.

Activate Azure CDN

This is a great start for your small project to start out with, but you can in the future easily transition into using the full power of Azure Content Delivery Network (Azure CDN) on your existing Azure Storage-account, simply by activating it from the Azure CDN-section in your Storage-account.

Azure Deployment of Mono-Repos in GitHub

With the increasing adoption of Microservices, one of the questions sometimes discussed is mono-repo vs. multi-repo, which tries to answer if you structure your code to have one large, single repository for all/multiple microservices or individual, small repositories for each microservice.

If you're in the scenario of using a mono-repo, where you have multiple projects in a single repo, and you want to deploy this single code-base to multiple applications and services in Azure, there is a simple way to do this.

Kudu, the engine behind git-deployments in Azure App Service, has documentation on how to customize deployments, but only one of these ways works well with the mono-repo-scenario. The approach of using a .deployment-file unfortunately locks the configuration of the specified project into your source-code, so it won't work for mono-repos.

Instead, using the Application settings available in all Azure App Services, you use the key PROJECT to set the value for the path of the project you want to be deployed to your different Azure App Services. So this solution works for Function Apps, Web Apps, API Apps and much more.

Example

If you have the following code-structure in your single repository:

  • /src/ExampleProject/ExampleProject.csproj
  • /src/ExampleProject.FunctionsApp/ExampleProject.FunctionsApp.csproj
  • /src/ExampleProject.WebApiApp/ExampleProject.WebApiApp.csproj
  • /src/ExampleProject.WebApp/ExampleProject.WebApp.csproj
  • /test/ExampleProject.Tests/ExampleProject.Tests.csproj

Then you'd set up your PROJECT-path in Application settings for your different Application Services like this:

  • Function App: "src/ExampleProject.FunctionsApp"
  • Web App: "src/ExampleProject.WebApp"
  • API App: "src/ExampleProject.WebApiApp"

With the paths relative to the Git-repository's root-path. You can also point directly to the .csproj-file, if there is any risk of ambiguity in the same file-folder.

When this is done you can configure your Azure App Services for Continuous Deployment using the same repository, in GitHub, Bitbucket, local Git or any other source supported in Azure.