Azure Storage is a quick and effortless way to store data for applications that has high availability, is secure, scales and is redundant. This blog post walks through a simple application that creates a short code for a long URL to easily reference it. It uses Table Storage to map codes to URLs and a Queue to process redirect counts. Everything is handled by serverless Azure Functions. The only prerequisite to build and run locally is Visual Studio 2017 15.5 or later, including the Azure Developer workload. That will automatically install the Azure Storage Emulator you can use to program against tables, queues, blobs, and files on your local machine. You do not have to have an Azure account to run this on your machine.
Build and Test Locally with Function App Host and Azure Storage Emulator
You can download the source code for this project here.
Open Visual Studio 2017 and create a new “Azure Functions” project (the template will be under the “Cloud” category). Pick a name like, ShortLink
.
In the next dialog, choose “Azure Functions v1”, select “Http Trigger”, pick “Storage Emulator” for the Storage Account, and set Access rights to “Anonymous.”
Right-click the name Function1.cs
in the Solution Explorer and rename it to LinkShortener.cs
. Change the function name to “Set” and update the code to use “href” instead of “name” as follows:
Hit F5 to run the function locally. You should see the function console launch and provide you with a list of URLs to access your function.
Access the end point from your web browser by copying and pasting the URL for the “Set” operation. You should receive an error message asking you to pass an href. Append the following to the end of the URL:
?href=https://developer.microsoft.com/advocates
You should see the URL echoed back to you. Stop debugging (SHIFT+F5).
Out of the box, the functions template creates a function app. The function app hosts multiple functions, which are snippets of code that can be triggered by various events. In this example, the code is triggered by an HTTP/HTTPS request. Visual Studio uses attributes to declare the function name and specify the bindings. The log is automatically passed into the method you to to write logging information.
It’s time to add storage!
Table Storage uses a partition (to segment the data) and a row key (to identify a unique data item). The app will use a special partition of “1” to store a key that indicates the next code to use. The short code is generated by a simple algorithm that translates an integer to a string of alphanumeric characters. To store a short code, the partition will be set to the first character of the code, the row key will be the short code, and a target field will contain the full URL. Create a new class file and name it UrlKey.cs
. Add this using statement:
using Microsoft.WindowsAzure.Storage.Table;
Then add the class:
Next, add a class named UrlData.cs
, include the same “using” statement and define the class like this:
Add the same using statement to the top of the LinkShortener.cs
file. Azure Functions provides special bindings that take care of connecting to various resources. Modify the Run
method to include a binding for the key and another binding that will be used to write out the URL information.
The Table
attributes represent bindings to Table Storage. Different parameters allow behaviors such as passing in existing entries or collections of entries, as well as a CloudTable
instance you can think of as the context you use to interact with a specific table. The binding logic will automatically create the table if it doesn’t exist. The key entry is automatically passed in if it exists. This is because the partition and key are included in the binding. If it doesn’t exist, it will be passed as null and you can initialize it and store it as a new entry:
Next, add the code to turn the numeric key value into an alphanumeric code, then create a new instance of the UrlData
class.
The final steps for the redirect loop involve saving the data and updating the key. The response returns the code.
Now you can test the functionality. Make sure the storage emulator is running by searching for “Storage Emulator” in your applications and clicking on it. It will send a notification when it is ready. Press F5 and paste the same URL used earlier with the query string set. If all goes well, the response should contain the initial value “BNK”. Next, open “Cloud Explorer” (View -> Cloud Explorer) and navigate to local developer storage. Expand table storage and view the two entries. Note the id for the key has been incremented:
With an entry in storage, the next step is a function that takes the short code and redirects to the full URL. The strategy is simple: check for an existing entry for the code that is passed. If it exists, redirect to the URL, otherwise redirect to a “fallback” (in this case I used my personal blog). The redirect should happen quickly, so the short code is placed on a queue for a separate function to process statistics. Simply declaring the queue with the Queue
binding is all it takes for the storage driver to create the queue and add the entry. You are passed an asynchronous collection so you may add multiple queue entries. Anything you add is automatically inserted into the queue. It’s that simple!
Run the project again, and navigate to the new “Go” endpoint and pass the “BNK” parameter. Your URL will look something like: http://localhost:7071/api/Go/BNK
. You should see it redirect to the page you originally passed in. Refresh your Cloud Explorer and expand the “Queues” section. There should be a new queue named “counts” with a single entry (or more if you tried the redirect multiple times).
Processing the queue ties together elements of the previous function. The function uses a queue trigger and will be called for and with each entry in the queue. The implemented logic simply looks for a matching entry in the table, increments the count, then saves it.
Run the project, and if your Storage Emulator is running, you should see a call to the queue processing function in the function app console. After it completes, refresh your Cloud Explorer. You should see the queue is now empty and the count has been updated on the URL in Table Storage.
Publish to Azure
It’s great to be able to run and debug locally, but to be useful the app should be hosted in the cloud. This step requires an Azure Account (you can get one for free). Right-click on the ShortLink
project and choose “Publish…”. Make sure “Azure Function App” and “Create New” are selected, then click the “Publish” button.
In the dialog, give the app a unique name (it must be globally unique so you may have to try a few variations). Choose “New” for the resource group and give it a logical name, then choose “New” for plan. Give the plan a name (I like to use the app name followed by “Link”), choose a region close to you and pick the “Consumption Plan” then press “OK.”
Click “Create” to create the necessary assets in Azure. Visual Studio will create the resources for you, build your application, then publish it to Azure. When everything is ready, you will see the message “Publish completed.” in the Output dialog for Build.
Test adding a link (replace “myshortlink” with your own function app name):http://myshortlink.azurewebsites.net/api/Set?href=https://docs.microsoft.com/azure/storage/
Then test the redirect:http://myshortlink.azurewebsites.net/api/Go/BNK
You can use the Storage Explorer to attach to Azure and verify the count.
But wait – isn’t Azure Storage supposed to be secure? How did this just work without me entering credentials?
If you don’t specify a connection string, all storage references default to an AzureWebJobsStorage
connection key. This is the storage account created automatically to support your function app. In your local project, the local.settings.json
file points to development storage (the emulator). When the Azure Function App was created, a connection string was automatically generated for the storage account. The application settings override your local settings, so the application was able to run against the storage account without modification! If you want to connect to a different storage account (for example, if you choose to use CosmosDB for premium table storage) you can simply add a new connection string and specify it as a parameter on the bindings and triggers.
When you publish from Visual Studio, the publish dialog has a link to “Manage Application Settings…”. There, you can add your own settings including any custom connection strings you need, and it will deploy the settings securely to Azure as part of the publish process.
That’s all there is to it!
Conclusion
There is a lot more you could do with the application. For example, the application “as is” does not have any authentication, meaning anyone could access your link shortener and create short links. You want to change the access to “Function level” for the “Set” function and secure the website with an SSL certificate to prevent anonymous access. For a more complete version of the application that includes logging, monitoring, and web front end to paste links, read Build a Serverless Link Shortener Faster than you can Finish your Latte.
The intent of this post was to illustrate how easy and effective the experience of integrating Azure Storage with your application can be. There are SDKs available to perform the same functions from desktop and mobile applications as well. Perhaps the biggest benefit of leveraging storage is the low cost. I run a production link shortener that processes several hundred hits per day, and my monthly cost for both the serverless function and the storage is less than one dollar. Azure Storage is both accessible and cost effective.
Here is the full project.
Enjoy!