This article was originally published at blogs.siliconorchid.com on 19-May-2019
This is part three of a three-part series that shows you a way to combine several serverless technologies, creating an online chat system that sends messages between users of WhatsApp and a real-time web app.
In part 1 of this series, we introduce you to the scenario and walk through the configuration of an entire solution, ready for local testing.
In part 2 of this series, we look at the various pieces of code that make up our solution in detail.
In part 3 of this series, we walk through the additional steps needed to setup and deploy the solution to the cloud.
In the previous articles of this series, we've tested the system locally and reviewed the code. In this final part of the series, we're now going to deploy the system to the cloud.
- In the first article of this series, we created and configured the SignalR Service in Azure. That's already set to go, so there is nothing further we need to do with this.
- We need to create and publish our Azure Functions to an App Service
- We need to create an Azure Storage account and publish the front-end web app to a Static Website.
- Return to the Azure Portal and navigate to the Resource Group that we created in Part One.
- Click the Add button (at the top) to add a new resource.
- Type "Function App" into the search bar and select "Function App" from the list (it should be at the top).
- Click the Create button.
- A "Function App Create" blade will now show. We need to provide some information:-
- Provide an app name - I used "WhatsAppChatFunction.azurewebsites.net" for this demo, but the name must be unique - so you will need to use your own name.
- Select your existing resource group from the dropdown.
- The form will prompt you to create a new Storage Account. This will be used to store the Build Artefacts (compiled code, etc.) that we will publish. These files will be used when provisioning instances of our Function. The automatically generated name is fine.
- Click the Create button - Azure will now provision the Function App.
With the Azure Function resource provisioned, select it so we can edit its values.
- Select the Platform Features (a tab along the top of the screen) to view an extended view of all the features we have access to.
- Under the "General Settings" heading, click the Configuration option.
We need to add some Application Settings. You'll see that the list of settings already contains some values - we can completely ignore these.
Click the New Application Setting button to add a new key. You will need to repeat this process for each of the four keys that we need to add. To make things easier, you can copy both the keys and their values directly from the
local.settings.jsonconfiguration file in the solution. These are the four we need:-
Note, that it's ok to enter these keys directly into the Azure application setting list. There is a potentially confusing inconsistency, in that the
local.settings.jsonconfig file has these nested under a key called
Values- but there is no need to include this when copying the values into Azure (i.e. you do not need to make the key
Although it's possible to write functions directly using the Azure Portal, we've already written all of our code in Visual Studio, so we'll be deploying our code from there.
With the Azure Function resource provisioned, configured and ready to go - let's go to Visual Studio so we can publish from there:-
- Right-click on the
- Select Publish from the menu (usually near the top).
- In the "Pick a publish target" dialogue, choose "Select Existing" [Azure App Service] and then click the Publish button.
- A new dialogue will appear, allowing us to choose the Resource Group and then the Function Resource that you created earlier. I named mine
WhatsAppChatFunctionbut you will have to choose your own.
- Ignore the "deployment slot" item and just keep the "Function Resource" highlighted.
- Click OK
Visual Studio will now go about deploying your code.
If you need to make a change to your code and re-publish, your "publish profile" will be retained on subsequent visits here, meaning you simply need to click the Publish button.
If you have purely static content, such as the website in this demo (or my blogging website), then there is a way to radically slash your hosting costs using Azure.
Your instinct may lead you down the path of creating resources such as "Web Apps", but there is a new feature in Azure which means you don't need to do this!
As of Dec 2018, Azure Storage now offers static website hosting.
- From your resource group, click the Add to create a new resource.
- Search for "storage" and select "Storage Account".
- Click Create to start creating a new *Storage Account *resource.
- In the Create Storage account screen:-
- Select your existing resource group.
- Provide a unique name for the storage account. I chose "whatsappwebsitestorage", but you will need to provide your own.
- Choose a nearby location.
- Leave "Performance" as "Standard".
- Leave "Access Tier" as "Hot"
- Click Review and Create to proceed.
- You will be prompted to review your selections, but assuming there is no problem, carry on and create the new Storage Account and wait a few moments for the resource to be provisioned.
With the Storage Account resource created, find the option "Static Website" under the heading "Settings". The Azure portal presents a slightly overwhelming number of options, so check the screenshot below if you can't spot the option easily:-
In the dialogue that appears:-
- Switch the mode "static website" to "enable".
- In the text-entry for "Index document name" enter "index.html".
- You can also enter "index.html" into the "Error Document Path" option, simply as a default redirect back to the homepage, because we haven't created an actual default error page. If you were to improve the system, you could make a page such as "404.html" to contain an error message.
- Click Save
Inside the Blobs section of this Storage Resources, the Azure Portal will have created a specially named Container called "Web$". You can look at this yourself by:-
- Clicking "Overview" to take you back to the main dashboard page of the Storage Resource
- Click the prominent "Blobs" tile which will be displayed on the main dashboard (alongside other tiles "Files", "Tables" and "Queues".
You can click on this new "$Web" container, which will list any blobs that are stored inside.
If the terminology is not familiar to you, think of a "Storage Account" as a kind of disk drive, a "Container" as a kind of "folder", and "Blobs" as the actual files.
You could at this point, use the Azure Portal to upload files individually, but this will be slow and frustrating. So instead, we'll use a different tool for the job, which we'll talk about in the next section.
- Before we leave this Azure Portal page, make a note of the "Primary Endpoint" setting which will be displayed. This is the URL we need to visit in our browser.
- The Url will be different in your own project, but in my case, the Url is
- The Url will be different in your own project, but in my case, the Url is
When it comes to deploying files to an Azure Storage account, there are a number of ways we could tackle the problem. For example, in a more complicated Continuous Deployment system, we could get our build process to automatically drop files into the Storage Container for us.
For the purpose of this demonstration, we want to keep it simple and convenient to use.
We'll be using a Windows utility created by Microsoft called the "Azure Storage Explorer". Please follow the instruction in the following links:-
- You can download "Azure Storage Explorer" from here.
- You can find an Azure Storage Explorer : Getting Started guide here.
If you have followed the instructions, have installed "Azure Storage Explorer" and have provided your Azure credentials, you should be able to view your Azure Storage resources.
- In the tree structure, locate and click on the
- Note, you will also see a Storage Account that we created early in this demo, which is used to store the contents of our Azure Function code … make sure you don't get mixed up as to which Storage Account you should be looking at!
- Return to the file
...\WhatsAppSignalRDemo\src\WhatsAppSignalRDemo.Web\wwwroot\js\chat.jsand open it for editing.
Change the value of the constant
http://localhost:7071/api/to be the Url of your Azure Function Endpoint. You will have to use your own, but for me, this value would be
Using a regular Windows File Explorer:
- Navigate to the folder.
- copy or drag-and-drop all of the files in the
wwwrootfolder into the main panel of the Container within the "Azure Storage Explorer".
- Note, that the contents of
wwwrootis just static content, there is no complication or cleverness required - just a simple file copy.
- Navigate to the folder.
There are alternative ways to deploy a static website - for example, this tutorial shows you how to deploy directly from Visual Studio Code.
We've already done a lot of work by this point, but there are still a final few configuration changes that we need to make, in order for our published system to be able to work.
The Azure Function APIs have CORS restrictions that will stop our published Web App from being able to use them. We need to add our new Web App to a whitelist.
In the Azure Portal, navigate through your resource group to you Azure Function App. In this demo, I called mine
Locate and click the "CORS" option which is found under the heading "API"
- Add a new entry to the list of allowed hosts, to include the URL to your static website. For my demo, this host would be
We're almost there now. The final task we need to perform is to update the API endpoint stored in the Twilio dashboard and replace the callback with the URL of our published Azure Function.
You will need to use your own address, but for me, this was
Your system will now be good to go!
Messaging API's. The decision to use one messaging service provider over another is largely a market decision. There are many companies who can provide solutions, so you will ultimately be comparing service costs against support and reputation. Twilio have emerged as a market leader. Their ease of use and great documentation makes them a compelling option.
Serverless Computing. We've presented the use of Azure in this series, but disregarding your actual preference of cloud provider (i.e. putting Google Cloud, AWS, Azure etc on a level playing field), the case to use serverless functions instead of more conventional hosting platforms (such as regular VMs, or an Azure WebApp etc), is not necessarily a clear-cut decision. That dreaded cliche of "it depends" still comes into play.
In scenarios where your service is not fully utilising your host platform, the cost savings of using serverless compute can be hugely significant.
An example of such a solution [that could make significant cost savings] could be an application that serves a domestic audience predominantly during business hours. That system would have a host machine that is effectively only being used for a third of the day and is sat largely dormant at other times. This is a waste of money.
In contrast, with a heavily-used system with a global 24 / 7 usage pattern, you'll need to pay very close attention to the costs. Similarly, if stable and predictable costs are important, serverless computing may not be for you if choosing the consumption model (instead choose the more familiar hosted model)
These articles are intended for someone taking their first steps with these technologies. They are not attempting to present an architecture that you should use in a commercial product. There are aspects that we have not covered, such as security and resilience, that you should seek to explore further.
There are some standout issues that I would highlight, which could be the subject of your next steps to improve this project.
The UI and API's have no authentication. As presented they are wide-open to potential abuse (for example, there is nothing to stop you sending messages to the
The individual functions communicate by calling HTTP endpoints. If there is a transient problem - e.g. the Function has not been used for a while and has "gone cold" (non-responsive) - that communication attempt could be lost.
A far better way to process tasks could be to use services such as Azure Event Hubs, Queues, Service Bus, or Event Grid. Also consider investigating the recently introduced Azure Durable Functions which accommodate the concept of chaining-together workflows of separate functions.
- For example, in this demo the
SendWhatsAppMessageFunctionby HTTP. A more resilient improvement would be to recode the
BroadcastSignalRMessageFunctionso that it adds a message to a queue. Subsequently, we could change
SendWhatsAppMessageFunctionso that it is a queue-triggered function.
- For example, in this demo the
For now, I'll leave you with some other ideas of where you could take this project next:-
- SignalR Authentication, to prevent completely open and insecure access.
- Authenticated pages, where "username" can be automatically picked up from logged-in user instead of manually typing it.
- Explore using Azure CDN or Azure FrontDoor for ways to improve your static website (e.g. with your own custom domain, using free SSL).
- Microsoft: SignalR QuickStart Guide
- Microsoft: Tutorial: Get started with ASP.NET Core SignalR
- Azure SignalR Service now supports ASP.NET
- GitHub Issue : Dependency Injection support for Functions
- Microsoft : Use dependency injection in .NET Azure Functions
- DEV.to : Using Entity Framework with Azure Functions
- Azure Functions local.settings.json Secrets and Source Control
- App settings reference for Azure Functions
- Work with Azure Functions Core Tools - Local Settings File
- Tom Faltesek : Azure Functions local.settings.json Secrets and Source Control
No third party (i.e. Microsoft or Twilio) compensate me for my promotion of their services. However, my partner Layla Porter is an employee of Twilio Inc, in the capacity of a developer evangelist (and I very occasionally hang out with other Twilio evangelists) so I have a strong bias to recommend their services.
Thanks to Corey Weathers for hopping onto a video call with me to discuss parts of this project.
Thanks to Layla Porter for being my document reviewer. I don't have a team to back me up and sometimes you need another set of eyes!