DEV Community

Jeffrey T. Fritz
Jeffrey T. Fritz

Posted on • Originally published at jeffreyfritz.com on

Blazor and .NET 8: How I Built a Fast and Flexible Website

I’ve been working on a new website for my series CSharp in the Cards. I built this website in a way that was easy to maintain, flexible and most importantly would respond quickly to requests from visitors. I knew that Blazor with .NET 8 had a static server rendering feature and decided that I wanted to put it to the test. I recently published a new lesson to the website and included a web assembly component to allow for paging and filtering the list of lessons I was pleasantly surprised when I saw the performance dashboards on azure showing that it was handling requests and responding very very quickly.

Response times of C# in the Cards after the new episode

In this blog post, let’s talk about how I’ve optimized the website for speed and some of the finishing touches that you can put on your Blazor website to make it screaming fast running on a very small instance of Azure App Service.

Static Site Rendering – Its Blazor, but easier

With .NET 8 there’s a new render mode for Blazor and it’s called static site rendering or SSR. This new render mode ditches all interactivity that we previously had with Blazor server side and Blazor Web Assembly and instead favors delivering HTML and other content from the server to browsers in a high speed manner. We can bolt on other techniques that we know from SEO and website optimization to make this even faster and deliver a great experience for our visitors.

@page "/About"
@attribute [OutputCache(Duration = 600)]

<PageTitle>C# in the Cards - About Us</PageTitle>

<HeadContent>
    <link rel="canonical" href="https://csharpinthecards.com/about" />
    <meta name="title" content="C# in the Cards - About Us">
    <meta name="description" content="C# in the Cards is an innovative video training series that teaches the fundamentals of C# programming language using a deck of cards. Join Microsoft instructor and Twitch streamer Jeff Fritz on this exciting journey.">
    <meta name="keywords" content="C#, Programming, Learning, Video Series, Jeff Fritz, Microsoft, Twitch, Education, Coding, Deck of Cards, Interactive Learning">
    <meta name="robots" content="index, follow">
    <meta property="og:title" content="C# in the Cards - About Us" />
    <meta property="og:description" content="C# in the Cards is an innovative video training series that teaches the fundamentals of C# programming language using a deck of cards. Join Microsoft instructor and Twitch streamer Jeff Fritz on this exciting journey." />
    <meta property="og:image" content="https://csharpinthecards.com/img/jeff.webp" />
    <meta property="og:url" content="https://csharpinthecards.com" />
    <meta name="twitter:card" content="summary_large_image">
    <meta name="twitter:site" content="@@csharpfritz">
    <meta name="twitter:title" content="C# in the Cards - About Us">
    <meta name="twitter:description" content="C# in the Cards is an innovative video training series that teaches the fundamentals of C# programming language using a deck of cards. Join Microsoft instructor and Twitch streamer Jeff Fritz on this exciting journey.">
    <meta name="twitter:image" content="https://csharpinthecards.com/img/jeff.webp">
</HeadContent>

<main>

  <h1>About C# in the Cards</h1>
  ...
</main>
Enter fullscreen mode Exit fullscreen mode

The About page is configured to output a bunch of HTML headers for the SEO folks and the social media pages to be able to present good information about the site. Notice the headers that are added to satisfy the search engines:

  • a canonical link element that identifies where the page should be served from
  • a keywords meta element with information about what you can find here
  • a robots element that tells the search engine crawlers what they can do with the page
  • open graph and Twitter meta tags that instruct Twitter, Facebook, LinkedIn, Discord, and other sites about the images, titles, and description of the page

That’s fine… but there are two other features to notice:

  1. This is a static page with no data being presented. I’ve tagged it on line 2 with an attribute to allow output caching for 600 seconds (10 minutes). This way the web server doesn’t have to render a new copy when its requested within 10 minutes of a previous request.
  2. The images references are in webp format. Let’s not overlook this super-compressed format for displaying high-quality images. It might be 2024, but every bit we deliver over the network still matters for performance and the 600×600 portrait picture of myself on this page was compressed nicely:
Original Compressed # Difference
PNG WEBP
450kb 30kb -93.3%

93% savings… that’s crazy good and means that your browser is not downloading an extra 420kb it doesn’t need.

Data is stored in static files on disk

For this simple website I don’t need a big fancy database like SQL Server or Postgres or even MySQL. For this site, I’ve stored all of the data in a simple CSV file on disk. That means that I can edit the list of articles that are available and the metadata that goes with them by just opening the file in Excel and writing new content. This means that when it comes time for me to read data about the list of content that’s available I’m only reading from a very small file on disk and I don’t need to worry about any kind of contention. I also don’t need to worry about any service that’s running to deliver that data because it’s only coming out of a small file on disk that’s read only.

public class PostRepository
{

  public PostRepository(IWebHostEnvironment env, IMemoryCache cache)
  {
    Env = env;
    _Cache = cache;
  }

  public IHostEnvironment Env { get; }

  private const string POST_CACHE_KEY = "post_data";
  private readonly IMemoryCache _Cache;

  private IEnumerable<PostModel> GetPostsFromDisk()
  {

    // open the posts.csv file from the wwwroot folder using linq2csv
    var context = new LINQtoCSV.CsvContext();
    return context.Read<Post>(Path.Combine(Env.ContentRootPath, "wwwroot", "posts", "posts.csv"), new CsvFileDescription
    {
      SeparatorChar = ',',
      FirstLineHasColumnNames = true,
      QuoteAllFields = true,
      UseFieldIndexForReadingData = true,
      IgnoreUnknownColumns = true
    }).Select(p => (PostModel)p)
    .ToArray();

  }

  public PostModel[] GetPosts() 
  {

    return _Cache.GetOrCreate<PostModel[]>(POST_CACHE_KEY, entry => {
      entry.SlidingExpiration = TimeSpan.FromMinutes(30);
      return GetPostsFromDisk().ToArray();
    })!;

  }

}
Enter fullscreen mode Exit fullscreen mode

In this repository class I use the LinqToCSV library to open and read all of the content from the file into a Post object in the first method, GetPostsFromDisk. Later, in a public method called GetPosts, you see where I use the in memory cache feature of ASP.NET Core to fetch data from the cache if its available or get it from disk and store it in cache for 30 minutes. I could probably extend this timeout to several hours or even days since the website doesn’t get any new content without uploading a new version of the site.

The key here is that the meta data about the lessons on the site is loaded and kept in memory. As of episode 9 of the series, the posts.csv file is only 1.4kb so I have no worries about loading its entire contents into memory.

Don’t forget, in order to add the MemoryCache to your ASP.NET Core application, you need to add this line to your site configuration in the Program.cs file:

builder.Services.AddMemoryCache();
Enter fullscreen mode Exit fullscreen mode

I could add other cache options like Redis to the site, but with how small the data I want to cache is, I don’t need that sophistication at this point.

Pre-rendered Interactive Web Assembly Content is fast… REALLY fast

I wanted to add a subset of the lessons to the front page of the website so that you could see the latest six episodes in the video series and scroll back and forth to the other episodes. This should be an interactive component but I still wanted the home page to render quickly and have a fresh speedy response time as you page through and look at the various episodes that are available. The natural way to do this with Blazor is to build a web assembly component that will run on the client and render data as users click on the buttons for that collection of articles.

I wrote a simple pager component that would receive a collection of lesson data and render cards for each lesson. Since we already know that the collection of lesson data is less than 2kb in size I don’t have a problem sending the entire collection of data into the browser to be rendered.

<h2>Recent Lessons</h2>

<PagingPostCollection 
  Posts="Posts" PageSize="6" 
  @rendermode="InteractiveWebAssembly"
  SortAscending="false" />
Enter fullscreen mode Exit fullscreen mode

When I use the @rendermode attribute in this code, it forces the render mode to web assembly and ASP.NET will pre-render as well as cache a version of that component’s resultant HTML with the home page. After viewers download the Web Assembly content it will hand control over to web assembly and it will be a fully interactive component for them to be able to work with.

Lesson Pager on the C# in the Cards website

Lesson Pager on the C# in the Cards website

Blazor lets me build content to be rendered on the web and I get to choose where exactly it should run. It can run in the browser with web assembly it can run statically on the server it can run interactively on the server if I want it to. In this case running as web assembly gives a really nice usability effect that makes it easy for viewers to locate the content they want to watch.

Compress the Content from Kestrel

By default content that’s delivered from the ASP.NET kestrel web server is uncompressed. We can add brotli compression to the Web server and deliver content in a much smaller package to our visitors with just a few simple lines of code in program.cs. This is something that I think everybody should do with their Internet facing websites:

#if (!DEBUG)
builder.Services.AddResponseCompression(options =>
{
options.EnableForHttps = true;
});
#endif
Enter fullscreen mode Exit fullscreen mode

Add response compression configures the server so that it will deliver broadly compressed content. In this application I wrap it with the conditional debug detection because hot reload does not work with compression enabled. When we deliver the website to the production web host it will be running in release mode and compression will be enabled.

Optimize all the JavaScript and CSS

CSS AND JavaScript can be minified and combined to reduce the number and size of downloads for this static content that makes our websites look good. For this website I installed and used the WebOptimizer package available on NuGet. My configuration for this looks like the following:

builder.Services.AddWebOptimizer(pipeline => {
    pipeline.AddCssBundle("/css/bundle.css", new NUglify.Css.CssSettings
    {
        CommentMode = NUglify.Css.CssComment.None,

    }, "app.css", "css/bootstrap.css", "style.css", 
    "css/plugins.css", "css/colors.css", "css/responsive.css");
    pipeline.MinifyJsFiles("/js/jqueryCustom.js");
});
Enter fullscreen mode Exit fullscreen mode

This script bundles the CSS files that were delivered with my website template and minifies the one JavaScript file that I manage with my project.

Set long cache-control headers for static content

The last thing that I did was set long duration cash dash control headers for static content like images CSS and Javascript files. This is easy to do with just a few more lines of optional configuration when I configure the static file feature inside of ASP.NET Core:

app.UseStaticFiles(new StaticFileOptions()
{
  OnPrepareResponse =
  r => {
    string path = r.File.PhysicalPath;
    if (path.EndsWith(".css") || path.EndsWith(".js") || 
    path.EndsWith(".gif") || path.EndsWith(".jpg") || 
    path.EndsWith(".png") || path.EndsWith(".svg") || path.EndsWith(".webp"))
    {
      TimeSpan maxAge = new TimeSpan(370, 0, 0, 0);
      r.Context.Response.Headers.Append("Cache-Control", "max-age=" + maxAge.TotalSeconds.ToString("0"));
    }
  },
});
Enter fullscreen mode Exit fullscreen mode

Summary

This website’s been easy for me to build because I can rely on my normal HTML skills and the plethora of HTML templates and CSS libraries out there to make my website look good. Blazor helps me to make it interactive render quickly and grow as I add more content to it. my cost in interaction with azure is minimal, as I’m using a Basic-2 instance of Azure App Service running Linux to deliver this site.

Top comments (0)