ASP.NET Static Site Generation

As hinted at in a previous post, I have transitioned from hosting dynamic web applications in a cloud hosting service (like AWS Elastic Beanstalk or Azure App Service) to hosting static websites in my document storage location of choice (Amazon S3 in this case). This post discusses my development process for a project like this.

Existing Projects

I have a number of projects that currently utilize this deployment paradigm. They are all some flavor of ASP.NET developed locally with a local database, which are then exported and uploaded to S3.

MorrisPhotos.com

My photography website, MorrisPhotos.com, is a static website hosted in S3 and pushed to the edge via Amazon CloudFront. I’ve got CloudFlare handling DNS and threat detection, and it all just works out really nicely.

LodiCornFest5k.com

A local race website, LodiCornFest5k.com, is also a static website hosted in S3. I don’t have CloudFront in front of this guy, as it’s less important to aggressively cache the data on the website.

OhioTrackStats.com

My pride and joy, OhioTrackStats.com, is a static website hosted in S3. There will be later blog posts about this one at another date.


The Process

Local Development

I’m a .NET developer. This means I am most comfortable slinging some ASP.NET code, especially for local development. As .NET Core 2.1 is the de facto standard, all my new projects are moving toward that.

A simple example is my MorrisPhotos.com source. Maybe one day I’ll go into further detail on how I code, but this is not the purpose of the post today.

My local setup is pretty straightforward: I’ve got Visual Studio 2017 Community running on my home machine with ReSharper plugged in. Nothing special here with the setup. You could do all of this with VS Code instead, but I like the fully-featured IDE.

With respect to local development, I’m adding new features, fixing any issues, or just generally running the application locally on my home machine to see how it’s working. Nothing too special here, but this is the last place that the application is truly dynamic.

Local Database

I’ve got SQL Server 2017 Express running on my home machine. It’s nothing fancy, but it’s got the databases that drive each of the projects listed above. To access the databases in the code, I typically use either Entity Framework Core or ServiceStack ORMLite. I don’t really have any speed or performance requirements with the database access layer, so I’m basically writing bad code to access the data.

S3 Setup

In order to publish a website using S3, I follow this guide. It’s very self-explanatory, and I recommend you go through each step in order to get it correct.

For example’s sake, I’ve got both a www.ohiotrackstats.com bucket and a ohiotrackstats.com bucket in S3 currently for the OhioTrackStats.com website. The www bucket redirects directly to the non-www bucket. This is the pattern that I’ve got for each of my projects.

Domain Setup

As mentioned previously, I’ve got DNS in CloudFlare’s infrastructure. For MorrisPhotos.com, this looks like the following:

  • Domain registered through Google Domains
  • Name Servers set to CloudFlare’s name servers
  • A couple CNAME records in CloudFlare
    • morrisphotos.com -> non-www S3 bucket hostname
    • www.morrisphotos.com -> www S3 bucket hostname

Static Site Generation

This is where the real magic happens. Given a dynamic web application, I want to create a static HTML website that can be hosted directly through S3.

I personally use HTTrack Website Copier to crawl the locally-running ASP.NET web application and generate the HTML, CSS, and JavaScript files to a folder structure. The beauty of this tool is that it will update all the links on a website to be relative paths, allowing for the website to be run from anywhere that a web server exists.

I make sure to exclude all external CSS and JS files (files hosted on external CDNs, like the Bootstrap files, for example), as I don’t want to be responsible for hosting their content in my website. Beyond that, the setup is very straightforward, and once I’ve got a website setup once, I never have to update it.

In order to properly execute this step, I need to make sure I’m actively running the web application locally, and then I point to the tool the port that IIS Express is running on. I try to make sure to run the application in Release mode in order to get the built-in benefits of bundling and minification, as well.

Deployment

Once I’ve got the static site generated, all I’ve got to do is copy the output to the non-www S3 bucket. I can do this via the AWS Console, but I’d rather not upload the entire web site every time I make a small update.

To combat full uploads, I am utilizing the AWS CLI. Specifically, I am using a batch file that runs the static site generator, then runs the S3 sync CLI command to push only the changed items. A sample batch script can be seen below. I’ll describe after.

"C:\Program Files\WinHTTrack\httrack.exe" "http://localhost:59141" -O "C:\My Web Sites\LodiCornFest5k" --update
aws s3 sync "C:\My Web Sites\LodiCornFest5k\localhost_59141" s3://lodicornfest5k.com --acl="public-read"

The first line in the batch script is to run the HTTrack software from the command line, pointing to http://localhost:59141, which is the IIS Express website that I set up for my LodiCornFest5k project in Visual Studio. The -O switch points the application to the mirror location, which is something I already set up in the GUI with the options I wanted. Finally, we use the -update switch to ensure that it only writes files that have changed since the last time we ran the tool.

The second line in the batch script is the upload line. It runs the aws s3 sync command, which has already been configured on my machine to use an IAM user that has write access to all my S3 buckets. This isn’t the best security, as I should probably limit that IAM user to just the buckets that I need, but it’ll do for now. I then point it to the location of the files on the host machine (C:\My Web Sites\LodiCornFest5k\localhost_59141), and then the bucket location to push the files to (s3://lodicornfest5k.com in this case). In addition, I add the --acl="public-read" flag to the uploaded files to ensure that they can be viewed over internet without authenticating, as my website is publicly available.

Again, since I’m generating the static site within this batch file, I need to make sure that I’m running the website locally. I’ve kicked this batch file off a number of times without doing that, and the whole thing borks pretty hard.

Future Enhancements

There is one major enhancement that I want to get to, but I haven’t tried it yet. Instead of manually running the batch file locally when I make changes, I would love for a CI server to be able to do it instead. It would require me to package up the httrack.exe binary with my application. Then, upon commit to the master branch, I would have something like a Continuous Integration tool like AppVeyor run a post-build command that is similar to the batch file itself. This would ensure that the website in S3 directly matches what’s in source control, which is the holy grail.

I’ll probably get around to that one in the next year or so, so stay tuned!


Review

In all, the “do work” stage of building my custom web applications hasn’t really drastically changed with this new hosting model. I’m still running and testing locally whenever I make changes. Instead of pushing the built bits out to a web application server, though, I am just pushing out the generated files instead. It’s a much more elegant and speedy solution for both me and the end user!

Cost Saving

Previously, I was using a hosted application platform for something like MorrisPhotos.com (Elastic Beanstalk, specifically). This meant that I was incurring charges on an EC2 instance, an RDS instance, plus some networking and hardware costs. This cost upwards of $100/month just to host and run the web application.

Now, with just the S3 buckets hosting the static website (and the thousands of photos in a separate S3 bucket), plus CloudFront as the CDN layer in front of the S3 bucket, I am paying less than $10/month. I don’t have the exact numbers at the moment, but we’re looking at roughly a 90% savings per month. I’d say this was absolutely worth it.

Time Saving

I don’t have hard numbers on the actual time savings for the end users, but I do have a general understanding that, instead of a web application server (IIS in this case) receiving a request, handing off to the web application to process, handing off to the database to retrieve data, then pushing it back up the pipeline to the end user, I am simply giving the user a typically-cached HTML/JS/CSS/image file instead. The speed that the end user is seeing is significantly faster, on both a time to first byte aspect (due to the edge locations of CloudFront in MorrisPhotos.com’s case, generally) and a total download time aspect.

All in all, this new deployment and hosting paradigm has greatly improved everyone’s experiences with my “dynamic” web applications, and it’s certainly something I’ll be doing from now on for web applications that need to be database-driven but only updated somewhat regularly.