BLOG

Automated Static Site Hosting with Jekyll, Vagrant, S3, and CloudFront

I maintain this website along with a few others using Jekyll and S3. One of the issues that I’ve seen people have is wanting the same ease of use you get from hosting your blog on something like WordPress. While we can’t get quite all the way there, here’s the process I use which has worked well for me.

Vagrant

First, I use Vagrant to do the hosting on a local VM so that I don’t have to put all the dependencies on my local machine. It also makes the setup easily transportable to different computers later.

Here is the Vagrantfile I use for this blog:

# -*- mode: ruby -*-
# vi: set ft=ruby :

Vagrant.configure(2) do |config|

  # Pulling from Ubuntu 14.04 LTS
  config.vm.box = "ubuntu/trusty64"

  # Give it a custom hostname
  config.vm.hostname = "blog"

  # Create a private network, which allows host-only access to the machine
  # using this specific IP. This way we don't need to mount custom ports.
  config.vm.network "private_network", ip: "10.10.10.11"

  # Share our main ion folder with the VM
  config.vm.synced_folder ".", "/opt/blog", nfs: true, mount_options: ['rw', 'vers=3', 'tcp', 'fsc' ,'actimeo=1']

  # Disable the default vagrant shared folder
  config.vm.synced_folder ".", "/vagrant", disabled: true

  # Run our setup file
  config.vm.provision :shell, :path => "./vagrant-dependencies.sh"

  # Custom configuration for VirtualBox
  config.vm.provider "virtualbox" do |vb|

    # This may need to be set in order to get DNS to resolve on the client VM
    vb.customize ["modifyvm", :id, "--natdnshostresolver1", "on"]
    
    # This may help shared folders work better, but also using NFS above is probably what makes everything work
    vb.customize ["setextradata", :id, "VBoxInternal2/SharedFoldersEnableSymlinksCreate//opt/blog", "1"]
    
    # Customize the amount of memory on the VM:
    vb.memory = "2048"
  end

end

This puts all the files in /opt/blog on the VM and then runs vagrant-dependencies.sh, which is a bash script that looks like this:

# Upgrade to root
sudo su

# Add Node.js to apt
curl -sL https://deb.nodesource.com/setup_4.x | bash -

# Add Ruby to apt
apt-add-repository ppa:brightbox/ruby-ng

# Add PPA for Java
add-apt-repository ppa:webupd8team/java

# Update apt and begin installing
apt-get -y update
apt-get -y upgrade
apt-get install -y build-essential libkrb5-dev git

# Install Node, Ruby
apt-get install -y nodejs ruby2.3 ruby2.3-dev

# Install Java
echo debconf shared/accepted-oracle-license-v1-1 select true | sudo debconf-set-selections
echo debconf shared/accepted-oracle-license-v1-1 seen true | sudo debconf-set-selections
apt-get install -y oracle-java8-installer

# Remove anything we don't need
apt-get -y autoremove

# Install global npm dependencies
npm install -g npm@latest gulp

# Install gems
gem update --system
gem install jekyll
gem install jekyll-gist
gem install s3_website

I also keep a Makefile in the root of my project that looks like the file below. Note that because I’m using Vagrant I need to add --force_polling, since the NFS fileshare that Vagrant sets up does not allow system level file watching. This is an annoyance you’ll find in any build tool that has a watcher, including things like Webpack.

build:
  jekyll build

watch:
  jekyll build --watch --incremental --force_polling

serve:
  jekyll serve --port $(PORT) --host $(IP)
	
deploy:
  gulp && s3_website push

.PHONY: build serve deploy watch

gulp

From a gulp perspective the only thing I care about is copying the files into a final folder where they will be uploaded from, and doing some light minification. This isn’t strictly necessary since I’m now gzipping the files, but I’ve left it in anyway.

var gulp = require('gulp');
var htmlmin = require('gulp-htmlmin');
var minifyCSS = require('gulp-minify-css');
var uglify = require('gulp-uglify');
var del = require('del');

gulp.task('default', function() {
  
  del(['./_dist/**/*.*'], function (err) {
    console.log('Cleaned _dist folder');
      
    gulp.src('./_site/css/*.css')
      .pipe(gulp.dest('./_dist/css'));
      
    gulp.src('./_site/js/*.js')
      .pipe(gulp.dest('./_dist/js'));

    gulp.src(['./_site/**/*.html', './_site/*.xml'])
      .pipe(htmlmin({
        collapseWhitespace: true,
        removeComments: true
      }))
      .pipe(gulp.dest('./_dist'));
      
    gulp.src('./images/**/*.*')
      .pipe(gulp.dest('./_dist/images'));
      
  });
  
});

s3_website

Deployment of the files is made easier with a nice tool written in Ruby called s3_website. If you use the Vagrantfile above, this is installed automatically along with it’s dependencies on Ruby and Java.

Note that you will want to use Amazon’s IAM roles and lock down the permissions so that the user you define here only has permission to S3 and CloudFront. Hopefully that will limit the damage if the credentials were leaked.

This is what my s3_website.yml file looks like:

# These are your S3 Credentials
# You can also pull them from server environment variables
s3_id: MY_IAM_ID
s3_secret: MY_IAM_SECRET_KEY
s3_bucket: mdcox.net

# These are my settings, but see the documentation for more

max_age: 300

gzip: true

s3_reduced_redundancy: true

site: _dist/

cloudfront_distribution_id: MY_CLOUDFRONT_ID_FOR_MDCOX
cloudfront_invalidate_root: true

Bringing It All Together

Now my process for updating this site or any other static site I manage looks like this:

  1. vagrant up
  2. vagrant ssh
  3. From within the SSH session, cd /opt/blog && make watch
  4. From within a new SSH session window, cd /opt/blog && make serve
  5. Edit the files locally to update as necessary. Changes will be shown at http://10.10.10.11:4000/
  6. When finished, make deploy from within an SSH session

The files will be updated on S3 and any files updated will also be invalidated at CloudFront.

calendartwitterfeedenvelopelinkedingithub-altbitbucket