Developer for Laravel, Craft and Go.

Using NuxtJS Server Middleware as a Proxy Pass

NuxtJS is an amazing tool for building SSR applications with Vue. There are so many benefits right out of the box such as PWA support, automatic routing, plugins, and modules. One really powerful feature, which is less documented, is server middleware.


By default, middleware in NuxtJS is also exposed to the client side. This is useful for common middleware like auth where you need to ensure that a token is set in the local storage or cookie for a user to ensure they are authenticated. However, there are cases where you might want to mask 

Server Middleware

NuxtJS describes server middleware as "a connect instance that we can add our own custom middleware to. This allows us to register additional routes (typically /api routes) without need for an external server." So what exactly does that mean? 

With NuxtJS as your framework, you gain server and client support automatically. There are many instances where you may have environment variables that you want to only render on the server side and not expose to the client this could be an internal API, key, or etc. NuxtJS allows this through the use of server middleware. 

Use Case

Image the following scenario, we have an internal API to users of the NuxtJS application will consume, but we do not want to expose the API to the users directly. In this scenario, we can create a server middleware that will handle requests and proxy pass them to a server defined by the environment variable.

First, let's create a new folder to store the server middleware. NuxtJS uses /api in their example, but this folder name is completely up to you (although I would avoid placing it in /middleware/server just from an organization standpoint). Keeping with the NuxtJS documentation, create api/v1/index.js

Note: don't forget to npm or yarn require the http-proxy package.

With the server middleware created, we need to register it with the nuxt.config.js. 

Now anytime to hit /api/v1 our requests will be proxy passed to the API defined by the environment in the server middleware!

Note: it's important that you do not define the environment using the nuxt.config.js env property as this will publish the values to the server and client.


A lot of credit goes to this video by Program with Erik (

My Docker setup for Craft CMS

I get asked this question often, so I figured I would document how I setup my Craft installations that are powered by Docker. There are three main components to my setup; a Dockerfile, Makefile, and docker-compose.yml.


The Dockerfile is the main "workhorse" for my flow, it should represent the entirely of the application and should be near identical to production as possible. Some people opt for a `` and `Dockerfile.production` but I personally find this as a cumbersome workflow; especially when your team grows someone will forget to update one and push a deployment.

I always keep my Dockerfile in "production" mode. This means I don't install xdebug or other local development tools, for those requirements I tend to place those commands in the Makefile (e.g. `make xdebug` which will execute the installation steps inside the local docker container). This flow also allows the CI/CD tools to install xdebug in the staging or test process when needed.

Apache Configuration File

This Dockerfile uses the PHP Apache image as a base, since we modified a few items (specifically the path to the document root) we need to add our own Apache configuration. You can see this in the COPY step in the Dockerfile above.

A lot of load balanced applications will also need to set the `X-Proto-Forwarded` header to properly pass HTTPS

Docker Compose

This file is used for local development, instead of running a long list of docker commands, the docker-compose file allows you to "bootstrap" your applications dependencies such as PostgreSQL and Redis.

There are a few key items of note in the docker-compose file.

  1. cached will increase performance when loading files between the host machine and docker image. Docker has a nice write up here
  2. volumes will create a volume that stores your PostgreSQL data between rebuilds. Since Docker images are made to be ephemeral, this is critical to keeping your work locally


Instead of writing a lot of local scripts and storing in the code repository, I tend to use a Makefile. This allows me, or anyone on the team, to add repeatable commands to use among the team. There are some specifics to tagging the images and pushing to the Docker Container Registry, but the one command specific to Craft is local, using the Craft CLI to quickly install your local development environment with one command

This is not a complete example, as it is missing a few build steps like testing.

One really interesting thing to note is the Makefile syntax which is a little difficult to find `SOMETHING ?= this`. This will check for the environment variable SOMETHING and if it is not found, it will default to something. This allows your CI/CD tools to be really flexible and define variables for each build step.

Increasing web performance locally with Lighthouse

A while ago, Google updated their PageSpeed Insights to start using Lighthouse for analyzing accessibility, performance, SEO, and progressive web apps. The way we used to calculate our score was to ship the application to a staging server and run PageSpeed Insights. That process was time consuming and most developers just gave up on it.

Running PageSpeed Insights locally

Running PageSpeed Insights locally

Most people don't know that you can run these checks locally and also get a list of items that need to be corrected. This means you can maximize your score locally without shipping to a publicly available server!

DynamoDB under the hood

under the hood

Tips for designing GraphQL schemas

Another excellent talk from GraphQL summit