Direct S3 File Uploads for NodeJS

3 min read

I’m on a mission to document the most challenging parts of software development. Client side uploads to S3 is absolutely top 5. Navigating AWS permissions and troubleshooting worthless error messages is the absolute worst.

So why would you want to upload to S3? Well, it’s easy to do file uploads to your site in Node/Express but then what? Hopefully your servers are disposable so you need somewhere persistent to save the file. Furthermore, file uploads can kill your server – it’s much healthier to have the file go directly from the user to Amazon S3 with basically no stress on your server.

(By the way – check out the end of this article for some updated notes from 2022)

Step 1) Create Your S3 Bucket

First you need to have an S3 bucket with the proper settings to allow client side upload. Follow these steps:

  1. Go to this page and create a bucket
  2. Name: consider naming it a domain if you ever wish to serve files directly from there (e.g. s3.example.com)
  3. Once the bucket is created, go to Permissions > CORS Configuration and paste in this XML to allow client-side access to the bucket

Step 2) Setup Your AWS Permissions

This is the difficult part that most articles hand wave. You’ll probably want to create a new user with the correct permissions to upload files to the bucket. This will give you the tokens required:

  • Open IAM Management
  • Click “Add User”
  • Name: doesn’t matter, but I’d include the bucket name. (e.g. s3.example.com.user)
  • Choose “Programmatic access”
  • Choose “Attach existing policies” > Create Policy
    • Service: S3
    • Actions: DeleteObject, ListBucket, GetObject, PutObject
    • Resources > bucket > Add ARN > (enter bucket name)
    • Resources > object > Add ARN >
      • Bucket name: (enter bucket name)
      • Object name: (check Any)
    • Press Review
    • Name: I’d use s3.example.com.user.policy
    • Press “Create Policy”
  • Now on the Add User page, click the refresh icon
  • Search for your new policy’s name
  • Check the policy’s box and press Next until you save
  • Your user is created!
  • Make sure to copy the access key and secret. This is your only chance!

Step 3) Setup Your Signature Route

To upload a file directly from a browser, your user will need to prove that you’ve given them permission. So here’s how that happens.

They choose a file to upload and their browser will tell the server what file (and what file type) they intend to upload. Then the server uses the secret access key to build a “signed” URL. The browser will use that URL to directly upload the file to S3.

There are so many convoluted code examples out there. Truth is, it’s very easy to sign a URL if you know what to do. Here’s how:

const app = ...;
const aws = require('aws-sdk');
const s3 = new aws.S3({
  region: 'us-east-1',
  accessKeyId: '******************',
  secretAccessKey: '************************************',
});
app.get('/uploader/sign', async (req, res) => {
  const { key, type } = req.query;
  const url = s3.getSignedUrl('putObject', {
    Bucket: 's3.example.com',
    Key: key,
    Expires: 60,
    ContentType: type,
  });
  res.send({ url });
});

Be sure to fill out the access keys and bucket (from the first 2 steps). Also make sure you install the aws-sdk package from NPM.

npm install aws-sdk --save

Step 4) Upload the File

First, I recommend using axios for making requests. So reference that from a CDN or NPM (if you’re using webpack).

Now you need a file input:

<input type="file" id='thefile' />
<button type="button" onclick='upload'></button>

Now when they click the upload button, fetch the signed url and do the upload:

function upload () {
  const file = document.getElementById('thefile').files[0];
  // Make sure a file is selected
  if (!file) return;
  // Fetch the signed url
  const key = file.name;
  const response = await axios.get(`/uploader/sign?key=${key}&type=${file.type}`);
  const url = response.data.url;
  try {
    // Attempt the upload
    const options = { headers: { 'Content-Type': file.type } };
    await axios.put(url, file, options);
  }catch(e){
    alert(`Upload failed: ${e}`);
  }
}

This example uploads the file using the user’s filename to the root of your bucket. Probably not a good plan, but how you organize them is up to you. A common plan would be to put user uploads in their own folder then either rename the file or add an extra random directory in the middle so that multiple uploads with the same filename don’t clobber each other. Something like:

const random = [...Array(30)].map(() => Math.random().toString(36)[2]).join('');
const key = `uploads/${random}/${file.name}`;

Update: March 22, 2022

The last few weeks, I’ve been setting up a new project on S3. A few things have changed:

  • Never use dots in bucket names. AWS used to almost require you to name buckets like files.myapp.com because that was the only way to use the bucket for hosting a website. Now they have moved the bucket name from the path (e.g. https://s3.us-east-1.amazonaws.com/my.bucket.name/my-file.txt) to a “virtual-hosted” url (e.g. https://my.bucket.name.s3.us-east-1.amazonaws.com/my-file.txt). With these new urls, amazon’s wildcard ssl certificate can’t cover the three levels (my, bucket, and name). So you’d be forced to serve your files over HTTP (bad) or proxy all the requests through cloudfront (expensive and annoying).
  • So how do you setup custom domains without dots? Use Cloudflare DNS and set your CNAME target to my-bucket-name.s3.us-east-1.amazonaws.com. Cloudflare will automatically serve your domain with SSL and the traffic between Cloudflare and S3 will be encrypted using amazon’s wildcard cert.
  • One other curious thing – CORS. If you use Cloudflare to proxy requests to your bucket (like I described above) then you will get a CORS error when trying to do the preflight OPTIONS check on your PutObject call. Why? S3 looks at the “Host” header to determine which bucket you’re fetching from. Your host header will be the custom domain from the original request, not the bucket name. Cloudflare allows Page rules to edit request headers, but only for Enterpri$e accounts. Instead, you can setup a free cloudflare worker to intercept the request, fetch it with the correct url, then return the response. Here’s an example.

One Reply to “Direct S3 File Uploads for NodeJS”

  1. Sir, you are the wind beneath my wings. At long last, an example that doesn’t bury the important stuff in mountains of “And here is how you create a Node.js project from scratch.” And it works!

Leave a Reply

Your email address will not be published.