S3 upload with presigned url — React and NodeJS

DSL
8 min readApr 11, 2021

--

I have recently worked on a ticket whose task is to build a new feature to allow our users to upload their profile photos to S3. This is a seemingly very easy task but it could be very annoying when you encounter errors or bugs, some of errors could even be misleading and thus you end up spending lot of time in the wrong direction. So I decided to write a post to share how I did it, what errors did I see and how did I solve them. If you are a new developer or experienced developer first time working on S3 upload, this is the post for you to read on.

There are different ways to do this, one way is to stream the file from UI to API, then upload the file directly to S3 with the aws-sdk. This is not as efficient and secure, because you are transferring the files twice, from UI to API, then API to S3 and it is even worse when the file is big. There is another way to do it, that is by using a presigned URL. You can read more details from here.

So here are the 3 steps I took to get it to work.

1: Create an API endpoint that accepts the filename and filetype from the UI

2: The endpoint makes a request to S3 with aws-sdk to get a presigned URL and sends it back to the UI

3: UI makes a PUT request to S3 to upload the file with the returned presigned URL.

Let dive into it !

Backend

First, let’s setup a S3 bucket. I like to do it with the CLI but you can definitely do so with the AWS GUI. if you are interested in reading more on how to do that with the CLI, you can read this.

aws s3api create-bucket --bucket profile-upload-test --region us-east-1

and then make sure you add the following CORS policy.If you are new to S3 and CORS policy, you can read this for reference.

[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"PUT",
"POST",
"DELETE"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": []
},
]

Then we need to set permission, go to AWS IAM and create a user and attach following policy to it. If you are new to AWS IAM you can read here.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "YourPolicyName",
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<REGION>-<BUCKETNAME>",
"arn:aws:s3:::<REGION>-<BUCKETNAME>/*"
]
}
]
}

Now, since the s3 bucket is setup and operational, we should start working on the API part. we need to write an endpoint to get the presigned URL. Assume that you already have a NodeJS and Express server running, if not you can read this doc to setup a basic server before you go further.

The following code is for the backend to make a request to S3 to obtain the presigned URL. Basically, it configures AWS with your credentials, then it creates a Params JSON object for it to make a get request to S3 to get the presigned URL.

const S3 = require('aws-sdk/clients/s3');app.all('/presignedurl', (req, res) => {........
........
aws.config.update({
region: AWS_REGION,
accessKeyId: AWS_ACCESS_KEY_ID,
secretAccessKey: AWS_SECRET_ACCESS_KEY
});
const s3Params = {
Bucket: S3BucketName,
Key: fileName,
Expires: 60 * 60,
ContentType: 'image/*'
};
const url = await getPresignUrlPromiseFunction(s3, s3Params);if(url) return url;
......
}

You may ask why do I use a Promise Function to handle the request here, instead of just making the request through an async await call.

function getPresignUrlPromiseFunction(s3, s3Params): Promise<string>{
return new Promise(async (resolve, reject) => {
try {
await s3.getSignedUrl('putObject', s3Params, function (err, data) {
if (err) {
return reject(err);
}
resolve(data);
});
} catch (error) {
return reject(error);
}
});
}

The reason I used getPresignUrlPromiseFunction to wrap the s3 request with a callback inside is because when I did the otherwise, it only returns a base url,

https://s3.us-west-2.amazonaws.com/

without the hash. a typical presigned URL should have the format as

https://s3.us-west-2.amazonaws.com/<some hash here>

And it returns the complete URL inside the promise function. you can read my How to Promisify a Callback and Resolve its Returned Data for more details on this technique.

I still do not fully understand why perhaps it has something to do with my organization’s AWS setup or it simply needs to be resolved by a promise. If you know why please drop me a comment.

UI

Now that our API is functional, let’s move on to setup the UI. For the UI, I used ReactJS, if you use other framework the logic should be similar. First we need to setup a form that includes an input tag to accept the photo or any file from the user and a submit button. And the input has an onChange handler and the submit button has an onSubmit handler.

<form>    
...
<input type="file" onChange={handleUploadChange} multiple
accept="image/*" hidden />
<span>Add a photo</span></div>
<input onSubmit={() => handleSubmit(userType!)}
type="submit" value="Submit" />
</form>

The following is the code for the onChange handler whose purpose is to get two items, the file and the filelink.

async function handleUploadChange(event) {
if (e.target.files) {
const file = e.target.files[0];
const fileLink = URL.createObjectURL(file);
....
....
}
}

the file is an object with keys and values about the file’s details, such as filename, type, and last modified time. The URL.createObjectURL is an internal JS method that turns the file object into a temporary link, with the following format,

blob:http://localhost:3000/<random uuid>

so you could put in an image tag to display it as thumbnail. You should put the file and the filelink to either local state or redux depends on your setup.

<img className={classes.profilePhoto} src={state.fileLink} alt="" />

The next thing to do is to push the file to S3 when users submit the form through the handleSubmit function. The handleSubmit function has two handlers, getS3SignUrl, and pushProfilePhotoToS3.

async function handleSubmit(){
const data = await getS3SignUrl(state.file.name,
state.file.type);
if(data.url){
const photoConvertedToBlob = dataURIToBlob(profilePhotoResized);
await pushProfilePhotoToS3(data.url, state.file);
}
}

The getS3SignUrl handler makes a GET request to the API endpoint presignedurl we built earlier, and return a presigned URL from S3.

export async function getS3SignUrl(filename, filetype){
const headers = new Headers({ 'Content-Type': 'application/json'
});
const options = {
method: 'POST',
headers: headers,
body: JSON.stringify({ fileName, fileType: fileType })
};
const response = await fetch(`${baseUrl}/presignedurl`,
options);
const presignedUrl = await response.json();
return presignedUrl
}

Then we need to use the returned presigned URL to PUT the file to S3, the pushProfilePhotoToS3 handler does that.

export async function pushProfilePhotoToS3(presignedUrl, uploadPhoto) {
const myHeaders = new Headers({ 'Content-Type': 'image/*' });
const response = await fetch(presignedUploadUrl, {
method: 'PUT',
headers: myHeaders,
body: file
});

If everything goes well, it should return status 200 and you should see the file uploaded to your S3 bucket.

Errors and Debugging

If some how you get error, the first thing to test is that your s3 is connected properly. Sometimes, the UI would throw an error saying CORS origin policy blocking access, or it would throw an 500 or 403 status and says accessed denied, or a 404 not found. My recommendation is to first make sure your aws access key and secret are correct, if you have setup your local aws cli and configured your keys locally, you can run the following to make sure that they are working.

aws sts get-caller-identity

If you get correct feedback that means your keys are working. You can read more on this if you are new.

Then you should try to make sure that your S3 bucket is setup properly and is reachable. For example, you can run the following to see if your bucket is responding. You can read more on here

aws s3api list-buckets --query "yourBucketName"

Then if it’s not those, you should log out the presigned URL after you sent the request both on the backend and frontend to see if there is any undefined. If it does return the url successfully, then you could run the following in the terminal to see if it can upload manually. (make sure you have the file in your current directory and it is the same file you uploaded earlier and also you have your local aws envrionment setup)

curl -X PUT -T file.jpg -L "your presigned URL"

If your keys are working, and your s3 bucket is setup correct and you get a presigned URL back, and you ran the above and you get something like the following. That implies perhaps something is missing or spelled wrong in the headers.

<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId>ASIAWHYATVXXV24MOTYS</AWSAccessKeyId><StringToSign>PUT...........
>50 55 54 0a 0a 0a 31 36 31 37 38 32 36 35 30 33 0a 78 2d 61 6d 7a 2d 73 65 63 75 72 69 74 79 2d 74 6f 6b 65 6e 3a 46 77 6f 47 5a 58 49 76 59 58 64 7a 45 41 55 61 44 47 7a 61 4a 79 43 33 58 4e 67 70 6f 63 55 34 74 69 4b 42 41 51 41 59 4e 49 66 38 6a 62 4e 76 77 32 59 79 65 61 58 73 59 4b 6c 4f 57 33 34 33 36 30 44 72 68 44 67 31 2f 78 34 54 73 36 50 70 57 38 6a 67 4f 4b 49 57 7a 48 36 38 2f 52 75 42 68 63 7a 5a 36 5a 75 36 44 4f 76 74 5a 68 2f 53 53 6b 69 6a 61 46 51 75 70 6e 6a 31 43 78 50 6c 69 65 67 68 34 6d 6c 50 46 50 35 69 34 35 77 6e 68 47 45 2f 37 54 4c 42 78 42 72 63 71 52 68 31 2b 78 62 4....

In my case, this error bugged me for an entire day, and I thought the issue was on access key and secret, but only did I find out later that I spelled Content-type as ContentType in the UI.

Above and Beyond

If you have successfully gotten it to work, then congratulation! Here are few more things you could consider doing on top of what we just covered. For example, you can do some image manipulation, for example, you can resize the the photo before it’s uploaded, so the size will be standardized or you can change the format of the image so they will all be the same. In case your user uploads another photo later it will replace the last one, otherwise you will end up have many photos uploaded by the same user. And you can checkout react-image-file-resizer.

Another tip is that before you send the presigned URL back to the UI, you should also push the file name to the database, in my case under user profile table, so you can access it later. Or you could And you could write a function to generate a file name based on user Id and the file type, in that case you can always get it back easily later.

--

--

DSL
DSL

Written by DSL

Sr software engineer. Love in Go, JavaScript, Python, and serverless AWS. Follow me for tech insights and experiences. follow me on twitter @terraformia

Responses (4)