Automatically update your S3 static-site every time you push to your Bitbucket repository. Manual deployments are prone to error and a monotonous waste of time.
First we need a S3 bucket that will serve our static site and an AWS user so Bitbucket can use those credentials to upload to S3.
Bitbucket Pipelines are configured in YAML format and run tasks when a user pushes to a branch it hosts. In our example, we will use Yarn for installing our dependencies and building. Then we will use AWS CLI to push code up to our S3 static site.
As neither Yarn or AWS CLI are pre-installed in our Bitbucket Pipeline environment we will need to install them in our Pipeline script. AWS CLI needs credentials for the users we created earlier in order to push to S3, so we will add these as Bitbucket environment variables
Add your new AWS user’s credentials as environment variables to Bitbucket. Viewing your repository on bitbucket.org, go to Settings > Environment variables. Add variables named “AWSACCESSKEYID” and “AWSSECRETACCESSKEY” with values from your AWS user you already set up (values are called “Access key ID” and “Secret access key” in AWS). Make sure “AWSSECRETACCESS_KEY” is set to “Secured” using the checkbox on the right, this prevents it being output in logs.
bitbucket-pipelines.yml file in the root of your folder. Below is an example Pipelines config. If you use this config, you will need to:
adjust the Yarn task for building your site (
the folder your site is built locally (
the bucket name (
Once that file is committed to
master every time a change is pushed to
master Bitbucket will build your project and upload to S3, magic.
image: node:6.9.4 pipelines: branches: master: - step: caches: - node script: # install Amazon CLI - apt-get update && apt-get install -y python-dev - curl -O https://bootstrap.pypa.io/get-pip.py - python get-pip.py - pip install awscli # install Yarn (for project dependencies) - curl -o- -L https://yarnpkg.com/install.sh | bash -s -- --version 0.18.1 - export PATH=$HOME/.yarn/bin:$PATH # get project dependencies - yarn install # build project - adjust to whatever build script you run - yarn stage # sync files in "dist" folder with Amazon S3 bucket "your-s3-bucket-name" (allow files to be publicly accessible and delete old files) - aws s3 sync dist s3://your-s3-bucket-name --acl public-read --delete