Terraform provider for BigCommerce

January 25, 2021

This weekend I created a Terraform provider for BigCommerce. The reason for doing is so that you can declaritively define certain aspects of BigCommerce configuration in code. At the moment the provider only supports managing webhooks.

It looks like this:

terraform {
  required_providers {
    bigcommerce = {
      source = "ashsmith/bigcommerce"
      version = "0.0.4"
    }
  }
}

provider "bigcommerce" {
 client_id    = "your_stores_client_id" # env var: BIGCOMMERCE_CLIENT_ID
 access_token = "your_stores_access_token" # env var: BIGCOMMERCE_CLIENT_ID
 store_hash   = "your_store_hash" # env var: BIGCOMMERCE_STORE_HASH
}

resource "bigcommerce_webhook" "orders_webhooks" {
  # What scope of events are you listening for.
  scope = "store/order/*" # listens to all order events.

  # Where the webhook event payload will be sent:
  destination = "https://api.myawesomeapi.com/webhooks"

  is_active = true
}

Why define webhooks as terraform resources?

This allows us to declaratively define what events are are being consumed, which makes it easy for us to reason our applications infrastructure and it's dependencies. Additionally this terraform code can be applied to multiple environments so that we can ensure they are setup exactly the same.

We can take this a step further when combined with other terraform resources. For example, we can define the destination of our webhook to map directly to a HTTP-triggered GCP Cloud Function that will consume the webhook events:

# (I have omitted other GCP resources here for brevity)
# Full example of GCP Cloud Function: 
# https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/cloudfunctions_function#example-usage---public-function
resource "google_cloudfunctions_function" "function" {
  name        = "function-test"
  description = "My function"
  runtime     = "nodejs10"

  available_memory_mb   = 128
  source_archive_bucket = google_storage_bucket.bucket.name
  source_archive_object = google_storage_bucket_object.archive.name
  trigger_http          = true
  entry_point           = "testing"
}

resource "bigcommerce_webhook" "orders_webhooks" {
  scope       = "store/order/*"
  destination = google_cloudfunctions_function.function.https_trigger_url
  is_active   = true
}

Just like that we'll have all order events sent to a newly provisioned Cloud Function. This is repeatable, and as mentioned before can be applied to multiple environment to ensure consistency in our configuration.

The source code is available on Github:

And the Terraform provider is available in the registry here: