Python 100 project #52: Cost Notification for GCP

I used to use AWS quite often previously, and I created cost notification using python on AWS lambda and slack API. These few months though, I am not using AWS much, but GCP due to personal reasons. Hence I created (almost) same notification using Google Cloud Function and slack API.

Though the objective is the same, required steps are quite different. The biggest differences are:

  • No native API is available for the cost … There is Billing API, but it’s for pricing and billing account info
  • No built-in cron job available for cloud function … There is a workaround by using GAE, but it’s too much for this purpose.

So, I twisted the data and notification flow so that it fits to my use, and it looks like below image.

Let’s look how it is coded.

Setup Billing Data Export


1. Go to Billing -> Billing Export.

2. Enable File Export. Note Bucket Name you used here, other options(prefix, format) is not important for notification.

3. Enable BigQuery Export. Note DataSet name.

Create Cloud Function


1. Create Cloud Functions. Use FInalize/Create for Event Type, and use your Bucket Name for Bucket.

2. We use following three files:

[ requirements.txt ]

# for query exported cost to bigquery
google-cloud-bigquery
# for http post for slack notification
requests

[ slack.py ]

import requests

token = "YOUR_TOKEN"

def post(msg, channel, hostname):
    headers = {
        "Content-type": "application/json"
    }
    params = {
        "token": token,
        "text": msg,
        "channel": channel,
        "as_user": False,
        "username": hostname,
    }
    url = "https://slack.com/api/chat.postMessage"
    resp = requests.post(url, params=params, headers=headers)
    return resp.status_code

[ main.py ]

from datetime import datetime
import os

from google.cloud import bigquery

import slack

SLACK_CHANNEL = os.environ['SLACK_CHANNEL']

def get_monthly_cost(month, year):
    """
    get the monthly accumulated cost of GCP services.
    input: month - int, year - int
    return: list of result dict, each result dict keys are composed of ["currency", "accumulated_cost"]
    """
    bq_client = bigquery.Client()
    dataset_id = "YOUR_DATASET"
    query = (
    f'SELECT SUM(cost) as accumulated_cost, currency FROM {dataset_id} '
    f'WHERE invoice.month = "{year}{month}" '
    'GROUP BY currency'
    )
    query_job = bq_client.query(query)
    costs = query_job.result()

    return costs


def main(event, context):
    month, year = datetime.today().month, datetime.today().year
    costs = get_monthly_cost(month, year)
    content = f"Estimated cost of Y{year}M{month} is as follows:\n"
    for cost in costs:
        content += f"{cost['currency']}: {round(cost['accumulated_cost'],2)}\n"
    try:
        slack.post(content, SLACK_CHANNEL, "GoogleCloudFunction")
        print(f"Message posted to {SLACK_CHANNEL}")
    except requests.exceptions.RequestException as e:
        print(f"Request failed: {e}")

See Notification in action


Once deployed, notification should be sent to the specified slack channel.

Because this function is triggered by the billing export to the cloud storage(bucket), it will be done once in a day. It seems the timing is not always the same, but accuracy of the timing is not important for me, hence it is ignorable for me.