Blog

gSlack: Integrating Google Cloud Platform with Slack

One of the perks I really like as part of my work as CTO at DoiT International, is my day-to-day conversations with our customers. On many occasions, they are enlightening and I always learn something new.

Last week, I’ve noticed a ticket from one of our clients with a question on what would be the best way to integrate Google Cloud Platform with Slack.

Specifically, they wanted to get notified to their Slack channel when some activity took place in one of their Google Cloud projects, for example instance being started or terminated, new bucket being created or deleted and so on. Isn’t it a nice idea? Quick search with google.com didn’t reveal any immediate results so I had to figure out what would be the quickest and simplest way of achieving Slack and Google Cloud Platform integration.

Fortunately, there is an elegant, completely serverless solution for this which I will try to explain in this post. Even better, today we are open-sourcing the gSlack which you can deploy in your own GCP project in just few minutes and get instant and flexible notifications to your Slack channel.

slack gcp integration

gSlack uses Stackdriver Logging — Google’s centralized logging platform which allows you to store, search, analyze, monitor, and alert on log data and events from Google Cloud Platform (and Amazon Web Services). Most of Google’s cloud services send its logs to Stackdriver Logging. I was specifically interested in “Activity Logs” which include changes in Google Cloud Platform environment.

Here is how it looks in the Google Cloud Console:

google cloud platform slack

Google Stackdriver Logging UI

One of the most cute features of Stackdriver Logging is its ability to automatically export new log entries to Google Cloud Storage, Google BigQuery and Google Pub/Sub. You just configure an ‘export’ and everything else just magically happens for you.

I needed a transport which could relay log entries from Stackdriver Logging & Slack so Pub/Sub looked like the best alternative for this use-case. If you are not familiar with Pub/Sub, it is Google’s fully-managed real-time messaging service that allows you to send and receive messages between independent applications.

Setting up Pub/Sub export is as easy as configuring log filter (I only need ‘activity’ based logs, hence the logName=”projects/doit-playground/logs/cloudaudit.googleapis.com%2Factivity”) and the name of Pub/Sub topic to push the messages to:

google cloud slack integration

Pub/Sub Sink Configuration

So now, every time there is a new entry in Stackdriver’s Logging, it will be automatically pushed to my Pub/Sub topic. Pretty nice, right?

Next, I needed to put some ‘glue’ between the Pub/Sub and Slack so new messages being published to Pub/Sub will be posted as Slack notifications. Luckily, Google now has (in beta) Cloud Functions — a serverless environment to build and connect cloud services. Basically, you can code a ‘function’ written in NodeJS which will be triggered by one of the supported triggers such as new file in the bucket, http request or (I bet you’ve already guessed!) — new message in a Pub/Sub topic!

Our complete flow would look like this — StackDriver Logging logs certain activity in our project, automatically sends it to Pub/Sub topic which in its turn invokes Cloud Function sending a message to Slack channel using official Slack’s NodeJS SDK.

gcp slack integration

gSlack Architecture Diagram

To setup Cloud Functions, you’ll need to upload a zip file containing the code and your package.json file:

google cloud slack

Google Cloud Functions setup

To avoid sending every activity log entry coming to Slack channel (some of them might be not that interesting) and better format messages being published to Slack, I’ve decided to use one more Google’s managed service — Google Cloud Datastore.

Google Cloud Datastore is a managed NoSQL document database built for automatic scaling, high performance, and ease of application development. It has built-in integration with Cloud Functions and also a nice UI which you can use to quickly edit entries (‘kinds’ and ‘properties’ in Datastore’s terminology).

We need something like Google Cloud Datastore to persistently store runtime configuration of gSlack, specifically the definition of which messages we want to publish and how messages published to Slack will look like.

gcp slack

Editing Datastore’s data using built-in UI

For each set of log entries, you’ll need to configure test, message and the slackChannel where you want to publish the notification.

The test must be a valid JS expression that returns a boolean. If it returns true the test passes and the message will be sent to Slack. For example, if we want to only include messages from Google Compute Engine and track ‘start’ and ‘stop’ instance events, we can use the following test:

$.protoPayload.serviceName==='compute.googleapis.com' && ( $.protoPayload.methodName==='v1.compute.instances.start' || $.protoPayload.methodName==='v1.compute.instances.stop') && $.operation.last

Similarly, the message must be a valid JS string template. It will be evaluated to produce the message. e.g.

Instance '${$.protoPayload.resourceName.split('/').slice(-1)[0]}' was ${$.protoPayload.methodName==='v1.compute.instances.start'?'started':'stopped'} at zone '${$.resource.labels.zone}' by '${$.protoPayload.authenticationInfo.principalEmail}' in project '${$.resource.labels.project_id}'

As a result, the following notification will be sent to Slack. In my tests, it only takes about 5 seconds between the actual event and the time the message gets to the Slack channel.

slack gcp

Actual Slack Notification

You can add as many tests & messages as you’d like, parsing various Google Cloud Platform services such as Compute Engine, App Engine, Cloud Storage, BigQuery and even Billing. Actually, we include some of these examples in the gSlack repository.

The full code as well as deployment instructions are available at GitHub’s gSlack repository. Feel free to star or pull request and improve the gSlack ;-)

As always, you can reach me at [email protected] with your suggestions and ideas.

P.S. I’d like to thank Shahar Frank, Cloud Architect at DoiT International who actually coded the whole example in just few hours as well as helping me to prepare this post.

Subscribe to updates, news and more.

Related blogs