Deploy GCP Cloud Function with Event Trigger via PubSub
Table of Contents
- Introduction
- Create Index.js Entry Point File
- Include Dependency Package for GCP Cloud Function
- Deploy to GCP Cloud
- Send Pub/Sub Message
- Verify the Logs
- Trigger Event with Cloud Scheduler
- Conclusion
Introduction
This blog shows how to create a simple Node.js application and deploy it to GCP Cloud Functions as a Pub/Sub Event Trigger.
Create Index.js Entry Point File
Create an Index.js file and add the below code. It will get the dependency package from '@google-cloud/functions-framework', create the endpoint with cloudEvent, and log all the request event variables to the log so we can verify later on.
const functions = require('@google-cloud/functions-framework');
functions.cloudEvent('myEvent', event => {
console.info('test_data', event);
});
Include Dependency Package for GCP Cloud Function
Since we'll use the package for GCP Cloud Functions, we need to add this dependency package in the package.json file.
"dependencies": {
"@google-cloud/functions-framework": "^3.3.0"
}
Deploy to GCP Cloud
Create Pub/Sub Topic
Before deploying the Node.js code to GCP Cloud, we'll need to create a Pub/Sub topic. We'll use the below command to create a topic called my_topic.
gcloud pubsub topics create my_topic
Deploy Cloud Function via Local Environment
Below is the command for deploying Node.js code to GCP Cloud. Run this in the same source code directory. Make sure entry-point is the name written in index.js and trigger-topic is the one we just created, my_topic.
gcloud functions deploy myEventTriggerFunction --gen2 --region=us-central1 --runtime=nodejs18 --entry-point=myEvent --trigger-topic=my_topic --allow-unauthenticated
Send Pub/Sub Message
After verifying the cloud function deployed successfully, we can publish a message and verify the logs in the cloud later. Run the below command locally to send a test message to our topic.
gcloud pubsub topics publish my_topic --message="my first message"
Verify the Logs
Now, if everything works as expected, we should see the logs in the Cloud Function log tab. The data field below should store the actual message we sent earlier, encoded with Base64. To read the message, you can either use online tools to decode the text from Base64, or in the actual application you can decode message.data for additional processing.

Trigger Event with Cloud Scheduler
We have set up the Node.js app to trigger with GCP Pub/Sub, so we can also use Cloud Scheduler to run the Node.js app regularly. The below command will set up GCP Cloud Scheduler to run every minute and send a message as from_scheduler. This design can help in case you have any regular process that needs to run on a schedule — every minute, every hour, or maybe once a day.
gcloud scheduler jobs create pubsub my_job --location=us-central1 --schedule="* * * * *" --topic="my_topic" --message-body="from_scheduler"
Conclusion
By following these steps, you can deploy a GCP Cloud Function with a Pub/Sub event trigger and optionally automate it with Cloud Scheduler. This pattern is useful for building event-driven architectures where functions respond to messages published to a topic.