gcf

package module
v0.0.0-...-31cf166 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 25, 2026 License: Apache-2.0 Imports: 7 Imported by: 0

README

Bigtable Automation

Background

This directory contains the Google Cloud Function involved in BT cache generation. The Cloud Function first gets notified by a Borg pipeline, via init.txt, when the cache files have been uploaded to GCS. As a result, it creates a new BT table, scales-up the node-limit (for faster loads, only on base cache), kicks off the CsvImport Dataflow job and registers launched.txt.

After the dataflow job has run to completion, it notifies this Cloud Function again, via completed.txt. As a result, it scales-down the node-limit (only on base cache). In future, for branch cache, it will notify Mixer.

All of the above .txt files are created in a per-cache directory. So, they allow for concurrent cache builds and tracking past state.

There are two distince GCF, one used for production imports with entry point ProdBTImportController and the other used for private imports with entry point PrivateBTImportController. The two functions has the same workflow as described below, except the GCS folder structure are different. The production entry point can be tested locally through local/main.go

Validate BT Import End-to-end using (GCS | BT) Test Environment

  1. First start the Cloud Function locally, as follows:

    ./base/local/deploy.sh
    
  2. A test branch cache exists in this folder.

    Just in case a test was run with that cache before, clean-up first:

    ./base/local/test.sh cleanup
    
  3. Fake an init trigger from Google pipeline:

    ./base/local/test.sh init
    

    To validate this step:

  4. Fake a completion trigger from Dataflow job:

    ./base/local/test.sh completed
    

    Validate this step by confirming that the prophet-test BT instance now has 1 node.

Deployment

After validating the change in test environment, deploy to PROD by running:

./base/gcp/deploy.sh base
./base/gcp/deploy.sh branch

When this completes, look at the prophet-cache-trigger on GCP console to version.

To deploy private GCF, identify the environment, pick the corresponding yaml files in private/*.yaml and run

./base/custom/deploy.sh <env>

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func BaseController

func BaseController(ctx context.Context, e lib.GCSEvent) error

BaseController consumes a GCS event and runs an import state machine.

func CustomController

func CustomController(ctx context.Context, e lib.GCSEvent) error

Controller consumes a GCS event and runs an import state machine.

Types

This section is empty.

Directories

Path Synopsis
base
local command
local command
Package gcf runs a GCF function that triggers in 2 scenarios:
Package gcf runs a GCF function that triggers in 2 scenarios:

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL