Google Cloud Retroactively analyzing more than 90 days of VM and sole-tenant node usage

When we run License Tracker for the first time, the tool analyzes the last 90 days of audit logs to determine how many VMs and physical servers we’ve been using. Going back 90 days in history is useful, but can we go back further?

License Tracker analyzes the Admin Activity audit logs. These logs are enabled by default and retained for 400 days. So there shouldn’t be anything in the way of going back further in history.

Except for Cloud Run.

License Tracker uses Cloud Run jobs, which is (currently, in July 2022) not GA yet. And because the service isn’t GA yet, Cloud Run jobs are limited to 60 minutes. One hour seems like a generous timeout, but analyzing audit logs is slow: an hour is likely to be enough to go through 90 days of audit logs, but unlikely to suffice for analyzing 400 days worth of logs, especially if we’re analyzing dozens of projects at once.

But we don’t have to run License Tracker in Cloud Run – it’s a command line tool after all, and we can just as well perform the initial run elsewhere. And once the initial run has completed, we can let Cloud Run take over to manage the daily delta-analysis.

One option is to run License Tracker locally (on our workstation). But that’s likely to require a fair amount of IAM reconfiguration, and that’s not ideal. A better option is to let Cloud Build perform the initial run. That solves the timeout issue, and we can even let Cloud Build …

  • use the Cloud Run job’s service account (so that we don’t have to do any IAM reconfiguration)
  • use the same, existing Docker image (so that we don’t need to rebuild the code)

Using Cloud Build to run License Tracker

Suppose we’ve already deployed License Tracker to Cloud Run and have run the initial analysis. But we’re not satisfied with 90 days worth of data, so we want to re-run the initial analysis for the past 365 days.

Let’s fire up a terminal (Cloud Shell works) and …

  1. Point gcloud to the project that contains the License Tracker Cloud Run app:

    gcloud config set project PROJECT
    

    Where PROJECT is the project ID.

  2. Set the region that contains the Cloud Run app:

    gcloud config set run/region REGION
    

    Where REGION is the name of the region.

  3. Initialize some environment variables:

    # Get the service account email used by Cloud Run
    SERVICE_ACCOUNT=$(gcloud beta run jobs describe license-tracker \
      --format=value\('spec.template.spec.template.spec.serviceAccountName'\))
    
    # Get the Docker image used by Cloud Run
    IMAGE=$(gcloud beta run jobs describe license-tracker \
      --format=value\('metadata.annotations['client.knative.dev/user-image']'\))
    
    PROJECT_ID=$(gcloud config get-value core/project)
    
  4. Allow the service account to write logs (requires for Cloud Build to work):

    gcloud projects add-iam-policy-binding $PROJECT_ID \
      --member serviceAccount:$SERVICE_ACCOUNT \
      --role roles/logging.logWriter
    
  5. Allow the service account to pull the License Tracker Docker image from GCR:

    gsutil iam ch \
      serviceAccount:$SERVICE_ACCOUNT:objectViewer  \
      gs://artifacts.$PROJECT_ID.appspot.com
    
  6. Drop the existing BigQuery dataset:

    bq rm -r -f -d $PROJECT_ID:license_usage
    

Now we’re ready to submit a build:

  1. Create a build configuration that invokes License Tracker with the command line argument --analysis-window=400:

    cat << EOF > build.yaml
    steps:
    - name: 'gcr.io/$PROJECT_ID/license-tracker'
      entrypoint: 'dotnet'
      args: ['Google.Solutions.LicenseTracker.dll', '--autodiscover', '-v', '--analysis-window=365']
      dir: '/app'
    serviceAccount: projects/$PROJECT_ID/serviceAccounts/$SERVICE_ACCOUNT
    options:
      logging: CLOUD_LOGGING_ONLY
    EOF
    
  2. Submit the build to Cloud Build:

    gcloud builds submit \
        --no-source \
        --timeout "12h" \
        --config build.yaml \
        --async
    

The build will run in the background, so we’re safe to close the terminal.

It might take a few hours for the analysis to complete, but after that, we’ll have a new BigQuery dataset that covers the last year.

And in the meantime, we can check the status of the analysis in in the Cloud Console under Cloud Build > History.

Any opinions expressed on this blog are Johannes' own. Refer to the respective vendor’s product documentation for authoritative information.
« Back to home