Cost Tracking Analytics
Tracking API usage and correlating it to cost-per-user is a standard requirement for transitioning the GenMedia Creative Studio to a production environment.
The application implements structured JSON logging via the genmedia.analytics logger. By streaming these logs to Google Cloud BigQuery, administrators can easily track usage, monitor API calls, and calculate costs.
1. Setting up a Log Router Sink
Section titled “1. Setting up a Log Router Sink”Google Cloud allows you to automatically route specific logs from Cloud Run directly into a BigQuery dataset for analysis.
- Navigate to Logging > Log Router in the Google Cloud Console.
- Click Create Sink.
- Sink Details: Name it
genmedia-analytics-sink. - Sink Destination: Choose BigQuery dataset. Select or create a new dataset (e.g.,
genmedia_analytics). - Inclusion Filter: This is the most critical step. You only want to route the analytics logs, not standard application traffic. Use the following filter:
jsonPayload.name = "genmedia.analytics"jsonPayload.event_type = "model_call"
- Create Sink: Google Cloud will automatically create the required table schema in BigQuery as new logs arrive.
2. Understanding the Log Payload
Section titled “2. Understanding the Log Payload”The application enriches every model_call event with the following metadata inside the jsonPayload.extra_data field:
model_name: The specific model version used (e.g.,gemini-2.5-flash,veo-3.1-lite-generate-001).status: Eithersuccessorfailure.duration_ms: The total execution time of the API call.user_email: The email of the user who initiated the request.session_id: The unique session identifier.details: A JSON object containing model-specific parameters (e.g.,aspect_ratio,num_images_generated,video_duration).
3. Correlating Usage with Cost in BigQuery
Section titled “3. Correlating Usage with Cost in BigQuery”Once the logs are flowing into BigQuery, you can use SQL to aggregate usage and multiply it by the standard Google Cloud pricing.
Here is an example SQL query to calculate the cost of Veo Video Generation per user for the current month:
WITH VeoUsage AS ( SELECT JSON_EXTRACT_SCALAR(jsonPayload.extra_data, "$.user_email") AS user_email, JSON_EXTRACT_SCALAR(jsonPayload.extra_data, "$.model_name") AS model_name, CAST(JSON_EXTRACT_SCALAR(JSON_EXTRACT(jsonPayload.extra_data, "$.details"), "$.duration_seconds") AS INT64) AS duration_seconds, timestamp FROM `your-project.genmedia_analytics.run_googleapis_com_stdout` WHERE JSON_EXTRACT_SCALAR(jsonPayload.extra_data, "$.event_type") = "model_call" AND JSON_EXTRACT_SCALAR(jsonPayload.extra_data, "$.status") = "success" AND JSON_EXTRACT_SCALAR(jsonPayload.extra_data, "$.model_name") LIKE "veo-%" AND EXTRACT(MONTHFROM timestamp) = EXTRACT(MONTH FROM CURRENT_TIMESTAMP()))
SELECT user_email, model_name, SUM(duration_seconds) AS total_seconds_generated, -- Example: Assuming Veo costs $0.20 per second of generated video SUM(duration_seconds) * 0.20 AS estimated_cost_usdFROM VeoUsageGROUP BY user_email, model_nameORDER BY estimated_cost_usd DESC;Note: Pricing changes over time. Always refer to the official Vertex AI Pricing Page to update your correlation multipliers.
4. A Note on Historical Logs
Section titled “4. A Note on Historical Logs”Creating a Log Router Sink only routes future logs. It will not automatically backfill your BigQuery dataset with historical logs generated prior to the sink’s creation.
If you need to analyze historical costs for an existing instance:
- One-Time Export: Navigate to Logging > Logs Exporer, filter for your analytics logs (
jsonPayload.name="genmedia.analytics"), and use the Actions > Download/Export feature to manually copy the last 30 days of logs into your BigQuery table. - Log Analytics: Alternatively, if your Cloud Logging bucket is upgraded to support Log Analytics, you can query your historical logs directly using standard SQL without needing a Log Router Sink at all.