BACK TO INDEXGEOSPATIAL IN CLOUD - PART 3
Cloud Platform

Geospatial in Cloud: GCP

BigQuery GIS, Earth Engine, and Cloud Run. When each one is the right tool - and when it's not.

PUBLISHEDFEB 2026
SERIESGEOSPATIAL IN CLOUD
CATEGORYTECHNICAL
AUTHORAXIS SPATIAL TEAM
Photorealistic data centre with cloud infrastructure - representing GCP geospatial services at scale
  • BigQuery GIS: spatial SQL on billion-row datasets, query in 5-15 seconds, cost $5/TB scanned
  • Earth Engine: unmatched for satellite imagery analysis, but overkill for vector workflows and hard to integrate with non-Google tools
  • Cloud Run: deploy geospatial APIs (FastAPI + GDAL) with zero infrastructure, 0-to-N autoscaling, pay only for requests
  • GCP's hidden advantage: the best geospatial public datasets (Landsat, Sentinel, OpenStreetMap) are free on BigQuery

Google has the most powerful geospatial computing platform on the planet - Earth Engine. It also has the largest cloud-native spatial SQL engine - BigQuery GIS. But choosing between them, combining them, or deciding you need neither is where most teams get stuck.

This post breaks down each GCP geospatial service with real query times, actual costs, and honest assessments of where they fall short. No vendor fluff. No feature-list repackaging. Just what works in production.

Geospatial in Cloud Series

This is Part 3 of our Geospatial in Cloud series. Each post is self-contained. Part 1 covers Databricks. Part 2 covers AWS. Read the one that matches your stack.

GCP Geospatial Services Map

GCP's geospatial capabilities are spread across six services. Understanding which one to use for what is the first decision you need to make.

SERVICEGEOSPATIAL USEWHEN TO USE
BigQuery GISSpatial SQL on massive datasetsBillions of rows, ad-hoc spatial analysis
Earth EngineSatellite imagery analysisMulti-temporal raster analysis, change detection
Cloud RunContainerised geospatial APIsProcessing services, tile servers, APIs
Cloud Storage (GCS)COG/GeoParquet storagePrimary data lake (like S3)
Vertex AIML on geospatial dataClassification, prediction models
Dataflow (Beam)Streaming geospatial processingReal-time sensor data, IoT

Most geospatial teams on GCP use three of these: BigQuery GIS for analysis, GCS for storage, and Cloud Run for serving. Earth Engine is the specialist tool you bring in when satellite imagery is involved.

BigQuery GIS Deep Dive

BigQuery GIS operates on a native GEOGRAPHY type. No extensions, no plugins, no PostGIS-style setup. Spatial functions work directly on petabyte-scale tables with the same SQL your analysts already write.

A spatial join on 1 billion points against 50K polygons takes 8-12 seconds. Try that on PostGIS.

-- Spatial join: points within flood zones

SELECT
    p.name,
    z.flood_risk_level,
    ST_AREA(z.geometry) / 1e6 as zone_area_km2
FROM `project.dataset.properties` p
JOIN `project.dataset.flood_zones` z
ON ST_CONTAINS(z.geometry, p.geometry)
WHERE p.country = 'DE'

Standard SQL. No spatial extensions to install, no index creation required. The syntax is identical to what your data team already writes - it just happens to include spatial predicates.

A distance-based query against the full OpenStreetMap dataset (1.5TB, pre-loaded in BigQuery's public datasets):

-- Distance query against OSM public data

SELECT
    osm.name,
    osm.amenity,
    ST_DISTANCE(
      osm.geometry,
      ST_GEOGPOINT(13.4, 52.5)
    ) as distance_m
FROM `bigquery-public-data.geo_openstreetmap.planet_features` osm
WHERE osm.amenity = 'hospital'
AND ST_DWITHIN(
  osm.geometry,
  ST_GEOGPOINT(13.4, 52.5),
  10000  -- 10km radius
)
ORDER BY distance_m
LIMIT 10

KEY INSIGHT: S2 CELL INDEXING

BigQuery GIS uses S2 cell indexing internally. Spatial operations are not scan-based - they use a spatial index automatically. No EXPLAIN ANALYZE, no manual index creation, no tuning. It just works at scale. This is the single biggest difference from PostGIS, where spatial index creation and maintenance is a constant overhead.

For a deep dive on the file formats that power this, see our guide on cloud-native geospatial formats (GeoParquet, COG, STAC). GeoParquet and COG work natively with BigQuery and GCS.

Earth Engine: Honest Assessment

Earth Engine is extraordinary for a specific set of problems. It is also wildly overused for problems it was never designed to solve. Here is where it genuinely has no competition, and where it creates more problems than it solves.

Where Earth Engine Is Unmatched

70+ petabytes of satellite imagery - free

Landsat, Sentinel, MODIS, VIIRS - all pre-processed and analysis-ready. Downloading this data yourself would take months and cost thousands in transfer fees alone.

Multi-temporal analysis in minutes

Computing an NDVI time series over 5 years for a 10km x 10km area takes under a minute. Locally, you'd spend days downloading imagery before you could even start processing.

Built-in change detection and classification

Deforestation tracking, urban expansion monitoring, crop classification - algorithms are built in with the data co-located. No data movement.

Where Earth Engine Is Overkill or Wrong

1. Vector analysis

Earth Engine is raster-first. Vector operations are slow and limited compared to BigQuery GIS or PostGIS. Spatial joins on vector datasets that take seconds in BigQuery take minutes in Earth Engine - and the API is far less intuitive for SQL-trained analysts.

2. Custom processing pipelines

Earth Engine's execution model is opaque. You cannot control parallelism, memory allocation, or execution order. For complex multi-step pipelines where you need predictable behaviour, Cloud Run with your own GDAL/rasterio container gives you full control.

3. Integration with non-Google tools

Getting data out of Earth Engine into your data warehouse, dashboard, or downstream pipeline is painful. Exports are asynchronous and can take hours. If your architecture involves Snowflake, Databricks, or any non-Google analytics platform, plan for significant integration overhead.

4. Reproducibility

Earth Engine code runs on Google's servers with undocumented infrastructure. Exact reproduction of results is not guaranteed across time. For regulated industries (insurance, finance) where audit trails matter, this is a real compliance concern.

5. Cost predictability

The free tier is generous for research. But commercial use pricing is opaque and usage-dependent. BigQuery's $5/TB scanned model is transparent and controllable. Earth Engine's commercial pricing makes budgeting difficult.

THE HONEST RECOMMENDATION

Use Earth Engine for what it is best at: satellite imagery analysis at planetary scale. Use BigQuery GIS + Cloud Run for everything else. Do not try to force Earth Engine into a general-purpose geospatial platform - you will spend more time fighting its limitations than building your actual product.

Cloud Run for Processing

Cloud Run is the underrated workhorse of GCP geospatial. Deploy a FastAPI + GDAL container with zero infrastructure management. It auto-scales from 0 to N instances based on request volume, and you pay only for actual request time.

Perfect for tile servers, geocoding APIs, on-demand raster processing, and any geospatial microservice that needs to scale without cluster management.

# FastAPI geospatial API on Cloud Run

from fastapi import FastAPI
import geopandas as gpd
from shapely.geometry import box

app = FastAPI()

@app.post("/spatial-query")
async def spatial_query(bbox: dict):
    gdf = gpd.read_parquet(
        "gs://my-bucket/parcels.parquet"
    )
    area = box(
        bbox["minx"], bbox["miny"],
        bbox["maxx"], bbox["maxy"]
    )
    result = gdf[gdf.intersects(area)]
    return {
        "count": len(result),
        "parcels": result.to_json()
    }

Deployment is a single command:

gcloud run deploy geo-api \
    --source . \
    --region europe-west1 \
    --memory 2Gi \
    --timeout 300

No Kubernetes manifests. No Docker Compose files. No load balancer configuration. Cloud Run handles HTTPS, autoscaling, and zero-downtime deployments automatically. For teams that want to deploy geospatial APIs without becoming infrastructure engineers, this is the fastest path.

Public Datasets: GCP's Hidden Weapon

This is the advantage nobody talks about. GCP hosts the largest collection of pre-loaded, pre-indexed, query-ready geospatial datasets on any cloud platform. On AWS or Azure, you need to download, convert, and load them yourself. On GCP, it is one SQL JOIN away.

DATASETSIZEUPDATESCOST
OpenStreetMap (planet)1.5TBWeeklyFree
US Census100GBAnnualFree
NOAA Weather500GBDailyFree
EPA Facilities10GBQuarterlyFree
TIGER (US boundaries)50GBAnnualFree

The practical impact: enriching your proprietary data with OSM points of interest, census demographics, or weather patterns is a JOIN operation, not an ETL pipeline. No data engineering required. No storage costs for the public data. You pay only for the query compute ($5/TB scanned).

Real Benchmarks

These numbers come from production workloads, not synthetic tests. Each operation was measured with standard GCP pricing in europe-west1.

GCP GEOSPATIAL BENCHMARKS - PRODUCTION WORKLOADS

Spatial join (1B points x 50K polygons)
BigQuery: 11.3s

$0.25 per query

Distance query (10km radius, OSM)
BigQuery: 2.1s

$0.01 per query

NDVI time series (5yr, 10km x 10km)
Earth Engine: 45s

Free (research tier)

COG tile extraction
Cloud Run: 230ms

$0.0000024 per request

Full pipeline (ingest + query + export)
BQ + Cloud Run: 18s

$0.30 per run

$0.30per full pipeline run

ingest + spatial query + export, europe-west1

Cost Analysis

A side-by-side comparison for a mid-sized geospatial team (10-30 users, 1TB vector data, regular spatial analysis workloads).

WORKLOADGCPESRI ENTERPRISEPOSTGIS (SELF-MANAGED)
Store 1TB vectors$20/mo (GCS)~$200/mo (Enterprise)$50/mo (RDS)
Spatial queries (1TB/mo)$5 (BigQuery)Included (licence)$0 (compute only)
Raster analysisFree (EE research)$500/mo (Image Server)DIY (rasterio)
API serving$10/mo (Cloud Run)$300/mo (GeoEvent)$50/mo (VM)

HONEST CAVEAT: QUERY-BASED PRICING BITES

BigQuery charges $5 per TB scanned. A single careless SELECT * on a 10TB table costs $50. Always use partitioning, clustering, and column selection to control costs. ESRI's flat-fee model is actually more predictable if you are doing heavy, frequent analysis. GCP is cheaper on average but has a higher variance. Plan accordingly.

When NOT to Use GCP for Geospatial

We run geospatial workloads on GCP daily. We still tell clients not to use it in these scenarios:

1. Your organisation is AWS or Azure first

Multi-cloud complexity rarely justifies the benefits. If your data team, IAM policies, and billing are on AWS, adding GCP for geospatial alone introduces operational overhead that outweighs BigQuery's advantages. Use your primary cloud's spatial capabilities first. Our AWS guide covers the alternatives.

2. You need real-time spatial queries (sub-10ms)

BigQuery has a 1-2 second minimum query time. This is fast for analytics but unacceptable for a spatial API serving a web application. For sub-10ms spatial lookups, use PostGIS with a warm connection pool or Redis with geospatial indexing.

3. You need full control over processing

Earth Engine is a black box. BigQuery abstracts away execution. Cloud Run abstracts away infrastructure. If you need to control every aspect of execution - parallelism, memory allocation, scheduling granularity - consider AWS ECS or bare Kubernetes where you manage the entire stack.

4. You are processing sensitive data

Earth Engine processes data on Google's infrastructure with limited control over data residency. For highly sensitive geospatial data (defence, classified assets, certain financial data), a self-hosted solution with full audit control may be required. BigQuery offers more control here, but Earth Engine does not.

5. Your geospatial workload is small

If you process fewer than 1M records or query less than 100GB/month, PostGIS on a single server is simpler, cheaper, and faster for these workloads. BigQuery's strength is scale. Below a certain threshold, the overhead of cloud-native tooling is not worth the convenience.

Reference Architecture

A production GCP geospatial stack for a team running mixed vector and raster workloads. This is the architecture we deploy for clients who are already on GCP.

Cloud-native geospatial architecture diagram showing data flow from ingestion through processing to serving

GCP Geospatial Reference Architecture

Data Lake:
  GCS bucket (COG + GeoParquet)
  <- source data, versioned

Analytics:
  BigQuery GIS
  <- spatial SQL, joins, aggregations

  Earth Engine
  <- satellite imagery, temporal analysis
  <- raster specialisation ONLY

Processing:
  Cloud Run (FastAPI + GDAL)
  <- APIs, on-demand processing

  Dataflow (Apache Beam)
  <- streaming, real-time ingestion

ML:
  Vertex AI
  <- geospatial classification, predictions

The key principle: each service does one thing well. BigQuery for SQL analysis, Earth Engine for raster, Cloud Run for APIs, GCS for storage. Resist the temptation to route everything through Earth Engine or build everything as BigQuery stored procedures.

For teams comparing this with other platforms, Part 1 covers Databricks (stronger for lakehouse architectures) and Part 2 covers AWS (more flexible, DIY approach).

Frequently Asked Questions

Can BigQuery handle geospatial data?

Yes. BigQuery has a native GEOGRAPHY type with spatial functions (ST_CONTAINS, ST_DISTANCE, ST_INTERSECTION, and more). It handles spatial queries on billion-row datasets in seconds using automatic S2 cell indexing. No extensions or plugins needed.

Is Google Earth Engine free?

Earth Engine is free for academic and research use. Commercial use requires a paid licence, with pricing that depends on usage volume. For many commercial geospatial workloads, BigQuery GIS combined with Cloud Run is more cost-effective and predictable.

What is the best GCP service for geospatial analysis?

It depends on data type. For vector analysis at scale, BigQuery GIS is the best choice. For satellite imagery and multi-temporal raster analysis, Earth Engine is unmatched. For custom processing APIs, Cloud Run with GeoPandas/rasterio is the most flexible option.

GCP has the most powerful individual geospatial tools of any cloud platform. The challenge is knowing which tool to reach for.

BigQuery GIS for vector analysis at scale. Earth Engine for satellite imagery - and nothing else. Cloud Run for APIs without infrastructure overhead. The teams that get GCP geospatial right are the ones that resist the temptation to use one service for everything.

That is the consistent pattern across this entire series. Match the tool to the problem. Not the other way around.

Get Workflow Automation Insights

Monthly tips on automating GIS workflows, open-source tools, and lessons from enterprise deployments. No spam.

READY TO MOVE TO CLOUD

Free Geospatial Cloud Assessment

We analyse your current workflows, data volumes, and team skills to recommend the right cloud platform - whether that is Databricks, AWS, GCP, or staying where you are.

  • Platform recommendation (Databricks / AWS / GCP / hybrid)
  • Cost projection: Year 1 vs Year 3
  • Migration effort estimate