- Manual GIS cloud migration costs $50K-$800K and takes 4-6 weeks -every single time, regardless of estate size
- 80% of failures trace to undocumented dependencies and tacit knowledge missed during manual discovery
- AI agents automate the inventory, analysis, translation, and deployment steps -reducing time from weeks to hours
- Automated migration does not work for custom ArcObjects COM code, highly custom FME transformers, or workflows with undocumented business rules that the analysts who wrote them have since left the organisation
Quick Answer
Manual GIS cloud migration costs $50K-$800K and takes 4-6 weeks because the entire discovery, analysis, and translation process is done by consultants manually. The market is $5.2B annually and almost entirely unautomated. AI agents can handle the inventory, dependency mapping, code translation, and deployment steps programmatically -reducing most migrations from weeks to hours. The exceptions are workflows with custom COM code or undocumented business logic that the original authors have since left.
Call any of the top GIS consultancies today -Blue Raster, GCS, Woolpert, SSP Innovations -and ask for a cloud migration quote. You will get a statement of work that runs 4 weeks minimum, costs six figures, and involves a team of three to five consultants running manual discovery workshops.
That is not incompetence. It is the only approach that existed before AI agents could read code. But that constraint is now gone. Here is what actually takes time in a GIS migration, why it fails, and what a fully automated alternative looks like.
The $5.2B Market Nobody Has Automated
GIS is deeply embedded in utilities, insurance, infrastructure, government, and environmental consulting. Most of these organisations built their spatial workflows on desktop GIS -primarily ArcPy scripts running against file geodatabases, FME workbenches moving data between systems, and QGIS projects with Python plugins for analysis.
Cloud pressure has been building for five years. Storage costs, processing scale, collaboration requirements, and the end of ArcMap support (March 2026) are all forcing the migration. The GIS professional services market -the consultants who execute these migrations -is valued at over $5.2B annually.
TYPICAL MIGRATION PRICING (2026)
Rates from published SOWs. Excludes cloud infrastructure costs and internal engineer time.
The striking thing is that timeline. A small 20-workflow migration and a large 500-workflow migration both take roughly the same calendar time. That is because the constraint is not the volume of code -it is the discovery workshops, stakeholder interviews, sign-off cycles, and manual documentation that happen before a single line gets rewritten. Every migration goes through the same process regardless of size.
What a Manual GIS Cloud Migration Actually Looks Like
A consulting engagement follows a predictable eight-step process. Understanding each step shows exactly where the time goes -and which steps are genuinely human problems versus which ones are automation waiting to happen.
Inventory (Week 1)
Consultants walk through file servers, SharePoint folders, and network drives cataloguing every ArcPy script, FME workbench, QGIS project, and ModelBuilder model. This is entirely manual. In one mid-sized infrastructure consultancy we reviewed, the team found 340 scripts when the client thought they had "about 80." Nobody had a complete inventory.
Dependency Assessment (Week 1-2)
Each script is read manually to map what it calls, what data it reads, and what it writes. This uncovers hardcoded file paths pointing to \\server\gis\layers, scripts that call other scripts, and FME workbenches that reference ArcSDE connection files that may no longer exist.
Business Logic Capture (Week 2)
Workshops with the GIS analysts who wrote the scripts. The code rarely explains why a buffer is 250 metres rather than 200 metres, or why a particular spatial join uses the third geometry in a multi-part polygon. That knowledge lives in people's heads. Consultants run structured interviews to extract it.
Target Architecture Planning (Week 2-3)
Deciding whether workflows go to Databricks, AWS Glue, GCP Dataproc, or Snowflake. This depends on the client's existing cloud contracts, data residency requirements, and the nature of the workloads. Most consultancies default to whichever platform they are certified on -which is not always the right choice for the client.
Code Rewrite (Week 3-4)
ArcPy calls translated to GeoPandas and Rasterio. FME workbenches converted to Python ETL pipelines. QGIS processing algorithms rewritten using GDAL bindings. Developers doing this manually typically translate 5-15 scripts per day -meaning a 200-script estate takes 2-4 weeks of pure development time.
Output Validation (Week 4-5)
Running the new cloud pipelines against the same inputs as the legacy scripts and comparing outputs. Feature counts, coordinate precision, attribute values, and spatial accuracy all need to match within tolerance. Any divergence triggers a debugging cycle back into the rewrite phase.
Deployment (Week 5-6)
Configuring cloud infrastructure, setting up IAM roles, creating orchestration (Airflow, Databricks Workflows, AWS Step Functions), and deploying into production environments. Platform-specific issues -IAM propagation delays, GDAL format limitations on object storage -often cause a final round of debugging here.
Handover and Documentation (Week 6+)
Writing up what was migrated, what was left behind, and how the new pipelines are maintained. This is usually compressed at the end of an engagement and produces documentation that becomes outdated within months.
Steps 1, 2, 5, 6, and 7 are candidates for automation. Steps 3 and 4 require human judgement. Step 8 is valuable but rarely done properly under time pressure. That ratio -five automatable steps, two requiring humans -is where the opportunity sits.
Why 80% of GIS Migrations Fail or Overrun
Industry estimates put GIS migration failure rates above 60%. Projects that "succeed" frequently overrun by 50-100% on time and budget. The root causes are consistent across every engagement.
Tacit knowledge loss
GIS workflows accumulate undocumented decisions over years. Why does this script clip to a 500-metre coastal buffer? Because a regulation that passed in 2019 required it, and the analyst who added the line left in 2021. The consultants interviewing the current team will not find this. Neither will anyone reading the code. When the cloud rewrite removes the clip because it looks redundant, the output is silently wrong.
Undiscovered dependencies
Manual inventory misses dependencies. Script A calls Script B via a subprocess. FME Workbench C reads from a network share that is also written by Script D. The consultant catalogues A, B, C, and D separately. Nobody notices D is upstream of C until the cloud version of C fails on day one of production because D has not been migrated yet.
Platform assumptions in legacy code
ArcPy scripts written on Windows use pathlib.Path with backslash separators that break on Linux cloud instances. arcpy.env.workspace points to a file geodatabase at a UNC path. Rasterio writes require a local intermediate file because S3 and GCS object storage cannot seek -so a direct write that works locally silently corrupts GeoTIFF output in cloud. These platform-specific traps cost days each when they surface mid-deployment.
Scope creep from incomplete inventory
The SOW was written against an inventory of 80 scripts. Production holds 340. The consultant discovers this in week two. The client is told the scope needs to increase. The SOW is renegotiated. The timeline slips. This pattern is so common that experienced GIS project managers budget a 2x multiplier on any client-provided workflow count.
None of these are problems with the consultants. They are structural problems with doing discovery manually against codebases that were never designed to be audited. An automated system that reads code directly sidesteps most of them.
The Automated Alternative: How AI Agents Handle Each Step
Axis Spatial's migration agents work through the same eight steps -but five of them are handled programmatically, without a consultant in the room.
Automated: Inventory
Upload a folder -any mix of .py, .fmw, .qgz, .mxd, .shp, .tif files. The agent scans every file and generates a complete inventory with types, sizes, and estimated complexity. What takes a consultant two days takes the agent four minutes. Files never leave the browser -parsing happens locally.
Automated: Dependency Mapping
The agent parses imports, file reads, subprocess calls, and FME workspace connections to build a dependency graph. Script A calling Script B is detected automatically. FME workbenches reading from specific geodatabases are identified. The dependency graph is presented for human review before migration starts -which means the "discovered 340 scripts when we thought we had 80" problem surfaces in minute five, not week two.
Human required: Business Logic Capture
The agent identifies code patterns that likely encode business rules -magic numbers, spatial parameters without comments, conditional logic that lacks documentation. It then interviews the GIS team about these specific points rather than running open-ended workshops. The scope of human involvement shrinks from "tell us everything" to "explain these 14 specific decisions." This typically takes 45-90 minutes rather than two days.
Human-assisted: Target Architecture
The agent recommends a target platform based on workload type -raster-heavy workloads fit Databricks or AWS better than Snowflake, pure SQL analytics sit naturally on Snowflake, serverless event-driven pipelines suit AWS Lambda. The recommendation is explained with trade-offs. The human selects or overrides. This is a 10-minute decision, not a two-day workshop.
Automated: Code Generation
Three agents collaborate on the rewrite. A Planner architects the migration at the workflow level. A Builder generates code for each script -translating arcpy.analysis.Buffer to GeoPandas, arcpy.da.SearchCursor to vectorised Pandas operations, and FME transformers to equivalent Python functions. An Auditor validates every output -if it finds issues, the Builder revises up to three rounds before flagging for human review. Platform-specific traps (two-stage writes for GDAL formats, os.path failures on GCS) are handled by patterns learned from production deployments.
Automated: Output Validation
Cloud outputs are compared against legacy results automatically -feature counts, CRS, attribute values, and spatial accuracy within configurable tolerance. Pipelines are only marked complete when outputs match. Discrepancies are flagged with exactly what diverged and which line of generated code is likely responsible.
Automated: Deployment and Error Correction
Generated pipelines deploy to the target platform with one click. If deployment fails, the system classifies the error against a knowledge base of documented GIS deployment issues, generates a fix, tests it in isolation, and redeploys. Up to three automatic attempts before escalating. Each fix is persisted so future migrations benefit from it.
Manual Consulting vs Automated Migration: Real Numbers
The comparison below uses a concrete example: a mid-sized infrastructure consultancy with 120 ArcPy scripts and 30 FME workbenches, migrating to AWS (Glue + S3 + Athena).
| Dimension | Manual Consulting | Axis Spatial |
|---|---|---|
| Discovery and inventory | 5-8 days | 4 minutes |
| Business logic workshops | 3-5 days | 45-90 min |
| Code translation (150 files) | 10-15 days | 2-4 hours |
| Output validation | 3-5 days | Automated |
| Deployment | 2-3 days | One click |
| Total calendar time | 4-6 weeks | Hours to days |
| Cost (150-file estate) | $200K - $350K | Subscription |
| Files leave your environment | Yes -consultant laptops | Never |
| Re-migration if requirements change | Full SOW again | Re-run |
The cost comparison needs one honest clarification: the first migration on an automated platform includes a learning curve. Validating generated outputs, reviewing dependency graphs, and handling the exceptions the agents flag takes GIS team time. Budget 1-2 days of senior GIS engineer time on a first migration. By the second migration, that drops to 2-4 hours. The consultancy comparison includes no equivalent reduction -every migration costs similarly because the hours are driven by the discovery process, which does not get faster with repetition on the consultancy side.
When Automated Migration Does Not Work
Automated migration is not a universal replacement for consulting. There are four categories of work where automated tools will produce an incomplete result and human expertise is required throughout -not just for the 45-minute interview.
ArcObjects COM code
ArcPy scripts that call ArcObjects through COM interfaces -using comtypes or the arcobjects bindings -rely on ESRI's C++ internals that have no direct open-source equivalent. A geometric network topology operation or a custom raster algebra that calls into ArcObjects cannot be translated by pattern matching. These require a GIS developer to assess what the COM operation actually does and find or build an equivalent.
Highly custom FME transformers
FME workbenches using standard transformers (Reprojector, Clipper, Joiner) migrate well. Workbenches built around PythonCaller or TCLCaller with complex embedded logic, or those using custom installed transformers from the FME Hub, need individual review. The agent will flag these but cannot guarantee the translation without human verification.
Workflows with departed original authors and no documentation
If the analyst who wrote a critical script left the organisation two years ago, nobody can answer the agent's targeted questions about undocumented decisions. The agent will flag the uncertainty and generate a best-interpretation translation -but the output cannot be validated until the workflow is run against real production data. This is a lower risk for internal analytics scripts and a higher risk for regulatory submission workflows where output errors have consequences.
Network Analyst and advanced topology workflows
ESRI Network Analyst routing, service area analysis, and origin-destination matrix generation have no direct one-to-one equivalent in open-source Python. osmnx and pgRouting cover many use cases but require architectural decisions -what graph data model, what routing algorithm, what edge weight schema -that cannot be automated. These migrations need a GIS architect involved from the start.
In practice, most GIS estates have 5-15% of their workflows in one of these categories. The remaining 85-95% are standard spatial operations -buffer, clip, join, reproject, raster analysis -that migrate well with automated tooling. The right approach is to use automated migration for the bulk of the estate and allocate the freed budget to proper human review of the complex exceptions.
The GIS migration market is still priced as if discovery and translation are inherently manual tasks. They are not.
Code can be read programmatically. Dependencies can be mapped automatically. Standard API calls can be translated by pattern. Outputs can be validated against legacy results without a consultant in the room. The 4-6 week consulting timeline is a relic of a period when those tasks required humans because there was nothing else.
The work that still requires humans -capturing undocumented business logic, making architecture decisions, reviewing exception cases -is genuine expertise. That is where the consulting budget should go. Not on manually reading 300 ArcPy scripts line by line.
Get Workflow Automation Insights
Monthly tips on automating GIS workflows, open-source tools, and lessons from enterprise deployments. No spam.
