NGA — Map of the World Modernization
Enterprise geospatial platform modernization in a highly constrained environment—re-architecting navigation, governance, and a closed-loop user feedback system that turned global signals into roadmap decisions.
TL;DR — What Changed (Impact First)
Quick summary of where we started and what we delivered
Before (Problems Inherited)
- •Fragmented navigation and inconsistent UX patterns across legacy modules
- •Limited ability to translate user/stakeholder feedback into prioritized, testable backlog work
- •High compliance + delivery friction (air-gapped constraints, accessibility, ATO realities)
- •Developer friction: unclear UX standards, inconsistent handoffs, rework risk
My Scope (Solo UX, Manager-Level)
After / What Changed
- •Implemented a durable closed-loop feedback system: signals → coded inputs → backlog → roadmap → release → telemetry → next sprint
- •Re-architected IA and navigation around analyst/admin task models
- •Shipped scalable component framework (tokens, patterns, accessibility rules)
- •Embedded UX into Agile + DevSecOps pipelines (2–4 sprints ahead)
- •Established UX SOPs, RACI, cadence, and governance for continuous improvement
Problem & Context (5Ws)
Strategic context for enterprise platform modernization
Expert Users
Analysts and administrators who rely on the platform for time-sensitive geospatial analysis and reporting
Mission-Critical Platform
An enterprise geospatial platform undergoing modernization while still supporting ongoing operational use
Constrained Environment
Limited connectivity (air-gapped constraints) and strict compliance requirements
Multi-Release Effort
Modernization delivered through Agile increments, aligned to delivery milestones
Reduce Friction & Standardize
Reduce friction in core user tasks, standardize UX across modules, and create measurable, repeatable continuous improvement—despite constraints
Product → UX Mapping
How legacy system modernization translates into user-centered design outcomes
Fragmented tools with inconsistent patterns, siloed data, and steep learning curves
SME interviews, heuristic evaluations, and structured feedback collection to identify pain points
Information architecture restructured around analyst and admin mental models and workflows
Consistent patterns, design tokens, and component library across all modernized modules
My Role (Sole UX, Manager-Level Scope)
I served as the sole UX practitioner while operating at a manager level—owning UX strategy, research, design execution, product planning, and delivery operations. My defining contribution was designing and implementing a closed-loop feedback system that made continuous improvement durable, measurable, and actionable inside an air-gapped, compliance-heavy environment.
Product + Ops Ownership
- Product planning, backlog shaping, and roadmap framing from user/stakeholder signals
- Stakeholder management, expectation-setting, and risk/feasibility tradeoffs
- Working 2–4 sprints ahead to keep design/dev aligned and reduce rework
Research + Signal Design
- Designed data-call + questionnaire instruments to convert anecdotal input into structured signals
- Conducted SME interviews, heuristic evaluations, and gap analyses
- Built personas, journey strips, and service blueprints (on- and off-computer)
IA + UX Execution
- Re-architected navigation and IA around analyst/admin task models
- Designed high-fidelity flows (analyst + admin) and onboarding wizard using progressive disclosure
- Created data visualization guidance for mission-speed interpretation
Design Systems + Accessibility
- Defined component framework, themes, design tokens, and accessibility rules (508/WCAG)
- Established UX standards that engineering could implement consistently
Agile + DevSecOps Integration
- Wrote user stories, Definition of Done, and acceptance criteria tied to measurable KPIs
- Integrated UX into DevSecOps workflows (including automation hooks and quality gates)
- Improved developer experience via clearer standards, repeatable handoffs, and validation steps
User Feedback Loop + Continuous Improvement
- Designed and operationalized closed-loop feedback system converting signals to roadmap decisions
- Established governance, SOPs, and cadence so improvement continued without heroics
- Routed telemetry and usage patterns back into sprint planning and backlog prioritization
Collaboration Map
Approach Overview
Discover → Define → Design → Deliver → Test → Iterate
Discover
- SME interviews + contextual inquiry (as feasible in constrained environments)
- Heuristic evaluations + gap analysis against current workflows and UI patterns
- Stakeholder alignment and requirements gathering sessions
Define
- Task-model-driven IA: analyst vs admin mental models
- Personas, journey strips, service blueprints
- KPI definitions + measurement plan (what we would track and why)
Design
- High-fidelity flows and prototypes validated through structured reviews
- Component framework + tokens + accessibility rules
- Data visualization guidance for fast interpretation and reduced cognitive load
Deliver
- Sprint-ready user stories, acceptance criteria, and Definition of Done
- Dev handoff optimized for implementation fidelity
- QA partnership (including accessibility validation and consistency checks)
Test
- Usability validation with SMEs and representative users
- Accessibility audits against Section 508 / WCAG 2.1 AA standards
- Performance and compatibility testing in constrained environments
Iterate
- Closed-loop feedback system operationalized (signals to roadmap to telemetry)
- Governance + SOPs + cadence so improvement continued without heroics
- Continuous refinement based on usage analytics and user feedback
Closed-Loop Continuous Improvement System
In an air-gapped, compliance-heavy environment, "feedback" often dies in email threads or meetings. I designed and owned a durable closed-loop system that converted user/SME signals into coded inputs, backlog decisions, roadmap commitments, and measurable releases—then routed telemetry back into the next sprint.
The Feedback Loop
Operational Mechanics
- Data-call + questionnaire instruments to standardize input quality
- Governance: SOPs, policies, RACI, and cadence (so the process outlived any one person)
- Scorecards to track outcomes and keep stakeholders aligned
- Developer hooks: definition of done, accessibility gates, token usage rules, regression checks
Telemetry + Measurement
- Measurement strategy designed for constrained environments
- Advanced Matomo integration for detailed user behavior analytics
- Usage patterns and friction signals routed back to backlog
- Recurring review cycles and backlog health checks
Design Execution
Key design deliverables that drove the modernization

Information Architecture + Navigation
- Rebuilt navigation around task-based mental models (analyst/admin)
- Reduced cross-module inconsistency by standardizing naming, layout patterns, and navigation behaviors

Onboarding + Progressive Disclosure
- Designed an onboarding wizard that guided users through complex setup without overwhelming them
- Used progressive disclosure to balance expert power with clarity for less frequent tasks

Design System + Component Framework
- Material UI-based component patterns, tokens, and accessibility rules designed for reuse across modules
- Documentation standards so engineers could implement consistently

Data Visualization & User Behavior Analytics
- Guidance designed to support fast interpretation, comparison, and export workflows
- Advanced Matomo integration for tracking user behavior patterns and identifying friction points
- Reduced cognitive load for time-sensitive analysis tasks
Design Gallery
Wireframes, low fidelity, and high fidelity designs — click to view full size
Results (Shipped Outcomes + What We Measured)
Representative modernization use cases and the metrics we tracked
Shipped Outcomes
Representative modernization use cases:
- •Expert-user UX consistency: unified patterns across modules to improve proficiency, workflow efficiency, sharing, and productivity
- •Data import wizard: broke complex ingestion into a five-step workflow with clearer requirements and field mapping guidance—improving data awareness and reducing manual data grooming
- •Export + reporting support: enabled exports in formats users needed for downstream work (e.g., PDF/DOCX/CSV/SHP/KML), improving record handling and continuity for analysis/reporting
- •Advanced analysis support: designed workflows that made sophisticated analytics more approachable (e.g., clustering configuration and saving results for comparison)
- •Improved access to critical data and enrichment: supported interfaces that made bulk enrichment and data access feasible for expert users without requiring scripts
What We Measured (Scorecard)
Measures designed/tracked (no invented results):
- •Task success and friction points (top workflows)
- •Onboarding completion + time-to-first-success
- •Support tickets / repeat issues and time-to-triage
- •Signal-to-roadmap match rate (how often user signals translated into shipped work)
- •Adoption of standardized components/tokens (design system coverage proxy)
Stakeholder Outcome
Client/stakeholders supported continued investment because the program created a durable, measurable mechanism for continuous improvement under constraints.
Operating Mechanisms
How we operated at manager-level scope
UX Governance
SOPs, policies, RACI, and engagement cadence
Agile Integration
UX worked 2–4 sprints ahead; stories + acceptance criteria tied to measurable outcomes
DevSecOps Alignment
Quality gates for accessibility + token usage; clearer handoff standards; regression awareness
Continuous Improvement
Feedback loop operationalized with recurring review cycles and backlog health checks
Tools / Stack
Explore More Projects
Discover other case studies showcasing strategic UX design and delivery
Let's Discuss Your Next Project
Interested in how strategic UX can drive measurable outcomes for your organization?