Structured Observation Management (SOM) is a feature designed for Esri product ArcGIS Pro and AllSource, bringing together computer vision and urban sensor data to support more intuitive and data-informed decision-making in GEOINT analysis and urban planning workflows.
2024 UX Design | User Research
The primary goal of the Structured Observation Management (SOM) framework was to bridge the gap between complex urban sensor data and the critical question faced by those in intelligence and urban planning: “What should I be looking at, and where?” My objectives focused on:
Designing intuitive workflows that integrate both still and motion imagery.
Reducing cognitive load by addressing pain points in multi-source analysis.
Ensuring scalability for use across domains like disaster response and public health.
The goal was to align technical capabilities with real user needs to support faster, smarter urban decision-making.
Based on the stakeholder discussion among product engineers, developers, and I, we have defined following key features to be developed for the SOM framework.
Data Capture: Collects data from drones, satellites, urban cameras, and other sensors.
Information Extraction: Uses computer vision for object detection, including exploitation of Z-value in imagery.
Object Classification: Classifies the detected objects using manual and AI algorithms based on taxonomy and ontology.
Pattern Detection: Analyzes the classified data to generate dashboards, identifying patterns in data to provide actionable insights.
To refine the SOM framework, I collaborated with a user researcher to survey intelligence analysts, environmental scientists, and urban planners at two key conferences:
Technical Exchange Meeting (TEM)
Esri User Conference.
Below is a slide documenting the research process and the final report that aggregates the insights gathered.
Following the user research, I mapped out key workflows in FigJam to align technical capabilities with user needs, per key features and its capabilities. For example, below is the workflow for information extraction stage, specifically satellite imagery correction and exploitation by applying the Z-axis.
From these workflows, I created user flows—focusing on pane where most interactions happen—that detailed task sequences, decision points, and system responses. These early design artifacts served as the foundation for high-fidelity prototype.
Beyond Figma prototype, I have also participated in the following processes:
Rendered the UI presented in Figma using the Visual Studio SDK.
Collaborated with the dev team on backend integration
Data Capture
The interface enables users to load multiple still images collected by satellites and generate mosaic datasets to visualize a complete view of a target area (top).
It also supports video input, allowing users to geolocate footage on the map and visualize the flight path or movement of the capturing drone (bottom).
2. Information Extraction
The interface allows users to either manually annotate objects detected in the imagery or automate the process using computer vision, with all inputs managed through the side pane as contextual comments.
3. Object Classification
Once objects are identified, users can apply a range of analytical tools to aggregate data, generate statistical summaries, and uncover patterns—supported by algorithms grounded in a predefined taxonomy and ontology.
4. Pattern Detection
After completing the analysis, users can visualize the results and generate dashboards to highlight patterns and insights uncovered during the process.
After the rollout, I worked closely with clients to support real-world use of the SOM framework—guiding users on how to extract insights and communicate findings effectively through the dashboard. By translating complex sensor data into actionable intelligence, the product became a critical tool in several mission-driven projects.
Notable applications include:
Opioid diversion tracking: Enabled public health and law enforcement teams to identify diversion risks 30% faster by visualizing supply patterns across regions.
FEMA disaster response analysis: Supported rapid situational assessments and improved resource planning, reducing time-to-decision during emergency events by an estimated 40%.
Through field engagement and feedback loops, the product was refined to better support analytical workflows—ultimately increasing adoption among key government and civilian users.