Unified Data Platform for Engineering Workflows

Role
UX Researcher · Architecture & Discovery
Year
2026

01 — The Executive Summary
The client is a global leader in energy services, operating across dozens of countries. Their drilling engineering teams are responsible for designing and executing complex well programs — decisions that directly affect safety, cost, and efficiency for oil and gas operators worldwide.
The business goal was clear: build a centralised, cloud-based platform that would let engineers query historical data, run analyses, and make better decisions — faster. My job was to understand what that actually needed to look like from a human perspective before a single line of code was written.
02 — The Strategic Problem
The stated problem was "we need a database." The real problem was much more human.
When the project brief arrived, it was framed as a technical integration challenge — connect the systems, consolidate the data, done. But when I sat down with engineers, a very different picture emerged.
"At least half my day is spent copying, cleaning, and reformatting data, before I've done any actual engineering."
These engineers were deeply skilled professionals. But their tools forced them to behave like data entry operators. I identified four distinct human problems driving this:
Information Silos: Software was workstation-based, meaning data was locked on individual laptops and inaccessible to the wider team.
The "Manual Tax": Engineers spent up to half their day cutting and pasting data from flat files and proprietary repositories into modeling tools.
Knowledge Loss: Critical lessons learned from previous projects were not captured in a searchable format, leading to redundant work across different product lines.
Strategic Goal: Develop a single user entry point (Portal) that integrates fragmented data sources into a "smart" repository compatible with future AI/ML automation
03 — Discovery & Evidence-Based Insights
This was a highly specialised technical domain. Engineers spoke in acronyms and assumed shared context that didn't exist on the UX side. My first challenge was becoming fluent enough in their work to ask the right questions — without pretending to be an engineer.
I conducted in-depth interviews with drilling engineers, specialist engineers from a second practice area, and operations managers across multiple regions. I also reviewed existing requirement documents, workflow diagrams, and historical process materials to ground my questions in reality.
Insight 01: The Efficiency Tax Finding: Engineers spent 50% of their day on manual data transposing, accepting it as "part of the job".
Pivot: Re-architected the workflow to automate data ingestion, aiming to recover 30% of engineering capacity for high-value analysis.
Insight 02: The Trust Barrier Finding: Any automation is rejected if it acts as a "black box"; engineers only trust data they can trace.
Pivot: Built radical transparency into the ingestion logic, including mandatory QA/QC checkpoints and "Chain of Custody" markers for every data point.
Insight 03: Contextual Intelligence Finding: Raw data is noise without geological or temporal context (e.g., formation type or regional performance).
Pivot: Centered the Portal design on Query Flexibility, allowing users to filter massive datasets by specific "Use Case" criteria like formation tops or drilling hazards.
Insight 04: Bridging the Institutional Silo Finding: Parallel teams were often duplicating 50% of their work due to zero cross-functional visibility.
Pivot: Implemented a Unique Well Identifier to serve as a technical bridge, allowing different product lines to instantly access shared "lessons learned".
Insight 05: Preserving Expert Autonomy Finding: Engineers are protective of their judgment; AI recommendations are seen as a threat if they feel "mandatory".
Pivot: Adopted an "Advisor, Not Dictator" model—the system generates preliminary drafts (e.g., Driller’s Roadmaps) for the engineer to "tweak" and validate
One stakeholder conversation I'm proud of: the engineering lead initially wanted the AI features as the headline of Phase 1. I pushed back — citing research showing that engineers would dismiss the platform entirely if they didn't trust the underlying data first. We agreed to position AI as a Phase 1 architectural requirement (so the data model would support it) but a Phase 3 user-facing feature. That sequencing decision likely saved the project from a credibility problem at launch.
04 — The UX Process
From interviews to a complete information architecture. My process moved through four stages, each one building directly on the last.
User Interviews & Contextual Shadowing - Conducted sessions with engineers across both practice areas. Asked them to walk through real recent tasks. Recorded pain points, workarounds, and the moments where their existing tools broke down.
Insight Synthesis & Persona Development - Color-coded affinity mapping across all research notes. Grouped themes into five core insights. Built three personas representing the distinct mental models, goals, and frustrations of the user community.
Workflow Mapping & User Capabilities Matrix - Mapped the end-to-end engineering workflow — what engineers do, when, in what order, and where data enters and exits. Built a capabilities matrix defining what each user type can see, create, edit, and share. This became the access control framework for the platform.
Sitemap & System Principles - Translated the workflow model into a full information architecture. Defined system principles that the platform must follow (e.g., "data quality is always visible," "the engineer is always the decision-maker"). Prioritised features into two delivery phases based on engineer needs and business readiness.
05 — Outcome and Impact
What the research enabled. This project didn't ship a product during my tenure — it shipped a foundation. The measure of success here is the quality and completeness of that foundation, and its direct influence on what comes next.
Aligned stakeholders around a clear product direction
Reduced ambiguity in system design
Established a scalable foundation for future development
06 — Reflection
What worked: The decision to do contextual, task-based interviews rather than survey-style research was critical. Engineers were used to being asked what they "need" — and they'd give you a wishlist. Watching them work surfaced what they'd stopped noticing was broken.
What I'd do differently: I would push earlier for a co-design session with engineers to validate the IA structure before finalising it. The pivot I made (reorganising around workflow vs. data type) came from my own synthesis, which was right — but testing it earlier would have given me more confidence and potentially saved a round of revision.
The hardest part: Working in a domain where I was always the least technically knowledgeable person in the room. I had to build enough credibility to ask "why" questions that engineers initially found surprising — "why do you do it that way?" is a question that doesn't get asked much in technical industries. Learning to sit comfortably in that role, and to trust that my outsider perspective was an asset rather than a liability, was the most important thing I developed on this project
Scope of Work
