BAE Systems
A Unified Analytics Platform for Rapid Pattern Discovery
Defense, Aerospace and Security Company



Designs and data have been modified and do not represent real products or systems.
At a Glance
Over 8, twelve-week program increments, we transformed a fragmented set of tools into a unified, AI-enhanced analytics platform, reducing user abandonment and manual workflows through a human-centered DeLUX (Lean UX) process.
~15
Features Developed
20+
Users Tested
+20%
Time Saved
40%
Increase in Satisfaction Score
Who is BAE Systems?
BAE Systems is an international defense, aerospace and security company that delivers a full range of products and services for air, land, space and naval forces, as well as advanced electronics, security, information technology solutions and customer support services.

My Role
Assistant UX Design Lead (in a team of 4 with one UX Design Lead and two UX Designers)
Platform
Responsive website
UX Methods
User Research, Competitive Audit, Wireframing, Prototyping, Usability Testing, Design Systems, Section 508 Accessibility
Tools
Adobe XD, Adobe Illustrator, Confluence/Jira, Microsoft Suite, Figma
Duration
Feb 2024 – Dec 2025
THE CHALLENGE
Bridging the Gap: Overcoming Data Silos and User Abandonment
The Problem
Analysts used a disparate patchwork of tools, manually documenting data and insights on local devices. This caused high cognitive load and information fragmentation. Furthermore, existing prototypes were outdated and clunky, leading to user abandonment.
In the beginning of 2024, BAE Systems had completed a few prototypes of data analysis tools that continuously got reworked due to usability issues and siginificant user abandonment.
The program goals:
Re-design failed prototypes and incorporate user feedback to improve usability
Add share and collaboration features natively in the tool

Integrate AI to recommend insights and evidence for the user's area of focus
THE SOLUTION
A Unified and Modernized Analytics Platform
Designs and data have been modified and do not represent real products or systems.
Unified Data Hub
User Need: Users need to capture, organize, and access data in one place so they can easily connect information and work more efficiently.
Pain Points: Data was scattered across spreadsheets and local files, making it hard for users to find, connect, and share information.
Solution: A unified tool with a centralized space to gather, store, and document data in one real-time environment

Share & Export
User Need: Users need a fast, reliable way to share insights and outputs with others so they can align quickly and move cases forward.
Pain Points: Sharing work required manual copying, reformatting, and external tools, slowing collaboration and increasing the risk of miscommunication.
Solution: Share & Export — A one-click sharing and quick export feature that allows users to instantly distribute insights and generate report-ready outputs without manual reformatting.

AI- Powered Insights
User Need: Users need help quickly surfacing relevant insights and next steps so they can make confident, timely case decisions.
Pain Points: Manually reviewing large volumes of data made it difficult to identify patterns, prioritize evidence, and reach conclusions efficiently.
Solution: Integrate AI to automatically surface relevant evidence, highlight patterns, and recommend potential case conclusions and actions. This is triggered from a cases's side panel tab.

THE PROCESS
The DeLUX Process, Google Design Sprints, and Section 508
We followed an internal process called DeLux (Designing with Lean UX). It is a human-centered design process that evolves the double diamond process and is based on the lean UX model of Think --> Make --> Check.

Throughout the design process, we applied core UX and visual accessibility best practices—such as clear hierarchy, readable typography, and thoughtful contrast—particularly when designing in dark mode, to support usability and reduce cognitive load.
For major capabilities, we executed 5-Day Google Design Sprints (GDS) to compress the timeline from concept to high-fidelity prototype. In the subsequent weeks after a GDS, we would then refine the sub-flows and vet for feasibility with SMEs, Engineers, and available users.

Once the product design was finalized, we conducted a Section 508 accessibility assessment to ensure compliance with federal accessibility standards. This review included on validating contrast ratios, text legibility, focus states, and interaction patterns across key workflows.
While formal compliance validation occurred at the end of the process, insights from the assessment informed future iterations and reinforced accessibility considerations for subsequent design cycles.
USER RESEARCH
Understanding the Analyst
Before starting the project, we synthesized existing research artifacts—including personas and user journey maps—from prior work. We built on this foundation by conducting participatory design sessions and remote user interviews during GDS workshops. Together, these efforts helped us identify key behaviors, needs, and pain points. We conducted our research with these goals in mind:
Primary Research Goals
-
How do users analyze data and generate insights?
-
What information are users actively looking for?
-
Where do users experience friction or inefficiencies?
Through research synthesis, we identified this journey map for how analysts approach their work along with their pain points.

Noted Pain Points:

Fragmented Tooling
Users rely on multiple disconnected interfaces, creating friction and slowing down workflows.

Too Much to Retain
Data, notes, and insights are difficult to save, revisit and build over time.

Limited Collaboration & Visibility
Users struggle to easily share data and analysis with teammates.

Repetitive, Time-Consuming Work
Lack of reusable data and analysis forces users to redo work, wasting time and effort.
UNDERSTANDING THE INDUSTRY
Competitors Have More Robust Information Architecture
We analyzed adjacent analytics tools to benchmark our product against other platforms and evaluate how well they support end-to-end analytical workflows. This exercise revealed that competing tools offer more robust and intuitive information architecture, helping inform opportunities to reduce cognitive overload and improve discoverability in our product.
Existing Navigation
-
Minimal navigation with access to workspace and few utility items make full analytical workflow unintuitive
-
Critical actions buried in a "Tools" menu
-
Visual clutter of content causing low readability

Other Analytic Platforms
-
More robust information architecture that is easier to find relevant items
-
More functionality within the workspace including multi-views of data and user customization

MAKE AND TEST
Testing Prior Designs Against New Concepts
To validate our research insights and reduce risk before moving into final designs, we tested prior design solutions against newly proposed concepts throughout each GDS session. Many earlier designs lacked sufficient user feedback, particularly around information architecture, so testing became a critical step in ensuring the product supported intuitive, end-to-end analytical workflows.
Testing Approach
-
Conducted A/B concept testing during GDS sessions where applicable
-
Tested original designs vs. newly proposed concepts
-
Sessions ranged from 5–20 users, depending on feature scope
-
Focused on navigation structure, analysis process, and collaboration patterns
Key Tests & Findings (The designs shown are not representative of actual data and are just examples.)
1. Navigation and Information Architecture
-
80% of users preferred a dedicated dashboard view multiple kinds of data and quickly assess stats, trends, and active cases
-
Users wanted a clear home entry point and easy access back to full case listings
-
The IA felt limiting with critical tools buried outside of the navigation

2. Analysis Working Area
-
All users wanted a single, multi-purpose analysis working area to analyze data, gather information, and track progress
-
All users favored multi-data view versus single-data view that can adapt to seeing a variety of information and performing various analytical tasks quickly

3. Notifications and Staying Informed of New Evidence or Updates
-
All users wanted a centralized notifications area dedicated to new evidence, where details can be explored and filtered, rather than only having a notifications menu (original)

4. Collaboration & Knowledge Sharing
-
Majority of users preferred a collaboration whiteboard with sticky notes and templates to contribute context and organize insights
-
All users wanted the ability to share work internally and externally
-
All users want to maintain the ability to export work to readable formats (supported originally)

TECHNICAL CONSTRAINTS
Balancing Scope, Timing, and MVP
Technical constraints required us to continually reassess priorities throughout the project. While the vision included a broad set of workspace and collaboration features, development was limited by a predefined component library and tight delivery timelines. To balance business needs, feasibility, and MVP scope, we made intentional tradeoffs in what to build first.
What changed:
Prioritized for MVP
-
Core analysis space functionality
-
Notifications for new evidence and updates
-
Sharing
Pushed Back
-
Whiteboard collaboration tool
-
Exporting in multiple formats
FINAL DESIGN
The Final Design
Designs and data have been modified and do not represent real products or systems.



IMPACT
Improved Efficiency and Higher User Satisfaction
From the start, we tested concepts with remote users and measured usability using the System Usability Scale (SUS). As the product evolved through continuous testing and iteration, we observed meaningful improvements in both user sentiment and engagement:
20% increase in satisfaction ratings
Over 50% increase in our usability testing participation due to excitement for using the product
~30% estimated reduction in time to insight rate
VISUAL DESIGN
Building the Design System from the Ground Up
I led the creation of the design system for a new product with no existing foundations. To ensure scalability and speed, I embedded design system development into the workflow and partnered closely with engineering to align on Vuetify (Material Design–based) for components and FontAwesome for icons.
Process to design system foundation and components involved a process similar to how we designed features, starting with research, design exploration, and narrowing to get final design components:

Establishing design foundations:

Building components. Sample components displayed below:

ACCESSIBILITY
Section 508 Accessibility Testing
I directed the Section 508 accessibility initiative for this product, defining the overall testing strategy and VPAT/ACR approach. I partnered with internal accessibility experts and coached the design team on accessibility best practices to ensure compliance was built into both design and execution. These efforts resulted in a comprehensive Accessibility Conformance Report (ACR) that met federal requirements while also uncovering clear opportunities for future accessibility improvements.
Tools Used
To support testing, we combined automated and manual methods using tools such as WAVE, Accessibility Insights, Lighthouse, and various contrast analyzers. This approach helped uncover both surface-level and deeper interaction issues. Primary tools consisted of but not limited to these:
-
WAVE
-
Accessibility Insights
-
Lighthouse
-
Adobe Contrast Analyzer
-
Windows Screen Reader
Evaluation & Testing
We evaluated the product against 45 Section 508 and WCAG 2.0 success criteria using the ICT Baseline, ensuring comprehensive coverage across design and functional requirements. Tested areas included these:
Color contrast and visual hierarchy
Keyboard navigation and focus order
Accessible names, labels, and ARIA attributes
Button and link consistency
Screen reader compatibility
Findings & Improvements
Key Findings
-
Color contrast followed 3:1 ratio
-
Missing or incomplete ARIA labels
-
Keyboard navigation gaps
-
Focus indicator patterns were consistent in major areas
-
Screen reader limitations
Actionable Recommendations
-
Implement comprehensive ARIA attribute system
-
Establish consistent keyboard interaction patterns
-
Optimize semantic structure for screen readers and keyboard navigation
Retrospective
Over two years of co-leading this design effort, I’m proud of what our team accomplished—building UX from the ground up at an organization where design was still emerging. Establishing design practices, processes, and trust alongside delivery was both challenging and deeply rewarding.
Key takeaways I’ll carry forward:
-
Frequent, cross-functional communication. Regular collaboration with product managers, scrum leads, systems engineers, and software engineers allowed us to iterate quickly, adapt to shifting priorities, and reduce downstream rework.
-
Consistent retrospectives to improve the process. Holding biweekly retrospectives throughout each 12-week program increment gave the team space to reflect, adjust, and continuously improve how we worked together.
-
Investing in the design system earlier. While we ultimately built a strong design system, prioritizing it earlier would have reduced rework and enabled users to test with a more cohesive look and feel sooner—an insight I’ll apply to future projects.

