
BAE SYSTEMS
Data Analysis tool for users
BACKGROUND
By the beginning of 2024, BAE Systems had completed a few prototypes of data analysis tools that continuously got reworked due to usability. The process was too technical that users felt the prototypes clunky and tedious, leading to abandonment and frustration. As a new phase of the project got approved, I got brought in as a UX Designer and later became Assistant UX Design Lead, to join a 3-person UX team to do a full re-design of the product while incorporating AI.
Quick Disclaimer: Because the details of this product are confidential, this case study will primarily focus on the process versus showing a detailed product analysis.
MY ROLE
As an Assistant UX Design Lead, my goal was to simplify the workflow, making it easier for users to understand the tool and allow them to draw conclusions for their analyses quickly. Additionally, for my team, I had a goal of ensuring we covered the "must-have" use cases and designed effectively. My role included a mix of UX, research, copywriting and visual design.
​
I contributed to a cross-functional team of about 30 professionals, collaborating with product managers, software engineers, researchers, and system engineers
Project Type
Responsive Website
Role
Assistant UX Design Lead (Syp Siligato,
UX Design Lead; Hannah Choi, UX Designer; Eugenia Tzeng, UX Designer)
Methods
User Research
Wireframing
Usability Testing
Design Systems
Tools
Adobe XD
Adobe Illustrator
Confluence/Jira
Microsoft Suite
Duration
Feb 2024-Present
PROBLEM
USERS LACK A CENTRALIZED WAY TO ANALYZE DATA QUICKLY
Upon taking on this project, my team analyzed research, existing systems, and prior protytpes and noted these key problems: ​​
-
Analysts struggled to quickly find and interpret relevant insights within large, complex datasets.
-
Users faced a lack of a centralized place to analyze data — information was scattered across multiple systems.
-
Users needed to cross-reference multiple data sources, but workflows were slow and cumbersome.
-
Users needed highly secure access controls without creating friction in daily work.
-
​Collaboration between teams on insights and analysis was inefficient or fragmented.
-
​Existing tools and prior prototypes were outdated and didn’t match the efficiency and usability of modern technology

THE SOLUTION
HOW MIGHT WE ALLOW USERS TO ANALYZE DATA QUICKLY AND EASILY

1
Centralized workspace that unifies scattered data and provides real-time AI insights
We created a unified environment where analysts can access and analyze all relevant data in one place, making insights easy to identify and eliminating the need to analyze data across multiple locations.
This is an example for illustrative purposes and does not represent any actual data.


2
Enhanced Collaboration
We redesigned the tool to enable collaboration, letting teams share insights and work together efficiently within a whiteboard, that took into account security measures. In the prior user workflow, the experience was limited to chats and more manual process to share ideas, projects, and insights.
This is an example for illustrative purposes and does not represent any actual data.



3
Cohesive & Modern UI​
I led the creation of a cohesive design system, following Material Design principles and leveraging FontAwesome icons, to deliver a modern, engaging UI that makes complex data approachable, intuitive, and visually engaging.
APPROACH
UX APPROACH: 12 WEEK SPRINTS
This project operated in 12-week design sprints, each consisting of six 2-week increments. In each sprint, the UX team focused on designing 2–3 features that would be implemented in the following sprint. We followed a cyclical approach that included planning, collaborating with research on user needs, designing concepts, validating feasibility, and refining designs to a final version.
​
Our process was fluid, and as we designed and failed fast, we iterated on our workflow and aligned with architecture and engineering leads with the final process below.







UNDERSTANDING THE USER
COMMUNICATING REGULARLY WITH RESEARCH AND CONDUCTING REMOTE USER TESTING PROVIDED USER INSIGHTS
Due to the sensitive nature of the information designed, the details around our users cannot be shown but below is on our process. To arrive at personas and user journey maps, we leveraged existing user data and our research team to gather all information. This provided a foundation for how we designed wireframes for each feature. Each quarter, we were introduced new features to design. Although we have the personas and user journey maps as a reference, we continuously gathered user information from performing the following:
Meet Frequently with User Research Team
-
Who performed this? I, along with my team
-
What was performed? Ad-hoc conversations and scheduled meetings to discuss use cases and gather design feedback
-
Why? We met with our research team to understand use cases throughout the feature design lifecycle mentioned above. This ensured we designed in a user-centric approach
-
When were these performed? Daily
Perform Remote User Testing
-
Who performed this? I, along with my team
-
What was performed? UX facilitated remote testing sessions where we led participatory design sessions where obtained users' ideas, or we asked users to provide feedback on existing mockups. We also collected survey data to obtain satisfaction ratings as well as additional feedback that wasn’t communicated.
-
Why? To support features that had little user feedback and we needed to gather insights within a short time frame to inform an initial concept
-
When were these performed? As needed
It’s important to note that these are not an either/or type of research method between the both above but are made in conjunction with each other. The former where we meet the the user research team, is more unscheduled, where we can ask for support when we need to fill in any gaps in use cases. The latter is more limited to a time in order to get our creative juices flowing for topics where we and the user research team need more data.
WIREFRAMES
THE PROCESS OF CREATING WIREFRAMES
In each 12- week sprint, the UX team focused on designing 2–3 features that would be implemented in the following sprint. During each quarter sprint, each UX designer led features to bring to low-fidelity by the end of the 12-week period. This consisted of competitive audits,sketching, then bringing the design to low fidelity, iterating. The flow can go back to other steps where needed and is not linear.

This design flow is similar to what I’ve done in other projects like Northeastern CAMD. After this lo-fidelity design is complete, we discuss feasibility with developers and begin the high-fidelity wireframes and refine.
VISUAL DESIGN
BUILDING AND MANAGING THE DESIGN SYSTEM FROM THE GROUND UP
A key part of my role was leading the creation of our design system. Because this was an entirely new product with no existing design foundations, I researched best practices and developed a strategy to integrate design system development into our ongoing workflow. In collaboration with developers, we aligned on using Vuetify, an open-source library based on Google’s Material Design, as the foundation for our components, and FontAwesome for icons.
Our process was structured in two phases: initial design (exploring and defining components) and final spec(documenting in Confluence and Adobe XD and delivering production-ready designs). The below became the acceptance criteria for Jira tasking.
Phase 1: Initial Design
-
Research Vuetify Standards and analogs where relevant, depending on if the component can follow Vuetify/Material standards or if it’s a custom component
-
Compile feasibility questions to bring to developer review
-
Create and refine low-fidelity design
-
Obtain feasibility signoff from the developers
Phase 2: Final Spec
-
Document redline annotations for components with a table of specs in the internal wiki page
-
Add the final component to the master design file for team to easily grab and use
To arrive at the foundational areas, the colors and typography, we went through mood board exercises with the customer to identify the look and feel of what they wanted for their system and arrived at the color palette below from Material Design’s theme builder:



Once the foundational elements were established, we then began designing components. Here’s an example of a snackbar I worked on:

Due to information sensitivity the full design system and Confluence wiki documentation cannot be presented. The final set of components included a variety of standard components such as buttons, cards, data tables, chips, menus, tooltips, text fields, as well as a set of custom components for cards, card containers, and others more specific to the system.
CHALLENGES
ESTABLISHING A DESIGN PROCESS WITH LARGE TEAMS IS NOT EASY
Challenge 1: Balancing Speed, Collaboration, and Feasibility
As a new product built by a newly formed UX team, we had to ramp up quickly and stay agile. Working across multiple teams with different goals was challenging—UX prioritized usability, while developers focused on feasibility and delivery. Finding common ground required constant collaboration, quick iteration, and flexibility to meet client expectations.
​
Challenge 2: Iterating the Design Process Midstream
Early on, system engineers were driving the design approach, with UX brought in late to validate functionality. This limited our ability to advocate for user needs and design without limitations. We also started with lo-fi concepts but had little time to evolve them into high-fidelity mockups within the design system. Over time, we shifted to involve UX from the start—leading with research, then design, followed by developer feasibility checks—which closed gaps and improved both usability and delivery speed.
​
Challenge 3: Shifting from Lo-Fi to High-Fi Under Constraints
We initially began with low-fidelity concepts, but customer priorities shifted, and less time was allocated for building out high-fidelity mockups and integrating them into the design system. This forced us to prioritize functionality over style to reach initial operating capability, while still ensuring usability. While we were able to advocate prioritizing design system work to improve the look and feel of the application, we still had to do some negotiation to implement the more critical components now and less critical later.
IMPACT
HIGHER SATISFACTION RATES AND USABILITY TESTING PARTICIPATION
Since the beginning of our user testing, we presented concepts to remote users and asked them to provide us measures with a System Usability Scale. Over time as we continue to test and update the product we've seen:​
20% increase in satisfaction ratings
​
Over 50% increase in our usability testing participation due to excitement for using the product

NEXT STEPS FOR BAE SYSTEMS
Moving forward, our focus will be to continue refining the experience. We will do that through iterative design and user testing. This will allow us to validate features with real users, ensuring every design is grounded in their needs and workflows. By maintaining a user-centered mindset throughout development, we'll continue shaping solutions that are intuitive and impactful, supporting our users in translating complex data into clear, actionable insights.
CASE STUDIES LIKE THIS
NORTHEASTERN UNIVERSITY CAMD
BIG COJONES GOLF

