Case Study · Federal Web Application

Modernizing FEMA's Mitigation Planning Portal

MPP 2.0 consolidates two outdated systems into a user-centered platform, giving federal planners a clearer, faster way to manage hazard mitigation plans across all ten FEMA regions.

Role
Solo UX Designer
Client
FEMA (Federal)
Timeline
4 months active
Platform
Web Application

background

The Mitigation Planning Portal (MPP) is a web application developed by FEMA to track and report hazard mitigation plans and related data elements across all 10 regions. Users can create mitigation plans, manage compliance, and oversee funding across a national network of states and local jurisdictions.

"Consolidated space of record for mitigation plan status."

— Stakeholder workshop, FEMA

Understanding the Users

MPP comprises a wide range of FEMA users, each with varying levels of administrative access. Users are organized into three primary groups: Federal, Regional, and State, each with a unique set of access.

To cover the complexity and design process in greater depth, this case study focuses on federal users.

Primary Focus
Federal
Full edit access across all 10 regions. The most demanding user — highest data scope, highest stakes for accuracy.
Regional
Data quality oversight across all regions. Responsible for enforcing entry standards.
State
Scoped to assigned region or state. Mirror the Federal permission tiers within their geographic boundary.

The design work centered on the Federal Read & Write user — the role with the broadest scope and the most friction in the existing system. Meet Sandra:

Pain Points

I learned from the stakeholder workshops that several user pain points were overlooked during the transition from the Legacy version to MPP 1.0, and the updates were completed without resolving many of the problems in the legacy version.

  • Broken User Flow

    The status dashboard provided a high-level view of plan status, but did not offer a direct path to the plans they were working on. Users had to navigate via search to resume their work.

  • Confusing review process

    The current design lacks a reliable way to log their activity or leave comments tied to the plan they are working on.

  • Can't easily track updates

    Users can either receive all notifications for a plan or none at all. For those working with 100 or more, it becomes overwhelming, making it nearly impossible to prioritize.

  • "Manually entering Tribal and entering can be tedious."

    — Stakeholder workshop, FEMA

    Legacy designs screenshots

    Status Dashboard (Landing Page) - The landing page gave analysts a regional overview of plan statuses at a glance. It was a useful starting point but not a complete workflow. The design assumed users would navigate to individual plans through search, with no shortcut back to work already in progress.
    How can we improve?
    • How might we allow users quickly resume plans they were previously working on?
    • What would a "saved" or "flagged" state look like in this context?
    Plan Detail - The plan detail page tried to show everything at once: basic plan information, funding status, and jurisdiction data all competing for space in a dense view. This made it difficult for analysts to focus on what mattered most at any point in their workflow.
    How can we improve?
    • How might we reorganize the information displayed on the screen to reduce cognitive load?
    • What level of detail does an analyst actually need at first glance vs. on demand?
    • How should users move between plan sections without losing their place?

    Ideation

    Building and refining ideas

    After reviewing both prior versions of MPP, the client and I iterated on design directions to align with user needs and business goals. The focus was on connecting the missing dots, simplifying every step of the process, and making information transparent at every stage.

    Mapping out the plan management process

    1. From Landing Page to Plan Detail — tracing how users navigate to and through individual plans
    2. Clustered plan details are reorganized into a simplified, focused interface
    3. Alert Preferences were introduced to help users follow the plans that need extra attention

    Features carried over from Legacy MPP are indicated in yellow. New features added for MPP 2.0 are in blue. Green indicates my design suggestion but not included in the delivered version, but documented here as part of this case study.

    Final Designs

    The client didn’t want to deviate too much from the original design. I didn’t want this to be a “make it look pretty” type of work, so I pushed for a bigger challenge and explained how the new design proposal can benefit users’ experience.

    Although some were accepted, the number of design updates was rejected due to time constraints and conflicts with existing dev enviorment. For this case study, I am sharing the design I initially proposed.

    Flag Plans — Introduced a feature that lets users mark any plan and return to it directly from the dashboard. Accessible from a dedicated My Plans tab, the saved plan list eliminates the need to reconstruct a search for work already in progress.


    Jurisdiction Status tab — Separated the Jurisdiction Status table into its own dedicated tab, giving the data room to breathe. Framed to the client around cognitive load: sectioning content helps users process information faster and reduces the likelihood of input errors.


    🖥
    Status Dashboard — MPP 2.0
    My Plans tab + Flag Plans feature
    6B
    Design Solutions

    Alert Preferences

    The client-approved alert system used a modal pop-up triggered by a bell icon within a dense data table. Each time a planner configured alerts for a plan, they lost their place in the list, disrupting context instead of supporting it.

    In government applications, modals are typically associated with errors, confirmations, or warnings. Using a modal for settings suggests urgency when users are simply configuring preferences.

    The split-panel approach

    The redesigned Alert Preferences page uses a split-panel layout, with a card list of saved plans on the left and an inline settings panel on the right. Selecting a plan opens its alert configuration without leaving or interrupting the list view.

    Each plan card displays a summary of active alerts, such as status changes, expiration windows, and review activity, allowing planners to quickly identify which plans have alerts configured. Plans without alerts are visually dimmed.

    Input type rationale

    Expiration alerts use radio buttons — selecting a time window is a mutually exclusive choice, and presenting options as a card-style group makes that logic visible. Status and comment alerts use toggles — they are independent, binary, and the toggle communicates that directly.


    Outcome

    What Success Looked Like Without Metrics

    This project did not include a formal usability study or post-launch analytics access. That is the reality of government UX work: baseline data rarely exists, and access to usage analytics is limited. Success was measured by other means.

    5/5
    Pain points identified in stakeholder workshops addressed in the final design
    2→1
    Steps to resume an active plan from the dashboard, down from an open-ended search flow
    Client sign-off from FEMA after a stakeholder review process spanning multiple feedback cycles

    "Because this project lacked baseline analytics, design decisions were grounded in direct stakeholder workshop findings and validated through client review cycles."

    Reflection

    What I'd Do Differently

    Would do again
    Anchor every design decision to a specific stakeholder quote or workshop finding
    Use the persona as a decision-making tool, not just a deliverable
    Document the facilitation work — it is design work
    Advocate for my rationale even when the final decision goes another way
    Would do differently
    Establish a baseline usability benchmark earlier, even informally
    Push harder to resolve stakeholder conflicts before they affect delivery
    Get alignment on visual direction in week one, not month two
    Involve a second designer for peer review on key decision points

    This project reinforced something I keep relearning: in enterprise and government UX, the hardest design problems aren't on the screen. They're in the room — in competing stakeholder priorities, unclear mandates, and decisions that take months because no one owns them. Knowing when to advocate, when to reframe, and when to accept a direction and ship is as much the work as the artifact itself.