Ann Niou

Product Designer

GOOGLE CLOUD SECURITY · GENAI · 2024-2025

GenAI for Cloud Compliance

ROLE Lead UX Designer
TIMELINE 2024–2025
4 Months (Idea to Imp)
TEAM Console Chat Platform
Compliance Eng
Security Eng
SKILLS AI Product Design
Service Blueprinting
Prototyping

Designing AI that earns trust in regulated environments.

90% reduction in time-to-value on high-friction compliance workflows. Launched at Google Cloud Next '24 and '25.

Since early 2024, I’ve led UX across Google Cloud's Security, Privacy, and Compliance (SPC) portfolios. My focus has been leveraging generative AI to supercharge high-stakes workflows, working directly with the core Gemini Cloud Assist team to evolve the platform.

Operating in regulated industries (healthcare, finance, government) brings unique design challenges. While the broader organization was pushing aggressively for fully autonomous, agentic AI experiences, my mandate was to balance that technical ambition with the reality of our users: deep skepticism of AI in high-stakes security contexts. Here, an AI error doesn't just erode trust—it can result in significant legal and financial repercussions.

Our overarching goal was to "democratize SPC": making complex frameworks understandable and designing an AI experience that acts as a trusted security specialist. Collaborating closely with PMs, UXR, and Engineering, I helped drive product strategy from vision to functional prototypes, taking these projects from idea to implementation in just 4 months.

SPC is a broad, complex domain spanning disparate orgs, engineering teams, and user roles. Before generating ideas for AI, we needed to establish a clear focus.

Rather than operating strictly within existing product boundaries, I leveraged UXR insights to help restructure our approach around two core Jobs-to-be-Done (JTBD):

  • Administrators managing cloud compliance (setup, testing, monitoring).
  • Developers / DevOps building services without introducing security exposure.

I secured senior leadership buy-in for this pivot. It prevented us from spreading our efforts too thin and gave previously siloed engineering teams a shared, user-centric roadmap.

Under tight deadlines for Cloud Next, I ran service blueprinting sessions with PM and UXR to map the end-to-end experience for both JTBDs.

My goal was to help the team step back from our existing UI surfaces and anchor our perspective on the user's actual steps, decision points, and desired outcomes. This gave product partners a fresh perspective on how their specific tools fit into the broader journey. By shifting the focus onto the user's goals, we were able to aggregate UXR insights and visually pinpoint the most critical areas of friction.

I then introduced a prioritization framework to evaluate opportunities based on user impact and AI suitability. This focused discovery allowed us to cut through the noise and isolate the two most significant bottlenecks in the ecosystem, setting the foundation for our two flagship features.

Service blueprint and AI prioritization matrix
Service blueprint — cross-surface journey map and AI prioritization matrix

Targeting the Compliance Administrator

The Problem: The Translation Gap
Translating abstract regulations into technical configurations takes months. Teams manually map requirements to GCP controls in spreadsheets, requiring specialist knowledge they rarely have.

The North Star: Set up a compliance posture in a single sitting, without referring to external documentation.

Working closely with engineering to understand our AI stack, I built rapid prototypes to explore GenAI interaction patterns. We designed an experience that generates a personalized compliance plan in minutes, packaging clear control mappings with plain-language explanations. What previously lived in spreadsheets now had a credible starting point.

The Pivot: Balancing Agency with Skepticism.
Initial engineering momentum leaned toward fully agentic, automated deployments. However, research surfaced a critical blocker: users were deeply skeptical of autonomous AI in compliance workflows. Pushing controls to production requires strict human sign-off, and users did not trust an agent to shortcut that governance.

In response, I pulled back the autonomy. I iterated the prototypes so the AI acted as an expert drafter rather than an executor, explicitly introducing dry runs, rollback options, and human-in-the-loop review steps. By intentionally designing friction back into the system, we bridged the gap between agentic capability and user trust.

Compliance plan prototype
Compliance plan prototype — AI-generated framework draft highlighting the "Dry Run" and human review steps

Targeting the Developer / DevOps

The Problem: Context-Switching Bottlenecks
Resolving security errors while building creates real exposure. Developers aren't compliance specialists; when they hit a blocker, they either guess (introducing risk) or escalate to a security team (halting momentum).

The North Star: Resolve a security finding in-context without escalating to a compliance team.

The developer workstream required a completely different approach. Research identified the highest-friction security findings developers encounter. I designed Gemini to surface these findings directly in-context and resolve them—via Terraform, gcloud, or console clicks—without forcing the developer to leave their workflow or find a specialist.

Because these high-impact experiences lived on surfaces owned by other teams, I ran weekly UX alignment sessions with cross-functional leadership to ensure our AI remediation patterns fit naturally into their shipping products.

Security finding remediation
Security finding remediation — in-context finding with Gemini-generated remediation and one-click apply

The central GenAI tiger team was developing the core console chat patterns simultaneously. I partnered directly with them to contribute the novel patterns we developed for security—specifically multi-step planning and consecutive task completion—back into the centralized Google Cloud component library, ensuring they could scale across the entire ecosystem.

GenAI pattern contribution
GenAI pattern contribution — multi-step planning pattern contributed to the central component library

This case study covers two high-impact projects that successfully democratized SPC. By structuring a broad scope around core user journeys and designing AI that respects governance and trust, we reduced time-to-value on the highest-friction compliance workflows by 90%.

Both experiences were successfully shipped and featured center-stage at Google Cloud Next 2024 and 2025.