Lucy Carpenter

Lucy CarpenterLucy CarpenterLucy Carpenter
More

Lucy Carpenter

Lucy CarpenterLucy CarpenterLucy Carpenter
DATABRICKS

LAKEFLOW

I led this 9-month project from discovery through V1 launch—owning the full UX strategy and end-to-end design. I collaborated closely with research, engineering, and product management to shape the product vision, and drove alignment through stakeholder feedback, cross-functional reviews, and executive presentations.

project goalS

Business goals

Reduce the fragility and overhead of data pipelines cobbled together across third-party tools by introducing a unified, end-to-end orchestration solution within the Databricks platform. By streamlining pipeline development and monitoring into a single, governed environment, LakeFlow aimed to deliver more secure, stable, and maintainable workflows—while also unlocking access for a broader set of users, ultimately driving greater platform adoption and revenue growth.

Design goal

LakeFlow serves a broad spectrum of users—from highly technical data engineers building complex, production-grade pipelines, to less technical users like data analysts or platform operators who need to monitor jobs, configure schedules, or trigger workflows without deep coding expertise.

This meant designing for both power and approachability: exposing advanced capabilities for experts while creating guardrails, sensible defaults, and visual affordances for less technical users working within governed environments.

solutions

Visual, End-to-End Workflow Composition

LakeFlow brought ingestion, transformation, and orchestration into one unified interface, allowing users to visually build and manage data pipelines without jumping between tools.

Built-In Observability and Iteration Tools

LakeFlow gave users visibility into pipeline performance with real-time task status, logs, and debugging features directly in the UI—helping teams quickly diagnose issues, iterate faster, and build trust in their workflows.

Progressive and Context-Aware UI

The interface was designed with progressive disclosure and smart defaults, enabling users of varying expertise—from SQL-savvy analysts to Spark power users—to stay productive without being overwhelmed. Lakeflow Designer was AI-first, enabling users of any technical ability to build powerful pipelines.

challenges

Unifying Disjointed Data Engineering Workflows

Data engineers were forced to hop between fragmented tools to build pipelines end to end, so I set out to design a seamless, connected experience across ingestion, transformation, and orchestration within LakeFlow.

Balancing Visual Simplicity with Technical Depth

Because LakeFlow had to serve both seasoned Spark experts and users coming from GUI-based ETL tools, I designed a progressive interface that revealed complexity only when needed to support a wide range of skill levels.

Designing for System Interoperability

LakeFlow needed to work in harmony with existing Databricks components like Unity Catalog and DLT without duplicating functionality, so I focused on designing clear integration points and visual cues to reinforce the system’s connectedness.

Establishing Clarity in Orchestration UX

Orchestration tools often become either too abstract or too low-level to manage effectively, so I designed a visual node-based experience with built-in feedback and debugging to keep control intuitive and transparent.

EArly design thinking

I see myself as a facilitator of idea generation, bringing in diverse perspectives. For Lakeflow Designer, I led brainstorming sessions with devs, PMs, and designers—like this whiteboarding session mapping when users explore data vs. apply transformations.

project planning and documentation

I kick off design projects with a design doc to align on stakeholders, the happy path journey, testable hypotheses, and key deliverables. Collaborating with PMs and devs early helps us move quickly and stay aligned, even in ambiguity.

understanding real user needs

I joined 20+ customer interviews and led live demos at our ‘Ask an Expert’ station to gather feedback on upcoming products. I synthesized insights through a UX lens—capturing pain points, feature requests, and user journeys—to guide design decisions and future research.

For Lakeflow Designer, we learned that business users found Databricks intimidating and needed a no-code interface to harness its power—data quality, permissions, metadata, scheduling—without dealing with technical complexity.

I conducted competitor heuristic evaluations through a Jobs-to-be-Done lens, mapping how we compare across the user journey. This Blue Ocean approach helps identify where we can differentiate and lead.

For this project, the analysis highlighted an opportunity to stand out with best-in-class point-and-click data ingestion and smooth integration with third-party databases. We could further differentiate by simplifying next steps like dashboard creation or data export.

Design-Led Approach & Collaboration

Design Leadership & Strategy

  • Led quarterly planning sessions to align design goals with business priorities
  • Facilitated key workshops for cross-functional collaboration and strategic decision-making
  • Hosted bi-weekly stakeholder design reviews to gather feedback and maintain executive alignment
  • Defined UX metrics to define customer success

Cross-Functional Collaboration & Communication

  • Scheduled regular 1:1s with PMs and tech leads to prioritize work and plan design deliverables
  • Coordinated meetings with cross-functional partners to ensure cohesive product development

Agile Execution & Team Engagement

  • Participated in weekly standups to stay aligned on project progress and blockers
  • Established quick feedback channels on Slack for real-time design input and iteration
  • Organized weekly brainstorming sessions with engineering teams to foster innovation and solve complex challenges

workflows

This visual shows how data engineers, analytics engineers, and platform admins interact across LakeFlow—covering ingestion, transformation, orchestration, and governance. It highlights their intersecting workflows and roles in building, scheduling, and maintaining reliable, compliant data pipelines.

Happy path design

Early in Figma, I create keyframes—like a movie storyboard—using real customer stories, such as a tractor analyst using IoT data. This quickly aligns the team on workflow direction and highlights design tradeoffs.

after the happy path...

I designed for multiple user roles—admin, technical, and business users—while accounting for unhappy paths and future functionality. I explored front-end and API-level errors, created solutions to help users recover, and integrated AI for observability and early issue detection.

validation and impact analysis

I used Lyssna to test early happy path flows, validated pain points with our internal solutions team, and ran guided research with real customers using high-fidelity prototypes.



After user studies for Lakeflow Connect and Designer, I created clear reports and slides with key recommendations, future ideas, and supporting quotes and videos. I also facilitated impact analysis sessions to help the team prioritize changes based on customer impact vs. technical cost.

DESIGN REVIEWS AND FEEDBACK GATHERING

Weekly brainstorm and review workshops with devs and PMs

I like to get my developers and PMs involved as much as they want in brainstorming and reviewing designs. Every week was a bit different based on the type of feedback I needed to keep moving forward quickly, but each week I would set the agenda and drive the workshop to gather early ideas, get technical feedback on the designs, and drive the larger strategy forward.

Design team reviews

For these projects, I created weekly meetings with the other 3 designers working on other parts of Lakeflow (such as declarative DLT pipelines, notebooks and scheduling jobs). I set up the meetings to have one person assigned for the first 30 minutes and then an open slot for the final 30 minutes based on who needed UX-centric feedback that week.

PRDs, standups and tickets

As a technically-minded designer, I like to be involved down to the API level of the product and influence the architecture of the backend when possible. To achieve this level of involvement, I attend daily standups and bug bashes. I also make sure to be embedded in the ticketing system used by the team (such as JIRA) and get assigned to the technical tickets. From a strategy perspective, I co-authored the PRDs with the PMs and supported their work by providing images and gifs to illustrate the product direction.

Iterative real-time feedback

Many design decisions can be made over Slack. I was actively involved in the team Slack for both projects where we could discuss design ideas and give feedback in real-time without waiting for the weekly brainstorm sessions. This worked great when working directly with an engineer on a specific feature. We would start a thread together and get feedback from the team without blocking anyone or filling up their calendars too much.

Weekly update email

I piggy-back on what the PMs are doing as much as possible. For these projects, I added a UX section to the weekly update emails that went out to the executive team at Databricks. This way, I was able to get timely input from senior executives before we moved too far in the design.

vibe coding

Lately, I’ve used vibe coding tools like Lovable to build high-fidelity prototypes faster and collaborate more closely with engineers.

For Lakeflow Designer, this let me prototype key canvas interactions—like node creation and navigation—using real tech, improving both design quality and handoff.

taking it to the finish line

For Connect and Designer, the finish line for 0-1 was the conference the first week of June. 

I'm both a strategic, visionary designer and a tactical, API-level designer. For these projects, this meant setting the vision for the MVP, while considering V2+.

PRODUCT ANALYSIS

The principal engineer on Lakeflow offers his perspective on the unique customer value and technological innovation of Lakeflow Connect and Lakeflow Designer.

RESULTS

Metrics

Internally, teams reported faster onboarding and fewer errors during pipeline development, signaling increased user confidence and a smoother experience. Externally, Companies like Porsche cut development time by 85%. 


In LakeFlow Connect, we tracked all error codes and found that troubleshooting became measurably faster once we integrated AI-powered “fix it for me” suggestions. 


In LakeFlow Designer, we saw more user IDs actively creating and scheduling pipelines—an early but promising sign that business users were now able to build and iterate faster with less reliance on code. That directly supported our broader goal of expanding adoption among less technical team members.

Media


Copyright © 2025 Lucy Carpenter