I led this 9-month project from discovery through V1 launch—owning the full UX strategy and end-to-end design. I collaborated closely with research, engineering, and product management to shape the product vision, and drove alignment through stakeholder feedback, cross-functional reviews, and executive presentations.
Reduce the fragility and overhead of data pipelines cobbled together across third-party tools by introducing a unified, end-to-end orchestration solution within the Databricks platform. By streamlining pipeline development and monitoring into a single, governed environment, LakeFlow aimed to deliver more secure, stable, and maintainable workflows—while also unlocking access for a broader set of users, ultimately driving greater platform adoption and revenue growth.
LakeFlow serves a broad spectrum of users—from highly technical data engineers building complex, production-grade pipelines, to less technical users like data analysts or platform operators who need to monitor jobs, configure schedules, or trigger workflows without deep coding expertise.
This meant designing for both power and approachability: exposing advanced capabilities for experts while creating guardrails, sensible defaults, and visual affordances for less technical users working within governed environments.
Visual, End-to-End Workflow Composition
LakeFlow brought ingestion, transformation, and orchestration into one unified interface, allowing users to visually build and manage data pipelines without jumping between tools.
Built-In Observability and Iteration Tools
LakeFlow gave users visibility into pipeline performance with real-time task status, logs, and debugging features directly in the UI—helping teams quickly diagnose issues, iterate faster, and build trust in their workflows.
Progressive and Context-Aware UI
The interface was designed with progressive disclosure and smart defaults, enabling users of varying expertise—from SQL-savvy analysts to Spark power users—to stay productive without being overwhelmed. Lakeflow Designer was AI-first, enabling users of any technical ability to build powerful pipelines.
Data engineers were forced to hop between fragmented tools to build pipelines end to end, so I set out to design a seamless, connected experience across ingestion, transformation, and orchestration within LakeFlow.
Because LakeFlow had to serve both seasoned Spark experts and users coming from GUI-based ETL tools, I designed a progressive interface that revealed complexity only when needed to support a wide range of skill levels.
LakeFlow needed to work in harmony with existing Databricks components like Unity Catalog and DLT without duplicating functionality, so I focused on designing clear integration points and visual cues to reinforce the system’s connectedness.
Orchestration tools often become either too abstract or too low-level to manage effectively, so I designed a visual node-based experience with built-in feedback and debugging to keep control intuitive and transparent.
I see myself as a facilitator of idea generation, bringing in diverse perspectives. For Lakeflow Designer, I led brainstorming sessions with devs, PMs, and designers—like this whiteboarding session mapping when users explore data vs. apply transformations.
I kick off design projects with a design doc to align on stakeholders, the happy path journey, testable hypotheses, and key deliverables. Collaborating with PMs and devs early helps us move quickly and stay aligned, even in ambiguity.
I joined 20+ customer interviews and led live demos at our ‘Ask an Expert’ station to gather feedback on upcoming products. I synthesized insights through a UX lens—capturing pain points, feature requests, and user journeys—to guide design decisions and future research.
For Lakeflow Designer, we learned that business users found Databricks intimidating and needed a no-code interface to harness its power—data quality, permissions, metadata, scheduling—without dealing with technical complexity.
I conducted competitor heuristic evaluations through a Jobs-to-be-Done lens, mapping how we compare across the user journey. This Blue Ocean approach helps identify where we can differentiate and lead.
For this project, the analysis highlighted an opportunity to stand out with best-in-class point-and-click data ingestion and smooth integration with third-party databases. We could further differentiate by simplifying next steps like dashboard creation or data export.
This visual shows how data engineers, analytics engineers, and platform admins interact across LakeFlow—covering ingestion, transformation, orchestration, and governance. It highlights their intersecting workflows and roles in building, scheduling, and maintaining reliable, compliant data pipelines.
Early in Figma, I create keyframes—like a movie storyboard—using real customer stories, such as a tractor analyst using IoT data. This quickly aligns the team on workflow direction and highlights design tradeoffs.
I designed for multiple user roles—admin, technical, and business users—while accounting for unhappy paths and future functionality. I explored front-end and API-level errors, created solutions to help users recover, and integrated AI for observability and early issue detection.
I used Lyssna to test early happy path flows, validated pain points with our internal solutions team, and ran guided research with real customers using high-fidelity prototypes.
After user studies for Lakeflow Connect and Designer, I created clear reports and slides with key recommendations, future ideas, and supporting quotes and videos. I also facilitated impact analysis sessions to help the team prioritize changes based on customer impact vs. technical cost.
I like to get my developers and PMs involved as much as they want in brainstorming and reviewing designs. Every week was a bit different based on the type of feedback I needed to keep moving forward quickly, but each week I would set the agenda and drive the workshop to gather early ideas, get technical feedback on the designs, and drive the larger strategy forward.
For these projects, I created weekly meetings with the other 3 designers working on other parts of Lakeflow (such as declarative DLT pipelines, notebooks and scheduling jobs). I set up the meetings to have one person assigned for the first 30 minutes and then an open slot for the final 30 minutes based on who needed UX-centric feedback that week.
As a technically-minded designer, I like to be involved down to the API level of the product and influence the architecture of the backend when possible. To achieve this level of involvement, I attend daily standups and bug bashes. I also make sure to be embedded in the ticketing system used by the team (such as JIRA) and get assigned to the technical tickets. From a strategy perspective, I co-authored the PRDs with the PMs and supported their work by providing images and gifs to illustrate the product direction.
Many design decisions can be made over Slack. I was actively involved in the team Slack for both projects where we could discuss design ideas and give feedback in real-time without waiting for the weekly brainstorm sessions. This worked great when working directly with an engineer on a specific feature. We would start a thread together and get feedback from the team without blocking anyone or filling up their calendars too much.
I piggy-back on what the PMs are doing as much as possible. For these projects, I added a UX section to the weekly update emails that went out to the executive team at Databricks. This way, I was able to get timely input from senior executives before we moved too far in the design.
Lately, I’ve used vibe coding tools like Lovable to build high-fidelity prototypes faster and collaborate more closely with engineers.
For Lakeflow Designer, this let me prototype key canvas interactions—like node creation and navigation—using real tech, improving both design quality and handoff.
For Connect and Designer, the finish line for 0-1 was the conference the first week of June.
I'm both a strategic, visionary designer and a tactical, API-level designer. For these projects, this meant setting the vision for the MVP, while considering V2+.
The principal engineer on Lakeflow offers his perspective on the unique customer value and technological innovation of Lakeflow Connect and Lakeflow Designer.
Internally, teams reported faster onboarding and fewer errors during pipeline development, signaling increased user confidence and a smoother experience. Externally, Companies like Porsche cut development time by 85%.
In LakeFlow Connect, we tracked all error codes and found that troubleshooting became measurably faster once we integrated AI-powered “fix it for me” suggestions.
In LakeFlow Designer, we saw more user IDs actively creating and scheduling pipelines—an early but promising sign that business users were now able to build and iterate faster with less reliance on code. That directly supported our broader goal of expanding adoption among less technical team members.
Copyright © 2025 Lucy Carpenter