Role - Embedded UX designer on multiple engineering teams launching V1 versions of these products.
Targeted User - Data Engineers, Admins, and Business Users needing custom datasets.
Goal - Make data ingestion and transformation - workflows that used to require a team of developers - into a simple point-and-click UI.
I start most design projects with a design doc. This allows me to move quicker by getting clarity on the stakeholders, aligning the team on a happy path user journey, starting to define our hypotheses to user test over time, collect all customer interviews and notes in one place, and a workback plan to define the major deliverables. I bring my devs and PMs along with me as I write this doc to mak
I conducted and participated in over 20 customer interview calls and summits at the beginning of this project. Here you can see the 'Ask an Expert' station at our conference where I met with customers to show them demos of upcoming products and get their feedback. I summarize the interviews specifically with a UX-lens to define pain points, feature requests, and existing user journeys. I'll use these findings to drive design decision-making discussions and as an input for future usability studies and user research.
For Lakeflow Designer, this showed us that business users were intimidated by Databricks and wanted a no-code interface to turn raw data into actionable insights leveraging all the Databricks technology (data quality, permissioning, MD info, scheduling, etc.) without needing to understand how the technology was working under the hood.
Then I perform competitor heuristic evaluations to deeply understand the competitive landscape from a UX-perspective focusing on Jobs to be Done within the larger context of the overall user journey. I then create a visualization that shows how we compare to our competitors at each stage in the user journey. This type of Blue Ocean lens allows me to recommend areas to focus on and really nail in our product to make us stand out as the best solution for users.
For this project, the analysis showed me that we could stand out from our competitors by building the best-in-class version of point-and-click data ingestion with seamless handshakes between Databricks and third-party databases, such as SQL server, Salesforce, Google Analytics, and many more. We could also stand out by making the next steps in the flow easy to perform, such as building a custom dashboard, downloading the data as an excel file, etc.
Since I believe that the best ideas can come from anyone on the team, I see my role as a facilitator of idea generation.
For Lakeflow Designer, I ran design brainstorming sessions both with my dev team, my PMs, and other designers on the team interested in the project. This image shows a whiteboarding session I did to flesh out the user journey and at what points a user would want to perform explo
One of the first things I do in the Figma file is to make keyframes, similar to how a movie is made, to start getting alignment on the general direction of the product workflow. I use actual customer stories from the interviews, in this case the example I used was a data analyst at a tractor manufacturing business wanting to use tractor IoT streaming data to make decisions about inventory.
This ap
I start iterating on the design for different user types, unhappy paths, and additional functionality the product will offer.
Most products I work on support an admin experience, technical staff, and a business user role.
The unhappy paths involve understanding the front-end and API-level errors users can run into as they go through the workflow. I work to find technically-possible ways to help user
There are many different user research tools available based on the hypotheses we want to test. For these projects, I used Lyssna to do early evaluations on the happy path user journey. Then I met with people on our internal customer solutions team to understand if the designs would solve the pain points that their current customers were running into. With the high fidelity prototypes, I conducted guided user research studies with actual customers who would be using these tools.
After the user studies for Lakeflow Connect and Designer, I wrote easy-to-understand reports and made slide decks highlighting the main recommendations based on severity, calling out potential future ideas, and going into detail on all these points with customer quotes and videos.
Using this data to drive design decisions is a powerful tool, but only one part of the larger product decision-making. If the team needs to make tradeoffs on what to implement or change, I will run a session with the team, in this case an impact analysis activity, to help drive the discussion on what would make the largest customer impact versus technical cost.
The through line in my career is taking highly technical, disjointed workflows and turning them into a unified, point-and-click experience.
For Lakeflow Connect, this involved understanding all the steps required to ingest data from a third-party source. For example, ingesting from SQL server required creating a connection between the server and Databricks, provisioning an IAM role, setting up a gateway pipeline that handled streaming changes, setting up an ingestion pipeline, and scheduling it.
I like to get my developers and PMs involved as much as they want in brainstorming and reviewing designs. Every week was a bit different based on the type of feedback I needed to keep moving forward quickly, but each week I would set the agenda and drive the workshop to gather early ideas, get technical feedback on the designs, and drive the larger strategy forward.
For these projects, I created weekly meetings with the other 3 designers working on other parts of Lakeflow (such as declarative DLT pipelines, notebooks and scheduling jobs). I set up the meetings to have one person assigned for the first 30 minutes and then an open slot for the final 30 minutes based on who needed UX-centric feedback that week.
As a technically-minded designer, I like to be involved down to the API level of the product and influence the architecture of the backend when possible. To achieve this level of involvement, I attend daily standups and bug bashes. I also make sure to be embedded in the ticketing system used by the team (such as JIRA) and get assigned to the technical tickets. From a strategy perspective, I co-authored the PRDs with the PMs and supported their work by providing images and gifs to illustrate the product direction.
Many design decisions can be made over Slack. I was actively involved in the team Slack for both projects where we could discuss design ideas and give feedback in real-time without waiting for the weekly brainstorm sessions. This worked great when working directly with an engineer on a specific feature. We would start a thread together and get feedback from the team without blocking anyone or filling up their calendars too much.
I piggy-back on what the PMs are doing as much as possible. For these projects, I added a UX section to the weekly update emails that went out to the executive team at Databricks. This way, I was able to get timely input from senior executives before we moved too far in the design.
Recently, I've started doing my high-fidelity designs using vibe coding products such as Lovable. This way, not only can I create working prototypes much faster, but I also can collaborate with engineers using their language of code.
For Lakeflow Designer, I knew that the experience was heavily dependent on the interactions of adding nodes and navigating around the canvas. With a Lovable prototype,
For Connect and Designer, the finish line for 0-1 was the conference the first week of June.
I'm both a strategic, visionary designer and a tactical, API-level designer. For these projects, this meant setting the vision for the MVP, while considering V2+,
The principal engineer on Lakeflow offers his perspective on the unique customer value and technological innovation of Lakeflow Connect and Lakeflow Designer.