Approach

Design Steps

Below is a list of steps I use to create software products and features. Every project is different. Some projects may have three steps, while others have ten. The choices and trade-offs made while creating a feature or product are primarily due to time and resources, which vary from project to project. 

Validation Strategy Examples

  • I like to start every project by asking questions like these so that I understand the value of it and where the final targets are. 

    1. Define the problem - What are we trying to help users do?

    2. Value Props - What is the business value of this feature? 

    3. Strategy - How will this improve our market position? 

    4. Success Rate - How will we know if it's a success?

    5. Functionality Criteria - Based on what we know, what does it need to do?

    6. Metrics - After it's launched, what do we need to measure? 

  • Putting on my detective hat, I begin to look at my assumptions and the clarity of the problem to be solved. Are my assumptions valid with user data to support them or do I need more research to determine validity? Depending on the results I make a plan for what is needed and loop in my Product counterparts or any cross-functional team members.

    Methods for validating assumptions:

    1. Competitive Analysis for market fit or need

    2. Google Analytics for user behavior

    3. Hotjar for user behavior

    If I need more detailed information on user behavior:

    1. Collect direct user feedback with interviews, surveys, contextual inquiry (field research) 

    2. Collect internal user feedback from Customer Success and Account Managers

  • This is where the fun begins and the product begins to take shape. 

    1. Brainstorming with stakeholders, product and engineers

    2. White Boarding

    3. Journey maps (flow charts) of the user path(s)

    4. Create user stories 

    5. Info architecture (diagraming)

    6. Sketch ideas

    7. Wireframing

    8. Collaborating with engineering on MVP/Cs

    9. Refine ideas for the solution

    10. Confirm functionality criteria

  • Now that the major details have been hammered out, it's time to make a useful, 

    1. Narrow down the scope of the project

    2. Convert ideas into mockups

    3. Check user flows with prototypes and click-throughs

    4. Write specs (if needed)

    5. Decomp project with engineers for tickets

  • I work with the engineers to provide feedback throughout the development before it’s on staging. A 5-minute screen share of what they have can save them time later on. I can provide feedback on functionality early as they build it and then can help with the visual as they build it if they run into issues with the design. Working hand-in-hand like this makes it easier for the developers and delivers a higher quality product.

  • After launch, I like to review the product to measure success and to see if anything needs to be added or adjusted. Addtions and adjustments are added to new epics and the process starts over again.

    1. Analyze the agreed on metrics

    2. Validate feature with user feedback

    3. Perform live testing

    1. UXPlanet.com

    2. SmashingMagazine.com

    3. Nielsen-Norman group (nng.com)

    4. UX.stackexchange.com

    5. Medium.com

    1. Nielsen-Norman group

    2. UXPin ebooks and whitepapers on various topics.

    3. Third-party vendors sometimes provide these as validation of their products. If I use these, I look at their competition also for balanced info.

  • Look at competitors' products that excel in the area I'm working on to see how they handle the issues. I tread carefully here. I look to leverage existing (familiar) UI patterns to reduce user friction, not to copy what they did. Competitors' cohorts are rarely identical, and their solutions may work for their users but not ours.

    Example:

    When working on a Shopping Cart, I'd looked at ASOS, Amazon, and a few others acclaimed for excellent Cart experiences. I looked for commonalities in how they dealt with helping the user through complex purchase flows.

    One thing that stood out in all of them was breaking the information into steps. Most used an accordion for this. This finding prompted a deeper dive into best practices for accordions.

    After I completed my research, I used accordions to break up the information into steps with auto-close and visual checkmarks for completed sections. Also, used "smart" defaults pulled from the stored account information for most used shipping addresses and credit cards to help the user quickly navigate the complicated shopping cart.

    1. Dig into the sites acknowledged as trendsetters for that year.

    2. Look at previous years to see if there is a clear trajectory for trends.

    3. Look for any trend forecasting.

    1. Look at the user flows to see if they match my intended flows, and make adjustments if needed.

    2. Investigate low cart conversions or form completions by looking at drop-offs. (These I would investigate further with Hotjar.)

    3. Review statistical user data like device types and demographical data to ensure it matches my user sets.

    1. Look at screen recordings for user behavior for Carts and forms.

    2. Review heatmaps to validate assumptions on UI patterns.

    1. Send select groups of users a personal email with the link to a crafted google survey.

    2. I worked with Marketing on crafting their questions for email campaign surveys for several projects.

    3. I recently started looking into HotJar's on-page surveys and would use them in the future.

  • Formal: I went on-site and observed several users completing a prescribed set of tasks with follow-up questions I created. After, I would compile the results and analyze them.

    Informal: In-person or over facetime, I observed what they were doing with the software and asked questions. This method was most helpful with troubleshooting a bug or optimizing a workflow by understanding their workarounds.

  • Methods to get user feedback when there is no access to users:

    1. I needed some users to use our checkout process to validate some recent changes, and I could not access any existing users. I found a coffee shop that real estate agents were likely to visit. For three days, I set up my laptop with a "Try My App" sign and had people try to customize an item and checkout in exchange for a free coffee card. I prepared standardized questions, gathered the responses, and noted observations into a spreadsheet I later analyzed. Several insights were stakeholder approved for the Backlog.

      2. Ask friends or coworkers if they know anyone in their networks who fits our cohorts (age, skill, career sector, etc.).

      Example:

      Feedback from Customer Service on a new release of a Shopping Cart was that older users needed help to complete the purchase and called in for help. After reviewing the feedback, I assumed that the recent font size and contrast updates needed tweaking for an older audience. I needed users who were 55-65 that were active real estate agents to validate my assumptions, and at that time, I had no access to our users.

      So I asked my coworkers if they knew anyone with that criteria. A coworker's mom matched and was willing to talk to me. I set up an easy task to complete with a follow-up questionnaire. Her responses confirmed my assumptions.

      I reworked the page's font hierarchy and increased the contrast on a few colors. I ran it through an online heuristics evaluator to confirm. I had her retake the test to confirm the improvements were enough. These were put into production and further validated by the compliments collected by Customer Service.

  • Some of the online tools that I have used to validate design assumptions:

    1. UX Check — Performs a quick Heuristics evaluation of a page.

    2. Button Checker — Reviews all buttons on a page for contrast.

    3. Colorable — Checks the contrast for colors.