White Pond Design Home image

User Centric Product Design

Craig Maxey


  1. Introduction
  2. Product sales vs. product adoption
  3. Completing functional requirements
  4. The goals of the user: personas and task scenarios
  5. Learning about the user’s world: Contextual inquiry
  6. The steps of contextual inquiry
  7. The contextual inquiry deliverables
  8. Account management strategy
  9. References


This document outlines the process for designing a customer centric product offering. The proposed techniques have been validated by both personal experience and the product design community. References have been included to provide greater details on the proposed techniques.

A product can be sold if it offers a compelling value proposition. Product success is dependent on its adoption by its users. This adoption is dependent the user’s ability to achieve the goals promised in the glossy brochure. 

It should be noted that the person responsible for the products purchase is typically in a different organizational role than the end user of the product. We must differentiate the needs of purchasing user from the needs of the end user who must adopt the product.

Case Study: Part Costing Software

In the manufacturing industry, the true cost of manufacturing a part is typically not known until the parts are physically produced.  The time and cost associated with producing a part can prohibit the exploration of cost saving design alternatives.  Imagine the value of a tool that could determine the cost of a part as it is being designed.  Such a product was created. 
The following steps are required to determine the cost of a part. 

  • First the CAD representation of the part needs to be analyzed to identify the part features, e.g. holes, slots, and finished surfaces.
  • Then, the resulting list of part features needs to be associated with manufacturing processes, e.g. drilling and machining.
  • Finally, the cost of each manufacturing process needs to be calculated. Summing up these individual costs allows the determination of the total part cost, as it is being designed.

The steps described above reveal a set of product functional requirements. Geometrical analysis, manufacturing process mapping and costing are the necessary components of this costing tool. Logically these components are outlined in the product functional requirements.  A typical product design trap is to rely on these implementation objects to define the user interaction design.  This results in the head exploding experience that we have come to associate with tools that have complex implementations.  Users are confronted with tables of cryptic data and mazes of controls without being offered a hint of where the money is going in the manufacture of the part much less what could be done to improve it.  The functional requirements were met but the resulting solution faces tremendous adoption hurdles because the user cannot extract the initial value proposition, figuring out how to save manufacturing costs.  In this scenario, all functionality is presented with equal priority, equal weighting. The questions that must be asked:

  • What is the relative value of specific functionality? 
  • What should be considered as advanced functionality, hence progressively disclosed? 
  • What functionality needs to be added to ensure that the user will be successful in asking the question; where is the money going and why.

This is just a small sample of the many questions gating the interaction design.  This document outlines the strategy for answering these product design questions.
This is just one example, many more can be offered.

We need to discover, from the user, what is important, what is not, and what is missing.

Product sales vs. product adoption

Often the exclusive focus of product management is on the identification of the product features required for a purchase.  In the example above, it was the responsibility of product management to identify the types of parts and associated manufacturing processes that needed to be added to the product to insure the expansion of product sales.   It was the responsibility of product development to identify the geometric analytics, manufacturing and costing capabilities necessary to satisfy these product management defined requirements.  It should be noted that usually the folks making the purchase decision in an organization have different roles than the folks that will be using the product.  The stakeholders of a product determine if it will be purchased.  The users of the product determine if it will be adopted.  Even with a clear value proposition, the product will fail if the customer does not use it.  Product adoption depends on including the product features necessary for the user to perform her task.  Product adoption also requires that the product respect the customer’s organizational landscape. The best cost saving product will not produce a cheaper part if the mechanical engineer is not motivated to reduce cost.

Completing functional requirements

The case study above highlights the need to identify the functional requirements that ensure product adoption.  There are obvious advantages of discovering these requirements before the product is built.  The processes described in this document should be employed in the initial functional requirements phase rather than during the initial product deployment at which time the schedule and cost impacts are prohibitive.  (Ultimately the adverse effect of these product design flaws on product adoption or market share prompt costly remedial corrective design efforts.)

The goals of the user: personas and task scenarios

There are many examples of well understood product value propositions inspiring several initial competitors but product success being reserved to one.  Quicken and the Palm Pilot are excellent examples of products that emerged from a sea of competing approaches. Their success was driven by a superior understanding of the customer’s goals and requirements. (See Taylor & Schroeder; Butter & Pogue)
Good product design is dependent on an understanding of what the customer needs to do:

    • First of all, what are target customer’s responsibilities and goals?
    • How do they currently achieve their goals?
    • What % of the time do they spend on their relative responsibilities?
    • How is the customer measured against those goals?
    • How does the customer see the impact of our product offering on their daily work?

A successful product is dependent on the answers to these questions.  These answers take the form of task scenarios (see Cooper & Reiman). Note: There is a slight distinction between task scenarios and use cases.  Use cases tend to be more constrained leaving less room to explore optimal design solutions.
The first step in determining the features required for product adoption is the identification and articulation of the user’s goals. These goals must be expressed in words and concepts familiar to the end users.  The tasks scenarios will describe, in the words of the users, the specific steps they need to take to derive value from the product.  The different customer types need to be discovered by examining the target organizations. Interviews with the different user types will support the construction of personas (see Cooper & Reiman).

A quick comment about “usability”: 
Historically usability testing has been a bottom-feeding activity, tacked on to the end of product development.  In this capacity it was able to tell you the mistakes that you made in product definition, too late to do anything other than to fix minor superficial problems.  The usability community does offer techniques for observing the work patterns of the user. (See Kuniavsky; Nielsen) These techniques will prove essential for the required user analysis.

Task scenarios and personas will be constructed from observing and interviewing the customer in their work environment.  This “contextual inquiry” is a widely accepted approach for obtaining the user information required to inform product design.  (See Cooper & Reiman; Beyer & Holzblatt; Kuniavsky; Mayhew; Wixon & Ramey)

Learning about the user’s world: Contextual inquiry

 “Need to get data about the structure of work practice.  This needs to be done in the field, where the customers do their work. Asking a customer to describe the structure of their work outside of the context of that work does not provide reliable results.”  (See Beyer & Holtzlbatt)  “The core premise of Contextual Inquiry is very simple: go where the customer works, observe the customer as he or she works, and talk to the customer about the work.”

As the passages above suggests, contextual inquiry involves observing and interviewing the user in the context of his or her work.  Is the user’s environment an oil covered computer on the shop floor or in a pristine office physically removed from the production facility. How are mechanical design decisions communicated to manufacturing? How are manufacturing problems communicated to the designers? How much time does the user spend on her different responsibilities? How does she communicate her results? Observations of the workplace provides insights to these and other questions.

Figure 1

Historically usability engineers perform this contextual analysis. Involving the interaction designer in this process with the customer has proven essential in the past and is promoted in definitive literature on the subject:
Continuing this thread, designers do not need to be experts in the work itself but they must understand the work structure…” in seeing patterns and distinctions in the way people organize work” (see Cooper & Reiman).  The presence of an interaction engineer allows the real time exploration of interaction solutions to specific design problems.

Some general comments:

  • Observing verses asking: “Most people are incapable of accurately assessing their own behaviors (Pinker 1999) especially outside of the context of their activities” (see Cooper & Reiman).  While it is unrealistic to camp out and watch a user 24-7 it is very important to interview the customer at his or her workplace. 
  • Customer training is not an appropriate vehicle for obtaining this user information. The goals of these two activities are significantly different. The training goals collide with the discovery process.
  • Ethnography is a common tool used by system designers for studying the work patterns of users in highly specialized domains.  This anthropological approach allows non-physicians to design systems for doctors.  (See Wixon & Ramey)  In-house domain expertise provides important insights to work patterns and work context.
  • Usability training provides the skills required to conduct the user interviews and to make the contextual observations.  


The steps of contextual inquiry

The ideal user data gathering can be characterized by the following:

Domain understanding prior to the customer visit

The observing process requires an understanding of the target customer’s business.  In the case above, manufacturing principles need to be understood prior to a customer visit.  The observer must have enough domain awareness to understand the concepts revealed in the during the interview.

Account preparation prior to the customer visit

It is important to review with the account management any issues that the interviewers need to either avoid or specifically address.  Asking questions that have already been answered many times in the past can be at best annoying to the customer.  Done right, the customer tends to appreciate the commitment to solving their product problems that is inferred by a customer visit.  It is truly a win-win opportunity.


Small number of appropriate test subjects

Ideally 3 to 5 users representing each user type will be sufficient to identify fundamental product adoption requirements or problems with an existing product. It is important to note that these test subjects must be the actual users, the people that must adopt the product in order to declare success. Note: A systemic interaction design mistake is for software developers to falsely assume that they represent the actual users.  This is very rarely the case.  The reality “we are not the users” (See Taylor & Schroeder) is typically applicable.


Minimize the disruption caused by the user data gathering process

The optimal amount of interview time required, 1-2 hours per user, should be scheduled at the convenience of your customers.  Not only does an interview see diminishing returns after about an hour or so but this amount of time also minimizes the inconvenience to your customers.

Script guidelines that guide the inquiry

Loose scripts need to be written that are designed to explore customer work patterns.  The scripts reflect the current understanding of the customer’s problem.  The interview process needs to be flexible.  This is the time to make real-time adjustments to assumptions, the time to learn about completely new concepts. It is for that reason that questionnaires per se have very little value here.  Usability training provides the skills required to guide, without leading, this user exploration. While product ideas may be explored, the goal of this inquiry is to listen and learn.

Team approach; interaction expertise coupled with domain expertise.

The goal of the customer interviews is to gain insights on work patterns and requirements.  The fidelity of this exploration depends on both domain expertise and classical usability interview skills.  It is very rare for someone trained in usability analysis and contextual inquiry to also be an expert in target product domain. The industry understanding is essential for the real time exploration of concepts revealed in the interview process.  The best way to conduct these interviews is to adopt a team approach, a user analysis resource coupled to a domain expert. This team approach should be duplicated with the new customer prospects.  The presence of this domain experience in the observing process reinforces customer confidence in your company’s command of their problem space.

A quick comment about customers as product designers:
Customers do not design the interaction solution anymore than they design the database schema. It is essential to understand the reasons behind specific feature requests.  The customer may be requesting a work around for another underlying problem or requesting a solution that is in conflict with other existing functionality. The goal of the customer interviews is to identify the customer’s needs.  It is your responsibility to design the product solution.

The contextual inquiry deliverables

A report will be written for each user interview.  Analysis of these reports will enable the construction of both persona descriptions and task scenarios.  The details of these deliverables are beyond the scope of this document.

Account management strategy

We need to identify and address the account management concerns attendant with the acquisition of this customer data.  Minimizing the demands on the customer is a key element.  Underscoring the value that this effort adds to their resulting product has also proven to be a valuable incentive.  The presence of domain expertise can only help build confidence in our product solution. 

  • The customer expects the vendor to aware of his problems.  He expects your team to be aware of the complaints that he has taken the time to share. It is very important that your organizational departments, product management, development, and professional services coordinate efforts for this customer-facing task.
  • Quality product design requires quality interdepartmental communication.  Information obtained in product management needs to be shared with the product design team and vice versa.

The quality of the product is dependent on getting this information. In the absence of actual customers we need to find legitimate user proxies (as opposed to our local domain wunderkind).


  • Inside Intuit; Suzanne Taylor, Kathy Schroeder
  • Piloting Palm; Andrea Butter & David Pogue
  • About Face 2.0 – The Essentials of Interaction Design; Alan Cooper & Robert Reiman (definitive reference on interaction design).
  • Contextual Design; Defining Customer-Centered Systems; Hugh Beyer, Karen Holzblatt (definitive reference on contextual inquiry)
  • Observing the User Experience; A Practitioner’s Guide to User Research; Mike Kuniavsky
  • The Usability Engineering Lifecycle; Deborah J. Mayhew
  • Field Methods Casebook for Software Design; Dennis Wixon,  Judith Ramey, PhD
  • Cost-Justifying Usability; Randolph G. Bias
  • The Design of Everyday Things; Donald A. Norman.
  • Usability Engineering; Jakob Nielsen (definitive introduction)