Moderated card sorts – physical or digital?

I recently carried out a day of moderated open card sort testing as part of a website refresh project, which provided an opportunity to experiment with both physical and digital card sorts. 

In the spirit of reflecting on Lagom’s research practice, this blog talks about some of the things that I have learned through the process and how I think it affects our practice going forwards. 

Physical card sorts 

The physical card sort sessions involved printing the site content onto a set of index cards and participants using post-it notes to write down the group headings as the card sort progressed. 

In terms of delivering the session, using physical cards felt interactive and engaging for the users. The use of post-it notes allowed participants to reflect and refine their group headings as the session progressed. Similarly, the use of index cards provided participants with the opportunity to write down new content terminology in instances where they felt a term was incorrect or ill-defined.   

To analyse the results alongside a larger number of unmoderated card sorts, I needed to upload the physical responses to our card sorting software (Optimal Sort). This process takes up extra time, and may have some implications for the quality of the research. 

One of the benefits of the physical approach is that it affords participants the opportunity to create and articulate sub-groups. Whilst Optimal Sort works well generally, it does not provide an equivalent opportunity to create sub-groups or allow participants to explore new content terminology. Consequently, there is a risk of losing the thinking associated with the creation of sub-groups when importing the findings to the software.      

Digital card sorts

The digital card sorts sessions had some advantages in comparison to the physical approach. Notably, the process was more efficient in terms of translating the findings from participants into the analysis software, as participants were directly inputting the data into the tool. Optimal Sort works well for this process. However, as it lacks options to create and record sub-groups,  I had to capture this information through conversation rather than allowing participants to map it out themselves. 

Another benefit of the digital process is that it offered the chance for clients to observe the sessions without physically being in the room, by using  screen sharing software ( The client could hear feedback first hand throughout the day. This proved particularly valuable when analysing the results with the clients, who were better able to understand the nuances in terminology used by participants on the day.

How will we deliver card sorts in the future?

Like most things the answer to this question feels particularly context dependent. Both physical and digital options have their advantages and disadvantages. However all things being equal I have a preference for the physical option, because it offers possibilities for participants to create sub-groups and rename cards as they participate in the activity, whilst also feeling more engaging and fun. I feel that this ultimately results in better insights, which is the most important factor in research at the end of the day.

Related Case Studies

RAF Instagram engagement user research

We recently completed a longitudinal piece of research with the RAF looking at engagement with the RAF’s Instagram followers.

Discovery on a National Work Experience Service for Health Education England

We were appointed by Health Education England (HEE) to conduct discovery research to inform the development of a national digital…

More from the Author

Service designer

John Gribbin 28/03/2023

Reflections on service design in the public sector – A guest lecture

As part of Lagom’s leg up offering, we often…

John Gribbin 04/07/2022

What is a Lagom Service Designer?

I recently made the formal switch in job roles from…

John Gribbin 01/10/2021

Using Miro as a project whiteboard

I recently saw an image that described discovery phase work…