We have recently been thinking about the way in which AI can allow us to deliver more value to our clients through our work.
As part of this thinking, we have been exploring the use of AI across our research processes. This blog will explore the central question of ‘Can ChatGPT help to articulate user needs?’
What is a user need?
Gov.uk describe user needs as the needs that a user has of a service, and which that service must satisfy for the user to get the right outcome for them.
They are typically written using a defined syntax:
As a [user role], I need to [the task that a user has to do], so that I can [the goal that a user wants to achieve].
Understanding the needs of all users is critical to successfully designing a service that is likely to be used, help people get the correct outcome, and cost less to operate.
How we currently identify user needs
Identifying user needs is a multifaceted process that requires the analysis of findings from a range of research methods, for example: user interviews, observations, workshops, or analytics reviews.
We firstly capture our research notes in a specialist research tool called Dovetail. These notes can then be coded with distinct tags that indicate where user feedback represents a potential need.
Once we have a comprehensive list of needs, we move to a Google sheet and begin a process of refining the list. This involves converting needs from user feedback into the correct syntax, removing duplicates and sorting the needs into relevant themes. This is the foundation of our user needs backlog, which is a key output in all of our discoveries.
Can ChatGPT identify and articulate use needs?
To determine whether ChatGPT can help to articulate user needs, we first needed some data. With data security in mind, we heavily edited an interview transcript from a previous project to remove all identifiable information. Where necessary, this was replaced with broader language to convey the same point but without identifying a participant.
The anonymised set of interview notes was uploaded to ChatGPT, alongside the following prompt:
“Create a bullet point list of user needs that are derived from the following research notes. Articulate each user need in the following syntax: As a [User], I need to [do something], so that I can [complete a task]. Make sure that user needs are not focusing on specific solutions at this stage.”
This resulted in a list of 12 user needs based on the interview data. In reviewing these needs, three things stood out:
1) Some of the needs identified by ChatGPT were similar to the needs that we had identified in our initial processes
There were some positive results from this approach with some needs bearing similarity to the ones that we identified in our current processes during the project.
This can be illustrated by the following two user needs, the first was generated in the project and the second was a result of ChatGPT analysing the dummy interview notes:
As a healthcare professional, I need to know the different steps involved in applying for a global health activity, so that I can prepare and submit my application.
As a healthcare professional, I need guidance and support in navigating the application process, so that I can efficiently complete all the required paperwork and submit a successful application.
2) One new need was identified that we hadn’t captured in our initial processes
Interestingly, ChatGPT also succeeded in identifying a genuine user need that we had not captured as part of our initial processes. On reflection this was not a strong need that came through the user research, however it may have been interesting to test as part of the user needs prioritisation survey.
3) Some needs were low quality or not relevant
Whilst points 1 and 2 reflect some of the successes of the approach, we also encountered some needs that were not sufficient quality or not relevant to the project. For example:
As a user of the program’s website, I would appreciate regular updates and improvements to the website content, including additional information about global health opportunities beyond HEE partnerships, links to relevant resources, and a space for trainees to explore and discover various global health opportunities.
This does not meet the syntax requirements of a user need and has several other issues. Notably the length of the need is too long and the need reflects on too many solutions.
This experiment set out to answer a straightforward question ‘can ChatGPT help to articulate user needs?’ Having tested an approach, the answer appears to be yes, but with some limitations.
It is clear that it can be a useful tool for inspiration. Needs can often be difficult to articulate and identify and ChatGPT offers a useful source of additional inspiration to inform the writing process.
Further, in testing the needs with the rest of the Lagom team through a game of robot or no-bot, it became clear that it can be difficult to tell apart human written needs and those written by ChatGPT.
However, it is clearly not a perfect tool for articulating user needs. Not all of the needs that it derived were well articulated. It also did not comprehensively identify every need that we picked out through our typical manual coding process.
Whilst the process as a whole was fairly effective, it also required a lot of human interaction to facilitate. Either to quality assure or reword large chunks of a need to perfect the phrasing.
ChatGPT isn’t going to be replacing our current processes any time soon but it will be interesting to see whether iterations to this process can make it even better in the future.