Digital Marketing & Communications

We've seen 1s and 0s you wouldn't believe

Topic: User research

Five lessons learnt about user testing

📥  Communication, Design, User research

Doing anything with more regularity will provide performance benefits and provide insights into how to do it better, and over the last couple years we've really ramped up the frequency of our user testing.
Sooo, here are five lessons that I want to share from our latest round of testing (for our online course publisher). They really helped me to improve our process.

1. Keep the technology barrier low

We used to have our own 'testing lab' (a MacBook Pro running Silverback hooked up to PC peripherals), but we quickly found that people got confused by the unfamiliar interface.
Where possible we now get users to test on their own machines at their own desks to remove this barrier. This has the benefit of making our volunteers feel at ease by being in familiar surroundings. In other situations like guerilla testing, we've found that a tablet works really well, due to being portable and not requiring any accessory beyond a finger.

2. Don't go out alone

It's really hard to ask the questions and capture the answers effectively without help, so bring along a colleague. This also helps when approaching people cold because you will be more relaxed and confident, and from their perspective two random strangers approaching you is less unsettling than one.

3. Groups are good

Generally you get fewer in-depth responses when testing your product with a group of people, the trade-off being the greater breadth of replies. However, group discussion of individual responses leads to additional insight that you will not get when talking with one person. Make sure you allow time your testing script to accommodate this off-piste discussion.

4. Be realistic

It's better to test a few features thoroughly, than rush through a whole raft of different aspects. Also, it's really important not to run too many sessions at one time; it’s tiring and you’ll miss insight through fatigue and response bias (you re-interpret what someone says because you've heard the same response 8 times already that morning).
We found 4 large-scale (~45 minute) sessions was enough for one day - and around 9 smaller scale (~10 minute) sessions.

5. Be enthusiastic

You're proud of what you've done, but you want to make it better. Don't be defensive of any issues, instead thank the user for finding the problem - they've done you a big favour.
Always remember that people are giving up their time for you, for no reward - so make it an enjoyable experience!


How to share in the rewards of user research

📥  User research

Our priority principle is 'Put users’ needs first'. Lately we’ve been helping University teams to carry out their own user research. We've found it very effective and we now want to encourage more departments to get involved.

Why we prioritise user needs

There are people out there who know how to make your department's digital presence the best it can possibly be. Those people are the users of your information or services.

For our users, is a tool to get things done. Some sections of the platform cater for general users, such as information about our courses or directions to campus. Other sections of the platform serve very specific groups of users, like the content management system used by our publishers or guidance for alumni mentors.

Putting users’ needs first assumes that what’s best for our users is what’s best for the University. So when we are designing, developing or creating content for, our users and their need for quick and simple access to information and tools are informing and motivating our every effort.

How we develop an understanding of user needs

Putting users first is just lofty principle unless we make a concerted effort to learn about our users. We need to continually work away at developing a better understanding of how our ‘users’ breakdown into cohorts, what the needs of those cohorts are, and how they would like those needs to be met.

Some other organisations benefit from the budgets or the specialist in-house staff to maintain a continuous flow of user research being fed into their digital development processes. We have neither, but we haven't let that frustrate us.


Renewing the Library landing page

📥  Style, content and design, User research

We’ve been aware for some time that the Library homepage (or landing page as we've come round to calling it) had been looking tired and over-burdened with links. It also featured a menu system which often provided too many pathways leading to the same destination.

The rationale for the last revamp and ongoing updates had been to provide different pathways to information and services, each of which responding to a different user mindset and/or university role. However, in practice, we found that the presence of so many routes obscured the content and diluted the impact of information on the site as things became buried in various sections of the Library webpages.

Screenshot of the previous Library landing page

The previous Library landing page

Screenshot showing the heatmap of the clickrate on the previous Library landing page

Heatmap of the clickrate on the previous Library landing page

We met with the Digital Marketing & Communications team which helped “shake up” and refresh our expectations for prioritising and re-positioning content. Digital’s hotspot analysis of the Library landing page use played a pivotal role in this review, showing us where people made heavy use of links and where there were barren unused sections of the page and menus.



Our best decision - what the Alumni team learned from user research

📥  User research

A pre-occupation of the Alumni Relations team in 2014 (beyond being nice to our graduates) has been a thing called Bath Connection. It’s a bespoke online portal which allows students or graduates needing careers advice, support or mentoring, to search for and connect directly with ‘Alumni Experts’ who have voluntarily uploaded a career profile and offered their help.

The damned thing

We were due to launch in October. However, as the summer break sped by, and the speed of development slowed to an infuriating crawl, it would have been tempting to skip what is the subject of this blog post, and just get the thing (which was rapidly being referred to as “the damned thing”) out there.

In retrospect, I’m glad we took the Digital team’s advice to squeeze in some user testing. It was probably the best decision we made over the course of the project.


User testing with staff

📥  User research

Following on from our user testing with students, we recently ran some testing with staff for a new website that collected all the online materials relating to Agresso - the purchasing system used by the University - ahead of a new version of the software being launched.

We've been working in collaboration with the Finance Office in this project, and early on we identified the need for user testing. Agresso is an important cog in the University machine, and any issues with the website would need to be urgently addressed.

Ask the right questions

Whereas the Students webpage from last year’s project had a fairly broad remit, the Agresso site – and many staff-facing sites like it – has a very specific purpose, and contains detailed information users require to complete involved tasks.

I personally have never used Agresso, but am aware it is quite an intricate system. In drawing up a script for testing, I consulted with our Finance Office colleagues to ensure that we asked users to complete the most common and important tasks.

Keep it short

Staff who use Agresso are often administrators, and have a hectic workload. It is important to keep the test concise. We trimmed the user testing process from the student-focused sprint in the earlier blog.

We decided against using Silverback in order to keep things straightforward. We settled on the old-fashioned approach of pen and paper, with the script printed out for us to write down the answers while we were talking to the users.

We felt this stripped-down approach was best due to a) the aforesaid time constraints and b) ultimately we were testing the content, not the users' familiarity with Agresso and its supporting materials.

Search widely for users

The team in the Finance Office drew up a list of staff who used Agresso on a daily basis across several departments, including the library, Student Services, Computing Services, the Faculty of Engineering & Design and ICIA (Institute of Contemporary Arts).

We then contacted all of them asking who would be available. We set ourselves the target of having the magic number of five for our testing.

Not everyone was able to find time to meet us. In aiming to reach the desired number of users, it is wise to ask many more.

Go to them for answers

The busy staff who were able to give us half an hour found it more convenient for us to visit them in their office.

This has a number of advantages:

  • It places how the staff member would use the website in context. We get a fuller picture of their role and Agresso's function within it.
  • It also allows colleagues in the office to join in on any discussion that spins out of the testing, offering different perspectives on Agresso and user needs.
  • This kind of discussion gives you a broader sense of the goals staff need Agresso to achieve, which informs further thinking on the website.

Be precise with the purpose of the test

We had to revise the script when it became clear that it wasn’t providing the correct context, and the users weren't entirely clear on the purpose of the tasks. This is a by-product of testing users on something they know more about than you do.

Whereas the structure and questions in the script remained the same, the revised version was much more specific about the purpose of the website we were building, and how it was supporting staff who would be using the new version of Agresso.

Bring a knowledgeable friend

As a rule of thumb it is always a good idea to conduct user testing in pairs – one to talk the user through the test, the other to write down answers and ask additional questions. In this instance it was essential to bring along a colleague from the Finance Office who was familiar with Agresso, and could articulate better than me the needs of users and ask more incisive follow-up questions.

The feedback we got was positive, which validated the work we had done so far. We also had some constructive criticism which resulted in important changes being made. This led to us:

  • rewriting content for clarity, after users questioned the terminology used
  • making changes to the information architecture of particular sections of the site
  • developing a new section aimed at infrequent approvers.

You can see the results of our work when the new Agresso site goes live in the middle of March.


The Students' homepage has changed

📥  User research

The Students' homepage has changed based on feedback from students.

It’s the same information as before but improved in three ways:

1. Links to Webmail, Moodle, Samis etc are now at the top of the page

2. The same page works on your phone and tablet just as well as it does on the desktop

3. Brighter colours replace the drab greys and make it easier to pick out information.

These are the first of many regular updates we’ll make to the homepage and other bits of the site for students.

Please let us know what you think by filling out this short survey so we can carry on providing the best homepage for you.

Alternatively, you can email us via or you can comment below.

You can read more about how the new design evolved and how we based our work on user testing with students on this blog.


Tinker, tailor...

📥  Design, User research

We've been working to a new set of principles in the Digital team since November.

As a result, we've increased our levels of user testing, based our decisions on data rather than assumption, and released web pages and sites more often.

There is still one area we haven't really tackled though - iteration.

As a small (but growing)  in-demand team, going back and fixing things can get overlooked in place of moving on to the next project. But a website is a living thing - and much like a garden, it can look unloved if it isn't tended to regularly.

With this in mind, we've iterated the navigation in the Research site. (more…)

User testing with students – Dos and Don’ts

  , , ,

📥  User research

A colleague and I have been working on a new homepage for the Students site. In keeping with our new delivery principles, we have put user testing, along with data, front and centre in informing our decisions.

We did two bouts of testing, one on the current Students homepage and another on a prototype version of the new page, and we learned what did and didn’t work for testing as a result.

Write a script

For the tests we created a document detailing our aims, the variables we put in place, and our questions. After this we created a user testing script which:


  • asked the students to complete a form giving their details (year, course) and their technical preferences (Mac or PC, browser etc)
  • gave them a simple series of tasks to complete on the Students page and
  • asked follow-up questions about the tasks, the site and their thoughts about it.

We used Silverback, a tool which filmed the student completing the task and records all their keystrokes. It is important at this stage to ask the student if they comfortable with this and to emphasize they will remain anonymous and it won’t be distributed.

Right time, right place


The Student Union is the perfect place for user testing

Every student we spoke to was happy to help. However, there are two important things to consider:


  1. Students are busy - don’t be greedy with their time and subject them to long, involved and complicated tests.
  2. The campus is a busy place - students are constantly toing and froing. Pick the location of your user testing very carefully.

Unusually, we made all our mistakes the second time round. For the first tests we hung around the lobby of the Students Union and approached students who appeared to be in between lectures. We also went mid-afternoon, when it wasn’t too busy or noisy.

For the second test, we went to the Union again, but at lunchtime when it was heaving, and it became clear we were going to struggle to find a) anywhere to sit and b) a quiet place where our conversation and recordings weren’t going to be drowned out.

We compounded this error by decamping to the front area of the library. We managed to test five students, but it took a lot longer than our first round of six. This was because most students we approached were en route to a lecture or – the temerity! – an exam.

Key points

When conducting user testing with students, keep the following in mind:


  • write a short testing script that is quick to complete
  • make sure they are comfortable with being recorded
  • make it clear that you are testing the website, not them
  • write questions which allow them to talk freely, rather than leading questions designed to give you the answers you want to hear
  • pick a venue where students are likely to be enjoying a break, at a time when it won’t be too busy.

At the end of this you will have valuable information that will shape your product and strengthen your hand when engaging with stakeholders. After all, few things should trump ‘it’s what the students want’.