Surely the smartest people on the planet can know where to click? Vicky Brock of Highland Business Research aims to answer that question, providing she doesn't succumb to the arctic temperatures in the room. She's starting the session off by threatening the audience with calisthenics in order to ward off the prospect of frostbite (did I mention that it's cold in here?).
The presentation is based on a case study of one of her clients - Durham University, one of the top UK, and top European universities. The tale starts with the university cutting the marketing budget...
With the limited budget that was available to her client, he was just about able to spend some money on research, with the idea being that it would give him the data to present the case for an increase in his marketing budget.
Initially 3 challenges faced the project
- Offline marketing focus - not much in the way of online marketing
- Academic teams drive content creation - research groups, etc
- Sites structured on organizational lines
In order to determine whether the current budget really was too little, there had to be a review of the customer base, to see exactly how much was currently being wasted, and how much opportunity was being left on the proverbial table.This meant that the right questions had to be asked - can the current assumptions be trusted?
- Who really uses the site?
- How do they use it?
- What do they want from us?
- Where do we fit in the research process?
- Do we have the right information?
The actual goal of Durham University is not to drive more applicants - the goal is to drive the right candidates, as they currently get more than enough applicants, and the higher the number of applicants, the higher the cost to service those applications.
There are many diverse user groups, but do they match the priorities? There's a mix of evidence available to the University in the attempt to answer this question.
- Rudimentary web analytics - only just moved to base Google Analytics from log file analysis
- Started off looking at the analytics data to see that people were jumping straight to the application process without doing research, which explained some of the unfocused applications.
- Next the University was looking at removing custom, unformatted personal pages for
people that (possibly) no longer worked there, analytics showed that there people
who entered through those pages looked at quite a few pages on the
site, so maybe removing them wouldn't be a good idea - more investigation was called for.
- Surveys - multiple path surveys based on user groups (international, UK based, etc)
- The first "whoa" moment for the university. This was where they found out that people were doing research on universities for family members, not for themselves, a completely new segment that they'd never even thought of, let alone addressed
- Off-site search and sector data
- Google insights showed a view of university traffic cycles that enabled them to map their application cycles against the norm.
- Qualitative user testing - this human aspect was the element that tied all of the other data together
- Not lab based - in situ
- Careful recruitment (paid)
- Qualitative focus
- Record it on video for maximum impact
- Really plan the framework
- What don't they see / do?
Out of 8 testers, not one was able to get a valid result out of the internal search engine... This was due to internal siloing of the site, which resulted in poor architecture. The users ended up going to Google to find the right page, which sometimes cost the University money as once they found the result they wanted, they actually clicked on the, sometimes expensive, PPC ad.
Some testers were very confused as to where to go on the site, as the navigation was written very academically, too much text on each page, with no clear flow.
Pulling all of this data together gave them an idea of the prospector journey flow
- Is there a course or post of interest? - Basic facts, fast
- I'm potentially interested how does this work? - Exploratory mission
- This could work - where do I fit in? - Imagining a match - emotion begins to enter the process
- Can i be sure, what is it really like? - Trust and anticipation
The really big problems that they recognized needed to be fixed.
- Dysfunctional search engine
- Silos of content
- Too many words
- Incorrect assumptions about the users needs:
- the application is not an end process
- users do not know the academic jargon
- hierarchy of informational needs
The results depended on the segment
- The Transactors (MBA students) were leaving the site, most likely never to return, or to do so reluctantly
- The Chasers (research associates and the Phds) were prepared to live with the site as it currently stood, although that didn't mean that improving the experience wouldn't provide benefit to this segment
Having identified the issues and documented them, the budget was approved to fix the main issues:
- The internal search now works
- The custom, one-off, pages were branded and navigation was added - the Chasers were the ones that loved them and they were actually a huge entry point to the site for this highly lucrative segment
- Added pictures of young people doing stuff, to draw the prospectors in when they hit the 'emotional' stages of the cycle
Key Takeaways that apply to any site:
- Challenge and test your assumptions about who uses the site and why
- Talk to people - user test
- You can do wonders with a cheap tool set
- Get budget approved for critical fixes by showing user pain and monetizing lost opportunities.