Digital Strategy

Part Two: Moving towards mission-driven digital

October 02, 2018

Adapted from a talk delivered at the Arts Marketing Association Conference in Liverpool, July 2018.

In part one of this blog, I introduced the concept of mission-driven digital and talked about some of the motivations for taking a mission-driven approach to your digital strategy. In this second part, I will sketch out a roadmap for moving towards mission-driven digital in the arts and cultural sectors.

 ––

How do we bring together the often-separated ‘mission’ and ‘digital’ worlds in arts organisations? We need to take a human-centred look at our mission, and find opportunities to use digital technology to meet users’ genuine needs and requirements.

Most business models have focused on self interest instead of user experience.

Tim Cook, Apple CEO

In my previous blog post, I suggested that one of the problems suffered by cultural organisations is that their digital channels focus almost entirely on sales and ticketing content, to the exclusion of any information related to their mission. But in many cases, this is self-inflicted. If you want to engage users in the mission of your organisation, you need to give them something of value to engage with.

The answer to this is user-centred design. Although there’s been a gradual shift towards user-centred design in the arts and culture sector, it hasn’t developed far enough or fast enough, especially when it comes to how the mission of arts and culture organisations is reflected digitally.

What dominates instead is the shared ‘big idea’ — often an inspired hunch agreed by a group of people who work for the organisation. Or, even worse, the idea comes from the HiPPO in the room, and everyone else just has to go along with it.

Taking a data-enabled, user-centred approach to your mission-driven digital strategy is key, and it’s not that difficult to do. There are essentially 5 steps:

1. Research your users.

This is the step that is most often substituted for assumed knowledge and inspired hunches, so it’s vitally important to challenge what you think you know about your audiences.

Research can take many forms, but should always be focused on users in direct or indirect ways.

It can be really tempting to skip this part of the process, or to try and force well-known institutional dogma (“our audiences behave like this”) into a box called research. One way to test your assumed knowledge is to start the research phase with a blank piece of paper, and draw four quadrants. In each quadrant write ‘known knowns’, ‘known unknowns’ etc.:

The purpose of this chart is to provide a direction for the research you undertake, and also to ensure you persevere with research until you have at least one insight in your ‘unknown unknowns’ column. If you end the process with nothing in the unknown unknown box, don’t end the process! Because I guarantee there is something that you don’t already know about your audiences that you will uncover as part of this process.

In terms of what research to carry out; this will vary depending on what you already know about your audiences, as well as the type of project, but there are three basic approaches you can take:

  • Watch them. You can carry out formal user testing, either in-person or remotely via testing services. We regularly use tools like Hotjar or FullStory to recreate web sessions, and they are incredibly useful tools if you want to look at how real users engage with digital interfaces. If you’re testing prototypes, you might need to rely on in-person or remote testing of specific parts of your user journey. If you’re looking at information architecture, you might want to engage users in card-sort exercises — in-person or remotely — to find out how they categorise information.

  • Ask them. Talking to users can be really helpful, whether that’s through focus groups, surveys, stakeholder interviews or workshops. You should bear in mind though that all surveys, workshops, focus groups and interviews will convey some form of bias on the part of the people conducting them, as well as betray biases of the people being asked. This doesn’t remove the usefulness of these tools, but should affect how you structure the questions you ask, and how you combine this data with other observations.

  • Include non-users. One of the challenges of mission-driven digital is that we are currently over-focused on our existing digital users (who we see almost exclusively as ticket-buyers), so it’s important to reach out to people who are potential users as well as current users, so that we can see how we might expand our digital offerings in ways that are useful to and needed by people we’re not currently reaching online. Do this by identifying potential users from your database who are known to be engaged with your organisation, but not digitally, and inviting them to take part in surveys or focus groups.

2. Identify requirements

The next step — once you have a picture of your users in mind — is to identify the specific needs or requirements of the user for the project that you’ve identified. Maybe you’re looking to improve the donation flow on your website, via an app. Or perhaps you want to work on your membership registration process.

If we want to engage users with our mission, and to provide genuinely valuable features and functionality to them, then we need to express requirements in a way that reflects their needs, rather than ours.

We capture requirements for our projects in a number of different ways, but one of the most powerful ways that we do this is through user stories.

If this is something you find yourself doing, I really recommend reading the UK Government Digital Service service manual entry on user stories. It illustrates very clearly that the most important thing, when capturing requirements, is to focus on the user’s required outcome rather than any organisational objectives.

So when the Government Digital Service redesigned the electoral registration system, their user story was: “as a UK resident, I want to get my details on the electoral register so that I can vote”. The way that they wrote the requirement didn’t mention technology, or websites, or any desired outcome on the part of the govt related to cost-saving or anything else. It clearly and unequivocally focuses on the user.

3. Design

Alright, so we have our research, and we have our user stories. Step 3 is design. This requires action, so it can be the hardest step to move onto. We humans are often paralysed by complexity, especially when dealing with it involves motivating groups of people.

A useful tool for getting going with this step is to employ a tool called user journey mapping to plot out the desired steps that you want your users to go through. Using your research from step 1, and requirements from step 2 as a guide, you create a map of interactions that will help you fulfil the users’ needs. Again, the Government Digital Service has a really helpful blog on creating user journey maps which talks through the process of creating these.

These user journey maps can be as hi-fi or low-fi as you want. You might find value in even sketching these out, without too much considered design, so you can start to identify areas for improvement, even if you can’t tackle the whole journey in one go.

4. & 5. Test & Repeat

Steps 4 and 5 are inextricably linked. They’re so important, because the cyclical nature of the user-centred design process is critical to its success. We should always be aiming to improve things for our users — either as their needs change and evolve, or as our capacity to meet those needs increases.

However, testing mission-driven digital is hard, because many of these interactions are more longitudinal than transactional.

For example, consider the case of a new donor prospect. Depending on what research you look at, it takes between 7 and 13 interactions to move someone from a first meeting to making a donation. Those interactions can take place across multiple platforms, online and offline. And the final ‘conversion’ step may not even take place online.

This means we need to shift from a ‘session oriented’ view of digital behaviour to a ‘customer oriented’ view.

Tools like Hotjar allow you to access live site testing data at a customer level, seeing where users return to a site on the same device and how they interact with the site over multiple visits. Survey tools can also help you capture qualitative data about the site experience and user motivations, which can be used to refine the user experience in the future.

Similarly, combining data from CRM systems with your web analytics data helps to present you with a customer-oriented view of your digital activity. For example, with a bit of web development work, you can send high-level information about logged-in users to Google Analytics, so you can examine web sessions segmented by user type (like ‘member’, ‘subscriber’ etc.).

Testing mission-driven digital is not always as straightforward as counting pageviews and conversions; it requires a more sophisticated measurement protocol and some ingenuity.

But if you get it right, you can gather vital intelligence about the ways in which your users want to interact with you online, and move ever closer towards a genuine mission-driven digital strategy.

In the final part of this blog series, I will look at some recurring themes of mission-driven digital and discuss how organisations can employ these to engage their users more effectively.