Topic(s): analytics, research, and user research
As user research becomes firmly established in organizations around the world, it’s tempting to congratulate ourselves and retreat to our shiny new labs. But our work is nowhere near complete. As currently practiced, user research remains narrow in focus, often limited to the qualitative methods that reflect our own educational biases, and the tools that fit within our own comfort zones.
Other research practices, such as web analytics, business analytics, and market research, are equally powerful ways of learning about users’ wants and needs. More importantly, they’re often complementary with what we do. When our organizations combine methods that tell what is going on are combined with methods that tell why, only then will they truly realize the value of all user research.
In his talk, Lou Rosenfeld will explore the complementary aspects of the different research perspectives, argue for breaking down the silos that divide them, and suggest a framework for developing products and services that are better analyzed, better designed, and, ultimately, better performing.
About the speaker(s)
Lou Rosenfeld is Rosenfeld Media’s founder and publisher. Like many user experience folk, Lou started somewhere (library science), made his way somewhere else (information architecture, then user experience), and has ended up in an entirely different place (publishing). Lou spent most of his career in information architecture consulting, first as founder of Argus Associates and later as an independent consultant. He co-founded the Information Architecture Institute and the IA Summit. He helps curate the Enterprise UX and DesignOps Summit conferences. And he does know something about publishing, having edited or co-authored five books, including Information Architecture for the World Wide Web, and Search Analytics for Your Site.
Jess McMullin: It’s my great pleasure to introduce Lou Rosenfeld. He’s been tremendously influential in the field. If you haven’t had the chance to read Information Architecture for the World Wide Web, he’s one of the co-authors. It sold many, many, many copies. Over 100,000?
Lou Rosenfeld: 150,000.
Jess: 150,000 copies, wow. And not only that, I really think that Lou’s character has infused the Summit and the IA community with a particular quality that I have found very rarely in other places which is that of being open and approachable that he’s not a rock star with a rock star persona.
He’s a rock star who does amazing things but you can just come up and hang out with him and that has really influenced the tenor of the conversation and the attitude of this community. It’s why I’m part of it, is because I was able to be here and just be another one of the people at the table, so I’m grateful for that.
And, Lou’s more recent ventures include creating Rosenfeld Media, which is a phenomenal publisher of user experience titles. You may have seen their BookMobile cruising around out in the lobby. If you haven’t, check it out. We’re also going to be giving a few Rosenfeld Media titles away tomorrow.
But Lou wears two hats. He’s not only becoming a publishing mogul, one-day mogul but right now, sort of on that arc. He’s also still working with organizations to help them improve their user experience and particularly at a strategic level and how Information Architecture, User Experience and other things can benefit them as an organization.
So he has a wide roster of clients still and having had the opportunity to work on several projects with Lou in the past is certainly something that I’ve enjoyed tremendously. Lastly, I wanted to say thanks for Lou for breaking down barriers.
That same kind of attitude in the IA community of just, you can walk up and start talking to people has actually spread across to other places where I think through his influence, he’s tried to build bridges and to help us think more about not just our own little echo chamber but what we can learn from other places.
I’m really looking forward just hearing what he has to share with us today. Thanks Lou.
Lou: Thanks. That’s really nice. Thanks. Thank you Jess and thanks for the opportunity to be here. This is home for me, which oddly is one of the hardest places to speak for me so I don’t know. It’s like when I go home to my family, my parents, my brothers, I sometimes feel like it’s really hard to get a word in, New York Jewish thing perhaps.
I’m the youngest of five boys. Somehow, I almost feel that way here because I feel like this whole community is, I hate the term rock stars but really, I mean, just the presentations both by first timers as well as veterans the last couple of days have been amazing.
So I want to do this some time while I’m up here on stage and I’ll forget at the end. I just want really to give a round of applause one more to everyone who’s been involved in this event. One more time please.
All right, thanks. Jess, I like to build bridges. I’m really not so much about building bridges I think but burning down silos and I’ve sort of spent the majority of my career as a consultant in the post-Argus world, ten years this month, working on dealing with large enterprise class organizations.
That have these big messy blobs of information. They’re not really sites. They’re information spaces or to use a $2.00 term, environments, where the content is often separated by political divisions or technology or what have you and very much not in a user centered way.
For many years, I taught a workshop on Enterprise IA, all about that, how to make information, more user-centric and tear down some of those silos. Kevin Cheng, actually commissioned him to create that graphic of me. Kevin is here somewhere and I’d like to get him to do some more. It’s time to update it. But I love this graphic.
I’m always dealing with silos but in my consulting in the last couple of years, I’ve started to see a new kind of silo crop up that worries me. So I’ve been, like a lot of you, certainly as a solo consultant, when I go in to a client, I’m trying to learn something about what they know.
What are their insights, what can they tell me so that we don’t have to reinvent the wheel, so I don’t have to ask them, start doing certain studies, start doing any type of analytics research and what have you.
Where do insights live when you go into an organization? It’s really kind of confusing and kind of distressing because a lot of us, certainly information architects, we’re often working with or already part of some sort of user research team or user experience team.
And so those people can kind of tell us, “Oh, you know, we’ve been doing some user studies of this sort or that sort.” But it’s not really enough. That’s just a limited amount of the insights that are possibly resident in a large organization.
In fact, what I’m finding is we have a new kind of silo to tilt at. We have silos of insight. The information we need to make informed decisions as designers of one type or another are split up all over these large organizations.
And if you work in a large organization, I mentioned you might be seeing this and if you’re not, I imagine you will be soon as you try to branch out beyond the canon of user research methods that we take pretty much for granted that we always do because they’re not enough.
This is very typical I’m finding when I’m working with clients. I get to the client, large client and I start asking those questions, where are the insights. And like I said, I’m often working with the user research team and they’ve got their reports that they’ve been doing.
They’ve been doing user studies of one type or another as I mentioned and that’s great. I feel very comfortable but I don’t feel satisfied because it’s not enough. Where else do the insights live? Well, this is not atypical and I ask for this because they’re all about site search analytics these days.
I like to see their search logs. And It’s another group. And I’m already upset because, wait a minute. You got one group of people that are ostensibly running the Information Architecture but then you have search people over here? Wait a minute, these things are supposed to be together and certainly the insights that come from them.
The insights we learned from looking at what people are searching, for example, are somewhere else. It gets worse. We got the call center. I can tell you, one client I worked with, one of the first things I asked them for was, “I know you got a call center.” I know it’s somewhere else. Could you find out if there are logs and have those call center logs been categorized from one way or another?” “We don’t think so.”
We got to ask a lot of questions and go through a lot of hoops. In one case, it took something like four or five months working with the client and just pure dumb luck that we suddenly found someone who said, “Sure, I know the right people and I’ll get them for you.”
But there’s more. There are these guys, web analytics people who’re sitting on top of tons of data and to make it more complicated, usually in large organizations, not one group is doing this. There’s a bunch of people working maybe with different applications.
Maybe there’s an Omniture installation somewhere and maybe there’s Unica somewhere else and so forth. So wait a minute, it’s getting more complicated. Where am I going to find the insights that me or the organization itself needs to drive its design decisions?
Then there’s, this is not atypical, there’s the voice of the customer people using their methods to figure out what we would call, not voice of the customer perhaps, but user behavior, user needs or what have you. So there are all these other people and they know something that’s different.
And then there’s the people sitting on top of the CRM data. Now I’m getting to a point where I’m not sure I’ve even really seen CRM data. I’m not sure if many of us would know what it look like if they hit us in the face.
We know it’s around somewhere and it’s important. It tells us something about user behavior. Then in a lot of organizations there’s a research center. This is where a lot of this stuff is supposed to be emanating from the first place. These are the people that are supposed to have the insights about the future.
That’s a good thing to know about but there’s huge walls between some of these folks and the people that are making design decisions on the ground. And you know, a lot of large organizations have these research centers but yours might not. Still, there are insights in places that you may not own but they’re called libraries.
People have done research for years. It might be applicable to what you want to do and yet we’re not going out and looking for past research to inform our decisions. Even if it was not invented here, it could still be highly valuable.
And then, this is actually a bittersweet thing for me, in one case, I showed up with a client and saw that they had a huge mental model Indi Young style on the wall and as a publisher of that book I was just, “Great! Great!” I wasn’t expecting that but it was a different agency that owned that.
They were like the people who really knew how to do mental models and so that was very worrisome because it’s not just that the insights are fragmented in our organizations but sometimes they reside outside our organizations and that’s problematic because if they’re outside our organizations, they could leave. And we won’t have any institutional knowledge retained.
Then in the same organization, another agency has been brought in. They do brand architecture. I never even heard of brand architecture but when we started looking at it, it’s like, “Wow.”
We’re doing a lot of interesting, different research to come up with what we people might call information architecture. And then another agency or another third party doing net promoter research, huge unbelievable surveys but you had to go to them in order to change.
For example, the NPS work that was being done with one of my clients, as interesting as it was, they asked no questions in their elaborate surveys about findability but we couldn’t change that because those surveys were residing in the hands of the third party, on and on and on.
We’re tilting at new silos and again if you’re not, I think you will be if you start looking. You probably will find them. It’s bad news because, what’s going on here? We have these organizations that get it.
They say, “Yes, we get user research and let’s demonstrate that by showing you that we’re paying lots of money for lots of professionals who work for us or who work at our agencies or other third parties that we’re employing,” and they’re doing all these great stuff but nothing’s coming together.
They’re totally overpaying and more importantly, besides duplicating effort, they’re losing out on the combinatorial effect of putting these insights together. There’s your pre-chewed tweet, combinatorial effect of putting insights together.
That’s what I want you to walk away with and I think it’s been a nice message actually in some of the sessions in the last couple of days like certainly Nate Silver but also, I heard great things about the one right before that I unfortunately missed because I was sweating in my hotel room.
OK, so we’re missing out on all these great benefits from not putting all these things together. It’s almost like a mapping or cartographic challenge, really a three-headed one. We have this fragmentation problem which I’ve already said things live in silos not just content but now insights that ought to help us figure out what to do with content and other design issues.
We’ve got differentiation. We don’t really understand what that CRM stuff is about. I’ve never seen one of those things before. Yet, I sense it might be good to look at if I’m doing any kind of design work.
And then most importantly this combinatorial issue, the synthesis of all those insights into something that approaches an organizational brain, an organizational or institutional way to make smart design decisions.
So that’s kind of what we’re facing. In my limited experience with this, I’ve tried to map it and I think a lot of us are pretty good at doing this sort of mapping of an organization and how it works. It’s almost like the same sort of urge that we use, that got us into doing things like site maps and wire frames.
I came up with a bunch of dichotomies. I couldn’t map it so let me run through some of these dichotomies. What I’m finding is, there’s a lot of people who are really good at figuring out what is going on and there’s a lot of other people often not the same that are really good at figuring out why those things are going on.
So for example, people draw on information that comes from analytics research, the quantitative data. They may learn something really interesting. But it’s all behavioral stuff. They don’t really know what was going on in a user’s head.
They can draw up and infer interesting hypothesis but they can’t test those hypothesis. That’s something that people who are really good at doing user studies for example like a lot of us, are really good at.
We, on the other hand, aren’t always so good at knowing the right questions to ask. And I’m kind of going to start focusing a bit on two areas of practice, web analytics and user research but these are the ones I know best.
This is really even more complex when you introduce all the other perspectives but let me just focus on these two. A lot of web analytics people can tell you what is going on. They can’t tell you why. A lot of us can tell you why things are the way they are but we don’t know what to test necessarily.
We don’t have the right questions to explore without the data to help us figure that out. There’s a whole kind of a breakdown between qualitative and quantitative people. I love this diagram with the two brains in there. I wish I had come up with it.
I’m not sure how well you can read it but numbers versus emotion, analysis versus empathy, the brain versus binky? Is that what it is? So you know, we have different ways of looking at problems, different ways we try to solve problems and we often are comfortable with different types of data or evidence to help us solve those problems or at least to help us understand what the problems might be.
So that’s a big breakdown. A lot of what I’m saying right now, I’m trying to make a point and by making that point, I’m going to over generalize quite a bit but I think a lot of us kind of would fall into one of these categories.
I don’t know that anyone is equally comfortable with qualitative and quantitative data. I’ve met very few people that seemed to be able to do that. In many cases, I think some of us make for qualitative studies because we’re really uncomfortable with quantitative data or vice versa. It’s just the nature of how our minds work and what we’re comfortable with.
A lot of us are in the business of making sure our organizations reached their goals. Web analytics people as an example, they express goals as KPI, things that are measurable, key performance indicators.
A lot of us in this room have been trained to think more on behalf of the user and what their goals are and how to identify them and make sure they’re using them. Sometimes those things are very easy to mesh together especially in commerce sites for example. Often, they’re not.
We have to resolve these things but we’re not always so good at it because usually, whoever is making the decision has a bias in one direction or the other. In effect they’re thinking with half a brain.
I think a lot of us are really good at measuring the world that we know. Certainly, again, on the analytics side, you start with your KPI based on metrics and you say, “I’m going to look at all that data and figure out whether we’re performing against the goals that we’ve set out for ourselves as an organization.
Are we doing well? Are we not doing well? Contrast that with looking at data for patterns, looking at data for things to emerge that were unexpected. That kind of emergent data analysis is really looking to learn about the world we don’t know and therefore we don’t know how to measure.
And then yet another, I’m sure there are more dichotomies. There’s a breakdown between the comfort level and understanding of statistical data versus descriptive data and you could have people who make very strong arguments, garbage in, garbage out in both cases and they’d both be right probably in both cases.
But that’s not how they see it. Usually, we have a bias toward one direction or another. We like one, we like the other, usually not both but they tell us very interesting, but different things that often fit together nicely, as we’ll see.
I’ve tried to, like I say here, reduce this to a very over generalized, over simplified set of dichotomies that I’ve just gone through. This is just a summary of what those are. And this is just for two areas. This is just for web analytics and user experience.
But if you look at these, I hope what you’re starting to see is not just the differences but the fact that they come together quite nicely, that they’re very complimentary. That’s where that’s where combinatorial effect that’s coming in where the insights that one has fit quite nicely with the insights of the other.
Now, I can’t map this. It’s just not in my wheelhouse but I bet some of you could. What I’m really hopeful for is that someone like Alex Osterwalder who wrote the Business Model Generation book. Is anyone familiar with it? Fantastic.
He actually created and published it with a bunch of people, created a whole new business model around publishing just to do one book. Amazing. But he did a whole bunch of mapping of essentially business models. It’s over simplified but damn it, it’s useful.
We need something like that to take all these types of insights and put them together in a way that would be really useful for us especially making design decisions. So without a map, why bother even trying?
You know, if we can’t map, this is really a hard problem, what’s the value of jumping in? Well, for one, we can really, really learn quite a bit from each other’s data, right? So let me give you an example.
This is one of my favorite things in the world. It’s a little bit of site search analytics code little snippet of stuff. All you really need to know is that if you look at it, the orange stuff like “vincense plate” is what was searched.
There are a few other things that you can maybe figure out, an IP address so you know who it is, the time-date stamps, you know when it happened. The zero next to the last bit of information is how many search results there were.
Now, look at another line. Same time, roughly two seconds later, same IP address. Now they’re searching license plate and I got, I think it’s 146 results. Interesting, what’s going on here? Maybe they spelled it right but well what happens next?
Oh, it’s a different user and they’re searching on a real mouthful. This is a state government site and this user was searching that site for Regional Transportation Governance Commission. People search things that long? They know what those things are even called? Do you know what government agencies are called?
As you’re looking through this, I bet you each one of you are already putting on your analysis hats and saying, “You know, obviously typos are an issue. How would I fix that problem? Maybe I would turn on the spell check on the search engine.”
Now did they get what they wanted when they were searching on the license plate or not? You don’t know. Is that a common thing that people search and what about this mouthful in the last line?
Basically, each one of us should probably have a whole bunch of different ways of looking at this data and we would start ferrying very different hypothesis, just a couple of examples.
I think a lot of people from, to over generalized analytics community, would be wondering things like are we converting on license plate renewals. A lot of other people like me would be saying, “What are people searching for the most? Is it license plate coming up a lot?”
If so, are we giving that information very easily, are we presenting it on the main page so they can renew their license plate easily and so forth. So we look at the same data, tiny little snippet of data and we probably are all starting to come up with different conclusions or at least different hypothesis and thinking about what we do next differently.
What the next action would be could be very different if you’re looking at this as an interaction designer versus an analytics person versus a content strategist. Another way we can really benefit each other is by helping improve each other’s design tools.
So I grabbed an Adaptive Path persona and you know again, I love site search analytics but there are lots of other types of analytics out there that you can do this with but I threw some site search analytics data in there.
So you got your typical persona stuff, right? And then, why don’t we add some data? Wouldn’t that enrich in a new way “what does Steven search?” Now I can actually go to my analytics people and say I could use some of that data.
In fact, maybe my personas might match up well with your audience segments. Maybe you can start putting these things together in some new and far more powerful ways. We can really help tell each other’s stories.
I love this example. Adaptive Path again, Jeff Veen and a team were working on a product to make analytics data more easy to understand. I think it’s called Measure Map. Is that right anyone? Measure Map.
Google liked it. In fact, Google basically bought Measure Map but they really bought the team. And they’d already purchased the analytics application that they were going to make into Google Analytics but they wanted that team to work on it, to help tell the story of the data in a way that maybe someone from the web analytics world wouldn’t have thought.
Whereas a lot of analytics applications are just showing you reports because that’s what we do. We just show reports and more reports and more reports. What these guys did is they turned that on its end and they said reports are answers to questions.
Let’s lead with questions, let’s frame reports as questions and that’s an obvious duh to us but really powerful thing that might not have happened if those people from a UX perspective weren’t involved with project.
So we can help solve each other’s design problems. Now Jared, you’re becoming the new Jakob because you have to be mentioned in every talk ever given in the user experience world. But this is a Jared story and I hope I get it right. It’s an old one. It’s oldie but goodie, Land’s End.
If you’re not from around here, that’s a big commerce site for clothing. They were looking in their search logs and finding that people were searching on product IDs. In other words SKUs, S-K-Us and they weren’t finding anything, which wasn’t surprising because there was no SKUs listed on the site. That information wasn’t there, these things.
Well what do you do? The fix is probably pretty obvious as you add that information to the catalog, the web based catalog and that’s great. However, they wanted to learn a little bit more. Is there some mystery here? Why would people…know our SKUs. That’s kind of bizarre.
It’s like going to Amazon with the ISBN number in mind. It’s probably not a common behavior. So it’s easy to fix but they wanted to know more. The analytics data suggested a problem which in this case hypothesis is pretty obvious and that’s the problem they solved but they wanted to dig deeper so they did a field study.
What they learned was, hey, people in their homes, they are referring to that nice printed catalog that they’ve been receiving every two months since they were 14 years old. You know because the website is a pain in the ass to use but anyway, they’re used to the printed catalog and it’s got those high res images.
Oh, I like that. Let me now go to the website and type in that number thing. Wait a minute, now you weren’t doing that twenty years ago. You were calling the 800 number. Yes, but then you have to deal with a human being and they’re so pushy. I hate them.
Or, you could send your order into the US mail but that takes a long time and apparently it’s easier to do this on a website. So it’s actually kind of a nice illustration of a cross-channel experience, which I would call service design but I might get in trouble for that.
But this data by the way is based on a bunch of analytics that Netflix just threw in to Excel. You will see no statistical tests. I think Nate made the point but if you didn’t make it strong enough you can do data analysis without doing statistical tests.
OK, repeat that to yourself, you can, you can, you should. Ok, so what Netflix is doing here is really interesting. They are looking at on the right, things that we searched. These are basically the names of movies and TV shows. Things like Click and The Departed and Thank You For Smoking.
And all the way on the left, sorry I didn’t lay this out; they had a number of times each one of these are searched. You’re seeing the searches sorted by most common to least common. All right, so that’s one hoop. You’re looking at stuff that was really popular.
The second hoop was they wanted to look at stuff that was being clicked through. So which results when the user did the search were being clicked through, another kind of analytics data, ClickStream analysis.
They were seeing that certain titles were clicked through at high percentages and others maybe not so high. OK and then there’s the third hoop, which was were those things being added to the users queue? Something being searched and then OK, being clicked through and then OK, was it being added to the queue, which is Netflix Gold.
That’s what they want to see happen. In certain cases, things were very successful like Thank You for Smoking. It was searched a lot. It was a number three search; click through at a high rate, fairly high and then it was added to the queue, the really high rate.
All right, but then Netflix, they don’t really care about whether they’re succeeding. They care about where they’re failing. So, Lost, very popular, clicked through a lot and then added to the queue at a really, really low rate, any theories?
Well there’s a bunch of theories that again, you guys have probably come up with. It might have been that TV shows that are multi episode are problematic in a way that should impact the design of the way they present the search results.
It could be something with the term lost. Maybe it shows up in a lot of different titles for some reason and that confuses things. There are a lot of things that could be happening here. Maybe the availability of those items is really slow and so maybe Netflix needs to look at that and say, “Oh, we need to stock more of Lost.”
What they do, do is they look at patterns like this not just Lost but see what other patterns map up with this or match up to this and see if they can learn something different, that they can test out some new hypothesis.
Now if you were just an analytics person, you probably wouldn’t come up with this. You probably needed some other perspective. Maybe it was not a user experience person but he was a manager who understood the basic core of the business model of Netflix. These things have to come together in order to get good, unique insights.
Ok, so this is a challenge, is getting these pieces together to work as one brain. It’s not that there’s just two hemisphere just like 20 or 30 pieces to this brain or this jigsaw puzzle that aren’t really coming together and that’s what I think is a really, really fantastic challenge for us to be thinking about how we might solve.
We folks are a bunch of gap fillers. We see problems like this just like we saw problems like, hmm I’m on this project for this web thing and nobody is figuring out how to organize the information. Oh, that’s called Information Architecture, that’s me. I’m going to fill that gap.
People in this room are primarily here because we fill gaps, that’s how we got here. I think this is a key one that a lot of us are going to be dealing with for years to come. What do we do about it? From a kind of grassroots perspective, what we as practitioners can do, well it’s tough.
There are certain things we can do or we should at least ethically be obligated to try with the understanding that we’re in large organizations in many cases. We can’t always succeed which means of course, try, try again.
As an obvious basic thing to do, surf those silos, talk to other people. Samantha Starmer gave a fantastic talk yesterday and talked about how at REI, she goes, good lord, she leaves her own building on the REI campus and goes over to where the marketing people who are in another building and hangs out and probably uses those really cool shoes as a great conversation piece.
She goes and talks to people, you can do that. You won’t get in trouble. You may even like, ask people to have lunch with you and find out what they do. Pretend you’re 22 years old again and you’re doing informational interviews. But do it with people in your organization who might, just might have something to do with insights that you haven’t seen. And be prepared to share what you’re doing with them of course.
I’m probably, I’m guessing not the only person in this room who loves, idolizes, would lie down and die for Dave Gray. So Dave Gray of XPLANE, he’s known as like one of the world’s foremost visual thinkers.
I was trying to resolve some of these dichotomies and figure out how to get people from different disciplines or different perspectives to talk to each other when we’re not even using the same languages. We’ve all been in that situation.
And Dave just like said, “Oh, well, what you want to do of course is create boundary objects or identify boundary objects.” Boundary objects are, as you might gather, things that are common to different disciplines. The different disciplines might call them different things.
But they are resident in both disciplines and therefore, they can be a source for common discussion and ultimately people come in together and understanding each other. And like a day after, we got off the phone with Dave, he actually blogged something or he took the concept further and create what he calls a boundary matrix which you’re seeing there.
But before we get into matrices, simple ideas of boundary objects would be things like goals. We and user experience might see goals very differently than web analytics people. They express goals in a quantitative way known as KPI. Hey, that’s cool. We got something in common. That’s a nice starting point.
Another example we’ve already covered, a persona might map up quite nicely with an audience segment, the kind of audience segments that they’re using in the analytics world or within our companies. It’s another idea of a boundary object.
Can you sit down and show each other what you’re doing with your common boundary objects and start looking for ways to maybe have them drive each other, meld them, improve them together in some way, maybe even combine them.
So the URL for that blog posting is right there. By the way, these slides are all up on SlideShare although you might give me tonight to upload this last minute version because I can never finish a presentation until five minutes before I have to give it.
An obvious thing, I mean, brown bags. You guys are doing incredible stuff. Can you present it, every other Tuesday? Can you ask other people if they’re interested to come join you? Who are those other people? Those are the silos, bribe them. Say you’ll buy the pizza.
Give them an opportunity. At least try, you might get one of the 25 people you hoped to have show up but that one is going to be pretty damn worthwhile. On the other hand, I got to say I’ve tried doing things like this in a more focused way with very senior blessing.
And you know what happens? People say they’re going to show up and then work gets in the way. A lot of these things I’m saying with a grain of salt because ultimately until people have clear incentives to share, to show up to meet with each other, it’s hard to do.
You may be that you can’t get beyond the really informal things but you can still do those informal things and not expect people to actually come together as working groups and so forth. Because they may not have the blessing or the incentives from senior leadership and that’s where it ultimately comes down to.
So one last thing we can do, as I said earlier, we’re a bunch of really good mappers. You guys are probably some incredible mappers because you’ve been mapping information spaces for years and those are really hard to map. They’re multi-dimensional.
Actually, I think a lot of what we’re talking about here is not multi-dimensional, it’s maybe two-dimensional. The hard part is just doing the legwork. The harder part is doing the connections and synthesis that I mentioned earlier.
But there are some maps out there. The one I really like, Christian Rohrer has this landscape of user research methods. Some of you I’m sure have seen this. I think this is fabulous. It’s imperfect and flawed and wonderful.
So what he’s doing is he has two axis, the data sources or behavioral versus attitudinal. The approaches are qualitative versus quantitative and he has a categorization scheme in there for the different types of methods that he’s mapping out.
Now if I was a web analytics person, I’d have a lot of issues with this landscape. I’d say my stuff is always squeezed the upper right hand corner and yet I have much more data than anybody else. Is that right? Does that make sense? And so forth.
This is skewed because this is the kind of perspective that folks from our community would come up with to describe the landscape of ways you might achieve insight. However, as flawed as it could be, it’s useful because now you have a thing, an artifact to show other people and have them challenge and take you seriously in the process and draw them into the discussion.
Similarly, Avinash Kaushik has his Trinity Strategy. Avinash is probably the best selling author in web analytics. If you look at his worldview, you’ll see that our stuff is again a tiny little piece. He uses terms that we probably wouldn’t necessarily…like voice of the customer, in that whole area of experience, the lower right.
So he accords us a little bit. This says Christian accords people like Avinash a little bit but it’s a start and these are good maps. What we need is something that takes a much broader view and I think we can get there. But it’s a big challenge.
I’ve been talking about what practitioners might be able to do. There are ultimately a lot of things that have to come from leadership, from those people we call decision-makers; the tail can only wag the dog so far.
How do we get those decision makers on board? One of the real problems with decision-makers is, to over generalize yet once again. I don’t think they take the concept of the term decision-making very seriously because if you look at the decision-making apparatuses that manifest themselves in large organization, you would never have designed it that way.
In fact, one thing I really encourage you to do if you have the opportunity as a consultant, if you get to the C-level people, or some senior decision-makers or if you’re really lucky, someone within your own organization gets face time with very senior people.
Blue sky it. Ask that question, “What? You are decision-makers. I assumed you need evidence or data to drive your decisions. Can we do a little blue sky exercise around what ideal you would have?
If we’re just starting our organization from scratch, if we burn the whole thing down and start it over again, how would we make decisions? I can assure you, it would not look like what you have now which is organic and siloed.”
The insights are not in any kind of logical form. The insights are locked up in applications, in disciplines and in perspectives that don’t work together. Blue-sky decision-making apparatus, that’s your organizational brand. How would it look, what would it be like, would it be balanced?
I would also encourage you to do something that makes a lot of people uncomfortable but I’m finding in client meetings very effective. I’d say we’re going ban certain terms. And when I do this, basically I say, “Everyone in the room, if you use some of these terms you’d be banned. You got to put a dollar in the center of the table and if I do it, its $5.00”,§ or you can make the stakes higher, whatever it is.
These are the kind of terms that are useless. These are the kind of terms that are meaningless. These are crutches but they’re worse than that. They’re worst in crutches because they have a negative impact.
When we talk about, “well. our Omniture people do this”. we basically just completely gave away whatever kind of deeper understanding of what it is they really do because we’ve labeled them as the Omniture people. Or we talk about certain methods without really understanding what the insights behind those methods might be?
So you can try something like this where you ban certain terms that people use as crutches to force the issue of thinking about what kind of insights do we need to make decisions, design decisions and ultimately strategic decisions because they’re all melding anyway.
Try banding terms. Try to present frameworks. This is mine. Again, it’s over generalized like everyone of them will be. But this has a bunch of assumptions and questions that don’t use those kinds of terms I’m suggesting that you ban.
I mean, yes, I’ve presented web analytics and user experience, cross those out but focus on the questions. Focus on the general concepts that fit within the realm of decision-making and use that as your guide.
Then you’re in the position to start doing design but a lot of you probably look at this and you see it the way I do which is dashboards. Executives love dashboard. It’s just sort of… They just go together, executives and dashboards.
But you know, they’re kind of a pain in the ass to design when those dials aren’t all quantitative. I mean really, the metaphor starts to fall apart pretty quickly because what we want really is to portray or map, that term again, our insights.
We got stuff coming in from card sorting here. We’ve got stuff coming in from ClickStream analysis there but then what we want to do is draw lines between them. Some of those things don’t map as dials or present themselves as dials because they’re not quantitative but actually, the really interesting thing is to draw the connections between them so we get that combinatorial effect.
So this metaphor, it may be the best one you might come up with but it’s an imperfect one. Use it but realize it’s not going to get you all the way. When you put these things together you put, really, people together.
Now you’re really succeeding because you’re getting people who are amazing but haven’t been given the opportunity necessarily to work with other kinds of people.
Ultimately, the framework that you need to create, it’s not a map; it’s not a dashboard. It’s people. It really is people. It’s Soylent Green.
You got to get these people, thinking together, give them the opportunity to interact together, let them rub up against each other and bump into each other and say, “Hey, who the hell are you and what do you know?” And that’s how it happens.
The kind of organizational decision-making brain we need to start thinking about designing, you know, we have to do some mapping, we have to do some modeling but ultimately, we have to rethink the way our organizations are structured in order to make decisions.
I think we’re those people. Again, we’re the people that figure out what’s not happening. We’re the people who have those uncomfortable feelings of, “Ugh, something’s wrong here and no one’s thinking about it. And I guess I’m going to have to be that person yet again.”
So you’ve been there. You already have some battle scars I’m sure. You’re going to be there again. You’re going to solve this problem and then you’re going to be on another problem but if you’re in this field, you’re one person who’s probably here because you see things aren’t right.
You may not even have the term to describe it yet but you want to fix it. You want to plug that gap. So there’s hope.
You guys are it. Thanks.
Jess: Thanks Lou, that was fantastic. We have some time for questions. We have a couple of mics running in the room. If you want to ask questions just raise your hand and wait until one of our mics finds you.
Audience Member: Hey, Lou, great. I’m in exactly this situation right now with a client so this is perfect for me to hear. I just wanted to throw one little anecdote from my recent past with this. We were trying to come up with an app to help some people with a complicated task and we were looking at how they were accessing the forms to do that task in current state.
We did some site search analysis to see what keywords they were using. What was bizarre was that we saw people were using terms that we just thought regular people shouldn’t know this term. People don’t talk like this.
The cool thing there was, is that we happened to be working with the call center too because they were helping the lead people through this very process. We put two and two together and figured out that the call center associates were coaching people.
People would call and they would say, “I need to do this thing.” And the call center person would say, “Oh, you need a form. Go to the website, type in obscure word that nobody normally actually says and it’ll get you right to where you need to go.”
But it was only because we connected the silos by accident, right? I just had that memory when you mentioned that.
Lou: Oh, I know.
Audience Member: Right, exactly. And now it’s occurring to me that I probably could have done some more detective work and even found other stuff in other places so this is great. Thanks.
Audience Member: You need like a Sherlock Holmes hat now in the picture instead of the hardhat.
You know I always kind of worry about knowledge management because it seems more focused in some respects on capturing what’s in people’s brain rather than benefiting from what we’ve already captured.
I always see it as a bit of…it’s useful but somewhat incomplete. That needs to be at the table though. I think a lot of people in this room probably have some experience in knowledge management. There’s a lot of overlap with IA for certain. Probably a weasely answer but that’s my answer.
Samantha: That was really fantastic, thank you and very timely and apropos to stuff I’m thinking about. We recently created a centralized customer insights team that’s separate from my team.
I still have the user research on my side. On the one hand, it’s really great because more people talking about insights is fantastic by dealing with the silo problem. The other thing we’re dealing with is almost too much information now.
Now that we have the centralized team who’s getting time and money and funding to get new tools, how would you recommend starting to create those insights, that are the ones that are the most valuable?
Lou: Let me use a better way to describe this. When I have authors that are stuck and when I’m writing, when I’m stuck, I’ve been writing too much or they’ve been writing too much and they’ve lost sight of where they are, they’re very much in the bottom-up land.
What I tell them to do is flip or switch to top-down land. Go back to your outline. Think about your structure one more time and try to fill out that skeleton. I think that’s what I would do in your case.
When you get stuck in one area bottom-up, you’re drowning in data, go back to what your goals are and express them as KPI if you can but go back to why you do it, what kind of decisions are really important for the organization to make and center…you’re sifting through all that stuff in your prioritization around that.
Anyone else? Are there any other anecdotes? Livia, right here.
Livia: Hey, Lou, can you clarify a little bit what you mean by mapping because I’m having a heated debate with Kevin here because we’re trying to figure out if you’re focusing on…
Livia: So, I guess my concern was having tried to do a lot of in-house work where I was trying to get designers more interested in data and gone through an effort to try to bring those silos down and provide all these different sources of data so that designers could become interested in that data and coming to Samantha’s problem, becoming overwhelmed with the amount of data.
I found that there’s a big problem in terms of spending time just mapping stuff out and spending time actually just trying to understand, “how do I interpret data?” Because this is new to a lot of designers.
Lou: So it’s almost like there’s a threshold of if you don’t have a certain level of literacy about understanding certain pieces of data, you can just spend a lot of time mapping because that’s so much more fun than trying to figure it out.
In a way, the mapping process should probably work the same way. You need to be kind of doing an ongoing effort to figure out where the insights are. You’re not going to do it all at once. I’m not sure it’s possible.
However, I think that’s where the boundary objects can really come in well. Because the boundary objects, there’s a handful of things that you can probably identify that might have value to other people and get to those designers to look at data.
Get those data people to look at design because there’s some commonality there so in a way it’s like they’re like small touch points that are high value and high priority that people can focus on together. So if your real challenge is getting people together, I’d go back to Dave Gray’s work on that.
That’s it, OK, thanks everyone.