Brian is the Founder and Principal of Designing for Analytics, an independent consultancy helping technology leaders turn their data into valuable data products.
Brian also hosts the podcast, Experiencing Data, where he digs into the brains of leading voices at the intersection of design, ML/AI, analytics and data product management. Brian also authored the Designing for Analytics Self-Assessment Guide, teaches a seminar called Designing Human-Centered Data Products and was published in O'Reilly Media's 97 Things About Ethics in Data Science Everyone Should Know.
I was introduced to Brian by a subscriber to my blog who knew him and recommended that I talk to him about this topic. From the interview below you can see why! Brian has spent a great deal of time focusing on how data teams should work with stakeholders.
Brian’s perspective as someone who has a design perspective is very helpful, and this interview has made me view design in a different way too. Sometimes it can be hard as data folks to step outside of our world and ourselves; Brian has some great advice and experience that he shares in this interview.
It flows really nicely from the first post in this series last week, where Elena touched some of these topics too.
Again this is taken from a live interview transcript.
Education
I have an undergraduate degree in percussion performance, so I was a musician. Well, I am also a professional musician in addition to my consulting work, so all of my formal training regarding technology was actually all happening during my time in college. So, doing early web design and learning about the Internet was happening right when I was in college. I eventually started to apprentice as a designer with an agency, and then by the time I graduated from my undergrad, I had a little studio like some of my own clients, mostly for the School of Performing Arts and Career Services at Northern Arizona University, where I went to school - which is up in Flagstaff, Arizona, in the mountains. I kind of had just picked that, almost like a hobby on the side. It then turned into another income stream for me, and this is all peak.com era, so I ended up having this dual career track for 25 years, or however long it's been since I moved to Boston.
How did you come to data in your career?
I don't really have the separation of data and digital or software or the human factors piece because ultimately, when we think about it from a design perspective, I mean, words and text are kind of data and information, so it's still consumed the same way a non-data application is. This delineation for me isn't that important. Data people tend to cluster as an affinity group and they self-identify as something that's unique and different from technology. However, as a designer who's working with that as a material, I just don't have that distinction. So, I can tell you when I started to focus my consulting and my business work in the data product space - that's been an accidental and then an intentional journey.
I worked at several startups when I first moved out to Boston. I worked Lycos, but, gradually I started to move into, starting when I was at Lycos, a lot of the business-oriented products. This includes Quote.com and some others which became Lycos Finance, which was a competitor to Yahoo Finance. I got put on business and information design stuff, and this is around 2002 and at the time I wasn't yet involved with this. A lot of the other designers wanted to work on much more visual mediums, and I didn't have a traditional graphic design background. I was a self taught designer, but I found the challenge of working with the information to be kind of fun. So, I liked the boring business applications and data products and things like this.
So, I ended up working kind of heavily in financial services and trading. I worked at JPMorgan and then I was at Fidelity for a while in ETrade, and then, eventually, I got pulled into enterprise software quite heavily in the storage area, which is a very subspecialized area of technology that Boston also has quite a bit of talent in. So, we have HPS out here, EMC, NetApp and some of my colleagues from JPMorgan actually ended up in that space. Some startups were building software for people who manage data center infrastructure including storage, virtual servers and networking. That was all analytics, because it was all basically troubleshooting software. Troubleshooting slow performance in the data center is a very difficult user experience because there's so many factors that could lead to it. Broken is actually usually easier to fix than slow.
I also worked in that space for a while with a startup. They got acquired by NetApp, and so I worked more in that space. Then, I got dragged over to EMC by another engineer, to work on some of their products over there. Analytics was following all these different projects I had done. I worked in video analytics and I helped launch Aptopia, which is a client of mine. They're an app intelligence platform. It was kind of following me, so I made a very intentional choice about seven years ago: I'll focus my design consulting work in the analytics space, because I like working there.
From there, it kind of opened my eyes to things I didn't even know existed. I'd worked with business analysts before, especially at JPMorgan and Fidelity, but they were more like requirements gatherers in old-fashioned waterfall. However, I didn't really know that there were enterprise data teams that specifically designed and created solutions for internal business colleagues to make better decisions. Effectively, a decision support team that lives inside of an enterprise organization that may not be a software company, that could be a traditional brick and mortar. When I started to specialize with my company, which is called "Designing for Analytics", and I started to go on podcasts and I started applying to speak at conferences, I was this weird designer hanging out with all the Data science people.
I could see these different affinity groups that didn't necessarily self-identify as software engineers or technologists. They really saw data as a unique thing. The crack opened up for me and I realized, oh my gosh, analytics goes well beyond just stats products, digital software products, commercial software… There's a lot of internal work happening in this space, too, that's kind of how that whole thing opened up for me. I started writing and speaking and making more content towards that side of my business. So, I really see my audience as the people I try to help fall into two categories.
At the top, it's really the people in the technology sector. Tech companies, usually their founders, are heads of product and less so data, because the product people are the ones that are responsible for viability, feasibility, desirability, usability, utility... There may be some data partners, and often there's going to be data science talent on that team and engineering talent. Typically, those product leaders are one healthy part of my audience.
What became my new other audience are these enterprise data leaders that have internal teams. And what I'm finding they were interested in is not so much like we have a product, a data product, or a particular application or something that needs design help, but rather how do we learn the design and product orientation and how we do our work now? That audience tends to engage with me through speaking, guest speaking or training, where a leader wants to upskill their staff because their challenge tends to be the low adoption challenge.
We gave our CMO exactly what they asked for, and then they didn't use it. And it was like literally, we said we could do it and then gave them exactly what they asked for and they didn't use it. They're frustrated, because the business doesn't trust the information. They don't use it. I see that as a human factors problem, for which design and product orientation, from the more mature software discipline - those skills can be applied to internal data work. It doesn't matter that they're not swiping a credit card every month to use your dashboard. The principles are the same. Like, we're still going to deliver a solution. Maybe if we make it for them, if we fit it into how they see the world, if they're involved with the creation of it… which is how we think about design.
We design with our stakeholders and users and customers, not for them. So, we don't go away and make something and then come back. They are an integral part of that. This is very much my philosophy at least, and that's what I'm trying to help that branch of my audience with, in particular, is to get better results with the work that they're doing. Because, usually, the technology part is fairly easy. Maybe there's challenges in getting the data for your model and all those kinds of things and the data is dirty and that's all clean-up and it's not busy work - it's the necessary grind that you have to go through. But that's mostly cost work. That's not creating any value, yet. That's just getting the mise en place, as they say. When you're cooking right, you’ve got to get all your ingredients cut up, right sized, portioned out, before you turn the heat on, before you cook anything.
I'm really kind of trying to champion this product and human-centered design methodology in the world of data science and analytics, particularly in the machine learning and AI space these days, so that we have higher adoption of the work we're doing. People's work matters. If you're someone who has the skills to build these machine learning models or whatever the solution is, it's much more fun when your work gets used.
People don't want to be left alone to do some work in the basement and then write a paper about it five years later. This was kind of the joke - or maybe it's not really a joke - but very senior data scientists didn't really care whether anything got used. Their motivation was to get really highly-accurate models, so they can publish a paper about it. It's not my problem if the business doesn't use it, because that's not what I'm here for. That model, I think, has changed, and I think the leaders, the people managing these teams are realizing we're on the hook to create value, here. If no one uses the stuff we're making, a) my job is on the line, but b) actually, with COVID, we had the disbursement of labor from geography - workers could work anywhere.
What I've heard, even from guests on my podcast, is people want to work on stuff that matters now, because my job market is much bigger than it was. "What's the meaning of why I’m waking up every day to build stuff, if no one's going to use it?" The desirability to work on stuff that matters has actually increased, and so it's no longer sufficient for a highly-skilled technologist in the data space to just work on their laptop and then go close the lid. I don't care. I have no idea who this is for. Some guy in sales apparently needs a propensity model, and I've never talked to them. I don't even know where they live. I know nothing about them. It's not my problem. That mentality feels like it's changed. I've heard it's changed. People actually want to work on stuff that's going to matter now.
If you do care, that's a human factors problem. So, you can't attack that with Python and Jupyter notebooks. That's not a Python problem. That's not a data engineering problem. It's a human factors problem. That's a whole set of skills that we need, in order to increase adoption.
I see the way you've talked about this approach where they may not be swiping a credit card, but it's still a product. I see that as a solution to the problem, being that sometimes, what I call the human interfaces, don't work well. I work at a two person company now. I've worked at a 10,000 person company before. In the two person company, we almost know what each other is thinking. At the 10,000 person company, you don't even know who you're talking to sometimes, because it's through three other people.
The question that I have next is - can you describe some human interfaces that you've seen that work really well? And if so, why did they work well?
If you're talking about the activities of how teams work together to get better results, if that's what you're saying when you say human interface, then yeah, I can give you a story about that. It's from a student that took one of my seminars called "Designing human centered data products." He was a VP of Data Science at a small data science consulting firm. Data science and analytics work, in social work or something in a space related to healthcare. He told me that he is usually a client-facing person, especially in that pre-sales role doing what we call discovery work, figuring out what actually needs to get done.
He said: we started using some of your techniques and asking questions and not assuming that the problem as expressed by the customer is actually the thing that needs to be made. So often the problem that the customer gives us is actually a solution possibility. They don't express it as a problem, they express it as a solution, which they think will help you know what to make for them, so that they can become happy with it. David, I need a machine learning model that will predict X, Y, and Z for me. I need a ChatGPT-like solution that will help me know which ads my team should spend money on for the next quarter. They've just preloaded the assumption that ChatGPT or an LLM is going to help them.
He said: “The questions you're asking have been so helpful in us flushing out what is actually needed. It turned a light on, because it was unsolicited feedback from the client that the discovery work and the asking of questions wasn't challenging.” You can look at it as, oh, I can't challenge them. This is three business departments away from me. It's been funneled down from some leader somewhere and it's just - go deliver. Instead, by pushing back and asking these questions in a generous way, where it's not a challenge, but it's simply to figure out what the desired outcome means for you and for the people that we're making this for.
Jared Spool, in the design space, likes to talk about it. We think about user experience; all solutions have a user experience of some kind. You can't not have one. The way to frame a positive outcome is: “How would we know if we improved somebody's life here, in a work context? Did we improve somebody's life and how would we know if we did?”
So, our data, our decision support applications, and whatever our solutions are that we're making - how do we make that person's life better? How would we know? And the only way to know that is to know what it's like to be a salesperson that uses data to decide who to cold call, or however your sales team sells. How does the pricing person price today such that they don't price too high, they don't price too low, the margins are good, but they don't overprice it. How do they do that work today and how do we involve them in that process?
If you just give us machine learning to tell us how many carrots to buy for when we restock, carrots for our chain of grocery stores… if you don't know anything about how that person negotiates the carrot pricing now and what factors go into it, and you just say: it's 39 cents, so don't pay more than that. They say, I pay this other seller like 50 cents a pound, but his carrots fly off the shelves. I'm willing to take a hit there, because we sell a lot more of those. You're now telling me I can't go above 41 cents now, because of some model that I was not involved with. At worst they say that, or they just simply ignore it and all that work. All your model did is spit out this number on a screen somewhere? This is only one iota of their day, to look at this price. It's just one factor. For you, it was a six month project to get all the data together, to actually come up with that prediction - to him, it's insignificant.
They will still buy carrots whether priced $0.42 or a little bit more or less. What's in it for them? They're going to get paid the same. They don't get a bonus based on pricing the carrots right. This is where stuff falls apart in my book - the better way to do that is you get the carrot buyer involved in the process of making it. We figure out: what does success mean for that person and what is the business success? Why does this person need to price the carrots really well from the farmers that he's buying for? Why is that important? How does that align to some company goal there? If we can get that alignment right, it makes the carrot buyer look better to their manager, because they can express how they worked with the Data Science team.
We came up with this pricing model. We're no longer overspending. I can show you how it's changed. Taking some subjectivity out of our pricing was one of our strategic goals for the quarter. I can point to this thing that I worked on with the Data team to back that up. Now everyone is winning. The Data team has a win on the board. The carrot buyer feels good, because they can show how they're actually saving the company some money. The business is seeing positive results from the carrot person and their purchases, enabled by the Data Science team who built the model. If we're collaborating together on this, it’s now a very ideal world kind of situation. And I know that there can be particularly big companies, a lot of obstacles to breaking down this stuff.
Management has to open the door - I see some of this with Data Science leaders in particular when building models. You have to be willing to say no:
"You guys aren't ready to work with us."
"Well, what do you mean? Just give me a dashboard."
"Well, what I mean is you've just told me you have no time to work on this with us and we design stuff with our clients. We will not design something and throw it over the wall to you, because what happens is it doesn't get used. And I'm going to spend $70,000 in the next quarter of your money and you're going to get something that your team won't use. So do you want me to spend 70k? Where else could you spend that 70k of your budget?"
"Let me show you the desert of stuff behind me that has not gotten used. Look at these shiny models in GitHub. Look at that one up there. Oh, yeah, I built that one back in 2016. That could save us a billion dollars a year, but it didn't get used."
"The other way is: you can join us, which means you're going to have to commit time and energy and resources in the form of people to create and design this solution with us. That's how we will actually help you with your real problem."
Have you seen through your time in consulting or elsewhere, where this has not worked well... why and when? What was bad about it?
Where I see it the most is usually in some of the training I do, particularly for private teams. If the people who are responsible for the product management or the design of the solution, which is the person that ultimately will decide: How would these people use this solution in the end? Who is our customer? Who do we need to satisfy? If this person is not enrolled in that journey and they're not enrolled in the non-technical work that has to be done, it's dead, because there's always going to be more technical work to be done.
With technical work it either works or it doesn't. Product work is not like that. It's so rare that we get this instantaneous clear signal that it's working. If people aren't comfortable with the ambiguity and the grayness of this type of non-technical work that has to be done, it's not going to work. That's the first thing. The second place it doesn't work is if management and leadership don't enable challenge between teams. Teams have to be able to say no to each other.
If your salespeople can't find 2 hours a week to do a design jam with our team to help us understand: How do you sell? Show us how you sell - what's it like to sell? How do you use data right now? Are you using spreadsheets, HubSpot? How do you decide who to call? There's 50,000 people in the database. How do you decide who to call next, that you think is going to buy something? If no one's asking that question, no one wants to ask that question, or the salesperson is not involved, it's dead. And sometimes it's management that has to say, no, we will not build this for you unless, dear sales leader, you take those guys off the phone and bring them into the room because in the short term, yeah, they're not going to be selling as much. They're not going to be on the phone.
This is a strategic engagement. We're talking about potentially saving or earning millions of new revenue here. It's not going to happen tomorrow. It's going to happen in the next year, if we're successful. So you’ve got to take your eye off the $10,000 sales that this guy might close this week and focus on the millions of dollars that are at play here. If we get this right, if the data team can help us stop calling the wrong people and start calling the right people that are ready to buy, you're going to look better. Their sales quotas are going to go up. The business does better. The data team helped the sales team crush it. Everyone again has now won. In that scenario, there's really not a loser, except in the short term. If someone's just like: “I’ve got to meet my quota, my guy has to be on the phone 18 hours a week.” - that's a management problem. They need to clear the path for this kind of work to happen. Otherwise, your risk of putting out what I call "technically right, effectively wrong", just goes up again because you're going to take a guess. You're going to make assumptions. You throw it at the sales team and you hope that they're going to use whatever it is that you made. Many times they don't, because they weren't involved. They don't know how it came up. And, as soon as those numbers don't look right and they say "I've been here for 25 years and I've never seen X. Like, where did you guys get that? That does not sound right to me." You're done.
In your role as a consultant working with data teams, how is it that you help them? What do you train them in to help make human interfaces and data better?
The root of it is human-centered design, but taking the parts that are relevant and applicable to someone who's working on data products. It's that design is actually a team sport. So, designers are really facilitating the act of creating something great. We tend to think of the physical objects a lot of the time with this, but, effectively what's happening, especially with complex solutions, is that a lead designer is really facilitating some positive outcome, usually centered more on the customer. This is where the focus is more on the user, who's not always the business sponsor or the business stakeholder. In our internal world of data products, they're often the same person. The person that's writing an internal check is the one that's going to use it, or their team is going to use it.
The act of undertaking this creative work, getting comfortable with this non-binary, grey space, with doing customer research and user research - and not looking at it as challenging the person that said they need this thing. We really want to know what it is like to be a salesperson, because we don't know. We don't do that. Why are we asking all these questions? Because we don't want to build you something that's just a waste of your time or worse. We actually impose a tax on you. We produce this crazy report or dashboard and it's 50 pages long: "How is this supposed to help me? This feels like a tax. This doesn't feel like it's helping me. You just added work to my plate. What's in it for me?" We don't want to do that to you.
By letting us get into your head and doing this work, we're learning together and we're going to come up with something better for you. We teach the soft skills or the comfort of learning how to do that work, to be comfortable with the ambiguity, that is a lot of the work. This is in addition to some of the more tactical skills around how we create journey maps to understand workflows and processes. If the stream is running down-current here, you don't want to create something that requires them to walk up-current. You want to give them a better canoe that's already on this stream. It's going this way.
We want to fit it into their life and the way they do stuff now and not try to have them exit their habits and the way they do things. These are the soft skills.Ultimately, it's about how we increase the adoption of the things that we're making. The usability, the utility, the trust, the adoption. Those are the skills I'm trying to teach to teams. And they're not technical. They're all non-technical skills.
Key Takeaways
I wasn’t aware that COVID and the further globalisation of work caused by it, has meant that people want their work to be more impactful. It makes so much sense… there has been a greater focus during this time, than any other in living memory, of people globally reconsidering what is important to them - where they live, how much money they need, what they’re willing to tolerate... Also choosing work that is meaningful to them (and often this means impactful or useful, to data folks) is completely natural in my mind.
I really like the term “solution possibility”, I’d never heard it before. Savvy data folks often ask a lot of whys when asked for something: Why do you want this? To understand X. Why do you want to understand X? To make a decision about Y. Why do you want to make a decision about Y? To contribute to KPI Z. What control levers can you pull and when and in what context?…
Being able to say no to work you think won’t be used or valuable is probably going to be a key theme in this series. I love how Brian has gone further into explaining why we should say no. If the stakeholder isn’t willing to spend the time with us in design, we’ll probably build the wrong thing and it will take a lot longer or not be used. Willingness to engage, in a stakeholder, has often been a subconscious reason to prioritise work for data folks, maybe we should just quantify it and be more explicit?
Last week I saw a theme regarding sales of our data products. Brian speaks about adoption and how design with the stakeholder can mean they will embrace our work with enthusiasm. Design feels like good pre-sales work for data folks.