127.8K
Downloads
161
Episodes
If you’re a leader tasked with generating business and org. value through ML/AI and analytics, you’ve probably struggled with low user adoption. Making the tech gets easier, but getting users to use, and buyers to buy, remains difficult—but you’ve heard a ”data product” approach can help. Can it? My name is Brian T. O’Neill, and on Experiencing Data—one of the top 2% of podcasts in the world—I offer you a consulting designer’s perspective on why creating ML and analytics outputs isn’t enough to create business and UX outcomes. How can UX design and product management help you create innovative ML/AI and analytical data products? What exactly are data products—and how can data product management help you increase user adoption of ML/analytics—so that stakeholders can finally see the business value of your data? Every 2 weeks, I answer these questions via solo episodes and interviews with innovative chief data officers, data product management leaders, and top UX professionals. Hashtag: #ExperiencingData. PODCAST HOMEPAGE: Get 1-page summaries, text transcripts, and join my Insights mailing list: https://designingforanalytics.com/ed ABOUT THE HOST, BRIAN T. O’NEILL: https://designingforanalytics.com/bio/
Episodes
Tuesday Jan 14, 2020
Tuesday Jan 14, 2020
Joost Zeeuw is a data scientist and product owner at Pacmed, a data-driven healthcare and AI startup in Amsterdam that combines medical expertise and machine learning to create stronger patient outcomes and improve healthcare experiences. He’s also taught a number of different subjects—like physics, chemistry, and mathematics—at Lyceo, an online education service, and Luzac College in the Netherlands.
Join Brian and Joost as they discuss the role of design and user experience within the context of providing personalized medical treatments using AI. Plus:
- The role data has in influencing doctors’ decisions—without making the decisions
- The questions Joost’s product team asks before designing any AI solution at Pacmed
- How people’s familiarity with iPhones and ease-of-use has influenced expectations around simplicity—and the challenges this poses when there is machine learning under the hood
- Why Brian thinks Pacmed’s abnormal approach to design is great—and what that approach looks like
- The simple, non-technical, but critical thing Pacmed did early on to help them define their AI product strategy and avoid going down the wrong path
- An example of an unexpected treatment prediction that Pacmed’s algorithm detected—which ended up being something that a specific field of medicine had been studying with classical research techniques 10,000 km away
- Where Joost believes Western medicine falls short with respect to new drug trials
Resources and Links
Quotes for Today’s Episode
“Pacmed in that has a three-fold mission, which is, first of all, to try to make sure that every single patient gets the treatment that has proven to work for him or her based on prior data analysis. And next to that we say, ‘well, if an algorithm can learn all these awesome insights generated by thousands and thousands of doctors, then a doctor using one of those products is also very capable of learning more and more things from the lessons that are incorporated in this algorithm and this product.’ And finally, healthcare is very expensive. We are trying to maximize the efficiency and the effectiveness of that spend by making sure everybody gets a treatment that has the highest probability of working for him or her.” — Joost
“Offering a data product like this is really another tool in that toolbox that allows the doctor to pierce through this insane amount of complexity that there is in giving care to a patient.” — Joost
“Before designing anything, we ask ourselves this: Does it fit into the workflow of people that already have maybe one of the most demanding jobs in the world?” — Joost
“There’s a very big gap between what is scientifically medically interesting and what’s practical in a healthcare system.” — Joost
“When I talk about design here, I’m talking kind of about capital D design. So product design, user experience, looking at the whole business and the outcomes we’re trying to drive, it’s kind of that larger picture here.” — Brian
“I don’t think this is ‘normal’ for a lot of people coming from the engineering side or from the data science side to be going out and talking to customers, thinking about like how does this person do their job and how does my work fit into you know a bigger picture solution of what this person needs to do all day, and what are the health outcomes we’re going for? That part of this product development process is not about data science, right? It’s about the human factors piece, about how does our solution fit into this world.” — Brian
“I think that the impact of bringing people out into the field—whatever that is, that could be a corporate cubicle somewhere, a hospital, outside in a farm field—usually there’s a really positive thing that happens because I think people are able to connect their work with an actual human being that’s going to potentially use this solution. And when we look at software all day, it’s very easy to disconnect from any sense of human connection with someone else.” — Brian
“If you’re a product owner or even if you’re more on the analytics side, but you’re responsible for delivering decision support, it’s really important to go get a feel for what people are doing all day.” — Brian
Tuesday Dec 31, 2019
Tuesday Dec 31, 2019
Di Dang is an emerging tech design advocate at Google and helped lead the creation of Google’s People + AI Guidebook. In her role, she works with product design teams, external partners, and end users to support the creation of emerging tech experiences. She also teaches a course on immersive technology at the School of Visual Concepts. Prior to these positions, Di worked as an emerging tech lead and senior UX designer at POP, a UX consultant at Kintsugi Creative Solutions, and a business development manager at AppLift. She earned a bachelor of arts degree in philosophy and religious studies from Stanford University.
Join Brian and Di as they discuss the intersection of design and human-centered AI and:
- Why a data science leader should care about design and integrating designers during a machine-learning project, and the impacts when they do not
- What exactly Di does in her capacity as an emerging tech design advocate at Google and the definition of human-centered AI
- How design helps data science teams save money and time by elucidating the problem space and user needs
- The two key purposes of Google’s People + AI Research (PAIR) team
- What Google’s triptych methodology is and how it helps teams prevent building the wrong solution
- A specific example of how user research and design helped ship a Pixel 2 feature
- How to ensure an AI solution is human-centered when a non-tech company wants to build something but lacks a formal product manager or UX lead/resource
- The original goals behind the creation of Google’s People + AI Guidebook
- The role vocabulary plays in human-centered AI design
Resources and Links
Twitter: @Dqpdang
Quotes from Today’s Episode
“Even within Google, I can’t tell you how many times I have tech leaders, engineers who kind of cock an eyebrow at me and ask, ‘Why would design be involved when it comes to working with machine learning?’” — Di
“Software applications of machine learning is a relatively nascent space and we have a lot to learn from in terms of designing for it. The People + AI Guidebook is a starting point and we want to understand what works, what doesn’t, and what’s missing so that we can continue to build best practices around AI product decisions together.” — Di
“The key value proposition that design brings is we want to work with you to help make sure that when we’re utilizing machine learning, that we’re utilizing it to solve a problem for a user in a way that couldn’t be done through other technologies or through heuristics or rules-based programming—that we’re really using machine learning where it’s most needed.” — Di
“A key piece that I hear again and again from internal Google product teams and external product teams that I work with is that it’s very, very easy for a lot of teams to default to a tech-first kind of mentality. It’s like, ‘Oh, well you know, machine learning, should we ML this?’ That’s a very common problem that we hear. So then, machine learning becomes this hammer for which everything is a nail—but if only a hammer were as easy to construct as a piece of wood and a little metal anvil kind of bit.” — Di
“A lot of folks are still evolving their own mental model around what machine learning is and what it’s good for. But closely in relation—because this is something that I think people don’t talk as much about maybe because it’s less sexy to talk about than machine learning—is that there are often times a lot of organizational or political or cultural uncertainties or confusion around even integrating machine learning.” — Di
“I think there’s a valid promise that there’s a real opportunity with AI. It’s going to change businesses in a significant way and there’s something to that. At the same time, it’s like go purchase some data scientists, throw them in your team, and have them start whacking stuff. And they’re kind of waiting for someone to hand them a good problem to work on and the business doesn’t know and they’re just saying, ‘What is our machine learning strategy?’ And so someone in theory hopefully is hunting for a good problem to solve.” — Brian
“Everyone’s trying to move fast all the time and ship code and a lot of times we focus on the shipping of code and the putting of models into production as our measurement—as opposed to the outcomes that come from putting something into production.” — Brian
“The difference between the good and the great designer is the ability to merge the business objectives with ethically sound user-facing and user-centered principles.” — Brian
Tuesday Dec 17, 2019
Tuesday Dec 17, 2019
When it comes to telling stories with data, Cole Nussbaumer Knaflic is ahead of the curve. In October 2015, she wrote a best-selling book called storytelling with data: a data visualization guide for business professionals. That book led to the creation of storytelling with data, an agency that helps businesses communicate more effectively using data, and she’s since followed-up with another best-seller: storytelling with data: let’s practice! Prior to her current role, Cole served as a people analytics manager at Google, was the owner and chief consultant at Insight Analytics, and held several positions at Washington Mutual, among other positions.
In our chat, we covered:
- Why sharp communication skills are integral to telling stories with data
- The skills data people need to effectively communicate with data
- Who Cole thinks you should run your presentations by first, and the specific colleagues you should be sharing them with
- Why it’s important to practice presentations in informal settings first
- How looking at data in different formats can help you build more effective presentations
- The differences between exploratory and explanatory data analysis in the context of storytelling
- The important role of diction when presenting data
- Cole’s opinions on the skills many modern workers need around data storytelling
- Why data visualization and the ability to tell stories is not a nice-to-have skill
- What Cole’s approach to preparing for a presentation looks like and the format she uses to kick off the process
Resources and Links
Designingforanalytics.com/seminar
Twitter: @Storywithdata.
Company website: Storytellingwithdata.com
Quotes from Today’s Episode
“I've always really enjoyed that space where numbers and business intersect and enjoy how we can use numbers to get to understand things better and make smarter decisions.” — Cole
“If you're the one analyzing the data, you know it best. And you're actually in a unique position to be able to derive and help others derive greater value from that data. But in order to do that, you have to be able to talk to other people about it and communicate what you've done to technical audiences and to non-technical audiences.” — Cole
“When it comes to communicating effectively with data, you can't take out the human part. That's the part where things can either succeed or fail.” — Cole
“There's no single right way to show data. Any data can be graphed a ton of different ways. And so when we're thinking about how we visualize our data, it really means stepping back and thinking about what sort of behavior we’re trying to drive in our audience. What do we want them to see in this? And then it often means iterating through different views of the data, which is also a fantastic way just to get to know your data better because different views will make observations easier or less easy to see.” — Cole
“As soon as we try to draw attention to one aspect of the data or another, it actually makes any other potential takeaways harder to see.” — Cole
“Words are very important for making our data accessible and understandable.” — Cole
“Depending on the visualization, what you're doing is you're teaching yourself not to assume that the information is necessarily clear. You're being objective. And it sounds like a dumb question, but that's kind of what I usually recommend to my clients: We need to be really objective about our assumptions about what's being communicated here and validate that.” — Brian
“The low-fidelity format—especially if you're working with a stakeholder or perhaps someone who's going to be the recipient—enables them to give you honest feedback. Because the more polished that sucker looks, the less they're going to want to give you any.” — Brian
Tuesday Dec 03, 2019
Tuesday Dec 03, 2019
Angela Bassa is the director of data science and head of data science and machine learning at iRobot, a technology company focused on robotics (you might have clean floors thanks to a Roomba). Prior to joining iRobot, Angela wore several different hats, including working as a financial analyst at Morgan Stanley, the senior manager of big data analytics and platform engineering at EnerNOC, and even a scuba instructor in the U.S. Virgin Islands.
Join Angela and I as we discuss the role data science plays in robotics and explore:
- Why Angela doesn’t believe in a division between technical and non-technical skill
- Why Angela came to iRobot and her mission
- What data breadcrumbs are and what you should know about them
- The skill Angela believes matters most when turning data science into a producer of decision support
- Why the last mile of the UX is often way longer than one mile
- The critical role expectation management plays in data science, how Angela handles delivering surprise findings to the business, and the marketing skill she taps to help her build trust
Resources and Links
Designing for Analytics Seminar
Quotes from Today’s Episode
“Because these tools that we use sometimes can be quite sophisticated, it’s really easy to use very complicated jargon to impart credibility onto results that perhaps aren’t merited. I like to call that math-washing the result.” — Angela
“Our mandate is to make sure that we are making the best decisions—that we are informing strategy rather than just believing certain bits of institutional knowledge or anecdotes or trends. We can actually sort of demonstrate and test those hypotheses with the data that is available to us. And so we can make much better informed decisions and, hopefully, less risky ones.” — Angela
“Data alone isn’t the ground truth. Data isn’t the thing that we should be reacting to. Data are artifacts. They’re breadcrumbs that help us reconstruct what might have happened.” — Angela
[When getting somebody to trust the data science work], I don’t think the trust comes from bringing someone along during the actual timeline. I think it has more to do with bringing someone along with the narrative.—Angela
“It sounds like you’ve created a nice dependency for your data science team. You’re seen as a strategic partner as opposed to being off in the corner doing cryptic work that people can’t understand.” — Brian
“When I talk to data scientists and leaders, they often talk about how technical skills are very easy to measure. You can see them on paper, you can get them in the interview. But there are these other skills that are required to do effective work and create value.” — Brian
Tuesday Nov 19, 2019
Tuesday Nov 19, 2019
Tom Davenport has literally written the book on analytics. Actually, several of them, to be precise. Over the course of his career, Tom has established himself as the authority on analytics and how their role in the modern organization has evolved in recent years. Tom is a distinguished professor at Babson College, a research fellow at the MIT Initiative on the Digital Economy, and a senior advisor at Deloitte Analytics. The discussion was timely as Tom had just written an article about a financial services company that had trained its employees on human-centered design so that they could ensure any use of AI would be customer-driven and valuable. We discussed their journey and:
- Why on a scale of 1-10, the field of analytics has only gone from a one to about a two in ten years time
- Why so few analytics projects actually make it into production
- Examples of companies who are using design to turn data into useful applications of AI, decision support and product improvements for customers
- Why shadow IT shouldn’t be a bad word
- AI moonshot projects vs. MVPs and how they relate
- Why journey mapping is incredibly useful and important in analytics and data science work
- How human-centered design and ethnography is the tough work that’s required to turn data into decision support
- Tom’s new book and his thoughts on the future of data science and analytics
Resources and Links:
- Website: Tomdavenport.com
- LinkedIn: Tom Davenport
- Twitter: @tdav
- Designingforanalytics.com/seminar
- Designingforanalytics.com
Quotes from Today’s Episode
“If you survey organizations and ask them, ‘Does your company have a data-driven culture?’ they almost always say no. Surveys even show a kind of negative movement over recent years in that regard. And it's because nobody really addresses that issue. They only address the technology side.” — Tom Eventually, I think some fraction of [AI and analytics solutions] get used and are moderately effective, but there is not nearly enough focus on this. A lot of analytics people think their job is to create models, and whether anybody uses it or not is not their responsibility...We don't have enough people who make it their jobs to do that sort of thing. —Tom I think we need this new specialist, like a data ethnographer, who could sort of understand much more how people interact with data and applications, and how many ways they get screwed up.—Tom I don't know how you inculcate it or teach it in schools, but I think we all need curiosity about how technology can make us work more effectively. It clearly takes some investment, and time, and effort to do it.— Tom TD Wealth’s goal was to get [its employees] to experientially understand what data, analytics, technology, and AI are all about, and then to think a lot about how it related to their customers. So they had a lot of time spent with customers, understanding what their needs were to make that match with AI. [...] Most organizations only address the technology and the data sides, so I thought this was very refreshing.—Tom “So we all want to do stuff with data. But as you know, there are a lot of poor solutions that get provided from technical people back to business stakeholders. Sometimes they fall on deaf ears. They don't get used.” — Brian “I actually had a consultant I was talking to recently who said you know the average VP/director or CDO/CAO has about two years now to show results, and this gravy train may be slowing down a little bit.“ — Brian “One of the things that I see in the kind of the data science and analytics community is almost this expectation that ‘I will be handed a well-crafted and well-defined problem that is a data problem, and then I will go off and solve it using my technical skills, and then provide you with an answer.’” — Brian
Tuesday Nov 05, 2019
025 - Treating Data Science at IDEO as a Discipline of Design with Dean Malmgren
Tuesday Nov 05, 2019
Tuesday Nov 05, 2019
Dean Malmgren cut his teeth as a chemical and biological engineer. In grad school, he studied complex systems and began telling stories about them through the lens of data algorithms. That led him to co-found Datascope Analytics, a data science consulting company which was purchased by IDEO, a global design firm. Today, Dean is an executive director at IDEO and helps teams use data to build delightful products and experiences.
Join Dean and I as we explore the intersection of data science and design and discuss:
- Human-centered design and why it’s important to data science
- What it was like for a data science company to get ingested into a design firm and why it’s a perfect match
- Why data science isn’t always good at creating things that have never existed before
- Why teams need to prototype rapidly and why data scientists should hesitate to always use the latest tools
- What data scientists can learn from design team and vice-versa
- Why data scientists need to talk to end users early and often, and the importance of developing empathy
- The difference between data scientists and algorithm designers
- Dean’s opinions on why many data analytics projects fail
Resources and Links
Quotes from Today’s Episode
“One of the things that we learned very, very quickly, and very early on, was that designing algorithms that are useful for people involves a lot more than just data and code.” — Dean
“In the projects that we do at IDEO, we are designing new things that don’t yet exist in the world. Designing things that are new to the world is pretty different than optimizing existing processes or business units or operations, which tends to be the focus of a lot of data science teams.” — Dean
“The reality is that designing new-to-the-world things often involves a different mindset than optimizing the existing things.” — Dean
“You know if somebody rates a movie incorrectly, it’s not like you’d throw out Siskel and Ebert’s recommendations for the rest of your life. You just might not pay as much attention to them. But that’s very different when it comes to algorithmic recommendations. We have a lot less tolerance for machines making mistakes.” — Dean
“The key benefit here is the culture that design brings in terms of creating early and getting feedback early in the process, as opposed to waiting you know three, five, six, seven months working on some model, getting it 97% accurate but 10% utilized.” — Brian
“You can do all the best work in the world. But at the end of the day, if there’s a human in the loop, it’s that last mile or last hundred feet, whatever you want to call it, where you make it or break it.” — Brian
“Study after study shows that 10 to 20% of big data analytics projects and AI projects succeed. I’ve actually been collecting them as a hobby in a single article, because they keep coming out.” — Brian
Tuesday Oct 22, 2019
Tuesday Oct 22, 2019
David Stephenson, Ph.D., is the author of Big Data Demystified, a guide for executives that explores the transformative nature of big data and data analytics. He’s also a data strategy consultant and professor at the University of Amsterdam. In a previous life, David worked in various data science roles at companies like Adidas, Coolblue, and eBay.
Join David and I as we discuss what makes data science projects succeed and explore:
- The non-technical issues that lead to ineffective data science and analytics projects
- The specific type of communication that is critical to the success of data science and analytics initiatives (and how working in isolation from your stakeholder or business sponsor creates risk))
- The power of showing value early, starting small/lean, and one way David applies agile to data science projects
- The problems that emerge when data scientists only want to do “interesting data science”
- How design thinking can help data scientists and analytics practitioners make their work resonate with stakeholders who are not “data people”
- How David now relies on design thinking heavily, and what it taught him about making “cool” prototypes nobody cared about
- What it’s like to work on a project without understanding who’s sponsoring it
Resources and Links
Connect with David on LinkedIn
David’s book: Big Data Demystified
Quotes from Today’s Episode
“You see a lot of solutions being developed very well, which were not designed to meet the actual challenge that the industry is facing.” — David
“You just have that whole wasted effort because there wasn’t enough communication at inception.” — David
“I think that companies are really embracing agile, especially in the last few years. They’re really recognizing the value of it from a software perspective. But it’s really challenging from the analytics perspective—partly because the data science and analytics. They don’t fit into the scrum model very well for a variety of reasons.” — David
“That for me was a real learning point—to understand the hardest thing is not necessarily the most important thing.” — David
“If you’re working with marketing people, an 80% solution is fine. If you’re working with finance, they really need exact numbers. You have to understand what your target audience needs in terms of precision.” — David
“I feel sometimes that when we talk about “the business” people don’t understand that the business is a collection of people—just like a government is a collection of real humans doing jobs and they have goals and needs and selfish interests. So there’s really a collection of end customers and the person that’s paying for the solution.” — Brian
“I think it’s always important—whether you’re a consultant or you’re internal—to really understand who’s going to be evaluating the value creation.”— Brian
“You’ve got to keep those lines of communication open and make sure they’re seeing the work you’re doing and evaluating and giving feedback on it. Throw this over the wall is a very high risk model.” — Brian
Tuesday Oct 08, 2019
Tuesday Oct 08, 2019
Dr. Murray Cantor has a storied career that spans decades. Recently, he founded Aptage, a company that provides project risk management tools using Bayesian Estimation and machine learning. He’s also the chief scientist at Hail Sports, which focuses on applying precision medicine techniques to sports performance. In his spare time, he’s a consulting mathematician at Pattern Computer, a firm that engineers state-of-the-art pattern recognition solutions for industrial customers.
Join Murray and I as we explore the cutting edge of AI and cover:
- Murray’s approach to automating processes that humans typically do, the role humans have in the design phase, and how he thinks about designing affordances for human intervention in automated systems
- Murray’s opinion on causal modeling (explainability/interpretability), the true stage we are in with XAI, and what’s next for causality in AI models
- Murray’s opinions about the 737 Max’s automated trim control system interface (or lack thereof) and how it should have been designed The favorite method Murray has for predicting outcomes from small data sets
- The major skill gaps Murray sees with young data scientists in particular
- How using science fiction stories can stimulate creative thinking and help kick off an AI initiative successfully with clients, customers and stakeholders
Resources and Links
New York Times Expose article on the Boeing 737 Max
New Your Times Article on the 737 Max whistleblower
Quotes from Today’s Episode
“We’re in that stage of this industrial revolution we’re going through with augmenting people’s ability with machine learning. Right now it’s more of a craft than a science. We have people out there who are really good at working with these techniques and algorithms. But they don’t necessarily understand they’re essentially a solution looking for a problem.” — Murray
“A lot of design principles are the same whether or not you have AI. AI just raises the stakes.” — Murray
“The big companies right now are jumping the guns and saying they have explainable AI when they don’t. It’s going to take a while to really get there.” — Murray
“Sometimes, it’s not always understood by non-designers, but you’re not testing the people. You’re actually testing the system. In fact, sometimes they tell you to avoid using the word test when you’re talking to a participant, and you tell them it’s a study to evaluate a piece of software, or in this case a cockpit, to figure out if it’s the right design or not. It’s so that they don’t feel like they’re a rat in the maze. In reality, we’re studying the maze.” — Brian
“Really fundamental to understanding user experience and design is to ask the question, who is the population of people who are going to use this and what is their range of capability?” – Murray
“Take the implementation hats off and come up with a moonshot vision. From the moonshot, you might find out there are these little tangents that are actually feasible increments. If you never let yourself dream big, you’ll never hit the small incremental steps that you may be able to take.” — Brian
Tuesday Sep 24, 2019
Tuesday Sep 24, 2019
Scott Friesen’s transformation into a data analytics professional wasn’t exactly linear. After graduating with a biology degree and becoming a pre-med student, he switched gears and managed artists in the music industry. After that, he worked at Best Buy, eventually becoming their Senior Director of Analytics for the company’s consumer insights unit. Today, Scott is the SVP of Strategic Analytics at Echo Global Logistics, a provider of technology-enabled transportation and supply chain management services. He also advises for the International Institute for Analytics.
In this episode, Scott shares what he thinks data scientists and analytics leaders need to do to become a trustworthy and indispensable part of an organization. Scott and I both believe that designing good decision support applications and creating useful data science solutions involve a lot more than technical knowledge. We cover:
- Scott’s trust equation, why it’s critical for analytics professionals, and how he uses it to push transformation across the organization
- Scott’s “jazz” vs “classical” approach to creating solutions
- How to develop intimacy and trust with your business partners (e.g., IT) and executives, and the non-technical skills analytics teams need to develop to be successful
- Scott’s opinion about design thinking and analytics solutions
- How to talk about risk to business stakeholders when deploying data science solutions
- How the success of Scott’s new pricing model was impeded by something that had nothing to do with the data—and how he addressed it
- Scott’s take on the emerging “analytics translator” role
- The two key steps to career success—and volcanos
Resources and Links
Quotes from Today's Episode
“You might think it is more like classical music, but truly great analytics are more like jazz. ” — Scott
“If I'm going to introduce change to an organization, then I'm going to introduce perceived risk. And so the way for me to drive positive change—the way for me to drive adding value to the organizations that I'm a part of—is the ability to create enough credibility and intimacy that I can get away with introducing change that benefits the organization.” — Scott
“I categorize the analytic pursuit into three fundamental activities: The first is to observe, the second is to relate, and the third is to predict. ” — Scott
“It's not enough to just understand the technology part and how to create great models. You can get all that stuff right and still fail in the last mile to deliver value.” — Brian
“I tend to think of this is terms of what you called ‘intimacy.’ I don’t know if you equate that to empathy, which is really understanding the thing you are talking about from the perspective of the other person. When we do UX research, the questions themselves are what form this intimacy. An easy way to do that is by asking open-ended questions that require open-ended answers to get that person to open up to you. ” — Brian
Tuesday Sep 10, 2019
Tuesday Sep 10, 2019
John Purcell has more than 20 years of experience in the technology world. Currently, he’s VP of Products at CloudHealth, a company that helps organizations manage their increasingly complex cloud infrastructure effectively. Prior to this role, he held the same position at SmartBear Software, makers of application performance monitoring solutions. He’s also worn several hats at companies like LogMeIn and Red Bend Software.
In today’s episode, John and I discuss how companies are moving more and more workloads to the cloud and how John and his team at CloudHealth builds a platform that makes it easy for all users—even non-technical ones—to analyze and manage data in the cloud and control their financial spending. In addition to exploring the overall complexity of using analytics to inform the management of cloud environments, we also covered:
- How CloudHealth designs for multiple personas from the financial analyst to the DevOps operator when building solutions into the product
- Why John has “maniacal point of view” and “religion” around design and UX and how they have the power to disrupt a market
- How design helps turn otherwise complex data sets that might require an advanced degree to understand into useful decision support
- How data can lead to action, and how CloudHealth balances automation vs. manual action for its customers using data to make decisions
- Why John believes user experience is a critical voice at the table during the very earliest stages of any new analytics/data initiative
Resources and Links
Twitter: @PurcellOutdoors
Quotes from Today’s Episode
“I think that’s probably where the biggest point of complexity and the biggest point of challenge is for us: trying to make sure that the platform is flexible enough to be able to inject datasets we’ve never seen before and to be able to analyze and find correlations between unknown datasets that we may not have a high degree of familiarity with—so that we can generate insight that’s actionable, but deliver it in a way that’s [easy for anyone to understand].” — John
“My core philosophy is that you need UX at the table early and at every step along the way as you’re contemplating product delivery, and I mean all the way upstream at the strategic phase, at the identification of what you want to go tackle next including product strategy, pain identification, persona awareness, and who are we building for—all the way through solving the problem, what should the product be capable of, and user validation.” — John
“in the cloud, we’re just at the very early stages of [automation based on analytics] from a pure DevOps point of view. We’re still in the world of show me your math. Show me why this is the recommendation you’re making.” — John
“When making decisions using data, some IT people don’t like the system taking action without them being involved because they don’t trust that any product would be smart enough to make all the right decisions, and they don’t want applications going down.” — Brian
“I think the distinction you made between what I would call user interface design, which is the surface layer, buttons, fonts, colors, all that stuff often gets conflated in the world of analytics as being, quote ‘design.’ And as I think our audience is hearing from John here, is that it [design] goes much beyond that. It can get into something like, ‘how do you design a great experience around API documentation? Where’s the demo code? How do I run the demo?’ All of that can definitely be designed as well.” — Brian
“I hear frequently in my conversations with clients and people in the industry that there are a lot of data scientists who just want to use the latest models, and they want to work on model quality and predictive accurateness, etc. But they’re not thinking about how someone is going to use this model to make a decision, and whether will there be some business value created at the end.” — Brian