Experiencing Data with Brian T. O’Neill
054 - Jared Spool on Designing Innovative ML/AI and Analytics User Experiences

054 - Jared Spool on Designing Innovative ML/AI and Analytics User Experiences

December 15, 2020

Jared Spool is arguably the most well-known name in the field of design and user experience. For more than a decade, he has beena witty, powerful voice for why UX is critical to value creation within businesses. Formerly an engineer, Jared started working in UX in 1978, founded UIE (User Interface Engineering) in 1988, and has helped establish the field over the last 30 years. In addition, he advised the US Digital Service / Executive Office of President Obama and in 2016, Jared co-founded the Center Centre, the user experience design school that’s creating a new generation of industry-ready UX designers.

Today however, we turned to the topic of UX in the context of  analytics, ML and AI—and what teams–especially those without trained designers on staff–need to know about creating successful data products.

In our chat, we covered: 

  • Jared’s definition of “design”
  • The definition of UX outcomes, and who should be responsible for defining and delivering them
  • Understanding the “value chain” of user experience and the idea that “everyone” creating the solution is a designer and responsible for UX
  • Brian’s take on the current state of data and AI-awareness within the field of UX —and whether Jared agrees with Brian’s perceptions
  • Why teams should use visual aids to drive change and innovation, and two tools they can use to execute this 
  • The relationship between data literacy and design
  • The type of math training Jared thinks is missing in education and why he thinks it should replace calculus in high school -- Examples of how UX design directly addresses privacy and ethical issues with intelligent devices
  • Some example actions that leaders who are new to the UX profession can do immediately to start driving more value with data products

Quotes from Today’s Episode

“Center Centre is a school in Chattanooga for creating UX designers, and it's also the name of the professional development business that we've created around it that helps organizations create and exude excellence in terms of making UX design and product services…” - Jared

“The reality is this: on the other side of all that data, there are people. There's the direct people who are interacting with the data directly, interacting with the intelligence interacting with the various elements of what's going on, but at the same time, there's indirect folks. If someone is making decisions based on that intelligence, those decisions affect somebody else's life.” - Jared

“I think something that's missing frequently here is the inability to think beyond the immediate customer who requests a solution.” Brian

“The fact that there are user experience teams anywhere is sort of a new and novel thing. A decade ago, that was very unlikely that you'd go into a business and there’d be a user experience team of any note that had any sort of influence across the business.” - Jared

[At Netflix], we'd probably put the people who work in the basement on [server and network] performance at the opposite side of the chart from the people who work on the user interface or what we consider the user experience of Netflix […] Except at that one moment where someone's watching their favorite film, and that little spinny thing comes up, and the film pauses, and the experience is completely interrupted. And it's interrupted because the latency, and the throughput, and the resilience of the network are coming through to the user interface. And suddenly, that group of people in the basement are the most important UX designers at Netflix.  - Jared

My feeling is, with the exception of perhaps the FANG companies, the idea of designers being required, or part of the equation when we're developing probabilistic solutions that use machine learning etc., well, it's not even part of the conversation with most user experience leaders that I talk to. - Brian

Links

053 - Creating (and Debugging) Successful Data Product Teams with Jesse Anderson

053 - Creating (and Debugging) Successful Data Product Teams with Jesse Anderson

December 1, 2020

In this episode of Experiencing Data, I speak with Jesse Anderson, who is Managing Director of the Big Data Institute and author of a new book

titled, Data Teams: A Unified Management Model for Successful Data-Focused Teams. Jesse opens up about why teams often run into trouble in their efforts to build data products, and what can be done to drive better outcomes. 

In our chat, we covered: 

  • Jesse’s concept of debugging teams
  • How Jesse defines a data product, how he distinguishes them from software products
  • What users care about in useful data products
  • Why your tech leads need to be involved with frontline customers, users, and business leaders 
  • Brian’s take on Jesse’s definition of a “data team” and the roles involved-especially around two particular disciplines 
  • The role that product owners tend to play in highly productive teams
  • What conditions lead teams to building the wrong product
  • How data teams are challenged to bring together parts of the company that never talk to each other – like business, analytics, and engineering teams
  • The differences in how tech companies create software and data products, versus how non-digital natives often go about the process

Quotes from Today’s Episode

“I have a sneaking suspicion that leads and even individual contributors will want to read this book, but it’s more [to provide] suggestions for middle,upper management, and executive management.” – Jesse

“With data engineering, we can’t make v1 and v2 of data products. We actually have to make sure that our data products can be changed and evolve, otherwise we will be constantly shooting ourselves in the foot. And this is where the experience or the difference between a data engineer and software engineer comes into place.” – Jesse

“I think there’s high value in lots of interfacing between the tech leads and whoever the frontline customers are…” – Brian

“In my opinion-and this is what I talked about in some of the chapters-the business should be directly interacting with the data teams.” – Jesse

“[The reason] I advocate so strongly for having skilled product management in [a product design] group is because they need to be shielding teams that are doing implementation from the thrashing that may be going on upstairs.” – Brian

“One of the most difficult things of data teams is actually bringing together parts of the company that never talk to each other.” – Jesse

Links

 

052 - Reasons Automated Decision Making with Machine Learning Can Fail with James Taylor

052 - Reasons Automated Decision Making with Machine Learning Can Fail with James Taylor

November 17, 2020

In this episode of Experiencing Data, I sat down with James Taylor, the CEO of Decision Management Solutions. This discussion centers around how enterprises build ML-driven software to make decisions faster, more precise, and more consistent-and why this pursuit may fail.

We covered:

  • The role that decision management plays in business, especially when making decisions quickly, reliably, consistently, transparently and at scale.
  • The concept of the "last mile," and why many companies fail to get their data products across it
  • James' take on operationalization of ML models, why Brian dislikes this term
  • Why James thinks it is important to distinguish between technology problems and organizational change problems when leveraging ML.
  • Why machine learning is not a substitute for hard work.
  • What happens when human-centered design is combined with decision management.
  • James's book, Digital Decisioning: How to Use Decision Management to Get Business Value from AI, which lays out a methodology for automating decision making.

Quotes from Today's Episode

"If you're a large company, and you have a high volume transaction where it's not immediately obvious what you should do in response to that transaction, then you have to make a decision - quickly, at scale, reliably, consistently, transparently. We specialize in helping people build solutions to that problem." - James 

"Machine learning is not a substitute for hard work, for thinking about the problem, understanding your business, or doing things. It's a way of adding value. It doesn't substitute for things." - James

"One thing that I kind of have a distaste for in the data science space when we're talking about models and deploying models is thinking about 'operationalization' as something that's distinct from the technology-building process." - Brian

"People tend to define an analytical solution, frankly, that will never work because[…] they're solving the wrong problem. Or they build a solution that in theory would work, but they can't get it across the last mile. Our experience is that you can't get it across the last mile if you don't begin by thinking about the last mile." - James 

"When I look at a problem, I'm looking at how I use analytics to make that better. I come in as an analytics person." - James

"We often joke that you have to work backwards. Instead of saying, 'here's my data, here's the analytics I can build from my data […], you have to say, 'what's a better decision look like? How do I make the decision today? What analytics will help me improve that decision?' How do I find the data I need to build those analytics?' Because those are the ones that will actually change my business." - James 

"We talk about [the last mile] a lot ... which is ensuring that when the human beings come in and touch, use, and interface with the systems and interfaces that you've created, that this isthe make or break point-where technology goes to succeed or die." - Brian

Links

 

051 - Methods for Designing Ethical, Human-Centered AI with Undock Head of Machine Learning, Chenda Bunkasem

051 - Methods for Designing Ethical, Human-Centered AI with Undock Head of Machine Learning, Chenda Bunkasem

November 3, 2020

Chenda Bunkasem is head of machine learning at Undock, where she is focusing on using quantitative methods to influence ethical design. In this episode of Experiencing Data, Chenda and I explore her actual methods to designing ethical AI solutions as well as how she works with UX and product teams on ML solutions.

We covered:

  • How data teams can actually design ethical ML models, after understanding if ML is the right approach to begin with  
  • How Chenda aligns her data science work with the desired UX, so that technical choices are always in support of the product and user instead of “what’s cool”
  • An overview of Chenda’s role at Undock, where she works very closely with product and marketing teams, advising them on uses for machine learning 
  • How Chenda’s approaches to using AI may change when there are humans in the loop
  • What NASA’s Technology Readiness Level (TRL) evaluation is, and how Chenda uses it in her machine learning work 
  • What ethical pillars are and how they relate to building AI solutions
  • What the Delphi method is and how it relates to creating and user-testing ethical machine learning solutions

Quotes From Today’s Episode 

“There's places where machine learning should be used and places where it doesn't necessarily have to be.” - Chenda

“The more interpretability, the better off you always are.” - Chenda

“The most advanced AI doesn't always have to be implemented. People usually skip past this, and they're looking for the best transformer or the most complex neural network. It's not the case. It’s about whether or not the product sticks and the product works alongside the user to aid whatever their endeavor is, or whatever the purpose of that product is. It can be very minimalist in that sense.” - Chenda 

“First we bring domain experts together, and then we analyze the use case at hand, and whatever goes in the middle — the meat, between that — is usually decided through many iterations after meetings, and then after going out and doing some sort of user testing, or user research, coming back, etc.” - Chenra, explaining the Delphi method. 

“First you're taking answers on someone's ethical pillars or a company's ethical pillars based off of their intuition, and then you're finding how that solution can work in a more engineering or systems-design fashion. “ - Chenda 

“I'm kind of very curious about this area of prototyping, and figuring out how fast can we learn something about what the problem space is, and what is needed, prior to doing too much implementation work that we or the business don't want to rewind and throw out.” - Brian

“There are a lot of data projects that get created that end up not getting used at all.”- Brian

Links

Undock website

Chenda's personal website

Substack

Twitter

Instagram

Connect with Chenda on LinkedIn

 

050 - Ways to Practice Creativity and Foster Innovation When You’re An Analytical Thinker

050 - Ways to Practice Creativity and Foster Innovation When You’re An Analytical Thinker

October 20, 2020

50 episodes! I can’t believe it. Since it’s somewhat of a milestone for the show, I decided to do another solo round of Experiencing Data, following the positive feedback that I’ve gotten from the last few episodes. Today, I want to help you think about ways to practice creativity when you and your organization are living in an analytical world, creating analytics for a living, and thinking logically and rationally. Why? Because creativity is what leads to innovation, and the sciences says a lot of decision making is not rational. This means we have to tap things besides logical reasoning and data to bring data products to our customers that they will love...and use. (Sorry!)

One of the biggest blockers to creativity is in the organ above your shoulders and between your ears. I frequently encounter highly talented technical professionals  who find creativity to be a foreign thing reserved for people like artists. They don’t think of themselves as being creative, and believe it is an innate talent instead of a skill. If you have ever said, “I don’t have a creative bone in my body,” then this episode is for you.

As with most technical concepts, practicing creativity is a skill most people can develop, and if you can inculcate a mix of thinking approaches into your data product and analytical solution development, you’re more likely to come up with innovative solutions that will delight your  customers. The first thing to realize though is that this isn’t going to be on the test. You can’t score a “92” or a “67” out of 100. There’s no right answer to look up online. When you’re ready to let go of all that, grab your headphones and jump in. I’ll even tell you a story to get going.

 

Links Referenced

Previous podcast with Steve Rader

 

049 - CxO & Digital Transformation Focus: (10) Reasons Users Can’t or Won’t Use Your Team’s ML/AI-Driven Software and Analytics Applications

049 - CxO & Digital Transformation Focus: (10) Reasons Users Can’t or Won’t Use Your Team’s ML/AI-Driven Software and Analytics Applications

October 6, 2020

Join the Free Webinar Related to this Episode

I'm taking questions and going into depth about how to address the challenges in this episode of Experiencing Data on Oct 9, 2020. 30 Mins + Q/A time. Replay will also be available.

Register Now

Welcome back for another solo episode of Experiencing Data. Today, I am primarily focusing on addressing the non-digital natives out there who are trying to use AI/ML in innovative ways, whether  through custom software applications and data products, or as a means to add new forms of predictive intelligence to existing digital experiences.

Many non-digital native companies today tend to approach software as a technical “thing” that needs to get built, and neglect to consider the humans who will actually use it — resulting in a lack of business or organizational value emerging. While my focus will be on the design and user experience aspects that tend to impede adoption and the realization of business value, I will also talk about some organizational blockers related to how intelligent software is created that can also derail a successful digital transformation efforts.

These aren’t the only 10 non-technical reasons an intelligent application or decision support solution might fail, but they are 10 that you can and should be addressing—now—if the success of your technology is dependent on the humans in the loop actually adopting your software, and changing their current behavior.

Links

 

048 - Good vs. Great: (10) Things that Distinguish the Best Leaders of Intelligent Products, Analytics Applications, and Decision Support Tools

048 - Good vs. Great: (10) Things that Distinguish the Best Leaders of Intelligent Products, Analytics Applications, and Decision Support Tools

September 22, 2020

Today I’m going solo on Experiencing Data! Over the years, I have worked with a lot of leaders of data-driven software initiatives with all sorts of titles. Today, I decided to focus the podcast episode on what I think makes the top product management and digital/software leaders stand out, particularly in the space of enterprise software, analytics applications, and decision support tools.

This episode is for anyone leading a software application or product initiative that has to produce real value, and not just a technology output of some kind. When I recorded this episode, I largely had “product managers” in mind, but titles can vary significantly. Additionally, this episode focuses on my perspective as a product/UX design consultant and advisor, focusing specifically at the traits associated with these leaders’ ability to produce valuable, innovative solutions customers need and want. A large part of being a successful software leader also involves managing teams and other departments that aren’t directly a part of the product strategy and design/creation process, however I did not go deep into these aspects today. As a disclaimer, my ideas are not based on research. They’re just my opinions. Some of the topics I covered include:

  • The role of skepticism
  • The misunderstanding of what it means to be a “PM”
  • The way top software leaders collaborate with UX professionals, designers, and engineering/tech leads
  • How top leaders treat UX when building customer-focused technology
  • How top product management leaders define success and make a strategy design-actionable
  • The ways in which great PMs enable empathy in their teams and evangelize meaningful user research
  • The output vs. outcome mindset
047 - How Yelp Integrates Data Science, Engineering, UX, and Product Management when Creating AI Products with Yelp’s Justin Norman

047 - How Yelp Integrates Data Science, Engineering, UX, and Product Management when Creating AI Products with Yelp’s Justin Norman

September 8, 2020

In part one of an excellent series on AI product management, LinkedIn Research Scientist Peter Skomoroch and O’Reilly VP of Content Strategy Mike Loukides explained the importance of aligning AI products with your business plans and strategies. In other words, they have to deliver value, and they have to be delivered on time. Unfortunately, this is much easier said than done. I was curious to learn more about what goes into the complex AI product development process, and so for answers I turned to Yelp VP of Data Science Justin Norman, who collaborated with Peter and Mike in the O’Reilly series of articles. Justin is a career data professional and data science leader with experience in multiple companies and industries, having served as director of research and data science at Cloudera Fast Forward Labs, head of applied machine learning at Fitbit, head of Cisco’s enterprise data science office, and as a big data systems engineer with Booz Allen Hamilton. He also served as a Marine Corps Officer with a focus in systems analytics. We covered:

  • Justin’s definition of a successful AI product
  • The two key components behind AI products
  • The lessons Justin learned building his first AI platform and what insights he applied when he went to Yelp.
  • Why AI projects often fail early on, and how teams can better align themselves for success.
  • Who or what Beaker and Bunsen are and how they enable Yelp to test over 700 experiments at any one time.
  • What Justin learned at an airline about approaching problems from a ML standpoint vs. a user experience standpoint—and what the cross-functional team changed as a result.
  • How Yelp incorporates designers, UX research, and product management with its technical teams
  • Why companies should analyze the AI, ML and data science stack and form a strategy that aligns with their needs.
  • The critical role of AI product management and what consideration Justin thinks is the most important when building a ML platform
  • How Justin would approach AI development if he was starting all over at a brand new company
  • Justin’s pros and cons about doing data science in the government vs. the private sector.

Quotes from Today’s Episode

“[My non-traditional background] gave me a really broad understanding of the full stack [...] from the physical layer all the way through delivering information to a decision-maker without a lot of time, maybe in an imperfect form, but really packaged for what we're all hoping to have, which is that value-add information to be able to do something with.” - Justin

“It's very possible to create incredible data science products that are able to provide useful intelligence, but they may not be fast enough; they may not be [...] put together enough to be useful. They may not be easy enough to use by a layperson.” -Justin

 

“Just because we can do things in AI space, even if they're automated, doesn't mean that it's actually beneficial or a value-add.” - Justin

“I think the most important thing to focus on there is to understand what you need to be able to test and deploy rapidly, and then build that framework.” - Justin

“I think it's important to have a product management team that understands the maturity lifecycle of building out these capabilities and is able to interject and say, ‘Hey, it's time for us to make a different investment, either in parallel, once we've reached this milestone, or this next step in the product lifecycle.’” - Justin

“...When we talk about product management, there are different audiences. I think [Yelp’s] internal AI product management role is really important because the same concepts of thinking about design, and how people are going to use the service, and making it useful — that can apply to employees just as much as it can to the digital experience that you put out to your end customers.” -Brian

“You hear about these enterprise projects in particular, where the only thing that ever gets done is the infrastructure. And then by the time they get something ready, it’s like the business has moved on, the opportunity's gone, or some other challenge or the team gets replaced because they haven't shown anything, and the next personcomes in and wants to do it a different way.” - Brian

Links

 

046 - How Steelcase’s Data Science, UX, & Product Teams Are Helping Customers Design Safer Office Workplaces Informed by Covid-19 Recommendations w/ J…

046 - How Steelcase’s Data Science, UX, & Product Teams Are Helping Customers Design Safer Office Workplaces Informed by Covid-19 Recommendations w/ J…

August 25, 2020

When you think of Steelcase, their office furniture probably comes to mind. However, Steelcase is much more than just a manufacturer of office equipment. They enable their customers (workplace/workspace designers) to help those designers’ clients create useful, effective, workplaces and offices that are also safe and compliant.

Jorge Lozano is a data science manager at Steelcase and recently participated as a practitioner and guest on an IIA webinar I gave about product design and management being the missing links in many data science and analytics initiatives. I was curious to dig deeper with Jorge about how Steelcase is enabling its customers to adjust workspaces to account for public health guidelines around COVID-19 and employees returning to their physical offices. The data science team was trying to make it easy for its design customers to understand health guidelines around seat density, employee proximity and other relevant metrics so that any workspace designs  could be “checked” against public health guidelines.

Figuring out the what, when, and how to present these health guidelines in a digital experience was a journey that Jorge was willing to share.

We covered:

  • Why the company was struggling to understand how their [office] products came together, and how the data science group tried to help answer this.
  • The digital experience Steelcase is working on to re-shape offices for safe post-pandemic use.
  • How Steelcase is evaluating whether their health and safety recommendations were in fact safe, and making a difference.
  • How Jorge’s team transitioned from delivering “static data science” outputs into providing an enabling capability to the business.
  • What Steelcase did to help dealer designers when engaging with customers, in order to help them explain the health risks associated with their current office layouts and plans.
  • What it was like for Jorge’s team to work with a product manager and UX designer, and how it improved the process  of making the workspace health guidelines useful.

 

Resources and Links:

 

Quotes from Today’s Episode

“We really pride ourselves in research-based design” - Jorge

“This [source data from design software] really enabled us to make very specific metrics to understand the current state of the North American office.” - Jorge

“Using the data that we collected, we came up with samples of workstations that are representative of what our customers are more likely to have. We retrofitted them, and then we put the retrofitted desk in the lab that basically simulates the sneeze of a person, or somebody coughing, or somebody kind of spitting a little bit while they're talking, and all of that. And we're collecting some really amazing insights that can quantify the extent to which certain retrofits work in disease transmission.” - Jorge

“I think one of the challenges is that, especially when you're dealing with a software design solution that involves probabilities, someone has to be the line-drawer.” - Brian

“The challenge right now is how to set up a system where we can swarm at things faster, where we're more efficient at understanding the needs and [are able to get] it in the hands of the right people to make those important decisions fast? It's all pointing towards data science as an enabling capability. It's a team sport.” - Jorge

 

045 - Healthcare Analytics…or Actionable Decision Support Tools? Leadership Strategies from Novant Health’s SVP of Data Products, Karl Hightower

045 - Healthcare Analytics…or Actionable Decision Support Tools? Leadership Strategies from Novant Health’s SVP of Data Products, Karl Hightower

August 11, 2020

Healthcare professionals need access to decision support tools that deliver the right information, at the right time. In a busy healthcare facility, where countless decisions are made on a daily basis, it is crucial that any analytical tools provided actually yield useful decision support to the target customer. In this episode, I talked to Karl Hightower from Novant Health about how he and his team define “quality” when it comes to data products, and what they do to meet that definition in their daily work. Karl Hightower is the Chief Data Officer and SVP of Data Products at Novant Health, a busy hospital and medical group in the Southeast United States with over 16 hospitals and more than 600 clinics. Karl and I took a deep dive into data product management, and how Karl and his team are designing products and services that help empower all of the organization’s decision makers. In our chat, we covered:

  • How a non-tech company like Novant Health approaches data product management
  • The challenges of designing data products with empathy in mind while being in an environment involving physicians and healthcare professionalsThe metric Karl’s team uses to judge the quality and efficacy of their data products, and how executive management contributed to defining this success criteria
  • How Karl encourages deep empathy between analytics teams and their users by deeply investigating how the users being served by the team make decisions with data
  • How and why Novant embraces design and UX in their data product work
  • The types of outcomes Karl sees when designers and user experience professionals work with analytics and data science practitioners.
  • How Karl was able to obtain end user buy-in and support for ?
  • The strategy Karl used to deal with a multitude of “information silos” resulting from the company’s numerous analytics groups.

 

Resources and Links:

Quotes from Today’s Episode

“I tend to think of product management as a core role along with a technical lead and product designer in the software industry. Outside the software industry, I feel like product management is often this missing hub. ” - Brian

 

“I really want to understand why the person is asking for what they're asking for, so there is much more of a closer relationship between that portfolio team and their end-user community that they're working with. It's almost a day-to-day living and breathing with and understanding not just what they're asking for and why are they asking for it, but you need to understand how they use information to make decisions.” - Karl

 

“I think empathy can sound kind of hand-wavy at times. Soft and fluffy, like whipped cream. However, more and more at senior levels, I am hearing how much leaders feel these skills are important because the technology can be technically right and effectively wrong.” - Brian

 

“The decision that we got to on executive governance was how are we going to judge success criteria? How do we know that we're delivering the right products and that we're getting better on the maturity scale? And the metric is actually really simple. Ask the people that we're delivering for, does this give you what you need when you need it to make those decisions? - Karl

 

“The number one principle is, if I don't know how something is done [created with data], I'm very unlikely to trust it. And as you look at just the nature of healthcare, transparency absolutely has to be there because we want the clinicians to poke holes in it, and we want everyone to be able to trust it. So, we are very open. We are very transparent with everything that goes in it.” - Karl

 

“You need to really understand the why. You’ve got to understand what business decisions are being made, what's driving the strategy of the people who are asking for all that information.” - Karl

 

 

 

Podbean App

Play this podcast on Podbean App