In this edition of NovusNorth’s thought leadership conversation, Dave Cowing had the opportunity to speak with Erin Helcl, Founder of New Helio Strategies.
Erin is the founder of New Helio Strategies and brings over two decades of experience leading experience design and transformation within complex, highly regulated industries such as telecom, retail, and financial services. Her work focuses on helping leaders and teams shape what’s next—designing services, building trust, and navigating disruption. A central theme in her approach is ensuring people can see themselves in the future; without that vision, they risk disengaging. Erin’s recent research and writing, including her article “The Future of Product and Service Design is Now,” explores how automation and evolving roles impact individuals—and how thoughtful design can help them grow into what’s next.
NovusNorth is a leading innovator in digital experience and platforms for the financial services industry and provides product management, user experience design, and development services.
Key Takeaways:
This post shares the highlights from the discussion between Dave and Erin.
Read the Transcript
Dave Cowing
In your post, “The Future of Product and Service Design is Now”, you mentioned that AI accelerates design. Can you elaborate on how AI is currently being integrated into product and service design and the impact that it’s having on the process?
Erin Helcl
What I’m starting to see happen is that AI is really becoming a design partner and that product and design teams are using it to accelerate manual tasks. This includes things like synthesizing research, something that would take a long time previously, or sketching early prototypes or exploring unexpected ideas in ideation.
“AI is really becoming a design partner”
– Erin Helcl
“AI is really becoming a design partner”
– Erin Helcl
What’s interesting is how it’s shifting the designer’s role. With AI handling speed and scale, we can focus on asking the right questions, exploring what could be. That’s where the real creativity and strategy lie.
I recently took a course through IDEO University on AI and design thinking. They shared some mindsets I try to embody in my work. One of them is being collaborative. Another is product and service designers combining AI’s scale with human empathy and creativity, because that’s what we are really bringing to the table. Also making it human-centered, using AI to augment, not replace. Then there is being responsible. How are we embedding ethics into every step, not just tacking it on at the end. This is both in our process as well as in our end products and services, because we’re thinking about it from both lenses now.
Then there’s the idea of using AI as a sandbox for learning and pushing boundaries. That explorer mindset. Sometimes as service designers, running ideation sessions as humans, we can get stuck in our sandbox of what we’ve always known and it can be hard to think outside that box. We can use AI as a partner to help us push beyond boundaries and generate wilder ideas to work from.
It’s not about automating design. It’s about creating space for deeper insight, faster exploration and better outcomes. But, talking through those mindsets, there are definitely implications we’ll need to think about as we grow this.
Dave Cowing
I’m wondering, if you look at how that then impacts or is implemented in large organizations, how does that manifest itself? Beyond the individual, what changes are you seeing in larger organizations?
Erin Helcl
There are a couple of things I’ve been seeing, and a few futures and trends I’ve been thinking through. As we move faster and replace more manual tasks, I’m seeing convergence of roles. I am starting to see shifts in teams, in terms of where design sits within the organization. Some organizations are saying, “If you want to replace FTE, you need to show me how AI can’t do the role.” So how do we continue to show value?
Some of the futures I’ve been thinking about include the converged role—product, service, engineering, and management blending into one hybrid role powered by AI. Maybe it’s not one role but smaller teams. I don’t think that these scenarios are mutually exclusive, I think there will be some version of all of this kind of coming true.
Another one that I talk about in the article is the agent economy. We’re designing for AI agents in customer experience and services. They’re also acting as collaborators in creating the journeying and end users in the journey.
Then there’s the human-centered renaissance. I think there is a bit of AI fatigue. Will people continue to adopt or push back and seek out emotional, ethical, human-made experiences
In these three futures, the service designer plays a key role, not just mapping journeys or service blueprints but as a systems orchestrator. Large organizations need people who can bring forward systems thinking: how tech and policy interact, storytelling, and sensemaking. Technical fluency will be critical; not coding but being able to translate between disciplines and understand this more complex ecosystem.
Organizations need to be thinking about how the role is shifting. But at the heart of it, we still need to design with empathy, ethics, imagination. I think that’s more relevant than ever.
Dave Cowing
You keep coming back to the word trust. Can you highlight how trust can differentiate successful designs or service blueprints? What are some of the strategies designers and companies need to use to create trust?
Erin Helcl
We are designing in an age of deep mistrust. Between misinformation and automation, people are unsure what to believe or where they fit.
I recently went to an AI and creativity symposium at Toronto Metropolitan University. Bruce MacCormack, a senior advisor that at the CBC (Canadian Broadcast Corporation), spoke about work with the Coalition for Content Provenance and Authenticity within journalism and the media. He said, “Seeing is no longer believing.” That’s a challenging space to be in.
He’s leading efforts with CBC in Canada, the BBC, and The New York Times to develop a credentialing system to preserve and signal verified news content. The are creating trust and authenticity frameworks so you know that the news you’re reading is true and real, and hasn’t changed since its source. LinkedIn has also launched this in their platform.
We’re seeing synthetic media rise. It’s important not just in journalism, but in any system where people need to know what’s real and what’s trustworthy. Trust can’t just be an outcome, it has to be an intentional part of how we design.
So, what can organizations do?
- Transparency – show how systems work, and decisions were made. If I’m going to trust you in my financial services, I need to understand what’s being done and how it’s being done. Consented control will let people opt in, opt out, or step back in this journey as needed and be able to understand what control they still have.
- Context – making sure innovation doesn’t outpace understanding. Just because we can automate something, doesn’t mean we should. We should use that human centered design lens to understand what makes sense, and not just doing it for automation or AI’s sake but really thinking about what makes sense for the end user.
“Just because we can automate something, doesn’t mean we should.”
– Erin Helcl
“Just because we can automate something, doesn’t mean we should.”
– Erin Helcl
I also saw this play out at that symposium at Toronto Metropolitan University. They were sharing their plans on how they were going to roll out Gemini to their student body this coming call. It wasn’t just about launching a tool. They were preparing to do this meaningfully, to build trust so that students can use it in an ethical and authentic way that still allows for learning and showing their true value. That’s what we need to think about as we roll AI out: how is it empowering us, not replacing us. How it creates trust in the system and the organization we’re working with. Trust shouldn’t be a checkbox or disclaimer, it needs to live in the experience itself.
“Trust shouldn’t be a checkbox or disclaimer, it needs to live in the experience.”
– Erin Helcl
“Trust shouldn’t be a checkbox or disclaimer, it needs to live in the experience.”
– Erin Helcl
Dave Cowing
What do you see as the most significant trends, kind of shaping the future of product and service design?
Erin Helcl
When I wrote the article The Future of Product and Service Design is Now, I created a job description for a service designer of 2030. It was a fun activity, but also really hard, because is this now or 2030? I think it was a fun experiment and helped me see myself in the future.
I thought about trends I mentioned: the converged role, the agent economy, and the human-centered renaissance. From a skills perspective, we talked about systems thinking, sensemaking, technical fluency.
From a leadership perspective, we’re have to help guide this in organizations. There’s five skills that I would call out:
- Systems leadership: help others zoom out and understand business, tech, and the experience.
- Sensemaking: bring clarity to complexity and help teams see themselves in the future, both for their customer experience, their product design and for themselves and how they play in the organization.
- Psychological safety: create space to challenge the status quo, try things without fear of failure and creating space to be able to challenge, experiment and grow. Innovation is about doing new things and you have to learn your way through it, because you’re not going to have all the answers.
- Comfort with ambiguity: technology changing at an extremely fast pace, so working within that ambiguity as a leader is going to be extremely important.
- Balancing ethics with speed: With the opportunity to be AI powered across research, synthesis, prototyping and ideation, there’s pressure to move faster and do it for less. Where’s the value that we continue to bring. Moving faster isn’t always better.
I’ve worked with teams where innovation is stalled and not because of a lack of ideas or because people didn’t feel safe to speak up. We have to create this environment where challenge and change are welcome, and we need to have the courage to push back, to slow down, to hurry up, and to make sure we’re doing this ethically and doing this right.
As leaders, we need to model how to work with AI. We’re talking about co-creation with AI or using it as an end user. Let’s not do this as a gimmick, but let’s do it as a creative partner and really experiment. Sometimes it’s going to work out and sometimes it’s not. But if the risk isn’t high in terms of what you’re trying, work with your teams to do that. So I’ll reiterate that I think our job is to help people see themselves in the future and then build those conditions to help them get there.
Dave Cowing
Are strategic designers who deeply understand business problems better positioned against the impacts of design becoming democratized, compared to those who primarily execute or “assemble” designs?
Erin Helcl
I think there’s definitely a possibility of that. When I talk about the skills of the future, to be able to elevate into that systems thinking and that more strategic piece, to be that orchestrator. AI can help us with low fidelity wireframes and prototypes, coding all these different things. So, where is the human element? I think we should talk a little bit about how we keep people at the center of these AI powered systems and what human centered design continue look like.
Dave Cowing
Is there also a much more significant kind of education and re skilling effort that organizations need to go through that they’re not prepared for?
Erin Helcl
One of the large financial institutions that I worked for a few years ago was thinking about that. I did a lot of deep research around people whose jobs were being impacted by automation or new ways of working. The number one thing that kept coming back is I need to be able to see myself in this future, and then I need to be able to have the space to learn and upskill. It can’t be at side of desk; it must be deliberate with time carved out for me to do that. The benefit to the organization is these are people who understand your business. They’re already here. They’re already on board. The cost to onboard new employees is very high. This is an opportunity to do upskill the team, and we actually used those insights to create formal, re-skilling programs.
But I’m not seeing that across the board. I went to a talk at the Rotman School of Management in Toronto, and it was AI Use Cases and Financial Services. It was 10% of adoption today in large companies. There was a fear around being replaced or being tracked on what you’re doing within AI. One of the consultants from one of the large consulting firms said, “If you don’t bring your people along with you, and you don’t bring your talent along with you, that is going to be your biggest mistake.” I think this is a huge opportunity from a systems thinking, from a services design perspective, let’s apply it to our employee base and really think how to bring people along into the future.
Dave Cowing
What can designers do to ensure that the user is not lost in this process?
Erin Helcl
You can now generate a full research report, service maps, and personas without even talking to a person. You use deep research function on ChatGPT, use a search function, you do some good prompt analysis, and then suddenly you’re creating an experience around people, not for them. This scares me, because you need to be able to hear someone’s story. Without being able to see context, this is a dangerous place to be.
I think about my career as a service designer, and how some of my most meaningful insights have come through. I’ve done retail in my career as a service designer, designing accessible and inclusive retail experiences. I did a shop along with a customer. We went store to store, and I was able to observe their experience and talk to them throughout it. This is what shaped the work. There are people behind the data and we need to remember that. AI, I don’t think can replicate that. Synthesis isn’t just about clustering data, like it’s where we build empathy.
“There are people behind the data and we need to remember that.”
– Erin Helcl
“There are people behind the data and we need to remember that.”
– Erin Helcl
I even feel this is a challenge today, when I’m doing the synthesis and I’m doing the research, and then I’m bringing forward stakeholders to present that to them, and something gets lost in that. So how do we bring quotes, and how do we bring videos, and how do we get our partners immersed in research as well? This is the challenge with the speed that we want to go at and we can do. Just because we can do it doesn’t mean we should do it. So much of the meaning lives in that process when you think about service design and human centered design
To answer your question, we need to keep talking to people, even when it’s inconvenient, get out of the building, get away from your computer and go, context immersion, ethnographic research, talk to people, work with people. Ask yourself, are your users informed? Are they in control? Do they feel seen? And protect the moments that build that emotional connection and not just efficiency. AI is a powerful partner, but trust, emotion and insights still come from people.
Even without AI, we bring bias into the design process. We can unintentionally center our own perspectives or misinterpret data. AI doesn’t fix that, but it can help us see patterns or assumptions we might miss. Maybe if we can design these systems thoughtfully, it can actually be a tool for greater objectivity and inclusivity. But I think the key is to stay aware of that, the limitations and use it to enhance, not replace, our human responsibility as designers. What I want to work on going forward is what is that sweet spot of using AI to keep me objective, but not replacing the work that I do with humans.
Dave Cowing
What are some of the key areas to be thinking about in the space of trust and ethics?
Erin Helcl
To summarize it, I’ll call back to that course that I just did on AI and Design Thinking. One of the things that were provided were actual ethics cards that provide design principles and ways of working to make sure that this is part of our process.
So number one, make sure that it’s not something you do at the end. Make sure that you have checkpoints, you’re designing for it up front, and you’re checking your outputs along the way.
Three other key things stood out, from a design principles perspective:
- Don’t presume desirability. Just because we can use it doesn’t mean we should.
- Design the scene. We’ve talked about people being able to step in and step out. But also they need to know where AI is acting on their behalf, and where they can intervene.
- Keep it human centered. Remember there are people behind the data. We do need to humanize the abstract and hold space for people impacted.
I have personally talked to employees that are being replaced. I’ve seen customers excluded by intuitive systems. When working on inclusive retail design, it was the in person conversations, not the data, that changed how we built. So just making sure that ethics isn’t a check box at the end, that it is that mindset every step. What data are we missing? Who’s missing? Are we reinforcing bias, or are we challenging it? Like I said before, even when humans do the work, bias creeps in. AI can’t be held at a lower standard. In some ways I think it’ll force us to raise our own standard. We need to create experiences that are fair and explainable and human. We need to bring that courage to the table and be willing to slow things down when everything else is telling us to speed up.
About The Experts

Erin Helcl
Experience Strategist and Founder
Erin Helcl is an experience strategist, transformation leader, and founder of New Helio, a consultancy focused on designing human-centered services in an AI-powered world. With over two decades of experience in retail, telecom, and financial services, she’s led large-scale transformations across North America inside some of its most complex organizations. Erin teaches Developing Insights for Design at the University of Toronto and offers public and customized workshops for organizations on building the psychological safety required to drive change and innovation. Her work blends service design, systems thinking, and inclusive leadership to help teams build trust and design for futures people can actually see themselves in. She explores this thinking further in her article, The Future of Product and Service Design Is Now. Learn more at newhelio.ca.

Dave Cowing
Chief Executive Officer, NovusNorth
NovusNorth is an outcome-oriented experience consultancy that drives business results by creating compelling experiences for customers and employees in the fintech and financial services industry. Dave has 30 years of experience helping companies ranging from Fortune 500 market leaders to disruptive startups envision and create new digital product experiences that drive meaningful outcomes.
In this edition of NovusNorth’s thought leadership conversation, Dave Cowing had the opportunity to speak with Erin Helcl, Founder of New Helio Strategies.
Erin is the founder of New Helio Strategies and brings over two decades of experience leading experience design and transformation within complex, highly regulated industries such as telecom, retail, and financial services. Her work focuses on helping leaders and teams shape what’s next—designing services, building trust, and navigating disruption. A central theme in her approach is ensuring people can see themselves in the future; without that vision, they risk disengaging. Erin’s recent research and writing, including her article “The Future of Product and Service Design is Now,” explores how automation and evolving roles impact individuals—and how thoughtful design can help them grow into what’s next.
NovusNorth is a leading innovator in digital experience and platforms for the financial services industry and provides product management, user experience design, and development services.
Key Takeaways:
This post shares the highlights from the discussion between Dave and Erin.
Read the Transcript
Dave Cowing
In your post, “The Future of Product and Service Design is Now”, you mentioned that AI accelerates design. Can you elaborate on how AI is currently being integrated into product and service design and the impact that it’s having on the process?
Erin Helcl
What I’m starting to see happen is that AI is really becoming a design partner and that product and design teams are using it to accelerate manual tasks. This includes things like synthesizing research, something that would take a long time previously, or sketching early prototypes or exploring unexpected ideas in ideation.
“AI is really becoming a design partner”
– Erin Helcl
“AI is really becoming a design partner”
– Erin Helcl
What’s interesting is how it’s shifting the designer’s role. With AI handling speed and scale, we can focus on asking the right questions, exploring what could be. That’s where the real creativity and strategy lie.
I recently took a course through IDEO University on AI and design thinking. They shared some mindsets I try to embody in my work. One of them is being collaborative. Another is product and service designers combining AI’s scale with human empathy and creativity, because that’s what we are really bringing to the table. Also making it human-centered, using AI to augment, not replace. Then there is being responsible. How are we embedding ethics into every step, not just tacking it on at the end. This is both in our process as well as in our end products and services, because we’re thinking about it from both lenses now.
Then there’s the idea of using AI as a sandbox for learning and pushing boundaries. That explorer mindset. Sometimes as service designers, running ideation sessions as humans, we can get stuck in our sandbox of what we’ve always known and it can be hard to think outside that box. We can use AI as a partner to help us push beyond boundaries and generate wilder ideas to work from.
It’s not about automating design. It’s about creating space for deeper insight, faster exploration and better outcomes. But, talking through those mindsets, there are definitely implications we’ll need to think about as we grow this.
Dave Cowing
I’m wondering, if you look at how that then impacts or is implemented in large organizations, how does that manifest itself? Beyond the individual, what changes are you seeing in larger organizations?
Erin Helcl
There are a couple of things I’ve been seeing, and a few futures and trends I’ve been thinking through. As we move faster and replace more manual tasks, I’m seeing convergence of roles. I am starting to see shifts in teams, in terms of where design sits within the organization. Some organizations are saying, “If you want to replace FTE, you need to show me how AI can’t do the role.” So how do we continue to show value?
Some of the futures I’ve been thinking about include the converged role—product, service, engineering, and management blending into one hybrid role powered by AI. Maybe it’s not one role but smaller teams. I don’t think that these scenarios are mutually exclusive, I think there will be some version of all of this kind of coming true.
Another one that I talk about in the article is the agent economy. We’re designing for AI agents in customer experience and services. They’re also acting as collaborators in creating the journeying and end users in the journey.
Then there’s the human-centered renaissance. I think there is a bit of AI fatigue. Will people continue to adopt or push back and seek out emotional, ethical, human-made experiences
In these three futures, the service designer plays a key role, not just mapping journeys or service blueprints but as a systems orchestrator. Large organizations need people who can bring forward systems thinking: how tech and policy interact, storytelling, and sensemaking. Technical fluency will be critical; not coding but being able to translate between disciplines and understand this more complex ecosystem.
Organizations need to be thinking about how the role is shifting. But at the heart of it, we still need to design with empathy, ethics, imagination. I think that’s more relevant than ever.
Dave Cowing
You keep coming back to the word trust. Can you highlight how trust can differentiate successful designs or service blueprints? What are some of the strategies designers and companies need to use to create trust?
Erin Helcl
We are designing in an age of deep mistrust. Between misinformation and automation, people are unsure what to believe or where they fit.
I recently went to an AI and creativity symposium at Toronto Metropolitan University. Bruce MacCormack, a senior advisor that at the CBC (Canadian Broadcast Corporation), spoke about work with the Coalition for Content Provenance and Authenticity within journalism and the media. He said, “Seeing is no longer believing.” That’s a challenging space to be in.
He’s leading efforts with CBC in Canada, the BBC, and The New York Times to develop a credentialing system to preserve and signal verified news content. The are creating trust and authenticity frameworks so you know that the news you’re reading is true and real, and hasn’t changed since its source. LinkedIn has also launched this in their platform.
We’re seeing synthetic media rise. It’s important not just in journalism, but in any system where people need to know what’s real and what’s trustworthy. Trust can’t just be an outcome, it has to be an intentional part of how we design.
So, what can organizations do?
- Transparency – show how systems work, and decisions were made. If I’m going to trust you in my financial services, I need to understand what’s being done and how it’s being done. Consented control will let people opt in, opt out, or step back in this journey as needed and be able to understand what control they still have.
- Context – making sure innovation doesn’t outpace understanding. Just because we can automate something, doesn’t mean we should. We should use that human centered design lens to understand what makes sense, and not just doing it for automation or AI’s sake but really thinking about what makes sense for the end user.
“Just because we can automate something, doesn’t mean we should.”
– Erin Helcl
“Just because we can automate something, doesn’t mean we should.”
– Erin Helcl
I also saw this play out at that symposium at Toronto Metropolitan University. They were sharing their plans on how they were going to roll out Gemini to their student body this coming call. It wasn’t just about launching a tool. They were preparing to do this meaningfully, to build trust so that students can use it in an ethical and authentic way that still allows for learning and showing their true value. That’s what we need to think about as we roll AI out: how is it empowering us, not replacing us. How it creates trust in the system and the organization we’re working with. Trust shouldn’t be a checkbox or disclaimer, it needs to live in the experience itself.
“Trust shouldn’t be a checkbox or disclaimer, it needs to live in the experience.”
– Erin Helcl
“Trust shouldn’t be a checkbox or disclaimer, it needs to live in the experience.”
– Erin Helcl
Dave Cowing
What do you see as the most significant trends, kind of shaping the future of product and service design?
Erin Helcl
When I wrote the article The Future of Product and Service Design is Now, I created a job description for a service designer of 2030. It was a fun activity, but also really hard, because is this now or 2030? I think it was a fun experiment and helped me see myself in the future.
I thought about trends I mentioned: the converged role, the agent economy, and the human-centered renaissance. From a skills perspective, we talked about systems thinking, sensemaking, technical fluency.
From a leadership perspective, we’re have to help guide this in organizations. There’s five skills that I would call out:
- Systems leadership: help others zoom out and understand business, tech, and the experience.
- Sensemaking: bring clarity to complexity and help teams see themselves in the future, both for their customer experience, their product design and for themselves and how they play in the organization.
- Psychological safety: create space to challenge the status quo, try things without fear of failure and creating space to be able to challenge, experiment and grow. Innovation is about doing new things and you have to learn your way through it, because you’re not going to have all the answers.
- Comfort with ambiguity: technology changing at an extremely fast pace, so working within that ambiguity as a leader is going to be extremely important.
- Balancing ethics with speed: With the opportunity to be AI powered across research, synthesis, prototyping and ideation, there’s pressure to move faster and do it for less. Where’s the value that we continue to bring. Moving faster isn’t always better.
I’ve worked with teams where innovation is stalled and not because of a lack of ideas or because people didn’t feel safe to speak up. We have to create this environment where challenge and change are welcome, and we need to have the courage to push back, to slow down, to hurry up, and to make sure we’re doing this ethically and doing this right.
As leaders, we need to model how to work with AI. We’re talking about co-creation with AI or using it as an end user. Let’s not do this as a gimmick, but let’s do it as a creative partner and really experiment. Sometimes it’s going to work out and sometimes it’s not. But if the risk isn’t high in terms of what you’re trying, work with your teams to do that. So I’ll reiterate that I think our job is to help people see themselves in the future and then build those conditions to help them get there.
Dave Cowing
Are strategic designers who deeply understand business problems better positioned against the impacts of design becoming democratized, compared to those who primarily execute or “assemble” designs?
Erin Helcl
I think there’s definitely a possibility of that. When I talk about the skills of the future, to be able to elevate into that systems thinking and that more strategic piece, to be that orchestrator. AI can help us with low fidelity wireframes and prototypes, coding all these different things. So, where is the human element? I think we should talk a little bit about how we keep people at the center of these AI powered systems and what human centered design continue look like.
Dave Cowing
Is there also a much more significant kind of education and re skilling effort that organizations need to go through that they’re not prepared for?
Erin Helcl
One of the large financial institutions that I worked for a few years ago was thinking about that. I did a lot of deep research around people whose jobs were being impacted by automation or new ways of working. The number one thing that kept coming back is I need to be able to see myself in this future, and then I need to be able to have the space to learn and upskill. It can’t be at side of desk; it must be deliberate with time carved out for me to do that. The benefit to the organization is these are people who understand your business. They’re already here. They’re already on board. The cost to onboard new employees is very high. This is an opportunity to do upskill the team, and we actually used those insights to create formal, re-skilling programs.
But I’m not seeing that across the board. I went to a talk at the Rotman School of Management in Toronto, and it was AI Use Cases and Financial Services. It was 10% of adoption today in large companies. There was a fear around being replaced or being tracked on what you’re doing within AI. One of the consultants from one of the large consulting firms said, “If you don’t bring your people along with you, and you don’t bring your talent along with you, that is going to be your biggest mistake.” I think this is a huge opportunity from a systems thinking, from a services design perspective, let’s apply it to our employee base and really think how to bring people along into the future.
Dave Cowing
What can designers do to ensure that the user is not lost in this process?
Erin Helcl
You can now generate a full research report, service maps, and personas without even talking to a person. You use deep research function on ChatGPT, use a search function, you do some good prompt analysis, and then suddenly you’re creating an experience around people, not for them. This scares me, because you need to be able to hear someone’s story. Without being able to see context, this is a dangerous place to be.
I think about my career as a service designer, and how some of my most meaningful insights have come through. I’ve done retail in my career as a service designer, designing accessible and inclusive retail experiences. I did a shop along with a customer. We went store to store, and I was able to observe their experience and talk to them throughout it. This is what shaped the work. There are people behind the data and we need to remember that. AI, I don’t think can replicate that. Synthesis isn’t just about clustering data, like it’s where we build empathy.
“There are people behind the data and we need to remember that.”
– Erin Helcl
“There are people behind the data and we need to remember that.”
– Erin Helcl
I even feel this is a challenge today, when I’m doing the synthesis and I’m doing the research, and then I’m bringing forward stakeholders to present that to them, and something gets lost in that. So how do we bring quotes, and how do we bring videos, and how do we get our partners immersed in research as well? This is the challenge with the speed that we want to go at and we can do. Just because we can do it doesn’t mean we should do it. So much of the meaning lives in that process when you think about service design and human centered design
To answer your question, we need to keep talking to people, even when it’s inconvenient, get out of the building, get away from your computer and go, context immersion, ethnographic research, talk to people, work with people. Ask yourself, are your users informed? Are they in control? Do they feel seen? And protect the moments that build that emotional connection and not just efficiency. AI is a powerful partner, but trust, emotion and insights still come from people.
Even without AI, we bring bias into the design process. We can unintentionally center our own perspectives or misinterpret data. AI doesn’t fix that, but it can help us see patterns or assumptions we might miss. Maybe if we can design these systems thoughtfully, it can actually be a tool for greater objectivity and inclusivity. But I think the key is to stay aware of that, the limitations and use it to enhance, not replace, our human responsibility as designers. What I want to work on going forward is what is that sweet spot of using AI to keep me objective, but not replacing the work that I do with humans.
Dave Cowing
What are some of the key areas to be thinking about in the space of trust and ethics?
Erin Helcl
To summarize it, I’ll call back to that course that I just did on AI and Design Thinking. One of the things that were provided were actual ethics cards that provide design principles and ways of working to make sure that this is part of our process.
So number one, make sure that it’s not something you do at the end. Make sure that you have checkpoints, you’re designing for it up front, and you’re checking your outputs along the way.
Three other key things stood out, from a design principles perspective:
- Don’t presume desirability. Just because we can use it doesn’t mean we should.
- Design the scene. We’ve talked about people being able to step in and step out. But also they need to know where AI is acting on their behalf, and where they can intervene.
- Keep it human centered. Remember there are people behind the data. We do need to humanize the abstract and hold space for people impacted.
I have personally talked to employees that are being replaced. I’ve seen customers excluded by intuitive systems. When working on inclusive retail design, it was the in person conversations, not the data, that changed how we built. So just making sure that ethics isn’t a check box at the end, that it is that mindset every step. What data are we missing? Who’s missing? Are we reinforcing bias, or are we challenging it? Like I said before, even when humans do the work, bias creeps in. AI can’t be held at a lower standard. In some ways I think it’ll force us to raise our own standard. We need to create experiences that are fair and explainable and human. We need to bring that courage to the table and be willing to slow things down when everything else is telling us to speed up.
About The Experts

Erin Helcl
Experience Strategist and Founder
Erin Helcl is an experience strategist, transformation leader, and founder of New Helio, a consultancy focused on designing human-centered services in an AI-powered world. With over two decades of experience in retail, telecom, and financial services, she’s led large-scale transformations across North America inside some of its most complex organizations. Erin teaches Developing Insights for Design at the University of Toronto and offers public and customized workshops for organizations on building the psychological safety required to drive change and innovation. Her work blends service design, systems thinking, and inclusive leadership to help teams build trust and design for futures people can actually see themselves in. She explores this thinking further in her article, The Future of Product and Service Design Is Now. Learn more at newhelio.ca.

Dave Cowing
Chief Executive Officer, NovusNorth
NovusNorth is an outcome-oriented experience consultancy that drives business results by creating compelling experiences for customers and employees in the fintech and financial services industry. Dave has 30 years of experience helping companies ranging from Fortune 500 market leaders to disruptive startups envision and create new digital product experiences that drive meaningful outcomes.
Let’s Talk!
Trying to find your new north? Get in touch to find out how we can work together to achieve the business outcomes you need.
Our Latest Insights
Experience strategist and transformation leader, Erin Helcl, shares her insights on why product and service design need to evolve now for an AI future and what designers and organizations can do to be ready.
By NovusNorth
Veteran Product Manager, Eddie Chin, shares his vision on the future of product management and how AI will become a coach for product managers and teams.
By NovusNorth
Founder and Principal of MCE Strategic Consulting Marie Chinnici-Everitt, shares how companies can make a stronger connection between their brand and delivering a superior client experience in today’s competitive landscape in financial services.
By NovusNorth