Meet Darrel Rhea: Our newest expert collaborator
I'm thrilled to welcome Darrel Rhea as our latest expert collaborator at Snowmelt. I have had the pleasure of knowing Darrel for over 15 years, and he's been an incredible mentor and support to me as the worlds of strategy and design have evolved over the last couple of decades. I sat down with Darrel to ask a few questions as he joins us.
So I’d love to start off just with a question about who you are, Darrel. We know your career has been incredibly broad, spanning design, psychology, research, innovation and strategy. You've worked at the highest levels in these fields and been a thought leader in these disciplines. You've worked with executive leaders in hundreds of the world's organisations through industries and public sectors and this provided you with a privileged and global perspective that very few people have.
Please tell me about your journey and what brought you to the work you do today.
Thanks and excited to be in the conversation. Probably a big picture introduction would be to say where do I come from? Where did I start? Because I think that's really important, for my own self-definition, I came out of California in the 1960s, you know, the 50s, 60s, 70s when I grew up, which was a time where the population doubled. No one was from California. There was the counterculture movement. There was music in Hollywood. And there was hippie culture and surf culture, and culture was this thing that was exploding that I was participating in. And it felt like we were making it up, right? We were participating in it. So I learned early that culture and identity were things that you didn't just necessarily inherit, you actually made. Because we had the freedom to do that. Which was I think an anomaly at the time, historically maybe.
I also grew up in Silicon Valley. So I started my career in Silicon Valley and watched technology explode during that time. So I had a front row seat and I was a participant in that and those kinds of things changed the world. So that got me pretty excited about design and innovation because it was an environment that was all about change. My friends from the east coast of the United States would go to Ivy League schools and try to figure out how they would fit in and on the west coast we kind of made it up and thought it was our job to invent it and not accept anything that came my way. So that's kind of the early beginnings of it.
And then I'd say it's also important that I was mentored by Dr. Louis Cheskin who was a seminal figure in the fields of design and the psychology perception and design research. He basically invented the field of design research. So he came out of the 40s and 50s and 60s and a generation or two before me and boy I learned a lot there. He was really responsible for taking design from something that was really considered a commercial art at that time to a science. So he brought empirical science to the field of design and applied social science research techniques, quantitative and qualitative market research to understanding human perception, which really kind of changed the field of design pretty much significantly. And after that I kind of stood on his shoulders with the firm named Cheskin that for the next 30 some years I helped lead and scale globally into a global consulting organisation working in design research, market research, design management, innovation and eventually strategy. So, that's kind of the arc of my career. I've been a consultant my whole life. And it's been a really fun opportunity to travel the millions of miles that I've got to travel, work with some of the largest organisations in the world at senior executive levels quite often and really learn and appreciate the value of strategy as a competence.
How amazing Darrel, thanks so much for sharing that. I think there's a really interesting thread in there around the art and science bringing those together and I think from your history in the 60s in California that's coming through in that sense you can kind of change the world. I mean something that I think a lot of people might not know about you, you're also an amazing illustrator and musician and it's the way you bring that kind of meaning into the work you're doing as well that is quite incredible.
So the next question is really about thinking about the context that we're in at the moment. We're in a massive amount of change and at a pace that most of us have never seen before.
What kinds of challenges are you seeing that leaders are consumed by today?
Well, the elephant in every room, I would say, is definitely the massive adoption of AI that is either happening or coming very very quickly. And what I'd say about that I was involved with mainframe computers and supercomputers and mini computers and the introduction of laptops and the introduction of search and internet and wireless and e-commerce, all these waves largely emerging out of Silicon Valley with my clients I participated in all those and those changed the world, as we all know, in pretty dramatic ways.
But what I'd say about AI is that this is at an order of magnitude far beyond that. And so it really presents a challenge to leaders of all types because you've got to get your head wrapped around what this is and what it could mean. I think personally that it's the central design challenge of our lifetime. Because it's not just technical, it's systemic. And I know when I talk like that, you could say, "Oh, Darrel, you're being an AI alarmist or an AI fanboy." which I'll cop to being both of those things. But it's not just that it's really the central axis around which economic and social and governance systems are all being or will soon be redesigned. And so I think that's the big challenge there.
What I'd say is, as an example, we recently had the Davos conference and I've spoken at the World Economic Forum and it's about economic growth and stability across the globe and it's pretty conservative organisation and maybe they sprinkle in a little bit of equality and sustainability and a few other things so that the CEOs can be loved. But ultimately it hasn't been about technology and this year Davos was almost dominated by AI and I think that's a major indicator that AI is now a central force that's reshaping power in the world. It's reshaping labor markets and governance and security and social systems worldwide.
So we've got this change going on in this period of time that we're living. That is a level of historical change and if you're a systems designer which is Snowmelt’s competency, you just can't ignore this. This is the biggest challenge of our lifetime and it really demands I think systems design and this the whole skill set that comes with design. And how that shakes out is I think you've got executive leaders who have brilliant people and have great skills at managing recurrent business in a fairly stable world or a slow changing world to all of a sudden realising they're moving into an area or era of perpetual instability. And their people and their systems that they've taken for granted and frankly optimised over their whole careers, those systems are going to be less and less relevant and they don't have the people they don't have the tool set to really address that and so I think a lot of leaders are not just challenged, they're underneath the surface really kind of panicked. And I think they should be and to a certain extent we all should be quite concerned.
Thanks Darrel. That's slightly alarmist but it’s true, I mean we are at the precipice of enormous change coming towards organisations. You talk about this perpetual instability and just the totally different ways of thinking and acting in that environment.
Can you give us some examples of some of the risks that leaders are facing in this context?
Well, I'll give you two that I'm concerned about. One will be at a personal or individual level and one more organisational.
The personal level would be that as we rush to deploy AI and start using it in our own personal professional practices, we're starting to use AI to offload our own thinking. Like it's really a very effective tool to speed up what we're doing and increase efficiency and our productivity. And I'm just watching a whole lot of people move through it really fast, ask answers for their questions, cut and paste, slam it in the document, and move on. Right? Because this is great. We're actually looking smarter in doing this. But the risk here is that we offload our cognitive abilities and we will eventually atrophy, right? We will eventually be highly productive stupid people and what I mean by that is that it's very much like over-reliance on GPS in the car. You probably know people who can't, who can't read a map, who don't have any spatial awareness and can only drive by following the directions that a computer gives them. If it says turn left into the lake, they're likely to turn left into a lake, right? So, that's an example of a skill and an ability that atrophies. And that's really how I see a lot of people using AI and that worries me. They're using it like a vending machine, right? And they just want to get answers and where I think they should be using it like a cockpit for a jet fighter. It should be something that really gives you control that you take control and assert control to go. So that's not a theoretical risk. It's a real risk. And I'm working on systems to counter that. And I'm really worried about that on a large scale, especially for young people who are going to grow up with this intelligence that's far greater than human intelligence and it will be just really easy to rely on it. So how do we build human judgment and discernment and how do we use it to amplify what makes us human beings is the question.
The second area of of risk that I'm kind of interested in, what I'd say is that organisations because all this work is moving at this frightening pace now that we're producing all of this, that decisions tend to be made prematurely that we're moving at at a speed where we're not necessarily slowing down and being fully accountable for what we're doing and why we're doing it.
So we're able to use and leverage AI to be highly productive and efficient, but we're not asking for the sake of what we are doing it? Are we doing things and making decisions that we should be accountable for, that reflect our sense of purpose and the commitments that we make in the world? I'm really worried that it will degrade some of the intelligence of the organisation as well as we start to depend more on our agents and less on ourselves.
Amazing Darrel, I think those are some really significant challenges and I know in your work you've been really focused on ensuring that there is the learning loops, the reflection, the kind of critique coming from your engagement with AI, which I think some of the the ideas you've had of building those loops into your design processes, learning processes are really significant.
So if we think about systems design, systemic design and it moving from a creative craft into a core c-suite capability, how does that actually happen and how might that help organisations navigate instability?
Well, I think a lot of it centers on strategy, that in this time of incredible change and accelerated change, organisations and leadership - whether it's at the top of an organisation or whether it's a team or an initiative or a group or a governing department - we all have to learn to be agile and to quickly reinvent ourselves and focus on and create strategy. And one of the primary tools of systems design is strategy. Strategy is an argument for change. It's not a PowerPoint. It's not a deck. It is a conversation that lives in the commitments of people. It's a narrative story that creates that context.
And I think therefore Systems Design is becoming more and more essential, especially because the kinds of problems that businesses are going to be faced with aren't just a matter of optimising specific areas or specific corners or processes. As we enable and implement AI, it's going to require a total rethinking of the systems that we have, all of our systems, and how we operate. And so that's something that most business people aren't really used to dealing with. They're used to managing, not necessarily designing and designing new systems, especially large complex systems.
So, how do you deal with the complexity? How do you deal with change? I think that systems design is going to be an even more essential component. Rather than a somewhat esoteric dark art from a decade or two ago up to mainstream now, I think it's going to be really really critical for navigating the instability and leveraging strategy to do that.
That's great reflections, Darrel. I think just something to build on that, the one thing that we often think about is coherence across an organisation and there's a lot of change happening and if you're building, you know, optimising one part of the system, the rest needs to be coherent with that in terms of your strategy, your intention - I know this is something that you talk about a lot in terms of organisations. So maybe you can tell us a little bit more about that.
You often speak about moving from knowledge work to intentional work. What does that mean for the future of organisational design?
Well, it kind of has to do with the role of the human being. If the value of expertise has been devalued essentially to zero because we have an intelligence with artificial intelligence that it will if it hasn't already - and we could argue about that - superseded human intelligence then our identities and our skill set and our roles are largely based around our expertise and what we what we bring. So a knowledge worker who prides himself in knowledge is I think they're toast. I think that's going to be something, unfortunately, that is going to create a crisis there. But there is a role for human beings and that role is really to express and hold the intent of the organisation because computers don't have agency and computer agents don't have agency, yet. We'll see about that.
So our role is really holding that agency and expressing judgment and discernment about the decisions that are going to be made. And so the leaders then have to spend less time on executional issues and and focus more on supervising and adaptive systems where AI is the executor and it's just kind of changing that role. So I think it becomes more important to be absolutely clear about what we're doing and why we're doing it as an organisation and what design principles that we have and what outcomes we are committed to do and what metrics we'll use to get there. Those are the things that human beings need to declare and really truly own. And that's a skill set that isn't about executing all the tasks that we're so proud of ourselves for being able to do. It is a skill set around conversation and co-design and a more declarative role.
And when I talk about the ‘intentional organisation’, that's really what I'm talking about is privileging that role and really emphasising it because without it I think we're going to have machines lead us to places that we don't necessarily want to go.
It's such an interesting comment, Darrel. That emphasis on clarity. It's interesting that now more than ever it feels like being able to understand the entire organisation, how all of your operations all work together to create that value that you're trying to create in the world. It sounds like you know in some ways you mentioned earlier that you know people are using AI to kind of outsource thinking, in a way you're saying well there's these core parts that humans really need to be architecting and structuring if we want everything moving in the same direction. So I think that's a really powerful thought to reflect on.
So, we have covered a lot of ground, Darrel. You have so much experience in so many different fields and we're absolutely delighted to have you as an expert collaborator. And I've really been appreciating over the last year really helping us learn about this AI frontier and how we can be using this and how our clients can be utilizing these really contemporary ways of designing organisations and creating opportunities. So I'm interested in just throwing this question to you.
What specific frontier would you like to explore with our clients, with Snowmelt’s clients to help them mobilise action?
Darrel: Well, what I'd say is, going back to the challenge at hand, that most all organisations have to be dealing with is building the capacity, the capability and the skill set as an organisation to operate in an environment of incredible instability. Where everything is going to be reinvented. We have to reinvent it with new tools and new rules and all at the same time. So working with you guys and your clients and my clients together collaborating is exciting, around the question of how do we build that level of agility and resilience when we need to be operating in an environment of perpetual change? And not just have better plans but better judgment, better alignment, better satisfaction, a better sense of control. And my focus has really been on helping teams choose what they want to do, align on that, mobilise and act in environments of instability and change. And that's what makes systems design really work and have it be something other than just a conceptual practice. It really is a very pragmatic tool set that you guys are great at and bringing those skills to the world is definitely needed and I'm looking forward to seeing where you guys go.
Learn more
- Subscribe to Darrel’s Substack
- Read Darrel's most recent article; Agentic AI: Autonomy Without Accountability
- Listen to Darrel’s music on Spotify
- See Darrel’s artworks and photography on Behance





