Podcast

Remember to subscribe to our YouTube channel for a more immersive experience with our guests and content.
@scenariofutures

Transcripts & Show notes

00:00:00.000 –> 00:00:03.240

I wonder how much of the facilitator has to like

 

00:00:03.240 –> 00:00:06.820

bring in this cult of personality. Because Matt,

 

00:00:06.960 –> 00:00:10.740

it’s not just ending the workshop, right? It’s

 

00:00:10.740 –> 00:00:13.019

saying, okay, now we got to think in a different

 

00:00:13.019 –> 00:00:14.800

way. And now we’re going to take what you did

 

00:00:14.800 –> 00:00:16.699

and think in a different way now, you know, because

 

00:00:16.699 –> 00:00:18.800

we have like creativity at the start and then

 

00:00:18.800 –> 00:00:22.620

causality and puzzle solving in the middle, you

 

00:00:22.620 –> 00:00:24.940

know, and then we get all the way to others.

 

00:00:25.000 –> 00:00:28.660

So I wonder how much of that. is really facilitated

 

00:00:28.660 –> 00:00:32.939

by, yeah, embracing a very specific profile of

 

00:00:32.939 –> 00:00:34.460

emotional. I don’t want to say manipulation,

 

00:00:34.479 –> 00:00:36.640

but sometimes it almost feels like that because

 

00:00:36.640 –> 00:00:44.560

we have to get through it. Back to you. PASS. I don’t

 

00:00:44.560 –> 00:00:46.920

know. I don’t know what to do with that, Megan.

 

00:00:47.179 –> 00:00:51.280

No, that’s great. It’s a great observation. I

 

00:00:51.280 –> 00:00:53.219

don’t know. Just cut this out. I don’t know.

 

00:00:53.259 –> 00:00:57.450

I don’t know. I’m sorry. A cult of personality?

 

00:00:57.670 –> 00:00:59.490

You got me stuck on that. That’s where you got

 

00:00:59.490 –> 00:01:02.310

me stuck. Welcome to Scenarios for Tomorrow,

 

00:01:02.509 –> 00:01:04.629

a podcast where we turn tomorrow’s headlines

 

00:01:04.629 –> 00:01:07.530

into today’s thought experiments. This first

 

00:01:07.530 –> 00:01:09.930

series includes conversations with the authors

 

00:01:09.930 –> 00:01:13.230

of our latest book, Improving and Enhancing Scenario

 

00:01:13.230 –> 00:01:16.379

Planning, Futures Thinking Volume. from Edward

 

00:01:16.379 –> 00:01:20.099

Elgar Publishing. I’m your host, Dr. Megan Crawford.

 

00:01:20.239 –> 00:01:22.340

And throughout this first series, you’ll hear

 

00:01:22.340 –> 00:01:25.099

from my guests, the numerous global techniques

 

00:01:25.099 –> 00:01:27.799

for practicing and advancing scenario planning.

 

00:01:27.980 –> 00:01:40.659

Enjoy. Today, we are lucky to have two guest

 

00:01:40.659 –> 00:01:44.319

authors with us. First is Nicholas Rowland, who

 

00:01:44.319 –> 00:01:47.140

is a distinguished professor of sociology at

 

00:01:47.140 –> 00:01:50.540

Penn State University in the U .S. He is also

 

00:01:50.540 –> 00:01:53.359

the academic trustee on Penn State’s Board of

 

00:01:53.359 –> 00:01:56.280

Trustees. Nicholas studies governance, the future,

 

00:01:56.480 –> 00:02:00.840

and the conduct of science. Matthew Spaniel is

 

00:02:00.840 –> 00:02:04.159

a senior researcher of strategic foresight in

 

00:02:04.159 –> 00:02:07.420

the Department of People and Technology at Roskilde

 

00:02:07.420 –> 00:02:11.039

University in Denmark. Matt has curated a YouTube

 

00:02:11.039 –> 00:02:14.060

channel called the Futurist and Foresight Papers

 

00:02:14.060 –> 00:02:17.080

Explained, where he has about a dozen or so scholars

 

00:02:17.080 –> 00:02:19.860

talking about their papers, of which both Nick

 

00:02:19.860 –> 00:02:22.419

and I have both been guests. So I highly recommend

 

00:02:22.419 –> 00:02:25.280

checking that out. Links will be provided in

 

00:02:25.280 –> 00:02:28.849

the show notes. So welcome both. It’s great to

 

00:02:28.849 –> 00:02:32.629

have you here. Thank you. It’s a pleasure to

 

00:02:32.629 –> 00:02:37.009

be here. We have had the opportunity, more than

 

00:02:37.009 –> 00:02:40.509

many, to catch up at our various international

 

00:02:40.509 –> 00:02:43.710

conferences and events that have gone around.

 

00:02:44.030 –> 00:02:46.650

But as it happens, I understand that the two

 

00:02:46.650 –> 00:02:49.490

of you have known one another for some time,

 

00:02:49.590 –> 00:02:54.199

right? Indeed, that’s the case. In fact, scholars

 

00:02:54.199 –> 00:02:57.699

that have seen our work probably don’t know that

 

00:02:57.699 –> 00:02:59.259

Matt and I actually went to college together

 

00:02:59.259 –> 00:03:01.759

so many years ago. And in fact, during our freshman

 

00:03:01.759 –> 00:03:03.960

year, we coincidentally lived right across the

 

00:03:03.960 –> 00:03:07.319

hall from one another. And so once we graduated,

 

00:03:07.479 –> 00:03:09.259

our paths went a slightly different direction.

 

00:03:09.259 –> 00:03:12.960

I went directly into a sociology graduate program.

 

00:03:13.740 –> 00:03:16.439

But then later on, we kind of got connected.

 

00:03:16.680 –> 00:03:18.539

Matt, do you want to take over what you did?

 

00:03:18.960 –> 00:03:21.560

So I went into finance and banking for a while

 

00:03:21.560 –> 00:03:24.379

and then did a master’s in international development

 

00:03:24.379 –> 00:03:26.939

in South America and came back and ended up at

 

00:03:26.939 –> 00:03:28.759

the Copenhagen Institute for Future Studies.

 

00:03:30.240 –> 00:03:34.860

All right. That’s basically when we sort of rejoined

 

00:03:34.860 –> 00:03:37.719

forces, let’s say. And so we’ve been writing

 

00:03:37.719 –> 00:03:39.379

ever since. And that was about 10 years ago.

 

00:03:39.819 –> 00:03:42.620

Do you all have any idea about how many papers

 

00:03:42.620 –> 00:03:46.659

you have together or books? Papers total? Oh,

 

00:03:46.680 –> 00:03:49.550

gosh, I don’t know. At least a dozen. Yeah. Probably

 

00:03:49.550 –> 00:03:54.550

pushing 20. It could be. Nice. Nice and enviable

 

00:03:54.550 –> 00:03:57.449

professional relationship there. We’re all looking

 

00:03:57.449 –> 00:03:59.430

for that. In fact, our university, our department

 

00:03:59.430 –> 00:04:03.689

rather, just had one called Finding Your Research

 

00:04:03.689 –> 00:04:07.810

Life Partner. Something like that. I thought

 

00:04:07.810 –> 00:04:10.650

about it with that one. I was like, no, but that’s

 

00:04:10.650 –> 00:04:15.629

cute. So I’m grateful that we could find this

 

00:04:15.629 –> 00:04:17.569

time to chat because I know it was really difficult

 

00:04:17.569 –> 00:04:20.389

for all of us to get our schedules aligned, especially

 

00:04:20.389 –> 00:04:23.610

since we’re all teaching. We’re at exam times,

 

00:04:23.709 –> 00:04:28.110

all this kind of stuff. But I really, I wanted

 

00:04:28.110 –> 00:04:30.819

  1. talk specifically with y ‘all, not just because

 

00:04:30.819 –> 00:04:32.779

you’re authors on the book, but because of what

 

00:04:32.779 –> 00:04:36.360

you brought to the book. And it has to do with

 

00:04:36.360 –> 00:04:39.360

this latest shift in the focus in our field of

 

00:04:39.360 –> 00:04:42.819

scenario planning. So as mentioned in the introduction,

 

00:04:43.240 –> 00:04:45.620

we’ve just published this book together about

 

00:04:45.620 –> 00:04:48.660

scenario planning in the 21st century. We’re

 

00:04:48.660 –> 00:04:51.279

right here at the quarter century mark in 2025.

 

00:04:51.839 –> 00:04:54.420

We’d like to talk a little bit more about that.

 

00:04:54.829 –> 00:04:58.370

we understand that not all of our listeners may

 

00:04:58.370 –> 00:05:01.970

be familiar with even the concept or the terms

 

00:05:01.970 –> 00:05:06.029

scenario planning, though some may have heard

 

00:05:06.029 –> 00:05:09.550

more about us since the pandemic when our field

 

00:05:09.550 –> 00:05:14.060

got… extremely popular but one of the motivations

 

00:05:14.060 –> 00:05:16.860

to this podcast as well as was to the book was

 

00:05:16.860 –> 00:05:19.639

to bring our world of futures and foresight science

 

00:05:19.639 –> 00:05:24.439

outside the walls of academia where within understandably

 

00:05:24.439 –> 00:05:27.879

the language is closely controlled but knowledge

 

00:05:27.879 –> 00:05:30.579

ends up being not as easily to access as we generally

 

00:05:30.579 –> 00:05:33.540

wish it to be which just means we’re here to

 

00:05:33.540 –> 00:05:37.019

have a chat with the public yeah sounds great

 

00:05:38.060 –> 00:05:41.180

All right. Well, your chapter then is titled

 

00:05:41.180 –> 00:05:45.259

Examining Emotion in the Facilitation of Scenario

 

00:05:45.259 –> 00:05:48.980

Planning. Now with that, let’s start with a little

 

00:05:48.980 –> 00:05:51.899

softball question, right? So what is, in your

 

00:05:51.899 –> 00:05:54.779

professional experiences, what is scenario planning?

 

00:05:57.000 –> 00:06:01.680

I’m happy to start that one. So historically

 

00:06:01.680 –> 00:06:03.579

speaking, there’s been a little bit of confusion

 

00:06:03.579 –> 00:06:08.379

around the sort of absolute or the absolute definition

 

00:06:08.379 –> 00:06:11.120

of scenario planning. And that’s because, at

 

00:06:11.120 –> 00:06:14.959

least since the 1990s, there wasn’t academic

 

00:06:14.959 –> 00:06:17.939

consensus on what the term scenario itself meant.

 

00:06:18.100 –> 00:06:19.800

And that’s something that Matt and I actually

 

00:06:19.800 –> 00:06:21.579

picked up as one of the first papers that we

 

00:06:21.579 –> 00:06:24.120

wrote together was. trying to understand this

 

00:06:24.120 –> 00:06:26.759

problem about the confusion over the definition

 

00:06:26.759 –> 00:06:29.560

of scenarios in the scenario planning literature.

 

00:06:29.879 –> 00:06:31.939

And so what we did is we basically scoured the

 

00:06:31.939 –> 00:06:35.459

literature for every single example where scenario

 

00:06:35.459 –> 00:06:38.019

was defined. So these would be statements like

 

00:06:38.019 –> 00:06:42.519

scenarios are blank or a scenario is. And we

 

00:06:42.519 –> 00:06:45.399

basically amalgamated them all together and found

 

00:06:45.399 –> 00:06:48.860

that scenarios have six characteristics. The

 

00:06:48.860 –> 00:06:52.329

first one. of course, is that they’re temporally

 

00:06:52.329 –> 00:06:55.829

rooted in the future. And then secondarily, they

 

00:06:55.829 –> 00:06:58.810

almost always involve reference to some kind

 

00:06:58.810 –> 00:07:02.529

of external force in that context, maybe a political

 

00:07:02.529 –> 00:07:04.930

force or an economic force. There’s a bunch of

 

00:07:04.930 –> 00:07:07.889

variability there, of course. From there, we

 

00:07:07.889 –> 00:07:10.290

found that scholars make the argument that any

 

00:07:10.290 –> 00:07:13.550

scenario should be obviously possible, but to

 

00:07:13.550 –> 00:07:16.629

some extent also very plausible. and that the

 

00:07:16.629 –> 00:07:20.629

appropriate form for them to take is a narrative,

 

00:07:20.709 –> 00:07:24.089

or to put it more plainly, a story, only it’s

 

00:07:24.089 –> 00:07:26.029

set in the future, obviously not in the past.

 

00:07:26.850 –> 00:07:29.930

From there, we know that they exist in sets,

 

00:07:30.009 –> 00:07:32.870

and that they’re meant inside of those sets,

 

00:07:32.990 –> 00:07:36.430

this is the final criteria, to be meaningfully

 

00:07:36.430 –> 00:07:39.230

different from one another, so that there’s not

 

00:07:39.230 –> 00:07:41.470

a lot of overlap in the scenarios. They’re related,

 

00:07:41.649 –> 00:07:46.410

but they cover different ground. We’ve worked

 

00:07:46.410 –> 00:07:48.990

on what the word scenario means in there, although

 

00:07:48.990 –> 00:07:51.329

Matt is the one with experience actually conducting

 

00:07:51.329 –> 00:07:55.069

scenario planning in facilitated workshops. So

 

00:07:55.069 –> 00:07:57.230

maybe you want to take over the planning part

 

00:07:57.230 –> 00:08:00.839

for this one, Matt? It’s a bit of a kind of a

 

00:08:00.839 –> 00:08:04.680

missed marked nomer for the actual field that

 

00:08:04.680 –> 00:08:07.480

we work in, right? So planning assumes that you’ve

 

00:08:07.480 –> 00:08:09.379

got a budget and you’ve got a project and you’re

 

00:08:09.379 –> 00:08:11.879

trying to implement that project where when you’ve

 

00:08:11.879 –> 00:08:14.339

got a set of scenarios sitting in front of you

 

00:08:14.339 –> 00:08:15.740

and you’re trying to figure out what they mean,

 

00:08:15.839 –> 00:08:18.439

we’re looking for a few different types of other

 

00:08:18.439 –> 00:08:21.639

types of outcomes rather than just plans for

 

00:08:21.639 –> 00:08:25.189

the future. So the first thing is that we’re

 

00:08:25.189 –> 00:08:28.170

looking to ideate options. So if you think of

 

00:08:28.170 –> 00:08:31.110

the scenario as the puzzle, what’s the solution

 

00:08:31.110 –> 00:08:33.710

to that puzzle? And the solution to these puzzles

 

00:08:33.710 –> 00:08:36.389

are often going to come out in this world of,

 

00:08:36.450 –> 00:08:39.289

say, strategic options or ideas or actionable

 

00:08:39.289 –> 00:08:43.610

initiatives or something like that. But when

 

00:08:43.610 –> 00:08:46.480

you put that together again, The scenarios come

 

00:08:46.480 –> 00:08:49.360

back into the discussion again when you want

 

00:08:49.360 –> 00:08:51.799

to start talking about their ability to simulate

 

00:08:51.799 –> 00:08:56.919

the outcomes of those options. So imagine that

 

00:08:56.919 –> 00:08:59.679

you find some solutions to these types of puzzles

 

00:08:59.679 –> 00:09:03.179

and as you think of them and how they would unfold

 

00:09:03.179 –> 00:09:06.220

under these conditions set forth by the scenarios,

 

00:09:06.519 –> 00:09:08.799

right? You’re in this kind of, you’re stress

 

00:09:08.799 –> 00:09:10.799

testing your ideas, right? You’re trying to figure

 

00:09:10.799 –> 00:09:13.899

out what is the… the options that we have for

 

00:09:13.899 –> 00:09:16.200

ways forward. So we’re not setting a plan, a

 

00:09:16.200 –> 00:09:18.320

particular singular plan into place, but we’re

 

00:09:18.320 –> 00:09:21.299

looking for a divergence of different possibilities

 

00:09:21.299 –> 00:09:24.120

that we can move our organization into the future.

 

00:09:25.320 –> 00:09:30.000

Okay. All right. So you’re honestly the first

 

00:09:30.000 –> 00:09:36.429

group or individuals to break up. concept of

 

00:09:36.429 –> 00:09:38.649

scenario planning between the artifacts right

 

00:09:38.649 –> 00:09:41.490

the output that comes at the end pretty much

 

00:09:41.490 –> 00:09:44.289

but then the process itself as well and what

 

00:09:44.289 –> 00:09:47.389

it can be and yeah i’ve cited that first paper

 

00:09:47.389 –> 00:09:50.669

it is a pretty popular one um and in fact i’ll

 

00:09:50.669 –> 00:09:53.409

put up your decision tree that you mentioned

 

00:09:53.409 –> 00:09:58.250

nicholas at the start because um it’s a it’s

 

00:09:58.250 –> 00:10:01.210

one of it’s actually one of the graphics uh figures

 

00:10:01.210 –> 00:10:03.870

i use to teach how to illustrate qualitative

 

00:10:03.870 –> 00:10:07.529

data to the PhDs and stuff, because decision

 

00:10:07.529 –> 00:10:10.210

trees are just really nice. They’re really intuitive

 

00:10:10.210 –> 00:10:14.129

and understandable. I can’t take credit for that.

 

00:10:14.269 –> 00:10:16.690

Matt thought of the flowchart idea. I thought

 

00:10:16.690 –> 00:10:20.769

so, Matt. I use it in a course called concept

 

00:10:20.769 –> 00:10:25.210

analysis. So you can use that format, right?

 

00:10:25.370 –> 00:10:27.389

So the concept analysis or concept engineering

 

00:10:27.389 –> 00:10:30.909

also follows a similar type of method, if you

 

00:10:30.909 –> 00:10:33.240

will. Right. So there are ways to do this kind

 

00:10:33.240 –> 00:10:35.799

of work independently of scenario planning. Right.

 

00:10:35.840 –> 00:10:38.580

I wrote a paper on greenwashing with some students

 

00:10:38.580 –> 00:10:42.100

that we use a very similar type of method. And

 

00:10:42.100 –> 00:10:45.820

I find that PhDs actually get a ton of value

 

00:10:45.820 –> 00:10:47.360

out of something like that because you’re teaching

 

00:10:47.360 –> 00:10:50.159

them how to do what’s called a scoping review.

 

00:10:50.360 –> 00:10:52.559

Right. Which is like a miniature literature review

 

00:10:52.559 –> 00:10:55.840

for students. And so it’s quite a really nice

 

00:10:55.840 –> 00:11:01.129

way to get inducted into the PhD. or the research

 

00:11:01.129 –> 00:11:03.809

field, right, by being able to work with the

 

00:11:03.809 –> 00:11:06.570

literature in a very abridged fashion before

 

00:11:06.570 –> 00:11:09.750

you go out and do, say, more systematized or

 

00:11:09.750 –> 00:11:12.970

large literature reviews. Right. And it gets

 

00:11:12.970 –> 00:11:15.750

cited. The paper actually gets cited a lot as

 

00:11:15.750 –> 00:11:18.950

a way to do that rather than for its value as

 

00:11:18.950 –> 00:11:22.149

kind of defining scenario in itself. So that’s

 

00:11:22.149 –> 00:11:25.509

interesting you bring that up. Well, well, there

 

00:11:25.509 –> 00:11:28.950

you go. I think it all fits well. And I just

 

00:11:28.950 –> 00:11:31.289

realized. In this moment, I never told y ‘all

 

00:11:31.289 –> 00:11:35.110

that I could use your decision tree as a regular

 

00:11:35.110 –> 00:11:43.049

teaching example. So let’s see if we can find

 

00:11:43.049 –> 00:11:47.730

out about similar exciting, unexpected insights

 

00:11:47.730 –> 00:11:53.730

with your new approach here. One thing I do recognize

 

00:11:53.730 –> 00:11:56.409

in your work, and there’s a few out there in

 

00:11:56.409 –> 00:12:00.799

our field where you consistently keep and when

 

00:12:00.799 –> 00:12:03.179

I say you I mean the both of you I’m trying not

 

00:12:03.179 –> 00:12:05.879

to be too southern here but y ‘all promote together

 

00:12:05.879 –> 00:12:11.080

um really like I don’t want to I’m trying to

 

00:12:11.080 –> 00:12:15.539

be not too flowery in this but you really bring

 

00:12:15.539 –> 00:12:18.679

in new stuff okay you bring in new concepts that

 

00:12:18.679 –> 00:12:20.740

maybe we’ve all been talking about a bit in the

 

00:12:20.740 –> 00:12:23.799

past or we we you know we chat about between

 

00:12:23.799 –> 00:12:26.519

talks at conferences and stuff but but y ‘all

 

00:12:26.519 –> 00:12:29.299

are really some One of the teams that gets there

 

00:12:29.299 –> 00:12:33.980

first and paves the way. So this chapter, I remember

 

00:12:33.980 –> 00:12:38.379

talking with each of y ‘all independently. And

 

00:12:38.379 –> 00:12:41.759

this idea just sort of came out. And I remember

 

00:12:41.759 –> 00:12:44.240

saying. What do you kind of just have sitting

 

00:12:44.240 –> 00:12:46.960

on the sidelines that you haven’t found a platform

 

00:12:46.960 –> 00:12:49.659

yet to talk about? And that’s where this conversation,

 

00:12:49.779 –> 00:12:53.039

at least with me, came up for the book. So I

 

00:12:53.039 –> 00:12:55.340

did some searching afterwards and I repeatedly

 

00:12:55.340 –> 00:12:57.500

have done this for the last two years when we

 

00:12:57.500 –> 00:13:01.100

first discussed your chapter, which is about

 

00:13:01.100 –> 00:13:04.659

examining the emotional elements in the facilitation

 

00:13:04.659 –> 00:13:09.100

of scenario planning. Right. So I just before

 

00:13:09.100 –> 00:13:12.629

this. interview here right i did it again i went

 

00:13:12.629 –> 00:13:15.590

and i did a broad search of scenario planning

 

00:13:15.590 –> 00:13:18.830

um as it’s represented both in the public sphere

 

00:13:18.830 –> 00:13:21.990

and in the academic scholarship because these

 

00:13:21.990 –> 00:13:24.649

are very very different representations sometimes

 

00:13:24.649 –> 00:13:28.549

and and maybe we rail against that a bit too

 

00:13:28.549 –> 00:13:31.269

much i don’t know maybe not enough but what i

 

00:13:31.269 –> 00:13:34.759

find is they do align in some key concepts, right?

 

00:13:34.860 –> 00:13:36.679

So if you look for scenario planning in the private

 

00:13:36.679 –> 00:13:39.779

sector, you’ll see terms like foresight and forecasting,

 

00:13:40.159 –> 00:13:42.919

uncertainty, risk, things like that come up.

 

00:13:43.039 –> 00:13:46.919

In the academics world, you’ll hear those plus

 

00:13:46.919 –> 00:13:50.019

decision -making. In the private sector, you’ll

 

00:13:50.019 –> 00:13:53.220

hear more about agility and adaptiveness. But

 

00:13:53.220 –> 00:13:56.940

what is almost exclusively missing? In fact,

 

00:13:56.940 –> 00:13:58.840

I didn’t get any hits, to be honest, except for

 

00:13:58.840 –> 00:14:01.980

your publication was a discussion of the concept

 

00:14:01.980 –> 00:14:05.519

of emotions, or as we see in social sciences,

 

00:14:05.539 –> 00:14:11.679

affect, right, in the scenario space. And I am

 

00:14:11.679 –> 00:14:17.820

just as interested as to why this is not being

 

00:14:17.820 –> 00:14:21.129

brought up more now as I was before. Because

 

00:14:21.129 –> 00:14:24.870

foresight is rooted in the human agent. It’s

 

00:14:24.870 –> 00:14:27.750

humans engaging in foresight thinking. It’s humans

 

00:14:27.750 –> 00:14:30.769

engaging in scenario planning. And it’s humans

 

00:14:30.769 –> 00:14:34.929

that are the facilitators. So we know that humans,

 

00:14:35.110 –> 00:14:40.049

we as the species, are almost exclusively motivated

 

00:14:40.049 –> 00:14:44.309

by our emotions. And so we need to bring that

 

00:14:44.309 –> 00:14:48.529

into the dialogue. So I would like to pass the

 

00:14:48.529 –> 00:14:52.029

ball back to you now. What got you started on

 

00:14:52.029 –> 00:14:54.429

this idea of looking at the emotional aspect

 

00:14:54.429 –> 00:14:58.309

of scenario planning? Sure. I’m happy to start.

 

00:14:59.450 –> 00:15:02.210

Three things come to mind right away. One of

 

00:15:02.210 –> 00:15:07.990

them is that a couple years prior, we had been

 

00:15:07.990 –> 00:15:11.769

invited to a book about cognition. The link then

 

00:15:11.769 –> 00:15:14.090

from cognition to emotion ended up happening

 

00:15:14.090 –> 00:15:18.490

in this paper because the way that emotion is,

 

00:15:18.710 –> 00:15:21.350

at least the most frequently, but I do agree

 

00:15:21.350 –> 00:15:24.129

that it’s extremely rare, especially in any coherent

 

00:15:24.129 –> 00:15:28.710

way, despite the obviousness of emotion in the

 

00:15:28.710 –> 00:15:31.870

human drama that is every group level facilitation,

 

00:15:32.009 –> 00:15:35.389

planning, practice, strategy, development, and

 

00:15:35.389 –> 00:15:38.830

so on and so forth, is in terms of hot and cold

 

00:15:38.830 –> 00:15:42.940

cognition. That’s the jargon from higher education.

 

00:15:43.620 –> 00:15:46.299

So could you walk us through that? Yeah, yeah.

 

00:15:46.320 –> 00:15:49.620

No, happily. Hot cognition is talking about during

 

00:15:49.620 –> 00:15:52.899

a, especially decision -making processes, times

 

00:15:52.899 –> 00:15:55.340

when basically emotions are running high. That’s

 

00:15:55.340 –> 00:15:59.480

the hot part. And that it’s seen as interrupting

 

00:15:59.480 –> 00:16:03.399

cognition. That is, is that it discourages rational.

 

00:16:04.480 –> 00:16:07.320

thinking in part because the emotions take over.

 

00:16:07.379 –> 00:16:09.500

They’re sort of unregulated emotion. And then

 

00:16:09.500 –> 00:16:11.080

there’s this other thing called cold cognition.

 

00:16:11.220 –> 00:16:13.539

And you can imagine it’s basically just the opposite

 

00:16:13.539 –> 00:16:17.100

where you’re being like, you know, cold, very

 

00:16:17.100 –> 00:16:19.580

rational, and you’re keeping things, you know,

 

00:16:19.580 –> 00:16:21.820

keeping the emotion level low. And the belief

 

00:16:21.820 –> 00:16:25.639

in that line of research is basically that hot

 

00:16:25.639 –> 00:16:30.000

cognition is bad and it interrupts. and inhibits

 

00:16:30.000 –> 00:16:32.440

rational decision -making. And then there’s cold

 

00:16:32.440 –> 00:16:35.039

cognition, which seems to facilitate it in some

 

00:16:35.039 –> 00:16:39.360

way. Now, Matt and I weren’t super satisfied

 

00:16:39.360 –> 00:16:42.039

with that when we thought about our own experiences

 

00:16:42.039 –> 00:16:44.659

in this place, but that’s how we got to emotion.

 

00:16:44.860 –> 00:16:46.620

It was through cognition. It was being asked

 

00:16:46.620 –> 00:16:49.379

to write a book chapter about cognition. And

 

00:16:49.379 –> 00:16:51.799

secondarily, though, as soon as I read between

 

00:16:51.799 –> 00:16:53.779

the lines that even though those two terms are

 

00:16:53.779 –> 00:16:56.580

all about cognition, they’re also all about emotion,

 

00:16:56.799 –> 00:17:00.460

too, since that’s the major variable force that’s

 

00:17:00.460 –> 00:17:03.679

in the background. Right. And so being a very,

 

00:17:03.759 –> 00:17:06.160

I would say, a classically trained sociologist

 

00:17:06.160 –> 00:17:08.279

that took a lot of theory courses and everything

 

00:17:08.279 –> 00:17:11.900

like that, sociology of emotions has emerged

 

00:17:11.900 –> 00:17:15.579

in the mid 1970s and has been regularly utilized

 

00:17:15.579 –> 00:17:19.829

in order to look at. how emotions fit into social

 

00:17:19.829 –> 00:17:22.829

structure, how they were driven by shared norms,

 

00:17:22.930 –> 00:17:25.250

especially within organizations, and how they

 

00:17:25.250 –> 00:17:28.670

play a role in group -level processes, of which

 

00:17:28.670 –> 00:17:30.750

the facilitation of scenario planning is, of

 

00:17:30.750 –> 00:17:34.569

course, a great example. So for me, that wasn’t

 

00:17:34.569 –> 00:17:36.589

a leap at all, even though they didn’t seem to

 

00:17:36.589 –> 00:17:39.470

be connected in my understanding of those literatures.

 

00:17:40.089 –> 00:17:42.269

But then the third part, this is the real part.

 

00:17:42.329 –> 00:17:44.309

This is the heart of it all. is when I first

 

00:17:44.309 –> 00:17:46.289

started working again, remember how I told you,

 

00:17:46.309 –> 00:17:48.089

Matt and I kind of, we went to college together,

 

00:17:48.150 –> 00:17:49.910

then had like a period of time where we did our

 

00:17:49.910 –> 00:17:52.329

own separate work and then came back together

 

00:17:52.329 –> 00:17:55.650

once Matt was earning his PhD. When I would listen

 

00:17:55.650 –> 00:17:57.730

to Matt, and he had a lot of scenario playing

 

00:17:57.730 –> 00:18:01.390

experience already, in between the lines all

 

00:18:01.390 –> 00:18:04.769

the time was language about emotional regulation,

 

00:18:05.009 –> 00:18:09.029

you know, and that when, that was the most interesting

 

00:18:09.029 –> 00:18:11.890

thing, if you ask me. Bring a scenario planner

 

00:18:11.890 –> 00:18:13.349

in front of you. They’ll talk about emotion,

 

00:18:13.569 –> 00:18:16.289

no problem. Ask them to write about it. It doesn’t

 

00:18:16.289 –> 00:18:21.410

exist. For some reason, the very humanness of

 

00:18:21.410 –> 00:18:24.750

some of it just gets kind of taken out. And so

 

00:18:24.750 –> 00:18:28.670

honestly, I came to that conclusion in part just

 

00:18:28.670 –> 00:18:30.769

listening to Matt about when he was frustrated

 

00:18:30.769 –> 00:18:34.369

or when he had clients that were in a really

 

00:18:34.369 –> 00:18:36.650

negative emotional state and almost refused to

 

00:18:36.650 –> 00:18:39.609

play along and things like that. And so that’s

 

00:18:39.609 –> 00:18:43.519

the truth. behind it all, honestly. And so since

 

00:18:43.519 –> 00:18:45.059

I brought that up, Matt, do you want to pick

 

00:18:45.059 –> 00:18:49.019

up on that one? I think Nick was able to see

 

00:18:49.019 –> 00:18:51.480

that between the lines and that’s quite astute

 

00:18:51.480 –> 00:18:55.670

of him. I think a lot of times when… Scenario

 

00:18:55.670 –> 00:18:57.809

planners, consultants are called into organizations.

 

00:18:57.890 –> 00:19:01.450

You’re really not faced with a very bright, sunny

 

00:19:01.450 –> 00:19:04.509

day, right? Oftentimes, they’re struggling. And

 

00:19:04.509 –> 00:19:06.269

if they could solve the problems that they’re

 

00:19:06.269 –> 00:19:08.670

facing, they would have solved them. But they’re

 

00:19:08.670 –> 00:19:10.910

bringing in external help to do this often. And

 

00:19:10.910 –> 00:19:14.250

so there’s some issues that are already kind

 

00:19:14.250 –> 00:19:17.430

of unraveling inside the organizations. Perhaps

 

00:19:17.430 –> 00:19:19.829

they’ve lost a client. Perhaps they’ve lost a

 

00:19:19.829 –> 00:19:21.569

major supplier, and they’re trying to build back

 

00:19:21.569 –> 00:19:26.640

their resilience, for example. But one of the

 

00:19:26.640 –> 00:19:30.799

main tasks that I see is the facilitator has

 

00:19:30.799 –> 00:19:35.299

is when you’re walking into a situation in an

 

00:19:35.299 –> 00:19:39.240

organization in such a state, how is it that

 

00:19:39.240 –> 00:19:45.039

we can kind of make them do things differently,

 

00:19:45.240 –> 00:19:48.460

right? And so part of the emotional management

 

00:19:48.460 –> 00:19:53.890

stuff tries to shift them from… the state that

 

00:19:53.890 –> 00:19:58.109

they’re in to something where we can kind of

 

00:19:58.109 –> 00:20:00.970

jointly find new solutions together. And I’m

 

00:20:00.970 –> 00:20:03.329

a big proponent of the idea that that requires

 

00:20:03.329 –> 00:20:06.869

fun, right? So as an emotional state in itself,

 

00:20:07.009 –> 00:20:11.049

I believe that we’re having fun when we’re solving

 

00:20:11.049 –> 00:20:13.549

problems, when we’re seeing new things, when

 

00:20:13.549 –> 00:20:16.089

we’re collaborating with one another, when we’re

 

00:20:16.089 –> 00:20:18.130

building on other people’s ideas, when we’re

 

00:20:18.130 –> 00:20:21.920

iterating. And so… emotions can then work in

 

00:20:21.920 –> 00:20:23.920

our favor right when we can get that excitement

 

00:20:23.920 –> 00:20:27.940

going and and consultants and facilitators out

 

00:20:27.940 –> 00:20:29.660

there would recognize those moments that when

 

00:20:29.660 –> 00:20:31.700

you get that conversation and it’s moving in

 

00:20:31.700 –> 00:20:33.660

the right direction and and people are really

 

00:20:33.660 –> 00:20:36.819

kind of ping -ponging and doing a great job Our

 

00:20:36.819 –> 00:20:39.519

job as facilitators can be just to kind of hide

 

00:20:39.519 –> 00:20:42.019

over in the corner and not say anything, right?

 

00:20:42.099 –> 00:20:44.880

To get out of their way because they’re finding

 

00:20:44.880 –> 00:20:48.920

ways and doing things and reigniting that conversation

 

00:20:48.920 –> 00:20:51.640

that they’ve had a hard time having, right? Over

 

00:20:51.640 –> 00:20:57.079

time as they kind of slip and move and say drift

 

00:20:57.079 –> 00:21:00.720

in the directions that they’re going. The two

 

00:21:00.720 –> 00:21:04.059

of you paint a pretty, I think, accurate picture.

 

00:21:04.480 –> 00:21:08.640

of what a lot of us experience in the scenario

 

00:21:08.640 –> 00:21:11.680

consulting space right so we’re entering a room

 

00:21:11.680 –> 00:21:15.000

where we are already recognized because they

 

00:21:15.000 –> 00:21:17.859

brought us in as the scenario planning expert

 

00:21:17.859 –> 00:21:20.779

we’re taking that consultant role and often they’re

 

00:21:20.779 –> 00:21:23.380

strangers right we might know one or two people

 

00:21:23.380 –> 00:21:26.299

in the room because that’s who we’ve been setting

 

00:21:26.299 –> 00:21:29.069

up the intervention and the workshops with but

 

00:21:29.069 –> 00:21:30.710

a lot of the other people a lot of the other

 

00:21:30.710 –> 00:21:34.329

executives will be strangers and they have this

 

00:21:34.329 –> 00:21:37.190

way that they they everybody comes with their

 

00:21:37.190 –> 00:21:39.509

expectations in short right and those expectations

 

00:21:40.990 –> 00:21:43.450

As you say, like when we talk about this, we

 

00:21:43.450 –> 00:21:46.390

do, we imply a lot. Sometimes explicitly we’re

 

00:21:46.390 –> 00:21:49.410

saying, you know, they’re grumpy or disruptive

 

00:21:49.410 –> 00:21:52.930

or, you know, they’re very friendly or shy or,

 

00:21:52.990 –> 00:21:55.349

you know, and some of these are behaviors I’m

 

00:21:55.349 –> 00:21:57.410

largely mentioning, but they’re rooted in, you

 

00:21:57.410 –> 00:22:01.109

know, these assumptions of emotions. Right. And

 

00:22:01.109 –> 00:22:03.829

then our reactions to them, because it’s our

 

00:22:03.829 –> 00:22:06.630

role. We constantly talk about going back to

 

00:22:06.630 –> 00:22:09.410

what we say versus what we publish that Nicholas

 

00:22:09.410 –> 00:22:14.170

was mentioning is we present ourselves, though,

 

00:22:14.190 –> 00:22:17.589

as these sort of objective people, right? Like

 

00:22:17.589 –> 00:22:19.450

the way when the doctor comes in, we assume the

 

00:22:19.450 –> 00:22:21.009

doctor is going to be objective. When the expert

 

00:22:21.009 –> 00:22:22.430

comes in, we assume they’re going to have some

 

00:22:22.430 –> 00:22:26.690

sort of, you know, rational, if you will, look.

 

00:22:26.890 –> 00:22:29.410

And I will push back maybe later on the difference

 

00:22:29.410 –> 00:22:33.299

between emotional and rational. I really want

 

00:22:33.299 –> 00:22:35.920

to get to your work first. I really want to focus

 

00:22:35.920 –> 00:22:38.299

on that. So I do like this picture that you’re

 

00:22:38.299 –> 00:22:40.680

painting here. And it’s something I agree. We

 

00:22:40.680 –> 00:22:47.039

need to be more maybe forthcoming with acknowledging

 

00:22:47.039 –> 00:22:52.519

what these pieces are. And all of us have a social

 

00:22:52.519 –> 00:22:54.579

science background. I think maybe that’s why

 

00:22:54.579 –> 00:22:59.000

some of us are more ready. to recognize that

 

00:22:59.000 –> 00:23:00.519

than others. Like you say, Nicholas, you were

 

00:23:00.519 –> 00:23:02.279

the one who started seeing these between the

 

00:23:02.279 –> 00:23:05.579

lines things. Matt, you’re the one who was speaking

 

00:23:05.579 –> 00:23:09.740

about these things to start with. So let’s take

 

00:23:09.740 –> 00:23:12.019

that picture, right? You’ve got at least one

 

00:23:12.019 –> 00:23:15.279

facilitator in the room. You’ve got a whole bunch

 

00:23:15.279 –> 00:23:18.200

of people who generally know each other and have

 

00:23:18.200 –> 00:23:21.779

a completely expectation of working with each

 

00:23:21.779 –> 00:23:24.980

other, which we are about to disrupt as the facilitator

 

00:23:24.980 –> 00:23:29.259

in the room. Now, when it comes to our, we talk

 

00:23:29.259 –> 00:23:31.420

about group think, we talk about group dynamics.

 

00:23:31.539 –> 00:23:34.339

Well, here’s a group emotions, right? How do

 

00:23:34.339 –> 00:23:37.079

you see these emotions, our emotions influencing

 

00:23:37.079 –> 00:23:42.079

the decision -making process that is happening

 

00:23:42.079 –> 00:23:44.980

or hope to happen when we do these scenario planning

 

00:23:44.980 –> 00:23:49.359

workshops? So I think that’s a very interesting

 

00:23:49.359 –> 00:23:53.859

question, what you’re asking here about. How

 

00:23:53.859 –> 00:23:56.299

does it influence the decision making that goes

 

00:23:56.299 –> 00:24:00.019

on in the room? In those scenario planning workshops,

 

00:24:00.279 –> 00:24:03.200

it’s not like they’re taking critical, like organizational

 

00:24:03.200 –> 00:24:06.599

type of decisions either. I mean, they’re simulating

 

00:24:06.599 –> 00:24:08.920

potential options through these different types

 

00:24:08.920 –> 00:24:10.900

of conditions and the scenarios. And I think

 

00:24:10.900 –> 00:24:14.160

so it’s more of an exploratory than it is some

 

00:24:14.160 –> 00:24:17.640

kind of like closing down kind of part of the

 

00:24:17.640 –> 00:24:22.130

design diamond, if you will. But there’s plenty

 

00:24:22.130 –> 00:24:24.990

of people that are coming in with loaded agendas,

 

00:24:24.990 –> 00:24:28.190

with something that they have to get done, with

 

00:24:28.190 –> 00:24:31.670

maybe a group of stakeholders inside the organizations

 

00:24:31.670 –> 00:24:35.109

that have their demands on the table. So there’s

 

00:24:35.109 –> 00:24:38.509

a lot of pressure that we can’t necessarily know

 

00:24:38.509 –> 00:24:41.609

all the intricacies of. And this is a challenge

 

00:24:41.609 –> 00:24:45.150

because these manifest themselves in all sorts

 

00:24:45.150 –> 00:24:50.130

of different ways. Being, you know, the person

 

00:24:50.130 –> 00:24:53.470

that doesn’t want to play along, right? And that

 

00:24:53.470 –> 00:24:56.730

can be any role, right? That can be a C -suite

 

00:24:56.730 –> 00:24:59.769

member. That could be a CFO, CEO, or that could

 

00:24:59.769 –> 00:25:03.250

be one of the external members, right? And it’s

 

00:25:03.250 –> 00:25:05.230

a delicate moment and it’s a fragile moment,

 

00:25:05.390 –> 00:25:08.789

right? And so trying to get them to back up and,

 

00:25:08.789 –> 00:25:12.720

you know. downplay if you will a little bit of

 

00:25:12.720 –> 00:25:15.980

the seriousness of the of the episode to take

 

00:25:15.980 –> 00:25:18.420

them into this kind of hypothetical this you

 

00:25:18.420 –> 00:25:21.059

know these aren’t real stakes we’re talking about

 

00:25:21.059 –> 00:25:23.670

right these are just going through some of these

 

00:25:23.670 –> 00:25:26.630

right thought experiments and trying to move

 

00:25:26.630 –> 00:25:29.369

them away from this whole, we’re going to make

 

00:25:29.369 –> 00:25:32.250

a huge decision right now, right. That’s going

 

00:25:32.250 –> 00:25:34.730

to impact all you, your agenda and all your stakeholders

 

00:25:34.730 –> 00:25:37.529

to something more. Let’s just explore, you know,

 

00:25:37.529 –> 00:25:40.170

some of the, the different options that we have.

 

00:25:40.910 –> 00:25:43.710

Right. And so that’s a big moment, right. To

 

00:25:43.710 –> 00:25:46.930

get those people, if we call it the buy -in,

 

00:25:47.029 –> 00:25:48.650

right. To get the buy -in, to get the group moving

 

00:25:48.650 –> 00:25:51.750

forward together, to, to. To be able to do this

 

00:25:51.750 –> 00:25:55.789

kind of thinking. Big surprise. I agree with

 

00:25:55.789 –> 00:26:00.329

Matthew. Having observed him facilitate numerous

 

00:26:00.329 –> 00:26:04.710

times, I think he’s exactly right. And so maybe

 

00:26:04.710 –> 00:26:07.750

returning to that idea that emotion and rationality

 

00:26:07.750 –> 00:26:09.990

are sometimes somehow locked together in this

 

00:26:09.990 –> 00:26:14.509

hot and cold cognition modeling. I don’t know

 

00:26:14.509 –> 00:26:20.470

that I’ve ever seen a facilitator. actively try

 

00:26:20.470 –> 00:26:26.190

to bring the emotional state down in order to

 

00:26:26.190 –> 00:26:28.710

get to rationality. And in fact, when it’s the

 

00:26:28.710 –> 00:26:31.529

most successful, I often find what’s going on

 

00:26:31.529 –> 00:26:33.809

is they’re creating what we might now call like

 

00:26:33.809 –> 00:26:38.769

a safe space or a space where you can have hypothetical

 

00:26:38.769 –> 00:26:42.809

thought experiments where nobody is 100 % committed

 

00:26:42.809 –> 00:26:45.029

to any which direction. And like Matt said before,

 

00:26:45.089 –> 00:26:48.329

it’s exploratory in nature. And then you get

 

00:26:48.329 –> 00:26:53.589

people into, for lack of a better phrase, a somewhat,

 

00:26:53.769 –> 00:26:58.230

I would call it like heightened engagement. Because

 

00:26:58.230 –> 00:27:00.809

I don’t think it involves less emotion. I don’t

 

00:27:00.809 –> 00:27:04.049

think it involves, I’m not trying to get Goldilocks

 

00:27:04.049 –> 00:27:05.809

and the Three Bears here and say that it’s either

 

00:27:05.809 –> 00:27:07.650

too little or too much, but you got to get it

 

00:27:07.650 –> 00:27:10.710

just right. But I think there is an emotional

 

00:27:10.710 –> 00:27:14.970

state that can be induced. by creating an environment

 

00:27:14.970 –> 00:27:17.650

where people feel free to share and make guesses

 

00:27:17.650 –> 00:27:20.630

and maybe say something that isn’t the brightest,

 

00:27:21.069 –> 00:27:22.930

you know, as they’re just kind of exploring the

 

00:27:22.930 –> 00:27:25.829

future, but they’re engaged and they’re fully

 

00:27:25.829 –> 00:27:29.390

willing to, I don’t know, for lack of a better

 

00:27:29.390 –> 00:27:33.950

word, play, right? Like I was saying, yeah. Yeah,

 

00:27:33.970 –> 00:27:35.869

yeah, that’s exactly right. And so I don’t think

 

00:27:35.869 –> 00:27:38.009

it’s not cold cognition. I think it’s something

 

00:27:38.009 –> 00:27:41.349

more like what Matt is saying about getting people

 

00:27:41.349 –> 00:27:44.430

engaged and giving them a… enough room and

 

00:27:44.430 –> 00:27:48.089

enough emotional space so that they don’t feel

 

00:27:48.089 –> 00:27:50.430

like if they make a mistake they’re going to

 

00:27:50.430 –> 00:27:53.269

be judged by their colleagues or a situation

 

00:27:53.269 –> 00:27:56.309

where unlike in regular business operations maybe

 

00:27:56.309 –> 00:27:58.410

your boss brings up an idea and it hasn’t been

 

00:27:58.410 –> 00:28:00.829

the greatest idea that he or she has ever heard

 

00:28:00.829 –> 00:28:05.390

you know and you create some space to get out

 

00:28:05.390 –> 00:28:07.809

of the internal politics. This is probably the

 

00:28:07.809 –> 00:28:10.470

key. Get out of the internal politics that are

 

00:28:10.470 –> 00:28:12.549

blocking the ability of this organization to

 

00:28:12.549 –> 00:28:15.190

do this on their own. I think that’s the emotional

 

00:28:15.190 –> 00:28:17.970

regulation piece that gets delivered when someone

 

00:28:17.970 –> 00:28:21.329

like Matt is facilitating. There’s another model

 

00:28:21.329 –> 00:28:23.029

that we’ll often find also in the literature

 

00:28:23.029 –> 00:28:26.170

regarding Daniel Kahneman’s thinking fast and

 

00:28:26.170 –> 00:28:28.250

slow, right? Where he’s got the system one and

 

00:28:28.250 –> 00:28:30.650

system two thinking. And I think it’s got some

 

00:28:30.650 –> 00:28:33.569

similarity what we’re thinking about in the hot

 

00:28:33.569 –> 00:28:35.930

and cold cognition. But the hot and cold cognition

 

00:28:35.930 –> 00:28:41.470

is more about emotionally laden decisions again.

 

00:28:41.549 –> 00:28:44.619

And where the thinking hot… Or the thinking

 

00:28:44.619 –> 00:28:48.000

fast and slow is about the time or the speed

 

00:28:48.000 –> 00:28:52.180

it takes you to make a decision. And I also think

 

00:28:52.180 –> 00:28:58.940

that this is also an unfair, say, depiction of

 

00:28:58.940 –> 00:29:01.380

what’s going on in these workshops. You know,

 

00:29:01.420 –> 00:29:05.059

I’ll advocate for a Goldilocks, you know, third

 

00:29:05.059 –> 00:29:09.119

way that if we can switch this form that we communicate

 

00:29:09.119 –> 00:29:14.059

into. collaborative strategic conversation right

 

00:29:14.059 –> 00:29:18.299

where we’re using creativity and all of you know

 

00:29:18.299 –> 00:29:21.500

our imaginations and all of our puzzle solving

 

00:29:21.500 –> 00:29:25.500

abilities right to apply to these types of you

 

00:29:25.500 –> 00:29:28.500

know organizational questions then it does become

 

00:29:28.500 –> 00:29:31.359

something that’s really engaging right and we

 

00:29:31.359 –> 00:29:33.019

can really build and use each other’s momentum

 

00:29:33.019 –> 00:29:36.750

and so The positive side of this, it doesn’t

 

00:29:36.750 –> 00:29:39.569

necessarily require that it’s a hot or cold or

 

00:29:39.569 –> 00:29:43.650

a mode one or mode two when we can have that.

 

00:29:43.730 –> 00:29:47.210

And we feel like we can find that. And some people

 

00:29:47.210 –> 00:29:49.549

will find that flow. Right. But that’s often

 

00:29:49.549 –> 00:29:52.910

depicted as an individual type of activity where

 

00:29:52.910 –> 00:29:55.210

you find that flow when you’re in that in the

 

00:29:55.210 –> 00:29:57.910

zone kind of thinking. But we can also have that

 

00:29:57.910 –> 00:30:00.589

interpersonally. And we don’t find that very

 

00:30:00.589 –> 00:30:04.750

much in the literature. Hmm. Sounds like a future

 

00:30:04.750 –> 00:30:12.190

paper. Definitely. Um, well, the, I, yeah, let’s

 

00:30:12.190 –> 00:30:15.869

see. So many thoughts. Um, I do like the way

 

00:30:15.869 –> 00:30:18.930

you said, Matt, that, um, one of the things,

 

00:30:18.950 –> 00:30:21.230

one of the things you look for or is enjoyable

 

00:30:21.230 –> 00:30:26.029

or is aimed for in, um, in these strategic planning

 

00:30:26.029 –> 00:30:28.809

places, spaces, right. With the scenario teams,

 

00:30:29.089 –> 00:30:32.769

um, as a facilitator is it’s. like trying to

 

00:30:32.769 –> 00:30:35.250

manufacture, not manufacture, trying to excite

 

00:30:35.250 –> 00:30:37.890

fun, right? Trying to get some funness at which

 

00:30:37.890 –> 00:30:40.589

then, as you, Nicholas, say, it becomes almost

 

00:30:40.589 –> 00:30:44.769

like a hot, hot, trying to think of the word,

 

00:30:44.789 –> 00:30:47.329

and it left me, hot cognition, right? Emotions

 

00:30:47.329 –> 00:30:49.970

start running high, but not necessarily in a

 

00:30:49.970 –> 00:30:52.490

bad way. And then, as you say, Matt, you just

 

00:30:52.490 –> 00:30:54.069

kind of get out of the way and just let them

 

00:30:54.069 –> 00:30:55.849

run. And I think we’ve all been there before,

 

00:30:55.869 –> 00:30:57.190

where it’s like, no, no, no, I don’t want to

 

00:30:57.190 –> 00:30:59.650

disrupt this flow. And it puts me in mind of

 

00:30:59.650 –> 00:31:01.630

the old Model T cars, where you had to like,

 

00:31:01.789 –> 00:31:04.650

wind them up and sometimes it took a while and

 

00:31:04.650 –> 00:31:07.069

that first crank was always the hardest because

 

00:31:07.069 –> 00:31:09.869

the machine’s the coldest and that the way oil

 

00:31:09.869 –> 00:31:11.509

used to be back then and lubricants would be

 

00:31:11.509 –> 00:31:13.369

very thick and you really had to heat that up

 

00:31:13.369 –> 00:31:15.710

but once it went you would not touch it again

 

00:31:15.710 –> 00:31:17.250

you wouldn’t want to touch it again you would

 

00:31:17.250 –> 00:31:23.410

disrupt what um the the machines um now um self

 

00:31:23.410 –> 00:31:26.329

-perpetuating functioning was so or flow was

 

00:31:26.329 –> 00:31:29.930

so yeah to right and then we have to disrupt

 

00:31:29.930 –> 00:31:32.049

it because you know somebody set a time plan

 

00:31:32.049 –> 00:31:35.950

right for the day you gotta go and you have to

 

00:31:35.950 –> 00:31:38.170

go up there and say oh you know i’m really sorry

 

00:31:38.170 –> 00:31:40.309

that i have to interrupt such you know interesting

 

00:31:40.309 –> 00:31:45.630

conversations but now we have to move on so let’s

 

00:31:45.630 –> 00:31:47.829

go with that the speaking you know saying the

 

00:31:47.829 –> 00:31:50.470

things that we don’t really say in the professional

 

00:31:50.470 –> 00:31:53.970

spaces, like in our publications. I wonder in

 

00:31:53.970 –> 00:31:56.069

this one thing that I’ve wondered since we first

 

00:31:56.069 –> 00:31:58.309

talked about this topic of emotions in the spaces

 

00:31:58.309 –> 00:32:00.450

and what it means for the client, what it means

 

00:32:00.450 –> 00:32:03.529

for scenario planning broadly is I wonder how

 

00:32:03.529 –> 00:32:07.369

much of the facilitator has to like bring in

 

00:32:07.369 –> 00:32:10.109

this cult of personality almost, because we talk

 

00:32:10.109 –> 00:32:12.890

about fighting against that sometimes when we

 

00:32:12.890 –> 00:32:14.869

have the decision makers in the room, like we

 

00:32:14.869 –> 00:32:16.930

say, you know, the boss or the CEO or something

 

00:32:16.930 –> 00:32:20.779

who is. creating this sort of blockage, right?

 

00:32:20.960 –> 00:32:24.799

But it’s because they’ve cultivated this sort

 

00:32:24.799 –> 00:32:27.839

of like cultural, corporate culture, personality

 

00:32:27.839 –> 00:32:29.960

type, and then everybody’s responding to them.

 

00:32:30.059 –> 00:32:32.839

And one of our colleagues, George Wright, he’s

 

00:32:32.839 –> 00:32:35.119

talked about taking those people out of the room

 

00:32:35.119 –> 00:32:37.839

for a while so everybody else will talk. And

 

00:32:37.839 –> 00:32:40.380

I’ve wondered how much of this that we’re talking

 

00:32:40.380 –> 00:32:44.309

about requires us to… manufacture that within

 

00:32:44.309 –> 00:32:47.250

ourselves to this cult of personality. And I’m

 

00:32:47.250 –> 00:32:49.170

like, no, no, no, trust me. We just got to get

 

00:32:49.170 –> 00:32:51.069

through the first windup, right? We just got

 

00:32:51.069 –> 00:32:52.730

to get to the pitch. We just got to get to that

 

00:32:52.730 –> 00:32:55.869

and then just letting them go and then keeping

 

00:32:55.869 –> 00:32:58.970

that momentum going. Because Matt, it’s not just

 

00:32:58.970 –> 00:33:02.890

ending the workshop, right? It’s saying, okay,

 

00:33:02.950 –> 00:33:05.509

now we got to think in a different way. And now

 

00:33:05.509 –> 00:33:06.950

we’re going to take what you did and think in

 

00:33:06.950 –> 00:33:08.470

a different way now, you know, because we have

 

00:33:08.470 –> 00:33:11.609

like creativity at the start and then causality.

 

00:33:12.480 –> 00:33:14.599

puzzle solving in the middle you know and then

 

00:33:14.599 –> 00:33:17.559

we get all the way to others so i wonder how

 

00:33:17.559 –> 00:33:21.599

much of that is really facilitated by yeah embracing

 

00:33:21.599 –> 00:33:25.339

a very specific profile of emotional i don’t

 

00:33:25.339 –> 00:33:27.119

want to say manipulation but sometimes it almost

 

00:33:27.119 –> 00:33:29.539

feels like that because we have to get through

 

00:33:29.539 –> 00:33:43.339

it back to you I don’t have to do that, Megan.

 

00:33:43.680 –> 00:33:47.859

No, that’s great. Great observation. I don’t

 

00:33:47.859 –> 00:33:50.039

know. Just cut this out. I don’t know. I don’t

 

00:33:50.039 –> 00:33:53.799

know. I’m sorry. That’s a cult of personality.

 

00:33:53.920 –> 00:33:55.980

You got me stuck on that. That’s where you got

 

00:33:55.980 –> 00:33:58.680

me stuck. And you probably get Nick stuck, too,

 

00:33:58.740 –> 00:34:00.619

on cult of personality. Because we were thinking

 

00:34:00.619 –> 00:34:04.000

maybe we could go to the difference between emotion

 

00:34:04.000 –> 00:34:05.799

work and emotional labor if we haven’t been there.

 

00:34:05.819 –> 00:34:08.039

Have we been there yet? No, we haven’t. We’ve

 

00:34:08.039 –> 00:34:12.409

just barely touched upon that. With that in mind,

 

00:34:12.630 –> 00:34:17.989

you have these divisions that you have touched

 

00:34:17.989 –> 00:34:21.110

upon, which is the difference between, and it

 

00:34:21.110 –> 00:34:22.829

links into what we were just talking about, but

 

00:34:22.829 –> 00:34:25.309

I really wonder, it links between emotional labor,

 

00:34:25.469 –> 00:34:30.429

right, and emotional work. And again, along with

 

00:34:30.429 –> 00:34:32.849

the broader conversation about emotions, not

 

00:34:32.849 –> 00:34:37.090

really discussed. These not only… in and of

 

00:34:37.090 –> 00:34:39.309

themselves, but the difference between them.

 

00:34:39.389 –> 00:34:42.429

So with this idea of cult personalities and getting

 

00:34:42.429 –> 00:34:44.329

people going and all that kind of stuff, how

 

00:34:44.329 –> 00:34:49.969

do you see these differences? Interesting. Especially

 

00:34:49.969 –> 00:34:52.889

in the context of a facilitator, particularly.

 

00:34:53.630 –> 00:34:57.250

Right, right. I think these two terms get us

 

00:34:57.250 –> 00:35:01.110

some additional purchase in not only the way

 

00:35:01.110 –> 00:35:02.690

we think about it, but also the way we would

 

00:35:02.690 –> 00:35:06.380

examine this from a scientific perspective. The

 

00:35:06.380 –> 00:35:09.219

first piece that you brought up, I’m actually

 

00:35:09.219 –> 00:35:10.780

going to go in reverse because I think that might

 

00:35:10.780 –> 00:35:13.400

be better to start with work. So when we talk

 

00:35:13.400 –> 00:35:15.179

about emotion work, and again, keep in mind,

 

00:35:15.199 –> 00:35:17.619

this is the mid -1970s in sociology when these

 

00:35:17.619 –> 00:35:22.420

ideas first emerged. Emotion work, work is not

 

00:35:22.420 –> 00:35:24.719

meant to be, even though the other one, labor

 

00:35:24.719 –> 00:35:26.980

and work, you almost think they must be synonymous

 

00:35:26.980 –> 00:35:29.539

because those words could stand in for one another.

 

00:35:30.170 –> 00:35:33.309

When scholars use the terminology of emotion

 

00:35:33.309 –> 00:35:35.610

work, they’re simply referring to the effort

 

00:35:35.610 –> 00:35:40.550

that you or I or anyone else inside or outside

 

00:35:40.550 –> 00:35:43.769

of an organization is asked to exert on a daily

 

00:35:43.769 –> 00:35:50.690

basis in order to manage and control their expression

 

00:35:50.690 –> 00:35:53.750

and presentation of emotion during human interaction.

 

00:35:53.969 –> 00:35:59.349

So all the time we are… filtering what emotions

 

00:35:59.349 –> 00:36:02.170

we’re going to allow to move into the exterior

 

00:36:02.170 –> 00:36:05.969

part of our person or ourself and what emotions

 

00:36:05.969 –> 00:36:09.090

we’re going to keep, you know, tight to the chest

 

00:36:09.090 –> 00:36:11.889

or keep a poker face as they sometimes say, right?

 

00:36:12.050 –> 00:36:15.269

Now, all of us are doing that all the time, right?

 

00:36:15.349 –> 00:36:17.369

I mean, there are definitely moments for, you

 

00:36:17.369 –> 00:36:19.849

know, I’m sure lots of people know these moments

 

00:36:19.849 –> 00:36:22.369

where inside you’re really frustrated or really

 

00:36:22.369 –> 00:36:24.949

upset with somebody, but on the outside, nobody

 

00:36:24.949 –> 00:36:27.760

would know. So this is emotion work. And the

 

00:36:27.760 –> 00:36:29.400

reason they call it work is that’s the effort

 

00:36:29.400 –> 00:36:33.219

that you have to deliver in order to do that.

 

00:36:33.659 –> 00:36:35.800

Emotional labor, though, this is now we’re in

 

00:36:35.800 –> 00:36:39.360

the labor market. So we’re talking about occupations,

 

00:36:39.360 –> 00:36:41.579

work inside of organizations, things like that.

 

00:36:42.199 –> 00:36:46.420

And emotional labor then refers to the economic

 

00:36:46.420 –> 00:36:50.460

aspects where emotion is an explicit part of

 

00:36:50.460 –> 00:36:55.539

your job. To deliver your labor effectively and

 

00:36:55.539 –> 00:36:58.820

in exchange for money, you need to manage the

 

00:36:58.820 –> 00:37:02.400

emotional states of others and, of course, yourself.

 

00:37:02.699 –> 00:37:04.659

And in some lines of work, this is really obvious.

 

00:37:05.099 –> 00:37:08.260

So in one of the banner examples in early sociology,

 

00:37:08.739 –> 00:37:13.380

there was the case of airline stewards and airline

 

00:37:13.380 –> 00:37:16.460

stewardesses, where when, for example, a plane

 

00:37:16.460 –> 00:37:19.929

experiences some turbulence. Their job is to

 

00:37:19.929 –> 00:37:22.610

literally go into the cabin and manage people’s

 

00:37:22.610 –> 00:37:24.670

emotional state so that they’re calm and everything

 

00:37:24.670 –> 00:37:29.150

is collected, right? The truth is, loads of professions

 

00:37:29.150 –> 00:37:31.889

are asked to do emotional labor all the time,

 

00:37:31.889 –> 00:37:33.969

but you’d never find it in their job descriptions,

 

00:37:34.309 –> 00:37:38.190

right? Nobody’s going to say that, for example,

 

00:37:38.190 –> 00:37:40.449

a contemporary job, since we’re all faculty members,

 

00:37:40.630 –> 00:37:45.210

a contemporary aspect of the faculty workload

 

00:37:45.210 –> 00:37:47.570

right now is managing the emotional states of

 

00:37:47.570 –> 00:37:50.889

students. Sometimes that’s in the classroom where

 

00:37:50.889 –> 00:37:54.409

you want to keep discussions productive and so

 

00:37:54.409 –> 00:37:56.889

that they don’t devolve. Other times, and I know

 

00:37:56.889 –> 00:37:58.670

scholars are talking about this more than ever

 

00:37:58.670 –> 00:38:02.329

right now, in, for example, office hours and

 

00:38:02.329 –> 00:38:04.329

outside of the classroom where faculty members

 

00:38:04.329 –> 00:38:07.610

are being asked to act as kind of like de facto

 

00:38:07.610 –> 00:38:10.090

psychological counselors for their students and

 

00:38:10.090 –> 00:38:11.849

things like that. And so you’d never see that

 

00:38:11.849 –> 00:38:14.909

in a job description. But I think people are

 

00:38:14.909 –> 00:38:17.170

getting the sense that that is emerging as more

 

00:38:17.170 –> 00:38:20.670

of a norm. And so that’s the distinction. Emotion

 

00:38:20.670 –> 00:38:22.449

work, we’re all doing it all the time. It has

 

00:38:22.449 –> 00:38:24.929

a lot to do with how we keep some emotions in

 

00:38:24.929 –> 00:38:27.449

and let others out, sometimes selectively or

 

00:38:27.449 –> 00:38:29.849

even strategically. And then emotional labor

 

00:38:29.849 –> 00:38:33.630

is where your pay is effectively tied to either

 

00:38:33.630 –> 00:38:35.670

creating an emotional state in somebody else,

 

00:38:35.710 –> 00:38:39.530

like as a server, say at a restaurant. Or as

 

00:38:39.530 –> 00:38:42.789

part of managing, say, your clients or your customers

 

00:38:42.789 –> 00:38:46.550

or something like that. But you brought up the

 

00:38:46.550 –> 00:38:50.750

idea of how does that work with a cult of personality?

 

00:38:51.269 –> 00:38:55.090

You know, when you’ve got a, let’s say, a big

 

00:38:55.090 –> 00:38:58.369

wig in the room while you’re trying to get people

 

00:38:58.369 –> 00:39:02.409

into a more dynamic and imaginary space to deal

 

00:39:02.409 –> 00:39:04.030

with some of the thought experiments that you’re

 

00:39:04.030 –> 00:39:06.550

really challenged with in the organization. And

 

00:39:06.550 –> 00:39:09.929

sometimes that gets clogged up. If the current

 

00:39:09.929 –> 00:39:12.210

strategy is being designed by someone who’s elite

 

00:39:12.210 –> 00:39:13.869

in the organization, you’re going to find out

 

00:39:13.869 –> 00:39:16.130

when they come up with ideas during brainstorming

 

00:39:16.130 –> 00:39:18.809

practices, they’re often the best idea in the

 

00:39:18.809 –> 00:39:21.829

room. And the reason why, of course, is that

 

00:39:21.829 –> 00:39:24.730

there are political consequences. Even though

 

00:39:24.730 –> 00:39:27.309

you try to create, for the best of your ability,

 

00:39:27.429 –> 00:39:30.789

a safe space to be exploratory in these facilitating

 

00:39:30.789 –> 00:39:39.579

practices, there are realities. There are employees

 

00:39:39.579 –> 00:39:41.860

and organizations that don’t want to look dumb

 

00:39:41.860 –> 00:39:44.239

in front of their colleagues. They don’t want

 

00:39:44.239 –> 00:39:49.219

to say something that might become, I don’t know,

 

00:39:49.239 –> 00:39:52.119

used against them in the future. They might find

 

00:39:52.119 –> 00:39:56.860

that if they make too many recommendations that

 

00:39:56.860 –> 00:39:58.980

are unpopular, it might come up in their next

 

00:39:58.980 –> 00:40:01.920

round of employee review or something like that.

 

00:40:04.650 –> 00:40:08.409

I think it’s undersold exactly how much at risk

 

00:40:08.409 –> 00:40:11.130

some employees put when we asked them to do some

 

00:40:11.130 –> 00:40:13.570

of this more exploratory work, especially in

 

00:40:13.570 –> 00:40:17.030

the context of their peers. And so I did my best

 

00:40:17.030 –> 00:40:19.369

to touch on the call to personality thing. That’s

 

00:40:19.369 –> 00:40:22.510

a hard one. That’s a hard one. The truth is,

 

00:40:22.570 –> 00:40:28.469

because that is so rarely talked about in any

 

00:40:28.469 –> 00:40:30.789

systematic way, I feel like part of it is we

 

00:40:30.789 –> 00:40:33.210

don’t even have the analytical vocabulary yet

 

00:40:33.210 –> 00:40:36.340

to fully. parse out some of those issues. I don’t

 

00:40:36.340 –> 00:40:39.539

disagree. I think that they’re real. And I feel

 

00:40:39.539 –> 00:40:43.760

like I know what that is when I hear it in a

 

00:40:43.760 –> 00:40:46.539

more casual conversation. But getting at that

 

00:40:46.539 –> 00:40:49.880

empirically, that would be interesting. Really,

 

00:40:49.900 –> 00:40:52.199

really interesting. Well, you know, I’ll tell

 

00:40:52.199 –> 00:40:56.000

you what first got me thinking about that as

 

00:40:56.000 –> 00:40:59.760

even like a mover and shaker in the room, you

 

00:40:59.760 –> 00:41:05.429

know, of this. of the scenario space is when

 

00:41:05.429 –> 00:41:09.449

i was learning with from george my supervisor

 

00:41:09.449 –> 00:41:12.429

and he had his ways i mean he had been doing

 

00:41:12.429 –> 00:41:14.570

it for ages so he was obviously teaching me through

 

00:41:14.570 –> 00:41:18.269

his methods which involved his even just physical

 

00:41:18.269 –> 00:41:20.070

gestures you know just the way he walks around

 

00:41:20.070 –> 00:41:21.710

the room all this kind of stuff just everything

 

00:41:21.710 –> 00:41:23.670

about him and i was absorbing it like a sponge

 

00:41:23.670 –> 00:41:26.590

as you’re just you should right this is how it

 

00:41:26.590 –> 00:41:28.510

works and you’re the student and you’re being

 

00:41:28.510 –> 00:41:31.769

mentored um and i really said none of it no,

 

00:41:31.789 –> 00:41:34.769

not none of it, but a lot of what I thought would

 

00:41:34.769 –> 00:41:36.849

work for me because I was just parroting him

 

00:41:36.849 –> 00:41:40.369

in these very effective ways as is not a bad

 

00:41:40.369 –> 00:41:41.750

thing, but I was noticing they weren’t working

 

00:41:41.750 –> 00:41:44.829

for me. And then I started, all right, I’m a

 

00:41:44.829 –> 00:41:46.590

researcher. I’m just going to systematically

 

00:41:46.590 –> 00:41:49.409

try and figure out, take notes and figure out

 

00:41:49.409 –> 00:41:53.710

which parts aren’t working and why. And I, I

 

00:41:53.710 –> 00:41:57.090

mean, it’s hard to pin down a lot of what I’m

 

00:41:57.090 –> 00:42:01.210

about to say. But my mitigations and alterations

 

00:42:01.210 –> 00:42:05.010

helped support what I thought I was seeing, which

 

00:42:05.010 –> 00:42:09.170

is George is an older man in the room who commands

 

00:42:09.170 –> 00:42:11.590

authority. And I was a younger female in the

 

00:42:11.590 –> 00:42:16.139

room. attempting to command authority. And even

 

00:42:16.139 –> 00:42:18.820

when I do, when I figured out how to bring that

 

00:42:18.820 –> 00:42:21.300

authority into the space, which goes back to

 

00:42:21.300 –> 00:42:22.699

something you were saying, Nicholas, about we

 

00:42:22.699 –> 00:42:24.300

got to get them to trust us first. We got to

 

00:42:24.300 –> 00:42:26.659

get them to trust the process. And then, you

 

00:42:26.659 –> 00:42:28.800

know, we go to what Matt was saying, which is

 

00:42:28.800 –> 00:42:30.880

then we can get them going. And then, you know,

 

00:42:30.900 –> 00:42:33.179

they start going right. Just to get to that trust

 

00:42:33.179 –> 00:42:36.900

part, I had to create a whole other version of

 

00:42:36.900 –> 00:42:39.940

myself. It’s not that it’s inauthentic. but it

 

00:42:39.940 –> 00:42:42.800

definitely wasn’t George and it definitely wasn’t

 

00:42:42.800 –> 00:42:44.579

who I was bringing in before. And that’s why

 

00:42:44.579 –> 00:42:46.380

I started thinking like, okay, well maybe this

 

00:42:46.380 –> 00:42:51.420

is like, I need them to sort of halo effect me

 

00:42:51.420 –> 00:42:53.679

a bit. And then I can just pass the ball back

 

00:42:53.679 –> 00:42:56.360

to them and get out of the way. Right. So that’s

 

00:42:56.360 –> 00:42:59.079

what got me thinking about it. And that as core

 

00:42:59.079 –> 00:43:03.199

is yeah. A lot of emotional, not work, just work

 

00:43:03.199 –> 00:43:05.980

for me and labor, but like specifically trying

 

00:43:05.980 –> 00:43:12.159

  1. watch theirs and sort of facilitate their

 

00:43:12.159 –> 00:43:14.519

emotional spaces as well. At least that’s how

 

00:43:14.519 –> 00:43:18.760

I was interpreting it. I don’t know. We’re breaking

 

00:43:18.760 –> 00:43:21.260

new ground here. These weren’t part of our questions.

 

00:43:22.300 –> 00:43:25.400

I think that you’re on the right. I think you’re

 

00:43:25.400 –> 00:43:28.039

absolutely on the right track. And interestingly

 

00:43:28.039 –> 00:43:29.940

enough, despite the fact that you would think

 

00:43:29.940 –> 00:43:33.460

all of that should be a core part of the training

 

00:43:33.460 –> 00:43:36.500

process for getting people ready for the field,

 

00:43:37.550 –> 00:43:40.230

I mean, you just do not have an emotional management

 

00:43:40.230 –> 00:43:45.550

101 sort of piece, even though, like Matt said

 

00:43:45.550 –> 00:43:48.130

before, I think it’s really important that oftentimes

 

00:43:48.130 –> 00:43:50.469

the emotional state is already heightened when

 

00:43:50.469 –> 00:43:52.349

you show up because if things were going well,

 

00:43:52.409 –> 00:43:54.849

you wouldn’t need external support. And then

 

00:43:54.849 –> 00:43:58.510

the expectation is not only that, well, they’re

 

00:43:58.510 –> 00:44:01.130

coming in hot, is that you’re going to be able

 

00:44:01.130 –> 00:44:04.309

to manage them. Right. That they’re they’re asking

 

00:44:04.309 –> 00:44:06.329

for some outside. That’s literally the point

 

00:44:06.329 –> 00:44:11.409

of facilitation. Right. And so the assumption

 

00:44:11.409 –> 00:44:14.530

that, you know, you’re going to walk into a place

 

00:44:14.530 –> 00:44:16.769

with a heightened emotional state and that you’re

 

00:44:16.769 –> 00:44:20.690

going to be expected to manage it. And then simultaneously,

 

00:44:20.730 –> 00:44:23.949

keep in mind with the emotional work piece, also

 

00:44:23.949 –> 00:44:26.929

personally manage your own emotional emotional

 

00:44:26.929 –> 00:44:31.280

state, too, so that even if, for example. You

 

00:44:31.280 –> 00:44:32.760

know, you’ve got, like you said before, like

 

00:44:32.760 –> 00:44:35.119

one of the big wigs in the room just like won’t

 

00:44:35.119 –> 00:44:38.980

play ball or whatever the case might be. Instead

 

00:44:38.980 –> 00:44:41.260

of experiencing your authentic emotion, which

 

00:44:41.260 –> 00:44:43.900

could be deep levels of frustration, like why

 

00:44:43.900 –> 00:44:46.179

did you bring me here if you’re not going to

 

00:44:46.179 –> 00:44:50.170

play ball in the first place? So that’s where

 

00:44:50.170 –> 00:44:52.369

the emotional thing is really intense because

 

00:44:52.369 –> 00:44:55.349

you’re self -regulating your emotions while simultaneously

 

00:44:55.349 –> 00:44:58.090

managing the emotions in the room, all of which

 

00:44:58.090 –> 00:45:02.869

needs to move into a productive space or else

 

00:45:02.869 –> 00:45:07.050

things fall apart pretty quick. Right. So that

 

00:45:07.050 –> 00:45:08.769

was some of what we were talking about before.

 

00:45:10.050 –> 00:45:13.030

Yes. So there’s these challenges, right? These

 

00:45:13.030 –> 00:45:16.750

challenges of as the facilitator of managing

 

00:45:16.750 –> 00:45:21.469

a bunch of cats that we’re trying to herd in

 

00:45:21.469 –> 00:45:25.190

the room for their betterment. So let’s step

 

00:45:25.190 –> 00:45:28.110

from there. I think we’ve talked about this quite

 

00:45:28.110 –> 00:45:32.050

a bit, but I’d like to hear what ethical concerns

 

00:45:32.050 –> 00:45:34.170

arise when facilitators manage or manipulate

 

00:45:34.170 –> 00:45:37.170

emotions in the workshop. How does that go? How

 

00:45:37.170 –> 00:45:40.400

does that go, Nick? Well, just the idea that

 

00:45:40.400 –> 00:45:45.179

people in other areas where the work is highly

 

00:45:45.179 –> 00:45:48.380

about emotional regulation talk about burnout.

 

00:45:48.559 –> 00:45:53.059

So like the airline craft or the airline stewardesses,

 

00:45:53.139 –> 00:45:56.079

for example, because they’re so disassociated

 

00:45:56.079 –> 00:45:58.760

with their authentic emotional state on a daily

 

00:45:58.760 –> 00:46:02.000

basis, they become slowly more and more estranged

 

00:46:02.000 –> 00:46:04.679

from their actual feelings. And this coming from

 

00:46:04.679 –> 00:46:06.599

the States, you’re familiar with that restaurant

 

00:46:06.599 –> 00:46:11.659

named. Hooters. Hooters. Yeah. Is that right?

 

00:46:12.019 –> 00:46:14.219

Sports bar kind of place. I mean, it’s like.

 

00:46:14.280 –> 00:46:17.500

Oh, it’s a Hooters. Yeah. Yeah. Well, you know

 

00:46:17.500 –> 00:46:20.099

what? Well, so they ran almost the same kind

 

00:46:20.099 –> 00:46:22.940

of interviews with women that were working at

 

00:46:22.940 –> 00:46:25.559

these places. And one thing that I will never

 

00:46:25.559 –> 00:46:29.039

forget was they were conducting one of the interviews

 

00:46:29.039 –> 00:46:31.440

right after a woman had gotten off like three,

 

00:46:31.519 –> 00:46:34.579

like long back to back to back, you know, where

 

00:46:34.579 –> 00:46:38.079

she was just like. being you know low -key uh

 

00:46:38.079 –> 00:46:41.699

sexually exploited in the workplace basically

 

00:46:41.699 –> 00:46:43.780

you know like can’t wait to get you guys more

 

00:46:43.780 –> 00:46:45.900

beers so you can sexually harass me more this

 

00:46:45.900 –> 00:46:49.380

should be great and at the the first question

 

00:46:49.380 –> 00:46:51.900

she basically said something like well how are

 

00:46:51.900 –> 00:46:53.659

you today how are you feeling there’s just kind

 

00:46:53.659 –> 00:46:56.260

of like a early rapport question and her answer

 

00:46:56.260 –> 00:47:00.519

was i don’t know i haven’t been myself all week

 

00:47:00.519 –> 00:47:04.969

and i was just like god damn you know like that

 

00:47:04.969 –> 00:47:09.250

really hits you know and so either way some emotional

 

00:47:09.250 –> 00:47:11.369

estrangement i think is there so i can start

 

00:47:11.369 –> 00:47:14.989

with that yeah maybe but i don’t as a facilitator

 

00:47:14.989 –> 00:47:18.329

i can’t say i mean i’m drained like energy wise

 

00:47:18.329 –> 00:47:21.409

but i wouldn’t say that i get into some kind

 

00:47:21.409 –> 00:47:25.070

of emotional or do i that’s hard for me to feel

 

00:47:25.070 –> 00:47:28.449

i don’t know do you get there megan because well

 

00:47:28.449 –> 00:47:30.800

i don’t know if i you just said you did Right.

 

00:47:30.920 –> 00:47:34.059

Yeah. Yeah. I mean, to an extent, exactly. And

 

00:47:34.059 –> 00:47:38.559

what Nicholas mentions is, is, I mean, not to

 

00:47:38.559 –> 00:47:41.219

make it reduce it too much, too reductive, but

 

00:47:41.219 –> 00:47:44.139

like, welcome to the woman’s world. I mean, I’ve

 

00:47:44.139 –> 00:47:45.940

had these conversations with my kids who are

 

00:47:45.940 –> 00:47:49.400

both boys and I’ve told them jokingly, but it

 

00:47:49.400 –> 00:47:51.360

came from a very serious place. Just one day

 

00:47:51.360 –> 00:47:52.920

out of the blue said, wow, I don’t know how much

 

00:47:52.920 –> 00:47:55.440

of my personality is me and how much of it is

 

00:47:55.440 –> 00:47:58.139

just a lifetime of coping mechanisms because.

 

00:47:58.460 –> 00:48:03.650

Jesus Christ. Sorry. Right. I said it. And I

 

00:48:03.650 –> 00:48:06.769

think that gets back to what I was saying about,

 

00:48:06.809 –> 00:48:11.190

you know, as facilitators. how we command the

 

00:48:11.190 –> 00:48:14.610

space, how we lead the space or how we feed into

 

00:48:14.610 –> 00:48:16.929

the space or, you know, whatever approach we

 

00:48:16.929 –> 00:48:19.150

take. And clearly George had one approach after

 

00:48:19.150 –> 00:48:23.329

a career of making, you know, this, this practice

 

00:48:23.329 –> 00:48:27.349

really famous basically. And, and then myself

 

00:48:27.349 –> 00:48:29.869

who was brand new and not only from a different

 

00:48:29.869 –> 00:48:33.019

country, but a different. you know, selection

 

00:48:33.019 –> 00:48:37.460

of demographics on top of that. So yeah, yeah,

 

00:48:37.519 –> 00:48:41.860

no, I think these are very real functions in

 

00:48:41.860 –> 00:48:43.880

the space that maybe we don’t always bring out.

 

00:48:43.960 –> 00:48:50.860

So that does bring in this ethical sort of conversation,

 

00:48:51.179 –> 00:48:53.940

right? Okay, go ahead and ask the question. I

 

00:48:53.940 –> 00:48:55.900

think that Nick can do a really good job here.

 

00:48:56.429 –> 00:48:59.530

I’m not sure that I can maybe you guys go for

 

00:48:59.530 –> 00:49:01.869

  1. If I can bring in this Peter Schwartz stuff,

 

00:49:02.010 –> 00:49:04.550

if I find a dovetail for it, that makes sense.

 

00:49:04.630 –> 00:49:07.309

I’ll go there. But it’s not the facilitator’s

 

00:49:07.309 –> 00:49:11.170

ethical problem. It’s more of a contextual ethical

 

00:49:11.170 –> 00:49:14.289

problem than it is our problem. Right. So Peter

 

00:49:14.289 –> 00:49:18.309

Schwartz says in the extended version of The

 

00:49:18.309 –> 00:49:22.110

Long View. Right. That was his book. The Long

 

00:49:22.110 –> 00:49:26.449

View. Lessons from the Longview that came out

 

00:49:26.449 –> 00:49:29.969

in 2010 was kind of like an echo book that he

 

00:49:29.969 –> 00:49:35.409

wrote. He talks about a facilitation session

 

00:49:35.409 –> 00:49:38.530

where he’s got a gold mining company in the room

 

00:49:38.530 –> 00:49:41.469

and the CEOs in there and all of the top brass

 

00:49:41.469 –> 00:49:44.070

are in the room and they’re building out some

 

00:49:44.070 –> 00:49:46.210

of the scenarios but the CEOs had a very kind

 

00:49:46.210 –> 00:49:51.800

of dedicated kind of strategic direction. And

 

00:49:51.800 –> 00:49:53.800

he’s going to buy up other gold companies, and

 

00:49:53.800 –> 00:49:56.800

that’s what he’s going to hear to do. And in

 

00:49:56.800 –> 00:49:59.659

the scenarios comes the question, what happens

 

00:49:59.659 –> 00:50:03.380

if the gold prices turn and go south? And that

 

00:50:03.380 –> 00:50:07.559

would kind of be a bad result of the strategic

 

00:50:07.559 –> 00:50:12.139

plan that’s sealed. Anyway, after the session,

 

00:50:12.380 –> 00:50:15.179

the CEO was able to identify the people in the

 

00:50:15.179 –> 00:50:19.820

room who had different opinions about his strategy

 

00:50:19.820 –> 00:50:25.550

direction. and he fired him, right? And Peter

 

00:50:25.550 –> 00:50:27.650

Schwartz didn’t know that this was going to happen,

 

00:50:27.750 –> 00:50:30.449

right? But he did it anyway, right? And then

 

00:50:30.449 –> 00:50:33.289

lo and behold, doesn’t take much long after that.

 

00:50:33.369 –> 00:50:35.250

And then the actual, the prices of the gold go

 

00:50:35.250 –> 00:50:38.369

south, right? And the CEO gets fired by the board.

 

00:50:38.670 –> 00:50:42.449

So, I mean, it’s, but the facilitator in the

 

00:50:42.449 –> 00:50:44.989

room, Peter’s trying to create the safe space

 

00:50:44.989 –> 00:50:46.949

for the people to speak their minds and explore

 

00:50:46.949 –> 00:50:50.429

freely. That’s the irony though. Where it wasn’t

 

00:50:50.429 –> 00:50:54.420

the case. That it was actually being used as

 

00:50:54.420 –> 00:50:59.059

a moment where the CEO could identify the people

 

00:50:59.059 –> 00:51:05.179

who might be against his strategy. That’s a good

 

00:51:05.179 –> 00:51:07.860

one, though, because the facilitator is the one

 

00:51:07.860 –> 00:51:09.679

who’s brought in and asked to create this safe

 

00:51:09.679 –> 00:51:16.420

space. And then it can be used for purposes like

 

00:51:16.420 –> 00:51:19.400

that. I mean, that’s an ethical issue. But you

 

00:51:19.400 –> 00:51:21.519

got a lot of this. Yeah, a lot of it’s this value.

 

00:51:21.579 –> 00:51:23.139

The other way you could go with a question like

 

00:51:23.139 –> 00:51:25.980

this is this kind of value free exploration where,

 

00:51:26.059 –> 00:51:29.079

look, we don’t have dogs in the fight in a sense.

 

00:51:29.159 –> 00:51:31.139

Right. So the scenarios are up on the wall and

 

00:51:31.139 –> 00:51:34.280

we try to treat them rationally. Right. Whether

 

00:51:34.280 –> 00:51:36.119

or not we want them to happen is not kind of

 

00:51:36.119 –> 00:51:38.360

the point of the exercise. It’s whether or not

 

00:51:38.360 –> 00:51:40.559

they can plausibly happen and what we should

 

00:51:40.559 –> 00:51:43.119

we do if that’s the case. Right. And that’s a

 

00:51:43.119 –> 00:51:46.539

different discussion. Right. Where van der Heiden

 

00:51:46.539 –> 00:51:49.510

very much is in. the movement of we should create

 

00:51:49.510 –> 00:51:52.510

value -free scenarios, where we don’t necessarily

 

00:51:52.510 –> 00:51:57.409

take ethical positions on them. But then you’ve

 

00:51:57.409 –> 00:52:00.690

got things like Popper coming in from the back

 

00:52:00.690 –> 00:52:03.130

door saying, look, if the solutions to those

 

00:52:03.130 –> 00:52:07.130

puzzles or those scenarios are ethically challenged,

 

00:52:07.469 –> 00:52:10.349

then that’s justification for the refutation

 

00:52:10.349 –> 00:52:13.949

of that. So there’s a moral in -context moment

 

00:52:13.949 –> 00:52:18.849

of ethical types of, say, reasons to get rid

 

00:52:18.849 –> 00:52:22.929

of maybe the unethical strategic options, right?

 

00:52:22.989 –> 00:52:25.610

In the moment, rather than judge the scenarios,

 

00:52:25.610 –> 00:52:28.469

you judge the options, right? And I don’t know

 

00:52:28.469 –> 00:52:30.110

if you want to go there with the question, right?

 

00:52:30.210 –> 00:52:31.869

Because the question, you can go in a number

 

00:52:31.869 –> 00:52:33.190

of different ways. You know what I mean? When

 

00:52:33.190 –> 00:52:38.210

you bring ethics in. In your experience, what

 

00:52:38.210 –> 00:52:41.050

are people right now, what are people not talking

 

00:52:41.050 –> 00:52:43.750

about that you think they should be talking about?

 

00:52:46.480 –> 00:52:49.219

Well, I’ll take the first swing at this one.

 

00:52:49.579 –> 00:52:54.280

And it loosely relates back to our paper, but

 

00:52:54.280 –> 00:52:56.719

not exactly. I would say that probably one of

 

00:52:56.719 –> 00:52:58.739

the most important trends that’s being discussed

 

00:52:58.739 –> 00:53:02.699

both publicly and in scholarship for strategic

 

00:53:02.699 –> 00:53:05.000

foresight and scenario planning in the entire

 

00:53:05.000 –> 00:53:09.860

future -oriented planning area is the rise of

 

00:53:09.860 –> 00:53:13.820

AI and specifically what that means for facilitators.

 

00:53:14.480 –> 00:53:16.719

If you want to talk about something that stirs

 

00:53:16.719 –> 00:53:19.219

some strong emotions amongst facilitators, it’s

 

00:53:19.219 –> 00:53:21.380

whether or not their jobs could be replaced by

 

00:53:21.380 –> 00:53:28.019

  1. And one of the themes that I seem to sense

 

00:53:28.019 –> 00:53:32.480

in this area is that a lot of scholars, and it

 

00:53:32.480 –> 00:53:34.760

turns out ourselves included in some of our earliest

 

00:53:34.760 –> 00:53:39.639

work on this topic, engaged with different AI

 

00:53:39.639 –> 00:53:44.400

tools. And the underlying current was more or

 

00:53:44.400 –> 00:53:46.460

less some version of, well, these machines will

 

00:53:46.460 –> 00:53:48.320

never be able to do what we are going to do.

 

00:53:48.380 –> 00:53:51.719

And there was a kind of low -key celebration

 

00:53:51.719 –> 00:53:54.659

of human exceptionalism in those conversations.

 

00:53:54.760 –> 00:53:56.800

But I think a lot of that needs to be returned

 

00:53:56.800 –> 00:54:01.480

to, and we need to think a lot more deeply about

 

00:54:01.480 –> 00:54:06.099

what… can be done with AI and where and when

 

00:54:06.099 –> 00:54:08.500

and really have a much more structured and strategic

 

00:54:08.500 –> 00:54:11.039

conversation, which I say, obviously, without

 

00:54:11.039 –> 00:54:15.519

irony. That said, in all of the discussion about

 

00:54:15.519 –> 00:54:18.360

AI as it pertains to strategy and planning, just

 

00:54:18.360 –> 00:54:21.119

like the broader discussions, Megan, that you

 

00:54:21.119 –> 00:54:24.920

brought up before, also almost exclusively missing

 

00:54:24.920 –> 00:54:26.960

from those discussions is any role of emotion

 

00:54:26.960 –> 00:54:31.780

and affect. And to be clear, Matt and I have

 

00:54:31.780 –> 00:54:33.940

been writing about AI. Matt and I have been writing

 

00:54:33.940 –> 00:54:37.800

about emotion and scenario planning. And I don’t

 

00:54:37.800 –> 00:54:42.219

know that even we ourselves noted that part of

 

00:54:42.219 –> 00:54:46.300

our reaction to the rise of AI is starting to

 

00:54:46.300 –> 00:54:48.380

think through some of these emotional regulation

 

00:54:48.380 –> 00:54:54.420

pieces in facilitation that I’m not sure how

 

00:54:54.420 –> 00:54:58.260

AI figures into just yet. But that’s where I

 

00:54:58.260 –> 00:55:01.179

think. I think we should be going as trying to

 

00:55:01.179 –> 00:55:05.460

understand that piece for facilitators, for scenario

 

00:55:05.460 –> 00:55:08.320

planning related to AI and emotion. I think there’s

 

00:55:08.320 –> 00:55:14.679

some real work to be done there. Okay. And to

 

00:55:14.679 –> 00:55:18.940

you, Matt, what do you think? I think Nick said

 

00:55:18.940 –> 00:55:21.800

it the best. So thanks for having me on your

 

00:55:21.800 –> 00:55:25.099

podcast. Okay, well, I agree. Not going to take

 

00:55:25.099 –> 00:55:29.730

a swing, Matt? No. I mean, that’s some heavy

 

00:55:29.730 –> 00:55:32.050

hitting stuff. That’s the question, right? I

 

00:55:32.050 –> 00:55:33.889

think Nick did a great job there. I don’t think

 

00:55:33.889 –> 00:55:37.429

I want to add anything to that. Okay. Sorry if

 

00:55:37.429 –> 00:55:42.070

I stole your thunder. No, no, no. Nor thunder

 

00:55:42.070 –> 00:55:44.989

stolen. Everything else is, I mean, the question

 

00:55:44.989 –> 00:55:46.889

is, what should we be talking about? And there’s

 

00:55:46.889 –> 00:55:49.489

lots of little things that are coming in the

 

00:55:49.489 –> 00:55:54.610

pipeline, but stay tuned. Right. Okay. We will.

 

00:55:55.130 –> 00:55:58.269

Everybody stay tuned. It was great having both

 

00:55:58.269 –> 00:56:02.309

of y ‘all here today. I’m glad we could get through

 

00:56:02.309 –> 00:56:06.530

a bunch of the topics we had floated around ideas

 

00:56:06.530 –> 00:56:09.530

about before, but just never had the time to

 

00:56:09.530 –> 00:56:12.190

really just have a chat about them. So that’s

 

00:56:12.190 –> 00:56:16.449

great. Hopefully our audience laughed and cringed

 

00:56:16.449 –> 00:56:20.670

as much as we did. And yeah, I will see you at

 

00:56:20.670 –> 00:56:23.050

the next conference. So thank you very much.

 

00:56:24.010 –> 00:56:28.090

You know, I teach cringe. I teach cringe. I have

 

00:56:28.090 –> 00:56:31.789

a lecture on cringe and why we should be looking

 

00:56:31.789 –> 00:56:35.190

for it. Right. Very good. Thanks, Megan. Amazing.

 

00:56:35.949 –> 00:56:38.469

Scenarios for Tomorrow is produced by me, Megan

 

00:56:38.469 –> 00:56:42.230

Crawford, with invaluable feedback from Dr. Isabella

 

00:56:42.230 –> 00:56:45.750

Riza, Jeremy Creep, Brian Eggo, and as always,

 

00:56:45.949 –> 00:56:49.340

my kids. This is a production of the Futures

 

00:56:49.340 –> 00:56:52.139

and Analytics Research Hub and Pharr Lab affiliated

 

00:56:52.139 –> 00:56:55.239

with Edinburgh Napier Business School. You can

 

00:56:55.239 –> 00:56:57.699

find show notes, references, and transcripts

 

00:56:57.699 –> 00:57:02.940

at scenarios .pharrhub .org. That’s scenarios

 

00:57:02.940 –> 00:57:06.449

.pharrhub .org. You can follow us across social

 

00:57:06.449 –> 00:57:09.289

media by searching for scenario futures, all

 

00:57:09.289 –> 00:57:12.250

one word. You can subscribe to Scenarios for

 

00:57:12.250 –> 00:57:14.050

Tomorrow wherever you listen to your podcasts.

 

00:57:14.590 –> 00:57:17.769

Today’s track was composed by Rocket, whose links

 

00:57:17.769 –> 00:57:21.130

are provided in the show notes. This is Scenarios

 

00:57:21.130 –> 00:57:23.690

for Tomorrow, where tomorrow’s headlines start

 

00:57:23.690 –> 00:57:25.030

as today’s thought experiments.

00:00:00.000 –> 00:00:02.680

We have a proverb in the Jimba, actually, in

00:00:02.680 –> 00:00:05.980

my mother tongue. Well, father tongue, technically.

00:00:06.780 –> 00:00:12.759

That literally means, if you eat a lot, you shit

00:00:12.759 –> 00:00:15.320

a lot. I’m not kidding you. That’s really what

00:00:15.320 –> 00:00:18.079

the proverb is about. That’s what my parents

00:00:18.079 –> 00:00:20.500

used to explain how the tax system functions.

00:00:21.320 –> 00:00:26.100

Now, joke aside. Welcome to Scenarios for Tomorrow,

00:00:26.260 –> 00:00:28.379

a podcast where we turn tomorrow’s headlines

00:00:28.379 –> 00:00:31.320

into today’s thought experiments. This first

00:00:31.320 –> 00:00:33.700

series includes conversations with the authors

00:00:33.700 –> 00:00:37.000

of our latest book, Improving and Enhancing Scenario

00:00:37.000 –> 00:00:40.119

Planning, Futures Thinking Volume, from Edward

00:00:40.119 –> 00:00:43.880

Elgar Publishing. I’m your host, Dr. Megan Crawford,

00:00:44.020 –> 00:00:46.100

and throughout this first series, you’ll hear

00:00:46.100 –> 00:00:48.840

from my guests the numerous global techniques

00:00:48.840 –> 00:00:51.539

for practicing and advancing scenario planning.

00:00:51.700 –> 00:01:04.469

Enjoy! Kwamu Eva Fankaa is the head of the African

00:01:04.469 –> 00:01:08.469

Center of Expertise and co -runs the Decolonial

00:01:08.469 –> 00:01:10.650

Comparative Law Project at Max Planck Institute

00:01:10.650 –> 00:01:13.450

for Comparative Private International Law in

00:01:13.450 –> 00:01:16.780

Germany. She previously worked as the Africa

00:01:16.780 –> 00:01:19.840

Coordinator for Futures Literacy at UNESCO for

00:01:19.840 –> 00:01:23.459

four years. She has also organized her own practice

00:01:23.459 –> 00:01:27.620

as a head futurist for such international organizations

00:01:27.620 –> 00:01:32.840

as UNICEF Innocenti, the OECD, and United Nations

00:01:32.840 –> 00:01:35.540

Guiding Principles on Business and Human Rights,

00:01:35.739 –> 00:01:39.280

and with other universities as well. She is also

00:01:39.280 –> 00:01:42.120

directly invested in artistic interdisciplinary

00:01:42.120 –> 00:01:47.900

projects such as Lagos Yaoundé Biennials and

00:01:47.900 –> 00:01:51.519

the Taipei Arts Festival. The main inquiry behind

00:01:51.519 –> 00:01:54.980

her work has been the question, how to allow

00:01:54.980 –> 00:01:58.040

space for the negotiation of meaning to deepen

00:01:58.040 –> 00:02:01.620

conversations. Welcome, Kwamu. It’s great to

00:02:01.620 –> 00:02:06.609

have you here. Thank you for having me. Our circles

00:02:06.609 –> 00:02:08.710

have crossed a few times over the last couple

00:02:08.710 –> 00:02:12.449

of years, but this is really the only, I think

00:02:12.449 –> 00:02:13.969

it might be the first official time we’ve had

00:02:13.969 –> 00:02:16.469

a chance to just sit, but the second chance we’ve

00:02:16.469 –> 00:02:19.889

seen each other on a computer screen, to be honest.

00:02:19.969 –> 00:02:23.460

So I really appreciate it. Your work across the

00:02:23.460 –> 00:02:27.479

globe has impacted so many industries and disciplines.

00:02:27.699 –> 00:02:31.719

It’s incredible. And I’m excited that you are

00:02:31.719 –> 00:02:34.180

joining our audience today and that I get to

00:02:34.180 –> 00:02:36.759

join your audience today to learn more about

00:02:36.759 –> 00:02:39.979

your work and perspectives on the field of futures

00:02:39.979 –> 00:02:43.310

and foresight more broadly. As mentioned in the

00:02:43.310 –> 00:02:45.349

introduction, we just published a book together

00:02:45.349 –> 00:02:47.590

about scenario planning in the 21st century.

00:02:47.750 –> 00:02:52.289

And at today, almost the middle of 2025, we’re

00:02:52.289 –> 00:02:55.090

here to talk a bit about that. We understand

00:02:55.090 –> 00:02:57.270

that not all our listeners are familiar with

00:02:57.270 –> 00:03:00.449

scenario planning, though many may have been

00:03:00.449 –> 00:03:03.169

introduced to it a bit during the pandemic when

00:03:03.169 –> 00:03:07.590

our jobs became extremely popular. But one of

00:03:07.590 –> 00:03:10.199

the motivations to this podcast. is to bring

00:03:10.199 –> 00:03:12.860

our world of futures and foresight science outside

00:03:12.860 –> 00:03:16.620

the walls of academia where we largely work and

00:03:16.620 –> 00:03:20.000

where language is closely controlled, understandably

00:03:20.000 –> 00:03:24.300

so, but knowledge is not as easy to access as

00:03:24.300 –> 00:03:26.919

we generally wish it to be and we assume it to

00:03:26.919 –> 00:03:30.560

be quite often, which just means we, the two

00:03:30.560 –> 00:03:32.159

of us, are here to have a chat with the public

00:03:32.159 –> 00:03:35.930

today. Your chapter in our book is titled Reframing

00:03:35.930 –> 00:03:38.969

and Futures Literacy, Tackling the Poverty of

00:03:38.969 –> 00:03:42.370

the Modern Imagination. So let’s get right into

00:03:42.370 –> 00:03:50.469

  1. Thanks. I will do two things. I will first

00:03:50.469 –> 00:03:55.770

explain what anticipatory assumptions are, and

00:03:55.770 –> 00:03:59.930

then I will talk about how we can play with them.

00:04:00.750 –> 00:04:05.830

And so to define anticipatory assumptions, I

00:04:05.830 –> 00:04:08.430

will say that those are basically our entry points

00:04:08.430 –> 00:04:13.229

to thinking about the future. So what does it

00:04:13.229 –> 00:04:17.269

mean concretely? If you are being asked to think

00:04:17.269 –> 00:04:21.089

about what the world would look like in 2030

00:04:21.089 –> 00:04:25.230

or let’s be crazy in 2060, the first thing that

00:04:25.230 –> 00:04:28.079

you will do is obviously, well, You’ll think

00:04:28.079 –> 00:04:30.000

about whether you’ll be there to begin with.

00:04:30.120 –> 00:04:31.939

So, you know, what will be your potential potential

00:04:31.939 –> 00:04:35.339

perspective on this but also you’ll then can

00:04:35.339 –> 00:04:38.060

retrieve any information that is available to

00:04:38.060 –> 00:04:41.480

you may be grandmother stories may be what you

00:04:41.480 –> 00:04:45.680

read on the news or heard from a neighbor or

00:04:45.680 –> 00:04:48.139

what you were taught at school or what you have

00:04:48.139 –> 00:04:51.300

discussed with colleagues recently and this will

00:04:51.300 –> 00:04:54.800

be the type of Data that you will then try to

00:04:54.800 –> 00:04:59.829

project onto the timeframe that you wish to go

00:04:59.829 –> 00:05:03.870

  1. And this is a very normal process. For any

00:05:03.870 –> 00:05:06.670

conversation that you have with people, you always

00:05:06.670 –> 00:05:10.870

have to rely on something that’s the essence

00:05:10.870 –> 00:05:13.949

of it. So to talk about anticipatory assumption

00:05:13.949 –> 00:05:18.310

is not to shame people for having them. It’s

00:05:18.310 –> 00:05:22.230

about people being aware that those exist. and

00:05:22.230 –> 00:05:25.170

therefore that those can tell us a lot about

00:05:25.170 –> 00:05:29.889

how people think and what I found to be really

00:05:29.889 –> 00:05:32.389

exciting about anticipatory assumption which

00:05:32.389 –> 00:05:35.889

has been at the core of what my future’s work

00:05:35.889 –> 00:05:40.829

has been about is how we can actually be put

00:05:40.829 –> 00:05:44.329

in a position where we can question the reasons

00:05:44.329 –> 00:05:48.509

why we do things but we can also connect and

00:05:48.509 –> 00:05:53.579

relate more to the people in the room oftentimes

00:05:53.579 –> 00:05:57.959

even if you think about conflicts the minor conflicts

00:05:57.959 –> 00:06:01.720

those originate from both having assumptions

00:06:01.720 –> 00:06:07.199

and not communicating upon them and all of that

00:06:07.199 –> 00:06:10.839

can be to a certain extent mitigated and so if

00:06:10.839 –> 00:06:13.980

you think about how to be part of the society

00:06:13.980 –> 00:06:16.079

to be part of a group or to be in a relationship

00:06:16.079 –> 00:06:20.410

is about imagining the future together it’s important

00:06:20.410 –> 00:06:23.430

to do the work of not necessarily assuming that

00:06:23.430 –> 00:06:25.470

we have the same idea of what the future is about.

00:06:26.910 –> 00:06:31.829

And even when we may have a similar idea, there’s

00:06:31.829 –> 00:06:33.870

also moments where we might need to question

00:06:33.870 –> 00:06:38.069

why those ideas are similar. And that could also

00:06:38.069 –> 00:06:43.029

be the problem in itself. And so earlier, many

00:06:43.029 –> 00:06:48.600

were referring to… anticipatory assumptions

00:06:48.600 –> 00:06:53.259

and how those can be similar to mental models

00:06:53.259 –> 00:06:57.100

that were discussed in senior scenario planning

00:06:57.100 –> 00:07:02.120

field in the scenario planning fields and that

00:07:02.120 –> 00:07:04.180

is true there is a connection between the two

00:07:04.180 –> 00:07:11.779

i would see a difference that is that there’s

00:07:11.779 –> 00:07:14.620

a playfulness with anticipatory assumption that

00:07:14.620 –> 00:07:18.459

allows us to be kind to ourselves. And I think

00:07:18.459 –> 00:07:22.199

that is something that is indeed needed when

00:07:22.199 –> 00:07:26.779

we produce science. And so that allows me to

00:07:26.779 –> 00:07:29.360

explain a little bit how do we play with anticipatory

00:07:29.360 –> 00:07:33.399

assumptions. So in the article, I do refer to

00:07:33.399 –> 00:07:36.839

two use cases. where we were playing with the

00:07:36.839 –> 00:07:39.720

future of waste. But another one that is also

00:07:39.720 –> 00:07:42.540

a heavy topic on matters such as the future of

00:07:42.540 –> 00:07:47.480

racism. And in both contexts, those have practical,

00:07:47.839 –> 00:07:52.439

economic, emotional, political implications.

00:07:52.740 –> 00:07:56.579

So one may seem lighter than the other, but at

00:07:56.579 –> 00:07:59.519

the end of the day, both have quite severe implications

00:07:59.519 –> 00:08:04.600

for many people. And because the future is actually

00:08:04.600 –> 00:08:07.360

a very serious matter, it’s also sometimes difficult

00:08:07.360 –> 00:08:10.220

to see how playful we can actually be with the

00:08:10.220 –> 00:08:14.120

future. Because, you know, we are accountable,

00:08:14.379 –> 00:08:16.980

we have a duty to hold, and that’s really important.

00:08:17.360 –> 00:08:21.139

I just don’t believe that having a sense of duty

00:08:21.139 –> 00:08:24.720

or responsibility prevent us from seeking ways

00:08:24.720 –> 00:08:29.399

to connect playfully and in a smart fashion as

00:08:29.399 –> 00:08:34.379

well. For me, the advantage of anticipatory assumptions

00:08:34.379 –> 00:08:41.379

is the possibility to ask ourselves the question

00:08:41.379 –> 00:08:45.700

as to why? Where does it come from? And to really

00:08:45.700 –> 00:08:50.039

do some type of both personal and political inquiry

00:08:50.039 –> 00:08:53.779

and put them both together. I’ll start with the

00:08:53.779 –> 00:08:58.750

future of race. we discussed how we were looking

00:08:58.750 –> 00:09:01.490

at the future of waste with different participants

00:09:01.490 –> 00:09:07.149

from Central and Eastern Europe. Now, the matter

00:09:07.149 –> 00:09:11.009

of waste is that quite evidently, if we think

00:09:11.009 –> 00:09:13.549

about the connotation of the world itself, not

00:09:13.549 –> 00:09:16.470

even thinking about waste studies and scientific

00:09:16.470 –> 00:09:20.950

discourses on waste management, waste is usually…

00:09:21.909 –> 00:09:25.409

well, connotated with this idea that it’s something

00:09:25.409 –> 00:09:31.669

that is useless. And if I say that, you’re going

00:09:31.669 –> 00:09:33.789

to actually believe, well, that’s pretty straightforward.

00:09:33.950 –> 00:09:38.330

Yes. Thank you for coming. But it has implications

00:09:38.330 –> 00:09:42.250

as to how we deal with it. Whenever something

00:09:42.250 –> 00:09:45.330

is connotated as something that is negative or

00:09:45.330 –> 00:09:49.389

useless, the policies that result from that usually

00:09:49.389 –> 00:09:52.299

reflect that as well. And here you can easily

00:09:52.299 –> 00:09:55.440

connect that with some security matters or even

00:09:55.440 –> 00:09:57.899

matters of migration or conversation that goes

00:09:57.899 –> 00:10:00.159

in that direction where connotation that we have

00:10:00.159 –> 00:10:04.519

on people or things tend to make us act in a

00:10:04.519 –> 00:10:07.639

particular way. And yet again, that sounds like

00:10:07.639 –> 00:10:12.379

common sense. But we realize that common sense

00:10:12.379 –> 00:10:16.139

has not always shared by everyone, but also that

00:10:16.139 –> 00:10:19.059

it has more implications than what we think.

00:10:20.110 –> 00:10:27.269

So, on waste. One of the examples that was given

00:10:27.269 –> 00:10:30.889

was how when we ask people to think about 2050

00:10:30.889 –> 00:10:35.769

and the future of waste, we get things such as

00:10:35.769 –> 00:10:41.370

the importance of strong waste management for

00:10:41.370 –> 00:10:47.250

public health matters, or the fact that in a

00:10:47.250 –> 00:10:51.259

desire to be part of a more sustainable economy,

00:10:51.539 –> 00:10:54.659

there’s a need for waste to either be recycled

00:10:54.659 –> 00:10:59.919

or to be eradicated. Immediately, those type

00:10:59.919 –> 00:11:02.519

of behaviors that are induced by a particular

00:11:02.519 –> 00:11:07.019

assumption of what waste is for, or preventing

00:11:07.019 –> 00:11:11.480

some conversations from happening, automatically

00:11:11.480 –> 00:11:15.320

what you hear from those different policies is

00:11:15.320 –> 00:11:18.289

that waste is a problem. and a problem that needs

00:11:18.289 –> 00:11:22.450

to be fixed so you’re being put in a position

00:11:22.450 –> 00:11:25.350

that does not necessarily allow us to think of

00:11:25.350 –> 00:11:28.169

waste as something that is quite natural in the

00:11:28.169 –> 00:11:31.350

overall ecosystem something that is natural in

00:11:31.350 –> 00:11:33.590

an ecosystem is not something that needs to be

00:11:33.590 –> 00:11:36.129

fixed something that needs to be fixed is an

00:11:36.129 –> 00:11:40.889

error and that’s different from a system um and

00:11:40.889 –> 00:11:44.190

and so that very small assumption that i told

00:11:44.190 –> 00:11:47.710

you about that sounded Obviously, very reasonable.

00:11:47.909 –> 00:11:51.070

Yeah, waste is useless. Waste is something that

00:11:51.070 –> 00:11:54.669

we do not want in our lives. It allows us to

00:11:54.669 –> 00:11:57.110

immediately understand the type of policy measures

00:11:57.110 –> 00:12:00.370

or the type of behaviors that we have vis -a

00:12:00.370 –> 00:12:04.710

-vis something as small as waste. And that was

00:12:04.710 –> 00:12:08.029

just for waste. Imagine what it means for anything

00:12:08.029 –> 00:12:10.029

else when we’re doing things on the future of

00:12:10.029 –> 00:12:12.090

learning, when we’re doing things for the future

00:12:12.090 –> 00:12:14.590

of security, when we’re doing things on the future

00:12:14.590 –> 00:12:19.490

of… technology, if you go to those topics that

00:12:19.490 –> 00:12:25.090

appear to be larger, obviously the type of assumptions

00:12:25.090 –> 00:12:29.470

that are underlying will be just as large. And

00:12:29.470 –> 00:12:31.830

that’s the moment where we do need to have this

00:12:31.830 –> 00:12:33.629

type of conversation that allows us to actually

00:12:33.629 –> 00:12:36.269

see, well, let’s imagine that waste is actually

00:12:36.269 –> 00:12:39.549

the system itself. And so if waste is a system,

00:12:39.649 –> 00:12:42.090

you cannot solve it. And waste is like, not only

00:12:42.090 –> 00:12:44.529

everywhere, so it’s not waste as a variable.

00:12:45.070 –> 00:12:47.809

but we’re actually in a waste society and so

00:12:47.809 –> 00:12:50.730

the way we interact with one another is basically

00:12:50.730 –> 00:12:54.330

um the same way as what we do and how we deal

00:12:54.330 –> 00:12:58.809

with waste what does it mean or a situation where

00:12:58.809 –> 00:13:01.629

we actually see that waste is wealth actually

00:13:01.629 –> 00:13:05.230

there’s a good example from um so i’m cameroonian

00:13:05.230 –> 00:13:11.090

um and uh my group of affiliation is called venerate

00:13:11.090 –> 00:13:14.509

king and there’s different bani neke kingdoms

00:13:14.509 –> 00:13:19.210

now one of them was a bit further north from

00:13:19.210 –> 00:13:23.570

where i come from was known to have a king who

00:13:23.570 –> 00:13:29.929

would have a pile of waste and the reason for

00:13:29.929 –> 00:13:37.909

that is because waste was the symbol of accumulated

00:13:37.909 –> 00:13:42.870

wealth And so from that perspective that some

00:13:42.870 –> 00:13:44.950

have been using before, you know, with this idea

00:13:44.950 –> 00:13:47.789

of waste can be actually a form of wealth. And

00:13:47.789 –> 00:13:51.250

if you go for a capitalist understanding of waste

00:13:51.250 –> 00:13:54.990

as well, it might just then lead you to just

00:13:54.990 –> 00:14:00.450

trying to accumulate waste and then recycle that

00:14:00.450 –> 00:14:03.570

waste, which does not necessarily put you in

00:14:03.570 –> 00:14:05.690

a sustainable practice and just puts you in a

00:14:05.690 –> 00:14:09.480

very accumulative an accumulation -based type

00:14:09.480 –> 00:14:14.279

of system. But by actually shifting around, what

00:14:14.279 –> 00:14:16.279

could be the different examples? So is it waste

00:14:16.279 –> 00:14:19.519

as well? Is it waste as the overall paradigm?

00:14:20.019 –> 00:14:24.940

Is it about waste suddenly not even being managed?

00:14:25.320 –> 00:14:28.580

So waste is not even a variable that we actually

00:14:28.580 –> 00:14:31.120

want to care about. What type of society are

00:14:31.120 –> 00:14:33.659

we producing every time? And so suddenly you

00:14:33.659 –> 00:14:36.299

realize that by going back to the assumption,

00:14:37.070 –> 00:14:39.549

you can actually produce different types of behaviors

00:14:39.549 –> 00:14:42.330

or policies. Or sometimes you produce the same,

00:14:42.549 –> 00:14:45.809

and then you get to understand one. And I think,

00:14:45.830 –> 00:14:49.269

especially in the society that we live in, I’m

00:14:49.269 –> 00:14:51.710

not going to go back even to what’s going on

00:14:51.710 –> 00:14:54.409

in the world, so you all have your own idea of

00:14:54.409 –> 00:14:59.690

what that means. It’s important to have tools

00:14:59.690 –> 00:15:02.490

that allow you to relate to other people and

00:15:02.490 –> 00:15:06.250

to understand yourself as a person better. and

00:15:06.250 –> 00:15:08.389

to understand your context, how it influences

00:15:08.389 –> 00:15:11.350

you, how you may have gotten manipulated in one

00:15:11.350 –> 00:15:13.669

way or another. And to what extent it’s not always

00:15:13.669 –> 00:15:16.169

a problem because we all get manipulated somehow,

00:15:16.590 –> 00:15:20.009

but are you able to play around with what you

00:15:20.009 –> 00:15:24.350

were taking for granted is really the type of

00:15:24.350 –> 00:15:28.750

skill that I wish we could develop more. Something

00:15:28.750 –> 00:15:33.029

that came up in my mind, and I don’t know if

00:15:33.029 –> 00:15:35.500

I would have ever experienced this. had i stayed

00:15:35.500 –> 00:15:38.759

in my hometown or my home state for all of my

00:15:38.759 –> 00:15:42.799

life but moving out of the country I jokingly

00:15:42.799 –> 00:15:45.720

call it like a constant exercise in having my

00:15:45.720 –> 00:15:48.740

assumptions challenged. And it’ll be small assumptions.

00:15:48.940 –> 00:15:51.200

Like I thought the sidewalk was supposed to always

00:15:51.200 –> 00:15:53.360

look like this. I had no idea, you know, like

00:15:53.360 –> 00:15:55.379

little stuff like that. But then it can get very

00:15:55.379 –> 00:15:56.779

big, like exactly what you’re talking about.

00:15:56.820 –> 00:15:58.659

I would call waste management probably one of

00:15:58.659 –> 00:16:00.860

the biggest things and important things on this

00:16:00.860 –> 00:16:03.519

planet because every society collapses the second

00:16:03.519 –> 00:16:08.169

waste management collapses or ceases. But I digress,

00:16:08.169 –> 00:16:11.490

right? So it was even being aware that I had

00:16:11.490 –> 00:16:14.330

assumptions, just not, I think that’s one of

00:16:14.330 –> 00:16:18.330

the biggest tasks for people in our field who

00:16:18.330 –> 00:16:23.169

are being brought in to other groups who are

00:16:23.169 –> 00:16:26.169

hoping that, you know, we could use our expertise.

00:16:26.309 –> 00:16:29.789

It’s first thing is trying to convince them that

00:16:29.789 –> 00:16:33.049

they have assumptions. And then. what those assumptions

00:16:33.049 –> 00:16:36.450

are, as you’re saying, when it comes to exercises

00:16:36.450 –> 00:16:39.490

of, yeah, well, in futures, it’s anticipation.

00:16:39.549 –> 00:16:41.830

We’re always looking towards the future. It’s

00:16:41.830 –> 00:16:45.350

literally in the title. But yeah, I had no idea

00:16:45.350 –> 00:16:48.029

how that was even a thing until experiencing

00:16:48.029 –> 00:16:51.690

it, that people could just be blind, like hard

00:16:51.690 –> 00:16:54.750

blind to their own assumptions. And most of the

00:16:54.750 –> 00:16:56.789

time that is benign. You know, most of the time

00:16:56.789 –> 00:17:00.230

it’s just a goofy moment, but it can scale to

00:17:00.230 –> 00:17:04.579

much larger. So what you’re saying is one of

00:17:04.579 –> 00:17:06.460

the techniques, one of the efforts that helps

00:17:06.460 –> 00:17:11.440

to challenge those assumptions is a reframing

00:17:11.440 –> 00:17:17.539

of reality, of imagination, of the assumptions

00:17:17.539 –> 00:17:21.380

of a variable. So waste was used as a variable

00:17:21.380 –> 00:17:25.309

there. That’s a tried and true. method in psychology

00:17:25.309 –> 00:17:28.690

isn’t and behavioral economics is um connected

00:17:28.690 –> 00:17:31.549

with priming right you prime them with a message

00:17:31.549 –> 00:17:38.509

or a view of the world and then you um have reframed

00:17:38.509 –> 00:17:41.670

you see how how that can help you what you’re

00:17:41.670 –> 00:17:48.289

hoping yeah and that’s you know you see this

00:17:48.289 –> 00:17:50.990

used in media media is probably one of the most

00:17:50.990 –> 00:17:57.980

common um what would you say, I wouldn’t say

00:17:57.980 –> 00:18:01.460

beneficiaries, but users of this priming, reframing

00:18:01.460 –> 00:18:09.220

method on the public scale. So with all that,

00:18:09.319 –> 00:18:11.279

I don’t want to step too far into your time,

00:18:11.380 –> 00:18:14.980

but with all that, you bring in this idea that

00:18:14.980 –> 00:18:18.480

is I mean, a lot of people talked about it, but

00:18:18.480 –> 00:18:20.200

not a lot of people have given a lot of definition

00:18:20.200 –> 00:18:24.700

to it. And it’s this, what does it take to reframe?

00:18:24.759 –> 00:18:30.599

What does it take to, and I really wanted to

00:18:30.599 –> 00:18:33.680

get back to your playful idea of imagining alternative

00:18:33.680 –> 00:18:39.799

futures and stuff. And it’s the variety that

00:18:39.799 –> 00:18:42.299

we end up being around. I’m trying to think of

00:18:42.299 –> 00:18:44.619

the word. You bring in this idea of collective

00:18:44.619 –> 00:18:49.039

intelligence, right? And that’s through a sort

00:18:49.039 –> 00:18:54.160

of participatory method, which can mean any number

00:18:54.160 –> 00:18:57.559

of things, right? But clearly an action -based

00:18:57.559 –> 00:19:05.180

effort of changing these assumptions, challenging

00:19:05.180 –> 00:19:07.920

however it is. I’m going to pass this back to

00:19:07.920 –> 00:19:11.240

you. If you could walk us through your ideas

00:19:11.240 –> 00:19:14.599

of that collective intelligence method. So on

00:19:14.599 –> 00:19:18.299

the playfulness and maybe also going back to

00:19:18.299 –> 00:19:20.960

the technique, just to get a better sense of

00:19:20.960 –> 00:19:24.180

how does it work? Because really, hopefully by

00:19:24.180 –> 00:19:28.420

now we understand what and why. Why do we do

00:19:28.420 –> 00:19:33.039

this work of trying to find out more about relationship

00:19:33.039 –> 00:19:36.680

to power? One thing that actually I’d like to

00:19:36.680 –> 00:19:41.089

say. And actually, it will be part of the reason

00:19:41.089 –> 00:19:45.230

why we had issues finding this session. I will

00:19:45.230 –> 00:19:48.369

have my first exhibition at the end of this week,

00:19:48.569 –> 00:19:53.829

where we do work on technomagics. And basically,

00:19:53.890 –> 00:19:58.930

the work at CoDesign was on the reproduction

00:19:58.930 –> 00:20:02.470

of Unreal Times. And I think that connects very

00:20:02.470 –> 00:20:04.529

nicely with what we’re discussing right now.

00:20:04.849 –> 00:20:07.569

And where is that going to be, by the way? in

00:20:07.569 –> 00:20:14.210

Freiburg, in South Germany. And, well, promotion

00:20:14.210 –> 00:20:16.470

on this one, so it will be from May until July

00:20:16.470 –> 00:20:21.369

2025 in Freiburg at IVEC, or Galerie für Gegenwart,

00:20:21.509 –> 00:20:25.910

so the gallery for the present in Germany. And

00:20:25.910 –> 00:20:30.630

as part of this work, so we’re kind of interrogating

00:20:30.630 –> 00:20:32.589

the connection between technology and magic.

00:20:36.009 –> 00:20:38.049

What I found to be really interesting when we

00:20:38.049 –> 00:20:41.529

talked about technology or magic is that there’s

00:20:41.529 –> 00:20:46.210

some things that feel real and other that just

00:20:46.210 –> 00:20:51.109

don’t. And usually, for some reason, technology

00:20:51.109 –> 00:20:56.210

appears to be more real than magic. And the first

00:20:56.210 –> 00:20:58.589

thing that I think was important for me to say

00:20:58.589 –> 00:21:04.400

was how what we… What feels real has less to

00:21:04.400 –> 00:21:07.779

do with what is actually probable and more to

00:21:07.779 –> 00:21:13.799

do with what we are told to feel or what we are

00:21:13.799 –> 00:21:17.859

allowed to feel. So technically speaking this

00:21:17.859 –> 00:21:21.660

idea that technology is a very tangible thing

00:21:21.660 –> 00:21:25.160

even though if I ask anyone what do you see as

00:21:25.160 –> 00:21:27.160

the future of technology nobody’s going to think

00:21:27.160 –> 00:21:30.579

about the people who are behind technology such

00:21:30.579 –> 00:21:35.329

as the cobalt miners in the Democratic Republic

00:21:35.329 –> 00:21:38.130

of the Congo or the manufacturers in Vietnam

00:21:38.130 –> 00:21:40.890

or in China. They’re going to think about the

00:21:40.890 –> 00:21:44.289

users, even though I would say those making the

00:21:44.289 –> 00:21:49.890

tools should be the most tangible aspect of the

00:21:49.890 –> 00:21:53.529

matter. But somehow that aspect does not really

00:21:53.529 –> 00:21:56.869

feel real. But the idea of having smart houses

00:21:56.869 –> 00:22:01.640

feel very real in a way, magic. will feel also

00:22:01.640 –> 00:22:05.079

less real because we don’t see the process but

00:22:05.079 –> 00:22:07.660

for tech we don’t see the process either and

00:22:07.660 –> 00:22:10.400

so this this ability to see what’s real and what’s

00:22:10.400 –> 00:22:13.799

not has very much to do with power and so for

00:22:13.799 –> 00:22:15.880

any conversation where we want to connect with

00:22:15.880 –> 00:22:18.380

other people starting from the position of just

00:22:18.380 –> 00:22:21.220

you know even if you cannot name where power

00:22:21.220 –> 00:22:25.200

comes from you’re still able to feel it and that

00:22:25.200 –> 00:22:27.180

is enough to have a conversation with other people

00:22:27.180 –> 00:22:29.980

because we’re not just here to blame people We

00:22:29.980 –> 00:22:31.980

just want to be able to better understand ourselves.

00:22:32.640 –> 00:22:36.039

And that is a political move already, not political

00:22:36.039 –> 00:22:38.700

in the sense of the partisan move. I’m not asking

00:22:38.700 –> 00:22:40.980

who you’re voting for, but just political in

00:22:40.980 –> 00:22:44.079

terms of committing to being part of society.

00:22:44.640 –> 00:22:47.359

And I think that’s something that we do need.

00:22:47.819 –> 00:22:50.660

And if you’re committing to being part of society,

00:22:50.900 –> 00:22:54.160

you’re committing to putting a bit of yourself

00:22:54.160 –> 00:22:57.700

to society. And so that’s where collective intelligence

00:22:57.700 –> 00:23:01.049

actually plays a role. I remember recently a

00:23:01.049 –> 00:23:04.829

conversation with an Egyptian colleague, well

00:23:04.829 –> 00:23:07.630

an Egyptian colleague, an Ethiopian colleague

00:23:07.630 –> 00:23:11.250

and a German colleague and we’re all based in

00:23:11.250 –> 00:23:15.470

Germany. Now for the first time our Ethiopian

00:23:15.470 –> 00:23:18.890

colleague was hearing that we needed to have

00:23:18.890 –> 00:23:25.529

an insurance for moments when if we basically

00:23:27.019 –> 00:23:30.220

cause any damage to the property of somebody

00:23:30.220 –> 00:23:33.980

else that would be covered by insurance who would

00:23:33.980 –> 00:23:39.819

basically compensate the other party and he was

00:23:39.819 –> 00:23:46.200

really surprised if I can put that in milder

00:23:46.200 –> 00:23:50.920

terms by that fact it was like well why do we

00:23:50.920 –> 00:23:54.660

need to be insured for this type of damage, wouldn’t

00:23:54.660 –> 00:23:58.099

it be just good to have a conversation with one

00:23:58.099 –> 00:24:03.579

another? And, you know, it was interesting because

00:24:03.579 –> 00:24:07.200

as he raised that question, suddenly we had to

00:24:07.200 –> 00:24:10.339

think about what our insurance is for. And it

00:24:10.339 –> 00:24:13.000

leads us to other conversations such as, you

00:24:13.000 –> 00:24:15.559

know, the fact that we choose to hide ourselves

00:24:15.559 –> 00:24:18.930

behind laws. um in order not to talk to anyone

00:24:18.930 –> 00:24:21.250

you can just say well that’s what’s written in

00:24:21.250 –> 00:24:26.710

the civil code please apply article x um and

00:24:26.710 –> 00:24:30.069

um and so suddenly the point is not necessarily

00:24:30.069 –> 00:24:34.190

to say that suddenly you should not have an insurance

00:24:34.190 –> 00:24:37.109

um especially if there’s an insurance company

00:24:37.109 –> 00:24:39.609

listening to us and then suddenly blocking this

00:24:39.609 –> 00:24:43.440

podcast um But the point is just to better understand

00:24:43.440 –> 00:24:45.440

why did we actually come up with an insurance

00:24:45.440 –> 00:24:48.119

based system in Germany? And then you actually

00:24:48.119 –> 00:24:50.839

go back to how the culture functions and the

00:24:50.839 –> 00:24:55.220

type of precautions and how risk adverse people

00:24:55.220 –> 00:24:57.180

can be, et cetera, et cetera. So you actually

00:24:57.180 –> 00:24:59.140

are given the opportunity to better understand

00:24:59.140 –> 00:25:02.880

how you function. But it’s easier done when you

00:25:02.880 –> 00:25:05.619

have somebody who is not from the system telling

00:25:05.619 –> 00:25:07.900

you about this or somebody who is from the system,

00:25:07.960 –> 00:25:10.960

but not looking at the system the same way you

00:25:10.960 –> 00:25:14.910

  1. And so having the opportunity to just relate

00:25:14.910 –> 00:25:18.369

better is usually just a way to understand yourself

00:25:18.369 –> 00:25:21.789

better. And so for anticipatory assumptions,

00:25:22.450 –> 00:25:28.190

the way we go about it is that we’re not just

00:25:28.190 –> 00:25:30.309

going for anticipatory assumptions. The point

00:25:30.309 –> 00:25:32.549

is not just to have a list of anticipatory assumptions.

00:25:32.710 –> 00:25:35.309

Like how many biases did we have in the room?

00:25:35.390 –> 00:25:37.690

And then be happy because we counted, I don’t

00:25:37.690 –> 00:25:43.019

know, 16 or 64. um the point is to say okay um

00:25:43.019 –> 00:25:46.380

we’re all coming together there is an objective

00:25:46.380 –> 00:25:48.559

that we have we want to discuss the future of

00:25:48.559 –> 00:25:50.359

waste we want to discuss the future of racism

00:25:50.359 –> 00:25:54.759

in the context of a particular organization that

00:25:54.759 –> 00:25:58.059

targets um something in particular we want to

00:25:58.059 –> 00:26:04.789

make sure that um people of african descent anywhere

00:26:04.789 –> 00:26:07.809

in the world can actually feel like they’re part

00:26:07.809 –> 00:26:10.930

of the country that understands their history

00:26:10.930 –> 00:26:14.329

and in which they feel respected okay that’s

00:26:14.329 –> 00:26:18.950

our goal now um the question is what are we currently

00:26:18.950 –> 00:26:21.750

thinking about the world so how are we looking

00:26:21.750 –> 00:26:25.130

at what we’re doing from the perspective of what

00:26:25.130 –> 00:26:30.380

will happen in the future so okay in 2060 I see

00:26:30.380 –> 00:26:35.079

how history will be taught in history books or

00:26:35.079 –> 00:26:41.039

how it’s okay to have courses on how to do your

00:26:41.039 –> 00:26:46.400

hair in a classroom. Okay, usually the stories

00:26:46.400 –> 00:26:49.460

that you’re going to hear are based on things

00:26:49.460 –> 00:26:53.519

that people have experienced very recently. So

00:26:53.519 –> 00:26:56.539

people discussing hair politics may be due to

00:26:56.539 –> 00:26:58.609

the fact that you know they realized that they

00:26:58.609 –> 00:27:00.950

needed to pay 100 euros to get their hair done.

00:27:01.190 –> 00:27:03.529

And so that’s the topic that came on the next

00:27:03.529 –> 00:27:07.029

day. The same way when we organized the Futures

00:27:07.029 –> 00:27:13.250

UTC Live, so a futures workshop on energy with

00:27:13.250 –> 00:27:16.869

people who were based in Western Europe in February

00:27:16.869 –> 00:27:22.549

2022, which also happens to be when the war between

00:27:22.549 –> 00:27:25.500

Ukraine and Russia started. Well, the matter

00:27:25.500 –> 00:27:30.059

of energy self -sufficiency, world security came

00:27:30.059 –> 00:27:32.660

up quite often, even though we’re talking about

00:27:32.660 –> 00:27:37.000

the future of the rule of manufacturing, which

00:27:37.000 –> 00:27:40.819

can be related to energy, but it’s not really

00:27:40.819 –> 00:27:44.660

the focus. But somehow it was on everybody’s

00:27:44.660 –> 00:27:48.019

minds. So that’s what we discussed. and you see

00:27:48.019 –> 00:27:50.440

how using the future can just be a way to talk

00:27:50.440 –> 00:27:53.519

about the fears anxieties or associated sources

00:27:53.519 –> 00:27:56.720

of excitement that we have at the moment when

00:27:56.720 –> 00:28:00.700

the activity is organized so first we united

00:28:00.700 –> 00:28:05.339

by common sense of purpose and we allow for contemporary

00:28:05.339 –> 00:28:09.480

matters immediate matters to also be there because

00:28:09.480 –> 00:28:11.779

they’re already there so might as well welcome

00:28:11.779 –> 00:28:15.779

them Now, we know that there’s a big elephant

00:28:15.779 –> 00:28:19.119

in the room or several big elephants in the rooms,

00:28:19.180 –> 00:28:22.119

which are those anticipatory assumptions. So

00:28:22.119 –> 00:28:26.220

the type of data or data in the way it stands

00:28:26.220 –> 00:28:29.140

on, it’s not only statistics, like any pieces

00:28:29.140 –> 00:28:32.920

of information that we mobilize about coming

00:28:32.920 –> 00:28:35.660

from the past, coming from the present, coming

00:28:35.660 –> 00:28:38.440

from our understanding of the past and the present

00:28:38.440 –> 00:28:40.960

and what we believe other people in the room

00:28:40.960 –> 00:28:45.230

are ready to hear. And we bring all of that to

00:28:45.230 –> 00:28:48.230

the future. Okay, what is blocking us in that

00:28:48.230 –> 00:28:51.150

process? Are there things that we hear, association

00:28:51.150 –> 00:28:54.269

of ideas that we’re making that are preventing

00:28:54.269 –> 00:28:58.589

us from seeing what is the matter at hand? I

00:28:58.589 –> 00:29:01.410

usually like to frame it under like association

00:29:01.410 –> 00:29:04.329

of ideas because that’s usually easier. So, you

00:29:04.329 –> 00:29:07.849

know, waste and usefulness to reuse the example

00:29:07.849 –> 00:29:11.700

from before. Waste and wealth. um so what are

00:29:11.700 –> 00:29:14.740

the connection if you think about racism what

00:29:14.740 –> 00:29:17.299

do we usually connect that with based on what

00:29:17.299 –> 00:29:19.740

people are saying well if they’re talking about

00:29:19.740 –> 00:29:23.259

history books they usually have make a connection

00:29:23.259 –> 00:29:27.420

between racism and slavery um so that there’s

00:29:27.420 –> 00:29:31.259

particular episodes in history that are responsible

00:29:31.259 –> 00:29:35.440

um for the way people look at one another and

00:29:35.440 –> 00:29:38.539

so we want to document those processes And then

00:29:38.539 –> 00:29:42.160

we realize by connecting racism and history that

00:29:42.160 –> 00:29:46.299

we actually usually frame racism as only being

00:29:46.299 –> 00:29:49.059

a matter of awareness. It’s because people don’t

00:29:49.059 –> 00:29:53.200

know that they do the things that they do. Which

00:29:53.200 –> 00:29:57.579

could be true. Could also not be true. And so

00:29:57.579 –> 00:30:00.000

it’s then interesting to think about those different

00:30:00.000 –> 00:30:03.319

layers. Because maybe if you want to talk to

00:30:03.319 –> 00:30:07.339

particular people, that layer won’t work. And

00:30:07.339 –> 00:30:09.500

so you have to think about other types of layers.

00:30:09.759 –> 00:30:12.339

So the idea is to, yet again, think about this

00:30:12.339 –> 00:30:16.019

exercise as, you know, letting you go explore

00:30:16.019 –> 00:30:20.039

what is possible. Bear in mind that, once again,

00:30:20.119 –> 00:30:24.759

I insist on the political nature of this work,

00:30:24.839 –> 00:30:28.339

not to make it all heavy because, you know, sometimes

00:30:28.339 –> 00:30:30.900

we’re afraid. We hear political and we want to

00:30:30.900 –> 00:30:36.180

run away because it sounds scary, like many things.

00:30:36.700 –> 00:30:38.500

uh unfortunately i don’t believe that we can

00:30:38.500 –> 00:30:41.640

avoid it many people’s lives or political just

00:30:41.640 –> 00:30:44.319

the choice to have a family or not to have one

00:30:44.319 –> 00:30:46.180

is a political one or just the fact that you

00:30:46.180 –> 00:30:48.460

don’t have a choice is also a political method

00:30:48.460 –> 00:30:51.420

um the fact that you go to school they don’t

00:30:51.420 –> 00:30:56.759

go to school the fact that you choose to um talk

00:30:56.759 –> 00:30:59.440

to particular people all of those different methods

00:30:59.440 –> 00:31:03.019

or heavily political even who you love is political

00:31:03.019 –> 00:31:07.039

i was going to say my state is now made just

00:31:07.039 –> 00:31:10.299

existing as certain types of people a political

00:31:10.299 –> 00:31:15.920

um issue so it’s it’s difficult to avoid it and

00:31:15.920 –> 00:31:18.859

and so that’s why i use that term not to scare

00:31:18.859 –> 00:31:22.740

people away but just to contextualize the work

00:31:22.740 –> 00:31:25.579

that we do and so when it’s a matter of talking

00:31:25.579 –> 00:31:27.440

about the future talking about the way society

00:31:27.440 –> 00:31:31.160

is organized or can be organized obviously this

00:31:31.160 –> 00:31:34.599

is definition of what is political and so um

00:31:35.099 –> 00:31:39.920

um we can play around with topics that don’t

00:31:39.920 –> 00:31:41.940

necessarily sound very political and suddenly

00:31:41.940 –> 00:31:44.599

turn out to be i did things on the future of

00:31:44.599 –> 00:31:49.440

bread and we had a lovely conversation not necessarily

00:31:49.440 –> 00:31:52.940

with bakers where people ended up talking about

00:31:52.940 –> 00:31:56.039

how they don’t want to see bread no more because

00:31:56.039 –> 00:32:00.099

wheat has been taking over any type of conversation

00:32:00.099 –> 00:32:02.700

and so They want to actually talk about other

00:32:02.700 –> 00:32:06.319

ingredients or other cereals. And then that’s

00:32:06.319 –> 00:32:08.559

obviously very much connected to food distribution

00:32:08.559 –> 00:32:16.019

systems, who get access to the agribusiness industry.

00:32:16.400 –> 00:32:19.180

So all of those matters you realize, you were

00:32:19.180 –> 00:32:21.799

just talking about baguettes and you ended up

00:32:21.799 –> 00:32:25.460

talking about how our economic systems are built.

00:32:26.099 –> 00:32:30.880

And so from the perspective of techniques, we

00:32:30.880 –> 00:32:34.880

always bear that in mind that being said we do

00:32:34.880 –> 00:32:38.940

this work in smaller groups and so this is then

00:32:38.940 –> 00:32:41.700

the opportunity to just have a conversation with

00:32:41.700 –> 00:32:44.660

that little group being in mind that we belong

00:32:44.660 –> 00:32:51.720

to a larger world well okay so with that said

00:32:51.720 –> 00:32:55.240

which is um an incredible amount of wealth there

00:32:55.240 –> 00:32:59.640

of information and ways of looking at how as

00:32:59.640 –> 00:33:03.819

individuals and when we identify as members of

00:33:03.819 –> 00:33:07.460

communities we can apply these concepts of futures

00:33:07.460 –> 00:33:11.500

literacy and reframing techniques in daily lives

00:33:11.500 –> 00:33:13.960

you know that’s um because that’s one of the

00:33:13.960 –> 00:33:16.599

big questions right how how do we take this on

00:33:16.599 –> 00:33:20.259

or how do we help others take this on so with

00:33:20.259 –> 00:33:23.500

that in mind and across this whole conversation

00:33:23.500 –> 00:33:26.740

have one final question for you and this is just

00:33:26.740 –> 00:33:29.000

you this is just about from your point of the

00:33:29.000 –> 00:33:31.220

world what you’re seeing maybe in your profession

00:33:31.220 –> 00:33:37.160

um but what are you seeing that people are not

00:33:37.160 –> 00:33:39.920

talking about that you think they should be talking

00:33:39.920 –> 00:33:48.480

about i think in general people are trapped by

00:33:48.480 –> 00:33:53.470

a certain sense of hype whose origins they don’t

00:33:53.470 –> 00:34:00.549

even know. And then part of the most basic conversations

00:34:00.549 –> 00:34:05.950

just do not take place. I’ve already referred

00:34:05.950 –> 00:34:08.489

to what that means for technology. So the fact

00:34:08.489 –> 00:34:12.210

that we focus on the end users or how amazing

00:34:12.210 –> 00:34:14.409

it is to do all the things that we thought were

00:34:14.409 –> 00:34:18.750

impossible to do and suddenly those seem to be

00:34:18.750 –> 00:34:21.889

feasible and we stop talking about it. But wait

00:34:21.889 –> 00:34:24.650

a minute, how do we produce those things? What

00:34:24.650 –> 00:34:28.969

economy is actually required to sustain whatever

00:34:28.969 –> 00:34:31.769

we need to do? So, you know, where do minerals

00:34:31.769 –> 00:34:36.590

come from? How are they negotiated or not negotiated?

00:34:38.190 –> 00:34:41.969

Who has to sacrifice their daily lives for that

00:34:41.969 –> 00:34:46.550

to happen? And is it the cost? Are those costs

00:34:46.550 –> 00:34:51.860

that I’m ready to bear as ecologically? Those

00:34:51.860 –> 00:34:54.099

type of conversations are just never asked. You

00:34:54.099 –> 00:34:56.739

just ask between, you know, choosing between

00:34:56.739 –> 00:35:02.179

two smartphones. That’s not a choice, if I may.

00:35:02.360 –> 00:35:06.219

Like, usually with the way we describe our post

00:35:06.219 –> 00:35:08.380

-capitalist societies or whatever terminology

00:35:08.380 –> 00:35:11.179

we want to use, we believe that we live in societies

00:35:11.179 –> 00:35:14.739

of choice. And usually, for example, examples

00:35:14.739 –> 00:35:18.610

that we have of… right at the end of the Cold

00:35:18.610 –> 00:35:23.710

War, when he describes Soviet countries. One

00:35:23.710 –> 00:35:26.349

of the examples that comes in this type of movies

00:35:26.349 –> 00:35:29.429

or books or testimonies that we have from people

00:35:29.429 –> 00:35:32.769

is how suddenly they went from the monopoly of

00:35:32.769 –> 00:35:36.889

state on particular brands to having a plethora,

00:35:37.230 –> 00:35:41.809

like just a set of different brands, formats,

00:35:41.949 –> 00:35:46.610

sizes and everything. And so we have this kind

00:35:46.610 –> 00:35:49.070

of assumption that we live in a society where

00:35:49.070 –> 00:35:54.650

there’s an abundance of choice. And I do want

00:35:54.650 –> 00:35:58.730

to actually challenge that notion, which was

00:35:58.730 –> 00:36:01.210

what this idea of poverty of the imagination

00:36:01.210 –> 00:36:04.610

has been all about, which is that it’s not because

00:36:04.610 –> 00:36:06.230

you have plenty of brands in your supermarket

00:36:06.230 –> 00:36:09.369

that you’re very rich in terms of the type of

00:36:09.369 –> 00:36:13.300

choices that you can afford to make. Not only

00:36:13.300 –> 00:36:15.440

because you cannot necessarily afford all of

00:36:15.440 –> 00:36:18.300

the products that are in the supermarket, but

00:36:18.300 –> 00:36:22.219

also because there’s some initial choices that

00:36:22.219 –> 00:36:25.079

you were not even allowed to make. You’re being

00:36:25.079 –> 00:36:29.139

exposed to those final end choices and not the

00:36:29.139 –> 00:36:34.500

very basic ones. And being able to reclaim those

00:36:34.500 –> 00:36:39.860

basic conversations is for me a source of wealth.

00:36:40.320 –> 00:36:43.139

that is for me a source of abundance and that

00:36:43.139 –> 00:36:45.019

is something that we can definitely tap into

00:36:45.019 –> 00:36:48.460

because that’s where the bread of the money like

00:36:48.460 –> 00:36:51.480

that’s that’s where real things happen like i

00:36:51.480 –> 00:36:54.460

remember doing things on land law and who gets

00:36:54.460 –> 00:36:57.500

access to land and somebody telling me well thank

00:36:57.500 –> 00:36:59.760

you for raising our conversation because i realized

00:36:59.760 –> 00:37:02.659

we’re doing lots of work on urban structures

00:37:02.659 –> 00:37:05.500

and how cities should be organized in a way that

00:37:05.500 –> 00:37:08.780

account for uh well smart cities so you know

00:37:08.780 –> 00:37:14.239

how tech can um support um houses um thinking

00:37:14.239 –> 00:37:19.039

about um um how you know how we can have building

00:37:19.039 –> 00:37:25.380

that or um can be more easily face um disasters

00:37:25.380 –> 00:37:29.380

like natural disasters which is of course an

00:37:29.380 –> 00:37:32.179

important take but this very idea that we could

00:37:32.179 –> 00:37:35.320

have architecture that responds to our needs.

00:37:35.559 –> 00:37:38.940

So, you know, what do we use a house for? Well,

00:37:39.099 –> 00:37:42.119

my house is about, I like to actually produce

00:37:42.119 –> 00:37:44.599

my own fruit if I can, but I’m not that good

00:37:44.599 –> 00:37:46.880

at it, so I’ll just do maybe the tomatoes, but

00:37:46.880 –> 00:37:48.699

actually tomatoes are hard, so it’s a bad example.

00:37:49.380 –> 00:37:53.179

But, you know, okay, basil. Or I want to use

00:37:53.179 –> 00:37:57.760

my house to welcome people, because I usually

00:37:57.760 –> 00:38:00.610

have family around at least once a month. Okay,

00:38:00.829 –> 00:38:03.130

so what part is actually more important to you?

00:38:03.210 –> 00:38:06.070

Is it the living room? Is it having more bedrooms?

00:38:06.650 –> 00:38:10.289

How do people just like sit around? This sounds

00:38:10.289 –> 00:38:13.429

like a very silly matter, but it’s actually,

00:38:13.469 –> 00:38:16.929

I think, practical conversation that most architects

00:38:16.929 –> 00:38:21.630

should have. And I believe and trust based on

00:38:21.630 –> 00:38:25.449

some of the traditional architectural work that

00:38:25.449 –> 00:38:27.090

many architects are actually interested in this

00:38:27.090 –> 00:38:29.679

conversation and do hold that space. But you

00:38:29.679 –> 00:38:31.820

see that when you start with very basic items,

00:38:32.119 –> 00:38:37.260

those more dominant matters comes up. The same

00:38:37.260 –> 00:38:39.099

thing as the waste conversation. It’s something

00:38:39.099 –> 00:38:42.019

that may sound silly, but you realize that there’s

00:38:42.019 –> 00:38:46.619

a series of significant policy and behaviors

00:38:46.619 –> 00:38:52.119

that are directed by this way of looking at the

00:38:52.119 –> 00:38:55.800

world. So I would start with the method first

00:38:55.800 –> 00:38:58.079

in terms of… What are people not talking about?

00:38:58.260 –> 00:39:00.239

Well, it would depend on whatever the matter.

00:39:01.099 –> 00:39:05.800

But I would say the generic problem that we have

00:39:05.800 –> 00:39:09.539

comes from that. And then it affects all of the

00:39:09.539 –> 00:39:12.440

subject matters. It affects the way we think

00:39:12.440 –> 00:39:15.840

about migration in Lebanon and we just think

00:39:15.840 –> 00:39:18.320

about sending people back to where they come

00:39:18.320 –> 00:39:20.519

from without actually interrogating the type

00:39:20.519 –> 00:39:22.679

of needs that they have. how do they organize

00:39:22.679 –> 00:39:25.320

their daily lives what are their their aspirations

00:39:25.320 –> 00:39:27.440

for the future what do they want for their families

00:39:27.440 –> 00:39:30.360

where do they protect the idea of a dynasty for

00:39:30.360 –> 00:39:33.619

the family we start from there when you get answers

00:39:33.619 –> 00:39:35.840

that may be very different from what you might

00:39:35.840 –> 00:39:39.019

expect or similar for reasons that you did not

00:39:39.019 –> 00:39:43.519

suspect um it’s true for waste it’s super agriculture

00:39:43.519 –> 00:39:46.860

it’s true for the way we organize our educational

00:39:46.860 –> 00:39:51.420

systems but for sure let’s start with very simple

00:39:51.420 –> 00:39:53.920

practical matters that bring us to all of those

00:39:53.920 –> 00:39:58.000

abstract and meta conversations that are, of

00:39:58.000 –> 00:40:02.760

course, deeply needed. Okay. Thank you so much.

00:40:02.860 –> 00:40:06.500

I’m going to do my part to listen to this interview

00:40:06.500 –> 00:40:10.760

again and see how I can bring that into the practice,

00:40:10.860 –> 00:40:12.559

because that’s what this is about, right? These

00:40:12.559 –> 00:40:14.739

conversations about learning from each other,

00:40:14.820 –> 00:40:18.349

as well as hopefully spreading. more knowledge

00:40:18.349 –> 00:40:20.650

about what we do. So thank you so much. That

00:40:20.650 –> 00:40:25.610

was really, really just for me, that was an emotional

00:40:25.610 –> 00:40:31.110

rollercoaster of a ride. So very much. And I

00:40:31.110 –> 00:40:35.090

appreciate you coming today. Thank you for that.

00:40:35.670 –> 00:40:38.250

Scenarios for Tomorrow is produced by me, Megan

00:40:38.250 –> 00:40:41.989

Crawford, with invaluable feedback from Dr. Isabella

00:40:41.989 –> 00:40:45.550

Riza, Jeremy Creep, Brian Eggo, and as always,

00:40:45.730 –> 00:40:49.570

my kids. This is a production of the Futures

00:40:49.570 –> 00:40:52.389

and Analytics Research Hub and Pharr Lab affiliated

00:40:52.389 –> 00:40:55.469

with Edinburgh Napier Business School. You can

00:40:55.469 –> 00:40:57.929

find show notes, references, and transcripts

00:40:57.929 –> 00:41:03.230

at scenarios .pharrhub .org. That’s scenarios

00:41:03.230 –> 00:41:06.710

.pharrhub .org. You can follow us across social

00:41:06.710 –> 00:41:09.530

media by searching for Scenario Futures, all

00:41:09.530 –> 00:41:12.489

one word. You can subscribe to Scenarios for

00:41:12.489 –> 00:41:14.309

Tomorrow wherever you listen to your podcasts.

00:41:15.130 –> 00:41:17.989

Today’s track was composed by Rocket, whose links

00:41:17.989 –> 00:41:21.269

are provided in the show notes. This is scenarios

00:41:21.269 –> 00:41:23.929

for tomorrow, where tomorrow’s headlines start

00:41:23.929 –> 00:41:25.289

as today’s thought experiments.

Select episode references:

Kwamou Eva Feukeu

TEDx talk (English)

TEDx talk (Français)

Today’s track “Experimental Cinematic Hip-Hop” was composed by @Rockot

00:00:00 Megan

With lectures, I realise I say so a lot, so we’re doing this. So we’re doing that. So I’m trying to stay away from that and of course.

00:00:08 Shardul

So here so.

[both laughing]

00:00:11 Megan

Welcome to Scenarios for Tomorrow podcast where we turn tomorrow’s headlines into today’s thought experiments. This first series includes conversations with the authors of our latest.

00:00:21 Megan

Book improving and enhancing scenario planning, future thinking volume from Edward Elgar publishing. I’m your host Dr Megan Crawford and throughout this first series you’ll hear from my guests, the numerous global techniques for practising and advancing scenario planning. Enjoy.

00:00:39 Music

Yeah.

00:00:48 Megan

Shardul Phadnis is an associate professor of operations and supply chain management at the Asian School of Business in Kuala Lumpur, Malaysia. Shardul explores the intersection of supply chains and strategic management. Specifically, how scenario planning influences.

00:01:07 Megan

The adaptability of supply chain configurations and how organisations create value by orchestrating supply chain operation.

00:01:15 Megan

Is 2022 book strategic planning for dynamic supply chains preparing for uncertainty using scenarios describes first-hand accounts of applications of scenario planning for strategic supply chain planning in 3IN depth cases involving businesses and government planning agencies.

00:01:35 Megan

He received the 2015 Guarantano Rising Star Award from the Industry Studies Association for his research and apparel supply chains. Welcome Shardul.

00:01:47 Shardul

Thank you, Megan, and thanks for that wonderful introduction.

00:01:50 Megan

Yes. Well, you I honestly enjoyed being able to read your book. What was it about was, I think it was two years ago and getting a chance to write. Yeah. Getting a chance to. To write a reply for it. That was really exciting because.

00:02:00 Shardul

Almost three years ago, yes.

00:02:05 Megan

I’ve never worked.

00:02:06 Megan

Scenario planning and supply chain together, but they.

00:02:10 Megan

What we’ll be talking about?

00:02:11 Megan

Today, how well they go.

00:02:12 Megan

Together.

00:02:13 Shardul

Absolutely, Yep. And you wrote the review for that as well. It was a very nice review that came out in, I think, futures and foresight. Science didn’t.

00:02:22 Megan

Yeah, it was again a really great one. Everybody should read both of them, but it’s great to finally get a chance to sit down with you one-on-one, because I think the bulk of our chats for years have been relegated to emails and possibly just one conference that we saw each other in passing.

00:02:42 Megan

I’m not sure.

00:02:43 Megan

If we’ve even made the same conference.

00:02:45 Megan

This for years.

00:02:47 Megan

Which seems quite extraordinary, doesn’t it, given the?

00:02:50 Megan

Size of our field.

00:02:52 Megan

Exactly. So here we are.

00:02:54 Shardul

Very small community, I mean, seems like the people whose papers we read, we kind of know most of those people, and many of those people haven’t even met in person, especially since we have been living in Kuala Lumpur for almost 10 years and the Community seems to be very.

00:03:11 Shardul

Much focused heavily in Europe, I would say like like where you are and parts of the United States, but not so much.

00:03:20 Shardul

In Asia, actually.

00:03:22 Megan

Yeah. And we’ll be talking about that a little bit, which is why I was really glad that you joined our book. Your perspectives were something I really, really wanted to get into this particular book. So now we finally get to deep dive right into our our little understood world of scenario planning.

00:03:42 Megan

And foresight science. And we get to do it at our own pace, which is nice.

00:03:48 Megan

So as mentioned in the introduction, we’ve just published a book together about scenario planning in the 21st century, and it was nearly exactly 2 years in the making. In fact, I think our first interview together on this was February 2 years ago, 2023.

00:04:07 Shardul

Oh wow, two years ago.

00:04:08 Megan

And right was looking into that.

00:04:11 Megan

And today at the quarter century Mark 2025, we’re here to talk a bit about our our joint work, but particularly your chapter. We understand that not all of our listeners are familiar with scenario planning, though many may have heard more about it since the pandemic when it got.

00:04:32 Megan

When our jobs got really popular and one of the motivations to this podcast is to bring our world of futures and foresight science.

00:04:41 Megan

Outside the walls of academia, where within the language is very closely controlled for understandable reasons, that’s just the nature of science, communication and knowledge is not as easy to access as we generally wish it to be. So we’re here to have.

00:04:58 Megan

A chat with the public.

00:05:00 Shardul

That’s great, that’s.

00:05:01 Shardul

Great. I think it is really something.

00:05:05 Shardul

Very important thing to do to bring this really critical process of scenario planning right and as I think about it as we talk about it, it seems that scenario planning is getting even more and more important for the world that we live in. Now. We are swimming in all kinds of uncertainties these days, right? Think about the trade wars and the tariffs.

00:05:26 Shardul

The issues that are going on the geopolitical.

00:05:29 Shardul

Mentions, but also now just. Even beyond that, the whole thing with AI which now your book is so timely scenario planning for the 21st century.

00:05:39 Shardul

We don’t understand how it’s going to affect organisations or, even more broadly, society.

00:05:47 Shardul

And that’s your scenario. Planning can be really helpful.

00:05:50 Shardul

And also the another thing about the net 0, the whole environmental sustainability.

00:05:56 Shardul

Companies are struggling to kind of embark on the Net Zero journey successfully, and one of the biggest things there are stumbling blocks is really the uncertainty and they’re struggling to figure out how to make kind of long term investment decisions under that uncertainty.

00:06:14 Shardul

So it is.

00:06:14 Shardul

Absolutely. The perfect timing for you and Josh to come up with this book.

00:06:19 Shardul

And I think it’s it’s very timely that we are doing this podcast as.

00:06:25 Megan

Great. I’m glad you and I agree. So you, you touched upon a lot of points there that I really, really wanted to get into and some of them being what we even mean by uncertainty. It’s a word thrown around a lot like innovation and everybody has an opinion on what it means in every business.

00:06:46 Megan

As a specific focus on a realm of uncertainty or innovation, as it were, and so yeah, it would be cool to see if we can.

00:06:57 Megan

Your your chapter in particular gives some very concrete examples of this.

00:07:02 Megan

And let me go ahead and introduce your chapter your your contribution to this book was titled or is titled Evaluating Effects of Scenario Planning Lessons from Medical Research. Your chapter is the only one that brings in the medical field as.

00:07:22 Megan

Focus for the practise of scenario planning, so I wanted to open with that. In particular what let’s just start with what motivated you to explore the effects of scenario planning through the lens of medical research methodologies.

00:07:40 Shardul

OK, so let me I think it is worth.

00:07:42 Shardul

Giving some background.

00:07:43 Shardul

So I’m an engineer, so when I started applying scenario planning in the context of supply chain strategy and supply chain management.

00:07:54 Shardul

My orientation was not so.

00:07:57 Shardul

Descriptive to see how this actually works in the companies that apply.

00:08:01 Shardul

Right. But it was a fairly strong prescriptive orientation to say, look, the executives, supply chain executives specifically that we are dealing with that we are working with.

00:08:14 Shardul

Are dealing with some extremely challenging situations. They are focused on the day-to-day job, but they also have to think about what kind of.

00:08:25 Shardul

Let’s say supply chain infrastructure and what I mean by that is factories distribution, network distribution centres, investing in fleet of vehicles.

00:08:35 Shardul

Making long term partnerships with suppliers and so on and so forth. These decisions often go beyond your few quarters. They last. Now you have to think about next five years, 10 years in some projects that we have done for freight infrastructure now, now you you have to think about next 30 years.

00:08:55 Shardul

So that’s the kind of.

00:08:57 Shardul

Application context I was working with.

00:09:00 Shardul

And then we go in that context.

00:09:03 Shardul

It is a prescriptive orientation in the sense that you are bringing in scenario planning as a decision making aid, something that’s going to help you overcome the limitations that you currently have, limitations of existing decision making processes.

00:09:21 Shardul

So by nature, we need to show that it actually works.

00:09:25 Shardul

So that was kind of A1 motivation that to really see is it beneficial to use scenario planning. So that was one question that I had during my doctoral studies and in my dissertation I have one paper that that does use field experiments to answer that question partly.

00:09:45 Shardul

But also The thing is that the.

00:09:50 Megan

I was just thinking I was going to say it’s a that is a very, very real and salient issue in the field of scenario planning is is it effective? If it is, how do we show that it’s effective? And I think that was the entrance for a lot of us into our doctoral studies.

00:10:09 Megan

That’s it. Really fascinating to find out that you had the same motivation. Considering we entered the field very differently and in very different places, but we have the same motivation there and others in our field.

00:10:24 Megan

As well, but go ahead. I’m sorry.

00:10:26

Yeah.

00:10:27 Shardul

Yeah. So you know when you think about scenario planning, there are some questions that.

00:10:35 Shardul

Sometimes the same. Maybe not everybody thinks about them. So just to give you some examples, let’s just say if you if you are saying if you are thinking for the long term and there is enough uncertainty in the sense that you cannot really predict or you cannot really foresee how things might evolve in the future.

00:10:56 Shardul

But the investments that you’re making today, let’s say you’re building deciding to build a factory.

00:11:01 Shardul

And you want to know, should I build that factory in?

00:11:03 Shardul

The United States.

00:11:05 Shardul

Or Canada or Mexico? Or should I build it in Asia? Or should I not even build a factory at all, but kind of outsource my production to a contract manufacturer? But when you’re making decisions of that nature, you have to think about.

00:11:21 Shardul

Now, next 5-10 years, because if I start building a factory, it’s going to take a couple of years to build it.

00:11:27 Shardul

And once you build it.

00:11:29 Shardul

You’re not going to close it down since six months later. I thinking that, oops, I made a mistake in deciding to build this factory. You’re going to operate it for three, five years, so you have to think about that long term planning horizon.

00:11:42 Shardul

And that’s why we bring in scenario planning, right.

00:11:45 Shardul

But then you think about it in terms of application areas?

00:11:50 Shardul

So is this process of scenario planning? Is it equally useful for a, say, a new startup that’s thinking about next two to three years versus say, a multinational corporation that could be using scenario planning for its corporate strategy or?

00:12:07 Shardul

Operations or supply chain strategy which might have a planning horizon of 5-7 or ten years.

00:12:12 Shardul

Or let’s say a freight infrastructure investments such as the Department of Transportation and when they’re building highways and investing in ports and rail, rail lines and so on, they’re thinking 20-30 years out.

00:12:26 Shardul

So it’s the same process. Is it same process? Equally useful? Does context matter? That’s number one second? If it is then should you create the scenarios and apply them in the same way in all three cases?

00:12:39 Shardul

Or should there be new variations?

00:12:42 Shardul

Then another thing that we talk about is it’s not a one time use of scenarios, but you want to use it on an ongoing basis.

00:12:50 Shardul

So what is? What are the pros and cons of that?

00:12:54 Shardul

And if you want to use it on an ongoing basis, how frequently should you use it? Do you reuse it every three months once a year, once a month? What’s the right frequency?

00:13:03 Shardul

And then we can, especially in my field where in operations and supply chain management, the executives.

00:13:12 Shardul

Are often consumed by issues of the short term nature.

00:13:17 Shardul

Did we meet this monthly target for the target for this month? Are we meeting?

00:13:21 Shardul

The quarterly target.

00:13:23 Shardul

And if the same executives are thinking using scenarios that go 5 or 10 years out.

00:13:28 Shardul

Does that hinder their ability to think for the short term, which is also equally important, right. So there’s and then there is the whole, you know, emergence of AI in last two or three years when we are thinking what decisions could be left to AI and what could be made by executives using tools like scenario planning.

00:13:48 Shardul

So there are all these questions and.

00:13:50 Shardul

I don’t think in our field we have a.

00:13:55 Shardul

Very scientifically valid answer to these questions.

00:14:00 Shardul

Right. And that’s where the question, how this evaluation comes in is how should you answer these questions? How should we answer these questions? What are different methods for doing that? And there was a whole motivation behind my doctoral research as well as your research as you said as you said.

00:14:15 Shardul

And the reason for and to kind of?

00:14:19 Shardul

Make the Long story short, the reason for bringing medicine is that.

00:14:25 Shardul

When I was doing my doctoral studies, I had a professor from mechanical engineering, Dan Frey at MIT. He was on my doctoral committee.

00:14:36 Shardul

And he had written a paper that borrowed medical research methods for evaluating design methods.

00:14:44 Shardul

With a very similar prescriptive orientation.

00:14:48 Shardul

And that kind of motivated this and I saw look, there are parallels between medicine and scenario planning.

00:14:55 Shardul

At least in my application, scenario planning is prescriptive just like.

00:15:00 Shardul

Medicine, just like in medicine, you cannot just try any unproven treatment on a human being.

00:15:07 Shardul

Or there is a very systematic way for doing that?

00:15:11 Shardul

No. In scenario planning, you cannot just go and test these things on a corporation because there are some huge implications for the success of the failure of.

00:15:20 Shardul

Better. So there are ethical issues that that are common to both medicine and scenario planning. It’s a prescriptive orientation and that’s why I think borrowing the ideas from medicine, which has a very established way of.

00:15:36 Shardul

Evaluating new treatments we can learn from that in scenario planning and that’s what this chapter tries to do.

00:15:44 Megan

Oh, OK. So yeah, a lot of people when they first saw this thought it was the other way around. It was using scenario planning to advance.

00:15:54 Megan

Medical questions, problems, you know, risking things like that where you’re saying it was the other way around. It was learning from a well established and robust system that is medicine.

00:16:11 Shardul

Mm-hmm.

00:16:12 Shardul

Absolutely, yeah, yeah.

00:16:14 Shardul

And that’s what we do in this book is that we talk about, we kind of motivate the need for learning from medicine and kind of justify why we can do that. But then we talk about medical research methods first and also in medicine there are.

00:16:30 Shardul

About 13 methods that we cover in this book book chapter and that they fall in kind of four different categories. So they go from really basic fundamental research to there are clinical trials. Then there are observational studies and then there is epidemiology, epidemiological studies, right and then.

00:16:50 Shardul

We say, OK, how does the medicine?

00:16:54 Shardul

Practise these different kinds of research methods to evaluate new medical treatments, and then we kind of analogously say, OK, what can we learn from that scenario planning and how can we design research methods analogously by learning from medical research methods?

00:17:14 Megan

Well, what are some of the answers you found or observations?

00:17:19 Shardul

OK so Maybe it will be useful just to give a very brief overview of miracle research methods, because not everybody may be familiar with it.

00:17:32 Megan

So I’m not sure here.

00:17:34 Shardul

Yeah. I mean, neither was I until I started now kind of looking into this angle. And as I got deeper.

00:17:40 Shardul

To it.

00:17:42 Shardul

I was actually quite impressed by how methodical the whole process is.

00:17:47 Shardul

It just all starts with, say, basic or fundamental research.

00:17:51 Shardul

And you can think about there are about four different categories of basic research. So there is the theoretical research right when you are looking at the fundamental sciences like biology, microbiology and so on, and you’re using the theoretical knowledge to develop new hypothesis.

00:18:11 Shardul

So that’s kind of one thing. Second after that is what is called in vitro studies.

00:18:18 Shardul

So that’s where researchers will take samples of cells and tissues and try things on that. So if there is a new molecule that they want to try out, they can just try in a test tube. That’s how it is in vitro, right? So it’s a very small scale.

00:18:21

Old.

00:18:35 Shardul

But what works for at the tissue or cell or tissue level may not work for the whole entire animal.

00:18:42 Shardul

So on the third level, it’s called in vivo studies, where medicine uses animal models, right? Because now you are not dealing with just one kind of cell, but now about 400 different types of cells, different types of tissues and so on. And see how this treatment works in the whole body. And then there is lastly computer modelling which is called.

00:19:02 Shardul

In silico studies, right. So computer modelling, simulation and so on. So that is all kind of the.

00:19:09 Shardul

Theoretical.

00:19:10 Shardul

Research you can establish causality because you are working in a very controlled environment.

00:19:17 Shardul

But to establish kind of external validity is difficult because you’re still in the lab.

00:19:24 Shardul

So the second.

00:19:25 Shardul

Group of medical studies is what we are. What is known as clinical trials.

00:19:31 Shardul

And some of these are done before a treatment is approved and released, and there’s kind of one that is done afterwards. So it starts with phase one where you’re not even working with patients or people who are suffering from a condition, but you are working with healthy adults.

00:19:49 Shardul

And you’re just trying to see what is the right dosage. What is the safe dosage, a very small group of healthy individuals?

00:19:56 Shardul

Once it passes that stage, then you go to the medicine goes to the second phase of clinical trial, then you are working with a small group of of people who are living with that condition.

00:20:08 Shardul

And we find out, OK, is this effective? Does.

00:20:10 Shardul

It really help these people.

00:20:12 Shardul

And if that passes, then you go to a third stage of clinical trials when you’re looking at a much larger study. And this is where you might have a control group, you will have a double-blind study and so on. And what what medicine is trying to decide here is, is this new treatment just as safe and justice as effective as the current alternative?

00:20:32 Shardul

That we have.

00:20:34 Shardul

And if it passes, then the drug gets released. But then there’s there’s stage 4 clinical trial, which is you don’t stop there.

00:20:42 Shardul

So they will continue evaluating to see how is this drug working in the longer term, not just in a few weeks that you had for this these phase 1-2 or three clinical trials.

00:20:53 Shardul

That says a very methodical way of clinical trials. And then there are observational studies.

00:20:59 Shardul

So observational studies come in when you cannot prescribe something.

00:21:04 Shardul

It could be too dangerous, not ethical.

00:21:08 Shardul

Prescribe this treatment or it could be, let’s say if you want to see the effect of living near a chemical plant, you cannot ask.

00:21:19 Shardul

Or to go live near a chemical plant, right. But you are using secondary data, kind of retrospectively, to see how does it affect who people who lived near this chemical plant or what was the effect of radiation or having lead in the paint and things like that.

00:21:35 Shardul

It’s it’s kind of a secondary data retrospective.

00:21:38 Shardul

Studies, which is observation.

00:21:40 Shardul

And there’s a fourth category of study that is called the epidemiological, and even then there are about 3 or 4 categories.

00:21:50 Shardul

So 1 is just ecological studies where they find the medical researchers are trying to see how prevalent is a certain condition among certain group of people.

00:22:01 Shardul

Maybe by ethnicities, by their geography and so on is just observational. Now. You’re not looking at any individuals you know, you’re looking at kind of a large aggregate group of people. Then there are cross-sectional studies which are similar to ecological, but you also collect some individual level data to see.

00:22:21 Shardul

Do any individual attributes affect this?

00:22:25 Shardul

Then there are case control. Then you say. OK, well, let’s take a group of, you know, when you’re choosing people based on their based on the dependent variable, right. So you’re saying, OK, look, there’s a group of people who are suffering from a medical condition.

00:22:39 Shardul

Let’s choose a control group who do not have that condition, and let’s look backwards and see what independent variables might affect, or might predict why they why this control group has a condition. Sorry, the treatment, or rather.

00:22:55 Shardul

The suffering group or the patients have this condition but control group.

00:23:00 Shardul

And lastly, there are cohort studies where you where you study over long term that that just longitudinally and the most one of the most famous examples of cohort studies is the British Smoker study which we mentioned in this book as well. The book chapter as well is that longitudinally starting from 1951.

00:23:19 Shardul

They studied doctors who were smokers versus non-smokers.

00:23:23 Shardul

And they looked at the fatality rate, how it varied, and there was a very convincing proof that smoking actually is not good for you. So there is this whole array of research method.

00:23:36 Shardul

To understand, evaluate different medical treatments, understand the seizures, and so on.

00:23:42 Shardul

So now we borrow from that and say, OK, can we can we learn from this?

00:23:49 Shardul

And design.

00:23:52 Shardul

Interventions, not not and designs, let’s say research.

00:23:58 Shardul

To understand how scenario planning affects these different how the practise of scenario planning influences the decision processes of the individual judgement, team judgement outcomes and things.

00:24:12 Shardul

So what we have done in this chapter is we next we talk about the scenario planning research methods.

00:24:20 Shardul

But also I wanted to see how once we once I looked at, OK, 13 medical research methods, then there are 13 analogous scenario planning research methods.

00:24:33 Shardul

And I want to.

00:24:34 Shardul

Say, how widely have these been studied in our literature?

00:24:39 Shardul

And it is now. It’s actually there are studies that fall in these different buckets now and those are mentioned in.

00:24:45 Shardul

The book chapter.

00:24:47 Shardul

There are also a few of these research methods for which.

00:24:50 Shardul

We don’t have any studies.

00:24:52 Shardul

So it kind.

00:24:52 Shardul

Of you know, the book chapter opens up and says look.

00:24:56 Shardul

There is a vast feel.

00:24:58 Shardul

All of scenario planning which we can evaluate using a variety of methods. For some, there’s already an established kind of precedence and we have things that we can build on, but there are some other areas that are completely, you know, it’s like Wild West. Nobody has ventured there and those could be great opportunities for new researchers.

00:25:18 Shardul

To explore.

00:25:21 Megan

Yeah. And that’s that’s something I think is what brought possibly all of us into the world of.

00:25:29 Megan

Futures and foresight more broadly, but scenario planning specifically is we’re kind of we’ve we are the lucky few is sort of at the edge of the field. Some people use the term vanguard if you will. But I think we’re a bit late for that one that would have been more in the 50s.

00:25:49 Megan

When this was being scenario, plan was being spearheaded by Herman Kahn.

00:25:56 Megan

And there’s the early crew, but yeah, yeah, there’s a lot of gaps. There’s a lot of gaps in knowledge. And sometimes people think it’s because there’s no interest, but I have a feeling, and I did at the start, and I remain with this feeling that it just hasn’t been enough time and enough people.

00:26:16 Megan

It just takes time.

00:26:18 Megan

To for people to stand up and say, Oh yeah, I have that question. Let’s look into it as you have done with this, with the information that you shared for this chapter.

00:26:31 Megan

So one thing I took from what you were saying was that medical research is very structured and understandably so, because we’re talking about people’s lives, right. And we’re talking about severe.

00:26:48 Megan

Not in a bad way, but just severe concerns of Ethicality and, you know, ethical practise. And as we always should, when we’re when we’re looking at treatments that could help or harm human life or just life in general.

00:27:06 Megan

But as well with that, it sounds like it’s what we call an iterative process.

00:27:14 Megan

Where one thing is tested, the next thing that’s tested is built off of the knowledge of that, and sometimes they go back because of the new discoveries they find is like OK, we need to step back and test this again, is that is that one of the the the, the, the the methods or the methodology that you were looking at?

00:27:34 Megan

To support scenario planning, use and research.

00:27:39 Shardul

Absolutely, yeah, because any research method has no, it has some strengths. It has some limitations, right. And then medicine that is understood that if you’re using, let’s say if you’re conducting let’s say in vitro or in vivo studies, let’s say with animal models.

00:27:58 Shardul

The external validity of those models is going to be questionable, right? Because you’re working on mice per say, not human beings, right? But that is still valuable. What you’re learning from those animal models, that knowledge is value.

00:28:12 Shardul

Same thing we have to understand in the scenario playing literature as well, yes. So for example, if I’m doing experiments with say MBA students, of course the decisions that are made by MBA students in using a particular case in a in a course in a in A1 semester long course.

00:28:31 Shardul

It would be difficult.

00:28:35 Shardul

It will be difficult to so there are external validity of those decisions would be questionable for real world decisions, right?

00:28:42 Megan

And just just for our listeners who aren’t in academia and especially experimentation, would you take a second to tell us what you mean by external validity?

00:28:54 Shardul

Yeah. Great. OK. So thanks, I’m glad you asked that because sometimes we take these terms for granted, right?

00:29:02 Shardul

I do. Yeah. So we kind of use these terms all the time, but they may not be common.

00:29:07 Megan

Because it’s part of our profession, right. It’s something we have to constantly acknowledge in everything we write. When it comes to studies, right? And and gathering data so.

00:29:10

Uh.

00:29:21 Megan

I’ll pass the mic back to you.

00:29:23 Shardul

OK, OK, I’m glad you asked that.

00:29:25 Shardul

Meghan, so external.

00:29:27 Shardul

Validity.

00:29:28 Shardul

Is. Let’s say we conduct a study. Let’s say we are testing the effect of scenario planning is used on how students choose between, say options 1-2 and three. Let’s just say hypothetically.

00:29:44 Shardul

And we find out that students who use scenario planning.

00:29:48 Shardul

Opted for Option 3 which is say more flexible investment in a certain project and so on, versus students who did not use scenario planning went for, say, option one, which is a very concrete rigid investment. Once you invest, you cannot deviate from that. But just say we.

00:30:07

That.

00:30:08 Shardul

Now external validity means what we have seen in this classroom project classroom experiment rather.

00:30:15 Shardul

With that work in the real world as well. So in other words, can we claim that by using scenario planning a real world decision maker and often these decisions made are made by executives who are very senior, have now couple of decades or maybe even more experience?

00:30:35 Shardul

Would they behave the same way or would the scenario planning have the same effect on their decisions?

00:30:42 Shardul

As what we saw in the classroom with MBA students.

00:30:46 Shardul

Or even under thread.

00:30:48 Shardul

Gases.

00:30:49 Shardul

So that’s the question of external validity. Can we take the findings from our small setting and say that apply in the real world setting of corporations and?

00:31:01 Shardul

Say public sector organisations and so on.

00:31:04 Megan

You mentioned in there that the causal effects often in in not often every time in scenario planning practise as well as medical practise. We’re looking for causal relationships did this thing.

00:31:18 Megan

Change something in the business environment. Change something in the human’s biological system. Did it change something within the makeup of of the bacteria? The drug? Whatever we’re working with?

00:31:36 Megan

But from what you’ve discovered in your work, particularly, you know, with this what what you shared in the chapter, what are some of the biggest barriers that you found to conducting rigorous empirical, which means evidence gathering studies in?

00:31:55 Megan

Or her.

00:31:56 Shardul

Field I think the biggest challenge in our field is getting access to access to a setting right. Working with an organisation that is willing to.

00:32:09 Shardul

Work with you, but also is willing to.

00:32:12 Shardul

Kind of allow you to maybe do controlled experimentation in a very controlled manner, but just willing to do that right as opposed to just saying, OK, create scenarios and then we we can start using them, kind of that experimentation part where we can vary things.

00:32:31 Shardul

OK, maybe the way we create scenarios, maybe the way we apply scenarios, but by creating variation we can see which one works better, right? Which is more effective or if it is not right, but that is that is challenging and to your earlier point, Megan, when you mentioned that a lot of the work in our field in scenario planning has been observational.

00:32:52 Shardul

You’re absolutely spot on.

00:32:54 Shardul

Right. So there are four kind of broad categories of medical research methods that I mentioned. You have the theoretical research, then there are the clinical trials that we very popular. Everybody knows about them or they’re in the news. We we hear about them. Then there are observational studies and then there is epidemiological, right?

00:33:15 Shardul

OK so.

00:33:15 Shardul

Kind of more.

00:33:18 Shardul

Retrospective or could be prospective studies as well. But if you look at some of the classic works in scenario planning, the work now the works are from peer work for example. And what we learned learned from use of scenario planning at Shell right. These are perfect examples of observational studies.

00:33:39 Shardul

We know what happened or we have read about what happened and based on that we believe that scenario planning is very.

00:33:46 Shardul

Useful.

00:33:47 Shardul

There are other cases of, say, UPS. For example, there’s a very famous Harvard Business Review case about UPS’s use of scenario planning.

00:33:56 Shardul

And I use that in my in my courses because it is scenario planning and UPS as a supply chain company. So it’s a perfect combination for my course.

00:34:06 Shardul

But these are observational studies. They are useful. They are certainly valuable because they tell us that, look, this method scenario planning can actually be quite useful.

00:34:18 Shardul

It can help companies get ahead of their competitors. What? What I what I say in this chapter is that it is great. Those studies are valuable, but we shouldn’t restrict ourselves just to the observational studies. We also need to do theoretical research. We need to be inspired by the way clinical trials are conducted.

00:34:39 Shardul

And use analogous methods for testing scenario planning. You can even do even epidemiological studies, right?

00:34:48 Megan

Yeah, I think and understandably so. I think that is something not really well understood. The value of bringing.

00:34:59 Megan

A facilitator on who understands research methods because it’s not that this is a research project, you know, it’s not like we are.

00:35:12 Megan

Even going to write a paper from our consultings often it’s that we have this background that helps us understand how to evidence our work, how to lay out the game plan. You know, in a very project management kind of way.

00:35:32 Megan

But with that research.

00:35:35 Megan

Panache to it if you will where where we are caught because because in research we constantly have to think about and justify before we’re even allowed to step into our project. What we’re looking for, how we’re going to try to find it. We don’t know if we’re going to find it or not.

00:35:37 Shardul

Mm-hmm.

00:35:56 Megan

Right. And even often, how we’re going to analyse all those data that we get in the ends, we have to have a really good idea.

00:36:07 Megan

Of our space, while also leaving ourselves open for surprises, because if we’re closed off to surprises, then we miss. We miss. Really what comes down to valuable, valuable information. And that’s like getting into the nitty gritty of what uncertainty, what it means to work in uncertainty.

00:36:29 Megan

Of what it means to work in risk and risk analysis. You know, we have to understand that we don’t have all the answers, but we still have to articulate how are we going to get there, right? So I think that’s that’s.

00:36:44 Megan

That’s something obviously a self promotion to aspect, but that’s any any academic you bring into the private organisational you know space is that’s what you’re bringing in. Somebody who has a mind of who can break down the process step by step and see elements of the future.

00:37:04 Megan

That will help us right now in our efforts and not waste them right? That make wasted efforts.

00:37:09

Yeah, yeah.

00:37:12 Shardul

Yeah, can kind of to to to that point of maybe it is self promoting, but that’s the value of academics, right. So you bring in academics, you work as an organisation, you bring in academics, you bring this in a sceptical mindset. We bring this research mentality and say, OK, I’m not just going to go through the motions and deliver something.

00:37:34 Shardul

I also want to step back and say what was the actual value of this work, right? How how did it actually affect you?

00:37:43 Shardul

So also I will say work with now work with academics. Don’t just go to a big consulting firm. No. Come, come, come to us academics and then you can get this kind of additional research knowledge as you mentioned.

00:37:57

Yeah.

00:37:59 Megan

Yeah. OK. So I have a.

00:38:02 Megan

Question at the end.

00:38:03 Megan

Now I’d like to ask everybody in scenario planning and and in the foresight field we we constantly are seeing things that we have at best come out in random conversations in pubs or at dinner or conferences, but we never really find a perch.

00:38:24 Megan

For him and that I would like to make this one of our purchase and my question is in the field you’re ending in the work you do and the everything that is your environment as the the expert that you are in, the research that you are, what trends are you seeing in your work in your area of work that are happening?

00:38:45 Megan

Right now or but just barely budding.

00:38:49 Shardul

So right now, AI is such an overwhelming trend, right? So that is use of AI and really the uncertainty about how it’s going to affect our work, both in terms of.

00:39:03 Shardul

Education. So how we teach, right? How we should incorporate AI?

00:39:09 Shardul

Lots of questions about that.

00:39:12 Shardul

But also in our research, how it is going to affect our research.

00:39:16 Shardul

I mean.

00:39:18 Shardul

Should should you use it for research design? I mean there are so many things that I can do. Sometimes it is scary that especially if you’re working with secondary data.

00:39:27 Shardul

Right.

00:39:28 Shardul

You you don’t need to spend several months gathering and analysing the data. I can do that like this, right? So that is definitely an overwhelming trend about use of AI. The second thing that that I see and that is more specific to my field of supply chain.

00:39:48 Shardul

Is the whole idea of dealing with uncertainty.

00:39:54 Shardul

In the long term, but also in the short term, so for example.

00:39:59 Shardul

Kind of related to scenario planning.

00:40:02 Shardul

So if it if we kind of go back historically in my kind of 1970s, sixties, even 1950’s, the origins of scenario planning the method.

00:40:13 Shardul

It was created and evolved to make really long term decisions, right? So let’s say we we know in 1950s Rand Corporation the use of scenario planning was to say, OK, what kind of defence defence infrastructure we need, right? So really long term decisions saying which shell right, really long term.

00:40:33 Shardul

Messages.

00:40:35 Shardul

And that’s where the method is. Has been really useful, but the question that I get in in supply chain is that.

00:40:44 Shardul

Can we use the same approach for dealing with uncertainties in the short term?

00:40:49 Shardul

Because if I am dealing with the, let’s say if I have to organise.

00:40:55 Shardul

A, say ocean carrier to carry several containers of goods that my factory in Asia is producing to its destination in North America.

00:41:07 Shardul

And they’ll say that I need to book this in advance at two months in advance.

00:41:12 Shardul

How should I know how much capacity should I reserve?

00:41:16 Shardul

What carrier should I raise, or what rate should I reserve it at, right? What path should it take? Should it go through, say, the Suez Canal, or should it go around the no through the Pacific, right? So there are questions like that that that are very short term nature. We’re thinking about uncertainties in next three to six months.

00:41:36 Shardul

And these things used to be more predictable. You can say no next three months are going to look fairly similar to what I’ve seen in the past. I can take no I can create a forecasting model based on historical data.

00:41:50 Shardul

Now I can project it forward, say three months out. You know, simple exponential smoothing models. Now you combine them with machine.

00:41:57 Shardul

Learning and so on worked beautifully for the short term kind of horizon, but even now we are seeing massive uncertainties in that, so at least in supply chain management, that’s a.

00:42:08 Shardul

That’s a trend that we are seeing some kind of.

00:42:11 Shardul

In maybe insoluble, or if that’s the right term, insoluble uncertainty, it’s something that you cannot just say.

00:42:18 Shardul

Next three months will look like you know what? What I’ve seen in the last three months or long historical data.

00:42:24 Shardul

Mm-hmm. So that is, that is a trend, definitely. We see in supply chain management some unstructured uncertainties are also shaping the long short term decision making.

00:42:36 Megan

OK, it’s pretty big, pretty weighty.

00:42:42 Megan

But that’s the point, right? That’s some food for thought. What? What? What is the expert on the inside? Seeing that isn’t really making it out to the general conversation yet? And this is what we do, isn’t it? This is we are Trend watchers. We are horizon scanners.

00:43:02 Megan

Your red flag.

00:43:03 Megan

Like identifiers, if you will. OK. Well, thank you, Cheryl, for coming today. This is has been fascinating. Like I’ve read your chapter. I gave you feedback on your.

00:43:17 Megan

Chapter.

00:43:18 Megan

Your chapters in a book together with and with us, and I still have learned.

00:43:23 Megan

An incredible amount of information from this so I I hope this.

00:43:30 Megan

Really gets out in the field more. This is incredible stuff and I will be in touch, by the way, about some future research because without going into it any further, I mean I that’s all I do is research and then field work, right. And you’ve brought in some exceptional.

00:43:51 Megan

Venn diagramming kind of concepts that I’m excited to jump into further.

00:43:58 Megan

Oh, thank you. Thank you for coming.

00:44:01 Shardul

No, thank you. Thank you. It was really fun.

00:44:03 Megan

Scenarios for tomorrow is produced by me, Megan Crawford, with invaluable feedback from Dr Isabella Risa, Jeremy, Cripe, Brian, Eggo. And as always, my kids.

00:44:15 Megan

This is a production of the Futures and Analytics research hub and Far Lab affiliated with Edinburgh Napier Business School. You can find show notes, references and transcripts at scenarios.farhub.org.

00:44:30 Megan

That’s scenarios dot farhud dot org.

00:44:32 Megan

Or you can follow us across social media by searching for a scenario features all one word you can subscribe to scenarios for tomorrow wherever you listen to your podcasts. Today’s track was composed by a rocket whose links are provided in the show notes. This is scenarios for tomorrow where tomorrow’s headlines start as today’s thought experiments.

Show notes:
This is a production of the Futures & Analytics Research (FAR) Hub.
Today’s track “Experimental Cinematic Hip-Hop” was composed by ‪@Rockot‬.

Select episode references:

Shardul Phadnis https://shardulphadnis.com

Order your copy of “Strategic Planning for Dynamic Supply Chains: Preparing for Uncertainty Using Scenarios” https://link.springer.com/book/10.1007/978-3-030-91810-1

Crawford & Plant-O’Toole https://onlinelibrary.wiley.com/doi/pdf/10.1002/ffo2.167

Pier Wack https://en.wikipedia.org/wiki/Pierre_Wack

Shell scenarios https://www.shell.com/news-and-insights/scenarios.html

Garvin & Levesque. (2006) “Strategic Planning at United Parcel Service.” Harvard Business School Case 306-002. https://www.hbs.edu/faculty/Pages/item.aspx?num=32845

00:00:25 –> 00:00:33
In our lifetimes, we will see revolutions of such unfathomable proportions that they change the very notion of our concept of reality. Bioengineered solutions to global hunger, whole cloth relocation of capital cities, a radical coup d ‘etat on data harvesting. These aren’t prophecies, they’re possibilities. We are the sci -fi of our ancestors after all.

00:00:57 –> 00:01:00
My name is Dr. Megan Crawford, and this is Scenarios for Tomorrow. where tomorrow’s headlines start as today’s thought experiments.

00:01:10 –> 00:01:13
As a data scientist and foresight researcher, I study how organizations strategize for the future. But let’s be honest, scenario planning sounds like the driest of corporate jargon. That is until you see it in action.

00:01:23 –> 00:01:26
Like when healthcare teams around the world embrace foresight to outmaneuver pandemics, or watch in real time the ethical realities of our government’s scenario planning through military invasions, or witness how scenarios developed at the turn of the century changed the literal landscape of today’s infrastructure.

00:01:43 –> 00:01:46
That’s the alchemy we will explore in this podcast, transforming abstract, unknowable futures into collective action and what that means when our future becomes our past.

00:01:56 –> 00:02:00
Each episode, we’ll sit down with global futurists, intelligence and defense policy architects, scenario planners, strategists, and yes, the little people backstage like me to unpack how we build strategies with governments, NGOs, and private firms, turn behavioral science into actions of change, and design hope in the face of radical uncertainty. but always questioning

00:02:24 –> 00:02:27
What could happen, what should happen, and how do we get there?

00:02:27 –> 00:02:31
This is a production of the Futures and Analytics Research Hub and Pharr Lab, affiliated with Edinburgh Napier Business School. You can find show notes, references, and transcripts at scenarios .pharrhub .org. That’s scenarios.farhub.org.

00:02:48 –> 00:02:51
You can follow us across social media by searching for @scenariofutures, all one word. You can subscribe to Scenarios for Tomorrow wherever you listen to your podcasts. Today’s track was composed by Rocket, whose links are provided in the show notes. This is Scenarios for Tomorrow, where tomorrow’s headlines start as today’s thought experiments.

Show notes:
This is a production of the Futures & Analytics Research (FAR) Hub.
Today’s track “Experimental Cinematic Hip-Hop” was composed by ‪@Rockot‬.

Select episode references:
Severn Suzuki speech to the United Nations
Greta Thunberg speech to the United Nations
A Letter to the Future From Kid President
Panama canal – Reuters
U.S. Department of Defense
China Military – Getty Images
COVID Ebrahim Noroozi – Associated Press
COVID pic – Hilary Swift New York Times
Ronald E. McNair’s last space flight – CBS Evening News
“Daisy” campaign for Lyndon B. Johnson