Exploring Chemophobia and the Impact of Social Media on Education with James Kennedy

Exploring Chemophobia and the Impact of Social Media on Education with James Kennedy

Join us for a thought-provoking episode as we delve into the world of chemophobia and its impact on society with renowned science communicator James Kennedy. Discover the historical roots of chemophobia and its prevalence in today's world. From debunking myths to exploring the influence of social media on education, we explore the intersection of science, perception, and media. Tune in to gain valuable insights, challenge your perspectives, and uncover the truth behind the fear of chemicals. Don't miss this engaging conversation with James Kennedy that will leave you questioning everything you thought you knew

Bio

James Kennedy is a British-born science communicator with a Natural Sciences BA(Hons.) and MSci degrees from the University of Cambridge, U.K. He has a wealth of experience in science communication and speaks at corporate events about how to tackle an irrational fear of “chemicals”. His 2021 book, Everything Is Natural, was published by the Royal Society of Chemistry. He has since released two chemistry textbooks and multiple sets of chemistry education materials.

Time Stamps: 

0:51 Sponsor message

1:31 How James Kennedy made the viral banana poster and introduction to chemophobia(Multiple Chemical Sensitivity Syndrome)

6:25 Is the fear of chemicals a new thing or is there a history behind it?

8:52 What is 'provenance' and how do objects carry stories 

12:15 Why are adults more chemophobic than kids?

17:29 History of Chemophobia in the United States 

22:14 The "Running Girl" Photograph and its Influence on the World 

23:42 Post-modernism and uptick of Chemophobia 

26:14 How is social media impacting education across the globe?

31:34 What is your stance on artificial intelligence and tools like ChatGPT?

37:06 Sponsor mid-roll

37:57 Will AI completely change the education system as a whole? 

44:25 How are kids being influenced by social media in schools?

50: 30 The dystopia of TikTok mental health issues in kids 

54:21 "Everything is Natural" blurring the lines between natural and artificial

55:47 Final plugs

Plugs

Links
Website: https://jameskennedymonash.wordpress.com/

Kennedy College: https://kennedycollege.com.au/

Book: Everything Is Natural: Exploring How Chemicals Are Natural, How Nature Is Chemical and Why That Should Excite Us

Course: Chemistry for VCE

Twitter: James Kennedy (@JamesKennedyEDU) / Twitter

KG Food Company: KetoGeek | Official Site – KG Food Company

Shop Energy Pods(Our sponsors): Energy Pods – KG Food Company

Transcription:

1
00:00:00,000 --> 00:00:08,280
Welcome to the Energize Explorer Enjoy Podcast where we bring you inspiring conversations

2
00:00:08,280 --> 00:00:13,840
about food, fitness, and adventure fueled by the power of the energy pods.

3
00:00:13,840 --> 00:00:17,840
Let's do this!

4
00:00:17,840 --> 00:00:32,480
Get ready to explore the fascinating world of science with our guest on today's episode.

5
00:00:32,480 --> 00:00:37,500
Introducing James Kennedy, a British-born science communicator and author with a wealth

6
00:00:37,500 --> 00:00:42,720
of experience in captivating audiences with his insights on tackling irrational fear

7
00:00:42,720 --> 00:00:43,720
of chemicals.

8
00:00:43,720 --> 00:00:48,360
Join us as we delve into the wonders of science and discover the hidden discoveries

9
00:00:48,360 --> 00:00:50,100
behind everyday chemistry.

10
00:00:50,100 --> 00:00:53,640
Hey there listeners!

11
00:00:53,640 --> 00:00:58,360
This episode is sponsored by the ever-amazing KG Food Company's energy pods.

12
00:00:58,360 --> 00:01:00,280
Are you ready for a flavor adventure?

13
00:01:00,280 --> 00:01:05,280
Get ready to taste white chocolate strawberry, breakfast mokkunuar, and the fan favorite

14
00:01:05,280 --> 00:01:06,680
chocolate nova.

15
00:01:06,680 --> 00:01:11,720
These pods are not only delicious, but also are packed with protein, healthy fats, and

16
00:01:11,720 --> 00:01:14,440
minimal sugar to keep you going all day.

17
00:01:14,440 --> 00:01:18,920
Plus, they come with a built-in spoon for your on-the-go snacking needs.

18
00:01:18,920 --> 00:01:20,920
So, don't wait another minute!

19
00:01:20,920 --> 00:01:31,000
Head over to kgfoodco.com, kgfodc.com, and orders and energy pods to fuel your life.

20
00:01:31,000 --> 00:01:33,120
I trained as a chemistry teacher initially, right?

21
00:01:33,120 --> 00:01:39,040
I did natural science at Cambridge and then went to do study education here in Australia.

22
00:01:39,040 --> 00:01:44,520
This banana poster was the first slide of my first ever chemistry lesson.

23
00:01:44,520 --> 00:01:49,280
And I just wanted to try and show that chemistry is relevant and interesting and is everywhere,

24
00:01:49,280 --> 00:01:50,600
and that's what the poster was about.

25
00:01:50,600 --> 00:01:53,160
I had no political agenda at all, right?

26
00:01:53,160 --> 00:01:57,600
It was literally a hook, we would call it, an education like to get everyone's attention,

27
00:01:57,600 --> 00:02:01,560
to get some discussion going before we get into organic chemistry, which is the topic.

28
00:02:01,560 --> 00:02:08,920
So I chose a banana as one of the most recognizable fruits, and then just analyzed

29
00:02:08,920 --> 00:02:13,800
what was in it, not from my own experiments, but from literature that's already been done.

30
00:02:13,800 --> 00:02:15,000
People have already done this work.

31
00:02:15,000 --> 00:02:18,480
This is the later posters I made have references on the back.

32
00:02:18,480 --> 00:02:24,160
If you turn it over, there's all the academic references on that, but these are published

33
00:02:24,160 --> 00:02:25,680
later on what's in a banana.

34
00:02:25,680 --> 00:02:28,240
And it's interesting the way that it was perceived.

35
00:02:28,240 --> 00:02:31,880
So students found it interesting like, "Oh, I recognize some of those things, but some

36
00:02:31,880 --> 00:02:34,720
of them I own, they started a bit of interest in the students."

37
00:02:34,720 --> 00:02:38,840
What surprised me was when I put it online, it just literally went viral.

38
00:02:38,840 --> 00:02:45,560
I got reposted everywhere and modified and decoded newspapers, New York Times, even and some

39
00:02:45,560 --> 00:02:46,560
others.

40
00:02:46,560 --> 00:02:50,160
What was fascinating was people assumed that I had some kind of political agenda.

41
00:02:50,160 --> 00:02:55,960
I was trying to show that one of two things, either that modern bananas are not real,

42
00:02:55,960 --> 00:03:01,120
like some kind of there is a fake banana industry that is cobbling together like fake bananas

43
00:03:01,120 --> 00:03:04,000
because they're cheap industrial synthetic banana.

44
00:03:04,000 --> 00:03:05,400
That's what some people assumed.

45
00:03:05,400 --> 00:03:06,880
And the other people were pro chemistry.

46
00:03:06,880 --> 00:03:09,800
They were like, "Oh, yeah, good, good, I need to, they got the message basically that I was

47
00:03:09,800 --> 00:03:10,800
trying to show."

48
00:03:10,800 --> 00:03:13,040
But this dichotomy really emerged.

49
00:03:13,040 --> 00:03:16,000
I knew we started to get, you know, as usual in the Twitter comments, it's back and forth

50
00:03:16,000 --> 00:03:17,000
of extremes.

51
00:03:17,000 --> 00:03:21,680
And that really alerted me to this fact that some people are, I actually want to say

52
00:03:21,680 --> 00:03:25,600
paranoid about the prevalence of chemicals around them.

53
00:03:25,600 --> 00:03:29,760
It's stifling for those people and it's a small number of people, it's a small percentage,

54
00:03:29,760 --> 00:03:34,080
but it's up to 10% who will go to extreme life for a void.

55
00:03:34,080 --> 00:03:38,400
What they see as like chemical contamination, they perceive it as everywhere.

56
00:03:38,400 --> 00:03:40,400
And then it just got me down this rabbit hole of research.

57
00:03:40,400 --> 00:03:44,880
It was a summer holiday by this point and I was just looking into the fear of chemistry,

58
00:03:44,880 --> 00:03:49,280
you know, Michelle Frankl's work and there's a lot of other researchers who've looked

59
00:03:49,280 --> 00:03:52,880
into this five or ten of them, the fear of chemistry.

60
00:03:52,880 --> 00:03:54,480
And this became my thing.

61
00:03:54,480 --> 00:03:56,880
Why are some people so afraid of chemicals?

62
00:03:56,880 --> 00:04:00,400
And it turns out there's sort of three groups of people and it comes to the perceptions

63
00:04:00,400 --> 00:04:01,800
of chemistry you get.

64
00:04:01,800 --> 00:04:07,560
The ones who are pretty open about chemicals appreciate the good that they do most scientifically

65
00:04:07,560 --> 00:04:09,160
litter people will be in that group.

66
00:04:09,160 --> 00:04:11,280
Then you've got the middle, which is the biggest group.

67
00:04:11,280 --> 00:04:14,760
The middle is sort of uninformed, it doesn't really care that much.

68
00:04:14,760 --> 00:04:17,800
They'll sort of maybe think that chemicals are usually a bit bad, but they're also happy

69
00:04:17,800 --> 00:04:18,800
to use them.

70
00:04:18,800 --> 00:04:20,560
They don't have any strong opinion.

71
00:04:20,560 --> 00:04:24,360
Then you've got the other sort of 10% that I mentioned, which are in chemophobic and

72
00:04:24,360 --> 00:04:28,320
I really don't like that word, by the way, but they have this fear of chemicals.

73
00:04:28,320 --> 00:04:31,600
There is a, there's a clinical word for it actually better than chemophobic.

74
00:04:31,600 --> 00:04:36,920
And then there's multiple chemical sensitivity syndrome or something like that.

75
00:04:36,920 --> 00:04:38,920
It's a psychological, right?

76
00:04:38,920 --> 00:04:41,720
It's proven psychological.

77
00:04:41,720 --> 00:04:43,520
It's like the fear of Wi-Fi.

78
00:04:43,520 --> 00:04:47,360
It's that kind of thing that's an experiment along this way in the way that it's help people.

79
00:04:47,360 --> 00:04:51,480
There is Wi-Fi, very strong Wi-Fi in the room and they get symptoms even though there's

80
00:04:51,480 --> 00:04:52,960
no Wi-Fi in the room.

81
00:04:52,960 --> 00:04:55,360
Yeah, it's a psychological thing.

82
00:04:55,360 --> 00:04:56,880
So, this became huge.

83
00:04:56,880 --> 00:05:01,480
It became a discussion topic and it became basically the next ten years of...

84
00:05:01,480 --> 00:05:08,480
Inferences speeches and trying to engage the public, trying to educate journalists on how

85
00:05:08,480 --> 00:05:14,120
not to push people into that category and how to working with bloggers, influencers who

86
00:05:14,120 --> 00:05:18,520
are often not science trained on how to communicate chemical stories appropriately.

87
00:05:18,520 --> 00:05:23,280
Obviously, teaching the whole time, trying to get the middle sort of 75% up into the loving

88
00:05:23,280 --> 00:05:27,560
chemistry part that's been a bulk of my sort of bread and butter as well.

89
00:05:27,560 --> 00:05:30,920
Yeah, this, this, but it all started with banana, really.

90
00:05:30,920 --> 00:05:32,440
Just zooming in on banana.

91
00:05:32,440 --> 00:05:37,720
It's also important to notice here is that my journey really, really interwoven with what

92
00:05:37,720 --> 00:05:43,320
you were making as well because once I remember that when I was browsing around on Facebook,

93
00:05:43,320 --> 00:05:47,960
that's I think the first time I got exposed to one of your infographics.

94
00:05:47,960 --> 00:05:53,640
And I was like, holy moly, the brain just lights up immediately.

95
00:05:53,640 --> 00:05:56,760
You realize that everything is actually chemicals.

96
00:05:56,760 --> 00:05:57,760
What am I doing?

97
00:05:57,760 --> 00:05:58,760
What am I doing with my life?

98
00:05:58,760 --> 00:05:59,760
What am I doing with my brain?

99
00:05:59,760 --> 00:06:03,640
So, let's dial back time a little bit.

100
00:06:03,640 --> 00:06:06,360
Let's talk about where the origins are.

101
00:06:06,360 --> 00:06:10,840
It has this chemophobia of some sort or man, there's a different, difficult term that you

102
00:06:10,840 --> 00:06:11,840
mentioned.

103
00:06:11,840 --> 00:06:14,760
I don't know if I can use that, the psychological term.

104
00:06:14,760 --> 00:06:20,040
But going back in time, is there like some connection where there's origin, there's an origin

105
00:06:20,040 --> 00:06:25,800
story here or have we always been afraid of chemicals or is this just like a new thing?

106
00:06:25,800 --> 00:06:27,600
It's, it comes in guys, right?

107
00:06:27,600 --> 00:06:32,600
So at the moment where it's dipped during COVID, we were not afraid of chemicals during COVID,

108
00:06:32,600 --> 00:06:36,760
we were afraid of the virus more or less and the lockdowns and the vaccines and everything

109
00:06:36,760 --> 00:06:41,800
else and the fear of the chemicals themselves actually dipped because we were, we were, we were

110
00:06:41,800 --> 00:06:46,840
buying more chemicals than ever to try and it sizes bleach and other things and soaps.

111
00:06:46,840 --> 00:06:49,160
So we, they were a little, the smallest of our concerns.

112
00:06:49,160 --> 00:06:53,080
But it's rising again now, it comes and goes, it's always been there just it depends

113
00:06:53,080 --> 00:06:54,080
on to what extent.

114
00:06:54,080 --> 00:06:57,960
So if you go all the way back, I like to go all the way back to ancient history because

115
00:06:57,960 --> 00:07:02,200
I think about that as a sort of a lens to view many of the things that we do as humans,

116
00:07:02,200 --> 00:07:05,800
I think it does explain a lot about what we do now.

117
00:07:05,800 --> 00:07:11,400
We evolve telling stories in the savannah around campfires essentially and that, that image,

118
00:07:11,400 --> 00:07:14,680
I have that image on my wall of like humans, early humans just around a campfire and

119
00:07:14,680 --> 00:07:18,680
the savannah just in the evening, they're telling stories, they're, they're cooking food,

120
00:07:18,680 --> 00:07:23,320
they're learning from each other in order to try and have a better hunter-gatherer experience

121
00:07:23,320 --> 00:07:24,320
the next day.

122
00:07:24,320 --> 00:07:28,400
And then we're talking 200,000 years ago now, but that's, that's the time when fire was

123
00:07:28,400 --> 00:07:33,000
becoming widespread, language was becoming widespread as well and our brain developed and so

124
00:07:33,000 --> 00:07:34,400
we really separated from animals.

125
00:07:34,400 --> 00:07:39,560
It's, it's a long process, but sometime around there, that's when we became fully sort of

126
00:07:39,560 --> 00:07:40,560
human.

127
00:07:40,560 --> 00:07:42,560
And why do I mention all of this?

128
00:07:42,560 --> 00:07:46,240
Because the stories we would tell would be stories of the hero's journey.

129
00:07:46,240 --> 00:07:51,520
So the hero's journey sort of archetypal story is, it's, it's the template for all our

130
00:07:51,520 --> 00:07:52,520
stories right now.

131
00:07:52,520 --> 00:07:56,800
Like every, every movie you go and watch is basically a hero's journey story and what happens

132
00:07:56,800 --> 00:08:00,680
in the hero's journey story is you've, you've got a hero who meets some kind of adversity

133
00:08:00,680 --> 00:08:05,080
comes back changed, having defeated that adversity where they come back, but they, they come

134
00:08:05,080 --> 00:08:08,040
back to where they started, but the, the, the places are a bit different.

135
00:08:08,040 --> 00:08:10,560
They are different because they've been victorious.

136
00:08:10,560 --> 00:08:15,400
What's interesting for me is there is usually an object involved like a talisman we call it,

137
00:08:15,400 --> 00:08:20,360
that's gone on the journey itself and that object comes back different too.

138
00:08:20,360 --> 00:08:24,480
So, so relating this back to like the hunter gatherers sitting around the fire, we would

139
00:08:24,480 --> 00:08:30,800
have brought memory aids, I guess, souvenirs, I suppose, from our hunter gatherer trips during

140
00:08:30,800 --> 00:08:36,040
the day and we would have used those as memory aids to look at show and tell, like, like,

141
00:08:36,040 --> 00:08:38,600
like what I found, I got a, I killed an ox.

142
00:08:38,600 --> 00:08:43,040
I, I got a, I got a, I got a something, I went to the far away place, I got, so I got

143
00:08:43,040 --> 00:08:44,040
a rock.

144
00:08:44,040 --> 00:08:48,200
These, these objects that we carry with us help us to tell the story.

145
00:08:48,200 --> 00:08:52,040
But what's interesting is there's a, there's a psychological phenomenon called provenance.

146
00:08:52,040 --> 00:08:57,360
And provenance is the idea that the story is actually in the object itself.

147
00:08:57,360 --> 00:09:02,400
It's not just a memory aid, but by going on a journey from somewhere exotic or dangerous

148
00:09:02,400 --> 00:09:07,000
and back to your home, that object contains a story physically.

149
00:09:07,000 --> 00:09:10,240
Now, scientifically, we cannot detect that.

150
00:09:10,240 --> 00:09:12,640
There's, there's actually nothing that's happened through the object.

151
00:09:12,640 --> 00:09:15,360
It's just moved, but we act as if it's true.

152
00:09:15,360 --> 00:09:19,080
We act as if objects that have been on an interesting journey have more value.

153
00:09:19,080 --> 00:09:20,240
And it kind of makes sense.

154
00:09:20,240 --> 00:09:24,640
Like, if you, if you think about this, say that there's an actor who wears, who, who carries

155
00:09:24,640 --> 00:09:31,280
a, a, a bag or a, a shirt or something, that's in a, in a film in a movie, that particular

156
00:09:31,280 --> 00:09:33,120
item will sell for a lot of money.

157
00:09:33,120 --> 00:09:37,440
But the identical item that's being sold down in target would sell for 12 bucks because

158
00:09:37,440 --> 00:09:40,240
it didn't go on that journey to, to the movie set.

159
00:09:40,240 --> 00:09:44,400
And we, there's lots of examples of this, you know, there's a, I remember, Lady Gaga's

160
00:09:44,400 --> 00:09:46,760
nail clippings for like $10,000 and stuff.

161
00:09:46,760 --> 00:09:50,720
It's ridiculous, but we do act as if we act as if these things are true.

162
00:09:50,720 --> 00:09:55,880
If an object goes on a journey, it's somehow embodies the, the journey in it.

163
00:09:55,880 --> 00:09:56,880
Okay.

164
00:09:56,880 --> 00:09:58,240
So how does this link to chemotherapy?

165
00:09:58,240 --> 00:10:05,000
Well, we believe that if we can see that an object has gone on and on pleasant journey,

166
00:10:05,000 --> 00:10:07,800
we see that thing as unpleasant now permanently.

167
00:10:07,800 --> 00:10:11,120
And this, and there's lots of other psychological things that, that, that's been to this, which

168
00:10:11,120 --> 00:10:16,280
I'll get into later on, but like, if we can see that say a, a compound or a, let's say,

169
00:10:16,280 --> 00:10:23,520
food, a food has had something from a laboratory put into it, we act as if that food is now

170
00:10:23,520 --> 00:10:28,800
tainted or if it's been through some kind of industrial food processing facility, we, we,

171
00:10:28,800 --> 00:10:32,680
we perceive it as unpleasant for whatever reason, then we see the food as being tainted.

172
00:10:32,680 --> 00:10:36,520
And a couple of interesting, um, other points which, which we could go into is we believe

173
00:10:36,520 --> 00:10:38,600
once it's tainted, it can't be purified.

174
00:10:38,600 --> 00:10:43,600
It cannot read it of that, that experience that the, the, the object all the food has had

175
00:10:43,600 --> 00:10:45,720
or it could be a skincare product or whatever it is.

176
00:10:45,720 --> 00:10:50,080
So once it's been on that journey through a sort of artificial laboratory type setting,

177
00:10:50,080 --> 00:10:52,720
it's permanently tainted, um, as the idea of contagion.

178
00:10:52,720 --> 00:10:58,080
And we believe that the, yeah, the, the idea of adding anything sort of industrial or chemical

179
00:10:58,080 --> 00:11:00,200
is, is going to increase its toxicity.

180
00:11:00,200 --> 00:11:04,880
So over processing and, um, yeah, industrialized chemical, but there's another whole reason

181
00:11:04,880 --> 00:11:05,880
for that as well.

182
00:11:05,880 --> 00:11:09,480
So putting all this together, we get this idea that, yeah, foods and cosmetics that have

183
00:11:09,480 --> 00:11:14,920
been processed industrially, especially with any kind of anything synthetic added are

184
00:11:14,920 --> 00:11:18,480
permanently then bad for you and gross and to be avoided.

185
00:11:18,480 --> 00:11:24,480
And I could unpack any one of those in the detail, but the sort of explains we've evolved

186
00:11:24,480 --> 00:11:30,400
from, from a, from a long time ago to have the sort of hardware ready to be scared of chemicals.

187
00:11:30,400 --> 00:11:34,840
It's just a matter of, do we activate that or do we learn that it's wrong and overcome

188
00:11:34,840 --> 00:11:35,840
it?

189
00:11:35,840 --> 00:11:36,840
That's a lot of wrong stuff, right?

190
00:11:36,840 --> 00:11:39,720
So we have to, but we have to come most of it through education.

191
00:11:39,720 --> 00:11:43,400
It's just that the fear of chemicals is, we are prone to that.

192
00:11:43,400 --> 00:11:47,560
Oh, one of the questions Corey handles our production and I was talking to him about

193
00:11:47,560 --> 00:11:51,080
him earlier, like what kind of question should I ask James this time around?

194
00:11:51,080 --> 00:11:55,360
And he mentioned something really interesting and it sort of ties into what you just said,

195
00:11:55,360 --> 00:11:59,160
is that educating students seems to be a little bit on the easier side.

196
00:11:59,160 --> 00:12:03,680
Perhaps you have a better inside entered, but as people become older, it seems to become

197
00:12:03,680 --> 00:12:11,120
even harder to sort of re-educate them towards, like accepting that food or anything that we

198
00:12:11,120 --> 00:12:13,200
consume is made of chemicals.

199
00:12:13,200 --> 00:12:14,600
So what's going on there?

200
00:12:14,600 --> 00:12:17,040
Well, people learn about chemistry in three ways, right?

201
00:12:17,040 --> 00:12:21,320
So when they're young, they get it from school and you have a monopoly then as a teacher,

202
00:12:21,320 --> 00:12:25,640
all the kids ideas of chemistry are taught between, we will take grade seven and grade

203
00:12:25,640 --> 00:12:26,640
twelve.

204
00:12:26,640 --> 00:12:29,880
For most people it's grade seven and grade ten because most students don't choose science

205
00:12:29,880 --> 00:12:31,080
the senior two years.

206
00:12:31,080 --> 00:12:32,800
So while he's here, it's the case.

207
00:12:32,800 --> 00:12:35,760
So at all, they don't do any science, most of them.

208
00:12:35,760 --> 00:12:40,200
So you've got basically grade seven to grade ten, a monopoly on their knowledge of chemistry

209
00:12:40,200 --> 00:12:43,280
and you can, and they come in, yes, they have a knowing nothing at all.

210
00:12:43,280 --> 00:12:45,920
So that's seven, eight, nine, ten, as four years.

211
00:12:45,920 --> 00:12:51,560
You can deliver lessons that promote chemistry as a discipline of, and I use these two words,

212
00:12:51,560 --> 00:12:55,560
creation and purification because I believe that it's almost a spiritual thing.

213
00:12:55,560 --> 00:13:00,520
But it elevates chemistry to, like, some kind of transcendent, spiritual good, right?

214
00:13:00,520 --> 00:13:04,840
Or I believe chemistry should be promoted as a discipline of self-creation and purification

215
00:13:04,840 --> 00:13:08,640
because by the time they finish school, they get their chemistry knowledge from two different

216
00:13:08,640 --> 00:13:09,880
places, no longer from school.

217
00:13:09,880 --> 00:13:12,040
Obviously they get it from marketing, right?

218
00:13:12,040 --> 00:13:15,080
Product labels and advertisements and they get it from mass media.

219
00:13:15,080 --> 00:13:16,560
So water, white, etc.

220
00:13:16,560 --> 00:13:23,400
You know, the evil chemists in, in TV shows and marketing tends to swerve towards the

221
00:13:23,400 --> 00:13:26,080
chemicals are bad, therefore by our product.

222
00:13:26,080 --> 00:13:29,600
That's a common, you know, you make something of villain and they say, oh, that doesn't contain

223
00:13:29,600 --> 00:13:30,400
that.

224
00:13:30,400 --> 00:13:31,880
You can do this with any particular chemical.

225
00:13:31,880 --> 00:13:36,680
You can find any particular food group and write a book about a diet that doesn't include

226
00:13:36,680 --> 00:13:37,680
that.

227
00:13:37,680 --> 00:13:42,040
And it's a template for being a, you know, you can do that.

228
00:13:42,040 --> 00:13:43,240
They're not all based on science.

229
00:13:43,240 --> 00:13:46,240
So they learn from that when they're adults and that's very difficult to control.

230
00:13:46,240 --> 00:13:50,080
You know one person has them anopoly on that, like that's the, that's an entire industry

231
00:13:50,080 --> 00:13:53,640
and there's no space for, as a teacher to go in and educate the adult because they're

232
00:13:53,640 --> 00:13:57,080
not actually learning chemistry from adults, like from teachers, sorry, they're learning

233
00:13:57,080 --> 00:13:59,040
chemistry from marketers.

234
00:13:59,040 --> 00:14:01,920
And then you've got the mass media that does nothing we can do about that, at least not in the

235
00:14:01,920 --> 00:14:02,920
short term.

236
00:14:02,920 --> 00:14:08,160
But when you've got Neil de Grasse Tyson and Brian Cox representing physics, right, they're huge,

237
00:14:08,160 --> 00:14:10,080
huge like physics heroes.

238
00:14:10,080 --> 00:14:11,760
And there are many others as well.

239
00:14:11,760 --> 00:14:12,760
Right.

240
00:14:12,760 --> 00:14:15,560
But then you've got David Attenborough and you've got, of course you could, you could put Richard

241
00:14:15,560 --> 00:14:16,560
Dawkins in there.

242
00:14:16,560 --> 00:14:17,840
Perhaps the evolutionary biology.

243
00:14:17,840 --> 00:14:23,040
There are others, but David Attenborough is really the king of biology and under, again,

244
00:14:23,040 --> 00:14:26,160
there are others, there's the, you know, the, the, the bear grills and the others.

245
00:14:26,160 --> 00:14:28,960
I suppose they're biologists too, but there are these really.

246
00:14:28,960 --> 00:14:32,920
Sort of respectable people representing biology and physics.

247
00:14:32,920 --> 00:14:35,400
And I said the respectable because they're knowledgeable.

248
00:14:35,400 --> 00:14:39,080
You, you want to watch their entertaining, their, their wise, their, they're also like pleasant

249
00:14:39,080 --> 00:14:43,600
characters like you, you feel like they're good people beyond just being scientists.

250
00:14:43,600 --> 00:14:48,120
And then you've got Walter White who, who doesn't know what he's doing at the beginning.

251
00:14:48,120 --> 00:14:51,720
And he, you know, kills a lot of people by like the third episode.

252
00:14:51,720 --> 00:14:54,000
And it's just evil, completely evil.

253
00:14:54,000 --> 00:14:56,480
And unfortunately he's the most famous chemist there is.

254
00:14:56,480 --> 00:14:57,480
Who else is there?

255
00:14:57,480 --> 00:15:02,000
Like I ask this every, every single time I do a keynote on this, I literally ask who is more

256
00:15:02,000 --> 00:15:03,960
famous as a chemist than Walter White.

257
00:15:03,960 --> 00:15:05,800
And I never get an answer.

258
00:15:05,800 --> 00:15:09,800
The, the one answer that people do attempt is they say, are the presenter of chemistry

259
00:15:09,800 --> 00:15:10,800
of volatile history.

260
00:15:10,800 --> 00:15:15,640
It was a British documentary series, but the, um, presenter of that was Jim El Calilelli,

261
00:15:15,640 --> 00:15:16,640
who was a physicist, right?

262
00:15:16,640 --> 00:15:19,240
So the BBC did a documentary series on chemistry.

263
00:15:19,240 --> 00:15:21,800
Firstly, they named it chemistry a volatile history.

264
00:15:21,800 --> 00:15:23,640
That's hardly a flattering title.

265
00:15:23,640 --> 00:15:27,320
Secondly, they never got further than three episodes, which just introduces the periodic

266
00:15:27,320 --> 00:15:28,320
table and that's it.

267
00:15:28,320 --> 00:15:29,320
No more.

268
00:15:29,320 --> 00:15:35,240
So it would be like a, a, a food documentary where you're just studying episode one, flower,

269
00:15:35,240 --> 00:15:37,560
episode two, eggs, episode three, milk.

270
00:15:37,560 --> 00:15:41,720
And you never actually look at any recipes, building compounds together.

271
00:15:41,720 --> 00:15:43,400
So on the third thing is he's a physicist.

272
00:15:43,400 --> 00:15:44,960
He's a professor of physics.

273
00:15:44,960 --> 00:15:47,760
So he's, you know, they couldn't find an actual chemist to present it, right?

274
00:15:47,760 --> 00:15:51,160
So, yeah, I don't think there are any more famous chemists than Walter White.

275
00:15:51,160 --> 00:15:54,960
And that's another problem in the mass media, though, that the only way we can do something

276
00:15:54,960 --> 00:16:00,720
about that is, is long term find another chemistry hero and make something that is bigger than

277
00:16:00,720 --> 00:16:01,720
breaking bad.

278
00:16:01,720 --> 00:16:04,520
But that is just such a huge challenge who, who's really able to do that.

279
00:16:04,520 --> 00:16:11,720
I know one person, I'm talking of, well, it is, it costs a lot of money like it, it's, um,

280
00:16:11,720 --> 00:16:15,280
and to make it engaging and it would, it would be a team of a hundred people and, and it

281
00:16:15,280 --> 00:16:18,800
would take many, many years, but, uh, some people say Bill Nye as well.

282
00:16:18,800 --> 00:16:21,760
So I looked into Bill Nye, but I'm, I'm a big fan of Bill Nye.

283
00:16:21,760 --> 00:16:24,040
A lot of kids grew up watching him this generation as well.

284
00:16:24,040 --> 00:16:26,160
They still, they still popular now.

285
00:16:26,160 --> 00:16:32,000
And, uh, if you analyze the topics of Bill Nye's episodes, they're almost all physics and

286
00:16:32,000 --> 00:16:33,000
engineering.

287
00:16:33,000 --> 00:16:36,680
There are, I think, two chemistry episodes out of five whole series.

288
00:16:36,680 --> 00:16:39,040
So he's not a chemist at all.

289
00:16:39,040 --> 00:16:43,680
He's not even, I wouldn't even call him a balanced scientist across all the disciplines.

290
00:16:43,680 --> 00:16:46,120
He's very much physics and engineering, which makes sense.

291
00:16:46,120 --> 00:16:47,120
That's his background.

292
00:16:47,120 --> 00:16:49,400
And I think third place was biology as a theme.

293
00:16:49,400 --> 00:16:51,240
So, so yeah, I don't count Bill Nye.

294
00:16:51,240 --> 00:16:55,400
He's not, he's not more influential than, uh, than breaking bad and marketing, which is

295
00:16:55,400 --> 00:16:58,800
where adults learn and we can't, at the moment, compete with that.

296
00:16:58,800 --> 00:16:59,800
Yeah, you're right.

297
00:16:59,800 --> 00:17:05,400
It's very difficult competing with mass media because at the, at the whim of some entertainer,

298
00:17:05,400 --> 00:17:09,000
you'll have a flood of hundreds of thousands of people doing what they want them to do.

299
00:17:09,000 --> 00:17:15,120
It's, it's almost like magic that we've created these monstrous sort of entities within

300
00:17:15,120 --> 00:17:16,640
our society.

301
00:17:16,640 --> 00:17:21,200
Um, sort of to sort of backtrack once more, you mentioned something happened during

302
00:17:21,200 --> 00:17:27,760
the 1960s, which sort of amplified the whole bullseye towards chemicals.

303
00:17:27,760 --> 00:17:28,760
What was that?

304
00:17:28,760 --> 00:17:30,040
Well, a couple of things.

305
00:17:30,040 --> 00:17:33,200
There was, we, we, we just backtracked to the 1950s, right?

306
00:17:33,200 --> 00:17:38,480
So 1950s, Kimophobia was at a low and the reason was, I think, post Second World War in

307
00:17:38,480 --> 00:17:43,720
Europe, we were, um, I mean, many more adults, particularly women get back into the workforce.

308
00:17:43,720 --> 00:17:48,160
We were looking for things, ways of saving, um, saving time, also conserving food for

309
00:17:48,160 --> 00:17:49,160
a long time.

310
00:17:49,160 --> 00:17:54,360
Society was changing in many ways that meant we had a need for time-saving plastics, particularly

311
00:17:54,360 --> 00:17:56,400
single-use plastics.

312
00:17:56,400 --> 00:17:59,800
They were, they were essential at that time.

313
00:17:59,800 --> 00:18:04,200
Um, it's a shame now we've forgotten how good they are, but, you know, just people were throwing

314
00:18:04,200 --> 00:18:05,200
them in the ocean.

315
00:18:05,200 --> 00:18:08,200
But anyway, we, um, in the 1950s, there was, there was, it was, it was a low.

316
00:18:08,200 --> 00:18:12,880
We get into the 60s, it starts to rise again because in the 1962, uh, a Rachel Kosson wrote

317
00:18:12,880 --> 00:18:14,640
a book, right, Silent Spring.

318
00:18:14,640 --> 00:18:18,000
So if any, if any of the listens haven't read Silent Spring, I recommend actually

319
00:18:18,000 --> 00:18:19,000
getting it.

320
00:18:19,000 --> 00:18:20,560
It's, uh, it's a PDF online.

321
00:18:20,560 --> 00:18:22,200
You can, it's open, open source.

322
00:18:22,200 --> 00:18:25,680
You can just, uh, Silent Spring, we're Rachel Kosson, but bear in mind the following things,

323
00:18:25,680 --> 00:18:26,680
right?

324
00:18:26,680 --> 00:18:31,680
So Silent Spring is a, it's a dystopian, it's two, it's two parts, right?

325
00:18:31,680 --> 00:18:39,160
The introduction is a dystopian fiction about a completely unrealistic future where Earth

326
00:18:39,160 --> 00:18:44,320
has become devoid of all life because chemicals have taken over and the, the earth is dead.

327
00:18:44,320 --> 00:18:47,720
Basically, that's chapter one, and it's fiction.

328
00:18:47,720 --> 00:18:48,720
Really unrealistic.

329
00:18:48,720 --> 00:18:54,920
Then the rest of the book is nonfiction, and it's about real concerns about DDT, mostly

330
00:18:54,920 --> 00:18:55,920
and other things.

331
00:18:55,920 --> 00:19:00,640
So this book got put on the nonfiction shelf because it is 90% nonfiction.

332
00:19:00,640 --> 00:19:04,760
However, anybody picking it up is going to start reading a chapter one because it's a

333
00:19:04,760 --> 00:19:08,640
nonfiction book and this, and they're working through the, the fiction part, the first

334
00:19:08,640 --> 00:19:11,440
bit of the beginning, not realizing that is fiction.

335
00:19:11,440 --> 00:19:13,040
And I've got some quotes from the book.

336
00:19:13,040 --> 00:19:14,040
You can find these online.

337
00:19:14,040 --> 00:19:15,280
It's a very short first chapter, by the way.

338
00:19:15,280 --> 00:19:16,440
It's like three pages.

339
00:19:16,440 --> 00:19:20,960
And they, they're still horrible, the script is harrowing about all birds of died and, and

340
00:19:20,960 --> 00:19:25,880
people have died and, and the rivers are all poisoned and dried and it's horrible, but

341
00:19:25,880 --> 00:19:29,480
the fact that it was on the nonfiction shelf scared people so much.

342
00:19:29,480 --> 00:19:31,200
Maybe that was what, what she wanted.

343
00:19:31,200 --> 00:19:36,560
I don't know, but she has, I mean, very much, she wrote this, this book for multiple reasons,

344
00:19:36,560 --> 00:19:41,480
I mean, we could go into that, but the book was tainted by her own personal experience,

345
00:19:41,480 --> 00:19:42,480
I think.

346
00:19:42,480 --> 00:19:48,440
The author was on, was on chemotherapy at the time, 1962 chemotherapy, where it was not,

347
00:19:48,440 --> 00:19:52,320
not particularly pleasant, it wasn't as a fancy today and she was suffering a lot.

348
00:19:52,320 --> 00:19:58,320
So her book against chemicals in society was really a reflection of her own suffering

349
00:19:58,320 --> 00:20:02,600
against the chemotherapy symptoms, but she sort of projected that onto the world.

350
00:20:02,600 --> 00:20:07,800
And it, it is very sad actually when you read it, but the book was massively influential,

351
00:20:07,800 --> 00:20:09,640
1962, Rachel Carson's book.

352
00:20:09,640 --> 00:20:13,040
And it, it is sort of made chemotherapy a peak again.

353
00:20:13,040 --> 00:20:17,240
And then a few other things happened in the 1960s, so obviously the space race is getting

354
00:20:17,240 --> 00:20:18,240
on.

355
00:20:18,240 --> 00:20:20,160
And this, the spot the way the space race wouldn't have been happened without, wouldn't

356
00:20:20,160 --> 00:20:24,080
have happened without all the new materials and the plastics and everything else that made

357
00:20:24,080 --> 00:20:25,080
it possible, right?

358
00:20:25,080 --> 00:20:28,960
So, so on the one hand, we're saying chemicals are evil, but at the same time, we're loving

359
00:20:28,960 --> 00:20:29,960
the space race.

360
00:20:29,960 --> 00:20:33,280
Like, that's, you know, we're, we're taking advantage of all the materials of chemistry

361
00:20:33,280 --> 00:20:34,960
and chemical industries made for us.

362
00:20:34,960 --> 00:20:36,280
Then you get to 1968.

363
00:20:36,280 --> 00:20:42,560
1968 was, I think it was a hollow eight where they, they went around the back of the moon and

364
00:20:42,560 --> 00:20:45,760
took a photo of Earth for the first time and I got that photo on my wall.

365
00:20:45,760 --> 00:20:51,200
So that photo is called, um, Earthrise, Earthrise, it's called Earthrise and you can see the

366
00:20:51,200 --> 00:20:53,760
Earth appearing to rise over the moon.

367
00:20:53,760 --> 00:20:56,440
And it was massively influential picture.

368
00:20:56,440 --> 00:21:01,440
That was the first image of the Earth as a whole from a distance, it's hard to imagine

369
00:21:01,440 --> 00:21:05,720
there was a time when we'd never seen the Earth as a whole, but that image changed people's

370
00:21:05,720 --> 00:21:06,720
perspective.

371
00:21:06,720 --> 00:21:11,520
And it, it made people realize that the Earth is sort of wholesome and beautiful when you

372
00:21:11,520 --> 00:21:15,240
can't see the people on it because you zoomed out far enough.

373
00:21:15,240 --> 00:21:18,920
They took, they took a sequel, I suppose, in 1972, it's called the Blue Marble and the

374
00:21:18,920 --> 00:21:23,160
Blue Marble was better because it clearer, it's bigger and you can see the entire Earth,

375
00:21:23,160 --> 00:21:24,160
they took it from the right angle.

376
00:21:24,160 --> 00:21:28,200
So the sun is on the entire front of the Earth and it's all lit up rather than the first

377
00:21:28,200 --> 00:21:32,240
one was like half lit, like a half moon type, half Earth, I suppose, half, half light,

378
00:21:32,240 --> 00:21:35,280
half dark, but the, uh, this one, the, the blue marble, 1972.

379
00:21:35,280 --> 00:21:39,360
Whole Earth and Nildegrass Tyson makes a really interesting point about this.

380
00:21:39,360 --> 00:21:44,480
He says that in that photo, that's the first map, person was world map where there were no

381
00:21:44,480 --> 00:21:45,480
borders, written on it.

382
00:21:45,480 --> 00:21:50,020
And he's got this great video, he gives a speech, a keynote speech in the US, and he lists the

383
00:21:50,020 --> 00:21:54,360
effects that they had on the environmental movement as people realize the Earth is beautiful

384
00:21:54,360 --> 00:21:59,600
as a whole without the sort of the marks that we've drawn on it to try and divide us.

385
00:21:59,600 --> 00:22:04,400
They made people feel united and a worship almost mother earth that became a, but one

386
00:22:04,400 --> 00:22:06,360
of the most famous photos of all time.

387
00:22:06,360 --> 00:22:09,440
What people don't realize though is that was on the front page of the New York Times.

388
00:22:09,440 --> 00:22:13,120
Within weeks, you then had running girl, I don't know if you're familiar with the running

389
00:22:13,120 --> 00:22:14,520
girl photo from Vietnam.

390
00:22:14,520 --> 00:22:16,040
So this is a photo taken.

391
00:22:16,040 --> 00:22:20,480
Yeah, it's a horrible photo, but it's, it's the photo of the aftermath of a chemical weapon

392
00:22:20,480 --> 00:22:22,720
attack on a Vietnamese village.

393
00:22:22,720 --> 00:22:26,920
And there was a, a Vietnam half, a half Vietnamese half American photographer went in and took

394
00:22:26,920 --> 00:22:31,320
this photo of people fleeing the chemical attack.

395
00:22:31,320 --> 00:22:34,120
And, um, those two photos were, they were both.

396
00:22:34,120 --> 00:22:36,760
On the front page of the New York Times within weeks of each other.

397
00:22:36,760 --> 00:22:42,600
So that juxtaposition in 1972 made people realize mother earth, beautiful, whole, untouched,

398
00:22:42,600 --> 00:22:46,560
pristine, and people evil using chemicals cause death and destruction.

399
00:22:46,560 --> 00:22:51,120
And that juxtaposition really just accelerated the fear of chemicals.

400
00:22:51,120 --> 00:22:55,760
And from then on, you had, um, everything that the Environmental Protection Agency was founded,

401
00:22:55,760 --> 00:23:00,000
the, the whole catalytic converter thing, um, the, the, the mandates I guess came in with

402
00:23:00,000 --> 00:23:02,280
got the clean air act came in.

403
00:23:02,280 --> 00:23:06,120
You also had all sorts of other lawsuits and cleanups for chemical disasters that might

404
00:23:06,120 --> 00:23:08,080
have been ignored 50 years before.

405
00:23:08,080 --> 00:23:10,240
You know, oil spills and all these other things as well.

406
00:23:10,240 --> 00:23:13,320
So it's, uh, yeah, that, that's basically what happened in the 1960s.

407
00:23:13,320 --> 00:23:16,120
A whole sort of series of social events, which accelerated it.

408
00:23:16,120 --> 00:23:19,960
And then I think by the, by the late 80s, we've forgotten again, somewhat.

409
00:23:19,960 --> 00:23:21,840
It went.

410
00:23:21,840 --> 00:23:22,840
We were busy.

411
00:23:22,840 --> 00:23:26,600
I don't know what were people doing in the late 80s, um, worried about East versus West

412
00:23:26,600 --> 00:23:31,400
politics and, and busy making money, I suppose, but then it's, it picked up again.

413
00:23:31,400 --> 00:23:32,400
And, uh, here we are today.

414
00:23:32,400 --> 00:23:36,080
So again, it comes and goes, but the 60s was a particularly interesting time.

415
00:23:36,080 --> 00:23:40,760
I think that with the advent of social media, the whole thing seems to have come back.

416
00:23:40,760 --> 00:23:42,160
Is that what your observation has been?

417
00:23:42,160 --> 00:23:43,920
Oh, totally, but it's been a long time coming.

418
00:23:43,920 --> 00:23:44,920
It's before social media.

419
00:23:44,920 --> 00:23:47,320
So I mean, I got a chapter in, in the book that I wrote on this.

420
00:23:47,320 --> 00:23:49,680
It's, I actually called it post-modernism.

421
00:23:49,680 --> 00:23:52,080
And I, I called it out like in 2015.

422
00:23:52,080 --> 00:23:54,480
The first time I had a slide, I found a slide from a keynote.

423
00:23:54,480 --> 00:23:58,120
I blamed post-modernism for the recent uptick, um, back in 2015.

424
00:23:58,120 --> 00:24:01,600
And back then, what I said was, it's actually about how I bring up the slide here.

425
00:24:01,600 --> 00:24:02,600
I read it too.

426
00:24:02,600 --> 00:24:06,440
Okay, this is, this is from 2015 before it became a, uh, a skateboard for everything today.

427
00:24:06,440 --> 00:24:08,800
Um, experts are no longer automatically trusted.

428
00:24:08,800 --> 00:24:10,840
But cult of the amateur has emerged.

429
00:24:10,840 --> 00:24:12,760
Scientific truth is just one opinion.

430
00:24:12,760 --> 00:24:13,760
Wow.

431
00:24:13,760 --> 00:24:16,240
So, and I, I don't like to overthink things, right?

432
00:24:16,240 --> 00:24:19,360
But the post-modernism movement, it's a bit broad.

433
00:24:19,360 --> 00:24:23,960
It's, it's, it's hard to define exactly, but one of the things that we're seeing recently is,

434
00:24:23,960 --> 00:24:27,080
in my perception, and the listeners will, will chip in and they'll comment and go,

435
00:24:27,080 --> 00:24:28,520
"No, no, no, that's not post-modernism."

436
00:24:28,520 --> 00:24:29,520
And that's okay.

437
00:24:29,520 --> 00:24:30,760
I'm interested in having that discussion.

438
00:24:30,760 --> 00:24:35,840
So, is renegotiating the norms that we've constructed through the modern in Stereo?

439
00:24:35,840 --> 00:24:36,840
Just renegotiating everything.

440
00:24:36,840 --> 00:24:40,000
Just knock it all down, a rebuild, a different scratch, just in case we go anything wrong.

441
00:24:40,000 --> 00:24:44,720
That includes not automatically assuming that experts and educated people should be trusted.

442
00:24:44,720 --> 00:24:48,520
At the top, we should renegotiate that and figure it out from first principles all over again.

443
00:24:48,520 --> 00:24:51,720
And what that's resulted in is, like, knocking down the domino, knocking down the house

444
00:24:51,720 --> 00:24:54,920
of cuts, and just saying, "Okay, we're all on the bottom, now we're all equal."

445
00:24:54,920 --> 00:25:00,720
The doctor is, his view is equal with this blogger's view, which is equal with the president's

446
00:25:00,720 --> 00:25:02,520
view, which is equal with my uncle's view.

447
00:25:02,520 --> 00:25:03,520
They're all equal.

448
00:25:03,520 --> 00:25:07,600
And social media has allowed that to happen, but that mindset came first.

449
00:25:07,600 --> 00:25:08,600
We wanted that.

450
00:25:08,600 --> 00:25:13,320
We wanted that equality of views and equality of exposure of views and equality of respect

451
00:25:13,320 --> 00:25:15,880
of all views, even if they're ridiculous.

452
00:25:15,880 --> 00:25:18,320
And we wanted that before social media came.

453
00:25:18,320 --> 00:25:20,160
Social media just provided that.

454
00:25:20,160 --> 00:25:21,960
And now we're thriving.

455
00:25:21,960 --> 00:25:24,680
Well, we, we, we, we, we, we're not.

456
00:25:24,680 --> 00:25:28,160
We, we, we are lapping it up even though it's really not good for us.

457
00:25:28,160 --> 00:25:32,680
We really should be trusting educated opinions much more than uneducated ones.

458
00:25:32,680 --> 00:25:35,360
But you know, you know what's going on out there.

459
00:25:35,360 --> 00:25:39,520
There's all sorts of nonsense, not social media, the proliferates, and people listen.

460
00:25:39,520 --> 00:25:45,000
So looking at the global educational landscape, like, for example, you're down in Australia,

461
00:25:45,000 --> 00:25:48,720
and you've also taught in China for, for about four years as well.

462
00:25:48,720 --> 00:25:53,760
So could you do like a side by side comparison to sort of analyze what your observations are,

463
00:25:53,760 --> 00:25:58,880
especially pertaining to chemistry, like, what is the global situation and where's the net effect

464
00:25:58,880 --> 00:25:59,880
going towards?

465
00:25:59,880 --> 00:26:03,200
Are we moving in a positive direction in different countries?

466
00:26:03,200 --> 00:26:08,400
Or is it going to be sort of a big problem in the near future when it comes to education

467
00:26:08,400 --> 00:26:12,840
and learning and especially taking into the context of the power of social media and the

468
00:26:12,840 --> 00:26:15,200
same effects that you just mentioned?

469
00:26:15,200 --> 00:26:19,720
Social media is a major problem and that's a problem that we are facing, I say, in, in the

470
00:26:19,720 --> 00:26:23,840
West, I don't like the term the West, like I don't like the term, Kimophobia, right?

471
00:26:23,840 --> 00:26:25,320
But, I mean, how else do we?

472
00:26:25,320 --> 00:26:31,120
The G7, I suppose, I mean, the sort of democratic open wealthy countries, right?

473
00:26:31,120 --> 00:26:32,840
These, it's a problem we have.

474
00:26:32,840 --> 00:26:37,720
It's a serious problem, social media, and I have spent four years in China working there,

475
00:26:37,720 --> 00:26:41,320
and you know, I speak fluent Chinese, and I will always keep it up with developments

476
00:26:41,320 --> 00:26:42,800
in China and see the direction is going.

477
00:26:42,800 --> 00:26:43,800
I love China.

478
00:26:43,800 --> 00:26:45,600
I want China to thrive and succeed.

479
00:26:45,600 --> 00:26:51,480
I want it to be one of the world leaders culturally, economically, to be safe and stable and a great

480
00:26:51,480 --> 00:26:53,400
place to live and all of that.

481
00:26:53,400 --> 00:26:58,000
One thing that they're doing, I don't want to say well, but better than us, is managing

482
00:26:58,000 --> 00:26:59,600
fake news online.

483
00:26:59,600 --> 00:27:05,160
And I don't think they've got it 100% right, but fake news can't really spread that fast

484
00:27:05,160 --> 00:27:06,160
through China.

485
00:27:06,160 --> 00:27:12,240
I mean, I'm all for free speech, but when, you know, it's, we are suffering from the consequences

486
00:27:12,240 --> 00:27:15,640
of too much exposure of freedom of speech.

487
00:27:15,640 --> 00:27:19,920
I think that summarizes, I believe in freedom of speech, but we are having algorithms that

488
00:27:19,920 --> 00:27:23,320
amplify dumb things, and that's what China wants to allow.

489
00:27:23,320 --> 00:27:28,480
So you can say whatever you want in China on social media, but you limited, each group is

490
00:27:28,480 --> 00:27:32,960
limited to 500 people, and celebrities can't have more than, there's a certain number of followers

491
00:27:32,960 --> 00:27:36,240
of the way it just kind of, sorry, so I understand the number of likes on a post that kind of

492
00:27:36,240 --> 00:27:37,240
mixes out.

493
00:27:37,240 --> 00:27:40,680
So you can't, nobody's more famous than say 100,000 likes.

494
00:27:40,680 --> 00:27:46,280
It just says more than 100,000, you don't know who's super huge and who's just slightly huge.

495
00:27:46,280 --> 00:27:48,440
So this is a thing, they're doing a few things.

496
00:27:48,440 --> 00:27:53,000
Yeah, once you get to 100,000, you don't, it just says 100,000 plus, you've reached fame,

497
00:27:53,000 --> 00:27:55,600
and you don't know whether it's millions or tens of millions, you don't know.

498
00:27:55,600 --> 00:28:00,640
It creates a sort of equality at the top with a little still freedom of the, I'm not saying

499
00:28:00,640 --> 00:28:01,640
it's a perfect system at all.

500
00:28:01,640 --> 00:28:02,640
It's really not.

501
00:28:02,640 --> 00:28:06,000
There are lots of things I've done like about that system, like there's lots of information

502
00:28:06,000 --> 00:28:08,360
you can't find, and Google doesn't work.

503
00:28:08,360 --> 00:28:12,240
There's lots of other services that don't work, but yeah, it's not, it's not perfect at all,

504
00:28:12,240 --> 00:28:15,720
at all, but we are suffering from the consequences of doing the opposite, which is just complete

505
00:28:15,720 --> 00:28:19,000
freedom and, um, it's not the freedom of the problem.

506
00:28:19,000 --> 00:28:24,880
It's the amplification of misleading things and, and then large actors are playing the

507
00:28:24,880 --> 00:28:29,520
system to look like a large number of small actors and manipulate people into thinking

508
00:28:29,520 --> 00:28:34,000
stuff, but we wanted it, that's my point, we wanted it before social media came, we wanted

509
00:28:34,000 --> 00:28:38,200
equality of thoughts, everybody's thoughts to be treated equally with equal respect and

510
00:28:38,200 --> 00:28:39,200
equal air time.

511
00:28:39,200 --> 00:28:40,720
We actually wanted that before it came along.

512
00:28:40,720 --> 00:28:44,800
I know we've got this problem where, which exists here, not, not so much in China, which

513
00:28:44,800 --> 00:28:48,280
exists here that some, that science is just an opinion, that you can go through the

514
00:28:48,280 --> 00:28:53,720
rigor of academia and years, you know, master's degree, PhD, the, the rigor of a scientific

515
00:28:53,720 --> 00:28:58,440
study for many years and come out and you've got then somebody just saying, no, fake news

516
00:28:58,440 --> 00:29:03,440
or no, no, you can just make in something up and it gets treated with equal respect among

517
00:29:03,440 --> 00:29:04,440
an educated audience.

518
00:29:04,440 --> 00:29:10,080
And I think, I call it postmodernism, I call it postmodernism, it's one of the 10 is postmodernism,

519
00:29:10,080 --> 00:29:14,960
we just don't trust a hierarchy we've built, which was actually based on ability and education,

520
00:29:14,960 --> 00:29:15,960
we've brought it right down.

521
00:29:15,960 --> 00:29:19,400
But the Chinese don't have this so much, they're quite happy to look up to experts and again,

522
00:29:19,400 --> 00:29:23,400
not a perfect system, but they look up to the experts and go, yep, I agree with that.

523
00:29:23,400 --> 00:29:26,760
Well, yeah, that's also not, that's also not right, you've got to be skeptical.

524
00:29:26,760 --> 00:29:27,760
Yeah.

525
00:29:27,760 --> 00:29:30,440
You know, what happened here, like, what, what, I really, it's very unfortunate with

526
00:29:30,440 --> 00:29:35,320
our system here, we had experts here saying, don't wear a mask and then you must wear a mask

527
00:29:35,320 --> 00:29:39,320
and then we also had, no, no, no, how dare you say it came from a lab and then I came from

528
00:29:39,320 --> 00:29:40,320
a lab, right?

529
00:29:40,320 --> 00:29:43,880
So, and there are other things now, I could, I am not going to say what they are, but I've

530
00:29:43,880 --> 00:29:48,280
got a list of predictions of things which I think will become untrue in the next couple

531
00:29:48,280 --> 00:29:51,720
of years, things you cannot say right now.

532
00:29:51,720 --> 00:29:58,240
But come back in 2025, we'll be like, I should tell you offline, but things that the, the

533
00:29:58,240 --> 00:30:00,160
way the official narrative will completely change.

534
00:30:00,160 --> 00:30:04,600
So, no one's doing a perfect job at this, but we are suffering the consequences of social

535
00:30:04,600 --> 00:30:09,440
media and having everyone's opinions taken equally, I don't know what the solution is.

536
00:30:09,440 --> 00:30:12,960
I really don't, but more discussion is certainly one of the parts of the solution.

537
00:30:12,960 --> 00:30:19,200
One of the ways I look at it is that America, for example, our country has really tied

538
00:30:19,200 --> 00:30:21,320
influence to monetary gains.

539
00:30:21,320 --> 00:30:22,560
That's a huge problem here.

540
00:30:22,560 --> 00:30:28,160
And I think that's a, that's a powerful driving force where science gets trumped by someone

541
00:30:28,160 --> 00:30:29,840
saying with large falling.

542
00:30:29,840 --> 00:30:34,280
So, yes, I do think that influence is a huge issue.

543
00:30:34,280 --> 00:30:38,360
And another thing is just to add to the whole freedom of speech concept here is that I'm

544
00:30:38,360 --> 00:30:40,160
not against it either.

545
00:30:40,160 --> 00:30:46,600
But the way I think about it is when you enter the industrial age, we had ways to sort of

546
00:30:46,600 --> 00:30:51,840
expedite processes, we had rules in place, we had everything streamlined.

547
00:30:51,840 --> 00:30:56,120
We're in the information age right now and I think we don't have the right tools to sort

548
00:30:56,120 --> 00:30:58,640
of navigate the waters.

549
00:30:58,640 --> 00:31:03,720
Like for example, if somebody who's been working 40, 60 hours a day, they're going to go online

550
00:31:03,720 --> 00:31:07,680
and look up something that their friend mentioned at the work and they are going to find the

551
00:31:07,680 --> 00:31:12,360
most popular thing because maybe popular is the best and it seems to work.

552
00:31:12,360 --> 00:31:17,800
So by that proxy and sort of pipeline, these people get brainwashed or indoctrinated in

553
00:31:17,800 --> 00:31:22,760
bad information, which brings me to a very interesting question for you, which is the rise

554
00:31:22,760 --> 00:31:24,520
of artificial intelligence.

555
00:31:24,520 --> 00:31:31,040
So what's going on with their, what's your sort of outlook on it and are there any concerns

556
00:31:31,040 --> 00:31:34,480
that you have or do you see a positive light at the end of the tunnel?

557
00:31:34,480 --> 00:31:37,480
Oh, definitely positive, very much pro AI.

558
00:31:37,480 --> 00:31:38,480
I use it.

559
00:31:38,480 --> 00:31:45,960
I subscribe to chat to you, and it's mind blowing the pace of advancement.

560
00:31:45,960 --> 00:31:48,800
So we have to embrace it like every other technology.

561
00:31:48,800 --> 00:31:51,000
It's not as scary as people admit.

562
00:31:51,000 --> 00:31:52,800
I'm not as scary as people say.

563
00:31:52,800 --> 00:31:59,080
So it's probably as world changing as maybe, I would say somewhere between the invention

564
00:31:59,080 --> 00:32:03,600
of DVDs and the invention of the internet, I'd put it somewhere between there.

565
00:32:03,600 --> 00:32:06,880
It's less influential than the internet, but it's more influential than DVDs.

566
00:32:06,880 --> 00:32:09,640
You know, the ability to watch high-rate digital movies.

567
00:32:09,640 --> 00:32:12,320
It's another tool and we have to use those tools.

568
00:32:12,320 --> 00:32:13,320
This is our human history.

569
00:32:13,320 --> 00:32:16,520
Again, I go back to ancient humans, evolutionary history.

570
00:32:16,520 --> 00:32:24,040
The pattern of human evolution has been always to concentrate more energy use and more knowledge

571
00:32:24,040 --> 00:32:26,600
into ever smaller units and use them faster.

572
00:32:26,600 --> 00:32:30,000
That's been an exponential, that's been the pattern of human development.

573
00:32:30,000 --> 00:32:32,920
So then it's just an exonological step.

574
00:32:32,920 --> 00:32:37,120
So I mean, AI comes out of sort of predictive text and cluster analysis machine learning

575
00:32:37,120 --> 00:32:38,120
for statistics.

576
00:32:38,120 --> 00:32:40,200
It's only a step above that.

577
00:32:40,200 --> 00:32:43,400
It's just gone from being a tool for professionals to public, right?

578
00:32:43,400 --> 00:32:47,080
And then the T-PT just made a public, but they're not just out the blue, like, from me,

579
00:32:47,080 --> 00:32:48,880
fierce, sort of, from the sky.

580
00:32:48,880 --> 00:32:53,040
They were already tools and then one step more and now they're public.

581
00:32:53,040 --> 00:32:54,720
From for chat T-PT, for example.

582
00:32:54,720 --> 00:32:56,440
We did think about predictive text.

583
00:32:56,440 --> 00:33:00,440
We've had that before and no one was afraid of, you know, at least a couple of years ago,

584
00:33:00,440 --> 00:33:01,440
nobody was afraid of it.

585
00:33:01,440 --> 00:33:04,480
Maybe at the beginning people were worried that your phone was learning what you say and

586
00:33:04,480 --> 00:33:05,880
then giving you the next word.

587
00:33:05,880 --> 00:33:09,480
But it's essentially, the chat T-PT's a glorified version of that.

588
00:33:09,480 --> 00:33:14,480
And we don't know what is trained on large amounts of text on the internet is the next logical

589
00:33:14,480 --> 00:33:16,240
step and we have to embrace it.

590
00:33:16,240 --> 00:33:18,880
And the people who don't embrace it will be left behind.

591
00:33:18,880 --> 00:33:19,880
It's inevitable.

592
00:33:19,880 --> 00:33:21,280
I mean, we could agree it's going to be good.

593
00:33:21,280 --> 00:33:25,200
It's going to be bad, but it's just the trajectory of human development and we have to

594
00:33:25,200 --> 00:33:26,960
be on board or will be left behind.

595
00:33:26,960 --> 00:33:32,520
It's like, you know, some, particularly older people did not get on board with the internet.

596
00:33:32,520 --> 00:33:38,400
And you know, not being able to use the internet now, it's a bit of a problem causes inconveniences.

597
00:33:38,400 --> 00:33:42,720
And you know, you've got to walk into the town centre to do everything.

598
00:33:42,720 --> 00:33:45,840
But those who don't get on board with AI will have those kinds of problems in the future

599
00:33:45,840 --> 00:33:46,840
as well.

600
00:33:46,840 --> 00:33:50,720
It'll just be harder to do stuff or won't be easier like it is for the other people.

601
00:33:50,720 --> 00:33:54,120
No, a big fan of AI, self-driving cars.

602
00:33:54,120 --> 00:33:56,000
We've got to get the policy on board though.

603
00:33:56,000 --> 00:33:58,040
So two things we've got to be aware of.

604
00:33:58,040 --> 00:34:01,160
Number one, firstly, first we get on board with it.

605
00:34:01,160 --> 00:34:03,920
Or we don't really have to be like, but the two things we've got to be aware of, we

606
00:34:03,920 --> 00:34:07,800
need to have always a human being on top of the AI.

607
00:34:07,800 --> 00:34:14,080
Always we have to have a human, one human responsible for what the AI does legally.

608
00:34:14,080 --> 00:34:18,800
So, you know, in the same way that we could build a tool, an AI is just another tool.

609
00:34:18,800 --> 00:34:20,080
We could build a hammer, for example.

610
00:34:20,080 --> 00:34:25,920
You can't sue the hammer company if you injure sale for good cause damage to you, the product

611
00:34:25,920 --> 00:34:26,920
you're making, right?

612
00:34:26,920 --> 00:34:27,920
It's the person's fault.

613
00:34:27,920 --> 00:34:32,440
The person has to take on that responsibility of using the tool and be legally responsible

614
00:34:32,440 --> 00:34:35,680
and be able to be punished for anything bad of the tool does.

615
00:34:35,680 --> 00:34:39,880
The tool malfunctions and hurts someone has to be the users fault and it doesn't work any

616
00:34:39,880 --> 00:34:40,880
other way.

617
00:34:40,880 --> 00:34:44,920
I'm not saying that's like ethically the right thing to do, but legally it doesn't work any

618
00:34:44,920 --> 00:34:45,920
other way.

619
00:34:45,920 --> 00:34:49,480
Otherwise we're in this quandary of like, oh, machine did something.

620
00:34:49,480 --> 00:34:52,400
Nobody's responsible was going to happen again then.

621
00:34:52,400 --> 00:34:55,480
And then the machines take over.

622
00:34:55,480 --> 00:34:57,320
We have to have one, yeah.

623
00:34:57,320 --> 00:35:02,520
People when one person is responsible for each AI tool that is being used, so, you know,

624
00:35:02,520 --> 00:35:03,920
they'll be careful with it.

625
00:35:03,920 --> 00:35:04,920
They'll exercise restraint.

626
00:35:04,920 --> 00:35:08,960
They'll put their human judgment onto it and say, do I really want to put this up?

627
00:35:08,960 --> 00:35:14,680
So everything I've used, AI for, I've ultimately put my name on and said, I will be responsible

628
00:35:14,680 --> 00:35:16,160
for this if there's anything better.

629
00:35:16,160 --> 00:35:17,680
Check it, obviously for that reason.

630
00:35:17,680 --> 00:35:21,240
But if I wasn't sitting on top of it, if I wasn't legally responsible for what the AI

631
00:35:21,240 --> 00:35:24,440
makes, I'd copy paste without reading it and put it out on the internet and that's

632
00:35:24,440 --> 00:35:25,440
where we get into a problem.

633
00:35:25,440 --> 00:35:26,720
There's another problem as well.

634
00:35:26,720 --> 00:35:30,880
If people start doing that and then there's nobody sitting on top, it's like sitting on top

635
00:35:30,880 --> 00:35:32,280
of the horse in a way.

636
00:35:32,280 --> 00:35:38,280
Like you can't have an extremely powerful, capable horse with nobody on top of it controlling

637
00:35:38,280 --> 00:35:39,280
it.

638
00:35:39,280 --> 00:35:41,360
Like it's just going to do what it wants and ruin the village.

639
00:35:41,360 --> 00:35:44,440
So we need to have a person sit on it, responsible for what that horse does.

640
00:35:44,440 --> 00:35:48,640
And then if that horse does something wrong, the rider gets punished, right?

641
00:35:48,640 --> 00:35:51,880
It's the, I mean, in religion, he said, AI doesn't have a soul.

642
00:35:51,880 --> 00:35:53,520
Like animals don't have souls, right?

643
00:35:53,520 --> 00:35:57,960
And in the religious context, meaning it's like they're not responsible legally for their

644
00:35:57,960 --> 00:36:00,240
actions because they're not conscious.

645
00:36:00,240 --> 00:36:04,960
And then the scientifically, whatever, it's just an analogy to say, AI is like an animal.

646
00:36:04,960 --> 00:36:09,400
In a way, we have to own it and control it and be responsible for everything it does

647
00:36:09,400 --> 00:36:10,400
like a pet owner.

648
00:36:10,400 --> 00:36:12,280
You've got to dog the bite someone, it's the honest fault.

649
00:36:12,280 --> 00:36:16,600
Right, if my AI writes misleading information, puts it out, put someone, my fault.

650
00:36:16,600 --> 00:36:18,080
Right, so I put my name on it.

651
00:36:18,080 --> 00:36:19,520
We have to stay on top of the AI.

652
00:36:19,520 --> 00:36:21,320
So what does that mean for like self-driving cars?

653
00:36:21,320 --> 00:36:23,320
Well, someone's got to be responsible.

654
00:36:23,320 --> 00:36:29,240
And it would have to be either the driver, which I think initially it has to be ultimately.

655
00:36:29,240 --> 00:36:32,160
You could say, see Elon Musk, although he's going to have a lot of a lot of trouble going

656
00:36:32,160 --> 00:36:35,240
forward if he takes responsibility for all the self-driving cars.

657
00:36:35,240 --> 00:36:40,440
So probably for a very long time, the responsibility is going to be within the hands of the driver.

658
00:36:40,440 --> 00:36:45,680
You are an expert communicator and at the same time as you've taught in various different

659
00:36:45,680 --> 00:36:46,680
schools.

660
00:36:46,680 --> 00:36:50,360
I'm just curious about 50 different schools.

661
00:36:50,360 --> 00:36:56,800
So I'm just curious, what do you see is going to happen in the future when it comes to, for

662
00:36:56,800 --> 00:37:02,920
example, artificial intelligence, teaching, are we entering a new phase of education as we

663
00:37:02,920 --> 00:37:03,920
speak?

664
00:37:03,920 --> 00:37:07,520
This question I've seen many times.

665
00:37:07,520 --> 00:37:12,960
Hey listeners, before we jump back into our fascinating conversation, let's take a moment

666
00:37:12,960 --> 00:37:17,920
to appreciate the delicious goodness of paychef food companies, energy pods.

667
00:37:17,920 --> 00:37:23,320
These delightful pods come in a variety of amazing flavors like white chocolate strawberry,

668
00:37:23,320 --> 00:37:27,280
breakfast mokonua, and the ever popular chocolate snobah.

669
00:37:27,280 --> 00:37:31,960
And they're not just tasty, they're also packed with protein, healthy fats, and minimal

670
00:37:31,960 --> 00:37:34,640
sugar to keep you fueled throughout the day.

671
00:37:34,640 --> 00:37:36,360
Perfect for when you're on the go.

672
00:37:36,360 --> 00:37:40,720
These energy pods even come with a built-in spoon for your convenience.

673
00:37:40,720 --> 00:37:46,560
So go ahead and treat yourself to these mouthwatering pods that will make your taste buds dance.

674
00:37:46,560 --> 00:37:53,480
Visit KGFOODCO.COM to grab your favorite flavors.

675
00:37:53,480 --> 00:37:57,720
And now let's get back to our inspiring guest.

676
00:37:57,720 --> 00:38:02,160
With every new technology that people think of this new technology is going to revolutionize

677
00:38:02,160 --> 00:38:04,280
it, it's going to revolutionize every education.

678
00:38:04,280 --> 00:38:06,440
And the answer is no, it won't, not that much.

679
00:38:06,440 --> 00:38:10,440
There will be a small change, a small incremental change to what we already do.

680
00:38:10,440 --> 00:38:14,520
So think back to what's the first big change possibly?

681
00:38:14,520 --> 00:38:18,080
Well, you could go way back to using paper versus using slate.

682
00:38:18,080 --> 00:38:20,280
For example, is this going to revolutionize education?

683
00:38:20,280 --> 00:38:21,600
No, it didn't.

684
00:38:21,600 --> 00:38:27,440
You can whiteboard instead of blackboard to revolutionize, not really, not massively so.

685
00:38:27,440 --> 00:38:31,120
Keep going forwards, you've got how about this spell checkers?

686
00:38:31,120 --> 00:38:34,400
So that was a big debate back in the day, like when I was a kid.

687
00:38:34,400 --> 00:38:35,400
Turn off your spell checker.

688
00:38:35,400 --> 00:38:37,800
If you're typing your work, you should know how to spell.

689
00:38:37,800 --> 00:38:41,160
And then we realized actually, just going to turn it on anyway.

690
00:38:41,160 --> 00:38:43,800
They'll have it off when the teacher's there and they'll quickly turn it back on and

691
00:38:43,800 --> 00:38:44,800
turn it back off.

692
00:38:44,800 --> 00:38:47,880
It's very difficult to turn off the spell checker now.

693
00:38:47,880 --> 00:38:50,880
But these would be a button that you have to press to do the spell check.

694
00:38:50,880 --> 00:38:55,480
And so they ended up being, we went from being, no, no, no, we can't do this.

695
00:38:55,480 --> 00:38:56,480
The students will cheat.

696
00:38:56,480 --> 00:39:00,080
They won't learn how to spell to encouraging students to use it.

697
00:39:00,080 --> 00:39:04,920
And now as a teacher, if I see a typed essay that's been badly spelled, I'll just say

698
00:39:04,920 --> 00:39:06,600
check the spelling using word.

699
00:39:06,600 --> 00:39:08,680
I'll encourage them to do that.

700
00:39:08,680 --> 00:39:09,680
And that's the pattern.

701
00:39:09,680 --> 00:39:13,720
We tend to go from skepticism to opposition and then encouragement.

702
00:39:13,720 --> 00:39:17,000
And for every technology that comes into education.

703
00:39:17,000 --> 00:39:18,320
So yeah, spell check was one.

704
00:39:18,320 --> 00:39:19,800
Then laptops was another one.

705
00:39:19,800 --> 00:39:21,320
Well, I pads and then laptops.

706
00:39:21,320 --> 00:39:22,800
The ultimate became laptops.

707
00:39:22,800 --> 00:39:25,200
So laptops in schools became a thing.

708
00:39:25,200 --> 00:39:29,280
It was maybe 10 years ago, roughly like that.

709
00:39:29,280 --> 00:39:31,840
Where now the students are on their laptops all day, every day.

710
00:39:31,840 --> 00:39:33,520
I don't know what it's like in America.

711
00:39:33,520 --> 00:39:35,240
You know, what are they doing in schools?

712
00:39:35,240 --> 00:39:38,080
Like here, pretty much every school all day.

713
00:39:38,080 --> 00:39:39,920
I'd say maybe not literally all day.

714
00:39:39,920 --> 00:39:42,320
There's like four lessons out of five in the day.

715
00:39:42,320 --> 00:39:43,520
They'll be on their laptops.

716
00:39:43,520 --> 00:39:44,520
Yeah, sport.

717
00:39:44,520 --> 00:39:50,120
Yeah, I remember when I went to college and that was like any many years back.

718
00:39:50,120 --> 00:39:51,760
We were using laptops regularly.

719
00:39:51,760 --> 00:39:55,600
I used to personally own a laptop myself all the time.

720
00:39:55,600 --> 00:40:01,960
I'd imagine now we just throw probably phones and tablets into the mix, probably tablets

721
00:40:01,960 --> 00:40:03,160
more so.

722
00:40:03,160 --> 00:40:07,600
So I'm not sure, but I can totally see that it's probably very similar over here as well.

723
00:40:07,600 --> 00:40:08,600
Probably.

724
00:40:08,600 --> 00:40:13,060
What's happened here is it's now become compulsory to have a laptop in the students school

725
00:40:13,060 --> 00:40:16,180
bag all day from grade seven onwards compulsory.

726
00:40:16,180 --> 00:40:18,620
Like all the learning materials are delivered via the laptop.

727
00:40:18,620 --> 00:40:20,300
The laptop is open every lesson.

728
00:40:20,300 --> 00:40:21,460
That's the main means of learning.

729
00:40:21,460 --> 00:40:26,420
You also have a notebook, but most of the learning is done on the laptop and you can tell

730
00:40:26,420 --> 00:40:29,500
because the school's printing budgets have plummeted.

731
00:40:29,500 --> 00:40:33,780
There used to be, you know, 50 grand a year, whatever, and there'll be now like 12.

732
00:40:33,780 --> 00:40:38,540
There's no printing anymore because most of the 12 will be into here, nice brochures and

733
00:40:38,540 --> 00:40:39,540
stuff.

734
00:40:39,540 --> 00:40:42,340
So everything is delivered electronically and the kids are on the laptops all day and

735
00:40:42,340 --> 00:40:46,220
we thought, oh, they're going to get, remember we used to be worried about like, was it like,

736
00:40:46,220 --> 00:40:47,660
was it, you get square eyes?

737
00:40:47,660 --> 00:40:48,660
Remember that?

738
00:40:48,660 --> 00:40:49,900
Thanks to be a worry pad.

739
00:40:49,900 --> 00:40:54,660
There's a lot of those man.

740
00:40:54,660 --> 00:40:59,340
So yeah, you get square eyes, they'll have hunched over a posture and all these issues.

741
00:40:59,340 --> 00:41:03,980
We went from skepticism to oppositions, let's embrace it.

742
00:41:03,980 --> 00:41:04,980
Let's just do it.

743
00:41:04,980 --> 00:41:09,540
It saves the planet and they learn better and also an interesting upside of laptops was

744
00:41:09,540 --> 00:41:10,820
the discipline improved.

745
00:41:10,820 --> 00:41:18,540
So when you have kids who are a bit agitated, fidgety, usually their boys might be a difficulty

746
00:41:18,540 --> 00:41:24,140
engaging in the task for a long period of time, they get a Chrome extension for a little

747
00:41:24,140 --> 00:41:28,260
game because the school can block, the school's going to block the game sites, they can block

748
00:41:28,260 --> 00:41:30,100
steam, they can block all of that.

749
00:41:30,100 --> 00:41:34,700
But for some reason, they can't block Chrome extensions and they, you can get Tetris and

750
00:41:34,700 --> 00:41:36,660
stuff on the browser.

751
00:41:36,660 --> 00:41:41,500
So these fidgety kids, instead of like pushing and stealing rubbers and erasers and pushing,

752
00:41:41,500 --> 00:41:46,540
sorry not rubbers, we call them erasers off the table and stealing pens, whatever, causing

753
00:41:46,540 --> 00:41:50,220
a little minor chaos in the classroom, they'll just get onto play Tetris and Pac-Man

754
00:41:50,220 --> 00:41:53,780
on the Chrome extension and discipline actually improved.

755
00:41:53,780 --> 00:41:58,380
Those kids who are disengaged no longer bother others and the same thing happens during recess

756
00:41:58,380 --> 00:42:03,180
and lunch, like, there's a downside, right, there's less physical play.

757
00:42:03,180 --> 00:42:07,980
Now, there's a lot less sport happening, voluntary spontaneous sport of recess and lunch because

758
00:42:07,980 --> 00:42:10,060
everyone's on their devices, on their laptops.

759
00:42:10,060 --> 00:42:15,900
So yeah, the upside was physical bullying went down and disruption in the classroom definitely

760
00:42:15,900 --> 00:42:18,620
went down with the introduction of laptops and now we encourage it.

761
00:42:18,620 --> 00:42:20,780
So the same thing will happen with AI, so what will happen with AI?

762
00:42:20,780 --> 00:42:24,260
We've gone from no, no, no, they're going to cheat on the essays and lowless block in,

763
00:42:24,260 --> 00:42:28,220
Italy blocked it and then reversed it and the government schools here blocked it, the

764
00:42:28,220 --> 00:42:31,700
Catholic and private schools encouraged it.

765
00:42:31,700 --> 00:42:34,580
And here we have a three tier system, right?

766
00:42:34,580 --> 00:42:39,320
So the bottom, it's, I wish it wasn't this, it's like a French flag, I wish it wasn't this

767
00:42:39,320 --> 00:42:44,420
stratified here, but we have three types of schools, government schools, I mean, oversimplifying

768
00:42:44,420 --> 00:42:48,380
here, therefore the low socio-economic groups generally, there are very few exceptions,

769
00:42:48,380 --> 00:42:53,780
then you've got the middle Catholic schools, the middle group and then for the richest 20%

770
00:42:53,780 --> 00:42:54,860
go to private schools.

771
00:42:54,860 --> 00:43:02,060
So the top two sort of layers of schools encouraged it, the bottom one band and that tells me

772
00:43:02,060 --> 00:43:04,020
that's just going to widen the gap, right?

773
00:43:04,020 --> 00:43:07,700
We need to get government schools here on board with chattyBT, get teachers, because teachers

774
00:43:07,700 --> 00:43:12,020
use it, I'm a, I'm a, I'm a, I'm on LinkedIn teachers groups, I'm on Facebook teachers groups,

775
00:43:12,020 --> 00:43:14,060
they're always using chattyBT to plan their lessons.

776
00:43:14,060 --> 00:43:18,420
At the same time, banning the students from using chattyBT is going to widen the gap,

777
00:43:18,420 --> 00:43:21,500
you know, some of the kids here haven't heard of chattyBT in government schools.

778
00:43:21,500 --> 00:43:25,540
Meanwhile, Catholic schools and private schools are encouraging its use and teaching them

779
00:43:25,540 --> 00:43:30,860
how to make good prompts and how to get ahead using AI and that's the right side of history

780
00:43:30,860 --> 00:43:31,860
to be on.

781
00:43:31,860 --> 00:43:36,760
Jump the skepticism and the opposition, get straight into the embracing it, like every other

782
00:43:36,760 --> 00:43:41,380
tool throughout human history, even, you know, even nuclear weapons, right?

783
00:43:41,380 --> 00:43:44,460
I mean, they could be argued as like a thing we don't want to use, but we are using them

784
00:43:44,460 --> 00:43:47,580
as a, as a deterrent, they're extremely effective.

785
00:43:47,580 --> 00:43:51,020
We don't have to, you know, press the button for them to be useful, they're just every

786
00:43:51,020 --> 00:43:54,100
tool, including those have been embraced.

787
00:43:54,100 --> 00:43:55,580
So let's do that.

788
00:43:55,580 --> 00:43:57,180
Let's not be on the wrong side of history.

789
00:43:57,180 --> 00:44:01,860
That's absolutely true and I'm totally on board it and I'm so happy that there are many

790
00:44:01,860 --> 00:44:04,740
people who have actually started to embrace that.

791
00:44:04,740 --> 00:44:08,860
One of the questions I have is when you and I probably went to school, we didn't have anything

792
00:44:08,860 --> 00:44:09,860
called social media.

793
00:44:09,860 --> 00:44:13,820
Nowadays we have things like TikTok and Instagram and all of these kind of things.

794
00:44:13,820 --> 00:44:20,220
Now, how is schooling be affected by that and how is the interpersonal relationship between

795
00:44:20,220 --> 00:44:25,180
kids being influenced by that and the relationship perhaps between the parents and the children

796
00:44:25,180 --> 00:44:26,180
as well?

797
00:44:26,180 --> 00:44:27,180
That's an interesting one.

798
00:44:27,180 --> 00:44:33,340
Well, firstly for the kids themselves, so Jonathan Hate, HAIDT is his surname.

799
00:44:33,340 --> 00:44:39,180
So Jonathan Hate's probably, as far as I can tell, the world leader on this, and I follow

800
00:44:39,180 --> 00:44:40,180
him on the sub-stack.

801
00:44:40,180 --> 00:44:41,980
And I recommend your listeners do as well.

802
00:44:41,980 --> 00:44:46,980
Jonathan H A I D T and he's got a book coming out soon on exactly this, on the effect

803
00:44:46,980 --> 00:44:52,980
of social media and he's got lots of data, studies, massive sort of, he's got a Google

804
00:44:52,980 --> 00:44:56,700
doc where people can pull and put in extra studies that have been done on this and he's building

805
00:44:56,700 --> 00:45:00,460
up a picture that is going to go into a book that's coming out in roughly six to 12 months

806
00:45:00,460 --> 00:45:02,420
time, something like that.

807
00:45:02,420 --> 00:45:07,940
So he's talking early to 24 roughly, the book, but it's on the effect of social media and

808
00:45:07,940 --> 00:45:10,460
exactly how and who it affects.

809
00:45:10,460 --> 00:45:16,820
And the reason there are so many conflicting studies on this is some studies look at

810
00:45:16,820 --> 00:45:20,420
screen time, just generic screen time.

811
00:45:20,420 --> 00:45:23,700
And if you look at just screen time, it turns out it's not that bad.

812
00:45:23,700 --> 00:45:26,380
So there's worries about square eyes and stuff where I've blown.

813
00:45:26,380 --> 00:45:30,740
In multiple ways, people's wellbeing academic results, they're not affected really that much

814
00:45:30,740 --> 00:45:32,420
by having more screen time.

815
00:45:32,420 --> 00:45:38,180
What matters is who is watching, so some demographics are more affected than others.

816
00:45:38,180 --> 00:45:44,260
So girls more than boys, poor more than rich and liberal more than conservative, open to

817
00:45:44,260 --> 00:45:48,740
new ideas coming at them, even if they're ridiculous.

818
00:45:48,740 --> 00:45:53,780
And the, so so who's watching and what you're doing on the screen makes a huge difference.

819
00:45:53,780 --> 00:45:57,620
So it turns out watching a movie on the couch with a friend, that's screen time, but that's

820
00:45:57,620 --> 00:46:01,100
actually good for your mental health.

821
00:46:01,100 --> 00:46:03,820
And they found a different types of screen time are not equal.

822
00:46:03,820 --> 00:46:08,340
So when you break it down, so what Jonathan hates does and he says that so the people who

823
00:46:08,340 --> 00:46:11,300
go, no, no, no, this study found no correlation with blah, blah, blah.

824
00:46:11,300 --> 00:46:13,100
And he goes, no, no, look at what study was measuring.

825
00:46:13,100 --> 00:46:18,620
Is it screen time or was it specifically social media dooms growling?

826
00:46:18,620 --> 00:46:23,980
Because in terms of a hierarchy, the worst thing you can do is dooms growling through influential

827
00:46:23,980 --> 00:46:29,860
TikToks, when I'm being very specific here of glorification of mental health problems.

828
00:46:29,860 --> 00:46:31,500
No, I don't, I don't know.

829
00:46:31,500 --> 00:46:32,500
TikTok doesn't give me this.

830
00:46:32,500 --> 00:46:36,940
It might be that I'm the wrong age bracket, but I've met with young people who've shown

831
00:46:36,940 --> 00:46:40,780
me their TikTok for you page and every roughly fifth video.

832
00:46:40,780 --> 00:46:48,660
So you get four funny videos, four interesting funny videos, some farmer planting trees and

833
00:46:48,660 --> 00:46:54,620
double time really fast and then you get some cool sort of eclipse effects, some cool stuff,

834
00:46:54,620 --> 00:47:00,980
just relaxing brain downtime for those video, short video, something funny.

835
00:47:00,980 --> 00:47:06,100
Then every roughly fifth video is a glorification of a self-diagnosed mental illness.

836
00:47:06,100 --> 00:47:08,100
And it's oddly specific.

837
00:47:08,100 --> 00:47:12,300
Each person gets pushed into this sort of different niche of mental illness.

838
00:47:12,300 --> 00:47:17,380
And I don't know whether it's done deliberately by TikTok, I don't think so because usually

839
00:47:17,380 --> 00:47:20,580
these things happen through an negligence rather than malice.

840
00:47:20,580 --> 00:47:22,220
That's my, my Occam's razor.

841
00:47:22,220 --> 00:47:28,060
I just assume negligence first before I have resort and assuming it's malice.

842
00:47:28,060 --> 00:47:32,380
But I think the algorithm is just realized, people lap that up and you need the four nice

843
00:47:32,380 --> 00:47:33,380
funny videos.

844
00:47:33,380 --> 00:47:37,940
Otherwise, it becomes a horrible and palatable because it's all, but it literally

845
00:47:37,940 --> 00:47:38,940
is.

846
00:47:38,940 --> 00:47:40,700
Every fifth video is like a glorification of a mental illness.

847
00:47:40,700 --> 00:47:48,180
And so for example, let's say, like an eating disorder, and there'll be somebody who just

848
00:47:48,180 --> 00:47:54,340
proclaims to have this and shows what it's like and hashtags and all of that.

849
00:47:54,340 --> 00:47:57,780
And then, but they might actually have it for real, but what's interesting is people then

850
00:47:57,780 --> 00:48:01,500
start imitating because they realize that gets followers, they'll self-diagnosed, other

851
00:48:01,500 --> 00:48:04,900
people who don't have it will self-diagnosed make videos about it.

852
00:48:04,900 --> 00:48:08,740
So the point that they actually start getting it and you get this thing, they call it a social

853
00:48:08,740 --> 00:48:14,220
contagion and they wish it wasn't real, but please do read Jonathan Hayd's research on this.

854
00:48:14,220 --> 00:48:19,900
This kind of stuff within, so Tiktok does, there was a study done recently, they got a thousand

855
00:48:19,900 --> 00:48:25,860
AIs actually and they randomly gave the AIs different interests, really specific interests.

856
00:48:25,860 --> 00:48:30,540
So they told the AIs, you're a thousand, you know, Tiktok accounts, a thousand SIM cards,

857
00:48:30,540 --> 00:48:35,580
a thousand, and each AIs had its own specific set of interests like rabbits jumping over fences

858
00:48:35,580 --> 00:48:38,220
or something really specific.

859
00:48:38,220 --> 00:48:43,140
And the AIs just went through Tiktok and the AIs were judging, what's in the video?

860
00:48:43,140 --> 00:48:44,460
What can I see in the video?

861
00:48:44,460 --> 00:48:45,460
Do I like it if I do?

862
00:48:45,460 --> 00:48:49,380
I'll stay on that for longer, if I don't know rabbits growl.

863
00:48:49,380 --> 00:48:54,700
So 30 minutes was the average, so 30 minutes, 30 minutes Tiktok could just put through randomly

864
00:48:54,700 --> 00:48:58,260
showing you stuff, but figure out what you want, reach of these AIs.

865
00:48:58,260 --> 00:49:02,700
And what was more interesting was that after that 30 minutes, if you changed the preferences

866
00:49:02,700 --> 00:49:05,700
of the AIs, Tiktok didn't change its algorithm.

867
00:49:05,700 --> 00:49:06,700
What?

868
00:49:06,700 --> 00:49:10,260
The ones you're pushed into that, you're stuck.

869
00:49:10,260 --> 00:49:14,300
Like so you might go in on a bad day, you're like, you're feeling like, I want to know,

870
00:49:14,300 --> 00:49:17,220
you think you've got some mental health, whatever, you're stuck there.

871
00:49:17,220 --> 00:49:21,420
It gives you that then, even if you change what you want, now Tiktok has released a reset

872
00:49:21,420 --> 00:49:25,420
button somewhere in the settings, so you can reset the algorithm and start again, but

873
00:49:25,420 --> 00:49:27,100
it, who's going to do that?

874
00:49:27,100 --> 00:49:31,180
Like it learns so much about you, it pigeonholes you in 30 minutes and it pushes you down

875
00:49:31,180 --> 00:49:35,820
this rabbit hole of, um, celebrating mental illness and it's very strange.

876
00:49:35,820 --> 00:49:39,700
I, I encourage you to read Jonathan Hayt's work on this, um, get on to Tiktok if you have,

877
00:49:39,700 --> 00:49:43,220
if you're not a young person who you can trust and show you their genuine for you page,

878
00:49:43,220 --> 00:49:48,220
because a very personal thing, um, have a look at what's on it and it's literally glorifying,

879
00:49:48,220 --> 00:49:50,980
celebrating mental health issues.

880
00:49:50,980 --> 00:49:54,780
Beyond awareness, I'm all for like awareness, it goes beyond that.

881
00:49:54,780 --> 00:50:00,860
It goes towards glorification and phomo, like if you don't have one, you're, you're out

882
00:50:00,860 --> 00:50:02,020
of the crowd.

883
00:50:02,020 --> 00:50:03,020
You need one.

884
00:50:03,020 --> 00:50:04,700
Like it's, it's gone way too far.

885
00:50:04,700 --> 00:50:10,540
And I'm all for acceptance, of course, uh, understanding education about it, awareness,

886
00:50:10,540 --> 00:50:11,700
but it's gone way too far.

887
00:50:11,700 --> 00:50:15,100
And some of these people are, they look like they're mocking the mental health issues,

888
00:50:15,100 --> 00:50:18,540
because you can look at them and go, nah, that's not real, but it doesn't look really doing

889
00:50:18,540 --> 00:50:21,900
this for the followers, the likes, the views, but it's very strange.

890
00:50:21,900 --> 00:50:26,580
And some of these mental health issues are, we've gone from nearly zero to being quite prevalent.

891
00:50:26,580 --> 00:50:30,020
Yeah, the most common one was, uh, I think it was the end of last year or the year before,

892
00:50:30,020 --> 00:50:32,380
um, but the Tourette's thing, did you hear about that?

893
00:50:32,380 --> 00:50:34,220
No, I barely used TikTok.

894
00:50:34,220 --> 00:50:35,580
Yeah, Tourette's.

895
00:50:35,580 --> 00:50:36,580
So real, right?

896
00:50:36,580 --> 00:50:39,620
This is, it sounds like, this is, it sounds like a black marionette.

897
00:50:39,620 --> 00:50:41,420
This is a stupid nightmare, man.

898
00:50:41,420 --> 00:50:44,780
What were, are kids are there and adults don't tend to know, like they don't browse the

899
00:50:44,780 --> 00:50:46,180
for you page of it, right?

900
00:50:46,180 --> 00:50:49,460
But as a teacher, occasionally, here's about been to so many schools, since thousands

901
00:50:49,460 --> 00:50:55,020
of students, I've, literally, um, I've seen this, um, so, so the Tourette's thing was this

902
00:50:55,020 --> 00:51:01,380
one specific example where it was in the US only, not Canada, was in the US, where some,

903
00:51:01,380 --> 00:51:05,660
one of the mental health issues, the spread for whatever reason, was Tourette's and that

904
00:51:05,660 --> 00:51:11,940
just became, again, ring, niche, sort of quite a minor issue, like it's not, not very prevalent,

905
00:51:11,940 --> 00:51:13,260
but it rocketed.

906
00:51:13,260 --> 00:51:18,900
And people started presenting to GPs suddenly within two months of having Tourette's.

907
00:51:18,900 --> 00:51:22,980
On the sudden onset Tourette's, by the way, like they never had the history of it and they're

908
00:51:22,980 --> 00:51:23,980
suddenly got Tourette's.

909
00:51:23,980 --> 00:51:28,980
Then they realized the GPs were, um, there was a pattern in this sudden explosion of diagnosis

910
00:51:28,980 --> 00:51:29,980
of Tourette's.

911
00:51:29,980 --> 00:51:34,280
And that's, so these were, tiktok addicts, and Instagram addicts, social media, doom-scrolling

912
00:51:34,280 --> 00:51:35,280
addicts.

913
00:51:35,280 --> 00:51:39,100
And what they found was that they had been, when they interviewed them, they showed that,

914
00:51:39,100 --> 00:51:42,220
oh, actually, I've been shown loads and loads of Tourette's videos.

915
00:51:42,220 --> 00:51:44,320
Now I wish it wasn't, but true.

916
00:51:44,320 --> 00:51:49,580
But some mental health issues are amplified through repeated exposure by the algorithm and

917
00:51:49,580 --> 00:51:51,800
the algorithm knows that and it shows you more.

918
00:51:51,800 --> 00:51:55,840
And I'm not saying China does these rights, but China shuts stuff like that down.

919
00:51:55,840 --> 00:52:00,080
Now I'm all for freedom of speech and freedom of press.

920
00:52:00,080 --> 00:52:04,360
But when you're amplifying stuff to the point that it's causing mass harm, I do ask question,

921
00:52:04,360 --> 00:52:09,360
you know, there was, yeah, like for example, the way that China treats its, you know, China

922
00:52:09,360 --> 00:52:14,480
has occasionally, not very often, but they'll have like a, you know, mass killing, like

923
00:52:14,480 --> 00:52:18,320
in the US they happen all this, sorry, they do, they happen all the time.

924
00:52:18,320 --> 00:52:21,960
But within China, when that happens, the press just doesn't mention it.

925
00:52:21,960 --> 00:52:26,480
Now all they mention it like this once, somewhere, and then they don't talk about it constantly

926
00:52:26,480 --> 00:52:31,320
and ruminate on it and make it look normal, which is I think the amplification is possibly

927
00:52:31,320 --> 00:52:35,960
too much, maybe, because it doesn't seem to be working, it's still happening in the US.

928
00:52:35,960 --> 00:52:40,640
The amplification of these issues does not work.

929
00:52:40,640 --> 00:52:41,840
The Chinese just keep it quiet.

930
00:52:41,840 --> 00:52:45,360
Now whether they have fewer of them will never know if they keep it quiet.

931
00:52:45,360 --> 00:52:49,440
So we don't know whether that works either, but certainly what we're doing is not working.

932
00:52:49,440 --> 00:52:50,440
We amplify these issues.

933
00:52:50,440 --> 00:52:54,000
They spread, and the Tourette is the most famous one, but there are others and some of them

934
00:52:54,000 --> 00:52:57,440
are so oddly specific, but they spread through TikTok.

935
00:52:57,440 --> 00:52:59,240
They're crazy, man.

936
00:52:59,240 --> 00:53:00,240
Yeah.

937
00:53:00,240 --> 00:53:01,240
It's crazy.

938
00:53:01,240 --> 00:53:04,760
So they've banned smartphones here now in schools across the board.

939
00:53:04,760 --> 00:53:07,840
Obviously in China or in your side of the...

940
00:53:07,840 --> 00:53:10,080
Here in here in Australia, I'll state.

941
00:53:10,080 --> 00:53:16,000
Anyway, I'll state it's just banned smartphones in schools for kids.

942
00:53:16,000 --> 00:53:17,160
That's man.

943
00:53:17,160 --> 00:53:19,240
I hope things get better over time.

944
00:53:19,240 --> 00:53:20,240
Yeah.

945
00:53:20,240 --> 00:53:22,240
Well, it's like...

946
00:53:22,240 --> 00:53:23,240
What do you do?

947
00:53:23,240 --> 00:53:26,840
I think the easiest fix is just get TikTok to tweak a few things.

948
00:53:26,840 --> 00:53:27,840
That's all.

949
00:53:27,840 --> 00:53:30,040
Make the algorithm automatically reset every week.

950
00:53:30,040 --> 00:53:31,040
Something like that, I don't know.

951
00:53:31,040 --> 00:53:32,840
Yeah, I don't know, man.

952
00:53:32,840 --> 00:53:34,920
I didn't realize this problem was this bad.

953
00:53:34,920 --> 00:53:36,920
You see it as a teacher.

954
00:53:36,920 --> 00:53:42,000
The adults have not seen it yet, I think, but the fact that I mean classrooms with grade

955
00:53:42,000 --> 00:53:44,640
seven, the grade 12, I see it.

956
00:53:44,640 --> 00:53:45,640
And I'll notice eventually.

957
00:53:45,640 --> 00:53:48,560
The adults will notice eventually when these kids graduate the Commander Society in five

958
00:53:48,560 --> 00:53:49,560
years time.

959
00:53:49,560 --> 00:53:50,560
Oh, man.

960
00:53:50,560 --> 00:53:51,560
And those are...

961
00:53:51,560 --> 00:53:52,560
All right.

962
00:53:52,560 --> 00:53:54,480
We can start wrapping things up a little bit.

963
00:53:54,480 --> 00:53:56,920
We probably have another podcast.

964
00:53:56,920 --> 00:54:00,480
We're going to do another one because I have these topics that I really wanted to delve

965
00:54:00,480 --> 00:54:05,240
into you, but that can wait for the next one and we'll keep it all going.

966
00:54:05,240 --> 00:54:06,240
But I really want to...

967
00:54:06,240 --> 00:54:10,960
First of all, before we get into the final slides, I want you to give me a paper trail of

968
00:54:10,960 --> 00:54:16,360
all the books you've written because in case somebody's interested in hemophobia or any

969
00:54:16,360 --> 00:54:19,920
of the topics that you've covered in detail, I want them to go check it out.

970
00:54:19,920 --> 00:54:20,920
Yeah, absolutely.

971
00:54:20,920 --> 00:54:21,920
So...

972
00:54:21,920 --> 00:54:22,920
I would say...

973
00:54:22,920 --> 00:54:24,400
I'll give you one, right?

974
00:54:24,400 --> 00:54:25,720
Everything is natural.

975
00:54:25,720 --> 00:54:29,360
Everything is natural as my book, but it's published with the role of society of chemistry.

976
00:54:29,360 --> 00:54:30,360
And everything is natural.

977
00:54:30,360 --> 00:54:31,360
It's 2021.

978
00:54:31,360 --> 00:54:32,360
I've published that.

979
00:54:32,360 --> 00:54:33,360
And it's on...

980
00:54:33,360 --> 00:54:37,520
It's blurring the lines between natural and artificial and saying that actually humans have

981
00:54:37,520 --> 00:54:43,760
been engaged with nature and utilized it and manipulated it and actually improved it

982
00:54:43,760 --> 00:54:45,080
thousands of years.

983
00:54:45,080 --> 00:54:46,080
Let's not forget that.

984
00:54:46,080 --> 00:54:47,880
Let's thank our ancestors for doing that.

985
00:54:47,880 --> 00:54:50,000
We are just continuing that journey.

986
00:54:50,000 --> 00:54:52,680
The extra processing and manufacturing we are doing.

987
00:54:52,680 --> 00:54:57,160
Yes, we have to look after nature for sustainability, but from a sustainability perspective.

988
00:54:57,160 --> 00:55:01,960
But we've been using nature to our advantage and improving it for its own good and survival

989
00:55:01,960 --> 00:55:02,960
for a very long time.

990
00:55:02,960 --> 00:55:04,480
We've been tending the landscape for a long time.

991
00:55:04,480 --> 00:55:06,000
Let's keep doing that.

992
00:55:06,000 --> 00:55:09,880
Not let's go back to wilderness and hunt together at times.

993
00:55:09,880 --> 00:55:13,240
Let's celebrate the work that our ancestors did.

994
00:55:13,240 --> 00:55:14,240
Everything is natural.

995
00:55:14,240 --> 00:55:18,080
And it talks about the history of how we became fearful of chemicals and the sort of recent

996
00:55:18,080 --> 00:55:23,040
question marks over human influence on the earth and sort of tries to elase some of

997
00:55:23,040 --> 00:55:24,040
those fears.

998
00:55:24,040 --> 00:55:25,040
So everything is natural.

999
00:55:25,040 --> 00:55:26,840
I've got other books as well, but they're very academic.

1000
00:55:26,840 --> 00:55:28,840
So I'd recommend just that one for listeners.

1001
00:55:28,840 --> 00:55:33,400
The others are like, no, knowing me, I might just go and sneak into your academic sort of

1002
00:55:33,400 --> 00:55:35,160
list and start reading those books.

1003
00:55:35,160 --> 00:55:36,680
It's chemical equations and stuff.

1004
00:55:36,680 --> 00:55:39,000
I'd rather, yeah, everything is natural.

1005
00:55:39,000 --> 00:55:40,000
That's the one to go.

1006
00:55:40,000 --> 00:55:42,040
I'll put that in the show notes as well.

1007
00:55:42,040 --> 00:55:45,680
And speaking of show notes, how can people find you and reach out and communicate with you?

1008
00:55:45,680 --> 00:55:46,680
Yeah, good question.

1009
00:55:46,680 --> 00:55:54,400
So I would say go to my website, so James Kennedy monash.wordpress.com and you'll find

1010
00:55:54,400 --> 00:55:58,080
the sort of summary of recent stuff that has done all the books around there, any podcast

1011
00:55:58,080 --> 00:55:59,080
appearance.

1012
00:55:59,080 --> 00:56:00,080
How about this?

1013
00:56:00,080 --> 00:56:01,080
Twitter.

1014
00:56:01,080 --> 00:56:02,080
You can give me Twitter handle in the description as well.

1015
00:56:02,080 --> 00:56:04,520
James Kennedy EDU sounds fantastic.

1016
00:56:04,520 --> 00:56:10,080
Well, thank you so much for the nukes of information that you threw at me and blew my mind.

1017
00:56:10,080 --> 00:56:14,720
Like, I, dude, I just feel like I just came out from under a rock.

1018
00:56:14,720 --> 00:56:16,680
And so thank you.

1019
00:56:16,680 --> 00:56:17,680
Well, thank you.

1020
00:56:17,680 --> 00:56:18,680
It's always a pleasure for that.

1021
00:56:18,680 --> 00:56:19,680
It's great.

1022
00:56:19,680 --> 00:56:23,320
Get ready for an adrenaline-fueled adventure.

1023
00:56:23,320 --> 00:56:28,800
Our next episode features Jesse, an energy-bott, customer with captivating and mind-blowing

1024
00:56:28,800 --> 00:56:32,560
stories from his incredible journeys across the country.

1025
00:56:32,560 --> 00:56:37,280
Prepare to be hooked by Jesse's captivating tales of adventure, excitement and inspiration.

1026
00:56:37,280 --> 00:56:41,040
Don't miss this thrilling episode coming to you next week.

1027
00:56:41,040 --> 00:56:48,440
Thank you for listening to our podcast.

1028
00:56:48,440 --> 00:56:53,480
If you enjoyed today's episode, please rate, review and subscribe to our energized Explore

1029
00:56:53,480 --> 00:56:57,640
Enjoy podcast on your favorite podcasting platform.

1030
00:56:57,640 --> 00:56:58,640
See you next week.

1031
00:56:58,640 --> 00:57:08,640
[MUSIC]

1032
00:57:08,640 --> 00:57:10,340
(floor sound)