Roxane Heaton, CIO, Macmillan Cancer Support

Overview

Join Roxane Heaton, CIO, Macmillan Cancer Support as she discussed digital inclusion, innovation and leadership.

Register Now

Transcript

00:00 1
00:00:03,600 --> 00:00:05,880
Welcome to CIO Leadership Live UK.
2
00:00:05,880 --> 00:00:08,880
I'm Lee Rennick Executive Director,
CIO Communities
3
00:00:09,000 --> 00:00:11,560
and I'm very excited to introduce
and welcome Dr.
4
00:00:11,560 --> 00:00:14,560
Roxane Heaton,
CIO, Macmillan Cancer Support.
5
00:00:14,880 --> 00:00:16,480
Roxane, please introduce yourself.
6
00:00:16,480 --> 00:00:19,480
And could you tell us a little bit
about your current role?
7
00:00:19,680 --> 00:00:21,920
Thanks so much,
Lee, and lovely to be here today.
8
00:00:21,920 --> 00:00:25,600
So as I've mentioned, my name is Roxane
Heaton, the Chief Information Officer
9
00:00:25,960 --> 00:00:30,800
of one of the UK's largest cancer charities,
which is Macmillan Cancer Support.
10
00:00:31,080 --> 00:00:35,400
And I'm responsible for technology
and digital and data optimization.
11
00:00:35,640 --> 00:00:39,760
We are 98% funded by fundraising
and were are the size of 250.
12
00:00:39,760 --> 00:00:41,800
So it's really important
we spend our money wisely.
13
00:00:41,800 --> 00:00:43,040
People living with cancer.
14
00:00:43,040 --> 00:00:44,560
Well, thank you so much for sharing that
15
00:00:44,560 --> 00:00:46,400
and I really appreciate you
joining us here today.
16
00:00:46,400 --> 00:00:50,160
Roxanne, we've created the series
to support diversity in technology
17
00:00:50,160 --> 00:00:53,680
and to listen to women working in
the sector who are building and supporting
18
00:00:53,680 --> 00:00:57,960
DE&I. This year
the IWD theme was #embraceequity.
19
00:00:58,200 --> 00:01:00,240
So the first question,
can you please tell us a little bit
20
00:01:00,240 --> 00:01:01,200
about your own career
21
00:01:01,200 --> 00:01:04,440
path and provide some insights
or tips on that road path. As a woman
22
00:01:04,440 --> 00:01:07,440
especially, are there any lessons learned
that you could share?
23
00:01:07,600 --> 00:01:10,600
It's a very great question
and I think it's applicable for any sector.
24
00:01:11,080 --> 00:01:15,080
I went to an all girls boarding school
and then straight into university
25
00:01:15,080 --> 00:01:15,800
doing engineering.
26
00:01:15,800 --> 00:01:18,880
So when there were only seven females,
our course, nearly 100.
27
00:01:19,160 --> 00:01:22,160
So it was straight
into a very different world.
28
00:01:22,360 --> 00:01:26,760
And I remember what I've always learned.
Especially someone who, I've got a stammer.
29
00:01:26,920 --> 00:01:30,480
So I'm always aware of different
superpowers and everyone's got superpowers
30
00:01:30,480 --> 00:01:33,720
and it's about maximizing those
because we're all very different.
31
00:01:33,880 --> 00:01:36,800
So I've had a squigley career.
Which I think
32
00:01:36,800 --> 00:01:40,000
is hugely valuable, and especially in
discussion with people. We're all different.
33
00:01:40,000 --> 00:01:42,320
We will bring different
perspectives to the table.
34
00:01:42,320 --> 00:01:46,160
So I started my career in the Royal Navy
after a quick stint in banking.
35
00:01:46,160 --> 00:01:49,280
Actually,
which got me really hungry for making sure
36
00:01:49,280 --> 00:01:52,600
we did great things with money
and the restrictions and things money.
37
00:01:52,600 --> 00:01:54,920
But yeah,
I spent 12 years in the Royal Navy,
38
00:01:54,920 --> 00:01:58,280
which I think really
is embedded my instinctive
39
00:01:59,360 --> 00:02:01,560
ambitions and behaviors of
40
00:02:01,560 --> 00:02:05,280
teamwork and being a team member first
and a leader second.
41
00:02:05,280 --> 00:02:10,520
But actually as a leader is
it is everyone's responsibility.
42
00:02:11,080 --> 00:02:15,120
I'm really lucky to jump from there
to Morrisons, one of the UK's
43
00:02:15,320 --> 00:02:19,800
largest supermarkets. Which is end to end
retailer, so everything from farm to fork.
44
00:02:20,120 --> 00:02:20,920
So hughly interesting.
45
00:02:20,920 --> 00:02:23,240
I hope I can touch with some
some insight there later.
46
00:02:23,240 --> 00:02:27,040
But everything I learned through
all of the really again, just showed me
47
00:02:27,040 --> 00:02:30,040
that whatever skill you bring,
it can
48
00:02:30,320 --> 00:02:31,920
you can bring a different view.
49
00:02:31,920 --> 00:02:35,360
So finding different allies
and different people, who could champion
50
00:02:35,360 --> 00:02:39,360
you was,
I think, hardest in my early, early time.
51
00:02:39,360 --> 00:02:42,560
But actually it's been super interesting
as I've gone on later in my career
52
00:02:42,720 --> 00:02:47,960
to find people who I really learned
from are those around my network.
53
00:02:49,080 --> 00:02:50,960
And that's
where I get my greatest learning.
54
00:02:50,960 --> 00:02:52,680
But actually those are
my greatest allies too.
55
00:02:52,680 --> 00:02:55,080
And it probably took me a little while
to realize that.
56
00:02:55,080 --> 00:02:58,880
So building your relationships across
networks, the best thing you can do?
57
00:02:59,320 --> 00:03:00,840
Well,
I really appreciate you sharing that.
58
00:03:00,840 --> 00:03:03,360
And it's looks like and we're
going to talk about this in just a minute.
59
00:03:03,360 --> 00:03:05,360
and it segues
really well into the next question.
60
00:03:05,360 --> 00:03:09,480
So I was researching some data
around women and technology in the UK,
61
00:03:09,480 --> 00:03:13,280
and some new data reveals
that the proportion of female employees
62
00:03:13,280 --> 00:03:18,520
in the UK, in UK tech has declined for
the first time in five years by over 2%.
63
00:03:18,520 --> 00:03:21,440
Now, this is also happened in Canada,
which is very interesting.
64
00:03:21,440 --> 00:03:25,200
So it looks like it might be a bit
of a worldwide global phenomenon.
65
00:03:25,800 --> 00:03:28,800
This study actually revealed
that under 15% of IT
66
00:03:28,800 --> 00:03:31,680
directors are women
in the sector in the UK.
67
00:03:31,680 --> 00:03:34,760
So you are really a big supporter
of diversity and equity.
68
00:03:35,120 --> 00:03:36,960
And I was wondering if you could share
69
00:03:36,960 --> 00:03:40,880
what you believe organizations
could do to support diverse workforces.
70
00:03:41,240 --> 00:03:44,960
Gosh, it's a really concerning stat isn't it.
And it's something that after
71
00:03:45,440 --> 00:03:49,120
so much work by a lot of people
across across the globe to increase that.
72
00:03:49,120 --> 00:03:53,920
So female leaders, female CEOs
and definitely females in technology. It's
73
00:03:53,920 --> 00:03:57,520
really concerning is why is it going to be
something to really unpack?
74
00:03:58,080 --> 00:04:00,840
But it's not only concerning
for the workforce we have, it's
75
00:04:00,840 --> 00:04:03,280
concerning the solutions
that we're designing.
76
00:04:03,280 --> 00:04:07,000
And that's why it's so important
that we really address this.
77
00:04:07,360 --> 00:04:09,680
So we must normalize the conversation.
78
00:04:09,680 --> 00:04:13,760
A big thing that I try and do is
talk in business outcomes.
79
00:04:14,320 --> 00:04:18,840
Make sure it's really open to non tech techies
as I try and call the people
80
00:04:19,200 --> 00:04:22,120
add a huge or just as much amount of value,
if not more value.
81
00:04:22,120 --> 00:04:24,760
If you talk about the user, your technologist
82
00:04:24,760 --> 00:04:28,200
because everybody uses technology
and I would love to hear from everybody.
83
00:04:28,440 --> 00:04:32,040
It's about getting in early as well
and making it and making it relevant
84
00:04:32,040 --> 00:04:33,920
for people back to normalizing.
85
00:04:33,920 --> 00:04:37,440
Help people understand what value
they can bring to the technology sector.
86
00:04:39,200 --> 00:04:39,960
But it's not good enough.
87
00:04:39,960 --> 00:04:43,960
So we've worked really hard
in terms of making new routes,
88
00:04:44,160 --> 00:04:48,600
sideways routes through retraining
and also realizing not everyone's a unicorn.
89
00:04:48,800 --> 00:04:51,120
So as long as we can talk about
clear development plans
90
00:04:51,120 --> 00:04:54,120
and that's for everyone, not not,
not not just females,
91
00:04:54,400 --> 00:04:56,160
we can support everyone to grow.
92
00:04:56,160 --> 00:05:00,840
So we've now currently got 40% female in our organization,
which is fantastic, in technology, and
93
00:05:01,960 --> 00:05:05,080
50% female in my leadership team.
We've gone from from
94
00:05:05,120 --> 00:05:09,480
from almost one
one out of 12 to 50%, which is fantastic.
95
00:05:09,680 --> 00:05:12,240
But we must keep pushing that
and having those role models.
96
00:05:12,240 --> 00:05:15,720
There's that, there's an old phrase
that you can't be what you can't see.
97
00:05:16,040 --> 00:05:19,640
So we need more diversity of lots
of different demographics
98
00:05:19,920 --> 00:05:24,360
at all different spaces in people's
careers and understanding
99
00:05:24,360 --> 00:05:28,120
what are the traps that the as you
mentioned, that 2% let's work out why.
100
00:05:28,160 --> 00:05:29,520
So yeah, it's really important.
101
00:05:29,520 --> 00:05:30,240
Yeah, I think it is.
102
00:05:30,240 --> 00:05:32,760
And given that you said you started
your career in engineering,
103
00:05:32,760 --> 00:05:36,880
I mean, I think there's that ability
to look at how younger women
104
00:05:36,880 --> 00:05:40,480
are looking at STEAM versus STEM
and bringing the arts
105
00:05:40,480 --> 00:05:44,280
and you talked about the soft skills like
making sure that people have you know,
106
00:05:44,280 --> 00:05:47,280
we recognized people's skills
outside of technology
107
00:05:47,280 --> 00:05:50,160
that seems to be trending a lot
in these conversations I'm having.
108
00:05:50,160 --> 00:05:53,640
Like we want people to bring in skills
and then a lot of the,
109
00:05:53,920 --> 00:05:55,720
you know, male allies
I've been speaking to
110
00:05:55,720 --> 00:05:57,600
in the tech industry have been saying,
you know, look,
111
00:05:57,600 --> 00:06:03,080
we know that women don't apply for jobs
if they only have 50 or 60%
112
00:06:03,080 --> 00:06:06,120
of their skills based, they don't think
they should, whereas men will.
113
00:06:06,440 --> 00:06:07,080
And so, you know,
114
00:06:07,080 --> 00:06:11,000
just the male allies that are encouraging
and looking at those applications
115
00:06:11,000 --> 00:06:14,640
that come in, encouraging women to apply,
I think that's such an important part.
116
00:06:14,640 --> 00:06:19,160
So and just as you mentioned right now,
having this dialog is so impactful for me.
117
00:06:19,160 --> 00:06:21,240
Every time I speak to a woman
who's working in the sector
118
00:06:21,240 --> 00:06:23,560
to talk about their career
when they've come,
119
00:06:23,560 --> 00:06:28,160
I gain some insights myself, so I really
appreciate you sharing that very much.
120
00:06:28,680 --> 00:06:32,440
You mentioned in the beginning that you've
had a career in various sectors,
121
00:06:32,760 --> 00:06:37,040
and a lot of CIOs I speak with talk
about knowledge gained by working across
122
00:06:37,040 --> 00:06:40,040
sectors, you know, and the key learnings
in their career through that.
123
00:06:40,920 --> 00:06:41,400
You've worked
124
00:06:41,400 --> 00:06:44,400
both in private and public sector
and now you're in the charitable sector.
125
00:06:44,680 --> 00:06:49,000
So could you please provide some insights
on how you utilize your learnings
126
00:06:49,000 --> 00:06:52,960
between sectors and perhaps why
cross-sector learning is really important?
127
00:06:53,320 --> 00:06:56,000
So when I think about this question,
I think about this question, I think about
128
00:06:56,000 --> 00:06:57,920
our user doesn't care.
129
00:06:57,920 --> 00:07:01,200
I only work in the charitable sector
or the public sector or the private sector.
130
00:07:01,400 --> 00:07:03,240
They don't just experience one sector.
131
00:07:03,240 --> 00:07:06,240
They bounce across each sector
and they expect
132
00:07:06,240 --> 00:07:08,800
the same sort of level of service.
133
00:07:08,800 --> 00:07:12,600
Not everyone expects the Amazon experience
if one on one to shop at Amazon,
134
00:07:12,960 --> 00:07:15,960
but absolutely
it's about keeping up with Joneses.
135
00:07:16,560 --> 00:07:21,560
So it's not good enough to just think that
because I'm a I'm a charity, I'm it's
136
00:07:21,560 --> 00:07:25,960
okay that I offer a less good experience
doing the experience in those other sectors.
137
00:07:26,240 --> 00:07:29,240
So I think it's always keeping
punchy is really important
138
00:07:29,320 --> 00:07:31,360
because otherwise
the users will go elsewhere.
139
00:07:31,360 --> 00:07:35,360
And they'll get the service from elsewhere
and donate their money elsewhere.
140
00:07:35,560 --> 00:07:36,480
From my perspective.
141
00:07:36,480 --> 00:07:40,160
That's why I mentioned that networking
is so important and huge learnings
142
00:07:40,200 --> 00:07:43,360
with peers
and we're all hungry for the same thing.
143
00:07:43,360 --> 00:07:45,960
There's not enough money
to go around to reinvent the wheel
144
00:07:45,960 --> 00:07:47,720
and the wheel
doesn't need to be reinvented
145
00:07:47,720 --> 00:07:50,800
in a lot places. We can learn from each other
as we're going on
146
00:07:50,800 --> 00:07:53,800
in each each journey
into different use of technologies
147
00:07:53,880 --> 00:07:58,240
and soft, skills
and sharing that diversity. So
148
00:07:59,280 --> 00:08:00,520
what I've learned,
149
00:08:00,520 --> 00:08:03,160
either packing boxes of boxes
in a warehouse
150
00:08:03,160 --> 00:08:06,160
or understanding
the fish canning line at Grimsby,
151
00:08:06,240 --> 00:08:10,600
for example to look at to look inside
fishes, make sure that they are good
152
00:08:10,600 --> 00:08:15,960
enough, the quality or even the very high
tech potato sorter on the manufacturing site,
153
00:08:16,040 --> 00:08:19,240
just to give a few examples
of the supermarket
154
00:08:19,240 --> 00:08:24,120
you see behind the sort of the shop front,
I always think about how technology,
155
00:08:24,120 --> 00:08:27,240
the user, the journey,
the efficiency for people are so valuable.
156
00:08:28,400 --> 00:08:31,440
And so often when I'm talking to different
stakeholders and users,
157
00:08:32,240 --> 00:08:35,400
when I think about different experiences
I've had across those different sectors,
158
00:08:35,640 --> 00:08:38,960
I try and I always think
there's different way to solve a problem
159
00:08:39,120 --> 00:08:44,720
and that there are always sort of certain
friction and always I find an analysis
160
00:08:44,720 --> 00:08:47,720
I can do from a sector
that I've been in
161
00:08:47,960 --> 00:08:50,800
to finding new ways to solve problems.
162
00:08:50,800 --> 00:08:52,080
I think. So
163
00:08:52,080 --> 00:08:54,720
the charity sector
is no different to another sector.
164
00:08:54,720 --> 00:08:57,560
Even more
so with the same stakeholder pressures.
165
00:08:57,560 --> 00:09:01,080
We've all got shareholders,
daily profit and loss targets,
166
00:09:01,080 --> 00:09:03,480
it's just different
things we're looking at over
167
00:09:04,440 --> 00:09:07,560
revenue implications
or even when there's revenue implications.
168
00:09:07,560 --> 00:09:11,080
But we can again keep up with the Joneses
we can really push ahead
169
00:09:11,320 --> 00:09:13,280
and look outside
the global trends as well,
170
00:09:13,280 --> 00:09:17,640
because someone could have done it out out
there, and so why not?
171
00:09:17,720 --> 00:09:21,680
So if my my only advice is always think
differently, always keep hungry
172
00:09:21,680 --> 00:09:24,160
and so you can stay ahead of the game.
And that's absolutely fine.
173
00:09:24,160 --> 00:09:26,680
Looking across sector, across. The world,
I love that.
174
00:09:26,680 --> 00:09:28,560
And you know,
when you talk about across the world
175
00:09:28,560 --> 00:09:32,200
and in the charitable sector,
you know, a few years back the UK
176
00:09:32,480 --> 00:09:34,560
introduced the tap to pay
at a lot of events. Right?
177
00:09:34,560 --> 00:09:38,760
And so instead of having to put your £5 in
or whatever into the donation box,
178
00:09:38,760 --> 00:09:40,400
you could just tap your card.
179
00:09:40,400 --> 00:09:43,920
Well, that hadn't come out in Canada,
but it was learning through charities here
180
00:09:43,920 --> 00:09:46,880
that's happening over there. I went,
Oh my God, this is amazing.
181
00:09:46,880 --> 00:09:49,960
And so, like the banking industry
did a whole innovation around
182
00:09:49,960 --> 00:09:53,000
that in the UK, which again,
you talk about global learning.
183
00:09:53,000 --> 00:09:54,960
So I really appreciate
you bringing that up
184
00:09:54,960 --> 00:09:56,880
because that is such an important part
of it.
185
00:09:56,880 --> 00:10:00,000
And obviously, again, your background
working at Morrisons,
186
00:10:00,000 --> 00:10:03,080
having that opportunity to look at,
you know, how food is processed
187
00:10:03,080 --> 00:10:06,480
and probably looking at blockchain
and other things just, just again
188
00:10:06,480 --> 00:10:08,400
enhances your role that you're in now
189
00:10:08,400 --> 00:10:10,720
because you're having those outside
learnings.
190
00:10:10,720 --> 00:10:14,400
A CIO at a roundtable said to me once,
or we were at a roundtable
191
00:10:14,400 --> 00:10:16,680
talking about cloud
and all sorts of things
192
00:10:16,680 --> 00:10:20,440
and but the one CIO said,
you know, people do expect the end user
193
00:10:20,440 --> 00:10:24,320
does expect that Amazon experience
your order one day and get it the next.
194
00:10:24,440 --> 00:10:27,400
Worthy We talked about the Ocado
experience of food delivery
195
00:10:27,400 --> 00:10:30,720
or whatever,
so I appreciate you sharing that with us.
196
00:10:30,720 --> 00:10:34,320
And I wanted to really talk
about tech for good.
197
00:10:34,560 --> 00:10:35,680
It really seems to be
198
00:10:35,680 --> 00:10:38,920
with a lot of the CIOs,
especially in the UK, talking about this
199
00:10:38,920 --> 00:10:41,920
and really being intentional
about looking at that.
200
00:10:42,360 --> 00:10:45,160
And the last time we spoke,
you spoke about your passion for tech
201
00:10:45,160 --> 00:10:48,040
for good, really to support inclusion
of underrepresented.
202
00:10:48,040 --> 00:10:50,720
communities in accessing technology.
203
00:10:50,720 --> 00:10:53,280
And you made a really interesting
point to me around,
204
00:10:53,280 --> 00:10:56,200
you know, building data models
that can really help inform that.
205
00:10:56,200 --> 00:10:59,600
So really being intentional
and working with data to understand
206
00:10:59,880 --> 00:11:02,880
where you can, you know,
support underrepresented communities.
207
00:11:03,360 --> 00:11:06,120
So could you please discuss this
and perhaps some of the ways
208
00:11:06,120 --> 00:11:09,080
you believe leaders
and organizations can support this?
209
00:11:09,080 --> 00:11:12,640
This is a huge question and again,
it's super exciting.
210
00:11:12,840 --> 00:11:16,800
I probably think your last comment actually
about different sectors
211
00:11:16,800 --> 00:11:19,880
and globally as well,
because I think about Blockbuster,
212
00:11:19,880 --> 00:11:24,360
which is a UK film hiring company.
I always say
213
00:11:24,360 --> 00:11:28,320
you don't want to turn into a Blockbuster,
so the need was that change
214
00:11:28,320 --> 00:11:31,240
but it happened a little bit too late.
And I think that's really interesting
215
00:11:31,240 --> 00:11:34,240
about the global thing as well
and about doing doing tech for good.
216
00:11:34,520 --> 00:11:37,640
The nations who are really performing
are actually some of those
217
00:11:37,640 --> 00:11:42,200
who are furthest behind in terms
of digital inclusion, and that's why
218
00:11:42,240 --> 00:11:45,240
I think it's really important
to not be not be complacent.
219
00:11:45,680 --> 00:11:48,360
There were some really
some really groundbreaking activities
220
00:11:48,360 --> 00:11:54,240
in different areas of the world
who were leapfrogging us in tap to pay
221
00:11:54,240 --> 00:11:57,880
yu've mentioned
because they need too. That need is so critical.
222
00:11:57,880 --> 00:12:00,960
And so when I think about
looking at different data models, I've got
223
00:12:01,840 --> 00:12:05,200
some different viewpoints
on joining up different data models
224
00:12:05,200 --> 00:12:08,200
from different styles of data,
for example,
225
00:12:08,600 --> 00:12:13,160
and the power of that, because we
we actually only look at the boundaries
226
00:12:14,200 --> 00:12:17,200
that we set in or the datasets
that we currently see.
227
00:12:17,400 --> 00:12:20,880
And so by thinking differently,
by looking into this into space,
228
00:12:20,880 --> 00:12:21,840
as one of my peers would say,
229
00:12:21,840 --> 00:12:25,720
and just searching for new stars,
you can find insight.
230
00:12:25,960 --> 00:12:27,920
And so when I think about the charity
231
00:12:27,920 --> 00:12:31,720
sector that I work in, I currently serve
a certain portion of people in living with cancer
232
00:12:32,160 --> 00:12:33,000
that's not good enough.
233
00:12:33,000 --> 00:12:36,000
And so whereas when I was going to
when I think how I can solve that,
234
00:12:36,240 --> 00:12:38,080
I think what could help joining that up.
235
00:12:38,080 --> 00:12:41,640
So this experience from when I worked in
central government at the height of COVID,
236
00:12:41,880 --> 00:12:46,920
when we we formulated the discussions
237
00:12:46,920 --> 00:12:50,280
to join up different datasets
from across government departments.
238
00:12:50,600 --> 00:12:54,240
And what the power of that show
is that through aggregated data sets
239
00:12:54,240 --> 00:13:01,440
that were anonymized and had a kind of
a nationwide view, but localized impact.
240
00:13:01,680 --> 00:13:05,760
You could see the different impact
of different datasets being joined up.
241
00:13:05,760 --> 00:13:10,200
So for example, water data
or people the number of people on the road,
242
00:13:10,240 --> 00:13:15,280
you can see the see the transmissions,
for example, either increase or decrease.
243
00:13:15,600 --> 00:13:19,760
And I think the same thing can happen in
different examples is using open
244
00:13:19,760 --> 00:13:25,440
datasets or datasets
from other other other organizations
245
00:13:25,440 --> 00:13:29,080
that can be shared.
It's for example linked to social care.
246
00:13:29,120 --> 00:13:30,160
So what
247
00:13:30,160 --> 00:13:34,920
at what point in a social ecosystem
can you apply a certain lever
248
00:13:35,040 --> 00:13:38,360
and the individual will not go back
to social care like a rubber band.
249
00:13:38,600 --> 00:13:42,120
And so it's understanding those datasets
and understanding well
250
00:13:42,120 --> 00:13:45,440
actually if you provide that lever over
and over again, you can move that child
251
00:13:45,440 --> 00:13:49,080
and that dependance from social care
and break and break the trend.
252
00:13:49,840 --> 00:13:52,240
And I think that's so important
because obviously
253
00:13:52,240 --> 00:13:55,440
the data is out there,
but we in society aren't using it.
254
00:13:55,480 --> 00:13:57,720
We're trying to hold on to it so closely.
255
00:13:57,720 --> 00:13:59,520
But actually we all know it's there
256
00:13:59,520 --> 00:14:01,600
and different people are
using it, different activities.
257
00:14:01,600 --> 00:14:04,000
I mean, we've been acutually
using it, even offline.
258
00:14:04,000 --> 00:14:07,280
So as a retailer for example, someone
would look out, see it's raining
259
00:14:07,440 --> 00:14:12,360
or see a film being loaned to another cinema
and adjust their offering.
260
00:14:13,520 --> 00:14:15,840
People have naturally done it for years.
261
00:14:15,840 --> 00:14:16,560
But actually
262
00:14:16,560 --> 00:14:19,920
what can we do if we actually harness
the power of data sitting in systems,
263
00:14:20,520 --> 00:14:23,520
making sure it's safe
and controlled in the people's trust
264
00:14:23,880 --> 00:14:26,680
and can people control of that
data is really clear,
265
00:14:26,680 --> 00:14:30,360
but actually showing the benefits
really with real life applications,
266
00:14:30,720 --> 00:14:34,520
then if we look at different societies,
have achieved this really, really well
267
00:14:34,560 --> 00:14:38,000
through COVID. They democratized the trust
268
00:14:39,440 --> 00:14:44,160
and they democratised data, control input in discussions
and it really increase
269
00:14:44,160 --> 00:14:48,400
people's engagement with activity
and the trust in the entire system.
270
00:14:48,560 --> 00:14:50,200
And therefore onwards
271
00:14:50,200 --> 00:14:54,320
they could understand the motivators
and detractors from using digital tools.
272
00:14:54,480 --> 00:14:57,280
So I think it's really, really powerful
273
00:14:57,280 --> 00:14:59,880
as one element of tech figured.
274
00:14:59,880 --> 00:15:02,880
And so there's so many foundations
we put in place
275
00:15:03,000 --> 00:15:07,600
before we get on to shiny new things
such as, well not just AI, but elements
276
00:15:07,600 --> 00:15:12,080
of AI or other things, because there's people
need to want to trust the system.
277
00:15:12,360 --> 00:15:15,840
So foundations are really important as part of it.
278
00:15:16,680 --> 00:15:17,760
You're really inspiring me.
279
00:15:17,760 --> 00:15:19,680
Thank you so much for saying that,
because I'm thinking
280
00:15:19,680 --> 00:15:22,320
we're going to our next question,
which is going to be about innovation.
281
00:15:22,320 --> 00:15:22,920
And GenAI.
282
00:15:22,920 --> 00:15:25,200
But, you know,
I think what you're saying is
283
00:15:25,200 --> 00:15:28,960
we have an app and what I'm hearing is
almost we have this opportunity right now
284
00:15:28,960 --> 00:15:31,960
with this new inflection point in tech
and maybe quantum,
285
00:15:32,400 --> 00:15:35,000
You know, I feel like
Quantum is really coming in behind.
286
00:15:35,000 --> 00:15:38,840
So there's that opportunity to potentially
look at how we're connecting this data.
287
00:15:38,840 --> 00:15:39,600
Imagine that
288
00:15:39,600 --> 00:15:43,480
at the speed of which we could do things
quicker and faster and support individuals
289
00:15:43,480 --> 00:15:44,400
and how we could share.
290
00:15:44,400 --> 00:15:48,120
And if corporations were aligned to that,
to making ensure there was, you know,
291
00:15:48,120 --> 00:15:51,280
because I think of organizations
that donate food to food banks
292
00:15:51,280 --> 00:15:54,360
or other organizations,
imagine if that was all connected, how
293
00:15:54,360 --> 00:15:55,600
that would really change the world.
294
00:15:55,600 --> 00:15:57,960
So, yeah,
you're really it's very inspirational.
295
00:15:57,960 --> 00:15:59,960
So thank you for sharing that.
296
00:15:59,960 --> 00:16:03,480
And I did want to go through
to our last question, which is I'm asking
297
00:16:03,480 --> 00:16:07,520
everybody this question right now,
and the theme is really around innovation.
298
00:16:07,520 --> 00:16:11,360
and GenAI, so,
you know, obviously, GenAI
299
00:16:11,360 --> 00:16:14,640
and LLMs are very prevalent right now
in discussions about innovation
300
00:16:14,640 --> 00:16:17,920
and just in discussions
about technology, really.
301
00:16:18,280 --> 00:16:21,560
So could you share your views on that
and perhaps some of the ways
302
00:16:21,560 --> 00:16:24,800
you're looking at to deploy
or what you're seeing in the market?
303
00:16:25,000 --> 00:16:28,360
So if I think about this
and I'm really a big fan of shiny toys,
304
00:16:29,080 --> 00:16:32,280
but shiny toys
that are bound with their limitations
305
00:16:32,280 --> 00:16:35,760
in the discussion.
There's definitely a place keeping up
306
00:16:35,760 --> 00:16:38,760
and understanding what's out in the market
and play and playing safely.
307
00:16:38,840 --> 00:16:42,360
But at the same time, I think there were
some foundations that we thought about.
308
00:16:42,600 --> 00:16:45,040
If I use example,
one of my wonderful medical team
309
00:16:45,040 --> 00:16:48,200
I'm really privileged to work with because
they work directly with the frontline.
310
00:16:49,440 --> 00:16:51,720
Is that I
think about we need to be contextualized.
311
00:16:51,720 --> 00:16:57,240
We've got a really real critical, exciting
point in a technological revolution
312
00:16:57,840 --> 00:17:01,920
or evolution to think, to recontextualize,
not just do as we've always done.
313
00:17:02,000 --> 00:17:05,640
So arthritis, for example,
to be treated the same way since the 1950s
314
00:17:05,800 --> 00:17:10,600
and one dataset of white middle aged men
in the north of Manchester in the UK.
315
00:17:11,040 --> 00:17:12,240
Why do we still do that?
316
00:17:12,240 --> 00:17:13,520
And so it becomes
317
00:17:13,520 --> 00:17:17,480
pretty apparent that actually the medical
treatments are very different now.
318
00:17:17,640 --> 00:17:21,880
And so the same solution of fix
is isn't working.
319
00:17:22,040 --> 00:17:24,960
So in the same way,
how can we use the opportunity
320
00:17:24,960 --> 00:17:28,560
with AI and large language models
and whatever is going to come next
321
00:17:29,320 --> 00:17:30,560
in order to recontextualize?
322
00:17:30,560 --> 00:17:33,240
I go back to the bones,
what actually the user needing.
323
00:17:33,240 --> 00:17:35,560
And how can we solve that problem?
324
00:17:35,560 --> 00:17:38,880
Because no doubt it will create
more discussions about dependencies
325
00:17:39,080 --> 00:17:44,680
and also get rid of the waste
effot. All those processes and also expose
326
00:17:44,680 --> 00:17:48,560
second order effects of using AI and
327
00:17:49,840 --> 00:17:50,480
LLMs.
328
00:17:50,480 --> 00:17:53,600
This is
all done for if you make the hospital porter
329
00:17:53,600 --> 00:17:57,640
group more efficient, of course
it's going to open up more bed spaces.
330
00:17:57,960 --> 00:18:01,880
Some of these some of the examples we just
mapped them out and think a bit broader.
331
00:18:02,160 --> 00:18:05,160
We can think about different,
different implications.
332
00:18:05,880 --> 00:18:09,480
So for me it's about exciting people,
about the opportunities, data
333
00:18:09,760 --> 00:18:14,400
to really talk to people about ethics
and talking about biases and help
334
00:18:14,400 --> 00:18:19,200
people understand what is that excitement
to level thing to bring on on the journey.
335
00:18:19,840 --> 00:18:23,760
Because at the end of day we're
all looking to make the future better,
336
00:18:24,000 --> 00:18:27,080
but it must be secure, resilient
and joined up
337
00:18:27,080 --> 00:18:30,560
and efficient because all of these things
are really expensive as well.
338
00:18:30,840 --> 00:18:34,440
So we all we I think some of us know
some of the pricing models
339
00:18:34,440 --> 00:18:37,720
with new AI systems
through our hype vendors
340
00:18:37,720 --> 00:18:41,280
are staff to trying to go live and
it's extortionate frankly.
341
00:18:41,520 --> 00:18:44,480
I as a charity can't keep up.
342
00:18:44,480 --> 00:18:48,400
So how can we all get on this bus and be
supporting the users in the same way,
343
00:18:48,720 --> 00:18:53,400
including those those vendors.
Obviously there will be people who will be left behind,
344
00:18:53,640 --> 00:18:57,000
and I don't
I don't mind if it's us, but I'm really
345
00:18:57,440 --> 00:19:00,480
conscious of the person using the system
because if we're being left behind
346
00:19:00,800 --> 00:19:04,000
the user who needs the most services
will absolutely be left behind.
347
00:19:04,200 --> 00:19:07,440
And that that and that that the people I'm
I'm worried about.
348
00:19:07,640 --> 00:19:11,160
So how I answer questions in my mind
about knowing what's what
349
00:19:11,480 --> 00:19:14,920
what's out there absolutely making sure
that the foundations
350
00:19:15,120 --> 00:19:18,240
of data sharing data ethics and data
management are in place
351
00:19:19,040 --> 00:19:22,880
and a real hunger for people
352
00:19:23,080 --> 00:19:27,960
to understand the opportunities
and and to excite people with with that.
353
00:19:28,280 --> 00:19:31,480
Because at the same time,
that will open up those motivators
354
00:19:31,760 --> 00:19:34,600
for others
to get on a digital inclusion journey,
355
00:19:34,600 --> 00:19:38,080
which will only help the health, wealth
and ultimate well-being
356
00:19:38,680 --> 00:19:42,600
in the community
if if they are enabled and excited to make
357
00:19:42,600 --> 00:19:45,720
best use of the exciting technologies
that are out there.
358
00:19:45,800 --> 00:19:50,040
Dr. Roxane, you're very, very inspiring
and I really appreciate this conversation
359
00:19:50,040 --> 00:19:50,400
today.
360
00:19:50,400 --> 00:19:53,520
It has been just phenomenal for me
to listen to you and hear
361
00:19:53,560 --> 00:19:55,080
give you points on technology.
362
00:19:55,080 --> 00:19:56,760
And so thank you so much.
363
00:19:57,960 --> 00:19:58,320
Thank you.
364
00:19:58,320 --> 00:20:01,080
Thank you very much.
I really loveed talking to you. Thank you.
365
00:20:01,080 --> 00:20:03,920
And if you're interested in learning more,
please don't hesitate to visit us
366
00:20:03,920 --> 00:20:06,320
at cio.com/uk
367
00:20:06,320 --> 00:20:06,920
Thanks again.