WEBVTT

00:00:02.200 --> 00:00:07.100
In a deep sense as the technology that the world society

00:00:07.100 --> 00:00:07.900
people have always wanted.

00:00:07.900 --> 00:00:10.100
So I can talk about this for a very long time.

00:00:10.100 --> 00:00:13.900
I think we can do a lot to really maximize to go to manage

00:00:13.900 --> 00:00:14.500
and mitigate the bad.

00:00:14.500 --> 00:00:19.300
But the scary part is just sort of putting this lever into the

00:00:19.300 --> 00:00:22.000
world will for sure have unpredictable consequences.

00:00:51.500 --> 00:00:54.000
Can't we think it's very important to our mission to deploy

00:00:54.000 --> 00:00:55.100
things like Chachi PT.

00:00:55.100 --> 00:01:01.300
So that people gain some experience of feeling of the capabilities

00:01:01.300 --> 00:01:02.600
and limitations of these systems.

00:01:02.600 --> 00:01:06.900
But as these models get to know you better and are able capable of

00:01:06.900 --> 00:01:11.900
more, you can really imagine a world where you have a fairly

00:01:11.900 --> 00:01:15.200
simple and short conversation with the model and a huge amount

00:01:15.200 --> 00:01:18.100
of things get on your behalf, pretty soon.

00:01:18.100 --> 00:01:21.300
I think we'll just expect all the products and services.

00:01:21.300 --> 00:01:23.300
We used to have some intelligence baked in,

00:01:23.300 --> 00:01:27.000
and it'll just kind of be expected like a mobile app is today.

00:01:27.000 --> 00:01:29.500
I think if this technology goes wrong,

00:01:29.500 --> 00:01:33.800
it can go quite wrong, and we want to be vocal about that.

00:01:33.800 --> 00:01:36.200
We want to work with the government to it,

00:01:36.200 --> 00:01:39.100
is our responsibility to educate policymakers and the public

00:01:39.100 --> 00:01:41.800
at large about what we think is happening.

00:01:41.800 --> 00:01:43.200
What we think may happen.

00:01:43.200 --> 00:01:47.600
Like we have in the future and to put technology out into the

00:01:47.600 --> 00:01:48.700
world so people can see it.

00:01:51.200 --> 00:01:52.100
liberation of

00:01:53.900 --> 00:01:56.800
deepening of societal inequalities.

00:01:57.900 --> 00:02:03.300
We have seen how algorithmic biases can perpetuate discrimination

00:02:03.300 --> 00:02:08.900
and Prejudice and how the lack of transparency can undermine

00:02:08.900 --> 00:02:12.800
public trust that voice was not mine.

00:02:13.800 --> 00:02:15.600
The words were not mine.

00:02:16.800 --> 00:02:23.400
And the audio was an AI voice cloning software trained on my floor

00:02:23.400 --> 00:02:27.500
speeches. The remarks were written by Kat gbt

00:02:28.600 --> 00:02:34.800
when it was asked how I would open this hearing with.

00:02:34.800 --> 00:02:37.800
So grown up in a world where we sort of trust someone's

00:02:37.800 --> 00:02:38.900
voice. We hear over the phone.

00:02:38.900 --> 00:02:42.100
This particular example, I think is going to be a problem

00:02:42.100 --> 00:02:45.500
and we need Society to adjust to it fast.

00:02:45.500 --> 00:02:47.900
So I think we just all need to start telling people.

00:02:47.900 --> 00:02:51.900
This is coming, you can't trust a voice to hear of the phone

00:02:51.900 --> 00:02:56.300
anymore in society is capable of adapting to his people are much

00:02:56.300 --> 00:02:58.700
smarter and savvier than I think a lot of the so-called

00:02:58.700 --> 00:02:59.400
experts

00:03:00.500 --> 00:03:01.100
Think.

00:03:03.100 --> 00:03:06.500
These systems are already quite powerful and will get tremendously

00:03:06.500 --> 00:03:11.200
powerful and we have come together as a global Community

00:03:11.200 --> 00:03:15.800
before 4 very powerful technologies, that pose a substantial

00:03:15.800 --> 00:03:18.800
risk, that one have to overcome to get to the tremendous

00:03:18.800 --> 00:03:19.100
upside.

00:03:20.900 --> 00:03:26.700
I think AI will contribute a lot to increase in the total amount

00:03:26.700 --> 00:03:27.500
of wealth in the world.

00:03:27.500 --> 00:03:30.800
However, it won't be sufficient on its own.

00:03:30.800 --> 00:03:34.500
And I still think we need much more public policy around.

00:03:34.500 --> 00:03:37.200
How we're going to divvy up access to these systems.

00:03:37.200 --> 00:03:39.000
How we're going to divvy up governance of these systems?

00:03:39.000 --> 00:03:41.000
That will be a matter for policy.

00:03:41.000 --> 00:03:45.900
Not the technology along with the effect that I have are going

00:03:45.900 --> 00:03:48.700
to be very different in the world than the effects that social media.

00:03:48.700 --> 00:03:54.500
And so I think it is difficult and dangerous to say,

00:03:54.500 --> 00:03:56.500
oh, it's going to play out like this.

00:03:56.500 --> 00:03:59.000
And I think of this like social media were going to think of this

00:03:59.000 --> 00:04:01.200
like nuclear weapons are not think of this like synthetic

00:04:01.200 --> 00:04:03.300
biology were think of this like, the iPhone.

00:04:03.300 --> 00:04:07.700
This is just a new thing and I think analogies from the past

00:04:07.700 --> 00:04:09.500
are going to particularly badly.

00:04:09.500 --> 00:04:10.400
I lost this time around.