WEBVTT

00:00:00.000 --> 00:00:02.200
Hey everyone, welcome back today and Tack.

00:00:02.200 --> 00:00:03.300
I'm Julia Beauchamp.

00:00:03.300 --> 00:00:07.000
I am here with the CEO of drum wave Michelle dennedy to they

00:00:07.000 --> 00:00:10.300
were talking privacy and data and how to make it work for you

00:00:10.300 --> 00:00:11.200
to stick around.

00:00:18.100 --> 00:00:21.700
Michelle thank you so much for calling in all the way from sunny

00:00:21.700 --> 00:00:26.200
sunny and probably warm California warm if it's good to be here.

00:00:26.200 --> 00:00:30.900
Yeah, we're dealing with cold over here on the in the Northeast

00:00:30.900 --> 00:00:33.100
but I think we're warming up hopefully soon.

00:00:33.100 --> 00:00:36.000
But anyway, we're here to talk about data privacy.

00:00:36.000 --> 00:00:39.400
And I think what a lot of us here privacy.

00:00:39.400 --> 00:00:41.500
I know I'm certainly one of them.

00:00:41.500 --> 00:00:45.100
I hear I think okay data privacy laws and it makes you feel

00:00:45.100 --> 00:00:48.800
as a sort of data is something that is inherently mishandled,

00:00:48.800 --> 00:00:50.200
but is that really the truth?

00:00:51.100 --> 00:00:52.200
Not at all.

00:00:52.200 --> 00:00:54.100
And I'm glad you're very reason.

00:00:54.100 --> 00:00:57.200
I think a misconception and a missed opportunity.

00:00:57.200 --> 00:01:00.800
So when you think about privacy, I think there's two sorts

00:01:00.800 --> 00:01:04.800
of reactions one is it's either super law super regulated

00:01:04.800 --> 00:01:08.600
and your doctor know I can't talk to people I want to talk to

00:01:08.600 --> 00:01:11.900
you. I can't Market to people I want to talk to you or you

00:01:11.900 --> 00:01:13.500
think mine mine mine.

00:01:13.500 --> 00:01:18.200
Mine privacy is about hiding in our little Cloister world and

00:01:18.200 --> 00:01:22.800
not being observed privacy is actually a much more nuanced

00:01:22.800 --> 00:01:23.900
an interesting conversation.

00:01:23.900 --> 00:01:28.300
It's about authorized sharing of information according

00:01:28.300 --> 00:01:30.700
to ethical moral and legal guidelines.

00:01:30.700 --> 00:01:35.600
So the technology conversation it's a legal and policy conversation

00:01:35.600 --> 00:01:38.300
and its business and cultural conversation.

00:01:39.300 --> 00:01:43.600
So how can we think of our data is something that isn't something

00:01:43.600 --> 00:01:46.400
to be afraid of is this a relatively new conversation

00:01:46.400 --> 00:01:48.300
help we always been afraid of data.

00:01:49.300 --> 00:01:51.900
I think we've been afraid of data for a super long time.

00:01:51.900 --> 00:01:56.700
So privacy laws go far far back in history into the Code of Hammurabi

00:01:56.700 --> 00:02:00.200
where you talk about technology and availability.

00:02:00.200 --> 00:02:03.500
There's actually a lot that says in the Code of Hammurabi

00:02:03.500 --> 00:02:05.000
what happens to you?

00:02:05.000 --> 00:02:08.900
If you peek into the window of your neighbor and look upon his

00:02:08.900 --> 00:02:12.300
wife in a state of undress alive while the consequences

00:02:12.300 --> 00:02:15.800
were a bit dire the different time but what they're talking

00:02:15.800 --> 00:02:20.900
about was technology it was possible to look into your neighbor's

00:02:20.900 --> 00:02:25.000
home behaviors that clearly we're going on people are sure guess

00:02:25.000 --> 00:02:29.600
and they were doing things and the prohibitions against those

00:02:29.600 --> 00:02:33.500
activities based on the technology that was available at the

00:02:33.500 --> 00:02:37.900
time and the balance on cultural needs nor to the culture

00:02:37.900 --> 00:02:42.000
of privacy goes back into ancient times that you talk about

00:02:42.000 --> 00:02:46.400
data as an asset is starting to be a newer conversation.

00:02:46.400 --> 00:02:47.700
That's collecting steam.

00:02:47.700 --> 00:02:48.800
But what am I?

00:02:49.200 --> 00:02:55.300
Idols rear Admiral Grace Hopper and in 1965 the data would

00:02:55.300 --> 00:02:58.400
someday be on the corporate balance sheet because in many cases

00:02:58.400 --> 00:03:04.200
it's more valuable than the hard whether it's over 50 years

00:03:04.200 --> 00:03:05.800
Kids Time to Get on Up.

00:03:06.900 --> 00:03:10.800
That's a really interesting point that it seems something

00:03:10.800 --> 00:03:14.600
that has maybe being shifted into the you know public conversation

00:03:14.600 --> 00:03:18.600
were often. I would think that a lot of our viewers are acutely

00:03:18.600 --> 00:03:21.300
aware of the fact that a lot of tech companies are profiting

00:03:21.300 --> 00:03:25.100
very much off of their data and that's what you maybe see some

00:03:25.100 --> 00:03:27.700
of these consumer privacy laws coming in and trying to stop

00:03:27.700 --> 00:03:31.500
some companies from benefiting and profiting off of user data

00:03:31.500 --> 00:03:36.500
in a way that seems unfair but it seems like people are sort

00:03:36.500 --> 00:03:40.100
of more aware of how I could be of how their data is used.

00:03:40.100 --> 00:03:43.600
And you think of the leading to people's art of taking no feeling

00:03:43.600 --> 00:03:44.400
in control.

00:03:46.200 --> 00:03:49.900
I think yes or no, I think as soon as you start talking about

00:03:49.900 --> 00:03:52.800
transparency as some sort of magical Elixir,

00:03:52.800 --> 00:03:56.100
I'm going to tell you what I collect about you and then suddenly

00:03:56.100 --> 00:04:01.100
it's some sort of a get-out-of-jail-free card or I have to say

00:04:01.100 --> 00:04:03.800
either share everything or share nothing.

00:04:03.800 --> 00:04:08.000
It turns into this weird dichotomy where you're not really

00:04:08.000 --> 00:04:11.700
advancing the real Dynamic I think to look at it.

00:04:11.700 --> 00:04:13.600
Let's look at it in the context of this year.

00:04:13.600 --> 00:04:17.700
And now we're in the middle of stay-at-home orders all over

00:04:17.700 --> 00:04:21.700
the world fighting a global pandemic by staying isolated

00:04:21.700 --> 00:04:24.800
from each other because we don't have a cure for this type of

00:04:24.800 --> 00:04:29.300
disease. So what are we doing reaction almost every conversation

00:04:29.300 --> 00:04:33.900
overnight whether you had these long-winded discussions

00:04:33.900 --> 00:04:37.500
about bring your own device, and should you allow your workers

00:04:37.500 --> 00:04:38.100
to work from home?

00:04:39.500 --> 00:04:41.900
Guess what they're working from home.

00:04:41.900 --> 00:04:46.900
So we're using that digital those Digital Services and I think

00:04:46.900 --> 00:04:48.300
we can look at it in one of two ways.

00:04:48.300 --> 00:04:52.000
You can trust the Digital Services because you have no other

00:04:52.000 --> 00:04:53.400
choice you're in your home.

00:04:53.400 --> 00:04:56.200
If you want to work, you have to turn on your screen and share,

00:04:56.200 --> 00:05:00.300
you know, welcome to my home office or whatever place you are.

00:05:00.300 --> 00:05:02.800
I can turn on the technology and you can see,

00:05:02.800 --> 00:05:05.900
you know, the Golden Gate Bridge Our Stars or whatever resume

00:05:05.900 --> 00:05:06.800
has on the menu today.

00:05:06.800 --> 00:05:12.300
You can also look at this in a business and consumer partnership

00:05:12.300 --> 00:05:14.000
sort of Sebastian.

00:05:14.000 --> 00:05:17.700
What are the assets right now that are valuable.

00:05:17.700 --> 00:05:21.200
My team is stretched across the day the whole world.

00:05:21.200 --> 00:05:25.800
So down and Brazil up in Canada all the way in Spain and we

00:05:25.800 --> 00:05:29.100
have conversations where I can look into their faces sort of guess

00:05:29.100 --> 00:05:30.500
things about their moods.

00:05:30.500 --> 00:05:34.000
It's not coming from a a digital through conversational

00:05:34.000 --> 00:05:37.800
word or text and we can have a conversation and do business.

00:05:39.400 --> 00:05:42.900
That is a blessing understanding the data information

00:05:42.900 --> 00:05:46.900
can be intellectual property that drives from Wade Incorporated

00:05:46.900 --> 00:05:52.200
that conversation can be deeply personal conversation

00:05:52.200 --> 00:05:55.500
can be possessed and observe by Third parties.

00:05:55.500 --> 00:05:59.200
If we're not careful these sorts of re-evaluations

00:05:59.200 --> 00:06:02.200
are not just during a crisis or pandemic.

00:06:02.200 --> 00:06:06.300
I believe that we are not talking about getting back to normal

00:06:06.300 --> 00:06:08.200
anything. The world has changed.

00:06:08.200 --> 00:06:10.600
We went digital overnight.

00:06:10.600 --> 00:06:13.600
We've been prophesying that the world was going to experience

00:06:13.600 --> 00:06:17.000
a digital transformation ever since the internet allowed

00:06:17.000 --> 00:06:20.800
us to have stretchability and mail address connecting

00:06:20.800 --> 00:06:25.600
everything. So I think the reality is that we're all experiencing

00:06:25.600 --> 00:06:30.300
data in a totally new way and we all know we're not going to

00:06:30.300 --> 00:06:32.000
have a lot of currency in the next several months.

00:06:33.100 --> 00:06:36.300
But what I do know is you're going to have a lot more data.

00:06:36.300 --> 00:06:38.500
And so the question is what's next?

00:06:39.500 --> 00:06:42.600
Right and I think my question for you than us would have to fold.

00:06:42.600 --> 00:06:45.900
How can you know, like you said Enterprises have shipped

00:06:45.900 --> 00:06:47.600
to using a lot of his collaboration tools.

00:06:47.600 --> 00:06:52.000
I mean, we're connecting right now via Xoom virtually overnight.

00:06:52.000 --> 00:06:56.800
So how can Enterprise's first of all make sure that you know

00:06:56.800 --> 00:06:59.800
the value that they're getting out of the product that there

00:06:59.800 --> 00:07:02.400
are perhaps their privacy isn't being compromised and on the

00:07:02.400 --> 00:07:05.400
other hand. How can these vendors that are now experiencing

00:07:05.400 --> 00:07:08.800
massive massive massive up text in use. I

00:07:08.800 --> 00:07:12.100
mean, I think Zoom said that some of their daily video calls

00:07:12.100 --> 00:07:17.200
what or maybe monthly from December to now has just increased

00:07:17.200 --> 00:07:20.100
so there's experiencing a time or traffic.

00:07:20.100 --> 00:07:23.500
How can those sorts of collaboration tool companies make sure

00:07:23.500 --> 00:07:27.100
that they're doing right by their data and using that in a way

00:07:27.100 --> 00:07:29.500
that benefits them and also they aren't,

00:07:29.500 --> 00:07:32.600
you know, messing with all of their customers up on the other end.

00:07:32.600 --> 00:07:32.600


00:07:33.600 --> 00:07:37.900
Yanny, I think it's you know Zoom has been called out for having

00:07:37.900 --> 00:07:41.900
encryption. That was not what was expected excetra.

00:07:41.900 --> 00:07:47.100
I think if you look at all of these tools and when does the

00:07:47.100 --> 00:07:50.600
place and time where there are invented and created and Cassidy

00:07:50.600 --> 00:07:53.600
as you've said I heard from another vendor this morning that

00:07:53.600 --> 00:07:56.800
they had something like sixty billion seconds of meeting

00:07:56.800 --> 00:08:04.600
minutes. I'm since January of 2028 bandwidth load of of

00:08:07.100 --> 00:08:10.300
A transaction I would say every conversation is a digital

00:08:10.300 --> 00:08:13.600
transaction. So think about if we talked about banking with

00:08:13.600 --> 00:08:18.100
our digital assets, we're talkin about the World Bank exploding

00:08:18.100 --> 00:08:19.000
with digital currency.

00:08:19.000 --> 00:08:22.500
So to your question what can vendors do and what can consumers

00:08:22.500 --> 00:08:25.100
do I think it's the time I mean,

00:08:25.100 --> 00:08:28.200
this is like, you know, so biased and ridiculous of me,

00:08:28.200 --> 00:08:32.300
but it could be time for some privacy engineering.

00:08:32.300 --> 00:08:38.300
Hello. So let me quickly just say privacy by Design is the policy

00:08:38.300 --> 00:08:42.100
the outcome you want to have is a share Dynamic where you're

00:08:42.100 --> 00:08:47.900
not having a default conversation that says corporations

00:08:47.900 --> 00:08:51.000
get to take advantage of individuals and individuals

00:08:51.000 --> 00:08:53.300
don't have any rights in their information in their digital

00:08:53.300 --> 00:08:58.500
Lively privacy by Design says literally you should be designing

00:08:58.500 --> 00:08:58.800
better.

00:08:59.800 --> 00:09:03.900
Design is great policy is great architecture is important

00:09:03.900 --> 00:09:07.900
but Engineers are builders for problem solvers.

00:09:07.900 --> 00:09:13.200
So privacy engineer looks across the people process and technology

00:09:13.200 --> 00:09:16.300
and says, what does fair principle mean?

00:09:16.300 --> 00:09:19.600
How do we disclose it for that transparency?

00:09:19.600 --> 00:09:22.200
How do we build controls?

00:09:22.200 --> 00:09:24.100
And so that we know that we're doing?

00:09:24.100 --> 00:09:26.400
Well, how do we meet her it?

00:09:26.400 --> 00:09:30.700
Welcome to drum wave Incorporated understanding what your metrics

00:09:30.700 --> 00:09:32.900
are and your digital asset mean to you.

00:09:32.900 --> 00:09:36.800
And then how are you actually communicating that with some sense of

00:09:36.800 --> 00:09:40.800
choice? Do we have a black and white option or is there

00:09:40.800 --> 00:09:43.200
something in there we can have Custom Communications

00:09:43.200 --> 00:09:49.300
and expectations were actually competing on our ability to manage

00:09:49.300 --> 00:09:53.500
digital assets. And of course my hard biases I think it's a pretty

00:09:53.500 --> 00:09:57.500
good way to satisfy a customer to meet them where they are and

00:09:57.500 --> 00:09:59.600
have some available to Africa.

00:10:01.900 --> 00:10:06.700
So another Racine that a lot of these companies are experiencing,

00:10:06.700 --> 00:10:08.600
you know, massive massive up texting traffic.

00:10:08.600 --> 00:10:11.800
Is it too late for privacy by Design?

00:10:11.800 --> 00:10:14.900
I mean, is that something that needs to be gotten from you know

00:10:14.900 --> 00:10:18.400
minute one the second right after you figure out a name for your

00:10:18.400 --> 00:10:26.900
company II think you're thinking is privacy privacy you working

00:10:26.900 --> 00:10:29.500
for Scott McNeely who said you have zero privacy anyway,

00:10:29.500 --> 00:10:33.700
get over it and he sits on my board now actually all these

00:10:33.700 --> 00:10:37.100
years later over 20 years later and we still sort of have some

00:10:37.100 --> 00:10:40.800
fundamental disagreements in some fundamental really fundamental

00:10:40.800 --> 00:10:44.600
agreement and the disagreement I have is that technology

00:10:44.600 --> 00:10:45.200
gets to choose.

00:10:45.200 --> 00:10:51.700
So just because you can doesn't mean you should have just because

00:10:51.700 --> 00:10:55.400
we can observe everything and just because it's possible

00:10:55.400 --> 00:10:59.200
for your employer to to tell you to leave your camera on all

00:10:59.200 --> 00:11:01.700
the time because they're too lazy to come up with metric.

00:11:01.900 --> 00:11:17.000
Measure you talk back and we can travel and give talks swear

00:11:17.000 --> 00:11:23.000
and somebody said well because there's been a breach does your

00:11:23.000 --> 00:11:24.500
I can't even remember what it was anymore.

00:11:24.500 --> 00:11:29.400
Doesn't that prove to you at last that privacy is dead because

00:11:29.400 --> 00:11:31.200
this. I was able to hack.

00:11:31.200 --> 00:11:31.700
I know what it was.

00:11:31.700 --> 00:11:33.800
They hacked Bezos his phone.

00:11:33.800 --> 00:11:38.400
He's somebody hacked his phone and hack.

00:11:38.400 --> 00:11:41.500
It was a bad password shame on him.

00:11:41.500 --> 00:11:44.800
So it wasn't a logical flaw.

00:11:44.800 --> 00:11:48.700
I was a user error and then potentially maybe we need that are

00:11:48.700 --> 00:11:50.200
passwords. That's a new innovation.

00:11:50.200 --> 00:11:54.000
But what I said to this fellow at the time and I do believe

00:11:54.000 --> 00:11:58.000
this is true is are we saying that privacy is dead because

00:11:58.000 --> 00:12:01.700
we don't have the tools to serve that very human.

00:12:01.900 --> 00:12:05.100
Ornamental need and if that is true.

00:12:05.100 --> 00:12:11.100
All the same logic, so banks have fraud all the time people

00:12:11.100 --> 00:12:15.600
steal currency sometimes nine point on the subway when we used

00:12:15.600 --> 00:12:21.300
to take the subway and sometimes electronically does the fact

00:12:21.300 --> 00:12:25.100
that we are able to steal currency from Banks unit thinks

00:12:25.100 --> 00:12:28.100
you're dead. I'm so confused.

00:12:28.100 --> 00:12:31.900
Maybe I just have a simple mind but it seems to me that if

00:12:31.900 --> 00:12:35.400
we want to use currency, we protect banking transactions

00:12:35.400 --> 00:12:39.400
and we have some regulations and we figure out how to make loans

00:12:39.400 --> 00:12:44.000
and share currency in use it for whatever we want to do social

00:12:44.000 --> 00:12:48.100
economic or business needs why should have digital asset be so

00:12:48.100 --> 00:12:51.100
fundamentally different just because it's hard.

00:12:52.100 --> 00:12:53.800
Means that it doesn't exist anymore.

00:12:55.100 --> 00:12:56.200
I think it's not too late.

00:12:57.300 --> 00:12:58.400
Well, that's certainly good news.

00:12:58.400 --> 00:13:05.300
And I think and I think it really Echoes a lot of what the

00:13:05.300 --> 00:13:08.400
sort of like, you know tenant of being a private is very closely

00:13:08.400 --> 00:13:14.600
related, you know, sibling twin security is always about minimizing

00:13:14.600 --> 00:13:16.900
the risk, you're never going to have a one-hundred-percent

00:13:16.900 --> 00:13:19.600
foolproof security practice, perhaps we can maybe never also

00:13:19.600 --> 00:13:23.000
have like a one-hundred-percent foolproof privacy practice

00:13:23.000 --> 00:13:25.700
for when you start to minimize that risk is that when you can

00:13:25.700 --> 00:13:28.700
really start to get the most out of all of this out of your

00:13:28.700 --> 00:13:32.700
data exactly, I think code in particular,

00:13:32.700 --> 00:13:35.700
you know before you can get some privacy just functionality

00:13:35.700 --> 00:13:37.000
in general in a digital world.

00:13:37.000 --> 00:13:39.700
It says Columbia, MD.

00:13:39.700 --> 00:13:44.000
I just perfect example of accidentally it's as clumsy as language

00:13:44.000 --> 00:13:49.600
I can say colomby and and it doesn't hopefully run all my credibility.

00:13:49.600 --> 00:13:54.400
I just misspoke so you can have errors as long as you correct

00:13:54.400 --> 00:13:57.100
those errors summer shoes.

00:13:57.300 --> 00:14:03.000
Huge huge and you can't come back from that and others are mistaken

00:14:03.000 --> 00:14:08.900
trusted vendors or a switch that was not turned off after after a

00:14:08.900 --> 00:14:12.000
pandemic you shared too much information in a panic that didn't

00:14:12.000 --> 00:14:13.100
plan to take it off.

00:14:13.100 --> 00:14:16.100
So I think there's a lot of new ones along the way and I

00:14:16.100 --> 00:14:19.000
think we're going to stumble and I think they're ours going

00:14:19.000 --> 00:14:24.800
to be errors made but I think that as we somehow muddle through

00:14:24.800 --> 00:14:29.900
the two thousand different languages that are on the current

00:14:29.900 --> 00:14:33.600
Planet. I think we're going to also fumble our way through

00:14:33.600 --> 00:14:35.800
the 7 billion different choices.

00:14:35.800 --> 00:14:37.100
We make as individuals.

00:14:37.100 --> 00:14:38.000
What are your twin or not?

00:14:38.000 --> 00:14:39.500
It's a perfect case study.

00:14:39.500 --> 00:14:46.900
Do you have bodily Integrity from your twin or doubt that do you

00:14:46.900 --> 00:14:49.500
have a special relationship with this person?

00:14:49.500 --> 00:14:55.600
Absolutely, but those two things do not mean that you you can

00:14:55.600 --> 00:14:59.200
join and so all this is Is our one model epic thing.

00:14:59.200 --> 00:15:01.800
So take it back to you know,

00:15:01.800 --> 00:15:04.300
the here-and-now what are some steps that you'd recommend

00:15:04.300 --> 00:15:08.600
especially I think collaboration vendors, which as I think

00:15:08.600 --> 00:15:11.500
I've said, you know a million times this video in a million

00:15:11.500 --> 00:15:14.300
times over on our Channel or experiencing a massive uptick

00:15:14.300 --> 00:15:17.900
in traffic what are some steps that they can take now to ensure

00:15:17.900 --> 00:15:21.300
privacy and that they also don't get burned once this is all

00:15:21.300 --> 00:15:31.900
in by their customers for mishandling data medicine like we've

00:15:31.900 --> 00:15:35.700
never done and maybe we don't ever go back for the common

00:15:35.700 --> 00:15:38.600
cold. There's not really a great reason for you to infect

00:15:38.600 --> 00:15:40.300
other people in the doctor's office.

00:15:40.300 --> 00:15:42.500
If you can do telemedicine, for example,

00:15:42.500 --> 00:15:45.200
very sensitive information changing hands.

00:15:45.200 --> 00:15:50.500
I will say every organization whether you're not for profit

00:15:50.500 --> 00:15:55.600
or your government agency or you're a massive multi-billion

00:15:55.600 --> 00:15:56.600
minute per day.

00:15:57.300 --> 00:16:03.300
Collaboration space the number one thing you need to do is understand

00:16:03.300 --> 00:16:04.000
your digital assets.

00:16:04.000 --> 00:16:06.200
I don't know where you're going.

00:16:06.200 --> 00:16:09.600
I know every organization is going somewhere slightly different.

00:16:09.600 --> 00:16:14.000
But what I do know down to my core is that data is an

00:16:14.000 --> 00:16:19.200
asset. It is our most powerful currency for Chile in these

00:16:19.200 --> 00:16:23.200
uncertain times and you need a map.

00:16:23.200 --> 00:16:27.000
So if you haven't figured out how to really get ahold of your

00:16:27.000 --> 00:16:31.500
digital assets figure out how to map them.

00:16:31.500 --> 00:16:34.800
I happen to know a company drum wave.com.

00:16:34.800 --> 00:16:37.200
That's why we exist.

00:16:37.200 --> 00:16:41.900
There's a reason a privacy person as a CEO of this type of company

00:16:41.900 --> 00:16:45.200
because I understand the inherent integrity and acid value

00:16:45.200 --> 00:16:51.200
when you balance value including values, like ethics and humanity

00:16:51.200 --> 00:16:55.700
and individuality and autonomy with risk loss.

00:16:56.700 --> 00:17:01.700
Degradation defamation all those nasty brand bad things that

00:17:01.700 --> 00:17:04.200
happen. That's your digital balance sheet.

00:17:04.200 --> 00:17:08.100
That's what Grace Hopper was talking about a 1965 balance

00:17:08.100 --> 00:17:13.600
so that your asset value of your data exceeds the risks that

00:17:13.600 --> 00:17:16.300
you're taking and be able to show that metric. I

00:17:16.300 --> 00:17:17.600
think that's really really important.

00:17:17.600 --> 00:17:25.300
Of course. I'm very biased but thought about so obviously

00:17:25.300 --> 00:17:27.300
it seems like there's really no one-size-fits-all

00:17:27.300 --> 00:17:29.500
privacy practice much like there's no one-size-fits-all

00:17:29.500 --> 00:17:31.900
practice for virtually everything.

00:17:31.900 --> 00:17:36.800
Then what the your point that you know,

00:17:36.800 --> 00:17:38.700
there's some baseline rules that we need to follow.

00:17:38.700 --> 00:17:42.000
What's the benefit of these privacy regulations,

00:17:42.000 --> 00:17:44.900
you know, gdpr CCP are obviously the big ones if there's

00:17:44.900 --> 00:17:45.400
no one-size-fits-all.

00:17:47.200 --> 00:17:51.700
I think if you look to the core of I'll take 99% of these

00:17:51.700 --> 00:17:53.400
regulations you'll find.

00:17:54.400 --> 00:17:58.400
The fair principles that we laid out in the 1960s and even earlier

00:17:58.400 --> 00:18:02.500
and it really is talking about the word Fair sounds so mushy

00:18:02.500 --> 00:18:07.900
Engineers hate it there like we didn't go into these math classes

00:18:07.900 --> 00:18:11.600
because we likes to 2% words.

00:18:11.600 --> 00:18:14.100
We're here for the zeros in the woods kid.

00:18:14.100 --> 00:18:17.800
So how do you break down stair Ness and fair principles

00:18:17.800 --> 00:18:20.000
things like transparency?

00:18:20.000 --> 00:18:21.600
So you're not surprising people.

00:18:21.600 --> 00:18:25.600
What does informed consent mean whether you're calling it opting

00:18:25.600 --> 00:18:29.000
in or opting out it's giving enough information so that you're

00:18:29.000 --> 00:18:34.000
having a transaction having the control so that people can make

00:18:34.000 --> 00:18:39.900
a choice whether to share or not share without bias understanding

00:18:39.900 --> 00:18:46.100
that security is a absolutely critical component of privacy

00:18:46.100 --> 00:18:48.200
and intellectual property management.

00:18:48.200 --> 00:18:50.900
It used to be this old saw of like,

00:18:50.900 --> 00:18:53.800
well, you can have security without privacy,

00:18:53.800 --> 00:18:57.300
but Can a pregnancy belt security somehow that made it through

00:18:57.300 --> 00:19:00.400
a dichotomy or privacy was somehow subordinate?

00:19:00.400 --> 00:19:06.700
No privacy is the what and why security can be your how so understanding

00:19:06.700 --> 00:19:09.400
that these principles repeat over and over again,

00:19:09.400 --> 00:19:14.500
whether it's gdpr whether it's the newly revised rules in New

00:19:14.500 --> 00:19:19.500
Zealand lgpd down in Brazil, you're starting to see these buckets

00:19:19.500 --> 00:19:23.000
again and again of tell me what you're doing is proving to me

00:19:23.000 --> 00:19:26.800
that you're being ethical and legal and moral and give me some

00:19:26.800 --> 00:19:27.400
control.

00:19:28.600 --> 00:19:30.400
Great. Thank you so much Michelle.

00:19:30.400 --> 00:19:34.200
I really appreciate you calling and lots of great info and it's

00:19:34.200 --> 00:19:39.900
it makes a lot of sense to think about you have this massive

00:19:39.900 --> 00:19:40.500
amount of data.

00:19:40.500 --> 00:19:43.400
If you're a corporation and you can be doing a lot of good not

00:19:43.400 --> 00:19:46.100
just you know for yourself, but I think you could also probably

00:19:46.100 --> 00:19:48.000
be doing some good in terms of customized,

00:19:48.000 --> 00:19:50.900
you know experiences for a customer with that did it's not all

00:19:50.900 --> 00:19:54.400
scary bad-ass Lily and employees.

00:19:54.400 --> 00:19:55.300
We have had two.

00:19:55.300 --> 00:20:00.000
We've had to serve our employees like never before if you didn't

00:20:00.000 --> 00:20:04.100
understand that you're human resources department isn't just

00:20:04.100 --> 00:20:09.500
about not getting you sued time to serve in some of these digital

00:20:09.500 --> 00:20:12.600
experience. Is that your your employees are having now and how

00:20:12.600 --> 00:20:16.500
you're measuring their success is Everything's changed now,

00:20:16.500 --> 00:20:20.600
so you have to rethink leadership from inside out who are your

00:20:20.600 --> 00:20:22.300
employees? Who are your partners?

00:20:22.300 --> 00:20:24.200
How do we know we're being successful.

00:20:24.200 --> 00:20:26.500
All of those answers are digital.

00:20:27.500 --> 00:20:30.100
Well, thank you so much, Michelle, and thank you so much for

00:20:30.100 --> 00:20:31.400
calling in. Really appreciate it.

00:20:31.400 --> 00:20:35.900
Thank you, and thank you all so much for watching this episode

00:20:35.900 --> 00:20:36.500
of stay intact.

00:20:36.500 --> 00:20:38.400
If you like this episode, be sure to give the thumbs-up

00:20:38.400 --> 00:20:40.000
And subscribe to my channel.

00:20:40.000 --> 00:20:42.500
If you have any questions comments about privacy,

00:20:42.500 --> 00:20:46.300
please leave them below or any tips about getting the most out

00:20:46.300 --> 00:20:49.800
of your data without running into any sort of big lawsuit.

00:20:49.800 --> 00:20:50.800
Would love to hear it.

00:20:50.800 --> 00:20:51.400
Leave them below.

00:20:51.400 --> 00:20:53.500
Thanks again for watching and we'll see you next time.