Chewing the Fat with Jeff Fisher - Ep 81 | Facial Recognition, The Good, The Bad and the Ugly; Just Look Into The Camera | Guest: Tom Kulik
Episode Date: April 20, 2019More than 60 AI experts have publicly called on Amazon to stop selling Rekognition, its facial recognition software, to U.S. law enforcement. The technology has serious baked-in gender and racial bias...es, they suggest, and shouldn’t be used by police before the country puts legal guardrails in place. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
All right, so you know that we cover internet and technology on our Saturday shows here on Chewing the Fat quite a bit and AI and where we're headed and what's happening to your privacy.
And so I wanted to talk a little bit about facial recognition.
You know, years ago, we, this has been coming on now for 15, 16, maybe even 20 years.
I remember living in Tampa when they tried to bring the facial recognition to Ebor City during the Super Bowl.
And they talked us into doing this and how they were going to catch criminals.
But they didn't have any database to fall back on.
So they weren't finding any really faces.
And it took forever because it was searching and searching and searching.
And by the time it ended its search, if anything was found, that person is long gone.
Well, Tom Kulig, an attorney here in Dallas Fort Worth, a cybersecurity attorney, technology attorney from Schiff and Stone, has been dealing with this for that much time and longer, really, right, Tom?
That's true.
Now, the facial recognition that we have today works better for multiple reasons, but it, again, it's the databases that they didn't have in the past.
So arguably, yes, that's part of it.
Right. So now we have a bigger database because we've all given up our rights pretty much on our phone saying,
yes, I just want to take a picture. If you want access to my pictures, fine. You can have access to my pictures.
But we weren't thinking that they were going to use those against us. And now does it seem that they are or are they going to?
I mean, I realize it's all for our safety. Oh, yes. Absolutely. It's for our safety.
No, it's a really, really great point. And it's a very layered question.
when you think about it. Because one, just to set the foundation for the audience here is when we're
dealing about facial recognition, it's not just talking facial analysis, which is just being
able for a computer system to be able to see a face and know, hey, that's a face.
Facial recognition takes it that extra step and then maps that. That's a face and that's who it is.
Correct. That's who it is. Exactly. And nowadays, you know, you can go to any quarter in Dallas or Fort Worth,
New York City, L.A., and you're going to see a camera.
You're on care. I mean, we've reconciled to the fact that wherever we go, we're on care.
And oddly enough, the law for many, many years now has reconciled itself to say, hey, your fingerprint, your face, you're presenting that out to the world.
So your face is your identifier.
So I don't necessarily have a right to privacy in my face itself because it identifies me.
Where the issue is, is how those are being used in the database and otherwise.
I mean, think about it this way.
We all worry about our phones being able to track or our use in the internet and them tracking where we've gone through cookie files, etc.
Well, I've got a GPS built into me without any technology itself.
It's my face now.
And if you're able to go from camera to camera, you're now being able to track my movements over a period of time.
I don't think the law ever, ever contemplated.
So that's really where we're going.
And that's my fear.
It's not so much just like, if you remember the scene of minority report with Tom Cruise and he walks into this, I think it was the gap.
and automatically scanned his face as this other face that he was wearing and become.
And he's getting targeted with all of these ads and everything like that.
It's more than that database targeting.
It has some issues too in creating these databases.
Absolutely.
I mean, that's what we're after Zuckerberg for and we're after all of that for.
Yes.
I mean, there's another big article today where he was, you know,
they're worried about him selling our stuff for.
But we've given the rights to him.
Why wouldn't he?
I mean, I'm not on his side, but we said, we said,
we signed we said you're using we're using your product you can use whatever you want from us so what is
the problem it's a great point there because we in the united states have a different system and a different
treatment of privacy rights and especially when it comes to our personal data than the you the EU the
EU takes a more individual right standpoint you probably heard about this GDPR general data protection
regulation i'm doing a lot of work with that legally right now and i will tell you that from their
standpoint you heard it here but don't hold me to it but europe actually got this right i actually have
always agreed that this is something that we need more control over. But since the very
beginnings of when I first started dealing with data privacy on the web, our use in the United
States was always a focus on, hey, you get to opt out. We're going to use it, but you can choose
to opt out. We'll give you notice. We'll give you all these, you will get your consent with
respect to certain uses and certain things we can use it for, et cetera, and so forth. It's a very
different paradigm. I don't think that paradigm's going to work as technology keeps evolving.
That's what I'm worried about.
You're right.
We've given some of that.
But here's my criticism, and there are many of Facebook.
But even myself who has dealt with Facebook, I've read through their terms and conditions, it was not easy.
You're the one.
Although they're very good.
The problem was actually drilling down to address your privacy rights and what third parties can look at, et cetera.
Since Cambridge Analytica and that whole fiasco happened with them, it became a lot easier.
They made that easier to do.
but it's not perfect.
And we have,
we have made a decision
as a society
to say,
hey,
I'm okay with giving up
some of my privacy,
but now people are starting
to wake up.
And part of it I saw
just from this tipping point
was the Cambridge
analytical matter
with Facebook.
I think there was a tipping point
there where people said,
wait a minute,
what do you mean
you're using it like that?
I know a lot of people
are not reading those terms
and they're just clicking on eye on grade.
I just want to be able to take the picture.
That was my point to begin with.
Exactly.
Yes.
Yes.
Yes.
Yes.
You can look at my pictures on my phone.
I just want to take a picture of my kid.
And some of it is I get, and it's okay, for instance, part of their own facial recognition technology, which you can opt into using and tagging friends, et cetera.
There's a nice use.
If you're using Facebook, you can choose to say, hey, you can recognize me, recognize my closest friends who are the people who, you know, you see me interacting with most through my, you know, page.
And from there, I'm okay with that.
And I don't have a problem with Facebook.
outlining it.
Barely.
Exactly.
But I don't have a problem with that.
I think most people don't.
It's the fact that how will that be used?
Are they going to nail compile, which is, although their database, I don't have a lot of
information on it, they haven't participated in certain studies that I'm aware of because
it's their tech and they're keeping a closed vest.
And it's reputed to be very, very good, like rivaling Googles.
And I would absolutely agree with that.
But if they're going to create this database of faces, are they going to commercialize it the way
Amazon is trying to commercialize recognition and how are they going to do that? Because let's be
honest, we've seen it through and through. Facebook's trying to make money. Sure. That's their deal.
For making money. That's their deal. That was my deal with him selling all the data.
Agreed. We've okayed it. So yeah, he's going to try to monetize it. Now with these databases,
and maybe I'm jumping ahead a little bit, but with these databases, so they're trying to make money.
Now, okay, so let's say I want to, one of the problems originally with the facial recognition in Florida, when I was there, was that the database wasn't big enough.
Right.
It wasn't big enough, right?
So now we have, we have technology that's able to have other law enforcement be in one database.
They can all, they can all access this one big database.
Or cross-linking that.
I mean, we're getting at that point where now they're trying to.
get them to talk to each other and do other things. So yeah, you're right. We're getting a
large and larger data set. So do is it, is it okay to sell it to law enforcement? Is this law,
as law enforcement saying, hey, we're not going to buy it, you're going to give it to us.
Wow, that's a great, that's a great question there. From right now and the, from what I can
tell, we have larger databases, but guess what? Lo and behold, the facial recognition is not
working so well on the very large databases. When you're talking about 10,000 to 20,000 faces,
So many are so much alike.
It might be okay.
But when you have other, what they'll call disruptors put in there, which would be other,
other faces, and then you have a million faces.
There was a University of Washington study on this recently.
It's not working all too well.
That doesn't really surprise me.
I mean, people change their looks all the time, right?
Correct.
Change their looks with drugs on purpose.
So it's the same face, but that different look.
And remember it, when we're talking about this type of analysis and facial scanning,
I mean, there's different types of technologies here, but unlike Apple's face ID,
which is actually putting 30,000 infrared points 3D mapping your face,
you now have two-dimensional images that most of these databases are using.
And some of them are not necessarily a perfect view up front.
It's profile views, three-quarter views, things like that.
And that is not the easiest thing to do.
Are we going to get better with it very quickly?
Absolutely, because of AI.
And the artificial intelligence that can go and learn from all these images,
it will proceed very, very, very quickly to become very, very, very accurate.
I do believe that.
Yeah. So what are you, what's the biggest problem right now with the facial recognition?
Well, other than, you know, I know you talked about being able to, you know, when you start getting a larger database, then the failure rate is going to, you know, exponentially explode on that as well.
Well, there's a, there is an apparent, and I've just seen this from some commentators and other articles that have seen out there.
There is an ostensible gender bias and a sensible racial bias in some of these, these, these,
programs you know what that is it's not that the program knows or oh my gosh these
racist people are programming right it's not when you're looking at the eyes
and the nose in the mouth which are the predominant areas there are certain
things like depending on certain skin tones and certain spacing that for certain
types of do not certain types of people certain races the programs are seem to
have a harder time with that will resolve itself but that right now it seems to be
a little bit of a problem at least with some the the other
issue that I have is obviously it's not necessarily accurate. Let's use a hypothetical here. I realize
I might be jumping ahead with this, but we've talked about how quickly technology will catch up and allow
these data bases to the truth. We've already witnessed it. You're right. You're right. And let's, you know,
I don't want to single out Amazon's recognition, although there's been a light of writing on that.
Let's look at this right now, it's not that accurate. And even if you're 92% accurate, that's not
a lot of people. There's a lot of people there. So what happens if, in fact, there's a camera
that is a government camera on this one area
that there's been a lot of different crime,
let's say in my town that I live in.
And then lo and behold,
that recognition software somehow tags me
as being present in that area when I wasn't
because there was an error by that recognition.
And now I'm being brought into a police investigation
that I shouldn't be brought into in the first place.
Not because of some really good,
hard shoe leather work by detectives,
but because some piece of software somewhere said,
hey, I was there at this time.
And now I have to burn a proof.
Right.
That's what I'm saying.
Since we have in the past three years, maybe four, we've now turned the innocent until
proven guilty into guilty until you prove yourself innocent.
You know, it's a great.
Which is agonizing to me.
It's an interesting point.
And this is where I've been quite vocal over the last, especially last five years in my writings
and my speeches, where I'm very concerned.
And some judges have begun listening to me with respect to touch.
ID and face ID and not compelling someone broadly without a warrant to be able to open up that
phone. They're saying, wait a minute, it's not about, you know, being able to identify that this
is you. It's about what can you get to and what's on that. That might be outside the scope of your
authority or scope of any warrant. Ah, we won't look at that. What are you talking about?
It's how you're crazy. But think about that in the terms of facial recognition now. And,
and that's where, you know, we've talked about this earlier. I'm worried about the law by its very
nature moves slower. Our common law system does not move as quickly, even near as quickly as
technology, which is like Moore's law with computers. It used to be a good thing. I mean, it's a hard
it is a hard thing. It is. But it's a difficult thing when it comes to technology because now when this
right to privacy, which exactly, this right to privacy, which is not read the Constitution folks,
read the Bill of Rights, right to privacy is not in there. It was construed by the Supreme Court
from certain first, fourth, fifth, sixth, and 14th amendments.
So now we're looking at this from the context of what used to be a good thing with technology
is now more difficult.
And now what might have been the rationale of a court regarding recently a few years ago,
thankfully, I think it was U.S. versus Jones, Supreme Court said,
hey, government, you can't run up to somebody's car and throw a GPS device on it without a
warrant.
That's a good thing.
But look how long it took to get to that point.
And now, what about official recognition?
But again, okay, so you can't.
And I want to pause for just a second now.
We can't do that without a warrant.
However, if they feel we were giving them a little wide berth on that as well as well,
I've got probable cause.
Well, yes.
Okay, so I can put that GPS there because I've got probable cause.
I know I don't have a warrant, but I've got probable cause.
Right.
And again, I totally agree with that.
I was trying to direct this and where we're going is, you know, the government can do
certain things when it shows a compelling state interest.
What if when it comes to the facial recognition and the database that might be created
by all these cameras in a city that over time it's getting better and better and recognizing all these
people. And they're showing through their own statistics that, hey, it's reducing crime because we're
seeing this stuff and we want to have your picture on there. We're doing all this things with these
database. And we have a compelling state interest to do that now because they can reduce crime.
Where my problem is, when you have an Amazon or Facebook give their database to law enforcement,
where's our check and balance on law enforcement's use of that? I'm not some conspiracy theory.
that doesn't trust the government.
I'm saying there's a check and balance that needs to be there.
That's what the Fourth Amendment and Fifth Amendment does.
And Sixth Amendment in our system.
Fourth Amendment, Unreasonable Search and Seizures.
Miranda, Fifth Amendment, you know, that I'm not going to sit here and say something
that'll criminate myself.
Or Sixth Amendment, where I get to face my accuser.
I get to face my AI.
I mean, you see what I'm saying?
What is that?
And it gets me concerned that the law needs to catch up, just as I've been an advocate
for having more control over our own privacy rights and the data that's collected
from us.
It's not that you can take a picture of my face.
It's how you're going to use that.
That's my concern.
And we're already seeing that people are manipulating other people's faces on the
internet to make them do things that make them things.
You have no idea.
It's scarier, Jeff.
There is a program out there that I believe it was through the Open AI that Elon Musk is
one of the people behind this.
It's so scary good at creating a person's face.
They don't, they're not.
releasing those components of the platform because it could literally they showed some pictures
and you would swear that's an actual picture of a person sure but that person that looks so real does not
exist i mean so we are we taking over someone's uh someone's person now we're just creating another
we could i mean it is the when i looked at that immediately my mind jumped forward to all the legal
issues i'll be doing dealing with when it comes to misappropriation of a name or likeness
that some might use on a commercial.
It's like, that wasn't even me.
I didn't even film that.
No, they created that from their programming.
I mean, it's...
Did your wife decide that you were going to go on another vacation
with the money you're going to make from this already?
Very interesting.
I didn't even think of that.
Thanks for putting that in my mind.
But it's really, to me, I hope it's not taken the wrong way.
It's fascinating to me to watch how technology does this,
but it also scares me because I've been dealing with the law
long enough now as a practitioner to realize that we need.
need to be a little bit more proactive.
And that's why I've become more vocal on certain things and realize some might take heat
for and some people agree or disagree.
It's not that I'm right.
I'm pushing forward the conversation.
We need to do that.
So when you say become proactive, you know, that's easy to say.
Yeah.
And really, it's easy to do, but it's not easy to do.
Right.
I mean, because we all want the convenience that's there right with these, whatever it is.
I mean, I want, do I want the same day delivery from Amazon to my house?
Would I need toothpaste?
You're damn right I do.
That's right.
Yeah, we need toothpaste.
Great point.
I'm at the door.
Okay, thank you.
Awesome.
But, you know, so, yes, I want that.
But I also don't want Amazon using my likeness for whatever they want to use it for.
Exactly right.
I mean, at the very least, it needs to outline very clearly, very easily for someone that can actually read something simply, say,
hey, we're using your face for this, this, this, and that.
If you're okay with it, you could say yes, but it should be, we can use it for this,
but not that, kind of like GDPR is for a certain, I can have a right to be forgotten under that
European law.
I could say you have to erase all of my information company.
I don't want you to keep it anymore.
Do we think, okay, and I love that, but do we think, do we buy that that's going to happen?
I don't.
You know why?
Because I think the Pandora's box has been opened.
I really do.
And I think when it comes to we'll be able to rein in certain elements.
But our hands are tied for how we began with the Internet, with allowing these companies to gather all this information under the guise of an opt-out, which, of course, most people never did.
I mean, how many of us, I write terms of use and privacy policies.
And I hate to admit it, but sometimes I'm like, I'm just not, I go through it.
I agree.
I agree.
I agree.
My Apple is my greatest one.
Oh, it's updated my thing.
Do you agree?
Yes, I agree.
You know, it's an Apple phone.
So be it.
And virtually everybody does.
And they know that.
And even with the GDPR now, where you're required to have some of these,
you should see how some of these companies are architecting.
You know, when you read something from Daily Mail here in the U.S.,
they'll say, hey, you know, we have a new law and you have to manage these things.
And now you have to go through and you have to check off these things and not check off.
And it becomes a hassle when it's so easy to just say, okay, and get to your article that you want to read
or spend five minutes trying to go through everything to say, don't, you know, use this, but you're okay with this and not that.
Most people, again, are not going to do that.
Just want to read the article.
That's exactly.
Just want to read the article.
stop with this stuff. I get it. And what's happened is that in a way now it's become,
because of the allure of the service or the goods that we can get, you know, I'm a proud
prime member. I get it. It's a wonderful service. But at the same time, you know, we have now
had this carrot dangled in front of us that we've now become used to chewing. We've caught it
and chew on it. And don't mind that that carrot's leading us down an area that we might not want
to go. And that's why I try to be able to think all of this need to. We still think we are going to be
able to stop it, right?
Yeah, I think, I think, oh, yeah, it's not as bad.
When I get to the cliff, I'll be able to stop.
I guarantee you that there are members of your audience here, be like, oh, Tom's just
being, he's just being really a hard reactor.
Jeff, Tom, you guys are wrong.
I respect that.
But I'm just telling you, because I've seen the evolution in the last 21 or so years that
I've been dealing with data privacy and data security, you know, I dealt with as an engineer
before I became a lawyer.
I dealt with virus on a trading floor.
I mean, I've dealt with this stuff for a long, long time.
And it's beginning to scare me because the,
heck is proceeding so quickly, I just don't know how we, how we close that box again.
So we're talking to Tom Colick, an attorney at law, Esquire, who, do you have the Esquire
sign?
I do not.
I got rid of the Esquire sign, my friend.
You have let me down.
I'm sorry.
I don't even know why I'm talking to you.
But, you know, to make you feel better, I think I do have a business card holder on my desk that
I got from my aunt when I had become a member of the bar that does have the ESQ on it.
So there you go.
Maybe I'll let that one go.
So we're talking a little bit about, you know, AI and facial recognition and where we're headed, you know, in the future and really focusing on facial recognition.
So many, so many things that we do in today's world, you know, take our picture and they're going to use it for whatever they're going to use it for.
And I don't know.
What's the number one thing people can do proactively?
Okay.
To maybe put a pause on their face being used for.
whatever. Well, that's
a fantastic question. The one thing you're
not going to be able to do is
I mean, I guess you can. You could
put on a wig and a mustache
and beard or a burqa going down the streets
so those cameras don't get you. But ultimately
I think whenever...
Will a foil hat more? Maybe.
You know, it's a great point. That might work.
Although that goes into a whole
conspiracy theory that we should probably not talk about.
But I think from
Facebook, for instance, where you
might be using any type of
system that might be recognizing photos that you've uploaded to it or whatnot.
I would strongly encourage Facebook's, believe it or not, as much as I criticize it at times,
it has gotten better with you being able to say, hey, this is how we're going to use that.
And it's gotten relatively good.
Go through it.
Just don't take it for granted.
Opt out of it.
You don't want it used to say, sorry, don't try to mount my face tan and my friends or any of that
stuff.
Don't try to auto tag me.
Leave that out.
And they will do that, ostensibly.
They're saying they're doing that.
And I will trust them at their word that they will do that because they've taken enough heat.
At least in the front of the store, they're there.
not doing that. Exactly right. Exactly correct. That is one thing that I would encourage
anyone to do. And that's, I think... And you can do that with all the social media accounts,
really. I mean, Twitter, Twitter, Instagram, which is Facebook. I mean, but they have the same...
I was surprised when I heard an interview with at Jack. And he was saying, well, we have that.
You just have to go in and check, you know, opt that out. And I was thinking, I didn't know that.
And, you know, they're not advertising that. But if you go and actually read some of the
terms, like, here's our privacy policy. And some of them are not nicely written,
not legalese and you're like, oh, oh, I can do that.
And most people don't even realize that.
So I, like, in Facebook, take the little,
they have a little privacy test portion of that you can take and it'll drill you down
to some of the various ways that they use your data.
I encourage and have been for some time encouraging people to take control of that on all
their social media.
But when it comes to where it's such an early stage on facial wreck, I tell people don't
worry so much about your Apple Face ID.
That is, that is, basically creates a mathematical expression.
Yeah, yeah.
Because most of the.
these actually create a mathematical expression that's then encrypted and saved on your device it's
not saved on the cloud or elsewhere it's not something that can be decrypted but it's just used
really as a true biometric to open your device and so that i don't worry as much as about although
those don't work perfectly either was just this past week on on media talking about how's you know
a mother and her 10 year old son she was all happy because she got this new you know face id phone
it says son you're not going to be able to open it anymore it goes and he takes and you looked at it
It opened right up.
It turned out to have something to do with, you know, again, it's not perfect.
Although lighting's not an issue.
It's sometimes the amount of light or the direction of sunlight on the lens can create
problems.
So she was able to duplicate and like fix it.
And it showed that.
So again,
nothing's perfect.
But right now it's a little hard outside of what I just said to be proactive with it.
Unless, you know, I mean, especially outside.
Again, the government has a right.
This building, which has security cameras around it.
They have a right to do that.
At least I see.
saw some.
You know,
there's plenty.
Were those stuff for me?
You guys knew I was coming.
There's plenty.
But, you know,
that's some way that you can be a bit proactive.
And, you know,
just try to stay abreast of when you read things about this.
There are a lot of people now that are starting to talk about.
Oh, wait a minute.
This stuff is being used in ways that we didn't really think about.
And so,
you know,
again, try to keep abreast of that.
Any social media you're using,
look for news on that or anything that talks about their use of facial wreck.
Facebook's obscenely quiet about this right now.
And I think,
think it's because they don't want to give away how they want to use this.
Like you said, they're good in the front of the store, but I'm really worried about back
the store.
I really, well, I mean, it's an article.
I haven't even read it yet, but I saw the headline of Zuckerberg was getting in trouble.
I briefly mentioned it about.
You're right.
I saw that article on something about how he was using some of the data to punish some of
his competitors, et cetera.
Haven't read it yet.
Look, that's what he's in business for.
I know.
I know.
It's hard for me to be really angry about that because, as I said, we said, yeah, sure,
go ahead.
My criticism is not with them making money.
This is a capitalist system.
Good for them.
What I don't want to happen is they're doing it in a way that's not open.
They're doing it in a way that's...
So what happens when, let's say, you know, Facebook and Twitter went before the United States
government and said, hey, we're a platform.
Yep.
And we don't want to be able to be sued by people that are using us because we're a platform.
And so we want to let everybody have their say, only they're not doing that.
They're not really doing that.
So what happens when we take that away?
And I doubt we're going to take that away.
Probably the government is going to say, well, we'll just come into our fold and regulate the subar.
And it's those things that we don't see that you and I both know are probably happening.
And some of that is, hey, look at the great facial database that we have.
And boy, this could be really helpful to you guys.
And why don't you use this for your law enforcement purposes?
because you have a great interest in being able to reduce crime, et cetera.
And here you just pay us X or God forbid.
If we could just access the NSA database to share with ours.
What actually I'm thinking even further ahead.
What's scary for me is it makes sense that these databases will not be alone.
They'll cross link.
And you won't just have this overriding database, although that might happen.
It's the interoperability into your interaction of that that will create other ways that this could be used.
that's beyond the digital advertising
and beyond the facial recognition
for law enforcement that we're talking about.
We probably haven't even dreamed up
some of the ways that this would be used yet.
For instance, this one came to my mind
the other day.
I'm walking up to my car and my car had medically open
so why because the camera looked at my face.
You know, but now, you know, again, as I said,
we...
Oh, it's because you had the key fob in your pocket.
You know, but that's actually going to go...
Again, just like the keys have gone by the wayside.
I think the fob will too, because what are we?
I'm walking identifier right now.
I just am.
And facial recognition is getting even more, this is happening right now.
There are testing systems that don't just try to map geometric points.
They're actually getting the resolution's so good.
It's actually mapping the pores on your face.
And so they're saying that it'll be so accurate, it could actually tell the difference between twins.
And so that's pretty powerful.
Now, imagine the power.
They probably don't have a fat guy face once.
I'm good.
I'm good.
But I tell you, it's, it just shows the power of the technology.
now what does it do to how we're being used like my big issue here just as it was with touch
ID and face ID and the government being able to open a phone without a warrant unless they had of
course probably cause we we understand that but it's now how that can be used where you know I saw
this with toll tags where they can see where I was going through tolls at a certain point but now
everywhere I walk and everywhere I go you can now because of cameras map what I've done over the span
of an entire day and okay how is that being used how will that then someone want to grab that
Oh, law enforcement, look at you guys, you're doing this.
And, of course, governments are always looking for more money.
So they're going to license that to them.
We believe you were here.
Let us have access to your phone so that we know exactly where you were.
You can prove to us that you weren't there.
Which is not the way the law should work, is it?
It really isn't.
No, it is not my burden, isn't it?
It's not supposed to be that way, right?
We flipped that completely.
Yeah.
The last, I mean, for sure, the last three or four years, we've, that has just flipped on its head.
You know, it's, it's interesting to,
watch when you look at the composition, especially now of the Supreme Court, and I was encouraged by
the Jones decision and Carpenter and others that were basically saying, no, no, no, this Fourth Amendment
check is an important check. We have to have that here. No unreasonable searches and seizures.
But when it comes to our privacy, you know, I don't have a Fourth Amendment. I've got what is
a construed over time through common law. And that, although I'm grateful that we have that,
that's a pretty tenuous thread to be on. You know, it really is when you think about it. And there's
nothing that I'm not interpreting a statute that says X. I'm interpreting prior decisions. Griswell
versus Connecticut, which talked about this number of rights and our right to privacy was in
1966. Wow. It's a pretty long time ago. Long time ago. Think about that. And that's one of the
foundations of our privacy. So much has changed. So much is, exactly right. And see, that is you'll see more,
I think, really over this next 10 years, especially more so, more decisions talking about this and parsing
this. And I applaud that because we need to catch up and we have to look at this.
through this technological lens now.
It's not a matter of whether you identify me
my face or my fingerprint.
It's now a matter of what are you doing
with that data to track,
what am I doing and where am I going?
That's worse than Orwellian.
It really is.
Look, their argument is going to be, right?
You tell me, their argument is going to be well,
these people over here are going to be able to use it.
Oh, yeah.
These people over here are going to be able to you.
It's just us.
You're right.
It's just us.
We'll be fine.
Oh, don't worry.
Don't worry.
And my point is, that's wonderful in face value.
but where's my check and balance on that?
You've got a try, you know, I have Congress, the president of the judiciary, but you've got
to believe me, Tom.
Believe me, it's fine.
You know, this department, Orlando Police Department's going to be fine using this.
You just got to believe this.
Where's my check and balance?
Well, you've got a Fourth Amendment balance.
Really?
No, you don't.
You know, I'd like something at the forefront rather than have to rely on this getting
through a court to say X.
I would like a better check and balance on that.
Maybe it would be my control over that to say, you know, there's certain things you
as a government will be able to do, but now you can't have my picture in a commercialized
database.
Sorry, guys.
can't take your database and they'll sell that back to Amazon or back to Facebook for them to improve
their algorithms and their deep scan. We won't use it. It'll be almost now as far as facial recognition,
it's there, there'll be, you know, there'll be kind of a backdoor in too, right? Like with the DNA
test, right? How they're catching the bad guys with the DNA test. We don't have your DNA, but we go
about it through the back door with family members. That's exactly right. Well, we've got this. Oh,
that links you to this. You'd be surprised how many people don't recognize that when you look at the terms that
they give you for doing that that DNA, you don't recognize that 23 and me, Ancestry.com,
all say, we can use this data for own commercial purposes.
And they take that data and share it with law enforcement.
Now, I think they also, to be fair, and I'm going to be fair.
And I'm 23 and me as a sponsor here at the network.
But I think that they also have, give you an opt out on that.
They do.
Do they not?
And you know what?
I applaud them for that.
Because from, from my standpoint, when you're using any of those services, that was what I'm
talking about with Facebook.
I talk about with Amazon.
Give, let me know.
and let me make that decision.
And that's,
that's fantastic.
That's the best that we can do.
That's probably the number one thing
that we can focus on, right?
And if you tell me that,
especially in today's world.
I'm fine with that.
That's absolutely,
I think that's the most
that we can ask for.
But unfortunately,
that's not what's happening
with a lot of things.
Tom Kulik,
Esquire,
here in the Metroplex.
You're going to come in office
and just see that play, aren't you?
I really appreciate you coming by.
I'm fascinated.
I could be here
the rest of the name with you.
It's a real pleasure.
Thank you so much for having me on.
If there's anything
that we need to cover
than this. Always, always. Anything that's dealing with the intersection of law and technology
with our lives and especially with business, it's what I do every day as a lawyer.
Amazing. It's got to be fascinating. It's fun stuff. It keeps me awake at night, sadly.
But that's a good thing. I believe that. No kidding. Thanks, Tom. I appreciate it.
My pleasure.
And thank you for spending a little time with me on Saturday. I appreciate it. It was fascinating.
A very interesting conversation. I hope to have more of those for you as some extra
Some extra podcasts here on Chewing the Fat.
For those of you that are subscribers,
now would be a good time for you to rate and review.
20 stars, best podcast ever.
And then share it with either a friend or someone you don't like.
It doesn't matter.
Just share it and say, hey, thinking to you, you should subscribe.
If you're listening to this and you're not a subscriber, why not?
Subscribe.
I mean, every podcast needs subscribers.
This one needs them more than any others.
So be sure to subscribe.
It's available wherever free podcasts are sold.
I don't know what more I can do for you.
