Crime Fix with Angenette Levy - Florida Man Used AI To Turn Girls' Photos Into Porn: Sheriff
Episode Date: October 2, 2025Lucius Martin, 39, faces a long list of charges after a woman found photos of her daughter and the girl's friend nude on his phone. The Marion County Sheriff says the photos were not original... images of the girls and Martin used artificial intelligence to alter the photos and turn them into pornography. Deputies stopped Martin earlier this week to take him into custody and at that time they said he destroyed evidence. Law&Crime's Angenette Levy goes through the disturbing allegations and why Martin could face more charges in this episode of Crime Fix — a daily show covering the biggest stories in crime.PLEASE SUPPORT THE SHOW:Download the FREE Upside App at https://upside.app.link/crimefix to get an extra 25 cents back for every gallon on your first tank of gas.Host:Angenette Levy https://twitter.com/Angenette5Guest:Dave Aronberg https://x.com/aronbergTim Jansen https://www.instagram.com/courtroomchefProducer:Jordan ChaconCRIME FIX PRODUCTION:Head of Social Media, YouTube - Bobby SzokeSocial Media Management - Vanessa BeinVideo Editing - Daniel CamachoGuest Booking - Alyssa Fisher & Diane KayeSTAY UP-TO-DATE WITH THE LAW&CRIME NETWORK:Watch Law&Crime Network on YouTubeTV: https://bit.ly/3td2e3yWhere To Watch Law&Crime Network: https://bit.ly/3akxLK5Sign Up For Law&Crime's Daily Newsletter: https://bit.ly/LawandCrimeNewsletterRead Fascinating Articles From Law&Crime Network: https://bit.ly/3td2IqoLAW&CRIME NETWORK SOCIAL MEDIA:Instagram: https://www.instagram.com/lawandcrime/Twitter: https://twitter.com/LawCrimeNetworkFacebook: https://www.facebook.com/lawandcrimeTwitch: https://www.twitch.tv/lawandcrimenetworkTikTok: https://www.tiktok.com/@lawandcrimeSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
Wondery Plus subscribers can binge all episodes of this Law and Crimes series ad-free right now.
Join Wondery Plus in the Wondery app, Apple Podcasts, or Spotify.
A truck driver pulled over and arrested.
The sheriff says he was using artificial intelligence to make child porn using a girl he knew.
I have the disturbing allegations being made against Lucius Martin.
the girl's mom showed detectives, and I'll tell you why, he could be in even more trouble in another county.
Welcome to Crime Fix. I'm Ann Jeanette Levy. Now, you've probably heard about artificial intelligence or AI and how people are using it to do a lot of things. Well, a sheriff in Florida says a man was using it to create child sex abuse material.
They used to just call it child pornography.
This case is out of Marion County, Florida, about 100 miles northwest of Orlando.
Lucius Martin is now in the Marion County Jail, and he's facing three charges, creation of child
pornography, possession of child pornography, and destroying evidence.
Deputies pulled over Martin on Monday in his big rig, and it sounds like they think they caught him
in the act of destroying evidence.
I'll have more on that in a bit, but here's the video of him being taken in a day.
into custody.
So you saw that the deputy ordered Lucius Martin to get out of the cab of his semi and he didn't do.
So you saw that the deputy ordered Lucius Martin to get out of the cab of his semi and he didn't do it.
The other deputy went around the truck to try to get into it, but the door was locked.
The probable cause affidavit says that's when Martin was destroying evidence.
The affidavit states, while barricaded within his vehicle, he apparently reset or restarted his cellular device.
The deputies immediately obtained the device and made all attempts possible to stop the restarting process by powering the device off.
This destruction of evidence was clearly observed by the device displaying the rebooting screen,
when it was obtained by law enforcement on scene, I later observed the screen of the device
to show what appeared to be the new device setup screen. This is consistent with the device
being factory reset. Wow, that's not good, and it's a major red flag when it comes to
investigating an allegation that someone is making and has child pornography on his phone.
detectives say the investigation into Martin began when the mother of a girl found images of her
daughter on Martin's cell phone. There were multiple duplicate images which showed the original
image of the child clothed in a duplicate of the same with the child victim unclothed. The woman also
identified her daughter's friend depicted in a similar manner. The woman captured a photo of the
photo gallery using her own cell phone and deleted the image from the defendant's phone.
heard about Upside, it's a great app that gives you cash back on things like gas, groceries,
and takeout. Upside gives you real money that you can transfer from the app straight into your bank
account. When I pump gas, I use Upside. I've also used it when I go out for ice cream.
Here's how it works. Download the app, claim an offer for whatever you're buying, and pay as usual
using a debit or credit card, and follow the steps and get paid to find out how much you could earn.
Click the link in the description to download Upside or scan the QR code on your screen and use
our promo code crime fix to get an extra 25 cents back on every gallon on your first tank of gas.
That's promo code crime fix for an extra 25 cents back on your first gallon of gas.
Now, if you're the mother of this girl, you are probably freaking out when you discover this.
The detective wrote in his affidavit that an analyst extracted data from the mom's phone and found
the images were apparently taken on June 3rd, nearly four months ago. The detectives described
the images that they found, and some of them were real images that were later altered to make them
pornographic, according to the detectives. The affidavit explained, original image. This image depicts the
child in a black bathing suit, apparently standing at the edge of a small body of water. Altered image.
The altered image depicts the same environment in background, but the child is nude. The child is
clearly altered for sexual gratification as the focal point is now superimposed female breasts and
genitalia. Altered image. This depiction is a second alteration of the image of the child in a black
bathing suit. It depicts the same background and environment, but the child is topless. The AI generation
depicts different female breasts and contain different deformities to the extremities.
Now, I mentioned to you that these original images were real images of the girl.
and another girl, her friend, clothed. The detective wrote,
The above images appear to have been obtained from a public-facing social media application.
They were then altered via digital means. The altered images clearly depicted the children in sexual positions
and are clearly focused on sexual gratification. This is evidenced by the images being altered by the
suspect to depict the children nude. Now get this. Detectives asked to hold Lucius Martin without bail
because they think he's dangerous and he's actually under investigation for possibly molesting a child
in Lake County. The affidavit states, through the course of this investigation, I was contacted by
the Department of Children and Families. I learned the defendant is currently the subject of an
active investigation in Lake County regarding the sexual molestation of a child whose name is
redacted. This is an active investigation currently assigned to the Lake County Sheriff's Office.
They also wrote that Martin destroyed evidence, and the victims in this case, they're afraid of him given that he has, quote, violent and aggressive behavior that he's displayed in the past.
Right now, Lucius Martin's bail is set at just over $60,000.
I'm going to play the entire clip of Lucius Martin's arrest uninterrupted later on in the show, but for now, I'll pick it up where I left off earlier.
Here, hold that one back.
Let's throw that one, can you care.
Hold on this one.
Alright, we're gonna go to this part here.
Got anything on you, not supposed to have?
No.
Any bombs, guns, drugs, anything crazy?
No, dude.
Alright. Spread your feet for me.
Go ahead and finish that and then spit it out.
What have I done?
Y'all still have a talk?
I'm explaining it to you.
Just give me one minute, okay?
I need to grab it.
Sorry.
You can take up the whole back seat if you need it and I'll crank the air up if it's warm.
You ain't got another pair of these deep.
I don't have three, no man.
Not on me, they're on the other side, but you need to sit sideways or they got to work?
No.
There you go.
So I thought we should bring in Dave Aaronberg to discuss this.
He is a former state attorney for Palm Beach County, now a defense attorney.
Dave, put your prosecutor hat on for this one.
You've got a guy taking pictures of a young woman and then, or a young girl, I should say a girl, a child, and then altering them with AI to put breasts and stuff like that, naked body parts on her and turning this into.
child pornography. So, yikes. When you were a prosecutor, did you have a case like this ever?
And, Jeanette, I don't recall a specific case like this, but I know that it was a growing problem,
enough that the state legislature back in 2024 enacted a law to get tough on this. Because if technology
changes and evolves, these perverts will change and evolve and try different things. And they may
think that because it's not an actual image of this individual that they can get away with it.
No, Florida law makes it a crime to knowingly possess control, intentionally view, or create
generated child pornography.
It's a second degree felony, punishable about to 15 years in prison.
And that's the thing.
Are these people, and to me, I mean, I guess I need to look at AI and see what you can
exactly do with it.
to me, this is a guy who drives a truck for a living who somehow has gotten some AI tools
allegedly, according to the prosecution, and according to the police, and manufactured child
pornography with a photograph of an actual person, an actual child.
So that takes, to me, some thought and a little bit of skill, at least.
Yeah.
Yes. And if you are generating child pornography using AI and the individual, the victim here, is an identifiable minor as opposed to a fictional person. The identifiable minor will get you a second degree felony punishable up to 15 years in prison. If it's a fictitious person who someone, a reasonable person would say is under 18, that's a third degree felony, punishable by up to five years.
years in prison. So there's a difference. So the fact that this victim is a real person who lives
in their home and is a minor will get him a more serious charge. My mind then goes to,
I hope this child hasn't been physically touched by this person. Because if you're manufacturing
pornography of a person who's close to you and you have,
access to. It makes me wonder, have you ever acted on any impulses related to that? Or is this
just all some kind of sick fantasy in your head? It is not far-fetched to think that the person who goes
out of his way to allegedly create an AI version of someone who lives in his household and is
engaging in sexual acts would perhaps try to touch that individual.
Again, these are allegations, innocent till proven guilty, but it doesn't look good.
And if you look at the guy, my goodness, he walked with a cigarette hanging from his mouth
and slovenly, and he was in the truck allegedly deleting images.
Talk about conscious of guilt.
He wouldn't get out of his truck who was trying to delete images from his phone.
It doesn't work like that.
A guy who's, I guess, technologically sophisticated enough to create AI, doesn't know that
deleting stuff from your phone isn't going to delete.
complete stuff from your phone forever.
Mm-hmm.
And that's a, that's the concern here, too.
Has he ever done anything?
And are there other electronic devices that he has that maybe now they'll go get a search
warrant for and look into?
Yes.
We've seen this before.
We saw this in the case out of the, in the Orlando area, Maddie Soto, the tragic death.
Yeah.
That, you know, you have people who are in that life.
style mindset where they swap photos back and forth.
There are different levels.
You have individuals who are possessors.
Then you have individuals who are distributors, which I don't know which one he's going to be
considered.
And then you have those who are creators.
And generally, the federal prosecutors will come down the harshest on the creators.
And this guy allegedly is a creator.
I don't know if he's distributed to others.
And the fact that it's not real child pornography, it's an AI, deep fake, is still illegal.
It won't be charged as seriously as if it was a real person, but you can still get, like I said, a second-degree felony out of this, punishable about to 15 years in prison.
That's serious.
That is very serious.
And so this unfolded rather quickly.
I mean, mom goes to the sheriff's office.
She reports this.
The minute she finds it, they discuss.
that it's AI generated or AI, it's an AI altered image because it's like girl in a girl in
swimsuit and then girl not in swimsuit, nude, things of that nature and then things are done
to the image to make it a nude girl, which is just if I were mom, I'd just be like losing my mind
and coming out of my skin, which it sounds like mom was. But this is just what we're,
we're reading about this particular victim that it can't be if you're you cannot tell me that
this is going to be the only stuff that they find on that phone or that they this is just what mom
went to the cops about so there has to be if he's doing this with this girl there could be other
stuff on the phone that he's trying to wipe obviously um you know you mentioned that earlier with
the mattie soda case but i'm thinking if he's in the if he's in the if he's in the
of his truck, and he's trying to, you know, factory reset the phone to get everything off
there, rather it probably was more on there than just of this girl.
Yeah, I think that's a fair assumption, although, you know, I hate to speculate when he
hasn't been charged with anything else yet, but it is a, but that's why we're here is to
give our legal opinions, and it's been my experience that when you have individuals who
engaged in this kind of conduct that it's not a one-off. These are not people who just, oh,
the first time they do it, they get caught. Generally, it takes a while until they get caught,
and then they do. And then the entire criminal justice system comes down on them really hard.
You know, when it comes to child pornography, it's one of the few crimes where the possession
is treated really harshly, similarly to the sale of it.
So even though there are gradations, as I mentioned, but generally, like, for example, with
drugs, if you possess drugs, you will likely get diversion.
You will try to get rehab, whereas the seller is treated really harshly.
Not so when it comes to child pornography.
The possessor will be treated harshly as they should and the seller as well,
because we want to get rid of the demand side.
even though it's hard.
A lot of these people are sick in the head,
but we want to make the penalty so harsh for the demand side
that people do not try to bite it off the internet thinking they'll be a slap on the wrist if they do.
You said that the Florida legislature amended the law in 2024.
Usually it takes a really long time for laws to catch up to the technology.
But it sounds so this was probably something they were seeing as a problem.
for a while, and then somebody went to the legislature and said, hey, you've really got to do something to change the law so we can go after this.
Not only did the legislature update the law on this, but in the last session, 2025, they added a law called Brooks Law.
That was named after the former mayor of Jacksonville's daughter, who apparently was a high school student who had an AI-generated
sexual image, uh, created and circulated online. And when they tried to get that removed,
the family found that the social media and online platforms lacked a clear mechanism for doing so.
It was hard to get it removed from the internet. So this law requires them to remove this stuff
from the internet. Wow. Wow. That's just so disturbing. Well, we'll keep an eye on it. Dave
Berenberg, thank you so much. Thanks for having me, Ingenit. I want to turn now to Tim Jansen. He is a defense
attorney down in the Tallahassee area of Florida. Also, used to be a federal prosecutor. Tim,
if you're a defense attorney in this case lands in your lap, what do you do with it?
Well, you've got a lot of issues here because, first of all, it's pretty clear they got him for
destruction if they can prove he wiped the phone where when he was parked we don't know the
relative of the relationship of who screenshot the images that he created um so they have a pretty
strong case that it looks like he created it or someone created this AI child pornography so the question
is going to be really it wasn't on his phone and if they wiped it clean even if they sell bright
it would they be able to tell that it was on that phone a lot of times they can find it it
sometimes hides in different areas so when you reset it it doesn't all the way reset it
so that might be helped give the mother credibility because she's the one that took screenshots and
went to the authorities yeah i mean it's just what is laid out in this probable cause affidavit
is so disgusting and absolutely if you're a mom or even a dad
and you come across this, you're going to be like, what?
You're turning pictures of my child into porn, like with AI.
And that's what's scary about this stuff.
And, you know, I remember a prosecutor years and years ago mentioning this, talking about how there was all this AI porn, you know, coming about and it was being used on the dark web and stuff like this.
This guy's got it on his phone, allegedly.
And one thing that is interesting to me, and we saw this in the Maddie Soto murder case, Tim, is the fact that Stefan Stearns, he pleaded no contest, he's doing a life sentence now for Maddie's murder.
He had a Google Drive backed up, like he had it linked to his phone.
So I'm wondering, did this guy have potentially a similar setup where he had an iCloud backed up or some type of
thing backed up where all these images might be stored despite doing the factory reset well that's true
and and the more images they can give you the bigger sentence they can give them because those images
and now in federal court they have really increased the penalties there's so many enhancements under
the guidelines that they over i mean they over indulge on the enhancements so you get a plea offer
in federal court versus state court you're getting a lot of enhancements
that aren't in the state system, you get a lot of, and you got minimum mandatory prison time.
It's clear this guy really knew what he was doing or was sophisticated, even by looking at
him, he didn't look at it, but he was sophisticated that he could do this and knowledgeable that
he knew the gig was up when the police pulled him over.
Yeah, it's almost like he knew exactly why they were stopping him.
Right.
You know, I've seen over the past where you have like coaches or the,
the school photographer takes pictures of girls.
And we had a guy here at a private school taking all these pictures.
And then they got them for child porn.
And they went into the house and found all these images of these girls that were innocent
pictures, but they were placed in an area.
This guy was clearly obsessed with these young girls.
And this is another step because not only is he taking pictures, now he's making them nude
with their faces.
on it. It's very troubling, very troubling.
When I look at this, too, you know, I could see a defense attorney trying to say,
and you're a defense attorney. Well, he just, you know, he, this is fake. It's artificial
intelligence. But that doesn't matter. I mean, if you're, if you're using it to create
something that is child pornography or,
deemed child pornography, child sex abuse material, does it matter if you're taking a photograph
of somebody and altering it to then make it pornography? I mean, it's still pornography. It's still
child porn, right? I think the statute does cover that. I think the Florida statute's depicting
child pornography. In federal court, to get restitution, it has to be a live,
victim. They have to identify the victim. That's when you have to have a live victim. But if you
depict somebody and make it look like a child pornography and you're possessing it, I think they can
still be prosecuted. The laws are catching up to technology. And with all the good technology we have,
there's always criminals find a way around it. They always find ways to commit crimes. And sometimes
the law is behind the criminal mind. But they are eventually catching.
job. And then these people will try another way to get around it. If these people spent less
time finding ways to commit crimes, especially against children, they can be very productive
members. And some of them are productive. This is just a deep-seated in their mind. It's troubling.
And we get them. I'm seeing more of it, I can tell you. I see a lot of it, child pornography,
possession of child pornography, traveling to meet minors. It just seems to be overwhelming in the
communities. What kind of scares me and frightens me, if you're making this stuff, that's the
allegation that he was manufacturing it and possessing it on his phone. Are you taking it?
Are you going to the next step? Are you actually engaging with minors? And I'm assuming that could be
part of the investigation.
Yeah, that is always the investigation.
And I had a sentencing the other day where I had a forensic psychiatrist.
He's the best in Florida.
And they use them for Jimmy Rice here in Florida.
The state attorney does.
He says the people that look at child pornography are the least ones to recidivate and touch
a child.
The ones that are trying to reach out and meet them or touch them, they're two different
classes because the guy that wants to look at child porn is in his basement doesn't think he'll get
caught he's not a psychopath the guy that's trying to meet them or go meet them or touch them is a
psychopath and believes he can get away with it where the guy most time they say 70 or 80 percent
guy watching child pornography is not going to try to meet a child because they they're too
scared but they enjoy just looking at but and that's hard to swallow but but
when he was telling me in statistics over the years and all the statistics, it was
very, and the judge bought it. It was very interesting, different than I thought, like you
think. I believe that if you're watching child porn, you want to meet a child. But that's not
what he says. That's interesting. But I, I kind of feel like, yeah, you might live in a fantasy world
in your head, but there's got to be a time when they want to try to act on it. That's just
my opinion. I don't know. I've been covering this stuff for years. You've represented plenty of
defendants, but obviously this doctor is the expert. So I'm, I am creeped out by this
great new world AI stuff. Totally. It just makes me nauseous, just even reading this complaint.
Those that travel to meet a minor or pay to meet a minor, those are the ones we really have to
watch out for because they're willing to go out there.
and actually meet a minor, engage in sex, regardless of they know it's wrong and they're willing to do it.
Those are the ones that are, and they got the highest recidivism rate of any of it.
But this is, this is all new.
I have not seen one like this.
I knew of a lawyer, I'll tell you this funny, I knew a lawyer in town who got mad at a prosecutor and he imaged her face on a stripper
at a strip club and it got out and he got disciplined by the bar he's still a lawyer today but
he was able to put the prosecutor's face on a stripper in a strip club i mean and that was probably
15 years 20 years again strange uh strange gross stuff tim jansen we'll keep an eye on it
thanks so much thank you good seeing now here's that entire clip of the body worn camera footage from
the arrest of Lucius Martin earlier this week.
I don't know how much.
I know how do you have to be on the other than I think you don't get to me and do.
Okay, okay I find out.
Okay, okay.
Hold on that one.
I got it.
Okay.
Go ahead.
Okay.
Go ahead.
Okay.
Hold on.
Turn around this one.
Turn around this one.
Alright, we're going to go to this part here.
Is that anything on you?
Not supposed to have?
No.
Any bombs, guns, drugs, anything crazy?
No, dude.
Spread your feet for me.
Go ahead and finish that and then spit it out.
What have I done?
I don't explain it to you. Just give me one minute, okay?
You can grab it.
Sorry.
You can take up the whole back seat if you need it and I'll crank the air up if it's warm.
You ain't got another pair of these, do you?
I don't have three, no, man.
Not on me, they're on the other side, but you need to sit sideways or they got to work?
There you go.
Right now, Lucius Martin is being held in.
Martin is being held in the Marion County Jail, but it sounds like this investigation is far from over.
So stay tuned. That's it for this episode of Crime Fix. I'm Ann Janette Levy. Thanks so much for
being with me. I'll see you back here next time.