Employee Survival Guide® - Inside The AI Dash Cam Lawsuit Transforming Workplace Privacy

Episode Date: March 9, 2026

Comment on the Show by Sending Mark a Text Message.A camera on a windshield shouldn’t spark a constitutional showdown—but that’s exactly where we go. We dig into a federal class action over AI-p...owered dash cams in commercial trucks and the claim that detecting risky driving crossed into biometric surveillance under the Illinois Biometric Information Privacy Act. Using three primary filings—the complaint, the company’s answer, and the settlement memo—we translate dense legal and technical arguments into a clear narrative about consent, data retention, and what it means to profit from your face in the AI era.We break down how “machine vision plus AI” allegedly tracks behaviors like phone use, drowsiness cues, eating, and smoking, and why plaintiffs argue such precision requires mapping a driver’s unique face geometry. Then we press on the company’s firm denial that any facial mapping occurs, exploring its 23 affirmative defenses: vendor status, implied consent inside monitored cabs, lack of concrete harm, federal preemption under trucking safety laws and FAAAA, and the dormant Commerce Clause. Along the way, we show how per-scan statutory damages could balloon into existential numbers, and why a $4.25 million non-reversionary settlement for an estimated 85,000 drivers became the pragmatic off-ramp for both sides.This isn’t just about one fleet technology provider. It’s a stress test for how privacy laws, interstate commerce, and safety innovation collide when algorithms enter the workplace. We sit with the human reality of life inside a long-haul cab, the psychological weight of being watched, and the murky line between helpful alerts and intrusive analysis. And we look beyond faces: if future systems can identify us by seat-pressure patterns, grip rhythms, or behavioral signatures, do today’s laws still protect us?If you care about AI ethics, workplace surveillance, biometric privacy, trucking safety, and the future of consent, this deep dive is your roadmap. Subscribe, share with a friend who loves hard problems, and tell us: where should we draw the line between prevention and privacy? If you enjoyed this episode of the Employee Survival Guide please like us on Facebook, Twitter and LinkedIn. We would really appreciate if you could leave a review of this podcast on your favorite podcast player such as Apple Podcasts and Spotify. Leaving a review will inform other listeners you found the content on this podcast is important in the area of employment law in the United States. For more information, please contact our employment attorneys at Carey & Associates, P.C. at 203-255-4150, www.capclaw.com.Disclaimer: For educational use only, not intended to be legal advice.

Transcript
Discussion (0)
Starting point is 00:00:08 Hey, it's Mark here and welcome to the next edition of the Employee Survival Guide, where I tell you, as always, what your employer does definitely not want you to know about, and a lot more. Welcome back to the deep dive. Today, we are opening up a stack of documents that sit right at the collision point of modern artificial intelligence and your privacy at work. And a massive federal class action lawsuit. Right, exactly. If you have ever wondered exactly who owns the mathematical map of your face or how the tools we use to state, safe might be fundamentally changing the definition of surveillance, you are in the exact right
Starting point is 00:00:45 place. You really are. It is a fascinating space. Because we are looking at a dispute that on the surface is about commercial truck drivers and dash cams, but underneath it is a battle over the raw materials of the AI age. It is a remarkable convergence of technology and constitutional theory and state law. Yeah. The source material driving our deep dog today consists of highly detailed legal filings from a federal class action lawsuit that played out in the United States District Court at the Southern District of Illinois. We are pulling directly from three primary documents, right. Yes, three main ones.
Starting point is 00:01:17 First, the amended class action complaint, which lays out the plaintiff's grievances and their technical analysis of the surveillance in extensive detail. Which is fascinating reading, by the way. It really is. Second, we had the defendant's official answer to that complaint, where the tech company at the center of this pushes back with a veritable wall of robust legal and technical defenses. A massive wall. And finally, we are examining the memorandum in support of the final $4.25 million class action settlement. The mission for our deep dive today for you, the learner listening,
Starting point is 00:01:50 is to take these dense legal filings and translate them. Unpack them completely. We want to unpack exactly how an AI-enabled dash cam system, a piece of hardware placed inside commercial trucking fleets, to monitor safety and ended up triggering a massive legal war under the Illinois Biometric Information Privacy Act, which you'll hear us refer to as BIPA today. We will be using that acronym a lot. We are going to explore the granular mechanics of the technology, the incredibly strict legal frameworks trying to contain it, and the high-stakes risk assessments that lead to multi-million dollar payouts. And before we get into the weeds, a crucial point to establish regarding our approach to these sorts. We are going to be presenting the factual allegations made by the plaintiffs. Right. The claims they are making.
Starting point is 00:02:38 Exactly. And we're also going to present the vigorous categorical denials and counter arguments made by the defendants. Just laying it all out. We are entirely impartial here. Our goal is not to take a side or act as a judge, but simply to analyze the fascinating, sometimes contradictory arguments presented in these source documents. We are your guides through the complexity of the claims. Okay. Let's unpack this. Let's start with the technology at the center of the source documents. of the storm because without understanding the hardware, the legal arguments just don't make sense. They fall apart completely. The defendant in this case is a company called Linux Incorporated. They are a San Diego-based
Starting point is 00:03:13 video telematics and fleet management systems corporation. Which is a very long way of saying they provide high-tech video and analytics services to the transportation industry. Right, to put it in plain English. And according to the complaint, they are an absolute giant in this space. The sources note that Linux technology is, used by more than 4,000 fleets across the country. It is a massive footprint. They claim to hold data based on over 100 billion miles of driving. That 100 billion miles figure is critical.
Starting point is 00:03:45 I mean, it is not just a marketing statistic. It is the foundational data set they continually used to refine the accuracy of their software. Because that is how machine learning works, right? Exactly. The more miles their cameras observe, the more edge cases their algorithms encounter. and the more robust their models become over time. It just keeps learning. The specific piece of hardware facilitating this massive beta gathering
Starting point is 00:04:08 and the device at the center of the lawsuit is the SF300 drive cam. Which is a specialized camera system retrofitted into commercial trucks. But we need to be clear. It is entirely different from a standard dash cam you might buy for your personal car to record fender benders. Right. You just stick a GoPro on your dash. This is not that. No. The SF300 monitors both the outside of the vehicle, and the interior cab of the truck simultaneously.
Starting point is 00:04:34 It is looking out of the highway ahead, and it is looking directly inward at the driver. And it is what the camera does with that inward-facing video that kicks off this entire legal saga. Lidex markets their premier technology under a specific acronym. They call it MV-V-plus AI. M-V-plus AI stands for machine vision plus artificial intelligence. It sounds very sci-fi.
Starting point is 00:04:58 It does. And Lidex has a very distinct way of explaining to their clients, which the plaintiffs actually quoted directly in their complaint to build their case. Using their own marketing against them. According to Lidick's own documentation, the machine vision component sees and recognizes, while the artificial intelligence component interprets and decides. Which is a really interesting distinction.
Starting point is 00:05:18 The stated purpose of this combined system is to detect risky driving behaviors in real time. We are talking about detecting inattentive driving or speeding or failing to wear a seatbelt. The standard safety stuff. Right. but it goes much further into behavioral monitoring. The system is looking for drivers who are smoking, eating, drinking, or using a mobile device while behind the wheel. When the system detects these behaviors, it issues an in-cab audio alert, usually a beep or a spoken warning to the driver. The goal is to prompt the driver to self-correct in the moment, theoretically preventing a catastrophic highway accident before it happens.
Starting point is 00:05:54 Which sounds undeniably great on paper. I mean, preventing accidents involving massive commercial truck. is a goal everyone shares. Nobody wants tired or distracted truck drivers on the road. Absolutely. But this is where the plaintiff's technical claims come in. And this forms the entire biometric foundation of the lawsuit. The plaintiffs who are commercial truck drivers allege that this system is not just a passive video camera recording footage for safety reviews. The alleged it is an active face scanning surveillance tool.
Starting point is 00:06:23 The crux of the plaintiff's argument is basically an engineering assumption. They claim that in order for this drive cam to know, with high algorithmic certainty that a driver is holding a cell phone to their ear or putting a cup of coffee to their lips, the camera first has to mathematically map the driver's face.
Starting point is 00:06:41 The allegation is that the camera scans the driver's face, geometry, identifying unique anchor points. Like what kind of points? Things like the distance between the eyes, the curve of the mouth, the tip of the nose, the edge of the lips. The plaintiffs argue that the system
Starting point is 00:06:56 isn't merely detecting a general human shape. They allege it is extracting highly specific geometric data points from that individual human's face to calculate exactly what that human is doing at any given millisecond. To make this concrete for you listening, I really want to describe the visual exhibits the plaintiffs submitted in this complaint because they are wild to look at. They are striking images. The plaintiffs pulled these illustrations directly from Lytics' own marketing materials and technical white papers. They show images of a driver sitting in the cab and overlaying the driver's face is that. this glowing digital wireframe mesh. It looks exactly like the motion capture technology they use in Hollywood sci-fi movies or
Starting point is 00:07:36 video games. Exactly. The complaint shows these wire frames locked on to the driver's head movements. In one image, a driver is holding a phone, and the system is allegedly tracking the proximity of the phone's coordinates to the facial landmark coordinates. There is another showing a driver with a cup and another with a cigarette. And the plaintiffs point to these images and say, look, you cannot accurately track a cigarette going into a mouth unless you have mathematically mapped exactly where that specific
Starting point is 00:08:03 mouth is using facial geometry. It is a very compelling visual argument. I have to admit, seeing those wireframes makes the concept of workplace monitoring feel incredibly intimate. I was joking earlier that if I had one of these in my car during my morning commute, the AI would be screaming at me every 10 seconds for sipping my coffee or adjusting my radio. It is a level of scrutiny that most people have never experienced. To really understand the mechanics of what the plaintiffs are alleging the complaint includes several highly technical exhibits, specifically exhibits 8 through 11.
Starting point is 00:08:36 Which detailed the fundamental underlying mechanics of computer vision and face detection technology. And the very first concept we have to untangle here, which is absolutely crucial in biometric law, is the stark difference between face detection and face recognition. I am glad we are pausing on this, because to anyone outside of a computer science lab, face-a-tecure. detection and face recognition sound like two ways of saying the exact same thing. They really do sound synonymous. But in this legal battle, the distinction is everything. It truly is. Face detection is the process of an algorithm looking at a digital image and answering a very basic question.
Starting point is 00:09:12 Is there a human face anywhere in this picture? It just finds a face. Any face. Exactly. It finds a face. It does not care who the face belongs to. Face recognition takes it a massive step further and answers the question whose face is this. It identifies the person. Right. It takes the detected face, measures it, and matches it to a specific known identity in a database, like unlocking your smartphone with your face.
Starting point is 00:09:35 Now, the plaintiffs are alleging that Liddix system uses sophisticated face detection algorithms that inherently cross the line into biometric collection. Specifically, the exhibits detail foundational computer vision models like the Viola Jones framework and hair cascade detection. Let's break down hair cascade detection, because the way. these algorithms actually see is totally alien to how a human brain processes a picture. An algorithm doesn't look at a photo and see a person. No, not at all. It sees a grid of millions of pixels with different numerical color value. Exactly. A face detection algorithm starts by scanning the
Starting point is 00:10:11 pixels of an image for the easiest, most universal human future to detect, which is the eyes. Why the eyes specifically? Because in a digital image, the area across the eyes is generally darker than the area across the upper cheeks of the forehead, simply due to the way shadows fall on the human skull. Oh, that makes sense. The algorithm isn't looking for an eyeball. It is looking for that specific mathematical contrast pattern in the pixel data, dark, light dark. Once it finds that contrast of the eyes, it attempts to detect the eyebrows, the mouth, the bridge of the nose, the nostrils. It applies a cascading series of mathematical tests. That is the cascade and heart cascade. I was reading through this part of the source material. And the best way I could visualize it is to imagine
Starting point is 00:10:55 the algorithm as an incredibly strict, very fast bouncer at a nightclub doing a visual pat down. I like that analogy. The bouncer isn't looking at your whole outfit at once. First, they check if you have an ID in your hand. That is the eyes. If you don't have an ID, they instantly reject you and move to the next person in line. They don't even bother looking at anything else. Right. If you do have an ID, they move to the second test. Does the picture match? That is the nose and mouth. If it fails at any step, the bouncer immediately stops processing you and moves on. It is a rapid cascading series of yes or no tests. And if a region of pixels passes every single stage of the bouncer's checks, the algorithm concludes mathematically that it has found a
Starting point is 00:11:37 human face. That highlights the speed and the sequential nature of the processing. It happens in fractions of a second continually frame by frame. But this brings us to the part of the process that the plaintiffs argue crosses the legal line. Feature extraction. Right. This is the core of their claim. The plaintiffs exhibits explain that once the model is trained in operating, it extracts these specific features. These geometric data points representing the eyes, nose, and mouth.
Starting point is 00:12:04 And it compares them to stored patterns to verify the face and track its real-time movements. The plaintiffs argue that this extraction process by its very nature involves collecting biometric identifiers. Their argument is that you cannot run a mathematical algorithm, that plots the distance between my eyes and my mouth to see if I am drinking coffee without simultaneously capturing my unique phase geometry. This is where I want you to step into the shoes of the drivers for a moment. Imagine your own workplace, whatever that looks like.
Starting point is 00:12:33 Imagine you were sitting on your desk or operating a forklift or stocking inventory. Just doing your normal daily tasks. Right. How would it change your psychological state if a company mandated camera was pointed at you and every single second of your workday an algorithm was mathematically plotting your facial landmarks. It is a lot of pressure. It is checking the geometry of your eyes and mouth over and over again, running those bouncer tests constantly just to ensure you aren't taking a sip of water or glancing down at a text message.
Starting point is 00:13:03 It fundamentally changes the feeling of being on the clock. It is a profound shift in what it means to be monitored by your employer. It certainly is. And that profound shift in surveillance brings us directly into the legal, framework that this entire lawsuit is built upon. Let's talk about that framework. We need to look at BIPA, the Illinois Biometric Information Privacy Act. If you follow technology or privacy news, you've probably heard this acronym whispered in terror by corporate lawyers. It is widely considered the strictest biometric
Starting point is 00:13:33 privacy law in the United States. What is genuinely remarkable about BIPA is its history. It was enacted all the way back in 2008. Which is crazy to think about. In terms of modern artificial intelligence, deep learning, and smartphone technology, 2008 is practically the stone age. The first iPhone had barely been released. Exactly. But the Illinois legislature was incredibly prescient. They enacted by PA because they recognized early on the severe, unique, and irreversible risks of compromise biometric data. The core philosophy behind BPA is actually really elegant when you strip away the legalese. Think about standard data breaches. If someone hacks a corporate server,
Starting point is 00:14:14 and steals your password or your social security number or your credit card detail, that is a massive headache. Huge hassle. You have to make phone calls, freeze your credit trait, maybe deal with some fraud. But you can change your password. You can cancel your credit card. You can be issued a new PN. The damage can be mitigated. You can reset those things. Right. But if a server is hacked and cyber criminals steal the exact mathematical map of your face geometry or your fingerprints or your retina scan, you cannot change your face.
Starting point is 00:14:42 No, you're stuck with it. You cannot get new fingerprints. You are permanently compromised. That data is tied to your immutable physical biology forever. That concept of biological immutability is the engine of the law. And because the stakes of losing that data are so unimaginably high, by PA lays out very strict definitions to protect it. How does it actually define the data? The statute defines a biometric identifier as a retina or iris scan fingerprint voice print or a scan of hand or face geometry. So very specific physical traits. Yes. But it also casts a wider net by defining biometric information as any information, regardless of how it is captured, converted, stored, or shared that is based on an individual's biometric identifier and used to identify them.
Starting point is 00:15:25 So armed with those strict definitions, the pointiffs in this class action bring the hammer down. Their complaint hits a lens with three specific counts. Three distinct ways they allege this AI dash cam system violated Illinois law. Let's walk through them because each one builds on the last. Count I focuses entirely on the retention schedule, which falls under Section 15A of byPA. What exactly does this section demand from a company? Section 15A is about data life cycle management. The rule states that any private entity in possession of biometric identifiers must have a written policy.
Starting point is 00:15:59 And crucially, that policy must be made available to the public. It can't be a secret internal memo. No, it has to be public. This policy has to establish a clear retention schedule and, specific guidelines for permanently destroying that biometric data once the initial purpose for collecting it has been satisfied. Or within three years of the person's last interaction with the company, whichever comes first. So you can't just quietly hoard the data in a server farm forever building up a shadow profile of a person over decades. You have to tell the public exactly when
Starting point is 00:16:30 and how you're going to delete it. Precisely. The transparency is mandatory. And the plaintiff's core allegation for count I is that Liddix completely failed to maintain such a policy. But then they have a privacy policy on their website? The complaint notes that while LITACS obviously had a standard privacy policy on its website, that policy actively disclaimed responsibility for the data practices of its clients, the trucking companies. Oh, so they were pointing the finger at the employers. Yes.
Starting point is 00:16:57 Linux's policy stated that they act only as a service provider processing personal information on behalf of their clients and directed anyone reading it to go look at the client's own individual privacy policies. So they were basically saying, don't look at us, look at the trucking fleet you work for. Exactly. The plaintiffs argue this passing of the bark completely fails to meet Brempie's explicit requirement for a publicly available definitive destruction schedule maintained by the entity actually holding the data, which they argue is Lidex. Okay, so they allegedly didn't have a specific retention schedule. To a layman, that might sound like a bureaucratic oversight. Maybe a failure of the compliance department to update a webpage is a missing public public.
Starting point is 00:17:39 policy really enough to trigger a multi-million dollar federal lawsuit? On its own, a technical violation of 15A is serious, but it is the second count that truly creates the massive legal exposure. The plaintiffs follow up with count two, which is the bedrock of BPA, informed written consent under Section 15B. This is the section that gives corporate compliance officers nightmares. Walk us through the mechanics of 15B. The rule is absolute. A company cannot collect capture purchase receipts. through trade or otherwise obtain your biometrics without doing three specific things.
Starting point is 00:18:14 What are the three steps? First, they must inform you in writing that your biometric identifiers or information are being collected or stored. Second, they must inform you in writing of the specific purpose and the length of term for which your biometrics are being collected, stored, and used. Okay, so full disclosure. Full disclosure. And third, and this is where companies always trip up, they must receive a written release
Starting point is 00:18:33 executed by you. An actual signature. Yes. You have to physically or digitally sign a document giving them explicit informed permission before a single piece of biometric data is collected. And the allegation here from the drivers is a total across-the-board failure on all three fronts. The drivers claim they were never informed in writing by Lytics that their face geometry was being mathematically scanned by these inward-facing dash cams. They claimed they were completely in the dark.
Starting point is 00:19:01 They claim they were never told how long the resulting data would be kept on Lytics servers. And most importantly, they alleged they absolutely never signed any kind of written release, allowing Linux to capture their biometrics in the first place. They just showed up for work, got into their assigned trucks, and the cameras were already rolling. Building on that lack of consent, we reach count three, which is perhaps the most fascinating from a modern tech industry perspective. Count three looks at the commercialization of this data under Section 15C. What does 15C say?
Starting point is 00:19:32 The rule in buyPA is incredibly important. strict. No private entity in position of a biometric identifier may sell least trade or otherwise profit from a person's biometric identifiers or biometric information. The way the plaintiffs structure this allegation is really clever because they aren't accusing Littics of taking a truck driver's face map and literally selling it to a third-party data broker for a quick buck. That would be a cartoonish violation. Right. Nobody's accusing them of selling faces on the black market. Instead, the plaintiffs look at the economics of how AI is built. They claim that, Lytex uses this massive database of continuously scanned drivers, again, drawing on that 100 billion
Starting point is 00:20:12 miles of driving data to relentlessly train and refine its artificial intelligence. By continually feeding their machine learning algorithms with this allegedly biometric data, Lidx engineers newer, better, more accurate behavioral detection products. They then turn around and market this supposedly superior, highly trained technology to new corporate clients to win market share. That is the core of their argument. The plaintiffs are arguing that using biometric data as the raw material to build a better commercial product and then selling that resulting product to increase market dominance
Starting point is 00:20:44 is a direct, albeit indirect violation of buyPA's no profit rule. They are essentially saying you are profiting off our faces by using them to make your AI smarter so you can sell more dash cams. What's fascinating here is how the defense chose to respond to this massive multilatial attack, because Linux did not just roll over and try to settle quietly on day one. Not at all. When you read Defendant Litig's official answer to the complaint, you don't just see a disagreement over how to read a statute. You see a masterclass in aggressive, comprehensive,
Starting point is 00:21:19 scorched earth legal defense. The answer document is formidable. Linux didn't just quibble with the plaintiff's interpretation of Illinois law. They flat out denied the entire foundational technological premise of the lawsuit. It is a blanket categorical denial of the core allegation. It is a fundamental clash of reality. The plaintiffs are saying your machine vision mathematically maps our faces to know if we were smoking a cigarette. And Lytics is saying, no, it doesn't.
Starting point is 00:21:45 You fundamentally misunderstand how our proprietary software works. It is two completely different stories. Throughout their answer, Litts freely admits that their MV plus AI technology detects risky driving behaviors. They proudly admit they sell this technology to improve. prove fleet safety and save lives. They stand behind the product. But they vehemently, repeatedly deny that the drive cam scans drivers' face geometry. They deny that it harnesses or extracts biometric data points as defined by by PA.
Starting point is 00:22:16 And they expressly unequivocally deny maintaining a trove of biometrics on their servers. So how do they explain what the camera is actually doing then? They argue that their system detects the objects and the behaviors, the presence of a cell phone, the motion of an arm, the contrast of an eye closing. but it does not create or store a geonetric map of the driver's unique facial identity. And that technological denial, as forceful as it is, is just the opening salvo. Leninca's legal team doesn't put all their eggs in the we don't scan faces basket. In their answer, they list 23 separate affirmative defenses. 23 is a lot.
Starting point is 00:22:50 Now, for those who might not spend their free time reading civil litigation filings, an affirmative defense is essentially a legal pivot. It is a way for a defendant to say, even if a jury decides that everything the plaintiff says is 100% true, which, for the record, we strongly deny, the plaintiffs still cannot win this lawsuit because of this entirely separate legal reason. Categorizing and analyzing these 23 defenses reveals a brilliant multi-pronged strategy designed to attack the lawsuit from every conceivable angle. Let's dig into the major categories of these defenses because they show how complex it is to apply a state privacy law
Starting point is 00:23:28 to interstate technology. First, we have what we could loosely call. the, we're just the vendor defense. Right. Lytics argues that it is a third-party technology provider. They do not employ these truck drivers. The trucking companies, the fleets that hire the drivers, are the ones who purchase a Lytics system and mandated the installation of the cameras in their trucks. Lytics argues that bi-PA should not apply to a third-party software vendor that has no direct
Starting point is 00:23:52 contractual employment relationship with the individuals being recorded. They claim it would result in an absurd and overly burdensome application of the statute. to hold the software developer liable for how an employer chooses to monitor its own employees. It's like suing Microsoft because your boss uses Excel to track your bathroom brakes. That is exactly the logic they are using. Building on the relationship between the driver and the employer, Linux, brings up the implied consent and clean hands defense. How does implied consent work in this context? This defense posits that even if biometric data was collected, which again, they fiercely deny the drivers,
Starting point is 00:24:29 implicitly consented to that collection through their actions. How exactly does one implicitly consent to a biometric scam? Liddix's argument is that these drivers voluntarily got into the cabs of these commercial trucks day after day, week after week. They continuously drove trucks that they knew full well were equipped with inward-facing Lidex technology. There is a camera staring right at them. Yes. Lidex argues you cannot knowingly drive a monitored truck for months,
Starting point is 00:24:56 reap the financial benefits and wages of that employment, and then turn around years later and claim your privacy was secretly invaded without your consent. By doing the job under those known conditions, they argue that drivers accepted the terms. LITX also brings up the no actual harm argument, which is a classic enduring defense in privacy and data breach litigation. It is the so what defense. Liddix points out quite accurately that the plaintiffs didn't suffer any actual concrete damages. No one's identity was stolen. No, no one's bank account was drained by a hacker. No one lost a job or suffered reputational damage because a cybercriminals stole a wireframe map of their face from a Lydex server. Politics argues that BPA was designed to prevent harm and without actual demonstrable harm,
Starting point is 00:25:41 the plaintiffs are not entitled to the massive ruinous financial damages they are seeking. But the defense doesn't stop at state law interpretations or procedural arguments. They go all the way to the top. They pull out the United States Constitution and federal law to mount their counterattacks. These are the truly heavy legal theories that threaten to invalidate the plaintiff's entire case. Indeed. Lidex raises the doctrine of federal preemption. In the hierarchy of American law, federal law trumps state law.
Starting point is 00:26:09 Liddix argues that by PA, which is a state-level privacy law passed in Illinois, is preempted by the massive body of federal laws governing the interstate trucking industry. They point specifically to federal trucking safety regulations and the Federal Aviation Administration Authorization Act. or FAAAA, which despite its name, heavily deregulated and governs the motor carrier industry. The argument is straightforward. The federal government heavily regulates commercial trucking to ensure safety across all 50 states. Right. If an AI dash cam is proven to improve federal safety standards and reduce highway fatalities, the state of Illinois cannot use a local privacy law to effectively ban penalize or severely restrict that safety technology. The federal mandate for safety overrides the state mandate for privacy.
Starting point is 00:26:52 Then they invoke the dormant commerce clause of the U.S. Constitution. I want to spend a minute on this because it is a fascinating application of a very old constitutional principle to cutting edge AI. It is a brilliant stretch of the law. Historically, the framers of the Constitution included the commerce clause because they didn't want the individual states getting into trade wars with each other, setting up tariffs at their borders and destroying the national economy. They needed free trade between the states. The dormant aspect of this clause prohibits states from passing legislation that improperly burdens or discriminates against interstate commerce. Applying that to AI dash cams is a brilliant legal maneuver. Lidox argues that applying an Illinois privacy law to interstate commercial truck drivers,
Starting point is 00:27:39 drivers who are constantly crossing state lines carrying essential goods across the country, creates an unconstitutional undue burden on the national supply chain. Just imagine the logistical nightmare this would create. Let's say a truck driver is hauling a load of goods from Indiana through Illinois and into Iowa. Following a standard route. The safety camera is on in Indiana. But the moment the trek's tires cross the state line into Illinois, the trucking company would have to somehow geofence the truck and automatically disable the safety features of the AI camera to avoid by PA liability,
Starting point is 00:28:12 only to turn it back on the moment the truck enters Iowa. Liddix argues that Illinois simply does not have the constitutional authority to impose its hyper-local privacy standards on the national interstate supply chain in a way that forces companies to degrade their safety protocols. And lastly, if all of those defenses fail, LIDEX brings up the excess of fines clause of the Eighth Amendment. Let's do the math on this, because this is where the sheer scale of the risk becomes terrifying for a corporation.
Starting point is 00:28:40 The numbers get huge, very fast. BPA allows for statutory damages of $1,000 for every single negligent violation and $5,000 for every single intentional or reckless violation. And the courts have heavily debated what constitutes a violation. Is it one violation per driver? Or is it one violation every single time the camera mathematically scans the face? If an algorithm runs at 30 frames per second, checking facial geometry continually for an eight-hour driving shift, you can see how the math quickly becomes astronomical.
Starting point is 00:29:10 Exactly. Even using conservative estimates, if you multiply a $5,000 intentional violation penalty by thousands of truck drivers, the resulting damages don't just punish the company. They obliterate it. Liddix argues that using a state statute to impose hundreds of millions or potentially billions of dollars in damages for a technical privacy violation where no actual data was hacked or stolen operates as an unconstitutional disproportionate punishment that violates the excessive fines clause. This wall of defenses really crystallizes the tension of the whole case.
Starting point is 00:29:43 The technology scales effortlessly across state lines, across board, and across corporate boundaries. But the legal accountability and the laws governing that tech are fragmented, localized, and incredibly rigid. It's easy to get lost in the constitutional law, the Dormant Commerce Clause, and the mathematical mechanics of Harkas State algorithms. But we have to remember there are actual human beings sitting inside these truck cabs. We can't lose sight of the people. Let's pivot and look at the reality of the people living under this technology. The amended complaint brings forward three named plaintiffs to represent the massive class of drivers. Joshua Lewis, who drove for Maverick Transportation, James Kavanaugh who drove for Quick Crete,
Starting point is 00:30:23 and Nathaniel Timmons who drove for Gemini Motor Transport. The deep dive into their daily lives, as outlined in the complaint, paints a very specific and isolating picture of the alleged surveillance reality. We have to contextualize their workspace. Right. This isn't an office. These aren't people sitting in an office cubicle from nine to five where they can get up, walk to a break room or step outside for a private phone call. They're over the road, commercial truck drivers. The cab of that truck is their entire world for hours and days on end. It is their workspace while driving their break room while parked, and often their dining room and
Starting point is 00:30:58 bedroom. And according to the complaint, the drive cam is a constant unblinking presence in that confined space. The psychological toll of knowing an AI lens is pointed at you analyzing your micro-movements while you were just trying to eat a sandwich or drink a coffee on a grueling 10-hour hall is immense. It creates a pressure cooker environment. But there is a specific technical detail from the sources that brings this entire federal lawsuit into the jurisdiction of Illinois. And it has to do with tracking. Yes, the geolocation data. The inward facing camera doesn't exist in a vacuum.
Starting point is 00:31:32 The Littorick system is a comprehensive telematics unit, meaning it is constantly tracking. The GPS coordinates the speed and the physical location of the trucks in real time. And that GPS ping is the trap being sprung. Exactly. These drivers work national or regional routes. But because their designated routes inevitably took them across the border and physically into the state of Illinois, BPA's jurisdiction was triggered. So they didn't even have to be Illinois residents? No, not at all.
Starting point is 00:32:00 The plaintiffs construct a very logical argument because Linux's system was actively tracking the truck's precise geolocation. Lytics knew exactly where the drivers were at all times. Therefore, Liddick's knew full well that the alleged facial scanning was occurring, while the drivers were physically present within the geographic borders of Illinois. It didn't matter if the trucking company employing the driver was based in Arkansas, or if Liddix's server farms were based in California. The moment those tires rolled onto Illinois asphalt with the inward-facing camera running, the plaintiffs argue Illinois law applied, and Lytics was on the hook.
Starting point is 00:32:36 Here's where it gets really interesting, because you have this unstoppable technological force meeting an immovable legal object. You have a massive, well-fronted tech company vigorously denying the claims and raising 23 formidable constitutional level defenses. And on the other side, you have plaintiffs armed with the strictest biometric privacy law in the country and a very sympathetic narrative of intense workplace surveillance. It is a legal Mexican standoff. How does this end? It ends with a $4.25 million settlement. But to understand why both sides agree to lay down their arms, we have to look at the sheer attrition of the litigation. The settlement memorandum outlines a grueling timeline.
Starting point is 00:33:16 This wasn't a quick payout or a nuisance settlement. It was three years of grinding complex, highly adversarial litigation. Litics fought incredibly hard. They filed a motion to dismiss the case entirely early on, which the plaintiffs successfully survived. Surviving a motion to dismiss is a huge hurdle. It means the judge looked at the complaint and said, yes, there is enough legal merit here to proceed to discovery. And discovery in a case involving proprietary aid. AI algorithms is an absolute nightmare.
Starting point is 00:33:45 It is the grueling process of demanding internal emails, technical specification, source code, summaries, and engineering documents from each other. They fought over what documents had to be produced. Finally, after years of spending money on corporate lawyers, it culminated in a full-day, intense mediation session with a respected independent mediator. There is also a quick procedural note in the documents regarding Maverick Transportation, the employer of the first-named plaintiff Joshua Lewis. That's an important detail.
Starting point is 00:34:15 Early on in the process, Mavericks settled their portion of the claims out of court. The documents note they were dismissed from the lawsuit with prejudice. Wait, what does with prejudice actually mean in plain English? Because it sounds like the judge was angry at them. It's a common legal phrase that simply means the dismissal is final and permanent. When a case is dismissed with prejudice, it means the plaintiff is legally barred from ever bringing that exact same claim against that exact same defendant, court. The book is permanently closed on Maverick. That dismissal left Liddick's standing entirely alone to face the music as the sole remaining defendant against the entire class of drivers. So let's look
Starting point is 00:34:54 at the final deal they struck in that mediation room. The memorandum lays it out clearly. The settlement establishes a non-reversionary cash fund of $4,250,000. Non-reversionary is a vital term here. It means that Linnix writes the check and puts the money in an escrow account, and no matter how many drivers actually fill out the paperwork to claim their share of the money, Linux doesn't get a single puny back. If only 10 drivers claim the money, those 10 drivers get a massive payout. Any leftover funds do not revert back to the company. The scale of the class size is staggering. The settlement covers an estimated 85,000 unique class members. That means there are 85,000 individual commercial truck drivers who operated a Lidex equipped vehicle within the state of Illinois during the specified
Starting point is 00:35:40 class period. The court-appointed settlement administrator executed a massive notice program to try and track down these drivers, sending emails physical mail and digital ads, and successfully provided direct notice to 22.5% of the massive class. And notably when the court asked for objections, not a single class member out of the thousands notified, and no defendant opposed the final approval of this settlement. But the big question, the million dollar question, or I guess the $4.25 million question is why settle? If Liddix had 23 amazing defenses, if they believe they could win on the Dormant Commerce Clause, and they swore up and down that their tech fundamentally does not stand face geometry, why write a check for $4.25 million? And conversely, if the Plankovs truly felt their
Starting point is 00:36:24 fundamental human privacy was deeply violated by an illegal biometric surveillance tool, why let Lytics off the hook? For what if you divide 4.25 million by 85,000 drivers amounts to about 50 bucks a before the lawyers take their cut. It all comes down to the brutal reality of risk assessment. Both sides faced absolute existential threats if they refused to settle and took this to a jury trial. For the plaintiffs, the risk was that a jury might actually listen to Lytics engineers and believe them. What if Liddick successfully proved at trial, using expert computer science testimony, that their MV-plus AI algorithm genuinely does not scan-face geometry is strictly defined by BPA?
Starting point is 00:37:03 What if they proved it only detects shapes and contrast patterns without ever extracting unique biometric identifiers. If a jury believed that technical distinction, the plaintiffs lose everything. They get zero dollars after years of work. Alternatively, what if the judge eventually agreed with Liddick's federal preemption defense? If the court ruled that federal trucking safety laws completely override Illinois state privacy laws, the plaintiff's case is instantly destroyed on constitutional grounds. And for Littics, the risk was apocalyptic. Apocalyptic is the exact right word. let's revisit the math on the statutory damages in bi-PA. Let's assume the best-case scenario for the plaintiffs and the worst case for Littics. A jury finds that Littics intentionally or recklessly violated
Starting point is 00:37:45 by PA. That is $5,000 per violation. Multiply that $5,000 by $85,000. The damages immediately start at $425 million, nearly half a billion dollars just as a baseline. But it gets worse because bi-PA damages have often been interpreted by courts to accrue per scan, not per person. If the AI dash cam mathematically scans a driver's face geometry multiple times a minute for an eight-hour driving shift for three years, the mass creates a financial black hole that would bankrupt almost any tech company on Earth. A trial loss could have been a fatal company-ending event. When you look at an exposure of potentially billions of dollars, a $4.25 million settlement is not an admission of guilt. It is a highly calculated pragmatic business expense to permanently eliminate a catastrophic risk.
Starting point is 00:38:32 If we connect this to the bigger picture, this settlement serves as a massive warning beacon for the entire tech and transportation industry. In the age of AI data is an incredibly valuable asset. It is a raw material that builds the future. But this case proves that unregulated, improperly consented biometric data is a multi-million dollar liability waiting to detonate. It shows that deploying AI surveillance tools, even tools designed with the noble necessary goal of saving lives on the highway, carries immense unquantifiable legal risks if those tools intersect with aggressive state privacy laws. So what does this all mean? We started this deep dive looking at a small plastic camera mounted to the windshield of a truck, and we end up looking at the fragile high-wire balancing act of modern society.
Starting point is 00:39:17 On one hand, we possess advanced artificial intelligence that can genuinely prevent horrific accidents. It can catch a tire driver falling asleep at the wheel. It can sound an alarm before a distracted driver rear ends a family minivan. It definitively saves lives. But on the other hand, we have the strict legal boundaries of human privacy and the right to exist without being mathematically quantified by our employers. Do we have to sacrifice the sovereign ownership of our own biological face geometry to be safe on the highway? This lawsuit proves that we as a society have absolutely not figured out the answer to that question. And that is the ultimate takeaway for you listening to this.
Starting point is 00:39:56 The very definition of what constitutes you in a digital space is still being. actively fought over in the courts? Is you just a traditional photograph? Is it a glowing wireframe map of your facial landmarks? Is it a statistical model of your eye movements? Technology is evolving exponentially faster than the law can comprehend. And everyday judges and juries are being forced to draw legal boundaries around concepts that didn't even exist 20 years ago.
Starting point is 00:40:21 I want to leave you with a brand new concept to chew on. Something that builds on this entire debate, but looks just over the horizon. Right now, privacy law is like. like BPA, focus strictly on physical biological identifiers, your face geometry, your fingerprints, your retina. Well, what happens in three or five years when AI systems become so hyper-advanced that they don't even need to see your face to know exactly who you are and what you are doing? What if a smart seat in a truck cab or your office chair could identify you and predict your
Starting point is 00:40:51 level of distraction or fatigue solely by the unique microscopic way your body shifts its weight over time? What if sensors embedded in a steering wheel or a keyboard can identify your unique identity based on the distinct rhythm temperature and pressure of how your hands grip the surface? When your very behavior and movement become your biometric signature, will our current rigid laws be able to protect that data signature? Or will the entire legal framework have to be torn down and rewritten all over again? That is a wild, slightly terrifying thought to end on, and exactly why we love digging into these sources. Thank you for joining us on this. deep dive into the legal and technological frontier.
Starting point is 00:41:30 We hope we unpacked the complexity and gave you the insight you were looking for. Keep asking the big questions. Keep looking past the headlines and above all stay insanely curious. We'll catch you on the next one. Hey, it's Mark, and thank you for listening to this episode of the Employees' Fibah Guide. If you'd like to be interviewed for our podcast and share your story about what you're going through at work and do so anonymously, please send me an email at m-C-R-E-Y at C-A-P-C-Law.com. And also, if you like this podcast episode and others like it, please leave us a review.
Starting point is 00:42:00 It really does help others find this podcast. So leave a review on Apple or Spotify or wherever you listen to this podcast. Thank you very much. And I'm glad to be a service to you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.