It Could Happen Here - Best and Worst (non AI) Products at CES

Episode Date: January 13, 2026

From exoskeletons, rolling laptops, and VR therapy, Robert and Garrison discuss the some of the products at CES 2026 that aren't powered by AI and ChatGPT.See omnystudio.com/listener for privacy infor...mation.

Transcript
Discussion (0)
Starting point is 00:00:00 This is an I-Heart podcast. Guaranteed Human. Also Media. This is It Could Happen here, a podcast about things falling apart. And today, the things falling apart are consumer electronics as an industry. We are at CES 2026, the trade show where the tech industry shows us everything it's going to sell us in the next year. And we've seen some cool stuff and a lot of bloatware, a lot of crap, a lot of AI-enabled stuff that doesn't need to be AI-enabled here to, talk about it is our panel of experts.
Starting point is 00:00:38 Experts is a strong word, but okay. Yeah, panel is a strong word. Of is even a strong word. I'm Robert Evans. Garrison Davis is also with me. And I'm Ben Rose Porter. And, you know, Ben, you're an academic, right? You do college.
Starting point is 00:00:54 Yeah, I do. I teach sociology. That's right. That's right. As a sociopath, uh, how have you enjoyed the trade show? folks. It's fascinating. I mean, you know, it's a fascinating world sociologically. You can learn a lot of
Starting point is 00:01:11 the maladies of society at this place. Yeah, most of them are on display here. We are now going to walk you through a bunch of the largely non-AI products that we were able to find after sorting through the gunk. Yeah, it's kind of like, like, you know how people like pan for gold? Yes. If instead of like panning for gold and like a beautiful mountain stream, like you were panning for gold in like a pile of used condoms. Gold is also generous. This is panning for pennies in a pile of used condoms. Right, right.
Starting point is 00:01:43 Now, what was the best used condom you saw at CES 2026, Garrison? I think you mean what's the best penny? Good products. Did we see any good products? Yeah, there's some good ones. Did I? I don't know if I saw any good products this year. The translation stuff is still around and some of it's really cool.
Starting point is 00:02:01 No, I mean, yeah, in other episodes we've talked about, the glasses and the earbuds, most of the product sheets I have out now are, at the very least mixed. If not, if not bad. At Eureka Park, there was this product called Nodie, which is a smartphone replacement for kids, like kids, like, 6 to 12 if you don't want them having a smartphone. And it allows them to send voice messages to a parent who, like, you know, has, like, this approved contact list. so you can also listen to music, listen to like some like, radio, like, you know, radio dramas, probably some books.
Starting point is 00:02:38 It connects to Spotify. You can learn languages, allegedly. You can learn words, right? You can learn words from other languages. And you can communicate with your parents. And this was like fine, right? It's the stuff that we've kind of seen before, but it was this little like, you know, silicon.
Starting point is 00:02:53 Kind of looks like like an elf bar, if you're familiar with an elf bar. It kind of looks like that. and it's if, yeah, if you don't want to give your, like, you know, six-year-old an iPhone, but you, you know, want them to have a way to contact you. Right. And that's, like, fine, right? So there's a lot of, like, basic stuff like that, I guess, that is, like, kind of okay. I don't want to be cynical for the sake of being cynical.
Starting point is 00:03:14 That kind of stuff bores me. But there really wasn't that many good products this year because so many of them just were LLM rappers, as we discussed in our previous episode. And there's another category. There were products that were good, and I could tell were good products on their own, that they had still thrown AI functionality that I, I don't want that I know will blow out the price and also makes it me less want to buy it. LG was advertising their new OLED television that has like the most vibrant colors,
Starting point is 00:03:39 the best OLED screen that's ever been on a TV. And it looked fucking great. I have a video of it. The colors, the darks are amazing. LG screens are really good. But it's also with AI, with Evo AI. And to show off its vibrant colors and what it's capable of, they had a loop of videos that were inspired by the Hubble Space Telescope.
Starting point is 00:03:56 and you can see it if you ever watch Deep Space 9 it's about the quality of the Deep Space 9 intro video. It does look very Star Trek intro, yeah. Which is frustrating because again, we have images from the Talascope.
Starting point is 00:04:13 They look cool because you know what they are is real. Like they're real products of one of the most impressive things human beings ever made and you're bragging that like we have an AI generating images inspired by this thing that look worse.
Starting point is 00:04:27 Yeah, a lot of those things that are kind of mixed. I wanted the TV. I was like, wow, I might like a TV like that until you displayed that part of it. And now I just kind of feel dirty about even the prospect of wanting
Starting point is 00:04:38 one of your products. This is an interesting product. I saw at the Innovation Awards showcase. This is the acoustic eye, which is kind of just... First off, that's an ear. That's an ear, right?
Starting point is 00:04:47 That's an ear. We don't need to do that. But this is specifically a security system that tries to detect very small drones. and this is like for you know high profile people's houses mostly you can put that on your your roof or your window and it'll detect very small drones that cameras would not be able to detect so it's trying to like listen for drones it provides 360 degree omnidirectional
Starting point is 00:05:12 aerial surveillance detecting invisible aerial threats so i i would assume that they're trying to excel these to like you know like executives like CEOs politicians people who are at at risk of either drones filming around their house or more like kinetic drone-based attacks. Right. And this is an interesting product that's like very current. It's a very current product. Like this is a reflection of
Starting point is 00:05:37 some of the times that we're in. And I found that to be one of an interesting thing. I guess we should discuss the exoskeleton, which is maybe the best part of the show for me. Yes. There's a lot more exoskeletons this year. The technology is clearly matured.
Starting point is 00:05:54 And matured to the point that not only is it, do they have viable versions for, like, industrial use for people working in factories and whatnot, right? Which two years ago is what they were always advertising is that. Yeah. These are things that you buy at an enterprise level. And, you know, I know they're not primarily concerned with the health and well-being of their workers, but actually these do improve health and well-being of workers, right? It reduces the felt load and the felt strain. It reduces the damaged knees and back, right? And that does affect their productivity.
Starting point is 00:06:20 Right. And it also affects their profit because you're not, you're less likely to have workman's comp claims, right? It's one of those things where it's a really good idea and the products work. There's a number of very good exoskeletons. We received from a company called HyperShell, an exoskeleton before the convention this year that you and I both wore on the floor. I have some data on it, which is that I timed my normal walking pace when I'm not particularly trying to get anywhere is about 19 minutes a mile, right? If I'm just kind of like walking casually. when I put the hyper shell on and had it at 75% power mode,
Starting point is 00:06:53 my walking pace was 15 to 16 minutes a mile, about 15 and a half, I think is what I generally advertised. And my heart rate didn't change meaningfully. And it was like one or two higher than it normally is, but not really a significant change in heart rate, right? And I felt like at the end of the day, my feet heard about a normal amount for a day at CES, but my lower back and my knees felt less strain, right?
Starting point is 00:07:13 That was my experience with the hyper shell. It's like an external hip. It attaches around your waist. Yeah, there's a belt around your hip, and it goes up to right above your knees. It's kind of the, and it's not a huge footprint. No, it's a very small device. And yeah, it goes around your hip, then another strap, like, goes above your knee, and it kind of assists or guides your leg and your hip up and down.
Starting point is 00:07:34 Yeah, and for the record, folks, the basic version of this product is about $1,000. The version we had was about $2,100, right? And the battery will last about 30 kilometers, they say. I didn't have any trouble getting about an eight-hour day out of it. Ben, you wore it much more than I did. I wore it a little bit. Do want to talk about your experience with HyperShell? Yeah, I was impressed with it.
Starting point is 00:07:55 It was, first of all, HyperShell, very funny name. I like that name. It was pretty comfortable to wear, which is, I always see the exoskeletons, and I'm like, it looks kind of awkward. But it was very easy to take on and pull off. And it was comfortable, and it was pretty simple. And it basically just has two motors, that sort of assist when you move your leg,
Starting point is 00:08:17 it pushes your leg, and when it comes back in the step, it pushes it back down. And so it's just assisting, and it tracks your leg pretty well. So there's just very little time when you're pushing against the machine. It's coordinated very well.
Starting point is 00:08:30 I mean, it just functioned. It worked, and you could walk a long time. Yeah, the fact that it's like, yes, it's a product that works, and it does a thing that has utility. It feels increasingly rare at CES. I was a little disappointed that the product didn't.
Starting point is 00:08:45 not pay attention to my emotions and build a relationship of empathy with me. That was, but the walking was good. Yeah, I asked the hyper-shell about its opinions on Proust, and it had very little to say. It just, it is like two main modes. It has like this eco mode, and it's this hyper mode, which can get really aggressive. If you turn up hyper mode all the way. You feel it's like kind of lifting your legs. You could be bounding.
Starting point is 00:09:07 Yeah. And you can adjust, like, the torque. You can adjust, like, how much delay it has. You have a lot of different settings. the one thing that I had a lot of fun with is that there's this other experimental mode I don't know if you turn it on yet called fitness mode
Starting point is 00:09:20 fitness mode is cool it does the opposite it adds resistance to the like movement so it's for like working out if you want like a harder hike right then you can you can turn on fitness mode and then it'll be it's more work to like how Goku trains to walk
Starting point is 00:09:35 sure but something I that's an anime thing by the way now one thing that I found out through my own, my own cunning, is that if you have hyper mode turned on all the way, which it was when Ben was wearing the exoskeleton, on my phone,
Starting point is 00:09:55 I can switch from hyper mode to fitness mode immediately, which completely halts any movement. So you can be walking at like seven... You really fuck up your friend if they don't have the app. At seven, you're walking with like seven miles an hour, really fast speedwalking. And then I press a button,
Starting point is 00:10:13 and your legs are going to an immediate halt. And it was really fun to do that for about seven hours. Yeah. I almost crashed into several people just for you to get his kicks. Oh, I'm glad you found the terrorize your friend option on there. It was really fun. It was really fun. Most of the time I used it, I had installed the app when I got it, but I didn't think about it after that.
Starting point is 00:10:34 Because you can do everything on device, or at least you can do a lot of things on a device. You get a lot more options when you're using the app. But you don't need to use the app. to handle the basic functionality, right? I love adjusting the intensity of how much, how much it's doing. And it shows a whole bunch of applications, right? You can wear this, hiking, like, running, doing, you know, lifting, your work.
Starting point is 00:10:54 And I like the thoughtfulness in, you don't need the app to use this, but the app vastly, the app allows you gives you a lot of control that you're not gonna get off of a simple, like, button, right? It just struck me as good design. It has a lot of, like, fidelity via the app. So that's the hyper shell folks, good exoskeleton. I've used a few at various CESs and,
Starting point is 00:11:13 this one certainly strikes me as like a very good, like, consumer option. Like if you as an individual want one, and I'm sure still most of the sales are going to be like enterprise, different companies that have, like, you know, want these for people who are doing like loading and unloading and like a loading dock or whatever. But I think the prices will continue to go down and they are now hitting the point where this is like a thing that individuals can afford if they want one. And there is a lot of, and HyperShell focused on this in some of their advertising, but there's a lot of utility for people with disabilities for stuff like this, right?
Starting point is 00:11:44 Like that's part of the point of all of these different products. And in general, when it came to the stuff where like the, because there's AI in this too, HyperShell talks about it. And it's mostly just like how it learns and reacts to your motions, right? That that's like machine learning. And when we talk about AI, usually the useful applications, you could also just call machine learning. That's what we used to say.
Starting point is 00:12:05 But in general, the products that impressed me most and scared me most at CES were health care related products, right? Where we have a towel that read your sweat and can tell if you have, like, vitamin deficiencies or if you're, if you're not hydrated enough or all of these different, like the number of products where it's both like, yes, this thing can tell if you've like fatty liver disease, like based on without needing to go to a doctor. And I'm sure that is useful and will help a lot of people. And also all of these products are selling your data to the highest better.
Starting point is 00:12:39 your health data, your biometrics. Did you ask them about that? Sorry, I am not aware of that being the specific thing for the fatty liver people. That was my problem with all of the health wearables, I should say. Just to be clear. I should clarify. The wearables are all on the cloud, and every one of the ones I saw has deals with the LLM companies that they're working with and are handing your data over.
Starting point is 00:13:01 Yeah, because that's how they get quote, quote, smarter is through a massive data collection. And so there's this thing where there is. this kind of baseline expectation here that everyone is fine with handing over all of their data, all of their physical data, all of their biometrics, which I, like the utility is undeniable of things that can diagnose that you need to go to a doctor or can at least suggest, like, the existence of problems combined with, and we are not at all interested in keeping that information secure. I find the kind of casual, and no one will, because I don't think people will. I think people will buy these products and not think about who's getting access.
Starting point is 00:13:39 to their biometric data. And I wish that people cared about that. And we've seen that specifically be a problem around like pregnancy and with states restricting abortion and the ways in which these companies are aware of people's bodies even before the actual people are.
Starting point is 00:13:59 Here's an ad break. All right, we're back from ads. So I went to Lenovo's booth. It was a bunch of laptops. I use Lenovo laptops. They make good, products. It was every product there was either here is a new update to the line of laptops we've been making for 30 years. This is the latest one. This is the latest think pad carbon. This is the latest
Starting point is 00:14:28 idea pad or whatever. And then they had a couple of like the, the big thing they were showing was the Lenovo twist, which is a laptop that has a screen, it can twist around. And so it can lay flat like a tablet. But it can also, the thing that we're really showing is that it's motorized and it has AI enabled so it can follow your face. And you can set it to track an individual's face. And as you move, it will move with you. Now it has two different modes. One of them is also you can like gesture to it or you can command it by voice and you can say go into laptop mode, go into tablet mode, turn left, turn right. That did not work well. About half the time in the demo that I saw, it did not respond, maybe because the room was loud, maybe because the data was bad. But then
Starting point is 00:15:07 he put it into face tracking mode. And when you have multiple faces, you can pick which face it tracks and it swiveled to meet your face. And it was cool and it did work very well. And it did work very well. I was like, this is impressive. Like, what is the use case? Why would you want it? And he was like, well, say you're doing a presentation, like you're a CEO or something doing a presentation.
Starting point is 00:15:29 This way the screen with your text on it or the PowerPoint on it will follow you as you move around. And I was like, that could be useful. I don't feel like many people are in that situation often. I've never been in that situation in my life. And I speak in public for a living sometimes.
Starting point is 00:15:45 So I guess, yeah, there probably is a CEO who would benefit. There's like five of those guys. Like, what is the, this is a whole laptop product line. You have multiple versions. And I didn't get a single reason why you would want this other than that, other than for presentations. And it's genuinely impressive that it can track your face and move as you move, but why?
Starting point is 00:16:09 Another product they had that was in the but why category was the Lenovo Legion, and this is not a product that's ever going to come out, but it was like a proof of concept. So it's their gaming laptop line. And it has a normal screen that can widen to be three times as long. And it unfolds. They have screens that unfold.
Starting point is 00:16:28 And it's cool that they could do that. And it looked neat. And it's neat that a screen has that capability. I don't want it because it also, it doesn't look, I could see, like, obviously I can see Utodian like, you can have a screen that gets bigger without it being a bigger footprint for the laptop.
Starting point is 00:16:43 But when the screen is unfold, folded, there's huge, like, speed bump size looking like wads of screen that are bigger and like bulge out and it doesn't look good. Like, it's a bad screen. It's like a little, like, bubbly. Yeah. When it's fully extended, it's not like a good screen. Okay. You said it's a proof of concept piece. Right. Right. Right. This thing. I mean, like, the... They're showing that they're working on folding screen technology. The twist was, was like a previous version of that. Yes. The proof of concept called the swivel. Yeah. was at CES last year.
Starting point is 00:17:15 Yes. And they improved it, and now it's a real product called the Twist. Right. And maybe this could be the case for this like unrolling, unrolling, unfolding gaming laptop screen.
Starting point is 00:17:24 Eventually it will be in products, right? Yeah. But I also don't think, I don't see how you can not have the bulge. I don't know. I mean, integral to how the screen works. The folding screens have come a long way
Starting point is 00:17:35 the past five years. I use one. Yeah. They also do have some pixels dying in the fold area. I mean, yeah, if you're trying to buy products
Starting point is 00:17:43 for longevity, probably not the thing for you. If you like it for the novelty and for some reason have enough cash to burn, then maybe it's something someone will be interested in. As I was watching it unfold, there were two guys behind me and talking about it.
Starting point is 00:17:56 And one of them was like, yeah, it's not a real, it's not going to actually come out. Like, that's even like the old version of the chassis. And I said, I just don't really see a use. I don't think people want a product like this. Like I'm looking at it. The screen's not great. And I just don't see who's going to buy this.
Starting point is 00:18:12 And the, guy behind me said, well, I think like the use case is like billionaire CEOs and other people have a lot of money. And I look back at it was a Lenovo rep. And I was like, that's not, that's insane. Why? Who are you? Like, that's not like, did you just say that to me? A Lenovo rep said that. Yeah, yeah, he's one of the guys doing the demos. He had a Lenovo badge on. And I, yeah, that, that was weird to me. That's wild. But again, at least it was a thing. No, it was a physical product. Because the other thing they had, they had the, um, the work station, which the thing they were showing was there was an app on it that looks at your face and
Starting point is 00:18:47 shows you how fatigued you are by percent and how fatigued your eyes are and like other data. And I was like, oh, that's creepy and kind of impressive. But then I walked away and I came back and it gave me a totally different set of numbers for my fatigue. Were you differently fatigued? No, I was a second later. And I did it four times. And every time the set of numbers was like different enough that the only assumption I can make is that it was, those aren't real numbers. It's just generating a number and telling you that, and it's full of shit, because it wouldn't have been so different if it was actually measuring anything. It's just random numbers that it's putting on to make you think it's measuring something. It's probably trying to. There might be just
Starting point is 00:19:26 subtle things that dramatically change the number that's being displayed. I specifically, once I notice that tried to keep my face flat. And I did notice when you move closer and further it changes, but I think it's just programmed to as you move alter the numbers so that you see the number moving, but every time I came on new, it was a different number. It went from like, when I started, it was like 0.02, and the second time I came back, it was 0.50. And again, I did nothing. I was specifically keeping my face neutral.
Starting point is 00:19:53 Yeah, that's interesting. Like, it's just this, it's not a big deal. I think that type of stuff will get better. Like, we've seen versions of that before that have actually done, like, okay. Yeah. Well, but there could be a lot of factors into, like, into why the demo goes a certain way. The face tracking and facial recognition were,
Starting point is 00:20:11 But I went to this booth that, like, the big thing they were doing was, like, driving assistant robots that would, like, yell at you if you fell asleep. Or if you, like, looked away and were, like, texting or something, it would say, like, look away from your phone. Look at the screen. Please. And there's definitely, like, utility there, right? Like, that is probably a good idea. We saw that last year at Samsung's section of Eureka Park for, like, test taking to, like, make sure students aren't cheating at tests. Right.
Starting point is 00:20:37 So it, like, sends an alert every time the student's eyes goes away. from the computer screen. If they keep looking down, like, under, if they could be checking their phone or notes. Smart eye is the company. One thing I did appreciate
Starting point is 00:20:50 was that the little device that they put in was just like a circle with two eyes on it as opposed to like a whole dashboard. So that seemed nice. But the thing that they had, the first thing I used
Starting point is 00:20:59 was this optical recognition system where it learned my eyes and then when I came on it would give my name every time I walked up to it. And so like, yeah, it definitely like recognizes at least your eyes.
Starting point is 00:21:09 And it could switch between different people, but it was also, it can tell when you're drunk, they claim. And I couldn't tell that. I couldn't, I was high on Kratum and I had been smoking Delta 8, but, so I was definitely not sober. It didn't recognize me as not sober, so it couldn't measure those things. Maybe it can tell if you're drunk. The demo, it's, it, they said it could, and they showed a woman when she was sober and when she was drunk. And they explained to me, it was actually really difficult, because we did this on a close track and she is really drunk. And we had to jump through a lot. lot of hoots for them to let us have a person drunk driving.
Starting point is 00:21:44 That's interesting. And I did find that kind of funny. No, yeah. That is how you test. That would be kind of a hard thing to test. Yeah. Yeah. You could test in like simulations.
Starting point is 00:21:50 Right. But they wanted like this was supposed to be a perfect concept in a real vehicle and it'll shut down the car. I did have some people posting when I posted a video of that that like I have this condition with my eyes or that condition with my eyes and normal optical recognition stuff doesn't work with me. Is this going to show that I'm drunk or is it going to like be able to and I actually don't know.
Starting point is 00:22:09 Again, I was not able to test the product of that. that does seem like a concern I would have hoped they've dealt with but also I kind of doubt they did because usually there's gaps in products like this. I could see a company like this partnering with like an insurance company
Starting point is 00:22:22 or partnering with like certain cars that would like stop the car from being able to be moved if it detects the driver is drunk and how like false positives would play into that and then you just like are like locked out of your car because this robot thinks you're intoxicated
Starting point is 00:22:36 and actually you're fine you just like look like that. And I could also see them like rolling this product out and it hits like Thailand or something. And then they're being a big store where it's like, they didn't test this on any people of like Thai ancestry and it actually sees all of them as drunk because of like an error in the coding. Sure. And being like, oh, great. Like I'm not, again, smart eye from what I can tell their technology worked. But in order to adequately like review and test it, you do need, you need more access to it than they're giving it the show. I can't actually tell you
Starting point is 00:23:07 if it works at determining when people are drunk. I can just show you they had a very video claiming it does. So one product that I feel at best mixed about, I first saw in the CES Innovation Awards section, this is called a self-insight therapy, ResolveXR from South Korea. Oh, one of my two favorite Koreas, by the way, just since we're talking Korea.
Starting point is 00:23:29 This is a VR therapy program that is supposed to give you a final goodbye with a deceased loved one in VR. Yes, this was my initial reaction as well. And like I've seen versions, I've seen versions of this before where it's like an LLM or like an AI pretending to be your like dead wife that you like hug in VR. Sure. And that type of stuff I've generally felt bad about. No.
Starting point is 00:23:59 Some of the people who've used it like in like the promotional videos are, you know, crying and I feel and like the product in a way. As a friend, I will promise you if that moment ever comes for you, I'll pretend to be your dead wife. I don't even know what to say that. But what I found interesting about Resolve XR is that the avatar of your deceased loved one is not... That's a bleak term? Is not actually an AI. Nor is it a fully pre-recorded, like, prescripted simulation.
Starting point is 00:24:32 It is being puppeted by a therapist that you are working with as a part of the Gestalt Empty Chair Therapy technique. And this is what the product is. So you're working with a therapist who is using text to speech that is talking as your deceased, as your deceased loved one, as a part of this therapy exercise.
Starting point is 00:24:52 If you have recordings of their voice, the AI will try to replicate their voice. That's something I feel a little bit odd about. But that is like the one like aspect of like, of like, quote, AI that's being used here is for, is for the voice cloning. And then there's like a pre-recorded set of like gestures that someone does in like motion capture.
Starting point is 00:25:10 But the actual, like, like, live puppeting of this thing is done by a therapist that you were sitting across from, but you have, you know, VR goggles on. And this is, this is not supposed to be something that you do, like, routinely. It's not like, oh, I'm, like, I'm talking to my wife. Like, my wife is in VR. It's, this is a therapeutic exercise meant for people dealing with extreme grief, specifically when loved ones have been taken away during, like, like, like, accidents. Like, they specifically mentioned, like, a plane crash that happened a year ago. And so this is for people, like, an extreme, extreme grief.
Starting point is 00:25:40 extreme grief to give them like closure through this therapeutic exercise. And this is this is the, this is the pamphlet. I went through a lot of whiplash because obviously my first assumption was this is an evil product where you like feed your loved one social media data and it pretends to be them. And it's not that. And it's good that it's not that.
Starting point is 00:25:59 But then it's like a, it's being basically a therapist puppeting your dead loved one. I can, there's a couple of conclusions I have. First off, these people are trying to be ethical. It does seem. It seems like they care and they are attempting to provide something that is useful to people who are suffering. I also think this might fundamentally, this idea might be fundamentally unethical and impossible to do well. So I think this might be a case of someone trying to do the most ethical version of something that cannot be done ethically, which is a category of AI device that I've seen this year.
Starting point is 00:26:33 It's tricky because on one hand, you know, as I was talking to the woman at the booth and reading, through the materials they had. It seemed like they were selling this as, this is just sort of augmentation to a therapeutic practice that has already done. We're just putting a, you know, a digitally generated face and voice to it. But it's so, I mean, it's so easy to imagine.
Starting point is 00:26:58 Just this company seems somewhat ethically focused. But all kinds of directions you could go in this, it's a Pandora's box a little bit of like, We're starting to venture into creating replicas of the dead. Yeah. And that is... And I went to a panel kind of in the same tone on mental health and AI that I thought was going to be an opportunity for me to harass an executive during a Q&A.
Starting point is 00:27:29 Which is one of your favorite CES activities? It's my favorite thing to do at CES. And there literally is like AI, there's ample data that it is a disaster for mental health, not just AI psychosis, but there are a lot of things that it is. makes worse and a lot of problems that it causes people and a lot of problems that it exacerbates, including like suicidal ideation. This is documented. There's data on it. So that's what I was, I was showing up prepared to do that. And what I actually got was an actual clinical therapist who was trying to talk about, who first started by very much admitting the dangers with AI and the
Starting point is 00:28:03 things that it harms in terms of mental health. And then was trying to say, what would a responsible and ethical, like therapeutic AI do? And her argument was, we know how many people need therapy and don't have access to it, both in the United States and worldwide. And some sort of automated bot system that handles aspects of therapy might be the only way to provide affordable therapy to the number of people who need it who can't currently afford it. And I disagree, or at least I don't think that I don't disagree. It's accurate that there's way more people who need mental health than can afford it, right? That's undeniable, undeatable. I disagree that AI can help this problem in any meaningful way and in fact think it will only make it worse. But I understand
Starting point is 00:28:49 that she was coming at this from a, I am attempting to define what a responsible, a therapeutic AI might do. And through the course of that, I believe she is partially, because I talked to her about this afterwards too, she thinks it might be possible, but is not convinced that it is in fact possible for there to be an AI therapy system that is actually useful. And one of the things she brought up was that traditional AI chatbots, the big ones, are all programmed to gas you up in order to keep you using them, right? The program to make you want to continue to interact with them. And so it does things that are really bad for your mental health and that can exacerbate and cause new problems, right, because of the way these are programmed. So any responsible AI therapy chatbot would have to
Starting point is 00:29:36 not do that, which I'm like, that is true that you can't be a useful therapeutic tool that only praises people, right? That's just not a thing. But when I came up to afterwards, I was like, my issue is I think you're right about that. But I also think if you're saying the AI therapy bot is going to be a separate product that does not do these things. Number one, it's a high bar to get people to pay money for a tool when they already have the chatbot. And number two, if the chat bot that is good for them doesn't do the thing that makes it addictive, people will continue to use the addictive one for therapy. And she said, yeah, that's my worry too. And so I came away from being like, she's trying to explore if this can be done. And my conclusion, based on her exploration, is it can't.
Starting point is 00:30:20 But it's so interesting that she said that because she, and this is, I mean, me hearing you talk about her, she sounds like the only person who was thinking about this at all at this convention. And she was the only person on the panel. Yeah. Of just thinking. It was a speech, not a panel. really, yeah. Of thinking about that what are the social relations behind these technologies? Because, of course, that's the main question here is a technology that can generate, that can have a conversation with you is one thing. It's not that really doesn't seem like the poor of the problem so much as, well, all the machines that are having conversations are driven by very specific incentives, you know, to interact with their users in a particular way that has everything to do with the social
Starting point is 00:31:00 relationships of their production and use. And, you know, to do with the social relationships of their production and use. and the notion that technology would have any connection to social relations at all is completely absent from any discussion of any product I've seen here. Speaking of social relations between products and human beings, here's some ads. We're back and God, I really love those Chumba Casino ads. They remind me that whenever I'm out in the world, sitting down with my loved ones, you know, watching, the big game, I could be gambling. And kind of every other moment of my life that I'm not gambling is wasted. Robert, have you heard of Kalshi? No. So you know about politics, right? I love politics. And you know about insider trading, right? I love insider trading.
Starting point is 00:31:57 What if you could do insider trading about all of politics on the exclusive information that you will learn as a journalist? Wow, that sounds legal. It shockingly is. There's actually zero federal or state regulation affecting this whatsoever. That's the Kalshi guarantee. There's no regulatory mechanism that exists on a state level to regulate this behavior. Now, Giers, and you say that, but that's literally just what's printed above their booth. Let's talk about maybe the worst product that I saw at CES, child-free trust. Boy. So what?
Starting point is 00:32:30 I don't know. Is it like a software that lets you make a trust? This software was marketed towards child-free people. They lead with 25% of Americans don't have children and children. don't intend to have them. Hell yeah. Yeah, and so their ideas that they write here, child free trust is the first comprehensive nationwide solution
Starting point is 00:32:52 providing medical and financial POA, executor and trustee representation for child free and permanently childless people. So if you are childless and you don't want to burden your loved ones, that's their wording, with, you know, your estate plan when you die, this is a company that will do that for you. What's interesting, though, is that I asked them, I said, well, you know, this presumably already happens.
Starting point is 00:33:19 What is in place? I have a trust in no children. Yes. Yeah. Well, the state appoint someone to handle this, though, they will first see if that you have, if you have any loved ones who would want to take on these duties. And obviously, people without kids aren't capable of love, so. Yeah.
Starting point is 00:33:36 They, a little bit, I mean, strangely, like, so it's, it's a product that is for, we already have a public service that searches for people, loved ones that could take on these responsibilities. And if it cannot find those, it takes it on as a public good. So it was a product completely without purpose. It was like you, in order to use this, you just have to be intent on on separating yourself from society in this way. You have no loved ones, presumably, or you don't want them involved in your estate planning. You also don't want the state involved. So it requires this, this third party company. It was just a very strange product the way they presented it. Well, I mean, it's, again, I have a trust and a will. If you don't have a kid and you don't have,
Starting point is 00:34:21 like, you're not like married or you don't have a surviving spouse, like, yes, the state will point somebody, but that process is slower and more expensive if you want to avoid the cost or if you want to avoid, like having a trust and a will is not an unreasonable thing, especially if you want to make sure if you have assets and you want to make sure they go a specific place. you want to donate them somewhere or whatever. Well, and like a lawyer can handle that. A lawyer should handle that, is what I'm saying is you shouldn't use an app. Specifically what they do that the lawyer can't is they provide a service for the corporation
Starting point is 00:34:51 to execute power of attorney. And that is the main thing. So, this is the most antisocial service I've seen at all of CES because it's built on this idea that if you have no kids and you are so separated from the rest of your family, like you don't trust any siblings, You don't trust a spouse, a partner. Maybe you don't have one. You don't trust parents.
Starting point is 00:35:13 You don't even trust a friend to do this for you. Instead that you turn to a company, a private company. You don't even trust the state, right? Because the state can handle this. It is a private corporation that it's going to handle your will, your estate, and power of attorney. That's the POA thing is fought up. That's what really got me. Because I asked them, it's like, a lawyer can handle all this.
Starting point is 00:35:36 And like, well, no, a regular lawyer can't be power of attorney. And I was like, oh, this is the core of your product, actually, is that it's for people who are so antisocial, who have so separated themselves that they don't trust, they don't trust anyone who could do that. They don't have any loved ones, really. It's not just about being child-free. No.
Starting point is 00:35:53 It's like, it's about, you do not exist in, like, in a social network whatsoever. Because I can see the kind of people who might need this are people or might want this are not the kind of people who use apps because the actual, the group of people who definitely don't have kids and also may not have any living friends
Starting point is 00:36:10 are people who are incredibly elderly. It's not even a factor of like their life is bleak. It's just you live way too long. You literally don't have anyone left that you knew. But they're not going to use an app. Like that's just not how they think about if they don't have a lawyer,
Starting point is 00:36:23 they'll have the state handle it. But like they're not going to download the child free app, this 104 year old Okinawan woman to like handle this for them. There's like a graph on the bottom that's like features and like what different versions have which features.
Starting point is 00:36:38 and the three features are child free trust, trust and will, and then free will as one word, but the W is capitalized. I just like seeing free will and then checks and X's at the bottom. You don't get free will on all the options, sorry? What is free will as a service garrison? Are they saying the machine has it? Or is it literally a free will making service? I think it's creating a will from free. And they're just calling it free will.
Starting point is 00:37:00 I think that's what they're doing. Okay. I mean, but yeah. And this was in this was in Eureka part. Yeah. which is where the cool little products are. That's bad. Yeah.
Starting point is 00:37:12 It was really bizarre. Yeah. There's a fun bit that you could do if you had the money to show up and just do bits at CES, where it's like, this company can handle all of your end of life care and decisions. And it's just a booth with a handgun on a table. So let's talk about, to close, let's just discuss like what the CES kind of means in general. We already kind of discuss the AI angle of this. And it's something that we've seen throughout this show is these massive banners hanging. everywhere about how
Starting point is 00:37:40 CES is where innovators show up. Sure, yeah, absolutely. And in how everything's based around innovation and creativity. This is where everything descends from. And this sort of like tech idealism that the world is based around these concepts of innovation and creativity. And they do
Starting point is 00:37:56 not mention any physical way that actually comes into the world or the sort of mechanisms of the world that allow innovation to take place. And Ben, we've been talking a lot about Yes, the past, like, two days. I mean, it's all, we're solving all the problems of the world and we're fixing everything
Starting point is 00:38:14 and it's all eternal sunshine. And where that comes from is innovation and creativity, but those are just, yeah, totally abstract quantities. There's not even a subject given of, like, innovators. Well, who, who is that? If you have a product, you're innovating. Yeah. If you, is that the owner? Is that the workers who make it?
Starting point is 00:38:35 I mean, no mention of labor at any point in any of this. I mean, that's a given. Oh, that's not true. They talked about all the laborers you could replace. Yeah, all right. You can, we don't need them anymore. It's innovation is drawn in the full circle. It produces itself now.
Starting point is 00:38:49 Yeah, just this abstract quantum of, I mean, it really is kind of interesting. It does become a blur of like who is the magical font producing all of this stuff. It almost, sometimes it almost seems like it's the consumer. It's sort of suggesting is like it's actually you who creates all of this. It was very vague, very strange. Mm-hmm. We went to that one panel about like trying to address underserved people who are, who are like cut out of tech and like cut out of, cut out of all of these like industries. Yeah, I mean, it was very nondescript about who exactly that was.
Starting point is 00:39:25 Yeah, they had people talking for the, that are paralyzed veterans, someone from the NAACP. And I mean, it was just one, again, there was no actual discussion of the social relationships. behind technology, it was just entirely about this technology is here. It is the first priority. So everything else has to follow after. And yeah, very unclear about they would all talk about what technology could be used for, but entirely nondescript about where that process comes. Who's making those decisions? Where those centers of power are. The reach to have an, for existing brands and companies that make real things, to have an AI angle was the most obvious and tortured thing that I saw.
Starting point is 00:40:08 And one of my favorite booths every year is the Jackery booth. Jackery makes batteries and solar panels. And they make pretty good batteries and solar panels for like expeditions, for camping. Like they're rugged. And I use them. They're good products. And this year they had the new edition of all their batteries, the new edition of all
Starting point is 00:40:23 their solar panels. And as generally happens with technology, everything's a little better than it was last year. But there's not much room for AI aside from like the batteries have AI by which they mean there's like a learning algorithm that can determine like, how to optimize aspects of like power draw or whatever. Like, sure, that's not really AI in the way that the AI industry means it. But I, sure, sounds real.
Starting point is 00:40:46 But because that wasn't enough, they had this thing that they called their Mars rover, which was not a Mars rover. I don't think we'll ever go on Mars. It does not look like it could survive on Mars. But it is a rover that is a big battery on wheels that is intelligent and can drive itself and has solar panels that slide out. And the use case for this was it will travel around and can go to where the sun is in order to charge itself up and then head to you to offer you outlets when you're doing
Starting point is 00:41:11 work. And it's like, is a robot that moves really, I can see like a two points in my life where I might have gotten used to. Who's going to buy this? For what? Where will it be deployed? It just roams around outside, this expensive machine that does not look like it should get rained on too much and finds the sun to charge itself up and then heads over to you to charge device. It can't charge a home. It doesn't power a house. It's like a little robot. Is it for like camping? I don't know. That was unclear.
Starting point is 00:41:41 They showed it being used and they showed it as like, I'm outside using power tools. The robot came up to me so I could plug in. I guess you could take it like the park? You could take it to... A park? But it's big. It is pretty big.
Starting point is 00:41:52 It's like a sizable machine. It probably weighed 80 pounds. And again, like it's impressive that it can go seek out the sun to charge itself up. It's impressive that you could like call it or like call it with an app and it will come over to you and there's an outlet. Who, what is the, who will buy this? Why? When? That is an odd one. Again, I can think working in my yard, working out, you know, when I'm shearing the goats or whatever, I have had to carry a battery with me because there's not an outlet out there and the shears need a batterer. And yeah, I guess it would be easier if the robot moved there, but this has to be like $20,000. I'm not going to buy a robot to do that. I can just pick up a battery and walk with it 100 feet. Who will use this? Why? It's not going to Mars. I assure you it's not going to Mars. Again, I wouldn't want it to be left out in the rain. And I love the Jackery products. They make good stuff. And the fact that like, yeah, you clearly scrambled to make this so that you had a thing that can compete with all the other AI things. And I wish you were just devoting your lives to making better solar panels, which is what I want from you. Anyway.
Starting point is 00:43:03 I'm just curious. Are the, how many is here? Are they profiting? Do they... Yeah, I mean, some of them. Some of them. But a lot, I mean... Some of them are funded with VC money and have not made a profit. Some of them are funded from VC money
Starting point is 00:43:16 and have been losing money for years. And then there's also, like, Lenovo makes... It's a company. It makes a profit. They make products people buy, you know? Jackery makes products people buy. There's also a lot of startups. Like the Eureka Park section
Starting point is 00:43:26 that we've been referring to, like the stuff in the bottom floor of the Venetian, that those are, you know, a lot of startup companies who are looking for investors as well. So, yeah, it is definitely a mix. Some of them are trying to do like business to business sales.
Starting point is 00:43:38 Some of them are more consumer facing. Some of them are looking for investors. Some of them are profitable and other ones are trying to boost their stock price by being here, kind of like Cloyd at LG. But I think what's really important is that everyone here is an innovator. And you know why we know that they're innovators? Because they've shown up. And only innovators show up here.
Starting point is 00:43:58 And they're the real, the driving force of the economy. And honestly, I was feeling bad about myself until I saw that banner and realized that I am an innovator. you know you are an innovator thank you i showed up and i figured out how to be the guy on the most cratum at the uh at the c s show floor no i mean c s is so is so interesting to i innovated like it it runs both on this like technology idealism where everything is based on a you know people geniuses you know your steve jobs having in like an idea steve's job and he is the innovator and everything descends from the idea that is like the thing is like this like like tech platonism so like this is one side of it
Starting point is 00:44:35 You also have, like, the tech accelerationists at CES, where it's like they occupy this position of being so pro-technology, no matter whatever downsides that the current iteration might have, because they need someone to hold that position in order for this thing to move forward. They know that there's concerns around data protection, but their opinion is that it doesn't matter. There is problems there, but we, the people here at CES, the innovators, need to ignore them because we have to push forward. and this is what like the Austrian Secretary of State said at that one panel I went to.
Starting point is 00:45:08 It was like data protection is a problem but it gets in the way of innovation. Yeah. And that's literally literally what he said. And so you have you have this like this tech optimistic like acceleration like viewpoint of like technology will be better but in order for it to be better and saves eventually it's kind of shitty and has some problems now.
Starting point is 00:45:27 But we need to push forward through that all the way. Like we can't we can't go slowly. It has to go forward. So they adopt this, this viewpoint because they need someone to hold this tech optimism viewpoint in order for the process to like unfold. Yeah, there's something strangely clear-eyed about that in the way that it's like, yeah, if I mean, if you're limiting your view to the system of capitalism, yeah, the whole thing goes into crisis. If you're not squeezing a little more juice out of the orange. And if this is what it takes to do that, then full steam ahead. They're lucid about that.
Starting point is 00:46:02 Yeah. Yeah. You know what else is lucid? Us saying it's time to end this fucking podcast. Goodbye. Another CES miracle. We have kind of survived. It Could Happen here is a production of Cool Zone Media.
Starting point is 00:46:19 For more podcasts from Cool Zone Media, visit our website, Coolzonemedia.com, or check us out on the IHeart Radio app, Apple Podcasts, or wherever you listen to podcasts. You can now find sources for It Could Happen here listed directly in episode descriptions. Thanks for listening. is an IHeart podcast. Guaranteed human.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.