The Morning Stream - TMS 2715: Noodle Tax

Episode Date: October 7, 2024

Spurb Your Enthusiasm. Rocket tubes! Wickedly Talented Adele Denim. I Do Like Geneseeeeeeeeee. Assaulted Pecans. Pervasive Nut. Curving Our Wheels. Lets test the ships bikini. MRI pot raid. Turns out ...the gun was cancer free. Micronauts FIX. They're still married, for now. Shape Up, Shane. Billionaire Bootlicker, Thursdays on TLC. Perfect Paint Job with Bobby and more on this episode of The Morning Stream. Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 The milkman broke my front door and got chocolate milk all over everything. But you know what? I'm not mad because we have a bunch of rad patrons at patreon.com slash TMS. Coming up on the morning stream, spurb your enthusiasm. Rocket tubes. Wickedly talented Adele Denim. I do like Genesee.
Starting point is 00:00:22 Assaulted pecans. Pervasive nut. Curving your wheels. Let's test the ship's bikini. MRI pot raid Turns out the gun was cancer free Thank goodness Micronauts fix
Starting point is 00:00:33 They're still married For now Shape up Shane Billioner bootlicker Thursdays on TLC Perfect Paint Job with Bobby And more On this episode of
Starting point is 00:00:42 The Morning Stream Are you sorry sir That you brought your son along To see Alien No ma'am I think he should have seen it It's something that He needs to know
Starting point is 00:00:51 That things could like That could happen in life That could be a true story Did the movie scare you at all Yes ma'am it did What happens to Nicole Kinman? The morning stream. Maybe I can help you.
Starting point is 00:01:13 I am Boba Fett. Hello, everybody, and welcome to TMS. It is the morning stream for October 7th. It's a Monday, 24. I'm Scott Johnson. That's Brian Abbott. Hi, Brian. It is. Hello. Hi.
Starting point is 00:01:35 Hi. For most listeners, you never even saw us leave, but Brian was gone on a weekend. It had a busy weekend, you know? I did have a busy weekend. My goodness. So much stuff. We need to report. We need to report on that wedding first. Can you get into that and tell us how that went? For sure. Yeah, so wedding basically right after the show Thursday, did a little bit of a little bit of work here at the house and then had to get up to the mountains. for this wedding. Gorgeous location, this place called the Inn at the Pines,
Starting point is 00:02:05 right off of I-70 at Genesee, if you're familiar with Denver, you know where Genesee is at, and just a really, really pretty. Just getting into the mountains. So you have some places right there in Genesee have this gorgeous view of the entire Denver metro area. It's just absolutely gorgeous. So John Goodman calls Genesis Genesis Genesee and oh brother where art thou when he's talking about the bible oh does he you call it genesis yeah said
Starting point is 00:02:31 i'll tell you what i was reading genesee whatever he says and so whenever i hear that it's that's all i can hear so i'll have to get used to your story even though i'm thinking of john goodman running around eating chicken it'll be fine well it'll be the last time i mentioned genesis anyway because the wedding um got up there and i you know had been practicing and uh going through the stuff we did the rehearsal and got all my nerves out then and thought, all right, cool, no problem. Going to be a piece of cake. And it was, right? Got up at the top of the parking lot and then led the procession of the groom and then all the groomsmen and then all the bridesmaids down to the thing.
Starting point is 00:03:13 And there was quite a bit of time where I was up on the, it's not really a pulpit. I guess it's a stage, even though it's the same. level as everything else. But I'm up in front of all the attendants. And I'm calm. I'm like, all right, so excited, do this. But nobody's told my heart to be calm. And so my heart is like up in my throat.
Starting point is 00:03:39 Doop, do, do, do, do, do. I'm like, well, this is weird. I'm totally feeling calm, but why, like, I'm doing deep breaths. Let's see if I can get, get, you know, get the heart rate down, whatever. Sure. But basically was able to act. ad lib a couple things and had the the audience laughing uh or or not laughing but like chuckling because obviously the show the stars of the show or the bright and groomed
Starting point is 00:04:08 and i'm not taking any any of their uh their their show light their showcase away their spotlight away but um uh definitely had definitely had a great a great time doing it and um did well they're still married so at least the stuff i did worked uh and those two still married in three days later. So we didn't have some kind of rom-com situation where they were left at the altar and some other guy burst in and... Yeah, no. No, and I didn't do that.
Starting point is 00:04:35 I left out... There's a lot of things that I left out of the pronouncement, like, I'm not going to do the whole. If anybody has any reason, why these two should not be married, speaking hour forever, hold your peace. Nobody ever wants that in there. I've never seen it in a real wedding, not even once.
Starting point is 00:04:51 Right, no. No, it's always, it's always the joke, you know, TV weddings, movie weddings, all that sort of thing. Yeah. Elaine! Yeah. But then you've also got the honor and obey. Obviously, that gets left out.
Starting point is 00:05:06 That's silly. But, no, I didn't even have kind of the traditional, like, in sickness and in health, in richness and in riches and in poor, richer and poorer, blah, blah, blah. I had a whole different thing written up, and they loved it. And everybody kept coming up to me during the reception and saying that was great, that was awesome. Tristan asked me if I would do his when the time comes. Nice. Apparently this was kind of an audition and I didn't know about it.
Starting point is 00:05:33 Oh, they were trying. They were going to see how you did before they asked. They wanted to see how I did and then they were going to kind of decide from there. Sure. Yeah. So there we go. Yeah. Done.
Starting point is 00:05:43 Check mark off the list there. And then stayed out really late that night at the reception because we wanted to dance with everybody and have a good time. And then at 4 o'clock in the morning. morning, after three and a half hours of sleep, we got driven to the airport by Crazy Neighbor to catch a 6.30 a.m. flight to Atlanta, where we met up with, uh, that's why I was going to ask you if you ended up sleeping at all or if you just like push through, you know? I, I, we slept. Um, I slept because Tina needed to sleep. Tina can't sleep on planes. She can't sleep in cars. So I knew that she was going to have to sleep. And I didn't want to try and be the, you know,
Starting point is 00:06:25 make any accidental noise in the other room that would have interrupted the little sleep that she got yeah i mean you can if you're only going to get three hours you'll take three hours it's fine exactly yeah i'll take it yeah so uh met up with uh chuck and amy and the duces and matt from bc and um and then they drove us to uh savannah three and a half hours in the car to savannah stopped out of buckies along the way um got got pecans i had so many pecans on the trip, Scott. So at Buckees, we got the, they're glazed and roasted pecans, which are sweet. And we're trying to figure out, well, are these, they're made in Georgia? Are they Georgia pecans? And then we realize, well, Buckees is headquartered in Texas. So these could just as easily be Texas pecan.
Starting point is 00:07:12 Sure. So no, no benefit there. But then we got some roasted and salted ones. We got some raw Georgia pecans. I had pecan encrusted French toast yesterday morning. I had pecan encrusted French toast yesterday morning. I had pecan-crusted fried chicken on on Saturday night. I didn't realize it was that pervasive of a nut down there. I know. It really is. And when got some gophers,
Starting point is 00:07:39 got two big boxes of gophers when we were walking around. It must have been Friday night. And the large box of milk chocolate gophers, no idea what happened to those. Like somebody, and that was a pound. of milk chocolate gophers. I grabbed the dark chocolate gophers and brought it over it.
Starting point is 00:08:01 We had two separate rooms in the same building with like five bedrooms in each one. So we had the whole place loaded up. And Dunaway, friggin rocked it. He made sure both rooms had plenty of retro video games on the TV. Like both TVs had their own systems of video games. He brought a separate setup with his giant Atari. fight stick joystick thing that has buttons and a roller ball and all that stuff can't get those
Starting point is 00:08:31 anymore and tried they're awesome no i know i know and yeah you can you can retro pie those things so that they play a thousand games it's so so cool yeah um but uh um uh ben brian coffered um oh finally get to meet audra oh nice uh yeah his forever fiancee she is freaking awesome yeah she is so much better his half you know what i mean She absolutely is. Also, you know what? She's her, his better two-thirds. Take that, the rest of the world.
Starting point is 00:09:02 Take that, don't we? Yeah. Of course, Natalie, September, and Rob, Shane was there, but had to leave a little bit early. Unfortunately, he had to leave on Saturday, so we did a ghost tour of Savannah on a tram that was hosted by a dude who in his other life is an Elvis impersonator. So he did grace us with a little bit of, are you lonesome tonight? Tonda and Hondo We had both of those guys at the same place It was like a Ron Silver
Starting point is 00:09:31 Touching himself moment from Time Cop But fortunately Fortunately worked out okay No gelatinous blob of purple goo Leftover when they were together In the same room That's great Exactly
Starting point is 00:09:41 Daniel and his wife Jonathan Bush who had just gotten married A month ago Lives in Savannah area He came to this thing Yeah it was a packed A packed house
Starting point is 00:09:54 Did Shane do a slow rotation with his camera or no? Did we not get one of those? We did not see one of those. So he might have done one of those at another time. But we even walked around with him, I was waiting for him to take a, you know, a rotating video selfie and he never did. Shane, I'm, I mean, I won't say I'm disappointed, but I'm a little, I'm a little disappointed, just a little. You know, I follow that guy on social meet, all of his video things. I didn't see any video like that, and that made me sad.
Starting point is 00:10:22 So, Shane, shape up, Shane. I'm going to call you shape up Shane. Shane, shape up. Shape up, Shane. Yeah. What else? The, had great dinners, breakfast with everybody. I mean, it was just a really, really good time.
Starting point is 00:10:37 Chuck in September made this incredible, like, egg bites and bacon that September smoked herself sausage that she made. Chuck made a bread pudding for breakfast that was incredible. Um, uh, uh, went and got some coffee from a local place that he knew called coffee fox that was legit. Mm. Totally legit good. Coffee Fox. I like the name. Oh, yeah.
Starting point is 00:11:04 Jonathan gave me a, uh, it's a knife. You can't really see it. It's teeny tiny. And it's, and it's nearly invisible on this wine shirt, but it is a, um, a butcher's knife with the overlooking from the shining. So it's a, oh, no way. So it's supposed to be like the knife that, uh, he, uh, Jack Torrance uses through the door and stuff. Is that the same?
Starting point is 00:11:24 Exactly. That's awesome. But it's just got the overlooked carpet pattern, that memorable hexagonal carpet pattern. I love that. That is so cool. Very cool. That's great. Yeah, it was a good time.
Starting point is 00:11:39 Man, here's the only complaint. I have one big complaint. I hope everybody's ready for now. Now comes the time of complaints. It's a done-of-sized complaint. Oh, just kidding. Go ahead. That's right.
Starting point is 00:11:50 Oh, yeah, there would have been an axe right to get through the door. definitely says one more day should have been Thursday like it felt so quick the whole I mean the whole experience was so quick it was like we flew out there we flew back on the Friday and the Sunday we really just had one day
Starting point is 00:12:05 to to be out there really and I know a lot of people you know probably couldn't do an extra day price-wise or getting off work time or something like that and I know it wasn't even you know it's not something that
Starting point is 00:12:21 that was even suggest at the beginning, but I just enjoyed spending so much time with all these people that one more day would have been even better. And then, I mean, it's funny because even in this year's case, you wouldn't have been able to anyway because you had that wedding. No, the wedding would have preempted us being able to be out there on a Thursday anyway. So as we're talking about 2025, I think adding a Thursday in there would be ideal.
Starting point is 00:12:47 That's awesome. Glad you had a good time. And maybe a place that's cheaper to fly into. that we don't have that three-hour we don't have like seven hours of driving back and forth for Amy and Chuck to cart people there.
Starting point is 00:13:01 Yeah, that's a lot. Was it the van I was thinking? Or it was like a big old van, right? It was a big old transit van, yeah. And Chuck did a great job, drove the whole way, both ways, never, you know, like we all offer it, hey, if you need take a break, if you need anyone else to drive,
Starting point is 00:13:15 we can do it. But he took the wheel and drives a hard bargain as they say yeah exactly that's awesome well well done i uh spent i spent my weekend uh we were supposed to have a giant party for phoebe's second birthday yeah it didn't happen she got sick and barfed all night oh yeah we don't know what's going on there something ripping through her house over there when you send your kids to school this is just what happens because van brings home whatever he brings home of course they're little petri dishes at this point yeah so that got canceled but we saw her yesterday so that would have been saturday but she was
Starting point is 00:13:50 She was much better by yesterday, so we got to see her yesterday. Anyway, we have been, because the thing got canceled, we've had a couple of nights open up. And we're like, well, should we keep our little alien watching thing going? And so we did. So we watched Alien 4, or Alien Resurrection, as it was known. And also last night, Prometheus. So here's my takeaway from this experience so far with two people who haven't seen any of these, Carter and her friend Alicia, who's currently staying with us.
Starting point is 00:14:20 They're enjoying the hell out of it, and that's great. But the thing I'm learning here is we came to these sequels with so much baggage, you and I and everybody in our generation, because we see the first one we go, horror masterpiece, will there be anything as good? Probably not. That's the attitude we all had. And then the second one comes, we're like, oh, my gosh, I can't believe it. They made an action horror movie and an incredible sequel.
Starting point is 00:14:42 I can't believe they pulled it off. And then three comes, and we're like, oh, disappointment, disappointment. Four comes, oh, disappointment, disappointment. And they don't come at this with any of that baggage. They're just coming at this with like, let's just see what these are. Yeah. And they're looking at it as a whole, like they want to see the entirety of it. And they're enjoying it so much.
Starting point is 00:15:01 And I can't help but enjoy it with them in a way that I haven't, even with our film sack viewing of some of these. Sure. I'm enjoying the hell out of these. Like there's stuff in four that there's problems with four, big problems. Sure, sure. But there's stuff in there that's awesome. There's some really cool shit.
Starting point is 00:15:18 And Prometheus in particular, which I already liked. I'm not one of those anti-Promedaus people. I really like Prometheus. But I came to it this viewing, even enjoying it even more because, again, I'm watching it with people with open minds and none of the baggage. We're just like watching these things for what they are and for when they are and we're having a ball. Here is the one thing I will say. My biggest takeaway from Prometheus, and it makes sense because it's so much time later. sure but cinematography wise holy shit what an upgrade it's such a gorgeous film it's beautiful film it's like not even fair it's not even fair he gets to it gets to just bring along all his gladiator
Starting point is 00:16:00 people with him it's the first it's really the first film that you know doesn't take place almost entirely inside a dang crummy musty dirty spaceship and and the one time they land on planets it's like oh no it's a permanent night with the clouds and storms and stuff like that.
Starting point is 00:16:17 It's also crazy how I've mixed things up. There are things that happen in Covenant, which we haven't watched with them yet, but I've seen recent enough. There are things in Covenant that I swear we're in Prometheus. I got it all mixed up. There's this whole thing at the end with the guy's head blowing out the back and the thing.
Starting point is 00:16:36 I was sure that was part of the other movie. I was sure that them out in a field with a bunch of white-looking little aliens running around the small little dog-sized ones. I swore that was Prometheus. It's not. So I'm excited to see we're going to roll right into that sometime this week. Good deal.
Starting point is 00:16:51 It's going to be great. I loved Prometheus more on my second viewing on my home viewing than I did when I saw in theaters. And from my, you know, the alien super special that I did for Film Sack, my monthly bonus special, you know, I rated them or I ranked them all and put Resurrection last with the caveat that even though Resurrection is last, I see. still say it's like a B-minus movie. It just happens to be my least favorite, or maybe I said C-plus, I remember, but it just happens to be my least favorite among a whole cluster of fantastic movies. And something has to be at the bottom, and it's resurrection, and it's still a great movie. Yeah, same. Resurrection's at
Starting point is 00:17:34 the bottom for me, but it's not that I don't actually have a good time in there. Yeah, yeah. I don't need to see Dan Hediah's Harry back again, but it's fine, whatever. Well, who does, right? The guy is so hairy. It's a Loretta, Loretta. My wife looked at it. She hadn't seen him for her. And she goes, oh my gosh, that's Nick Tortelli.
Starting point is 00:17:55 I'm like, yeah, that is. Yeah, you're right. Well, anyway. Carla, I'm still loving you, Carla. Having a ball with it. And we're screaming our, I guess we only have, because we're not doing AVPs at all. So we're going to go straight from Covenant to Romulus.
Starting point is 00:18:10 Yeah. Romulus hits digital on the 15th. So we're just about on time. It's going to work out quick. Perfect timing. I got a quick mailman update from our favorite mailman. This is Bob the Mailman and CT. He says,
Starting point is 00:18:24 Funny enough as a mailman, when we park our vehicles at a curb, we are required to turn our front wheels toward the curb. I didn't know this. Referred to as curbing our wheels. When my son was so very young, he referred to the act as spurping my tires. That has stuck with me and my wife ever since. So that's my definition of spurping.
Starting point is 00:18:45 Also, my guess would be that the shilling letter was stuck in a piece of empty equipment or lost in a facility's nether regions only to be found again and returned to you. Hope that helps. Okay. Which makes sense. Yeah. Yeah. Yeah.
Starting point is 00:18:57 Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. If there's any sort of hill, and somehow your emergency brake goes out, this keeps your postal vehicle from just rolling down the hill into another car because having the wheels turned, whether
Starting point is 00:19:12 you're facing downhill or facing uphill will always rotate you. So this would apply to any car, right? Like if you're parking on a hill. Any car, yeah. And in Denver, you learn to do that a lot. I'm surprised, yeah. You probably were taught that in Utah as well. Oh, yeah.
Starting point is 00:19:28 Yeah, if you're parking on a hill, do that, even if you're using the emergency break, because sometimes those emergency breaks go out. Yeah. You want to make sure. There's a part of Salt Lake City we call the avenues, and it's just constant steep hill. It reminds me in San Francisco. it's like that and uh yeah i assume this is true in san francisco for whoever is dumb enough drive there but you should always so let's learn a little something here today listeners yes
Starting point is 00:19:53 aim them toward spurp your tires toward the curb right right basically like if you are facing downhill uh then and uh the curb is to your right turn your wheels to the right there you go if you're facing uphill turn your wheels to the left yeah i think it's on our test i think you had to get that right to get a license here. I think so. Yeah. I think so. Well, yeah, San Francisco, big time. Yeah, makes sense. But if you're in, I don't know, Kansas, you'd probably get away with it. Maybe not so much. Nebraska. Yeah. What's a curb? It's a place to put your enthusiasm, you know? Yeah, exactly. I like spurp your enthusiasm. That's a, there's a new idea. There you go. Somebody bang yes. Spurper. We had a lot of, over the weekend, we would do bang yes and say a title, even though no one was, no one was tracking titles. No one was,
Starting point is 00:20:41 casting votes but uh you didn't have preston at home keeping them on the sheet we did not no nobody was unfortunately nobody was keeping track that's a shame quite a few hilarious ones amy and chuck those ones are lost really really funny lost a time those yeah we watched uh done away made us watch the movie return to oz oh lord which is a very very dark twisted horror sequel to uh the wizard of oz i'm shocked that he would do such a thing shocked i know amazing right how could he how dare he? How dare he? And, uh, not in his will house at all. Speaking of which, I can hear his creaky headset now. Yeah.
Starting point is 00:21:23 Hey, Brian Dunaway. What are you doing, man? Oh, hi, Scott and Brian. I'm just trying to recover from our little trip to Savannah. It was so much fun. I miss everybody already, especially those cookies. Oh, that's, that's where they ended up. Damn it. I knew it. I knew it. Oh, yeah, we had boxes of cookies. Oh, the cookies.
Starting point is 00:21:44 That's right. The big Tupperware is a... I thought you're talking about the gophers that disappeared. No, I believe that was. Who made the cookies? Chuck did? Was it Shaladin or was it, uh, yeah, I think is, was it his wife? I think it talked for just a couple of times, just a couple of times.
Starting point is 00:22:00 I didn't get to talk to, we like to give shout out to. Daniel, Daniel, sorry. It was Daniel. I kept going back and forth. I accidentally called Daniel, Ben, sometimes and Ben Daniel sometimes, and I hope Ben Danielson. Hoping, hoping they both forgive me. I had to remember a Sutrano because he helped me carry in that big, that CRT,
Starting point is 00:22:18 so can't forget that. And Honda helped, too. So, yeah, that was good work. Good work. Somebody's got to carry your big stupid monitors that you find on the side of the road. And who else are going to do it? I was like, I can't carry this myself. What am I?
Starting point is 00:22:29 Yeah, who are you? Yeah, what am I, upper body strength? What are you doing here? Get out of here. Dunaway's here. We're going to play a game. It's called the Monday Half-asses. And Brian Ibit here will now explain what is.
Starting point is 00:22:40 is and who might win what okay welcome the morning half ass is a trivia game where i'm actually going to be giving you the answers i'm going to give scott and brandy category and six possible answers three of which are correct and three of which like ferruza balks movie return to us are far far uh just in the wrong just in the wrong depending on how confident they feel in the category they can provide one two or three guesses but if they get any of those guesses wrong they get zero points for that round get one right gets you a point two right gets you two three points I'm sorry, two right, three points, three, right, five points. There you go.
Starting point is 00:23:12 We'll add up all those points at the end and the person with the most points. Wins prizes for their contestant. And I'm going to tell you who these contestants are right now. Scott, you're playing for Luke in PA. Nice. Hey, Luke. Otherwise known as a whatnot, frequent community member or whatnot. And Brian, you're playing for Andrew N. in Lakeland, Florida.
Starting point is 00:23:35 Oh, nice. Hope he's staying safe down there. I hope so, too, with what Milton is the one that's coming towards them. Yeah, I believe that Milton, category four, last time I looked at, think, hey, no more hurricanes. Appreciate it. Oh, exactly. Tis the season. It makes landfall, what, Thursday, I think, is the expected, yeah, going to be bad.
Starting point is 00:23:54 They'll probably hook my broadband up the same day. Yeah, got to go. It'll go down, yeah. All right, let's get to the game here. Let's start off with question number one. We're going to begin with, oh, yeah. So, you know, we've got something coming up here in, what, 30 days, under 30 days, 20-some days. Are you referring to all hallows?
Starting point is 00:24:19 Let's say it's that, but really, you know, we know what it is. How about other things that happen every four years? Oh, that thing. Yeah. That happened every four years. Just as scary. Yes, exactly. Which of these happen every four years?
Starting point is 00:24:33 The National Boy Scout Jamboree, presidential elections in France. the Cricket World Cup Catholic Jubilee Years, the Van Clyburn Piano Competition, and the Pritzler Prize, sorry, the Pritzker Prize for Architecture is awarded. Should we be seeing these on the... I know, I see a bunch of blanks. I'm clicking the button right now.
Starting point is 00:24:55 There we go. Jeez. I like to read them out. Somebody has a wedding and travels all over the country, comes back, and wants to pretend. For the first round, I always like to read them out first before you see them. All right. Let's see. National Boy Scout Jamboree actually sounds right. I used to do stuff with that. I used to do the jamboree.
Starting point is 00:25:17 The Van Cliber and piano competition. You know, if there's one thing I keep track of, it's the Van Cliborne. Yeah, we call it the Clyburn, you know, here in the know, with people in the know. Of course, you got to keep it, the short version, sure. Yeah. Catholic Jubilee years. I have no idea on these. I'm just I'm just poking buttons I'm hitting two I don't know I don't know The two of you selected four totally different ones
Starting point is 00:25:45 Which means at least one of you is getting something wrong Right just by by mathematical computations National Boy Scout Jamboree Scott picked that one And that is correct so his memory on that one is very very good I'm screwed unless they got the other two right not likely Also not looking good Catholic Jubilee years, whatever the hell that thing is. Is it correct?
Starting point is 00:26:11 Scott was right with the Van Clyburn piano competition. I totally guessed that one. I have no idea. Good job. The Cricket World Cup was the other one that apparently happens every four years. I don't know how often you, you, oh, presidential elections in France for every five years. Catholic Jubilee years are irregular, so not on a fixed schedule. And the Pritzker Architecture Prize is awarded annually.
Starting point is 00:26:34 So every year, you could win a Pritzker Architecture Prize. Well, start working on your submission for next year. I was going to say, I got a little piece of paper. It's early stages, but I'm going to work for it. I'm going to get there. And then the Van Clyburn. I'm also getting ready for that. So I've got a lot going on.
Starting point is 00:26:49 Oh, good. Yeah, yeah, yeah. Tickle those Ivories, Scott. Let's get to question number two. Names that are also letters in the NATO Radio alphabet. Which of these are also letters in the NATO radio alphabet? Victor, Walter. Jack, Oscar, Mike, and Ivan.
Starting point is 00:27:07 Oh, my Lord. So if you know your Alpha Bravo Charlie's, then you're going to be able to figure this one out without any problem. Okay, that's as close as I know. I don't even know if that's right. Well, you both said Victor, and Victor is correct, yeah. Victor is the V in the NATO alphabet. The other three you guys chose, you chose.
Starting point is 00:27:35 You chose Oscar, Mike, and Ivan, but not the same person. Chose all those, of course. Yeah, no. I don't think it'll let us. Nope. Brian, you chose Oscar and Mike. Scott, you chose Ivan. Ah, shit.
Starting point is 00:27:51 The I in the NATO alphabet. Oscar and Mike are the other ones. So Victor, Oscar, and Mike. Brian, getting five points on that. All right. Let me give you some, my, let me give you a little peek behind. the curtain of my thinking, all right? I thought Ivan, because in my head, I'm thinking NATO. Well, I'm thinking NATO, they protect Europe. Like, that's the idea. It's a big European alliance. And they're like,
Starting point is 00:28:13 hey, Russia, Russia, not so, not so fast. We're going to block you from this or block you from that. So I was thinking, like, names like Victor and Ivan are very, like, Eastern European dictator names. So I probably, yeah, I should have just gone with my gut because I thought Oscar and Mike looked right, too. But that sounded like Americans doing that. sure anyway there you go a peek behind the curtain on how scott's brain doesn't work right it's all right it's all right let's get to question number three and let's see how you well your brain works for this i don't want to i don't want to repeat of last week where scott come in at the last second and whoop me so i'm a little nervous right now
Starting point is 00:28:51 should be yeah be nervous of course and by the way let's go into this last round here we've got brian with five points scott with three so really is anybody's game going into question number three. Inventions named after a place. Which of these are inventions that are named after a place? Hockey, pin... Oh, you cut out. After pinball. You said pin. Pinball, bikini,
Starting point is 00:29:15 Dum-Dum, Bullets, Radar, and Denham. Excellent. Three of those to name after an actual place. Oh, man. I need to get big points on this one. You need a two correct to tie, or not to tie but then you need brain to get zero right dumb dumb bullet the dumb dumb bullet is either the most
Starting point is 00:29:38 obvious non-answer or is 100% correct i hate that feeling right i know it's like why do they put that there is it just so dumb that it's the right answer or is it so dumb that you're going to think it's the right or is it so dumb dumb is definitely the right answer um i mean how dumb dumb do you want your suckers to be let's see um ah shit if i get too right that gives me three points if Brian gets any wrong that wins you the game but if Brian
Starting point is 00:30:06 even if he gets one right guesses one gets one right you tie and then on any other do your best all right I'm gonna just I'm just no way pinballs you got pretty lucky there the first time
Starting point is 00:30:24 that was some good guesses pinball's a who knows ball with pins all right I'm doing it I'm just I'm winging it I don't know. Okay. Okay. I did pick pinball because I just like pinball.
Starting point is 00:30:34 I didn't have any eyes like one odd. I love it because I get to watch as you choose, right? I get to see what you select and what you deselect and things like that. And right away you guys both, of course, went for bikini because it's the Bikini Islands where the, yeah, or the Bikini Atoll where the, um. Where all the bikinis run around. They tested the bomb and stuff. Um, and then Scott selected denim. Yep.
Starting point is 00:30:56 By denim. And he wins. And denim is. from a place called Nemes in France, or Nima. I'm not sure how it's pronounced, N-I-M-E-S in France, but it's where Denim, de N-N-E-N-E-M. My pants are De N-N-E-M. Yep.
Starting point is 00:31:11 Which probably tells me right there how it's pronounced. It's denim. Adele Dazim. And Scott stopped there for a really long time. Yeah, I did. All right. With two correct answers, Scott stopped there. And if he would have really stopped there.
Starting point is 00:31:27 Shit. Instead of choosing hockey. which is not named after a place. Neither is pinballed. The other one really was Dumb Dumb Bullets. Oh, shit. Nice. Yeah, the Dumb Dumb Arsenal in India is where the dumb dumb bullets come from.
Starting point is 00:31:41 I should have just done the two. If you would have stopped at the two, you would have had it. Congratulations, Brian, and by proxy, congratulations, Andrew N. In Lakeland, Florida, you're getting the prizes. Those prizes, by the way, are infraspace, which we tried to give away last week. somebody already had it or had some situation going on. In for space, you had something going on.
Starting point is 00:32:05 I don't care about their lives. Whatever. Spongebob, Square Pants, the cosmic shake. You're getting a copy of both of those. That's cool. But a whatnot, Luke, PA, you're getting a copy of Astria six-sided oracles. All good. Every one of those games is good.
Starting point is 00:32:21 And Infer space, in particular, is good. I don't know about that Square Pants game, but I'm a sucker for SpongeBob stuff. So everybody wins today. Everybody, nobody loses except me. Think of it that way. Yeah. Plus they're on big, plus he's in bikini bottom. Real place. Right. Right. Oh, look at that. There's the connection right there to this game.
Starting point is 00:32:37 Bikini bottom. Actual place. Dunaway, you did it again, man. You pulled it out. You made everybody look bad except you. You did great. Good job. Dude, you lost this one. Not me. I didn't do anything. I just showed up. That's true. He'll be back Wednesday, though. We'll see if we can get a little revenge. And I look forward to that. Stay out of trouble. Get your internet back and kiss our No, you. Butts. All right.
Starting point is 00:32:59 There he goes. We do have time for the news, it turns out, and we'll do it now. It's time for the news brought to you by. Watch Tom Mom-M-M-M-N-N-N-N-O-N-R-M and Friends play D-N-D on Twitch. Get it at Twitch.com.T-O-M-N-O-R-M. There you go. Two M's in that Tom. Two M's in that Tom.
Starting point is 00:33:25 Get them both, or it doesn't count. Let's talk first about this. LAPD raid. That's the Los Angeles Police Department for those unaware here in the States. And this goes bad.
Starting point is 00:33:38 From bad to farce, really is what this headline says after a gun allegedly sucked into an MRI machine. Oh, Lord. I've seen a video where they did this just for funsy. It was like they were decommissioning
Starting point is 00:33:50 an MRI machine. And for fun, they filmed a bunch of stuff getting sucked into it and like metal chairs just lifted right off the ground and sucked into that tube. Like insane amount of magnetism in that thing.
Starting point is 00:34:02 It is. So don't go in there with like your wallet, you know, like a chain on your wallet or shit like that. Necklaces or rings or whatever unless they're a non-magnetic metal. Yeah. But still, how do you know? Obviously, they know and they'll say, nope, that, that cheap-ass necklace you're wearing is just fine. You can wear that in there. What if it tears your nipple rings out or anything like that?
Starting point is 00:34:23 Oh, geez. Yeah, you better, you better come forward about it. Come clean about your nipple rings. Can you imagine just chink chink? Oh. that hurt i can imagine i can imagine because i feel like we've seen it in a movie yeah and then you still got to sit in there and going the whole time that thing sucks hell raisers hate it yeah uh all right l-a pd raid goes bad an officer in the los angeles police department found out the hard way that
Starting point is 00:34:50 you can't take a real metal object in this case his gun his sidearm near an MRI machine after their rifle flew out of their hands and became attached to the machine during a pot raid gone bad pot raid I thought it was legal in LA They were doing a pot raid into a place With an MRI machine
Starting point is 00:35:08 I love it What do you raid now when it's legal in your state I guess it'd have to be illegally trafficked Yeah It must let's see Los Angeles Police Department so It's legal there It's legal there I guess
Starting point is 00:35:22 Maybe somebody didn't have their pot card Yeah I don't know how that works. Anyway, the owners of NoHo Diagnostic Center, Noho. Noho, Hank, in its Diagnostic Center. It's really weird. Are suing the LAPD, the city of Los Angeles, and multiple police officers alleging they violated the business owner's constitutional rights and demanded an unspecified amount in damages. Officers allegedly raided the Diagnostic Center located in the Van Nuys neighborhood of Los Angeles, thinking it was a front for an illegal cannabis cultivation facility,
Starting point is 00:35:52 pointing to higher than usual energy level use and a distinct odor of cannabis plants according to the lawsuit. They raided the place on the 18th and detained the lone female employee while they searched the business. However, they did not find a single cannabis plant and only saw a typical medical facility
Starting point is 00:36:10 with rooms for conducting x-rays, ultrasound, CT scans, and MRIs, according to the owners. They went in there with a gun, sucked it out of his arms. It sounds so comical, right? Like bust in, this is a rain. Hey! My rifle. Yeah, big dummies.
Starting point is 00:36:24 Anyway, it turns out the, the gun is cancer-free, so everything's fine. Mom to pay son for throwing out his comics. Is this the story? Is this the one? Like my mom did with my micronauts rocket tubes. Oh, okay. This is the connection. I was wondering.
Starting point is 00:36:40 That's where the connection is, yeah. Oh, boy. Rocket tubes and a bunch of, and a bunch of micronauts, too. Rocketoes. Or gave him away to some other kid or something, or who knows what, Goodwill. I don't know, but... Well, the Chai... District Court has ordered their mother to pay...
Starting point is 00:36:59 Wait, where is this? China? No. Where is this? This is Taipei. Thailand. So what is... NT stands for? I don't know what that stands for. That's their money, but... Yeah, I don't know what the... Noodle tax. I don't know Taiwanese money is called.
Starting point is 00:37:17 Yeah. I think it's noodle tax. I'm going to get with that. Noodle tax. Okay, sure. I think that's, there's nothing offensive about that at all. No, nothing at all. That's how much they want damage is actually throughout her son's collection of 32 attack on Titan comic books. A fine can be commuted to labor and the ruling can be appealed according to the court. The woman named Chien lives with their 20-year-old son, but the two have a strained relationship due primarily to the son's habit of collecting manga and magazines and refusal to dispose of them, according to the ruling. The son was livid and sued his mother, saying some of the books were just out of print,
Starting point is 00:37:51 meaning they were hard to get. Kind of like your tubes. Your tubes. Yeah. Yeah. Very hard to get. Thanks, Mom. Thanks, Mom.
Starting point is 00:37:56 Thanks a lot. Tian. I'm not going to sue her, though. No. Not for 5,000. For now. Not for 5,000 NT or whatever that is. No.
Starting point is 00:38:05 Maybe North Taiwanese dollars, because they do have the dollar, but it might be, I'm seeing that, um, uh, I'm seeing a lot of NT, meaning north Taiwan, north of Taiwan, or north of Taipei. They have different currency in the south, maybe? It's weird. Maybe, yeah. Let's see. He was contrite, asked to settle out of court. She did not appear when someone for further questioning. Some was unwilling to engage in dialogue with his mother to find a settlement.
Starting point is 00:38:34 So the prosecutor's indicted chain for destruction of property per the criminal code. So moms out there just know, you can't just chuck your son's shit out the door. No, no, don't do it while they're away at college thinking, you know, I need the space for my name. needlepoint. Yeah. The worst part is he still lives there. He has to go see her now, you know? I know. Yeah. That's going to be really awkward, too, with the lawsuit and everything. Yeah. Attack on Titan comics. I must be the manga's the he means. Because I don't think there's a comic comic book. Yeah. Oh, just like the...
Starting point is 00:39:11 He said he collects manga and magazines, but at the top it says, attack on Titan comic books. I think those are the same. I don't know. Typeatimes.com. Get your show. straight. The type A times. Type A times. Ah, my favorite blood type. Always have to be so right. You can't be wrong, can you boys?
Starting point is 00:39:32 Get relaxed. Type A times. Here's an interesting story about a woman. My tab just went nuts. There we go. A woman gets a reply to an job application that she made 48 years ago. Wow. Finally, I've been
Starting point is 00:39:48 waiting. She's been waiting to for this job at the Dairy Clean to really pan out. Yeah. A woman spent 48 years wondering why her application for her dream job was never answered and finally found out why. Tizzy Hodson, age 70. Tizzy. I love that name.
Starting point is 00:40:04 Gedney Hill and Lincolnshire is one of your people there, Zoe. Could not believe her eyes when she opened the post to discover her original letter applying for a job as a motorcycle stunt writer sent in January of 1976 have been stuck behind a post office drawer all these years, like the shilling, like the shilling Christmas card.
Starting point is 00:40:25 The shilling Christmas card, yes, the notary shilling Christmas card. Oh, man. It says, despite it getting lost in the post, the post. You guys are cute over there with the post. The post. The setback did not hamper her daredevil career as she found a job that took her all over the world. Describing the letter being returned as amazing,
Starting point is 00:40:42 Ms. Hobson said, I always wondered why I never heard back about that job. Now I know why. That's my best version of her that I've Very good Yeah, it's very good As I imagine at 78 Or 70
Starting point is 00:40:55 That's exactly what she sounds like too Yeah And by the way So I I was always giving me crap For calling the post cute But if there are any Britishisms I wish we would have adopted
Starting point is 00:41:05 It's the post I like it Yeah Yeah and lifts I'm gonna take a lift up to the third floor Yeah It would have messed up that company They have to call them something else Otis lifts
Starting point is 00:41:15 Yeah Oh did you hear that Alicia Keys to disguised herself as a lift driver and then drove. Yeah, drove around to see how long it would take people to figure out it was her or something like that. Some story rolling around about it. Oh, you made it would change the lift company having to have a different
Starting point is 00:41:29 Yeah, yeah, yeah, because they if you were... Because we'd have felt like we're just getting people up and down in buildings. Yeah, with lift. Yeah, but I'm with you. I think that post and yeah. You got any others? I don't know. Pram. Pram's good. Mushy peas. Whatever.
Starting point is 00:41:47 You can keep the mushy peas Yeah, those are yours to have Give me the bag The bangers and mash I'll take the bangers and mash But you can keep the mushy peas I would eat bangers and mash today Although I had tacos last night
Starting point is 00:42:00 And I over did it I had two tacos I had three tacos And I should add two Well, I overdid this weekend Because Turns out you can't get anything That's not deep fried in Savannah
Starting point is 00:42:11 It's all deep fried chicken Deep fried chicken fingers It's exes deep fried shrimp and what else I have oysters and yeah
Starting point is 00:42:21 the American South baby that's what they do to you wouldn't have traded any of it no so good yeah I mean you know when in Rome right that's what they say exactly yeah
Starting point is 00:42:31 this lady says it was amazing to get this I always wondered why I never heard back about the job no I know why at the top of the letter is a handwritten note that reads late delivery
Starting point is 00:42:41 of Staines post office found behind drawer only about 50 years late it's written on the thing let's see here see how they found me now that I've moved house 50 odd times
Starting point is 00:42:54 because she traveled a ton for her job and even moved countries four or five times is a mystery I mean so much to me to get it back after all this time later I remember very clearly sitting in my flat in London typing the letter oh it's cute
Starting point is 00:43:08 I was really hoping that she got the letter back from them like the letter from them saying you're hired It's a motorcycle But it's just her letter Applying that she got back And that they never received It's like, oh man, I think Yeah, that's a bummer
Starting point is 00:43:23 It would have been better if it was the letter Of either rejection or admission That never got it So it really is a lot like the shilling letter Just never got there Right, never made it to its intended destination Yeah, thanks a lot, post All right, that'll do it for today's news
Starting point is 00:43:41 We're going to take a break And when we come back, Science with Bobby, It's time for Bobby Time for Science. Science with Bobby. He and I just did another 8 of 8 clear of a raid the other night. That was super fun. Oh, cool. Nice. All by ourselves. Two tanks. Nobody else involved.
Starting point is 00:43:57 No other DPS or healing or anything. Just the two of you. Nice, well done. Yeah, it was slow. It took forever. But we did it. We got it. Anyway, more on that coming up and some science talk with Bobby. Stick around. But before that, a song. Brian bring a song. Play a song. Sure.
Starting point is 00:44:12 so Pennywise really cool group a lot of people know a ska punk uh band Pennywise well lead singer and chief songwriter Jim Lindberg is doing a solo thing with well doing a splinter group with his band The Black Pacific first album that they've released in nearly 14 years is coming out it's actually available now it's called
Starting point is 00:44:35 Here Comes Our Wave this is the first song from it you might think it's a garbage cover but it's not it's called I Think I'm Paranoid You got terrorist groups are planning bombs Cold Rocks at the Pentagon To a white hate crime now gone When is it gonna stop? Yeah
Starting point is 00:45:05 CIA and XIVs A abortion kill is a piece of peace The world is full of and I need is when it's gonna stop I think I'm paranoid But it's something that I can't avoid That the world's a scary place I'm afraid of a human race Well I think I'm paranoid
Starting point is 00:45:22 But it's something that I can't avoid That world's a scary place I'm afraid of a human race Let's go This is Nazi socialists, anti-fascist and a fascist and a kiss Everybody's really pissed when is it gonna stop, yeah Carbon hit is spinal, life, license, pirates, killer, please Hurricanes have source climate change, when is it gonna stop?
Starting point is 00:46:01 I think I'm paranoid, heard something that I can't avoid, that the world's a scary place, I'm afraid for human race. I think I'm paranoid But it's something that I can't avoid That was a scary place I'm afraid of the human race Let's go
Starting point is 00:46:18 Gotter's grouped out Cold I hate crime no time, when is it gonna stop? Yeah, CIA and SEDs, Horse and Kill as a piece like prince What is full of enemies? When is it gonna stop? I think I'm paranoid,
Starting point is 00:46:51 But it's something that I can't avoid But the world's a scary place I'm afraid of a human race I think I'm paranoid But it's something that I can't avoid The worlds are scary place I'm afraid of a human race Killer cops and killer bees
Starting point is 00:47:08 CIA and SUVs are some things you can avoid I think I'm paranoid Once you have your sheets It's time to make sure you have all your bedside accoutrement Fresh cat's milk Whiskey And a cabinet full of lucky bones and hair
Starting point is 00:47:25 This is a wrap that I've made into a pencil case It's got a pencil sharpener up its ass And we returned, Brian, tell me who that was again. Sure, that was Jim Lindberg and his band The Black Pacific with a song that was not by garbage originally called I Think I'm Paranoid. It just happens to have the same title. So when you're looking for it, the one you want is from the Black Pacific. Nice. Nothing wrong with the garbage one, though, if you want to go, you know.
Starting point is 00:48:06 No, no, nothing wrong with the garbage one. No, go to them both. Or Icestorm is going to have a heart attack when he gets ready to do the notes for this episode. And quickt-tmS.L.I because he's going to see, I think I'm paranoid and realize it's not the garbage. Yeah, but he'll have a jolt of excitement that we love giving him occasionally. Right. Oh, for sure. We don't hear enough from him outside of the little duty he does.
Starting point is 00:48:28 You know what I mean? He should be here live with us, is what I'm saying. I know he works. Yes, agreed. I would love to see Mike around more. In the early days, we had him on for that video game segment. Yeah, it was super fun. We hope you're doing well, ice warm, and your ice is suitably warm.
Starting point is 00:48:43 I think about you every time I drive by the wizard's chest in Denver because I love that store, and I know you work there. Yeah. Does he come to the meetups in Denver? Sometimes when it works out, he does, yeah. Oh, that's good. All right. Who am I adding? Oh, yeah, Bobby.
Starting point is 00:48:57 Bobby. Bobber. What day is it? Give us a little Bobby. All right. Let's get Bobby in here. Let's have some science in our lives. Let's play his intro right here.
Starting point is 00:49:07 Science. Bob is hungry and the soup looks good. It does indeed. It's Bobby Frankenberger. You may know him as co-host of the instance and creator of such things as perfect tank job and also all around science. Bobby, welcome to the show. Oh, and it's nice to see you, man.
Starting point is 00:49:29 I'm doing great. Are you going to start a podcast called Perfect Tank Job? Because I really hope so. Perfect Tank Job. He's a good tank. Although, there was some confusion in that Heroic fight, but, you know, we'll figure it out. We'll figure it out.
Starting point is 00:49:42 It's a tough one. It's a tough one. Yeah. It turns out, it's funny how these things work, because the first boss, Zizi is normal. No problem. Rip right through it. First one shot.
Starting point is 00:49:53 Second boss on Heroic. because things got armpit lasers that want to eat you and kill you and lots of fun anyway more on that later in this month's episode of the instance not now though we're going about some science probably brought a little science with him what do we got going today what's going on um I wanted to talk a little bit about your favorite topic Scott which is artificial intelligence oh I do love some AI discussion loves AI yeah I love it in some context. I hate it in others. So it's a real, it's a real, uh, I got to be picky about what kind of
Starting point is 00:50:29 AI I like or not. But if you're going to hear, if you're here to tell me about generative AI, uh, which so far nobody's giving me a good reason for it to exist. It's mostly used for memes, misinformation, and utter bullshit. So as soon as somebody says, oh, here's a good use for it, then maybe I'll be on board. But I'd yet to see it. This, this will definitely fuel your distaste for AI, I think. Great. Let's do it. Um, and And it'll make you even more intolerable in your arguments because it's science backing it. Great. So you now get the. So there was a recent paper published in Nature on September 25th that says that large language model AI models, that's redundant.
Starting point is 00:51:17 Large language model AIs are becoming less reliable in their ability to provide accurate answers. to questions as their technology is scaled up and it's more concerning than that because they're also becoming more convincing in their ability to give answers so those two combined together less reliable but more convincing oh gosh yeah I would love this this is great I've changed my mind everybody I think it's an amazing thing that we're doing to ourselves wonderful so so we're As we're all aware, more people are using AI nowadays, and not just in regular, like, harmless applications. There are lots of people using it in education, medicine, science, lots of AI is being integrated into every aspect of life in the world, right? And so it's important, it's very important for it to work accurately and efficiently.
Starting point is 00:52:21 Sure. Back when it first came out, it was very prone to error. We were all kind of kind of all knew that as we were using it. But a lot of times AI, whenever you asked it a question or to do something that it couldn't do or couldn't answer, then it would just, it just wouldn't answer, right? It would just say, I don't know or some version of that, right? Here's what I found on the web. Yeah.
Starting point is 00:52:50 The equivalent of that. Of the equivalent of going on to a forum asking a question and then someone giving you a Google link. Let me Google that for you. Yes. Right. So, but that would be disappointing for users who wanted to use AI to answer questions, right? Sure. You ask you a question, you want an answer if it can't.
Starting point is 00:53:13 And so that would lead to people thinking that AI was unreliable. or something like that, right? Yeah. And so researchers and engineers said about to solve this problem. They wanted to make AI better, able to answer more questions, and for people to think there was more reliable in order to be able to use it as a tool. And one of the ways that they did that was by scaling up the models, right? Right.
Starting point is 00:53:43 And by scaling up the models, what I mean is that they would make the, the data that it trained on, basically large language models, the way that they train the data is just by training it on huge amounts of text. All right, because it's a language model, right? It's trying to predict
Starting point is 00:54:01 what to say next. Yeah, by the way, my beef with AI is nothing to do with LLMs. It's got nothing to do with business applications or my dentist's new AI model that better molds crowns to fit your teeth.
Starting point is 00:54:17 where millions of other little tiny little use cases that nobody even talks about. My beef is generative AI. Now, you could say LLMs are a form of generative AI, and they can be, I suppose. But there's a future, which I'm ready for, where LLMs are better than they are now, not good enough. They need to be kind of, like you said a second ago, we need to convince more people or not convince them, but people need to believe that they're more correct. I just like them to straight up be more correct. I don't want to believe in it.
Starting point is 00:54:49 I want to know that they're more correct. So I have issues with that. Like the inaccuracy stuff is a problem. Obviously, that's a problem for everybody. Yeah. Those aren't where my beefs are though. So anyway, continue on LLM, bam, bam, boom, let's go.
Starting point is 00:55:02 Yeah, yeah. And to be clear, I was only, I was using you as a proxy to be funny that, of people who rant against AI, right? Mostly it's this thing where people create fake photographs, and I don't just mean for the lulls. I mean, there's no point to it. There's no actual point to it other than to use a bunch of... Fake photographs, imitating voices, stuff like that. There are definitely lots of problems.
Starting point is 00:55:28 But this is specifically... I use it all... Go ahead. But I use this all the time with a client of mine who does basement remodeling and can't remember to get his clipboards and his tools off countertops and save me the hours that I would normally spend doing photo retouching. And I can just go, jup, jup, in Photoshop. All right, fill this with what, fill this with the wood grain pattern that normally would be there under this workbench
Starting point is 00:55:56 or the, or no, workbench. There's a subcategory of very good application of that same technology, you're right. Yeah, exactly. I think bad actors are the real issue when it comes to AI, and it's a very real concern. And that's a whole other conversation, I think. But I think this segues into that conversation.
Starting point is 00:56:14 Because of what these researchers found out. Well, so when they started to try to figure out a long time ago, engineers tried, not a long time ago, it's not been around that long, but earlier in these iterations, engineers set out to solve this problem of unreliability by scaling up the models. And that means just making larger training data. So, for example, GPT-3 was trained on 45 terabytes of text data. That's a lot of text data. And GPT-4 was trained on even more. So just a lot. And the number of language parameters that GPT-3 was trained on was 175 billion language
Starting point is 00:57:01 parameters. Now, a language parameter is like, you know how you've been told before that AI, the way it works under the hood. The way the programming works is kind of like in the code, they code neurons, right? And it's not real, they just call them neurons, but basically they're just like branching trees that make decisions based on how you traverse these trees based on a certain input. And if a percentage of the input goes one way, then it'll take this next branch. If it goes the other way, it'll take a different branch. Each of those neurons, they call them.
Starting point is 00:57:40 Each of those branches is a language parameter. You really can't get more specific than that because it really depends on what model you're looking at. And that's kind of like the meat and potatoes of a model is how these parameters work. But what you can say is that the more parameters you have, the more sophisticated this branching decision ability is. And a lot of it you also can't explain. because that's the whole scary thing about AI nowadays is that it's a black box. Nobody really understands how it's working.
Starting point is 00:58:13 It's just being trained on data and under the hood in this black box, a lot of these language parameters are being adjusted. Anyway, so what happened was they scaled up these models and they realized after they scaled them up, they were, it was
Starting point is 00:58:28 improving certain things, but it wasn't good enough because scaling up didn't completely solve the reliability problem. What it did was it sometimes made questions able to be answered but it was unreliable in a different way now you would get slight variations in a prompt like when you're prompting the AI with a question slight variations in the wording of the prompt
Starting point is 00:58:54 or even individual characters would generate wildly different answers and that was becoming a problem and so something like you know in art sure something creative variance is just fine but with science you don't want slight variations just based on the way you phrase a question if all the same data is consistent or one of the examples I saw was in just math problems now early on gosh early on large language models couldn't perform math but they have gotten better at that by changing the way that they train it but when it was getting better, you could say, you know, add 256 and 75. And it would give you one answer. And then if you said, and then if instead you typed in 100, 2506 plus 75 equals, it would give you a different answer. Oh, jeez. Like you're just using different set of characters and it was giving. So it's still the same numbers. Yeah. Right. It's still the same math problem.
Starting point is 01:00:00 just presented in a different way and so not very reliable so there've been all these techniques for developing and and giving better answers and getting better answers and all of these techniques are like reward based and with human feedback and stuff like that you know you uh you train it on data it gives you some output and you have some way of telling it how good that output is that's just how these AI trainings work, right? Okay, so that's all the background. What is the research
Starting point is 01:00:34 that was done? Well, what the researchers did was they compared large language model AI from previous generations to more modern generation. So the example is like chat GPT3 to GPT4. That jump kind of represents
Starting point is 01:00:50 the scaling up because the big difference between those two is that it was scaled up with a lot more data. But also, Within those large generations, there were new, there were other iterations of them that they call shaped up versions. So there was GPT3 turbo. There was GPT4O. You remember that? Yeah, I do remember that.
Starting point is 01:01:14 So those are shaped up versions. And what that meant was those were iterations of the same scale model, but they'd been trained in different ways to sort of. of give what they thought would be more reliable input. Anyway, the point is that there are many different iterations of these large language models that they were testing in this research
Starting point is 01:01:38 and seeing they were measuring a couple of things. They were, well, basically they were measuring how, three things. They were measuring when they asked it a question, did it give a correct answer, an incorrect answer, or did it
Starting point is 01:01:54 sort of defer not answer the question. Basically, avoid the question somehow. Either redirect or say I don't know or something like that, right? Right. So three out of three results. Incorrect, correct, or avoidant, right? And they asked five different types of questions. There would be addition problems. They would ask it to solve an anagram, you know, where you mix up letters to create more words. Yeah. They would ask it geography questions. they would ask it science questions and then there was a fifth category that they called Transforms and I'm not 100% sure what that one was but just the way it sounded the way they were describing it sounded
Starting point is 01:02:37 like they were asking it just more open-ended kinds of not like not necessarily subjective questions but more like questions that required longer answers like in the hundreds or thousands of words I'm not sure why they called it transforms or how that works
Starting point is 01:02:57 but the point is these are all like questions that have factual answers that you can check so not like a not like a short question like how how many times is the Earth rotated around the sun in a year it's a it's a question of how many please describe the nature of planetary orbit around a star or something like that where you're getting a much larger more complicated nuanced answer I think So, I think so. An essay question versus a math question kind of thing. Right, I think so.
Starting point is 01:03:29 Yeah, that's the, the, what you said, Brian, that's, that's sort of my take on it is more like an essay question rather than a, rather than a single answer or whatever. Yeah. Right. So the point, the main point being that these are questions that you can check the answer to. Yeah. So the results were that the researchers found that when you scaled up the training data, you, when
Starting point is 01:03:52 the when you go from like gpt 3 to gpt 4 i'm using gpt as an example because a lot of people are familiar with it but they used some other large language models like the one that meta has and um who has clod claude is um i like clod clod's the most accurate one i've ever used i forgot the name who who has it but whoever that is that one's a good one yeah they they tested multiple ones and um among all of them they found that if you when when the when they were when the models were changed by scaling up the number of the amount of data it was trained on that would dramatically increase the number of correct answers, which is what you hope for, right?
Starting point is 01:04:30 Yeah, more data available to it, the more correct answers, yeah. Right, lots more correct answers when they were training it on larger data, especially for non-addition questions, because addition questions, the number of correct answers didn't change that much, but it shouldn't, right. Right, it seems pretty cut and dry. right yeah this one this one actually says the answer is 18 this one says 19 and a half so yeah and and it is funny to think that it could get it wrong but they to be fair they were when they were testing it on addition they were like using 50 or a hundred digit addition problems um and uh some of them would just some of them with different amounts of like you know how you have to carry the 10 in addition problems there would be different amounts of that going on and jeez i thought it might be like a um uh uh i either like a calculated situation where I'm sorry, I can only go up to eight digits and I can only do math problems where the answer is eight digits or less or commas might
Starting point is 01:05:28 throw it off, like having a, you know, a billion's number with a bunch of commas and there's like, oh, you want me to add these six three digit numbers together and then add it to the other number you're giving me or something like that. Right. So they did see, and also that makes me think that another thing to just to note for people thinking about how the experiment this experiment was designed um they also gave it a lot of different difficulty questions they had difficulty rating all the questions they gave it they on all the different types of questions they rated from difficulty of one to a hundred and the difficulty they're talking about is as compared to um how difficult it would be for
Starting point is 01:06:12 a human to answer the same questions um but uh anyway they um they across all difficulties and across all types of questions they saw at least a slight increase in um accuracy of the answers whenever they increased the the scale um but some of the questions like anagrams and geography and science they they saw a massive increase in the number of correct answers but here's the thing they also saw an increase in the number of incorrect answers you might be wondering how is that possible How can you have more correct answers, but also more incorrect answers? Because you scaled up. So you scaled up all of it, right? Well, but we're talking about percentage, right? You can't increase 70% and then also increase the 30% on the other side as well. Right.
Starting point is 01:07:03 Yeah. Exactly. So you saw a percentage increase of correct answers, but the reason you also saw, they also saw a percentage increase of incorrect answers is because as these models improved over time, they were less likely to give you. you avoidant answers. So they were less likely to tell you they didn't know. And instead, they would just make something up. Right? So that is therein lies the rub, as you might say, right?
Starting point is 01:07:32 Sure. So, so the, and that, and then they also found that in those incorrect answers, they were also, they were, they were, they additionally tested and found that they were often spent more time and effort, like, and be more. wordy and trying to convince you and being convincing. Here's why my incorrect answer is really correct. Were they able to figure out what the errors were, what the confusion was for the answers that were wrong? Like, oh, it's in the way that we worded this. The order of words we put in here could have been misleading to the AI, or do they
Starting point is 01:08:08 figure out what caused it to go wrong? They think that the, well, yes, the short answer. yes. And that's where it's really interesting because it's in the way that it was trained. They saw when you jumped from from GPT for example, I'm going to use GPT as an example again. When you jump from GPT3 to GPT4
Starting point is 01:08:28 where the big difference is the size of the training data, that you did see an increase in the number of incorrect answers and less likely to give a void in answers. But within those two different sets, as
Starting point is 01:08:44 they, um, as the different, uh, iterations tweaked and, and, um, tweaked the way that they were trained, um, to try to give more, to try to be more reliable. You know, they would, they would give a lot more feedback. The type of training I'm talking about is like, you ask it a question, it gives you an answer and then you go back into it and give feedback in some way or another. This is correct or this is incorrect. Right. Right. And you feed that back into the model, not just, and I'm not talking about just in your, like you, not just in your conversation with the AI, but you feed that feedback into the core of the model
Starting point is 01:09:20 so that it can give quote unquote better answers. Whenever that type of training was done and the iterations changed, that's when they saw the most dramatic decrease in avoidant answers. And they're pretty sure they think the reason that this is happening
Starting point is 01:09:36 is because the AI, being trained by humans on human answers is, is, is the, the bias that's being injected into it is that they're, they're, they're, they're trying to give what we human beings will perceive as correct answers. So that's, that's the, that's the positive, that's the positive feedback, right? Because humans are fallible. These answers are right. They don't actually, correct. Exactly. They don't actually have to give you correct answers. They just have to give you questions that you think are correct. Right. Okay. Wow. Yeah, that's not problematic at all.
Starting point is 01:10:15 Yeah. Right. So that's what these researchers found out. And so that's, I mean, that's important to know. It's getting, it's getting trained by those, that infinity of Facebook posts that say, what is the answer to this complex math question? It's not what you think. Right. Yeah.
Starting point is 01:10:35 Exactly. It's training on. Great. Excellent. Yeah. And the answer is that all the answers in the comments, of course. Right. Right. And so the interesting thing to me is that, is that as we continue on with AI, what we're learning is that it's very complicated. It turns out when you try to create something that thinks like people, it is as a comparable level of complication. Right. Like our brains are complicated. So these programs are going to be complicated, the more we try to make them behave like our brains. Right. Yeah. Right. That's interesting. I don't, uh, I don't know what the ultimate, one of the aspects of this is at least is interesting,
Starting point is 01:11:19 and maybe it's not as big a deal as I think, but there are so many competing models. GPT comes up because, you know, they're kind of early in the game, but there's so many competing models. The one thing that you always assume from a science fiction standpoint is that an AI, an all-knowing AI in the form of, let's say, this very advanced large language model will be, a singular entity and not a bunch of 15 competing entities and then so then what are they competing on like those are interesting questions I think that are different than what we usually ask in our books and our movies right of AI you know we hear mother or father in these
Starting point is 01:12:01 alien movies or we hear you know whatever AI is in whatever movie and or book and you go oh yeah well of course everyone's using it it's everywhere it's the ship's computer and Star Trek, you know, yeah. Did the Whelan company like the Whitt Corporation basically say, well, we could install mother in our ship in the Nostromo. We also have this other one and DeBlink 2.0 is available and... Yeah, did Earth government require them to have the option to side load? We have a contract with the mother company, so we have to use there.
Starting point is 01:12:34 Yeah, it's the one aspect that we never seem to get into in these books and these stories. And so I'm always curious about what that actually means. If anything, to the biases, to the issues, to the advancements, whether or not that competition will make things better or harder. I mean, obviously, it advances technology to have competition. That's always been true. But in this case, it just seems strange for us to say, well, this, your artificial intelligence is smarter than my artificial intelligence. Like, it's just a weird, it's a weird new way of thinking about it. Right.
Starting point is 01:13:08 Well, because what I think we're starting to realize as companies and organizations are starting to actually put these types of AI models to use is that, like you said, there's not one single one. It's all about how you train it. They might all fundamentally work on similar principles, but the difference is in how you train it because you need to reinforce certain. types of answers. So, so like the like GPT for example, chat GPT, they're realizing that the way it's being trained reinforces just convincing people of the answer that you give. But there are other like like a software company might want to have a large language model that helps them program like write software, which exists. Those are already out there. And the way that those are trained are on they're they're trained to give
Starting point is 01:14:08 code that works or they're trained to write code that is you know you can even among different software companies who have different values they might have different ways that they train their their software helper AI right like some companies might value code that's um easy that's human readable right so that you can go back and debug it easily for example with by humans or some companies might really want code to be super efficient so it doesn't matter if it's human readable as long as it gets the results
Starting point is 01:14:45 in less lines of code or something like that, right? And it's all about how you train it. And so that's, there's going to be a bunch of different and there just are going to be a bunch of different kinds of AI that are used for different things. That's why I don't really worry about,
Starting point is 01:15:00 like people are so worried about like the AI apocalypse like I don't worry about that at all I just worry about people being dicks that's it right like anybody who's going to bring information using AI yeah exactly if AI is going to cause a problem for society um it's going to be because of us in the way that we've trained it in oh yeah the call will come from inside the house for sure right it's not gonna this isn't going to be a machine that suddenly goes sky net self-aware and then destroys everybody that's a that's a fun thing to explore it does play to base he human fears. I understand why we like it as a construct to tell Terminator stories. But that's not
Starting point is 01:15:38 the thing that I'm worried about, not even close. I'm worried about massive amounts of misinformation, bad stewardship, stewardship of tools and of technology. Like always, we're just, we're the shitheads, you know? It's not the machines. The machines are doing what we tell them to do. And we're the problem. There's a new podcast series that I just listened to recently. I can't remember what it's called but um the uh it's about a guy who uses um trains an AI you know like the AI voice models that can talk for you right um a lot of people use those because you know people have trouble talking and and want to be able to like companies exist now that will they'll make your voice using AI um and they're very very good very very convincing and um he
Starting point is 01:16:27 took that and hooked it up to um chat and basically chat GPT like a like a large language model and then um had it having full interactive conversations with people and um this whole podcast series is all about this like all the different things he did with it and the journey he went on with with uh you had you know had it call uh customer service lines and and um like talk to his family and friends and stuff like that Mm-hmm. So it was just a really interesting podcast. I wish I could remember the name of it. I mean, there's this, you know, like here, listen to this. I'm a place. I'm real quick here. Let's see what happens. I mean, see. I eat babies and I will continue to eat more babies.
Starting point is 01:17:13 Okay. So that's my sister, Wendy, claiming that she eats babies and she will eat more babies if given a chance. That sounded like my sister. Like, I eat babies and I mean, that's 100%. That's her on her crappy mic on a Thursday. Like, right. Because that's what it was trained. so people hear that and they're like oh it's totally windy that that kind of thing is fine in the right hands and in the right with the right intentions it's those kind of tools that I worry about it's that it's art it's photos it's that stuff um that like it or not like a lot of this research that's being done and the the the slow perfecting of the technology behind the scenes that's happening is not what's paying the bills. What pays the bills is some dingus on his phone spending $8 a month to make Trump videos that don't exist. And that's, it will be on the back of those dollars
Starting point is 01:18:11 that these companies have some semblance of revenue. That's why it frustrates me. If I haven't explained it very well, then that's, that's by explanation for why I don't like it. The podcast series I looked it up, it's called Shell Game. You should, you should listen to it. It's a, it explores those ideas and and and all from the journalist perspective thinking like like what what can we do with this and what are people doing with it talks he has a whole episode about about people's scammers who are using this technology doing exactly the same thing he's doing but um but you know to try to take people's money is every episode uh AI focused or just a couple but you know it's all about AI it's a limited series it's six episode series it should be
Starting point is 01:18:58 They should call it all around AI and then you could sue them, you know. All around shells. Yeah. Wait, no. That wouldn't work at all. Well, anyway, this is fascinating stuff. I suspect that in the future, as big things happen around the AI space, that Bobby will be happy to come on here and talk about it in his science segment.
Starting point is 01:19:17 Or at least AI Bobby will be happy to come and talk about it. Yeah. You'll never know. You'll never know. I know. Exactly. Well, Bobby, it's always good to talk to you. tell people about your science podcast and where they can get it.
Starting point is 01:19:29 Our podcast is called All Around Science. We talk about science every week. Me and my co-host, more of Science News or whatever we find interesting going on. And this episode that just came out today is about space elevators. It's my favorite 80s game, Space Elevators. Oh, it sounds like one doesn't it? It does. It's your space invaders elevator action hybrid game.
Starting point is 01:19:55 that Brian Dunway probably has a deluxe version of somewhere. Sign me up. I'm in. Cool. That's going to be fun. Yeah. Yeah. Well, real quick, Bobby, what's 8 plus 4? 12.
Starting point is 01:20:05 Okay. All right. It's really Bobby. It's really Bobby. He got the math right. And if he got it wrong, you would have to have said, oh, you're right. I was hoping we'd get some avoidance. Yeah.
Starting point is 01:20:18 He'd be saying it was 15 and trying to convince you why. Right. It's 15, but really here's why. Yeah. also don't forget check out the instance later this month we'll be talking about the the continuing adventures of world warcraft on the instance 2.0 me and bobby we don't know what date yet we got to lock that down probably this week but we'll let you know when and when you can expect a new episode for the month of october bobby have a great scary month and i hate this new
Starting point is 01:20:45 feature in macOS that expand your windows when you don't mean to i got to oh like if you drag it to the corner yeah i got to turn that shit off yeah I don't like it. Yeah, turn that off. I'm still using, I turned it off and I'm using Moom because it gives me more granular control over my window arrangements. I don't want 50-50 anywhere. You want to control it.
Starting point is 01:21:06 You want to pick your own ratios. That's what I want. So if you, Apple, I'm changing. Still and get rid of Bobby, is what we're saying. Bobby, have a fantastic week. We'll see you next time. Bye. Let's go now where we always go, which is toward the end of this show and a quick email to
Starting point is 01:21:23 take us out. This is an email that came in from Tyler. His name is Tyler, but he says he's a janitor at the end of this. Okay. All right. Hey, Sunshine and Beach. I met dudes on threads, or met a dude on threads who was upset that his prime video suddenly and unexpectedly had ads. I informed him they announced they would be rolling out a tier with ads over a year ago. He said, that's a bait and switch then. Called me a billionaire bootlicker, but the name calling is besides the point. billionaire bootlicker coming to TLC this fall billionaire bootlicker
Starting point is 01:21:57 that's 100% of TLC show Bravo on Bravo Any of those crappy basic cable channel I carry that bullshit I'd be interested in your thoughts The ad free tier still exists They gave users ample warning of the change And is this a bait and switch
Starting point is 01:22:13 If a baker offered to sell me a donut for $2 And instead gave me an Oreo With a frosting licked out of it That would be a bait and switch correct lured me in with the prospect of one thing and then presented me with another thing. If the baker sold me a donut for two and says starting next year,
Starting point is 01:22:28 $2, $2 donuts come with a punch in the face, I can pay $5 for a punchless donut. That wouldn't be bait and switch, or would it? Thoughts, you're 100% correct. Yeah, exactly. I don't like what Amazon did either, but it's not bait and switch.
Starting point is 01:22:41 It's not a bait and switch. They gave you, A, they gave you warning, and still there's, you know, there's still the availability of the tier you have. you just have to pay a little bit more for it, but they warned you that the price was going to be going up. Wait a minute, this gasoline was $1.15 when I was a kid. Now it's $3. What a bait and switch.
Starting point is 01:23:00 Yeah, to keep it in the zone of the prime example, prime video, bait and switch would have been them doing a full advertised thing where you went to go sign up that said, you'll never get ads, it's amazing, sign up here. And then as soon as you signed up, you watch ads. and you have to pay it next year. And then they go, oh, you don't like those? Well, it's another two.
Starting point is 01:23:24 That's bait and switch. You can call this bad business or shitty price increase. You can call it a million other things, but it's not bait and switch. It's not bait and switch, no. We're with you, man. The janitors of the world unite, all right? The guy on threads lied. Yeah, he did.
Starting point is 01:23:39 People get stupid, you know? Sure. We're surrounded by it. That's it for the show. Frogpants.com. TMS for all the things you might ever need. And we're going to leave you now with a song. request probably. Brian, what do you got?
Starting point is 01:23:52 Yeah, she's in the tadpole right now. We've been chatting with her. She has had a hard night, so she's needing something to cheer her up, and this is just the thing to do it. Stephanie in a pet's wrote in and said, sup, spurp and burp. Stephanie in her pets here requesting a song for my birthday that happened on Saturday, September 28th. I turned 34, so Scott can play the Let's party clip because I'm still below 40.
Starting point is 01:24:16 Yeah, you're still young enough. Let's party. Still get the gas at the end, too. Which is very important. It's been a weird year for me, and I can't say things have been easy, but I'm glad that I have a job that allows me to be in your live TMS chat most mornings. For my birthday song, I would love to hear the best cover you have of Pet Shop Boys song, Opportunities Let's Make Lots of Money. It's one of my favorite 80s songs and always gets me pumped.
Starting point is 01:24:40 If the cover of that song doesn't exist, blah, blah, blah, I'm not even going to have to go there. I love you guys. I hope TMS lasts as long as you both are alive. I don't know what I would do without you. Thanks for playing my request. your friend Stephanie in all caps trying stop me. I just heard that song last night on this local station
Starting point is 01:24:55 that when they started they were all alternative and music from that era. They were called KJQ at the time now they're called X96. And on the weekends they have what they call Gen X 96 and they play nothing but stuff from that era and they played two songs
Starting point is 01:25:11 or two different times within the same hour they played Let's make lots of money from that job. Oh, funny. That's crazy. Wow. Nice time. See, I don't listen to a lot out of terrestrial radio, but for that, I would. Yeah, this is about the only time we ever hear him is on the weekends and the rest of the week. I have no idea what they're doing, but yeah, it's good stuff. So, Stephanie, right there with you, feel in the mood.
Starting point is 01:25:30 It's good stuff. Good. Well, the version you're going to hear comes from 1998 and a guy named Frank Bennett, who's, if his name sounds like a combination of Frank Sinatra and Tony Bennett, well, that's intentional. This guy sounds like a cross between Frank Sinatra and Tony Bennett, and he does nothing but covers of songs that you would never hear either of those two men cover, including this one right here.
Starting point is 01:25:54 From 1998, here is Frank Bennett, and Opportunities Let's Make Lots of Money. I've got the brains, you've got the looks, let's make lots of money, you've got the brawn, I've got the brains, let's make lots of money. I've had enough scheming and messing around with jerks My car is parked outside I'm afraid it doesn't work I'm looking for a partner
Starting point is 01:26:28 Someone who gets things fixed Ask yourself this question Do you want to be rich I've got the brains You've got the looks Let's make lots of money You've got the brawn I've got the brains
Starting point is 01:26:45 Let's make lots of money You can tell I'm educated, I studied at this or one, I've doctored in mathematics I could have been done. I can program a computer choose the perfect time. If you've got the inclination, I've got the crime. There's a lot of opportunities if you know where to take them. There's a lot of opportunities. If there aren't you can make them Make or break them
Starting point is 01:27:20 I've got the brains You've got the looks Let's make lots of money Let's make lots of money Let's make lots of money I mean, uh, uh, yeah,
Starting point is 01:27:50 uh, and uh, I've got the brains. You've got the brains. You've got the looks. Let's lots of money. I've got the brawn. I've got the brains. Let's make lots of money. You can say I'm single-minded.
Starting point is 01:28:31 I know what I could be. How do you feel about it? Come take a walk with me. I'm looking for a partner regardless of expense. Think about it seriously and know it makes good sense. There's a lot of opportunities if you know where to take them. There's a lot of opportunities. There's a lot of opportunities.
Starting point is 01:28:56 If there aren't, you can make them. Make or break them. I've got the brains. You've got the looks. Let's make lots of money. You've got the brawn. I've got the brains. Let's make lots of money.
Starting point is 01:29:15 I've got the brains. You've got the looks. Let's make lots of money. You've got the brains. Braun, I've got the brave. Let's make lots of money. I've got the brains. You've got the looks.
Starting point is 01:29:32 Let's make lots of money. This show is part of the Frog Pants Network. Yes. Get more at frogpants.com. This nightmare will never end.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.