The David Knight Show - INTERVIEW Epic Tech Failures — We Hope

Episode Date: April 2, 2024

Neuralink, the "internet of brains and bodies, AI "inbreeding", "self-driving cars", the EV push, and grid deconstruction — tools of technocratic slavery that have some major issues to solve for our... wannabe masters.  Goattree joinsFind out more about the show and where you can watch it at TheDavidKnightShow.comIf you would like to support the show and our family please consider subscribing monthly here: SubscribeStar https://www.subscribestar.com/the-david-knight-showOr you can send a donation throughMail: David Knight POB 994 Kodak, TN 37764Zelle: @DavidKnightShow@protonmail.comCash App at: $davidknightshowBTC to: bc1qkuec29hkuye4xse9unh7nptvu3y9qmv24vanh7Money is only what YOU hold: Go to DavidKnight.gold for great deals on physical gold/silverFor 10% off Gerald Celente's prescient Trends Journal, go to TrendsJournal.com and enter the code KNIGHTBecome a supporter of this podcast: https://www.spreaker.com/podcast/the-david-knight-show--2653468/support.

Transcript
Discussion (0)
Starting point is 00:00:00 All right, welcome back. And joining us now is a guest we've had on many times, always enjoy talking to Goat Tree. He works in the cybersecurity area, and he is, I should call him a white hat guy that is protecting against the black hat guys. In other words, a business will hire him to try to find vulnerabilities and warn them before the bad guys do it. So joining us now is Goat Tree. Good to have you on, Goat Tree. Thanks for joining us. Hey, good morning, David.
Starting point is 00:00:36 What's on your mind today? Oh, man, everything. You know, you've got a great show rolling today. And, you know, I was listening and I was thinking, you know, you've got a great show rolling today. And, you know, I was listening and I was thinking, you know, failure's good. It's really good. Because a lot of this that you were talking about has failed. And I hope it continues to fail. Even though they're promoting it like it's on the horizon.
Starting point is 00:01:04 Yeah. Oh, yeah. Absolutely. fail even though they're promoting it like it's on the horizon yeah oh yeah absolutely yeah i'm happy to see these self-driving cars fail but you know what they don't care there's there's several articles out from mainstream media in the last couple of days saying well uh whether it works or not we're gonna we're gonna equip the semi trailers to be self-driving it's like you've already seen it doesn't work. You've already seen how bad it is in San Francisco and other places like that, and you've had to stop these experiments.
Starting point is 00:01:31 So now what you're going to do is you're going to fail up. I mean, this is ridiculous. They keep taking it up to bigger and bigger things to fail. Well, I mean, but they'll promote it as a success. I mean, you know, one of the things I've really got to chuckle about, and everybody's panicked about it. Well, you should be panicked about it, but you know, Elon Musk just did that brain chip thing. And he's promoting it as a success.
Starting point is 00:02:01 And one thing, it validates what we've been saying for years about it to plug people into the grid. He put this chip into a man that was paralyzed from the neck down. Under the premise that it's going to give him his motor skills back and do all this. The other thing that has got came out of it so far is the guys hooked up to the net, just like we've been predicting all along. Yeah. And only result Elon's gotten from it was posted on X. I'm Elon's bot.
Starting point is 00:02:38 Yeah. He still can't move his arms. He still can't move his legs, but he's wired up to the net now. Yeah. Yeah. He can play video games with his mind. Isn't that a great accomplishment? Yeah.
Starting point is 00:02:51 That's a pretty radical, uh, game accessory to open up your skull and put a chip on your brain so that you can play a game. I mean, that's, that's amazing. Well, now I think the most successful part of that is, you know, how we've been talking about how are they going to hook you up to the net? And then that's going to lead into your facts. But how is he able to transmit outside of his skull? I mean, that was one of the sticking points. And the only thing I could think of is the guy's got like the same tenor like these cartoon Martians you know we're raising things in the old TV show
Starting point is 00:03:35 my favorite Martian where thanks about it it's come out of his head I don't know but you know at the same time as they move forward in this and of course here here's the vision coming from the rand corporation they said that um there's going to be an internet of brains by 2050 this sounds like the matrix i mean these people are trying to make science fiction come true and it's kind of interesting you know to see um writers like annie jacobson she's now out again talking about nuclear war but i've talked about her books on operation paperclip and on darpa and other things like that uh she gets access to these people in the pentagon and and and talks about you know what is coming up
Starting point is 00:04:16 but um it truly is amazing uh she said that they'd like to get these science fiction writers and movie producers into the Pentagon and talk to them because then that gives them a vision for what they want to implement. And she said when she went there with Gayle Ann Hurd, who was the former wife of James Cameron, they were married and co-producers of the Terminator series. And then they got divorced and he just as far as i'm concerned he made a bunch of junk underwater movies and everything but you know with gail and heard there uh everybody was like oh i love the terminator i mean they identify with the villains and she said that's kind of the scary part of this she said first of all they're all kind of you know when they go home they're they got a family and they got a dog and they got a tv and they're kind of like us but she said the bad thing about it is these people identify with the villains, and they're looking at these science fiction writers.
Starting point is 00:05:09 Give me some ideas of some things that I can do. Well, here we've got the Rand Corporation talking about an Internet of Brains. It looks like they've been camping out on the Matrix. Mm-hmm. And here's another fail. Okay. Now, you think about this. Google AI.
Starting point is 00:05:27 They're presenting the founding fathers as African Americans. This is not AI. And to me, it's just flat out fraudulent. It's a failure. It's programmed. It has nothing to do with AI. They're trying to influence people under the guise of AI. That's the public access to their AI. I don't know what's in the back room. But to me, if I had a staff
Starting point is 00:06:00 of programmers that did that to my AI, everybody would... It was a perfect talk. But you know, it's perfect because it was a picture of what they were doing with text, you know, and yet when you do it with a picture, everybody gets it instantly. And now they're talking about the fact that they can do voice cloning so quickly that they can, they only need 15 seconds of your voice in order to clone it okay well i guess maybe if google's doing this everybody's going to sound
Starting point is 00:06:30 like samuel l jackson right hey maybe they all sound like me they'll get lucky you know yeah they're gonna they're gonna be misappropriating voices all over the place this is such such dangerous technology that only the government can have it. This is Sam Altman, right, who is always, this guy is dangerous. But he's going to give that technology to the government to clone our voices, but you and I can't have that. It's just too dangerous for us to have that kind of technology. You know, once again, I think maybe it's time someone start calling failures
Starting point is 00:07:07 a good thing because have you seen this technology i haven't i mean i hear about it yeah i mean but i haven't seen anything and it's great cover for these politicians you know they can say well that wasn't me that was ai that's right trump was saying that when they were doing all of this stuff, all these clips of me saying that. That's all AI. He's already using that as an excuse, as an alibi. He's like, no, I think that was you. So, you know, it's like everybody's talking about failure.
Starting point is 00:07:42 I used to be totally anti-failure, but now I'm embracing it. It's like failure is a good thing with what these people are doing. Yeah. Well, you sound like me. We start out in technology, and it's like, yeah, this stuff is great. And then we start seeing how it's being monopolized by the government and by the military industrial complex and DARPA and all the rest of the stuff, and we start rooting for failure.
Starting point is 00:08:12 You know, I've got a great story for you and that goes back to your days at ti um you remember back in the 80s or even in 70s man if you had a calculator handheld calculator you were top of the food chain. And one of the first things I did with, uh, I was working in radio at the time. And the first thing I did, you know, I was a techie, but you know, back then tech didn't pay anything. You were looked at like, like the village idiot. So I was working in the radio station and my first paycheck, I bought a TI handheld calculator, about the size of a credit card. And, you know, I whipped that thing out. Man, I'm the biggest nerd in the room.
Starting point is 00:08:54 And it was instant respect. Ooh, ah. That and your digital watch. Well, I didn't have to spend a four-day. I just stole't need a calculator. My mother who passed earlier, well, it was last year, and I was cleaning her house out, and I found that calculator because I gave it to her after there was bigger and better things. The more zeros and the more commas you could calculate,
Starting point is 00:09:24 the bigger geek you were. So I gave her my old one, and I found that. And it has to be every bit of 45 years old. I turned it on. It was working. I'm like, you know, Ross Perot and David got it right the first time. But, you know, what they're doing with technology technology it's just setting up for film it's not doing anything like we used to be able to do where you had longevity to it yeah yeah and it was a big it
Starting point is 00:09:57 was a big leap you know when i got into engineering they had just um approved for use uh for the students they could they could use uh calculators instead of using a slide rule and having, you know, carry around with you a slide rule and a book with tables and extrapolate between there. And so, you know, it was like, oh, this is great. And it really was, you know, and, and it was very useful and helpful. Uh, and so they had a TI programmable and they had a HP programmable. So I think HPs put themselves out of business now, but I got the HP. It had a RPN in it.
Starting point is 00:10:29 Remember that a reverse Polish notation, basically. It was a stack. And, and so you do the operatives in a different order and everything, but it was, it was a lot of fun and you had a great, great time with that stuff. And I remember going into classes and they have this giant a couple of them they have giant slide rules on the wall that were maybe about five or six feet long so the professor could indicate people had to do it and like boy sure i'm glad i didn't start this a year or two earlier you know it's a great timing to miss out
Starting point is 00:11:02 on that stuff but yeah they used to it used to be a technology was something that was very useful and directed to you. I remember the other thing that was happening right about that time was the revolution in personal computers. And you were involved in that as well. But, you know, the thing is, they had centralized computers and it was such a hassle to use them and then you had computers that would sit there and wait for you instead of you sitting there waiting for the the computer but now it's completely the other way around they're using technology in every regard to surveil and to control us not to serve us that's right you know you said that but i i was employee number 30 at Compaq back in the day.
Starting point is 00:11:46 Wow. And, you know, what we built back then is now what we call smartphones. But I remember one day, this is a great story, because this is where they're going wrong with technology. This is before Compaq went out, you know, really hit the market. And we were trying to build the ultimate machine. This is before Compaq went out, you know, really hit the market. And we were trying to build the ultimate machine. And we did a lot of things. We smoked things we weren't supposed to smoke back then, but it helped us in the lab.
Starting point is 00:12:21 So we got the bright idea, let's shock test the prototypes. Well, we grabbed about four of the prototypes and went up to the roof, and we were throwing them off the roof, shock testing them. Rod Canyon, the CEO, drove up. I swear I've never seen a man have a nervous breakdown in the middle of a parking lot. But of those computers that we threw off the roof, four of the five survived. Oh, no. Can you imagine throwing any kind of electrical device off the roof nowadays
Starting point is 00:12:54 and expecting it to survive? That's right. Oh, that could have been a marketing thing right there. So it's a, uh, that's your, uh, uh, it's not a, it's not a bug. It's a feature. You know, we can's not a it's not a bug it's a feature you know we can we can drop these things off the roof a certain percentage of them survive well the vice presidents walked in and we were bowling with them i mean we had all way that we thinking using them like bowling balls to see if they survive. But, you know, nobody does that anymore.
Starting point is 00:13:27 It's like it's a piece of junk. You drop your iPhone on the floor, it's going to break. And, you know, this is the type of stuff that the longevity of things, as opposed to what we seem to be doing now, which is totally wrong. You think about these brain chips. You get one of these put in. You know, while I'm on that subject, I remember you were talking about DARPA.
Starting point is 00:13:58 I was part of the NEMEX program, which was the big web server, the one that looks at everything on the web search it looks at everything on the internet well at the time parallel to it was a DARPA brain initiative and I had to clear it soon these type things where you know you're not talking about that you slip over there read and watch and see what worked and what didn't. When I'm talking about these chips and what they're doing, I already had, this was 2013, 2014, I already had a glimpse of what was coming.
Starting point is 00:14:39 And something dawned on me with these COVID shots. You know how they're pulling all these sinews and this white rubbery stuff out of these people's veins and stuff? It dawned on me. This is the threading that will go from the chip to the person's antenna,
Starting point is 00:15:01 which, as it stands right now, will uh uh the tattoos wow wow and that nanotech lost control or they lost control of it something lost control and it didn't stop building at thread status and it just kept building until it plugged these veins and arteries. Wow. Uh, this, this nanotech is in these people from these shots. It's almost going rogue. Yeah.
Starting point is 00:15:35 Yeah. As a matter of fact, that's kind of interesting because you look at their key technologies, genetics, robotics, artificial intelligence, and nanotech and the acronym to remember that is grain but they talk a great deal now about genetics always talking about genetics always talking about robotics always talking about artificial intelligence but what they don't talk about is nanotech and i i think you know that dog that's not barking right now the fact that they're not talking about nanotech that's a big deal and it's interesting that you would mention that because i i think that's very significant uh we had um in japan they threw out first 1.2 million uh doses of this genetic code injection and then they threw out an additional million and what happened was and i
Starting point is 00:16:18 don't know if these things were not properly refrigerated or what it was but they started noticing black particulates in it that would react to magnets. And so they just trashed a couple of like two and a quarter million doses of this stuff because they had magnetic particulates come out of the. A precipitate. Yeah, that was that graphene oxide. Yeah, exactly. Yeah.
Starting point is 00:16:43 And, uh, but you know, I don't, I'm me. I can't do it. I mean, if someone went through one of these big fibers things in front of me, I'm going to squeal like 12 year old girl and go, Ooh, get it away. One of these, well, they're pulling these things out of dead people cause it killed them. You know, like you said, it's like, that's always the thing about nanotech. You know, they're pulling these things out of dead people because they killed them. You know, like you said, that's always the thing about nanotech. You know, they're talking about what happens with nanotech, self-reproducing and very small. What happens if we can't stop it?
Starting point is 00:17:15 And it doesn't run away. And they say, well, just to get this gray goo. And maybe you're right. Maybe that is what is happening with these. Well, you know, it has run away. But these undertakers and people that are dealing with this, well, I'm thinking I would like to see, and it would be the smoking gun and being end all on proven.
Starting point is 00:17:33 This is they send an electrical charge to it and see if it will conduct electricity. Yeah, that's right. You see, if it, if it will handle a little 9-volt charge that are even smaller, it's a minuscule amount of electricity, but something that could verify it, that would leave no doubt that this was this threading system that they have planned for these brain chips. It'd be interesting to know what the resistance is across it, wouldn't it? Yeah, it would.
Starting point is 00:18:09 And that's my point. If it's conducting electricity, you've got some explaining to do, Pfizer. No, they don't have to explain anything. They'll just say, it's rare. It's not happening. I don't care what you see. Don't believe your eyes. I'm the FDA.
Starting point is 00:18:24 I'm Fauci. i'm fauci i'm trump or whatever i'm going to tell you what this is about or biden uh yeah it truly is amazing and yet you know getting back to this brain chip thing and what rand corporation is now saying you know they'll say eventually when they're ready to roll something out they'll finally admit what they're doing you've seen this stuff happening for a very long time and you know getting back to annie jacobson and she was talking about some of these agencies. She said, you know, you have this one agency, and it was involved in launching satellites and all this kind of stuff. Huge agency.
Starting point is 00:18:54 And nobody knew it existed for decades. You look at the NSA. For the longest time, people would say it's no such agency. It doesn't even exist. We're not even going to acknowledge its existence and that type of thing. But they keep these eight even entire agencies one guy was bragging about it he said we had um uh we had 10 000 people that were involved in this project and not one person leaked it and he said well actually 10 000 won because we had tom clancy involved as well
Starting point is 00:19:20 tom clancy you know tom clancy this is the guy who worked with steve pachinik and he was working with this guy he's keeping this big project with 10 000 people he's keeping it secret it's like what what is going on with it anyway the uh but the bottom line is when they start talking about this stuff it means that they're right on the cusp of putting it out and so now the rand corporation is is talking about an internet of bodies. We've always talked about how, you've talked about this for a very long time, all these different medical devices and things that they put on and how they can use that to hook us
Starting point is 00:19:54 into the internet of things and how we all become more things that they can monitor. And so they talk about the internet of bodies and they say, well, and then, you know, if we've got an internet of bodies, we can have an internet of Brains as well. Elon Musk is actually even talking about this goat tree with talking about how social media for him is kind of like a hive mind.
Starting point is 00:20:15 And maybe that is a key part of why he got Twitter, because maybe it isn't just about building his public image with conservatives as Mr. Free Speech or something. But maybe it is about this hive mind mentality and being able to monitor what everybody is saying. Maybe that's the key part of it. Maybe that's the hive mind that he is trying to set up. You've got the answer. You just haven't applied it to the right application yet. That's AI. You see, machine learning, remember? Here's the thing.
Starting point is 00:20:54 This is what it's going to take to drive AI. I'm going to put my cutoff date at year 2000 after Bill Gates failed to do Y2K. I love the reminder of that. His first virus pandemic didn't work out. Neither did his second one. Keep going, Bill. Smartest man in the room.
Starting point is 00:21:18 Um, let's hope that their surveillance stuff works as good as his search function on windows. I'm telling you, man, that's what I'm saying. Tell you it's good. Yeah. But if you took everything that was said, ever been said published or put on internet and put it into it, insanely large, uh, library, the power, the AI, AI would be giving you answers by consensus. Okay.
Starting point is 00:21:58 You just said Elon's trying to turn a Twitter into a, basically a AI form. You're right. What better bang for your buck is to buy the largest social media platform there is. You're limited. Well, except if you want to pay for it, you can post more,
Starting point is 00:22:13 but you're limited to a set amount of characters. It's perfectly tailor made to feed, feed AI. So, you know, you could ask AI, what's, what is the world's favorite color? And it could access all these, uh, trillions of replies and referring colors and come up with one that is primarily the most favorite. And then it would lie to you about it because it's biased.
Starting point is 00:22:46 Well, sure. You could do like Google and just pre-program it. You know, like, hey, let's just program it to give out the results we want. Exactly. Yeah. Now, that's not AI.
Starting point is 00:23:01 I mean, I'm sorry. That there is, well, I can't say it on air I mean you know everybody turns around and says did he really say that but no this is Google needs to be called out for this yeah yeah it's pure Bolshevik you know
Starting point is 00:23:18 it's a Marxist there's the word I was looking for it rhymes with it anyhow yeah that's right that's right. That's right. Well, you know, when you look at what they're talking about, though, and as you're pointing out, they've got to scrape all this information to train AI. And then they've got to take the extra step of inputting a bias into it, you know, to get it to give you the first. It's got to know what the right answer is.
Starting point is 00:23:44 And then it's got to be trained to lie to you. So they got a task ahead of them. But when they grab all of this information, it's now even being talked about by the Wall Street Journal saying, they're calling it inbreeding, whereas in the past, and other people have talked about it, they call it kind of cannibalism. And they made the analogy to uh mad cow disease you know you feed cows other cows and after a while they get this brain disease or humans uh cannibals they get yakov kreutzfeld disease which is a human equivalent of mad cow but they said you know if the artificial intelligence has to feed on uh human content
Starting point is 00:24:23 on the internet. And then the problem is that it's pumping out so much content. Once it starts feeding on its synthetic stuff, once it starts bringing in artificial intelligence stuff instead of human stuff to use, then it starts getting really stupid in just a couple of generations. It's just like mad cow disease. And so they're calling it now, now that wall street journalist, I talked about this months ago.
Starting point is 00:24:50 Now wall street journalist talking about it instead of making the reference to mad cow disease, uh, they, they call it inbreeding, you know, kind of comparing it to the Royal family, I guess, you know, but it's, uh, it's good to be this inbreeding thing. And, uh, but either way, they've, they've got a real issue because they're pumping out so much content, they're going to be polluting their own data source. One could never have enough inbred silicon, you know. Yeah, that's right.
Starting point is 00:25:21 That's what it's going to be. And you talk about failure being good. That may be the thing that saves us from artificial intelligence is going to be the fact that it's inbred or it turns into this cannibalism thing. If you stop and think about it, you could make the comparison of AI being zombies. You know, got to feed on brains. You know, brains, I need more brains. That's what your brain chip may evolve into.
Starting point is 00:25:54 I was trying to remember these, uh, in ancient Egypt, they use beetles for when, when a person passed away upon your gray matter, what was the scarabs or something like that? There's a scared beetleab beetle yeah i didn't know that they ate brains well yeah they're they're really beautiful they're they're they're rainbow colored and really beautiful and they have a really uh ugly existence and that may be what we need digital scarabs you know that's that's right. Instead of Trojan horses or viruses, we could have scarabs, uh, for the AI. Maybe that'll be a thing.
Starting point is 00:26:30 Maybe, maybe you can help make that come true. Who knows? I don't know about that, but you know, with the, with the wall of crap that we're having to put up with, I mean, he's lucky when you brought up the, the, the, the ship that ran over the bridge earlier you don't hear any more about it only happened like two or three days ago that's old news it's something they've already moved on can you imagine how old the news is going to be when it's a decade
Starting point is 00:26:55 from now and they still haven't built a replacement by then people forget there was even a bridge there as short as our attention spans are getting. They'll forget that Baltimore was there because Baltimore will disappear. That may not be a bad idea at the moment. Well, except that they're all going to relocate somewhere else. It's like smashing an anthill. The problem is they're going to reconstitute themselves somewhere else that may be closer to your house. Maybe we could rename it like Hollywood or something.
Starting point is 00:27:29 If they'd already returned, we don't remember Baltimore, but Hollywood only in Maryland is really a nice place. But all this, this information overload we're on, it's almost got to the point where, where people are that try to keep up with it because i was thinking okay how am i gonna do your show you know what are we gonna focus on and there is so much out there right now it's it's it's overwhelming if you tried to focus on it you'd almost feel like a schizophrenic oh, I know. Tell me about it. That's my wife. You think about what that's going to do to AI.
Starting point is 00:28:13 I mean, how can it process the amount of information, disinformation, general nonsense that's being hurled at it on a daily basis yeah yeah i mean that's true well you know when you look at it i think what is really scary too is you know not only is it their encroachment into our minds and their intention to uh you know take over our minds literally uh connect us to their internet uh because that's a two-way street you know if you're if you can project your thoughts guess what they can project stuff into your thoughts as well right uh so it's kind of a two-way street there but you know the other part of it is uh robotics
Starting point is 00:28:54 and the merger of robotics with artificial intelligence i played a couple of weeks ago, a picture of this company called Figure, Figure AI. And they had a very, very impressive robot. I mean, this thing had smooth moves, you know, and not this jerky kind of stuff like you see with Disney animatronics or anything like that. But, I mean, it had a great deal of dexterity. It could pick up, you know, soft uncooked eggs and things like that and not break them. It had a pause.
Starting point is 00:29:30 You could interact with it because it had chat GPT as a brain, but it had incredible dexterity, and it could process and learn from watching humans how to do tasks. It could interact with people and kind of draw conclusions and stuff. There was a little bit of a hesitation after they talked to it, but it come back and it even had like this kind of, you know, California surfer dude way of speaking, you know, and even had like a, well uh yeah you know kind of that type of deal it even they even programmed it with that to make it seem more human even though you're just looking at a faceless glass head uh but you know when you have something like that now these people are the guy who did future um sorry no future but figure ai and the ceo is now saying that uh we're going to have these things everywhere they're going to be ubiquitous we're going to have these things everywhere. They're going to be ubiquitous.
Starting point is 00:30:26 They're going to be like your cell phone. And I started thinking about that. What could possibly go wrong with that goat tree? Everybody forget how to think. Just go look up your glass head and talk to you. It has answers. And this thing is going to follow you around. This is worse than Orwell.
Starting point is 00:30:43 Orwell envisioned a big brother. I that's going to constantly be watching it. This thing is going to follow you around. This is worse than Orwell. Orwell envisioned a big brother eye that's going to constantly be watching you. This thing is going to be mobile. And it's going to be metal and strong. I mean, it could throttle you if it wants to. But, you know, it's going to be your constant companion following you around. Instead of having your spy device in your pocket, your big brother device, the smartphone in your pocket, you're going to have some minder bodyguard robot, or more like a prison guard, who's going to be following you around everywhere. Yep, it goes
Starting point is 00:31:15 back to the old saying, we're alone together. Yeah, that's right. It's going to be, again, it's going to be kind of a babysitter kind of a minder a constant surveillance there when they had the snowden tapes and um snowden leaks uh and it was only reported in germany der spiegel but they didn't show it here in the u.s so i've shown it over and over again and it was three slides and they said who would have thought in 1984 and they show the super bowl commercial and the next slide is steve jobs holding up a smartphone who would have thought 1984 that this would be big brother and they show him holding a smartphone and then they
Starting point is 00:31:54 show everybody lined up in an apple store to to buy it and they said and that the zombies would line up to pay for it themselves and so i guess they're going to get people to pay for these robots themselves uh the early adopters will think they're going to get people to pay for these robots themselves. Uh, the early adopters will think they're so cool. And then they'll probably heavily subsidize these things so that everybody can have their own prison guard. Don't you think?
Starting point is 00:32:14 Well, you know, I mean, people, uh, you know, I hate to say it. Maybe I'm just going rogue.
Starting point is 00:32:21 I don't know, but people, they, they start to drumbeat of this cool factor. And they condition these people to you must have it. Yeah. And by the time it actually rolls out, it's like us taxpayers in the U.S., we're funding our own demise.
Starting point is 00:32:44 And we're happily doing it. It's the same thing with the great big glass coffee kids. I don't even know how you're going to cart that thing around, but anyhow. Well, it can walk. It can walk. It can probably do somersaults on top of you as well. Pound you into the ground. It's like one of these Boston dynamics bots that are out there.
Starting point is 00:33:12 Well, yeah, some people they'll be going, I can't wait. And they'll probably even turn them into some, oh man. I don't even think about that. I'm I'm about to perv out thinking about, Hey, like give me a big sexy one. You know, like now we're going there. Yeah. Yeah. Oh yeah.
Starting point is 00:33:27 There'll be all kinds. I mean, you know, you got the people who are into a little fantasy stuff, the free type of people that probably give them like little Pokemon things to follow them around, they'll, they'll dress them up in all different kinds of ways. And, uh, but it'll be, yeah. Oh, yeah. If they're, if they're smart, you, you remember, remember, you go back to look at the Segway, right? When the guy, what was his name?
Starting point is 00:33:48 Was it Dean Common or something? I can't remember his name. Anyway, he got a whole bunch of Silicon Valley executives together to show him the Segway. And Steve Jobs said, that's great. That's really amazing. But he said, you got to make it look cool or people aren't going to buy it. And he said, furthermore, you got to control the release of this thing, because if you don't, you're going to have some, you know, some some idiot have a really bad fall on it. And that's going to kill the, you know, the marketing buzz on it.
Starting point is 00:34:19 Well, it turned out that idiot was George W. Bush. And it didn't look cool. It looked really stupid. You know, I rode one once. Karen and I rode all around Chicago once on these things. And we looked so incredibly stupid. But we were having so much fun, we didn't really care. But because it is a blast to ride these things, you just kind of lean.
Starting point is 00:34:39 And you're going about 12 miles an hour in one direction. But they do look really stupid in this. And that was what he was saying. So they're not going to make the – if they got somebody that's smart, like Steve Jobs, they're going to make them look cool. They're going to make them look sexy or playful or intimidating or something like that so the rappers can have a couple of these things following them around all the time. Of course. Then you have your collector's edition so that you're building up the demand,
Starting point is 00:35:08 or, you know, this is a one off. It's all the one in existence. Yeah. Watch him build for it. Like rare art, you know, it's yeah. Yeah. It's just stupid. It, it, it, it, the things that they're trying to push off on, on the world.
Starting point is 00:35:24 I mean, I'm rooting for failure now. I mean, I no longer want to build indestructible stuff. I'm rooting for failure. Yeah. I hope the big glass head fails or whatever that we're talking about with that invention. Yeah, the figure bot. Well, talking about failure, you've got all these fast food companies now. The new rage is for them to add artificial intelligence.
Starting point is 00:35:49 And I wonder how artificial intelligence makes the food taste. Does it taste any better? I haven't been to a fast food place in 10 years. I could not tell you. I don't even like driving by them because, you know, it's like everybody drives crazy getting in and out of there. Oh, yeah. And that's what they're doing. They're making them all drive-thrus.
Starting point is 00:36:12 Right? So, even Chick-fil-A is setting up now, they're opening up restaurants that are drive-thru only. And now this is an article from Wall Street Journal, and they're talking about uh this this corporation yum that owns taco bell and pizza hut and and uh you know several of these
Starting point is 00:36:33 yeah and so they're they're saying uh they're going to start setting up changing their restaurants of kentucky fried chicken they're going to set it up as a drive-through type of thing so everything is about drive-through but i thought they were also the same people that are trying to ban private cars how's that going to work i'm thinking that it'll be the bike lane yeah it is it is crazy uh but yeah that's what they're doing everywhere and i I keep looking at this, but I think it's a good example of how these people are not really, they're just chasing fads. You know, it's like a corporate fad to add AI to stuff. And it's not really going to help the quality of their fast food at all. Maybe the AI could come up with something that's more addictive or something, you know, or more unhealthy.
Starting point is 00:37:23 But it's not really going to help the food. And then it comes up with, oh, let's just do drive-thrus when they don't realize they're not paying any attention to the fact that Biden and company are trying to stop all the cars, including the electric cars. None of this stuff really makes any sense. You know, if you conceptualize what these guys are saying is, no one's going to have a job. no one's going to have a job. No one's going to have a car.
Starting point is 00:37:48 How are you going to pay for that? How are you going to acquire it? And then how are you going to pay for it? That's right. That's right. I mean, you know, they're making themselves obsolete. Failure's good. Mm-hmm.
Starting point is 00:38:00 Mm-hmm. Maybe the global health rebound and people become healthy because they can't access the yuck or yum. Yeah, yuck is more like it. That's not going to be the new. But, yeah, it's – and we see in California they got $20 an hour minimum wage for all of the fast food restaurants just went into effect with newsom and so you know that this is going to push uh replacement of the employees with robots and other things like that and as you're just saying who's going to buy their food right already you got a lot of these fast food chains are saying well we're seeing a drop
Starting point is 00:38:42 off even in the cheap fast food stuff we're seeing a drop-off. Even in the cheap fast food stuff, we're seeing a drop-off because a lot of our people can't afford it anymore. And this is only going to raise the price up, so they're going to fire people and replace them with robots. So again, are you going to have robots going through the drive-thru to get something to eat? No, they're not going to do that. Henry Ford said he wanted to have an automobile that his workers could afford.
Starting point is 00:39:04 They don't think like that anymore. They don't want us. The future doesn't need us. The future doesn't want us. And these futurists don't want or need us. Well, and then that brings us to digital currency. Okay, you don't need a job. We'll just replenish your wallet every month or whatever.
Starting point is 00:39:23 And, you know, suddenly, where are we at? This is something, it kind of goes off the grid for a moment, but then it brings it back into the subject. And this is the big threes, maybe even the big four that people don't realize is, well, also with the Vax.
Starting point is 00:39:46 I mean, you can even throw this into it. But the leaders in Washington, the leaders on the state level, the local level, the ones that are advocating abortions, they have Americans killing Americans. I mean, the next generation, you know, they're always about sustainability. Well, this is not sustainable. That's right. And
Starting point is 00:40:13 this is sociological engineers coming up with this stupidity, just like fast food. And then, if you do have a child that manages not to get aborted and you raise it five years old and you put it in the public school, uh, they're, they're doing their dead level best to scramble the kids of brains with all this transvestites.
Starting point is 00:40:42 I don't call it change gender. This, this is just stupid, it's mental illness. And if your kid does finally manage to survive all that mental
Starting point is 00:40:56 torture and manage not to come out to be self-loathing and you know, which restroom to use and if it's a boy or girl and what species it is. You know, they had messed up our society so bad. They're even trying to kill us off with the vaccines. That's right. Yeah.
Starting point is 00:41:23 Yeah. They try to sterilize us mentally. and if they can't do that, they sterilize us physically. They can use our food. They can use the medicine. But it is a mass psyop of sterilization to hate other people, not even just the other sex, the other gender, but they want you to hate um being around other people completely you
Starting point is 00:41:47 know that i had an article last week of a woman who has been wearing her mask now for four years she doesn't want her grandkids around her she said they're little germ monsters they get everybody so ocd uh with all this stuff they hate other people and they don't they're completely they just want to live their life alone and live their life again through the internet and uh just just hook up that way and live vicariously that way that's that's she may have a point about kids they are pretty nasty yeah a little incubators but uh yeah as well, they have their strong points as well. But it is interesting to see this program and how they have gotten traction on it.
Starting point is 00:42:34 And of course, it was very key to have everybody locked down in 2020 to put everybody on, get everybody a taste of universal basic income. I think that was one of the amazing things to me was how long this went on and how easy it was to train people to take a handout. I mean, it's like the pacified, you know, just like you take a wild animal and train them to be hand fed. And they did that to people very quickly. Yeah, they did that to people very quickly. They've got the American public domesticated and so servile that, you know, I honestly think you could do pretty much anything you wanted to the general populace.
Starting point is 00:43:17 And if you've got a good enough excuse or narrative, the people will go along with it. You know, it's crazy. excuse are merited, the people will go along with it. You know, it's crazy. And then, if this isn't bad enough, this is our own government that is orchestrating this. This is not a government for the people or by the people.
Starting point is 00:43:40 They're bringing in replacements. They made such a mess of our society and what they've done to the citizenry, they're bringing in replacements they made such a oh yeah a mess of our society and what they've done to the citizenry they're bringing in replacements who don't have abortions and who don't go along with this stuff and we're going to have one massive culture collision here shortly if if these clowns in washington are not removed i mean yeah well they want to have different they'll have different cultures different languages different groups of people who as they collapse the system are going to start fighting each other instead of fighting them
Starting point is 00:44:18 that's the key thing they got to have us fighting each other instead of fighting them and that's what i think is really in store this year in the politics you know they want everybody focused around the two guys who have been rolling this program out for the last four years on us uh but we got to have one of the two of them and it's all about their personalities and look at how everybody this far away from the election how much everybody hates the guy in the other party from them. It truly is amazing. But that's what they're training us for. They're pushing us into an ethnic civil war as they pull down society. That's exactly what they want.
Starting point is 00:44:55 And, you know, here's something that people don't think about with this. And it's a very real scenario. No one seems to conceptualize this would be the end if we had a civil war. Because we're out here killing, enslaving, whatever they have envisioned, each other, creating a power vacuum i guess who's going to slip in there on that vacuum it's either going to be china mexico we don't have to worry about canada but uh maybe russia i think they have their own issues at the moment. So, you know, yeah, let's have the civil war. And in the end, it's not going to turn out the way people envision. No, no. But if it attracted our politicians and bureaucrats, it's not going to turn out the way they envision.
Starting point is 00:46:01 That's right. You know, Boy, you're good here. Well, we live in interesting times, and they're going to get even more interesting at an accelerating rate, I think, as we head into the last. This next term of whoever it is that's president, they're going to take us right up to the end of the fourth turning.
Starting point is 00:46:19 And they're going to take us through collapse. Either Biden or Trump could do it or Lala Harris. They'd be perfect for this role. Any of them. They've already gone through the vetting. They already know these people can perfectly manage to collapse our society. So that's what they're going to try to do.
Starting point is 00:46:37 And so that just, to me, it just underscores how we've got to get more focused on our own lives, how we've got to prepare for it. We've had already some forward-looking state legislators you know here in tennessee we've got senator nicely in utah they've just set up some new new laws that are going to allow the utah state government to start accumulating gold to protect themselves against the the financial weaponization of a centralized digital currency and other things like that, and to the possibility of a financial collapse. And so you've got a lot of different states. There's several of them here in the southeast that are looking at this. You've got Utah, which is just one guy has been very active there.
Starting point is 00:47:24 Ivory is his name. And just like nicely here, he's put in several bills that are going to make it so the state and individuals, it's easier for them to use gold and to have gold as a counter to what's going to be done by the Federal Reserve. So you have people who do see what's coming. But if we see it, we got to make those moves ourselves, really, to try to protect that. going to be done by the federal reserve so you have people who do see what's coming uh but if we see it we got to make those moves ourself really to try to protect that yeah but i mean if this goes off the way that they're saying you know it's going to have you're going to you know i hate to even say it you're going to have to cut off the head of the snake.
Starting point is 00:48:09 You're going to have to have your ruby ridges. You're going to have to have your branch Davidian incidents. You're going to have to, oh man, I don't even like saying this stuff. You're going to have to eradicate one way or another. I don't mean terminally, but I mean this thought process of these politicians and these bureaucrats who have led us to this point. They have.
Starting point is 00:48:37 They have to be punished. They have to be removed from power. And things need to be reestablished to represent the citizens it no longer does that yeah i know we're supposed to be talking about tech but you can look at just right now the news feed let's take instead let's take eas, the oyster rabbit. He did say oyster when he meant to say Easter. All hail the great oyster rabbit. There's all this
Starting point is 00:49:20 transgender stuff being pushed on us. For the life of me, I cannot make sense of it. Why would you take something that's less than 1% of the United States population and offend about 75% of your voters with it? I mean, I would say 50% to 75% of the citizens in the U S identifies Christian. So why would you run that risk for one less than 1% of the population to offend the majority? Well,
Starting point is 00:49:55 I have an idea. I have an idea because I think this is fundamentally about, you know, making people say two plus two equals five. And as I said, around the broadcast, you have the two plus two equals five. That's said earlier in the broadcast you you have the two plus two equals five that's what the tyrants want to do as as solzhenitsyn said as orwell said they want
Starting point is 00:50:11 you to live by the lie they want you to lie to yourself to double think to say whatever they say you know once they once you accept two plus two equals five then they'll give you something else that you have to lie to yourself about and, and just blindly repeat whatever they say. But it's also, um, this, this idea that it ties in, uh, the, um, the hedonism, uh, that, uh, uh, you had with, uh, Orwell, um, and not Orwell, but, um, Huxley, you know you know, and so it ties all of that together. But it's a very useful construct. When you look at, you know, from the transgender stuff, they've also got the furry stuff. And I think that that dovetails into an augmented reality and people who want to just live their
Starting point is 00:51:02 life alone, living on virtual reality and living on the internet. That's what they're pushing people through. I think that's another aspect of this transgender stuff is to keep people away from having physical contact with the real world, keep them living in a bubble, keep them in a virtual reality where they can get on the internet. Their entire existence is on the internet just like ready player one they can be a rabbit their entire life they can be an oyster rabbit or whatever you know and i think that's a part of it as well what do you think well i'm tending to agree with you on it but this is so obviously the death of democracy everybody keeps screaming about
Starting point is 00:51:46 democracy yet you're taking one percent less than one percent of the population and you're elevating their importance over the majority oh yeah that's not democracy that's right that's right and if you have a leader in them, I mean, I'm pretty apolitical. I think they all are unfit for their office. But if you have a leader that is disregarding the beliefs of the majority, that person needs to be removed from office. Yeah. I'm not talking about religious.
Starting point is 00:52:25 I'm not talking about, uh, any, any of this stuff that they, they, they get you lost in the weeds. I'm talking about the concept. We're a Democrat Republic.
Starting point is 00:52:36 Well, okay. We're a Republic that runs off democracy. When democracy is not being respected, we've got a problem. Yeah. Years ago, newt gingrich and he was just a politician blowing in the wind and everything but at least he cared what the majority said i remember him talking about he said so you got i don't even remember what the issue was he goes so
Starting point is 00:52:55 you got 80 of the people who want this and you got 20 of the people who want that well as a guy who's a politician i want to go with the 80 But we don't have people like that anymore. You know, as you're pointing out, they really, they keep talking democracy, democracy, even though we've got a Republican, we've got a Bill of Rights and things like that. They hold democracy up as this great ideal, and yet they don't care. They don't care about the 80%. They're going to go with the 0.02%.
Starting point is 00:53:21 Well, I'll say this. If this podcast thing gig don't work out you really ought to run for uh congress to get what a senate just just make my life total nightmare in hell right i told you i told karen i said you know with this abortion stuff happening i'm tempted to run for congress uh and do the same thing that that uh, uh, that, that progressive liberal did and say, Hey, I'm running for Congress. I can show pictures of an aborted baby if I want to. And there's nothing you can do about it. Just make the whole thing about that. But I said, you know, I'd have to run as a Democrat because I don't want to make sure that I didn't get elected.
Starting point is 00:53:57 Well, I mean, if you did get elected, it'd be totally worth it because it would restore my faith in democracy. Yeah. Yeah. That would be an interesting thing, but yeah, we'll have to see maybe in the future years when things start to wind down, maybe I'll do that. Just have,
Starting point is 00:54:13 have the, have the campaign be a single issue thing and everybody just sending money to fund a, a, um, uh, commercials, you know,
Starting point is 00:54:21 somebody else can do it. You don't have to wait for me. We could, we could use your soft, scary face as a campaign poster, you know somebody else can do it you don't have to wait for me we could use your pissed off scary face as a campaign poster you know there we go that's it yeah well it's always great talking to you goat tree and and it always is a blast and it's great tapping into your experience here with technology because that's where they're coming at us with you know the it's a technological slavery and you and i have both watched this this thing turn 180 degrees around from the point at which technology was a great thing we loved it it was making our lives better and now it's being
Starting point is 00:54:57 turned into a system of slavery and they're putting these chains on us right now and people can't wake up to it it truly is amazing so it's always great to have you on thank you so much for joining us goat tree appreciate it well thank you very much david all right we'll talk to you later and uh folks thanks for joining us and don't forget uh david knight dot gold if you want gold is going up really fast but if you want to have something that's going to be an effective counter to this digital slavery, go to DavidKnight.Gold. Let me tell you, the David Knight Show you can listen to with your ears. You can even watch it by using your eyes in fact if you can hear me that means you're listening to
Starting point is 00:55:49 the David Knight show right now yeah good job and you want to know something else you can find all the links to everywhere to watch or listen to the show at thedavidknightshow.com That's a website.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.