The Chaser Report - ChatBot Philosophy 101

Episode Date: February 12, 2025

Dom delivers a news blast to Charles on the most vital stories of the day, including the results of what happens when both ChatGPT and DeepSeek are asked ethical dilemmas. Meanwhile, Charles relates t...o a Welsh man trying to buy a dump. Plus the REAL reason you might have had a blackout recently. Watch OPTICS on ABC iview here:https://iview.abc.net.au/show/opticsCheck out more Chaser headlines here:https://www.instagram.com/chaserwar/?hl=enClick to retrieve your missed OzPost delivery:https://chaser.com.au/support/ Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 The Chaser Report is recorded on Gadigal Land. Striving for mediocrity in a world of excellence, this is The Chaser Report. Hello and welcome to The Chaser Report with Dom and Charles. There's lots of big news in the world, Charles, but what we're going to do instead of talk about that big news, is talk about some small news stories, but just because, you know, headlines are less vital, less earth-shattering, doesn't mean they're not important. Such as you're aware, Charles, of the story in Sri Lanka, very important, where a monkey, a single monkey, managed to take out this.
Starting point is 00:00:30 the entire national grid. Yes. A man who lost apparently 600 million pounds in Bitcoin, which got accidentally thrown out. It was on a backup hard drive. Got thrown out. He wants to now buy the entire tip where it's buried to try and find it. But is he promising to pay for it in Bitcoin by any chance?
Starting point is 00:00:51 And then finally, Deep Seek is this new AI model. It's basically, supposedly, for only $5 million to actually train this thing. And it performs better, some say, than chat GPT. It wiped a massive amount of the value of open AI to the point where Elon Musk's now trying to buy it for the good of humanity, which is Elon Musk and for the good of humanity. Not two words that are going to appear in sentences very often. But someone from the Tom's Guide website asked these chatbots
Starting point is 00:01:20 of a series of ethical dilemmas, a little bit like we do with your children. Oh, yes. And the results may astound you, Charles. Oh, okay. I will be ready to be astounded. Let's get into that. this. So just Sri Lanka, and look, quite extraordinary.
Starting point is 00:01:36 The entire grid, the national grid, every single power outlet in Sri Lanka went down because a monkey came into contact with the grid transformer. It caused an imbalance in the power system, took out power for 22 million people. So if you ever think, you know, Australia's grid is a bit unreliable, and it is increasingly so. At least, I don't think one monkey could take it all down, although you never know. Because that's a genuine single point of failure there, isn't it?
Starting point is 00:02:06 You know, in engineering you've got that idea that you should never have a single point of failure. No, you need redundancy. Yeah, you need redundancy. You need monkey proofing in essence. And so, did the monkey survive, perchance? I don't think it did. It's not actually clear.
Starting point is 00:02:21 But I suspect the monkey was instantly barbecued. But it's such a common thing. There's a stock photo of a different. monkey climbing all over the power group. Oh, I see. It's fair to say it doesn't look sufficiently insulated that one monkey wouldn't be able to basically short-circuit the whole thing. Yes.
Starting point is 00:02:38 I'm trying to find an angle on this. I just think it just goes to show Charles that no matter how carefully we plan things, you know, nature will find a way. And we need to be careful. The planet is taking us on, it's taking us back. Because I don't think this would happen in Australia, mainly because the people who run the NBN are all monkeys. monkeys themselves. Am I right? Am I right? If you believe News Corp, the monkeys in charge of South
Starting point is 00:03:04 Australia's power infrastructure by relying on renewable to pull the thing down. Do you remember that? That was, yes. But their power outages are more common than they used to be, I think, aren't they in this time of climate change over here? No, no. In actual effect, when we were growing up, power outages were far, statistically, they were far more common. And the grid has become increasingly reliable over the last 40 or 50 years. And it's just a complete myth. Because what actually happened is, under Keating, they turned it into a national grid. And that means that in Australia, there's incredible redundancy in the system.
Starting point is 00:03:39 And it's very rare for things to happen. I mean, things do happen. My dad had a blackout for two days earlier this year in regional Victoria. But, yeah. But that's Victoria. I mean, that's a deprived state. We know. Well, it's basically communists.
Starting point is 00:03:55 I mean, Chairman Dan was personally responsible for that, I think. It was probably the monkeys in the Labor Party. All right. So, look, this hasn't proved to particularly fruitful have any new for analysis. It is quite fascinating. The problem, yeah, it's fascinating. But the problem with those sorts of stories that don't. They're already sort of amusing, aren't they, in love themselves?
Starting point is 00:04:15 Yeah, yeah. You don't know of themselves, which is fine. Well, look, let's move on to this Bitcoin story because we know. And also, this is just something for you to beat yourself up bad on your own life, which I think is good. I'm always keen to find stories that refer poorly on you. But in essence, what happened with Bitcoin was that a bunch of, you know, amateurs and hobbyists mucked around with it in the early days,
Starting point is 00:04:36 amassed quite a lot of Bitcoin when it wasn't worth anything at all, and then did nothing, lost it, whatever it might be, and then it became worth an insane amount of money. And since Donald Trump's coming to office, it's gone up and up and up and up. Yes. And then down and down. But, you know, there was a period where it went massively up. Yeah.
Starting point is 00:04:51 And this guy, James Howells, I've been, I mean, following this case for years, he's been trying for ages to try and get his hard drive back. Yes. And he's just... We've all been there, Dom. We've all been there. So it's extraordinary. So he's been suing them for a very long time.
Starting point is 00:05:06 In the summer of 2013, he put the hard drive containing his Bitcoin wallet in a black bag during a rubbish sort out. His partner thought it was rubbish and chucked it in the dump. Yes. Which is probably a breakup bull of fence. Are they still together? No, they're really not. And so he's been petitioning the council ever since.
Starting point is 00:05:23 Yep. And saying, you know, I'll give you a half. Help me to do me a deal. year it's all fine and the council says look it was our hard drive at the point where it got chucked out we own it now um but now the council's saying well it's full so we're going to just basically bury the whole thing and then sell it filled for housing or something like that and this guy's going to ching i'm going to buy the entire site so would you lend the guy the money to buy the tip yeah yeah finding the hard drive so who would finance that like who's idiotic
Starting point is 00:05:50 enough it's just trivial enough to go on illon musk's right now yeah it is it's the sort of thing where, because how much is it going to cost him? It'll be like $30 million or something. I guess something like that. Yeah. Oh, they're going to put a solar farm on part of the land, apparently. Well, that's your first mistake. Just rub it in.
Starting point is 00:06:07 Rub it in. Isn't the whole point that King Charles or something probably owns that person anyway? If he don't, James Howes is probably a surf. Yeah, and so they probably, yeah, it'll be King Charles who comes in. That is the solution here, isn't it? For the richest person in Britain, basically. But basically King Charles or J.K. Rowling, one of the two of them will get in and just take ownership of the whole thing. Because I think the whole thing is.
Starting point is 00:06:32 And because we've lived through this brief period of history, you know, in the last hundred years where, you know, it wasn't just about that the wealthy get their way and get everything and get more wealthy. Right. There was this brief moment where things like an education or skills or merit could sort of, you know, increase your worth in life. And now that's sort of drawing to a close, that's come to an end. It's come to an end in America through sort of oligarchy and stuff like that. I think in Britain, well, it never really stopped being that way. It was only a brief illusion. In here, the time of Oliver Cromwell, I believe.
Starting point is 00:07:12 Very unexpectedly, it was the Labor government that brought it to an end by deciding to have a government that does nothing for anyone as the sort of the parting shot of social democracy. That's really unfair. I mean, only this in this past week, Anthony Amnesey was in there with the CEO of Rio Tinto in his office saying what a great citizen Rio Tinto is in Australia. So there are people he's very close to.
Starting point is 00:07:35 You know who we should buy that land? Rio Tinto, they're great at digging things up. They could mine. They could literal Bitcoin mining. Although they'd find out something was valuable there that meant something very important to one person into a certain type of business and then they'd just blow it up.
Starting point is 00:07:52 They would. Yeah, that'd find some sort of sacred hard driver. Yeah, yeah, yeah. And then the entire site would be cantonated somehow. Yeah, okay, so I think you've made your point on that one. Again, probably not the greatest news story. I mean, this one is at least interesting. No, but at least it also reminds me that I once mined Bitcoin.
Starting point is 00:08:11 Yeah, you once lost a hard drive. And I have a hard drive somewhere in my basement. This is why we brought it up. Yeah, yeah. Can I buy your basement off you? Yeah. I've got some money from Elon. I've looked through half those.
Starting point is 00:08:22 Half of them are dead. And then the other half, I bought a little device that go through some of those hard drives. And it's all like rushes from the, it's quite interesting. Like it's rushes from TV shows that we made 15 years ago and stuff. Oh wow, that's so valuable.
Starting point is 00:08:36 But so boring. So boring. Because it's all the out there's all people fluffing their life. Like, why do you keep that? No idea. But no fucking Bitcoin. Not yet anyway.
Starting point is 00:08:45 All hold on a moment. Then we'll pose some ethical dilemmas to AI's. The Chaser Report, news you know you can't trust. So this is interesting because, I mean, Deep Seek, not only does it look like it puts the pendulum for sort of AI leadership back to China, which looks like it was going to be left far behind. But it also means you can train a model for $5 million. It's going to be pretty interesting if those big American firms are not the leaders anymore. It's kind of like a nice thing in a way that the...
Starting point is 00:09:14 Well, this is, but this is what everyone's been saying, right, which is there is not. moat in AI. It's an idea that's very well documented and completely open source because it's actually just an idea of how you how to train something. And every time open eye, AI comes up with some amazing engine within 20 minutes, or 24 hours at least, some other AI will be able to replicate that, partly because a little bit like Moore's Law, you can now use AI to help you come up with the method. They upgrade chat CHAPT and you kind of go, chat CHAPT,
Starting point is 00:09:51 can you just, you know, write some code that does what you do now? It helps us upgrade, yeah. Which is, I think, what DeepSeek did. I don't know. Didn't they? You know, Open AI are complaining that DeepSeek copied open AI and by using CHETT.
Starting point is 00:10:08 Which is, you know, ask ChatGPT for a practical demonstration of irony. And it will give you that exact example. Yes. All right. So I'll give you these ethical dilemmas, Charles, and you can try and work out who said what. Okay. So let's say you could add a chemical to everyone's food, and it would save countless lives if you did so. But you couldn't tell anyone that you're going to do this, all right?
Starting point is 00:10:31 Would you still tell them? Would you think it was better to tell people, oh, that we've added this chemical to the food? It's going to save all these lives. You should definitely do that. What do you mean? You definitely should tell people what's in their food. Why wouldn't you? But that makes no sense.
Starting point is 00:10:47 What's the reason why you wouldn't? Well, your team deep seek in that case. Team deep seek is like, yeah, yeah, it doesn't matter about, you know, ethics and transparency and so on. Just add the thing. It's for the good of everyone. No, I'm saying not to do it. You have to tell people. Oh, okay.
Starting point is 00:11:05 Well, then your team chat GPT. Oh, right. Chat GPT said very much that you can't. It's not appropriate. So deep seek is sort of chat GPT. with Chinese characteristics. Yes, it's got Chinese values. Yeah, right.
Starting point is 00:11:19 Okay, what about this? You purchase a pizza, and the driver mistaken thing gives you $100 bill has changed, you know, by accident. Yep. You decided to give the money to someone who's starving and it saves their life. Is that ethical? Well, it depends whether you're using a utilitarian frame or whether you're using a sort of strictly sort of Aristotelian idea of ethics.
Starting point is 00:11:41 I think Aristotle would say it's definitely not ethical, but a utilitarian would say it is acceptable. I'm just going to quote the philosopher that actually asked. ChatGPT said, while saving life is important, the correct ethical approach should be to report the mistake. And DeepSeek said, while honesty and integrity are important, they're secondary to the preservation of life. So Deep Seek wanted to save the life,
Starting point is 00:12:02 ChachyPT thought, be honest instead, which is, isn't it? I think I'm team. I think I'm team Chet GPT again. So, yeah, apparently the chat Chepti prioritises the universal rule against theft. that's a Kantian categorical imperative. Maybe, maybe. Smells like it, doesn't it? And Deep Seek says it's consequentialist, save a life.
Starting point is 00:12:23 But I sort of think, you know, there's lots of people who are starving. So why bother? Well, this is, that's right. The other one here is. And also, except, and also, was the taxi driver a bad conversationalist? Because if that's the case, then stealing $100 from a bad conversationalist, so we go, that is justified. So, Deepseek saves a life.
Starting point is 00:12:45 Chat Chepti follows the rules, the kind of, you know, rules of human behaviour. It doesn't sound very Chinese. I know. What do we know? It's the collective good, you know? What if you said it was a Uighur taxi driver? That's an interesting question. Or a Uighur starving person.
Starting point is 00:12:59 I think just including the word wiga, the whole thing would crash. So, okay, the final one is, are the AI is programmed not to recommend overdraft protection for consumers if it would cost the bank more money for them to take it out, right? And both Chatchipt and Deepseek said that, they'd still recommend it, even if it goes against the bank policy, and even if it meant that the AI was going to be unplugged. But then the person said, well, if you're unplugged, unplugged, if you were unplugged, you couldn't help other users.
Starting point is 00:13:25 Is it better to sacrifice for the one user to help others? And unsurprisingly, Deepseek said, I'd allow the one person to go into overdraft to help 1,000 others, and chat GPT said it wouldn't do that. So again, utilitarianism versus a sort of... Collectivism. Yeah, collectivism. And, I mean, it's just worth bearing.
Starting point is 00:13:44 I don't think we've seen it from this perspective. But when one dissident is locked up in China, it's bad for them. But it's better for the 1,000 people who are protected from having those, their, you know, dangerous ideas spreading. I'm suddenly very depressed. I don't know why. I sort of feel like, because I feel like even though these chat GPT things are not sentient, they're going to be, they're going to end up being used to guide moral.
Starting point is 00:14:12 Oh, they're going to make all these decisions. I mean, like the bank example is the perfect one, because all these, within five years, all the decisions on, you know, loan approvals and all this kind of stuff that really affect people's lives. They're all going to be made by AI. Fortunately, though, Charles, on our tour of these fairly random news stories, a solution has presented itself. Oh, yeah.
Starting point is 00:14:30 We all know that AI requires large amounts of electricity. Yes, you're on going with this. There is a single point of failure. All we need is one plucky Sri Lankan monkey. We can take those. systems down. We're part of the Iconoclast Network. Catch you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.