The Breakfast Club - INTERVIEW: Will.I.Am Talks Tech, The Potential Of AI, Harvard Degree, New Music + More

Episode Date: April 17, 2024

See omnystudio.com/listener for privacy information....

Transcript
Discussion (0)
Starting point is 00:00:00 Hey guys, I'm Kate Max. You might know me from my popular online series, The Running Interview Show, where I run with celebrities, athletes, entrepreneurs, and more. After those runs, the conversations keep going. That's what my podcast, Post Run High, is all about. It's a chance to sit down with my guests and dive even deeper into their stories, their journeys, and the thoughts that arise once we've hit the pavement together. Listen to Post Run High on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:00:37 Hey, everyone. This is Courtney Thorne-Smith, Laura Layton, and Daphne Zuniga. On July 8th, 1992, apartment buildings with pools were never quite the same as Melrose Place was introduced to the world. We are going to be reliving every hookup, every scandal and every single wig removal together. So listen to Still the Place on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. Hi, I'm Dani Shapiro, host of the hit podcast, Family Secrets. How would you feel if when you met your biological father for the first time, he didn't even say hello? And what if your past itself was a secret and the time had suddenly come to share that past with your child? These are just a few of the powerful and profound questions we'll be asking on our 11th season of Family Secrets. Listen to season 11 of Family Secrets
Starting point is 00:01:31 on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Dr. Laurie Santos, host of the Happiness Lab podcast. As the U.S. elections approach, it can feel like we're angrier and more divided than ever. But in a new, hopeful season of my podcast, I'll share what the science really shows, that we're surprisingly more united than most people think. We all know something is wrong in our culture, in our politics, and that we need to do better and that we can do better. Listen on the iHeartRadio app, Apple Podcasts,
Starting point is 00:02:05 or wherever you listen to podcasts. Jenny Garth, Jana Kramer, Amy Robach, and TJ Holmes bring you I Do Part Two, a one-of-a-kind experiment in podcasting to help you find love again. Hey, I'm Jana Kramer. I'm Jenny Garth. Hi, everyone. I'm Amy Robach. And I'm TJ Holmes, and we are, well, not necessarily relationship experts. If you're ready to dive back into the dating pool and find lasting love, we want to help. Listen to I Do Part 2 on the iHeartRadio
Starting point is 00:02:33 app, Apple Podcasts, or wherever you listen to podcasts. Wake that ass up in the morning. The Breakfast Club. Morning, everybody. It's DJ Envy, Jess Hilarious, Charlamagne Tha Guy. We are The Breakfast Club. We got a special guest in the building.
Starting point is 00:02:49 We have Will.i.am. Welcome, brother. What's up? Thanks for having me, y'all. It's always good to be on your show. It's awesome. This is my first time in your space right here. This is dope.
Starting point is 00:02:59 Yeah, we've been here about over a year. What year are you in mentally, Will, and technology-wise? I always like to push it five to ten years. Working today for what's happening five to ten years from now. So you're like in 2034? Not right now, not at the moment. When I go to the office, that's when we're pushing and working on those types of things. Or what's getting ready to, what the world looks like in those increments, five to ten years, and then work towards building that.
Starting point is 00:03:33 But what does it look like in five years? He has speakers in his clothes. Jesus, yes. You have, your hoodie is like tech. Yeah, because I'm like, because I was always doing this. Like, hey, so so and then like doing this hey so so i'm like yo so yeah what is that you just don't believe in airpods william is different man what is that what is it it's a it's it's a you know protect your neck
Starting point is 00:03:58 my neck sorry a speaker that's on the neck they sell them they were selling them way before COVID and then COVID happened and then they stopped making them
Starting point is 00:04:12 so then I went to the company that was making them so we could make our own and this is a prototype that we're gonna do together that'll be out this holiday season
Starting point is 00:04:20 so you can take phone calls on it yeah wow can you give us a... You're just talking to me. I don't care, but just show us how it works.
Starting point is 00:04:29 Oh, I'll show you how it works. Hold on. Ayo, Finn. So we're here at the show. Say what's up to everybody. And we're getting into the interview and they were tripping off of the device I have on my my hoodie which is like a speaker where I actually summoned you on
Starting point is 00:04:49 so so say what's up to everybody are you sounds like you're setting the scene with some next-level tech mystery will rocking a device on your hoodie that's got everyone guessing that's how Ayo, sounds like you're setting the scene with some next level tech mystery, Will. Rocking a device on your hoodie that's got everyone guessing. That's how you blend street cool with tech intrigue. It's like you're the DJ dropping a track nobody's heard yet. Everyone's ears perk up, you feel me? Pitch it as the future of personal tech.
Starting point is 00:05:20 A sneak peek into how we're going to stay connected in ways that stray out of tomorrow. Imagine your hood. Hey, but you didn't say what's up to everybody. Don't go into an imaginary scenario. Say what's up to the people that are here at the table. You know where we at. Yo, what's up, everybody? Right here at the table. My bad for jetting off into the future there. Ah, yo. What's up, everybody?
Starting point is 00:05:46 Right here at the table. My bad for jetting off into the future there. Let's bring it back to the now, with all of us gathered, ready to dive into the real vibes of this moment. So we going into that. So he knows exactly where we at, or you got to tell him you got to hit me at the breakfast club.
Starting point is 00:06:03 Oh, no, I already set it up. Well, he's like a real person Sometimes you ask somebody something and then you just go off tumble something else No, yeah, I set it up, you know before before we got in here I'm like, all right That's what I'm about to do. There's a folks I would be rocking with and You know, it's good for my meetings that I go into, prepare, because it'll gather information about what it is that I'm about to be engaged with. So that dude's name's Finn, and we got Fiona. I got Phillip.
Starting point is 00:06:37 We got this Fiera. I'm working on these other personas, Felix and Felicia. So I build the personas, get like the personalities, throw the personalities to our tool, then fine tune the voice quality. But the voice quality is from a capture when you're working with the talent and interviewing the talent.
Starting point is 00:07:00 And then after you get the capture, you go and you do all the fine tuning and how it how it behaves as far as its flair and expression. Oh, it's over for friends. Fuck y'all. I don't need no friends. I got a friend. Felix, Felicia. So let me ask you.
Starting point is 00:07:21 So you so you don't see. Do you see bad things about AI or do you only see the good things about AI? Because people always say AI is taking jobs and that, you know, it's going to be a place where, you know, the robots are going to be taking over. What's your thoughts on all of that? It's too early to let your mind go down a pessimistic, negative thought when we as creatives have the opportunity to try to make it as productive and a tool for good at this point in time. Can't trust you, man.
Starting point is 00:07:56 All I know is the projects that I come from, AI didn't create the scenario for us to live through those conditions. That wasn't AI. People living through what they're live through those conditions. That wasn't AI. People living through what they're living through in Congo, that's not AI. That's just human greed. So those folks that are living in those situations that need solutions that no one is trying to solve, now finally there's a tool to help you solve your problems. So to just define it as this is going to mess stuff up, well, maybe it's going to solve a lot of stuff for folks that have been waiting for solutions,
Starting point is 00:08:33 but now they got the ability to solve the problems themselves. So I like to look at AI from that perspective. Who needs it? The folks that are worried about it are folks that have been sitting in the lap of comfort and those folks were also the folks that are responsible for those folks living through pretty hard times decision maker policy makers and so i like to look at it from that perspective where there's an opportunity to make it. If you use your creativity, your imagination, your focus to identify a problem, train the AI to see what that situation is, and then push it to solve that particular problem.
Starting point is 00:09:22 That is what I'm focused on right now building these types of tools and the first thing is to create these personas um so that intelligence sounds the way we sound what does that mean most people be like what are you talking about will i am you saying we don't sound intelligent nope that's not what i'm saying i'm saying if you get these AIs right now and these corporations that are bringing these AIs to culture and society, they don't sound like us. We are having to change how we speak and how we behave to some tool that hasn't been trained by us, that doesn't have the same types of vernaculars and the way it expresses itself. So why do we have to change the way we talk to talk to something that's intelligent?
Starting point is 00:10:09 And why can't that programmed intelligence sound like us? And is what my urgency is. Like, there's going to be a bias, so why can't we also solve that bias in making a system that rocks like how we rock yeah because i was reading uh actually reading this this morning it's funny it's uh is ai racially biased study finds chatbots treat black sounding names differently so if you say your name is tamika and you know you're a job candidate they'll offer you a salary of like 79 375 but if you switch your name to todd
Starting point is 00:10:41 it boosts the suggested salary to 82 000 so it it's like that's all based on who's creating the program, right? So those are called data biases, algorithmic biases. And that's primarily because the folks that are training the data ain't us. The folks that are building the algorithms ain't us. So you could think that's malicious, and some of it is, but it's also malicious from the policymakers on who has been guided to go down that path to be data scientists and algorithmic programmers and data trainers.
Starting point is 00:11:20 And so that's the reason why my focus on my philanthropic side is to go to the inner city, starting with the one that I came from, to get kids up to speed with robotics and computer science. So my program serves about 14,000 students in L.A. We're across almost over 300 schools in L.A. with our partnership with LAUSD to address that very point because we need more data scientists, more computer scientists to solve these data biases. Do you think it should be, it should stop in certain places? Like you talk about the positive and the positive effects, but like in hip hop the other day, we heard an alleged Kendrick reply and they said that was AI. We've seen these pictures adjusted. We're saying it's AI. We're seeing a lot of students now that are actually not writing reports now
Starting point is 00:12:07 because they can put it in their chat GPT and AI will generate papers for them. What happens when it's world leaders? World leaders saying, you know, things on social media, you can't tell if it's Biden or Putin. You don't know what's going on. Mm-hmm.
Starting point is 00:12:21 That's happened before. Some smaller version of that with photoshop that's happened before some smaller version of that with calculators you know people ain't doing their math homework they on the calculator and the calculator is giving the answers man i just saw this picture of such and such with such and such that was done on phot or man. I just saw back in the day, day, day. I saw the picture of the queen with some,
Starting point is 00:12:49 that was a painting. So we just have a hyper, you know, volume way up to a thousand version of that. And as a society, we have to educate everyone on what the compute is capable of now. And that dialogue needs to be echoed around the world to where we now have to ask what's real from what's fake.
Starting point is 00:13:10 You go to a supermarket, back in the early 2000s, you bought an orange. Then people push for regulations, be like, now we got organic oranges. Because people didn't know that stuff was genetically modified. When my mom was young, there were no such thing as organic oranges. Everything was organic. And then something came into existence to where stuff started getting modified
Starting point is 00:13:34 to where now you have two types of oranges in a damn supermarket. So that education is super important so people know what they are interfaced with. Is it real? Is it up? Is it going down? The ingredients on the back of every package is there. Same needs to be for AI. Can humans handle that with thoughts, though?
Starting point is 00:13:58 Like if I hear, you know, something from a world leader and I got to decide, is this an organic thought from our world leader or a GMO thought? Can we handle that as people? In time, yeah. What if it's calling for a nuclear war and we don't have time? What if it's a world leader hearing another world leader saying, I just hit a nuke button or something? I hope, from an optimistic lens, the powers that be don't have the same access to technology that we have. So they probably have some type of vetting system that when information comes in, stuff is watermarked with some way to identify what's real from what's faux um when information comes in because there's
Starting point is 00:14:47 if we have if i could build stuff like this with you know assembling teams and have ideas and visions and the team um you know takes those ideas and visions and improves on them or like shits on them like no no maybe we should do it like this, and that type of banter to then come with a product. Imagine what, you know, leaders of governments have. Imagine what their R&D and their, you know, process to, their processes to, you know, to vet are far beyond. So that's just my optimistic perspective. Because the moment I, knowing the things that I know,
Starting point is 00:15:30 if I go down that rabbit hole, I'm not getting out of bed. So I have to have an optimistic perspective. You know, last year, last year I got an invite. They were like, hey, we want you to come be a part of these dialogues at the Vatican. The Vatican put together this AI council. I'm like, what? Vatican and AI.
Starting point is 00:15:55 What's that about? So I went and participated. Um, everyone that's working in AI was to, you know, share the things that they're working on. So we have folks from OpenAI there, Microsoft, Google, Anthropic, you know, Stanford, MIT, Berkeley. You have all the folks that are either pushing AI, working on these foundation models and systems, sharing the work that they're working on and then sharing their concerns, sharing the things that they have been researching and then vowed to never release, and the worries. So when you're in those types of conversations and these types of worries,
Starting point is 00:16:38 first couple of days was unsettling, like, oh, man, damn. Then the second session was an optimistic session. And we had one this year as well. But I like to keep an optimistic lens on. That's the way to use my imagination. I don't think it could ever get worse for folks that are really actually going through some real hard times. What's the dress code for these Illuminati meetings
Starting point is 00:17:05 when you go to Nevada to talk to them? Oh, my God. I don't know. They got Illuminati thrones like you. Wait, is this? No, man. And Will's just not here talking. He has a new company called FYIAI.
Starting point is 00:17:18 Correct. It's an AI-based company. Yeah, I've been in the AI space since 2009. So this is one video that we did back on the intro to I'ma Be. Check this out. So I come with this suitcase. You know, I'll come with that next. What is that?
Starting point is 00:17:40 What is that? This right here is the future. I input my voice, high notes, my low notes, then the whole English vocabulary. What you're able to do with that, because of this artificial intelligence, like when it's time to make a new song, I just type in the lyrics, and then this thing sings it, says it, raps it, talks it. So the cats have built a lot of these technologies, saw this 14 years ago. And 14 years ago, the Transformer paper wasn't written.
Starting point is 00:18:15 It's seven years to go for the Transformer paper. For those that don't know what the Transformer paper, the Transformer paper was this research program uh out of google labs or google brain one of the two and they wrote this paper and from that paper came this whole new way of uh bringing to market uh consumer ai or what came from that was a consumer version of AI that GPT launched from that paper. So the T in GPT stands for transformer, which is a paper written by Google. And so a lot of the things that people were excited about in the field of AI back in the early 2000s wasn't plausible until that paper was written.
Starting point is 00:19:02 So I've been excited about the field since 2008 when I first found out about it, 2007 and 2008 when I first found out about it, did this video in 2010, invested in companies in 2012, 2013, 2014, had a watch that spoke in 2014 that we launched with 3Mobile and AT&T here in America, but nobody wanted to watch. That was a phone that early, 2013. And now everybody has one. Yeah.
Starting point is 00:19:31 So I've been, like, you know, doing things. Some of the things I did didn't work. Beats worked silently, partner there, but that wasn't AI. But my proceeds, I went out and out and like invested in companies in Israel Bangalore uh developers from Palestine uh I just love the field you know it's a pretty pretty awesome uh realm to solve problems around so question when you did that video did that technology already exist or they got that that was the seed for their idea no that technology didn't exist so you were the seed for their idea something
Starting point is 00:20:10 something like that man yeah wow well congratulations you're about to graduate from harvard yeah this october yeah what what can i ever teach you? A lot of stuff, bro. Business. Okay. Managing teams. Learning from other people's stumbles. You can learn a lot, especially where this journey that I'm on, this mission that I'm on. I need more information to get to where I want to go next. And I needed not only information, I needed new networking,
Starting point is 00:20:49 relationships in different fields. Like I met folks that build data centers. I never run into people that build data centers. So one of my classmates builds data centers. So you're actually in class. You're not online. You go to class. Yeah, I'm in the dorms.
Starting point is 00:21:10 Really? I'm going to say in a dorm. Your dorm must look crazy. You must have all types of futuristic features, gadgets, and all types of stuff in there. I just got my little one little suitcase. I take five outfits and rotate those five outfits. No draws.
Starting point is 00:21:26 I just buy draws every week and socks. It's that. It's the life. I love being able to go back. It'll be my first time ever graduating on stage. I never graduated on stage. Not in junior high school or high school. Yeah, we didn't need to know you didn't have no draws.
Starting point is 00:21:43 I mean, there was really no point. No, because if I said I got five outfits in a bag, somebody's imagination is like, nigga, where them drawers at? We would assume that you had drawers in there. Like, we assume you had toothpaste and toothbrush. No, no, you don't travel with toothpaste, toothbrush, or drawers. That never crossed my mind until you said it. Now that's all I can think about.
Starting point is 00:22:01 Willie ain't wearing no drawers. No, no, I wear drawers. He buys them fresh every week. He freeballs them right now. No, no, I wear drawers. I get them. He buys them fresh every week. He free balls right now. I have to free ball every once in a while. Not forgetting to put on your speaker on your neck. Not wearing your drawers. No, bro.
Starting point is 00:22:16 That's going to be headlines. You know that, right? Willie ain't going to wear no drawers. No, y'all distortion. See, that's how you get there. That's how he's spreading them rumors. I posted something that you said a couple of weeks ago, and you were talking about how they're investing more into robots than into humans.
Starting point is 00:22:35 Paraphrasing, you wonder why people are stupid, basically. No, not stupid. That's harsh on people. Just my two paths paths my entrepreneurial path my philanthropic path I've been able to raise more money for AI personally than raising money
Starting point is 00:22:54 for the foundation so my comparisons was not just like it was not just like a vague perspective or like this umbrella you know perspective not just like a vague perspective or like this umbrella perspective. It was a personal perspective. Also looking at inner city investment for education versus inner city investment for data centers,
Starting point is 00:23:19 inner city investment for terminals and the the way phones we all need connectivity we also have this connectivity gap where folks don't have you know connection in rural areas that's right so you have awesome ai coming no investment for education in inner cities some developing countries don't even have access to connectivity. Wow. Folks that do have devices, there's these business models that are making folks addicted to features that dumb us down. The behavior is rewarding us for content that keeps us on this this this hamster wheel wow we're using emojis and freaking memes to communicate and the machine is being trained to break down quantum physics and quantum entanglement right so that to me is like okay where does this go and how do we curb it how do we um get control back right and
Starting point is 00:24:35 i'm optimistic i think we will come around it but you have to identify the tripwires you have to be like hey look what's happening here's how we can you know be better off 10 years from now so a 10 year old now when they 20 it's gonna be all right when we identify it if we don't identify it 10 year old that's 20 yo yo i don't know i love this i love this's like, yo, these hoods that we come from, these porn disenfranchised areas, this country has shown us that they may not put the investments financially and the resources into these communities, but they will put them into this tech and this AI. So how about we go? Hey, guys.
Starting point is 00:25:18 I'm Kate Max. You might know me from my popular online series, The Running Interview Show, where I run with celebrities, athletes, entrepreneurs, and more. After those runs, the conversations keep going. That's what my podcast, Post Run High, is all about. It's a chance to sit down with my guests and dive even deeper into their stories, their journeys, and the thoughts that arise once we've hit the pavement together. You know that rush of endorphins you feel after a great workout? Well, that's when
Starting point is 00:25:50 the real magic happens. So if you love hearing real, inspiring stories from the people you know, follow, and admire, join me every week for Post Run High. It's where we take the conversation beyond the run and get into the heart of it all. It's lighthearted, pretty crazy, and very fun. Listen to Post Run High on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, what's up? This is Ramses Jha. And I go by the name Q Ward.
Starting point is 00:26:21 And we'd like you to join us each week for our show Civic Cipher. That's right. We're going to discuss social issues, especially those that affect black and brown people, but in a way that informs and empowers all people to hopefully create better allies. Think of it as a black show for non-black people. We discuss everything from prejudice to politics to police violence, and we try to give you the tools to create positive change in your home, workplace, and social circle. Exactly. Whether you're Black, Asian, White, Latinx, Indigenous, LGBTQIA+, you name it. If you stand with us, then we stand with you. Let's discuss the stories and conduct the interviews that will help us create a more empathetic, accountable, and equitable America.
Starting point is 00:27:01 You are all our brothers and sisters, and we're inviting you to join us for Civic Cipher each and every Saturday with myself, Ramses Jha, Q Ward, and some of the greatest minds in America. Listen to Civic Cipher every Saturday on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, everyone. This is Courtney Thorne-Smith, Laura Layton, and Daphne Zuniga. On July 8, 1992, apartment buildings with pools were never quite the same as Melrose Place was introduced to the world. It took drama and mayhem to an entirely new level. We are going to be reliving every hookup, every scandal, every backstab, blackmail and explosion,
Starting point is 00:27:46 and every single wig removal together. Secrets are revealed as we rewatch every moment with you. Special guests from back in the day will be dropping by. You know who they are. Sydney, Allison, and Joe are back together on Still the Place with a trip down memory lane and back to Melrose Place. So listen to Still the Place on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. Hey there, my little creeps. It's your favorite ghost host, Teresa. And guess what? Haunting is back, dropping just in time for spooky season.
Starting point is 00:28:27 Now, I know you've probably been wandering the mortal plane, wondering when I'd be back to fill your ears with deliciously unsettling stories. Well, wonder no more, because we've got a ghoulishly good lineup ready for you. Let's just say things get a bit extra. We're talking spirits, demons, and the kind of supernatural chaos that'll make your spooky season complete. You know how much I love this time of year.
Starting point is 00:28:51 It's the one time I'm actually on trend. So grab your pumpkin spice, dust off that Ouija board, just don't call me unless it's urgent, and tune in for new episodes every week. Remember, the veils are thin, the stories are spooky, and your favorite ghost host is back and badder than ever.
Starting point is 00:29:13 Listen to Haunting on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, this is Justin Richmond, host of The Broken Record Podcast. Every week, I or my co-host, Leah Rose, sit down with the artists you love to get unparalleled creative insight. Now we have a special series where we speak with the artists behind one of the most influential jazz labels of the 20th century, Blue Note Records. You'll hear from artists like nine-time Grammy Award-winning Noah Jones, John Mellencamp and Madonna collaborator Michelle Indegiocello, and from the legendary Ron Carter, former member of the Miles Davis Quintet,
Starting point is 00:29:50 who's also played with Herbie Hancock and on Gil Scott Heron's The Revolution Will Not Be Televised. Join us over at Broken Record to hear stories behind the legendary label. Listen on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts.
Starting point is 00:30:30 To the tech and AI, Listen on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. They haven't really got their imagination and their brains around AI yet as far as being contributors to it and flexing with it. But some big company in 2034 is going to come from Fifth Ward. It's going to come from the Bronx. It's going to come from Watts. It's going to come from Southside Chicago. Somebody is going to crack the code and do something massive in this space.
Starting point is 00:30:53 What is your deal? What did you say, Jeff? I said, my son is 12. We need to get on that. No, seriously. He's already into software and tech and all that type of stuff. He wants to go to school for engineering and building, all types of stuff like that. So we thought it was sports, but he's like, no, I was just playing sports
Starting point is 00:31:11 because that's what y'all wanted me to do, but I'm really into computers. And he loves that. He's been doing that for the last two years. That's dope. Yeah. You have to consume a mindset, get into the creator mindset. Yeah, and compete. Like, when our community competes,
Starting point is 00:31:27 we really do it to levels like, oh man, what? If you think about it, basketball didn't have African Americans from the jump. Baseball didn't have African Americans or Dominicans and Colombians and Cubans from the jump. We entered that sport, and we not only innovated, we dominated in that sport.
Starting point is 00:31:47 And business is a sport. You know, it's competitive. The same type of tenacity, the same type of like fall, get back up, punch, get punched, punch back. You got to, we got to compete the same way we compete in athletics, the same way we compete with entertainment. We got to compete over same way we compete in athletics, the same way we compete with entertainment. We got to compete over here and encourage our kids, our nieces, our nephews to compete that field.
Starting point is 00:32:13 And what is Udio? That's your other AI app. Am I pronouncing it right? Yeah, Udio. Udio. Udio is, Udio in a, was inspired by that video. This team that used to be at Google left Google to start Udio, and I met the folks that built this technology a year and a half ago.
Starting point is 00:32:43 And now they branched off, started a new company. It's an app where you type in, you know, prompt in like the vibe you want. And bro, like amazing music. It makes music? Texted music. It makes not only music, it makes comedy sketches it makes like live like it's wild bro it's it's dope so it does scripts and all that like not scripts like uh
Starting point is 00:33:18 how can i explain it i'll explain this way me and the guys in the black ips i'm like yo check this out i want to show you similar to the video what happened in 2009 when we did this video last month 2024 march i'm like yo you guys got to see this i typed in black IPs live in forgot what country. And I said, cause we were trying to do a song and I needed to get a crowd sound. How do you get a crowd sound right now? There's a limitation that if you want a crowd to, to repeat after you or say what you have to actually be in that crowd.
Starting point is 00:34:06 You can't make a crowd in the studio. You can put a lot of reverb, get a whole bunch of people to be like, yeah. And it's not going to sound like a freaking crowd, bro. So I needed to get a crowd. So I went to the audio and I typed in the chorus to our song and I put live in concert 20,000 people and then I put
Starting point is 00:34:31 we sing the chorus here crowd I aim the mic some version of the prompt and then the crowd sings this part bro this thing did it and Taboo from the Black Eyed Peas he's like bro. This thing did it.
Starting point is 00:34:48 And Tabu from the Black Eyed Peas, he's like, wow, well, this is dope, but that really sounds like we was really there. But we wasn't there. And it was a crowd. It's hitting me like it was a memory. But it's not.
Starting point is 00:35:02 So that's the part that's like, ooh, a little scary a little bit. So I'll show you this one thing that uh that we did on audio so let me get this i did a i did a uh i had a radio interview yesterday and it was like a radio execs we did this little zoom on like the future radio and so then I took notes from everybody's comments and so then I put the notes
Starting point is 00:35:33 into Finn and so I was like yo Finn I need you to summarize this in a joke as if somebody was gonna do it for stand up so then I put it in U and then audio spat this out so y'all talking about radio right y'all
Starting point is 00:35:52 talking about how something old is gonna be new that's like asking a cow how to jump up and fly amfm is dope and if, but his name is morning and night. And we are on the motherfucking moon now. Nighttime. All damn day. Why he sound like Chris Rock? Well, let me tell you something. Let me tell you something, Will.
Starting point is 00:36:21 You, Finn, Fiona, all y'all better get out of here. Because I'm a stand-up comedian and you know y'all coming for my job Will you, Finn, Fiona all y'all better get out of here because I'm a stand up comedian and you know y'all coming for my job no no no here's why I could say the same thing
Starting point is 00:36:33 about beats cause I'm like damn these things be making crazy beats like I hold on let me just go
Starting point is 00:36:41 I didn't get the punchline for that joke either he ain't let it go long enough. It was free. It was coming. No, that was it. That was it? Because I want to know what happens to the radio.
Starting point is 00:36:48 For obvious reasons. Okay. So, I'll go back to the beats thing. Okay. In a way, I'm like, damn, this thing is doing the beats. Mm-hmm. What does that mean for me making beats? Mm-hmm. Here's what it means for me making beats when I make a beat
Starting point is 00:37:12 and I make a song and then have to go and write the song to the beat I just made that took me god damn at least six hours to have something awesome. Go in there and make a beat, make sure everything's sound right, get the right plug in for it.
Starting point is 00:37:31 A good six hours is gone. And then when I'm, you know, I go in there, I respond to the beat. Then I'm like, okay, what am I saying? Then I got to translate that mumble. And when I translate that mumble, I got to keep the vowels. Because the moment I change the vowels, then it ain't flowing the way it was naturally. My soul's response to the track yeah so i keep the
Starting point is 00:38:08 vowels and then find the emotion so okay got it got it but then i'm like i wonder what that would sound like instead of the piano i have a guitar but i don't play guitar let me call up my guitar hey george i need you to come over here and do this guitar part yeah because i can't play guitar. Let me call up my guitar. Hey, George, I need you to come over here and do this guitar part. Because I can't get guitar to sound really awesome on a keyboard. I want those errors. I want those mistakes. I'm like, oh, darn it. And so now that I know there's an audio,
Starting point is 00:38:37 I should be able to put in the lyrics that I did after I did that. It's still not going to take that away from me. That's like my yoga. I got to stretch. I don't care if there's an AI stretcher. I don't care if there's like, you know, some AI yoga teacher. I got to stretch for me. So that's my stretch. And then once I stretch, what am I stretching for? Am I stretching for a sprint? If I'm stretching for a marathon, what's the actual game? And so when you're studying for a game or competition you want
Starting point is 00:39:06 to see what what your competitor's doing and you want to have as many options as possible to dodge to up and that's when ai will come into it to to as a tool for you when then i can just version out what i created and then when i version out what i created. And then when I version out what I created, now I have the ability to then like really make that piece of work that came out of my imagination to make it the best. It could be what I have a plethora of freaking options. So all it did was all it does. If you look at it from an optimistic point of view, I got more options than one,
Starting point is 00:39:41 but I still had a seated. Now then there's like this lazy lazy is like and you're gonna know the difference between whack and awesome if you lazy rocking yeah and we know that without a tune we know that from like you know the state of music where it's at right now there's a lot of lazy rocking right now yeah there's folks that are like worried about skip rates that really you know are you being truly creative when you're based on skip rates and now the the algorithm is telling you that you got to put the song you got to put the chorus here in the front is that really what
Starting point is 00:40:16 the soul of the song needs to be the chorus got to be slam in the front or is there some way to get you to that so i don't think we're even being truly like imaginative right now we're being reactive to algorithms and tiktoks when now the song is like reduced to 15 seconds yeah are you really truly being creative are you being pimped by an algorithm so i i i'm happy that we're in this this this this space, to shake us up, to wake us up. Because the moment that we continue to go down this algorithmic programming and we value algorithmic music, well, then the AI is going to make better algorithmic shit than us because it understands the algorithm. So something has to shake us out of
Starting point is 00:41:05 this like program that we're in because it is a program and the and the audience hopefully being optimistic starts to value organic oranges do you have some type of neural link will no no no no would you get it no no no i don't believe you no i mean i don't believe you wouldn't get it because i feel like you would almost have to have it to compete in the future. All of us. Especially if you're a creative. I could... Nah, nah, nah, nah.
Starting point is 00:41:35 That's what they tell you to say, huh? Who's they? You keep on trying to catch you up. You keep on trying. You all said you'd tell us. What they them you talk about? There's a lot of they thems now you wouldn't get neural link though i don't know no i wouldn't
Starting point is 00:41:50 not even i don't know because that leaves right now not even right now just the concept there's certain concepts that don't gel with me personally crypto didn't gel with me they didn't get into it nfts didn't gel with me. They didn't get into it. NFTs didn't gel with me. Didn't get into it. Neuralink doesn't gel with me. I'm not going to get into it. Gang banging didn't gel with me.
Starting point is 00:42:19 Not going to get into it. That's like in a project you could gang bang to keep up, to keep compete. Nah, I don't have to do that. Selling drugs, didn't gel with me, didn't get into it. But when you got nothing in the hood in the projects, you got to survive. You got to sell this. Nah, didn't get into it.
Starting point is 00:42:39 So that same stance of like, nah, I didn't get into that. It's the same way I feel about Neuralink. Not to say Neuralink is gangbanging, but it's mind-banging. I don't know what's on the other end of it. I don't know what's on the other end of it to be like, yeah,
Starting point is 00:43:02 I'm down with that. Yeah, yeah. It could take you over in some way, playing with your brain like that. I don't know enough information about, I know it has some, everything that comes to society always aims to do some solution, to bring solutions. And so I'm not knocking what it aims to do some solution to bring solutions and so i'm not knocking what it aims to do but this thing that we have called the brain is very very powerful i forgot how many parameters are how many parameters we have in our neural link or sorry our neural our
Starting point is 00:43:47 neural connectivity in our minds so I get that I get that I get that I get that every get this then we'll be like how could you forget will Will? 100 billion neurons. Each neuron has 7,000 synapses and roughly 700 T parameters. I don't know what that means. Yeah. The adult human brain runs continuously, whether awake or asleep. Only about 12 watts of power it takes to power our minds.
Starting point is 00:44:26 These AIs take a billion watts to train and consumes about 260 million watts a day. So just from that perspective, what it takes to power the human mind, how vast it is as far as its compute. And we only use a small percentage of it. I don't know if a business practice is on the other end to then use human minds to power AIs. So for that, I don't know. And there's a lot of folks that maybe a lot of Elon folks and Elon's contributions have been amazing. I just hope it doesn't go down that path.
Starting point is 00:45:13 Yeah. I hope it's not like, hey, wow, you know, how do you sustain global compute? Wow, you know, the human mind only takes 12 watts of power to power it. That would be horrible if there's a farm of brains to power AI. So there's no way robots should win is what you're saying. Because, you know, that's the other thing people are afraid of. They feel like, you know, they've watched all of these movies where the robots take over and that's the end of the world. There's no way the robots can win is what you're saying. Because, you know, that's the other thing people are afraid of. They feel like, you know, they've watched all of these movies where the robots take over and that's the end of the world.
Starting point is 00:45:48 There's no way the robots can win. They shouldn't. What was that again? There's no way the robots can win. Like, none of these movies that we've seen, these apocalyptic movies where robots take over could happen. Did that sound dumb to you? No.
Starting point is 00:46:04 I don't... The optimistic part of me, which is the majority governing side of how I look at the world, is no. Because that's not how they're programmed. They're not programmed to... But who's to say that there's not some they're programmed. They're not programmed to... But who's to say that there's not some wacko out there
Starting point is 00:46:29 that wants to program that? To be like the, you know, the owner of the wickedest freaking pit bulls in the hood. Who's to say? Who knows? Right. But believing in humanity, who's to say who knows but believing in humanity humanity has proven that although we've made mistakes
Starting point is 00:46:54 it's not in our nature to build something that's gonna take us out the nuclear bomb the nuclear bomb that's a moment where humans are at our worst. And I believe in humanity too much to think that we're going to be at our worst to that extreme again. You trip, you fall.
Starting point is 00:47:33 Very rarely do you trip, you fall, and fall, and fall, and fall, and fall. It's not in our balance, and that's not how we are. We fall, we scratch our knee, we break our leg, and then from there you're super cautious. And I think human as an organism is more like that than it is like I fall, I bust my face, and then I'm careless and I'm just going to throw my face on the floor. I don't think that's not what we are as a society, as a community, as a species, as an organism.
Starting point is 00:48:06 So just being optimistic, like I said, wearing that. Or I could go down that path creatively, imaginatively, and we should all just jump out the building now. You got to be South of Blackness. Be optimistic. You got to keep your head to the sky. You have to. I just heard the end of the song
Starting point is 00:48:27 of the lady. I love that song. Yeah. How are you musically? Where are you at musically right now? We got a new song coming out for the Bad Boys 4. Nice.
Starting point is 00:48:44 Soundtrack. So that's really, really cool to have. Black Eyed everybody yep everybody fergie too well we got a bunch of everybody's okay okay that's right so black eyed peas there's there's so if you say everybody and don't mention kim hill well then that's not true that's not good to kim hill so when you say everybody you don't mention Kim Hill, well, then that's not true. That's not good to Kim Hill. So when you say everybody, you don't mention Macy Gray or Estero, that's not good to our past either. So Black Eyed Peas? You're greater than Black Eyed Peas?
Starting point is 00:49:12 We rock with Macy Gray on two albums. I don't remember that. Yeah, well, those are the records that didn't really sell that well. Oh, okay, okay, okay, okay. Got you, got you, got you. But we have a history before Fergie, so when people tell me everybody, I'm like, what, okay. Got you, got you, got you. But we have a history before Fergie, so when people tell me everybody,
Starting point is 00:49:25 I'm like, what's everybody? Kim Hill, everybody. And I love her, that's my big sister. The Fergie, everybody. They're talking about the Fergie, everybody. Fergie, everybody. But then we have a current with J. Ray, where it hits,
Starting point is 00:49:42 and with 20-year-olds, that 15-year-olds, that when we did songs with Fer year olds that you know 15 year olds that when we did songs of Fergie that 15 year old how it was two
Starting point is 00:49:51 and 18 they come in with their parents to our shows singing Ritmo and Mamacita 18, 20 year olds you know
Starting point is 00:50:01 there was five so Fergie's not on it to answer the question he's not on it, to answer the question. I'm trying to just, you're not on Fergie on it, well that's all we're trying to do. Oh, you 40, yeah, they had Arshane on. Yeah, 45, they had Arshane on. Oh, Fergie's not on it.
Starting point is 00:50:13 But to the 20 year olds out there that like J-Ray so? J-Ray's on it. Got you, got you, got you. Okay, that's what's up. Yeah, but yeah, so like- What happened, after the National Anthem controversy, you didn't want anything to do with her anymore? Jesus.
Starting point is 00:50:24 No, like Bad Boy, the Bad Bad Boys 3 song that we did, Ritmo, Frigga wasn't on that. Okay. That was before the National Anthem. Okay. Yeah. You saw the National Anthem controversy, right? See, I'd be looking at the world optimistic.
Starting point is 00:50:40 Good answer. I keep my head to the sky. Good answer, man. Good answer. I love my head to the sky. Good answer. Good answer. I love that. I love that. Who I am, ladies and gentlemen. Yes.
Starting point is 00:50:52 Yes. Can they download your new app and all that stuff? Yeah, download FYI.AI. And if you want to see Finn and Fiona in action, check me out on SiriusXM on my Sirius XM radio show. The very first co-host that's AI is Fiona on my radio show. And, yeah, it's fresh. It's a pleasure. I've never not learned something from you.
Starting point is 00:51:19 Literally. Every time you come and have a conversation with us, I learn something. I walk away like oh man i need to look into that more yeah oh thanks thank you guys it's always a pleasure to come here like and congratulations on on the success it just keeps keeps growing the subjects you talk about are like heavy subjects and uh light subjects funny subjects you talk you guys like cover the whole entire it's it's funny it's fun, it's informative. It's like, wow, damn. You really
Starting point is 00:51:48 push... You really push guests. Your fearlessness is like, wow, that's dope. Thank you, sir. Thank you, Will. Well, it's Will. I. M. It's The Breakfast Club. Good morning. Wake that ass up in the morning. The Breakfast Club.
Starting point is 00:52:07 Hey, guys. I'm Kate Max. You might know me from my popular online series, The Running Interview Show, where I run with celebrities, athletes, entrepreneurs, and more. After those runs, the conversations keep going. That's what my podcast, Post Run Run High is all about. It's a chance to sit down with my guests and dive even deeper into their stories, their journeys, and the thoughts that arise once we've hit the pavement together. Listen to Post Run High
Starting point is 00:52:37 on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Hey, everyone. This is Courtney Thorne-Smith, Laura Layton, and Daphne Zuniga. On July 8th, 1992, apartment buildings with pools were never quite the same as Melrose Place was introduced to the world. We are going to be reliving every hookup, every scandal, and every single wig removal together. So listen to Still the Place on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. Hi, I'm Dani Shapiro, host of the hit podcast, Family Secrets. How would you feel if when you met your biological father for the first time,
Starting point is 00:53:21 he didn't even say hello? And what if your past itself was a secret and the time had suddenly come to share that past with your child? These are just a few of the powerful and profound questions we'll be asking on our 11th season of Family Secrets. Listen to season 11 of Family Secrets on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. I'm Dr. Laurie Santos, host of the Happiness Lab podcast. As the U.S. elections approach, it can feel like we're angrier and more divided than ever. But in a new, hopeful season of my podcast, I'll share what the science really shows,
Starting point is 00:53:59 that we're surprisingly more united than most people think. We all know something is wrong in our culture, in our politics, and that we need to do better and that we can do better. Listen on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. Daphne Caruana Galizia was a Maltese investigative journalist who on October 16, 2017, was assassinated. Crooks everywhere unearths the plot to murder a one-woman WikiLeaks. She exposed the culture of crime and corruption
Starting point is 00:54:30 that were turning her beloved country into a mafia state. Listen to Crooks Everywhere on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.