The Chris Voss Show - The Chris Voss Show Podcast – Glendor PHI Sanitizer, AI Startup Cleanses Medical Data for Privacy and Research

Episode Date: February 7, 2025

Glendor PHI Sanitizer, AI Startup Cleanses Medical Data for Privacy and Research glendor.com About the Guest(s): Julia Komissarchik is the CEO and co-founder of Glendor Incorporated, a company dedi...cated to addressing the challenges of protecting patient privacy while enabling the sharing and aggregation of multimodal medical data for research and AI model training. With over 20 years of experience in AI, including areas such as machine learning, deep learning, OCR, and speech recognition, Julia holds degrees from UC Berkeley in Mathematics and Cornell University. Additionally, she has authored 10 U.S. patents in the field of AI. Episode Summary: In this episode of The Chris Voss Show, host Chris Voss sits down with Julia, the CEO and co-founder of Glendor Incorporated. The discussion delves into the intricate world of AI, patient privacy, and healthcare data security. Julia shares her insights on how the growing field of artificial intelligence intersects with the medical sector, emphasizing the importance of protecting sensitive medical data while facilitating its use for research and AI model advancements. As the conversation unfolds, Julia talks about her experience in Silicon Valley and how she perceives Utah's burgeoning tech scene, known as Silicon Slopes, and its potential in fostering innovative startups like Glendor. Julia elaborates on the critical issue of healthcare data privacy and the role of technology in safeguarding patient information. She discusses how Glendor’s AI technology functions like a "washing machine," cleansing medical records from personally identifiable information to make them safe for data sharing and research use. The conversation touches upon the pressing need for diverse and comprehensive datasets to ensure AI models in healthcare are effective and applicable beyond a few select regions. Julia highlights the significance of a collective effort to balance privacy concerns with the necessity of data contribution for medical advancements. The episode closes with a call to action for increased awareness and participation in the conversation about AI and data privacy. Key Takeaways: Glendor's innovative AI technology enables secure sharing and utilization of medical data by anonymizing sensitive information. Medical records are highly valuable on the dark web due to their comprehensive nature and lifelong relevance. A significant portion of AI models in healthcare rely on data from limited geographic regions, which underscores the importance of broader data contribution. Julia sees great potential in Utah’s tech ecosystem, comparing its energy and innovation spirit to Silicon Valley of the 1990s. Participating in safeguarding and sharing medical data responsibly can lead to advancements in AI applications in healthcare. Notable Quotes: "Our focus is protecting patient's privacy while empowering data sharing. That's what we do." "Once the data is removed for most of, let's say, x-rays, then it'll be safe to share this data with the AI company so that it can be used for training." "If we don't start to contribute our data, then we'll be in deep trouble as well." "Please think. Spend time thinking. And thinking about AI, thinking about privacy, and just making it a conscious choice." "We as humans need to think through this. And it's going by us so fast that we don't have time, but we should take time."

Transcript
Discussion (0)
Starting point is 00:00:00 You wanted the best. You've got the best podcast, the hottest podcast in the world. The Chris Voss Show, the preeminent podcast with guests so smart you may experience serious brain bleed. The CEOs, authors, thought leaders, visionaries, and motivators. Get ready, get ready, strap yourself in. Keep your hands, arms, and legs inside the vehicle at all times because you're about to go on a monster education roller coaster with your brain. Now, here's your host, Chris Voss. Hi, folks. It's Voss here from thechrisvossshow.com. There you go, ladies and gentlemen. I don't know any things that makes it official. Welcome to the Big Show. We certainly appreciate having you guys. As always, the Chris Voss Show has been around for There you go, ladies and gentlemen. The Iron Lady sings and then makes it official. Welcome to the big show.
Starting point is 00:00:45 We certainly appreciate having you guys. As always, the Chris Voss Show has been around for 16 years, going on 17 years and over 2,300 episodes. We are growing like a weed, putting out, what is it, two to three new episodes a day, 10 to 15 new shows a week. If you can't find something to binge watch on the Chris Voss Show, the CEOs, the billionaires, the presidential advisors, the Pulitzer Prize winners, the people with their amazing stories, their journeys, their lessons of life that we share on the show, I don't know, maybe
Starting point is 00:01:13 you're just really too smart for us. Or maybe not. Maybe you just think you are, which is probably my problem as a narcissist. Anyway, guys, welcome to the show. Go to Goodreads.com, FortressCris Vcom fortes chris was chris was one of the tick tockety and chris facebook.com you can find us all there and stalk us today an amazing young lady on the show we're gonna be talking about her insights in ai we've been doing a profiling of companies when we're visiting utah here there's a there's a silicon valley-ish sort of startup network. Maybe I should let her explain
Starting point is 00:01:46 it here in Utah. And so we've been kind of doing some local interviews. We kind of like the local folks and all the stuff they do. Today, we're going to be talking with Julia Komisarkic. Do I have that right, Julia? Yeah, probably do. Did I get that right? Yes. There we go. I wrote it down and we try and make sure we get people's names right. So welcome to the show, and we'll get into it here. You are the CEO and co-founder of Glendore Incorporated, the companies solving the challenge of protecting patients' privacy while enabling worldwide multimodal medical data access. I'm having some multimoda right there multimodal in my coffee right now and sharing and aggregation for research in AI model
Starting point is 00:02:30 training so we're gonna get in the AI kids we'll be talking about the future I should have a sound bite for that the future I Julie is over 20 years working experience of our AI including machine learning deep learning OCR image processing, speech recognition, analysis, natural language processing. She has degrees from UC Berkeley in mathematics, Cornell University, and 10 U.S. patents in AI, and she's reached the pinnacle of a career on The Chris Foss Show. Welcome to the show, Julia.
Starting point is 00:02:57 How are you? Thank you. Good to be here. It's fun to have you as well. We cut it up and do a lot of infotainment on the show and have fun with people. So, Julia, give us dot coms wherever you want people to get to know you better on the interwebs. I'm here to talk about two things. One of them is how scary it is to have your personal medical information being accessed on the dark web.
Starting point is 00:03:20 And another is how important it is that the medical information is contributed for research. Otherwise, we'll be left out. Yeah yeah and where can we find your website it's glendor.com and so give us a 30 000 overview what you guys do there our focus is protecting patients privacy while empowering data sharing that's what we do and why would anybody want to bother about this is because let me tell you a couple of scary stories so first story is and not the first-hand experience but if you go to dark web and you want to buy somebody's social security number it will be like 30 50 cents if you want to buy somebody's one single medical record it will be 250 to 300 really it 100x. And there was a reason for it. Because, you know, credit card reports, social securities, all this come and go.
Starting point is 00:04:09 But medical records are the same for the patient from birth to death, right? If somebody had a childhood disease, they will have it. Nothing will change. If somebody had COVID, they will have it. Nothing will change. So this information is valuable. And in the wrong hands can be actually quite dangerous. So you guys are in Utah-based.
Starting point is 00:04:31 I think that's how we found you in a big media group that we took over. And we've been profiling companies out of Utah. And we, of course, we usually profile a bunch of AI companies out of South by Southwest, or not South by Southwest, CES show. But tell us a little bit about what goes on here in Utah. We were talking before the show, there's a bit of a Silicon Valley-ish sort of lots of startups here, lots of energy and technology. Yeah, so most of my professional life I spend in Bay Area, in Silicon Valley. And here in Utah, we have a concept of Silicon Slopes. Same thing, but with a lot of fun and skiing.
Starting point is 00:05:06 And I find that Silicon Slopes right now is very similar to the Silicon Valley in the 90s. Lots of startups, lots of energy, lots of people who are interested in doing cool things and experimenting. Yeah. And I think it's cool. I think it's really important because for a long time, Silicon Valley was just, you know, the technology and everything was centralized in Silicon Valley. Which is, I mean, it's a great bed for growing these ideas and different things. But, you know, it seemed like I had a lot of friends that were coders and engineers and people that did internet stuff. I'm a layman, clearly. They did internet stuff. I'm a layman, clearly. They did internet stuff.
Starting point is 00:05:47 But they would live in different parts of the country or different parts of the world. And they'd be like, we can't afford to move to Silicon Valley. We've got family and kids here. We can't. And so back when it was kind of this hotbed, really, and I think Silicon Valley is probably still a hotbed. But back then then it just kind
Starting point is 00:06:05 of seemed god if you weren't you know i had friends that tell me i'm like i'm a coder if i geez i live in alabama like i need to be in you know silicon valley if i really want to make it you know and i want my app or whatever i'm working on to make it and so now it seems like over the past 20 years that there's been you you know, it's spread out. It's, you know, in New York, it's in Alabama, it's in, you know, all sorts of places. And, you know, places like Utah. Utah, I know they spent a lot of money to get infrastructure in here to get the cabling so that, you know, they get the high speed bandwidth for the big companies. You've got Adobe here, a lot of amazing tech companies and stuff. So tell us what your AI software does specifically and
Starting point is 00:06:46 how it helps people and why people should be interested in it. You can think of it as a washing machine. It's a washing machine which deletes or washes away sensitive information so that what's remained is something that can be shared safely without leading back to the patient. And that's the ultimate goal, how to share this data with the AI company so that it can be used for training. Yeah, but it is a washing machine, so we make sure that dirty laundry doesn't leave the house and that dirty laundry doesn't enter the house. So now, I've seen the pictures that are on your website, and it looks like if I have an x-ray taken of me, and I imagine this goes to my other medical records, there's data on there that's specific to me or identifies me. And here's Chris, and yeah, he jumped off that roof again because he was drunk and broke his back.
Starting point is 00:07:42 Don't do that, folks. It's bad. Yeah. You know, here's Chris again. He's drank too much around us don't do that folks it's bad yeah you know here's chris again he's he's uh drank too much i don't know i don't have a joke for this folks okay i'll move on but you know and so they see that but now why would why would is that is that data need to be cleaned because because of the dark web or does that data need to be cleaned because maybe they will use that my x-rays for medical improvement purposes or look you know hey look what this idiot did you know they pass i'm sure there's like a doctor's facebook group
Starting point is 00:08:17 where they pass around everyone's like look what this idiot did right i don't i don't know if that's true folks i'm just making it up but is it is it both or is it how how does that play out yeah and by the way i'll take objection to the idiot part but yes sometimes doctors have use cases sort of example cases of special cases that they do share around a lot of times it's also used for research and the largest use case right now is AI training. But let me get back to you in terms of what is x-ray. So for x-ray, if you see your x-ray, it's actually usually done in a format called DICOM. And this format has two parts to it.
Starting point is 00:08:57 One of them is the image itself. And another is the so-called metatags. So metatags are easier to explain for somebody who hasn't worked with DICOMS because it's similar to what we have meta tags for. It's not quite, but similar to meta tags that we have, for example, for images, et cetera. When it comes to the image itself, quite often what you have is the x-ray plus your name will be actually engraved into the image. So it will burn in. Think of like Texas Rangers, right? Something is stamped into the image itself. And that information will have your name, or sometimes your address, but definitely your date of birth, date of study, etc. All of this information is actually highly sensitive, because it will tell somebody who shouldn't know that you actually had that x-ray and also will give information that it's your x-ray and not somebody else. So it's actually protected by the laws. For our country, it's HIPAA. For European countries, it's GDPR. But also, it needs to
Starting point is 00:09:50 be protected and removed just for the general privacy concerns. But once that data is removed for most of the, let's say, x-rays, then it will be safe to share this data with, for example, with AI companies, because then it will not lead back to you. However, there are interesting, scary side problems. One of them is, so x-ray might actually contain, let's say, a pacemaker. And if the light was right, you can actually see the ID of that pacemaker on the image. And that we know all from csi can lead back to the patient even though the name is not there so that's one thing another thing which is even scarier and something that everybody has to be aware of if let's say it's a brain mri and my mri or ct
Starting point is 00:10:37 scans what happens is you take slices of head right and then you show that those slices however if you take those slices and combine them together, you will see a face. And face recognition is now good enough to actually recognize the face. So, you know, the name will be deleted, but the face will be out there. And then it becomes the question of how do we modify the face so that it's no longer recognizable at the same time the medical information that will be in those images is still workable oh and i suppose with the ai you could put the face back together too now that's the trick i tell you to do that so that it can be right and there's certain things
Starting point is 00:11:17 that can be done and that's you know that's just for one modality we keep talking about multimodality what we mean by that is it's x-rays it's cd scans it's mris but it's also reports it's videos it's photos it's voice recordings you know there's so many different ways different types of medical information each type has its own challenges and each type needs to be de-identified so that patients are protected. Yeah. Yeah, because no one wants your data out there. So can I use your software to, you know, anonymize my, is that a word? Anonymize my x-rays, and then I can go sell them on the dark web and make some money? I mean, $250.
Starting point is 00:11:56 This is, I mean, $250 is $250. It won't be sellable on dark web anymore. That's the beauty of it. Oh, damn it. Once you remove the patient identification, then this particular piece of data is no longer attractive for the quote-unquote bad guys. So you won't sell it on the dark web, but you can actually contribute to the data lakes and quote-unquote sell it for research.
Starting point is 00:12:19 Ah, yeah. And that's something that everybody, I think, will need to think about. Because the other scary part, so I started with the dark web, but there's also another side of this. And that is, if you look at where the AI and healthcare models are trained on right now, it's a handful of data sets coming from large hospitals from the coast, from the East Coast and the West Coast. Whereas data that comes from other states, which are not coastal states, is almost not present. And that's actually dangerous because if you look at ChatGPT, right, GPC, anybody like that, what they're doing is they're training. So AI models need to be trained on the data.
Starting point is 00:12:58 And there's expression, garbage in, garbage out. If the data is not good, then the models will not be effective. But for ChatGPT, the data that they use for training was openly available public internet data yeah they're actually reporting that they're running out of that wow so imagine how difficult situation is in health care because they don't have access to that data because this is highly private so the only way to get access to that data has to be first cleansed in the washing machine. So you use the washing machine, you remove everything sensitive,
Starting point is 00:13:30 and then this data, A, will no longer be attractive for any nefarious purposes, and B, can actually be shared to train the model so that models will be applicable to all of us and not just a few hospitals. So it's really interesting. So why did you decide to get into this field? What made you decide that this was the startup proponent you wanted to take and do? I'm an AI, but this is my opportunity to work in healthcare. Okay.
Starting point is 00:13:57 Something to help. So that's one of the exciting parts of this. And it's a startup, right? You guys are doing it? It is a startup, yeah. Okay, there you go. So what do you like about, we'll do a little ad for Utah, what do you like about doing startups in Utah?
Starting point is 00:14:14 I guess there's a talent pool here and other things going on. What do you find maybe some benefits of that over, I mean, I don't need you to throw rocks at Silicon Valley, but maybe there's some skiing. We have we have we've skied it's a good thing energy so a lot of people who are very interested in events so for example last week there was a utah tech week tons of events every event hosted by somebody else and you can see all the different types of communities different types of trust different types of things that are happening. And that's extremely great to see for an entrepreneur because, you know, most of the time it's very lonely and hard work.
Starting point is 00:14:55 But this energy just keeps you going. Yeah. You know, we talk about that on the show a lot, being an entrepreneur and doing startups. And, you know, you're kind of sometimes you're working in the dark there for a while trying to make things work sometimes in stealth mode and you know you can't maybe sometimes talk about what you're doing or sometimes you just feel alienated it's it's hard to be the the startup folks the the ceo it's hard it's hard to be the people who are trying to launch these and get them going because sometimes you don't know if you really have a market fit or if it's going to click or
Starting point is 00:15:29 if consumers or business to business are going to buy it. It seems like a great idea, but you never know. I've seen entrepreneurs build the most greatest product ever known to man, but if people want to adopt for it or adapt to it one of the two or people just don't find value in it even though it's a great product you can sit and look at it you can make that is amazing like people should be buying that you know and uh instead they're buying like yugos or something i don't know it's a 1990s yugo car joke the so why don't we cover that we want to tell people about your thing.
Starting point is 00:16:05 Are you looking for investors? Are you looking for, is there anything that you're trying to do, reach out? Maybe that if someone's out there listening, they can tap into for you. Are you looking for investors or people that can proselyte, spread the word, et cetera, et cetera? What I'm looking for is this awareness that everybody needs to have in terms of the patient's privacy and also awareness that if we don't start to contribute our data, then we'll be in deep trouble as well. So for me, that's the most important sort of two-pronged challenge, how to explain what does it mean patient's privacy and how important it is to think about how you publish your data. Because once it's out, it's out. There is no way to retrieve it.
Starting point is 00:16:48 At the same time, how important it is to make sure that we do have that data. Otherwise, the models will not be applicable to us. And that's the two things that I'm always looking for, people who are interested in hearing this and who are interested in working on this. Another thing which we're looking into is creating, this is more Utah specific, a Utah medical data lake, something that will help AI companies train their models on a large, diverse set of data. Because in Utah, we have urban areas, we have suburban areas, but we also have rural hospitals. We have clinics, tribal clinics. So all of this data becomes extremely important to build effective models that can be used in healthcare. Interesting. So let me ask you this. You're creating a solution for this AI to go through,
Starting point is 00:17:40 identify metadata, identify notifications and stuff. How do they do it now? Is there a way they just take some scissors and cut it off the x-ray? How do they do that now? Or do they do it now? You can think of spy movies where the document is blacked out. But actually the black can be removed under certain lights, so that's dangerous.
Starting point is 00:18:00 You actually have to modify pixels. But let's forget about documents. Let's forget about medical images. Think of videos. So in the video, there's a face that can be recognized. Certain light, you can actually recognize fingerprints. There could be a tag that the company in the hospital is wearing. There could be a whiteboard. You know, sometimes people have presentations and they forget to wipe out the whiteboard in the back.
Starting point is 00:18:21 And that might have some identifying information. There's so many different things, tattoos. There's so many different things one can think of, which actually, it's sort of almost like a CSI, but in reverse, right? You're trying to figure out how to make sure that nobody can identify, if let's say it's a video or a photo, nobody can identify the location and who the person is while still using that for training for certain use cases. And so the AI does it. It probably saves time and money is one of the benefits of having this utilized
Starting point is 00:18:53 if you're a small rural hospital, like you mentioned. Yeah, precisely. But more importantly than for, for example, if we're talking about rural hospitals, it's an opportunity tool. For a while, it was a dirty word, but monetize the data. A lot of rural hospitals are in the red. Something like 80% and higher of rural hospitals in each state are in the red. And it becomes very dangerous because those hospitals will be closed.
Starting point is 00:19:19 But they can monetize safely. And that's one thing. It has to be the data. It has to be de-identified. It has to be no dirty laundry leaves the house, right? But's de identified it can be this data can be monetized so the i healthcare companies can use it to train their model so it's a win-win situation that this is starting to to take hold in the society where people are starting to realize that we can't do that there was was actually an interesting study in Canada where they asked patients whether or not they will be willing to share their data.
Starting point is 00:19:48 Over 90% said yes. Once it's de-identified, they're comfortable sharing data for research. Huh. Now, I would imagine, and I guess you tell me how this works, is it possible that by sharing this data, we could maybe lead to better advances through ai and in health and maybe fight cancer etc that's the thing because well look at the situation the number of doctors is not necessarily increasing but the number of patients in aging population just now population in size, is increasing all the time.
Starting point is 00:20:25 So we're running out of doctors. So whether or not we want it or not, a lot of things will be done with AI. But AI is stuck because it doesn't have access to the data, so it can't be built. As a result, if we can bypass that problem, then we can have AI, which will get better and better, and it won't be replacing decision making but it can replace certain uh certain tasks and be an assistant to the doctor
Starting point is 00:20:52 there you go that sounds great i mean i they say sometimes maybe ai might cure cancer that or it might kill us all it's i guess it's still on the table it's up to ai no it's up to us it's still on the table. It's up to AI. No, it's up to us. It's up to us. We have to think about it. One of the things which is a lot we as a society need to think about is what are we using AI for, what it's trained on, how biased it is, how generalizable it is, basically how can it be applied to different situations. But at the end of the day, we as humans need to think through this. And it's going by us so fast that we don't have time, but we should take time. Most people are busy watching the Kardashians and The Bachelor, so I'm not sure how far that's going to go. But that's the reason we have you on the podcast.
Starting point is 00:21:36 That's why sometimes it's useful to just throw out the TV. Yeah, just throw it right out the window. Make sure you're on the bottom floor, though. You might hit somebody. You know, if you're doing it and nobody's downstairs, then it gives you a very nice feeling, right? Learn something. Read a book, folks. This is why we're talking about AI. Because, I mean, like you say, we are the shapers of this future with AI.
Starting point is 00:21:59 And we've got to really pay attention to pitfalls, dangers. I mean, certainly the top end of it seems to be infinite as to what AI can think about, because AI doesn't really have to think about what we think about every day. A lot of people are chasing kids around and trying to buy the newest car and the newest cell phone and, I don't know, watch the latest Bachelor or Kardashian show. But AI is going to be sitting around going like how do i square off infinity or something you know they're gonna be they're gonna be focused on some things that they're not that we're not you know they're ai is not gonna be worried am i wearing the right outfit you know so it'll be interesting to see what ai comes up with as long as you know it
Starting point is 00:22:42 keeps us around please ai if we like you, don't hurt us. Any final thoughts as we go out and telling people how to onboard or reach out to you for more information or data? Are you guys hiring maybe in the upcoming future?
Starting point is 00:22:56 Not right now. We're a small company, so it takes a while, but we're always looking for fellow travelers. Okay. I can say, please think. Spend time thinking.
Starting point is 00:23:07 And thinking about AI, thinking about privacy, and just making it a conscious choice if you're doing something. That would be my call to action. Call to start thinking. You know, people have to care. We're stewards of this community. We're stewards of this democracy. We all have to give a crap.
Starting point is 00:23:23 We all have to figure this out because if we don't someone else will and it might be a computer somewhere that's called skynet so that's my terminator jokes of ai anyway guys thank you very much for coming the show i learned a whole lot about what's going on with my x-rays and stuff i think what i'm going to do is i'm going to go find me some x-rays that have been cleansed. And then when I go into my doctor and he's like, oh, we have your x-rays here, man. You're in bad shape. And then when he's not looking or he goes, you know, he loses that room for a second, I'll just switch them. And then I'll come back and be like, wait, you've been improving.
Starting point is 00:23:57 And I'll be like, yeah, I don't know, man. I don't see what the issues are here. Don't tell my insurance to say about that. Don't tell my insurance company to do that. Not all the jokes are funny, people. That's how it works. Anyway, thank you very much, Julia, for coming on the show. We really appreciate it.
Starting point is 00:24:16 And good luck with your startup. And we hope you are successful and make the world a better place. Without people like you and entrepreneurs who look to make the world a better place, this wouldn't be so cool so thank you thank you for having me thanks to our audience for tuning in go to goodreads.com for chest chris fos linkedin.com for chest chris fos chris fos one on the tick tockety and chris fos facebook dot com all those crazy places in that be good to each other stay safe be good to each other stay safe and we'll see you guys next time

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.