Screaming in the Cloud - Fixing Shadow AI and Surviving re:Invent with Chase Douglas
Episode Date: February 5, 2026Chase Douglas, CEO at Archodex, talks about AI security problems and why re:Invent has become a nightmare. Chase helps companies capture every AI interaction so they don't get in trouble with... compliance. Corey and Chase discuss Shadow AI, why Corey runs Claude Code in an account called “Superfund,” and how re:Invent put metal spikes on benches so people couldn't sit down. They also talk about why AWS released fewer announcement than before, and why Chase is finally optimistic about AI coding tools after months of frustration.Show Highlights: (01:51) What Archodex Does(07:00) The Superfund Account for AI(08:19) Shadow AI Problem(11:41) What Happened at re:Invent (14:59) Sponsorship Costs at re:Invent(17:00) Metal Spikes on Benches(21:39) AWS Releases Declining (25:24) Why Chase Is Finally Optimistic About AI Coding(27:13) Code Review Changed with AI(31:22) Where to Find ChaseLinks: Archodex: https://archodex.comLinkedIn: https://www.linkedin.com/in/chasedouglas/Sponsored by: duckbillhq.com
Transcript
Discussion (0)
There's a lot of interesting stuff happening in AI where we're starting to realize on the coding front.
Like, what does it mean to code efficiently with AI when you get into spec-driven, test-driven development?
Welcome to Screaming in the Cloud. I'm Corey Quinn, and I am joined by dear friend of mine and yours, Chase Douglas, CEO at Archidex these days.
Chase, how have you been?
I've been good. I've been good.
You found through an interesting series of evolutions.
You were the CEO at a startup called Stackery.
Before you did that, no one knows.
No one has any idea what your past is.
Just to be clear, CTO, because I worked alongside some amazing people.
Oh, my apologies.
All those three letter titles are strange.
CTO at Stackery and co-founder.
Let's not forget that part of it.
Exactly.
Which honestly means your job is whatever needs to be done in that moment.
You have five things that must get done today.
Here's a thousand question questionnaire for compliance.
about printers about your SaaS product. AI can do it. Yeah. So then you got acquired by
AWS, which was interesting because the way I found out that you got acquired was I got an
email as a stackery customer. And then I emailed a few of you'd be like, what the hell? Like,
hey, we've been like that for months now and we haven't been allowed to tell anyone. Thanks for
noticing. So yeah, that was Amazon marketing once again taking a vow of silence for no
discernible reason. You were there for a few years. You got a few things out the door, the second
coming of stackery, but it's a UI focused product at AWS, so it's of course doomed.
That company cannot build a good UX to save its life and then decided that, all right,
I'm going to go do something else. What makes me unhappy, unreasonably so, and then you decide to
go tilt on a windmill. What windmill is it? I'm building a project that's called Architects.
It's me and a fellow principal engineer from AWS. There's a lot of really interesting.
interesting, fun stuff happening with AI.
When companies go out there and they say, like, we have a great idea for how we can give
people an amazing feature, an amazing solution, a product, whatever, and it's going to use
AI on the back end.
A lot of times you get these product teams, they dive in, they do some cool stuff, some POCs,
and then they're like, we're ready to launch.
And then they start talking with their lawyers, their compliance teams, especially in
enterprise departments with like five different security teams.
And the skids happen.
And you have to figure out what do we do to solve a lot of compliance stuff?
And especially because AI, like, people are sending all kinds of crazy stuff into these systems, freeform text and whatnot.
And so it's a challenging space.
But what's really interesting is leading companies doing AI.
So these are companies like 80 of us themselves have their own AI products.
Not like Bedrock is an AI product, but I mean like more like their DevOps agent.
Like it's using AI.
Intercom with their fin, especially AI enabled customer service solutions.
Century with, again, they're kind of DevOps focused AI stuff.
I've talked with people at companies like these.
And there's a lot of common things where they'll say, okay, to solve for security and governance and compliance,
we have this kind of like central pipeline that we send up.
all our LLM AI interactions to. And from there we figure out what do we need to do? Audit logging,
where do we send the raw logs? Analytics, how do we anonymize these things and send it to the right
analytics warehouse or operational hotel service vendor of choice? Potentially anomaly detection
for security. But the point is there's a central pipeline where this is all governed and managed
because of how radioactive it could be if this were mishandled.
Yes.
Even now, we're a small startup.
We're building software at Duckbell.
And in fact, this is probably a good point to wind up doing the sponsored read.
Since we are now the only sponsor of this, after years, I have gotten it down there.
We are helping companies solve their cloud contracts with software and services to boot.
If your cloud contract is expiring within the next year, please reach out to us.
If nothing else, you can get that sad,
look in your eyes that we all get when talking about contracts combined with cloud billing
and commiserate.
And odds are we might be able to help you out with things.
Our product Skyway is doing amazing things around contract management.
Please, check us out at duckbillhq.com.
But as we were mentioning, from the AI perspective, yeah, we have an entire policy
internally around which systems were allowed to put AI through for models, how we get
those models.
and we have a standing policy that customer data does not touch AI just for, you know, customer comfort, if nothing else.
Right. So there's a bunch of stuff in this space about how do you handle this?
Whether you're dealing with public information, customer data, you're mixing things, whatever it is,
how do you make sure you have confidence? There's one place where all of this flows through a single pipeline
that is properly anonymized if it needs to be for certain endpoint destinations.
You know who accesses those. You've got law.
logging on who can see my snowflake cluster or wherever I've got my audit logs. Once you've
got that in place, then you can have confidence to turn on the gas and really enable your product
teams to build, deliver, ship awesome new features. So that's what we are working with customers
to do is to make this turnkey easy. And that's where my co-founder, myself, Aperva, being
previously
AWS principal engineers, working in an
enterprise environment, working with
legal teams to define
and develop processes
and implementations for compliance
considerations, security
and whatnot, as
we worked on AI
enabled solutions, if you're
saying to yourself
that you are in this
problem space having to figure this out
and you wish you had two
AWS principal engineers working on this
and helping you solve this and figuring out how to talk with your lawyers internally or externally.
Give us a ring.
You can hit me up on LinkedIn, Chase Douglas.
You can go to our site.
It's architect's.com.
That's A-R-C-H-H-O-D-E-X.com.
Quest a demo, hit our contact form, whatever you want to do.
We're here to help you.
We're early days.
So we're building things that meet what our customers are asking us to build.
build for them, hit us up so that we can help you. I do want to highlight that this has become
a problem. What are the approaches I do? Because I do a lot of random one-offs with AI. Go vibe code me
a thing. I just redid my entire email newsletter production pipeline over the break for that reason.
I've been sitting on that for six years. It was $50,000 of development effort to have a human do
it. Or I can now finally get to a point where I can just poke Claude Code for a few weeks and
it winds up spinning out something that works for me. And the way I do this is I have, I run Claude
in its own EC2 instance. It has root on the box. I run this in its own AWS account. It has
an administrator access IAM role for that entire account. And that account is called superfund,
as in the government program to clean up toxic waste sites. It's expensive and it's toxic.
And it does a whole lot of terrible things. But there's nothing persistent in that account. And
there's no long-term data, there's no data that lives there at all that is as any sensitivity.
So everything to get out of there has to go through a CICD pipeline that has more guardrails on it.
Because I don't want the innovation to get slowed down, but I also, you know, don't want to accidentally the database.
I can't run clawad code in dangerous mode on my laptop.
That has stuff on there that cannot see the light of day.
So I have to be a little bit more cautious.
Yeah.
And it's also a challenge when there's a thing that people started calling shadow AI.
It's this idea of, oh, yeah, when we are rolling out our AI-enabled products, we put it through bedrock.
So it's going through all the AWS guardrails and everything.
That's great.
But what are your developers on the teams doing when they're spiking their POCs and they're testing out, you know, some changes or whatever?
I've talked to so many engineers from so many companies who say, yeah, we have bedrock, but, like, it's locked down.
I can't use bedrock to really figure out what model's going to work best.
and how to prompt it the right way.
So I reach for things like openrouter.aI,
which now lets me hit any number of models and test them out.
But ahead of security and compliance, at your company,
they're like, wait, you're doing what?
We don't have any relationship with OpenRouter.
We don't know where that goes.
We don't know how they log things.
We don't know what data you're mixing into it.
So that's this so-called shadow AI.
You're selling your entire code base for this thing into ClaudeCode.
Well, in my case, jokes on you.
Claude code wrote the whole thing anyway.
Really?
being able to capture that and understand what's going on. That's the other really kind of
interesting, innovative part of what we built is with kind of like a zero instrumentation approach,
you can slap on a little bit of instrumentation to your like compute cluster, and we'll be able
to capture every AI-LLM interaction and tool interaction that your workloads do, whether they're
in dev, test, production, QA, whatever it is. And so that's kind of an exciting thing, too,
that not only are we helping make sure that people have these centralized guard-rail pipelines
on the back end, we're making sure that they're capturing everything on the front end, too,
even in dev and test.
I want to kick the tires on this myself.
One of the things, even in this Superfund account, as you can imagine, it uses in some cases
the Anthropic API, in some cases, Bedrock.
In one Godforsaken instance, it's using a lot of vertex AI over on Google Cloud for unrelated
reasons.
And one of the challenges...
I've heard people like it.
Well, it's great.
The challenge I've got is that every quarter when they come out with new models, I get to play
whack-a-mole throughout a bunch of different micro-repoes and figure out what's calling what.
I've been able to do it so far by API key and guess and check, but I don't want to be running
Sonnet 2 anymore. So seeing what's making what called aware would be helpful.
My co-founder, he was just playing around with our demo that we built yesterday, and he noticed
that OpenRouter, now one of their previously free models, it's just returning errors, saying,
uh, we no longer do this anymore. We're not doing.
free tier of this model.
So you've got to go find a different model.
I don't know how many people are using open router in production per se,
but this is a kind of challenge.
It is, as you said, whackamol of
how do we keep this stuff operational
when there's thousands of different models
out there and how you access them,
what their price points are,
they're changing all the time.
You even noted AWS is changing their price points
on some of their, I don't know if it was models.
It might have been, what was it again?
They changed their,
pricing recently on? Capacity blocks for MLs raised by 15%. They do claim to adjust them every
quarter and everyone's whining. It's like spot. It's dynamic price. Changing things once every
three months is not my definition of dynamic. Maybe that's enterprise dynamism. Yeah, I suppose.
Maybe the enterprise CFO is totally fine with that. Speaking of enterprise dynamism,
I am curious as far as what it is that you have seen about reinvent this year.
So it went to reinvent. I went there to have conversations with people. A little bit to learn, but
most of the time I don't go to like sessions or whatnot. It's about like networking, finding people
in the hallways at the events and both like expanding the people I know who I can then follow up with
when I realize, oh, they know a thing that I don't know and I want to know more and I could do that
after the fact, but also just like, you know, in real time, learning, what's the, what's the
state of play? What are people doing? And yeah, this year, this year was interesting. I didn't go
last year, but I had been to, previous to that, eight reinvents in a row. So this year,
things felt a little different. I don't know what you kind of sensed in it, but it was getting
harder to just have those networked conversations with people. I was talking with like Ben Kehoe.
And we were struggling just to meet up. Like where are we going to meet? Because if you've been
to reinvent in the past, you know that it's in this sort of like very large casino conference
center place. And if you are walking around, there are places where it extends even into a mall.
And there's benches, there's this place called like St. Peter's Square with a ton of tables and chairs, and you can sit and have a conversation.
There are restaurants lining all of these venues that 10 years ago, you could just go to any of these restaurants during the day and just sit down and have a coffee or something.
Over the past few years, they would get sort of like locked down in the evening for private events.
But this year, every single one of those places, including the ILLI coffee shop, was locked out as a private event the whole day and night.
Mike, my business partner, was saying that the catchphrase for re-invent is closed for private event.
God forbid, you're making the terrible decision to vacation there that week.
Where are you supposed to eat?
Every restaurant is booked up and closed.
It's awful.
And he couldn't find a cup of coffee at one point, which, if you know, Mike, that's a lot.
like three quarters of his personality. I just needed to get some easy quick food. I ended up going
like one block down, which Vegas, one block is not a block anywhere else in the world, one block
down to get to the closest easily accessible fast food, it was like subway or whatever,
which is in like the very back of a low-key, like, slots casino. And I was like, you know what?
At least if I went this far, it's semi-quiet here.
I got the bling going on of, like, the slots.
But I don't know.
Like, it's hard.
It's hard.
You know, like, I'm friends with, you know, many other people from lots of different companies.
I was talking with some people from one company that has sponsored reinvent in the past and had booze at their expo and everything.
They're like, yeah, we're not even doing that this year.
Like, it's so expensive as a vendor to show up.
And the leads are, it's hard to justify what you're getting out of it as the expensive
have gone up and up and up for the conference.
And so this is kind of happening on both sides.
You got like a ticket to go to the conference is $2,100 this year.
The sponsorship is like crazy entry price probably of like 50 to 75K for a tiny little booth
at this point.
I don't actually know.
And it reminds me of like Cory Doctoro's.
idea of like inshittification? It's gotten terrible. I still, it was burned into my memory years ago.
That at ReInvent, in the opening keynote, Andy was talking about, what is this for? Is it a sales
conference? No, he lied, given how expensive everything has become. Is it a training event?
What is it? He said, it's about education. It's about learning things. Yeah, the problem I'm
seeing is that the educational has a very speaking from doctrine, as opposed to, you know,
is speaking from reality perspective.
I couldn't find places to sit this year.
And I don't know if that's because there were fewer places to sit,
or I just wasn't invited to some of the special places.
I too got to sit with Ben Kehoe and chat.
And the reason I was able to do that is because I snuck my way into a place I didn't belong
because everyone thinks I'm an AWS hero.
This episode is sponsored by my own company, Duck Bill.
Having trouble with your AWS bill, perhaps it's time to renegotiate a contract with them.
Maybe you're just wondering how to predict what's going on in the wide world of AWS.
Well, that's where Duck Bill comes in to help.
Remember, you can't duck the Duck Bill Bill,
which I am reliably informed by my business partner is absolutely not our motto.
To learn more, visit DuckbillHQ.com.
There was some ex-AWS hero allowances that I availed myself of as well in the same way.
But to your point, the thing I found hilarious was the Venetian entrance.
So the Venetian is like the casino hotel that this is in,
and it's got its own entrance separate from the Palazzo,
separate from a bunch of other places you can get in.
The Venetian entrance had, historically, this kind of like,
you'd almost think it was a fountain, but there's actually no water fixture in there.
It's just kind of like a planted area in this rotunda.
and you could sit there on these low, like, areas.
It was almost like a bench.
And this year, there were these like metal decorative grates
that had like spiky tips on those benches.
You could not even sit there.
I had to laugh when I was walking and I'm like,
oh, I know I can go down there
and I walk a quarter mile down to that place.
And I get there and it's like,
what's greeting me is metal spiky things.
so I can't even sit there.
So this is while I was doing a FaceTime with my family, one of the evenings.
And I'm like, oh, turns out I'm just going to be pacing around this rotunda as I talk with you
because I got nowhere to just sit.
But yeah, like Corey, Doctor-O, like other Corey.
He spells his name incorrectly, but I like him anyway.
Right.
It's like phase one of intubification is like, you know, making it, what was it, like super awesome for the consumers.
Like, you know, reinvent 10, 15 years ago is all about education.
And then phase two is we're making up not so awesome
or the consumers, we're making it super awesome for like vendors, businesses.
So like, okay, now, you know, it's a great place to go
and have a booth and vendor or whatever.
And then like phase three is we're now raking in so much money.
This is our business.
We're not making it any awesome for either the consumers or the businesses.
And it's just a money machine now.
And I feel like that's what ReInvent is.
It's like, I don't actually know what the revenue profit is like for Reinvent itself.
But, you know, I just look at the pricing and everything.
And I'm like, well, who's actually making out well here?
And it's like, well, I think it's AWS is primarily making out well here for their conference.
So, yeah.
So one thing I'm curious about is I need to talk to more folks in various spaces.
But Reinvent has lost all sense of itself.
I gave a talk, two talks this year, last year, now that this is January.
And it was the only time for those talks where I left the Venetian.
And it was like a completely separate conference.
There was an entire world I never got to see.
This, it could have been two separate conferences for all I noticed.
So it was a, I don't fully understand the value or who people, what value people think they're getting from it.
We're looking at budgeting for marketing for the year.
And that's pretty expensive to sponsor that stuff.
Like, people like, oh, it would be great if we have.
like a plate, like a desk we could do a demo at at our booth rather just the kiosk.
It's like, yeah, that's, that's going to have a six-figure build hatch to it if we want that.
Like, is it worth that much? I don't know. It's a, it feels like a cash grab.
I do have to say, the best event I went to at Reinvent this year was your event at Atomic Liquors
because it was well off strip over in the Fremont area. It was free.
The grid people show, some people just show fly into town just for that event.
And the people who end up there are there because they know it's going to be good.
They're not just there like, where's my next free drink on the strip kind of thing?
And I had such great conversations there.
That was like ReInvent of the old.
But you have to find it in pockets here and there now.
That is my favorite part of Reinvent.
I would do that regardless.
But the rest of it, I am at a point now where I don't go to sessions
other than the ones I'm speaking at because I don't like witness.
line forever to get turned away at the door.
I, a lot of the people I normally want to talk to just aren't there in the way that they
used to be.
I talk to like some of the folks that I love spending time with and reinvent and the most
I'm just weren't showing up.
It is a shadow of itself.
And I get nothing good can stay.
That is, that is the inherent problem.
Like, nostalgia.
We always want things to be like they used to be.
But I do find myself wondering who this conference is for because increasingly it's not for me.
And maybe that's okay.
there's a lot more people out there
who are not like me than people who are, thank the world.
And maybe that's the right answer.
I'm sure AWS has a vision for this, I hope.
I just don't know what it is.
Yeah, the other interesting thing is,
I was curious when I was just seeing, like,
I follow the what's new feed to keep up with AWS,
and especially at ReInvent every day,
it's like, okay, so what was actually released here?
And I remember five to ten years ago
every day at ReInvent was like another 100 releases.
And it would actually take me a couple hours just to sift through them.
I'm not reading every one of them.
I'm just sifting through the headlines and then poking in on the 10 that are interesting.
Now, that was also during the heyday of them just releasing anything for any niche that anyone said,
yeah, sure, that sounds like I would like that from AWS.
Yeah, go build that.
And they would.
It's not necessarily good either.
I went back. I could only go back in their feed history to like 2022. But you could see like the
amount of releases has dwindled too. So like 2022 was something like 101 releases. It's still coming out
of COVID a little bit. 203, 127 releases. And then 2024, 66 releases. 2025, 69 releases.
So it's really coming down.
And some of that is, I think, like, pivot to AI
and having to re-figure out what are they doing in AI.
But I think some of it is also a little bit of like,
where is the continued innovation that they're known for?
I'm going to be very curious to see, like,
just kind of how the future continues to unfold for AWS,
because a lot of people there who remain,
who are super smart and doing amazing stuff.
And you see that in some of the stuff that continues to get released.
And you've also seen, you know, as I have, I'm sure, a lot of people have left AWS for various reasons.
And I kind of like, what is happening with the innovation?
And if that's part of what has also driven reinvent over the years and what's been amazing about it, like, where is that going to, that input that feeds into what makes reinvent great?
I am deeply curious to see how that's going to shape up.
I just, at this point, it feels like it can't be getting results for folks.
Like, you take a look at some of these sponsor packages for tens of millions of dollars,
and are you generating that much business from reinvent for that?
Brand marketing.
I think it's brand marketing at this point, when you can go up and down the strip
and see on every casino's giant video board about every two out of every three,
ads is for a tech company, come see us at Expo booth number 592.
And then hilariously, the other third are for like rodeo cowboy gear because like as soon
as reinvent leaves, it's now like a like a PBR rodeo event or something.
But yeah, it's crazy.
I don't know, not a CMO at a giant company.
It's a weird mess.
I really want to thank you for taking the time to chat with me.
about this. What are you looking forward to this year? It's 2026. We are on the cusp of something
interesting happening, I think. What are you excited about? Like, there's a lot of pessimism
out there, but what do you actually, what makes you smile when you think about this Coveneer?
Because if it's reinvent, we're done. This is a little bit more like individual,
a little bit more personal sized. I'll tell you what I have really enjoyed. I have had moments
of skepticism about AI. And I also want to say I'm not like, I know there's environmental concerns
about it too. And I want to hope that we will find ways of lessening its impact on all kinds of
different ways and not just try to shove it under the rug. But that said, there's a lot of
interesting stuff happening in AI where we're starting to realize on the coding front. Like,
what does it mean to code efficiently with AI when you,
You get into spec-driven, test-driven development.
And I have actually found it fun.
Midpoint of last year, I was like, thrown spaghetti at the wall, like, AI, go do this for me,
try it.
And getting terrible results.
And then, you know, talking with people, it's like, you need to do the spec-driven kind of development.
And all of a sudden, it's like unlocked so much productivity.
And yet I still get enjoyment when I get to say to my AI coding tool of choice being like,
yeah cool i see what you did there but my my russ program that you wrote like you shouldn't be using
threads for parallelism you should be using async Tokyo mechanisms and being able to take the raw
computer engineering you know like i went to school for computer engineering using stuff i learned
from the operating systems class back then still today to mix that in even though i'm
not handwriting the code, I am still activating that part of my brain that's putting that to use
and knowing that it's creating better software because I'm putting the right guardrails around the
stuff. So I will say that coming out of 2025, I'm much more hopeful about just what it means to be a
software engineer than I was coming into 2025 and hearing all this AI stuff and yet feeling
very unsure about it. I would say that. I'd say that. I'm very happy with a lot of what
Claude Code is putting out. And yeah, is it architecturally inconsistent? Yes, so is my own
nonsense. What of it? You can iterate forward on it. And efficiency of code versus does it work
is important. So the question right now is how do we review all of this slop code that's being
generated. Well, how do we build systems that do that? Because on some level, like, well, then I won't
know what every line of code in production is. Yeah, you're right. You won't. Do you look at the compiled
output in some languages? Of course you don't. No one does. So how do we wind up making sure it does
the thing that we think it's doing? And that leads to better test harnesses, better controls around
things that have criticality in them. But if I'm building something for fun or a demo for shit posting,
I don't care as much as I do if it's the payments code.
There's a difference here.
And that is, and there's a different level of rigor that I bring to it.
And also, I think it's useful to point out, early career for myself, when I reviewed code, I would review it line by line.
Like, is this actually doing the right thing?
Later career for myself, to this point, as you start to get seniority and tenure and you're actually helping organizations of teams,
build, ship, deliver, and manage their code bases,
you start to realize, like, you can't do line by line reviews.
It's way more about what are the things I'm seeing here?
What kinds of, like, how did you think about the tests you chose to write?
How did you think about the architecture you chose?
How did you think about some of your implementation choices, the library choices?
And I'm going to poke in and just ask you this and ask you that.
And if you're giving me good answers, then I'm going to have way more confidence.
doing what it should be doing. And if there are issues, it's probably just a bug that we need to
address, but like every code has bugs you've got to address eventually. And if you're giving me
not so good answers, then it's like, okay, now we need to dig a little deeper. That kind of skill
set is still highly applicable to AI. I will have times where I'm like, I see what you're doing there,
but I want you to change it a little bit because I know we need to be doing it a slightly
different way. There are other times I was dealing with some code yesterday where I'm like,
no, you're doing it wrong. And I'm telling you, I know you're doing it wrong. You need to be doing
it this way. It was like, it was even just trying to put some heavy dependencies into regular
dependencies instead of just developer dependencies. And I had to like really say, no, you're doing
this wrong. And eventually it's like, you know what, you're right. That should be in the developer
dependencies. So it doesn't end up in the container image we're going to be shipping to customers.
That kind of skill set is still applicable in AI-driven or augmented development today.
That has me happy about being a software engineer still.
I find that, and this is going to be controversial, I get it.
People tend to view software engineering in different ways.
For me, it's never been typing.
It's never been wrestling with weird syntax or boilerplate.
it's never been getting all of the weird things set up correctly.
It's been more about architecture and designing the overall flow.
Maybe I'm the weird one.
I've never been a professional software engineer, just enthusiastic amateur.
That's the part of it that I like.
And I think that people are in some cases missing the forest for the trees.
I'm sure that'll come back to bite me.
At least there's always going to be software that needs to be written,
just like we are always going to be engineering buildings to be constructed.
and we're always going to be, like, engineering as just a skill set is always going to be needed.
And so it's just a matter of figuring out, like, what are our best tools to use today?
And how do we figure out how to use them efficiently and hopefully enjoy using them in the process?
We should all be so lucky. Maybe that's something to look forward to.
Chase, I want to thank you for taking the time to speak with me.
Again, people can visit you atarchodex.com to learn more.
Hit me up, as I said, LinkedIn even.
It may be easier for you to just remember, Chase Douglas.
Send me a message, connect.
Like, there's lots of ways to reach me, to reach us and what we're doing with AI,
governance, security, compliance, that kind of stuff for organizations.
You know, we're new, so we don't have the gargantuan marketing reach yet.
But, yeah, if you, like I said, if you, if you,
want Amazon caliber principal engineers helping you figure out how to do compliance and governance
and security for your AI workloads. Find us. We'll help you out. And hopefully you don't want
Amazon caliber UX designers implementing the same. Chase, thank you so much for your time. I appreciate it.
Yeah, yeah, yeah. This has been so much fun. Anytime. Anytime. Chase Douglas, CEO at Archidex,
I'm cloud economist Cory Quinn. This is screaming in the cloud. My thanks to my own company, Duck Bill H.Q.
And if you've enjoyed this podcast, please leave a five-star review on your podcast platform of choice.
Whereas if you've hated this podcast, please leave a five-star review on your podcast platform
of choice, along with an angry, insulting comment featuring an annotated copy of the Reinvent Sponsorship Perspectus.
