Algorithms + Data Structures = Programs - Episode 47: Combinatory Logic!
Episode Date: October 15, 2021In this episode, Conor waxes rhapsodic about how beautiful combinatory logic is!Date Recorded: 2021-10-03Date Released: 2021-10-15Functional vs Array Programming TalkArrayCast: Why Tacit?The Wolfram S... Combinator ChallengeOn the building blocks of mathematical logic - 1924, SchönfinkelAn Analysis of Logical Substitution, 1929, CurryCombinatory logic. Volume I, 1958, Curry and FeysSKI Combinator CalculusHigher-order functionC++20 std::identityJ language isPalindrome tweetC++ std::reverseC++ std::equalJ |. (reverse)J -: (match)Haskell . (Bluebird)J Essays/Hook Conjunction?APL/J TrainsHaskell intersectHaskell nullHaskell Data.CompositionHaskell .: (Blackbird)C++20 std::ranges::sortHaskell onC++ std::mismatchC++ std::accumulateC++ std::transformC++ std::min_elementC++ std::max_elementC++ std::minmax_elementJohn Backus’s FPC++North ConferenceIntro Song InfoMiss You by Sarah Jansen https://soundcloud.com/sarahjansenmusicCreative Commons — Attribution 3.0 Unported — CC BY 3.0Free Download / Stream: http://bit.ly/l-miss-youMusic promoted by Audio Library https://youtu.be/iYYxnasvfx8
Transcript
Discussion (0)
I was really dying when you sent me the recording of the duck that you encountered on your run.
And you, the duck's quacking, and Connor's doing very bad duck sound effects.
I thought that was a decent duck quack.
Quack.
Quack.
Quack. Quack. Quack.
Quack.
Quack.
Welcome to ADSP, the podcast episode 47 recorded on October 3rd, 2021.
My name is Connor and today with my co-host Bryce, we talk about my favorite topic at the moment,
combinatory logic, and we talk about some combinator birds and real birds as well.
You should teach me some stuff about combinators or something because...
Oh, while we wait for Dave?
Is Dave coming?
Have we heard from him?
I have not heard from Dave.
So...
Oh man, do we want to do like a mini combinators episode
until Dave shows up?
If Dave shows up?
I think so.
I think so.
Okay, here we go.
Teach me master.
Oh boy, oh boy, oh boy, oh boy.
So some of you may have already seen the talk that I uploaded and premiered on YouTube on October 2nd.
There was a digression in the midst of that talk that covered the history of combinatory logic.
But where do I want to start?
So one, you can go watch that talk if you are interested afterwards.
I've also, on the most recent episode of Arraycast, my other podcast,
I go on like a 10 or 15-minute ramble about how combinatory logic and combinators
have led to the most beautiful code that I've ever seen in my life.
But yeah, let's start with the history, a mini history of combinatory logic.
The year is 1924. My main guy,
my guy, Moses Schoenfinkel, real sad story. He had like a health problems, ended up dying pretty
young. But before he did that, he managed to publish one paper on his own, another with a
co-author. And that one paper was called On the Building Blocks of Mathematical Logic. And my guy, Moses Schoenfinkel, introduces five combinators, I, C, and S, two others
that don't really matter, T and Z, but C would later be renamed the K combinator, and I,
K, and S, for those of you that are in the know, are the three combinators that make
up the S-K-I combinator calculus.
We'll get to that in a sec.
This is 1924.
Five years later, a guy named Haskell Curry,
some of you may or may not have heard of him.
The Haskell is from Haskell and the Curry is from Curry.
No big deal.
He rediscovers combinatory logic
and also discovers that Sean Finkel had this paper in 24
that introduces these combinators.
So his paper, I believe, is called
An Analysis of Substitution Logic or
something like that. And that is one of his first papers that will lead to two or three decades of
research on what ultimately is going to be called combinatory logic. And he publishes, Haskell
Curry, that is, publishes a seminal text, which is volume one of a two-volume book, textbook with,
I believe the first volume is authored with
Robert Faze, who I actually don't know who that is, but co-authors should get mentioned.
And this is basically, it's an untyped, simple lambda calculus. So everyone's heard of lambda
calculus, but like, at least to me, a lot less people have heard of this S-K-I combinator calculus or combinator logic.
And all you actually need to have a Turing complete language is S and K.
I is expressible in terms of S and K.
Okay, but hang on, hang on, hang on.
What is a combinator?
A combinator, in my opinion, is a composition pattern.
They're all just functions, higher order functions.
Some of them don't really express, quote unquote, composition patterns, such as the I combinator,
because the I combinator is just the identity function. For those of you that have not heard
of the identity function, it's a function that takes a single argument and returns you back that
argument. When I first heard of the identity function,
I thought, that sounds stupid. Why would you ever need that? And admittedly, I think that's a very reasonable reaction to have to a function that literally just returns you what you pass into it.
But if it's so useless, why did it show up in C++20? Well, you need it for composition, right?
You know, if you're going to plug into some other higher order function. Yes, there are many higher order functions. And just in case there are some
non-functional programmers that have, are not super familiar with the term, a higher order
function is just a regular function, but it can take a function as an argument or return a function.
So it's just a function that takes other functions or returns functions. Great examples of these,
sort of, if you use sort of a little hand wavy, is like all of the std algorithms, you know, std accumulate. Any algorithm that can be customized
with a function object or a lambda, you can consider as kind of a higher order function.
It's not, you know, by the book definition, because, you know, things decay to pointers and
blah, blah, blah. But the equivalence of these in other languages like Haskell and functional languages, those are all higher order functions versus a function like, you know, plus that just
adds two numbers together. That is not a higher order function. It takes sort of data as arguments.
So all of these combinators, so S, K, and I, I can describe, S is a little complicated, but I and K
are super easy to describe. So I takes a single argument. It's a unary function.
Just returns you what you passed in.
K takes, it's a binary function, takes two arguments and just returns you the first one.
So it throws away the second argument.
Also seems a tiny bit useless, but these things become very important.
And S is what is known as the most powerful combinator of all.
In fact, Stephen Wolfram, for those of you that are following that guy, Mathematica,
he has the combinator challenge out right now.
He literally just launched it two weeks ago.
And it's $20,000 to anybody that can show that the S combinator on its own is universally,
is computationally universal, whatever that means.
I don't actually think, initially, I thought that meant that like you, you know, that means Turing complete. So S and K are shown to be Turing
complete just on their own. I thought he was asking to just so that S is Turing complete,
but I don't think computationally universal is the same as Turing complete. But S, what S is,
is it is a function that takes three arguments, basically a binary function, a unary function,
and an argument.
And the composition pattern there
is you take your argument, pass it to the unary function,
and then you take the result of that evaluation
along with the original argument again,
and pass those as two arguments to the binary function.
So that is a little bit of something to digest like mentally, but a really easy example is, which I think we've
talked about this on the podcast before, is palindrome. So if you want to check, is this
string or is this vector of numbers palindromic? If you reverse it, is it the same thing? So your
unary function is reverse, and C++, literally, it's stood reverse.
Your binary function is stood equal,
checking to whether a string or a list are the same thing.
And the composition pattern then is you reverse the string or the vector,
and then you take that reverse string with your initial string,
pass those to stood equal, and if they're the same,
you know that you have a palindrome.
So that's a very, very simple example of where you can use the S combinator. And very interestingly, in the J language, which is the second array programming language that Ken Iverson worked on after APL,
two juxtaposed functions form the S combinator. And they call that a hook because of sort of the pattern which i'm not going
to get into and that is different from apl where two juxtaposed functions is the b combinator okay
explain what you mean by juxtaposed just like side by side so literally like in in j equal
stood equal the equivalent of stood equal is a verb or a function called match, which is hyphen colon. So in J they have digraphs. So
two ASCII symbols. So the equivalent of stood equal is hyphen colon and reverse is pipe period.
So when you put those next to each other with the binary function first and the unary function
second, literally without doing anything else that automatically forms an S combinator,
which they call a hook. Whereas an APL, if you put those same functions next to each other,
it's not going to evaluate correctly because it forms the B combinator, which if you're a Haskell
programmer is just the dot composition operator, which is you take two unary functions, you first
evaluate the first one, and then you take the result of that and pass it to the second function. So like very, if you want to add one to a number and then
multiply it by two, you can do, you know, one plus x, and then your second thing, you do two times x,
and you just evaluate those in order. So it's very interesting that APL uses the B combinator for
what they call two trains, which is just two juxtaposed functions, whereas in J, they use the S combinator. And super interestingly, so J was developed in the early 90s. There's a paper by
Roger Hooey, who was the main implementer that worked with Ken Iverson on J, that he published
in 2006. And he called it hook conjunctions question mark, and basically asserts that after
17 years of experience using the S combinator for two juxtaposed functions, he thinks that that was a mistake. Because in APL, they have
the three train, and Jay also has this, so that's when you have three juxtaposed functions,
that forms what's called a fork or an S prime combinator. And it's very, very similar to the
S combinator, but the S combinator
takes a binary function, a unary function, and your argument. An S prime combinator takes one
binary function and two unary functions. And it has the exact same pattern, but where we passed
the original argument as the second argument for the S combinator, you basically have the initial argument, you apply the unary function, each
unary function to that initial argument, and then take the results of both of those and
pass them to the binary function.
So in order to define reverse, or sorry, is palindrome using an S prime combinator, you
just need to, for that second unary operation, use the I combinator, use the identity function.
So just like pass along the
original one. And so basically, he was pointing out that this S prime combinator, it's the more
general version of the S combinator. So as soon as you have the S prime combinator, you can already
spell the S combinator just with an extra one or two characters. And so it's sort of a waste.
And in using two juxtaposed functions as the s combinator leads to
a bunch of other things that are less ideal um and i had actually always thought that
but then i was like oh this is apl 2.0 ken obviously you know he had his best ideas most
likely in j but the thing is is they only started experimenting with combinators in j they didn't
actually have these in apl and apl dialogue apl didn't get combinators or trains until 2014. So eight years after,
Roger Hooey had decided that the S combinator in J was a mistake. So yeah, S, K, I, and then,
and those, the birds that correspond to those are Starling, Kestrel, and Identity, or Idiot,
but I don't really like the word idiot and yeah and then there's
they're just they're so it's it's i'm probably at this point i've lost like all the listeners
by trying to explain that the s prime combinator is a more general form of the s combinator and
if you have that yada yada but like it is it is so beautiful so for instance let me let me take
i talked about this one briefly in the array cast episode.
So for listeners of both podcasts, I apologize, but I did not approve this.
You can't be like, I've already made my peace with the fact that you're cheating on me with
another podcast, but you can't come in here and advertise it.
Oh no.
I just, uh, I mean, I can cut it out, but I won't because, hey, I'm the editor of...
So if you want to, there was a, was it a leak code or a Perl weekly challenge? One of the two.
Given two lists of numbers, return true if they're disjoint, if they have no overlapping elements.
Very simple problem in terms of like a problem statement. So if you're given, you know, as your first list, one, two, three, and your second list, four, five, six, that returns true because those don't have any overlapping.
But if you're given one, two, you know, the list of numbers, or the
elements that show up in both, and then just check is that empty. And Haskell has both of these
functions, the intersect function is called intersection, and is an empty list is called
null. So ideally, like, the way you can solve this without using combinators is you
just, you know, in parentheses, you go, you know, if your first list is A and your second list is B,
you go in parentheses, intersection AB, and then the result of that, you just pass to null. But
like, I like point-free. I don't like having to state, you know, my arguments. And I also don't
really like parentheses. I like reading things linearly. So how do you compose a function that takes two arguments with a function that takes one argument? In Haskell,
the composition operator, the dot, it composes unary functions, a function that, you know,
takes an A and returns a B and then takes a B and returns a C. You can compose those together
with the dot and then get a function that takes an A as an argument and returns a C. But this one, how do you do that?
So you basically can't unless if you know about the Blackbird.
The Blackbird, otherwise known as the B1 Combinator.
I love the B1 Combinator.
It is just so awesome.
It comes up all the time.
And you can, if you download the data.composition module or library in Haskell,
it provides you with an operator or a function that is dot colon. And that is the B1 combinator.
You throw that in between null and intersection and you're good to go.
What does it do? What does it do?
It composes a function that takes two arguments with a function that takes one argument.
But how?
It's just the mechanics.
You just define it such that it performs that order of operations.
And so what it does is if you put on the left side of the…
Wait, wait, it feeds the binary into the urinary? So it's going to, if given a function a, that's a unary
function and a function b, that's a binary function. If you, if you spell a dot colon b,
aka insert the binary, the blackbird or the b1 combinator in between it, it will then construct
a function that takes, that's a binary function that takes two arguments and
first applies B, feeds those two arguments to B, evaluates it, and then feeds the result of that
to A. Oh, okay. I see. So you could very simply do this in Python or C++ by just writing a function
that takes, you know, two arguments uh that would either have to
be templated or you can just use auto and then you know so it would be auto a auto b auto c auto d
where c and d are your arguments and a and b are your functions and then you would just return a
lambda that you know captures all that stuff and it basically is spelt return inside the body of the lambda, return A parentheses B parentheses C comma D,
end parentheses, end parentheses, semicolon.
I think I got that right.
I might have missed one end parentheses.
Yeah.
Whether that made sense to anybody.
But anyways, these-
It sort of reminds me of that extra parameter that we have
on some of the range-based overloads of algorithms in C++.
Oh, yeah, yeah, yeah, yeah, yeah, yeah.
Yep, yep, yep, yep, yep.
I can't remember.
What's that parameter?
Yeah, projection.
There we go, yeah.
So what's a great example?
Actually, what combinator is that?
It's applying a unary function to a sequence of elements and then doing your algorithm on it. So that is... I don't actually
think that specifically maps to... Well, isn't it similar to that B combinator that you just
described? It is, yes. I think actually that in all cases, that might be the case. I'm just trying to think that...
So like, let's... The classic example of this is sort. So say you have a list of strings,
and you want to sort them by length. Currently, pre-C++20, the way you're going to do that is by writing a custom comparator, a binary lambda, and then you're going to write the comparator
that is doing a.size less than b.size or whatever.
Or more accurately, LHS.size for left-hand side
less than RHS.size.
But in C++20, you can now use the projection overload
to pass a member function.
So I believe the spelling of that would be like ampersand.
If it's string, it's going to be std colon colon string colon colon size.
I might have misspelled that.
But then once you have that, it's going to basically apply that unary function.
So it's very similar to a transform iterator,
if you're familiar with the boost iterators or the thrust iterators and then it's going to perform the sort on that uh basically
modified element so instead of comparing it on strings you're pairing it comparing it on size
t's or whatever whatever that returns so i i think this is an example of like an embedded B combinator, which is,
although it's,
it's not really though,
right?
Cause actually what's being performed is a custom.
It's a,
it's a comparator.
So it's applying a binary function in the midst of this sort algorithm after
applying a unary function to modify what you're doing that.
So it's actually,
Oh, it's actually an embedded psi combinator.
Wow, that's awesome.
And that's what I mean is that I think
for a function like a stood transform,
it wouldn't really make sense to use a projection, I guess,
or maybe it would.
But like you could really just embed
whatever that projection is inside your lambda
that you're performing.
But potentially you're doing some already custom function object that you don't want to have to create a lambda.
So if you're doing that, then I don't know.
There might be some use case for it.
You could also, there's a range adapter.
There's a views transform.
So you could just pipe two of those together. Right, right, right. Yeah. But so that's
the thing is in the case where you're performing a unary operation on the projected elements,
it's an example of like an embedded B combinator where you're doing one unary function followed by
another unary function, where it's a binary function, whether that's in the form of a custom
comparator, or, you know or a binary function that returns some
sort of element, that's an embedded psi combinator. So a psi combinator, P-S-I, in case I'm pronouncing
that wrong, it's so awesome. It's known as ON in Haskell, where basically you apply a unary
function to your two arguments to your binary function and then apply your binary function.
So it's actually a specialization,
another specialization of the S prime combinator.
So the S prime combinator had a specialization
in the S combinator where one of the unary functions
was fixed to identity.
Right.
The psi combinator is a specialization
of the S prime combinator
where each of the unary functions is the same.
So in the case of doing our, you know,
sorting strings by length,
that unary function is our size method.
And then the binary operation is just the less than
or whatever kind of sort we're doing.
Ah, that's awesome.
That's awesome.
That's awesome.
I mean, and there's this fantastic parallel,
although I'm sure this exists in like everything in life, where the same way that there are the
most general versions of algorithms, like stood mismatch is at the top of sort of the mismatch
algorithms and stood accumulate, roughly speaking, is at the top or the bottom, you know, the root of
the tree of reduction algorithms. I say sort of because most of our
reduction algorithms return iterators and std accumulate returns of value. So, you know,
yada yada. But so the point is, is you have these like hierarchical relationships where one algorithm
is a specialization of another algorithm is a specialization of another algorithm, but can all
be traced back to std transform. The same thing exists in combinator land, like S prime combinator.
I'm not sure if that's a
root one but there's these sort of latent or not latent but there's like direct relationships
between them where it's not i haven't read enough on the literature if they talk about
how one is like the more general version of the other one but yeah i just it's it's and that's
the thing is it this sounds like what did um richard park who was one of the panelists on
the most recent array cast episode when he heard me and sort of others talking about tacit programming or point-free programming and using all these combinators under whatever name you want to give them, he used the word highfalutin.
It sounded like, oh, like, you know, and I think there is a lot of truth. Like there is something, I think it's a valid criticism to hear someone going on and on
about like, oh, this is so awesome. It's so beautiful. It's so elegant, but it's, it's all
of this extra literature and sort of structure that you need to learn and know. But I, I think
it is worth learning. And I also think that as well, you know this already, Bryce, but I think that if you were to express your
programs in a point free kind of form, that is heavily relies on these combinators, the structure,
the, the patterns within these composition patterns that are combinators enables a compiler
to be written that can just take they can do ridiculous like ast transformations
um like this is i'm not sure if i've talked about this on the podcast or it's just been
in private conversations with bryce but like the fact that you know and i guess this is fixed a
little bit in um with ranges but like the fact that you call you know three stood transforms
in a row in c++ 11 that's three times slower than calling a single one.
And like a sophisticated enough compiler that maybe removed some of the legacy of C++
should be able to see that,
oh, hey, I'm calling three algorithms
and those could be fused together.
We shouldn't say three times slower
because that's not actually correct.
What you mean is three times more efficient.
I mean, it's actually worse than three times slower.
It's typically typically
you're you're you're making certain this like you're making certain assumptions there it may
actually be the case that um uh like i'll give you an example let's say that combining those
three transforms meant that uh uh the um the that you blew up cache, right?
That like when you run all three of those transform operators
like consecutively on the elements,
that causes you to blow out your like L1 cache.
And so it's less cash efficient and so it's actually slower to combine all of those operators
into a single pass than to do three separate passes right okay yes so there are going to be
certain use cases where I'm sort of hand waving and making general broad statements that it won't
be true for right but if well my point is you're talking about speed but you didn't actually mean speed you meant efficiency you
meant you know this is a a three you know a three pass algorithm versus a one pass algorithm
you meant like a you meant like a speed in like a theoretical sense you didn't mean like speed in practice like it may i mean but like
in practice i feel like in actually i shouldn't speak about but i just there are many cases where
in practice it will be right right there are but there are also many cases where like like it's not
something that you can just blindly do um there will be plenty of cases where it, you know, that sort of loop fusion will be less efficient.
And like that's something that you have to like tune for in balance.
Right.
Things like register pressure, etc.
Right.
In general, though, for like the simple cases, you know, there's been code reviews where there's been two algorithms you know a reduction and a transform
and i've spelled it you know using the correct algorithms and then the code review says this
will be slower use a stood for each and combine these and then i profile it and then you know
sure enough that's the case and that makes me sad because i'm choosing the less expressive way
to well why aren't you using a transform reduce uh because that doesn't that no that does not
that does not work so it's it's not a transform and a reduction it's like i'm i'm doing a reduction
and a transform each separately so you're doing a reduction then you're doing a transform that depends on the reduction?
No, they're orthogonal to each other.
Oh, I understand.
But it's like you're iterating over the same data.
Yeah.
And so the most expressive way to code that
is just to use the correct algorithms.
But if you find yourself coding two algorithms
one after another, and they're operating on the same
data, it is going to be faster to use what is, in my opinion, one of the worst algorithms that
stood for each and sort of losing the expressivity of what you're doing, but gaining the performance.
And my dream is to have a programming language that you can write it the most expressive way,
and it still be as
efficient as it being sort of folded into. And that's the thing is so using these combinators,
like the classic example is if you're using a S prime combinator, AKA what's known as a fork
in APL to perform min max element, the equivalent min-max element. So minimum and maximum are
unary operations. They take a single range and they do a reduction. And then the binary operation
in this case is just like make pair. But there are many different versions of the algorithm.
Like say you want to know the difference between the maximum and the minimum. Well, you just,
you change your binary operation from make pair to minus, and you make sure that your maximum is on the left side of that binary operation.
If you perform like, you know,
min element and then max element,
you know, that will be half as efficient
as the min max element
because that's a single pass algorithm.
Well, with this expressed in the S prime combinator,
it enables, it doesn't exist
because in APL I've profiled this, and it is
roughly 260 times, or no, 2.6 times as slow or whatever efficient based on the profiling that
I did. But you could hypothetically write a interpreter or compiler that recognizes that
idiom and says, oh, look, we're doing two reductions. And we know that because it's the
S prime combinator, it's working on the same sequence of data. So you can just bundle those
binary operations into a reduction that performs both of them. And the accumulator is a pair of,
you know, your two results, and poof, you know, you now have you now are able to express this
as, you know, in the most expressive way
possible, but you're not giving up any performance. And this is just a single example. Like on top of
that, there's a bunch of other examples. And I've actually heard someone, I need to look into this,
but apparently John Backus had a language in the sixties. I don't know, I might be off by a decade, called FP, that was largely based on APL.
From what from the light reading that I've done on that, that is more focused on the algebraic
properties that we've discussed in the past, where like, if you know, the associativity or
the commutativity, can you can you know, compile down to different parallel implementations of
algorithms that can be used in this kind of compiler that I'm
talking about. That, that's sort of a different thing though. That's just keeping track of,
you know, if you're using a point-free, you know, operation and some reduction,
knowing at the end of the day that, oh, it's this composition of things is still both
commutative and associative. Let's, let's call std reduce. Woo. Hey Dave. So, uh,
what is point-free programming?
Point-free programming is programming without points.
That was the worst definition I've ever heard.
Really?
The worst?
You haven't heard worse than that?
You used all of the term, like one of the key rules of good definition is don't define it in terms of itself.
I was going to elaborate.
I just wanted, I saw the opportunity for a joke
and I did my best.
It's very counter, or it's very confusing
because in Haskell, as I've mentioned before,
the B combinator, the composition operator,
is the period, which is a dot or a point.
And when you're doing point-free programming,
you're using a lot of combinators
and the main combinator being the B combinator,
aka the point.
So point-free programming
in Haskell ends up using a lot of points. And I was like, what the, what the heck? Like this
doesn't seem right. But what they mean by point-free is argument-free. Point refers to the
function arguments. So not having to mention the arguments explicitly. So if I want to go back to that, is this set disjoint question, one way, the non-point
free way to define that is to go, is disjoint AB equals null parentheses intersection AB
end parentheses, and you're done. But note, I had to name the two arguments, the two lists, A and B,
and those get mentioned in the solution. The point-free version is is disjoint equals null
B1 combinator, aka dot colon intersection. And A and B are not mentioned anywhere. There's no
arguments mentioned. And some people don't really, they don't like it. They think it's confusing. I absolutely love point-free solutions.
It's the epitome of elegance in programming, in my opinion.
I just realized we didn't tell Dave that he needed to record on his end.
We didn't.
Oh, yeah.
Well, I assume.
So, yeah.
All right. oh yeah well i assume so yeah i all right uh you know the listener has has heard some amount of uh
uh well uh we just heard from dave he popped in and uh didn't have his mic recording um so yeah
that's uh dave popped his head in for a sec and is he's sorting some stuff out but yeah anyways
we should i'm sorry i'm just making this super hard for you to edit.
No, no, it's the best, man. I have a blast. This honestly, it's every whatever two weeks or
whatever when we record, this is like, it's one of the highlights of the week. You know, what is,
so I was talking to someone, this is just random tangent, but my sister was in town
for the last week from Calgary. Yeah, Shannon, right? Yeah. Yeah. Yeah. It was awesome to have her around and her partner, Evan. Um, and we went to this sort of outdoor patio at one point
and, uh, I ended up, well, actually he was thinking about starting a podcast and yeah,
he was like asking me like, you know, like, how do you, do you like always have topics planned
ahead and, uh, et cetera, et cetera. And I was like, oh, sometimes we do, sometimes we don't.
Like, I mean, you want it to be really structured at the beginning.
And now we just went out the window.
Yeah.
We had this whole document of like, he's like, we got to brainstorm.
We got to get the first, you know, 40 ideas laid out.
And then I was like, honestly, when your co-host is Bryce and is as entertaining as he is.
I was like, that's the key.
Just find a really entertaining co-host.
No, no, no.
The key, Connor.
The key is the chemistry.
It's that rich, lush, luscious, Bryson Connor chemistry.
That's what makes the podcast.
If it was just me with some random person, it wouldn't be a podcast.
That's true.
That's true.
Yeah, there has
to be chemistry i will admit though i think between the two of us uh you definitely bring
a lot more i don't know personality or what it is but like you know the whole the whole bit last
time about you being like well i do i am quite fond of waterfowl i mean that's a rare that's a
rare kind of kind of you know a thing to find in someone.
I was really dying when you sent me the recording of the duck that you encountered on your run.
And you, the duck's quacking.
And Connor's doing very bad duck sound effects.
I thought that was a decent duck quack.
It was pretty good.
It was pretty good it was pretty good
yeah we got we got tons of birds you got a cpp north man you're definitely gonna have to come
up and uh oh yeah should should we i mean at some point maybe we should do a live
you know a live recording of this like in front of a live audience
um i mean we we definitely need to record in the same room at some point um whether
we would invite i don't know how i feel about i did because that's i don't know it just seems
very like who do people care enough do they actually want to be oh yeah we have fans we
have fans trust me i don't know i don't i don't know do we i think we have people that listen. Would they call themselves fans?
So there's people who listen and tolerate.
But yeah.
Thanks for listening.
We hope you enjoyed and have a great day.