CppCast - Expression Templates
Episode Date: October 5, 2015Rob and Jason are joined by Joel Falcou to discuss Expression Templates. Joel Falcou is an assistant professor in France where he works on torturing compilers to get the best performance out o...f modern hardware. He's an active member of the Boost community and CTO of NumScale, a start-up aligned with parallel processing tools. News Rejuvenating the Microsoft C/C++ Compile Coroutines in Visual C++ 2015 Holy Build Box Joel Falcou @joel_f Joel Falcou on GitHub Joel Falcou on StackOverflow Links NumScale Expression Templates - Past, Present, Future
Transcript
Discussion (0)
This episode of CppCast is sponsored by JetBrains, maker of excellent C++ developer tools including
CLion, ReSharper for C++, and AppCode.
Start your free evaluation today at jetbrains.com slash cppcast dash cpp.
Episode 29 of CppCast with guest Joel Falcoo recorded October 5th, 2015.
In this episode, we talk about rejuvenating the Microsoft C++ compiler.
And we'll interview Professor Joel Falco.
Joel will talk to us about how to get started with expression templates. Welcome to episode 29 of CppCast, the only podcast for C++ developers by C++ developers.
I'm your host, Rob Irving, and I have a bit of a different intro this episode because I need to apologize for some technical difficulties.
When we recorded this episode with Joel, Joel and Jason both recorded their side of the audio fine.
But my recording software had a problem and we did not get my audio recorded from our conversation.
So we thought about just re-recording the entire episode,
but because this is an interview show, most of the speaking and the content tends to come from
our guests and Joel's audio came through fine. So we're going to go ahead and release this episode.
What I've done is I've tried to insert some of the questions that I asked Joel back into the interview.
I apologize if some of this comes through a little awkward,
but I'm trying to do the best I can and preserve the content.
And I just wanted to apologize again and promise that we'll do our best
to make sure this doesn't happen again. Thanks.
At the top of every episode, I'd like to read a piece of feedback.
This week, Druva writes in, After years of programming C++ and spending all my time solving system-side problems in kernel space, again. Thanks. my interest to again go back to books and learn the advances in the language. Thank you for that. Well, thank you, Druva, for writing in. That's definitely what we're trying to bring to the
community with CppCast, trying to give people an interest in learning about all these new and
great things that are happening with the language. So we'd love to hear your thoughts about the show
also. You can always email us at feedback at cppcast.com, follow us on Twitter at twitter.com
slash cppcast, or like us on Twitter at twitter.com slash cppcast,
or like us on Facebook at facebook.com slash cppcast.
And again, you can always review us on iTunes as well, which helps us get more listeners.
So joining us tonight is Joel Falcu.
Joel is an assistant professor in France where he works on torturing compilers to get the best performance out of modern hardware.
He's an active member of the Boost community and CEO of NumScale,
a startup aligned with parallel processing tools.
Joel, welcome to the show.
Hi, how are you?
So, what is your favorite compiler to torture?
Oh, well, I was stuck for a while torturing GCC,
because that was the only compiler that can support the kind of tortures I was wanting to do.
But, well, with time, currently I'm pretty doing much good with Clang,
which has a very nice idea to give us actual meaningful error messages.
That's always a bonus.
But recently, and I think that goes with one of the news,
is I started working again with Visual Studio. And it started being quite good to do the newer stuff we were dealing with recently.
But yeah, my main target is game.
Okay.
So the first news item is this blog post from the Microsoft Visual C++ team on rejuvenating the Microsoft C++ compiler.
We've all kind of followed how they've gone through
several updates with their compiler for Visual Studio 2015,
which finally brought constexpr support.
And they've kind of detailed all of the work
they've been putting in there behind the scenes.
And Joel, I know you actually were commenting
on this article on Twitter.
Yes, yes.
It's actually a very good news
because that shows that Microsoft
actually care for us
somehow, C++ developers.
Well, I'm saying that
with half a joke, but that's
very cool news because
I mean, Microsoft
EDE, I mean,
EDE is probably the best I ever saw.
And I was
rather, you know, sad that the compiler was not really up to date
and not really well performing.
But now if I can actually get this cool ED and a cool compiler,
I mean, that's basically what you'd be expecting.
I knew a bit of the stories that the blog posts talk about,
this cookie way to compile the code and so on.
So it's cool to see that they are getting
out of this quagmire.
I agree. It's a good
article to read if you have any interest in
the Visual C++ compiler, basically, is what
it comes down to. This next article comes from
Kenny Kerr, and he is a friend
of the show and former guest, and he's
writing about coroutines in
Visual C++ 2015, which is a
new feature that got added. You can use it by passing in the slash await compiler flag when
you use C++ 2015, and hopefully this feature will make it into the C++ 17 standard. I have not read
the article yet in depth, but there is a talk from Gore, who is the person who implemented coroutines in the Visual C++ compiler, I believe, with a rather enticing title at CBPCod 2015 of negative cost abstraction with coroutines.
So I really wanted to go to that talk, but I was giving a talk at the same time.
So I was not able to, but I am very much looking forward to that being online and learning more
about coroutines in general. Yeah, I mean, coroutine is one of those models that makes,
you know, concurrent programming so much easier. It's cool to also have this kind of,
not actually a replacement, but alternative
for all the systems that we already have. That's actually something pretty much interesting
to look into. I glanced over the blog post. I wanted to go to the talk myself, but I didn't
have time, so I just glanced over the slide and it looks pretty much
a nice piece of work so uh i think it's interesting to see how it's how it goes and uh
of kind of you know um competition it gives to the traditional way of you know uh dealing with
features and whatnot or threads and if we can actually get something out of all those alternatives
so that's yeah that's an interesting topic right now.
And this last news item I want to mention is coming from GitHub,
and it's the Holy Build Box,
which is a tool for building cross-distribution Linux C++ binary.
And I'm just going to let Joel introduce this one.
All right, it solves this very, very embarrassing problem
that if you really want to be able to distribute your executable on
different Linux boxes that
the user may have or not have
installed the proper dependencies
or the correct version of this or that,
you end up with
strange issues that
is not even related to your application, but
the way it gets deployed. So having
a system that can help you actually
make everything smooth, know, smooth.
I think I spend a bit of time looking at the GitHub repo
and the way they do it, it's actually pretty cool.
So you are sure you've got everything
in the correct versions, you know,
in terms of dependencies and dynamic libraries
just there in the box.
And that's solving a massive amount of issues
with deploying Linux systems.
So yeah, that's pretty
much something I put on my stack of
cool things to test when I have
time because it seems really, really interesting
and very well thought.
I'm also going to have to give it a try.
I'm just slightly concerned that
in the big projects that I work on
that could really benefit from something
like this, how gigantic of a package is it going to make?
That's the problem.
I mean, enjoy your multiple gigabytes, you know, deploying stuff.
That's my fear, too.
But, well, I mean, let's try.
Right.
Give it a shot.
See what we find.
Yeah.
So, Joel, can you get us started with an overview of expression templates?
Yeah.
So normally that's a point where you got this, you know, overall movie style background music.
Okay.
So expression templates, it's a bit funny because it's a bit old and new at the same time.
Expression templates actually date back to 1995, where a guy called Todd Veldwiesen
actually trying to solve an optimization issues
with his small-scale image processing library.
And what he wanted to do is get rid of temporary objects.
So what he did was building, you know,
a library with image object,
with overload operators and functions,
which is a traditional way, let's say.
And the issue is that whenever you wrote something like image A equal B plus C minus D,
every binary operator was generating a temporary that the compiler was not really able to optimize,
probably getting a memory allocation and a copy that should not be there.
And so Todd devised a system that was about to remove all those temporary.
And to do so, what it is, was basically to say that when I wrote A equal B plus C minus D,
what interests me is the fact that at some point, I want to compute this A plus b minus d in one shot so expression template is a way to say okay
you've got those types that these properties that any operation you put on them are actually lazy
they don't do any computation they just build a very you know lightweight object that represents
the actual operation you really really want to do but it doesn't do anything. And you build this kind of
tree-like structures combining operators and functions that ultimately contains reference
to your actual data. And at some point, you want this tree-like structure to be, how to say that,
condensed or compressed into the actual value it should be if it gets evaluated.
So you can actually, using expression templates, arbitrarily delay the time between the point you build an expression
and the point you actually process it in any way.
It could be computing a value in a table.
It could be displaying it.
It could be sending over the wire.
And by correlating those two phases,
you can actually eliminate all those temporaries
because everything you would be able to do is,
okay, I have this restructure.
I would just go down the tree
and compute my value on the fly
and doesn't build any temporaries.
Or what you can do,
and that's what we do in other tools,
is before evaluating the tree, you can do, and that's what we do in other tools, is before evaluating the
tree, you can just, at compile time, how to say that, reconstruct the tree in a different way,
because you detected patterns, or you detected useless codes, and so on. And so before evaluating,
you can apply arbitrary high-level optimizations that a compiler may know or may not know about.
And expression template in itself is a technique, is a set of techniques that help us implement
such a delayed evaluation process.
Yeah, that's a bit about the news for it.
I think that it's probably one of the most, you know, not complex, but intricate items of C++.
And so I said it was both old and new because it's very old.
I mean, it's 1995.
That's almost now 20 years, something like this.
Yeah, it is.
It's a 20 years old techniques.
And the funny thing is that, you know, is popularity or usage pattern just, you know,
wane and wax with time because depending on how many need people have
for this kind of stuff they just you know climb up in popularity before just going back to the
obscurities they came from and so on and so on and currently actually there was a lot of discussion
about is this technique still needed because i mean if you want to get rid of temporaries
what about just using move semantics and return value optimization?
OK. And in fact, yes. And so if you remove the no temporary tricks, what's left for expression template?
And what's actually left is that as you can actually perform this arbitrary transformation on your expression,
you can actually embed in your library information that the compiler cannot,
you know, deduce itself. Like this simple example of, let's say you have a linear algebra,
you know, library, and people keep writing something like a equal b times the inverse
of some matrices, which is a very costly operation. And if you are an expert in algebra, you know that this particular pattern
can be simplified by calling a very much simpler operation.
And that's what expression templates lets you do.
Match this pattern, replace the code by something faster, and then generate it.
And that's where we are still winning with expression templates
because we can learn compilers to exploit
high-level expert information from the code that it cannot find out by itself.
So you're really asking a lot of the compiler, is that right?
Oh, yes. Michael Kess, I don't know if you see where he is, actually wanted to make me a t-shirt
with a compiler abuser or something on it.
Yeah, we stress the compiler a lot
because basically we ask him
to do his work
at least two or three times, you know.
So it's a very, very intensive
compile time process.
And there were a lot of work
by different people
to try to solve that.
But the idea is to say
that basically,
okay, you probably compiled twice as slow than before,
but you expect your code to be 10 times faster.
So sometimes, I mean,
you are ready to pay this huge amount of compile time
because you know that the final application
will be blazingly fast.
But it's a problem of, you know,
not to say that, you know,
you know, choices of how you want to do your thing.
So you mentioned linear algebra equations, that kind of thing.
Is there other kind of like real world applications for this
that maybe our average listener would be able to appreciate?
Oh, yeah.
So, yeah.
So linear algebra is the first stuff.
That's basically what it was made for.
You have a bunch of libraries.
For example,
I think some people may know about
Boost Spirit,
which is a library that helps you write
text parser and text generator
from a grammar.
Okay, so that's basically built using the same system.
So you basically write directly
what looks like an actual language grammar,
and it gets compiled to the most efficient automaton
that is able to parse your code.
Other example is the Boost MSM library,
which lets you build state machine using such systems. That just looks like what you would be writing if you were writing state machine in UML.
And then again, compile to the perfect automaton for that.
You can also use expression template on a very small scale when you really want to be only able to
capture a very few specific pattern that you want to optimize so you can actually have a small lazy
operation that you can put into basically any kind of of library that requires this optimization to be done. For a while, it was used in Boost Phoenix,
which was basically C++ 2003 version of Lambda function.
So you can actually write,
actually regular C++ expression in the code
and get a Lambda.
And that was basically doing the same.
And speaking of regular expression,
Boost RegExp is also another example of that.
So you have a library.
You need a way to be able to optimize arbitrary patterns
and get rid of temporaries in the same time.
You basically need expression templates.
So yeah, that's basically the use case.
So Boost.perit, all the library, which is actually better designed as a small language.
So, if your library looks like a language, better use expression templates.
So, you just came back from a three-hour talk on expression templates at CppCon this year.
How did it go?
Yep, I survived.
You survived.
Well, actually, it was quite cool.
My first CppCon was a bit different because, I mean, the conference was brand new and we
had a few ideas about what kind of talk we should be giving.
And by witnessing last year's CppCon, I was struck about the fact that, okay, people come
there to learn things they don't know about, and they want to go back home with pieces of, you know, information that can go and reuse in their actual, you know, production code.
So I devised my presentation this year to be some kind of, you know, progressive step against what's expression templates,
what should be using them for.
This is a way to write one by hand, which doesn't take ages.
And this is a limitation of this, and this is a tool you should be using instead.
I got a lot of return just after the talk where people were actually quite happy because
they actually learned small pieces of stuff, even if they probably don't get a grand idea as a whole, but
people learn about
the CRTP
techniques, which is
basic stuff that we need to use for
implementing expression templates,
and for them it was already
a lot because they knew the name,
didn't know what it was, and same for
the people that have issues
seeing why, for example example auto was not a good
fit with expression template and so on so it was quite long but i think even for me it was quite a
good experiment because i i i felt i could actually you know bring something to some people and that
was the um the effect i was looking for. CLion natively supports C and C++, including C++11 Standard, LibC++, and Boost.
You can instantly navigate to a symbol's declaration or usages, too.
And whenever you use CLion's code refactoring, you can be sure your changes are applied safely throughout the whole codebase.
Use Subversion, Git, GitHub, Mercurial, CVS, Perforce via plugin, and TFS, with a unified interface for all of those. Run a terminal
inside the IDE and install one of the dozen of plugins like Vim emulation mode, for example.
Download the trial version and learn more at jb.gg slash cppcast dash clion and use the following
coupon code to get a 20% discount for the C-Line personal license.
CPP Cast, JetBrains, CPP Tools.
So you just kind of casually mentioned CRTP,
which I'm guessing some of our listeners are not familiar with.
Would you mind going into that?
Sure. So CRTP is the Cur want to have some types that behave in a polymorphic way,
but you don't really want to pay the cost
of runtime polymorphism.
So virtual functions are out of the way.
So how do you do this?
The idea is to say that
you have a bunch of derived classes
that implement some behavior, and they all inherit from a base class,
which is a template class, and they pass their own type to this base class.
So you end up with something like struct derived colon public base of derived. And that's strange because it looks like you're actually defining the class with its own name.
That's where the curious part comes from.
And when you look at the base class,
so the base class just says, okay, I'm a base of some type T.
And what you have there is that you have the exact same interface
you want your derived class to have.
And inside, what you do is you say,
okay, if you have a base of something,
what I really want to do is actually call the interface,
the function of something.
And what you do is you take your z pointer,
you cast it to your derived class and call the function on it. And you are
allowed to do this because the only way you can actually have
a base of t is when you are writing the t
class that derives from base of t. So you know that your
this object is safely castable to the
daughter class.
And so you end up passing base of t reference
when you pass base into a function,
and you call the interface from base,
but statically it recalls the function from the derived class.
So it's basically polymorphism resolved at compile time.
Okay.
So you have the same...
Go ahead.
Sorry.
You have the same benefits
on regular polymorphism in terms of interface.
That means you can have case A and B and C
all with the same interface
that you can pass to function in an arbitrary way,
and they all select at compile time
the correct implementation of the interface you need. Okay. So this is something compilers are used to seeing, and they all select at compile time the correct implementation of the interface you
need. Okay, so this is something compilers are used to seeing, and they're able to do things like
inline the function calls. Yeah, exactly. And you don't pay the cost for the VSR table, and so on,
and so on. Okay. And so we need that in expression templates, so you can actually wrap every kind of
expression, unary expression, binary expression, expression with three parameters, below a
single expression class that do things the correct way.
And you don't have to care how many sub-tree you have in your expression.
Everything is there, and you can call it in a generic way.
Yeah, you probably lost half your audience at this point.
I'm sorry for that.
That's fine.
So one of the questions I personally had is, I just, because of my own schedule, I only saw the last 30 minutes of your talk at CBPCon this year.
But something you mentioned and something that we have had, I believe, Edward mentioned when he was on our podcast was that reducing symbol sizes in these gigantic template metaprograms
can reduce your compile time.
And that's been bothering me for the last couple of weeks on several levels.
I want to know, is this true across all compilers, for instance?
That's a good question.
I'm actually trying to get a meaningful experiment
with controlled environment and properly instrumented compilers to exactly know what happens.
Currently, it's something you observe from the outside.
But it's very difficult to know in your compile stack when the magic happens. My hypothesis is that at some point when you build a type, you have some representation of
this type or this type name actually stored as a string somewhere inside the compiler. And at some
point, if you have too much of too long symbols, you basically end up into a corner cases of the
compiler algorithm for handling names that devolves its normal complexity
to something that just blows up.
Okay.
And my idea is to say,
okay, let's actually measure these things
and see where it came from.
Actually, which part of the compiler
is actually acting weird
and try to see what happens.
Because if it's basically,
well, I mean,
algorithmic problem inside the compiler, maybe we can
fix it, ship a patch, and done.
And if not, if it's something
more complex, we can actually have clues to know
what's going on. So currently, GCC
actually
exposes behavior.
And I think Edward
spoke about Brigand last time.
And in Brigand,
it's very clear. I mean mean as we don't need to use
very complex type names using macros like what mpl does it's basically instant compile time
and the only places that take times to compile is the ones that we cannot do anything that
emulate what mpl was doing in terms of naming so it's it's exhibited by g. So it's exhibited by GCC.
It's exhibited by Clang.
But by experience, it looks like you need to have far greater symbol size to get any meaningful impact.
OK.
And Visual Studio also shows something similar.
It's still, you know, something you see and you know that if you do it differently, it's
better. I'm still missing, you know that if you do it differently, it's better.
I'm still missing the scientist behind me.
I'm still missing the exact why.
So I have this plan to actually run a proper experiment and see what's going on.
Okay.
So probably make a blog post about that whenever I got something
and see if we can actually solve this the proper way.
For me, it looks like somewhere deep know, somewhere deep into the compiler,
you have to sort all of this.
And there is so much type of so much long,
you know, names that something,
you know, crumbled under its own weight.
And the idea is to find what?
Well, so until the meantime,
until the compilers are fixed,
do you have any practical...
Am I here?
Hello? Hello?
Yeah.
So in the meantime, until the compilers are fixed for this kind of long symbol name problem,
do you have any practical advice for how to identify the symbols that might be a problem
for us?
Well, identifying is actually the main issue.
It's more like, you know, a cumulative issue. You know, I think that you can actually make a phony files where you don't have very long name, but you have a very large amount of them.
And at some point, you would just get the same effect.
So actually what you should be trying to do is think like a compiler and try to count how many names you should be handling and how long they are and try
to minimize this globally.
So the using techniques that
Edouard said is actually one
of the very simplest ones.
What I
suspect is that as soon as you have
a recursive template types
that can contain itself
and so on and again and again,
you end up with some kind of, you know,
almost quadratic behavior
because you will build the first one
and you will copy it again into the larger one
and so on and so on and so on, you see.
So recursive templates that instantiate themselves
inside themselves are probably something
you want to try limiting.
And on the more, let's say, down-to-earth cases,
you don't really want to have types
that depend on too many other types for a long time, actually.
It means that in a single transition unit,
you should try to limit the number of memories
that your type name should be consuming.
One trick we found,
that's probably the one you see when you were there,
is you can actually erase type information
if you capture value in a lambda,
which is not polymorphic,
and that just returns an internal structure
that contains a template you want to hide,
and you just return this value.
It's a bit complex.
Probably it's better if you look at my slides.
But basically, yeah, you can hide type information inside a structure
which is returned from a local lambda.
Because the compiler will say that this type is basically
name of the lambda something, the name of the type inside,
which may contain template's parameters by capturing them from the outside, but not having them inside its own type.
And this kind of eraser of the type actually lets you manage the exposed symbol size by
you can just arbitrarily reduce it.
And on this front, MSVC is actually probably far in advance because when you return something from a lambda, the lambda name is basically, I don't know, maybe 16 or 32 characters GUID, which has no relationships to where you call it or where the file name is.
So you always end up with 32 characters, whatever the crap you put inside.
And that's something that is basically constant in terms of size
and solves a lot of problems.
And so if GCC or Clang guys is actually hearing you,
yeah, please do the same thing.
I think we might have a couple of listeners on those teams,
but I'm not sure.
Okay.
So, yeah, I tried playing with that technique
yes that you're mentioning that I did
see in the last part of your talk and
I couldn't quite get it to do what I wanted
to do so your slides are online
now is that right? Yes
they are online and you should have
all the material into my
GitHub account and there's a CPP
code 1215
repository
and if something is not working,
just send me an email
and we can work something out.
Okay.
So aside from watching your talk
when it's available,
do you have any resources
you'd recommend for developers
who want to get started
with Expression Template?
Yes.
So there is two takes on that.
First,
either you really want to understand
what's going on inside. That's something
that some people like to do before
jumping into something.
I recommend reading the
actual first paper
about the subject by Todd Vergrissen.
I got the reference
somewhere. I can give it to you
afterwards.
It basically goes the way
I went in my presentation. my presentation it's okay this
is a problem i want to solve and this is what we need to do and it goes you know step by step
building all the building blocks so you understand everything you know inside out i don't recommend
trying to do this manually at home uh for more for more than just, you know, teaching purpose, because it's very cumbersome if you want to maintain such a code base, okay?
And once you get a grasp of it,
you can jump into the documentation of Boost Proto,
which is a Boost library that is actually some kind of compiler toolkit.
So you can actually build your own expression template in a vast simpler way
So proto provides abstractions about
What your language should be supporting or you transform an expression into something else and so on
So it lets you focus on designing the actual interface of the languages instead of you know, getting into the gory details
It's a bit slower to compile but you can actually play with a
lot of things. There is a bunch of example ranging from a simple
calculator to an actual pseudo array based computation system. I also made
I think it was three or six hours tutorial about Booth Proto at CPP Now a bunch of years ago.
You should be able to find the slide and material on my GitHub.
And it's a piece by piece, you know, let's do this with Proto, you know, from the very beginning,
checking everything in between so you can actually know what each part of the library brings to the complete results.
There is not much other
materials. I know there is a bunch of guys that were using Boost.prodo or manual
execution templates that may have blog posts on that, but usually people just use it
silently, you know, and don't really boast about it, for obvious reasons. No, yeah. Start with values and paper.
Try to transition towards
the documentation of
Boost Proto. Try to see
Eric Sneeber, Boost Proto
author, talk about Boost Proto.
You got to find example
and go through into the details, which is
very well done. And I got a bunch of
material on that and a couple of tutorials
which also can help going inside this very complex subject.
Did you have any personal highlights from attending CppCon this year?
Well, I spent a great time over there, as always, with the gang.
I really liked the Bjarne keynote about the core guides
and the GSL effort.
That's something which is actually
bearing fruits already.
I think we needed this kind of effort
to get out of the PR nightmare
that we were pretty much stuck for years.
And that's very cool that Bjarne,
all these people in Microsoft
are actually teaming up to get this live.
I think that was the
most, you know, enlightening part
of the conference. I
really enjoyed all the keynotes.
As usual, John Cobb made
a great work on that.
The talk
were great. The venue
was great, but that's given
now. So, yeah, definitely coming
back next year. Where can people
find you online, Joel?
You can follow me on Twitter,
even if I don't tweet much, but
sometimes I do.
Most of my
tricks,
doing something, lies in my
GitHub account at g4ku
in a very obvious way.
They can reach me
on classical social networks,
most of them being Facebook,
Google+, or LinkedIn.
I don't think there is much people
with my name and surname combination,
so I'm pretty much a given.
And I usually have
a very C++-centric avatar,
so I should be, you know,
noticeable out of the potential ambiguity.
And I'm always responding to email if needed,
so just shot me something on joel.facu at gmail.com if needed.
Jason, was there anything else you wanted to ask before we let Joel go?
I did not. I am good, thank you.
I've got a lot to research now.
Joel, thank you so much for your time.
Thank you, Rob, thank you, I've got a lot to research now. Joel, thank you so much for your time. Thank you, Rob, senior Jason, and thanks for this opportunity and listening to you soon.
All right, thank you for joining us.
Thank you.
Thanks so much for listening as we chat about C++.
I'd love to hear what you think of the podcast.
Please let me know if we're discussing the stuff you're interested in, or if you have a suggestion for a topic, I'd love to hear that also. You can email all your
thoughts to feedback at cppcast.com. I'd also appreciate if you can follow CppCast on Twitter
and like CppCast on Facebook. And of course, you can find all that info and the show notes
on the podcast website at cppcast.com. Theme music for this episode is provided by podcastthemes.com.