The Changelog: Software Development, Open Source - Coming to asciinema near you (Interview)
Episode Date: October 11, 2023This week we're joined by Marcin Kulik to talk about his project asciinema. You've likely seen this out there in the wild — asciinema lets you record and share your terminal sessions in full fidelit...y. Forget screen recording apps that offer blurry video. asciinema provides a lightweight, text-based approach to terminal recording with lots of possibilities. Marcin shares the backstory on this project, where he'd like to take it, who's supporting him along the way, and we even included 11 minutes of bonus content for Changelog ++ subscribers.
Transcript
Discussion (0)
This week on The Change Law, we're joined by Marcin Kulik talking about his project, Askinima.
You've seen this out there in a while, I'm sure.
Askinima lets you record and share your terminal sessions in full fidelity.
Forget screen recording apps that offer blurry video. Eskinema provides a lightweight text-based approach to terminal recording with lots of possibilities.
Plus, it's open source.
Martin shares the backstory on this project, where he wants to take it, who's supporting him along the way.
And we even included 11 minutes of bonus content for our Plus Plus subscribers.
If you're not a Plus Plus subscriber, correct that at changelog.com slash plus plus.
10 bucks a month, 100 bucks a year.
Drop the ads, get closer to the metal,
get bonus content, of course,
and support us directly.
We appreciate that.
A big thank you to our friends
and partners at Fastly and Fly.
This podcast got you fast all over the world
because Fastly is fast, super fast, globally.
Check them out at Fastly.com.
And our friends at Fly will help you put your app and your database closer to users all over the world with no ops.
Check them out at Fly.io.
What's up, friends? I'm here with Vijay Raji, CEO and founder of Statsig, where they help
thousands of companies from startups to Fortune 500s to ship faster and smarter with a unified
platform for feature flags, experimentation, and analytics. So Vijay, what's the inception story
of Statsig? Why did you build this? Yeah,. So Static started about two and a half years ago. And before that, I was at Facebook
for 10 years where I saw firsthand the set of tools that people or engineers inside Facebook
had access to. And this breadth and depth of the tools that actually led to the formation of the
canonical engineering culture that Facebook is famous for. And that also got me
thinking about how do you distill all of that and bring it out to everyone if every company wants to
build that kind of an engineering culture of building and shipping things really fast,
using data to make data-informed decisions, and then also informed to what do you need to go
invest in next. And all of that was fascinating, was really, really powerful.
So much so that I decided to quit Facebook and start this company.
Yeah, so in the last two and a half years, we've been building those tools that are helping
engineers today to build and ship new features and then roll them out.
And as they're rolling it out, also understand the impact of those features.
Does it have bugs?
Does it impact your customers in the way that you expected it? Or are there some side effects, unintended side
effects? And knowing those things help you make your product better. It's somewhat common now to
hear this train of thought where an engineer developer was at one of the big companies,
Facebook, Google, Airbnb, you name it. And they get used to certain tooling on the inside. They
get used to certain workflows, certain developer culture, certain ways of doing things, tooling,
of course. And then they leave and they miss everything they had while at that company.
And they go and they start their own company like you did. What are your thoughts on that?
What are your thoughts on that kind of tech being on the inside of the big companies and those of us out here, not in those companies, without that tooling?
In order to get the same level of sophistication of tools that companies like Facebook, Google, Airbnb, and Uber have, you need to invest quite a bit.
You need to take some of your best engineers and then go have them go build tools like this.
And not every company
has the luxury to go do that, right? Because it's a pretty large investment. And so the fact that
the sophistication of those tools inside these companies have advanced so much, and that's like
left behind most of the other companies and the tooling that they get access to, that's exactly
the opportunity that I was like, okay, well, we need to bring those sophistication outside so everybody can be benefiting from these.
OK, the next step is to go to Statsig.com slash change while they're offering our fans
free white glove onboarding, including migration support, in addition to five million free
events per month.
That's massive.
Test drive Statsig today at Statsig.com slash changelog.
That's S-T-A-T-S-I-G.com slash changelog.
The link is in the show notes. We are here with Marjan Kulik, the creator of Askinema.
Welcome to the show.
Thank you. Thanks for having me.
Askinema has been around a long time.
For those who don't know, this is a service which allows you to record and play back your terminal.
You'll probably have seen it in readmes, on websites, all around the world.
12 years, I think you said, in your email.
You've been working on this.
Can you take us back to the beginning?
Why you built this in the first place and how?
Yeah, it's been 12 years now.
Why?
That's a good question.
I think it was just like a hacker spirit.
I was playing in my terminal with this old school script command that some unique systems
have like Linuxux and mac
os and it allows you it it has its roots in like 70s i think and it just records all the terminal
outputs into a file called typescript typescript i'm not confused with the typescript language
yeah yeah okay this format predates the whole js ecosystem thing before it was cool yes
it goes back to before typescript was exactly so i was playing with it and then it just felt so like
magical when i replayed it in my terminal without touching the keyboard and it was just like
everything was you know like a ghost would be typing and And it felt really like really cool. And I thought, well, not many people know about this
and how can I share this, my recordings with my colleagues?
The immediate thing was like,
well, I can send the TypeScript file over email,
ask them to replay it in the terminal.
And then it felt like, yeah, that's just too much to ask.
And I thought, well, there should be a way to share this on the internet,
on the website somewhere so I can just send a link to someone so they can watch it.
And then I started experimenting on like,
how would I replay such a file in a browser on the website?
And yeah, it was just like me experimenting but it was kind of like my initial
prototype was jQuery based replay inside uh like a div element on a page like development with some
span elements inside with some styling etc and and was, yeah, it was a proof of concept.
It worked.
So I thought, yeah, that sounds cool.
And yeah, like fast forward 12 years
and it's like a kind of bigger than I thought.
And a lot of people use it.
So there's been many phases in this project,
the early stages and then the recent developments.
It's been growing consistently gradually. So it was just like a fun side project initially. I didn't even know if it would work,
but I saw that it can work. So I just persisted. Well, 12 years is a long time ago, and it does seem like over those 12 years, the trend of sharing
your terminal with the world, whether you're demoing your new library or project or your
terminal tool you built, has been trending upward dramatically.
Even just sharing code snippets, sharing technical things in general onto various
social networks is definitely and in readmes of course even as open source tooling has kind of
matured to where people now release their open source tool and they have like a marketing plan
you know like they have like a go to market document where they're like here's what i'm
gonna do and probably like create a use askinima to create a video is on a lot of
people's list of like to do's when they're going to launch their new thing.
So you were definitely way ahead of that trend and really took the hard bit.
I mean,
we,
who do we have on Adam from carbon?
Remind me the tool that allows you to have like,
yes,
really cool
pretty code snippets and share them with the world and he made it dead simple i can't recall
anyways we talked about the technical details of that and how interesting it was but in terms of
problem sets i mean it's a static image right so march and you really pick the harder of the tasks
right like let's record
disparate terminals on all these different computers around the world and then provide
some sort of way of hosting and embedding and playing back so you you bit off a big chunk you
said it started off as like a jquery div replay thing like you're like manipulating html elements yeah well but that was just like a
like initial prototype was a ruby on rails application for hosting for uploading the
recordings and inside of the assets directory in the ruby on rails project there was like a
giant js file which was meant to replay that it was kind of hacky ad hoc approach to just
replaying that it was not like it was not what the player is today as a distinct separate full
featured thing but yeah that was it was how it worked initially it was like create a div on a page and then go over a recorded frames.
Frame by frames, I mean timed chunks of data
that was printed to a terminal at the time of recording.
And then parse those chunks of data
and then try to render that on the page
with like a set of span elements within span element like so you
were taking the script and actually animating what the script was doing exactly so you weren't
actually recording a video that's that's totally correct so this was other than ability to share
the recordings easily the other aspect that i focused on since the beginning was to solve the
the blurry screencasts that were like the video video at the time like 2010 11 was okay but not
as you know high definition as these days so when someone recorded a tutorial for youtube or to put on their own website by
just recording screen as in pixels to like a mp4 file or something like that it didn't scale well
it was like way too much bandwidth for how little essence is actually in there, right? I mean, by essence, like the core information payload.
So because it's just text after all.
So the player back then and still today,
it's animating using HTML elements.
It's animating the terminal.
So maybe I go back a little before animation.
What happens is actually the player embeds a terminal emulator.
So like you have your iTerm or terminal app or Alacrity or Xterm.
In order to correctly display how terminal looked at the time of recording,
it needs to be recreated at the time of recording it needs to be recreated as a time
of playback so the recorder doesn't record your terminal visually it doesn't grab the text
characters like a grid it just intercepts the standard output and everything that the applications
like your shell or other applications
that are running in your shell are writing to the terminal so it's just like a byte stream
and so the player reconstructs that basically it's a simple terminal emulator in in the shape of like
something that looks like a video player right so at So at the heart of it, there's an actual terminal emulator.
These days, it wasn't like that from the beginning,
but these days, this is the terminal emulator.
I wrote this written in Rust, and it's compiled to WebAssembly,
and it's embedded in the JS player.
So some really cool technology.
I'm learning this as we go.
I just thought they were videos, dude. Like I was looking at, I'm now I'm inspecting element. I'm watching it change
the HTML. You're like, talk about hard mode. Adam, did you know that these aren't actually
videos? They're just like, they look like videos, but they're, I suspect that it was like character,
you know, sniffing essentially and like recreating video out of it. Okay. I suspected that. Cause I
mean, that would make sense. Cause it's's characters essentially what else would make sense is like you just record
the screen and then like provide a place to play that back you know but maybe this was back when
that was more difficult you trying to solve fuzziness so here we have like resolution
independence so that's pretty cool exactly and that allows me to to scale font, for example, dynamically, automatically. So if you open any recording
on Askinema.org, which is the primary hosting site that the community uses, if you resize your
browser window, you can see how the player resizes and the text at the same time. And you can select the text and copy and paste it
because it's just a text there.
If you hit F key for full screen,
you go to full screen mode
and then it fills your entire screen
and the text also scales up accordingly.
That's really cool.
I thought it was simpler than that.
That sounds hard.
I'm sure you've
worked on it over time and have improved it over time. I guess it's probably a huge data win too,
because like you're hosting these things at the end of the day. What does my computer upload
in order to host that? Is it just a set of whatever I sent the standard out? Like you
were saying, like it's probably just minuscule in comparison to if
i recorded my screen for 30 seconds or 60 seconds and sent that to you exactly the file format uh
the recorder uses uh is called ascii cast which is uh json based text format which you can just
open and read current version of the recorder uses ASCII CAD format version 2,
which is like a JSON-LD or, well, the JSON new line delimited format
where you have like a JSON document on every line.
And those are small JSON documents.
Calling them documents is not the right thing here.
But, you know, it's like a kind of readable format.
It's documented in the ASCII NEMA repository.
So you can write your own recorder, you can write your own player,
and then you can deal with those ASCII CAS recordings.
So how the recorder works is when you start ASCII NEMA recorder in your shell,
it creates a thing called pseudo terminal,
like a Unix systems have this capability,
PTY pseudo terminal,
which creates like a imitation of a terminal
for the program which runs in it.
So ASCII Nema recorder creates pseudo terminal
inside of which it launches your shell again
and that shell gets recorded
and whatever you type in it gets recorded.
So by being in control of this pseudo terminal,
the recorder can intercept all the output that goes from your programs inside the terminal,
as well as all the input, which is the keystrokes.
But that's off by default.
So you can actually capture keystrokes,
and these would be also included in the ASCII cast file.
Does it also record things like Vim?
Like if you open up Vim
and different things like this,
does it record literally everything
and it recreates, you know,
as if you're sitting there
watching my terminal in the past?
Obviously, it's a video.
Yes, yes.
It records everything like Vim.
What have you there?
Like anything, oh, right now,
the only thing that you cannot really reproduce is like
those new newfangled things like images is displaying like regular images inside terminal
emoji how about emoji well emoji works uh so so the recorder and player speak utf-8 natively and so mlg also works there yeah and the whole uh recording when you
let's say you record a short session like five minute long depends on what you do there and how
how much activity is there roughly it can like take like 10 kilobytes of disk size. And if you would start recording session and went to make a coffee and forgot about it for 10 hours, let's say,
and you come back to your computer at night and then it's still there,
over this 10 hours, it would write zero bytes to the the disk because it only captures not capturing any action
yes only captures when there is action that's why the files are so small it'd be cool if adobe
edition would do that adam because i've definitely stopped recording a podcast but not actually
stopped recording the podcast and then left and then the next day day, I have come to my computer to like a 17 gigabyte file or more.
Well, in that case,
it is capturing though
because it's capturing the microphone.
It's capturing the silence
of the microphone.
Yeah, I mean, it would compress well.
It is idle silence.
Not as cool as Eskinema
that just is like,
hey, if you're not a standard outing,
I'm not capturing anything.
So that makes sense.
Well, one thing I've noticed though, Marcin, is that
even though you may delay, so
in particular, let's say you're typing
a command. You're walking somebody through
NetPlan. You want to reconfigure
your network configuration. I just did this as a test.
And you open up in Vim.
But while you delay typing
new characters in, the cursor kind of
continues to blink. So you're saying that
in a scenario like that, while you're delaying more characters being typed or tabbing pressed or the command being
completed, there's that delay. It's not writing anything, but the cursor continues to blink. And
in the playback, you're mimicking the same time. You're mimicking or playing back that delay how does that work to
not write anything but also write the delay i guess how do you know the time how do you not
write something but also know there's a delay in the user interaction with the terminal yeah so the
right word mimicking there's just um this trick that I implemented cursor blinking at a constant rate of like
every half a second, it switches from being visible to invisible and it gets reset.
Like the blinking cycle gets reset every time there's new output being rendered.
This is how actually how terminals, real terminals do it as well.
So they, some of them stop blinking after some time,
but in general, they, there's like an initial delay
when you're typing and the cursor is visible,
the block, the white block is visible all the time.
But once you stop for half a second or second,
something like that, it starts blinking.
And this is purely visual thing.
It's not related to any data being written anywhere.
It's a playback thing, essentially.
Yeah, a presentational thing.
So the player is exactly mimicking how terminals do it.
What I find interesting too is that in this playback,
I can copy and paste from the video.
Like you said, Jared, you thought it was a video.
I never inspected them.
I just stare at them.
I'm like, oh, that's a cool video.
I guess I never thought about it.
Me, I've never really similar.
I haven't done, I've never been somebody to figure out
like how can I use this?
I don't do much teaching.
I haven't used it.
I don't do much demoing. So I'm usually a consumer not a producer in regards to Eskima Ascii Cinema gosh Eskima well however you it's all good first time I've said it out loud actually so
forgive me there's there's one funny pronunciation which shows up every few years, ASCII and EMA. Oh my gosh.
Right.
That's what it's time to beg to differ.
You know, when someone says that one,
then you can interject and say, well.
That's where I draw the line.
You have to choose one, ASCII Cinema or ASCII EMA.
But I love that.
I mean, because, you know, especially as a teachable tool,
like if you're teaching with the tool,
if you're recording your terminal and you're an educator
or even just somebody helping a friend or a coworker in any way, sharing the knowledge you have and you're recording your terminal either for yourself in future playback.
Because sometimes I do documentation to teach future Adam what past Adam did and why did he?
So in many ways, these could be like a version of a video to tell me in the future through motion rather than just static documentation.
You know, how to go through the process of setting X, Y, Z up and how that goes.
And you can actually get to see it, but you get to copy and paste from that.
So it's not this, you know, video where you can't OCR the thing or whatever like you have.
It's literally text.
Right.
That's actually a good idea. I've never thought about that, but I've often just like
gone through steps that I've painfully learned. Here's the process. And I will like throw them
into whatever notes, a markdown file, obsidian, sometimes into like a dot SH. I'm like, well,
these are just executed, you know, I'll just put it into an actual script. But other times I just
want to know what I did because I'm not going to run it I'm going to modify it and actually just recording yourself
and just saving that for yourself later is probably a pretty compelling use of Eskinema
that I had never considered the only thing I've changed though is is the speed I know that you
do record or you know you record the delay you recognize it i'd love it if like when i type a
command it could be just like a certain speed you know there's a way to say like speed it up
to x yeah because there's certain things like you literally see it back in real time so
if you're thinking or you're delaying too slow right what you get with video is the ability to edit and, you know, finesse the output.
Whereas this, it's thus far, it seems to be a one-to-one.
Is that true?
Can you edit these things?
Is there finessing to this thing?
Yeah, yeah.
There are two controls that you can use to make it more pleasant to watch.
So one thing, the speed of playback is the player supports changing the
speed. So one way you can do it is when you self-host the player, which is available as
an NPM library that you can embed on your own website. There's this option when you
initialize the player option called speed, and then you can just pass a number two, for example,
and you have like a double the speed,
or.5 to half the speed.
For the recordings on the Ask Cinema Org website,
when you have one open,
you can append to the URL a parameter like question mark,
speed equals two, and then you will have it
like doubled so that's one thing like good old speed control on that note one thing that's
missing from the player's UI is the ability to change the player on the fly just by clicking
the UI which is on my to-do, which is always growing. But that's probably
something that would be neat there. So another control for for making videos more pleasant is
something called idle time limiting, which is a capability in the player for removing all the pauses like idle moments where nothing happens so you you can
use let's say you want to to keep the inactivity max inactivity to two seconds then you can use
this idle time limit parameter for the player which you can also set on the recording, in the recording settings
page on the website, which essentially removes all the boring moments where nothing happens,
compressing the delays between animation frames.
So then if you record something and you go make a coffee come back an hour later you still can continue and you don't need to
re-record because then you can after the recording you can apply this idle time limit option
and say set it to two seconds and then it will be smooth and pleasant to watch because
the longer pause will be two seconds that you will see there.
As you were describing that, I was in my settings fiddling with that. And the, as you said,
unpleasantness of the player in the consuming of it went away. I set it to 1.5. So basically it was,
you know, very similar to how you would normally delay versus like my true delay, which was about 10 seconds.
I was just fiddling with it.
So it sort of took all that away to make the viewing experience more pleasant
because it pretty much went through the command quickly.
It went to the next step, which was actually opening Vim and actually showing off the configuration
and then exiting Vim because I know how to do that and all that good stuff.
So it was kind of cool. I like that.
Was that a humble brag? I know how to do that and all that good stuff. So it was kind of cool. I like that. Was that a humble brag?
I know how to do that.
Well, yeah.
Just in case you didn't know.
I'm not stuck in Vim.
I'm out.
I'm out, baby.
Yeah.
And this feature is possible
thanks to actually not having this in a video format
like pixels or like grabbing the screen,
but just like capturing right events like all the
events that terminal you know produces like applications produce and then being able to
manipulate them before presenting them the cool thing too is that you can change i was in the
settings again i changed the terminal theme to dracula shot out to the Dracula producers.
Xeno.
Xeno Roshan. Amazing. I love Dracula.
But you're able to change the terminal theme and I'm a
JetBrains Mono fan
myself. So I chose
the nerd font. It's my favorite.
I like Firikoto. I'm not a hater.
My preference is JetBrains Mono.
I'm a long time Ubuntu Mono user
but no love lost. All good fonts. Yeah preference is JetBrains Mono. I'm a longtime Ubuntu Mono user,
but no love lost.
All good fonts.
Yeah, these are all good, man.
All good fonts.
But you can change how it looks afterwards.
I think that's just,
it's literally like you have a replayable terminal session,
themable, totally changeable.
That's so amazing that that's even a possibility.
Yeah, it's like Apple doing all these magic things on new iPhones with video where
you can do some changes after you recorded them and they can look different, like to
play with depth of field and stuff like that.
So this is a different thing, but you can do fun stuff with it.
What's up, friends?
Today we have an awesome sponsor,.techdomains, and they're giving this segment away to.tech
founders to showcase the amazing things that are being built on a.tech domain through
their startups.tech program.
.techdomains are the go-to namespace to build anything in tech and home to the world's most
innovative startups.
For example, a self-driving AI company that's raised $3.7 billion and is building on Aurora.tech, Thank you. There are thousands of companies like this who are taking advantage of.tech domains to reinforce their brand as tech focused and forward thinking.
But here's the cool thing.
Instead of just selling domains,.tech domains wants to give their users a platform to show the world the amazing things their.tech startups are building.
So if you're building on a.tech domain or you want to simply apply to this startups.tech program by going to startups.tech
slash changelog and filling out the form. That way, dot tech startups get to be in front of
thousands of people like on this show, and we get to learn about cool things they're building
on dot tech. Again, go to startups.tech slash changelog. Once again, startups.tech slash changelog.
I guess my question is, with that capability, what is the limit?
What is the possibility?
Like, I know this is an open source project.
I don't know how you maintain it in terms of like your financial stability with it. You know, the support on it, you have a sponsor or two.
And you have backers, I think, through GitHub sponsors, probably things like that.
How do you vision the possibility where this can go?
This seems very capable of being more than what it is.
Not that it's not enough, but it's got a lot of possibility,
a lot of potential.
Yeah.
So as I mentioned, I have a huge backlog of things to work on
and both missing features requested by users
as well as some crazy ideas I dream up once in a while.
Maybe I'll touch a bit on how the development progresses
and how I do it in terms of time and resources.
Sure.
So pretty much since the beginning, it was a hobby project.
And it's still kind of is it now it's more than that.
But it's been a side project that I've been working on in my spare time exclusively, while having a regular developer job.
And there were phases where I had a little bit more time to spend on this project.
And there were like literally years that I had very little time for maintenance, not even for thinking of new features.
So it's not a business.
It's not a product.
It's always been an open source project. And I've been over the idea that maybe turning it into business would be something cool.
Like many times I've been over this.
But I haven't decided to turn it into like a, let's say, SaaS product, like software as a service, where I just like offer like premium accounts and try to monetize users.
Maybe I, well, first I just,
it's just too much fun to ruin it
with like money chase for me.
It's a great, you know,
place where I can learn new technology,
play with some cool stuff.
And there's like no stakeholders.
Nobody like can tell me that,
no, you you cannot rewrite this
in the rust or something like that right so it's always it's been bringing me a lot of fun over this
12 years and um another another side of that is it's kind of niche audience so um maybe i'm not
good you know in this business thing because i don't really believe that
you could you know make decent money on such a niche thing so yeah i i settled on on this idea
that i want to work on it and be supported by the community, but still keep it as a pure free and open
source project and keep the AskAnima.org site free to use as well like it's been for all
this time.
So recently I've set up a GitHub sponsors program to support that and I offer consulting
services around the project. There are companies who use
some parts of Askinema stack in their system, in their products, who need some customizations
or some missing features, and they are willing to pay for that. So that's how I'm trying to
make it sustainable. How is sponsors working out for you well i i set it up just a few
weeks ago like probably literally years too late i should have done it like a long time ago but
but i've been but uh but some years ago i've been really busy with with like real art job and was
invested in that so i was really wasn't really thinking about going full-time as kinema uh but now yeah it's it's
few weeks i have uh right now three monthly sponsors and um it's a start it's it's a good
start uh but i think it's pretty good start because one of those is uh is a company which
gives like you know like donate donate significant amount of yeah of of money
it it just they just started donating like two days ago but but it's like a it's a big a big
thing for me because it's not there yet where i can say it's uh sustainable for me like to cover
all the living costs and operating costs of the service.
But it's a good start, I think.
So I hope there will be more people who think that this is cool project
and can just chime in a little bit.
Absolutely.
The one thing that I feel like that's a big thing, but also kind of changes
the project perhaps is I'm sitting here
watching Star Wars episode 4 which is a really cool one by the way it's like 17 minutes and I
hate that it's episode 4 it's like come on guys it's the first one but okay let's not get off on
Star Wars a new hope as they renamed it too and I'm thinking like I would love to have you know
the the audio.
And of course, when you're doing tutorials and stuff,
the one thing that's missing is audio, right?
But at the same time, I understand the purity of text only
and the fact that you can safely hit that play button
and not worry about it blasting.
Or maybe it does blast and I just have it muted.
Is there audio in this thing?
I haven't
found no there isn't okay now there isn't it's like it's hard to second guess myself i'm like
oh maybe it's there and i just don't see because everything i'm seeing is just you know it's text
and it's it's awesome but now so the the ascii cast format is just for textual data for what
gets printed and optionally for inputs like keystrokes i don't think it would
be a good fit for like trying to cram some audio inside of that well obviously we could invent a
new format or maybe like a sidecar or something like you send two files kind of a thing yeah yeah
so so actually uh there are people who who use um the standalone player
on their websites and they do use it with audio they record audio separately and then the player
has a api which you can use to control it and it also emits several events like playing, paused, stopped, kind of similar to like HTML5 video element on the page
can be controlled and inspected.
So you can do it with the player as well.
So I know about people doing it and just thinking,
like creating an audio element next to the player
and then coordinating those.
It would be nice to have like a built-in support for like optional audio file
when you initialize the player.
So the player would actually handle this because right now, yeah,
you need to write a little bit of glue code to achieve that, but it's possible.
Right.
And you also couldn't host it on eskinema.org.
Like you'd have your own player and all that. yeah exactly which is fine but and because the recordings are
small it doesn't cost much to to actually host host them out of my pocket because right now there
is um half a million recordings on the site i I store them all in S3 buckets
and I pay for that $3 a month.
So that's really cheap.
That's cheap for like a hosting site
with a half a million recordings.
Right.
Now, if you're on R2,
I'm not a Cloudflare representative,
but we did switch ourselves to R2
because that zero egress cost for us makes a
large difference when we're sending MP3s around the world, even with a CDN in front of our bucket.
So if you did something like that, where your, at least your outbound audio playback would be a zero,
my guess is that would scale rather nicely depending of course, on how
popular that feature is and how long the audio recordings go and I fully admit
this would be a large shift.
Oh yeah.
Yeah.
Addition to the product, but, uh, would be an interesting dimension
that you're currently not supporting.
Probably on average, like if you have a recording right now, which is like 10
kilobytes, adding audio to it would be like
extra megabytes or 10 megabytes so like a hundred to thousand times more than it is right now so
it would kill the the lean aspect of of all of it uh so maybe that's also like why i'm kind of
hesitating but you know you can do it if you if you can use a player you can do it you need audio
jared that's what you're saying you need some you need something audio wise in there well i just
feel like because you're watching the star wars you know four well version which is you know just
begs for audio well it definitely begs for audio but that was mostly just me that's a really cool
one by the way we should link in the show notes. Yeah. This Star Wars recording, it's been created in late 90s, actually.
Is that the one where you can telnet to like blinking lights or something and watch it?
Yes, exactly.
So telnet.
It's the same one.
Okay.
Yeah.
Yeah.
So telnet blinking lights.
It's, I think it's defunct.
I think it's.
Oh, is it defunct?
It's since last year.
It's.
This is a very old, cool hack that you could do
and you could tell your friends.
If you tell that to, I think it was blinkenlights.nl,
I believe, maybe that's wrong.
Then you would connect and they had a Star Wars server
and it would actually just play back Star Wars,
A New Hope in your terminal.
And that's what we're watching here.
Is that right?
Yeah.
Yeah.
So on that note, I wrote a blog post on Ask Cinema blog called Blast from the Past, which
is, it's kind of technical, but it's also fun because Ask Cinema player has ability
since few versions back to write your own parser for a recording
file. So in addition to being able to play ASCII cast format, it can also play now the old school
TypeScript format from the 70s, which started it all for me. And also there was a tool called, and it's still maybe used by some people,
called TTY Rec, a similar tool for recording terminal.
And it produces terminal sessions in its own TTY Rec file format.
So a cinema player can now play back those formats as well,
because in essence, they all do the same thing, those recorders.
They capture the data stream.
So in that blog post, Blast from the Past,
I show how you can write a custom parser for Simon Jensen's Star Wars
Aschimation, this is how he calls it.
The Askimation.
He created this as like a TXT file
with distinct frames there.
So the Askinema player is also able to play stuff like that.
The reason why I bring up audio
and to answer Adam's question directly
is because when I look at what people use for their demos,
it's either asking them, uh, or it's a YouTube video embedded and YouTube videos have audio.
So maybe if you want to put a cool soundtrack to your 45 second deal, or if you want to narrate
and walk through what it is that you're showing off, it's just very handy to have an audio track.
And so I think that that's compelling.
And I think there's a lot of people who use YouTube because they can do both.
And I think that if Eskimo did both, at least even if it was just default off,
but you can get it done, I think that might be compelling for a lot of people.
That's why I bring it up.
Not that I need it.
I just think it's a nice to have.
That is, that is a valid use
case yeah certainly i noticed people do work around lack of audio by writing comments inside
the terminal oh yeah totally clearing them which is a workaround so yeah like having audio there
would definitely be helpful in those cases um another option and this is this is a
feature that's that's coming is to have a ability to to have some form of subtitles so you can you
could prepare uh like a script of you know like what you want to show at certain time points and
it would show like a subtitle.
Yeah.
That would definitely be a nice addition
that's more in line with the spirit of Escinema,
which is text-based, roots in ASCII, at least
metaphorically.
We know it's UTF-8.
Yeah.
So more than just ASCII, but roots in that world,
and text, and small amount of kilobytes
and trying to really streamline and make it all those things.
Same thing with captions is you can scale them up or down
as you redraw, so you have that resolution independence
with captions as well.
That probably gets you like 90% of the way there,
I think, if it was easy to use.
What about things like embedding and whatnot?
How do people use this as a true tool to, I mean, I'm just on the explore tab
and it begs for categorization, tagging of sorts.
Like, can I just hang out in Elixir for a bit and just see people do things in Elixir?
I don't know, write some Elixir or something like that.
Or, you know, things where you're setting up particular things with, you know,
standing up an Ubuntu server or something like that,
like configuring Kubernetes or whatever it might be.
Like things that are generally like challenging to showcase,
but, you know, Eskinema offers all the flexibility in everything it does.
What about things like that?
If I did that, then being able to embed it in my blog,
how does a user use this beyond the website or beyond hosting their own player?
Yeah, sure.
So there are a few options there.
One option is if you record and upload to eskinema.org,
you can just click on,
there are two links on the recording page.
One is download and the other is share.
And if you click on the share one,
you have a few options of how you can share it with people,
including embedding it.
So you can just place a js snippet inside your html which creates a embedded player inside an
iframe on your site it's still hosted on askinema.org but but insert from there but you can
have it just by pasting a js snippet that that's one option. And another option is you can self-host the player.
So you can use the player on your own site.
And if you have a recording on your computer,
you can just use it there.
But also you can download the ASCII CAS file
from ASCII-NEMA.org
and include it in the assets of your own website
and use the player there.
So that's how you can, yeah.
There's also ability to,
so when you, you can also like have a link to the recording
which is like an image preview.
In fact, it's SVG because SVG is sharp
and can produce nice looking scalable preview. So you can use like a SVG is sharp and can produce nice looking scalable preview.
So you can use like a SVG thumbnail thing,
which shows like one of the frames from the recording
and it links to the site.
Also.
And another thing also, yeah.
I think I know what you're going to say.
Yeah.
And also you can convert ASCII cast files to GIF files. That's exactly what I was going to say. Yeah. And also, you can convert ASCII cast files to GIF files.
That's exactly what I was going to say.
Yes.
And I pronounce it GIF.
Correctly.
Yeah.
So, yeah, there's a recent project, sub-project I created called AGG,
Ascinima GIF Generator.
You sure it's not Generator?
It's a GIF generator?
Yeah.
You got it for a second.
He had to stop and think about that one.
He did.
He's like, oh.
What happened?
Should I agree or not agree?
Is he right?
Ag is like a second generation of the GIF generator I created. image magic and some other pieces of the unix pipeline to just generate a series of images
combine them with image magic and and whatnot but it was really resource intensive like slow
used huge amounts of memory and for many people it was like impossible impossible to convert medium-sized recordings to GIF files with that.
So it was kind of like a prototype solution that people started using, and it couldn't be saved.
So it needed a rewrite.
And since I had this virtual terminal emulator written in Rust.
I thought I'll rewrite this in Rust.
So ag is super, super fast,
uses very minimal amount of memory
and can convert any ASCII cast file
to a GIF file in a matter of seconds.
This is exciting for me
because as a person who routinely helps other people with getting eyeballs on their projects,
I'm often looking for the GIF or the image to help share what they're up to.
And oftentimes I'll come across a readme, they'll have an Eskinema,
and I'll think, ooh, this is actually harder for me to share than if this was just a GIF embedded in here.
And I did not know about this until today is this uh brew install away how do i get this thing installed
cargo install well you can i think it may be in brew docker docker pod oh you can try brew only
with a shot brew install ag see what happens yeah well i have i had this funny relationship with gif generation actually
since uh gif is very inefficient format for it is kind of video it's like the worst but at the
same time it's so easy to embed everywhere so people just use it it's like yeah so initially
i thought it's like such a bad idea it's like. It stands in opposition to what the project was meant to be,
like lean, tiny files with sharp rendering.
And people started asking for it.
And then we're going to convert them to GIFs.
How can I convert it to MP4 and stuff like that?
After many, many people asked about it and tried various approaches
i thought okay i i can spend some time on that so even though i i always prefer you know the
real thing yeah the seeing in a player where you can you know rewind pause, pause, et cetera. I see like how it's useful.
And I still get asked like,
how can I convert it to MP4?
Really people want to create videos,
which, you know, also I can understand
because if we leave that like tiny file size aside,
another nice thing that a Cinemaciinema recorder has is like
the ease of recording you're in your terminal and you think oh okay i'm gonna record that you just
type asciinema rec and then it starts recording you don't need to go anywhere else you don't need
to click outside of a terminal window so it's. And you can also like, automate this, you can, you can,
for example, put a snippet in your shell configuration file, which will start recording
every shell in every shell that you start, which is like when you open new tabs in your terminal
emulator, every tab would be recorded automatically to a file somewhere, right?
You can do stuff like that.
So the ease of recording, I think, is what people like, even though some of them want to convert it to a video and probably some of them even uploaded this stuff to YouTube.
That would be funny, but probably it happened.
As I look at it, I'm just thinking, why is this ag not just built right into
Eskinema when you install it?
Because it's like... Ag as a service, man.
Oh, even into the command line tool.
Right into it. You can either export the ones you
store locally or built into
the website, which is, it uses
ag to export, essentially. And you can choose
you know, it's the
exporter. It's really just a feature,
not a whole new project, really,
of the core thing. The next best thing to seeing it on your site or self-hosting it
would be to export it and use it in a different way. It's now stuck that way.
You have to re-export it if you wanted to do a different font or themable, but you have your pristine source that you can just re-export new versions of GIFs, GIFs, MP4s, whatever.
Yeah, yeah.
Before I forget, Ag is a service.
That's actually a thing.
I haven't worked on that, but there is one person, Mari, I don't remember where she worked,
but they've created a web service which generates social media preview cards
for various things and many things that don't have it.
Like when you put a link to some resource on the internet,
on Twitter or Mastodon etc some of the sites
don't include the necessary metadata like the open graph tags etc so they they created like a
proxy web service thing that generates some previews on the fly and they actually use ag
as a library because you can use the ag as a library
in your own Rust code.
And they generate GIFs on the fly there
for those previews.
So yeah, it's a thing.
That's cool.
But yeah.
But back to like why ag is not part
of the initial package or like the, yeah, well,
I guess it's like a historical thing.
First is like the recorder that the thing you actually run in your
terminal terminal is written in Python.
And in order to generate GIF, you need to actually visualize the whole
thing after capturing the, the bytes.
So in order to, in order to do that you need to to have some form of terminal emulation like some embedded one inside your your software so
i have this one the askinema's virtual terminal is uh it's a separate r library called AVT. And this one is embedded in Ag.
So Ag takes the ASCII cast file,
feeds all the events captured into this terminal emulator,
and at each step, it generates a picture of it
and then feeds it to the GIF generator library.
It's kind of resource intensive and
with rust it works smooth and fine but but the the recorder is in python so i would need to
well there are ways to to embed rust code as like python native extensions inside python projects
there's one project called py03 which allows you to call rust code from your
python code so that's one way and it would complicate packaging probably a lot for many
people i'm not packaging a skin mr quarter for like dozens of distros and various other
operating systems and they would just like burn me on the stake
if I just like made this,
like you need a Python environment
and all the Rust tool chain
and something else maybe to package it up.
So I would be willing to rewrite the recorder in Rust
instead of trying to combine those things
like in one package.
It'd be less work to rewrite
than it would be to actually bundle and package those.
What's up, friends?
There's so much going on in the data and machine learning space.
It's just hard to keep up.
Did you know that graph technology lets you connect the dots across your data and ground your LLM in actual knowledge?
To learn about this new approach, don't miss Nodes on October 26th.
At this free online conference, developers and data scientists from around the world will share how they use graph technology for everything from building intelligent apps and APIs to enhancing machine
learning and improving data visualizations.
There are 90 inspiring talks over 24 hours.
So no matter where you're at in the world,
you can attend live sessions to register for this free conference.
Visit neo4j.com slash nodes.
That's N E O the number, the number four, j.com slash nodes.
My install process, I installed it on Ubuntu versus Mac OS
because I just happen to have Proxmox with a VM just chilling there to tinker with.
So I just SSH into it really quickly.
And the process to install is via a PPA repository,
which I believe stands for Personal Package Archive.
And so this is like a special way you can essentially,
I'm not even familiar with this really,
but the process to install is apt-add repository PPA colon
and then essentially the namespace
and then the name of the thing you're installing.
And I think if there's updates to it,
I would get that when I do apt-get update
so that if there's new Askinema updates, then I just sort of get that when I do apt-get update. So that if there's new Askinema updates,
then I just sort of get them when I do typical Linux maintenance, which is cool. But then I'm
like, well, I'm happy to just install a separate project the same way, but I didn't even know it
existed until I had to go hunt it down. And it's a couple of years old. So my recommendation isn't
necessarily to combine them, but just don't obscure them you know where where i've got to go and discover the world of skinoma
because it's an obvious feature for you know the usage of it it is yeah maybe i think the
ag is not not mentioned in the main skin i read me even maybe i need to check that but yeah the discoverability is yeah probably
that's a nice feedback thanks so i brew install ag i did brew install ag and i got one dot four
dot three which i believe is the latest version okay and i downloaded the star wars cast this is uh eskinema569727.cast that's a 2.3 megabyte file that's a long one what we said it's
like 17 minutes something like this yeah this one is on the longer side and i ran ag on it to
generate the gif it generated in six seconds and the star wars.gif is six megabytes so three times the size
and that's at 582 pixels by 274 resolution so super small by default i'm sure you can set that
somehow with ag yes you can you can change the font size so yeah you can change the font size
but just the default here it is.
And the gift file, which was created pretty quick, six seconds on a modern
Mac, three times the size of the cast.
And it's this tiny little, so if I got this thing blown up big to the way I
want it, it's going to be massive.
Yeah.
And you can convert this to MP4 now and with will be smaller, but it will be.
Illegible, I guess.
Yeah. So, you know know it is what it is yeah that's an extreme example though it's like who's going to do that necessarily like
most things might be yeah you know a terminal session where you're demonstrating who wants
to watch star wars in their terminal who would watch a 17 minute long long GIF? Yeah, I can't imagine.
One which you can't really pause.
Yeah, exactly.
And if you miss something, you can't rewind.
Oh, man, no.
Oh, yeah.
But note on that thing where you said, Adam, that you upload it and then you need to export it to create a GIF from it.
In fact, you can actually record it locally to a file
without uploading to the site.
So you can just ask a schema rec demo.cast,
and then you can convert that demo.cast to demo.gif locally without...
The cast files, that's like your source file there, right?
That's your source of truth.
Yes.
When you're done recording,
you get the option to upload to a schema.org or ctrl c to save locally and that actually saves still to my
temp directory which is interesting at least on ubuntu i don't know if this is like a setting or
a config you can do for you know somewhere else yeah you can you can configure that but it always
records to disk so you don't lose your recording in case of like, let's say you end the session
and you hit enter to upload it.
And then there's like 504 or something.
You don't have an internet.
So yeah, it always records in real time to disk.
And then at the end, you can either upload it by hitting enter,
and then the temp file gets deleted,
or you can just control C out of it,
and the file stays there.
You can recover it from TMP if you want,
or just leave it there and it gets cleaned up at some point.
Yeah, I'm actually seeing the details here now
for a configuration file too because
I guess this might be a Mac only thing.
I guess you always have a home directory and a.config
folder.
Schema slash config is where
you can set API things,
record things,
standard in, environment
file variables and stuff like that.
A lot of configuration here. I that a lot of configuration here i see a lot
of possibility with this i mean i want to be encouraging because like we're grilling you on
like how it works less to be like here's all the warts that you have to go and leave this podcast
and be like man i gotta go this guy's gave me a to-do list you'll work on his audio support when
he leaves here yeah yeah it's less like that i believe there's a lot of possibility i don't see the exact path to get there but you've built such an amazing tool it is super cool that you get this
replayable re-exportable you know non-static dynamic option to record a session and however
you want to do it whether it's an emulation of star wars the movie as ASCII or a demo.
I think there's a lot of cool things there.
And there's so many documentation websites that just lack that.
I guess a GIF is static in that case,
and there's a lot of documentation sites now that use dynamic code examples
and stuff like that.
Maybe that's not the solution there, but there's a lot of options.
I think that this does make sense to use because it's such a it keeps the exact you know contents that you typed in it's not
a movie it's not static and immutable it's you can change it so i love that about that a lot of
possibility though yeah i've been hearing from some people uh that they actually prefer, like when there's like a installation tutorial,
that they actually do prefer just a list of commands
so they can copy paste them
and just like follow them by reading.
So I have this feature in mind actually
where I'm thinking about implementing
like a transcript view in the player
where you could switch from the regular player
view where you can like seek and pause and play to like a text dump of it which is scrollable
and just like have a how you would see your own terminal at the very end and you are able to scroll back and see what's there so i'm
this is something that could be like this alternative view which you could toggle to
could be like best of both worlds because uh you can record it and then if someone prefers to just like scroll through it and I mean, scroll vertically
and just like see the output and copy paste
that they could do it.
Or people who want to see this
in a more lively fashion, like animated,
they can keep using the default, the current view.
Well, I have a feature request,
but I thought maybe we would save it for the post show.
It might be a little bit too nerdy to...
Go for it.
Go for it now?
Give me, yeah.
What do you think, Adam?
Should I do it now?
Should I do it for maybe a plus plus thing?
Just go for it, I guess.
Just go for it.
This is an ag feature request.
And one that if you would help me,
if it's feasible and you would help me,
I'd be willing to open a PR on ag or you could just cut it up yourself.
Just you let me know, but I will be willing to help on this.
You AGG space input file name, output file name.
Like that's the standard deal.
It should be able to accept alternate to an image file name
or an input file name it should also accept an asciinima.org url and if it detects it asciinima.org
url it should resolve where that cast file is download it and create the gif all in one step
it's there it works oh really yeah you can you can just yeah you can i think it should be in the
readme as well i i don't. I didn't see that either.
I was curious about that.
It's not in the help.
Like when I did.
AGG.
Hold on, let me try this.
And then paste the URL and then the name of the output GIF file and you will get it.
Okay.
Yeah, try it.
That's a lot, Marcin.
I like the way you think.
It's all there.
It's all there.
Try it.
No pull request required.
Copy, paste.
Yeah, I mean, that's the best kind of code there is.
The code that's already been written by somebody else.
AGG, boom, test.gif.
I think it worked.
It doesn't say much.
Test.gif.
And it's a duck pond, which is what I downloaded, duck pond.
Cool, man.
Good thinking.
That's a good feature. I would definitely recommend
giving installation instructions.
Okay, yeah. A smaller request.
And I would definitely help with this.
We could put that into the help
documentation so that you know
that it works. Oh, yeah.
Good idea. Like, if I just type ag,
it doesn't tell me that.
Possibly not.
Oh, like the man that generates from just typing the prompt itself,
the command itself?
Yeah, like you just type ag with nothing that says,
the following required arguments were not provided.
Input file name, output file name, usage, options,
input file name, output file name.
Right there it could say, input file name or a schema URL.
Yeah.
You know what I'm saying?
Yeah, yeah, yeah.
And ag dash dash help might say it.
This help message is generated by the library for common line parsing,
the clap library, but I'm sure there's a way to just modify it in some way.
Cool.
I can code that up for you, Marjan.
I can, I can fix that help message for you.
Oh, awesome. Awesome. She'll accept me. Oh yes. I can fix that help message for you. Oh, awesome.
Awesome.
If you'll accept me.
Oh, yes, of course.
Yeah.
Contributor?
Yeah.
Oh, yeah.
It is October too, Jared.
So, you know, you can't get a t-shirt this year,
but you can get your...
No more t-shirts for Hacktoberfest.
I want to see if the, I suppose,
input slash output to open source this year
changes because of that.
Because of the
the incentive my guess is it's going to be way down because everybody who's found out there's
no free t-shirt they're all just like yeah sorry digital rewards just don't get us yeah I think
that uh yeah that definitely the incentivization needs to go needs to be better but to their credit
I mean they were printing 50,000 t-shirts
and sent them around the world.
Can you imagine the operation?
I mean, I wouldn't want to do that.
Yeah, and the gaming, I think it really...
But then maybe you just think to yourself,
maybe we just shouldn't do Hacktoberfest, I guess.
Well, either way, I'm sure you'll still be motivated, right?
Because you love open source.
I don't know. Off topic, off topic.
It is off topic. We're bite-shedding a little bit.
I guess I want to come back to maybe less to say what you should do,
but more like what do you want to do with this?
Given the possibility and the potential here,
if you could just wave a magic wand, what would it be?
Would it be you don't work anywhere else and do different things
and you find a way to convert this into a lifestyle business?
Maybe you have one or two employees.
What do you want from this?
I would like to
be able to to just work on it and not really imagining myself turning it into business but
just like really keeping this as a as a free software and uh and uh something that people
really enjoy to use and admire because I'm constantly hearing how people
are amazed by this, like specifically like how it works and you know, and this
this keeps me going.
I really, I really enjoy it and I've been programming my whole life and I've been
doing it for just for fun and I want to keep it that way. So if I could wave a magic wand,
I would love to have enough sponsors, patrons
that would allow me to focus on this
for as long as possible.
So that's my kind of dream.
That would be great.
I've put a lot of work into this project over the years
and I can't really stop.
It's so fun, you know.
And then I have some upcoming
new cool things that,
well, I can share a little bit.
Tease them if you can.
Can you tease something?
Right.
Yeah.
So one thing I've been working on
is live streaming,
like, you know, Twitch for terminal nerds.
So just being able to stream your terminal in real time,
share the link with people and they watch it,
and then you just do some coding or some stuff like that.
So I have it implemented in the server,
which is the server side is elixir
phoenix and i have the web socket driver inside the player so basically instead of playing from
the file it can play from a live source and i have a streaming component that you locally run
together with with the recorder that forwards all this the whole
stream to the server and the the server distributes it to the players of the viewers so i have this
almost ready i've been i've been just like busy with other things uh but i hope to to release it
quite soon and i'm still kind of like on the fence
because the streaming component is also written in Rust.
And it makes even more sense for this one
to be part of the initial package of the recorder.
But I have the same problem with like join,
like merging this Rust piece
with the existing Python code base.
So right now you can combine the AskKinema recorder with the streamer,
just piping one to another, and it will just forward this stuff to the viewers.
But I'm thinking about just, yeah, it all points to a Rust rewrite of the recorder because then I could have the existing code
of this streamer thing inside the recorder
and I could also merge ag into that.
So it could be like one single executable for all of it.
But that's, yeah, that would be some work.
The recorder code base is not huge.
It's relatively small Python code base.
So actually I've written it seven years ago into Go.
And then after a year or so,
it turned out to be not the best choice for the project.
So I went back to Python.
So, but now...
Why is that?
Why wasn't it a good choice?
There was, I written a blog post about that
on a schema blog,
but it's been like,
so one thing with this Python implementation
is that it only uses things
from Python standard library
because this pseudo terminal PTY modules there,
it's stable, it works on all platforms,
like even on Android or some obscure systems.
So it was kind of like packaging was kind of solved
because you just package like a PI file
and distribute it to systems
which have this standard library and it's covered.
So there was a problem with the co-implementation
that some low-level systems stuff needed to be implemented
or tweaked for every platform.
And at the time, Go packaging was kind of in a weird state,
in my opinion.
I know that it's better now,
but it was like different options
of like how you vendor libraries or package things.
And then the maintainers of Ask Cinema
for different distributions,
they had some troubles with it as well.
And then in the end, I kind of didn't like Go.
Well, that's a valid reason. I mean, if you don't like it, you don't like Go. Well, that's a valid reason.
I mean, if you don't like it, you don't like it.
Yeah, for me, that was one of the reasons.
You can read the blog post.
There's many more there.
If it would be just one of them,
I would probably stick with it.
But there was like a bunch of reasons.
But now I have a bunch of reasons to do a Rust rewrite.
So maybe I'll find time for that
and then all will fall into place
and maybe unifying this stuff would open more possibilities.
Let's try and wave that magic wand.
What would it take?
I know that you said what you would do if you could
and you mentioned GitHub sponsors and you mentioned patrons.
Do you have a Patreon too?
Or do you just focus on GitHub sponsors? Right now, just GitHub sponsors and you mentioned patrons. Do you have a Patreon too? Or do you just focus on GitHub sponsors?
Right now, just GitHub sponsors.
Okay.
Yeah.
And I noticed that you've got a five monthly supporters goal.
It's 60% there.
So you've got, I don't know how many.
One, two, four.
So you have four sponsors right now,
which is pretty awesome for just being a few weeks into being sponsorable.
How many does it take? Like what does it take for you to step away from your existing gig what is
that you know not so much the details but like what does it take how many sponsors to take how
can we you know ask the community that's listening to this podcast just to shower you with with
sponsors with whatever it takes to get you to that next, one of the next levels, one of the next milestones. Definitely, it would help if there were some corporate sponsors.
I have on GitHub sponsors page profile,
I've set up some monthly tiers
where I offer some perks for sponsors.
So those higher level tiers would really help
because that would allow me to to focus on that and not
worry about living costs and and other things like that um of course like individual donations are
are more than welcome but yeah i think that i'm really new with you know to this sponsorship thing. I never did that before.
But yeah, I think monthly sponsors,
not just one-time donations, would change it, right?
Because it compounds.
Very cool.
Well, it's a very cool project.
I have cloned Ag,
and I am looking for the correct places to submit my pull request soon.
So stay tuned for that.
There's one more thing I want to implement hopefully soon, which is full text search on their website.
Because it's all text, so I can index that with elastic search or something like that.
And then search all the recording content.
How many recordings did you say you have up there?
Half a million?
Half a million recordings, yes.
Yeah.
What's the size of your S3 bucket?
That's a good question.
Three bucks a month can't be much.
That is in some tens or hundreds of gigabytes,
but not more.
We have some friends we can introduce you to.
Typesense, similar
to Elastic, maybe
easier to use, and I'm sure willing to
talk to you in some way, shape, or form.
I can't make any promises, but
happy to make that intro. We are big fans of
TypeSense. Jason Bosco and team do a great job
of leading that project. It's
open source. They also have a cloud
supported version of it, and they're one of our sponsors.
This isn't a paid plug, but we just like them a lot.
When we get a chance,
we tell people about it. That's what we're doing here now.
Awesome. Thank you. I agree with that.
I think full-text search would be super cool. There's a lot of things
you could do on-site to categorize
even the Explorer experience
to provide discovery.
A lot of reasons people use tools
is distribution of ideas.
Not just a tool for me to communicate to the one-to-one I already have planned,
but how can you help me meet and reach the masses that I don't even know
that are interested in what it takes to go from zero to ZFS, for example.
And if I did a terminal session of like zero to ZFS, what is that like?
Or what is it like to configure static IP addresses in a bonded nature for XYZ on Ubuntu
these are things I've done recently for example and I'm like I want to share that
but static documentation is kind of boring you know is there another way
and I've always known about this but just didn't know how to use the tool and wasn't really sure
of your plans but if we can get some sponsors to you and you're
dedicating more of your time to
this and this is what you want to do anyways and it gets better as a tool maybe it becomes
something that people can use beyond its current usage which is like you mentioned live streaming
i think that's super cool that the idea of like live stream the terminal agreed to people that's
that's an interesting idea there's legs to this idea more so than it's there now yeah there's like tons of of ideas in my head and and ideas in in github issues from people
it's just like time is needed time and focus well you mentioned an email do you mind if i
put your email here on air uh you mentioned it in your sponsors page that's kind of public do
you mind if i mention it right here is that that cool? Oh, yeah. Yeah, sure.
So, listeners, if you're
nodding your head to the things that
Jared and I are probing
marching about and just ideating
with him, and you've got more,
reach out to him, marching
at askinema.org.
We'll put that in the show notes. That's also
on the sponsors page. If you've got any questions about that,
you want to support him, go check that out. But I've got to imagine there's times, the sponsors page if you got any questions about that you want to support him go check that out but i gotta imagine there's times jerry when people leave
these episodes we do that are maybe slightly smarter than you and maybe half as smart as me
uh and they come back with better ideas and they and they email the people right and they go and
like the recent thing with pokey rule that was that was cool that thing in slack like that's
cool how that happened from that podcast that whole entire video which the listeners aren't aware of but you are that's kind of cool how people riff and
munch these ideas so get in touch with march and if you got some thoughts and uh best worst case
scenario just say hello and say you like the tool amen to that march and anything else that we've
left unsaid that you wanted to bring up before we call it a show?
I'm not sure. I like we've covered a lot.
Oh yeah.
So, uh, we have a matrix room.
Oh yeah.
Where we hang out.
So, uh, the communities is, is there, we discuss stuff, just join in and say, hi,
you can find the link to the matrix room, matrix room on the website in the footer.
There you go.
Pop into their channel and say hi if this is something that interests you.
Well, thanks so much for coming on the show
and sharing all about this really cool project.
I learned a lot.
I thought I knew what it was and I didn't even know how cool it was.
I agree.
Hopefully our listener also learned a thing or two. lot. I thought I knew what it was and I didn't even know how cool it was. And so I agree. Hopefully
our listener also learned a thing or two. Very cool. And we wish you all the best with
continuing to work on this. Thank you very much. Thanks for having me.
Let's help Martin make this a reality to work on a schema full time, go to github.com slash sponsors slash Kulik with the L being a one. So
it's K-U-1-I-K instead of K-U-L-I-K because his name is Marcin Kulik. He's got a slew of sponsorship
options for everyone to help him back this project. As of recording, he has seven sponsors
who have funded his work on Eskinema. And that is so cool. The link to his sponsors page is in
the show notes. And again, we have an 11 minutes bonus content on this show with Marchin for our
plus plus subscribers. If you're not a plus plus subscriber, it is easy to change
that by going to changelog.com slash plus plus. It's better. That's right. It is better.
changelog.com slash plus plus. Drop the ads. Get closer to the metal. That cool changelog metal
directly support us. And of course, bonus content. That's cool cool one more shout out to our friends and our partners
at fastly fly and also type sense i mentioned them at the end of the show love type sense jason is
awesome his team is awesome super fast in memory search so cool typesense.org we use it and we love
it and also to our awesome beats master in residence aka break master
cylinder those beats so good so so good and coming up this friday on the pod we are back for our next
edition of kaizen kaizen that's right kaizen is back on change looking friends me jared and
garehart go deep on what's new around here.
Got to tune in.
That's it.
The show's done.
Thank you again for tuning in.
We'll see you next time.