LINUX Unplugged - Episode 152: To .NET or to .NOT? | LUP 152
Episode Date: July 6, 2016Noah joins Wes for the second time this week to talk with the mumble room. Package management for Bash takes it one step too far, Nvidia starts putting GPUs in your containers, we learn some surprisin...g things about open source at Comcast & discuss just what "Microsoft ♥ Linux" really means.
Transcript
Discussion (0)
40, 50 years ago, amateur radio operators and just general electronic enthusiasts, in fact, the Linux users of like the 1920s, 30s, and 40s, were into a hobby called ham radio.
If you haven't heard of it, basically it's you take, you build your own radios, or you buy pre-made radios, and then you play with them, and you learn about them.
And the federal government literally takes millions of dollars of frequency spectrum and gives it to amateur radio operators to play with.
The only real requirement is you have to pass a very basic test proving you're not going to kill yourself or anyone else.
And as we have entered 2012, 13, 14, 15, 16, technology has evolved.
And now instead of buying these physical radios, you still buy a physical radio, but the radio doesn't have an interface.
There's no dial.
There's no ability to interface with the radio.
It's just a box with a USB plug.
And you plug that box into your computer and you can control the radio from software, thus the name Software Defined Radio.
So what Wes is talking about is this Lua Radio.
And I'm over here at luaradio.io.
And it says Lua Radio is a lightweight, embedded, low-flow graph signal processing frameworks for at lua radio.io and it says lua radio is a lightweight embedded low flow graph
signal processing frameworks for software defined radio it provides a sweet source sync of processing
blocks with a simple API for defining flow graphs running flow graphs creating blocks and creating
data types so I'm assuming if I'm reading this right and if I'm understanding it from the you
know 45 seconds I've been looking at this basically you're able to go through and piece together different components of the radio inside of the software.
Am I understanding that right?
Yeah, that's what it looks like.
Have you used GNU radio at all?
I have, yeah.
It's a real challenge if I'm being perfectly honest with you.
Yeah, and so that's what – when I was reading through this, I haven't used software-defined radio very much, but I'm definitely interested in it.
I have a physics background, so any kind of playing with the spectrum is definitely interesting,
even though I haven't done much of it myself.
And so a lot of people are saying this has similar capabilities, but is easier to use, more lightweight,
and if you're familiar with Lua at all, easier to interface with.
Yeah, man. Have you ever used Xmonet?
You know, I have not. I've looked at it, but I have not actually tried it.
So I've been playing it, but I have not actually tried it.
So I've been playing with XMONET. I was on a – I was for better or for worse, I was stuck on a very low-powered machine.
And the way to drag the most horsepower out of that machine was to use a tiling desktop manager.
And I chose XMONET, and I was super happy with it.
In fact, so much so that I'm considering it making my primary desktop interface.
Wow.
Yeah, I know.
I know it sounds crazy, but here's the thing.
Anyone that I used to think that I used GNOME
because I had too much to get done
and I didn't have enough time to worry about
memorizing weird commands and stuff.
But after using Xmonad for a while, I've honed in on this is a more efficient way to work.
And to that end, you can configure it up the wazoo with Lua.
I believe it's Lua, if I'm not mistaken.
No, buddy, it's Haskell.
It's Haskell.
Xmonad is Haskell.
Okay, which one?
Is it awesome then that's Lua?
Lua is awesome.
Awesome.
Okay, okay, okay.
Well, so I guess my point doesn't really stand,
except for the fact that what I found was
the things that I was going to tweak
were already tweaked for me in Xmonad.
So the things I wanted to set up
were already set up the way I would have wanted them to.
I thought that was,
apparently I confused the two.
I'd gone through a couple of different ones,
but the point,
all that to say that those kind of configuration setups
where you just have this big configuration file in something like lua is super easy to for anyone
to just even if you don't understand programming it's a little daunting at first but once you learn
what the little things that you need to change are super easy to get set up and use
so i guess what i'm saying is i i feel like it would be the same thing in the software-defined radio, right?
Yeah, definitely. It definitely seems like that.
People are excited that it should be portable.
Maybe they can use it on all kinds of different interfaces and devices.
So it'll be interesting to see where this comes from.
It's exciting to see it launch anyway.
Right, right.
This is Linux Unplugged, episode 152 for July 5th, 2016.
Welcome to Linux Unplugged, your weekly Linux talk show that's taken your beloved host,
shoved him into the back of an RV, and sent him trailing across the country to enjoy himself. Stepping in for the one, the only, the great, the Linux podcaster, hair master, Chris Fisher, it's me, Noah Chalaya, and...
Hey guys, it's Wes.
It's Mr. Wes. How are you doing today, Wes?
Oh, I'm doing wonderful, and it's great to be with you, Noah. How about yourself?
You and I totally rocked Linux Action Show on Sunday. Now we're going to rock Linux Unplugged today. That's right.
And we have an exciting, exciting show. I told you on Sunday
the show was going to be so big that we couldn't put a word to it. Do you have a word to describe
this show that we're about to do right now?
Maybe two words. Exciting and confusing.
Really?
Well, that got me intrigued.
I think we'll find out.
All right.
Well, let's head on down into our mumble room and welcome our virtual log. This is a group that meets over at mumble.jupitercolony.com.
And you can join for free.
Just download the mumble client and participate with us.
2 p.m. Pacific time hello mumble room time
appropriate greetings hello hello hello everyone says oh those voices i love them i know it's like
it's like we have our own little fan club and they and they join us via the internet and everyone is
welcome so if you're if you're listening to lin Unplugged, you could join Linux Unplugged next week by downloading Mumble and get with the program.
Please do. It's so much fun.
All right, Wes, tell me about our first story.
Our first story. Okay, so containers, all the rage, everyone's talking about them,
and maybe people are actually using them, right? Like you were just talking with Michael
on Coder about how he's deploying a lot of docker containers up on digital ocean so now if
you're if you're one of the people using document tainers and I know at least me
I use them on do but I also find uses you know deploying apps on my local
computer maybe I need something I don't want to mess up you know we talk a lot
about we've been talking about snap packages and flat pack etc but really
docker can do a lot of the same things there are other considerations right
like it's more complicated there's it's not quite in the same boat, but it really, at the end,
you can use it to deploy applications in the same way. So now NVIDIA is releasing a plugin
that allows you to have GPU accelerated containers. Awesome. Awesome. You know,
that takes things to a whole new level on some degree because if if you if we could start to containerize
games that that seems like that would be a major game changer because no longer are you packaging
and developing first and you're not targeting a specific platform anymore now you're targeting
a container and if you can make if you could take that container and make it
interoperable among all three of the big operating systems linux mac and windows
now we've really changed the landscape.
Oh, that would be interesting.
Yeah, it looks like at first right now,
they're really targeting machine learning applications,
places where you might want to be deploying these apps,
but inside the container, they're too restricted to get access to the GPU.
So NVIDIA, they're working on, they're basically exporting,
they're adding some hooks into the user LAN part and having these user LAN accessible parts exported into the Docker containers.
And then you can do like CUDA type stuff uh right from inside docker awesome so what have you seen is this is i
this is early days i assume this is not something that has that has taken off or have any real teeth
to it yet right yeah it is early days they do have um some code up on github you know the info
world here does note that like one of the biggest drawbacks is that
CUDA is a proprietary standard,
but it also just happens to be
that NVIDIA really is the chip maker
that does a lot of the machine learning stuff. You know,
these big clusters, people doing the serious work, they're usually
doing it on NVIDIA. AMD has been
opening up, which is really interesting, but
it really doesn't have the same kind of market
uptake that NVIDIA does, especially in machine learning.
Sure. DERP learning. DERP learning, that's right. WeIA does, especially in machine learning. Sure. DIRP learning.
DIRP learning. That's right. We have a term for this, guys.
Yeah. DIRP learning. DIRP learning.
I'm interested. Outside of – so you're saying – are you telling me that at this point in time what they're really targeting is really kind of only the whole machine learning thing or is it going to expand beyond that?
You know, the article doesn't mention anything that's kind of focused on the machine learning stuff. I'm sure that's kind of what's
hot and what's motivating this. But once you have access to the GPU, it'd be interesting to see
what else can be run there. Yeah. Yeah, it really would. I, you know, it's interesting because I
watched, I watched, we went to LinuxCon last year. And, you know, the way that LinuxCon works is they always have a sub-focus.
You know, Linux is obviously the general focus,
and then they have this, like, deep-down focus that, you know,
that they kind of run in on.
And this year, it was all about containers.
And, in fact, they were joking, not joking,
the little subtitle was ContainerCon, right?
And so, and I have watched everything in Linux over the last year,
focus or shift focus to be centered on or around containers. And, you know, first we saw it happen
on the server side. And then very shortly after that, we saw that entering on the desktop side.
And it makes me wonder if the future of computing in general is not just going to be containers.
Yeah, I know. Right. And it's interesting, you know, having got, you've been using LXD in the past,
I do kind of get this, I'll build up a host, right?
And I'll have this main host
and then I'll install a bunch of containers on it.
And I just love, it's really nice
having that main host just kept so clean.
Anything you really want to do,
you end up in a container.
If you're done with that, you just delete it.
There's really, it's very easy to manage.
I can see why people are fond of it.
Yeah, yeah, for sure.
You know, I wonder,
have you played with flat pack and snap packages and all those?
Yeah, I've been getting my feet wet. I mean, I haven't done anything, any real work in them yet,
but I've played with them a little. So I'll tell you, the Telegram conversation I was having
literally as I was getting on the air is one of our prized and beloved community members is
actually working on packaging me
a OS
ticket app inside of Electron.
Oh, really? Interesting.
So one of the things I hate is not being able to have
a dedicated application when I'm
working.
And I can't alt-tab between things.
It really messes up the window
management kind of workflow, yeah.
It does, and especially because things like – so for example, our ticketing system, I'm basically constantly referencing back and forth.
There's never a time when – no matter what I'm doing, something is either going in or out of the ticket system.
And to not have that in its own dedicated window is a huge crutch for me.
And so Sean was working – Sean Adram was working on packaging this just because he's a nice guy and sends me a telegram.
He's like, here, I got this done for you.
I'm like, really, dude?
And he's like, yeah, it actually didn't really take me all that long.
Yeah, I know.
So I'm playing with that. And then the first story we get on the air, we start talking about containers.
It seems like every time I turn around, something else is now possible.
Something else is now happening because of containers.
Yeah, exactly.
Well, I mean the Electron thing is just because it's web app, but you don't even need an Electron framework to do that.
There's browsers like Surf, which is from the Suckless project,
and they have
a browser that's pretty much
bare bones, nothing.
It doesn't have user
Chrome anywhere, and for people who are not
aware, that means the GUI
of the browser, not
actually Chrome.
It has the
same kind of thing. You have custom cookies for each individual anyway, so it has the same kind of thing. You can have custom cookies for
each individual website. So you can just load up any web app into surf as its own independent window
and not have to worry about putting any kind of framework in it. Really? Now have you, that does
sound very useful. Have you had, have you had any experience deploying this stuff? Yeah, I actually
have like three of them running right now.
Really? And how have you found it to be?
It works great.
The only problem is that I'm running an older version of Ubuntu base that has an older version of Suckless,
which the problem with that is only because packages don't update very much.
So once that's no longer an issue, then it wouldn't matter
because the latest
version of surf has like the latest web kit engine. So it doesn't have any kind of like weird
issues. Like for example, I have Gmail in one of them that every once in a while, Gmail will say
you're using an old version of web kit. Other than that, it works flawlessly. It just constantly
tells me that I'm using an old version, but as far as like functionally, it works fantastic. And all you need to do is create a text file that stores the cookies and you're done.
It runs just like a web app in a like a full desktop thing.
But, you know, web app style.
Now, excuse my ignorance, but isn't the entire idea of the whole container thing that it
shouldn't matter what version of Ubuntu you're on and what version of Ubuntu your users are
out?
Right. But Suckless is not based on the container stuff.
This is just an old traditional dev thing.
I'm saying this is a browser that doesn't need any kind of framework
to run around it to be what you want,
or it's a separate window and everything like that,
and it has a separate session.
So you can have two or a hundred different Gmail clients running at the same time,
and they would all be separate windows, and they would all be tracked differently,
and they all have a different session and all that stuff.
Wow. I could definitely use that at work and a few other places.
Yeah, I have one for each Slack that I use.
I have one for multiple Gmails and so on and so on.
Yeah, that's – So Suckless surf is a really good app for that yeah
i'll have to look into the you know it's funny too because i for a long time i told myself i was like
what i'm asking for what i need what i want is so simple and plain and straightforward that i can't
believe something doesn't exist and then i kind of got the idea that the only real way to do that
was with something like electron and then i'm like'm like, well, I guess that works.
But I just I feel like there should be a simpler, more straightforward solution.
I'm finding out, as usual, it's just it's just a matter of my ignorance, not a matter of, you know, the lack of Linux.
It's also they don't really tell people about they don't really advertise that they have this feature that they just say it's a bare bones browser.
They don't mention that they have individual sessions system.
So like, but it's interesting because all you need to do is run the, the, whatever,
whatever thing you want, whatever website you want to go to, you just type in surf,
put in the cookie file location and then the website and that's it.
And it rooms like how you expect it to.
Nice.
Well, thank you.
I appreciate that.
Rotten.
All right.
You know, the next thing I want to talk about, or I guess rather Wes wants to talk about, is making his computer more artificially intelligent.
And I'll tell you what.
Here is your – I'll give you two secret surprises.
Secret surprises.
Surprises, it's not really secrets I'm about to tell you, but it's still a surprise, kind of, sort of.
Keep it secret for the surprise.
There are two things.
One is I am desperately trying to get more patrons for Chris.
So when he gets home, we have to do it.
Yeah, I know.
So he'll get home and he'll be like, oh, my gosh, Noah and Wes did not burn the network down.
And dad still has a house.
This is absolutely great.
So we are at 650 right now.
So we have two patrons.
My goal is to get us to 660.
Now, I can't tell you quite yet because I haven't finalized the details of what it is, but I have something in the works.
And I also have a backup plan if that first plan falls through of something super special and cool that we're going to be giving away to one of the 12 people.
Two spots are taken.
We've got 10
left so if yeah i know i know and and so if you have been considering signing up for patron now
is the time to do a patreon.com slash today and if you sign up before chris gets back so that's
going to be it will have to do it on the linux action show this coming sunday so it has to be
before then if you sign up before then you'll be entered into the drawing.
I don't know what we're going to do.
We're going to do some kind of drawing
and pull out names
and you'll win something really cool.
I think it's cool anyway.
And it might be even cooler
if Noah's plan number one goes through.
And if not, I have a backup plan number two.
Okay, that's super secret special thing number one.
Super secret special thing number two.
Next week's Linux Action Show, if you don't usually tune in, obviously with Chris being
gone, it gives us, we have to fill for his absence, and those are some pretty big shoes
to fill.
However, we think we might have a pretty good idea of how to do it, and I'm not going to
tell you exactly what it is, but suffice to say that it is going to be
amazing and it may or may not relate to this next story that we're going to talk about which by the
way is that the mycroft ai personal assistant is now available on kde plasma 5 what do you think
about that wes well you know i mean chris has just been making everyone so jealous he's using
that gnome extension he's loving it. You've got that personal AI,
Popey's voice right there all the time.
Popey uses Arch.
Right, exactly.
Oh, how I wish I had that.
Chris, why did you leave and not give me that?
Would you like it?
Would you like it?
I can send it to you because I have it.
Let's set that up later.
I think so.
So if you are a KDE Plasma 5 user,
rotten corpse, ahem,
you can now have access to the Mycroft AI application,
which, let me tell you, you know,
the thing is I am still in kind of disbelief that he's managed to get this project as far as he has
in the short amount of time that he has.
Oh, it's incredible.
Without compromising on basically any of his core values.
You know, so we really have a hugely industry competitive project. Oh, it's incredible. Without compromising on basically any of his core values.
So we really have a hugely industry competitive project.
In fact, arguably the only one of its kind, right? Because you have things like the Amazon Echo, but that's a very specifically tailored device that you can do very specifically tailored things.
It's not a drop in –
It's not an open-in. It's not an open, extensible world.
Right.
But even if we abstract away from the open source argument, even if we get above that, there's still no drop-in solution that I'm aware of that you can just take an AI software stack and just drop it in and say,
okay, this is my device that does my widget that does X, Y, Z, and I want it to have AI. So here we go. I don't know of any other
software system that will allow you to do that. And so, you know, he's really, he has struck a
nerve. And I think that in the long run, that's going to work out very, very well because it's
inevitably Mycroft is going to be a more versatile and useful platform
than anything that remotely competes with it right right exactly and i think we've seen like because
it's so you know if you become a business that wants to to have the same thing like it's it's
open to you sure you have to contribute back upstream of course but you know it allows the
community and people who really have needs to build the tool that will work for them.
Do you have to contribute back upstream?
Well, depending on the license.
So, yeah, if you make changes, yeah, yeah.
But I'm saying if you are a project and all you want to do is just use that Mycroft AI in its form, you're not making changes to it.
You're just using it.
You just take it and use it, right?
Yeah, totally.
That's exciting.
I think so. I think so.
I think so.
And the fact that we already am starting to see, you know, I remember we were at System76 in November, and they were still in the launching phase of this.
So to watch, to go from November to July and have this not be a thing just on GNOME, but now they've extended out to KDE.
You know, where is this going to be one year from now? Is this going to be on every device I own?
I hope so.
I know, and I also feel like when, you know, like when
Cortana came to the desktop and Siri's coming to it,
I just feel, before, before Mycroft
really had picked up as much momentum as it had now,
it really felt like, what was Linux going to
have to compete? What was the open source world going to have to compete?
We were so far behind. We didn't have anything.
We're not that great at desktop integration anyway. So it really didn't
seem like this is possible. So really like have to throw a hat tip out here to Aditya Mehra,
who seems to be the author of both this Qt application as well as the Gnome Shell extension.
So it's really helping desktop users out a lot. Okay. So here's the, I guess here's the thing.
Well, I won't say that. I won't say that. Yeah, we'll leave it at that. We'll leave it at that.
Wrong show, Noah. Yeah, no. Yeah, right. No, that's my thing. It won't say that i won't say yeah we'll leave it at that we'll leave it on show no uh yeah yeah no yeah yeah right no that's my thing it's my trademark i take it
everywhere but suffice to say that i think that mycroft is is going to be absolutely awesome and
i don't think i think that at this point from what i've seen what mycroft can do and that's a that's
a that's a small scope right i think it's much larger than that i think that from the very little that i've seen it blows away cortana to me is the new clippy of 2016 that's how i feel about it i mean i played
with it for a little bit it feels like clippy it feels like that stupid irritating paper clip that
bounces around gives you nothing useful and just annoys the heck out of you except now it's like
it's almost like clippy combined with annoying flash infomercial embedded in the page with super loud sounds yeah exactly and it's like under your
fingertips all the time all right so i see you're wanting to run a podcast would you like me to
mumble for you exactly exactly dude that's exactly how i feel about it i think it's the dumbest thing
ever all right well looks like arch has finally caught up to Ubuntu.
Zygoon.pl headline, Snappy in Arch move to community repo.
And says, hey there, Snappers, I'd like to announce something that you may have noticed during the last update of SnapD to version 2.0.10.
The AUR package is no longer there.
Instead, you can now get an update Snappy in the arch simply by running this one liner and then he gives pac-man tech s space snap dsnap d um you know it's interesting wes i was just talking again i was talking to um adrian right before the show
and he was he was in the process of packaging this electron thing to me and he says well i know that
you have a bunch on your laptop now so i'll just package it as a snap and i said doesn't it really not matter anymore what operating system i
have now if it's just a snap i just you know install it on any os right or any distro and he
said no actually so it turns out it's it's still very much a community thing so the community
distro and arch has it and you know it's it's available in Ubuntu and it's available in Copper, but it's not widely spread.
Right.
It's fairly wide.
It is fairly wide.
I mean, for the fact that it's only been around for two months, it's pretty wide now.
Well, I guess my question is twofold.
The first part of the question is, doesn't it kind of eliminate a universal installer
if it's not
truly universal throughout all distros and then my second question good yes but it's it's that's
that's only a temporary thing because it's it has to be people have to adopt it to make it universal
yeah so what you're saying is eventually we'll get there more than likely i mean the fact that
it's been two months and when it first started two months ago, it was in one distro. And then it became in every flavor. And then it became in like six other distros now. Like CentOS has it, Fedora has it, Gentoo has it, Linux Mint has it now.
Like the fact that it's only been two months and there's already that much support, the more and more distros that adopted and the more time we have, the more it becomes universal.
I think it just hasn't had enough time to get there.
Okay.
Well, I guess that makes sense.
I mean it's a good point.
You can't have universality just on the get-go or that would be pretty hard to do.
And we will see.
If we get the critical mass, like if you can tip the ship here with enough distros, then you'll feel like an outsider if you don't have it, and your users will probably come to expect it.
Yeah, every time this discussion comes up, I have to reference the whole XKCD thing.
Problem, we have 13 competing standards.
Let's create one standard to standardize them all.
Problem, we have 14 competing standards.
I mean, it's true, but that's also kind of like, we didn't really have a standard for
universal apps yet, so they're all competing to become the standard.
Yeah.
So it will eventually – we will have a standard because there's so many people who are behind the idea of having a standard that it is very likely that one of them will get it.
I don't know which one, but one of them will.
And I think the best example right now is if someone wants to install like NextCloud, they can just install Ubuntu or whatever that supports Snap and then just Snap install NextCloud and you're done.
Yeah.
Like it's that simple now. Like that is such a huge change from what it used to be that I see that this is being this is definitely going to be like
revolutionary type of thing yeah go ahead dar i will claim that it doesn't even matter if it
doesn't become universal as long as the core distribution is implemented what i mean is
the distribution concept is a concept of try different things and see if you succeed at doing
and if it becomes good, people adopt it.
The toolkits, GTK, Qt, all that kind of jazz.
Essentially, for me, if it is good,
it's going to get adopted by the major distributions.
Usually people run mostly on the major distributions.
Those that are running different things,
it's probably because they're looking for different things anyways.
It doesn't matter if it's not universal,
as long as it's in the core ones.
If you're developing software and you want to ship it, heck, if you can have a single binary that it runs, that's what matters for your user as long as he can execute your program.
And at the end of the day, it doesn't matter.
Yeah, that's fair.
That's fair.
I guess I started out – I mostly took the opposite side of this argument simply to provide context and because I've been speaking with some developers and they had some concerns. But I think overall, if I had any concerns, even if those concerns haven't been directly addressed or squashed, I guess they're outweighed by the huge benefits of what this brings to the table. And I guess the only thing is, it is interesting to me that something of this magnitude
is still in the community repo.
It's not in the main Arch repo.
Although, I suppose on the other hand...
It is the main Arch repo, though.
Is it?
Well, there's a distinction in turn.
Yeah, the community is just a branch.
It's not officially maintained by Arch.
But the fact that they're allowing a trusted user to maintain it in the official Arch repos, it is kind of like they are kind of acknowledging that it's more than just an AUR. It is being maintained by someone in that community.
That's good clarification because I was not aware of that.
So anyway, yeah, I think this is going to be interesting.
It'll be exciting.
This will be my first Snap package that I will rely on.
I've installed Telegram, but I have always had the option of doing it from a PPA.
I installed other applications, but they are just – it was specifically for the purpose to try out Snap. This OS ticket desktop application will be the first one that I've ever, that I'll have ever
used that is only available to me in a Snap package. And because it's in a Snap package,
I guess I'll be able to use it on anything, right? Yeah, I'll be really curious to see,
you know, how that goes for you. You should definitely, when you, once you've got a hold
of it, you should definitely see how far you can spread it,
just see which servers and how easy it is.
Yeah, I totally will.
I'll tell you what.
The funny thing about the snaps is that most of the things that –
on the developer side, there are certain things that I see that it could be a problem,
but there's also certain things that I think that are fantastic.
But the funny thing about it is that so far, since there's only been a couple months,
But the funny thing about it is that so far, since there's only been a couple months, the amount of progress that has been made on the Snap side is kind of amazing.
Just because in the very beginning, there was two or three problems that were addressed as being catastrophic, and there's no way it's going to be adopted, and then they happened.
And Snap released features for that.
And then now there was like,
the immediate jump was then the shared library issue.
And then now Snap is basically going to have in like a couple of weeks,
shared libraries are going to be released.
So they're like pretty much anytime
there's an issue that's not,
like that seems like Snap is not going to be
something that people would want.
They're going like, well, okay,
now if you brought our attention, we'll make it. Huh. a good sign yeah it is that's that's very promising very promising
outlook so wes do you have the chat room up right now yes i do so i'll tell you something i had i
was i had the chat room up on my laptop and then as it turns out one of the problems so here behind
the scenes ready behind the scenes um one of the problems
with doing shows from grand forks is i'm not usually doing them from grand forks and so when
i every show i think i kind of have everything down and to be fair coda radio was so much less
overhead than last that i literally walked into the studio i set up in about 15 minutes and then
i was like well now what that's awesome i guess i'll go have lunch and come back in an hour and a half and do the show and worked out beautifully well thought
i was all set up to do unplugged but the one thing that i missed was i there's where i didn't do the
intro music for coder because there were some editing concerns love i've done the intro music
before and i was like well it should be no problem to knock it out but i realized i didn't have a
third computer to play the intro music on so right before the show starts starts, I'm like, well, I'll use my laptop.
So I bring it over.
I set it in the middle of the studio, plug the audio cable in, and run the music.
Well, then I get back here, and I realize I don't have access to the chat room.
That's a problem, right?
Well, the great thing about Quassel is it syncs on all of the devices.
Turns out today is the day that our techs decide they're going to upgrade all of the access points.
So we're going to AC, right? Nice. I have no Wi-fi in here i have no wi-fi so i have all of
my phone i have no wi-fi what are you gonna do well i tell you what what i did was i have my
samsung galaxy s6 on ting and the cool thing about ting especially if you use the code uh
linux or you visit the website linux.ting.com,
is they are going to give you $25
off your first device or your first month of service.
Now, my total bill,
my total bill is $24.
That's what I pay every month,
and that is for a phone for myself and my wife.
That's crazy.
Yeah, we have two phones,
and it costs us 24 bucks a month.
Now, the great thing,
how that's even possible is
you only pay for what you use.
So I don't use minutes.
In fact, I hate phone calls.
If you call me,
you're likely to get rejected.
I only answer phone calls
from like three people.
And other than that,
the little ringer thing,
by default, is set to silent.
And then there's certain people
that I have set a ringtone for,
like my wife,
if she needs something urgently. For the most part, I just don't do phone calls. So my minutes
are zero. My messages, I hate texting because I'm on so many different devices that I can't text
from one device. You just send me a text message. In fact, I have those alerts disabled. So I check
them like once a month for those people that refuse to join 2016. But I use Telegram for
going over, you know, data for everything. Now we're an IT company. So what does that mean? once a month for those people that refuse to join 2016. But I use exactly I use telegram for going
over, you know, data for everything. Now we're an IT company. So what does that mean? That means
that we install IT infrastructure, I never have to worry about not having Wi Fi, because if I'm at
my office, except for today, I have Wi Fi, if I'm at my home, I have Wi Fi. And if I'm at any client
location, we have a super secret special network that all of our all of our tech devices connect
to that give us admin access to
the network and unrestricted speeds and all that good stuff. I just throw my phone on there. So I
have network access literally everywhere I go. So much so that when I got my S6, I didn't have a
GSM SIM card at the time for it because I still have my other phone activated and I want to
number two. It's a long story, but I didn't have a SIM card. I actually used my phone for three days while I waited for my Amazon Prime sim card by the way you can get
them off of Amazon Prime the Ting ones I used it for three days voice calls through Zoiper SIP
texting through Telegram and of course all of my email stuff and I never even really missed a beat
I didn't have data while I was in the car that wasn't really the end of the world so it's kind
of like gap coverage and it was just kind of funny today because it was one of the first times where I was like,
hey, I actually don't have, I don't have wifi for the first time. And that doesn't usually happen.
So all that to say, linux.ting.com, go over there for 25 bucks. You should easily be able to get
your first month of service. If not, if you're a single person, you might even be able to milk
two months out of that, just six bucks per line.
And you can do a hotspot.
So I actually, in addition to my two phones, I've also got a hotspot that I use when we go out to the lake and stuff like that, just so the laptops can get connected, the kids can be there and whatnot.
Linux.ting.com and a huge thanks to Ting for sponsoring the Linux Unplugged program.
Thanks to Ting for sponsoring the Linux Unplugged program.
So the next story on the docket is, I believe it's this BPKG.
Am I pronouncing that right? You are dead on.
What does that sound like to you?
Big B package.
I mean, I know what it is because I got it pulled up here, but
is it B package? Is that how I'm supposed to pronounce it? Let's go with B package. That
seems like the most fun. So, you know, we all love package managers. We were just talking about
the next round of pseudo package managers with Snap and Flatpak and Docker. And well, everyone,
you know, you also have things like you have your, you know, your Rust has a, has cargo, you have pip for Python. Now we have a B package for bash.
As a developer, I hate them all. Okay. Tell me why. Hold on. Hold on. Tell me why.
Okay. I will say the following. If you're developing Python, somebody uses install,
somebody uses pip. Yes. They eventually fixed and merged both of the package managers. But that
means only that now you have software that when you download the repository in the source code
you actually don't get all of the dependencies you need for that program to to build even though
you're just getting source code in tarballs you will get source code and you will have all
git has sub modules but developers seem that they want to use whatever package manager you have go
has the same problem.
They put the URL as an import inside of the language.
What that means?
You download your source code.
You want to work offline.
You can't.
You want to build.
You can't.
If the server goes down of one of the dependencies, you're dead.
Essentially, it's terrible.
Wrong.
It's much worse the problem than actually the problem you end up finding
in the distributions repositories, because at least you know that it's your distro.
They might not support all the packages you're looking for.
But when they have something, they tend to have all the dependencies you need.
So what would you recommend then?
I mean are you saying that none of these should exist?
Are you saying that the problem is we got too much of a good thing?
that none of these should exist?
Are you saying that the problem is we got too much of a good thing?
I think it's because it creates isolationism
and makes the people using a specific language
end up having to recreate all of these libraries
that could be used across
and actually have the ability to use code
that already exists outside of their own isolationist thing.
I believe if you want to fix your source code,
if you're already shipping source code and if you're already using some sort of versioning control it's not difficult to put in
your build script or whatever uh your version of control supports like git has sub modules in
and sub trees use those things please okay so you're that avoids so much just one tool to do
the job okay so you're saying you're saying it's unnecessary?
I'm saying it's unnecessary
because let's say I want to use a Python program.
Now I have to have pip, easy install.
I have to have git.
I have to have the distribution.
So all of these packages are spread across,
not uniformly, not all in the same version.
It's like, it's just a pain.
As a developer, I think it's terrible, actually.
Hmm.
Okay.
Wes, do you have a contrary opinion?
Well, you know, I wasn't exactly excited about this.
I just think it was interesting to see something,
like see these concepts promulgate down even to the shell level.
I don't do too much bash.
I mean, I enjoy writing a bash script.
It certainly is useful.
But, you know, if I get to a certain level where there's enough logic,
I usually have migrated to another language
that has solved these problems already.
I did install it on my laptop here,
and, you know, it's working well.
It was a fast install, although they do recommend
one of the ways to install it is the curl pipe to bash,
so there's probably a theme going on here.
But, you know, you can install stuff.
I've got, like, an RTail program for remotely tailing log files. So I
can see some utility in it. And certainly if you were a Bash developer, I'm not sure if that's a
thing, but if it is a thing, maybe it does allow you to have, it will set up local dependencies
for you. So you can install things in user land rather than needing root permissions.
Let me just add one small thing. I see nothing wrong with having maybe an index of where you
can find the multiple sites, the things that don't get published into a repository.
If you have some sort of index that allows you to search code,
like search code website, for example, you can do that.
But honestly, when it comes down to the way you distribute your package,
come on, it's source code.
If you are building a program, that program, you're shipping the source code,
the user of that source code expects to be able to build and run your program.
Make that work.
Stop depending on yet another tool to push your code
so that all of your collection of code is available
and start dealing with less surface.
A lot of the errors and issues come exactly from increasing your surface of the area.
You as a developer can comprehend all of it perfectly.
You end up having dependencies.
And you can see as an example what happened with Node.
Essentially, one developer decided to, okay, since I can't have the name I wanted because of trademarks, I'm going to delete it and broke a lot of websites because projects depended on it.
It was an example.
That's not a very good example just because node had a very poor infrastructure it was if node did not allow them to you know remove
it any at any time for any reason whatsoever that wouldn't have happened but here's they screwed up
not necessarily like the structure like not the concept of having shared dependencies and stuff
like that it was the the fact that Node screwed up, or NPM specifically.
Actually, what about this? And this proves the example that I was speaking about.
If you are using versioning control, and you fork a repository that you are using as a dependency,
you can pull exactly that hash version. And guess what? You can insert into your project
as a submodule, as a subtree, and it will be just there. Okay, but, well, first off, I would just point out that you're talking about an idealistic
situation where everything and all source code is open source, but that's not the case,
and the majority of proprietary software would be a dependency that you could not do that
with, so.
You can.
You can.
You can have Git repositories that have binaries that are proprietary.
You can totally do it.
It's a lame excuse.
It's doable.
We have the tools.
It's people who are so centric about the way that they're used to developing code.
They don't look at how I actually make my source code something that just builds and just works.
They complain about the packages issues, but most of the maintainers of the multiple distributions constantly fight,
okay, this package pulls from pip and this package pulls from easy install and this package puts
from Node.js. Which one respects my licenses in the case of Debian? The maintainer has all of
these efforts to go through. If maintainers didn't have these efforts to go through,
they could just look, okay, we prefer in our
distribution to put binaries
in usrbin. And so
they would look at where in your package
build, so in the directory where you build your
project, the binaries are generated,
make those binaries go to
usrbin. That actually would simplify the
process for both developers and maintainers.
But no, nobody sees that.
Okay, I'm used to... Also, I don't want to be dependent on maintainers but no nobody sees that okay i'm used to also i don't
i don't want to be dependent on maintainers to update my applications for the rest of my life so
even if even if your solution is is good in that sense it still it still puts it on the maintainers
and still means that i can't update it whenever i want so even if that was a solution i still don't
want that for all intents and purposes you can be the maintainer as much as you want.
I don't care.
I just think that it separates the concerns between what a developer ships in terms of source code and what a maintainer ships in terms of as a maintainer job.
Currently, the maintainer job basically is reverse the process that developer decided to create to build their package, understand how key can incorporate
that into their distro.
Wanting or not,
that's the process of a maintainer.
I just say that if you as a developer
have the care of isolating
your source environment into a location,
it's actually preferable.
Even if it's you doing the work
as a maintainer,
it's you shipping a binary
or it's someone else
coming and picking up on it you know uh rakai says in the in the in the chat room he made a he
made an interesting point i'd like to get your guys's take on it he says someone point me to a
distro that is packaging a ton of bash scripts as packages because i don't know of any so this
argument doesn't make any sense to me i can can see for normal packages, but bash scripts?
And then he goes on to say...
Sorry?
I was just going to say, I'm surprised that this is even a thing.
No, just that bash has packages.
I mean, I don't even know, like a.sh file, that's a package for bash.
I don't even understand.
I think that's what he's saying.
And somewhere in here, I can't find the exact quote, but he says, oh, yeah, here it is.
I would love a Bash script repository.
So that makes sense to me.
But there's no way to get it elsewhere without just searching get home from code and pulling down from local.
Does that change your perspective at all?
Either Rotten Corpse or Dart Evelyn or Wes, if you want to jump in.
Slightly, but very mildly only,
in the sense that I agree with the repository,
I agree that there's not something for it,
and in that sense, maybe distribution should look into incorporating common bash scripts.
Then you have to make the other question.
Isn't bash scripts for customization, personalization of things?
When things need to be a system, become their own isolated daemon or program.
If we look at Bash as
something to automate things for
our own personal taste, then a
repository makes sense in the terms of
community sharing, not necessarily
as distribution, actually
distributing this. It's very
personal.
Those things become
standardized, they become small little programs
that end up being deployed as a package go ahead with my feeling i'll just add in in some ways like
it there's also kind of like a weird reverse or perverse incentive where uh a lot of times bash
scripts i i'll use them where i might not use another tool because you know they're very
portable they're easy to run you don't have to compile them etc so if i'm if i have to install
this secondary tool to then down these obviously you don't have to compile them, etc. So if I have to install this secondary tool to then down these, obviously you don't. You can get them just through the repo or whatever. But it makes it like, well, if I'm going to already be using a package manager, I might as well use something in Python or a higher level language that's used to facilitate a lot of things.
In fact, Rakai actually shares with me a couple of little out scripts of the way that some of the videos are transcoded.
And my understanding is that's all wrapped inside of a Bash script.
That's great.
I actually tried using – I took a course on Linux Academy, more about them in a little bit.
But I actually – I tried writing a little bit of code in bash uh and and then i decided i was
going to quote unquote graduate to a higher programming language and so i took a course
from linux academy on python and what ended up happening was i found in python how to write bash
like how i could call bash scripts and so i ended up yeah and so basically i ended up wrapping all
these bash scripts inside of python which made absolutely absolutely no sense, but it was fun to do.
But I think part of it is that you have to be comfortable with bash to be a Linux admin to begin with. inside of bash enables us to get the functionality as if we were a developer writing a program
without the overhead of actually having to learn the development language and quote unquote doing
it properly and in that regard i think that there's that that bash has a lot of power it
has a lot of power behind it you know to that point how many people do you know in 2016 that
are writing hugely complicated scripts in DOS or whatever,
you know,
it just,
I,
I,
I,
I,
we have,
we have,
I mean,
our shell is so powerful that we're talking about a package manager for it.
That in and of itself is cool.
I'm going to not without disrespecting anybody.
I will say that the fact that once those things start needing a package manager, I think it's actually an indication that perhaps those things should be moved outside Bash.
It's just a suggestion because a lot of things don't contemplate the use case scenarios that are beyond the, you know, small automation.
Bash was originally, at least if I understand correctly it was originally intended as you
invoke mostly other programs to do but when you start doing manipulation when you start producing
it's a show yeah exactly when you start trying to produce output maybe it's the time that you
need to make a stop in your heart and say really i need to get this done properly because seriously
it actually just damages further because we then need that
Bash becomes always there,
becomes having the issues that other programming
languages already have solved,
and now we're having to deal with an
extra package manager. I just
don't believe it's a good idea.
A package manager might not be good for
Bash, but I agree
that the idea of a repo would be pretty
cool. So maybe not practical
but still pretty cool yeah well you guys should all just try it out and report back if you actually
use it yeah yeah yeah maybe that's an idea maybe we could have like a community review kind of a
thing you know i think part of the drive is to you know so one part of the drive is just that
we're familiar with it but the i think the reason that a lot of us especially as system administrators
i don't know about us but the reason that i've become so ingrained in Bash is because I can run
it from the command line, right?
Yeah, exactly.
And so basically what that enables you to do is if you have a tool that you fundamentally
rely on that requires a web interface, it means that you're only able to access that
tool when you have, I'm sorry, did I say web interface?
Graphical interface.
You're only able to access that tool when you have, I'm sorry, did I say web interface? Graphical interface. You're only
able to access that tool when you have the graphical desktop installed. And that's not
always the case. In fact, a lot of times you'll end up using like all of my servers, and I can
say this with 100% certainty, every single server I own, whether it is a virtual server or it's a
physical box, does not have a head on it. And it actually provides for some comical interaction
because I'll get calls from clients. You know know all the servers that we have installed at client locations
they're on a rack and there's no display connected to it because we administrate everything you know
over ssh totally and that that enables us to get it to it remotely but it'll be funny every once
in a while you have a client that their server goes down or something stops working and they
don't want to pay us to come out so they'll they'll go in there and they're like i'll fix
this i'll just go and click the restart button or the turn back on.
Yeah, well, that's all right.
They're locked.
They can't get logged in.
But they have the cable.
They plug a monitor cable and they set a monitor up.
And I've gotten more than one call where they're like, well, yeah, this is so-and-so from so-and-so client.
And we went in the server room.
Something's definitely wrong with our server.
We plugged a monitor in and it's booted to a DOS prompt and it's not loading Windows.
You know, they just they have no idea, right?
But I think the drive behind that, not only is it lower resources, but it fundamentally enables us to take those servers if we ever need to, and it's happened more than once, and move them over to a VPS provider.
So you have a client, and they have a server, and all of a sudden it goes down.
The protocol used to be 10 years ago. Here's how that conversation would look.
Hey, I'm really sorry that your server went down. I understand that you're not able to get any work
done. If you'd like, I can go ahead and overnight a brand new server to you and would have it up
and running in about 12 hours. If it's super, super, super, super, super, super, super important,
I can call XYZ and they'll go ahead and uh put an expedite thing
on it and i can drive to minneapolis four and a half hours and i can have it up at eight hours
and then you'd be back online today that conversation looks like this hey guys i'm
really sorry that your server went down if you pay us 15 dollars uh we will go ahead and set
you up an account over at digital ocean which, which is our chosen VPS provider.
And then we bill them 15 bucks and DigitalOcean charges us five bucks. And so we make a nice
little spread in there. And it's a recurring cost because if you're in small business,
you know that the most valuable revenue stream that you have is the recurring sources. It's
not the one offs that the people that call, even if it's a big client, even if it's a big job
and you bid it out and it's, you know, 10, 15, 20, even $30,000, that money will
eventually go away. So it's those, it's those, it is those reoccurring contracts where they sign up
and they say, yeah, we want, because it's only five bucks. We want a main server. We want a
backup server. And then we want a failover server offsite, you know, this, that, and the other.
Okay. No problem. I log into my digital ocean control panel. I create their main server. I create their backup server. And those
may be on one data center. And then I put another one in like, you know, Frankfurt or wherever
that's totally offsite and it's doing offsite replication. So if anything fails at the main
data center or something like that, we have the backups and I can do all of this from my laptop
or even my freaking cell phone right on site. And I can tell the customer and I have told the
customer,
Hey,
we'll have this up and running for you in about 20 minutes.
Cause it takes me about 30 seconds to get the server spun up and working.
And then it takes me,
you know,
the,
the rest of the time I probably spent half an hour locking the system down and installing all their applications and stuff like that.
The thing is,
um,
if you use the code deal unplugged,
you're going to get,
uh, you're going to get ten dollars off your first linux rig now you could apply that towards a big linux rig and run a really powerful
server for a free month or you could do what i do which is spread that over two machines and you can
actually run two five dollar rigs now just to see how far I could push DigitalOcean, a couple weeks ago, we did a review of virtualization.
And I actually virtualized a $5 droplet to see if I could do it.
You can actually get mini droplets out of your $5 droplet if you really had to.
So if you had like a good example might be a web server, a small web server that has like a company portal.
Maybe it redirects to a different site or maybe you just
have a couple little company resources something like that you just something super super tiny
maybe you needed two of those you could probably get those to run on a five dollar rig i'm not
necessarily suggesting you put that into production because for five bucks a month who cares but i can
tell you this i have there isn't a vps provider out there that i haven't tried and hands down the zero compromise VPS solution is digital
ocean.
So go over there,
use the code D O unplug and get $10 towards your first Linux rig.
And a huge thank you to digital ocean for sponsoring the Linux unplugged
program.
So Wes,
uh,
I,
this next piece here is, uh confusing, to say the least.
Linux and open source are powering comic massive restructure?
Am I reading that right?
Almost.
So Linux and open source are powering Comcast's massive infrastructure.
Comcast.
I didn't see the rap.
There you go.
Okay.
You know, and that's a hard one to say for us people.
Comcast is not always our friends.
I know here at the JB1, we certainly complain about them.
I think that's every day.
But, you know, you don't always think about, like, how much of the open source technology they really do and have to leverage to compete as a huge ISP.
Uh-huh.
has a huge ISP. Uh-huh. We've had our, you know, you speak about problems, but we have had a plethora of Comcast problems recently, so much so that Chris is actively looking to try to find
alternatives to Comcast. The problem is, in his particular area, Neck of the Woods,
they are not terribly available. In fact, I heard him on the air last week sometime,
he was even joking. He's like, maybe I'll just go do shows from the road because I probably have more reliable
internet. Yeah, exactly. I mean, that says it right there. Okay. So dive into this a little
bit for me. Tell me how Comcast is using Linux. Well, okay. So basically, I mean, to start with,
obviously Comcast ships a crazy amount of traffic. Let's see. In 2012, they had 1.2 million gigabytes of data traffic.
In 2013, that grew to almost 7 million gigabytes of traffic.
And it just keeps getting more.
In 2014, now it's at 74.8 million gigs of traffic.
People are clearly picking it up, and they need things that can really scale.
It's just interesting to hear the people say.
Here's what their representative is saying.
Comcast, like so many others, is a very Linux-heavy operating system company.
Generally speaking, we're more on the open source side of those Linux distributions than the commercial side.
So they mostly are using Ubuntu and CentOS.
In addition, they're huge OpenStack users.
So they mostly deploy KVM, and then they do all their data center orchestration through OpenStack.
They have roughly a petabyte of memory and around a million virtual CPU cores that they are running under OpenStack,
mostly to run all their web interfaces, billing, management, account stuff, the X1 interface, the on-screen interface, all that stuff.
So I really never really thought about it before, but it's kind of interesting.
You know, I suppose really what – the thing is like it would be cool to almost use – almost it would be cool to use this as something like a runs Linux on the Linux action show.
But the reality is there they are the epitome of everything I hate about these kinds of Linux users.
And it is it is this in a very short order.
They don't care about Linux or the community.
They care about profiting off the backs of the hardworking people and the excellent infrastructure that Linux offers.
Now, I'm a small business owner. I'm not against making money. I'm not against making a lot of
money. Right. Our economy relies on that. Yeah. But I am against abusing a community
and having zero respect for that community. Tough oracle.
Well, and then further is pushing the people that made that community exist in the first
place out of the community because they are not part of your business model and don't benefit you directly.
And Comcast, in the few little things that I've had to deal with them, seems like they hit every one of those bullet points square on the head.
And I hate it.
And so, you know, it's kind of cool.
But let's face it.
All that to say, you can't run a company the size of Comcast unless you're running it on Linux.
I mean Windows small business server isn't going to cut it.
And Comcast needs like best-in-class engineers, and those people are going to be using Unix tools and a lot of Linux-based tools.
So here's an interesting twist for you.
Are you familiar with Cumulus Networks?
Only a little bit.
OK.
So Cumulus Networks had a really bright idea.
They said right now in business we have a problem we have network engineers network technicians and we have system administrators
and the system administrators are very good with bash and are very good with linux and are very
good with you know computers and understand the networking side of linux but anytime we want to
do something serious heavy lifting with networking,
we go to our network engineers and our network technicians.
And those are the guys that know Cisco and they know Juniper and they know HP.
Very different worlds.
You know, Microtech. Right, exactly.
And they are who provide our network.
So if you want to have a high-functioning company,
you employ the best of the best of the system administrators
and you employ the best of the best of network engineers and technicians.
And if your network crashes, your system administrator notices, he contacts the network
engineer and they take care of it.
And if your network engineer or technician is there and the server goes down, they don't
touch it.
You just wait for the administrator to come.
But basically, that means your payroll is twice because you have two equally technically proficient people, but they are specializing in different areas.
Even though the guy who is running – your network engineer guy has to have some basic system administration knowledge because he's copying files to and from a TFTP server most of the time.
And your network administrator has to have some basic – your system administrator has to have some basic knowledge of networking because he's got to get his servers to talk on the network. And so they
said, why not do away with this whole dichotomy and say, let's use Linux as the network operating
system and just route everything through IP tables and through a custom interface that they wrote on
Linux. And so. Oh, interesting. Yeah, it is really cool, dude. So basically they sell this switch and router
and whatever for data center.
We're talking like 40 gig connection.
I'm going to move.
They're not small scale, right?
This is really big stuff,
but ISP level, that kind of thing.
But it's all being done on Linux
and you're using Linux.
So their selling point is
train your guys to be a system administrator.
They can administrate our networking equipment as well as any of your
servers,
because it's all one operating system and you interface with it like you
would any Linux distro and all that,
all that arriving to the point that it seems like that would be a really
great route for some place like Comcast,
who probably has an abundance of people that are working in networking.
And now they're trying to branch out and well, not necessarily branch out, like Comcast who probably has an abundance of people that are working in networking and now
they're trying to branch out and well not necessarily branch out but they're being
forced because cable is going by the wayside to focus on these video type services which require
a huge back-end infrastructure of administration seems like you could really cut your costs down
if you could get a network infrastructure in place that utilizes the same talent as your
administrators that are administrating all this video content, what do you think?
Yeah, it's interesting to see, especially with Microsoft's working on their switching Linux-based thing.
Dell has their new operating system doing the same sort of thing.
Definitely software-defined networking and having these things.
I will also say, personally, networking is interesting.
And I have several friends who have Cisco certificates or et cetera or are networking guys, and I've worked with network engineers. We get along on the fundamentals
and the protocols, but it's just interesting. A lot of the professional network engineers
are applying practices with very proprietary equipment, very locked down, following specifications
made and implemented by these vendors. It's very different in the Linux world, where networking
really has that spirit of
just like, you know, maybe you won't get quite the throughput or you won't be able to do like,
you know, you're not having a giant core router or anything. But the amount of flexibility and
things you can do with the Linux networking stack is I'm continually impressed.
You know, it's interesting, too, because they they've literally proven and I mean,
I'm telling you, these are not cheap equipment, right? You're looking at 20,000 plus,
you know, yeah. So, I mean, but they're selling, these are not cheap equipment, right? You're looking at $20,000 plus. Totally.
Yeah.
So I mean, but they're selling these things right and left, and they're hugely popular.
And I think the reason is because in the long scheme, if you're looking at large scale, really, your costs are going down, not up.
And I think that alone is hugely advantageous.
Now, I don't mean to sidetrack the conversation, Wes, but do you have
any networking certifications? I do not have any certifications now.
Have you ever gone to any of the training? I've had some informal training from network
engineers, yeah. So I have sat and passed some of the Cisco exams, right? And I made an offhanded
comment on last, I've actually made it more than once that the certifications for networking are kind of a joke.
And I didn't mean to say that network engineers are not talented or that it's not difficult to pass a test or anything like that.
Totally.
But what I am saying, and I stand by this, and after I explained it via an email, a guy wrote back and he's like, oh, man, you brought back some bad memories.
back some bad memories you know i thought when i went to sit down i actually i was certified in red hat first and then i went back and did my cisco stuff because at the time i was focusing
on system administration then i went and branched out my own company this and the other when i went
to sit for the cisco test i expected to sit down in front of a cisco router and they would give me
a computer to ssh or telnet or use a zero connection whatever and give me a series of tasks to configure the switch or router
or whatever.
Right.
Turns out what they actually have is a little Java applet that emulates the router, but
only the commands that they think that you should have.
So right off the bat, you know, you get used to doing certain things, especially if you're
a Cisco guy.
One of the first things that you'll do when you sit down is show space config enter, and then it spits out exactly what the router is,
you know, doing right now, you know, or show int, you know, you can put in non-ambiguous terms. So
the full command would be something like show interfaces brief.
But I've never actually typed that out.
In fact, I couldn't even tell you. Right, why would you?
I couldn't even tell you if it's interface or interfaces.
I don't remember.
What I do know is I type show space int space br enter, and it shows me all of the interfaces that are there.
Well, a lot of this stuff isn't working in this stupid Java thing, and it is throwing me off because I am painfully aware that I'm not actually onto a router.
I'm on this little
emulation thing that doesn't feel like a router and nothing feels right to me.
It's not testing your ability on a real router.
No, it's really not. Aside from the fact that it's super delayed and it's not in Guake and I can't
draw, it's just everything is off to me and I'm already feeling out of my comfort zone.
And then I get past the little simulation things and they go on to these questions where they ask
you to calculate subnets now i would
figure if i was writing the test that if i wrote four subnet questions and the person can answer
all four of them then they can subnet and if they can't answer four of them then they can't subnet
and that should be the end of it what turns out they want you to do like i don't know some absurd
thing like you do like 30 of these questions but they give you the only i'm not very good at math
and so i have to i have to do all the addition basically by hand and they give you a like an eight and a half by 11 transparency thing and an erasable marker oh man yeah but you can't
erase it because they don't give you they give you like a kleenex but they don't really give
you like anything they don't give you an eraser so you didn't wear your best clothes there no
kidding dude i'm like spitting on the thing because there's no water and i'm like i use
up my kleenex in like the first 30 seconds and And then the rest of it, like it's all over my arm and I'm trying to find places.
It's just a mess, right?
Well, that was in stark contrast to doing my Red Hat cert, which was you sit down at
a Linux box, they give you 25 things to do and you do them and you restart the computer.
And if the computer does all of the things, then you pass.
And if the computer doesn't do all the things and you don't pass, it's a really straightforward
thing.
So I am looking at my certifications coming up for renewal and one of the big things i was
concerned about was red hat 7 because that's we've made this you know this big change gone to we've
gone from rel 6 to rel 7 and i have to go sit for this test and so i was a little concerned because
things have changed and i know i'm not i'm i know i'm'm not necessarily in the swing because I'm no longer focusing on, you know, knowing.
When I did the six test, I was focusing on learning all the things for six.
Now I'm focusing on learning all the things I need to do to get my job done.
Well, it turns out Linux Academy was able to help me out there.
I went over to Linux Academy and signed up for an account by going to Linux Academy dot com slash unplugged.
Got a huge discount and actually did all of my REL 7 training there.
So I think that is awesome.
I feel perfectly confident.
I have gone.
I have done both ways, guys.
I have done it both ways.
I have paid the like it's like a three thousand dollar class fee.
And then you go for a week and there is there is an advantage to that i won't lie there is an advantage to sitting in a classroom and being surrounded by
other people that want to learn linux and spending nine hours a day you know with an hour break and
right it's like a boot camp no yeah no real coffee you know you have some coffee but it's cold
and sometimes smells aren't great as small as they're not great because you're sitting in front
of other it people and it's somebody else's computer and it's their uncomfortable chair
sometimes and they frown upon you bringing your own laptop which means you have to do things at Smalls are not great because you're sitting in front of other IT people and it's somebody else's computer and it's their uncomfortable chair sometimes.
And they frown upon you bringing your own laptop, which means you have to do things at their pace.
But there is some advantage to that.
But the bigger advantage is going to Linux Academy dot com slash unplugged and signing up for an account.
It'll cost you pennies on the dollar, literally like one one hundredth of the cost.
About that, actually.
And then on top of that that you decide when you want
to train so if i work best between the hours of like 10 p.m and three in the morning that's my
like power zone i can get almost anything done in those hours partly because my phone isn't
constantly interrupting me but also because i'm just i'm the most awake and the most focused right
about that period of time well linus academy works great for me with that. I curl up in bed, I take my laptop, I take a course, did Red Hat 7, also did Python.
You can learn just a ton of things
on these e-learning sites.
And the other thing is,
and I talked about this on the Linux Action Show,
these instructors are not your average,
these aren't the people that are here for money.
They're not the people that they needed a job
and they couldn't actually administrate Linux,
so they decided to teach it.
You wind up with some of those people sometimes.
These are people that are genuinely passionate about Linux
and teaching other people about Linux.
And I said it on Linux Actions.
I'll say it again on Linux Unplugged.
If you're not making $70,000,
but you understand what is going on in this show,
then you need a new job. And you
need to go over to linuxacademy.com slash unplug, sign up for an account, show your support for the
Linux Action Show, and pick yourself up a new skill. Get a certification. The Red Hat certification
to take the test is like $200. To take the class is like $3,000.
That's a crazy difference.
It is. It is. And so if you can do the learning part of it, if you can actually pick up the learning over at Linux Academy for pennies on the dollar and then go take the test for 200 bucks, you'll wind up in the same boat that I am in.
I think if you're one of those people that are working at like a help desk for $30,000, $35,000 a year, you could double your income in just a couple of months by going to linuxacademy.com slash unplug. They'll hook you
up. And a couple of people after saying similar things on the Linux Action Show have written in
and asked me some small business questions and hiring questions. By all means, do that. Go over
to jupiterbroadcasting.com, click on the contact link and choose Linux Unplugged from the drop-down
menu. I'd be more than happy to address any specific questions. Of course, you can always
email me, noah at jupiterbroadcasting.com. I have some experience in that area and I be more than happy to address any specific questions. Of course, you can always email me, noah at jupiterbroadcasting.com.
I have some experience in that area, and I'm more than happy to share it with you.
So, Wes, I want to talk about, let's see here, local wire.
Let's see here.
I'm sorry.
Where's the next story?
I got lost.
Set up a core on Ubuntu, is that right?
The LocalWire one?
What?
Yep.
Okay, okay, good.
Okay, localwire.pl headline,
set up.NET Core on Ubuntu.
And then subtitle, Microsoft loves Linux.
.NET Core 1.0 is here, and it's great.
And it's a great opportunity to start playing with not only on the Windows platform, but also on Linux.
Today, I will show you how not only to set up.NET Core, but how to set up the whole development environment for developing.NET.
Since one of Microsoft's goals was for multi-platform to support, let's take a look at how to.NET Core on Ubuntu.
Why Ubuntu?
Well, it's popular and easy and
doesn't work right i added that uh also microsoft uh used it as a platform for the choice for their
bash and docker support so windows for windows so the decision seemed quite straightforward to me
we'll be working with ubuntu 14.04 lts good choice as it's the most widespread version at this time
you can also try out other platforms as as I did, specifically Red Hat Enterprise,
which is totally free at developers.redhat.com.
For developers, they also support.NET on RHEL and as part of the.NET Foundation.
Quite rock-solid backed, isn't it?
In fact, he misses it, but I'll inform you that you can actually do the same thing on CentOS.
In fact, that's what I would actually recommend recommend because they're not going to cut you off.
RHEL is really if you've got money and you want to pay for it.
And you really need that support.
Yeah, exactly.
That's really what you're paying for.
I'm a big fan of CentOS when it's either for me or for personal or even if it is a professional production environment.
If it's not a large company that doesn't depend on support, I'll oftentimes stick them on CentOS just because I think it offers, you know, you get basically the same thing for less money. But this is super exciting, especially because it seems like there is a lot of disagreement on exactly what.NET Core on Linux meant.
Exactly.
And the big thing was I have directly been told
by people much smarter than me
that you could not develop.NET on Linux.
It had to be done on Windows,
and this specifically says otherwise.
So I think there's a lot of FUD going around.
No, it used to be
true.
The.NET Core
thing
is
pretty much brand new, but
there are certain things you can do
in Linux, and there are certain things you can't.
Just like you can't use Visual Basic on Linux, that kind of thing.
Okay. I guess I can talk a little bit about this.
As I've looked into actually using it on Linux because there was a need for a customer that uses Windows
and I just wanted to make it on Linux, deploy it.
I see.
Essentially, when it comes down to this is a lot of the things that you would rely on
to have your.NET application built and shipped on Windows.
That's kind of the appeal is you wrote it in your Linux environment, you ship it there.
It's actually non-existent.
So the WPF, the Windows specific stuff, it's not available.
So you can't even call it with dummy functions that would allow you to then deploy and test.
So you can't really do that.
Developing environments on Linux for.NET are really not curated.
So that experience also sucks.
And it's a language that benefits a lot from all of the existing code base of the.NET framework.
Without that, it becomes kind of like useless most of the times.
You better go with C++, use Qt or something like that for making your application at that point.
Essentially, then there's also this other thing that is actually helping a little bit of its adoption in the Linux platform,
which is everybody has seen what Oracle has tried to do with Google in terms of APIs and copyrights.
So even though Google won fair use, it was fair use in the Google's case.
It doesn't guarantee your case is going to be fair use.
And you're a little bit scared of that. But let's face it, Microsoft is also a proprietary developer company.
They've had this close done for longest time. They're famously to being able to copyright,
to patent out software. So if you're scared about litigation, you might not use it. So in that sense,
we already have a curated environment around other languages that provide us those features.
And then you say, but at least with.NET, I can develop it for mobile platform.
Well, you have to pay about $5,000 to have a Xamarin account, which is the only developing environment, which is based on monodeveloped.
It is supposed to support the iPhone and Android.
So then you look at your options around and you say, just no.
Essentially, this is why I think it hasn't been the explosion it was intended to be.
It's like, basically, you lack a consistent environment to work on on the platform.
So essentially, it makes it useless for most people, I would claim.
Yeah, there are definitely a lot of usability limitations.
I'm just kind of curious because I know there's a lot of people who, like, they of grew up in like c sharp is very popular some people really like f sharp and i wonder for
certain like server side things if there are people who were kind of bound to windows and
were comfortable in those environments and were okay with a minimal environment but they still
wanted to use the core of those languages if they could find a find a use case here
essentially they lack the tooling to support them in their use case they probably could do
they lack the tooling so they don't have visual studio as they are used to they had that they
probably could do the switch but now they still can mono develop is very good is the best choice
you have when it comes to c-sharp in linux but it still mostly sucks honestly sorry point well taken you know i uh i guess i'm watching this with interest it it is
from all angles the world is getting to be a smaller place now we're developing for containers
rather than targeting specific distros i just talked with michael dominic on monday code radio
great show you should check it out he was talking about how he pushes clients to develop for HTML5
because they can move from device to device
and it becomes platform agnostic.
And now we're looking at, you know,
Bash coming to Windows
and we've got Bash on Windows
and.NET is on Linux.
And it's like...
It's very confusing.
I don't know if confusing is the right word
or if...
Or I guess I shouldn't say the right word, wrong word,
but it's not the word I would choose.
But it just seems like the walls are falling down and we're all going to find it we're
all going to look up and realize that we're all standing in the same room with the same same
things regardless of what operating system we're using that couldn't be exemplified more sorry go
ahead no the same thing will be linux i've already did this prediction in and unplugged before
microsoft will use the linux kernel that's why bash and linux exist that's why they bought a
lot of android developers android companies basically to get the the developers and they
will move into having a proprietary desktop on top of linux people won't even notice they will
just suddenly swiftly change the underneath and since every program is already running on linux linux will be what we've seen as the tron operating system dude you want to
know my work do you want to know what i would use it i would use it exactly i would use it if if
tomorrow i tell you what if tomorrow microsoft came out and said you know what windows is a
terrible you know we have destroyed the windows, we have destroyed the Windows name, we have destroyed the Windows brand,
and we are sick of Windows, and Windows
10 is a total failure because it sucks just a little bit
less than 8, which sucked just a little bit less than
Vista, which sucked just a little bit more than XP.
The,
if that happened tomorrow
and Windows says, okay, we're done with Windows, we're
going to do the next thing called
Microsoft Next OS, which
is our next version of Windows
into the future and stuff like that.
And basically, because we have done all of this stuff and we're trying to drag these
developers off their Macs and we're trying to drag them onto Windows, we have implemented
Bash and we don't want to make, there's no point in maintaining the NT kernel anymore
because now we could just maintain, we could just let the Linux community, which we're
already relying on for the Bash stuff and basically everything at the developer level, all of that stuff is we're pulling that in from the
open source community. And Hey, we love Linux. Why not just use the Linux kernel? Okay. That's
all right. So here's our new windows version. And it's kind of like windows except it's running a
Linux kernel. And we have the traditional windows desktop that you all know and love with the stupid
idiotic start menu, which by the way, we patented because start is such a, is such a unique and, and, and, and signifying word. Um, we're going to take that UI
and we're going to stick it on top of Linux. And now all of you developers don't have a reason
to go anywhere else. Oh, and by the way, because we're Microsoft and we basically dominate the PC
market space, everyone will now develop for Linux because we have the Linux kernel. If that happened
tomorrow, I wouldn't,
when I buy a new computer, I wouldn't bother reinstalling Linux. I just use what's on there.
Because frankly, even if it's a proprietary desktop, am I super excited about it? No.
But if it saves me the trouble of having to reinstall the operating system, if I get support
from the OEM, I can run all of my software because all I have to do is maybe I install the snap,
whatever, and I can install the snap packages on top of the existing Ubuntu base which is probably what they're basing off of it
anyway do I really care if it's a proprietary desktop now I can live with that I think that
would be amazing and I would take I wouldn't take back but I would significantly reduce
all of the Microsoft bashing I've done over the years I would be like that would be like Noah's
dream come true because instantaneously, America switches to Linux
basically overnight if that happened.
And I realize I'm pipe dreaming.
You know, no need to send me an email
and tell me how crazy I am.
I know.
What?
It's not going to happen.
Two years and a half have passed
since my prediction,
two years and a half happened.
Bash on Windows happened.
Linux distribution on the server side happened.
They bought a lot of Android companies.
Just mark my words.
It's only two years and a half since my prediction. I did a total prediction of five years. I said this in the show. I'm saying
it again. Stick to my word. Do you know how happy I am? I don't know what. I will do the show naked.
I will show up and do the Linux action show naked if that ever happens. Please do not.
I will. The camera only goes way stuff. I don't know what I would do. I'd be so happy. Words cannot describe. I don't know what i would do i'd be so happy you wouldn't just words cannot describe i don't know what i do i do
something crazy that's how excited i would it'll never i don't believe it's gonna happen i hope so
who did that whoever did that amazing whoever did whoever did that should be shot
no i i don't think i don't i don't think i i don't think i i't think I don't think I'm really not.
I'm really not.
I hope you're right.
I so hope you're right.
I have a hard time buying into it.
But our next story kind of exemplifies why you might actually be right.
Turns out you can run I3 on Windows.
So it's not exactly what we're just talking about because it's still the NT kernel, but we're running an open source desktop on it.
Wes, tell me about this.
Yeah, you know, this Brian Kettleson here,
you know, he kind of goes out
and kind of talks about why,
you know, he feels macOS is slipping.
He really isn't quite comfortable enough
on the Linux desktop.
I know you and I would both have a lot of things
to say there about improvements, you know,
but for him, he really appreciates it.
He deploys to the Linux environment server side.
He appreciates, you know, the niceties of the Linux desktop, but he also has certain things he needs for Windows.
A lot of people embedded in, you know, larger enterprises, they're hooked on Outlook or compatibility with other issues.
And again, I'm sure we wouldn't say like, hey, you know, you can probably just virtualize that.
But he explored running i3 natively, well, semi-natively, I guess, but natively through Windows.
He uses, I'd seen people, you know, I've played a little bit with the Bash shell.
I know you've played a little bit, but I've not played at all with getting X11 stuff working.
So he uses MOBA Xterm, and he was able to set up the settings right to have the X server
exported right.
He configures the Ubuntu Bash setup.
He runs i3, and he gets it running.
I don't know. I really want to try it out. I don't think it would be the solution for me,
but I can think of a few devs I know that, you know, they're more comfortable in Windows or
Windows with their default or they're playing games. I don't know how I feel about it,
but it's interesting how far we can go to really have like a mini non-virtualized Linux world.
I think we're definitely making progress and I think that we're going in a very good direction. And the exciting thing about some of this stuff is,
I don't think any of us would organically trust Microsoft's motives. But I think where I get
super encouraged and super giddy and super happy is Microsoft, voluntarily or involuntarily,
is losing control over a large aspect of this. For example, they can choose to implement Bash.
They don't really have a ton of control after that.
The developers are going to expect
the same environment system inside of Bash on Windows
as they would on Bash on side of Ubuntu.
And as has been exemplified numerous times,
if that isn't the case,
they'll simply log into a VPS
and do all the work in the
VPS and it makes it less valuable. Go ahead, Dar. But there could be a reversal because what
matters is who sets the expectation and who has the most users sets the trend, right?
Microsoft had, I'm not trying to doom the vision I just said that they will transition.
I'm just trying to be transparent here. Microsoft originally was a company
that provided software for other operating systems.
They became platform
because their apps were the most adopted
in those systems
and they developed an ecosystem
that supported development.
How do you arrive at that?
Microsoft started basically from the get-go
as an operating system company.
I mean, they competed at the fundamental level by they provided an operating system to ibm who didn't have a way to function their computers
and so when you so when they had software software that ran and had to run on their
operating system and that it was like that from day one yes and no because one of the the biggest
department of microsoft that made money was on the one that was selling to Apple.
They were developing software to Apple.
But what developers look at is where are the people that are creating things that users want and they can make money.
That's why people move gradually to Microsoft.
It's not because Apple stopped being a good platform.
It's because they saw that this company not only was able to develop software,
but had an entire developer-friendly ecosystem.
Something I've criticized our ecosystem a lot for not having,
which is you go to MDN on Microsoft, you go to developers.apple.com.
We barely have something similar that provides a tool chain that you can know,
this is how you develop for Ubuntu.
Only now we're looking at it.
The quickest we have that, the better and easier to fix.
But Dara, even…
Go ahead.
They become a company that provides these awesome tools and demonstrates by building tools themselves, acquiring tools themselves.
Now they become attractive.
They set standard again.
The day that Snap, we were going to do Snap,
I mentioned to a couple of different developers,
you might have even been in the room when I had this discussion,
there was instant throwback to it, or instant pushback.
And I'm hearing, well, I don't think this is a good way
to go about a prescribed tool chain,
because the great thing about Linux is that
every distro has its own way of getting packages in and that that enables us to have much more flexible and much
more distro aware developers and they're going to dumb all of the software down to not take
advantage of the environment of all of the environment variables because they have to go
with the lowest common denominator and I I guess what I'm I guess what I'm getting at is that yes
this is yes universal apps are a great thing.
Yes, we don't have some of the stuff that is at its core functionality, Microsoft Windows and Mac OS X.
But at the same time, there are a lot of people and the primary users of that operating system that I think really like those differences.
and to some degree this whole universal app thing is a problem is a is a is a answer to a problem that that only really new people to linux really have an issue with i have not had an issue of
having to get software for my computer in a very long time chris has not had an issue having being
able to get software in his computer for a very long time now him and i struggle with different
aspects of that he that is true right so he is going to have so it is a there's a two it's a two-sided coin though
on my side of the coin yes i have to go to a ppa and yes sometimes the software packages are
slightly less up to date but on his side there are hoops but on his side of the coin he can
download the latest version of team viewer but it might not run and he can download the latest
version of lightworks and it may not run and And so he has the software availability, but because of the dynamic nature, he can't necessarily guarantee that every piece of software works, unless, of course, we're talking about GNU, open source, all that.
The standard.
Yeah, right.
And this is where –
The standard for somebody who is only using open source apps, yes.
Yes, true.
But you back away from that and you say, well, the standard in the IT industry for remote controlling somebody's desktop, what piece of software can I be almost guaranteed that they're going to have on their computer that I could log into?
Well, TeamViewer is a pretty good chance that that's going to happen.
And we don't use that for work.
I have a different program that if we weren't going to use it,'t going to use it talk about it on sunday i would talk about it here but totally yeah but but we use it all the time
i use it all the time for personal things where somebody calls in and says hey you know grandma's
computer doesn't work my mother's computer doesn't work you know those things are excellent examples
of where team viewer works but you know all the coming full circle i think that um i think that you know that that uh
i don't know i would just leave it there we'll leave it there that i i think that there
there are are different advantages of how you can get that software and how you get that stuff to
work on i'm very distressed so uh anyway uh wes unless it doesn't work on linux at all and then
you're screwed right yeah well that's for example who needs something for their jobs, and it doesn't work on Linux at all, and they have to go through this workaround of using a virtual machine or something like that.
So the solution is more importantly to bringing in things that companies and developers don't want to mess with this ecosystem that is incredibly convoluted.
But if someone is already in the ecosystem, yeah, it doesn't feel that convoluted.
Except for when you find an application and you're like, cool, they released it as a dev, but I use Fedora, so I use RPMs.
I can't use this dev, but they don't make anything other than a dev.
But what about someone who uses Debian, and they provided a dev for specifically Ubuntu?
There is so many reasons why our current ecosystem is not a good idea.
As far as the core operating system, it's fantastic.
But the application layer, no.
So Ryan, you're saying we should all standardize on snaps running on GNU slash K Windows?
Yeah, maybe.
I don't care, really.
I want the standard application layer.
I don't care what the distro decides as the core fundamental format they want to use.
can have an application that will actually run on my system and that or as a developer that i can release an app that i don't have to release four apps for the same distribution based on
the different versions of that distribution or that the fact that i have to release
technically 20 packages just to support linux as a whole like that's absurd. If I could release a
flat pack, a snap, and an app image
and that's it, I would be happy
to do that. That is at least
10 times better than what it is right now.
Or the more important thing is when they get
the packages from
a repo, that the maintainer decides
I'll update it in a year.
I'll put out that
let's just have the crazy idea.
What about you treat each distribution as their own operating system
and give them the goddamn freedom to innovate and be different as they wish?
Yeah.
Honestly, honestly, if people didn't do different things,
we wouldn't have the diversity or the benefits we got.
Some things got adopted cross-wide.
They took their time.
Fair enough.
Well, we'll leave it there, guys.
I really appreciate you guys being here.
Thanks so much.
It's been a lot of fun hanging out with you and chatting about Linux.
That'll put this episode of Linux Unplugged in the books.
A special thanks to our Mumble room, who takes their time every week to be here.
Michael Tunnell, our producer.
Rekhi, our video editor,
and of course the entire team at Jupiter Broadcasting
for making this show possible.
Hey, if you have any questions, comments, or concerns,
head over to jupiterbroadcasting.com,
click on the contact link,
and choose Linux Unplugged from the drop-down menu.
While you're there, don't forget to check out
jupiterbroadcasting.com slash calendar
to stay up to date on current time showtimes.
We'll see you right back here next week. Thank you. so now if you try tiling uh there's no episode yeah here's the thing i gotta talk chris into it
and i have i have like i've like sort of brought it up, kind of, in, like, a passing sort of way.
And there's just, like, I can read people super, super well.
It's either a gift or a curse.
I haven't decided which yet.
It's a curse.
I think so.
But the thing is, like, I can just tell.
It does not tweak him the same way it tweaks me, right?
Yeah, I actually do t tiling but not through a
tiling window miniature i just use i use tiling functionality oh yeah something that's pretty
i will say i do sometimes get away with like if i'm in cinnamon or something like it can at least
do the quadrants which will often be enough if not not all the time but sometimes it's enough