[svlug] Devuan

Ivan Sergio Borgonovo mail at webthatworks.it
Mon Jul 11 07:36:00 PDT 2016

On 07/11/2016 09:34 AM, Rick Moen wrote:

> My usual inclination when discussions seem to be getting lost in the
> clouds of abstract language is to discuss something specific.  So,
> consider a Linux-based Internet server.  Its software presents one attack
> surface to remote agents, and a second and broader attack surface to
> local agents.

> When I learned firewalls and Internet security (primarily from the
> Cheswick & Bellovin text of that name, http://wilyhacker.com/1e/ -- but
> also others), a key principle was that the more software functions are
> accessible with elevated privilege, the greater the risk of critical
> malfunction or security breach.  _And_, incidentally, the critical
> malfunctions need not require actual attack, to result in harm.  Because
> bugs are a thing, without attackers.

> Thus, it is in the interest of the server sysadmin to use the mimimally
> featured code that suffices to accomplish the machine functions deemed
> necessary:  Mutatis mutandis, you would thus prefer the software
> alternative for each role that has the smallest-scoped feature set able
> to do the job required.  This is why, for example, SVLUG favours
> Lightthpd (nginx would be good, too) over Apache httpd for its current
> software needs, and NSD rather than BIND9 for authoritative-only DNS
> service.

That's because there are enough people that think lighttp is worth to be 
maintained. Lighttp doesn't live in vacuum, it has dependencies and an 
environment in which it can work.

Have you been paid to chose and install lighttpd on SVLUG website?
The environment where picking up lighttpd vs. nginx can pay you a living 
is the kind of environment where managing small simple things adds up 
enough fast to make them complicated ;)

Behind the simple thing you chose there is an "economy" that keeps them 
alive that you are "abstracting" away, taking it for granted.
You wouldn't know the difference between lighttpd and apache if you 
didn't have the chance to make it relevant in some environment.

Is there anyone living on just maintaining a *simple* piece of software?
No, generally they make a living on services (or blackmailing users <g>).

Maintaining well integrated alternatives of a software has a cost.
It can be low, but it adds up.

You can wait Debian (or someone else) find a way to get systemd and 
whatever alternative init makes sense, you can fork Debian.
Forking Debian for just one package doesn't add enough value to switch 
Then you may have specialized distribution that "add some value".

There is no surprise sometimes tools seem to be getting more complicated 
than they should be and that you've "locally" the feeling you're 
"picking up the simplest one".
You're just oscillating around the maximum of complexity you can handle 
at the level of abstraction you chose to work with that in turns is 
oscillating around the sweet spot between complexity and added value.

Simplicity is an artifact of looking at things at the right scale at a 
given moment, moving up and down in that scale to see things as "simple" 
is a technique, not an added value, not an end.
A good tool may be "that simple" and not "that other way simple" because 
its optimal integration requires "that way to be simple" and not "that 
other way to be simple".
Choice require knowledge and availability, that add complexity somewhere 
Picking up a "simpler" tool is the engineering choice of picking up the 
scale at which you want to improve a solution. It doesn't make "adding 
value" to your choice any simpler.
Complexity doesn't subtract value to the tools you use, it is not an 
inherently bad characteristic. Complex things are bad because they were 
badly engineered. Progress is into making things more complex rather 
than simpler.
What is the value of a nearly perfectly secure email server? Surely less 
than the one of an organization making money with it.

The down side of reality being sliced in thinner and thinner layers to 
be understood is people are finding harder and harder to get the "big 
picture" and the cost of communicating information among "layers" 
getting higher.
Sooner or later evolution will kick in and decide if
a) people with power and unfit animal instinct for the times we are 
living in will exploit our ignorance and exterminate us
b) we will get a 4Kg brain
c) we will lose obsolete animal behaviors
d) chose your mix

This half chewed "unix philosophy" thing held like a club is plain 
bullshit if you want to stretch it at every scale. But people like 
dogmas and simple recipes because these are the ones they can handle easier.
As an added bonus people like talking about a glorious past of safe 
accomplishments, that may not even belong to them.
And still there is a lot of engineering wisdom in the "unix philosophy" 
on which most of the modern software engineering techniques of nearly 40 
years later are based on.
People working at Bell Labs didn't have to wait Metcalfe's law to know 
what is at its heart[1] and that's the part of "unix philosophy" that 
the cult of Unix apprentices tend to forget.

The "oh this things is breaking stuff, it goes beyond the things it 
should do, you're breaking my universe" is just an over reaction to the 
temporary vacuum left by all this things oscillating and people having 
to adapt to a different level of abstraction. They can go to a lower 
level of abstraction or a higher one but finding their new place has a 
cost, and it is not going to "simplify" their lives, but it is unavoidable.
Then things settle and you've the feeling that you can understand and 
control stuff at the abstraction level you're comfortable with, that's 
just because stuff has been boxed properly and well engineered.
Defect in the box have been solved and you really don't care anymore to 
understand what's inside it. It just works. And if it doesn't it just 
mean the work is not finished yet.

Captious estimate of cost or added drama don't make thing simpler.

But at least Debian developers and the people working for an alternative 
init were actually doing something, they are "finishing the work".
Really gratuitous complexity is still coming from spectators complaining 
and throwing shit on both sides unable to cope with the complexity of a 
process they never show they were willing to handle.
Everything else was excusable engineering misjudgment in a process of 
improvement or quantum fluctuations.

When there will be a better solution I bet those people that were just 
complaining will say that if it wasn't for their shit throwing nothing 
wold have happened and they knew something better could be done.
Because they are stockholders and Debian is theirs.

>>> You do not 'manage' that by crowdsourcing it.

>> You do.

>> Because a big thick wall is a single point of failure.
>> Too simple things may not be flexible enough. You make them more
>> complicated, you increase attack surface.
>> If you don't come to compromises people will try to circumvent your
>> defenses etc...
>> Value and complexity go together. You may argue that they may not
>> increase with the same law, but when you've finished to explore the
>> boundaries, if you've to increase value, you'll have to increase complexity.
>> Deterministic behavior is just one of the many proprieties you may want
>> from a system.
>> The most current theories say you've to be pretty careful about what you
>> could expect from determinism ;)

> I'm really sorry, but the above is so _very_ abstract that I really have
> no idea what it means in the real world and what connection it has with
> upthread discussion.  Probably me being an irritatingly literal person,
> again.

I do understand the value of examples not only to make an abstract claim 
clearer but also as test of the soundness of the argument (not as a proof).
Of course when an abstract argument is complicated by several layers of 
abstraction, getting down the chain by examples takes a whole lot of 

>>> Let me tell you a story about mej (Michael E. Jennings).
>> [...]
>> And your point is?
> That _one guy_ beautifully maintained a major Linux distribution,
> unaided, for multiple years.  The _whole_ megillah.  By himself.
> And with very high quality.

But where the added value was?
What were the expectations at that time?
How did they earn their money?

> Then, you missed the point about the mej anecdote completely.

I still find it not particularly relevant. But probably even my point 
wasn't clear.

[1] The UNIX Programming Environment: "the idea that the power of a 
system comes more from the relationships among programs than from the 
programs themselves"

Ivan Sergio Borgonovo

More information about the svlug mailing list