More code often translates to more features on the ground, though. While I prefer to run software that is running on amazing, concise and well designed code - ultimately, none of these criteria come into my software selection.
If I have a choice of two pieces of software: (1) having a lot of features, not-too-bad start-up time, usable UI, and good compatibility with other software I need to use. (2) missing some important feature, fast start up, pretty UI, poor compatibility. Then I (and nearly everyone else) is going to choose option 1. Best example of this is, obviously, Microsoft Windows.
Comes down to something simple: the only people who care about the quality of the code are those who have to look at the code. Everyone else is only interested in the surface layer, and sometimes even bad code doesn't show up on the surface if it's been battle-hardened with enough bug fixes.
Up to a point. But then, interesting work will need to be done with a large graph of small pieces of software, which starts to look a lot like a large piece of software. Delaying complexity is not the same thing as avoiding it; the former can be a reasonable approach, the latter is whistling past the graveyard.
I like their philosophy, and I like the C language (its a beautiful minimal thing, although there is definite room for improvement). But making software that sucks less in C is still a hard sell.
I like suckless too, and use some of their tools (surf is a godsend for my dying laptop) but I find it interesting they linked to the "Duct Tape Programmer" essay at the bottom. I mean, aren't they doing another kind of opposite to duct tape programming?
"Write beautifully clean and minimal C code" seems to me just as detached from "just ship the damn thing!" as doing massively multi-inheritanced templated C++ architecture, albeit in the opposite direction.
I was wondering exactly the same thing, especially after reading "Most hackers actually don’t care much about code quality. Thus, if they get something working which seems to solve a problem, they stick with it.".
Tell me about it. C is great for clearly defined system level programs (I do a lot of layer 2 and 3 networking stuff at work), but take the same language in an environment where the customer doesn't really know what he wants and the guy before you decided that everything should be done as if you're to be the next MegaCorp and you need a lot of fifos, schedulers, DSLs and other fancy stuff everywhere to be as efficient and scalable as possible...
C is great, but it does make foot amputation rather easy for some people.
I write a fair amount of video processing and container parsing software, and I wouldn't even think of doing this in any language but C. But for anything else I do -- which is all strictly higher-level, mind -- the bare metal begins to burn a little.
To me, a "suck less" philosophy means that my environment should:
* Mount flash drives automatically when I plug them in.
* Let users configure the wireless network connection, and set it to automatically reconnect with the same settings to different APs on the same network, and let users override certificate settings for the same, and remember the overridden settings.
* Let users configure printer settings.
Sure, dwm is great if you spend all of your time in xterm and emacs/vim. But I spend at least 0.02% of my time doing the above tasks, and I don't want to inflate that number.
Some people believe that the fewer lines of code they write, the more elegant their software becomes. Bull. Just like writing more lines doesn't make your code better, writing fewer lines doesn't make it better.
I read the argument as "Gnome versus dwm", not as "Metacity versus dwm".
Every moment of my life that I spend editing fstab, running mount commands, or scripting my system is a moment that I could spend doing anything else. If I install Gnome, it will automount USB drives with no intervention on my part. Call this "low interface complexity." The same thing happens on OS X and Windows.
Now remember that it's not just about automounting. It's about monitor calibration, input devices, keyboard layouts, text conversion software, assistive technologies, wireless network configuration, printer configuration, etc. "Myriad" is a good word here.
If I uninstall Gnome, all of those tasks get shoved into the "unsolved" category, except for one — window management. For each task I have to find an application for it, compare different applications that solve the same task, and configure it. If it's something that has to always run in the background, like automounting, then I have to figure out a way to make it start when I log in. Call this "high interface complexity."
What I'd like to get across is this: optimizing for the number of lines of code is the wrong thing to optimize for. It's wrong (incorrect) to optimize for a large number of lines of code, and it's wrong (incorrect) to optimize for a small number of lines of code. Lines of code is a poor metric for almost everything, with the exception of bug count.
You should always be optimizing for "quality of life". If your idea of a higher quality of life is using a more minimalistic window manager, then what we have is a disagreement of a spiritual nature.
To me, Gnome sucks less than dwm by a mile.
And yes, I still know that Gnome is a desktop environment and dwm is a window manager.
I'm not really confused, just not very fond of cobbling together a usable system out of several dozen different applications, which I must individually install and configure. Gnome fixes that problem.
I have used dwm as a window manager, along with dmenu for application selection. I tend to mix and match their stuff as I find the convenience of a desktop manager (gdm/slim) and a graphical file manager (thunar) useful. I'm glad they are around as an alternative!
"Many (open source) hackers are proud if they achieve large amounts of code, because they believe the more lines of code they’ve written, the more progress they have made"
No need, let me explain. Back in the day men had hair in their chests, programmers wrote their own alloc()/dealloc(), and managers believed waterfall would solve all problems, production used to measured by line count. That notion still goes on many heads out there.
Bullshit. Back in the day men killed each other and then ate corpses. But don't try to base any article on the premise that people still practice cannibalism widely, do you?
With good reason, too. The primitive societies you're talking about (and, for the most part, the ideologies that drove them) are dead, so to write articles about them would be daft. OTOH, many PHBs and their various ideologies are still very alive (i.e. [0]), so it would make sense to write an article about them.
I think we didn't have computers in the era you're referring to. I was talking about the seventies. I gave you clues about that when I talked about the memory management functions and waterfall.
Anyway, you seem to have some anger management issues and this is not the place to sort it out. You should look for help.
Many open source developers (I imagine more than 25% but probably less than 50%) are from countries like China, Poland, Brazil or Russia. We didn't have computers in seventies. We didn't hear about waterfall except from the books blaming it. We never counted our lines as a metric. It's not in our culture, which is rather young.
So it could rather be about cannibals.
You seem think that by assaulting me you get free out of a stupid statement. Guess why, you don't.
Assault is a loaded word to describe a honest observation based on a needless graphic picture you painted. You might not need help to sort out your anger issues. But you certainly need some work on your politeness skills. Here's a tip on the importance of that, politenes is appreciated as much as rudeness is abhorred.
Now onto your claims about open source developers from Brazil, Russia, India, Poland, and China. I'm going to ignore the fact you only backed it up with your imagination, and focus on the interesting part, your assumption. You assumed that all these fellow programmers didn't have access to books, older programmers, nor even computers, in the seventies. Despite of the fact that programmers from these countries have been consistently shipping great software for decades. What really staggers me is that you assume on behalf of all these people that they lack culture, just to prove your point.
Do humanity a favor, and go read a book [0]. Or at least try to leave home so you can talk to people, and finally work on your poor social skills.
Why should I be polite to someone who doesn't support his claims with any facts? And I'm still waiting for any. I still didn't see any not based on ancient history.
You don't help your point by stating that some people had access to computers in non-first world countries. Sometimes they built those computers before programming those. You also have to prove they had exposure to line-counting culture, which you didn't yet. Note: you can quote books, but not just refer to those.
A delusion is a belief held with strong conviction despite superior evidence to the contrary.[1] Unlike hallucinations, delusions are always pathological (the result of an illness or illness process).[1] As a pathology, it is distinct from a belief based on false or incomplete information, confabulation, dogma, illusion, or other effects of perception. [0]
Nobody sets out to write bad code. It is easier not to suck if you can control / choose the scope of your project. But more often than not, the scope is dictated by a real-world set of complex needs and circumstances that evolve throughout the whole live of the project.
In the end, good UI, proper testing and responsive customer support play a bigger role in customer satisfaction than the quality of code.
No, but a lot of people (especially managers trying to get promoted via over-ambitious IT projects) set out to write Big Code, which goes that way almost always.
In the end, good UI, proper testing and responsive customer support play a bigger role in customer satisfaction than the quality of code.
If you've been in the enterprise long enough, you've learned that some systems are so far gone in code quality that keeping the UI and customer support up to date becomes impossible. Fixing one bug creates two more.
If I have a choice of two pieces of software: (1) having a lot of features, not-too-bad start-up time, usable UI, and good compatibility with other software I need to use. (2) missing some important feature, fast start up, pretty UI, poor compatibility. Then I (and nearly everyone else) is going to choose option 1. Best example of this is, obviously, Microsoft Windows.
Comes down to something simple: the only people who care about the quality of the code are those who have to look at the code. Everyone else is only interested in the surface layer, and sometimes even bad code doesn't show up on the surface if it's been battle-hardened with enough bug fixes.