I just spotted a number of hits from Ars Technica to my blog. It is a regular event that one of my blog posts gets posted in some online discussion, causing a noticeable spike in my statistics. When it does, I usually check out that discussion. This was a rare occasion where I actually enjoyed the discussion. It also reminds me directly of a post I made only a few weeks ago: The Pessimist Pamphlet.
You can find this particular discussion here on Ars Technica. In short, it is about a news item on one of Microsoft’s recent patches, namely to the Equation Editor. The remarkable thing here is that they did a direct binary patch, rather than patching the source code and rebuilding the application.
The discussion that ensued, seemed to split the crowd into two camps: One camp that was blown away by the fact that you can actually do that. And another camp that had done the same thing on a regular basis. My blog was linked because I have discussed patching binaries on various occasions as well. In this particular case, the Commander Keen 4 patch was brought up (which was done by VileR, not myself).
Anyway, the latter camp seemed to be the ‘old warrior’/oldskool type of software developer, which I could identify with. As such, I could also identify with various statements made in the thread. Some of them closely related to what I said in the aforementioned Pessimist Pamphlet. I will pick out a few relevant quotes:
(In response to someone mentioning various currently popular processes/’best practices’ such as unit tests, removing any compiler warnings etc):
I know people who do all this and still produce shitty code, as in it doesn’t do what its supposed to do or there are some holes that users’ can exploit, etc. There’s no easy answer to it as long as its a human that is producing the code.
I have said virtually the same thing in another discussion the other day:
That has always been my objection against “unit-test everything”.
If you ask me, that idea is mainly propagated by people who aren’t mathematically inclined, so to say.
For very simple stuff, a unit-test may work. For complicated calculations, algorithms etc, the difficultly is in finding every single corner-case and making tests for those. Sometimes there are too many corner-cases for this to be a realistic option to begin with. So you may have written a few unit-tests, but how much of the problem do they really cover? And does it even cover relevant areas in the first place?
I think in practice unit-tests give you a false sense of security: the unit-tests that people write are generally the trivial ones that test things that people understand anyway, and will not generally go wrong (or are trivial to debug when they do). It’s often the unit-tests that people don’t write, where the real problems are.
(People who actually had an academic education in computer science should be familiar both with mathematics and also the studies in trying to formally prove correctness of software. And it indeed is a science).
On to the next:
What you consider “duh” practices are learned. Learned through the trials and efforts of our elders. 20 years from now, a whole generation of developers will wonder why we didn’t do baby-simple stuff like pointing hostile AIs at all our code for vulnerability testing. You know, a thing that doesn’t exist yet.
This touches on my Pessimist Pamphlet, and why something like Agile development came into existence in the first place. Knowing where something came from and why is very important.
The one process that I routinely use is coding standards. Yes, including testing for length required before allocating the memory and for verifying that the allocation worked.
The huge process heavy solutions suck. They block innovation, slow development and still provide plenty of solutions for the untrained to believe their work is perfect – because The Holiest of Processes proclaims it to be so.
Try getting somewhat solid requirements first. That and a coding standard solves nearly every bug I’ve even seen. The others, quite honestly, were compiler issues or bad documentation.
Another very important point: ‘best practices’ often don’t really work out in reality, because they tend to be very resource-heavy, and the bean counters want you to cut corners. The only thing that REALLY gives you better code quality is having humans write better code. Which is not done with silly rules like ‘unit tests’ or ‘don’t allow compiler warnings’, but having a proper understanding of what your code is supposed to do, and how you can achieve this. Again: as the Pessimist Pamphlet says: make sure that you know what you’re doing. Ask experienced people for their input and guidance, get trained.
Another one that may be overlooked often:
There’s also the problem that dodgy hacks today are generally responses to the sins of the past.
“Be meticulous and do it right” isn’t fun advice; but it’s advice you can heed; and probably should.
“Make whoever was on the project five years ago be meticulous and do it right” is advice that people would generally desperately like to heed; but the flow of time simply doesn’t work that way; and unless you can afford to just burn down everything and rewrite, meticulous good practice takes years to either gradually refactor or simply age out the various sins of the past.
Even if you have implemented all sorts of modern processes today, you will inevitably run into older/legacy code, which wasn’t quite up to today’s standards, but which your system still relies on.
And this one:
You can write shit in any language, using any process.
Pair programming DOES tend to pull the weaker programmer up, at least at first, but a weird dynamic in a pair can trigger insane shit-fails (and associated management headaches).
There’s no silver bullet.
Exactly: no silver bullet.
The next one is something that I have also run into various times, sadly… poor management of the development process:
Unfortunately in the real world, project due dates are the first thing set, then the solution and design are hammered out.
I’m working coding on a new project that we kicked off this week that is already “red” because the requirements were two months behind schedule, but the due date hasn’t moved.
And the reply to that:
It’s sadly commonplace for software project to allot zero time for actual code implementation. It’s at the bottom of the development stack, and every level above it looks at the predetermined deadline and assumes, “well, that’s how long I’VE got to get MY work done.” It’s not unusual for implementation to get the green light and all their design and requirements documents AFTER the original delivery deadline has passed. Meanwhile, all those layers – and I don’t exclude implementation in this regard – are often too busy building their own little walled-off fiefdoms rather than working together as an actual team.
Basically the managers who think they’re all-important, and once they have some requirements, they’ll just shove it into a room with developers, and the system will magically come out on the other end. Both Agile development and the No Silver Bullet article try to teach management that software development is a team sport, and management should work WITH the developers/architects, not against them. As someone once said: Software development is not rocket science. If only it were that simple.
Another interesting one (responding to the notion that machine language and assembly are ‘outdated’ and not a required skill for a modern developer):
The huge difference is that we no longer use punchcards, so learning how punchcards work is mostly a historic curiosity.
And really, most of the skill in reading assembly isn’t the assembly itself. It’s in understanding how computers and OS actually work, and due to Leaky Abstraction (https://en.wikipedia.org/wiki/Leaky_abstraction) it’s often abstractions can be broken, and you need to look under the curtain. This type of skill is still pretty relevant if you do computer security related work (decompiling programs would be your second nature), or if you do performance-sensitive work like video games or VR or have sensitive real-time requirements (needing to understand the output of the compiler to see why your programs are not performing well).
Very true! We still use machine code and assembly language in today’s systems. And every now and then some abstraction WILL leak such details. I have argued that before in this blogpost.
Which brings me to the next one:
We can celebrate the skill involved without celebrating the culture that makes those skills necessary. I’d rather not have to patch binaries either, but I can admire the people who can do it.
A common misunderstanding on the blogpost I mentioned above is that people mistook my list of skills for a list of ‘best practices’. No, I’m not saying you should base all your programming work around these skills. I’m saying that these are concepts you should master to truly understand all important aspects of developing and debugging software.
This is also a good point:
My point is: software engineering back in the days might not have all those fancy tools and “best practises” in place: but it was an art, and required real skills. Software engineering skills, endurance, precision and all that. You had your 8 KB worth of resources and your binary had to fit into that, period.
I am not saying that I want to switch my code syntax highlighter and auto-completion tools and everything, and sure I don’t want to write assembler But I’m just saying: don’t underestimate the work done by “the previous generations”, as all the lessons learned and the tools that we have today are due to them.
If you learnt coding ‘the hard way’ in the past, you had to hone your skills to a very high level to even get working software out of the door. People should still strive for such high levels today, but sadly, most of them don’t seem to.
Just as frustrating is that quite a few developers have this mania with TDD, Clean Architecture, code reviews processes etc. without really understanding the why. They just repeat the mantras they’ve learnt from online and conference talks by celebrities developers. Then they just produced shitty code anyway.
And the response to that:
A thousand times this. Lately I have a contractor giving me grief (in the form of hours spent on code reviews) because his code mill taught him the One True Way Of Coding.. sigh.
As said before, understand what the ideas are behind the processes. Understanding the processes and thoughts makes you a much better developer, and allow you to apply the processes and ideas in the spirit they were meant by the initiators, for best effect. And I cannot repeat it often enough: There is no silver bullet! No One True Way Of Coding!
Well, that’s it for now. I can just say that I’m happy to see I’m not quite alone in my thoughts on software development. On some forums you only see younger developers, and they generally all have the same, dare I say, naïve outlook on development. I tend to feel out-of-place there. I mostly discuss programming on vintage/retro-oriented forums these days, since they are generally populated with older people and/or people with a more ‘oldskool’ view on development, and years of hands-on experience. They’ve seen various processes and tools come and go, usually failing to yield a lot of result. The common factor in quality has always been skilled developers. It is nice to see so many ‘old warriors’ also hanging out on Ars Technica.
And again, I’d like to stress that I’m not saying that new tools or processes are bad. Rather that there’s no silver bullet, no One True Way of Coding. Even with the latest tools and processes, humans can and will find ways to make horrible mistakes (and conversely, even many moons ago, long before current languages, tools and processes had been developed, there were people who wrote some great software as well). Nothing will ever replace experience, skill and just common sense.