Being the same vintage as Jim, much of this brought back memories.
There really wasn't any such thing as a tech stack - our company would produce software for various Unixen and, especially, Vax, and the language was alway C, with some Pascal on the Vax side (as, I was told, the Vax Pascal compiler produced far more optimal code than the C compiler).
Being a software engineer was like being a furniture-maker. You had to master a small number of tools - the saw, the chisel - and then craftsmanship consisted of discovering an affinity with these tools and a love of getting ever more skilled at using them. This path wasn't for everyone, which was fine - plenty of management and other non-tech roles for those folks - but if you grooved with your tools it was a real joy to be able to move from chair- to table- to cabinet-making and carry and refine your skills as you went.
This idyll could not last, however. Before too long it became uneconomical to make furniture this way, so for your chair making project you instead had to learn how to run a chair-making machine. When you moved onto a tables project, the machines there were infuriatingly dissimilar to the chair-making ones you'd just mastered. Worse still, moving back to chair-making after a year or two, you discover that the old machines you knew are now obsolete and you have to re-learn how to run their replacements. This stops being fun, or interesting, after a while, and yet sadly it's all that most younger software engineers know.
Occasionally, there is the need to make some shim or gizmo that the machines don't cover, so out come the trusty saw and chisel of yore - much to the amazement of the young'uns, who are astonished that an old-timer can still wield these antiques so effectively.
> Worse still, moving back to chair-making after a year or two, you discover that the old machines you knew are now obsolete and you have to re-learn how to run their replacements.
Love your analogy, it's all spot-on.
On the above, I'll add that the new chair-making machine is hardly ever better than the one from 2 years ago that you saw, it's simply all completely different for the sake of being disruptive to you its user. Most of the time it's actually worse, with fewer features and more restricted customization.
As a youngster who has had to pick up a saw and chisel to repair an old chair, I think you're looking back with rose tinted glasses a bit. I've never been astonished by legacy code, as much as bemused for the lack of object orientation, data normalization, etc. The technology changes of course, but there are advancements made in craftsmanship/techniques.
There's some 'cleverness' lost probably, in the new ways vs the old. I've seen some pretty novel approaches to what should be simple tasks in legacy code. And I do recognize that for what it is. But as impressive as that cleverness can be, I think it's one of the reasons as to why a lot of legacy code persists and nobody wants to touch it. Among many other reasons.
Huge disagree on this. Old code tend to be much more terrible because the tools were terrible. They gave little incentive and absolutely no help to write even passable code. We end up with a mess of poorly documented, sparingly commented spaghetti code that sometimes work, often by accident.
Nowadays, we have linters to point out common classes of mistakes, programming languages with actual type systems to enforce invariants, integrated testing tools, etc...
Note that, yes, we do still have new, terrible codebases. But at least the tooling nowadays raised the bar to have a minimum floor of quality that, while very low, is still oh so much higher than it used to be.
> Huge disagree on this. Old code tend to be much more terrible because the tools were terrible.
I've looked at 1990s code in the Windows NT kernel. It was wonderful.
Oldest source file I went through was from 1993. Perfectly readable and understandable.
One of the best modern code bases I ever worked in didn't even have a linter setup. The principle dev reviewed every single commit and enforced a consistency across the code base that was better than what any tools ever could have done.
> The principle dev reviewed every single commit and enforced a consistency across the code base that was better than what any tools ever could have done.
How was it so much better that it justified that level of busywork on the part of the senior member of the team (and busywork for everyone else, to fix the style nits they enforced in review)? I would have guessed that taking a few hours to install and configure a formatting linter to free up the principal dev's time to focus on other things would have been hugely high leverage.
It ensured not only consistency of style, but also consistency of ideas. Every file was structured similarly, impedance mismatches were minimized, work across the entire code base was organized and unified. Junior engineers got a chance to talk to the principle developer about every commit they made, and accordingly their abilities as software engineers skyrocketed.
Developers quickly set their IDEs to follow the team's coding guidelines, style wasn't really a problem.
We had "coding standards," and if you didn't code it to the right format, you'd get fussed at. I didn't start professionally until the mid 90s, so I'm a little younger than the OP. I've been coding since the mid 80s though as a youngster.
Putting BEGIN..END around a one line block was always a contention since it wasn't required. Less code vs arguably more readable code (Pascal)
> The code quality is determined by the author, just like today.
This is nonsense. The worst authors can still produce bad code with the best tools and the best authors can still produce good code with the worst tools, but most authors are somewhere in the middle, and tooling makes a huge difference to the average.
> As a youngster ... as much as bemused for the lack of object orientation.
And as an oldster you be happy again to see the lack of object orientation and not the 5 level misabstracted inheritance hierarchy someone didn't do for purpose but just because university taught the modern way to automatically end up with sane structured code... and sure sure every tool can be used well or misused and never overgeneralize 8-)
I may be wrong but youngsters as oldsters have both their own rose tinted glasses.
I think you might be missing parent comment's point. The way I read it they aren't saying that old code is better, just that you used built everything up from scratch generally (or to a much greater extent than nowadays) instead of using whatever framework is in fashion this week
You also have to understand one thing (source: me, a sw developer since late 80s in Europe) - there was no internet, and most of the people working in the field had no CS degree.
This meant that neat new tricks (and stupid old mistakes) were done and (re)discovered all over the place, ALL THE TIME.
You might have decided that data normalization was a good idea, or you might have maybe got the idea by one of your mentors... but you could not know that precisely the same thing was being done/taught on the other side of the street, or even 2 floors below your office.
Data normalisation was huge in the past because of storage concerns. But it makes less sense now. There's still concerns of inconsistent data and duplication but often it's more performant to not do it to the extreme extent we learned to in the 90s with all the normal forms. It's still a good concept but no longer a strict rule.
The inconsistency can be covered with stored procedures and prepared statements which are also a great way to improve security (SQL injection)
So IMO excessive normalisation is no longer the be all and end all in this day and age. Just the way an RDBMS is always the way to go. Sometimes object storage is better.
But anyway I'm surprised you saw less normalisation in old code. Personally I saw much more then than now.
When I told him about SQL databases. Relational stuff, normalization. Awesome.
He told me that was all fine and dandy but just too slow, this new fangled SQL database stuff. He used databases where you simply accessed rows by key. If you needed to access something by a different key you just made a different table where the same data was arranged by that key instead. Super performant.
Of course my dad also programmed in languages like 370 assembler.
Funny how the young folks today talk about NoSQL databases indeed.
SQL databases were a dumb misstep, it's always baffled me how they ever caught on. In 10 or 20 years we'll look back on them the same way we look on C++/Java-style OO today.
It's not really such a dumb idea actually. It just depends on what you favour. There are plenty of nice things about SQL databases. I suppose it's the regular back and forth between one extreme and another. Both driven by "we need something new to work on" and changing technology landscape.
Relational databases really weren't all that practical back in the day with the hardware that was available. Putting thought into defining how you are going to query your data was available though. Relational algebra wasn't a thing from the start either.
If someone comes along and tells you that you don't have to know all this up-front and you can come up with a query you want to ask about the data you have, you will be able to, isn't that awesome? Ad-hoc, just like that. No need to carefully transform the data you have, ensure you keep it all up-to-date in multiple places etc. Of course data volumes grow and even the newer hardware you have can soon no longer handle what you got in a timeframe that you like. Indexing will be a thing. Of course even indexes grow way too huge to really perform, but hardware to the rescue, where at least all the indexes you need frequently will fit in RAM. Lots of caching going on too for your regular workloads.
Guess where the story is going? Well of course there's the old analytical vs. transactional load thing, i.e. your "Data Warehouse" is a separate database that is optimized for the pre-defined queries again, actually de-normalizing lots of things, being on different hardware, so as not to disturb warm caches for the transactional load etc. And yes, finally NoSQL again, i.e. back to the roots. Put more thought into how you're going to query this as your globe spanning SaaS load won't fit onto the hardware you have available. Of course this brings problems because we're just so good at predicting what kind of query we want to ask about our data. Databases like MongoDB, Cassandra, AWS DocumentDB etc. grow indexes supporting querying arbitrarily ... There's a hole in my bucket dear Liza, dear Liza ... :)
[I'm sure this nice story line is not globally completely correct/adhering to exact timelines but illustrates the point]
A different index, surely, not a different table? This sounds like an ISAM database to me, where you'd have to do lookups manually one by one, picking the right index for each yourself.
To be honest, I don't really know much about what he told me any more and I can't go back and ask him any longer. It's possible but I can't tell you yes or no for sure. It just evoked the memory of that conversation. Same with the Pick and MUMPS the other reply mentions. Doesn't ring a bell, seems possible though.
A puzzle: chair-creation was automated because creating every chair is expensive. Copying and shipping a piece of software costs nearly nothing, and still we're shouldering the absurd complexity of industrial production.
Thing I hear from a lot of software engineers from the time-frame of 1980-2000 is how much software engineering - to engineers at least - was about the craft. It was all about the craft of making good software. People invested real energy into trying to do it right, and treated it as such. This is of course likely colored by experience and "rose tinted" glasses, however, I think we as an industry have let this go, especially since the rise of bootcamps. It just isn't about the craft anymore, its all about "hustle". There is some truth to the fact that engineering in this time there was more caring about it as a craft, caring about performance, caring about shipping quality etc.
I like thinking about things as a craft, I don't always like the "hustle" aspect that is pervasive now, it guarantees shipping too early and never taking quality & craftsmanship seriously, I feel like.
I want to work somewhere again that sees engineering software as a craft again.
Oh my goodness, no -- that wasn't it at all. To the contrary, the 1990s sucked for software development: everything was proprietary and (not unrelated) mostly terrible. Worse, the world was moving in the wrong direction, as what had started as a liberating computing movement (personal computing) had turned into a proprietary monster in the form of Microsoft. If you want to know what software development in the 1990s was like, read "Showstopper!"[0]; if you want to know what it was like to try start a company in the shadow of Microsoft, read "Startup"[1] -- and if you want to know just how shabbily software engineers were thought of, read (actually don't -- it's terrible) Ed Yourdon's "Decline and Fall of the American Progammer."[2]
Anyone telling you that the 1990s were "all about the craft" is pranking you. This may be an uncomfortable truth, but you live in the best time in human history to be a software engineer!
I am 50 years old and I agree. I've never liked programming as much as today, because I do less of it and get to collaborate with all the other people also doing less of it. Basically, programming is a terribly difficult way to spend time, and open source let's us reduce this tax on humanity.
> "...as what had started as a liberating computing movement (personal computing) had turned into a proprietary monster in the form of Microsoft."
Given that most of us today work in server-based software, it must be mentioned that the UNIX server/workstation vendors were equally interested in becoming proprietary monsters of their own, with Sun Microsystems at the top of the heap and charging prices that made even Microsoft server products look attractive. They would've been a Microsoft if they could've and they certainly tried their damndest to do so.
Also, having to ship the product on physical media and fast internet not being a thing meant that software would actually have a "finished" state. Today nothing is ever finished, everything is in eternal beta and constantly updated for no good reason.
I was a professional developer starting about 1996, so this doesn't apply to earlier timeframes, but there were a lot of people TALKING about software engineering be a craft. In the trenches, there were a lot of people just trying to get things working. You would try to do this "the right way" (cf. how Agile is described vs how it actually works today) but a lot of it was aspirational. We were hacking things together and rushing it out the door back then as well.
I hear this too, but don't forget about context. Like the article states, it was hard to push updates, so there was a grind. Extreme Programming wasn't a thing yet, nor was online updates to software, so there were self inflicted downsides.
It's also not like everyone forgot at the end of the day there was a business that had to be run, and profits had to be made, to continue working in the craft.
And yet, even these "hacks" were interesting, at least, and to some extent, sometimes quite masterful.
When I listen to folks re-calling this period its not that they don't say there weren't short comings, of course there was! Its that they remember everyone being in it together, in the thick, and really collaborating, because they cared, even when they had to take shortcuts to meet a deadline.
Maybe that's what it is, the average engineer just doesn't care anymore like that, for better or worse. Its a hard thing to describe exactly, but it sure isn't prevalent now.
And who can blame the average engineer today? Business incentives are often actively hostile to caring in this fashion. Technical excellence is not a goal anymore. Worked at many places now, and IME all have the same broad stroke issues around quality, meeting deadlines, product owning engineering work pipelines (definitely not true Agile, where teams are autonomous) etc.
I disagree with you. It's still "about the craft" for probably just as many people as it was back then... But of course now there are way more people and it's much simpler to enter the field. Now, you don't need a college degree to get a job in this business. However, just as many people as back then still do get that.
> I want to work somewhere again that sees engineering software as a craft again.
If you find this place, will you tell me? I will come work with you.
I admit though, that I struggle to codify what “coding as a craft” is. And that worries me. I fear I am really just playing a game of “I’ll describe it when I understand it” up-over-that-next-hill-ism with myself that is illusory and delusional.
>I struggle to codify what “coding as a craft” is.
It means different things for different people, but for me, it means:
Step one is getting it to work, step two is making is as simple/clean as possible (most people stop at step 1 it seems).
Take time to fix issues / sloppy code when you see it.
Ideally you want the codebase to look like it was all written by one person.
Carefully balance maintainability vs speed or cleverness.
Refactoring is a necessity and should be done periodically. Same with tuning.
Take the time to name your methods/classes/variables as accurately, verbosely and consistently as possible.
Test your shit for bugs before you give it to QA, for the love of everything holy. If QA sends something back, you should feel a little embarrassed.
Take the time to write useful comments. Focus on intent, that way the next person can know if the code is doing what it's supposed to or not.
KISS, DRY, YAGNI.
The best test is when I come back to my own code I wrote 6 months ago and if I say, "who the hell wrote this shit?!" and it's me, I need to clean it up.
You wrote: <<The best test is when I come back to my own code I wrote 6 months ago and if I say, "who the hell wrote this shit?!" and it's me, I need to clean it up.>>
Up vote on this! I do it once a week -- at least. First, I am frustrated by some unreadable logic, then I look at Git blame... "Oh..." Second, fix it... again!
Closest I ever got was working at Apple, and even then it was due to the fact I was on a very specific type of team, so I don't know if it was common case or not.
I hear Google is largely like this, but who knows anymore. I hear mixed things all the time. It feels like you have to find a specialized company that may be immune to certain pressures and likely top of their field or the only company really in their given market sector. Maybe Oxide Computer?
I think Supabase might be like this, given their marketing and hiring materials, but its hard to say. They're in a "long game" (according to their founder) so they think more long term. That tends to be a good indicator of this, I think, is seeing long term horizons over quarterly ones.
We're fortunate to have started at a time when VCs were incredibly irrational (read: good for founders), and even more fortunate to have found a lead investor that's patient.
We've been able to build Supabase our own way: hire the right people (mostly international), focus on what developers want (even though they don't pay much), and build open source (collaborating rather than competing). None of these are controversial until you have a profit-maximising VC breathing down your neck.
> They're in a "long game" (according to their founder)
I stand by this. It's a decade-long endeavour to build credibility as a database company and that's not something we can hurry.
Google used to be like this, but now, though some isolated pockets remain, is not like that anymore. A good mental model of how Google internally works today is Microsoft, but with free food.
My time at Microsoft was learning a craft from old guys who were there in "the beginning". I'm so grateful for the hours spent on a whiteboard with a patient sage explaining to me how things work. My code reviews were 10 iterations until I learned to do a better job. Very high bar. Even comments had to be full English sentences. I miss it.
Not sure I agree with your experience at AWS. For me, my teammates have been super critical, always trying to raise the bar and looking for ways to make things better. So, all I’ll say is that it is very team specific and you can find good and bad teams at every company.
Does caring about the quality actually mean the quality is good? Maybe it doesn’t even matter. If the team cares and they are satisfied with what the quality of what they are building, maybe that’s enough?
It means you produce something that works, has some consistency of organization, and most importantly: other developers will be able to come behind you to support and extend it.
> ...I think we as an industry have let this go, especially since the rise of bootcamps. It just isn't about the craft anymore, its all about "hustle". There is some truth to the fact that engineering in this time there was more caring about it as a craft, caring about performance, caring about shipping quality etc.
Hustle almost always seems like the most direct path to market/economic success. As market imperatives increasingly dominate, other priorities (like "craftsmanship") fall by the wayside.
IMHO, the market system is important and useful, but it's also pernicious in a lot of ways. It needs to be powerfully kept in check, otherwise things become shiny and hollow.
Its a bubble / bust mentality though. I remember a lot of people pivoting from finance (the previous "winner" of high salary, high growth opportunities pre mortgage crash especially) to tech, that also corresponded with the rise of this culture, IMO. Still was happening until maybe very very recently, MBA's were flooding the tech sector looking for opportunities, founding companies etc.
Unsurprisingly, this is when "hustle culture" and the explosion of management roles started to take off in earnest.
Craftsmanship isn't a bubble / bust mentality. Good work should stand the test of time, as it were.
>I want to work somewhere again that sees engineering software as a craft again.
I see this as an economic question. Few software products are going to have any real longevity, and few companies are willing to pay for high quality engineering. This isn't to say that the people working on the products aren't good engineers or that the products are garbage, so much as to say "good enough" is good enough for most products and companies.
I appreciate the craft type of working! Within a craft you care about your work and your tools.
The German phrase "Beruf kommt von Berufung" describes the howl attitude but all translations to English seem improper. Because it is a pun and probably because the apprenticeship (vocational education at work and school) isn't such important in the English speaking world.
> Even if we did see then (and we mostly didn’t) that small, frequent releases were better in a whole bunch of ways, we couldn’t really do it. It was expensive to ship on physical media, and disruptive to our customers to have to do frequent installs.
It's still disruptive to customers. As a software engineer I love CD/CI. It makes the entire process much smoother and more efficient.
As a customer / end-user, though? Nope. I absolutely hate it. Maybe I'm just getting old but I'm extremely nostalgic for the days where I got to decide if I the upgrade was worth it to me or not. These days things will change and move you on without any notice or opt-in what-so-ever.
This is why I turned off auto updates on my phone. I do update stuff periodically, and sometimes I'm forced (and certainly I do OS updates when there is a security issue), but I don't want my apps randomly changing out from under me overnight. Rarely these days are the "bug fixes and performance improvements" actually an improvement for me as the user. I especially hate it when a developer says something like "we update the app as often as possible to make it better for you!" No, updating it "as often as possible" is not better for me. It might be better for YOU, but it's just annoying to me.
Sometimes when I play a retro game, I'm completely in awe of programmers of that vintage.
Games took a year or two to make, on massively constrained hardware, in assembly language, with no engines, all original artwork and music, and no bug fix updates after it was complete.
People like to pretend the future is always better and everything is advancing, but surely they had more talent than we do.
I feel like you're cherry-picking. Games have gotten way bigger content-wise and way more intense in many design aspects. Many retro games had massive teams and utilized earlier frameworks and content to release faster.
There are many old school techniques which are amazing and quite a few talented developers, but I wouldn't make a sweeping statement that "they" had more talent than "we" as a whole.
It's funny, though. Maybe it's just my age and not a comment on the quality of games, but I don't even play newer video games anymore. Every once in a while I'll start a game of Civilization IV (from 2005) if I have a lazy weekend and need a change of pace. I played a lot on the PS3 and then my PS4 was a dust collector. Complete buyers remorse there (and don't get me started on the fact that I needed an Internet connection on first boot to set it up. WTF?!!?!!!)
And I'll still sit and play Tetris for hours and hours and hours on my retro NES. The newer versions, not so much.
I've had this sense over the decades that as video games have grown in scope and graphics they've gotten less fun to the point where I tell people when they ask me "I don't play video games." It's probably been about ten years since I've paid for one.
Many retro games had massive teams and utilized earlier frameworks and content to release faster
I think you and the previous commenter have different ideas of what constitutes "retro."
The people I knew who worked on retro games usually were in "teams" of one or two, even if they worked for a massive company like Atari. If there were three people working on a video game, it was a huge deal. And "frameworks" weren't even a thing. Each game started from the ground up, with the exception of code clips that you printed out and saved in a binder.
You'd still need to name specific examples. All the way from the NES to the current gen, you can name examples of both big teams and small teams / individuals having some form of success.
The coding got easier per content unit and the skill floor is lower, yes. But the ceiling is still high, the demands are far higher and you'd need to be far more skilled in various disciplines to deliver the equivalent of a few decades ago. Those hours you gained no longer coding? You better have spent them figuring out how to do VFX, some decent pixel/line art, or make some stellar sound tracks. If not, maybe be able to do marketing or know how to make your game function in multiplayer online.
>And "frameworks" weren't even a thing.
Using virtually the same frame across a wide variety of processes is the very definition of a framework, even if it isn't a software framework in the modern sense. Many big games on the NES and SNES were built off of one another, you can tell by the similarities between developers and their work. Square being the most notorious.
There's an amazing interview series with Nasir Gebelli, who made some of the first Apple II games, coded the first three Final Fantasy games and a 3d game in assembler, as well as Secret of Mana game for SNES. Highly recommended!
> As a customer / end-user, though? Nope. I absolutely hate it. Maybe I'm just getting old but I'm extremely nostalgic for the days where I got to decide if I the upgrade was worth it to me or not. These days things will change and move you on without any notice or opt-in what-so-ever.
Yeah, even as someone who works as a developer, it seems that in all but a few cases the development organizations have a massively more control about how things are done than they should (which usually manifests as a veto by saying X is too hard/expensive). The overall effect is to make software crappy and annoying in ways that are less technical, and thus less likely to be fixed. Shipping on physical media put a technical constraint that helped keep that social problem in check.
Yup. In most businesses it seems the path to promotion for a Product Owner is to come up with that killer feature that will garner massive new adoption and launch the company into hyper-growth. This leads to existing end users being the sacrificial guinea pigs in a never ending experiment that place hypothetical (and almost never realized) future users over the immediate interests of those currently paying for the product.
I love it as a customer. There's a lot of excitement when new changes come out, and I want them right now!
Beyond business software, I can think of the way that indie games are delivered as early releases on Steam. Most recently, I would get excited for every little Valheim update. It extended the life of the game, which probably won't be truly done for many more years. If they were limited to a single boxed physical release, the game would be left with a shorter development cycle limited by the production budget. Now, the game's sales more directly dictate the level of attention the game gets. Concepts are validated earlier and customers get to more frequently provide feedback rather than developers spending years building something and hoping that people like it.
That particular developer also did a great job of generally not breaking your world and save game (somewhat unlike similar creative/survival games like Minecraft, where you'd be missing out on a lot of things by not starting fresh for each major update).
Sure, frequent updates can break things...but back in the day, there were still plenty of bugs on formally released products, and then you're stuck with those broken things for months or years waiting for the next version to physically ship to you.
One of the things that makes CD/CI possible in the first place is that testing has become far more automated and sophisticated. No one would tolerate CI/CD if it didn't have that additional level of automated quality.
When I see someone who claims not to like when software gets frequent updates, I ask myself: "Do you even like software? I thought new functionality was supposed to be exciting!" I personally see resistance to change as a generally negative personality trait (obviously, there are limits, not all change is good change), and I think that's why I don't understand the romantic nostalgia for the days when we'd buy a piece of software in a box off of a shelf.
> When I see someone who claims not to like when software gets frequent updates, I ask myself: "Do you even like software?"
I used to. That's why I've dedicated 30 years of my life to making it.
But to be honest, not so much anymore. It's not just about breaking things, it's about CHANGING things. Things you paid for. Things you agreed to purchase in a certain shape and form. Forget video games, I'm talking about every day tools you use and depend on to do your daily job or life functions. Things like online banking - my bank suddenly rolled out a complete UX overhaul of their online system and omfg is it ever worse than it was before... I now need to take multiple steps and clicks to do something I used to be able to do in one step. They also broke the browser's back button etc.
And why does this happen? Not because brilliant teams of seasoned experts sat around the table, did focus groups and market research with existing customers to figure out how to make the product better. But because a lone Product Owner sees a path to promotion if they can figure out that one single killer feature that will get a massive new adoption of hypothetical new users (hardly ever realized). And so we, the end user, end up perpetual sacrificial guinea pigs in a never ending experiment that throws us under the bus because the business is chasing a hyper growth they are very unlikely to see.
If you're dealing with something like an indie video game, made by an individual or a small team of passionate people who actually care about their existing users and what to make things better for them then you're going to have a different experience. 99% of the tech industry today is not even in the same universe, let alone ballpark, as that.
To me, disliking greedy corporations and bad management isn't very connected to the method of delivery of products. Frequent and often automatic updates just represent that delivery method.
In contrast to your bank, my bank's website has added numerous improvements and modernizations that have made it easier, more useful, and less frustrating. Perhaps it's just time to switch banks?
There are plenty examples of badly managed products in the pre-SaaS era. If you bought Windows Me, it shipped as a terrible product and it never got better. As soon as you opened the shrink wrap you had no recourse but to wait for Windows XP to fix those problems.
Grand Theft Auto: Vice City featured a bug where saving at the ice cream factory was very likely to corrupt your save. The PS2 had no ability to apply patches or updates to games, so you were just stuck with the software that way forever.
So, you "agreed to purchase it in a certain shape and form" – but just like any other product you never truly know what you get until you made the purchase.
I have no problem with greedy corporations. What I don't like is people changing stuff on me when I didn't opt in to that change. Imagine if someone broke into your house and rearranged all your furniture. That's what it feels like. I don't like not owning my software and not being in control of things I pay for. I don't like not getting to decide if/when I upgrade and what I upgrade to. No one had to upgrade to Windows ME. That's my one and only point.
I don't know why you keep trying to shift the conversation back to video games. Video games are a very different type of software from most and I don't play them or have any interest in them. They're completely irrelevant to the conversation as far as I'm concerned.
Respectfully, I'm not straw-manning nor am I shifting any goal posts. Whether or not you agree with my arguments, I believe I am making them as consistently I can and in good faith.
I'm using video games are an easy example because it's simple to point to a specific gameplay bug that every player will encounter in those boxed games. Those kinds of experiences aren't as well documented for older productivity software. I understand you don't like video games but you should be able to understand the concept of my argument, for all purposes of our discussion these games can be considered to be generic software.
My overarching argument revolves around my belief that the bad things about software revolve around its business model, not from its delivery method in isolation.
>But because a lone Product Owner sees a path to promotion if they can figure out that one single killer feature that will get a massive new adoption of hypothetical new users (hardly ever realized).
This++. It's also the young hotshot phenomenon, where they come in and decide they can do everything better than the old timers (the previous hotshots 3-4 years ago), and change everything they can. We've all been there.
I have elderly family members, 85-plus, who can't understand why the UI changes all the time and they have to continually relearn. Explained this way, they get it, but still say, "they need to remember that not everyone wants to relearn everything every couple of years".
Yeah, I get frustrated about this A LOT. "Change for the sake of change", and chasing meaningless KPIs and metrics that involve ZERO value to users and only serve to benefit the vendor (usually at the expense of users, in fact).
> I love it as a customer. There's a lot of excitement when new changes come out, and I want them right now!...
> When I see someone who claims not to like when software gets frequent updates, I ask myself: "Do you even like software? I thought new functionality was supposed to be exciting!" I personally see resistance to change as a generally negative personality trait (obviously, there are limits, not all change is good change), and I think that's why I don't understand the romantic nostalgia for the days when we'd buy a piece of software in a box off of a shelf.
You seem to be thinking about this through the lens of computer games, which is probably not the right one to bring the problem into focus (then you pile on a bunch of personal judgement, which is frankly irritating).
Also, why should anyone "like software" (as a broad category)? That seems like putting the cart before the horse. I use software because it solves a problem I want solved (ideally with minimal effort); I don't find problems just to put software to use because I "like software." You don't seem to realize that "update" does not necessarily mean "improvement". Often an update is actually a regression to the user (most obviously because of bugs, but more perniciously through annoying changes like unnecessary UX redesigns or dropping features). I'm not "excited" when my workflows get broken or having my knowledge invalidated (e.g. changes that force me to waste effort re-learning how to do things I already knew how to do).
The nostalgia about buying software in a box off a shelf is nostalgia for having problems that stay solved. If that software worked, it would almost certainly continue to work until you made the decision to change something. Not anymore.
I think it’s easy to exaggerate how often updates are regressions, because the ones that don’t turn out so well are going to sit in our memory more strongly than all the other ones that worked fine.
It’s like if you go to Starbucks every day, and in 20 visits your coffee was prepared correctly. Then on the 21st visit, something was messed up.
Which experience would be most memorable? The one where they messed up, not the 20 other experiences where everything was fine.
Most updates to most software
are made in good faith. Companies (at least, the ones not in a complete monopoly) don’t try to scam their customers, they want to keep you happy so you keep buying. They’re not trying to mess up your workflow.
And this is why resistance to change is such an unattractive trait to me: if you’re learning every day like you’re supposed to, having to re-learn something shouldn’t be such a big deal, and it’s not like most software updates are going to just completely change how everything works (especially when they’re frequent and incremental). I think boxed software is actually worse in this regard. I always hated having to shell out a bunch of money all at once for the next version and then go through a more jarring migration process, because each update contained years of changes all at once.
Arguably, Microsoft’s ribbon interface in Office would never had happened if Office 365 existed at the time. They needed a big visual change they could show off in screenshots to stimulate discrete sales.
Apple managed to do something as extreme as changing the underlying file system of my computer with a routine update, no clean install or data migration required. If I wasn’t a tech enthusiast, I would have had no idea it even happened! Can you imagine how mind-blowing it would have been if we could have had that experience in 1998? For most people in the 90s, upgrading your OS meant buying an entirely new computer because it was just that difficult.
I’d personally rather not sit around being bitter and cynical about things being different now, maximizing the bad and minimizing the good.
> When I see someone who claims not to like when software gets frequent updates, I ask myself: "Do you even like software? I thought new functionality was supposed to be exciting!"
Taking away a much loved feature is exciting? Losing more and more control over your own computer, over things you paid for, is exciting? Waiting an hour for your PS4 to install mandatory updates because you haven't turned it on in a few months is exciting?
Good god this is such a naive opinion. I suppose you think all change is good? Even if the people implementing them are greedy assholes in search of ego strokes or power trips?
What if I changed the balance of your checking account to $0? Would that be an exciting change?
For every update that delivers on its exciting new promise, there are four others that have made the life of its developers easier at the cost of their users.
Developers are the ones getting paid beaucorp bucks here, making their life easier should not be a higher priority than supporting things that users want.
Speaking as a developer and lifelong computer nerd of 30+ years.
Interesting example because the PS4 can update games automatically in rest mode while you sleep. It’s so much better than downloading patches manually like we used to do over our dial up connections.
I don’t think all change is good. I just think the anti-updates crowd are minimizing the good and maximizing the bad. Most software updates in our life are so seamless and uneventful that we might as well forget they exist.
You don’t remember the 20 times Starbucks made you a perfect cup of coffee, you remember the one time they messed up.
I think developers and lifelong computer needs like us completely forget that the regular person doesn’t give a shit about any of this. They just want it to work, and they like getting new stuff.
Seamless updating for most things is a relatively recent development. And yes, I notice it all the time, it's annoying as hell. Again, only occasionally do said updates actually improve anything. If we're lucky it will be as good as before. But often we're not, and something is lost.
(And rest mode or no, whenever I turn on my PS4 -- every couple of months -- there's always a stack of updates to deal with. And it always takes forever)
And Starbucks has never made me a perfect cup of coffee. Their beans are perpetually burnt and their process sucks. Back when I was a customer of theirs I had make my order so complicated it's frequently the butt of jokes. In all their haste to serve me and a thousand others every hour, they've stripped coffee down to the barest of essence, robbing it of any character that might have saved it.
Perhaps you've never known what it is like to truly master something, only to have it ripped out of your hands because reasons. I truly hate not having control over my software.
Yeah, we like software. Probably every single person reading this site does. What we don't like is having constant new crap added that doesn't actually benefit us as users (like constantly-shifting user interface, new limitations that didn't exist before, "overhauls" that remove functionality we relied on or expected or were otherwise intimately familiar with). It's a waste of our time and energy, and causes stress.
Yeah, new functionality is neat, but it's an ignorantly-optimistic delusion to believe that frequent updates strictly involve new functionality. It's extremely common that updates involve literally zero benefit to the user, instead introducing some bullshit we didn't want like a "What's New?!" popup that now harasses every time you open the software (Miro, Discord, etc.), or a new notification icon that glows blatantly every time there's a new Awesome Sale available (looking at you, Guild Wars 2).
In many cases, no. It's an adequate tool for some task I need to accomplish. And while some updates are genuinely useful and appreciated, others are regressions or just a change that I now have to get used to and which IMO doesn't make the software a better tool for the task at hand.
I turned off automatic app updates on my phone. It turns out some apps, especially Google ones, do get whiny at you for not updating. Other than that, changes at least happen when I expect them. Nothing can save me from A/B testing though...
I lived and worked throughout Berkeley, San Jose, Cupertino, Marin County from '80-'95. I've been in the original Apple office. I remember 280 before all the CISCO and Oracle buildings went up. I have my share of stories.
Still I feel like us old guys have got to have more on the ball for the new-comers than (i) memories (ii) rants about how it used to be.
A psychologist once pointed out old people go to memories if they've got nothing going on now. I like Warren Buffet's line noted for not investing in tech: I might be old fashioned but I'm not old fashioned stupid.
Part of having gray hairs is to know what was stuck-on-stupid then, and not repeat it now. Part of having those gray hairs is a Peter Drucker take on business: factual, direct, and unsentimental but not mean. The future (i.e. today relative to '89) isn't always better.
Meh. I'm a decade or two younger than you, and I've got plenty of stuff going on, but I'm still so drawn to sentimentalism.
I miss workstations and the feeling of quality. I miss slapping in a router and having connectivity. I miss the sheer degree of early 2000's brogrammer tomfoolery. I miss BBSes and local community. I miss usenet. I miss the sense that IPC and single thread performance was gonna increase forever. I miss having a T1 to my apartment and playing with frame relay and Serious HiCap Telco Circuits (tm). I miss HalTed and the maker movement before we called it a maker movement.
Man, we've got such great stuff now. I've got a home 2gbps PON link and 10 gigabit ethernet everywhere and plastic crappy machines that are 20 million times faster than that decked out SPARCStation 20 and more IOPS than I could have ever imagined.
But why'd it have to eat all the best parts of those things? (I know why, but that doesn't mean I have to like it...)
I drooled over all that stuff but couldn't afford it. Now I might be able to afford some of the workstations I coveted as a kid, but they can hardly do anything by today's standards. Some of them can heat a room pretty well at least.
The stuff I like is from an era where machines weren't too power hungry. Typical supplies were 150W, though not very efficient.
The main thing is-- it was all very nice sheet metal; they'd thought about RAS, etc; tantalum capacitors everywhere instead of aluminum electrolytic; etc. Today, machines probably have a much longer useful life than ever before, but are built more cheaply and disposable than before. If you buy a high-end ThinkStation or whatever, you basically get an ordinary PC with a nicer CPU inside.
I resented my dad buying a PS/2 when I was growing up because it had poor game compatibility and couldn't run (early) Linux... acquired one for nostalgia and found that it has a whole lot of the things I liked about late 90's Unix workstations inside.
While it eventually became feasible to ship pet food at not too much of an economic disadvantage... it's only possible to do so for the largest players and they're still not really making money on the transaction.
If we ever get serious about taxing carbon it'll become pretty seriously uneconomic again.
> "The software engineering team had two titles: Software Engineer and Senior Software Engineer. This was typical in the industry. You had to have at least 10 years of experience, but more likely 15 years, as a Software Engineer before being considered for promotion to Senior. The bar was higher for Senior then, too. I’d say a Senior engineer back then was skilled and experienced more like a Staff or Principal engineer today."
amazing to see that the title inflation in our industry has roughly matched actual economic inflation :)
on a more serious/optimistic note, there is a lot more tech to learn today than in 1989, and we probably learn at a faster rate too given the available amount of material (HN, youtube, books, etc). so perhaps not unwarranted. i'd love to pit a senior engineer of today vs a senior engineer of 1989 in doing modern programming tasks.
I suspect that the ones of today would have a mental agility and/or capability to discover things on the fly that simply weren't possible all those decades ago.
I think it's only at the real low-level or hardware/assembly or, for instance, a bit above the stack, say, optimizing a data model or fine-tuning a DB engine for a given domain, that the old timers would really shine.
I see it in my daily life: my dad was a Fortran programmer way before I was born, so, maybe you know... 1970s or something like that. He was actually very good at his job, getting a research position doing some serious programming work that was seen as groundbreaking at the time, doing some finite element modeling on extremely resource constrained environments, probably what we would call data-oriented programming today, but taken to its extreme....
However, he didn't really keep up with programming after a few years, shifting to MATLAB based tasks for some university courses he lectures... and I feel that the complexity of today is just too much for him to grasp as it was. It's just too many moving parts and too many layers of abstraction to sift through.
By the same account, I expect your typical developer of a regular company, myself included, seniors included, etc, etc, to be completely lost if they had to optimize code for a given hardware and needed to write some assembly, or whatever, disassemble some JVM class files, etc.
> By the same account, I expect your typical developer of a regular company, myself included, seniors included, etc, etc, to be completely lost if they had to optimize code for a given hardware and needed to write some assembly, or whatever, disassemble some JVM class files, etc.
I've done both. I've gone from high level C# to counting clock cycles for embedded, and then back again.
There is a transition period, but it is perfectly do-able. It is helpful if you have someone to show you the ropes, but after that it isn't too bad.
Heck I've met some teenagers who are into disassembling JVM/CLR stuff. (Which is actually, IMHO, much easier than raw assembly).
Have you ever done scaling calculations for AWS services? Same sort of logic applies to writing code for embedded, there is just a different number of mhz and instead of gigabytes of memory you are talking kilobytes of memory, but other than order of magnitudes, the reasoning is actually not that dissimilar.
My first couple of jobs from late 90s on were C++ and Java: homogeneous giant code bases with no agile, scrum, code review or anything.
I do just fine with modern code bases and their complexity. I know I also have the experience to understand when the complexity is there for the sake of it and when it’s because of the business domain.
I’ve left projects where the first kind was too high because I’ve seen enough to know I don’t have time for that BS anymore :)
It’s curmudgeonly, but I can’t help but bristle a bit at the ‘we didn’t have _____ when I was a programmer’ stuff because it very often existed in some substantial form.
Case in point, UI/UX- You could have user interface design as a job back then, perhaps at IBM working on ATMs. UI philosophy was a significant part of what was done at Xerox PARC, and you can read a 1989 article by Alan Kay on the topic:
http://worrydream.com/refs/Kay%20-%20User%20Interface,%20a%2...
In 1988 the first edition of the Handbook of Computer Human Interaction was published, a staggeringly expansive book on the subject, which included stuff like rapid prototyping, user acceptance and design review.
Sure, plenty of companies didn’t bother with this added expense, but that’s still the case today. The idea that it wasn’t there at all is misguided.
Another bit that seems off is mention how Java was ‘a total game changer’ when it came out. Cross-platform development was an exciting idea, but in practice, everything with Java was dog slow for quite a while. The true and eventual potential of the JVM was still opaque for years after the 1996 launch.
"Alls we had was waterfall" rankles too - and maybe it's just me but I've been hearing it a lot the last couple of years - particularly when most people first heard about "waterfall" was when someone made them do SCRUM.
And by 1989 OO was obviously the right thing for graphics and graphical user interfaces were arriving in a big way. MacApp was four years old. The original NeXT workstation had launched the year before. "niche" only in the sense that Turbo C++ was a year away.
https://en.wikipedia.org/wiki/MacApp
Xerox PARC was doing bleeding edge work. To compare anywhere else at the time with their revolutionary innovations, to claim that those companies just "didn't bother with the added expense" is absurd.
Graphical vector mini-supers. Unix and vax environments, with SUN ruling the roost and SGI crashing in. Multiple instances of what became things like knime. X11 and multiple vendor-specific gui toolkits. Excellent vectorizing compilers for Fortran and C. 4 and 8 core multiprocessors. Granted, it wasn’t as fast but there was lots there which people probably forget.
Software wise, there were research groups with all of the ml/ai areas in chemistry (Gasteiger, others) so I assume they were active elsewhere. The quants were starting upon Wall Street and they needed visualization systems to know whether their model is going up or down. Stereotactic imaging for medical uses, fluid dynamics and finite element analysis were leading to changes in car and airplane design,
Microprocessors killed off this ecosystem - the pa-risc chip from HP could be a soft pc faster than a 486 could be a pc. But don’t think it wasn’t there. Hell, all of the foundation was laid by (mostly) hackers in the 60’s and 70’s that were building on now. It just used way less memory :-)
And don’t forget the work from the folks at bell labs. And neat stuff like the blit.
I attended a lecture by him around 1990. Maybe I was not the Turing Award class of computer scientist, was not sooo impressed what he spoke about. Remember Parnas [1] approx. the same year much better.
Around 1998 Gray had built something called TerraServer, must have been years before Google Maps or OSM. It was a real eye opener at the time to see my little house in the middle of nowhere in Northern Europe on the internet published for free by an American company. At the time PC disk sizes where still measured in Megabytes, so you weren't even able to store that many aerial pictures. At least not pictures of current size. Probably they were smaller then, don't have any data.
>Every last software engineer at ACD was a white man
Interesting, where I was at the time, it was a 60/40 split between men and women. Also I would say 10% were people of color. And one person was completely blind.
Two of the smartest people there were people of color, early on (before 1989) I learned a lot from them.
It's interesting, my experience at a FANG (which invests heavily in recruiting women engineers) is that approximately 25% of teams are 60/40 men/women, and 75% of them are 100% men. The women I've worked with have told me that this is not a coincidence, that they very quickly learn which managers they will not work under or which coworkers they will not be on a team with, and that they take the presence of other women on the team as an indicator that none of these characters are present.
I wonder if the same principal operates geographically - certain regions find that they just never encounter women or minorities in the workforce, and it may have as much to do with the region as the people.
I think this statement goes too far in that case, I'd walk it back:
> The software industry was a far less diverse place then.
It probably makes sense when talking specifically about the midwest, due to demographics. But out here in SoCal in the 90s, I had a south asian boss and foreign national engineers from a number of countries working side by side. China, Italy, Norway. It was a big facility and we had US minority employees as well, although not a huge number. Software dept was 1/4 women approximately.
Re: being gay it was the "don't ask, don't tell" days. No one I knew cared in the least. An interesting angle is I wasn't perceptive about it as a young person. Looking back in the hindsight of maturity, obviously gay people were everywhere. Just like in Hollywood, everyone just winked and nudged. Right, Uncle Arthur is ~fifty, unmarried, and rather flamboyant, nothing to see here. ;-)
My very first programming job was in 1991 or 92 writing an interactive kiosk based upon a touch screen driven by an Amiga, using the multimedia platform Scala (quite similar to the later Macromedia Director, but it still exists).
Then with a colleague we developed an interactive phone application (the sort that asks you to "press 1 for this, press 2 for that, press # to go back to the menu") on a SCO Unix system, in some sort of weird BASIC. While one of us tested the application on the phone, the other one played "X-whack-a-mole" :)
> There was no working from home. The minicomputers were available only on the in-office network. I think it was not technically impossible to connect them to the Internet, but at home we all had only dialup access. Not only would speed have been a concern, but the family wouldn’t enjoy having the phone tied up for hours while you worked.
I was lucky in that I got to work remotely from the beginning. Late 80s at the university I set up a modem bank to one of the UNIX boxes and installed a second phone line at home, so I was dialed in pretty much 24x7 (at 2400 baud, of course, but with an all-text terminal session it's fine and usable).
At first job they had dialup into the Sun servers as well, so I often worked from home (by now at 14.4 and later 56K at which point remote X was doable).
Similar to the blank tape story, around 7 years ago at a big networking corp a big customer needed the latest product to be in their possession before some tax related date. The new product software wasn't ready at all at that point, so we shipped them unit with a place holder FW that could do only one thing: FW upgrade!
The section about waterfall being the only SDLC is interesting because it touches on the economics of shipping and software engineering practices:
Continuous delivery only makes sense if the value of new feedback and delivered software is higher than the total cost of delivery. If you have to mail a floppy disk to each customer, that's a lot more expensive than just updating your production servers.
Worked at a company a few years ago that was still using Classic ASP for some projects. It was not supported by Microsoft at this point, which is saying something, and I’d be surprised if the project was ever sunset.
The software development process sounds all too familiar, i.e. hundreds of bugs in QA, long releases, long release cycles, death marches, and burnout. Fun stuff.
This cultural mindset permeated the entire organization. Even for projects using new technology, we were coupled to this legacy project and our processes always devolved to what it had created.
Not so sure about the development/testing split of the waterfall, but two months up front just to design and gather requirements... feel like that might be a net benefit!
It sounds nice, but in reality your requirements end up being a castle in the sky. My team once spent 6 weeks for a small project (I was the team lead) creating a detailed requirements document with screenshots, a prototype app, etc. My users were all at hand and I showed them the prototype. They all agreed that it was what they wanted, and signed the requirements doc.
We spent several months building the application. When we shipped, the software was unusable, because a security feature that we had been explicitly told could wait for the second release turned out to be essential.
We also ran into a big problem with concurrent editing that no one had considered in advance, and blew up our carefully-created design.
The solution to changing requirements isn’t to nail them down, because that’s effectively impossible, but to get better at evolutionary design techniques (introduced by Extreme Programming) so you can adapt to arbitrary requirements changes without disruption.
I got my start in the software industry in that building, too. I was an intern at Rose-Hulman Ventures in 2005, building tax administration software for Indiana counties in C# 1.0 WinForms with Visual SourceSafe and Microsoft SQL Server. VSS required you to take a lock on a file to be able to edit it (there’s no merge ability) so I spent a full afternoon one day separating our data access layer class into partial classes so it would be spread across multiple files, and more than one developer could write a stored procedure wrapper at a time! Likewise, our bug database also only had enough licenses for one or two people to be logged in at a time, so we printed out our tickets to work on them. Learned a lot that summer, especially the fundamentals of databases and software architecture. Also learning C# that early gave me a ton of experience and willingness to work with C# again in my next few jobs.
Well, you survived using Visual Source Safe. (I've lost so much hard work with that tool). I worked on stuff one generation before C#. Visual FoxPro was my money maker for awhile.
We had a “robust” backup strategy. (Taking a copy of the source code onto the file share every night.) Definitely lost VSS revision history at least once. I don’t know how VSS ever made it to customers!
You’re right, I misremembered! I made a hideous inheritance chain instead:
abstract class UserDAL { }
abstract class ParcelDAL : UserDAL { }
...
abstract class DAL_n : DAL_n_minus_1 { }
class DataAccessLayer : DAL_n { }
each in their own file, of course. I tried to give them descriptive names and very narrow scopes. But inheritance is pretty dumb here. What does it mean for ParcelDAL to extend UserDAL? A parcel is not a user. When C# 2 came out, I was able to get rid of the inheritance and make them all part of one big partial class, keeping the file names the same.
I am of a similar vintage. Technically, I was in hardware/firmware in '89, but I was working in a startup then and the lines were pretty blurry - I wrote a good bit of assembly & C code. One thing that I notice was that even though there wasn't really any sort of formal push to get women into tech back then, the software group was pretty close to 50% women where I was working in the late 80s - in fact, for a while there were more women than men in the group.
Similar at the next couple of companies I worked in software into the mid-to-late 90s - maybe not 50% women, but pretty close.
But then in the aughts that gender distribution tended to skew more towards males. Not sure why that was. And now I'm working in a startup where it's about 40% women, so maybe we're getting back to where we were in the late 80s.
I love these stories. I got my start around 1998 when the web was coming into full force which was only a decade after this and very different. Things were changing quickly in that era.
The closest I've had to this was working at a company that had been around for close to thirty years and had kept many of the original developers (they'd taken the product from a terminal interface to the web, from cobol to c#) and had many stories like the one in the original post.
I started a few years after you, around about 2003. There were certainly pockets of industry that still operated exactly the same as the article described, especially defense and aerospace.
Which makes sense, given working in embedded hardware (especially in classified contexts) back then came with a lot of the same limitations described in the article - software had to be delivered by sneaker-net, installation was burdensome, you couldn't just fire up the application for a quick bug fix at home, etc. Not to mention the inertia of having Always Done It That Way.
I started about 10 years after, and I think that was almost the best of both worlds. Agile was starting to become a thing, but wasn’t perverted to sell expensive courses and credentials yet, and tooling began to be developed that was a serious improvement on what we had.
> Then someone had a brilliant idea: we would ship US Sprint a blank tape with our usual letter listing the changes in the release.
...preceded by...
> Our relationship with US Sprint was always iffy, and one day we pissed them off one time too many and they canceled their contract
Well, yeah. If your company depends on a small number of large customers, you really can't get away with pulling nonsense like the blank tape trick on them. People aren't idiots.
This old trick is still in use nowadays. However, instead of shipping a tape I have seen a colleague uploading a zip corrupted by hand, just two gain a couple of days of bug fixing.
> The software industry was a far less diverse place then. Every last software engineer at ACD was a white man, most of them younger than 35. QA was much the same except for one young woman on the team.
counter point: My 2nd job in the field, 1990, NYC, software development firm, reported to a Chinese-American woman (Cooper Union grad) and the CTO was also Asian American. My ~mentor was a Bell Labs engineer with a luxuriant white beard (so way over "35"). I also had a private office that I shared with another engineer (a young white married guy who commuted from Philly). Q/A iirc had many Indian women.
It occurs to me, that the one place you might see parity with the Midwest and the coasts in terms of a loss of diversity was in the composition of the graduating classes for compsci in 1989. I remember peeking in on one class in the Bay Area and there wasn’t a woman in sight. The majority of women I knew who were involved in programming at the time came from a background in business administration. I think it’s true that the workplaces on the coasts were decades ahead of the rest of the country (women, minorities, LGBT), but academia was still far behind in terms of inclusion like the rest of the country.
I have been working in the software industry since 1990. In 1990, I joined a company that made innovative coupling between AutoCad and relational databases (Oracle). However, I wrote my first program in 1978 in Algol 60. I also used Fortran, LISP, (Turbo) Pascal, BASIC, and 6502 Assembly before (in 1990) starting to develop software in C.
I remember that one day at this company, we had just got a MicroVAX system, when I there was someone at the door with a europallet. We wondered what it could be. Turned out it was the VAX/VMS documentation that belonged to the MicroVAX. It took more space than the system itself.
Back then I wrote a program to manage the inspection of Fire Equipment at Generating Systems, by myself, in Turbo Pascal. The inspections were recorded on Hand-Held computers (oooohhhh!) made by the Norand company in Cedar Rapids, Iowa. I wonder how well that code would hold up to modern scrutiny.
Here's a clone of TECO[1], which I fell in love with at Rose-Hulman, in Turbo Pascal, as a sample of how I wrote things back then.... roast away
>To my astonishment I discovered that I loved to write, even more than I enjoyed writing code.
I wish I was this person. I'm one of those (great majority of) people whose documentation you'd hate.
I deeply hate writing like that because I can never properly explain my tangled thoughts. Even my comments on sites like that are at most few liners. Describing things in code is much easier, since you don't have centuries of context written into words.
Your class names, variable names, and method names have plenty of centuries' worth of context in them anyway whether you like it or not, not to mention the comments and documentation.
I think you'd benefit from looking at writing differently. What you describe--"I can never properly explain my tangled thoughts"--is the essence of composition. The goal of the process of writing is to untangle those thoughts and express them!
When I say "process," I don't just mean that you begin typing immediately. Composition is about planning as well. Even if you spend just a few minutes making a list of keywords & important concepts then arrange them on cards in the order you think you should present them in, I bet it would help writing your initial draft of a document.
If that doesn't appeal to you, so be it, but look into a composition course, because I genuinely think you could turn something you hate into something that helps. You might never fall in love with writing, but that doesn't mean it can't be a tool that solves problems for you.
It's weird... I was never big on writing, but when I finally tried my hand at a data format specification (https://github.com/kstenerud/concise-encoding), I found that I really enjoyed the experience.
It's 3000 lines and took me four years and countless revisions, but I feel it was worth it. I actually enjoyed writing the spec even more than writing the reference implementation :)
Wow, I like the data format specification you devised. Especially that you insist on "no optional features or extensions" -- something that makes other standards so difficult to rely on.
Is there anyone using it? It's always great to have extra ammunition to make a strong case that we should use this instead of, say, XML.
It's still in the prerelease stage, but v1 will be released later this year. I'm mostly getting hits from China since they tend to be a lot more worried about security. I expect the rest of the world to catch on to the gaping security holes of JSON and friends in the next few years as the more sophisticated actors start taking advantage of them. For example https://github.com/kstenerud/concise-encoding/blob/master/ce...
- Revamp the compliance tests to be themselves written in Concise Encoding (for example https://github.com/kstenerud/go-concise-encoding/blob/master... but I'll be simplifying the format some more). That way, we can run the same tests on all CE implementations instead of everyone coming up with their own. I'll move the test definitions to their own repo when they're done and then you can just submodule it.
I'm thinking that they should look more like:
c1
{
"type" = {
"identifier" = "ce test"
"version" = 1
}
"tests" = [
{
"name" = "Some kind of test"
"success" = [
// These must successfully convert to each other
{
"cte" = "c1 [1 2 3 4]"
"cbe" = |u8x 81 01 7a 01 02 03 04 7b|
// Events are a kind of text shorthand for what the parsing of cte or cbe should produce
"events" = ["v 1" "l" "n 1" "n 2" "n 3" "n 4" "e"]
}
{
...
}
]
"failure" = [
// These must fail to decode
{
"cte" = "c1 100 ]"
}
{
"cbe" = |u8x 81 01 64 7b|
}
]
}
]
}
> I deeply hate writing like that because I can never properly explain my tangled thoughts.
I brought a friend of mine to give a couple lectures at our company for this exact reason. He's a professor of non-fiction. There's a couple techniques we learned you can use:
I was in that Token-Ring world back then, and the terminator clip at the end of the chain was the bane of my existence. Our System 6 and 7 machines had to talk to everything, VAX and PDP and DOS and anything else that communicated over serial.
The WWW was a rumor then a thing, and it was fun to click around on Gopher, but you couldn't find anything you could really use.
We also tended to have long-term stays at companies. We usually had to maintain the software we wrote, so it encouraged us to at least leave some decent documentation breadcrumbs.
>We also tended to have long-term stays at companies.
I was at a minicomputer company (DG) from the mid-80s until the end of the 90s. Mostly PM for hardware. A lot of engineers had been there for a long time. Even more so, people in support functions of various types. There was one woman who I think may have been the only person who knew our BOM and related systems inside out. I think there are still a few people I worked with who are still there (by way of EMC and Dell).
At least as of a few years ago, one of them was one of the engineers who worked on the computer that was chronicled in Soul of a New Machine.
1995 India - Had a paid internship (Rs 2500 per month or ~ USD 80 back then) with a small software company that worked on FoxPro + dBase. A very diverse team (gender ratio), the highly enthusiastic bunch created software for non-banking financial companies.
A thick documentation book occupied most of the space in the large, laminated box that such development software came in. Along with half a dozen floppy disks where we hoped that no disk would have a bad sector.
Requirements were constantly over the phone from clients and shipping software meant taking two sets of floppy disks of different brands (backup!), in-person, to the city where the client was.
CRTs needed a warm-up time and we had a small diesel generator (hand cranked) to power up the x86 computers once the main power line crashed (which was frequent).
No internet searches, no help other than that thick paper manual or waiting around for hours for a senior engineer to get free to help you out.
College taught us C and Pascal on Unix.
That seems so far away from the world of microservices and SPAs.
</nostalgia>
On my Amiga UX/UI design was a thing, layout, settings pages, forms, GUIs that did scale with different font sizes. At least for some companies, some UIs had horrible UX though. Like when they added shadow to text labels.
On my CPC I wrote a GUI library with auto layouts and muli step flows woth |RX (?) commands with characters like ┤ [1]
for borders - always reading Apple magazines.
Good point about the Amiga. Someone else here commented on Xerox PARC. So yes, there was UX design. However, in so many companies like the one I worked for in the story, it was utterly ignored.
> That led us to create enormous Gantt charts (printed dot matrix) that we’d tack onto a huge wall, and use them to track plans and work completed. It never failed that during the first week of coding we’d discover something that we didn’t think of in the design phase, and we’d have to replan the entire project and print all new Gantt charts. We did this over and over every project.
I’ve only ever worked with “modern” agile approaches to software development. Gantt charts feel like a mismatch with the speed of software development, but sometimes I wonder if there certain software projects where Gantt would work better than agile.
It's interesting how the company, MDSI, started on 8-bit timeshared mainframes, then eventually ported their software to newer platforms using a combination of emulation and microcoding.
I happen to know Chuck from my years racing solar cars in college.
I hadn't considered that token-ring networks had the limitation that all nodes had to be online for it to work, but it makes total sense. We're so spoiled these days. :)
And that last idea to ship a blank tape to buy you time is pure genius. Reminds me of the high school / undergrad tricks of submitting a corrupted archive or document on the deadline date. It bought you at least a day or two before the professor noticed, and by then you'd hopefully be finished. Not that I ever partook in such reprehensible behavior...
Hey Jim, it's nice to see a familiar face around here. We crossed paths (extremely briefly) at Angie's List where I was "that one guy working remote from Chicago".
Brought back great memories for me-- I worked in Terre Haute in the early 90s.
My shop was at a local bank. We were a little more diverse-- we had some ladies working in the shop, and an African-American male also. (And several older men.)
We worked hard, running a mainframe/COBOL shop. Some late nights (batch processing was king then!) and some great teamwork. Good memories.
I saw the 'blank CD / oops our shipping department screwed up' trick in 1998. I think it was used to recognize revenue in an earlier quarter for a release that wasn't quite ready yet, but where they wanted to have a nice hockey stick growth curve. I think it eventually cost a VP his job though.
I've used that last trick a couple of times when stuff was over-optimistically promised "today, end of business", sending out random bytes renamed to .zip and having the real file ready next morning when people complained about not being able to open yesterday's zipfile.
Or, it is what any half-competent witty student would do when they couldn’t finish their essay in time - send a corrupted .docx file and blame Moodle :)
...and how much is still the same today: coding on networked UNIX machines with TCPIP, C and its variants, emacs, etc. Also, software engineering is still mostly male.
> Then someone had a brilliant idea: we would ship US Sprint a blank tape with our usual letter listing the changes in the release. And so we did.
> “Oh! We are so sorry. We can’t imagine what must have gone wrong! We will send you another tape today.”
Those games are so common no one really falls for it. They might be polite and not call your bluff, but you can bet they know. (And I suspect that this was the "straw that broke the camel's back" for Sprint.)
I would guess they are dependencies. For example you would want to start the database server before you started the application. If the application started with no databases, it would do some funky things and may end up in a weird state.
I started my first programming job, well trainee programmer in September 1989. Five people applied for the job and I ended up taking my TV and Atari ST to the interview to show them some stuff I had written in STOS Basic.
Got the job and started with dBase II on Concurrent DOS.
I am surprised I had to scroll this far to find a dBase reference. I am the same vintage as OP, and yeah... back in the day dBase was the key to riches...at least as a contract dev. I remember spending A Lot of Money on the big box of dBase IV when it was released...all the diskettes and manuals. Around that time there was much talk about migrating from dBase databases (those .dbf files) to the magic of "SQL" which at the time meant nothing to me. Needless to say dBase IV cratered, and the industry went elsewhere. Good times!
I also have some fond memories of writing a 'smart' algorithm in STOS Basic for generating Madelbrot fractals. And is still took hours to fill the screen.
There really wasn't any such thing as a tech stack - our company would produce software for various Unixen and, especially, Vax, and the language was alway C, with some Pascal on the Vax side (as, I was told, the Vax Pascal compiler produced far more optimal code than the C compiler).
Being a software engineer was like being a furniture-maker. You had to master a small number of tools - the saw, the chisel - and then craftsmanship consisted of discovering an affinity with these tools and a love of getting ever more skilled at using them. This path wasn't for everyone, which was fine - plenty of management and other non-tech roles for those folks - but if you grooved with your tools it was a real joy to be able to move from chair- to table- to cabinet-making and carry and refine your skills as you went.
This idyll could not last, however. Before too long it became uneconomical to make furniture this way, so for your chair making project you instead had to learn how to run a chair-making machine. When you moved onto a tables project, the machines there were infuriatingly dissimilar to the chair-making ones you'd just mastered. Worse still, moving back to chair-making after a year or two, you discover that the old machines you knew are now obsolete and you have to re-learn how to run their replacements. This stops being fun, or interesting, after a while, and yet sadly it's all that most younger software engineers know.
Occasionally, there is the need to make some shim or gizmo that the machines don't cover, so out come the trusty saw and chisel of yore - much to the amazement of the young'uns, who are astonished that an old-timer can still wield these antiques so effectively.