The longer I write software, the more my sense of “impressive” changes. What actually amazes me these days isn’t modern technology, but older systems — and the people who built them.
Take Windows 95. A full operating system from almost 30 years ago. GUI, drivers, multitasking, multimedia, process and thread management. All of it lived in roughly 50 MB on disk and ran on 4–8 MB of RAM.
Now compare that to today. The browser tab I’m using to type this text currently takes over 1 GB of memory. I’m not compiling anything. I’m not rendering video. I’m just editing text.
That number alone would horrify engineers from the 1980s — people who ran full multi-user Unix environments on machines with 2 MB of RAM and 20 MB hard drives. Entire development workflows — editors, compilers, networking, users — fit inside constraints that feel impossible now.
Even small things today feel heavy. A simple “Hello, World” after activating a virtual environment can easily pull in tens of megabytes of libraries before any real logic runs. Not because the problem is complex, but because the ecosystem around it is.
The Disappearing Discipline
What surprises me isn’t that hardware became faster. That was inevitable. What surprises me is how abundance changed our behavior. We lost our software manners.
The constraint of scarcity once enforced an unwritten code of conduct:
- Memory was precious — you cleaned up after yourself
- Every cycle counted — you thought before you looped
- Dependencies were earned — you didn’t pull in libraries for trivial tasks
- Abstractions were understood — you knew what happened under the hood
Old systems weren’t magical. They were constrained — and that constraint forced discipline. We’ve traded that discipline for convenience.
The Professional Paradox
Here’s where it gets professionally painful: The system now rewards waste.
If you don’t use the tonne of libraries, cloud SDKs, and abstraction layers that consume resources (and cash) at runtime, you risk being passed over for Developer B who doesn’t care. Developer B is the “deliverer” — they ship fast, consequences be damned!
The metrics are stacked against careful craftsmanship:
- Velocity > Efficiency
- Features shipped > Resources consumed
- Time to market > Technical debt considered
- Framework familiarity > Understanding fundamentals We’ve created a world where the most “productive” developer is often the one who piles abstraction upon abstraction, dependency upon dependency, until the entire structure becomes so bloated that it requires hardware upgrades just to maintain parity; and inflates cloud costs.
The Cost of Bad Manners
This isn’t just about nostalgia. The consequences are real:
- Environmental impact: We’re burning megawatts to run inefficient software that does simple tasks
- Accessibility erosion: Software that requires the latest hardware excludes users with older devices
- Security fragility: Layers of dependencies create attack surfaces we don’t understand
- Innovation stagnation: When all our energy goes into maintaining bloat, we have little left for genuine breakthroughs. The engineers who built C++ on a 2 MB PDP-11 weren’t just clever — they were considerate. They considered the hardware, the next programmer, the user’s resources. That consideration was their professional ethic.
Relearning Our Manners
So yes, we’re losing our manners. But manners can be relearned. It starts with small acts of consideration:
- Question dependencies: “Do I really need this 50 MB library for a simple task?”
- Profile relentlessly: Know what your code actually does, not what you think it does
- Understand one layer down: Know what happens beneath your abstraction
- Advocate for efficiency: Make performance a feature, not an afterthought The most impressive software isn’t what uses the most resources — it’s what accomplishes the most with the least. That discipline, that consideration, that professional courtesy toward the machine and the user — that’s what we need to reclaim.
Because in the end, software development isn’t just about making computers do things. It’s about how we choose to exist in a world of limited resources. And good manners, it turns out, are just as important in code as they are in life.



Top comments (31)
That is the negative consequence of over-provisioning.
Developers get fast machines and fast internet connections, and then they assume everyone else has access to the same means.
The same is happening with hosting, even websites that are local want fast page loads on the other side of the world. And because of hyperscalers like AWS, Azure and Google cloud it is possible.
Don't get me started on AI, people are expected to have multi gigabyte models on every device they own.
The tech industry needs to do less disrupting and add more value.
"Re-introduce constraints intentionally" is the key phrase. And you're spot on—the platforms that host this software share the responsibility. True innovation from hyperscalers wouldn't just be more power, but tooling that rewards leanness and helps us build software that's fast because it's efficient, not just because it's running on a beast!
We loved your post so we shared it on social.
Keep up the great work!
Let's spread the gospel
Running 500K+ API calls daily taught me this the hard way. Switched to Cloudflare Workers because the platform forces discipline—tight CPU limits, pay-per-request billing. Can't be wasteful when the edge punishes bloat. Constraints aren't nostalgia, they're competitive advantage.
Got it!
This is striking, but I believe that incentives are completely out of date rather than developers losing manners. Most teams are rewarded for meeting deadlines and satisfying stakeholders rather than for efficiency or restraint. It is difficult to advocate for discipline when frameworks and deadlines essentially penalize it. Fixing this seems to begin at the organizational level rather than with individual developers attempting to be purists.
On spot. The atmosphere we code in! The air we breath
I agree with your points. I would like to add another factor to the problems of modern software development: the dynamics of capitalism. I know it sounds like a bold statement, but in times of software abundance, the race to be "first" becomes overwhelming. I believe the flow of money is driven by this mechanism. The Windows system, in my opinion, corroborates this. As far as I know, Windows wasn't considered a high-quality product, but it became very popular for various reasons. This "popularity" made money flow to Microsoft, and this was an important factor in developing a better product. Not all software follows this logic (Linux, for example), but if we restrict ourselves to software in the context of companies and businesses, being first makes all the difference. Therefore, to me, it seems that software development faces a lot of pressure to be "ready" as quickly as possible, even at risk, and then we have to deal with the problems and technical debt. Unfortunately.
Good point. It reminds me of @doogal example about prototypes vs. core systems.
The real issue is: do our development phases include a culture of cleanup? A prototype might not need to be lean, but is there ever a planned stage to pay down that debt before it becomes production bloat? Too often, the "temporary" solution becomes permanent because the business incentive to refactor it never arrives.
I think there is a balance to be struck that needs to be aware of the unique context that an application is being developed in.
We need to be thinking about the financials of the business we are working for because ultimately, that is why we are being paid. That business survival or not can be helped by the trade-offs we make.
We generally need to deliver software fast enough that the business can make money, if we deliver it in a poor state, then it can become unstable or the speed of delivery after the first couple of releases can drop as we get mired in technical debt.
The way I tend to approach it is by being deliberate about the "low-quality" parts of a product, put a higher quality facade around something that is hacky. Generally the facade is very quick to make, you still deliver on time and then you have a decent interface that you can use to isolate the hacky bits and dealing with them becomes a much more manageable endevour.
For example, say we are building a T-shirt ordering application, we know roughly that we want to have requirements for taking payment, telling the warehouse to ship the T-shirt and emailing the user. Each one of those 3 tasks could be written as very hacky scripts as long as they have been isolated from one another with a reasonable interface. Then dealing with the technical debt becomes a much more concrete conversation about improving the email system, rather than refactor the whole application
I'm from the generation of UK spectrum bedroom coders, and when I first studied computer programming, we were marked on things such as:
When that started to change, and coding became easier because you didn't have to work within constraints, I lost interest in computers for over 20 years.
It would be nice to see it return.
PS: 1k ZX Chess, written for the ZX81 is, for me, the best computer code written. A chess game, with a computer opponent, in just 672 bytes 🎯
what we lost
This is like a gale of fresh air. Back in the 80s I ran a small dev team sharing an Altos Xenix computer with, IIRC, 128Kb of memory, developing in MF COBOL. As our target platform was IBM PCs one of the acceptance tests was that the application should load and execute in 64Kb.
Them was the days my friend!
Them days, wish we could go back
Thanks for your post! I mostly agree, but it's funny that you cite Windows 95 as a good example, that unstable pre-NT desktop operation system built on top of MS-DOS that needed 9 floppy disks to install. Software like Windows 95 was one of the reasons that the Linux community developed desktop environments, because developers were craving for stable alternatives to Microsoft, and Steve Jobs hadn't yet decided to bring his NeXT step ideas back to Apple.
I appreciate memory saving and efficiency right now and back then, but I'd rather point to home computer game development in the 1980s or the 1990s demo scene instead. Now you can retort and point out what an idiosyncratic chaos Commodore assembly development meant ;-)
Haha, excellent and fair critique! I was using Windows 95 more as a shocking reference for scale, not stability—but your point stands!. The real craft was indeed in the demo scene, game devs, and early Unix communities working under extreme limits. It’s precisely that mindset we’ve drifted from. And you nailed the irony: today's Linux, in many forms, consumes far beyond what its pioneers imagined. The spirit of minimalism they fought for has been overshadowed by convenience!
As an eight-year Linux user, the painful irony is watching the ecosystem recreate the bloat it opposed. Even after applying limits; browsers and CLI tools crash from pure memory bloat on an 8GB RAM! That was an entire server in those days—no complex logic, just waste. That's the tangible price of the discipline we've lost.
I always was thinking, why we reinvent the wheel. Things may be overhauled, but not at the cost of a browser that eats up CPU resources and memory. When I started with my current main open source projets's language, it was about Windows 3.1. I know about the stuff - I don't really need AI to do my job :-)
Love this!
You're the software developer friend I've never had. We should be friends! Ha
But seriously, I spent years mentoring developers on efficiency. It's a lost art form on enterprise applications (at least where I've been). Now I see 10 microservices built for products handling 1000s of requests per DAY (not second). And it feels like a complete waste. It's not efficient with memory, CPU or IO usage. But more than that it's not efficient for the devs either, or the support staff. It feels like wasted productivity...
However, I also understand that sometimes efficiency comes at the cost of training and man hours and thus businesses tend to reward speed over efficiency. Resources aren't a constraint anymore (not until budget season) and so businesses have trained developers this way. There are still efficiency driven developers and when done well they're praised but overall Software is not forced into the same level of discipline as other engineering disciplines like electrical, mechanical, or industrial engineering. When you mess up on a circuit design and it gets to production, it can cost 100s of thousands of $$ to fix requiring better discipline up front.
I wish we could find a way to create that same level of discipline with Software where that high schooler that learned how to program python last weekend, doesn't think he's a genius because he made something work... but hey that's just me.
Brilliant analogy. We need to make software waste as tangible as a burned PCB.
Consider me a friend!
Some comments may only be visible to logged-in visitors. Sign in to view all comments.