This post is about general resource usage optimization. I'm not only talking about CPU usage here, but also RAM, disk, network, etc. I'm also not talking about the obvious high-performance computing use cases or games or the like. I am talking about applications and services in general, be it a mail client, a calendar app, some background notification task, some server backend, you get the idea…
Back in the 80s (and early 90s) most computers and gaming consoles where 100% deterministic. For example you could tell what a C64 was doing on every clock cycle, how the hardware interacts, when the video chips takes over the bus to read RAM contents, and so on. Being efficient on those devices required a lot of in-depth knowledge about the underlying system.
If you ran into performance issues on a C64, you couldn't just upgrade the CPU or install more RAM. The only solution was to optimize the shit out of your code. That included counting clock cycles, rearranging opcodes, replacing opcodes, moving code/data around to avoid page boundary crossings. And yes, I am talking about machine language here, not a higher level language. There were just so many ways you could screw up performance on those systems and a lot of skill was needed to fully utilize the device you were developing for.
Well, on most platforms of today you don't do assembler any more. It was still needed during the 90s, as the higher language compilers weren't that good in optimizing. If you're using a C++ compiler today you will most likely not benefit from stuff written in pure assembler any more. So clock cycle counting, other hardcore tricks and in-depth knowledge of the underlying hardware is no longer needed. Still you should be aware of the system you're developing for. You should know about its limitations. Another important thing is: have in depth knowledge of the language you're using.
While hardcore optimization sessions are mostly not needed today, you still can write optimized code. Somehow the solution to every performance issue today seems to be: upgrade your system (have more servers, more RAM, close background tasks, do this, do that). Well, IMHO this is in most cases bullshit, laziness or a twisted way of looking at code.
So why should you optimize your resource usage then?
In the past, a computer mostly had a constant energy consumption level. There was no speed stepping or energy management, they were single CPU systems and did mostly not run while powered via batteries. Today nearly everything has at least two hardware threads, has speed stepping, energy management, and more and more devices run on batteries.
There are obvious reasons why something needs to be optimized to have better resource usage performance like a sluggish UI or 20 second startup delay for your new application. But there are other not so obvious reasons that most developers seem to forget about.
I know there are limits to how resource friendly you can be, but having a messenger application that consumes 10% CPU power while idle is not such a case.
Well, if someone invites you into his home, do you then bring 100 friends with you that occupy every room? Do you switch on all devices? Do you damage the furniture? So what do you think gives you the right, to behave like that on my PC/laptop/smartphone? Developing applications for users, that should entertain them or be helpful and then saying 'screw you' might not be the best way to be successful.
If your software significantly reduces the battery lifetime of my smartphone, you sure will get uninstalled. Your new messaging app uses uncompressed JSON/XML internet traffic ? Well, 'screw you'.
What is it that developers don't care about their users nowadays? Why do they think they own my device? There are so many updates to mobile applications (and even mobile operating systems) that have increased battery lifetime in their change logs. Why is this even necessary? You fricking know that you're developing for a battery powered, mobile device, so you know in advance of the limitations. Do you know why this is necessary? Because you don't give a shit about your users. You just don't care. And then, when people scream into your face about how crappy your application is, how much battery it's draining, how much traffic it's using, and your boss realizes that you're about to lose revenues, then you come up with a batch of crappy emergency fixes to improve something that could have been done correctly right from the start.
You know what? There were times without internet, times were software was unchangeably burnt into ROMs. Do you know what happened to your business when you screwed this up? No 'download a patch' here, nope. You had to recall every fricking media to destroy it and replace it with a working version … at least twice the money spent. This could have easily meant that your business is dead. Or think about the Mars rover. Imagine the same low quality of software there. Millions and billions of Dollars wasted.
So, why was, and is it possible to write quality software and you're not able to? Because you just don't care … you value your own comfort higher than that of your future users. You're lazy, either too lazy to come up with a better solution or too lazy to educate yourself about improvements upfront. There's no excuse for all of this. If you don't know nothing about the language you're working with, the system you're developing for and you're therefore writing an application that's wasting my time and my money, then don't expect me to use your shit or even give you feedback about obvious changes. Get your shit together and start caring about the people you develop your software for.
“Resource usage optimization, Motherfucker - Do you speak it?“