Do we really need faster computers to write more bloated software?


It’s crazy how powerful the hardware becomes nowadays, every three to six months the productivity doubles at least in the ARM world.

Now we have 4-core RK3188 Cortex-A9 running at 1.8Ghz, After three months there will be an 8-core new processor, then a 16-core etc.

I start asking myself do we really need such powerful processors? Even A10 a single core Cortex-A8 at 1Ghz is capable of decoding and playing video. Then what the hell you need more? Can you watch 16 videos at same time?

The only consequence  of the powerful hardware I see is that programmers write more and more bloated software on it. They become lazier, because the hardware is fast they do not try to learn algorithms nor to optimize their code – why use quick sort when my computer can sort the array for a microsecond using bubble sort?

This is crazy!

Back in 1980s Borland Turbo Pascal 3.0 was on one 5″ disk and in 39KB. There was minimalistic IDE with editor, Pascal compiler, linker and run-time library, and it was compiling and linking something like 10 000 Pascal lines per second on a humble 386 Machine on 25Mhz!

A few weeks ago we installed Code Composer Studio on a 3Ghz Machine with 4GB of RAM and wrote a simple EMPTY “hello world” for AM3352 just to check if it works with our TMS320-XDS100-V3 JTAG as bare metal programming.

After scratching the HDD for nearly 15 minutes this 1-liner code was successfully compiled!

We couldn’t believe our eyes! The programmer who did this bloatware should be publicly lynched, this is simply ridiculous! I couldn’t imagine ANYONE with common sense of ever considering working with such a compiler, so I called a friend of mine who I know uses CCS because he has no other choice.

He told me that this is absolutely correct and that he and his collegues who work with CCS have developed very bad habits during their work with it. Working with CCS makes him smoke at least two packs of cigarettes per day!

He says that usually their development process is like this: they hit the compile button then go out for coffee or to smoke while the code is being compiled, after 10-20 minutes they go back, run the code and find bugs, then they try to remember what they had changed before they hit the compile button, which, in many cases, they’ve already forgotten 🙂

Here is another blog article for the 50 bytes of source code which takes 4GB to compile.

What the hell is going on with the programming industry?

29 Comments (+add yours?)

  1. Boyan Peychev
    Aug 23, 2013 @ 11:48:07

    It is not only bad code. Multitasking is normal thing in my daily work. And YES, I use my 8 cores and 8GB RAM (Ubuntu Desktop envoirement) in my daily job with over a 10 active applications – browser, mail client, netbeans, database tools, terminal, skype, pidgin, libreoffice and etc. And YES, I want my computer to be fast to do my work faster.


  2. PaceyIV
    Aug 23, 2013 @ 13:59:57

    I completely agree with you!
    There’s nothing worse than a software engineer!
    Any other engineer have to study much before release any new device! They can’t make mistakes! If that happens a bridge may collapse, a ship can sink,..
    They cannot fix they bug with an easy update!
    A software engineer is really too lazy! They makes too bugs! They don’t think much about performance or security for they software! A software it’s too slow: you’re pc it’s too old for this software!
    In Italy we call software engineer as “pigiatasti”: they are just people who press some keys. 🙂

    I’m an electronic engineer but I’m working as a software engineer.


  3. guille36
    Aug 23, 2013 @ 14:11:19

    hey man, I think you have only one brain!


  4. funlw65
    Aug 23, 2013 @ 17:24:44

    For the Software part, I agree with you, no more good programmers these days.
    For the hardware, I guess they want to offer an alternative to Intel processors and to prove that the ARM can be good enough for desktop computers. Anyway, you know that this race is about who sells more, better and faster ARM processors.

    BTW, lets pretend that I want to use olinuxino as a replacement for my stupid Windows Vista laptop (which I use it with linux). Has olinuxino enough processing power and memory to be able to use Eclipse IDE with Atmel Toolchain for AVR firmware development as I do with my laptop? If yes, then can I do this continuously (day and night) without the need of a cooling fan (those drive me crazy)? Do this test and you will have some answers (for you and for us) to your article. I mean, is not enough to prove that LibreOffice is running on the existing hardware, you must prove that I can do useful tasks inside LibreOffice, Eclipse IDE, pitivi (video editing), Qt development, and so on

    Do you realize that you can provide PC hardware alternatives (this is crazy enough)?.I guess you do.


  5. XFer
    Aug 23, 2013 @ 18:58:37

    Totally agree: we don’t need more raw power, expecially in the ARM world.
    Not only programmers are lazy at their own coding: they don’t even exploit tools that are already out there!
    To name a few: they don’t care enough about compiler flags, multithreading (OpenMP and such), SIMD-optimized libraries (yes there are NEON-optimized libraries out there). Many times I feel like Don Quixote against windmills.

    On the hardware side, what we may need is better power efficiency (28nm instead of 40nm for A20, for example).


  6. Radu - Eosif Mihailescu
    Aug 24, 2013 @ 03:04:11

    +1 to that. Borland Pascal on a Cyrix 486DX2v @66MHz with 4MiB RAM blazed through source like it was nothing. IIRC, the only few times I could see the status text go “Linking …” for more than a fraction of a second was with 15k+ lines programs spread across a dozen or more units. To make it clear, that’s 15k+ lines of object-oriented, complex code!

    Today’s industry has *no* excuse whatsoever for its current state and for the highly non-professional attitude towards work on behalf of the programmers, many of them self-proclaimed “software engineers”.

    Want more examples? Here you go:
    Nokia 9000i Communicator: i386SX @24MHz, could run a full fledged IMAP client while remaining perfectly responsive
    All Williams WPC pinball machines: Motorola 68B09EP @2MHz, runs cursive 128x32x4 animations while keeping score and responding to player input with no lag
    And, not in any way the least, anyone had a Sinclair Spectrum clone as a kid? Now that’s one mean machine and it’s still nothing compared to the Amiga, which is still nothing compared to i286.


  7. brucedawson
    Aug 24, 2013 @ 04:44:58

    That sort of performance from CCS is completely unacceptable. Rubbish. I hope you are complaining loudly to the company you purchased it from.

    Just curious — have you tried profiling it? On a machine with sufficient RAM there should not be significant disk I/O during builds but it sounds like there is. Maybe it needs more memory? Not that that is excusable, but if you can solve this by upgrading to 16 or 32 GB of RAM in your development machine then it would be well worth it. That much RAM should let the OS cache everything that could possibly be needed.

    If it’s burning CPU for 15 minutes then that’s a whole different type of crazy — hard to imagine.

    I’m the one who wrote about 50 bytes of code taking 4 GB to compile and I would caution against taking that discovery too seriously. VC++ is normally far better behaved than that — that (illegal) program just happened to expose a bug. I routinely compile and link hundreds of source files in minutes or less using VC++. Incremental builds of huge DLLs can just take seconds.


    • OLIMEX Ltd
      Aug 24, 2013 @ 08:23:06

      you may bet that we will never purchase CCS 🙂 it have 30 days evauation when you install it and this is what we used
      I have not spent time in pofiling but seems all operations go through disk operations, intallation itself is few GB


  8. boz
    Aug 24, 2013 @ 06:39:50

    +1 to the programmers in the 80’s


  9. Tom
    Aug 24, 2013 @ 11:17:35

    We need fewer people that are not software engineers that make software. Learning how to make decent software takes years and there are a lot of people who think that they can get away with not learning the craft. You can’t.

    And yes, we need more raw power.


  10. Ricko
    Aug 24, 2013 @ 13:59:10

    Yep I recognise the syndrome. Used to develop embedded code for barcode label printers with 8080 in assembler and had 8192 bytes rom and 128 ram to fit the whole thing in. Makes you wonder if the bloatware sellers have a deal eith the hardware makers to keep us buying more and bigger machines?


  11. Tom
    Aug 24, 2013 @ 17:14:08

    It is meaningless to compare those times with the times we are in now, things have changed and for the most part for the better.
    For most of us it is not an option to manually write for example a USB software stack in assembly. Or a network stack. ( Or a network stack to begin with, most of us will use a readymade stack/library for a lot of things.) Those are the kind of devices the customers are asking for now and they want it in a fraction of the time the hackers of those days had.
    The way this industry is heading is less and less comparable to how it was done in those days. Heck, when i started everyone was fiddling with their crappy written software to put in in the available 256 bytes RAM; 10 years later we are using 32bits microcontrollers exclusively ( and hooray for that ) without that stupid time consuming unproductive stuff.
    We are living in a great time if you are into this craft, and i’m really glad to be a part of it.
    Crappy craftsmen are of all times; deal with it.


  12. Dude Durham Duncan
    Aug 25, 2013 @ 00:47:48

    Now, you’re asking whether we need computers. We didn’t for thousands of years. As long as there’s still a lot that current computers don’t allow us to do, faster computers are good. Let’s not forget how many of the concepts we take for granted today started on supercomputers. They then took years to show up on affordable hardware. Think of the graphical user interface or programming systems (there are example videos about Engelbart’s or Kay’s works on After all, aren’t we still in AI winter? The algorithms developed during the blossom of AI research haven’t changed since then but computers meanwhile have become powerful to the point where running those algorithms might become feasable. Supercomputers for everyone are on the horizon. Have a look at the Parallella board that will start shipping production units in October.

    On the other hand, some of the ideas that were present in Engelbart’s 1968 “Mother of All Demos” still have not been reappeared in software to this day, which is a shame. Because of the way, commercialization of hardware and software took place, we are now in a somewhat unsatisfactory situation. But that, we are also in a general sense. But writing and optimizing software takes time. Programmer’s time used to be cheap compared to machine time, so it could be done. Today, computers are certainly cheaper than programmers, so if an inefficient program can be hacked together faster than a well thought-out one, we get bloatware. Overdoing it gets us Wirth’s Law.

    In the future, we will have hundreds and then thousand of cores. We already have that with graphics cards. We will probably see it with ARM cores, although it may take a while. We can already have small and simple computers on Greenarrays multi-computer chips (by the way, @Olimex, it would be great, if you could make a cheap board or SOM with GA144 on it, because I’m not aware of any such; Greenarrays’ own eval board is prohibitively expensive; maybe I should make a wish via your web site). With the Propeller, we also have a multicore microcontroller from Parallax, with the Propeller 2 just around the corner (Olimex should also consider to use them on a board). And then, we also see hybrid SoCs from Xilinx and Altera that feature reconfigurable hardware. These are also interesting candidates for open hardware boards or SOMs. The idea is not that new. It may have started with an embedded PowerPC core in Xilinx’s Virtex-II Pro and they now use ARM cores for their Zynq SoCs.


  13. Emiliano Daddario
    Aug 25, 2013 @ 03:18:11

    CPU speed won’t grow according to Moore’s law forever. It’ll grow slower and slower year by year. The “new Moore’s law”, roughly speaking, is the increase in the number of cores. So the forward-looking question is not – do we need faster cores? – but rather – do we need more and more cores? Yes. That phenomenon would be unuseful if software wouldn’t benefit from it (this is exactly the point of your article). But it does benefit indeed, mostly thanks to the pure functional programming paradigm. Clojure is the closest to perfection. In Clojure you get parallelization almost for free. Without functional programming, parallelization is harder and worse, maybe too hard and too bad. Functional programming and multiple cores is a perfect marriage. But neither Clojure nor any super modern language will save a lazy programmer who doesn’t write unit tests at least (test driven development is even better).
    I’m an Italian software guy, not a hardware one, but I know that a compiler (C++ for example) should compile a short code within a fraction of a second where possible, especially if the compilation is part of an automated test suite. Some automated tests have to be run several times a day and so mustn’t take long. In a similar way, a software user shouldn’t see his important piece of software freeze several times a day, especially if it freezes because of bad unit testing, or even no testing at all.


  14. Michael Shimniok
    Aug 25, 2013 @ 08:49:52

    I will definitely say that learning embedded programming even as a hobbyist, and particularly on ATtiny devices, has really opened my eyes as to what is possible with not much (RAM, CPU, etc.)


    • Emiliano Daddario
      Aug 25, 2013 @ 10:38:26

      I agree, I’m a hobbyist with respect to embedded programming, and I find it very instructive in that sense.


  15. Tom
    Aug 25, 2013 @ 09:53:42

    Imho, programming in such constraints compromises the quality of the software: Good programming practises are abandoned because of limitations of the hardware.


  16. Dimitar Pavlov
    Aug 25, 2013 @ 11:37:58

    Most SW people perceive product development only from SW perspective. So they see only sub-optimal SW solutions and are not able to evaluate the HW+SW ones in terms of cumulative cost for development&manufacturing, cannot evaluate parameters such as time to market, competition, etc… Hence the confusion.


    • John S
      Sep 06, 2013 @ 16:26:43

      Most software developers design their software as a stand alone entity on a PC. They don’t accept that other things might also be running. They basically default everything to benefit their software and not the overall performance of a users PC. If they did, the user would get ask how much resources they want to give up for that program. Such as quick launch, auto updates and so on. Most people don’t even look at the custom install options fearing they will mess up something. My biggest gripe has been the lack of software makers doing a proper uninstall application for their software. So many time their is fragmented files left on a PC.


  17. Victor
    Aug 27, 2013 @ 16:21:52

    This is why I write my firmware with my own setup (Code::Blocks), and not the bloatware by the microcontroller vendors. I just download the libraries, invest a day in getting all libraries and directories set up correctly and I’m off!
    By the way, using FPGA tooling is even worse… Tried installing the Altera toolchain lately?


  18. John S
    Sep 06, 2013 @ 16:18:56

    OK so I have used Windows since 3.1. Had several years I used OS X on Mac’s and lately I started trying Chrome OS on a Chromebook. My conclusion is that modern operating systems themselves are pretty efficient. Its the software and apps that are not. Eventually what happens is too many programs run things in the back ground even when your not using them. Supposed to speed up their product?? Or update. Pretty soon you can have well over 100 processes running.
    Maybe not using a lot of CPU cycles. But still running in memory. I think security software has always got a bad name for slowing PC’s down. Much of it is because of the active scanning required to protect the PC. When anti virus did not have live scanning and simply ran a scan every week or so. It used much less resources. I think you have to presume that much of the added bulk of software is simply do to the user wanting stuff done automatically. From updating the OS to software programs. Everything has its own updater running at some point to check for those precious updates. But after using the Chromebook. I realize that even a duel core ARM with 1.7 ghz can easily run a browser and play video. But even that can run into trouble when you try and do too much with it. I still find many reviews complaining about the AMD APU’s and Atom like CPU’s that are simply too slow. Its safe to assume that no CPU below a 2.0Ghz will satisfy the majority of PC users.
    Even though the OS as improved in performance. The rest of the stuff has only replaced that freed up speed with more bloat. Try running Windows 7 without a security suite at all. Even turn off Windows Defender and disable all programs with stuff running in the back ground. So basically running just system processes. Windows 7 is truly fast!


  19. patrick295767
    Jul 07, 2015 @ 22:56:32

    I completely agree with you! I completely agree with you!

    Please visit the clean source code of MS !!

    CSS is a nightmare for experienced coders.

    You may visit one post from me 🙂


  20. Trackback: We’re approaching the limits of computer power – we need new programmers now –
  21. Trackback: Наближаваме предела на изчислителната мощност и имаме нужда от нови програмисти – Gamezone България
  22. Trackback: We are approaching the limit of computing power – we need new programmers – Developers
  23. Trackback: We’re approaching the boundaries of laptop energy – we'd like new programmers now | John Naughton - SlashPB
  24. Trackback: We're approaching the limits of computer power – we need new programmers now | TANAKA Precious Metals
  25. Trackback: 田中貴金属グループ|コンピュータは"限界"に近づいている。新たなプログラマが求められる理由

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: