[Linux-aus] contest proposal
jon.maddog.hall@gmail.com
jonhall80 at comcast.net
Tue Jan 2 23:48:51 AEDT 2024
So many places to comment...so early in my morning...
> But it is true I find it hard to get excited about RAM usage when, in
> what seems like a just a few years ago I was amazed PC's were zooming
> through the 32 bit barrier, yet now I carry a battery power device with
> 64bits of address space, more connectivity, screen resolution, removable....
As someone who kicked off 64-bit addressing in Linux, the usage of 64-bit architecture in systems today is not *just* that every application needs 64 bits, but many need 33 bits. Also it is harder for developers to maintain both a 32-bit address space and a 33-bit address space.
I went through 16-bit address spaces to 32-bit address spaces and I really do not want to go through 64-bit address spaces to require 128 bit address spaces. Fortunately there are not too many applications that could utilize 128 bit address spaces....
But I digress...
It is *typically* not the instruction code that *needs* the larger amounts of main (and cache) memory but the ridiculous use of data space that people build into their applications AND (as noted) the wild non-usage of reasonable library management.
However the other issue of performance that I mentioned is more along the lines of engineering discipline in profiling your code and perhaps realizing that a tiny bit of your code is using a huge amount of resources, and wouldn't it be nice to reduce that somewhat?
I am DISTRESSED when computer professionals and academics start talking to me about how we have large memories and fast processors and we do not need to be concerned with efficiency anymore.
>(I'm sorry, but re-writing
> sin and cos doesn't seem comparable.)
NO, you are WRONG, WRONG, WRONG. It is EXACTLY COMPARABLE.
Many efficiencies can be made much better by looking at the very lowest levels. Obviously a bad algorithm will not be magically made much better if you optimize its code...choosing a better algorithm would be much better, but that would require the re-design of the code, something that you have ceded people will probably not do.
And part of Russell Coker's proposal should have the nice side-affect of making people aware of the problem and build the answers into their code in the first place.
>I also think the world at
> large isn't going to give up on wanting software that works and can be
> relied upon to not take down a fair chunk of a country's
> telecommunications network. [1]
So what in Russell Coker's proposal is going to make the software any worse? And might make it considerably better?
My husband works on software for a very large and well known gaming system. He recently implemented a mechanism to trap exceptions generated by the hardware that were not trapped before. Dozens of potentially fatal exceptions appeared in code that was thought to be tested....exceptions that would, from time to time cause the applications to mysteriously crash.
> So, it's a rock hits a hard place. Perhaps the Queen Mary hits Norfolk
> Island is a better metaphor, because these software projects are huge
> and cornerstones of the current internet, so nothing is going to change
> course quickly.
True, but if the Titanic had started to change course just a minute or so earlier they might have missed the iceberg altogether and we might have been spared several long movies.
Sorry that this idea does not float your boat, but I think most people would rather plug the leaks while they can.
>
> Personally, I'd give up on the Queen Mary entirely. Changing its course
> is just too hard. I'd start with the smaller projects one person can fit
> their head around
And that is what we are talking about. Plug the leaks, but at the same time teach better boat building.
>
> [0] From what I can tell, the EU New Product Liability Directive is
> mostly about defanging software shrink wrap licences. They will not be
> able to disclaim liability any more. You can't disclaim liability for a
> toaster that electrocutes someone from a design flaw, so I don't know
> why software has got away with the same thing for so long. The
> implications for open source software is for a supplier to be liable you
> have to have bought the toaster from them, and then the toaster must
> have killed you. Kinda - you get the idea. Software killing someone who
> downloaded it from a public repository without your knowledge doesn't
> fit the bill no matter how much it may seem like it should. But the new
> law still can impact open source developers. If for example you were a
> log4j developer that earned money on the side by fixing bugs in it, then
> the dollar amounts of damage done makes my eyes water and you maybe just
> made yourself liable for it. Once this passes, I wouldn’t do that sort
> of thing without getting professional indemnity first.
>
> [1] Granted, we all know that was more of an operations issue. But it
> sure raised a lot of eyebrows, as in called before a senate committee
> for a "please explain" type eyebrow raise. To the owners of those
> eyebrows it's all just computers all the way down - don't give us any of
> this finger pointing crap.
I am not really sure where electricians came in. However I would recommend that any optimizations be passed by the maintainer of the code and accepted by them.
And now I can go back to sleep.
md
More information about the linux-aus
mailing list