6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Mon Nov 11, 2024 7:27 pm

All times are UTC




Post new topic Reply to topic  [ 30 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: Mon Jul 07, 2014 1:24 am 
Offline

Joined: Wed Jun 26, 2013 9:06 pm
Posts: 56
Compare my SNES homebrew code to licensed developer codes:

My way: Write routines as simplistic as possible. Results in 100 objects onscreen with no slowdown.
Their way: Write over-complicated routines and fix it later. Results in slowdown with 4 objects onscreen.

What I want to know is why do programmers make it a routine to wait until the very very end to do any speed optimizations, when they could have dodge the problem entirely by using simpler algorithms in the first place?

I've asked this question on several websites before and I always get the same response "they had dead lines." Well if they had dead lines, why bother with writing a slow collision routine that is 50 lines of code long over a fast collision routine that is only about 15 lines?


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 2:02 am 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8481
Location: Midwestern USA
Aaendi wrote:
What I want to know is why do programmers make it a routine to wait until the very very end to do any speed optimizations, when they could have dodge the problem entirely by using simpler algorithms in the first place?

I've asked this question on several websites before and I always get the same response "they had dead lines." Well if they had dead lines, why bother with writing a slow collision routine that is 50 lines of code long over a fast collision routine that is only about 15 lines?

Professional programmers, who are being paid to write software, do have deadlines. The first deadline is to get a working program put together. Optimization is secondary to having something to demonstrate to the boss. If you ever decide to go into professional software development, you will understand.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 2:12 am 
Offline
User avatar

Joined: Fri Dec 11, 2009 3:50 pm
Posts: 3367
Location: Ontario, Canada
I happened to notice a pertinent item earlier tonight on Hacker News:

Rob Pike's Rules of Programming
Quote:
Rule 1. You can't tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.

He lists five rules altogether, all worth reading. The HN discussion is here.

cheers,
Jeff

_________________
In 1988 my 65C02 got six new registers and 44 new full-speed instructions!
https://laughtonelectronics.com/Arcana/ ... mmary.html


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 2:37 am 
Offline

Joined: Sun Jul 28, 2013 12:59 am
Posts: 235
The procedure that I learnt, quite some number of years ago is:

  • Make it work.
  • Make it right.
  • Make it fast.

As an embedded developer, particularly on some other company's hardware, you will run into the following:

  • Aggressive deadlines, to the point where simply getting everything to work in time is difficult.
  • Requirements that change frequently, causing already functioning code to have to be rewritten.
  • Incomplete and inaccurate documentation, sometimes horribly so.
  • Inefficient, overgeneral, and buggy library code that you're not allowed to modify or replace.
  • Buggy hardware.
  • Development systems that don't fully match the behavior of production systems (AIUI, some NES development systems support D mode, while the actual deployed hardware does not, and this actually affected score-keeping in some games).
  • Limited available memory (both ROM and RAM).
  • Unfamiliarity with the platform, especially early on in the platform lifecycle.
  • Managers who prioritize adding new features over making sure that the exisiting features work right and that new features will be easier to add in future. There's no time to sharpen the saw when you need to cut down the Sahara forest, after all!
  • Barely-competent co-workers, often with the ability to talk themselves out of trouble and into a higher-paying position at the company.
  • Organizational policies that prevent hiring competent people.

So, yes, a lot of code out there is terrible. But a lot of it is written under terrible conditions, some of it by otherwise-excellent programmers. And some programmers are simply horrible at their jobs, but it tends to work out a lot better if you give them the benefit of the doubt... Unless you're looking to hire them, which is another conversation entirely.


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 3:41 am 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8481
Location: Midwestern USA
As I said, getting the program to work is the priority. I've done a lot of professional development, and have always made sure that stuff worked before attempting to make stuff work better.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 3:49 am 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8481
Location: Midwestern USA
Dr Jefyll wrote:
I happened to notice a pertinent item earlier tonight on Hacker News:

Rob Pike's Rules of Programming
Quote:
Rule 1. You can't tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.

He lists five rules altogether, all worth reading. The HN discussion is here.

cheers,
Jeff

Actually six rules. :lol:

Eric Raymond expands on Pike's rules, but basically ends up saying no more than Pike, using many more words in the process.

Doug McIlroy also had something to say about all this:

    (iii) Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 4:22 am 
Offline

Joined: Wed Jun 26, 2013 9:06 pm
Posts: 56
Dr Jefyll wrote:
I happened to notice a pertinent item earlier tonight on Hacker News:

Rob Pike's Rules of Programming
Quote:
Rule 1. You can't tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.

He lists five rules altogether, all worth reading. The HN discussion is here.

cheers,
Jeff


SNES developers seemed to follow 1 and 2, but not 3 and 4. Most of the slowness comes from not keeping the algorithms simple.


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 4:33 am 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8539
Location: Southern California
Some managers may still be looking for "productivity" in terms of number of lines of code, even if being sloppy and just punching it out to get it working before the deadline does not actually yield any net productivity advantage over doing a good job of it with fewer lines by the same deadline. I've had to re-write code that was done that way by someone else. I couldn't believe the waste of everything. There were no advantages other than possibly impressing someone that he could keep the keyboard quite warm.

There may be some similarity to a messy work space. If you're always cleaning it, putting away things before you're done with them, etc., you waste time. Allowing some mess can let you be more productive. OTOH, if the mess surpasses a certain point, it begins to waste your time. If, however, the mess is on the complex breadboard you're building instead of on the workbench, you're sure to have lots of problems that could have been avoided be being more neat. The "mess" threshold needs to be much lower there.

In reading the page at Jeff's ycombinator link, much of it seems to be related to GUIs in systems with preemptive multitasking and memory protection and so on, which I do not work in; but I'm seeing different things being discussed in optimization.

One seems to be about understanding the problem better, and programming a more suitable solution. If you don't understand it well, you might be able to make something appear to work by brute-forcing it just to move on to some other part of the project, and then you may not ever get the chance to come back and do it right, so there are poorly done parts that survive an embarrassingly long time.

Another seems to be that you understand the problem just fine, but there's the matter of how well a solution is carried out. In this case, it seems that taking the time to do good job of it gives a good ROI pretty consistently. It may not seem to matter in one part of the program, but doing a good job of it leaves you with something you can re-use elsewhere to great advantage.

I have to disagree 80-90% with Rob Pike's rule #1 ("You can't tell where a program is going to spend its time"). If you don't know (and it's a situation where it matters), it sounds like you have probably lost a lot of control. I suspect there are too many layers of software between you and the machine for the job. It's a situation I hate to be in. Ironically, that situation probably comes from the idea that processing power and memory are cheap. The number of bugs and the maintainability of code are not improved by saying you don't need to optimize because you can cheaply move to more-powerful hardware.

As far as not needing to optimize because processing power and memory are cheap, that's something I cannot agree with at all. Look at the bloatware that modern GUI OSs are. I remember many years ago when a new technology was introduced that improved memory prices and speed a lot in one step, and immediately Microsoft was saying, "This is great because now we don't have to be as careful and we can get new software out faster," and what happened is that the user never got the benefit. Boot-up times made no net improvement, the "disc full" messages came up just as often, and there were just as many bugs.

A nice thing about Forth (in which everything is a "word," whether it's a program fragment, a constant, an array, a defining word, an operator, whatever) is that you can get a word going as a secondary, ie, in high-level, then later re-write it as a primitive, ie, defined in assembly language, for maximum performance, without having to change any of the code that calls it.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 5:59 am 
Offline

Joined: Sat Mar 27, 2010 7:50 pm
Posts: 149
Location: Chexbres, VD, Switzerland
I absolutely hate those one lined so-called "rules". They make absolutely no sense, as if you could summary good and bad programming practice in 5 (or so) lines.

It depends so much on the context, if you are in education, or industry, if you are writing for your hobby or just because you're paid to do so, if you have deadlines or not, if you want to make elegant code, and of course even if in theory the programming language is not related to programming in itself, in practice it is, so yes it will depend on the programming language.

It is absolutely wrong to say you can't know where a program is going to spend it's time. In fact if you have half a brain you can guess it very easily. The only exception is if you are doing system calls and have no idea what the library will do behind it. Which may be pretty much always the case of people programming in very high level language, where they have no idea what they're actually doing on the machine.

And, the sentence, "premature optimisation is the root of all evil" is simply stupid and was probably a dumb joke, but people took it seriously.

I think it would be trivial to find one evil which is not due to premature optimisation. For instance, nazism was not a premature optimisation. Enough said.

Now let's pretend "premature optimisation is the root of all bad-programming practice (aka "evil") ". Again this is wrong. Some of the random programming practices that comes in mind :

- Bad indentation
- Variable names that makes no sense or aren't related to the variable's purpose
- Function names that makes no sense or [...]
- Not comment what a function expect as input and output
- Writing too much stuff on one line like : while (--i != function_call(arg1++, arg2))
- Copy-paste a function and make it only slightly different, instead of using an extra argument

None of those bad practices are premature optimisation, so again this sentence is plain wrong.

Now let's dumb it down to "premature programming optimisation is the root of some bad-programming practice (aka "evil")"

Then finally this is true of course (at last). If you want to make a super optimal program before writing it, there is higher probability you'll write spaghetti code and fail or simply loose interest in continuing it, because it's too much effort (if you're a hobbyist).

However, completely ignore any kind of "optimisation" when programming is not a very good practice either. You should really think about the proper data structures, so that you'll gain time, and not having to say "oh if I use structure Y instead of structure X, my program could be 10x faster" and having to rewrite everything.

So my advice is to optimize data structure first, then write clear, commented code (having it "optimized" is ok as long as it doesn't discard the comprehension), and finally, make a few optimizations that could alter the comprehension of the code if it is needed.

Also, optimising non-bottlenecks is not a crime. It's just a waste of your time in the worst case, but it won't hurt the performance in any way. Don't get me wrong - optimising bottlenecks is better, but that doesn't make optimise the rest wrong. There is plenty of situations where measuring the bottleneck directly is simply impossible or terribly difficult, so you'll have to resort to cheap heuristics.

Conclusion : I think paradoxally those dumb one-liners and people citing them as if it was the bible is the reason there's so many bad programmers around. In reality most of the approach depends on the context (embedded or not, hobby or work) and programming language. Just don't trust one-liners without thinking.

I'd say : "Dumb one-liners are the root of all evil".


Last edited by Bregalad on Mon Jul 07, 2014 6:13 am, edited 1 time in total.

Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 6:10 am 
Offline
User avatar

Joined: Sun Jun 30, 2013 10:26 pm
Posts: 1949
Location: Sacramento, CA, USA
GARTHWILSON wrote:
... A nice thing about Forth (in which everything is a "word," whether it's a program fragment, a constant, an array, a defining word, an operator, whatever) is that you can get a word going as a secondary, ie, in high-level, then later re-write it as a primitive, ie, defined in assembly language, for maximum performance, without having to change any of the code that calls it.


I am a habitual offender when it comes to premature optimization. I try to force myself to "get it running" first, but a defect in my character prevents me from doing so without countless optimization iterations getting in the way. For example, I'm still translating a couple hundred Camel Forth words from MSP430 to 65m32 (to help me optimize my instruction set), and I can't get past the fact that most (>75%) of the DOCOLON ... EXIT definitions can be rewritten as faster 65m32 primitives with no space penalties.

So now I have a 'nested' premature optimization loop structure that's driving me crazy. If I was trying to do this for a living, I would have been fired or bankrupted long ago ... fortunately, it's just a hobby, at least for now.

Mike


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 8:34 am 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10977
Location: England
There's a very nice comment on StackOverflow:
Quote:
It's kind of ironic that many people who yell "Premature optimization is the root of all evil" themselves have prematurely optimized [Knuth's] quote:
Quote:
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%


(Great discussion by the way! I'm still reading the HN piece and not yet got to Pike's piece.)

(Edit: and now I see
Quote:
I believe it was CA Hoare who said this. Even Knuth says so

and then I find that Hoare disclaimed it too...)


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 9:37 am 
Offline

Joined: Sat Mar 27, 2010 7:50 pm
Posts: 149
Location: Chexbres, VD, Switzerland
And, no offense, but why are so many people talking about this Knuth guy like he was God or something ? Ok he might be doctor, professor, etc, but that doesn't make him god. Taking quotes from him out of their context and dumbing them down, and then repeaiting it as if it was Holy teachings makes no sense.

Also, while I dont' disagree with his original statement (in its context), where does that 97% number comes from ? Did he had a record of a significant amount of programming projects that were seriously wrong, and count in how many of them the problem was "premature optimisation" ? Or did he just throw them randomly based on his intuition ? I bet it's the latter. There's nothing wrong with that by itself, but that changes a lot of things in regards to what this actually means.

Quote:
I've had to re-write code that was done that way by someone else. I couldn't believe the waste of everything. There were no advantages other than possibly impressing someone that he could keep the keyboard quite warm.

I agree. I remember once I had seen almost the same piece of code (a function) copy/pasted about 30 consecutive times with multiple variants, which was inherently stupid, since a single function with an extra argument could do all that. I couldn't believe someone would write code like that, and professionally so.


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 9:57 am 
Offline

Joined: Sat Jul 28, 2012 11:41 am
Posts: 442
Location: Wiesbaden, Germany
Optimization starts long before the first line of code is actually written. It is all about the best structure of a program and anticipating bottlennecks before they actually occur. And even then you must be prepared to discard all of your code and start with a new structure if your anticipation turns out to be wrong.

The cycle counter in your head should start counting from the same moment you have the idea for a new program. Writing code first and count cycles later will limit your ability to optimize because your mind is then preset to keep the structure you already started with.

The quote "We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil." is from a time where structured programming was a rare species. The evil in optimization was breaking the structure of a program without the real need to do so (It came from a technical paper called "Structured Programming with go to Statements"). However, it points to the bottlenecks (the magic 3%) to need attention from the very beginning.

So it is about knowing the bottlenecks of a program and paying attention to them from the start of program development. The rest of the code will probably never need optimization, not even at the end of program development.

_________________
6502 sources on GitHub: https://github.com/Klaus2m5


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 12:30 pm 
Offline

Joined: Sun Apr 10, 2011 8:29 am
Posts: 597
Location: Norway/Japan
Aaendi wrote:
My way: Write routines as simplistic as possible. Results in 100 objects onscreen with no slowdown.
Their way: Write over-complicated routines and fix it later. Results in slowdown with 4 objects onscreen.

What I want to know is why do programmers make it a routine to wait until the very very end to do any speed optimizations, when they could have dodge the problem entirely by using simpler algorithms in the first place?


What I've seen in coding is that the over-complicated way is often a typical result of trying to optimise as you go, while writing code as simplistic as possible is what you get if you don't. Instead you simply code the algorithm straight forward, as uncomplicated as possible. I code that way myself, and only after that do I look for where I can optimise. I find that it's both easier to optimise and also easier to refactor&improve starting from the uncomplicated code. A simple example that I ran into way back in time (the eighties): Due to slow disks, some coder had tried to find clever ways to improve throughput. When I took over I threw it all out (very complicated stuff), and re-wrote it in a straight forward way: read/process/write. After that I could simply replace the 'read' call with a buffered, optimised library-version I wrote for that purpose. Much better result and less bug-prone, and easier to maintain.
Pre-"optimised" code is often difficult to fix, test, and improve. And besides, I prefer to profile or use some other method to find out where the real need is. It's often not where one would assume it to be.


Top
 Profile  
Reply with quote  
PostPosted: Mon Jul 07, 2014 2:24 pm 
Offline
User avatar

Joined: Fri Dec 11, 2009 3:50 pm
Posts: 3367
Location: Ontario, Canada
Bregalad wrote:
And, the sentence, "premature optimisation is the root of all evil" is simply stupid and was probably a dumb joke, but people took it seriously.
I thought it was clear the sentence is a mixture of humor and seriousness -- that a valid point has knowingly been exaggerated. Of course it would be more accurate to say "root of some problems," but to say "root of all evil" is more memorable and amusing -- at least most folks find it so. It's understood we're to take the statement as informative, but not at its literal face value.

As for "You can't tell where a program is going to spend its time," that'll be more true or less true depending on the platform. For example, cache misses and disk accesses can turn the picture upside down. Years ago I read an article that illustrated this dramatically. The algorithm that seems it'd be slowest may actually be fastest.

The performance bottlenecks for most projects on this forum usually can be predicted, and I suspect SNES is the same. But it's not an infallible process; I'm guessing that once in a while something will surprise you. So, IMO performance profiling still has its place, but its importance is less than with more complex platforms.

cheers,
Jeff

_________________
In 1988 my 65C02 got six new registers and 44 new full-speed instructions!
https://laughtonelectronics.com/Arcana/ ... mmary.html


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 30 posts ]  Go to page 1, 2  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 0 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: