Edit: Fixed attribution
16-bit 6502 vs. ARM or MIPS?
Re: 16-bit 6502 vs. ARM or MIPS?
BigDumbDinosaur wrote:
New technology isn't always good technology, and over the years the electronics and computer industries (especially a certain software vendor in Redmond, Washington) have repeatedly demonstrated that adage.
Edit: Fixed attribution
Last edited by jmp(FFFA) on Thu Nov 26, 2015 4:55 am, edited 1 time in total.
- GARTHWILSON
- Forum Moderator
- Posts: 8773
- Joined: 30 Aug 2002
- Location: Southern California
- Contact:
Re: 16-bit 6502 vs. ARM or MIPS?
Stockholders invest for one reason: profits. If you don't take your company public, you have more freedom to do what you want, even if it's not quite the most profitable in terms of dollars. I personally got into electronics for fun, not money. When the fun is gone, so am I.
What entices gullible consumers and produces profits isn't necessarily what's smart. They'll go for what's flashy, or a fad, or think that new always means better. Bill Gates realized decades ago that he was in the business of frustration. A new version comes out with a few new features, and everyone's got to get it. Frustrations are built in, but you won't find them right away. After people do start finding the problems and complaining, the answer is, "Oh, you're still using that old version?? You just need to buy this new one. All those problems are taken care of, and it can do all this new cool stuff," and the cycle repeats, and he gets to hit you again in the wallet just seldom enough that you don't notice the plan. I quit using Windows years ago and 90% of my computer problems evaporated.
Windows 10, and now the updates from it that they're starting to apply to Windows 7 & 8, bring huge problems with snooping. Someone else here (I don't remember who) said you just have to know how to get around it; but I've read more articles since then telling about parts of it that you cannot turn off, and that even if you think you have all the snooping turned off, they're still recording some of your info and using it for marketing.
But people will keep buying, just because it's what's familiar, something that's both good and bad. Many also tend to think that "New!" means "You want this!" whereas others of us see it as "Don't be a guinea pig! Let someone else be the one to find the pitfalls. Wait 'til the bugs are worked out."
Regarding development systems, I know every company pushes their own, but someone with a lot of experience often has their own system that they've been refining for years, and they have become very efficient with it, and they don't want to be told they have to do things a different way.
Again, I like the philosophy put forth in the following articles:
Yep, I'm stubborn. That can be both good and bad.
What entices gullible consumers and produces profits isn't necessarily what's smart. They'll go for what's flashy, or a fad, or think that new always means better. Bill Gates realized decades ago that he was in the business of frustration. A new version comes out with a few new features, and everyone's got to get it. Frustrations are built in, but you won't find them right away. After people do start finding the problems and complaining, the answer is, "Oh, you're still using that old version?? You just need to buy this new one. All those problems are taken care of, and it can do all this new cool stuff," and the cycle repeats, and he gets to hit you again in the wallet just seldom enough that you don't notice the plan. I quit using Windows years ago and 90% of my computer problems evaporated.
Windows 10, and now the updates from it that they're starting to apply to Windows 7 & 8, bring huge problems with snooping. Someone else here (I don't remember who) said you just have to know how to get around it; but I've read more articles since then telling about parts of it that you cannot turn off, and that even if you think you have all the snooping turned off, they're still recording some of your info and using it for marketing.
But people will keep buying, just because it's what's familiar, something that's both good and bad. Many also tend to think that "New!" means "You want this!" whereas others of us see it as "Don't be a guinea pig! Let someone else be the one to find the pitfalls. Wait 'til the bugs are worked out."
Regarding development systems, I know every company pushes their own, but someone with a lot of experience often has their own system that they've been refining for years, and they have become very efficient with it, and they don't want to be told they have to do things a different way.
Again, I like the philosophy put forth in the following articles:
- Software survivalism, by our own Samuel Falvo (kc5tja on the forum), professional programmer. (In spite of the name, it's about hardware too.) I would like to see this way of thinking become more popular and organized.
- Neo-Retro Computing, also by Samuel Falvo
- Low Fat Computing (A politically incorrect essay by Jeff Fox) He and Chuck Moore (inventor of Forth), taking an entirely different programming philosophy, plus Forth hardware and software, have improved the compactness and speed of code by factors of 100 to 1000. I am constantly challenged by this.
Yep, I'm stubborn. That can be both good and bad.
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
Re: 16-bit 6502 vs. ARM or MIPS?
jmp(FFFA) wrote:
BTW, on modern CPUs, compiled languages like C generally run faster than hand-coded assembly language! Humans are just not that good at keeping track of register allocation, pipeline stalls, cache hits, and other factors that make modern CPUs so much faster than their predecessors.
Of course you can put out a product much faster (and even that is changing now, too -- we have come a long way within our small community), but pure assembly will always run significantly faster in an incredibly small code space. It's that simple. All those things you mentioned (register allocation, pipeline stalls, etc.) are part and parcel of what makes us tick, and yes we can and do keep track of it all (we have cycle contests even at 3 ghz), so while literally every other point you have made thus far has been arguably more or less dead-on, the last one is false and has been proven over and over.
Re: 16-bit 6502 vs. ARM or MIPS?
satpro wrote:
I've been sitting here really enjoying this conversation, so thanks for that. It's been a fun read. But if in any part of that last statement you're referring to the x86 platform, then I'll just come right out and say it: that's not even close to reality. Any half-decent asm programmer can write code that runs at least 2x-5x faster (and greater) than compiled C code -- I know I can. Even heavily "optimized" C is still not the same as pure assembly, nor is embedded assembly within C. The language design itself is the primary reason. We can write code that screams along at full speed, one instruction after another, two at a time on newer CPUs, deciding whether or not we even need a pointer or the stack, which instruction or addressing mode meets our need, and as an added bonus -- guaranteeing not to stall a pipeline or miss a cache due to working directly with the silicon and knowing exactly where we are in memory at all times. C just doesn't, again, because of the way it is designed. I like to think of x86 assembly as 6502 assembly on a ridiculously grand scale.
Of course you can put out a product much faster (and even that is changing now, too -- we have come a long way within our small community), but pure assembly will always run significantly faster in an incredibly small code space. It's that simple. All those things you mentioned (register allocation, pipeline stalls, etc.) are part and parcel of what makes us tick, and yes we can and do keep track of it all (we have cycle contests even at 3 ghz), so while literally every other point you have made thus far has been arguably more or less dead-on, the last one is false and has been proven over and over.
Of course you can put out a product much faster (and even that is changing now, too -- we have come a long way within our small community), but pure assembly will always run significantly faster in an incredibly small code space. It's that simple. All those things you mentioned (register allocation, pipeline stalls, etc.) are part and parcel of what makes us tick, and yes we can and do keep track of it all (we have cycle contests even at 3 ghz), so while literally every other point you have made thus far has been arguably more or less dead-on, the last one is false and has been proven over and over.
"Assembly is often used for performance-critical parts of a program, although it is difficult to outperform a good C++ compiler for most programmers."
https://software.intel.com/en-us/articl ... -assembly/
BTW, neither I nor I suspect Intel would claim that given unlimited time, a human could not outperform a good C compiler (which itself is more efficient than a C++ compiler). But given a reasonable amount of time (where reasonable might be defined as three to five times as much time as it would take a C programmer of equivalent skill to write equivalent code), the C compiler will generate code of equal or faster speed. Some architectures skew this even more in favor of the compiler (e.g. those with large numbers of registers and/or register windows) and some skew it more in favor of assembly (6502 is an extreme example where even a novice assembly programmer can beat a good C compiler).
Compiler design is a fascinating subject. Just as microprocessors have evolved tremendously in the last few decades, so has the subject of compiler design. Modern architectures (x64, ARM) are designed hand-in-hand with compiler writers in order to produce an optimal blend of technology to maximize the efficiency of compiled languages, NOT for hand-coded assembly language.
By the way, much of human progress is driven by increasing levels of abstraction. Human working memory can only handle a limited number of ideas at once. This is the whole point behind the evolution in high level languages and why it is a good thing for most programmers to eschew assembly in favor of higher level languages at least on modern CPUs.
- GARTHWILSON
- Forum Moderator
- Posts: 8773
- Joined: 30 Aug 2002
- Location: Southern California
- Contact:
Re: 16-bit 6502 vs. ARM or MIPS?
jmp(FFFA) wrote:
(where reasonable might be defined as three to five times as much time as it would take a C programmer of equivalent skill to write equivalent code),
Quote:
By the way, much of human progress is driven by increasing levels of abstraction. Human working memory can only handle a limited number of ideas at once. This is the whole point behind the evolution in high level languages and why it is a good thing for most programmers to eschew assembly in favor of higher level languages at least on modern CPUs.
Edit, Oct 2022: I came across this 9½-minute video posted in July which seems to be an excerpt from a seminar, where the speaker is saying that all these high-level languages (HLLs) are starting to make programmers less productive now, and that the HLLs are failing to deliver the promised benefits. He says that as we go up the ladder of HLLs, "somewhere through this chain, it becomes wrong":
Programmers Aren't Productive Anymore - Jonathan Blow
https://www.youtube.com/watch?v=bZ6pA--F3D4
He started out by talking about the benefits of abstraction, and says we don't want to program in assembly language anymore because it doesn't have abstraction. But that's where I say he's wrong. As you'll see in programming examples on my site, my use of macros in assembly language gives tremendous abstraction. (Some have said it doesn't even look like assembly language. However, if you know the processor's assembly language and are mindful of what you've put in the macros, you'll know exactly what they're assembling—it's just that you no longer have to look at the ugly innards every time you do the code.) In fact, if you have the same macros for multiple processors and their assemblers, you can transfer the source code of one to a source-code file for another one, and there won't be that much modification needed to adapt it. Portability definitely won't approach 100% of course; but it will be far better than starting over, and partly addresses this criticism of assembly language. I use many of the same macros when I program PIC microcontrollers for work.
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?
Re: 16-bit 6502 vs. ARM or MIPS?
jmp(FFFA) wrote:
If by the x86 platform you are referring to current generation (e.g. x64 architecture) platforms, then seems that Intel pretty much agrees with me on this point:
"Assembly is often used for performance-critical parts of a program, although it is difficult to outperform a good C++ compiler for most programmers."
"Assembly is often used for performance-critical parts of a program, although it is difficult to outperform a good C++ compiler for most programmers."
Anyway, I'm not up for the religious war.
Re: 16-bit 6502 vs. ARM or MIPS?
GARTHWILSON wrote:
Stockholders invest for one reason: profits. If you don't take your company public, you have more freedom to do what you want, even if it's not quite the most profitable in terms of dollars. I personally got into electronics for fun, not money. When the fun is gone, so am I.
GARTHWILSON wrote:
What entices gullible consumers and produces profits isn't necessarily what's smart. They'll go for what's flashy, or a fad, or think that new always means better. Bill Gates realized decades ago that he was in the business of frustration. A new version comes out with a few new features, and everyone's got to get it. Frustrations are built in, but you won't find them right away. After people do start finding the problems and complaining, the answer is, "Oh, you're still using that old version?? You just need to buy this new one. All those problems are taken care of, and it can do all this new cool stuff," and the cycle repeats, and he gets to hit you again in the wallet just seldom enough that you don't notice the plan. I quit using Windows years ago and 90% of my computer problems evaporated.
GARTHWILSON wrote:
Windows 10, and now the updates from it that they're starting to apply to Windows 7 & 8, bring huge problems with snooping. Someone else here (I don't remember who) said you just have to know how to get around it; but I've read more articles since then telling about parts of it that you cannot turn off, and that even if you think you have all the snooping turned off, they're still recording some of your info and using it for marketing.
GARTHWILSON wrote:
Regarding development systems, I know every company pushes their own, but someone with a lot of experience often has their own system that they've been refining for years, and they have become very efficient with it, and they don't want to be told they have to do things a different way.
GARTHWILSON wrote:
Again, I like the philosophy put forth in the following articles:
In terms of computer languages, I don't see the world nearly so black and white as some others do. Forth and C are two different tools with some overlap. Forth does some things better on some platforms, and C does other things better. If you only have a hammer in your toolchest, then everything looks like a nail to you. But if your toolbox is full of mutually complementary tools, you can pick the one that best suits the problem at hand. I will say, however, that on the 6502 at least, I'd strongly prefer a good Forth implementation to any C compiler I've ever seen.
Re: 16-bit 6502 vs. ARM or MIPS?
satpro wrote:
Anyway, I'm not up for the religious war.
Re: 16-bit 6502 vs. ARM or MIPS?
Well, this devolved. My own opinions:
On 32-bit architectures:
Every CPU is interesting in its own way. Every instruction set has its interesting quirks (who doesn't love the PowerPC's eieio instruction?). And high level languages have their niche. Yes, even 'C'. Even Forth. Even Java (I just threw up a little).
I've been a lurker for a long time here, and I love that you guys continue to hack 6502. It's absurd to say the 6502 is modern in any sense, but that doesn't stop it from being special (either historically or emotionally), and it doesn't stop it being useful or being used.
- Familiarity and deep experience is a fine reason to stick to an architecture - both for an individual and a commercial entity.
- Commercially, products are often evolutions/spins of existing products, and "don't reinvent the wheel" can apply. As long as a CPU can keep up with the workload, incremental diffs are likely prudent over taking on a redesign for the sake of the new shiny CPU of the week.
- I don't doubt that millions or even billions of 6502 cores have shipped. It has been around a long time.
- I don't doubt that it's still chosen for new iterations of product.
On 32-bit architectures:
- You don't lose control or understanding with a 32-bit architecture. When I programmed ARM in assembly language I was fully in control of every aspect of execution at all times. With my commercial MIPS work, its TLB-based MMU was a extremely satisfying thing to learn and master.
- 16 million addresses is fun. But you know what's really fun? 4 billion addresses.

- 'C' compilers can out perform programmers. They cannot out perform expert programmers.
- 'C' is the lingua franca of embedded systems. It's a fine language for its intended task (embedded and low level operating systems) and I think it's intellectually dishonest to claim 'C' is a problem just because it's not your cup of tea.
Every CPU is interesting in its own way. Every instruction set has its interesting quirks (who doesn't love the PowerPC's eieio instruction?). And high level languages have their niche. Yes, even 'C'. Even Forth. Even Java (I just threw up a little).
I've been a lurker for a long time here, and I love that you guys continue to hack 6502. It's absurd to say the 6502 is modern in any sense, but that doesn't stop it from being special (either historically or emotionally), and it doesn't stop it being useful or being used.
Re: 16-bit 6502 vs. ARM or MIPS?
Quote:
'C' is the lingua franca of embedded systems.
Over the years, I've also written tens of thousands of lines of embedded C code, that I can fairly easily reuse from one project to the next, even switching from one ARM to another ARM or even switching from ARM to MIPS. Even all of my interrupt handlers are written in C. The only assembly language in my recent projects is for a little task switcher where I need access to the stack pointer, and some other registers/instructions that are more easily done in assembly. But that's only 100 lines of code.
- BigDumbDinosaur
- Posts: 9425
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: 16-bit 6502 vs. ARM or MIPS?
jmp(FFFA) wrote:
BigDumbDinosaur wrote:
Tor wrote:
New technology isn't always good technology, and over the years the electronics and computer industries (especially a certain software vendor in Redmond, Washington) have repeatedly demonstrated that adage.
x86? We ain't got no x86. We don't NEED no stinking x86!
Re: 16-bit 6502 vs. ARM or MIPS?
@jmp(FFFA):
I think [..]
Er, quoting error there. I didn't write that. Could you update your post please?
-Tor
jmp(FFFA) wrote:
BigDumbDinosaur wrote:
Tor wrote:
New technology isn't[...]
-Tor
Last edited by Tor on Tue Nov 24, 2015 3:23 pm, edited 1 time in total.
Re: 16-bit 6502 vs. ARM or MIPS?
BigDumbDinosaur wrote:
My evaluation is based upon the technical merits of the product, not the financial results of the vendor. Would you rather fly in an airliner that was designed to maximize the builder's profits or maximize the safety and reliability of the product?
If Microsoft made an airliner, I wouldn't hesitate to fly in it. Safety and reliability are good business practices.
Any business that isn't trying to maximize its profits (within legal boundaries of course ) is by definition defrauding its stockholders.
[Edit: fixed typo]
Last edited by jmp(FFFA) on Tue Nov 24, 2015 2:19 pm, edited 1 time in total.
Re: 16-bit 6502 vs. ARM or MIPS?
sark02 wrote:
My own opinions:
- Familiarity and deep experience is a fine reason to stick to an architecture - both for an individual and a commercial entity.
- Commercially, products are often evolutions/spins of existing products, and "don't reinvent the wheel" can apply. As long as a CPU can keep up with the workload, incremental diffs are likely prudent over taking on a redesign for the sake of the new shiny CPU of the week.
In other words, keep your toolbox well stocked with tools, and try to use the best tool for each job you do, not your favorite tool.
Re: 16-bit 6502 vs. ARM or MIPS?
There may be a missing element here, which is age. An engineer in their 20s should be outward looking and flexible - it's far too young to get stuck in a rut. But an engineer in their 50s or 60s might be in a position to say "this is what I know, this is what I do, and these are the people and projects that can vouch for me" and choose not to upgrade their skillset. (Or, they might feel they can't take that bet, or they don't want to!)