6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Tue Nov 12, 2024 8:57 am

All times are UTC




Post new topic Reply to topic  [ 88 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6  Next
Author Message
PostPosted: Mon Nov 23, 2015 11:38 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
BigDumbDinosaur wrote:
New technology isn't always good technology, and over the years the electronics and computer industries (especially a certain software vendor in Redmond, Washington) have repeatedly demonstrated that adage.

I think it's telling that you use one of the most successful businesses in the world as an example of a company that produces new technology that isn't good technology. At least by the yardstick most of the rest of the world uses (e.g. stock valuation, market capitalization), they produce a great deal of good technology.

Edit: Fixed attribution


Last edited by jmp(FFFA) on Thu Nov 26, 2015 4:55 am, edited 1 time in total.

Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 1:17 am 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8539
Location: Southern California
Stockholders invest for one reason: profits. If you don't take your company public, you have more freedom to do what you want, even if it's not quite the most profitable in terms of dollars. I personally got into electronics for fun, not money. When the fun is gone, so am I.

What entices gullible consumers and produces profits isn't necessarily what's smart. They'll go for what's flashy, or a fad, or think that new always means better. Bill Gates realized decades ago that he was in the business of frustration. A new version comes out with a few new features, and everyone's got to get it. Frustrations are built in, but you won't find them right away. After people do start finding the problems and complaining, the answer is, "Oh, you're still using that old version?? You just need to buy this new one. All those problems are taken care of, and it can do all this new cool stuff," and the cycle repeats, and he gets to hit you again in the wallet just seldom enough that you don't notice the plan. I quit using Windows years ago and 90% of my computer problems evaporated.

Windows 10, and now the updates from it that they're starting to apply to Windows 7 & 8, bring huge problems with snooping. Someone else here (I don't remember who) said you just have to know how to get around it; but I've read more articles since then telling about parts of it that you cannot turn off, and that even if you think you have all the snooping turned off, they're still recording some of your info and using it for marketing.

But people will keep buying, just because it's what's familiar, something that's both good and bad. Many also tend to think that "New!" means "You want this!" whereas others of us see it as "Don't be a guinea pig! Let someone else be the one to find the pitfalls. Wait 'til the bugs are worked out."

Regarding development systems, I know every company pushes their own, but someone with a lot of experience often has their own system that they've been refining for years, and they have become very efficient with it, and they don't want to be told they have to do things a different way.

Again, I like the philosophy put forth in the following articles:
  • Software survivalism, by our own Samuel Falvo (kc5tja on the forum), professional programmer. (In spite of the name, it's about hardware too.) I would like to see this way of thinking become more popular and organized.
  • Neo-Retro Computing, also by Samuel Falvo
  • Low Fat Computing (A politically incorrect essay by Jeff Fox) He and Chuck Moore (inventor of Forth), taking an entirely different programming philosophy, plus Forth hardware and software, have improved the compactness and speed of code by factors of 100 to 1000. I am constantly challenged by this.

Yep, I'm stubborn. That can be both good and bad.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 2:01 am 
Offline
User avatar

Joined: Thu Nov 27, 2014 7:07 pm
Posts: 47
Location: Ocala, Fl, USA
jmp(FFFA) wrote:
BTW, on modern CPUs, compiled languages like C generally run faster than hand-coded assembly language! Humans are just not that good at keeping track of register allocation, pipeline stalls, cache hits, and other factors that make modern CPUs so much faster than their predecessors.

I've been sitting here really enjoying this conversation, so thanks for that. It's been a fun read. But if in any part of that last statement you're referring to the x86 platform, then I'll just come right out and say it: that's not even close to reality. Any half-decent asm programmer can write code that runs at least 2x-5x faster (and greater) than compiled C code -- I know I can. Even heavily "optimized" C is still not the same as pure assembly, nor is embedded assembly within C. The language design itself is the primary reason. We can write code that screams along at full speed, one instruction after another, two at a time on newer CPUs, deciding whether or not we even need a pointer or the stack, which instruction or addressing mode meets our need, and as an added bonus -- guaranteeing not to stall a pipeline or miss a cache due to working directly with the silicon and knowing exactly where we are in memory at all times. C just doesn't, again, because of the way it is designed. I like to think of x86 assembly as 6502 assembly on a ridiculously grand scale, and I sure haven't heard anyone suggest we should write 6502 programs in C.

Of course you can put out a product much faster (and even that is changing now, too -- we have come a long way within our small community), but pure assembly will always run significantly faster in an incredibly small code space. It's that simple. All those things you mentioned (register allocation, pipeline stalls, etc.) are part and parcel of what makes us tick, and yes we can and do keep track of it all (we have cycle contests even at 3 ghz), so while literally every other point you have made thus far has been arguably more or less dead-on, the last one is false and has been proven over and over.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 3:27 am 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
satpro wrote:
I've been sitting here really enjoying this conversation, so thanks for that. It's been a fun read. But if in any part of that last statement you're referring to the x86 platform, then I'll just come right out and say it: that's not even close to reality. Any half-decent asm programmer can write code that runs at least 2x-5x faster (and greater) than compiled C code -- I know I can. Even heavily "optimized" C is still not the same as pure assembly, nor is embedded assembly within C. The language design itself is the primary reason. We can write code that screams along at full speed, one instruction after another, two at a time on newer CPUs, deciding whether or not we even need a pointer or the stack, which instruction or addressing mode meets our need, and as an added bonus -- guaranteeing not to stall a pipeline or miss a cache due to working directly with the silicon and knowing exactly where we are in memory at all times. C just doesn't, again, because of the way it is designed. I like to think of x86 assembly as 6502 assembly on a ridiculously grand scale.

Of course you can put out a product much faster (and even that is changing now, too -- we have come a long way within our small community), but pure assembly will always run significantly faster in an incredibly small code space. It's that simple. All those things you mentioned (register allocation, pipeline stalls, etc.) are part and parcel of what makes us tick, and yes we can and do keep track of it all (we have cycle contests even at 3 ghz), so while literally every other point you have made thus far has been arguably more or less dead-on, the last one is false and has been proven over and over.

If by the x86 platform you are referring to current generation (e.g. x64 architecture) platforms, then seems that Intel pretty much agrees with me on this point:

"Assembly is often used for performance-critical parts of a program, although it is difficult to outperform a good C++ compiler for most programmers."

https://software.intel.com/en-us/articles/introduction-to-x64-assembly/

BTW, neither I nor I suspect Intel would claim that given unlimited time, a human could not outperform a good C compiler (which itself is more efficient than a C++ compiler). But given a reasonable amount of time (where reasonable might be defined as three to five times as much time as it would take a C programmer of equivalent skill to write equivalent code), the C compiler will generate code of equal or faster speed. Some architectures skew this even more in favor of the compiler (e.g. those with large numbers of registers and/or register windows) and some skew it more in favor of assembly (6502 is an extreme example where even a novice assembly programmer can beat a good C compiler).

Compiler design is a fascinating subject. Just as microprocessors have evolved tremendously in the last few decades, so has the subject of compiler design. Modern architectures (x64, ARM) are designed hand-in-hand with compiler writers in order to produce an optimal blend of technology to maximize the efficiency of compiled languages, NOT for hand-coded assembly language.

By the way, much of human progress is driven by increasing levels of abstraction. Human working memory can only handle a limited number of ideas at once. This is the whole point behind the evolution in high level languages and why it is a good thing for most programmers to eschew assembly in favor of higher level languages at least on modern CPUs.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 3:47 am 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8539
Location: Southern California
jmp(FFFA) wrote:
(where reasonable might be defined as three to five times as much time as it would take a C programmer of equivalent skill to write equivalent code),

I don't have experience writing GUIs, or any significant amount of OOP (although when I read about OOP, I think, "Oh yes, I've done that in Forth, without knowing it was OOP"). However, the "equivalent code" part may be part of the disagreement. The embedded applications I've written so many of are very intense in bit and byte realtime input and output, and an HLL would not have made the programming process any easier, even ignoring performance. OTOH, some applications definitely call for an HLL, or at least doing assembly in a very HLL-like way, with a lot of abstraction which is where this comes in:
Quote:
By the way, much of human progress is driven by increasing levels of abstraction. Human working memory can only handle a limited number of ideas at once. This is the whole point behind the evolution in high level languages and why it is a good thing for most programmers to eschew assembly in favor of higher level languages at least on modern CPUs.
and the more experience I get, the more I find ways to make assembly that way, especially through macros, gaining the benefits of abstraction, and of keeping control of the project, maintainability, fewer bugs, etc., while avoiding memory and performance penalties.

Edit, Oct 2022: I came across this 9½-minute video posted in July which seems to be an excerpt from a seminar, where the speaker is saying that all these high-level languages (HLLs) are starting to make programmers less productive now, and that the HLLs are failing to deliver the promised benefits. He says that as we go up the ladder of HLLs, "somewhere through this chain, it becomes wrong":
Programmers Aren't Productive Anymore - Jonathan Blow
https://www.youtube.com/watch?v=bZ6pA--F3D4
He started out by talking about the benefits of abstraction, and says we don't want to program in assembly language anymore because it doesn't have abstraction. But that's where I say he's wrong. As you'll see in programming examples on my site, my use of macros in assembly language gives tremendous abstraction. (Some have said it doesn't even look like assembly language. However, if you know the processor's assembly language and are mindful of what you've put in the macros, you'll know exactly what they're assembling—it's just that you no longer have to look at the ugly innards every time you do the code.) In fact, if you have the same macros for multiple processors and their assemblers, you can transfer the source code of one to a source-code file for another one, and there won't be that much modification needed to adapt it. Portability definitely won't approach 100% of course; but it will be far better than starting over, and partly addresses this criticism of assembly language. I use many of the same macros when I program PIC microcontrollers for work.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 4:02 am 
Offline
User avatar

Joined: Thu Nov 27, 2014 7:07 pm
Posts: 47
Location: Ocala, Fl, USA
jmp(FFFA) wrote:
If by the x86 platform you are referring to current generation (e.g. x64 architecture) platforms, then seems that Intel pretty much agrees with me on this point:

"Assembly is often used for performance-critical parts of a program, although it is difficult to outperform a good C++ compiler for most programmers."

That's not how I read it, but I'm not on the outside looking in, nor am I trying to convince you to use my compiler. I think they pretty much agree with me.

Anyway, I'm not up for the religious war.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 4:07 am 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
GARTHWILSON wrote:
Stockholders invest for one reason: profits. If you don't take your company public, you have more freedom to do what you want, even if it's not quite the most profitable in terms of dollars. I personally got into electronics for fun, not money. When the fun is gone, so am I.

I agree with you. And that's one reason why electronics is my hobby, not my profession.

GARTHWILSON wrote:
What entices gullible consumers and produces profits isn't necessarily what's smart. They'll go for what's flashy, or a fad, or think that new always means better. Bill Gates realized decades ago that he was in the business of frustration. A new version comes out with a few new features, and everyone's got to get it. Frustrations are built in, but you won't find them right away. After people do start finding the problems and complaining, the answer is, "Oh, you're still using that old version?? You just need to buy this new one. All those problems are taken care of, and it can do all this new cool stuff," and the cycle repeats, and he gets to hit you again in the wallet just seldom enough that you don't notice the plan. I quit using Windows years ago and 90% of my computer problems evaporated.

Your experience is perfectly valid. But the bottom line is that businesses operate vastly more efficiently than they have in the past because of the deeply flawed efforts of the company you disdain. And while the market has served up plenty of other deeply flawed alternatives, none of them provided enough of an advantage over Microsoft's offerings to significantly alter the market -- at least not until Google and Apple came along. There is no PerfectOS, no matter how much the Apple and Linux fanboys claim otherwise.

GARTHWILSON wrote:
Windows 10, and now the updates from it that they're starting to apply to Windows 7 & 8, bring huge problems with snooping. Someone else here (I don't remember who) said you just have to know how to get around it; but I've read more articles since then telling about parts of it that you cannot turn off, and that even if you think you have all the snooping turned off, they're still recording some of your info and using it for marketing.

There is a lot of FUD out there about snooping. Bottom line is that Apple, Google, Microsoft, and others all do it for advertising purposes. You can opt out and give up smartphones, web browsing, smart TVs and many of the other advantages modern technology brings. But you may not have to give up as much privacy as you think in order to enjoy most of the value that technology brings. Have a look at Bruce Schneier's web site (https://www.schneier.com/) for some ideas. It keeps me in touch with the most significant threats to my privacy and lets me know what I can do to limit them.

GARTHWILSON wrote:
Regarding development systems, I know every company pushes their own, but someone with a lot of experience often has their own system that they've been refining for years, and they have become very efficient with it, and they don't want to be told they have to do things a different way.

But is it realistic to expect your efforts to be able to compete with the efforts of entire communities working towards the same goals. Eclipse and NetBeans are defacto standards for IDEs these days and they have come a long way from the early days. They are also easily customized for your own specialized needs if you want.

GARTHWILSON wrote:
Again, I like the philosophy put forth in the following articles:

Not surprisingly, I also like the philosophy in those articles (though admittedly I just skimmed over them -- I did bookmark them for more thorough reading later). I do see a day in the future where governments will have mandatory backdoors in all off-the-shelf computing platforms available to the general public. To the extent that I'd like to be able to maintain a modest local archive of personal data not accessible to whomever is in power at any given moment, I am confident in my ability to create a suitable system from available parts if it becomes prudent. I don't see it as an imminent threat, however.

In terms of computer languages, I don't see the world nearly so black and white as some others do. Forth and C are two different tools with some overlap. Forth does some things better on some platforms, and C does other things better. If you only have a hammer in your toolchest, then everything looks like a nail to you. But if your toolbox is full of mutually complementary tools, you can pick the one that best suits the problem at hand. I will say, however, that on the 6502 at least, I'd strongly prefer a good Forth implementation to any C compiler I've ever seen.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 4:17 am 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
satpro wrote:
Anyway, I'm not up for the religious war.

It's fine. I don't have time for discussion with people who can't be polite.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 4:53 am 
Offline

Joined: Tue Nov 10, 2015 5:46 am
Posts: 230
Location: Kent, UK
Well, this devolved. My own opinions:
  • Familiarity and deep experience is a fine reason to stick to an architecture - both for an individual and a commercial entity.
  • Commercially, products are often evolutions/spins of existing products, and "don't reinvent the wheel" can apply. As long as a CPU can keep up with the workload, incremental diffs are likely prudent over taking on a redesign for the sake of the new shiny CPU of the week.
  • I don't doubt that millions or even billions of 6502 cores have shipped. It has been around a long time.
  • I don't doubt that it's still chosen for new iterations of product.

So, I think I got what I was looking for - that the '816 is interesting because it builds upon the 6502, providing more features whilst allowing you to carry over your accumulated knowledge and experience. It also has its own tricks to be discovered... which adds to the fun. I guess (and don't take this the wrong way), it's like moving from an 8086 to 80286, or from a 68000 to a 68020... The same, but more.

On 32-bit architectures:
  • You don't lose control or understanding with a 32-bit architecture. When I programmed ARM in assembly language I was fully in control of every aspect of execution at all times. With my commercial MIPS work, its TLB-based MMU was a extremely satisfying thing to learn and master.
  • 16 million addresses is fun. But you know what's really fun? 4 billion addresses. :-)

On 'C':
  • 'C' compilers can out perform programmers. They cannot out perform expert programmers.
  • 'C' is the lingua franca of embedded systems. It's a fine language for its intended task (embedded and low level operating systems) and I think it's intellectually dishonest to claim 'C' is a problem just because it's not your cup of tea.

A few years ago I needed a low cost micro to perform something fairly trivial. I chose an 8-pin PIC12, if I recall, which I programmed in assembly. I didn't chose the PIC12 for its instruction set. I chose it because it was tiny and dirt cheap. It's took a few hours to pick up the tools and ISA, and the function was implemented in a day. It was the right call.

Every CPU is interesting in its own way. Every instruction set has its interesting quirks (who doesn't love the PowerPC's eieio instruction?). And high level languages have their niche. Yes, even 'C'. Even Forth. Even Java (I just threw up a little).

I've been a lurker for a long time here, and I love that you guys continue to hack 6502. It's absurd to say the 6502 is modern in any sense, but that doesn't stop it from being special (either historically or emotionally), and it doesn't stop it being useful or being used.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 6:05 am 
Offline
User avatar

Joined: Tue Nov 16, 2010 8:00 am
Posts: 2353
Location: Gouda, The Netherlands
Quote:
'C' is the lingua franca of embedded systems.

Indeed. And not only is it easier to write in C than in assembly, there's also a huge existing codebase in C, including newly written libraries made by microcontroller vendors to quickly use the peripherals in their chip. Things like 3D motion tracking, audio processing, and motor drivers are all available.

Over the years, I've also written tens of thousands of lines of embedded C code, that I can fairly easily reuse from one project to the next, even switching from one ARM to another ARM or even switching from ARM to MIPS. Even all of my interrupt handlers are written in C. The only assembly language in my recent projects is for a little task switcher where I need access to the stack pointer, and some other registers/instructions that are more easily done in assembly. But that's only 100 lines of code.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 7:13 am 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8482
Location: Midwestern USA
jmp(FFFA) wrote:
BigDumbDinosaur wrote:
Tor wrote:
New technology isn't always good technology, and over the years the electronics and computer industries (especially a certain software vendor in Redmond, Washington) have repeatedly demonstrated that adage.

I think it's telling that you use one of the most successful businesses in the world as an example of a company that produces new technology that isn't good technology. At least by the yardstick most of the rest of the world uses (e.g. stock valuation, market capitalization), they produce a great deal of good technology.

My evaluation is based upon the technical merits of the product, not the financial results of the vendor. Would you rather fly in an airliner that was designed to maximize the builder's profits or maximize the safety and reliability of the product?

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 9:06 am 
Offline

Joined: Sun Apr 10, 2011 8:29 am
Posts: 597
Location: Norway/Japan
@jmp(FFFA):

jmp(FFFA) wrote:
BigDumbDinosaur wrote:
Tor wrote:
New technology isn't[...]

I think [..]

Er, quoting error there. I didn't write that. Could you update your post please?

-Tor


Last edited by Tor on Tue Nov 24, 2015 3:23 pm, edited 1 time in total.

Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 2:01 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
BigDumbDinosaur wrote:
My evaluation is based upon the technical merits of the product, not the financial results of the vendor. Would you rather fly in an airliner that was designed to maximize the builder's profits or maximize the safety and reliability of the product?

Your evaluation is based on your own biases. It is impossible for people to come to agree about what is "good" and what is "bad" and the closest we can come is to a consensus. The market's consensus is that Microsoft is good. Even among unbiased users (e.g. they don't work for Microsoft or one of its competitors) you will find that most people aren't yearning to switch to another operating system. In fact, if you look under the hood at what makes it tick, you will find quite a sophisticated operating system that is more technologically advanced in many ways than competitors like MacOS or Linux. But I didn't come here to be an apologist for Microsoft -- they don't need one anyway.

If Microsoft made an airliner, I wouldn't hesitate to fly in it. Safety and reliability are good business practices.

Any business that isn't trying to maximize its profits (within legal boundaries of course ) is by definition defrauding its stockholders.

[Edit: fixed typo]


Last edited by jmp(FFFA) on Tue Nov 24, 2015 2:19 pm, edited 1 time in total.

Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 2:17 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
sark02 wrote:
My own opinions:
  • Familiarity and deep experience is a fine reason to stick to an architecture - both for an individual and a commercial entity.
  • Commercially, products are often evolutions/spins of existing products, and "don't reinvent the wheel" can apply. As long as a CPU can keep up with the workload, incremental diffs are likely prudent over taking on a redesign for the sake of the new shiny CPU of the week.

If you were saying, for example, that familiarity and deep experience with some slightly-out-of-date technology like Freescale Coldfire, then I might agree. But for 30-40 year old technology like the 6502/65816, I don't agree. While it may be capable of performing adequately in a few modern applications, there are much better choices not just from a performance perspective, but also from a cost and efficiency standpoint -- at least for commercial applications. IMHO, if you are a developer then it is your job to keep up with technological developments in your field, even if it's uncomfortable.

In other words, keep your toolbox well stocked with tools, and try to use the best tool for each job you do, not your favorite tool.


Top
 Profile  
Reply with quote  
PostPosted: Tue Nov 24, 2015 5:41 pm 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10977
Location: England
There may be a missing element here, which is age. An engineer in their 20s should be outward looking and flexible - it's far too young to get stuck in a rut. But an engineer in their 50s or 60s might be in a position to say "this is what I know, this is what I do, and these are the people and projects that can vouch for me" and choose not to upgrade their skillset. (Or, they might feel they can't take that bet, or they don't want to!)


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 88 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 6 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: