6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Sun May 05, 2024 7:02 pm

All times are UTC




Post new topic Reply to topic  [ 88 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6  Next
Author Message
PostPosted: Mon Nov 23, 2015 6:39 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
Arlet wrote:
An ARM core requires a fairly modern process, though, so I can still see people using 6502 cores on a old process, using second hand fab equipment, especially if they also have some old software and/or programmers.

A 110 nm process is cheap enough that a number of graduate programs at universities here in the US have students in VLSI classes producing designs on a single wafer or even a portion of a wafer. That's what, 100+ times denser than the original 6502. That's plenty of room for a high-performance (ARM or MIPS) core, a small cache, and even some simple peripherals.

BigEd wrote:
I don't know why anyone would choose WDC over ARM - cost, development tools, reference designs, and support might be among the reasons. Even if a 32-bit core seems excessive for some task, that's no matter if the price is right and the product gets to market on time.

Surely you aren't suggesting that WDC has better development tools, reference designs, or support than what is available for ARM? Also, there are at least four free ARM-compatible cores available on opencores.org, plus as Arlet points out, it can't cost more than a few cents to license the real thing in quantity anyway.

I can think of one reason to use the 6502 (8-bit, 16-bit, or otherwise) over an ARM or MIPs in a modern non-commercial product: To quote Cliff Biffle -- because it's hard!

(Cliff has done some impressive work making a STM32F4 microcontroller produce nice graphics on an 800x600 VGA display here: http://cliffle.com/project/, demo here: https://www.youtube.com/watch?v=7yXxhvKmVb0. When asked why he used this part rather than something well suited to the task like an FPGA or microcontroller with a built-in VGA interface, his response was, "because it's hard!" Right on, Cliff!)


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 8:30 pm 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8173
Location: Midwestern USA
jmp(FFFA) wrote:
Please pardon me if I don't take WDC's word for how many cores they have licensed, etc. unless and until the numbers are confirmed by outside sources. They wouldn't be the first company to misrepresent their sales.

You aren't the first one to wonder about that—I did as well some time ago. WDC is a closely-held corporation, which means they are not subject to SEC public filing rules. So using casual methods to find out their sales, profitability, etc., isn't really possible.

However, as my company is registered with Dun & Bradstreet (as is WDC), I periodically communicate with a rep there, usually to investigate the creditworthiness of a prospective client. One day when I happened to be on the phone with my D&B rep, I asked her if she could give me any info about WDC, without me starting a formal request (which costs money to process). She said that their reported sales were around the 10 million USD mark, but couldn't tell me anything about net profit, D-to-L ratio, or anything like that. The sales number, when considered against the claimed annual use of 65xx cores and what the likely royalties would be, makes sense. So I'm reasonably confident that the claimed annual usage is not overly inflated.

Quote:
Finally, even if it were true that they have in fact licensed hundreds of millions of cores, I still stand by my comment that any modern application of their 65c02 technology outside of retro or educational use is suboptimal in comparison to current generation technology in most meaningful ways (cost, performance, efficiency, etc.).

I guess microwave ovens, refrigerators, water softeners, pacemakers and implanted defibrillators are using "suboptimal" technology. Don't those design engineers know any better? :lol:

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 8:37 pm 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8432
Location: Southern California
WDC is a fabless company. When they (or more likely, their clients) do produce hardware, they go to the fab houses with their designs. It's like "What kind of PCB could I make as an individual with no real PCB equipment?" and of course the answer is none, but I can send my designs to a modern board house and get excellent-quality boards for a price much lower than a business had to pay 20 years ago.

This is from a topic on the advantages of RPN about two years ago on the HP museum forum, at http://www.hpmuseum.org/forum/thread-1374-page-2.html:
Quote:
Quote:
Quote:
Quote:
Garth Wilson wrote:
Here's an example. I'm finishing up another work project with a low-cost PIC16F microcontroller for a commercial product, with 5,500 lines of code....

Why would anyone use a PIC in a commercial product?
There are other processors that are as cheap and far more capable...

It was a suitable fit for early work projects going back 18 years, but the last project has had me taking it far beyond what it was designed to do, and future projects may go further; so yes, I will be looking into a new µC family for the next 10-20 years' work. I have a lot invested in this one, in terms of learning its traps, its I/O, having a lot of code I can re-use (including my structure macros and other macros), knowing the MPASM assembler really well, having a production-worthy programmer, etc.; so I do not take lightly the matter of switching families. That even goes for things like op amps, where there are subtleties that don't come through in the data sheets, only through experience, so I don't carelessly hop from one to another. That's rather O.T., so if I can solicit your input, I'll link to this forum topic, or this one (the same thing on two different low-traffic forums).

Hi Garth,
I do quite understand your point.
COMPATIBILTY was and is a major problem on Computing world.
There is always apearing "new" things, more capable, BUT you have to forgett and throw away all the work , all the knowledge of many years of learning and effort.
Since many years I am aware of that problem and I am avoiding very consequently not to get tied to something people want to tie me.

I've often joked that just when a programming language or other major subsystem becomes mature, stable, well-documented, bug-free and highly productive, it will be taken off the market and replaced with a new one that is none of these things. Only, it's not really a joke, is it? :wink:

I watched a few hours of lecture on ARM instruction set, and didn't find it easy at all. (Admittedly, perhaps it was just the way it was presented. I don't have the experience with it to know.) I think it would take me a long, long time to get up to some semblance of speed on it, and even then, I would not have the benefit of all my 6502 experience. And for what? That level of computing power is usually for video (like smartphones), which I have zero interest in. For our low volumes of thousands, max, not tens of thousands of units, development time (which is fastest if it's something you have experience with) is more important than per-piece price; so what do I care if I can get an ARM for $.37. Sometimes however, when I see the directions the industry has gone, I think that if anything changes with my job, I'd rather push a broom and stock shelves (and just keep electronics as a hobby).

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:03 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
BigDumbDinosaur wrote:
I guess microwave ovens, refrigerators, water softeners, pacemakers and implanted defibrillators are using "suboptimal" technology. Don't those design engineers know any better? :lol:

No, they don't. The world is littered with products with awful designs from every field of engineering, and electrical engineers are no exception to this. I'm no reverse engineer, but I have spent a fair amount of time taking things apart to see what makes them tick, and I've seen plenty of awful engineering out there on commercial products over the past several decades. On top of that, I've spent a good deal of time reading over EE and IT-related US patents and a significant percentage of them are written by people almost as incompetent as the employees of the patent office itself.

I'll even go a step further and say that putting a 6502 into a mass-produced microwave oven, refrigerator, water softener, pacemaker, etc. is not just suboptimal, but it's actually dumb. Why put your company at a competitive disadvantage by putting dead-end technology in your products when newer technology is available that comes with much better development tools, better support, higher levels of integration, and offers the performance people expect in the burgeoning IoT world.


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:20 pm 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8432
Location: Southern California
Competitive disadvantage? Obviously they're doing pretty well. Better development tools? I won't use most companies' IDEs. They take control away, they assume you don't know what you're doing, and they usually require you to use their sub-par editors rather than a good idependent professional programmer's text editor. Better support? It varies among companies. My experience is that TI's is absolutely terrible, and Microchip's people tend to fire off an answer before reading the whole question or problem, wasting time for both of us. Higher levels of integration: WDC's licensees are integrating everything on one IC. Performance, as I said before: Many things have no need for that level of performance. IoT is one of the things that makes me want to get out of the industry. It's used for snooping, and also damages our health with radiation everywhere. (I suspect the 6502 has plenty of computing power to do IoT though.) [Edit, a couple of years later: The 65c02 is now getting into IoT.] BTW, I don't have or want a smartphone. I do have a flip phone but I haven't turned it on in a week or two. There's a good chance it has a 65c02 in it anyway, as I know the 65c02 has gone into a lot of phones-- it's just not used for the graphics-intensive stuff.

Edit, years later: To add to my quote above about many things having no use for the ARM level of performance, Wayne Freeman of Microchip, which sells 8-, 16-, and 32-bit microcontrollers, has an article in Electronics Design magazine, "11 Myths About 8-Bit Microcontrollers" where he makes clear that 8-bitters are quite viable in the market and aren't going anywhere. He answers every objection raised in this thread, and more.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:24 pm 
Offline

Joined: Sun Apr 10, 2011 8:29 am
Posts: 597
Location: Norway/Japan
I'm not sure how my dishwasher, refrigerator, microwave oven, or any of those type of household items could benefit from the improved performance of e.g. an ARM CPU. A while ago I got a chance to look at the circuit board of my home alarm system when the techs came over to upgrade the sensors. The whole thing runs off a Z8 microcontroller.
The 65C02, the Z8, and the Z80 are in volume production to this day. I don't think the engineers and companies using them do so because they don't know better. On the contrary, I suspect they know exactly what they're doing.


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:28 pm 
Offline
User avatar

Joined: Tue Nov 16, 2010 8:00 am
Posts: 2353
Location: Gouda, The Netherlands
Quote:
(I suspect the 6502 has plenty of computing power to do IoT though.)

Maybe, but that's not the point. People want to grab a couple of standard libraries and run them, not spend weeks and months optimizing their own hand crafted network stack in assembly so that it will fit within the constraints of ancient technology that cost more money than modern ICs.


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:28 pm 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10798
Location: England
It doesn't really make sense to me to say an ARM has too much performance. It might be too costly, or take too much power, or you might not have the skills in your team to use it, but I can't see how excess performance can be a problem.

ARM assembly is pretty straightforward. ARM is also a better target for C programming. I'm sure that's been part of the reason for ARM's spectacular success.


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:35 pm 
Offline
User avatar

Joined: Tue Nov 16, 2010 8:00 am
Posts: 2353
Location: Gouda, The Netherlands
Quote:
I'm not sure how my dishwasher, refrigerator, microwave oven, or any of those type of household items could benefit from the improved performance of e.g. an ARM CPU

It's not just about the raw performance. It's mainly about the competitive price, modern peripherals, low power, modern tools and familiarity in the workforce. Checking on DigiKey, I see two pages worth of ARM controllers that are cheaper than the smallest Z8. Apart from old habits, I don't see any good reason to stick with old tech.

And sometimes, raw performance isn't strictly required for the main task, but it can help to make the product cheaper. For instance, the little motor that drives the rotating dish in the microwave could be constructed using a sensorless brushless DC motor, reducing cost and increasing simplicity and lifetime. The same ARM that drives the keys and display has enough horsepower left over to figure out how to drive the motor phases. So, instead of just replacing the small old microcontroller, a modern controller can also absorb some of the other circuitry.


Last edited by Arlet on Mon Nov 23, 2015 9:53 pm, edited 1 time in total.

Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:52 pm 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10798
Location: England
Yes, an existing codebase would be a compelling reason to use a particular architecture.


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 9:55 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
GARTHWILSON wrote:
For our low volumes of thousands, max, not tens of thousands of units, development time (which is fastest if it's something you have experience with) is more important than per-piece price; so what do I care if I can get an ARM for $.37. Sometimes however, when I see the directions the industry has gone, I think that if anything changes with my job, I'd rather push a broom and stock shelves (and just keep electronics as a hobby).

I understand and empathize with what you are saying. The problem is, 99.9% of the embedded engineers out there don't know the first thing about 6502s but can develop in C for an ARM very quickly. Even if, as you say, the 6502 is "good enough" for the job (which is debatable in a world where having more features than your competition is an important marketing consideration), why should a prospective employer hire you to work on a product based on a 6502 which will become instantly unmaintainable as soon as you retire or move on to a new company?


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 10:11 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
GARTHWILSON wrote:
Competitive disadvantage? Obviously they're doing pretty well. Better development tools? I won't use most companies' IDEs. They take control away, they assume you don't know what you're doing, and they usually require you to use their sub-par editors rather than a good idependent professional programmer's text editor. Better support? It varies among companies. My experience is that TI's is absolutely terrible, and Microchip's people tend to fire off an answer before reading the whole question or problem, wasting time for both of us. Higher levels of integration: WDC's licensees are integrating everything on one IC. Performance, as I said before, many things have no need for that level of performance. IoT is one of the things that makes me want to get out of the industry. It's used for snooping, and also damages our health with radiation everywhere. (I suspect the 6502 has plenty of computing power to do IoT though.) BTW, I don't have or want a smartphone. I do have a flip phone but I haven't turned it on in a week or two. There's a good chance it has a 65c02 in it anyway, as I know the 65c02 has gone into a lot of phones-- it's just not use for the graphics-intensive stuff.

I understand your preferences, and even share some of them. I have been using vi and make for software development since the 1980s, and I'm fairly comfortable with emacs as well. As powerful as they are, they are no match for a modern NetBeans or Eclipse-based IDE. I have worked extensively with both environments and I can say without reservation that I'm more productive with the new tools than the old ones. Of course YMMV, but the industry has for the most part moved on to the newer tools so if you don't keep up you get left behind.

I share your experience working with TI -- they are terrible and I avoid using their parts whenever possible. I've had good experiences with Linear Technology and some of the other smaller companies out there. I'd say there seems to be an inverse relationship between company size and quality of support in general, but that's probably no surprise.

Finally, it matters not what our preferences may be, but rather what the world wants. And right now they are cuckoo for whiz-bang technology, features, and shiny polished buttons (Apple's specialty). Trying to fulfill these with 30-year-old technology is not a good business strategy IMHO.


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 10:25 pm 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8432
Location: Southern California
I acknowledge that some of your points are valid.

I have been with my current employer for 23 years, although twice I came within a hair's width of quitting a few years ago over disagreements with the boss. I do document my work super well though. If I ever do want to leave, I don't want to feel like I can't because I would orphan them. I brought one product to market with a 65c02, in 1993, a high-end aircraft intercom with a load of features that the competition still has not matched, to this day. This was before microcontrollers were ubiquitous, and before there were inexpensive choices that you could program on the workbench for low-volume production. The processor ran at 1MHz and used about 3KB of ROM and only the first two pages of RAM. It was still in demand 13 years later when it was forced out of production by a fight between the company owners, and once in a while we still get people asking if there's any way we can put one together for them. I've brought lots of products to market though with PIC16's. There are a bazillion products out there whose control requirements don't come anywhere close to maxing out a 65c02's performance or memory space—unless you use something like C. It's a shame that such an ugly, ugly language got pushed into popularity.

I do understand about market pressures; but that subject usually refers to consumer electronics, which I don't want to have anything to do with. I've been in niche markets since 1985, and industrial and military before that. In industrial and niche markets, time-to-market pressures are very low, we're not trying to entice the giggling junior-high cheerleaders at the electronics counter at Target or the T-Mobile store, and a product model may keep selling for a decade or two. The only work I did touching on military was in applications engineering at a manufacturer of VHF and UHF power transistors, mostly for amplifiers for military communications and radars, so it did not involve microprocessors, although there was programming of in-house instrumentation. [Edit, 2018: My own employer has been getting into more military applications; and I've also been doing consulting at a company that makes propulsion units for small satellites which of course have to be super reliable since besides the cost to launch them into space, you can't just mail it back home for repair. They're using the MSP430 microcontroller.]

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 11:18 pm 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8173
Location: Midwestern USA
Tor wrote:
I'm not sure how my dishwasher, refrigerator, microwave oven, or any of those type of household items could benefit from the improved performance of e.g. an ARM CPU. A while ago I got a chance to look at the circuit board of my home alarm system when the techs came over to upgrade the sensors. The whole thing runs off a Z8 microcontroller.

The 65C02, the Z8, and the Z80 are in volume production to this day. I don't think the engineers and companies using them do so because they don't know better. On the contrary, I suspect they know exactly what they're doing.

You made my point. :D Thanks!

Economy of production and service are often more powerful business motivators than whiz-bang performance. A computer jock may push something to his boss terms of how fast it is, but what the boss really cares about is if it will be profitable and not a warranty headache. New technology isn't always good technology, and over the years the electronics and computer industries (especially a certain software vendor in Redmond, Washington) have repeatedly demonstrated that adage.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Mon Nov 23, 2015 11:26 pm 
Offline

Joined: Wed Sep 23, 2015 8:14 pm
Posts: 171
Location: Philadelphia, PA
GARTHWILSON wrote:
I do understand about market pressures; but that subject usually refers to consumer electronics, which I don't want to have anything to do with. I've been in niche markets since 1985, and industrial and military before that. In industrial and niche markets, time-to-market pressures are very low, we're not trying to entice the giggling junior-high cheerleaders at the electronics counter at Target or the T-Mobile store, and a product model may keep selling for a decade or two. The only work I did touching on military was in applications engineering at a manufacturer of VHF and UHF power transistors, mostly for amplifiers for military communications and radars, so it did not involve microprocessors, although there was programming of in-house instrumentation.

I've been in both consumer and industrial markets, and I agree with you that time-to-market pressures are much lower outside of the consumer space.

I get that for applications where the 65c02 has sufficient horsepower to get the job done, you feel you can develop a product there faster than on another platform due to your extreme familiarity with it. I'm just saying that this is far from a generally compelling reason to use such a processor (or any other technology from that era) on a new product being brought to market today, in virtually any space -- consumer or industrial. If you had your own company developing your own products, maybe it might make a certain sense for you to use them given your reluctance to learn newer technologies (though perhaps if you didn't hate C so much you'd find it a lot easier). It would certainly put you at a competitive disadvantage, but I've no doubt your skill and experience will make up for some of that.

BTW, on modern CPUs, compiled languages like C generally run faster than hand-coded assembly language! Humans are just not that good at keeping track of register allocation, pipeline stalls, cache hits, and other factors that make modern CPUs so much faster than their predecessors. But I can understand your hatred of C if you do most of your work with a 65c02 -- I can't think of a worse target for a C compiler except maybe an i4004!


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 88 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: