6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Sun Nov 24, 2024 2:00 am

All times are UTC




Post new topic Reply to topic  [ 41 posts ]  Go to page Previous  1, 2, 3
Author Message
PostPosted: Fri Nov 29, 2019 7:40 am 
Offline
User avatar

Joined: Fri Nov 09, 2012 5:54 pm
Posts: 1431
MichaelM wrote:
I realize that most of us on this forum are old and very much curmudgeons

Yes, that's the usual result when staying in a tech job for some time.

MichaelM wrote:
Threads like the one here carrying on about the 8086 real mode architectural limitations are getting to be like broken records since it's been nearly 2 decades since I used an x86 in that mode.

As an old curmudgeon, I still stick with Borland C 2.0 in a DOS box for testing concepts.
It's just because the compiler help is really good, and because the code examples in the help really work...
...and it really is helpful to know that your code works indeed before trying it in a newer IDE.

Anybody knows what to do when M$ visual studio stubbornly tells you "Exception has been thrown by the target of an invocation" ?
Coding sure was more fun when computers only had 1kB of RAM. ;)


Top
 Profile  
Reply with quote  
PostPosted: Fri Nov 29, 2019 8:17 am 
Offline

Joined: Thu Jan 21, 2016 7:33 pm
Posts: 282
Location: Placerville, CA
I mean, the last time I touched real-mode 8086 assembly was...um, two and a half weeks ago. It's definitely still germane to the discussion in this type of community.


Top
 Profile  
Reply with quote  
PostPosted: Fri Nov 29, 2019 10:15 am 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10986
Location: England
Personally, I find historical stories interesting, and rants boring.

As for the 8086, I think it's worth noting that it was architected by someone (Stephen Morse) with relatively little experience and implemented in short order by a small team: it was only meant to be a stop-gap. Neither Intel nor others, I think, put great value in object-code compatibility for microprocessors, or expected such a long run as the x86 eventually got.

As for the 6809, it has about twice the transistor count of the 6502 - one would hope that it delivers a better experience. The 6502 had to be cheap, and therefore small, to have any chance at all against the 6800. It could not have been much more complex than it was, and indeed we know it was simplified in some way at a late point in the design cycle: it had gone over budget.


Top
 Profile  
Reply with quote  
PostPosted: Fri Nov 29, 2019 3:17 pm 
Offline
User avatar

Joined: Tue Mar 02, 2004 8:55 am
Posts: 996
Location: Berkshire, UK
BigEd wrote:
As for the 6809, it has about twice the transistor count of the 6502 - one would hope that it delivers a better experience.

It does but only to programmers. Its instruction set is much better that the 6502 with more registers, two stacks, and lots of nice addressing modes but all the extra complexity slows instruction decoding down and sadly Motorola didn't make fast versions of the chip.

_________________
Andrew Jacobs
6502 & PIC Stuff - http://www.obelisk.me.uk/
Cross-Platform 6502/65C02/65816 Macro Assembler - http://www.obelisk.me.uk/dev65/
Open Source Projects - https://github.com/andrew-jacobs


Top
 Profile  
Reply with quote  
PostPosted: Fri Nov 29, 2019 5:34 pm 
Offline
User avatar

Joined: Sat Dec 01, 2018 1:53 pm
Posts: 730
Location: Tokyo, Japan
BigEd wrote:
As for the 6809, it has about twice the transistor count of the 6502 - one would hope that it delivers a better experience.

Yes. For twice the transistor count full sixteen bit index registers and stack pointer might be expected, and a few other fun bits such as a relocatable zero page. But the 6809 went far beyond that. (Just look at the adressing modes. It's a freakin' minicomputer.) There's no question in my mind that it was a truly beautiful design in and of itself, assuming that cost was no object. (You certainly couldn't say the same about the 6502, whose engineering beauty was absoutely tied up in how much they gave you for an incredibly low price.) And yet the 6809 wasn't all that expensive, either.

There are a whole boatload of mistakes and errors and misjudgements you can correctly attribute to Motorola, but neither the 6809's architecture nor its relatively reasonable price fall into that vessel.

Quote:
The 6502 had to be cheap, and therefore small, to have any chance at all against the 6800.

I disagree. The 6502 could have been considerably larger and more expensive and still been a decent competitor. That the 6502 team eschewed being just a notable improvement and instead were revolutionary is a testament to their genius.

_________________
Curt J. Sampson - github.com/0cjs


Top
 Profile  
Reply with quote  
PostPosted: Fri Nov 29, 2019 11:19 pm 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8514
Location: Midwestern USA
cjs wrote:
(For MOS with the 65816 it made sense to spend gates on binary compatibilty...)

Minor point: the 65C816 was not a MOS Technology design, which was also true of the 65C02.

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Sun Dec 08, 2019 4:42 pm 
Offline

Joined: Mon May 21, 2018 8:09 pm
Posts: 1462
The 68K family was an excellent CISC ISA. I think I heard tell that it was influenced by the VAX. The VAX is actually even more flexible with its addressing modes, which can be applied independently to each input and output operand, but the simpler 68K scheme makes a lot of sense in context.

It became obsolescent when it became too difficult to make CISC CPUs faster - about the time when superscalar capabilities became mandatory to keep up with the RISC revolution. Implementing superscalar capabilities is considerably harder when you have variable-length instructions, which RISC CPUs didn't. A popular option was to migrate 68K machines to PowerPC; the Mac did a pretty good job of this by integrating a 68K emulator into the OS. The PowerPC was sufficiently faster than the (already competent, but not superscalar) 68040 that the emulated code often ran faster than on a native 68K Mac. So the 68K was relegated to the embedded market and became CPU32, and then ColdFire.

At around the same time, x86 CPUs essentially became hardware translation engines bolted onto RISC CPUs. A pioneer in this field was NexGen, who literally built an x86 translator chip to wire up to their existing RISC CPU chip. I've actually seen a motherboard that was built with these two chips. AMD were taking a similar approach with their K5; it was internally an x86 translation engine bolted onto an Am29K RISC CPU. AMD then bought NexGen and turned their design into the K6, with a Socket 7 pinout that would fit directly into Intel Pentium motherboards (the K5 fitted the earlier Socket 5 Pentium motherboards). Again in a similar timeframe, Intel introduced the Pentium Pro and then the Pentium II which used the same basic idea - but AMD kept end-user costs lower with similar performance. Subsequently, the translation was no longer to a standard RISC ISA, but was customised to maximise performance.

There is no fundamental reason why subsequent 68K CPUs could not have been built in the same way, but somehow the funding and the technical knowhow never materialised. Indeed with the cleaner 68K ISA, it would probably have been easier to do so than it was with x86. Then it would likely have survived to the end of 32-bit ISA viability, around the time the Athlon 64 was released - and a similar solution (a modal backwards compatibility break) adopted for the move to 64-bit, as ARM also has.

Meanwhile, the 6502 has never grown all that much beyond its original design. This has actually been to its benefit, because it is still relevant to homebrew enthusiasts like ourselves, as well as to cost-sensitive embedded systems. Sure, I can pick up an ARM microcontroller for €1 when a new 65C816 costs €8, but the former doesn't let me build a complete computer around it; the latter does.


Top
 Profile  
Reply with quote  
PostPosted: Mon Dec 09, 2019 2:29 am 
Offline
User avatar

Joined: Sat Dec 01, 2018 1:53 pm
Posts: 730
Location: Tokyo, Japan
Chromatix wrote:
There is no fundamental reason why subsequent 68K CPUs could not have been built in the same way, but somehow the funding and the technical knowhow never materialised. Indeed with the cleaner 68K ISA, it would probably have been easier to do so than it was with x86.

It looks to me like a simple matter of installed base. The x86 PC market was several times the size of all other PC markets put together, so maintaining compatibility even at enormous (ludicrous?) cost was worth the gamble, and it paid off with continued market dominance. Had the PC market been more fragmented amongst the various CPUs, likely it would not have been worthwhile to spend so much just for compatibility: with greater multi-platform support in the software ecosystem, other vendors with cheaper architectures offering the same performance would have been worth switching to.

_________________
Curt J. Sampson - github.com/0cjs


Top
 Profile  
Reply with quote  
PostPosted: Mon Dec 09, 2019 2:46 pm 
Offline

Joined: Mon May 21, 2018 8:09 pm
Posts: 1462
In other words, if IBM had gone for the 68K instead of the 8086 for their PC, we might indeed have had super-fast 68K-family CPUs using a translating architecture. But they went for a cheaper option, and something accidentally made the PC take off in a big way.


Top
 Profile  
Reply with quote  
PostPosted: Tue Dec 10, 2019 6:56 am 
Offline
User avatar

Joined: Sat Dec 01, 2018 1:53 pm
Posts: 730
Location: Tokyo, Japan
Chromatix wrote:
....and something accidentally made the PC take off in a big way.

Not an accident. Sure, it might not have happened exactly as it did had some things not gone well, but Gates' intent in selling IBM a non-exclusive license to MS-DOS was exactly the same as his selling a non-exclusive BASIC license to Altair more than five years earlier: commoditize its complement.

Even had IBM managed to put a stop to full-on PC clones (with the same memory map and BIOS), we still would have had a lot of different MS-DOS machines out there, all using 8086-based processors. In fact, outside the U.S. we actually did: the Victor 9000 was the most popular business computer in Europe in the very early '80s, and the PC-9800 series in Japan was more popular than PC clones right into the early '90s.

_________________
Curt J. Sampson - github.com/0cjs


Top
 Profile  
Reply with quote  
PostPosted: Tue Dec 10, 2019 9:35 am 
Offline
User avatar

Joined: Thu May 14, 2015 9:20 pm
Posts: 155
Location: UK
About DTACK GROUNDED...
Quote:
and you can throw out about 93% of Motorola's application information on the 68000.
I think this sums up the authors take on life :mrgreen:

We can discuss the pros, cons and happenstances forever. None of the 8 bits were designed for home computers or for hobbyist use. If you could go back in time and ask the designers of the 6800, 6809, 6502, 8080, Z80 etc where their MPUs would be in five years, let alone 40 years later, they would not believe that any 8 bit would still be in production.

Talking of which, you can still buy new CMOS Z80 chips...
To wander off in this direction, you could read this: https://news.ycombinator.com/item?id=10763274

I thought I read somewhere that IBM choose the 8088 because (1) they already had dealings with Intel and (2) they wanted the chips to be available in volume, so they wanted there to be second sources....

Mark


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 41 posts ]  Go to page Previous  1, 2, 3

All times are UTC


Who is online

Users browsing this forum: No registered users and 36 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: