It's all very amusing but I don't want to descend to that level. I liked my Amiga and I learnt a lot in getting things running on it: had to drop to assembly twice I think, once to get a library working to read DOS format disks and once to get a C compiler connected into the OS. Hooking up different levels was an education in calling conventions.
No reason it has to be either/or.
Cheers
Ed
There's no "either/or" involved—I'm looking at it from an historical perspective—nor is there any descending taking place. I'm commenting on a certain type of myopia that afflicts some observers of computer technology. The "DTACK Grounded" editor was viewing microprocessor technology through a too-small lens, his view being further constricted by the desire to promote his company's product (I can't say I ever ran across one of those boards that adapted a 68K to a PET/CBM machine).
At the time of its introduction, the overwhelming advantage of the 68K family was its performance. However, for much of its economic life, it was hobbled by complexity and cost, exacerbated by Motorola's inability to bring down the cost to the level of Intel's inferior product. There has never been any question in my mind that the 68K was superior to the x86 design in almost all aspects. It was for that reason that I taught myself 68K assembly language some 25 years ago, thinking that, for once, technical superiority might win out over market hype and inferiority. It didn't happen. Had that been the case, we might have all been processing on Amigas running UNIX, and I might have actually gotten some benefit from my studiousness (I've all but forgotten 68K machine code).
The chuckles came from the predictions that were made by the "DTACK Grounded" editor despite emerging and quite apparent market forces. The microprocessor market during the period of time when "DTACK Grounded" was being published had already forked: complexity to the left (so to speak) and simplicity to the right. There was no question that the divergence would only increase in time (look at the current generation of x86 architecture systems—terribly complicated buggers).
I think it was too easy at the time to get caught up in the wave of ever-increasing capability/complexity and fail see that not all applications required it. Did the controller in a microwave oven require the capability/complexity of a 68K or x86 MPU? How about the ASIC embedded in the then-nascent cell phone? Neither did, and that's why Bill Mensch, who obviously was not viewing the microprocessor market through a peephole lens, continues to make WDC's weekly payroll with the 65xx family. I suspect he gets a chuckle every now and then as he reminisces about the "microprocessor battles" of the 1980s and how well his tiny company has done over the years despite the predictions.