6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Sat Nov 16, 2024 1:46 pm

All times are UTC




Post new topic Reply to topic  [ 16 posts ]  Go to page Previous  1, 2
Author Message
PostPosted: Tue Jul 11, 2017 8:01 pm 
Offline
User avatar

Joined: Thu Dec 11, 2008 1:28 pm
Posts: 10980
Location: England
Yes, I've read somewhere about how long - weeks I think - it took Acorn to re-verify the chip using their Basic model after a change to the design. I'll try to find that... ah - found it:

Quote:
I’ve talked about Sophie who did the instruction set architecture, I did the microarchitecture reference model, three VLSI designers put the chip together, but there was also a software testing team. So Sophie built a software emulator for the ARM and several people on the software side, Hugo Tyson, John Thackeray – I can’t remember if David Seal was there then, but there were four or five people who built test programs and these were just bits of software which, you know, when they were testing the barrel shifter they checked every possible capability of the barrel shifter and the program self tested that it worked. And I do remember that on – when we came to move that from Sophie’s software emulator, which was quite efficient, to my reference model, which of course was built to model the way the hardware worked and was therefore much less efficient, that it took about two CPU weeks on a BBC second processor to run all the validation programs on this model, to check the model out. And of course every time there was a design change you had to run this two weeks of testing again.

Steve Furber at http://sounds.bl.uk/related-content/TRA ... f#page=139
via the comments at https://web.archive.org/web/20190327222 ... HcVHwqynNJ

A bit more from Steve Furber:
Quote:
At Acorn we picked up on that [RISC] idea and set about developing the ARM. ARM stands for Acorn RISC Machine, Reduced Instruction Set Computer is ARM's middle name. It's a very simple processor. Sophie Wilson (again) drafted the instruction set. I wrote the BBC BASIC reference model, the entire processor was modelled in BBC BASIC. I thought the model was lost until 2-3 years ago when I found it on a floppy disc in my garage. It took me quite a long time to read this floppy disc, because it was a BBC floppy disc, and I don't generally have BBCs lying around operational, but on there was the original ARM reference model, 808 lines of code, the processor and the test environment. When you turn it into today's high technology design systems in Verilog it comes to 10000 lines of code.
...
Q: What didn't you like about the processors that were available at the time when you were about to launch on your processor design?

The answer is technical: there were two thing we didn't like. The first was that they were based on the type of instruction that were used in minicomputers in the 1970s and this meant they were very complex instruction sets, for example the National Semiconductors 32016 has a memory-memory divide instruction that took 360 clock cycles to complete, which was 60 microseconds during which it was not interruptable. The processor had a worse-case interrupt response time of 60 microseconds. This very strongly affected its real-time performance. On the Beeb we did everything with interrupts to do the real-time handling of IO and the 16-bit processors had worse performance for IO, we thought that was just a bad idea.

The second thing was perhaps even more fundamental. We'd done a lot of experiments to establish that the main thing that affected a processors performance is its ability to access and use memory bandwidth. The power of a processor is roughly proportional to the number of MB/s it can suck out of memory. In the early 80s the commodity memory, the thing you paid money for when you bought a computer, was called DRAM. That delivers a certain bandwidth that you could use and one of the processors you could buy that could keep up with the memory, they effectively threw this bandwidth away. The thing delivered the performance that the memory forced you. So with the ARM, firstly keeping it very simple, following RISC principles meant we got the real-time response, ARMs worst interrupt latency was a few microseconds; and secondly it was not a 16-bit processor, it was a 32-bit processor that immediately gives you double the memory bandwidth and it could make good use of all the RAM available and furthermore it could use fast-page-mode technology, a way of exploiting sequential access modes. It was very tuned to the memory and as a result we could get effectively 3-4 times the memory bandwidth as commodity processors. Real-time performance and memory bandwidth.


The 808-line Basic model was on display in Cambridge last year - here's a photo:
Image
Quote:
This Beeb is showing the ARM CPU simulator, famously written in 808 lines of Basic by Steve Furber. It's too small to run it though.


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 16 posts ]  Go to page Previous  1, 2

All times are UTC


Who is online

Users browsing this forum: dmrogers99 and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: