Turnkey get-started-fast solution

Let's talk about anything related to the 6502 microprocessor.
lordsteve
Posts: 56
Joined: 22 Jan 2003
Location: Estados Unidos
Contact:

Post by lordsteve »

8BIT said...
Quote:
What microcontroller did you have in mind? I would like to work with something that has 75+ MIPS that does not require complex interfaces.
Glad you asked! I am a fanboy of the SX microcontroller.
http://www.parallax.com/sx/index.asp
http://www.parallax.com/detail.asp?product_id=SX48BD-G
Don't worry. They have dip versions, too.

8BIT said...
Quote:
I have been working with AVR controllers for ethernet and FAT16/32 interfaces.
I have used the an SX28 microcontroller @50MHz to stream 22050KHz stereo 16-bit audio to an Analog Devices AD1866 DAC from a FAT16 file system on an SD card. Fun stuff.

The mighty Garth said...
Quote:
A modern video driver with a large video RAM is probably not something we could do with any microcontroller.
Not modern, and maybe not large, but I have used the SX micro to generate frame-buffered video with 128x96 resolution. More is possible.
Thanks for playing.
-- Lord Steve
kc5tja
Posts: 1706
Joined: 04 Jan 2003

Post by kc5tja »

And if you use an external shift register or set of shift registers, I'm sure you could grab color information as well as higher resolutions too. To get anything close to 640x480 at VGA bandwidths, though, you're looking at least at an Intellasys seaForth chip (24 small Forth cores on a single chip), with an FPGA preferred.

Like the Terbium 32-bit architecture, seaForth chips are currently in development, and therefore, vaporware. But since they've already published instruction sets and emulators and the like, there's actually a better chance of seeing a seaForth than a Terbium at this point. ;D
faybs
Posts: 106
Joined: 16 Oct 2006

Post by faybs »

Parallax also sells a chip they call the Propeller, it's an 8 core chip with 32KB of internal shared memory and 2KB of private memory for each core. Cores run independently of each other, with shared memory arbitration handled by giving full access to one core at a time, round robin style. The chip has 32 user IO lines and is powerful enough to drive VGA at 1600x1200, 64 colors using only a resistor network DAC (too bad there's only enough internal memory to get text modes at that resolution...) It also has internal dedicated hardware to generate composite video, again with just a resistor network DAC. All that in a $13 DIP40 package, plus an external serial ROM to store your programs in (the code gets sucked in just after a chip reset). Parallax hosts a user-contributed code library in their website, in it they have code to drive PAL and NTSC TVs and VGA monitors, read and write SD cards, generate polyphonic sound and quite a few other things. It's quite a good candidate for an "uber peripheral" for V2 of the little trainer board.
User avatar
8BIT
Posts: 1787
Joined: 30 Aug 2002
Location: Sacramento, CA
Contact:

Post by 8BIT »

I have two 22V10 PLD's on order, one is Atmel and one is Lattice. I want to test these in my programmer to be sure I can program at least one. I have test code written that will do the address decoding and also provide the bank switching for the upper ROM and RAM. With the 22V10, there are not enough resources to provide feedback on the current bank select state (unless you tie the pin to an input on a VIA).

As an option, I've also written code to program a Xilinx XC9536 CPLD. This is a 44 pin PLCC that would allow you to read the state of the internal bank switches. As an added benefit, you can also run a two speed clock and have an extra IO block decoded. This would be the upgrade to the 22V10 or replacement if the 22V10 fails to perform the necessary operations.

Keeping the beginner in mind, do we need to stick to 40 pin DIP parts for the 65C02 and 65C22's or would 44 pin PLCC (with through hole sockets) parts be acceptable?

Any thoughts?

Daryl
smilingphoenix
Posts: 43
Joined: 20 May 2006
Location: Brighton, England

Post by smilingphoenix »

I've just read this topic from start to finish and have to agree with a comment made by GARTHWILSON early on - this is becoming a lofty ideas discussion and losing sight of the original idea somewhat.

The original idea was for a simple system for beginers. With this in mind, I would like to suggest what I would have liked when I started playing with the 6502 all those years ago...

A 65C02 processor with jumper options to allow a 65C816 if desired.

2 65C22 VIAs. One handles an LCD and keypad and bleeper and maybe a few LEDs or something. The other just goes to connectors for experimenting.

A 65C51 for serial connection to a PC or whatever. I know the 65C51 isn't the best ACIA in the world, but this system is for beginers who won't mind waiting a few seconds while their 1024 byte program transfers at 9600 baud.

32k static RAM chip

2 ROM sockets for 8k eproms/eeproms. One rom would have the monitor, the other would be a socket for expansion. It might be possible to program an eeprom in socket but as I've never played with them I wouldn't know.

Some basic address decoding logic, using 74HC chips or similar. I really think this would be preferable to PALs, GALs, FPGAs or some such programable logic. I would hate to learn about computers using a simple board like this and, when I understood enougth to start studying the board itself, find that most of it was buried into a complex, preprogrammed device, even if the equivilant circuit was available..

A connector carrying the CPU bus. Sooner or later, beginners are going to want to build their own interfaces that hang onto the CPU - they can't learn about this unless they can get access to the CPU bus. Bus buffers on this connector might be a good idea - they prevent the rest of the system exploding when you make a mistake and also mean you can connect an interface via a reasonable length of cable without worrying about loading the CPU too heavily (At least you can if you don't wind the clock up too far...)

The ROM should contain a basic monitor allowing the user to examine & modify memory via the keypad/lcd and upload and download software from a PC or whatever over the serial link. It should also contain some basic utilities. Languages and operating systems should be reserved for expansion ROMs.

And thats it. No, really. The advanced features we prefer are a big turn-off for the absolute novice who doesn't understand them. Should the user progress to the point of wanting to add advanced functionality, he can, via the bus expansion connector or whatever. But for the basic system, I believe it should be kept as simple as possible.

A few other random points.

DIL devices are to be prefered to devices with tiny legs spaced very close - simply because it's much easier to attach probes and see what's going on.

I havn't mentioned a clock speed. 2 or 3 MHz would do for absolute beginers but someone is bound to want to crank it up at some point, so provision for faster devices should be included. I like the idea of using a clock oscilator can and letting the user change his clock speed by plugging in a new can.

---< Brain Overload >---

I'm sure there were some other points I wanted to make but I can't think of them right now.

Sorry to have interrupted your discussions with my inane ramblings. Please do continue, it makes for very interesting reading. I never cease to be amazed at what some of you guys can pursuade the humble 6502 to do.

Humm. Having read this through, it is concievable that this post may be considered slightly rude. If this is the case, I do apologise - rudeness was not my intention.
Shift to the left, shift to the right,
mask in, mask out,
BYTE! BYTE! BYTE!
User avatar
8BIT
Posts: 1787
Joined: 30 Aug 2002
Location: Sacramento, CA
Contact:

Post by 8BIT »

smilingphoenix

Your comments are quite welcome and I have to agree with you. Purhaps even I have gotten a little carried away with adding too much into it.

If you have time, take a look at my SBC-2 boards and tell me if that would be something similar to what you are describing (minus the two ROM chips and bus buffers).

thanks for your comments. ( I will most likely incorporate some of my design ideas in a new SBC-3 board.)

Daryl

http://sbc.rictor.org/hardware.html
smilingphoenix
Posts: 43
Joined: 20 May 2006
Location: Brighton, England

Post by smilingphoenix »

Daryl

Yep. Your SBC-2, with a keypad & lcd driven off one of the 65C22's was pretty much what I had in mind.

They've got enougth memory and I/O to handle most things a beginner is likely to want to do. With the bus available, you could then expand further if you wished.

One thing that is vital - make sure the monitor program has the ability to upload a block of memory via the RS232 port. Otherwise you've got no way to save your programs prior to the inevitable crash. Guess who forgot this when they built their first SBC all those years ago?
Shift to the left, shift to the right,
mask in, mask out,
BYTE! BYTE! BYTE!
daivox
Posts: 30
Joined: 04 Sep 2004
Location: Last Ninja 2: Basement

Post by daivox »

It's times like this that I wish my OS hadn't proven to be so poorly written. I'm planning to do a rewrite from scratch with definite pre-planning and goals instead of just jumping in for educational purposes, and maybe when that's done it'll be more helpful.
kc5tja
Posts: 1706
Joined: 04 Jan 2003

Post by kc5tja »

Be careful!!!

The best-laid plans often lead to monstrosities like OS/360! Meanwhile, the most haphazardly concocted, half-assed excuse for coding ends up dominating the world -- like Unix. ;D

In my experience, I find that my prototypes very often end up becoming production code because they work "good enough."

Do you have a webpage describing your OS? I'd be interested in looking at it.

(Although, that being said, implementing a POSIX subset seems to be the most productive thing an OS writer can do. POSIX isn't my favorite architecture (indeed, AmigaOS is my personal favorite), but you can't argue it has a metric ton of applications already written to use it. )
daivox
Posts: 30
Joined: 04 Sep 2004
Location: Last Ninja 2: Basement

Post by daivox »

IIRC, POSIX relies on C, and C is simply not something that runs quite optimally on 65xx CPUs, no?

And I don't know much about POSIX anyway. My main problem with my prototype was that it simply didn't provide certain things I needed, and while tons and tons of it could be reused (I had malloc, some basic memory moving calls, stuff like that) I also noticed that it simply needed too much reworking to include some things that were desperately needed (drivers registering IRQ/NMI hooks dynamically with the core, for example).

My plans are grand, however. It will be completely self-hosting, with the same format for executable files AND for libraries, meaning not only that the linker doesn't have to think about two different file formats internally, but also that a program can double as a library! The biggest problem I've had is that I haven't written the toolchain yet, because I'm a freakin' amateur and am learning everything by doing.

No, there's no page to describe it, but if you shoot mail to jody(at)jodybruchon.com I'll gladly throw some info at you. Maybe you can make some suggestions.
Last edited by daivox on Sun Jul 01, 2007 2:00 am, edited 1 time in total.
kc5tja
Posts: 1706
Joined: 04 Jan 2003

Post by kc5tja »

daivox wrote:
IIRC, POSIX relies on C, and C is simply not something that runs quite optimally on 65xx CPUs, no?
No, POSIX is an API standard, specifying a Unix-like API for operating systems. It says nothing about how it's implemented, although the specification does use C for detailing various data structures.

Lunix (for the Commodore 64/128 computers) implements an API that is clearly inspired by POSIX, for example, even if it isn't completely POSIX. This OS is written in a combination of assembly language and cc65 (with the kernel written in assembly, and most of the on-disk utilities written in C).
Quote:
My plans are grand, however. It will be completely self-hosting, with the same format for executable files AND for libraries, meaning not only that the linker doesn't have to think about two different file formats internally, but also that a program can double as a library!
Nothing grand about that -- what I want to know is, what is taking people so long about this? Python, for example, implements this philosophy. I use this all the time at work:

Code: Select all

#!/usr/bin/python

import foo
import bar

def AGlobalFunction(...):
  ...

class AGlobalClass(foo.ABaseClass):
  ...

def main(args):
  ...

if __name__=='__main__':
  main(sys.argv)
This allows me to import an otherwise "executable" Python script as a module elsewhere, while still retaining the ability to directly execute it if it makes sense to.

The other nice thing with this is the fact that a compiler can compile directly to an executable without concern for linking of any kind. The loader is the linker. The only need for an external linker is for creating JAR-like entities, where you combine multiple modules into a single meta-module.
Quote:
The biggest problem I've had is that I haven't written the toolchain yet, because I'm a freakin' amateur and am learning everything by doing.
From someone who already went down this path:

* Start Backwards.

Solving a puzzle is always easiest if you work your way through it backwards. For example, ignore the language issues, and jump directly to the loader. Make something that works with hand-crafted, assembly language files made to look like binaries produced by a compiler as input. This way, you have your input under full control as you debug the loader.

* Just Let Go or You Aren't Gonna Need It

Don't over-engineer your code. Don't under-engineer it either. Every time you say, "Shoot, I'm going to need XYZ," just walk away. Solve the problem you have now and only that problem. Who cares if it isn't the right solution? You can always fix it later.

* Throw it away; you will anyhow.

Part of "fixing it later" is realizing that code is cheap -- it is the interfaces between code that is more expensive. Therefore, if you realize that it'd be easier to just start a chunk of code over from scratch than retrofitting what's already there, do it. As long as the new code conforms to whatever interface(s) are relavent, nobody will be the wiser.

By way of analogy, a Mazda RX-7 runs using a Wankel rotary engine. These engines operate very differently from a reciprocating engine. However, if you find you want more low-end torque from the '7, you have no choice but to either put in a larger Wankel, or replace it outright with a Ford 302 V8 engine. The fit isn't perfect in either case, and adapters need to be implemented, but many people have done this with satisfying results. The point being, it's not the specific details of the engine that matter, but rather, the interface between the engine and the transmission.

Indeed, conversion of a car to full electric relies on the same principle.

This works with software too. It's what allows my applications to be written in Forth, in C, or in Haskell, and Linux doesn't give a hoot what I choose. As long as my code makes use of the Linux system call interface, it's happy, and I'm happy. Take advantage of this concept in your code too.

A corollary to this is to prototype everything, and with very rare exceptions, never use prototype code in production code. You'll find the quality of your code is better the second time around anyway.

Sometimes, it is the interface itself which needs to be adjusted. That's OK too, as long as you're careful! Remember that changing an interface directly implies that two or more pieces of code will need to be altered as well, since an interface always has a provider and at least one consumer. Which brings me to...

* Test, test, test, test, test, test, and test some more.

Test Driven Development. I cannot speak highly enough about this topic. Write your unit tests first. Then you write your production code to satisfy the unit tests. This has several highly desirable attributes:

1. Well written unit tests can be written in any order whatsoever. This means that you can test high-level code long before you test low-level code, or vice versa, or even a mish-mash. It achieves this because . . .

2. Unit tests enforce natural boundaries of modularity. If you find you can't test malloc() because it depends on the memory pool implementation somehow, you either need to find a way to guarantee memory pool accessors are tested first, or, preferred, write a mock set of memory pool accessors just for the purpose of testing.

3. Unit tests are automated. As you implement more and more functionality in the production code, you'll find that you'll periodically break another unit test accidentally (e.g., an interface was inadvertently changed). Fix the broken test or the production code under test before implementing the next feature.

4. Releases can be made only with 100% unit test success rate. If even one test fails, slip your schedule and fix it. Trust me on this.

5. Concentrate on the highest priority features first. Unix is successful because of the 80/20 rule -- the API is far from perfect, but it does solve 80% of the programming problems elegantly. The remaining 20% can be made up for in other ways. It's better to have something that works Kinda OK, than to have a perfectly non-working system. This is the "Worse is Better" approach to code design. I'm not happy with it, personally, but there's no arguing with the results.

Unit tests aren't perfect of course. Being software, they can be just as buggy as the production code you're coding. You'll find that coding will take 2x to 3x as long as you would with a debugger. However, in my decades of coding experience, I've found that using TDD eliminates all but the most dire need for a debugger all-together, that well-written unit tests serve as documentation for how to use various interfaces, mocking dependencies show how pieces of code fit together in the big picture (very important as systems get complex!), etc.

But that's the nice thing about unit tests. It's not the correctness of any one piece of code that you're looking for, but rather, agreement between the test and the production code. The reason is simple: given an incorrect assumption about the specification of a program, you can write just as incorrect code using formal analysis as you can with unit tests. So, don't worry about unit tests being buggy. Discrepencies between production and test code will be much, much more instructive than correctness.

By way of analogy, notice that all modern, complex electronics have some form of testing functionality built-in, be it with JTAG interfaces or some other system. There is no reason complex software should be an exception.
daivox
Posts: 30
Joined: 04 Sep 2004
Location: Last Ninja 2: Basement

Post by daivox »

Yikes.

And why isn't this advice posted somewhere on this site? :)

Thanks for the help--I was starting to figure those things out on my own, in fact, but the jump-start is quite welcome, and I'd highl recommend someone get Mike to jam that info elsewhere on the site as a permanent document.
kc5tja
Posts: 1706
Joined: 04 Jan 2003

Post by kc5tja »

Mainly because it's not relavent to the 6502 in the specific, but programming in general. These things have all been known for some time, but their interactions haven't been known until recently. You'll want to Google for things like "extreme programming" and "test driven design" tutorials.

Also, you'll need to find (or write) a meaningful unit test runner to get the full benefit, but quick and dirty unit test runners can be written too.

I plan on writing a tutorial using my CUT package (a unit test system I wrote for the C language), but goodness, it's been close to 10 years in the making... ;D So don't wait up for me.

EDIT

http://video.google.com/videoplay?docid ... 0081075324

An excellent video discussing behavior driven development. Although centered on Ruby as the language, the concepts are language agnostic. You might be interested in seeing this video. Approximately 1 hour long.
fachat
Posts: 1124
Joined: 05 Jul 2005
Location: near Heidelberg, Germany
Contact:

Post by fachat »

I know this might not directly fit your requirements, but here is my approach to a turnkey-get-started-fast solution.

http://www.6502.org/users/andre/csa/gecko/index.html

André
Lost
Posts: 20
Joined: 07 Oct 2007
Location: Toronto, Canada
Contact:

Starter Kit thoughts from a novice

Post by Lost »

I think that I am one of the target market for your starter kit. I've been lurking on this board for about 6 weeks now (as well as several other 6502 places) because I've been toying with the idea of trying to design my own computer, as many of you already have. I have take a couple of electronics course in college and can solder somewhat but this will be my first attempt at putting a digital computer together.

I want to say that smilingphoenix has got it right from my point of view. To start with I need an extremely simple system that doesn't look too difficult to put together. After doing a lot of research and poking around I've decided to go with a kit designed for schools to teach this sort of thing. 8Bit's SBC2 was a really close contender, though.

The system I'm planning on getting (probably over Christmas) is an Aptco breadboard computer (http://www.apatco.com/ncs_details). What made me go with this one is:

1 It comes with a manual. This is extremely important for someone just starting out. You've already talked about this so I think you understand.

2 It is simple. The whole thing fits on a breadboard and the number of chips isn't daunting. The breadboard ranks pretty highly on the deciding factor. It's a lot easier to fix things if I wire something wrong or put a diode in backwards.

3 It teaches some important basic lessons about how to control and manage the devices. The use of simple logic gates and such to create the chip-select lines for RAM, ROM, and VIA access along with sorting out the memory map is a huge deal for the beginner. I've worked in the computer field for 20 years but I've always been in software and I never understood how device access worked until I started looking at hardware projects like this.

4 I have to do both the hardware and the firmware. Having pre-configured ROMs doesn't teach me as much as having to create my own (even if I copy the assembly out of a book).

There are only 2 things missing (in my opinion) from the Aptco kit. A serial port (or other PC interface) and an EEPROM programmer.

The serial port will be my first attempt to add to the system myself but it would be nice to include it in the kit. As someone else mentioned, 9600 bps would be just fine. Heck, I'll be happy if I can drive it at 2400 bps.

The EEPROM programmer wouldn't be a problem if they weren't so expensive (unless I've been looking in the wrong places). For this I will probably use one of the designs I've seen posted here or elsewhere.

To start with the 8K of ROM and 32K of RAM seams like plenty to me (though I'm already thinking of ways to increase both).
Post Reply