6502.org Forum  Projects  Code  Documents  Tools  Forum
It is currently Fri Sep 20, 2024 9:28 am

All times are UTC




Post new topic Reply to topic  [ 153 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6, 7, 8 ... 11  Next
Author Message
PostPosted: Wed Dec 16, 2015 7:08 pm 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8389
Location: Midwestern USA
randallmeyer2000 wrote:
Whoa; just noticed this morning, Garth's primer ( http://wilsonminesco.com/6502primer/ClkGen.html ) mentions "Read-not-write" pin for the 6502 and its difference from RD/ and WR/ pins. I'll double check the '816 schematic; "RWB" if I remember correctly.

The 6502 family uses a single read/write signal (RWB on the 65C02 and 65C816), which is high when in a read cycle and low when in a write cycle. This pattern is understood by all 6502 family peripheral devices, such as the 65C22 and 65C51, and was copied from the Motorola 6800 design.

Non-6502 devices often have separate /OE and /WE inputs to tell them when data is being read from the device (/OE is low and /WE is high) or data is being written to the device (/OE is high and /WE is low), a pattern used in Intel x86 hardware. /RD and /WR or equivalents (I refer to them as /RD and /WD in my schematics) are respectively attached to /OE and /WE. A simple circuit involving an inverter and two NAND gates can generate fully qualified /RD and /WD signals that will work with almost any non-6502 peripheral device.

Attachment:
File comment: /RD and /WD Generation
read_write_qualify_reduced.gif
read_write_qualify_reduced.gif [ 19.15 KiB | Viewed 1450 times ]

Quote:
(I note that he recommends 74ACxx logic in some places, where there is some debate on this forum. I am foolishly and naively stumbling through the timing diagrams, as I presently write, and so I can offer no comment on slew times and such of the 74HCxx or 74ACxx family).

Use of the fast logic families (74AC, 74F and 74ABT) is something that can be a double-edged sword. The very low propagation time of these devices adds performance in circuits with elaborate logic requirements. However, these devices produce very rapid output transitions, which can trigger ringing and noise issues. I used 74AC logic in POC V1 to avoid prop delays that would hold back performance, with the awareness of the potential for ringing and noise. Ringing per se isn't fatal to circuit behavior as long as the voltage excursions caused by ringing don't routinely exceed the maximum input ratings for the device, or cause the input logic levels to go into the "no man's land" area of the device's switching characteristic (i.e., the voltage range between the device's Vol and Voh specs).

Noise can be quite a problem, especially if the power and ground distribution is inadequate and/or insufficient bypassing is present. Ringing and noise are counteracted by good circuit layout and construction practices, as well as use of a true ground plane on the board (e.g., by constructing on a four-layer PCB). There's a lot of discussion on the forum about this aspect of construction.

One significant advantage of the 74AC and 74ABT logic families is their strong output drive, which is of value when significant circuit loading is present. 74ABT, in particular, is very strong and hence finds a lot of use in bus drivers and transceivers.

Quote:
The data bank addressing (on the data pins ! Arghhhh!) alone is enough to give me a headache.

It's not that difficult. The basic circuit involves a 74xx573 latch gated by the inverted Ø2 signal. The latch's output generates the A16-A23 address component, which is applied to the RAM. WDC shows an example circuit on page 46 of the 65C816 data sheet.

Quote:
How difficult is interface to Blu-Ray?

A word of advice. I don't want to discourage you in any way but I think you are rapidly raising the level of complexity to a point where you won't be able to get your system working. As I have often suggested, it's best to learn how to fly a single-engine plane before stepping into the cockpit of a 747 and taking off for Tokyo. :D

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Wed Dec 16, 2015 10:33 pm 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8510
Location: Southern California
randallmeyer2000 wrote:
6522's are making more sense to me, as are the KAC-9628 (image sensors). One nagging thing,. for both, is the "programmable" aspects; both have internal registers and I am at a loss to explain how any of them are accessed or used.

As for access, the 6502/816 use "memory-mapped" I/O, meaning the I/O ICs appear as simply memory, at a memory address. Per your address-decoding logic and the register-select (RS) inputs of the '22, each register will be at a particular address. Don't use the actual numbers in the code. Just make the name an equate (with the EQU assembler directive) so every time you use the name, the assembler will substitute-in the numerical address at assembly time. Here's an example of what you might have in your source code:
Code:
VIA_BASE_ADR:  EQU  $4800       ; Base address of the 6522 VIA.
                                ; Addresses of 16 registers in 6522.
PB:     EQU  VIA_BASE_ADR + 0   ; Port B
PA:     EQU  VIA_BASE_ADR + 1   ; Port A
DDRB:   EQU  VIA_BASE_ADR + 2   ; data direction register for port B
DDRA:   EQU  VIA_BASE_ADR + 3   ; data direction register for port A
T1CL:   EQU  VIA_BASE_ADR + 4   ; timer 1 counter  low byte
T1CH:   EQU  VIA_BASE_ADR + 5   ; timer 1 counter high byte
T1LL:   EQU  VIA_BASE_ADR + 6   ; timer 1 latch  low byte
T1LH:   EQU  VIA_BASE_ADR + 7   ; timer 1 latch high byte
T2CL:   EQU  VIA_BASE_ADR + 8   ; timer 2 counter  low byte
T2CH:   EQU  VIA_BASE_ADR + 9   ; timer 2 counter high byte
SR:     EQU  VIA_BASE_ADR + $A  ; shift register (a synchronous-serial port)
ACR:    EQU  VIA_BASE_ADR + $B  ; auxiliary control register
PCR:    EQU  VIA_BASE_ADR + $C  ; peripheral control register
IFR:    EQU  VIA_BASE_ADR + $D  ; interrupt flag register
IER:    EQU  VIA_BASE_ADR + $E  ; interrupt-enable register
PANOHS: EQU  VIA_BASE_ADR + $F  ; Port A, but with no handshaking, if handshaking is enabled

This accomplishes a couple of purposes. The names are far more meaningful to a human, and if you later want to change the memory map or apply the code to a different computer with a different address map, you don't have to change all the places in the code that refer to the registers—only change the EQUate list near the beginning. As it is above, if you say for example STA DDRA, for "store the accumulator's contents in data-direction register A," the assembler will assemble 8D 03 48, $4803 being the address of DDRA.

Quote:
I think, in both cases, the "default mode" will happen on startup of the devices, and I won't have to worry about "individually addressable bits" (for the 6522) [...]. Hopefully, programming--for these devices--is a thing I can ignore.

You pretty much always have to set up your I/O. RES\ specifically puts all the VIA's peripheral interface lines in the input state, disables the timers, shift register, interrupting, etc.. That means you can't even output anything until after you write to one or more control registers, like a data-direction register.

Quote:
Whoa; just noticed this morning, Garth's primer ( http://wilsonminesco.com/6502primer/ClkGen.html ) mentions "Read-not-write" pin for the 6502 and its difference from RD/ and WR/ pins. I'll double check the '816 schematic; "RWB" if I remember correctly.

or better said, "read, write-not," (so you don't confuse it with the opposite, "read-not, write")

In WDC lingo (and nobody else's, as far as I've found), the "B" stands for "bar," the overbar that makes the logic negative. In this case, the "write" is the negative logic, and the bar goes over the W, but not the R. It's the same for all the 65-family parts.

Quote:
(I forgot to mention; the image sensors are "programmable" for 8, 10, or 12 bit depth, i.e. in the Analog to digital conversion of pixel values. Also, I neglected any CFA (color filter array) interpolations.... I don't remember/know if that interpolation is done by the Altera/FLEX-PLA or if that is done "on-chip".).

I have read the basics on handshaking and non-handshaking reads and writes. I hope the 6522 is not limited to 8 bits at a time, handshaking, painstakingly until the data load is handled/directed/stored. One hopes there would be a way for the image sensor to say "here it comes" and the microprocessor to say "OK, send it all", and then "poof", my biologist-brain needn't worry about how to get the 1s and 0s from the image sensor into the RAM and/or permanent/mass storage.

You could still use the built-in handshaking for more than 8 bits if desired. Let's say you have ten bits, and hardware handshaking set up on port A. When you're sending, put the other two bits on port B first, then when you write to port A (which sets the "data available" line), the whole thing is there. When you receive and an interrupt says there's a new word available, read port B first, so that when you read port A (which tells the transmitter, "Thanks. Got it" (via the "data taken" line), you've already gotten the other two bits, so it's ok if they get immediately changed.

There is always handshaking though. Even in the case of something like RS-232 using neither RTS & CTS nor XON & XOFF, there would still be an agreement between the transmitter and receiver regarding speed, so that the transmitter knows that as long as it doesn't exceed a certain rate, the receiver can handle the data as fast as it comes. That's kind of wasteful though, because the margin could be teensy or it could be huge, and continuously changing. A big margin would waste a lot of time as the line sits idle between bytes. Handshaking makes sure that every byte gets through, without unnecessary idle time, even while transmit and receive throughput capabilities keep varying. It is common to have buffers at both ends.

Quote:
Is there some way to hook the KAC-9628 digital video output directly to some RAM, and thus eliminate the need for the microprocessor and I/O chip to handle the info?

Now you're talking about DMA (direct memory access). There are a few topics here on DMA.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Thu Dec 17, 2015 5:51 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Thanks BDD. I know there is a lot in yesterday's response for me to obsess over, and follow, verbatim. I just have a hunch about it!


Top
 Profile  
Reply with quote  
PostPosted: Thu Dec 17, 2015 6:33 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Quote:
A word of advice. I don't want to discourage you in any way but I think you are rapidly raising the level of complexity to a point where you won't be able to get your system working. As I have often suggested, it's best to learn how to fly a single-engine plane before stepping into the cockpit of a 747 and taking off for Tokyo. :D


This is "Randall Airlines"; we left for Tokyo years ago!

But you are right. I am simultaneously (1) trying to extend Moore's law for 50 more years, (2) build Strong AI, from scratch, with nothing but Popsicle sticks, and (3) build a functioning, vintage computing device.

If I had to cut something out, it would be #1. The other two, it seems, are non-negotiable and inextricably intertwined.

Are you familiar with "Le Scribe Accropi"? A famous statue. Jay Enoch, a noted academician since the 1950s and 1960s, studied it in the 1990s or 2000s. I read an article of his; I will try to post a page, if/when I get a chance, concerning the seated scribe. https://en.wikipedia.org/wiki/The_Seated_Scribe. https://en.wikipedia.org/wiki/UC_Berkeley_School_of_Optometry (Dr. Enoch was Dean, 1980-1992, and known for his study of the Stiles Crawford Effect https://en.wikipedia.org/wiki/Stiles%E2%80%93Crawford_effect ).

So, is it beyond me. Yes. Is it seated somewhere in the "Je ne sais quoi" of humanity, to seek a Golem, an automata, an avatar, a robot, a Doppleganger, Adam (from soil) and Eve (from his rib), Talos (ancient Greek), a Frankenstein? Yes, I think so.

So, I know you probably have limited time to add to a folly like this, but I have to wonder if there is anything else worth doing? A lot of what goes on, in life, in my estimation, is passing time, and looking for a few simple comforts at the end of a long and arduous (in the past, physically arduous, but in our stressed out post-post-post-modernism, increasingly it is intellectually arduous) day at work. And "advancement" can't take place in a static environment; a place where all good ideas are assumed to have been "tried already".

I was born in 1980. The image sensor was invented in 1969. The "image sensor revolution" didn't occur until about the year 2000-2001 (That is when I got my first digital camera, and took about 300 pictures of marine invertebrates, in my 3 month study abroad semester in Bermuda). There are avenues of exploration that have not yet occurred; I am going to stumble down one of these (and, it seems, so far, you are going to help! hahah! Funny how things work, huh?).

You have the burden of knowledge (i.e. especially that of the exact thing the 6502/816 is immediately and unquestionably capable of). I am blessed with a lack of this burden of knowledge (yes, I just insulted myself!). I don't know what is impossible, yet, or even, for that matter, what is possible. But I know what inspires me, and I might be "artistic" in the way I go about it, but that is a scientist's prerogative (at least, I think it is? It used to be that way? Maybe all the REAL scientists have turned into engineers?).

Patent law and literature has a term "reduction to practice". Having envisioned the thing that I want, and need, I must go about manifesting it. The technology is there; it just needs to be BENT, with all the strength I/We possess, towards the endpoint (as I have defined it).

Have you ever thought about the term "planar process"? I, suddenly, have need for a "slightly-less-than-planar-process". In fact, I need a "spherical process". Anybody want to loan me a few billion dollars for a new semiconductor fab? I thought not. Maybe its impossible; maybe silicon retinas would have aberrant electrical characteristics because a hemispherical shaped image sensor would cut across the crystal plane in odd ways. Maybe the material removed from such a hemisphere of single crystal silicon would be so much that a single image sensor cut out of the boule would be prohibitively expensive? maybe maybe maybe. Maybe a geodesic approximation of planar image sensors would be "good enough"?

Whoa boy; I have just "techno-ranted". I trust it made sense (mostly!). In short, thanks for your help!

(P.S. there is a patent application olut there for curved--slightly--image sensors. I forget the number at the moment; a Japanese company? Maybe Toshiba? Mitsubishi? Panasonic? I forget.).

P.P.S. It is not my intention to make a non-functioning device; so the advice is taken and heeded. Maybe, If I am lucky, I will keep the KAC project separate from the 65816 project, and thus one will work, even if the other one remains mere "aspiration". Ideally, in the end, both will function, and, if I'm lucky, "dovetail" nicely.


Top
 Profile  
Reply with quote  
PostPosted: Thu Dec 17, 2015 6:38 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Note: functioning lenses in the seated scribe. some of the oldest--if not THE oldest--confirmed lenses that mankinfd has constructed. Also, copper retinas--to scare the theives out of the phartaohs tomb! The torches would shine back, reflected from copper retinas, through the lenses! amazing stuff!!


Top
 Profile  
Reply with quote  
PostPosted: Thu Dec 17, 2015 6:52 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Itemized today's commments: (still, they are very scattered thoughts, for an "itemized list", but I look at the academic papers that I have been reading these past ten years and--despite my intention to study evolution biology until that unfortunate day when I expire--I think to myself, "What Hath God Wrought"!).

1) OK, last night I took home a copy of all the posts made to this thread. Read them, fairly thoroughly. I'm struck by the progress I have made and appreciative of your gentle guidance.

Aside from one lengthy digression into physical theory and transistior operation, I have stayed, remarkably, on-task. Well, except for the image sensor project, but that is my normal, standard obsession, and I think, could not be separated from this project, even if I tried.

Here is the link to the KAC-9628 data sheet. http://media.digikey.com/pdf/Data%20Sheets/Eastman%20Kodak%20Co%20PDF/KAC-9628LongSpec%5B1%5D.pdf . I should have provided it a long time ago.

I was happy to read, again, your advice on files and compression (JPG, PNG, GIF), as that is what I hope to acomplish with the WDC65816/KAC9628 project. An FPA/FSA (Focal Plane Array/Focal Surface Array) produces alot of data (as the human retina must, i.e. 120 million rod and cone cells (Osterberg, circa 1930), or 90 million cells (Curcio et al, circa 1990).). So, while you were just being helpful, regarding the mechanics and particulars of PCB pattern generation, I felt I should mention how helpful it is to have "software people" looking at this project (I don't mean that as a prejorative.I know that 6502-ers are, generally speaking, equal parts software and hardware. It just so happens that software is a bit of a mystery to me, especially so when the discussion of it is divorced from discussions of the hardware. And even then, as you might have noticed by now, my grasp of hardware is tenuous, at best.).

If I am to accomplish my goals--life goals, not just goals for this project--I must begin to research video file formats (I am afraid the complexity of my knowledge in this field does not go much further than the cursory study, in a Radio Shack "Introduction to analog circuits" handbook, of television signals, and a few bits and pieces, told to me by a friend who knows, and subsequently verified in books and on the internet, of how the color TV signal was overlaid or "piggy-backed" on the original B and W signal. Beyond this rudimentary knowledge of commercial broadcast, consumer (propaganda?) electronics, I know very little about the nature of the data that gives rise to "moving images". I suppose that's not entirely true; I bought, and read much of, Jim Janesick's "Scientific Charge Coupled Devices" (copyright, circa 2001).

So capture, I know. Transmission, I sort of know. Screen production, (i.e. cathode ray tube, raster lines, odds and evens, etc.) I sort of know. But HD-this, and HD-that, and DVD/Computer? VGA, XVGA, etc etc ? is a mystery to me!

2) I should point out, again, that Van der Spiegel et. al. (circa 1989)--though I had thought of a foveated sensor before I had found and read that particular article, in 2006--would rightly be considered the first attempt to do what I am proposing to do. "Hardware compression" might be a term that we could use.

Oyster (textbook "The Structure and Function of the Human Eye", 2000) discusses the "data compression" in the human eye, in vivo. Though, I think it is safe to say that, while they are still speaking the language of science, biologists, anatomists, nueurologists, ophthalmologists, optometrists, and histologists definitely have their own dialect. The data and information imparted by these specialists is not PRECISELY the same offered by physicists, chemists, optical engineers and electrical engineers. It can take some significant math and "glue logic" (I like that phrase, so I have co-opted it! "Glue Logic"!), i.e. theory and assumption, to get the "numbers" to "talk to each other". That is, the juxtaposition of measured values in these fields must include significant conversion of one data set or another and it must be kept in mind that this data is almost always collected under experimental conditions, particular and peculiar to a given research program.


(The data that I am referring to, and that which comes to mind when I think of the "apples" of your world and the "oranges" of mine, is the data for visual acuity. Sometimes it seems that every single researcher who has ever studied the subject has/had their own standards and purposes for the data, and it can be maddenning to figure out what any one researcher MEANS when he/she uses a visual acuity value. I think Lorrin Riggs, a researcher at Brown Univeristy, for many years, had a good article on the different types of visual acuity measurements, published, I think, in Graham (circa 1960?)

Now, luckily, a properly controlled experiment states the "givens" and the "assumptions", and many of the experimental programs assume the same things, and control for the same variables and variations found under natural circumstances, for simplicity's sake.

But I am digressing from the main. In short, video compression and ANALYSIS are what I hope to accomplish with this project. I realize newer designs with newer processors, DSPs, graphics processors, and floating point processors are probably preferred, but I think this present hardware project (65816/9628 FPA/FSA)--as complex as it certainly will be--is within my grasp and capabilities.

3) I looked at timing diagrams last night (really, for the first time, in-depth), and started to understand them; RWB assertion, Address (valid times), '816 Data/Address multiplex. Still thinking it all through and trying to come to some conclusion on the OE pins (under a different plan, I might have been able to get away with tying OE low, but I don't think it is right for this project. So, I must come up with some solution for this. I think it will involve BDDs circuit with the 747AC74 (74ABT74 preferred, but I've already purchased the AC) and the 2X freq oscillator-can (producing a Phi1 and Phi2; 50-50 clock, 2 sigs precisely out of phase).).

4) Rethinking the decoding circuit, presently, as I think I might need all the memory space I can get (despite Garth's assurance that a Newbie probably won't use all 24KB ROM or 24 KB RAM in my previous plan. I will post the previous plan, if you want to follow my reasoning, but I am revising it to the "Garth Primer Plan" (74XX688? I think was the part number for this particular circuit) that gives 32KB RAM and 32 KB ROM, less the requisite I/O space.

But it is also about time that I think about the 4 expansions I could include for a '816 plan. Since my board will go "multilayer" now (it was tough to get it all on two layers, of one board. not just tough, maybe impossible!) and since Garth reminded me that the expansion board is 32 Mb of Fast-RAM (not 32 MB!) per module, I might not be greedy about a little bit of "lost space" in the memory map, if it'll make the decoding easier. I mean, I should keep the memory-map fully functional for both native and emulation mode (sufficient I/O, RAM, and ROM for both modes) and squeeze as much bandwidth/throughput into the design as possible.

In short, I must study the timing much more than I have. I am just getting my feet wet, now.

5) Also, after studying the I/O interface to the 65816 I find that I need to study some other designs for image sensor/microprocessor interface, to see what other people "normally do". Then, if I am familiar with an option, or two, or three, I can choose from what people "normally do" without radically re-inventing video processing/interface.

6) I am posting a diagram of my "multilayer PCB hack". Do let me know if the idea is insane, i.e. can't work. It is just two double-sided PCBs screwed together with a blank (no clad) perf board between them. It should work and should give me a decent ground plane (I think?).

I suppose I could hire out the PCB work. If I get my circuit running, I could do a second one and hire it out to "make it look pretty".

7) I did read the section on "forum ettiquette", but I had probably already stepped all over the rules before I read them. I endeavor to follow the rules and social conventions, but I also must acknowledge that, as a scientist, it is my job to circumvent them when it seems absolutely necessary. I mention this now, as I think at this point in the string it should be clear that I am trying to "build a skyscraper out of popsickle sticks". My only excuse is "politics". I can't delve into it too far without descending into a vicious rant and screed, so I will spare you here, since I value the intellectual fertilization that this site provides (and even I tire of reading my complaints). I think I have elbowed enough of my political adversaries (as a youth, I didn't know I even had these!) out of the way, so that I might be able to accomplish something, sometime soon!

8) So, in summation, the FPA/FSA imager would best be done with a multichip module ( http://forum.6502.org/download/file.php?id=2948&mode=view, http://forum.6502.org/download/file.php?id=2946&mode=view ), and in some sort of geodesic shape, goldberg polyhedra ( https://en.wikipedia.org/wiki/Goldberg_polyhedron , https://www.jstage.jst.go.jp/article/tmj1911/43/0/43_0_104/_pdf , and, formerly an awesome site, but presently somewhat diminished in functionality, http://polyhedra.org/poly/!).

My calculated Petzval/Coddington curvature ( http://forum.6502.org/download/file.php?id=2971&mode=view) should be assessed and questions about the fucntional human eye, in vivo, should be discussed. The work of G.H . Gliddon (Dartmouth and Rochester U, circa 1920s) should be assessed and re-done with modern equipment and optics (Low ref. index plastics/polymers, ideally, but CaF2 might be a low cost solution). The image sensors should be back-illuminated, probably CCD, though CMOS, I hear, is making strides to overtake the "industry standard" CCD. Backside thinned should be investigated, though the thinning of the silicon might (WILL!) cause the sensor to curl up on itself and straightening it out might require a bulkier package and reduce the space for fitting each sensor into the array. So, maybe too bulky for a FPA/FSA? Finally, one should note that the smaller the image sensor, and package, the closer the "linear", planar ICs can be made to fit the curved surface fo the "robot retina".

This is how the science SHOULD be done. Why isn't it being done this way? Politics.


Attachments:
File comment: Just a joke; a bit of art. Adapted from Leonardo Da Vinci's schematic (Section of a Rhombicuboctahedron, published Pacioloi's mathematics text)
Leonardo_polyhedramiror final.png
Leonardo_polyhedramiror final.png [ 1 MiB | Viewed 1422 times ]
Top
 Profile  
Reply with quote  
PostPosted: Thu Dec 17, 2015 7:42 pm 
Offline
User avatar

Joined: Fri Aug 30, 2002 1:09 am
Posts: 8510
Location: Southern California
randallmeyer2000 wrote:
6) I am posting a diagram of my "multilayer PCB hack". Do let me know if the idea is insane, i.e. can't work. It is just two double-sided PCBs screwed together with a blank (no clad) perf board between them. It should work and should give me a decent ground plane (I think?).

I suppose I could hire out the PCB work. If I get my circuit running, I could do a second one and hire it out to "make it look pretty".

Fortunately, the price of custom, commercially made PCBs has come down within reach of the hobbyist now. Going this direction rather than making your own, you can save time, get a much more reliable product, and get density that you can't achieve at home in a PC board.

_________________
http://WilsonMinesCo.com/ lots of 6502 resources
The "second front page" is http://wilsonminesco.com/links.html .
What's an additional VIA among friends, anyhow?


Top
 Profile  
Reply with quote  
PostPosted: Thu Dec 17, 2015 11:10 pm 
Offline
User avatar

Joined: Thu May 28, 2009 9:46 pm
Posts: 8389
Location: Midwestern USA
randallmeyer2000 wrote:
Quote:
A word of advice. I don't want to discourage you in any way but I think you are rapidly raising the level of complexity to a point where you won't be able to get your system working. As I have often suggested, it's best to learn how to fly a single-engine plane before stepping into the cockpit of a 747 and taking off for Tokyo. :D

This is "Randall Airlines"; we left for Tokyo years ago!

But you are right. I am simultaneously (1) trying to extend Moore's law for 50 more years, (2) build Strong AI, from scratch, with nothing but Popsicle sticks, and (3) build a functioning, vintage computing device.

If I had to cut something out, it would be #1. The other two, it seems, are non-negotiable and inextricably intertwined.

I'm not sure if you are fully understanding my point, so I'll elaborate a bit, with the hope that nothing that follows is condescending in any way.

I've worked on digital hardware for more than 45 years, and electronics even longer (I started as a child with this stuff), but it wasn't until 2009, when I was in a downward health spiral and seeking something to do that wouldn't be physically taxing, that I got a burr under my saddle and decided to build some kind of homebrew computer. I know a lot about electronics and computing in general, including knowledge acquired from some 35 years of experience with UNIX and the kinds of machines on which it is/was run. I'm in the business of building servers and high end workstations. I have several U.S. patents related to electronics. All of this is to make the point that it wasn't as though I would be working from a position of ignorant bliss and no understanding of what those electrons were up to when the switch is flipped.

Nevertheless, I made the decision to proceed in stages—and start by take flying lessons in a Piper Cub, as it were. I took the route of starting simple, making sure I fully understood the behavior of what I had created before moving onto the next logical step. I'm not the only one around here who has taken that route. In fact, before I did any design work I read a lot of what others had done, which was quite instructive and helped me to avoid the pitfalls that had gotten others.

Before you can hope to rig up image acquisition hardware, Blue-Ray and all that, you need to have a solid foundation of basic digital logic under your belt and you need to build a basic system that can compute and run in a stable fashion. Although 65xx hardware is relatively simple in concept when compared to, say, x86 or PA-RISC stuff, it's still not a cakewalk to master by any means. Your previous scribings are suggesting to me that there is quite a bit about digital logic and the welding of it into a working system that you don't yet know. In that regard, you're in (presumably) good company, as all of us started at that point at some time in the past.

So of your list, yes, number one (extending Moore's law) should be out of the equation, as by definition, anything built with 65xx hardware will be well behind the Moore's law curve. Numbers three and two need to be interchanged. To use yet another metaphor, the foundation for your new house needs to be poured and the concrete needs to have cured before you can start framing, roofing and laying brick. Summed up, you can't install the kitchen appliances before having a place to call a kitchen. :D

_________________
x86?  We ain't got no x86.  We don't NEED no stinking x86!


Top
 Profile  
Reply with quote  
PostPosted: Fri Dec 18, 2015 9:29 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
NO, BDD, it's not condescending. No worries. To me, sequential circuits are still very much a mystery. I'll confess to looking at the flip flops and latches in my "Wakerly" digital design textbook and reading the first and last sentences in each section. Then, loudly proclaiming "I understand that" I march forward and run into flip flops and latches again and then I have to ask myself "Do I?". Do I really understand them?

I have enough calculus and algebra to get me through the signal analysis. I have enough physics (maybe?) to understand the propagation and PCB construction. But, just need to study the digital design (i.e. specific implementations and functionality of transistors, in specific ICs, made by specific manufacturers at specific times.). I hope I don't offload too much of that workload; It can't be outsourced! (Famous quote of Euclid, when asked by royalty if there was an easier way to learn math. "No, there is no royal road to geometry")

Not laziness, per se, but it has been a subject that has not been made a priority, by me (for various reasons). I am now making it a priority.

Its probably clear I am jack of all trades, master of none. But I take my history, and science, and philosophy seriously. And reduction to practice (the patent mantra; though recent patent law changes might make that different, now?) is my aim.

I will think again, and try to take it slower, with reasonable goal posts. I like the history and technology and I understand that my part in it is small. But I also know it is important, and if I don't do it--push for it--it won't get done.

Thanks again, for your appropriate concerns.


Top
 Profile  
Reply with quote  
PostPosted: Fri Dec 18, 2015 9:39 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
(In response to private message; but I liked the question and answer so much, I decided to share, here; I hope that's OK? It should be.)

Yes. Awesome. The correct question.

There is an easy answer, but I must digress. I have, on several occasions, happened upon wonderful books discussing, in depth, the state of knowledge of the evolution of eyes and/or light sensitivity in Animalia, and biota in general. I wish (a) I had the money to run out and purchase these crucial references... though I sometimes still remember the precise place in the stacks and the color of the binding of the book that gives the answer and sometimes I even remember the title ... and/or (b) the memory to recall what was learned in the brief time I held the book in my hands. (I probably took good notes, but my filing system precludes any reasonable retrieval. I am preparing to write a book, and have been since I left undergraduate career and began preparing for graduate school entry.).

Shorter answer? Nature does it different ways. Like twenty, or so. An awesome convergence of design principles; analogy, as a general rule, not homology (though, to be sure, there is a bit of both, if memory serves me correctly).

One thing about the computer and digital and internet revolution is the "cheapness" of thought. I am afraid I succumb to such, and your Pascal quotation is apt and diplomatic and warranted. At least I am in good company.

Similar cheapness of thought in the latter days of the printing press. All this paper and "capacity" and few scholars (due to previous century of Black Death, according to James Burke's history treating that particular subject) to fill the pages! Well, that was rectified rather quickly, with stupendous and sometimes horrifying results!

Eyes; monophyly https://en.wikipedia.org/wiki/Eye#Evolution ; https://en.wikipedia.org/wiki/Evolution_of_the_eye#One_origin_or_many.3F

Benefits to curved image sensors? With lenses being what they are, NONE! But, if one considers lenses BEING WHAT THEY WERE, well that changes the equation.

The proper reference in this regard is Rudolf Kingslake's "A History of Lens Design", and specifically the first few chapters. It is a wonderful read, with minimal technical knowledge necessary. I will pass on a page or two, maybe on the forum (I do like my thoughts "open source"; I am coming off a recent "career"--if one calls it that-- as a "ugly-duckling-crank-scientist" (about 10 years), Scientific-radical-liberal-radio-DJ (about 2 years), and a Third-party candidate for the U.S. House of Representatives for the State of Vermont (6 months, but I didn't campaign. I thumbed to Boston, from Vermont, and went to the National archives to read the MIT Rad lab docs--from WWII--and, more to the point, the Harvard Optical Laboratory documents concerning "Fluorite Apochromatic lenses for aerial photography".), so, I am desirous of being rather "in the limelight".

I guess, what I am saying, is I like my thoughts copious, transparent, deep, unstifled, and--your point is taken--circumspect and organized. I suppose that is asking a lot of myself, and others.

I'll slow down, in the near future, as it is the holiday season, and, in any event, I must bathe and steep in the timing diagrams. I must absorb them by osmosis. And, above all, I must order a few more parts, draw a few more drawings, start plating, and etching and soldering. I gotta get something running, or it WILL BE an endless conversation, and not just SEEM like one!

Quote:
I'm not sure why it should be advantageous to have a curved imaging surface. It's what we find in eyes, but then eyes belong in sockets and like to swivel, so a more or less spherical shape is just what's needed. Of course, optics is not my speciality subject - but can you say why you believe this is a good approach?


SHORT ANSWER:

Lenses naturally produce a curved image. It took lens designers and glass chemists careful thought and attention to flatten the field (i.e. light field; the surface of optimum focus that all/most of the image points fall upon). Some tried a curved platen (cylindrical. A few tried spherical, but papers had to be specially made, so the cylinder only corrected in one direction (+/- X, but not +/- Y)). Others just lined up the subjects along a city street corner, so that the subjects, i.e. people, in the periphery were standing where the lens "wants" them to stand.

The Petzval curvature (also Coddington Curvature, though nobody calls it that, but Henry Coddington probably has priority.) and the equation describing it are intimately associated with the equations for the "aberration" of astigmatism. The Siedel aberrations is another reference you could check on wikipedia if you so desired. Correcting for curved image tends to correct for astigmatism too, but correcting for both, as I recall, is a balance of both.

Suffice it to say just about any lens purchased on the market is designed to place a flat image on a flat piece of film; and now, of course, sensors.

Is there a good reason for curved sensors? I don't know ("evolution" thinks so!). When I started the project, I planned only on a foveated (dense receptors in the central region; less dens in the periphery) imager. When I found that those existed (a 3 to 5 year journey just finding the right papers, on my own, with no graduate advisor and/or access to premium research university) I moved on to seeing if I could make them better. The photoreceptor densities then (1980s and 1990s, the papers on retinomorphic sensors were written), were not very good, and I still have yet to see one that approaches the human eye (120 million, Osterberg, circa 1930; or 94 million, Curcio et. al. circa 1990; the human eye is probably about tenfold less, as the "on-chip data compression" is substantial! hahaha! I never did find a definitive answer about "How many megapixels is the human eye" as the "apples" and "oranges" were a difficult comparison. But I collected many different data sets and normalized a few values to try to get closer to an answer.)

But before I got a good answer, suddenly the market got crazy, the pixels got small, and vision was everywhere (I graduated college in 2000, and around 2005-2010, things got really intense. CMOS helped a lot as it forced CCD, the mature technology, to compete.). So, 120 MP camera, suddenly didn't seem nuts to me. Neither did a 94 million, or 10 million.

I studied a bit about the history of motion pictures and the flicker fusion frequency (speed at which successive images will blur together and create the illusion of motion to the human eye.). So, 60 fps is a good "rule of thumb" for human vision, though 30fps, interlaced (for cathode ray tube with known phosphor "decay time") is adequate too. I think Hugh Davson's "Physiology of the Eye" cites between 40 and 50 as the fff, but, as always, his "apples" don't match your, i.e. computer science's, "oranges". The study he cites mentions all sorts of experimental conditions and anatomical "caveats".

So, not a very short answer, but, in the end, I decided it was foolish to try to planarize the foveated retina-sensor when the lens "wants" to produce a curved image in the first place. (I still might move back to planarized foveated retina sensors, but I will still need a special lens to deal with the (a) barrel/pincushion distortion, (b) Petzval curvature/Astigmatism.

Also, human optical system is (a) wide field, (b) quite sensitive (high dynamic range), and (c) variable focal length (due to flexible lens and accommodating musculature).

Besides, if curved retinas were good enough for your grandparents--and their grandparents grandparents grandparents--then why isn't it good enough for you?! Why back in Adam and Eve's day, they never heard of a flat-field lens!

As for commercial application, it probably depends on how "human" you want your Strong AI to be? Must it see like a human? Human perspective? Euclid himself wrote on issues of human "optical perspective" (i.e. Euclid's "Optics", trans. Harry Edwin Burton, Journal of Opt. Soc. Am., circa 1940s) and he didn't even have a good theory of light or images (Al Hazen is best/priority for theory of images; ancient Greeks were somewhat confused about the nature of light. David Parks wrote a nice book "Fire in the Eye" that I have been neglecting. Perhaps I'll read over the holidays?).

I was almost toying with the idea of NOT buying computer parts with the money my father was sending me. I almost was going to get one of those books I have needed for years; "Natural History of Vision" by David Wade (the scientist, not the science writer. The distinction has become one worth emphasizing, as the latter has a controversial book out, last year). I should try to be clear about light and vision, as the history and philosophy is a tangled one, and not one that seems very settled, even now. most of us just "consent" to the weird world of Einstein and Beyond (i.e. quantum physics).

Despite the ridiculous length (and near-irrelevance to my precise '816-9628-fsa project) of this, I might put this on the forum post. The method and madness and the man are one and the same. If I am going to fail in this project, I am going to do it in public. If I succeed, then others will have helped me, (clearly, as I lack any and all requisite resources.). If others succeed in doing what I have set out to do, that is OK too (but I would be a little bit disappointed). If NOBODY EVEN TRIES to do what I am trying to do here, that would be a REAL TRAGEDY!

Patent law recently changed. I almost decided to try to get a patent with my holiday money; instead of computer parts. But I think I prefer to build it; reduce to practice, rather than try to profit. That game" is a bit of a mess. Which is why I asked PandaPro to draw me a patent troll (for the Technopoly/65-0-2-opoly game). Unfortunately, his ten-year-old brain violated the copyright on the drawn examples of patent trolls that I gave him, so we will have to try to tease some more creativity out of him! hahaha!

There will be time to try for a patent later. Reduction to practice is my present mantra.

I'd love it if there were a payday in here somewhere, but there just isn't. Someday there might be; I did the "legwork" (historical, philosophical, and now, technical). But I don't see a huge commercial market for "robots" since humans are cheap and plentiful (crass, but true. I didn't invent capitalism, and I never said it was ethical, or even legal.). Perhaps, someday, the free world will prefer "guilt free" manual laborers? But Strong AI robots have rights too? Don't they?

So, what I am doing is science, not business. Short and simple. Any help you can offer, despite my endless techno-philsophical-rants, is appreciated.

And, as always, you have my promise; I will do my homework; good ideas will be incorporated into my project. I laid out my thoughts above, from the last 10 years, or so, and any ideas that come after this are some "open-source" version of yours and mine (you folks know who you are! You frequent "post-ers").

I am not prone to idle praise. I hope you know that (as I, lately, praise you folks at least once every post!).

Note: One last historical caveat. Glass history and Dartmouth Eye Institute. OK, This has to be shorter story than I can actually tell. I will have to tell it again, in another place, as it is pretty interesting. I'll just mention that G.H. Gliddon tried to make a scale model of a functioning optical apparatus of the human eye circa 1920s, at Dartmouth and U of Rochester, but for lack of suitable low-index glasses, had to scale the size up (I think it was 1.2 or 1.5 X the size of the in vivo human eye). Furthermore, I have a "just post-WWI" report that mentions that the war interrupted scientific glass supplies from abroad and that America had to ramp up research and production of such supplies for the crucial war effort. So, oddly, our country can sometimes be a little bit backwards. In our defense--as Kingslake's book mentions--the Germans (Schott, Abbe, and Zeiss) were just getting started in their science in the period of 1860-1880, and before that, optics was a bit of a scientific specialty, i.e., for telescopes. The invention of the Daguerrotype in the 1830s-1840s brought about a need for new glass and new lens design, just as the invention of the telescope had done one or two hundred years earlier (1740s, achromatic lenses, England, Chester "Charles" Moore Hall etc.).

Fascinating history! Sorry, so long! But history is a long thing!

EDIT: I'm sure a million typos above: I left them in. But one seemed "not good" to leave: "period of 1860-1880" in the paragraph directly above, not "1869-1880".

Whooops; I graduated college in 2002! (second edit!)


Top
 Profile  
Reply with quote  
PostPosted: Fri Dec 18, 2015 9:49 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Ran out of computer time yesterday (because I prattle" at the keyboard);

I meant to post the following schematics, so here they are.

Also, I awoke at 4 am and read the old posts (from this thread), and wrote a short response. I almost read through all of them. So, I might comment (tomorrow), as I am quickly running out of time today (Star Wars movie, today; let's hope Disney doesn't ruin the franchise! I have fond memories of my favorite space opera....childhood, etc. Where were you for Star Wars?).

OK, I had hoped to post something semi-intelligent about the following four schematics, but can't. My ride arrived, it is 4PM, the movie is at 6:15, 5 minute ride to town; 1 hour wait because it is STAR WARS.

Here are the schematics, I have to go.

Cheers


NOTE: One short note is necessary; this is the old circuit; I am going to scrap it. Closeup of the decoder ICs. Did not bring Phi2 into the circuit, but probably should have. 74HC138 and 74HC11 (3 input AND).


Attachments:
12 14 2015 Layer 4 of 4 decode closeup.png
12 14 2015 Layer 4 of 4 decode closeup.png [ 24.13 KiB | Viewed 1373 times ]
12 14 2015 Layer 1 of 4 decode closeup.png
12 14 2015 Layer 1 of 4 decode closeup.png [ 99.73 KiB | Viewed 1373 times ]
12 14  2015 Layer one of four top.png
12 14 2015 Layer one of four top.png [ 51.23 KiB | Viewed 1373 times ]
12 14  2015 Layer 4 of 4 bottom.png
12 14 2015 Layer 4 of 4 bottom.png [ 49.46 KiB | Viewed 1373 times ]
Top
 Profile  
Reply with quote  
PostPosted: Sun Dec 27, 2015 10:21 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Slightly bummed today. Had hoped to stack (and bolt together, with smooth, unthreaded, "registration pins" so that all layers aligned nicely) a double sided PCB to a bare board, to another double sided PCB, and thus, to make a multilayer board without getting fancy (fancy, like on this website http://www.thinktink.com/stack/volumes/volvi/multilyr.htm) but I just checked the DIP pins and they won't reach through the 3/16" of board that would result in such a setupo (i.e. 1.6 mm *3 = approx. 5 mm).

Still thinking it might work, with some "finagling". Maybe with all the ICs on one side of one board, and pins only travelling through that one board. Then the ground plane on the back of that board. Then the bare layer. Then another ground plane. Then the last layer could have signals "jumper wired" through.

Just a thought. Maybe half baked. Will "think and tinker", until it gets done right. Maybe PCB house will, in the end, be the solution?

I read everywhere that "prepreg" is used for multilayer board? Where do I buy this? Can find it nowhere? I have, hoewever, found some "PCB-standard" copper foil, fromm Philmore datak? Oh well, I'll keep looking, in case I want to try to do a PCB the right way.


Top
 Profile  
Reply with quote  
PostPosted: Sun Dec 27, 2015 10:53 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Have been focusing lately. Read my KAC9628 (kodak image sensor ) pdfs and now understand much more (chief obstacle to my understanding was that I2C is NOT for output of video data! Its only for programming. Quite a bit of programming in a CMOS imager. They are kind of like a VIA/PIA--i.e. having many registers which can be addressed/written-to and affecting functionality of the chip--.). So, several parallel data outputs of KAC-9628 have several modes for output; vsync and hsysnc traces/lines/sigs can be "free running" or "data ready" mode. I am still sorting the details in my head, and DMA--a vague notion of it, in my head--might help reconcile the problem of massive data and high freq-clock of image sensors vs. 65816 "clunkiness". (I am sure she was quite a vision, in her prime!).

This KAC-9628 pdf-reading has NOT gotten me closer to understanding the 65816/6522 system. I am only half way through the 6522 data sheet, and understanding it better, but I am certain I am falling behind in other areas that I thought I understood.

Specifically, I thought I understood the timing and decoding and glue logic of the 65816 hardware system, but am finding that the interface to memory (i.e. including the Phi 2 signal in the decoding scheme, due to memory being "weird"... by "weird" I mean "made to interface with x86 devices", I suspect!) is not as straightforward as I thought. It is probably simple, but I'm just hitting a mental roadblock, until the block unblocks! I think BDDs sites/posts will set me straight, when I get down to serious study of them.

PCB design and manufacture (mechanics and chemistry of) has me in knots (as usual), but this is OK. At least here, I know how it SHOULD be done, even if I can't--or refuse (due to poverty and "cheapness"/frugality)--to do it the right way!

Looking at laser diodes and laser diode circuits. Vague notions and dreams of an "all optical 6502"; 3500 All-Optical switches/logic would be glorious? Mere fantasy? Perhaps? Velocity factor might be improved, with light and some photonic crystal fibers? (refractive index is a ratio of speed of light in a vaccuum to that in medium, and thus, hollow core PCF might be advantageous?).

Not quite as semi-rich as I thought, but made an order today, anyway. Got a 6522, but from Rockwell (I think? Bought from Jameco!) so I have to take a look at a different spec sheet. Hope its compatible; I didn't check too closely.

Got a nice cheap 5V regulator for Pandapro; maybe start that TTL 4-bit computer from the radioshack book I have. That should be a good starter system, for him and me!

Gave more thought to the goldberg polyhedra; Really, there is way too much package for these itty-bitty image sensors. Multichip module is essential, for a good "smoother curve" polyhedral approximation to the petzval/coddington curvature! I never searched for "bare wafer, CMOS imagers"; perhaps somebody sells them?


Top
 Profile  
Reply with quote  
PostPosted: Mon Dec 28, 2015 8:25 am 
Offline

Joined: Mon Mar 25, 2013 9:26 pm
Posts: 183
Location: Germany
What kind of tool are you using to create the layout? Is there any schematic for this layout? I find it really hard to read.

Mario.

_________________
How should I know what I think, until I hear what I've said.


Top
 Profile  
Reply with quote  
PostPosted: Fri Jan 01, 2016 8:00 pm 
Offline

Joined: Mon Oct 12, 2015 5:19 pm
Posts: 255
Thanks for bothering to try , Mario.

Main difficulty with following "my plan" is the lack of a definite one. I am using TurboCad, a "cheesy" software I bought at Staples 10 years ago. It is both sufficient and deficient. It is inefficient to. I have downloaded and installed a PCB layout tool, but haven't set aside the time to learn the intricacies (same for the OSLO lens design tool Oi have, and the LASI IC/ASIC Layout tool) of the program.

(I was trying to implement the design on one double sided board, without plated through holes, and just solder pads near drill holes to jumper the wire through to the backside. But, lack of a ground plane made me rethink doing it this way. I might scrap what I have done and rethink the whole plan.)

I am presently reading Barry Brey's "The Intel Microprocessors"--upon Garth's suggestion that I study Direct Memory Access--and I have found the book much more informative than the first time I tried to digest it. I suppose the later chapters are more "hardware heavy" and thus, more reasonable to me? I dunno? I suppose that the 8237 DMA chip is probably specific to the Intel family? Perhaps it can be adapted? Brey's book mentions "Shared Bus Operation" and I might think about that too.

The trouble, Mario, is that my low-cost project, "roundify-ing" the planar-process (i.e. Integrated Circuits) image sensors, requires a make-shift "multichip module" (though, perhaps some HF acid could cut them out of their packages, and for a mere 80,000$, or so, I could get a gold wire "ball bonder" to repackage the image sensors in a proper "buckyball shape"/Soccer-ball shape) and something like 9 times (or 18 times, for binocular vision) the VGA resolution. Now, this is not crazy, since HD TV is about this resolution. But to do this with the 6502/65816 might be a stretch? I dunno?

But, I did see a Nolan Bushnell, Atari, oral history video on youtube, that describes his early ideas to run six or seven terminals off of one microprocessor. This would have made video arcades look a little bit different, but he couldn't get the processors of the day to run fast enough. I think 3 TV screens and he could make the economics of the system work, but 2 made it impossible to justify the cost of the microprocessor. So, he went to "hard wired", discrete chips, systems.


So,

648 pixels X 488 pixels X 12 bit depth X 30 fps X 1 sensor = 114 Mb / sec
648 pixels X 488 pixels X 12 bit depth X 30 fps X 9 sensors = 1.024 Gb / sec
648 pixels X 488 pixels X 12 bit depth X 30 fps X 18 sensors = 2.049 Gb / sec

648 pixels X 488 pixels X 1.5 Byte depth X 30 fps X 1 sensor = 14 MB / sec
648 pixels X 488 pixels X 1.5 Byte depth X 30 fps X 9 sensors = 128 MB / sec
648 pixels X 488 pixels X 1.5 Byte depth X 30 fps X 18 sensors = 256 MB / sec

648 pixels X 488 pixels X 1.5 Byte depth X 30 fps X 1 sensor X 60 sec/min = 854 MB / min
648 pixels X 488 pixels X 1.5 Byte depth X 30 fps X 9 sensors X 60 sec/min = 7.7 GB / min
648 pixels X 488 pixels X 1.5 Byte depth X 30 fps X 18 sensors X 60 sec/min= 15 GB / min

And, thus,

@ 18 Qty sensors, 922 GB/hour, 1.5 TB/waking day (waking day = 16 hours).

the most relevant numbers are the 14 MB/sec for one sensor. Note that the 65816, which I already own, but have not done anything with yet, has a maximum addressable space of 16 MB. Clearly some sort of "architectural accommodation" must be made.

I spent some of this holiday vacation studying the timing diagrams, programming options, and other parameters of the KAC-9628 (i.e. LM 9628) image sensor (now discontinued).

So, IF I proceed with the 65816 project (i.e. separate of my KAC project), then I think I will learn the PCB CAD program AND/OR route the signals through a multilayer board (whereas, earlier, I envisioned a two-layer board, which is probably--at least for my skill level--unworkable.).
[
IF I want to think more about DMA or Shared Bus Architecture, I can draw up a second plan. IF I want something that is "modern" (i.e. faster), graphics processors, DSPs, DMA controllers/processors, and DRAM (I think?) caches, and etc. would be wise additions to the plan. Thanks.


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 153 posts ]  Go to page Previous  1, 2, 3, 4, 5, 6, 7, 8 ... 11  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 57 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to: