GARTHWILSON wrote:
I would put the trimmer at R1 or R2 instead of where you show it. This will adjust the gain of the op amp circuit, and keep the R-2R ladder nicely balanced.
The charge pump will be fine for the negative power-supply voltage. Ideally the op amp's output voltage will depend only on the signal inputs and the feedback network, as long as the power supply voltage is great enough to reach the desired peak output voltages. The only problem might be that at the higher frequencies like your charge pump would work at, the PSRR (power supply rejection ratio) might be poor; but particularly if the op amp is taking very little current relative to what the charge pump is capable of, the ripple shouldn't be a problem. If you notice any artifacts in your video, they'll probably be cleared up by increasing the capacitor values in the charge pump.
Thanks! I came across this useful app note from Analog Devices showing the best ways to implement gain adjustment:
https://www.analog.com/en/technical-art ... cuits.html. Based on that and your advice, I've now moved the trimmers.
For the charge pump, I was worried because it shows a significant drop in output voltage, 1V between 0mA and 20mA which is what the three opamps would typically use. However, I since realized I had already purchased a MAX739 a long time ago for a DRAM project, so I updated the power supply to be based on that. It is a regulator and should be more stable in the range I'm using it, although it needs a few more passives.
Now, that makes both ICs used on this board quite expensive, at 7€ each. Plus the trimmers and R2Rs, it is an expensive board. Maybe in the future I'll try to reduce costs, I know going to SMD will likely make this a fraction of the price, but for now I want to experiment.
drogon wrote:
I wonder if you're over-thinking it...
So just as another data point, there exists a VGA adapter for the Raspberry Pi that's driven purely from the GPIO connector (by the GPU) that's nothing more than an R/2R network and it works remarkably well without any buffer amplifier, trim pots, etc. It's only 6 bits per channel but for the most part you'd never know it wasn't the full 24-bit RGB and despite passive components it run right past full HD resolutions too.
Search for Gert VGA666 for more details. (inc. schematic, etc.)
-Gordon
Yes, I'm most definitely over-thinking it. I agree that a couple of resistors is all you need, but where is the fun in that!
If I can get marginally better color reproduction and signal integrity, while learning something in the process, then I'll be happy.
AndrewP wrote:
I'm not sure you need that negative voltage. As far as I know the VGA RGB signals are
0V to 0.7V. It's possible that the LT1260 can be run off +5V positive and 0V negative but I only think that from a quick scan of the datasheet - don't trust me until you've tested it.
You might not need the LT1260 at all. If you haven't seen them then James Sharman's videos are a pretty good reference:
DAC Test - VGA from Scratch - Part 10.
Thanks! I watch James' videos religiously, and his approach is sound. What's missing from his presentation is the effect of current on the R2R ladder. If a bit of current flows through the cable, or you have signal integrity problems due to the impedance mismatch and not driving it at 75 ohms, then I would argue that it could generate errors comparable to the 1% tolerance of resistors, especially noticeable in 8 bits of resolution. I have not measured this though, so maybe I'm overstating it. Still, using an opamp buffer is an approach that should fix that and a learning opportunity for me!
On the power supply issue, they do recommend +/- 5V, but also say it could run with a single supply. I'm not sure how to interpret the datasheet to see how much error there would be at 0V in a single supply circuit? And I'm also not familiar at all with DC biasing and filtering to make the opamp work in a better range, maybe that's for a future learning project.