Chromatix wrote:
Right, so the VDU commands are a lot like the WYSE ones, but approximately one byte shorter - because they don't have an ESC prefix.
Which could make them ambiguous in the real (i.e., ASCII) world where a common standard is required. What you have described is a very narrow and specific hardware example.
<ESC> was defined in the ASCII standard so a receiving device could unambiguously detect when the following character(s) are to be interpreted in some way instead of being sent on to be displayed. Even the Commodore 128 used escape sequences to affect the video in some fashion, thus insulating the programmer (BASIC or otherwise) from having to know anything about the video driver or the underlying hardware.
In the timesharing BASIC dialects that were developed during the minicomputer era (and are in use to this day), the abstraction was even greater. For example, to print flashing, reverse video text to the terminal, one could write
PRINT 'BR','BB',"Flashing Text",'EB','ER' in their program and the BASIC interpreter would generate the escape sequences needed to turn on reverse video (
'BR') and flashing (
'BB'). Internally, the character string would be tokenized to
<ESC>BR<ESC>BBFlashing Text<ESC>EB<ESC>ER, the
<ESC> bytes telling the interpreter that the next two characters were a display control command. The programmer didn't have to know anything about the terminal's command language, as the interpreter would take care of that.
That said, this topic isn't really one about how to control a display. Whatever floats your boat is the right way to do it.
However, standards do have a purpose and the use of
<ESC> to tell a device that something special is coming is nearly universal in computing—for example, any printer that understands PCL is using escape sequences. It's been that way since I started back in 1970 and will probably stay that way into the foreseeable future.
commodorejohn wrote:
Unfortunately, if you're not relying on eight-bit transmission of what is technically a seven-bit character code, there aren't any places to fit in more elaborate one-byte cursor controls...
...which is why the VT-100 command set evolved the way it did. DEC was a laggard when it came to adopting 8-bit serial transmission, opting instead to stick with 7 bits with parity. By the mid-1980s, parity in terminal interfaces was dying out (it really didn't have any purpose, as a framing or other error would immediately be obvious to the terminal user) and 8N1 had become the
de facto standard. I can't recall the last time I configured a terminal server to use parity (parity does live on in some serial data acquisition devices, such as credit card readers).
Quote:
Which isn't to say that the VT-100 standard isn't over-engineered and an inefficient use of bandwidth;
Dunno about "over-engineered."
More like "over-complicated." Even after 8-bit ASCII became commonplace, DEC stuck with their scheme, apparently as some form of vendor lock-in (it didn't work—WYSE and others added VT-100 emulation to their products).
As I earlier said, the more compact WYSE 60 command structure had a lot to do with that terminal's overwhelming success (also, the quality of its display was a factor). In a complex screen layout in which there are many fields requiring many cursor-plotting functions, bandwidth consumption on the slower serial interfaces of the 1980s was a real concern. For example, sending four bytes to position the cursor (WYSE) instead of eight (VT-100), multiplied by 15 or 20 such commands per screen significantly reduced bandwidth consumption in a large installation (in the past, I routinely installed systems with 30-40 terminals, all running off a single UNIX box). Reducing the total outflow reduced low-level kernel processing, since serial I/O in those systems was interrupt-driven.
Getting back to the topic, a change of terminal personality definitely can be done if one is familiar with C and knows the details of the terminal to be emulated. I understand that a development environment can be downloaded from Microchip's website and installed to write code. A PICKit is needed to flash the MCU. The genuine Microchip PICKit costs about 50 USD through the usual electronics sources. There are Chinese clones that vary in quality. As always, caveat emptor.