ChatGPT: ASCII to decimal conversion
Re: ChatGPT: ASCII to decimal conversion
All these systems are stochastic - they don't produce the same text, even for the same inputs. And they are text generators, not lookup machines, so you're not getting information, you're getting something which plausibly resembles information. That resemblance can be very good, but there's still a difference. Even when they provide links as if they are citations, those links may not support the statements made - they are merely plausible links. They might even be to defunct websites or websites which have since been domain-squatted.
Although these systems are getting better, the only way to use them reliably is to check everything they output. Which of course you did.
Although these systems are getting better, the only way to use them reliably is to check everything they output. Which of course you did.
- BigDumbDinosaur
- Posts: 9425
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: ChatGPT: ASCII to decimal conversion
Martin_H wrote:
I am porting code from the 6502 to 65816, and I needed a routine to print a 16-bit binary number as decimal...In Edge I type "65816 assembly to print number as decimal number" and at the top is panel with AI generated code. But the code doesn't look right. It has repeated blocks of code and do nothing blocks.
In the same vein of relying on artificial “intelligence,” there was this news story about autonomous automobiles, aka “self-crashing cars.” The revelation that that autonomy isn’t quite what it seems does a lot to bolster my confidence in the ability of machines to make timely decisions that, if wrong, could result in large hospital or mortician bills.
Quote:
But here's the weird part. I redo the search out of curiosity, and it generates a completely different block of code. Initially it looks better, but as I read it, I realize its likewise bonkers, so I gave it the thumbs down feedback.
Quote:
I'm not wasting my time trying this, but I thought I would share it. I added a few comments in caps where it's off the wall.
Quote:
Go home AI you're drunk.
x86? We ain't got no x86. We don't NEED no stinking x86!
Re: ChatGPT: ASCII to decimal conversion
@BDD, thanks for the link to the Waymo story. I've been telling my brother-in-law that they had human rescue drivers and now I have a source.
- BigDumbDinosaur
- Posts: 9425
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: ChatGPT: ASCII to decimal conversion
Martin_H wrote:
@BDD, thanks for the link to the Waymo story. I've been telling my brother-in-law that they had human rescue drivers and now I have a source.
BTW, you should know that my automobile also has a “rescue driver”: me!
x86? We ain't got no x86. We don't NEED no stinking x86!
Re: ChatGPT: ASCII to decimal conversion
Why does MS claim that 'AI' produces 30% of its code? Because after they spent so many billions on it, they can't afford programmers any more...
Except in very specialised applications, 'AI' is delivering the distilled and averaged intelligence of the internet. Ask yourself which part of the population posts loudest and most often... and rejoice that now this idiocy can also include its cohort's idiocy in its own training material. Eventually, the answer to any question asked of generative 'AI' will be "Eh?"
Fifteen years ago, I wrote a Master's dissertation on the subject of autocorrection for OCR'd documents of book length - 100k to 1M words. The software I wrote for it included knowledge of English spelling, frequency of letters and partial word forms, common errors, detailed noise models for particular OCR programs. It generated correction lists of words which were not in normal spelling directories; eventually it either decided that a word was correct, or offered a _list_ of alternatives in descending probability. It was very very good, and I used that software to correct a lot of scanned books. But I would never allow it to correct something on its own, as people trust 'AI' to do...
Neil ([/rant])
Except in very specialised applications, 'AI' is delivering the distilled and averaged intelligence of the internet. Ask yourself which part of the population posts loudest and most often... and rejoice that now this idiocy can also include its cohort's idiocy in its own training material. Eventually, the answer to any question asked of generative 'AI' will be "Eh?"
Fifteen years ago, I wrote a Master's dissertation on the subject of autocorrection for OCR'd documents of book length - 100k to 1M words. The software I wrote for it included knowledge of English spelling, frequency of letters and partial word forms, common errors, detailed noise models for particular OCR programs. It generated correction lists of words which were not in normal spelling directories; eventually it either decided that a word was correct, or offered a _list_ of alternatives in descending probability. It was very very good, and I used that software to correct a lot of scanned books. But I would never allow it to correct something on its own, as people trust 'AI' to do...
Neil ([/rant])
- BigDumbDinosaur
- Posts: 9425
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: ChatGPT: ASCII to decimal conversion
barnacle wrote:
Why does MS claim that 'AI' produces 30% of its code? Because after they spent so many billions on it, they can't afford programmers any more...
Quote:
Except in very specialised applications, 'AI' is delivering the distilled and averaged intelligence of the internet. Ask yourself which part of the population posts loudest and most often... and rejoice that now this idiocy can also include its cohort's idiocy in its own training material. Eventually, the answer to any question asked of generative 'AI' will be "Eh?"
Quote:
Fifteen years ago, I wrote a Master's dissertation on the subject of autocorrection for OCR'd documents of book length - 100k to 1M words.
x86? We ain't got no x86. We don't NEED no stinking x86!
Re: ChatGPT: ASCII to decimal conversion
While microsoft will have strong business reasons (as they see them) to do what they do, there's no shortage of cash: a quick search says $86 billion.
Re: ChatGPT: ASCII to decimal conversion
AI has come a long way in the nearly 3 years since I started this thread. I've primarily been using Claude (Sonnet 4.5) to assist me with PUNIX. It does still make mistakes from time to time (I would say with somewhat lower frequency that myself), but is able to accept instruction and make corrections. It also will correct my mistakes in sensible ways. Basically, it's like working with a collaborator. It's not an infallible coding god, but it also no longer just emits nonsense. The code it generates typically will assemble and run with expected results, and the mistakes it makes are semantically full - for example, it would like there to be an (indirect,x),y addressing mode (don't we all), and will sometimes wishfully employ it. When this is pointed out by me Claude will then acknowledge the correction, and output the appropriate fix (which, for my architecture, means using a zero page "pseudoregister" as an intermediate storage target for the indirection).
"The key is not to let the hardware sense any fear." - Radical Brad
Re: ChatGPT: ASCII to decimal conversion
Interesting video on the failure rate of LLMs in real world usage:
https://youtu.be/z3kaLM8Oj4o?si=8n_iTOBFc_7_QO3r
https://youtu.be/z3kaLM8Oj4o?si=8n_iTOBFc_7_QO3r
Re: ChatGPT: ASCII to decimal conversion
Paganini wrote:
it would like there to be an (indirect,x),y addressing mode (don't we all)
Paganini wrote:
[...] the appropriate fix (which, for my architecture, means using a zero page "pseudoregister" as an intermediate storage target for the indirection).
Usage is pretty simple. Imagine that X selects a z-pg pair which points to a multi-byte array. The 65C02 can very easily fetch the 0th byte of the array by using...
Code: Select all
LDA (0,X)KK recognizes undefined opcode $CB as an alias, which gets translated to $A1 -- LDA (ind,X) -- before the 'C02 sees it. This means $CB is equally apt at speedily fetching the 0th byte of the array. But! $CB also secretly
Because W is readable at a fixed location in z-pg, we don't need X to access it. Thus it becomes fast and easy to use LDA (W),Y to access the remaining bytes in the array. In one very commonly used snippet (in which the array is simply a 16-bit word), performance almost doubles! (an 89% speedup)
-- Jeff
In 1988 my 65C02 got six new registers and 44 new full-speed instructions!
https://laughtonelectronics.com/Arcana/ ... mmary.html
https://laughtonelectronics.com/Arcana/ ... mmary.html
Re: ChatGPT: ASCII to decimal conversion
I have to say that my experiments in FAT32 are leading me to wonder about a 32-bit 6502. But I fear it would have so many extra instructions: I find it annoying, for example, that the inc/dec pair weren't included in the (zp) instruction. I'd also like to be able to use x and y as sources and destinations for the ALU operations. But then, we could add perhaps sixteen 32-bit register pairs and we've reinvented the ARM. Hmmm.
Neil
Neil
- BigDumbDinosaur
- Posts: 9425
- Joined: 28 May 2009
- Location: Midwestern USA (JB Pritzker’s dystopia)
- Contact:
Re: ChatGPT: ASCII to decimal conversion
barnacle wrote:
I have to say that my experiments in FAT32 are leading me to wonder about a 32-bit 6502...
You are in a somewhat similar boat (I just had to get that in
While the 65C02 has excellent throughput with handling load/store instructions and register operations, it’s mathematically weak. Aside from efficient disk accesses, the filesystem driver needs a strong arithmetic library. It’s the nature of the filesystem beast, something that came clear to me years ago when I was working with the Xetec Lt. Kernal. Filesystem stuff is not easy, even with a 32-bit MPU with built-in multiply and divide instructions.
I say look at what you’ve accomplished to date and keep pushing forward. Just watch out for icebergs.
x86? We ain't got no x86. We don't NEED no stinking x86!
Re: ChatGPT: ASCII to decimal conversion
Encouragement is always appreciated. I feel at this point, I have all the major algorithms in place, mostly working (but not f_alloc(), which is sulking). It's a question of sorting this in minimal assembly; it's about 4k at the moment, though with a lot of unnecessary debugging code in place.
I feel I could - with the knowledge I now have - probably write a C program to handle the major arcana in a week. Assembly, a little longer perhaps. C is, of course, assembly language with structures and 32-bit arithmetic.
I'm writing up a small screed to summarise the thread as I go along. It's currently around 20 pages...
Neil
I feel I could - with the knowledge I now have - probably write a C program to handle the major arcana in a week. Assembly, a little longer perhaps. C is, of course, assembly language with structures and 32-bit arithmetic.
I'm writing up a small screed to summarise the thread as I go along. It's currently around 20 pages...
Neil
Re: ChatGPT: ASCII to decimal conversion
I think it's a good adventure. We all know (surely) that any reasonable CPU can do anything that any other reasonable CPU can do, given memory and time. So FAT32 on the '02 is surely going to be feasible, it's just that it's somewhat tricky. It might be that some of the trickiness would be just as challenging on another CPU with more and wider registers. It might be that suitable macros would hide some of the details of doing 32 bits on an 8 bit micro. But we've been doing that ever since we ran a sensible Basic on our 6502 computers - most of us never even thought to consider the lack of an FPU, because an 8 bit micro can do anything. Indeed, that's part of the joy!
Re: ChatGPT: ASCII to decimal conversion
I do hope it's been an enjoyable adventure, even if challenging.