BigDumbDinosaur wrote:
whartung wrote:
Also consider runtimes such as the JVM that will switch from interpreted to compiled to recompiled based on runtime behaviors.
Just my opinion, but I don't think Java is a good example. We have been discussing programming languages in which the compiler emits executable machine instructions that will run without the need of a runtime package. Java doesn't qualify in that sense as a compiled language.
Nonsense.
All compiled code have runtimes. Some are as minor as _crt0 and the standard library, statically linked in to the image. Others include more sophisticated capabilities, such as garbage collectors and dynamic module loaders (DLLs, the old DOS overlay managers, etc.), plus all those calls to the local OS. Java's is just a particularly sophisticated runtime. Even Java ME (the Java runtime targeted for embedded applications) includes a JIT layer.
Early Microsoft applications were a combination of native and "p-code". They did this to preserve space.
When the Macintosh switched over from the 68K to the PPC, it ran in a mixed environment, with the 68K code being emulated (and later, dynamically compiled) along side the now-native PPC code. This wasn't just on the application level, but even early system ROMs were a mix of PPC and 68K. The file system primitives were one example that remained in 68K in the early releases. This worked so well, they did the same thing when they moved to the x86. An x86 OS running PPC programs with 68K code in them…makes your head hurt.
The .NET runtime does not compile into "P-Code" like Java does, rather it compiles into an intermediate code that is JIT compiled on load. .NET intermediate code is never "interpreted" like the JVM can do, it always runs native instructions. It's just those native instructions aren't created until the modules are loaded at runtime.
All of these systems were "compiled" in every sense of the word. You can ship Java code with the runtime bundled as a standard EXE file.
LLVM is similar, an intermediate representation that is both statically compiled, and JIT -- yet the host language is simply compiled down in to the LLVM code. Now the compiler writers focus on creating optimal LLVM code while the LLVM creators focus on creating better machine code out of the LLVM intermediate form for the individual CPU sets.
The assertion was that a machine can't create better code than a human. Are there instances where the machine will most certainly be worse than a human? Naturally. But it's no longer a blanket statement that a machine will produce worse code than a human, simply because in many modern scenarios the machine simply has much more information, including dynamic runtime information in some cases, at hand to make better decisions than a human does, particularly in large projects.