I assume GP meant that a lot of compilers also interpret and interpreters also compile.
For compilers, constant folding is a pretty obvious optimization. Instead of compiling constant expressions, like 1+2, to code that evaluates those expressions, the compiler can already evaluate it itself and just produce the final result, in this case 3.
Then, some language features require compilers to perform some interpretation, either explicitly like C++'s constexpr, or implicitly, like type checking.
Likewise, interpreters can do some compilation. You already mentioned bytecode. Producing the bytecode is a form of compilation. Incidentally, you can skip the bytecode and interpret a program by, for example, walking its abstract syntax tree.
Also, compilers don't necessarily create binaries that are immediately runnable. Java's compiler, for example, produces JVM bytecode, which requires a JVM to be run. And TypeScript's compiler outputs JavaScript.
Then what is the difference, I always thought of Java as closer to python in the sense that it's running the byte code. And python also has bytecode.
I don't know what the difference is , I know there can be intepreters of compilers but generally speaking it's hard to find compilers of intepreters
Eg C++ has compilers , intepreters both (cpi) , gcc
Js doesn't have compilers IIRC , it can have transpilers Js2c is good one but i am not sure if they are failsafe (70% ready) ,
I also have to thank you , this is a great comment
Programming languages mostly occupy a 4-dimensional space at runtime. These axes are actually a bit more complicated than just a line:
* The first axis is static vs dynamic types. Java is mostly statically-typed (though casting remains common and generics have some awkward spots); Python is entirely dynamically-typed at runtime (external static type-checkers do not affect this).
* The second axis is AOT vs JIT. Java has two phases - a trivial AOT bytecode compilation, then an incredibly advanced non-cached runtime native JIT (as opposed to the shitty tracing JIT that dynamically-typed languages have to settle for); Python traditionally has an automatically-cached barely-AOT bytecode compiler but nothing else (it has been making steps toward runtime JIT stuff, but poor decisions elsewhere limit the effectiveness).
* The third axis is indirect vs inlined objects. Java and Python both force all objects to be indirect, though they differ in terms of primitives. Java has been trying to add support for value types for decades, but the implementation is badly designed; this is one place where C# is a clear winner. Java can sometimes inline stack-local objects though.
* The fourth axis is deterministic memory management vs garbage collection. Java and Python both have GC, though in practice Python is semi-deterministic, and the language has a somewhat easier way to make it more deterministic (`with`, though it is subject to unfixable race conditions)
I have collected a bunch more information about language implementation theory: https://gist.github.com/o11c/6b08643335388bbab0228db763f9921...
The easy definition is that an interpreter takes somethings and runs/executes it.
A compiler takes the same thing, but produces an intermediate form (byte code, machine code, another languages sometimes called "transpilar"). That you can then pass through an interpreter of sorts.
There is no difference between Java and JVM, and Python and the Python Virtual Machine, or even a C compiler targeting x86 and a x86 CPU. One might call some byte code, and the other machine code .. they do the same thing.
While an interpreter can do optimizations, they do not produce "byte code" -- by that time they are compilers!
As for the comparison with the JVM .. compare to a compiler that produces x86 code, it cannot be run without an x86 machine. You need a machine to run something, be it virtual or not.