I'm really interested in the PyPy project, but for the 1st (but less well-known) of its purposes listed below:
However I've read elsewhere that RPython can be troublesome to work with--syntax created for dynamic typing suddenly restricted to inferred static typing leads to hard-to-understand compile errors.
It's less about syntax (the only thing Python syntax has to do with typing is that it has no place for type annotations, and that can - and was, in 3.0 - changed), and more about:
So my question is, are there any other projects that would allow you to write a brainfudge interpreter/JIT like in the tutorial above? Or is PyPy the only option for doing so as succinctly?
I am not aware of any other project which attempts creates a JIT compiler from an interpreter. I'm pretty confident the idea was new when the PyPy guys did it, so the odds that something else like this (if it exists) is more mature than RPython are slim. There are numerous projects which aid individual aspects. There are also a few which tackle many or "all" of these aspects together, such as Parrot. But AFAIK none of them have success stories anywhere as compelling as PyPy.
Parrot is a VM for dynamic languages and features several backends (no JIT since v1.7 as I just learned, but the architecture permits re-introducing one transparently), and apparently grew a rich set of tools for language implementers. The CLR and JVM offer similar services for static object-oriented languages, though I do not know of tools quite as sophisticated as Parrot's.
But instead of you writing an interpreter, it defines am IR (several ones in fact) and your job is compiling the language to that IR (and defining built-in functionality in terms the VM can understand). In this regard, it's different from the RPython approach of writing an interpreter. Also, as with other VMs, you are screwed should some aspect of your language map badly to the IR. Need something radically different from the VM's services? Have fun emulating it (and suffering from awful performance). Need a language-specific optimization (that is not valid for arbitrary IR, and cannot be done ahead of time)? Say goodbye to those performance improvements. I'm not aware of a complete language implementation on Parrot except for toy languages. And since they are not constantly bragging about performance, I fear that they are currently weak in this regard.
LLVM (mentioned by others), as well as many other code generators/backends, is just one ingredient. You'd have to write a fully-blown static compiler lowering your language to the level of abstraction of machine code, rather than an interpreter. That may be feasible, but is certainly quite different.
If one exists, what's the point of RPython in general?
"Writing JIT compilers is hard, let's
go shoppingwrite interpreters." Well, it probably started out as "we want to do Python in Python, but it'd be too slow and we don't want to make a programming language from scratch". But these days, RPython is a very interesting programming language in its own right, by virtue of being the world's first programming language with JIT compilers as first-class (not really in the sense of first-class functions, but close enough) language construct.
Would it have made more sense just to do "PyPy" in an existing interpreter-creation tool?
Just for the sake of being meta, doing research, and showing it works, I favor the current approach. Up until the point where the JIT generator worked, you could have had the same in any language with static compilation, C-ish performance potential and macros (or another way to add what they call "translation aspects") -- although those are rare. But writing a well-performing (JIT or not) compiler for the entire Python language has repeatedly proven too hard for humans to do. I'd say it would not have made more sense to write an interpreter, and then struggle to get it right again in a separate JIT compiler codebase, and still optimize anything.