finally the #wasmtime winch backend can run the ghc wasm backend test suite. there's a good reason why every engine out there has a baseline jit and tier-up strategy, glad my daily driver is catching up
The magic of #WebAssembly and #WASI is that as long as you can feed your target program to a runtime capable of understanding it, everything else just kinda works, because it's already expressed on the far side of the abstraction boundary.
I have not written a line of #Rust before, but I was able to slightly tweak one of the #wasmtime examples to get the CPython WASI build running for the most basic task.
This is huge! This could allow other Web browser engines like #Firefox's #Gecko to run on iOS, and expand the possibilities with #WebAssembly on iOS, by allowing engines like #wasmtime to run in Apple phones 🥳
(o/c terms and conditions apply, and the costs might be prohibitive)
Interesting in trying out the new wasmtime serve in #Wasmtime 14? Here's a quick tutorial showing how to build a #WASI HTTP component, and how to run it:
@sunfish I just updated #wasmtime to 14.0.1 - but it looks like the new wasmtime serve command is behind a feature-flag, and the normal shipped wasmtime is build without this feature? So I guess I would have to build wasmtime from source?
#Wasmtime has a JIT and a #Wasm runtime that uses a lot of pointers, so it's especially cool that this PR is able to make it preserve strict provenance and run under stacked borrows in Miri: