jax-js
A pure JavaScript port of Google's JAX ML framework that compiles and runs neural networks directly in the browser via auto-generated WebGPU kernels—with autodiff, JIT, and vectorization built in.
At a Glance
Pricing
Open-source ML compiler and framework for browser-based training and inference.
Engagement
Available On
About jax-js
jax-js is a JavaScript reimplementation of Google DeepMind's JAX framework—a Python ML library similar to PyTorch that's known for its composable function transformations like automatic differentiation (grad), vectorization (vmap), and JIT compilation. Unlike runtime-only solutions like TensorFlow.js or ONNX that ship pre-built kernels, jax-js is a full ML compiler that generates optimized WebGPU and WebAssembly kernels from scratch at runtime.
The library runs entirely in the browser with zero dependencies. It achieves ~3 TFLOP/s on matrix multiplication benchmarks (M4 Pro) and can train MNIST to 99% accuracy completely client-side. A MobileCLIP demo semantic-searches 180,000 words of text at ~500 GFLOP/s—all in pure frontend JavaScript.
- Full ML compiler, not just runtime — Generates optimized GPU compute kernels dynamically rather than relying on handwritten kernel libraries, enabling automatic fusion of operations.
- JAX-style transformations — Supports grad() for automatic differentiation, vmap() for vectorization, and jit() for compilation—composable transformations that work together.
- Browser-native training and inference — Train neural networks with hot module reloading (edit code while training runs) or run inference on models like MobileCLIP entirely client-side.
- Zero dependencies — Pure JavaScript package installable via npm with no external requirements.
- Rust-like memory model — Uses move semantics with .ref for reference counting, avoiding JavaScript's GC limitations for numerical workloads.
To get started, install via npm (npm install @jax-js/jax), initialize WebGPU with await init('webgpu'), and use the NumPy-like API. The live REPL at jax-js.com lets you experiment immediately.

Community Discussions
Be the first to start a conversation about jax-js
Share your experience with jax-js, ask questions, or help others learn from your insights.
Pricing
Free Plan Available
Open-source ML compiler and framework for browser-based training and inference.
- Full ML compiler with kernel generation (not just runtime)
- JAX transformations: grad, vmap, jit
- WebGPU acceleration (~3 TFLOP/s matmul on M4 Pro)
- Zero dependencies, npm installable
- Live REPL and working demos (MNIST, MobileCLIP)
Capabilities
Key Features
- ML compiler that generates WebGPU/Wasm kernels (not handwritten)
- JAX-style autodiff (grad), vectorization (vmap), JIT compilation
- ~3 TFLOP/s matmul performance, ~500 GFLOP/s transformer inference
- Train neural networks in-browser with hot module reloading
- Zero dependencies, pure JavaScript, npm installable
- Rust-like move semantics for memory management
- NumPy-compatible API