# jax-js > A pure JavaScript port of Google's JAX ML framework that compiles and runs neural networks directly in the browser via auto-generated WebGPU kernels—with autodiff, JIT, and vectorization built in. jax-js is a JavaScript reimplementation of Google DeepMind's JAX framework—a Python ML library similar to PyTorch that's known for its composable function transformations like automatic differentiation (grad), vectorization (vmap), and JIT compilation. Unlike runtime-only solutions like TensorFlow.js or ONNX that ship pre-built kernels, jax-js is a full ML compiler that generates optimized WebGPU and WebAssembly kernels from scratch at runtime. The library runs entirely in the browser with zero dependencies. It achieves ~3 TFLOP/s on matrix multiplication benchmarks (M4 Pro) and can train MNIST to 99% accuracy completely client-side. A MobileCLIP demo semantic-searches 180,000 words of text at ~500 GFLOP/s—all in pure frontend JavaScript. - **Full ML compiler, not just runtime** — Generates optimized GPU compute kernels dynamically rather than relying on handwritten kernel libraries, enabling automatic fusion of operations. - **JAX-style transformations** — Supports grad() for automatic differentiation, vmap() for vectorization, and jit() for compilation—composable transformations that work together. - **Browser-native training and inference** — Train neural networks with hot module reloading (edit code while training runs) or run inference on models like MobileCLIP entirely client-side. - **Zero dependencies** — Pure JavaScript package installable via npm with no external requirements. - **Rust-like memory model** — Uses move semantics with .ref for reference counting, avoiding JavaScript's GC limitations for numerical workloads. To get started, install via npm (`npm install @jax-js/jax`), initialize WebGPU with `await init('webgpu')`, and use the NumPy-like API. The live REPL at jax-js.com lets you experiment immediately. ## Features - ML compiler that generates WebGPU/Wasm kernels (not handwritten) - JAX-style autodiff (grad), vectorization (vmap), JIT compilation - ~3 TFLOP/s matmul performance, ~500 GFLOP/s transformer inference - Train neural networks in-browser with hot module reloading - Zero dependencies, pure JavaScript, npm installable - Rust-like move semantics for memory management - NumPy-compatible API ## Integrations WebGPU, WebAssembly, npm, GitHub ## Platforms WEB, DEVELOPER_SDK ## Pricing Open Source ## Version 0.0.5 ## Links - Website: https://jax-js.com/ - Documentation: https://jax-js.com/docs - Repository: https://github.com/ekzhang/jax-js - EveryDev.ai: https://www.everydev.ai/tools/jax-js