Before: Python Image Processing
Pillow: 50 images/sec Memory: 500 MB Problem: Users complain about timeouts
# Slow pure Pythonfor img in images: process(img)# 20ms each
50x
Faster
than pure Python
10x
Less Memory
lower server costs
100%
Type Safe
cross-boundary
Write once in Rust. Use everywhere.
use bridgerust::prelude::*;
#[bridge] struct Calculator { value: f64, }
#[bridge_methods] impl Calculator { #[constructor] fn new() -> Self { Self { value: 0.0 } }
#[method] fn add(&mut self, x: f64) { self.value += x; }
#[method] fn get_value(&self) -> f64 { self.value } } from calculator import Calculator
calc = Calculator() calc.add(10.5) calc.add(5.3) print(calc.get_value()) # 15.8
# 50x faster than pure Python! ⚡ import { Calculator } from 'calculator';
const calc = new Calculator(); calc.add(10.5); calc.add(5.3); console.log(calc.getValue()); // 15.8
# 10x faster than pure JavaScript! 🚀One Rust implementation. Two native libraries. Zero compromises.
BridgeRust unifies PyO3 and napi-rs into a single, zero-cost macro system. Write your logic once in Rust, and deploy native high-performance bindings to Python, Node.js, and beyond.
Before: Python Image Processing
Pillow: 50 images/sec Memory: 500 MB Problem: Users complain about timeouts
# Slow pure Pythonfor img in images: process(img)# 20ms eachAfter: Rust + BridgeRust
Your Library: 2,500 images/sec Memory: 50 MB Result: Happy users, 80% lower costs
# Fast Rust coreprocess_batch(images)# 0.4ms eachBefore: Celery Task Queue
Celery: 1,000 tasks/sec Latency: 50ms per task Problem: Can’t scale without more servers
After: Rust + BridgeRust
Your Queue: 50,000 tasks/sec Latency: 1ms per task Result: Same servers, 50x throughput
# Install the BridgeRust CLIcargo install bridge
# Verify installationbridge --version# bridge 0.1.2# Create a new BridgeRust projectbridge new my-fast-libcd my-fast-lib
# Start the development server (Live Reload)bridge dev# Watching for changes...# Build bindings for both Python & Node.jsbridge build --all
# Generate documentation automaticallybridge docs --open# Run testscargo test
# Run benchmarks across languagesbridge benchmark# Running benchmarks...# Rust: 100ns/iter# Python: 5000ns/iter (50x slower)You just built a universal library!
Write Once
DRY (Don’t Repeat Yourself). Define with #[bridge] or #[bridge_module]. We generate PyO3 and NAPI-RS bindings automatically.
#[bridge_module] mod my_lib { pub fn process() {} // Automatically bridged! }No duplicate FFI code. No maintenance burden.
Native Speed
10-50x faster than pure Python/Node. Zero runtime overhead. Compiled to native machine code.
Real benchmarks from Embex:
Type Safety
Compile-time guarantees across language boundaries. Automatic type mapping: Vec<T> → List[T] / T[].
Catch errors at compile time:
#[method] fn process(&self, data: Vec<f64>) -> Result<String>Open Source
MIT licensed. Free forever. Built on battle-tested PyO3 and NAPI-RS.
Growing community of contributors.
Universal SDKs & Clients
Write one API client in Rust. Publish to both PyPI and npm. Perfect for: - Database drivers - API wrappers - Cloud service clients - Authentication libraries
High-Performance Data Processing
Process gigabytes of data with Rust’s speed. Let users script in Python/JavaScript: - ETL pipelines - Data transformations - Analytics engines
System Tools & Utilities
Build powerful system-level tools that integrate with both ecosystems: - File processors - Network utilities - Compression libraries - Cryptography tools
Real-Time Services
High-throughput services for production use: - WebSocket servers - Task queues - Message brokers - Event processors
See what’s possible with the framework:
| Feature | BridgeRust | PyO3 Only | NAPI-RS Only | Pure Python/Node |
|---|---|---|---|---|
Write Once | ✅ Yes | ❌ Python only | ❌ Node only | ✅ Yes |
Python Support | ✅ Native | ✅ Native | ❌ No | ✅ Native |
Node.js Support | ✅ Native | ❌ No | ✅ Native | ✅ Native |
Performance | 🚀 10-50x | 🚀 10-50x | 🚀 10-50x | 🐌 Baseline |
Code Duplication | ✅ None | ❌ High | ❌ High | ✅ None |
Type Safety | ✅ Compile-time | ✅ Compile-time | ✅ Compile-time | ⚠️ Runtime |
Learning Curve | 📚 Medium | 📚 Hard | 📚 Hard | 📘 Easy |
The Best of All Worlds: Write once, native performance, both ecosystems.
Watch the framework being built in public. Follow along as real libraries are created with BridgeRust.
Start building universal libraries
MIT licensed • Free forever • Open source
No signup required • No tracking • No data collection