Skip to content

50x

Faster

than pure Python

10x

Less Memory

lower server costs

100%

Type Safe

cross-boundary

npm downloads PyPI monthly downloadsGitHub stars

Write once in Rust. Use everywhere.

use bridgerust::prelude::*;
#[bridge]
struct Calculator {
value: f64,
}
#[bridge_methods]
impl Calculator {
#[constructor]
fn new() -> Self {
Self { value: 0.0 }
}
#[method]
fn add(&mut self, x: f64) {
self.value += x;
}
#[method]
fn get_value(&self) -> f64 {
self.value
}
}

One Rust implementation. Two native libraries. Zero compromises.


BridgeRust unifies PyO3 and napi-rs into a single, zero-cost macro system. Write your logic once in Rust, and deploy native high-performance bindings to Python, Node.js, and beyond.


Before: Python Image Processing

Pillow: 50 images/sec Memory: 500 MB Problem: Users complain about timeouts

# Slow pure Python
for img in images:
process(img)
# 20ms each

After: Rust + BridgeRust

Your Library: 2,500 images/sec Memory: 50 MB Result: Happy users, 80% lower costs

# Fast Rust core
process_batch(images)
# 0.4ms each

Before: Celery Task Queue

Celery: 1,000 tasks/sec Latency: 50ms per task Problem: Can’t scale without more servers

After: Rust + BridgeRust

Your Queue: 50,000 tasks/sec Latency: 1ms per task Result: Same servers, 50x throughput

What will YOU speed up?

Terminal window
# Install the BridgeRust CLI
cargo install bridge
# Verify installation
bridge --version
# bridge 0.1.2

You just built a universal library!


Write Once

DRY (Don’t Repeat Yourself). Define with #[bridge] or #[bridge_module]. We generate PyO3 and NAPI-RS bindings automatically.

#[bridge_module]
mod my_lib {
pub fn process() {} // Automatically bridged!
}

No duplicate FFI code. No maintenance burden.

Native Speed

10-50x faster than pure Python/Node. Zero runtime overhead. Compiled to native machine code.

Real benchmarks from Embex:

  • Vector search: 50x faster
  • Memory usage: 10x less

Type Safety

Compile-time guarantees across language boundaries. Automatic type mapping: Vec<T>List[T] / T[].

Catch errors at compile time:

#[method]
fn process(&self, data: Vec<f64>)
-> Result<String>

Open Source

MIT licensed. Free forever. Built on battle-tested PyO3 and NAPI-RS.

Growing community of contributors.


Universal SDKs & Clients

Write one API client in Rust. Publish to both PyPI and npm. Perfect for: - Database drivers - API wrappers - Cloud service clients - Authentication libraries

High-Performance Data Processing

Process gigabytes of data with Rust’s speed. Let users script in Python/JavaScript: - ETL pipelines - Data transformations - Analytics engines

  • Machine learning preprocessing

System Tools & Utilities

Build powerful system-level tools that integrate with both ecosystems: - File processors - Network utilities - Compression libraries - Cryptography tools

Real-Time Services

High-throughput services for production use: - WebSocket servers - Task queues - Message brokers - Event processors


See what’s possible with the framework:


FeatureBridgeRustPyO3 OnlyNAPI-RS OnlyPure Python/Node

Write Once

✅ Yes❌ Python only❌ Node only✅ Yes

Python Support

✅ Native✅ Native❌ No✅ Native

Node.js Support

✅ Native❌ No✅ Native✅ Native

Performance

🚀 10-50x🚀 10-50x🚀 10-50x🐌 Baseline

Code Duplication

✅ None❌ High❌ High✅ None

Type Safety

✅ Compile-time✅ Compile-time✅ Compile-time⚠️ Runtime

Learning Curve

📚 Medium📚 Hard📚 Hard📘 Easy

The Best of All Worlds: Write once, native performance, both ecosystems.


Watch the framework being built in public. Follow along as real libraries are created with BridgeRust.


Start building universal libraries

MIT licensed • Free forever • Open source

No signup required • No tracking • No data collection