Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 0 additions & 2 deletions .github/workflows/build-nodejs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,6 @@ on:
branches: [master, main]
tags:
- 'v*'
pull_request:
branches: [master, main]
workflow_dispatch:
inputs:
release_tag:
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@
.vscode/
*.iml

# Vite
.vite/

# Ignore Node.js-specific files
node_modules/
npm-debug.log
Expand Down
82 changes: 79 additions & 3 deletions bindings/typescript/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ Remote connect to major Lightning node implementations with one TypeScript inter
- Supports major nodes: CLN, LND, Phoenixd
- Supports protocols: BOLT11, BOLT12, NWC
- Includes custodial APIs: Strike, Speed, Blink
- Includes Spark (`SparkNode`) with pure TypeScript signer patching (no UniFFI bindings)
- LNURL + Lightning Address support (`user@domain.com`, `lnurl1...`)
- Frontend-capable TypeScript runtime (`fetch`-based)

Expand Down Expand Up @@ -59,6 +60,80 @@ const status = await node.lookupInvoice({ paymentHash: invoice.paymentHash });
const txs = await node.listTransactions({ from: 0, limit: 10 });
```

### Spark (browser + Expo Go oriented)

```ts
import { createNode, installSparkRuntime } from '@sunnyln/lni';

// One-time runtime setup for browser / React Native / Expo Go.
// - Adds Buffer + atob/btoa polyfills when missing
// - Wraps fetch for gRPC-web body reader compatibility
// - Optionally injects API key header into same-origin Spark HTTP calls
const sparkRuntime = installSparkRuntime({
apiKey: 'optional-api-key',
apiKeyHeader: 'x-api-key',
apiKeySameOriginOnly: true, // default true; safer for browser CORS
});

const sparkNode = createNode({
kind: 'spark',
config: {
mnemonic: 'abandon ...', // store securely in production
network: 'mainnet', // or regtest/testnet/signet/local
// optional:
// passphrase: '...',
// defaultMaxFeeSats: 20,
// sparkOptions: { ...sdk options... },
// sdkEntry: 'auto' | 'bare' | 'native' | 'default'
},
});

const sparkInfo = await sparkNode.getInfo();
const sparkInvoice = await sparkNode.createInvoice({
amountMsats: 25_000,
description: 'Spark invoice',
});

// Optional cleanup (restores original global fetch if installSparkRuntime changed it)
sparkRuntime.restore();
```

### How does the pure TypeScript Spark signer work?

The Spark protocol uses **2-of-2 threshold Schnorr signatures (FROST)** for every operation — sending, receiving, even wallet initialization. The official Spark SDK (`@buildonspark/spark-sdk`) ships a **Rust WASM binary** that handles all the FROST cryptography. That works in Node.js, but breaks in **browsers and Expo/React Native** where WASM loading is restricted or unavailable.

We didn't write a FROST library from scratch. We used two existing TypeScript packages:
- **`@frosts/core`** — generic FROST protocol types and operations
- **`@frosts/secp256k1-tr`** — secp256k1 + Taproot (BIP-340) specific implementation

Then in `spark.ts`, we wrote two functions that **replace** the SDK's WASM calls:

1. **`pureSignFrost()`** — computes the user's signature share. This is the user's half of the 2-of-2 signing. It takes the user's secret key, nonces, and the signing package (commitments from both the user and the statechain operator), then produces a signature share using the FROST round2 math.

2. **`pureAggregateFrost()`** — combines the user's share with the statechain operator's share into a final Schnorr signature. For the adaptor path (Lightning payments via swap), it also handles adaptor signature construction where the signature is offset by an adaptor point.

These get injected into the Spark SDK via `setSparkFrostOnce()`, which overrides the SDK's default WASM signer with our pure TS implementation.

The tricky part wasn't the basic FROST math — it was matching the exact behavior of Spark's **custom FROST variant** ("nested FROST" from Lightspark's fork). Spark uses nested FROST where the user is always in a singleton participant group, which means the Lagrange interpolation coefficient (lambda) is always 1 for the user. The statechain operators form their own group. This is different from standard FROST where Lagrange coefficients are computed across all participants.

Three cryptographic bugs were discovered and fixed during implementation:

- **Adaptor signature validation key**: The adaptor z-candidate validation was checking against the user's individual public key instead of the aggregated group key. The BIP-340 challenge hash depends on the full group key, so validation was effectively random (~50% correct).

- **Non-adaptor aggregate path**: The `@frosts` library's built-in `aggregateWithTweak()` function broke for two reasons: (a) JavaScript Maps use reference equality for object keys, and different `Identifier` instances for the same logical signer don't match, and (b) the library computed binding factors with a slightly different key than what was used during signing. Fixed by doing manual aggregation — sum the shares and serialize — matching the adaptor path.

- **Payment completion polling**: The SDK's `payLightningInvoice()` returns immediately after initiating the swap, before the Lightning payment settles. Added polling via `getLightningSendRequest` to wait for the terminal status and return the preimage.

Reference implementation for Spark's FROST behavior: [buildonspark/spark](https://github.com/buildonspark/spark) (uses Rust WASM, not `@frosts`).

Spark entrypoint behavior:
- `sdkEntry: 'auto'` (default) uses a browser-safe bundled Spark bare runtime in browser/Expo and falls back to the default SDK entry in Node.
- `sdkEntry: 'bare'` forces the browser-safe bundled no-WASM/no-native path.
- `sdkEntry: 'default'` uses the standard SDK export (may load WASM).
- `sdkEntry: 'native'` uses the React Native native SDK entry.

Node-only note: use `sdkEntry: 'default'` for Node environments.

For NWC specifically, `createNode` returns `NwcNode` when `kind: 'nwc'`, so you can close it:

```ts
Expand Down Expand Up @@ -104,14 +179,15 @@ await node.onInvoiceEvents(
- `StrikeNode`
- `SpeedNode`
- `BlinkNode`
- `SparkNode`
- LNURL helpers (`detectPaymentType`, `needsResolution`, `resolveToBolt11`, `getPaymentInfo`)

Not included yet:
- `SparkNode` (planned)

## Frontend Runtime Notes

- Uses `fetch`, no Node-native runtime dependency required.
- Spark no longer requires project-level Spark shims/vendor bundles; those are packaged in `@sunnyln/lni`.
- For local `file:` package development with Expo, build the package first (`bindings/typescript`: `npm run build`) and use the Expo example `metro.config.js` pattern for `./dist/*` resolution.
- If Expo shows `Invalid call ... import(specifier)` in `dist/nodes/spark.js`, rebuild `bindings/typescript` and restart Metro with cache clear (`npx expo start --clear`).
- You can inject custom fetch via constructor options:
- `new LndNode(config, { fetch: customFetch })`
- Most backends require secrets (API keys, macaroons, runes, passwords). For production web apps, use a backend proxy/BFF to protect credentials.
Expand Down
41 changes: 41 additions & 0 deletions bindings/typescript/examples/spark-expo-go/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files

# dependencies
node_modules/

# Expo
.expo/
dist/
web-build/
expo-env.d.ts

# Native
.kotlin/
*.orig.*
*.jks
*.p8
*.p12
*.key
*.mobileprovision

# Metro
.metro-health-check*

# debug
npm-debug.*
yarn-debug.*
yarn-error.*

# macOS
.DS_Store
*.pem

# local env files
.env*.local

# typescript
*.tsbuildinfo

# generated native folders
/ios
/android
Loading