My Wiggum Loop
tl;dr
- A simple zsh Wiggum Loop can drive Codex effectively from PROMPT.md.
- The prompt works for code when it enforces TODOs, tests, and conventional commits.
- I use plan mode first, then keep tightening evaluation checks while the loop runs.
- This approach shipped uniffi-bindgen-node-js in 79 commits, but it has not helped docs.
This is how I’m currently writing most of my code:
$ wiggle
# Run codex in a simple loop against PROMPT.md.
#
# Behavior:
# - Reads input from PROMPT.md on each iteration.
# - Stops if PROMPT.md is missing, empty, or only contains blank lines.
# - Stops if the last non-empty line of PROMPT.md is exactly ✅.
# - Prints codex output live.
# - Prints codex's exit code after each run.
# - Stops if codex exits with a non-zero status.
# - Sleeps 1 second between iterations.
# - Exits cleanly on Ctrl-C.
# - Rings the terminal bell when the loop exits.
#
# Usage:
# wiggle [extra codex exec args...]
#
# Notes:
# - Put this function in ~/.zshrc, then run: source ~/.zshrc
# - PROMPT.md is resolved relative to your current working directory.
wiggle() {
local i=1
local last_line
local exit_code
trap 'echo; echo "wiggle stopped at iteration $i"; return' INT
while :; do
last_line="$(awk 'NF { last=$0 } END { print last }' PROMPT.md 2>/dev/null)"
if [[ -z "$last_line" || "$last_line" == "✅" ]]; then
echo "PROMPT.md is missing, empty, blank, or marked complete; stopping"
break
fi
echo "[$(date '+%Y-%m-%d %H:%M:%S')] wiggle iteration $i"
codex exec "$@" - < PROMPT.md
exit_code=$?
echo "exit code: $exit_code"
if (( exit_code != 0 )); then
echo "codex failed; stopping"
break
fi
((i++))
sleep 1
done
printf '\a'
}
It’s a Ralph Wiggum loop, and it’s shockingly effective if you use the right prompt.
Here’s an actual prompt I used to generate the initial version of my uniffi-bindgen-node-js:
SlateDB-First Node UniFFI Bindgen
# SlateDB-First Node UniFFI Bindgen
## Summary
- Turn this repo into a single Cargo package that exposes a reusable NodeBindingGenerator library plus a uniffi-bindgen-node-js CLI.
- Target UniFFI 0.29.x first so the generator matches SlateDB’s current bindings/uniffi crate.
- Use UniFFI library-mode generation + Askama templates to emit one ESM npm package per invocation, with ready-to-consume .js + .d.ts output.
- Replace ffi-rs with koffi, and scope v1 to the UniFFI feature set SlateDB actually needs: objects, constructors, sync/async methods, records, flat/tagged enums, error enums,
Option, Vec, HashMap, bytes, and synchronous callback interfaces.
## Public Interfaces
- CLI: generate <lib_source> --crate-name <crate> --out-dir <dir> plus --package-name, --cdylib-name, --node-engine, --lib-path-literal, --manual-load, and --config-override.
- UniFFI config: add [bindings.node] support with the same knobs; defaults are package name = crate namespace, node engine = >=16, auto-load enabled, and library path =
sibling native library file.
- Generated package API conventions:
- bytes -> Uint8Array; Node Buffer inputs work naturally.
- Option<T> -> T | undefined, Vec<T> -> Array<T>, HashMap<K,V> -> Map<K,V>, i64/u64 -> bigint.
- Records -> plain JS objects + .d.ts interfaces.
- Flat enums -> frozen runtime constant objects with string literal values.
- Tagged enums -> tagged JS objects/classes.
- UniFFI errors -> Error subclasses.
- Objects -> JS classes backed by UniFFI handles.
- Generated files: package.json, index.js, index.d.ts, one component API pair (<namespace>.js/.d.ts), one low-level FFI pair (<namespace>-ffi.js/.d.ts), and shared runtime
helper modules under runtime/.
## Implementation Changes
- Add src/lib.rs, src/bindings/*, src/subcommands/*, templates/**/*, and askama.toml; keep this repo as one Cargo package, not a workspace.
- Port the old BindingGenerator/library-mode shape from /tmp/uniffi-bindgen-node, but simplify it for JS output and SlateDB-first scope.
- Pin uniffi/uniffi_bindgen to the SlateDB-compatible 0.29 line first.
- Build a Node runtime around koffi:
- declare UniFFI FFI functions, RustBuffer, ForeignBytes, RustCallStatus, and callback-interface vtables from ci.ffi_definitions().
- port/adapt the minimal runtime pieces from uniffi-bindgen-react-native: errors, ffi-types, ffi-converters, rust-call, async-rust-call, handle-map, callbacks, and
objects.
- normalize Koffi 64-bit integer marshalling so the public API always exposes bigint.
- make load() idempotent; on first load, open the library, register callback vtables, then run UniFFI contract-version and checksum checks.
- use registered Koffi callbacks for long-lived callback-interface vtables and the global Rust-future continuation callback.
- Keep native library resolution simple in v1: prefer configured platform/arch native packages, fall back to sibling-library loading, and retain a literal-path override.
- Reject unsupported v1 constructs with explicit generator errors: custom types, external types, async callback-interface methods, CommonJS output, and multi-package/platform-
switch packaging.
## Test Plan
- Rust unit tests for config parsing, path resolution, Koffi FFI type mapping, bytes/enum/error naming, and unsupported-feature diagnostics.
- Snapshot/codegen tests using small local fixture crates that cover:
- objects + constructors + async methods
- flat/tagged enums and error enums
- records, Option, Vec, Map, bytes
- synchronous callback interfaces
- End-to-end Node smoke tests:
- build a small fixture cdylib, generate a package into /tmp, install npm deps, and execute a JS smoke script.
- run the same smoke flow against local SlateDB bindings/uniffi without modifying that repo; the script should import the generated package, create Settings.default(),
mutate settings via set()/to_json_string(), create a WriteBatch and put()/delete() using Buffer, and call init_logging() with a JS callback implementation.
- Acceptance criteria:
- generated SlateDB bindings import cleanly in plain JS and TypeScript.
- async methods resolve through the Rust-future polling path.
- callback interfaces register and round-trip without handle leaks or stale-handle errors in normal use.
## Assumptions And Defaults
- ESM-only output for v1.
- JS + .d.ts are generated directly from Askama templates; downstream packages do not need a TypeScript build step.
- UniFFI 0.29.x compatibility is the first-class target; widening to newer UniFFI versions is follow-up work after SlateDB is green.
- Runtime choice is koffi, because its docs explicitly cover aggregate C types, JS callbacks, and registered callbacks for long-lived native interactions: https://koffi.dev/
and https://koffi.dev/callbacks
## Instructions
You are in a Ralph Wiggum loop. You are making progress on the plan defined above. Work through the first few TODOs in the `## TODO` section below.
- update PROMPT.md with an updated TODO list after each change
- never ever change any PROMPT.md text _except_ the items in the `## TODO` section
- you may update the TODO items as you see fit--remove outdated items, add new items, mark items as completed
- commit after each completed TODO
- use conventional commit syntax for commit messages
- if there are no items left in TODO.md, append a final line in PROMPT.md that contains only the emoji: ✅
## TODO
- [x] Convert the repo from the current stub into a real single-package Cargo crate with both a library target and the `uniffi-bindgen-node-js` binary.
- [x] Add `src/lib.rs` and move CLI startup out of `src/main.rs` into reusable modules.
- [x] Add `askama.toml` and a `templates/` directory for all generated output.
- [x] Add Rust dependencies for `anyhow`, `askama`, `camino`, `cargo_metadata`, `clap`, `heck`, `serde`, `serde_json`, `textwrap`, `toml`, `uniffi`, and `uniffi_bindgen`.
- [x] Pin `uniffi` and `uniffi_bindgen` to the UniFFI 0.29 line to match SlateDB.
- [x] Implement a `generate` subcommand that accepts `lib_source`, `--crate-name`, and `--out-dir`.
- [x] Add CLI options for `--package-name`, `--cdylib-name`, `--node-engine`, `--lib-path-literal`, `--manual-load`, and `--config-override`.
- [x] Make the CLI call UniFFI library-mode generation with a custom `NodeBindingGenerator`.
- [x] Parse `[bindings.node]` from `uniffi.toml` and merge it with CLI overrides.
- [x] Fail with clear errors when required inputs are missing or invalid.
- [x] Add `src/bindings/mod.rs` with a `NodeBindingGenerator` that implements `uniffi_bindgen::BindingGenerator`.
- [x] Add a config struct for `[bindings.node]` settings and defaults.
- [x] Write generated files into one output package directory per invocation.
- [x] Generate `package.json`, `index.js`, `index.d.ts`, `<namespace>.js`, `<namespace>.d.ts`, `<namespace>-ffi.js`, and `<namespace>-ffi.d.ts`.
- [x] Generate shared runtime helper files under `runtime/`.
- [x] Add a generator-side component model that collects top-level functions, objects, constructors, methods, records, enums, and errors before template rendering.
- [x] Reject unsupported v1 inputs up front with explicit generator errors for custom types, external types, and callback interfaces that are still waiting on runtime support.
- [x] Render public API types for `bytes`, `i64/u64`, `Option<T>`, `Vec<T>`, `HashMap<K, V>`, and the nested combinations SlateDB needs.
- [x] Generate public JS + `.d.ts` skeletons for top-level functions, objects, constructors, methods, records, flat enums, tagged enums, and error enums.
- [x] Support synchronous callback interfaces needed by SlateDB.
- [x] Reject unsupported v1 features with explicit generator errors: async callback-interface methods, CommonJS output, and multi-package platform-switch packaging.
- [x] Add `koffi` as the generated package FFI dependency instead of `ffi-rs`.
- [x] Implement library loading and symbol binding with `koffi`.
- [x] Declare runtime representations for UniFFI `RustBuffer`, `ForeignBytes`, `RustCallStatus`, handles, and callback vtables.
- [x] Normalize Koffi 64-bit values so generated bindings consistently expose `bigint`.
- [x] Make library loading idempotent.
- [x] Support automatic library loading by default.
- [x] Support `--manual-load` by exporting explicit load and unload helpers.
- [x] Support sibling-library lookup by default plus literal-path override support.
- [x] Add a `runtime/errors.js` + `.d.ts` module with UniFFI error helpers and internal runtime errors.
- [x] Add a `runtime/ffi-types.js` + `.d.ts` module with `RustBuffer` and raw byte helpers.
- [x] Add a `runtime/ffi-converters.js` + `.d.ts` module for primitive, optional, sequence, map, timestamp, duration, string, and byte-array converters.
- [x] Add a `runtime/rust-call.js` + `.d.ts` module for sync Rust call handling and `RustCallStatus` checking.
- [x] Add a `runtime/async-rust-call.js` + `.d.ts` module for Rust future polling, completion, cancellation, and cleanup.
- [x] Add a `runtime/handle-map.js` + `.d.ts` module for foreign callback/object handles.
- [x] Add a `runtime/callbacks.js` + `.d.ts` module for callback-interface registration and callback error propagation.
- [x] Add a `runtime/objects.js` + `.d.ts` module for UniFFI object factories, object converters, and destruction semantics.
- [x] Generate Koffi declarations for every FFI function in `ci.ffi_definitions()`.
- [x] Generate Koffi struct definitions for every UniFFI FFI struct used by the component.
- [x] Generate callback declarations for callback-interface methods and Rust future continuation callbacks.
- [x] Generate low-level wrappers for checksum functions and the UniFFI contract-version function.
- [x] Run contract-version validation at initialization time.
- [x] Run checksum validation at initialization time.
- [x] Generate public object classes for UniFFI objects.
- [x] Generate constructor wrappers that lift returned handles into JS objects.
- [x] Generate sync method wrappers that lower arguments, call FFI, check `RustCallStatus`, and lift results.
- [x] Generate async method wrappers that create, poll, complete, and free Rust futures.
- [x] Generate record type definitions and record converters.
- [x] Generate flat enum runtime representations and converters.
- [x] Generate tagged enum runtime representations and converters.
- [x] Generate error classes and error converters.
- [x] Generate callback-interface wrappers for `LogCallback` and `MergeOperator`.
- [x] Register callback-interface vtables during package initialization.
- [x] Confirm generated bindings compile against the full SlateDB UniFFI surface without unsupported-feature failures.
- [x] Ensure `HashMap<String, i64>` maps cleanly to `Map<string, bigint | number>` according to the chosen converter rules.
- [x] Ensure nested `Vec<Vec<u8>>` arguments and returns work correctly.
- [x] Ensure `Option<Vec<u8>>` and `Option<Arc<Object>>` patterns work correctly.
- [x] Ensure synchronous callback interfaces work for `init_logging` and merge operators.
- [x] Ensure async methods work for DB, reader, iterator, snapshot, transaction, and WAL APIs.
- [x] Create a minimal local UniFFI fixture crate for objects, records, enums, errors, async methods, and bytes.
- [x] Create a local fixture crate for synchronous callback interfaces.
- [x] Add Rust tests that snapshot generated JS and `.d.ts` output for the fixtures.
- [x] Add Rust tests for config parsing and output path resolution.
- [x] Add Rust tests for unsupported-feature diagnostics.
- [x] Add Rust tests for generated checksum and contract-version initialization code.
- [x] Build a fixture cdylib during tests.
- [x] Generate a Node package into a temp directory.
- [x] Install npm dependencies in the temp directory.
- [x] Run a plain JS smoke script that imports the generated package and exercises sync and async calls.
- [x] Run a TypeScript smoke script or `tsc --noEmit` check against the generated `.d.ts` output.
- [x] Verify that passing Node `Buffer` values into `Uint8Array` byte parameters works correctly.
- [x] Build `/Users/chrisriccomini/Code/slatedb/bindings/uniffi` as a cdylib without modifying the SlateDB repo.
- [x] Generate a Node package from the built SlateDB library into a temp directory.
- [x] Install npm dependencies for the generated package.
- [x] Run a smoke script that imports the generated SlateDB package.
- [x] In the smoke script, call `Settings.default()`, `set()`, and `to_json_string()`.
- [x] In the smoke script, create a `WriteBatch`, call `put()` with `Buffer` keys and values, and call `delete()`.
- [x] In the smoke script, call `init_logging()` with a JS callback implementation and verify callback delivery.
- [x] Treat successful import plus these calls as the first end-to-end acceptance gate.
- [x] Add a README for this repo describing installation, CLI usage, supported UniFFI features, and current limitations.
- [x] Document that v1 is ESM-only and emits ready-to-consume `.js` + `.d.ts`.
- [x] Document that v1 targets UniFFI 0.29.x first.
- [x] Run Rust tests last.
- [x] Run the end-to-end Node tests after Rust tests pass.
- [ ] Remove the validation in src/bindings/mod.rs that errors when lib_path_modules is set.
- [ ] Parse lib_path_modules as structured native package entries with platform and arch fields.
- [ ] Generate runtime loader code that picks the matching native package for the current process.platform and process.arch.
- [ ] Fall back to sibling-library loading when no native package config is present.
- [ ] Throw a clear runtime error when native package config exists but nothing matches the current platform.
- [ ] Add a native-package subcommand that emits one npm package per platform/arch.
- [ ] Let the main package declare optionalDependencies on native packages.
- [ ] Add config for native package names and supported platform/arch targets.
- [ ] Add config for publish metadata like version, description, license, and repository.
- [ ] Stop hardcoding "0.0.0" as the generated package version.
- [ ] Update the main package.json template to be publish-ready.
- [ ] Add native-package templates for package.json, JS entrypoint, .d.ts, and README.
- [ ] Copy the built .so/.dylib/.dll into generated native-package output.
- [ ] Make runtime loading try installed native packages before sibling-library fallback.
- [ ] Keep sibling-library loading working for local development.
- [ ] Keep manual_load working with native-package resolution.
- [ ] Emit clear runtime errors when no matching native package is installed.
- [ ] Support structured platform/arch module entries, not just raw strings.
- [ ] Add tests for parsing the new publish and native-package config.
- [ ] Add tests for generating the main package with optionalDependencies.
- [ ] Add tests for generating native packages with correct os and cpu.
- [ ] Add runtime loader tests for matching, missing, and incompatible native packages.
- [ ] Add npm pack --dry-run tests for both main and native packages.
- [ ] Add smoke tests that install a generated main package plus a local native package.
- [ ] Document published-package generation separately from local generation.
- [ ] Standardize docs and examples on the uniffi-bindgen-node-js binary name.
I use it like this:
wiggle --add-dir "$HOME/.cargo" -c sandbox_workspace_write.network_access=true
Or, often, just:
wiggle
The prompt above ran for five hours, made 79 commits, and generated my uniffi-bindgen-node-js repo. I iterated on the project with several more wiggum loops, which I will talk about in another post.
Some important notes about this workflow:
- Use plan mode to generate the prompt.
- Tell the LLM to take a pass on the TODOs and break any up that are seem too big or complex.
- Don’t be afraid to modify the PROMPT.md while the loop is running.
- I only use Codex 5.4 xhigh. As soon as there’s a new frontier model, I’ll switch to that.
- Make sure to include whatever code coverage, linting, formatting, and testing commands you want run on every iteration in the prompt. This keeps things from getting too far off the rails.
- The prompt needs to include a way to evaluate progress and make sure things haven’t regressed. Tests, compiler checks, code quality, cyclomatic complexity, soak tests, performance tests, and so on. You really need to overdo it. Fortunately, you can have the LLMs write this stuff, too.
- I like to have the loop commit its changes as it goes. I instruct it to use conventional commit syntax. I have been using a single-line style for that. It occurred to me that having it write really nice long-form commit messages in the body would be a great way to document what it’s doing. I haven’t tried this yet, but I supsect it will help the LLM, itself, keep track of what it’s doing and why over consecutive iterations.
The example above shows a Wiggum Loop that completes, but I sometimes just let the Wiggum Loop run forever. In that case, the instructions look like this:
Memory Leak Detection and Fixing
Use the leak and soak harnesses in this repo to find one real, reproducible memory leak, fix the root cause, and verify that the fix is correct. Don't worry about ineffecient memory usage, just actual memory leaks that could cause a long-running process to run out of memory.
Start by reproducing a leak with the existing workflows documented in `DEVELOPMENT.md`.
Runtime leak setup:
- `cargo run --example runtime_leak_prep -- basic --out-dir /tmp/uniffi-basic-leaks`
- `cargo run --example runtime_leak_prep -- callbacks --out-dir /tmp/uniffi-callback-leaks`
- `cargo run --example runtime_leak_prep -- basic --manual-load --out-dir /tmp/uniffi-basic-manual-leaks`
Runtime soak probes:
- `UNIFFI_LEAK_PACKAGE_DIR=/tmp/uniffi-basic-leaks /tmp/node-v22.22.2-darwin-arm64/bin/node --expose-gc scripts/leaks/runtime-basic-soak.mjs`
- `UNIFFI_LEAK_PACKAGE_DIR=/tmp/uniffi-basic-leaks /tmp/node-v22.22.2-darwin-arm64/bin/node --expose-gc scripts/leaks/runtime-basic-soak.mjs --scenario bytes`
- `UNIFFI_LEAK_PACKAGE_DIR=/tmp/uniffi-basic-leaks /tmp/node-v22.22.2-darwin-arm64/bin/node --expose-gc scripts/leaks/runtime-basic-soak.mjs --scenario objects`
- `UNIFFI_LEAK_PACKAGE_DIR=/tmp/uniffi-basic-leaks /tmp/node-v22.22.2-darwin-arm64/bin/node --expose-gc scripts/leaks/runtime-basic-soak.mjs --scenario async`
- `UNIFFI_LEAK_PACKAGE_DIR=/tmp/uniffi-callback-leaks /tmp/node-v22.22.2-darwin-arm64/bin/node --expose-gc scripts/leaks/runtime-callback-soak.mjs`
- `UNIFFI_LEAK_PACKAGE_DIR=/tmp/uniffi-basic-manual-leaks /tmp/node-v22.22.2-darwin-arm64/bin/node --expose-gc scripts/leaks/runtime-load-unload-soak.mjs`
If needed, bisect further with `runtime-basic-soak.mjs --scenario ... --case ...`. Use `--baseline-only` and `--pause` when useful. On macOS, use `leaks --atExit -- ...` or
`leaks <pid>` for native leak inspection.
If the runtime probes are not the best target, you may use the generator leak probe:
- `cargo run --example generator_leak_probe -- both --pause-after-warmup --pause-at-end`
Requirements:
- Choose one confirmed leak and explain how you reproduced it.
- Fix the root cause, not just the symptom.
- Re-run the relevant soak test after the change and confirm the leak is gone or materially improved.
- Review your own diff before committing. Do that review in the context of the reproduced leak, and explicitly check for ownership and lifetime mistakes, missing cleanup on error paths, behavioral regressions, and missing test coverage.
- Do not commit if your review finds problems. Fix them first.
- Do not introduce performance regressions. Run the benchmark suite with Node 22 and compare the relevant results before and after your change:
- `PATH=/opt/homebrew/opt/node@22/bin:$PATH cargo test --test node_benchmarks -- --ignored --nocapture`
- Before committing, formatting and lint checks must pass:
- `cargo fmt --check`
- `cargo clippy --all-targets -- -D warnings`
- Do not let CCN complexity increase too badly in any area. If the fix adds control-flow complexity, check the affected Rust code with the `lizard` workflow documented in `DEVELOPMENT.md` and keep any increase small and justified.
- Before committing, all required tests must pass:
- `cargo test --locked -- --ignored`
- `cargo test`
Definition of done:
- The chosen leak is reproducible before the fix.
- The relevant soak test is clean or materially improved after the fix.
- The benchmark run shows no meaningful regression in the affected area.
- Your review is complete and does not identify unresolved issues.
- `cargo fmt --check` and `cargo clippy --all-targets -- -D warnings` pass before commit.
- Both required test commands pass before commit.
## Instructions
You are in a Ralph Wiggum loop. Keep iterating until the chosen leak is fixed and fully verified.
Rules:
- never ever change any PROMPT.md text except to append the final line if there are no detected memory leaks
- run `cargo fmt --check` and `cargo clippy --all-targets -- -D warnings` before each commit
- commit after each completed fix
- use conventional commit syntax for commit messages
- if there are no detected memory leaks, append a final line in PROMPT.md that contains only the emoji: ✅
Deliverables:
- a short note describing the leak, how you reproduced it, the root cause, and why the fix is correct
- a short review summary describing what you checked before commit and any residual risk
- benchmark and test results showing the change is safe to land
For open-ended work like this prompt, I usually forego the TODO list and just give it a more open-ended definition of done.
So far a Wiggum Loop has worked well for me with:
- Greenfield from-scratch projects
- Performance improvements
- Memory leak detection and fixes
- Test coverage improvements
- Decreasing code complexity (cyclomatic complexity)
I haven’t had it work well with textual documentation improvements. It works fine for code comments, but I haven’t found it useful for improving README or documentation site content.