Quick Start
Let’s get you running code in 5 minutes.
Your First Process
Create a file called hello.ts:
import { run } from "jsr:@j50n/proc@0.24.6";
// Run a command and capture output
const lines = await run("echo", "Hello, proc!").lines.collect();
console.log(lines); // ["Hello, proc!"]
Run it:
deno run --allow-run hello.ts
What just happened?
run()started theechocommand.linesconverted the output to text lines.collect()gathered all lines into an array
Chaining Processes
Let’s chain commands together, like shell pipes:
import { run } from "jsr:@j50n/proc@0.24.6";
const result = await run("echo", "HELLO WORLD")
.run("tr", "A-Z", "a-z") // Convert to lowercase
.lines.first;
console.log(result); // "hello world"
Each .run() pipes the previous output to the next command’s input.
Working with Files
Process a file line by line:
import { read } from "jsr:@j50n/proc@0.24.6";
const errorCount = await read("app.log")
.lines
.filter((line) => line.includes("ERROR"))
.count();
console.log(`Found ${errorCount} errors`);
Handling Errors
Errors propagate naturally—catch them once at the end:
import { run } from "jsr:@j50n/proc@0.24.6";
try {
await run("false") // This command exits with code 1
.lines
.collect();
} catch (error) {
console.error(`Command failed: ${error.code}`);
}
No need to check errors at each step. They flow through the pipeline and you catch them once. For details, see Error Handling.
Using Array Methods
Work with async data using familiar Array methods:
import { enumerate } from "jsr:@j50n/proc@0.24.6";
const data = ["apple", "banana", "cherry"];
const numbered = await enumerate(data)
.enum() // Add indices
.map(([fruit, i]) => `${i + 1}. ${fruit}`)
.collect();
console.log(numbered);
// ["1. apple", "2. banana", "3. cherry"]
A Real Example
Let’s find the 5 most recent commits that mention “fix”:
import { run } from "jsr:@j50n/proc@0.24.6";
const commits = await run("git", "log", "--oneline")
.lines
.filter((line) => line.includes("fix"))
.take(5)
.collect();
commits.forEach((commit) => console.log(commit));
This chains multiple operations, all streaming, using minimal memory. For more complex examples, see Recipes.
Data Transforms (Optional)
Need to work with CSV, TSV, JSON, or Record formats? Import the transforms module:
import { read } from "jsr:@j50n/proc@0.24.6";
import { fromCsvToRows, toTsv } from "jsr:@j50n/proc@0.24.6/transforms";
// Convert CSV to TSV
await read("data.csv")
.transform(fromCsvToRows())
.transform(toTsv())
.writeTo("data.tsv");
The transforms module is separate from the core library to keep your bundle lightweight. See Data Transforms for details.
See Also
- Key Concepts — Properties vs methods, resource management
- Running Processes — All the ways to run commands
- Error Handling — How errors propagate through pipelines
- Recipes — Copy-paste solutions for common tasks