export API
Data export utilities for browser and worker contexts.
npm install @rowops/export
Worker Factory
createExportWorker
Creates a web worker for streaming exports.
import { createExportWorker } from "@rowops/export";
const worker = createExportWorker();
worker.postMessage({
requestId: "1",
type: "export",
payload: {
ipcBytes: arrowIpcBuffer,
format: "csv",
},
});
worker.onmessage = (event) => {
const { csvBytes } = event.data.payload;
downloadBlob(csvBytes, "export.csv");
};
Browser Utilities
downloadCsvFromRows
Quick CSV download from materialized JS rows.
import { downloadCsvFromRows } from "@rowops/export";
// Export error rows
downloadCsvFromRows(errorRows, "errors.csv");
// Export sample data
downloadCsvFromRows(sampleRows, "sample.csv");
Note: This is for small datasets only (error reports, validation failures). For large datasets, use the worker-based export path or downloadCsvFromRowChunks.
downloadCsvFromRowChunks
Streaming CSV download from an async chunk source. Supports File System Access API for direct disk streaming.
import { downloadCsvFromRowChunks } from "@rowops/export";
// Stream rows from an async generator
async function* generateChunks() {
for (let page = 0; page < 100; page++) {
const rows = await fetchPage(page);
yield rows;
}
}
await downloadCsvFromRowChunks(generateChunks(), "large-export.csv", {
useStreamingDownload: true, // Use File System Access API if available
});
Options:
interface CsvDownloadOptions {
/** Enable streaming to disk via File System Access API (default: true) */
useStreamingDownload?: boolean;
}
Note: When useStreamingDownload is enabled and the browser supports File System Access API, the file is written directly to disk without buffering the entire CSV in memory. Falls back to Blob download if not supported.
Types
ExportFormat
type ExportFormat = "csv";
MaskConfig
Re-exported from @rowops/schema for use with exports.
interface MaskConfig {
autoDetectPII?: boolean;
defaultStrategy?: MaskStrategy;
emailStrategy?: MaskStrategy;
phoneStrategy?: MaskStrategy;
// ... other fields
}
Streaming Export (Recommended)
For large datasets, use the streaming callbacks on RowOpsImporter:
Arrow IPC Streaming (Best Performance)
<RowOpsImporter
onExportIpcChunk={async (bytes, chunkIndex) => {
// Send directly to server
await fetch("/api/upload", {
method: "POST",
body: bytes,
headers: {
"Content-Type": "application/vnd.apache.arrow.stream",
"X-Chunk-Index": String(chunkIndex),
},
});
}}
onExportComplete={(summary) => {
console.log(`Exported ${summary.totalRows} rows in ${summary.totalChunks} chunks`);
}}
/>
JS Object Streaming
<RowOpsImporter
onExportChunk={async (rows, progress) => {
console.log(`Processing ${progress.current}/${progress.total}`);
for (const row of rows) {
await processRow(row);
}
}}
onExportComplete={(summary) => {
console.log("Done!");
}}
/>
CSV Escape Handling
The downloadCsvFromRows function handles:
- Commas in values (wraps in quotes)
- Quotes in values (escapes with double quotes)
- Newlines in values (wraps in quotes)
- Null/undefined values (empty string)
// Input row
const row = {
name: 'John "JD" Doe',
address: "123 Main St, Apt 4",
notes: "Line 1\nLine 2",
};
// CSV output
// "name","address","notes"
// "John ""JD"" Doe","123 Main St, Apt 4","Line 1
// Line 2"
Usage Examples
Export Validation Errors
<RowOpsImporter
onComplete={(result) => {
// Export is handled via onExportErrors callback
}}
onExportErrors={() => {
// Triggered when user clicks "Export Errors" button
// The component handles the download internally
}}
/>
Manual CSV Export
import { downloadCsvFromRows } from "@rowops/export";
function exportFilteredData(allRows: Record<string, unknown>[]) {
// Filter to high-value customers
const highValue = allRows.filter((row) => (row.total_spend as number) > 10000);
// Download
downloadCsvFromRows(highValue, "high-value-customers.csv");
}
Server-Side Export (Headless)
import { runHeadlessJob } from "@rowops/headless";
const result = await runHeadlessJob({
inputPath: "./data.csv",
outputPath: "./validated.csv",
exportFormat: "csv", // or "ipc" for Arrow
schema: mySchema,
});
console.log(`Exported to: ${result.outputPath}`);
Export Modes Comparison
| Mode | Use Case | Memory | Performance |
|---|---|---|---|
onExportChunk | Medium datasets, need JS access | Medium | Medium |
onExportIpcChunk | Large datasets, server upload | Low | Fast |
downloadCsvFromRows | Error reports, samples | High | N/A |
downloadCsvFromRowChunks | Large datasets, client download | Low | Fast |
Tier Restrictions
| Feature | Free | Pro | Scale | Enterprise |
|---|---|---|---|---|
| CSV export | Yes | Yes | Yes | Yes |
| Streaming export | Yes | Yes | Yes | Yes |
| Arrow IPC export | No | Yes | Yes | Yes |
| Max rows per export | 1,000 | 100,000 | 1,000,000 | Unlimited |