import-core API
Browser-native import engine for high-volume, privacy-first data ingestion.
npm install @rowops/import-core
createBrowserImportJob
Creates an import job with automatic worker setup for parsing, validation, profiling, masking, and export.
import { createBrowserImportJob } from "@rowops/import-core";
const job = await createBrowserImportJob({
store: chunkStore,
schema: mySchema,
license: {
projectId: "proj_123",
entitlementToken: "token_...",
},
});
Options
interface CreateBrowserImportJobOptions {
/** Chunk storage (IndexedDB-backed) */
store: ChunkStore;
/** Validation schema */
schema?: ValidationSchema | null;
/** Lazy schema getter */
getSchema?: () => ValidationSchema | null;
/** Schema ID for tracking */
schemaId?: string;
/** License configuration */
license?: LicenseConfig;
/** Transfer policy for export */
transferPolicy?: TransferPolicy;
/** Worker response timeout in milliseconds (<= 0 disables) */
workerTimeoutMs?: number;
/** Optional caller-supplied session identifier */
importSessionId?: ImportSessionId;
}
LicenseConfig
type LicenseConfig = {
projectId: string;
entitlementToken: string;
}
ImportJob
The main job interface returned by createBrowserImportJob.
Key Methods
interface ImportJob {
// Parsing
parse(file: File, options?: ParseOptions): Promise<void>;
listSheets(file: File): Promise<ImportSheetInfo[]>;
// Validation
validate(): Promise<void>;
// Profiling
profile(): Promise<ProfileReport>;
// Repair
applyRepair(command: RepairCommand): Promise<void>;
// Export
export(options: ExportOptions): Promise<ExportResult>;
exportIpcChunks(callback: (bytes: Uint8Array, index: number) => Promise<void>): Promise<void>;
// Unify (deduplication)
unify(config: UnifyConfig): Promise<UnifyResult>;
// State
getSnapshot(): ImportJobSnapshot;
getRawRowsByIndex(indices: number[]): Promise<Array<{ index: number; row: Record<string, unknown> }>>;
// Cleanup
dispose(): Promise<void>;
}
Sync is executed via @rowops/sync (runSync) and the Importer Sync panel, not through ImportJob methods.
IndexedDB Storage
createIndexedDbChunkStore
Creates an IndexedDB-backed store for Arrow IPC chunks.
import { createIndexedDbChunkStore } from "@rowops/import-core";
const store = createIndexedDbChunkStore({
dbName: "rowops-import",
});
createIndexedDbSessionStore
Creates storage for resumable import sessions.
import { createIndexedDbSessionStore } from "@rowops/import-core";
const sessionStore = createIndexedDbSessionStore({
dbName: "rowops-sessions",
});
Types
ImportJobSnapshot
interface ImportJobSnapshot {
phase: ImportPhase;
parsing: boolean;
totalRows: number;
validRows: number;
invalidRows: number;
warningRows: number;
validation?: {
validRowCount: number;
invalidRowCount: number;
errorGroups: ErrorGroup[];
};
maskSkipped: boolean;
}
ImportPhase
type ImportPhase =
| "idle"
| "parsing"
| "parsed"
| "validating"
| "validated"
| "profiling"
| "profiled"
| "masking"
| "masked"
| "transforming"
| "transformed"
| "exporting"
| "exported"
| "done"
| "error";
Usage Example
import {
createBrowserImportJob,
createIndexedDbChunkStore
} from "@rowops/import-core";
async function runImport(file: File, schema: ValidationSchema) {
// Create storage
const store = await createIndexedDbChunkStore();
// Create job
const job = await createBrowserImportJob({
store,
schema,
license: {
projectId: process.env.PROJECT_ID,
entitlementToken: token,
},
});
try {
// Parse file
await job.parse(file);
// Validate parsed data
await job.validate();
// Get results
const snapshot = job.getSnapshot();
console.log(`Valid: ${snapshot.validRows}`);
console.log(`Invalid: ${snapshot.invalidRows}`);
// Export valid rows
await job.exportIpcChunks(async (bytes, index) => {
await uploadChunk(bytes, index);
});
} finally {
await job.dispose();
}
}
See Also
- React Quickstart - Using import-core via React wrapper
- Node.js Quickstart - Headless usage
- Validate API - Validation module
- Transform API - Transform module