Split Operations
When You Need This
Section titled “When You Need This”The high-level upload() handles single-piece multi-copy uploads end-to-end. Use split operations when you need:
- Batch uploading many files to specific providers without repeated context creation
- Custom error handling at each phase — retry store failures, skip failed secondaries, recover from commit failures
- Signing control to control the signing operations to avoid multiple wallet signature prompts during multi-copy uploads
- Greater provider/dataset targeting for uploading to known providers
The Upload Pipeline
Section titled “The Upload Pipeline”Every upload goes through three phases:
store ──► pull ──► commit │ │ │ │ │ └─ On-chain: create dataset, add piece, start payments │ └─ SP-to-SP: secondary provider fetches from primary └─ Upload: bytes sent to one provider (no on-chain state yet)- store: Upload bytes to a single SP. Returns
{ pieceCid, size }. The piece is “parked” on the SP but not yet on-chain and subject to garbage collection if not committed. - pull: SP-to-SP transfer. The destination SP fetches the piece from a source SP. No client bandwidth used.
- commit: Submit an on-chain transaction to add the piece to a data set. Creates the data set and payment rail if needed.
Storage Contexts
Section titled “Storage Contexts”A Storage Context represents a connection to a specific storage provider and data set.
Creating Contexts
Section titled “Creating Contexts”import { class Synapse
Synapse } from "@filoz/synapse-sdk"import { function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount } from "viem/accounts"
const const synapse: Synapse
synapse = class Synapse
Synapse.Synapse.create(options: SynapseOptions): Synapse
create({ SynapseOptions.account: `0x${string}` | Account
account: function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount("0x...") })
// Single context — auto-selects providerawait const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "my-app" },})
// Single context (createContext)await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ StorageServiceOptions.providerId?: bigint
providerId: 1n, // specific provider (optional) StorageServiceOptions.dataSetId?: bigint
dataSetId: 42n, // specific data set (optional) BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "my-app" }, // data set metadata for matching/creation BaseContextOptions.withCDN?: boolean
withCDN: true, // enable fast-retrieval (paid, optional) StorageServiceOptions.excludeProviderIds?: bigint[]
excludeProviderIds: [3n], // skip specific providers (optional)})
// Multiple contexts for multi-copyconst const contexts: StorageContext[]
contexts = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContexts(options?: CreateContextsOptions): Promise<StorageContext[]>
createContexts({ CreateContextsOptions.count?: number
count: 3, // number of contexts (default: 2) CreateContextsOptions.providerIds?: bigint[]
providerIds: [1n, 2n, 3n], // specific providers (mutually exclusive with dataSetIds) CreateContextsOptions.dataSetIds?: bigint[]
dataSetIds: [10n, 20n, 30n], // specific data sets (mutually exclusive with providerIds)})const [const primary: StorageContext
primary, const secondary: StorageContext
secondary] = const contexts: StorageContext[]
contextsView creation options for createContext()
View creation options for createContexts()
Data Set Selection and Matching
Section titled “Data Set Selection and Matching”The SDK intelligently manages data sets to minimize on-chain transactions. The selection behavior depends on the parameters you provide:
Selection Scenarios:
- Explicit data set ID: If you specify
dataSetId, that exact data set is used (must exist and be accessible) - Specific provider: If you specify
providerId, the SDK searches for matching data sets only within that provider’s existing data sets - Automatic selection: Without specific parameters, the SDK searches across all your data sets with any approved provider
Exact Metadata Matching: In scenarios 2 and 3, the SDK will reuse an existing data set only if it has exactly the same metadata keys and values as requested. This ensures data sets remain organized according to your specific requirements.
Selection Priority: When multiple data sets match your criteria:
- Data sets with existing pieces are preferred over empty ones
- Within each group (with pieces vs. empty), the oldest data set (lowest ID) is selected
Provider Selection (when no matching data sets exist):
- If you specify a provider (via
providerId), that provider is used - Otherwise, the SDK selects from endorsed providers for the primary copy and any approved provider for secondaries
- Before finalizing selection, the SDK verifies the provider is reachable via a ping test
- If a provider fails the ping test, the SDK tries the next candidate
- A new data set will be created automatically during the first commit
Example Upload Flow
Section titled “Example Upload Flow”Store Phase
Section titled “Store Phase”Upload data to a provider without committing on-chain:
const const contexts: StorageContext[]
contexts = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContexts(options?: CreateContextsOptions): Promise<StorageContext[]>
createContexts({ CreateContextsOptions.count?: number
count: 2,})const [const primary: StorageContext
primary, const secondary: StorageContext
secondary] = const contexts: StorageContext[]
contexts
const { const pieceCid: PieceLink
pieceCid, const size: number
size } = await const primary: StorageContext
primary.StorageContext.store(data: UploadPieceStreamingData, options?: StoreOptions): Promise<StoreResult>
store(const data: Uint8Array<ArrayBuffer>
data, { StoreOptions.pieceCid?: PieceLink
pieceCid: const preCalculatedCid: PieceLink
preCalculatedCid, // skip expensive PieceCID (hash digest) calculation (optional) StoreOptions.signal?: AbortSignal
signal: const abortController: AbortController
abortController.AbortController.signal: AbortSignal
The signal read-only property of the AbortController interface returns an AbortSignal object instance, which can be used to communicate with/abort an asynchronous operation as desired.
signal, // cancellation (optional) StoreOptions.onProgress?: (bytesUploaded: number) => void
onProgress: (bytes: number
bytes) => { // progress callback (optional) var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Uploaded ${bytes: number
bytes} bytes`) },})
var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Stored: ${const pieceCid: PieceLink
pieceCid}, ${const size: number
size} bytes`)store() accepts Uint8Array or ReadableStream<Uint8Array>. Use streaming for large files to minimize memory.
After store completes, the piece is parked on the SP and can be:
- Retrieved via the context’s
getPieceUrl(pieceCid) - Pulled to other providers via
pull() - Committed on-chain via
commit()
Pull Phase (SP-to-SP Transfer)
Section titled “Pull Phase (SP-to-SP Transfer)”Request a secondary provider to fetch pieces from the primary:
// Pre-sign to avoid double wallet prompts during pull + commitconst const extraData: `0x${string}`
extraData = await const secondary: StorageContext
secondary.StorageContext.presignForCommit(pieces: Array<{ pieceCid: PieceCID; pieceMetadata?: MetadataObject;}>): Promise<Hex>
presignForCommit([{ pieceCid: PieceLink
pieceCid }])
const const pullResult: PullResult
pullResult = await const secondary: StorageContext
secondary.StorageContext.pull(options: PullOptions): Promise<PullResult>
pull({ PullOptions.pieces: PieceLink[]
pieces: [const pieceCid: PieceLink
pieceCid], PullOptions.from: PullSource
from: (cid: PieceLink
cid) => const primary: StorageContext
primary.StorageContext.getPieceUrl(pieceCid: PieceCID): string
getPieceUrl(cid: PieceLink
cid), // source URL builder (or URL string) PullOptions.extraData?: `0x${string}`
extraData, // pre-signed auth (optional, reused for commit) PullOptions.signal?: AbortSignal
signal: const abortController: AbortController
abortController.AbortController.signal: AbortSignal
The signal read-only property of the AbortController interface returns an AbortSignal object instance, which can be used to communicate with/abort an asynchronous operation as desired.
signal, // cancellation (optional) PullOptions.onProgress?: (pieceCid: PieceCID, status: PullStatus) => void
onProgress: (cid: PieceLink
cid, status: PullStatus
status) => { // status callback (optional) var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`${cid: PieceLink
cid}: ${status: PullStatus
status}`) },})
if (const pullResult: PullResult
pullResult.PullResult.status: "complete" | "failed"
status !== "complete") { for (const const piece: { pieceCid: PieceCID; status: "complete" | "failed";}
piece of const pullResult: PullResult
pullResult.PullResult.pieces: { pieceCid: PieceCID; status: "complete" | "failed";}[]
pieces) { if (const piece: { pieceCid: PieceCID; status: "complete" | "failed";}
piece.status: "complete" | "failed"
status === "failed") { var console: Console
console.Console.error(...data: any[]): void
The console.error() static method outputs a message to the console at the 'error' log level.
error(`Failed to pull ${const piece: { pieceCid: PieceCID; status: "complete" | "failed";}
piece.pieceCid: PieceLink
pieceCid}`) } }}The from parameter accepts either a URL string (base service URL) or a function that returns a piece URL for a given PieceCID.
Pre-signing: presignForCommit() creates an EIP-712 signature that can be reused for both pull() and commit(). This avoids prompting the wallet twice. Pass the same extraData to both calls.
Commit Phase
Section titled “Commit Phase”Add pieces to an on-chain data set. Creates the data set and payment rail if one doesn’t exist:
// Commit on both providersconst [const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit, const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit] = await var Promise: PromiseConstructor
Represents the completion of an asynchronous operation
Promise.PromiseConstructor.allSettled<[Promise<CommitResult>, Promise<CommitResult>]>(values: [Promise<CommitResult>, Promise<CommitResult>]): Promise<[PromiseSettledResult<CommitResult>, PromiseSettledResult<CommitResult>]> (+1 overload)
Creates a Promise that is resolved with an array of results when all
of the provided Promises resolve or reject.
allSettled([ const primary: StorageContext
primary.StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit({ CommitOptions.pieces: { pieceCid: PieceCID; pieceMetadata?: MetadataObject;}[]
pieces: [{ pieceCid: PieceLink
pieceCid, pieceMetadata?: MetadataObject
pieceMetadata: { filename: string
filename: "doc.pdf" } }], CommitOptions.onSubmitted?: (txHash: Hex) => void
onSubmitted: (txHash: `0x${string}`
txHash) => { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Transaction submitted: ${txHash: `0x${string}`
txHash}`) }, }), const secondary: StorageContext
secondary.StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit({ CommitOptions.pieces: { pieceCid: PieceCID; pieceMetadata?: MetadataObject;}[]
pieces: [{ pieceCid: PieceLink
pieceCid, pieceMetadata?: MetadataObject
pieceMetadata: { filename: string
filename: "doc.pdf" } }], CommitOptions.extraData?: `0x${string}`
extraData, // pre-signed auth from presignForCommit() (optional) CommitOptions.onSubmitted?: (txHash: Hex) => void
onSubmitted: (txHash: `0x${string}`
txHash) => { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Transaction submitted: ${txHash: `0x${string}`
txHash}`) }, })])
if (const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit.status: "rejected" | "fulfilled"
status === "fulfilled") { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Primary: dataSet=${const primaryCommit: PromiseFulfilledResult<CommitResult>
primaryCommit.PromiseFulfilledResult<CommitResult>.value: CommitResult
value.CommitResult.dataSetId: bigint
dataSetId}`)}if (const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit.status: "rejected" | "fulfilled"
status === "fulfilled") { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Secondary: dataSet=${const secondaryCommit: PromiseFulfilledResult<CommitResult>
secondaryCommit.PromiseFulfilledResult<CommitResult>.value: CommitResult
value.CommitResult.dataSetId: bigint
dataSetId}`)}The result:
txHash— transaction hashpieceIds— assigned piece IDs (one per input piece)dataSetId— data set ID (may be newly created)isNewDataSet— whether a new data set was created
Multi-File Batch Example
Section titled “Multi-File Batch Example”Upload multiple files to 2 providers with full error handling:
import { class Synapse
Synapse } from "@filoz/synapse-sdk"import { function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount } from "viem/accounts"
const const synapse: Synapse
synapse = class Synapse
Synapse.Synapse.create(options: SynapseOptions): Synapse
create({ SynapseOptions.account: `0x${string}` | Account
account: function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount("0x..")})
const const files: Uint8Array<ArrayBuffer>[]
files = [ new var TextEncoder: new () => TextEncoder
The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.
TextEncoder().TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>
The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.
encode("File 1 content..."), new var TextEncoder: new () => TextEncoder
The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.
TextEncoder().TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>
The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.
encode("File 2 content..."), new var TextEncoder: new () => TextEncoder
The TextEncoder interface takes a stream of code points as input and emits a stream of UTF-8 bytes.
TextEncoder().TextEncoder.encode(input?: string): Uint8Array<ArrayBuffer>
The TextEncoder.encode() method takes a string as input, and returns a Global_Objects/Uint8Array containing the text given in parameters encoded with the specific method for that TextEncoder object.
encode("File 3 content..."),]
// Create contexts for 2 providersconst [const primary: StorageContext
primary, const secondary: StorageContext
secondary] = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContexts(options?: CreateContextsOptions): Promise<StorageContext[]>
createContexts({ CreateContextsOptions.count?: number
count: 2, BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "batch-upload" },})
// Store all files on primary (note: these could be done in parallel w/ Promise.all)const const stored: any[]
stored = []for (const const file: Uint8Array<ArrayBuffer>
file of const files: Uint8Array<ArrayBuffer>[]
files) { const const result: StoreResult
result = await const primary: StorageContext
primary.StorageContext.store(data: UploadPieceStreamingData, options?: StoreOptions): Promise<StoreResult>
store(const file: Uint8Array<ArrayBuffer>
file) const stored: any[]
stored.Array<any>.push(...items: any[]): number
Appends new elements to the end of an array, and returns the new length of the array.
push(const result: StoreResult
result) var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Stored ${const result: StoreResult
result.StoreResult.pieceCid: PieceLink
pieceCid}`)}
// Pre-sign for all pieces on secondaryconst const pieceCids: PieceLink[]
pieceCids = const stored: StoreResult[]
stored.Array<StoreResult>.map<PieceLink>(callbackfn: (value: StoreResult, index: number, array: StoreResult[]) => PieceLink, thisArg?: any): PieceLink[]
Calls a defined callback function on each element of an array, and returns an array that contains the results.
map(s: StoreResult
s => s: StoreResult
s.StoreResult.pieceCid: PieceLink
pieceCid)const const extraData: `0x${string}`
extraData = await const secondary: StorageContext
secondary.StorageContext.presignForCommit(pieces: Array<{ pieceCid: PieceCID; pieceMetadata?: MetadataObject;}>): Promise<Hex>
presignForCommit( const pieceCids: PieceLink[]
pieceCids.Array<PieceLink>.map<{ pieceCid: PieceLink;}>(callbackfn: (value: PieceLink, index: number, array: PieceLink[]) => { pieceCid: PieceLink;}, thisArg?: any): { pieceCid: PieceLink;}[]
Calls a defined callback function on each element of an array, and returns an array that contains the results.
map(cid: PieceLink
cid => ({ pieceCid: PieceLink
pieceCid: cid: PieceLink
cid })))
// Pull all pieces to secondaryconst const pullResult: PullResult
pullResult = await const secondary: StorageContext
secondary.StorageContext.pull(options: PullOptions): Promise<PullResult>
pull({ PullOptions.pieces: PieceLink[]
pieces: const pieceCids: PieceLink[]
pieceCids, PullOptions.from: PullSource
from: (cid: PieceLink
cid) => const primary: StorageContext
primary.StorageContext.getPieceUrl(pieceCid: PieceCID): string
getPieceUrl(cid: PieceLink
cid), PullOptions.extraData?: `0x${string}`
extraData,})
// Commit on both providersconst [const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit, const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit] = await var Promise: PromiseConstructor
Represents the completion of an asynchronous operation
Promise.PromiseConstructor.allSettled<[Promise<CommitResult>, Promise<CommitResult>]>(values: [Promise<CommitResult>, Promise<CommitResult>]): Promise<[PromiseSettledResult<CommitResult>, PromiseSettledResult<CommitResult>]> (+1 overload)
Creates a Promise that is resolved with an array of results when all
of the provided Promises resolve or reject.
allSettled([ const primary: StorageContext
primary.StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit({ CommitOptions.pieces: { pieceCid: PieceCID; pieceMetadata?: MetadataObject;}[]
pieces: const pieceCids: PieceLink[]
pieceCids.Array<PieceLink>.map<{ pieceCid: PieceLink;}>(callbackfn: (value: PieceLink, index: number, array: PieceLink[]) => { pieceCid: PieceLink;}, thisArg?: any): { pieceCid: PieceLink;}[]
Calls a defined callback function on each element of an array, and returns an array that contains the results.
map(cid: PieceLink
cid => ({ pieceCid: PieceLink
pieceCid: cid: PieceLink
cid })) }), const pullResult: PullResult
pullResult.PullResult.status: "complete" | "failed"
status === "complete" ? const secondary: StorageContext
secondary.StorageContext.commit(options: CommitOptions): Promise<CommitResult>
commit({ CommitOptions.pieces: { pieceCid: PieceCID; pieceMetadata?: MetadataObject;}[]
pieces: const pieceCids: PieceLink[]
pieceCids.Array<PieceLink>.map<{ pieceCid: PieceLink;}>(callbackfn: (value: PieceLink, index: number, array: PieceLink[]) => { pieceCid: PieceLink;}, thisArg?: any): { pieceCid: PieceLink;}[]
Calls a defined callback function on each element of an array, and returns an array that contains the results.
map(cid: PieceLink
cid => ({ pieceCid: PieceLink
pieceCid: cid: PieceLink
cid })), CommitOptions.extraData?: `0x${string}`
extraData }) : var Promise: PromiseConstructor
Represents the completion of an asynchronous operation
Promise.PromiseConstructor.reject<never>(reason?: any): Promise<never>
Creates a new rejected promise for the provided reason.
reject(new var Error: ErrorConstructornew (message?: string, options?: ErrorOptions) => Error (+1 overload)
Error("Pull failed, skipping secondary commit")), // not advised!])
if (const primaryCommit: PromiseSettledResult<CommitResult>
primaryCommit.status: "rejected" | "fulfilled"
status === "fulfilled") { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Primary: dataSet=${const primaryCommit: PromiseFulfilledResult<CommitResult>
primaryCommit.PromiseFulfilledResult<CommitResult>.value: CommitResult
value.CommitResult.dataSetId: bigint
dataSetId}`)}if (const secondaryCommit: PromiseSettledResult<CommitResult>
secondaryCommit.status: "rejected" | "fulfilled"
status === "fulfilled") { var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`Secondary: dataSet=${const secondaryCommit: PromiseFulfilledResult<CommitResult>
secondaryCommit.PromiseFulfilledResult<CommitResult>.value: CommitResult
value.CommitResult.dataSetId: bigint
dataSetId}`)}Error Handling Patterns
Section titled “Error Handling Patterns”Each phase’s errors are independent. Failures don’t cascade — you can retry at any level:
| Phase | Failure | Data state | Recovery |
|---|---|---|---|
| store | Upload/network error | No data on SP | Retry store() with same or different context |
| pull | SP-to-SP transfer failed | Data on primary only | Retry pull(), try different secondary, or skip |
| commit | On-chain transaction failed | Data on SP but not on-chain | Retry commit() (no re-upload needed) |
The key advantage of split operations: if commit fails, data is already stored on the SP. You can retry commit() without re-uploading the data. With the high-level upload(), a CommitError would require re-uploading.
Lifecycle Management
Section titled “Lifecycle Management”Terminating a Data Set
Section titled “Terminating a Data Set”Irreversible Operation
Data set termination cannot be undone. Once initiated:
- The termination transaction is irreversible
- After the termination period, the provider may delete all data
- Payment rails associated with the data set will be terminated
- You cannot cancel the termination
Only terminate data sets when you’re certain you no longer need the data.
To delete an entire data set and discontinue payments for the service, call context.terminate().
This method submits an on-chain transaction to initiate the termination process. Following a defined termination period, payments will cease, and the service provider will be able to delete the data set.
You can also terminate a data set using synapse.storage.terminateDataSet({ dataSetId }), when the data set ID is known and creating a context is not necessary.
import { class Synapse
Synapse } from "@filoz/synapse-sdk"import { function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount } from "viem/accounts"
const const synapse: Synapse
synapse = class Synapse
Synapse.Synapse.create(options: SynapseOptions): Synapse
create({ SynapseOptions.account: `0x${string}` | Account
account: function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount("0x...") })const const ctx: StorageContext
ctx = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "my-app" },})// Via contextconst const hash: `0x${string}`
hash = await const ctx: StorageContext
ctx.StorageContext.terminate(): Promise<Hash>
terminate()await const synapse: Synapse
synapse.Synapse.client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client.waitForTransactionReceipt: (args: WaitForTransactionReceiptParameters<Chain>) => Promise<TransactionReceipt>
Waits for the Transaction to be included on a Block (one confirmation), and then returns the Transaction Receipt. If the Transaction reverts, then the action will throw an error.
- Docs: https://viem.sh/docs/actions/public/waitForTransactionReceipt
- Example: https://stackblitz.com/github/wevm/viem/tree/main/examples/transactions_sending-transactions
- JSON-RPC Methods:
- Polls
eth_getTransactionReceipt on each block until it has been processed.
- If a Transaction has been replaced:
- Calls
eth_getBlockByNumber and extracts the transactions
- Checks if one of the Transactions is a replacement
- If so, calls
eth_getTransactionReceipt.
waitForTransactionReceipt({ hash: `0x${string}`
The hash of the transaction.
hash })var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log("Dataset terminated successfully")
// Or directly by data set IDconst const hash2: `0x${string}`
hash2 = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.terminateDataSet(options: { dataSetId: bigint;}): Promise<Hash>
terminateDataSet({ dataSetId: bigint
dataSetId: 42n })await const synapse: Synapse
synapse.Synapse.client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client.waitForTransactionReceipt: (args: WaitForTransactionReceiptParameters<Chain>) => Promise<TransactionReceipt>
Waits for the Transaction to be included on a Block (one confirmation), and then returns the Transaction Receipt. If the Transaction reverts, then the action will throw an error.
- Docs: https://viem.sh/docs/actions/public/waitForTransactionReceipt
- Example: https://stackblitz.com/github/wevm/viem/tree/main/examples/transactions_sending-transactions
- JSON-RPC Methods:
- Polls
eth_getTransactionReceipt on each block until it has been processed.
- If a Transaction has been replaced:
- Calls
eth_getBlockByNumber and extracts the transactions
- Checks if one of the Transactions is a replacement
- If so, calls
eth_getTransactionReceipt.
waitForTransactionReceipt({ hash: `0x${string}`
The hash of the transaction.
hash: const hash2: `0x${string}`
hash2 })Deleting a Piece
Section titled “Deleting a Piece”To delete an individual piece from the data set, call context.deletePiece().
This method submits an on-chain transaction to initiate the deletion process.
Important: Piece deletion is irreversible and cannot be canceled once initiated.
import { class Synapse
Synapse } from "@filoz/synapse-sdk"import { function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount } from "viem/accounts"
const const synapse: Synapse
synapse = class Synapse
Synapse.Synapse.create(options: SynapseOptions): Synapse
create({ SynapseOptions.account: `0x${string}` | Account
account: function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount("0x...") })const const ctx: StorageContext
ctx = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "my-app" },})// List all pieces in the data setconst const pieces: any[]
pieces = []for await (const const piece: PieceRecord
piece of const ctx: StorageContext
ctx.StorageContext.getPieces(options?: { batchSize?: bigint;}): AsyncGenerator<PieceRecord>
getPieces()) { const pieces: any[]
pieces.Array<any>.push(...items: any[]): number
Appends new elements to the end of an array, and returns the new length of the array.
push(const piece: PieceRecord
piece)}
if (const pieces: any[]
pieces.Array<any>.length: number
Gets or sets the length of the array. This is a number one higher than the highest index in the array.
length > 0) { await const ctx: StorageContext
ctx.StorageContext.deletePiece(options: { piece: string | PieceCID | bigint;}): Promise<Hash>
deletePiece({ piece: string | bigint | PieceLink
piece: const pieces: PieceRecord[]
pieces[0].PieceRecord.pieceId: bigint
pieceId }) var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log( `Piece ${const pieces: PieceRecord[]
pieces[0].PieceRecord.pieceCid: PieceLink
pieceCid} (ID: ${const pieces: PieceRecord[]
pieces[0].PieceRecord.pieceId: bigint
pieceId}) deleted successfully` )}
// Delete by PieceCIDawait const ctx: StorageContext
ctx.StorageContext.deletePiece(options: { piece: string | PieceCID | bigint;}): Promise<Hash>
deletePiece({ piece: string | bigint | PieceLink
piece: "bafkzcib..." })Download Options
Section titled “Download Options”The SDK provides flexible download options with clear semantics:
SP-Agnostic Download (from anywhere)
Section titled “SP-Agnostic Download (from anywhere)”Download pieces from any available provider using the StorageManager:
// Download from any provider that has the piececonst const data: Uint8Array<ArrayBufferLike>
data = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.download(options: StorageManagerDownloadOptions): Promise<Uint8Array>
download({ pieceCid: string | PieceLink
pieceCid })
// Download with CDN optimization (if available)const const dataWithCDN: Uint8Array<ArrayBufferLike>
dataWithCDN = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.download(options: StorageManagerDownloadOptions): Promise<Uint8Array>
download({ pieceCid: string | PieceLink
pieceCid, withCDN?: boolean
withCDN: true })Context-Specific Download (from this provider)
Section titled “Context-Specific Download (from this provider)”When using a StorageContext, downloads are automatically restricted to that specific provider:
import { class Synapse
Synapse, type type PieceCID = Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>
PieceCID } from "@filoz/synapse-sdk"import { function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount } from "viem/accounts"
const const synapse: Synapse
synapse = class Synapse
Synapse.Synapse.create(options: SynapseOptions): Synapse
create({ SynapseOptions.account: `0x${string}` | Account
account: function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount("0x...") })const const ctx: StorageContext
ctx = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "my-app" },})const const pieceCid: PieceLink
pieceCid = null as unknown as type PieceCID = Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>
PieceCID;// Downloads from the provider associated with this contextconst const data: Uint8Array<ArrayBufferLike>
data = await const ctx: StorageContext
ctx.StorageContext.download(options: DownloadOptions): Promise<Uint8Array>
download({ pieceCid: string | PieceLink
pieceCid })CDN Option Inheritance
Section titled “CDN Option Inheritance”The withCDN option follows a clear inheritance hierarchy:
- Synapse level: Default setting for all operations
- StorageContext level: Can override Synapse’s default
- Method level: Can override instance settings
await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.download(options: StorageManagerDownloadOptions): Promise<Uint8Array>
download({ pieceCid: string | PieceLink
pieceCid }) // Uses Synapse's withCDN: trueawait const ctx: StorageContext
ctx.StorageContext.download(options: DownloadOptions): Promise<Uint8Array>
download({ pieceCid: string | PieceLink
pieceCid }) // Uses context's withCDN: falseawait const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.download(options: StorageManagerDownloadOptions): Promise<Uint8Array>
download({ pieceCid: string | PieceLink
pieceCid, withCDN?: boolean
withCDN: false }) // Method override: CDN disabledNote: When withCDN: true is set, it adds { withCDN: '' } to the data set’s metadata, ensuring CDN-enabled and non-CDN data sets remain separate.
Using synapse-core Directly
Section titled “Using synapse-core Directly”For maximum control, use the core library functions without the SDK wrapper classes. This is useful for building custom upload pipelines, integrating into existing frameworks, or server-side applications that don’t need the SDK’s orchestration.
Provider Selection
Section titled “Provider Selection”import { function fetchProviderSelectionInput(client: Client<Transport, Chain>, options: fetchProviderSelectionInput.OptionsType): Promise<ProviderSelectionInput>
fetchProviderSelectionInput, function selectProviders(options: ProviderSelectionOptions): ResolvedLocation[]
selectProviders } from "@filoz/synapse-core/warm-storage"
// Fetch all chain data needed for selectionconst const input: ProviderSelectionInput
input = await function fetchProviderSelectionInput(client: Client<Transport, Chain>, options: fetchProviderSelectionInput.OptionsType): Promise<ProviderSelectionInput>
fetchProviderSelectionInput(const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client, { address: `0x${string}`
address: const walletAddress: `0x${string}`
walletAddress,})
// Primary: pass endorsedIds to restrict pool to endorsed providers onlyconst [const primary: ResolvedLocation
primary] = function selectProviders(options: ProviderSelectionOptions): ResolvedLocation[]
selectProviders({ ...const input: ProviderSelectionInput
input, ProviderSelectionOptions.count?: number
count: 1, ProviderSelectionOptions.metadata?: MetadataObject
metadata: { source: string
source: "my-app" },})
// Secondary: pass empty set to allow any approved providerconst [const secondary: ResolvedLocation
secondary] = function selectProviders(options: ProviderSelectionOptions): ResolvedLocation[]
selectProviders({ ...const input: ProviderSelectionInput
input, ProviderSelectionInput.endorsedIds: Set<bigint>
endorsedIds: new var Set: SetConstructornew <bigint>(iterable?: Iterable<bigint> | null | undefined) => Set<bigint> (+1 overload)
Set(), ProviderSelectionOptions.count?: number
count: 1, ProviderSelectionOptions.excludeProviderIds?: Set<bigint>
excludeProviderIds: new var Set: SetConstructornew <bigint>(iterable?: Iterable<bigint> | null | undefined) => Set<bigint> (+1 overload)
Set([const primary: ResolvedLocation
primary.ResolvedLocation.provider: PDPProvider
provider.PDPProvider.id: bigint
id]), ProviderSelectionOptions.metadata?: MetadataObject
metadata: { source: string
source: "my-app" },})fetchProviderSelectionInput() makes a single multicall to gather providers, endorsements, and existing data sets. selectProviders() is a pure function — no network calls — that applies a 2-tier preference within the eligible pool:
- Existing data set with matching metadata
- New data set (no matching data set found)
The endorsedIds parameter controls which providers are eligible. When non-empty, only endorsed providers can be selected — there is no fallback to non-endorsed. When empty, all approved providers are eligible. The SDK’s smartSelect() uses this to enforce endorsed-for-primary (hard constraint) while allowing any approved provider for secondaries.
Upload and Commit
Section titled “Upload and Commit”import { class Synapse
Synapse, type type PieceCID = Link<MerkleTreeNode, RAW_CODE, MulticodecCode<4113, "fr32-sha2-256-trunc254-padded-binary-tree">, 1>
PieceCID, function asChain(chain: Chain): Chain
asChain } from "@filoz/synapse-sdk"import { function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount } from "viem/accounts"
const const synapse: Synapse
synapse = class Synapse
Synapse.Synapse.create(options: SynapseOptions): Synapse
create({ SynapseOptions.account: `0x${string}` | Account
account: function privateKeyToAccount(privateKey: Hex, options?: PrivateKeyToAccountOptions): PrivateKeyAccount
privateKeyToAccount("0x...") })const const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client = const synapse: Synapse
synapse.Synapse.client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client;const const chain: Chain
chain = function asChain(chain: Chain): Chain
asChain(const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client.chain: Chain
Chain for the client.
chain);const const walletAddress: `0x${string}`
walletAddress = const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client.account: Account
The Account of the Client.
account.address: `0x${string}`
Address of the Smart Account.
address;const const primary: StorageContext
primary = await const synapse: Synapse
synapse.Synapse.storage: StorageManager
storage.StorageManager.createContext(options?: StorageServiceOptions): Promise<StorageContext>
createContext({ BaseContextOptions.metadata?: Record<string, string>
metadata: { source: string
source: "my-app" },})import * as import SP
SP from "@filoz/synapse-core/sp"import { function signAddPieces(client: Client<Transport, Chain, Account>, options: signAddPieces.OptionsType): Promise<signAddPieces.ReturnType>
signAddPieces, function signCreateDataSetAndAddPieces(client: Client<Transport, Chain, Account>, options: signCreateDataSetAndAddPieces.OptionsType): Promise<signCreateDataSetAndAddPieces.ReturnType>
signCreateDataSetAndAddPieces } from "@filoz/synapse-core/typed-data"
const const myStream: ReadableStream<Uint8Array<ArrayBufferLike>>
myStream = new var ReadableStream: new <Uint8Array<ArrayBufferLike>>(underlyingSource: UnderlyingDefaultSource<Uint8Array<ArrayBufferLike>>, strategy?: QueuingStrategy<Uint8Array<ArrayBufferLike>> | undefined) => ReadableStream<Uint8Array<ArrayBufferLike>> (+2 overloads)
ReadableStream<interface Uint8Array<TArrayBuffer extends ArrayBufferLike = ArrayBufferLike>
A typed array of 8-bit unsigned integer values. The contents are initialized to 0. If the
requested number of bytes could not be allocated an exception is raised.
Uint8Array>({ UnderlyingDefaultSource<Uint8Array<ArrayBufferLike>>.start?: (controller: ReadableStreamDefaultController<Uint8Array<ArrayBufferLike>>) => any
start(controller: ReadableStreamDefaultController<Uint8Array<ArrayBufferLike>>
controller) { controller: ReadableStreamDefaultController<Uint8Array<ArrayBufferLike>>
controller.ReadableStreamDefaultController<Uint8Array<ArrayBufferLike>>.enqueue(chunk?: Uint8Array<ArrayBufferLike> | undefined): void
The enqueue() method of the js-nolint enqueue(chunk) - chunk - : The chunk to enqueue.
enqueue(new var Uint8Array: Uint8ArrayConstructornew (elements: Iterable<number>) => Uint8Array<ArrayBuffer> (+6 overloads)
Uint8Array([1, 2, 3])) controller: ReadableStreamDefaultController<Uint8Array<ArrayBufferLike>>
controller.ReadableStreamDefaultController<Uint8Array<ArrayBufferLike>>.close(): void
The close() method of the ReadableStreamDefaultController interface closes the associated stream.
close() },})// Upload piece to SPconst { const pieceCid: PieceLink
pieceCid, const size: number
size } = await import SP
SP.function uploadPieceStreaming(options: SP.uploadPieceStreaming.OptionsType): Promise<SP.uploadPieceStreaming.OutputType>export uploadPieceStreaming
uploadPieceStreaming({ serviceURL: string
serviceURL: const primary: StorageContext
primary.StorageContext.provider: PDPProvider
provider.PDPProvider.pdp: PDPOffering
pdp.PDPOffering.serviceURL: string
serviceURL, data: any
data: const myStream: ReadableStream<Uint8Array<ArrayBufferLike>>
myStream,})
// Confirm piece is parkedawait import SP
SP.function findPiece(options: SP.findPiece.OptionsType): Promise<SP.findPiece.OutputType>export findPiece
findPiece({ serviceURL: string
serviceURL: const primary: StorageContext
primary.StorageContext.provider: PDPProvider
provider.PDPProvider.pdp: PDPOffering
pdp.PDPOffering.serviceURL: string
serviceURL, pieceCid: PieceLink
pieceCid, retry?: boolean
retry: true,})
// Sign and commit (new data set)const const result: createDataSetAndAddPieces.OutputType
result = await import SP
SP.function createDataSetAndAddPieces(client: Client<Transport, Chain, Account>, options: SP.CreateDataSetAndAddPiecesOptions): Promise<SP.createDataSetAndAddPieces.ReturnType>
createDataSetAndAddPieces(const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client, { cdn: boolean
cdn: false, payee: `0x${string}`
payee: const primary: StorageContext
primary.StorageContext.provider: PDPProvider
provider.serviceProvider: `0x${string}`
serviceProvider, payer?: `0x${string}`
payer: const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client.account: Account
The Account of the Client.
account.address: `0x${string}`
Address of the Smart Account.
address, recordKeeper?: `0x${string}`
recordKeeper: const chain: Chain
chain.Chain.contracts: { multicall3: ChainContract; usdfc: { address: Address; abi: typeof erc20WithPermit; }; filecoinPay: { address: Address; abi: typeof filecoinPayV1Abi; }; fwss: { address: Address; abi: typeof fwss; }; fwssView: { address: Address; abi: typeof filecoinWarmStorageServiceStateViewAbi; }; serviceProviderRegistry: { address: Address; abi: typeof serviceProviderRegistry; }; sessionKeyRegistry: { address: Address; abi: typeof sessionKeyRegistryAbi; }; pdp: { address: Address; abi: typeof pdpVerifierAbi; }; endorsements: { address: Address; abi: typeof providerIdSetAbi; };}
Collection of contracts
contracts.fwss: { address: Address; abi: typeof fwss;}
fwss.address: `0x${string}`
address, pieces: { pieceCid: PieceCID; metadata?: MetadataObject;}[]
pieces: [{ pieceCid: PieceLink
pieceCid }], metadata?: MetadataObject
metadata: const primary: StorageContext
primary.StorageContext.dataSetMetadata: Record<string, string>
dataSetMetadata, serviceURL: string
serviceURL: const primary: StorageContext
primary.StorageContext.provider: PDPProvider
provider.PDPProvider.pdp: PDPOffering
pdp.PDPOffering.serviceURL: string
serviceURL,})
const const confirmation: SP.waitForCreateDataSetAddPieces.ReturnType
confirmation = await import SP
SP.function waitForCreateDataSetAddPieces(options: SP.waitForCreateDataSetAddPieces.OptionsType): Promise<SP.waitForCreateDataSetAddPieces.ReturnType>
waitForCreateDataSetAddPieces(const result: createDataSetAndAddPieces.OutputType
result)var console: Console
console.Console.log(...data: any[]): void
The console.log() static method outputs a message to the console.
log(`DataSet: ${const confirmation: SP.waitForCreateDataSetAddPieces.ReturnType
confirmation.dataSetId: bigint
dataSetId}`)SP-to-SP Pull
Section titled “SP-to-SP Pull”const const response: SP.PullResponse
response = await import SP
SP.function waitForPullStatus(client: Client<Transport, Chain, Account>, options: SP.waitForPullStatus.OptionsType): Promise<SP.waitForPullStatus.ReturnType>
waitForPullStatus(const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client, { serviceURL: string
serviceURL: const secondary: StorageContext
secondary.StorageContext.provider: PDPProvider
provider.PDPProvider.pdp: PDPOffering
pdp.PDPOffering.serviceURL: string
serviceURL, pieces: SP.PullPieceInput[]
pieces: [{ pieceCid: PieceLink
pieceCid, sourceUrl: string
sourceUrl: const primary: StorageContext
primary.StorageContext.getPieceUrl(pieceCid: PieceCID): string
getPieceUrl(const pieceCid: PieceLink
pieceCid), }], payee: `0x${string}`
payee: const secondary: StorageContext
secondary.StorageContext.provider: PDPProvider
provider.serviceProvider: `0x${string}`
serviceProvider, payer?: `0x${string}`
payer: const client: Client<Transport, Chain, Account, PublicRpcSchema, PublicActions<Transport, Chain>>
client.account: Account
The Account of the Client.
account.address: `0x${string}`
Address of the Smart Account.
address, cdn?: boolean
cdn: false, metadata?: MetadataObject
metadata: const secondary: StorageContext
secondary.StorageContext.dataSetMetadata: Record<string, string>
dataSetMetadata,})This path requires manual EIP-712 signing. The signAddPieces and signCreateDataSetAndAddPieces functions from @filoz/synapse-core/typed-data handle the signature creation.
Next Steps
Section titled “Next Steps”-
Storage Operations — The high-level multi-copy upload API for most use cases. Start here if you haven’t used
synapse.storage.upload()yet. -
Calculate Storage Costs — Plan your budget and fund your storage account. Use the quick calculator to estimate monthly costs.
-
Payment Management — Manage deposits, approvals, and payment rails. Required before your first upload.