Optimizing RPC usage can significantly improve performance, reduce costs, and enhance user experience.

Transaction Optimization Patterns

Follow these best practices for optimizing Solana transaction sending and confirmation. For a detailed guide, visit Helius Transaction Optimization Guide.

Optimize Compute Unit (CU) Usage

Simulate CUs Used: Test your transaction to determine CU usage. Example:

const rpcResponse = await connection.simulateTransaction(testTransaction, { replaceRecentBlockhash: true, sigVerify: false });
const unitsConsumed = rpcResponse.value.unitsConsumed;

Set a CU Limit: Add a margin (~10%) to the simulated value:

const computeUnitIx = ComputeBudgetProgram.setComputeUnitLimit({ units: Math.ceil(unitsConsumed * 1.1) });
instructions.push(computeUnitIx);

Serialize & Encode Transactions

Serialize and Base58 encode your transaction for APIs:

const serializedTx = testTransaction.serialize();
const encodedTx = bs58.encode(serializedTx);

Set Priority Fees

Get Fee Estimate: Fetch a recommended priority fee from Helius:

const response = await fetch(HeliusURL, { method: "POST", body: JSON.stringify({ method: "getPriorityFeeEstimate", params: [...] }) });
const priorityFee = (await response.json()).result.priorityFeeEstimate;

Apply the Fee:

const computeBudgetIx = ComputeBudgetProgram.setComputeUnitPrice({ microLamports: priorityFee });
instructions.push(computeBudgetIx);

Send & Confirm Transactions

  • Assemble, serialize, and send your transaction using sendTransaction or sendRawTransaction.
  • Tip: Set skipPreflight: true to reduce transaction time by ~100ms, but note the loss of pre-validation.

Monitor & Rebroadcast

If a transaction isn’t confirmed:

1

Check Status

Use getSignatureStatuses to check its status.

2

Rebroadcast

Rebroadcast until the blockhash expires.

For a comprehensive breakdown, visit our guide.

When to Use Jito Tips

Key Facts

Bundles

Groups of up to 5 transactions executed sequentially and atomically via Jito.

Tips

Incentives for block builders to execute bundles.

Auctions

Bundles compete via out-of-protocol auctions every 200ms based on tip amounts.

Validators

Only Jito-Solana client validators (currently >90% of stake) can process bundles.

Prioritization

Tips, compute efficiency, and account locking patterns determine bundle order.

Use Cases

Ideal for landing transactions at the top of a block.

Use Cases for Jito Tips

MEV Opportunities

Arbitrage trading, liquidation transactions, front-running protection, specific transaction ordering.

Time-Critical DeFi Operations

Token launches, high-volatility trades, NFT mints.

High-Stakes Transactions

Immediate settlement or time-sensitive interactions.

Examples of Jito Usage

1

Arbitrage Trading

A trader identifies a price difference between two decentralized exchanges (DEXs). Using Jito tips ensures their arbitrage transaction is processed at the top of the block, securing profit before others can react.

2

NFT Minting

During a high-demand mint, competition is fierce. Adding Jito tips guarantees priority placement in the block, increasing the chances of a successful mint.

3

Liquidation Transactions

In lending protocols, liquidations can be time-critical. Jito tips allow the liquidator’s transaction to execute ahead of others, ensuring timely liquidation and profit capture.

Best Practices for Jito Tips

  • Use Tips for Priority: Apply Jito tips only when transaction timing and order are critical.
  • Avoid Overusing Tips: Routine actions like token transfers or minor interactions typically don’t benefit from Jito tips.

Evaluation Checklist Before Using Jito Tips

JavaScript/TypeScript Optimization Tips for Solana

Lazy Loading

Load components only when needed to reduce initial load time:

if (condition) {
  const module = await import("./heavyModule.js");
  module.doSomething();
}

Optimized Loops

Use for loops instead of forEach for better performance with large datasets:

for (let i = 0; i < array.length; i++) {
  process(array[i]);
}

Use Map or Set for Lookups

For frequent lookups, Map and Set are faster than arrays:

const map = new Map([ ["key", "value"] ]);
const value = map.get("key");

Prefer const

const enables better optimizations than var or let:

const counter = 0;

Manage Memory

Clear unused intervals or subscriptions to avoid leaks:

const intervalId = setInterval(doSomething, 1000);
clearInterval(intervalId);

Batch or Debounce API Calls

Minimize redundant RPC calls:

let timeoutId;
function debounce(func, delay) {
  clearTimeout(timeoutId);
  timeoutId = setTimeout(func, delay);
}
debounce(() => connection.getLatestBlockhash(), 300);

Simplify Object Handling

Avoid deep cloning of large objects; use shallow copies:

const newObj = { ...originalObj };

Optimize JSON Handling

For large payloads, use libraries like json-bigint:

import JSONbig from "json-bigint";
const parsed = JSONbig.parse(largeJsonString);

These optimizations provide better performance and scalability, reduced resource usage, and cleaner, maintainable code.

Data Transfer Optimizations

Base64 Is Faster than Base58

For serialized transaction data on Solana, Base64 is faster and more efficient than Base58. Base64 avoids complex calculations and is widely supported by Solana APIs.

Use Base64 Encoding:

// Serialize and encode
const serializedTx = testTransaction.serialize();
const encodedTx = encode(serializedTx);

// Decode when needed
const decodedBytes = decode(encodedTx);

Why:

  • Faster encoding/decoding
  • Native support in APIs
  • Ideal for performance-critical tasks

Real-Time Data Monitoring

setInterval(async () => {
  const info = await connection.getAccountInfo(pubkey);
}, 1000);
// 1 RPC call per second, high latency

Using WebSockets (onAccountChange) pushes updates to your application in near-real time and eliminates repetitive polling.

Why:

  • Lower latency: Changes are delivered as they happen, rather than on a fixed schedule.
  • Less network overhead: You only receive data when it changes, rather than every second.

Advanced Query Patterns

Token Holder Breakdown

const accounts = await connection.getProgramAccounts(TOKEN_PROGRAM_ID);
const holders = accounts.filter(acc => 
  acc.account.data.parsed.info.mint === mintAddress
);
// Downloads all token accounts (~100MB+)

Why:

  • Targeted queries: Only fetch accounts for the specified mint.
  • Significant bandwidth savings: Up to a 99% reduction in data transfer.

Program State Analysis

const accounts = await connection.getProgramAccounts(programId);
const states = accounts.filter(acc => acc.account.data.length === STATE_SIZE);
// Downloads all accounts

Why:

  • Reduced data transfer: Leverage the RPC node to filter by dataSize and memcmp.
  • Faster client processing: Only download essential fields via dataSlice.

Validator Performance Check

const slots = await connection.getBlocks(start, end);
const leaders = await Promise.all(
  slots.map(slot => connection.getSlotLeader(slot))
);
// N+1 RPC calls

Why:

  • One request: Retrieves aggregated block production stats in bulk.
  • Fewer network calls: Lowers overhead and speeds up data processing.

Account Updates Analysis

const txs = await connection.getSignaturesForAddress(address);
const states = await Promise.all(
  txs.map(async tx => {
    const info = await connection.getAccountInfo(address, { slot: tx.slot });
    return info;
  })
);
// N+1 RPC calls, downloads full history

Why:

  • Streaming approach: Capture state changes as they occur.
  • Less data: Only fetch slices of the account if you need partial info.

Memory Optimization Patterns

Large Data Processing

const accounts = await connection.getProgramAccounts(programId);
const processed = accounts.map(acc => processAccount(acc));
// Loads all data into memory

Chunking ensures that you only load manageable subsets of data at a time.

Why:

  • Prevents OOM: Keeps memory usage in check by processing smaller batches.
  • Improved throughput: Parallel processing of chunks can speed up overall operation.

Transaction Analysis

const txs = await connection.getSignaturesForAddress(address);
const graph = new Map();
for (const tx of txs) {
  const details = await connection.getTransaction(tx.signature);
  graph.set(tx.signature, details);
}
// Sequential processing, high memory usage

Why:

  • Faster: Batching transactions reduces overhead.
  • Controlled memory usage: Large sets are split into smaller requests.

Program Buffers

const accounts = await connection.getProgramAccounts(BUFFER_PROGRAM_ID);
const buffers = new Map();
accounts.forEach(acc => {
  buffers.set(acc.pubkey, acc.account.data);
});
// Loads all buffers into memory

Why:

  • Lazy loading: Only fetch buffer contents when needed.
  • 90% reduction in initial memory usage: You avoid loading all buffers at once.

Token Accounts

const accounts = await connection.getProgramAccounts(TOKEN_PROGRAM_ID);
const balances = new Map();
accounts.forEach(acc => {
  const { mint, owner, amount } = acc.account.data.parsed;
  if (!balances.has(owner)) balances.set(owner, new Map());
  balances.get(owner).set(mint, amount);
});
// Processes all token accounts

Why:

  • Targeted queries: Only query token accounts for known owners.
  • Less memory usage: An 80% reduction compared to pulling every token account on chain.

Compressed NFTs

const trees = await connection.getProgramAccounts(SPL_ACCOUNT_COMPRESSION_ID);
const leaves = await Promise.all(
  trees.map(async tree => {
    const canopy = await getConcurrentMerkleTreeAccountInfo(tree.pubkey);
    return getLeafAssetId(canopy, 0, tree.pubkey);
  })
);
// Processes all trees sequentially

Why:

  • Parallel execution: Processes multiple trees simultaneously.
  • Timeouts: Prevents tasks from blocking the entire flow.

Network Optimization Patterns

Smart Retry Logic

const getWithRetry = async (signature) => {
  for (let i = 0; i < 3; i++) {
    try {
      return await connection.getTransaction(signature);
    } catch (e) {
      await sleep(1000);
    }
  }
};
// Fixed retry pattern

Why:

  • Adaptive backoff: Dynamically extends wait time for repeated failures.
  • Handles rate limits: Checks for specific errors (e.g., “429 Too Many Requests”).

WebSocket Optimization

const sub1 = connection.onAccountChange(acc1, () => {});
const sub2 = connection.onAccountChange(acc2, () => {});
const sub3 = connection.onAccountChange(acc3, () => {});
// Multiple WebSocket connections

Why:

  • Fewer connections: Consolidates multiple subscriptions into one.
  • Lower overhead: Reduces the complexity of maintaining many WebSocket channels.

Custom Data Feeds

connection.onProgramAccountChange(programId, () => {});
// Receives all account changes

Why:

  • Reduced bandwidth: Filter out accounts you don’t care about.
  • Less processing: Limits the data you must handle on each event.

Transaction Monitoring

setInterval(async () => {
  const sigs = await connection.getSignaturesForAddress(address);
  const newSigs = sigs.filter(sig => !processed.has(sig));
  for (const sig of newSigs) {
    const tx = await connection.getTransaction(sig);
    // Process tx
  }
}, 1000);
// Polling with high overhead

Why:

  • Push-based: Gets new signatures immediately via logs.
  • Less duplication: Eliminates repeated polling intervals.

Best Practices

Use Appropriate Commitment Levels

  • processed for WebSocket subscriptions.
  • confirmed for general queries.
  • finalized only when absolute certainty is required.

Implement Robust Error Handling

  • Use exponential backoff for retries.
  • Handle rate limit (HTTP 429) errors gracefully.
  • Validate responses to avoid processing incomplete or corrupted data.

Optimize Data Transfer

  • Utilize dataSlice wherever possible to limit payload size.
  • Leverage server-side filtering (memcmp and dataSize).
  • Choose the most efficient encoding option (base64, jsonParsed, etc.).

Manage Resources

  • Batch operations to reduce overhead.
  • Cache results to avoid redundant lookups.
  • Bundle multiple instructions into a single transaction where applicable.

Monitor Performance

  • Track RPC usage and latency.
  • Monitor memory consumption for large dataset processing.
  • Log and analyze errors to detect bottlenecks.

Circuit Breakers & Throttling

  • Employ circuit breakers to halt or pause operations under excessive error rates.
  • Throttle requests to respect rate limits and ensure stable performance.

By following these techniques and best practices, you can significantly reduce operational costs, enhance real-time responsiveness, and scale more effectively on Solana.