Learn how to decode and parse transaction data from Laserstream for better understanding of Solana transactions.
When you receive transaction data from Laserstream, there are two important things to look for:
Message → What the user wanted to do (their signed proposal)
Meta → What actually happened (the execution result)
The challenge: Raw transaction data comes as binary byte arrays like <Buffer 00 bf a0 e8...> instead of readable addresses and signatures.
This guide shows you how to: Decode that binary data into human-readable format, extract meaningful information, and understand the complete transaction story from proposal to execution.
Run the minimal client below. The filter flags drop vote and failed transactions, and the accountsInclude array limits results to activity that touches the Jupiter program ID.
Why decode? Raw Laserstream data contains signatures, account keys, and hashes as binary Uint8Array objects that are unreadable. You need to convert these to base58 strings to make sense of the transaction.
The solution: Laserstream uses Yellowstone gRPC, which provides built-in decoding utilities. Instead of writing separate decoders for each field type, we use one recursive function that converts all binary data to human-readable format.
Copy
Ask AI
import bs58 from 'bs58';import { subscribe, CommitmentLevel, SubscribeUpdate, LaserstreamConfig } from '../client';// Recursive function to convert all Buffer/Uint8Array fields to base58function convertBuffers(obj: any): any { if (!obj) return obj; if (Buffer.isBuffer(obj) || obj instanceof Uint8Array) { return bs58.encode(obj); } if (Array.isArray(obj)) { return obj.map(item => convertBuffers(item)); } if (typeof obj === 'object') { return Object.fromEntries( Object.entries(obj).map(([key, value]) => [key, convertBuffers(value)]) ); } return obj;}async function runTransactionSubscription() { const config: LaserstreamConfig = { apiKey: 'your-api-key', endpoint: 'laserstream-endpoint', }; const request = { transactions: { "Jupiter-transactions": { vote: false, failed: false, accountsInclude: ['JUP6LkbZbjS1jKKwapdHNy74zcZ3tLUZoi5QNyVTaV4'] } }, commitment: CommitmentLevel.PROCESSED, accounts: {}, slots: {}, transactionsStatus: {}, blocks: {}, blocksMeta: {}, entry: {}, accountsDataSlice: [] }; const stream = await subscribe( config, request, (update: SubscribeUpdate) => { if (update.transaction) { // Convert all binary fields to human-readable format const decodedTransaction = convertBuffers(update.transaction); console.log('💸 Decoded transaction:', JSON.stringify(decodedTransaction, null, 2)); // Or process specific fields processTransaction(update.transaction); } }, console.error ); console.log(`✅ stream id → ${stream.id}`); process.on('SIGINT', () => { stream.cancel(); process.exit(0); });}function processTransaction(txUpdate: any) { const tx = txUpdate.transaction; const meta = tx.meta; console.log('Transaction Details:'); console.log('- Signature:', bs58.encode(tx.signature)); console.log('- Slot:', txUpdate.slot); console.log('- Success:', meta.err === null); console.log('- Fee:', meta.fee, 'lamports'); console.log('- Compute Units:', meta.computeUnitsConsumed); // Account keys are already available in the message const message = tx.transaction.message; if (message.accountKeys) { console.log('- Account Keys:'); message.accountKeys.forEach((key: Uint8Array, index: number) => { console.log(` ${index}: ${bs58.encode(key)}`); }); } // Log messages are already UTF-8 strings if (meta.logMessages && meta.logMessages.length > 0) { console.log('- Log Messages:'); meta.logMessages.forEach((log: string) => { console.log(` ${log}`); }); }}runTransactionSubscription();
This approach leverages the built-in decoding while handling the binary fields that need manual conversion. The transaction structure is already parsed - you just need to convert the binary fields to human-readable format.
Now that we can see the decoded data, let’s explore the two main parts of every Laserstream transaction update. Remember from our initial example that each transaction contains two key objects:
transaction.transaction.transaction → the signed message (user’s proposal)
transaction.transaction.meta → the execution metadata (validator’s response)
This two-part structure tells a complete story: what the user requested versus what actually happened. Let’s examine each part in detail.
Transaction HeadernumRequiredSignatures tells the validator how many signatures to verify, while the two numReadonly* values label accounts the runtime can treat as read-only, enabling parallel execution.
Account Keys DictionaryaccountKeys is a plain list of public keys that acts as a lookup table. Every later integer in the transaction - programIdIndex, each element in an instruction’s accounts array - points back into this list by index, saving more than a kilobyte per message.
Instructions: The Actual Commands
Each instruction contains three key parts:
programIdIndex: Points to an address in the accountKeys array (e.g., index 10 = ComputeBudget111111111111111111111111111111)
accounts: A base58-encoded string representing which account indexes this instruction touches. Note: Due to the convertBuffers function, this appears as base58 but actually contains account indices (e.g., "3vtmrQMafzDoG2CBz1iqgXPTnC" decodes to indices [21, 19, 12, 17, 2, 6, 1, 22])
data: The actual instruction data encoded as base58
This design means instead of repeating full 32-byte addresses, each instruction just references positions in the lookup table.
Signatures: Proof of Authorizationsignatures contains the cryptographic signatures proving the required accounts authorized this transaction. The number of signatures must match header.numRequiredSignatures.
Address Table Lookups
If versioned is true, addressTableLookups appears with an on-chain table and two index lists. Lookup tables lift the hard cap on address count to dozens while keeping the packet under the 1,232-byte MTU.
How It All Connects: The Flow
Here’s what happens from first principles:
Build the lookup table: accountKeys lists all addresses this transaction will touch
Set the rules: header specifies how many signatures are required and which accounts are read-only
Create the commands: Each instruction points to:
A program (via programIdIndex → accountKeys[index])
The accounts it needs (via accounts → multiple accountKeys[index] positions)
The instruction data (encoded in data)
Add authorization: signatures proves the required accounts approved this transaction
Set expiration: recentBlockhash ensures this transaction can’t be replayed later
Example walkthrough:
Copy
Ask AI
accountKeys[15] = "pAMMBay6oceH9fJKBRHGP5D4bD4sWpmSwMn52FMfXEA"instruction.programIdIndex = 15→ This instruction calls the Pump.fun AMM programinstruction.accounts = "3vtmrQMafzDoG2CBz1iqgXPTnC"→ This base58 string decodes to account indices: [21, 19, 12, 17, 2, 6, 1, 22]→ These indices map to: accountKeys[21], accountKeys[19], accountKeys[12], etc.→ Which gives us the actual account addresses this instruction needs
In a single sweep we now know:
who must sign (header.numRequiredSignatures)
which programs and accounts each instruction touches (programIdIndex + accounts)
the deadline for inclusion (recentBlockhash)
any bonus keys fetched from tables (addressTableLookups)
That is every promise the user makes to the network.
Now let’s examine the second half of the transaction structure: the meta object. While the message shows what the user proposed, the meta reveals what actually happened when the validator executed that proposal.
The validator replies with facts: success or failure, cost, and side-effects.
Copy
Ask AI
{ "err": null}
Execution Resulterr is null on success or a structured error code when any instruction fails.
Transaction Costs and Balance Changesfee shows the base cost - 5,000 lamports per signature - and the first key in accountKeys is the fee payer, so subtracting its pre- and post-balances proves the debit.
Copy
Ask AI
{ "computeUnitsConsumed": 23432}
Compute UsagecomputeUnitsConsumed records exact workload; pairing it with a chosen micro-lamport price gives the priority fee.
Cross-Program Invocations
Each item in innerInstructions carries an index pointing back to its parent instruction, letting you trace every cross-program call like a stack trace.
Token Balance ChangespreTokenBalances and postTokenBalances list SPL-Token accounts with a ready-made uiTokenAmount so you avoid dividing by 10⁹ yourself.
Resolved Lookup TablesloadedWritableAddresses and loadedReadonlyAddresses confirm exactly which table keys were pulled in, closing the loop with the proposal.
Everything under message is what the user asked for; everything under meta is what the cluster observed.
Read the payload in that order - proposal first, outcome second - and every parameter explains the next without overwhelming detail. Add or drop decoders as you like, and the same stream becomes a mint detector, fee profiler, or audit logger without ever leaving the Laserstream feed.