Parsing and Decoding

Advanced Data Decryption and Accessibility

This is not just a simple storage of Solana transactions. has decrypted a majority of these transactions, making them available in both binary serialized and deserialized formats. By mastering the ability to quickly and cost-effectively rescan the entire Solana, has developed a sophisticated transaction and account decoder. This decoder is based on a three-level approach: sourcing data from the blockchain, repositories, and code inference.

Utilization of IDL/ABI and Decoding Strategies

In Solana, the Interface Description Language (IDL) is the standard for meta-information about account and transaction structures. harnesses this standard, extracting IDL schemas directly from the blockchain when available. The team has also curated a list of the most popular Solana programs, tracking their GitHub repositories. This allows the analyzer to access the source codes of corresponding programs. In cases where IDL is not published on-chain, either connects it from the project itself or, if unavailable, the decoder heuristically analyzes the code to deduce the necessary IDL, enabling the decryption of on-chain data. Here we are using state-of-the-art LLM-based extraction techniques with subsequent cleaning, testing, and recompiling.

Adaptive and Error-Tolerant Data Processing

The platform is built to adapt to the evolution of unfinalized programs, accommodating changes in their structure. Erroneous deserializations are not a concern, as constructing a new dataset derived from the binary immutable dataset of transactions is a relatively inexpensive operation for Currently, the platform is capable of performing arbitrary calculations for each transaction using the Lua language, and there are plans to enhance this feature by enabling the compilation of transaction processing scripts using WebAssembly.

Data Visualization and Advanced Filtering Capabilities offers a vast array of well-structured data, providing the community with tools to build useful, derivative live datasets on top of this array. These datasets can be beautifully visualized through interactive dashboards, APIs, and other data consumption tools. The platform features rapid data filtering capabilities. The filtering condition is intelligently divided into two parts: easily-verifiable conditions that are compiled into prefiltering predicates to discard unnecessary data at the scanning stage, and structural decryption to check the remainder of the condition on already structured and possibly transformed data. The obtained data can then be aggregated, sorted, saved in derivative datasets, and even joined together in some cases.

Private Data


Last updated