This document provides a brief overview of the construction of StreamInterceptors. Note that this is an initial draft; we are currently collaborating with our early clients to establish the foundational specifications for StreamInterceptors.

Before proceeding, it is important to highlight that in most cases, neither dataset builders nor end-users will interact with StreamInterceptors in their raw form. Instead, all interactions will be facilitated through a graphical user interface (GUI).

Anatomy of interceptor

Stream interceptors define how data sets are created and stored. This can range from simply selecting desired transactions and streaming them via webhooks to constructing a full data set structure managed by DH3.

Let's break down each section:

This section specifies the initial input of raw data. Depending on the type of interceptor used, it might include various options.


Here you configure the computing resources allocated to the interceptor. As of now, only ArchiveStreamInterceptor requires this configuration.


This acts as a filtering layer before utilizing more compute-intensive resources. It efficiently filters through streams without fully extracting or transforming transaction data.

spec.extract -> spec.transform -> spec.load

When the filter identifies required matches, it sends the matching data through Extract, Transform and Load steps. This process handles everything from simple transaction parsing and sending through webhooks to complex data extraction involving additional queries to other datasets or live RPCs.

For these last three, we offer useful starters, such as extractors for parsing full Solana transactions with extensive program support, transformers to map transaction data to database fields, and various sinks for loaders that support both external and DH3-managed databases.

The community can enhance this further by replacing any part with custom logic written in WASM, providing a flexible and powerful platform for tailored solutions.

Types of interceptors




Basic example

kind: ArchiveStreamInterceptor
   name: 'filter-meta-updates'
    channel: sol.transactions.regular
        timeRange: [111, 222]
        blockRange: [111, 222]
    replicas: 10
    limit: # limits per replica
      cpu: 1
      mem: 1Gi
    - beam: 'dh3/solanaTransactionFilter'
      instructions: [{},{}]
    - beam: 'dh3/solanaTransactionParser'
      includeFieldsMeta: true
      includeAccountsMeta: true
      instructions: true
      events: true
  transform: []
    - beam: 'dh3/webhook'
        - url1
        - urlbc
      batch: 100
      gzip: true

Last updated