🧠
DH3 Docs
  • Welcome WEB3 frens
  • 🧠DH3 Focus
    • Effective storage
    • Multichain
    • Quick indexing
    • Parsing and Decoding
    • Exploring
    • Building
    • Data ownership
  • ⚒️Specs
    • Data Streams
      • Storage
      • Streams
      • Channels
    • Stream interceptors
      • Interceptors
        • Selecting stream
        • Defining compute resources
        • Filtering
        • Extracting
        • Transforming
        • Loading
      • Beams
        • Available
        • Create a beam
      • Helpers
        • Dictionaries
      • Execution
      • Examples
    • Data sets
      • Base data sets
      • Bring your data
      • Views
    • Providers
    • API's
      • Backfilling
        • HTTP Methods
        • WebHooks payload
        • Monitoring processes
        • Transaction Filter
    • Other
      • Disaster recovery
Powered by GitBook
On this page
  • Anatomy of interceptor
  • Types of interceptors
  • Basic example
  1. Specs
  2. Stream interceptors

Interceptors

PreviousStream interceptorsNextSelecting stream

Last updated 12 months ago

This document provides a brief overview of the construction of StreamInterceptors. Note that this is an initial draft; we are currently collaborating with our early clients to establish the foundational specifications for StreamInterceptors.

Before proceeding, it is important to highlight that in most cases, neither dataset builders nor end-users will interact with StreamInterceptors in their raw form. Instead, all interactions will be facilitated through a graphical user interface (GUI).

Anatomy of interceptor

Stream interceptors define how data sets are created and stored. This can range from simply selecting desired transactions and streaming them via webhooks to constructing a full data set structure managed by DH3.

Let's break down each section:

spec.stream

This section specifies the initial input of raw data. Depending on the type of interceptor used, it might include various options.

spec.compute

Here you configure the computing resources allocated to the interceptor. As of now, only ArchiveStreamInterceptor requires this configuration.

spec.filter

This acts as a filtering layer before utilizing more compute-intensive resources. It efficiently filters through streams without fully extracting or transforming transaction data.

spec.extract -> spec.transform -> spec.load

When the filter identifies required matches, it sends the matching data through , and steps. This process handles everything from simple transaction parsing and sending through webhooks to complex data extraction involving additional queries to other datasets or live RPCs.

For these last three, we offer useful starters, such as extractors for parsing full Solana transactions with extensive program support, transformers to map transaction data to database fields, and various sinks for loaders that support both external and DH3-managed databases.

The community can enhance this further by replacing any part with custom logic written in WASM, providing a flexible and powerful platform for tailored solutions.

Types of interceptors

ArchiveStreamInterceptor

StreamInterceptor

DedicatedStreamInterceptor

Basic example

apiVersion: interceptor.dh3.io/v1beta
kind: ArchiveStreamInterceptor
metadata:
   name: 'filter-meta-updates'
spec:
  stream:
    channel: sol.transactions.regular
    options:
        timeRange: [111, 222]
        blockRange: [111, 222]
  compute:
    replicas: 10
    limit: # limits per replica
      cpu: 1
      mem: 1Gi
  filter:
    - beam: 'dh3/solanaTransactionFilter'
      instructions: [{},{}]
  extract:
    - beam: 'dh3/solanaTransactionParser'
      includeFieldsMeta: true
      includeAccountsMeta: true
      instructions: true
      events: true
  transform: []
  load:
    - beam: 'dh3/webhook'
      url:
        - url1
        - urlbc
      batch: 100
      gzip: true

⚒️
Extract
Transform
Load