Create Publication

We are looking for publications that demonstrate building dApps or smart contracts!
See the full list of Gitcoin bounties that are eligible for rewards.

Article Thumbnail

Developer Preview of Conduit

Background

The Algorand blockchain stores the entire history of all blocks and transactions. One of our goals is to provide an easy way to access this data; the monolithic “Indexer” currently provides this functionality. After months of design discussions, feedback from the community, and incremental implementation, we are very excited to release a developer preview of Conduit, a new way to access Algorand chain data. We look to the community to further test and experiment with the system and share all feedback to help us improve it.

Conduit is a modular framework composed of importer, processor, and exporter plugins. Together these form a flexible data pipeline that enables easier access to the blockchain data applications need.

Conduit is designed with the following goals in mind:

  • easy to use and maintain
  • extensible by ecosystem
  • database agnostic
  • easily scalable
  • adaptable and customizable
  • encourage innovation and experimentation

With Conduit, users configure their own data pipelines for filtering, aggregation, and storage of transactions and accounts on Algorand. With processor and exporter plugins, it enables users to access on-chain data and slice and dice it in whatever way they need while also using their database of choice.

The existing Indexer will be updated to use Conduit under the hood and should be thought of going forward as a specific instantiation of the Conduit modular framework with a preconfigured pipeline using an algod importer, block processor, and Postgres exporter. If you plan to continue using Indexer, this will be a transparent upgrade to you and will require no additional work on your end. A preview version of the Indexer that uses Conduit under the hood has also been provided.

What we’ve done so far:

Simplified Code

We refactored Indexer to use Conduit by separating the import, process, and export functions.

The Indexer previously ingested raw blocks from algod and processed them before account data was written to a postgres database. To do so, the postgres database was queried on each round to fetch the initial account states, creating a circular dependency on the database. We removed the circular dependency on Postgres for processing raw blocks in July with the Indexer 2.13.0 release. Now, an account cache is maintained in the indexer data directory and is used during block processing.

By simplifying the code and avoiding reads from a remote database, we made performance improvements. Because processing each round is done in a simpler, non circular fashion, the time it takes to initialize a new deployment has also been improved. The new architecture laid the foundation for Conduit.

Framework and Plugins

We built the Conduit framework to enable users to configure their own data pipelines composed of three components: an importer, processor(s), and an exporter.

We have built an initial set of plugins, including:

  • algod importer: fetches blocks one by one from the algod REST API
  • filter_processor: enables filtering transactions to include only those wanted
  • postgresql: writes block data to a postgres database
  • file_writer: write block data to files

Developers can also build their own plugins for Conduit that can be used in conjunction with the Indexer or as part of a new system.

Call to Action

We are asking the community to test and experiment with the plugins we’ve built and also to try building your own. Our documentation explains Conduit, lists current plugins, and walks through building plugins.

Conduit enables users to do things like:

  • filter for only the transactions you want (e.g., all transactions by sender x, transactions greater than 100,000 algos, or even all transactions by sender x greater than 100,000 algos)
  • prune data by automatically discarding old transactions (e.g., you only want transactions after a certain round)
  • build a plugin that works with the filter processor to trigger on specific transactions (e.g., filter for anytime an NFT gets sold on a marketplace that triggers a transaction to pay royalties to the creator)

With these possible use cases and many others, we hope that the ecosystem will start to experiment with Conduit and provide feedback in the #conduit-preview Discord channel. We are seeking feedback related to bugs, usability, features, and documentation.