This is the official registry of actions supported on Rabbithole.gg & Terminal.
- Clone the repo
pnpm install
- navigate to a package or create a new one
To run all tests:
pnpm test
To run individual package test:
pnpm test --filter=@rabbitholegg/questdk-plugin-connext
Replace the filter param with the package you're trying to target.
To build all projects except the @rabbitholegg/questdk-plugin-project
, run the following command:
pnpm build
This command utilizes Turbo to optimize the build process, ensuring that only relevant changes trigger rebuilds, significantly speeding up development time.
Maintain code quality and consistency with our linting commands. To lint all projects except the @rabbitholegg/questdk-plugin-project
, you can run:
pnpm lint
For automatic fixes to linting issues, execute:
pnpm lint:fix
These commands help ensure that your code adheres to our coding standards and styles.
Encountering build issues or need to start from scratch? The nuke
command is your go-to solution. It cleans your working directory by removing all untracked files and directories, then reinstalls dependencies:
pnpm nuke
This powerful command is ideal for resetting your project to a clean state.
Remember to replace the filter parameter with the specific package you're targeting for individual commands as needed.
In order to publish you need to make sure that the pull request you're submitting has a changeset. If you don't want to publish this isn't needed. In order to generate a changeset run pnpm changeset
, select a change type [major,minor,patch], and draft a small summary of the changeset. Select version based on semantic versioning.
After this all you need to do is push and merge the pull request and the Github Action will handle the process of versioning, and publishing.
We use Conventional Commits in order to make changelogs and versioning a breeze.
Often times when testing it's necessary to link into the questDK repo. We handle this through package linking.
We recommend using direct linking, not global linking. If the local questDK changes don't seem to be recognized try deleting the file node_modules\.pnpm\@rabbitholegg+questdk
[may have some version specific information - delete all of them].
Also remember the package must be built for changes to take effect in this repo. It should not be necessary to re-link after builds.
At times it may be necessary to restart the typescript server when working with a linked package in VSCode.
In order to link the questdk
run this from root of this package:
pnpm link path/to/questdk
If you encounter Module not found: Can't resolve 'viem/chains'
or another dependency related issue immediatly after linking you most likely need to rebuild the package that has been linked into the host package.
Quest Plugins are how we translate transaction data into actions. It’s how our platform can take information from smart contracts deployed for any EVM native protocol and quickly parse it into a standardized format. Using this standardized action information we determine if a given user has transactions that meet the specific criteria for a given quest.
For example, writing a plugin for Uniswap that translates a swap
action allows users to create, and complete Quests on Uniswap. The plugin is used by our Indexing service to parse transaction data into specific information about that allowing quests to target certain amounts, certain recipients, or even certain tokens.
At this time we only support plugins for swap
, mint
, and bridge
but we’re constantly adding new actions (stake
forthcoming). If there’s a specific action you’d like supported please reach out. Some protocol integrations are more complex than others.
Plugins are necessary because there is no standard for how protocols should describe actions, so implementations vary across protocols. This makes it difficult to understand the type of action being performed just from transaction data. Plugins serve two purposes: integrate protocol actions using a standard schema, and provide an open source registry of curated protocol actions approved for use on Rabbithole and Terminal.
Without a plugin, it’s not possible to create quests for a specific action, since we don’t have an ability to parse the transaction data and understand the actions that are occurring in those transactions. These plugins allow us to store action information in a standardized way. Once the action has been standardized we can use any of these standard elements as criteria for quests on the platform. As we progress, this also unlocks the ability to perform advanced analytics on a given class of actions.
In the example given above, Uniswap uses an extremely specialized pattern for their Swap contract relying on a bytes array for all input data.
Without a plugin to translate this information into a standardized format that includes chainId
amountIn
or recipient
it wouldn’t be possible for us to establish specific success criteria for quests deployed through our terminal. In this particularly obtuse example, all of the inputs are tucked into a bytes array, so without our plugin it’s impossible to know these common attributes that are shared between all swaps.
Once a protocol has a plugin merged into our plugin repo, that protocol will automatically be a valid option for quest creation on our platform through the quest creation terminal.
Plugin implementation is relatively simple, although it requires a strong understanding of the project you're integrating with. Oftentimes we rely on a projects API to get fresh and consistent information on the project we're integrating with. Each plugin has a user defined pluginID
that needs to be added to the plugin registry and should be descriptive of the project you're integrating. We also require functions to return the list of all supported chains [ getSupportedChainIds
], and the list of supported tokens [ getSupportedTokenAddresses
] for each supported chain. The supported chains and tokens are often where a project specific API can come in handy. The most complex aspect of plugin implementation is the bridge
mint
and swap
functions which are used to return a TransactionFilter
. For our backend, we store this filter in the quests actionSpec
which we later read when applying filters in our indexer. The generally provides a way of finding or filtering out transactions that fulfill the requirements of a quest’s action
property.
When developing a plugin the steps are as follows:
- Create a new package in
packages
by duplicating the template folder, following the naming convention ofquestdk-plugin-<project>
- Fill out the appropriate test transaction information in the
test-transaction.ts
file. If you find an example of a transaction, this tool can help generate the test-transaction data. These transactions are meant to cover all of the edge cases for a given action. Oftentimes the contract, and function signature will be different for these different transaction types. In extreme situations there may be additional edge cases (handling of Matic on Polygon for example) but ensure that every situation where a different contract, or a different function signature is used is fully captured in the file. These are the transactions you should be testing against, and you can allow these different edge cases to influence how the plugin is developed.
// BRIDGE TEST TRANSACTIONS
const BRIDGE_ETH_L1_TO_L2 = {}
const BRIDGE_ETH_L2_TO_L1 = {}
const BRIDGE_TOKEN_L1_TO_L2 = {}
const BRIDGE_TOKEN_L2_TO_L1 = {}
// SWAP TEST TRANSACTIONS
const SWAP_TOKEN_TO_TOKEN = {}
const SWAP_TOKEN_TO_ETH = {}
const SWAP_ETH_TO_TOKEN = {}
// MINT TEST TRANSACTIONS
const MINT = {}
- Implement
IActionPlugin
and export the interface as a top-level package export. In the case that not all actions are implemented, just returnPluginActionNotImplementedError:
- Implement previously mentioned helper functions [
getSupportedChainIds
,getSupportedTokenAddresses
] - Implement any action functions you expect the plugin to support [
swap
,mint
,bridge
]
- Implement previously mentioned helper functions [
- Add your plugin by
id
in the plugin registryquestdk-plugin-registry
.id
should be listed using kebab-case and any versioning should be appended (uniswap-v2, mirror-world, etc)
When implementing the action function, the parameters coming in are the criteria set through the terminal:
const {
sourceChainId,
destinationChainId,
tokenAddress, // For Native Tokens (ETH, Matic, etc) sometimes this is 0x0 or blank
amount,
recipient
} = bridge
Given these we have to map these supplied expected values against the actual ABI of the transaction we’ll be parsing:
return compressJson({
chainId: sourceChainId, // The chainId of the source chain
to: CHAIN_TO_CONTRACT[sourceChainId], // The contract address of the bridge
input: {
$abi: ACROSS_BRIDGE_ABI, // The ABI of the bridge contract
recipient: recipient, // The recipient of the funds
destinationChainId: destinationChainId, // The chainId of the destination chain
amount: amount, // The amount of tokens to send
originToken: tokenAddress, // The token address of the token to send
}, // The input object is where we'll put the ABI and the parameters
})
chainId
and to
both map directly to those params on the transaction object. Any param on the transaction object can be used in your filter (from
is often useful). For the input object you need to supply an ABI that contains the function signature of the relevant function. This can be a modified ABI that holds the function signature of multiple contracts, the filter will correctly pull out the one with the right signature, just watch out for poor handling of overloaded functions. The keys of the JSON object (i.e originToken
) needs to key into the expected value based on the parameters passed in from the action (tokenAddress
).
In the registry index.ts
import your new plugin and add it to the plugins
object as follows:
[Project.pluginId]: Project,
Replace 'Project' with the name of your project.
Also remember to add the new repo within the monorepo to the package.json
of the registry.
If you're ready to publish, remove the private
tag from your package.json
file.
We've made significant updates to our build process to enhance modularity, improve integration times, and set a foundation for reducing the overall package size. Although the package remains larger than our target, these changes pave the way for future optimizations through deduplication and exclusions.
- Enhanced Portability: The new structure improves the portability of modules, facilitating their use in diverse environments.
- Foundation for Slimming Down: By breaking down the build process and migrating away from unsupported tools, we've laid the groundwork for future size reductions.
- Improved Integration and Performance: Anecdotal evidence suggests a smoother and faster integration process, likely due to more efficient tree-shaking capabilities.
Our build process now consists of three discrete steps, designed to ensure both type safety and build speed:
- Type Verification: We start by verifying types and building type definitions. This step ensures the type safety of our code before it moves to transpilation.
- Transpilation: The code is then transpiled into unbundled CommonJS (CJS) and ECMAScript Module (ESM) formats using Babel. This step focuses on speed, bypassing the checks performed in the first step.
- Bundling: Finally, we bundle the transpiled code into ESM and Universal Module Definition (UMD) formats. This step also includes the minification of each version to reduce the size further.
- Third-party SDK's: If a plugin is dependent on a third party SDK, like
@zoralabs/protocol-sdk
, please ensure that a matching package name regular expression, ie:/@zoralabs/
is added to the plugin'svite.config.js
build.rollupOptions.external
to prevent unnecessary inclusion in compiled output. - Migration Away from Rome: Recognizing that Rome is no longer maintained, we've started to migrate our tooling away from it to ensure future compatibility and support.
- Utility Script for File Management: To facilitate cleaner updates, a helper script has been added for copying files out to every directory. This approach allows for a more straightforward "delete all and replace" strategy when updating modules.
While the new build process marks a significant improvement, some tasks remain, particularly concerning node polyfills:
- Node Polyfills with Rollup: Issues have arisen around
TextDecoder
, which is unavailable in certain environments. While we've managed to polyfill node utilities for most packages, exceptions exist for those utilizing Axios due to its reliance onTextDecoder
. - Web Environment Compatibility: As
TextDecoder
should be available in web environments, further investigation is needed to selectively polyfill node utilities. This consideration is crucial for web browser deployment, suggesting that webpack may offer a solution for filling these gaps.
We are committed to addressing these challenges and will provide updates as we refine the build process further. Your feedback and contributions are welcome as we continue to improve the efficiency and effectiveness of our package.
If you'd like to build a plugin and get support for your protocol on RabbiteHole all you need to do is submit a PR with the finished plugin. Here are some useful tips to assist, and when in doubt please join our discord or reach out by email [arthur@rabbithole.gg] for assistance building a plugin.