From b063822663043bda2b71099d1d5da7aeefd76a7c Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Fri, 13 Sep 2024 12:14:00 -0400 Subject: [PATCH 01/12] StreamingFast docs on Substreams-powered subgraphs --- website/pages/en/sps/sps_intro.mdx | 19 ++++ website/pages/en/sps/triggers.mdx | 10 ++ website/pages/en/sps/triggers_example.mdx | 107 ++++++++++++++++++++++ website/pages/en/substreams.mdx | 6 +- 4 files changed, 140 insertions(+), 2 deletions(-) create mode 100644 website/pages/en/sps/sps_intro.mdx create mode 100644 website/pages/en/sps/triggers.mdx create mode 100644 website/pages/en/sps/triggers_example.mdx diff --git a/website/pages/en/sps/sps_intro.mdx b/website/pages/en/sps/sps_intro.mdx new file mode 100644 index 000000000000..407be94e35c6 --- /dev/null +++ b/website/pages/en/sps/sps_intro.mdx @@ -0,0 +1,19 @@ + +--- +title: Introduction +--- + +By leveraging a Substreams package (`.yaml`) as a data source, your subgraph gains access to pre-extracted, indexed blockchain data, enabling more efficient and scalable data handling, especially when dealing with large or complex blockchain networks. + +This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) + +**Public Substreams packages** + +A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs. + +Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs. + diff --git a/website/pages/en/sps/triggers.mdx b/website/pages/en/sps/triggers.mdx new file mode 100644 index 000000000000..07fef12cad3e --- /dev/null +++ b/website/pages/en/sps/triggers.mdx @@ -0,0 +1,10 @@ + +---- +title: Substreams Triggers +---- + +Substreams triggers allow you to integrate Substreams data directly into your subgraph. By importing the [Protobuf definitions](https://substreams.streamingfast.io/documentation/develop/creating-protobuf-schemas#protobuf-overview) emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This enables efficient and streamlined data handling within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps_intro) to scaffold your first project in the devcontainer. + +To go through a coded example of a trigger based Subgraph, [click here](./triggers_example). diff --git a/website/pages/en/sps/triggers_example.mdx b/website/pages/en/sps/triggers_example.mdx new file mode 100644 index 000000000000..f23c4bbc43d0 --- /dev/null +++ b/website/pages/en/sps/triggers_example.mdx @@ -0,0 +1,107 @@ +Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project. + +Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id: + +```graphql +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: #Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + + - name: map_spl_transfers + use: solana:map_block #Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: #Modify the param fields to meet your needs + #For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + + + +Now see the corresponding subgraph manifest **(`subgraph.yaml`)** using a Substreams package as the data source: + +```yaml +specVersion: 1.0.0 +description: my-project-sol Substreams-powered-Subgraph +indexerHints: + prune: auto +schema: + file: ./schema.graphql +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +The Protobuf object is generated in AssemblyScript by running `npm run protogen` after running `substreams codegen subgraph` in the devcontainer, so you can import it in the subgraph code. Then transform your decoded Substreams data within the `src/mappings.ts` file, just like in a standard subgraph: + +```tsx +import { Protobuf } from "as-proto/assembly"; +import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events"; +import { MyTransfer } from "../generated/schema"; + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode); + + for (let i=0; i Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`. +> diff --git a/website/pages/en/substreams.mdx b/website/pages/en/substreams.mdx index 710e110012cc..0dbb67853c83 100644 --- a/website/pages/en/substreams.mdx +++ b/website/pages/en/substreams.mdx @@ -4,9 +4,11 @@ title: Substreams ![Substreams Logo](/img/substreams-logo.png) -Substreams is a powerful blockchain indexing technology developed for The Graph Network. It enables developers to write Rust modules, compose data streams alongside the community, and provide extremely high-performance indexing due to parallelization in a streaming-first approach. +Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features: -With Substreams, developers can quickly extract data from different blockchains (Ethereum, BNB, Solana, ect.) and send it to various locations of their choice, such as a Postgres database, a Mongo database, or a Subgraph. Additionally, Substreams packages enable developers to specify which data they want to extract from the blockchain. +- **Accelerated Indexing**: Substreams reduces subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing. +- **Multi-Chain Support**: Substreams expand indexing capabilities beyond EVM-based chains, supporting ecosystems like Solana, Injective, Starknet, and Vara. +- **Multi-Sink Support:** Subgraph, Postgres database, Clickhouse, Mongo database ## How Substreams Works in 4 Steps From 8f923216d05094d993d0aacec4b1d4343bc2dd59 Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Fri, 13 Sep 2024 12:18:39 -0400 Subject: [PATCH 02/12] added a title --- website/pages/en/sps/triggers_example.mdx | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/website/pages/en/sps/triggers_example.mdx b/website/pages/en/sps/triggers_example.mdx index f23c4bbc43d0..e7093a3ee5db 100644 --- a/website/pages/en/sps/triggers_example.mdx +++ b/website/pages/en/sps/triggers_example.mdx @@ -1,3 +1,8 @@ + +--- +title: Example Susbtreams Trigger +--- + Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project. Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id: From 0c10608b3f92706705b2834efb7748d12e3909f7 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Enol=20=C3=81lvarez?= Date: Fri, 13 Sep 2024 18:34:47 +0200 Subject: [PATCH 03/12] Add SpS documentation to the menu --- website/pages/en/_meta.js | 1 + website/pages/en/sps/_meta.js | 5 +++++ website/pages/en/sps/{sps_intro.mdx => sps-intro.mdx} | 1 - .../en/sps/{triggers_example.mdx => triggers-example.mdx} | 1 - website/pages/en/sps/triggers.mdx | 1 - 5 files changed, 6 insertions(+), 3 deletions(-) create mode 100644 website/pages/en/sps/_meta.js rename website/pages/en/sps/{sps_intro.mdx => sps-intro.mdx} (99%) rename website/pages/en/sps/{triggers_example.mdx => triggers-example.mdx} (99%) diff --git a/website/pages/en/_meta.js b/website/pages/en/_meta.js index 38ea74ac7720..1933137f8d5c 100644 --- a/website/pages/en/_meta.js +++ b/website/pages/en/_meta.js @@ -33,6 +33,7 @@ export default { title: 'Substreams', }, substreams: '', + sps: 'Substreams-powered Subgraphs', '---4': { type: 'separator', }, diff --git a/website/pages/en/sps/_meta.js b/website/pages/en/sps/_meta.js new file mode 100644 index 000000000000..9799789a8a03 --- /dev/null +++ b/website/pages/en/sps/_meta.js @@ -0,0 +1,5 @@ +export default { + 'sps-intro': '', + 'triggers': '', + 'triggers-example': '' +} \ No newline at end of file diff --git a/website/pages/en/sps/sps_intro.mdx b/website/pages/en/sps/sps-intro.mdx similarity index 99% rename from website/pages/en/sps/sps_intro.mdx rename to website/pages/en/sps/sps-intro.mdx index 407be94e35c6..c6e0f8e06808 100644 --- a/website/pages/en/sps/sps_intro.mdx +++ b/website/pages/en/sps/sps-intro.mdx @@ -1,4 +1,3 @@ - --- title: Introduction --- diff --git a/website/pages/en/sps/triggers_example.mdx b/website/pages/en/sps/triggers-example.mdx similarity index 99% rename from website/pages/en/sps/triggers_example.mdx rename to website/pages/en/sps/triggers-example.mdx index e7093a3ee5db..6546ad71d7d4 100644 --- a/website/pages/en/sps/triggers_example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -1,4 +1,3 @@ - --- title: Example Susbtreams Trigger --- diff --git a/website/pages/en/sps/triggers.mdx b/website/pages/en/sps/triggers.mdx index 07fef12cad3e..c2c1bd69faa2 100644 --- a/website/pages/en/sps/triggers.mdx +++ b/website/pages/en/sps/triggers.mdx @@ -1,4 +1,3 @@ - ---- title: Substreams Triggers ---- From 4b0f462466ff252883ad7873ffdbc50a2c5d5c82 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Enol=20=C3=81lvarez?= Date: Fri, 13 Sep 2024 18:38:24 +0200 Subject: [PATCH 04/12] Fix format --- website/pages/en/sps/_meta.js | 8 +-- website/pages/en/sps/sps-intro.mdx | 7 ++- website/pages/en/sps/triggers-example.mdx | 66 +++++++++++------------ website/pages/en/sps/triggers.mdx | 4 +- website/pages/en/substreams.mdx | 2 +- 5 files changed, 42 insertions(+), 45 deletions(-) diff --git a/website/pages/en/sps/_meta.js b/website/pages/en/sps/_meta.js index 9799789a8a03..3a73aad27134 100644 --- a/website/pages/en/sps/_meta.js +++ b/website/pages/en/sps/_meta.js @@ -1,5 +1,5 @@ export default { - 'sps-intro': '', - 'triggers': '', - 'triggers-example': '' -} \ No newline at end of file + 'sps-intro': '', + triggers: '', + 'triggers-example': '', +} diff --git a/website/pages/en/sps/sps-intro.mdx b/website/pages/en/sps/sps-intro.mdx index c6e0f8e06808..31cad7a271b8 100644 --- a/website/pages/en/sps/sps-intro.mdx +++ b/website/pages/en/sps/sps-intro.mdx @@ -4,7 +4,7 @@ title: Introduction By leveraging a Substreams package (`.yaml`) as a data source, your subgraph gains access to pre-extracted, indexed blockchain data, enabling more efficient and scalable data handling, especially when dealing with large or complex blockchain networks. -This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly: +This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly: - [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) - [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) @@ -12,7 +12,6 @@ This technology opens up more efficient and versatile indexing for diverse block **Public Substreams packages** -A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs. - -Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs. +A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs. +Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs. diff --git a/website/pages/en/sps/triggers-example.mdx b/website/pages/en/sps/triggers-example.mdx index 6546ad71d7d4..c67ed079e75f 100644 --- a/website/pages/en/sps/triggers-example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -2,7 +2,7 @@ title: Example Susbtreams Trigger --- -Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project. +Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project. Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id: @@ -12,17 +12,17 @@ package: name: my_project_sol version: v0.1.0 -imports: #Pass your spkg of interest +imports: #Pass your spkg of interest solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg modules: - + - name: map_spl_transfers use: solana:map_block #Select corresponding modules available within your spkg initialBlock: 260000082 - + - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes + use: solana:solana:transactions_by_programid_without_votes network: solana-mainnet-beta @@ -31,8 +31,6 @@ params: #Modify the param fields to meet your needs map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE ``` - - Now see the corresponding subgraph manifest **(`subgraph.yaml`)** using a Substreams package as the data source: ```yaml @@ -57,45 +55,44 @@ dataSources: handler: handleTriggers ``` -Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities: +Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities: ```graphql type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! } ``` The Protobuf object is generated in AssemblyScript by running `npm run protogen` after running `substreams codegen subgraph` in the devcontainer, so you can import it in the subgraph code. Then transform your decoded Substreams data within the `src/mappings.ts` file, just like in a standard subgraph: ```tsx -import { Protobuf } from "as-proto/assembly"; -import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events"; -import { MyTransfer } from "../generated/schema"; +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode); - - for (let i=0; i(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}`; - const entity = new MyTransfer(entity_id); - entity.amount = (event.transfer!.instruction!.amount).toString(); - entity.source = event.transfer!.accounts!.source; - entity.designation = event.transfer!.accounts!.destination; - - if (event.transfer!.accounts!.signer!.single != null){ - entity.signers = [event.transfer!.accounts!.signer!.single.signer]; - } - else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers; + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers } - entity.save(); + entity.save() } } } @@ -107,5 +104,4 @@ Here's what you’re seeing in the `mappings.ts`: 2. Looping over the transactions 3. Create a new subgraph entity for every transaction -> Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`. -> +> Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`. diff --git a/website/pages/en/sps/triggers.mdx b/website/pages/en/sps/triggers.mdx index c2c1bd69faa2..9bbba5efa5ac 100644 --- a/website/pages/en/sps/triggers.mdx +++ b/website/pages/en/sps/triggers.mdx @@ -1,6 +1,8 @@ ---- title: Substreams Triggers ----- +--- + +- Substreams triggers allow you to integrate Substreams data directly into your subgraph. By importing the [Protobuf definitions](https://substreams.streamingfast.io/documentation/develop/creating-protobuf-schemas#protobuf-overview) emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This enables efficient and streamlined data handling within the subgraph framework. diff --git a/website/pages/en/substreams.mdx b/website/pages/en/substreams.mdx index 0dbb67853c83..456c9728c364 100644 --- a/website/pages/en/substreams.mdx +++ b/website/pages/en/substreams.mdx @@ -4,7 +4,7 @@ title: Substreams ![Substreams Logo](/img/substreams-logo.png) -Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features: +Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features: - **Accelerated Indexing**: Substreams reduces subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing. - **Multi-Chain Support**: Substreams expand indexing capabilities beyond EVM-based chains, supporting ecosystems like Solana, Injective, Starknet, and Vara. From 34b2a0c05c880de544175b984f0317b485870c9a Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Sun, 15 Sep 2024 05:46:02 -0400 Subject: [PATCH 05/12] Addressing feedback --- website/pages/en/new-chain-integration.mdx | 31 +---- website/pages/en/sps/_meta.js | 2 +- website/pages/en/sps/sps-intro.mdx | 21 ++-- website/pages/en/sps/triggers-example.mdx | 133 +++++++++++++-------- website/pages/en/sps/triggers.mdx | 36 +++++- website/pages/en/substreams.mdx | 6 +- 6 files changed, 133 insertions(+), 96 deletions(-) diff --git a/website/pages/en/new-chain-integration.mdx b/website/pages/en/new-chain-integration.mdx index bc4f247011c3..dc91caa64d89 100644 --- a/website/pages/en/new-chain-integration.mdx +++ b/website/pages/en/new-chain-integration.mdx @@ -76,33 +76,4 @@ Graph Node should be syncing the deployed subgraph if there are no errors. Give ## Substreams-powered Subgraphs -For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams-powered subgraph codegen tools are included (check out [Injective](https://substreams.streamingfast.io/documentation/intro-getting-started/intro-injective/injective-first-sps) for an example). - -There are two options to consume Substreams data through a subgraph: - -- **Using Substreams triggers:** Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. -- **Using EntityChanges:** By writing more of the logic into Substreams, you can consume the module's output directly into `graph-node`. In `graph-node`, you can use the Substreams data to create your subgraph entities. - -It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in `graph-node`. Consider the following example implementing a subgraph handler: - -```ts -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -The `handleTransactions` function is a subgraph handler that receives the raw Substreams bytes as parameter and decodes them into a `Transactions` object. Then, for every transaction, a new subgraph entity is created. For more information about Substreams triggers, visit the [StreamingFast documentation](https://substreams.streamingfast.io/documentation/consume/subgraph/triggers) or check out community modules at [substreams.dev](https://substreams.dev/). +For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams codegen tools are included. These tools enable the ability to enable [Substreams-powered subgraphs](./sps/sps-intro). Follow the [How-To Guide](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application) and run `substreams codegen subgraph` to experience the codegen tools for yourself. diff --git a/website/pages/en/sps/_meta.js b/website/pages/en/sps/_meta.js index 3a73aad27134..0d4faf0f7f4f 100644 --- a/website/pages/en/sps/_meta.js +++ b/website/pages/en/sps/_meta.js @@ -1,5 +1,5 @@ export default { - 'sps-intro': '', + 'sps-intro': 'Introduction', triggers: '', 'triggers-example': '', } diff --git a/website/pages/en/sps/sps-intro.mdx b/website/pages/en/sps/sps-intro.mdx index 31cad7a271b8..369e6ca512ea 100644 --- a/website/pages/en/sps/sps-intro.mdx +++ b/website/pages/en/sps/sps-intro.mdx @@ -1,17 +1,20 @@ --- -title: Introduction +title: Introduction to Substreams-powered Subgraphs --- -By leveraging a Substreams package (`.yaml`) as a data source, your subgraph gains access to pre-extracted, indexed blockchain data, enabling more efficient and scalable data handling, especially when dealing with large or complex blockchain networks. +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. -This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly: +There are two methods of enabling this technology: -- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) -- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) -- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. -**Public Substreams packages** +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. -A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs. -Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs. +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/en/sps/triggers-example.mdx b/website/pages/en/sps/triggers-example.mdx index c67ed079e75f..2b198b499a0d 100644 --- a/website/pages/en/sps/triggers-example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -1,52 +1,70 @@ --- -title: Example Susbtreams Trigger +title: Tutorial: Set Up a Substreams-Powered Subgraph on Solana --- -Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project. +## Prerequisites -Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id: +Before starting, make sure to: -```graphql +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml specVersion: v0.1.0 package: name: my_project_sol version: v0.1.0 -imports: #Pass your spkg of interest +imports: # Pass your spkg of interest solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg modules: - + - name: map_spl_transfers - use: solana:map_block #Select corresponding modules available within your spkg + use: solana:map_block # Select corresponding modules available within your spkg initialBlock: 260000082 - + - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes + use: solana:solana:transactions_by_programid_without_votes network: solana-mainnet-beta -params: #Modify the param fields to meet your needs - #For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA +params: #M odify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE ``` -Now see the corresponding subgraph manifest **(`subgraph.yaml`)** using a Substreams package as the data source: +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: ```yaml -specVersion: 1.0.0 -description: my-project-sol Substreams-powered-Subgraph -indexerHints: - prune: auto -schema: - file: ./schema.graphql +... + dataSources: - kind: substreams name: my_project_sol network: solana-mainnet-beta source: package: - moduleName: map_spl_transfers + moduleName: map_spl_transfers # Module defined in the substreams.yaml file: ./my-project-sol-v0.1.0.spkg mapping: apiVersion: 0.0.7 @@ -55,53 +73,68 @@ dataSources: handler: handleTriggers ``` -Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities: +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: ```graphql type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! } ``` -The Protobuf object is generated in AssemblyScript by running `npm run protogen` after running `substreams codegen subgraph` in the devcontainer, so you can import it in the subgraph code. Then transform your decoded Substreams data within the `src/mappings.ts` file, just like in a standard subgraph: +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. -```tsx -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' +## Step 4: Generate Protobuf Files -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: +```bash +import { Protobuf } from "as-proto/assembly"; +import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events"; +import { MyTransfer } from "../generated/schema"; + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode); + + for (let i=0; i Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`. +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/en/sps/triggers.mdx b/website/pages/en/sps/triggers.mdx index 9bbba5efa5ac..281dd0e39bc2 100644 --- a/website/pages/en/sps/triggers.mdx +++ b/website/pages/en/sps/triggers.mdx @@ -1,11 +1,37 @@ ----- +--- title: Substreams Triggers --- -- +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps-intro) to scaffold your first project in the Development Container. + +The following example demonstrates how to implement a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` -Substreams triggers allow you to integrate Substreams data directly into your subgraph. By importing the [Protobuf definitions](https://substreams.streamingfast.io/documentation/develop/creating-protobuf-schemas#protobuf-overview) emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This enables efficient and streamlined data handling within the subgraph framework. +Here's what you’re seeing in the `mappings.ts` file: -> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps_intro) to scaffold your first project in the devcontainer. +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction -To go through a coded example of a trigger based Subgraph, [click here](./triggers_example). +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/en/substreams.mdx b/website/pages/en/substreams.mdx index 456c9728c364..a838a6924e2f 100644 --- a/website/pages/en/substreams.mdx +++ b/website/pages/en/substreams.mdx @@ -6,7 +6,7 @@ title: Substreams Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features: -- **Accelerated Indexing**: Substreams reduces subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing. +- **Accelerated Indexing**: Substreams reduce subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing. - **Multi-Chain Support**: Substreams expand indexing capabilities beyond EVM-based chains, supporting ecosystems like Solana, Injective, Starknet, and Vara. - **Multi-Sink Support:** Subgraph, Postgres database, Clickhouse, Mongo database @@ -46,3 +46,7 @@ To learn about the latest version of Substreams CLI, which enables developers to ### Expand Your Knowledge - Take a look at the [Ethereum Explorer Tutorial](https://substreams.streamingfast.io/tutorials/evm) to learn about the basic transformations you can create with Substreams. + +### Substreams Registry + +A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain, similar to the `mapping.ts` file in traditional subgraphs. Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks. From 04ef801dc60354a51de9c9cbb486775eec1d82da Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Sun, 15 Sep 2024 05:50:28 -0400 Subject: [PATCH 06/12] editing the _meta.js --- website/pages/en/sps/_meta.js | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/pages/en/sps/_meta.js b/website/pages/en/sps/_meta.js index 0d4faf0f7f4f..87dcc88ad5b1 100644 --- a/website/pages/en/sps/_meta.js +++ b/website/pages/en/sps/_meta.js @@ -1,5 +1,5 @@ export default { 'sps-intro': 'Introduction', triggers: '', - 'triggers-example': '', + 'triggers-example': 'Tutorial', } From 3be01428256884a90d6c667da765cfa1c1c77c57 Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Sun, 15 Sep 2024 06:00:08 -0400 Subject: [PATCH 07/12] fix --- website/pages/en/sps/triggers-example.mdx | 6 +++--- website/pages/en/sps/triggers.mdx | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/website/pages/en/sps/triggers-example.mdx b/website/pages/en/sps/triggers-example.mdx index 2b198b499a0d..afe6c1dfaeac 100644 --- a/website/pages/en/sps/triggers-example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -40,8 +40,8 @@ modules: network: solana-mainnet-beta -params: #M odify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA +params: # Modify the param fields to meet your needs +# For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE ``` @@ -103,7 +103,7 @@ This command converts the Protobuf definitions into AssemblyScript, allowing you With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: -```bash +```ts import { Protobuf } from "as-proto/assembly"; import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events"; import { MyTransfer } from "../generated/schema"; diff --git a/website/pages/en/sps/triggers.mdx b/website/pages/en/sps/triggers.mdx index 281dd0e39bc2..fbd97809f4db 100644 --- a/website/pages/en/sps/triggers.mdx +++ b/website/pages/en/sps/triggers.mdx @@ -6,7 +6,7 @@ Custom triggers allow you to send data directly into your subgraph mappings file > Note: If you haven’t already, visit one of the How-To Guides found [here](./sps-intro) to scaffold your first project in the Development Container. -The following example demonstrates how to implement a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. ```tsx export function handleTransactions(bytes: Uint8Array): void { From 99da12b6394202a6ed832e9e2d90d0ba957cf3e2 Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Sun, 15 Sep 2024 11:58:07 -0400 Subject: [PATCH 08/12] trying to fix prettier --- website/pages/en/sps/sps-intro.mdx | 1 - website/pages/en/sps/triggers-example.mdx | 78 +++++++++++------------ 2 files changed, 38 insertions(+), 41 deletions(-) diff --git a/website/pages/en/sps/sps-intro.mdx b/website/pages/en/sps/sps-intro.mdx index 369e6ca512ea..3e50521589af 100644 --- a/website/pages/en/sps/sps-intro.mdx +++ b/website/pages/en/sps/sps-intro.mdx @@ -12,7 +12,6 @@ Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. - Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: - [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) diff --git a/website/pages/en/sps/triggers-example.mdx b/website/pages/en/sps/triggers-example.mdx index afe6c1dfaeac..abe7c311c573 100644 --- a/website/pages/en/sps/triggers-example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -12,11 +12,11 @@ Before starting, make sure to: ## Step 1: Initialize Your Project 1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - + + ```bash + substreams init + ``` + 2. Select the "minimal" project option. 3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: @@ -26,22 +26,21 @@ package: name: my_project_sol version: v0.1.0 -imports: # Pass your spkg of interest +imports: # Pass your spkg of interest solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg modules: - - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes network: solana-mainnet-beta params: # Modify the param fields to meet your needs -# For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE ``` @@ -56,8 +55,8 @@ substreams codegen subgraph You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: ```yaml -... +--- dataSources: - kind: substreams name: my_project_sol @@ -79,11 +78,11 @@ Define the fields you want to save in your subgraph entities by updating the `sc ```graphql type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! } ``` @@ -104,30 +103,29 @@ This command converts the Protobuf definitions into AssemblyScript, allowing you With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: ```ts -import { Protobuf } from "as-proto/assembly"; -import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events"; -import { MyTransfer } from "../generated/schema"; +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode); - - for (let i=0; i(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}`; - const entity = new MyTransfer(entity_id); - entity.amount = (event.transfer!.instruction!.amount).toString(); - entity.source = event.transfer!.accounts!.source; - entity.designation = event.transfer!.accounts!.destination; - - if (event.transfer!.accounts!.signer!.single != null){ - entity.signers = [event.transfer!.accounts!.signer!.single.signer]; - } - else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers; + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers } - entity.save(); + entity.save() } } } From 12527333fc8d814f42779ea5d102affd8b6ef4f3 Mon Sep 17 00:00:00 2001 From: Giuliano Francescangeli Date: Sun, 15 Sep 2024 12:05:08 -0400 Subject: [PATCH 09/12] fixing prettier --- website/pages/en/sps/triggers-example.mdx | 1 - 1 file changed, 1 deletion(-) diff --git a/website/pages/en/sps/triggers-example.mdx b/website/pages/en/sps/triggers-example.mdx index abe7c311c573..d0edf1e74d58 100644 --- a/website/pages/en/sps/triggers-example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -55,7 +55,6 @@ substreams codegen subgraph You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: ```yaml - --- dataSources: - kind: substreams From a40c3aa0461f6b5c72fdeaecb6abb997e1f3979f Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Enol=20=C3=81lvarez?= Date: Mon, 16 Sep 2024 01:53:03 +0200 Subject: [PATCH 10/12] Fix title --- website/pages/en/sps/triggers-example.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/pages/en/sps/triggers-example.mdx b/website/pages/en/sps/triggers-example.mdx index d0edf1e74d58..d8d61566295e 100644 --- a/website/pages/en/sps/triggers-example.mdx +++ b/website/pages/en/sps/triggers-example.mdx @@ -1,5 +1,5 @@ --- -title: Tutorial: Set Up a Substreams-Powered Subgraph on Solana +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' --- ## Prerequisites From 3817c7a977aad7378a237331da860c8fa1681bda Mon Sep 17 00:00:00 2001 From: benface Date: Mon, 16 Sep 2024 10:48:32 -0400 Subject: [PATCH 11/12] `sps-intro` => `introduction` --- website/pages/en/new-chain-integration.mdx | 2 +- website/pages/en/sps/_meta.js | 2 +- website/pages/en/sps/{sps-intro.mdx => introduction.mdx} | 0 website/pages/en/sps/triggers.mdx | 2 +- 4 files changed, 3 insertions(+), 3 deletions(-) rename website/pages/en/sps/{sps-intro.mdx => introduction.mdx} (100%) diff --git a/website/pages/en/new-chain-integration.mdx b/website/pages/en/new-chain-integration.mdx index dc91caa64d89..a8a3c88c7250 100644 --- a/website/pages/en/new-chain-integration.mdx +++ b/website/pages/en/new-chain-integration.mdx @@ -76,4 +76,4 @@ Graph Node should be syncing the deployed subgraph if there are no errors. Give ## Substreams-powered Subgraphs -For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams codegen tools are included. These tools enable the ability to enable [Substreams-powered subgraphs](./sps/sps-intro). Follow the [How-To Guide](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application) and run `substreams codegen subgraph` to experience the codegen tools for yourself. +For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams codegen tools are included. These tools enable the ability to enable [Substreams-powered subgraphs](/sps/introduction). Follow the [How-To Guide](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application) and run `substreams codegen subgraph` to experience the codegen tools for yourself. diff --git a/website/pages/en/sps/_meta.js b/website/pages/en/sps/_meta.js index 87dcc88ad5b1..a8b84287610e 100644 --- a/website/pages/en/sps/_meta.js +++ b/website/pages/en/sps/_meta.js @@ -1,5 +1,5 @@ export default { - 'sps-intro': 'Introduction', + introduction: 'Introduction', triggers: '', 'triggers-example': 'Tutorial', } diff --git a/website/pages/en/sps/sps-intro.mdx b/website/pages/en/sps/introduction.mdx similarity index 100% rename from website/pages/en/sps/sps-intro.mdx rename to website/pages/en/sps/introduction.mdx diff --git a/website/pages/en/sps/triggers.mdx b/website/pages/en/sps/triggers.mdx index fbd97809f4db..ed19635d4768 100644 --- a/website/pages/en/sps/triggers.mdx +++ b/website/pages/en/sps/triggers.mdx @@ -4,7 +4,7 @@ title: Substreams Triggers Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. -> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps-intro) to scaffold your first project in the Development Container. +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. From 6b97445d1333d81b4fb0a2ccc2df6c9d9eb6a663 Mon Sep 17 00:00:00 2001 From: benface Date: Mon, 16 Sep 2024 10:57:16 -0400 Subject: [PATCH 12/12] Duplicate new pages in every language --- website/pages/ar/sps/_meta.js | 5 + website/pages/ar/sps/introduction.mdx | 19 +++ website/pages/ar/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ar/sps/triggers.mdx | 37 ++++++ website/pages/cs/sps/_meta.js | 5 + website/pages/cs/sps/introduction.mdx | 19 +++ website/pages/cs/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/cs/sps/triggers.mdx | 37 ++++++ website/pages/de/sps/_meta.js | 5 + website/pages/de/sps/introduction.mdx | 19 +++ website/pages/de/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/de/sps/triggers.mdx | 37 ++++++ website/pages/es/sps/_meta.js | 5 + website/pages/es/sps/introduction.mdx | 19 +++ website/pages/es/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/es/sps/triggers.mdx | 37 ++++++ website/pages/fr/sps/_meta.js | 5 + website/pages/fr/sps/introduction.mdx | 19 +++ website/pages/fr/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/fr/sps/triggers.mdx | 37 ++++++ website/pages/ha/sps/_meta.js | 5 + website/pages/ha/sps/introduction.mdx | 19 +++ website/pages/ha/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ha/sps/triggers.mdx | 37 ++++++ website/pages/hi/sps/_meta.js | 5 + website/pages/hi/sps/introduction.mdx | 19 +++ website/pages/hi/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/hi/sps/triggers.mdx | 37 ++++++ website/pages/it/sps/_meta.js | 5 + website/pages/it/sps/introduction.mdx | 19 +++ website/pages/it/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/it/sps/triggers.mdx | 37 ++++++ website/pages/ja/sps/_meta.js | 5 + website/pages/ja/sps/introduction.mdx | 19 +++ website/pages/ja/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ja/sps/triggers.mdx | 37 ++++++ website/pages/ko/sps/_meta.js | 5 + website/pages/ko/sps/introduction.mdx | 19 +++ website/pages/ko/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ko/sps/triggers.mdx | 37 ++++++ website/pages/mr/sps/_meta.js | 5 + website/pages/mr/sps/introduction.mdx | 19 +++ website/pages/mr/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/mr/sps/triggers.mdx | 37 ++++++ website/pages/nl/sps/_meta.js | 5 + website/pages/nl/sps/introduction.mdx | 19 +++ website/pages/nl/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/nl/sps/triggers.mdx | 37 ++++++ website/pages/pl/sps/_meta.js | 5 + website/pages/pl/sps/introduction.mdx | 19 +++ website/pages/pl/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/pl/sps/triggers.mdx | 37 ++++++ website/pages/pt/sps/_meta.js | 5 + website/pages/pt/sps/introduction.mdx | 19 +++ website/pages/pt/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/pt/sps/triggers.mdx | 37 ++++++ website/pages/ro/sps/_meta.js | 5 + website/pages/ro/sps/introduction.mdx | 19 +++ website/pages/ro/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ro/sps/triggers.mdx | 37 ++++++ website/pages/ru/sps/_meta.js | 5 + website/pages/ru/sps/introduction.mdx | 19 +++ website/pages/ru/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ru/sps/triggers.mdx | 37 ++++++ website/pages/sv/sps/_meta.js | 5 + website/pages/sv/sps/introduction.mdx | 19 +++ website/pages/sv/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/sv/sps/triggers.mdx | 37 ++++++ website/pages/tr/sps/_meta.js | 5 + website/pages/tr/sps/introduction.mdx | 19 +++ website/pages/tr/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/tr/sps/triggers.mdx | 37 ++++++ website/pages/uk/sps/_meta.js | 5 + website/pages/uk/sps/introduction.mdx | 19 +++ website/pages/uk/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/uk/sps/triggers.mdx | 37 ++++++ website/pages/ur/sps/_meta.js | 5 + website/pages/ur/sps/introduction.mdx | 19 +++ website/pages/ur/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/ur/sps/triggers.mdx | 37 ++++++ website/pages/vi/sps/_meta.js | 5 + website/pages/vi/sps/introduction.mdx | 19 +++ website/pages/vi/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/vi/sps/triggers.mdx | 37 ++++++ website/pages/yo/sps/_meta.js | 5 + website/pages/yo/sps/introduction.mdx | 19 +++ website/pages/yo/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/yo/sps/triggers.mdx | 37 ++++++ website/pages/zh/sps/_meta.js | 5 + website/pages/zh/sps/introduction.mdx | 19 +++ website/pages/zh/sps/triggers-example.mdx | 137 ++++++++++++++++++++++ website/pages/zh/sps/triggers.mdx | 37 ++++++ website/route-lockfile.txt | 103 ++++++++++++++-- 93 files changed, 4650 insertions(+), 7 deletions(-) create mode 100644 website/pages/ar/sps/_meta.js create mode 100644 website/pages/ar/sps/introduction.mdx create mode 100644 website/pages/ar/sps/triggers-example.mdx create mode 100644 website/pages/ar/sps/triggers.mdx create mode 100644 website/pages/cs/sps/_meta.js create mode 100644 website/pages/cs/sps/introduction.mdx create mode 100644 website/pages/cs/sps/triggers-example.mdx create mode 100644 website/pages/cs/sps/triggers.mdx create mode 100644 website/pages/de/sps/_meta.js create mode 100644 website/pages/de/sps/introduction.mdx create mode 100644 website/pages/de/sps/triggers-example.mdx create mode 100644 website/pages/de/sps/triggers.mdx create mode 100644 website/pages/es/sps/_meta.js create mode 100644 website/pages/es/sps/introduction.mdx create mode 100644 website/pages/es/sps/triggers-example.mdx create mode 100644 website/pages/es/sps/triggers.mdx create mode 100644 website/pages/fr/sps/_meta.js create mode 100644 website/pages/fr/sps/introduction.mdx create mode 100644 website/pages/fr/sps/triggers-example.mdx create mode 100644 website/pages/fr/sps/triggers.mdx create mode 100644 website/pages/ha/sps/_meta.js create mode 100644 website/pages/ha/sps/introduction.mdx create mode 100644 website/pages/ha/sps/triggers-example.mdx create mode 100644 website/pages/ha/sps/triggers.mdx create mode 100644 website/pages/hi/sps/_meta.js create mode 100644 website/pages/hi/sps/introduction.mdx create mode 100644 website/pages/hi/sps/triggers-example.mdx create mode 100644 website/pages/hi/sps/triggers.mdx create mode 100644 website/pages/it/sps/_meta.js create mode 100644 website/pages/it/sps/introduction.mdx create mode 100644 website/pages/it/sps/triggers-example.mdx create mode 100644 website/pages/it/sps/triggers.mdx create mode 100644 website/pages/ja/sps/_meta.js create mode 100644 website/pages/ja/sps/introduction.mdx create mode 100644 website/pages/ja/sps/triggers-example.mdx create mode 100644 website/pages/ja/sps/triggers.mdx create mode 100644 website/pages/ko/sps/_meta.js create mode 100644 website/pages/ko/sps/introduction.mdx create mode 100644 website/pages/ko/sps/triggers-example.mdx create mode 100644 website/pages/ko/sps/triggers.mdx create mode 100644 website/pages/mr/sps/_meta.js create mode 100644 website/pages/mr/sps/introduction.mdx create mode 100644 website/pages/mr/sps/triggers-example.mdx create mode 100644 website/pages/mr/sps/triggers.mdx create mode 100644 website/pages/nl/sps/_meta.js create mode 100644 website/pages/nl/sps/introduction.mdx create mode 100644 website/pages/nl/sps/triggers-example.mdx create mode 100644 website/pages/nl/sps/triggers.mdx create mode 100644 website/pages/pl/sps/_meta.js create mode 100644 website/pages/pl/sps/introduction.mdx create mode 100644 website/pages/pl/sps/triggers-example.mdx create mode 100644 website/pages/pl/sps/triggers.mdx create mode 100644 website/pages/pt/sps/_meta.js create mode 100644 website/pages/pt/sps/introduction.mdx create mode 100644 website/pages/pt/sps/triggers-example.mdx create mode 100644 website/pages/pt/sps/triggers.mdx create mode 100644 website/pages/ro/sps/_meta.js create mode 100644 website/pages/ro/sps/introduction.mdx create mode 100644 website/pages/ro/sps/triggers-example.mdx create mode 100644 website/pages/ro/sps/triggers.mdx create mode 100644 website/pages/ru/sps/_meta.js create mode 100644 website/pages/ru/sps/introduction.mdx create mode 100644 website/pages/ru/sps/triggers-example.mdx create mode 100644 website/pages/ru/sps/triggers.mdx create mode 100644 website/pages/sv/sps/_meta.js create mode 100644 website/pages/sv/sps/introduction.mdx create mode 100644 website/pages/sv/sps/triggers-example.mdx create mode 100644 website/pages/sv/sps/triggers.mdx create mode 100644 website/pages/tr/sps/_meta.js create mode 100644 website/pages/tr/sps/introduction.mdx create mode 100644 website/pages/tr/sps/triggers-example.mdx create mode 100644 website/pages/tr/sps/triggers.mdx create mode 100644 website/pages/uk/sps/_meta.js create mode 100644 website/pages/uk/sps/introduction.mdx create mode 100644 website/pages/uk/sps/triggers-example.mdx create mode 100644 website/pages/uk/sps/triggers.mdx create mode 100644 website/pages/ur/sps/_meta.js create mode 100644 website/pages/ur/sps/introduction.mdx create mode 100644 website/pages/ur/sps/triggers-example.mdx create mode 100644 website/pages/ur/sps/triggers.mdx create mode 100644 website/pages/vi/sps/_meta.js create mode 100644 website/pages/vi/sps/introduction.mdx create mode 100644 website/pages/vi/sps/triggers-example.mdx create mode 100644 website/pages/vi/sps/triggers.mdx create mode 100644 website/pages/yo/sps/_meta.js create mode 100644 website/pages/yo/sps/introduction.mdx create mode 100644 website/pages/yo/sps/triggers-example.mdx create mode 100644 website/pages/yo/sps/triggers.mdx create mode 100644 website/pages/zh/sps/_meta.js create mode 100644 website/pages/zh/sps/introduction.mdx create mode 100644 website/pages/zh/sps/triggers-example.mdx create mode 100644 website/pages/zh/sps/triggers.mdx diff --git a/website/pages/ar/sps/_meta.js b/website/pages/ar/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ar/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ar/sps/introduction.mdx b/website/pages/ar/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ar/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ar/sps/triggers-example.mdx b/website/pages/ar/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ar/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ar/sps/triggers.mdx b/website/pages/ar/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ar/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/cs/sps/_meta.js b/website/pages/cs/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/cs/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/cs/sps/introduction.mdx b/website/pages/cs/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/cs/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/cs/sps/triggers-example.mdx b/website/pages/cs/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/cs/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/cs/sps/triggers.mdx b/website/pages/cs/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/cs/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/de/sps/_meta.js b/website/pages/de/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/de/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/de/sps/introduction.mdx b/website/pages/de/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/de/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/de/sps/triggers-example.mdx b/website/pages/de/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/de/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/de/sps/triggers.mdx b/website/pages/de/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/de/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/es/sps/_meta.js b/website/pages/es/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/es/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/es/sps/introduction.mdx b/website/pages/es/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/es/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/es/sps/triggers-example.mdx b/website/pages/es/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/es/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/es/sps/triggers.mdx b/website/pages/es/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/es/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/fr/sps/_meta.js b/website/pages/fr/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/fr/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/fr/sps/introduction.mdx b/website/pages/fr/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/fr/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/fr/sps/triggers-example.mdx b/website/pages/fr/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/fr/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/fr/sps/triggers.mdx b/website/pages/fr/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/fr/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/ha/sps/_meta.js b/website/pages/ha/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ha/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ha/sps/introduction.mdx b/website/pages/ha/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ha/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ha/sps/triggers-example.mdx b/website/pages/ha/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ha/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ha/sps/triggers.mdx b/website/pages/ha/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ha/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/hi/sps/_meta.js b/website/pages/hi/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/hi/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/hi/sps/introduction.mdx b/website/pages/hi/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/hi/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/hi/sps/triggers-example.mdx b/website/pages/hi/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/hi/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/hi/sps/triggers.mdx b/website/pages/hi/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/hi/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/it/sps/_meta.js b/website/pages/it/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/it/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/it/sps/introduction.mdx b/website/pages/it/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/it/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/it/sps/triggers-example.mdx b/website/pages/it/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/it/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/it/sps/triggers.mdx b/website/pages/it/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/it/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/ja/sps/_meta.js b/website/pages/ja/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ja/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ja/sps/introduction.mdx b/website/pages/ja/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ja/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ja/sps/triggers-example.mdx b/website/pages/ja/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ja/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ja/sps/triggers.mdx b/website/pages/ja/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ja/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/ko/sps/_meta.js b/website/pages/ko/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ko/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ko/sps/introduction.mdx b/website/pages/ko/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ko/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ko/sps/triggers-example.mdx b/website/pages/ko/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ko/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ko/sps/triggers.mdx b/website/pages/ko/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ko/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/mr/sps/_meta.js b/website/pages/mr/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/mr/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/mr/sps/introduction.mdx b/website/pages/mr/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/mr/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/mr/sps/triggers-example.mdx b/website/pages/mr/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/mr/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/mr/sps/triggers.mdx b/website/pages/mr/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/mr/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/nl/sps/_meta.js b/website/pages/nl/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/nl/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/nl/sps/introduction.mdx b/website/pages/nl/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/nl/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/nl/sps/triggers-example.mdx b/website/pages/nl/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/nl/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/nl/sps/triggers.mdx b/website/pages/nl/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/nl/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/pl/sps/_meta.js b/website/pages/pl/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/pl/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/pl/sps/introduction.mdx b/website/pages/pl/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/pl/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/pl/sps/triggers-example.mdx b/website/pages/pl/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/pl/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/pl/sps/triggers.mdx b/website/pages/pl/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/pl/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/pt/sps/_meta.js b/website/pages/pt/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/pt/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/pt/sps/introduction.mdx b/website/pages/pt/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/pt/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/pt/sps/triggers-example.mdx b/website/pages/pt/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/pt/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/pt/sps/triggers.mdx b/website/pages/pt/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/pt/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/ro/sps/_meta.js b/website/pages/ro/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ro/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ro/sps/introduction.mdx b/website/pages/ro/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ro/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ro/sps/triggers-example.mdx b/website/pages/ro/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ro/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ro/sps/triggers.mdx b/website/pages/ro/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ro/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/ru/sps/_meta.js b/website/pages/ru/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ru/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ru/sps/introduction.mdx b/website/pages/ru/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ru/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ru/sps/triggers-example.mdx b/website/pages/ru/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ru/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ru/sps/triggers.mdx b/website/pages/ru/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ru/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/sv/sps/_meta.js b/website/pages/sv/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/sv/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/sv/sps/introduction.mdx b/website/pages/sv/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/sv/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/sv/sps/triggers-example.mdx b/website/pages/sv/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/sv/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/sv/sps/triggers.mdx b/website/pages/sv/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/sv/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/tr/sps/_meta.js b/website/pages/tr/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/tr/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/tr/sps/introduction.mdx b/website/pages/tr/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/tr/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/tr/sps/triggers-example.mdx b/website/pages/tr/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/tr/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/tr/sps/triggers.mdx b/website/pages/tr/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/tr/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/uk/sps/_meta.js b/website/pages/uk/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/uk/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/uk/sps/introduction.mdx b/website/pages/uk/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/uk/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/uk/sps/triggers-example.mdx b/website/pages/uk/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/uk/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/uk/sps/triggers.mdx b/website/pages/uk/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/uk/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/ur/sps/_meta.js b/website/pages/ur/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/ur/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/ur/sps/introduction.mdx b/website/pages/ur/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/ur/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/ur/sps/triggers-example.mdx b/website/pages/ur/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/ur/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/ur/sps/triggers.mdx b/website/pages/ur/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/ur/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/vi/sps/_meta.js b/website/pages/vi/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/vi/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/vi/sps/introduction.mdx b/website/pages/vi/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/vi/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/vi/sps/triggers-example.mdx b/website/pages/vi/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/vi/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/vi/sps/triggers.mdx b/website/pages/vi/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/vi/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/yo/sps/_meta.js b/website/pages/yo/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/yo/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/yo/sps/introduction.mdx b/website/pages/yo/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/yo/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/yo/sps/triggers-example.mdx b/website/pages/yo/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/yo/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/yo/sps/triggers.mdx b/website/pages/yo/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/yo/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/pages/zh/sps/_meta.js b/website/pages/zh/sps/_meta.js new file mode 100644 index 000000000000..4ebd7d55a84f --- /dev/null +++ b/website/pages/zh/sps/_meta.js @@ -0,0 +1,5 @@ +import meta from '../../en/sps/_meta.js' + +export default { + ...meta, +} diff --git a/website/pages/zh/sps/introduction.mdx b/website/pages/zh/sps/introduction.mdx new file mode 100644 index 000000000000..3e50521589af --- /dev/null +++ b/website/pages/zh/sps/introduction.mdx @@ -0,0 +1,19 @@ +--- +title: Introduction to Substreams-powered Subgraphs +--- + +By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. + +There are two methods of enabling this technology: + +Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph. + +Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities. + +It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node. + +Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly: + +- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana) +- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm) +- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective) diff --git a/website/pages/zh/sps/triggers-example.mdx b/website/pages/zh/sps/triggers-example.mdx new file mode 100644 index 000000000000..d8d61566295e --- /dev/null +++ b/website/pages/zh/sps/triggers-example.mdx @@ -0,0 +1,137 @@ +--- +title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' +--- + +## Prerequisites + +Before starting, make sure to: + +- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. +- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. + +## Step 1: Initialize Your Project + +1. Open your Dev Container and run the following command to initialize your project: + + ```bash + substreams init + ``` + +2. Select the "minimal" project option. +3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: + +```yaml +specVersion: v0.1.0 +package: + name: my_project_sol + version: v0.1.0 + +imports: # Pass your spkg of interest + solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg + +modules: + - name: map_spl_transfers + use: solana:map_block # Select corresponding modules available within your spkg + initialBlock: 260000082 + + - name: map_transactions_by_programid + use: solana:solana:transactions_by_programid_without_votes + +network: solana-mainnet-beta + +params: # Modify the param fields to meet your needs + # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA + map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE +``` + +## Step 2: Generate the Subgraph Manifest + +Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container: + +```bash +substreams codegen subgraph +``` + +You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: + +```yaml +--- +dataSources: + - kind: substreams + name: my_project_sol + network: solana-mainnet-beta + source: + package: + moduleName: map_spl_transfers # Module defined in the substreams.yaml + file: ./my-project-sol-v0.1.0.spkg + mapping: + apiVersion: 0.0.7 + kind: substreams/graph-entities + file: ./src/mappings.ts + handler: handleTriggers +``` + +## Step 3: Define Entities in `schema.graphql` + +Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example: + +```graphql +type MyTransfer @entity { + id: ID! + amount: String! + source: String! + designation: String! + signers: [String!]! +} +``` + +This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. + +## Step 4: Generate Protobuf Files + +To generate Protobuf objects in AssemblyScript, run the following command: + +```bash +npm run protogen +``` + +This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler. + +## Step 5: Handle Substreams Data in `mappings.ts` + +With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id: + +```ts +import { Protobuf } from 'as-proto/assembly' +import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' +import { MyTransfer } from '../generated/schema' + +export function handleTriggers(bytes: Uint8Array): void { + const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) + + for (let i = 0; i < input.data.length; i++) { + const event = input.data[i] + + if (event.transfer != null) { + let entity_id: string = `${event.txnId}-${i}` + const entity = new MyTransfer(entity_id) + entity.amount = event.transfer!.instruction!.amount.toString() + entity.source = event.transfer!.accounts!.source + entity.designation = event.transfer!.accounts!.destination + + if (event.transfer!.accounts!.signer!.single != null) { + entity.signers = [event.transfer!.accounts!.signer!.single.signer] + } else if (event.transfer!.accounts!.signer!.multisig != null) { + entity.signers = event.transfer!.accounts!.signer!.multisig!.signers + } + entity.save() + } + } +} +``` + +## Conclusion + +You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case. + +For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/pages/zh/sps/triggers.mdx b/website/pages/zh/sps/triggers.mdx new file mode 100644 index 000000000000..ed19635d4768 --- /dev/null +++ b/website/pages/zh/sps/triggers.mdx @@ -0,0 +1,37 @@ +--- +title: Substreams Triggers +--- + +Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework. + +> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container. + +The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created. + +```tsx +export function handleTransactions(bytes: Uint8Array): void { + let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1. + if (transactions.length == 0) { + log.info('No transactions found', []) + return + } + + for (let i = 0; i < transactions.length; i++) { + // 2. + let transaction = transactions[i] + + let entity = new Transaction(transaction.hash) // 3. + entity.from = transaction.from + entity.to = transaction.to + entity.save() + } +} +``` + +Here's what you’re seeing in the `mappings.ts` file: + +1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object +2. Looping over the transactions +3. Create a new subgraph entity for every transaction + +To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example). diff --git a/website/route-lockfile.txt b/website/route-lockfile.txt index ac7f9bf98564..37beda87e6f5 100644 --- a/website/route-lockfile.txt +++ b/website/route-lockfile.txt @@ -60,9 +60,13 @@ /ar/quick-start/ /ar/release-notes/assemblyscript-migration-guide/ /ar/release-notes/graphql-validations-migration-guide/ +/ar/sps/introduction/ +/ar/sps/triggers-example/ +/ar/sps/triggers/ /ar/substreams/ /ar/sunrise/ /ar/supported-network-requirements/ +/ar/tap/ /ar/tokenomics/ /cs/ /cs/404/ @@ -125,9 +129,13 @@ /cs/quick-start/ /cs/release-notes/assemblyscript-migration-guide/ /cs/release-notes/graphql-validations-migration-guide/ +/cs/sps/introduction/ +/cs/sps/triggers-example/ +/cs/sps/triggers/ /cs/substreams/ /cs/sunrise/ /cs/supported-network-requirements/ +/cs/tap/ /cs/tokenomics/ /de/about/ /de/arbitrum/arbitrum-faq/ @@ -188,14 +196,17 @@ /de/quick-start/ /de/release-notes/assemblyscript-migration-guide/ /de/release-notes/graphql-validations-migration-guide/ +/de/sps/introduction/ +/de/sps/triggers-example/ +/de/sps/triggers/ /de/substreams/ /de/sunrise/ /de/supported-network-requirements/ +/de/tap/ /de/tokenomics/ /en/ /en/404/ /en/about/ -/en/arbitrum-faq/ -> /en/arbitrum/arbitrum-faq/ /en/arbitrum/arbitrum-faq/ /en/arbitrum/l2-transfer-tools-faq/ /en/arbitrum/l2-transfer-tools-guide/ @@ -203,7 +214,6 @@ /en/chain-integration-overview/ /en/cookbook/arweave/ /en/cookbook/avoid-eth-calls/ -/en/cookbook/base-testnet/ /en/cookbook/cosmos/ /en/cookbook/derivedfrom/ /en/cookbook/grafting/ @@ -214,15 +224,11 @@ /en/cookbook/subgraph-debug-forking/ /en/cookbook/subgraph-uncrashable/ /en/cookbook/substreams-powered-subgraphs/ -/en/cookbook/upgrading-a-subgraph/ /en/deploying/deploying-a-subgraph-to-studio/ /en/deploying/subgraph-studio-faqs/ /en/deploying/subgraph-studio/ -/en/developer/assemblyscript-api/ -> /en/developing/graph-ts/api/ -/en/developing/assemblyscript-api/ -> /en/developing/graph-ts/api/ /en/developing/creating-a-subgraph/ /en/developing/developer-faqs/ -/en/developing/graph-ts/ -> /en/developing/graph-ts/README/ /en/developing/graph-ts/CHANGELOG/ /en/developing/graph-ts/README/ /en/developing/graph-ts/api/ @@ -247,7 +253,6 @@ /en/operating-graph-node/ /en/publishing/publishing-a-subgraph/ /en/querying/distributed-systems/ -/en/querying/graph-client/ -> /en/querying/graph-client/README/ /en/querying/graph-client/README/ /en/querying/graph-client/architecture/ /en/querying/graph-client/live/ @@ -261,9 +266,13 @@ /en/quick-start/ /en/release-notes/assemblyscript-migration-guide/ /en/release-notes/graphql-validations-migration-guide/ +/en/sps/introduction/ +/en/sps/triggers-example/ +/en/sps/triggers/ /en/substreams/ /en/sunrise/ /en/supported-network-requirements/ +/en/tap/ /en/tokenomics/ /es/ /es/404/ @@ -326,9 +335,13 @@ /es/quick-start/ /es/release-notes/assemblyscript-migration-guide/ /es/release-notes/graphql-validations-migration-guide/ +/es/sps/introduction/ +/es/sps/triggers-example/ +/es/sps/triggers/ /es/substreams/ /es/sunrise/ /es/supported-network-requirements/ +/es/tap/ /es/tokenomics/ /fr/ /fr/404/ @@ -391,9 +404,13 @@ /fr/quick-start/ /fr/release-notes/assemblyscript-migration-guide/ /fr/release-notes/graphql-validations-migration-guide/ +/fr/sps/introduction/ +/fr/sps/triggers-example/ +/fr/sps/triggers/ /fr/substreams/ /fr/sunrise/ /fr/supported-network-requirements/ +/fr/tap/ /fr/tokenomics/ /ha/about/ /ha/arbitrum/arbitrum-faq/ @@ -448,9 +465,13 @@ /ha/quick-start/ /ha/release-notes/assemblyscript-migration-guide/ /ha/release-notes/graphql-validations-migration-guide/ +/ha/sps/introduction/ +/ha/sps/triggers-example/ +/ha/sps/triggers/ /ha/substreams/ /ha/sunrise/ /ha/supported-network-requirements/ +/ha/tap/ /ha/tokenomics/ /hi/ /hi/404/ @@ -513,9 +534,13 @@ /hi/quick-start/ /hi/release-notes/assemblyscript-migration-guide/ /hi/release-notes/graphql-validations-migration-guide/ +/hi/sps/introduction/ +/hi/sps/triggers-example/ +/hi/sps/triggers/ /hi/substreams/ /hi/sunrise/ /hi/supported-network-requirements/ +/hi/tap/ /hi/tokenomics/ /it/ /it/404/ @@ -578,9 +603,13 @@ /it/quick-start/ /it/release-notes/assemblyscript-migration-guide/ /it/release-notes/graphql-validations-migration-guide/ +/it/sps/introduction/ +/it/sps/triggers-example/ +/it/sps/triggers/ /it/substreams/ /it/sunrise/ /it/supported-network-requirements/ +/it/tap/ /it/tokenomics/ /ja/ /ja/404/ @@ -643,9 +672,13 @@ /ja/quick-start/ /ja/release-notes/assemblyscript-migration-guide/ /ja/release-notes/graphql-validations-migration-guide/ +/ja/sps/introduction/ +/ja/sps/triggers-example/ +/ja/sps/triggers/ /ja/substreams/ /ja/sunrise/ /ja/supported-network-requirements/ +/ja/tap/ /ja/tokenomics/ /ko/about/ /ko/arbitrum/arbitrum-faq/ @@ -706,9 +739,13 @@ /ko/quick-start/ /ko/release-notes/assemblyscript-migration-guide/ /ko/release-notes/graphql-validations-migration-guide/ +/ko/sps/introduction/ +/ko/sps/triggers-example/ +/ko/sps/triggers/ /ko/substreams/ /ko/sunrise/ /ko/supported-network-requirements/ +/ko/tap/ /ko/tokenomics/ /mr/ /mr/404/ @@ -771,9 +808,13 @@ /mr/quick-start/ /mr/release-notes/assemblyscript-migration-guide/ /mr/release-notes/graphql-validations-migration-guide/ +/mr/sps/introduction/ +/mr/sps/triggers-example/ +/mr/sps/triggers/ /mr/substreams/ /mr/sunrise/ /mr/supported-network-requirements/ +/mr/tap/ /mr/tokenomics/ /nl/about/ /nl/arbitrum/arbitrum-faq/ @@ -834,9 +875,13 @@ /nl/quick-start/ /nl/release-notes/assemblyscript-migration-guide/ /nl/release-notes/graphql-validations-migration-guide/ +/nl/sps/introduction/ +/nl/sps/triggers-example/ +/nl/sps/triggers/ /nl/substreams/ /nl/sunrise/ /nl/supported-network-requirements/ +/nl/tap/ /nl/tokenomics/ /pl/about/ /pl/arbitrum/arbitrum-faq/ @@ -897,9 +942,13 @@ /pl/quick-start/ /pl/release-notes/assemblyscript-migration-guide/ /pl/release-notes/graphql-validations-migration-guide/ +/pl/sps/introduction/ +/pl/sps/triggers-example/ +/pl/sps/triggers/ /pl/substreams/ /pl/sunrise/ /pl/supported-network-requirements/ +/pl/tap/ /pl/tokenomics/ /pt/ /pt/404/ @@ -962,9 +1011,13 @@ /pt/quick-start/ /pt/release-notes/assemblyscript-migration-guide/ /pt/release-notes/graphql-validations-migration-guide/ +/pt/sps/introduction/ +/pt/sps/triggers-example/ +/pt/sps/triggers/ /pt/substreams/ /pt/sunrise/ /pt/supported-network-requirements/ +/pt/tap/ /pt/tokenomics/ /ro/about/ /ro/arbitrum/arbitrum-faq/ @@ -1025,9 +1078,13 @@ /ro/quick-start/ /ro/release-notes/assemblyscript-migration-guide/ /ro/release-notes/graphql-validations-migration-guide/ +/ro/sps/introduction/ +/ro/sps/triggers-example/ +/ro/sps/triggers/ /ro/substreams/ /ro/sunrise/ /ro/supported-network-requirements/ +/ro/tap/ /ro/tokenomics/ /ru/ /ru/404/ @@ -1090,9 +1147,13 @@ /ru/quick-start/ /ru/release-notes/assemblyscript-migration-guide/ /ru/release-notes/graphql-validations-migration-guide/ +/ru/sps/introduction/ +/ru/sps/triggers-example/ +/ru/sps/triggers/ /ru/substreams/ /ru/sunrise/ /ru/supported-network-requirements/ +/ru/tap/ /ru/tokenomics/ /sv/ /sv/404/ @@ -1155,9 +1216,13 @@ /sv/quick-start/ /sv/release-notes/assemblyscript-migration-guide/ /sv/release-notes/graphql-validations-migration-guide/ +/sv/sps/introduction/ +/sv/sps/triggers-example/ +/sv/sps/triggers/ /sv/substreams/ /sv/sunrise/ /sv/supported-network-requirements/ +/sv/tap/ /sv/tokenomics/ /tr/ /tr/404/ @@ -1220,9 +1285,13 @@ /tr/quick-start/ /tr/release-notes/assemblyscript-migration-guide/ /tr/release-notes/graphql-validations-migration-guide/ +/tr/sps/introduction/ +/tr/sps/triggers-example/ +/tr/sps/triggers/ /tr/substreams/ /tr/sunrise/ /tr/supported-network-requirements/ +/tr/tap/ /tr/tokenomics/ /uk/about/ /uk/arbitrum/arbitrum-faq/ @@ -1283,9 +1352,13 @@ /uk/quick-start/ /uk/release-notes/assemblyscript-migration-guide/ /uk/release-notes/graphql-validations-migration-guide/ +/uk/sps/introduction/ +/uk/sps/triggers-example/ +/uk/sps/triggers/ /uk/substreams/ /uk/sunrise/ /uk/supported-network-requirements/ +/uk/tap/ /uk/tokenomics/ /ur/ /ur/404/ @@ -1348,9 +1421,13 @@ /ur/quick-start/ /ur/release-notes/assemblyscript-migration-guide/ /ur/release-notes/graphql-validations-migration-guide/ +/ur/sps/introduction/ +/ur/sps/triggers-example/ +/ur/sps/triggers/ /ur/substreams/ /ur/sunrise/ /ur/supported-network-requirements/ +/ur/tap/ /ur/tokenomics/ /vi/about/ /vi/arbitrum/arbitrum-faq/ @@ -1411,9 +1488,13 @@ /vi/quick-start/ /vi/release-notes/assemblyscript-migration-guide/ /vi/release-notes/graphql-validations-migration-guide/ +/vi/sps/introduction/ +/vi/sps/triggers-example/ +/vi/sps/triggers/ /vi/substreams/ /vi/sunrise/ /vi/supported-network-requirements/ +/vi/tap/ /vi/tokenomics/ /yo/about/ /yo/arbitrum/arbitrum-faq/ @@ -1474,9 +1555,13 @@ /yo/quick-start/ /yo/release-notes/assemblyscript-migration-guide/ /yo/release-notes/graphql-validations-migration-guide/ +/yo/sps/introduction/ +/yo/sps/triggers-example/ +/yo/sps/triggers/ /yo/substreams/ /yo/sunrise/ /yo/supported-network-requirements/ +/yo/tap/ /yo/tokenomics/ /zh/ /zh/404/ @@ -1539,7 +1624,11 @@ /zh/quick-start/ /zh/release-notes/assemblyscript-migration-guide/ /zh/release-notes/graphql-validations-migration-guide/ +/zh/sps/introduction/ +/zh/sps/triggers-example/ +/zh/sps/triggers/ /zh/substreams/ /zh/sunrise/ /zh/supported-network-requirements/ +/zh/tap/ /zh/tokenomics/