\ No newline at end of file
+404: This page could not be found
404
This page could not be found.
\ No newline at end of file
diff --git a/404/index.html b/404/index.html
index 1b118c4c..61be445d 100644
--- a/404/index.html
+++ b/404/index.html
@@ -1 +1 @@
-404: This page could not be found
404
This page could not be found.
\ No newline at end of file
+404: This page could not be found
404
This page could not be found.
\ No newline at end of file
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/bounty_process.json b/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/bounty_process.json
deleted file mode 100644
index d19cdb5c..00000000
--- a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/bounty_process.json
+++ /dev/null
@@ -1 +0,0 @@
-{"pageProps":{"frontmatter":{"title":"Bounty Procedures","description":"The process for handling bounties on issues"},"body":"\nThis page explains how companies can place bounties on issues,\nhow developers can work on them,\nand how the [Comunica Association](/association/) manages such bounties.\n\n## Placing a bounty\n\nCompanies (or other types of organizations) that are interested in placing bounties on issues (features requests, bug reports, ...) must follow this procedure:\n\n1. Company lets the association know they are interested in placing a bounty on an issue, by mailing us.\n2. The association finds one or more suitable developers, and reports back to the company on their expected time frame and cost.\n3. All parties (association, company, developer) negotiate the final time frame and cost, after which one developer is assigned to the issue (if all parties agree).\n4. The company pays the full bounty cost to the association, from which the association claims an overhead of 15%.\n5. After completion (or when the reserved time runs out), all parties (association, company, developer) evaluate the work.\n6. The association pays the bounty to the developer (minus 15% overhead).\n\n## Working on a bounty\n\nDevelopers that are interested in working on issue bounties must follow this procedure:\n\n1. Based on the [list of bounties](/association/bounties/), developers can click on any issue to notify the association that they are interested in working on this issue.\n2. The association discusses with the developer to learn about previous experiences, and what the expected time frame and at what price the developer is willing to work for.\n3. If the company agrees with the developer's conditions, they jointly negotiate the final time frame and cost, after which the developer is assigned to the issue (if all parties agree), and the developer can start the work.\n4. After completion (or when the reserved time runs out), the developer presents the work to the company and the association for evaluation.\n5. The association pays the bounty to the developer (minus 15% overhead).\n\n**The developer should not start working on the issue, before the company and association have confirmed the assignment.**\n\n## Management of bounties\n\nThe association manages issues as follows:\n\n1. A company sends a mail to the association to place a bounty on one or more issues.\n2. The association marks the issue with the `comunica-association-bounty` label, and adds a footer to the issue to mark that a bounty has been placed, after which the issue will appear automatically in [the list of bounties](/association/bounties/). Optionally, a budget for the bounty can be added.\n3. If applicable, the association directly contacts potentially interested developers.\n4. The association awaits offers from developers with their estimated time frame and cost.\n5. Depending on the urgency of the issue, the association sends all offers from developers to the company, together with any previous experiences the association had with each developer.\n6. The company and association negotiate with at least one developer to agree on a fixed time frame and cost (taking into account the 15% overhead).\n7. The association sends an invoice to the company for the agreed upon price.\n8. After payment of the invoice, the developer can start with the work.\n9. The association assigns the issue to the developer, which will make the issue marked as *\"claimed\"* in [the list of bounties](/association/bounties/).\n10. Once the deadline is reached, the association contacts the company and developer to schedule a review meeting.\n11. During the review meeting, all parties discuss the outcome, and potential next steps.\n12. The association pays the bounty to the developer (minus 15% overhead).\n\nDepending on the specific needs of certain issues or use cases, deviations from these procedures may take place.\n\n## Claiming a bounty\n\nOnce a bounty has been fully finalized, you can request your payment by _submitting an expense_ via [Open Collective](https://opencollective.com/comunica-association/).\nWhen submitting an expense, you must attach an invoice, which must be a valid fiscal document.\nThis document must at least contain your VAT ID and your address and the Comunica Association's address:\n\n```\nComunica Association\nCantorsteen 10 \n1000 Bruxelles \nBelgië\n```\n\nAll expenses are handled by [Open Collective Europe](https://docs.opencollective.com/oceurope).\nMore details on expenses can be found on [Open Collective Europe's wiki](https://docs.opencollective.com/oceurope/how-it-works/expenses).\n\n## Rules\n\n1. While anyone is allowed to take up bounties, if board members want to take up bounties, all other board members have to agree, to avoid conflicts of interest.\n2. Once assigned, bounties are expected to be delivered in a timely manner. If the developer does not communicate any progress for more than a week (without prior notification of unavailability), the bounty may become unassigned.\n","excerpt":"","path":"/association/bounty_process","paths":["/about/","/ask/","/association/","/association/board/","/association/bounty_process/","/blog/","/blog/2020-08-19-intro/","/blog/2020-08-24-release_1_16/","/blog/2020-09-25-release_1_17/","/blog/2020-11-02-release_1_18/","/blog/2021-01-18-release_1_19/","/blog/2021-03-30-release_1_20/","/blog/2021-04-27-release_1_21/","/blog/2021-06-21-comunica_association_bounties/","/blog/2021-08-30-release_1_22/","/blog/2021-11-08-comunica_association_members/","/blog/2022-03-03-release_2_0/","/blog/2022-06-29-release_2_3/","/blog/2022-07-14-association_launch/","/blog/2022-08-24-release_2_4/","/blog/2022-11-09-release_2_5/","/blog/2023-05-24-release_2_7/","/blog/2023-07-04-release_2_8/","/blog/2024-03-19-release_3_0/","/blog/2024-05-11-release_3_1/","/blog/2024-07-05-release_3_2/","/contribute/","/docs/","/docs/query/","/docs/query/getting_started/","/docs/query/getting_started/query_cli/","/docs/query/getting_started/update_cli/","/docs/query/getting_started/query_cli_file/","/docs/query/getting_started/query_app/","/docs/query/getting_started/update_app/","/docs/query/getting_started/query_browser_app/","/docs/query/getting_started/query_docker/","/docs/query/getting_started/setup_endpoint/","/docs/query/getting_started/setup_web_client/","/docs/query/getting_started/query_dev_version/","/docs/query/usage/","/docs/query/faq/","/docs/query/advanced/","/docs/query/advanced/basic_auth/","/docs/query/advanced/bindings/","/docs/query/advanced/caching/","/docs/query/advanced/context/","/docs/query/advanced/destination_types/","/docs/query/advanced/explain/","/docs/query/advanced/extension_functions/","/docs/query/advanced/federation/","/docs/query/advanced/graphql_ld/","/docs/query/advanced/hdt/","/docs/query/advanced/logging/","/docs/query/advanced/memento/","/docs/query/advanced/proxying/","/docs/query/advanced/rdfjs/","/docs/query/advanced/rdfjs_querying/","/docs/query/advanced/rdfjs_updating/","/docs/query/advanced/result_formats/","/docs/query/advanced/solid/","/docs/query/advanced/source_types/","/docs/query/advanced/sparql_query_types/","/docs/query/advanced/specifications/","/docs/modify/","/docs/modify/getting_started/","/docs/modify/getting_started/custom_config_cli/","/docs/modify/getting_started/custom_config_app/","/docs/modify/getting_started/custom_init/","/docs/modify/getting_started/custom_web_client/","/docs/modify/getting_started/contribute_actor/","/docs/modify/getting_started/actor_parameter/","/docs/modify/extensions/","/docs/modify/faq/","/docs/modify/advanced/","/docs/modify/advanced/actor_patterns/","/docs/modify/advanced/algebra/","/docs/modify/advanced/architecture_core/","/docs/modify/advanced/architecture_sparql/","/docs/modify/advanced/browser_builds/","/docs/modify/advanced/buses/","/docs/modify/advanced/componentsjs/","/docs/modify/advanced/custom_cli_arguments/","/docs/modify/advanced/expression-evaluator/","/docs/modify/advanced/hypermedia/","/docs/modify/advanced/joins/","/docs/modify/advanced/linking_local_version/","/docs/modify/advanced/logging/","/docs/modify/advanced/mediators/","/docs/modify/advanced/metadata/","/docs/modify/advanced/observers/","/docs/modify/advanced/query_operation_result_types/","/docs/modify/advanced/rdf_parsing_serializing/","/docs/modify/advanced/sparqlee/","/docs/modify/advanced/testing/","/docs/modify/benchmarking/","/events/","/events/2019-06-03-eswc/","/events/2019-10-26-iswc/","/events/2022-09-07-association_launch/","/events/2022-09-13-semantics_conference/","/logos/","/research/","/research/amf/","/research/link_traversal/","/research/versioning/","/roadmap/"],"mattersData":{"/about/":{"title":"About","description":"Learn more about Comunica."},"/ask/":{"title":"Ask","description":"Ask questions about Comunica."},"/association/":{"title":"Comunica Association","description":"Organization for ensuring the maintenance and development of the Comunica"},"/association/board/":{"title":"Board of Directors","description":"The board makes decisions regarding the Comunica Association"},"/association/bounty_process/":{"title":"Bounty Procedures","description":"The process for handling bounties on issues"},"/blog/":{"title":"Blog","description":"Blog posts, containing announcements or other news.","blog_index":true},"/blog/2020-08-19-intro/":{"title":"A New Website for Comunica","excerpt":"\nWe're happy to present a brand new website for Comunica! 🎉\n_Don't know that Comunica is? [Read about it here](/about/)._\n\nThis new version contains all **basic information** around Comunica.\nAdditionally, it contains **guides** on how to [query with Comunica](/docs/query/),\nand how to [modify or extend it](/docs/modify/). \n\n"},"/blog/2020-08-24-release_1_16/":{"title":"Release 1.16.0: Full spec compliance, property paths, CSV/TSV, basic auth, and fixes","excerpt":"\nWith the latest release of Comunica, we have achieved the major milestone of **full compliance to the SPARQL 1.1 specification**.\nWhile Comunica has had support for all SPARQL 1.1 operators for a while,\nsome small parts were not always fully handled according to the spec,\nand property paths were not fully supported.\n\nThanks to the help of several students over the summer, these issues have been resolved,\nand all tests from [the SPARQL 1.1 test suite](https://w3c.github.io/rdf-tests/sparql11/) now pass.\n\n"},"/blog/2020-09-25-release_1_17/":{"title":"Hacktoberfest and Release 1.17.0","excerpt":"\nIn this post, we give an overview of\ncontribution possibilities during [Hacktoberfest](https://hacktoberfest.digitalocean.com/),\nand the newly released 1.17.0 version. \n\n"},"/blog/2020-11-02-release_1_18/":{"title":"Release 1.18.0: Smaller Web bundles and Microdata parsing","excerpt":"\nThis post gives a brief overview of the new 1.18.0 release.\n\n"},"/blog/2021-01-18-release_1_19/":{"title":"Release 1.19.0: Simplifications for extensions","excerpt":"\nThe 1.19.0 release focuses on simplications for developing Comunica extension.\nIt contains no significant fixes or changes for end-users.\n\n"},"/blog/2021-03-30-release_1_20/":{"title":"Release 1.20.0: SPARQL Update support","excerpt":"\nWith this new 1.20.0 release, we bring support for [SPARQL Update](https://www.w3.org/TR/sparql11-update/) queries to Comunica.\nNext to this, several enhancements were made to improve developer experience,\nminor new features, and important bug fixes.\n\n"},"/blog/2021-04-27-release_1_21/":{"title":"Release 1.21.0: Hypermedia-based SPARQL Updating","excerpt":"\nThe 1.21.0 version is a smaller release,\nthat mainly introduces the necessary wiring to enable hypermedia-driven SPARQL update querying,\nwhich lays the foundations for highly flexible updating of heterogeneous destinations, such as Solid data pods.\n\nIn other words, this provides the necessary ✨_magic_✨ for updating many different types of things. \n\n"},"/blog/2021-06-21-comunica_association_bounties/":{"title":"Announcing the Comunica Association, and a Bounty Program","excerpt":"\nIn this post, we announce the creation of the [Comunica Association](/association/),\nand the introduction of a new bounty system using which **organizations** and companies\ncan **fund development** of new features and the fixing of bugs,\nand through which **developers** can take up these bounties and **get paid**.\n\n"},"/blog/2021-08-30-release_1_22/":{"title":"Release 1.22.0: Improved update support, extension functions, and improved CLI handling","excerpt":"\nThe 1.22.0 version features some major additions, and a bunch of smaller internal fixes and performance improvements 🚀!\nThe primary changes that are discussed in this post are\nsupport for more SPARQL update destination types,\nSPARQL extension functions,\nand rewritten CLI handling.\n\n"},"/blog/2021-11-08-comunica_association_members/":{"title":"Comunica Association Memberships","excerpt":"\n[Earlier this year](/blog/2021-06-21-comunica_association_bounties/),\nwe announced the [Comunica Association](/association/),\nwhich is a non-profit organization that aims to make Comunica sustainable in the long term.\nIn this post, we announce the possibility to become a _member_ or _sponsor_ to the association,\nallowing organizations to drive the future roadmap of Comunica.\nWe plan an **official launch in fall 2022**, up until when organizations can choose\nto become a **founding member** of the Comunica Association.\n\n"},"/blog/2022-03-03-release_2_0/":{"title":"Release 2.0.0: A new major release with radical simplifications and performance improvements","excerpt":"\nSince its initial release a couple of years ago, Comunica has grown a lot,\nbut it has always remained fully backwards-compatible with every update.\nHowever, as with every software project, there is sometimes a need to make breaking changes\nso that old mechanisms can be replaced with better, newer ones.\nWith this update, we have aggregated several breaking changes into one large update,\nall of which should improve the lives of users one way or another.\nBelow, the primary changes are listed.\n\n"},"/blog/2022-06-29-release_2_3/":{"title":"Release 2.3.0: Better timeout support and minor enhancements","excerpt":"\nIt's been a while since our latest blog post,\nso here's a small announcement on the latest 2.3.0 release.\n\n"},"/blog/2022-07-14-association_launch/":{"title":"Official launch of the Comunica Association","excerpt":"\nAs previously announced, we will be officially launching the Comunica Association during the fall of this year.\nMore concretely, we are organizing an online launch event on the 7th of September,\nand we will be physically present at the Semantics conference in Vienna the week afterwards.\n\n"},"/blog/2022-08-24-release_2_4/":{"title":"Release 2.4.0: Better browser support and performance improvements","excerpt":"\nWe just released a new minor version of Comunica.\nHere's an overview of the main changes.\n\n"},"/blog/2022-11-09-release_2_5/":{"title":"Release 2.5.0: Fixes, string sources, and HTTP error handling","excerpt":"\nWe just released a new small update. Here's an overview of the main changes.\n\n"},"/blog/2023-05-24-release_2_7/":{"title":"Release 2.7.0: Better date support, better performance over SPARQL endpoints, and internal fixes","excerpt":"\nToday, we released a new minor update, which brings exciting new features, performance improvements, and bug fixes.\nBelow, you can find an overview of the main changes.\n\n"},"/blog/2023-07-04-release_2_8/":{"title":"Release 2.8.0: Support for quoted triples (RDF-star and SPARQL-star)","excerpt":"\nThis minor release focuses on a single but significant new feature: support for quoted triples.\n\n"},"/blog/2024-03-19-release_3_0/":{"title":"Release 3.0: 🔥 Blazingly fast federation over heterogeneous sources","excerpt":"\nMore than 2 years ago, we released [Comunica version 2.0](/blog/2022-03-03-release_2_0/),\nwhich featured many internal and external API changes that significantly simplified its usage.\nToday, we release version 3.0, which focuses more on internal changes, with limited changes to the external API.\nMost of the changes relate to the handling of data sources during query planning,\nwhich allows **more efficient query plans to be produced when querying over federations of heterogeneous sources**.\nThis means that for people using Comunica, the number of breaking changes in this update are very limited.\nThings will simplify be faster in general, and some small convenience features have been added,\nsuch as results being [async iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols). \nTo developers extending Comunica with custom actors, there will be some larger breaking changes.\n\n"},"/blog/2024-05-11-release_3_1/":{"title":"Release 3.1: 🌱 New package with tiny bundle size","excerpt":"\nThe primary addition in this release is the new [`@comunica/query-sparql-rdfjs-lite`](https://www.npmjs.com/package/@comunica/query-sparql-rdfjs-lite) package,\nwhich is optimized for small browser bundle size.\nCurrently, the minified size of this package is 648,88 KB (145,79 KB when gzipped).\nThis is about as small as you can get without removing required functionality from the SPARQL 1.1 spec\nBut if you don't need everything from SPARQL 1.1, it could get much smaller even!\n\n"},"/blog/2024-07-05-release_3_2/":{"title":"Release 3.2: 🔎 Knowing what to optimize","excerpt":"\nFor this release, we mainly focused on improving tooling to more easily track down performance issues.\nConcretely, we improved our query explain output,\nstarted running multiple benchmarks in our CI to avoid performance regressions,\nand applied several performance improvements that were identified following these changes.\n\n"},"/contribute/":{"title":"Contribute","description":"Contribute to the development of Comunica."},"/docs/":{"title":"Documentation","description":"Overview of all Comunica documentation.","index":true},"/docs/query/":{"title":"Query with Comunica","description":"Learn how to execute queries in different environments. Such as live in the browser, in JavaScript applications, or the CLI.","index":true},"/docs/query/getting_started/":{"title":"Getting started with querying","description":"Basic guides on how to easily get started with querying.","index":true},"/docs/query/getting_started/query_cli/":{"title":"Querying from the command line","description":"Execute SPARQL queries directly from the command line."},"/docs/query/getting_started/update_cli/":{"title":"Updating from the command line","description":"Execute SPARQL Update queries directly from the command line."},"/docs/query/getting_started/query_cli_file/":{"title":"Querying local files from the command line","description":"Execute SPARQL queries over local RDF files directly from the command line."},"/docs/query/getting_started/query_app/":{"title":"Querying in a JavaScript app","description":"Execute SPARQL queries from within your application using the JavaScript API."},"/docs/query/getting_started/update_app/":{"title":"Updating in a JavaScript app","description":"Execute SPARQL Update queries from within your application using the JavaScript API."},"/docs/query/getting_started/query_browser_app/":{"title":"Querying in a JavaScript browser app","description":"Execute SPARQL queries from within your client-side browser application using the JavaScript API."},"/docs/query/getting_started/query_docker/":{"title":"Querying from a Docker container","description":"Execute SPARQL queries within a Docker container."},"/docs/query/getting_started/setup_endpoint/":{"title":"Setting up a SPARQL endpoint","description":"Allow querying over HTTP via the SPARQL protocol"},"/docs/query/getting_started/setup_web_client/":{"title":"Setting up a Web client","description":"Set up a user-friendly static Web page where SPARQL queries can be executed client-side"},"/docs/query/getting_started/query_dev_version/":{"title":"Query using the latest development version","description":"If you want to make use of the latest changes that are not released yet"},"/docs/query/usage/":{"title":"Usage showcase","description":"Examples of where Comunica is used."},"/docs/query/faq/":{"title":"Querying FAQ","description":"Frequently asked questions about using Comunica."},"/docs/query/advanced/":{"title":"Advanced querying","description":"Advanced guides on how to get the most out of Comunica.","index":true},"/docs/query/advanced/basic_auth/":{"title":"HTTP Basic Authentication","description":"Send authenticated HTTP requests by including username and password."},"/docs/query/advanced/bindings/":{"title":"Bindings","description":"Bindings objects are used to represent results of SPARQL SELECT queries"},"/docs/query/advanced/caching/":{"title":"Caching","description":"When remote sources are requested, caching allows them to be reused in the future."},"/docs/query/advanced/context/":{"title":"Passing a context","description":"A context can be passed to a query engine to tweak its runtime settings."},"/docs/query/advanced/destination_types/":{"title":"Destination types","description":"Comunica detects and handles different types of destinations."},"/docs/query/advanced/explain/":{"title":"Explain","description":"Display information about the logical and physical query plan"},"/docs/query/advanced/extension_functions/":{"title":"Extension Functions","description":"Providing implementations for SPARQL extension functions."},"/docs/query/advanced/federation/":{"title":"Federated Querying","description":"Query over the union of data within any number of sources"},"/docs/query/advanced/graphql_ld/":{"title":"GraphQL-LD","description":"Using the power of JSON-LD contexts, GraphQL queries can be executed by Comunica"},"/docs/query/advanced/hdt/":{"title":"HDT","description":"HDT offers highly compressed immutable RDF storage."},"/docs/query/advanced/logging/":{"title":"Logging","description":"Loggers can be set to different logging levels to inspect what Comunica is doing behind the scenes."},"/docs/query/advanced/memento/":{"title":"Memento","description":"Using the Memento protocol, time travel queries can be executed."},"/docs/query/advanced/proxying/":{"title":"HTTP Proxy","description":"All HTTP requests can optionally go through a proxy."},"/docs/query/advanced/rdfjs/":{"title":"RDF/JS","description":"To achieve maximum interoperability between different JavaScript libraries, Comunica builds on top of the RDF/JS specifications."},"/docs/query/advanced/rdfjs_querying/":{"title":"Querying over RDF/JS sources","description":"If the built-in source types are not sufficient, you can pass a custom JavaScript object implementing a specific interface."},"/docs/query/advanced/rdfjs_updating/":{"title":"Updating RDF/JS stores","description":"If the built-in destination types are not sufficient, you can pass a custom JavaScript object implementing a specific interface."},"/docs/query/advanced/result_formats/":{"title":"Result formats","description":"Query results can be serialized in different formats."},"/docs/query/advanced/solid/":{"title":"Solid","description":"Solid – the Web-based decentralization ecosystem – can be queried with Comunica."},"/docs/query/advanced/source_types/":{"title":"Source types","description":"Comunica detects and handles different types of sources."},"/docs/query/advanced/sparql_query_types/":{"title":"SPARQL query types","description":"Different SPARQL query types are possible, such as SELECT, CONSTRUCT, ASK, ..."},"/docs/query/advanced/specifications/":{"title":"Supported specifications","description":"Comunica supports several RDF-related specifications"},"/docs/modify/":{"title":"Modify Comunica","description":"Learn how to configure your own Comunica engine, or extend Comunica by implementing new components.","index":true},"/docs/modify/getting_started/":{"title":"Getting started with modification","description":"Basic guides on how to easily get started with Comunica modification.","index":true},"/docs/modify/getting_started/custom_config_cli/":{"title":"Querying with a custom configuration from the command line","description":"Create a custom configuration of Comunica modules with reduced features, and query with it from the command line."},"/docs/modify/getting_started/custom_config_app/":{"title":"Querying with a custom configuration in a JavaScript app","description":"Create a custom configuration of Comunica modules with changed features, and query with it from within your application using the JavaScript API."},"/docs/modify/getting_started/custom_init/":{"title":"Exposing your custom config as an npm package","description":"Wrap your config in an npm package, and expose a CLI tool and a JavaScript API."},"/docs/modify/getting_started/custom_web_client/":{"title":"Exposing your custom config in a Web client","description":"Demonstrate your query engine as a static Web page."},"/docs/modify/getting_started/contribute_actor/":{"title":"Contributing a new query operation actor to the Comunica repository","description":"Setup a development environment, implement a new actor, and create a pull request."},"/docs/modify/getting_started/actor_parameter/":{"title":"Adding a config parameter to an actor","description":"For an existing actor, add a parameter that can be customized in the config file."},"/docs/modify/extensions/":{"title":"Extensions","description":"Existing extensions of Comunica."},"/docs/modify/faq/":{"title":"Modify FAQ","description":"Frequently asked question about Comunica modification."},"/docs/modify/advanced/":{"title":"Advanced modification","description":"Advanced guides on how to get the most out of Comunica modification.","index":true},"/docs/modify/advanced/actor_patterns/":{"title":"Actor Patterns","description":"Overview of common design patterns for actors"},"/docs/modify/advanced/algebra/":{"title":"Algebra","description":"The internal representation of queries during query execution."},"/docs/modify/advanced/architecture_core/":{"title":"Core Architecture","description":"The low-level software architecture of Comunica for achieving modularity."},"/docs/modify/advanced/architecture_sparql/":{"title":"SPARQL Architecture","description":"The high-level software architecture of Comunica for implementing SPARQL."},"/docs/modify/advanced/browser_builds/":{"title":"Browser builds","description":"All modules in Comunica can be built for the browser."},"/docs/modify/advanced/buses/":{"title":"Buses and Actors","description":"An overview of all buses in Comunica and their actors."},"/docs/modify/advanced/componentsjs/":{"title":"Components.js","description":"Components.js is the dependency injection framework that Comunica uses to wire components via config files."},"/docs/modify/advanced/custom_cli_arguments/":{"title":"Custom CLI arguments","description":"Adding custom arguments to CLI tools"},"/docs/modify/advanced/expression-evaluator/":{"title":"Expression Evaluator","description":"The expression evaluation engine of Comunica."},"/docs/modify/advanced/hypermedia/":{"title":"Hypermedia","description":"Discovery of data source capabilities during query execution."},"/docs/modify/advanced/joins/":{"title":"Joins","description":"Overview of how join operations are handled during query planning"},"/docs/modify/advanced/linking_local_version/":{"title":"Linking local Comunica versions to other projects","description":"Guide on how to use a local development version of Comunica with another local project"},"/docs/modify/advanced/logging/":{"title":"Logging","description":"How to log messages from within actors."},"/docs/modify/advanced/mediators/":{"title":"Mediators","description":"An overview of all mediators in Comunica."},"/docs/modify/advanced/metadata/":{"title":"Metadata","description":"Information for adaptive planning of query operations."},"/docs/modify/advanced/observers/":{"title":"Observers","description":"Passively observe actions executed by actors on a given bus."},"/docs/modify/advanced/query_operation_result_types/":{"title":"Query operation result types","description":"An overview of the different output types for query operations."},"/docs/modify/advanced/rdf_parsing_serializing/":{"title":"RDF Parsing and Serializing","description":"Basic concepts behind parsing and serializing RDF."},"/docs/modify/advanced/sparqlee/":{"title":"Sparqlee","description":"The SPARQL expression evaluation engine of Comunica. (DEPRECATED)"},"/docs/modify/advanced/testing/":{"title":"Testing","description":"The unit and integration tests that lead to a more stable codebase."},"/docs/modify/benchmarking/":{"title":"Benchmarking","description":"Guidelines on running experiments with Comunica."},"/events/":{"title":"Events","description":"Overview of all Comunica-related events.","index":true,"reverse":true},"/events/2019-06-03-eswc/":{"title":"2019-06-03: Tutorial at ESWC 2019","description":"Comunica tutorial at the ESWC 2019 conference"},"/events/2019-10-26-iswc/":{"title":"2019-10-26: Tutorial at ISWC 2019","description":"Comunica and Solid tutorial at the ISWC 2019 conference"},"/events/2022-09-07-association_launch/":{"title":"2022-09-07: Comunica Association Launch","description":"An online event for the official launch of the Comunica Association"},"/events/2022-09-13-semantics_conference/":{"title":"2022-09-13/15: Semantics Conference","description":"The Comunica Association will have a booth and talk at the Semantics Conference in Vienna"},"/logos/":{"title":"Logos","description":"Free to use logos of Comunica."},"/research/":{"title":"Research","description":"An overview of these research surrounding Comunica."},"/research/amf/":{"title":"Approximate Membership Functions","description":"An overview of research that has been done on AMFs during query execution."},"/research/link_traversal/":{"title":"Link Traversal","description":"An overview of research that has been done on Link-Traversal-based Query Processing."},"/research/versioning/":{"title":"Versioning","description":"An overview of research that has been done on Query Processing for RDF archives."},"/roadmap/":{"title":"Roadmap","description":"The long-term goals of Comunica"}}},"__N_SSG":true}
\ No newline at end of file
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/about.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/about.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/about.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/about.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/ask.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/ask.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/ask.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/ask.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/board.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/board.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/board.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/board.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/bounties.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/bounties.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/association/bounties.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/bounties.json
diff --git a/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/bounty_process.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/bounty_process.json
new file mode 100644
index 00000000..f36b6a34
--- /dev/null
+++ b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/association/bounty_process.json
@@ -0,0 +1 @@
+{"pageProps":{"frontmatter":{"title":"Bounty Procedures","description":"The process for handling bounties on issues"},"body":"\nThis page explains how companies can place bounties on issues,\nhow developers can work on them,\nand how the [Comunica Association](/association/) manages such bounties.\n\n## Placing a bounty\n\nCompanies (or other types of organizations) that are interested in placing bounties on issues (features requests, bug reports, ...) must follow this procedure:\n\n1. Company lets the association know they are interested in placing a bounty on an issue, by mailing us.\n2. The association finds one or more suitable developers, and reports back to the company on their expected time frame and cost.\n3. All parties (association, company, developer) negotiate the final time frame and cost, after which one developer is assigned to the issue (if all parties agree).\n4. The company pays the full bounty cost to the association, from which the association claims an overhead of 15%.\n5. After completion (or when the reserved time runs out), all parties (association, company, developer) evaluate the work.\n6. The association pays the bounty to the developer (minus 15% overhead).\n\n## Working on a bounty\n\nDevelopers that are interested in working on issue bounties must follow this procedure:\n\n1. Based on the [list of bounties](/association/bounties/), developers can click on any issue to notify the association that they are interested in working on this issue.\n2. The association discusses with the developer to learn about previous experiences, and what the expected time frame and at what price the developer is willing to work for.\n3. If the company agrees with the developer's conditions, they jointly negotiate the final time frame and cost, after which the developer is assigned to the issue (if all parties agree), and the developer can start the work.\n4. After completion (or when the reserved time runs out), the developer presents the work to the company and the association for evaluation.\n5. The association pays the bounty to the developer (minus 15% overhead).\n\n**The developer should not start working on the issue, before the company and association have confirmed the assignment.**\n\n## Management of bounties\n\nThe association manages issues as follows:\n\n1. A company sends a mail to the association to place a bounty on one or more issues.\n2. The association marks the issue with the `comunica-association-bounty` label, and adds a footer to the issue to mark that a bounty has been placed, after which the issue will appear automatically in [the list of bounties](/association/bounties/). Optionally, a budget for the bounty can be added.\n3. If applicable, the association directly contacts potentially interested developers.\n4. The association awaits offers from developers with their estimated time frame and cost.\n5. Depending on the urgency of the issue, the association sends all offers from developers to the company, together with any previous experiences the association had with each developer.\n6. The company and association negotiate with at least one developer to agree on a fixed time frame and cost (taking into account the 15% overhead).\n7. The association sends an invoice to the company for the agreed upon price.\n8. After payment of the invoice, the developer can start with the work.\n9. The association assigns the issue to the developer, which will make the issue marked as *\"claimed\"* in [the list of bounties](/association/bounties/).\n10. Once the deadline is reached, the association contacts the company and developer to schedule a review meeting.\n11. During the review meeting, all parties discuss the outcome, and potential next steps.\n12. The association pays the bounty to the developer (minus 15% overhead).\n\nDepending on the specific needs of certain issues or use cases, deviations from these procedures may take place.\n\n## Claiming a bounty\n\nOnce a bounty has been fully finalized, you can request your payment by _submitting an expense_ via [Open Collective](https://opencollective.com/comunica-association/).\nWhen submitting an expense, you must attach an invoice, which must be a valid fiscal document.\nThis document must at least contain your VAT ID and your address, the Comunica Association (Open Collective Europe must not be mentioned), and the Comunica Association's address:\n\n```\nAA Tower (Ghent University – imec)\nTechnologiepark-Zwijnaarde 122\n9052 Ghent, Belgium\nBelgië\n```\n\nAll expenses are handled by [Open Collective Europe](https://docs.opencollective.com/oceurope).\nMore details on expenses can be found on [Open Collective Europe's wiki](https://docs.opencollective.com/oceurope/how-it-works/expenses).\n\n## Rules\n\n1. While anyone is allowed to take up bounties, if board members want to take up bounties, all other board members have to agree, to avoid conflicts of interest.\n2. Once assigned, bounties are expected to be delivered in a timely manner. If the developer does not communicate any progress for more than a week (without prior notification of unavailability), the bounty may become unassigned.\n","excerpt":"","path":"/association/bounty_process","paths":["/about/","/ask/","/association/","/association/board/","/association/bounty_process/","/blog/","/blog/2020-08-19-intro/","/blog/2020-08-24-release_1_16/","/blog/2020-09-25-release_1_17/","/blog/2020-11-02-release_1_18/","/blog/2021-01-18-release_1_19/","/blog/2021-03-30-release_1_20/","/blog/2021-04-27-release_1_21/","/blog/2021-06-21-comunica_association_bounties/","/blog/2021-08-30-release_1_22/","/blog/2021-11-08-comunica_association_members/","/blog/2022-03-03-release_2_0/","/blog/2022-06-29-release_2_3/","/blog/2022-07-14-association_launch/","/blog/2022-08-24-release_2_4/","/blog/2022-11-09-release_2_5/","/blog/2023-05-24-release_2_7/","/blog/2023-07-04-release_2_8/","/blog/2024-03-19-release_3_0/","/blog/2024-05-11-release_3_1/","/blog/2024-07-05-release_3_2/","/contribute/","/docs/","/docs/query/","/docs/query/getting_started/","/docs/query/getting_started/query_cli/","/docs/query/getting_started/update_cli/","/docs/query/getting_started/query_cli_file/","/docs/query/getting_started/query_app/","/docs/query/getting_started/update_app/","/docs/query/getting_started/query_browser_app/","/docs/query/getting_started/query_docker/","/docs/query/getting_started/setup_endpoint/","/docs/query/getting_started/setup_web_client/","/docs/query/getting_started/query_dev_version/","/docs/query/usage/","/docs/query/faq/","/docs/query/advanced/","/docs/query/advanced/basic_auth/","/docs/query/advanced/bindings/","/docs/query/advanced/caching/","/docs/query/advanced/context/","/docs/query/advanced/destination_types/","/docs/query/advanced/explain/","/docs/query/advanced/extension_functions/","/docs/query/advanced/federation/","/docs/query/advanced/graphql_ld/","/docs/query/advanced/hdt/","/docs/query/advanced/logging/","/docs/query/advanced/memento/","/docs/query/advanced/proxying/","/docs/query/advanced/rdfjs/","/docs/query/advanced/rdfjs_querying/","/docs/query/advanced/rdfjs_updating/","/docs/query/advanced/result_formats/","/docs/query/advanced/solid/","/docs/query/advanced/source_types/","/docs/query/advanced/sparql_query_types/","/docs/query/advanced/specifications/","/docs/modify/","/docs/modify/getting_started/","/docs/modify/getting_started/custom_config_cli/","/docs/modify/getting_started/custom_config_app/","/docs/modify/getting_started/custom_init/","/docs/modify/getting_started/custom_web_client/","/docs/modify/getting_started/contribute_actor/","/docs/modify/getting_started/actor_parameter/","/docs/modify/extensions/","/docs/modify/faq/","/docs/modify/advanced/","/docs/modify/advanced/actor_patterns/","/docs/modify/advanced/algebra/","/docs/modify/advanced/architecture_core/","/docs/modify/advanced/architecture_sparql/","/docs/modify/advanced/browser_builds/","/docs/modify/advanced/buses/","/docs/modify/advanced/componentsjs/","/docs/modify/advanced/custom_cli_arguments/","/docs/modify/advanced/expression-evaluator/","/docs/modify/advanced/hypermedia/","/docs/modify/advanced/joins/","/docs/modify/advanced/linking_local_version/","/docs/modify/advanced/logging/","/docs/modify/advanced/mediators/","/docs/modify/advanced/metadata/","/docs/modify/advanced/observers/","/docs/modify/advanced/query_operation_result_types/","/docs/modify/advanced/rdf_parsing_serializing/","/docs/modify/advanced/sparqlee/","/docs/modify/advanced/testing/","/docs/modify/benchmarking/","/events/","/events/2019-06-03-eswc/","/events/2019-10-26-iswc/","/events/2022-09-07-association_launch/","/events/2022-09-13-semantics_conference/","/logos/","/research/","/research/amf/","/research/link_traversal/","/research/versioning/","/roadmap/"],"mattersData":{"/about/":{"title":"About","description":"Learn more about Comunica."},"/ask/":{"title":"Ask","description":"Ask questions about Comunica."},"/association/":{"title":"Comunica Association","description":"Organization for ensuring the maintenance and development of the Comunica"},"/association/board/":{"title":"Board of Directors","description":"The board makes decisions regarding the Comunica Association"},"/association/bounty_process/":{"title":"Bounty Procedures","description":"The process for handling bounties on issues"},"/blog/":{"title":"Blog","description":"Blog posts, containing announcements or other news.","blog_index":true},"/blog/2020-08-19-intro/":{"title":"A New Website for Comunica","excerpt":"\nWe're happy to present a brand new website for Comunica! 🎉\n_Don't know that Comunica is? [Read about it here](/about/)._\n\nThis new version contains all **basic information** around Comunica.\nAdditionally, it contains **guides** on how to [query with Comunica](/docs/query/),\nand how to [modify or extend it](/docs/modify/). \n\n"},"/blog/2020-08-24-release_1_16/":{"title":"Release 1.16.0: Full spec compliance, property paths, CSV/TSV, basic auth, and fixes","excerpt":"\nWith the latest release of Comunica, we have achieved the major milestone of **full compliance to the SPARQL 1.1 specification**.\nWhile Comunica has had support for all SPARQL 1.1 operators for a while,\nsome small parts were not always fully handled according to the spec,\nand property paths were not fully supported.\n\nThanks to the help of several students over the summer, these issues have been resolved,\nand all tests from [the SPARQL 1.1 test suite](https://w3c.github.io/rdf-tests/sparql11/) now pass.\n\n"},"/blog/2020-09-25-release_1_17/":{"title":"Hacktoberfest and Release 1.17.0","excerpt":"\nIn this post, we give an overview of\ncontribution possibilities during [Hacktoberfest](https://hacktoberfest.digitalocean.com/),\nand the newly released 1.17.0 version. \n\n"},"/blog/2020-11-02-release_1_18/":{"title":"Release 1.18.0: Smaller Web bundles and Microdata parsing","excerpt":"\nThis post gives a brief overview of the new 1.18.0 release.\n\n"},"/blog/2021-01-18-release_1_19/":{"title":"Release 1.19.0: Simplifications for extensions","excerpt":"\nThe 1.19.0 release focuses on simplications for developing Comunica extension.\nIt contains no significant fixes or changes for end-users.\n\n"},"/blog/2021-03-30-release_1_20/":{"title":"Release 1.20.0: SPARQL Update support","excerpt":"\nWith this new 1.20.0 release, we bring support for [SPARQL Update](https://www.w3.org/TR/sparql11-update/) queries to Comunica.\nNext to this, several enhancements were made to improve developer experience,\nminor new features, and important bug fixes.\n\n"},"/blog/2021-04-27-release_1_21/":{"title":"Release 1.21.0: Hypermedia-based SPARQL Updating","excerpt":"\nThe 1.21.0 version is a smaller release,\nthat mainly introduces the necessary wiring to enable hypermedia-driven SPARQL update querying,\nwhich lays the foundations for highly flexible updating of heterogeneous destinations, such as Solid data pods.\n\nIn other words, this provides the necessary ✨_magic_✨ for updating many different types of things. \n\n"},"/blog/2021-06-21-comunica_association_bounties/":{"title":"Announcing the Comunica Association, and a Bounty Program","excerpt":"\nIn this post, we announce the creation of the [Comunica Association](/association/),\nand the introduction of a new bounty system using which **organizations** and companies\ncan **fund development** of new features and the fixing of bugs,\nand through which **developers** can take up these bounties and **get paid**.\n\n"},"/blog/2021-08-30-release_1_22/":{"title":"Release 1.22.0: Improved update support, extension functions, and improved CLI handling","excerpt":"\nThe 1.22.0 version features some major additions, and a bunch of smaller internal fixes and performance improvements 🚀!\nThe primary changes that are discussed in this post are\nsupport for more SPARQL update destination types,\nSPARQL extension functions,\nand rewritten CLI handling.\n\n"},"/blog/2021-11-08-comunica_association_members/":{"title":"Comunica Association Memberships","excerpt":"\n[Earlier this year](/blog/2021-06-21-comunica_association_bounties/),\nwe announced the [Comunica Association](/association/),\nwhich is a non-profit organization that aims to make Comunica sustainable in the long term.\nIn this post, we announce the possibility to become a _member_ or _sponsor_ to the association,\nallowing organizations to drive the future roadmap of Comunica.\nWe plan an **official launch in fall 2022**, up until when organizations can choose\nto become a **founding member** of the Comunica Association.\n\n"},"/blog/2022-03-03-release_2_0/":{"title":"Release 2.0.0: A new major release with radical simplifications and performance improvements","excerpt":"\nSince its initial release a couple of years ago, Comunica has grown a lot,\nbut it has always remained fully backwards-compatible with every update.\nHowever, as with every software project, there is sometimes a need to make breaking changes\nso that old mechanisms can be replaced with better, newer ones.\nWith this update, we have aggregated several breaking changes into one large update,\nall of which should improve the lives of users one way or another.\nBelow, the primary changes are listed.\n\n"},"/blog/2022-06-29-release_2_3/":{"title":"Release 2.3.0: Better timeout support and minor enhancements","excerpt":"\nIt's been a while since our latest blog post,\nso here's a small announcement on the latest 2.3.0 release.\n\n"},"/blog/2022-07-14-association_launch/":{"title":"Official launch of the Comunica Association","excerpt":"\nAs previously announced, we will be officially launching the Comunica Association during the fall of this year.\nMore concretely, we are organizing an online launch event on the 7th of September,\nand we will be physically present at the Semantics conference in Vienna the week afterwards.\n\n"},"/blog/2022-08-24-release_2_4/":{"title":"Release 2.4.0: Better browser support and performance improvements","excerpt":"\nWe just released a new minor version of Comunica.\nHere's an overview of the main changes.\n\n"},"/blog/2022-11-09-release_2_5/":{"title":"Release 2.5.0: Fixes, string sources, and HTTP error handling","excerpt":"\nWe just released a new small update. Here's an overview of the main changes.\n\n"},"/blog/2023-05-24-release_2_7/":{"title":"Release 2.7.0: Better date support, better performance over SPARQL endpoints, and internal fixes","excerpt":"\nToday, we released a new minor update, which brings exciting new features, performance improvements, and bug fixes.\nBelow, you can find an overview of the main changes.\n\n"},"/blog/2023-07-04-release_2_8/":{"title":"Release 2.8.0: Support for quoted triples (RDF-star and SPARQL-star)","excerpt":"\nThis minor release focuses on a single but significant new feature: support for quoted triples.\n\n"},"/blog/2024-03-19-release_3_0/":{"title":"Release 3.0: 🔥 Blazingly fast federation over heterogeneous sources","excerpt":"\nMore than 2 years ago, we released [Comunica version 2.0](/blog/2022-03-03-release_2_0/),\nwhich featured many internal and external API changes that significantly simplified its usage.\nToday, we release version 3.0, which focuses more on internal changes, with limited changes to the external API.\nMost of the changes relate to the handling of data sources during query planning,\nwhich allows **more efficient query plans to be produced when querying over federations of heterogeneous sources**.\nThis means that for people using Comunica, the number of breaking changes in this update are very limited.\nThings will simplify be faster in general, and some small convenience features have been added,\nsuch as results being [async iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols). \nTo developers extending Comunica with custom actors, there will be some larger breaking changes.\n\n"},"/blog/2024-05-11-release_3_1/":{"title":"Release 3.1: 🌱 New package with tiny bundle size","excerpt":"\nThe primary addition in this release is the new [`@comunica/query-sparql-rdfjs-lite`](https://www.npmjs.com/package/@comunica/query-sparql-rdfjs-lite) package,\nwhich is optimized for small browser bundle size.\nCurrently, the minified size of this package is 648,88 KB (145,79 KB when gzipped).\nThis is about as small as you can get without removing required functionality from the SPARQL 1.1 spec\nBut if you don't need everything from SPARQL 1.1, it could get much smaller even!\n\n"},"/blog/2024-07-05-release_3_2/":{"title":"Release 3.2: 🔎 Knowing what to optimize","excerpt":"\nFor this release, we mainly focused on improving tooling to more easily track down performance issues.\nConcretely, we improved our query explain output,\nstarted running multiple benchmarks in our CI to avoid performance regressions,\nand applied several performance improvements that were identified following these changes.\n\n"},"/contribute/":{"title":"Contribute","description":"Contribute to the development of Comunica."},"/docs/":{"title":"Documentation","description":"Overview of all Comunica documentation.","index":true},"/docs/query/":{"title":"Query with Comunica","description":"Learn how to execute queries in different environments. Such as live in the browser, in JavaScript applications, or the CLI.","index":true},"/docs/query/getting_started/":{"title":"Getting started with querying","description":"Basic guides on how to easily get started with querying.","index":true},"/docs/query/getting_started/query_cli/":{"title":"Querying from the command line","description":"Execute SPARQL queries directly from the command line."},"/docs/query/getting_started/update_cli/":{"title":"Updating from the command line","description":"Execute SPARQL Update queries directly from the command line."},"/docs/query/getting_started/query_cli_file/":{"title":"Querying local files from the command line","description":"Execute SPARQL queries over local RDF files directly from the command line."},"/docs/query/getting_started/query_app/":{"title":"Querying in a JavaScript app","description":"Execute SPARQL queries from within your application using the JavaScript API."},"/docs/query/getting_started/update_app/":{"title":"Updating in a JavaScript app","description":"Execute SPARQL Update queries from within your application using the JavaScript API."},"/docs/query/getting_started/query_browser_app/":{"title":"Querying in a JavaScript browser app","description":"Execute SPARQL queries from within your client-side browser application using the JavaScript API."},"/docs/query/getting_started/query_docker/":{"title":"Querying from a Docker container","description":"Execute SPARQL queries within a Docker container."},"/docs/query/getting_started/setup_endpoint/":{"title":"Setting up a SPARQL endpoint","description":"Allow querying over HTTP via the SPARQL protocol"},"/docs/query/getting_started/setup_web_client/":{"title":"Setting up a Web client","description":"Set up a user-friendly static Web page where SPARQL queries can be executed client-side"},"/docs/query/getting_started/query_dev_version/":{"title":"Query using the latest development version","description":"If you want to make use of the latest changes that are not released yet"},"/docs/query/usage/":{"title":"Usage showcase","description":"Examples of where Comunica is used."},"/docs/query/faq/":{"title":"Querying FAQ","description":"Frequently asked questions about using Comunica."},"/docs/query/advanced/":{"title":"Advanced querying","description":"Advanced guides on how to get the most out of Comunica.","index":true},"/docs/query/advanced/basic_auth/":{"title":"HTTP Basic Authentication","description":"Send authenticated HTTP requests by including username and password."},"/docs/query/advanced/bindings/":{"title":"Bindings","description":"Bindings objects are used to represent results of SPARQL SELECT queries"},"/docs/query/advanced/caching/":{"title":"Caching","description":"When remote sources are requested, caching allows them to be reused in the future."},"/docs/query/advanced/context/":{"title":"Passing a context","description":"A context can be passed to a query engine to tweak its runtime settings."},"/docs/query/advanced/destination_types/":{"title":"Destination types","description":"Comunica detects and handles different types of destinations."},"/docs/query/advanced/explain/":{"title":"Explain","description":"Display information about the logical and physical query plan"},"/docs/query/advanced/extension_functions/":{"title":"Extension Functions","description":"Providing implementations for SPARQL extension functions."},"/docs/query/advanced/federation/":{"title":"Federated Querying","description":"Query over the union of data within any number of sources"},"/docs/query/advanced/graphql_ld/":{"title":"GraphQL-LD","description":"Using the power of JSON-LD contexts, GraphQL queries can be executed by Comunica"},"/docs/query/advanced/hdt/":{"title":"HDT","description":"HDT offers highly compressed immutable RDF storage."},"/docs/query/advanced/logging/":{"title":"Logging","description":"Loggers can be set to different logging levels to inspect what Comunica is doing behind the scenes."},"/docs/query/advanced/memento/":{"title":"Memento","description":"Using the Memento protocol, time travel queries can be executed."},"/docs/query/advanced/proxying/":{"title":"HTTP Proxy","description":"All HTTP requests can optionally go through a proxy."},"/docs/query/advanced/rdfjs/":{"title":"RDF/JS","description":"To achieve maximum interoperability between different JavaScript libraries, Comunica builds on top of the RDF/JS specifications."},"/docs/query/advanced/rdfjs_querying/":{"title":"Querying over RDF/JS sources","description":"If the built-in source types are not sufficient, you can pass a custom JavaScript object implementing a specific interface."},"/docs/query/advanced/rdfjs_updating/":{"title":"Updating RDF/JS stores","description":"If the built-in destination types are not sufficient, you can pass a custom JavaScript object implementing a specific interface."},"/docs/query/advanced/result_formats/":{"title":"Result formats","description":"Query results can be serialized in different formats."},"/docs/query/advanced/solid/":{"title":"Solid","description":"Solid – the Web-based decentralization ecosystem – can be queried with Comunica."},"/docs/query/advanced/source_types/":{"title":"Source types","description":"Comunica detects and handles different types of sources."},"/docs/query/advanced/sparql_query_types/":{"title":"SPARQL query types","description":"Different SPARQL query types are possible, such as SELECT, CONSTRUCT, ASK, ..."},"/docs/query/advanced/specifications/":{"title":"Supported specifications","description":"Comunica supports several RDF-related specifications"},"/docs/modify/":{"title":"Modify Comunica","description":"Learn how to configure your own Comunica engine, or extend Comunica by implementing new components.","index":true},"/docs/modify/getting_started/":{"title":"Getting started with modification","description":"Basic guides on how to easily get started with Comunica modification.","index":true},"/docs/modify/getting_started/custom_config_cli/":{"title":"Querying with a custom configuration from the command line","description":"Create a custom configuration of Comunica modules with reduced features, and query with it from the command line."},"/docs/modify/getting_started/custom_config_app/":{"title":"Querying with a custom configuration in a JavaScript app","description":"Create a custom configuration of Comunica modules with changed features, and query with it from within your application using the JavaScript API."},"/docs/modify/getting_started/custom_init/":{"title":"Exposing your custom config as an npm package","description":"Wrap your config in an npm package, and expose a CLI tool and a JavaScript API."},"/docs/modify/getting_started/custom_web_client/":{"title":"Exposing your custom config in a Web client","description":"Demonstrate your query engine as a static Web page."},"/docs/modify/getting_started/contribute_actor/":{"title":"Contributing a new query operation actor to the Comunica repository","description":"Setup a development environment, implement a new actor, and create a pull request."},"/docs/modify/getting_started/actor_parameter/":{"title":"Adding a config parameter to an actor","description":"For an existing actor, add a parameter that can be customized in the config file."},"/docs/modify/extensions/":{"title":"Extensions","description":"Existing extensions of Comunica."},"/docs/modify/faq/":{"title":"Modify FAQ","description":"Frequently asked question about Comunica modification."},"/docs/modify/advanced/":{"title":"Advanced modification","description":"Advanced guides on how to get the most out of Comunica modification.","index":true},"/docs/modify/advanced/actor_patterns/":{"title":"Actor Patterns","description":"Overview of common design patterns for actors"},"/docs/modify/advanced/algebra/":{"title":"Algebra","description":"The internal representation of queries during query execution."},"/docs/modify/advanced/architecture_core/":{"title":"Core Architecture","description":"The low-level software architecture of Comunica for achieving modularity."},"/docs/modify/advanced/architecture_sparql/":{"title":"SPARQL Architecture","description":"The high-level software architecture of Comunica for implementing SPARQL."},"/docs/modify/advanced/browser_builds/":{"title":"Browser builds","description":"All modules in Comunica can be built for the browser."},"/docs/modify/advanced/buses/":{"title":"Buses and Actors","description":"An overview of all buses in Comunica and their actors."},"/docs/modify/advanced/componentsjs/":{"title":"Components.js","description":"Components.js is the dependency injection framework that Comunica uses to wire components via config files."},"/docs/modify/advanced/custom_cli_arguments/":{"title":"Custom CLI arguments","description":"Adding custom arguments to CLI tools"},"/docs/modify/advanced/expression-evaluator/":{"title":"Expression Evaluator","description":"The expression evaluation engine of Comunica."},"/docs/modify/advanced/hypermedia/":{"title":"Hypermedia","description":"Discovery of data source capabilities during query execution."},"/docs/modify/advanced/joins/":{"title":"Joins","description":"Overview of how join operations are handled during query planning"},"/docs/modify/advanced/linking_local_version/":{"title":"Linking local Comunica versions to other projects","description":"Guide on how to use a local development version of Comunica with another local project"},"/docs/modify/advanced/logging/":{"title":"Logging","description":"How to log messages from within actors."},"/docs/modify/advanced/mediators/":{"title":"Mediators","description":"An overview of all mediators in Comunica."},"/docs/modify/advanced/metadata/":{"title":"Metadata","description":"Information for adaptive planning of query operations."},"/docs/modify/advanced/observers/":{"title":"Observers","description":"Passively observe actions executed by actors on a given bus."},"/docs/modify/advanced/query_operation_result_types/":{"title":"Query operation result types","description":"An overview of the different output types for query operations."},"/docs/modify/advanced/rdf_parsing_serializing/":{"title":"RDF Parsing and Serializing","description":"Basic concepts behind parsing and serializing RDF."},"/docs/modify/advanced/sparqlee/":{"title":"Sparqlee","description":"The SPARQL expression evaluation engine of Comunica. (DEPRECATED)"},"/docs/modify/advanced/testing/":{"title":"Testing","description":"The unit and integration tests that lead to a more stable codebase."},"/docs/modify/benchmarking/":{"title":"Benchmarking","description":"Guidelines on running experiments with Comunica."},"/events/":{"title":"Events","description":"Overview of all Comunica-related events.","index":true,"reverse":true},"/events/2019-06-03-eswc/":{"title":"2019-06-03: Tutorial at ESWC 2019","description":"Comunica tutorial at the ESWC 2019 conference"},"/events/2019-10-26-iswc/":{"title":"2019-10-26: Tutorial at ISWC 2019","description":"Comunica and Solid tutorial at the ISWC 2019 conference"},"/events/2022-09-07-association_launch/":{"title":"2022-09-07: Comunica Association Launch","description":"An online event for the official launch of the Comunica Association"},"/events/2022-09-13-semantics_conference/":{"title":"2022-09-13/15: Semantics Conference","description":"The Comunica Association will have a booth and talk at the Semantics Conference in Vienna"},"/logos/":{"title":"Logos","description":"Free to use logos of Comunica."},"/research/":{"title":"Research","description":"An overview of these research surrounding Comunica."},"/research/amf/":{"title":"Approximate Membership Functions","description":"An overview of research that has been done on AMFs during query execution."},"/research/link_traversal/":{"title":"Link Traversal","description":"An overview of research that has been done on Link-Traversal-based Query Processing."},"/research/versioning/":{"title":"Versioning","description":"An overview of research that has been done on Query Processing for RDF archives."},"/roadmap/":{"title":"Roadmap","description":"The long-term goals of Comunica"}}},"__N_SSG":true}
\ No newline at end of file
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-08-19-intro.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-08-19-intro.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-08-19-intro.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-08-19-intro.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-08-24-release_1_16.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-08-24-release_1_16.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-08-24-release_1_16.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-08-24-release_1_16.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-09-25-release_1_17.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-09-25-release_1_17.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-09-25-release_1_17.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-09-25-release_1_17.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-11-02-release_1_18.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-11-02-release_1_18.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2020-11-02-release_1_18.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2020-11-02-release_1_18.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-01-18-release_1_19.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-01-18-release_1_19.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-01-18-release_1_19.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-01-18-release_1_19.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-03-30-release_1_20.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-03-30-release_1_20.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-03-30-release_1_20.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-03-30-release_1_20.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-04-27-release_1_21.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-04-27-release_1_21.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-04-27-release_1_21.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-04-27-release_1_21.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-06-21-comunica_association_bounties.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-06-21-comunica_association_bounties.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-06-21-comunica_association_bounties.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-06-21-comunica_association_bounties.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-08-30-release_1_22.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-08-30-release_1_22.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-08-30-release_1_22.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-08-30-release_1_22.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-11-08-comunica_association_members.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-11-08-comunica_association_members.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2021-11-08-comunica_association_members.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2021-11-08-comunica_association_members.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-03-03-release_2_0.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-03-03-release_2_0.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-03-03-release_2_0.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-03-03-release_2_0.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-06-29-release_2_3.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-06-29-release_2_3.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-06-29-release_2_3.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-06-29-release_2_3.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-07-14-association_launch.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-07-14-association_launch.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-07-14-association_launch.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-07-14-association_launch.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-08-24-release_2_4.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-08-24-release_2_4.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-08-24-release_2_4.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-08-24-release_2_4.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-11-09-release_2_5.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-11-09-release_2_5.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2022-11-09-release_2_5.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2022-11-09-release_2_5.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2023-05-24-release_2_7.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2023-05-24-release_2_7.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2023-05-24-release_2_7.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2023-05-24-release_2_7.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2023-07-04-release_2_8.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2023-07-04-release_2_8.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2023-07-04-release_2_8.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2023-07-04-release_2_8.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2024-03-19-release_3_0.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2024-03-19-release_3_0.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2024-03-19-release_3_0.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2024-03-19-release_3_0.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2024-05-11-release_3_1.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2024-05-11-release_3_1.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2024-05-11-release_3_1.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2024-05-11-release_3_1.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2024-07-05-release_3_2.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2024-07-05-release_3_2.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/blog/2024-07-05-release_3_2.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/blog/2024-07-05-release_3_2.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/contribute.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/contribute.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/contribute.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/contribute.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/actor_patterns.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/actor_patterns.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/actor_patterns.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/actor_patterns.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/algebra.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/algebra.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/algebra.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/algebra.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/architecture_core.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/architecture_core.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/architecture_core.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/architecture_core.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/architecture_sparql.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/architecture_sparql.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/architecture_sparql.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/architecture_sparql.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/browser_builds.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/browser_builds.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/browser_builds.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/browser_builds.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/buses.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/buses.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/buses.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/buses.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/componentsjs.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/componentsjs.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/componentsjs.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/componentsjs.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/custom_cli_arguments.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/custom_cli_arguments.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/custom_cli_arguments.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/custom_cli_arguments.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/expression-evaluator.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/expression-evaluator.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/expression-evaluator.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/expression-evaluator.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/hypermedia.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/hypermedia.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/hypermedia.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/hypermedia.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/joins.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/joins.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/joins.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/joins.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/linking_local_version.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/linking_local_version.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/linking_local_version.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/linking_local_version.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/logging.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/logging.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/logging.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/logging.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/mediators.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/mediators.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/mediators.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/mediators.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/metadata.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/metadata.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/metadata.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/metadata.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/observers.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/observers.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/observers.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/observers.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/query_operation_result_types.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/query_operation_result_types.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/query_operation_result_types.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/query_operation_result_types.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/rdf_parsing_serializing.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/rdf_parsing_serializing.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/rdf_parsing_serializing.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/rdf_parsing_serializing.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/sparqlee.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/sparqlee.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/sparqlee.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/sparqlee.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/testing.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/testing.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/advanced/testing.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/advanced/testing.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/benchmarking.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/benchmarking.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/benchmarking.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/benchmarking.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/extensions.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/extensions.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/extensions.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/extensions.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/faq.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/faq.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/faq.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/faq.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/actor_parameter.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/actor_parameter.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/actor_parameter.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/actor_parameter.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/contribute_actor.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/contribute_actor.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/contribute_actor.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/contribute_actor.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_config_app.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_config_app.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_config_app.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_config_app.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_config_cli.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_config_cli.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_config_cli.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_config_cli.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_init.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_init.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_init.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_init.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_web_client.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_web_client.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/modify/getting_started/custom_web_client.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/modify/getting_started/custom_web_client.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/basic_auth.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/basic_auth.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/basic_auth.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/basic_auth.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/bindings.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/bindings.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/bindings.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/bindings.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/caching.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/caching.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/caching.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/caching.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/context.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/context.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/context.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/context.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/destination_types.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/destination_types.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/destination_types.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/destination_types.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/explain.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/explain.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/explain.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/explain.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/extension_functions.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/extension_functions.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/extension_functions.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/extension_functions.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/federation.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/federation.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/federation.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/federation.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/graphql_ld.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/graphql_ld.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/graphql_ld.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/graphql_ld.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/hdt.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/hdt.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/hdt.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/hdt.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/logging.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/logging.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/logging.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/logging.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/memento.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/memento.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/memento.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/memento.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/proxying.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/proxying.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/proxying.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/proxying.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/rdfjs.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/rdfjs.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/rdfjs.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/rdfjs.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/rdfjs_querying.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/rdfjs_querying.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/rdfjs_querying.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/rdfjs_querying.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/rdfjs_updating.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/rdfjs_updating.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/rdfjs_updating.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/rdfjs_updating.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/result_formats.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/result_formats.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/result_formats.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/result_formats.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/solid.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/solid.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/solid.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/solid.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/source_types.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/source_types.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/source_types.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/source_types.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/sparql_query_types.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/sparql_query_types.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/sparql_query_types.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/sparql_query_types.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/specifications.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/specifications.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/advanced/specifications.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/advanced/specifications.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/faq.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/faq.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/faq.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/faq.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_app.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_app.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_app.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_app.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_browser_app.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_browser_app.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_browser_app.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_browser_app.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_cli.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_cli.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_cli.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_cli.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_cli_file.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_cli_file.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_cli_file.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_cli_file.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_dev_version.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_dev_version.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_dev_version.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_dev_version.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_docker.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_docker.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/query_docker.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/query_docker.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/setup_endpoint.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/setup_endpoint.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/setup_endpoint.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/setup_endpoint.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/setup_web_client.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/setup_web_client.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/setup_web_client.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/setup_web_client.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/update_app.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/update_app.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/update_app.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/update_app.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/update_cli.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/update_cli.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/getting_started/update_cli.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/getting_started/update_cli.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/usage.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/usage.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/docs/query/usage.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/docs/query/usage.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2019-06-03-eswc.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2019-06-03-eswc.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2019-06-03-eswc.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2019-06-03-eswc.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2019-10-26-iswc.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2019-10-26-iswc.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2019-10-26-iswc.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2019-10-26-iswc.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2022-09-07-association_launch.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2022-09-07-association_launch.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2022-09-07-association_launch.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2022-09-07-association_launch.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2022-09-13-semantics_conference.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2022-09-13-semantics_conference.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/events/2022-09-13-semantics_conference.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/events/2022-09-13-semantics_conference.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/logos.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/logos.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/logos.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/logos.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research/amf.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research/amf.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research/amf.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research/amf.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research/link_traversal.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research/link_traversal.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research/link_traversal.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research/link_traversal.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research/versioning.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research/versioning.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/research/versioning.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/research/versioning.json
diff --git a/_next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/roadmap.json b/_next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/roadmap.json
similarity index 100%
rename from _next/data/git-sha-4b1fc6e10c3a27d59f77ff6a0de5c0651ba4ad4c/roadmap.json
rename to _next/data/git-sha-89f60b82ca518e8859555b713791b6acf2433ff1/roadmap.json
diff --git a/_next/static/chunks/pages/[...slug]-1954c3051d754a28.js b/_next/static/chunks/pages/[...slug]-1954c3051d754a28.js
new file mode 100644
index 00000000..367865d9
--- /dev/null
+++ b/_next/static/chunks/pages/[...slug]-1954c3051d754a28.js
@@ -0,0 +1 @@
+(self.webpackChunk_N_E=self.webpackChunk_N_E||[]).push([[330],{70010:function(e,n,t){(window.__NEXT_P=window.__NEXT_P||[]).push(["/[...slug]",function(){return t(69559)}])},91903:function(e,n,t){"use strict";var a=t(85893),o=t(9008),i=t.n(o);n.Z=e=>{let{title:n,description:t}=e;return(0,a.jsxs)(i(),{children:[(0,a.jsxs)("title",{children:["Comunica – ",n]}),(0,a.jsx)("link",{rel:"icon",href:"/favicon.ico"}),(0,a.jsx)("link",{rel:"foaf:primaryTopic",href:"/#software"}),(0,a.jsx)("link",{rel:"foaf:maker",href:"https://www.rubensworks.net/#me"}),(0,a.jsx)("link",{rel:"alternate",type:"application/rss+xml",title:"Comunica – Blog",href:"/rss-feed.xml"}),(0,a.jsx)("meta",{property:"og:image",content:"/img/comunica_red.svg"}),(0,a.jsx)("meta",{property:"og:title",content:"Comunica – ".concat(n)}),(0,a.jsx)("meta",{property:"og:description",content:"".concat(t.replace(/\n/g," "))}),(0,a.jsx)("meta",{property:"og:url",content:"/"}),(0,a.jsx)("meta",{property:"og:locale",content:"en_US"}),(0,a.jsx)("meta",{property:"og:site_name",content:"Comunica – ".concat(n)}),(0,a.jsx)("meta",{property:"og:type",content:"website"}),(0,a.jsx)("meta",{name:"twitter:site",content:"@comunicajs"}),(0,a.jsx)("meta",{name:"twitter:card",content:"summary"}),(0,a.jsx)("meta",{name:"twitter:title",content:"Comunica – ".concat(n)}),(0,a.jsx)("meta",{name:"twitter:description",content:"".concat(t.replace(/\n/g," "))}),(0,a.jsx)("meta",{name:"twitter:image",content:"https://comunica.dev/img/comunica_red.png"})]})}},57533:function(e,n,t){"use strict";t.d(n,{Z:function(){return p}});var a=t(85893),o=t(53951),i=t(38456),s=t.n(i),r=t(67294),c=t(10043),u=t.n(c),d=t(76388),l=t.n(d);function p(e){let{body:n}=e;return(0,a.jsx)(s(),{rehypePlugins:[l()],plugins:[u()],children:n,components:{code:m,h1:h,h2:h,h3:h,h4:h,h5:h,h6:h}})}t(1667);let m=e=>e.inline?(0,a.jsx)("code",{children:e.children}):(0,a.jsx)(o.default,{className:e.className,children:e.children}),h=e=>{let n=r.Children.toArray(e.children),t=n.reduce(g,""),a=t.toLowerCase().replace(/\W/g,"-");return r.createElement("h"+e.level,{id:a},e.children)};function g(e,n){return"string"==typeof n?e+n:r.Children.toArray(n.props.children).reduce(g,e)}},69559:function(e,n,t){"use strict";t.r(n),t.d(n,{__N_SSG:function(){return p},default:function(){return m},getStaticData:function(){return h}});var a=t(85893),o=t(91903),i=t(9675),s=t.n(i);function r(e){let{path:n,paths:t,mattersData:o,reverse:i}=e,s=t.filter(e=>e.startsWith(n)&&e!==n+"/").map(e=>e.slice(n.length+1,e.length)).filter(e=>(e.match(/\//g)||[]).length<=2).map(e=>({path:e,title:o[n+"/"+e].title,description:o[n+"/"+e].description,indent:(e.match(/\//g)||[]).length-1})).map(e=>(0,a.jsxs)("a",{href:e.path,className:"index-entry indent-"+e.indent,children:[(0,a.jsx)("h3",{children:e.title}),(0,a.jsx)("p",{children:e.description})]},e.path));return i&&(s=s.reverse()),(0,a.jsx)("div",{className:"index",children:s})}var c=t(57533);function u(e){let{path:n,paths:t,mattersData:o}=e,i=t.filter(e=>e.startsWith(n)&&e!==n+"/").map(e=>e.slice(n.length+1,e.length)).filter(e=>1===(e.match(/\//g)||[]).length).reverse().map(e=>{let[t,a,i,s]=/^([0-9][0-9][0-9][0-9])-([0-9][0-9])-([0-9][0-9])-/.exec(e);return{path:e,date:"".concat(new Date("".concat(a,"-").concat(i,"-").concat(s)).toLocaleDateString("en-US",{weekday:"long",year:"numeric",month:"long",day:"numeric"})),title:o[n+"/"+e].title,excerpt:o[n+"/"+e].excerpt}}).map(e=>(0,a.jsxs)("a",{href:e.path,className:"blog-entry",children:[(0,a.jsx)("h3",{children:e.title}),(0,a.jsx)("p",{className:"date",children:e.date}),(0,a.jsxs)("div",{className:"excerpt",children:[(0,a.jsx)(c.Z,{body:e.excerpt}),(0,a.jsx)("p",{className:"read-more",children:"Read more..."})]})]},e.path));return(0,a.jsx)("div",{className:"index",children:i})}function d(e){let{frontmatter:n,path:t,paths:o,mattersData:i}=e,s=o.filter(e=>t.startsWith(e)).map(e=>({path:e,title:i[e].title})).map(e=>(0,a.jsx)("li",{children:(0,a.jsx)("a",{href:e.path,children:e.title})},e.path));return s.length>0&&s.push((0,a.jsx)("li",{children:n.title},"_")),(0,a.jsx)("ul",{className:"breadcrumbs",children:s})}var l=t(67294),p=!0;class m extends l.Component{render(){let{frontmatter:e,body:n,path:t,paths:i,mattersData:s,excerpt:l}=this.props,p="";if(t.startsWith("/blog/")){let[e,n,o,i]=/^\/blog\/([0-9][0-9][0-9][0-9])-([0-9][0-9])-([0-9][0-9])-/.exec(t),s=new Date("".concat(o," ").concat(i," ").concat(n)).toLocaleDateString("en-US",{weekday:"long",year:"numeric",month:"long",day:"numeric"});p=(0,a.jsx)("p",{className:"date",children:s})}return(0,a.jsxs)("div",{className:"container-page",children:[(0,a.jsx)(o.Z,{title:e.title,description:l||e.description}),(0,a.jsxs)("main",{children:[(0,a.jsx)(d,{frontmatter:e,path:t,paths:i,mattersData:s}),(0,a.jsx)("h1",{children:e.title}),p,(0,a.jsx)("hr",{}),e.wip&&(0,a.jsxs)("div",{className:"wip",children:[(0,a.jsx)("h2",{children:"\uD83D\uDEA7 Under construction \uD83D\uDEA7️"}),(0,a.jsxs)("p",{children:["This section still needs to be created \uD83D\uDD28.",(0,a.jsx)("br",{}),"In the meantime, you can read our ",(0,a.jsx)("a",{href:"https://comunica.readthedocs.io/en/latest/",children:"old documentation"})," and check our ",(0,a.jsx)("a",{href:"https://github.com/comunica?utf8=%E2%9C%93&q=topic%3Atutorial&type=&language=",children:"tutorials"}),"."]}),(0,a.jsx)("p",{children:(0,a.jsx)("a",{href:"/contribute/",children:"You can contribute by helping to write guides like this."})})]}),(0,a.jsxs)("div",{className:"headers-overview",children:[(0,a.jsx)("p",{children:"On this page"}),(0,a.jsx)("ol",{className:"headers-overview-elements"})]}),(0,a.jsx)(c.Z,{body:n}),e.index&&(0,a.jsx)(r,{path:t,paths:i,mattersData:s,reverse:e.reverse}),e.blog_index&&(0,a.jsx)(u,{path:t,paths:i,mattersData:s})]})]})}componentDidMount(){let e=document.querySelector(".headers-overview-elements"),n=document.querySelector(".container-page"),t=n.querySelectorAll("h2");for(let n of t){let t=document.createElement("li"),a=document.createElement("a");a.textContent=n.innerText,a.setAttribute("href","#"+n.id),a.setAttribute("class","headers-overview-element"),t.appendChild(a),e.appendChild(t)}function a(){let n=document.querySelectorAll("a.headers-overview-element");for(let e=0;e0&&(e.parentNode.style.display="block"),window.addEventListener("load",a),window.addEventListener("scroll",a)}}async function h(){let e=(e=>{let n=e.keys(),t=n.map((e,n)=>e.slice(1,-3)+"/");return t})(t(78049)),n=e.map(e=>{let n;for(;n=/\/[0-9]*_/.exec(e);)e=e.replace(n,"/");return e}),a=(await Promise.all(e.map(e=>t(59176)(".".concat(e.slice(0,-1),".md"))))).map(e=>s()(e.default,{excerpt_separator:""})).reduce((e,t,a)=>(e[n[a]]=t,e),{});return{paths:n,matters:a,fallback:!1}}},73823:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'About'\ndescription: 'Learn more about Comunica.'\n---\n\nComunica is a knowledge graph querying framework. \nThis page provides more details about its goals and motivations.\n\nRelated pages:\n* [Roadmap](/roadmap/)\n* [Contribute](/contribute/)\n* [Logos](/logos/)\n\n
\n\n## Flexible querying of Linked Data\n\n[**Linked Data**](https://www.w3.org/standards/semanticweb/data) on the Web exists in **many shapes and forms**.\nLinked Data can be published using plain [RDF](https://www.w3.org/TR/rdf11-concepts/) files\nin various **syntaxes**, such as [JSON-LD](https://json-ld.org/), [Turtle](https://www.w3.org/TR/turtle/), [HTML+RDFa](https://www.w3.org/TR/html-rdfa/), and more.\nNext to that, different forms of **queryable Web interfaces** exist, such as [SPARQL endpoints](https://www.w3.org/TR/sparql11-protocol/) and [Triple Pattern Fragments (TPF) interfaces](https://linkeddatafragments.org/in-depth/#tpf).\nIf we want to **query** Linked Data from the Web, we need to be able to cope with this heterogeneity.\n\n**Comunica** is a **quering framework** that has been designed to handle different types of Linked Data interfaces in a **flexible** manner.\nIts primary goal is _executing [SPARQL](https://www.w3.org/TR/sparql11-query/) queries over one or more interfaces_.\n\n## Comunica is a meta-query engine\n\nComunica should not be seen as a query engine.\nInstead, Comunica is a _meta_ query engine using which query engines can be created.\nIt does this by providing a set of **modules** that can be **wired** together in a flexible manner.\n\nWhile we provide default configurations of Comunica to easily [get started with querying](/docs/query/getting_started/),\nanyone can [configure their own query engine](/docs/modify/getting_started/).\nThis fine-tuning of Comunica to suit your own needs, and avoiding the overhead of modules that are not needed.\n\n## For and on the Web\n\nWe strongly believe in the existence of **open Web standards**, such as those provided by [W3C](https://www.w3.org/) and [WhatWG](https://whatwg.org/).\nAs such, [Comunica **implements** several specifications](/docs/query/advanced/specifications/) such as [RDF](https://www.w3.org/TR/rdf11-concepts/) and [SPARQL](https://www.w3.org/TR/sparql11-query/).\nFurthermore, Comunica is implemented using Web-based technologies in **JavaScript**, which enables usage through browsers,\nthe command line, the SPARQL protocol, or any Web or JavaScript application.\n\n## Open\n\nComunica is an **open-source** software project that is available under the [MIT license](https://github.com/comunica/comunica/blob/master/LICENSE.txt),\nwhich means that it is allowed to be used in both open and commercial projects.\nNext to the source code, also our development process is open, which you can read or contribute to on [GitHub](https://github.com/orgs/comunica/projects),\nor read our [high-level roadmap](/roadmap/).\n\n## Research and Education\n\nComunica is designed as a flexible research platform for research on query execution.\nAs such, our goal is to make it sufficiently easy for researchers\nto investigate alternative query algorithms and techniques by [modifying engines](/docs/modify/).\nNext to this, we also aim to educate researchers and developers on [how to use](/docs/) Comunica.\n\n## Linked Data Fragments\n\nOne of the motivations behind Comunica is to be a [**Linked Data Fragments Client**](https://linkeddatafragments.org/concept/).\nLinked Data Fragments is a theoretical framework to analyse different Linked Data interfaces.\n\nWhile software used to exist to query over specific types of Linked Data interfaces,\nit used to be impossible to query over **combinations of different interfaces**.\nComunica solves this need by being independent of specific types of interfaces,\nas support for new interfaces can be plugged in.\n\n## Stability\n\nA primary goal of Comunica is to acts as a **stable** querying framework.\nFor this, we spend extra effort in [continuous testing](/docs/modify/advanced/testing/) at different levels.\n\n## Supporting the JavaScript ecosystem\n\nComunica depends on many dependencies to achieve its goals,\nsuch as spec-compliant RDF parsers and serializers.\nWe support these libraries, and contribute to them.\n\n## Who works on Comunica?\n\nFirst and foremost, Comunica is an **open-source** framework.\nThe Comunica project has been initiated by [IDLab](https://www.ugent.be/ea/idlab/en) at Ghent University – imec,\nand is being actively developed and maintained by a variety of [contributors](https://github.com/comunica/comunica/graphs/contributors).\nAll development happens publicly via GitHub [project boards](https://github.com/orgs/comunica/projects), [issues](https://github.com/comunica/comunica/issues), and [pull requests](https://github.com/comunica/comunica/pulls).\nAnyone is welcome to [contribute](/contribute/) to this project.\n\nAs of recently, the [Comunica Association](/association/) has been founded as a non-profit organization\nto make Comunica development more sustainable in the long term.\n"},9518:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Ask'\ndescription: 'Ask questions about Comunica.'\n---\n\nShould you not find the right information on this website,\nwe would be happy to help you out via any of the methods below.\n\nRelated pages:\n* [Roadmap](/roadmap/)\n* [Contribute](/contribute/)\n\n## Questions\n\nThe easiest way to get an answer to small questions is via our [Gitter channel](https://gitter.im/comunica/Lobby).\nThere, we have an active community of Comunica developers, contributors and enthusiasts.\n\nAlternatively, if you want a place to talk about your question (or discussion topic),\nyou can make use of the [discussions tab on GitHub](https://github.com/comunica/comunica/discussions).\n\nIn case you have a more general question related to SPARQL or RDF in JavaScript,\nthe [RDF/JS Gitter channel](https://gitter.im/rdfjs/public) should be of help.\n\n## GitHub issues\n\nIf you experience bugs with Comunica, or if you have suggestions for new features,\nfeel free to report them in our [issue tracker on GitHub](https://github.com/comunica/comunica/issues).\n\nPlease take into account that this is an open-source effort,\nso we may not be able to solve all issues, but we do our best!\nShould you be interested in helping our with fixing or implementing any of these issues,\nyou are very welcome to [contribute](/contribute/).\n\n## Twitter\n\nTo keep updated with the latest news on Comunica, find us on [Twitter](https://twitter.com/comunicajs).\n\n## Email\n\nFor any other matters, such as research collaborations or commercial support, you can send an email to [Ruben Taelman](mailto:ruben.taelman@ugent.be).\n"},54841:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Comunica Association\'\ndescription: \'Organization for ensuring the maintenance and development of the Comunica\'\n---\n\nThe Comunica Association is a **non-profit organization** for establishing a roadmap,\nand ensuring the maintenance and development of the Comunica framework and its dependencies.\n\n## Members and sponsors\n\nIf your organization is using Comunica, and you want to support its continued maintenance and future development,\nyou may consider [**donating or becoming a sponsor via Open Collective**](https://opencollective.com/comunica-association).\nThis will allow the association to fund core maintainers of Comunica to manage issues and pull requests, and to fund overall development.\nFurthermore, your organization will have the option to prioritize certain issues.\nAnother option is to become a **board member**, which will give your organization access to [board meetings](/association/board/) of the Comunica Association\nwhich will enable your organization to collaboratively determine the long-term vision and roadmap of Comunica and the Association.\n\n
\n \n
\n\nFeel free to [contact us](mailto:ruben.taelman@ugent.be) if you want to discuss alternative forms of support,\nor regarding any related questions.\n\n
\n\n## Bounties\n\nAnother goal of the Comunica Association, is to\n**connect organizations** that are in **need of improvements or features**, to **developers** seeking funding.\n\n
\n \n
\n\nUsing our Bounty Program,\norganizations can place [**bounties on issues**](/association/bounties/),\nand developers may work on them for an agreed upon price.\nThese bounties are primarily useful for issues that have a clearly defined scope, and are not too large.\nLarger issues with an unclear scope may be better suited for becoming part of the general roadmap,\nwhich is decided by Board Members,\nof which [your organization can also become a part of](#members-and-sponsors).\n\n
\n\n## Learn more\n\nIf you want to be notified about future developments around this association, submit your email address below!\n\n\n\nThe Comunica Association is hosted by [Open Collective Europe](https://opencollective.com/europe),\nand our budget is visible on [Open Collective](https://opencollective.com/comunica-association).\n\n
\n * Sponsors that want to have an issue prioritized should contact us.\n The board will decide the final order of issue handling based on historical sponsorship contribution and developer availability.\n
\n'},4074:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Board of Directors'\ndescription: 'The board makes decisions regarding the Comunica Association'\n---\n\nThe [Comunica Association](/association/) has a Board of Directors\nthat makes decisions with respect to the Comunica Association.\nThis page describes details on who are the members of this board, what it does, and how it works.\n\n## Members\n\n* [Ruben Taelman](https://www.rubensworks.net/) [(IDLab, Ghent University – imec)](https://knows.idlab.ugent.be/) - Codebase curator, Core maintainer\n* [Pieter Colpaert](https://pietercolpaert.be/) [(IDLab, Ghent University – imec)](https://knows.idlab.ugent.be/) - Strategic coordinator\n* [Jesse Wright](https://github.com/jeswr/) [(Australian National University)](https://cecs.anu.edu.au/) - Core maintainer\n\n## Goals\n\nThe Board of Directors makes decisions concerning the following topics:\n\n* Determine long-term goals via the [roadmap](/roadmap/).\n* Suggest priorities of issues to the maintainers for short-term development via the [project boards](https://github.com/orgs/comunica/projects).\n* Coordinate future of the Comunica Association\n\nFurthermore, board meetings can be used to evaluate the maintenance and development of Comunica and its related dependencies,\nwhich includes development by externals via the [Bounty Program](/association/bounties/).\n\n## Becoming a Board Member\n\nThere are two ways to become a Board Member:\n\n1. Become a financial contributor via [Open Collective](https://opencollective.com/comunica-association) of the the Board Member tier\n2. Become a regular [contributor](/contribute/) in any other way, with a dedication of at least four hours per week on average.\n\n## Decision-making Process\n\nAt least once every year, the board virtually meets for a board meeting.\nNot all members are required to be present at each meeting.\nThe chair is expected to prepare an agenda ahead of time on https://github.com/comunica/association/blob/master/board-meetings/next.md,\nwhich should contain points raised by the board members.\nA meeting may be skipped if there are no objections from members.\n\nThe chair is appointed by the board members, and may be changed at any time through a decision.\nThe title of \"codebase curator\" is reserved for one person,\nand can only be passed on to someone else by the current codebase curator.\n\nDuring the meeting, decisions can be made,\nand every member can place exactly one vote.\nIn case of a tie, the final decision is up to the chair.\nThe codebase curator may optionally overrule any (final) vote if this person considers this decision to be detrimental to the future of Comunica or the Comunica Association.\nNon-attending members may raise their vote for up to two weeks after the meeting after reading the meeting minutes.\nOnce a vote is final, an action will be carried out by the executive contributors.\n\nMinutes are scribed for each meeting by a volunteer,\nand are to appear afterwards on https://github.com/comunica/association/tree/master/board-meetings\nThe minutes are sent to all board members shortly after each meeting.\n"},43831:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Bounty Procedures\'\ndescription: \'The process for handling bounties on issues\'\n---\n\nThis page explains how companies can place bounties on issues,\nhow developers can work on them,\nand how the [Comunica Association](/association/) manages such bounties.\n\n## Placing a bounty\n\nCompanies (or other types of organizations) that are interested in placing bounties on issues (features requests, bug reports, ...) must follow this procedure:\n\n1. Company lets the association know they are interested in placing a bounty on an issue, by mailing us.\n2. The association finds one or more suitable developers, and reports back to the company on their expected time frame and cost.\n3. All parties (association, company, developer) negotiate the final time frame and cost, after which one developer is assigned to the issue (if all parties agree).\n4. The company pays the full bounty cost to the association, from which the association claims an overhead of 15%.\n5. After completion (or when the reserved time runs out), all parties (association, company, developer) evaluate the work.\n6. The association pays the bounty to the developer (minus 15% overhead).\n\n## Working on a bounty\n\nDevelopers that are interested in working on issue bounties must follow this procedure:\n\n1. Based on the [list of bounties](/association/bounties/), developers can click on any issue to notify the association that they are interested in working on this issue.\n2. The association discusses with the developer to learn about previous experiences, and what the expected time frame and at what price the developer is willing to work for.\n3. If the company agrees with the developer\'s conditions, they jointly negotiate the final time frame and cost, after which the developer is assigned to the issue (if all parties agree), and the developer can start the work.\n4. After completion (or when the reserved time runs out), the developer presents the work to the company and the association for evaluation.\n5. The association pays the bounty to the developer (minus 15% overhead).\n\n**The developer should not start working on the issue, before the company and association have confirmed the assignment.**\n\n## Management of bounties\n\nThe association manages issues as follows:\n\n1. A company sends a mail to the association to place a bounty on one or more issues.\n2. The association marks the issue with the `comunica-association-bounty` label, and adds a footer to the issue to mark that a bounty has been placed, after which the issue will appear automatically in [the list of bounties](/association/bounties/). Optionally, a budget for the bounty can be added.\n3. If applicable, the association directly contacts potentially interested developers.\n4. The association awaits offers from developers with their estimated time frame and cost.\n5. Depending on the urgency of the issue, the association sends all offers from developers to the company, together with any previous experiences the association had with each developer.\n6. The company and association negotiate with at least one developer to agree on a fixed time frame and cost (taking into account the 15% overhead).\n7. The association sends an invoice to the company for the agreed upon price.\n8. After payment of the invoice, the developer can start with the work.\n9. The association assigns the issue to the developer, which will make the issue marked as *"claimed"* in [the list of bounties](/association/bounties/).\n10. Once the deadline is reached, the association contacts the company and developer to schedule a review meeting.\n11. During the review meeting, all parties discuss the outcome, and potential next steps.\n12. The association pays the bounty to the developer (minus 15% overhead).\n\nDepending on the specific needs of certain issues or use cases, deviations from these procedures may take place.\n\n## Claiming a bounty\n\nOnce a bounty has been fully finalized, you can request your payment by _submitting an expense_ via [Open Collective](https://opencollective.com/comunica-association/).\nWhen submitting an expense, you must attach an invoice, which must be a valid fiscal document.\nThis document must at least contain your VAT ID and your address, the Comunica Association (Open Collective Europe must not be mentioned), and the Comunica Association\'s address:\n\n```\nAA Tower (Ghent University – imec)\nTechnologiepark-Zwijnaarde 122\n9052 Ghent, Belgium\nBelgi\xeb\n```\n\nAll expenses are handled by [Open Collective Europe](https://docs.opencollective.com/oceurope).\nMore details on expenses can be found on [Open Collective Europe\'s wiki](https://docs.opencollective.com/oceurope/how-it-works/expenses).\n\n## Rules\n\n1. While anyone is allowed to take up bounties, if board members want to take up bounties, all other board members have to agree, to avoid conflicts of interest.\n2. Once assigned, bounties are expected to be delivered in a timely manner. If the developer does not communicate any progress for more than a week (without prior notification of unavailability), the bounty may become unassigned.\n'},1716:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Blog'\ndescription: 'Blog posts, containing announcements or other news.'\nblog_index: true\n---\n"},19744:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'A New Website for Comunica'\n---\n\nWe're happy to present a brand new website for Comunica! \uD83C\uDF89\n_Don't know that Comunica is? [Read about it here](/about/)._\n\nThis new version contains all **basic information** around Comunica.\nAdditionally, it contains **guides** on how to [query with Comunica](/docs/query/),\nand how to [modify or extend it](/docs/modify/). \n\n\n\nWhile this website is still very much a **work in progress** at the time of writing,\na whole lot of pages have been added already.\nFor instance, the section on [querying with Comunica](/docs/query/) contains some extensive guides.\nIn the near future, more advanced guides on [modifying Comunica](/docs/modify/) will be added.\nIf you're interested in **helping out** with this effort, be sure to have a look at the [contribution guide](/contribute/).\n\nIn the future, this blog will be used for **announcing news** around Comunica,\nwhich can include significant new releases,\nand other things.\nSo be sure to keep your \uD83D\uDC40 on this!\nIf you want to be notified of new blog posts, you can [follow Comunica on **Twitter**](https://twitter.com/comunicajs).\n"},97463:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 1.16.0: Full spec compliance, property paths, CSV/TSV, basic auth, and fixes\'\n---\n\nWith the latest release of Comunica, we have achieved the major milestone of **full compliance to the SPARQL 1.1 specification**.\nWhile Comunica has had support for all SPARQL 1.1 operators for a while,\nsome small parts were not always fully handled according to the spec,\nand property paths were not fully supported.\n\nThanks to the help of several students over the summer, these issues have been resolved,\nand all tests from [the SPARQL 1.1 test suite](https://w3c.github.io/rdf-tests/sparql11/) now pass.\n\n\n\n## SPARQL 1.1 Query compliance\n\nOur continuous integration tool has been configured to continuously check correctness\nusing unit tests, integration tests, and the SPARQL 1.1 query test suite.\nSo far, some tests from this test suite used to fail, primarily due to the lack of full property path support.\nThanks to the help of [several](https://github.com/comunica/comunica/commits?author=stephaniech97) [students](https://github.com/comunica/comunica/commits?author=FlorianFV) that [contributed](/contribute/)\nduring the summer, all of these issues have been resolved,\nwhich makes Comunica fully compliant to the [SPARQL 1.1 Query](https://www.w3.org/TR/sparql11-query/) specification.\n\nThe next major goal will now be to implement the [SPARQL 1.1 Update](https://www.w3.org/TR/sparql11-update/) specification.\n\nInterested in helping out? Let us know via [GitHub](https://github.com/comunica/comunica/issues/435).\n\n## Property paths\n\nSPARQL 1.1 provides the [property paths syntax](https://www.w3.org/TR/sparql11-query/#propertypaths),\nwhich is a power-user feature that allows complex paths between two resources to be expressed.\nAs of now, Comunica implements all property paths functionality according to the specification.\n\nFor example, property paths allow you to define alternative predicates:\n```sparql\nSELECT ?person WHERE {\n [ rdfs:label "Bruce Willis"@en ] (dbpedia-owl:spouse|dbpedia-owl:child) ?person.\n}\n```\n\nTry out some example queries live via our Web client:\n\n* [Spouses and children of Bruce Willis](http://query.linkeddatafragments.org/#transientDatasources=http%3A%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen&query=SELECT%20%3Fperson%0AWHERE%20%7B%0A%20%20%5B%20rdfs%3Alabel%20%22Bruce%20Willis%22%40en%20%5D%0A%20%20%20%20(dbpedia-owl%3Aspouse%7Cdbpedia-owl%3Achild)%20%3Fperson.%0A%7D)\n* [In-laws of Brad Pitt](http://query.linkeddatafragments.org/#transientDatasources=http%3A%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen&query=SELECT%20%3Fperson%0AWHERE%20%7B%0A%20%20dbpedia%3ABrad_Pitt%20dbpedia-owl%3Aspouse*%20%3Fperson.%0A%7D)\n* [Movies from directors who have directed movies with Brad Pitt](http://query.linkeddatafragments.org/#transientDatasources=http%3A%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen&query=SELECT%20%3Fmovie%0AWHERE%20%7B%0A%20%20%5B%20rdfs%3Alabel%20%22Brad%20Pitt%22%40en%20%5D%0A%20%20%20%20%5Edbpedia-owl%3Astarring%2Fdbpedia-owl%3Adirector%2F%5Edbpedia-owl%3Adirector%20%3Fmovie.%0A%7D)\n\nShould you run into any bugs related to property paths, \nbe sure to [report them on our issue tracker](https://github.com/comunica/comunica/issues).\n\n## CSV/TSV Serializers\n\nWhile there already was support for many [result formats](/docs/query/advanced/result_formats/) in Comunica,\n[CSV and TSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) support was missing.\nAs of this release, this lack has been resolved.\nThey can be used by requesting the `text/csv` or `text/tab-separated-values` media types.\n\nFor example, try it out as follows from the command line:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100" \\\n -t \'text/csv\'\n```\n\n## Basic authentication\n\nSometimes, access to data on the Web requires [HTTP Basic Authentication](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication).\nAs of this release, you can [configure Comunica to pass the required credentials](/docs/query/advanced/basic_auth/) to access these sources that require authentication.\n\nFor example, username and password can be passed from the command line:\n```bash\n$ comunica-sparql https://username:password@example.org/page \\\n "SELECT * WHERE { ?s ?p ?o }"\n```\n\n## And more\n\nAside from the main features above, several fixes have been done.\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1160---2020-08-24) to read more about them.\n'},76007:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Hacktoberfest and Release 1.17.0'\n---\n\nIn this post, we give an overview of\ncontribution possibilities during [Hacktoberfest](https://hacktoberfest.digitalocean.com/),\nand the newly released 1.17.0 version. \n\n\n\n## Hacktoberfest\n\n[Hacktoberfest](https://hacktoberfest.digitalocean.com/) is a yearly event during the month of October to celebrate open-source projects,\nwhere everyone is invited to contribute to projects by submitting pull requests.\nOnce a certain number of pull requests has been made, you will receive some goodies.\n\nIf you're interested to participate in this event,\nwe have marked several issues with the label [`hacktoberfest`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Ahacktoberfest),\nwhich are well suited for first-time contributors.\n\nHappy hacking! \uD83E\uDE93\n\n## Release 1.17.0\n\nAs of today, version 1.17.0 has been released.\nIt mainly contains [a fix for the bug where some queries would never terminate without producing further results](https://github.com/comunica/comunica/commit/3095b269f1d98d706d1056495123a69bffe3b457).\nNext to this, it features some convenience features such as\n[making the logger data argument lazy](https://github.com/comunica/comunica/commit/e6d7cee1f7622e4bcb73188a0060d5d9823958f0),\n[ensuring the internal SPARQL endpoint defaults to application/json when no content type is requested](https://github.com/comunica/comunica/commit/cdde3559b51825eaebb686fffe0a9edf7c8ef238),\nand a fix for [http-based JSON-LD contexts not being retrievable within browsers](https://github.com/comunica/comunica/commit/2d0818c64e5bfbbb334ecbccb7b5a98a69263d1c).\nIt also lays the groundwork for [RDF* support](https://github.com/comunica/comunica/issues/594) in the near future.\n\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1170---2020-09-25) to read more about them.\n"},29396:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.18.0: Smaller Web bundles and Microdata parsing'\n---\n\nThis post gives a brief overview of the new 1.18.0 release.\n\n\n\n## Smaller Web bundle sizes\n\nThanks to [Jacopo Scazzosi](https://github.com/jacoscaz),\nthe **Webpack bundle size** of the default Comunica config has been reduced from **1.47 MiB to 1.15 MiB**.\nThis reduction is mainly caused by swapping to smaller and more Web-friendly dependencies.\n\nThese changes were applied in preparation of the new release of [Quadstore](https://github.com/beautifulinteractions/node-quadstore),\na Comunica-powered RDF graph database where small bundle sizes are crucial.\n\n## Microdata parsing\n\nComunica already supported parsing RDFa from HTML (and other XML-like) documents.\nSince Microdata is [the most popular form of structured information on the Web](http://webdatacommons.org/structureddata/2019-12/stats/stats.html),\nit makes a lot of sense to be able to query over this as RDF.\nAs such, we plugged in the recently created [Microdata to RDF Streaming Parser](https://github.com/rubensworks/microdata-rdf-streaming-parser.js) into the default Comunica SPARQL config.\n\nShould you not need this parser in your querying use case,\nno worries, you can easily exclude this by creating a [custom config](https://comunica.dev/docs/modify/).\n\n## Fixes and enhancements\n\nNext to the changes above, several other smaller fixes and enhancements (such as [Emoji-support in query expressions](https://github.com/comunica/sparqlee/commit/4b873834a38c35329495d142eaf1c59f56fc0038)) were applied.\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1180---2020-11-02) to read more about them.\n"},14741:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 1.19.0: Simplifications for extensions\'\n---\n\nThe 1.19.0 release focuses on simplications for developing Comunica extension.\nIt contains no significant fixes or changes for end-users.\n\n\n\n## Components.js 4\n\nComunica\'s modules are wired together using the [Components.js](/docs/modify/advanced/componentsjs/) dependency injection framework.\nAs of recently, Components.js [has been updated](https://github.com/LinkedSoftwareDependencies/Components.js)\nto major release version 4, which features several simplifications for developers.\n\nWhile this release is backwards-compatible,\nwe do recommend developers of Comunica modifications to make the following tweaks.\n\n### Reduce clutter in `package.json`\n\nAll Comunica modules would typically contain the following entries in their `package.json` files:\n\n```json\n{\n ...\n "lsd:module": "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-abstract-bindings-hash",\n "lsd:components": "components/components.jsonld",\n "lsd:contexts": {\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-abstract-bindings-hash/^1.0.0/components/context.jsonld": "components/context.jsonld"\n },\n "lsd:importPaths": {\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-abstract-bindings-hash/^1.0.0/components/": "components/"\n },\n ...\n}\n```\n\nThis can now be simpified to:\n\n```json\n{\n ...\n "lsd:module": true\n ...\n}\n```\n\n### Update Components.js context version\n\nIf you define your own JSON-LD contexts,\nit is recommended to update to the latest version of the Components.js version\n\n```text\n- "https://linkedsoftwaredependencies.org/bundles/npm/componentsjs/^3.0.0/components/context.jsonld",\n+ "https://linkedsoftwaredependencies.org/bundles/npm/componentsjs/^4.0.0/components/context.jsonld",\n```\n\nWhile this change is optional, you will see a startup warning mentioning the use of a deprecated context URL.\n\n## Next steps\n\nIn the future, we plan further simplifications to Comunica modifications.\nConcretely, we intend to enable to automatic generation of module and component files based on TypeScript source code.\n'},88917:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.20.0: SPARQL Update support'\n---\n\nWith this new 1.20.0 release, we bring support for [SPARQL Update](https://www.w3.org/TR/sparql11-update/) queries to Comunica.\nNext to this, several enhancements were made to improve developer experience,\nminor new features, and important bug fixes.\n\n\n\n## SPARQL Update\n\nUp until now, Comunica only supported performing read-only queries over one or more sources.\nWith this update, it is possible to execute [SPARQL Update queries](https://www.w3.org/TR/sparql11-update/)\nto modify data inside a _source_, or direct changes to a separate _destination_.\n\nThe current implementation is fully compliant to the SPARQL Update specification,\nand it passes all tests of the test suite.\n\nCurrently, Update support is limited to [RDF/JS stores](/docs/query/advanced/rdfjs_updating/).\nSupport for updating other types of destinations is planned,\nsuch as local RDF files, [Linked Data Platform](https://www.w3.org/TR/ldp/),\n[SPARQL endpoints](https://www.w3.org/TR/2013/REC-sparql11-protocol-20130321/),\n[SPARQL Graph Store protocol](https://www.w3.org/TR/2013/REC-sparql11-http-rdf-update-20130321/), ...\n\nNo explicit support for transactions is available at the moment,\nas we assume that RDF/JS stores handle this on their own.\nProper support for this at engine-level is planned.\n\n## SPARQL endpoint worker threads\n\nIf you [use Comunica to expose a SPARQL endpoint](/docs/query/getting_started/setup_endpoint/),\nyou can now set the number of parallel worker threads using the `-w` flag:\n\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en -w 4\n```\n\nThis will result in better performance when your endpoint serves many parallel requests.\n\nTogether with this change, the timeout handling has been improved,\nas the old implementation would sometimes not terminate query executions even if the timeout was exceeded.\n\n## Features, fixes and enhancements\n\nNext to the changes above, several other features, fixes and enhancements were applied,\nsuch the new [`@comunica/types`](https://github.com/comunica/comunica/commit/3f46a233883b699df87fcee3215516f97e15e346)\nand [`@comunica/context-entries`](https://github.com/comunica/comunica/commit/12b9ee3e8e5bc2d0fadd662a3d6aeef838b87619) packages,\nenabling [blank node correlation across results](https://github.com/comunica/comunica/commit/d9b93b4608c69e6c8b710b664c37e47a1c0d41c7),\nand a new [link queue bus](https://github.com/comunica/comunica/commit/8de44d1da8e63c9b3a15c26dadcb003c2c00f136).\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1200---2021-03-30) to read more about them.\n"},75399:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.21.0: Hypermedia-based SPARQL Updating'\n---\n\nThe 1.21.0 version is a smaller release,\nthat mainly introduces the necessary wiring to enable hypermedia-driven SPARQL update querying,\nwhich lays the foundations for highly flexible updating of heterogeneous destinations, such as Solid data pods.\n\nIn other words, this provides the necessary ✨_magic_✨ for updating many different types of things. \n\n\n\n## Hypermedia-based updates\n\nA key feature of Comunica is its ability to [automatically detect the type of source via hypermedia](/docs/modify/advanced/hypermedia/),\nand alter its query process based on the source's capabilities.\nWith this new update, this hypermedia-based logic has also been added to the handling of update queries.\n\nConcretely, if you pass a destination by URL to Comunica,\nthe capabilities of this destination will be detected,\nand an appropriate destination handler will be used.\n\nWith this update, we provide support for [a single hypermedia destination type](/docs/query/advanced/destination_types/):\nthe [SPARQL Update-based PATCH API](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-update-hypermedia-patch-sparql-update).\nSuch a destination is an HTTP APIs accepting PATCH requests containing SPARQL Update queries (`application/sparql-update`),\nsuch as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1).\n\nIn future updates, we intend to support more types of hypermedia-based destinations as well,\nsuch as [SPARQL endpoints](https://www.w3.org/TR/2013/REC-sparql11-protocol-20130321/),\nand [Linked Data Platform](https://www.w3.org/TR/ldp/).\n\nLearn more about updating from the [command line](/docs/query/getting_started/update_cli/)\nor from a [JavaScript application](/docs/query/getting_started/update_app/) in the documentation. \n\n## Features, fixes and enhancements\n\nNext to the changes above, several minor features, fixes and enhancements were applied,\nsuch as [more expressive configuration of JSON-LD parsing](https://github.com/comunica/comunica/commit/199710d70b01d22ea40fe5e12e16a9d8800f32fc),\nproper [CLI exit codes](https://github.com/comunica/comunica/commit/00aa446cc8d2fd713711787b8a59f45c266947ea),\nand [changing the context in the `optimize-query-operation` bus](https://github.com/comunica/comunica/commit/81373206a17d0fcb8d3af701e5266287113d545c).\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1210---2021-04-27) to read more about them.\n"},80705:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Announcing the Comunica Association, and a Bounty Program\'\n---\n\nIn this post, we announce the creation of the [Comunica Association](/association/),\nand the introduction of a new bounty system using which **organizations** and companies\ncan **fund development** of new features and the fixing of bugs,\nand through which **developers** can take up these bounties and **get paid**.\n\n\n\n## The need for an association\n\nComunica started out as a small software project to drive query-related research.\nBy now, it has grown into a project that is being widely used not only within research,\nbut also within companies and organizations as stable software.\n\nThe original research-driven development approach is running into its limits,\nsince features and bugs are reported regularly that do not fit into a strict research agenda.\nTherefore, there is a need to broaden the development scope of Comunica,\nwhich is the purpose of the **Comunica Association**.\n\n## Short-term goals\n\nAs of now, the Comunica Association is a **non-profit organization** (activity within [Open Knowledge Belgium](https://openknowledge.be/))\nthat as a first step will act as an intermediary between people in need of development,\nand people that want to offer development at a price.\nFor instance, a certain company may be in need of a specific feature in Comunica,\nbut may not have the required expertise to implement it.\nVia the Comunica Association, this company may place a bounty on this issue,\nso that other companies or freelance developers (that do have this expertise)\nmay take up this effort for the bounty price.\n\n
\n \n
\n\nVia this bounty program, we intend to grow a network of organizations and individuals that\ncan offer services to each other around the topic of Web-scale querying of Knowledge Graphs.\n\n**Several bounties have already been placed on issues!**\nSo if you\'re a developer willing to take up such work, have a look at [the list of bounties](/association/bounties/).\nIf you\'re an organization interested in placing new bounties, have a look at [the bounty procedures](/association/bounty_process/).\n\n## Long-term goals\n\nThis bounty program is a first step in the creation of the Comunica Association.\nAs a next step, we intend to bring this network of interested organizations and individuals\neven closer by allowing everyone to collaboratively determine the future roadmap of Comunica through memberships.\n\nThe Association will be as open and transparent as possible.\nThis will mean that important decisions will be shared on this website,\nand that all finances will visible for everyone via the [Open Collective platform](https://opencollective.com/).\n\nEven though the Comunica Association is a non-profit organization.\nIt will raise funds through the bounty program and memberships\nin order to secure funding for hiring dedicated developers.\nSuch developers can then become dedicated maintainers of Comunica,\nin order to make the open-source development of Comunica and related RDF/JS libraries more sustainable in the long-term.\n\n[Click here to learn more about the Comunica Association.](/association/) \n'},25615:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.22.0: Improved update support, extension functions, and improved CLI handling'\n---\n\nThe 1.22.0 version features some major additions, and a bunch of smaller internal fixes and performance improvements \uD83D\uDE80!\nThe primary changes that are discussed in this post are\nsupport for more SPARQL update destination types,\nSPARQL extension functions,\nand rewritten CLI handling.\n\n\n\n## Updating Solid, Linked Data Platform, and SPARQL endpoints\n\nIn the previous release of Comunica, [basic support for updating documents in Solid/LDP data pods was already added by enabling PATCH requests](/blog/2021-04-27-release_1_21/).\nIn this release, we improve this support by also adding an actor that can handle PUT requests,\nwhich will allow resources to be created that do not exist yet.\n\nFor example, the following query will work whether or not the destination resource already exists,\nand Comunica will automatically determine if it should send a PUT or PATCH request:\n```bash\n$ comunica-sparql http://mypod.example.org/file.ttl \\\n -q \"INSERT DATA { }\"\n```\n\nIn the future, it will also become possible to update _private_ resources via Solid authentication.\n\nFurthermore, this release also makes it possible to forward update queries to SPARQL endpoints.\n\nLearn more about updating from the [command line](/docs/query/getting_started/update_cli/)\nor from a [JavaScript application](/docs/query/getting_started/update_app/) in the documentation.\n\n## SPARQL extension functions\n\nSPARQL allows non-standard, [custom extension functions](https://www.w3.org/TR/sparql11-query/#extensionFunctions) to be used within queries.\nSince this release, Comunica allows developers to plug in custom implementations for such functions.\n\nFor example, this allows you to plug in an implementation for the custom `func:reverse` function in the following query:\n```text\nPREFIX func: \nSELECT ?caps WHERE {\n ?s ?p ?o.\n BIND (func:reverse(?o) AS ?caps)\n}\n```\n\nLearn more about [configuring SPARQL extension functions here](/docs/query/advanced/extension_functions/).\n\n## Improved CLI arguments handling\n\nUp until this release, the internal mechanics of declaring and handling command-line arguments for `comunica-sparql` was hardcoded.\nThis caused some problems for custom init actors such as `comunica-sparql-hdt`,\nwhere custom handling of these arguments was required.\n\nIn order to meet these needs, the internals of CLI handling has been completely rewritten using the [`yargs`](https://www.npmjs.com/package/yargs) library.\nOther init actors can now easily plug in custom argument handlers to modify how the CLI tool behaves.\nFor the end-user, no significant changes are apparent, as the CLI tools remain fully backwards-compatible.\n\nYou can learn more about this in the [custom CLI arguments guide](/docs/modify/advanced/custom_cli_arguments/).\n\n## Features, fixes and enhancements\n\nNext to the changes above, several minor features, fixes and enhancements were applied,\nsuch as [migration to the more flexible Fetch-based HTTP actor](https://github.com/comunica/comunica/commit/a96547be4b112887a4e164496e2c6540737d8391),\n[allowing custom Fetch functions to be provided via the context](https://github.com/comunica/comunica/commit/a89f88fc1bf63c6e5d8ec7d5aee4199cd8b01e58),\n[logging filter errors as warnings in the logger](https://github.com/comunica/comunica/commit/cf12a9af63078917c0577f1d4b7d023506eda9e5),\n[reducing memory usage during query execution](https://github.com/comunica/comunica/commit/b0aeb67743eb187ddfb4e6fe8b42df240f3a9de7),\n[better error reporting for HTTP errors](https://github.com/comunica/comunica/commit/f6c2d5b2fe920808cf9ab98071da769f763c0515),\nand more.\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1220---2021-08-30) to read more about them.\n"},60190:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Comunica Association Memberships\'\n---\n\n[Earlier this year](/blog/2021-06-21-comunica_association_bounties/),\nwe announced the [Comunica Association](/association/),\nwhich is a non-profit organization that aims to make Comunica sustainable in the long term.\nIn this post, we announce the possibility to become a _member_ or _sponsor_ to the association,\nallowing organizations to drive the future roadmap of Comunica.\nWe plan an **official launch in fall 2022**, up until when organizations can choose\nto become a **founding member** of the Comunica Association.\n\n\n\n## \uD83C\uDFC6 Status of the bounty program\n\nThe [bounty program](/association/bounties/) has now been running for a couple of months,\nand so far it is working exactly as intended.\nAt the time of writing, two organizations ([Triply](https://triply.cc/) and [Netwerk Digitaal Erfgoed](https://netwerkdigitaalerfgoed.nl/))\nhave placed a total of six bounties, with a varying scope.\nOne of these bounties has already been completed, and two of them are being worked on. \n\nAn important finding is that bounties are **best applied on issues that have a clearly defined scope**, and are not too large.\nFor example, a bounty for a specific and easily reproducible bug is ideal.\nOn the other hand, more high-level issues such as the need to [improve overall performance](https://github.com/comunica/comunica/issues/846)\nseem to be less suited for bounties, as the scope is large or infinite, and the required effort is hard to predict.\nSuch issues are better suited for being part of the general roadmap of Comunica,\nwhich is the main motivation for introducing a membership structure.\n\n## \uD83C\uDFC5 Members and sponsors\n\nUp until now, Comunica primarily had a research-driven [roadmap](/roadmap/),\nbecause it grew out of a research project.\nTo allow more organizations and individuals to determine what this roadmap should look like,\nthe Comunica Association now allows [_members_ to become part of the board](/association/board/).\n\n**Board members are able to determine Comunica\'s roadmap**, and the future of the association.\nOne can become part of the board by either contributing time or via a financial contribution,\nwhich will both be invested in core maintenance of Comunica,\nsuch as managing issues and pull requests, and working towards the roadmap.\n\nFurthermore, for organizations that want to support the association,\nbut do not have the desire to become part of the board,\nthere is the option to become a _sponsor_, for which three tiers currently exist.\nThe **budget provided by sponsors will also go directly towards funding core maintenance of Comunica**,\nwith the option for sponsors to prioritize certain issues.\n\n
\n \n
\n\nSince the Comunica Association has a commitment to work as publicly and transparant as possible,\nall financial contributions from members and sponsors will go via our [Open Collective](https://opencollective.com/comunica-association) page.\nThis will allow everyone to see who contributed to the project, and how the budget is being spent.\n\n## \uD83D\uDE80 Next steps\n\nOrganizations that are interested in **supporting Comunica**, can do so **starting from today**.\nBecoming a board member or a sponsor can be done via our [Open Collective](https://opencollective.com/comunica-association) page,\nafter which we will contact you about the practical next steps.\nIf you want to become a board member by contributing time, you can [contact us](mailto:ruben.taelman@ugent.be) directly.\n\nAll members and sponsors that are active by our launch date in the fall of 2022 (exact date will be announced later),\nwill be considered **founding members and sponsors**, and will receive a permanent mention on this website.\nBased on the active members and sponsors, we will be actively looking for dedicated core maintainers\nthat want to be funded by the Comunica Association (be sure to [contact us](mailto:ruben.taelman@ugent.be) if you\'re interested in this!).\n\n[Click here to learn more about the Comunica Association,](/association/)\nor [contact us](mailto:ruben.taelman@ugent.be) regarding any specific questions\nyou may have about the association.\n'},30494:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 2.0.0: A new major release with radical simplifications and performance improvements\'\n---\n\nSince its initial release a couple of years ago, Comunica has grown a lot,\nbut it has always remained fully backwards-compatible with every update.\nHowever, as with every software project, there is sometimes a need to make breaking changes\nso that old mechanisms can be replaced with better, newer ones.\nWith this update, we have aggregated several breaking changes into one large update,\nall of which should improve the lives of users one way or another.\nBelow, the primary changes are listed.\n\n\n\n## New query API\n\nFor most people, the biggest change will be in the way you use Comunica for query execution,\nas the package names of the default query engines have been renamed,\nand the JavaScript API has been improved.\n\n### New package names\n\nUp until now, you may have been using `@comunica/actor-init-sparql` (or a variant) as your main entry point for query execution.\n**This main entrypoint has been moved to `@comunica/query-sparql`** (or `@comunica/query-sparql-file` and `@comunica/query-sparql-rdfjs`).\nThis means that your imports and the dependencies in your `package.json` file will require updates.\n\nThe first reason for this renaming is the fact that the new names are shorter and easier to remember.\nThe second reason is mainly for people that want to configure their own Comunica engines,\nwhere the Query Init actor has been decoupled from the query engine entrypoints to simplify the creation of new engines.\n\n### Improved JavaScript query API\n\nAnother major change is related to the way you create and use a Comunica query engine in JavaScript.\nMainly, the following changes have been made:\n\n- `newEngine` has been replaced with the class `QueryEngine` that can be instantiated with the `new` keyword.\n- New result-dependent query methods have been added for simpler result consumption:\n - `queryBindings` for SELECT queries\n - `queryQuads` for CONSTRUCT and DESCRIBE queries.\n - `queryBoolean` for ASK queries\n - `queryVoid` for update queries\n- The general `query` method still exists, but has been changed.\n- The methods on a Bindings object have been changed and improved, and obtaining values for variables does not require the `?` prefix anymore.\n- If you are using TypeScript, make sure to bump `@rdfjs/types` to at least `1.1.0`.\n\nLearn more about the new API in the [guide on querying in a JavaScript app](/docs/query/getting_started/query_app/).\n\nBelow, you can see an example of a simple SPARQL SELECT query execution in the old and new versions of Comunica.\n\n**Before (Comunica 1.x):**\n```typescript\nconst newEngine = require(\'@comunica/actor-init-sparql\').newEngine;\nconst myEngine = newEngine();\n\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'https://fragments.dbpedia.org/2015/en\'],\n});\n\nresult.bindingsStream.on(\'data\', (binding) => {\n console.log(binding.get(\'?s\').value);\n});\nbindingsStream.on(\'end\', () => {});\nbindingsStream.on(\'error\', (error) => console.error(error));\n```\n\n**After (Comunica 2.x):**\n```typescript\nconst QueryEngine = require(\'@comunica/query-sparql\').QueryEngine;\nconst myEngine = new QueryEngine();\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'https://fragments.dbpedia.org/2015/en\'],\n});\n\nbindingsStream.on(\'data\', (binding) => {\n console.log(binding.toString()); // New: quick way to print bindings\n console.log(binding.get(\'s\').value);\n});\nbindingsStream.on(\'end\', () => {});\nbindingsStream.on(\'error\', (error) => console.error(error));\n```\n\nThis new query API is largely aligned with the recently created [RDF/JS query specification](https://rdf.js.org/query-spec/),\nwhich makes Comunica better interactable and interchangeable within the RDF JavaScript ecosystem.\n\n## Easier engine modifications\n\nBased on the feedback we received from developers that configure their own Comunica engines or implement their own Comunica packages,\nwe have refactored the internals of Comunica in several places to simplify these processes.\n\n### Automatic generation of components files\n\nComunica makes use of the dependency injection framework [Components.js](/docs/modify/advanced/componentsjs/)\nto load its configuration files.\nA requirement for this framework is that each package should expose a semantic description of its classes, i.e., the _components files_.\nThese components files are located within the `components/` directory of each package.\nWhile these files had to be manually created before,\nthese files can now be automatically generated from the TypeScript sources\nusing [Components-Generator.js](https://github.com/LinkedSoftwareDependencies/Components-Generator.js/).\nThis significantly reduces the effort when creating new Comunica packages.\nLearn more about this in the [getting started with modification guides](/docs/modify/getting_started/).\n\n### Config restructuring\n\nUp until now, all configuration files were split up in smaller fragments, but using an arbitrary fragmentation strategy.\nWith this update, all configuration files now use a consistent fragmentation strategy,\nwhere a separate sub-directory exists for each Comunica bus, in which one or more files can exist per actor.\nFurthermore, all configuration files have been moved to a new dedicated (zero-dependency) package\n[`@comunica/config-query-sparql`](https://github.com/comunica/comunica/tree/master/engines/config-query-sparql/),\nwhich simplifies reuse and extension of these config fragments.\nLearn more about this new config structure in the [README of `@comunica/config-query-sparql`](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/README.md).\n\n## Internal changes for better performance\n\nOne primary aspect of [our roadmap](/roadmap/) is to [improve overall performance](https://github.com/comunica/comunica/issues/846).\nIn this update, we refactored the way in which [join operations](/docs/modify/advanced/joins/) are handled,\nbecause these were not flexible enough before, which hindered optimizations.\n\nConcretely, Comunica used to handle most join operations within the Basic Graph Pattern actor,\nwhich made it impossible to use these join operators for joins with other types of operations,\nsuch as property paths, which thereby made these operations very slow.\nWith this refactoring, the join operator implementations have been fully decoupled from the Basic Graph Pattern actor,\nwhich for example makes joins between triple patterns and property paths much more efficient.\n\nWhile performance will be much better in many cases,\nthere are still a lot of [opportunities open for further optimization](https://github.com/comunica/comunica/issues/846).\nWe welcome [contributions](/contribute/) for making these optimizations a reality.\n\nLearn more about [joins in Comunica](/docs/modify/advanced/joins/).\n\n## Explaining query plans\n\nMost large-scale query engines offer some way of inspecting _how_ exactly a query engine will execute a given query,\nwhich is something Comunica has been lacking so far.\n\nWith this update, you can inspect in detail the exact query plan and actors that were used for executing a given query.\nThis functionality exists both on the command-line (via `--explain`), as in the JavaScript API.\nFor example, the command below shows an example of a physical plan that is printed for a given query:\n\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o. ?s a ?o } LIMIT 100\' --explain physical\n\n{\n "logical": "slice",\n "children": [\n {\n "logical": "project",\n "variables": [\n "s",\n "p",\n "o"\n ],\n "children": [\n {\n "logical": "join",\n "children": [\n {\n "logical": "pattern",\n "pattern": "?s ?p ?o"\n },\n {\n "logical": "pattern",\n "pattern": "?s http://www.w3.org/1999/02/22-rdf-syntax-ns#type ?o"\n },\n {\n "logical": "join-inner",\n "physical": "bind",\n "bindIndex": 1,\n "bindOrder": "depth-first",\n "cardinalities": [\n {\n "type": "estimate",\n "value": 1040358853\n },\n {\n "type": "estimate",\n "value": 100022186\n }\n ],\n "joinCoefficients": {\n "iterations": 6404592831613.728,\n "persistedItems": 0,\n "blockingItems": 0,\n "requestTime": 556926378.1422498\n },\n...\n```\n\nLearn more about [explaining query plans in Comunica](/docs/query/advanced/explain/).\n\n## Webinar\n\nDue to all of these changes and simplifications,\nwe are planning a public webinar in which the basic usage of Comunica will be explained.\nThis will be useful for new developers that want to get started with Comunica,\nand developers that have used Comunica before, but want to learn about the new ways of using it.\nThis is also a perfect time for new contributors to become part of the community,\nor possibly even the [Comunica Association](/association/).\nMore news on this webinar will follow later.\n\n## Full changelog\n\nWhile this blog post explained the primary changes in Comunica 2.x,\nthere are actually many more smaller changes internally that will make your lives easier.\nIf you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v201---2022-03-02).\n'},60212:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.3.0: Better timeout support and minor enhancements'\n---\n\nIt's been a while since our latest blog post,\nso here's a small announcement on the latest 2.3.0 release.\n\n\n\n## Better timeout support\n\nWhen doing queries over slow sources, it may sometimes be desired to have requests time out if they run for too long.\nAs of this release, it is possible to [configure such timeouts](/docs/query/advanced/context/#16--http-timeout).\n\nFor example, configuring a timeout of 60 seconds when querying over a TPF endpoint can be done as follows:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n httpTimeout: 60_000,\n});\n```\n\nThis functionality was implemented by [@Tpt](https://github.com/Tpt), as the functionality was requested via a [bounty](https://comunica.dev/association/bounties/).\n\n## Union default graph\n\nBy default, Comunica will only query over the [default graph](https://www.w3.org/TR/sparql11-query/#unnamedGraph).\nIf you want to query over triples in other named graphs, you need to specify this via the `GRAPH`, `FROM`, or `FROM NAMED` clauses.\nHowever, by setting the `unionDefaultGraph` context option to `true`, triples patterns will also apply to triples in the non-default graph.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n unionDefaultGraph: true,\n});\n```\n\n## Improved ordering of terms\n\nWe recently noticed that ordering of terms in Comunica (as used by `ORDER BY`), did not fully implement total ordering.\nThis caused [issues](https://github.com/comunica/comunica/issues/892) where certain terms would be ordered in an inconsistent manner.\nThanks to [@Tpt](https://github.com/Tpt), Comunica (and the underlying [Sparqlee expressions evaluator](https://github.com/comunica/sparqlee)) now have proper total ordering support.\n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v230---2022-06-29).\n"},39243:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Official launch of the Comunica Association'\n---\n\nAs previously announced, we will be officially launching the Comunica Association during the fall of this year.\nMore concretely, we are organizing an online launch event on the 7th of September,\nand we will be physically present at the Semantics conference in Vienna the week afterwards.\n\n\n\n## \uD83D\uDCD6 How we got here\n\nLast year, [we announced the Comunica Association](/blog/2021-06-21-comunica_association_bounties/),\nto make Comunica sustainable in the long-term,\nand to advance the [long-term roadmap](/roadmap/).\nUp until now, we had soft-launch period during which a bounty program and membership structure was being setup.\nThe association has grown a lot since then,\nwith [multiple developers actively working on bounties](/association/bounties/),\nand [multiple contributors supporting us via Open Collective](https://opencollective.com/comunica-association).\n\nWe thank the following founding members, which have supported the association for this launch:\n\n- [IDLab - Ghent University](https://www.ugent.be/ea/idlab/en)\n- [Australian National University](https://cecs.anu.edu.au)\n- [Dutch Digital Heritage Network (NDE)](https://netwerkdigitaalerfgoed.nl/)\n\n## \uD83D\uDE80 Online launch event\n\nWednesday 7 September 16:00 (Brussels time), we will livestream the launch of the Comunica Association.\nDuring this event, several invited speakers from various companies will talk about their experiences with Comunica, and show off some demo's.\nSpeaker profiles during this event range from commercial users of Comunica,\nto academics using Comunica for their research.\n\nIf you want to learn more about this event,\nyou can find more details on the [event page](/events/2022-09-07-association_launch/).\n\n## \uD83E\uDDD1\uD83C\uDFEB Semantics conference\n\nIn the week after the online launch event,\nthe [Semantics conference](https://2022-eu.semantics.cc/) takes place in Vienna, Austria from September 13 until September 15.\nWe will be preset at this conference with a booth and give a talk at the main conference.\nIf you plan to attend this conference, be sure to come find us there! \n"},80850:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.4.0: Better browser support and performance improvements'\n---\n\nWe just released a new minor version of Comunica.\nHere's an overview of the main changes.\n\n\n\n## Better browser support\n\nWhen using Comunica in browser bundling tools such as Webpack,\npolyfills had to be configured since Comunica made use of Node.js builtins.\nAs of this release, Comunica does not depend directly on these Node.js builtins anymore,\nwhich means that Comunica can be bundled directly with tools such as Webpack without having to configure polyfills in a custom config.\n\nThis change was implemented by [@Tpt](https://github.com/Tpt) via a [bounty](https://comunica.dev/association/bounties/).\n\n## Performance improvements\n\nThanks to some [internal changes inside AsyncIterator](https://github.com/comunica/comunica/commit/b16e18888b0e93821c76e01a6efd9bcb3c4f9523), Comunica now runs slightly faster in general.\n\nFurthermore, [some property path logic was rewritten](https://github.com/comunica/comunica/commit/0ad833f8f32f7e3c2de1b22a0424da027656bf6a),\nwhich makes * and + path queries significantly faster for large datasets.\n\n## Tweaks to the HTTP service\n\nThe [HTTP service](https://comunica.dev/docs/query/getting_started/setup_endpoint/) of Comunica (which exposes a SPARQL endpoint) has been polished.\nOn the one hand, several bugfixes have been applied to make the endpoint more stable when there are timeouts and long-running queries.\nFurthermore, [some](https://github.com/comunica/comunica/commit/4958206f6b042239efe2218ce268e4b981ce9e2c)\n[features]((https://github.com/comunica/comunica/commit/4dd99fee904c64e9ef700eb5080197c4a03a36fa))\nhave been added to are useful when benchmarking with Comunica. \n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v240---2022-08-24).\n"},64546:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.5.0: Fixes, string sources, and HTTP error handling'\n---\n\nWe just released a new small update. Here's an overview of the main changes.\n\n\n\n## String sources\n\nIf you have an RDF dataset available in a JavaScript string in some RDF serialization,\nyou can now immediately query over it by passing it as a `stringSource` as follows:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n {\n type: 'stringSource',\n value: '. .',\n mediaType: 'text/turtle',\n baseIRI: 'http://example.org/',\n },\n ],\n});\n```\n\nThis feature has been contributed by [@constraintAutomaton](https://github.com/constraintAutomaton).\n\n## HTTP error handling\n\nWith this update, query engines can become more robust against unstable or unavailable server.\n\nUsing the `httpRetryOnServerError`, `httpRetryCount`, and `httpRetryDelay` options,\nyou can make your engine retry requests for a number of times if the server produces an error for it.\n\nUsing the `recoverBrokenLinks` option, you can make your engine fall back to the [WayBack Machine](https://archive.org/web/) if a document has become unavailable.\n\nLearn more about using these options on the [command line](https://comunica.dev/docs/query/getting_started/query_cli/)\nand [query context](https://comunica.dev/docs/query/advanced/context/).\n\nThese features were contributed by [@Laurin-W](https://github.com/Laurin-W/) and [@jeswr](https://github.com/jeswr/).\n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v250---2022-11-09).\n"},63367:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 2.7.0: Better date support, better performance over SPARQL endpoints, and internal fixes\'\n---\n\nToday, we released a new minor update, which brings exciting new features, performance improvements, and bug fixes.\nBelow, you can find an overview of the main changes.\n\n\n\n## \uD83D\uDCC5 Durations, Dates, and Times in Filters\n\nThe SPARQL 1.1 specification prescribes only a very limited set of operations that can be done over literals with datatype `xsd:dateTime`.\nFor example, it is not possible to add/subtract durations, compute differences between times, and so on. \nRecently, [a suggestion was made](https://github.com/w3c/sparql-12/blob/main/SEP/SEP-0002/sep-0002.md) to extend\nthe number of operations that can be done over `xsd:dateTime`\'s and related datatypes.\nThis Comunica release implements this [proposal](https://github.com/w3c/sparql-12/blob/main/SEP/SEP-0002/sep-0002.md),\nwhich means that queries such as the following are now possible:\n\n```text\nPREFIX xsd: \nSELECT ?id ?lt ?gt WHERE {\n VALUES (?id ?l ?r) {\n (1 "PT1H"^^xsd:dayTimeDuration "PT63M"^^xsd:dayTimeDuration)\n (2 "PT3S"^^xsd:dayTimeDuration "PT2M"^^xsd:dayTimeDuration)\n (3 "-PT1H1M"^^xsd:dayTimeDuration "-PT62M"^^xsd:dayTimeDuration)\n (4 "PT0S"^^xsd:dayTimeDuration "-PT0.1S"^^xsd:dayTimeDuration)\n }\n BIND(?l < ?r AS ?lt)\n BIND(?l > ?r AS ?gt)\n}\n```\n\nThis functionality was implemented by [@jitsedesmet](https://github.com/jitsedesmet).\n\n## \uD83D\uDE80 Improved performance over SPARQL endpoints\n\nComunica aims to enable query execution over [different types of query interfaces](/about/#flexible-querying-of-linked-data), which includes SPARQL endpoints.\nWhile participating in a [recent workshop on federated querying over SPARQL endpoints](https://github.com/MaastrichtU-IDS/federatedQueryKG),\nwe encountered several performance issues that were caused by implementation bugs when querying over multiple SPARQL endpoints.\nWith this update, these performance issues have been resolved, and many queries that would either timeout or crash due to memory issues now run efficiently.\n\nThis functionality was implemented by [@surilindur](https://github.com/surilindur/),\n[@constraintAutomaton](https://github.com/constraintAutomaton/), and [@rubensworks](https://github.com/rubensworks/).\n\n## \uD83D\uDDC3️ Refactored internal metadata\n\nAs Comunica follows a [hypermedia-driven query execution model](/docs/modify/advanced/hypermedia/)\nto allow source capabilities to be detected and exploited on-the-fly,\nthere is a need for keeping track of the _metadata_ of such sources.\n\nTo enable more adaptive and efficient query execution in the future,\nwe have refactored this internal metadata so that it can be updated during query execution.\nThis allows operators to adaptively act upon newly discovered information in sources.\n\nMore details on these metadata changes can be read in the [documentation](/docs/modify/advanced/metadata/).\n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v270---2023-05-24).\n'},57145:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.8.0: Support for quoted triples (RDF-star and SPARQL-star)'\n---\n\nThis minor release focuses on a single but significant new feature: support for quoted triples.\n\n\n\n## \uD83E\uDE86 Quoted triples support\n\nRecently, the RDF-star community group has produced extensions to RDF and SPARQL: [RDF-star and SPARQL-star](https://www.w3.org/2021/12/rdf-star.html).\nThese extensions allow statements to be made about other statements,\nwhich previously only used to be possible using inconvenient workarounds such as RDF reification and named graphs.\nThe [RDF-star W3C working group](https://www.w3.org/groups/wg/rdf-star/) is now working on preparing new versions of the RDF and SPARQL recommendations,\nwhich are scheduled to be finalized in the second half of 2024.\n\nConcretely, this functionality allows triples to be _quoted_ in subject and object positions of other triples.\nFor example, the statement _\"Alice says that Violets are Blue\"_ could be expressed in Turtle as follows:\n```text\n@prefix : .\n:Alice :says << :Violets :haveColor :Blue >> .\n```\nFurthermore, this could be queried in SPARQL as follows:\n```text\nPREFIX : \nSELECT ?person ?color WHERE {\n ?person :says << :Violets :haveColor ?color >> .\n}\n```\n\nThis Comunica update adds support to this new functionality, following the [RDF-star community group report](https://www.w3.org/2021/12/rdf-star.html).\nConcretely, most RDF parsers and serializers, all SPARQL result parsers and serializers,\nand the SPARQL query parser and processing have been updated to handle quoted triples.\nFurthermore, for storing quoted triples in-memory, we recommend the optimized [`rdf-stores`](https://www.npmjs.com/package/rdf-stores) package,\nwhich is also being used internally for handling quoted triples.\n\nThis functionality is fully backwards-compatible, meaning that existing applications that do not make use of quoted triples will experience no differences.\nFurthermore, breaking changes in our RDF-star support _may_ occur if the RDF-star W3C working group decides to deviate from the RDF-star community group report.\n\n## Full changelog\n\nAs always, if you want to learn more about all changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v280---2023-07-04).\n"},84831:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 3.0: \uD83D\uDD25 Blazingly fast federation over heterogeneous sources\'\n---\n\nMore than 2 years ago, we released [Comunica version 2.0](/blog/2022-03-03-release_2_0/),\nwhich featured many internal and external API changes that significantly simplified its usage.\nToday, we release version 3.0, which focuses more on internal changes, with limited changes to the external API.\nMost of the changes relate to the handling of data sources during query planning,\nwhich allows **more efficient query plans to be produced when querying over federations of heterogeneous sources**.\nThis means that for people using Comunica, the number of breaking changes in this update are very limited.\nThings will simplify be faster in general, and some small convenience features have been added,\nsuch as results being [async iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols). \nTo developers extending Comunica with custom actors, there will be some larger breaking changes.\n\n\n\n## \uD83D\uDD01 Async iterable results\n\nSince recent JavaScript versions, it has been possible to use a new _for-await_ syntax over [async iterables](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols).\nComunica has been using the [AsyncIterator library](https://github.com/RubenVerborgh/AsyncIterator/) since its initial release.\nThis requires users to consume results as streams using on-data listeners, as follows:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [ \'http://fragments.dbpedia.org/2015/en\' ],\n});\n\nbindingsStream.on(\'data\', (bindings) => {\n console.log(bindings.toString());\n});\n```\n\nAs of Comunica 3.x, **results can now also be consumed via the async iterable interface**, as follows:\n```javascript\nfor await (const bindings of bindingsStream) {\n console.log(bindings.toString());\n}\n```\n\nIn performance-critical cases, we still recommend the on-data listener approach.\nBut in most cases, the async iterable interface will provide sufficient levels of performance.\n\n## \uD83D\uDE4B Performance improvements for end-users\n\nIn Comunica version 2.x, federated queries (i.e. queries across multiple sources)\nwould essentially be split at triple pattern level,\neach triple pattern would be sent to each source,\nand results would be combined together locally.\nWhile this way of working is semantically correct, it is not always the most performant,\nespecially when working with sources such as SPARQL endpoints that can accept way more than just triple patterns.\n\nIn Comunica version 3.x, the internal architecture has been refactored\nto enable query planning to not just happen at triple pattern level,\nbut to **enable any kind of query operation to be sent to any kind of source that would support them**.\nWhile this new architecture will enable better query optimizations to be implemented in the future,\nwe already implemented some optimizations in this release.\nFirst, if Comunica detects that multiple operations _exclusively_ apply to one source,\nthen these **operations will be grouped and sent in bulk to this source** ([`@comunica/actor-optimize-query-operation-group-sources`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-group-sources)).\nRoughly, this correspond to the [FedX optimization techniques](http://iswc2011.semanticweb.org/fileadmin/iswc/Papers/Research_Paper/05/70310592.pdf),\nbut extended to apply to heterogeneous sources instead of only SPARQL endpoints.\nSecond, if a join is done between two sources,\nwhere one of these sources accepts bindings to be pushed down into the source (such as SPARQL endpoints and brTPF interfaces) ([`@comunica/actor-rdf-join-inner-multi-bind-source`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-bind-source)),\nthe **_bound-join_ technique is applied** (FedX).\nThird, if sources accept `FILTER` operations, then these **`FILTER` operations can be pushed down into the sources** that accept them ([`@comunica/actor-optimize-query-operation-filter-pushdown`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-filter-pushdown)).\nFourth, if some operations will not produce any results based on prior `COUNT` or `ASK` queries,\nthen these **empty source-specific operations will be pruned away** ([`@comunica/actor-optimize-query-operation-prune-empty-source-operations`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-prune-empty-source-operations)).\n\nEnd-users of Comunica will see a significant performance improvement when federating across multiple sources,\nespecially if some of those sources would be SPARQL endpoints.\nBelow, you can find some high-level performance comparisons of queries in Comunica 2.x vs 3.x.\n\n| Query | Comunica 2.x | Comunica 3.x |\n|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------|--------------------------|\n| [Books by San Franciscans in Harvard Library (DBpedia TPF)](http://query.linkeddatafragments.org/#transientDatasources=%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen;%2F%2Fdata.linkeddatafragments.org%2Fviaf;%2F%2Fdata.linkeddatafragments.org%2Fharvard&query=SELECT%20%3Fperson%20%3Fname%20%3Fbook%20%3Ftitle%20%7B%0A%20%20%3Fperson%20dbpedia-owl%3AbirthPlace%20%5B%20rdfs%3Alabel%20%22San%20Francisco%22%40en%20%5D.%0A%20%20%3FviafID%20schema%3AsameAs%20%3Fperson%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20schema%3Aname%20%3Fname.%0A%20%20%3Fbook%20dc%3Acontributor%20%5B%20foaf%3Aname%20%3Fname%20%5D%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20dc%3Atitle%20%3Ftitle.%0A%7D%20LIMIT%20100) | 5774.32 ms (669 requests) | 4923.86 ms (334 requests) |\n| [Books by San Franciscans in Harvard Library (DBpedia SPARQL)](http://query.linkeddatafragments.org/#datasources=https%3A%2F%2Fdbpedia.org%2Fsparql&transientDatasources=%2F%2Fdata.linkeddatafragments.org%2Fviaf;%2F%2Fdata.linkeddatafragments.org%2Fharvard&query=SELECT%20%3Fperson%20%3Fname%20%3Fbook%20%3Ftitle%20%7B%0A%20%20%3Fperson%20dbpedia-owl%3AbirthPlace%20%5B%20rdfs%3Alabel%20%22San%20Francisco%22%40en%20%5D.%0A%20%20%3FviafID%20schema%3AsameAs%20%3Fperson%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20schema%3Aname%20%3Fname.%0A%20%20%3Fbook%20dc%3Acontributor%20%5B%20foaf%3Aname%20%3Fname%20%5D%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20dc%3Atitle%20%3Ftitle.%0A%7D%20LIMIT%20100) | Timeout | 8469.86 ms (632 requests) |\n| [Compounds in Lindas and Rhea](http://query.linkeddatafragments.org/#datasources=sparql%40https%3A%2F%2Flindas.admin.ch%2Fquery;https%3A%2F%2Fsparql.rhea-db.org%2Fsparql&query=PREFIX%20schema%3A%20%3Chttp%3A%2F%2Fschema.org%2F%3E%0ASELECT%20*%20WHERE%20%7B%0A%20%20%3Fsubstance%20a%20schema%3ADefinedTerm%20%3B%0A%20%20%20%20schema%3Aidentifier%20%3Fidentifier%20%3B%0A%20%20%20%20schema%3AinDefinedTermSet%20%3Chttps%3A%2F%2Fld.admin.ch%2Fcube%2Fdimension%2Fel01%3E%20.%0A%20%20%3Fcompound%20%3Chttp%3A%2F%2Frdf.rhea-db.org%2Fformula%3E%20%3Fidentifier%20%3B%0A%20%20%20%20%3Chttp%3A%2F%2Frdf.rhea-db.org%2Fname%3E%20%3Fname%20%3B%0A%20%20%20%20%3Chttp%3A%2F%2Frdf.rhea-db.org%2Faccession%3E%20%3Faccession%20.%0A%7D%0A) | Timeout | 424.57 ms(41 requests) |\n\n### Inspecting source selection results\n\nIf you are interested in understanding how Comunica will split up queries across multiple sources,\nyou can make use of the [logical explain mode](/docs/query/advanced/explain/).\n\nFor example, if we want to execute the following query across three sources\n(https://dbpedia.org/sparql (SPARQL), http://data.linkeddatafragments.org/viaf (TPF), http://data.linkeddatafragments.org/harvard (TPF)),\nthe logical explain mode will show us how this query is split up and assigned to each source.\n\n**Query:**\n```txt\nSELECT ?person ?name ?book ?title {\n ?person dbpedia-owl:birthPlace [ rdfs:label "San Francisco"@en ].\n ?viafID schema:sameAs ?person;\n schema:name ?name.\n ?book dc:contributor [ foaf:name ?name ];\n dc:title ?title.\n} LIMIT 100\n```\n\n**Explain:**\n```txt\ncomunica-sparql \\\n https://dbpedia.org/sparql http://data.linkeddatafragments.org/viaf http://data.linkeddatafragments.org/harvard \\\n -f query.sparql --explain logical\n{\n "type": "slice",\n "input": {\n "type": "project",\n "input": {\n "type": "join",\n "input": [\n {\n "type": "join",\n "input": [\n {\n "type": "union",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "viafID"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://schema.org/sameAs"\n },\n "object": {\n "termType": "Variable",\n "value": "person"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "viafID"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://schema.org/sameAs"\n },\n "object": {\n "termType": "Variable",\n "value": "person"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/viaf)(SkolemID:1)"\n }\n }\n ]\n },\n {\n "type": "union",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "g_1"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://xmlns.com/foaf/0.1/name"\n },\n "object": {\n "termType": "Variable",\n "value": "name"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "g_1"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://xmlns.com/foaf/0.1/name"\n },\n "object": {\n "termType": "Variable",\n "value": "name"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/harvard)(SkolemID:2)"\n }\n }\n ]\n },\n {\n "type": "union",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "book"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://purl.org/dc/terms/title"\n },\n "object": {\n "termType": "Variable",\n "value": "title"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "book"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://purl.org/dc/terms/title"\n },\n "object": {\n "termType": "Variable",\n "value": "title"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/harvard)(SkolemID:2)"\n }\n }\n ]\n }\n ]\n },\n {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "person"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://dbpedia.org/ontology/birthPlace"\n },\n "object": {\n "termType": "Variable",\n "value": "g_0"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "g_0"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://www.w3.org/2000/01/rdf-schema#label"\n },\n "object": {\n "termType": "Literal",\n "value": "San Francisco",\n "language": "en",\n "datatype": {\n "termType": "NamedNode",\n "value": "http://www.w3.org/1999/02/22-rdf-syntax-ns#langString"\n }\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ],\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "viafID"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://schema.org/name"\n },\n "object": {\n "termType": "Variable",\n "value": "name"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/viaf)(SkolemID:1)"\n }\n }\n ]\n },\n {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "book"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://purl.org/dc/terms/contributor"\n },\n "object": {\n "termType": "Variable",\n "value": "g_1"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/harvard)(SkolemID:2)"\n }\n }\n ]\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "person"\n },\n {\n "termType": "Variable",\n "value": "name"\n },\n {\n "termType": "Variable",\n "value": "book"\n },\n {\n "termType": "Variable",\n "value": "title"\n }\n ]\n },\n "start": 0,\n "length": 100\n}\n```\n\nThe `scopedSource` annotations on operations show which sources apply to which sources.\nThe above shows that most of the query will be split at triple pattern level to the different sources,\nexcept for the patterns `?person dbpedia-owl:birthPlace [ rdfs:label "San Francisco"@en ].`,\nwhich have been identified as exclusively applying to https://dbpedia.org/sparql,\nwhich can therefore be sent as-is to the SPARQL endpoint.\n\nHereafter, this post will discuss the internal changes in more detail for developers\nthat want to update their implementations to this new architecture.\n\n## \uD83D\uDD0D Query Source Identify bus\n\n[`@comunica/bus-query-source-identify`](https://github.com/comunica/comunica/tree/master/packages/bus-query-source-identify) is a new bus that roughly\nreplace the `@comunica/bus-rdf-resolve-quad-pattern` and `@comunica/bus-rdf-resolve-quad-pattern-hypermedia` buses.\nThe main difference is that `@comunica/bus-query-source-identify` runs _before_ query execution within the `@comunica/bus-context-preprocess` bus,\nwhile the old buses ran _during_ query execution.\nRunning things before query execution enables more optimization opportunities,\nwhich enabled the existence of actors such as [`@comunica/actor-optimize-query-operation-filter-pushdown`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-filter-pushdown) and [`@comunica/actor-optimize-query-operation-prune-empty-source-operations`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-prune-empty-source-operations).\n\nIf you had an actor on the `@comunica/bus-rdf-resolve-quad-pattern` or `@comunica/bus-rdf-resolve-quad-pattern-hypermedia` bus,\nthese can now be moved to the `@comunica/bus-query-source-identify` or `@comunica/bus-query-source-identify-hypermedia` bus.\nThe main API change here is that sources now need to implement the `IQuerySource` interface,\nthat they need to announce the shape of query operations they support (instead of only quad patterns),\nand that these operations need to be executable within the source.\n\n## \uD83D\uDE8C Query Process bus\n\n[`@comunica/bus-query-process`](https://github.com/comunica/comunica/tree/master/packages/bus-query-parse) is a new bus that contains all logic for fully processing a query,\nwhich usually involves steps such as parsing, optimizing, and evaluating, which can be delegated to other buses.\nAll of this logic was previously contained within [`@comunica/actor-init-query`](https://github.com/comunica/comunica/tree/master/packages/actor-init-query),\ntogether with many other boilerplate logic,\nwhich made things very difficult if developers would want to modify a small part of the query process.\nWith this new bus, developers can more easily plug in custom query process actors,\nsuch as _adaptive_ query planners.\n\n## Full changelog\n\nWhile this blog post explained the primary changes in Comunica 3.x,\nthere are actually many more smaller changes internally that will make your lives easier.\nIf you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v301---2024-03-19).\n'},45609:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 3.1: \uD83C\uDF31 New package with tiny bundle size'\n---\n\nThe primary addition in this release is the new [`@comunica/query-sparql-rdfjs-lite`](https://www.npmjs.com/package/@comunica/query-sparql-rdfjs-lite) package,\nwhich is optimized for small browser bundle size.\nCurrently, the minified size of this package is 648,88 KB (145,79 KB when gzipped).\nThis is about as small as you can get without removing required functionality from the SPARQL 1.1 spec\nBut if you don't need everything from SPARQL 1.1, it could get much smaller even!\n\n\n\nBesides this, several fixes were applied, and some internal changes to our CI to better\n[track the browser bundle size](https://github.com/comunica/comunica/commit/f212b9262f5d2a12a40848f01132299904dc132c)\nand [overall query performance](https://github.com/comunica/comunica/commit/1d8b0d202a7d4728e3692764b33d8795686ce5a0) over time.\n\n## Full changelog\n\nIf you want to learn more about all changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v310---2024-05-11).\n"},72212:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 3.2: \uD83D\uDD0E Knowing what to optimize\'\n---\n\nFor this release, we mainly focused on improving tooling to more easily track down performance issues.\nConcretely, we improved our query explain output,\nstarted running multiple benchmarks in our CI to avoid performance regressions,\nand applied several performance improvements that were identified following these changes.\n\n\n\n## \uD83D\uDD0E Query explain improvements\n\nComunica has had several [query explain functionalities](/docs/query/advanced/explain/) for a while now,\nto show how a query is parsed, optimized (logical), and executed (physical).\nHowever, the physical plan output tended to be very verbose, which made it difficult to draw conclusions from.\n\nIn this update, the physical plan output has undergone three main changes:\n\n1. The output from joins (especially Bind Joins) are _compacted_, so that recurring patterns in sub-plans are not repeated. Instead, a counter is added showing how many times a certain sub-plan was executed.\n2. The default output is a compact text representation instead of the previous JSON output. (The old JSON representation is still available when passing the `physical-json` explain value)\n3. Additional metadata is emitted, such as cardinalities and execution times.\n\nFor example, outputs such as the following can now be obtained:\n\n```bash\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT ?movie ?title ?name\nWHERE {\n ?movie dbpedia-owl:starring [ rdfs:label "Brad Pitt"@en ];\n rdfs:label ?title;\n dbpedia-owl:director [ rdfs:label ?name ].\n FILTER LANGMATCHES(LANG(?title), "EN")\n FILTER LANGMATCHES(LANG(?name), "EN")\n}\' --explain physical\n```\n```text\nproject (movie,title,name)\n join\n join-inner(bind) bindOperation:(?g_0 http://www.w3.org/2000/01/rdf-schema#label "Brad Pitt"@en) bindCardEst:~2 cardReal:43 timeSelf:2.567ms timeLife:667.726ms\n join compacted-occurrences:1\n join-inner(bind) bindOperation:(?movie http://dbpedia.org/ontology/starring http://dbpedia.org/resource/Brad_Pitt) bindCardEst:~40 cardReal:43 timeSelf:6.011ms timeLife:641.139ms\n join compacted-occurrences:38\n join-inner(bind) bindOperation:(http://dbpedia.org/resource/12_Monkeys http://dbpedia.org/ontology/director ?g_1) bindCardEst:~1 cardReal:1 timeSelf:0.647ms timeLife:34.827ms\n filter compacted-occurrences:1\n join\n join-inner(nested-loop) cardReal:1 timeSelf:0.432ms timeLife:4.024ms\n pattern (http://dbpedia.org/resource/12_Monkeys http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (http://dbpedia.org/resource/Terry_Gilliam http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~1 src:0\n join compacted-occurrences:2\n join-inner(multi-empty) timeSelf:0.004ms timeLife:0.053ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://dbpedia.org/ontology/director ?g_1) cardEst:~0 src:0\n filter cardEst:~5,188,789.667\n join\n join-inner(nested-loop) timeLife:0.6ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n join compacted-occurrences:1\n join-inner(multi-empty) timeSelf:0.053ms timeLife:0.323ms\n pattern (?movie http://dbpedia.org/ontology/director ?g_1) cardEst:~118,505 src:0\n pattern (?movie http://dbpedia.org/ontology/starring http://wikidata.dbpedia.org/resource/Q35332) cardEst:~0 src:0\n filter cardEst:~242,311,843,844,161\n join\n join-inner(symmetric-hash) timeLife:36.548ms\n pattern (?movie http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~20,013,903 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n\nsources:\n 0: QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)\n```\n\n## ⚙️ Continuous performance tracking\n\nIn order to keep better track of the evolution of Comunica\'s performance,\nwe have added continuous performance tracking into our continuous integration.\nFor various benchmarks, we can now see the evolution of execution times across our commit history.\nThis allows us to easily identify which changes have a positive or negative impact on performance.\n\nFor considering the performance for different aspects, we have included the following benchmarks:\n\n- WatDiv (in-memory)\n- WatDiv (TPF)\n- Berlin SPARQL Benchmark (in-memory)\n- Berlin SPARQL Benchmark (TPF)\n- Custom web queries: manually crafted queries to test for specific edge cases over the live Web\n\nThis allows us to inspect performance as follows:\n\n
\n \n
\n\n_Fluctuations in the graph are mainly caused by confounding variables in the GitHub Actions environment, such as running on different hardware and runner versions._\n\nThese results can be inspected in [more close detail](https://github.com/comunica/comunica-performance-results) together with execution times per query separately.\n\n## \uD83C\uDFCE️ Performance improvements\n\nThanks to the improvements to our physical query plan output and the continuous performance tracking,\nwe identified several low-hanging efforts for improving performance:\n\n- [Addition of a hash-based optional join actor](https://github.com/comunica/comunica/commit/de90db0140cd10e2bfdf23c26f9eeff5e94f3ef2)\n- [Tweaking constants of our internal join cost model](https://github.com/comunica/comunica/commit/50333c92ed1cf5410f172f608a213424e510986e)\n- [Making optional hash and bind join only work with common variables](https://github.com/comunica/comunica/commit/df40c20e001121cd0ae9a9adf67ed221dc2966ba)\n\nBesides these changes, we have many more performance-impacting changes in the pipeline for upcoming releases!\n\n## Full changelog\n\nBesides this, several fixes were applied, as well as various changes and additions.\nIf you want to learn more about all changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v320---2024-07-05).\n'},4950:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Contribute'\ndescription: 'Contribute to the development of Comunica.'\n---\n\n## Report bugs or request features\n\nThe easiest way to contribute to Comunica is by **reporting the bugs** your encounter,\nand **requesting new features** or enchancements.\n\nBoth of these should be done via [**GitHub issues**](https://github.com/comunica/comunica/issues).\nMake sure to be as descriptive as possible, and completely fill in the requested template.\n\n## Fix bugs or implement new features\n\nIf there is a certain bug that annoys you,\nor if you see the opportunity for a new feature that would make your life easier,\nyou are welcome to contribute by submitting a **pull request**.\nBefore you open a pull request, it is considered a good practise to first\n[open an issue](https://github.com/comunica/comunica/issues) or [discuss it with the community](/ask/).\n\nDon't know on what to get started? Have a look at issues tagged with the [`good-first-issue`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue) label\nor the [`dev-ready`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Adev-ready) label.\nIssues tagged with [`good-first-issue`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue) are issues that should be implementable by new contributors.\nIssues tagged with [`dev-ready`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Adev-ready) are potentially harder issues, but they are directly implementable without research.\n\nWhen contributing, make sure to keep in mind the following:\n* Read how to [set up a development environment](https://github.com/comunica/comunica#development-setup).\n* Read the guide on [contributing an actor](/docs/modify/getting_started/contribute_actor/).\n* Commit messages:\n * [Use descriptive, imperative commit message](https://chris.beams.io/posts/git-commit/). These commit messages will be used as input for release changelogs. Have a look at the [commit history](https://github.com/comunica/comunica/commits/master) for examples.\n * Commit messages should include a reference to relevant issues. For example, `Closes #123`, or `Related to #456`.\n* Pull requests should pass all checks\n * Unit tests with 100% branching coverage (`yarn test`)\n * Clean code with passing linter (`yarn run lint`)\n * Code documentation\n * [Pass all spec and integration tests](/docs/modify/advanced/testing/)\n * Signing the [Contributor License Agreement](https://cla-assistant.io/comunica/comunica)\n* Only add the files that are needed, so don't blindly do a `git -a`. (avoid adding editor-specific files)\n* A good editor can make your life a lot easier. For example, [WebStorm](https://www.jetbrains.com/community/education/#students) can be used for free with an academic license.\n* All JSdoc can be found on https://comunica.github.io/comunica/\n\nTips and tricks:\n* Only do `yarn install` in the repo root, and *never* in one of the sub-packages, as this can break your repo.\n* `yarn run build` will (re)build all TypeScript to JavaScript and generate Components.js files. These can also be invoked separately via `yarn run build:ts` and `yarn run build:components`. These can also be executed on package-level.\n* `yarn run build-watch` will continuously build TypeScript to JavaScript and generate Components.js files, which is useful during development. These can also be invoked separately via `yarn run build-watch:ts` and `yarn run build-watch:components`.\n* `yarn test` and `yarn run lint` execute the tests and linter checks locally. Before a PR is opened, these must always pass, and testing coverage must be 100%.\n* When editing configuration files in packages like `query-sparql`, `yarn run prepare` can be executed to compile the JSON files to JavaScript before they can be executed. (not needed when executing dynamically)\n* When modifying a dependency package such as [sparqlee](https://github.com/comunica/sparqlee), [Yarn's link functionality](https://classic.yarnpkg.com/en/docs/cli/link/) can be used to force your local version of that dependency to be used in Comunica.\n\n## Write documentation\n\nThis website aims to provide detailed documentation on how to use and modify Comunica.\nIf you see an opportunity for improving this documentation, fixing mistakes, or adding new guides,\nyou are welcome to contribute via [GitHub](https://github.com/comunica/website).\n\n## Create example code\n\nThe [Comunica examples repository](https://github.com/comunica/examples) contains several example packages that modify Comunica,\nwith details on how they are created and how they work.\nAnyone is more than welcome to contribute new example packages to this repository.\nFor inspiration, you can have a look at the [example requests](https://github.com/comunica/examples/issues?q=is%3Aissue+is%3Aopen+label%3Aexample-request).\n\n## Guidelines for core developers\n\nThe following guidelines only apply to people with push access to the Comunica repositories.\n\n### Branching Strategy\n\nThe `master` branch is the main development branch.\n\nReleases are `tags` on the `master` branch.\n\nAll changes (features and bugfixes) must be done in a separate branch, and PR'd to `master`.\n\nRecursive features must be PR'd to their parent feature branches, as a feature can consist of multiple smaller features.\n\nThe naming strategy of branches is as follows:\n* Features: `feature/short-name-of-feature`\n* Bugfixes: `fix/short-name-of-fix`\n\n### Issue Strategy\n\nIssues should be assigned to people when possible, and must be progressed using the applicable GitHub project boards:\n\n* [Maintenance](https://github.com/orgs/comunica/projects/2)\n* [Development](https://github.com/orgs/comunica/projects/3)\n* [Documentation](https://github.com/orgs/comunica/projects/4)\n\nGeneral issues progress:\n\n1. Triage: If the issue is not yet accepted or assigned.\n2. To Do (3 levels of priority): When the issue is accepted and assigned, but not in progress yet.\n3. In Progress: When the issue is being worked on by the assignee, or is under review.\n4. Done: When the issue is resolved and reviewed. If attached to a PR, this can be merged, or closed otherwise.\n5. On hold: If the issue is awaiting external input.\n\n### Merging Pull Requests\n\nAll PRs must pass the following checklist:\n\n* All CI checks must pass. For unit tests, this includes 100% coverage, and coverage lines should not be skipped.\n* The PR must be approved by at least 2 [core maintainers](https://comunica.dev/association/board/).\n * If more than a week goes by, then the approval of 1 core maintainer is sufficient, unless another core maintainer explicitly indicated the desire for later review.\n * The codebase curator can always merge immediately.\n* If commits don't meet the commit message guidelines from above, the \"Squash and merge\" functionality of GitHub must be used, and a new commit message must be created. Otherwise, PRs can be merged via the \"Rebase\" button.\n\n### Making a new release\n\nMaking a new release only requires invoking `yarn run publish-release` from the repository root, which does the following using [lerna](https://github.com/lerna/lerna):\n\n* Prompts your for providing the new version (major, minor, patch).\n* Bump the versions from all changed packages.\n* [Generate a changelog](https://github.com/rubensworks/manual-git-changelog.js) from all commits since the last release. The process will halt until you modify (and save) the changelog where needed (remove unneeded commits, and categorize them), and confirm by pressing any key in the console. \n* Release all changed packages to npm.\n* Push the tag to GitHub.\n* Push to master.\n\n
\nIf publication fails due to a random NPM server error,\nyou can invoke the [`retry-republish.sh`](https://github.com/comunica/comunica/blob/master/.github/retry-publish.sh) scripts to retry the publication.\nThis script can be safely called multiple times.\nYou may have to stash your repo first.\n
\n\n### Making a new pre-release\n\nMaking a new release only requires invoking `yarn run publish-canary` from the repository root, which does the following using [lerna](https://github.com/lerna/lerna):\n\n* Temporarily do a patch release increment on all packages in the form of `-alpha..0`.\n* Release all packages to npm with the `next` tag.\n* Undo temporary changes\n\nPre-releases do not trigger changelog changes, git commits, and pushes.\n\nIf the lerna script exited with an error, you may notice some issues with git. In that case, make sure to execute the following:\n\n```bash\ngit update-index --no-assume-unchanged $(git ls-files | tr '\\\\n' ' ') && git checkout .\n```\n"},75835:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Documentation'\ndescription: 'Overview of all Comunica documentation.'\nindex: true\n---\n\nEither you can use Comunica for executing queries, or you can modify it to suit your specific goals.\n\nLooking for the [code documentation](https://comunica.github.io/comunica/) instead?\n\n
\nWatch some of these guides in action live within this Webinar recording.\n
\n"},17642:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Query with Comunica'\ndescription: 'Learn how to execute queries in different environments. Such as live in the browser, in JavaScript applications, or the CLI.'\nindex: true\n---\n\nThe following guides explain how to execute queries in different environments,\nsuch as live in the browser, in JavaScript applications, or the CLI.\n\n
\nWatch some of these guides in action live within this Webinar recording.\n
\n"},62712:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Getting started with querying'\ndescription: 'Basic guides on how to easily get started with querying.'\nindex: true\n---\n\nThe following guides explain some basic ways in which you can use Comunica for querying.\n"},97750:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying from the command line\'\ndescription: \'Execute SPARQL queries directly from the command line.\'\n---\n\nThe default Comunica query engine that exposes most standard features is Comunica SPARQL,\nwhich uses the package name `@comunica/query-sparql`.\nIn this guide, we will install it _globally_, and show how it can be invoked from the command line.\n\n
\nWatch part of this guide in action live within this Webinar recording.\n
\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql\n```\n\n## 2. SPARQL querying over one source\n\nAfter installing Comunica SPARQL, you will be given access to several commands including `comunica-sparql`,\nwhich allows you to execute SPARQL queries from the command line.\n\nThis command requires one or more URLs to be provided as **sources** to query over.\nAs last argument, as **SPARQL query string** can be provided.\n\nFor example, the following query retrieves the first 100 triples from [DBpedia](https://fragments.dbpedia.org/2016-04/en):\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n
\nGiven a URL, Comunica will automatically detect the type of source and handle it accordingly.\n
\n\nAs output, a JSON array of bindings for the selected variables will be returned:\n```\n[\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/1999/02/22-rdf-syntax-ns#type","?o":"http://rdfs.org/ns/void#datasource"},\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/1999/02/22-rdf-syntax-ns#type","?o":"http://www.w3.org/ns/hydra/core#Collection"},\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/ns/hydra/core#search","?o":"https://fragments.dbpedia.org/2016-04/en#triplePattern"}\n...\n``` \n\n## 3. Query file input\n\nSince SPARQL queries can sometimes become very large, it is possible to supply them via a local file using the `-f` option.\n\nAssuming a file `path/myquery.sparql` exists, we can query over it as follows:\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en -f path/myquery.sparql\n```\n\n## 4. SPARQL querying over multiple sources\n\nOne key feature of Comunica is its ability to query over **multiple sources**.\nFor this, you can just supply any number of URLs as arguments.\nJust make sure that the last argument remains your query.\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/ \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n## 5. SPARQL CONSTRUCT and ASK\n\nNext to SPARQL `SELECT` queries,\nit is also possible to execute `CONSTRUCT` queries to produce RDF triples:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n```text\n "2010-04-21"^^;\n "1939-01-02"^^;\n "PDF";\n ;\n "Sheboygan, Wisconsin";\n "1";\n...\n```\n\n`ASK` queries will produce a boolean output:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "ASK { ?s ?p ?o }"\n```\n```\ntrue\n```\n\n## 6. Changing result format\n\n`SELECT` queries will be printed as JSON by default, and `CONSTRUCT` queries as [RDF TriG](https://www.w3.org/TR/trig/).\nThis can be overridden using the `-t` option.\n\nFor example, displaying results as SPARQL JSON results:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100" \\\n -t \'application/sparql-results+json\'\n```\n```json\n{"head": {"vars":["s","p","o"]},\n"results": { "bindings": [\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/date","type":"uri"},"o":{"value":"1899-05-06","type":"literal","datatype":"http://www.w3.org/2001/XMLSchema#date"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/isCitedBy","type":"uri"},"o":{"value":"http://dbpedia.org/resource/Tierce_(unit)","type":"uri"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/newspaper","type":"uri"},"o":{"value":"Biloxi Daily Herald","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/page","type":"uri"},"o":{"value":"6","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/title","type":"uri"},"o":{"value":"A New System of Weights and Measures","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/url","type":"uri"},"o":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"}},\n...\n``` \n\n
\nAll available formats can be printed via comunica-sparql --listformats\n
\n\n## 7. Printing the query plan\n\nUsing the `--explain` option, the query plan can be printed via [different explain modes](/docs/query/advanced/explain/).\n\n## 8. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql`.\nYou can learn more options by invoking the _help_ command:\n```text\n$ comunica-sparql evaluates SPARQL queries\n\nRecommended options:\n -q, --query Evaluate the given SPARQL query string [string]\n -f, --file Evaluate the SPARQL query in the given file [string]\n -i, --inputType Query input format (e.g., graphql, sparql) [string] [default: "sparql"]\n -t, --outputType MIME type of the output (e.g., application/json) [string]\n\nOptions:\n -c, --context Use the given JSON context string or file (e.g., config.json) [string]\n --to Destination for update queries [string]\n -b, --baseIRI base IRI for the query (e.g., http://example.org/) [string]\n -d, --dateTime Sets a datetime for querying Memento-enabled archives [string]\n -l, --logLevel Sets the log level (e.g., debug, info, warn, ...) [string] [default: "warn"]\n --lenient If failing requests and parsing errors should be logged instead of causing a hard crash [boolean]\n -v, --version Prints version information [boolean]\n --showStackTrace Prints the full stacktrace when errors are thrown [boolean]\n --httpTimeout HTTP requests timeout in milliseconds [number]\n --httpBodyTimeout Makes the HTTP timeout take into account the response body stream read [boolean]\n --httpRetryCount The number of retries to perform on failed fetch requests [number]\n --httpRetryDelay The number of milliseconds to wait between fetch retries [number]\n --httpRetryOnServerError If fetch should be retried on 5xx server error responses, instead of being resolved. [boolean]\n --unionDefaultGraph If the default graph should also contain the union of all named graphs [boolean]\n --noCache If the cache should be disabled [boolean]\n --distinctConstruct If the query engine should deduplicate resulting triples [boolean]\n -p, --proxy Delegates all HTTP traffic through the given proxy (e.g. http://myproxy.org/?uri=) [string]\n --listformats Prints the supported MIME types [boolean]\n --explain Print the query plan [string] [choices: "parsed", "logical", "physical"]\n --localizeBlankNodes If blank nodes should be localized per bindings entry [boolean]\n -r, --recoverBrokenLinks Use the WayBack machine to recover broken links [boolean] [default: false]\n\nExamples:\n comunica-sparql https://fragments.dbpedia.org/2016-04/en -q \'SELECT * { ?s ?p ?o }\'\n comunica-sparql https://fragments.dbpedia.org/2016-04/en -f query.sparql\n comunica-sparql https://fragments.dbpedia.org/2016-04/en https://query.wikidata.org/sparql ...\n comunica-sparql hypermedia@https://fragments.dbpedia.org/2016-04/en sparql@https://query.wikidata.org/sparql ...\n```\n'},20919:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Updating from the command line\'\ndescription: \'Execute SPARQL Update queries directly from the command line.\'\n---\n\nComunica SPARQL (`@comunica/query-sparql`) allow you to initiate queries to _update_ data in a certain store.\nIn this guide, we will build upon [the guide on querying from the command line](/docs/query/getting_started/query_cli/),\nand show how you can not only read, but also update data.\n\n
\nAt the time of writing, not all possible destination types may be supported yet.\n
\n\n## 1. Updating one source\n\nUsing the `comunica-sparql` command line tool,\nyou can invoke not only read queries, but also update queries.\n\nAssuming you pass just one source,\nthis source will also be assumed to be the destination for update queries.\n\nFor example, the following query appends a single triple to `https://example.org/myfile.ttl`:\n```bash\n$ comunica-sparql https://example.org/myfile.ttl \\\n "INSERT DATA { }"\n```\n\n
\nGiven a URL, Comunica will automatically detect the type of destinations and handle it accordingly.\n
\n\nAs output, `ok` will be printed if the update was successful:\n```\nok\n``` \n\n## 2. Updating a different destination\n\nWhile Comunica supports querying over **multiple sources**,\nit only supports updating **a single destination**.\n\nTherefore, if you are querying over multiple sources,\nbut you want to pass the results to a single destination,\nthen you must explicitly define this destination using the `--to` option.\n\nFor example, the following query takes the first 100 triples from 3 sources,\nand inserts them into `https://example.org/myfile.ttl`:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/ \\\n --to https://example.org/myfile.ttl \\\n "INSERT { ?s ?p ?o. } WHERE { SELECT * WHERE { ?s ?p ?o } LIMIT 100 }"\n```\n\n
\nThe type of destination is here also automatically detected,\nand can also be overridden.\n
\n'},48016:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying local files from the command line\'\ndescription: \'Execute SPARQL queries over local RDF files directly from the command line.\'\n---\n\nUsing Comunica SPARQL File, you can query over RDF files that are stored on your local machine.\n\n
\nWhile Comunica SPARQL allows you to query sources exposed via URLs on the command line,\nit does not allow you to query RDF local files.\nThis is because Comunica SPARQL can be used in a variety of use cases, of which deployment on a public server is one.\nIn some of these cases, the ability to access the local file system can imply a major security risk,\nwhich is why we require the use of a separate package. \n
\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql-file\n```\n\n## 2. SPARQL querying over one local file\n\nAfter installing Comunica SPARQL, you will be given access to several commands including `comunica-sparql-file`,\nwhich allows you to execute SPARQL queries from the command line.\n\nJust like `comunica-sparql`, this command requires one or more URLs to be provided as **sources** to query over.\nAs last argument, as **SPARQL query string** can be provided.\n\nFor example, the following query retrieves the first 100 triples from `path/to/my/file.ttl`:\n```bash\n$ comunica-sparql-file path/to/my/file.ttl \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n## 3. SPARQL querying over one remote file\n\nNext to local file, also _remote_ files identified by a URL can be queried:\n```bash\n$ comunica-sparql-file https://www.rubensworks.net/ \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n## 4. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql-file`.\nYou can learn more options by invoking the _help_ command, or by [reading the Comunica SPARQL documentation](/docs/query/getting_started/query_cli/):\n```text\n$ comunica-sparql-file --help\n```\n'},95276:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying in a JavaScript app'\ndescription: 'Execute SPARQL queries from within your application using the JavaScript API.'\n---\n\nThe default Comunica query engine that exposes most standard features is Comunica SPARQL,\nwhich uses the package name `@comunica/query-sparql`.\nIn this guide, we will install it as a dependency in a [Node.js](https://nodejs.org/en/) JavaScript application,\nand show how it can be used to execute queries.\n\n
\nWatch part of this guide in action live within this Webinar recording.\n
\n\n## 1. Installation\n\n
\nThis assumes you already have an npm package.\nIf you don't have one yet, create one using npm init.\nYou will also need a JavaScript file to write in, such as main.js.\n
\n\nIn order to add Comunica SPARQL as a _dependency_ to your [Node.js](https://nodejs.org/en/) application,\nwe can execute the following command:\n```bash\n$ npm install @comunica/query-sparql\n```\n\n## 2. Creating a new query engine\n\nThe easiest way to create an engine is as follows:\n\n```javascript\nconst QueryEngine = require('@comunica/query-sparql').QueryEngine;\n\nconst myEngine = new QueryEngine();\n```\n\nYou can reuse an engine as often as possible.\nThis is especially valuable if you repeatedly query over the same sources,\nas [caching](/docs/query/advanced/caching/) can be performed. \n\n## 3. Executing SPARQL SELECT queries\n\nOnce you engine has been created, you can use it to execute any SPARQL query, such as a `SELECT` query:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n});\n```\n\nThe first argument of `queryBindings()` is a SPARQL query string,\nand the second argument is a [query context](/docs/query/advanced/context/) containing options,\nwhich must at least contain an array of sources to query over. \n\nThe resulting `bindingsStream` is a stream of **bindings**,\nwhere each binding contains values for the selected variables (`?s ?p ?o`).\n\n
\nWhile the sources is the only required option in the query context,\nadditional options can be passed\nto tweak how the engine executed the query.\n
\n\n### 3.1 Consuming binding results as a stream\n\nThe most efficient way to make use of the result,\nis by adding a **data-listener** to the `bindingsStream`:\n```javascript\nbindingsStream.on('data', (binding) => {\n console.log(binding.toString()); // Quick way to print bindings for testing\n\n console.log(binding.has('s')); // Will be true\n \n // Obtaining values\n console.log(binding.get('s').value);\n console.log(binding.get('s').termType);\n console.log(binding.get('p').value);\n console.log(binding.get('o').value);\n});\n```\n\nThe data-listener will be invoked _for each resulting binding_,\nas soon as the query engine has detected it.\nThis means that the data-listener can be invoked many times during query execution,\neven if not all results are available yet.\n\nEach `binding` is an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object\nthat contains mappings from variables to RDF terms.\nVariable names can either be obtained by string label (without the `?` prefix) or via [RDF/JS](/docs/query/advanced/rdfjs/) variable objects,\nand bound RDF terms are represented as [RDF/JS](/docs/query/advanced/rdfjs/) terms.\nLearn more about the usage of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\nTo find out when the query execution has **ended**,\nand all results are passed to the data-listener,\nan **end-listener** can be attached as well.\n```javascript\nbindingsStream.on('end', () => {\n // The data-listener will not be called anymore once we get here.\n});\n```\n\nIt is also considered good practise to add an **error-listener**,\nso you can detect any problems that have occurred during query execution:\n```javascript\nbindingsStream.on('error', (error) => {\n console.error(error);\n});\n```\n\n### 3.2 Consuming binding results as an async iterable\n\nUsing a for-await loop, you can consume bindings as an [async iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols).\nWhile this is more compact than the stream-based approach, it may lead to a slightly lower level of performance:\n\n```javascript\nfor await (const bindings of bindingsStream) {\n console.log(bindings.get('s').value);\n console.log(bindings.get('s').termType);\n}\n```\n\n### 3.3 Consuming binding results as an array\n\nIf performance is not an issue in your application,\nor you just want the results in a simple array,\nthen you can call the asynchronous `toArray()` method on the `bindingsStream`:\n\n```javascript\nconst bindings = await bindingsStream.toArray();\n\nconsole.log(bindings[0].get('s').value);\nconsole.log(bindings[0].get('s').termType);\n```\n\nThis method will return asychronously (using `await`) as soon as _all_ results have been found.\nIf you have many results, it is recommended to consume results iteratively via a data listener instead.\n\nEach binding in the array is again an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object.\n\nIf you want to limit the number of results in the array, you can optionally pass a limit:\n```javascript\nconst bindings = await bindingsStream.toArray({ limit: 100 });\n```\n\n## 4. Executing queries over multiple sources\n\nQuerying over more than one source is trivial,\nas any number of sources can easily be passed via an array:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\n 'http://fragments.dbpedia.org/2015/en',\n 'https://www.rubensworks.net',\n 'https://ruben.verborgh.org/profile/',\n ],\n});\n```\n\n## 5. Executing SPARQL CONSTRUCT queries\n\nNext to `SELECT` queries, you can also execute a `CONSTRUCT` query to generate RDF quads/triples:\n```javascript\nconst quadStream = await myEngine.queryQuads(`\n CONSTRUCT WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n```\n\n### 5.1 Consuming quad results as a stream\n\nThe most efficient way to make use of the resulting RDF quads,\nis by adding a **data-listener** to the `quadStream`:\n```javascript\nquadStream.on('data', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\nThe data-listener will be invoked _for each constructed RDF triple/quad_,\nas soon as the query engine has created it.\nThis means that the data-listener can be invoked many times during query execution,\neven if not all results are available yet.\n\nEach `quad` is an [RDF/JS](/docs/query/advanced/rdfjs/) quad,\nwhich contain `subject`, `predicate`, `object` and `graph` terms.\n\nJust like `bindingsStream`, **end-listener** and **error-listener** can also be attached:\n\n```javascript\nquadStream.on('end', () => {\n // The data-listener will not be called anymore once we get here.\n});\nquadStream.on('error', (error) => {\n console.error(error);\n});\n```\n\n### 5.2 Consuming quad results as an async iterable\n\nJust like with binding results,\nquads can also be consumed using for-await.:\n\n```javascript\nfor await (const quad of quadStream) {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n}\n\n```\n\n### 5.3 Consuming quad results as an array\n\nJust like with binding results,\nif performance is not an issue in your application,\nor you just want the results in a simple array,\nthen you can call the asynchronous `toArray()` method on the `bindingsStream`:\n\n```javascript\nconst quads = await quadStream.toArray();\n\nconsole.log(quads[0].subject.value);\nconsole.log(quads[0].predicate.value);\nconsole.log(quads[0].object.value);\nconsole.log(quads[0].graph.value);\n```\n\nThis method will return asychronously (using `await`) as soon as _all_ results have been found.\nIf you have many results, it is recommended to consume results iteratively via a data listener instead.\n\nEach `quad` is again an [RDF/JS](/docs/query/advanced/rdfjs/) quad,\nwhich contain `subject`, `predicate`, `object` and `graph` terms.\n\n## 6. Executing SPARQL ASK queries\n\nOne of the simplest forms SPARQL is the ASK query,\nwhich can be executed in Comunica as follows:\n```javascript\nconst hasMatches = await myEngine.queryBoolean(`\n ASK {\n ?s ?p \n }`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n})\n```\n\nThe value of `hasMatches` indicates if the query has at least one result. \n\n## 7. Executing a generic query\n\nIf you don't know beforehand if your query is a `SELECT`, `CONSTRUCT`, or `ASK` (e.g. if your app accepts queries via user input),\nthen you can make use of the generic `query` method that supports all query types:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n\nif (result.resultType === 'bindings') {\n const bindingsStream = await result.execute();\n\n bindingsStream.on('data', (binding) => {\n console.log(binding.toString());\n });\n}\n```\n\nThe resulting object represents a _future_ to the query results.\nIf has a field `resultType` that indicates the query and result type, which can be `'bindings'`, `'quads'`, `'boolean'`, or `'void'`.\nThe asynchronous `execute` method effectively executes the query, and returns a result depending on the `resultType`, corresponding to the `queryBindings`, `queryQuads`, ... methods.\nFor example, if the result type is `'bindings'`, then the return type of `execute` will be a bindings stream.\n\nOptionally, you can also obtain metadata about the results via this `query` method for the `'bindings'` and `'quads'` result types:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n\nif (result.resultType === 'bindings') {\n const metadata = await result.metadata();\n console.log(metadata.cardinality);\n console.log(metadata.canContainUndefs);\n}\n```\n\n## 8. Serializing to a specific result format\n\nIf you want your application to output query results in a certain text-based format,\njust like [executing Comunica on the command line](/docs/query/getting_started/query_cli/),\nthen you can make use of the `resultToString()` method.\n\nFor example, serializing to SPARQL JSON can be done as follows:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\nconst { data } = await myEngine.resultToString(result,\n 'application/sparql-results+json');\ndata.pipe(process.stdout); // Print to standard output\n```\n\nThe `resultToString()` method accepts a query result and a result format media type.\nThe media type is optional, and will default to `application/json` for bindings, `application/trig` for quads, and `simple` for booleans.\n\n
\nAll available result formats can be retrieved programmatically\nby invoking the asynchronous getResultMediaTypes() method.\n
\n"},92421:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Updating in a JavaScript app\'\ndescription: \'Execute SPARQL Update queries from within your application using the JavaScript API.\'\n---\n\nComunica SPARQL (`@comunica/query-sparql`) allow you to initiate queries to _update_ data in a certain store.\nIn this guide, we will build upon [the guide on querying in a JavaScript app](/docs/query/getting_started/query_app/),\nand show how you can not only read, but also update data.\n\n## 1. Creating a new query engine and store\n\nThe easiest way to create an engine and store is as follows:\n\n```javascript\nconst QueryEngine = require(\'@comunica/query-sparql\').QueryEngine;\nconst N3 = require(\'n3\');\n\nconst myEngine = new QueryEngine();\n\nconst store = new N3.Store();\n```\n\nWe make use of the [`Store` from `N3.js`](https://github.com/rdfjs/N3.js#storing) for these examples.\n\n## 2. Executing INSERT DATA queries\n\nOnce you engine has been created, you can use it to execute any SPARQL Update query, such as a `INSERT DATA` query:\n```javascript\n// Initiate the update\nawait myEngine.queryVoid(`\n PREFIX dc: \n INSERT DATA\n { \n dc:title "A new book" ;\n dc:creator "A.N.Other" .\n }`, {\n sources: [ store ],\n});\n\n// Prints \'2\' => the store is updated\nconsole.log(store.size);\n```\n\n## 3. Executing DELETE/INSERT WHERE queries\n\n`DELETE/INSERT WHERE` queries allow you to delete and insert new quads,\nbased on quads that are already available:\n\n```javascript\n// Insert initial data\nawait myEngine.queryVoid(`\n PREFIX foaf: \n INSERT DATA\n { \n foaf:givenName "Bill" .\n foaf:familyName "McKinley" .\n foaf:givenName "Bill" .\n foaf:familyName "Taft" .\n foaf:givenName "Bill" .\n foaf:familyName "Clinton" .\n }`, {\n sources: [ store ],\n});\n\n// Rename all occurrences of "Bill" to "William"\nawait myEngine.queryVoid(`\n PREFIX foaf: \n DELETE { ?person foaf:givenName \'Bill\' }\n INSERT { ?person foaf:givenName \'William\' }\n WHERE\n {\n ?person foaf:givenName \'Bill\' \n }`, {\n sources: [ store ],\n});\n```\n\n
\nFor more information on the types of update queries that are possible, \nplease refer to the SPARQL Update specification.\n
\n\n## 4. Configure a custom destination\n\nBy default, update queries will modify data within the given source.\nIn some cases, you may want to direct changes to another place.\nFor example, if you have multiple sources, but you want to direct all changes to a single source.\n\nThis can be done by passing a `destination` into the query context:\n```javascript\n// Insert friends based on common friends from Ruben\'s\nawait myEngine.queryVoid(`\n PREFIX foaf: \n INSERT\n {\n foaf:knows ?friend\n }\n WHERE\n {\n foaf:knows ?friend .\n foaf:knows ?friend . \n }`, {\n sources: [\n \'https://www.rubensworks.net/\',\n \'https://ruben.verborgh.org/profile/\',\n ],\n destination: store,\n});\n```\n\n
\n'},51839:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying in a JavaScript browser app'\ndescription: 'Execute SPARQL queries from within your client-side browser application using the JavaScript API.'\n---\n\nComunica can run in both [Node.js JavaScript applications](/docs/query/getting_started/query_app/),\nand as **client-side applications in Web browsers**.\n\n## 1. Using a pre-built version\n\nThe easiest way to use Comunica in your Web app,\nis by using a pre-built Comunica SPARQL version that is served via a GitHub CDN:\n```html\n\n\n```\n\n
\nThe code example above will always make use of the the latest Comunica version in the 2.x.x range.\nInstead, you can use a specific version.\n
\n\nThe full API of Comunica is available under the `Comunica` namespace.\nMore information on its usage can be found in the guide on\n[using Comunica in a JavaScript app](/docs/query/getting_started/query_app/).\n\n## 2. Bundling for the browser\n\nComunica is compatible with browser bundler tools such as [Webpack](https://www.npmjs.com/package/webpack)\nand [browserify](http://browserify.org/).\nIf you are not familiar with these tools,\nyou can read the following guides:\n* [Webpack: Creating a Bundle – getting started](https://webpack.js.org/guides/getting-started/#creating-a-bundle)\n\nYou will need to create a \"UMD bundle\" and supply a name (e.g. with the -s Comunica option in browserify).\n\n
\nRefer to our specific guide on\nbuilding for the browser\nif you want to build specific configurations of Comunica for the browser.\n
\n"},26884:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying from a Docker container\'\ndescription: \'Execute SPARQL queries within a Docker container.\'\n---\n\n
\n \n
\n\nIf for whatever reason you are unable or unwilling to install Node.js,\nthen you can make use Comunica via [**Docker containers**](https://www.docker.com/) instead.\n\nUsage of the Comunica SPARQL via Docker can be done via the [`comunica/query-sparql` Docker image](https://hub.docker.com/r/comunica/query-sparql):\n```bash\n$ docker run -it --rm comunica/query-sparql \\\n https://fragments.dbpedia.org/2015-10/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nThe signature of this command is identical to the [`comunica-sparql` command](/docs/query/getting_started/query_cli/).\n\nBy default, the latest (stable) released version will be pulled and started.\nIf you want to make use of the latest development version,\nwhich is updated upon each new commit in the [Comunica GitHub repository](https://github.com/comunica/comunica),\nthen the `dev` tag can be used:\n```bash\n$ docker run -it --rm comunica/query-sparql:dev \\\n https://fragments.dbpedia.org/2015-10/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nA new Docker tag is also created upon each new release,\nso you can select a fixed version of Comunica if needed,\nsuch as version 1.14.0:\n```bash\n$ docker run -it --rm comunica/query-sparql:1.14.0 \\\n https://fragments.dbpedia.org/2015-10/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nA list of all available tags can be found on the [Docker hub](https://hub.docker.com/r/comunica/query-sparql/tags).\n'},64942:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Setting up a SPARQL endpoint\'\ndescription: \'Allow querying over HTTP via the SPARQL protocol\'\n---\n\nThe [SPARQL protocol](https://www.w3.org/TR/sparql11-protocol/) allows clients to send SPARQL queries to Web servers over HTTP,\nand query results to be sent back to the client. \nComunica SPARQL can be used to set up a **SPARQL endpoint** on top of any number of sources you want.\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql\n```\n\n## 2. SPARQL endpoint over one source\n\nAfter installing Comunica SPARQL, you will be given access to several commands including `comunica-sparql-http`,\nwhich allows you to start a SPARQL endpoint from the command line.\n\nThis command requires one or more URLs to be provided as **sources** to query over.\n\nFor example, the following command starts a SPARQL endpoint over [DBpedia](https://fragments.dbpedia.org/2016-04/en):\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en\n```\n\n
\nGiven a URL, Comunica will automatically detect the type of source and handle it accordingly.\n
\n\nBy default, the endpoint will be exposed on port 3000.\nYour endpoint will now be live on `http://localhost:3000/sparql`.\nAny client that understands the SPARQL protocol will now be able to send queries to this URL,\nsuch as [`fetch-sparql-endpoint`](https://github.com/rubensworks/fetch-sparql-endpoint.js/), or even Comunica itself.\n\n
\nThe URL http://localhost:3000/ will automatically redirect to http://localhost:3000/sparql.\n
\n\nYou can easily test query execution over your endpoint using a tool such as `curl`.\nThe SPARQL protocol allows sending queries via HTTP GET by passing a URL-encoded SPARQL query via the `?query=` parameter:\n```bash\n$ curl -v "http://localhost:3000/sparql?query=CONSTRUCT%20WHERE%20%7B%3Fs%20%3Fp%20%3Fo.%7DLIMIT%20100"\n```\n\n## 3. SPARQL endpoint over multiple sources\n\nOne key feature of Comunica is its ability to query over **multiple sources**.\nFor this, you can just supply any number of URLs as arguments.\n\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/\n```\n\n## 4. SPARQL endpoint over local files\n\nFirst install Comunica SPARQL for files:\n\n```bash\n$ npm install -g @comunica/query-sparql-file\n```\n\nThen start the SPARQL server:\n\n```bash\n$ comunica-sparql-file-http path/to/my/file.ttl\n```\n\n## 5. Changing the port\n\nUsing the `-p` option, the port can be changed:\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en \\\n -p 3001\n```\n\n## 6. Increasing the number of worker threads\n\nUsing the `-w` option, the number of parallel worker threads can be set:\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en \\\n -w 4\n```\n\nSetting this to the number of available CPU cores tends to give the best performance.\n\n## 7. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql-http`.\nYou can learn more options by invoking the _help_ command:\n```text\n$ comunica-sparql-http --help\ncomunica-sparql-http exposes a SPARQL endpoint\n\nRecommended options:\n -p, --port HTTP port to run on [number] [default: 3000]\n -w, --workers Number of worker threads [number] [default: 1]\n -t, --timeout Query execution timeout in seconds [number] [default: 60]\n -u, --update Enable update queries (otherwise, only read queries are enabled) [boolean] [default: false]\n\nOptions:\n -c, --context Use the given JSON context string or file (e.g., config.json) [string]\n --to Destination for update queries [string]\n -b, --baseIRI base IRI for the query (e.g., http://example.org/) [string]\n -d, --dateTime Sets a datetime for querying Memento-enabled archives [string]\n -l, --logLevel Sets the log level (e.g., debug, info, warn, ...) [string] [default: "warn"]\n --lenient If failing requests and parsing errors should be logged instead of causing a hard crash [boolean]\n -v, --version Prints version information [boolean]\n --showStackTrace Prints the full stacktrace when errors are thrown [boolean]\n -i, --invalidateCache Enable cache invalidation before each query execution [boolean] [default: false]\n\nExamples:\n comunica-sparql-http https://fragments.dbpedia.org/2016-04/en\n comunica-sparql-http https://fragments.dbpedia.org/2016-04/en https://query.wikidata.org/sparql\n comunica-sparql-http hypermedia@https://fragments.dbpedia.org/2016-04/en sparql@https://query.wikidata.org/sparql\n```\n'},12214:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Setting up a Web client'\ndescription: 'Set up a user-friendly static Web page where SPARQL queries can be executed client-side'\n---\n\nIf you want to easily **demonstrate** a couple of SPARQL queries on a **Web page**,\nor if you want to show off your custom built Comunica engine,\nthen you can do this using the [Comunica jQuery widget](https://github.com/comunica/jQuery-Widget.js/).\n\nAs an example, a public instance of this widget is available at http://query.linkeddatafragments.org/.\n\n## 1. Install from npm\n\n### 1.1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install [`@comunica/web-client-generator`](https://github.com/comunica/jQuery-Widget.js/):\n```bash\n$ npm install -g @comunica/web-client-generator\n```\n\n### 1.2. Building a static Website for production\n\nAfter installing, you can build a production-ready version of [Comunica SPARQL](https://github.com/comunica/comunica/tree/master/engines/query-sparql):\n```bash\n$ comunica-web-client-generator\n```\n\nThe resulting `build` directory can be deployed on a Web server\nusing something like [NGINX](https://www.nginx.com/) or [GitHub pages](https://pages.github.com/).\n\n### 1.3. Build a custom config\n\nIn order to override the [default config](https://github.com/comunica/jQuery-Widget.js/blob/master/config/config-default.json), you can pass one as argument.\n\n```bash\n$ comunica-web-client-generator config/config-default.json\n```\n\nThis assumes that your engine's dependencies are available in your working directory.\nIf this is not the case, provide a path to your engine's directory via the `-c` option:\n\n```bash\n$ comunica-web-client-generator path/to/engine/config/config-default.json -c path/to/engine/\n```\n\n### 1.4. Change settings and queries\n\nThe default datasources and queries can be changed as follows:\n\n```bash\n$ comunica-web-client-generator -s settings.json -q queries\n```\n\nExamples for the [`settings.json`](https://github.com/comunica/jQuery-Widget.js/blob/master/settings.json) file\nand the [`queries`](https://github.com/comunica/jQuery-Widget.js/tree/master/queries) directory.\n\n### 1.5. Show all available options\n\nAll available options for this command are:\n\n```bash\n$ comunica-web-client-generator -h\ncomunica-web-client-generator generates Comunica Web clients\n Usage:\n comunica-web-client-generator config/config-default.json\n comunica-web-client-generator config/config-default.json -d my-build/ -s my-settings.json\n comunica-web-client-generator config/config-default.json -q my-queries/\n comunica-web-client-generator config/config-default.json -w my-webpack.config.js\n\n Options:\n -d Destination of the built output (defaults to build)\n -m The compilation mode (defaults to production, can also be development)\n -c Path to the main Comunica module (defaults to cwd)\n -q Path to custom queries directory\n -s Path to custom settings file\n -w Path to custom Webpack config\n --help Print this help message\n```\n\n## 2. Install from GitHub\n\n### 2.1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can [clone the Comunica jQuery widget repo](https://github.com/comunica/jQuery-Widget.js/), and install it:\n```bash\n$ git clone https://github.com/comunica/jQuery-Widget.js.git\n$ cd jQuery-Widget.js\n$ npm install\n```\n\n### 2.2. Starting the built-in Web server\n\nThe widget comes with its own (optional) Web server,\nwhich can be started as follows:\n```bash\n$ npm run dev\n```\n\nNow, you page will be live at `http://localhost:8080`.\n\n
\nThis port can be changed to something else by adding the --port option\nwithin the dev script in package.json.\n
\n\n### 2.3. Building a static Website for production\n\nThe built-in Web server should primarily be used for testing.\nIf you want to deploy this page on a Web server,\nsomething like [NGINX](https://www.nginx.com/) or [GitHub pages](https://pages.github.com/) is recommended.\n\nYou can build a production-ready version of this page as follows:\n```bash\n$ npm run build\n```\n\nThe contents of the `build` folder can now be deployed on to any Web server.\n\n### 2.4. Changing the default queries and datasets\n\nYou'll notice that the page contains some example queries and datasets by default.\nYou can change these by modifying the contents of the `queries/` folder and the `settings.json` file.\n\n
\nWhen running the built-in dev server, the process will have to be restarted after every change to the queries or settings.\n
\n"},40971:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Query using the latest development version\'\ndescription: \'If you want to make use of the latest changes that are not released yet\' \n---\n\nWhile the [Comunica GitHub repository](https://github.com/comunica/comunica) receives regular commits,\nwith fixes for bugs or new features,\nthese changes are not always immediately published as a new release on the npm package manager.\n\nWhile we always recommend using a released version of Comunica,\nthere are situations where you may want to make use of the **latest development version** from GitHub instead.\nFor example, if your application depends on a new _feature_ or _fix_ in Comunica,\nand you already want to develop or test your application before the new Comunica release is available.\n\nIn this guide, we will do this by setting up the **Comunica development environment**.\n\n## 1. Setup the Comunica development environment\n\nIf you want to make use of the latest development version,\nyou will have to **clone** the GitHub repository,\nand **install** it via the [Yarn package manager](https://yarnpkg.com/):\n```bash\n$ git clone https://github.com/comunica/comunica.git\n$ cd comunica\n$ yarn install\n```\n\n
\nSetting up the development via the npm package manager will not work due to the Comunica repository making use\nof the Yarn workspaces functionality.\n
\n\n## 2. Querying from the command line\n\nIf installation is successful, you can navigate to any package and make use of it\nsimilar to how you would when it has been installed via npm.\n\nFor example, executing a SPARQL query from the command line with Comunica SPARQL\ncan be done by navigating to `engines/query-sparql`, and invoking `bin/query.js`:\n```bash\n# cd engines/query-sparql\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n``` \n\nYou can execute any of the commands explained in the [CLI guide](/docs/query/getting_started/query_cli/)\nby simply replacing `comunica-sparql` with `node bin/query.js`.\n\nIf you want to [set up a SPARQL endpoint](/docs/query/getting_started/setup_endpoint/),\nyou can use `node/http.js` instead of `comunica-sparql-http`.\n\n## 3. Linking Comunica SPARQL to your package\n\nIf you have a [JavaScript application that makes use of Comunica SPARQL](/docs/query/getting_started/query_app/),\nthen you can **link** it to your local Comunica development environment.\n\nThis can be done by first indicating that Comunica SPARQL can be linked (starting from the Comunica development environment folder):\n```bash\n$ cd engines/query-sparql\n$ yarn link\n```\n\nNext, in the folder of your JavaScript package,\nwe can link Comunica SPARQL as follows:\n```bash\n$ yarn link "@comunica/query-sparql"\n```\n\nNow, your application will use the development version of Comunica instead.\n\n
\nIf you want to go back to the npm version of Comunica SPARQL,\nthen you first have to unlink it from your application by running yarn unlink "@comunica/query-sparql".\n
\n'},13302:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Usage showcase'\ndescription: 'Examples of where Comunica is used.'\n---\n\nComunica is being used in a variety of places for its querying and RDF-related capabilities.\nBelow, a couple of these uses are listed.\nFeel free to [contact us](/ask/) if you want your use of Comunica added to this list.\n\n## LDflex\n\n[LDflex](https://github.com/LDflex/LDflex) is a JavaScript library provides a convenient syntax for quickly writing and executing queries in a developer-friendly way.\nUsing the power of Comunica and JSON-LD contexts, you can write expressions like `person.friends.firstName` to get a list of your friends.\n\nLDflex is used within the [Solid](https://solidproject.org/) community to easily [interact with one or more Solid data pods](https://github.com/solid/query-ldflex/).\nUsing the compact syntax of LDflex, it is very simple to query from within [React components](https://github.com/solid/react-components).\n\n## GraphQL-LD\n\n[GraphQL-LD](https://github.com/rubensworks/graphql-ld.js) is a JavaScript library\nthat allows Linked Data to be queried via [GraphQL](https://graphql.org/) queries and a JSON-LD context.\nThe approach involves converting a GraphQL query and JSON-LD context to a SPARQL query,\nwhich can then be executed by any SPARQL query engine [such as Comunica](https://github.com/rubensworks/graphql-ld-comunica.js).\n\nIt can also be used execute [authenticated queries over Solid data pods](https://github.com/rubensworks/GraphQL-LD-Comunica-Solid.js),\nfor which [reusable React components](https://github.com/rubensworks/solid-react-graphql-ld.js) are available.\n\n## Quadstore\n\n[Quadstore](https://github.com/belayeng/quadstore) is a [LevelDB](https://github.com/google/leveldb)-based graph database for Node.js and the browser.\n[Quadstore Comunica](https://github.com/belayeng/quadstore-comunica) is a SPARQL engine on top of Quadstore that is powered by Comunica.\n\n## LDkit\n[LDkit](https://ldkit.io) is a Linked Data query toolkit for TypeScript developers. It provides ORM-like abstraction over RDF data: you define a data source and a data schema and let LDkit handle SPARQL queries, data fetching and conversion of RDF to to JS/TS native types in background.\n\nLDkit provides built-in support to query SPARQL endpoints, but it is [fully compatible with Comunica](https://ldkit.io/docs/how-to/query-with-comunica) in case you need to access other RDF data sources.\n\n## RDF Parse\n\n[RDF Parse](https://github.com/rubensworks/rdf-parse.js) is a JavaScript library parses RDF based on content type or file name in a streaming manner.\nIt supports all of the major RDF serializations.\nInternally, this library makes use of the `rdf-parse` bus and actors from Comunica.\n\n## RDF Dereference\n\n[RDF Dereference](https://github.com/rubensworks/rdf-dereference.js) is a JavaScript library dereferences URLs to get its RDF contents.\nThis tool is useful in situations where you have a URL, and you just need the parsed triples/quads, without having to concern yourself with determining the correct content type and picking the correct parser.\nInternally, this library makes use of the `rdf-dereference` bus and actors from Comunica.\n\n## RDF Play\n\n[RDF Play](https://rdf-play.rubensworks.net/) is a Web-based tool for performing simple RDF operations, such as parsing, serializing and dereferencing from URLs.\nInternally, this library makes use of RDF parsers from the Comunica framework, which enable streaming processing of RDF.\n\n## ESWC Conference 2020\n\nAll metadata of the [ESWC Conference (2020)](https://2020.eswc-conferences.org/) is [queryable](https://query.2020.eswc-conferences.org/)\nvia a jQuery widget instance of Comunica.\nIt features several example queries over a [Triple Pattern Fragments](https://linkeddatafragments.org/concept/) interface through which the ESWC 2020 metadata is published.\n\n## Walder\n\n[Walder](https://github.com/KNowledgeOnWebScale/walder) offers an easy way \nto set up a website or Web API on top of decentralized knowledge graphs.\nIt uses Comunica for querying these knowledge graphs.\nhosted via Solid PODs, SPARQL endpoints, Triple Pattern Fragments interfaces, RDF files, and so on. \nUsing content negotiation, Walder makes the data in these knowledge graphs available to clients via HTML, RDF, and JSON-LD. \nUsers define in a configuration file which data Walder uses and how it processes this data.\n"},6572:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying FAQ'\ndescription: 'Frequently asked questions about using Comunica.'\n---\n\nFeel free to [ask us](/ask/), or have a look at the\n[example](https://github.com/comunica/examples) repository.\n\n## How can I query over RDF documents on my local file system?\n\nInstead of using Comunica SPARQL, you can use [Comunica SPARQL File](/docs/query/getting_started/query_cli_file/)\nto query over files on your local file system.\n\nComunica SPARQL by default does not allow you to query over local file for security reasons.\n\n## How to query over sources in memory?\n\n[Comunica SPARQL RDF/JS](/docs/query/advanced/rdfjs_querying/) can be used for in-memory querying.\n\n## How are result bindings and quads represented in JavaScript?\n\nSELECT query results will be contained in a `bindingsStream`,\nwhere each data element is a `Binding`.\nEach `binding` is an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object\nthat contains mappings from variables to RDF terms.\nVariable names can either be obtained by string label (without the `?` prefix) or via [RDF/JS](/docs/query/advanced/rdfjs/) variable objects,\nand bound RDF terms are represented as [RDF/JS](/docs/query/advanced/rdfjs/) terms.\nFor example:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT ...`, {...});\nbindingsStream.on('data', (binding) => {\n console.log(binding.get('s').value);\n console.log(binding.get('s').termType);\n});\n```\nLearn more about the usage of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\nCONSTRUCT query results will be contained in a `quadStream`,\nwhere each data element is an [RDF/JS](/docs/query/advanced/rdfjs/) quad.\nFor example:\n```javascript\nconst quadStream = await myEngine.queryQuads(`CONSTRUCT ...`, {...});\nquadStream.on('data', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\nRead more about this in the [guide om executing SPARQL queries in JavaScript applications](/docs/query/getting_started/query_app/).\n\n## What datastructure is behind `bindingsStream` and `quadStream`?\n\nQuery results can be returned via `bindingsStream` (SELECT queries) and `quadStream` (CONSTRUCT) queries.\n\nThese streams are backed by an [AsyncIterator](https://github.com/RubenVerborgh/AsyncIterator),\nwhich is a lightweight JavaScript implementation of demand-driven object streams.\nAs opposed to Node's `Stream`, you cannot push anything into an `AsyncIterator`;\ninstead, an iterator pulls things from another iterator.\n\nFurthermore, these streams are _lazy_,\nwhich means that the results will only be calculated once you request them,\nand an `'end'` event will only be emitted when all of them have been consumed.\n\n## I need a specific feature, how do I get it into Comunica?\n\nSince Comunica is an open-source project,\nthe best way to get new features in, is by [contributing yourself](/contribute/).\n\nAlternatively, you can delegate implementation work to a third-party via the [Comunica Association](/association/).\n\n## How to measure query performance with Comunica?\n\n### Simple statistics\n\nThe easiest way to get statistics on the performance of a specific query\nis by using the `'stats'` [result format](/docs/query/advanced/result_formats/).\nThis will print the number of results, their delay from query start,\nand the number of HTTP requests that have been executed up until the result was available.\n\nFor example, stats can be printed via the command line as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -t stats \\\n 'SELECT * WHERE { ?s ?p ?o } LIMIT 10'\nResult,Delay (ms),HTTP requests\n1,265.488428,2\n2,265.7177,2\n3,265.889677,2\n4,266.141152,2\n5,266.332423,2\n6,266.496283,2\n7,266.674167,2\n8,266.861855,2\n9,268.330294,2\n10,268.51177,2\nTOTAL,268.816168,2\n```\n\n### Enabling production-mode\n\nIf you want to do benchmarking with Comunica in Node.js, make sure to run Node.js in production mode as follows:\n\n```\n$ NODE_ENV=production comunica-sparql ...\n```\n\nThe reason for this is that Comunica extensively generates internal Error objects. In non-production mode, these also produce long stacktraces, which may in some cases impact performance.\n\n### More advanced experiments\n\nA more advanced tool for setting up large-scale reproducible Comunica experiments is [Comunica Bencher](https://github.com/comunica/comunica-bencher).\n"},75770:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Advanced querying'\ndescription: 'Advanced guides on how to get the most out of Comunica.'\nindex: true\n---\n\nThe following guides explore some of the more advanced concepts when querying using Comunica.\n"},36323:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'HTTP Basic Authentication'\ndescription: 'Send authenticated HTTP requests by including username and password.'\n---\n\nVia [HTTP Basic Authentication](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication)\none can include **username and password** credentials in HTTP requests.\nIf you want to query such protected resources,\nyou can include this authentication information for _all_ HTTP requests,\nor only for requests to _specific sources_. \n\n## Authentication on the command line\n\nVia the command line, username and password can be included in the URL as follows:\n```bash\n$ comunica-sparql https://username:password@example.org/page \\\n \"SELECT * WHERE { ?s ?p ?o }\"\n```\n\n## Authentication in an application\n\nWhen using [Comunica SPARQL in an application](/docs/query/getting_started/query_app/), authentication information can be set using the `httpAuth` [context entry](/docs/query/advanced/context/):\n\nEnabling basic authentication for _all_ HTTP requests:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n httpAuth: 'username:password',\n});\n```\n\nEnabling basic authentication for _a specific source_:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://username:password@example.org/page'],\n});\n```\n"},76759:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Bindings'\ndescription: 'Bindings objects are used to represent results of SPARQL SELECT queries'\n---\n\nSPARQL `SELECT` query results are represented as a stream of _bindings_ (sometimes also referred to as `BindingsStream`),\nwhere each bindings object represents a mapping from zero or more variables to RDF terms.\n\n
\nThe SPARQL specification uses solution mapping as terminology to refer to bindings.\nThis means that a bindings object is equivalent to a solution mapping,\nand a solution sequence is equivalent to a bindings stream.\n
\n\nBindings object are represented using the [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) interface,\nand can be created using any RDF/JS [`BindingsFactory`](http://rdf.js.org/query-spec/#bindingsfactory-interface).\nComunica provides the [`@comunica/bindings-factory`](https://github.com/comunica/comunica/tree/master/packages/bindings-factory) package that implements these interfaces.\n\nBelow, several examples are shown on how these bindings objects can be used.\nPlease refer to [the README of `@comunica/bindings-factory`](https://github.com/comunica/comunica/tree/master/packages/bindings-factory) for a complete overview of its operations.\n\n## Reading values of bindings\n\n### `Bindings.has()`\n\nThe `has()` method is used to check if a value exists for the given variable.\nThe variable can either be supplied as a string (without `?` prefix), or as an RDF/JS variable.\n\n```typescript\nif (bindings.has('var1')) {\n console.log('Has var1!');\n}\nif (bindings.has(DF.variable('var2'))) {\n console.log('Has var2!');\n}\n```\n\n### `Bindings.get()`\n\nThe `get()` method is used to read the bound value of variable.\nThe variable can either be supplied as a string (without `?` prefix), or as an RDF/JS variable.\n\n```typescript\nimport * as RDF from '@rdfjs/types';\n\nconst term1: RDF.Term | undefined = bindings.get('var1');\nconst term2: RDF.Term | undefined = bindings.get(DF.variable('var2'));\n```\n\n### Entry iteration\n\nEach bindings object is an Iterable over its key-value entries,\nwhere each entry is a tuple of type `[RDF.Variable, RDF.Term]`.\n\n```typescript\n// Iterate over all entries\nfor (const [ key, value ] of bindings) {\n console.log(key);\n console.log(value);\n}\n\n// Save the entries in an array\nconst entries = [ ...bindings ];\n```\n\n### `Bindings.toString`\n\nThe `toString()` method returns a compact string representation of the bindings object,\nwhich can be useful for debugging.\n\n```typescript\nconsole.log(bindings.toString());\n\n/*\nCan output in the form of:\n{\n \"a\": \"ex:a\",\n \"b\": \"ex:b\",\n \"c\": \"ex:c\"\n}\n */\n```\n\n## Creating bindings\n\nFirst, a bindings factory must be created:\n```typescript\nimport * as RDF from '@rdfjs/types';\nimport { DataFactory } from '@comunica/data-factory';\nimport { BindingsFactory } from '@comunica/bindings-factory';\n\nconst DF = new DataFactory();\nconst BF = new BindingsFactory(DF);\n```\n\nBindings can be created in different ways:\n```typescript\nconst bindings1: RDF.Bindings = BF.bindings([\n [ DF.variable('var1'), DF.literal('abc') ],\n [ DF.variable('var2'), DF.literal('def') ],\n]);\n\nconst bindings2: RDF.Bindings = BF.fromRecord({\n var1: DF.literal('abc'),\n var2: DF.literal('def'),\n});\n```\n"},11986:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Caching'\ndescription: 'When remote sources are requested, caching allows them to be reused in the future.'\n---\n\nWhen remote documents are fetched over HTTP, a Comunica engine can cache documents to optimize future reuse.\nIf [your application](/docs/query/getting_started/query_app/) works over volatile resources, then you may want to invalidate this cache,\nwhich can be done as follows:\n\n```javascript\n// Invalidate the full cache\nmyEngine.invalidateHttpCache();\n\n// Invalidate a single document\nmyEngine.invalidateHttpCache('http://example.org/page.html');\n```\n\nOptionally, you can also pass the `noCache: true` flag to your context to invalidate the cache before query execution starts:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://xmlns.com/foaf/spec/20140114.rdf'],\n noCache: true,\n});\n```\n"},22249:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Passing a context'\ndescription: 'A context can be passed to a query engine to tweak its runtime settings.'\n---\n\nWhen passing a query to a Comunica query engine,\nyou can pass additional information to the engine using a **context** object.\n\n## 1. How to use the context\n\nWhen [querying in a JavaScript application](/docs/query/getting_started/query_app/),\nthe context must be passed as second argument to the `query()` method of a Comunica engine.\n\nFor example, a context that defines the `sources` to query over is passed as follows:\n```javascript\nconst QueryEngine = require('@comunica/query-sparql').QueryEngine;\nconst myEngine = new QueryEngine();\n\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n});\n```\n\nThe `sources` field is the only entry that is required in the context.\nAll other entries that are discussed hereafter are optional.\n\n
\nDuring query execution, the context is converted into an immutable object\nto ensure that the original context entries remain unchanged during the whole query execution.\n
\n\n## 2. Overview\n\nThe following table gives an overview of all possible context entries that can be passed.\n\n| **Key** | **Description** |\n| ------- | --------------- |\n| `sources` | An array of data sources |\n| `destination` | A data destination for update queries |\n| `lenient` | If HTTP and parsing failures are ignored |\n| `initialBindings` | Variables that have to be pre-bound to values in the query |\n| `queryFormat` | The provided query's format |\n| `baseIRI` | Base IRI for relative IRIs in SPARQL queries |\n| `log` | A custom logger instance |\n| `datetime` | Specify a custom date |\n| `httpProxyHandler` | A proxy for all HTTP requests |\n| `httpIncludeCredentials` | (_browser-only_) If current credentials should be included for HTTP requests |\n| `httpAuth` | HTTP basic authentication value |\n| `httpTimeout` | HTTP timeout in milliseconds |\n| `httpBodyTimeout` | Makes the HTTP timeout apply until the response is fully consumed |\n| `httpRetryCount` | The number of retries to perform on failed fetch requests |\n| `httpRetryDelay` | The number of milliseconds to wait between fetch retries |\n| `httpRetryOnServerError` | If fetch should be retried on 5xx server error responses, instead of being resolved. |\n| `recoverBrokenLinks`| Use the WayBack machine to recover broken links |\n| `extensionFunctions` or `extensionFunctionCreator` | SPARQL extension functions |\n| `fetch` | A custom [`fetch`](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) function |\n| `readOnly` | If update queries may not be executed |\n| `explain` | The query explain mode |\n| `unionDefaultGraph` | If the default graph should also contain the union of all named graphs |\n| `localizeBlankNodes` | If blank nodes should be localized per bindings entry |\n\nWhen developing Comunica modules, all context entry keys can be found in [`@comunica/context-entries`](https://comunica.github.io/comunica/modules/_comunica_context_entries.html). \n\n## 3. Defining sources\n\nUsing the `sources` context entry, data sources can be defined that Comunica should query over.\nThe value of this must be an array, where the array may contain both strings or objects:\n* Array elements that are strings are interpreted as URLs, such as `'https://www.rubensworks.net/'` or `'https://fragments.dbpedia.org/2016-04/en'`.\n* Object array elements can be different things:\n * A hash containing `type` and `value`, such as `{ type: 'sparql', value: 'https://dbpedia.org/sparql' }`.\n * An [RDF/JS](/docs/query/advanced/rdfjs/) source object, such as [`new N3Store()`](https://github.com/rdfjs/N3.js#storing).\n\nString-based sources will lead to Comunica trying to determine their source type automatically.\nHash-based sources allows you to enforce a specific source type.\n\n
\nSome SPARQL endpoints may be recognised as a file instead of a SPARQL endpoint due to them not supporting SPARQL Service Description,\nwhich may produce incorrect results. For these cases, the sparql type MUST be set.\n
\n\nFor example, all of the following source elements are valid:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n 'https://fragments.dbpedia.org/2015/en',\n { type: 'hypermedia', value: 'https://fragments.dbpedia.org/2016/en' },\n { type: 'file', value: 'https://www.rubensworks.net/' },\n new N3Store(),\n { type: 'sparql', value: 'https://dbpedia.org/sparql' },\n ],\n});\n```\n\n## 4. Defining an update destination\n\nIf you are executing an update query over more than one source,\nthen you need to specify the `destination` of the resulting update.\nMore details on this can be found in the guide on [updating in a JavaScript app](/docs/query/getting_started/update_app/).\n\n## 5. Lenient execution\n\nBy default, Comunica will throw an error when it encounters an invalid **RDF document** or **HTTP URL**.\nIt is possible to **ignore these errors** and make Comunica ignore such invalid documents and URLs\nby setting `lenient` to `true`:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n lenient: true,\n});\n```\n\n## 6. Binding variables\n\nUsing the `initialBindings` context entry, it is possible to **bind** certain variables in the given query to terms before the query execution starts.\nThis may be valuable in case your SPARQL query is used as a template with some variables that need to be filled in.\n\nThis can be done by passing an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object as value to the `initialBindings` context entry:\n```javascript\nimport { BindingsFactory } from '@comunica/bindings-factory';\nimport { DataFactory } from 'rdf-data-factory';\n\nconst DF = new DataFactory();\nconst BF = new BindingsFactory();\n\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE {\n {?s ?p ?template1 } UNION { ?s ?p ?template2 }\n}`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n initialBindings: BF.fromRecord({\n template1: factory.literal('Value1'),\n template2: factory.literal('Value2'),\n }),\n});\n```\n\n`Bindings` can be created using any [RDF/JS `BindingsFactory`](http://rdf.js.org/query-spec/#bindingsfactory-interface),\nsuch as [`@comunica/bindings-factory`](https://www.npmjs.com/package/@comunica/bindings-factory).\nLearn more about the creation of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\n## 7. Setting the query format\n\nBy default, queries in Comunica are interpreted as SPARQL queries.\nAs such, the `queryFormat` entry defaults to `{ language: 'sparql', version: '1.1' }`.\n\nSince Comunica is not tied to any specific **query format**, it is possible to change this to something else, such as `{ language: 'graphql', version: '1.0' }`.\nMore information on this can be found in the [GraphQL-LD guide](/docs/query/advanced/graphql_ld/).\n\n## 8. Setting a Base IRI\n\nTerms in SPARQL queries can be relative to a certain **Base IRI**.\nTypically, you would use the `BASE` keyword in a SPARQL query to set this Base IRI.\nIf you want to set this Base IRI without modifying the query,\nthen you can define it in the context using `baseIRI`:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE {\n ?s ?o\n}`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n baseIRI: 'http://example.org/',\n});\n```\n\n## 9. Enabling a logger\n\nA logger can be set using `log`.\nMore information on this can be found in the [logging guide](/docs/query/advanced/logging/).\n\n## 10. Setting a custom date\n\nUsing `datetime`, a custom **date** can be set in Comunica.\nThe range of this field must always be a JavaScript `Date` object:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n date: new Date(),\n});\n```\n\nThis date is primarily used for the SPARQL `NOW()` operator.\nIt is also used when performing time travel querying using the [Memento protocol](/docs/query/advanced/memento/).\n\n## 11. Enabling an HTTP proxy\n\nAll HTTP requests can be run through a proxy using `httpProxyHandler`.\nMore information on this can be found in the [HTTP proxy guide](/docs/query/advanced/proxying/).\n\n## 12. Include credentials in HTTP requests\n\n_Only applicable when running in the browser_\n\nIf this option is enabled, then all cross-site requests will be made using credentials of the current page.\nThis includes cookies, authorization headers or TLS client certificates.\n\nEnabling this option has no effect on same-site requests.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpIncludeCredentials: true,\n});\n```\n\n## 13. Send requests via HTTP basic authentication\n\nVia HTTP Basic Authentication one can include **username and password** credentials in HTTP requests.\nMore information on this can be found in the [HTTP basic authentication guide](/docs/query/advanced/basic_auth/).\n\n## 14. SPARQL extension functions\n\nSPARQL allows non-standard, [custom extension functions](https://www.w3.org/TR/sparql11-query/#extensionFunctions) to be used within queries.\nIn order to provide an implementation to these extension functions,\nComunica allows developers to plug them in via the context.\nMore information on this can be found in the [SPARQL extension functions guide](/docs/query/advanced/extension_functions/).\n\n## 15. Using a custom fetch function\n\nBy default, Comunica will use the built-in [`fetch` function](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) to make HTTP requests.\nIt is however possible to pass a custom function that will be used instead for making HTTP requests,\nas long as it follows the [Fetch API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API).\n\nThis can be done as follows:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n fetch: myfetchFunction,\n});\n```\n\n_If you want to perform authenticated HTTP requests for Solid, you may want to consider using [Comunica Solid](https://comunica.dev/docs/query/advanced/solid/)._\n\n\n## 16. HTTP Timeout\n\nBy default Communica does not apply any timeout on the HTTP requests done to external services. It is possible to add a timeout using the `httpTimeout` option which value is the timeout delay in milliseconds. For example to add an HTTP timeout of 60s:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpTimeout: 60_000,\n});\n```\n\nIt is also possible to make this timeout not only apply until the response starts streaming in but until the response body is fully consumed using the `httpBodyTimeout` boolean option. It is useful to limit cases like very long response streams:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpTimeout: 60_000,\n httpBodyTimeout: true\n});\n```\n\n## 17. Union Default Graph\n\nBy default, Comunica will only query over the [default graph](https://www.w3.org/TR/sparql11-query/#unnamedGraph).\nIf you want to query over triples in other named graphs, you need to specify this via the `GRAPH`, `FROM`, or `FROM NAMED` clauses.\nHowever, by setting the `unionDefaultGraph` context option to `true`, triples patterns will also apply to triples in the non-default graph. \n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n unionDefaultGraph: true,\n});\n```\n\n## 18. HTTP Retries\n\nUsing the `httpRetryOnServerError`, `httpRetryCount`, and `httpRetryDelay` options,\nyou can make your engine retry requests for a number of times if the server produces an error for it.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpRetryOnServerError: true,\n httpRetryCount: 3,\n httpRetryDelay: 100,\n});\n```\n\n## 19. Broken link recovery\n\nThe `recoverBrokenLinks` option can make your engine fall back to the [WayBack Machine](https://archive.org/web/) if a document has become unavailable.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://xmlns.com/foaf/spec/20140114.rdf'],\n recoverBrokenLinks: true,\n});\n```\n\n## 20. Deduplicate quads in construct queries\n\nThe `distinctConstruct` option can remove duplicate quads from CONSTRUCT query outputs.\nThis corresponds to placing a `DISTINCT` onto a `CONSTRUCT` operator (which is not allowed by the SPARQL specification).\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`CONSTRUCT WHERE { ?s1 ?p1 ?o1. ?s2 ?p2 ?o2 }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n distinctConstruct: true,\n});\n```\n"},65625:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Destination types'\ndescription: 'Comunica detects and handles different types of destinations.'\n---\n\nComunica SPARQL supports _update_ queries to add, delete, or change data\non both the [command line](/docs/query/getting_started/update_cli/)\nand when [calling Comunica from a JavaScript application](/docs/query/getting_started/update_app/).\n\nUpdate queries typically consists of two parts:\n\n1. Query pattern to select data from a [_source_](/docs/query/advanced/source_types/);\n2. Quads to add or delete based on the query pattern into a _destination_.\n\nIn most cases, the _source_ and _destination_ are equal,\nsuch as when modifying data in [an in-memory RDF/JS Store](/docs/query/advanced/rdfjs_updating/).\n\nSince Comunica decouples _source_ and _destination_,\nit is possible to _read_ data from one place, and _apply changes_ in another place.\n\nUsually, destinations are passed as URLs that point to Web resources.\nBased on what is returned when _dereferencing_ this URL,\nComunica can apply different update algorithms.\n\nInstead of relying on Comunica's detection algorithms,\nyou can **enforce** the use of a certain type.\n\n
\nSome SPARQL endpoints may be recognised as a file instead of a SPARQL endpoint due to them not supporting SPARQL Service Description,\nwhich may produce incorrect results. For these cases, the sparql type MUST be set.\n
\n\n
\nWhen enabling the info logger,\nyou can derive what type Comunica has determined for each destination.\n
\n\n## Setting destination type on the command line\n\nDestination types can optionally be enforced by prefixing the URL with `@`, such as\n\n```bash\n$ comunica-sparql https://example.org/file-in.ttl \\\n --to patchSparqlUpdate@https://example.org/file-out.ttl \\\n \"INSERT DATA { }\"\n```\n\n## Setting destination type in an application\n\nVia a [JavaScript application](/docs/query/getting_started/query_app/),\nthe destination type can be set by using a hash containing `type` and `value`:\n```javascript\nawait myEngine.queryVoid(`...`, {\n sources: [\n { type: 'file', value: 'https://example.org/file-in.ttl' },\n ],\n destination: { type: 'patchSparqlUpdate', value: 'https://example.org/file-out.ttl' },\n});\n```\n\n## Supported destination types\n\nThe table below summarizes the different destination types that Comunica supports by default:\n\n| **Type name** | **Description** |\n| ------- | --------------- |\n| `rdfjsStore` | JavaScript objects implementing the [RDF/JS `store` interface](/docs/query/advanced/rdfjs_updating/) |\n| `sparql` | [SPARQL endpoint](https://www.w3.org/TR/sparql11-protocol/) |\n| `putLdp` | [Linked Data Platform](https://www.w3.org/TR/ldp/) HTTP APIs accepting `PUT` requests containing an RDF document, such as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1). |\n| `patchSparqlUpdate` | [Linked Data Platform](https://www.w3.org/TR/ldp/) HTTP APIs accepting `PATCH` requests containing SPARQL Update queries (`application/sparql-update`), such as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1). |\n\nThe default source type is `auto`,\nwhich will automatically detect the proper source type.\nFor example, if an `Accept-Patch: application/sparql-update` header\nis detected, the `patchSparqlUpdate` type is used.\n"},61042:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Explain\'\ndescription: \'Display information about the logical and physical query plan\'\n---\n\nThe explain functionality allows you to extract information about the query plan of a Comunica query engine.\n\nThere are three explain modes available:\n\n- `parsed`: The [SPARQL Algebra](/docs/modify/advanced/algebra/) tree as parsed from the input query.\n- `logical`: The optimized logical query plan in SPARQL Algebra.\n- `physical`: A hierarchical log of which logical operations have been executed by which (physical) actors.\n\nWhile the `parsed` and `logical` explain modes happen before query execution,\nthe `physical` explain mode requires query execution to be completed.\nThis is because Comunica is an adaptive query engine that alters its query plan dynamically based on the sources it discovers at runtime.\nThis means that query execution must be completed before the final (physical) query plan can be inspected.\n\n
\nIf you require more insight into what operations are being executed at runtime,\nyou can make use of the built-in logging functionality.\n
\n\n
\nThe output for the physical mode is an experimental feature,\nwhich means that the format of it might improve and be changed in the future inbetween major updates.\n
\n\n## Explaining on the command line\n\nIf you have [installed Comunica SPARQL for the command line](/docs/query/getting_started/query_cli/),\nthen you will have immediate access to the query explain functionality via the `--explain` option.\n\nBelow, you can see examples on how the different explain modes can be invoked.\n\n### Explain parsed on the command line\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o } LIMIT 100\' --explain parsed\n\n{\n "type": "slice",\n "input": {\n "type": "project",\n "input": {\n "type": "bgp",\n "patterns": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "s"\n },\n "predicate": {\n "termType": "Variable",\n "value": "p"\n },\n "object": {\n "termType": "Variable",\n "value": "o"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "s"\n },\n {\n "termType": "Variable",\n "value": "p"\n },\n {\n "termType": "Variable",\n "value": "o"\n }\n ]\n },\n "start": 0,\n "length": 100\n}\n```\n\n### Explain logical on the command line\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o } LIMIT 100\' --explain logical\n\n{\n "type": "slice",\n "input": {\n "type": "project",\n "input": {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "s"\n },\n "predicate": {\n "termType": "Variable",\n "value": "p"\n },\n "object": {\n "termType": "Variable",\n "value": "o"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "s"\n },\n {\n "termType": "Variable",\n "value": "p"\n },\n {\n "termType": "Variable",\n "value": "o"\n }\n ]\n },\n "start": 0,\n "length": 100\n}\n```\n\n### Explain physical on the command line\n\n```bash\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT ?movie ?title ?name\nWHERE {\n ?movie dbpedia-owl:starring [ rdfs:label "Brad Pitt"@en ];\n rdfs:label ?title;\n dbpedia-owl:director [ rdfs:label ?name ].\n FILTER LANGMATCHES(LANG(?title), "EN")\n FILTER LANGMATCHES(LANG(?name), "EN")\n}\' --explain physical\n\nproject (movie,title,name)\n join\n join-inner(bind) bindOperation:(?g_0 http://www.w3.org/2000/01/rdf-schema#label "Brad Pitt"@en) bindCardEst:~2 cardReal:43 timeSelf:2.567ms timeLife:667.726ms\n join compacted-occurrences:1\n join-inner(bind) bindOperation:(?movie http://dbpedia.org/ontology/starring http://dbpedia.org/resource/Brad_Pitt) bindCardEst:~40 cardReal:43 timeSelf:6.011ms timeLife:641.139ms\n join compacted-occurrences:38\n join-inner(bind) bindOperation:(http://dbpedia.org/resource/12_Monkeys http://dbpedia.org/ontology/director ?g_1) bindCardEst:~1 cardReal:1 timeSelf:0.647ms timeLife:34.827ms\n filter compacted-occurrences:1\n join\n join-inner(nested-loop) cardReal:1 timeSelf:0.432ms timeLife:4.024ms\n pattern (http://dbpedia.org/resource/12_Monkeys http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (http://dbpedia.org/resource/Terry_Gilliam http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~1 src:0\n join compacted-occurrences:2\n join-inner(multi-empty) timeSelf:0.004ms timeLife:0.053ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://dbpedia.org/ontology/director ?g_1) cardEst:~0 src:0\n filter cardEst:~5,188,789.667\n join\n join-inner(nested-loop) timeLife:0.6ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n join compacted-occurrences:1\n join-inner(multi-empty) timeSelf:0.053ms timeLife:0.323ms\n pattern (?movie http://dbpedia.org/ontology/director ?g_1) cardEst:~118,505 src:0\n pattern (?movie http://dbpedia.org/ontology/starring http://wikidata.dbpedia.org/resource/Q35332) cardEst:~0 src:0\n filter cardEst:~242,311,843,844,161\n join\n join-inner(symmetric-hash) timeLife:36.548ms\n pattern (?movie http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~20,013,903 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n\nsources:\n 0: QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)\n```\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o. ?s a ?o } LIMIT 100\' --explain physical-json\n\n{\n "logical": "slice",\n "children": [\n {\n "logical": "project",\n "variables": [\n "o",\n "p",\n "s"\n ],\n "children": [\n {\n "logical": "join",\n "children": [\n {\n "logical": "join-inner",\n "physical": "bind",\n "bindIndex": 1,\n "bindOperation": {\n "source": "QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)",\n "pattern": "?s http://www.w3.org/1999/02/22-rdf-syntax-ns#type ?o"\n },\n "bindOperationCardinality": {\n "type": "estimate",\n "value": 100022186,\n "dataset": "https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type"\n },\n "bindOrder": "depth-first",\n "cardinalities": [\n {\n "type": "estimate",\n "value": 1040358853,\n "dataset": "https://fragments.dbpedia.org/2016-04/en"\n },\n {\n "type": "estimate",\n "value": 100022186,\n "dataset": "https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type"\n }\n ],\n "joinCoefficients": {\n "iterations": 6404592831613.728,\n "persistedItems": 0,\n "blockingItems": 0,\n "requestTime": 8902477556686.99\n },\n "childrenCompact": [\n {\n "occurrences": 100,\n "firstOccurrence": {\n "logical": "pattern",\n "source": "QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)",\n "pattern": "http://commons.wikimedia.org/wiki/Special:FilePath/!!!善福寺.JPG ?p http://dbpedia.org/ontology/Image"\n }\n }\n ]\n }\n ]\n }\n ]\n }\n ]\n}\n```\n\n## Explaining in JavaScript\n\nIf you have [installed Comunica SPARQL in a JavaScript app](/docs/query/getting_started/query_app/),\nthen you can invoke the `explain` method on your query engine with a certain explain mode.\n\nBelow, you can see examples on how the different explain modes can be invoked.\n\n### Explain parsed in JavaScript\n\n```typescript\nconsole.log(await engine.explain(`SELECT * WHERE {\n ?s ?p ?o.\n }`, {\n sources: [ \'https://www.rubensworks.net/\' ],\n}, \'parsed\'));\n\n/*\nWill print:\n\n{\n explain: true,\n type: \'parsed\',\n data: {\n input: {\n patterns: [\n factory.createPattern(\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ),\n ],\n type: \'bgp\',\n },\n type: \'project\',\n variables: [\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ],\n },\n}\n\nwith DF being an RDF data factory, and factory being a SPARQL algebra factory.\n */\n```\n\n### Explain logical in JavaScript\n\n```typescript\nconsole.log(await engine.explain(`SELECT * WHERE {\n ?s ?p ?o.\n }`, {\n sources: [ \'https://www.rubensworks.net/\' ],\n}, \'logical\'));\n\n/*\nWill print:\n\n{\n explain: true,\n type: \'logical\',\n data: {\n input: {\n input: [\n factory.createPattern(\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ),\n ],\n type: \'join\',\n },\n type: \'project\',\n variables: [\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ],\n },\n}\n\nwith DF being an RDF data factory, and factory being a SPARQL algebra factory.\n */\n```\n\n### Explain physical in JavaScript\n\n```typescript\nconsole.log(await engine.explain(`SELECT * WHERE {\n ?s ?p ?o.\n }`, {\n sources: [ \'https://www.rubensworks.net/\' ],\n}, \'physical\'));\n\n/*\nWill print:\n\n{\n explain: true,\n type: \'physical\',\n data: `slice\n project (o,p,s)\n join\n join-inner(bind) bindOperation:(?s http://www.w3.org/1999/02/22-rdf-syntax-ns#type ?o) bindCardEst:~100,022,186\n pattern (http://commons.wikimedia.org/wiki/Special:FilePath/!!!善福寺.JPG ?p http://dbpedia.org/ontology/Image) src:0 compacted-occurrences:100\n\nsources:\n 0: QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)\n`,\n}\n */\n```\n'},10205:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Extension Functions'\ndescription: 'Providing implementations for SPARQL extension functions.'\n---\n\nSPARQL allows non-standard, [custom extension functions](https://www.w3.org/TR/sparql11-query/#extensionFunctions) to be used within queries.\nIn order to provide an implementation to these extension functions,\nComunica allows developers to plug them in via the context.\n\n
\nTake into account that when writing SPARQL queries with extension functions,\nthat these queries will not be portable to other types of query engines anymore,\nas these extension functions may not be standardized.\n
\n\n## Dictionary-based extension functions\n\nThe easiest way to plug in extension functions to Comunica is by using\nthe `extensionFunctions` [context entry](/docs/query/advanced/context/)\nin a [JavaScript application](/docs/query/getting_started/query_app/):\n\n```typescript\nimport {DataFactory} from \"rdf-data-factory\";\n\nconst DF = new DataFactory();\n\nconst bindingsStream = await myEngine.queryBindings(`\nPREFIX func: \nSELECT ?caps WHERE {\n ?s ?p ?o.\n BIND (func:to-upper-case(?o) AS ?caps)\n}\n`, {\n sources: ['https://www.rubensworks.net/'],\n extensionFunctions: {\n 'http://example.org/functions#to-upper-case'(args: RDF.Term[]) {\n const arg = args[0];\n if (arg.termType === 'Literal' && arg.datatype.value === 'http://www.w3.org/2001/XMLSchema#string') {\n return DF.literal(arg.value.toUpperCase(), arg.datatype);\n }\n return arg;\n },\n },\n});\n```\n\nWithin this `extensionFunctions` dictionary, you can provide any number of extension functions.\nThese functions may even be `async`.\n\n## Callback-based extension functions\n\nIf function names are not known beforehand,\nor the dictionary-based format is not usable for whatever reason,\nthen the callback-based `extensionFunctionCreator` entry may be used:\n\n```typescript\nimport {DataFactory} from \"rdf-data-factory\";\n\nconst DF = new DataFactory();\n\nconst bindingsStream = await myEngine.queryBindings(`\nPREFIX func: \nSELECT ?caps WHERE {\n ?s ?p ?o.\n BIND (func:to-upper-case(?o) AS ?caps)\n}\n`, {\n sources: ['https://www.rubensworks.net/'],\n extensionFunctionCreator: (funcTerm: RDF.NamedNode) => {\n if (funcTerm.value === 'http://example.org/functions#to-upper-case') {\n return (args: RDF.Term[]) => {\n const arg = args[0];\n if (arg.termType === 'Literal' && arg.datatype.value === 'http://www.w3.org/2001/XMLSchema#string') {\n return DF.literal(arg.value.toUpperCase(), arg.datatype);\n }\n return arg;\n };\n }\n },\n});\n```\n\nThe `extensionFunctionCreator` is invoked upon any occurrence of an extension function,\nand is called with the extension function name, wrapped within an [RDF/JS named node](/docs/query/advanced/rdfjs/).\nThe return type of this function is expected to be a function with the same signature\nas the values of the `extensionFunction` dictionary, or `undefined`.\n"},60711:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Federated Querying\'\ndescription: \'Query over the union of data within any number of sources\'\n---\n\nOne of the key features of Comunica,\nis the ability to query over **multiple sources** of different types.\nThis concept of querying over multiple sources is called _federated querying_.\n\nThis functionality can be exploited on both\nthe [CLI](/docs/query/getting_started/query_cli/) and the [JavaScript API](/docs/query/getting_started/query_app/).\nIn this guide, we will make use of the CLI as an example.\n\n
\nFederated query execution does not just send the query to each source separately.\nInstead, the triples from all sources are considered one large virtual dataset, which can then be queried over.\n
\n\n## Distributed Knowledge\n\nA fundamental concept of Linked Data and the Semantic Web\nis that data can be spread over different sources across the Web.\nThis means that querying over this data potentially involves more than one source.\n\nWhile some knowledge graphs such as\n[DBpedia](https://wiki.dbpedia.org/) and [Wikidata](https://www.wikidata.org/wiki/Wikidata:Main_Page)\naim to accumulate as much data as possible in one place,\nthese always have limitations in scope.\nAs such, federated querying may be needed for some queries.\n\n## Federated Querying in Comunica\n\nComunica\'s ability to execute federated queries is enabled by default.\nThis can be invoked by simply passing more than one source to the engine.\n\nFor example, the following query will retrieve all triples from DBpedia and two RDF documents:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/ \\\n "SELECT * WHERE { ?s ?p ?o }"\n```\n\nThe example above shows that sources do not necessarily have to be of [the same type](/docs/query/advanced/source_types/).\n\n## Real-world federation example\n\nOne example of a real-world federated query,\nis task of linking people in DBpedia to library datasets.\nFor this, the [Virtual International Authority File](http://viaf.org/) can be used as a source to provide this linking.\n\nThe query below will retrieve all books in the Harvard Library written by people born in San Francisco:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n http://data.linkeddatafragments.org/viaf \\\n http://data.linkeddatafragments.org/harvard \\\n \'SELECT ?person ?name ?book ?title {\n ?person dbpedia-owl:birthPlace [ rdfs:label "San Francisco"@en ].\n ?viafID schema:sameAs ?person;\n schema:name ?name.\n ?book dc:contributor [ foaf:name ?name ];\n dc:title ?title.\n }\'\n```\n\n
\nThe TPF-based source https://fragments.dbpedia.org/2016-04/en is interchangeable with SPARQL-endpoint-based source https://dbpedia.org/sparql.\nThe engine will produce similar results as the sources represent the same dataset.\n
\n'},33889:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'GraphQL-LD\'\ndescription: \'Using the power of JSON-LD contexts, GraphQL queries can be executed by Comunica\'\n---\n\nInstead of SPARQL queries, you can also provide [**GraphQL-LD**](https://github.com/rubensworks/graphql-ld.js) queries,\nwhich are [GraphQL](https://graphql.org/) queries\nenhanced with a [JSON-LD](https://json-ld.org/) context.\nGraphQL-LD is a developer-friendly alternative to SPARQL that allows querying Linked Data and using the results in a straightforward way.\n\n## What is GraphQL-LD?\n\nAssuming the following SPARQL query:\n\n```sparql\nSELECT ?id ?starring WHERE {\n OPTIONAL {\n ?id ;\n ?starring.\n ?starring "Brad Pitt"@en.\n }\n}\n```\n\nThis could be written in a more compact way in GraphQL:\n\n```graphql\n{\n id\n ... on Film {\n starring(label: "Brad Pitt")\n }\n}\n```\n\nAnd this can be based on the following JSON-LD context:\n\n```json\n{\n "@context": {\n "Film": "http://dbpedia.org/ontology/Film",\n "label": { "@id": "http://www.w3.org/2000/01/rdf-schema#label", "@language": "en" },\n "starring": "http://dbpedia.org/ontology/starring"\n }\n}\n```\n\nLearn more about the **features** of GraphQL-LD on [GitHub](https://github.com/rubensworks/GraphQL-LD.js),\nor read [an article about GraphQL-LD](https://comunica.github.io/Article-ISWC2018-Demo-GraphQlLD/).\n\n## Using GraphQL-LD on the command line\n\nTo run GraphQL queries with [Comunica SPARQL from the command line](/docs/query/getting_started/query_cli/),\nset the `-i` flag to `graphql` and refer to your config file with the JSON-LD context (`@context`) through the `-c` flag.\nTo output your results as a GraphQL tree, set the MIME type of the output with `-t` to `tree`.\n\nFor example, the first 100 labels in DBpedia can be retrieved as follows:\n```bash\n$ comunica-sparql http://fragments.dbpedia.org/2015-10/en \\\n -q "{ label(first: 100) @single }" \\\n -c "{ \\"@context\\": { \\"label\\" : \\"http://www.w3.org/2000/01/rdf-schema#label\\" } }" \\\n -i graphql \\\n -t tree\n```\n\nSince the queries and contexts can be inconvenient to pass on the command line, they can also be supplied as files:\n```bash\n$ comunica-sparql http://fragments.dbpedia.org/2015-10/en \\\n -f query.graphql \\\n -c config-with-context.json \\\n -i graphql \\\n -t tree\n```\n\n## Using GraphQL-LD in an application\n\nIf you want to execute GraphQL-LD queries in [your application](/docs/query/getting_started/query_app/),\nyou can do this as follows:\n```javascript\nconst QueryEngine = require(\'@comunica/query-sparql\').QueryEngine;\nconst bindingsStreamToGraphQl = require(\'@comunica/actor-query-result-serialize-tree\').bindingsStreamToGraphQl;\n\nconst myEngine = new QueryEngine();\nconst result = await myEngine.query(`\n{\n label @single\n writer(label_en: \\"Michael Jackson\\") @single\n artist @single {\n label @single\n }\n}\n`, {\n sources: [\'http://fragments.dbpedia.org/2016-04/en\'],\n queryFormat: {\n language: \'graphql\',\n version: \'1.0\'\n },\n "@context": {\n "label": { "@id": "http://www.w3.org/2000/01/rdf-schema#label" },\n "label_en": { "@id": "http://www.w3.org/2000/01/rdf-schema#label", "@language": "en" },\n "writer": { "@id": "http://dbpedia.org/ontology/writer" },\n "artist": { "@id": "http://dbpedia.org/ontology/musicalArtist" }\n }\n});\n// Converts raw Comunica results to GraphQL objects\nconst data = await bindingsStreamToGraphQl(await result.execute(), result.context, {materializeRdfJsTerms: true});\n```\n'},85945:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'HDT'\ndescription: 'HDT offers highly compressed immutable RDF storage.'\n---\n\n[HDT](http://www.rdfhdt.org/) is a highly compressed RDF dataset format that enables efficient triple pattern querying.\nComunica enables executing SPARQL queries over HDT files,\nas it is one of the supported [source types](/docs/query/advanced/source_types/).\n\nQuerying over HDT requires [Comunica SPARQL HDT package (`@comunica/query-sparql-hdt`)](https://github.com/comunica/comunica-feature-hdt/tree/master/engines/query-sparql-hdt#readme).\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\nHDT requires GCC 4.9 or higher to be available.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql-hdt\n```\n\n## 2. SPARQL querying over one HDT file\n\nAfter installing Comunica SPARQL HDT, you will be given access to several commands including `comunica-sparql-hdt`,\nwhich allows you to execute SPARQL queries from the command line.\n\nJust like `comunica-sparql`,\nthis command requires one or more URLs to be provided as **sources** to query over.\nAs last argument, as **SPARQL query string** can be provided.\n\nFor example, the following query retrieves the first 100 triples a local HDT file:\n```bash\n$ comunica-sparql-hdt hdt@path/to/myfile.hdt \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\"\n```\n\n## 3. SPARQL querying over multiple HDT files\n\nJust like `comunica-sparql`, querying over multiple sources simply requires you to pass them after each other:\n```bash\n$ comunica-sparql-hdt hdt@path/to/myfile1.hdt \\\n hdt@path/to/myfile2.hdt \\\n hdt@path/to/myfile3.hdt \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\"\n```\n\n## 4. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql-hdt`.\nYou can learn more options by invoking the _help_ command, or by [reading the Comunica SPARQL documentation](/docs/query/getting_started/query_cli/):\n```text\n$ comunica-sparql-hdt --help\n```\n\nThe API for [querying over HDT files in JavaScript apps is identical to Comunica SPARQL](/docs/query/getting_started/query_app/),\nand just requires importing `@comunica/query-sparql-hdt` instead of `@comunica/query-sparql`.\n\nIn order to [set up a SPARQL endpoint, `comunica-sparql-hdt-http` can be used, just like Comunica SPARQL](/docs/query/getting_started/setup_endpoint/).\n"},50974:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Logging'\ndescription: 'Loggers can be set to different logging levels to inspect what Comunica is doing behind the scenes.'\n---\n\nIf you want to inspect what is going on during query execution,\nyou can enable a logger in Comunica.\n\n
\nThis guide focuses on configuring logging levels and printing output.\nClick here if you want to learn more about invoking a logger from within an actor implementation.\n
\n\n## Logging on the command line\n\nUsing Comunica SPARQL on the command line, logging can be enabled via the `-l` option.\nFor example, printing debug-level logs can be done as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -l debug\n```\n```text\n[2022-02-23T09:46:17.615Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n[2022-02-23T09:46:17.756Z] INFO: Identified as qpf source: https://fragments.dbpedia.org/2016-04/en { actor: 'urn:comunica:default:rdf-resolve-hypermedia/actors#qpf' }\n[2022-02-23T09:46:17.761Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n[2022-02-23T09:46:17.785Z] DEBUG: Determined physical join operator 'inner-bind' {\n entries: 2,\n variables: [ [ 's', 'p', 'o' ], [ 's', 'o' ] ],\n costs: {\n 'inner-none': undefined,\n 'inner-single': undefined,\n 'inner-multi-empty': undefined,\n 'inner-bind': 6458426063925.053,\n 'inner-hash': undefined,\n 'inner-symmetric-hash': undefined,\n 'inner-nested-loop': 104059105829280600,\n 'optional-bind': undefined,\n 'optional-nested-loop': undefined,\n 'minus-hash': undefined,\n 'minus-hash-undef': undefined,\n 'inner-multi-smallest': undefined\n },\n coefficients: {\n 'inner-none': undefined,\n 'inner-single': undefined,\n 'inner-multi-empty': undefined,\n 'inner-bind': {\n iterations: 6404592831613.728,\n persistedItems: 0,\n blockingItems: 0,\n requestTime: 538332323.1132541\n },\n 'inner-hash': {\n iterations: 1140381039,\n persistedItems: 1040358853,\n blockingItems: 1040358853,\n requestTime: 1391277679.44\n },\n 'inner-symmetric-hash': {\n iterations: 1140381039,\n persistedItems: 1140381039,\n blockingItems: 0,\n requestTime: 1391277679.44\n },\n 'inner-nested-loop': {\n iterations: 104058966701512660,\n persistedItems: 0,\n blockingItems: 0,\n requestTime: 1391277679.44\n },\n 'optional-bind': undefined,\n 'optional-nested-loop': undefined,\n 'minus-hash': undefined,\n 'minus-hash-undef': undefined,\n 'inner-multi-smallest': undefined\n }\n}\n[2022-02-23T09:46:17.786Z] DEBUG: First entry for Bind Join: {\n entry: Quad {\n termType: 'Quad',\n value: '',\n subject: Variable { termType: 'Variable', value: 's' },\n predicate: NamedNode {\n termType: 'NamedNode',\n value: 'http://www.w3.org/1999/02/22-rdf-syntax-ns#type'\n },\n object: Variable { termType: 'Variable', value: 'o' },\n graph: DefaultGraph { termType: 'DefaultGraph', value: '' },\n type: 'pattern'\n },\n metadata: {\n requestTime: 18,\n pageSize: 100,\n cardinality: { type: 'estimate', value: 100022186 },\n first: 'https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type&page=1',\n next: 'https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type&page=2',\n previous: null,\n last: null,\n searchForms: { values: [Array] },\n canContainUndefs: false,\n order: undefined,\n availableOrders: undefined,\n variables: [ [Variable], [Variable] ]\n },\n actor: 'urn:comunica:default:rdf-join/actors#inner-multi-bind'\n}\n[2022-02-23T09:46:17.794Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en?subject=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FSpecial%3AFilePath%2F%21%21%21%E5%96%84%E7%A6%8F%E5%AF%BA.JPG&object=http%3A%2F%2Fdbpedia.org%2Fontology%2FImage {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n[2022-02-23T09:46:17.795Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en?subject=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FSpecial%3AFilePath%2F%21%21%21%E5%96%84%E7%A6%8F%E5%AF%BA.JPG&object=http%3A%2F%2Fwikidata.dbpedia.org%2Fontology%2FImage {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n```\n\nAll log messages will be printed to standard error (`stderr`).\n\nIf you only want to print the logs, you can void all query results as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -l debug > /dev/null\n```\n\nIf you want to redirect all logs to a file, you can forward them like this:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -l debug 2> /path/to/log.txt\n```\n\n## Logging levels\n\nThe following logging levels are available in Comunica:\n\n* `trace`\n* `debug`\n* `info`\n* `warn`\n* `error`\n* `fatal`\n\n
\nWhen enabling a level, all levels below are also enabled.\nFor example, when enabling error, then fatal will also be enabled.\n
\n\n## Logging in an application\n\nUsing the `log` [context entry](/docs/query/advanced/context/), you can enable logging in a [JavaScript application that uses Comunica](/docs/query/getting_started/query_app/):\n```javascript\nimport {LoggerPretty} from \"@comunica/logger-pretty\";\n\nconst bindingsStream = await myEngine.queryBindings('SELECT * WHERE { ?s ?p ?o }', {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n log: new LoggerPretty({ level: 'debug' }),\n});\n```\n\nThis logger makes use of `LoggerPretty`, which will print everything to standard error (`stderr`),\njust like Comunica SPARQL on the command line.\n\nAlternatively, more advanced logging can be achieved by making use of [`@comunica/logger-bunyan`](https://github.com/comunica/comunica/tree/master/packages/logger-bunyan/),\nor by implementing your own logger that implements the [`Logger` interface](https://github.com/comunica/comunica/blob/master/packages/core/lib/Logger.ts).\n"},82329:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Memento'\ndescription: 'Using the Memento protocol, time travel queries can be executed.'\n---\n\nUsing the [Memento protocol](https://tools.ietf.org/html/rfc7089),\nit is possible to perform **time-based content negotiation** over HTTP.\nThis allows servers to expose different temporal versions of resources next to each other,\nand clients to retrieve these versions at different times.\n\nComunica has built-in support for the Memento protocol\n([`actor-http-memento`](https://github.com/comunica/comunica/tree/master/packages/actor-http-memento)).\nTo enable Memento, one simply passes a date to the query engine via the [context](/docs/query/advanced/context/),\nand Comunica will perform time-based negotiation for that date.\n\nFor example, the [DBpedia TPF interface supports the Memento protocol](https://ruben.verborgh.org/blog/2016/06/22/querying-history-with-linked-data/).\nIn order to query over it at version 2010 from the command line, a custom date can be passed with `-d`:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -d 'June 1, 2010' \\\n 'SELECT ?name ?deathDate WHERE {\n ?person a dbpedia-owl:Artist;\n rdfs:label ?name;\n dbpedia-owl:birthPlace [ rdfs:label \"York\"@en ].\n FILTER LANGMATCHES(LANG(?name), \"EN\")\n OPTIONAL { ?person dbpprop:dateOfDeath ?deathDate. }\n }'\n```\n\nDates can also be passed via the JavaScript API, via the [query engine context](/docs/query/advanced/context/).\n"},13330:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'HTTP Proxy'\ndescription: 'All HTTP requests can optionally go through a proxy.'\n---\n\nOptionally, you can configure a proxy to redirect all HTTP(S) traffic.\nThis is for example useful when Comunica is used in a Web browser\nwhere a [proxy enables CORS headers on all responses](https://www.npmjs.com/package/cors-anywhere).\n\n## Proxying on the command line\n\nVia the command line, a proxy can be enabled via the `-p` option as follows:\n```bash\n$ comunica-sparql http://fragments.dbpedia.org/2015-10/en \"SELECT * WHERE { ?s ?p ?o }\" \\\n -p http://myproxy.org/?uri=\n```\n\n## Proxying in an application\n\nWhen using [Comunica SPARQL in an application](/docs/query/getting_started/query_app/), a proxy can be set using the `httpProxyHandler` [context entry](/docs/query/advanced/context/):\n```javascript\nimport { ProxyHandlerStatic } from \"@comunica/actor-http-proxy\";\n\nconst bindingsStream = await myEngine.queryBindings('SELECT * WHERE { ?s ?p ?o }', {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n httpProxyHandler: new ProxyHandlerStatic('http://myproxy.org/?uri='),\n});\n```\n\nIn the example above, a `ProxyHandlerStatic` is passed,\nwhich will simply put the URL `http://myproxy.org/?uri=` in front of all URLs that would be requested.\n\nIf you need a more advanced proxy behaviour,\nthen you can implement your own proxy handler.\nAll proxy handlers must implement the [`IProxyHandler` interface](https://github.com/comunica/comunica/blob/master/packages/actor-http-proxy/lib/IProxyHandler.ts).\n"},68577:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'RDF/JS'\ndescription: 'To achieve maximum interoperability between different JavaScript libraries, Comunica builds on top of the RDF/JS specifications.'\n---\n\n
\n \n
\n\nRDF/JS offers a set of RDF specifications for JavaScript libraries\nthat are defined by the [RDF JavaScript Libraries W3C community group](https://www.w3.org/community/rdfjs/).\nMost of the popular JavaScript libraries adhere to these specifications, which makes it possible to use them interchangeably, and in any combination.\nThis allows you to for example use an RDF parser from one developer, and pipe its output into an RDF store from another developer.\n\nFor most of these specifications, corresponding [TypeScript typings exist](https://www.npmjs.com/package/@types/rdf-js),\nand many libraries ship with their own typings as well,\nwhich makes RDF/JS especially useful if you want to develop more strongly-typed JavaScript applications.\n\nComunica is conformant to the following RDF/JS specifications. \n\n## Data model specification\n\nThe foundational part of RDF/JS is its [low-level **data model** specification](http://rdf.js.org/data-model-spec/),\nin which JavaScript interfaces are described for representing **RDF terms** and **RDF quads**.\nFive types of terms exist:\n\n* [Named Node](http://rdf.js.org/data-model-spec/#namednode-interface): Represents a thing by IRI, such as `https://www.rubensworks.net/#me`.\n* [Blank Node](http://rdf.js.org/data-model-spec/#blanknode-interface): Represents a thing without an explicit name.\n* [Literal](http://rdf.js.org/data-model-spec/#literal-interface): Represents a raw value of a certain datatype, such as `\"Ruben\"` or `1992`.\n* [Variable](http://rdf.js.org/data-model-spec/#variable-interface): Represents a variable, which can be used for matching values within queries.\n* [Default Graph](http://rdf.js.org/data-model-spec/#defaultgraph-interface): Represents the default graph in RDF. Other graphs can be represented with named or blank nodes.\n\n[RDF quads](http://rdf.js.org/data-model-spec/#quad-interface) are defined as an object with RDF terms for **subject**, **predicate**, **object** and **graph**.\nAn RDF triple is an alias of a quad,\nwhere the graph is set to the default graph.\nFor the remainder of this document, I will just refer to RDF quads.\n\nFinally, a [Data Factory](http://rdf.js.org/data-model-spec/#datafactory-interface) interface is defined,\nwhich allows you to easily create terms and quads that conform to this interface.\nDifferent Data Factory implementations exist, such as [`rdf-data-factory`](https://www.npmjs.com/package/rdf-data-factory)\nand the factory from [`N3.js`](https://github.com/rdfjs/N3.js#interface-specifications).\nFor example, creating a quad for representing someone's name with a data factory can be done like this:\n\n```javascript\nimport { DataFactory } from 'rdf-data-factory';\n\nconst factory = new DataFactory();\n\nconst quad = factory.quad(\n factory.namedNode('https://www.rubensworks.net/#me'), // subject\n factory.namedNode('http://schema.org/name'), // predicate\n factory.literal('Ruben') // object\n);\n```\n\nReading raw values from the quad can be done as follows:\n\n```javascript\nquad.subject.value === 'https://www.rubensworks.net/#me';\nquad.predicate.value === 'http://schema.org/name';\nquad.object.value === 'Ruben';\n```\n\nFor checking whether or not quads and terms are equal to each other, the `equals` method can be used:\n\n```javascript\nfactory.literal('Ruben').equals(factory.literal('Ruben')); // true\nfactory.literal('Ruben').equals(factory.literal('Ruben2')); // false\nquad.equals(quad); // true\n```\n\n## Stream interfaces\n\nComunica handles most parts of query execution in a **streaming** manner,\nwhich means that some query results may already be returned\neven though other results are still being processed.\n\nNext to the RDF/JS data model, a dedicated specification exist for handling [RDF streams](http://rdf.js.org/stream-spec/),\nwhich is of high important to Comunica.\n\nOne interface of high importance is the [RDF/JS `Source` interface](http://rdf.js.org/stream-spec/#source-interface).\nYou can [pass a custom `Source` to Comunica to execute queries over it](/docs/query/advanced/rdfjs_querying/).\n\nThe [RDF/JS `Store` interface](http://rdf.js.org/stream-spec/#store-interface) is an extension of `Source`\nthat also allows quads to be added and removed.\nYou can [pass a custom `Store` to Comunica to execute update queries over it](/docs/query/advanced/rdfjs_updating/).\n\n## Query interfaces\n\nThe [RDF/JS query spec](http://rdf.js.org/query-spec/) is a specification that provides\nhigh-level and low-level interfaces that are common to query engines.\nFor example, query engines implementing these high-level interfaces are mostly interchangeable when used within applications.\n\nThe most important high-level interfaces that are implemented by Comunica\nare the [Queryable](https://rdf.js.org/query-spec/#queryable-interfaces)\nand [SparqlQueryable](https://rdf.js.org/query-spec/#sparql-queryable-interfaces) interfaces.\nCompared to these standard interfaces, the only additional requirement that Comunica places is the usage\nof a [source-based context](https://rdf.js.org/query-spec/#querysourcecontext-interface) as second argument to the query methods.\n\nNext to that, Comunica also implements the [`BindingsFactory`](http://rdf.js.org/query-spec/#bindingsfactory-interface)\nand [`Bindings`](http://rdf.js.org/query-spec/#bindings-interface) interfaces via the\n[`@comunica/bindings-factory`](https://github.com/comunica/comunica/tree/master/packages/bindings-factory) package.\nLearn more about the usage of these bindings [here](/docs/query/advanced/bindings/).\n"},82075:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying over RDF/JS sources'\ndescription: 'If the built-in source types are not sufficient, you can pass a custom JavaScript object implementing a specific interface.'\n---\n\nOne of the [different types of sources](/docs/query/advanced/source_types/) that is supported by Comunica\nis the [RDF/JS `Source` interface](http://rdf.js.org/stream-spec/#source-interface).\nThis allows you to pass objects as source to Comunica as long as they implement this interface.\n\nAn RDF/JS `Source` exposes the [`match`](http://rdf.js.org/stream-spec/#source-interface) method\nthat allows quad pattern queries to be executed,\nand matching quads to be returned as a stream.\n\n
\n\nSeveral implementations of this `Source` interface exist.\nIn the example below, we make use of the [`Store` from `N3.js`](https://github.com/rdfjs/N3.js#storing)\nthat offers one possible implementation when you want to [query over it with Comunica within a JavaScript application](/docs/query/getting_started/query_app/):\n```javascript\nconst store = new N3.Store();\nstore.addQuad(\n namedNode('http://ex.org/Pluto'),\n namedNode('http://ex.org/type'),\n namedNode('http://ex.org/Dog')\n);\nstore.addQuad(\n namedNode('http://ex.org/Mickey'),\n namedNode('http://ex.org/type'),\n namedNode('http://ex.org/Mouse')\n);\n\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: [store],\n});\n```\n\n
\nInstead of the default Comunica SPARQL package (@comunica/query-sparql),\nthe Comunica SPARQL RDF/JS (@comunica/query-sparql-rdfjs)\ncan also be used as a more lightweight alternative\nthat only allows querying over RDF/JS sources.\n
\n\n
\nIf the RDF/JS `Source` also implements the RDF/JS Store interface,\nthen it is also supports update queries to add, change or delete quads in the store.\n
\n\n## Optional: query optimization\n\nThe RDFJS [Source interface](http://rdf.js.org/#source-interface) by default only exposed the `match` method.\nIn order to allow Comunica to produce more efficient query plans,\nyou can optionally expose a `countQuads` method that has the same signature as `match`,\nbut returns a `number` or `Promise` that represents (an estimate of)\nthe number of quads that would match the given quad pattern.\nCertain `Source` implementations may be able to provide an efficient implementation of this method,\nwhich would lead to better query performance.\n\nIf Comunica does not detect a `countQuads` method, it will fallback to a sub-optimal counting mechanism\nwhere `match` will be called again to manually count the number of matches.\n"},27124:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Updating RDF/JS stores\'\ndescription: \'If the built-in destination types are not sufficient, you can pass a custom JavaScript object implementing a specific interface.\'\n---\n\nOne of the [different types of destinations](/docs/query/advanced/destination_types/) that is supported by Comunica\nis the [RDF/JS `Store` interface](http://rdf.js.org/stream-spec/#store-interface).\nThis allows you to pass objects as destination to Comunica as long as they implement this interface.\n\n
\n\nSeveral implementations of this `Store` interface exist.\nIn the example below, we make use of the [`Store` from `N3.js`](https://github.com/rdfjs/N3.js#storing)\nthat offers one possible implementation when you want to [query over it with Comunica within a JavaScript application](/docs/query/getting_started/query_app/):\n```javascript\nconst store = new N3.Store();\n\nconst query = `\nPREFIX dc: \nINSERT DATA\n{ \n dc:title "A new book" ;\n dc:creator "A.N.Other" .\n}\n`;\n\n// Execute the update\nawait myEngine.queryVoid(query, {\n sources: [store],\n});\n\n// Prints \'2\' => the store is updated\nconsole.log(store.size);\n```\n'},54924:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Result formats\'\ndescription: \'Query results can be serialized in different formats.\'\n---\n\nBy default, Comunica has support for the following result formats:\n\n| **Media type** | **Description** |\n| ------- | --------------- |\n| [`application/json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-json) | A custom, simplified JSON result format. |\n| [`simple`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-simple) | A custom, text-based result format. |\n| [`application/sparql-results+json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-json) | The [SPARQL/JSON](https://www.w3.org/TR/sparql11-results-json/) results format. |\n| [`application/sparql-results+xml`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-xml) | The [SPARQL/XML](https://www.w3.org/TR/rdf-sparql-XMLres/) results format. |\n| [`text/csv`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-csv) | The [SPARQL/CSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) results format. |\n| [`text/tab-separated-values`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-tsv) | The [SPARQL/TSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) results format. |\n| [`stats`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-stats) | A custom results format for testing and debugging. |\n| [`table`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-table) | A text-based visual table result format. |\n| [`tree`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-tree) | A tree-based result format for GraphQL-LD result compacting. |\n| [`application/trig`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [TriG](https://www.w3.org/TR/trig/) RDF serialization. |\n| [`application/n-quads`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [N-Quads](https://www.w3.org/TR/n-quads/) RDF serialization. |\n| [`text/turtle`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [Turtle](https://www.w3.org/TR/turtle/) RDF serialization. |\n| [`application/n-triples`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [N-Triples](https://www.w3.org/TR/n-triples/) RDF serialization. |\n| [`text/n3`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [Notation3](https://www.w3.org/TeamSubmission/n3/) serialization. |\n| [`application/ld+json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [JSON-LD](https://json-ld.org/) RDF serialization. |\n\n## Querying from the command line\n\nWhen using [Comunica from the command line](/docs/query/getting_started/query_cli/),\nthe result format can be set using the `-t` option:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100" \\\n -t "application/sparql-results+json"\n```\n```json\n{"head": {"vars":["s","p","o"]},\n"results": { "bindings": [\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/date","type":"uri"},"o":{"value":"1899-05-06","type":"literal","datatype":"http://www.w3.org/2001/XMLSchema#date"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/isCitedBy","type":"uri"},"o":{"value":"http://dbpedia.org/resource/Tierce_(unit)","type":"uri"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/newspaper","type":"uri"},"o":{"value":"Biloxi Daily Herald","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/page","type":"uri"},"o":{"value":"6","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/title","type":"uri"},"o":{"value":"A New System of Weights and Measures","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/url","type":"uri"},"o":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"}},\n...\n```\n\n
\nAll available formats can be printed via comunica-sparql --listformats\n
\n\n### Querying in a JavaScript app\n\nWhen using [Comunica in a JavaScript application](/docs/query/getting_started/query_app/),\nresults can be serialized to a certain format using `resultToString()`:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\nconst { data } = await myEngine.resultToString(result,\n \'application/sparql-results+json\');\ndata.pipe(process.stdout); // Print to standard output\n```\n\nThe `resultToString()` method accepts a query result and a result format media type.\nThe media type is optional, and will default to `application/json` for bindings, `application/trig` for quads, and `simple` for booleans.\n\n
\nAll available result formats can be retrieved programmatically\nby invoking the asynchronous getResultMediaTypes() method.\n
\n'},89366:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Solid'\ndescription: 'Solid – the Web-based decentralization ecosystem – can be queried with Comunica.'\n---\n\n## What is Solid\n\n[Solid](https://solidproject.org/) is a Web-based decentralization ecosystem\nwhere people are in control over their own data.\n\nSolid achieves this by giving everyone control over their own **personal data pod**.\nApplications are completely separate, and have to ask permission to access your data.\n\nSince Solid and Comunica have a compatible technology stack,\nComunica can be used to query over Solid data pods.\nThe default [Comunica SPARQL engine](/docs/query/getting_started/query_cli/)\ncan directly be used to query over public Solid data pods as long as you are querying over public data.\nIf you want to **query over data pods that require authentication**,\nyou can use one of the approaches mentioned below.\n\n## Query pods with a custom fetch function\n\nLibraries such as [@inrupt/solid-client-authn-node](https://www.npmjs.com/package/@inrupt/solid-client-authn-node)\nand [@inrupt/solid-client-authn-browser](https://www.npmjs.com/package/@inrupt/solid-client-authn-browser)\nallow you to authenticate with your Solid WebID.\nThese libraries provide a custom `fetch` function, using which you can execute authenticated HTTP requests.\n\nYou can forward this fetch function to Comunica SPARQL to make it perform authenticated queries to pods as shown below.\n\n```typescript\nimport { QueryEngine } from '@comunica/query-sparql-solid';\nimport { Session } from '@inrupt/solid-client-authn-node';\n\nconst session = new Session();\nconst myEngine = new QueryEngine();\n\nawait session.login({ ... }); // Log in as explained in https://docs.inrupt.com/developer-tools/javascript/client-libraries/tutorial/authenticate-nodejs-web-server/\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT * WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n // Set your profile as query source\n sources: [session.info.webId],\n // Pass the authenticated fetch function\n fetch: session.fetch,\n});\n```\n\n## Query pods with an existing Solid session\n\n[Comunica SPARQL Solid](https://github.com/comunica/comunica-feature-solid/tree/master/engines/query-sparql-solid)\nallows you to pass your authenticated Solid session object.\nHereafter, we list some examples on how to use it from JavaScript and the command line.\nPlease refer to the [README of Comunica SPARQL Solid](https://github.com/comunica/comunica-feature-solid/tree/master/engines/query-sparql-solid#readme)\nfor more details.\n\n**Querying from JavaScript**:\n```typescript\nimport { QueryEngine } from '@comunica/query-sparql-solid';\nimport { Session } from '@inrupt/solid-client-authn-node';\n\nconst session = new Session();\nconst myEngine = new QueryEngine();\n\nawait session.login({ ... }); // Log in as explained in https://docs.inrupt.com/developer-tools/javascript/client-libraries/tutorial/authenticate-nodejs-web-server/\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT * WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n // Set your profile as query source\n sources: [session.info.webId],\n // Pass your authenticated session\n '@comunica/actor-http-inrupt-solid-client-authn:session': session,\n});\n```\n\n**Querying an existing document**:\n```bash\n$ comunica-sparql-solid --idp https://solidcommunity.net/ \\\n http://example.org/existing-document.ttl \\\n \"SELECT * { ?s ?p ?o }\"\n```\n\n**Creating a new document**:\n```bash\n$ comunica-sparql-solid --idp https://solidcommunity.net/ \\\n http://example.org/new-document.ttl \\\n \"INSERT DATA { }\"\n```\n\n**Updating an existing document**:\n```bash\n$ comunica-sparql-solid --idp https://solidcommunity.net/ \\\n http://example.org/existing-document.ttl \\\n \"INSERT DATA { }\"\n```\n\nPlease be aware that that there are several [open known issues](https://github.com/comunica/comunica-feature-solid/tree/master/engines/query-sparql-solid#known-issues) relating to other software.\n\n[LDflex](/docs/query/usage/#ldflex) and [GraphQL-LD](/docs/query/usage/#graphql-ld) are examples of tools that ship with Comunica SPARQL Solid.\n\n## Query pods using link traversal\n\nThe approaches for querying Solid mentioned above require you to know upfront in which pod and in which documents\nyour data resides before you can query over it.\n[_Comunica SPARQL Link Traversal Solid_](https://github.com/comunica/comunica-feature-link-traversal/tree/master/engines/query-sparql-link-traversal-solid#comunica-sparql-link-traversal)\nprovides a way to query over Solid pods without having to know beforehand in which documents the necessary data resides in.\nIt does this by following links between documents _during query execution_.\n\nThis is still an experimental query approach, which does not yet work well for complex queries.\nLearn more about active [research on link traversal in Solid](https://comunica.dev/research/link_traversal/).\n\nThe example below executes a query across multiple simulated Solid pods to find all messages by a certain creator:\n\n```typescript\nimport { QueryEngine } from '@comunica/query-sparql-solid';\n\nconst myEngine = new QueryEngine();\nconst bindingsStream = await myEngine.queryBindings(`\n PREFIX snvoc: \n SELECT DISTINCT ?forumId ?forumTitle WHERE {\n ?message snvoc:hasCreator .\n ?forum snvoc:containerOf ?message;\n snvoc:id ?forumId;\n snvoc:title ?forumTitle.\n }`, {\n // Sources field is optional. Will be derived from query if not provided.\n //sources: [session.info.webId], // Sets your profile as query source\n // Session is optional for authenticated requests\n //'@comunica/actor-http-inrupt-solid-client-authn:session': session,\n // The lenient flag will make the engine not crash on invalid documents\n lenient: true,\n});\n```\n\nTry out this query above in our [live demo](https://comunica.github.io/comunica-feature-link-traversal-web-clients/builds/solid-default/#query=PREFIX%20snvoc%3A%20%3Chttps%3A%2F%2Fsolidbench.linkeddatafragments.org%2Fwww.ldbc.eu%2Fldbc_socialnet%2F1.0%2Fvocabulary%2F%3E%0ASELECT%20DISTINCT%20%3FforumId%20%3FforumTitle%20WHERE%20%7B%0A%20%20%3Fmessage%20snvoc%3AhasCreator%20%3Chttps%3A%2F%2Fsolidbench.linkeddatafragments.org%2Fpods%2F00000006597069767117%2Fprofile%2Fcard%23me%3E.%0A%20%20%3Fforum%20snvoc%3AcontainerOf%20%3Fmessage%3B%0A%20%20%20%20snvoc%3Aid%20%3FforumId%3B%0A%20%20%20%20snvoc%3Atitle%20%3FforumTitle.%0A%7D).\n"},42473:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Source types'\ndescription: 'Comunica detects and handles different types of sources.'\n---\n\nComunica SPARQL enables query execution over one or more sources\non both the [command line](/docs/query/getting_started/query_cli/)\nand when [calling Comunica from a JavaScript application](/docs/query/getting_started/query_app/).\n\nUsually, sources are passed as URLs that point to Web resources.\nBased on what is returned when _dereferencing_ this URL,\nComunica can apply different query algorithms.\n\nInstead of relying on Comunica's detection algorithms,\nyou can **enforce** the use of a certain type.\n\n
\nSome SPARQL endpoints may be recognised as a file instead of a SPARQL endpoint due to them not supporting SPARQL Service Description,\nwhich may produce incorrect results. For these cases, the sparql type MUST be set.\n
\n\n
\nWhen enabling the info logger,\nyou can derive what type Comunica has determined for each source.\n
\n\n## Setting source type on the command line\n\nOn the [command line](/docs/query/getting_started/query_cli/), source types can optionally be enforced by prefixing the URL with `@`, such as:\n```bash\n$ comunica-sparql sparql@https://dbpedia.org/sparql \\\n \"CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100\"\n```\n\n## Setting source type in an application\n\nVia a [JavaScript application](/docs/query/getting_started/query_app/),\nthe source type can be set by using a hash containing `type` and `value`:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n { type: 'sparql', value: 'https://dbpedia.org/sparql' },\n ],\n});\n```\n\n## Supported source types\n\nThe table below summarizes the different source types that Comunica supports by default:\n\n| **Type name** | **Description** |\n|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `file` | plain RDF file in any RDF serialization, such as [Turtle](https://www.w3.org/TR/turtle/), [TriG](https://www.w3.org/TR/trig/), [JSON-LD](https://json-ld.org/), [RDFa](https://www.w3.org/TR/rdfa-primer/), ... |\n| `sparql` | [SPARQL endpoint](https://www.w3.org/TR/sparql11-protocol/) |\n| `hypermedia` | Sources that expose query capabilities via hypermedia metadata, such as [Triple Pattern Fragments](https://linkeddatafragments.org/specification/triple-pattern-fragments/) and [Quad Pattern Fragments](https://linkeddatafragments.org/specification/quad-pattern-fragments/) |\n| `qpf` | A hypermedia source that is enforced as [Triple Pattern Fragments](https://linkeddatafragments.org/specification/triple-pattern-fragments/) or [Quad Pattern Fragments](https://linkeddatafragments.org/specification/quad-pattern-fragments/) |\n| `brtpf` | A hypermedia source that is enforced as [bindings-restricted Triple Pattern Fragments](https://arxiv.org/abs/1608.08148) |\n| `rdfjs` | JavaScript objects implementing the [RDF/JS `source` interface](/docs/query/advanced/rdfjs_querying/) |\n| `serialized` | An RDF dataset serialized as a string in a certain format. |\n| `hdt` | [HDT files](/docs/query/advanced/hdt/) |\n| `ostrichFile` | Versioned [OSTRICH archives](https://github.com/rdfostrich/comunica-query-sparql-ostrich) |\n\nThe default source type is `auto`,\nwhich will automatically detect the proper source type.\nFor example, if a [SPARQL Service Description](https://www.w3.org/TR/sparql11-service-description/)\nis detected, the `sparql` type is used.\n\n## RDF serializations\n\nComunica will interpret the `Content-Type` header of HTTP responses to determine used RDF serialization.\nIf the server did not provide such a header, Comunica will attempt to derive the serialization based on the extension.\n\nThe following RDF serializations are supported:\n\n| **Name** | **Content type** | **Extensions** |\n| -------- | ---------------- | ------------- |\n| [TriG](https://www.w3.org/TR/trig/) | `application/trig` | `.trig` |\n| [N-Quads](https://www.w3.org/TR/n-quads/) | `application/n-quads` | `.nq`, `.nquads` |\n| [Turtle](https://www.w3.org/TR/turtle/) | `text/turtle` | `.ttl`, `.turtle` |\n| [N-Triples](https://www.w3.org/TR/n-triples/) | `application/n-triples` | `.nt`, `.ntriples` |\n| [Notation3](https://www.w3.org/TeamSubmission/n3/) | `text/n3` | `.n3` |\n| [JSON-LD](https://json-ld.org/) | `application/ld+json`, `application/json` | `.json`, `.jsonld` |\n| [RDF/XML](https://www.w3.org/TR/rdf-syntax-grammar/) | `application/rdf+xml` | `.rdf`, `.rdfxml`, `.owl` |\n| [RDFa](https://www.w3.org/TR/rdfa-in-html/) and script RDF data tags [HTML](https://html.spec.whatwg.org/multipage/)/[XHTML](https://www.w3.org/TR/xhtml-rdfa/) | `text/html`, `application/xhtml+xml` | `.html`, `.htm`, `.xhtml`, `.xht` |\n| [RDFa](https://www.w3.org/TR/2008/REC-SVGTiny12-20081222/metadata.html#MetadataAttributes) in [SVG](https://www.w3.org/TR/SVGTiny12/)/[XML](https://html.spec.whatwg.org/multipage/) | `image/svg+xml`,`application/xml` | `.xml`, `.svg`, `.svgz` |\n\n## String source\n\nString-based sources allow you to query over sources that are represented as a string in a certain RDF serialization.\n\nFor example, querying over a Turtle-based datasource:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n {\n type: 'serialized',\n value: '. .',\n mediaType: 'text/turtle',\n baseIRI: 'http://example.org/',\n },\n ],\n});\n```\n"},72821:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'SPARQL query types\'\ndescription: \'Different SPARQL query types are possible, such as SELECT, CONSTRUCT, ASK, ...\'\n---\n\nThe [SPARQL 1.1 query specification](https://www.w3.org/TR/sparql11-query/)\nintroduces four query types:\n\n* `SELECT`: Return matches as a collection of solution bindings.\n* `CONSTRUCT`: Create RDF triples from matches.\n* `DESCRIBE`: Create RDF triples about a resource.\n* `ASK`: Check if at least one match exists.\n\nThe [SPARQL 1.1 update specification](https://www.w3.org/TR/sparql11-update/)\nalso introduces query types that modify data, but return no output.\n\nThis guide shows how to handle these query types from the [command line](/docs/query/getting_started/query_cli/)\nand via [a JavaScript application](/docs/query/getting_started/query_app/).\n\n
\nQuery results for each of these query types can be represented in different formats.\n
\n\n## 1. Command line\n\nBelow, the different query type usages are summarized.\nMore information can be found in the [command line guide](/docs/query/getting_started/query_cli/).\n\n### 1.1. `SELECT`\n\nThe following query retrieves the first 100 triples from [DBpedia](https://fragments.dbpedia.org/2016-04/en):\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nAs output, a JSON array of bindings for the selected variables will be returned:\n```\n[\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/1999/02/22-rdf-syntax-ns#type","?o":"http://rdfs.org/ns/void#datasource"},\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/1999/02/22-rdf-syntax-ns#type","?o":"http://www.w3.org/ns/hydra/core#Collection"},\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/ns/hydra/core#search","?o":"https://fragments.dbpedia.org/2016-04/en#triplePattern"}\n...\n```\n\n### 1.2. `CONSTRUCT`\n\nNext to SPARQL `SELECT` queries,\nit is also possible to execute `CONSTRUCT` queries to produce RDF triples:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n```text\n "2010-04-21"^^;\n "1939-01-02"^^;\n "PDF";\n ;\n "Sheboygan, Wisconsin";\n "1";\n...\n```\n\n### 1.3. `DESCRIBE`\n\nSimilar to `CONSTRUCT`, `DESCRIBE` will output triples that are connected to a given resource by any predicate:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "DESCRIBE "\n```\n```text\n, , , , , , , , , , ;\n "4945528"^^;\n "14830"^^;\n "71"^^;\n "697541030"^^;\n...\n```\n\n### 1.4. `ASK`\n\n`ASK` queries will produce a boolean output:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "ASK { ?s ?p ?o }"\n```\n```\ntrue\n```\n\n### 1.5. Update\n\nUpdate queries will produce no output, unless an error occurs:\n```bash\n$ comunica-sparql https://example.org/file.ttl \\\n "INSERT DATA { }"\n```\n\n## 2. Application\n\nBelow, the different query type usages are summarized.\nMore information can be found in the [application guide](/docs/query/getting_started/query_app/).\n\n### 1.1. `SELECT`\n\nThe following query retrieves the first 100 triples from [DBpedia](https://fragments.dbpedia.org/2016-04/en):\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\nbindingsStream.on(\'data\', (binding) => {\n console.log(binding.get(\'s\').value);\n console.log(binding.get(\'p\').value);\n console.log(binding.get(\'o\').value);\n});\n```\n\n### 1.2. `CONSTRUCT`\n\nNext to SPARQL `SELECT` queries,\nit is also possible to execute `CONSTRUCT` queries to produce RDF triples:\n```javascript\nconst quadStream = await myEngine.queryQuads(`\n CONSTRUCT WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\n```\n```javascript\nquadStream.on(\'data\', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\n### 1.3. `DESCRIBE`\n\nSimilar to `CONSTRUCT`, `DESCRIBE` will output triples that are connected to a given resource by any predicate:\n```javascript\nconst quadStream = await myEngine.queryQuads(`\n DESCRIBE `, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\n```\n```javascript\nquadStream.on(\'data\', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\n### 1.4. `ASK`\n\n`ASK` queries will produce a boolean output:\n```javascript\nconst hasMatches = await myEngine.queryBoolean(`\n ASK {\n ?s ?p \n }`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n})\n```\n\n### 1.5. Update\n\nUpdate queries will produce a void output:\n```javascript\nawait myEngine.queryVoid(`\n PREFIX dc: \n INSERT DATA\n { \n dc:title "A new book" ;\n dc:creator "A.N.Other" .\n }`, {\n sources: [ store ],\n});\n```\n'},60187:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Supported specifications\'\ndescription: \'Comunica supports several RDF-related specifications\'\n---\n\nThis page summarizes the specifications Comunica implements.\n\n## Query standards\n\nThe following standard query specifications are supported:\n\n| **Description** |\n|-------------------------------------------------------------------------------------------------------------------------|\n| [SPARQL 1.1 Query Language](https://www.w3.org/TR/sparql11-query/) |\n| [SPARQL 1.1 Update](https://www.w3.org/TR/sparql11-update/) |\n| [SPARQL 1.1 Service Description](https://www.w3.org/TR/sparql11-service-description/) |\n| [SPARQL 1.1 Federated Query](https://www.w3.org/TR/sparql11-federated-query/) |\n| [SPARQL 1.1 Query Results JSON Format](https://www.w3.org/TR/sparql11-results-json/) |\n| [SPARQL Query Results XML Format (Second Edition)](https://www.w3.org/TR/rdf-sparql-XMLres/) |\n| [SPARQL 1.1 Query Results CSV and TSV Formats](https://www.w3.org/TR/sparql11-results-csv-tsv/) |\n| [SPARQL 1.1 Protocol](https://www.w3.org/TR/sparql11-protocol/) |\n| [SPARQL next SEP 0002 - Excluding ADJUST function](https://github.com/w3c/sparql-12/blob/main/SEP/SEP-0002/sep-0002.md) |\n| [RDF-star and SPARQL-star](https://www.w3.org/2021/12/rdf-star.html) | \n\nThe following notable specifications are not supported _yet_:\n\n| **Description** |\n| ------- |\n| [SPARQL 1.1 Entailment Regimes](https://www.w3.org/TR/sparql11-entailment/) |\n| [SPARQL 1.1 Graph Store HTTP Protocol](https://www.w3.org/TR/sparql11-http-rdf-update/) |\n\n## Serializing SPARQL results\n\nSPARQL query results can be serialized in [different formats](/docs/query/advanced/result_formats/).\nFor all of these supported formats, the following are standards:\n\n| **Media type** | **Description** |\n| ------- | --------------- |\n| [`application/sparql-results+json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-json) | The [SPARQL/JSON](https://www.w3.org/TR/sparql11-results-json/) results format. |\n| [`application/sparql-results+xml`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-xml) | The [SPARQL/XML](https://www.w3.org/TR/rdf-sparql-XMLres/) results format. |\n| [`text/csv`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-csv) | The [SPARQL/CSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) results format. |\n| [`text/tab-separated-values`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-tsv) | The [SPARQL/TSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) results format. |\n\n
\nAll serializers work in a streaming manner.\n
\n\nNext to these, RDF serializations are supported, as shown below.\n\n## Serializing RDF\n\nRDF triples/quads can be serialized via the following RDF serializations:\n\n| **Media type** | **Description** |\n| ------- | --------------- |\n| [`application/trig`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [TriG](https://www.w3.org/TR/trig/) RDF serialization. |\n| [`application/n-quads`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [N-Quads](https://www.w3.org/TR/n-quads/) RDF serialization. |\n| [`text/turtle`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [Turtle](https://www.w3.org/TR/turtle/) RDF serialization. |\n| [`application/n-triples`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [N-Triples](https://www.w3.org/TR/n-triples/) RDF serialization. |\n| [`text/n3`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [Notation3](https://www.w3.org/TeamSubmission/n3/) serialization. |\n| [`application/ld+json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [JSON-LD](https://json-ld.org/) RDF serialization. |\n\n
\n\n## RDF/JS\n\nAlignment with other JavaScript libraries is achieved via the following RDF/JS specifications:\n\n| **Description** |\n| ------- |\n| [RDF/JS Query specification](https://rdf.js.org/query-spec/) |\n| [RDF/JS Stream interfaces specification](https://rdf.js.org/stream-spec/) |\n| [RDF/JS Data model specification](https://rdf.js.org/data-model-spec/) |\n'},51527:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Modify Comunica'\ndescription: 'Learn how to configure your own Comunica engine, or extend Comunica by implementing new components.'\nindex: true\n---\n\nThe following guides show how to configure your own Comunica engine, or extend Comunica by implementing new components.\nIf you want to see some full examples,\nhave a look at our dedicated [examples repository](https://github.com/comunica/examples).\n"},22111:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Getting started with modification'\ndescription: 'Basic guides on how to easily get started with Comunica modification.'\nindex: true\n---\n"},4228:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying with a custom configuration from the command line\'\ndescription: \'Create a custom configuration of Comunica modules with reduced features, and query with it from the command line.\'\n---\n\nWhile packages such as [Comunica SPARQL](https://github.com/comunica/comunica/tree/master/engines/query-sparql)\nship with a default configuration that offer specific querying functionality,\nit is possible to **override these configurations**,\nso that you can modify the internal capabilities of your query engine.\n\nIn this guide, we will keep it simple,\nand we will just **remove some parts of the config file** to create a more lightweight query engine,\nand query it from the command line.\nIn a next guide, we will look into [querying with a custom config from a JavaScript app](/docs/modify/getting_started/custom_config_app/). \n\n
\n\n## 1. Requirements of a config file\n\nComunica is composed of a **set of _[actors](/docs/modify/advanced/architecture_core/)_**\nthat execute specific tasks.\nFor example, all SPARQL query operators (`DISTINCT`, `FILTER`, `ASK`, ...)\nhave a corresponding actor that implements them in a certain way.\n\nBy modifying the Comunica config file,\nit is possible to **plug in** different implementations for certain SPARQL query operators,\nin case you for example have a more efficient implementation yourself. \n\n### Main config file\n\nA **Comunica config is written in JSON**, and typically looks something like this:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/config-query-sparql/^3.0.0/components/context.jsonld"\n ],\n "@id": "urn:comunica:my",\n "@type": "Runner",\n "import": [\n "ccqs:config/context-preprocess/actors.json",\n "ccqs:config/context-preprocess/mediators.json",\n "ccqs:config/http/actors.json",\n "ccqs:config/http/mediators.json",\n "ccqs:config/init/actors.json",\n "ccqs:config/optimize-query-operation/actors.json",\n "ccqs:config/optimize-query-operation/mediators.json",\n "ccqs:config/query-operation/actors.json",\n "ccqs:config/query-operation/mediators.json"\n ]\n}\n``` \n\nEssentially, this config file contains a list of imports to smaller config files,\nwhich are loaded in when Comunica reads this config file.\n\nThese imported config files each represent a component on a particular bus.\nFor example `ccqs:config/query-operation/actors.json` refers to all actors that are registered on the query operation bus,\nand `ccqs:config/query-operation/mediators.json` refers to the mediators that are defined over the query operation bus.\n\n
\n\nThe `ccqs:` prefix refers to the scope of the `@comunica/config-query-sparql` package,\nwhich means that all paths following it refer to files within this package.\n\n### Imported config file\n\nFor example, the imported config file `ccqs:config/query-operation/actors.json` could look something like this:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/config-query-sparql/^3.0.0/components/context.jsonld"\n ],\n "import": [\n "ccqs:config/query-operation/actors/query/ask.json",\n "ccqs:config/query-operation/actors/query/bgp.json",\n "ccqs:config/query-operation/actors/query/construct.json",\n "ccqs:config/query-operation/actors/query/describe.json",\n "ccqs:config/query-operation/actors/query/distinct.json",\n "ccqs:config/query-operation/actors/query/extend.json",\n "ccqs:config/query-operation/actors/query/filter.json",\n "ccqs:config/query-operation/actors/query/from.json",\n "ccqs:config/query-operation/actors/query/group.json",\n "ccqs:config/query-operation/actors/query/join.json",\n "ccqs:config/query-operation/actors/query/leftjoin.json",\n "ccqs:config/query-operation/actors/query/minus.json",\n "ccqs:config/query-operation/actors/query/nop.json",\n "ccqs:config/query-operation/actors/query/orderby.json",\n "ccqs:config/query-operation/actors/query/project.json",\n "ccqs:config/query-operation/actors/query/quadpattern.json",\n "ccqs:config/query-operation/actors/query/reduced.json",\n "ccqs:config/query-operation/actors/query/service.json",\n "ccqs:config/query-operation/actors/query/slice.json",\n "ccqs:config/query-operation/actors/query/sparql-endpoint.json",\n "ccqs:config/query-operation/actors/query/union.json",\n "ccqs:config/query-operation/actors/query/values.json"\n ]\n}\n```\n\nThis example config file imports several smaller config files,\nwhere each config file contains a single _[actor](/docs/modify/advanced/architecture_core/)_ that will be loaded into Comunica.\n\nFor example, the `ccqs:config/query-operation/actors/query/ask.json` file could look as follows:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/runner/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-ask/^3.0.0/components/context.jsonld"\n ],\n "@id": "urn:comunica:default:Runner",\n "@type": "Runner",\n "actors": [\n {\n "@id": "urn:comunica:default:query-operation/actors#ask",\n "@type": "ActorQueryOperationAsk",\n "mediatorQueryOperation": { "@id": "urn:comunica:default:query-operation/mediators#main" }\n }\n ]\n}\n```\n\nEach configured actor fulfills a specific task, e.g.:\n\n* `ActorQueryOperationAsk`: Executes SPARQL `ASK` queries.\n* `ActorQueryOperationDistinctHash`: Executes the SPARQL `DISTINCT` operator.\n* `ActorQueryOperationFilterSparqlee`: Executes SPARQL `FILTER` expressions.\n\n
\nWhile the exact meaning of these config files are not important for this guide,\nif you want to learn more about its details,\nhave a look at the guide on\nconfiguration files.\n
\n\n## 2. Install Comunica SPARQL\n\nSince we want to override the default config of **Comunica SPARQL**,\nwe have to make sure its package is installed first:\n\n```bash\n$ npm install -g @comunica/query-sparql\n```\n\n## 3. Start from an existing config file\n\nThe easiest way to create a custom config, is to start from an existing one, and add/remove things to fit your needs.\n\nLet\'s start by creating a new empty directory,\nand create a file called `config.json`.\n\nIn this guide, we will start from\nthe [Comunica SPARQL default config file](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/config-default.json).\nLet\'s **copy its contents entirely into our `config.json`**:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/config-query-sparql/^3.0.0/components/context.jsonld"\n ],\n "import": [\n "ccqs:config/context-preprocess/actors.json",\n "ccqs:config/context-preprocess/mediators.json",\n "ccqs:config/hash-bindings/actors.json",\n "ccqs:config/hash-bindings/mediators.json",\n "ccqs:config/http/actors.json",\n "ccqs:config/http/mediators.json",\n "ccqs:config/http-invalidate/actors.json",\n "ccqs:config/http-invalidate/mediators.json",\n "ccqs:config/init/actors.json",\n "ccqs:config/merge-bindings-context/actors.json",\n "ccqs:config/merge-bindings-context/mediators.json",\n "ccqs:config/optimize-query-operation/actors.json",\n "ccqs:config/optimize-query-operation/mediators.json",\n "ccqs:config/query-operation/actors.json",\n "ccqs:config/query-operation/mediators.json",\n "ccqs:config/query-parse/actors.json",\n "ccqs:config/query-parse/mediators.json",\n "ccqs:config/query-process/actors.json",\n "ccqs:config/query-process/mediators.json",\n "ccqs:config/query-result-serialize/actors.json",\n "ccqs:config/query-result-serialize/mediators.json",\n "ccqs:config/query-source-identify/actors.json",\n "ccqs:config/query-source-identify/mediators.json",\n "ccqs:config/query-source-identify-hypermedia/actors.json",\n "ccqs:config/query-source-identify-hypermedia/mediators.json",\n "ccqs:config/dereference/actors.json",\n "ccqs:config/dereference/mediators.json",\n "ccqs:config/dereference-rdf/actors.json",\n "ccqs:config/dereference-rdf/mediators.json",\n "ccqs:config/rdf-join/actors.json",\n "ccqs:config/rdf-join/mediators.json",\n "ccqs:config/rdf-join-entries-sort/actors.json",\n "ccqs:config/rdf-join-entries-sort/mediators.json",\n "ccqs:config/rdf-join-selectivity/actors.json",\n "ccqs:config/rdf-join-selectivity/mediators.json",\n "ccqs:config/rdf-metadata/actors.json",\n "ccqs:config/rdf-metadata/mediators.json",\n "ccqs:config/rdf-metadata-accumulate/actors.json",\n "ccqs:config/rdf-metadata-accumulate/mediators.json",\n "ccqs:config/rdf-metadata-extract/actors.json",\n "ccqs:config/rdf-metadata-extract/mediators.json",\n "ccqs:config/rdf-parse/actors.json",\n "ccqs:config/rdf-parse/mediators.json",\n "ccqs:config/rdf-parse-html/actors.json",\n "ccqs:config/rdf-resolve-hypermedia-links/actors.json",\n "ccqs:config/rdf-resolve-hypermedia-links/mediators.json",\n "ccqs:config/rdf-resolve-hypermedia-links-queue/actors.json",\n "ccqs:config/rdf-resolve-hypermedia-links-queue/mediators.json",\n "ccqs:config/rdf-serialize/actors.json",\n "ccqs:config/rdf-serialize/mediators.json",\n "ccqs:config/rdf-update-hypermedia/actors.json",\n "ccqs:config/rdf-update-hypermedia/mediators.json",\n "ccqs:config/rdf-update-quads/actors.json",\n "ccqs:config/rdf-update-quads/mediators.json"\n ]\n}\n```\n\n## 4. Execute with Comunica SPARQL\n\nWhile we usually use `comunica-sparql` to invoke Comunica SPARQL on the command line,\nwe can instead call `comunica-dynamic-sparql` with exactly the same arguments\nto allow **loading in a custom config file**.\n\nIn order to specify a custom config file,\nwe have to set the path to our config file via the `COMUNICA_CONFIG` environment variable:\n```bash\n$ export COMUNICA_CONFIG="config.json"\n```\n\nIf you now execute `comunica-dynamic-sparql`,\nit will load in your `config.json` file.\n\nLet\'s try a simple query to see if this works:\n```bash\n$ comunica-dynamic-sparql http://fragments.dbpedia.org/2016-04/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n
\nIf you don\'t define the COMUNICA_CONFIG environment variable,\ncomunica-dynamic-sparql will fallback to the default Comunica SPARQL config file.\n
\n\n
\ncomunica-dynamic-sparql has a significant startup delay compared to comunica-sparql,\nsince it now have to load in, parse, and interpret a config file.\ncomunica-dynamic-sparql should therefore only be used for simple testing\nbefore you use your query engine in a separate package.\n
\n\n## 5. Removing RDF serialization actors\n\nAs an example, we will **remove all actors that can output results in any RDF format**.\nAll of these actors are defined in the `ccqs:config/rdf-serialize/actors.json` config file.\n\nBefore we make any changes to our config file,\nlet us inspect the result formats that are currently available:\n```bash\n$ comunica-dynamic-sparql --listformats\napplication/ld+json\napplication/trig\napplication/n-quads\ntext/turtle\napplication/n-triples\ntext/n3\nstats\ntree\ntable\napplication/sparql-results+xml\ntext/tab-separated-values\napplication/sparql-results+json\ntext/csv\nsimple\napplication/json\n```\n\nThe first 6 of those formats are RDF serialization formats,\nwhich are mainly used for outputting `CONSTRUCT` query results.\n\nIf we want to remove those actors from the config file,\nwe can remove the following line from our `config.json`:\n```diff\n- "ccqs:config/rdf-serialize/actors.json",\n```\n\nIf we now inspect the available result formats, we get the following:\n```bash\n$ comunica-dynamic-sparql --listformats\nstats\ntree\ntable\napplication/sparql-results+xml\ntext/tab-separated-values\napplication/sparql-results+json\ntext/csv\nsimple\napplication/json\n```\n\nAs you can see, the 6 RDF serialization formats are not present anymore.\nThis is because Comunica has not loaded them in because we have removed them from our config file.\n\n## 6. Only allowing `SELECT` queries\n\nLet\'s take our config modifications a step further,\nand let\'s say our goal is to build a query engine that can **_only_ execute `SELECT`** queries,\nand we don\'t want to be able to execute `CONSTRUCT` and `DESCRIBE` queries.\nThis will require us to remove some more actors.\n\nWhile the actors for `CONSTRUCT` and `DESCRIBE` are defined in `ccqs:config/query-operation/actors.json`,\nwe can not just simply remove that file from our imports,\nbecause it also contains actors for other SPARQL query operators which we don\'t want to remove, such as `SELECT`.\nInstead of _just_ removing `ccqs:config/query-operation/actors.json`,\nwe will remove it _and_ copy its contents directly into our config file.\n\n### 6.1. Inline an imported config\n\nTo do this, first **remove** the following line from our `config.json`:\n```text\n- "ccqs:config/query-operation/actors.json",\n```\n\nNext, **copy the `"import"` entries** from [`ccqs:config/query-operation/actors.json`](https://raw.githubusercontent.com/comunica/comunica/master/engines/config-query-sparql/config/query-operation/actors.json) ([GitHub](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/query-operation/actors.json)),\nand paste it after the current `"import"` entries in our `config.json`.\n\nYour `config.json` file should have the following structure now:\n```text\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/config-query-sparql/^3.0.0/components/context.jsonld"\n ],\n "import": [\n "ccqs:config/context-preprocess/actors.json",\n "ccqs:config/context-preprocess/mediators.json",\n ...\n "ccqs:config/rdf-update-quads/actors.json",\n "ccqs:config/rdf-update-quads/mediators.json",\n \n "ccqs:config/query-operation/actors/query/ask.json",\n "ccqs:config/query-operation/actors/query/bgp.json",\n "ccqs:config/query-operation/actors/query/construct.json",\n ...\n "ccqs:config/query-operation/actors/update/load.json",\n "ccqs:config/query-operation/actors/update/move.json"\n ]\n}\n\n```\n\n
\nAt this point, your config file should still be valid.\nConfirm this by executing comunica-dynamic-sparql.\n
\n\n### 6.2. Remove actors\n\nNext, we will remove the query operation actors we don\'t need.\nConcretely, we will remove the following imports to actors:\n\n* `ccqs:config/query-operation/actors/query/construct.json`: Handles `CONSTRUCT` queries.\n* `ccqs:config/query-operation/actors/query/describe.json`: Handles `DESCRIBE` queries.\n\nFor this, remove the following lines:\n```diff\n- "ccqs:config/query-operation/actors/query/construct.json",\n- "ccqs:config/query-operation/actors/query/describe.json",\n```\n\n### 6.3. Test changes\n\nAfter this change, you should now be unable to execute `CONSTRUCT` or `DESCRIBE` queries.\nTry this out by executing the following:\n```bash\n$ comunica-dynamic-sparql http://fragments.dbpedia.org/2016-04/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nExecuting a `SELECT` query will still work:\n```bash\n$ comunica-dynamic-sparql http://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nYou have now successfully built your own custom Comunica engine that is a bit more lightweight than the default one.\nJust like the `CONSTRUCT` and `DESCRIBE` actors,\nyou can remove any other actors you don\'t want to make it even more lightweight.\n\n
\nLoading custom configs from the command line is limited to loading from a single custom config file.\nIf you want to split up your config file over different parts, you have to load it via the JavaScript API.\n
\n'},40492:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying with a custom configuration in a JavaScript app\'\ndescription: \'Create a custom configuration of Comunica modules with changed features, and query with it from within your application using the JavaScript API.\'\n---\n\nIn the previous guide, we looked into\n[querying with a custom config from the command line](/docs/modify/getting_started/custom_config_cli/).\nIn this guide, we\'ll do the same from within a JavaScript application,\nbut we will **split up our config across different files** for convenience.\n\n
\nThis assumes you already have an npm package.\nIf you don\'t have one yet, create one using npm init.\nYou will also need a JavaScript file to write in, such as main.js.\n
\n\nIn order to add Comunica SPARQL as a _dependency_ to your [Node.js](https://nodejs.org/en/) application,\nwe can execute the following command:\n```bash\n$ npm install @comunica/query-sparql\n```\n\n## 2. Creating a new query engine\n\nWhile [`QueryEngine` is used to import Comunica SPARQL\'s default config](/docs/query/getting_started/query_app/),\nwe can load a custom config by creating our engine via `newEngineDynamic()`:\n```javascript\nconst QueryEngineFactory = require(\'@comunica/query-sparql\').QueryEngineFactory;\n\nconst myEngine = await new QueryEngineFactory().create({\n configPath: \'config.json\', // Relative or absolute path \n});\n```\n\n`configPath` refers to a config file, which we will create in the next step.\n\n## 3. Start from an existing config file\n\nThe easiest way to create a custom config, is to start from an existing one, and add/remove things to fit your needs.\n\nLet\'s create a file called `config.json` in your package.\n\nIn this guide, we will start from\nthe [Comunica SPARQL default config file](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/config-default.json).\nLet\'s **copy its contents entirely into our `config.json`**:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/config-query-sparql/^3.0.0/components/context.jsonld"\n ],\n "import": [\n "ccqs:config/context-preprocess/actors.json",\n "ccqs:config/context-preprocess/mediators.json",\n "ccqs:config/hash-bindings/actors.json",\n "ccqs:config/hash-bindings/mediators.json",\n "ccqs:config/http/actors.json",\n "ccqs:config/http/mediators.json",\n "ccqs:config/http-invalidate/actors.json",\n "ccqs:config/http-invalidate/mediators.json",\n "ccqs:config/init/actors.json",\n "ccqs:config/merge-bindings-context/actors.json",\n "ccqs:config/merge-bindings-context/mediators.json",\n "ccqs:config/optimize-query-operation/actors.json",\n "ccqs:config/optimize-query-operation/mediators.json",\n "ccqs:config/query-operation/actors.json",\n "ccqs:config/query-operation/mediators.json",\n "ccqs:config/query-parse/actors.json",\n "ccqs:config/query-parse/mediators.json",\n "ccqs:config/query-process/actors.json",\n "ccqs:config/query-process/mediators.json",\n "ccqs:config/query-result-serialize/actors.json",\n "ccqs:config/query-result-serialize/mediators.json",\n "ccqs:config/query-source-identify/actors.json",\n "ccqs:config/query-source-identify/mediators.json",\n "ccqs:config/query-source-identify-hypermedia/actors.json",\n "ccqs:config/query-source-identify-hypermedia/mediators.json",\n "ccqs:config/dereference/actors.json",\n "ccqs:config/dereference/mediators.json",\n "ccqs:config/dereference-rdf/actors.json",\n "ccqs:config/dereference-rdf/mediators.json",\n "ccqs:config/rdf-join/actors.json",\n "ccqs:config/rdf-join/mediators.json",\n "ccqs:config/rdf-join-entries-sort/actors.json",\n "ccqs:config/rdf-join-entries-sort/mediators.json",\n "ccqs:config/rdf-join-selectivity/actors.json",\n "ccqs:config/rdf-join-selectivity/mediators.json",\n "ccqs:config/rdf-metadata/actors.json",\n "ccqs:config/rdf-metadata/mediators.json",\n "ccqs:config/rdf-metadata-accumulate/actors.json",\n "ccqs:config/rdf-metadata-accumulate/mediators.json",\n "ccqs:config/rdf-metadata-extract/actors.json",\n "ccqs:config/rdf-metadata-extract/mediators.json",\n "ccqs:config/rdf-parse/actors.json",\n "ccqs:config/rdf-parse/mediators.json",\n "ccqs:config/rdf-parse-html/actors.json",\n "ccqs:config/rdf-resolve-hypermedia-links/actors.json",\n "ccqs:config/rdf-resolve-hypermedia-links/mediators.json",\n "ccqs:config/rdf-resolve-hypermedia-links-queue/actors.json",\n "ccqs:config/rdf-resolve-hypermedia-links-queue/mediators.json",\n "ccqs:config/rdf-serialize/actors.json",\n "ccqs:config/rdf-serialize/mediators.json",\n "ccqs:config/rdf-update-hypermedia/actors.json",\n "ccqs:config/rdf-update-hypermedia/mediators.json",\n "ccqs:config/rdf-update-quads/actors.json",\n "ccqs:config/rdf-update-quads/mediators.json"\n ]\n}\n```\n\n## 4. Executing SPARQL SELECT queries\n\nOnce your engine has been created based on your custom config,\nyou can use it to execute any SPARQL query, such as a `SELECT` query:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\n\nbindingsStream.on(\'data\', (binding) => {\n console.log(binding.get(\'s\').value);\n console.log(binding.get(\'p\').value);\n console.log(binding.get(\'o\').value);\n});\n```\n\nIf you wrote this in a file called `main.js`, you can invoke it by executing `node main.js`.\n\n
\nIf you run into config loading problems,\nmake sure your app has a package.json file,\notherwise config loading will fail.\n
\n\n## 5. Only allowing `SELECT` queries\n\nOur goal in this step is to build a query engine that can **_only_ execute `SELECT`** queries,\nand we don\'t want to be able to execute `CONSTRUCT` and `DESCRIBE` queries.\nThis will require us to remove some more actors.\n\nWhile the actors for `CONSTRUCT` and `DESCRIBE` are defined in `ccqs:config/query-operation/actors.json`,\nwe can not just simply remove that file from our imports,\nbecause it also contains actors for other SPARQL query operators which we don\'t want to remove, such as `SELECT`.\n\nIn the [guide on querying with a custom config from the command line](/docs/modify/getting_started/custom_config_cli/),\nwe achieved this by inlining `ccqs:config/query-operation/actors.json` into our main config file.\nIn this guide, we\'ll do this in a cleaner way by **redefining** the contents of `ccqs:config/query-operation/actors.json`\nin a **separate local file**, and applying our changes there.\n\n### 5.1. Declare config options in `package.json`\n\nBefore we can refer to other files within our config file,\nwe have to add some entries to our `package.json` file\nso that the config files can be found during engine initialization.\n\nConcretely, we need to **add the following entry to `package.json`**:\n```text\n{\n ...\n "lsd:module": true\n ...\n}\n```\n\n
\nIf you want to learn more about what this entry means,\nread our guide on Components.js,\na dependency injection framework that Comunica uses.\n
\n\n### 5.2. Create a context\n\nIn order to allow our config file to import other files,\nwe need to create a JSON-LD context file.\n\nCreate the file **`components/context.jsonld`** with the following contents:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/componentsjs/^5.0.0/components/context.jsonld",\n {\n "npmd": "https://linkedsoftwaredependencies.org/bundles/npm/",\n "my": "npmd:my-package/^1.0.0/"\n }\n ]\n}\n```\n\nAgain, make sure to replace `my-package` with your package `name`.\n\n
\nTo avoid collisions with other packages, it is recommended to use another prefix than "my" in your context.\n
\n\n### 5.3. Copying `config/query-operation/actors.json`\n\nNext, we will create a local copy of `ccqs:config/query-operation/actors.json`.\n\nFor this, create a file **`config/query-operation/actors.json`**,\nand paste in the contents of [`ccqs:config/query-operation/actors.json`](https://raw.githubusercontent.com/comunica/comunica/master/engines/config-query-sparql/config/query-operation/actors.json) ([GitHub](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/query-operation/actors.json)).\n\n### 5.4. Make config refer to local `config/query-operation/actors.json`\n\nNow that we have declared config options in our `package.json`,\ncreated a context,\nand created a local copy of `config/query-operation/actors.json`,\neverything is ready to **modify our `config.json` to refer to our local `config/query-operation/actors.json`**.\n\nFor this, remove the following line from `config.json`:\n```diff\n- "ccqs:config/query-operation/actors.json",\n```\nAnd replace it with the following line:\n```diff\n+ "my:config/query-operation/actors.json",\n```\n\nAlso add the newly created config to the contexts of the config file (again replacing `my-package` with your package `name`):\n```diff\n "@context": [\n+ "https://linkedsoftwaredependencies.org/bundles/npm/my-package/^1.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/config-query-sparql/^3.0.0/components/context.jsonld"\n ],\n```\n\nThis change means that Comunica will load its query operators from our local `config/query-operation/actors.json` file,\ninstead of the default `ccqs:config/query-operation/actors.json` file.\n\nIf you run your app again, things should still function like before at this point.\n\n### 5.5. Remove actors\n\nNext, we will remove the actors we don\'t need.\nConcretely, we will remove the following imports to actors:\n\n* `ccqs:config/query-operation/actors/query/construct.json`: Handles `CONSTRUCT` queries.\n* `ccqs:config/query-operation/actors/query/describe.json`: Handles `DESCRIBE` queries.\n\nFor this, remove the following lines:\n```diff\n- "ccqs:config/query-operation/actors/query/construct.json",\n- "ccqs:config/query-operation/actors/query/describe.json",\n```\n\n### 5.6. Test changes\n\nAfter this change, you should now be unable to execute `CONSTRUCT` or `DESCRIBE` queries.\nTry this out by executing the following:\n```bash\nconst quadStream = await myEngine.queryQuads(`\n CONSTRUCT WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\n\nquadStream.on(\'data\', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\nExecuting a `SELECT` query will still work:\n```bash\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\n\nbindingsStream.on(\'data\', (binding) => {\n console.log(binding.get(\'s\').value);\n console.log(binding.get(\'p\').value);\n console.log(binding.get(\'o\').value);\n});\n```\n\nYou have now successfully built your own custom Comunica engine that is a bit more lightweight than the default one.\nJust like the `CONSTRUCT` and `DESCRIBE` actors,\nyou can remove any other actors you don\'t want to make it even more lightweight.\n\nIf you want, you can create additional config file parts in `config/`\nand refer to them from our main `config.json` with the `my:` prefix.\n\n
\n'},84338:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Exposing your custom config as an npm package\'\ndescription: \'Wrap your config in an npm package, and expose a CLI tool and a JavaScript API.\'\n---\n\nIn this guide, we will wrap our custom config into a new npm package,\nso that we can **expose it as a proper reusable query engine**.\nThis package will be able to do everything packages such as Comunica SPARQL (`@comunica/query-sparql`) can do.\nThis means that this package will have a CLI tool, and that it will expose a JavaScript API for use in other packages.\n\n
\nA fully functional example can be found\nhere.\n
\n\n## 1. Initialize a new package\n\nInitialize a new **empty npm package** as follows:\n```bash\n$ npm init\n```\n\nThe simplest way to include all required Comunica dependencies is to extend from **Comunica SPARQL**.\nAs such, add it as a dependency as follows:\n```bash\n$ npm install @comunica/query-sparql\n```\n\n
\nIf you want to create a more lightweight package by selecting only those dependencies that are absolutely required,\nyou can make use of the Comunica Packager.\n
\n\nWe recommend to also **install TypeScript** as a dev dependency:\n```bash\n$ npm install -D typescript\n```\n\nAdd a `tsconfig.json` file with the following contents:\n```text\n{\n "compileOnSave": true,\n "compilerOptions": {\n "target": "es2021",\n "lib": [\n "es2021",\n "dom"\n ],\n "module": "commonjs",\n "resolveJsonModule": true,\n\n "strict": true,\n "strictFunctionTypes": true,\n "strictPropertyInitialization": false,\n "noImplicitOverride": true,\n "declaration": true,\n "downlevelIteration": true,\n "inlineSources": true,\n "preserveConstEnums": true,\n "removeComments": false,\n "sourceMap": true\n },\n "include": [\n "bin/**/*",\n "lib/**/*"\n ],\n "exclude": [\n "**/node_modules"\n ]\n}\n\n```\n\n\n
\nIf your custom config also depends on other packages\nthat are not included in Comunica SPARQL,\nyou have to install them here as well.\n
\n\n## 2. Add a config file\n\n### 2.1. Create config file\n\nWe assume here that **you already have created a custom config file**.\n[Click here to learn how to create one](/docs/modify/getting_started/custom_config_app/) should you not have done this already.\n\nCreate a `config/` folder, and add your config file (`config-default.json`) in here.\n\nIf your config file includes other config sets, you can include them in this folder as well.\nIn this case, you also have to make sure to include the context file in `components/context.jsonld`.\n\nThe only requirement here is that there is at least a file **`config/config-default.json`**.\n\n### 2.2. Declare config options in `package.json`\n\n
\nIf your config file is decomposed into several files,\nyou may already have done this step.\n
\n\nBefore we can refer to other files within our config file,\nwe have to add the `"lsd:module"` entry to our `package.json` file\nso that the config files can be found during engine initialization.\n\nConcretely, we need to **add the following entry to `package.json`**:\n```text\n{\n ...\n "lsd:module": true\n ...\n}\n```\n\n
\nIf you want to learn more about what this config entry means,\nread our guide on Components.js,\na dependency injection framework that Comunica uses.\n
\n\n## 3. Compiling the config into JavaScript\n\nIn order to make the query engine start as fast as possible,\nwe will pre-compile our config file into a JavaScript file.\n\nWe will configure this in such as way that we can still modify our config file if needed,\nand recompile the JavaScript file easily.\n\nFor this, add the following **scripts to our `package.json`** file:\n```text\n{\n ...\n "scripts": {\n ...\n "build:engine": "comunica-compile-config config/config-default.json > engine-default.js",\n "build:lib": "tsc",\n "build": "npm run build:lib && npm run build:engine",\n "prepare": "npm run build"\n },\n}\n```\n\nYou can use the build script later as follows:\n```bash\n$ npm run build\n```\n\nAt this moment however, that will still fail due to missing files, but you can already do this now:\n```bash\n$ npm run build:engine\n```\n\nAfterwards, you should have an `engine-default.js` file in your folder.\n\n## 4. Creating command line tools\n\nIn this step, we will create three command line tools:\n\n* `bin/query.js`: The main CLI tool.\n* `bin/http.js`: Script for starting a SPARQL endpoint.\n* `bin/query-dynamic.js`: A [CLI tool in which you can load a custom config](/docs/modify/getting_started/custom_config_cli/).\n\nEach of these CLI tools are optional, and you only have to create those you want.\nFor this, **create the following files**:\n\n`bin/query.js`:\n```typescript\n#!/usr/bin/env node\nimport { runArgsInProcessStatic } from \'@comunica/runner-cli\';\nrunArgsInProcessStatic(require(\'../engine-default.js\')());\n```\n\n`bin/http.js`:\n```typescript\n#!/usr/bin/env node\nimport { HttpServiceSparqlEndpoint } from \'@comunica/actor-init-query\';\nconst defaultConfigPath = `${__dirname}/../config/config-default.json`;\nHttpServiceSparqlEndpoint.runArgsInProcess(process.argv.slice(2), process.stdout, process.stderr, `${__dirname}/../`, process.env, defaultConfigPath, code => process.exit(code))\n .catch(error => process.stderr.write(`${error.message}/n`));\n```\n\n`bin/query-dynamic.js`:\n```typescript\n#!/usr/bin/env node\nimport { runArgsInProcess } from \'@comunica/runner-cli\';\nrunArgsInProcess(`${__dirname}/../`, `${__dirname}/../config/config-default.json`);\n```\n\nAs a final step, we have to make sure that we expose our CLI tools from the package.\nAs such, add the following **bin entries to `package.json`**:\n```text\n{\n ...\n "bin": {\n "my-comunica": "./bin/query.js",\n "my-comunica-http": "./bin/http.js",\n "my-comunica-dynamic": "./bin/query-dynamic.js"\n },\n}\n```\n_You can replace `my-comunica` with any name you want._\n\nIf needed, [custom arguments may be added to CLI tools](/docs/modify/advanced/custom_cli_arguments/).\n\n## 5. Exposing a JavaScript API\n\nIn order to use your query engine as a dependency in other packages,\nwe have to expose its JavaScript API.\nWe will also immediately make it browser-friendly.\n\nFor this, create the following files:\n\n**`lib/QueryEngine.ts`**:\n```typescript\nimport { QueryEngineBase } from \'@comunica/actor-init-query\';\nimport type { ActorInitQueryBase } from \'@comunica/actor-init-query\';\nconst engineDefault = require(\'../engine-default.js\');\n\n/**\n * A Comunica SPARQL query engine.\n */\nexport class QueryEngine extends QueryEngineBase {\n public constructor(engine: ActorInitQueryBase = engineDefault()) {\n super(engine);\n }\n}\n```\n\n**`lib/QueryEngineFactory.ts`**:\n```typescript\nimport { QueryEngineFactoryBase } from \'@comunica/actor-init-query\';\nimport { QueryEngine } from \'./QueryEngine\';\n\n/**\n * A factory that can create query engines dynamically based on a given config.\n */\nexport class QueryEngineFactory extends QueryEngineFactoryBase {\n public constructor() {\n super(\n `${__dirname}/../`,\n `${__dirname}/../config/config-default.json`,\n actorInitQuery => new QueryEngine(actorInitQuery),\n );\n }\n}\n```\n\n**`lib/index.ts`**:\n```typescript\nexport * from \'./QueryEngine\';\nexport * from \'./QueryEngineFactory\';\n```\n\n**`lib/index-browser.ts`**:\n```typescript\nexport * from \'./QueryEngine\';\n```\n\nAs a final step,\nmake sure to expose the following entries in your **`package.json`** file:\n```text\n{\n ...\n "main": "lib/index.js",\n "types": "lib/index",\n "browser": {\n "./lib/index.js": "./lib/index-browser.js",\n "./lib/index.js.map": "./lib/index-browser.js.map"\n }\n}\n```\n\n## 6. Indicating what files should be published\n\nNot all files should be published to npm when releasing the package,\nand not all files should be added to git repositories.\n\nFor this **create the following files**:\n\n`.npmignore`\n```text\n```\n_`.npmignore` MUST exist and MUST be empty._\n\n`.gitignore`\n```text\nengine-default.js\nnode_modules\nlib/**/*.js\nlib/**/*.js.map\nlib/**/*.d.ts\ntest/**/*.js\ntest/**/*.js.map\ntest/**/*.d.ts\nbin/**/*.js\nbin/**/*.js.map\nbin/**/*.d.ts\n```\n\nAs a final step, **add following entries to `package.json`**:\n```text\n{\n ...\n "files": [\n "components",\n "config",\n "bin/**/*.d.ts",\n "bin/**/*.js",\n "bin/**/*.js.map",\n "lib/**/*.d.ts",\n "lib/**/*.js",\n "lib/**/*.js.map",\n "engine-default.js"\n ],\n}\n```\n\n## 7. Publish to npm\n\nNow, you are ready to [publish your package to npm](https://docs.npmjs.com/creating-and-publishing-scoped-public-packages),\nand allow other to use it via the CLI or via the JavaScript API.\n'},64265:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Exposing your custom config in a Web client\'\ndescription: \'Demonstrate your query engine as a static Web page.\'\n---\n\nIn this guide, we use the [Comunica Web Client](https://github.com/comunica/jQuery-Widget.js)\nto run our engine client-side as a static Web page,\njust like http://query.linkeddatafragments.org/.\n\nThis guide assumes you already [expose your custom config as an npm package](/docs/modify/getting_started/custom_init/).\n\n## 1. Cloning the repo\n\n1. Go to https://github.com/comunica/jQuery-Widget.js/\n2. Make sure you are logged into your GitHub account.\n3. Click on the "Fork" button.\n\nAfter this, a copy of the jQuery-Widget.js repo will be available for your account\nin which you can make all the changes you want.\n\nNext, we will **clone** your fork to the local file system as follows: \n```bash\n$ git clone https://github.com//jQuery-Widget.js.git\n```\n_Make sure you replace `` with your GitHub username._\n\nAs a final setup step, we can install all dependencies as follows:\n```bash\n$ cd jQuery-Widget.js\n$ yarn install\n```\n\n## 2. Plugging in your custom config\n\nBy default, the Web client is configured with Comunica SPARQL (`@comunica/query-sparql`).\nIn this step, we will modify it so that our custom engine is configured instead.\n\nFirst, install our package as a dependency:\n```bash\n$ npm install my-package\n```\nMake sure to replace `my-package` by the name of [the package you created before](/docs/modify/getting_started/custom_init/).\n\nNext, replace the `import` in `config/config-default.json` as follows:\n```text\n{\n ...\n "import": [\n "my:config/config-default.json"\n ]\n}\n```\nMake sure to replace the `"my"` prefix, so it refers to the scope of your package.\n\n## 3. Build and run\n\nThese were the only changes required to plug your package into the Web client.\n\nTo start a local Web server to test your engine, run the following:\n```bash\n$ yarn run dev\n```\n\nTo create an actual build in the `build/` folder that can be deployed to any Web server, run the following:\n```bash\n$ yarn run build\n```\n\nOptionally, you can now [tweak the default datasources and queries](https://github.com/comunica/jQuery-Widget.js#readme). \n'},11652:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Contributing a new query operation actor to the Comunica repository\'\ndescription: \'Setup a development environment, implement a new actor, and create a pull request.\'\n---\n\nThis guide focuses on all the required steps for contributing a new query operation actor to Comunica.\nConcretely, we will focus on implementing a custom actor for the SPARQL `REDUCED` operator.\n\n
\nOnce you have followed this guide and actually want to contribute,\nhave a look at our contribution guide.\n
\n\n## 1. Requirements\n\nYou will need the following to follow this guide:\n\n* [git](https://git-scm.com/)\n* [Node.js](https://nodejs.org/en/) (version 8.0 or higher)\n* [Yarn](https://yarnpkg.com/en/) package manager\n* Any kind of editor that be used to edit JavaScript files (We recommend [WebStorm](https://www.jetbrains.com/community/education/#students))\n* A [GitHub](https://github.com/) account\n\n## 2. Cloning the repo\n\nSince you do not have access to the Comunica repository by default,\nyou will have to **fork** the Comunica repo first.\n\n1. Go to https://github.com/comunica/comunica\n2. Make sure you are logged into your GitHub account.\n3. Click on the "Fork" button.\n\nAfter this, a copy of the Comunica repo will be available for your account\nin which you can make all the changes you want.\n\nNext, we will **clone** your fork to the local file system as follows: \n```bash\n$ git clone https://github.com//comunica.git\n```\n_Make sure you replace `` with your GitHub username._\n\nAs a final setup step, we can install all dependencies as follows:\n```bash\n$ cd comunica\n$ yarn install\n```\n\nThis will install the dependencies of all modules.\nAfter that, all [Comunica packages](https://github.com/comunica/comunica/tree/master/packages) are available in the `packages/` folder\nand can be used in a development environment.\nAll pre-built [Comunica engines and configs](https://github.com/comunica/comunica/tree/master/engines) are available in the `engines/` folder\nsuch as querying with [Comunica SPARQL (`engines/query-sparql`)](https://github.com/comunica/comunica/tree/master/engines/query-sparql).\n\nA good git practise is to develop on **feature branches**.\nFor this, branch from the `master` as follows:\n```bash\n$ git checkout -b feature/my-feature\n```\n_Replace `my-feature` with a short name (without spaces) of the feature you want to implement._\n\n
\nIf you fix a bug, you can name your branch something like fix/my-fix.\n
\n\nIf you want to make sure that everything has been installed correctly,\nnavigate to `engines/query-sparql`, and try out a simple query from the command line:\n```bash\n$ cd engines/query-sparql\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n \'SELECT * WHERE { ?s ?p ?o } LIMIT 100\'\n```\n\nIf this command produces valid output, your development environment has been successfully setup.\n\nLet\'s navigate back to the repo root, so we\'re ready for the next step:\n```bash\n$ cd ../..\n```\n\n## 3. Creating a new package\n\nThe Comunica monorepo contains a large collection of packages in the [`packages/`](https://github.com/comunica/comunica/tree/master/packages) directory.\nThis contains different types of packages: _actors, mediators and buses_.\n\n
\n\nFor each type of package, we provide a **generator tool** to initialize a template repo.\nFor this, you can use the [generator-comunica](https://github.com/comunica/generate-comunica) project (a [Yo](https://www.npmjs.com/package/yo) generator).\n\nTo install this generator, start a _new terminal session_ outside of the Comunica repo directory,\nand execute the following commands:\n```bash\n$ npm i -g yo\n$ git clone git@github.com:comunica/generate-comunica.git\n$ cd generate-comunica\n$ npm install\n$ npm link\n```\n\nThis will expose the `comunica:bus`, `comunica:mediator`, `comunica:actor`, and `comunica:actor-query-operation` generators for initializing projects of the respective types.\n`comunica:actor-query-operation` is a special type of the `comunica:actor` generator that has been preconfigured to the `query-operation` bus,\nwhich we will make use of in this guide.\nIf you want to create an actor on another bus than `query-operation`, you will have to invoke `comunica:actor` instead.\n\nIn this case, we want to create an actor on the `query-operation` bus for the `REDUCED` query operation.\nAs such, we can **execute the generator** as follows in the repo root:\n```bash\n$ yo comunica:actor-query-operation\n? The SPARQL Algebra type name of the operator (lowercase) reduced\n? The SPARQL Algebra interface name Reduced\n? Actor name (without actor-bus- prefix, lowercase) reduced-my\n? The full readable name of the actor Reduced My\n? The component base name of the actor (without Bus part) ReducedMy\n? A description of the actor A comunica Reduced My Query Operation Actor.\n? The component context prefix caqorm\n create packages/actor-query-operation-reduced-my/components/Actor/QueryOperation/ReducedMy.jsonld\n create packages/actor-query-operation-reduced-my/components/components.jsonld\n create packages/actor-query-operation-reduced-my/components/context.jsonld\n create packages/actor-query-operation-reduced-my/lib/ActorQueryOperationReducedMy.ts\n create packages/actor-query-operation-reduced-my/test/ActorQueryOperationReducedMy-test.ts\n create packages/actor-query-operation-reduced-my/.npmignore\n create packages/actor-query-operation-reduced-my/index.ts\n create packages/actor-query-operation-reduced-my/package.json\n create packages/actor-query-operation-reduced-my/README.md\n```\n\nAfter answering the required question, a new package will be initialized at `packages/actor-query-operation-reduced-my/`.\n\nIn order to **link the dependencies of this new package**, make sure to run `yarn install` again in the monorepo root.\nYou will see some compilation errors, which you can ignore, as your new actor has not been implemented yet.\n\n## 4. Implementing your actor\n\nIn this step, we will implement our actor in `packages/actor-query-operation-reduced-my/lib/ActorQueryOperationReducedMy.ts`.\n\nThe generated class extends from `ActorQueryOperationTypedMediated`,\nwhich abstracts away many of the commonly required tasks for operators.\nThis class requires you to override two methods: `testOperation` and `runOperation`.\nThese two methods correspond to the [test and run phases that will be called by mediators](/docs/modify/advanced/architecture_core/#run-and-test-phases-for-selecting-an-actor).\n\n### 4.1. Test phase\n\nSince the `ActorQueryOperationTypedMediated` class already implements the test phase by checking if the incoming operation is a `REDUCED` operation,\nwe can just implement `testOperation` as follows:\n```typescript\n public async testOperation(pattern: Algebra.Reduced, context: IActionContext): Promise {\n return true;\n }\n```\n\n
\nIf you want to make your actor only handle specific types of this operation,\nyou can add additional checks in here.\nIf you want to fail the test in certain cases, you will have to throw an error.\n
\n\n### 4.2. Run phase\n\nThe `runOperation` method will contain the actual logic for evaluation the `REDUCED` operator.\n\nBefore we start, change the return type of this method from `Promise` to `Promise`,\nbecause this method will always [return bindings as query result](/docs/modify/advanced/query_operation_result_types/).\n\nThe first step of implementing the REDUCED actor,\nrequires evaluating the sub-operation that this REDUCED operation exists over.\n\nFor example, `REDUCED` can be applied over the following BGP:\n```\nSELECT REDUCED * WHERE {\n ?s ?p .\n ?s ?p ?o.\n}\n```\n\nAs such, we first have to evaluate this BGP first (or whatever other sub-operator is defined).\n\nThis sub-operation is stored in the `input` field of our `pattern`.\nBy using the query operation mediator (`this.mediatorQueryOperation`),\nwe can evaluate this sub-operation.\nThe sub-operator can be evaluated by the mediator as follows:\n```javascript\n// Delegate resolving the input operation to the mediator.\nconst output = ActorQueryOperation.getSafeBindings(await this\n .mediatorQueryOperation.mediate({ operation: pattern.input, context }));\n```\n\nSince the `REDUCED` operator is very loosely defined in the SPARQL specification,\nit is valid to filter _nothing_ from the results, and just return the child operator\'s results as-is.\n\nAs such, we can return the following:\n```bash\nreturn {\n type: \'bindings\',\n bindingsStream: output.bindingsStream,\n metadata: output.metadata,\n};\n```\n\n
\nHave a look at the other query operation actors if you want to do something more complex with the output\'s bindingsStream.\n
\n\n## 5. Unit-testing your actor\n\nSince [testing is very important in Comunica](/docs/modify/advanced/testing/),\nthe generator will automatically generate some unit tests for your class in `packages/actor-query-operation-reduced-my/test/ActorQueryOperationReducedMy-test.ts`.\n\nSince we don\'t actually do anything in our actor, all default unit test should already pass.\nCheck this by executing in the repo root:\n```bash\nyarn run test ActorQueryOperationReducedMy-test.ts\n```\n\nHere, it is important that every class in your package reaches a code coverage of 100%.\nTherefore, if you have a different actor implementation,\nyou may have to add additional unit tests to check different cases.\n\n## 6. Configuring your actor\n\nIf you want to make it so that your actor is enabled by default in Comunica SPARQL,\nthen you\'ll have to make sure it is present in the default config.\n\nFor this, first **add your package as a dependency** in `engines/query-sparql/package.json`:\n```text\n{\n ...\n "dependencies": {\n ...\n "@comunica/actor-query-operation-reduced-my": "^1.0.0"\n }\n ...\n}\n```\n\n
\nWhen creating a new actor, you can leave the version fixed at "^1.0.0".\nThis version will be incremented automatically upon each new Comunica release.\n
\n\nNext, we have to **configure the actor** by replacing the existing `REDUCED` actor in the default config file `engines/config-query-sparql/config/query-operation/actors/query/reduced.json`:\n```text\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/runner/^3.0.0/components/context.jsonld",\n\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-reduced-my/^3.0.0/components/context.jsonld"\n ],\n "@id": "urn:comunica:default:Runner",\n "@type": "Runner",\n "actors": [\n {\n "@id": "urn:comunica:default:query-operation/actors#reduced",\n "@type": "ActorQueryOperationReducedMy",\n "mediatorQueryOperation": { "@id": "urn:comunica:default:query-operation/mediators#main" }\n }\n ]\n}\n```\n\n
\nWhen adding non-query-operator actors, you may have to include your actor in a different config set.\n
\n\n## 7. Testing with Comunica SPARQL\n\nBefore we make our pull request,\nwe have to make sure that our actor actually works in practise.\n\nFor this, we have to make sure our TypeScript is properly compiled to JavaScript,\nand that our configuration file has been compiled:\n```bash\n$ yarn run build # Compile typescript and the components files at the ROOT OF THE REPO\n$ cd engines/query-sparql\n$ yarn run prepare # Compiles config\n```\n\n
\nYou can also just run yarn install again from the root package, which will take care of all of this, and more.\n
\n\nAfter that, we should now be able to execute Comunica SPARQL from the command line with a given `REDUCED` query:\n```bash\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n \'SELECT REDUCED * WHERE { ?s ?p ?o } LIMIT 100\'\n```\n\n## 8. Creating a pull request\n\nOnce everything has been tested, we can commit our **code and create a pull request**.\n\nFirst, add the changed files, and commit your code.\n\n```bash\n$ git add packages/actor-query-operation-reduced-my \\\n engines/query-sparql/config \\\n engines/query-sparql/package.json\n$ git commit -m "Add my custom reduced operator" \n```\n\n
\nBefore making the commit, make sure you are not any unneeded files. You can use git status for this.\n
\n\nSeveral [pre-commit checks](/contribute/#report-bugs-or-request-features) will be done, such as linting and unit testing.\nShould any of these checks fail, your commit will not be done,\nand you have to retry again after fixing the problems.\n\nAlso make sure to check in your new package if there are any `TODO`s remaining,\nsuch as in the `README.md` file.\n\nOnce your commit is done, you can push your changes to your fork:\n```bash\n$ git push origin feature/my-feature\n```\n\nThe only thing that\'s left to do is making the pull request\nfrom your branch to the Comunica master branch at https://github.com/comunica/comunica/pulls.\nOnce you\'ve opened the pull request, several [automated checks](/contribute/#report-bugs-or-request-features)\nwill be run, and someone will have a look at your contribution very soon!\n'},98997:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Adding a config parameter to an actor\'\ndescription: \'For an existing actor, add a parameter that can be customized in the config file.\'\n---\n\nIn this guide, we will add a parameter to an existing actor,\nand show how to set values for this parameter via the config file.\n\nWe will start from the actor we have created in the [guide on contributing a new actor](/docs/modify/getting_started/contribute_actor/).\n\n## 1. Modifying the constructor\n\nWe want to add a parameter that is set at configuration/startup time.\nFor this, we need to make sure that our actor accepts this parameter via the constructor.\n\nFor this, first create a **new interface** that is used as single argument in the constructor:\n```typescript\nexport interface IActorQueryOperationReducedMyArgs extends IActorQueryOperationTypedMediatedArgs {\n myParam: number;\n}\n```\nHere, `IActorQueryOperationTypedMediatedArgs` is the default constructor argument\nfor query operation actors that contains common parameters that will automatically be set behind the scenes.\n\nNext, **replace our constructor** with the following:\n```typescript\npublic constructor(args: IActorQueryOperationReducedMyArgs) {\n super(args, \'reduced\');\n}\n```\n\nIn order to use the passed parameter values,\nadd the following field in your class:\n\n```typescript\nprivate readonly myParam: number;\n```\n\nIn order to temporarily check the passed parameter value,\nwe can add a `console.log` statement in the `runOperation` method.\n\nMake sure to run `yarn run build` in the repo root to make sure that your modifications\nto the TypeScript files have been compiled to JavaScript.\n\n## 2. Set values in our config file\n\nEverything has now been setup to define values for our parameter via the config file.\n\nAs such, we can **modify our declaration of our actor in `engines/config-query-sparql/config/query-operation/actors/query/reduced.json`** by adding a value for `"myParam"`:\n```text\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/runner/^3.0.0/components/context.jsonld",\n\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-reduced-my/^3.0.0/components/context.jsonld"\n ],\n "@id": "urn:comunica:default:Runner",\n "@type": "Runner",\n "actors": [\n {\n "@id": "urn:comunica:default:query-operation/actors#reduced",\n "@type": "ActorQueryOperationReducedMy",\n "mediatorQueryOperation": { "@id": "urn:comunica:default:query-operation/mediators#main" }\n "myParam": 123\n }\n ]\n}\n```\n\nAs a test, you can now attempt a [query execution with our config](/docs/modify/getting_started/contribute_actor/#7--testing-with-comunica-sparql).\nIf you placed a `console.log` statement in your actor,\nyou should now see the value `123` on stdout.\n\n
\nIn this guide, we showed how to define an integer parameter.\nYou can instead also define other parameter types,\nwhere parameters can even accept other components (such as mediators).\n
\n\n
\nWhen running yarn run build, a JSON-LD representation of your TypeScript files\nwill be created in the components/ directory of your package.\nThe components/context.jsonld will list all discovered parameters that you can pass within the config file. \n
\n'},31516:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Extensions'\ndescription: 'Existing extensions of Comunica.'\n---\n\nDifferent default configurations of Comunica are provided,\nsuch as [Comunica SPARQL](https://github.com/comunica/comunica/tree/master/engines/query-sparql#readme),\n[Comunica SPARQL File](https://github.com/comunica/comunica/tree/master/engines/query-sparql-file#readme),\nand [Comunica SPARQL RDF/JS](https://github.com/comunica/comunica/tree/master/engines/query-sparql-rdfjs#readme).\nNext to those, several extensions and modified versions exist of Comunica that offer specific functionality.\n\nFeel free to [contact us](/ask/) if you want your extension of Comunica added to this list.\n\n## Examples\n\nThe [Comunica Examples](https://github.com/comunica/examples) repository\ncontains a number of example packages that explain and illustrate how to create customized Comunica packages.\n\n## Solid\n\n[`@comunica/query-sparql-solid`](https://github.com/comunica/comunica-feature-solid)\nis Comunica SPARQL query engine that allows queries to be executed using your [Solid account](https://solidproject.org/).\n\nRead more about this in [our guide on Solid](/docs/query/advanced/solid/).\n\n## Link Traversal\n\n[`@comunica/query-sparql-link-traversal`](https://github.com/comunica/comunica-feature-link-traversal) and\n[`@comunica/query-sparql-link-traversal-solid`](https://github.com/comunica/comunica-feature-link-traversal-solid)\nare Comunica SPARQL query engine that follow links between documents during query execution.\n\nRead more about this in [our guide on Link Traversal](/research/link_traversal/).\n\n## AMF\n\n[Comunica AMF](https://github.com/comunica/comunica-feature-amf)\nprovides a set of experimental actors that handle _approximate membership functions_, such as Bloom filters.\nRead more about this in [this article](https://comunica.github.io/Article-SSWS2020-AMF/).\n\n## HDT\n\n[`@comunica/comunica-actor-rdf-resolve-quad-pattern-hdt`](https://github.com/comunica/comunica-actor-rdf-resolve-quad-pattern-hdt)\nis a package that enables [resolving a quad pattern](/docs/modify/advanced/buses/#rdf-resolve-quad-pattern) over HDT files.\nThe [Comunica SPARQL HDT package](https://github.com/comunica/comunica-query-sparql-hdt#readme)\nprovides a default configuration that adds full SPARQL query support using other actors from Comunica SPARQL.\n\nRead more about this in [our guide on querying over HDT](/docs/query/advanced/hdt/).\n\n## OSTRICH\n\n[OSTRICH](https://github.com/rdfostrich) is a versioned RDF triple store.\n\n[`@comunica/actor-rdf-resolve-quad-pattern-ostrich`](https://github.com/rdfostrich/comunica-actor-rdf-resolve-quad-pattern-ostrich)\nis a package that enables [resolving a quad pattern](/docs/modify/advanced/buses/#rdf-resolve-quad-pattern) over OSTRICH files.\nIt determines the version to query over from the context.\n\n[`@comunica/actor-query-operation-contextify-version`](https://github.com/rdfostrich/comunica-actor-query-operation-contextify-version)\nis a package that detects graph-based version [operations](/docs/modify/advanced/buses/#query-operation)\nand rewrites them to operations with a version context.\n\nThe [Comunica SPARQL OSTRICH package](https://github.com/rdfostrich/comunica-query-sparql-ostrich#readme)\nprovides a default configuration that adds full SPARQL query support using other actors from Comunica SPARQL.\n\n## SPARQL-OTFC\n\n[SPARQL-OTFC](https://github.com/Flanders-Make-vzw/sparql-otfc#readme) extends the SPARQL query language with on-the-fly computations. It enables developers to host special predicates that do not exist in a queried data source yet are computed at runtime. To the end-user asking a query, these predicates behave just like regular predicates.\n"},53844:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Modify FAQ'\ndescription: 'Frequently asked question about Comunica modification.'\n---\n\nCan't find an answer to your question?\nFeel free to [ask us](/ask/), or have a look at the\n[example](https://github.com/comunica/examples) repository.\n\n## Should I publish my package to npm before I can use it?\n\nWhile it is recommended to publish your reusable Comunica packages to npm, this is not required.\nThe [Components.js](/docs/modify/advanced/componentsjs/) dependency injection framework is able to work with packages that are locally linked to each other, as long as they are available in the `node_modules/` directory.\n\nIf you receive warnings in the form of `Detected remote context lookup for...`,\nthis usually means that Components.js was not able to find the corresponding package locally, and will [fallback to a remote context lookup](https://github.com/LinkedSoftwareDependencies/Components.js/discussions/82).\nThis can either be caused by an incorrect context URL, or a missing dependency in the `node_modules/` directory.\n\n## How to query over a non-RDF source?\n\nAdding support for new types of sources is typically done by adding a new actor to\nthe [RDF Resolve Quad Pattern bus](/docs/modify/advanced/buses/#rdf-resolve-quad-pattern).\n[Click here](https://github.com/comunica/examples/tree/master/packages/actor-rdf-resolve-quad-pattern-api-weather)\nto find an example on how to query over a JSON weather API source.\n\n## How to count all triples that are received by the query engine?\n\n[Click here](https://github.com/comunica/examples/tree/master/packages/actor-observe-rdf-dereference)\nto find an example on how this can be done.\n"},77646:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Advanced modification'\ndescription: 'Advanced guides on how to get the most out of Comunica modification.'\nindex: true\n---\n"},94512:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Actor Patterns\'\ndescription: \'Overview of common design patterns for actors\'\n---\n\nBelow, you can find several actor design patterns that are used within Comunica.\n\n## Wrapper\n\nActors such as [`@comunica/actor-http-proxy`](https://github.com/comunica/comunica/tree/master/packages/actor-http-proxy)\nand [`@comunica/actor-http-memento`](https://github.com/comunica/comunica/tree/master/packages/actor-http-memento)\nfollow a wrapper-based design.\nThis means that they wrap around existing functionality in the bus without precisely knowing what that behaviour is.\nThe actor can then invoke this existing functionality, and optionally modify the input and output.\n\nThe wrapper design can be achieved by giving a mediator reference of the same bus that the actor is registered to.\nIn this case, the proxy actor exists on the HTTP bus, but it also has a reference to an HTTP mediator.\nFurthermore, wrapper actors usually need to run before all other actors on the bus,\nwhich can be achieved in the Components.js config using `"beforeActors": { "@id": "urn:comunica:default:http/actors#fetch" }`.\n\nThe `run()` method of wrapper usually involves modifying the input action,\nannotating the context with a key to avoid the same actor to be re-invoked with infinite recursion,\ninvoking the mediator, and modifying the output.\n\nAn example of the wrapper approach can be found in [`ActorHttpProxy`](https://github.com/comunica/comunica/blob/master/packages/actor-http-proxy/lib/ActorHttpProxy.ts).\n'},4471:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Algebra\'\ndescription: \'The internal representation of queries during query execution.\'\n---\n\nLike most query engines,\ninstead of internally working directly with a SPARQL query string,\nComunica works with an algebraic representation of a SPARQL query,\ncorresponding to the [SPARQL 1.1 algebra](https://www.w3.org/TR/sparql11-query/#sparqlQuery).\nThis SPARQL algebra makes it easier for operating on SPARQL operators in a consistent manner,\nand for applying transformations during query optimization.\n\n## Query Operation Actors\n\nAll actors on the [Query Operation bus](/docs/modify/advanced/buses/#query-operation)\ncorrespond to exactly one SPARQL algebra operator type.\nFor example, [`@comunica/actor-query-operation-construct`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-construct)\nhandles algebra operations with type `\'construct\'`.\n\n## SPARQLAlgebra.js\n\nConverting a query string to SPARQL algebra\nhappens in the [SPARQL Parse bus](/docs/modify/advanced/buses/#query-parse).\nThe [`@comunica/actor-query-parse-sparql`](https://github.com/comunica/comunica/tree/master/packages/actor-query-parse-sparql) actor\non this bus makes use of the [SPARQLAlgebra.js](https://github.com/joachimvh/SPARQLAlgebra.js) package.\n\nExamples on how the conversion between SPARQL query string and SPARQL algebra happens can be found in the tests: https://github.com/joachimvh/SPARQLAlgebra.js/tree/master/test\n\n## Converting a SPARQL query into algebra\n\nIf you want to quickly check what the algebra of a given SPARQL query string looks like,\nyou can make use of Comunica\'s [explain functionality](/docs/query/advanced/explain/) as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en -q \'SELECT * { ?s ?p ?o }\' --explain parsed\n\n{\n "type": "project",\n "input": {\n "type": "bgp",\n "patterns": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "s"\n },\n "predicate": {\n "termType": "Variable",\n "value": "p"\n },\n "object": {\n "termType": "Variable",\n "value": "o"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "s"\n },\n {\n "termType": "Variable",\n "value": "p"\n },\n {\n "termType": "Variable",\n "value": "o"\n }\n ]\n}\n```\n\nThis tool is therefore useful if you want to implement support for a SPARQL operator,\nbut you need to find out to what algebra operation this corresponds.\n\n## Converting algebra into a SPARQL query\n\nYou can also apply the reverse transformation from algebra to SPARQL query string,\nfor which you will need to globally install [SPARQLAlgebra.js](https://github.com/joachimvh/SPARQLAlgebra.js):\n```bash\n$ npm install -g sparqlalgebrajs\n$ sparqlalgebrajs -q -r \'\n{\n "type": "project",\n "input": {\n "type": "bgp",\n "patterns": [\n {\n "type": "pattern",\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "x"\n },\n "predicate": {\n "termType": "Variable",\n "value": "y"\n },\n "object": {\n "termType": "Variable",\n "value": "z"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n }\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "x"\n },\n {\n "termType": "Variable",\n "value": "y"\n },\n {\n "termType": "Variable",\n "value": "z"\n }\n ]\n}\n\'\n\nSELECT ?x ?y ?z WHERE { ?x ?y ?z }\n```\n'},69763:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Core Architecture\'\ndescription: \'The low-level software architecture of Comunica for achieving modularity.\'\n---\n\nThis document gives an overview of the core architecture of Comunica,\nwhich gives us the desired **modularity** and **flexibility**.\n\nThis core architecture has been implemented in [`@comunica/core`](https://github.com/comunica/comunica/tree/master/packages/core).\n\nOn top of this architecture, the more high-level [SPARQL architecture](/docs/modify/advanced/architecture_sparql/) has been defined.\n\n
\nWatch a Webinar recording to gain a high-level overview of the core architecture.\n
\n\n## Core components: Actor, Mediator, and Bus\n\nComunica\'s architecture has been designed with flexibility and loose coupling of components as main goals.\nFor this, Comunica consists of **three types of components**: **actors**, **mediators**, and **buses**.\n\nAll logic in Comunica is separated into different **actors** ([`Actor`](https://comunica.github.io/comunica/classes/_comunica_core.Actor.html)),\nfollowing the [actor model](https://en.wikipedia.org/wiki/Actor_model).\nEach actor independently performs a specific task.\nFor example, one actor can take implement the SPARQL `UNION` operator,\nanother actor can parse JSON-LD documents,\nand another actor can parse JSON-LD documents _in a different way_.\n\nAll actors are subscribed onto task-specific **buses** ([`Bus`](https://comunica.github.io/comunica/classes/_comunica_core.Bus.html)),\nfollowing the [publish-subscribe pattern](https://en.wikipedia.org/wiki/Publish%E2%80%93subscribe_pattern).\nFor example, a SPARQL query operator bus could contain actors for `UNION`, `SELECT`, `FILTER`, and more.\nAn RDF parsing bus could contain actors for JSON-LD, RDFa, Turtle, and more.\n\nSince multiple actors can exist for solving a specific task\n(for example if we have two actors for parsing JSON-LD documents),\n**mediators** ([`Mediator`](https://comunica.github.io/comunica/classes/_comunica_core.Mediator.html)) are used for determining the "best" actor on a bus for executing a certain action,\nfollowing the [mediator pattern](https://en.wikipedia.org/wiki/Mediator_pattern).\n\nTo ensure loose coupling of components, actors never communicate with each other directly.\nInstead, they always communicate via a mediators and buses, as shown in the following figure:\n\n
\n \n
\n\n
\nWith Observers, you can passively observe actions executed by actors on a given bus.\n
\n\n## Run and test phases for selecting an actor\n\nDifferent mediators can select actors in different ways.\nFor this, the **mediator** will go through **two phases**:\n\n1. **Test phase**: The action is sent onto the bus to all subscribed actors. The actors return the estimated conditions under which the action could be executed, without actually executing the action.\n2. **Run phase**: The action is sent to a single actor for execution, where this actor is chosen by the mediator based on the returned test conditions.\n\nFor instance, the following figure shows an example of a mediator that will always pick the fastest actor on the bus as possible.\n\n
\n \n
\n\nTherefore, an actor must at least expose the following methods:\n```typescript\nexport interface Actor {\n test(action: IAction): Promise;\n run(action: IAction): Promise;\n}\n```\n\nThe interfaces `IAction`, `IActorTest`, and `IActorOutput` depend on the bus this actor is subscribed to.\n\nLearn more about the [actors, buses](/docs/modify/advanced/buses/) and [mediators](/docs/modify/advanced/mediators/) that exist in Comunica.\n\n## Wiring of components\n\nAll Comunica actors, buses, and mediators are implemented as [separate npm packages](https://github.com/comunica/comunica/tree/master/packages).\nIn order to _wire_ these different components with each other in a single application,\nwe make use of the **dependency injection** framework [Components.js](/docs/modify/advanced/componentsjs/).\nComponents.js allows us to wire components with each other using one or more [configuration files](/docs/modify/advanced/componentsjs/#creating-configurations-in-json-ld).\nPlugging in different components therefore do not require any code changes, but simply a config change.\n\nConsidering these different types of components,\nwe make use of the following naming conventions for packages:\n\n* Buses: `@comunica/bus-[name-of-bus-type]`\n* Mediators: `@comunica/mediator-[name-of-mediator]`\n* Actors: `@comunica/actor-[name-of-bus-type]-[name-of-actor]`\n* Mediator types: `@comunica/mediatortype-[name-of-mediator-type]`\n'},62413:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'SPARQL Architecture\'\ndescription: \'The high-level software architecture of Comunica for implementing SPARQL.\'\n---\n\nThis document gives an overview of the architecture that implements SPARQL query execution in Comunica.\nThis builds upon the [core architecture](/docs/modify/advanced/architecture_core/) of _actors_, _mediators_, and _buses_.\n\n## Overview\n\nThe figure below shows an overview of the most relevant _buses_ and _actors_ that are used in Comunica SPARQL.\nSome buses such as _Query Operation_ contain a large number of subscribed actors,\nwhich is why the figure below only shows a few as illustration. \n\n[Click on the figure](/img/architecture_sparql.svg) to view it in full screen, or view the [PDF version](/img/architecture_sparql.pdf).\n\n
\n \n \n \n
\n\n## Data flow for a query execution\n\nFor a given SPARQL query, the following logic flow occurs: (_some parts are omitted for simplicity_)\n\n* **Init:** All Comunica engines start here. This is where they accept generic input parameters, such as CLI arguments.\n * **Query Process:** Extracts things like query and output format from input arguments.\n * **Context Preprocess:** A bus in which actors can optionally modify the [query context](/docs/query/advanced/context/).\n * **Query Source Identify:** Identifies sources using the *Query Source Identify* bus, where query sources can accept query operations.\n * **RDF/JS:** Translates the array of sources in the [query context](/docs/query/advanced/context/) into the union of quad streams by resolving each source separately in the *RDF Resolve Quad Pattern* bus.\n * **Hypermedia:** Resolves query operations by interpreting hypermedia links and controls.\n * **Dereference RDF:** Dereferences a path or URL into a stream of quads, which internally makes use of several parsers in the *RDF Parse* bus, and it uses data lookup actors from the *Dereference* bus.\n * **RDF Metadata:** Extracts the quads relevant for metadata from the stream of data quads.\n * **RDF Metadata Extract:** Create an object with metadata for a given metadata quad stream.\n * **RDF Metadata Accumulate:** Merge the metadata object with any previous metadata (only applies if multiple links are being followed).\n * **RDF Resolve Hypermedia Links:** Determines which links should be followed from the metadata of the current source.\n * **RDF Resolve Hypermedia Links Queue:** Creates a link queue that enables different strategies for queueing links.\n * **Query Source Identify Hypermedia:** Handle a source based on the extracted metadata.\n * **None:** The source is considered a raw RDF file, for which all data quads matching the query operation are returned.\n * **SPARQL:** The source is considered a SPARQL endpoint if it has a service description, for which we use the SPARQL protocol.\n * **QPF:** The source is considered a [Triple/Quad Pattern Fragments](https://linkeddatafragments.org/) interface.\n * **SPARQL Parse:** Parsing the SPARQL query into SPARQL algebra.\n * **Optimize Query Operation:** Applies optional optimizations to the SPARQL algebra before actual execution.\n * **Query Operation:** Executes the query operation.\n * **Join:** Handles joins between multiple query operations via its own separate bus.\n * **SPARQL Serialize:** Serializes the query result into a text-based serialization.\n\n[Click here for a full list of buses and actors](/docs/modify/advanced/buses/).\n'},96848:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Browser builds\'\ndescription: \'All modules in Comunica can be built for the browser.\'\n---\n\nUsing bundlers such as [Webpack](https://www.npmjs.com/package/webpack)\nand [browserify](http://browserify.org/),\nyou can bundle your custom Comunica engine for the browser.\n\nFor this, you have to go the following. \n\n
\nIf you just want to make use of default Comunica engines such as Comunica SPARQL,\nrefer to the guide on querying in a JavaScript browser app.\n
\n\n
\nA full example of a custom Comunica engine that is browser-ready can be found\nhere.\n
\n\n## 1. Compile the config to JavaScript\n\nNot all parts of Comunica can be executed in the browser.\nNamely, the dynamic version of Comunica that can read from a config on the local file system.\n\nAs such, if we want to expose our engine in the browser,\nwe have to **compile our config to a JavaScript file**.\nThis can be done using the `comunica-compile-config` from [`@comunica/runner`](https://github.com/comunica/comunica/tree/master/packages/runner).\n\nFor this, add `@comunica/runner` as a dev dependency to your `package.json`,\nand add the following script (assuming your config exists at `config/config-default.json`):\n\n```text\n{\n ...\n "scripts": {\n ...\n "prepublishOnly": "npm run build:engine",\n "build:engine": "comunica-compile-config config/config-default.json > engine-default.js"\n }\n}\n```\n\n## 2. Create a browser-specific entrypoint\n\nNext, create a file called **`lib/index-browser.ts`**, which will become the browser variant of `lib/index.ts`.\n`lib/index-browser.ts` should at least contain the following:\n```typescript\nexport * from \'./QueryEngine\';\n```\n\n## 3. Expose the browser-specific entrypoint\n\nAfter that, we have to **tell the browser bundling tools that they need to look at `index-browser.js`**\ninstead of `index.js` for browser apps.\nFor this, add the following to your `package.json`:\n```text\n{\n ...\n "browser": {\n "./lib/index.js": "./lib/index-browser.js"\n }\n}\n```\n\n## 4. Building for the browser\n\nNow you\'re ready to compile your application for the browser using tools such as [Webpack](https://www.npmjs.com/package/webpack).\n\n
\nWhile Comunica required polyfilling using tools such as node-polyfill-webpack-plugin,\nthis is not required anymore as of Comunica 2.4.0.\n
\n\nPlease refer to the documentation of [Webpack](https://www.npmjs.com/package/webpack) on how to configure this build process.\n\nBelow you can find an example configuration file for Webpack, which may require some fine-tuning depending on your use case:\n\n```javascript\nconst path = require(\'path\');\nconst ProgressPlugin = require(\'webpack\').ProgressPlugin;\n\nmodule.exports = {\n entry: [ \'@babel/polyfill\', path.resolve(__dirname, \'my-app.js\') ],\n output: {\n filename: \'my-app-browser.js\',\n path: __dirname, \n libraryTarget: \'window\',\n },\n devtool: \'source-map\',\n module: {\n rules: [\n {\n test: /\\.js$/,\n loader: \'babel-loader\',\n exclude: /node_modules/,\n },\n ]\n },\n plugins: [\n new ProgressPlugin(),\n ]\n};\n\n```\n'},18096:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Buses and Actors'\ndescription: 'An overview of all buses in Comunica and their actors.'\n---\n\nThis page gives an **overview of all _buses_ and _actors_**\nthat are used in the default Comunica engines,\nsuch as [Comunica SPARQL](https://github.com/comunica/comunica/tree/master/engines/query-sparql)\nand [Comunica SPARQL File](https://github.com/comunica/comunica/tree/master/engines/query-sparql-file)\nOther configurations such as [Comunica SPARQL HDT](https://github.com/comunica/comunica-query-sparql-hdt) contain additional actors and buses.\n\nThis builds upon the [core architecture](/docs/modify/advanced/architecture_core/) of _actors_, _mediators_, and _buses_.\nAn overview of how these buses and actors are connected can be found in the [SPARQL architecture](/docs/modify/advanced/architecture_sparql/).\n\n## Context Preprocess\n\n_Package: [`@comunica/bus-context-preprocess`](https://github.com/comunica/comunica/tree/master/packages/bus-context-preprocess)_\n\nA bus in which actors can optionally modify the [query context](/docs/query/advanced/context/).\n\nSubscribed actors need to implement [`ActorContextPreprocess`](https://comunica.github.io/comunica/classes/_comunica_bus_context_preprocess.ActorContextPreprocess.html).\n\n| Name | Package | Description |\n|------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Convert Shortcuts | [`@comunica/actor-context-preprocess-convert-shortcuts`](https://github.com/comunica/comunica/tree/master/packages/actor-context-preprocess-convert-shortcuts) | Expands shortcuts in the context to full context keys. |\n| Query Source Identify | [`@comunica/actor-context-preprocess-query-source-identify`](https://github.com/comunica/comunica/tree/master/packages/actor-context-preprocess-query-source-identify) | Identifies all query sources in the context using the [Query Source Identify bus](https://github.com/comunica/comunica/tree/master/packages/bus-query-source-identify). |\n| Query Source Skolemize | [`@comunica/actor-context-preprocess-query-source-skolemize`](https://github.com/comunica/comunica/tree/master/packages/actor-context-preprocess-query-source-skolemize) | Places all identified query sources in a skolemization wrapper. |\n| Set Defaults | [`@comunica/actor-context-preprocess-set-defaults`](https://github.com/comunica/comunica/tree/master/packages/actor-context-preprocess-set-defaults) | Will set default context values for query engines, such as the logger, timestamp, function arguments cache, ... |\n| Source To Destination | [`@comunica/actor-context-preprocess-source-to-destination`](https://github.com/comunica/comunica/tree/master/packages/actor-context-preprocess-source-to-destination) | Defines the write destination only if a single query source has been defined. |\n\n\n## Dereference\n\n_Package: [`@comunica/bus-dereference`](https://github.com/comunica/comunica/tree/master/packages/bus-dereference)_\n\nDereferences a path or URL into a (generic) stream.\n\nSubscribed actors need to implement [`ActorDereference`](https://comunica.github.io/comunica/classes/_comunica_bus_dereference.ActorDereference.html).\n\n### Actors\n\n| Name | Package | Description |\n|----------|----------------------------------------------------------------------------------------------------------------------------|--------------------------------------------|\n| File | [`@comunica/actor-dereference-file`](https://github.com/comunica/comunica/tree/master/packages/actor-dereference-file) | Dereferences a local file. |\n| HTTP | [`@comunica/actor-dereference-http`](https://github.com/comunica/comunica/tree/master/packages/actor-dereference-http) | Dereferences a remote file. |\n| Fallback | [`@comunica/actor-dereference-fallback`](https://github.com/comunica/comunica/tree/master/packages/actor-dereference-fallback) | A fallback actor with the lowest priority. |\n\n\n## Dereference RDF\n\n_Package: [`@comunica/bus-dereference-rdf`](https://github.com/comunica/comunica/tree/master/packages/bus-dereference-rdf)_\n\nDereferences a path or URL into a stream of quads.\n\nSubscribed actors need to implement [`ActorDereferenceRdf`](https://comunica.github.io/comunica/classes/_comunica_bus_dereference_rdf.ActorDereferenceRdf.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Parse | [`@comunica/actor-dereference-rdf-parse`](https://github.com/comunica/comunica/tree/master/packages/actor-dereference-rdf-parse) | Dereferences RDF using [`@comunica/bus-dereference`](https://github.com/comunica/comunica/tree/master/packages/bus-dereference). Invokes parsing with [`@comunica/bus-rdf-parse`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-parse). |\n\n\n## Hash Bindings\n\n_Package: [`@comunica/bus-hash-bindings`](https://github.com/comunica/comunica/tree/master/packages/bus-hash-bindings)_\n\nA bus for hashing `Bindings`.\n\nSubscribed actors need to implement [`ActorHashBindings`](https://comunica.github.io/comunica/classes/_comunica_bus_hash_bindings.ActorHashBindings.html).\n\n### Actors\n\n| Name | Package | Description |\n|------|-------------------------------------------------------------------------------------------------------------------------------|-----------------------------|\n| SHA1 | [`@comunica/actor-hash-bindings-sha1`](https://github.com/comunica/comunica/tree/master/packages/actor-hash-bindings-sha1) | Hashes bindings using SHA1. |\n\n\n## Hash Quads\n\n_Package: [`@comunica/bus-hash-quads`](https://github.com/comunica/comunica/tree/master/packages/bus-hash-quads)_\n\nA bus for hashing `RDF.Quad`.\n\nSubscribed actors need to implement [`ActorHashQuads`](https://comunica.github.io/comunica/classes/_comunica_bus_hash_quads.ActorHashQuads.html).\n\n### Actors\n\n| Name | Package | Description |\n|------|----------------------------------------------------------------------------------------------------------------------|--------------------------|\n| SHA1 | [`@comunica/actor-hash-quads-sha1`](https://github.com/comunica/comunica/tree/master/packages/actor-hash-quads-sha1) | Hashes quads using SHA1. |\n\n\n## HTTP\n\n_Package: [`@comunica/bus-http`](https://github.com/comunica/comunica/tree/master/packages/bus-http)_\n\nPerforms HTTP(S) requests.\n\nSubscribed actors need to implement [`ActorHttp`](https://comunica.github.io/comunica/classes/_comunica_bus_http.ActorHttp.html).\n\n### Actors\n\n| Name | Package | Description |\n|---------|----------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|\n| Memento | [`@comunica/actor-http-memento`](https://github.com/comunica/comunica/tree/master/packages/actor-http-memento) | Implements the [Memento protocol](/docs/query/advanced/memento/). |\n| Native | [`@comunica/actor-http-native`](https://github.com/comunica/comunica/tree/master/packages/actor-http-native) | Performs requests using Node's request library. |\n| Fetch | [`@comunica/actor-http-fetch`](https://github.com/comunica/comunica/tree/master/packages/actor-http-fetch) | Performs requests using the fetch API. |\n| Proxy | [`@comunica/actor-http-proxy`](https://github.com/comunica/comunica/tree/master/packages/actor-http-proxy) | Run requests through a proxy. |\n| Wayback | [`@comunica/actor-http-wayback`](https://github.com/comunica/comunica/tree/master/packages/actor-http-wayback) | Run requests through the Wayback machine. |\n\n\n## HTTP Invalidate\n\n_Package: [`@comunica/bus-http-invalidate`](https://github.com/comunica/comunica/tree/master/packages/bus-http-invalidate)_\n\nA bus for HTTP invalidation events.\n\nSubscribed actors need to implement [`ActorHttpInvalidate`](https://comunica.github.io/comunica/classes/_comunica_bus_http.ActorHttp.html).\n\n\n## Init\n\n_Package: [`@comunica/bus-init`](https://github.com/comunica/comunica/tree/master/packages/bus-init)_\n\nAll Comunica engines start here. This is where they accept generic input parameters, such as CLI arguments.\n\nSubscribed actors need to implement [`ActorInit`](https://comunica.github.io/comunica/classes/_comunica_bus_init.ActorInit.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Query | [`@comunica/actor-init-query`](https://github.com/comunica/comunica/tree/master/packages/actor-init-query) | Initializes query execution by parsing a given query, optimizing, executing, and serializing results. |\n\n\n## Merge Bindings Context\n\n_Package: [`@comunica/bus-merge-bindings-context`](https://github.com/comunica/comunica/tree/master/packages/bus-init)_\n\nA bus for creating merge handlers that are responsible for merging context entries in bindings with different values.\n\nSubscribed actors need to implement [`ActorMergeBingsContext`](https://comunica.github.io/comunica/classes/_comunica_bus_merge_bindings_context.ActorMergeBingsContext.html).\n\n### Actors\n\n| Name | Package | Description |\n|-------| ------- |------------------------------------------------------|\n| Union | [`@comunica/actor-actor-merge-binding-factory-context-union`](https://github.com/comunica/comunica/tree/master/packages/actor-merge-binding-factory-context-union) | Merges context entry values by taking the set-union. |\n\n\n## Optimize Query Operation\n\n_Package: [`@comunica/bus-optimize-query-operation`](https://github.com/comunica/comunica/tree/master/packages/bus-optimize-query-operation)_\n\nApply optional optimizations to the SPARQL algebra before actual execution.\nOptionally, a modified context can be returned.\n\nSubscribed actors need to implement [`ActorOptimizeQueryOperation`](https://comunica.github.io/comunica/classes/_comunica_bus_optimize_query_operation.ActorOptimizeQueryOperation.html).\n\n### Actors\n\n| Name | Package | Description |\n|--------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Assign Sources Exhaustive | [`@comunica/actor-optimize-query-operation-assign-sources-exhaustive`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-assign-sources-exhaustive) | Converts every quad pattern in the query to a union of quad patterns per source. |\n| BGP to Join | [`@comunica/actor-optimize-query-operation-bgp-to-join`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-bgp-to-join) | Converts BGPs into join operations. |\n| Construct Distinct | [`@comunica/actor-optimize-query-operation-construct-distinct`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-construct-distinctv) | Wraps the top-level Construct clause in Distinct if --distinct flag is on. |\n| Describe To Constructs Subject | [`@comunica/actor-optimize-query-operation-describe-to-constructs-subject`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-assign-sources-exhaustive) | Converts [SPARQL `DESCRIBE`](https://www.w3.org/TR/sparql11-query/#describe) operations to construct queries with all triples related to a given subject. |\n| Filter Pushdown | [`@comunica/actor-optimize-query-operation-filter-pushdown`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-filter-pushdown) | Pushes down filter expressions into the query plan as deep as possible. |\n| Join BGP | [`@comunica/actor-optimize-query-operation-join-bgp`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-join-bgp) | Merges joins of multiple BGPs into a single BGP. |\n| Join Connected | [`@comunica/actor-optimize-query-operation-join-connected`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-join-connected) | Clusters entries within a join operation into separate sub-joins if they are connected by variables. |\n| Group Sources | [`@comunica/actor-optimize-query-operation-group-sources`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-group-sources) | Groups exclusive groups of query operations into sources only if those sources support those grouped operations. |\n| Prune Empty Source Operations | [`@comunica/actor-optimize-query-operation-prune-empty-source-operations`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-prune-empty-source-operations) | Removes operations from the query plan that are guaranteed to produce empty results. |\n| Rewrite Add | [`@comunica/actor-optimize-query-operation-rewrite-add`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-rewrite-add) | Rewrites ADD operators as DELETEINSERT operations. |\n| Rewrite Copy | [`@comunica/actor-optimize-query-operation-rewrite-copy`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-rewrite-copy) | Rewrites COPY operators as DELETEINSERT operations. |\n| Rewrite Move | [`@comunica/actor-optimize-query-operation-rewrite-move`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-rewrite-move) | Rewrites MOVE operators as DELETEINSERT operations. |\n\n\n## Query Operation\n\n_Package: [`@comunica/bus-query-operation`](https://github.com/comunica/comunica/tree/master/packages/bus-query-operation)_\n\nEvaluates [SPARQL algebra operations](/docs/modify/advanced/algebra/).\n\nSubscribed actors need to implement [`ActorQueryOperation`](https://comunica.github.io/comunica/classes/_comunica_bus_query_operation.ActorQueryOperation.html)\nor [`ActorQueryOperationTyped`](https://comunica.github.io/comunica/classes/_comunica_bus_query_operation.ActorQueryOperationTyped.html).\n\n### Actors\n\n| Name | Package | Description |\n|-------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Ask | [`@comunica/actor-query-operation-ask`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-ask) | Handles `ASK` operations. |\n| BGP join | [`@comunica/actor-query-operation-bgp-join`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-bgp-join) | Handles BGPs by delegating to [`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join). |\n| Construct | [`@comunica/actor-query-operation-construct`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-construct) | Handles `CONSTRUCT` operations. |\n| Describe subject | [`@comunica/actor-query-operation-describe-subject`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-describe-subject) | Handles `DESCRIBE` operations. |\n| Distinct hash | [`@comunica/actor-query-operation-distinct-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-distinct-hash) | Handles `DISTINCT` operations through hashing. |\n| Extend | [`@comunica/actor-query-operation-extend`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-extend) | Handles `EXTEND` operations. |\n| Filter | [`@comunica/actor-query-operation-filter`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-filter) | Handles `FILTER` operations. |\n| From quad | [`@comunica/actor-query-operation-from-quad`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-from-quad) | handles `FROM` operations by considering FROM and FROM NAMED as target graph elements in quads. |\n| Group | [`@comunica/actor-query-operation-group`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-group) | Handles `GROUP BY` operations. |\n| Join | [`@comunica/actor-query-operation-join`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-join) | Handles join operations by delegating as inner join to [`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join). |\n| Left join | [`@comunica/actor-query-operation-leftjoin`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-leftjoin) | Handles `OPTIONAL` operations by delegating as optional join to [`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join). |\n| Minus | [`@comunica/actor-query-operation-minus`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-minus) | Handles `MINUS` operations by delegating as minus join to [`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join). |\n| Nop | [`@comunica/actor-query-operation-nop`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-nop) | Handles `NOP` operations. |\n| Order by | [`@comunica/actor-query-operation-orderby`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-orderby) | Handles `ORDER BY` operations. |\n| Path Alt | [`@comunica/actor-query-operation-path-alt`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-alt) | Handles `alt` property path expressions. |\n| Path Inv | [`@comunica/actor-query-operation-path-inv`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-inv) | Handles `inv` property path expressions. |\n| Path Link | [`@comunica/actor-query-operation-path-link`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-link) | Handles `link` property path expressions. |\n| Path Nps | [`@comunica/actor-query-operation-path-nps`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-nps) | Handles `nps` property path expressions. |\n| Path One or more | [`@comunica/actor-query-operation-path-one-or-more`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-one-or-more) | Handles `one-or-more` property path expressions. |\n| Path Seq | [`@comunica/actor-query-operation-path-seq`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-seq) | Handles `seq` property path expressions. |\n| Path Zero or more | [`@comunica/actor-query-operation-path-zero-or-more`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-zero-or-more) | Handles `zero-or-more` property path expressions. |\n| Path Zero or one | [`@comunica/actor-query-operation-path-zero-or-one`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-path-zero-or-one) | Handles `zero-or-one` property path expressions. |\n| Project | [`@comunica/actor-query-operation-project`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-project) | Handles `SELECT` operations. |\n| Reduced hash | [`@comunica/actor-query-operation-reduced-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-reduced-hash) | Handles `REDUCED` operations through hashing. |\n| Service | [`@comunica/actor-query-operation-service`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-service) | Handles `SERVICE` operations. |\n| Slice | [`@comunica/actor-query-operation-slice`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-slice) | Handles `LIMIT` and `OFFSET` operations. |\n| Source | [`@comunica/actor-query-operation-source`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-sparql-endpoint) | Delegates operations annotated with a query source towards that source. |\n| Union | [`@comunica/actor-query-operation-union`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-union) | Handles `UNION` operations. |\n| Values | [`@comunica/actor-query-operation-values`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-values) | Handles `VALUES` operations. |\n| Update Clear | [`@comunica/actor-query-operation-update-clear`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-update-clear) | Handles `CLEAR` operations. |\n| Update Composite Update | [`@comunica/actor-query-operation-update-compositeupdate`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-update-compositeupdate) | Handles `composition of multiple SPARQL update operations. |\n| Update Create | [`@comunica/actor-query-operation-update-create`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-update-create) | Handles `CREATE` operations. |\n| Update Delete Insert | [`@comunica/actor-query-operation-update-deleteinsert`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-update-deleteinsert) | Handles `INSERT DATA`, `DELETE DATA`, and `INSERT/DELETE` operations. |\n| Update Drop | [`@comunica/actor-query-operation-update-drop`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-update-drop) | Handles `DROP` operations. |\n| Update Load | [`@comunica/actor-query-operation-update-load`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-update-load) | Handles `LOAD` operations. |\n\n\n## Query Parse\n\n_Package: [`@comunica/bus-query-parse`](https://github.com/comunica/comunica/tree/master/packages/bus-query-parse)_\n\nParsing an input query into (SPARQL) algebra.\n\nSubscribed actors need to implement [`ActorQueryParse`](https://comunica.github.io/comunica/classes/_comunica_bus_query_parse.ActorQueryParse.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| SPARQL | [`@comunica/actor-query-parse-sparql`](https://github.com/comunica/comunica/tree/master/packages/actor-query-parse-sparql) | Uses [SPARQLAlgebra.js](https://github.com/joachimvh/SPARQLAlgebra.js) for parsing SPARQL query strings into SPARQL algebra. |\n| GraphQL | [`@comunica/actor-query-parse-graphql`](https://github.com/comunica/comunica/tree/master/packages/actor-query-parse-graphql) | Parses GraphQL strings into SPARQL algebra following the [GraphQL-LD](/docs/query/advanced/graphql_ld/) approach. |\n\n\n## Query Process\n\n_Package: [`@comunica/bus-query-process`](https://github.com/comunica/comunica/tree/master/packages/bus-query-parse)_\n\nA bus for fully processing a query. This usually involves parsing, optimizing, and evaluating, which can be delegated to other buses.\n\nSubscribed actors need to implement [`ActorQueryProcess`](https://comunica.github.io/comunica/classes/_comunica_bus_query_process.ActorQueryProcess.html).\n\n### Actors\n\n| Name | Package | Description |\n|------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------|\n| Annotate Source Binding | [`@comunica/actor-query-process-annotate-source-binding`](https://github.com/comunica/comunica/tree/master/packages/actor-query-process-annotate-source-binding) | Annotates bindings with their sources. |\n| Explain Logical | [`@comunica/actor-query-process-explain-logical`](https://github.com/comunica/comunica/tree/master/packages/actor-query-process-explain-logical) | Explains the logical query plan after parsing and optimizing. |\n| Explain Parsed | [`@comunica/actor-query-process-explain-parsed`](https://github.com/comunica/comunica/tree/master/packages/actor-query-process-explain-parsed) | Explains the parsed query. |\n| Explain Physical | [`@comunica/actor-query-process-explain-physical`](https://github.com/comunica/comunica/tree/master/packages/actor-query-process-explain-physical) | Explains the physical query plan after parsing, optimizing, and evaluating. |\n| Sequential | [`@comunica/actor-query-process-sequential`](https://github.com/comunica/comunica/tree/master/packages/actor-query-process-sequential) | Processes a query in a sequential manner. It first parses the query, optimizes it, and then evaluates it. |\n\n\n## Query Result Serialize\n\n_Package: [`@comunica/bus-query-result-serialize`](https://github.com/comunica/comunica/tree/master/packages/bus-query-result-serialize)_\n\nSerializes the query result into a text-based serialization.\n\nSubscribed actors need to implement [`ActorQueryResultSerialize`](https://comunica.github.io/comunica/classes/_comunica_bus_query_result_serialize.ActorQueryResultSerialize.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- |---------------------------------------------------------------------------------------------------------------------------------------| ----------- |\n| JSON | [`@comunica/actor-query-result-serialize-json`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-json) | Serializes to a simple JSON format. |\n| RDF | [`@comunica/actor-query-result-serialize-rdf`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-rdf) | Serializes to an RDF format by delegating to [`@comunica/bus-rdf-serialize`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-serialize). |\n| Simple | [`@comunica/actor-query-result-serialize-simple`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-) | Serializes to a simple format. |\n| SPARQL CSV | [`@comunica/actor-query-result-serialize-sparql-csv`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-csv) | Serializes to SPARQL/CSV. |\n| SPARQL JSON | [`@comunica/actor-query-result-serialize-sparql-json`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-json) | Serializes to SPARQL/JSON. |\n| SPARQL TSV | [`@comunica/actor-query-result-serialize-sparql-tsv`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-tsv) | Serializes to SPARQL/TSV. |\n| SPARQL XML | [`@comunica/actor-query-result-serialize-sparql-xml`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-xml) | Serializes to SPARQL/XML. |\n| Stats | [`@comunica/actor-query-result-serialize-stats`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-stats) | Serializes basic statistics. |\n| Table | [`@comunica/actor-query-result-serialize-table`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-table) | Serializes in a simple table format. |\n| Tree | [`@comunica/actor-query-result-serialize-tree`](https://github.com/comunica/comunica/tree/master/packages/actor-query-result-serialize-tree) | Serializes to a JSON tree. |\n\n\n## Query Source Identify\n\n_Package: [`@comunica/bus-query-source-identify`](https://github.com/comunica/comunica/tree/master/packages/bus-query-source-identify)_\n\nIdentifying the types of query sources.\n\nSubscribed actors need to implement [`ActorQuerySourceIdentify`](https://comunica.github.io/comunica/classes/_comunica_bus_query_source_identify.ActorQuerySourceIdentify.html).\n\n### Actors\n\n| Name | Package | Description |\n|---------------|----------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------|\n| Serialized | [`@comunica/actor-query-source-identify-serialized`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-serialized) | Handles serialized sources.. |\n| Hypermedia | [`@comunica/actor-query-source-identify-hypermedia`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-hypermedia) | Handles [hypermedia-based sources](/docs/modify/advanced/hypermedia/). |\n| RDF/JS Source | [`@comunica/actor-query-source-identify-rdfjs`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-rdfjs) | Handles [RDF/JS Sources](https://comunica.dev/docs/query/advanced/rdfjs_querying/). |\n\n\n## Query Source Identify Hypermedia\n\n_Package: [`@comunica/bus-query-source-identify-hypermedia`](https://github.com/comunica/comunica/tree/master/packages/bus-query-source-identify)_\n\nIdentifying a query source based on the extracted metadata.\n\nSubscribed actors need to implement [`ActorQuerySourceIdentifyHypermedia`](https://comunica.github.io/comunica/classes/_comunica_bus_query_source_identify_hypermedia.ActorQuerySourceIdentifyHypermedia.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- |-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------|\n| Annotate Source | [`@comunica/actor-query-source-identify-hypermedia-annotate-source`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-hypermedia-annotate-source) | This actor wraps around other hypermedia sources and adds the URL from which the bindings are derived to the binding's context. |\n| None | [`@comunica/actor-query-source-identify-hypermedia-none`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-hypermedia-none) | The source is considered a raw RDF file, for which all data quads matching the quad pattern are returned. |\n| QPF | [`@comunica/actor-query-source-identify-hypermedia-qpf`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-hypermedia-qpf) | The source is considered a [Triple/Quad Pattern Fragments](https://linkeddatafragments.org/) interface. |\n| SPARQL | [`@comunica/actor-query-source-identify-hypermedia-sparql`](https://github.com/comunica/comunica/tree/master/packages/actor-query-source-identify-hypermedia-sparql) | The source is considered a SPARQL endpoint if it has a service description, for which we use the SPARQL protocol. |\n\n\n## RDF Join\n\n_Package: [`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join)_\n\nHandles joining of bindings streams.\n\nIt supports different logical join types, such as inner, optional, and minus joins.\n\nSubscribed actors need to implement [`ActorRdfJoin`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_join.ActorRdfJoin.html).\n\n### Actors\n\n| Name | Package | Description |\n|--------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| Inner Hash | [`@comunica/actor-rdf-join-inner-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-hash) | Inner hash join of two entries. |\n| Inner Nested loop | [`@comunica/actor-rdf-join-inner-nestedloop`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-nestedloop) | Inner nested loop join of two entries. |\n| Inner None | [`@comunica/actor-rdf-join-inner-none`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-none) | Inner join between zero entries, and returns a single binding. |\n| Inner Single | [`@comunica/actor-rdf-join-inner-single`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-single) | Inner join of a single entry, and returns the entry itself. |\n| Inner Symmetric hash | [`@comunica/actor-rdf-join-inner-symmetrichash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-symmetrichash) | Inner symmetric hash join of two entries. |\n| Inner Multi empty | [`@comunica/actor-rdf-join-inner-multi-empty`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-empty) | Inner multi-join that accepts any number of inner-join entries of which at least one is empty and returns an empty stream. |\n| Inner Multi Bind | [`@comunica/actor-rdf-join-inner-multi-bind`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-bind) | Inner multi-join that inner-joins 2 or more streams by picking the one with the lowest cardinality, binding each item with the remaining operations, and recursively resolving those operations by delegating to [`@comunica/bus-query-operation`](https://github.com/comunica/comunica/tree/master/packages/bus-query-operation). |\n| Inner Multi Bind Source | [`@comunica/actor-rdf-join-inner-multi-bind-source`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-bind-source) | Inner multi-join that inner-joins 2 or more streams by picking the one with the lowest cardinality, chunking it according to a certain block size, and joining each chunk with the remaining query by pushing it into the source. |\n| Inner Multi sequential | [`@comunica/actor-rdf-join-inner-multi-sequential`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-sequential) | Inner multi-join by just picking the two of them hierarchically. |\n| Inner Multi smallest | [`@comunica/actor-rdf-join-inner-multi-smallest`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-smallest) | Inner multi-join by always picking the first two streams with smallest estimate cardinality. |\n| Inner Multi smallest filter bindings | [`@comunica/actor-rdf-join-inner-multi-smallest-filter-bindings`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-smallest-filter-bindings) | Inner multi-join that inner-joins 2 or more streams by joining the smallest two, and joining the result with the remaining streams by delegating back to the [RDF Join bus](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join). While joining the smallest two, the first stream is pushed down as filter into the second stream.. |\n| Minus Hash | [`@comunica/actor-rdf-join-minus-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-minus-hash) | Anti-join (minus) of 2 streams using the hash join algorithm. This actor does _not_ support streams that can have undefined values. |\n| Minus Hash undef | [`@comunica/actor-rdf-join-minus-hash-undef`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-minus-hash-undef) | Anti-join (minus) of 2 streams using the hash join algorithm. This actor supports streams that can have undefined values. |\n| Optional Bind | [`@comunica/actor-rdf-join-optional-bind`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-optional-bind) | Left-join (optional) 2 streams using the bind join algorithm. It binds each item of the first stream with the second operation, and recursively resolving that operation by delegating to [`@comunica/bus-query-operation`](https://github.com/comunica/comunica/tree/master/packages/bus-query-operation). |\n| Optional Hash | [`@comunica/actor-rdf-join-optional-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-optional-hash) | Left-join (optional) 2 streams using the hash join algorithm. |\n| Optional Nested loop | [`@comunica/actor-rdf-join-optional-nestedloop`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-optional-nestedloop) | Left-join (optional) 2 streams using the nested loop join algorithm. |\n| Optional Opt Plus | [`@comunica/actor-rdf-join-optional-optplus`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-optional-optplus) | Left-join (optional) 2 streams using the [OPT+](https://www.researchgate.net/publication/333627321_OPT_A_Monotonic_Alternativeto_OPTIONAL_in_SPARQL) algorithm. |\n\n\n## RDF Join Entries Sort\n\n_Package: [`@comunica/bus-rdf-join-entries-sort`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join-entries-sort)_\n\nDetermines the order in which join entries should be ordered.\n\nSubscribed actors need to implement [`ActorRdfJoinEntriesSort`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_join_entries_sort.ActorRdfJoinEntriesSort.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Cardinality | [`@comunica/actor-rdf-join-entries-sort-cardinality`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-entries-sort-cardinality) | Orders join entries by increasing cardinality. |\n\n\n## RDF Join Selectivity\n\n_Package: [`@comunica/bus-rdf-join-selectivity`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join-selectivity)_\n\nCalculates or estimates the selectivity of joins.\n\nSubscribed actors need to implement [`ActorRdfJoinSelectivity`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_join_selectivity.ActorRdfJoinSelectivity.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Variable Counting | [`@comunica/actor-rdf-join-selectivity-variable-counting`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-selectivity-variable-counting) | Estimates the selectivity by counting the overlap of variables and non-variables in patterns. |\n\n\n## RDF Metadata\n\n_Package: [`@comunica/bus-rdf-metadata`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-metadata)_\n\nExtracts the quads relevant for metadata from the stream of data quads.\n\nSubscribed actors need to implement [`ActorRdfMetadata`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_metadata.ActorRdfMetadata.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| All | [`@comunica/actor-rdf-metadata-all`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-all) | Considers all incoming quads as both data and metadata quads. |\n| Primary topic | [`@comunica/actor-rdf-metadata-primary-topic`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-primary-topic) | Splits off the metadata based on the existence of a `foaf:primaryTopic` link. |\n\n\n## RDF Metadata Accumulate\n\n_Package: [`@comunica/bus-rdf-metadata-accumulate`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-metadata-accumulate)_\n\nA bus for aggregating metadata objects together.\n\nSubscribed actors need to implement [`ActorRdfMetadataAccumulate`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_metadata_accumulate.ActorRdfMetadataAccumulate.html).\n\n### Actors\n\n| Name | Package | Description |\n|--------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------|\n| Can Contain Undefs | [`@comunica/actor-rdf-metadata-accumulate-cancontainundefs`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-accumulate-cancontainundefs) | Accumulate the `canContainUndefs` field. |\n| Cardinality | [`@comunica/actor-rdf-metadata-accumulate-cardinality`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-accumulate-cardinality) | Accumulate the `cardinality` field. |\n| Page Size | [`@comunica/actor-rdf-metadata-accumulate-pagesize`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-accumulate-pagesize) | Accumulate the `pageSize` field. |\n| Request Time | [`@comunica/actor-rdf-metadata-accumulate-requesttime`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-accumulate-requesttime) | Accumulate the `requestTime` field. |\n\n## RDF Metadata Extract\n\n_Package: [`@comunica/bus-rdf-metadata-extract`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-metadata-extract)_\n\nCreate an object with metadata for a given metadata quad stream.\n\nSubscribed actors need to implement [`ActorRdfMetadataExtract`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_metadata_extract.ActorRdfMetadataExtract.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Allow HTTP Methods | [`@comunica/actor-rdf-metadata-extract-allow-http-methods`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-allow-http-methods) | Extract the `Allow` HTTP response header. |\n| Hydra Controls | [`@comunica/actor-rdf-metadata-extract-hydra-controls`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-hydra-controls) | Extract controls using the Hydra vocabulary. |\n| Hydra Count | [`@comunica/actor-rdf-metadata-extract-hydra-count`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-hydra-count) | Extract count estimates using the Hydra vocabulary. |\n| Hydra Page size | [`@comunica/actor-rdf-metadata-extract-hydra-pagesize`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-hydra-pagesize) | Extract page sizes using the Hydra vocabulary. |\n| Patch SPARQL Update | [`@comunica/actor-rdf-metadata-extract-patch-sparql-update`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-patch-sparql-update) | Checks for the presence of `application/sparql-update` in the `Accept-Patch` header. |\n| Put Accepted | [`@comunica/actor-rdf-metadata-extract-put-accepted`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-put-accepted) | Extracts the [`Accept-Put`](https://solidproject.org/TR/protocol#accept-put) HTTP response header. |\n| Request Time | [`@comunica/actor-rdf-metadata-extract-request-time`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-request-time) | Extracts the time it took to request the page in milliseconds. |\n| SPARQL Service | [`@comunica/actor-rdf-metadata-extract-sparql-service`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-sparql-service) | Extract SPARQL service description metadata. |\n\n\n## RDF Parse\n\n_Package: [`@comunica/bus-rdf-parse`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-parse)_\n\nParses quads from a serialization format.\n\nSubscribed actors need to implement [`ActorRdfParse`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_parse.ActorRdfParse.html).\n\n### Actors\n\n| Name | Package | Description |\n|----------|----------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------|\n| HTML | [`@comunica/actor-rdf-parse-html`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-html) | Parses HTML documents by delegating to [`@comunica/bus-rdf-parse-html`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-parse-html). |\n| JSON-LD | [`@comunica/actor-rdf-parse-jsonld`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-jsonld) | Parses JSON-LD. |\n| N3 | [`@comunica/actor-rdf-parse-n3`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-n3) | Parses Turtle, Trig, N-triples, or N-Quads. |\n| RDF/XML | [`@comunica/actor-rdf-parse-rdfxml`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-rdfxml) | Parses RDF/XML. |\n| XML RDFa | [`@comunica/actor-rdf-parse-xml-rdfa`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-xml-rdfa) | Parses RDFa in XML. |\n| SHACLC | [`@comunica/actor-rdf-parse-shaclc`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-shaclc) | Parses SHACLC. |\n\n\n## RDF Parse HTML\n\n_Package: [`@comunica/bus-rdf-parse-html`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-parse-html)_\n\nParses quads from an HTML document.\n\nSubscribed actors need to implement [`ActorRdfParseHtml`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_parse_html.ActorRdfParseHtml.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| RDFa | [`@comunica/actor-rdf-parse-html-rdfa`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-html-rdfa) | Parses RDFa. |\n| Microdata | [`@comunica/actor-rdf-parse-html-microdata`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-html-microdata) | Parses Microdata to RDF. |\n| Script | [`@comunica/actor-rdf-parse-html-script`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-parse-html-script) | Parses script tags and attempts to parse them by delegating to [`@comunica/bus-rdf-parse`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-parse). |\n\n\n## RDF Resolve Hypermedia Links\n\n_Package: [`@comunica/bus-rdf-resolve-hypermedia-links`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-resolve-hypermedia-links)_\n\nDetermines which links should be followed from the metadata of the current source.\n\nSubscribed actors need to implement [`ActorRdfResolveHypermediaLinks`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_resolve_hypermedia_links.ActorRdfResolveHypermediaLinks.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Next | [`@comunica/actor-rdf-resolve-hypermedia-links-next`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-resolve-hypermedia-links-next) | Follow next page links. |\n\n## RDF Resolve Hypermedia Links Queue\n\n_Package: [`@comunica/bus-rdf-resolve-hypermedia-links-queue`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-resolve-hypermedia-links-queue)_\n\nCreates [`ILinkQueue`](https://comunica.github.io/comunica/interfaces/_comunica_bus_rdf_resolve_hypermedia_links_queue.ilinkqueue.html) instances,\nwhich enables different strategies for queueing links.\n\nSubscribed actors need to implement [`ActorRdfResolveHypermediaLinksQueue`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_resolve_hypermedia_links_queue.ActorRdfResolveHypermediaLinksQueue.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| FIFO | [`@comunica/actor-rdf-resolve-hypermedia-links-queue-fifo`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-resolve-hypermedia-links-queue-fifo) | Provides a link queue following the first in, first out strategy |\n\n\n## RDF Serialize\n\n_Package: [`@comunica/bus-rdf-serialize`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-serialize)_\n\nSerializes quads to an RDF serialization format.\n\nSubscribed actors need to implement [`ActorRdfSerialize`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_serialize.ActorRdfSerialize.html).\n\n### Actors\n\n| Name | Package | Description |\n|---------|--------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|\n| JSON-LD | [`@comunica/actor-rdf-serialize-jsonld`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-serialize-jsonld) | Serializes to JSON-LD. |\n| N3 | [`@comunica/actor-rdf-serialize-n3`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-serialize-n3) | Serializes to Turtle, Trig, N-triples, or N-Quads. |\n| SHACLC | [`@comunica/actor-rdf-serialize-shaclc`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-serialize-shaclc) | Serializes to SHACLC. |\n\n\n## RDF Update Hypermedia\n\n_Package: [`@comunica/bus-rdf-update-hypermedia`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-update-hypermedia)_\n\nHandle a destination based on the extracted metadata.\n\nSubscribed actors need to implement [`ActorRdfUpdateHypermedia`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_update_hypermedia.ActorRdfUpdateHypermedia.html).\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| SPARQL | [`@comunica/actor-rdf-update-hypermedia-patch-sparql-update`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-update-hypermedia-patch-sparql-update) | The destination is considered an HTTP APIs accepting `PATCH` requests containing SPARQL Update queries (`application/sparql-update`), such as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1). |\n\n\n## RDF Update Quads\n\n_Package: [`@comunica/bus-rdf-update-quads`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-update-quads)_\n\nHandle the insertion and deletion of streams of quads for a given destination type.\n\nSubscribed actors need to implement [`ActorRdfUpdateQuads`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_update_quads.ActorRdfUpdateQuads.html)\nor [`ActorRdfUpdateQuadsDestination`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_update_quads.ActorRdfUpdateQuadsDestination.html)\n\n### Actors\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| RDF/JS Store | [`@comunica/actor-rdf-update-quads-rdfjs-store`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-update-quads-rdfjs-store) | The destination is considered an [RDF/JS Store](https://comunica.dev/docs/query/advanced/rdfjs_querying/). |\n| Hypermedia | [`@comunica/actor-rdf-update-quads-hypermedia`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-update-quads-hypermedia) | The destination that handles updates by interpreting hypermedia links and controls. |\n"},38271:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Components.js\'\ndescription: \'Components.js is the dependency injection framework that Comunica uses to wire components via config files.\'\n---\n\nA direct consequence of the high modularity of Comunica is that it leads to\na high number of **modules that need to be wired together** before they can be used.\n\nComunica makes use of the **dependency injection framework [Components.js](https://componentsjs.readthedocs.io/en/latest/)**\nto take care of this wiring of modules.\nIn essence, Components.js allows you to create [JSON-LD](https://json-ld.org/) configuration files\nin which you _declaratively_ define which components you want to instantiate using what parameters.\nComponents.js can then _read_ these configuration files, and instantiate them as runtime JavaScript objects.\n\nWhile there is [detailed documentation available for Components.js](https://componentsjs.readthedocs.io/en/latest/),\nwe summarize the most important parts for Comunica on this page.\n\n
\n\n## Terminology\n\nBefore you continue reading this guide,\nit is important to understand the three following concepts:\n\n* **Module:** A collection of **components**. _For example, an npm package._\n* **Component:** Something that can be instantiated. _For example, a JavaScript/TypeScript class._\n* **Instance:** An instantiated **component**. _For example, a JavaScript/TypeScript class instance._\n\nFor example, the npm package `@comunica/actor-query-operation-reduced-hash` is a **module**\nthat exposes a single **component** `ActorQueryOperationReducedHash`,\nwhich implements the SPARQL `REDUCED` operator.\nDuring dependency injection, any number of **instances** of the component `ActorQueryOperationReducedHash`\ncan be created, possibly with different parameters values.\n\n## Describing modules in JSON-LD\n\nThe `components/` directory of each package contains JSON-LD representations of the module and its components,\nwhich **describe how components can be instantiated**.\nAs of Comunica version 2.x, the contents of this directory are automatically generated\nusing [Components-Generator.js](https://github.com/LinkedSoftwareDependencies/Components-Generator.js/),\nwhich is invoked when running `yarn run build`.\n\nWhile **these files should never be created or modified manually**,\nsome examples below are shown to explain their most important parts.\n\n`components/components.jsonld`: (_root components file_)\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-reduced-hash/^3.0.0/components/context.jsonld"\n ],\n "@id": "npmd:@comunica/actor-query-operation-reduced-hash",\n "@type": "Module",\n "requireName": "@comunica/actor-query-operation-reduced-hash",\n "import": [\n "caqorh:components/ActorQueryOperationReducedHash.jsonld"\n ]\n}\n```\n\n`components/ActorQueryOperationReducedHash.jsonld` (simplified):\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-reduced-hash/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/core/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/bus-query-operation/^3.0.0/components/context.jsonld"\n ],\n "@id": "npmd:@comunica/actor-query-operation-reduced-hash",\n "components": [\n {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash",\n "@type": "Class",\n "requireElement": "ActorQueryOperationReducedHash",\n "extends": "cbqo:components/ActorQueryOperationTypedMediated.jsonld#ActorQueryOperationTypedMediated",\n "comment": "A comunica Reduced Hash Query Operation Actor.",\n "parameters": [\n {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args_cacheSize",\n "range": "xsd:integer",\n "default": "100"\n },\n {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args_mediatorQueryOperation",\n "range": "cc:components/Mediator.jsonld#Mediator"\n }\n ],\n "constructorArguments": [\n {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args__constructorArgument",\n "fields": [\n {\n "keyRaw": "cacheSize",\n "value": {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args_cacheSize"\n }\n },\n {\n "keyRaw": "mediatorQueryOperation",\n "value": {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args_mediatorQueryOperation"\n }\n }\n ]\n }\n ]\n }\n ]\n}\n```\n\nThe `import` key allows components to be defined across different files,\nwhere its values internally translate into a local file path.\nFor example, `"caqorh:components/ActorQueryOperationReducedHash.jsonld"`\ncorresponds to the local file `components/ActorQueryOperationReducedHash.jsonld`.\n\nThe prefix `caqorh:` identifies the scope of this package.\nInternally, this gives all files a unique URL\nthat makes all modules and components _semantic_ and fully dereferenceable.\nFor example, `"caqorh:components/ActorQueryOperationReducedHash.jsonld"`\nexpands to the URL https://linkedsoftwaredependencies.org/bundles/npm/%40comunica%2Factor-query-operation-reduced-hash/^3.0.0/components/ActorQueryOperationReducedHash.jsonld.\n\n
\nLinked Software Dependencies is a service\nthat exposes all npm packages as JSON-LD,\nwhich forms a key element in Components.js.\n
\n\nLearn more in the Components.js documentation on [modules](https://componentsjs.readthedocs.io/en/latest/configuration/modules/)\nand [components](https://componentsjs.readthedocs.io/en/latest/configuration/components/general/).\n\n## Context files\n\nThe so-called context is another file in the `components/` directory that will be automatically generated using\n[Components-Generator.js](https://github.com/LinkedSoftwareDependencies/Components-Generator.js/)\nwhen invoking `yarn run build`.\n\nThis context is needed because\nour components and config files always make use of URLs as identifiers for things (`@id` in JSON-LD).\nSince URLs sometimes can become long, we make use of _JSON-LD context files_\nto **define shortcuts and prefixes for some URLs**.\n\nFor example, the context for our reduced actor (defined in `components/context.jsonld`) could look as follows (simplified):\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/componentsjs/^5.0.0/components/context.jsonld",\n {\n "npmd": "https://linkedsoftwaredependencies.org/bundles/npm/",\n "caqorh": "npmd:@comunica/actor-query-operation-reduced-hash/^2.0.0/",\n "ActorQueryOperationReducedHash": {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash",\n "@prefix": true,\n "@context": {\n "cacheSize": {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args_cacheSize"\n },\n "mediatorQueryOperation": {\n "@id": "caqorh:components/ActorQueryOperationReducedHash.jsonld#ActorQueryOperationReducedHash_args_mediatorQueryOperation"\n }\n }\n }\n }\n ]\n}\n```\n\nThe relevant entries in this file that become reusable are `caqorh`, `ActorQueryOperationReducedHash`, `cacheSize`, and `mediatorQueryOperation`.\nDo note that `cacheSize` and `mediatorQueryOperation` will _only_ be usable within instances of `ActorQueryOperationReducedHash`, i.e., when instantiating `ActorQueryOperationReducedHash` via `"@type"`.\n\nIf you want to use these prefixes in any other file,\nthe full URL of this context has to be used in `"@context"`.\nThis URL will always be in the form of `"https://linkedsoftwaredependencies.org/bundles/npm//^.0.0/components/context.jsonld"`.\n\n## Creating configurations in JSON-LD\n\nConfiguration files are used to **instantiate components**.\nWhile modules and components are defined in the `components/` folder,\nwe typically create our config files in `config/`.\nWe also define these as JSON-LD files, with pointers to our components files.\n\nThe instantiation of a Comunica engine could look like this:\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/runner/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-init-query/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-reduced-hash/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-construct/^3.0.0/components/context.jsonld" \n ],\n "@id": "urn:comunica:my",\n "@type": "Runner",\n "actors": [\n {\n "@id": "urn:comunica:default:init/actors#query",\n "@type": "ActorInitQuery"\n },\n {\n "@id": "urn:comunica:default:query-operation/actors#reduced",\n "@type": "ActorQueryOperationReducedHash",\n "mediatorQueryOperation": { "@id": "urn:comunica:default:query-operation/mediators#main" },\n "mediatorHashBindings": { "@id": "urn:comunica:default:hash-bindings/mediators#main" }\n },\n {\n "@id": "urn:comunica:default:query-operation/actors#construct",\n "@type": "ActorQueryOperationConstruct",\n "mediatorQueryOperation": { "@id": "urn:comunica:default:query-operation/mediators#main" }\n }\n ]\n}\n```\n\nLearn more in the full Components.js documentation on [configs](https://componentsjs.readthedocs.io/en/latest/configuration/configurations/semantic/).\n\n## Package.json contents\n\nIf you want to expose components or use modular configs in your npm package,\n**you must enable a flag in your `package.json` file so that Components.js can find your npm package**:\n```text\n{\n ...\n "lsd:module": true\n ...\n}\n```\n\nLearn more in the full Components.js documentation on [exposing components](https://componentsjs.readthedocs.io/en/latest/getting_started/basics/exposing_components/).\n\n### More control over Components.js configuration (optional)\n\nWhile this is optional,\nyou can configure yourself where Components.js can find required files (components, contexts, configs) in your npm package**.\n\nFor this, you can add the following entries to your `package.json` file:\n```text\n{\n ...\n "lsd:module": "https://linkedsoftwaredependencies.org/bundles/npm/my-package",\n "lsd:components": "components/components.jsonld",\n "lsd:contexts": {\n "https://linkedsoftwaredependencies.org/bundles/npm/my-package/^1.0.0/components/context.jsonld": "components/context.jsonld"\n },\n "lsd:importPaths": {\n "https://linkedsoftwaredependencies.org/bundles/npm/my-package/^1.0.0/components/": "components/",\n "https://linkedsoftwaredependencies.org/bundles/npm/my-package/^1.0.0/config/": "config/"\n }\n ...\n}\n```\n\n_On each line, make sure to replace `my-package` with your package `name`._\n\nThese entries have the following meaning:\n\n* `lsd:module`: The URL that corresponds to your npm package. This will mostly be `https://linkedsoftwaredependencies.org/bundles/npm/` appended by your package name.\n* `lsd:components`: Local path to your root components file. This will mostly be `components/components.jsonld`.\n* `lsd:contexts`: The mapping of context URLs to local context files. This will typically contain only one entry for `components/context.jsonld`, but can be empty. This is used by Components.js when looking up contexts to first look in the local file system, to avoid expensive HTTP(S) lookups if the file already exists locally.\n* `lsd:importPaths`: The mapping of component and config files to local files. This will typically contain entries for `components/` and `config/`. This is used by Components.js when looking up components or config imports to first look in the local file system, to avoid expensive HTTP(S) lookups if the file already exists locally.\n'},11434:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Custom CLI arguments'\ndescription: 'Adding custom arguments to CLI tools'\n---\n\nAs explained within the guide to [expose your custom config as an npm package](/docs/modify/getting_started/custom_init/),\ncustom command line tools can be created as follows:\n\n`bin/query.js`:\n```typescript\n#!/usr/bin/env node\nimport { runArgsInProcessStatic } from '@comunica/runner-cli';\nrunArgsInProcessStatic(require('../engine-default.js'));\n```\n\n`bin/http.js`:\n```typescript\n#!/usr/bin/env node\nimport { HttpServiceSparqlEndpoint } from '@comunica/actor-init-query';\nconst defaultConfigPath = `${__dirname}/../config/config-default.json`;\nHttpServiceSparqlEndpoint.runArgsInProcess(process.argv.slice(2), process.stdout, process.stderr, `${__dirname}/../`, process.env, defaultConfigPath, code => process.exit(code))\n .catch(error => process.stderr.write(`${error.message}/n`));\n```\n\n`bin/query-dynamic.js`:\n```typescript\n#!/usr/bin/env node\nimport { runArgsInProcess } from '@comunica/runner-cli';\nrunArgsInProcess(`${__dirname}/../`, `${__dirname}/../config/config-default.json`);\n```\n\nThis will cause the built-in CLI arguments from `comunica-sparql` to be inherited.\nIt is however also possible to _extend_ these arguments so that you can add additional ones,\nwhich can be processed in any way.\n\n## Creating CLI Arguments Handlers\n\nThis argument handling can be done using one or more instances of [`ICliArgsHandler`](https://comunica.github.io/comunica/interfaces/_comunica_actor_init_query.ICliArgsHandler.html),\nwhich may be implemented as follows:\n```typescript\nexport class MyCliArgsHandler implements ICliArgsHandler {\n public populateYargs(argumentsBuilder: Argv): Argv {\n return argumentsBuilder\n .options({\n myOption: {\n alias: 'm',\n type: 'string',\n describe: 'Just some option',\n default: 'A default value',\n },\n });\n }\n\n public async handleArgs(args: Record, context: Record): Promise {\n context['this-is-a-context-key'] = args.myOption;\n }\n}\n```\n\nThe `populateYargs` method allows you to declare options within the `argumentsBuilder` using the [yargs API](https://www.npmjs.com/package/yargs).\nThen, the `handleArgs` is invoked after the CLI tool has been invoked with some options,\nso that you can extract the defined option, and modify the [query context](/docs/query/advanced/context/) if needed (which is still mutable at this stage).\n\n## Passing CLI Arguments Handlers\n\nThen, in order to pass your instances of `ICliArgsHandler` to the CLI tools,\nyou can do this as follows:\n\n`bin/query.js`:\n```typescript\n#!/usr/bin/env node\nimport { runArgsInProcessStatic } from \"@comunica/runner-cli\";\nimport { KeysInitSparql } from '@comunica/context-entries';\nimport { ActionContext } from '@comunica/core';\nrunArgsInProcessStatic(require('../engine-default.js'), {\n context: ActionContext({\n [KeysInitSparql.cliArgsHandlers]: [ new MyCliArgsHandler() ],\n }),\n});\n```\n\n`bin/http.js`:\n```typescript\n#!/usr/bin/env node\nimport {HttpServiceSparqlEndpoint} from \"@comunica/query-sparql\";\nHttpServiceSparqlEndpoint.runArgsInProcess(process.argv.slice(2), process.stdout, process.stderr,\n __dirname + '/../', process.env, __dirname + '/../config/config-default.json', () => process.exit(1), [ new MyCliArgsHandler() ]);\n```\n\n`bin/query-dynamic.js`:\n```typescript\n#!/usr/bin/env node\nimport { runArgsInProcess } from \"@comunica/runner-cli\";\nimport { KeysInitSparql } from '@comunica/context-entries';\nimport { ActionContext } from '@comunica/core';\nrunArgsInProcess(__dirname + '/../', __dirname + '/../config/config-default.json', {\n context: ActionContext({\n [KeysInitSparql.cliArgsHandlers]: [ new MyCliArgsHandler() ],\n }),\n});\n```\n"},66425:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Expression Evaluator'\ndescription: 'The expression evaluation engine of Comunica.'\n---\n\nThe expression evaluator package of Comunica is used by different Comunica actors for evaluating expressions.\n\nConcretely, the following actors make use of this:\n* [`@comunica/actor-query-operation-extend`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-extend): Implements the extent operator.\n* [`@comunica/actor-query-operation-filter-sparqlee`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-filter-sparqlee): Implements the filter operator.\n* [`@comunica/actor-query-operation-group`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-group): Implements the group operator.\n* [`@comunica/actor-query-operation-leftjoin`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-leftjoin): Implements the left join operator.\n* [`@comunica/actor-query-operation-orderby-sparqlee`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-extend): Implements the order by operator.\n\n## Using The Expression Evaluator\n\n```ts\nimport { translate } from \"sparqlalgebrajs\";\nimport { stringToTerm } from \"rdf-string\";\n\n// An example SPARQL query with an expression in a FILTER statement.\n// We translate it to SPARQL Algebra format ...\nconst query = translate(`\n SELECT * WHERE {\n ?s ?p ?o\n FILTER langMatches(lang(?o), \"FR\")\n }\n`);\n\n// ... and get the part corresponding to \"langMatches(...)\".\nconst expression = query.input.expression;\n\n// We create an evaluator for this expression.\n// A sync version exists as well.\nconst evaluator = new AsyncEvaluator(expression);\n\n// We can now evaluate some bindings as a term, ...\nconst result: RDF.Term = await evaluator.evaluate(\n Bindings({\n ...\n '?o': stringToTerm(\"Ceci n'est pas une pipe\"@fr),\n ...\n })\n);\n\n// ... or as an Effective Boolean Value (e.g. for use in FILTER)\nconst result: boolean = await evaluator.evaluateAsEBV(bindings);\n```\n\n## Config\n\nThe expression evaluator accepts an optional config argument, that is not required for simple use cases,\nbut for feature completeness and spec compliance it should receive `now, baseIRI, exists, aggregate and bnode`.\n\nFor the extended date functionality (see later), an additional context item has been added: `implicitTimezone`.\nThe choice was made to default to the timezone `now` has.\nIt can be desired to set it explicitly so `implicitTimezone` does not change over time (i.e., it is not dependent on daylight saving time).\n\n```ts\ninterface AsyncEvaluatorContext {\n now?: Date;\n baseIRI?: string;\n\n exists?: (expression: Alg.ExistenceExpression, mapping: Bindings) => Promise;\n aggregate?: (expression: Alg.AggregateExpression) => Promise;\n bnode?: (input?: string) => Promise;\n extensionFunctionCreator?: (functionNamedNode: RDF.NamedNode) => (args: RDF.Term[]) => Promise | undefined;\n overloadCache?: LRUCache;\n typeCache?: LRUCache;\n getSuperType?: (unknownType: string) => string;\n implicitTimezone?: { zoneHours: number; zoneMinutes: number;}; \n}\n```\n\n## Errors\n\nThis package exports an Error class called `ExpressionError` from which all SPARQL related errors inherit.\nThese might include unbound variables, wrong types, invalid lexical forms, and much more.\nThese errors can be caught, and may impact program execution in an expected way.\nAll other errors are unexpected, and are thus programmer mistakes or mistakes in this package.\n\nThere is also the utility function `isExpressionError` for detecting these cases.\n\n```ts\n// Make sure to catch errors if you don't control binding input\ntry {\n const result = await evaluator.evaluate(bindings);\n consumeResult(result);\n} catch (error) {\n if (isExpressionError(error)) {\n console.log(error); // SPARQL related errors\n ... // Move on, ignore result, ...\n } else {\n throw error; // Programming errors or missing features.\n }\n}\n```\n\n## Exists\n\n'Exists' operations are an annoying problem to tackle in the context of an expression evaluator,\nsince they make the operation stateful and context dependant.\nThey might span entire streams and, depending on the use case, have very different requirements for speed and memory consumption.\nThis package has therefore decided to delegate this responsibility back to you.\n\nYou can, if you want, pass hooks to the evaluators of the shape:\n\n```ts\nexists?: (expression: Alg.ExistenceExpression, mapping: Bindings) => Promise;\n```\n\nIf this package encounters any or existence expression, it will call this hook with the relevant information, so you can resolve it yourself.\nIf these hooks are not present, but an existence expression is encountered, then an error is thrown.\n\nAn example consumer/hook can be found in [Comunica](https://github.com/comunica/comunica/blob/master/packages/actor-query-operation-filter-sparqlee/lib/ActorQueryOperationFilterSparqlee.ts).;\n\n## Aggregates\n\nAn `AggregateEvaluator` to which you can pass the individual bindings in the stream, and ask the aggregated result back, is provided.\nIt uses the internal type system for operations such as `sum` and `avg`.\n\n```ts\nconst stream = [bindings1, bindings2, bindings3];\n\nif (stream.length === 0) {\n return AggregateEvaluator.emptyValue(aggregateExpression);\n} else {\n const evaluator = new AggregateEvaluator(aggregateExpression, bindings[0]);\n stream.slice(1).forEach((bindings) => evaluator.put(bindings));\n return evaluator.result();\n}\n```\n\nWe have not found any SPARQL Algebra for which this occurs,\nbut we happen to find any aggregate expressions nested in the expression (or even at the top level),\nwe will call (similarly to EXISTS) an aggregate hook you might have provided.\n\n```ts\naggregate?: (expression: Alg.AggregateExpression) => Promise;\n```\n\nYou can probably ignore this.\n\nWe also provide an `AsyncAggregateEvaluator` to that works the same way `AggregateEvaluator` does.\nThe signature of only the `put` method changes to be async. It is up to you to handle this correctly.\nYou are for example expected to await all puts before you ask for `result`.\nYou should also note the order of calling and awaiting put while using the `GroupConcat` aggregator.\n\n## Extension functions\n\nThis section explains how to pass extension functions to the evaluator.\nYou don't need to do this directly. If you want to provide extension function to a\nComunica engine follow the [extension function docs](https://comunica.dev/docs/query/advanced/extension_functions/).\n\nExtension functions can be added by providing the `extensionFunctionCreator` in the config.\nExample\n```ts\nconfig.extensionFunctionCreator = (functionName: RDF.NamedNode) => {\n if (functionNamedNode.value === 'https://example.org/functions#equal') {\n return async (args: RDF.Term[]) => {\n return literal(String(args[0].equals(args[1])), 'http://www.w3.org/2001/XMLSchema#boolean'); \n }\n }\n}\n```\n\n## Overload function caching\n\nAn functionArgumentsCache allows the partial evaluator to cache the implementation of a function provided the argument types.\nWhen not providing a cache in the context, the evaluator will create one.\n\nThis cache can be reused across multiple evaluators. Manual modification is not recommended.\n\n## Context dependant functions\n\nSome functions (BNODE, NOW, IRI) need a (stateful) context from the caller to function correctly according to the spec.\nThis context can be passed as an argument to the evaluator (see the [config section](#config) for exact types).\nIf they are not passed, the evaluator will use a naive implementation that might do the trick for simple use cases.\n\n### BNODE\n\n[spec](https://www.w3.org/TR/sparql11-query/#func-bnode)\n\nBlank nodes are very dependent on the rest of the SPARQL query, therefore,\nwe provide the option of delegating the entire responsibility back to you by accepting a blank node constructor callback.\nIf this is not found, we create a blank node with the given label,\nor we use uuid (v4) for argument-less calls to generate definitely unique blank nodes of the shape `blank_uuid`.\n\n`bnode(input?: string) => RDF.BlankNode`\n\n### Now\n\n[spec](https://www.w3.org/TR/sparql11-query/#func-now)\n\nAll calls to now in a query must return the same value, since we aren't aware of the rest of the query,\nyou can provide a timestamp (`now: Date`). If it's not present, the evaluator will use the timestamp of evaluator creation,\nthis at least allows evaluation with multiple bindings to have the same `now` value.\n\n### IRI\n\n[spec](https://www.w3.org/TR/sparql11-query/#func-iri)\n\nTo be fully spec compliant, the IRI/URI functions should take into account base IRI of the query,\nwhich you can provide as `baseIRI: string` to the config.\n\n## SPARQL 1.2\n\nThe partial evaluator package looks already implements some SPARQL 1.2 specification functions.\n\nCurrently, this is restricted to the [extended date](https://github.com/w3c/sparql-12/blob/main/SEP/SEP-0002/sep-0002.md) functionality.\nPlease note that the new sparql built-in `ADJUST` function has not been implemented due to package dependencies.\n\n## Type System\n\nThe type system of the partial evaluator is tailored for doing (supposedly) quick evaluation of overloaded functions.\n\nA function definition object consists of a tree-like structure with a type (e.g. `xsd:float`) at each internal node.\nEach level of the tree represents an argument of the function\n(e.g. function with arity two also has a tree of depth two).\nThe leaves contain a function implementation matching the concrete types defined by the path of the tree.\n\nWhen a function is called with some arguments, a depth first search,\nto find an implementation among all overloads matching the types of the arguments,\nis performed in the tree.\n\n**[Subtype substitution](https://www.w3.org/TR/xpath-31/#dt-subtype-substitution)** is handled for literal terms.\nWhat this means is that for every argument of the function, and it's associated accepted type,\nWhen a function accepts a type, it also accepts all subtypes for that argument.\nThese sub/super-type relations define the following type tree:\n\n
\n \n
\n\nSo, when expecting an argument of type `xsd:integer` we could provide `xsd:long` instead and the\nfunction call would still succeed. The type of the term does not change in this operation.\n\nThe expression evaluator also handles **[type promotion](https://www.w3.org/TR/xpath-31/#promotion)**.\nType promotion defines some rules where a types can be promoted to another, even if there is no super-type relation.\nExamples include `xsd:float` and `xsd:decimal` to `xsd:double`and `xsd:anyURI` to `xsd:string`.\nIn this case, the datatype of the term will change to the type it is promoted to.\n\n"},76240:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Hypermedia\'\ndescription: \'Discovery of data source capabilities during query execution.\'\n---\n\nComunica enables **[hypermedia](https://en.wikipedia.org/wiki/HATEOAS)-driven query execution**.\nThis allows users to provide data sources by URL,\nand Comunica will automatically detect the querying capabilities for this source\nto determine an efficient query execution plan. \n\nThis strategy makes it so that when providing a link to a SPARQL endpoint (e.g. https://dbpedia.org/sparql),\ncommunication will be done using SPARQL queries.\nWhile when providing a link to a plain RDF file (e.g. http://ruben.verborgh.org/profile/),\nthe whole file will be downloaded and queried in-memory.\n\n
\nThis page only describes the handling of hypermedia for read queries.\nThe handling of hypermedia for update queries happens in a very similar manner,\nwith the main difference that the RDF Resolve Hypermedia bus\nis replaced by the RDF Update Hypermedia bus.\n
\n\n## Hypermedia actor\n\nThe actor in Comunica that drives hypermedia handling is\n[`@comunica/actor-query-source-identify-hypermedia`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-resolve-quad-pattern-hypermedia).\nThis actor is registered to the [Query Source Identify bus](/docs/modify/advanced/buses/#query-source-identify).\nThis actor will be invoked once for each query during context preprocessing,\nand the identified source will be assigned to suboperations within the query.\n\n
\nThe SPARQL architecture\nshows how this hypermedia actor relates to all other actors and buses.\n
\n\n## Steps for handling hypermedia\n\nFor each URL-based data source,\nthe hypermedia actor will always go through the following steps:\n\n1. Dereference RDF ([Dereference RDF bus](/docs/modify/advanced/buses/#dereference-rdf))\n2. Split data and metadata streams ([RDF Metadata bus](/docs/modify/advanced/buses/#rdf-metadata))\n3. Extract metadata as object ([RDF Metadata Extract bus](/docs/modify/advanced/buses/#rdf-metadata-extract))\n4. Determine links to other sources ([RDF Resolve Hypermedia Links bus](/docs/modify/advanced/buses/#rdf-resolve-hypermedia-links))\n5. Create a queue for managing links ([RDF Resolve Hypermedia Links Queue bus](/docs/modify/advanced/buses/#rdf-resolve-hypermedia-links-queue))\n6. Handle source based on metadata ([Query Source Identify Hypermedia bus](/docs/modify/advanced/buses/#rdf-resolve-hypermedia))\n\nHereafter, we go over these three steps using three example sources:\n\n1. https://dbpedia.org/sparql\n2. http://fragments.dbpedia.org/2016-04/en\n3. https://ruben.verborgh.org/profile/\n\n## 1. Dereference RDF\n\nAn HTTP(S) request is done to retrieve the RDF data at the given location\nvia [content negotiation](https://developer.mozilla.org/en-US/docs/Web/HTTP/Content_negotiation).\nDifferent ways of doing this may exist in the [Dereference RDF bus](/docs/modify/advanced/buses/#dereference-rdf).\nConcretely, the input is an URL, and the output is a stream of parsed RDF triples/quads.\n\nFor example:\n\n1. https://dbpedia.org/sparql\n```turtle\nns1:sparql rdf:type sd:Service ;\n sd:endpoint ns1:sparql ;\n sd:feature sd:UnionDefaultGraph ,\n sd:DereferencesURIs .\n@prefix ns3: .\nns1:sparql sd:resultFormat ns3:SPARQL_Results_JSON ,\n ns3:SPARQL_Results_XML ,\n ns3:Turtle ,\n ns3:N-Triples ,\n ns3:N3 ,\n ns3:RDF_XML ,\n ns3:SPARQL_Results_CSV ,\n ns3:RDFa ;\n sd:supportedLanguage sd:SPARQL10Query ;\n sd:url ns1:sparql .\n```\n2. http://fragments.dbpedia.org/2016-04/en\n```turtle\n hydra:member .\n a void:Dataset, hydra:Collection;\n void:subset ;\n hydra:search _:triplePattern.\n_:triplePattern hydra:template "https://fragments.dbpedia.org/2016-04/en{?subject,predicate,object}";\n hydra:variableRepresentation hydra:ExplicitRepresentation;\n hydra:mapping _:subject, _:predicate, _:object.\n_:subject hydra:variable "subject";\n hydra:property rdf:subject.\n_:predicate hydra:variable "predicate";\n hydra:property rdf:predicate.\n_:object hydra:variable "object";\n hydra:property rdf:object.\n void:subset ;\n a hydra:PartialCollectionView;\n dcterms:title "Linked Data Fragment of DBpedia 2016-04"@en;\n dcterms:description "Triple Pattern Fragment of the \'DBpedia 2016-04\' dataset containing triples matching the pattern { ?s ?p ?o }."@en;\n dcterms:source ;\n hydra:totalItems "1040358853"^^xsd:integer;\n void:triples "1040358853"^^xsd:integer;\n hydra:itemsPerPage "100"^^xsd:integer;\n hydra:first ;\n hydra:next .\n dbpprop:date "1899-05-06"^^xsd:date;\n dbpprop:isCitedBy ;\n dbpprop:newspaper "Biloxi Daily Herald";\n dbpprop:page "6";\n dbpprop:title "A New System of Weights and Measures";\n dbpprop:url .\n...\n```\n3. https://ruben.verborgh.org/profile/\n```turtle\n\n a foaf:Document, foaf:PersonalProfileDocument;\n rdfs:label "Ruben Verborgh’s FOAF profile"@en;\n foaf:maker :me;\n foaf:primaryTopic :me.\n:me a foaf:Person;\n foaf:name "Ruben Verborgh"@en, "Ruben Verborgh"@nl;\n rdfs:label "Ruben Verborgh"@en, "Ruben Verborgh"@nl;\n vcard:fn "Ruben Verborgh"@en, "Ruben Verborgh"@nl;\n con:preferredURI "https://ruben.verborgh.org/profile/#me";\n foaf:givenName "Ruben"@en, "Ruben"@nl;\n foaf:familyName "Verborgh"@en, "Verborgh"@nl;\n...\n```\n\n## 2. Split data and metadata streams\n\nSome RDF sources may include metadata inside the document,\nsuch as [Triple Pattern Fragments](https://linkeddatafragments.org/specification/triple-pattern-fragments/).\nAs such, there needs to be a way to distinguish between data and metadata triples,\nfor which different strategies exist in the [RDF Metadata bus](/docs/modify/advanced/buses/#rdf-metadata).\n\n
\n\nFor example:\n\n1. https://dbpedia.org/sparql\n\n**Data:** _empty_\n\n**Metadata:**\n```turtle\nns1:sparql rdf:type sd:Service ;\n sd:endpoint ns1:sparql ;\n sd:feature sd:UnionDefaultGraph ,\n sd:DereferencesURIs .\n@prefix ns3: .\nns1:sparql sd:resultFormat ns3:SPARQL_Results_JSON ,\n ns3:SPARQL_Results_XML ,\n ns3:Turtle ,\n ns3:N-Triples ,\n ns3:N3 ,\n ns3:RDF_XML ,\n ns3:SPARQL_Results_CSV ,\n ns3:RDFa ;\n sd:supportedLanguage sd:SPARQL10Query ;\n sd:url ns1:sparql .\n```\n2. http://fragments.dbpedia.org/2016-04/en\n\n**Data:**\n```turtle\n dbpprop:date "1899-05-06"^^xsd:date;\n dbpprop:isCitedBy ;\n dbpprop:newspaper "Biloxi Daily Herald";\n dbpprop:page "6";\n dbpprop:title "A New System of Weights and Measures";\n dbpprop:url .\n...\n```\n**Metadata:**\n```turtle\n hydra:member .\n a void:Dataset, hydra:Collection;\n void:subset ;\n hydra:search _:triplePattern.\n_:triplePattern hydra:template "https://fragments.dbpedia.org/2016-04/en{?subject,predicate,object}";\n hydra:variableRepresentation hydra:ExplicitRepresentation;\n hydra:mapping _:subject, _:predicate, _:object.\n_:subject hydra:variable "subject";\n hydra:property rdf:subject.\n_:predicate hydra:variable "predicate";\n hydra:property rdf:predicate.\n_:object hydra:variable "object";\n hydra:property rdf:object.\n void:subset ;\n a hydra:PartialCollectionView;\n dcterms:title "Linked Data Fragment of DBpedia 2016-04"@en;\n dcterms:description "Triple Pattern Fragment of the \'DBpedia 2016-04\' dataset containing triples matching the pattern { ?s ?p ?o }."@en;\n dcterms:source ;\n hydra:totalItems "1040358853"^^xsd:integer;\n void:triples "1040358853"^^xsd:integer;\n hydra:itemsPerPage "100"^^xsd:integer;\n hydra:first ;\n hydra:next .\n```\n\n3. https://ruben.verborgh.org/profile/\n\n**Data:**\n```turtle\n\n a foaf:Document, foaf:PersonalProfileDocument;\n rdfs:label "Ruben Verborgh’s FOAF profile"@en;\n foaf:maker :me;\n foaf:primaryTopic :me.\n:me a foaf:Person;\n foaf:name "Ruben Verborgh"@en, "Ruben Verborgh"@nl;\n rdfs:label "Ruben Verborgh"@en, "Ruben Verborgh"@nl;\n vcard:fn "Ruben Verborgh"@en, "Ruben Verborgh"@nl;\n con:preferredURI "https://ruben.verborgh.org/profile/#me";\n foaf:givenName "Ruben"@en, "Ruben"@nl;\n foaf:familyName "Verborgh"@en, "Verborgh"@nl;\n...\n```\n\n**Metadata:** _empty_\n\n## 3. Extract metadata as object\n\nUsing actors on the [RDF Metadata Extract bus](/docs/modify/advanced/buses/#rdf-metadata-extract),\nrelevant parts of the metadata stream are identified,\nand a convenient metadata object is constructed for later use.\n\nFor example:\n\n1. https://dbpedia.org/sparql\n```json\n{\n "sparqlService": "https://dbpedia.org/sparql"\n}\n```\n2. http://fragments.dbpedia.org/2016-04/en\n```json\n{\n "first": "https://fragments.dbpedia.org/2016-04/en?page=1",\n "next": "https://fragments.dbpedia.org/2016-04/en?page=2",\n "searchForms": {\n "values": [\n {\n "mappings": {\n "subject": "http://www.w3.org/1999/02/22-rdf-syntax-ns#subject",\n "predicate": "http://www.w3.org/1999/02/22-rdf-syntax-ns#predicate",\n "object": "http://www.w3.org/1999/02/22-rdf-syntax-ns#object"\n },\n "template": "https://fragments.dbpedia.org/2016-04/en{?subject,predicate,object}"\n }\n ]\n },\n "cardinality": { "type": "estimate", "value": 1040358853, "dataset": "https://fragments.dbpedia.org/2016-04/en" }\n}\n```\n3. https://ruben.verborgh.org/profile/\n```json\n{}\n```\n\n## 4. Determine links to other sources\n\nBased on the detected metadata, links are extracted that can optionally be followed.\nThese links are determined using actors on the [RDF Resolve Hypermedia Links bus](/docs/modify/advanced/buses/#rdf-resolve-hypermedia-links).\n\nFor example:\n\n1. https://dbpedia.org/sparql: _None_\n2. http://fragments.dbpedia.org/2016-04/en: https://fragments.dbpedia.org/2016-04/en?page=2\n3. https://ruben.verborgh.org/profile/: _None_\n\n## 5. Create a queue for managing links\n\nUsing the [RDF Resolve Hypermedia Links Queue bus](/docs/modify/advanced/buses/#rdf-resolve-hypermedia-links-queue),\na [`ILinkQueue`](https://comunica.github.io/comunica/interfaces/_comunica_bus_rdf_resolve_hypermedia_links_queue.ilinkqueue.html) instance is created\nusing which the order is determined to process links.\n\nBy default, this will be a queue that processes links in FIFO order.\n\n## 6. Handle source based on metadata\n\nFinally, the [Query Source Identify Hypermedia bus](/docs/modify/advanced/buses/#rdf-resolve-hypermedia)\ncontains actors that can handle sources based on the extracted metadata.\n\nConcretely, the detected metadata will be given to each actor on the bus,\nand the actor that can handle it with the best _filtering capabilities_\nwill be allowed to handle it.\n\nFor example:\n\n1. https://dbpedia.org/sparql: SPARQL query to https://dbpedia.org/sparql\n2. http://fragments.dbpedia.org/2016-04/en: Fill in `https://fragments.dbpedia.org/2016-04/en{?subject,predicate,object}`, and follow all subsequent next-page links.\n3. https://ruben.verborgh.org/profile/: No hypermedia, so fallback to querying over all triples in the returned data stream.\n\n
\nIf multiple links are being followed, the metadata object corresponding to the current quad pattern will be\nincrementally updated after each link that is being followed.\nThis is done using the rdf-metadata-accumulate bus, which has dedicated actors for handling how to merge specific\nmetadata fields together.\n
\n'},66705:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Joins\'\ndescription: \'Overview of how join operations are handled during query planning\'\n---\n\nJoin operations form a critical part most query engines.\nThe _order_ in which operations are joined, and the _algorithms_ that are used to execute those joins,\ndetermine in large part the overall efficiency of query executions.\nThe acts of determining this order and the selection of the join algorithms are parts of _query planning_.\n\n## Adaptive query planning\n\nWhile most query engines perform query planning _before_ query execution,\nComunica does (part of) its query planning _during_ query execution,\nwhich makes it an _adaptive_ query engine.\nThis is because Comunica aims to query over remote data sources,\nwhich makes it difficult to determine the optimal query plan ahead of query execution.\nInstead, the choices for query planning are taken as soon as they are required\nand the relevant information about the sources is available.\n\n## What is a join\n\nSPARQL queries typically consist of many joins.\nFor example, the following SPARQL query requires two triple patterns to be joined:\n\n```text\nSELECT * WHERE {\n ?s ?link.\n ?link ?o.\n}\n```\n\nA query engine can represent this as two join entries that each can produce bindings:\n\n- Join entry 1 with bindings for variables `?s` and `?link`\n- Join entry 2 with bindings for variables `?link` and `?o`\n\nThe join of these two entries will result in a new intermediary operation that produces bindings for the variables `?s`, `?link`, and `?o`.\nThe bindings in this intermediary operation will contain all existing combinations of these variables based on the two underlying join entries. \n\nFor example, we assume the following bindings for the two join entries:\n\n```text\njoin entry 1:\n { s: "ex:s1"; link: "ex:link1" }\n { s: "ex:s2"; link: "ex:link2" }\n { s: "ex:s3"; link: "ex:link3" }\n\njoin entry 2:\n { link: "ex:link1", o: "ex:o1" }\n { link: "ex:link1", o: "ex:o2" }\n { link: "ex:link3", o: "ex:o3" }\n```\n\nIf we determine the possible combinations of these join entries following the _inner join_ semantics,\nthen we will obtain the following bindings:\n\n```text\njoined bindings:\n { s: "ex:s1"; link: "ex:link1"; o: "ex:o1" }\n { s: "ex:s1"; link: "ex:link1"; o: "ex:o2" }\n { s: "ex:s3"; link: "ex:link3"; o: "ex:o3" }\n```\n\nNote that the second binding of the first join entry does not appear in the final results,\nbecause the value for `?link` (`"ex:link2"`) does not exist in the second join entry\'s bindings.\n\n## Logical and physical joins\n\nA _logical join_ type indicates the semantics of a join operation,\nand are under control of the query writer.\nThe example above explains how the so-called **inner join** works,\nwhich is the most common logical join within SPARQL queries.\n\nThere are however also two other logical join types that can occur within SPARQL queries:\n\n- **Optional join** (or _left join_): a join with two entries where all bindings from the left entry are matched with the bindings from the right entry. If no matching bindings are found in the right entry, undefined values are used for those.\n- **Minus join** (or _anti join_): a join with two entries where all bindings from the left entry are returned that have no corresponding bindings in the right entry.\n\nEach logical join can be implemented via different _physical join_ algorithms.\nThe selection of these algorithms is usually done internally within query engines during query planning,\nand is therefore not under control of the query writer.\n\nFor example, two popular algorithms for the inner join are the nested-loop-join and hash-join algorithms,\nwhere the former is based on a nested for-loop, and the latter makes use of a hash-dictionary to achieve a lower computational complexity.\n\n## Join actors\n\nThe [`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join) bus in Comunica accepts join actions,\nwhere each action determine the entries that require joining, and the logical join that is to be used.\nFor example, this bus will be invoked for the inner-join type when more than one operation (e.g. triple pattern) occurs in the query.\n\nCurrently, the following join actors are available in Comunica:\n\n- **Inner join**\n - [`@comunica/actor-rdf-join-inner-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-hash): Hash join of two entries.\n - [`@comunica/actor-rdf-join-inner-nestedloop`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-nestedloop): Nested loop join of two entries.\n - [`@comunica/actor-rdf-join-inner-none`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-none): Join between zero entries, and returns a single binding.\n - [`@comunica/actor-rdf-join-inner-single`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-single): Join of a single entry, and returns the entry itself.\n - [`@comunica/actor-rdf-join-inner-symmetrichash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-symmetrichash): Symmetric hash join of two entries.\n - [`@comunica/actor-rdf-join-inner-multi-empty`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-empty): Multi-join that accepts any number of inner-join entries of which at least one is empty and returns an empty stream.\n - [`@comunica/actor-rdf-join-inner-multi-bind`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-bind): Multi-join that inner-joins 2 or more streams by picking the one with the lowest cardinality, binding each item with the remaining operations, and recursively resolving those operations by delegating to [`@comunica/bus-query-operation`](https://github.com/comunica/comunica/tree/master/packages/bus-query-operation).\n - [`@comunica/actor-rdf-join-inner-multi-sequential`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-sequential): Multi-join by just picking the two of them hierarchically.\n - [`@comunica/actor-rdf-join-inner-multi-smallest`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-smallest): Multi-join by always picking the first two streams with smallest estimate cardinality.\n- **Optional join**\n - [`@comunica/actor-rdf-join-optional-bind`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-optional-bind): Join 2 streams using the bind join algorithm. It binds each item of the first stream with the second operation, and recursively resolving that operation by delegating to [`@comunica/bus-query-operation`](https://github.com/comunica/comunica/tree/master/packages/bus-query-operation).\n - [`@comunica/actor-rdf-join-optional-nestedloop`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-optional-nestedloop): Join 2 streams using the nested loop join algorithm.\n- **Minus join**\n - [`@comunica/actor-rdf-join-minus-hash`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-minus-hash): Join 2 streams using the hash join algorithm. This actor does _not_ support streams that can have undefined values.\n - [`@comunica/actor-rdf-join-minus-hash-undef`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-minus-hash-undef): Join 2 streams using the hash join algorithm. This actor supports streams that can have undefined values.\n\n## Selecting physical joins\n\n
\nActor selection in Comunica is done using mediators.\nLearn more about mediators in the core architecture.\n
\n\nThe [Join Coefficients Mediator](https://github.com/comunica/comunica/tree/master/packages/mediator-join-coefficients-fixed) is a mediator that will select the "optimal" join actor based on their join coefficients (cost estimates).\nEach join actor can calculate their join coefficients based on metadata that is provided by data sources.\n\nThe available join coefficients that are calculated by each join actor are:\n\n- `iterations`: An estimation of how many iterations over items are executed. This is used to determine the CPU cost.\n- `persistedItems`: An estimation of how many items are stored in memory. This is used to determine the memory cost.\n- `blockingItems`: An estimation of how many items block the stream. This is used to determine the time the stream is not progressing anymore.\n- `requestTime`: An estimation of the time to request items from sources. This is used to determine the I/O cost.\n\nThe Join Coefficients Mediator\ncan be configured with weights to calculate an overall cost based on these join coefficients,\nafter which the actor with the lowest overall cost will be allowed to execute the action.\n\n
\nIf you want to inspect or debug the chosen physical joins,\nyou can use the explain functionality,\nor make use of the logger.\n
\n\n### Physical join selection example\n\nWe assume two join entries with the following cardinalities (a.k.a., estimated number of bindings):\n\n- Join entry 1: 10\n- Join entry 2: 1.000\n\nAssuming the availability of the nested-loop-join and hash-join actors,\nthese will calculate the join coefficients as follows:\n\n- Nested-loop-join\n - `iterations = 10 * 1.000 = 10.000`\n - `persistedItems = 0`\n - `blockingItems = 0`\n- Hash-join\n - `iterations = 10 + 1.000 = 1.010`\n - `persistedItems = 10`\n - `blockingItems = 10`\n\n_The `requestTime` join coefficient is omitted out for simplicity._\n\nIf the Join Coefficients Mediator gives equal weights to all join coefficients,\nthen it can come up with the following overall costs, which would make hash-join the selected physical actor:\n\n- Nested-loop-join: `10.000 + 0 + 0 = 10.000`\n- Hash-join: `1.010 + 10 + 10 = 1.030`\n\nHowever, if the Join Coefficients Mediator would be configured to give a much higher weight (`10.000`)\nto the number of blocking items (e.g. when early results are prioritized),\nthen the overall costs would become, which would make nested-loop join the selected physical actor:\n\n- Nested-loop-join: `10.000 + 0 * 1.000 + 0 = 10.000`\n- Hash-join: `1.010 + 10 * 10.000 + 10 = 11.020`\n'},15152:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Linking local Comunica versions to other projects\'\ndescription: \'Guide on how to use a local development version of Comunica with another local project\'\n---\n\nIn cases where a local development version of Comunica is consumed as a dependency of another project, linking the local development version of Comunica to the project is required. For example, various extensions of Comunica exist. These extensions utilize packages from the base Comunica framework and add additional packages or engine configurations. When working on local changes to base Comunica and needing to use these changes in an extension, the local development version of Comunica needs to be somehow installed in the extension project.\n\nThere exist several methods for installing non-published packages as dependencies, including the support for installation from local paths or git repositories in [yarn add itself](https://yarnpkg.com/cli/add), so the best solution for a given use case may vary. On this page, we introduce two methods to connect a local development version of Comunica to a project depending on it: Yarn workspaces and lerna-linker.\n\nThese methods do not require publishing the development version of Comunica to NPM, thus being useful for testing changes before they are made public.\n\n## Yarn Workspaces\n\nThe [workspaces functionality of Yarn](https://yarnpkg.com/features/workspaces) can be used to automatically handle the interlinking process of multiple packages. This approach is already used within the various Comunica monorepositories to manage package interdependencies, and can be extended to link local Comunica packages from a monorepository into another local project without the use of `yarn link`.\n\n
\nThis approach involves the editing of package.json using local relative paths, as well as probable automated modifications to yarn.lock upon install. Such changes will need to be reverted prior to publishing the target project anywhere.\n
\n\n### Using with a simple local project\n\nFor example, given a local project and the Comunica base repository cloned next to each other as follows,\n\n```text\n/path/to/comunica\n/path/to/project\n```\n\nit is possible to include the local versions of Comunica base packages in the project by editing the `package.json` of the local project to include workspace references to the Comunica workspace packages:\n\n```json\n{\n "name": "project",\n "private": true,\n "workspaces": [\n "../comunica/engines/*",\n "../comunica/packages/*"\n ],\n ...\n}\n```\n\nAfterwards, running `yarn install` in the local project directory should result in Yarn simply linking the local Comunica packages in it.\n\n\n### Using with a local monorepository with Comunica dependencies\n\nThe process is identical to that of a simple project structure, except the `package.json` workspaces paths should be added alongside existing ones. For example, to set up the Comunica base and a feature repository for local development, one could clone them next to each other,\n\n```text\n/path/to/comunica\n/path/to/comunica-feature-repository\n```\n\nafter which the `package.json` of the feature repository could be modified to include both the existing packages and the local Comunica ones:\n\n```json\n{\n "name": "comunica-feature-repository",\n "private": true,\n "workspaces": [\n "../comunica/engines/*",\n "../comunica/packages/*",\n "engines/*",\n "packages/*"\n ],\n ...\n}\n```\n\n
\nBecause Yarn will use symbolic links for the workspaces packages, they will be linked as they are on disk, rather than through an emulated package install process. This means the packages must have their own dependencies installed and their code built at their source directory.\n
\n\n## Lerna-linker\n\nThe [lerna-linker](https://www.npmjs.com/package/lerna-linker) script is designed to facilitate package linking in a Lerna monorepo. It iterates over all packages, executing `yarn unlink` and `yarn link` on each. It then saves all linked packages and runs `yarn link ` for each linked package in the Comunica extension.\n\n### Installation\n\nInstall the script globally using the following:\n\n```bash\n$ npm install -g lerna-linker\n```\n\n### Usage\n\nAssume the local version of Comunica is located at `path/to/comunica` and the extension at `path/to/my-project`.\n\n1\\. Link Source Packages by navigating to the base Comunica directory and running:\n\n```bash\n$ cd path/to/comunica\n$ lerna-linker linkSource \n```\n\nThis command links all packages in the base repository.\n\n2\\. Link source packages to target by moving to the Comunica extension directory and running:\n\n```bash\n$ cd path/to/comunica-extension\n$ lerna-linker linkTarget\n```\n\nThis command links the base Comunica packages to the extension.\n\n3\\. Undo Linking of the base Comunica packages by navigating to the Comunica extension directory and running:\n\n```bash\n$ cd path/to/comunica-extension\n$ lerna-linker unlinkTarget\n$ yarn install\n```\n\nThis command will unlink the base Comunica packages from the extension.\n\nBy following these steps, you can effectively manage local changes to the base Comunica framework and ensure they are utilized within the extensions. \n\n
\nLinking multiple different development versions simultaneously will not work, as running lerna-linker linkSource will overwrite all previously made links.\n
\n'},40674:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Logging'\ndescription: 'How to log messages from within actors.'\n---\n\nActors can log messages at different levels,\nwhich may be useful for debugging,\nor emitting basic information.\n\n
\nThis guide focuses on invoking a logger from within an actor implementation.\nClick here if you want to learn more about configuring logging levels and printing output.\n
\n\n## Logging methods\n\nAll actors ([`Actor`](https://comunica.github.io/comunica/classes/_comunica_core.Actor.html)) expose the following methods:\n\n* `logTrace(context, message, dataCb?)`\n* `logDebug(context, message, dataCb?)`\n* `logInfo(context, message, dataCb?)`\n* `logWarn(context, message, dataCb?)`\n* `logError(context, message, dataCb?)`\n* `logFatal(context, message, dataCb?)`\n\nThese methods allow a log message to be emitted at the different [logging levels](/docs/query/advanced/logging/#logging-levels).\n\nThese methods require the [context](/docs/query/advanced/context/) to be passed,\nand a string message.\nOptionally, you can pass a callback to a JSON data hash.\n\n## Example\n\nEmitting a log message in an actor's `run` method can be done as follows:\n```typescript\npublic run(action: IAction): Promise {\n this.logInfo(action.context, 'This is a message');\n this.logInfo(action.context, 'This is another message, with data',\n () => ({ someParam: 'someValue' }));\n}\n```\n\n\n\n"},90719:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Mediators\'\ndescription: \'An overview of all mediators in Comunica.\'\n---\n\nMediators form a critical part of [Comunica\'s core architecture](/docs/modify/advanced/architecture_core/).\nThey are responsible for **selecting one actor from a bus**, based on a given task.\n\nWhile actors perform the actual logic, they never communicate with each other directly.\nInstead, they always communicate through mediators, to reduce coupling between actors.\nIf a different actor selection technique is needed,\na different mediator can be configured without having to change any buses or actors.\n\n## Mediator implementations\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| Race | [`@comunica/mediator-race`](https://github.com/comunica/comunica/tree/master/packages/mediator-race) | Picks the first actor that resolves its test. |\n| Number | [`@comunica/mediator-number`](https://github.com/comunica/comunica/tree/master/packages/mediator-number) | Mediates over a single number field. It can either choose the actor with the maximum or with the minimum value. |\n| All | [`@comunica/mediator-all`](https://github.com/comunica/comunica/tree/master/packages/mediator-all) | Special mediator that runs _all_ actors that resolve their test in parallel. |\n| Combine Pipeline | [`@comunica/mediator-combine-pipeline`](https://github.com/comunica/comunica/tree/master/packages/mediator-combine-pipeline) | Special mediator that goes over all actors in sequence and forwards I/O. This requires the action input and the actor output to be of the same type. |\n| Combine Union | [`@comunica/mediator-combine-union`](https://github.com/comunica/comunica/tree/master/packages/mediator-combine-union) | Special mediator that takes the union of all actor results. |\n| Join Coefficients Fixed | [`@comunica/mediator-join-coefficients-fixed`](https://github.com/comunica/comunica/tree/master/packages/mediator-join-coefficients-fixed) | Mediates over join actors implementing the [Join Coefficients mediator type](https://github.com/comunica/comunica/tree/master/packages/mediatortype-join-coefficients). |\n\n## Mediator types\n\nComunica contains several packages named `@comunica/mediatortype-*`\nthat expose interfaces that extend the `IActorTest` interface.\nThese interfaces can be reused in different actors to indicate what properties can be mediated over.\n\nThe following mediator types are available:\n\n| Name | Package | Description |\n| ---- | ------- | ----------- |\n| HTTP Requests | [`@comunica/mediatortype-httprequests`](https://github.com/comunica/comunica/tree/master/packages/mediatortype-httprequests) | Number of HTTP requests required for an action. |\n| Iterations | [`@comunica/mediatortype-iterations`](https://github.com/comunica/comunica/tree/master/packages/mediatortype-iterations) | Number of iterations that are needed for joining streams. |\n| Priority | [`@comunica/mediatortype-priority`](https://github.com/comunica/comunica/tree/master/packages/mediatortype-priority) | Priority of an actor, for example used for parsers and serializers in content negotiation. |\n| Time | [`@comunica/mediatortype-time`](https://github.com/comunica/comunica/tree/master/packages/mediatortype-time) | Estimated time an action will take. |\n| Join Coefficients | [`@comunica/mediatortype-join-coefficients`](https://github.com/comunica/comunica/tree/master/packages/mediatortype-join-coefficients) | Represents the cost of a join operation on the [RDF Join bus](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join). |\n\n## Configuring and using a mediator\n\n### Defining a component mediator parameter\n\nThe following components file shows how a `mediatorJoin` parameter is added to [`@comunica/actor-query-operation-join`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-join):\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-join/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/bus-query-operation/^3.0.0/components/context.jsonld"\n ],\n "@id": "npmd:@comunica/actor-query-operation-join",\n "components": [\n {\n "@id": "caqoj:Actor/QueryOperation/Join",\n "@type": "Class",\n "extends": "cbqo:Actor/QueryOperationTypedMediated",\n "requireElement": "ActorQueryOperationJoin",\n "comment": "A comunica Join Query Operation Actor.",\n "parameters": [\n {\n "@id": "caqoj:mediatorJoin",\n "comment": "A mediator for joining Bindings streams",\n "required": true,\n "unique": true\n }\n ],\n "constructorArguments": [\n {\n "extends": "cbqo:Actor/QueryOperationTypedMediated/constructorArgumentsObject",\n "fields": [\n {\n "keyRaw": "mediatorJoin",\n "value": "caqoj:mediatorJoin"\n }\n ]\n }\n ]\n }\n ]\n}\n```\n\n### Instantiating a component mediator\n\nThe following config file shows how we instantiate an actor with a race mediator over the RDF join bus ([`@comunica/bus-rdf-join`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-join)):\n```json\n{\n "@context": [\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-query-operation-join/^3.0.0/components/context.jsonld",\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/mediator-race/^3.0.0/components/context.jsonld"\n ],\n "@id": "urn:comunica:my",\n "actors": [\n {\n "@id": "config-sets:sparql-queryoperators.json#myJoinQueryOperator",\n "@type": "ActorQueryOperationJoin",\n "caqoj:mediatorJoin": {\n "@id": "config-sets:sparql-queryoperators.json#mediatorRdfJoin",\n "@type": "MediatorRace",\n "cc:Mediator/bus": { "@id": "cbrj:Bus/RdfJoin" }\n }\n }\n ]\n}\n``` \n\n### Invoking a mediator in TypeScript\n\nInvoking the mediator in a TypeScript actor implementation is done like this:\n```typescript\nimport { IActionContext } from \'@comunica/types\';\nimport { AIActorTest, Mediator } from \'@comunica/core\';\nimport { ActorRdfJoin, IActionRdfJoin } from \'@comunica/bus-rdf-join\';\nimport { IMediatorTypeIterations } from \'@comunica/mediatortype-iterations\';\n\nexport class ActorQueryOperationJoin extends ActorQueryOperationTypedMediated {\n\n public readonly mediatorJoin: Mediator;\n\n public constructor(args: IActorQueryOperationJoinArgs) {\n super(args, \'join\');\n }\n\n public async testOperation(pattern: Algebra.Join, context: IActionContext): Promise {\n return true;\n }\n\n public async runOperation(pattern: Algebra.Join, context: IActionContext): Promise {\n const myAction: IActionRdfJoin = { ... }; \n return this.mediatorJoin.mediate(myAction);\n }\n}\n\nexport interface IActorQueryOperationJoinArgs extends IActorQueryOperationTypedMediatedArgs {\n mediatorJoin: Mediator;\n}\n```\n'},40300:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Metadata'\ndescription: 'Information for adaptive planning of query operations.'\n---\n\nAs Comunica follows a [hypermedia-driven query execution model](/docs/modify/advanced/hypermedia/)\nto allow source capabilities to be detected and exploited on-the-fly,\nthere is a need for keeping track of the _metadata_ of such sources.\nThis metadata can then be used to determine how the remaining query execution should happen.\n\n## Interface\n\nAll bindings streams and quad streams are coupled with a [`IMetadata`](https://comunica.github.io/comunica/interfaces/_comunica_types.IMetadata.html) object,\nwhich could look as follows:\n```json\n{\n \"cardinality\": { \n \"type\": \"estimate\",\n \"value\": 10403,\n \"dataset\": \"https://fragments.dbpedia.org/2016-04/en\"\n },\n \"canContainUndefs\": false,\n \"pageSize\": 100,\n \"requestTime\": 1056,\n \"order\": [\n { \"variable\": \"keyA\", \"order\": \"asc\" },\n { \"variable\": \"keyB\", \"order\": \"desc\" }\n ]\n}\n```\n\nThe `cardinality` is one of the most important fields in this metadata object,\nas it determines an estimate or exact representation of the number of entries in the current bindings or quad stream.\nThis information is crucial for [join query planning](/docs/modify/advanced/joins/).\n\n## Extraction\n\nThe fields in metadata objects are determined by a combination of actors activate on the\n[`@comunica/bus-rdf-metadata-extract`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-metadata-extract) bus.\nThese actors will inspect the current HTTP response (body and headers) to determine what fields to populate the metadata object with.\n\nFor example, if the `hydra:count` predicate is present in the response,\nthe [`@comunica/actor-rdf-metadata-extract-hydra-count`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-metadata-extract-hydra-count)\nactor can use this value to determine the cardinality.\n\n
\n\n## Accumulation\n\nSometimes, metadata objects need to be merged together.\nThis is required in 2 places:\n* [`@comunica/actor-rdf-resolve-quad-pattern-hypermedia`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-resolve-quad-pattern-hypermedia): Merging the metadata objects discovered when following multiple links in a hypermedia source.\n* [`@comunica/actor-rdf-resolve-quad-pattern-federated`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-resolve-quad-pattern-federated): Merging the metadata objects of a quad pattern federated over different sources.\n\nThe [`@comunica/bus-rdf-metadata-accumulate`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-metadata-accumulate) bus\nconsists of actors that determine how each field in metadata objects need to be merged together.\nMost fields (such as `requestTime` and `pageSize`) involve a simple addition.\nHowever, the `cardinality` fields involves more complexity, as will be explained below.\n\n### Accumulating cardinalities\n\nIf cardinalities are exact, then the accumulating involves a simple addition:\n\n```text\nCardinality1: { type: 'exact', value: 100 }\nCardinality2: { type: 'exact', value: 5 }\nCardinalityOut: { type: 'exact', value: 105 }\n```\n\nIf one of the cardinalities is an estimate, then the accumulated cardinality will also be an estimate, but we can still add them:\n\n```text\nCardinality1: { type: 'exact', value: 100 }\nCardinality2: { type: 'estimate', value: 5 }\nCardinalityOut: { type: 'estimate', value: 105 }\n```\n\nIf one of the cardinalities is a dataset-wide cardinality, while the other is not dataset-wide (e.g. during link traversal),\nthen the first cardinality is kept:\n\n```text\nCardinality1: { type: 'exact', value: 100, dataset: 'ex:dataset1' }\nCardinality2: { type: 'estimate', value: 5 }\nCardinalityOut: { type: 'exact', value: 100, dataset: 'ex:dataset1' }\n```\n\nIf a cardinality is a subset of a dataset (e.g. when performing a specific TPF request), then the subset cardinality is kept:\n\n```text\nCardinality1: { type: 'exact', value: 100, dataset: 'ex:dataset1' }\nCardinality2: { type: 'exact', value: 5, subsetOf: 'ex:dataset1' }\nCardinalityOut: { type: 'exact', value: 5, dataset: 'ex:dataset1' }\n```\n\nIf cardinalities with different datasets are accumulated (e.g. during federation),\nthen they are directly added, without their dataset scope:\n\n```text\nCardinality1: { type: 'exact', value: 100, dataset: 'ex:dataset1' }\nCardinality2: { type: 'estimate', value: 5, dataset: 'ex:dataset2' }\nCardinalityOut: { type: 'estimate', value: 105 }\n```\n\n## States\n\nAll metadata objects have a `state` field, which refers to an [`IMetadataValidationState`](https://comunica.github.io/comunica/interfaces/_comunica_types.IMetadataValidationState.html).\nThis state allows you to inspect if this metadata is still valid, or to listen to metadata invalidations.\nIf a metadata object is invalid, it should not be used anymore, and a new version should be requested from the bindings or quad stream.\n\nMetadata states can for example be updated if a series of links is being followed during link traversal of a source,\nwith the cardinality being continuously incremented for each additional document that is found after following a link.\n\nThese metadata states enable actors to adaptively act upon newly discovered information in sources.\n"},19177:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Observers'\ndescription: 'Passively observe actions executed by actors on a given bus.'\n---\n\nObservers are an optional element in [Comunica's core architecture](/docs/modify/advanced/architecture_core/).\nThey allow you to **listen to all actions on a bus**, without modifying the action's input or output.\n\nObservers ([`ActionObserver`](https://comunica.github.io/comunica/classes/_comunica_core.actionobserver.html)) require a `bus` parameter, which should be supplied in the config file.\nYour observer implementation must override the following `onRun` method:\n```typescript\ninterface ActionObserver {\n onRun(\n actor: Actor,\n action: IAction,\n output: Promise,\n ): void;\n}\n```\nThis method allows you to see the handling actor, the executed action, and a promise to the action output.\n\n[Click here to find an example of a full observer implementation and configuration.](https://github.com/comunica/examples/tree/master/packages/actor-observe-rdf-dereference)\n"},1913:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Query operation result types'\ndescription: 'An overview of the different output types for query operations.'\n---\n\nComunica supports different [SPARQL query types](/docs/query/advanced/sparql_query_types/),\neach of which may require different kinds of output.\nFor example, `SELECT` queries returns a stream of bindings,\n`CONSTRUCT` and `DESCRIBE` returns a stream of quads,\nand `ASK` returns a boolean.\n\nThis document gives an overview of how these different output types are represented internally by Comunica actors.\n\n## Query operation output type\n\nAll relevant types and interfaces are exposed by the\n[Comunica types package](https://github.com/comunica/comunica/tree/master/packages/types).\n\n[`IQueryOperationResult`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResult.html)\nis a TypeScript union type over the following interfaces:\n\n* [`IQueryOperationResultBindings`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultBindings.html): Represents a stream of bindings.\n* [`IQueryOperationResultQuads`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultQuads.html): Represents a stream of quads.\n* [`IQueryOperationResultBoolean`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultBoolean.html): Represents a boolean result.\n* [`IQueryOperationResultVoid`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultVoid.html): Represents a void result.\n\n## Bindings output\n\nAn output of type [`IQueryOperationResultBindings`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultBindings.html)\nlooks as follows:\n\n```typescript\ninterface IQueryOperationResultBindings {\n type: 'bindings';\n context: ActionContext;\n metadata: () => Promise;\n bindingsStream: BindingsStream;\n}\n```\n\nThe most important field in here is `bindingsStream`, which is of type [`BindingsStream`](https://comunica.github.io/comunica/modules/_comunica_types.BindingsStream.html).\nThis is a stream containing bindings.\nLearn more about the usage of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\n## Quads output\n\nAn output of type [`IQueryOperationResultQuads`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultQuads.html)\nlooks as follows:\n\n```typescript\ninterface IQueryOperationResultQuads {\n type: 'quads';\n context: ActionContext;\n metadata: () => Promise;\n quadStream: RDF.Stream & AsyncIterator;\n}\n```\n\nThe most important field in here is `quadStream`, which is of type [`RDF.Stream`](/docs/query/advanced/rdfjs/)\ncontaining [RDF/JS quads](/docs/query/advanced/rdfjs/).\n\n## Boolean output\n\nAn output of type [`IQueryOperationResultBoolean`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultBoolean.html)\nlooks as follows:\n\n```typescript\ninterface IQueryOperationResultBoolean {\n type: 'bindings';\n context: ActionContext;\n execute: () => Promise;\n}\n```\n\nThe most important method in here is `execute`, which returns a promise resolving to a boolean.\n\n## Void output\n\nAn output of type [`IQueryOperationResultVoid`](https://comunica.github.io/comunica/modules/_comunica_types.IQueryOperationResultVoid.html)\nlooks as follows:\n\n```typescript\ninterface IQueryOperationResultVoid {\n type: 'void';\n context: ActionContext;\n execute: () => Promise;\n}\n```\n\nThe most important method in here is `execute`, which returns a void promise.\n\n## Casting an unknown output type\n\nIf your actor calls a query operation mediator, it will receive an output of type `IActorQueryOperationOutput`.\nIf you want to operate on the results directly,\nand if you are not certain of the output type,\nyou will have to check the `type` field of the output,\nand handle it accordingly.\n\nIf you however know beforehand what the type will be,\nyou can safely cast the output type with the following helper functions:\n\n* `ActorQueryOperation.getSafeBindings`: Returns `IQueryOperationResultBindings`.\n* `ActorQueryOperation.getSafeQuads`: Returns `IQueryOperationResultQuads`.\n* `ActorQueryOperation.getSafeBoolean`: Returns `IQueryOperationResultBoolean`.\n* `ActorQueryOperation.getSafeVoid`: Returns `IQueryOperationResultVoid`.\n\nFor example, the minus query operation actor ([`@comunica/actor-query-operation-minus`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-minus))\ncan only operate on bindings streams.\nAs such, it can safely cast outputs as follows:\n\n```typescript\nconst leftResult: IQueryOperationResultBindings = ActorQueryOperation.getSafeBindings(\n await this.mediatorQueryOperation.mediate({ operation: pattern.right, context }),\n);\nconst rightResult: IQueryOperationResultBindings = ActorQueryOperation.getSafeBindings(\n await this.mediatorQueryOperation.mediate({ operation: pattern.left, context }),\n);\n\nleftResult.bindingsStream.filter(...);\n```\n"},3195:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'RDF Parsing and Serializing'\ndescription: 'Basic concepts behind parsing and serializing RDF.'\n---\n\nParsing from and serializing to RDF is of great importance within Comunica,\nas Comunica needs to be able to query over RDF files in different formats,\nand produce RDF query results in different formats.\n\nFor this, Comunica provides the\n[RDF Parse](/docs/modify/advanced/buses/#rdf-parse) ([`@comunica/bus-rdf-parse`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-parse))\nand\n[RDF Serialize](/docs/modify/advanced/buses/#rdf-serialize) ([`@comunica/bus-rdf-serialize`](https://github.com/comunica/comunica/tree/master/packages/bus-rdf-serialize)) bus.\nThese buses respectively contain spec-compliant **streaming** [parsers](/docs/query/advanced/specifications/#parsing-rdf)\nand [serializers](/docs/query/advanced/specifications/#serializing-rdf) for the most important RDF formats.\n\n## Calling a parser\n\nRDF parsing actors implement the [`ActorRdfParse`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_parse.ActorRdfParse.html) abstract class,\nwhich can handle two types of actions:\n\n* Retrieval of supported media types (`mediaTypes`), such as `'text/turtle'`, `application/ld+json`, ...\n* Parsing for a given media type (`handle`).\n\nWhile the first action can be used to determine all available media types that can be parsed across all actors in a bus,\nthe second action is typically used afterwards to parse RDF for a specific media type.\n\nSince there are two types of actions, calling an RDF parser involves two respective mediators.\nAn example of such two mediators can be found in [`dereference-rdf/actors.json`](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/dereference-rdf/actors.json).\nIn TypeScript, these mediators will correspond to the following fields:\n```typescript\npublic readonly mediatorRdfParseMediatypes: MediateMediaTypes;\npublic readonly mediatorRdfParseHandle: MediateMediaTyped<\n IActionParse,\n IActorTest,\n IActorParseOutput\n>;\n```\n\nAll available media types can be retrieved as follows:\n```typescript\nconst { mediaTypes } = await this.mediatorRdfParseMediatypes.mediate(\n { context, mediaTypes: true },\n);\n```\n\nParsing for a specific media type can be done as follows:\n```typescript\nconst { quads } = (await this.mediatorRdfParseHandle.mediate(\n {\n context,\n handle: {\n context,\n headers: undefined, // Optional HTTP fetch headers\n input: textStream,\n metadata: { baseIRI: 'http://example.org/' },\n },\n handleMediaType: 'text/turtle',\n },\n)).handle;\n```\nInput `quadStream` must always be a text stream,\noutput `quads` is am [RDF/JS stream](/docs/query/advanced/rdfjs/).\n\nMore examples on how these parses are used can be found\nin actors on the [Dereference RDF bus](/docs/modify/advanced/buses/#dereference-rdf)\nor in the [rdf-parse.js package](https://github.com/rubensworks/rdf-parse.js).\n\n## Calling a serializer\n\nRDF serialzation actors implement the [`ActorRdfSerialize`](https://comunica.github.io/comunica/classes/_comunica_bus_rdf_serialize.ActorRdfSerialize.html),\nwhich can handle two types of actions:\n\n* Retrieval of supported media types (`mediaTypes`), such as `'text/turtle'`, `application/ld+json`, ...\n* Retrieval of supported media types as URLs (`mediaTypeFormats`), such as `http://www.w3.org/ns/formats/N3`, `http://www.w3.org/ns/formats/JSON-LD`, ...\n* Parsing for a given media type (`handle`).\n\nThe first action can be used to determine all available media types that can be parsed across all actors in a bus,\nthe second action is used to identify media types by URL in things like SPARQL service descriptions,\nand the third action is typically used afterwards to parse RDF for a specific media type.\n\nSince there are three types of actions, calling an RDF serializer involves three respective mediators.\nAn example of such two mediators can be found in [`sparql-serializers.json`](https://github.com/comunica/comunica/blob/master/engines/query-sparql/config/sets/sparql-serializers.json).\nIn TypeScript, these mediators will correspond to the following fields:\n```typescript\npublic readonly mediatorRdfSerialize: MediatorRdfSerializeHandle;\npublic readonly mediatorMediaTypeCombiner: MediatorRdfSerializeMediaTypes;\npublic readonly mediatorMediaTypeFormatCombiner: MediatorRdfSerializeMediaTypeFormats;\n```\n\nAll available media types can be retrieved as follows:\n```typescript\nconst { mediaTypes } = await this.mediatorMediaTypeCombiner.mediate(\n { context, mediaTypes: true },\n);\n```\n\nAll available media type URLs can be retrieved as follows:\n```typescript\nconst { mediaTypeFormats } = await this.mediatorMediaTypeFormatCombiner.mediate(\n { context, mediaTypeFormats: true },\n);\n```\n\nSerializing for a specific media type can be done as follows:\n```typescript\nconst { data } = (await this.mediatorRdfSerialize.mediate({\n context,\n handle: {\n type: 'quads',\n quadStream, // An RDF/JS Stream of RDF/JS quads.\n },\n handleMediaType: 'text/turtle',\n})).handle\n```\nInput `quadStream` must always be an [RDF/JS stream](/docs/query/advanced/rdfjs/),\noutput `data` is a text stream.\n\nMore examples on how these parses are used can be found\nin the [SPARQL RDF Serialize actor](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf).\n"},58938:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Sparqlee'\ndescription: 'The SPARQL expression evaluation engine of Comunica. (DEPRECATED)'\n---\n\nSparqlee has been deprecated for the expression evaluator package.\nThis package has its own [docs page](/docs/modify/advanced/expression-evaluator).\n\nSparqlee was an [open-source](https://github.com/comunica/sparqlee) SPARQL 1.1 expression engine\nused by different Comunica actors for evaluating expressions.\n\nConcretely, the following actors made use of it:\n* [`@comunica/actor-query-operation-extend`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-extend): Implements the extent operator.\n* [`@comunica/actor-query-operation-filter-sparqlee`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-filter-sparqlee): Implements the filter operator.\n* [`@comunica/actor-query-operation-group`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-group): Implements the group operator.\n* [`@comunica/actor-query-operation-leftjoin`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-leftjoin): Implements the left join operator.\n* [`@comunica/actor-query-operation-orderby-sparqlee`](https://github.com/comunica/comunica/tree/master/packages/actor-query-operation-extend): Implements the order by operator.\n"},23871:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Testing'\ndescription: 'The unit and integration tests that lead to a more stable codebase.'\n---\n\nSince [code stability is one of the main goals of Comunica](/about/#stability),\nwe spend a lot of effort on testing our code.\nFor this, we continuously execute different kinds of tests in our [continous integration setup](https://github.com/comunica/comunica/blob/master/.travis.yml).\nThis means that every changes to the codebase always goes through\na large number of tests to make sure that no existing logic breaks.\n\n## Unit tests\n\nUsing the [Jest testing framework](https://jestjs.io/),\nwe test each actor in isolation.\nThe code requires a code and branch coverage of 100%.\n\nAll unit tests can be executed in the development environment using the following command:\n```bash\n$ yarn run test\n```\n\n## Integration tests\n\nUsing [rdf-test-suite-ldf.js](https://github.com/comunica/rdf-test-suite-ldf.js),\nwe check the correctness of a collection of SPARQL queries over the different default Comunica configurations.\nThis tool makes use of [declarative test manifest](https://github.com/comunica/manifest-ldf-tests)\nthat are inspired by the SPARQL 1.1 test suite.\n\nAll integration tests can be executed in the development environment using the following command:\n```bash\n$ npx lerna run integration\n```\n\n## Specification tests\n\nTo ensure the compliance to [specifications](/docs/query/advanced/specifications/),\nwe continuously execute their test suites using [rdf-test-suite.js](https://github.com/rubensworks/rdf-test-suite.js).\n\nAll specification tests can be executed in the development environment using the following command:\n```bash\n$ npx lerna run spec\n```\n\n## Sanity checks\n\nCertain things such as [browser builds](/docs/modify/advanced/browser_builds/) are not fully tested yet.\nIn order to at least check if they succeed during building,\nwe check these steps as well.\n\nFor example:\n```bash\n$ npx lerna run browser\n```\n\n## Next steps\n\nThere's still a lot more we want regarding testing to improve stability.\nInterested in helping out? Have a look at [this issue](https://github.com/comunica/comunica/issues/167).\n"},81569:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Benchmarking'\ndescription: 'Guidelines on running experiments with Comunica.'\n---\n\nThis page lists guidelines on how to run experiments with Comunica.\nThis can be useful for researchers that want to evaluate their modification,\nor for Comunica core developers that want to check performance.\n\n## Considerations when benchmarking\n\n### Running Node in production mode\n\nIf you want to do benchmarking with Comunica in Node.js, make sure to **run Node.js in production** mode as follows:\n\n```bash\nNODE_ENV=production node packages/some-package/bin/some-bin.js\n```\n\nThe reason for this is that Comunica extensively generates internal `Error` objects.\nIn non-production mode, these also produce long stacktraces, which may in some cases impact performance.\n\n### Taking into account startup time of the engine\n\nIf you want to run experiments, it is important to take into account the **time it takes for the query engine to start**.\nWhen measuring execution time, one should _only_ measure the actual time it takes for the engine to execute the query,\nexcluding the query engine's startup time.\n\nAs such, simply measuring the execution time via the command line is not advised.\nInstead, one should either make use of a [SPARQL endpoint](/docs/query/getting_started/setup_endpoint/),\nthe `stats` writer on the command line,\nor measure query execution via JavaScript.\n\n### Warming up the JavaScript engine\n\nSince most modern JavaScript engines (such as the V8 engine used by Node.js) are based on Just In Time (JIT) compilation,\nthey take some time to compile and to learn about the application's structure to apply optimizations.\nAs such, it is important to warm up your query engine before doing measurements over it,\nunless you specifically want to measure the cold-start performance.\nThe **recommended way to do this is to set up [a Comunica SPARQL endpoint](/docs/query/getting_started/setup_endpoint/)**,\ndo some warmup queries over it, and only then execute the actual benchmark.\n\nEngines such as V8 tend to each an optimal state rather quickly,\nso not too many warmup rounds are required for execution time to stabilize.\nThe number of warmup rounds can depend on your engine's version, machine, and query set.\n\n## Simple benchmarking using the stats writer\n\nThe easiest way to do simple benchmarking is to make use of the `-t stats` [result format](/docs/query/advanced/result_formats/).\n\n```bash\n$ NODE_ENV=production \\\n comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -t stats\n```\n\nThis will output CSV in the form of:\n```csv\nResult,Delay (ms),HTTP requests\n1,136.638436,2\n2,137.211264,2\n3,137.385467,2\n...\n98,151.781901,2\n99,151.838555,2\n100,151.898222,2\nTOTAL,152.175256,2\n```\n\nThis tells us:\n\n* The number of query results\n* The cumulative time for each result to be emitted\n* The cumulative number of HTTP requests required up until each result\n\n## Simple benchmarking in JavaScript\n\nWhen [creating a Comunica query engine from a JavaScript application](/docs/query/getting_started/query_app/),\nmeasuring a query's execution time can be done as follows:\n```javascript\n// Start a timer\nconsole.time(\"myTimer\");\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n\nbindingsStream.on('data', (binding) => {\n // Optionally do some logging\n});\nbindingsStream.on('end', () => {\n // End the timer\n console.timeEnd(\"myTimer\");\n});\n```\n\nMeasuring execution time from JavaScript gives you more flexibility compared to the command line.\n\nExamples for more advanced benchmarking in JavaScript can be found in the [examples repo](https://github.com/comunica/examples/).\n\n## Reproducible benchmarking via JBR\n\n[JBR](https://github.com/rubensworks/jbr.js)\nis a JavaScript-based benchmarking framework\nfor easily creating and running various benchmarks with engines such as Comunica and [LDF Server](https://github.com/LinkedDataFragments/Server.js).\nIt is useful if you want to compare different configurations of Comunica or other engines with each other.\n\nTogether with the (semantic) configuration files of Comunica and LDF Server,\nthis tool completes the whole provenance chain of experimental results:\n\n* **Setup** of sofware based on configuration\n* **Generating** experiment input data\n* **Execution** of experiments based on parameters\n* Description of environment **dependencies** during experiments\n* **Reporting** of results\n* **Archiving** results into a single file for easy exchange\n"},92646:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Events'\ndescription: 'Overview of all Comunica-related events.'\nindex: true\nreverse: true\n---\n\nBelow, all events are listed that are related to, or organized by members of the Comunica community.\n"},38831:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'2019-06-03: Tutorial at ESWC 2019\'\ndescription: \'Comunica tutorial at the ESWC 2019 conference\'\n---\n\n
\n \n
\n\nOn June 3rd 2019, a tutorial about Comunica was given at the ESWC 2019 conference, in Portoroz, Slovenia.\n\nAll materials and slides can be found on [the tutorial\'s web page](https://comunica.github.io/Tutorial-ESWC2019-Comunica/).\n'},2447:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'2019-10-26: Tutorial at ISWC 2019\'\ndescription: \'Comunica and Solid tutorial at the ISWC 2019 conference\'\n---\n\n
\n \n
\n\nOn October 26th 2019, a tutorial about Solid and Comunica was given at the ISWC 2019 conference, in Auckland, New Zealand.\n\nAll materials and slides can be found on [the tutorial\'s web page](https://comunica.github.io/Tutorial-ISWC2019-Solid-Comunica/).\n'},3468:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'2022-09-07: Comunica Association Launch\'\ndescription: \'An online event for the official launch of the Comunica Association\'\n---\n\nOn [Wednesday 7 September 16:00 (Brussels time)](https://www.timeanddate.com/worldclock/converter.html?iso=20220907T140000&p1=48),\nthe official launch of the [Comunica Association](/association/) takes place online as a digital event.\n\nDuring this event, several invited speakers from various companies will talk about their experiences with Comunica, and show off some demo\'s.\n\nThis event will be open for everyone to watch.\nIf you want to be notified about updates (such as when the livestream link becomes available),\n**be sure to [register for free](https://www.eventbrite.com/e/comunica-association-launch-tickets-383969391787)**.\n\nIn the week after this event, we will be physically attending the [Semantics conference](/events/2022-09-13-semantics_conference/),\nwhere we will have a booth and talk at the main conference.\n\n## Watch live\n\nFollow the launch event live on Wednesday 7 September at 16:00 (Brussels time).\nYou can either watch on this page, or on [YouTube](https://www.youtube.com/watch?v=seXvRI-XtiM).\n\n
\n\n
\n\n## Schedule\n\nBelow, you can find the preliminary schedule of the launch event.\n\n### Introduction\n\n*By [Ruben Taelman](https://www.rubensworks.net/) - [IDLab - Ghent University](https://knows.idlab.ugent.be/)*\n\n*[Slides](https://www.rubensworks.net/raw/slides/2022/comunica-launch-intro/)*\n\n### Using Comunica for building foundational heritage network services\n\n*By [Enno Meijers](https://www.linkedin.com/in/ennomeijers/) - [Dutch Digital Heritage Network (NDE)](https://netwerkdigitaalerfgoed.nl/)*\n\n*[Slides](https://docs.google.com/presentation/d/14MGwBoz-x-9XCk5v1OlnxLF5UGFMYxIObVohpX8uqOc/edit#slide=id.p5)*\n\n
\n\nAt the Dutch Digital Heritage Network we’ve been building services that facilitate the discovery, use and sharing of linked data. Two examples are a realtime federated search engine (Network of Terms) and a dataset index (Dataset Register). Comunica is a great tool that allows us to efficiently query a variety of RDF sources. Plans for the future include a browser-based version and better support for fulltext SPARQL. These require some changes in Comunica, so I’ll take the opportunity to plug our wishlist.\n
\n\n### Building a linked data multi-store with Comunica Association components\n\n*By [Wouter Beek](https://github.com/wouterbeek/) - [Triply B.V.](https://triply.cc/)*\n\n
\n\nTriplyDB is a commercial multi-store that offers a wide variety of\nservices over linked data knowledge graphs.\nServices include Comunica Association components like Linked Data\nFragments, Linked Data Event Streams, and the Comunica SPARQL engine.\nIn this talk we show how Comunica Association components can be used\nto strengthen the offering of commercial linked data products.\n
\n\n### ~~Using Comunica to query Pods hosted on PodSpaces 2.0~~\n\n_**Cancelled due to availability issues, will take place at a later date**_\n\n*By [Pat McBennett](https://github.com/pmcb55/) - Technical Architect at [Inrupt](https://inrupt.com/)*\n\n
\n\nWe see Comunica as a great example of the growing Linked Data and Solid developer community and that\'s why we actively contribute to it. In particular, we find it to be a great showcase for querying Solid Pods.\nWe recently released an update that allows Comunica to query Pods in PodSpaces 2.0 and we are looking forward to continue our commitment to Comunica and the rest of the Solid community.\nWe are going to demo using Comunica to query Pods hosted on PodSpaces 2.0.\n
\n\n### The more you know - easy access to enriched RDF using LDflex + Comunica\n\n*By [Jesse Wright](https://github.com/jeswr/) - [University Medallist Alumni, Australian National University](https://cecs.anu.edu.au/) & Software Engineer at [Inrupt](https://inrupt.com/)*\n\n*[Slides](http://jeswr.me/slides-2022-comunica-talk/) and [Examples](https://github.com/jeswr/slides-2022-comunica-talk/tree/main/examples)*\n\n
\n\nClient side query and RDF reasoning has the compacity to unlock a plethora of powerful Web applications. In this talk we demonstrate how Comunica can be used to quickly query across multiple decentralised data sources and receive results that have been enriched RDF reasoning in real-time. We will dive into real-world applications and demonstrate the power of Comunica and LDflex in creating rich user apps with minimal code.\n
\n\n### A querying SDK from research to practice\n\n*By [Pieter Colpaert](https://pietercolpaert.be/) - [IDLab - Ghent University](https://knows.idlab.ugent.be/)*\n\n
\n\nComunica was conceived within IDLab as a next generation Linked Data Fragments client, that would be able to query over heterogeneous data sources. It has now grown into our flagship product that we certainly don’t want to keep only within academic mids. We believe a Comunica Association is the right step forward to also involve other universities, industry and start-ups, governments and hobbyists into a true quadruple helix collaboration. Today, we use Comunica ourselves to build a governmental data space, and as the querying engine behind Solid applications.\n
\n'},53199:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'2022-09-13/15: Semantics Conference\'\ndescription: \'The Comunica Association will have a booth and talk at the Semantics Conference in Vienna\'\n---\n\n
\n \n
\n\nThe week after [the online launch event of the Comunica Association](/events/2022-09-07-association_launch/),\nwe will be present at the European [Semantics Conference](https://2022-eu.semantics.cc/) in Vienna from September 13 until September 15.\n\nBe sure to attend the conference if you want to talk to us there.\nWe will have a **booth** at which you can find us during the breaks,\nand we will have a **talk about Comunica** at the main conference.\n'},22154:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Logos\'\ndescription: \'Free to use logos of Comunica.\'\n---\n\n
\n \n \n \n \n \n
\n\nIf you want to indicate that you use Comunica in your project,\nyou are free to use any form of the Comunica logo.\n\n[**Download the logo pack**](https://www.dropbox.com/s/s7xmy6ednifgm9v/comunica-logos.zip?dl=1).\n\nYou are not allowed to use these logos directly as your application\'s logo.\n'},88809:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Research'\ndescription: 'An overview of these research surrounding Comunica.'\n---\n\n## Working with us\n\nThe Comunica project has been initiated by [IDLab](https://www.ugent.be/ea/idlab/en) at Ghent University – imec\nas a framework for researching query execution over decentralized knowledge graphs on the Web.\nSince Comunica is open-source, anyone is free to use it to perform any research they want.\n\nIf you are interested in **collaborating**, or working on Comunica as a **Bachelor/Master/PhD student**,\nfeel free to [contact us](/ask/#email).\nWe always have interesting projects to offer on both applied development and fundamental research.\n\n## Cite\n\nIf you are using or extending Comunica as part of a scientific publication,\nwe would appreciate a citation of our [article](https://comunica.github.io/Article-ISWC2018-Resource/).\n\n```bibtex\n@inproceedings{taelman_iswc_resources_comunica_2018,\n author = {Taelman, Ruben and Van Herwegen, Joachim and Vander Sande, Miel and Verborgh, Ruben},\n title = {Comunica: a Modular SPARQL Query Engine for the Web},\n booktitle = {Proceedings of the 17th International Semantic Web Conference},\n year = {2018},\n month = oct,\n url = {https://comunica.github.io/Article-ISWC2018-Resource/}\n}\n```\n\n## Experiments\n\nThe following experiments have been done with Comunica:\n\n* [Link Traversal](/research/link_traversal/)\n* [Versioning](/research/versioning/)\n* [Approximate Membership Functions](/research/amf/)\n\n## Publications\n\nThe following publications make significant use of Comunica:\n\n* [**Optimizing Approximate Membership Metadata in Triple Pattern Fragments for Clients and Servers**. Taelman, R., Van Herwegen, J., Vander Sande, M., & Verborgh, R. (2020)](https://comunica.github.io/Article-SSWS2020-AMF/) ([Learn more](/research/amf/))\n* [**Discovering Data Sources in a Distributed Networkof Heritage Information**. M., de Valk, S., Meijers, E., Taelman, R., Van De Sompel, H., & Verborgh, R. (2019)](https://biblio.ugent.be/publication/8629105/file/8629106.pdf)\n* [**Computational integrity for outsourced execution of SPARQL queries**. Morel, S (2019)](https://www.scriptiebank.be/sites/default/files/thesis/2019-10/main_0.pdf)\n* [**Querying heterogeneous linked building datasets with context-expanded GraphQL queries**. Werbrouck, J., Senthilvel, M., Beetz, J., & Pauwels, P. (2019)](https://biblio.ugent.be/publication/8623179/file/8623180)\n* [**Using an Existing Website as a Queryable Low-Cost LOD Publishing Interface**. Van de Vyvere, B., Taelman, R., Colpaert, P., & Verborgh, R. (2019, June).](https://link.springer.com/chapter/10.1007/978-3-030-32327-1_35)\n* [**SAD Generator: eating our own dog food to generate KGs and websites for academic events**. Heyvaert, P., Chaves-Fraga, D., Priyatna, F., Sequeda, J., & Dimou, A. (2019, June).](https://link.springer.com/chapter/10.1007/978-3-030-32327-1_19)\n* [**Versioned Querying with OSTRICH and Comunica in MOCHA 2018**. Taelman, R., Vander Sande, M., & Verborgh, R. (2018, June)](https://biblio.ugent.be/publication/8566999/file/8567001.pdf)\n\nAlso using Comunica in our work? [Let us know](/ask/#email) so we can add a reference to this list.\n\n## Tutorials\n\nThe following conference tutorials make use of Comunica:\n\n* [**Building Decentralized Applications with Solid and Comunica**](https://comunica.github.io/Tutorial-ISWC2019-Solid-Comunica/). Ruben Taelman, Joachim Van Herwegen, Ruben Verborgh. Full-day tutorial at the [18th International Semantic Web Conference (ISWC 2019)](https://iswc2019.semanticweb.org), Auckland, New Zealand, 2019.\n* [**Querying Linked Data with Comunica**](https://comunica.github.io/Tutorial-ESWC2019-Comunica/). Ruben Taelman, Joachim Van Herwegen. Half-day tutorial at the [16th Extended Semantic Web Conference (ESWC2019)](https://2019.eswc-conferences.org/), Portoroz, Slovenia, 2019.\n* [**Knowledge Representation as Linked Data: Tutorial**](https://www.cikm2018.units.it/tutorial2.html). Van Herwegen, J., Heyvaert, P., Taelman, R., De Meester, B. and Dimou, A. Tutorial at the [27th ACM International Conference on Information and Knowledge Management](https://www.cikm2018.units.it/). \n"},99496:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Approximate Membership Functions\'\ndescription: \'An overview of research that has been done on AMFs during query execution.\'\n---\n\nApproximate Membership Functions (AMFs) are probabilistic data structures\nthat efficiently can determine membership of a set,\nat the cost of false positives.\nThey are typically much smaller than a full dataset,\nmaking them a useful pre-filtering method.\n\nAMFs have been investigated in the context of reducing the number of HTTP requests\nwhen querying over a [Triple Pattern Fragments](https://linkeddatafragments.org/specification/triple-pattern-fragments/) interface.\n\n## 1. Materials\n\n* [Academic article](https://comunica.github.io/Article-SSWS2020-AMF/) ([Initial work that was built upon](https://linkeddatafragments.org/publications/iswc2015-amf.pdf))\n* [Reproducible experiments](https://github.com/comunica/Experiments-AMF)\n* [AMF-enabled Comunica engine](https://github.com/comunica/comunica-feature-amf/)\n* [AMF-enabled LDF Server](https://github.com/LinkedDataFragments/Server.js/tree/feature-handlers-amf-2)\n\n## 2. Main findings\n\n[_Learn more in our academic article._](https://comunica.github.io/Article-SSWS2020-AMF/)\n\n### AMFs lead to faster complete results\n\nDue to the reduction of HTTP requests, complete results come in earlier.\nIn some cases, the first result can be delayed.\n\n
\n \n
\n\n### Caching significantly speeds up query execution\n\nAn HTTP cache like NGINX achieves the best results, but additionally caching AMF filters server-side is not worth the effort.\n\n
\n \n
\n\n### Extreme false-positive probabilities slow down query execution\n\nOn average, a false-positive probability of 1/64 leads to the lowest overall query evaluation times for this experiment.\n\n
\n \n
\n\n## 3. Recommendations for data publishers\n\nBased on the conclusions of our experimental results,\nwe derived the following guidelines for publishers who aim to use the AMF feature:\n\n* Enable **HTTP caching** with a tool such as [NGINX](https://www.nginx.com/).\n* **Pre-compute AMFs** (or at least cache) AMFs of size 10.000 or higher.\n* If AMFs can be cached, prefer **Bloom filters** over GCS.\n* Use a false-positive **probability of 1/64**.\n'},17051:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Link Traversal\'\ndescription: \'An overview of research that has been done on Link-Traversal-based Query Processing.\'\n---\n\n[Link-Traversal-based Query Processing (LTQP)](https://link.springer.com/content/pdf/10.1007/s13222-013-0122-1.pdf) is a querying paradigm\nthat enables querying over an interlinked set of Linked Data documents\nby following links between them.\n\nIf you\'re mainly interested in Link Traversal from a Solid perspective, you can find details [here](/docs/query/advanced/solid/).\n\nResearch is being done on LTQP through various implementations in Comunica.\nThis page summarizes ongoing work, and provides links to demos.\n\n## Experimental Implementations\n\nA [dedicated (mono)repository](https://github.com/comunica/comunica-feature-link-traversal) has been created\nthat contains actors for enabling LTQP inside Comunica.\n\nSince there are multiple approaches for handling LTQP,\nmultiple [configurations](https://github.com/comunica/comunica-feature-link-traversal/tree/master/engines/config-query-sparql-link-traversal/config).\nWe have configurations for the following use cases:\n- Linked Open Data: [`config-default.json`](https://github.com/comunica/comunica-feature-link-traversal/blob/master/engines/config-query-sparql-link-traversal/config/config-default.json)\n - Available as an npm package: [`@comunica/query-sparql-link-traversal`](https://www.npmjs.com/package/@comunica/query-sparql-link-traversal)\n- Solid pods: [`config-solid-default.json`](https://github.com/comunica/comunica-feature-link-traversal/blob/master/engines/config-query-sparql-link-traversal/config/config-solid-default.json)\n - Available as an npm package: [`@comunica/query-sparql-link-traversal-solid`](https://www.npmjs.com/package/@comunica/query-sparql-link-traversal-solid)\n- TREE sources: [`config-tree.json`](https://github.com/comunica/comunica-feature-link-traversal/blob/master/engines/config-query-sparql-link-traversal/config/config-tree.json)\n\n## Main findings\n\nBelow, you can read the high-level findings of our link traversal experiments.\n\n### Link traversal over Solid pods\n\nWe have implemented link discovery actors dedicated to the structural properties of Solid data pods,\nsuch as their reliance on [LDP containers](https://www.w3.org/TR/ldp/), and the [Solid type index](https://solid.github.io/type-indexes/).\nWe have evaluated their performance using the [SolidBench](https://github.com/SolidBench/SolidBench.js) benchmark.\n\n[_Learn more in our academic article._](https://comunica.github.io/Article-EDBT2023-SolidQuery/)\n\n#### Structural assumptions about Solid pods significantly boost performance\n\nThe table below shows a subset of the aggregated query results when using the dedicated LDP and Solid type index actors.\n\nWe can observe that the traditional reachability semantics for link traversal (`cNone`, `cMatch`, `cAll`)\nare either unable to find all necessary documents in Solid pod to answer queries (low result accuracy acc) (`cNone` and `cMatch`),\nor they follow too many links that they result in a timeout (∑to) (`cAll`).\n\nHowever, when the add the Solid-specific actors (`cNone-solid`, `cMatch-solid`, `cAll-solid`),\nwe gain higher levels of accuracy.\nThe most optimal combination is `cMatch` with the Solid actors,\nwhich achieves an accuracy of more than 99% in this case.\n\n| | t | ~t | t1 | ~t1 | req | ∑ans | acc | ∑to |\n| --- | ---: | ---: | ---: | ---: | ---: | ---: | ---: | ---: |\n| cNone | 40 | 0 | N/A | N/A | 8 | 0.00 | 0.00% | 0 |\n| cMatch | 1,791 | 0 | 22,946 | 24,439 | 1,275 | 0.00 | 0.00% | 1 |\n| cAll | 128,320 | 127,021 | 28,448 | 10,554 | 0 | 0.63 | 3.13% | 8 |\n| cNone-solid | 1,552 | 1,006 | 425 | 331 | 357 | 20.50 | 74.14% | 0 |\n| **cMatch-solid** | **12,483** | **2,372** | **2,309** | **925** | **2,708** | **39.13** | **99.14%** | **0** |\n| cAll-solid | 123,979 | 125,235 | 48,382 | 10,368 | 16,623 | 3.13 | 17.40% | 7 |\n\n#### Even if queries are slow, first results can arrive quickly\n\nSome queries might take multiple seconds to finish.\nSince all query algorithms have been designed to process results in a streaming manner,\nresults can arrive iteratively.\nThis means that results can arrive after a few milliseconds, even if the final result only arrives after multiple seconds,\nas can be seen in the figure below.\n\n
\n \n
\n\n#### Type index discovery is slightly better than LDP discovery\n\nAs shown in the figure below, using the Solid type index for discovering data in pods results in\na significantly lower number of HTTP requests compared to LDP-based discovery.\n\n
\n \n
\n\nEven though this difference in number of HTTP requests is significant,\nthis results in only a minor difference in execution time, as shown below.\n\n
\n \n
\n\n#### Pod size and fragmentation impact performance\n\nWhen we fragment data inside our pods in different ways (`composite`, `separate`, `single`, `location`, `time`),\nor we increase the amount of data inside pods by a given factor (`1`, `5`),\nwe see a signficant impact on performance, as shown in the query result arrival times of a query below.\n\n
\n \n
\n\n#### Limitations and future work\n\nThe current main limitation of this approach is that it only works well for non-complex queries.\nAs soon as query complex increases, query execution times become too high to be practical.\nThe root cause of this problem is the lack of proper query planning,\nwhich would need to happen adaptively as soon as pod-specific information is discovered.\n\n## Try it out\n\nBelow, we list links to several example configurations for LTQP\nthat have been built as a Web client.\n\n\n'},56308:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Versioning'\ndescription: 'An overview of research that has been done on Query Processing for RDF archives.'\n---\n\nVersioned querying enables query execution over RDF archives _at_ specific versions, _between_ certain versions, and _across_ all versions.\n\nResearch is being done on versioning through various implementations in Comunica,\nin particular on [OSTRICH](https://github.com/rdfostrich/ostrich)-based RDF archives.\nThis page summarizes ongoing work. \n\n## Experimental Implementations\n\nA [dedicated (mono)repository](https://github.com/comunica/comunica-feature-versioning) has been created\nthat contains actors for enabling versioned querying inside Comunica.\n\nThe default configuration ([`config-default.json`](https://github.com/comunica/comunica-feature-versioning/blob/master/engines/config-query-sparql-versioning/config/config-default.json))\ncontains actor configurations for querying [OSTRICH](https://github.com/rdfostrich/ostrich) archives.\n"},78217:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Roadmap'\ndescription: 'The long-term goals of Comunica'\n---\n\nThis page gives an overview of the long-term goals of Comunica in order of priority,\nwhich is determined by the [Comunica Association](/association/).\n\n[Interested in helping out? Find out more on how to contribute](/contribute/).\n\n## Improving overall performance\n\n_[All performance-related issues](https://github.com/comunica/comunica/labels/performance%20%F0%9F%90%8C)_\n\nComunica has been designed with modularity and flexibility as primary requirement,\nwhile performance was a lower priority.\nNevertheless, [experiments](https://comunica.github.io/Article-ISWC2018-Resource/#comparison-tpf-client)\nshow that the performance Comunica of is still very similar to equivalent engines.\n\nAs Comunica is being used increasingly in more use cases,\nfor larger datasets and more complex queries,\nspecific [performance issues](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3A%22performance+%F0%9F%90%8C%22) are being identified.\nIn order to [resolve these](https://github.com/comunica/comunica/issues/846), new algorithms may need to be implemented,\nupstream packages may need to be evaluated,\nor even some architectural changes may be required in some cases.\nNext to that, issues related to lowering the browser bundle size are also of interest.\n\n## Developer experience\n\n_[All devex-related issues](https://github.com/comunica/comunica/labels/devx%20%F0%9F%8E%A8)_\n\nCode-specific improvements are possible\nto make it easier for developers to work with and in Comunica.\nFor example, errors can sometimes be too cryptic, which hinders development.\n\nA list of all open developer experience issues can be found\n[here](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3A%22devx+%F0%9F%8E%A8%22).\n\n## Outreach\n\n_[All outreach-related issues](https://github.com/comunica/comunica/labels/outreach)_\n\nWe intend to connect with different communities that may have overlapping interests,\nwhich we can do by lowering the barrier to entry for developers from other communities.\nThis can for example be achieved by providing pre-packaged versions of Comunica that work out-of-the-box in other environments,\nsuch as for example [rollup.js](https://rollupjs.org/guide/en/).\n\n## Future-oriented development\n\n_[All future-oriented issues](https://github.com/comunica/comunica/labels/future-oriented)_\n\nIn addition to specification compliance, Comunica is being built with possible future specifications in mind.\nComunica should become a testbed for easily testing out new query features and techniques.\nFor instance, for efforts such as [RDF\\*/SPARQL\\*](https://blog.liu.se/olafhartig/2019/01/10/position-statement-rdf-star-and-sparql-star/)\nand [SPARQL 1.2](https://github.com/w3c/sparql-12/).\nThis also includes making Comunica ready for new technologies such as [ESM](https://nodejs.org/api/esm.html) and [WebAssembly](https://webassembly.org/).\n\nWhile the architecture of Comunica has been built with this flexibility in mind,\nsome specific changes will need to be made before this is possible.\nFor instance, testing new SPARQL 1.2 query features will require the development of a new SPARQL query parser,\nsince our current parser ([SPARQL.js](https://github.com/RubenVerborgh/SPARQL.js/)) is [not flexible enough in that respect](https://github.com/comunica/comunica/issues/403).\n\n## Tangents\n\nBelow, you can find several topics that parts of the community are working on, but are not part of the general roadmap.\n\n### Different forms of query execution\n\n_Point of contact: [Ruben Taelman](https://www.rubensworks.net/contact/)_\n\nComunica's current query execution model relies on defining a set of _data sources_ to query over.\nWhile this traditional form of query execution works well in many cases,\nit can be too constrained in cases where _data is spread over many sources across the Web_, which are interlinked.\n\nOne alternative form of query execution is [Link-Traversal-based Query Execution](https://arxiv.org/abs/1108.6328),\nwhere _links are followed_ on the Web to find data.\n\nA future goal of Comunica is the integration of such alternative forms of query execution.\n\nYou can learn more about this work on our [experiments page](/research/#experiments).\n\n### Alternative query languages\n\n_Point of contact: [Ruben Taelman](https://www.rubensworks.net/contact/)_\n\nSPARQL is currently the (only) recommended way of querying knowdlege graphs that are represented in RDF.\nHowever, there is a wide range of new graph query languages emerging, such as GraphQL, Cypher and GQL, each having their own advantages.\nAs such, being able to express queries over knowledge graphs in different languages may be valuable for different use cases.\n\nFor instance, [GraphQL-LD](/docs/query/advanced/graphql_ld/) already offers one alternative language in which queries can be expressed.\nCompared to SPARQL, GraphQL-LD is less complex, but also less expressive.\n\nGeoSPARQL is another language that may be investigated in the future.\n\n\n"},59176:function(e,n,t){var a={"./about.md":73823,"./ask.md":9518,"./association.md":54841,"./association/board.md":4074,"./association/bounty_process.md":43831,"./blog.md":1716,"./blog/2020-08-19-intro.md":19744,"./blog/2020-08-24-release_1_16.md":97463,"./blog/2020-09-25-release_1_17.md":76007,"./blog/2020-11-02-release_1_18.md":29396,"./blog/2021-01-18-release_1_19.md":14741,"./blog/2021-03-30-release_1_20.md":88917,"./blog/2021-04-27-release_1_21.md":75399,"./blog/2021-06-21-comunica_association_bounties.md":80705,"./blog/2021-08-30-release_1_22.md":25615,"./blog/2021-11-08-comunica_association_members.md":60190,"./blog/2022-03-03-release_2_0.md":30494,"./blog/2022-06-29-release_2_3.md":60212,"./blog/2022-07-14-association_launch.md":39243,"./blog/2022-08-24-release_2_4.md":80850,"./blog/2022-11-09-release_2_5.md":64546,"./blog/2023-05-24-release_2_7.md":63367,"./blog/2023-07-04-release_2_8.md":57145,"./blog/2024-03-19-release_3_0.md":84831,"./blog/2024-05-11-release_3_1.md":45609,"./blog/2024-07-05-release_3_2.md":72212,"./contribute.md":4950,"./docs.md":75835,"./docs/1_query.md":17642,"./docs/1_query/1_getting_started.md":62712,"./docs/1_query/1_getting_started/1_query_cli.md":97750,"./docs/1_query/1_getting_started/1_update_cli.md":20919,"./docs/1_query/1_getting_started/2_query_cli_file.md":48016,"./docs/1_query/1_getting_started/3_query_app.md":95276,"./docs/1_query/1_getting_started/3_update_app.md":92421,"./docs/1_query/1_getting_started/4_query_browser_app.md":51839,"./docs/1_query/1_getting_started/5_query_docker.md":26884,"./docs/1_query/1_getting_started/6_setup_endpoint.md":64942,"./docs/1_query/1_getting_started/7_setup_web_client.md":12214,"./docs/1_query/1_getting_started/8_query_dev_version.md":40971,"./docs/1_query/2_usage.md":13302,"./docs/1_query/3_faq.md":6572,"./docs/1_query/advanced.md":75770,"./docs/1_query/advanced/basic_auth.md":36323,"./docs/1_query/advanced/bindings.md":76759,"./docs/1_query/advanced/caching.md":11986,"./docs/1_query/advanced/context.md":22249,"./docs/1_query/advanced/destination_types.md":65625,"./docs/1_query/advanced/explain.md":61042,"./docs/1_query/advanced/extension_functions.md":10205,"./docs/1_query/advanced/federation.md":60711,"./docs/1_query/advanced/graphql_ld.md":33889,"./docs/1_query/advanced/hdt.md":85945,"./docs/1_query/advanced/logging.md":50974,"./docs/1_query/advanced/memento.md":82329,"./docs/1_query/advanced/proxying.md":13330,"./docs/1_query/advanced/rdfjs.md":68577,"./docs/1_query/advanced/rdfjs_querying.md":82075,"./docs/1_query/advanced/rdfjs_updating.md":27124,"./docs/1_query/advanced/result_formats.md":54924,"./docs/1_query/advanced/solid.md":89366,"./docs/1_query/advanced/source_types.md":42473,"./docs/1_query/advanced/sparql_query_types.md":72821,"./docs/1_query/advanced/specifications.md":60187,"./docs/2_modify.md":51527,"./docs/2_modify/1_getting_started.md":22111,"./docs/2_modify/1_getting_started/1_custom_config_cli.md":4228,"./docs/2_modify/1_getting_started/2_custom_config_app.md":40492,"./docs/2_modify/1_getting_started/3_custom_init.md":84338,"./docs/2_modify/1_getting_started/4_custom_web_client.md":64265,"./docs/2_modify/1_getting_started/5_contribute_actor.md":11652,"./docs/2_modify/1_getting_started/6_actor_parameter.md":98997,"./docs/2_modify/2_extensions.md":31516,"./docs/2_modify/3_faq.md":53844,"./docs/2_modify/advanced.md":77646,"./docs/2_modify/advanced/actor_patterns.md":94512,"./docs/2_modify/advanced/algebra.md":4471,"./docs/2_modify/advanced/architecture_core.md":69763,"./docs/2_modify/advanced/architecture_sparql.md":62413,"./docs/2_modify/advanced/browser_builds.md":96848,"./docs/2_modify/advanced/buses.md":18096,"./docs/2_modify/advanced/componentsjs.md":38271,"./docs/2_modify/advanced/custom_cli_arguments.md":11434,"./docs/2_modify/advanced/expression-evaluator.md":66425,"./docs/2_modify/advanced/hypermedia.md":76240,"./docs/2_modify/advanced/joins.md":66705,"./docs/2_modify/advanced/linking_local_version.md":15152,"./docs/2_modify/advanced/logging.md":40674,"./docs/2_modify/advanced/mediators.md":90719,"./docs/2_modify/advanced/metadata.md":40300,"./docs/2_modify/advanced/observers.md":19177,"./docs/2_modify/advanced/query_operation_result_types.md":1913,"./docs/2_modify/advanced/rdf_parsing_serializing.md":3195,"./docs/2_modify/advanced/sparqlee.md":58938,"./docs/2_modify/advanced/testing.md":23871,"./docs/2_modify/benchmarking.md":81569,"./events.md":92646,"./events/2019-06-03-eswc.md":38831,"./events/2019-10-26-iswc.md":2447,"./events/2022-09-07-association_launch.md":3468,"./events/2022-09-13-semantics_conference.md":53199,"./logos.md":22154,"./research.md":88809,"./research/amf.md":99496,"./research/link_traversal.md":17051,"./research/versioning.md":56308,"./roadmap.md":78217};function o(e){return Promise.resolve().then(function(){if(!t.o(a,e)){var n=Error("Cannot find module '"+e+"'");throw n.code="MODULE_NOT_FOUND",n}return t(a[e])})}o.keys=function(){return Object.keys(a)},o.id=59176,e.exports=o},78049:function(e,n,t){var a={"./about.md":73823,"./ask.md":9518,"./association.md":54841,"./association/board.md":4074,"./association/bounty_process.md":43831,"./blog.md":1716,"./blog/2020-08-19-intro.md":19744,"./blog/2020-08-24-release_1_16.md":97463,"./blog/2020-09-25-release_1_17.md":76007,"./blog/2020-11-02-release_1_18.md":29396,"./blog/2021-01-18-release_1_19.md":14741,"./blog/2021-03-30-release_1_20.md":88917,"./blog/2021-04-27-release_1_21.md":75399,"./blog/2021-06-21-comunica_association_bounties.md":80705,"./blog/2021-08-30-release_1_22.md":25615,"./blog/2021-11-08-comunica_association_members.md":60190,"./blog/2022-03-03-release_2_0.md":30494,"./blog/2022-06-29-release_2_3.md":60212,"./blog/2022-07-14-association_launch.md":39243,"./blog/2022-08-24-release_2_4.md":80850,"./blog/2022-11-09-release_2_5.md":64546,"./blog/2023-05-24-release_2_7.md":63367,"./blog/2023-07-04-release_2_8.md":57145,"./blog/2024-03-19-release_3_0.md":84831,"./blog/2024-05-11-release_3_1.md":45609,"./blog/2024-07-05-release_3_2.md":72212,"./contribute.md":4950,"./docs.md":75835,"./docs/1_query.md":17642,"./docs/1_query/1_getting_started.md":62712,"./docs/1_query/1_getting_started/1_query_cli.md":97750,"./docs/1_query/1_getting_started/1_update_cli.md":20919,"./docs/1_query/1_getting_started/2_query_cli_file.md":48016,"./docs/1_query/1_getting_started/3_query_app.md":95276,"./docs/1_query/1_getting_started/3_update_app.md":92421,"./docs/1_query/1_getting_started/4_query_browser_app.md":51839,"./docs/1_query/1_getting_started/5_query_docker.md":26884,"./docs/1_query/1_getting_started/6_setup_endpoint.md":64942,"./docs/1_query/1_getting_started/7_setup_web_client.md":12214,"./docs/1_query/1_getting_started/8_query_dev_version.md":40971,"./docs/1_query/2_usage.md":13302,"./docs/1_query/3_faq.md":6572,"./docs/1_query/advanced.md":75770,"./docs/1_query/advanced/basic_auth.md":36323,"./docs/1_query/advanced/bindings.md":76759,"./docs/1_query/advanced/caching.md":11986,"./docs/1_query/advanced/context.md":22249,"./docs/1_query/advanced/destination_types.md":65625,"./docs/1_query/advanced/explain.md":61042,"./docs/1_query/advanced/extension_functions.md":10205,"./docs/1_query/advanced/federation.md":60711,"./docs/1_query/advanced/graphql_ld.md":33889,"./docs/1_query/advanced/hdt.md":85945,"./docs/1_query/advanced/logging.md":50974,"./docs/1_query/advanced/memento.md":82329,"./docs/1_query/advanced/proxying.md":13330,"./docs/1_query/advanced/rdfjs.md":68577,"./docs/1_query/advanced/rdfjs_querying.md":82075,"./docs/1_query/advanced/rdfjs_updating.md":27124,"./docs/1_query/advanced/result_formats.md":54924,"./docs/1_query/advanced/solid.md":89366,"./docs/1_query/advanced/source_types.md":42473,"./docs/1_query/advanced/sparql_query_types.md":72821,"./docs/1_query/advanced/specifications.md":60187,"./docs/2_modify.md":51527,"./docs/2_modify/1_getting_started.md":22111,"./docs/2_modify/1_getting_started/1_custom_config_cli.md":4228,"./docs/2_modify/1_getting_started/2_custom_config_app.md":40492,"./docs/2_modify/1_getting_started/3_custom_init.md":84338,"./docs/2_modify/1_getting_started/4_custom_web_client.md":64265,"./docs/2_modify/1_getting_started/5_contribute_actor.md":11652,"./docs/2_modify/1_getting_started/6_actor_parameter.md":98997,"./docs/2_modify/2_extensions.md":31516,"./docs/2_modify/3_faq.md":53844,"./docs/2_modify/advanced.md":77646,"./docs/2_modify/advanced/actor_patterns.md":94512,"./docs/2_modify/advanced/algebra.md":4471,"./docs/2_modify/advanced/architecture_core.md":69763,"./docs/2_modify/advanced/architecture_sparql.md":62413,"./docs/2_modify/advanced/browser_builds.md":96848,"./docs/2_modify/advanced/buses.md":18096,"./docs/2_modify/advanced/componentsjs.md":38271,"./docs/2_modify/advanced/custom_cli_arguments.md":11434,"./docs/2_modify/advanced/expression-evaluator.md":66425,"./docs/2_modify/advanced/hypermedia.md":76240,"./docs/2_modify/advanced/joins.md":66705,"./docs/2_modify/advanced/linking_local_version.md":15152,"./docs/2_modify/advanced/logging.md":40674,"./docs/2_modify/advanced/mediators.md":90719,"./docs/2_modify/advanced/metadata.md":40300,"./docs/2_modify/advanced/observers.md":19177,"./docs/2_modify/advanced/query_operation_result_types.md":1913,"./docs/2_modify/advanced/rdf_parsing_serializing.md":3195,"./docs/2_modify/advanced/sparqlee.md":58938,"./docs/2_modify/advanced/testing.md":23871,"./docs/2_modify/benchmarking.md":81569,"./events.md":92646,"./events/2019-06-03-eswc.md":38831,"./events/2019-10-26-iswc.md":2447,"./events/2022-09-07-association_launch.md":3468,"./events/2022-09-13-semantics_conference.md":53199,"./logos.md":22154,"./research.md":88809,"./research/amf.md":99496,"./research/link_traversal.md":17051,"./research/versioning.md":56308,"./roadmap.md":78217};function o(e){return t(i(e))}function i(e){if(!t.o(a,e)){var n=Error("Cannot find module '"+e+"'");throw n.code="MODULE_NOT_FOUND",n}return a[e]}o.keys=function(){return Object.keys(a)},o.resolve=i,e.exports=o,o.id=78049},33596:function(){}},function(e){e.O(0,[146,464,675,774,888,179],function(){return e(e.s=70010)}),_N_E=e.O()}]);
\ No newline at end of file
diff --git a/_next/static/chunks/pages/[...slug]-5b8472a9f9d567b0.js b/_next/static/chunks/pages/[...slug]-5b8472a9f9d567b0.js
deleted file mode 100644
index baa31107..00000000
--- a/_next/static/chunks/pages/[...slug]-5b8472a9f9d567b0.js
+++ /dev/null
@@ -1 +0,0 @@
-(self.webpackChunk_N_E=self.webpackChunk_N_E||[]).push([[330],{70010:function(e,n,t){(window.__NEXT_P=window.__NEXT_P||[]).push(["/[...slug]",function(){return t(69559)}])},91903:function(e,n,t){"use strict";var a=t(85893),o=t(9008),i=t.n(o);n.Z=e=>{let{title:n,description:t}=e;return(0,a.jsxs)(i(),{children:[(0,a.jsxs)("title",{children:["Comunica – ",n]}),(0,a.jsx)("link",{rel:"icon",href:"/favicon.ico"}),(0,a.jsx)("link",{rel:"foaf:primaryTopic",href:"/#software"}),(0,a.jsx)("link",{rel:"foaf:maker",href:"https://www.rubensworks.net/#me"}),(0,a.jsx)("link",{rel:"alternate",type:"application/rss+xml",title:"Comunica – Blog",href:"/rss-feed.xml"}),(0,a.jsx)("meta",{property:"og:image",content:"/img/comunica_red.svg"}),(0,a.jsx)("meta",{property:"og:title",content:"Comunica – ".concat(n)}),(0,a.jsx)("meta",{property:"og:description",content:"".concat(t.replace(/\n/g," "))}),(0,a.jsx)("meta",{property:"og:url",content:"/"}),(0,a.jsx)("meta",{property:"og:locale",content:"en_US"}),(0,a.jsx)("meta",{property:"og:site_name",content:"Comunica – ".concat(n)}),(0,a.jsx)("meta",{property:"og:type",content:"website"}),(0,a.jsx)("meta",{name:"twitter:site",content:"@comunicajs"}),(0,a.jsx)("meta",{name:"twitter:card",content:"summary"}),(0,a.jsx)("meta",{name:"twitter:title",content:"Comunica – ".concat(n)}),(0,a.jsx)("meta",{name:"twitter:description",content:"".concat(t.replace(/\n/g," "))}),(0,a.jsx)("meta",{name:"twitter:image",content:"https://comunica.dev/img/comunica_red.png"})]})}},57533:function(e,n,t){"use strict";t.d(n,{Z:function(){return p}});var a=t(85893),o=t(53951),i=t(38456),s=t.n(i),r=t(67294),c=t(10043),u=t.n(c),d=t(76388),l=t.n(d);function p(e){let{body:n}=e;return(0,a.jsx)(s(),{rehypePlugins:[l()],plugins:[u()],children:n,components:{code:m,h1:h,h2:h,h3:h,h4:h,h5:h,h6:h}})}t(1667);let m=e=>e.inline?(0,a.jsx)("code",{children:e.children}):(0,a.jsx)(o.default,{className:e.className,children:e.children}),h=e=>{let n=r.Children.toArray(e.children),t=n.reduce(g,""),a=t.toLowerCase().replace(/\W/g,"-");return r.createElement("h"+e.level,{id:a},e.children)};function g(e,n){return"string"==typeof n?e+n:r.Children.toArray(n.props.children).reduce(g,e)}},69559:function(e,n,t){"use strict";t.r(n),t.d(n,{__N_SSG:function(){return p},default:function(){return m},getStaticData:function(){return h}});var a=t(85893),o=t(91903),i=t(9675),s=t.n(i);function r(e){let{path:n,paths:t,mattersData:o,reverse:i}=e,s=t.filter(e=>e.startsWith(n)&&e!==n+"/").map(e=>e.slice(n.length+1,e.length)).filter(e=>(e.match(/\//g)||[]).length<=2).map(e=>({path:e,title:o[n+"/"+e].title,description:o[n+"/"+e].description,indent:(e.match(/\//g)||[]).length-1})).map(e=>(0,a.jsxs)("a",{href:e.path,className:"index-entry indent-"+e.indent,children:[(0,a.jsx)("h3",{children:e.title}),(0,a.jsx)("p",{children:e.description})]},e.path));return i&&(s=s.reverse()),(0,a.jsx)("div",{className:"index",children:s})}var c=t(57533);function u(e){let{path:n,paths:t,mattersData:o}=e,i=t.filter(e=>e.startsWith(n)&&e!==n+"/").map(e=>e.slice(n.length+1,e.length)).filter(e=>1===(e.match(/\//g)||[]).length).reverse().map(e=>{let[t,a,i,s]=/^([0-9][0-9][0-9][0-9])-([0-9][0-9])-([0-9][0-9])-/.exec(e);return{path:e,date:"".concat(new Date("".concat(a,"-").concat(i,"-").concat(s)).toLocaleDateString("en-US",{weekday:"long",year:"numeric",month:"long",day:"numeric"})),title:o[n+"/"+e].title,excerpt:o[n+"/"+e].excerpt}}).map(e=>(0,a.jsxs)("a",{href:e.path,className:"blog-entry",children:[(0,a.jsx)("h3",{children:e.title}),(0,a.jsx)("p",{className:"date",children:e.date}),(0,a.jsxs)("div",{className:"excerpt",children:[(0,a.jsx)(c.Z,{body:e.excerpt}),(0,a.jsx)("p",{className:"read-more",children:"Read more..."})]})]},e.path));return(0,a.jsx)("div",{className:"index",children:i})}function d(e){let{frontmatter:n,path:t,paths:o,mattersData:i}=e,s=o.filter(e=>t.startsWith(e)).map(e=>({path:e,title:i[e].title})).map(e=>(0,a.jsx)("li",{children:(0,a.jsx)("a",{href:e.path,children:e.title})},e.path));return s.length>0&&s.push((0,a.jsx)("li",{children:n.title},"_")),(0,a.jsx)("ul",{className:"breadcrumbs",children:s})}var l=t(67294),p=!0;class m extends l.Component{render(){let{frontmatter:e,body:n,path:t,paths:i,mattersData:s,excerpt:l}=this.props,p="";if(t.startsWith("/blog/")){let[e,n,o,i]=/^\/blog\/([0-9][0-9][0-9][0-9])-([0-9][0-9])-([0-9][0-9])-/.exec(t),s=new Date("".concat(o," ").concat(i," ").concat(n)).toLocaleDateString("en-US",{weekday:"long",year:"numeric",month:"long",day:"numeric"});p=(0,a.jsx)("p",{className:"date",children:s})}return(0,a.jsxs)("div",{className:"container-page",children:[(0,a.jsx)(o.Z,{title:e.title,description:l||e.description}),(0,a.jsxs)("main",{children:[(0,a.jsx)(d,{frontmatter:e,path:t,paths:i,mattersData:s}),(0,a.jsx)("h1",{children:e.title}),p,(0,a.jsx)("hr",{}),e.wip&&(0,a.jsxs)("div",{className:"wip",children:[(0,a.jsx)("h2",{children:"\uD83D\uDEA7 Under construction \uD83D\uDEA7️"}),(0,a.jsxs)("p",{children:["This section still needs to be created \uD83D\uDD28.",(0,a.jsx)("br",{}),"In the meantime, you can read our ",(0,a.jsx)("a",{href:"https://comunica.readthedocs.io/en/latest/",children:"old documentation"})," and check our ",(0,a.jsx)("a",{href:"https://github.com/comunica?utf8=%E2%9C%93&q=topic%3Atutorial&type=&language=",children:"tutorials"}),"."]}),(0,a.jsx)("p",{children:(0,a.jsx)("a",{href:"/contribute/",children:"You can contribute by helping to write guides like this."})})]}),(0,a.jsxs)("div",{className:"headers-overview",children:[(0,a.jsx)("p",{children:"On this page"}),(0,a.jsx)("ol",{className:"headers-overview-elements"})]}),(0,a.jsx)(c.Z,{body:n}),e.index&&(0,a.jsx)(r,{path:t,paths:i,mattersData:s,reverse:e.reverse}),e.blog_index&&(0,a.jsx)(u,{path:t,paths:i,mattersData:s})]})]})}componentDidMount(){let e=document.querySelector(".headers-overview-elements"),n=document.querySelector(".container-page"),t=n.querySelectorAll("h2");for(let n of t){let t=document.createElement("li"),a=document.createElement("a");a.textContent=n.innerText,a.setAttribute("href","#"+n.id),a.setAttribute("class","headers-overview-element"),t.appendChild(a),e.appendChild(t)}function a(){let n=document.querySelectorAll("a.headers-overview-element");for(let e=0;e0&&(e.parentNode.style.display="block"),window.addEventListener("load",a),window.addEventListener("scroll",a)}}async function h(){let e=(e=>{let n=e.keys(),t=n.map((e,n)=>e.slice(1,-3)+"/");return t})(t(78049)),n=e.map(e=>{let n;for(;n=/\/[0-9]*_/.exec(e);)e=e.replace(n,"/");return e}),a=(await Promise.all(e.map(e=>t(59176)(".".concat(e.slice(0,-1),".md"))))).map(e=>s()(e.default,{excerpt_separator:""})).reduce((e,t,a)=>(e[n[a]]=t,e),{});return{paths:n,matters:a,fallback:!1}}},73823:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'About'\ndescription: 'Learn more about Comunica.'\n---\n\nComunica is a knowledge graph querying framework. \nThis page provides more details about its goals and motivations.\n\nRelated pages:\n* [Roadmap](/roadmap/)\n* [Contribute](/contribute/)\n* [Logos](/logos/)\n\n
\n\n## Flexible querying of Linked Data\n\n[**Linked Data**](https://www.w3.org/standards/semanticweb/data) on the Web exists in **many shapes and forms**.\nLinked Data can be published using plain [RDF](https://www.w3.org/TR/rdf11-concepts/) files\nin various **syntaxes**, such as [JSON-LD](https://json-ld.org/), [Turtle](https://www.w3.org/TR/turtle/), [HTML+RDFa](https://www.w3.org/TR/html-rdfa/), and more.\nNext to that, different forms of **queryable Web interfaces** exist, such as [SPARQL endpoints](https://www.w3.org/TR/sparql11-protocol/) and [Triple Pattern Fragments (TPF) interfaces](https://linkeddatafragments.org/in-depth/#tpf).\nIf we want to **query** Linked Data from the Web, we need to be able to cope with this heterogeneity.\n\n**Comunica** is a **quering framework** that has been designed to handle different types of Linked Data interfaces in a **flexible** manner.\nIts primary goal is _executing [SPARQL](https://www.w3.org/TR/sparql11-query/) queries over one or more interfaces_.\n\n## Comunica is a meta-query engine\n\nComunica should not be seen as a query engine.\nInstead, Comunica is a _meta_ query engine using which query engines can be created.\nIt does this by providing a set of **modules** that can be **wired** together in a flexible manner.\n\nWhile we provide default configurations of Comunica to easily [get started with querying](/docs/query/getting_started/),\nanyone can [configure their own query engine](/docs/modify/getting_started/).\nThis fine-tuning of Comunica to suit your own needs, and avoiding the overhead of modules that are not needed.\n\n## For and on the Web\n\nWe strongly believe in the existence of **open Web standards**, such as those provided by [W3C](https://www.w3.org/) and [WhatWG](https://whatwg.org/).\nAs such, [Comunica **implements** several specifications](/docs/query/advanced/specifications/) such as [RDF](https://www.w3.org/TR/rdf11-concepts/) and [SPARQL](https://www.w3.org/TR/sparql11-query/).\nFurthermore, Comunica is implemented using Web-based technologies in **JavaScript**, which enables usage through browsers,\nthe command line, the SPARQL protocol, or any Web or JavaScript application.\n\n## Open\n\nComunica is an **open-source** software project that is available under the [MIT license](https://github.com/comunica/comunica/blob/master/LICENSE.txt),\nwhich means that it is allowed to be used in both open and commercial projects.\nNext to the source code, also our development process is open, which you can read or contribute to on [GitHub](https://github.com/orgs/comunica/projects),\nor read our [high-level roadmap](/roadmap/).\n\n## Research and Education\n\nComunica is designed as a flexible research platform for research on query execution.\nAs such, our goal is to make it sufficiently easy for researchers\nto investigate alternative query algorithms and techniques by [modifying engines](/docs/modify/).\nNext to this, we also aim to educate researchers and developers on [how to use](/docs/) Comunica.\n\n## Linked Data Fragments\n\nOne of the motivations behind Comunica is to be a [**Linked Data Fragments Client**](https://linkeddatafragments.org/concept/).\nLinked Data Fragments is a theoretical framework to analyse different Linked Data interfaces.\n\nWhile software used to exist to query over specific types of Linked Data interfaces,\nit used to be impossible to query over **combinations of different interfaces**.\nComunica solves this need by being independent of specific types of interfaces,\nas support for new interfaces can be plugged in.\n\n## Stability\n\nA primary goal of Comunica is to acts as a **stable** querying framework.\nFor this, we spend extra effort in [continuous testing](/docs/modify/advanced/testing/) at different levels.\n\n## Supporting the JavaScript ecosystem\n\nComunica depends on many dependencies to achieve its goals,\nsuch as spec-compliant RDF parsers and serializers.\nWe support these libraries, and contribute to them.\n\n## Who works on Comunica?\n\nFirst and foremost, Comunica is an **open-source** framework.\nThe Comunica project has been initiated by [IDLab](https://www.ugent.be/ea/idlab/en) at Ghent University – imec,\nand is being actively developed and maintained by a variety of [contributors](https://github.com/comunica/comunica/graphs/contributors).\nAll development happens publicly via GitHub [project boards](https://github.com/orgs/comunica/projects), [issues](https://github.com/comunica/comunica/issues), and [pull requests](https://github.com/comunica/comunica/pulls).\nAnyone is welcome to [contribute](/contribute/) to this project.\n\nAs of recently, the [Comunica Association](/association/) has been founded as a non-profit organization\nto make Comunica development more sustainable in the long term.\n"},9518:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Ask'\ndescription: 'Ask questions about Comunica.'\n---\n\nShould you not find the right information on this website,\nwe would be happy to help you out via any of the methods below.\n\nRelated pages:\n* [Roadmap](/roadmap/)\n* [Contribute](/contribute/)\n\n## Questions\n\nThe easiest way to get an answer to small questions is via our [Gitter channel](https://gitter.im/comunica/Lobby).\nThere, we have an active community of Comunica developers, contributors and enthusiasts.\n\nAlternatively, if you want a place to talk about your question (or discussion topic),\nyou can make use of the [discussions tab on GitHub](https://github.com/comunica/comunica/discussions).\n\nIn case you have a more general question related to SPARQL or RDF in JavaScript,\nthe [RDF/JS Gitter channel](https://gitter.im/rdfjs/public) should be of help.\n\n## GitHub issues\n\nIf you experience bugs with Comunica, or if you have suggestions for new features,\nfeel free to report them in our [issue tracker on GitHub](https://github.com/comunica/comunica/issues).\n\nPlease take into account that this is an open-source effort,\nso we may not be able to solve all issues, but we do our best!\nShould you be interested in helping our with fixing or implementing any of these issues,\nyou are very welcome to [contribute](/contribute/).\n\n## Twitter\n\nTo keep updated with the latest news on Comunica, find us on [Twitter](https://twitter.com/comunicajs).\n\n## Email\n\nFor any other matters, such as research collaborations or commercial support, you can send an email to [Ruben Taelman](mailto:ruben.taelman@ugent.be).\n"},54841:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Comunica Association\'\ndescription: \'Organization for ensuring the maintenance and development of the Comunica\'\n---\n\nThe Comunica Association is a **non-profit organization** for establishing a roadmap,\nand ensuring the maintenance and development of the Comunica framework and its dependencies.\n\n## Members and sponsors\n\nIf your organization is using Comunica, and you want to support its continued maintenance and future development,\nyou may consider [**donating or becoming a sponsor via Open Collective**](https://opencollective.com/comunica-association).\nThis will allow the association to fund core maintainers of Comunica to manage issues and pull requests, and to fund overall development.\nFurthermore, your organization will have the option to prioritize certain issues.\nAnother option is to become a **board member**, which will give your organization access to [board meetings](/association/board/) of the Comunica Association\nwhich will enable your organization to collaboratively determine the long-term vision and roadmap of Comunica and the Association.\n\n
\n \n
\n\nFeel free to [contact us](mailto:ruben.taelman@ugent.be) if you want to discuss alternative forms of support,\nor regarding any related questions.\n\n
\n\n## Bounties\n\nAnother goal of the Comunica Association, is to\n**connect organizations** that are in **need of improvements or features**, to **developers** seeking funding.\n\n
\n \n
\n\nUsing our Bounty Program,\norganizations can place [**bounties on issues**](/association/bounties/),\nand developers may work on them for an agreed upon price.\nThese bounties are primarily useful for issues that have a clearly defined scope, and are not too large.\nLarger issues with an unclear scope may be better suited for becoming part of the general roadmap,\nwhich is decided by Board Members,\nof which [your organization can also become a part of](#members-and-sponsors).\n\n
\n\n## Learn more\n\nIf you want to be notified about future developments around this association, submit your email address below!\n\n\n\nThe Comunica Association is hosted by [Open Collective Europe](https://opencollective.com/europe),\nand our budget is visible on [Open Collective](https://opencollective.com/comunica-association).\n\n
\n * Sponsors that want to have an issue prioritized should contact us.\n The board will decide the final order of issue handling based on historical sponsorship contribution and developer availability.\n
\n'},4074:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Board of Directors'\ndescription: 'The board makes decisions regarding the Comunica Association'\n---\n\nThe [Comunica Association](/association/) has a Board of Directors\nthat makes decisions with respect to the Comunica Association.\nThis page describes details on who are the members of this board, what it does, and how it works.\n\n## Members\n\n* [Ruben Taelman](https://www.rubensworks.net/) [(IDLab, Ghent University – imec)](https://knows.idlab.ugent.be/) - Codebase curator, Core maintainer\n* [Pieter Colpaert](https://pietercolpaert.be/) [(IDLab, Ghent University – imec)](https://knows.idlab.ugent.be/) - Strategic coordinator\n* [Jesse Wright](https://github.com/jeswr/) [(Australian National University)](https://cecs.anu.edu.au/) - Core maintainer\n\n## Goals\n\nThe Board of Directors makes decisions concerning the following topics:\n\n* Determine long-term goals via the [roadmap](/roadmap/).\n* Suggest priorities of issues to the maintainers for short-term development via the [project boards](https://github.com/orgs/comunica/projects).\n* Coordinate future of the Comunica Association\n\nFurthermore, board meetings can be used to evaluate the maintenance and development of Comunica and its related dependencies,\nwhich includes development by externals via the [Bounty Program](/association/bounties/).\n\n## Becoming a Board Member\n\nThere are two ways to become a Board Member:\n\n1. Become a financial contributor via [Open Collective](https://opencollective.com/comunica-association) of the the Board Member tier\n2. Become a regular [contributor](/contribute/) in any other way, with a dedication of at least four hours per week on average.\n\n## Decision-making Process\n\nAt least once every year, the board virtually meets for a board meeting.\nNot all members are required to be present at each meeting.\nThe chair is expected to prepare an agenda ahead of time on https://github.com/comunica/association/blob/master/board-meetings/next.md,\nwhich should contain points raised by the board members.\nA meeting may be skipped if there are no objections from members.\n\nThe chair is appointed by the board members, and may be changed at any time through a decision.\nThe title of \"codebase curator\" is reserved for one person,\nand can only be passed on to someone else by the current codebase curator.\n\nDuring the meeting, decisions can be made,\nand every member can place exactly one vote.\nIn case of a tie, the final decision is up to the chair.\nThe codebase curator may optionally overrule any (final) vote if this person considers this decision to be detrimental to the future of Comunica or the Comunica Association.\nNon-attending members may raise their vote for up to two weeks after the meeting after reading the meeting minutes.\nOnce a vote is final, an action will be carried out by the executive contributors.\n\nMinutes are scribed for each meeting by a volunteer,\nand are to appear afterwards on https://github.com/comunica/association/tree/master/board-meetings\nThe minutes are sent to all board members shortly after each meeting.\n"},43831:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Bounty Procedures\'\ndescription: \'The process for handling bounties on issues\'\n---\n\nThis page explains how companies can place bounties on issues,\nhow developers can work on them,\nand how the [Comunica Association](/association/) manages such bounties.\n\n## Placing a bounty\n\nCompanies (or other types of organizations) that are interested in placing bounties on issues (features requests, bug reports, ...) must follow this procedure:\n\n1. Company lets the association know they are interested in placing a bounty on an issue, by mailing us.\n2. The association finds one or more suitable developers, and reports back to the company on their expected time frame and cost.\n3. All parties (association, company, developer) negotiate the final time frame and cost, after which one developer is assigned to the issue (if all parties agree).\n4. The company pays the full bounty cost to the association, from which the association claims an overhead of 15%.\n5. After completion (or when the reserved time runs out), all parties (association, company, developer) evaluate the work.\n6. The association pays the bounty to the developer (minus 15% overhead).\n\n## Working on a bounty\n\nDevelopers that are interested in working on issue bounties must follow this procedure:\n\n1. Based on the [list of bounties](/association/bounties/), developers can click on any issue to notify the association that they are interested in working on this issue.\n2. The association discusses with the developer to learn about previous experiences, and what the expected time frame and at what price the developer is willing to work for.\n3. If the company agrees with the developer\'s conditions, they jointly negotiate the final time frame and cost, after which the developer is assigned to the issue (if all parties agree), and the developer can start the work.\n4. After completion (or when the reserved time runs out), the developer presents the work to the company and the association for evaluation.\n5. The association pays the bounty to the developer (minus 15% overhead).\n\n**The developer should not start working on the issue, before the company and association have confirmed the assignment.**\n\n## Management of bounties\n\nThe association manages issues as follows:\n\n1. A company sends a mail to the association to place a bounty on one or more issues.\n2. The association marks the issue with the `comunica-association-bounty` label, and adds a footer to the issue to mark that a bounty has been placed, after which the issue will appear automatically in [the list of bounties](/association/bounties/). Optionally, a budget for the bounty can be added.\n3. If applicable, the association directly contacts potentially interested developers.\n4. The association awaits offers from developers with their estimated time frame and cost.\n5. Depending on the urgency of the issue, the association sends all offers from developers to the company, together with any previous experiences the association had with each developer.\n6. The company and association negotiate with at least one developer to agree on a fixed time frame and cost (taking into account the 15% overhead).\n7. The association sends an invoice to the company for the agreed upon price.\n8. After payment of the invoice, the developer can start with the work.\n9. The association assigns the issue to the developer, which will make the issue marked as *"claimed"* in [the list of bounties](/association/bounties/).\n10. Once the deadline is reached, the association contacts the company and developer to schedule a review meeting.\n11. During the review meeting, all parties discuss the outcome, and potential next steps.\n12. The association pays the bounty to the developer (minus 15% overhead).\n\nDepending on the specific needs of certain issues or use cases, deviations from these procedures may take place.\n\n## Claiming a bounty\n\nOnce a bounty has been fully finalized, you can request your payment by _submitting an expense_ via [Open Collective](https://opencollective.com/comunica-association/).\nWhen submitting an expense, you must attach an invoice, which must be a valid fiscal document.\nThis document must at least contain your VAT ID and your address and the Comunica Association\'s address:\n\n```\nComunica Association\nCantorsteen 10 \n1000 Bruxelles \nBelgi\xeb\n```\n\nAll expenses are handled by [Open Collective Europe](https://docs.opencollective.com/oceurope).\nMore details on expenses can be found on [Open Collective Europe\'s wiki](https://docs.opencollective.com/oceurope/how-it-works/expenses).\n\n## Rules\n\n1. While anyone is allowed to take up bounties, if board members want to take up bounties, all other board members have to agree, to avoid conflicts of interest.\n2. Once assigned, bounties are expected to be delivered in a timely manner. If the developer does not communicate any progress for more than a week (without prior notification of unavailability), the bounty may become unassigned.\n'},1716:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Blog'\ndescription: 'Blog posts, containing announcements or other news.'\nblog_index: true\n---\n"},19744:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'A New Website for Comunica'\n---\n\nWe're happy to present a brand new website for Comunica! \uD83C\uDF89\n_Don't know that Comunica is? [Read about it here](/about/)._\n\nThis new version contains all **basic information** around Comunica.\nAdditionally, it contains **guides** on how to [query with Comunica](/docs/query/),\nand how to [modify or extend it](/docs/modify/). \n\n\n\nWhile this website is still very much a **work in progress** at the time of writing,\na whole lot of pages have been added already.\nFor instance, the section on [querying with Comunica](/docs/query/) contains some extensive guides.\nIn the near future, more advanced guides on [modifying Comunica](/docs/modify/) will be added.\nIf you're interested in **helping out** with this effort, be sure to have a look at the [contribution guide](/contribute/).\n\nIn the future, this blog will be used for **announcing news** around Comunica,\nwhich can include significant new releases,\nand other things.\nSo be sure to keep your \uD83D\uDC40 on this!\nIf you want to be notified of new blog posts, you can [follow Comunica on **Twitter**](https://twitter.com/comunicajs).\n"},97463:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 1.16.0: Full spec compliance, property paths, CSV/TSV, basic auth, and fixes\'\n---\n\nWith the latest release of Comunica, we have achieved the major milestone of **full compliance to the SPARQL 1.1 specification**.\nWhile Comunica has had support for all SPARQL 1.1 operators for a while,\nsome small parts were not always fully handled according to the spec,\nand property paths were not fully supported.\n\nThanks to the help of several students over the summer, these issues have been resolved,\nand all tests from [the SPARQL 1.1 test suite](https://w3c.github.io/rdf-tests/sparql11/) now pass.\n\n\n\n## SPARQL 1.1 Query compliance\n\nOur continuous integration tool has been configured to continuously check correctness\nusing unit tests, integration tests, and the SPARQL 1.1 query test suite.\nSo far, some tests from this test suite used to fail, primarily due to the lack of full property path support.\nThanks to the help of [several](https://github.com/comunica/comunica/commits?author=stephaniech97) [students](https://github.com/comunica/comunica/commits?author=FlorianFV) that [contributed](/contribute/)\nduring the summer, all of these issues have been resolved,\nwhich makes Comunica fully compliant to the [SPARQL 1.1 Query](https://www.w3.org/TR/sparql11-query/) specification.\n\nThe next major goal will now be to implement the [SPARQL 1.1 Update](https://www.w3.org/TR/sparql11-update/) specification.\n\nInterested in helping out? Let us know via [GitHub](https://github.com/comunica/comunica/issues/435).\n\n## Property paths\n\nSPARQL 1.1 provides the [property paths syntax](https://www.w3.org/TR/sparql11-query/#propertypaths),\nwhich is a power-user feature that allows complex paths between two resources to be expressed.\nAs of now, Comunica implements all property paths functionality according to the specification.\n\nFor example, property paths allow you to define alternative predicates:\n```sparql\nSELECT ?person WHERE {\n [ rdfs:label "Bruce Willis"@en ] (dbpedia-owl:spouse|dbpedia-owl:child) ?person.\n}\n```\n\nTry out some example queries live via our Web client:\n\n* [Spouses and children of Bruce Willis](http://query.linkeddatafragments.org/#transientDatasources=http%3A%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen&query=SELECT%20%3Fperson%0AWHERE%20%7B%0A%20%20%5B%20rdfs%3Alabel%20%22Bruce%20Willis%22%40en%20%5D%0A%20%20%20%20(dbpedia-owl%3Aspouse%7Cdbpedia-owl%3Achild)%20%3Fperson.%0A%7D)\n* [In-laws of Brad Pitt](http://query.linkeddatafragments.org/#transientDatasources=http%3A%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen&query=SELECT%20%3Fperson%0AWHERE%20%7B%0A%20%20dbpedia%3ABrad_Pitt%20dbpedia-owl%3Aspouse*%20%3Fperson.%0A%7D)\n* [Movies from directors who have directed movies with Brad Pitt](http://query.linkeddatafragments.org/#transientDatasources=http%3A%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen&query=SELECT%20%3Fmovie%0AWHERE%20%7B%0A%20%20%5B%20rdfs%3Alabel%20%22Brad%20Pitt%22%40en%20%5D%0A%20%20%20%20%5Edbpedia-owl%3Astarring%2Fdbpedia-owl%3Adirector%2F%5Edbpedia-owl%3Adirector%20%3Fmovie.%0A%7D)\n\nShould you run into any bugs related to property paths, \nbe sure to [report them on our issue tracker](https://github.com/comunica/comunica/issues).\n\n## CSV/TSV Serializers\n\nWhile there already was support for many [result formats](/docs/query/advanced/result_formats/) in Comunica,\n[CSV and TSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) support was missing.\nAs of this release, this lack has been resolved.\nThey can be used by requesting the `text/csv` or `text/tab-separated-values` media types.\n\nFor example, try it out as follows from the command line:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100" \\\n -t \'text/csv\'\n```\n\n## Basic authentication\n\nSometimes, access to data on the Web requires [HTTP Basic Authentication](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication).\nAs of this release, you can [configure Comunica to pass the required credentials](/docs/query/advanced/basic_auth/) to access these sources that require authentication.\n\nFor example, username and password can be passed from the command line:\n```bash\n$ comunica-sparql https://username:password@example.org/page \\\n "SELECT * WHERE { ?s ?p ?o }"\n```\n\n## And more\n\nAside from the main features above, several fixes have been done.\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1160---2020-08-24) to read more about them.\n'},76007:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Hacktoberfest and Release 1.17.0'\n---\n\nIn this post, we give an overview of\ncontribution possibilities during [Hacktoberfest](https://hacktoberfest.digitalocean.com/),\nand the newly released 1.17.0 version. \n\n\n\n## Hacktoberfest\n\n[Hacktoberfest](https://hacktoberfest.digitalocean.com/) is a yearly event during the month of October to celebrate open-source projects,\nwhere everyone is invited to contribute to projects by submitting pull requests.\nOnce a certain number of pull requests has been made, you will receive some goodies.\n\nIf you're interested to participate in this event,\nwe have marked several issues with the label [`hacktoberfest`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Ahacktoberfest),\nwhich are well suited for first-time contributors.\n\nHappy hacking! \uD83E\uDE93\n\n## Release 1.17.0\n\nAs of today, version 1.17.0 has been released.\nIt mainly contains [a fix for the bug where some queries would never terminate without producing further results](https://github.com/comunica/comunica/commit/3095b269f1d98d706d1056495123a69bffe3b457).\nNext to this, it features some convenience features such as\n[making the logger data argument lazy](https://github.com/comunica/comunica/commit/e6d7cee1f7622e4bcb73188a0060d5d9823958f0),\n[ensuring the internal SPARQL endpoint defaults to application/json when no content type is requested](https://github.com/comunica/comunica/commit/cdde3559b51825eaebb686fffe0a9edf7c8ef238),\nand a fix for [http-based JSON-LD contexts not being retrievable within browsers](https://github.com/comunica/comunica/commit/2d0818c64e5bfbbb334ecbccb7b5a98a69263d1c).\nIt also lays the groundwork for [RDF* support](https://github.com/comunica/comunica/issues/594) in the near future.\n\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1170---2020-09-25) to read more about them.\n"},29396:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.18.0: Smaller Web bundles and Microdata parsing'\n---\n\nThis post gives a brief overview of the new 1.18.0 release.\n\n\n\n## Smaller Web bundle sizes\n\nThanks to [Jacopo Scazzosi](https://github.com/jacoscaz),\nthe **Webpack bundle size** of the default Comunica config has been reduced from **1.47 MiB to 1.15 MiB**.\nThis reduction is mainly caused by swapping to smaller and more Web-friendly dependencies.\n\nThese changes were applied in preparation of the new release of [Quadstore](https://github.com/beautifulinteractions/node-quadstore),\na Comunica-powered RDF graph database where small bundle sizes are crucial.\n\n## Microdata parsing\n\nComunica already supported parsing RDFa from HTML (and other XML-like) documents.\nSince Microdata is [the most popular form of structured information on the Web](http://webdatacommons.org/structureddata/2019-12/stats/stats.html),\nit makes a lot of sense to be able to query over this as RDF.\nAs such, we plugged in the recently created [Microdata to RDF Streaming Parser](https://github.com/rubensworks/microdata-rdf-streaming-parser.js) into the default Comunica SPARQL config.\n\nShould you not need this parser in your querying use case,\nno worries, you can easily exclude this by creating a [custom config](https://comunica.dev/docs/modify/).\n\n## Fixes and enhancements\n\nNext to the changes above, several other smaller fixes and enhancements (such as [Emoji-support in query expressions](https://github.com/comunica/sparqlee/commit/4b873834a38c35329495d142eaf1c59f56fc0038)) were applied.\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1180---2020-11-02) to read more about them.\n"},14741:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 1.19.0: Simplifications for extensions\'\n---\n\nThe 1.19.0 release focuses on simplications for developing Comunica extension.\nIt contains no significant fixes or changes for end-users.\n\n\n\n## Components.js 4\n\nComunica\'s modules are wired together using the [Components.js](/docs/modify/advanced/componentsjs/) dependency injection framework.\nAs of recently, Components.js [has been updated](https://github.com/LinkedSoftwareDependencies/Components.js)\nto major release version 4, which features several simplifications for developers.\n\nWhile this release is backwards-compatible,\nwe do recommend developers of Comunica modifications to make the following tweaks.\n\n### Reduce clutter in `package.json`\n\nAll Comunica modules would typically contain the following entries in their `package.json` files:\n\n```json\n{\n ...\n "lsd:module": "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-abstract-bindings-hash",\n "lsd:components": "components/components.jsonld",\n "lsd:contexts": {\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-abstract-bindings-hash/^1.0.0/components/context.jsonld": "components/context.jsonld"\n },\n "lsd:importPaths": {\n "https://linkedsoftwaredependencies.org/bundles/npm/@comunica/actor-abstract-bindings-hash/^1.0.0/components/": "components/"\n },\n ...\n}\n```\n\nThis can now be simpified to:\n\n```json\n{\n ...\n "lsd:module": true\n ...\n}\n```\n\n### Update Components.js context version\n\nIf you define your own JSON-LD contexts,\nit is recommended to update to the latest version of the Components.js version\n\n```text\n- "https://linkedsoftwaredependencies.org/bundles/npm/componentsjs/^3.0.0/components/context.jsonld",\n+ "https://linkedsoftwaredependencies.org/bundles/npm/componentsjs/^4.0.0/components/context.jsonld",\n```\n\nWhile this change is optional, you will see a startup warning mentioning the use of a deprecated context URL.\n\n## Next steps\n\nIn the future, we plan further simplifications to Comunica modifications.\nConcretely, we intend to enable to automatic generation of module and component files based on TypeScript source code.\n'},88917:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.20.0: SPARQL Update support'\n---\n\nWith this new 1.20.0 release, we bring support for [SPARQL Update](https://www.w3.org/TR/sparql11-update/) queries to Comunica.\nNext to this, several enhancements were made to improve developer experience,\nminor new features, and important bug fixes.\n\n\n\n## SPARQL Update\n\nUp until now, Comunica only supported performing read-only queries over one or more sources.\nWith this update, it is possible to execute [SPARQL Update queries](https://www.w3.org/TR/sparql11-update/)\nto modify data inside a _source_, or direct changes to a separate _destination_.\n\nThe current implementation is fully compliant to the SPARQL Update specification,\nand it passes all tests of the test suite.\n\nCurrently, Update support is limited to [RDF/JS stores](/docs/query/advanced/rdfjs_updating/).\nSupport for updating other types of destinations is planned,\nsuch as local RDF files, [Linked Data Platform](https://www.w3.org/TR/ldp/),\n[SPARQL endpoints](https://www.w3.org/TR/2013/REC-sparql11-protocol-20130321/),\n[SPARQL Graph Store protocol](https://www.w3.org/TR/2013/REC-sparql11-http-rdf-update-20130321/), ...\n\nNo explicit support for transactions is available at the moment,\nas we assume that RDF/JS stores handle this on their own.\nProper support for this at engine-level is planned.\n\n## SPARQL endpoint worker threads\n\nIf you [use Comunica to expose a SPARQL endpoint](/docs/query/getting_started/setup_endpoint/),\nyou can now set the number of parallel worker threads using the `-w` flag:\n\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en -w 4\n```\n\nThis will result in better performance when your endpoint serves many parallel requests.\n\nTogether with this change, the timeout handling has been improved,\nas the old implementation would sometimes not terminate query executions even if the timeout was exceeded.\n\n## Features, fixes and enhancements\n\nNext to the changes above, several other features, fixes and enhancements were applied,\nsuch the new [`@comunica/types`](https://github.com/comunica/comunica/commit/3f46a233883b699df87fcee3215516f97e15e346)\nand [`@comunica/context-entries`](https://github.com/comunica/comunica/commit/12b9ee3e8e5bc2d0fadd662a3d6aeef838b87619) packages,\nenabling [blank node correlation across results](https://github.com/comunica/comunica/commit/d9b93b4608c69e6c8b710b664c37e47a1c0d41c7),\nand a new [link queue bus](https://github.com/comunica/comunica/commit/8de44d1da8e63c9b3a15c26dadcb003c2c00f136).\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1200---2021-03-30) to read more about them.\n"},75399:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.21.0: Hypermedia-based SPARQL Updating'\n---\n\nThe 1.21.0 version is a smaller release,\nthat mainly introduces the necessary wiring to enable hypermedia-driven SPARQL update querying,\nwhich lays the foundations for highly flexible updating of heterogeneous destinations, such as Solid data pods.\n\nIn other words, this provides the necessary ✨_magic_✨ for updating many different types of things. \n\n\n\n## Hypermedia-based updates\n\nA key feature of Comunica is its ability to [automatically detect the type of source via hypermedia](/docs/modify/advanced/hypermedia/),\nand alter its query process based on the source's capabilities.\nWith this new update, this hypermedia-based logic has also been added to the handling of update queries.\n\nConcretely, if you pass a destination by URL to Comunica,\nthe capabilities of this destination will be detected,\nand an appropriate destination handler will be used.\n\nWith this update, we provide support for [a single hypermedia destination type](/docs/query/advanced/destination_types/):\nthe [SPARQL Update-based PATCH API](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-update-hypermedia-patch-sparql-update).\nSuch a destination is an HTTP APIs accepting PATCH requests containing SPARQL Update queries (`application/sparql-update`),\nsuch as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1).\n\nIn future updates, we intend to support more types of hypermedia-based destinations as well,\nsuch as [SPARQL endpoints](https://www.w3.org/TR/2013/REC-sparql11-protocol-20130321/),\nand [Linked Data Platform](https://www.w3.org/TR/ldp/).\n\nLearn more about updating from the [command line](/docs/query/getting_started/update_cli/)\nor from a [JavaScript application](/docs/query/getting_started/update_app/) in the documentation. \n\n## Features, fixes and enhancements\n\nNext to the changes above, several minor features, fixes and enhancements were applied,\nsuch as [more expressive configuration of JSON-LD parsing](https://github.com/comunica/comunica/commit/199710d70b01d22ea40fe5e12e16a9d8800f32fc),\nproper [CLI exit codes](https://github.com/comunica/comunica/commit/00aa446cc8d2fd713711787b8a59f45c266947ea),\nand [changing the context in the `optimize-query-operation` bus](https://github.com/comunica/comunica/commit/81373206a17d0fcb8d3af701e5266287113d545c).\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1210---2021-04-27) to read more about them.\n"},80705:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Announcing the Comunica Association, and a Bounty Program\'\n---\n\nIn this post, we announce the creation of the [Comunica Association](/association/),\nand the introduction of a new bounty system using which **organizations** and companies\ncan **fund development** of new features and the fixing of bugs,\nand through which **developers** can take up these bounties and **get paid**.\n\n\n\n## The need for an association\n\nComunica started out as a small software project to drive query-related research.\nBy now, it has grown into a project that is being widely used not only within research,\nbut also within companies and organizations as stable software.\n\nThe original research-driven development approach is running into its limits,\nsince features and bugs are reported regularly that do not fit into a strict research agenda.\nTherefore, there is a need to broaden the development scope of Comunica,\nwhich is the purpose of the **Comunica Association**.\n\n## Short-term goals\n\nAs of now, the Comunica Association is a **non-profit organization** (activity within [Open Knowledge Belgium](https://openknowledge.be/))\nthat as a first step will act as an intermediary between people in need of development,\nand people that want to offer development at a price.\nFor instance, a certain company may be in need of a specific feature in Comunica,\nbut may not have the required expertise to implement it.\nVia the Comunica Association, this company may place a bounty on this issue,\nso that other companies or freelance developers (that do have this expertise)\nmay take up this effort for the bounty price.\n\n
\n \n
\n\nVia this bounty program, we intend to grow a network of organizations and individuals that\ncan offer services to each other around the topic of Web-scale querying of Knowledge Graphs.\n\n**Several bounties have already been placed on issues!**\nSo if you\'re a developer willing to take up such work, have a look at [the list of bounties](/association/bounties/).\nIf you\'re an organization interested in placing new bounties, have a look at [the bounty procedures](/association/bounty_process/).\n\n## Long-term goals\n\nThis bounty program is a first step in the creation of the Comunica Association.\nAs a next step, we intend to bring this network of interested organizations and individuals\neven closer by allowing everyone to collaboratively determine the future roadmap of Comunica through memberships.\n\nThe Association will be as open and transparent as possible.\nThis will mean that important decisions will be shared on this website,\nand that all finances will visible for everyone via the [Open Collective platform](https://opencollective.com/).\n\nEven though the Comunica Association is a non-profit organization.\nIt will raise funds through the bounty program and memberships\nin order to secure funding for hiring dedicated developers.\nSuch developers can then become dedicated maintainers of Comunica,\nin order to make the open-source development of Comunica and related RDF/JS libraries more sustainable in the long-term.\n\n[Click here to learn more about the Comunica Association.](/association/) \n'},25615:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 1.22.0: Improved update support, extension functions, and improved CLI handling'\n---\n\nThe 1.22.0 version features some major additions, and a bunch of smaller internal fixes and performance improvements \uD83D\uDE80!\nThe primary changes that are discussed in this post are\nsupport for more SPARQL update destination types,\nSPARQL extension functions,\nand rewritten CLI handling.\n\n\n\n## Updating Solid, Linked Data Platform, and SPARQL endpoints\n\nIn the previous release of Comunica, [basic support for updating documents in Solid/LDP data pods was already added by enabling PATCH requests](/blog/2021-04-27-release_1_21/).\nIn this release, we improve this support by also adding an actor that can handle PUT requests,\nwhich will allow resources to be created that do not exist yet.\n\nFor example, the following query will work whether or not the destination resource already exists,\nand Comunica will automatically determine if it should send a PUT or PATCH request:\n```bash\n$ comunica-sparql http://mypod.example.org/file.ttl \\\n -q \"INSERT DATA { }\"\n```\n\nIn the future, it will also become possible to update _private_ resources via Solid authentication.\n\nFurthermore, this release also makes it possible to forward update queries to SPARQL endpoints.\n\nLearn more about updating from the [command line](/docs/query/getting_started/update_cli/)\nor from a [JavaScript application](/docs/query/getting_started/update_app/) in the documentation.\n\n## SPARQL extension functions\n\nSPARQL allows non-standard, [custom extension functions](https://www.w3.org/TR/sparql11-query/#extensionFunctions) to be used within queries.\nSince this release, Comunica allows developers to plug in custom implementations for such functions.\n\nFor example, this allows you to plug in an implementation for the custom `func:reverse` function in the following query:\n```text\nPREFIX func: \nSELECT ?caps WHERE {\n ?s ?p ?o.\n BIND (func:reverse(?o) AS ?caps)\n}\n```\n\nLearn more about [configuring SPARQL extension functions here](/docs/query/advanced/extension_functions/).\n\n## Improved CLI arguments handling\n\nUp until this release, the internal mechanics of declaring and handling command-line arguments for `comunica-sparql` was hardcoded.\nThis caused some problems for custom init actors such as `comunica-sparql-hdt`,\nwhere custom handling of these arguments was required.\n\nIn order to meet these needs, the internals of CLI handling has been completely rewritten using the [`yargs`](https://www.npmjs.com/package/yargs) library.\nOther init actors can now easily plug in custom argument handlers to modify how the CLI tool behaves.\nFor the end-user, no significant changes are apparent, as the CLI tools remain fully backwards-compatible.\n\nYou can learn more about this in the [custom CLI arguments guide](/docs/modify/advanced/custom_cli_arguments/).\n\n## Features, fixes and enhancements\n\nNext to the changes above, several minor features, fixes and enhancements were applied,\nsuch as [migration to the more flexible Fetch-based HTTP actor](https://github.com/comunica/comunica/commit/a96547be4b112887a4e164496e2c6540737d8391),\n[allowing custom Fetch functions to be provided via the context](https://github.com/comunica/comunica/commit/a89f88fc1bf63c6e5d8ec7d5aee4199cd8b01e58),\n[logging filter errors as warnings in the logger](https://github.com/comunica/comunica/commit/cf12a9af63078917c0577f1d4b7d023506eda9e5),\n[reducing memory usage during query execution](https://github.com/comunica/comunica/commit/b0aeb67743eb187ddfb4e6fe8b42df240f3a9de7),\n[better error reporting for HTTP errors](https://github.com/comunica/comunica/commit/f6c2d5b2fe920808cf9ab98071da769f763c0515),\nand more.\nCheck out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v1220---2021-08-30) to read more about them.\n"},60190:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Comunica Association Memberships\'\n---\n\n[Earlier this year](/blog/2021-06-21-comunica_association_bounties/),\nwe announced the [Comunica Association](/association/),\nwhich is a non-profit organization that aims to make Comunica sustainable in the long term.\nIn this post, we announce the possibility to become a _member_ or _sponsor_ to the association,\nallowing organizations to drive the future roadmap of Comunica.\nWe plan an **official launch in fall 2022**, up until when organizations can choose\nto become a **founding member** of the Comunica Association.\n\n\n\n## \uD83C\uDFC6 Status of the bounty program\n\nThe [bounty program](/association/bounties/) has now been running for a couple of months,\nand so far it is working exactly as intended.\nAt the time of writing, two organizations ([Triply](https://triply.cc/) and [Netwerk Digitaal Erfgoed](https://netwerkdigitaalerfgoed.nl/))\nhave placed a total of six bounties, with a varying scope.\nOne of these bounties has already been completed, and two of them are being worked on. \n\nAn important finding is that bounties are **best applied on issues that have a clearly defined scope**, and are not too large.\nFor example, a bounty for a specific and easily reproducible bug is ideal.\nOn the other hand, more high-level issues such as the need to [improve overall performance](https://github.com/comunica/comunica/issues/846)\nseem to be less suited for bounties, as the scope is large or infinite, and the required effort is hard to predict.\nSuch issues are better suited for being part of the general roadmap of Comunica,\nwhich is the main motivation for introducing a membership structure.\n\n## \uD83C\uDFC5 Members and sponsors\n\nUp until now, Comunica primarily had a research-driven [roadmap](/roadmap/),\nbecause it grew out of a research project.\nTo allow more organizations and individuals to determine what this roadmap should look like,\nthe Comunica Association now allows [_members_ to become part of the board](/association/board/).\n\n**Board members are able to determine Comunica\'s roadmap**, and the future of the association.\nOne can become part of the board by either contributing time or via a financial contribution,\nwhich will both be invested in core maintenance of Comunica,\nsuch as managing issues and pull requests, and working towards the roadmap.\n\nFurthermore, for organizations that want to support the association,\nbut do not have the desire to become part of the board,\nthere is the option to become a _sponsor_, for which three tiers currently exist.\nThe **budget provided by sponsors will also go directly towards funding core maintenance of Comunica**,\nwith the option for sponsors to prioritize certain issues.\n\n
\n \n
\n\nSince the Comunica Association has a commitment to work as publicly and transparant as possible,\nall financial contributions from members and sponsors will go via our [Open Collective](https://opencollective.com/comunica-association) page.\nThis will allow everyone to see who contributed to the project, and how the budget is being spent.\n\n## \uD83D\uDE80 Next steps\n\nOrganizations that are interested in **supporting Comunica**, can do so **starting from today**.\nBecoming a board member or a sponsor can be done via our [Open Collective](https://opencollective.com/comunica-association) page,\nafter which we will contact you about the practical next steps.\nIf you want to become a board member by contributing time, you can [contact us](mailto:ruben.taelman@ugent.be) directly.\n\nAll members and sponsors that are active by our launch date in the fall of 2022 (exact date will be announced later),\nwill be considered **founding members and sponsors**, and will receive a permanent mention on this website.\nBased on the active members and sponsors, we will be actively looking for dedicated core maintainers\nthat want to be funded by the Comunica Association (be sure to [contact us](mailto:ruben.taelman@ugent.be) if you\'re interested in this!).\n\n[Click here to learn more about the Comunica Association,](/association/)\nor [contact us](mailto:ruben.taelman@ugent.be) regarding any specific questions\nyou may have about the association.\n'},30494:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 2.0.0: A new major release with radical simplifications and performance improvements\'\n---\n\nSince its initial release a couple of years ago, Comunica has grown a lot,\nbut it has always remained fully backwards-compatible with every update.\nHowever, as with every software project, there is sometimes a need to make breaking changes\nso that old mechanisms can be replaced with better, newer ones.\nWith this update, we have aggregated several breaking changes into one large update,\nall of which should improve the lives of users one way or another.\nBelow, the primary changes are listed.\n\n\n\n## New query API\n\nFor most people, the biggest change will be in the way you use Comunica for query execution,\nas the package names of the default query engines have been renamed,\nand the JavaScript API has been improved.\n\n### New package names\n\nUp until now, you may have been using `@comunica/actor-init-sparql` (or a variant) as your main entry point for query execution.\n**This main entrypoint has been moved to `@comunica/query-sparql`** (or `@comunica/query-sparql-file` and `@comunica/query-sparql-rdfjs`).\nThis means that your imports and the dependencies in your `package.json` file will require updates.\n\nThe first reason for this renaming is the fact that the new names are shorter and easier to remember.\nThe second reason is mainly for people that want to configure their own Comunica engines,\nwhere the Query Init actor has been decoupled from the query engine entrypoints to simplify the creation of new engines.\n\n### Improved JavaScript query API\n\nAnother major change is related to the way you create and use a Comunica query engine in JavaScript.\nMainly, the following changes have been made:\n\n- `newEngine` has been replaced with the class `QueryEngine` that can be instantiated with the `new` keyword.\n- New result-dependent query methods have been added for simpler result consumption:\n - `queryBindings` for SELECT queries\n - `queryQuads` for CONSTRUCT and DESCRIBE queries.\n - `queryBoolean` for ASK queries\n - `queryVoid` for update queries\n- The general `query` method still exists, but has been changed.\n- The methods on a Bindings object have been changed and improved, and obtaining values for variables does not require the `?` prefix anymore.\n- If you are using TypeScript, make sure to bump `@rdfjs/types` to at least `1.1.0`.\n\nLearn more about the new API in the [guide on querying in a JavaScript app](/docs/query/getting_started/query_app/).\n\nBelow, you can see an example of a simple SPARQL SELECT query execution in the old and new versions of Comunica.\n\n**Before (Comunica 1.x):**\n```typescript\nconst newEngine = require(\'@comunica/actor-init-sparql\').newEngine;\nconst myEngine = newEngine();\n\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'https://fragments.dbpedia.org/2015/en\'],\n});\n\nresult.bindingsStream.on(\'data\', (binding) => {\n console.log(binding.get(\'?s\').value);\n});\nbindingsStream.on(\'end\', () => {});\nbindingsStream.on(\'error\', (error) => console.error(error));\n```\n\n**After (Comunica 2.x):**\n```typescript\nconst QueryEngine = require(\'@comunica/query-sparql\').QueryEngine;\nconst myEngine = new QueryEngine();\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'https://fragments.dbpedia.org/2015/en\'],\n});\n\nbindingsStream.on(\'data\', (binding) => {\n console.log(binding.toString()); // New: quick way to print bindings\n console.log(binding.get(\'s\').value);\n});\nbindingsStream.on(\'end\', () => {});\nbindingsStream.on(\'error\', (error) => console.error(error));\n```\n\nThis new query API is largely aligned with the recently created [RDF/JS query specification](https://rdf.js.org/query-spec/),\nwhich makes Comunica better interactable and interchangeable within the RDF JavaScript ecosystem.\n\n## Easier engine modifications\n\nBased on the feedback we received from developers that configure their own Comunica engines or implement their own Comunica packages,\nwe have refactored the internals of Comunica in several places to simplify these processes.\n\n### Automatic generation of components files\n\nComunica makes use of the dependency injection framework [Components.js](/docs/modify/advanced/componentsjs/)\nto load its configuration files.\nA requirement for this framework is that each package should expose a semantic description of its classes, i.e., the _components files_.\nThese components files are located within the `components/` directory of each package.\nWhile these files had to be manually created before,\nthese files can now be automatically generated from the TypeScript sources\nusing [Components-Generator.js](https://github.com/LinkedSoftwareDependencies/Components-Generator.js/).\nThis significantly reduces the effort when creating new Comunica packages.\nLearn more about this in the [getting started with modification guides](/docs/modify/getting_started/).\n\n### Config restructuring\n\nUp until now, all configuration files were split up in smaller fragments, but using an arbitrary fragmentation strategy.\nWith this update, all configuration files now use a consistent fragmentation strategy,\nwhere a separate sub-directory exists for each Comunica bus, in which one or more files can exist per actor.\nFurthermore, all configuration files have been moved to a new dedicated (zero-dependency) package\n[`@comunica/config-query-sparql`](https://github.com/comunica/comunica/tree/master/engines/config-query-sparql/),\nwhich simplifies reuse and extension of these config fragments.\nLearn more about this new config structure in the [README of `@comunica/config-query-sparql`](https://github.com/comunica/comunica/blob/master/engines/config-query-sparql/config/README.md).\n\n## Internal changes for better performance\n\nOne primary aspect of [our roadmap](/roadmap/) is to [improve overall performance](https://github.com/comunica/comunica/issues/846).\nIn this update, we refactored the way in which [join operations](/docs/modify/advanced/joins/) are handled,\nbecause these were not flexible enough before, which hindered optimizations.\n\nConcretely, Comunica used to handle most join operations within the Basic Graph Pattern actor,\nwhich made it impossible to use these join operators for joins with other types of operations,\nsuch as property paths, which thereby made these operations very slow.\nWith this refactoring, the join operator implementations have been fully decoupled from the Basic Graph Pattern actor,\nwhich for example makes joins between triple patterns and property paths much more efficient.\n\nWhile performance will be much better in many cases,\nthere are still a lot of [opportunities open for further optimization](https://github.com/comunica/comunica/issues/846).\nWe welcome [contributions](/contribute/) for making these optimizations a reality.\n\nLearn more about [joins in Comunica](/docs/modify/advanced/joins/).\n\n## Explaining query plans\n\nMost large-scale query engines offer some way of inspecting _how_ exactly a query engine will execute a given query,\nwhich is something Comunica has been lacking so far.\n\nWith this update, you can inspect in detail the exact query plan and actors that were used for executing a given query.\nThis functionality exists both on the command-line (via `--explain`), as in the JavaScript API.\nFor example, the command below shows an example of a physical plan that is printed for a given query:\n\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o. ?s a ?o } LIMIT 100\' --explain physical\n\n{\n "logical": "slice",\n "children": [\n {\n "logical": "project",\n "variables": [\n "s",\n "p",\n "o"\n ],\n "children": [\n {\n "logical": "join",\n "children": [\n {\n "logical": "pattern",\n "pattern": "?s ?p ?o"\n },\n {\n "logical": "pattern",\n "pattern": "?s http://www.w3.org/1999/02/22-rdf-syntax-ns#type ?o"\n },\n {\n "logical": "join-inner",\n "physical": "bind",\n "bindIndex": 1,\n "bindOrder": "depth-first",\n "cardinalities": [\n {\n "type": "estimate",\n "value": 1040358853\n },\n {\n "type": "estimate",\n "value": 100022186\n }\n ],\n "joinCoefficients": {\n "iterations": 6404592831613.728,\n "persistedItems": 0,\n "blockingItems": 0,\n "requestTime": 556926378.1422498\n },\n...\n```\n\nLearn more about [explaining query plans in Comunica](/docs/query/advanced/explain/).\n\n## Webinar\n\nDue to all of these changes and simplifications,\nwe are planning a public webinar in which the basic usage of Comunica will be explained.\nThis will be useful for new developers that want to get started with Comunica,\nand developers that have used Comunica before, but want to learn about the new ways of using it.\nThis is also a perfect time for new contributors to become part of the community,\nor possibly even the [Comunica Association](/association/).\nMore news on this webinar will follow later.\n\n## Full changelog\n\nWhile this blog post explained the primary changes in Comunica 2.x,\nthere are actually many more smaller changes internally that will make your lives easier.\nIf you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v201---2022-03-02).\n'},60212:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.3.0: Better timeout support and minor enhancements'\n---\n\nIt's been a while since our latest blog post,\nso here's a small announcement on the latest 2.3.0 release.\n\n\n\n## Better timeout support\n\nWhen doing queries over slow sources, it may sometimes be desired to have requests time out if they run for too long.\nAs of this release, it is possible to [configure such timeouts](/docs/query/advanced/context/#16--http-timeout).\n\nFor example, configuring a timeout of 60 seconds when querying over a TPF endpoint can be done as follows:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n httpTimeout: 60_000,\n});\n```\n\nThis functionality was implemented by [@Tpt](https://github.com/Tpt), as the functionality was requested via a [bounty](https://comunica.dev/association/bounties/).\n\n## Union default graph\n\nBy default, Comunica will only query over the [default graph](https://www.w3.org/TR/sparql11-query/#unnamedGraph).\nIf you want to query over triples in other named graphs, you need to specify this via the `GRAPH`, `FROM`, or `FROM NAMED` clauses.\nHowever, by setting the `unionDefaultGraph` context option to `true`, triples patterns will also apply to triples in the non-default graph.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n unionDefaultGraph: true,\n});\n```\n\n## Improved ordering of terms\n\nWe recently noticed that ordering of terms in Comunica (as used by `ORDER BY`), did not fully implement total ordering.\nThis caused [issues](https://github.com/comunica/comunica/issues/892) where certain terms would be ordered in an inconsistent manner.\nThanks to [@Tpt](https://github.com/Tpt), Comunica (and the underlying [Sparqlee expressions evaluator](https://github.com/comunica/sparqlee)) now have proper total ordering support.\n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v230---2022-06-29).\n"},39243:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Official launch of the Comunica Association'\n---\n\nAs previously announced, we will be officially launching the Comunica Association during the fall of this year.\nMore concretely, we are organizing an online launch event on the 7th of September,\nand we will be physically present at the Semantics conference in Vienna the week afterwards.\n\n\n\n## \uD83D\uDCD6 How we got here\n\nLast year, [we announced the Comunica Association](/blog/2021-06-21-comunica_association_bounties/),\nto make Comunica sustainable in the long-term,\nand to advance the [long-term roadmap](/roadmap/).\nUp until now, we had soft-launch period during which a bounty program and membership structure was being setup.\nThe association has grown a lot since then,\nwith [multiple developers actively working on bounties](/association/bounties/),\nand [multiple contributors supporting us via Open Collective](https://opencollective.com/comunica-association).\n\nWe thank the following founding members, which have supported the association for this launch:\n\n- [IDLab - Ghent University](https://www.ugent.be/ea/idlab/en)\n- [Australian National University](https://cecs.anu.edu.au)\n- [Dutch Digital Heritage Network (NDE)](https://netwerkdigitaalerfgoed.nl/)\n\n## \uD83D\uDE80 Online launch event\n\nWednesday 7 September 16:00 (Brussels time), we will livestream the launch of the Comunica Association.\nDuring this event, several invited speakers from various companies will talk about their experiences with Comunica, and show off some demo's.\nSpeaker profiles during this event range from commercial users of Comunica,\nto academics using Comunica for their research.\n\nIf you want to learn more about this event,\nyou can find more details on the [event page](/events/2022-09-07-association_launch/).\n\n## \uD83E\uDDD1\uD83C\uDFEB Semantics conference\n\nIn the week after the online launch event,\nthe [Semantics conference](https://2022-eu.semantics.cc/) takes place in Vienna, Austria from September 13 until September 15.\nWe will be preset at this conference with a booth and give a talk at the main conference.\nIf you plan to attend this conference, be sure to come find us there! \n"},80850:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.4.0: Better browser support and performance improvements'\n---\n\nWe just released a new minor version of Comunica.\nHere's an overview of the main changes.\n\n\n\n## Better browser support\n\nWhen using Comunica in browser bundling tools such as Webpack,\npolyfills had to be configured since Comunica made use of Node.js builtins.\nAs of this release, Comunica does not depend directly on these Node.js builtins anymore,\nwhich means that Comunica can be bundled directly with tools such as Webpack without having to configure polyfills in a custom config.\n\nThis change was implemented by [@Tpt](https://github.com/Tpt) via a [bounty](https://comunica.dev/association/bounties/).\n\n## Performance improvements\n\nThanks to some [internal changes inside AsyncIterator](https://github.com/comunica/comunica/commit/b16e18888b0e93821c76e01a6efd9bcb3c4f9523), Comunica now runs slightly faster in general.\n\nFurthermore, [some property path logic was rewritten](https://github.com/comunica/comunica/commit/0ad833f8f32f7e3c2de1b22a0424da027656bf6a),\nwhich makes * and + path queries significantly faster for large datasets.\n\n## Tweaks to the HTTP service\n\nThe [HTTP service](https://comunica.dev/docs/query/getting_started/setup_endpoint/) of Comunica (which exposes a SPARQL endpoint) has been polished.\nOn the one hand, several bugfixes have been applied to make the endpoint more stable when there are timeouts and long-running queries.\nFurthermore, [some](https://github.com/comunica/comunica/commit/4958206f6b042239efe2218ce268e4b981ce9e2c)\n[features]((https://github.com/comunica/comunica/commit/4dd99fee904c64e9ef700eb5080197c4a03a36fa))\nhave been added to are useful when benchmarking with Comunica. \n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v240---2022-08-24).\n"},64546:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.5.0: Fixes, string sources, and HTTP error handling'\n---\n\nWe just released a new small update. Here's an overview of the main changes.\n\n\n\n## String sources\n\nIf you have an RDF dataset available in a JavaScript string in some RDF serialization,\nyou can now immediately query over it by passing it as a `stringSource` as follows:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n {\n type: 'stringSource',\n value: '. .',\n mediaType: 'text/turtle',\n baseIRI: 'http://example.org/',\n },\n ],\n});\n```\n\nThis feature has been contributed by [@constraintAutomaton](https://github.com/constraintAutomaton).\n\n## HTTP error handling\n\nWith this update, query engines can become more robust against unstable or unavailable server.\n\nUsing the `httpRetryOnServerError`, `httpRetryCount`, and `httpRetryDelay` options,\nyou can make your engine retry requests for a number of times if the server produces an error for it.\n\nUsing the `recoverBrokenLinks` option, you can make your engine fall back to the [WayBack Machine](https://archive.org/web/) if a document has become unavailable.\n\nLearn more about using these options on the [command line](https://comunica.dev/docs/query/getting_started/query_cli/)\nand [query context](https://comunica.dev/docs/query/advanced/context/).\n\nThese features were contributed by [@Laurin-W](https://github.com/Laurin-W/) and [@jeswr](https://github.com/jeswr/).\n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v250---2022-11-09).\n"},63367:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 2.7.0: Better date support, better performance over SPARQL endpoints, and internal fixes\'\n---\n\nToday, we released a new minor update, which brings exciting new features, performance improvements, and bug fixes.\nBelow, you can find an overview of the main changes.\n\n\n\n## \uD83D\uDCC5 Durations, Dates, and Times in Filters\n\nThe SPARQL 1.1 specification prescribes only a very limited set of operations that can be done over literals with datatype `xsd:dateTime`.\nFor example, it is not possible to add/subtract durations, compute differences between times, and so on. \nRecently, [a suggestion was made](https://github.com/w3c/sparql-12/blob/main/SEP/SEP-0002/sep-0002.md) to extend\nthe number of operations that can be done over `xsd:dateTime`\'s and related datatypes.\nThis Comunica release implements this [proposal](https://github.com/w3c/sparql-12/blob/main/SEP/SEP-0002/sep-0002.md),\nwhich means that queries such as the following are now possible:\n\n```text\nPREFIX xsd: \nSELECT ?id ?lt ?gt WHERE {\n VALUES (?id ?l ?r) {\n (1 "PT1H"^^xsd:dayTimeDuration "PT63M"^^xsd:dayTimeDuration)\n (2 "PT3S"^^xsd:dayTimeDuration "PT2M"^^xsd:dayTimeDuration)\n (3 "-PT1H1M"^^xsd:dayTimeDuration "-PT62M"^^xsd:dayTimeDuration)\n (4 "PT0S"^^xsd:dayTimeDuration "-PT0.1S"^^xsd:dayTimeDuration)\n }\n BIND(?l < ?r AS ?lt)\n BIND(?l > ?r AS ?gt)\n}\n```\n\nThis functionality was implemented by [@jitsedesmet](https://github.com/jitsedesmet).\n\n## \uD83D\uDE80 Improved performance over SPARQL endpoints\n\nComunica aims to enable query execution over [different types of query interfaces](/about/#flexible-querying-of-linked-data), which includes SPARQL endpoints.\nWhile participating in a [recent workshop on federated querying over SPARQL endpoints](https://github.com/MaastrichtU-IDS/federatedQueryKG),\nwe encountered several performance issues that were caused by implementation bugs when querying over multiple SPARQL endpoints.\nWith this update, these performance issues have been resolved, and many queries that would either timeout or crash due to memory issues now run efficiently.\n\nThis functionality was implemented by [@surilindur](https://github.com/surilindur/),\n[@constraintAutomaton](https://github.com/constraintAutomaton/), and [@rubensworks](https://github.com/rubensworks/).\n\n## \uD83D\uDDC3️ Refactored internal metadata\n\nAs Comunica follows a [hypermedia-driven query execution model](/docs/modify/advanced/hypermedia/)\nto allow source capabilities to be detected and exploited on-the-fly,\nthere is a need for keeping track of the _metadata_ of such sources.\n\nTo enable more adaptive and efficient query execution in the future,\nwe have refactored this internal metadata so that it can be updated during query execution.\nThis allows operators to adaptively act upon newly discovered information in sources.\n\nMore details on these metadata changes can be read in the [documentation](/docs/modify/advanced/metadata/).\n\n## Full changelog\n\nAs always, if you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v270---2023-05-24).\n'},57145:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 2.8.0: Support for quoted triples (RDF-star and SPARQL-star)'\n---\n\nThis minor release focuses on a single but significant new feature: support for quoted triples.\n\n\n\n## \uD83E\uDE86 Quoted triples support\n\nRecently, the RDF-star community group has produced extensions to RDF and SPARQL: [RDF-star and SPARQL-star](https://www.w3.org/2021/12/rdf-star.html).\nThese extensions allow statements to be made about other statements,\nwhich previously only used to be possible using inconvenient workarounds such as RDF reification and named graphs.\nThe [RDF-star W3C working group](https://www.w3.org/groups/wg/rdf-star/) is now working on preparing new versions of the RDF and SPARQL recommendations,\nwhich are scheduled to be finalized in the second half of 2024.\n\nConcretely, this functionality allows triples to be _quoted_ in subject and object positions of other triples.\nFor example, the statement _\"Alice says that Violets are Blue\"_ could be expressed in Turtle as follows:\n```text\n@prefix : .\n:Alice :says << :Violets :haveColor :Blue >> .\n```\nFurthermore, this could be queried in SPARQL as follows:\n```text\nPREFIX : \nSELECT ?person ?color WHERE {\n ?person :says << :Violets :haveColor ?color >> .\n}\n```\n\nThis Comunica update adds support to this new functionality, following the [RDF-star community group report](https://www.w3.org/2021/12/rdf-star.html).\nConcretely, most RDF parsers and serializers, all SPARQL result parsers and serializers,\nand the SPARQL query parser and processing have been updated to handle quoted triples.\nFurthermore, for storing quoted triples in-memory, we recommend the optimized [`rdf-stores`](https://www.npmjs.com/package/rdf-stores) package,\nwhich is also being used internally for handling quoted triples.\n\nThis functionality is fully backwards-compatible, meaning that existing applications that do not make use of quoted triples will experience no differences.\nFurthermore, breaking changes in our RDF-star support _may_ occur if the RDF-star W3C working group decides to deviate from the RDF-star community group report.\n\n## Full changelog\n\nAs always, if you want to learn more about all changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v280---2023-07-04).\n"},84831:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 3.0: \uD83D\uDD25 Blazingly fast federation over heterogeneous sources\'\n---\n\nMore than 2 years ago, we released [Comunica version 2.0](/blog/2022-03-03-release_2_0/),\nwhich featured many internal and external API changes that significantly simplified its usage.\nToday, we release version 3.0, which focuses more on internal changes, with limited changes to the external API.\nMost of the changes relate to the handling of data sources during query planning,\nwhich allows **more efficient query plans to be produced when querying over federations of heterogeneous sources**.\nThis means that for people using Comunica, the number of breaking changes in this update are very limited.\nThings will simplify be faster in general, and some small convenience features have been added,\nsuch as results being [async iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols). \nTo developers extending Comunica with custom actors, there will be some larger breaking changes.\n\n\n\n## \uD83D\uDD01 Async iterable results\n\nSince recent JavaScript versions, it has been possible to use a new _for-await_ syntax over [async iterables](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols).\nComunica has been using the [AsyncIterator library](https://github.com/RubenVerborgh/AsyncIterator/) since its initial release.\nThis requires users to consume results as streams using on-data listeners, as follows:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [ \'http://fragments.dbpedia.org/2015/en\' ],\n});\n\nbindingsStream.on(\'data\', (bindings) => {\n console.log(bindings.toString());\n});\n```\n\nAs of Comunica 3.x, **results can now also be consumed via the async iterable interface**, as follows:\n```javascript\nfor await (const bindings of bindingsStream) {\n console.log(bindings.toString());\n}\n```\n\nIn performance-critical cases, we still recommend the on-data listener approach.\nBut in most cases, the async iterable interface will provide sufficient levels of performance.\n\n## \uD83D\uDE4B Performance improvements for end-users\n\nIn Comunica version 2.x, federated queries (i.e. queries across multiple sources)\nwould essentially be split at triple pattern level,\neach triple pattern would be sent to each source,\nand results would be combined together locally.\nWhile this way of working is semantically correct, it is not always the most performant,\nespecially when working with sources such as SPARQL endpoints that can accept way more than just triple patterns.\n\nIn Comunica version 3.x, the internal architecture has been refactored\nto enable query planning to not just happen at triple pattern level,\nbut to **enable any kind of query operation to be sent to any kind of source that would support them**.\nWhile this new architecture will enable better query optimizations to be implemented in the future,\nwe already implemented some optimizations in this release.\nFirst, if Comunica detects that multiple operations _exclusively_ apply to one source,\nthen these **operations will be grouped and sent in bulk to this source** ([`@comunica/actor-optimize-query-operation-group-sources`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-group-sources)).\nRoughly, this correspond to the [FedX optimization techniques](http://iswc2011.semanticweb.org/fileadmin/iswc/Papers/Research_Paper/05/70310592.pdf),\nbut extended to apply to heterogeneous sources instead of only SPARQL endpoints.\nSecond, if a join is done between two sources,\nwhere one of these sources accepts bindings to be pushed down into the source (such as SPARQL endpoints and brTPF interfaces) ([`@comunica/actor-rdf-join-inner-multi-bind-source`](https://github.com/comunica/comunica/tree/master/packages/actor-rdf-join-inner-multi-bind-source)),\nthe **_bound-join_ technique is applied** (FedX).\nThird, if sources accept `FILTER` operations, then these **`FILTER` operations can be pushed down into the sources** that accept them ([`@comunica/actor-optimize-query-operation-filter-pushdown`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-filter-pushdown)).\nFourth, if some operations will not produce any results based on prior `COUNT` or `ASK` queries,\nthen these **empty source-specific operations will be pruned away** ([`@comunica/actor-optimize-query-operation-prune-empty-source-operations`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-prune-empty-source-operations)).\n\nEnd-users of Comunica will see a significant performance improvement when federating across multiple sources,\nespecially if some of those sources would be SPARQL endpoints.\nBelow, you can find some high-level performance comparisons of queries in Comunica 2.x vs 3.x.\n\n| Query | Comunica 2.x | Comunica 3.x |\n|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------|--------------------------|\n| [Books by San Franciscans in Harvard Library (DBpedia TPF)](http://query.linkeddatafragments.org/#transientDatasources=%2F%2Ffragments.dbpedia.org%2F2016-04%2Fen;%2F%2Fdata.linkeddatafragments.org%2Fviaf;%2F%2Fdata.linkeddatafragments.org%2Fharvard&query=SELECT%20%3Fperson%20%3Fname%20%3Fbook%20%3Ftitle%20%7B%0A%20%20%3Fperson%20dbpedia-owl%3AbirthPlace%20%5B%20rdfs%3Alabel%20%22San%20Francisco%22%40en%20%5D.%0A%20%20%3FviafID%20schema%3AsameAs%20%3Fperson%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20schema%3Aname%20%3Fname.%0A%20%20%3Fbook%20dc%3Acontributor%20%5B%20foaf%3Aname%20%3Fname%20%5D%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20dc%3Atitle%20%3Ftitle.%0A%7D%20LIMIT%20100) | 5774.32 ms (669 requests) | 4923.86 ms (334 requests) |\n| [Books by San Franciscans in Harvard Library (DBpedia SPARQL)](http://query.linkeddatafragments.org/#datasources=https%3A%2F%2Fdbpedia.org%2Fsparql&transientDatasources=%2F%2Fdata.linkeddatafragments.org%2Fviaf;%2F%2Fdata.linkeddatafragments.org%2Fharvard&query=SELECT%20%3Fperson%20%3Fname%20%3Fbook%20%3Ftitle%20%7B%0A%20%20%3Fperson%20dbpedia-owl%3AbirthPlace%20%5B%20rdfs%3Alabel%20%22San%20Francisco%22%40en%20%5D.%0A%20%20%3FviafID%20schema%3AsameAs%20%3Fperson%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20schema%3Aname%20%3Fname.%0A%20%20%3Fbook%20dc%3Acontributor%20%5B%20foaf%3Aname%20%3Fname%20%5D%3B%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20dc%3Atitle%20%3Ftitle.%0A%7D%20LIMIT%20100) | Timeout | 8469.86 ms (632 requests) |\n| [Compounds in Lindas and Rhea](http://query.linkeddatafragments.org/#datasources=sparql%40https%3A%2F%2Flindas.admin.ch%2Fquery;https%3A%2F%2Fsparql.rhea-db.org%2Fsparql&query=PREFIX%20schema%3A%20%3Chttp%3A%2F%2Fschema.org%2F%3E%0ASELECT%20*%20WHERE%20%7B%0A%20%20%3Fsubstance%20a%20schema%3ADefinedTerm%20%3B%0A%20%20%20%20schema%3Aidentifier%20%3Fidentifier%20%3B%0A%20%20%20%20schema%3AinDefinedTermSet%20%3Chttps%3A%2F%2Fld.admin.ch%2Fcube%2Fdimension%2Fel01%3E%20.%0A%20%20%3Fcompound%20%3Chttp%3A%2F%2Frdf.rhea-db.org%2Fformula%3E%20%3Fidentifier%20%3B%0A%20%20%20%20%3Chttp%3A%2F%2Frdf.rhea-db.org%2Fname%3E%20%3Fname%20%3B%0A%20%20%20%20%3Chttp%3A%2F%2Frdf.rhea-db.org%2Faccession%3E%20%3Faccession%20.%0A%7D%0A) | Timeout | 424.57 ms(41 requests) |\n\n### Inspecting source selection results\n\nIf you are interested in understanding how Comunica will split up queries across multiple sources,\nyou can make use of the [logical explain mode](/docs/query/advanced/explain/).\n\nFor example, if we want to execute the following query across three sources\n(https://dbpedia.org/sparql (SPARQL), http://data.linkeddatafragments.org/viaf (TPF), http://data.linkeddatafragments.org/harvard (TPF)),\nthe logical explain mode will show us how this query is split up and assigned to each source.\n\n**Query:**\n```txt\nSELECT ?person ?name ?book ?title {\n ?person dbpedia-owl:birthPlace [ rdfs:label "San Francisco"@en ].\n ?viafID schema:sameAs ?person;\n schema:name ?name.\n ?book dc:contributor [ foaf:name ?name ];\n dc:title ?title.\n} LIMIT 100\n```\n\n**Explain:**\n```txt\ncomunica-sparql \\\n https://dbpedia.org/sparql http://data.linkeddatafragments.org/viaf http://data.linkeddatafragments.org/harvard \\\n -f query.sparql --explain logical\n{\n "type": "slice",\n "input": {\n "type": "project",\n "input": {\n "type": "join",\n "input": [\n {\n "type": "join",\n "input": [\n {\n "type": "union",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "viafID"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://schema.org/sameAs"\n },\n "object": {\n "termType": "Variable",\n "value": "person"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "viafID"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://schema.org/sameAs"\n },\n "object": {\n "termType": "Variable",\n "value": "person"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/viaf)(SkolemID:1)"\n }\n }\n ]\n },\n {\n "type": "union",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "g_1"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://xmlns.com/foaf/0.1/name"\n },\n "object": {\n "termType": "Variable",\n "value": "name"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "g_1"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://xmlns.com/foaf/0.1/name"\n },\n "object": {\n "termType": "Variable",\n "value": "name"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/harvard)(SkolemID:2)"\n }\n }\n ]\n },\n {\n "type": "union",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "book"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://purl.org/dc/terms/title"\n },\n "object": {\n "termType": "Variable",\n "value": "title"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "book"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://purl.org/dc/terms/title"\n },\n "object": {\n "termType": "Variable",\n "value": "title"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/harvard)(SkolemID:2)"\n }\n }\n ]\n }\n ]\n },\n {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "person"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://dbpedia.org/ontology/birthPlace"\n },\n "object": {\n "termType": "Variable",\n "value": "g_0"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n },\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "g_0"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://www.w3.org/2000/01/rdf-schema#label"\n },\n "object": {\n "termType": "Literal",\n "value": "San Francisco",\n "language": "en",\n "datatype": {\n "termType": "NamedNode",\n "value": "http://www.w3.org/1999/02/22-rdf-syntax-ns#langString"\n }\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ],\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(https://dbpedia.org/sparql)(SkolemID:0)"\n }\n },\n {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "viafID"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://schema.org/name"\n },\n "object": {\n "termType": "Variable",\n "value": "name"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/viaf)(SkolemID:1)"\n }\n }\n ]\n },\n {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "book"\n },\n "predicate": {\n "termType": "NamedNode",\n "value": "http://purl.org/dc/terms/contributor"\n },\n "object": {\n "termType": "Variable",\n "value": "g_1"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern",\n "metadata": {\n "scopedSource": "QuerySourceHypermedia(http://data.linkeddatafragments.org/harvard)(SkolemID:2)"\n }\n }\n ]\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "person"\n },\n {\n "termType": "Variable",\n "value": "name"\n },\n {\n "termType": "Variable",\n "value": "book"\n },\n {\n "termType": "Variable",\n "value": "title"\n }\n ]\n },\n "start": 0,\n "length": 100\n}\n```\n\nThe `scopedSource` annotations on operations show which sources apply to which sources.\nThe above shows that most of the query will be split at triple pattern level to the different sources,\nexcept for the patterns `?person dbpedia-owl:birthPlace [ rdfs:label "San Francisco"@en ].`,\nwhich have been identified as exclusively applying to https://dbpedia.org/sparql,\nwhich can therefore be sent as-is to the SPARQL endpoint.\n\nHereafter, this post will discuss the internal changes in more detail for developers\nthat want to update their implementations to this new architecture.\n\n## \uD83D\uDD0D Query Source Identify bus\n\n[`@comunica/bus-query-source-identify`](https://github.com/comunica/comunica/tree/master/packages/bus-query-source-identify) is a new bus that roughly\nreplace the `@comunica/bus-rdf-resolve-quad-pattern` and `@comunica/bus-rdf-resolve-quad-pattern-hypermedia` buses.\nThe main difference is that `@comunica/bus-query-source-identify` runs _before_ query execution within the `@comunica/bus-context-preprocess` bus,\nwhile the old buses ran _during_ query execution.\nRunning things before query execution enables more optimization opportunities,\nwhich enabled the existence of actors such as [`@comunica/actor-optimize-query-operation-filter-pushdown`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-filter-pushdown) and [`@comunica/actor-optimize-query-operation-prune-empty-source-operations`](https://github.com/comunica/comunica/tree/master/packages/actor-optimize-query-operation-prune-empty-source-operations).\n\nIf you had an actor on the `@comunica/bus-rdf-resolve-quad-pattern` or `@comunica/bus-rdf-resolve-quad-pattern-hypermedia` bus,\nthese can now be moved to the `@comunica/bus-query-source-identify` or `@comunica/bus-query-source-identify-hypermedia` bus.\nThe main API change here is that sources now need to implement the `IQuerySource` interface,\nthat they need to announce the shape of query operations they support (instead of only quad patterns),\nand that these operations need to be executable within the source.\n\n## \uD83D\uDE8C Query Process bus\n\n[`@comunica/bus-query-process`](https://github.com/comunica/comunica/tree/master/packages/bus-query-parse) is a new bus that contains all logic for fully processing a query,\nwhich usually involves steps such as parsing, optimizing, and evaluating, which can be delegated to other buses.\nAll of this logic was previously contained within [`@comunica/actor-init-query`](https://github.com/comunica/comunica/tree/master/packages/actor-init-query),\ntogether with many other boilerplate logic,\nwhich made things very difficult if developers would want to modify a small part of the query process.\nWith this new bus, developers can more easily plug in custom query process actors,\nsuch as _adaptive_ query planners.\n\n## Full changelog\n\nWhile this blog post explained the primary changes in Comunica 3.x,\nthere are actually many more smaller changes internally that will make your lives easier.\nIf you want to learn more about these changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v301---2024-03-19).\n'},45609:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Release 3.1: \uD83C\uDF31 New package with tiny bundle size'\n---\n\nThe primary addition in this release is the new [`@comunica/query-sparql-rdfjs-lite`](https://www.npmjs.com/package/@comunica/query-sparql-rdfjs-lite) package,\nwhich is optimized for small browser bundle size.\nCurrently, the minified size of this package is 648,88 KB (145,79 KB when gzipped).\nThis is about as small as you can get without removing required functionality from the SPARQL 1.1 spec\nBut if you don't need everything from SPARQL 1.1, it could get much smaller even!\n\n\n\nBesides this, several fixes were applied, and some internal changes to our CI to better\n[track the browser bundle size](https://github.com/comunica/comunica/commit/f212b9262f5d2a12a40848f01132299904dc132c)\nand [overall query performance](https://github.com/comunica/comunica/commit/1d8b0d202a7d4728e3692764b33d8795686ce5a0) over time.\n\n## Full changelog\n\nIf you want to learn more about all changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v310---2024-05-11).\n"},72212:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Release 3.2: \uD83D\uDD0E Knowing what to optimize\'\n---\n\nFor this release, we mainly focused on improving tooling to more easily track down performance issues.\nConcretely, we improved our query explain output,\nstarted running multiple benchmarks in our CI to avoid performance regressions,\nand applied several performance improvements that were identified following these changes.\n\n\n\n## \uD83D\uDD0E Query explain improvements\n\nComunica has had several [query explain functionalities](/docs/query/advanced/explain/) for a while now,\nto show how a query is parsed, optimized (logical), and executed (physical).\nHowever, the physical plan output tended to be very verbose, which made it difficult to draw conclusions from.\n\nIn this update, the physical plan output has undergone three main changes:\n\n1. The output from joins (especially Bind Joins) are _compacted_, so that recurring patterns in sub-plans are not repeated. Instead, a counter is added showing how many times a certain sub-plan was executed.\n2. The default output is a compact text representation instead of the previous JSON output. (The old JSON representation is still available when passing the `physical-json` explain value)\n3. Additional metadata is emitted, such as cardinalities and execution times.\n\nFor example, outputs such as the following can now be obtained:\n\n```bash\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT ?movie ?title ?name\nWHERE {\n ?movie dbpedia-owl:starring [ rdfs:label "Brad Pitt"@en ];\n rdfs:label ?title;\n dbpedia-owl:director [ rdfs:label ?name ].\n FILTER LANGMATCHES(LANG(?title), "EN")\n FILTER LANGMATCHES(LANG(?name), "EN")\n}\' --explain physical\n```\n```text\nproject (movie,title,name)\n join\n join-inner(bind) bindOperation:(?g_0 http://www.w3.org/2000/01/rdf-schema#label "Brad Pitt"@en) bindCardEst:~2 cardReal:43 timeSelf:2.567ms timeLife:667.726ms\n join compacted-occurrences:1\n join-inner(bind) bindOperation:(?movie http://dbpedia.org/ontology/starring http://dbpedia.org/resource/Brad_Pitt) bindCardEst:~40 cardReal:43 timeSelf:6.011ms timeLife:641.139ms\n join compacted-occurrences:38\n join-inner(bind) bindOperation:(http://dbpedia.org/resource/12_Monkeys http://dbpedia.org/ontology/director ?g_1) bindCardEst:~1 cardReal:1 timeSelf:0.647ms timeLife:34.827ms\n filter compacted-occurrences:1\n join\n join-inner(nested-loop) cardReal:1 timeSelf:0.432ms timeLife:4.024ms\n pattern (http://dbpedia.org/resource/12_Monkeys http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (http://dbpedia.org/resource/Terry_Gilliam http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~1 src:0\n join compacted-occurrences:2\n join-inner(multi-empty) timeSelf:0.004ms timeLife:0.053ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://dbpedia.org/ontology/director ?g_1) cardEst:~0 src:0\n filter cardEst:~5,188,789.667\n join\n join-inner(nested-loop) timeLife:0.6ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n join compacted-occurrences:1\n join-inner(multi-empty) timeSelf:0.053ms timeLife:0.323ms\n pattern (?movie http://dbpedia.org/ontology/director ?g_1) cardEst:~118,505 src:0\n pattern (?movie http://dbpedia.org/ontology/starring http://wikidata.dbpedia.org/resource/Q35332) cardEst:~0 src:0\n filter cardEst:~242,311,843,844,161\n join\n join-inner(symmetric-hash) timeLife:36.548ms\n pattern (?movie http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~20,013,903 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n\nsources:\n 0: QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)\n```\n\n## ⚙️ Continuous performance tracking\n\nIn order to keep better track of the evolution of Comunica\'s performance,\nwe have added continuous performance tracking into our continuous integration.\nFor various benchmarks, we can now see the evolution of execution times across our commit history.\nThis allows us to easily identify which changes have a positive or negative impact on performance.\n\nFor considering the performance for different aspects, we have included the following benchmarks:\n\n- WatDiv (in-memory)\n- WatDiv (TPF)\n- Berlin SPARQL Benchmark (in-memory)\n- Berlin SPARQL Benchmark (TPF)\n- Custom web queries: manually crafted queries to test for specific edge cases over the live Web\n\nThis allows us to inspect performance as follows:\n\n
\n \n
\n\n_Fluctuations in the graph are mainly caused by confounding variables in the GitHub Actions environment, such as running on different hardware and runner versions._\n\nThese results can be inspected in [more close detail](https://github.com/comunica/comunica-performance-results) together with execution times per query separately.\n\n## \uD83C\uDFCE️ Performance improvements\n\nThanks to the improvements to our physical query plan output and the continuous performance tracking,\nwe identified several low-hanging efforts for improving performance:\n\n- [Addition of a hash-based optional join actor](https://github.com/comunica/comunica/commit/de90db0140cd10e2bfdf23c26f9eeff5e94f3ef2)\n- [Tweaking constants of our internal join cost model](https://github.com/comunica/comunica/commit/50333c92ed1cf5410f172f608a213424e510986e)\n- [Making optional hash and bind join only work with common variables](https://github.com/comunica/comunica/commit/df40c20e001121cd0ae9a9adf67ed221dc2966ba)\n\nBesides these changes, we have many more performance-impacting changes in the pipeline for upcoming releases!\n\n## Full changelog\n\nBesides this, several fixes were applied, as well as various changes and additions.\nIf you want to learn more about all changes, check out the [full changelog](https://github.com/comunica/comunica/blob/master/CHANGELOG.md#v320---2024-07-05).\n'},4950:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Contribute'\ndescription: 'Contribute to the development of Comunica.'\n---\n\n## Report bugs or request features\n\nThe easiest way to contribute to Comunica is by **reporting the bugs** your encounter,\nand **requesting new features** or enchancements.\n\nBoth of these should be done via [**GitHub issues**](https://github.com/comunica/comunica/issues).\nMake sure to be as descriptive as possible, and completely fill in the requested template.\n\n## Fix bugs or implement new features\n\nIf there is a certain bug that annoys you,\nor if you see the opportunity for a new feature that would make your life easier,\nyou are welcome to contribute by submitting a **pull request**.\nBefore you open a pull request, it is considered a good practise to first\n[open an issue](https://github.com/comunica/comunica/issues) or [discuss it with the community](/ask/).\n\nDon't know on what to get started? Have a look at issues tagged with the [`good-first-issue`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue) label\nor the [`dev-ready`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Adev-ready) label.\nIssues tagged with [`good-first-issue`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Agood-first-issue) are issues that should be implementable by new contributors.\nIssues tagged with [`dev-ready`](https://github.com/comunica/comunica/issues?q=is%3Aissue+is%3Aopen+label%3Adev-ready) are potentially harder issues, but they are directly implementable without research.\n\nWhen contributing, make sure to keep in mind the following:\n* Read how to [set up a development environment](https://github.com/comunica/comunica#development-setup).\n* Read the guide on [contributing an actor](/docs/modify/getting_started/contribute_actor/).\n* Commit messages:\n * [Use descriptive, imperative commit message](https://chris.beams.io/posts/git-commit/). These commit messages will be used as input for release changelogs. Have a look at the [commit history](https://github.com/comunica/comunica/commits/master) for examples.\n * Commit messages should include a reference to relevant issues. For example, `Closes #123`, or `Related to #456`.\n* Pull requests should pass all checks\n * Unit tests with 100% branching coverage (`yarn test`)\n * Clean code with passing linter (`yarn run lint`)\n * Code documentation\n * [Pass all spec and integration tests](/docs/modify/advanced/testing/)\n * Signing the [Contributor License Agreement](https://cla-assistant.io/comunica/comunica)\n* Only add the files that are needed, so don't blindly do a `git -a`. (avoid adding editor-specific files)\n* A good editor can make your life a lot easier. For example, [WebStorm](https://www.jetbrains.com/community/education/#students) can be used for free with an academic license.\n* All JSdoc can be found on https://comunica.github.io/comunica/\n\nTips and tricks:\n* Only do `yarn install` in the repo root, and *never* in one of the sub-packages, as this can break your repo.\n* `yarn run build` will (re)build all TypeScript to JavaScript and generate Components.js files. These can also be invoked separately via `yarn run build:ts` and `yarn run build:components`. These can also be executed on package-level.\n* `yarn run build-watch` will continuously build TypeScript to JavaScript and generate Components.js files, which is useful during development. These can also be invoked separately via `yarn run build-watch:ts` and `yarn run build-watch:components`.\n* `yarn test` and `yarn run lint` execute the tests and linter checks locally. Before a PR is opened, these must always pass, and testing coverage must be 100%.\n* When editing configuration files in packages like `query-sparql`, `yarn run prepare` can be executed to compile the JSON files to JavaScript before they can be executed. (not needed when executing dynamically)\n* When modifying a dependency package such as [sparqlee](https://github.com/comunica/sparqlee), [Yarn's link functionality](https://classic.yarnpkg.com/en/docs/cli/link/) can be used to force your local version of that dependency to be used in Comunica.\n\n## Write documentation\n\nThis website aims to provide detailed documentation on how to use and modify Comunica.\nIf you see an opportunity for improving this documentation, fixing mistakes, or adding new guides,\nyou are welcome to contribute via [GitHub](https://github.com/comunica/website).\n\n## Create example code\n\nThe [Comunica examples repository](https://github.com/comunica/examples) contains several example packages that modify Comunica,\nwith details on how they are created and how they work.\nAnyone is more than welcome to contribute new example packages to this repository.\nFor inspiration, you can have a look at the [example requests](https://github.com/comunica/examples/issues?q=is%3Aissue+is%3Aopen+label%3Aexample-request).\n\n## Guidelines for core developers\n\nThe following guidelines only apply to people with push access to the Comunica repositories.\n\n### Branching Strategy\n\nThe `master` branch is the main development branch.\n\nReleases are `tags` on the `master` branch.\n\nAll changes (features and bugfixes) must be done in a separate branch, and PR'd to `master`.\n\nRecursive features must be PR'd to their parent feature branches, as a feature can consist of multiple smaller features.\n\nThe naming strategy of branches is as follows:\n* Features: `feature/short-name-of-feature`\n* Bugfixes: `fix/short-name-of-fix`\n\n### Issue Strategy\n\nIssues should be assigned to people when possible, and must be progressed using the applicable GitHub project boards:\n\n* [Maintenance](https://github.com/orgs/comunica/projects/2)\n* [Development](https://github.com/orgs/comunica/projects/3)\n* [Documentation](https://github.com/orgs/comunica/projects/4)\n\nGeneral issues progress:\n\n1. Triage: If the issue is not yet accepted or assigned.\n2. To Do (3 levels of priority): When the issue is accepted and assigned, but not in progress yet.\n3. In Progress: When the issue is being worked on by the assignee, or is under review.\n4. Done: When the issue is resolved and reviewed. If attached to a PR, this can be merged, or closed otherwise.\n5. On hold: If the issue is awaiting external input.\n\n### Merging Pull Requests\n\nAll PRs must pass the following checklist:\n\n* All CI checks must pass. For unit tests, this includes 100% coverage, and coverage lines should not be skipped.\n* The PR must be approved by at least 2 [core maintainers](https://comunica.dev/association/board/).\n * If more than a week goes by, then the approval of 1 core maintainer is sufficient, unless another core maintainer explicitly indicated the desire for later review.\n * The codebase curator can always merge immediately.\n* If commits don't meet the commit message guidelines from above, the \"Squash and merge\" functionality of GitHub must be used, and a new commit message must be created. Otherwise, PRs can be merged via the \"Rebase\" button.\n\n### Making a new release\n\nMaking a new release only requires invoking `yarn run publish-release` from the repository root, which does the following using [lerna](https://github.com/lerna/lerna):\n\n* Prompts your for providing the new version (major, minor, patch).\n* Bump the versions from all changed packages.\n* [Generate a changelog](https://github.com/rubensworks/manual-git-changelog.js) from all commits since the last release. The process will halt until you modify (and save) the changelog where needed (remove unneeded commits, and categorize them), and confirm by pressing any key in the console. \n* Release all changed packages to npm.\n* Push the tag to GitHub.\n* Push to master.\n\n
\nIf publication fails due to a random NPM server error,\nyou can invoke the [`retry-republish.sh`](https://github.com/comunica/comunica/blob/master/.github/retry-publish.sh) scripts to retry the publication.\nThis script can be safely called multiple times.\nYou may have to stash your repo first.\n
\n\n### Making a new pre-release\n\nMaking a new release only requires invoking `yarn run publish-canary` from the repository root, which does the following using [lerna](https://github.com/lerna/lerna):\n\n* Temporarily do a patch release increment on all packages in the form of `-alpha..0`.\n* Release all packages to npm with the `next` tag.\n* Undo temporary changes\n\nPre-releases do not trigger changelog changes, git commits, and pushes.\n\nIf the lerna script exited with an error, you may notice some issues with git. In that case, make sure to execute the following:\n\n```bash\ngit update-index --no-assume-unchanged $(git ls-files | tr '\\\\n' ' ') && git checkout .\n```\n"},75835:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Documentation'\ndescription: 'Overview of all Comunica documentation.'\nindex: true\n---\n\nEither you can use Comunica for executing queries, or you can modify it to suit your specific goals.\n\nLooking for the [code documentation](https://comunica.github.io/comunica/) instead?\n\n
\nWatch some of these guides in action live within this Webinar recording.\n
\n"},17642:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Query with Comunica'\ndescription: 'Learn how to execute queries in different environments. Such as live in the browser, in JavaScript applications, or the CLI.'\nindex: true\n---\n\nThe following guides explain how to execute queries in different environments,\nsuch as live in the browser, in JavaScript applications, or the CLI.\n\n
\nWatch some of these guides in action live within this Webinar recording.\n
\n"},62712:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Getting started with querying'\ndescription: 'Basic guides on how to easily get started with querying.'\nindex: true\n---\n\nThe following guides explain some basic ways in which you can use Comunica for querying.\n"},97750:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying from the command line\'\ndescription: \'Execute SPARQL queries directly from the command line.\'\n---\n\nThe default Comunica query engine that exposes most standard features is Comunica SPARQL,\nwhich uses the package name `@comunica/query-sparql`.\nIn this guide, we will install it _globally_, and show how it can be invoked from the command line.\n\n
\nWatch part of this guide in action live within this Webinar recording.\n
\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql\n```\n\n## 2. SPARQL querying over one source\n\nAfter installing Comunica SPARQL, you will be given access to several commands including `comunica-sparql`,\nwhich allows you to execute SPARQL queries from the command line.\n\nThis command requires one or more URLs to be provided as **sources** to query over.\nAs last argument, as **SPARQL query string** can be provided.\n\nFor example, the following query retrieves the first 100 triples from [DBpedia](https://fragments.dbpedia.org/2016-04/en):\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n
\nGiven a URL, Comunica will automatically detect the type of source and handle it accordingly.\n
\n\nAs output, a JSON array of bindings for the selected variables will be returned:\n```\n[\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/1999/02/22-rdf-syntax-ns#type","?o":"http://rdfs.org/ns/void#datasource"},\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/1999/02/22-rdf-syntax-ns#type","?o":"http://www.w3.org/ns/hydra/core#Collection"},\n{"?s":"https://fragments.dbpedia.org/2016-04/en#dataset","?p":"http://www.w3.org/ns/hydra/core#search","?o":"https://fragments.dbpedia.org/2016-04/en#triplePattern"}\n...\n``` \n\n## 3. Query file input\n\nSince SPARQL queries can sometimes become very large, it is possible to supply them via a local file using the `-f` option.\n\nAssuming a file `path/myquery.sparql` exists, we can query over it as follows:\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en -f path/myquery.sparql\n```\n\n## 4. SPARQL querying over multiple sources\n\nOne key feature of Comunica is its ability to query over **multiple sources**.\nFor this, you can just supply any number of URLs as arguments.\nJust make sure that the last argument remains your query.\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/ \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n## 5. SPARQL CONSTRUCT and ASK\n\nNext to SPARQL `SELECT` queries,\nit is also possible to execute `CONSTRUCT` queries to produce RDF triples:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n```text\n "2010-04-21"^^;\n "1939-01-02"^^;\n "PDF";\n ;\n "Sheboygan, Wisconsin";\n "1";\n...\n```\n\n`ASK` queries will produce a boolean output:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "ASK { ?s ?p ?o }"\n```\n```\ntrue\n```\n\n## 6. Changing result format\n\n`SELECT` queries will be printed as JSON by default, and `CONSTRUCT` queries as [RDF TriG](https://www.w3.org/TR/trig/).\nThis can be overridden using the `-t` option.\n\nFor example, displaying results as SPARQL JSON results:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100" \\\n -t \'application/sparql-results+json\'\n```\n```json\n{"head": {"vars":["s","p","o"]},\n"results": { "bindings": [\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/date","type":"uri"},"o":{"value":"1899-05-06","type":"literal","datatype":"http://www.w3.org/2001/XMLSchema#date"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/isCitedBy","type":"uri"},"o":{"value":"http://dbpedia.org/resource/Tierce_(unit)","type":"uri"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/newspaper","type":"uri"},"o":{"value":"Biloxi Daily Herald","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/page","type":"uri"},"o":{"value":"6","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/title","type":"uri"},"o":{"value":"A New System of Weights and Measures","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/url","type":"uri"},"o":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"}},\n...\n``` \n\n
\nAll available formats can be printed via comunica-sparql --listformats\n
\n\n## 7. Printing the query plan\n\nUsing the `--explain` option, the query plan can be printed via [different explain modes](/docs/query/advanced/explain/).\n\n## 8. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql`.\nYou can learn more options by invoking the _help_ command:\n```text\n$ comunica-sparql evaluates SPARQL queries\n\nRecommended options:\n -q, --query Evaluate the given SPARQL query string [string]\n -f, --file Evaluate the SPARQL query in the given file [string]\n -i, --inputType Query input format (e.g., graphql, sparql) [string] [default: "sparql"]\n -t, --outputType MIME type of the output (e.g., application/json) [string]\n\nOptions:\n -c, --context Use the given JSON context string or file (e.g., config.json) [string]\n --to Destination for update queries [string]\n -b, --baseIRI base IRI for the query (e.g., http://example.org/) [string]\n -d, --dateTime Sets a datetime for querying Memento-enabled archives [string]\n -l, --logLevel Sets the log level (e.g., debug, info, warn, ...) [string] [default: "warn"]\n --lenient If failing requests and parsing errors should be logged instead of causing a hard crash [boolean]\n -v, --version Prints version information [boolean]\n --showStackTrace Prints the full stacktrace when errors are thrown [boolean]\n --httpTimeout HTTP requests timeout in milliseconds [number]\n --httpBodyTimeout Makes the HTTP timeout take into account the response body stream read [boolean]\n --httpRetryCount The number of retries to perform on failed fetch requests [number]\n --httpRetryDelay The number of milliseconds to wait between fetch retries [number]\n --httpRetryOnServerError If fetch should be retried on 5xx server error responses, instead of being resolved. [boolean]\n --unionDefaultGraph If the default graph should also contain the union of all named graphs [boolean]\n --noCache If the cache should be disabled [boolean]\n --distinctConstruct If the query engine should deduplicate resulting triples [boolean]\n -p, --proxy Delegates all HTTP traffic through the given proxy (e.g. http://myproxy.org/?uri=) [string]\n --listformats Prints the supported MIME types [boolean]\n --explain Print the query plan [string] [choices: "parsed", "logical", "physical"]\n --localizeBlankNodes If blank nodes should be localized per bindings entry [boolean]\n -r, --recoverBrokenLinks Use the WayBack machine to recover broken links [boolean] [default: false]\n\nExamples:\n comunica-sparql https://fragments.dbpedia.org/2016-04/en -q \'SELECT * { ?s ?p ?o }\'\n comunica-sparql https://fragments.dbpedia.org/2016-04/en -f query.sparql\n comunica-sparql https://fragments.dbpedia.org/2016-04/en https://query.wikidata.org/sparql ...\n comunica-sparql hypermedia@https://fragments.dbpedia.org/2016-04/en sparql@https://query.wikidata.org/sparql ...\n```\n'},20919:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Updating from the command line\'\ndescription: \'Execute SPARQL Update queries directly from the command line.\'\n---\n\nComunica SPARQL (`@comunica/query-sparql`) allow you to initiate queries to _update_ data in a certain store.\nIn this guide, we will build upon [the guide on querying from the command line](/docs/query/getting_started/query_cli/),\nand show how you can not only read, but also update data.\n\n
\nAt the time of writing, not all possible destination types may be supported yet.\n
\n\n## 1. Updating one source\n\nUsing the `comunica-sparql` command line tool,\nyou can invoke not only read queries, but also update queries.\n\nAssuming you pass just one source,\nthis source will also be assumed to be the destination for update queries.\n\nFor example, the following query appends a single triple to `https://example.org/myfile.ttl`:\n```bash\n$ comunica-sparql https://example.org/myfile.ttl \\\n "INSERT DATA { }"\n```\n\n
\nGiven a URL, Comunica will automatically detect the type of destinations and handle it accordingly.\n
\n\nAs output, `ok` will be printed if the update was successful:\n```\nok\n``` \n\n## 2. Updating a different destination\n\nWhile Comunica supports querying over **multiple sources**,\nit only supports updating **a single destination**.\n\nTherefore, if you are querying over multiple sources,\nbut you want to pass the results to a single destination,\nthen you must explicitly define this destination using the `--to` option.\n\nFor example, the following query takes the first 100 triples from 3 sources,\nand inserts them into `https://example.org/myfile.ttl`:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/ \\\n --to https://example.org/myfile.ttl \\\n "INSERT { ?s ?p ?o. } WHERE { SELECT * WHERE { ?s ?p ?o } LIMIT 100 }"\n```\n\n
\nThe type of destination is here also automatically detected,\nand can also be overridden.\n
\n'},48016:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying local files from the command line\'\ndescription: \'Execute SPARQL queries over local RDF files directly from the command line.\'\n---\n\nUsing Comunica SPARQL File, you can query over RDF files that are stored on your local machine.\n\n
\nWhile Comunica SPARQL allows you to query sources exposed via URLs on the command line,\nit does not allow you to query RDF local files.\nThis is because Comunica SPARQL can be used in a variety of use cases, of which deployment on a public server is one.\nIn some of these cases, the ability to access the local file system can imply a major security risk,\nwhich is why we require the use of a separate package. \n
\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql-file\n```\n\n## 2. SPARQL querying over one local file\n\nAfter installing Comunica SPARQL, you will be given access to several commands including `comunica-sparql-file`,\nwhich allows you to execute SPARQL queries from the command line.\n\nJust like `comunica-sparql`, this command requires one or more URLs to be provided as **sources** to query over.\nAs last argument, as **SPARQL query string** can be provided.\n\nFor example, the following query retrieves the first 100 triples from `path/to/my/file.ttl`:\n```bash\n$ comunica-sparql-file path/to/my/file.ttl \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n## 3. SPARQL querying over one remote file\n\nNext to local file, also _remote_ files identified by a URL can be queried:\n```bash\n$ comunica-sparql-file https://www.rubensworks.net/ \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n```\n\n## 4. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql-file`.\nYou can learn more options by invoking the _help_ command, or by [reading the Comunica SPARQL documentation](/docs/query/getting_started/query_cli/):\n```text\n$ comunica-sparql-file --help\n```\n'},95276:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying in a JavaScript app'\ndescription: 'Execute SPARQL queries from within your application using the JavaScript API.'\n---\n\nThe default Comunica query engine that exposes most standard features is Comunica SPARQL,\nwhich uses the package name `@comunica/query-sparql`.\nIn this guide, we will install it as a dependency in a [Node.js](https://nodejs.org/en/) JavaScript application,\nand show how it can be used to execute queries.\n\n
\nWatch part of this guide in action live within this Webinar recording.\n
\n\n## 1. Installation\n\n
\nThis assumes you already have an npm package.\nIf you don't have one yet, create one using npm init.\nYou will also need a JavaScript file to write in, such as main.js.\n
\n\nIn order to add Comunica SPARQL as a _dependency_ to your [Node.js](https://nodejs.org/en/) application,\nwe can execute the following command:\n```bash\n$ npm install @comunica/query-sparql\n```\n\n## 2. Creating a new query engine\n\nThe easiest way to create an engine is as follows:\n\n```javascript\nconst QueryEngine = require('@comunica/query-sparql').QueryEngine;\n\nconst myEngine = new QueryEngine();\n```\n\nYou can reuse an engine as often as possible.\nThis is especially valuable if you repeatedly query over the same sources,\nas [caching](/docs/query/advanced/caching/) can be performed. \n\n## 3. Executing SPARQL SELECT queries\n\nOnce you engine has been created, you can use it to execute any SPARQL query, such as a `SELECT` query:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n});\n```\n\nThe first argument of `queryBindings()` is a SPARQL query string,\nand the second argument is a [query context](/docs/query/advanced/context/) containing options,\nwhich must at least contain an array of sources to query over. \n\nThe resulting `bindingsStream` is a stream of **bindings**,\nwhere each binding contains values for the selected variables (`?s ?p ?o`).\n\n
\nWhile the sources is the only required option in the query context,\nadditional options can be passed\nto tweak how the engine executed the query.\n
\n\n### 3.1 Consuming binding results as a stream\n\nThe most efficient way to make use of the result,\nis by adding a **data-listener** to the `bindingsStream`:\n```javascript\nbindingsStream.on('data', (binding) => {\n console.log(binding.toString()); // Quick way to print bindings for testing\n\n console.log(binding.has('s')); // Will be true\n \n // Obtaining values\n console.log(binding.get('s').value);\n console.log(binding.get('s').termType);\n console.log(binding.get('p').value);\n console.log(binding.get('o').value);\n});\n```\n\nThe data-listener will be invoked _for each resulting binding_,\nas soon as the query engine has detected it.\nThis means that the data-listener can be invoked many times during query execution,\neven if not all results are available yet.\n\nEach `binding` is an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object\nthat contains mappings from variables to RDF terms.\nVariable names can either be obtained by string label (without the `?` prefix) or via [RDF/JS](/docs/query/advanced/rdfjs/) variable objects,\nand bound RDF terms are represented as [RDF/JS](/docs/query/advanced/rdfjs/) terms.\nLearn more about the usage of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\nTo find out when the query execution has **ended**,\nand all results are passed to the data-listener,\nan **end-listener** can be attached as well.\n```javascript\nbindingsStream.on('end', () => {\n // The data-listener will not be called anymore once we get here.\n});\n```\n\nIt is also considered good practise to add an **error-listener**,\nso you can detect any problems that have occurred during query execution:\n```javascript\nbindingsStream.on('error', (error) => {\n console.error(error);\n});\n```\n\n### 3.2 Consuming binding results as an async iterable\n\nUsing a for-await loop, you can consume bindings as an [async iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#the_async_iterator_and_async_iterable_protocols).\nWhile this is more compact than the stream-based approach, it may lead to a slightly lower level of performance:\n\n```javascript\nfor await (const bindings of bindingsStream) {\n console.log(bindings.get('s').value);\n console.log(bindings.get('s').termType);\n}\n```\n\n### 3.3 Consuming binding results as an array\n\nIf performance is not an issue in your application,\nor you just want the results in a simple array,\nthen you can call the asynchronous `toArray()` method on the `bindingsStream`:\n\n```javascript\nconst bindings = await bindingsStream.toArray();\n\nconsole.log(bindings[0].get('s').value);\nconsole.log(bindings[0].get('s').termType);\n```\n\nThis method will return asychronously (using `await`) as soon as _all_ results have been found.\nIf you have many results, it is recommended to consume results iteratively via a data listener instead.\n\nEach binding in the array is again an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object.\n\nIf you want to limit the number of results in the array, you can optionally pass a limit:\n```javascript\nconst bindings = await bindingsStream.toArray({ limit: 100 });\n```\n\n## 4. Executing queries over multiple sources\n\nQuerying over more than one source is trivial,\nas any number of sources can easily be passed via an array:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\n 'http://fragments.dbpedia.org/2015/en',\n 'https://www.rubensworks.net',\n 'https://ruben.verborgh.org/profile/',\n ],\n});\n```\n\n## 5. Executing SPARQL CONSTRUCT queries\n\nNext to `SELECT` queries, you can also execute a `CONSTRUCT` query to generate RDF quads/triples:\n```javascript\nconst quadStream = await myEngine.queryQuads(`\n CONSTRUCT WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n```\n\n### 5.1 Consuming quad results as a stream\n\nThe most efficient way to make use of the resulting RDF quads,\nis by adding a **data-listener** to the `quadStream`:\n```javascript\nquadStream.on('data', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\nThe data-listener will be invoked _for each constructed RDF triple/quad_,\nas soon as the query engine has created it.\nThis means that the data-listener can be invoked many times during query execution,\neven if not all results are available yet.\n\nEach `quad` is an [RDF/JS](/docs/query/advanced/rdfjs/) quad,\nwhich contain `subject`, `predicate`, `object` and `graph` terms.\n\nJust like `bindingsStream`, **end-listener** and **error-listener** can also be attached:\n\n```javascript\nquadStream.on('end', () => {\n // The data-listener will not be called anymore once we get here.\n});\nquadStream.on('error', (error) => {\n console.error(error);\n});\n```\n\n### 5.2 Consuming quad results as an async iterable\n\nJust like with binding results,\nquads can also be consumed using for-await.:\n\n```javascript\nfor await (const quad of quadStream) {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n}\n\n```\n\n### 5.3 Consuming quad results as an array\n\nJust like with binding results,\nif performance is not an issue in your application,\nor you just want the results in a simple array,\nthen you can call the asynchronous `toArray()` method on the `bindingsStream`:\n\n```javascript\nconst quads = await quadStream.toArray();\n\nconsole.log(quads[0].subject.value);\nconsole.log(quads[0].predicate.value);\nconsole.log(quads[0].object.value);\nconsole.log(quads[0].graph.value);\n```\n\nThis method will return asychronously (using `await`) as soon as _all_ results have been found.\nIf you have many results, it is recommended to consume results iteratively via a data listener instead.\n\nEach `quad` is again an [RDF/JS](/docs/query/advanced/rdfjs/) quad,\nwhich contain `subject`, `predicate`, `object` and `graph` terms.\n\n## 6. Executing SPARQL ASK queries\n\nOne of the simplest forms SPARQL is the ASK query,\nwhich can be executed in Comunica as follows:\n```javascript\nconst hasMatches = await myEngine.queryBoolean(`\n ASK {\n ?s ?p \n }`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n})\n```\n\nThe value of `hasMatches` indicates if the query has at least one result. \n\n## 7. Executing a generic query\n\nIf you don't know beforehand if your query is a `SELECT`, `CONSTRUCT`, or `ASK` (e.g. if your app accepts queries via user input),\nthen you can make use of the generic `query` method that supports all query types:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n\nif (result.resultType === 'bindings') {\n const bindingsStream = await result.execute();\n\n bindingsStream.on('data', (binding) => {\n console.log(binding.toString());\n });\n}\n```\n\nThe resulting object represents a _future_ to the query results.\nIf has a field `resultType` that indicates the query and result type, which can be `'bindings'`, `'quads'`, `'boolean'`, or `'void'`.\nThe asynchronous `execute` method effectively executes the query, and returns a result depending on the `resultType`, corresponding to the `queryBindings`, `queryQuads`, ... methods.\nFor example, if the result type is `'bindings'`, then the return type of `execute` will be a bindings stream.\n\nOptionally, you can also obtain metadata about the results via this `query` method for the `'bindings'` and `'quads'` result types:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\n\nif (result.resultType === 'bindings') {\n const metadata = await result.metadata();\n console.log(metadata.cardinality);\n console.log(metadata.canContainUndefs);\n}\n```\n\n## 8. Serializing to a specific result format\n\nIf you want your application to output query results in a certain text-based format,\njust like [executing Comunica on the command line](/docs/query/getting_started/query_cli/),\nthen you can make use of the `resultToString()` method.\n\nFor example, serializing to SPARQL JSON can be done as follows:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n});\nconst { data } = await myEngine.resultToString(result,\n 'application/sparql-results+json');\ndata.pipe(process.stdout); // Print to standard output\n```\n\nThe `resultToString()` method accepts a query result and a result format media type.\nThe media type is optional, and will default to `application/json` for bindings, `application/trig` for quads, and `simple` for booleans.\n\n
\nAll available result formats can be retrieved programmatically\nby invoking the asynchronous getResultMediaTypes() method.\n
\n"},92421:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Updating in a JavaScript app\'\ndescription: \'Execute SPARQL Update queries from within your application using the JavaScript API.\'\n---\n\nComunica SPARQL (`@comunica/query-sparql`) allow you to initiate queries to _update_ data in a certain store.\nIn this guide, we will build upon [the guide on querying in a JavaScript app](/docs/query/getting_started/query_app/),\nand show how you can not only read, but also update data.\n\n## 1. Creating a new query engine and store\n\nThe easiest way to create an engine and store is as follows:\n\n```javascript\nconst QueryEngine = require(\'@comunica/query-sparql\').QueryEngine;\nconst N3 = require(\'n3\');\n\nconst myEngine = new QueryEngine();\n\nconst store = new N3.Store();\n```\n\nWe make use of the [`Store` from `N3.js`](https://github.com/rdfjs/N3.js#storing) for these examples.\n\n## 2. Executing INSERT DATA queries\n\nOnce you engine has been created, you can use it to execute any SPARQL Update query, such as a `INSERT DATA` query:\n```javascript\n// Initiate the update\nawait myEngine.queryVoid(`\n PREFIX dc: \n INSERT DATA\n { \n dc:title "A new book" ;\n dc:creator "A.N.Other" .\n }`, {\n sources: [ store ],\n});\n\n// Prints \'2\' => the store is updated\nconsole.log(store.size);\n```\n\n## 3. Executing DELETE/INSERT WHERE queries\n\n`DELETE/INSERT WHERE` queries allow you to delete and insert new quads,\nbased on quads that are already available:\n\n```javascript\n// Insert initial data\nawait myEngine.queryVoid(`\n PREFIX foaf: \n INSERT DATA\n { \n foaf:givenName "Bill" .\n foaf:familyName "McKinley" .\n foaf:givenName "Bill" .\n foaf:familyName "Taft" .\n foaf:givenName "Bill" .\n foaf:familyName "Clinton" .\n }`, {\n sources: [ store ],\n});\n\n// Rename all occurrences of "Bill" to "William"\nawait myEngine.queryVoid(`\n PREFIX foaf: \n DELETE { ?person foaf:givenName \'Bill\' }\n INSERT { ?person foaf:givenName \'William\' }\n WHERE\n {\n ?person foaf:givenName \'Bill\' \n }`, {\n sources: [ store ],\n});\n```\n\n
\nFor more information on the types of update queries that are possible, \nplease refer to the SPARQL Update specification.\n
\n\n## 4. Configure a custom destination\n\nBy default, update queries will modify data within the given source.\nIn some cases, you may want to direct changes to another place.\nFor example, if you have multiple sources, but you want to direct all changes to a single source.\n\nThis can be done by passing a `destination` into the query context:\n```javascript\n// Insert friends based on common friends from Ruben\'s\nawait myEngine.queryVoid(`\n PREFIX foaf: \n INSERT\n {\n foaf:knows ?friend\n }\n WHERE\n {\n foaf:knows ?friend .\n foaf:knows ?friend . \n }`, {\n sources: [\n \'https://www.rubensworks.net/\',\n \'https://ruben.verborgh.org/profile/\',\n ],\n destination: store,\n});\n```\n\n
\n'},51839:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying in a JavaScript browser app'\ndescription: 'Execute SPARQL queries from within your client-side browser application using the JavaScript API.'\n---\n\nComunica can run in both [Node.js JavaScript applications](/docs/query/getting_started/query_app/),\nand as **client-side applications in Web browsers**.\n\n## 1. Using a pre-built version\n\nThe easiest way to use Comunica in your Web app,\nis by using a pre-built Comunica SPARQL version that is served via a GitHub CDN:\n```html\n\n\n```\n\n
\nThe code example above will always make use of the the latest Comunica version in the 2.x.x range.\nInstead, you can use a specific version.\n
\n\nThe full API of Comunica is available under the `Comunica` namespace.\nMore information on its usage can be found in the guide on\n[using Comunica in a JavaScript app](/docs/query/getting_started/query_app/).\n\n## 2. Bundling for the browser\n\nComunica is compatible with browser bundler tools such as [Webpack](https://www.npmjs.com/package/webpack)\nand [browserify](http://browserify.org/).\nIf you are not familiar with these tools,\nyou can read the following guides:\n* [Webpack: Creating a Bundle – getting started](https://webpack.js.org/guides/getting-started/#creating-a-bundle)\n\nYou will need to create a \"UMD bundle\" and supply a name (e.g. with the -s Comunica option in browserify).\n\n
\nRefer to our specific guide on\nbuilding for the browser\nif you want to build specific configurations of Comunica for the browser.\n
\n"},26884:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Querying from a Docker container\'\ndescription: \'Execute SPARQL queries within a Docker container.\'\n---\n\n
\n \n
\n\nIf for whatever reason you are unable or unwilling to install Node.js,\nthen you can make use Comunica via [**Docker containers**](https://www.docker.com/) instead.\n\nUsage of the Comunica SPARQL via Docker can be done via the [`comunica/query-sparql` Docker image](https://hub.docker.com/r/comunica/query-sparql):\n```bash\n$ docker run -it --rm comunica/query-sparql \\\n https://fragments.dbpedia.org/2015-10/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nThe signature of this command is identical to the [`comunica-sparql` command](/docs/query/getting_started/query_cli/).\n\nBy default, the latest (stable) released version will be pulled and started.\nIf you want to make use of the latest development version,\nwhich is updated upon each new commit in the [Comunica GitHub repository](https://github.com/comunica/comunica),\nthen the `dev` tag can be used:\n```bash\n$ docker run -it --rm comunica/query-sparql:dev \\\n https://fragments.dbpedia.org/2015-10/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nA new Docker tag is also created upon each new release,\nso you can select a fixed version of Comunica if needed,\nsuch as version 1.14.0:\n```bash\n$ docker run -it --rm comunica/query-sparql:1.14.0 \\\n https://fragments.dbpedia.org/2015-10/en \\\n "CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100"\n```\n\nA list of all available tags can be found on the [Docker hub](https://hub.docker.com/r/comunica/query-sparql/tags).\n'},64942:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Setting up a SPARQL endpoint\'\ndescription: \'Allow querying over HTTP via the SPARQL protocol\'\n---\n\nThe [SPARQL protocol](https://www.w3.org/TR/sparql11-protocol/) allows clients to send SPARQL queries to Web servers over HTTP,\nand query results to be sent back to the client. \nComunica SPARQL can be used to set up a **SPARQL endpoint** on top of any number of sources you want.\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql\n```\n\n## 2. SPARQL endpoint over one source\n\nAfter installing Comunica SPARQL, you will be given access to several commands including `comunica-sparql-http`,\nwhich allows you to start a SPARQL endpoint from the command line.\n\nThis command requires one or more URLs to be provided as **sources** to query over.\n\nFor example, the following command starts a SPARQL endpoint over [DBpedia](https://fragments.dbpedia.org/2016-04/en):\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en\n```\n\n
\nGiven a URL, Comunica will automatically detect the type of source and handle it accordingly.\n
\n\nBy default, the endpoint will be exposed on port 3000.\nYour endpoint will now be live on `http://localhost:3000/sparql`.\nAny client that understands the SPARQL protocol will now be able to send queries to this URL,\nsuch as [`fetch-sparql-endpoint`](https://github.com/rubensworks/fetch-sparql-endpoint.js/), or even Comunica itself.\n\n
\nThe URL http://localhost:3000/ will automatically redirect to http://localhost:3000/sparql.\n
\n\nYou can easily test query execution over your endpoint using a tool such as `curl`.\nThe SPARQL protocol allows sending queries via HTTP GET by passing a URL-encoded SPARQL query via the `?query=` parameter:\n```bash\n$ curl -v "http://localhost:3000/sparql?query=CONSTRUCT%20WHERE%20%7B%3Fs%20%3Fp%20%3Fo.%7DLIMIT%20100"\n```\n\n## 3. SPARQL endpoint over multiple sources\n\nOne key feature of Comunica is its ability to query over **multiple sources**.\nFor this, you can just supply any number of URLs as arguments.\n\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/\n```\n\n## 4. SPARQL endpoint over local files\n\nFirst install Comunica SPARQL for files:\n\n```bash\n$ npm install -g @comunica/query-sparql-file\n```\n\nThen start the SPARQL server:\n\n```bash\n$ comunica-sparql-file-http path/to/my/file.ttl\n```\n\n## 5. Changing the port\n\nUsing the `-p` option, the port can be changed:\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en \\\n -p 3001\n```\n\n## 6. Increasing the number of worker threads\n\nUsing the `-w` option, the number of parallel worker threads can be set:\n```bash\n$ comunica-sparql-http https://fragments.dbpedia.org/2016-04/en \\\n -w 4\n```\n\nSetting this to the number of available CPU cores tends to give the best performance.\n\n## 7. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql-http`.\nYou can learn more options by invoking the _help_ command:\n```text\n$ comunica-sparql-http --help\ncomunica-sparql-http exposes a SPARQL endpoint\n\nRecommended options:\n -p, --port HTTP port to run on [number] [default: 3000]\n -w, --workers Number of worker threads [number] [default: 1]\n -t, --timeout Query execution timeout in seconds [number] [default: 60]\n -u, --update Enable update queries (otherwise, only read queries are enabled) [boolean] [default: false]\n\nOptions:\n -c, --context Use the given JSON context string or file (e.g., config.json) [string]\n --to Destination for update queries [string]\n -b, --baseIRI base IRI for the query (e.g., http://example.org/) [string]\n -d, --dateTime Sets a datetime for querying Memento-enabled archives [string]\n -l, --logLevel Sets the log level (e.g., debug, info, warn, ...) [string] [default: "warn"]\n --lenient If failing requests and parsing errors should be logged instead of causing a hard crash [boolean]\n -v, --version Prints version information [boolean]\n --showStackTrace Prints the full stacktrace when errors are thrown [boolean]\n -i, --invalidateCache Enable cache invalidation before each query execution [boolean] [default: false]\n\nExamples:\n comunica-sparql-http https://fragments.dbpedia.org/2016-04/en\n comunica-sparql-http https://fragments.dbpedia.org/2016-04/en https://query.wikidata.org/sparql\n comunica-sparql-http hypermedia@https://fragments.dbpedia.org/2016-04/en sparql@https://query.wikidata.org/sparql\n```\n'},12214:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Setting up a Web client'\ndescription: 'Set up a user-friendly static Web page where SPARQL queries can be executed client-side'\n---\n\nIf you want to easily **demonstrate** a couple of SPARQL queries on a **Web page**,\nor if you want to show off your custom built Comunica engine,\nthen you can do this using the [Comunica jQuery widget](https://github.com/comunica/jQuery-Widget.js/).\n\nAs an example, a public instance of this widget is available at http://query.linkeddatafragments.org/.\n\n## 1. Install from npm\n\n### 1.1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can install [`@comunica/web-client-generator`](https://github.com/comunica/jQuery-Widget.js/):\n```bash\n$ npm install -g @comunica/web-client-generator\n```\n\n### 1.2. Building a static Website for production\n\nAfter installing, you can build a production-ready version of [Comunica SPARQL](https://github.com/comunica/comunica/tree/master/engines/query-sparql):\n```bash\n$ comunica-web-client-generator\n```\n\nThe resulting `build` directory can be deployed on a Web server\nusing something like [NGINX](https://www.nginx.com/) or [GitHub pages](https://pages.github.com/).\n\n### 1.3. Build a custom config\n\nIn order to override the [default config](https://github.com/comunica/jQuery-Widget.js/blob/master/config/config-default.json), you can pass one as argument.\n\n```bash\n$ comunica-web-client-generator config/config-default.json\n```\n\nThis assumes that your engine's dependencies are available in your working directory.\nIf this is not the case, provide a path to your engine's directory via the `-c` option:\n\n```bash\n$ comunica-web-client-generator path/to/engine/config/config-default.json -c path/to/engine/\n```\n\n### 1.4. Change settings and queries\n\nThe default datasources and queries can be changed as follows:\n\n```bash\n$ comunica-web-client-generator -s settings.json -q queries\n```\n\nExamples for the [`settings.json`](https://github.com/comunica/jQuery-Widget.js/blob/master/settings.json) file\nand the [`queries`](https://github.com/comunica/jQuery-Widget.js/tree/master/queries) directory.\n\n### 1.5. Show all available options\n\nAll available options for this command are:\n\n```bash\n$ comunica-web-client-generator -h\ncomunica-web-client-generator generates Comunica Web clients\n Usage:\n comunica-web-client-generator config/config-default.json\n comunica-web-client-generator config/config-default.json -d my-build/ -s my-settings.json\n comunica-web-client-generator config/config-default.json -q my-queries/\n comunica-web-client-generator config/config-default.json -w my-webpack.config.js\n\n Options:\n -d Destination of the built output (defaults to build)\n -m The compilation mode (defaults to production, can also be development)\n -c Path to the main Comunica module (defaults to cwd)\n -q Path to custom queries directory\n -s Path to custom settings file\n -w Path to custom Webpack config\n --help Print this help message\n```\n\n## 2. Install from GitHub\n\n### 2.1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\n\nNext, we can [clone the Comunica jQuery widget repo](https://github.com/comunica/jQuery-Widget.js/), and install it:\n```bash\n$ git clone https://github.com/comunica/jQuery-Widget.js.git\n$ cd jQuery-Widget.js\n$ npm install\n```\n\n### 2.2. Starting the built-in Web server\n\nThe widget comes with its own (optional) Web server,\nwhich can be started as follows:\n```bash\n$ npm run dev\n```\n\nNow, you page will be live at `http://localhost:8080`.\n\n
\nThis port can be changed to something else by adding the --port option\nwithin the dev script in package.json.\n
\n\n### 2.3. Building a static Website for production\n\nThe built-in Web server should primarily be used for testing.\nIf you want to deploy this page on a Web server,\nsomething like [NGINX](https://www.nginx.com/) or [GitHub pages](https://pages.github.com/) is recommended.\n\nYou can build a production-ready version of this page as follows:\n```bash\n$ npm run build\n```\n\nThe contents of the `build` folder can now be deployed on to any Web server.\n\n### 2.4. Changing the default queries and datasets\n\nYou'll notice that the page contains some example queries and datasets by default.\nYou can change these by modifying the contents of the `queries/` folder and the `settings.json` file.\n\n
\nWhen running the built-in dev server, the process will have to be restarted after every change to the queries or settings.\n
\n"},40971:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Query using the latest development version\'\ndescription: \'If you want to make use of the latest changes that are not released yet\' \n---\n\nWhile the [Comunica GitHub repository](https://github.com/comunica/comunica) receives regular commits,\nwith fixes for bugs or new features,\nthese changes are not always immediately published as a new release on the npm package manager.\n\nWhile we always recommend using a released version of Comunica,\nthere are situations where you may want to make use of the **latest development version** from GitHub instead.\nFor example, if your application depends on a new _feature_ or _fix_ in Comunica,\nand you already want to develop or test your application before the new Comunica release is available.\n\nIn this guide, we will do this by setting up the **Comunica development environment**.\n\n## 1. Setup the Comunica development environment\n\nIf you want to make use of the latest development version,\nyou will have to **clone** the GitHub repository,\nand **install** it via the [Yarn package manager](https://yarnpkg.com/):\n```bash\n$ git clone https://github.com/comunica/comunica.git\n$ cd comunica\n$ yarn install\n```\n\n
\nSetting up the development via the npm package manager will not work due to the Comunica repository making use\nof the Yarn workspaces functionality.\n
\n\n## 2. Querying from the command line\n\nIf installation is successful, you can navigate to any package and make use of it\nsimilar to how you would when it has been installed via npm.\n\nFor example, executing a SPARQL query from the command line with Comunica SPARQL\ncan be done by navigating to `engines/query-sparql`, and invoking `bin/query.js`:\n```bash\n# cd engines/query-sparql\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100"\n``` \n\nYou can execute any of the commands explained in the [CLI guide](/docs/query/getting_started/query_cli/)\nby simply replacing `comunica-sparql` with `node bin/query.js`.\n\nIf you want to [set up a SPARQL endpoint](/docs/query/getting_started/setup_endpoint/),\nyou can use `node/http.js` instead of `comunica-sparql-http`.\n\n## 3. Linking Comunica SPARQL to your package\n\nIf you have a [JavaScript application that makes use of Comunica SPARQL](/docs/query/getting_started/query_app/),\nthen you can **link** it to your local Comunica development environment.\n\nThis can be done by first indicating that Comunica SPARQL can be linked (starting from the Comunica development environment folder):\n```bash\n$ cd engines/query-sparql\n$ yarn link\n```\n\nNext, in the folder of your JavaScript package,\nwe can link Comunica SPARQL as follows:\n```bash\n$ yarn link "@comunica/query-sparql"\n```\n\nNow, your application will use the development version of Comunica instead.\n\n
\nIf you want to go back to the npm version of Comunica SPARQL,\nthen you first have to unlink it from your application by running yarn unlink "@comunica/query-sparql".\n
\n'},13302:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Usage showcase'\ndescription: 'Examples of where Comunica is used.'\n---\n\nComunica is being used in a variety of places for its querying and RDF-related capabilities.\nBelow, a couple of these uses are listed.\nFeel free to [contact us](/ask/) if you want your use of Comunica added to this list.\n\n## LDflex\n\n[LDflex](https://github.com/LDflex/LDflex) is a JavaScript library provides a convenient syntax for quickly writing and executing queries in a developer-friendly way.\nUsing the power of Comunica and JSON-LD contexts, you can write expressions like `person.friends.firstName` to get a list of your friends.\n\nLDflex is used within the [Solid](https://solidproject.org/) community to easily [interact with one or more Solid data pods](https://github.com/solid/query-ldflex/).\nUsing the compact syntax of LDflex, it is very simple to query from within [React components](https://github.com/solid/react-components).\n\n## GraphQL-LD\n\n[GraphQL-LD](https://github.com/rubensworks/graphql-ld.js) is a JavaScript library\nthat allows Linked Data to be queried via [GraphQL](https://graphql.org/) queries and a JSON-LD context.\nThe approach involves converting a GraphQL query and JSON-LD context to a SPARQL query,\nwhich can then be executed by any SPARQL query engine [such as Comunica](https://github.com/rubensworks/graphql-ld-comunica.js).\n\nIt can also be used execute [authenticated queries over Solid data pods](https://github.com/rubensworks/GraphQL-LD-Comunica-Solid.js),\nfor which [reusable React components](https://github.com/rubensworks/solid-react-graphql-ld.js) are available.\n\n## Quadstore\n\n[Quadstore](https://github.com/belayeng/quadstore) is a [LevelDB](https://github.com/google/leveldb)-based graph database for Node.js and the browser.\n[Quadstore Comunica](https://github.com/belayeng/quadstore-comunica) is a SPARQL engine on top of Quadstore that is powered by Comunica.\n\n## LDkit\n[LDkit](https://ldkit.io) is a Linked Data query toolkit for TypeScript developers. It provides ORM-like abstraction over RDF data: you define a data source and a data schema and let LDkit handle SPARQL queries, data fetching and conversion of RDF to to JS/TS native types in background.\n\nLDkit provides built-in support to query SPARQL endpoints, but it is [fully compatible with Comunica](https://ldkit.io/docs/how-to/query-with-comunica) in case you need to access other RDF data sources.\n\n## RDF Parse\n\n[RDF Parse](https://github.com/rubensworks/rdf-parse.js) is a JavaScript library parses RDF based on content type or file name in a streaming manner.\nIt supports all of the major RDF serializations.\nInternally, this library makes use of the `rdf-parse` bus and actors from Comunica.\n\n## RDF Dereference\n\n[RDF Dereference](https://github.com/rubensworks/rdf-dereference.js) is a JavaScript library dereferences URLs to get its RDF contents.\nThis tool is useful in situations where you have a URL, and you just need the parsed triples/quads, without having to concern yourself with determining the correct content type and picking the correct parser.\nInternally, this library makes use of the `rdf-dereference` bus and actors from Comunica.\n\n## RDF Play\n\n[RDF Play](https://rdf-play.rubensworks.net/) is a Web-based tool for performing simple RDF operations, such as parsing, serializing and dereferencing from URLs.\nInternally, this library makes use of RDF parsers from the Comunica framework, which enable streaming processing of RDF.\n\n## ESWC Conference 2020\n\nAll metadata of the [ESWC Conference (2020)](https://2020.eswc-conferences.org/) is [queryable](https://query.2020.eswc-conferences.org/)\nvia a jQuery widget instance of Comunica.\nIt features several example queries over a [Triple Pattern Fragments](https://linkeddatafragments.org/concept/) interface through which the ESWC 2020 metadata is published.\n\n## Walder\n\n[Walder](https://github.com/KNowledgeOnWebScale/walder) offers an easy way \nto set up a website or Web API on top of decentralized knowledge graphs.\nIt uses Comunica for querying these knowledge graphs.\nhosted via Solid PODs, SPARQL endpoints, Triple Pattern Fragments interfaces, RDF files, and so on. \nUsing content negotiation, Walder makes the data in these knowledge graphs available to clients via HTML, RDF, and JSON-LD. \nUsers define in a configuration file which data Walder uses and how it processes this data.\n"},6572:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying FAQ'\ndescription: 'Frequently asked questions about using Comunica.'\n---\n\nFeel free to [ask us](/ask/), or have a look at the\n[example](https://github.com/comunica/examples) repository.\n\n## How can I query over RDF documents on my local file system?\n\nInstead of using Comunica SPARQL, you can use [Comunica SPARQL File](/docs/query/getting_started/query_cli_file/)\nto query over files on your local file system.\n\nComunica SPARQL by default does not allow you to query over local file for security reasons.\n\n## How to query over sources in memory?\n\n[Comunica SPARQL RDF/JS](/docs/query/advanced/rdfjs_querying/) can be used for in-memory querying.\n\n## How are result bindings and quads represented in JavaScript?\n\nSELECT query results will be contained in a `bindingsStream`,\nwhere each data element is a `Binding`.\nEach `binding` is an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object\nthat contains mappings from variables to RDF terms.\nVariable names can either be obtained by string label (without the `?` prefix) or via [RDF/JS](/docs/query/advanced/rdfjs/) variable objects,\nand bound RDF terms are represented as [RDF/JS](/docs/query/advanced/rdfjs/) terms.\nFor example:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT ...`, {...});\nbindingsStream.on('data', (binding) => {\n console.log(binding.get('s').value);\n console.log(binding.get('s').termType);\n});\n```\nLearn more about the usage of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\nCONSTRUCT query results will be contained in a `quadStream`,\nwhere each data element is an [RDF/JS](/docs/query/advanced/rdfjs/) quad.\nFor example:\n```javascript\nconst quadStream = await myEngine.queryQuads(`CONSTRUCT ...`, {...});\nquadStream.on('data', (quad) => {\n console.log(quad.subject.value);\n console.log(quad.predicate.value);\n console.log(quad.object.value);\n console.log(quad.graph.value);\n});\n```\n\nRead more about this in the [guide om executing SPARQL queries in JavaScript applications](/docs/query/getting_started/query_app/).\n\n## What datastructure is behind `bindingsStream` and `quadStream`?\n\nQuery results can be returned via `bindingsStream` (SELECT queries) and `quadStream` (CONSTRUCT) queries.\n\nThese streams are backed by an [AsyncIterator](https://github.com/RubenVerborgh/AsyncIterator),\nwhich is a lightweight JavaScript implementation of demand-driven object streams.\nAs opposed to Node's `Stream`, you cannot push anything into an `AsyncIterator`;\ninstead, an iterator pulls things from another iterator.\n\nFurthermore, these streams are _lazy_,\nwhich means that the results will only be calculated once you request them,\nand an `'end'` event will only be emitted when all of them have been consumed.\n\n## I need a specific feature, how do I get it into Comunica?\n\nSince Comunica is an open-source project,\nthe best way to get new features in, is by [contributing yourself](/contribute/).\n\nAlternatively, you can delegate implementation work to a third-party via the [Comunica Association](/association/).\n\n## How to measure query performance with Comunica?\n\n### Simple statistics\n\nThe easiest way to get statistics on the performance of a specific query\nis by using the `'stats'` [result format](/docs/query/advanced/result_formats/).\nThis will print the number of results, their delay from query start,\nand the number of HTTP requests that have been executed up until the result was available.\n\nFor example, stats can be printed via the command line as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -t stats \\\n 'SELECT * WHERE { ?s ?p ?o } LIMIT 10'\nResult,Delay (ms),HTTP requests\n1,265.488428,2\n2,265.7177,2\n3,265.889677,2\n4,266.141152,2\n5,266.332423,2\n6,266.496283,2\n7,266.674167,2\n8,266.861855,2\n9,268.330294,2\n10,268.51177,2\nTOTAL,268.816168,2\n```\n\n### Enabling production-mode\n\nIf you want to do benchmarking with Comunica in Node.js, make sure to run Node.js in production mode as follows:\n\n```\n$ NODE_ENV=production comunica-sparql ...\n```\n\nThe reason for this is that Comunica extensively generates internal Error objects. In non-production mode, these also produce long stacktraces, which may in some cases impact performance.\n\n### More advanced experiments\n\nA more advanced tool for setting up large-scale reproducible Comunica experiments is [Comunica Bencher](https://github.com/comunica/comunica-bencher).\n"},75770:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Advanced querying'\ndescription: 'Advanced guides on how to get the most out of Comunica.'\nindex: true\n---\n\nThe following guides explore some of the more advanced concepts when querying using Comunica.\n"},36323:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'HTTP Basic Authentication'\ndescription: 'Send authenticated HTTP requests by including username and password.'\n---\n\nVia [HTTP Basic Authentication](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication)\none can include **username and password** credentials in HTTP requests.\nIf you want to query such protected resources,\nyou can include this authentication information for _all_ HTTP requests,\nor only for requests to _specific sources_. \n\n## Authentication on the command line\n\nVia the command line, username and password can be included in the URL as follows:\n```bash\n$ comunica-sparql https://username:password@example.org/page \\\n \"SELECT * WHERE { ?s ?p ?o }\"\n```\n\n## Authentication in an application\n\nWhen using [Comunica SPARQL in an application](/docs/query/getting_started/query_app/), authentication information can be set using the `httpAuth` [context entry](/docs/query/advanced/context/):\n\nEnabling basic authentication for _all_ HTTP requests:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n httpAuth: 'username:password',\n});\n```\n\nEnabling basic authentication for _a specific source_:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://username:password@example.org/page'],\n});\n```\n"},76759:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Bindings'\ndescription: 'Bindings objects are used to represent results of SPARQL SELECT queries'\n---\n\nSPARQL `SELECT` query results are represented as a stream of _bindings_ (sometimes also referred to as `BindingsStream`),\nwhere each bindings object represents a mapping from zero or more variables to RDF terms.\n\n
\nThe SPARQL specification uses solution mapping as terminology to refer to bindings.\nThis means that a bindings object is equivalent to a solution mapping,\nand a solution sequence is equivalent to a bindings stream.\n
\n\nBindings object are represented using the [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) interface,\nand can be created using any RDF/JS [`BindingsFactory`](http://rdf.js.org/query-spec/#bindingsfactory-interface).\nComunica provides the [`@comunica/bindings-factory`](https://github.com/comunica/comunica/tree/master/packages/bindings-factory) package that implements these interfaces.\n\nBelow, several examples are shown on how these bindings objects can be used.\nPlease refer to [the README of `@comunica/bindings-factory`](https://github.com/comunica/comunica/tree/master/packages/bindings-factory) for a complete overview of its operations.\n\n## Reading values of bindings\n\n### `Bindings.has()`\n\nThe `has()` method is used to check if a value exists for the given variable.\nThe variable can either be supplied as a string (without `?` prefix), or as an RDF/JS variable.\n\n```typescript\nif (bindings.has('var1')) {\n console.log('Has var1!');\n}\nif (bindings.has(DF.variable('var2'))) {\n console.log('Has var2!');\n}\n```\n\n### `Bindings.get()`\n\nThe `get()` method is used to read the bound value of variable.\nThe variable can either be supplied as a string (without `?` prefix), or as an RDF/JS variable.\n\n```typescript\nimport * as RDF from '@rdfjs/types';\n\nconst term1: RDF.Term | undefined = bindings.get('var1');\nconst term2: RDF.Term | undefined = bindings.get(DF.variable('var2'));\n```\n\n### Entry iteration\n\nEach bindings object is an Iterable over its key-value entries,\nwhere each entry is a tuple of type `[RDF.Variable, RDF.Term]`.\n\n```typescript\n// Iterate over all entries\nfor (const [ key, value ] of bindings) {\n console.log(key);\n console.log(value);\n}\n\n// Save the entries in an array\nconst entries = [ ...bindings ];\n```\n\n### `Bindings.toString`\n\nThe `toString()` method returns a compact string representation of the bindings object,\nwhich can be useful for debugging.\n\n```typescript\nconsole.log(bindings.toString());\n\n/*\nCan output in the form of:\n{\n \"a\": \"ex:a\",\n \"b\": \"ex:b\",\n \"c\": \"ex:c\"\n}\n */\n```\n\n## Creating bindings\n\nFirst, a bindings factory must be created:\n```typescript\nimport * as RDF from '@rdfjs/types';\nimport { DataFactory } from '@comunica/data-factory';\nimport { BindingsFactory } from '@comunica/bindings-factory';\n\nconst DF = new DataFactory();\nconst BF = new BindingsFactory(DF);\n```\n\nBindings can be created in different ways:\n```typescript\nconst bindings1: RDF.Bindings = BF.bindings([\n [ DF.variable('var1'), DF.literal('abc') ],\n [ DF.variable('var2'), DF.literal('def') ],\n]);\n\nconst bindings2: RDF.Bindings = BF.fromRecord({\n var1: DF.literal('abc'),\n var2: DF.literal('def'),\n});\n```\n"},11986:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Caching'\ndescription: 'When remote sources are requested, caching allows them to be reused in the future.'\n---\n\nWhen remote documents are fetched over HTTP, a Comunica engine can cache documents to optimize future reuse.\nIf [your application](/docs/query/getting_started/query_app/) works over volatile resources, then you may want to invalidate this cache,\nwhich can be done as follows:\n\n```javascript\n// Invalidate the full cache\nmyEngine.invalidateHttpCache();\n\n// Invalidate a single document\nmyEngine.invalidateHttpCache('http://example.org/page.html');\n```\n\nOptionally, you can also pass the `noCache: true` flag to your context to invalidate the cache before query execution starts:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://xmlns.com/foaf/spec/20140114.rdf'],\n noCache: true,\n});\n```\n"},22249:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Passing a context'\ndescription: 'A context can be passed to a query engine to tweak its runtime settings.'\n---\n\nWhen passing a query to a Comunica query engine,\nyou can pass additional information to the engine using a **context** object.\n\n## 1. How to use the context\n\nWhen [querying in a JavaScript application](/docs/query/getting_started/query_app/),\nthe context must be passed as second argument to the `query()` method of a Comunica engine.\n\nFor example, a context that defines the `sources` to query over is passed as follows:\n```javascript\nconst QueryEngine = require('@comunica/query-sparql').QueryEngine;\nconst myEngine = new QueryEngine();\n\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n});\n```\n\nThe `sources` field is the only entry that is required in the context.\nAll other entries that are discussed hereafter are optional.\n\n
\nDuring query execution, the context is converted into an immutable object\nto ensure that the original context entries remain unchanged during the whole query execution.\n
\n\n## 2. Overview\n\nThe following table gives an overview of all possible context entries that can be passed.\n\n| **Key** | **Description** |\n| ------- | --------------- |\n| `sources` | An array of data sources |\n| `destination` | A data destination for update queries |\n| `lenient` | If HTTP and parsing failures are ignored |\n| `initialBindings` | Variables that have to be pre-bound to values in the query |\n| `queryFormat` | The provided query's format |\n| `baseIRI` | Base IRI for relative IRIs in SPARQL queries |\n| `log` | A custom logger instance |\n| `datetime` | Specify a custom date |\n| `httpProxyHandler` | A proxy for all HTTP requests |\n| `httpIncludeCredentials` | (_browser-only_) If current credentials should be included for HTTP requests |\n| `httpAuth` | HTTP basic authentication value |\n| `httpTimeout` | HTTP timeout in milliseconds |\n| `httpBodyTimeout` | Makes the HTTP timeout apply until the response is fully consumed |\n| `httpRetryCount` | The number of retries to perform on failed fetch requests |\n| `httpRetryDelay` | The number of milliseconds to wait between fetch retries |\n| `httpRetryOnServerError` | If fetch should be retried on 5xx server error responses, instead of being resolved. |\n| `recoverBrokenLinks`| Use the WayBack machine to recover broken links |\n| `extensionFunctions` or `extensionFunctionCreator` | SPARQL extension functions |\n| `fetch` | A custom [`fetch`](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) function |\n| `readOnly` | If update queries may not be executed |\n| `explain` | The query explain mode |\n| `unionDefaultGraph` | If the default graph should also contain the union of all named graphs |\n| `localizeBlankNodes` | If blank nodes should be localized per bindings entry |\n\nWhen developing Comunica modules, all context entry keys can be found in [`@comunica/context-entries`](https://comunica.github.io/comunica/modules/_comunica_context_entries.html). \n\n## 3. Defining sources\n\nUsing the `sources` context entry, data sources can be defined that Comunica should query over.\nThe value of this must be an array, where the array may contain both strings or objects:\n* Array elements that are strings are interpreted as URLs, such as `'https://www.rubensworks.net/'` or `'https://fragments.dbpedia.org/2016-04/en'`.\n* Object array elements can be different things:\n * A hash containing `type` and `value`, such as `{ type: 'sparql', value: 'https://dbpedia.org/sparql' }`.\n * An [RDF/JS](/docs/query/advanced/rdfjs/) source object, such as [`new N3Store()`](https://github.com/rdfjs/N3.js#storing).\n\nString-based sources will lead to Comunica trying to determine their source type automatically.\nHash-based sources allows you to enforce a specific source type.\n\n
\nSome SPARQL endpoints may be recognised as a file instead of a SPARQL endpoint due to them not supporting SPARQL Service Description,\nwhich may produce incorrect results. For these cases, the sparql type MUST be set.\n
\n\nFor example, all of the following source elements are valid:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n 'https://fragments.dbpedia.org/2015/en',\n { type: 'hypermedia', value: 'https://fragments.dbpedia.org/2016/en' },\n { type: 'file', value: 'https://www.rubensworks.net/' },\n new N3Store(),\n { type: 'sparql', value: 'https://dbpedia.org/sparql' },\n ],\n});\n```\n\n## 4. Defining an update destination\n\nIf you are executing an update query over more than one source,\nthen you need to specify the `destination` of the resulting update.\nMore details on this can be found in the guide on [updating in a JavaScript app](/docs/query/getting_started/update_app/).\n\n## 5. Lenient execution\n\nBy default, Comunica will throw an error when it encounters an invalid **RDF document** or **HTTP URL**.\nIt is possible to **ignore these errors** and make Comunica ignore such invalid documents and URLs\nby setting `lenient` to `true`:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n lenient: true,\n});\n```\n\n## 6. Binding variables\n\nUsing the `initialBindings` context entry, it is possible to **bind** certain variables in the given query to terms before the query execution starts.\nThis may be valuable in case your SPARQL query is used as a template with some variables that need to be filled in.\n\nThis can be done by passing an [RDF/JS `Bindings`](http://rdf.js.org/query-spec/#bindings-interface) object as value to the `initialBindings` context entry:\n```javascript\nimport { BindingsFactory } from '@comunica/bindings-factory';\nimport { DataFactory } from 'rdf-data-factory';\n\nconst DF = new DataFactory();\nconst BF = new BindingsFactory();\n\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE {\n {?s ?p ?template1 } UNION { ?s ?p ?template2 }\n}`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n initialBindings: BF.fromRecord({\n template1: factory.literal('Value1'),\n template2: factory.literal('Value2'),\n }),\n});\n```\n\n`Bindings` can be created using any [RDF/JS `BindingsFactory`](http://rdf.js.org/query-spec/#bindingsfactory-interface),\nsuch as [`@comunica/bindings-factory`](https://www.npmjs.com/package/@comunica/bindings-factory).\nLearn more about the creation of these bindings objects in the [bindings guide](/docs/query/advanced/bindings/).\n\n## 7. Setting the query format\n\nBy default, queries in Comunica are interpreted as SPARQL queries.\nAs such, the `queryFormat` entry defaults to `{ language: 'sparql', version: '1.1' }`.\n\nSince Comunica is not tied to any specific **query format**, it is possible to change this to something else, such as `{ language: 'graphql', version: '1.0' }`.\nMore information on this can be found in the [GraphQL-LD guide](/docs/query/advanced/graphql_ld/).\n\n## 8. Setting a Base IRI\n\nTerms in SPARQL queries can be relative to a certain **Base IRI**.\nTypically, you would use the `BASE` keyword in a SPARQL query to set this Base IRI.\nIf you want to set this Base IRI without modifying the query,\nthen you can define it in the context using `baseIRI`:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE {\n ?s ?o\n}`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n baseIRI: 'http://example.org/',\n});\n```\n\n## 9. Enabling a logger\n\nA logger can be set using `log`.\nMore information on this can be found in the [logging guide](/docs/query/advanced/logging/).\n\n## 10. Setting a custom date\n\nUsing `datetime`, a custom **date** can be set in Comunica.\nThe range of this field must always be a JavaScript `Date` object:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n date: new Date(),\n});\n```\n\nThis date is primarily used for the SPARQL `NOW()` operator.\nIt is also used when performing time travel querying using the [Memento protocol](/docs/query/advanced/memento/).\n\n## 11. Enabling an HTTP proxy\n\nAll HTTP requests can be run through a proxy using `httpProxyHandler`.\nMore information on this can be found in the [HTTP proxy guide](/docs/query/advanced/proxying/).\n\n## 12. Include credentials in HTTP requests\n\n_Only applicable when running in the browser_\n\nIf this option is enabled, then all cross-site requests will be made using credentials of the current page.\nThis includes cookies, authorization headers or TLS client certificates.\n\nEnabling this option has no effect on same-site requests.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpIncludeCredentials: true,\n});\n```\n\n## 13. Send requests via HTTP basic authentication\n\nVia HTTP Basic Authentication one can include **username and password** credentials in HTTP requests.\nMore information on this can be found in the [HTTP basic authentication guide](/docs/query/advanced/basic_auth/).\n\n## 14. SPARQL extension functions\n\nSPARQL allows non-standard, [custom extension functions](https://www.w3.org/TR/sparql11-query/#extensionFunctions) to be used within queries.\nIn order to provide an implementation to these extension functions,\nComunica allows developers to plug them in via the context.\nMore information on this can be found in the [SPARQL extension functions guide](/docs/query/advanced/extension_functions/).\n\n## 15. Using a custom fetch function\n\nBy default, Comunica will use the built-in [`fetch` function](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) to make HTTP requests.\nIt is however possible to pass a custom function that will be used instead for making HTTP requests,\nas long as it follows the [Fetch API](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API).\n\nThis can be done as follows:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n fetch: myfetchFunction,\n});\n```\n\n_If you want to perform authenticated HTTP requests for Solid, you may want to consider using [Comunica Solid](https://comunica.dev/docs/query/advanced/solid/)._\n\n\n## 16. HTTP Timeout\n\nBy default Communica does not apply any timeout on the HTTP requests done to external services. It is possible to add a timeout using the `httpTimeout` option which value is the timeout delay in milliseconds. For example to add an HTTP timeout of 60s:\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpTimeout: 60_000,\n});\n```\n\nIt is also possible to make this timeout not only apply until the response starts streaming in but until the response body is fully consumed using the `httpBodyTimeout` boolean option. It is useful to limit cases like very long response streams:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpTimeout: 60_000,\n httpBodyTimeout: true\n});\n```\n\n## 17. Union Default Graph\n\nBy default, Comunica will only query over the [default graph](https://www.w3.org/TR/sparql11-query/#unnamedGraph).\nIf you want to query over triples in other named graphs, you need to specify this via the `GRAPH`, `FROM`, or `FROM NAMED` clauses.\nHowever, by setting the `unionDefaultGraph` context option to `true`, triples patterns will also apply to triples in the non-default graph. \n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n unionDefaultGraph: true,\n});\n```\n\n## 18. HTTP Retries\n\nUsing the `httpRetryOnServerError`, `httpRetryCount`, and `httpRetryDelay` options,\nyou can make your engine retry requests for a number of times if the server produces an error for it.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n httpRetryOnServerError: true,\n httpRetryCount: 3,\n httpRetryDelay: 100,\n});\n```\n\n## 19. Broken link recovery\n\nThe `recoverBrokenLinks` option can make your engine fall back to the [WayBack Machine](https://archive.org/web/) if a document has become unavailable.\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: ['http://xmlns.com/foaf/spec/20140114.rdf'],\n recoverBrokenLinks: true,\n});\n```\n\n## 20. Deduplicate quads in construct queries\n\nThe `distinctConstruct` option can remove duplicate quads from CONSTRUCT query outputs.\nThis corresponds to placing a `DISTINCT` onto a `CONSTRUCT` operator (which is not allowed by the SPARQL specification).\n\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`CONSTRUCT WHERE { ?s1 ?p1 ?o1. ?s2 ?p2 ?o2 }`, {\n sources: ['https://fragments.dbpedia.org/2015/en'],\n distinctConstruct: true,\n});\n```\n"},65625:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Destination types'\ndescription: 'Comunica detects and handles different types of destinations.'\n---\n\nComunica SPARQL supports _update_ queries to add, delete, or change data\non both the [command line](/docs/query/getting_started/update_cli/)\nand when [calling Comunica from a JavaScript application](/docs/query/getting_started/update_app/).\n\nUpdate queries typically consists of two parts:\n\n1. Query pattern to select data from a [_source_](/docs/query/advanced/source_types/);\n2. Quads to add or delete based on the query pattern into a _destination_.\n\nIn most cases, the _source_ and _destination_ are equal,\nsuch as when modifying data in [an in-memory RDF/JS Store](/docs/query/advanced/rdfjs_updating/).\n\nSince Comunica decouples _source_ and _destination_,\nit is possible to _read_ data from one place, and _apply changes_ in another place.\n\nUsually, destinations are passed as URLs that point to Web resources.\nBased on what is returned when _dereferencing_ this URL,\nComunica can apply different update algorithms.\n\nInstead of relying on Comunica's detection algorithms,\nyou can **enforce** the use of a certain type.\n\n
\nSome SPARQL endpoints may be recognised as a file instead of a SPARQL endpoint due to them not supporting SPARQL Service Description,\nwhich may produce incorrect results. For these cases, the sparql type MUST be set.\n
\n\n
\nWhen enabling the info logger,\nyou can derive what type Comunica has determined for each destination.\n
\n\n## Setting destination type on the command line\n\nDestination types can optionally be enforced by prefixing the URL with `@`, such as\n\n```bash\n$ comunica-sparql https://example.org/file-in.ttl \\\n --to patchSparqlUpdate@https://example.org/file-out.ttl \\\n \"INSERT DATA { }\"\n```\n\n## Setting destination type in an application\n\nVia a [JavaScript application](/docs/query/getting_started/query_app/),\nthe destination type can be set by using a hash containing `type` and `value`:\n```javascript\nawait myEngine.queryVoid(`...`, {\n sources: [\n { type: 'file', value: 'https://example.org/file-in.ttl' },\n ],\n destination: { type: 'patchSparqlUpdate', value: 'https://example.org/file-out.ttl' },\n});\n```\n\n## Supported destination types\n\nThe table below summarizes the different destination types that Comunica supports by default:\n\n| **Type name** | **Description** |\n| ------- | --------------- |\n| `rdfjsStore` | JavaScript objects implementing the [RDF/JS `store` interface](/docs/query/advanced/rdfjs_updating/) |\n| `sparql` | [SPARQL endpoint](https://www.w3.org/TR/sparql11-protocol/) |\n| `putLdp` | [Linked Data Platform](https://www.w3.org/TR/ldp/) HTTP APIs accepting `PUT` requests containing an RDF document, such as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1). |\n| `patchSparqlUpdate` | [Linked Data Platform](https://www.w3.org/TR/ldp/) HTTP APIs accepting `PATCH` requests containing SPARQL Update queries (`application/sparql-update`), such as [Solid servers](https://github.com/solid/solid-spec/blob/master/api-rest.md#alternative-using-sparql-1). |\n\nThe default source type is `auto`,\nwhich will automatically detect the proper source type.\nFor example, if an `Accept-Patch: application/sparql-update` header\nis detected, the `patchSparqlUpdate` type is used.\n"},61042:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Explain\'\ndescription: \'Display information about the logical and physical query plan\'\n---\n\nThe explain functionality allows you to extract information about the query plan of a Comunica query engine.\n\nThere are three explain modes available:\n\n- `parsed`: The [SPARQL Algebra](/docs/modify/advanced/algebra/) tree as parsed from the input query.\n- `logical`: The optimized logical query plan in SPARQL Algebra.\n- `physical`: A hierarchical log of which logical operations have been executed by which (physical) actors.\n\nWhile the `parsed` and `logical` explain modes happen before query execution,\nthe `physical` explain mode requires query execution to be completed.\nThis is because Comunica is an adaptive query engine that alters its query plan dynamically based on the sources it discovers at runtime.\nThis means that query execution must be completed before the final (physical) query plan can be inspected.\n\n
\nIf you require more insight into what operations are being executed at runtime,\nyou can make use of the built-in logging functionality.\n
\n\n
\nThe output for the physical mode is an experimental feature,\nwhich means that the format of it might improve and be changed in the future inbetween major updates.\n
\n\n## Explaining on the command line\n\nIf you have [installed Comunica SPARQL for the command line](/docs/query/getting_started/query_cli/),\nthen you will have immediate access to the query explain functionality via the `--explain` option.\n\nBelow, you can see examples on how the different explain modes can be invoked.\n\n### Explain parsed on the command line\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o } LIMIT 100\' --explain parsed\n\n{\n "type": "slice",\n "input": {\n "type": "project",\n "input": {\n "type": "bgp",\n "patterns": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "s"\n },\n "predicate": {\n "termType": "Variable",\n "value": "p"\n },\n "object": {\n "termType": "Variable",\n "value": "o"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "s"\n },\n {\n "termType": "Variable",\n "value": "p"\n },\n {\n "termType": "Variable",\n "value": "o"\n }\n ]\n },\n "start": 0,\n "length": 100\n}\n```\n\n### Explain logical on the command line\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o } LIMIT 100\' --explain logical\n\n{\n "type": "slice",\n "input": {\n "type": "project",\n "input": {\n "type": "join",\n "input": [\n {\n "termType": "Quad",\n "value": "",\n "subject": {\n "termType": "Variable",\n "value": "s"\n },\n "predicate": {\n "termType": "Variable",\n "value": "p"\n },\n "object": {\n "termType": "Variable",\n "value": "o"\n },\n "graph": {\n "termType": "DefaultGraph",\n "value": ""\n },\n "type": "pattern"\n }\n ]\n },\n "variables": [\n {\n "termType": "Variable",\n "value": "s"\n },\n {\n "termType": "Variable",\n "value": "p"\n },\n {\n "termType": "Variable",\n "value": "o"\n }\n ]\n },\n "start": 0,\n "length": 100\n}\n```\n\n### Explain physical on the command line\n\n```bash\n$ node bin/query.js https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT ?movie ?title ?name\nWHERE {\n ?movie dbpedia-owl:starring [ rdfs:label "Brad Pitt"@en ];\n rdfs:label ?title;\n dbpedia-owl:director [ rdfs:label ?name ].\n FILTER LANGMATCHES(LANG(?title), "EN")\n FILTER LANGMATCHES(LANG(?name), "EN")\n}\' --explain physical\n\nproject (movie,title,name)\n join\n join-inner(bind) bindOperation:(?g_0 http://www.w3.org/2000/01/rdf-schema#label "Brad Pitt"@en) bindCardEst:~2 cardReal:43 timeSelf:2.567ms timeLife:667.726ms\n join compacted-occurrences:1\n join-inner(bind) bindOperation:(?movie http://dbpedia.org/ontology/starring http://dbpedia.org/resource/Brad_Pitt) bindCardEst:~40 cardReal:43 timeSelf:6.011ms timeLife:641.139ms\n join compacted-occurrences:38\n join-inner(bind) bindOperation:(http://dbpedia.org/resource/12_Monkeys http://dbpedia.org/ontology/director ?g_1) bindCardEst:~1 cardReal:1 timeSelf:0.647ms timeLife:34.827ms\n filter compacted-occurrences:1\n join\n join-inner(nested-loop) cardReal:1 timeSelf:0.432ms timeLife:4.024ms\n pattern (http://dbpedia.org/resource/12_Monkeys http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (http://dbpedia.org/resource/Terry_Gilliam http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~1 src:0\n join compacted-occurrences:2\n join-inner(multi-empty) timeSelf:0.004ms timeLife:0.053ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://dbpedia.org/ontology/director ?g_1) cardEst:~0 src:0\n filter cardEst:~5,188,789.667\n join\n join-inner(nested-loop) timeLife:0.6ms\n pattern (http://dbpedia.org/resource/Contact_(1992_film) http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~1 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n join compacted-occurrences:1\n join-inner(multi-empty) timeSelf:0.053ms timeLife:0.323ms\n pattern (?movie http://dbpedia.org/ontology/director ?g_1) cardEst:~118,505 src:0\n pattern (?movie http://dbpedia.org/ontology/starring http://wikidata.dbpedia.org/resource/Q35332) cardEst:~0 src:0\n filter cardEst:~242,311,843,844,161\n join\n join-inner(symmetric-hash) timeLife:36.548ms\n pattern (?movie http://www.w3.org/2000/01/rdf-schema#label ?title) cardEst:~20,013,903 src:0\n pattern (?g_1 http://www.w3.org/2000/01/rdf-schema#label ?name) cardEst:~20,013,903 src:0\n\nsources:\n 0: QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)\n```\n\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -q \'SELECT * { ?s ?p ?o. ?s a ?o } LIMIT 100\' --explain physical-json\n\n{\n "logical": "slice",\n "children": [\n {\n "logical": "project",\n "variables": [\n "o",\n "p",\n "s"\n ],\n "children": [\n {\n "logical": "join",\n "children": [\n {\n "logical": "join-inner",\n "physical": "bind",\n "bindIndex": 1,\n "bindOperation": {\n "source": "QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)",\n "pattern": "?s http://www.w3.org/1999/02/22-rdf-syntax-ns#type ?o"\n },\n "bindOperationCardinality": {\n "type": "estimate",\n "value": 100022186,\n "dataset": "https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type"\n },\n "bindOrder": "depth-first",\n "cardinalities": [\n {\n "type": "estimate",\n "value": 1040358853,\n "dataset": "https://fragments.dbpedia.org/2016-04/en"\n },\n {\n "type": "estimate",\n "value": 100022186,\n "dataset": "https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type"\n }\n ],\n "joinCoefficients": {\n "iterations": 6404592831613.728,\n "persistedItems": 0,\n "blockingItems": 0,\n "requestTime": 8902477556686.99\n },\n "childrenCompact": [\n {\n "occurrences": 100,\n "firstOccurrence": {\n "logical": "pattern",\n "source": "QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)",\n "pattern": "http://commons.wikimedia.org/wiki/Special:FilePath/!!!善福寺.JPG ?p http://dbpedia.org/ontology/Image"\n }\n }\n ]\n }\n ]\n }\n ]\n }\n ]\n}\n```\n\n## Explaining in JavaScript\n\nIf you have [installed Comunica SPARQL in a JavaScript app](/docs/query/getting_started/query_app/),\nthen you can invoke the `explain` method on your query engine with a certain explain mode.\n\nBelow, you can see examples on how the different explain modes can be invoked.\n\n### Explain parsed in JavaScript\n\n```typescript\nconsole.log(await engine.explain(`SELECT * WHERE {\n ?s ?p ?o.\n }`, {\n sources: [ \'https://www.rubensworks.net/\' ],\n}, \'parsed\'));\n\n/*\nWill print:\n\n{\n explain: true,\n type: \'parsed\',\n data: {\n input: {\n patterns: [\n factory.createPattern(\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ),\n ],\n type: \'bgp\',\n },\n type: \'project\',\n variables: [\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ],\n },\n}\n\nwith DF being an RDF data factory, and factory being a SPARQL algebra factory.\n */\n```\n\n### Explain logical in JavaScript\n\n```typescript\nconsole.log(await engine.explain(`SELECT * WHERE {\n ?s ?p ?o.\n }`, {\n sources: [ \'https://www.rubensworks.net/\' ],\n}, \'logical\'));\n\n/*\nWill print:\n\n{\n explain: true,\n type: \'logical\',\n data: {\n input: {\n input: [\n factory.createPattern(\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ),\n ],\n type: \'join\',\n },\n type: \'project\',\n variables: [\n DF.variable(\'s\'),\n DF.variable(\'p\'),\n DF.variable(\'o\'),\n ],\n },\n}\n\nwith DF being an RDF data factory, and factory being a SPARQL algebra factory.\n */\n```\n\n### Explain physical in JavaScript\n\n```typescript\nconsole.log(await engine.explain(`SELECT * WHERE {\n ?s ?p ?o.\n }`, {\n sources: [ \'https://www.rubensworks.net/\' ],\n}, \'physical\'));\n\n/*\nWill print:\n\n{\n explain: true,\n type: \'physical\',\n data: `slice\n project (o,p,s)\n join\n join-inner(bind) bindOperation:(?s http://www.w3.org/1999/02/22-rdf-syntax-ns#type ?o) bindCardEst:~100,022,186\n pattern (http://commons.wikimedia.org/wiki/Special:FilePath/!!!善福寺.JPG ?p http://dbpedia.org/ontology/Image) src:0 compacted-occurrences:100\n\nsources:\n 0: QuerySourceHypermedia(https://fragments.dbpedia.org/2016-04/en)(SkolemID:0)\n`,\n}\n */\n```\n'},10205:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Extension Functions'\ndescription: 'Providing implementations for SPARQL extension functions.'\n---\n\nSPARQL allows non-standard, [custom extension functions](https://www.w3.org/TR/sparql11-query/#extensionFunctions) to be used within queries.\nIn order to provide an implementation to these extension functions,\nComunica allows developers to plug them in via the context.\n\n
\nTake into account that when writing SPARQL queries with extension functions,\nthat these queries will not be portable to other types of query engines anymore,\nas these extension functions may not be standardized.\n
\n\n## Dictionary-based extension functions\n\nThe easiest way to plug in extension functions to Comunica is by using\nthe `extensionFunctions` [context entry](/docs/query/advanced/context/)\nin a [JavaScript application](/docs/query/getting_started/query_app/):\n\n```typescript\nimport {DataFactory} from \"rdf-data-factory\";\n\nconst DF = new DataFactory();\n\nconst bindingsStream = await myEngine.queryBindings(`\nPREFIX func: \nSELECT ?caps WHERE {\n ?s ?p ?o.\n BIND (func:to-upper-case(?o) AS ?caps)\n}\n`, {\n sources: ['https://www.rubensworks.net/'],\n extensionFunctions: {\n 'http://example.org/functions#to-upper-case'(args: RDF.Term[]) {\n const arg = args[0];\n if (arg.termType === 'Literal' && arg.datatype.value === 'http://www.w3.org/2001/XMLSchema#string') {\n return DF.literal(arg.value.toUpperCase(), arg.datatype);\n }\n return arg;\n },\n },\n});\n```\n\nWithin this `extensionFunctions` dictionary, you can provide any number of extension functions.\nThese functions may even be `async`.\n\n## Callback-based extension functions\n\nIf function names are not known beforehand,\nor the dictionary-based format is not usable for whatever reason,\nthen the callback-based `extensionFunctionCreator` entry may be used:\n\n```typescript\nimport {DataFactory} from \"rdf-data-factory\";\n\nconst DF = new DataFactory();\n\nconst bindingsStream = await myEngine.queryBindings(`\nPREFIX func: \nSELECT ?caps WHERE {\n ?s ?p ?o.\n BIND (func:to-upper-case(?o) AS ?caps)\n}\n`, {\n sources: ['https://www.rubensworks.net/'],\n extensionFunctionCreator: (funcTerm: RDF.NamedNode) => {\n if (funcTerm.value === 'http://example.org/functions#to-upper-case') {\n return (args: RDF.Term[]) => {\n const arg = args[0];\n if (arg.termType === 'Literal' && arg.datatype.value === 'http://www.w3.org/2001/XMLSchema#string') {\n return DF.literal(arg.value.toUpperCase(), arg.datatype);\n }\n return arg;\n };\n }\n },\n});\n```\n\nThe `extensionFunctionCreator` is invoked upon any occurrence of an extension function,\nand is called with the extension function name, wrapped within an [RDF/JS named node](/docs/query/advanced/rdfjs/).\nThe return type of this function is expected to be a function with the same signature\nas the values of the `extensionFunction` dictionary, or `undefined`.\n"},60711:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Federated Querying\'\ndescription: \'Query over the union of data within any number of sources\'\n---\n\nOne of the key features of Comunica,\nis the ability to query over **multiple sources** of different types.\nThis concept of querying over multiple sources is called _federated querying_.\n\nThis functionality can be exploited on both\nthe [CLI](/docs/query/getting_started/query_cli/) and the [JavaScript API](/docs/query/getting_started/query_app/).\nIn this guide, we will make use of the CLI as an example.\n\n
\nFederated query execution does not just send the query to each source separately.\nInstead, the triples from all sources are considered one large virtual dataset, which can then be queried over.\n
\n\n## Distributed Knowledge\n\nA fundamental concept of Linked Data and the Semantic Web\nis that data can be spread over different sources across the Web.\nThis means that querying over this data potentially involves more than one source.\n\nWhile some knowledge graphs such as\n[DBpedia](https://wiki.dbpedia.org/) and [Wikidata](https://www.wikidata.org/wiki/Wikidata:Main_Page)\naim to accumulate as much data as possible in one place,\nthese always have limitations in scope.\nAs such, federated querying may be needed for some queries.\n\n## Federated Querying in Comunica\n\nComunica\'s ability to execute federated queries is enabled by default.\nThis can be invoked by simply passing more than one source to the engine.\n\nFor example, the following query will retrieve all triples from DBpedia and two RDF documents:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n https://www.rubensworks.net/ \\\n https://ruben.verborgh.org/profile/ \\\n "SELECT * WHERE { ?s ?p ?o }"\n```\n\nThe example above shows that sources do not necessarily have to be of [the same type](/docs/query/advanced/source_types/).\n\n## Real-world federation example\n\nOne example of a real-world federated query,\nis task of linking people in DBpedia to library datasets.\nFor this, the [Virtual International Authority File](http://viaf.org/) can be used as a source to provide this linking.\n\nThe query below will retrieve all books in the Harvard Library written by people born in San Francisco:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n http://data.linkeddatafragments.org/viaf \\\n http://data.linkeddatafragments.org/harvard \\\n \'SELECT ?person ?name ?book ?title {\n ?person dbpedia-owl:birthPlace [ rdfs:label "San Francisco"@en ].\n ?viafID schema:sameAs ?person;\n schema:name ?name.\n ?book dc:contributor [ foaf:name ?name ];\n dc:title ?title.\n }\'\n```\n\n
\nThe TPF-based source https://fragments.dbpedia.org/2016-04/en is interchangeable with SPARQL-endpoint-based source https://dbpedia.org/sparql.\nThe engine will produce similar results as the sources represent the same dataset.\n
\n'},33889:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'GraphQL-LD\'\ndescription: \'Using the power of JSON-LD contexts, GraphQL queries can be executed by Comunica\'\n---\n\nInstead of SPARQL queries, you can also provide [**GraphQL-LD**](https://github.com/rubensworks/graphql-ld.js) queries,\nwhich are [GraphQL](https://graphql.org/) queries\nenhanced with a [JSON-LD](https://json-ld.org/) context.\nGraphQL-LD is a developer-friendly alternative to SPARQL that allows querying Linked Data and using the results in a straightforward way.\n\n## What is GraphQL-LD?\n\nAssuming the following SPARQL query:\n\n```sparql\nSELECT ?id ?starring WHERE {\n OPTIONAL {\n ?id ;\n ?starring.\n ?starring "Brad Pitt"@en.\n }\n}\n```\n\nThis could be written in a more compact way in GraphQL:\n\n```graphql\n{\n id\n ... on Film {\n starring(label: "Brad Pitt")\n }\n}\n```\n\nAnd this can be based on the following JSON-LD context:\n\n```json\n{\n "@context": {\n "Film": "http://dbpedia.org/ontology/Film",\n "label": { "@id": "http://www.w3.org/2000/01/rdf-schema#label", "@language": "en" },\n "starring": "http://dbpedia.org/ontology/starring"\n }\n}\n```\n\nLearn more about the **features** of GraphQL-LD on [GitHub](https://github.com/rubensworks/GraphQL-LD.js),\nor read [an article about GraphQL-LD](https://comunica.github.io/Article-ISWC2018-Demo-GraphQlLD/).\n\n## Using GraphQL-LD on the command line\n\nTo run GraphQL queries with [Comunica SPARQL from the command line](/docs/query/getting_started/query_cli/),\nset the `-i` flag to `graphql` and refer to your config file with the JSON-LD context (`@context`) through the `-c` flag.\nTo output your results as a GraphQL tree, set the MIME type of the output with `-t` to `tree`.\n\nFor example, the first 100 labels in DBpedia can be retrieved as follows:\n```bash\n$ comunica-sparql http://fragments.dbpedia.org/2015-10/en \\\n -q "{ label(first: 100) @single }" \\\n -c "{ \\"@context\\": { \\"label\\" : \\"http://www.w3.org/2000/01/rdf-schema#label\\" } }" \\\n -i graphql \\\n -t tree\n```\n\nSince the queries and contexts can be inconvenient to pass on the command line, they can also be supplied as files:\n```bash\n$ comunica-sparql http://fragments.dbpedia.org/2015-10/en \\\n -f query.graphql \\\n -c config-with-context.json \\\n -i graphql \\\n -t tree\n```\n\n## Using GraphQL-LD in an application\n\nIf you want to execute GraphQL-LD queries in [your application](/docs/query/getting_started/query_app/),\nyou can do this as follows:\n```javascript\nconst QueryEngine = require(\'@comunica/query-sparql\').QueryEngine;\nconst bindingsStreamToGraphQl = require(\'@comunica/actor-query-result-serialize-tree\').bindingsStreamToGraphQl;\n\nconst myEngine = new QueryEngine();\nconst result = await myEngine.query(`\n{\n label @single\n writer(label_en: \\"Michael Jackson\\") @single\n artist @single {\n label @single\n }\n}\n`, {\n sources: [\'http://fragments.dbpedia.org/2016-04/en\'],\n queryFormat: {\n language: \'graphql\',\n version: \'1.0\'\n },\n "@context": {\n "label": { "@id": "http://www.w3.org/2000/01/rdf-schema#label" },\n "label_en": { "@id": "http://www.w3.org/2000/01/rdf-schema#label", "@language": "en" },\n "writer": { "@id": "http://dbpedia.org/ontology/writer" },\n "artist": { "@id": "http://dbpedia.org/ontology/musicalArtist" }\n }\n});\n// Converts raw Comunica results to GraphQL objects\nconst data = await bindingsStreamToGraphQl(await result.execute(), result.context, {materializeRdfJsTerms: true});\n```\n'},85945:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'HDT'\ndescription: 'HDT offers highly compressed immutable RDF storage.'\n---\n\n[HDT](http://www.rdfhdt.org/) is a highly compressed RDF dataset format that enables efficient triple pattern querying.\nComunica enables executing SPARQL queries over HDT files,\nas it is one of the supported [source types](/docs/query/advanced/source_types/).\n\nQuerying over HDT requires [Comunica SPARQL HDT package (`@comunica/query-sparql-hdt`)](https://github.com/comunica/comunica-feature-hdt/tree/master/engines/query-sparql-hdt#readme).\n\n## 1. Installation\n\nSince Comunica runs on Node.js, make sure you have [Node.js installed](https://nodejs.org/en/) on your machine.\nHDT requires GCC 4.9 or higher to be available.\n\nNext, we can install Comunica SPARQL on our machine:\n```bash\n$ npm install -g @comunica/query-sparql-hdt\n```\n\n## 2. SPARQL querying over one HDT file\n\nAfter installing Comunica SPARQL HDT, you will be given access to several commands including `comunica-sparql-hdt`,\nwhich allows you to execute SPARQL queries from the command line.\n\nJust like `comunica-sparql`,\nthis command requires one or more URLs to be provided as **sources** to query over.\nAs last argument, as **SPARQL query string** can be provided.\n\nFor example, the following query retrieves the first 100 triples a local HDT file:\n```bash\n$ comunica-sparql-hdt hdt@path/to/myfile.hdt \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\"\n```\n\n## 3. SPARQL querying over multiple HDT files\n\nJust like `comunica-sparql`, querying over multiple sources simply requires you to pass them after each other:\n```bash\n$ comunica-sparql-hdt hdt@path/to/myfile1.hdt \\\n hdt@path/to/myfile2.hdt \\\n hdt@path/to/myfile3.hdt \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\"\n```\n\n## 4. Learn more\n\nThis guide only discussed the basic functionality of `comunica-sparql-hdt`.\nYou can learn more options by invoking the _help_ command, or by [reading the Comunica SPARQL documentation](/docs/query/getting_started/query_cli/):\n```text\n$ comunica-sparql-hdt --help\n```\n\nThe API for [querying over HDT files in JavaScript apps is identical to Comunica SPARQL](/docs/query/getting_started/query_app/),\nand just requires importing `@comunica/query-sparql-hdt` instead of `@comunica/query-sparql`.\n\nIn order to [set up a SPARQL endpoint, `comunica-sparql-hdt-http` can be used, just like Comunica SPARQL](/docs/query/getting_started/setup_endpoint/).\n"},50974:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Logging'\ndescription: 'Loggers can be set to different logging levels to inspect what Comunica is doing behind the scenes.'\n---\n\nIf you want to inspect what is going on during query execution,\nyou can enable a logger in Comunica.\n\n
\nThis guide focuses on configuring logging levels and printing output.\nClick here if you want to learn more about invoking a logger from within an actor implementation.\n
\n\n## Logging on the command line\n\nUsing Comunica SPARQL on the command line, logging can be enabled via the `-l` option.\nFor example, printing debug-level logs can be done as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -l debug\n```\n```text\n[2022-02-23T09:46:17.615Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n[2022-02-23T09:46:17.756Z] INFO: Identified as qpf source: https://fragments.dbpedia.org/2016-04/en { actor: 'urn:comunica:default:rdf-resolve-hypermedia/actors#qpf' }\n[2022-02-23T09:46:17.761Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n[2022-02-23T09:46:17.785Z] DEBUG: Determined physical join operator 'inner-bind' {\n entries: 2,\n variables: [ [ 's', 'p', 'o' ], [ 's', 'o' ] ],\n costs: {\n 'inner-none': undefined,\n 'inner-single': undefined,\n 'inner-multi-empty': undefined,\n 'inner-bind': 6458426063925.053,\n 'inner-hash': undefined,\n 'inner-symmetric-hash': undefined,\n 'inner-nested-loop': 104059105829280600,\n 'optional-bind': undefined,\n 'optional-nested-loop': undefined,\n 'minus-hash': undefined,\n 'minus-hash-undef': undefined,\n 'inner-multi-smallest': undefined\n },\n coefficients: {\n 'inner-none': undefined,\n 'inner-single': undefined,\n 'inner-multi-empty': undefined,\n 'inner-bind': {\n iterations: 6404592831613.728,\n persistedItems: 0,\n blockingItems: 0,\n requestTime: 538332323.1132541\n },\n 'inner-hash': {\n iterations: 1140381039,\n persistedItems: 1040358853,\n blockingItems: 1040358853,\n requestTime: 1391277679.44\n },\n 'inner-symmetric-hash': {\n iterations: 1140381039,\n persistedItems: 1140381039,\n blockingItems: 0,\n requestTime: 1391277679.44\n },\n 'inner-nested-loop': {\n iterations: 104058966701512660,\n persistedItems: 0,\n blockingItems: 0,\n requestTime: 1391277679.44\n },\n 'optional-bind': undefined,\n 'optional-nested-loop': undefined,\n 'minus-hash': undefined,\n 'minus-hash-undef': undefined,\n 'inner-multi-smallest': undefined\n }\n}\n[2022-02-23T09:46:17.786Z] DEBUG: First entry for Bind Join: {\n entry: Quad {\n termType: 'Quad',\n value: '',\n subject: Variable { termType: 'Variable', value: 's' },\n predicate: NamedNode {\n termType: 'NamedNode',\n value: 'http://www.w3.org/1999/02/22-rdf-syntax-ns#type'\n },\n object: Variable { termType: 'Variable', value: 'o' },\n graph: DefaultGraph { termType: 'DefaultGraph', value: '' },\n type: 'pattern'\n },\n metadata: {\n requestTime: 18,\n pageSize: 100,\n cardinality: { type: 'estimate', value: 100022186 },\n first: 'https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type&page=1',\n next: 'https://fragments.dbpedia.org/2016-04/en?predicate=http%3A%2F%2Fwww.w3.org%2F1999%2F02%2F22-rdf-syntax-ns%23type&page=2',\n previous: null,\n last: null,\n searchForms: { values: [Array] },\n canContainUndefs: false,\n order: undefined,\n availableOrders: undefined,\n variables: [ [Variable], [Variable] ]\n },\n actor: 'urn:comunica:default:rdf-join/actors#inner-multi-bind'\n}\n[2022-02-23T09:46:17.794Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en?subject=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FSpecial%3AFilePath%2F%21%21%21%E5%96%84%E7%A6%8F%E5%AF%BA.JPG&object=http%3A%2F%2Fdbpedia.org%2Fontology%2FImage {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n[2022-02-23T09:46:17.795Z] INFO: Requesting https://fragments.dbpedia.org/2016-04/en?subject=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FSpecial%3AFilePath%2F%21%21%21%E5%96%84%E7%A6%8F%E5%AF%BA.JPG&object=http%3A%2F%2Fwikidata.dbpedia.org%2Fontology%2FImage {\n headers: {\n accept: 'application/n-quads,application/trig;q=0.95,application/ld+json;q=0.9,application/n-triples;q=0.8,text/turtle;q=0.6,application/rdf+xml;q=0.5,application/json;q=0.45,text/n3;q=0.35,application/xml;q=0.3,image/svg+xml;q=0.3,text/xml;q=0.3,text/html;q=0.2,application/xhtml+xml;q=0.18',\n 'user-agent': 'Comunica/actor-http-fetch (Node.js v14.17.0; darwin)'\n },\n method: 'GET',\n actor: 'urn:comunica:default:http/actors#fetch'\n}\n```\n\nAll log messages will be printed to standard error (`stderr`).\n\nIf you only want to print the logs, you can void all query results as follows:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -l debug > /dev/null\n```\n\nIf you want to redirect all logs to a file, you can forward them like this:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n \"SELECT * WHERE { ?s ?p ?o } LIMIT 100\" \\\n -l debug 2> /path/to/log.txt\n```\n\n## Logging levels\n\nThe following logging levels are available in Comunica:\n\n* `trace`\n* `debug`\n* `info`\n* `warn`\n* `error`\n* `fatal`\n\n
\nWhen enabling a level, all levels below are also enabled.\nFor example, when enabling error, then fatal will also be enabled.\n
\n\n## Logging in an application\n\nUsing the `log` [context entry](/docs/query/advanced/context/), you can enable logging in a [JavaScript application that uses Comunica](/docs/query/getting_started/query_app/):\n```javascript\nimport {LoggerPretty} from \"@comunica/logger-pretty\";\n\nconst bindingsStream = await myEngine.queryBindings('SELECT * WHERE { ?s ?p ?o }', {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n log: new LoggerPretty({ level: 'debug' }),\n});\n```\n\nThis logger makes use of `LoggerPretty`, which will print everything to standard error (`stderr`),\njust like Comunica SPARQL on the command line.\n\nAlternatively, more advanced logging can be achieved by making use of [`@comunica/logger-bunyan`](https://github.com/comunica/comunica/tree/master/packages/logger-bunyan/),\nor by implementing your own logger that implements the [`Logger` interface](https://github.com/comunica/comunica/blob/master/packages/core/lib/Logger.ts).\n"},82329:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Memento'\ndescription: 'Using the Memento protocol, time travel queries can be executed.'\n---\n\nUsing the [Memento protocol](https://tools.ietf.org/html/rfc7089),\nit is possible to perform **time-based content negotiation** over HTTP.\nThis allows servers to expose different temporal versions of resources next to each other,\nand clients to retrieve these versions at different times.\n\nComunica has built-in support for the Memento protocol\n([`actor-http-memento`](https://github.com/comunica/comunica/tree/master/packages/actor-http-memento)).\nTo enable Memento, one simply passes a date to the query engine via the [context](/docs/query/advanced/context/),\nand Comunica will perform time-based negotiation for that date.\n\nFor example, the [DBpedia TPF interface supports the Memento protocol](https://ruben.verborgh.org/blog/2016/06/22/querying-history-with-linked-data/).\nIn order to query over it at version 2010 from the command line, a custom date can be passed with `-d`:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n -d 'June 1, 2010' \\\n 'SELECT ?name ?deathDate WHERE {\n ?person a dbpedia-owl:Artist;\n rdfs:label ?name;\n dbpedia-owl:birthPlace [ rdfs:label \"York\"@en ].\n FILTER LANGMATCHES(LANG(?name), \"EN\")\n OPTIONAL { ?person dbpprop:dateOfDeath ?deathDate. }\n }'\n```\n\nDates can also be passed via the JavaScript API, via the [query engine context](/docs/query/advanced/context/).\n"},13330:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'HTTP Proxy'\ndescription: 'All HTTP requests can optionally go through a proxy.'\n---\n\nOptionally, you can configure a proxy to redirect all HTTP(S) traffic.\nThis is for example useful when Comunica is used in a Web browser\nwhere a [proxy enables CORS headers on all responses](https://www.npmjs.com/package/cors-anywhere).\n\n## Proxying on the command line\n\nVia the command line, a proxy can be enabled via the `-p` option as follows:\n```bash\n$ comunica-sparql http://fragments.dbpedia.org/2015-10/en \"SELECT * WHERE { ?s ?p ?o }\" \\\n -p http://myproxy.org/?uri=\n```\n\n## Proxying in an application\n\nWhen using [Comunica SPARQL in an application](/docs/query/getting_started/query_app/), a proxy can be set using the `httpProxyHandler` [context entry](/docs/query/advanced/context/):\n```javascript\nimport { ProxyHandlerStatic } from \"@comunica/actor-http-proxy\";\n\nconst bindingsStream = await myEngine.queryBindings('SELECT * WHERE { ?s ?p ?o }', {\n sources: ['http://fragments.dbpedia.org/2015/en'],\n httpProxyHandler: new ProxyHandlerStatic('http://myproxy.org/?uri='),\n});\n```\n\nIn the example above, a `ProxyHandlerStatic` is passed,\nwhich will simply put the URL `http://myproxy.org/?uri=` in front of all URLs that would be requested.\n\nIf you need a more advanced proxy behaviour,\nthen you can implement your own proxy handler.\nAll proxy handlers must implement the [`IProxyHandler` interface](https://github.com/comunica/comunica/blob/master/packages/actor-http-proxy/lib/IProxyHandler.ts).\n"},68577:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'RDF/JS'\ndescription: 'To achieve maximum interoperability between different JavaScript libraries, Comunica builds on top of the RDF/JS specifications.'\n---\n\n
\n \n
\n\nRDF/JS offers a set of RDF specifications for JavaScript libraries\nthat are defined by the [RDF JavaScript Libraries W3C community group](https://www.w3.org/community/rdfjs/).\nMost of the popular JavaScript libraries adhere to these specifications, which makes it possible to use them interchangeably, and in any combination.\nThis allows you to for example use an RDF parser from one developer, and pipe its output into an RDF store from another developer.\n\nFor most of these specifications, corresponding [TypeScript typings exist](https://www.npmjs.com/package/@types/rdf-js),\nand many libraries ship with their own typings as well,\nwhich makes RDF/JS especially useful if you want to develop more strongly-typed JavaScript applications.\n\nComunica is conformant to the following RDF/JS specifications. \n\n## Data model specification\n\nThe foundational part of RDF/JS is its [low-level **data model** specification](http://rdf.js.org/data-model-spec/),\nin which JavaScript interfaces are described for representing **RDF terms** and **RDF quads**.\nFive types of terms exist:\n\n* [Named Node](http://rdf.js.org/data-model-spec/#namednode-interface): Represents a thing by IRI, such as `https://www.rubensworks.net/#me`.\n* [Blank Node](http://rdf.js.org/data-model-spec/#blanknode-interface): Represents a thing without an explicit name.\n* [Literal](http://rdf.js.org/data-model-spec/#literal-interface): Represents a raw value of a certain datatype, such as `\"Ruben\"` or `1992`.\n* [Variable](http://rdf.js.org/data-model-spec/#variable-interface): Represents a variable, which can be used for matching values within queries.\n* [Default Graph](http://rdf.js.org/data-model-spec/#defaultgraph-interface): Represents the default graph in RDF. Other graphs can be represented with named or blank nodes.\n\n[RDF quads](http://rdf.js.org/data-model-spec/#quad-interface) are defined as an object with RDF terms for **subject**, **predicate**, **object** and **graph**.\nAn RDF triple is an alias of a quad,\nwhere the graph is set to the default graph.\nFor the remainder of this document, I will just refer to RDF quads.\n\nFinally, a [Data Factory](http://rdf.js.org/data-model-spec/#datafactory-interface) interface is defined,\nwhich allows you to easily create terms and quads that conform to this interface.\nDifferent Data Factory implementations exist, such as [`rdf-data-factory`](https://www.npmjs.com/package/rdf-data-factory)\nand the factory from [`N3.js`](https://github.com/rdfjs/N3.js#interface-specifications).\nFor example, creating a quad for representing someone's name with a data factory can be done like this:\n\n```javascript\nimport { DataFactory } from 'rdf-data-factory';\n\nconst factory = new DataFactory();\n\nconst quad = factory.quad(\n factory.namedNode('https://www.rubensworks.net/#me'), // subject\n factory.namedNode('http://schema.org/name'), // predicate\n factory.literal('Ruben') // object\n);\n```\n\nReading raw values from the quad can be done as follows:\n\n```javascript\nquad.subject.value === 'https://www.rubensworks.net/#me';\nquad.predicate.value === 'http://schema.org/name';\nquad.object.value === 'Ruben';\n```\n\nFor checking whether or not quads and terms are equal to each other, the `equals` method can be used:\n\n```javascript\nfactory.literal('Ruben').equals(factory.literal('Ruben')); // true\nfactory.literal('Ruben').equals(factory.literal('Ruben2')); // false\nquad.equals(quad); // true\n```\n\n## Stream interfaces\n\nComunica handles most parts of query execution in a **streaming** manner,\nwhich means that some query results may already be returned\neven though other results are still being processed.\n\nNext to the RDF/JS data model, a dedicated specification exist for handling [RDF streams](http://rdf.js.org/stream-spec/),\nwhich is of high important to Comunica.\n\nOne interface of high importance is the [RDF/JS `Source` interface](http://rdf.js.org/stream-spec/#source-interface).\nYou can [pass a custom `Source` to Comunica to execute queries over it](/docs/query/advanced/rdfjs_querying/).\n\nThe [RDF/JS `Store` interface](http://rdf.js.org/stream-spec/#store-interface) is an extension of `Source`\nthat also allows quads to be added and removed.\nYou can [pass a custom `Store` to Comunica to execute update queries over it](/docs/query/advanced/rdfjs_updating/).\n\n## Query interfaces\n\nThe [RDF/JS query spec](http://rdf.js.org/query-spec/) is a specification that provides\nhigh-level and low-level interfaces that are common to query engines.\nFor example, query engines implementing these high-level interfaces are mostly interchangeable when used within applications.\n\nThe most important high-level interfaces that are implemented by Comunica\nare the [Queryable](https://rdf.js.org/query-spec/#queryable-interfaces)\nand [SparqlQueryable](https://rdf.js.org/query-spec/#sparql-queryable-interfaces) interfaces.\nCompared to these standard interfaces, the only additional requirement that Comunica places is the usage\nof a [source-based context](https://rdf.js.org/query-spec/#querysourcecontext-interface) as second argument to the query methods.\n\nNext to that, Comunica also implements the [`BindingsFactory`](http://rdf.js.org/query-spec/#bindingsfactory-interface)\nand [`Bindings`](http://rdf.js.org/query-spec/#bindings-interface) interfaces via the\n[`@comunica/bindings-factory`](https://github.com/comunica/comunica/tree/master/packages/bindings-factory) package.\nLearn more about the usage of these bindings [here](/docs/query/advanced/bindings/).\n"},82075:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Querying over RDF/JS sources'\ndescription: 'If the built-in source types are not sufficient, you can pass a custom JavaScript object implementing a specific interface.'\n---\n\nOne of the [different types of sources](/docs/query/advanced/source_types/) that is supported by Comunica\nis the [RDF/JS `Source` interface](http://rdf.js.org/stream-spec/#source-interface).\nThis allows you to pass objects as source to Comunica as long as they implement this interface.\n\nAn RDF/JS `Source` exposes the [`match`](http://rdf.js.org/stream-spec/#source-interface) method\nthat allows quad pattern queries to be executed,\nand matching quads to be returned as a stream.\n\n
\n\nSeveral implementations of this `Source` interface exist.\nIn the example below, we make use of the [`Store` from `N3.js`](https://github.com/rdfjs/N3.js#storing)\nthat offers one possible implementation when you want to [query over it with Comunica within a JavaScript application](/docs/query/getting_started/query_app/):\n```javascript\nconst store = new N3.Store();\nstore.addQuad(\n namedNode('http://ex.org/Pluto'),\n namedNode('http://ex.org/type'),\n namedNode('http://ex.org/Dog')\n);\nstore.addQuad(\n namedNode('http://ex.org/Mickey'),\n namedNode('http://ex.org/type'),\n namedNode('http://ex.org/Mouse')\n);\n\nconst bindingsStream = await myEngine.queryBindings(`SELECT * WHERE { ?s ?p ?o }`, {\n sources: [store],\n});\n```\n\n
\nInstead of the default Comunica SPARQL package (@comunica/query-sparql),\nthe Comunica SPARQL RDF/JS (@comunica/query-sparql-rdfjs)\ncan also be used as a more lightweight alternative\nthat only allows querying over RDF/JS sources.\n
\n\n
\nIf the RDF/JS `Source` also implements the RDF/JS Store interface,\nthen it is also supports update queries to add, change or delete quads in the store.\n
\n\n## Optional: query optimization\n\nThe RDFJS [Source interface](http://rdf.js.org/#source-interface) by default only exposed the `match` method.\nIn order to allow Comunica to produce more efficient query plans,\nyou can optionally expose a `countQuads` method that has the same signature as `match`,\nbut returns a `number` or `Promise` that represents (an estimate of)\nthe number of quads that would match the given quad pattern.\nCertain `Source` implementations may be able to provide an efficient implementation of this method,\nwhich would lead to better query performance.\n\nIf Comunica does not detect a `countQuads` method, it will fallback to a sub-optimal counting mechanism\nwhere `match` will be called again to manually count the number of matches.\n"},27124:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Updating RDF/JS stores\'\ndescription: \'If the built-in destination types are not sufficient, you can pass a custom JavaScript object implementing a specific interface.\'\n---\n\nOne of the [different types of destinations](/docs/query/advanced/destination_types/) that is supported by Comunica\nis the [RDF/JS `Store` interface](http://rdf.js.org/stream-spec/#store-interface).\nThis allows you to pass objects as destination to Comunica as long as they implement this interface.\n\n
\n\nSeveral implementations of this `Store` interface exist.\nIn the example below, we make use of the [`Store` from `N3.js`](https://github.com/rdfjs/N3.js#storing)\nthat offers one possible implementation when you want to [query over it with Comunica within a JavaScript application](/docs/query/getting_started/query_app/):\n```javascript\nconst store = new N3.Store();\n\nconst query = `\nPREFIX dc: \nINSERT DATA\n{ \n dc:title "A new book" ;\n dc:creator "A.N.Other" .\n}\n`;\n\n// Execute the update\nawait myEngine.queryVoid(query, {\n sources: [store],\n});\n\n// Prints \'2\' => the store is updated\nconsole.log(store.size);\n```\n'},54924:function(e,n,t){"use strict";t.r(n),n.default='---\ntitle: \'Result formats\'\ndescription: \'Query results can be serialized in different formats.\'\n---\n\nBy default, Comunica has support for the following result formats:\n\n| **Media type** | **Description** |\n| ------- | --------------- |\n| [`application/json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-json) | A custom, simplified JSON result format. |\n| [`simple`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-simple) | A custom, text-based result format. |\n| [`application/sparql-results+json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-json) | The [SPARQL/JSON](https://www.w3.org/TR/sparql11-results-json/) results format. |\n| [`application/sparql-results+xml`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-xml) | The [SPARQL/XML](https://www.w3.org/TR/rdf-sparql-XMLres/) results format. |\n| [`text/csv`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-csv) | The [SPARQL/CSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) results format. |\n| [`text/tab-separated-values`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-sparql-tsv) | The [SPARQL/TSV](https://www.w3.org/TR/sparql11-results-csv-tsv/) results format. |\n| [`stats`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-stats) | A custom results format for testing and debugging. |\n| [`table`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-table) | A text-based visual table result format. |\n| [`tree`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-tree) | A tree-based result format for GraphQL-LD result compacting. |\n| [`application/trig`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [TriG](https://www.w3.org/TR/trig/) RDF serialization. |\n| [`application/n-quads`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [N-Quads](https://www.w3.org/TR/n-quads/) RDF serialization. |\n| [`text/turtle`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [Turtle](https://www.w3.org/TR/turtle/) RDF serialization. |\n| [`application/n-triples`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [N-Triples](https://www.w3.org/TR/n-triples/) RDF serialization. |\n| [`text/n3`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [Notation3](https://www.w3.org/TeamSubmission/n3/) serialization. |\n| [`application/ld+json`](https://github.com/comunica/comunica/tree/master/packages/actor-sparql-serialize-rdf) | The [JSON-LD](https://json-ld.org/) RDF serialization. |\n\n## Querying from the command line\n\nWhen using [Comunica from the command line](/docs/query/getting_started/query_cli/),\nthe result format can be set using the `-t` option:\n```bash\n$ comunica-sparql https://fragments.dbpedia.org/2016-04/en \\\n "SELECT * WHERE { ?s ?p ?o } LIMIT 100" \\\n -t "application/sparql-results+json"\n```\n```json\n{"head": {"vars":["s","p","o"]},\n"results": { "bindings": [\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/date","type":"uri"},"o":{"value":"1899-05-06","type":"literal","datatype":"http://www.w3.org/2001/XMLSchema#date"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/isCitedBy","type":"uri"},"o":{"value":"http://dbpedia.org/resource/Tierce_(unit)","type":"uri"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/newspaper","type":"uri"},"o":{"value":"Biloxi Daily Herald","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/page","type":"uri"},"o":{"value":"6","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/title","type":"uri"},"o":{"value":"A New System of Weights and Measures","type":"literal"}},\n{"s":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"},"p":{"value":"http://dbpedia.org/property/url","type":"uri"},"o":{"value":"http://0-access.newspaperarchive.com.lib.utep.edu/us/mississippi/biloxi/biloxi-daily-herald/1899/05-06/page-6?tag=tierce+wine&rtserp=tags/tierce-wine?page=2","type":"uri"}},\n...\n```\n\n
\nAll available formats can be printed via comunica-sparql --listformats\n
\n\n### Querying in a JavaScript app\n\nWhen using [Comunica in a JavaScript application](/docs/query/getting_started/query_app/),\nresults can be serialized to a certain format using `resultToString()`:\n```javascript\nconst result = await myEngine.query(`\n SELECT ?s ?p ?o WHERE {\n ?s ?p .\n ?s ?p ?o\n } LIMIT 100`, {\n sources: [\'http://fragments.dbpedia.org/2015/en\'],\n});\nconst { data } = await myEngine.resultToString(result,\n \'application/sparql-results+json\');\ndata.pipe(process.stdout); // Print to standard output\n```\n\nThe `resultToString()` method accepts a query result and a result format media type.\nThe media type is optional, and will default to `application/json` for bindings, `application/trig` for quads, and `simple` for booleans.\n\n
\nAll available result formats can be retrieved programmatically\nby invoking the asynchronous getResultMediaTypes() method.\n
\n'},89366:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Solid'\ndescription: 'Solid – the Web-based decentralization ecosystem – can be queried with Comunica.'\n---\n\n## What is Solid\n\n[Solid](https://solidproject.org/) is a Web-based decentralization ecosystem\nwhere people are in control over their own data.\n\nSolid achieves this by giving everyone control over their own **personal data pod**.\nApplications are completely separate, and have to ask permission to access your data.\n\nSince Solid and Comunica have a compatible technology stack,\nComunica can be used to query over Solid data pods.\nThe default [Comunica SPARQL engine](/docs/query/getting_started/query_cli/)\ncan directly be used to query over public Solid data pods as long as you are querying over public data.\nIf you want to **query over data pods that require authentication**,\nyou can use one of the approaches mentioned below.\n\n## Query pods with a custom fetch function\n\nLibraries such as [@inrupt/solid-client-authn-node](https://www.npmjs.com/package/@inrupt/solid-client-authn-node)\nand [@inrupt/solid-client-authn-browser](https://www.npmjs.com/package/@inrupt/solid-client-authn-browser)\nallow you to authenticate with your Solid WebID.\nThese libraries provide a custom `fetch` function, using which you can execute authenticated HTTP requests.\n\nYou can forward this fetch function to Comunica SPARQL to make it perform authenticated queries to pods as shown below.\n\n```typescript\nimport { QueryEngine } from '@comunica/query-sparql-solid';\nimport { Session } from '@inrupt/solid-client-authn-node';\n\nconst session = new Session();\nconst myEngine = new QueryEngine();\n\nawait session.login({ ... }); // Log in as explained in https://docs.inrupt.com/developer-tools/javascript/client-libraries/tutorial/authenticate-nodejs-web-server/\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT * WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n // Set your profile as query source\n sources: [session.info.webId],\n // Pass the authenticated fetch function\n fetch: session.fetch,\n});\n```\n\n## Query pods with an existing Solid session\n\n[Comunica SPARQL Solid](https://github.com/comunica/comunica-feature-solid/tree/master/engines/query-sparql-solid)\nallows you to pass your authenticated Solid session object.\nHereafter, we list some examples on how to use it from JavaScript and the command line.\nPlease refer to the [README of Comunica SPARQL Solid](https://github.com/comunica/comunica-feature-solid/tree/master/engines/query-sparql-solid#readme)\nfor more details.\n\n**Querying from JavaScript**:\n```typescript\nimport { QueryEngine } from '@comunica/query-sparql-solid';\nimport { Session } from '@inrupt/solid-client-authn-node';\n\nconst session = new Session();\nconst myEngine = new QueryEngine();\n\nawait session.login({ ... }); // Log in as explained in https://docs.inrupt.com/developer-tools/javascript/client-libraries/tutorial/authenticate-nodejs-web-server/\n\nconst bindingsStream = await myEngine.queryBindings(`\n SELECT * WHERE {\n ?s ?p ?o\n } LIMIT 100`, {\n // Set your profile as query source\n sources: [session.info.webId],\n // Pass your authenticated session\n '@comunica/actor-http-inrupt-solid-client-authn:session': session,\n});\n```\n\n**Querying an existing document**:\n```bash\n$ comunica-sparql-solid --idp https://solidcommunity.net/ \\\n http://example.org/existing-document.ttl \\\n \"SELECT * { ?s ?p ?o }\"\n```\n\n**Creating a new document**:\n```bash\n$ comunica-sparql-solid --idp https://solidcommunity.net/ \\\n http://example.org/new-document.ttl \\\n \"INSERT DATA { }\"\n```\n\n**Updating an existing document**:\n```bash\n$ comunica-sparql-solid --idp https://solidcommunity.net/ \\\n http://example.org/existing-document.ttl \\\n \"INSERT DATA { }\"\n```\n\nPlease be aware that that there are several [open known issues](https://github.com/comunica/comunica-feature-solid/tree/master/engines/query-sparql-solid#known-issues) relating to other software.\n\n[LDflex](/docs/query/usage/#ldflex) and [GraphQL-LD](/docs/query/usage/#graphql-ld) are examples of tools that ship with Comunica SPARQL Solid.\n\n## Query pods using link traversal\n\nThe approaches for querying Solid mentioned above require you to know upfront in which pod and in which documents\nyour data resides before you can query over it.\n[_Comunica SPARQL Link Traversal Solid_](https://github.com/comunica/comunica-feature-link-traversal/tree/master/engines/query-sparql-link-traversal-solid#comunica-sparql-link-traversal)\nprovides a way to query over Solid pods without having to know beforehand in which documents the necessary data resides in.\nIt does this by following links between documents _during query execution_.\n\nThis is still an experimental query approach, which does not yet work well for complex queries.\nLearn more about active [research on link traversal in Solid](https://comunica.dev/research/link_traversal/).\n\nThe example below executes a query across multiple simulated Solid pods to find all messages by a certain creator:\n\n```typescript\nimport { QueryEngine } from '@comunica/query-sparql-solid';\n\nconst myEngine = new QueryEngine();\nconst bindingsStream = await myEngine.queryBindings(`\n PREFIX snvoc: \n SELECT DISTINCT ?forumId ?forumTitle WHERE {\n ?message snvoc:hasCreator .\n ?forum snvoc:containerOf ?message;\n snvoc:id ?forumId;\n snvoc:title ?forumTitle.\n }`, {\n // Sources field is optional. Will be derived from query if not provided.\n //sources: [session.info.webId], // Sets your profile as query source\n // Session is optional for authenticated requests\n //'@comunica/actor-http-inrupt-solid-client-authn:session': session,\n // The lenient flag will make the engine not crash on invalid documents\n lenient: true,\n});\n```\n\nTry out this query above in our [live demo](https://comunica.github.io/comunica-feature-link-traversal-web-clients/builds/solid-default/#query=PREFIX%20snvoc%3A%20%3Chttps%3A%2F%2Fsolidbench.linkeddatafragments.org%2Fwww.ldbc.eu%2Fldbc_socialnet%2F1.0%2Fvocabulary%2F%3E%0ASELECT%20DISTINCT%20%3FforumId%20%3FforumTitle%20WHERE%20%7B%0A%20%20%3Fmessage%20snvoc%3AhasCreator%20%3Chttps%3A%2F%2Fsolidbench.linkeddatafragments.org%2Fpods%2F00000006597069767117%2Fprofile%2Fcard%23me%3E.%0A%20%20%3Fforum%20snvoc%3AcontainerOf%20%3Fmessage%3B%0A%20%20%20%20snvoc%3Aid%20%3FforumId%3B%0A%20%20%20%20snvoc%3Atitle%20%3FforumTitle.%0A%7D).\n"},42473:function(e,n,t){"use strict";t.r(n),n.default="---\ntitle: 'Source types'\ndescription: 'Comunica detects and handles different types of sources.'\n---\n\nComunica SPARQL enables query execution over one or more sources\non both the [command line](/docs/query/getting_started/query_cli/)\nand when [calling Comunica from a JavaScript application](/docs/query/getting_started/query_app/).\n\nUsually, sources are passed as URLs that point to Web resources.\nBased on what is returned when _dereferencing_ this URL,\nComunica can apply different query algorithms.\n\nInstead of relying on Comunica's detection algorithms,\nyou can **enforce** the use of a certain type.\n\n
\nSome SPARQL endpoints may be recognised as a file instead of a SPARQL endpoint due to them not supporting SPARQL Service Description,\nwhich may produce incorrect results. For these cases, the sparql type MUST be set.\n
\n\n
\nWhen enabling the info logger,\nyou can derive what type Comunica has determined for each source.\n
\n\n## Setting source type on the command line\n\nOn the [command line](/docs/query/getting_started/query_cli/), source types can optionally be enforced by prefixing the URL with `@`, such as:\n```bash\n$ comunica-sparql sparql@https://dbpedia.org/sparql \\\n \"CONSTRUCT WHERE { ?s ?p ?o } LIMIT 100\"\n```\n\n## Setting source type in an application\n\nVia a [JavaScript application](/docs/query/getting_started/query_app/),\nthe source type can be set by using a hash containing `type` and `value`:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n { type: 'sparql', value: 'https://dbpedia.org/sparql' },\n ],\n});\n```\n\n## Supported source types\n\nThe table below summarizes the different source types that Comunica supports by default:\n\n| **Type name** | **Description** |\n|---------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `file` | plain RDF file in any RDF serialization, such as [Turtle](https://www.w3.org/TR/turtle/), [TriG](https://www.w3.org/TR/trig/), [JSON-LD](https://json-ld.org/), [RDFa](https://www.w3.org/TR/rdfa-primer/), ... |\n| `sparql` | [SPARQL endpoint](https://www.w3.org/TR/sparql11-protocol/) |\n| `hypermedia` | Sources that expose query capabilities via hypermedia metadata, such as [Triple Pattern Fragments](https://linkeddatafragments.org/specification/triple-pattern-fragments/) and [Quad Pattern Fragments](https://linkeddatafragments.org/specification/quad-pattern-fragments/) |\n| `qpf` | A hypermedia source that is enforced as [Triple Pattern Fragments](https://linkeddatafragments.org/specification/triple-pattern-fragments/) or [Quad Pattern Fragments](https://linkeddatafragments.org/specification/quad-pattern-fragments/) |\n| `brtpf` | A hypermedia source that is enforced as [bindings-restricted Triple Pattern Fragments](https://arxiv.org/abs/1608.08148) |\n| `rdfjs` | JavaScript objects implementing the [RDF/JS `source` interface](/docs/query/advanced/rdfjs_querying/) |\n| `serialized` | An RDF dataset serialized as a string in a certain format. |\n| `hdt` | [HDT files](/docs/query/advanced/hdt/) |\n| `ostrichFile` | Versioned [OSTRICH archives](https://github.com/rdfostrich/comunica-query-sparql-ostrich) |\n\nThe default source type is `auto`,\nwhich will automatically detect the proper source type.\nFor example, if a [SPARQL Service Description](https://www.w3.org/TR/sparql11-service-description/)\nis detected, the `sparql` type is used.\n\n## RDF serializations\n\nComunica will interpret the `Content-Type` header of HTTP responses to determine used RDF serialization.\nIf the server did not provide such a header, Comunica will attempt to derive the serialization based on the extension.\n\nThe following RDF serializations are supported:\n\n| **Name** | **Content type** | **Extensions** |\n| -------- | ---------------- | ------------- |\n| [TriG](https://www.w3.org/TR/trig/) | `application/trig` | `.trig` |\n| [N-Quads](https://www.w3.org/TR/n-quads/) | `application/n-quads` | `.nq`, `.nquads` |\n| [Turtle](https://www.w3.org/TR/turtle/) | `text/turtle` | `.ttl`, `.turtle` |\n| [N-Triples](https://www.w3.org/TR/n-triples/) | `application/n-triples` | `.nt`, `.ntriples` |\n| [Notation3](https://www.w3.org/TeamSubmission/n3/) | `text/n3` | `.n3` |\n| [JSON-LD](https://json-ld.org/) | `application/ld+json`, `application/json` | `.json`, `.jsonld` |\n| [RDF/XML](https://www.w3.org/TR/rdf-syntax-grammar/) | `application/rdf+xml` | `.rdf`, `.rdfxml`, `.owl` |\n| [RDFa](https://www.w3.org/TR/rdfa-in-html/) and script RDF data tags [HTML](https://html.spec.whatwg.org/multipage/)/[XHTML](https://www.w3.org/TR/xhtml-rdfa/) | `text/html`, `application/xhtml+xml` | `.html`, `.htm`, `.xhtml`, `.xht` |\n| [RDFa](https://www.w3.org/TR/2008/REC-SVGTiny12-20081222/metadata.html#MetadataAttributes) in [SVG](https://www.w3.org/TR/SVGTiny12/)/[XML](https://html.spec.whatwg.org/multipage/) | `image/svg+xml`,`application/xml` | `.xml`, `.svg`, `.svgz` |\n\n## String source\n\nString-based sources allow you to query over sources that are represented as a string in a certain RDF serialization.\n\nFor example, querying over a Turtle-based datasource:\n```javascript\nconst bindingsStream = await myEngine.queryBindings(`...`, {\n sources: [\n {\n type: 'serialized',\n value: '.