Skip to content

Commit

Permalink
Update Documentation Deployment (#256)
Browse files Browse the repository at this point in the history
* Remove old file

* Minor doc change

* Update html documentation page

* Remove unused files

* Move file

* Change image link

* Update mkdocs.yml

* Update docs deployment

* Add depyloment test

* Change trigger

* Change python action

* Change python action 2

* Add ontology deployment

* Fix ontology deployment

* Fix test

* Fix test 2

* Fix test 3

* Fix test 4

* Fix test 5

* Fix test 6

* Fix test 7

* Fix test 8

* Fix test 9

* Fix test 10

* Remove test workflow

* Fix python setup

* Test release files

* Fix workflow

* Fix workflow 2

* Fix workflow 3

* Remove test workflow
  • Loading branch information
nck-mlcnv authored Aug 7, 2024
1 parent 695e5e1 commit 1083cda
Show file tree
Hide file tree
Showing 11 changed files with 160 additions and 184 deletions.
4 changes: 0 additions & 4 deletions .bettercodehub.yml

This file was deleted.

9 changes: 0 additions & 9 deletions .github/pages/javadoc-latest.html

This file was deleted.

9 changes: 0 additions & 9 deletions .github/pages/latest.html

This file was deleted.

47 changes: 32 additions & 15 deletions .github/workflows/deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -81,52 +81,69 @@ jobs:
distribution: 'adopt'
cache: 'maven'
- name: Set up Python
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: 3.x
cache: 'pip'
- run: pip install mkdocs-material
- run: pip install mkdocs-macros-plugin
- run: sed -i "s/\$VERSION/${{ env.RELEASE_VERSION }}/g" mkdocs.yml
- run: sed -i "s/\$RELEASE_VERSION/${{ env.RELEASE_VERSION }}/g" mkdocs.yml

- run: mkdocs build -d site/${{ env.RELEASE_VERSION }}
- run: mvn javadoc:javadoc
- run: sed -i "s/\$VERSION/${{ env.RELEASE_VERSION }}/g" .github/pages/latest.html
- run: sed -i "s/\$VERSION/${{ env.RELEASE_VERSION }}/g" .github/pages/javadoc-latest.html

- name: Deploy Site
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./site/${{ env.RELEASE_VERSION }}
destination_dir: ./docs/${{ env.RELEASE_VERSION }}
- name: Deploy Site
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./site/${{ env.RELEASE_VERSION }}
destination_dir: ./docs/latest

- name: Deploy Javadoc
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./javadoc/${{ env.RELEASE_VERSION }}
publish_dir: ./javadoc/${{ env.RELEASE_VERSION }}/apidocs
destination_dir: ./javadoc/${{ env.RELEASE_VERSION }}
- name: Deploy latest.html
- name: Deploy Javadoc
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .github/pages/
keep_files: true
destination_dir: ./docs/
- name: Deploy latest.html
publish_dir: ./javadoc/${{ env.RELEASE_VERSION }}/apidocs
destination_dir: ./javadoc/latest

- name: Find Ontology Version
run: echo "ONTOLOGY_VERSION=$(grep 'versionIRI' schema/iguana.owx | grep -Po '[0-9]+.[0-9]+.[0-9]+')" >> $GITHUB_OUTPUT
id: find_ontology_version

- name: Fetch Ontologies
run: git fetch && git checkout origin/gh-pages ontology/
- run: mkdir -p ontology/${{ steps.find_ontology_version.outputs.ONTOLOGY_VERSION }}
- run: cp schema/iguana.owx ontology/${{ steps.find_ontology_version.outputs.ONTOLOGY_VERSION }}/iguana.owx
- run: cp schema/iguana.owx ontology/iguana.owx

- name: Deploy Ontology
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: .github/pages/
keep_files: true
destination_dir: ./docs/
publish_dir: ./ontology/
destination_dir: ./ontology/


deploy_gh_release:
name: Publish GitHub Release
runs-on: ubuntu-latest
needs: [compile-jar, deploy_to_maven, find_version]
needs: [compile_native, deploy_to_maven, find_version]
env:
RELEASE_VERSION: ${{ needs.find_version.outputs.RELEASE_VERSION }}

steps:
- uses: actions/checkout@v4
- name: Download artifacts from previous jobs
uses: actions/download-artifact@v4
with:
Expand Down
Binary file removed customs/images/Iguana_new_logo6.png
Binary file not shown.
Binary file removed customs/images/iguana-result-schema.png
Binary file not shown.
212 changes: 106 additions & 106 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -1,107 +1,107 @@
<p align="center">
<img src="https://github.com/dice-group/IGUANA/raw/develop/images/IGUANA_logo.png" alt="IGUANA Logo" width="200">
</p>

# IGUANA
Iguana is a benchmarking framework for testing the read performances of HTTP endpoints.
It is mostly designed for benchmarking triplestores by using the SPARQL protocol.
Iguana stresstests endpoints by simulating users which send a set of queries independently of each other.

Benchmarks are configured using a YAML-file, this allows them to be easily repeated and adjustable.
Results are stored in RDF-files and can also be exported as CSV-files.

## Features
- Benchmarking of (SPARQL) HTTP endpoints
- Reusable configuration
- Calculation of various metrics for better comparisons
- Processing of HTTP responses (e.g., results counting)

## Setup

### Prerequisites

If you're using the native version of IGUANA, you need to have at least a `x86-64-v3` (Intel Haswell and AMD Excavator or newer) system that is running Linux.

If you're using the Java version of IGUANA, you need to have `Java 17` or higher installed.
On Ubuntu it can be installed by executing the following command:

```bash
sudo apt install openjdk-17-jre
```

### Download
The latest release can be downloaded at https://github.com/dice-group/IGUANA/releases/latest.
The zip file contains three files:

* `iguana`
* `iguana.jar`
* `example-suite.yml`
* `start-iguana.sh`

The `iguana` file is a native executable for IGUANA that has been compiled with GraalVM.
The `iguana.jar` file is the standard Java executable for IGUANA.
The `start-iguana.sh` script is a helper script to start IGUANA with the `iguana.jar` file.

### Configuration
The `example-suite.yml` file contains an extensive configuration for a benchmark suite.
It can be used as a starting point for your own benchmark suite.
For a detailed explanation of the configuration, see the [configuration](./configuration/overview.md) documentation.

## Usage

### Native Version

Start Iguana with a benchmark suite (e.g., the `example-suite.yml`) by executing the binary:

```bash
./iguana example-suite.yml
```

### Java Version

Start Iguana with a benchmark suite (e.g., the `example-suite.yml`) either by using the start script:

```bash
./start-iguana.sh example-suite.yml
```

or by directly executing the jar-file:

```bash
java -jar iguana.jar example-suite.yml
```

If you're using the script, you can use JVM arguments by setting the environment variable `IGUANA_JVM`.
For example, to let Iguana use 4GB of RAM you can set `IGUANA_JVM` as follows:

```bash
export IGUANA_JVM=-Xmx4g
```

# How to Cite

```bibtex
@InProceedings{10.1007/978-3-319-68204-4_5,
author="Conrads, Lixi
and Lehmann, Jens
and Saleem, Muhammad
and Morsey, Mohamed
and Ngonga Ngomo, Axel-Cyrille",
editor="d'Amato, Claudia
and Fernandez, Miriam
and Tamma, Valentina
and Lecue, Freddy
and Cudr{\'e}-Mauroux, Philippe
and Sequeda, Juan
and Lange, Christoph
and Heflin, Jeff",
title="Iguana: A Generic Framework for Benchmarking the Read-Write Performance of Triple Stores",
booktitle="The Semantic Web -- ISWC 2017",
year="2017",
publisher="Springer International Publishing",
address="Cham",
pages="48--65",
abstract="The performance of triples stores is crucial for applications driven by RDF. Several benchmarks have been proposed that assess the performance of triple stores. However, no integrated benchmark-independent execution framework for these benchmarks has yet been provided. We propose a novel SPARQL benchmark execution framework called Iguana. Our framework complements benchmarks by providing an execution environment which can measure the performance of triple stores during data loading, data updates as well as under different loads and parallel requests. Moreover, it allows a uniform comparison of results on different benchmarks. We execute the FEASIBLE and DBPSB benchmarks using the Iguana framework and measure the performance of popular triple stores under updates and parallel user requests. We compare our results (See https://doi.org/10.6084/m9.figshare.c.3767501.v1) with state-of-the-art benchmarking results and show that our benchmark execution framework can unveil new insights pertaining to the performance of triple stores.",
isbn="978-3-319-68204-4"
}
<p align="center">
<img src="https://github.com/dice-group/IGUANA/raw/main/images/IGUANA_logo.png" alt="IGUANA Logo" width="200">
</p>

# IGUANA
Iguana is a benchmarking framework for testing the read performances of HTTP endpoints.
It is mostly designed for benchmarking triplestores by using the SPARQL protocol.
Iguana stresstests endpoints by simulating users which send a set of queries independently of each other.

Benchmarks are configured using a YAML-file, this allows them to be easily repeated and adjustable.
Results are stored in RDF-files and can also be exported as CSV-files.

## Features
- Benchmarking of (SPARQL) HTTP endpoints
- Reusable configuration
- Calculation of various metrics for better comparisons
- Processing of HTTP responses (e.g., results counting)

## Setup

### Prerequisites

If you're using the native version of IGUANA, you need to have at least a `x86-64-v3` (Intel Haswell and AMD Excavator or newer) system that is running Linux.

If you're using the Java version of IGUANA, you need to have `Java 17` or higher installed.
On Ubuntu it can be installed by executing the following command:

```bash
sudo apt install openjdk-17-jre
```

### Download
The latest release can be downloaded at https://github.com/dice-group/IGUANA/releases/latest.
The zip file contains three files:

* `iguana`
* `iguana.jar`
* `example-suite.yml`
* `start-iguana.sh`

The `iguana` file is a native executable for IGUANA that has been compiled with GraalVM.
The `iguana.jar` file is the standard Java executable for IGUANA.
The `start-iguana.sh` script is a helper script to start IGUANA with the `iguana.jar` file.

### Configuration
The `example-suite.yml` file contains an extensive configuration for a benchmark suite.
It can be used as a starting point for your own benchmark suite.
For a detailed explanation of the configuration, see the [configuration](./configuration/overview.md) documentation.

## Usage

### Native Version

Start Iguana with a benchmark suite (e.g., the `example-suite.yml`) by executing the binary:

```bash
./iguana example-suite.yml
```

### Java Version

Start Iguana with a benchmark suite (e.g., the `example-suite.yml`) either by using the start script:

```bash
./start-iguana.sh example-suite.yml
```

or by directly executing the jar-file:

```bash
java -jar iguana.jar example-suite.yml
```

If you're using the script, you can use JVM arguments by setting the environment variable `IGUANA_JVM`.
For example, to let Iguana use 4GB of RAM you can set `IGUANA_JVM` as follows:

```bash
export IGUANA_JVM=-Xmx4g
```

# How to Cite

```bibtex
@InProceedings{10.1007/978-3-319-68204-4_5,
author="Conrads, Lixi
and Lehmann, Jens
and Saleem, Muhammad
and Morsey, Mohamed
and Ngonga Ngomo, Axel-Cyrille",
editor="d'Amato, Claudia
and Fernandez, Miriam
and Tamma, Valentina
and Lecue, Freddy
and Cudr{\'e}-Mauroux, Philippe
and Sequeda, Juan
and Lange, Christoph
and Heflin, Jeff",
title="Iguana: A Generic Framework for Benchmarking the Read-Write Performance of Triple Stores",
booktitle="The Semantic Web -- ISWC 2017",
year="2017",
publisher="Springer International Publishing",
address="Cham",
pages="48--65",
abstract="The performance of triples stores is crucial for applications driven by RDF. Several benchmarks have been proposed that assess the performance of triple stores. However, no integrated benchmark-independent execution framework for these benchmarks has yet been provided. We propose a novel SPARQL benchmark execution framework called Iguana. Our framework complements benchmarks by providing an execution environment which can measure the performance of triple stores during data loading, data updates as well as under different loads and parallel requests. Moreover, it allows a uniform comparison of results on different benchmarks. We execute the FEASIBLE and DBPSB benchmarks using the Iguana framework and measure the performance of popular triple stores under updates and parallel user requests. We compare our results (See https://doi.org/10.6084/m9.figshare.c.3767501.v1) with state-of-the-art benchmarking results and show that our benchmark execution framework can unveil new insights pertaining to the performance of triple stores.",
isbn="978-3-319-68204-4"
}
```
1 change: 1 addition & 0 deletions docs/configuration/workers.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,5 +132,6 @@ If the property is set to `false`,
the worker will not parse the response bodies and will not calculate hash values for the response bodies.

Setting the property to `false` can improve the performance of the worker.
This means that the worker is able to measure the performance more accurately.
If the property is set to `true`, the worker will temporarily store the whole response bodies in memory for processing.
If the property is set to `false`, the worker will discard any received bytes from the response.
Binary file removed images/iguana3-logo.png
Binary file not shown.
File renamed without changes
Loading

0 comments on commit 1083cda

Please sign in to comment.