diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 00000000..bbfd3453 --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,41 @@ +# Contributing + +Before submitting a pull request, please make sure your code passes the following checks locally: + +- `cargo test` passes without any errors +- `cargo fmt` has properly formatted all files +- `cargo clippy` has been run on all files without any errors or warnings in pedantic mode + +These can be added to your pre-commit hooks to automate the checks. Beyond these checks, it is recommended to develop with standard Rust tooling like rust-analyzer. Once your code is passing locally, you can submit a pull request and a maintainer can pass it through the continuous integration checks. + +Besides this, we do not have any specific contribution guidelines or codes of conduct for now, however most likely these will be fleshed out as Odilia matures more. + +## Performance Benchmarking + +If you'd like detailed performance benchmarks, we recommend using the `flamegraph` package to show performance bottlenecks. +There is also `hotspot`, a C++ program available in the AUR, and some major package repos, which can display how much time is spent in various portions of the program in an accessible (GUI) way. + +First, install the subcommand with: + +```bash +$ cargo install flamegraph +``` + +If needed, install Hotspot from the AUR/your package repo, as well as `perf` which is required to produce the flame graph. + +```bash +$ paru/yay -S hotspot perf +``` + +Finally, add the following to the root `Cargo.toml`: + +```toml +[profile.bench] +debug = true +``` + +Now, you can run the following commands to produce flamegraphes for individual benchmarks with the following command: + +```bash +cargo flamegraph --bench load_test -- --bench [individual_bench_name] +``` diff --git a/README.md b/README.md index 6677204d..7ddea335 100644 --- a/README.md +++ b/README.md @@ -12,7 +12,7 @@ It's written in [Rust](https://rust-lang.org), for maximum performance and stabi This is **absolutely not production ready in any way!** Everything is in a fairly early stage and we're changing things on a daily basis. -However, Odilia is *somewhat* useable, and will not crash randomly or cause weird behaviour in other applications. +However, Odilia is _somewhat_ useable, and will not crash randomly or cause weird behaviour in other applications. Try it out! See if it works for you! ## Prerequisites @@ -29,7 +29,7 @@ spd-say "hello, world!" if you heard a voice saying "hello, world!", you can proceed to installing. Otherwise, check if sound is working on the computer in general. -## build and install +## Build and install To build odilia, copy paste the following on your command line . The following snippet will clone, build and install it for you, all at once without user interaction. The final binaries will be located in `~/.cargo/bin` @@ -48,51 +48,21 @@ Simply type `odilia` in your terminal! You can find us in the following places: -* [Discord](https://discord.gg/RVpRb9nS6K) -* IRC: irc.libera.chat - * #odilia-dev (development) - * #odilia (general) - * #odilia-offtopic (off-topic) -* Matrix: stealthy.club - * #odilia-dev (development) - * #odilia (general) - * #odilia-offtopic (off-topic) +- [Discord](https://discord.gg/RVpRb9nS6K) +- IRC: irc.libera.chat + - #odilia-dev (development) + - #odilia (general) + - #odilia-offtopic (off-topic) +- Matrix: stealthy.club + - #odilia-dev (development) + - #odilia (general) + - #odilia-offtopic (off-topic) ## Contributing -We are excited to accept new contributions to this project; in fact, we already have! Sometimes there may be missing documentation or lack of examples. Please, reach out to us, [make an issue](https://github.com/odilia-app/odilia), or a [pull request](https://github.com/odilia-app/odilia/pulls) and we will continue to improve Odilia with your help. By the way, a huge thank you to all who have contributed so far, and who will continue to do so in the future! +We are excited to accept new contributions to this project; in fact, we already have! Sometimes there may be missing documentation or lack of examples. Please, reach out to us, [make an issue](https://github.com/odilia-app/odilia), or a [pull request](https://github.com/odilia-app/odilia/pulls) and we will continue to improve Odilia with your help. By the way, a huge thank you to all who have contributed so far, and who will continue to do so in the future! -We do not have any specific contribution guidelines or codes of conduct for now, however most likely these will be fleshed out as Odilia matures more. - -### Performance Benchmarking - -If you'd like detailed performance benchmarks, we recommend using the `flamegraph` package to show performance bottlenecks. -There is also `hotspot`, a C++ program available in the AUR, and some major package repos, which can display how much time is spent in various portions of the program in an accessible (GUI) way. - -First, install the subcommand with: - -```bash -$ cargo install flamegraph -``` - -If needed, install Hotspot from the AUR/your package repo, as well as `perf` which is required to produce the flame graph. - -```bash -$ paru/yay -S hotspot perf -``` - -Finally, add the following to the root `Cargo.toml`: - -```toml -[profile.bench] -debug = true -``` - -Now, you can run the following commands to produce flamegraphes for individual benchmarks with the following command: - -```bash -cargo flamegraph --bench load_test -- --bench [individual_bench_name] -``` +See [CONTRIBUTING.md](./CONTRIBUTING.md) for more detail on how to contribute. ## License