Skip to content

Commit

Permalink
update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
C-Loftus committed Feb 24, 2024
1 parent 026b7f4 commit 4a50789
Show file tree
Hide file tree
Showing 2 changed files with 54 additions and 43 deletions.
41 changes: 41 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# Contributing

Before submitting a pull request, please make sure your code passes the following checks locally:

- `cargo test` passes without any errors
- `cargo fmt` has properly formatted all files
- `cargo clippy` has been run on all files without any errors or warnings in pedantic mode

These can be added to your pre-commit hooks to automate the checks. Beyond these checks, it is recommended to develop with standard Rust tooling like rust-analyzer. Once your code is passing locally, you can submit a pull request and a maintainer can pass it through the continuous integration checks.

Besides this, we do not have any specific contribution guidelines or codes of conduct for now, however most likely these will be fleshed out as Odilia matures more.

## Performance Benchmarking

If you'd like detailed performance benchmarks, we recommend using the `flamegraph` package to show performance bottlenecks.
There is also `hotspot`, a C++ program available in the AUR, and some major package repos, which can display how much time is spent in various portions of the program in an accessible (GUI) way.

First, install the subcommand with:

```bash
$ cargo install flamegraph
```

If needed, install Hotspot from the AUR/your package repo, as well as `perf` which is required to produce the flame graph.

```bash
$ paru/yay -S hotspot perf
```

Finally, add the following to the root `Cargo.toml`:

```toml
[profile.bench]
debug = true
``

Now, you can run the following commands to produce flamegraphes for individual benchmarks with the following command:

```bash
cargo flamegraph --bench load_test -- --bench [individual_bench_name]
```
56 changes: 13 additions & 43 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ It's written in [Rust](https://rust-lang.org), for maximum performance and stabi

This is **absolutely not production ready in any way!**
Everything is in a fairly early stage and we're changing things on a daily basis.
However, Odilia is *somewhat* useable, and will not crash randomly or cause weird behaviour in other applications.
However, Odilia is _somewhat_ useable, and will not crash randomly or cause weird behaviour in other applications.
Try it out! See if it works for you!

## Prerequisites
Expand All @@ -29,7 +29,7 @@ spd-say "hello, world!"
if you heard a voice saying "hello, world!", you can proceed to installing.
Otherwise, check if sound is working on the computer in general.

## build and install
## Build and install

To build odilia, copy paste the following on your command line . The following snippet will clone, build and install it for you, all at once without user interaction. The final binaries will be located in `~/.cargo/bin`

Expand All @@ -48,51 +48,21 @@ Simply type `odilia` in your terminal!

You can find us in the following places:

* [Discord](https://discord.gg/RVpRb9nS6K)
* IRC: irc.libera.chat
* #odilia-dev (development)
* #odilia (general)
* #odilia-offtopic (off-topic)
* Matrix: stealthy.club
* #odilia-dev (development)
* #odilia (general)
* #odilia-offtopic (off-topic)
- [Discord](https://discord.gg/RVpRb9nS6K)
- IRC: irc.libera.chat
- #odilia-dev (development)
- #odilia (general)
- #odilia-offtopic (off-topic)
- Matrix: stealthy.club
- #odilia-dev (development)
- #odilia (general)
- #odilia-offtopic (off-topic)

## Contributing

We are excited to accept new contributions to this project; in fact, we already have! Sometimes there may be missing documentation or lack of examples. Please, reach out to us, [make an issue](https://github.com/odilia-app/odilia), or a [pull request](https://github.com/odilia-app/odilia/pulls) and we will continue to improve Odilia with your help. By the way, a huge thank you to all who have contributed so far, and who will continue to do so in the future!
We are excited to accept new contributions to this project; in fact, we already have! Sometimes there may be missing documentation or lack of examples. Please, reach out to us, [make an issue](https://github.com/odilia-app/odilia), or a [pull request](https://github.com/odilia-app/odilia/pulls) and we will continue to improve Odilia with your help. By the way, a huge thank you to all who have contributed so far, and who will continue to do so in the future!

We do not have any specific contribution guidelines or codes of conduct for now, however most likely these will be fleshed out as Odilia matures more.

### Performance Benchmarking

If you'd like detailed performance benchmarks, we recommend using the `flamegraph` package to show performance bottlenecks.
There is also `hotspot`, a C++ program available in the AUR, and some major package repos, which can display how much time is spent in various portions of the program in an accessible (GUI) way.

First, install the subcommand with:

```bash
$ cargo install flamegraph
```

If needed, install Hotspot from the AUR/your package repo, as well as `perf` which is required to produce the flame graph.

```bash
$ paru/yay -S hotspot perf
```

Finally, add the following to the root `Cargo.toml`:

```toml
[profile.bench]
debug = true
```

Now, you can run the following commands to produce flamegraphes for individual benchmarks with the following command:

```bash
cargo flamegraph --bench load_test -- --bench [individual_bench_name]
```
See [CONTRIBUTING.md](./CONTRIBUTING.md) for more detail on how to contribute.

## License

Expand Down

0 comments on commit 4a50789

Please sign in to comment.