Incorporate suggestions from review

This commit is contained in:
Thiago Arrais 2020-10-26 15:37:01 -03:00 committed by David Peter
parent e1fea57182
commit 8704e76d36

View File

@ -40,26 +40,6 @@ option:
hyperfine --min-runs 5 'sleep 0.2' 'sleep 3.2' hyperfine --min-runs 5 'sleep 0.2' 'sleep 3.2'
``` ```
### Tip: aliases and functions
Aliases and shell functions can't be benchmarked directly, but you may try to put them in a separate
file and benchmark _that_:
```bash
echo 'my_function() { sleep 1 }' > /tmp/my_function.sh
echo 'alias my_alias="sleep 1"' > /tmp/my_alias.sh
hyperfine 'source /tmp/my_function.sh; eval my_function'
hyperfine 'source /tmp/my_alias.sh; eval my_alias'
```
If you're using bash, you may also want to `export` functions (can't do that with aliases, though):
```
$ my_function() { sleep 1; }
$ export -f my_function
$ hyperfine my_function
```
### Warmup runs and preparation commands ### Warmup runs and preparation commands
If the program execution time is limited by disk I/O, the benchmarking results can be heavily If the program execution time is limited by disk I/O, the benchmarking results can be heavily
@ -97,6 +77,26 @@ hyperfine --parameter-scan delay 0.3 0.7 -D 0.2 'sleep {delay}'
``` ```
This runs `sleep 0.3`, `sleep 0.5` and `sleep 0.7`. This runs `sleep 0.3`, `sleep 0.5` and `sleep 0.7`.
### Shell functions and aliases
If you are using bash, you can export shell functions to directly benchmark them with hyperfine:
```
$ my_function() { sleep 1; }
$ export -f my_function
$ hyperfine my_function
```
If you are using a different shell, or if you want to benchmark shell aliases, you may try to put
them in a separate file:
```bash
echo 'my_function() { sleep 1 }' > /tmp/my_function.sh
echo 'alias my_alias="sleep 1"' > /tmp/my_alias.sh
hyperfine 'source /tmp/my_function.sh; eval my_function'
hyperfine 'source /tmp/my_alias.sh; eval my_alias'
```
### Export results ### Export results
Hyperfine has multiple options for exporting benchmark results: CSV, JSON, Markdown (see `--help` Hyperfine has multiple options for exporting benchmark results: CSV, JSON, Markdown (see `--help`