zed/crates/semantic_index
2023-09-18 09:57:52 -04:00
..
eval add ndcg@k to evaluate metrics 2023-09-15 10:36:21 -04:00
examples add map to evaluation suite for semantic_index 2023-09-18 09:57:52 -04:00
src catchup with main 2023-09-15 09:31:33 -04:00
Cargo.toml catchup with main 2023-09-15 09:31:33 -04:00
README.md add eval values for tree-sitter 2023-09-12 20:36:06 -04:00

Semantic Index

Evaluation

Metrics

nDCG@k:

  • "The value of NDCG is determined by comparing the relevance of the items returned by the search engine to the relevance of the item that a hypothetical "ideal" search engine would return.
  • "The relevance of result is represented by a score (also known as a 'grade') that is assigned to the search query. The scores of these results are then discounted based on their position in the search results -- did they get recommended first or last?"

MRR@k:

  • "Mean reciprocal rank quantifies the rank of the first relevant item found in teh recommendation list."

MAP@k:

  • "Mean average precision averages the precision@k metric at each relevant item position in the recommendation list.

Resources: