Skip to main content
Version: 0.2.3

AI Models Overview

Sercha supports optional integration with external AI models for enhanced search capabilities. These are entirely optional, Sercha works without any AI services using pure keyword (BM25) search.

Model Categories

Embedding Models

Embedding models convert text into numerical vectors, enabling semantic search. When configured, Sercha can find documents by meaning, not just keywords.

Use cases:

  • Semantic search ("find documents about machine learning" matches "ML algorithms")
  • Similar document discovery
  • Hybrid search combining keywords and meaning

Large Language Models

LLMs enhance query understanding and document processing. When configured, Sercha can rewrite queries and generate summaries.

Use cases:

  • Query expansion (adding synonyms, fixing typos)
  • Document summarisation
  • Natural language query understanding

Graceful Degradation

Both model types are optional. Without them:

FeatureWith AI ModelsWithout AI Models
Keyword searchYesYes
Semantic searchYesNo
Query rewritingYesNo
SummarisationYesNo

Sercha degrades gracefully as core search functionality is always available.