swh.search.elasticsearch module#
- swh.search.elasticsearch.token_encode(index_to_tokenize: Dict[bytes, Any]) str[source]#
- Tokenize as string an index page result from a search 
- swh.search.elasticsearch.token_decode(page_token: str) Dict[bytes, Any][source]#
- Read the page_token 
- class swh.search.elasticsearch.ElasticSearch(hosts: List[str], indexes: Dict[str, Dict[str, str]] = {})[source]#
- Bases: - object- origin_update(documents: Iterable[OriginDict]) None[source]#
 - origin_search(*, query: str = '', url_pattern: str | None = None, metadata_pattern: str | None = None, with_visit: bool = False, visit_types: List[str] | None = None, min_nb_visits: int = 0, min_last_visit_date: str = '', min_last_eventful_visit_date: str = '', min_last_revision_date: str = '', min_last_release_date: str = '', min_date_created: str = '', min_date_modified: str = '', min_date_published: str = '', programming_languages: List[str] | None = None, licenses: List[str] | None = None, keywords: List[str] | None = None, fork_weight: float | None = 0.5, sort_by: List[str] | None = None, page_token: str | None = None, limit: int = 50) PagedResult[OriginDict, str][source]#