Multistep Query Engine๏ƒ

class llama_index.query_engine.multistep_query_engine.MultiStepQueryEngine(query_engine: BaseQueryEngine, query_transform: StepDecomposeQueryTransform, response_synthesizer: Optional[BaseSynthesizer] = None, num_steps: Optional[int] = 3, early_stopping: bool = True, index_summary: str = 'None', stop_fn: Optional[Callable[[Dict], bool]] = None)๏ƒ

Multi-step query engine.

This query engine can operate over an existing base query engine, along with the multi-step query transform.

Parameters
  • query_engine (BaseQueryEngine) โ€“ A BaseQueryEngine object.

  • query_transform (StepDecomposeQueryTransform) โ€“ A StepDecomposeQueryTransform object.

  • response_synthesizer (Optional[BaseSynthesizer]) โ€“ A BaseSynthesizer object.

  • num_steps (Optional[int]) โ€“ Number of steps to run the multi-step query.

  • early_stopping (bool) โ€“ Whether to stop early if the stop function returns True.

  • index_summary (str) โ€“ A string summary of the index.

  • stop_fn (Optional[Callable[[Dict], bool]]) โ€“ A stop function that takes in a dictionary of information and returns a boolean.

get_prompts() Dict[str, BasePromptTemplate]๏ƒ

Get a prompt.

update_prompts(prompts_dict: Dict[str, BasePromptTemplate]) None๏ƒ

Update prompts.

Other prompts will remain in place.

llama_index.query_engine.multistep_query_engine.default_stop_fn(stop_dict: Dict) bool๏ƒ

Stop function for multi-step query combiner.