Knowledge Graph Query Engine๏ƒ

Knowledge Graph Query Engine.

class llama_index.query_engine.knowledge_graph_query_engine.KnowledgeGraphQueryEngine(service_context: Optional[ServiceContext] = None, storage_context: Optional[StorageContext] = None, graph_query_synthesis_prompt: Optional[BasePromptTemplate] = None, graph_response_answer_prompt: Optional[BasePromptTemplate] = None, refresh_schema: bool = False, verbose: bool = False, response_synthesizer: Optional[BaseSynthesizer] = None, **kwargs: Any)๏ƒ

Knowledge graph query engine.

Query engine to call a knowledge graph.

Parameters
  • service_context (Optional[ServiceContext]) โ€“ A service context to use.

  • storage_context (Optional[StorageContext]) โ€“ A storage context to use.

  • refresh_schema (bool) โ€“ Whether to refresh the schema.

  • verbose (bool) โ€“ Whether to print intermediate results.

  • response_synthesizer (Optional[BaseSynthesizer]) โ€“ A BaseSynthesizer object.

  • **kwargs โ€“ Additional keyword arguments.

async agenerate_query(query_str: str) str๏ƒ

Generate a Graph Store Query from a query bundle.

generate_query(query_str: str) str๏ƒ

Generate a Graph Store Query from a query bundle.

get_prompts() Dict[str, BasePromptTemplate]๏ƒ

Get a prompt.

update_prompts(prompts_dict: Dict[str, BasePromptTemplate]) None๏ƒ

Update prompts.

Other prompts will remain in place.