Skip to content

Summarizer API

This page details the API for the Summarizer class, used for interacting with LLMs for code summarization tasks.

Details on how to initialize the Summarizer (likely via repo.get_summarizer()).

Summarizes the content of the specified file.

  • Parameters:
    • file_path (str): The path to the file within the repository.
  • Returns:
    • str: The summary generated by the LLM.
  • Raises:
    • FileNotFoundError: If the file_path does not exist in the repo.
    • LLMError: If there’s an issue communicating with the LLM.

summarize_function(file_path: str, function_name: str) -> str

Section titled “summarize_function(file_path: str, function_name: str) -> str”

Summarizes a specific function within the specified file.

  • Parameters:
    • file_path (str): The path to the file containing the function.
    • function_name (str): The name of the function to summarize.
  • Returns:
    • str: The summary generated by the LLM.
  • Raises:
    • FileNotFoundError: If the file_path does not exist in the repo.
    • SymbolNotFoundError: If the function cannot be found in the file.
    • LLMError: If there’s an issue communicating with the LLM.

summarize_class(file_path: str, class_name: str) -> str

Section titled “summarize_class(file_path: str, class_name: str) -> str”

Summarizes a specific class within the specified file.

  • Parameters:
    • file_path (str): The path to the file containing the class.
    • class_name (str): The name of the class to summarize.
  • Returns:
    • str: The summary generated by the LLM.
  • Raises:
    • FileNotFoundError: If the file_path does not exist in the repo.
    • SymbolNotFoundError: If the class cannot be found in the file.
    • LLMError: If there’s an issue communicating with the LLM.

Details on the configuration options (OpenAIConfig, etc.). This is typically handled when calling repo.get_summarizer(config=...) or via environment variables read by the default OpenAIConfig.

The Summarizer currently uses OpenAIConfig for its LLM settings. When a Summarizer is initialized without a specific config object, it creates a default OpenAIConfig with the following parameters:

  • api_key (str, optional): Your OpenAI API key. Defaults to the OPENAI_API_KEY environment variable. If not found, an error will be raised.
  • model (str): The OpenAI model to use. Defaults to "gpt-4o".
  • temperature (float): Sampling temperature for the LLM. Defaults to 0.7.
  • max_tokens (int): The maximum number of tokens to generate in the summary. Defaults to 1000.

You can customize this by creating an OpenAIConfig instance and passing it to repo.get_summarizer():

from kit.summaries import OpenAIConfig
# Example: Customize model and temperature
my_config = OpenAIConfig(model="o3-mini", temperature=0.2)
summarizer = repo.get_summarizer(config=my_config)
# Now summarizer will use o3-mini with temperature 0.2
summary = summarizer.summarize_file("path/to/your/file.py")