Summarizer API
This page details the API for the Summarizer
class, used for interacting with LLMs for code summarization tasks.
Initialization
Section titled “Initialization”Details on how to initialize the Summarizer
(likely via repo.get_summarizer()
).
Methods
Section titled “Methods”summarize_file(file_path: str) -> str
Section titled “summarize_file(file_path: str) -> str”Summarizes the content of the specified file.
- Parameters:
file_path
(str): The path to the file within the repository.
- Returns:
str
: The summary generated by the LLM.
- Raises:
FileNotFoundError
: If thefile_path
does not exist in the repo.LLMError
: If there’s an issue communicating with the LLM.
summarize_function(file_path: str, function_name: str) -> str
Section titled “summarize_function(file_path: str, function_name: str) -> str”Summarizes a specific function within the specified file.
- Parameters:
file_path
(str): The path to the file containing the function.function_name
(str): The name of the function to summarize.
- Returns:
str
: The summary generated by the LLM.
- Raises:
FileNotFoundError
: If thefile_path
does not exist in the repo.SymbolNotFoundError
: If the function cannot be found in the file.LLMError
: If there’s an issue communicating with the LLM.
summarize_class(file_path: str, class_name: str) -> str
Section titled “summarize_class(file_path: str, class_name: str) -> str”Summarizes a specific class within the specified file.
- Parameters:
file_path
(str): The path to the file containing the class.class_name
(str): The name of the class to summarize.
- Returns:
str
: The summary generated by the LLM.
- Raises:
FileNotFoundError
: If thefile_path
does not exist in the repo.SymbolNotFoundError
: If the class cannot be found in the file.LLMError
: If there’s an issue communicating with the LLM.
Configuration
Section titled “Configuration”Details on the configuration options (OpenAIConfig
, etc.).
This is typically handled when calling repo.get_summarizer(config=...)
or via environment variables read by the default OpenAIConfig
.
The Summarizer
currently uses OpenAIConfig
for its LLM settings. When a Summarizer
is initialized without a specific config object, it creates a default OpenAIConfig
with the following parameters:
api_key
(str, optional): Your OpenAI API key. Defaults to theOPENAI_API_KEY
environment variable. If not found, an error will be raised.model
(str): The OpenAI model to use. Defaults to"gpt-4o"
.temperature
(float): Sampling temperature for the LLM. Defaults to0.7
.max_tokens
(int): The maximum number of tokens to generate in the summary. Defaults to1000
.
You can customize this by creating an OpenAIConfig
instance and passing it to repo.get_summarizer()
:
from kit.summaries import OpenAIConfig
# Example: Customize model and temperaturemy_config = OpenAIConfig(model="o3-mini", temperature=0.2)summarizer = repo.get_summarizer(config=my_config)
# Now summarizer will use o3-mini with temperature 0.2summary = summarizer.summarize_file("path/to/your/file.py")