Bedrock llama prompt model driver
BedrockLlamaPromptModelDriver
Bases: BasePromptModelDriver
Source code in griptape/drivers/prompt_model/bedrock_llama_prompt_model_driver.py
prompt_driver: Optional[AmazonBedrockPromptDriver] = field(default=None, kw_only=True)
class-attribute
instance-attribute
tokenizer: BedrockLlamaTokenizer
property
Returns the tokenizer for this driver.
We need to pass the session
field from the Prompt Driver to the
Tokenizer. However, the Prompt Driver is not initialized until after
the Prompt Model Driver is initialized. To resolve this, we make the tokenizer
field a @property that is only initialized when it is first accessed.
This ensures that by the time we need to initialize the Tokenizer, the
Prompt Driver has already been initialized.
See this thread more more information: https://github.com/griptape-ai/griptape/issues/244
Returns:
Name | Type | Description |
---|---|---|
BedrockLlamaTokenizer |
BedrockLlamaTokenizer
|
The tokenizer for this driver. |
top_p: float = field(default=0.9, kw_only=True)
class-attribute
instance-attribute
process_output(output)
Source code in griptape/drivers/prompt_model/bedrock_llama_prompt_model_driver.py
prompt_stack_to_model_input(prompt_stack)
Converts a PromptStack
to a string that can be used as the input to the model.
Prompt structure adapted from https://huggingface.co/blog/llama2#how-to-prompt-llama-2
Parameters:
Name | Type | Description | Default |
---|---|---|---|
prompt_stack |
PromptStack
|
The |
required |