Monday, July 15, 2024

The Basics of function calling in LLM

 llm = BedrockChat(

    model_id="anthropic.claude-v2",

    model_kwargs={"temperature": 0.1},

    region_name="us-east-1",

)

base_model = AnthropicFunctions(llm=llm)

model = base_model.bind(

    functions=[

        {

            "name": "get_current_weather",

            "description": "Get the current weather in a given location",

            "parameters": {

                "type": "object",

                "properties": {

                    "location": {

                        "type": "string",

                        "description": "The city and state, e.g. San Francisco, CA",

                    },

                    "unit": {

                        "type": "string",

                        "enum": ["celsius", "fahrenheit"],

                    },

                },

                "required": ["location"],

            },

        }

    ],

    function_call={"name": "get_current_weather"},

)

res = model.invoke("What's the weather in San Francisco?")

In this case, the method get_current_weather is already a defined function which get called from the llm layer. 

refernces:

https://github.com/langchain-ai/langchain/discussions/18541




No comments:

Post a Comment