Friday, December 13, 2024

Langchain work with Google Gemini

from langchain_google_genai import ChatGoogleGenerativeAI

llm = ChatGoogleGenerativeAI(

    model="gemini-1.5-pro",

    temperature=0,

    max_tokens=None,

    timeout=None,

    max_retries=2,

    # other params...

)

messages = [

    (

        "system",

        "You are a helpful assistant that translates English to French. Translate the user sentence.",

    ),

    ("human", "I love programming."),

]

ai_msg = llm.invoke(messages)

ai_msg


print(ai_msg.content)



Chaining is like below 


from langchain_core.prompts import ChatPromptTemplate


prompt = ChatPromptTemplate.from_messages(

    [

        (

            "system",

            "You are a helpful assistant that translates {input_language} to {output_language}.",

        ),

        ("human", "{input}"),

    ]

)

chain = prompt | llm

chain.invoke(

    {

        "input_language": "English",

        "output_language": "German",

        "input": "I love programming.",

    }

)

references:

https://python.langchain.com/docs/integrations/chat/google_generative_ai/


No comments:

Post a Comment