•   about 2 years ago

How to call arctic model as llm?

Hi How to call arctic model as llm? I will need to pass this llm model as input later in SQLDatabaseChain.from_llm()

llm = model( "Snowflake/snowflake-arctic-instruct") ??? this is not working

SQLDatabaseChain.from_llm(
llm,
db,
prompt=few_shot_prompt,
use_query_checker=False,
verbose=True,
return_intermediate_steps=True,
)

  • 2 comments

  • Manager   •   about 2 years ago

    Hey Netravati,
    Thanks for sharing your question! I'm assuming based on the code snippet you shared that you're using LangChain.
    LangChain actually has a guide on using LangChain in conjunction with Replicate (https://python.langchain.com/docs/integrations/llms/replicate/), which is an easy way to access Arctic:

    ```
    import os
    from langchain.chains import LLMChain
    from langchain_community.llms import Replicate
    from langchain_core.prompts import PromptTemplate

    os.environ["REPLICATE_API_TOKEN"] = 'my_api_token'

    llm = Replicate(
    model="snowflake/snowflake-arctic-instruct",
    model_kwargs={"temperature": 0.75, "max_length": 500, "top_p": 1},
    )
    ```

  •   •   about 2 years ago

    thanks

Comments are closed.