Transformers Library by Hugging Face: Hugging Face is a company that specializes in artificial intelligence and natural language processing technologies. They have released an open-source Python library called “Transformers.” This library provides access to a wide range of pre-trained models based on the Transformer architecture, such as BERT, GPT, RoBERTa, T5, etc. The library enables researchers and developers to easily use these powerful pre-trained models for various natural language processing tasks, including text classification, text generation, sentiment analysis, and more.
With transformers, you can import any model from your code directly. And transformers will automatically download these models for you.
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM, LlamaTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.2", padding_side="left")
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.2")
references:
https://levelup.gitconnected.com/an-ultimate-guide-to-run-any-llm-locally-eb1a43052053
No comments:
Post a Comment