How to use Huggingface models offline?

How to use Huggingface models offline?

閱讀全文
请先 登录 后评论
  • 1 Follow
  • 0 Bookmark 97 Viewed
  • User asked in 2024-01-25 18:01:57

1 Answer

King Of Kings
擅長:AI

Hugging Face is an American company founded in 2016 that specializes in developing tools for building machine learning applications. The company's representative products are its library of transformers built for natural language processing applications, and a platform that allows users to share machine learning models and datasets. This article introduces how to use Python to use the model on Hugging Face.


To use Hugging Face's model library, you must use the transformers model library software package developed by the company. Transformers Library is a Python software package that contains open source implementations of Transformer models for text, image, and audio tasks. It is compatible with PyTorch, TensorFlow and JAX deep learning libraries and includes implementations of well-known models such as BERT and GPT-2. The library was originally named "pytorch-pretrained-bert", later renamed to "pytorch-transformers", and finally to "transformers".


Using the blip2 model with transformers, weights can be loaded online or offline. Online loading will automatically download model weights from hugging face. Offline loading requires manually downloading the model weights from hugging face, and then loading the local weights folder. Another method is to save the model weights loaded online to a folder and then use them offline. It is introduced below. blip2 is a multi-modal large model with relatively large weights. If the GPU is limited, you can use half-precision or INT8. The official website also provides example code, please check it yourself. This article takes half precision as an example.

请先 登录 后评论