Instructions to use Salesforce/codet5p-2b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Salesforce/codet5p-2b with Transformers:
# Load model directly from transformers import AutoModelForSeq2SeqLM model = AutoModelForSeq2SeqLM.from_pretrained("Salesforce/codet5p-2b", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Inference Problem
#9
by AngeloCurti22 - opened
When trying to do inference with this model, but also with 6b and 16b versions, the following error comes
AssertionError: Config has to be initialized with encoder and decoder config
Hope someone can help!