Statements (23)
| Predicate | Object |
|---|---|
| gptkbp:instanceOf |
gptkb:machine_learning_technique
|
| gptkbp:appliesTo |
gptkb:T5
gptkb:LLaMA gptkb:BERT gptkb:GPT transformer models |
| gptkbp:enables |
efficient transfer learning
fine-tuning with fewer parameters |
| gptkbp:hasLibrary |
gptkb:Hugging_Face_PEFT_library
|
| gptkbp:popularizedBy |
research in 2021-2023
|
| gptkbp:reduces |
computational cost
memory usage |
| gptkbp:relatedTo |
gptkb:LoRA
Prefix Tuning Prompt Tuning Adapter Tuning |
| gptkbp:standsFor |
Parameter-Efficient Fine-Tuning
|
| gptkbp:usedFor |
adapting large language models
|
| gptkbp:usedIn |
computer vision
natural language processing |
| gptkbp:bfsParent |
gptkb:LoRA
|
| gptkbp:bfsLayer |
8
|
| https://www.w3.org/2000/01/rdf-schema#label |
PEFT
|