trainable parameters. Low-rank subspace finetuning. Part of the model's input embeddings is fine-tuned via gradient descent.
- Fastfood transform to reparametrize the update to NN params.
- LoRa - simple low-rank matrix decomposition(or Kronecker product decomposition) to parametrize the weight
update
😶 #dailyreplort #llm #ai #architect #architecture #peft

