3A: Accelerate FSDP Fine-Tuning
This directory contains code for fine-tuning large language models using Accelerate with FSDP (Fully Sharded Data Parallel) on multiple GPUs.
Code
Note
config_FSDP.yaml: Configuration file for Accelerate FSDP setupfinetune.py: Main training script using Accelerate and peft to perform SFTTrainer with low-rank adapter (LoRA)job.sh: SLURM job script for Leonardo Booster HPC clusterrequirements.txt: Python dependencies