3A: Accelerate FSDP Fine-Tuning

This directory contains code for fine-tuning large language models using Accelerate with FSDP (Fully Sharded Data Parallel) on multiple GPUs.

Code

  • config_FSDP.yaml: Configuration file for Accelerate FSDP setup

  • finetune.py: Main training script using Accelerate and peft to perform SFTTrainer with low-rank adapter (LoRA)

  • job.sh: SLURM job script for Leonardo Booster HPC cluster

  • requirements.txt: Python dependencies