Skip to content

Latest commit

 

History

History
58 lines (41 loc) · 857 Bytes

README.md

File metadata and controls

58 lines (41 loc) · 857 Bytes

gpt

Generative Pre-trained Transformer in PyTorch from scratch

Train

CLI

python src/train.py

Options:

--batch_size 64
--num-epochs 100
--lr 0.0001
--from-checkpoint checkpoint_path.pth

Model is checkpointed after each epoch and stored in checkpoints/ directory

Code

from train import train

train()

Run

CLI

python src/run.py --from-checkpoint checkpoint_path.pth

Code

from run import run

run(model_path="checkpoint_path.pth", prompt="Rick:\nMorty, where are you?)

License

GPL v3

Jędrzej Maczan, 2024