fix(checkpoint): allow fine-tuning again for checkpoints without paths
fixes include overriding model object configuration with config, using
trainer.fit with ckpt_path in all cases except ones where the optimizer
configuration has changed.
also raises an error if the model configuration has changed
and changes checkpointing so that the monitored value is the
validation loss, but there is an additional checkpoint callback
that saves the last checkpoint regardless of the validation loss
fixes https://github.com/roedoejet/EveryVoice/issues/238