After testing, I found it to be the best ZIT FP8 checkpoint currently available. After communicating with the author, I decided to forwarded it.
heunpp2 + linear_quadratic, cfg: 1.5 - 4, step: 20+,
Tensor workbench can only use FP8. Forcing ZIT FP16 to run as FP8 will result in varying degrees of errors. Therefore, this model is very suitable for pairing with Lora trained on Tensors using a distillation-around approach.
This checkpoint belongs to the excellent author @FASCIUM of Civita.









