Full Checkpoint with improved TE do not load additional CLIP/TE
FLUX.1 (Base UNET) + Google FLAN
This model took the 42GB FP32 Google Flan T5xxl and quantized it with improved CLIP-L for Flux. To my knowledge no one else has posted or attempted this.
Quantized from FP32 T5xxl (42GB 11B Parameter)
Base UNET no baked lora's or other changes
Full FP16 version is available.