mistralai / Mixtral-8x7B-v0.1

Last scanned: Never Scanned

Unscanned

Versions

Version

Commit message

Commit author

Last scanned

Release 231211pstockNever scanned
HF Transformers integration (#3)pstockNever scanned
Update config.json (#5)pstockNever scanned
Update config.jsontimlacroixNever scanned
Update config.json (#19)timlacroixNever scanned
Update config.jsontimlacroixNever scanned
Update config.jsonTimeRobberNever scanned
Update config.jsonTimeRobberNov 12, 2024
Add MoE tag to Mixtral (#29)pstockNov 12, 2024
Adding Evaluation Resultsleaderboard-pr-botOct 23, 2024
remove old disclaimerLuckiestOneNov 7, 2024
Update config.jsonybelkadaNov 7, 2024
Align tokenizer with mistral-commonRocketknight1Never scanned
Remove chat template (this is a base model)Rocketknight1Nov 12, 2024
Upload MixtralForCausalLMybelkadaNever scanned
Update config.jsonybelkadaNever scanned
Update README.mdybelkadaNov 12, 2024
Delete consolidated.00.ptpstockNever scanned
Upload consolidated.04.pth with huggingface_hubpstockNever scanned
Upload consolidated.06.pth with huggingface_hubpstockNever scanned
Protect AI's security scanner detects threats in model files
With Protect AI's Guardian you can scan models for threats before ML developers download them for use, and apply policies based on your risk tolerance.
Learn more
Found a new threat?
Earn bounties and get recognition for your discoveries by submitting them through Huntr.
Report your finding