Новые интересные модели ИИ

Apr 10, 2024 21:12


MaziyarPanahi/Mixtral-8x22B-v0.1-GGUF


MaziyarPanahi/Mixtral-8x22B-v0.1-GGUF · Hugging Face


We’re on a journey to advance and democratize artificial intelligence through open source and open science.

huggingface.co

On April 10th, @MistralAI released a model named "Mixtral 8x22B," an 176B MoE via magnet link (torrent):

  • 176B MoE with ~40B active
  • Context length of 65k tokens
  • The base model can be fine-tuned
  • Requires ~260GB VRAM in fp16, 73GB in int4
  • Licensed under Apache 2.0, according to their Discord
  • Available on @huggingface (community)
  • Utilizes a tokenizer similar to previous models

The GGUF and quantized models here are based on v2ray/Mixtral-8x22B-v0.1 model

google/codegemma-7b-it-GGUF


google/codegemma-7b-it-GGUF · Hugging Face


We’re on a journey to advance and democratize artificial intelligence through open source and open science.

huggingface.co

Настройки для запуска модели брать тут:

StarCoder2


Models - Hugging Face


We’re on a journey to advance and democratize artificial intelligence through open source and open science.

huggingface.co

https://ai.sber.ru/post/vypushcheny-tri-novye-modeli-starcoder2-dlya-generacii-koda

mixtral, ИИ

Previous post Next post
Up