Capybara 200k 34b Q5 2025 Specs. After a lot of hard work, here it is, my latest (and biggest, considering model sizes) llm comparison/test: Innovative training and unprecedented context length:
Long context on 16gb cards may be possible. First 34b nous model and first 200k context length nous model!