HoML Logo

HoML

← Back to Models

DeepSeek V2

DeepSeek V2 is a powerful open-source Mixture-of-Experts (MoE) language model from DeepSeek AI. It has 236 billion total parameters, with 21 billion activated for each token, enabling strong performance while maintaining efficiency.

deepseek-v2

236B parameters Base

Pull this model

Use the following command with the HoML CLI:

homl pull deepseek-v2:base

Resource Requirements

Quantization Disk Space GPU Memory
FP16 472 GB 543 GB

deepseek-v2-chat

236B parameters Chat

Pull this model

Use the following command with the HoML CLI:

homl pull deepseek-v2:chat

Resource Requirements

Quantization Disk Space GPU Memory
FP16 472 GB 543 GB

deepseek-v2-lite-base

16B parameters Base

Pull this model

Use the following command with the HoML CLI:

homl pull deepseek-v2:lite-base

Resource Requirements

Quantization Disk Space GPU Memory
BF16 32 GB 32 GB

deepseek-v2-lite-chat

16B parameters Chat

Pull this model

Use the following command with the HoML CLI:

homl pull deepseek-v2:lite-chat

Resource Requirements

Quantization Disk Space GPU Memory
BF16 32 GB 32 GB