Skip to main content

Netmind AI

LiteLLM supports all models on Netmind AI.

API Keys​

import os 
os.environ["NETMIND_API_KEY"] = "your-api-key"

Sample Usage​

chat​

import os
from litellm import completion

os.environ["NETMIND_API_KEY"] = "your-api-key"

messages = [{"role": "user", "content": "Write me a poem about the blue sky"}]

completion(model="netmind/meta-llama/Llama-3.3-70B-Instruct", messages=messages)

embedding​

import os
from litellm import embedding

response = embedding(
model="netmind/nvidia/NV-Embed-v2", input=['I love programming.']
)
print(response)

Netmind AI Models​

liteLLM supports non-streaming and streaming requests to all models on https://www.netmind.ai/

Example Netmind Usage - Note: liteLLM supports all models deployed on Netmind

LLMs models​

Model NameFunction Call
netmind/deepseek-ai/DeepSeek-R1completion('netmind/deepseek-ai/DeepSeek-R1', messages)
netmind/deepseek-ai/DeepSeek-V3completion('netmind/deepseek-ai/DeepSeek-V3', messages)
netmind/meta-llama/Llama-3.3-70B-Instructcompletion('netmind/meta-llama/Llama-3.3-70B-Instruct', messages)
netmind/meta-llama/Meta-Llama-3.1-405Bcompletion('netmind/meta-llama/Meta-Llama-3.1-405B', messages)
netmind/Llama3.1-8B-Chinese-Chatcompletion('netmind/Llama3.1-8B-Chinese-Chat ', messages)
netmind/Qwen/Qwen2.5-72B-Instructcompletion('netmind/Qwen/Qwen2.5-72B-Instruct', messages)
netmind/Qwen/QwQ-32Bcompletion('netmind/Qwen/QwQ-32B', messages)
netmind/deepseek-ai/Janus-Pro-7Bcompletion('netmind/deepseek-ai/Janus-Pro-7B', messages)

Embedding models​

Model NameFunction Call
netmind/BAAI/bge-m3completion('netmind/BAAI/bge-m3', inputs)
netmind/nvidia/NV-Embed-v2completion('netmind/nvidia/NV-Embed-v2', inputs)
netmind/dunzhang/stella_en_1.5B_v5completion('netmind/dunzhang/stella_en_1.5B_v5', inputs)