More

LocalAI

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families and architectures.

OpenAI Proxy

Proxy Server to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user

Features:

企業在導入 LLM 時,可能會用到多種不同的模型,這些包含商用授權與開源授權以及來自不同的服務商。為了統一管理及開發應用這些各類不同模型,建議使用 OpenAI Proxy 這個平台來解決,以達到下列目的:
  • 統一 API 介接入口與格式
  • 成本追蹤
  • 平衡負載
Xinference

Xorbits Inference (Xinference) is an open-source platform to streamline the operation and integration of a wide array of AI models. With Xinference, you’re empowered to run inference using any open-source LLMs, embedding models, and multimodal models either in the cloud or on your own premises, and create robust AI-driven applications.

NVIDIA NIM

Explore the latest community-built AI models with an API optimized and accelerated by NVIDIA, then deploy anywhere with NVIDIA NIM inference microservices.

text-generation-webui

A Gradio web UI for Large Language Models.

只能執行本地模型,不支援外部模型 API。

支援以下多重功能的 AI 平台

教學

 


Revision #4
Created 11 November 2024 09:39:31 by Admin
Updated 11 November 2024 09:49:52 by Admin