Generative AI Scripting
-
Updated
Jun 11, 2024 - TypeScript
Generative AI Scripting
Easy-to-use and high-performance NLP and LLM framework based on MindSpore, compatible with models and datasets of 🤗Huggingface.
Shire offers a straightforward AI Coding & Agents Language that enables communication between an LLM and control IDE for automated programming.
OSWorld: Benchmarking Multimodal Agents for Open-Ended Tasks in Real Computer Environments
Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://codellama.h2o.ai/
User-friendly WebUI for LLMs (Formerly Ollama WebUI)
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
A cloud-native vector database, storage for next generation AI applications
A high-throughput and memory-efficient inference and serving engine for LLMs
RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
The easiest way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Multi-model Inference Graph/Pipelines, LLM/RAG apps, and more!
⏩ Continue enables you to create your own AI code assistant inside your IDE. Keep your developers in flow with open-source VS Code and JetBrains extensions
Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
Add a description, image, and links to the llm topic page so that developers can more easily learn about it.
To associate your repository with the llm topic, visit your repo's landing page and select "manage topics."