LLM-Stack
Portkey's AI Gateway is the interface between your app and hosted LLMs. It streamlines API requests to OpenAI, Anthropic, Mistral, LLama2, Anyscale, Google Gemini and more with a unified API.
✅ Blazing fast (9.9x faster) with a tiny footprint (~45kb installed)
✅ Load balance across multiple models, providers, and keys
✅ Fallbacks make sure your app ... See more
✅ Blazing fast (9.9x faster) with a tiny footprint (~45kb installed)
✅ Load balance across multiple models, providers, and keys
✅ Fallbacks make sure your app ... See more
Portkey-AI • GitHub - Portkey-AI/gateway: A Blazing Fast AI Gateway. Route to 100+ LLMs with 1 fast & friendly API.
Original creator : Jesse Zhang (GH: emptycrown, Twitter: @thejessezhang), who courteously donated the repo to LlamaIndex!
This is a simple library of all the data loaders / readers / tools / llama-packs that have been created by the community. The goal is to make it extremely easy to connect large language models to a large variety of knowledge sour... See more
This is a simple library of all the data loaders / readers / tools / llama-packs that have been created by the community. The goal is to make it extremely easy to connect large language models to a large variety of knowledge sour... See more
GitHub - run-llama/llama-hub: A library of data loaders for LLMs made by the community -- to be used with LlamaIndex and/or LangChain
Overview
AIConfig saves prompts, models and model parameters as source control friendly configs. This allows you to iterate on prompts and model parameters separately from your application code .
AIConfig saves prompts, models and model parameters as source control friendly configs. This allows you to iterate on prompts and model parameters separately from your application code .
- Prompts as configs : a standardized JSON format to store generative AI model settings, prompt inputs/outputs, and flexible metadata.
- Model-agnostic SDK :
lastmile-ai • GitHub - lastmile-ai/aiconfig: aiconfig -- config-driven, source control friendly AI application development
ANY
LLM of your choice, statistical methods, or NLP models that runs
locally on your machine
:
- G-Eval
- Summarization
- Answer Relevancy
- Faithfulness
- Contextual Recall
- Contextual Precision
- RAGAS
- Hallucination
- Toxicity
- Bias
- etc.
GitHub - confident-ai/deepeval: The LLM Evaluation Framework
Introducing Prompts: LLM Monitoring
W&B Prompts – LLM Monitoring provides large language model usage monitoring and diagnostics. Start simply, then customize and evolve your monitoring analytics over time.
W&B Prompts – LLM Monitoring provides large language model usage monitoring and diagnostics. Start simply, then customize and evolve your monitoring analytics over time.
Monitoring
Monitoring Tools
dstack is an open-source toolkit and orchestration engine for running GPU workloads. It's designed for development, training, and deployment of gen AI models on any cloud.
Supported providers: AWS, GCP, Azure, Lambda, TensorDock, Vast.ai, and DataCrunch.
Latest news ✨
Supported providers: AWS, GCP, Azure, Lambda, TensorDock, Vast.ai, and DataCrunch.
Latest news ✨
- [2024/01] dstack 0.14.0: OpenAI-compatible endpoints preview (Release)
- [2023/12] dst
dstackai • GitHub - dstackai/dstack: dstack is an open-source toolkit for running GPU workloads on any cloud. It works seamlessly with any cloud GPU providers. Discord: https://discord.gg/u8SmfwPpMd
They will start to support autoscaling in March. You can configure multiple clouds and they deploy to the cheapest one.
Higher performance and lower cost than any single LLM
We invented the first LLM router. By dynamically routing between multiple models, Martian can beat GPT-4 on performance, reduce costs by 20%-97%, and simplify the process of using AI
We invented the first LLM router. By dynamically routing between multiple models, Martian can beat GPT-4 on performance, reduce costs by 20%-97%, and simplify the process of using AI
Martian's Model Router: Optimize AI Performance and Reduce Costs
SkyPilot is a framework for running LLMs, AI, and batch jobs on any cloud, offering maximum cost savings, highest GPU availability, and managed execution.
SkyPilot abstracts away cloud infra burdens :
SkyPilot... See more
SkyPilot abstracts away cloud infra burdens :
- Launch jobs & clusters on any cloud
- Easy scale-out: queue and run many jobs, automatically managed
- Easy access to object stores (S3, GCS, R2)
SkyPilot... See more
skypilot-org • GitHub - skypilot-org/skypilot: SkyPilot: Run LLMs, AI, and Batch jobs on any cloud. Get maximum savings, highest GPU availability, and managed execution—all with a simple interface.
LanceDB
LanceDB is an open-source vector database for AI that's designed to store, manage, query and retrieve embeddings on large-scale multi-modal data. The core of LanceDB is written in Rust 🦀 and is built on top of Lance, an open-source columnar data format designed for performant ML workloads and fast random access.
Both the database and the un... See more
LanceDB is an open-source vector database for AI that's designed to store, manage, query and retrieve embeddings on large-scale multi-modal data. The core of LanceDB is written in Rust 🦀 and is built on top of Lance, an open-source columnar data format designed for performant ML workloads and fast random access.
Both the database and the un... See more
LanceDB - LanceDB
🐂 What is Oxen?
Oxen is a lightning fast data version control system for structured and unstructured machine learning datasets. We aim to make versioning datasets as easy as versioning code.
The interface mirrors git, but shines in many areas that git or git-lfs fall short. Oxen is built from the ground up for data, and is optimized to handle large... See more
Oxen is a lightning fast data version control system for structured and unstructured machine learning datasets. We aim to make versioning datasets as easy as versioning code.
The interface mirrors git, but shines in many areas that git or git-lfs fall short. Oxen is built from the ground up for data, and is optimized to handle large... See more
Oxen-AI • GitHub - Oxen-AI/oxen-release: Official repository for docs and releases of the Oxen CLI
Data Structuring & Storage