AI
AI Use Cases
When building LLM-powered applications, you typically need the following from external apps:
- Importing Documents from files, knowledge bases, and other sources. This knowledge is then used to provide LLM context via RAG.
- Importing Data Objects and Their Metadata from external apps. This lets you query the data using LLMs.
- Using Tools to make requests and take actions in external apps dynamically, based on the LLM's decisions.
In addition to these foundational use cases, your AI app may want to leverage the same use cases as any other non-LLM-powered app. To explore them, check out the Use Cases directory.
Updated 16 days ago