Skip to content

Use Cases

The following sections introduce common txtai use cases. A comprehensive set of over 50 example notebooks and applications are also available.

Build semantic/similarity/vector/neural search applications.

demo

Traditional search systems use keywords to find data. Semantic search has an understanding of natural language and identifies results that have the same meaning, not necessarily the same keywords.

search search

Get started with the following examples.

Notebook Description
Introducing txtai ▶️ Overview of the functionality provided by txtai Open In Colab
Similarity search with images Embed images and text into the same space for search Open In Colab
Build a QA database Question matching with semantic search Open In Colab
Semantic Graphs Explore topics, data connectivity and run network analysis Open In Colab

LLM Orchestration

LLM chains, retrieval augmented generation (RAG), chat with your data, pipelines and workflows that interface with large language models (LLMs).

Chains

Integrate LLM chains (known as workflows in txtai), multiple LLM agents and self-critique.

llm

See below to learn more.

Notebook Description
Prompt templates and task chains Build model prompts and connect tasks together with workflows Open In Colab
Integrate LLM frameworks Integrate llama.cpp, LiteLLM and custom generation frameworks Open In Colab
Build knowledge graphs with LLMs Build knowledge graphs with LLM-driven entity extraction Open In Colab

Retrieval augmented generation

Retrieval augmented generation (RAG) reduces the risk of LLM hallucinations by constraining the output with a knowledge base as context. RAG is commonly used to "chat with your data".

rag rag

A novel feature of txtai is that it can provide both an answer and source citation.

Notebook Description
Build RAG pipelines with txtai Guide on retrieval augmented generation including how to create citations Open In Colab
Advanced RAG with graph path traversal Graph path traversal to collect complex sets of data for advanced RAG Open In Colab
Advanced RAG with guided generation Retrieval Augmented and Guided Generation Open In Colab

Language Model Workflows

Language model workflows, also known as semantic workflows, connect language models together to build intelligent applications.

flows flows

While LLMs are powerful, there are plenty of smaller, more specialized models that work better and faster for specific tasks. This includes models for extractive question-answering, automatic summarization, text-to-speech, transcription and translation.

Notebook Description
Run pipeline workflows ▶️ Simple yet powerful constructs to efficiently process data Open In Colab
Building abstractive text summaries Run abstractive text summarization Open In Colab
Transcribe audio to text Convert audio files to text Open In Colab
Translate text between languages Streamline machine translation and language detection Open In Colab