Skip to main content

Tracing txtai

txtai Tracing via autolog

txtai is an all-in-one embeddings database for semantic search, LLM orchestration and language model workflows.

MLflow Tracing provides automatic tracing capability for txtai. Auto tracing for txtai can be enabled by calling the mlflow.txtai.autolog function, MLflow will capture traces for LLM invocation, embeddings, vector search, and log them to the active MLflow Experiment.

To get started, install the MLflow txtai extension:

pip install mlflow-txtai

Then, enable autologging in your Python code:

import mlflow

mlflow.txtai.autolog()

Examples

The simplest example to show the tracing integration is to instrument a Textractor pipeline.

import mlflow
from txtai.pipeline import Textractor

# Enable MLflow auto-tracing for txtai
mlflow.txtai.autolog()

# Optional: Set a tracking URI and an experiment
mlflow.set_tracking_uri("http://localhost:5000")
mlflow.set_experiment("txtai")

# Define and run a simple Textractor pipeline.
textractor = Textractor()
textractor("https://github.com/neuml/txtai")

txtai Textractor Tracing via autolog

More Information

For more examples and guidance on using txtai with MLflow, please refer to the MLflow txtai extension documentation