Qdrant Vector Store Example with Ollama Embeddings
This example demonstrates how to use the Qdrant vector store with Ollama for semantic search. The program initializes the store, adds a sample dataset of cities, performs several types of queries, and then cleans up the created resources.
Features Demonstrated
- Initializing the Qdrant store with an Ollama embedder.
- Creating a collection automatically.
- Adding documents and generating embeddings with
nomic-embed-text.
- Performing similarity searches.
- Applying metadata filters to narrow down search results.
- Using a score threshold to filter by relevance.
- Deleting documents from the collection.
Prerequisites
Before running the example, ensure you have the following installed and running:
- Go: Version 1.21 or later.
- Ollama: The Ollama server must be running.
- Qdrant: The easiest way to run Qdrant is with Docker:
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
Setup
-
Pull the Embedding Model:
The example uses the nomic-embed-text model. Pull it from Ollama:
ollama pull nomic-embed-text
-
Install Go Dependencies:
Navigate to the root of the goframe project and run:
go mod tidy
How to Run
Execute the main.go file from the root of the project:
go run ./examples/ollama-qdrant-vectorstore-example/main.go
Expected Output
You will see a series of logs indicating the progress of the demo:
- Initialization of Ollama and Qdrant components.
- Creation of a new collection (e.g.,
cities_demo_...).
- Successful addition of 7 documents.
- A series of search scenarios will be executed, each with its own header and results printed to the console (e.g.,
=== Basic Capital Search ===).
- Logs showing that the metadata filter for "Europe" worked correctly.
- A demonstration of document deletion.
- Finally, a cleanup message indicating the test collection has been deleted.
- The program will exit with the message:
Vector store demo completed successfully.