LlamaFile Completion Example
Welcome to this cheerful example of using LlamaFile for text completion with LangChain Go! ๐
What This Example Does
This fun little program demonstrates how to use the LlamaFile model to generate text completions. Here's what it does:
-
๐ Sets up a LlamaFile model with custom options:
- Uses an embedding size of 2048
- Sets the temperature to 0.8 for more creative outputs
-
๐ง Prepares a simple question: "Brazil is a country? answer yes or no"
-
๐ฎ Sends the question to the LlamaFile model for completion
-
๐บ Streams the generated response, printing it to the console as it's received
How to Run
- Make sure you have Go installed on your system
- Clone the repository and navigate to this example's directory
- Run the example with
go run llamafile_completion_example.go
What to Expect
When you run this example, you'll see the LlamaFile model's response to the question about Brazil streamed to your console. The answer should be a simple "yes" or "no", but remember that with the temperature set to 0.8, there might be some variation or additional context in the response!
Have fun exploring language models with this example! ๐๐ค