mirror of https://github.com/ollama/ollama
- better formatting of input prompt - use invoke instead of predict |
||
|---|---|---|
| .. | ||
| README.md | ||
| main.py | ||
| requirements.txt | ||
README.md
LangChain
This example is a basic "hello world" of using LangChain with Ollama.
Running the Example
-
Ensure you have the
llama3.2model installed:ollama pull llama3.2 -
Install the Python Requirements.
pip install -r requirements.txt -
Run the example:
python main.py