I was actually scratching my head on how to structure a regular prompt to produce csv data without extra nonsense like "Here is your data" and "Please note blah blah" at the beginning and end, so this is much welcome as I can define exactly what I want returned then just push structured output to csv.
No way. This is amazing and one of the things I actually wanted. I love ollama be because it makes using an LLM feel like using any other UNIX program. It makes LLMs feel like they belong on UNIX.
Question though. Has anyone had luck running it on AMD GPUs? I've heard it's harder but I really want to support the competition when I get cards next year.
rdescartes ·19 days ago
https://github.com/ggerganov/llama.cpp/blob/master/grammars/...
Show replies
chirau ·19 days ago
I was actually scratching my head on how to structure a regular prompt to produce csv data without extra nonsense like "Here is your data" and "Please note blah blah" at the beginning and end, so this is much welcome as I can define exactly what I want returned then just push structured output to csv.
Show replies
quaintdev ·19 days ago
Show replies
guerrilla ·19 days ago
Question though. Has anyone had luck running it on AMD GPUs? I've heard it's harder but I really want to support the competition when I get cards next year.
Show replies
bluechair ·19 days ago
In some instances, I'd rather parse Markdown or plain text if it means the quality of the output is higher.
Show replies