simonw 9 days ago

Prompting LLMs to turn search queries like "red loveseat" into structured search filters like {"item_type": "loveseat", "color": "red"} is a neat trick.

I tried Doug's prompt out on a few other LLMs:

Gemini 1.5 Flash 8B handles it well and costs about 1/1000th of a cent: https://gist.github.com/simonw/cc825bfa7f921ca9ac47d7afb6eab...

Llama 3.2 3B is a very small local model (a 2GB file) which can handle it too: https://gist.github.com/simonw/d18422ca24528cdb9e5bd77692531...

An even smaller model, the 1.1GB deepseek-r1:1.5b, thought about it at length and confidently spat out the wrong answer! https://gist.github.com/simonw/c37eca96dd6721883207c99d25aec...

All three tests run with https://llm.datasette.io using the llm-gemini or llm-ollama plugins.

hamelsmu 9 days ago

Doug is the OG of search. His book "Relevant Search" is great. Glad to see that he is teaching again.