Can a 1.4GB LLM Be My Local Search Engine? (spoiler: yes)

One of the main LLM use cases is often pitched as a search engine replacement and it got me thinking.. Can a small locally hosted LLM replace google when I'm on the go?

TL;DR - Yes! 

Though LLM stands for Large Language Models, some are surprisingly small.

To get started, I installed Ollama, which makes tinkering with a huge number of open-source models incredibly easy. I wanted to find a model that's small enough so it can run smoothly on my laptop while being actually useful for coding.

After some tinkering, I settled on qwen3:1.7b. The 1.7b version is only 1.4GB so it fits well in my laptop's RAM and it's good enough for real coding tasks!

Now, I can just spin up a local LLM in my termial (!!) and ask it random stuff while coding:

 

For simple search tasks (and even a bit of debugging), it's as good as the full blown web based LLMs. It won't tell me today's news, but for quick code snippets, syntax and library definitions, it's perferct.

(Sidenote - I recommend disabling the thinking mode because this model is an overthinker.)

Popular posts from this blog

TL;DR Remote work key-points from GitLab handbook

Digital detox with a custom Android launcher