Pages

marți, 27 mai 2025

News : tested local artificial intelligence with gemma2:2b and void editor.

I had a new full instaled windows 10 with the gemma2:2b in order to work as local artificial intelligence, with the 4Gb RAM!
Ollama gemma2:2b can works with minimal RAM!
The netstat command can show you if ollama server works, see bellow image:
I have done some tests with source code generation and it is a quite functional artificial intelligence, but at a minimal level. If you want something functional with a minimum of 4G RAM, then it works very well !
The settings of local ollama into void editor can be set on first running :