Basic Desktop AI integration with Ollama

Francesco Manghi
2 min readApr 30, 2024

--

For software, info and contribution visit the github page.
https://github.com/wonka929/Ollama_AI_Desktop_Assistant

Small and capable LLM are emerging on the public domain.
I was very pleasentely surprised by the release of Phi3 from Microsoft which is a small size LLM that can be run on smartphones and has proven to be really good for its size.

You can find more info about Phi3 online.

I though it was time to start integrating a little bit more LLMs in my daily routine. I thought to how I usually use LLMs for and decided to create a completely LOCAL integration of Phi3 for my desktop.

Features:

  • completely LOCAL (privacy and autonomy oriented)
  • python libraries
  • fast to use while doing your daily work
  • scalable (you can use all the LLMs model supported by Ollama)
  • adaptable (all the “functions” are nothing more than specific prompts)

What to do

Download Ollama

First of all you need to download and installa Ollama https://ollama.com/download

Follow instructions on the website to complete the installation.
Once installed, download an LLM model. I clearly chosed Phi3 and downloaded it running in terminal the command

ollama run phi3

Check that the model is working properly and continue.

Install Python3

Now you just have to download and install python3 if you haven’t already.

Download, install and include pip and tcl/tk in the installation.

Run the script

Now you should be able to just run the script by assigning him a shortcut on your OS.

Notes

Every OS has its own specific things to keep in mind for installation and configuration. Something more can be seen on the github page.
In any case these are just simple tasks everyone willing to integrate an LLM on its system should be able to perform.

--

--

Francesco Manghi
Francesco Manghi

Written by Francesco Manghi

Energy and Mechatronics Engineer. I have learnt Machine Learning and Data Science for work and passion. I love handcrafting and hiking in my freetime.

No responses yet