When on your wifi, try navigating in your browser to your windows computer's address with a colon and the port 11434 at the end. Would look something like this:
If it works your browser will just load the text: Ollama is running
From there you just need to figure out how you want to interact with it. I personally pair it with OpenWebUI for the web interface
I initially installed Ollama/OpenWebUI in my HP G4 Mini but it's got no GPU obviously so with 16GB ram I could run 7b models but only 2 or 3 tokens/sec.
It definitely made me regret not buying a bigger case that could accomodate a GPU, but I ended up installing the same Ollama/OpenWebui pair on my windows desktop with a 3060 12gb and it runs great - 14b models at 15+ tokens/sec.
Even better, I figured out that my reverse proxy on the server is capable of redirecting to other addresses in my network so now I just have a dedicated subdomain URL for my desktop instance. It's OpenWebUI is now just as accessible remotely as my server's.