ChatGopherPT: talking to LLMs over Gopher
___ _ _ ___ _ ___ _____ _
/ __| | |_ __ _ | |_ / __| ___ _ __ | |_ ___ _ _ | _ \ |_ _| (_)
| (__ | ' \ / _` | | _| | (_ | / _ \ | '_ \ | ' \ / -_) | '_| | _/ | | _
\___| |_||_| \__,_| \__| \___| \___/ | .__/ |_||_| \___| |_| |_| |_| (_)
|_|
_ _ _ _
| |_ __ _ | | | |__ (_) _ _ __ _
| _| / _` | | | | / / | | | ' \ / _` |
\__| \__,_| |_| |_\_\ |_| |_||_| \__, |
|___/
_
| |_ ___
| _| / _ \
\__| \___/
_ _ __ __
| | | | | \/ | ___
| |__ | |__ | |\/| | (_-<
|____| |____| |_| |_| /__/
___ __ __ ___ _ _
/ _ \ \ V / / -_) | '_|
\___/ \_/ \___| |_|
___ _
/ __| ___ _ __ | |_ ___ _ _
| (_ | / _ \ | '_ \ | ' \ / -_) | '_|
\___| \___/ | .__/ |_||_| \___| |_|
|_|
╔─*──*──*──*──*──*──*──*──*──*──*──*──*──*──*──*─╗
║1 ........................................ 1║
║2* ........................................ *2║
║3 ........................................ 3║
║1 ...........Posted: 2026-01-20........... 1║
║2* ....Tags: gopher my_warez my_servs ..... *2║
║3 ........................................ 3║
║1 ........................................ 1║
╚────────────────────────────────────────────────╝
I'm the bastard who brought Chinese AI to the pre-web Internet.
I wanted a way to talk to a language model that felt closer to a daemon than a
platform.
So I put `ollama` behind Gopher, using DeepSeek's models.
Available here: gopher://gopher.someodd.zip/1/gateway/ollama/list
ChatGopherPT runs on my own server. It exposes local models as a plain text
Gopher menu, accepts a prompt, and streams the reply line by line. Port 70 in,
text out. That is the whole interface.
There are no accounts to create and nothing to install beyond a Gopher client.
There is no analytics pipeline, no consent banner, no invisible third party
deciding how fast you get an answer. When it responds, it is because my machine
did the work.
Gopher keeps things honest. The protocol is small enough to understand fully.
You can watch the traffic, reason about the behavior, and write a client in an
afternoon. The result feels calm and legible.
I like that this works from a terminal. I like that it works from Lynx. I like
that it works from something you hacked together after dinner. It fits naturally
alongside the other quiet services that make up a personal server.
This is not a startup and not a demo. It is a thing I wanted, so I built it and
left it running.
If you miss small servers, plain text, and systems that stay out of your way,
you might enjoy talking to a language model over Gopher.
It's open source: https://github.com/someodd/small-gopher-applets
The source is a single README, which is a Literate Haskell file. You literally
mark the Markdown file as an executable and run it.