The way of Geminispace consumption
I read [Free as in Freedom] many years ago, when I started to interesting about GNU and Richard Stallman. There are many information, but one thing was stuck in my head, and it must be still relevant because it's included in [How I do my computing] on RMS's personal site. He is browsing WWW by downloading web pages through e-mail.
It have been strange to me. I understand a described purpose of acting that way, but it seems so uncomfortable.
I generally do not connect to web sites from my own machine, aside from a few sites I have some special relationship with. I usually fetch web pages from other sites by sending mail to a program (see https://git.savannah.gnu.org/git/womb/hacks.git) that fetches them, much like wget, and then mails them back to me.
It's about security and avoid of tracking content, which is connected with the way of life choose by RMS. It's obvious. But after my several months browsing into Geminispace (and Gophersphere) I realised that it's something more about it. The way of content consumption, which could be different in WWW than in Geminispace or Gophersphere.
Complex structure of information in average webpage made us working in different way. We are looking for information, sometimes in semi-automatic way (for example RSS/Atom), but in the most cases in manual way. Our daily basis is to check some list of webpages (some news portal, some weather podcast, some social-media service etc.). We'd like to automatize this. But it could be done only in the WWW with uncomplicated information structure, and when publishers don't want us to make us to visit them.
But the Geminispace is a different. We have simple information structure, and the most of users are familiar with other style of working. There are no need to give away all applications to the remote servers, and there is no problem to synchronise work through all devices (for example by SSH). Our server could be a center point of the pages we want to follow.
Obvious thing is to save time by:
- Automatic browse and gather content in one place, rather that manual browsing
- Automatic check for a new content
- Highlight changes in a new content
- Provide a new content in preferred way
I've made [Gemini diff script - gmidiff.sh] for my own purposes. And after a while I realised that it fulfill above requirements. Firstly I think only about tracing changes for [backlinks]. After a while, I figured out that it's convenient way of tracking changes in capsules without Atom feeds. For some pages it's better than [Comitium], because I can see (my script is analysing only headings and links, no text of paragraph) what is the point of new content. And if it seems to be interesting I can read the rest of content (in actual version of my script it is necessary to manually get the rest of content).
This is leads us to the beginning point [How I do my computing]. [Gemini diff script - gmidiff.sh] provide automatic browse and sending highlighted changes to my local mailbox. So I'm browsing Gemini capsules by downloading them through e-mail. I was surprised that I came to this point.
[How I do my computing]
[Free as in Freedom]
[Gemini diff script - gmidiff.sh]
szczezuja.space CC BY-SA
@ Sun 27 Jun 2021 09:11:14 PM CEST