RSS feeds
I believe it’s time to switch back to using RSS feeds instead of email subscriptions, but this option may not suit everyone, as some may find the setup a bit complex and unnecessary.
I enjoy reading interesting thoughts, papers, blogs, and philosophies, so I subscribed to many newsletters. It was ok and manageable until a few months ago when I started postponing some.
At first, I thought I’d catch up later, telling myself, “It’s fine, I’ll read them when I have time.” Well, the truth is, I never went back. My inbox piled up with unread emails, and I eventually did a mass deletion without even opening any one of them. Because all these unread emails started to give me anxiety and a feeling of missing out.
During this time, I noticed a spike in spam and phishing emails. Either a newsletter I subscribed to shared my email, or a platform was hacked.
What I’m getting at is content distribution and consumption is becoming cumbersome. With so many newsletter platforms out there making it easy to set up a newsletter, everyone wants our email. I started to dislike this, so I began looking for other options. I’ve disabled every pop-up on my blog except for the welcome page since there’s no option to do otherwise.
I know this feeling is common among readers. That’s why many of them use disposable or masked emails, and in some cases, email plus addressing. As a result, platforms have started to flush out and ban these types of emails, pressuring users to give up their real emails. In today's world, our email is like a private key—anyone who has it can easily engage in scams. I think this needs to change.
Most newsletter platforms have fixed their pricing based on the number of subscribers. Here’s the pain point; if a newsletter has 10,000 subscribers, 2,000 of them won’t even open the email. In these cases, who ends up paying for these unwanted expenses?
I get it. It’s a business, and it’s fine if it's a paid subscription. But the inbox flooding with content I want to read but can’t, leaves me feeling overwhelmed.
There are only two things to consider from a creator's point of view;
Make sure the article is distributed and it is reachable.
Ensure it reaches mainly the target audience and is consumed.
Aren't you tired of the pile of unread email newsletters?
I think it’s time to go back to using RSS feeds. But this does not apply to everyone. Because to my knowledge, the creator may not get the view count and statistics if the reader uses the in-app reader. This is the best fit for you if you’re currently overwhelmed with the content, ads, and state of your inbox.
RSS might not be a familiar term for everyone, and that also needs to change. RSS stands for Really Simple Syndication. All Substack publications have a valid RSS feed; you just need to add "/feed" to their publication URL. You can use a service like feedsearch.dev to find a publication’s feed. When you visit any RSS feed in a browser, you can see data like the title, author, description, publication date, and links for each item in its raw form. If it's well-designed, you can even view the post itself. That’s the role of RSS aggregators—they gather this information for easy access.
Well, now let’s come to the point. You have choices. Choose any RSS aggregator like omnivore.app or create an alert system like a personal Telegram bot for sending RSS feed updates to a Telegram channel or use a service like https://kill-the-newsletter.com when the newsletter demands an email ID to access the content.
I created a Telegram bot running on Google Apps Script that checks for new posts every morning and evening. This script has a time-driven trigger to execute. I’ve saved the RSS feed URLs I want in a Google Sheet. Add the above-mentioned service to find the feed you’d like to add to this sheet. Also, check this feed validation service to make sure it works properly.
This script runs on Google Apps Script, which has certain limitations. Please refer to the Google Apps Script quotas for more information. Alternatively, you can convert this code into some other format (like for eg; Python) and run it on your own server.
The script goes through these feeds each time it runs and saves the last sent post and its metadata on another page to cross-check in the next run. If a new post is found in any of these feeds, it will be sent to a Telegram channel configured in this script. It all works privately for me now. You can make your channel public and run it yourself as a well-curated feed. I prefer self-hosting platforms these days because I want control over what I can manage. You can set it up yourself if you're interested; I've shared the script as open source on my GitHub.
Requirement
Basic knowledge of copying and pasting code.
A Google account to run the script on Google Apps Script.
A Telegram account, a channel, and a bot.
A Google Sheets document to track Feed URLs and Sent Posts.
I have also created a README file to make the process easy if you're new to this kind of thing and want to test it out.
Why this system works for me?
The whole point of this setup is that these newsletters will no longer clutter my inbox and occupy the storage space which is costly these days and the price is only going upwards. I can decide whether to read them now or delete them since I'm the admin of this channel. I can also open the links, and scan through them, and if something seems worth a read, I can send it to a read-it-later app of my choice. I’ve set up a bot for two platforms as of now and I’ve shared scripts for both of these bots as open source. You can create one and run it for yourself.
Currently, I’ve set one for Pocket. Pocket offers a read-aloud feature too.
And one for Raindrop.io. Tip: With Raindrop’s Pro, you’ll get a permanent library - Even if a page you've saved is taken down, you'll still have a copy of it in Raindrop. No limit on count and space.
Footnote
All these workflow and system creations are part of my 10,000 hours, an idea shared by Andrej Karpathy. He highlights the 'snowball' effect, where small projects can unexpectedly grow into significant endeavors. Even seemingly insignificant projects contribute to the 10,000 hours of practice believed to be essential for achieving expertise. I’m planning to write about it more in my next blog.
Note: All the scripts I mentioned in this blog were created using LLMs, including Claude, Mistral, Meta AI, ChatGPT, and Gemini. Keep in mind that I don’t have formal coding knowledge, so the code structure, syntax, and best practices may not be optimal or conventional. I’ve also created a public Bluesky feed to share my experience with using LLMs.