I got shadowbanned twice before I figured it out.
The first time, I thought it was bad luck. The second time, I knew it was me. I was building a small tool to share updates about my indie project across a few subreddits, and I was posting too fast, too often, too mechanically. Reddit's spam filters are ruthless, and they don't care that your intentions were good.
The fix wasn't clever. It was embarrassingly simple: I needed to slow down and act like a human.
What Was Going Wrong
My original script looped through a list of subreddits and fired off posts one after another. No delays. No randomness. Just a tight loop hitting the API as fast as it could. To Reddit's systems, that looks exactly like a bot — because it is.
The problems were:
- No delay between posts — humans don't post five times in four seconds
- No randomness — identical timing patterns are a red flag
- No queue — if something failed, I had no way to retry gracefully
The Fix: A Simple Rate-Limited Queue
Here's the core of what I built. It's not fancy, but it works:
import time
import random
from collections import deque
class RateLimitedQueue:
def __init__(self, min_delay=25, max_delay=45):
self.queue = deque()
self.min_delay = min_delay
self.max_delay = max_delay
def add(self, task):
self.queue.append(task)
def run(self):
while self.queue:
task = self.queue.popleft()
try:
task()
except Exception as e:
print(f"Task failed: {e} — requeueing")
self.queue.append(task)
delay = random.uniform(self.min_delay, self.max_delay)
print(f"Waiting {delay:.1f}s before next action...")
time.sleep(delay)
You wrap each Reddit action in a callable and drop it in the queue. The loop pulls tasks one at a time, runs them, then waits a random delay between 25 and 45 seconds before the next one. If something fails, it goes back in the queue.
That random delay is the key part. Humans don't post at perfectly regular intervals. Mimicking that unpredictability is what kept me off the radar.
A Few Other Things That Helped
Respecting subreddit rules actually matters. I added a check that reads each subreddit's sidebar before posting. If it says "no self-promotion," I skip it. This isn't just good ethics — subreddits with active mods will report you manually if you ignore their rules, and that overrides any technical cleverness.
Posting from an aged account helps. New accounts are watched more closely. If you're building something that involves Reddit posting, use an account that's been active and legitimate for at least a few months.
Don't post identical text everywhere. I started generating slight variations for each subreddit — different opening line, different framing. Same core message, but not copy-pasted. Reddit's duplicate detection is solid.
Track what you've already posted. I keep a simple SQLite log. Before posting anywhere, I check if I've hit that subreddit in the last 72 hours. This alone probably cut my shadowban risk in half.
The Mindset Shift
The real lesson wasn't technical. It was that automation doesn't get to skip the social contract. Reddit communities are built on trust and shared norms. When I started treating those communities like real places instead of distribution channels, my posts got more engagement anyway — because I was posting things that actually fit.
The rate limiter didn't just keep me safe. It forced me to slow down and think about whether each post was actually worth making.
That's a useful constraint for any indie dev doing their own distribution.
More tools at forge.closerhub.app — practical playbooks for indie devs.
Top comments (0)