Business

52-Week Indie Challenge: One Revenue Experiment Per Week

I woke up on January 1st, 2026, with a hangover and an idea. What if I spent the entire year running one small revenue experiment per week? Not building...

I woke up on January 1st, 2026, with a hangover and an idea. What if I spent the entire year running one small revenue experiment per week? Not building full products. Not launching startups. Just testing one monetization hypothesis every seven days, tracking what worked, and being brutally honest about the results.

It's now March, and I'm twelve weeks in. Some experiments worked. Most didn't. A few surprised me completely. And I've learned more about making money as an independent developer in twelve weeks of structured experimentation than I did in the previous two years of winging it.

The CIOs Guide to MCP: How Model Context Protocol Connects AI to Your Enterprise and Why It Matters

The CIOs Guide to MCP: How Model Context Protocol Connects AI to Your Enterprise and Why It Matters

Stop building custom AI integrations. MCP is the universal standard adopted by Anthropic, OpenAI, Google, Microsoft. CIO guide to enterprise adoption.

Learn More

Here's the framework, the results so far, and why I think every indie dev should try something like this.


The Rules

Before I started, I set some ground rules. Without constraints, "experiments" become "unfinished side projects," and I have enough of those cluttering up my GitHub already.

Rule 1: Each experiment must be completable in under 10 hours. If it takes more than 10 hours, it's not an experiment — it's a project. The whole point is rapid testing, not deep building.

Rule 2: Each experiment must have a measurable revenue outcome. "I learned something" doesn't count. The question is always: did this generate revenue, or does it have a clear path to generating revenue within 30 days?

Rule 3: Document everything. Every experiment gets a write-up: hypothesis, what I built, what happened, what I learned. No revisionist history. No pretending the failures were "actually successes in disguise."

Rule 4: No repeating the same experiment type in consecutive weeks. This forces variety. If I try a digital product in week 5, week 6 has to be something different — maybe a service, an affiliate play, or a content experiment.

Rule 5: Budget cap of $50 per experiment. This keeps things lean and forces creativity. You'd be amazed what you can test for under $50 when you're not allowed to throw money at problems.

Here's how I track them:

var Experiment = function(weekNumber, hypothesis, type) {
    this.week = weekNumber;
    this.hypothesis = hypothesis;
    this.type = type;
    this.hoursSpent = 0;
    this.cost = 0;
    this.revenue = 0;
    this.status = 'planned';
    this.notes = '';
};

Experiment.prototype.complete = function(hours, cost, revenue, notes) {
    this.hoursSpent = hours;
    this.cost = cost;
    this.revenue = revenue;
    this.status = revenue > 0 ? 'revenue_generated' : 'no_revenue';
    this.notes = notes;
    return this;
};

Experiment.prototype.roi = function() {
    if (this.cost === 0) return this.revenue > 0 ? Infinity : 0;
    return ((this.revenue - this.cost) / this.cost * 100).toFixed(1) + '%';
};

Nothing fancy. Just enough structure to keep myself honest.


Weeks 1-4: The Naive Phase

Week 1: Sell a Code Template Pack

Hypothesis: Developers will pay $19 for a well-documented Express.js starter template with auth, database setup, and deployment configs.

I spent 8 hours packaging up patterns I use in every project — middleware setup, database connection pooling, error handling, logging, the stuff that takes two hours to set up on every new project. I put it on Gumroad.

Result: 3 sales. $57 revenue. $0 cost.

Not life-changing, but three people paid money for something I built in a day. That's validation. I later realized the template was too generic — experienced devs already have their own starter kits, and beginners don't know enough to appreciate what makes mine good. The sweet spot is mid-level developers who know enough to be productive but haven't built their own tooling yet.

Week 2: Affiliate Review Article

Hypothesis: A detailed, honest review of a developer tool I actually use will generate affiliate commissions through organic search traffic.

I wrote a 3,000-word review of a monitoring tool I use in production. Not a puff piece — I included what I didn't like about it alongside what I liked. Published it on Grizzly Peak Software with my affiliate link.

Result: $0 revenue in week one. But here's the thing — this article is still generating traffic three months later. It's earned about $180 in affiliate commissions total. Some experiments have delayed payoffs, and that's why Rule 2 includes the "clear path to revenue within 30 days" clause.

Week 3: One-Hour Consulting Slots

Hypothesis: Senior developers will pay $150/hour for focused architecture review sessions.

I posted availability on LinkedIn and in a couple of developer communities. No fancy sales page. Just "I have 30 years of experience, I'll review your architecture for an hour, $150."

Result: 2 bookings. $300 revenue. $0 cost.

Both sessions went well. Both clients said they'd come back. I learned something important: selling time is the fastest path to revenue, but it's also the least scalable. Still, as an experiment, it proved that my experience has direct monetary value — which sounds obvious but is something a lot of experienced developers doubt about themselves.

Week 4: Premium Newsletter Issue

Hypothesis: My email subscribers will pay $5 for a special deep-dive newsletter issue on a topic they've been asking about.

I surveyed my list, identified the most requested topic, wrote a comprehensive 5,000-word guide, and offered it as a one-time purchase.

Result: 34 sales. $170 revenue. $0 cost.

This was the first experiment that really surprised me. 34 out of roughly 800 subscribers paid for a single email. That's a 4.25% conversion rate on a cold offer with no sales page, no testimonials, no urgency tactics. It confirmed something I'd suspected: a small, engaged list is worth more than a large, indifferent one.


Weeks 5-8: Getting Smarter

Week 5: Automated SEO Audit Reports

Hypothesis: Small business owners will pay $29 for an automated SEO audit of their website, delivered as a PDF report.

I built a simple Node.js script that crawls a site, checks for common SEO issues, and generates a report. The technical work was about 6 hours. I listed it on a couple of freelance platforms.

var generateAuditReport = function(url, callback) {
    var checks = [
        { name: 'Title tags', fn: require('./checks/titleTags') },
        { name: 'Meta descriptions', fn: require('./checks/metaDescriptions') },
        { name: 'Heading structure', fn: require('./checks/headings') },
        { name: 'Image alt text', fn: require('./checks/imageAlts') },
        { name: 'Page speed', fn: require('./checks/pageSpeed') },
        { name: 'Mobile friendliness', fn: require('./checks/mobile') },
        { name: 'Internal linking', fn: require('./checks/internalLinks') }
    ];

    var results = [];
    var completed = 0;

    checks.forEach(function(check) {
        check.fn(url, function(err, result) {
            results.push({ check: check.name, result: result, error: err });
            completed++;
            if (completed === checks.length) {
                callback(null, {
                    url: url,
                    date: new Date().toISOString(),
                    results: results,
                    score: calculateScore(results)
                });
            }
        });
    });
};

Result: 7 sales. $203 revenue. $12 cost (API fees for page speed testing).

This one had legs. The key insight was that small business owners don't want to learn SEO — they want someone (or something) to tell them what's wrong and how to fix it. The automation means I can deliver reports with minimal ongoing effort.

Week 6: Sponsored Content Post

Hypothesis: A developer tool company will pay me to write an honest article featuring their product.

I reached out to three companies whose products I actually use, offering to write a technical article that incorporates their tool in a real-world scenario. Not a review — a tutorial that happens to use their product.

Result: 1 company said yes. $500 flat fee for a 2,000-word article that I could also publish on my own site.

This was the highest single-week revenue so far. The key was being selective — I only pitched companies whose products I genuinely use and like. The article reads like my normal content because it essentially is. No forced promotion, no awkward product placement.

Week 7: Open Source Sponsorship

Hypothesis: Companies will sponsor an open source tool I maintain if I add a "Sponsors" section to the README.

I have a small utility library on GitHub with about 400 stars. I added a sponsorship section and reached out to five companies in the relevant ecosystem.

Result: $0 revenue. Zero responses.

Complete failure. I think 400 stars isn't enough for companies to care. The threshold is probably in the thousands. This experiment taught me that open source sponsorship is a long game that doesn't fit the 10-hour-per-week constraint.

Week 8: Technical Writing for Other Publications

Hypothesis: Developer-focused publications will pay for original technical content.

I pitched three online publications with article ideas. Two responded. One offered $200 for a 1,500-word piece. I wrote it in about 4 hours.

Result: $200 revenue. $0 cost. Effective rate of $50/hour.

Not bad for writing, which is something I enjoy doing anyway. The hidden benefit: publication on a site with higher domain authority drives traffic back to my own properties. It's revenue plus marketing.


Weeks 9-12: Pattern Recognition

By this point, patterns were emerging. Here's what I noticed:

Things that work fast: Selling time (consulting), selling to existing audience (newsletter), writing for others (technical content).

Things that work slow but compound: Affiliate content, SEO audit tools, template products.

Things that didn't work at all: Open source sponsorship (at my scale), cold outreach for services, anything requiring paid advertising.

Week 9: Mini-Course (Video)

Hypothesis: A 90-minute video course on a specific technical topic will sell for $49.

I recorded a focused course on setting up production Node.js deployments. Screen recording, my voice, no fancy editing. Published on Gumroad.

Result: 5 sales in the first week. $245 revenue. $0 cost (used free recording software).

The effort-to-revenue ratio here was interesting. It took about 8 hours to prepare and record, but the content continues selling without additional work. By now it's generated about $800 total.

Week 10: "Build in Public" Thread with CTA

Hypothesis: A detailed build-in-public thread on social media will drive email signups that convert to product sales.

I documented a weekend project in real-time on LinkedIn and Twitter, with a link to my email list at the end.

Result: 23 new subscribers. $0 direct revenue, but those subscribers are now in the pipeline for future offers. Call it a deferred win.

Week 11: Bug Bounty Sprint

Hypothesis: Spending a focused day on bug bounty programs can generate quick revenue.

I spent 8 hours on HackerOne looking for vulnerabilities in programs I'm familiar with. Found and reported two valid issues.

Result: $0 immediate revenue. Bug bounties take weeks to validate and pay out. I eventually received $250 for one of them and the other was marked as a duplicate. Not repeatable or predictable enough for my taste.

Week 12: Paid Technical Q&A Session

Hypothesis: Developers will pay $20 each for a live group Q&A session on a specific topic.

I announced a 90-minute live session on "Deploying Node.js Apps Without DevOps Experience" to my email list. $20 per seat, capped at 30 people.

Result: 18 attendees. $360 revenue. $0 cost (used free video conferencing).

The engagement was incredible. People stayed for the full 90 minutes. Several bought my course afterward. And I now have a recording I can sell as an on-demand product. This experiment had the best combination of immediate revenue, relationship building, and content creation of anything I've tried.


The Scoreboard So Far

After 12 weeks:

  • Total experiments: 12
  • Experiments generating revenue: 10
  • Total revenue: $2,235
  • Total cost: $62
  • Hours invested: ~96 (average 8 per experiment)
  • Effective hourly rate: $23.26

That hourly rate looks low, and it is — for now. But several of these experiments created assets that continue generating revenue without additional work. The affiliate article, the course, the template pack, the SEO audit tool. When I factor in trailing revenue, the effective rate is probably closer to $40/hour and climbing.

More importantly, I now have data. I know what works for my audience, my skills, and my lifestyle. I'm not guessing anymore. The remaining 40 weeks of the year can be spent doubling down on the winning patterns instead of spraying effort in random directions.


What I'd Do Differently

If I were starting this challenge over, I'd change three things:

Start with your existing audience. My most successful experiments leveraged my email list and existing website traffic. The experiments that required finding new customers from scratch performed worst. If you have any kind of audience, start there.

Front-load the service experiments. Consulting and paid sessions generated the most revenue per hour and also generated the most learning about what people actually want. That knowledge makes the product experiments better.

Track trailing revenue from day one. Some experiments look like failures in week one but turn profitable over the next month. I didn't set up proper tracking until week 6, so I'm estimating for the early experiments.


Your Turn

You don't need to commit to 52 weeks. Try four. One experiment per week for a month. Total investment: about 40 hours and $200 or less.

Here's a starter sequence I'd recommend for any developer:

  • Week 1: Sell a one-hour consulting session to someone in your network
  • Week 2: Write and publish an affiliate review of a tool you use daily
  • Week 3: Package something you've already built into a purchasable template
  • Week 4: Run a paid group Q&A session for your network or community

If even one of those works, you've learned something worth more than any business book or online course could teach you. You've learned that people will pay you for your skills and knowledge outside the context of a salary.

That's the real experiment. Not whether any specific tactic works. Whether you can make money independently. And the only way to find out is to try, fail, adjust, and try again. Fifty-two times if necessary.

I'll report back at week 26. If the whole thing crashes and burns, I'll tell you about that too. Honesty about failure is the one thing the indie dev world needs more of, and the one thing it gets least.


Shane Larson is a software engineer and writer living in Caswell Lakes, Alaska. He runs Grizzly Peak Software and AutoDetective.ai, and is currently on week 12 of an experiment that's either brilliant or insane. The data will decide.

Powered by Contentful