DEV Community

Cover image for How I Built a 2,053-Page Next.js Site Without Knowing How to Code
Rishi Kumar
Rishi Kumar

Posted on

How I Built a 2,053-Page Next.js Site Without Knowing How to Code

Hi all. Quick warning before you read this - I am not a developer. I have never written code professionally in my life. I am posting this on dev.to because I think some of you will find the technical decisions interesting, and some of you (the ones who get asked by friends "can you build me a website?") might want to know what is actually possible now with AI coding tools, without being patronising about it.

I run edifyedu.in. Independent comparison platform for India's online universities. 2053 pages live. Built solo, no dev team, no agency. Took me 1.9 years.

Here is what the stack looks like and what I learned.

The stack

Frontend - Next.js 14, App Router, Tailwind CSS, deployed on Vercel.

CMS - Google Sheets, connected through Google Apps Script.

Why this combo? Because I needed:

  • Dynamic page generation for 125+ universities, each with semester fees, syllabi, eligibility, specialisations
  • Easy data editing without touching code (because I cant code)
  • SEO-friendly static pages (this is an SEO-driven site)
  • Free or near-free hosting because I had no money

Next.js with ISR plus Google Sheets as the database hit all four. Wordpress could not. Webflow could not at this scale. Custom backend would have needed a developer I could not afford.

The Google Sheets CMS pattern

This is the part I think devs will find interesting because it goes against best practices but works really well for content-heavy SEO sites with limited budget.

Architecture:

  • Google Sheet has one row per university with columns for every data point (fee, NIRF rank, UGC approval, specialisations, etc)
  • Google Apps Script exposes the sheet as a JSON API endpoint
  • Next.js fetches at build time using getStaticProps
  • ISR with revalidate set to false initially, switched to on-demand revalidation via API route after I hit Vercel Hobby plan ISR Write limits (more on that disaster below)
  • Page templates are React components that render based on the row data

Updating any of the 2053 pages now takes me changing one cell in a Google Sheet and triggering a revalidate.

What this means in practice - I never open my code editor anymore for content updates. I open Google Sheets. The site updates within minutes. For a non-coder running a site solo, this is the only architecture that works.

The hard parts - what I actually struggled with

Claude Code did most of the writing. But it was not a magic wand. Here is what was hard:

  1. Schema markup at scale

Indian education search results live or die by schema. FAQPage, Course, AggregateOffer, AggregateRating. Getting these right across 2053 pages, in valid JSON-LD format, without duplicates, took weeks of iteration.

The worst bug I had - FAQPage schema was firing twice on every page because two different components were emitting it. Google was ignoring both. Found this through Search Console's Rich Results report. Fixed it with Claude Code across 336 affected pages in maybe 40 minutes. That fix alone moved more rankings than any content update.

  1. The ISR write limit disaster

I had ISR set to revalidate every 60 seconds initially. Felt smart at the time. Vercel Hobby plan has limits on ISR Writes I did not know about. Hit the limit on day 8 of my big content push. Site froze for 10 days while I figured out what happened. Lessons:

  • Read your hosting plan limits BEFORE shipping
  • For SEO sites with infrequent content changes, time-based ISR is wasteful. Use revalidate false and trigger on-demand revalidation through a webhook
  • Add a manual revalidate button to your admin panel

I now have a /api/revalidate endpoint with Bearer token auth that Google Apps Script can trigger when I save a sheet. Solid setup. Should have built it from day one.

  1. Crawl budget

Pushed a sitemap of 2878 URLs. Watched only half get indexed. Lost a month figuring out that Google was choosing what to crawl based on internal links, not based on sitemap inclusion. Pages with internal links from already-indexed pages got crawled. Orphan pages did not.

Solution was internal linking automation. Every new university page gets internal links from 5 already-indexed related pages, inserted in the body copy in the first 300 words, with varied anchor text. Wrote a script in Claude Code that suggests where to add these links based on topical overlap. Half my crawl budget problem went away.

Tools I actually used daily

  • Claude Pro (around 20 dollars a month) - for thinking, drafting, debugging, schema validation
  • Claude Code - for actual code edits across the site
  • Vercel free tier - hosting
  • Google Sheets - the CMS
  • Google Apps Script - the bridge between Sheets and Next.js
  • Google Search Console - the actual brain of an SEO site, free, often ignored

Total monthly tooling cost - maybe 25 dollars including Vercel Pro upgrade after I hit limits. Replaced what would have been a 50000 INR per month dev contractor.

What I would tell another non-coder thinking about this

Three honest things:

First, the AI tools are good enough now. Two years ago this was not true. Claude Code can write production-quality React, Apps Script, and SQL if you describe what you want clearly. The bottleneck is not the code. It is whether you can articulate the requirement precisely.

Second, the boring engineering parts matter more than the design. I wasted 2 months in the beginning making the site look pretty. None of that moved rankings. The schema, the internal linking, the ISR setup, the crawl optimisation - that is what got me to 122k monthly impressions. Build the plumbing first. Design can wait.

Third, AI does not replace the thinking. Every fee on my site I verified by hand from official UGC notifications and university portals. Every claim I make I can defend. AI can write your code. It cannot carry your credibility. In an information-driven niche like education comparison, that distinction is the entire game.

Honestly I am still learning. The site has problems. Some pages do not rank yet, some data needs updating monthly, some sections need rewriting. But it exists, helps working professionals make better admission decisions, and was built by someone who could not write a for-loop 1.9 years ago.

If you have something you want to build, the tools are ready now. You probably are too.

You can see the site at edifyedu.in if you want to see what 2053 pages of independent university data, built without code, looks like.

Happy to answer questions in the comments. Especially the technical ones - I might not know the answer but I will tell you honestly what I tried.

Top comments (0)