DEV Community

Cover image for Your Portfolio Is Invisible. Here's How I Fixed Mine.
Pratik Goswami
Pratik Goswami

Posted on • Originally published at Medium

Your Portfolio Is Invisible. Here's How I Fixed Mine.

Everyone is building. AI tools have made it trivially easy to spin up a beautiful portfolio or product site in an afternoon and deploy it to Vercel or Netlify before dinner. But here is the thing nobody talks about: Vercel gives you a URL. It does not give you an audience.

I learned this firsthand. I have had a personal website at pratikgoswami.dev for years. It looked great - it had my projects, experience, contact details - but the only people who ever visited it were recruiters who already had my resume or connections who clicked through from my LinkedIn. The site was essentially a private document with a public URL.

Then, in late April 2026, I spent two days fixing that. Here is exactly what I did and what happened after.


First, Understand Who You Are Actually Building For

Most developers optimize their site for one audience: humans. But your site has two types of readers: humans and machines. And increasingly, it is the machines that decide whether the humans ever find you.

  • SEO (Search Engine Optimization) is about helping Search Engines (like Google) crawl, understand, and rank your content so people find you through search.
  • AEO (Answer Engine Optimization) is newer, and arguably more important today. It is about helping AI assistants like ChatGPT, Gemini, and Claude form a confident, accurate understanding of who you are, so they can mention you when someone asks "who are good full-stack developers with fintech experience?"

Think of it this way: SEO gets you on the map. AEO makes you a landmark.

With LLMs becoming the next search interface, both matter. I decided to tackle them together.


The Honest Baseline

Before I made any changes, my Google Search Console data tells an honest story. The site had a tiny trickle of impressions starting in early April, the result of some rudimentary metadata I had set up previously. But it was flat, sparse, and averaging a position of 36+ in search results, effectively invisible.

I deployed my SEO and AEO changes on April 28th, 2026. By April 30th, impressions spiked sharply. In the 12 days since, the site has accumulated 52 impressions, with 2 actual clicks and a 1.8% CTR. Small numbers, but the trajectory is the story. The spike is real and it came immediately after the changes.

Impression & Click Performance on Google Search Console

Now let me walk you through exactly what I changed.


Part 1: Traditional SEO - Making It Readable for Google

My site is built with Next.js (App Router), TypeScript, and Sanity CMS. It looked clean to any human visitor. To a crawler, it was a meaningless soup of <div> tags.

1. Semantic HTML Refactoring

The simplest change with the highest impact. I replaced generic container divs with meaningful HTML elements.

Before:

<section id="skills-section">
  <div className="skills-section-title">Skills</div>
  <div>
    {skillList.map(({title, skills}) => (
      <div className="skill-container" key="{title}">
        <div className="skill-title">{title}</div>
        <div className="skill-item-container">
          {skills && skills.map((skill, index) => (
            <div className="skill-item" key="{`${title}-${index}`}">{skill}</div>
          ))}
        </div>
      </div>
    ))}
  </div>
</section>
Enter fullscreen mode Exit fullscreen mode

After:

<section id="skills-section" aria-label="Skills">
  <h2 className="skills-section-title">Technical Skills</h2>
  <div className="skills-grid">
    {skillList.map(({ title, skills }) => (
      <article className="skill-container" key="{title}">
        <h3 className="skill-title">{title}</h3>
        <ul className="skill-item-container">
          {skills && skills.map((skill, index) => (
            <li className="skill-item" key="{`${title}-${index}`}">{skill}</li>
          ))}
        </ul>
      </article>
    ))}
  </div>
</section>
Enter fullscreen mode Exit fullscreen mode

Why does this matter? A crawler reads your HTML like an outline. <h1> through <h3> tags are chapter headings. <section> and <article> are meaningful containers. Without them, Googlebot sees a flat document with no hierarchy and has to guess what is important.

2. Fixing Content Discoverability

This one surprised me. I had project details that rendered conditionally, after a user interaction: a click to expand, a tab switch, a state toggle. It looked fine in the browser, but Googlebot crawls the initial DOM, not the post-interaction state. If your content only renders after a click, it effectively does not exist to Google.

I refactored these components to render all content in the initial HTML, using CSS to control visibility rather than React state.

3. Adding Guides for Crawlers

I added robots.ts and sitemap.ts to my project - two small files that do an important job. robots.ts tells crawlers which pages they are allowed to visit. sitemap.ts hands them a complete map of every page on your site so nothing gets missed. Next.js App Router makes this clean.

// app/robots.ts
export default function robots(): MetadataRoute.Robots {
  return {
    rules: {
      userAgent: "*",
      allow: "/",
      disallow: [""], // Routes that you dont want Crawler to see
    },
    sitemap: "https://www.pratikgoswami.dev/sitemap.xml",
  };
}
Enter fullscreen mode Exit fullscreen mode
// app/sitemap.ts
export default function sitemap(): MetadataRoute.Sitemap {
  return [
    {
      url: "https://www.pratikgoswami.dev",
      lastModified: new Date(),
      changeFrequency: "monthly",
      priority: 1,
    },
  ];
}
Enter fullscreen mode Exit fullscreen mode

The last step is submitting your sitemap in Google Search Console. That single action transforms your site from something Google might eventually find into something Google knows exists today.

4. Performance and Core Web Vitals

Google uses Core Web Vitals, specifically LCP (Largest Contentful Paint), as a ranking signal. LCP measures how long it takes for the largest visible element on your page to load - the lower the number, the faster your site feels to a visitor. 

I had been importing FontAwesome icons as an external dependency, which added unnecessary weight to the bundle. I replaced the animated canvas background with a native Canvas API implementation, cutting the external dependency entirely and improving load performance.

Core Web Vitals


Part 2: AEO - Making It Readable for AI

Most developers fix their SEO and think the job is done. It is not. Search behavior has fundamentally shifted. People are using ChatGPT, Gemini, and Claude as search engines now, asking them to recommend developers, tools, and products. If an AI model does not know who you are, you are invisible to an entirely new class of search. AEO is how you fix that.

1. JSON-LD Structured Data

Think of JSON-LD as your verified ID card for the internet - it tells search engines which queries you are relevant to, and gives AI models the structured facts they need to represent you accurately.

I added a <script type="application/ld+json"> block to my site's <head> using the Person and SoftwareApplication schemas from Schema.org. This gives AI models explicit, machine-readable facts rather than forcing them to infer who you are from your prose.

<Script
    id="schema-person"
    type="application/ld+json"
    dangerouslySetInnerHTML={{
      __html: JSON.stringify({
          "@context": "https://schema.org",
          "@type": "Person",
          name: "Pratik Goswami",
          jobTitle: "Frontend Software Engineer",
          url: "https://www.pratikgoswami.dev",
          sameAs: [
          "https://www.linkedin.com/in/prtkgoswami",
          "https://github.com/prtkgoswami",
          ],
          worksFor: {
          "@type": "Organization",
          name: "TikTok, IBM",
          },
          description:
          "Frontend Engineer specializing in React, TypeScript, and scalable UI systems.",
      }),
    }}
/>
Enter fullscreen mode Exit fullscreen mode

Without this, an LLM has to guess who you are based on scattered text. With it, you are handing the model a verified fact sheet.

2. The Triangle of Trust - Entity Linking via sameAs

The sameAs property is subtle but powerful. By linking your personal domain to your LinkedIn and GitHub profiles, you are creating a web of verification across multiple authoritative sources.

LLMs build a confidence score around named entities. If "Pratik Goswami" appears on a personal website, a LinkedIn profile, and a GitHub account, and those sources all link to each other, the model develops a high-confidence entity. It knows who you are. Without these links, you are just an unnamed node in a vast graph.

This is the triangle of trust: your domain, LinkedIn, and GitHub. Each point of the triangle reinforces the others.

3. Writing for RAG - Declarative, Specific Descriptions

Retrieval-Augmented Generation (RAG) is how many LLMs fetch context before answering. They pull relevant chunks of text from indexed sources. That means how you write about your projects matters as much as the schema you add.

Vague (bad for RAG):

"Passionate developer who loves building impactful things."

Specific (good for RAG):

"Built Gears Connect, a B2B automotive parts marketplace using Next.js, PostgreSQL, and Stripe, enabling real-time inventory management for 50+ vendors."

The second version is indexable. It is full of specific, searchable facts. An LLM retrieving context for a query like "full-stack developers who have built marketplace apps" can actually use it.

4. JSON-LD Over llms.txt

You may have seen chatter about llms.txt, a proposed standard for helping LLMs read your site, but it is a shortcut, not a foundation. 

llms.txt is a plain-text file placed at the root of your domain that gives AI crawlers a human-readable summary of your site, similar to how robots.txt works for search crawlers. Low effort to set up, but also low density. It carries no schema, no entity relationships, and zero influence over search rankings.

JSON-LD does all of that and more. It is a W3C standard understood by Google, Bing, LinkedIn, Perplexity, and SearchGPT simultaneously. It defines explicit relationships - that "Pratik Goswami" is a Person, worksFor specific organizations, and is the author of specific projects. It also unlocks Google Rich Results, putting your job title and links directly in the search listing. llms.txt cannot do any of this.


Part 3: Monitoring - Closing the Loop

Shipping SEO and AEO changes without monitoring is like sending a message and never checking if it was delivered. Monitoring closes the feedback loop - it tells you what Google has indexed, which queries are surfacing your name, and which parts of your site are actually holding a visitor's attention. Without it, you are optimizing blind.

Google Search Console was the first thing I set up, before deploying any changes. Verifying your domain and submitting your sitemap establishes a direct line of communication with Google.

The LLM test is simple and satisfying: ask ChatGPT, Gemini, and Claude "Who is Pratik Goswami?" My site now surfaces accurate, confident responses across all three. That is the AEO working.

Test on Gemini & GPT

Google Search Test

Custom Google Analytics tracking using IntersectionObserver lets me track which sections of my site recruiters actually read, not just that they visited. Knowing that the "Job Application Tracker" project card gets more dwell time than others is actionable data, not a vanity metric.

const observer = new IntersectionObserver(
    (entries) => {
        entries.forEach((entry) => {
            const sectionId = entry.target.id;

            if (entry.isIntersecting) {
                // User entered a section
                activeSection.current = sectionId;
                enterTime.current = Date.now();
                sendSectionView(sectionId);
            } else if (activeSection.current === sectionId && enterTime.current) {
                // User left the section → calculate time spent
                const duration = Date.now() - enterTime.current;
                sendTimeSpent(sectionId, duration);
                activeSection.current = null;
                enterTime.current = null;
            }
        });
    },
    { threshold: 0.5 } // Trigger when 50% of a section is visible
);

SECTIONS.forEach((id) => {
    const el = document.getElementById(id);
    if (el) observer.observe(el);
});
Enter fullscreen mode Exit fullscreen mode

Was It Worth It?

In two days of work, the site went from effectively invisible to ranking on the first page of Google for my name, accumulating 52 impressions in 12 days, and being recognized by ChatGPT, Gemini, and Claude when asked who I am.

The numbers are early, but the direction is clear. The spike in the Google Search Console (GSC) chart is immediate and unambiguous.

Here is what I want you to take away:

  • A beautiful site and a discoverable site are not the same thing. Most portfolios are the former, not the latter.
  • AEO is not the future. It is right now. LLMs are already answering career questions. If you are not structured for them, you are invisible to them.
  • The implementation is smaller than you think. Two files, one schema block, and a few hours of refactoring later, my site went from invisible to indexed. The effort was minimal. The compounding visibility was not.
  • Treat your portfolio like a product. Ship, measure, iterate.

Your Quick-Start Checklist

Pick one item from this list and do it this weekend:

[ ] Replace <div> wrappers with <section>, <article>, and proper <h1><h3> hierarchy
[ ] Audit for content hidden behind React state and render it in the initial HTML
[ ] Add robots.ts and sitemap.ts in Next.js App Router
[ ] Submit your sitemap in Google Search Console
[ ] Run PageSpeed Insights and check your LCP score
[ ] Add a Person JSON-LD schema to your <head>
[ ] Add sameAs links to LinkedIn and GitHub in your schema
[ ] Rewrite your project descriptions to be specific, declarative, and technical
[ ] Ask ChatGPT or Gemini who you are and see what they say today


So… What Is Next?

This is just the starting point. The real value of setting up monitoring is that the data now tells me where to focus next - which projects are getting attention, which queries are surfacing my name, and where there is still room to improve. I will keep the structured data updated as my experience grows, revisit the content as AEO standards evolve, and let the analytics guide the next round of changes. The foundation is built. Now it gets to compound.

Top comments (0)