The Uncomfortable Truth About SEO Nobody Admits
When I launched [conversor-iae-cnae](https://www.conversoriaecnae.es) in November 2024, I had a list of 47 things I "had to do" to rank on Google.
XML sitemaps ✓ Meta descriptions ✓ Image alt text ✓ Page speed ✓
The typical checklist everyone repeats.
Three months later, the tool reached #2 on Google for its main keyword. 63,600 clicks and 1.83M impressions according to Google Search Console.
But here's what's interesting: 80% of that traffic came from just 3 technical decisions. The other 90% of the SEO checklist barely moved the needle.
This is the real breakdown of what worked (with GitHub commits so you can verify it).
Decision #1: 2,247 Detail Pages with ISR (The Game Changer)
The context: Conversor IAE CNAE is a tool that converts between Spanish tax codes. I had two options:
1. A single page with dynamic search (easier to build) 2. 2,247 individual static pages, one per code
I chose option 2. And here's the technical trick that made the difference:
```typescript // Each detail page uses ISR with hourly revalidation export const revalidate = 3600;
export async function generateStaticParams() { // Generate paths for 2,247 IAE + CNAE codes const codes = await getAllCodes(); return codes.map((code) => ({ slug: code.codigo_stripped })); } ```
Why it worked:
- Google can crawl and index static HTML (no need to execute JavaScript)
- Each code has its own clean URL (`/iae/5013` instead of `/?search=5013`)
- ISR gives me the best of both worlds: static speed + dynamic freshness
- **1.83M impressions in 3 months** came mainly from these detail pages
Most developers would have built a SPA search. I generated 2,247 static pages. That was the decision.
Decision #2: Structured Data For AI Bots (Not Just Google)
The problem: I had indexed pages, but my indexation rate was 21.5%. Only 2,430 pages out of 10,000+ were in Google's index.
I investigated. The problem wasn't crawling, it was perceived relevance.
The solution: I implemented specific JSON-LD schemas for AI bots:
```json { "@context": "https://schema.org", "@type": "DefinedTerm", "name": "IAE 501.3", "description": "Venta en hipermercados", "inDefinedTermSet": "https://www.conversoriaecnae.es/iae", "termCode": "501.3" } ```
Real commits showing the evolution:
- **7d9c4a6** (Feb 10): "fix(seo): remove deprecated schemas, add DefinedTerm"
- **4b18d86** (Feb 10): "fix(seo): remove duplicate FAQPage JSON-LD, add dynamic FAQs"
These aren't generic "WebPage" schemas. They're DefinedTerm, specifically designed for ChatGPT, Perplexity, and Gemini to understand this is an official code with equivalencies.
Result: The indexation rate went up, but more importantly, I started appearing in AI bot responses when someone asks "What's the CNAE equivalent to IAE 501.3?"
I didn't just optimize for Google. I optimized for the future of search.
Decision #3: PostgreSQL Full-Text Search With Weighting (Not Elasticsearch)
Everyone told me: "For real-time search, you need Elasticsearch or Algolia."
I implemented full-text search with Supabase native PostgreSQL:
```sql CREATE FUNCTION search_iae_codes( search_query TEXT, limit_count INT DEFAULT 10 ) RETURNS TABLE (...) AS $$ BEGIN RETURN QUERY SELECT * FROM iae_codes WHERE search_vector @@ to_tsquery('spanish', search_query) ORDER BY ts_rank_cd(search_vector, to_tsquery('spanish', search_query)) DESC LIMIT limit_count; END; $$; ```
Why this decision was critical:
1. Spanish language configuration: `to_tsquery('spanish', ...)` does proper stemming ("empresas" → "empresa") 2. Relevance weighting: `ts_rank_cd` prioritizes exact matches over partial ones 3. Simplified infrastructure: One less database, one less service to maintain 4. <100ms latency: GIN indexes make searches instantaneous
Real-time search doesn't just improve UX. It indirectly improves SEO by reducing bounce rate and increasing time on page.
Google sees people interact with your site. It ranks better.
The 7 Things That (Almost) Didn't Matter
And now the part nobody tells you:
1. Perfect meta descriptions → Very little impact on rankings (though improves CTR) 2. Alt text on all images → The tool is almost pure text, irrelevant 3. Perfect XML sitemap → Google ignored it for 3 weeks, then crawled everything at once 4. Internal linking strategy → With 2,247 pages, internal linking builds itself 5. Perfect Core Web Vitals → Helps, but wasn't differential (already in "good") 6. Manual backlinks → Got 0 external backlinks, all technical SEO 7. Content freshness obsession → ISR revalidates hourly, but Google doesn't crawl that frequently
These things help. But if I had to start from scratch with 10 hours, I'd spend them all on the 3 decisions above.
The Real Numbers (No Filters)
Because this is build in public, here are the complete Google Search Console numbers:
- **63,600 total clicks** (last 3 months)
- **1.83M impressions** (organic reach)
- **Position #2** for "conversor iae cnae" (main keyword)
- **21.5% indexation rate** (improvable, but functional)
- **~20K clicks/month** since March 2025
The curve wasn't linear. It was:
- Weeks 1-4: ~500 clicks/month (sandbox phase)
- Weeks 5-8: ~5K clicks/month (Google starts trusting)
- Weeks 9-12: ~20K clicks/month (exponential momentum)
This isn't consumer SaaS. It's an ultra-niche B2B tool for Spanish freelancers and accounting firms. 20K monthly clicks in this niche is brutal.
The Framework You Can Replicate
If you're building a similar tool, here's the playbook:
1. Generate static pages massively → Use ISR if your content changes occasionally 2. Implement structured data aggressively → Prioritize schemas AI bots understand 3. Optimize internal search with full-text search → Reduce bounce rate indirectly improves SEO 4. Ignore 80% of generic checklists → Focus on what moves the needle for YOUR use case
And most importantly: ship fast, measure, iterate.
My commit history shows I touched the schemas 3 times ([092675f](https://github.com), [7d9c4a6](https://github.com), [4b18d86](https://github.com)) before finding the optimal configuration.
It wasn't perfect planning. It was fast iteration with real data.
The Trick Nobody Mentions
One last thing.
The most important factor for ranking fast wasn't any technical tactic.
It was solving a real problem people actively search for.
There are 3.3 million freelancers in Spain. They all need to convert between IAE and CNAE for tax authority and social security procedures. And before my tool, there was no decent online solution.
Existing demand + competent technical solution = fast rankings.
You don't need backlinks if you have this.
---
Building SEO-first tools? The principles are replicable: massive pages with ISR, structured data for AI, and optimized search. The rest is noise.
Keep building.
