Home Features FAQ Get Started

Frequently Asked Questions

Everything you need to know about Fyxit Now SEO Analysis

Quick Navigation

Scoring & Results

Why can't I achieve 100/100? Is it even possible?

While 100/100 is theoretically possible, it's extremely rare and may be impossible for most real-world websites. Here's why:

Multiple Weighted Factors: Your health score evaluates 30+ different SEO factors, each with different weights. Even small issues across many pages compound quickly. For example, if 10% of your pages have heading hierarchy issues (-20% weight), that alone deducts 2 points from your score.

Real-World Trade-offs:

  • Comprehensive content (good for SEO) often means longer pages, making image lazy-loading optimization harder
  • Third-party integrations (ads, analytics, chat widgets) introduce JavaScript dependencies you can't fully control
  • User-generated content may not follow perfect SEO practices
  • Some CMS platforms make certain optimizations difficult without major custom development
What's a Good Score?

Instead of chasing an impossible perfect score, aim for these realistic targets:

90-100: Excellent Industry-leading SEO, competitive advantage

70-89: Good Solid foundation, above average

50-69: Fair Functional but needs improvement

Below 50: Poor Critical issues need immediate attention

Focus on Impact, Not Perfection: A site scoring 85/100 with all critical issues fixed performs better than obsessing over getting from 95 to 100. Prioritize critical issues (missing titles, error pages, HTTPS) over micro-optimizations.

How exactly is my health score calculated?

Your score starts at 100 and points are deducted based on the percentage of pages affected by each issue, weighted by severity:

🔴 Critical Issues (Highest Impact):

  • Error pages (404, 500): -50% per affected page percentage
  • JavaScript errors: -40% per affected page percentage
  • Missing titles: -30% per affected page percentage
  • Non-HTTPS pages: -30% per affected page percentage
  • Missing H1 tags: -25% per affected page percentage

⚠️ Warnings (Medium Impact):

  • Keyword stuffing: -20% per affected page percentage
  • Missing descriptions: -20% per affected page percentage
  • Heading hierarchy issues: -20% per affected page percentage
  • Missing alt text: -20% per affected page percentage
  • Thin content (<300 words): -15% per affected page percentage
  • Missing/invalid Open Graph: -15%/10% per affected page percentage
  • Canonical tag issues: -15% per affected page percentage
  • Duplicate titles/descriptions: -15%/10% per affected item

💡 Suggestions (Lower Impact):

  • Missing structured data: -10% per affected page percentage
  • Orphan pages: -10% per affected page percentage
  • Redirect chains: -10% per affected page percentage
  • Missing lazy loading: -5% per affected page percentage
  • Old image formats: -5% per affected page percentage
  • Missing compression/caching: -5% each per affected page percentage
  • Deep pages (4+ clicks): -5% per affected page percentage

🌐 Infrastructure:

  • Overly restrictive robots.txt: -5 points
  • Broken sitemap URLs: -5 points
  • Missing sitemap.xml: -3 points
  • Missing robots.txt: -2 points
Example Calculation:

Site with 100 pages:

  • 10 pages missing descriptions = 10% × -20% = -2 points
  • 25 pages with heading issues = 25% × -20% = -5 points
  • 100 pages missing lazy loading = 100% × -5% = -5 points
  • Restrictive robots.txt = -5 points
  • Final Score: 83/100 (Good range)

Important: Scores are capped at a minimum of 10/100. Even sites with severe issues get at least 10 points to distinguish functioning sites from completely broken ones.

My score went down after I fixed issues. Why?

This can happen for several legitimate reasons:

  1. More pages discovered: The crawler may have found additional pages this time (through improved JavaScript rendering, sitemap, or deeper crawling). New pages bring new issues that affect your score.
  2. Different pages crawled: Dynamic content, A/B tests, or random crawl paths may result in analyzing a different subset of pages each time.
  3. Unintended side effects: While fixing one issue, you may have inadvertently introduced another. For example:
    • Restructuring headings might have broken the hierarchy
    • Adding new content might have introduced keyword stuffing
    • New images might be missing alt text or lazy loading
  4. Issues on new content: If you published new pages since the last analysis, those pages' issues now factor into your score.
How to Troubleshoot:
  • Check "Page-by-Page Analysis" to see exactly which pages have issues
  • Compare the total number of pages analyzed between runs
  • Look for new issue types that weren't present before
  • Use the "Recrawl These Pages" button on specific fixes to verify improvements immediately
What should I fix first?

Always tackle issues in order of severity and impact:

1. Critical Issues (Red 🔴) - Fix Immediately:

  • Error pages (404s, 500s)
  • Missing page titles
  • Enable HTTPS site-wide
  • JavaScript errors breaking functionality
  • Missing H1 tags

2. Warnings (Orange ⚠️) - Fix Next:

  • Missing meta descriptions
  • Heading hierarchy issues
  • Missing alt text on images
  • Thin content or keyword stuffing
  • Canonical tag problems

3. Suggestions (Blue 💡) - Optimize When Ready:

  • Image lazy loading
  • Modern image formats (WebP)
  • Compression and caching
  • Structured data (schema.org)
  • Internal linking improvements
Quick Wins: Some issues affect many pages but have simple, scalable fixes:
  • Missing meta descriptions → Add template-based descriptions in your CMS
  • Missing alt text → Batch update with descriptive text
  • Missing lazy loading → Add loading="lazy" attribute globally
  • No compression → Enable gzip/brotli in server config (5-minute fix)

Crawling & Analysis

Why isn't the crawler finding all my pages?

There are several possible reasons pages might be missed:

1. Robots.txt Blocking:

Check yourdomain.com/robots.txt. If you're blocking crawlers or specific paths, those pages won't be analyzed. Fyxit Now respects robots.txt by default (just like Google does).

2. No Internal Links:

Pages must be linked from somewhere on your site. "Orphan pages" (not linked from any other page) won't be discovered unless they're in your sitemap.xml.

3. JavaScript-Only Navigation:

If your site uses client-side routing (React Router, Vue Router, Next.js) without server-side rendering, some links may only appear after JavaScript executes. While we use Puppeteer to render JavaScript, complex SPAs may need additional configuration.

4. Authentication Required:

The crawler can't access pages behind login walls, paywalls, or member-only areas.

5. Depth Limits:

Free trial: 2 levels deep, up to 10 pages
Paid: 3 levels deep, up to 1,000 pages

Pages buried deeper than your depth limit won't be reached.

6. Missing from Sitemap:

If you have important pages not linked internally, they must be in your sitemap.xml to be discovered.

Solution: Create or update sitemap.xml to include all important pages. Fyxit Now automatically checks sitemaps and queues those URLs. For large sites, consider submitting your sitemap to Google Search Console as well.
What does "JavaScript-rendered content" mean and why does it matter?

Some modern websites load content dynamically using JavaScript instead of including it in the initial HTML response. This is common with:

  • Single Page Applications (React, Vue, Angular, Svelte)
  • Infinite scroll social feeds
  • Lazy-loaded product catalogs
  • "Load More" buttons that fetch content via AJAX

The SEO Problem: Without JavaScript execution, search engines see blank pages or "Loading..." placeholders instead of your actual content. This severely hurts SEO because your valuable content is invisible to crawlers.

Our Solution: Fyxit Now uses Puppeteer (headless Chrome) to:

  • Execute JavaScript and wait for content to load
  • Scroll pages to trigger lazy-loaded content
  • Extract links that only appear after JavaScript execution
  • Analyze pages on depth 0 (homepage) and depth 1 (direct children)

Your Long-term Solution: For optimal SEO, implement:

  • Server-Side Rendering (SSR) - Frameworks: Next.js, Nuxt.js, SvelteKit
  • Static Site Generation (SSG) - Pre-render at build time
  • Hybrid rendering - SSR for content pages, client-side for interactive features

Detected Issue: If you see warnings like "Page shows 'Loading...' instead of content," this means your site heavily relies on JavaScript. Search engines may not see your content at all. Consider implementing SSR for better SEO.
How often should I run analysis?

Recommended frequency depends on your situation:

After Major Changes: Run immediately after deploying:

  • Site redesigns or rebrands
  • Platform migrations (WordPress → Shopify, etc.)
  • URL structure changes
  • Template or theme updates

Active SEO Work: Every few days if you're actively fixing issues to track progress and verify improvements.

Regular Content Publishing: Weekly for blogs, news sites, or e-commerce sites adding products frequently.

Stable Sites: Monthly for established sites that rarely change.

Note: Free accounts are limited to 1 domain. Paid accounts ($79) get 5 domain slots and can rerun each domain unlimited times without restrictions.
Why does it show issues I already fixed?

If you've fixed issues but they still appear:

  1. Viewing cached results: Click "Run Analysis" again to get fresh data.
  2. Changes not deployed: Verify fixes are live on production (not just in staging or development).
  3. CDN/Cache not purged: If using Cloudflare, CloudFront, or similar, purge the cache after deploying changes.
  4. Fixed some pages, not all: Check the "Affected Pages" list - you may have missed some pages.
  5. Partial fixes: You may have fixed part of the issue but not completely. For example, adding some alt text but not all, or fixing heading hierarchy on some pages.
Pro Tip: Use the "Recrawl Page" button on individual pages in the analysis to immediately verify fixes on that specific page, rather than waiting for a full site re-crawl.

Technical Details

What's the difference between Critical, Warnings, and Suggestions?

🔴 Critical Issues: These severely impact SEO and user experience. They prevent search engines from properly indexing your content or cause functional problems.

Examples: Missing titles, error pages, no HTTPS, JavaScript errors, missing H1 tags

Impact: Can prevent pages from ranking entirely or cause significant ranking penalties.

⚠️ Warnings: These negatively affect SEO and should be addressed, but your site still functions.

Examples: Missing meta descriptions, heading hierarchy issues, missing alt text, thin content, keyword stuffing

Impact: Reduces ranking potential and click-through rates from search results.

💡 Suggestions: Performance and user experience optimizations that provide incremental SEO benefits.

Examples: Image lazy loading, modern formats (WebP), compression, caching, structured data

Impact: Improves page speed, user experience, and enhances rich search results (rich snippets).

Does this analyze mobile vs desktop separately?

Currently, FyxIt analyzes the desktop version of your site. However, many checks apply equally to mobile:

  • Page performance (compression, caching, image optimization)
  • Content quality and structure
  • HTML semantics and heading hierarchy
  • Meta tags and Open Graph data
  • Structured data

Mobile-Specific Recommendations:

  • Ensure responsive design (viewport meta tag)
  • Test on real devices or Chrome DevTools mobile emulation
  • Use Google's Mobile-Friendly Test for mobile-specific issues
  • Check Core Web Vitals in Google Search Console
How does this compare to Google Search Console and other SEO tools?

Google Search Console:

  • ✅ Shows how Google actually sees and indexes your site
  • ✅ Real search performance data (clicks, impressions, rankings)
  • ✅ Authoritative source for Google-specific issues
  • ❌ Only shows Google's perspective
  • ❌ Retroactive (shows issues after they affect rankings)
  • ❌ Limited technical SEO analysis

Fyxit Now:

  • ✅ Proactive analysis before issues affect rankings
  • ✅ Comprehensive technical SEO audit (30+ checks)
  • ✅ Immediate verification of fixes (recrawl specific pages)
  • ✅ Page-by-page breakdown with specific issues
  • ✅ JavaScript rendering to see what crawlers see
  • ❌ Can't show actual Google rankings or search performance

Use Both Together:

  • Use Fyxit Now to identify and fix issues proactively
  • Verify fixes immediately with Fyxit Now's recrawl feature
  • Monitor real-world impact in Google Search Console
  • Cross-reference issues found in both tools

vs. Other SEO Tools:

  • Screaming Frog: Similar crawling, desktop-only, expensive annual licensing
  • Ahrefs/SEMrush: Broader competitive analysis, much higher cost ($100+/month), less detailed technical audits
  • Lighthouse: Single-page analysis only, no site-wide crawling
  • Fyxit Now: Comprehensive site-wide crawling with JavaScript rendering, affordable one-time payment, actionable recommendations
What user agent does the crawler use?

Fyxit Now uses two crawling methods:

1. Standard HTTP Requests:
User-Agent: Mozilla/5.0 (compatible; WebCrawler/1.0; +https://fyxit.now)
Used for fast HTML parsing and initial link discovery.

2. JavaScript Rendering (Puppeteer):
Uses headless Chrome with full JavaScript execution for depth 0 and depth 1 pages. This mimics how Googlebot renders JavaScript.

For Developers: Our crawler respects robots.txt. To allow/block Fyxit Now specifically, use user-agent WebCrawler in your robots.txt file.
Will the crawler slow down my website?

No. Fyxit Now is designed to be a polite, respectful crawler:

  • Rate limiting: 1-2 second delays between requests
  • Concurrent limits: Maximum 3 parallel requests per domain
  • Timeout protection: Requests timeout after 15-20 seconds
  • Respectful: Follows robots.txt rules

This is similar to how Googlebot and other major search engines crawl websites. Your server should handle it easily.

Getting Started

What is Fyxit Now?
Fyxit Now is a comprehensive SEO analysis tool that crawls your website like search engines do and identifies issues hurting your rankings. It analyzes 30+ factors including heading hierarchy, Open Graph tags, meta descriptions, internal links, JavaScript errors, page performance, structured data, and much more.
How do I get started?
  1. Create Account: Sign up with email and password on the homepage
  2. Add Domain: Register your domain in the dashboard
  3. Verify Ownership: Choose DNS, HTML, or JavaScript verification
  4. Run Analysis: Click "Run Analysis" to crawl your site
  5. Review Results: Check your score and detailed issue reports
  6. Fix Issues: Follow Priority Actions to improve your score
  7. Recrawl: Verify improvements with targeted recrawls
Do I need to install anything on my website?
No installation required! Fyxit Now crawls your publicly accessible website externally, just like search engines. However, you must verify domain ownership using DNS TXT record (recommended), HTML meta tag, or JavaScript snippet to unlock full features.

Pricing & Features

What's included in the free trial?

Free accounts include:

  • ✅ 1 domain slot
  • ✅ Up to 10 pages per domain
  • ✅ Crawl depth: 2 levels
  • ✅ Full SEO analysis (all 30+ checks)
  • ✅ Health score and recommendations
  • ✅ Page-by-page breakdown
  • ✅ Unlimited reruns of your domain
  • ⚠️ Limited to 3 total page recrawls

Perfect for small sites, portfolios, landing pages, or testing before upgrading.

How much does Fyxit Now cost?

Fyxit Now costs just $79 one-time for 5 domain slots. No monthly subscription!

Paid accounts get:

  • ✅ 5 domain slots (pay $79 again to add 5 more)
  • ✅ Up to 1,000 pages per domain
  • ✅ Crawl depth: 3 levels
  • ✅ Unlimited reruns for each domain
  • ✅ Unlimited page recrawls
  • ✅ Auto-generated sitemaps (download as sitemap.xml)
  • ✅ All current and future features
  • ✅ Lifetime access
Important: Each subdomain counts as a separate domain slot. For example, blog.example.com, shop.example.com, and example.com would use 3 of your 5 domain slots.

Need more domains? Simply pay $79 again to add 5 more domain slots to your account. No limits on how many times you can upgrade!

What happens if I delete a domain?

You can delete domains from your account at any time, which will remove all associated data (pages, analysis results, etc.).

Important: Deleting a domain permanently consumes that slot. You do NOT get it back, and you do NOT receive a refund.

How it works: When you pay $79 for 5 domain slots, you can register up to 5 domains total (including any you later delete). Once deleted, that domain still counts as one of your 5 slots.

Example: You have 5 slots and register 3 domains → 2 slots remaining. Delete 1 domain → you now have 2 domains registered, but only 2 slots remaining (not 3). The deleted domain permanently consumed 1 slot.

Think carefully before deleting! Each domain you register, whether you keep it or delete it, permanently uses one of your slots.

Can Fyxit Now generate a sitemap for my website?

Yes! Paid users can download auto-generated sitemap.xml files based on their crawl results. This feature:

  • Automatically generates valid XML sitemaps
  • Excludes error pages (404s, 500s)
  • Sets priorities based on page depth (homepage = 1.0, deeper pages = lower priority)
  • Includes all successfully crawled pages
  • Ready to upload to your server and submit to Google Search Console

After running an analysis, paid users will see a "Download Sitemap" button in their results. The generated sitemap follows Google's XML Sitemap Protocol and can save you hours of manual work!

What payment methods do you accept?
We accept all major payment methods through PayPal, including credit cards, debit cards, and PayPal balance. You don't need a PayPal account to complete payment.

Domain Verification

Why do I need to verify my domain?
Verification proves you own or control the website. This prevents unauthorized users from analyzing domains they don't own and ensures only legitimate site owners can run unlimited analyses and recrawls.
What verification methods are available?

1. DNS TXT Record (Recommended):

  • Add TXT record to your domain's DNS settings
  • Name: @ or your domain
  • Value: fyxit-verification=YOUR_TOKEN
  • Best for: Technical users with DNS access

2. HTML Meta Tag:

  • Add meta tag to homepage <head>
  • Easiest for most users with CMS access

3. JavaScript Snippet:

  • Add JavaScript verification code to homepage
  • Alternative if meta tag method doesn't work
How long does verification take?
Verification is instant once you've added the code. For DNS TXT records, allow 5-10 minutes for DNS propagation before clicking "Verify Domain." HTML and JavaScript verification methods work immediately.

Support

What if I need help?
Contact us at support@fyxit.now for questions, technical issues, or feedback. We typically respond within 24 hours.
Back to Home
Home | Features | FAQ | About | Terms | Privacy