From 9d765095b953b9853a53cba8e0227e99f9e0e467 Mon Sep 17 00:00:00 2001
From: guergabo <65991626+guergabo@users.noreply.github.com>
Date: Mon, 11 May 2026 17:21:14 +0000
Subject: [PATCH 1/6] Link to Scraperly's per-site difficulty index
---
browsers/bot-detection/overview.mdx | 4 ++++
1 file changed, 4 insertions(+)
diff --git a/browsers/bot-detection/overview.mdx b/browsers/bot-detection/overview.mdx
index 2fdbeea..21fb1e9 100644
--- a/browsers/bot-detection/overview.mdx
+++ b/browsers/bot-detection/overview.mdx
@@ -64,6 +64,10 @@ Before you start automating your workflow, we recommend that you manually test y
Once you have a stable baseline, replicate those conditions in your automations.
+
+ Wondering how rough a given site tends to be? Scrapers keep an independent, per-site difficulty index at [Scraperly](https://scraperly.com).
+
+
## Recommended Practices
| Category | Recommendation |
From a9b1f33bdb3f1944dc0243a01badfedf753c8035 Mon Sep 17 00:00:00 2001
From: guergabo <65991626+guergabo@users.noreply.github.com>
Date: Mon, 11 May 2026 17:22:47 +0000
Subject: [PATCH 2/6] Drop 'Scrapers' framing
---
browsers/bot-detection/overview.mdx | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/browsers/bot-detection/overview.mdx b/browsers/bot-detection/overview.mdx
index 21fb1e9..3654458 100644
--- a/browsers/bot-detection/overview.mdx
+++ b/browsers/bot-detection/overview.mdx
@@ -65,7 +65,7 @@ Before you start automating your workflow, we recommend that you manually test y
Once you have a stable baseline, replicate those conditions in your automations.
- Wondering how rough a given site tends to be? Scrapers keep an independent, per-site difficulty index at [Scraperly](https://scraperly.com).
+ Wondering how rough a given site tends to be? [Scraperly](https://scraperly.com) maintains an independent, per-site difficulty index.
## Recommended Practices
From 686795554d00ff8a744e90d2716a4524d234b5a3 Mon Sep 17 00:00:00 2001
From: guergabo <65991626+guergabo@users.noreply.github.com>
Date: Mon, 11 May 2026 17:23:11 +0000
Subject: [PATCH 3/6] Reframe Scraperly callout as a guide
---
browsers/bot-detection/overview.mdx | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/browsers/bot-detection/overview.mdx b/browsers/bot-detection/overview.mdx
index 3654458..102eb32 100644
--- a/browsers/bot-detection/overview.mdx
+++ b/browsers/bot-detection/overview.mdx
@@ -65,7 +65,7 @@ Before you start automating your workflow, we recommend that you manually test y
Once you have a stable baseline, replicate those conditions in your automations.
- Wondering how rough a given site tends to be? [Scraperly](https://scraperly.com) maintains an independent, per-site difficulty index.
+ Looking for site-specific guidance? [Scraperly](https://scraperly.com) is an independent guide with per-site difficulty ratings and recommendations.
## Recommended Practices
From b0f11a592d0790438a4a8d91053b8ef5ccad7775 Mon Sep 17 00:00:00 2001
From: guergabo <65991626+guergabo@users.noreply.github.com>
Date: Mon, 11 May 2026 17:23:48 +0000
Subject: [PATCH 4/6] Lean on Scraperly's own framing
---
browsers/bot-detection/overview.mdx | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/browsers/bot-detection/overview.mdx b/browsers/bot-detection/overview.mdx
index 102eb32..ce730c0 100644
--- a/browsers/bot-detection/overview.mdx
+++ b/browsers/bot-detection/overview.mdx
@@ -65,7 +65,7 @@ Before you start automating your workflow, we recommend that you manually test y
Once you have a stable baseline, replicate those conditions in your automations.
- Looking for site-specific guidance? [Scraperly](https://scraperly.com) is an independent guide with per-site difficulty ratings and recommendations.
+ [Scraperly](https://scraperly.com) is an independent guide built by scrapers, for scrapers — per-site difficulty ratings and recommendations grounded in real-world testing.
## Recommended Practices
From d71ba2252c24687f7526ea8b31f2166173d2e9a3 Mon Sep 17 00:00:00 2001
From: guergabo <65991626+guergabo@users.noreply.github.com>
Date: Mon, 11 May 2026 17:24:42 +0000
Subject: [PATCH 5/6] Use Scraperly's own framing verbatim
---
browsers/bot-detection/overview.mdx | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/browsers/bot-detection/overview.mdx b/browsers/bot-detection/overview.mdx
index ce730c0..116c47b 100644
--- a/browsers/bot-detection/overview.mdx
+++ b/browsers/bot-detection/overview.mdx
@@ -65,7 +65,7 @@ Before you start automating your workflow, we recommend that you manually test y
Once you have a stable baseline, replicate those conditions in your automations.
- [Scraperly](https://scraperly.com) is an independent guide built by scrapers, for scrapers — per-site difficulty ratings and recommendations grounded in real-world testing.
+ [Scraperly](https://scraperly.com) — an independent guide built by scrapers, for scrapers. Per-site difficulty ratings for 683+ sites, grounded in real-world testing rather than sponsorships.
## Recommended Practices
From 23689d7b6d49c881da8dae55d4e37fc64a37ff4c Mon Sep 17 00:00:00 2001
From: guergabo <65991626+guergabo@users.noreply.github.com>
Date: Mon, 11 May 2026 17:25:54 +0000
Subject: [PATCH 6/6] Tier unsupported sites by difficulty; drop Scraperly
callout
---
browsers/bot-detection/overview.mdx | 4 ----
browsers/faq.mdx | 25 +++++++++++++++++--------
2 files changed, 17 insertions(+), 12 deletions(-)
diff --git a/browsers/bot-detection/overview.mdx b/browsers/bot-detection/overview.mdx
index 116c47b..2fdbeea 100644
--- a/browsers/bot-detection/overview.mdx
+++ b/browsers/bot-detection/overview.mdx
@@ -64,10 +64,6 @@ Before you start automating your workflow, we recommend that you manually test y
Once you have a stable baseline, replicate those conditions in your automations.
-
- [Scraperly](https://scraperly.com) — an independent guide built by scrapers, for scrapers. Per-site difficulty ratings for 683+ sites, grounded in real-world testing rather than sponsorships.
-
-
## Recommended Practices
| Category | Recommendation |
diff --git a/browsers/faq.mdx b/browsers/faq.mdx
index 47651e5..dcfa4fa 100644
--- a/browsers/faq.mdx
+++ b/browsers/faq.mdx
@@ -21,11 +21,20 @@ If you're experiencing slower-than-expected browser creation times, review your
## Unsupported Websites
-There are some websites that are not supported by Kernel browsers due to their restrictions around automation and associated bot detection. These include:
-
-- LinkedIn
-- Facebook
-- Instagram
-- X (Twitter)
-- Amazon
-- Reddit
+Some websites deploy bot detection aggressive enough that reliable automation isn't feasible today — even with stealth mode, residential proxies, and persistent profiles. The list below is incomplete and reflects what we've seen in practice.
+
+### Very Hard
+
+Effectively unscrapable at the moment. Expect login walls, rapid CAPTCHA gates, and device-level fingerprinting that doesn't yield to residential exits or stealth defaults.
+
+- **LinkedIn**
+- **Facebook**
+- **Instagram**
+
+### Hard
+
+Sometimes workable with the right combination of stealth mode, persistent profiles, and residential proxies, but expect frequent friction and rate-limiting.
+
+- **X (Twitter)**
+- **Amazon**
+- **Reddit**