[Reopening #1267] Fix proxy authentication ERR_INVALID_AUTH_CREDENTIALS in crawl4ai 0.6.1-0.6.3#1281
Conversation
- Fix dict-to-ProxyConfig conversion in BrowserConfig and CrawlerRunConfig - Fix JSON serialization of ProxyConfig objects in to_dict methods - Fix context proxy to use ProxySettings instead of plain dict - Resolves proxy authentication issues
WalkthroughThe changes update the handling of proxy configuration across configuration classes and browser context creation. Configuration classes now consistently convert proxy configuration dictionaries to Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant ConfigClass
participant ProxyConfig
User->>ConfigClass: from_kwargs(kwargs)
alt proxy_config is dict
ConfigClass->>ProxyConfig: from_dict(proxy_config)
ProxyConfig-->>ConfigClass: ProxyConfig instance
ConfigClass-->>User: ConfigClass instance with ProxyConfig
else proxy_config is not dict
ConfigClass-->>User: ConfigClass instance with original proxy_config
end
User->>ConfigClass: to_dict()
alt proxy_config has to_dict()
ConfigClass->>ProxyConfig: to_dict()
ProxyConfig-->>ConfigClass: dict
ConfigClass-->>User: dict with serialized proxy_config
else
ConfigClass-->>User: dict with raw proxy_config
end
sequenceDiagram
participant BrowserManager
participant Playwright.ProxySettings
BrowserManager->>BrowserManager: create_browser_context()
alt proxy_config present
BrowserManager->>Playwright.ProxySettings: new(server, username, password)
Playwright.ProxySettings-->>BrowserManager: ProxySettings object
BrowserManager->>BrowserManager: set context_settings["proxy"] = ProxySettings
end
BrowserManager-->>BrowserManager: create context with settings
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 0
🔭 Outside diff range comments (1)
crawl4ai/async_configs.py (1)
211-319: Consolidate ProxyConfig definitions to eliminate duplicationThe
ProxyConfigclass is defined in bothcrawl4ai/async_configs.pyandcrawl4ai/proxy_strategy.py, which risks divergence and makes maintenance harder. Please centralize the implementation in one module (for example,async_configs.py) and have all other code import it from there.• Remove the
class ProxyConfigblock fromcrawl4ai/proxy_strategy.py
• Addfrom .async_configs import ProxyConfigat the top ofcrawl4ai/proxy_strategy.py
• Updatecrawl4ai/__init__.py,docs/examples/*,deploy/docker/c4ai-code-context.md, etc., to importProxyConfigonly from its canonical location
• Run a full pass on tests and examples to confirm nothing breaks after consolidation
🧹 Nitpick comments (1)
crawl4ai/async_configs.py (1)
513-513: Fix inconsistent default value handling.The static analysis hint is correct -
kwargs.get("proxy_config", None)should bekwargs.get("proxy_config")sinceget()returnsNoneby default when the key doesn't exist.- proxy_config=ProxyConfig.from_dict(kwargs.get("proxy_config")) if isinstance(kwargs.get("proxy_config"), dict) else kwargs.get("proxy_config", None), + proxy_config=ProxyConfig.from_dict(kwargs.get("proxy_config")) if isinstance(kwargs.get("proxy_config"), dict) else kwargs.get("proxy_config"),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
crawl4ai/async_configs.py(4 hunks)crawl4ai/browser_manager.py(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
crawl4ai/async_configs.py (1)
crawl4ai/proxy_strategy.py (3)
ProxyConfig(10-118)from_dict(68-75)to_dict(98-105)
🪛 Ruff (0.11.9)
crawl4ai/async_configs.py
513-513: Use kwargs.get("proxy_config") instead of kwargs.get("proxy_config", None)
Replace kwargs.get("proxy_config", None) with kwargs.get("proxy_config")
(SIM910)
🔇 Additional comments (4)
crawl4ai/async_configs.py (3)
549-549: Proper proxy configuration serialization implemented.The implementation correctly handles both ProxyConfig objects and raw dictionaries by checking for the
to_dictmethod before calling it.
1124-1124: Good implementation of proxy config conversion.The logic correctly converts dictionary proxy configurations to ProxyConfig objects while preserving existing ProxyConfig instances. This ensures consistent handling across the codebase.
1237-1237: Consistent serialization pattern maintained.The serialization logic matches the pattern used in BrowserConfig.to_dict(), ensuring consistent behavior across configuration classes.
crawl4ai/browser_manager.py (1)
861-866: Excellent improvement using Playwright's ProxySettings.The change from manually constructing proxy dictionaries to using Playwright's
ProxySettingsobject improves type safety and ensures proper validation of proxy credentials. This directly addresses the proxy authentication issues mentioned in the PR objectives.
|
Merged into develop — thanks for fixing the proxy auth issue. We resolved a minor merge conflict in |

Continuation of PR #1267
This PR implements the proxy authentication fixes originally proposed in #1267. Due to repository state conflicts, the original PR could not be reopened, necessitating this resubmission with identical technical changes.
Issues addressed:
Current Impact: Multiple users continue experiencing these proxy authentication failures in versions 0.6.1-0.6.3, with recent reports confirming the ongoing need for these fixes in production environments.
🐛 Problem
The proxy authentication feature in
crawl4aiversions 0.6.1-0.6.3 fails withERR_INVALID_AUTH_CREDENTIALSerrors for all proxy configurations, even when valid credentials are provided.🔍 Root Cause
The core issue was inconsistent proxy credential handling across the
crawl4aicodebase:BrowserConfig.load()andCrawlerRunConfig.load()failed to convert dictionaryproxy_configinputs into properProxyConfigobjectsProxySettingsobjects, leading to incorrect parsingto_dict()methods didn't properly handleProxyConfigobjects during JSON serialization🛠️ Solution
The fix involved updates to two key files:
File 1:
crawl4ai/async_configs.pyFixed dict-to-ProxyConfig conversion in
from_kwargsmethods:Fixed JSON serialization in
to_dictmethods:File 2:
crawl4ai/browser_manager.pyFixed context proxy to use proper ProxySettings:
✅ Testing
Before Fix:
After Fix:
{ "success": true, "detected_ip": "185.240.64.202", "proxy_working": true, "message": "Successfully crawled via proxy" }Verified with real proxy credentials - the detected IP successfully changed from the server's IP to the proxy's IP.
📝 Usage
Proxy authentication now works as documented:
Python:
JSON API:
{ "urls": ["https://example.com"], "crawler_config": { "proxy_config": { "server": "proxy.example.com:8080", "username": "your_username", "password": "your_password" } } }📋 Checklist
🎯 Impact
crawl4aiversions 0.6.1-0.6.3Note
This is a continuation of the work from PR #1267. The fix addresses the core authentication issues that users are actively experiencing.
Summary by CodeRabbit