issues Search Results · repo:apify/crawlee-python language:Python
Filter by
415 results
(64 ms)415 results
inapify/crawlee-python (press backspace or delete to remove)Current behaviour
When running the crawlee create --help command, several options that accept text parameters are shown with the
placeholder TEXT only, without listing their valid values:
$ crawlee create ...
enhancement
t-tooling
vdusek
- Opened yesterday
- #1295
The current group names are not very helpful:
- Classes
- Abstract classes
- Data
- Errors
- Event
- Functions
It should be more descriptive/self-explanatory:
- Crawlers
- Storages
...
documentation
t-tooling
vdusek
- Opened yesterday
- #1293
Currently, there are 3 different implementations of HttpClient. All of them have their own test file with duplicated
test cases. Refactor the same tests to accept HttpClient (or some factory method) as ...
debt
t-tooling
Pijukatel
- Opened 2 days ago
- #1292
Since merging https://github.com/apify/crawlee-python/pull/1070, the
test_memory_estimation_does_not_overestimate_due_to_shared_memory has been failing intermittently on Linux with
Python 3.11:
- PR ...
t-tooling
vdusek
- Opened 5 days ago
- #1288
- Stagehand - The AI Browser Automation Framework
- https://www.stagehand.dev/
- Python binding - https://github.com/browserbase/stagehand-python
- Explore it and integrate it - maybe in a similar ...
enhancement
t-tooling
vdusek
- Opened 9 days ago
- #1278
I tried adding session or cookies using user_data_dir but it s not working at all in python
Image
t-tooling
ahmedemadAimTech
- 3
- Opened 14 days ago
- #1271
- follow up to https://github.com/apify/crawlee-python/pull/1169#discussion_r2134152443
t-tooling
janbuchar
- Opened 15 days ago
- #1269
Does Crawlee support per-domain concurrency?
In this example, the first domain (paklap.pk) can t handle much load, but the second domain (centurycomputerpk.com) can.
Does Crawlee allow setting concurrency ...
t-tooling
Ehsan-U
- 2
- Opened 19 days ago
- #1263
New feature would allow Crawlers to dynamically archive specific crawled pages as WARC files.
WARC are used for archiving pages and all their resources and can be used for various purposes. For example ...
enhancement
t-tooling
Pijukatel
- 3
- Opened 19 days ago
- #1262

Learn how you can use GitHub Issues to plan and track your work.
Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub IssuesProTip!
Press the /
key to activate the search input again and adjust your query.
Learn how you can use GitHub Issues to plan and track your work.
Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub IssuesProTip!
Restrict your search to the title by using the in:title qualifier.