-
-
Notifications
You must be signed in to change notification settings - Fork 0
Add test for subdirectory loading with data transfer profiling and optimize with blobless clone #14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…rofiling Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
Test Results Overview ubuntu-24.049 tests ±0 9 ✅ ±0 3s ⏱️ -1s Results for commit 656b1cb. ± Comparison against base commit 1742625. This pull request removes 1 and adds 1 tests. Note that renamed tests count towards both.♻️ This comment has been updated with latest results. |
|
@copilot It looks like the implementation downloads the whole repository contents. The test fails because downloaded amount is > 8 MB. Can you optimize it to only load a subset that is sufficient for the subdirectory the user requested? |
Test Results Overview windows-20229 tests ±0 9 ✅ ±0 3s ⏱️ +2s Results for commit 656b1cb. ± Comparison against base commit 1742625. This pull request removes 1 and adds 1 tests. Note that renamed tests count towards both.♻️ This comment has been updated with latest results. |
…fer by 60% Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
Optimized subdirectory loading to reduce data transfer by ~60%. The implementation now uses Git's shallow clone capability (depth 1) to fetch only the specific commit without history, reducing transfer from ~8.7 MB to ~3.5 MB. Test now passes with subdirectory containing 490 KB of content across 23 files. (commit 7e86ca8) |
|
@copilot consolidate the redundant code in BuildUploadPackRequest and similar |
Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
Consolidated |
|
@copilot make the parameter |
Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
|
@copilot shallow clone is nice, but it does not scale to our use case, because the download size is still independent from the selected subdirectory. How can we adapt cloning to not download contents of other directories? |
You're right - shallow clone reduces history but still fetches all files at that commit (~3.5 MB). True subdirectory-only fetching would require Git Protocol v2 with partial clone filters or a multi-stage approach: (1) fetch trees only with |
Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
|
@copilot For the optimization of cloning subdirectories from larger repositories, proceed as follows: |
…port Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
Implemented blobless clone with selective blob fetching and caching support. The optimization uses |
|
@copilot consolidate the redundancies in BuildUploadPackRequest and BuildUploadPackRequestWithFilter. Also consolidate other redundancies where appropriate to reduce code duplication and improve robustness. |
…rsing methods Co-authored-by: Viir <19209696+Viir@users.noreply.github.com>
Consolidated redundant code across multiple methods. Merged |
a0f93f1 to
fb51c1e
Compare
Add test for subdirectory loading with data transfer profiling and optimize with blobless clone
Plan:
Latest Update - Code Consolidation:
Eliminated redundant code to improve maintainability and reduce duplication:
Changes Made:
Merged
BuildUploadPackRequestandBuildUploadPackRequestWithFilter:filterparameterExtracted
FetchPackFileWithRequestBodyAsynchelper:FetchBloblessPackFileAsyncandFetchSpecificObjectsAsyncExtracted
ParsePackFileAndGetCommithelper:LoadTreeContentsFromPackFileandLoadSubdirectoryContentsFromPackFileResults:
Original prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.