{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":145744970,"defaultBranch":"master","name":"pytorch","ownerLogin":"rohan-varma","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2018-08-22T18:04:52.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/8039770?v=4","public":true,"private":false,"isOrgOwned":false},"refInfo":{"name":"","listCacheKey":"v0:1692744011.0","currentOid":""},"activityList":{"items":[{"before":"cc4fe473eb2d5f37af63134799214b25faafa2b8","after":"293a0f61c6687ae9f62956120a3562b3e8a84c76","ref":"refs/heads/export-D48508057","pushedAt":"2023-08-23T22:59:01.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"Ensure optimizer in backward works with 2d parallel (#107748)\n\nSummary:\nPull Request resolved: https://github.com/pytorch/pytorch/pull/107748\n\nTest to ensure optimizer in backward works with 2D parallel.\n\nTest Plan: CI\n\nReviewed By: awgu\n\nDifferential Revision: D48508057\n\nfbshipit-source-id: 7925e9076edb310e1be552135ce5b72de9a0dee8","shortMessageHtmlLink":"Ensure optimizer in backward works with 2d parallel (pytorch#107748)"}},{"before":"7f7bae992eeabab9b258e76a5e932fb25f9b7c97","after":"cc4fe473eb2d5f37af63134799214b25faafa2b8","ref":"refs/heads/export-D48508057","pushedAt":"2023-08-23T22:51:08.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"Ensure optimizer in backward works with 2d parallel (#107748)\n\nSummary:\nPull Request resolved: https://github.com/pytorch/pytorch/pull/107748\n\nTest to ensure optimizer in backward works with 2D parallel.\n\nTest Plan: CI\n\nReviewed By: awgu\n\nDifferential Revision: D48508057\n\nfbshipit-source-id: 7ca75d9bd4a15802213fbc64763922eb77a462f3","shortMessageHtmlLink":"Ensure optimizer in backward works with 2d parallel (pytorch#107748)"}},{"before":"58ef2ea96d8fe52737839c9205234fef3edbd38c","after":"7f7bae992eeabab9b258e76a5e932fb25f9b7c97","ref":"refs/heads/export-D48508057","pushedAt":"2023-08-23T22:45:57.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"Ensure optimizer in backward works with 2d parallel (#107748)\n\nSummary:\nPull Request resolved: https://github.com/pytorch/pytorch/pull/107748\n\nTest to ensure optimizer in backward works with 2D parallel.\n\nTest Plan: CI\n\nReviewed By: awgu\n\nDifferential Revision: D48508057\n\nfbshipit-source-id: 443c4c9f595d8b2fd97f4f67cd7382fa908b400a","shortMessageHtmlLink":"Ensure optimizer in backward works with 2d parallel (pytorch#107748)"}},{"before":null,"after":"58ef2ea96d8fe52737839c9205234fef3edbd38c","ref":"refs/heads/export-D48508057","pushedAt":"2023-08-22T22:40:11.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"Ensure optimizer in backward works with 2d parallel\n\nSummary: Test to ensure optimizer in backward works with 2D parallel.\n\nTest Plan: CI\n\nDifferential Revision: D48508057\n\nfbshipit-source-id: 3bac9c34309b73f545401ec439d69f9700eed83f","shortMessageHtmlLink":"Ensure optimizer in backward works with 2d parallel"}},{"before":"cef1ca4aeaf3bf2122be13fa916712308c3bd2f6","after":"6a62073eb12545fc5447740614d5095c7b704c11","ref":"refs/heads/export-D48363238","pushedAt":"2023-08-16T16:24:05.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"[BE] Improve input mismatch error msg (#107281)\n\nSummary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/107281\n\nTest Plan: CI\n\nReviewed By: daniellepintz, H-Huang, awgu\n\nDifferential Revision: D48363238\n\nfbshipit-source-id: 705bc6cc63d4173fba630f69471fbfa0d35bb636","shortMessageHtmlLink":"[BE] Improve input mismatch error msg (pytorch#107281)"}},{"before":"a17af5f711289c662d320d22fb4dae9757a77387","after":"cef1ca4aeaf3bf2122be13fa916712308c3bd2f6","ref":"refs/heads/export-D48363238","pushedAt":"2023-08-16T16:15:21.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"[BE] Improve input mismatch error msg (#107281)\n\nSummary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/107281\n\nTest Plan: CI\n\nReviewed By: daniellepintz, H-Huang, awgu\n\nDifferential Revision: D48363238\n\nfbshipit-source-id: e2b0573dd02e4225820cd90f2c7feaa42c90136a","shortMessageHtmlLink":"[BE] Improve input mismatch error msg (pytorch#107281)"}},{"before":null,"after":"a17af5f711289c662d320d22fb4dae9757a77387","ref":"refs/heads/export-D48363238","pushedAt":"2023-08-16T05:41:30.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"[BE] Improve input mismatch error msg\n\nTest Plan: CI\n\nDifferential Revision: D48363238\n\nfbshipit-source-id: 4740853d31b4c87915766a79749902d46409ac53","shortMessageHtmlLink":"[BE] Improve input mismatch error msg"}},{"before":null,"after":"d63209b1d046bdbbf11186bd9b3238a9b32721ae","ref":"refs/heads/export-D48324752","pushedAt":"2023-08-14T20:32:06.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"[PT-D][FSDP] Handle corner case of load with multi-backend PG\n\nSummary:\nWhen loading a CPU state_dict with a pg initialized with\ncpu:gloo,cuda:nccl, we hit a gloo crash since dest tensor is on GPU and input\nis on CPU.\n\nAs a workaround, just enforce that if local_tensor.is_cpu, the dest tensor is\nalso cpu.\n\nTest Plan: CI\n\nDifferential Revision: D48324752\n\nfbshipit-source-id: dd7cdffd8d3272d0342d77a70ebb5f1d0d142431","shortMessageHtmlLink":"[PT-D][FSDP] Handle corner case of load with multi-backend PG"}},{"before":null,"after":"77b1f578285c6fb88b5b1575c68925eec427ffb1","ref":"refs/heads/export-D47975982","pushedAt":"2023-08-02T01:13:10.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"[PG NCCL][RFC] Pause before throwing exception\n\nSummary:\nWhen this exception is thrown, torchx will kick in and take down the\nremaining processes. Adding a bit of time before throwing it so that other\nprocesses can also have their watchdogs fire, which can better contribute to\nlog understanding in terms of potential collective mismatches, shape\nmisalignment, etc.\n\nTested by running some jobs and ensuring all the watchdogs fire and report\nerrors in logs when collectives are misaligned, instead of just one.\n\nTest Plan: CI\n\nDifferential Revision: D47975982\n\nfbshipit-source-id: 598d9931071ce5ca73f46ac26acbfda19409f36f","shortMessageHtmlLink":"[PG NCCL][RFC] Pause before throwing exception"}},{"before":"5bec0bac2f491637921f4317547d3b4f71477771","after":"09964e8c5d6243c955eae2e8aa0284eac6ac9c8b","ref":"refs/heads/export-D44656110","pushedAt":"2023-04-04T17:47:56.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"Add forward_prefetch flag to fully_shard (#98277)\n\nSummary:\nPull Request resolved: https://github.com/pytorch/pytorch/pull/98277\n\nPer title\n\nTest Plan: CI\n\nDifferential Revision: D44656110\n\nfbshipit-source-id: 417165bd8e99a4311a223430cec93f98a0e0f4e4","shortMessageHtmlLink":"Add forward_prefetch flag to fully_shard (pytorch#98277)"}},{"before":null,"after":"5bec0bac2f491637921f4317547d3b4f71477771","ref":"refs/heads/export-D44656110","pushedAt":"2023-04-04T04:01:30.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"rohan-varma","name":"Rohan Varma","path":"/rohan-varma","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/8039770?s=80&v=4"},"commit":{"message":"Add forward_prefetch flag to fully_shard\n\nSummary: Per title\n\nTest Plan: CI\n\nDifferential Revision: D44656110\n\nfbshipit-source-id: 17d06097ef4903d22909f54e119e8f7063234fe9","shortMessageHtmlLink":"Add forward_prefetch flag to fully_shard"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAADcTOGgQA","startCursor":null,"endCursor":null}},"title":"Activity ยท rohan-varma/pytorch"}