{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":199081707,"defaultBranch":"master","name":"pytorch","ownerLogin":"anandj91","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2019-07-26T21:25:02.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/5725157?v=4","public":true,"private":false,"isOrgOwned":false},"refInfo":{"name":"","listCacheKey":"v0:1696274024.0","currentOid":""},"activityList":{"items":[{"before":null,"after":"618a731dd727e83218c64136d6427b0988126713","ref":"refs/heads/pytorch-200","pushedAt":"2023-10-02T19:13:44.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"anandj91","name":"Anand J","path":"/anandj91","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5725157?s=80&v=4"},"commit":{"message":"Set env vars","shortMessageHtmlLink":"Set env vars"}},{"before":"e9ebda29d87ce0916ab08c06ab26fd3766a870e5","after":"1dcaa5599072b3fd6d4df3c95bb0c5d4bd7c8a12","ref":"refs/heads/pytorch-201","pushedAt":"2023-09-24T20:54:01.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"anandj91","name":"Anand J","path":"/anandj91","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5725157?s=80&v=4"},"commit":{"message":"Set env vars for build from source","shortMessageHtmlLink":"Set env vars for build from source"}},{"before":"e9ebda29d87ce0916ab08c06ab26fd3766a870e5","after":null,"ref":"refs/heads/test","pushedAt":"2023-09-22T23:04:18.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"anandj91","name":"Anand J","path":"/anandj91","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5725157?s=80&v=4"}},{"before":null,"after":"e9ebda29d87ce0916ab08c06ab26fd3766a870e5","ref":"refs/heads/pytorch-201","pushedAt":"2023-09-22T23:04:07.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"anandj91","name":"Anand J","path":"/anandj91","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5725157?s=80&v=4"},"commit":{"message":"[2.0.1] Disable SDPA FlashAttention backward and mem eff attention on sm86+ for head_dim above 64 (#99736)\n\n* Disable SDPA FlashAttention backward and mem eff attention on sm86+ for head_dim above 64 (#99105)\r\n\r\nExpand sdpa_utils.h check to disable FlashAttention when using autograd and mem eff attention for the following cases\r\n- head_dim > 64\r\n- sm86 or newer\r\n\r\nPreviously we only disable these kernels on sm86 and for head_dim equal to 128.\r\n\r\nPull Request resolved: https://github.com/pytorch/pytorch/pull/99105\r\nApproved by: https://github.com/malfet\r\n\r\n* remove master only test\r\n\r\n---------\r\n\r\nCo-authored-by: albanD ","shortMessageHtmlLink":"[2.0.1] Disable SDPA FlashAttention backward and mem eff attention on…"}},{"before":"db3205b1c0db1dc31a27d20c9702f3d69c3ce1a0","after":"e9ebda29d87ce0916ab08c06ab26fd3766a870e5","ref":"refs/heads/test","pushedAt":"2023-09-22T23:02:31.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"anandj91","name":"Anand J","path":"/anandj91","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5725157?s=80&v=4"},"commit":{"message":"[2.0.1] Disable SDPA FlashAttention backward and mem eff attention on sm86+ for head_dim above 64 (#99736)\n\n* Disable SDPA FlashAttention backward and mem eff attention on sm86+ for head_dim above 64 (#99105)\r\n\r\nExpand sdpa_utils.h check to disable FlashAttention when using autograd and mem eff attention for the following cases\r\n- head_dim > 64\r\n- sm86 or newer\r\n\r\nPreviously we only disable these kernels on sm86 and for head_dim equal to 128.\r\n\r\nPull Request resolved: https://github.com/pytorch/pytorch/pull/99105\r\nApproved by: https://github.com/malfet\r\n\r\n* remove master only test\r\n\r\n---------\r\n\r\nCo-authored-by: albanD ","shortMessageHtmlLink":"[2.0.1] Disable SDPA FlashAttention backward and mem eff attention on…"}},{"before":"190d255d2ee180e2bb8fa617ece3cc9ecbddb3fb","after":"e42d450a555c1577fb8cee3f7ccde3532833f9a7","ref":"refs/heads/master","pushedAt":"2023-09-22T22:33:44.000Z","pushType":"push","commitsCount":10000,"pusher":{"login":"anandj91","name":"Anand J","path":"/anandj91","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/5725157?s=80&v=4"},"commit":{"message":"[core IR] Add div.Tensor_mode, div.Scalar_mode, and copy as core operators (#109812)\n\nPull Request resolved: https://github.com/pytorch/pytorch/pull/109812\nApproved by: https://github.com/kirklandsign","shortMessageHtmlLink":"[core IR] Add div.Tensor_mode, div.Scalar_mode, and copy as core oper…"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAADjbDnzAA","startCursor":null,"endCursor":null}},"title":"Activity · anandj91/pytorch"}