fix: update allanime key and ct_len#1667
Conversation
27a68ef to
a8df215
Compare
The Allanime API changed encryption on 2026-04-22. - Base key: SimtVuagFbGR2K7P -> Xot36i3lK3:v1 (now includes ":v1" suffix, derived via SHA-256) - Blob format: [version:1][IV:12][ciphertext][auth_tag:16] (was: [IV:12][plaintext] — version byte was skipped) - Ciphertext length: explicit ct_len = file_size - 13 - 16 (was: read entire rest, including GCM auth tag) - Version bump: 4.12.0 -> 4.13.0
a8df215 to
e5523a9
Compare
|
Three questions:
|
|
These are some of the observations I had after cloning into the fix (Do let me know if I made any errors anywhere)
|
|
@sahil-k-george bro you did in correct way, now with just small change in code it will run here refer to this |
Ok, since I'm on Android right now, I'll also do an equivalent fix for termux and let you know in #1665 |
Update:- |
|
|
(and yes i had to look up |
|
well @justchokingaround , @sahil-k-george i dunno about this stuff that much just shared what i did : ), hope it get fixes soon but it really works with windows after my change , so i shared it ... |
nethriondev
left a comment
There was a problem hiding this comment.
where this key comefrom i cant find it on https://cdn.allanime.day/all/manga/a10191a.js
|
Isn't it better then to just parse the file and recover some keys $ curl -s https://cdn.allanime.day/all/manga/83d4166.js | npx prettier --parser babel --stdin-filepath temp.js | awk '/importKey/{p=1} p; /\];/{p=0}'
"importKey",
"WUMcT",
"hXKMM",
"MJbkd",
"NEPNI",
"iqdXa",
"oWlug",
"cdjBR",
"map",
"OvVQp",
"Unsupporte",
"FNMqB",
"nqvob",
"izUlL",
"set",
"xDxPr",
"9587304pPDucs",
"uIqtx",
"690336osMA",
"jkDlP",
"hJvPG", |
|
no |
|
@justchokingaround I think you should plan for the updated keys instead of just saying "no" You told key is not likely to change (#1650 (comment)) and it seems like the change still happened, it's better to future proof if there is any way and seems like @BishopWolf was suggesting something on similar lines ( am not sure if that can just work or not but i feel it's better to think on the lines of key changing more frequently than expected imo to avoid such PRs )
I know you said this but as per me I feel it's worth a try atleast to go about that instead of just thinking that it's gonna break just as often.. but yeah, I would let @Derisis13 see as to how to keep ani-cli functional incase things like this happens. |
they didn't change just the key, they also changed the offset, so even if there was a key extractor, it would still fail, so it would just be a big waste of time to spend hours writing a key extractor just for it to break over night. if u wanna do that but my guest, but I'm not gonna keep elaborating further on why that's a bad idea |
|
So you'd keep updating keys overnight for it to be functional? ( Just trying to see how you envision it working as well.. ) |
|
Huh, I feel like both @justchokingaround and @manorit2001 are right.
|
|
Also, how is the offset calculated in the first place, from
|
|
2 cents.. For dynamic flows where things are changing, there are llms around and maybe that can be incorporated in this repo If it's a no go ( not sure what's the take on project with llm assisted workflows ) I'd say document how the key should be found and make it a user argument, but I feel this is where llms would come very handy when you can't solve things programmatically due to their dynamic nature. |
(Correct me if I am wrong) I think this repo follows a strict No-AI/No-LLM policy. So, that might be a no-go. Also regarding this:
I was thinking of something else, once I know how the offset is calculated, I'll share my idea here. (Because it might be a waste of time discussing it, if the idea can't be implemented in the first place.) |
|
... |
|
Listen. The extraction does not change frequently. Regarding LLMs, I do welcome LLM assisted contributions. I appreciate your eagerness to contribute. |
If it is not a frequent change then yeah, having a dedicated extraction mechanism is just bloatware atp.
That is a great insight which I didn't know. I was a bit skeptical on where the line was drawn regarding the situation. So, now it's more clear on what is okay and what is not. |
It's not the keys they change, it's encrypted text padding, algorithm anything the possibility is infinite and we should not write a parser for something that's uncertain. |
I think we differ on what I was suggesting actually, it wasn't on the lines of llm assisted contributions but it was more on the lines of llm going and finding this padding structure in code itself ( llm sdk being part of ani-cli repo ) to avoid the dynamic nature that we can't programatically solve today. But I understand at that point it's much better to move providers. It was just a suggestion as I mentioned. The reason i came to this PR was as it was somewhat troublesome that it hadn't been working and I had updated to the last commit as well, but I again saw it not working so just came here instead of opening another issue and what not, and seeing the potential that it still can potentially fail even after this PR within few days, I would be happiest if that doesn't happen and really thankful for contribution of @justchokingaround and you all. But I just err'ed on the side that it does fail again with them changing things frequently and suggested things based on that. |
|
Didn't test, just approved 😎 |
|
Several users are reporting it as working. Merge timing up to your discretion |
|
Merged as it works right now |




Pull Request Template
Type of change
Description
ramble here
Checklist
-chistory and continue work-ddownloads work-ssyncplay works-qquality works-vvlc works-e(select episode) aka-r(range selection) works-Sselect index works--skipani-skip works--skip-titleani-skip title argument works--no-detachno detach works--exit-after-playauto exit after playing works--nextep-countdowncountdown to next ep works--duband regular (sub) mode both work-hhelp info is up to dateAdditional Testcases