New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HTTP upload - Akamai #939
Comments
Hi. I can't vouch for what was merged into the master branch as I haven't tested it from the master branch myself yet. I'm running with my own branch which supports setting the user agent. You definitely need to set a user agent that's not rejected by Akamai's ingest servers. I'm also not using {{put+http}} in the URL, but this might be required with how it works on the master branch. |
@termoose I would like to use this ticket to confirm, maybe even fix, HTTP upload to Akamai using google's master branch. As a side note, regarding Akamai's rejecting useragents: I believe HLS should not be rejected by useragent validations. Akamai docs says that's the case only for HDS and MPEG-DASH. So maybe we don't even need useragent changing for HLS. But, in any case, I'll do some tests and note here the results. Thanks. |
I have been debugging this issue, and have some news. Please take a look at this diff:
With that modification, Akamai push works. The problem is as follows:
The problem here is, I believe, the wrong assumption that throwing errors is the correct behaviour for this particular I think is OK to throw errors when calling those functions, but I don't believe this should force so much code check in so many random places: that looks like an unwanted side effect to me. So, I believe we should consider a less aggresive implementation of the For clarification, this are the "functionalies generating errors" I'm talking about:
So, how do you people think we should handle this? |
@Canta Thanks for looking into it. Can you share the command you use? It could be a bug in the handling of the returned data in HttpFile. Can you try the below patch to see if it fixes the issue? diff --git i/packager/file/http_file.cc w/packager/file/http_file.cc
index 8ba7b471..268a6a64 100644
--- i/packager/file/http_file.cc
+++ w/packager/file/http_file.cc
@@ -45,8 +45,14 @@ constexpr const int kMinLogLevelForCurlDebugFunction = 2;
size_t CurlWriteCallback(char* buffer, size_t size, size_t nmemb, void* user) {
IoCache* cache = reinterpret_cast<IoCache*>(user);
- size_t length = cache->Write(buffer, size * nmemb);
- VLOG(3) << "CurlWriteCallback length=" << length;
+ size_t length = size * nmemb;
+ if (cache) {
+ length = cache->Write(buffer, length);
+ VLOG(3) << "CurlWriteCallback length=" << length;
+ } else {
+ // For the case of HTTP Put, the returned data may not be consumed. Return
+ // the size of the data to avoid curl errors.
+ }
return length;
}
@@ -284,7 +290,8 @@ void HttpFile::SetupRequest() {
curl_easy_setopt(curl, CURLOPT_FAILONERROR, 1L);
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1L);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, &CurlWriteCallback);
- curl_easy_setopt(curl, CURLOPT_WRITEDATA, &download_cache_);
+ curl_easy_setopt(curl, CURLOPT_WRITEDATA,
+ method_ == HttpMethod::kPut ? nullptr : &download_cache_);
if (method_ != HttpMethod::kGet) {
curl_easy_setopt(curl, CURLOPT_READFUNCTION, &CurlReadCallback);
curl_easy_setopt(curl, CURLOPT_READDATA, &upload_cache_); You'll still see errors like below, but they can be ignored.
|
Thanks @kqyang :)
|
@Canta Thanks for the quick response!
Great. Thanks!
Have you tried if updating the hard-coded useragent fixed the issue? If not, can you try?
Certainly. Would you like to create another PR to bring it back?
Good point. I think it should just be a VLOG, i.e. Change LOG(ERROR) << "HttpFile does not support Size()."; to VLOG(1) << "HttpFile does not support Size()."; You are welcome to submit another PR to update it. Thanks! |
…ject#939 : * Fixing a bug regarding "content length" and "file size" that was impeding normal HTTP operations. See issue comment 4. * Adding back the previously lost ability to change HTTP User Agent from command line parameter (mostly taken from termoose's code). * Changing the "HttpFile does not support Size()" message to VLOG(1) loglevel.
Couldn't do it yesterday, but there's the PR with the following modifications:
Of course, I've already tested the changes, without any new issues.
I did, but I actually don't have any qualified useragent available, so still don't have a successful MPEG-DASH test. I began talks with Akamai support about how to handle this, as I don't thing it's sustainable to just "lie" about the useragent: changing it is fine for testing stuff and other use cases, but the issue here are Akamai's policies, not any Shaka Packager technical problem. Shaka does what it does just fine, its useragent should not be its qualification parameter. BTW, I don't wanna be picky, but this ticket is a good-enough context for this other detail: I believe Shaka Packager's HTTP default user agent should change. |
@Canta Thanks for the PR! Appreciated.
Good catch! How about change it to #include "packager/version/version.h"
...
user_agent_ = "ShakaPackager/" + GetPackagerVersion(); The file.gyp needs to be updated to include an additional dependency |
@kqyang |
@kqyang This are Akamai's guidelines for MPEG-DASH: Here's the thing I would like to ask you about: I've done some tests, and Shaka tries to push the chunks before the manifest. That's ok by me, and actually makes sense, but it seems to be against Akamai's guidelines. So I would like to make some command line flag in the likes of
Thanks. |
Another thing regarding Akamai's qualification procedure. Take a look at this link: I was thinking of starting the qualification process by myself. Yet, I don't think that's ok, as the guidelines ask for business formalities that I just can't provide: I'm not a Google or Shaka representative. Giving my contact as formal business representative for this project would be... well... kinda scammy, frankly. Is there any chance that some Shaka Packager representative start that process? I could be somehow a technical contact for Akamai, yet I don't feel confortable telling them something like "yeah, I'm Shaka Packager representative": I'm just not that person. |
Interesting. Is it a hard requirement? Does it allow the manifest to be updated later?
A proper place to do it is in
I don't know what is the best way to approach it. I'll check with Google open source team / Google legal on how we can approach it. For now, feel free to tell Akamai that you or your company is using the open source packager |
- Do not write the HTTP PUT response to cache which can potentially overflow the cache buffer as it is not consumed. - VLOG(1) instead of LOG(ERROR) on HttpFile::Size() as it can be called during normal code execution. - Add a command line flag `--user_agent` to allow users to specify their custom user agent string. Fixes shaka-project#939.
System info
Operating System:
Ubuntu 18.04.5 LTS (dockerized)
Shaka Packager Version:
c1f64e5-release
Issue and steps to reproduce the problem
I'm struggling to push (actually PUT) some live stream to Akamai HTTP endpoints, based on this instructions.
It says here that the fresh HTTP PUT upload functionality was
Tested with ingesting encrypted DASH segments against Akamai MSL4 EP's
. Yet, several things are weird:ffmpeg
, but I can't so far with shaka packager. I have some logs regarding this, but would like to clear some other stuff first.Size()
function when it's not available or wrongly interpreting playlist URL. I'm actually confused right now about when to add theput+
header to which parameters.Note: HDS and DASH live streams published from non-qualified encoders will be rejected
, and shaka packager is not listed there.ffmpeg
IS there, but only qualified for HLS. Didn't tried yet ffmpeg+DASH, but the message is actually scary.So, with that in mind, before creating more precise tickets or doing a million extra tests, I would like to ask you people for some clarification.
Can anybody confirm packager's current ability to push HLS and/or MPEG-DASH to Akamai? Some command line example would be nice in order to reproduce it.
Packager Command:
If you execute this, it works fine:
This other combination of commands don't:
What is the expected result?
Shaka packager should push to Akamai.
What happens instead?
Different combinations of URLs/settings (like using or not the
put+
prefix in the hls_master_playlist_output, segment_template, or playlist_name parameters) end up in different situations, but in any case it doesn't work.The text was updated successfully, but these errors were encountered: