Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lower level postdata #4576

Open
jimmywarting opened this issue Jun 12, 2019 · 5 comments
Open

lower level postdata #4576

jimmywarting opened this issue Jun 12, 2019 · 5 comments

Comments

@jimmywarting
Copy link

jimmywarting commented Jun 12, 2019

postData only takes a string and i thing that is too high abstract for me. I would like to try posting binary data, preferable with ArrayBuffer, ArrayBufferView or better yet: A byte stream. so I'm able to pipe a large file.

page.on('request', request => {
  const overrides = {}
  if (request.url === 'https://httpbin.org/post') {
    overrides.method = 'POST'
    overrides.postData = stream || uint8Array || string
  }
  request.continue(overrides)
})

Would be awesome if one could utilize node-fetch's (Or any other whatwg fetch) Request class. that's following a spec. And have something that also follows service workers request interceptor naming convention

const fetch = require('node-fetch')
const { Request, Headers } = fetch

// Simular to service worker `evt.respondWith` (but for request)
// can pass in a promise that resolves to a request also (just like service worker respondWith)
something.requestWith(
  new Request(body, { method, headers })
)
@treemonster
Copy link

我也有此类需求,希望能得到官方支持啊,大的postData如果可以用一个 readAble 来获取就好了,直接不支持有点不方便

@stale
Copy link

stale bot commented Jun 27, 2022

We're marking this issue as unconfirmed because it has not had recent activity and we weren't able to confirm it yet. It will be closed if no further activity occurs within the next 30 days.

@stale stale bot added the unconfirmed label Jun 27, 2022
@jimmywarting
Copy link
Author

jimmywarting commented Jun 27, 2022

now that the fetch api is part of nodejs then i really wish for something like this to work:

page.on('request', async evt => {
  /** @type {Request} */
  const request = evt.request
  const formData = await request.formData() // Or: 
  for await (const uint8 of request.body) {
    console.log(uint8)
  }
  evt.respondWith(new Response('foo bar'), { status: 200 })
})

@stale stale bot removed the unconfirmed label Jun 27, 2022
@stale
Copy link

stale bot commented Aug 30, 2022

We're marking this issue as unconfirmed because it has not had recent activity and we weren't able to confirm it yet. It will be closed if no further activity occurs within the next 30 days.

@guest271314
Copy link

While requiring multiple requests, it is possible to pipe multiple streams into a single stream with a cooperative client. Using an extension with chrome.debugger I do something like this

background.js

import { bytesArrToBase64, base64ToBytesArr } from './base64-encode-decode.js';
const encoder = new TextEncoder();
chrome.debugger.onEvent.addListener(async ({ tabId }, message, params) => {
  console.log(tabId, message, params);
  if (
    message === 'Fetch.requestPaused' &&
    /^chrome-extension|ext_stream/.test(params.request.url)
  ) {
    await chrome.debugger.sendCommand({ tabId }, 'Fetch.fulfillRequest', {
      responseCode: 200,
      requestId: params.requestId,
      requestHeaders: params.request.headers,
      body: bytesArrToBase64(
        encoder.encode(
          JSON.stringify([...Uint8Array.from({ length: 1764 }, () => 255)])
        )
      ),
    });
  } else {
    await chrome.debugger.sendCommand({ tabId }, 'Fetch.continueRequest', {
      requestId: params.requestId,
    });
  }
});

chrome.action.onClicked.addListener(async (tab) => {
  const tabId = tab.id;
  await chrome.debugger.attach({ tabId }, '1.3');
  // await chrome.debugger.sendCommand({tabId}, 'Network.enable');
  await chrome.debugger.sendCommand({ tabId }, 'Fetch.enable', {
    patterns: [
      {
        requestStage: 'Request',
        resourceType: 'XHR',
        urlPattern: '*ext_stream',
      },
    ],
  });
});

On arbitrary Web page

{
  let abortable = new AbortController();
  let { signal } = abortable;
  let { readable, writable } = new TransformStream();
  (async (_) => {
    while (!abortable.signal.aborted) {
      try {
        (
          await fetch('./?ext_stream', {
            method: 'post',
            body: '',
            cache: 'no-store',
          })
        ).body
          .pipeThrough(new TextDecoderStream())
          .pipeThrough(
            new TransformStream({
              transform(v, c) {
                c.enqueue(JSON.parse(v));
              },
              flush() {
              //  console.log('flush');
              },
            })
          )
          .pipeTo(writable, {
            preventClose: true,
          })
          .catch(console.log);
      } catch (e) {
        console.warn(e);
        break;
      }
    }
  })();

  readable
    .pipeTo(
      new WritableStream({
        write(v) {
          console.log(v);
        },
        close() {
          console.log('close');
        },
        abort(reason) {
          console.log(reason);
        },
      }),
      { signal }
    )
    .catch(console.log);

  setTimeout(() => {
    writable.abort();
    abortable.abort('Done streaming.');
  }, 1000);
}

Since we can write a script directly to the page, it is also possible to do something like creating a TransformStream on the page then writing to the writable side in subsequent script executions on the page

async function transform(data = [...new Array(1764)].map(() => 0)) {
  if (
    !globalThis.hasOwnProperty('extension_readable') &&
    !globalThis.hasOwnProperty('extension_writable')
  ) {
    ({
      readable: globalThis.extension_readable,
      writable: globalThis.extension_writable,
    } = new TransformStream());
    globalThis.extension_writer = globalThis.extension_writable.getWriter();
    extension_readable.pipeTo(
      new WritableStream({
        write(v) {
          console.log(v.length);
        },
        close() {
          console.log('close');
        },
      })
    );
  } else {
    await globalThis.extension_writer.write(new Uint8Array(data));
  }
  // console.log(globalThis.extension_readable, globalThis.extension_writable);
  // return globalThis.hasOwnProperty('extension_readable') && globalThis.hasOwnProperty('extension_writable')
  return Promise.resolve();
}

globalThis.streaming = false;
globalThis.port = null;

chrome.action.onClicked.addListener(async (tab) => {
  if (!globalThis.streaming) {
    await chrome.scripting.executeScript({
      target: {
        tabId: tab.id,
      },
      world: 'MAIN',
      args: [],
      func: transform,
    });
    globalThis.port = chrome.runtime.connectNative('capture_system_audio');
    port.onMessage.addListener(async (message) => {
      await chrome.scripting.executeScript({
        target: {
          tabId: tab.id,
        },
        world: 'MAIN',
        args: [message],
        func: transform,
      });
    });
    port.onDisconnect.addListener((p) => {
      console.log(p);
    });
    port.postMessage('parec -d @DEFAULT_MONITOR@');
    streaming = true;
  } else {
    globalThis.port.disconnect();
    await chrome.scripting.executeScript({
      target: {
        tabId: tab.id,
      },
      world: 'MAIN',
      args: [],
      func: async () => {
        await globalThis.extension_writer.close();
        delete globalThis['extension_readable'];
        delete globalThis['extension_writable'];
      },
    });
    streaming = false;
  }
});

Neither is a substitute for the direct functionality of onfetch and respondWith() in a ServiceWorker, which can be the design pattern used for streaming responses.

Related #7863.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants