Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of proc languages stream support #1361

Open
mathewc opened this issue Apr 3, 2017 · 51 comments
Open

Out of proc languages stream support #1361

mathewc opened this issue Apr 3, 2017 · 51 comments

Comments

@mathewc
Copy link
Member

mathewc commented Apr 3, 2017

For all Node.js bindings (trigger/input/output) streaming is not supported - the data is put into an in memory buffer first. This can cause out of memory issues for people attempting to process large blobs for example.

We should investigate ways to support a streaming model, as you can optionally do in C# (by binding to a Stream). E.g. we might have a binding hint that allows the function to request a stream, and we provide the user code with a stream over the underlying SDK stream.

Note that if we move out of process for languages, streaming is complicated. We need to consider this when making any new programming model decisions.

@mathewc mathewc changed the title Node.js input buffering Node.js binding Stream support Apr 3, 2017
@christopheranderson
Copy link
Contributor

Related: #1319

@SteveALee
Copy link

This will also alow realistic static web serving of file based resources like HTML, css images etc.

@securityvoid
Copy link

I'd love to see this.

@goofballLogic
Copy link

node without streams? really?

@SteveALee
Copy link

Agree this should be a v high priority for js/ts to have 1st class support

@securityvoid
Copy link

@goofballLogic Yep :-). Based on my interactions it seems like there isn't a very large base of people using NodeJS with Azure Functions (Its at least seems to always be the same 5-6 names on any GitHub ticket). Given the small number of people, the product group is doing an amazing job working through issues annyway. Its gotten a TON better in the last ~year. With that said, some silly things like this seem to take time.

If I had noticed https://github.com/lambci/lambci earlier I probably would have stuck with AWS Lambda. The big thing I was missing on the Lambda side was CI/CD and I didn't want to pay for another service to get it. However, at this point things work well enough, and especially with the larger file support that Azure Functions handles vs. Lambda it makes several things I'm doing a lot easier.

Add in the streaming support, however; and the larger file support would be an even bigger reason to choose Azure Functions.

Azure Functions has the potential to be an amazing offering for NodeJS CI/CD function development. Its getting there, but isn't quite 100% there yet.

@goofballLogic
Copy link

goofballLogic commented Aug 10, 2017 via email

@securityvoid
Copy link

@goofballLogic It all depends what you're doing. If you're dealing with small files, then Lambda is probably faster / more consistent, etc. However, take a close look at these Lambda limits:
http://docs.aws.amazon.com/lambda/latest/dg/limits.html

One of our projects was uploading/processing a ~6.5MB XML file. Lambda couldn't handle that. I'd say that's even more core than streaming. This is kind of what happens when you're bleeding edge I guess.

I would say in general Lambda still seems more polished than Azure Functions, but that gap is quickly closing and even Lambda has its stupid stuff (e.g. File Limits).

Now there are ways around this in Lambda. You can write to S3 instead of the Lambda function, then retrieve the file from the Lambda Function and process it. However, this is a LOT more code than just handling the request.

With both platforms being pretty horrible to debug once deployed, more code = more trouble. As a result with any Servlerless stack it is definitely a place where the KISS principal applies.

@securityvoid
Copy link

@goofballLogic I should mention, things got a lot faster and more reliable with NodeJS on Azure when I WebPacked things into their own orphaned git branch (See https://github.com/securityvoid/pack-git ). If you play with Azure much you should check it out.

@SteveALee
Copy link

I decided not to clutter up my main repo with build artifacts. I ws also thinking that every function will have a full coppy of all it's deps in it's bundle file. The storage is limited, but at least we don't pay for it ;)

@SteveALee
Copy link

There is also https://github.com/Azure/azure-functions-pack

@jstuparitz
Copy link

Is there any update on this request? Would love to be able to stream files

@pgraham
Copy link

pgraham commented Feb 28, 2018

This lack of support kinda makes the blob storage service unfeasible for node projects, would love to see this implemented.

@anton-bot
Copy link

Can someone clarify - there is still no way to serve a binary file from Azure Function using Node?

@markmaynard
Copy link

Would really like this functionality in the node stack. Having to pull a blob into memory seems like a big miss.

@Bomret
Copy link

Bomret commented Aug 17, 2018

Is there any update on this?

@opensourcekam
Copy link

Update?

@Adithya1894
Copy link

any update on this? I am trying to achieve the exact same thing to stream the the Blob.

@asavaritayal asavaritayal changed the title Node.js binding Stream support Out of proc languages stream support Jan 18, 2019
@gentksb
Copy link

gentksb commented Feb 4, 2019

any update?
Stream is very important at FaaS. it's limited on memory & storage...

@restfulhead
Copy link

Is there any update on this issue? I can just repeat @adesmet's question: How in the world is this not a priority? IMHO streaming is basic functionality of a web framework. Directing users to .NET is not a good look, if you're promoting JAM stack support on Azure...

@danielgary
Copy link

Just came across this as well while trying to support Remix on Azure. Either Azure Static Apps needs to support fallback Function calls or Functions need to support streaming.

@thomasgauvin
Copy link

thomasgauvin commented Dec 15, 2021

Is there any update regarding this? This issue seems to have been on the backlog for 4 years, are there any plans to support streams for other languages than C#?

(EDIT: I made this comment as an individual developer, I'm now on the Azure Static Web Apps team and am advocating for this with the Azure Functions team)

@Sreini
Copy link

Sreini commented Mar 3, 2022

I agree this needs to be adressed immediately. I am deep into a java project with azure functions and am only now finding out core functionality like this is not supported

@ChuckJonas
Copy link

Any update? This is just another fundamental feature missing from Azure Functions....

@sig9
Copy link

sig9 commented Nov 20, 2022

I've been watching this for soooo long. Was excited to see a github notification on this topic. alas... another bump.

Any word from dev team or product managers?

We can't even do short lived SSE responses. FWIW cloudflare workers support this.

If this ever makes it to production, please also support it in azure static web app functions without needing BYO functions.

@cwilliams7
Copy link

This is now supported in AWS Lambda: https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/

Hope Azure catches up.

@thomasgauvin
Copy link

thomasgauvin commented Jun 2, 2023

I've been watching this for soooo long. Was excited to see a github notification on this topic. alas... another bump.

Any word from dev team or product managers?

We can't even do short lived SSE responses. FWIW cloudflare workers support this.

If this ever makes it to production, please also support it in azure static web app functions without needing BYO functions.

(EDIT: PM of Azure Static Web Apps here)

@sig9 very aware of the need for this in Azure Static Web Apps and hopefully we can draw attention to this so we can enable this for Azure Static Web Apps as well

Thanks everyone for responding & contributing to this thread!

@sig9
Copy link

sig9 commented Jun 4, 2023

@thomasgauvin The super compelling use case for this is streaming OpenAI responses back to the browser.
@azure/openai RequestOptions has a stream?: boolean property. If you set it to true, the completion responses will stream back via SSE. Using nuxt 3 I was happy with myself having proxied those responses back to the browser for a great user experience. Unfortunately, it only worked with the node dev server. When I deployed to Azure swa, my joy was short lived.

Then I remembered, oh Yeah, I couldn't stream back a large file blob unless I used c#, why would SSE magically work...

@m14t
Copy link

m14t commented Jul 19, 2023

Similar to @sig9, this is a crucial feature for using OpenAI on Azure functions, since OpenAI's responses are somewhat slow, streaming this data to the user is critical for an acceptable experience.

Linking: Azure/azure-functions-nodejs-library#97

With @azure/functions version 4.0.0-alpha.9, I noticed that it was using undici under the hood and tried the following, which results in a successful, but buffered / non-streaming response:

import {
  app,
  HttpRequest,
  HttpResponse,
  InvocationContext,
} from "@azure/functions";
import { ReadableStream, TextEncoderStream } from "node:stream/web";

const DELAY = 2000;

export function handler(
  request: HttpRequest,
  context: InvocationContext
): HttpResponse {
  context.log(`Http function processed request for url "${request.url}"`);

  const stream = new ReadableStream({
    start(controller: any) {
      controller.enqueue(`
        <!doctype html>
        <html lang=en-US>
        <body>
          The time is: ${new Date().toISOString()}<br /><br />
      `);
      setTimeout(() => {
        controller.enqueue(`
            ${DELAY}ms later it is now ${new Date().toISOString()}
          </body>
        `);
        controller.close();
      }, DELAY);
    },
  });

  return new HttpResponse({
    body: stream.pipeThrough(new TextEncoderStream()),
    status: 200,
    headers: {
      "Content-Type": "text/html; charset=utf-8",
    },
  });
}

app.http("test-streaming", {
  methods: ["GET", "POST"],
  authLevel: "anonymous",
  handler,
});

@thomasgauvin
Copy link

thomasgauvin commented Jul 19, 2023

@thomasgauvin The super compelling use case for this is streaming OpenAI responses back to the browser. @azure/openai RequestOptions has a stream?: boolean property. If you set it to true, the completion responses will stream back via SSE. Using nuxt 3 I was happy with myself having proxied those responses back to the browser for a great user experience. Unfortunately, it only worked with the node dev server. When I deployed to Azure swa, my joy was short lived.

Then I remembered, oh Yeah, I couldn't stream back a large file blob unless I used c#, why would SSE magically work...

Yep, I'm aware that this is a limiting factor when building OpenAI apps (I'm blocked by this myself). The Functions team has confirmed that they're working on this, starting with out of proc .NET, and this should come to JS after that. All the feedback collected in this thread is invaluable to help push for this, especially as this is seemingly a blocker for building OpenAI apps depending on Functions

@c-eiser13
Copy link

@m14t I was facing same limitation with a node function, so I created a C# function and utilized this library to implement streaming and it is working correctly. I call this from a React application and the data is streamed properly. Some of the function implementation was omitted for brevity.

                OpenAIAPI api = new OpenAIAPI("get from vault or environment variable");                
                //convert JSON string of messages into an array.
                JArray messages = JArray.Parse(strMessages);                
                List<ChatMessage> chatMessages = messages.ToObject<List<ChatMessage>>();
                //List<ChatMessage> chatMessages = new List<ChatMessage>();
                var modelVal = Model.ChatGPTTurbo;                
                var temp = 0.7;                
                ChatRequest request = new ChatRequest()
                {
                    Model = modelVal,
                    Temperature = temp,
                    Messages = chatMessages
                };                
                var response = req.HttpContext.Response;
                response.Headers.Add("Content-Type", "text/event-stream");
                response.Headers.Add("Cache-Control", "no-cache");
                response.Headers.Add("Connection", "keep-alive");
                response.StatusCode = 200;
                await using var stream = response.Body;
                var responseTokens = 0;
                await foreach (var token in api.Chat.StreamChatEnumerableAsync(request))
                {
                    Console.Write(token.Choices.First().Delta.Content);
                    var content = token.Choices.First().Delta.Content;
                    if (content != null)
                    {
                        var contentBytes = Encoding.UTF8.GetBytes(token.Choices.First().Delta.Content);
                        await stream.WriteAsync(contentBytes);
                        await stream.FlushAsync();
                        responseTokens++;
                    }
                }
                return new EmptyResult();

@thomasgauvin
Copy link

Related issue that focuses on returning streams from JS Functions: Azure/azure-functions-nodejs-library#97

@Petryxasport
Copy link

@c-eiser13 Sounds good!!! What version of function app did you create .net ?
image

@c-eiser13
Copy link

@Petryxasport this is the runtime version in Azure Portal
image

When creating the solution in VS, I used the Azure functions C# template

@Petryxasport
Copy link

@c-eiser13 So, I have created the same function and we can get answers from Open AI on Azure function, however I still have issue with Front end part based on a react application. We are using fetchEventSource from Microsoft and cannot get streaming.
Could you please share details about your react application or share your code?

@c-eiser13
Copy link

@Petryxasport on the React side, when a message is typed into the chat window and send button is clicked, here is how that is handled. Messages is my array of {role: string, content: string} objects.

const response = await service.StreamCompletion(
        messages,
        client,
        props.model ?? "gpt-3.5-turbo",
        props.temperature ?? 0.7
      );
      const reader = response.body.getReader();
      const decoder = new TextDecoder("utf-8");
      let sseData = "";
      const responseObj = {
        role: "assistant",
        content: "",
        time: format(new Date(), "h:mm aaa"),
        id: getRandomString(8)
      };
      const copy = cloneDeep(messages);
      copy.push(responseObj);
      const readStream = () => {
        reader.read().then(({ done, value }) => {
          if (done) {
            setState((prev) => mergeState(prev, { loading: false }));
            console.log("SSE stream closed");
            return;
          }

          sseData += decoder.decode(value, { stream: true });
          responseObj.content = sseData;
          setState((prev) => mergeState(prev, { messages: copy }));
          setTimeout(() => {
            if (layout === "SingleWebPartAppPage") {
              const element = document.querySelector('[data-automation-id="contentScrollRegion"]');
              if (element) {
                element.scrollTo({ top: element.scrollHeight, behavior: "smooth" });
              }
            } else {
              ref?.current?.scrollTo({ top: ref.current.scrollHeight, behavior: "smooth" });
            }
          }, 100);

          readStream();
        });
      };
      readStream();

service.StreamCompletion, is where I call my Azure function and it looks like this:

public StreamCompletion = async (messages: IChatMessage[], client: string, model: string, temperature: number) => {
    try {
      const response = await fetch(this.streamApi, {
        method: "POST",
        headers: {
          "Content-Type": "application/json",
        },
        body: JSON.stringify({
          messages: JSON.stringify(messages),
          client: client,
          model: model,
          temperatur: temperature,
        }),
      });
      return response;
    } catch (error) {
      console.error(`ChatService: StreamCompletion --> error streaming completion: ${error}`);
      throw Error(error);
    }
  };

@jeffzi19
Copy link

Just pinging the thread to see if there are any updates here. I can use the async HTTP status uri after hacking around a bit but it would really be nice to just stream the resopnse.

@ejizba
Copy link
Contributor

ejizba commented Jan 3, 2024

Hi @jeffzi19 which language are you targeting? This issue is generally used to track all languages, which each have a different status. I personally work on Node.js, which you can track using our Roadmap for general timelines or this issue for more specific details.

If you're using Python, here is their repo. I don't know if they have an issue tracking this, but I know they're working on it. If you don't see an issue feel free to create one.

I'm pretty sure .NET Isolated is already done. I don't think other languages are working on this yet but let me know if you have another language in mind and I can forward you to the right people who might know more.

@jeffzi19
Copy link

@ejizba Thank you for the response! I am using function 4.0 with python right now, I will take a look at their site. We looked into using the .NET isolated but some of the libraries needed by our python side are not compatible or don't have a solid enough .NET implementation to make it an option for us to change right now.

Thanks again!

@waynebaby
Copy link

@jeffzi19 @ejizba

I want to confess that I am using .net isolated, and I had to use Service Bus Queue to simulate streaming experience of OpenAI. when I get the streaming output, i send the tokens generated grouped by second to the SBQ, and return the full result after the completion is done.

SBQ supports AMQP WebSocket, just don't put every token into a new message, the throughput will be fine . It should work with most of the developing stack.

@ejizba
Copy link
Contributor

ejizba commented Feb 28, 2024

For those of you on Node.js, we just announced preview support for HTTP streams! 🎉 Learn more in our blog post

@thomasgauvin
Copy link

Super excited for the Azure Functions announcement of support for streaming of responses for Node.js Functions! 🎉 This will be a mission-critical feature for building AI apps with Azure Static Web Apps and using streams for returning large payloads, etc. Thanks everyone for the feedback in this thread!

@SteveALee
Copy link

SteveALee commented Mar 5, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests