Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Dynamic Binding scenarios (non .NET languages) #85

Open
pjeannet opened this issue Mar 11, 2016 · 44 comments
Open

Support Dynamic Binding scenarios (non .NET languages) #85

pjeannet opened this issue Mar 11, 2016 · 44 comments
Assignees
Labels
Milestone

Comments

@pjeannet
Copy link

At first I understood that we could pass somme parameters to the bindings out in function.json ({name} is used in your samples) by adding them in the object passed to the done function but it seems that these parameters are taken from the original input message. Do you think it would be possible to use the object passed to done as parameters for output binding? (could be used to define the blob name, the content, etc)

@mathewc

This comment was marked as outdated.

@christopheranderson christopheranderson added this to the backlog milestone Mar 12, 2016
@pjeannet
Copy link
Author

I just tried and though the propertie is in the bindingData object (checked with context.log(context.bindingData), an exception is thrown System.InvalidOperationException: No value for named parameter.

Full error stack :

Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.getFilesFromTar ---> System.InvalidOperationException: No value for named parameter 'fileName'. at Microsoft.Azure.WebJobs.Host.Bindings.Path.BindingTemplate.Bind(IReadOnlyDictionary`2 parameters) at Microsoft.Azure.WebJobs.Script.Binding.BlobBinding.<BindAsync>d__1.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Script.Description.NodeFunctionInvoker.<ProcessOutputBindingsAsync>d__1b.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Script.Description.NodeFunctionInvoker.<Invoke>d__0.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`1.<InvokeAsync>d__0.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithWatchersAsync>d__31.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithLoggingAsync>d__2c.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task) at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithLoggingAsync>d__13.MoveNext() --- End of inner exception stack trace --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<ExecuteWithLoggingAsync>d__13.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.<TryExecuteAsync>d__1.MoveNext() 

@pjeannet
Copy link
Author

Did some other tests, even if I change the original propertie in bindingData it does not change anything. I think the propertie in bindingData.QueueTrigger is called and this one cannot be change (or I did not success to change its value).

@mathewc
Copy link
Member

mathewc commented Mar 14, 2016

Yeah, you're right, you can't change those values it turns out. So this scenario is not enabled currently I'm afraid. I'll leave this open so we can investigate further.

@pjeannet
Copy link
Author

ATM I'm adding a propertie nameOut to the messages, a bit ugly but it works until the scenario is enabled :)

@mathewc mathewc changed the title Binding out pass variables Support Dynamic Binding scenarios Apr 5, 2016
@mathewc mathewc modified the milestones: Next, backlog Apr 5, 2016
@mathewc
Copy link
Member

mathewc commented Apr 5, 2016

@fabiocav
Copy link
Member

fabiocav commented Apr 5, 2016

Yes! @paulbatum and I were just having a conversation about that!
@davidebbo ran into an issue with a simple scenario that would benefit from that as well.

@mathewc
Copy link
Member

mathewc commented Apr 5, 2016

It's been on my radar - just haven't had time to finish it. I'll get to it soon. What is @davidebbo's scenario?

@davidebbo
Copy link
Contributor

My scenario was to add a queue trigger and blob output via the UI (keeping everything default. Then the code has out string myBlob:

public static void Run(string myQueueItem, TraceWriter log, out string myBlob)
{
    myBlob="Hello";
}

resulting in:

Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.QueueTriggerCSharp1'. Microsoft.Azure.WebJobs.Host: No binding parameter exists for 'name'.

While I now sort of understand why it doesn't work, the error handling made it difficult to figure out. Probably more of a case of needing to improve errors.

@mathewc
Copy link
Member

mathewc commented Apr 7, 2016

For Node.js, the dynamic binding capability will allow the storage account (i.e. the "connection") value to be specified along with the binding, something like the below where the same set of function.json values for a binding are specified:

context.bindings.bind({
    connection: 'myConnection',
    type: 'blob',
    path: 'documents/1234'
});

However, currently for C#, IBinder doesn't allow one to specify any additional parameter attributes for the bind - it always targets the default storage account. To enable multi-account for Functions, we've introduced IBinderEx in the core SDK which is what we're leveraging for all the non dynamic bindings currently across languages. That means for C#, currently IBinder scenarios can only target the default storage account (the one pointed to by AzureWebJobsStorage). C# code can upcast to IBinderEx to do this, but that is messy for users. We had to introduce IBinderEx to avoid breaking changes to IBinder.

Our options are either to embrace IBinderEx for these scenarios, or to add a Connection property to all the inbuilt SDK attributes (yuck). Need to gauge how common multi-account IBinder scenarios are for C#.

@jamesdixon
Copy link

Just wanted to check in on this feature to see if there was any timetable or workaround?

I'm using Node.js and my scenario is pretty simple: taking the canonical image example, I'd like to be able to generate a new thumbnail image that lives in the same container. For example, input image is image.jpg and the output container is images and the thumbnail generated would live as images/image_thumb.jpg. Currently, I can't seem to find a way to alter the {name} parameter to allow this to work.

@jamesdixon
Copy link

@pierre-weceipt could you be more specific on how you got this to work?

Thanks!

@pjeannet
Copy link
Author

@jamesdixon To be honest I don't remeber much of these tests. We put aside this project and it's been a while since I worked on this code. After looking at it again, what I understand is that I added a proerty fileNameOut to my message passed to the next queue. Here is an extract of the code, but unless I find some time to rerun in debug mode the whole project I can't explain much more, sorry :(

var message = { container: 'fileslist-to-process', fileNameIn: item.fileNameOut, fileNameOut: 'fileName' };
context.done(null, {
            blob: blob,
            message: message
        });

and the function.json

{
  "bindings": [
    {
      "type": "queueTrigger",
      "direction": "in",
      "queueName": "tar-to-process"
    },
    {
      "type": "blob",
      "name": "blob",
      "direction": "out",
      "path": "fileslist-to-process/{fileNameOut}"
    },
    {
      "type": "queue",
      "name": "message",
      "direction": "out",
      "queueName": "fileslist-to-process"
    }
  ]
} 

@solvingj
Copy link

We're also interested in dynamic out bindings, but for a slightly different use case.

Our preliminary application roadmap has more than 20 functions, all of which can and will be chained together in multiple workflows. There is very little guidance beyond very simple use cases, so the vision we came up with is to use an approach that is rooted in functional programming.

We wish to treat each function like a "monad" which will output to one of two associated channels: whether they be Storage Queues, ServiceBus Topics, or EventHubs. One will be for success, the other for fail. This will allow us to build sensible and efficient chains of functions, where each function only executes and processes an input when it has work to do. Currently, the design for chaining functions seems to encourage a pipeline where functions have to receive and analyze input in order to determine whether the previous function was successful, and then have multiple behaviors depending on the content of the input. This does not feel like an efficient paradigm.

What our vision would require is to be able to specify "queueName" or the like during execution. Alternatively, we'll just need to skip the output bindings altogether and do everything from within the function. To be honest, it's perfectly reasonable to do it that way, but I thought it was worth bringing up in this thread.

@sabbour
Copy link

sabbour commented Sep 28, 2017

Hello, is there an ETA for this?

@lifan0127
Copy link

Any update?

@paulbatum
Copy link
Member

No update, no ETA.

@mathewc mathewc changed the title Support Dynamic Binding scenarios Support Dynamic Binding scenarios (non .NET languages) Mar 8, 2018
@scottadmi
Copy link

scottadmi commented May 1, 2018

If dynamic binding is not yet supported, is it at least possible to determine the name of the file path that is generated for you. IE to log the UUID for the stored blob in a different channel? It seems odd that the feature parity is so distant for javascript bindings.

@GuyHarwood
Copy link

are dynamic/overridable input bindings supported in Javascript?
as i would like to override the queue input binding via local.settings.json for local development, without modifying function.json and accidentally checking in changes to that file.

@donniekerr
Copy link

Any 2019 updates on this?

@adriennn
Copy link

Any 2019 updates on this?

this is how you do this in nodejs (as per the docs). The below assumes you have the binding extensions installed.

let's assume your function is an http trigger, send a post request with the body:

{
"name" : "yes"
}

in function.json

    {
      "name": "InternalBlobReference",
      "type": "blob",
      "path": "blobpath/{name}.json",
      "connection": "AzureWebJobsStorage", //default, tells about the storage account to be used
      "direction": "out"
    }

in index.js

module.exports = async function (context, req) { 
context.bindings.InternalBlobReference= {"something":"This is dog."}
context.done()
}

and you will have the value {"something":"This is dog."} in a file named yes.json in your blob

@donniekerr
Copy link

Thank you adriennn,
This is good information. Before I saw this, I ended up just using the @azure/storage-blob uploadStreamToBlob in the function directly instead of bindings.

Easier dynamic bindings would be a good feature to add into Azure Functions in the future. Basically, any variable set at any time in context should work in {} in the function.json file. Ideally.

Thanks!
Donnie

@alexgman
Copy link

This may or may not have relevance here, but is there a way to conditionally apply output binding attributes to a function? Let's say I want to output bind to blob storage only if today is Tuesday, or only if some parameter meets some condition? How do I accomplish this?

@paulbatum
Copy link
Member

@alexgman The way you do this is you configure the output binding for blob storage, but then in your code you only assign a value if its tuesday.

@alexgman
Copy link

alexgman commented Mar 30, 2019 via email

@paulbatum
Copy link
Member

Sure, here's an unmodified code snippet from our reference doc:

module.exports = async function(context) {
    let retMsg = 'Hello, world!';
    context.bindings.httpResponse = {
        body: retMsg
    };
    context.bindings.queueOutput = retMsg;
    return;
};

Here is a modified version, based on your tuesday rule:

module.exports = async function(context) {
    let retMsg = 'Hello, world!';
    context.bindings.httpResponse = {
        body: retMsg
    };
    if( getToday() == "tuesday") { // getToday is a madeup function, you get the idea
      context.bindings.queueOutput = retMsg;
    }
    return;
};

@alexgman
Copy link

alexgman commented Mar 31, 2019 via email

@paulbatum
Copy link
Member

paulbatum commented Apr 1, 2019 via email

@alexgman
Copy link

alexgman commented Apr 1, 2019 via email

@paulbatum
Copy link
Member

@alexgman I suggest you file a new issue. The images you pasted didn't come through and it really sounds like you're asking something different to what is being discussed here.

@r-tanner-f
Copy link

Did I misunderstand adriennn's post or is that still a static binding?

My use case is unzipping files. I need to unzip, rename, and place in blob storage. Could be anywhere from 1 to 100 files in the zip.

@adriennn
Copy link

adriennn commented May 9, 2019

@r-tanner-f it's 'static' within the lifetime of your function as the bindings are defined by the triggers. So in your case, if you have hundreds of files, this means running the same function hundred of times with a single new binding for each file. I wouldn't use the bindings, and not js or java as the files would be handled in-memory. I would do a python function and use the azure blob python sdk to directly read all files paths in the blob and process them one at a time, writing to disk using python and the putting them to the blob again. If you are going to have more files then the function are not a scalable solution. Lastly, a more technical solution would be to use the azure cli to access the blob metadata or files and process the files with linux and put them back to the blob, this however requires a vm. It would be nice if Microsoft developed a CLaaS for Azure (Command Line as a Service).

ps. there are lots of talk on the msdn about having a *.file binding, meaning having access to many files with a wildcard match, the solution to this is using the event grid, which brings you back to running the function n times for n files.

@adriennn
Copy link

adriennn commented May 9, 2019

Paul, I'm attempting to use some out parameters, and they work AWESOME when I assign a value to

@alexgman I dont know much about c#, but if a function expects a parameter to work then it's gonna throw an error if the parameter is not passed and if its absence is not handled, you need to try catch.

@GuyHarwood
Copy link

a more technical solution would be to use the azure cli to access the blob metadata or files and process the files with linux and put them back to the blob, this however requires a vm. It would be nice if Microsoft developed a CLaaS for Azure (Command Line as a Service).

You could also do this as a build job / pipeline task in Azure DevOps

@eaquiler
Copy link

This seems to be long time around here but, I must ask again since I cannot see an answer for @scottadmi question. Since I cannot find a way to set the output file name is it possible to get the UUID generated so I can pass it forward?.

@MaxHogervorst
Copy link

Any ETA on this?

@alexgman
Copy link

alexgman commented Jun 21, 2019 via email

@MaxHogervorst
Copy link

Setting a dynmic name for non .net languages

@Vijay-Karthick
Copy link

Any 2019 updates on this?

this is how you do this in nodejs (as per the docs). The below assumes you have the binding extensions installed.

let's assume your function is an http trigger, send a post request with the body:

{
"name" : "yes"
}

in function.json

    {
      "name": "InternalBlobReference",
      "type": "blob",
      "path": "blobpath/{name}.json",
      "connection": "AzureWebJobsStorage", //default, tells about the storage account to be used
      "direction": "out"
    }

in index.js

module.exports = async function (context, req) { 
context.bindings.InternalBlobReference= {"something":"This is dog."}
context.done()
}

and you will have the value {"something":"This is dog."} in a file named yes.json in your blob

Thank you @adriennn . This worked for me.

@d3vAdv3ntur3s
Copy link

Any idea on how to use the dynamic naming for multiple blob outputs in the Java functions API, if it is even supported as of yet?

I've tried a combination of @BlobAttribute with @BlobOutput with no luck.

Thanks :)

@axof
Copy link

axof commented Aug 27, 2021

So, after 5 years of this issue, there's still no way of doing dynamic bindings with non .NET languages... I'd also like to be able to change the name of the blob I want to output with Node.js. The need to use a random guid makes no sense at all.

@aleromano92
Copy link

I surfed the internet for 2 hours because i didn't think this was not possible, then I stumbled here.
Do you have any estimation for when naming a blob will be possible?

@adriennn
Copy link

adriennn commented Dec 7, 2021

@axof and @Axel92Dev these threads are a bit difficult to navigate but it's indeed possible, there's an example for js see #85 (comment)

@aleromano92
Copy link

aleromano92 commented Dec 7, 2021

@axof and @Axel92Dev these threads are a bit difficult to navigate but it's indeed possible, there's an example for js see #85 (comment)

Not at all unfortunately, as in that example, the value of the variable {name} comes from a HTTP Request parameter.
You are not able from inside the function body to set a value for that {name}.

My Azure function is invoked on new message on a Service Bus Subscription, so I cannot invoke it with different parameters.

I ended up importing the Azure Blob Storage SDK inside my AF and programmatically store the file customizing whatever I want.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests