New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Binding/trigger for Azure Data Lake Store #353
Comments
Yes. Can we get a trigger for a files placed in Data Lake Store, similar to blob triggers for Azure Blob Storage |
Yes, longing for this one also. |
Yes, yes, yes!!!! Do custom web job trigger extensions work for Functions?? if so, how?? |
+1 |
This would be incredibly helpful. Any updates on this? |
Also agree that this would be super useful. Any progress / updates? |
This feature is not a high priority for us right now, but I will note that the announcement for Azure Event Grid listed Data Lake as one of the integrations they are building. Once you can subscribe to Data Lake updates through Event Grid, running an Azure Function would be trivial (see here for some info). So I think this the most likely way that the scenario would be enabled. I noticed that Event Grid integration for Data Lake is tracked in user voice here - I would encourage everyone following this issue to comment there to help the Data Lake team prioritize this work. |
Hi, Want to follow up on this thread, do we have Azure function integrated with ADL? or we still need to handle via event grid? |
@zhangruiskyline There has been some progress, courtesy of @joescars, see here. Note that there are no official releases of this yet (you would need to clone and build yourself, and there is no ETA for this to change). |
is this possible to do yet? |
@alexgman The state is unchanged, see above for the relevant links. |
Does any one have a solution for this yet? |
You can now use an ADLS Gen2 directory (namespace) as a trigger for an Azure Function. The problem comes in with permissions for the Function. I have not found a way to grant the Function permissions to ONLY the directory from which its triggered. The Function requires permissions to the ENTIRE DataLake. Example:
Problems:
Desired state:
|
Much like it's useful to process incoming blobs on their way into blob store it could be useful to process files landing in Azure Data Lake Store. Possibly something like a file for which there is no ADLA extractor where Azure Functions could process it, and create a metadata file more suitable for the purpose.
Anything in the pipeline for this?
The text was updated successfully, but these errors were encountered: