Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New resource: azurerm_stream_analytics_reference_input_blob #3633

Merged
merged 14 commits into from
Dec 17, 2019
Merged

New resource: azurerm_stream_analytics_reference_input_blob #3633

merged 14 commits into from
Dec 17, 2019

Conversation

praneetloke
Copy link
Contributor

I noticed that it was not possible to create inputs for a stream analytics job of type reference. The code to represent such a resource already existed, but just wasn't exposed as a resource to be able to use it.

I also added an example, that I tested using my own subscription. Here's a screenshot of how the "reference" input looks.

Screenshot_2019-06-10 Microsoft Azure

@@ -0,0 +1,243 @@
package azurerm
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code is pretty much an exact replica of the ..._stream_analytics_**stream**_input_blob.go.

props := streamanalytics.Input{
Name: utils.String(name),
Properties: &streamanalytics.ReferenceInputProperties{
Type: streamanalytics.TypeReference,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This value is what sets it distinguishes a reference input from a streaming input to the stream analytics job.

@@ -37,8 +37,8 @@ resource "azurerm_storage_container" "example" {
container_access_type = "private"
}

resource "azurerm_stream_analytics_stream_input_eventhub" "test" {
name = "eventhub-stream-input"
resource "azurerm_stream_analytics_stream_input_blob" "test" {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Noticed the typos while reading the docs.

@praneetloke praneetloke changed the title Add blob reference input as a new resource for stream analytics input resources New resource: Add blob reference input as a new resource for stream analytics input resources Jun 17, 2019
@praneetloke
Copy link
Contributor Author

Ping. Anyone?

@katbyte
Copy link
Collaborator

katbyte commented Sep 14, 2019

As mentioned above, Would it just make sense to add this ability to the to the other steam input blob resource?

@praneetloke
Copy link
Contributor Author

@katbyte I am going to take another look at this and get back to you. As you can see it has been a while since I opened this PR and have to refresh my memory a bit :)

@ghost ghost removed the waiting-response label Sep 14, 2019
@praneetloke
Copy link
Contributor Author

As mentioned above, Would it just make sense to add this ability to the to the other steam input blob resource?

Ok I looked at this again. I do believe this needs to be its own resource. Unfortunately, there is some overlap in how this resource is parsed vs. how the stream input is parsed. But there are two places where we must explicitly use the reference input type in the *CreateUdpate and the *Read functions. Maybe I don't know enough about adding a new resource that I can't see how this could be a part of the existing streaming input resource.

Just to be clear, it is important to note that the difference between the two types of input is that one if a streaming input vs. what is being added with this PR is a reference input. A reference input is a static source of data for a Stream Analytics job, whereas a streaming input is something that is continuously producing data.

Please let me know if this helps. I am happy to rework this if you have other ideas. I think for a Stream Analytics job, being able to add reference data inputs is an important feature.

@praneetloke
Copy link
Contributor Author

ping @katbyte

Copy link
Collaborator

@katbyte katbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for answering my questions, i've given this a review @praneetloke. Nothing major bt i have left some comments inline that should be addressed before merge

Stream Analytics Reference Input Blob's can be imported using the `resource id`, e.g.

```shell
terraform import azurerm_stream_analytics_stream_input_blob.test /subscriptions/00000000-0000-0000-0000-000000000000/resourcegroups/group1/providers/Microsoft.StreamAnalytics/streamingjobs/job1/inputs/input1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you'll need to update this line

"path_pattern": {
Type: schema.TypeString,
Required: true,
},
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we get a validate.NoEmptyStrings here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just curious, though. With Required: true, isn't NoEmptyStrings enforced automatically?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, you can pass an empty string, alls that is required is something is assinged to the property

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah gotcha. Would you like me to add validate.NoEmptyStrings here then?

Comment on lines 145 to 148
if d.IsNewResource() {
if _, err := client.CreateOrReplace(ctx, props, resourceGroup, jobName, name, "", ""); err != nil {
return fmt.Errorf("Error Creating Stream Analytics Reference Input Blob %q (Job %q / Resource Group %q): %+v", name, jobName, resourceGroup, err)
}

read, err := client.Get(ctx, resourceGroup, jobName, name)
if err != nil {
return fmt.Errorf("Error retrieving Stream Analytics Reference Input Blob %q (Job %q / Resource Group %q): %+v", name, jobName, resourceGroup, err)
}
if read.ID == nil {
return fmt.Errorf("Cannot read ID of Stream Analytics Reference Input Blob %q (Job %q / Resource Group %q)", name, jobName, resourceGroup)
}

d.SetId(*read.ID)
} else {
if _, err := client.Update(ctx, props, resourceGroup, jobName, name, ""); err != nil {
return fmt.Errorf("Error Updating Stream Analytics Reference Input Blob %q (Job %q / Resource Group %q): %+v", name, jobName, resourceGroup, err)
}
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given there is vastly different logic for create & update could we maybe split this function into a create and update?

examples/stream-analytics/main.tf Outdated Show resolved Hide resolved
examples/stream-analytics/main.tf Outdated Show resolved Hide resolved
examples/stream-analytics/variables.tf Show resolved Hide resolved

* `path_pattern` - (Required) The blob path pattern. Not a regular expression. It represents a pattern against which blob names will be matched to determine whether or not they should be included as input or output to the job.

* `storage_account_name` - (Required) The name of the Storage Account.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The name of the strage account to.. ?

@danielrbradley
Copy link

Hello, I'm personally very interested in getting Reference sources working for a project. I've read through the conversation so far and can't quite work out what the current blocker would be (beyond fixing merge conflicts). I'd be happy to try and lend a hand if I can to get this over the line and merged if you can summarise the outstanding issues that need to be resolved.

@ghost ghost removed the waiting-response label Dec 3, 2019
@praneetloke
Copy link
Contributor Author

@danielrbradley

beyond fixing merge conflicts

I believe this and also getting the test to pass. I haven't kept my fork up-to-date with the latest from the upstream master. I'll get this restored today.

@danielrbradley
Copy link

@praneetloke @katbyte I've rebased and got the build working on this second PR: #5129

Quite happy @praneetloke if you'd like to pull my changes and force push up to your original branch to continue the conversation here.

praneetloke and others added 4 commits December 12, 2019 09:46
…put blob datasource. Add an example for stream analytics. Add the new datasource to provider.go.
Co-Authored-By: kt <kt@katbyte.me>
@praneetloke
Copy link
Contributor Author

praneetloke commented Dec 12, 2019

@danielrbradley I am taking a look at restoring this PR right now. Thanks for opening yours. I'll check that too. I would prefer to fix up this PR rather than opening a new one, since the comments in this PR would be easier to follow-up on.

EDIT: Your PR was helpful in getting things all sorted out. Thank you for that!

@praneetloke
Copy link
Contributor Author

@katbyte @tombuildsstuff I have addressed the comments and fixed-up this PR with the latest from upstream (this repo). Can you please take another look at this?

@praneetloke
Copy link
Contributor Author

praneetloke commented Dec 12, 2019

Almost all of the checks passed, except for the lintstatic target, which seems to have failed with a transient error:

# cd .; git clone -- https://github.com/client9/misspell /home/travis/gopath/src/github.com/client9/misspell
Cloning into '/home/travis/gopath/src/github.com/client9/misspell'...
fatal: unable to access 'https://github.com/client9/misspell/': Failed to connect to github.com port 443: Connection timed out
package github.com/client9/misspell/cmd/misspell: exit status 128
GNUmakefile:13: recipe for target 'tools' failed

I don't have access to restart the build. Could you please restart just this job?

EDIT: Never mind. I made a small update to trigger another build. I didn't want to do that initially but due to the lack of response, I wanted to make sure if someone does open this PR to merge it, that it is ready to go.

Copy link
Collaborator

@katbyte katbyte left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for finishing this up @praneetloke! LGTM now

@katbyte katbyte added this to the v1.40.0 milestone Dec 17, 2019
@katbyte katbyte changed the title New resource: Add blob reference input as a new resource for stream analytics input resources New resource: azurerm_stream_analytics_reference_input_blob Dec 17, 2019
@katbyte katbyte merged commit 8a5813a into hashicorp:master Dec 17, 2019
katbyte added a commit that referenced this pull request Dec 17, 2019
@ghost
Copy link

ghost commented Jan 8, 2020

This has been released in version 1.40.0 of the provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading. As an example:

provider "azurerm" {
    version = "~> 1.40.0"
}
# ... other configuration ...

@ghost
Copy link

ghost commented Mar 28, 2020

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you feel this issue should be reopened, we encourage creating a new issue linking back to this one for added context. If you feel I made an error 🤖 🙉 , please reach out to my human friends 👉 hashibot-feedback@hashicorp.com. Thanks!

@ghost ghost locked and limited conversation to collaborators Mar 28, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants