Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: FSx for NetApp ONTAP scale-out support #34953

Closed
awsaxeman opened this issue Dec 15, 2023 · 5 comments · Fixed by #34993
Closed

[Enhancement]: FSx for NetApp ONTAP scale-out support #34953

awsaxeman opened this issue Dec 15, 2023 · 5 comments · Fixed by #34993
Labels
enhancement Requests to existing resources that expand the functionality or scope. service/fsx Issues and PRs that pertain to the fsx service.
Milestone

Comments

@awsaxeman
Copy link
Contributor

Description

AWS released scale-out support for Amazon FSx for NetApp ONTAP. This enhancement will allow support for deploying FSx for ONTAP File Systems with multiple HA Pairs.

Affected Resource(s) and/or Data Source(s)

  • aws_fsx_ontap_file_system

Potential Terraform Configuration

resource "aws_fsx_ontap_file_system" "test" {
  storage_capacity    = 2048
  subnet_ids          = [aws_subnet.test[0].id]
  deployment_type     = "SINGLE_AZ_2"
  ha_pairs            = 2
  throughput_capacity_per_ha_pair = 3072
  preferred_subnet_id = aws_subnet.test[0].id
}

References

https://aws.amazon.com/about-aws/whats-new/2023/11/amazon-fsx-netapp-ontap-scale-out-file-systems/

Would you like to implement a fix?

Yes

@awsaxeman awsaxeman added the enhancement Requests to existing resources that expand the functionality or scope. label Dec 15, 2023
Copy link

Community Note

Voting for Prioritization

  • Please vote on this issue by adding a 👍 reaction to the original post to help the community and maintainers prioritize this request.
  • Please see our prioritization guide for information on how we prioritize.
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request.

Volunteering to Work on This Issue

  • If you are interested in working on this issue, please leave a comment.
  • If this would be your first contribution, please review the contribution guide.

@github-actions github-actions bot added the service/fsx Issues and PRs that pertain to the fsx service. label Dec 15, 2023
@terraform-aws-provider terraform-aws-provider bot added the needs-triage Waiting for first response or review from a maintainer. label Dec 15, 2023
@awsaxeman
Copy link
Contributor Author

Started work on this enhancement

@justinretzolk justinretzolk removed the needs-triage Waiting for first response or review from a maintainer. label Dec 18, 2023
@github-actions github-actions bot added this to the v5.32.0 milestone Jan 3, 2024
@pannujat
Copy link

pannujat commented Jan 4, 2024

When would v5.32.0 be GA?

Copy link

This functionality has been released in v5.32.0 of the Terraform AWS Provider. Please see the Terraform documentation on provider versioning or reach out if you need any assistance upgrading.

For further feature requests or bug reports with this functionality, please create a new GitHub issue following the template. Thank you!

Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Feb 13, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement Requests to existing resources that expand the functionality or scope. service/fsx Issues and PRs that pertain to the fsx service.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants