-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyCharm autocompletion for properties #1078
Comments
I've tried to implement alternative 1, which outputs classes like this: class Resource(AWSObject):
resource_type = "AWS::ApiGateway::Resource"
ParentId = None # type: basestring
PathPart = None # type: basestring
RestApiId = None # type: basestring
props = {
'ParentId': (basestring, True),
'PathPart': (basestring, True),
'RestApiId': (basestring, True),
} The Problem now is that 2to3 completely ignores the type hints. |
Does the stub file approach work for both python2 and python3? Those likely could be autogenerated similar to the way I'm generating code from the resource specifications. |
@markpeek Stubs should work for both p2 and p3. The problem is that p2 and p3 have different syntax for type hinting (as in p2 doesn't really have one at all), so you can't use the exact same file for both. Conversion seems to be out of the question too. Would having two extra scripts for generating p2/p3 stubs be an acceptable solution? |
I've created a PR #1079 |
Any further comments on this? Does the PR need more work? |
Sorry for the delay, I've been contemplating the right way to handle it. Here are some thoughts:
|
Cheers for the feedback! |
Changes are done. See PR #1079 |
Any update on this? Would really love avoiding the cmd click for props everytime. |
I looked at it today and I think we need to make a few changes. I'll follow up in PR #1079. But I hope to get it merged and then fix it up real soon to get it out. |
I saw the pr has already been merged, but I don't get autocompletion. Do I need to do something else? I tried searching the docs but couldn't find anything. Thanks! |
The generator was merged, but sadly the generated files never were added to the repo/package. |
Are there any plans to add those files to the package? In the meantime I'll see if I can generate them myself and have PyCharm detect the stub files. Thanks! |
That is up to @markpeek ! I'd love to see my labour bear fruit :) |
@DrLuke , @xyvark , @grkking, @markpeek It built a thin wrapper layer on top of troposphere using And it comes with a feature called auto-reference/associate. You don't need to remember the detailed properties and the reference syntax to reference each other, you just use this, it automatically does the work for you: associate(lambda_func, iam_role)
associate(lambda_func, subnet1)
associate(lambda_func, subnet2)
associate(lambda_func, security_group) So it gives you auto suggestions on type and properties name. If my works are helpful, I am happy to see you star my project :) |
Thank you @MacHu-GWU |
Check how Pydantic does this. It does validation with type hints alone. Also, you get autocomplete in constructor keyword arguments with "pydantic" PyCharm plugin |
So Troposphere no longer supports Python 2, this makes this problem easier right now. |
I have a bit of time through the end of this year that I can put towards some work on implementing this if we can decide on a path forward. I put together a PoC rewriting So far, the biggest breaking change I see for this route would be The least effort path forward if this route is chosen would likely be to manually update all classes to use the new format. We can then look to automatically build the classes using jinjia as a future step unless someone else has some time in the next few weeks to get that started as it may burn too much of the time I have available right now to figure out. Open to any suggestions/feedback on my PoC. I'll hold off on working on it further until a maintainer chimes on. |
@ITProKyle there's spec files that AWS maintain that regularly with all resource, property information. I create an open source project that automatically keep the It gives you "Required Property", "Optional Property", "Return Value" hint and auto complete with built-in type hint and built in validator. Also more design pattern such as these are included:
I wish troposphere could also provide those features that can improve productivity. Feel free to borrow my work that automatically generate source code based on the spec file. |
@MacHu-GWU Yeah, this is neat (cottonformation) and will be useful to pull the schemas with it. User feedback though on cottonformation-> the problem and typically the reason why I really dislike CDK as well is, you renamed every single attribute from the CFN properties -> |
@JohnPreston yes. Everything you declare a resource, there are some option are In my experience, it fully leverages the power of modern IDE, this free you from looking up documents and allow you to focus on logic. |
I get that. But my point is one cannot then just import property name for propety name. The renaming of the attributes although with sense, is really annoying IMHO. With troposphere following the names of properties strictly and yes still be able to define validation and required/optional works just fine. And as a CFN engineer you should know that the required / not required is not enough at all to illustrate whether or not two options are compatible. In my experience, reading the docs saves kittens. And there is no shortcut for it, however hard you try |
@JohnPreston I totally agree with you. Type prefix like My initiative is to allow low experienced engineer to be able to develop CFT easier with less learning curve. And the trade off is to use magic prefix |
You don't need those prefixes for IDE support, see e.g. PyCharm Pydantic plugin |
@MacHu-GWU - Just want to echo the feedback that everyone else has provided. Changing the names of the properties kills its usability for me and my teams. I get that you are trying to differentiate required vs optional parameters but the appropriate way to do that would be by making the type annotation of the property That's not to say cottonformation isn't a nice library. It's just not a replacement for Troposphere for a company with hundreds of Troposphere templates (which would take a significant amount of time to update EVERY property) and is not a solution to this problem. |
I've been busy but keeping an eye on this thread. Here's some comments:
Anyway, wanted to provide a quick update from my viewpoint. The key for me is to rely more on the generator to keep the library up to date with the resource spec, ensure reproducible results (due to renames that are required), and to generate the corresponding stubs accurately. |
To expand on what @ITProKyle said and to give my 2cts on the whole pydantic story: I got introduced to pydantic mostly from those convos, and went off, created projects where I would laze off using the JSON schema to deal with creating my classes just because it feels then easy to just deal with objects and there properties instead of dealing with dictionaries / lists etc. Very cool, very familiar to using the CFN CLI tool to create resources/modules. However, although I find the principle of generating the Troposphere properties from the JSON schema exposed by AWS CFN sounds amazing, I think that moving from the current object model which is superbly easy to read and understand, moving to auto-generated classes, even with the ability to override the Base class, feels like a giant step back and would break so much code logic that has been implemented so far. I rely on the current structure of troposphere (i.e Plus the logic to go over a root AWS Resource down to its nested AWS Property is fairly straightforward. I might have already mentioned that, with the current model, you can pass a whole AWS Resource from an existing template and get its Troposphere version in a flash. For my usage, the compatibility is such that I can take the example for a given resource type from the AWS docs, and it works without a miss (modulo Intrinsic functions) Where I do like what cottonformation does for example, which I implemented my own way for my own purpose, is providing a way to know what the valid return values are (from Ref / GetAtt etc.) but that is then again something that is fairly straightforward to implement. So, in my opinion, going the pydantic way instead of good old **kwargs is going to be a nightmare for anyone using Troposphere as a pure library. However, for the use of the JSON definition of the CFN resources as exposed by the CFN team, I would much rather see using that purely as a json schema validation in the If for example we had a python library with all the CFN schemas in there (probably would be huge, but pydantic is not light anyway) we could use that as source for jsonschemas validation very easily (that's what I do for compose-spec among many other things). To your point @markpeek the one advantage I see with the generator is to avoid human error (which I know I did in one of my PRs) but I find the community has been pretty great at doing just that. So if we combined JSON validation from the CFN official models to validating PR changes we could be keeping up with the changes very fast and very nicely with a high grade of confidence. Thanks for reading. Go Troposhere! PS: If someone knows a way to open PR to the AWS CFN team to improve on the JSON definitions, can someone point me in the right direction ? |
@ITProKyle Thank you for the feedback! Great to hear different sounds with very solid reasons. |
On the topic of validators, there's much less need for maintaining basic validation functions with the use of pydantic as its all done automatically by the library. It even handles converting a dict into an object if the user wants to provide a dict instead.
Adding types for kwargs is quite easy but i'm not sure how we would go about auto-generating them for troposphere stubs. To it it, it would require changing to
The "title" change is partially pydantic and partially a new PEP being worked on to unify how type checkers treat custom classes that work like dataclasses (discussion). The As for vscode support, it is supported natively with the next release of pydantic (currently available from the master branch) which implements the PEP being worked on by pyright. Pyright/pylance is use by default by vscode's python extension.
This is something that would remain viable through the use of pydantic. Looking at my PoC, the only base classes being changes is that of the
How would it be a nightmare? It would work exactly the same but actually define the kwargs it wants instead of letting the user guess what it wants. As nice as it would be from a maintainer standpoint to have troposphere auto-generated from the CFN spec, it adds little value to the end user for the large time investment it will require. Not to say that it doesn't need done - there are just more impactful projects for the users of troposphere such as this one. A possible stopgap solution would be to validate troposphere against the CFN spec instead of generating from it. For example - for each resource in the spec, check to ensure that a class exists and that it's properties match. This would ensure that troposphere matches the spec (at least loosely) now and that it continues to match the spec as full auto-generation is worked towards. |
Sorry I was mistaken when wrote this. That's why I should not be doing this at 1AM (and yet I am replying at 2AM ^^). I really disagree on the point around that passing kwargs vs defined keywords would make user life easier: there is no shortcut to reading the docs and knowing / learning which parameters do what. The best we can do is implement the heck of of validators for resources that we know have incompatible properties and that sort of things (I will admit I have quite a few to backport from other project into this one). Validators generated from the JSON definition of the CFN resources into pydantic objects, as well, won't quite be able to do more than what JSON conditions allow do define/expect. And unless we can open PRs to CFN to improve as much as possible validation logic in the Schema definitions, it will always come down to "us" to implement these in code. Now, I might be missing something in what's been done so far to go the pydantic way and such, and I am more than happy to take a fork/branch that uses that new model for a spin to validate on backward compatibility. But if all this is only just to help users with a better IDE "help" experience and the way to do that is to change the whole classes model, we are trying to fix something that really needs no fixing at all and just works.
Can't agree more. I think that validation against the model when troposphere performs the render to_dict() of the objects instead of trying to generate these from the schema definition is much easier, and requires a ton less work.
Again, can't agree more. I'd volounteer to go over AWSProperty in the code to add the links to the docs in CFN for people to relate to it (i.e as in here) if that gives that shortcut for users to open the definition by clicking on it from the IDE. (I mean... mostly likely this is something one wants to scpript considering the number of properties :D ) Which actually brings me to a question/suggestion: should we look into defining a docstring standard format ? |
Per my above comments, I pushed some updates to the code generator along with changes to a few classes to prove it out. @JohnPreston this also includes adding doc links to the generated classes. There is an update to the CONTRIBUTING doc to attempt to document the changes needed to each class. @DrLuke the code generator should produce corresponding stub files (manually) if you want to verify it. I'm just doing file redirection at this time as I need to verify correctness and backward compatibility. Here are the simple shell aliases I've been using:
|
Not sure if this is related. But just in case. Just pulled from the main branch and installed in my venv and just realized, S3/ECR and a few other resources have code auto-generated. And yes, it looks slick to have the docs stings to the resource etc. and very happy to see that the @markpeek please see #1995 to fix import issues though :) Edit : out of curiosity, why use the CFN specifications file instead of the JSON schemas ? (apart from the obvious that it contains the link to the documentation for that resource). Might be a question for an AWS CFN engineer though |
Due to the fact that the classes have dynamically generated fields for the AWS resource properties, IDEs like PyCharm doesn't suggest those fields during autocompletion.
For example:
This could be remedied in two ways:
1. Add type hints to the beginning of classes, e.g.:
2. Generate stub files
This is outlined by PEP 484#stub-files
The text was updated successfully, but these errors were encountered: