Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warn that old-style ready_fn and test attributes will be deprecated #346

Merged

Conversation

pbaughman
Copy link
Contributor

I'd like to get rid of the ready_fn and the 'magic' creation of self.proc_info and self.proc_output, and other self.whatever attributes on test classes in launch_testing

This PR issues a warning when you use the 'old' way.

If your generate_test_description function looks like this instead of like this You will get the following warning when you run the test:

root@93575d1ee932:/launch# ros2 test launch_testing/test/launch_testing/examples/good_proc_launch_test.py 
Running with ROS_DOMAIN_ID 1
ROS_DOMAIN_ID 1
[WARNING] [launch_testing.loader]: Passing ready_fn as an argument to
generate_test_description will be deprecated in a future release.  Include a
launch_testing.actions.ReadyToTest action in the LaunchDescription instead.

If you access proc_output or proc_info via self like this instead of like this you will get the following warning when you run the test:

test_exit_code (good_proc_launch_test.TestProcessOutput) ... 
[WARNING] [launch_testing.loader]: Automatically adding attributes like self.proc_info and
self.proc_output to the test class will be deprecated in a future release.  Instead, add
proc_info or proc_output to the test method argument list to access the test object you need

Signed-off-by: Pete Baughman pete.baughman@apex.ai

@pbaughman
Copy link
Contributor Author

@hidmic ready_fn and self.proc_whatever arguments are very very old and have both been replaced by something better and more similar to regular pytest tests. I'm hoping we can issue a warning for a while, and then remove these all-together in a future release. I'll also prepare a branch with these features removed so we can run CI and figure out what ROS packages need to be updated to work the 'new' way.

@mabelzhang mabelzhang added the enhancement New feature or request label Oct 17, 2019
Copy link
Contributor

@hidmic hidmic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM, but for a few comments.

The thing is, though, that today's the API and feature freeze for Eloquent. If we merge this but we don't change all the places throughout the code base where ready_fn is being used, we'll release code with deprecation warnings. Would you be able to contribute those changes?

)
self.__warned = True

data = _warn_wrapper(data)
Copy link
Contributor

@hidmic hidmic Oct 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pbaughman meta: wouldn't it be simpler to turn injected attributes into properties that warn on access? Instead of trying to cover every possible way these attribute may be used I mean.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hidmic I will try - I'm not sure what it takes to dynamically add a property to a class but I'm sure I can figure it out

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See here.

'Automatically adding attributes like self.proc_info and self.proc_output '
'to the test class will be deprecated in a future release. '
'Instead, add proc_info or proc_output to the test method argument list to '
'access the test object you need'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pbaughman I think it'd be nice to use attr_name in the warning to give the user more precise directions. WDYT?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good idea

from .actions import ReadyToTest


_logger = launch.logging.get_logger(__name__)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pbaughman meta: I wonder if we shouldn't use the warnings module instead.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I've used that module before, so I can't speak to the pros and cons. You're referring to this right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hidmic Using the warnings module changes

test_count_to_four (good_proc_launch_test.TestGoodProcess) ... [WARNING] [launch_testing.loader]: Automatically adding attributes like self.proc_outputto the test class will be deprecated in a future release.  Instead, add proc_output to the test method argument list to access the test object you need

To

test_count_to_four (good_proc_launch_test.TestGoodProcess) ... /launch/install/launch_testing/lib/python3.6/site-packages/launch_testing/loader.py:251: FutureWarning: Automatically adding attribut
es like self.proc_output to the test class will be deprecated in a future release.  Instead, add proc_output to the test method argument list to access the test object you need
  FutureWarning,

It tries to print 1 line of source code at the end - that's why it says "FutureWarning" I'm not sure how to get it to stop doing that, but I also didn't look very hard.

It doesn't seem like using the 'warnings' module is a clear winner here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know how you're using the module. I think that using the stacklevel parameter appropriately you may get to refer to the exact line that triggered the warning (which is cool IMO).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do see the warning with -s, albeit buried in the rest of the test output. If we use the warnings module then we also see warning messages in the pytest summary. Using the warnings module also lets users to handle the warnings differently if they want, for example ignoring them or treating them as errors. Perhaps this is also configurable with the launch logger; I don't know.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I really dislike how the pytest wrapper combines as many tests as there are in a file into one mega-test. It's quite difficult to interpret the output. I can switch this to the 'warnings' module as long as we don't go around and around about the message that's printed. I didn't take that route originally because I couldn't figure out how to get it to display a warning without a bunch of meaningless (to the end user) extra stuff

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think if you use the warnings module with the default warning (UserWarning), then I don't think it displays "extra stuff". Even if it is ugly, I think it gets the message across. IMO, the warnings module is more visible for users compared to the launch logger warning. If someone else wants to weigh in, I'm not going to die on this hill. @hidmic ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Had to parse the discussion again, it's been a while.

To let the user know a feature is deprecated, I'd certainly stick to the warnings module. Not only for launch, but for every other ROS 2 core Python package out there too. @pbaughman I still think you may be able to get better output for ready_fn deprecation warnings if you do so when you bind them to the test function instead.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And about pytest aggregating all output in a rather confusing way, I agree. That's because there's a clear boundary between launch_testing and pytest (see here), and thus an entire launch test (with all its test cases) is a single pytest. To get fine grained output we need deeper integration. Turning launch_testing into an extension of pytest has been and is a path to achieve that.

@pbaughman
Copy link
Contributor Author

pbaughman commented Oct 18, 2019

@hidmic Here is as checklist of packages that use launch_testing and do things the "old" way. I will check them off as they're updated:

This list was generated by by doing:

wget https://raw.githubusercontent.com/ros2/ros2/master/ros2.repos
vcs import src < ros2.repos
grep -R "ready_fn"

And then ignoring stuff in launch_testing

@hidmic
Copy link
Contributor

hidmic commented Oct 18, 2019

I think that's a pretty accurate recollection. Watch out, some new ros2cli tests are hopefully coming along with ros2/ros2cli#304.

@pbaughman
Copy link
Contributor Author

Note, this PR is on-hold until I fix #347

Otherwise the existing ros2cli tests can't be updated to use the ReadyToTest action because it's not discoverable by launch_testing

@wjwwood
Copy link
Member

wjwwood commented Jan 2, 2020

What's the status of this pull request? Seems like the issue referenced in the most recent comment is closed.

@pbaughman
Copy link
Contributor Author

Hey @wjwwood . This PR causes warnings in other packages, and I haven't been able to resolve them all.

This PR in ros2/system_tests has test failures when I make the change
This PR in ros2/ros2cli is even more bizarre - the test times out waiting for the processes to start up

The situation is slightly complicated because I'm currently not doing any new work for Apex right now. I can tackle these with @dejanpan 's approval, or maybe someone else from Apex will take over this PR. Hopefully he can chime in.

@wjwwood
Copy link
Member

wjwwood commented Jan 3, 2020

Ok, sounds good and thanks for the update. I'll leave it as-is for now.

@ivanpauno
Copy link
Member

@pbaughman @dejanpan I've added this to the backlog, I can remove the label if you're planning to work on it in the near future.

@pbaughman
Copy link
Contributor Author

@ivanpauno I've reached out to Apex again to see if they want me to tackle this. Forgive my ignorance of your process but what does the "backlog" label mean, functionally, and what effect does it have on me if it's added or removed?

@mjcarroll
Copy link
Member

mjcarroll commented Jan 24, 2020 via email

@pbaughman
Copy link
Contributor Author

@ivanpauno I get to finish this after all! Unless I find some awful underlying problem I should have this ready to go by Monday or Tuesday next week.

@ivanpauno ivanpauno added help wanted Extra attention is needed and removed backlog labels Jan 24, 2020
@pbaughman
Copy link
Contributor Author

pbaughman commented Feb 20, 2020

Just a quick update that I'm still working on this. I got the system_tests PR merged when we determined the test failures were also on the nightlies.

the ros2cli test failures appear to be a real race condition in the test that is taking some effort to resolve. Progress is being made here: ros2/ros2cli#454 and here: ros2/ros2cli#376

Unless I find some awful underlying problem I should have this ready to go by Monday or Tuesday next week.

^^^^ sad-trombone noise

@jacobperron
Copy link
Member

Confirming flake8 failure is resolved:
Linux Build Status

Copy link
Member

@jacobperron jacobperron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm leaning towards using the warnings module for logging the deprecation notices, since they seem more visible to me (e.g. when running tests with pytest).

Regardless, our CI isn't catching the warnings one way or the other ros2/ci#418. I noticed that usage of ready_fn and other deprecated API was introduced in ros2/ros2cli#463, but CI doesn't report the warnings. I don't think it's something to worry about to resolve this PR.

@pbaughman
Copy link
Contributor Author

@jacobperron Is this good to merge then? Not sure if you're waiting on an action from me or not.

@jacobperron
Copy link
Member

@pbaughman see #346 (comment)

Pete Baughman and others added 6 commits March 26, 2020 09:29
  - Using a ReadyToTest action allows more future flexibility than
    a ready_fn passed to generate_test_description.  Warn that
    ReadyToTest action will be required in the future
  - Passing proc_info and proc_output and test args as arguments to
    test methods is much cleaner than relying on self.proc_info and self.proc_output
    'magically' being added to the tests and works more like pytest fixtures.
    Warn that this is will go away in the future

Signed-off-by: Pete Baughman <pete.baughman@apex.ai>
  - It's a little wasteful to set this on every test instance, because they share the same
    __class__ object, but it's fast and temporary until this is deprecated

Signed-off-by: Pete Baughman <pete.baughman@apex.ai>
Signed-off-by: Pete Baughman <pete.baughman@apex.ai>
Signed-off-by: Pete Baughman <peter.c.baughman@gmail.com>
Signed-off-by: Pete Baughman <peter.c.baughman@gmail.com>
  - This may make it easier to see the warnings in CI logs

Signed-off-by: Pete Baughman <peter.c.baughman@gmail.com>
@pbaughman pbaughman force-pushed the deprecate_old_launch_testing_features branch from 327d8b5 to 5cf0409 Compare March 26, 2020 17:20
@pbaughman
Copy link
Contributor Author

pbaughman commented Mar 26, 2020

@jacobperron
Ok, here's what I can make the output from the example test good_proc.test.py look like:

root@4841d43f6eb4:/launch# launch_test launch_testing/test/launch_testing/examples/good_proc_launch_test.py                                                                                      
Warning: Passing ready_fn as an argument to generate_test_description will be removed in a future release.  Include a launch_testing.actions.ReadyToTest action in the LaunchDescription instead.
[INFO] [launch]: All log files can be found below /root/.ros/log/2020-03-26-17-11-32-354584-4841d43f6eb4-1027
[INFO] [launch]: Default logging verbosity is set to INFO                                                                                                                                        
test_count_to_four (good_proc_launch_test.TestGoodProcess) ... Warning: Automatically adding attributes like self.proc_output to the test class will be deprecated in a future release.  Instead,
add proc_output to the test method argument list to access the test object you need
[INFO] [python3-1]: process started with pid [1030]               
[python3-1] Starting Up                                           
[python3-1] Loop 0                                                    

Hopefully this is satisfactory. @hidmic I'm sorry, I just don't understand what you're getting at when you say

I still think you may be able to get better output for ready_fn deprecation warnings if you do so when you bind them to the test function instead.

Unless I somehow inject code into the generate_test_description function, I can't make the stack trace point to the inside the generate_test_description function

Hopefully the latest commit produces an output that works for everybody though.

@hidmic
Copy link
Contributor

hidmic commented Mar 26, 2020

Unless I somehow inject code into the generate_test_description function, I can't make the stack trace point to the inside the generate_test_description function

Well, there's warnings.warn_explicit() with which you can refer to the test source file, but consider that an FYI.

Hopefully the latest commit produces an output that works for everybody tough.

What you have is good enough I guess. @jacobperron ?

Copy link
Member

@jacobperron jacobperron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works for me, thanks for iterating!

@pbaughman
Copy link
Contributor Author

Not sure how I ended up with a flake8 error in launch_context.py - that's not part of the changes in this PR

AssertionError: Found 1 code style errors / warnings:   ./launch/launch_context.py:124:5: A003 "locals" is a python builtin, consider renaming the class attribute assert 1 == 0

Maybe a new version of flake8 that added this error?

@jacobperron
Copy link
Member

We've switched our builds over to Ubuntu Focal, which recently had an update to flake8 I believe. The same error exists on master and shouldn't block this PR.

@jacobperron
Copy link
Member

Things looks fine compared to the nightlies. I'll just hold to retrigger the Windows build once the fix is merged.

@jacobperron jacobperron merged commit b655230 into ros2:master Mar 27, 2020
@pbaughman pbaughman deleted the deprecate_old_launch_testing_features branch March 27, 2020 21:15
@pbaughman
Copy link
Contributor Author

@jacobperron @wjwwood I'll put together a branch that actually removes these features, and I guess you can hold onto it until after Foxy is done and out the door. Does that work for you guys?

@jacobperron
Copy link
Member

Sounds good to me. Maybe you can open it as a draft PR so it doesn't get accidentally merged before Foxy?

@pbaughman
Copy link
Contributor Author

pbaughman commented Mar 27, 2020

@jacobperron No problem - How do I make something a 'draft PR' ?

Edit nvm, I see it's a standard thing now: https://github.blog/2019-02-14-introducing-draft-pull-requests/ I thought you meant an OSRF specific process

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants