Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

Test the Message Spec #1627

Merged
merged 12 commits into from

2 participants

Min RK Fernando Perez
Min RK
Owner

This adds a few preliminary tests for the message spec.

It uses Traitlets to perform validation of keys.

Checks right now are not very strict, as (almost) any key is allowed to be None, as long as it is defined. This is because I simply do not know which keys are allowed to be None, and this is not discussed in the specification. If no keys are allowed to be None, we violate that all over the place.

Parametric tests are used, so every key validation counts as a test (147!).

Message spec doc was found to misrepresent code in a few points, and some changes were made:

  • spec had error keys as exc_name/value, but we are actually using ename/value (docs updated to match code)
  • payloads were inaccurate - list of dicts, rather than single dict, and transformed_output is a payload, not top-level in exec-reply (docs update to match code).
  • in oinfo_request, detail_level was in message spec, but not actually implemented (code updated to match docs).

History messages are not yet tested, but I think I get at least elementary coverage of everything else in the doc.

Fernando Perez
Owner

Are you getting these tests run by a simple iptest? I think IPython.zmq is still in our exclusions list... If I run it manually via iptest IPython.zmq, I get dropped into an interactive editor a couple of times... It seems that the zmq subpackage has a couple of nasty tests in it and maybe initially we just punted and disabled the lot. As part of this PR, we should fix that so that iptest picks up all of IPython.zmq for testing as well...

Min RK
Owner

No, I'd been running the individual test file manually (iptest IPython.zmq.tests.test_message_spec).

I'll get on fixing IPython.zmq for general testing.

Fernando Perez
Owner

Awesome. That way this PR will really have the (fantastic) impact of getting all of our zmq stuff into the regular testing workflow.

minrk added some commits
Min RK minrk close KernelManager channel sockets when they stop
Otherwise there are dangling sockets on the Context, which cannot terminate.
df2592c
Min RK minrk skip magic_edit doctest in zmqshell
it opens a GUI editor, which is obviously inappropriate
c35ed97
Min RK minrk include IPython.zmq in iptest groups bf52088
Min RK
Owner

okay, iptest now runs IPython.zmq (with gtk/matplotlib exclusions, as appropriate, I believe).

Fernando Perez
Owner

Awesome! All tetsts pass on my box; we may get a few odd failures tomorrow from the buildbots, but if that's the case we'll track them one by one with the info from Shining Pandas. Heads-up to @takluyver in case anything out of the ordinary happens with py3. But let's merge this as-is, you did a terrific job and I'm thrilled to have zmq and messaging test coverage; we can fine-tune things later as needed.

Fernando Perez fperez merged commit 232fa81 into from
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Apr 18, 2012
  1. Min RK
  2. Min RK

    begin testing message spec

    minrk authored
  3. Min RK

    correct keys in pyerr messages

    minrk authored
  4. Min RK

    add detail_level to object_info requests

    minrk authored
    as described in message spec
  5. Min RK

    a couple more oinfo tests

    minrk authored
  6. Min RK

    use parametric tests in message_spec

    minrk authored
    Now each key check counts as a test
  7. Min RK
  8. Min RK

    more detail in oinfo tests

    minrk authored
  9. Min RK

    mark IOPub channel tests

    minrk authored
  10. Min RK

    close KernelManager channel sockets when they stop

    minrk authored
    Otherwise there are dangling sockets on the Context, which cannot terminate.
  11. Min RK

    skip magic_edit doctest in zmqshell

    minrk authored
    it opens a GUI editor, which is obviously inappropriate
  12. Min RK
This page is out of date. Refresh to see the latest.
6 IPython/core/interactiveshell.py
View
@@ -1456,11 +1456,13 @@ def _inspect(self, meth, oname, namespaces=None, **kw):
print 'Object `%s` not found.' % oname
return 'not found' # so callers can take other action
- def object_inspect(self, oname):
+ def object_inspect(self, oname, detail_level=0):
with self.builtin_trap:
info = self._object_find(oname)
if info.found:
- return self.inspector.info(info.obj, oname, info=info)
+ return self.inspector.info(info.obj, oname, info=info,
+ detail_level=detail_level
+ )
else:
return oinspect.object_info(name=oname, found=False)
6 IPython/testing/iptest.py
View
@@ -233,6 +233,7 @@ def make_exclude():
# We do this unconditionally, so that the test suite doesn't import
# gtk, changing the default encoding and masking some unicode bugs.
exclusions.append(ipjoin('lib', 'inputhookgtk'))
+ exclusions.append(ipjoin('zmq', 'gui', 'gtkembed'))
# These have to be skipped on win32 because the use echo, rm, cd, etc.
# See ticket https://github.com/ipython/ipython/issues/87
@@ -263,7 +264,9 @@ def make_exclude():
if not have['matplotlib']:
exclusions.extend([ipjoin('core', 'pylabtools'),
- ipjoin('core', 'tests', 'test_pylabtools')])
+ ipjoin('core', 'tests', 'test_pylabtools'),
+ ipjoin('zmq', 'pylab'),
+ ])
if not have['tornado']:
exclusions.append(ipjoin('frontend', 'html'))
@@ -385,6 +388,7 @@ def make_runners():
'scripts', 'testing', 'utils', 'nbformat' ]
if have['zmq']:
+ nose_pkg_names.append('zmq')
nose_pkg_names.append('parallel')
# For debugging this code, only load quick stuff
5 IPython/zmq/ipkernel.py
View
@@ -364,7 +364,10 @@ def complete_request(self, ident, parent):
self.log.debug(str(completion_msg))
def object_info_request(self, ident, parent):
- object_info = self.shell.object_inspect(parent['content']['oname'])
+ content = parent['content']
+ object_info = self.shell.object_inspect(content['oname'],
+ detail_level = content.get('detail_level', 0)
+ )
# Before we send this object over, we scrub it for JSON usage
oinfo = json_clean(object_info)
msg = self.session.send(self.shell_socket, 'object_info_reply',
23 IPython/zmq/kernelmanager.py
View
@@ -200,6 +200,10 @@ def run(self):
self.stream = zmqstream.ZMQStream(self.socket, self.ioloop)
self.stream.on_recv(self._handle_recv)
self._run_loop()
+ try:
+ self.socket.close()
+ except:
+ pass
def stop(self):
self.ioloop.stop()
@@ -297,19 +301,21 @@ def complete(self, text, line, cursor_pos, block=None):
self._queue_send(msg)
return msg['header']['msg_id']
- def object_info(self, oname):
+ def object_info(self, oname, detail_level=0):
"""Get metadata information about an object.
Parameters
----------
oname : str
A string specifying the object name.
+ detail_level : int, optional
+ The level of detail for the introspection (0-2)
Returns
-------
The msg_id of the message sent.
"""
- content = dict(oname=oname)
+ content = dict(oname=oname, detail_level=detail_level)
msg = self.session.msg('object_info_request', content)
self._queue_send(msg)
return msg['header']['msg_id']
@@ -388,6 +394,10 @@ def run(self):
self.stream = zmqstream.ZMQStream(self.socket, self.ioloop)
self.stream.on_recv(self._handle_recv)
self._run_loop()
+ try:
+ self.socket.close()
+ except:
+ pass
def stop(self):
self.ioloop.stop()
@@ -450,6 +460,11 @@ def run(self):
self.stream = zmqstream.ZMQStream(self.socket, self.ioloop)
self.stream.on_recv(self._handle_recv)
self._run_loop()
+ try:
+ self.socket.close()
+ except:
+ pass
+
def stop(self):
self.ioloop.stop()
@@ -573,6 +588,10 @@ def run(self):
# and close/reopen the socket, because the REQ/REP cycle has been broken
self._create_socket()
continue
+ try:
+ self.socket.close()
+ except:
+ pass
def pause(self):
"""Pause the heartbeat."""
408 IPython/zmq/tests/test_message_spec.py
View
@@ -7,34 +7,424 @@
# the file COPYING.txt, distributed as part of this software.
#-----------------------------------------------------------------------------
+import re
import sys
import time
+from subprocess import PIPE
+from Queue import Empty
import nose.tools as nt
from ..blockingkernelmanager import BlockingKernelManager
+
+from IPython.testing import decorators as dec
from IPython.utils import io
+from IPython.utils.traitlets import (
+ HasTraits, TraitError, Bool, Unicode, Dict, Integer, List, Enum,
+)
+
+#-----------------------------------------------------------------------------
+# Global setup and utilities
+#-----------------------------------------------------------------------------
def setup():
global KM
KM = BlockingKernelManager()
- KM.start_kernel()
+ KM.start_kernel(stdout=PIPE, stderr=PIPE)
KM.start_channels()
- # Give the kernel a chance to come up.
- time.sleep(1)
+
def teardown():
- io.rprint('Entering teardown...') # dbg
- io.rprint('Stopping channels and kernel...') # dbg
KM.stop_channels()
- KM.kill_kernel()
+ KM.shutdown_kernel()
+
+
+def flush_channels():
+ """flush any messages waiting on the queue"""
+ for channel in (KM.shell_channel, KM.sub_channel):
+ while True:
+ try:
+ msg = channel.get_msg(block=True, timeout=0.1)
+ except Empty:
+ break
+ else:
+ validate_message(msg)
+
+
+def execute(code='', **kwargs):
+ """wrapper for doing common steps for validating an execution request"""
+ shell = KM.shell_channel
+ sub = KM.sub_channel
+
+ msg_id = shell.execute(code=code, **kwargs)
+ reply = shell.get_msg(timeout=2)
+ validate_message(reply, 'execute_reply', msg_id)
+ busy = sub.get_msg(timeout=2)
+ validate_message(busy, 'status', msg_id)
+ nt.assert_equals(busy['content']['execution_state'], 'busy')
+
+ if not kwargs.get('silent'):
+ pyin = sub.get_msg(timeout=2)
+ validate_message(pyin, 'pyin', msg_id)
+ nt.assert_equals(pyin['content']['code'], code)
+
+ return msg_id, reply['content']
+
+#-----------------------------------------------------------------------------
+# MSG Spec References
+#-----------------------------------------------------------------------------
+
+
+class Reference(HasTraits):
+
+ def check(self, d):
+ """validate a dict against our traits"""
+ for key in self.trait_names():
+ yield nt.assert_true(key in d, "Missing key: %r, should be found in %s" % (key, d))
+ # FIXME: always allow None, probably not a good idea
+ if d[key] is None:
+ continue
+ try:
+ setattr(self, key, d[key])
+ except TraitError as e:
+ yield nt.assert_true(False, str(e))
+
+
+class RMessage(Reference):
+ msg_id = Unicode()
+ msg_type = Unicode()
+ header = Dict()
+ parent_header = Dict()
+ content = Dict()
+
+class RHeader(Reference):
+ msg_id = Unicode()
+ msg_type = Unicode()
+ session = Unicode()
+ username = Unicode()
+
+class RContent(Reference):
+ status = Enum((u'ok', u'error'))
+
+
+class ExecuteReply(Reference):
+ execution_count = Integer()
+ status = Enum((u'ok', u'error'))
+
+ def check(self, d):
+ for tst in Reference.check(self, d):
+ yield tst
+ if d['status'] == 'ok':
+ for tst in ExecuteReplyOkay().check(d):
+ yield tst
+ elif d['status'] == 'error':
+ for tst in ExecuteReplyError().check(d):
+ yield tst
+
+
+class ExecuteReplyOkay(Reference):
+ payload = List(Dict)
+ user_variables = Dict()
+ user_expressions = Dict()
+
+class ExecuteReplyError(Reference):
+ ename = Unicode()
+ evalue = Unicode()
+ traceback = List(Unicode)
-# Actual tests
+class OInfoReply(Reference):
+ name = Unicode()
+ found = Bool()
+ ismagic = Bool()
+ isalias = Bool()
+ namespace = Enum((u'builtin', u'magics', u'alias', u'Interactive'))
+ type_name = Unicode()
+ string_form = Unicode()
+ base_class = Unicode()
+ length = Integer()
+ file = Unicode()
+ definition = Unicode()
+ argspec = Dict()
+ init_definition = Unicode()
+ docstring = Unicode()
+ init_docstring = Unicode()
+ class_docstring = Unicode()
+ call_def = Unicode()
+ call_docstring = Unicode()
+ source = Unicode()
+
+ def check(self, d):
+ for tst in Reference.check(self, d):
+ yield tst
+ if d['argspec'] is not None:
+ for tst in ArgSpec().check(d['argspec']):
+ yield tst
+
+
+class ArgSpec(Reference):
+ args = List(Unicode)
+ varargs = Unicode()
+ varkw = Unicode()
+ defaults = List()
+
+
+class Status(Reference):
+ execution_state = Enum((u'busy', u'idle'))
+
+
+class CompleteReply(Reference):
+ matches = List(Unicode)
+
+
+# IOPub messages
+
+class PyIn(Reference):
+ code = Unicode()
+ execution_count = Integer()
+
+
+PyErr = ExecuteReplyError
+
+
+class Stream(Reference):
+ name = Enum((u'stdout', u'stderr'))
+ data = Unicode()
+
+
+mime_pat = re.compile(r'\w+/\w+')
+
+class DisplayData(Reference):
+ source = Unicode()
+ metadata = Dict()
+ data = Dict()
+ def _data_changed(self, name, old, new):
+ for k,v in new.iteritems():
+ nt.assert_true(mime_pat.match(k))
+ nt.assert_true(isinstance(v, basestring), "expected string data, got %r" % v)
+
+
+references = {
+ 'execute_reply' : ExecuteReply(),
+ 'object_info_reply' : OInfoReply(),
+ 'status' : Status(),
+ 'complete_reply' : CompleteReply(),
+ 'pyin' : PyIn(),
+ 'pyerr' : PyErr(),
+ 'stream' : Stream(),
+ 'display_data' : DisplayData(),
+}
+
+
+def validate_message(msg, msg_type=None, parent=None):
+ """validate a message"""
+ RMessage().check(msg)
+ if msg_type:
+ yield nt.assert_equals(msg['msg_type'], msg_type)
+ if parent:
+ yield nt.assert_equal(msg['parent_header']['msg_id'], parent)
+ content = msg['content']
+ ref = references[msg['msg_type']]
+ for tst in ref.check(content):
+ yield tst
+
+
+#-----------------------------------------------------------------------------
+# Tests
+#-----------------------------------------------------------------------------
+
+# Shell channel
+
+@dec.parametric
def test_execute():
- KM.shell_channel.execute(code='x=1')
- KM.shell_channel.execute(code='print 1')
+ flush_channels()
+ shell = KM.shell_channel
+ msg_id = shell.execute(code='x=1')
+ reply = shell.get_msg(timeout=2)
+ for tst in validate_message(reply, 'execute_reply', msg_id):
+ yield tst
+
+
+@dec.parametric
+def test_execute_silent():
+ flush_channels()
+ msg_id, reply = execute(code='x=1', silent=True)
+
+ # flush status=idle
+ status = KM.sub_channel.get_msg(timeout=2)
+ for tst in validate_message(status, 'status', msg_id):
+ yield tst
+ nt.assert_equals(status['content']['execution_state'], 'idle')
+
+ yield nt.assert_raises(Empty, KM.sub_channel.get_msg, timeout=0.1)
+ count = reply['execution_count']
+
+ msg_id, reply = execute(code='x=2', silent=True)
+
+ # flush status=idle
+ status = KM.sub_channel.get_msg(timeout=2)
+ for tst in validate_message(status, 'status', msg_id):
+ yield tst
+ yield nt.assert_equals(status['content']['execution_state'], 'idle')
+
+ yield nt.assert_raises(Empty, KM.sub_channel.get_msg, timeout=0.1)
+ count_2 = reply['execution_count']
+ yield nt.assert_equals(count_2, count)
+
+
+@dec.parametric
+def test_execute_error():
+ flush_channels()
+
+ msg_id, reply = execute(code='1/0')
+ yield nt.assert_equals(reply['status'], 'error')
+ yield nt.assert_equals(reply['ename'], 'ZeroDivisionError')
+
+ pyerr = KM.sub_channel.get_msg(timeout=2)
+ for tst in validate_message(pyerr, 'pyerr', msg_id):
+ yield tst
+
+
+def test_execute_inc():
+ """execute request should increment execution_count"""
+ flush_channels()
+
+ msg_id, reply = execute(code='x=1')
+ count = reply['execution_count']
+
+ flush_channels()
+
+ msg_id, reply = execute(code='x=2')
+ count_2 = reply['execution_count']
+ nt.assert_equals(count_2, count+1)
+
+
+def test_user_variables():
+ flush_channels()
+
+ msg_id, reply = execute(code='x=1', user_variables=['x'])
+ user_variables = reply['user_variables']
+ nt.assert_equals(user_variables, {u'x' : u'1'})
+
+
+def test_user_expressions():
+ flush_channels()
+
+ msg_id, reply = execute(code='x=1', user_expressions=dict(foo='x+1'))
+ user_expressions = reply['user_expressions']
+ nt.assert_equals(user_expressions, {u'foo' : u'2'})
+
+
+@dec.parametric
+def test_oinfo():
+ flush_channels()
+
+ shell = KM.shell_channel
+
+ msg_id = shell.object_info('a')
+ reply = shell.get_msg(timeout=2)
+ for tst in validate_message(reply, 'object_info_reply', msg_id):
+ yield tst
+
+
+@dec.parametric
+def test_oinfo_found():
+ flush_channels()
+
+ shell = KM.shell_channel
+
+ msg_id, reply = execute(code='a=5')
+
+ msg_id = shell.object_info('a')
+ reply = shell.get_msg(timeout=2)
+ for tst in validate_message(reply, 'object_info_reply', msg_id):
+ yield tst
+ content = reply['content']
+ yield nt.assert_true(content['found'])
+ argspec = content['argspec']
+ yield nt.assert_true(argspec is None, "didn't expect argspec dict, got %r" % argspec)
+
+
+@dec.parametric
+def test_oinfo_detail():
+ flush_channels()
+
+ shell = KM.shell_channel
+
+ msg_id, reply = execute(code='ip=get_ipython()')
+
+ msg_id = shell.object_info('ip.object_inspect', detail_level=2)
+ reply = shell.get_msg(timeout=2)
+ for tst in validate_message(reply, 'object_info_reply', msg_id):
+ yield tst
+ content = reply['content']
+ yield nt.assert_true(content['found'])
+ argspec = content['argspec']
+ yield nt.assert_true(isinstance(argspec, dict), "expected non-empty argspec dict, got %r" % argspec)
+ yield nt.assert_equals(argspec['defaults'], [0])
+
+
+@dec.parametric
+def test_oinfo_not_found():
+ flush_channels()
+
+ shell = KM.shell_channel
+
+ msg_id = shell.object_info('dne')
+ reply = shell.get_msg(timeout=2)
+ for tst in validate_message(reply, 'object_info_reply', msg_id):
+ yield tst
+ content = reply['content']
+ yield nt.assert_false(content['found'])
+
+
+@dec.parametric
+def test_complete():
+ flush_channels()
+
+ shell = KM.shell_channel
+
+ msg_id, reply = execute(code="alpha = albert = 5")
+
+ msg_id = shell.complete('al', 'al', 2)
+ reply = shell.get_msg(timeout=2)
+ for tst in validate_message(reply, 'complete_reply', msg_id):
+ yield tst
+ matches = reply['content']['matches']
+ for name in ('alpha', 'albert'):
+ yield nt.assert_true(name in matches, "Missing match: %r" % name)
+
+
+# IOPub channel
+
+
+@dec.parametric
+def test_stream():
+ flush_channels()
+
+ msg_id, reply = execute("print('hi')")
+
+ stdout = KM.sub_channel.get_msg(timeout=2)
+ for tst in validate_message(stdout, 'stream', msg_id):
+ yield tst
+ content = stdout['content']
+ yield nt.assert_equals(content['name'], u'stdout')
+ yield nt.assert_equals(content['data'], u'hi\n')
+
+
+@dec.parametric
+def test_display_data():
+ flush_channels()
+
+ msg_id, reply = execute("from IPython.core.display import display; display(1)")
+
+ display = KM.sub_channel.get_msg(timeout=2)
+ for tst in validate_message(display, 'display_data', parent=msg_id):
+ yield tst
+ data = display['content']['data']
+ yield nt.assert_equals(data['text/plain'], u'1')
+
2  IPython/zmq/zmqshell.py
View
@@ -34,6 +34,7 @@
from IPython.lib.kernel import (
get_connection_file, get_connection_info, connect_qtconsole
)
+from IPython.testing.skipdoctest import skip_doctest
from IPython.utils import io
from IPython.utils.jsonutil import json_clean
from IPython.utils.path import get_py_filename
@@ -256,6 +257,7 @@ def magic_doctest_mode(self,parameter_s=''):
mode=dstore.mode)
self.payload_manager.write_payload(payload)
+ @skip_doctest
def magic_edit(self,parameter_s='',last_call=['','']):
"""Bring up an editor and execute the resulting code.
28 docs/source/development/messaging.txt
View
@@ -324,22 +324,17 @@ Message type: ``execute_reply``::
When status is 'ok', the following extra fields are present::
{
- # The execution payload is a dict with string keys that may have been
+ # 'payload' will be a list of payload dicts.
+ # Each execution payload is a dict with string keys that may have been
# produced by the code being executed. It is retrieved by the kernel at
# the end of the execution and sent back to the front end, which can take
# action on it as needed. See main text for further details.
- 'payload' : dict,
+ 'payload' : list(dict),
# Results for the user_variables and user_expressions.
'user_variables' : dict,
'user_expressions' : dict,
-
- # The kernel will often transform the input provided to it. If the
- # '---->' transform had been applied, this is filled, otherwise it's the
- # empty string. So transformations like magics don't appear here, only
- # autocall ones.
- 'transformed_code' : str,
- }
+ }
.. admonition:: Execution payloads
@@ -347,20 +342,19 @@ When status is 'ok', the following extra fields are present::
given set of code, which normally is just displayed on the pyout stream
through the PUB socket. The idea of a payload is to allow special types of
code, typically magics, to populate a data container in the IPython kernel
- that will be shipped back to the caller via this channel. The kernel will
- have an API for this, probably something along the lines of::
+ that will be shipped back to the caller via this channel. The kernel
+ has an API for this in the PayloadManager::
- ip.exec_payload_add(key, value)
+ ip.payload_manager.write_payload(payload_dict)
- though this API is still in the design stages. The data returned in this
- payload will allow frontends to present special views of what just happened.
+ which appends a dictionary to the list of payloads.
When status is 'error', the following extra fields are present::
{
- 'exc_name' : str, # Exception name, as a string
- 'exc_value' : str, # Exception value, as a string
+ 'ename' : str, # Exception name, as a string
+ 'evalue' : str, # Exception value, as a string
# The traceback will contain a list of frames, represented each as a
# string. For now we'll stick to the existing design of ultraTB, which
@@ -853,7 +847,7 @@ Message type: ``crash``::
content = {
# Similarly to the 'error' case for execute_reply messages, this will
- # contain exc_name, exc_type and traceback fields.
+ # contain ename, etype and traceback fields.
# An additional field with supplementary information such as where to
# send the crash message
Something went wrong with that request. Please try again.