Skip to content

Commit

Permalink
Add a test for pickling (only Py3, not supported with Py2)
Browse files Browse the repository at this point in the history
  - Added __eq__ on Parser class
  - Add test_parser_pickling test (skip Py2 and parsers with schema)
  - Updated changelog
  • Loading branch information
brunato committed Jun 13, 2018
1 parent 5d10df1 commit a6f4f41
Show file tree
Hide file tree
Showing 3 changed files with 59 additions and 12 deletions.
49 changes: 37 additions & 12 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,29 +2,54 @@
CHANGELOG
*********

v1.0.5
======
`1.0.8_` (2018-06-13)
=====================
* Fixed token classes creation for parsers serialization

`1.0.7_` (2018-05-07)
=====================
* Added autodoc based manual with Sphinx

`1.0.6_` (2018-05-02)
=====================
* Added tox testing
* Improved the parser class with raw_advance method

`1.0.5`_ (2018-03-31)
=====================
* Added n.10 XPath 2.0 functions for strings
* Fix README.rst for right rendering in PyPI
* Added ElementPathMissingContextError exception for a correct
handling of static context evaluation

v1.0.4
======
`v1.0.4`_ (2018-03-27)
======================
* Fixed packaging ('packages' argument in setup.py).

v1.0.3
======
`v1.0.3`_ (2018-03-27)
======================
* Fixed the effective boolean value for a list containing an empty string.

v1.0.2
======
`v1.0.2`_ (2018-03-27)
======================
* Add QName parsing like in the ElementPath library (usage regulated by a *strict* flag).

v1.0.1
======
`v1.0.1`_ (2018-03-27)
======================
* Some bug fixes for attributes selection.

v1.0
====
`v1.0.0`_ (2018-03-26)
====================
* First stable version.


.. _1.0.0: https://github.com/brunato/elementpath/commit/b28da83
.. _1.0.1: https://github.com/brunato/elementpath/compare/1.0.0...1.0.1
.. _1.0.2: https://github.com/brunato/elementpath/compare/1.0.1...1.0.2
.. _1.0.3: https://github.com/brunato/elementpath/compare/1.0.2...1.0.3
.. _1.0.4: https://github.com/brunato/elementpath/compare/1.0.3...1.0.4
.. _1.0.5: https://github.com/brunato/elementpath/compare/1.0.4...1.0.5
.. _1.0.6: https://github.com/brunato/elementpath/compare/1.0.5...1.0.6
.. _1.0.7: https://github.com/brunato/elementpath/compare/1.0.6...1.0.7
.. _1.0.8: https://github.com/brunato/elementpath/compare/1.0.7...1.0.8

10 changes: 10 additions & 0 deletions elementpath/tdop_parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -247,6 +247,16 @@ def __init__(self):
self.tokens = iter(())
self.source = ''

def __eq__(self, other):
if self.token_base_class != other.token_base_class:
return False
elif self.SYMBOLS != other.SYMBOLS:
return False
elif self.symbol_table != other.symbol_table:
return False
else:
return True

def parse(self, source):
"""
Parses a source code of the formal language. This is the main method that has to be
Expand Down
12 changes: 12 additions & 0 deletions tests/test_elementpath.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
import sys
import io
import math
import pickle
from decimal import Decimal
from collections import namedtuple
from xml.etree import ElementTree
Expand Down Expand Up @@ -120,6 +121,8 @@ def setUpClass(cls):
cls.parser = XPath1Parser(namespaces=cls.namespaces, variables=cls.variables, strict=True)
cls.etree = ElementTree

#
# Helper methods
def check_tokenizer(self, path, expected):
"""Check the list of tokens generated by the tokenizer."""
self.assertEqual([
Expand Down Expand Up @@ -201,6 +204,15 @@ def wrong_type(self, path):
def wrong_name(self, path):
self.assertRaises(ElementPathNameError, self.parser.parse, path)

#
# Test methods
@unittest.skipIf(sys.version_info < (3,), "Python 2 pickling is not supported.")
def test_parser_pickling(self):
if getattr(self.parser, 'schema', None) is None:
obj = pickle.dumps(self.parser)
parser = pickle.loads(obj)
self.assertEqual(self.parser, parser)

def test_xpath_tokenizer(self):
# tests from the XPath specification
self.check_tokenizer("*", ['*'])
Expand Down

0 comments on commit a6f4f41

Please sign in to comment.