We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#63 中只能删除列表中的重复元素,并且因为使用set,导致去重后的列表中的元素是无序的
set
key
>>> list1 = [1,23,4,5,6,1,7,23] >>> list(remove_duplicate_elements(list1)) [1, 23, 4, 5, 6, 7] >>> list2 = [{'x':1, 'y':2}, {'x':1, 'y':3}, {'x':1, 'y':2}] >>> list(remove_duplicate_elements(a, key=lambda d:(d['x'],d['y']))) [{'x': 1, 'y': 2}, {'x': 1, 'y': 3}]
The text was updated successfully, but these errors were encountered:
2018.7.7 v1.0.15 add function fish_common.remove_duplicate_elements d…
45466d3
…oc and unittest from issue chinapnr#63 and chinapnr#77
No branches or pull requests
背景
#63 中只能删除列表中的重复元素,并且因为使用
set
,导致去重后的列表中的元素是无序的步骤
key
用以指定一个函数将序列中的元素转换为可哈希类型,这样就可以检测重复元素了set
类型,如果发现不是重复的元素,就将它加入到set
中,以保持有序举例
The text was updated successfully, but these errors were encountered: