Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

concat on axis=1 and ignore_index=True raises TypeError #871

Closed
lbeltrame opened this issue Mar 6, 2012 · 1 comment
Closed

concat on axis=1 and ignore_index=True raises TypeError #871

lbeltrame opened this issue Mar 6, 2012 · 1 comment
Labels
Milestone

Comments

@lbeltrame
Copy link
Contributor

Test case:

>>> frame1 = pandas.DataFrame({"test1":["a", "b", "c"], "test2":[1,2,3], "test3": [4.5, 3.2, 1.2]})
>>> frame2 = pandas.DataFrame({"test4": [5.2, 2.2, 4.3]})
>>> pandas.concat([frame1, frame2], ignore_index=True, axis=1)

/usr/lib64/python2.7/site-packages/pandas/tools/merge.pyc in concat(objs, axis, join, join_axes, ignore_index, keys, levels, names,
    689                        keys=keys, levels=levels, names=names,
    690                        verify_integrity=verify_integrity)
--> 691     return op.get_result()
    692 
    693 

/usr/lib64/python2.7/site-packages//pandas/tools/merge.pyc in get_result(self)
    759                              columns=self.new_axes[1])
    760         else:
--> 761             new_data = self._get_concatenated_data()
    762             return self.objs[0]._from_axes(new_data, self.new_axes)
    763 

/usr/lib64/python2.7/site-packages/pandas/tools/merge.pyc in _get_concatenated_data(self)
    776             for kind in kinds:
    777                 klass_blocks = [mapping.get(kind) for mapping in blockmaps]
--> 778                 stacked_block = self._concat_blocks(klass_blocks)
    779                 new_blocks.append(stacked_block)
    780             new_data = BlockManager(new_blocks, self.new_axes)

/usr/lib64/python2.7/site-packages/pandas/tools/merge.pyc in _concat_blocks(self, blocks)
    832                 concat_items = _concat_indexes(all_items)
    833 
--> 834             return make_block(concat_values, concat_items, self.new_axes[0])
    835 
    836     def _concat_single_item(self, item):

/usr/lib64/python2.7/site-packages/pandas/core/internals.pyc in make_block(values, items, ref_items, do_integrity_check)
    287 
    288     return klass(values, items, ref_items, ndim=values.ndim,
--> 289                  do_integrity_check=do_integrity_check)
    290 
    291 # TODO: flexible with index=None and/or items=None


/usr/lib64/python2.7/site-packages/pandas/core/internals.pyc in __init__(self, values, items, ref_items, ndim, do_integrity_check)
     28         self.ndim = ndim
     29         self.items = _ensure_index(items)
---> 30         self.ref_items = _ensure_index(ref_items)
     31 
     32         if do_integrity_check:

/usr/lib64/python2.7/site-packages/pandas/core/index.pyc in _ensure_index(index_like)
   2152     if hasattr(index_like, 'name'):
   2153         return Index(index_like, name=index_like.name)
-> 2154     return Index(index_like)
   2155 
   2156 def _validate_join_method(method):

/usr/lib64/python2.7/site-packages/pandas/core/index.pyc in __new__(cls, data, dtype, copy, name)
     70         else:
     71             # other iterable of some kind

---> 72             subarr = com._asarray_tuplesafe(data, dtype=object)
     73 
     74         if lib.is_integer_array(subarr) and dtype is None:

/usr/lib64/python2.7/site-packages/pandas/core/common.pyc in _asarray_tuplesafe(values, dtype)
    493 def _asarray_tuplesafe(values, dtype=None):
    494     if not isinstance(values, (list, tuple, np.ndarray)):
--> 495         values = list(values)
    496 
    497     if isinstance(values, list) and dtype in [np.object_, object]:

TypeError: 'NoneType' object is not iterable

From today's git.

@adamklein
Copy link
Contributor

Thx for reporting

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants