Closed
Description
There is an inconsistency in the returned (python) types when indexing .iloc with an integer array. The result should always be a sequence (Series), but if the array contains exactly one index, it returns a scalar value:
>>> import pandas, numpy
>>> column = pandas.Series(numpy.arange(10))
>>> indices, = (column > 4).nonzero()
>>> column.iloc[indices]
5 5
6 6
7 7
8 8
9 9
dtype: int64
>>> indices, = (column == 5).nonzero()
>>> column.iloc[indices]
5
>>> indices, = (column == 99).nonzero()
>>> column.iloc[indices]
Series([], dtype: int64)
I am working around that using numpy.atleast_1d
, but it's ugly.