Skip to content

.iloc with 1-element integer array behaves badly #5006

Closed
@hmeine

Description

@hmeine

There is an inconsistency in the returned (python) types when indexing .iloc with an integer array. The result should always be a sequence (Series), but if the array contains exactly one index, it returns a scalar value:

>>> import pandas, numpy
>>> column = pandas.Series(numpy.arange(10))
>>> indices, = (column > 4).nonzero()
>>> column.iloc[indices]
5    5
6    6
7    7
8    8
9    9
dtype: int64
>>> indices, = (column == 5).nonzero()
>>> column.iloc[indices]
5
>>> indices, = (column == 99).nonzero()
>>> column.iloc[indices]
Series([], dtype: int64)

I am working around that using numpy.atleast_1d, but it's ugly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    BugDtype ConversionsUnexpected or buggy dtype conversionsIndexingRelated to indexing on series/frames, not to indexes themselves

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions