-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Labels
api: bigtableIssues related to the Bigtable API.Issues related to the Bigtable API.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.‘Nice-to-have’ improvement, new feature or different behavior or design.
Description
It appears to me that if I want to scan through all rows of a Bigtable, Table.read_rows()
is the function to use. This returns a PartialRowsData.
Why is PartialRowsData
not an iterable? I'd like to consume all the rows for a "hello bigtable" sample by iterating through them. Something like
print('Scan for all greetings:')
for row in table.read_rows():
print(row.cells[COLUMN_FAMILY_NAME][COLUMN_NAME.encode('UTF-8')][0].decode('UTF-8'))
Instead, it appears that I need to use a while loop and watch a dictionary that gets populated. I think this means that even if I don't need to store all the rows in memory at once, they still do. That's both inefficient and not following Python idioms (of generators / iterables).
Metadata
Metadata
Assignees
Labels
api: bigtableIssues related to the Bigtable API.Issues related to the Bigtable API.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.‘Nice-to-have’ improvement, new feature or different behavior or design.