I'm using csv.Dictreader
to read in a csv file because accessing items by fieldnames is very convenient, but the the csv.Dictwriter
class is very finicky about how it deals with fieldnames. I get lots of "dict contains fields not in fieldnames"
exceptions, when I specifically don't want the dict to contain all the fieldnames I provided. I also want to be able to provide a list of fieldnames with keys that might not appear in every row of dictionary list.
So I've created a way to convert a list of dictionaries to a 2d array I can use with the csv.writer.writerow
function.
Question:
What I'm wondering is if my method is good, bad, or ugly. Is there a better/more pythonic way of converting a list of dictionaries with arbitrary fieldnames to a 2d array? Am I missing something obvious with csv.DictWriter
?
Code:
What it does is:
The output will skip fieldnames that you don't provide, but will also just put a blank space if you provide a fieldname that doesn't appear in every (or any) row, but will still include it in the header at the top of the csv file.
def csvdict_to_array(dictlist, fieldnames):
# Start with header row
csv_array = [fieldnames]
for row in dictlist:
csv_array.append(dictlist_row_to_list(row, fieldnames))
return csv_array
def dictlist_row_to_list(dictlist_row, fieldnames):
csv_row = []
for field in fieldnames:
if field not in dictlist_row:
csv_row.append('')
else:
csv_row.append(dictlist_row[field])
return csv_row
Sample input/output:
fieldnames = ["one", "three", "ten"]
dictlist = [{"one": "bob", "two": "bill", "three":"cat"},
{"one": "john", "two": "jack", "ten":"dog"}]
Output:
one,three,ten
bob,cat,
john,,dog
Thanks for your time
This produces your output:
fieldnames = ["one", "three", "ten"]
dictlist = [{"one": "bob", "two": "bill", "three":"cat"},
{"one": "john", "two": "jack", "ten":"dog"}]
res = [[item.get(key, '') for key in fieldnames] for item in dictlist]
res.insert(0, fieldnames)
print(res)
Result:
[['one', 'three', 'ten'], ['bob', 'cat', ''], ['john', '', 'dog']]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With