Below is a scraper that loops through two websites, scrapes a team's roster information, puts the information into an array, and exports the arrays into a CSV file. Everything works great, but the only problem is the writerow headers repeat in the csv file every time the scraper moves on to the second website. Is it possible to adjust the CSV portion of the code to have the headers only appear once when the scraper is looping through multiple websites? Thanks in advance!
import requests
import csv
from bs4 import BeautifulSoup
team_list={'yankees','redsox'}
for team in team_list:
page = requests.get('http://m.{}.mlb.com/roster/'.format(team))
soup = BeautifulSoup(page.text, 'html.parser')
soup.find(class_='nav-tabset-container').decompose()
soup.find(class_='column secondary span-5 right').decompose()
roster = soup.find(class_='layout layout-roster')
names = [n.contents[0] for n in roster.find_all('a')]
ids = [n['href'].split('/')[2] for n in roster.find_all('a')]
number = [n.contents[0] for n in roster.find_all('td', index='0')]
handedness = [n.contents[0] for n in roster.find_all('td', index='3')]
height = [n.contents[0] for n in roster.find_all('td', index='4')]
weight = [n.contents[0] for n in roster.find_all('td', index='5')]
DOB = [n.contents[0] for n in roster.find_all('td', index='6')]
team = [soup.find('meta',property='og:site_name')['content']] * len(names)
with open('MLB_Active_Roster.csv', 'a', newline='') as fp:
f = csv.writer(fp)
f.writerow(['Name','ID','Number','Hand','Height','Weight','DOB','Team'])
f.writerows(zip(names, ids, number, handedness, height, weight, DOB, team))
Using a variable to check if header is added or not may be helpful. If header added it will not add second times
header_added = False
for team in team_list:
do_some stuff
with open('MLB_Active_Roster.csv', 'a', newline='') as fp:
f = csv.writer(fp)
if not header_added:
f.writerow(['Name','ID','Number','Hand','Height','Weight','DOB','Team'])
header_added = True
f.writerows(zip(names, ids, number, handedness, height, weight, DOB, team))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With