mcoze mcoze - 1 month ago 194x 0x

I want to scrape sales information of aeroplanes for sale in Australia with BeautifulSoup:

One such site -

The updated python script below outputs the data into a csv file - after concatenating lists of results...

I think that a similar method could be implemented to search for more detailed information in the linked pages. The second scrape of linked pages can be commenced once obtaining a list of valid href's. The results can then be nested into the table.


BeautifulSoup - scraping help [updated]

#! python
import urllib
import urllib.request
import csv

from bs4 import BeautifulSoup

def make_soup(url):
	thepage = urllib.request.urlopen(url)
	soupdata = BeautifulSoup(thepage, "html.parser")
	return soupdata

var1 = []
var2 = []
for number in {"1", "2", "3", "4", "5", "6", "7", "8", "9", "10", "11", "12"}:
	soup = make_soup("" + number + "&total=101")
	for name in soup.findAll('div',{"class":"details"}):
	for price in soup.findAll('div',{"class":"summary"}):
print( var1 )
print( var2 )
print("the number is", len(var1))
print("the number is", len(var2))

zip(var1, var2)

with open('some2.csv', 'w') as f:
	writer = csv.writer(f)