In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-09-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
This article is about how python crawls data from second-hand houses. The editor thinks it is very practical, so share it with you as a reference and follow the editor to have a look.
First, find the location of the data:
Open Lianjia's official website, enter the second-hand housing page, select a city, you can see the total number of houses in the city and housing list data.
Second, determine the location of data storage:
The data of some websites is stored in html, while some are api, and some are even encrypted in js. Fortunately, Lianjia's housing data is stored in html:
3. Obtain html data:
Request a page through requests to get the html data of each page
# crawled url, default crawled real estate information of Nanjing Lianjia url = 'https://nj.***.com/ershoufang/pg{}/'.format(page)# request urlresp = requests.get (url, headers=headers, timeout=10)
The website in the code is not a real URL, can not be run directly!
Fourth, parse html to extract useful data:
Parse html through BeautifulSoup and extract corresponding useful data
Soup = BeautifulSoup (resp.content, 'lxml') # filter all li tags sellListContent = soup.select (' .sellListContent li.LOGCLICKDATA') # Loop through for sell in sellListContent: # title title = sell.select ('div.title a') [0] .string # grab all div information first Then extract houseInfo = list (sell.select ('div.houseInfo') [0] .stripped _ strings) # Real Estate name loupan = houseInfo [0] # split (' |') # House Type house_type = info [1]. Strip () # area size area = info [2] .strip ( ) # rooms facing toward = info [3] .strip () # Decoration type renovation = info [4] .strip () # House address positionInfo = '.join (list (sell.select (' div.positionInfo') [0] .stripped _ strings)) # Total house price totalPrice =''.join (list (sell.select (' div.totalPrice') [0] .stripped _ strings)) Housing unit price unitPrice = list (sell.select ('div.unitPrice') [0] .stripped _ strings) [0] Thank you for reading! This is the end of the article on "how python crawls the data of second-hand housing". I hope the above content can be helpful to you, so that you can learn more knowledge. if you think the article is good, you can share it for more people to see!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
The market share of Chrome browser on the desktop has exceeded 70%, and users are complaining about
The world's first 2nm mobile chip: Samsung Exynos 2600 is ready for mass production.According to a r
A US federal judge has ruled that Google can keep its Chrome browser, but it will be prohibited from
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
About us Contact us Product review car news thenatureplanet
More Form oMedia: AutoTimes. Bestcoffee. SL News. Jarebook. Coffee Hunters. Sundaily. Modezone. NNB. Coffee. Game News. FrontStreet. GGAMEN
© 2024 shulou.com SLNews company. All rights reserved.