0

I'm trying to scrape from 6pm.com and I'm running into an issue - my loop seems to be returning duplicate results, e.g. it keeps repeating the same product multiple times when a distinct product should only appear once.

Here's my code:

url_list1 = ['https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=1',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=2',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=3',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=4',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=5',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=6',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=7',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=8',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=9',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=10',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=11',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=12',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=13',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=14',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=15',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=16',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=17',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=18',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=19',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=20',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=21',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=22',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=23',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=24',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=25',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=26',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=27',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=28',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=29',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=30',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=31',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=32',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=33',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=34',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=35',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=36',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=37',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=38',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=39',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=40',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=41',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=42',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=43',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=44',
         'https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?&p=45'
]


url_list2 = []

for url1 in url_list1:
    data1 = requests.get(url1)
    soup1 = BeautifulSoup(data1.text, 'html.parser')


    productUrls = soup1.findAll('article')


    for url2 in productUrls:
        get_urls = "https://www.6pm.com"+url2.find('a', attrs={'itemprop': 'url'})['href']
        url_list2.append(get_urls)

print(url_list2)

So the first part (url_list1) is basically a link list. Each link leads to a page with 100 products of the selected brands. When I click each link and it opens in my browser, each page contains different products and there are no duplicates(that I'm aware of).

Next up, I initialize an empty list (url_list2) where I'm trying to store all the actual product URLs (so this list should contain 46 pages*100 products = around 4600 product URLs).

The first "for" loop iterates through each link in url_list1. The productUrls variable is a list that is supposed to store all "article" elements on each of the 46 pages.

The second, nested "for" loop iterates through the productUrls list and constructs the actual product URL. It is then supposed to append the constructed product URL to the empty list I initialized earlier, url_list2.

Testing the results with the print statement, I have noticed that products are duplicates instead of distinct.

Why would this be happening if by opening each url manually in my browser in url_list1 I can see different products on each page and don't notice any duplicates?

Any and all help is much appreciated.

2
  • 1
    How could you see the duplicates? I saw different products from the first 3 urls. Commented Mar 16, 2019 at 22:19
  • From what I understand, there were inconsistent duplicates due to the session expiring at different points of running the program. Commented Mar 17, 2019 at 0:17

2 Answers 2

1

You can do it a better for this scenario.You no need to take all urls in list.Please try below code which is simple way you can achieve the result.

from bs4 import BeautifulSoup
import re
import requests
headers = {'User-Agent':
       'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36'}

page = "https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso"

url_list2 = []
page_num = 1
session = requests.Session()
while page_num <47:
    pageTree = session.get(page, headers=headers)
    pageSoup = BeautifulSoup(pageTree.content, 'html.parser')
    productUrls = pageSoup.findAll('article')
    for url2 in productUrls:
        get_urls = "https://www.6pm.com"+url2.find('a', attrs={'itemprop': 'url'})['href']
        url_list2.append(get_urls)

    page = "https://www.6pm.com/filters/men-shoes/CK_XAVpbwAOoCe4VJIQl5AeABqkZoB4_9RFN-hasCYIahQlrmQO7H4gkb7sa5RV9uQ-LIf8DgAG2BqcfmAGBD-IEoAGTBqcHoALTEqsBjBsBgw_SBs0Z4QbsHK0UyiHvJMABAuICAwELGA.zso?p={}".format(page_num)
    page_num +=1

print(url_list2)
print(len(url_list2))

Let me know if that helps.

Sign up to request clarification or add additional context in comments.

1 Comment

Thank you very much for your time and effort! It seems to be working well with this code. It's quite elegant too, much cleaner code than my initial version.
1

What happens is that the pages that you see with your browser are not the same that requests gets. To solve the problem you must keep the session (of requests) alive.

Try this, it worked for me. Replace your big for loop by:

with requests.Session() as s:    # <--- here we create a session that stays alive
        for url1 in url_list1:
            data1 = s.get(url1)  # <--- here we call the links with the same session
            soup1 = BeautifulSoup(data1.text, 'html.parser')

            productUrls = soup1.findAll('article')

            for url2 in productUrls:
                get_urls = "https://www.6pm.com"+url2.find('a', attrs={'itemprop': 'url'})['href']
                url_list2.append(get_urls)

Good luck !

2 Comments

Thank you very much for your assistance! Could you please teach me how to check what requests sees? I imagine this will be very useful for my future troubleshooting.
Kajal posted an elegant solution that also addressed the pagination issue, I simply found his response most helpful. That isn't to say your solution didn't work or help me, but I can only accept one answer (if I can accept multiple, please let me know).

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.