I want to handle character strings up to 100K and write into a csv file into different columns. (basically trying to overcome excel cell limitation of 32K)
Below is sample code:
soup = BeautifulSoup(r.content, 'html5lib')
html = str(soup.select('div.DocumentText'))
if len(html) > 32000:
#How to handle here and assign to different variable ex: html1, html2 is the question
x.writerow([html_1,......, html_5])
Example flow trying to achieve
- Scrape website
- If scraped data characters are greater than 32000 and less than 100K
- split the scraped into different variable
- write each variable into different columns of CSV file
c.case_htmlinto items of size 32k each?