0

I am still new to Scrapy and Python.

I was able to connect on my remote DB couple with SSH credentials but...

would like to prevent this error from happening on every item being scraped.

Error: 2055: Lost connection to MySQL server at '127.0.0.1:3306', system error: 10053 An established connection was aborted by the software in your host machine

Below is my MySQL Pipeline Object

import mysql.connector
import sshtunnel

class MySQLStorePipeline(object):

def __init__(self):
    with sshtunnel.SSHTunnelForwarder(
        ('13.***.***.***', 22),
        ssh_username='***',
        ssh_password='***',
        remote_bind_address=('db1.prod.***.***.net.***', 3306),
        local_bind_address=('127.0.0.1', 3306)
    ) as tunnel:

        self.dbcon = mysql.connector.connect(
            host='127.0.0.1', 
            port=3306,
            user='***', 
            database='***', 
            password='***',
            charset='utf8'
        )
        self.cursor = self.dbcon.cursor() 

def process_item(self, item, spider):
    try:
        tables = self.cursor.execute('SHOW TABLES')
        print tables.fetchall()

        self.dbcon.commit()            
    except mysql.connector.Error as err:
        print('Error: {}'.format(err))

    return item

I just don't know how to maintain a database connection inside process_item function

1 Answer 1

0

You're using with ... that's why you get this behavior SSH tunnel from Python is closing automatically

Sign up to request clarification or add additional context in comments.

1 Comment

Can you structure a code for me? I am still new to python. Thanks!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.