0

I've a problem with SqlConnection in C#. I do a large number of INSERT NonQuery, but in any case SqlConnection save in the database always the first 573 rows. This is the method I use for queries. In this method there is a lock because I use different thread to save the data.

   public void InsertElement(string link, string titolo, string text)
    {
        string conString = "*****************";
        using (SqlConnection connection = new SqlConnection(conString))
        {
            connection.Open();

            text = text.Replace("\"", "");
            DateTime localDate = DateTime.Now;

            lock (thisLock)
            {
                string query = "IF (NOT EXISTS(SELECT * FROM Result " +
                " WHERE Link = '" + link + "')) " +
                " BEGIN " +
                " INSERT INTO Result ([Titolo],[Link],[Descrizione],[DataRicerca],[FKDatiRicercheID]) " +
                " VALUES('" + titolo + "', '" + link + "', '" + text + "', '" + localDate + "', 1) " +
                " END";

                if (connection != null)
                {
                    SqlCommand cmd = new SqlCommand(query, connection);
                    cmd.ExecuteNonQuery();
                }
            }
        }
    }

This is the code of the loop that call the method InsertElement()

public void Save()
{
    string[] DatiLetti;
    string url = "";

    while (result.Count > 0)
    {
        try
        {
            url = result.Last();
            result.RemoveAt(result.Count - 1);

            DatiLetti = ex.DirectExtractText(url);

            if (DatiLetti[0].Length > 2)
            {
                ssc.InsertGare(url, DatiLetti[0], DatiLetti[1]);
            }
        }
        catch (Exception exc)
        {
            logger.Error("Exception SpiderSave> " + exc);
        }
    }
}

Result is a volatile array that is progressively filled from other thread. I'm sure that the array contains more than 573 items.

I try to search one solution, but all the answers say that the number of database connections for SQLServer is over 32K at a time and I've already checked this number in my database. Is there anyone who can help me understand the problem?

6
  • It seems like a bad idea to call this method in a loop and recreate the connection every time. Create the connection once and pass it to the method doing the database transaction. Commented Aug 3, 2018 at 9:28
  • 1
    How many records should be saved? Does the loop finish due to an error/exception - if so can you give details. Can you also show the loop where you call this method. Commented Aug 3, 2018 at 10:01
  • @PaulF I save an indefinite number of data. It depends by the search result. Commented Aug 3, 2018 at 10:37
  • Look into MERGE ... this is basically what @TomTom suggests and combine with the advice from Suman and WynDiesel. Commented Aug 3, 2018 at 10:59
  • You didn't say whether or not you get an exception to terminate the loop prematurely? Have you stepped through with the debugger (a bit tedious for a large number of records, I know - but a counter & conditional breakpoint can help) to see if records after the 537th appear to be added? "I'm sure that the array contains more than 573 items." - have you confirmed that more than 537 of the result set need adding? How do you know that only 537 records are added - is it possible that all are added but you only get returned 537 for some reason? Commented Aug 3, 2018 at 11:22

2 Answers 2

1

Don't open a connection for every insert. Use one connection, then pass that connection through to your insert, like this :

public void InsertElement(string link, string titolo, string text, SqlConnection conn)
{
    text = text.Replace("\"", "");
    DateTime localDate = DateTime.Now;

    lock (thisLock)
    {
        string query = "IF (NOT EXISTS(SELECT * FROM Result " +
                        " WHERE Link = '" + link + "')) " +
                        " BEGIN " +
                        " INSERT INTO Result ([Titolo],[Link],[Descrizione],[DataRicerca],[FKDatiRicercheID]) " +
                            " VALUES('" + titolo + "', '" + link + "', '" + text + "', '" + localDate + "', 1) " +
                            " END";

        if (connection != null)
        {
            SqlCommand cmd = new SqlCommand(query, connection);
            cmd.ExecuteNonQuery();
        }
    }
}

I recommend also looking at paramatizing your query, as well as using bulk inserts, and not individual inserts

Sign up to request clarification or add additional context in comments.

6 Comments

"... dramatize your query, ..." made my day. Auto-correct is adorable.
@Filburt, I blindly accepted google's spelling correction. Fail.
Thanks for your answer. I've passed the connection variable but I've the same problem.
@Stefano, what if you remove the lock? Are you still getting an error that there's too many connections to the database?
@WynDiesel I've removed the lock, but I've the same problem.
|
1

If you are executing InsertElement() once for each rows of data to insert, then the execution will be too slow for large no. of rows. (Also, you are creating SqlConnection for each query execution.) Try adding many rows at once using a single INSERT query:

INSERT INTO tablename
(c1,c2,c3)
VALUES
(v1,v2,v3),
(v4,v5,v6)
...

3 Comments

Doing individual inserts will be slow & your solution is an improvement - but it does not account for only inserting 537 rows. Each insert command will be treated separately by the server & so it should not timeout.
Thanks for your answer. I can't do one query for multiple values because for each value I check if this already exist in database.
Actually you can - insert into a temporary table, move into final table WHILE CHECKING. This is SQL basics, only a little higher than "write a sql statement that isnerts rows".

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.