0

I’m trying to bulk import a very large table (75G) into the azure SQL DB (pricing tier P6 premium 1000 DTUs), it failed with the following error message “Msg 40544, Level 17, State 2, Line 179 The database ‘tempdb’ has reached its size quota. Partition or delete data, drop indexes, or consult the documentation for possible resolutions.”

I looked up a few blogs where they are suggesting an increase in the tier. I was wondering if I could just scale up tempdb without having to increase the tier itself.

Right now I’m chunking up the file into smaller volumes to load it but if I have to build index on this table after loading, I’m pretty sure it would fail again with the same error message.

Any thoughts??

1
  • did you manage to solve this issue ? I am facing this in azure data factory. Commented Jun 6, 2021 at 12:42

1 Answer 1

1

No. You have no direct control over TempDB and it's behavior. However, as you scale up your service tier, it's my understanding that your thresholds within TempDB also go up.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.