2

I am using SQL Server 2008 and I need to select all data from one table of one DB into another table of another DB on the same SQL Server instance.

Here is my script using. The server will run out of memory. The data is big -- table is about 50G size on disk. Any easy alternative solution or any solution to lower memory consumption is fine. The server has 16G physical RAM and is x64.

Here is the statement I am using,

insert into [TargetDB].[dbo].[Orders]
select *
from [SourceDB].[dbo].[Orders];

Any quick and simple solutions?

thanks in advance, George

3 Answers 3

3

Add some partitioning so that you don't have so take it all at once. Get data for one month at a time, or all ID:s ending with a specific number.

That way each batch gets a bit smaller.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks idstam, currently only ID is clustered primary key and it is of type GUID. I have an idea to write a loop and in each iteration copy a batch (e.g. 10,000 records for example for each iteration)?
2

Copy in batches

INSERT INTO [TargetDB].[dbo].[Orders]
SELECT TOP 100 *
FROM [SourceDB].[dbo].[Orders] S
WHERE NOT EXISTS
(
 SELECT 1 FROM [TargetDB].[dbo].[Orders] T1
 WHERE T1.OrderId = S.orderID
)

That should do it in batches of 100, which you could tweak to suit the number of records you need to process. This code does assume that you have some form of Unique value, such as OrderId to key off in the data copy process

Copy in ranges

If you have a field you can use to choose "ranges" such as an OrderDate, start off by running a query like

SELECT OrderDate, COUNT(1)
FROM [SourceDB].[dbo].[Orders]

to see how many distinct values there are and how many records there are per distinct value. That should allow you to choose some ranges (eg. 2009-01-01 -> 2009-01-31) and then use ranged queries to copy the data across:

INSERT INTO [TargetDB].[dbo].[Orders]
SELECT *
FROM [SourceDB].[dbo].[Orders] 
WHERE OrderDate BETWEEN '2009-01-01 00:00:00.000' AND '2009-01-31 23:59:59.997'

7 Comments

Thanks Rob, if I need to copy all the data to the destination database table, how to write a loop to let it complete all the things in some batches?
@George2 - If it's a one off process, probably easier to do it in large chunks manually to give the database a chance to "breathe" between batches, particularly as you suffered out of memory problems previously =)
@George2 - use the first "copy in batches", scale the size of the "TOP" up to a coupla thousand or so, and run it until the data copy is complete
@George2 - use the code above (the first block of code), substituting the name of your Guid ID column for "OrderId" in the penultimate line and changing "TOP 100" in the second line to "TOP 2000", or however many you want. Then keep running the script until the "Rows Affected" is returned as ZERO :)
@George2 - yes, it's possible - but I refer you to my 2nd comment on this answer where I suggest doing it manually :)
|
1

You might want to look into using BCP to bulk copy the data.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.