0

I am new to scripting world so might be asking some novice question but I did not find the exact code after so much googling.

My requirement is to generate a script to copy X number of files from one folder to another. Here number of files to be copied, source and target folder should be configurable in the script.

I tried Xcopy, Robocopy but in none I found any parameter to restrict the number of files to be copied. The program will run on Win7 or Win2008 server.

Please help.

1

4 Answers 4

4

I must say that I agree with vonPryz here. It seems odd. However it's very easy to do what you need in PowerShell. Just grab the number of items you want using:

$sourceFolder = "D:\source"
$destinationFolder = "D:\copiedfiles"
$maxItems = 200
Get-Childitem $sourceFolder | Select-Object -First $maxItems | Copy-Item $destinationFolder

This will even create the destination folder if it doesn't exist.

Sign up to request clarification or add additional context in comments.

3 Comments

I did with PowerShell but requirement is for Windows Script.
You tagged the question with PowerShell.
Get-Childitem $sourceFolder | Select-Object -First $maxItems | Copy-Item -Destination $destinationFolder
3

For batch-file tag:

@echo off

    setlocal enableextensions enabledelayedexpansion

    set "source=d:\temp\input"
    set "target=d:\temp\output"
    set num=10

    for /F "tokens=1,2 delims=:" %%f in ('dir /b /a-d "%source%\*"  ^| findstr /n "^" ') do (
        if %%f leq %num% (
            copy "%source%\%%g" "%target%" /y > nul 
        ) else goto endCopy
    )

:endCopy
    endlocal

Comments

1

Discuss with whoever gave the requirement and find out the exact filtering needs.

It makes little if any sense to copy N files from location X to location Y. Usually there is more sensible a criteria, like copy all files that are changed after DATETIME from location X to location Y. If there are more than N files, pick only N oldest/newest. Only the stakeholder can tell you what kind of behaviour is expected. Without more information, the script is very much unlikely to provide the expected functionality.

After you know the exact requirement, do some more Googling for solution. It shouldn't take much effort to find out how to achieve the task in Powershell. Hint: use gci with appropriate parameters/filters and store its results in an array.

2 Comments

I completely agree with you but you hardly have any choice when customer places the requirement and especially he is not ready to share complete picture with you.
I have a case where I need to split 45k files in 16 folders for later processing (parallel process). You cannot possibly know every scenario available so, it can make sense to someone else. In other words: answer the question, without judgment =)
0

Btw... the above PowerShell will throw exception about command not taking pipeline input. You need to copy each object, so the right command is: Get-Childitem $sourceFolder | ForEach-Object {Select-Object -First $maxItems | Copy-Item $destinationFolder}

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.