Skip to main content

Questions tagged [parallelism]

Performing tasks in parallel, to make use of multiple processors

Filter by
Sorted by
Tagged with
4 votes
4 answers
550 views

Given this code: #!/bin/bash set -euo pipefail function someFn() { local input_string="$1" echo "$input_string start" sleep 3 echo "$input_string end" } function ...
k0pernikus's user avatar
  • 16.7k
1 vote
1 answer
606 views

I am brand new to zstd/pzstd, trying out its features, compression, benchmarking it, and so on. (I run Linux Mint 22 Cinnamon.) This computer has 32 GB RAM. The basic command appears to be working, ...
Vlastimil Burián's user avatar
0 votes
1 answer
201 views

I created some pigz (parallel gzip) - home page - compressed archives of my SSD disk drives. (compiled version 2.8) I called one of them 4TB-SATA-disk--Windows10--2024-Jan-21.img.gz which says the ...
Vlastimil Burián's user avatar
1 vote
1 answer
123 views

I have been trying to understand lscpu's output and came across several threads dedicated to concepts of CPUs, physical cores, and threads. Based on those threads, to get the total number of CPUs (...
Jason's user avatar
  • 13
0 votes
1 answer
91 views

So at work we have a few systems running on their own hardware that is identical to all of the systems. If one of the hardware fails, one of the running system will take over and start the system that ...
Travis Hunt's user avatar
0 votes
3 answers
161 views

I am trying to parallelise my sample Bash script and have tried commands like & and wait. Please let me know what is an effective way to make it parallel My current code is working fine for ...
user96368's user avatar
1 vote
2 answers
2k views

I'm running openssl dhparam -out dhparam4096.pem 4096 and it pegs a single core at 100% for the duration of the task (which can be considerable on some processors). I have 1 or more additional cores ...
Pete Cooper's user avatar
3 votes
2 answers
1k views

I have two scripts running parallelly and they are echoing to the same file. One script is echoing +++++++++++++++ to the file while the other script is echoing =========== to the file. Below is the ...
Anubhav Rai's user avatar
1 vote
1 answer
181 views

I would like to run optipng in parallel with my 8-thread CPU using the shell. I know the program itself is not multicore optimized, the only way will be to run 8 files with it. I have 500+ PNG images ...
Vlastimil Burián's user avatar
3 votes
1 answer
994 views

Question: Since HTTP supports resuming at an offset, are there any tools (or existing options for commands like wget or curl) that will launch multiple threads to fetch the file in parallel with ...
KJ7LNW's user avatar
  • 535
0 votes
0 answers
793 views

I was trying to run multiple machine learning experiments simultaneously to save time. My OS is Fedora 36, and I use emacs. To automate the experiment I used runs.py import subprocess subprocess.run(&...
Schach21's user avatar
  • 103
4 votes
4 answers
460 views

I am trying to speed up a find command which accesses files on multiple harddrives by utilizing parallelization. Unfortunately, either the parallelization is ignored or the variable is not filled. ...
Ocean's user avatar
  • 278
1 vote
1 answer
370 views

I have near 400 git repos on my machine. And this is a script that I use to find their status collectively: function Check() { gitFolder="$1" parent=$(dirname $gitFolder); ...
Saeed Neamati's user avatar
1 vote
1 answer
291 views

I have a bash script that more or less looks like this: N=32 for i in $(seq -f "%06g" 0 ${LAST_NUM}) # LAST_NUM is an env variable do ((j=j%N)); ((j++==0)) && wait # Wait for all ...
PrinceWalnut's user avatar
2 votes
1 answer
2k views

How can I get reasonable parallelisation on multi-core nodes without saturating resources? As in many other similar questions, the question is really how to learn to tweak GNU Parallel to get ...
Nikos Alexandris's user avatar
1 vote
1 answer
196 views

Consider this code: job() { local id=$1 sleep $id } do_job_in_parallel() { local pids=() # run subshells for id in $(seq 4) do job $id & pids=("${pids[@]}" $!) ...
pmor's user avatar
  • 757
0 votes
0 answers
918 views

To install all the updates in a Red Hat or CentOS machine, we use the following command yum -y update --skip-broken --nobest We are looking for ways to improve total running time of installing the ...
Gaurav Ramrakhyani's user avatar
0 votes
1 answer
78 views

I'm running into an issue with CPU throttling that only seems to trigger under a specific workload of running the Kythe indexer. Detailed repro steps at the end of the question. I'm going to give a ...
typesanitizer's user avatar
3 votes
1 answer
98 views

I have this code to check the status of all of my git folders. find / -maxdepth 3 -not -path / -path '/[[:upper:]]*' -type d -name .git -not -path "*/Trash/*" -not -path "*/Temp/*" ...
Saeed Neamati's user avatar
7 votes
6 answers
8k views

In bash script, I have a program like this for i in {1..1000} do foo i done Where I call the function foo 1000 times with parameter i If I want to make it run in multi-process, but not all at once,...
urningod's user avatar
  • 173
0 votes
1 answer
420 views

TL;DR: A program draws the same random seed when started twice simultaneously. How did it happen? Details I'm running an MCMC statistical analysis, so I execute the program (phylobayes) twice to get 2 ...
PlasmaBinturong's user avatar
0 votes
0 answers
129 views

I'm trying to run an experiment that involves transforming a lot of files through a pipeline like this. A and B take a file as input and produces a file, or some text through stdout. C takes those ...
sigh's user avatar
  • 101
1 vote
0 answers
65 views

This is the command, run at the root of an NTFS partition from a Debian 12 VM (VirtualBox) $ (find . -type f -exec cat {} \;) | pv | wc -c I was trying to check that all files are readable. It's a ...
golimar's user avatar
  • 457
-1 votes
2 answers
1k views

I have a C++ program I'd like to run in parallel on a Linux machine, with each instance being a completely independent process that doesn't communicate with the others. Is there a straightforward way ...
Quavo's user avatar
  • 1
1 vote
2 answers
2k views

#!/usr/bin/bash TARGETS=( "81.176.235.2" "81.176.70.2" "78.41.109.7" ) myIPs=( "185.164.100.1" "185.164.100.2" "185.164.100.3" "185....
John Smith's user avatar
3 votes
2 answers
362 views

I've run into a couple of similar situations where I can break a single-core bound task up into multiple parts and run each part as separate job in bash to parallelize it, but I struggle with ...
guest's user avatar
  • 77
5 votes
1 answer
1k views

I have a bunch of bzipped JSON file that I read with xargs in parallel, do some light processing with jq and redirect the output to a file as follows: # Number of workers is one less than the number ...
abhinavkulkarni's user avatar
1 vote
1 answer
2k views

I have a makefile and want to make sure that all the rules are executed sequentially, that is, that no parallel execution is performed. I believe I have three ways of achieving this: With ....
Clément's user avatar
  • 410
1 vote
1 answer
365 views

I'm using the wait -n technique to perform max_jobs parallel tasks: #!/usr/bin/env bash cleanup() { echo "cleaning up..." } trap "cleanup" EXIT do_task() { echo "...
Zeta.Investigator's user avatar
5 votes
0 answers
1k views

I basically want to unlock any sudo authentication with any of these criteria met (whatever completes first successfully): a usb as a security key matched (custom script). fingerprint matched. ...
Animesh Sahu's user avatar
1 vote
1 answer
168 views

Does the following script #!/usr/bin/bash while true do ab -c 700 -n 100000000000 -r http://crimea-post.ru/ >>crimea-post & ab -c 700 -n 100000000000 -r http://nalog.gov.ru/ >>nalog ...
John Smith's user avatar
0 votes
0 answers
182 views

I have some programs that that need to run in order with a common task between each program that modifies the data and sends it to another location for processing. each modify_data script spawns ...
guest's user avatar
  • 77
2 votes
1 answer
318 views

I'm a complete newbie, so please excuse my ignorance and/or potentially wrong terminology. I'm using an Ubuntu server through ssh for brain image processing (one command with several programs, ...
Jana's user avatar
  • 29
1 vote
1 answer
3k views

I have a command which I want to run on all free cores to speed up the execution time. Specifically I am running the Pitman-Yor Adaptor-Grammar Sampler software I downloaded from here ./py-cfg/py-cfg-...
M.A.G's user avatar
  • 271
0 votes
1 answer
1k views

I'm inserting a lot of CSV files into database. I want to do it in parallel, for example run 4 processes. Right now I do it with the script like this: find . -name "*.csv" | xargs -n 1 -P 4 ....
Arzybek's user avatar
  • 135
1 vote
2 answers
3k views

I have this script to go through a list of URLs and the check return codes using Curl. Links file goes like this: https://link1/... https://link2/... https://link200/... (...) The script: INDEX=0 DIR=...
markfree's user avatar
  • 455
5 votes
1 answer
1k views

I am trying to write a script that has the purpose to parallelize an execution (a program that creates some files) running the processes in background and, when all commands in the for loop are done, ...
Robb1's user avatar
  • 297
1 vote
2 answers
879 views

I have this command here for batch converting PDF documents (first 2 pages) to TIFF files using pdftoppm. The goal is to put the TIFF images into its own folder with folder name matching the original ...
an0nhi11's user avatar
0 votes
2 answers
2k views

If I source the same shell script from multiple other scripts, which I run in parallel, and modify a shell variable defined in the sourced script, will the sourcing scripts mess up each other's value ...
Fiz's user avatar
  • 1
1 vote
1 answer
801 views

On Linux Mint 20.2 Cinnamon I would like to create a disk image of my secondary disk drive (SATA) containing Windows 10, not that it matters now, directly gzip'ed using the Parallel gzip = pigz onto ...
Vlastimil Burián's user avatar
1 vote
0 answers
1k views

I have written for loop and parallelized it with & and limited the threshold to running 3 jobs at one time. Below is my script. I am reserving 32 cores and 256 GB memory through BSUB command. The ...
botloggy's user avatar
  • 137
1 vote
1 answer
269 views

I have a bash function that mainly curls an endpoint for a set of links, and again curls each of those links (for another set of links) recursively. task() { link="$1" response=$(curl &...
Zeta.Investigator's user avatar
0 votes
3 answers
2k views

I have a script like this: #!/bin/csh command 1 \ -f \"input1\" \ -l input2 -other_swithes1 command 2 \ -f \"input1\" \ -m input2 \ -l input3 -other_swithes1 ...
inman's user avatar
  • 9
1 vote
1 answer
61 views

Let's say that I have access to a high-performance Linux cluster equipped with a scheduler (e.g. LSF, Slurm, etc.) that will allow me to have up to M jobs either running or pending at any one time, of ...
kjo's user avatar
  • 16.4k
1 vote
1 answer
156 views

I would like to split an input file on character count (ASCII is fine), combined with new lines as well. That is, every group of 10000 character should be seen as one record to be piped into the child ...
Frazier Thien's user avatar
2 votes
1 answer
746 views

Let's say I have a file listing the path of multiple files like the following: /home/user/file1.txt /home/user/file2.txt /home/user/file3.txt /home/user/file4.txt /home/user/file5.txt /home/user/file6....
raylight's user avatar
  • 541
2 votes
2 answers
4k views

I have a large input file which contains 30M lines, new lines in \r\n. I want to process this file in parallel by sending chunks of 1000 lines (or less, for the remainder of the file) to a REST API ...
Frazier Thien's user avatar
0 votes
1 answer
113 views

I had a problem because I have to execute near 1700 times the same program with different data. That program do several calculations using iterations, and the answer is obtained 5 hours later ...
David López Carbonell's user avatar
2 votes
1 answer
871 views

I am having a function A, which takes an argument fileName and then makes a curl post call with that file to employment Server.The function pseudocode will look as : function A(filename) { // read ...
rahul sharma's user avatar
0 votes
2 answers
372 views

I have multiple R scripts to read (up to 3 i.e. tr1.R, tr2.R, tr3.R). The bash script for reading a single script is given below #!/bin/bash #PBS -l nodes=1:ppn=10,walltime=00:05:00 #PBS -M #PBS -m e ...
b_takhel's user avatar

1
2 3 4 5 6