Skip to content

Commit f8e3b45

Browse files
first commit
0 parents  commit f8e3b45

34 files changed

+786
-0
lines changed

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# git ignore
2+
**/.DS_Store
3+
.DS_Store

README.md

Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
# Python ProcessPoolExecutor Jump-Start
2+
3+
![Python ProcessPoolExecutor Jump-Start](cover.png)
4+
5+
* <https://github.com/SuperFastPython/PythonProcessPoolExecutorJumpStart>
6+
7+
This repository provides all source code for the book:
8+
9+
* **Python ProcessPoolExecutor Jump-Start**: _Execute CPU-Bound Tasks in Parallel With Modern Process Pools_, Jason Brownlee, 2022.
10+
11+
12+
## Source Code
13+
You can access all Python .py files directly here:
14+
15+
* [src/](src/)
16+
17+
## Get the Book
18+
19+
You can learn more about the book here:
20+
21+
* Coming soon
22+
23+
### Book Blurb
24+
25+
> How much faster could your python code run (if it used all CPU cores)?
26+
>
27+
> The ProcessPoolExecutor class provides modern process pools for CPU-bound tasks.
28+
>
29+
> This is not some random third-party library, this is a class provided in the Python standard library (already installed on your system).
30+
>
31+
> This is the class you need to make your code run faster.
32+
>
33+
> There's just one problem. No one knows about it (or how to use it well).
34+
>
35+
> Introducing: "Python ProcessPoolExecutor Jump-Start". A new book designed to teach you modern process pools in Python, super fast!
36+
>
37+
> You will get a rapid-paced, 7-part course to get you started and make you awesome at using the ProcessPoolExecutor.
38+
>
39+
> Including:
40+
>
41+
> * How to create process pools and when to use them.
42+
> * How to configure process pools including the number of workers.
43+
> * How to execute tasks with worker processes and handle for results.
44+
> * How to execute tasks in the process pool asynchronously.
45+
> * How to query and get results from handles on asynchronous tasks called futures.
46+
> * How to wait on and manage diverse collections of asynchronous tasks.
47+
> * How to develop a parallel Fibonacci calculator 4x faster than the sequential version.
48+
>
49+
> Each of the 7 lessons was carefully designed to teach one critical aspect of the ProcessPoolExecutor, with explanations, code snippets and worked examples.
50+
>
51+
> Each lesson ends with an exercise for you to complete to confirm you understood the topic, a summary of what was learned, and links for further reading if you want to go deeper.
52+
>
53+
> Stop copy-pasting code from StackOverflow answers.
54+
>
55+
> Learn Python concurrency correctly, step-by-step.

cover.png

683 KB
Loading

src/lesson01_process.py

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
# SuperFastPython.com
2+
# example of running a function in a new process
3+
from multiprocessing import Process
4+
5+
# custom function to be executed in a child process
6+
def task():
7+
# report a message
8+
print('This is another process', flush=True)
9+
10+
# protect the entry point
11+
if __name__ == '__main__':
12+
# define a task to run in a new process
13+
process = Process(target=task)
14+
# start the task in a new process
15+
process.start()
16+
# wait for the child process to terminate
17+
process.join()
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# SuperFastPython.com
2+
# example running a function in the process pool
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# custom function to be executed in a worker process
6+
def task():
7+
# report a message
8+
print('This is another process', flush=True)
9+
10+
# protect the entry point
11+
if __name__ == '__main__':
12+
# create the process pool
13+
with ProcessPoolExecutor() as exe:
14+
# issue the task
15+
future = exe.submit(task)
16+
# wait for the task to finish
17+
future.result()
18+
# close the process pool automatically

src/lesson02_default_config.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
# SuperFastPython.com
2+
# example reporting the details of a default pool
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# protect the entry point
6+
if __name__ == '__main__':
7+
# create a process pool
8+
exe = ProcessPoolExecutor()
9+
# report the status of the process pool
10+
print(exe._max_workers)
11+
# shutdown the process pool
12+
exe.shutdown()
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# SuperFastPython.com
2+
# example initializing worker processes in the pool
3+
from time import sleep
4+
from concurrent.futures import ProcessPoolExecutor
5+
6+
# custom function to be executed in a worker process
7+
def task(number):
8+
# report a message
9+
print(f'Worker task {number}...', flush=True)
10+
# block for a moment
11+
sleep(1)
12+
13+
# initialize a worker in the process pool
14+
def init():
15+
# report a message
16+
print('Initializing worker...', flush=True)
17+
18+
# protect the entry point
19+
if __name__ == '__main__':
20+
# create and configure the process pool
21+
with ProcessPoolExecutor(2,
22+
initializer=init) as exe:
23+
# issue tasks to the process pool
24+
_ = exe.map(task, range(4))

src/lesson02_num_processes.py

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
# SuperFastPython.com
2+
# example of setting a large number number of workers
3+
from time import sleep
4+
from concurrent.futures import ProcessPoolExecutor
5+
6+
# custom task function executed in the process pool
7+
def task(number):
8+
# block for a moment
9+
sleep(1)
10+
# report a message
11+
if number % 10 == 0:
12+
print(f'>task {number} done', flush=True)
13+
14+
# protect the entry point
15+
if __name__ == '__main__':
16+
# create a process pool
17+
with ProcessPoolExecutor(50) as exe:
18+
# issue many tasks to the pool
19+
_ = exe.map(task, range(50))

src/lesson03_map_chunksize.py

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# SuperFastPython.com
2+
# example of executing multiple tasks in chunks
3+
from concurrent.futures import ProcessPoolExecutor
4+
5+
# custom function to be executed in a worker process
6+
def task(number):
7+
return number*2
8+
9+
# protect the entry point
10+
if __name__ == '__main__':
11+
# create the process pool
12+
with ProcessPoolExecutor(4) as exe:
13+
# issue tasks to execute concurrently
14+
_ = exe.map(task, range(10000), chunksize=500)
15+
# wait for all tasks to complete
16+
print('All done')
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# SuperFastPython.com
2+
# example executing tasks concurrently with multiple arg
3+
from random import random
4+
from time import sleep
5+
from concurrent.futures import ProcessPoolExecutor
6+
7+
# custom function to be executed in a worker process
8+
def task(number, value):
9+
# report a message
10+
print(f'Task using {value}', flush=True)
11+
# block for a moment to simulate work
12+
sleep(value)
13+
# return a new value
14+
return number + value
15+
16+
# protect the entry point
17+
if __name__ == '__main__':
18+
# create the process pool
19+
with ProcessPoolExecutor(4) as exe:
20+
# prepare random numbers between 0 and 1
21+
values = [random() for _ in range(10)]
22+
# issue tasks to execute concurrently
23+
for result in exe.map(task, range(10), values):
24+
# report results
25+
print(result)

0 commit comments

Comments
 (0)