295

I want to change the following code

for directory, dirs, files in os.walk(directory_1):
    do_something()

for directory, dirs, files in os.walk(directory_2):
    do_something()

to this code:

for directory, dirs, files in os.walk(directory_1) + os.walk(directory_2):
    do_something()

I get the error:

unsupported operand type(s) for +: 'generator' and 'generator'

How to join two generators in Python?

0

15 Answers 15

387

itertools.chain() should do it. It takes multiple iterables and yields from each one by one, roughly equivalent to:

def chain(*iterables):
    for it in iterables:
        for element in it:
            yield element

Usage example:

from itertools import chain

g = (c for c in 'ABC')  # Dummy generator, just for example
c = chain(g, 'DEF')  # Chain the generator and a string
for item in c:
    print(item)

Output:

A
B
C
D
E
F
Sign up to request clarification or add additional context in comments.

6 Comments

One should keep in mind that the return value of itertools.chain() does not return a types.GeneratorType instance. Just in case the exact type is crucial.
See @andrew-pate anser for itertools.chain.from_iterable() reference to return a types.GeneratorType instance.
itertools.chain() would give all the elements in one directory and then shift to the other directory. Now, how do we pick the first elements of both directories and perform some operations, and then shift to the next pair and so on? Any idea would be appreciated.
@yash Iterate over those directories manually using the built-in function next.
@yash you might like zip. It does precisely that, pick out the first, second etc. values and put them in tuples.
|
120

A example of code:

from itertools import chain

def generator1():
    for item in 'abcdef':
        yield item

def generator2():
    for item in '123456':
        yield item

generator3 = chain(generator1(), generator2())
for item in generator3:
    print item

Comments

97

In Python (3.5 or greater) you can do:

def concat(a, b):
    yield from a
    yield from b

8 Comments

So much pythonic.
More general: def chain(*iterables): for iterable in iterables: yield from iterable (Put the def and for on separate lines when you run it.)
Is everything from a yielded before anything from b is yielded or are they being alternated?
@problemofficer Yup. Only a is checked until everything is yielded from it, even if b isn't an iterator. The TypeError for b not being an iterator will come up later.
@Karolius Oh OK, I see what you're saying. It looks like you made a typo, which confused me: def chain(iterable) should be def chain(iterables). (Also, x for x in is redundant.) Anyway, there's already a tool in the stdlib that does that: itertools.chain.from_iterable. And beyond performance, if you had an infinite iterable of iterables, it wouldn't be possible to use unpacking.
|
41

Simple example:

from itertools import chain
x = iter([1,2,3])      #Create Generator Object (listiterator)
y = iter([3,4,5])      #another one
result = chain(x, y)   #Chained x and y

5 Comments

Why not add this example to the already existing, highly upvoted itertools.chain() answer?
This isn't quite right, since itertools.chain returns an iterator, not a generator.
Can't you just do chain([1, 2, 3], [3, 4, 5])?
To be pedantic, a list_iterator isn't a generator, but it is an iterator, which is what OP's effectively actually asking about, since generators don't behave any differently from iterators in this context.
An example has been added to the top answer, making this redundant.
16

Here it is using a generator expression with nested fors:

range_a = range(3)
range_b = range(5)
result = ( item
           for one_range in (range_a, range_b)
           for item in one_range )
assert list(result) == [0, 1, 2, 0, 1, 2, 3, 4]

The for ... in ... are evaluated left-to-right. The identifier after for establishes a new variable. While one_range in used in the following for ... in ..., the item from the second one is used in the „final” assignment expression of which there is only one (in the very beginning).

Related question: How do I make a flat list out of a list of lists?.

Comments

13

With itertools.chain.from_iterable you can do things like:

def genny(start):
  for x in range(start, start+3):
    yield x

y = [1, 2]
ab = [o for o in itertools.chain.from_iterable(genny(x) for x in y)]
print(ab)

7 Comments

You're using an unnecessary list comprehension. You're also using an unnecessary generator expression on genny when it already returns a generator. list(itertools.chain.from_iterable(genny(x))) is much more concise.
The !ist comprehension was an easy way to create the two generators, as per the question. Maybe my answer is a little convoluted in that respect.
I guess the reason I added this answer to the existing ones was to help those who happen to have lots of generators to deal with.
It isn't an easy way, there are many easier ways. Using generator expressions on an existing generator will lower performance, and the list constructor is much more readable then the list comprehension. Your method is much more unreadable in those regards.
Corman, I agree your list constructor is indeed more readable. It would be good to see your 'many easier ways' though ... I think wjandrea's comment above looks to do the same as itertools.chain.from_iterable it would be good to race them and see whos fastest.
|
8

2020 update: Work in both Python 3 and Python 2

import itertools

iterA = range(10,15)
iterB = range(15,20)
iterC = range(20,25)

first option

for i in itertools.chain(iterA, iterB, iterC):
    print(i)

# 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

alternative option, introduced in python 2.6

for i in itertools.chain.from_iterable( [iterA, iterB, iterC] ):
    print(i)

# 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

itertools.chain() is the basic.

itertools.chain.from_iterable() is handy if you have an iterable of iterables. For example a list of files per subdirectory like [ ["src/server.py", "src/readme.txt"], ["test/test.py"] ].

1 Comment

Python 2 went EOL on January 1, 2020, so I'm surprised you mention it
4

One can also use unpack operator *:

concat = (*gen1(), *gen2())

NOTE: Works most efficiently for 'non-lazy' iterables. Can also be used with different kind of comprehensions. Preferred way for generator concat would be from the answer from @Uduse

2 Comments

It's sad that there is no lazy evaluation of *generator, because it would have made this a marvelous solution...
–1 this will immediately consume both generators into a tuple!
2

If you want to keep the generators separate but still iterate over them at the same time you can use zip():

NOTE: Iteration stops at the shorter of the two generators

For example:

for (root1, dir1, files1), (root2, dir2, files2) in zip(os.walk(path1), os.walk(path2)):

    for file in files1:
        #do something with first list of files

    for file in files2:
        #do something with second list of files

Comments

2

(Disclaimer: Python 3 only!)

Something with syntax similar to what you want is to use the splat operator to expand the two generators:

for directory, dirs, files in (*os.walk(directory_1), *os.walk(directory_2)):
    do_something()

Explanation:

This effectively performs a single-level flattening of the two generators into an N-tuple of 3-tuples (from os.walk) that looks like:

((directory1, dirs1, files1), (directory2, dirs2, files2), ...)

Your for-loop then iterates over this N-tuple.

Of course, by simply replacing the outer parentheses with brackets, you can get a list of 3-tuples instead of an N-tuple of 3-tuples:

for directory, dirs, files in [*os.walk(directory_1), *os.walk(directory_2)]:
    do_something()

This yields something like:

[(directory1, dirs1, files1), (directory2, dirs2, files2), ...]

Pro:

The upside to this approach is that you don't have to import anything and it's not a lot of code.

Con:

The downside is that you dump two generators into a collection and then iterate over that collection, effectively doing two passes and potentially using a lot of memory.

2 Comments

This is not flattening at all. Rather, it is a zip.
A bit puzzled by your comment @jpaugh. This concatenates two iterables. It doesn't create pairs from them. Maybe the confusion is from the fact that os.walk already yields 3-tuples?
2

I would say that, as suggested in comments by user "wjandrea", the best solution is

def concat_generators(*gens):
    for gen in gens:
        yield from gen

It does not change the returned type and is really Pythonic.

3 Comments

Which is what itertools.chain.from_iterable() will do for you. See @andrew-pate 's answer.
Don't reinvent the wheel, use itertools.chain. My comment wasn't meant to suggest "the best solution", it was just to improve a mediocre solution. Anyway, you also changed the names and made them confusing: concat_generators can work on any iterable, not just generators, so it should be renamed along with gen; and args is vague, so I'd use iterables instead (or gens, following your incorrect naming scheme).
Oops, actually, I take most of that back. If you're using generator-specific features, like .send(), .throw(), and .close(), then this is the better solution because it actually lets you use them, which itertools.chain doesn't. But in OP's case, they're not using any of those features, so it's simpler to use chain. (Also, I should have linked generator iterator instead of generator. The glossary is arguably wrong for this term.)
0

Lets say that we have to generators (gen1 and gen 2) and we want to perform some extra calculation that requires the outcome of both. We can return the outcome of such function/calculation through the map method, which in turn returns a generator that we can loop upon.

In this scenario, the function/calculation needs to be implemented via the lambda function. The tricky part is what we aim to do inside the map and its lambda function.

General form of proposed solution:

def function(gen1,gen2):
        for item in map(lambda x, y: do_somethin(x,y), gen1, gen2):
            yield item

Comments

0

If you would like get list of files paths from a knows directories before and after, you can do this:

for r,d,f in os.walk(current_dir):
    for dir in d:
        if dir =='after':
                after_dir = os.path.abspath(os.path.join(current_dir, dir))
                for r,d,f in os.walk(after_dir): 
                    after_flist.append([os.path.join(r,file)for file in f if file.endswith('json')])
                              
        elif dir =='before': 
                before_dir = os.path.abspath(os.path.join(current_dir, dir))
                for r,d,f in os.walk(before_dir):
                    before_flist.append([os.path.join(r,file)for file in f if file.endswith('json')])

I know there are better answers, this is simple code I felt.

Comments

-1

You can put any generator into a list. And while you can't combine generators, you can combine lists. The cons of this is you actually created 3 lists in memory but the pros are that this is very readable, requires no imports, and is a single line idiom.

Solution for the OP.

for directory, dirs, files in list(os.walk(directory_1)) + list(os.walk(directory_2)):
    do_something()
a = range(20)
b = range(10,99,3)
for v in list(a) + list(b):
    print(v) 

Comments

-2

If you just need to do it once and do not wish to import one more module, there is a simple solutions...

just do:

for dir in directory_1, directory_2:
    for directory, dirs, files in os.walk(dir):
        do_something()

If you really want to "join" both generators, then do :

for directory, dirs, files in (
        x for osw in [os.walk(directory_1), os.walk(directory_2)] 
               for x in osw
        ):
    do_something()

3 Comments

The second snippet of code gives an indentation error. It can be fixed with surrounding the list comprehension with parentheses: the opening parenthesis should be on the same line as in and the closing after the list comp ends. Regardless of this error, I think this is a bad example to follow. It reduces readability by mixing up indentation. The itertools.chain answers are massively more readable and easier to use.
You don't need to add parenthesis. I just moved the opening bracket on the previous line to solve this. by the way, you may not like my example, but I still think it's a good idea to know how to do things by yourself, because it makes you able to write the library yourself instead of resorting to someone else's work when you need it.
sure, it is a good idea to learn how to do things by yourself. I never debated that. Sorry if I was unclear. The use of a list comprehension here reduces readability and is not really needed. List comprehensions are cool, long list comprehensions become hard to read & fix. The code could be improved by creating the list before and then iterating over it. Sorry about my parenthesis comment if it was incorrect.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.