欢迎各位兄弟 发布技术文章

这里的技术是共享的

You are here

python iter yield

shiping1 的头像

What does the yield keyword do in Python?


What is the use of the yield keyword in Python? What does it do?

For example, I'm trying to understand this code (**):

def node._get_child_candidates(self, distance, min_dist, max_dist):
    if self._leftchild and distance - max_dist < self._median:
        yield self._leftchild
    if self._rightchild and distance + max_dist >= self._median:
        yield self._rightchild  

And this is the caller:

result, candidates = list(), [self]
while candidates:
    node = candidates.pop()
    distance = node._get_dist(obj)
    if distance <= max_dist and distance >= min_dist:
        result.extend(node._values)
    candidates.extend(node._get_child_candidates(distance, min_dist, max_dist))
return result

What happens when the method _get_child_candidates is called? A list is returned? A single element is returned? Is it called again? When subsequent calls do stop?

** The code comes from Jochen Schulz (jrschulz), who made a great Python library for metric spaces. This is the link to the complete source: Module mspace.

shareimprove this question
 
1 
Check this answer too.. stackoverflow.com/a/23530101/736037 –  Giri May 8 '14 at 0:00
up vote5249down voteaccepted
+500

正确答案 To understand what yield does, you must understand what generators are. And before generators come iterables.

Iterables

When you create a list, you can read its items one by one, and it's called iteration:

>>> mylist = [1, 2, 3]
>>> for i in mylist:
...    print(i)
1
2
3

Mylist is an iterable. When you use a list comprehension, you create a list, and so an iterable:

>>> mylist = [x*x for x in range(3)]
>>> for i in mylist:
...    print(i)
0
1
4

Everything you can use "for... in..." on is an iterable: lists, strings, files... These iterables are handy because you can read them as much as you wish, but you store all the values in memory and it's not always what you want when you have a lot of values.

Generators

Generators are iterators, but you can only iterate over them once. It's because they do not store all the values in memory, they generate the values on the fly:

>>> mygenerator = (x*x for x in range(3))
>>> for i in mygenerator:
...    print(i)
0
1
4

It is just the same except you used () instead of []. BUT, you can not perform for i in mygenerator a second time since generators can only be used once: they calculate 0, then forget about it and calculate 1, and end calculating 4, one by one.

Yield

Yield is a keyword that is used like return, except the function will return a generator.

>>> def createGenerator():
...    mylist = range(3)
...    for i in mylist:
...        yield i*i
...
>>> mygenerator = createGenerator() # create a generator
>>> print(mygenerator) # mygenerator is an object!
<generator object createGenerator at 0xb7555c34>
>>> for i in mygenerator:
...     print(i)
0
1
4

Here it's a useless example, but it's handy when you know your function will return a huge set of values that you will only need to read once.

To master yield, you must understand that when you call the function, the code you have written in the function body does not run. The function only returns the generator object, this is a bit tricky :-)

Then, your code will be run each time the for uses the generator.

Now the hard part:

The first time the for calls the generator object created from your function, it will run the code in your function from the beginning until it hits yield, then it'll return the first value of the loop. Then, each other call will run the loop you have written in the function one more time, and return the next value, until there is no value to return.

The generator is considered empty once the function runs but does not hit yield anymore. It can be because the loop had come to an end, or because you do not satisfy a "if/else" anymore.

Your code explained

Generator:

# Here you create the method of the node object that will return the generator
def node._get_child_candidates(self, distance, min_dist, max_dist):

  # Here is the code that will be called each time you use the generator object:

  # If there is still a child of the node object on its left
  # AND if distance is ok, return the next child
  if self._leftchild and distance - max_dist < self._median:
      yield self._leftchild

  # If there is still a child of the node object on its right
  # AND if distance is ok, return the next child
  if self._rightchild and distance + max_dist >= self._median:
      yield self._rightchild

  # If the function arrives here, the generator will be considered empty
  # there is no more than two values: the left and the right children

Caller:

# Create an empty list and a list with the current object reference
result, candidates = list(), [self]

# Loop on candidates (they contain only one element at the beginning)
while candidates:

    # Get the last candidate and remove it from the list
    node = candidates.pop()

    # Get the distance between obj and the candidate
    distance = node._get_dist(obj)

    # If distance is ok, then you can fill the result
    if distance <= max_dist and distance >= min_dist:
        result.extend(node._values)

    # Add the children of the candidate in the candidates list
    # so the loop will keep running until it will have looked
    # at all the children of the children of the children, etc. of the candidate
    candidates.extend(node._get_child_candidates(distance, min_dist, max_dist))

return result

This code contains several smart parts:

  • The loop iterates on a list but the list expands while the loop is being iterated :-) It's a concise way to go through all these nested data even if it's a bit dangerous since you can end up with an infinite loop. In this case, candidates.extend(node._get_child_candidates(distance, min_dist, max_dist)) exhausts all the values of the generator, but while keeps creating new generator objects which will produce different values from the previous ones since it's not applied on the same node.

  • The extend() method is a list object method that expects an iterable and adds its values to the list.

Usually we pass a list to it:

>>> a = [1, 2]
>>> b = [3, 4]
>>> a.extend(b)
>>> print(a)
[1, 2, 3, 4]

But in your code it gets a generator, which is good because:

  1. You don't need to read the values twice.
  2. You can have a lot of children and you don't want them all stored in memory.

And it works because Python does not care if the argument of a method is a list or not. Python expects iterables so it will work with strings, lists, tuples and generators! This is called duck typing and is one of the reason why Python is so cool. But this is another story, for another question...

You can stop here, or read a little bit to see a advanced use of generator:

Controlling a generator exhaustion

>>> class Bank(): # let's create a bank, building ATMs
...    crisis = False
...    def create_atm(self):
...        while not self.crisis:
...            yield "$100"
>>> hsbc = Bank() # when everything's ok the ATM gives you as much as you want
>>> corner_street_atm = hsbc.create_atm()
>>> print(corner_street_atm.next())
$100
>>> print(corner_street_atm.next())
$100
>>> print([corner_street_atm.next() for cash in range(5)])
['$100', '$100', '$100', '$100', '$100']
>>> hsbc.crisis = True # crisis is coming, no more money!
>>> print(corner_street_atm.next())
<type 'exceptions.StopIteration'>
>>> wall_street_atm = hsbc.create_atm() # it's even true for new ATMs
>>> print(wall_street_atm.next())
<type 'exceptions.StopIteration'>
>>> hsbc.crisis = False # trouble is, even post-crisis the ATM remains empty
>>> print(corner_street_atm.next())
<type 'exceptions.StopIteration'>
>>> brand_new_atm = hsbc.create_atm() # build a new one to get back in business
>>> for cash in brand_new_atm:
...    print cash
$100
$100
$100
$100
$100
$100
$100
$100
$100
...

It can be useful for various things like controlling access to a resource.

Itertools, your best friend

The itertools module contains special functions to manipulate iterables. Ever wish to duplicate a generator? Chain two generators? Group values in a nested list with a one liner? Map / Zip without creating another list?

Then just import itertools.

An example? Let's see the possible orders of arrival for a 4 horse race:

>>> horses = [1, 2, 3, 4]
>>> races = itertools.permutations(horses)
>>> print(races)
<itertools.permutations object at 0xb754f1dc>
>>> print(list(itertools.permutations(horses)))
[(1, 2, 3, 4),
 (1, 2, 4, 3),
 (1, 3, 2, 4),
 (1, 3, 4, 2),
 (1, 4, 2, 3),
 (1, 4, 3, 2),
 (2, 1, 3, 4),
 (2, 1, 4, 3),
 (2, 3, 1, 4),
 (2, 3, 4, 1),
 (2, 4, 1, 3),
 (2, 4, 3, 1),
 (3, 1, 2, 4),
 (3, 1, 4, 2),
 (3, 2, 1, 4),
 (3, 2, 4, 1),
 (3, 4, 1, 2),
 (3, 4, 2, 1),
 (4, 1, 2, 3),
 (4, 1, 3, 2),
 (4, 2, 1, 3),
 (4, 2, 3, 1),
 (4, 3, 1, 2),
 (4, 3, 2, 1)]

Understanding the inner mechanisms of iteration

Iteration is a process implying iterables (implementing the __iter__() method) and iterators (implementing the __next__() method). Iterables are any objects you can get an iterator from. Iterators are objects that let you iterate on iterables.

More about it in this article about how does the for loop work.

shareimprove this answer
 
3 
Apart from being in only one line, generators are often faster than regular iterations, and, while they produce the same end result, generators do so very differently than for loops. Also, as some of the later parts of the answer details, generators give much more control over iterations. The yield command uses a generator, for example. –  someone-or-other Apr 23 '14 at 4:50
99 
I think your explanation of how yield works is a little confusing - a function is not executed from the beginning every time a new value is requested; instead, it is resumed from the same place where it left off last time, which is the line after yield. –  riv Apr 27 '14 at 20:06
3 
"Generators are iterators, but you can only iterate over them once." - is this a desired feature of a limitation of the language. To me, it seems like a limitation. –  jco Sep 1 '14 at 20:04
10 
@jco Its a feature, for example, if I had a database with 10k names in it and I wanted to print everyone's name. Using a generator, I could, knowing I wouldn't run out of memory. since once it prints it forgets. If I tried to pull it into a list, I could run out of memory. –  triunenature Sep 5 '14 at 8:19 
4 
As a remark, glob() returns an iterator, while iglob() returns a special iterator(using yield), 'without actually storing them all simultaneously.' see glob –  booksee Sep 7 '14 at 2:19 

Shortcut to Grokking yield

When you see a function with yield statements, apply this easy trick to understand what will happen:

  1. Insert a line result = [] at the start of the function.
  2. Replace each yield expr with result.append(expr).
  3. Insert a line return result at the bottom of the function.
  4. Yay - no more yield statements! Read and figure out code.
  5. Compare function to original definition.

This trick may give you an idea of the logic behind the function, but what actually happens with yieldis significantly different that what happens in the list based approach. In many cases the yield approach will be a lot more memory efficient and faster too. In other cases this trick will get you stuck in an infinite loop, even though the original function works just fine. Read on to learn more...

Don't confuse your Iterables, Iterators and Generators

First, the iterator protocol - when you write

for x in mylist:
    ...loop body...

Python performs the following two steps:

  1. Gets an iterator for mylist:

    Call iter(mylist) -> this returns an object with a next() method (or __next__() in Python 3).

    [This is the step most people forget to tell you about]

  2. Uses the iterator to loop over items:

    Keep calling the next() method on the iterator returned from step 1. The return value from next() is assigned to x and the loop body is executed. If an exception StopIteration is raised from within next(), it means there are no more values in the iterator and the loop is exited.

The truth is Python performs the above two steps anytime it wants to loop over the contents of an object - so it could be a for loop, but it could also be code like otherlist.extend(mylist) (where otherlist is a Python list).

Here mylist is an iterable because it implements the iterator protocol. In a user defined class, you can implement the __iter__() method to make instances of your class iterable. This method should return an iterator. An iterator is an object with a next() method. It is possible to implement both __iter__() and next() on the same class, and have __iter__() return self. This will work for simple cases, but not when you want two iterators looping over the same object at the same time.

So that's the iterator protocol, many objects implement this protocol:

  1. Built-in lists, dictionaries, tuples, sets, files.
  2. User defined classes that implement __iter__().
  3. Generators.

Note that a for loop doesn't know what kind of object it's dealing with - it just follows the iterator protocol, and is happy to get item after item as it calls next(). Built-in lists return their items one by one, dictionaries return the keys one by one, files return the lines one by one, etc. And generators return... well that's where yield comes in:

def f123():
    yield 1
    yield 2
    yield 3

for item in f123():
    print item

Instead of yield statements, if you had three return statements in f123() only the first would get executed, and the function would exit. But f123() is no ordinary function. When f123() is called, itdoes not return any of the values in the yield statements! It returns a generator object. Also, the function does not really exit - it goes into a suspended state. When the for loop tries to loop over the generator object, the function resumes from its suspended state, runs until the next yield statement and returns that as the next item. This happens until the function exits, at which point the generator raises StopIteration, and the loop exits.

So the generator object is sort of like an adapter - at one end it exhibits the iterator protocol, by exposing __iter__() and next() methods to keep the for loop happy. At the other end however, it runs the function just enough to get the next value out of it, and puts it back in suspended mode.

Why Use Generators?

Usually you can write code that doesn't use generators but implements the same logic. One option is to use the temporary list 'trick' I mentioned before. That will not work in all cases, for e.g. if you have infinite loops, or it may make inefficient use of memory when you have a really long list. The other approach is to implement a new iterable class SomethingIter that keeps state in instance members and performs the next logical step in it's next() (or __next__() in Python 3) method. Depending on the logic, the code inside the next() method may end up looking very complex and be prone to bugs. Here generators provide a clean and easy solution.

shareimprove this answer
 
31 
nice explanation! found it better than the most upvoted answer –  Rushil May 31 '14 at 15:40
2 
Why is 'yield' harder to understand than most Python constructs? Because of its non-local effect: Normally, when you write 'def xy():' you define a function and bind it to the name xy in the current scope. And this does of course not depend on what is in the body of that function. Not so with 'yield' in the body: Suddenly your definition is NOT a function definition anymore, it is a generator definition. And a generator is a very different thing. (One of the few really ugly aspects of Python in my view.) –  Lutz Prechelt Sep 16 '14 at 12:33
   
@LutzPrechelt, it still binds a function that returns a generator to the name xy in the scope. This function returns a generator. This is why a generator can be used with for x in (y for y in generator): and a function with yield in it needs to be called as for x in xy():. This question leaves out "Don't confuse your generators with generator functions," which can be observed in the ATM example in the accepted answer. It is fair to say it makes the keyword def ambiguous to anyone who didn't write the function, but it always is until you find return or yield in the code. –  Poik Feb 12 at 14:42

Think of it this way:

An iterator is just a fancy sounding term for an object that has a next() method. So a yield-ed function ends up being something like this:

Original version:

def some_function():
    for i in xrange(4):
        yield i

for i in some_function():
    print i

This is basically what the python interpreter does with the above code:

class it:
    def __init__(self):
        #start at -1 so that we get 0 when we add 1 below.
        self.count = -1
    #the __iter__ method will be called once by the for loop.
    #the rest of the magic happens on the object returned by this method.
    #in this case it is the object itself.
    def __iter__(self):
        return self
    #the next method will be called repeatedly by the for loop
    #until it raises StopIteration.
    def next(self):
        self.count += 1
        if self.count < 4:
            return self.count
        else:
            #a StopIteration exception is raised
            #to signal that the iterator is done.
            #This is caught implicitly by the for loop.
            raise StopIteration 

def some_func():
    return it()

for i in some_func():
    print i

For more insight as to what's happening behind the scenes, the for loop can be rewritten to this:

iterator = some_func()
try:
    while 1:
        print iterator.next()
except StopIteration:
    pass

Does that make more sense or just confuse you more? :)

EDIT: I should note that this IS an oversimplification for illustrative purposes. :)

EDIT 2: Forgot to throw the StopIteration exception

shareimprove this answer
 
1 
for-loop expects an iterable (not iterator), therefore an __iter__ method must be defined. In case of an iterator object it is very simple: __iter__ = lambda self: self –  J.F. Sebastian Oct 25 '08 at 1:55
   
__getitem__ could be defined instead of __iter__. For example: class it: pass; it.__getitem__ = lambda self, i: i*10 if i < 10 else [][0]; for i in it(): print(i), It will print: 0, 10, 20, ..., 90 –  J.F. Sebastian Oct 25 '08 at 2:03

The yield keyword is reduced to two simple facts:

  1. If the compiler detects the yield keyword anywhere inside a function, that function no longer returns via the return statement. Instead, it immediately returns a lazy "pending list" objectcalled a generator
  2. A generator is iterable. What is an iterable? It's anything like a list or set or range or dict-view, with a built-in protocol for visiting each element in a certain order.

In a nutshell: a generator is a lazy, incrementally-pending list, and yield statements allow you to use function notation to program the list values the generator should incrementally spit out.

generator = myYieldingFunction(...)
x = list(generator)

   generator
       v
[x[0], ..., ???]

         generator
             v
[x[0], x[1], ..., ???]

               generator
                   v
[x[0], x[1], x[2], ..., ???]

                       StopIteration exception
[x[0], x[1], x[2]]     done

list==[x[0], x[1], x[2]]

Example

Let's define a function makeRange that's just like Python's range. Calling makeRange(n) RETURNS A GENERATOR:

def makeRange(n):
    # return 0,1,2,...,n-1
    i = 0
    while i < n:
        yield i
        i += 1

>>> makeRange(5)
<generator object makeRange at 0x19e4aa0>

To force the generator to immediately return its pending values, you can pass it into list() (just like you could any iterable):

>>> list(makeRange(5))
[0, 1, 2, 3, 4]

Comparing example to "just returning a list"

The above example can be thought of as merely creating a list which you append to and return:

# list-version                   #  # generator-version
def makeRange(n):                #  def makeRange(n):
    """return [0,1,2,...,n-1]""" #~     """return 0,1,2,...,n-1"""
    TO_RETURN = []               #>
    i = 0                        #      i = 0
    while i < n:                 #      while i < n:
        TO_RETURN += [i]         #~         yield i
        i += 1                   #      i += 1
    return TO_RETURN             #>

>>> makeRange(5)
[0, 1, 2, 3, 4]

There is one major difference though; see the last section.


How you might use generators

An iterable is the last part of a list comprehension, and all generators are iterable, so they're often used like so:

#                   _ITERABLE_
>>> [x+10 for x in makeRange(5)]
[10, 11, 12, 13, 14]

To get a better feel for generators, you can play around with the itertools module (be sure to use chain.from_iterable rather than chain when warranted). For example, you might even use generators to implement infinitely-long lazy lists like itertools.count(). You could implement your own def enumerate(iterable): zip(count(), iterable), or alternatively do so with the yieldkeyword in a while-loop.

Please note: generators can actually be used for many more things, such as implementing coroutinesor non-deterministic programming or other elegant things. However, the "lazy lists" viewpoint I present here is the most common use you will find.


Behind the scenes

This is how the "Python iteration protocol" works. That is, what is going on when you do list(makeRange(5)). This is what I describe earlier as a "lazy, incremental list".

>>> x=iter(range(5))
>>> next(x)
0
>>> next(x)
1
>>> next(x)
2
>>> next(x)
3
>>> next(x)
4
>>> next(x)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
StopIteration

The built-in function next() just calls the objects .next() function, which is a part of the "iteration protocol" and is found on all iterators. You can manually use the next() function (and other parts of the iteration protocol) to implement fancy things, usually at the expense of readability, so try to avoid doing that...


Minutiae

Normally, most people would not care about the following distinctions and probably want to stop reading here.

In Python-speak, an iterable is any object which "understands the concept of a for-loop" like a list [1,2,3], and an iterator is a specific instance of the requested for-loop like [1,2,3].__iter__(). Agenerator is exactly the same as any iterator, except for the way it was written (with function syntax).

When you request an iterator from a list, it creates a new iterator. However, when you request an iterator from an iterator (which you would rarely do), it just gives you a copy of itself.

Thus, in the unlikely event that you are failing to do something like this...

> x = myRange(5)
> list(x)
[0, 1, 2, 3, 4]
> list(x)
[]

... then remember that a generator is an iterator; that is, it is one-time-use. If you want to reuse it, you should call myRange(...) again. Those who absolutely need to clone a generator (for example, who are doing terrifyingly hackish metaprogramming) can use itertools.tee if absolutely necessary, since the copyable iterator Python PEP standards proposal has been deferred.

shareimprove this answer
 

I feel like I post a link to this presentation every day: David M. Beazly's Generator Tricks for Systems Programmers. If you're a Python programmer and you're not extremely familiar with generators, you should read this. It's a very clear explanation of what generators are, how they work, what the yield statement does, and it answers the question "Do you really want to mess around with this obscure language feature?"

SPOILER ALERT. The answer is: Yes. Yes, you do.

shareimprove this answer
 

There's one extra thing to mention: a function that yields doesn't actually have to terminate. I've written code like this:

def fib():
    last, cur = 0, 1
    while True: 
        yield cur
        last, cur = cur, last + cur

Then I can use it in other code like this:

for f in fib():
    if some_condition: break
    coolfuncs(f);

It really helps simplify some problems, and makes some things easier to work with.

shareimprove this answer
 
8 
The same thing but with 3 lines less: "def fib(): a, b = 0, 1 while 1: yield b a, b = b, a+b". –  hhh Jan 14 '11 at 11:04
   
@hhh: Good point, I've finally decided to update the answer to make it a bit nicer. –  Claudiu Apr 21 '13 at 15:42

yield is just like return. It returns whatever you tell it to. The only difference is that the next time you call the function, execution starts from the last call to the yield statement.

In the case of your code, the function get_child_candidates is acting like an iterator so that when you extend your list, it adds one element at a time to the new list.

list.extend calls an iterator until it's exhausted. In the case of the code sample you posted, it would be much clearer to just return a tuple and append that to the list.

shareimprove this answer
 
40 
This is close, but not correct. Every time you call a function with a yield statement in it, it returns a brand new generator object. It's only when you call that generator's .next() method that execution resumes after the last yield. –  kurosch Oct 24 '08 at 18:11
5 
@kurosch: golden comment! That clear, concise explanation of when such functions "resume" or "reset" is enough to use yield adequately without the need to deeply understand generators' inner mechanics. – MestreLion Nov 7 '13 at 4:42

Yield gives you a generator.

def get_odd_numbers(i):
    return range(1, i, 2)
def yield_odd_numbers(i):
    for x in range(1, i, 2):
       yield x
foo = get_odd_numbers(10)
bar = yield_odd_numbers(10)
foo
[1, 3, 5, 7, 9]
bar
<generator object yield_odd_numbers at 0x1029c6f50>
bar.next()
1
bar.next()
3
bar.next()
5

As you can see, in the first case foo holds the entire list in memory at once. It's not a big deal for a list with 5 elements, but what if you want a list of 5 million? Not only is this a huge memory eater, it also costs a lot of time to build at the time that the function is called. In the second case, bar just gives you a generator. A generator is an iterable--which means you can use it in a for loop, etc, but each value can only be accessed once. All the values are also not stored in memory at the same time; the generator object "remembers" where it was in the looping the last time you called it--this way, if you're using an iterable to (say) count to 50 billion, you don't have to count to 50 billion all at once and store the 50 billion numbers to count through. Again, this is a pretty contrived example, you probably would use itertools if you really wanted to count to 50 billion. :)

This is the most simple use case of generators. As you said, it can be used to write efficient permutations, using yield to push things up through the call stack instead of using some sort of stack variable. Generators can also be used for specialized tree traversal, and all manner of other things.

shareimprove this answer
 
   
This example is especially useful because it compares and contrasts usage of yield and return. Out of all the answers this question received, I thought yours was the best. –  JesseBikman Mar 12 '14 at 16:59
   
Actually, this example doesn't quite demonstrate the difference perfectly, because in yield_odd_numbersyou're using range which builds up the entire list anyway (in Python 2). If you used xrange youryield_odd_numbers would be efficient. In Python 3, range is a generator by default and xrange doesn't exist, so in that case, even get_odd_numbers would produce a generator. –  Widjet Feb 25 at 0:20

For those who prefer a minimal working example, meditate on this interactive Python session:

>>> def f():
...   yield 1
...   yield 2
...   yield 3
... 
>>> g = f()
>>> for i in g:
...   print i
... 
1
2
3
>>> for i in g:
...   print i
... 
>>> # Note that this time nothing was printed
shareimprove this answer
 
   
This was the example that removed the brainfog. –  RPGillespie Dec 23 '14 at 20:53

It's returning a generator. I'm not particularly familiar with Python, but I believe it's the same kind of thing as C#'s iterator blocks if you're familiar with those.

There's an IBM article which explains it reasonably well (for Python) as far as I can see.

The key idea is that the compiler/interpreter/whatever does some trickery so that as far as the caller is concerned, they can keep calling next() and it will keep returning values - as if the generator method was paused. Now obviously you can't really "pause" a method, so the compiler builds a state machine for you to remember where you currently are and what the local variables etc look like. This is much easier than writing an iterator yourself.

shareimprove this answer
 

An example in plain language. I will provide a correspondence between high-level human concepts to low-level python concepts.

I want to operate on a sequence of numbers, but I don't want to bother my self with the creation of that sequence, I want only to focus on the operation I want to do. So, I do the following:

  • I call you and tell you that I want a sequence of numbers which is produced in a specific way, and I let you know what the algorithm is.
    This step corresponds to defining the generator function, i.e. the function containing a yield.
  • Sometime later, I tell you, "ok, get ready to tell me the sequence of numbers".
    This step corresponds to calling the generator function which returns a generator object.Note that you don't tell me any numbers yet, you just grab your paper and pencil.
  • I ask you, "tell me the next number", and you tell me the first number; after that, you wait for me to ask you for the next number. It's your job to remember where you were, what numbers you have already said, what is the next number. I don't care about the details.
    This step corresponds to calling .next() on the generator object.
  • … repeat previous step, until…
  • eventually, you might come to an end. You don't tell me a number, you just shout, "hold your horses! I'm done! No more numbers!"
    This step corresponds to the generator object ending its job, and raising a StopIterationexception The generator function does not need to raise the exception, it's raised automatically when the function ends or issues a return.

This is what a generator does (a function that contains a yield); it starts executing, pauses whenever it does a yield, and when asked for a .next() value it continues from the point it was last. It fits perfectly by design with the iterator protocol of python, which describes how to sequentially request for values.

The most famous user of the iterator protocol is the for command in python. So, whenever you do a:

for item in sequence:

it doesn't matter if sequence is a list, a string, a dictionary or a generator object like described above; the result is the same: you read items off a sequence one by one.

Note that defining a function which contains a yield keyword is not the only way to create a generator; it's just the easiest way to create one.

For more accurate information, read about iterator types, the yield statement and generators in the Python documentation.

shareimprove this answer
 
   
"it's raised automatically when the function ends or issues a return" - just notice that in functions containing yield you can only have return without arguments. Trying to return something throws SyntaxError. –  MestreLion Nov 7 '13 at 4:50

There is one type of answer that I don't feel has been given yet, among the many great answers that describe how to use generators. Here is the PL theory answer:

The yield statement in python returns a generator. A generator in python is a function that returnscontinuations (and specifically a type of coroutine, but continuations represent the more general mechanism to understand what is going on).

Continuations in programming languages theory are a much more fundamental kind of computation, but they are not often used because they are extremely hard to reason about and also very difficult to implement. But the idea of what a continuation is, is straightforward: it is the state of a computation that has not yet finished. In this state are saved the current values of variables and the operations that have yet to be performed, and so on. Then at some point later in the program the continuation can be invoked, such that the program's variables are reset to that state and the operations that were saved are carried out.

Continuations, in this more general form, can be implemented in two ways. In the call/cc way, the program's stack is literally saved and then when the continuation is invoked, the stack is restored.

In continuation passing style (CPS), continuations are just normal functions (only in languages where functions are first class) which the programmer explicitly manages and passes around to subroutines. In this style, program state is represented by closures (and the variables that happen to be encoded in them) rather than variables that reside somewhere on the stack. Functions that manage control flow accept continuation as arguments (in some variations of CPS, functions may accept multiple continuations) and manipulate control flow by invoking them by simply calling them and returning afterwards. A very simple example of continuation passing style is as follows:

def save_file(filename):
  def write_file_continuation():
    write_stuff_to_file(filename)

  check_if_file_exists_and_user_wants_to_overwrite( write_file_continuation )

In this (very simplistic) example, the programmer saves the operation of actually writing the file into a continuation (which can potentially be a very complex operation with many details to write out), and then passes that continuation (i.e, as a first-class closure) to another operator which does some more processing, and then calls it if necessary. (I use this design pattern a lot in actual GUI programming, either because it saves me lines of code or, more importantly, to wait for GUI events and then execute saved continuations repeatedly)

The rest of this post will, without loss of generality, conceptualize continuations as CPS, because it is a hell of a lot easier to understand and read.

 

Now let's talk about generators in python. Generators are a specific subtype of continuation. Whereascontinuations are able in general to save the state of a computation (i.e., the program's call stack), generators are only able to save the state of iteration over an iterator. Although, this definition is slightly misleading for certain use cases of generators. For instance:

def f():
  while True:
    yield 4

This is clearly a reasonable iterable whose behavior is well defined -- each time the generator iterates over it, it returns 4 (and does so forever). But it isn't probably the prototypical type of iterable that comes to mind when thinking of iterators (i.e., for x in collection: do_something(x)). This example illustrates the power of generators: if anything is an iterator, a generator can save the state of its iteration.

To reiterate: Continuations can save the state of a program's stack and generators can save the state of iteration. This means that continuations are more a lot powerful than generators, but also that generators are a lot, lot easier. They are easier for the language designer to implement, and they are easier for the programmer to use (if you have some time to burn, try to read and understand this page about continuations and call/cc).

But you could easily implement (and conceptualize) generators as a simple, specific case of continuation passing style:

Whenever yield is called, it tells the function to return a continuation. When the function is called again, it starts from wherever it left off. So, in pseudo-pseudocode (i.e., not pseudocode but not code) the generator's next method is basically as follows:

class Generator():
  def __init__(self,iterable,generatorfun):
    self.next_continuation = lambda:generatorfun(iterable)

  def next(self):
    value, next_continuation = self.next_continuation()
    self.next_continuation = next_continuation
    return value

where yield keyword is actually syntactic sugar for the real generator function, basically something like:

def generatorfun(iterable):
  if len(iterable) == 0:
    raise StopIteration
  else:
    return (iterable[0], lambda:generatorfun(iterable[1:]))

Remember that this is just pseudocode and the actual implementation of generators in python is more complex. But as an exercise to understand what is going on, try to use continuation passing style to implement generator objects without use of the yield keyword.

shareimprove this answer
 
   
Saying that "generators save the state of iterables" is somewhat misleading due to terminology. In Python, aniterable is an object that can be converted into an iterator (i.e. has an __iter__ or __getitem__methods). When comparing a generator to a continuation, one could say that the generator encapsulates a single execution frame, as opposed to the whole stack, as required by continuation. In that sense, generator is equivalent in power to a coroutine, and post-PEP-342 generators are often used in exactly that way. – user4815162342 Jul 15 '14 at 5:43
   
While your objection is technically correct that the example provided is not an iterable as it does not have an__iter__, it can be thought of as a "virtual" iterable whose __iter__ simply continues returning items forever. I prefer this way of explaining it, by noting the similarity of this construct with an iterable. –  aestrivexJul 18 '14 at 20:45
   
At this point I no longer understand what you are calling an iterable. A Python iterable is any object whose type implements an __iter__ method. This includes generator instances, but also files, and of course every iterator is by definition an iterable because its __iter__ returns itself. You could say that a generator saves the state of iteration over an iterator. Saying that generator f() "isn't a conventional iterable" makes no sense whatsoever. –  user4815162342 Jul 18 '14 at 21:39
   
Instead of say "iterable" -- say "state of iteration over an iterator." That is a good suggestion. I will try to figure out how to work it into this answer. –  aestrivex Jul 23 '14 at 15:38 

Here are some Python examples of how to actually implement generators as if Python did not provide syntactic sugar for them (or in a language without native syntax, like JavaScript). Snippets from that link is below.

As a Python generator:

from itertools import islice

def fib_gen():
    a, b = 1, 1
    while True:
        yield a
        a, b = b, a + b

assert [1, 1, 2, 3, 5] == list(islice(fib_gen(), 5))

Using lexical closures instead of generators

def ftake(fnext, last):
    return [fnext() for _ in xrange(last)]

def fib_gen2():
    #funky scope due to python2.x workaround
    #for python 3.x use nonlocal
    def _():
        _.a, _.b = _.b, _.a + _.b
        return _.a
    _.a, _.b = 0, 1
    return _

assert [1,1,2,3,5] == ftake(fib_gen2(), 5)

Using object closures instead of generators (because ClosuresAndObjectsAreEquivalent)

class fib_gen3:
    def __init__(self):
        self.a, self.b = 1, 1

    def __call__(self):
        r = self.a
        self.a, self.b = self.b, self.a + self.b
        return r

assert [1,1,2,3,5] == ftake(fib_gen3(), 5)
shareimprove this answer
 
   
I'm completely mislead by your example fib_gen2. Could you elaborate a little 'funky scope due to python2.x workaround for python 3.x use nonlocal' ? I don't grasp the sense of the _ and if there is a reason you use _ and return the function itself, why all that ? –  Stephane Rolland May 3 '13 at 11:07 
   
He uses an "anonymous" inner function so that multiple instances of the generator can be instantiated. He takes advantage of the fact that functions, as objects, can be assigned instance variables. In this way, the state of each individual pseudo-generator is stored between invocations. Internally, a generator function works a lot like the third example--you can imagine that Python creates a generator object, and the local variable of the function become the attributes of that object so that they are stored between invocations. –  acjay Jul 19 '13 at 0:12

While a lot of answers show why you'd use a yield to create a generator, there are more uses for yield. It's quite easy to make a coroutine, which enables the passing of information between two blocks of code. I won't repeat any of the fine examples that have already been given about using yield to create a generator.

To help understand what a yield does in the following code, you can use your finger to trace the cycle through any code that has a yield. Every time your finger hits the yield, you have to wait for a next or a send to be entered. When a next is called, you trace through the code until you hit the yield… the code on the right of the yield is evaluated and returned to the caller… then you wait. When next is called again, you perform another loop through the code. However, you'll note that in a coroutine, yield can also be used with a send… which will send a value from the caller into the yielding function. If a send is given, then yield receives the value sent, and spits it out the left hand side… then the trace through the code progresses until you hit the yield again (returning the value at the end, as if next was called).

For example:

>>> def coroutine():
...     i = -1
...     while True:
...         i += 1
...         val = (yield i)
...         print("Received %s" % val)
...
>>> sequence = coroutine()
>>> sequence.next()
0
>>> sequence.next()
Received None
1
>>> sequence.send('hello')
Received hello
2
>>> sequence.close()
shareimprove this answer
 
3 
This is what I was looking for. Sometimes Stackoverflow is filled with same answers which became noise. Only two people mentioned about coroutine and send. –  Kenji Noguchi Nov 25 '14 at 0:21

I was going to post "read page 19 of Beazley's 'Python: Essential Reference' for a quick description of generators", but so many others have posted good descriptions already.

Also, note that yield can be used in coroutines as the dual of their use in generator functions. Although it isn't the same use as your code snippet, (yield) can be used as an expression in a function. When a caller sends a value to the method using the send() method, then the coroutine will execute until the next (yield) statement is encountered.

Generators and coroutines are a cool way to set up data-flow type applications. I thought it would be worthwhile knowing about the other use of the yield statement in functions.

shareimprove this answer
 

there is another yield use and meaning (since python 3.3):

yield from <expr>

http://legacy.python.org/dev/peps/pep-0380/

A syntax is proposed for a generator to delegate part of its operations to another generator. This allows a section of code containing 'yield' to be factored out and placed in another generator. Additionally, the subgenerator is allowed to return with a value, and the value is made available to the delegating generator.

The new syntax also opens up some opportunities for optimisation when one generator re-yields values produced by another.

shareimprove this answer
 

Here is a mental image of what yield does.

I like to think of a thread as having a stack (even if it's not implemented that way).

When a normal function is called, it puts its local variables on the stack, does some computation, returns and clears the stack. The values of its local variables are never seen again.

With a yield function, when it's called first, it similarly adds its local variables to the stack, but then takes its local variables to a special hideaway instead of clearing them, when it returns via yield. A possible place to put them would be somewhere in the heap.

Note that it's not the function any more, it's a kind of an imprint or ghost of the function that the forloop is hanging onto.

When it is called again, it retrieves its local variables from its special hideaway and puts them back on the stack and computes, then hides them again in the same way.

shareimprove this answer
 

here is simple example with result:

def isPrimeNumber(n):
        print "isPrimeNumber({}) call".format(n)
        if n==1:
            return False
        for x in range(2,n):
            if n % x == 0:
                return False
        return True


def primes (n=1):
        while(True):
            print "loop step ---------------- {}".format(n)
            if isPrimeNumber(n): yield n
            n += 1

for n in primes():
        if n> 10:break
        print "wiriting result {}".format(n)   

output :

loop step ---------------- 1
isPrimeNumber(1) call
loop step ---------------- 2
isPrimeNumber(2) call
loop step ---------------- 3
isPrimeNumber(3) call
wiriting result 3
loop step ---------------- 4
isPrimeNumber(4) call
loop step ---------------- 5
isPrimeNumber(5) call
wiriting result 5
loop step ---------------- 6
isPrimeNumber(6) call
loop step ---------------- 7
isPrimeNumber(7) call
wiriting result 7
loop step ---------------- 8
isPrimeNumber(8) call
loop step ---------------- 9
isPrimeNumber(9) call
loop step ---------------- 10
isPrimeNumber(10) call
loop step ---------------- 11
isPrimeNumber(11) call

I am not an python developer but it looks to me "yield" holds the position of program flow and next time loop start from "yield" position. Seems like waiting at that position and just before that returning value outside and next time continue to work.

Seems to me interesting and nice ability :D

shareimprove this answer
 

From a programming viewpoint, the iterators are implemented as thunks

http://en.wikipedia.org/wiki/Thunk_(functional_programming)

To implement thunks (also called anonymous functions), one uses messages sent to a closure object, which has a dispatcher, and the dispatcher answers to "messages".

http://en.wikipedia.org/wiki/Message_passing

"next" is a message sent to a closure, created by "iter" call.

shareimprove this answer
 

protected by wim Feb 11 '13 at 1:48

Thank you for your interest in this question. Because it has attracted low-quality answers, posting an answer now requires 10 reputation on this site. 

Would you like to answer one of these unanswered questions instead?

来自 http://stackoverflow.com/questions/231767/what-does-the-yield-keyword-do-in-python

Everything I know about Python..


Prior to beginning tutoring sessions, I ask new students to fill out a brief self-assessment where they rate their understanding of various Python concepts. Some topics ("control flow with if/else" or "defining and using functions") are understood by a majority of students before ever beginning tutoring. There are a handful of topics, however, that almost all students report having no knowledge or very limited understanding of. Of these, "generators and the yield keyword" is one of the biggest culprits. I'm guessing this is the case for mostnovice Python programmers.

Many report having difficulty understanding generators and the yield keyword even after making a concerted effort to teach themselves the topic. I want to change that. In this post, I'll explain what the yieldkeyword does, why it's useful, and how to use it.

Note: In recent years, generators have grown more powerful as features have been added through PEPs. In my next post, I'll explore the true power of yield with respect to coroutines, cooperative multitasking and asynchronous I/O (especially their use in the "tulip" prototype implementation GvR has been working on). Before we get there, however, we need a solid understanding of how the yield keyword and generatorswork.

Coroutines and Subroutines

When we call a normal Python function, execution starts at function's first line and continues until a returnstatement, exception, or the end of the function (which is seen as an implicit return None) is encountered. Once a function returns control to its caller, that's it. Any work done by the function and stored in local variables is lost. A new call to the function creates everything from scratch.

This is all very standard when discussing functions (more generally referred to as subroutines) in computer programming. There are times, though, when it's beneficial to have the ability to create a "function" which, instead of simply returning a single value, is able to yield a series of values. To do so, such a function would need to be able to "save its work," so to speak.

I said, "yield a series of values" because our hypothetical function doesn't "return" in the normal sense. returnimplies that the function is returning control of execution to the point where the function was called. "Yield," however, implies that the transfer of control is temporary and voluntary, and our function expects to regain it in the future.

In Python, "functions" with these capabilities are called generators, and they're incredibly useful. generators(and the yield statement) were initially introduced to give programmers a more straightforward way to write code responsible for producing a series of values. Previously, creating something like a random number generator required a class or module that both generated values and kept track of state between calls. With the introduction of generators, this became much simpler.

To better understand the problem generators solve, let's take a look at an example. Throughout the example, keep in mind the core problem being solved: generating a series of values.

Note: Outside of Python, all but the simplest generators would be referred to as coroutines. I'll use the latter term later in the post. The important thing to remember is, in Python, everything described here as acoroutine is still a generator. Python formally defines the term generator; coroutine is used in discussion but has no formal definition in the language.

Example: Fun With Prime Numbers

Suppose our boss asks us to write a function that takes a list of ints and returns some Iterable containing the elements which are prime1 numbers.

Remember, an Iterable is just an object capable of returning its members one at a time.

"Simple," we say, and we write the following:

def get_primes(input_list):
    result_list = list()
    for element in input_list:
        if is_prime(element):
            result_list.append()

    return result_list

# or better yet...

def get_primes(input_list):
    return (element for element in input_list if is_prime(element))

# not germane to the example, but here's a possible implementation of
# is_prime...

def is_prime(number):
    if number > 1:
        if number == 2:
            return True
        if number % 2 == 0:
            return False
        for current in range(3, int(math.sqrt(number) + 1), 2):
            if number % current == 0: 
                return False
        return True
    return False

Either get_primes implementation above fulfills the requirements, so we tell our boss we're done. She reports our function works and is exactly what she wanted.

Dealing With Infinite Sequences

Well, not quite exactly. A few days later, our boss comes back and tells us she's run into a small problem: she wants to use our get_primes function on a very large list of numbers. In fact, the list is so large that merely creating it would consume all of the system's memory. To work around this, she wants to be able to callget_primes with a start value and get all the primes larger than start (perhaps she's solving Project Euler problem 10).

Once we think about this new requirement, it becomes clear that it requires more than a simple change toget_primes. Clearly, we can't return a list of all the prime numbers from start to infinity (operating on infinite sequences, though, has a wide range of useful applications). The chances of solving this problem using a normal function seem bleak.

Before we give up, let's determine the core obstacle preventing us from writing a function that satisfies our boss's new requirements. Thinking about it, we arrive at the following: functions only get one chance to return results, and thus must return all results at once. It seems pointless to make such an obvious statement; "functions just work that way," we think. The real value lies in asking, "but what if they didn't?"

Imagine what we could do if get_primes could simply return the next value instead of all the values at once. It wouldn't need to create a list at all. No list, no memory issues. Since our boss told us she's just iterating over the results, she wouldn't know the difference.

Unfortunately, this doesn't seem possible. Even if we had a magical function that allowed us to iterate from nto infinity, we'd get stuck after returning the first value:

def get_primes(start):
    for element in magical_infinite_range(start):
        if is_prime(element):
            return element

Imagine get_primes is called like so:

def solve_number_10():
    # She *is* working on Project Euler #10, I knew it!
    total = 2
    for next_prime in get_primes(3):
        if next_prime < 2000000:
            total += next_prime
        else:
            print(total)
            return

Clearly, in get_primes, we would immediately hit the case where number = 3 and return at line 4. Instead ofreturn, we need a way to generate a value and, when asked for the next one, pick up where we left off.

Functions, though, can't do this. When they return, they're done for good. Even if we could guarantee a function would be called again, we have no way of saying, "OK, now, instead of starting at the first line like we normally do, start up where we left off at line 4." Functions have a single entry point: the first line.

Enter the Generator

This sort of problem is so common that a new construct was added to Python to solve it: the generator. Agenerator "generates" values. Creating generators was made as straightforward as possible through the concept of generator functions, introduced simultaneously.

A generator function is defined like a normal function, but whenever it needs to generate a value, it does so with the yield keyword rather than return. If the body of a def contains yield, the function automatically becomes a generator function (even if it also contains a return statement). There's nothing else we need to do to create one.

generator functions create generator iterators. That's the last time you'll see the term generator iterator, though, since they're almost always referred to as "generators". Just remember that a generatoris a special type of iterator. To be considered an iterator, generators must define a few methods, one of which is __next__(). To get the next value from a generator, we use the same built-in function as foriterators: next().

This point bears repeating: to get the next value from a generator, we use the same built-in function as foriterators: next().

(next() takes care of calling the generator's __next__() method). Since a generator is a type of iterator, it can be used in a for loop.

So whenever next() is called on a generator, the generator is responsible for passing back a value to whomever called next(). It does so by calling yield along with the value to be passed back (e.g. yield 7). The easiest way to remember what yield does is to think of it as return (plus a little magic) for generator functions.**

Again, this bears repeating: yield is just return (plus a little magic) for generator functions.

Here's a simple generator function:

>>> def simple_generator_function():
>>>    yield 1
>>>    yield 2
>>>    yield 3

And here are two simple ways to use it:

>>> for value in simple_generator_function():
>>>     print(value)
1
2
3
>>> our_generator = simple_generator_function()
>>> next(our_generator)
1
>>> next(our_generator)
2
>>> next(our_generator)
3

Magic?

What's the magic part? Glad you asked! When a generator function calls yield, the "state" of the generator function is frozen; the values of all variables are saved and the next line of code to be executed is recorded until next() is called again. Once it is, the generator function simply resumes where it left off. If next() is never called again, the state recorded during the yield call is (eventually) discarded.

Let's rewrite get_primes as a generator function. Notice that we no longer need themagical_infinite_range function. Using a simple while loop, we can create our own infinite sequence:

def get_primes(number):
    while True:
        if is_prime(number):
            yield number
        number += 1

If a generator function calls return or reaches the end its definition, a StopIteration exception is raised. This signals to whoever was calling next() that the generator is exhausted (this is normal iteratorbehavior). It is also the reason the while True: loop is present in get_primes. If it weren't, the first timenext() was called we would check if the number is prime and possibly yield it. If next() were called again, we would uselessly add 1 to number and hit the end of the generator function (causing StopIteration to be raised). Once a generator has been exhausted, calling next() on it will result in an error, so you can only consume all the values of a generator once. The following will not work:

>>> our_generator = simple_generator_function()
>>> for value in our_generator:
>>>     print(value)

>>> # our_generator has been exhausted...
>>> print(next(our_generator))
Traceback (most recent call last):
  File "<ipython-input-13-7e48a609051a>", line 1, in <module>
    next(our_generator)
StopIteration

>>> # however, we can always create a new generator
>>> # by calling the generator function again...

>>> new_generator = simple_generator_function()
>>> print(next(new_generator)) # perfectly valid
1

Thus, the while loop is there to make sure we never reach the end of get_primes. It allows us to generate a value for as long as next() is called on the generator. This is a common idiom when dealing with infinite series (and generators in general).

Visualizing the flow

Let's go back to the code that was calling get_primes: solve_number_10.

def solve_number_10():
    # She *is* working on Project Euler #10, I knew it!
    total = 2
    for next_prime in get_primes(3):
        if next_prime < 2000000:
            total += next_prime
        else:
            print(total)
            return

It's helpful to visualize how the first few elements are created when we call get_primes in solve_number_10'sfor loop. When the for loop requests the first value from get_primes, we enter get_primes as we would in a normal function.

  1. We enter the while loop on line 3
  2. The if condition holds (3 is prime)
  3. We yield the value 3 and control to solve_number_10.

Then, back in solve_number_10:

  1. The value 3 is passed back to the for loop
  2. The for loop assigns next_prime to this value
  3. next_prime is added to total
  4. The for loop requests the next element from get_primes

This time, though, instead of entering get_primes back at the top, we resume at line 5, where we left off.

def get_primes(number):
    while True:
        if is_prime(number):
            yield number
        number += 1 # <<<<<<<<<<

Most importantly, number still has the same value it did when we called yield (i.e. 3). Remember, yield both passes a value to whoever called next(), and saves the "state" of the generator function. Clearly, then,number is incremented to 4, we hit the top of the while loop, and keep incrementing number until we hit the next prime number (5). Again we yield the value of number to the for loop in solve_number_10. This cycle continues until the for loop stops (at the first prime greater than 2,000,000).

Moar Power

In PEP 342, support was added for passing values into generators. PEP 342 gave generators the power to yield a value (as before), receive a value, or both yield a value and receive a (possibly different) value in a single statement.

To illustrate how values are sent to a generator, let's return to our prime number example. This time, instead of simply printing every prime number greater than number, we'll find the smallest prime number greater than successive powers of a number (i.e. for 10, we want the smallest prime greater than 10, then 100, then 1000, etc.). We start in the same way as get_primes:

def print_successive_primes(iterations, base=10):
    # like normal functions, a generator function
    # can be assigned to a variable

    prime_generator = get_primes(base)
    # missing code...
    for power in range(iterations):
        # missing code...

def get_primes(number):
    while True:
        if is_prime(number):
        # ... what goes here?

The next line of get_primes takes a bit of explanation. While yield number would yield the value of number, a statement of the form other = yield foo means, "yield foo and, when a value is sent to me, set other to that value." You can "send" values to a generator using the generator's send method.

def get_primes(number):
    while True:
        if is_prime(number):
            number = yield number
        number += 1

In this way, we can set number to a different value each time the generator yields. We can now fill in the missing code in print_successive_primes:

def print_successive_primes(iterations, base=10):
    prime_generator = get_primes(base)
    prime_generator.send(None)
    for power in range(iterations):
        print(prime_generator.send(base ** power))

Two things to note here: First, we're printing the result of generator.send, which is possible because sendboth sends a value to the generator and returns the value yielded by the generator (mirroring how yield works from within the generator function).

Second, notice the prime_generator.send(None) line. When you're using send to "start" a generator (that is, execute the code from the first line of the generator function up to the first yield statement), you must sendNone. This makes sense, since by definition the generator hasn't gotten to the first yield statement yet, so if we sent a real value there would be nothing to "receive" it. Once the generator is started, we can send values as we do above.

Round-up

In the second half of this series, we'll discuss the various ways in which generators have been enhanced and the power they gained as a result. yield has become one of the most powerful keywords in Python. Now that we've built a solid understanding of how yield works, we have the knowledge necessary to understand some of the more "mind-bending" things that yield can be used for.

Believe it or not, we've barely scratched the surface of the power of yield. For example, while send does work as described above, it's almost never used when generating simple sequences like our example. Below, I've pasted a small demonstration of one common way send is used. I'll not say any more about it as figuring out how and why it works will be a good warm-up for part two.

import random

def get_data():
    """Return 3 random integers between 0 and 9"""
    return random.sample(range(10), 3)

def consume():
    """Displays a running average across lists of integers sent to it"""
    running_sum = 0
    data_items_seen = 0

    while True:
        data = yield
        data_items_seen += len(data)
        running_sum += sum(data)
        print('The running average is {}'.format(running_sum / float(data_items_seen)))

def produce(consumer):
    """Produces a set of values and forwards them to the pre-defined consumer
    function"""
    while True:
        data = get_data()
        print('Produced {}'.format(data))
        consumer.send(data)
        yield

if __name__ == '__main__':
    consumer = consume()
    consumer.send(None)
    producer = produce(consumer)

    for _ in range(10):
        print('Producing...')
        next(producer)

Remember...

There are a few key ideas I hope you take away from this discussion:

  • generators are used to generate a series of values
  • yield is like the return of generator functions
  • The only other thing yield does is save the "state" of a generator function
  • A generator is just a special type of iterator
  • Like iterators, we can get the next value from a generator using next()
    • for gets values by calling next() implicitly

I hope this post was helpful. If you had never heard of generators, I hope you now understand what they are, why they're useful, and how to use them. If you were somewhat familiar with generators, I hope any confusion is now cleared up.

As always, if any section is unclear (or, more importantly, contains errors), by all means let me know. You can leave a comment below, email me at jeff@jeffknupp.com, or hit me up on Twitter @jeffknupp.


  1. Quick refresher: a prime number is a positive integer greater than 1 that has no divisors other than 1 and itself. 3 is prime because there are no numbers that evenly divide it other than 1 and 3 itself. 

Posted on  by 

Discuss Posts With Other Readers at discourse.jeffknupp.com!

来自 http://www.jeffknupp.com/blog/2013/04/07/improve-your-python-yield-and-generators-explained/


Iterators, generators and decorators


In this chapter we will learn about iterators, generators and decorators.

Iterators

Python iterator objects required to support two methods while following the iterator protocol.

__iter__ returns the iterator object itself. This is used in for and in statements.

next method returns the next value from the iterator. If there is no more items to return then it should raise StopIteration exception.

class Counter(object):
    def __init__(self, low, high):
        self.current = low
        self.high = high

    def __iter__(self):
        'Returns itself as an iterator object'
        return self

    def next(self):
        'Returns the next value till current is lower than high'
        if self.current > self.high:
            raise StopIteration
        else:
            self.current += 1
            return self.current - 1

Now we can use this iterator in our code.

>>> c = Counter(5,10)
>>> for i in c:
...   print i,
...
5 6 7 8 9 10

Remember that an iterator object can be used only once. It means after it raisesStopIteration once, it will keep raising the same exception.

>>> c = Counter(5,6)
>>> next(c)
5
>>> next(c)
6
>>> next(c)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 11, in next
StopIteration
>>> next(c)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 11, in next
StopIteration

Using the iterator in for loop example we saw, the following example tries to show the code behind the scenes.

>>> iterator = iter(c)
>>> while True:
...     try:
...         x = iterator.next()
...         print x,
...     except StopIteration as e:
...         break
...
5 6 7 8 9 10

Generators

In this section we learn about Python generators. They were introduced in Python 2.3. It is an easier way to create iterators using a keyword yield from a function.

>>> def my_generator():
...     print "Inside my generator"
...     yield 'a'
...     yield 'b'
...     yield 'c'
...
>>> my_generator()
<generator object my_generator at 0x7fbcfa0a6aa0>

In the above example we create a simple generator using the yield statements. We can use it in a for loop just like we use any other iterators.

>>> for char in my_generator():
...     print char
...
Inside my generator
a
b
c

In the next example we will create the same Counter class using a generator function and use it in a for loop.

>>> def counter_generator(low, high):
...     while low <= high:
...         yield low
...         low += 1
...
>>> for i in counter_generator(5,10):
...     print i,
...
5 6 7 8 9 10

Inside the while loop when it reaches to the yield statement, the value of low is returned and the generator state is suspended. During the second next call the generator resumed where it freeze-ed before and then the value of low is increased by one. It continues with the while loop and comes to the yield statement again.

When you call an generator function it returns a *generator* object. If you call *dir* on this object you will find that it contains __iter__ and *next* methods among the other methods.

>>> c = counter_generator(5,10)
>>> dir(c)
['__class__', '__delattr__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__iter__',
 '__name__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__',
  '__subclasshook__', 'close', 'gi_code', 'gi_frame', 'gi_running', 'next', 'send', 'throw']

We mostly use generators for laze evaluations. This way generators become a good approach to work with lots of data. If you don’t want to load all the data in the memory, you can use a generator which will pass you each piece of data at a time.

One of the biggest example of such example is os.path.walk() function which uses a callback function and current os.walk generator. Using the generator implementation saves memory.

We can have generators which produces infinite values. The following is a one such example.

>>> def infinite_generator(start=0):
...     while True:
...         yield start
...         start += 1
...
>>> for num in infinite_generator(4):
...     print num,
...     if num > 20:
...         break
...
4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21

If we go back to the example of my_generator we will find one feature of generators. They are not re-usable.

>>> g = my_generator()
>>> for c in g:
...     print c
...
Inside my generator
a
b
c
>>> for c in g:
...     print c
...

One way to create a reusable generator is Object based generators which does not hold any state. Any class with a __iter__ method which yields data can be used as a object generator. In the following example we will recreate out counter generator.

>>> class Counter(object):
...     def __init__(self, low, high):
...         self.low = low
...         self.high = high
...     def __iter__(self):
...          counter = self.low
...          while self.high >= counter:
...              yield counter
...              counter += 1
...
>>> gobj = Counter(5, 10)
>>> for num in gobj:
...     print num,
...
5 6 7 8 9 10
>>> for num in gobj:
...     print num,
...
5 6 7 8 9 10

Generator expressions

Generator expressionsGenerator expressions

In this section we will learn about generator expressions which is a high performance, memory efficient generalization of list comprehensions and generators.

For example we will try to sum the squares of all numbers from 1 to 99.

>>> sum([x*x for x in range(1,10)])

The example actually first creates a list of the square values in memory and then it iterates over it and finally after sum it frees the memory. You can understand the memory usage in case of a big list.

We can save memory usage by using a generator expression.

sum(x*x for x in range(1,10))

The syntax of generator expression says that always needs to be directly inside a set of parentheses and cannot have a comma on either side. Which basically means both the examples below are valid generator expression usage example.

>>> sum(x*x for x in range(1,10))
285
>>> g = (x*x for x in range(1,10))
>>> g
<generator object <genexpr> at 0x7fc559516b90>

We can have chaining of generators or generator expressions. In the following example we will read the file */var/log/cron* and will find if any particular job (in the example we are searching for anacron) is running successfully or not.

We can do the same using a shell command tail -f /var/log/cron |grep anacron

>>> jobtext = 'anacron'
>>> all = (line for line in open('/var/log/cron', 'r') )
>>> job = ( line for line in all if line.find(jobtext) != -1)
>>> text = next(job)
>>> text
"May  6 12:17:15 dhcp193-104 anacron[23052]: Job `cron.daily' terminated\n"
>>> text = next(job)
>>> text
'May  6 12:17:15 dhcp193-104 anacron[23052]: Normal exit (1 job run)\n'
>>> text = next(job)
>>> text
'May  6 13:01:01 dhcp193-104 run-parts(/etc/cron.hourly)[25907]: starting 0anacron\n'

You can write a for loop to the lines.

Closures

Closures are nothing but functions that are returned by another function. We use closures to remove code duplication. In the following example we create a simple closure for adding numbers.

>>> def add_number(num):
...     def adder(number):
...         'adder is a closure'
...         return num + number
...     return adder
...
>>> a_10 = add_number(10)
>>> a_10(21)
31
>>> a_10(34)
44
>>> a_5 = add_number(5)
>>> a_5(3)
8

adder is a closure which adds a given number to a pre-defined one.

Decorators

Decorator is way to dynamically add some new behavior to some objects. We achieve the same in Python by using closures.

In the example we will create a simple example which will print some statement before and after the execution of a function.

>>> def my_decorator(func):
...     def wrapper(*args, **kwargs):
...         print "Before call"
...         result = func(*args, **kwargs)
...         print "After call"
...         return result
...     return wrapper
...
>>> @my_decorator
... def add(a, b):
...     "Our add function"
...     return a + b
...
>>> add(1, 3)
Before call
After call
4

来自  http://pymbook.readthedocs.org/en/latest/igd.html

Python Yields are Fun! 



While you can optimize the heck out of your Python code with generators and generator expressions I'm more interested in goofing around and solving classic programming questions with the yield statement.

note: For this article, since it's easier to explain things as they happen, I'll be including a lot of inline comments.

Let's start with a simple function that returns a sequence of some of my favorite values:

# yielding.py
def pydanny_selected_numbers():

    # If you multiple 9 by any other number you can easily play with
    #   numbers to get back to 9.
    #   Ex: 2 * 9 = 18. 1 + 8 = 9
    #   Ex: 15 * 9 = 135. 1 + 3 + 5 = 9
    #   See https://en.wikipedia.org/wiki/Digital_root
    yield 9

    # A pretty prime.
    yield 31

    # What's 6 * 7?
    yield 42

    # The string representation of my first date with Audrey Roy
    yield "2010/02/20"

note: When a function uses the yield keyword it's now called a generator.

Let's do a test drive in the REPL:

>>> from yielding import pydanny_selected_numbers  # import ye aulde code

>>> pydanny_selected_numbers()  # create the iterator object
<generator object pydanny_selected_numbers at 0x1038a03c0>

>>> for i in pydanny_selected_numbers():  # iterate through the iterator
...     print(i)
...
9
31
42
"2010/02/20"

>>> iterator = pydanny_selected_numbers() # create the iterator object
>>> for i in iterator:  # iterate through the iterator object
...     print(i)
...
9
31
42
"2010/02/20"

Of course, if you know anything about generator expressions, you know I could do this more tersely with the following:

>>> iterator = (x for x in [9, 31, 42, "2010/02/20"])
>>> for i in iterator:
...     print(i)
...
9
31
42
"2010/02/20"

While that is more terse, it doesn't give us the amount of control we get by defining our own generator function. For example, what if I want to present the Fibonacci sequence in a loop rather than with recursion?

# fun.py
def fibonacci(max):
    result = 0
    base = 1
    while result <= max:

        # This yield statement is where the execution leaves the function.
        yield result
        # This is where the execution comes back into the function. This is
        # just whitespace, but that it came back while preserving the state
        # of the function is pretty awesome.

        # Fibonacci code to increase the number according to
        #   https://en.wikipedia.org/wiki/Fibonacci_number
        n = result + base
        result = base
        base = n

if __name__ == "__main__":

    for x in fibonacci(144):
        print(x)

Let's try this out in the REPL:

>>> from fun import fibonacci
>>> fibonacci(10)
<generator object fibonacci at 0x10d49e460>
>>> for x in fibonacci(10):
...     print(x)
0
1
1
2
3
5
8
>>> iterator = fibonacci(5)
>>> iterator
<generator object fibonacci at 0x10d63c550>
>>> iterator.next()
0
>>> iterator.next()
1

What's nice about this is so much more than fibonacci logic in a generator function. Instead, imagine instead of a lightweight calculation I had done something performance intensive. By using generator expressions I can readily control the execution calls with the iterator object's next() method, saving processor cycles.

Very nifty.

Summary

I admit it. Like many Python developers, I find using tools like yields and generators to optimize the heck out of performance intensive things a lot of fun.

If you are like me and like this sort of stuff, I recommend the following resources:

In the next article I'll demonstrate how to use the yield statement to create context managers.

Update: Nicholas Tollervey pointed me at wikipedia's Digital root article, so I added it to the comments of the first code sample.

Update: Oddthinking pointed out that I forgot a print statement. In the REPL it's not really needed, but if this is translated to a script then it's necessary.
 

Avatar
Join the discussion…

 

  • Avatar

    I got a clear picture of yield, here
    Thanks

  • Avatar

    This is pretty cool. Nice and concise explanation. The key for me was "note: When a function uses the yield keyword it's now called a generator."

    The StackEx answer is overdone, this is just right. Thanks.

  • Avatar

    Thanks for the discussion and examples on Python "generators" and the "yield" statement.

    I just about had it and needed just a little more to get it down. Your discussion and examples here did it.

    The Fibonacci sequence with the clear comments was great. I could see clearly
    that the generator yielded with one value, went out and printed it and then went
    back into the generator to modify the value before yielding again.

    Good stuff.

    Thanks.

  • Avatar

    Hi all,
    I am somewhat new to python - I understood most of it...
    Very nice!
    Need a bit of help to understand the following:
    iterator = (x for x in [9, 31, 42, "2010/02/20"])
    Is this a kind of for comprehension? 
    I played with it in REPL - but would like to understand it a little better ;-)
    Thanks for the lovely article ;-)

    --- cheerio atul

  • Avatar

    Just in case you haven't see this, and/or wanted more on generators/coroutines:

    http://www.dabeaz.com/coroutin...

  • Avatar

    Two quick suggestions:

    I think Box #3 is missing the "print" function.

    Box #4 could be simplified by factoring out n:

    result, base = base, result + base

来自 http://www.pydanny.com/python-yields-are-fun.html

This is part one of a talk I gave January 24, 2013 at the Ottawa Python Authors Group

Part Two is now also available.

Both parts of this presentation are also available as a single IPython Notebook which you can download and run locally, or view with nbviewer.ipython.org. The complete source is available at https://github.com/wardi/iterables-iterators-generators

A Gentle Introduction

The first few examples are from Ned Bachelder's Python Iteration Talk http://bit.ly/pyiter

Coming from another language you might find it natural to create a counter and increment it to iterate over a list in Python

my_list = [17, 23, 47, 51, 101, 173, 999, 1001]

i = 0
while i < len(my_list):
    v = my_list[i]
    print v
    i += 1
17
23
47
51
101
173
999
1001

You might also have been told about range() which almost lets you write a C-style for loop

for i in range(len(my_list)):
    v = my_list[i]
    print v
17
23
47
51
101
173
999
1001

But neither of the above are natural ways of iterating in python. We do this instead:

for v in my_list:
    print v
17
23
47
51
101
173
999
1001

Many types of objects may be iterated over this way. Iterating over strings produces single characters:

for v in "Hello":
    print v
H
e
l
l
o

Iterating over a dict produces its keys (in no particular order):

d = {
    'a': 1,
    'b': 2,
    'c': 3,
    }

for v in d:
    print v
# Note the strange order!
a
c
b

Iterating over a file object produces lines from that file, including the line termination:

f = open("suzuki.txt")
for line in f:
    print ">", line
> On education

> "Education has failed in a very serious way to convey the most important lesson science can teach: skepticism."

> "An educational system isn't worth a great deal if it teaches young people how to make a living but doesn't teach them how to make a life."

Objects that can be iterated over in python are called "Iterables", and a for loop isn't the only thing that accepts iterables.

The list constructor takes any iterable. We can use this to make a list of the keys in a dict:

list(d)
['a', 'c', 'b']

Or the characters in a string:

list("Hello")
['H', 'e', 'l', 'l', 'o']

List comprehensions take iterables.

ascii = [ord(x) for x in "Hello"]
ascii
[72, 101, 108, 108, 111]

The sum() function takes any iterable that produces numbers.

sum(ascii)
500

The str.join() method takes any iterable that produces strings.

"-".join(d)
'a-c-b'

Iterables can produce any python object. The re.finditer() function returns an iterable that producesre.match objects.

import re
suzuki = open("suzuki.txt").read()
for match in re.finditer(r'\bs\w+', suzuki):
    print match.group(0)
serious
science
skepticism
system

A "classic" iterable class

Very old versions of Python supported iteration through the __getitem__() method, and this is still supported.

class Lucky(object):
    def __getitem__(self, index):
        if index > 3:
            raise IndexError
        return 7

Lucky is a class that will return 7 for any index less than or equal to 3:

lucky = Lucky()

lucky[0]
7

And raise an IndexError for larger indexes:

lucky[6]
---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
/home/ian/git/iterables-iterators-generators/<ipython-input-15-5c0a87559915> in <module>()
----> 1 lucky[6]

/home/ian/git/iterables-iterators-generators/<ipython-input-13-bd578dbfbead> in __getitem__(self, index)
      2     def __getitem__(self, index):
      3         if index > 3:
----> 4             raise IndexError
      5         return 7

IndexError:

This is a perfectly well-behaved python iterable. We can loop over it:

for number in lucky:
    print number
7
7
7
7

Or pass it to functions that take iterables:

list(lucky)
[7, 7, 7, 7]

Even the in operator works with it:

7 in lucky
True

But writing this sort of class it difficult. You need to be able to return a value for any index passed to__getitem__() but most of the time you really only want to produce items in order from first to last.

Enter "Iterators".

Iterators

The naming is confusingly similar here, but it's important to understand the difference between iterablesand iterators.

Iterators are iterables with some kind of 'position' state and a .next() method. The .next() method may be called to produce the next item and update the internal state.

Iterables are objects that produce an iterator when they are passed to the iter() builtin.

Iterators are Iterables with .next()

Calling iter() on our "classic" iterable object produces a plain iterator instance

i = iter(lucky)
i
<iterator at 0x288cc10>

This plain iterator has a counter and the original object as its internal state. Calling .next()advances the counter and calls our "classic" iterable's .__getitem__() method.

print i.next()
print i.next()
print i.next()
print i.next()
7
7
7
7

When we get to the end however, our IndexError exception is turned into a StopIteration exception. This is the iterator protocol: when .next() raises StopIteration there are no more items to be produced.

print i.next() # raises StopIteration, *not* IndexError
---------------------------------------------------------------------------
StopIteration                             Traceback (most recent call last)
/home/ian/git/iterables-iterators-generators/<ipython-input-21-b3e04e043095> in <module>()
----> 1 print i.next() # raises StopIteration, *not* IndexError

StopIteration:

Remember that an iterator is an iterable, so it can be passed to anything that takes an iterable.

Be careful, though. Iterators may only be iterated over once, since they are updating their internal state as they go. If we try to iterate twice, the second time will produce no more items:

i = iter(lucky)
print list(i)
print list(i)
[7, 7, 7, 7]
[]

Calling on iter() on an iterable will produce a different iterator object each time.

i is iter(lucky)
False

Also, like other iterables, calling iter() on an iterator works, but it behaves differently!

Calling iter() on an iterator typically returns the exact same iterator. If you think about it, that's all that can be done because you can't rewind or duplicate an iterator in the general case.

i is iter(i)
True

Iterators come in all shapes and sizes.

xrange() has a rangeiterator:

iter(xrange(20))
<rangeiterator at 0x2896900>

dict has a dictionary-keyiterator:

iter({'a': 1, 'b': 2})
<dictionary-keyiterator at 0x2894aa0>

list doesn't even use the plain iterator type, and instead uses its own more efficient listiterator:

iter([4, 5, 6])
<listiterator at 0x28973d0>

And some have names that provide no clue what they iterate over:

re.finditer(r'\bs\w+', "some text with swords")
<callable-iterator at 0x28971d0>

A better iterable class

You can choose the iterator that will be returned by iter() by defining your own .__iter__() method:

class Countdown(object):
    def __iter__(self): # must return an iterator!
        return iter([5, 4, 3, 2, 1, 'launch'])

The for loop and other places that take iterables internally use iter(), which calls our new.__iter__() method to create an iterator:

for n in Countdown():
    print n
5
4
3
2
1
launch

Iterators the hard way

The example above is fine if we want to reuse an existing iterator (like the listiterator above), but what if we want to write a new iterator?

We know the protocol, so one approach is to just implement it:

class CountdownIterator(object):
    def __init__(self):
        self._remaining = [5, 4, 3, 2, 1, 'launch']

    def __iter__(self):
        return self

    def next(self):
        if not self._remaining:
            raise StopIteration
        return self._remaining.pop(0)

Our internal 'position' state is a list of items we pop one at a time when .next() is called. We implement .__iter__() in the normal way for an iterator: "return self". We raise StopIterationwhen we have nothing left to produce.

This works as expected, but it's rather a lot of code for a simple result.

for n in CountdownIterator():
    print n
5
4
3
2
1
launch

Generators

A generator function is a simpler way to create an iterator.

Generator functions let you use local variables and the position of the program counter as state for a generator object. A new generator object is created and returned each time you call a generator function. The generator object is an iterator.

Generators are Iterators created with a generator function or expression

Here is a generator function that only uses the program counter for state:

def countdown_generator():
    yield 5
    yield 4
    yield 3
    yield 2
    yield 1
    yield 'launch'

When we call the generator function it does nothing except create a new generator object. None of the code in the generator function has been executed yet.

countdown_generator()
<generator object countdown_generator at 0x289d0f0>

As the generator object is iterated over execution starts, following the generator function definition until the next yield statement.

When it reaches the yield statement execution is paused (the program counter is stored) and the value on the right of the yield statement is produced as a value from the generator object. Execution is resumed from the stored program counter position when iteration continues.

When the generator function reaches the end, the generator raises a StopIteration exception just like a normal iterator. And it behaves just like a normal iterator:

for n in countdown_generator():
    print n
5
4
3
2
1
launch

Now we have a much more concise way of defining our own iterator for an iterable class. The.__iter__() method of our class can be written as a generator function:

class Countdown(object):
    def __iter__(self):
        for n in [5, 4, 3, 2, 1, 'launch']:
            yield n
for n in Countdown():
    print n
5
4
3
2
1
launch

But, enough about classes. Let's dig further into how these generators work.

Recall that execution of the code in a generator function does not proceed until the generator object returned is iterated over. That lets us put things in a generator that might be expensive, knowing that we will only have to pay that cost when we actually ask it to produce the next item.

This generator causes a for loop to slow down between iterations. First waiting 5 seconds, then counting down from "5" to "1" with 1 seconds intervals in between:

import time

def slow_generator():
    time.sleep(5)
    yield 5
    time.sleep(1)
    yield 4
    time.sleep(1)
    yield 3
    time.sleep(1)
    yield 2
    time.sleep(1)
    yield 1
    time.sleep(1)

print "starting"
for n in slow_generator():
    print n
print "done"
starting
5
4
3
2
1
done

Another way of writing this code is to turn the generator inside-out.

Instead of sleeping inside the generator we can yield the amount of time we want to sleep. And instead of yield-ing the countdown we can use a function passed in to display values to the user.

def countdown_generator(fn):
    yield 5
    fn(5)
    yield 1
    fn(4)
    yield 1
    fn(3)
    yield 1
    fn(2)
    yield 1
    fn(1)
    yield 1

A show() function takes the place of the print inside the loop, and the time.sleep() call is done by the code iterating over the generator. This puts the code driving the generator in charge of how (or if) it sleeps for the given time.

def show(n):
    print n

print "starting"
for s in countdown_generator(show):
    time.sleep(s)
print "done"
starting
5
4
3
2
1
done

Generators as coroutines

While a generator object is an iterator, it can also be used for much more.

When paused at a yield statement generator objects can receive data by using .send() instead of.next().

When we use yield as an expression or assign it to a variable, the value passed to .send() is available inside the generator.

def knock_knock():
    name = yield "Who's there?"
    yield "%s who?" % name
    yield "That's not funny at all"

We have to switch to manually calling .next() on our generator object, because a for loop or function that takes an iterable won't be able to call .send() when we need to.

k = knock_knock()
k.next()
"Who's there?"

At this point execution is paused at the first yield. The assignment to the variable name hasn't happened yet. But when we .send() a value execution continues:

k.send("David")
'David who?'

And in the generator object we are at the second yield with "David" assigned to name.

If we send something to a yield that isn't being used as an expression, the value we send will be ignored:

k.send("David the environmentalist")
"That's not funny at all"

But execution continues the same as if we called .next().

This is the end of part 1.

In part 2 we build a simple interactive network game with 90% of the code written as generators. I will show how breaking down asynchronous code into generators can make it easy to test, easy to reuse, and (with some practice) easy to understand.

Tags: Ottawa Software Python OPAG


来自  http://excess.org/article/2013/02/itergen1/




 

Les générateurs sont une fonctionalité fabuleuse de Python, et une étape indispensable dans la maîtrise du langage. Une fois compris,vous ne pourrez plus vous en passer.

Rappel sur les itérables

Quand vous lisez des éléments un par un d’une liste, on appelle cela l’itération:

lst = [1, 2, 3]
>>> for i in lst :
...     print(i)
1
2
3

Et quand on utilise une liste en intention, on créé une liste, donc un itérable. Encore une fois, avec une boucle for, on prend ses éléments un par un, donc on itère dessus:

lst = [x*x for x in range(3)]
>>> for i in lst :
...     print(i)
0
1
4

À chaque fois qu’on peut utiliser “for in…” sur quelque chose, c’est un itérable : lists, strings, files…

Ces itérables sont pratiques car on peut les lire autant qu’on veut, mais ce n’est pas toujours idéal car on doit stocker tous les éléments en mémoire.

Les générateurs

Si vous vous souvenez de l’article sur les comprehension lists, on peut également créer des expressions génératrices:

generateur = (x*x for x in range(3))
>>> for i in generateur :
...     print(i)
0
1
4

La seule différence avec précédemment, c’est qu’on utilise () au lieu de []. Mais on ne peut pas lire generateur une seconde fois car le principe des générateurs, c’est justement qu’ils génèrent tout à la volée: ici il calcule 0, puis l’oublie, puis calcule 1, et l’oublie, et calcule 4. Tout ça un par un.

Le mot clé yield

yield est un mot clé utilisé en lieu et place de return, à la différence prêt qu’on va récupérer un générateur.

>>> def creerGenerateur() :
...     mylist = range(3)
...     for i in mylist:
...         yield i*i
...
>>> generateur = creerGenerateur() # crée un générateur
>>> print(generateur) # generateur est un objet !
< generator object creerGenerateur at 0x2b484b9addc0>
>>> for i in generateur:
...     print(i)
0
1
4

Ici c’est un exemple inutile, mais dans la vraie vie vivante, c’est pratique quand on sait que la fonction va retourner de nombreuses valeurs qu’on ne souhaite lire qu’une seule fois.

Le secret des maîtres Zen qui ont acquis la compréhension transcendantale de yield, c’est de savoir que quand on appelle la fonction, le code de la fonction n’est pas exécute. A la place, la fonction va retourner un objet générateur.

C’est pas évident à comprendre, alors relisez plusieurs fois cette partie.

creerGenerateur() n’exécute pas le code de creerGenerateur.

creerGenerateur() retourne un objet générateur.

En fait, tant qu’on ne touche pas au générateur, il ne se passe rien. Puis, dès qu’on commence à itérer sur le générateur, le code de la fonction s’exécute.

La première fois que le code s’éxécute, il va partir du début de la fonction, arriver jusqu’à yield, et retourner la première valeur. Ensuite, à chaque nouveau tour de boucle, le code va reprendre de la où il s’est arrêté (oui, Python sauvegarde l’état du code du générateur entre chaque appel), et exécuter le code à nouveau jusqu’à ce qu’il rencontre yield. Donc dans notre cas, il va faire un tour de boucle.

Il va continuer comme ça jusqu’à ce que le code ne rencontre plus yield, et donc qu’il n’y a plus de valeur à retourner. Le générateur est alors considéré comme définitivement vide. Il ne peut pas être “rembobiné”, il faut en créer un autre.

La raison pour laquelle le code ne rencontre plus yield est celle de votre choix: condition if/else, boucle, recursion… Vous pouvez même yielder à l’infini.

Un exemple concret et un café, plz

yield permet non seulement d’économiser de la mémoire, mais surtout de masquer la complexité d’un algo derrière une API classique d’itération.

Supposez que vous ayez une fonction qui – tada ! – extrait les mots de plus de 3 caractères de tous les fichiers d’un dossier.

Elle pourrait ressembler à ça:

import os
 
def extraire_mots(dossier):
    for fichier in os.listdir(dossier):
        with open(os.path.join(dossier, fichier)) as f:
            for ligne in f:
                for mot in ligne.split():
                    if len(mot) > 3:
                        yield mot

Vous avez là un algo dont on masque complètement la complexité, car du point de vue de l’utilisateur, il fait juste ça:

for mot in extraire_mots(dossier):
    print mot

Et pour lui c’est transparent. En plus, il peut utiliser tous les outils qu’on utilise sur les itérables d’habitude. Toutes les fonctions qui acceptent les itérables acceptent donc le résultat de la fonction en paramètre grâce à la magie du duck typing. On créé ainsi une merveilleuse toolbox.

Controller yield

>>> class DistributeurDeCapote():
    stock = True
    def allumer(self):
        while self.stock:
            yield "capote"
...

Tant qu’il y a du stock, on peut récupérer autant de capotes que l’on veut.

>>> distributeur_en_bas_de_la_rue = DistributeurDeCapote()
>>> distribuer = distributeur_en_bas_de_la_rue.allumer()
>>> print distribuer.next()
capote
>>> print distribuer.next()
capote
>>> print([distribuer.next() for c in range(4)])
['capote', 'capote', 'capote', 'capote']

Dès qu’il n’y a plus de stock…

>>> distributeur_en_bas_de_la_rue.stock = False
>>> distribuer.next()
Traceback (most recent call last):
  File "<ipython-input-22-389e61418395>", line 1, in <module>
    distribuer.next()
StopIteration
< type 'exceptions.StopIteration'>

Et c’est vrai pour tout nouveau générateur:

>>> distribuer = distributeur_en_bas_de_la_rue.allumer()
>>> distribuer.next()
Traceback (most recent call last):
  File "<ipython-input-24-389e61418395>", line 1, in <module>
    distribuer.next()
StopIteration

Allumer une machine vide n’a jamais permis de remplir le stock ;-) Mais il suffit de remplir le stock pour repartir comme en 40:

>>> distributeur_en_bas_de_la_rue.stock = True
>>> distribuer = distributeur_en_bas_de_la_rue.allumer()
>>> for c in distribuer :
...     print c
capote
capote
capote
capote
capote
capote
capote
capote
capote
capote
capote
capote
...

itertools: votre nouveau module favori

Le truc avec les générateurs, c’est qu’il faut les manipuler en prenant en compte leur nature: on ne peut les lire qu’une fois, et on ne peut pas déterminer leur longeur à l’avance. itertools est un module spécialisé la dedans: map, zip, slice… Il contient des fonctions qui marchent sur tous les itérables, y compris les générateurs.

Et rappelez-vous, les strings, les listes, les sets et même les fichiers sont itérables.

Chaîner deux itérables, et prendre les 10 premiers caractères ? Piece of cake !

>>> import itertools
>>> d = DistributeurDeCapote().allumer()
>>> generateur = itertools.chain("12345", d)
>>> generateur = itertools.islice(generateur, 0, 10)
>>> for x in generateur:
...     print x
...     
1
2
3
4
5
capote
capote
capote
capote
capote

Les dessous de l’itération

Sous le capôt, tous les itérables utilisent un générateur appelé “itérateur”. On peut récupérer l’itérateur en utiliser la fonction iter() sur un itérable:

>>> iter([1, 2, 3])
< listiterator object at 0x7f58b9735dd0>
>>> iter((1, 2, 3))
< tupleiterator object at 0x7f58b9735e10>
>>> iter(x*x for x in (1, 2, 3))
< generator object  at 0x7f58b9723820>

Les itérateurs ont une méthode next() qui retourne une valeur pour chaque appel de la méthode. Quand il n’y a plus de valeur, ils lèvent l’exception StopIteration:

>>> gen = iter([1, 2, 3])
>>> gen.next()
1
>>> gen.next()
2
>>> gen.next()
3
>>> gen.next()
Traceback (most recent call last):
  File "< stdin>", line 1, in < module>
StopIteration

Message à tous ceux qui pensent que je fabule quand je dis qu’en Python on utilise les exceptions pour contrôler le flux d’un programme (sacrilège !): ceci est le mécanisme des boucles interne en Python. Les boucles for utilisent iter() pour créer un générateur, puis attrappent une exception pour s’arrêter. À chaque boucle for, vous levez une exception sans le savoir.

Pour la petite histoire, l’implémentation actuelle est que iter() appelle la méthode __iter__() sur l’objet passé en paramètre. Donc ça veut dire que vous pouvez créer vos propres itérables:

>>> class MonIterableRienQuaMoi(object):
...     def __iter__(self):
...         yield 'Python'
...         yield "ça"
...         yield 'déchire'
...
>>> gen = iter(MonIterableRienQuaMoi())
>>> gen.next()
'Python'
>>> gen.next()
'ça'
>>> gen.next()
'déchire'
>>> gen.next()
Traceback (most recent call last):
  File "< stdin>", line 1, in < module>
StopIteration
>>> for x in MonIterableRienQuaMoi():
...     print x
...
Python
ça
déchire

12 thoughts on “Comment utiliser yield et les générateurs en Python ?

  • Poulet 2.0

    Petit typo dans l’exemple concret (merci pour le café):

            for mot in ligne.s<strong>p</strong>lit():
  • Sam Post author

    :-)

    Merci à tous les deux.

    (la flemme de mettre un tampon…)

  • Titus Crow

    Petite coquille également :
    for mot in extraire_mots(dossier):

    (pour les gens qui reprennent vos exemples en copié-collé ^^’)

  • Feadurn

    Un tout tout grand merci pour ce blog, apprenant python sur le tas pour les necessite de ma recherche, je peux enfin arriver a faire des choses un peu plus complexe grace a vous. La par exemple je suis dans le tuto sur les classes, et je vais peut etre enfin arriver a comprendre la POO.

    Pour que ce message soit un tantinet utile, dans la phrase

    “creerGenerateur() n’éxécute pas le code de creerGenerateur.creerGenerateur() retourne un objet générateur.” il manque peut etre un “mais” (ok c’etait pas si utile que ca finalement)

  • Sam Post author

    Effectivement y a moyen de rendre ça plus clair. J’ai fais un édit :-)

  • Krikor

    Si j’ai bien compris, l’intérêt de yield est de ne pas stocker en mémoire une grosse liste d’élément pour s’en servir mais d’aller chercher l’info dont on a besoin, s’en servir et tout de suite l’éliminer de la mémoire ?
    J’espère que je ne dis pas de bêtises, c’est juste pour bien comprendre quand utiliser yield plutôt que de retourner une liste.

  • Sam Post author

    Tout à fait. Yield permet aussi d’applanir des algorithmes complexes pour les exposer comme le parcours d’une liste.

  • policier moustachu

    Yo ! Je me suis fais un petit algo récursif pour calculer toutes les combinaisons de n entiers dont la somme fait m.
    Et pour pas exploser la pile je me demandais si c’était possible d’utiliser yield. Mais c’est pas évident évident …


    def pilepoile(n, taille):
    if taille == 1:
    return [[n]]
    else:
    toutes_les_listes = []
    for i in range(n + 1):
    intermediates = pilepoile(n-i, taille -1)
    for l in intermediates:
    l.insert(0,i)
    toutes_les_listes.extend(intermediates)
    return toutes_les_listes

  • policier moustachu

    def pilepoile(n, taille):
    if taille == 1:
    return [[n]]
    else:
    toutes_les_listes = []
    for i in range(n + 1):
    intermediates = pilepoile(n-i, taille -1)
    for l in intermediates:
    l.insert(0,i)
    toutes_les_listes.extend(intermediates)
    return toutes_les_listes

  • policier moustachu

    Désolé j’arrive pas à utiliser les tags de code proprement tamponnez moi fort.

  • GUILLAUME LE GALL

    Très bonne explication qui permet d’apréhender les subtilités des générateurs !

Leave a comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

 

 

Des questions Python sans rapport avec l'article ? Posez-les sur IndexError.


 

来自  http://sametmax.com/comment-utiliser-yield-et-les-generateurs-en-python/
 


 



 
普通分类: