Generators allow you to produce code ad-hoc. In Python 2, the range() function worked by creating an iterable list in memory. In Python 3, all constructs similar to this now return generators. With generators, you don't need to worry about where the next element goes after you iterate over it, and it's how iteration ends up being like. However, in generators you use the yield keyword, instead of a return or appending to a list. This produces on-the-fly values:
yield some_val_here
Memory and speed issues are now non-existant with generators (kind of). The argument is that generators allow your elements to become immediately usable without storing them.
Example 1 Output the numbers 1-10 with a traditional approach, and then with a generator.
In [2]:
# With a List
def traditional_numbers(n):
lst = []
for i in range(n):
lst.append(i)
print(lst)
traditional_numbers(10)
In [6]:
# With a generator
def generator_numbers(n):
for i in range(n):
yield i
for num in generator_numbers(10):
print(num, end =' ')
The generator allows us to use as many numbers as we like without worrying about memory. With the traditional approach, using anything outside the function will require a structure like a list. However, with a generator, you can produce values ad-hoc and iterate over them (because a generator is an interator) whenever.
In [ ]: