DEV Community

Sumeet Sarkar
Sumeet Sarkar

Posted on

Trinity of Context Managers, Generators & Decorators

Python provides three elegant concepts of Context Managers, Decorators & Generators. Understanding these concepts is pretty fundamental for being able to code elegant constructs in python.

What matters however, is that all these three tie up together in a very unique way. Before going to that, let us have a quick refresher on each of these concepts.

Context Managers

Context managers are an efficient way to wrap around enter and exit logic over a given resource.

It is a clean way of wrapping the code, to make sure an entry code is always executed and no matter the kind of operation being performed on the resource, which either completes or raises an exception, the exit logic will always be executed.

For any class to be a context manager, it should implement __enter__ and __exit__ methods.

class MyContextManager:

    def __enter__(self):
        pass

    def __exit__(self, *args):
        pass

A context manager is invoked using with keyword. It may or may not return the resource it operates on.

In the below example, the Context Manager on object creation takes the filePath and mode and upon its __enter__ returns the file object. The __exit__ is called once the code goes out of the context manager scope.

class FileContextManager:
    def __init__(self, filePath, mode):
        print('Init called')
        self.__filePath = filePath
        self.__mode = mode

    def __enter__(self):
        print('Enter called, attempting to open File')
        self.__f = open(self.__filePath, self.__mode)
        return self.__f

    def __exit__(self, *args):
        self.__f.close()
        print('Exit called and File closed')

Context Manager is invoked using with

with FileContextManager('/some/dir/file.txt', 'r') as f:
    print('Inside context manager')
    # Instead of reading the file, let us raise an Exception
    raise Exception('some exception')
    print('After exception')

# Output...
# 
# Init called
# Enter called, attempting to open File
# Inside context manager
# Exit called and File closed
# Exception: some exception

Context Managers, guarantees to execute a start & stop code

Even though there is an exception raised inside the context manager block, the __exit__ is still executed and file is closed. But anything after the exception is halted from execution.

Note: file in python is already a context manager. There is no need to create another wrapper Context Manager.

Generators

Generators are a neat way to slice up a function definition to chunks of logical code that can be iterated over. This is achieved by using the yield keyword. If a function contains at least one yield statement, it becomes a generator function.

When such a function is executed, it returns an object (iterator) but does not start execution immediately. __iter__() and __next__() are implemented implicitly. Hence, items can be iterated using next()

When yield is encountered the execution of the function is paused saving all its states and control is returned back, unlike a return which terminates the function execution. Successive calls to the function resumes from where it paused, unless the iterator raises a StopIteration

Let us see a basic Generator example. The function gen_odd_numbers yields odd numbers.

def gen_odd_numbers(seed, limit):
    if seed % 2 == 0:
        seed += 1
    while seed <= limit:
        yield seed
        seed += 2
for n in gen_odd_numbers(4, 10):
    print(n)
# Output...
#
# 5
# 7
# 9

Let us see another example below, do_task. Note that, here we are iterating by next hence, StopIteration needs to be handled. But moreover, what we may ascertain here is that generators using yield can help us execute code blocks in a guaranteed order.

Generators, guarantees to execute code in order

Hence, Task 1 is always performed before Task 2. Any number of steps of code blocks which need to be run in a particular order can leverage generators, and helps in returning to the context again, right where it left, using yield.

def do_task():
    print('Task 1 performed')
    yield
    print('Task 2 performed')
task_list = do_task()
while True:
    try:
        next(task_list)
    except StopIteration:
        break
# Output...
#
# Task 1 performed
# Task 2 performed

Decorators

Decorators are an interesting way of augmenting the functions you write to extend its capabilities beyond its core intent.

Decorators by concept can completely alter a functions intent, but that is not what a decorator is intended for. Instead, it should keep restricted itself to decorate the actual intent of a function, or in other words, simply augment it.

Decorators, augment existing behaviour of a function

Below is a simple logtime decorator which passes the first parameter as the formatted date time info, to the function it decorates. Here function logger is decorated. As a result, when logger is executed with arguments, it is the wrapper function returned by logtime which is actually getting executed, which after preparing the formatted date time string, executes the logger function with first argument as the date time string.

from datetime import datetime

def logtime(fn):
    def wrapper(*args):
        datetimeinfo = datetime.now().strftime('%c')
        return fn(f'{datetimeinfo}', *args)
    return wrapper

@logtime
def logger(*msg):
    s = ''
    for m in msg:
        s += (str(m) + ' ')
    print(s)

logger('hello world')       # Fri Feb 15 01:51:43 2019 hello world
logger('hello', 'world')    # Fri Feb 15 01:51:43 2019 hello world
logger()                    # Fri Feb 15 01:51:44 2019

A full four part series is covered on decorators. I would recommend to read on it to have a better understanding of it. (Links mentioned below this example)

A class based decorator sample syntax and execution.

class Decorator:
def __init__(self, fn):
        print('__init__')
        self.__fn = fn

    def __call__(self, *args, **kwargs):
        print('__call__')
        return self.__fn(*args, **kwargs)


@Decorator
def my_function(param):
    return f'my_function called with {param}'

print(my_function(10))

# Output...
#
# __init__   (called when the @Decorator is applied to a function, once for each function)
# __call__   (called each time, the decorated function is called)
# my_function called with 10

In case you would want to learn more about decorators, I have published a Python decorator series that you can check out.

Python decorators [Part 1] (Primer)
Python decorators [Part 2] (Nested Decorators)
Python decorators [Part 3] (Higher Order Decorators)
Python decorators [Part 4] (Decorators as classes)

Bringing it all together

Let us revise the benefits of each quickly.

Context Managers, guarantees to execute a start & stop code
Generators, guarantees to execute code in order
Decorators, augment existing behaviour of a function

Marrying these qualities together, we can have code which is guaranteed to have a particular set of instructions to execute in order (generators), with its start and stop conditions also guaranteed to execute no matter raising of exceptions by intermediate steps (context managers) and all of this can be wrapped around a function seamlessly (decorators).

Example:

  1. Let us write a Context Manager which accepts a generator, see __init__
  2. Context Manager __enter__ and __exit__ is implemented. Each of the __enter__ and __exit__ perform a next on the generator function.
  3. We also make this class as a Decorator, hence __call__ is to be implemented, which invokes the function received by the __init__
class ContextManager:

    def __init__(self, gen):
        self.__gen = gen

    def __call__(self, *args, **kwargs):
        self.__gen = self.__gen(*args, **kwargs)
        return self

    def __enter__(self):
        return next(self.__gen, None)

    def __exit__(self, *args):
        next(self.__gen, None)

Now, apply this Context Manager + Decorator ContextManager to the file_helper function below, which should be a generator function, hence a yield must be provided for the code to break in steps. The file_helper function after opening the file, makes a yield of the file object.

@ContextManager
def file_helper(filePath, mode):
    try:
        f = open(filePath, mode)
        yield f
    except Exception as e:
        print('File does not exist', e)
    else:
        f and f.close()

The final result

Now reading the file. Since, file_helper is now a context manager, thanks to the decorator. When with file_helper is invoked, the __call__ is invoked and the actual function self.__gen saved in __init__ is executed which returns a generator, since the file_helper function has yield in it. Now due to the with keyword, the __enter__ is executed, which does a next on the generator and file is opened in the file_helper and file object is yielded.

with file_helper('/some/dir/file.txt', 'r') as f:
    f and print(f.read())

Once, the code execution goes out of the context manager block, the __exit__ is executed, which performs another next on the generator function which returns None, as the generator is completed.

This way Context Manager, Generator and Decorator all three come together to accomplish a single goal in an elegant manner.

Such a @context_manager is already provided by python as part of the contextmanager module, and we just understood how it works!


If you are interested to learn few more concepts of python, you might find my Github repository Art of Python handy.


Looking for setting up an All Star IDE for python? Check out Ultimate Sublime for Python.

Top comments (0)