Python Decorators: Wrap Functions Without Touching Them
You have a function that works fine. Now you need to time it. Or retry it on failure. Or log every call. The naive solution is to edit the function body โ but then you have to do it again for the next function, and the next.
Decorators solve this by wrapping behavior around a function without changing the function itself.
๐ Free: AI Publishing Checklist โ 7 steps in Python ยท Full pipeline: germy5.gumroad.com/l/xhxkzz (pay what you want, min $9.99)
The mental model
A decorator is a function that takes a function and returns a (usually enhanced) function.
# This:
@timer
def generate_chapter():
...
# Is exactly the same as:
def generate_chapter():
...
generate_chapter = timer(generate_chapter)
That's the whole mechanism. @timer is syntactic sugar for reassigning the name. Understanding this one line removes all the mystery.
Functions are first-class objects
Before you can write a decorator, you need to know that Python functions are objects โ you can pass them as arguments, store them in variables, and return them from other functions.
def greet(name):
return f"Hello, {name}!"
# Assign to a variable
say_hi = greet
print(say_hi("Alice")) # Hello, Alice!
# Pass as an argument
def call_twice(fn, value):
return fn(value), fn(value)
call_twice(greet, "Bob") # ('Hello, Bob!', 'Hello, Bob!')
# Return from a function
def make_multiplier(n):
def multiply(x):
return x * n
return multiply # returns the inner function
double = make_multiplier(2)
print(double(5)) # 10
The inner multiply function above is a closure โ it remembers n from the enclosing scope even after make_multiplier has returned. Decorators use this pattern heavily.
Writing your first decorator
Here is a timer decorator that measures how long a function takes:
import time
def timer(fn):
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = fn(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{fn.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def slow_task():
time.sleep(0.5)
return "done"
slow_task()
# slow_task took 0.5003s
Three things happen in timer:
- It receives
fn(the original function) - It defines
wrapper, which adds timing around a call tofn - It returns
wrapperโ which replacesfnunder the same name
*args, **kwargs lets wrapper accept any arguments and forward them unchanged to fn.
functools.wraps โ why you need it
There is a problem with the decorator above:
print(slow_task.__name__) # wrapper โ wrong
print(slow_task.__doc__) # None โ lost
The wrapper replaced the original function, so its __name__, __doc__, and other metadata are gone. This breaks introspection, documentation tools, and logging.
Fix it with functools.wraps:
import time
import functools
def timer(fn):
@functools.wraps(fn) # copy metadata from fn to wrapper
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = fn(*args, **kwargs)
elapsed = time.perf_counter() - start
print(f"{fn.__name__} took {elapsed:.4f}s")
return result
return wrapper
@timer
def slow_task():
"""Simulates a slow operation."""
time.sleep(0.5)
print(slow_task.__name__) # slow_task โ
print(slow_task.__doc__) # Simulates a slow operation. โ
Always use @functools.wraps(fn) on the inner wrapper. There is no good reason to skip it.
Stacking decorators (order matters)
You can apply multiple decorators to one function. They apply bottom-up:
@timer
@log_call
def process():
...
# Equivalent to:
process = timer(log_call(process))
log_call wraps process first, then timer wraps the result. The outermost decorator (timer) runs first when the function is called.
import functools
def log_call(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
print(f"Calling {fn.__name__}")
result = fn(*args, **kwargs)
print(f"{fn.__name__} returned {result!r}")
return result
return wrapper
@timer
@log_call
def add(a, b):
return a + b
add(2, 3)
# Calling add
# add returned 5
# add took 0.0000s
Decorators with arguments (decorator factories)
Sometimes you want to configure a decorator: @retry(max_attempts=3). This requires one more layer โ a function that returns a decorator:
import functools
import time
def retry(max_attempts=3, delay=1.0, exceptions=(Exception,)):
"""Decorator factory: retry a function up to max_attempts times."""
def decorator(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
last_error = None
for attempt in range(1, max_attempts + 1):
try:
return fn(*args, **kwargs)
except exceptions as e:
last_error = e
if attempt < max_attempts:
print(f"{fn.__name__} failed (attempt {attempt}), retrying in {delay}s...")
time.sleep(delay)
raise last_error
return wrapper
return decorator
@retry(max_attempts=3, delay=0.5, exceptions=(ConnectionError,))
def fetch_data(url):
raise ConnectionError("Network down")
fetch_data("https://example.com")
# fetch_data failed (attempt 1), retrying in 0.5s...
# fetch_data failed (attempt 2), retrying in 0.5s...
# ConnectionError: Network down
The shape: retry(args) returns decorator, which takes fn and returns wrapper. Three levels of nesting, but each level has one job.
Common built-in decorators
Python ships with several decorators you have probably already used:
class Circle:
def __init__(self, radius):
self._radius = radius
@property
def area(self):
"""Access like an attribute, not a method call."""
import math
return math.pi * self._radius ** 2
@staticmethod
def from_diameter(d):
"""No self or cls โ just a function namespaced to the class."""
return Circle(d / 2)
@classmethod
def unit(cls):
"""Receives the class itself โ useful for alternative constructors."""
return cls(radius=1)
c = Circle(5)
print(c.area) # 78.539... โ no parentheses
c2 = Circle.from_diameter(10)
c3 = Circle.unit()
And functools.cache (Python 3.9+) memoizes a function's results automatically:
import functools
@functools.cache
def fibonacci(n):
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
fibonacci(100) # instant โ each value computed only once
@functools.lru_cache(maxsize=128) is the bounded version for functions with many unique inputs.
Real patterns for pipeline automation
These three decorators cover most automation needs:
import functools
import logging
import time
log = logging.getLogger(__name__)
# 1. Timer โ measure every LLM call
def timer(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = fn(*args, **kwargs)
log.info("%s completed in %.2fs", fn.__name__, time.perf_counter() - start)
return result
return wrapper
# 2. Retry โ handle transient API errors
def retry(max_attempts=3, delay=2.0, exceptions=(Exception,)):
def decorator(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
for attempt in range(1, max_attempts + 1):
try:
return fn(*args, **kwargs)
except exceptions as e:
if attempt == max_attempts:
raise
log.warning("%s attempt %d failed: %s", fn.__name__, attempt, e)
time.sleep(delay)
return wrapper
return decorator
# 3. Log call โ audit every function entry
def log_call(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
log.debug("โ %s called", fn.__name__)
result = fn(*args, **kwargs)
log.debug("โ %s done", fn.__name__)
return result
return wrapper
# Stack them: log_call wraps first, timer wraps last (outermost)
@timer
@retry(max_attempts=3, delay=2.0, exceptions=(ConnectionError, TimeoutError))
@log_call
def generate_chapter(topic: str) -> str:
# call the LLM API here
...
Three decorators, zero changes to generate_chapter's body. When you add a fourth function to the pipeline, you apply the same decorators and move on.
Class decorators (brief mention)
A class can also be a decorator if it implements __call__. This is useful when you need to store state across calls:
class CountCalls:
def __init__(self, fn):
functools.update_wrapper(self, fn)
self.fn = fn
self.count = 0
def __call__(self, *args, **kwargs):
self.count += 1
print(f"{self.fn.__name__} has been called {self.count} time(s)")
return self.fn(*args, **kwargs)
@CountCalls
def greet(name):
return f"Hello, {name}!"
greet("Alice") # greet has been called 1 time(s)
greet("Bob") # greet has been called 2 time(s)
print(greet.count) # 2
Use class decorators when a closure would need nonlocal to mutate state โ a class with instance variables is cleaner.
Quick reference
# Basic decorator (no arguments)
def my_decorator(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
# before
result = fn(*args, **kwargs)
# after
return result
return wrapper
# Decorator factory (with arguments)
def my_decorator(option=True):
def decorator(fn):
@functools.wraps(fn)
def wrapper(*args, **kwargs):
return fn(*args, **kwargs)
return wrapper
return decorator
# Usage
@my_decorator # no arguments โ apply directly
def fn_a(): ...
@my_decorator(option=False) # with arguments โ factory first
def fn_b(): ...
The pattern is always the same. Once you have memorized it, every decorator in the wild becomes readable.
The full pipeline uses decorators for retry logic and timing โ wrapping every LLM call without touching the call site: germy5.gumroad.com/l/xhxkzz โ pay what you want, min $9.99.
Top comments (0)