The goal of this document is to help you get running quickly and with as little fuss as possible. Because huey works with python in general but also has some special django integration, this guide is broken up into two parts. Read the general guide first, then continue on to the django guide, as information is presented in the general guide that is not covered in the django parts.
There are three main components (or processes) to consider when running huey:
These three processes are shown in the screenshot below – the left-hand pane shows the producer: a simple program that asks the user for input on how many “beans” to count. In the top-right, the consumer is running. It is doing the actual “computation” and simply printing the number of beans counted. In the bottom-right is the queue, Redis in this example, which we’re monitoring and shows tasks being enqueued (LPUSH) and read (BRPOP) from the database.
Assuming you’ve got huey installed, let’s look at the code from this example.
The first step is to configure your queue. The consumer needs to be pointed at a subclass of BaseConfiguration, which specifies things like the backend to use, where to log activity, etc.
# config.py
from huey.backends.redis_backend import RedisBlockingQueue
from huey.bin.config import BaseConfiguration
from huey.queue import Invoker
queue = RedisBlockingQueue('test-queue', host='localhost', port=6379)
invoker = Invoker(queue)
class Configuration(BaseConfiguration):
QUEUE = queue
THREADS = 4
The interesting parts of this configuration module are the Invoker object and the RedisBlockingQueue object. The queue is responsible for storing and retrieving messages, and the invoker is used by your application code to coordinate function calls with a queue backend. We’ll see how the invoker is used when looking at the actual function responsible for counting beans:
# commands.py
from huey.decorators import queue_command
from config import invoker # import the invoker we instantiated in config.py
@queue_command(invoker)
def count_beans(num):
print 'Counted %s beans' % num
The above example shows the API for writing “commands” that are executed by the queue consumer – simply decorate the code you want executed by the consumer with the queue_command() decorator and when it is called, the main process will return immediately after enqueueing the function call. The invoker is passed in to the decorator, which instructs huey where to send the message.
The main executable is very simple. It imports both the configuration and the commands - this is to ensure that when we run the consumer by pointing it at the configuration, the commands are also imported and loaded into memory.
# main.py
from config import Configuration # import the configuration class
from commands import count_beans # import our command
if __name__ == '__main__':
beans = raw_input('How many beans? ')
count_beans(int(beans))
print 'Enqueued job to count %s beans' % beans
To run these scripts, follow these steps:
The above example illustrates a “send and forget” approach, but what if your application needs to do something with the results of a task? To get results from your tasks, we’ll set up the RedisDataStore by adding the following lines to the config.py module:
from huey.backends.redis_backend import RedisBlockingQueue, RedisDataStore
from huey.bin.config import BaseConfiguration
from huey.queue import Invoker
queue = RedisBlockingQueue('test-queue', host='localhost', port=6379)
result_store = RedisDataStore('results', host='localhost', port=6379) # new
invoker = Invoker(queue, result_store=result_store) # added result store
class Configuration(BaseConfiguration):
QUEUE = queue
RESULT_STORE = result_store # added
THREADS = 4
To better illustrate getting results, we’ll also modify the commands.py module to return a string rather than simply printing to stdout:
from huey.decorators import queue_command
from config import invoker
@queue_command(invoker)
def count_beans(num):
return 'Counted %s beans' % num # changed "print" to "return"
We’re ready to fire up the consumer. Instead of simply executing the main program, though, we’ll start an interpreter and run the following:
>>> from main import count_beans
>>> res = count_beans(100)
>>> res # <--- what is "res" ?
<huey.queue.AsyncData object at 0xb7471a4c>
>>> res.get() # <--- get the result of this task
'Counted 100 beans'
Following the same layout as our last example, here is a screenshot of the three main processes at work:
It is often useful to enqueue a particular task to execute at some arbitrary time in the future, for example, mark a blog entry as published at a certain time.
This is very simple to do with huey. Returning to the interpreter session from the last section, let’s schedule a bean counting to happen one minute in the future and see how huey handles it. Execute the following:
>>> import datetime
>>> in_a_minute = datetime.datetime.now() + datetime.timedelta(seconds=60)
>>> res = count_beans.schedule(args=(100,), eta=in_a_minute)
>>> res
<huey.queue.AsyncData object at 0xb72915ec>
>>> res.get() # <--- this returns None, no data is ready
>>> res.get() # <--- still no data...
>>> res.get(blocking=True) # <--- ok, let's just block until its ready
'Counted 100 beans'
Looking at the redis output, we see the following (simplified for reability):
+1325563365.910640 "LPUSH" count_beans(100)
+1325563365.911912 "BRPOP" wait for next job
+1325563365.912435 "HSET" store 'Counted 100 beans'
+1325563366.393236 "HGET" retrieve result from task
+1325563366.393464 "HDEL" delete result after reading
Here is a screenshot showing the same:
Huey supports retrying tasks a finite number of times. If an exception is raised during the execution of the task and retries have been specified, the task will be re-queued and tried again, up to the number of retries specified.
Here is a task that will be retried 3 times and will blow up every time:
# commands.py
from huey.decorators import queue_command
from config import invoker
@queue_command(invoker)
def count_beans(num):
return 'Counted %s beans' % num # changed "print" to "return"
@queue_command(invoker, retries=3)
def try_thrice():
print 'trying....'
raise Exception('nope')
The console output shows our task being called in the main interpreter session, and then when the consumer picks it up and executes it we see it failing and being retried:
Oftentimes it is a good idea to wait a certain amount of time between retries. You can specify a delay between retries, in seconds, which is the minimum time before the task will be retried. Here we’ve modified the command to include a delay, and also to print the current time to show that its working.
# commands.py
from datetime import datetime
@queue_command(invoker, retries=3, retry_delay=10)
def try_thrice():
print 'trying....%s' % datetime.now()
raise Exception('nope')
The console output below shows the task being retried, but in between retries I’ve also “counted some beans” – that gets executed normally, in between retries.
The final usage pattern supported by huey is the execution of tasks at regular intervals. This is modeled after crontab behavior, and even follows similar syntax. Tasks run at regular intervals and should not return meaningful results, nor should they accept any parameters.
Let’s add a new task that prints the time every minute – we’ll use this to test that the consumer is executing the tasks on schedule.
# commands.py
from datetime import datetime
from huey.decorators import queue_command, periodic_command, crontab
from config import invoker
@queue_command(invoker)
def count_beans(num):
return 'Counted %s beans' % num
@queue_command(invoker, retries=3, retry_delay=10)
def try_thrice():
print 'trying....%s' % datetime.now()
raise Exception('nope')
@periodic_command(invoker, crontab(minute='*'))
def print_time():
print datetime.now()
Additionally, we need to indicate in the Configuration object that we want to run periodic tasks. The reason this is configurable is because if you were wanting to run multiple consumer processes, only one of them should be responsible for enqueueing periodic commands. The configuration now looks like this:
# config.py excerpt
class Configuration(BaseConfiguration):
QUEUE = queue
RESULT_STORE = result_store
PERIODIC = True # <-- new
Now, when we run the consumer it will start printing the time every minute:
That sums up the basic usage patterns of huey. If you plan on using with django, read on – otherwise check the detailed documentation on the following:
Configuring huey to work with django is actually more simple due to the centralized nature of django’s configuration and conventions. Rather than maintaining a Configuration object, as in the above example, everything is configured automatically using django settings. Following the previous example, we’ll re-create the bean counting task using django:
First let’s get the settings. In the interests of focusing on the bare minimum to get things running, here are the only settings you need. It assumes, in addition to the huey.djhuey app, a single app called test_app:
INSTALLED_APPS = [
'huey.djhuey',
'test_app',
]
HUEY_CONFIG = {
'QUEUE': 'huey.backends.redis_backend.RedisBlockingQueue',
'QUEUE_NAME': 'test-queue',
'QUEUE_CONNECTION': {
'host': 'localhost',
'port': 6379,
},
'THREADS': 4,
}
The test_app will be as simple as possible:
__init__.py (empty)
manage.py (standard)
settings.py
The only file with any code in it is test_app.commands:
from huey.djhuey.decorators import queue_command
@queue_command
def count_beans(number):
print 'Counted %s beans' % number
If you’re comparing against the example describe in the previous section, there are a couple key differences:
Let’s test it out:
To enable support for task results, define a RESULT_STORE in the django settings module:
HUEY_CONFIG = {
'QUEUE': 'huey.backends.redis_backend.RedisBlockingQueue',
'QUEUE_NAME': 'test-queue',
'QUEUE_CONNECTION': {
'host': 'localhost',
'port': 6379,
},
'RESULT_STORE': 'huey.backends.redis_backend.RedisDataStore',
'RESULT_STORE_CONNECTION': {
'host': 'localhost',
'port': 6379,
},
'THREADS': 4,
}