Skip to content Skip to sidebar Skip to footer

Accessing Celery Worker Instance Inside The Task

I want to use jupyter kernels in side the celery worker. There will be one Jupyter Kernel for each Celery Worker. To achieve it I am overriding the default Worker class of the cele

Solution 1:

Adding that worker instance information to the request object solved my problem. to do that I overrode the _process_task method of the worker class.

def_process_task(self, req):
  try:
    req.kwargs['kernel_client'] = self.kernel_client
    print("printing from _process_task {}".format(req.kwargs))
    req.execute_using_pool(self.pool)
  except TaskRevokedError:
    try:
      self._quick_release()   # Issue 877except AttributeError:
      passexcept Exception as exc:
    logger.critical('Internal error: %r\n%s',exc, traceback.format_exc(), exc_info=True)

Here is my task where I access the kernel_client

@app.task(bind=True)defpythontask(self,code, kernel_client=None):

    mid = kernel_client.execute(code)

    print("{}".format(kernel_client))
    print("{}".format(mid))

This thing works only when I start workers in solo mode otherwise not it throws some pickling error. Anyways using solo workers is my requirement so this solution works for me

Post a Comment for "Accessing Celery Worker Instance Inside The Task"