python - How can I configure celery task -


I am using celery in a project, where I used it as a scheduler (in the form of periodic work) I am doing

My cellular job looks like this:

  @primoked_task (run_evary = timeldata (sec = 300)) def update_all_feed (): feed_1 () feed_2 () ... ....... feed_n ()  

But as the number of feeds increases, it is taking more time to get other feeds (for example when the Celevery Feeds Working with the number n takes a long time to get the next feed (N + 1). I want to start the multiple feeds of the celestial sensitivity.

After going through the docs, I found that I can say a celery like the following:

  feed.delay () < / Code> 

How can I configure celery so that it is all feed ID and aggregates them (for example, 5 feeds, at one time)? I know that to get it, Must have to run as Damon.

NB: I'm using mongodb as a broker I had installed it and added iLeri to silvery config.

You can add all your feeds Can be set as such

  @periodic_task ru n_every = timeliness (sec = 300)) def update_all_feed (): feed_1.delay () feed_2.delay () ...... Or feed_n.delay ()  

Or you can make it easy by using a group

  @ erodyic_task from the salary import group (run_every = timedelta (Seconds = 300)) def update_all_feed (): group (feed.delay (i) in category i for (10) )  

Now you can start a task worker to run the job

  Celery worker- A your_app -l info - beat  

It starts executing your work every five minutes. However the default concurrency is equal to the core of your CPU. You can also convert concurrency if you want to execute 10 tasks at a time.

  Celery worker- A your_app -l info - beat-C10  

Comments