[PATCH] D61115: Parallel: only allow the first TaskGroup to run tasks parallelly

Rui Ueyama via Phabricator via llvm-commits llvm-commits at lists.llvm.org
Thu Apr 25 00:28:27 PDT 2019


ruiu added a comment.

Don't you think we can just get rid of the notion of TaskGroup? Looks like we don't need TaskGroup to implement a thread pool and a parallel-for-loop.

Generally, it doesn't make much sense to have more than one thread pool in a process (invoking more threads than necessary is against the aim of the thread pool itself), so we can make the thread pool a singleton class. Let's call a thread pool class ThreadPool. That class would define `getInstance()` to get an instance. The only other member function would be `void execute(std::function<void()> Fn)`, which executes `Fn` in some thread. Internally, that function would add a given function object to a queue and wakes up threads using a condition variable.

I believe it shouldn't be too hard to implement such thread pool class.

Once we implement the thread pool, then we can implement parallel-for-each like this:

  template <class IterTy, class FuncTy>
  void parallel_for_each(IterTy Begin, IterTy End, FuncTy Fn) {
    // Split tasks into groups.
    ptrdiff_t TaskSize = std::distance(Begin, End) / 1024;
    if (TaskSize == 0)
      TaskSize = 1;
                                
    size_t NumTasks;
    std::mutex Mu;
    std::condition_varaible Cond;
    
    // Submit jobs to the thread pool.
    while (TaskSize < std::distance(Begin, End)) {
      ++NumTasks;
      ThreadPool::getInstance().execute([&] {                                             
        std::for_each(Begin, Begin + TaskSize, Fn);
        std::lock_guard<std::mutex> Lock(Mu);
        if (--NumTasks == 0)
          Cond.notify_all();
      });
      Begin += TaskSize;
    } 
  
    std::for_each(Begin, End, Fn);
  
    // Wait for everybody to complete.
    std::lock_guard<std::mutex> Lock(Mu);
    cv.wait(lk, [&] {return NumTasks == 0;});
  } 

This parallel_for_each should be completely reentrant.


Repository:
  rL LLVM

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D61115/new/

https://reviews.llvm.org/D61115





More information about the llvm-commits mailing list