Hi!
I was playing lately with this :
#!/usr/bin/newlisp -n -c
(context 'pool)
(define (pool:pool task (poolSize 5) (syncDelay 50))
(while (> (length (sync)) poolSize)
(sync syncDelay))
(spawn 's (eval task)))
(define (pool:map task liste (poolSize 5) (syncDelay 50))
(let (size (- (length liste) 1)
out '())
; spawn processes
(for (i 0 size)
; Wait for availlable "slot" in pool
(while (>= (length (sync)) poolSize)
(sync syncDelay))
(spawn (sym i) (task (liste i))))
(sync 10000)
; get back values
(for (i 0 size)
(push (eval (sym i)) out -1))))
(context 'MAIN)
(define (test-func x)
(sleep (+ 5 (rand 100)))
(* x x))
(println
"Map without pooling : "
(time (map test-func (sequence 1 100)))
"ms")
(println
"Pool map, default pool size : "
(time (pool:map test-func (sequence 1 100)))
"ms")
(println
"Pool:map, pool size = 50 : "
(time (pool:map test-func (sequence 1 100) 50))
"ms")
(println
"Just to check results : "
(pool:map test-func (sequence 1 100) 50))
(println
"Pool:map on string : "
(join (pool:map (lambda (c) (upper-case c)) "A beautiful string") "") )
(exit)
Very basic attempt at creating higher level functions for multiprocessing. I thought it could be a good thing to have functions similar to these two (more optimised and fault-tolerant) availlable in newlisp coore:
- a pool-map to map a function on a list (or more, but my attempt is one list only) in a parallel way.
- a pool-pool to distribute tasks in a loop on a pool of threads (for side effects only, as in my attempt results are discarded).
What do you think?
Nice. I have been looking for this. Something in the line of Clojure agents.