Details
-
Question
-
Status: Resolved
-
Trivial
-
Resolution: Invalid
-
1.3.0
-
None
-
Cent os 6
Description
Hi ,
i am using Scala , doing a socket program to catch multiple requests at same time and then call a function which uses spark to handle each process , i have a multi-threaded server to handle the multiple requests and pass each to spark , but there's a bottleneck as the spark doesn't initialize a sub task for the new request , is it even possible to do parallel processing using single spark job ?
Best Regards,