Name Last modified Size Description
Parent Directory - numpy_dot.py 07-Nov-2017 15:27 532 numpy_dot.qsub 07-Nov-2017 15:34 624
This example is independent of the major Python version, i.e. Python 2.7.x and Python 3.6.x will both behave the same in this regard. There is a slight difference between the Intel optimized Python modules and the regular Python modules, which will be described below.
When submitting a job to the queue multiple cores are requested using the
-pe omp N flag.
When the job runs an environment variable called NSLOTS will be automatically set to the number of requested
The Numpy library's thread behavior is set according to two environment variables. The OMP_NUM_THREADS variable effects the multi-threading of higher level algorithms (ex. root finding) while the OPENBLAS_NUM_THREADS effects the number of of threads used for lower-level linear algebra calculations (ex. matrix-vector multiplication). If the Intel version of the Python modules are used (ex. python/3.6_intel-2018.0.018 or python/2.7_intel-2018.0.018) then a variable called MKL_NUM_THREADS is set in place of OPENBLAS_NUM_THREADS.
Higher level multi-threaded algorithms can call lower-level routines which can in turn create additional threads. The correct way to limit a Python job to the requested number of cores is to make sure that the product of OMP_NUM_THREADS and OPENBLAS_NUM_THREADS (or MKL_NUM_THREADS for Intel Python) is equal to the requested number of cores, NSLOTS.
The following is an example of a simple Python calculation with Numpy that can take advantage of multiple cores. The example is shown for the regular Python modules and it sets the OPENBLAS_NUM_THREADS variable. If using the Intel Python modules just swap MKL_NUM_THREADS for OPENBLAS_NUM_THREADS.
This is the example queue submission script to match the simple Python script. Since the sample script spends all of its time in low-level routines the OMP_NUM_THREADS variable is set to 1, and the OPENBLAS_NUM_THREADS variable is set to the number of requested cores, NSLOTS.