Pastebin launched a little side project called VERYVIRAL.com, check it out ;-) Want more features on Pastebin? Sign Up, it's FREE!
Guest

Untitled

By: a guest on Oct 2nd, 2013  |  syntax: None  |  size: 5.19 KB  |  views: 59  |  expires: Never
download  |  raw  |  embed  |  report abuse  |  print
Text below is selected. Please press Ctrl+C to copy to your clipboard. (⌘+C on Mac)
  1. [jallan@hpc27 ~]$ `which  mpirun` --report-bindings --bind-to-core -mca btl openib,sm,self -np 8   -hostfile ./hplnodes2 -d --debug-daemons  /cm/shared/apps/hpl/2.0/xhpl
  2. [hpc27:06010] System has detected external process binding to cores 0080
  3. [hpc27:06010] procdir: /tmp/openmpi-sessions-jallan@hpc27_0/50313/0/0
  4. [hpc27:06010] jobdir: /tmp/openmpi-sessions-jallan@hpc27_0/50313/0
  5. [hpc27:06010] top: openmpi-sessions-jallan@hpc27_0
  6. [hpc27:06010] tmp: /tmp
  7. [hpc27:06010] mpirun: reset PATH: /cm/shared/apps/openmpi/gcc/64/1.4.4/bin:/opt/vdt/globus/sbin:/opt/vdt/globus/bin:/opt/vdt/apache-maven/bin:/opt/vdt/apache-ant/bin:/opt/vdt/gpt/sbin:/opt/vdt/jdk1.6/bin:/opt/moab_lambda/bin:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:/opt/bin:/sbin:/usr/sbin:/cm/shared/apps/hpl/2.0:/cm/shared/apps/openmpi/gcc/64/1.4.4/bin:/chome/jallan/bin
  8. [hpc27:06010] mpirun: reset LD_LIBRARY_PATH: /cm/shared/apps/openmpi/gcc/64/1.4.4/lib64:/opt/vdt/globus/lib:/opt/vdt/jdk1.6/jre/lib/amd64client:/opt/vdt/jdk1.6/jre/lib/amd64/server:/opt/vdt/jdk1.6/jre/lib/amd64:/cm/shared/apps/gotoblas/core/64/1.26:/cm/shared/apps/openmpi/gcc/64/1.4.4/lib64
  9. aklog: Couldn't determine realm of user:aklog: unknown RPC error (-1765328189)  while getting realm
  10. aklog: Couldn't determine realm of user:aklog: unknown RPC error (-1765328189)  while getting realm
  11. Daemon was launched on hpc23 - beginning to initialize
  12. Daemon was launched on hpc24 - beginning to initialize
  13. [hpc23:22184] procdir: /tmp/openmpi-sessions-jallan@hpc23_0/50313/0/1
  14. [hpc23:22184] jobdir: /tmp/openmpi-sessions-jallan@hpc23_0/50313/0
  15. [hpc23:22184] top: openmpi-sessions-jallan@hpc23_0
  16. [hpc23:22184] tmp: /tmp
  17. [hpc24:09206] procdir: /tmp/openmpi-sessions-jallan@hpc24_0/50313/0/2
  18. [hpc24:09206] jobdir: /tmp/openmpi-sessions-jallan@hpc24_0/50313/0
  19. [hpc24:09206] top: openmpi-sessions-jallan@hpc24_0
  20. [hpc24:09206] tmp: /tmp
  21. Daemon [[50313,0],2] checking in as pid 9206 on host hpc24
  22. Daemon [[50313,0],2] not using static ports
  23. [hpc24:09206] [[50313,0],2] orted: up and running - waiting for commands!
  24. Daemon [[50313,0],1] checking in as pid 22184 on host hpc23
  25. Daemon [[50313,0],1] not using static ports
  26. [hpc27:06010] [[50313,0],0] node[0].name hpc27 daemon 0 arch ffc91200
  27. [hpc27:06010] [[50313,0],0] node[1].name hpc23 daemon 1 arch ffc91200
  28. [hpc27:06010] [[50313,0],0] node[2].name hpc24 daemon 2 arch ffc91200
  29. [hpc27:06010] [[50313,0],0] node[3].name hpc12 daemon INVALID arch ffc91200
  30. [hpc27:06010] [[50313,0],0] orted_cmd: received add_local_procs
  31. [hpc23:22184] [[50313,0],1] orted: up and running - waiting for commands!
  32. [hpc23:22184] [[50313,0],1] node[0].name hpc27 daemon 0 arch ffc91200
  33. [hpc23:22184] [[50313,0],1] node[1].name hpc23 daemon 1 arch ffc91200
  34. [hpc23:22184] [[50313,0],1] node[2].name hpc24 daemon 2 arch ffc91200
  35. [hpc23:22184] [[50313,0],1] node[3].name hpc12 daemon INVALID arch ffc91200
  36. [hpc23:22184] [[50313,0],1] orted_cmd: received add_local_procs
  37. [hpc24:09206] [[50313,0],2] node[0].name hpc27 daemon 0 arch ffc91200
  38. [hpc24:09206] [[50313,0],2] node[1].name hpc23 daemon 1 arch ffc91200
  39. [hpc24:09206] [[50313,0],2] node[2].name hpc24 daemon 2 arch ffc91200
  40. [hpc24:09206] [[50313,0],2] node[3].name hpc12 daemon INVALID arch ffc91200
  41. [hpc24:09206] [[50313,0],2] orted_cmd: received add_local_procs
  42. --------------------------------------------------------------------------
  43. Not enough processors were found on the local host to meet the requested
  44. binding action:
  45.  
  46.   Local host:        hpc23
  47.   Action requested:  bind-to-core
  48.   Application name:  /cm/shared/apps/hpl/2.0/xhpl
  49.  
  50. Please revise the request and try again.
  51. --------------------------------------------------------------------------
  52. [hpc23:22184] [[50313,0],1] orted_cmd: received iof_complete cmd
  53. --------------------------------------------------------------------------
  54. mpirun was unable to start the specified application as it encountered an error:
  55.  
  56. Error name: Unknown error: 1
  57. Node: hpc23.ib.ptc
  58.  
  59. when attempting to start process rank 0.
  60. --------------------------------------------------------------------------
  61. [hpc27:06010] [[50313,0],0] orted_cmd: received exit
  62. /cm/shared/apps/hpl/2.0/xhpl: error while loading shared libraries: libgoto.so: cannot open shared object file: No such file or directory
  63. [hpc23:22184] [[50313,0],1] orted_cmd: received iof_complete cmd
  64. [hpc23:22184] [[50313,0],1] orted_cmd: received exit
  65. [hpc23:22184] [[50313,0],1] orted: finalizing
  66. [hpc27:06010] sess_dir_finalize: job session dir not empty - leaving
  67. [hpc27:06010] 1 more process has sent help message help-odls-default.txt / odls-default:not-enough-resources
  68. [hpc27:06010] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
  69. [hpc27:06010] sess_dir_finalize: proc session dir not empty - leaving
  70. orterun: exiting with status 1
  71. [jallan@hpc27 ~]$ [hpc23:22184] sess_dir_finalize: job session dir not empty - leaving
  72. [hpc23:22184] sess_dir_finalize: job session dir not empty - leaving
  73. [hpc24:09206] [[50313,0],2] orted_cmd: received iof_complete cmd
  74. [hpc24:09206] [[50313,0],2] orted_cmd: received exit
  75. [hpc24:09206] [[50313,0],2] orted: finalizing
  76. [hpc24:09206] sess_dir_finalize: job session dir not empty - leaving
  77. [hpc24:09206] sess_dir_finalize: job session dir not empty - leaving