Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- [0] DAPL startup(): trying to open default DAPL provider from dat registry: ibnic0v2
- [1] DAPL startup(): trying to open default DAPL provider from dat registry: ibnic0v2
- [0] MPI startup(): DAPL provider ibnic0v2
- [1] MPI startup(): DAPL provider ibnic0v2
- [0] MPI startup(): dapl data transfer mode
- [1] MPI startup(): dapl data transfer mode
- [0] MPI startup(): Internal info: pinning initialization was done
- [0] MPI startup(): Rank Pid Node name Pin cpu
- [0] MPI startup(): 0 3700 CN01 {0,1,2,3,4,5,6,7}
- [0] MPI startup(): 1 2632 CN02 {0,1,2,3,4,5,6,7}
- [1] MPI startup(): Internal info: pinning initialization was done
- [0] MPI startup(): I_MPI_DEBUG=5
- [0] MPI startup(): I_MPI_PIN_MAPPING=1:0 0
- #---------------------------------------------------
- # Intel (R) MPI Benchmark Suite V3.2.3, MPI-1 part
- #---------------------------------------------------
- # Date : Thu Jan 17 08:56:05 2013
- # Machine : Intel(R) 64 Family 6 Model 26 Stepping 5, GenuineIntel
- # Release : 6.1.7601
- # Version : Service Pack 1
- # MPI Version : 2.2
- # MPI Thread Environment: MPI_THREAD_MULTIPLE
- # New default behavior from Version 3.2 on:
- # the number of iterations per message size is cut down
- # dynamically when a certain run time (per message size sample)
- # is expected to be exceeded. Time limit is defined by variable
- # "SECS_PER_SAMPLE" (=> IMB_settings.h)
- # or through the flag => -time
- # Calling sequence was:
- # C:\Users\sg\Desktop\imb_3.2.3\WINDOWS\IMB-MPI1_VS_2010\x64\Release\IMB-MPI1.exe
- # Minimum message length in bytes: 0
- # Maximum message length in bytes: 4194304
- #
- # MPI_Datatype : MPI_BYTE
- # MPI_Datatype for reductions : MPI_FLOAT
- # MPI_Op : MPI_SUM
- #
- #
- # List of Benchmarks to run:
- # PingPong
- # PingPing
- # Sendrecv
- # Exchange
- # Allreduce
- # Reduce
- # Reduce_scatter
- # Allgather
- # Allgatherv
- # Gather
- # Gatherv
- # Scatter
- # Scatterv
- # Alltoall
- # Alltoallv
- # Bcast
- # Barrier
- #---------------------------------------------------
- # Benchmarking PingPong
- # #processes = 2
- #---------------------------------------------------
- #bytes #repetitions t[usec] Mbytes/sec
- 0 1000 3.99 0.00
- 1 1000 3.99 0.24
- 2 1000 3.76 0.51
- 4 1000 3.77 1.01
- 8 1000 3.78 2.02
- 16 1000 3.81 4.01
- 32 1000 3.93 7.77
- 64 1000 3.93 15.52
- 128 1000 4.05 30.12
- 256 1000 4.10 59.57
- 512 1000 4.41 110.62
- 1024 1000 4.99 195.63
- 2048 1000 6.22 314.13
- 4096 1000 8.30 470.55
- 8192 1000 10.63 735.28
- 16384 1000 15.31 1020.76
- 32768 1000 21.21 1473.49
- 65536 640 31.53 1982.50
- 131072 320 52.39 2385.87
- 262144 160 94.76 2638.22
- 524288 80 185.22 2699.49
- 1048576 40 356.92 2801.73
- 2097152 20 699.45 2859.38
- 4194304 10 1393.73 2870.00
- ...
- #----------------------------------------------------------------
- # Benchmarking Bcast
- # #processes = 2
- #----------------------------------------------------------------
- #bytes #repetitions t_min[usec] t_max[usec] t_avg[usec]
- 0 1000 0.05 0.46 0.26
- 1 1000 3.41 3.41 3.41
- 2 1000 3.42 3.42 3.42
- 4 1000 3.42 3.42 3.42
- 8 1000 3.42 3.42 3.42
- 16 1000 3.43 3.44 3.43
- 32 1000 3.50 3.51 3.50
- 64 1000 3.55 3.55 3.55
- 128 1000 3.57 3.58 3.57
- 256 1000 3.79 3.80 3.80
- 512 1000 4.08 4.08 4.08
- 1024 1000 4.74 4.75 4.74
- 2048 1000 5.89 5.90 5.89
- 4096 1000 8.12 8.13 8.13
- 8192 1000 10.31 10.32 10.31
- 16384 1000 14.74 14.75 14.74
- 32768 1000 20.05 20.05 20.05
- Fatal error in PMPI_Bcast: Other MPI error, error stack:
- PMPI_Bcast(2112)........: MPI_Bcast(buf=00000000030C0040, count=65536, MPI_BYTE, root=0, comm=0x84000000) failed
- MPIR_Bcast_impl(1670)...:
- I_MPIR_Bcast_intra(1887): Failure during collective
- MPIR_Bcast_intra(1461)..:
- MPIR_Bcast_binomial(156): message sizes do not match across processes in the collective
- [0:CN01.cual.local] unexpected disconnect completion event from [1:CN02.****.*****]
- Assertion failed in file .\dapl_conn_rc.c at line 1128: 0
- internal ABORT - process 0
- job aborted:
- rank: node: exit code[: error message]
- 0: CN01.cual.local: 1: process 0 exited without calling finalize
- 1: CN02: 1: process 1 exited without calling finalize
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement