Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- GROMACS is written by:
- Emile Apol Rossen Apostolov Herman J.C. Berendsen Par Bjelkmar
- Aldert van Buuren Rudi van Drunen Anton Feenstra Sebastian Fritsch
- Gerrit Groenhof Christoph Junghans Peter Kasson Carsten Kutzner
- Per Larsson Justin A. Lemkul Magnus Lundborg Pieter Meulenhoff
- Erik Marklund Teemu Murtola Szilard Pall Sander Pronk
- Per Larsson Justin A. Lemkul Magnus Lundborg Pieter Meulenhoff [43/1917]
- Erik Marklund Teemu Murtola Szilard Pall Sander Pronk
- Roland Schulz Alexey Shvetsov Michael Shirts Alfons Sijbers
- Peter Tieleman Christian Wennberg Maarten Wolf
- and the project leaders:
- Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel
- Copyright (c) 1991-2000, University of Groningen, The Netherlands.
- Copyright (c) 2001-2014, The GROMACS development team at
- Uppsala University, Stockholm University and
- the Royal Institute of Technology, Sweden.
- check out http://www.gromacs.org for more information.
- GROMACS is free software; you can redistribute it and/or modify it
- under the terms of the GNU Lesser General Public License
- as published by the Free Software Foundation; either version 2.1
- of the License, or (at your option) any later version.
- GROMACS: gmx mdrun, VERSION 5.0.2
- Executable: /usr/bin/gmx_mpi
- Library dir: /usr/share/gromacs/top
- Command line:
- gmx_mpi mdrun -v -deffnm /home/diego-m/hd42/fluorosceina/box_building/solvent_test/solvent_test
- Back Off! I just backed up /home/diego-m/hd42/fluorosceina/box_building/solvent_test/solvent_test.log to /home/diego-m/hd42/fluorosceina/box_building/solven
- t_test/#solvent_test.log.6#
- Number of hardware threads detected (8) does not match the number reported by OpenMP (1).
- Consider setting the launch configuration manually!
- Reading file /home/diego-m/hd42/fluorosceina/box_building/solvent_test/solvent_test.tpr, VERSION 5.0.2 (single precision)
- Using 8 MPI processes
- Using 1 OpenMP thread per MPI process
- Non-default thread affinity set probably by the OpenMP library,
- disabling internal thread affinity
- Back Off! I just backed up /home/diego-m/hd42/fluorosceina/box_building/solvent_test/solvent_test.trr to /home/diego-m/hd42/fluorosceina/box_building/solven
- t_test/#solvent_test.trr.5#
- Back Off! I just backed up /home/diego-m/hd42/fluorosceina/box_building/solvent_test/solvent_test.edr to /home/diego-m/hd42/fluorosceina/box_building/solve$
- t_test/#solvent_test.edr.5#
- Steepest Descents:
- Tolerance (Fmax) = 1.00000e+03
- Number of steps = 50000
- [slave:26344] *** Process received signal ***nan Fmax= 2.35818e+10, atom= 214
- [slave:26345] *** Process received signal ***
- [slave:26345] Signal: Segmentation fault (11)
- [slave:26345] Signal code: Address not mapped (1)
- [slave:26345] Failing at address: 0xfffffffe00d2ab40
- [slave:26344] Signal: Segmentation fault (11)
- [slave:26344] Signal code: Address not mapped (1)
- [slave:26344] Failing at address: 0xfffffffe00dd75b0
- [slave:26345] [slave:26344] [ 0] /usr/bin/../lib64/libpthread.so.0(+0x10200)[0x7f931113a200]
- [slave:26344] [ 1] [ 0] /usr/bin/../lib64/libgromacs_mpi.so.0(+0xd1794c)[0x7f931268594c]
- [slave:26344] [ 2] /usr/bin/../lib64/libgromacs_mpi.so.0(+0xd21895)[0x7f931268f895]
- /usr/bin/../lib64/libpthread.so.0(+0x10200)[0x7f25d2d5d200]
- [slave:26345] [ 1] /usr/bin/../lib64/libgromacs_mpi.so.0(+0xd1794c)[0x7f25d42a894c]
- [slave:26345] [ 2] /usr/bin/../lib64/libgromacs_mpi.so.0(+0xd21895)[0x7f25d42b2895]
- [slave:26344] [ 3] /usr/bin/../lib64/../lib64/libgomp.so.1(GOMP_parallel+0x3f)[0x7f930f920b7f]
- [slave:26344] [ 4] /usr/bin/../lib64/libgromacs_mpi.so.0(nbnxn_put_on_grid+0xe7d)[0x7f9312691efd]
- [slave:26344] [ 5] /usr/bin/../lib64/libgromacs_mpi.so.0(dd_partition_system+0x4549)[0x7f93126d1b39]
- [slave:26344] [ 6] [slave:26345] /usr/bin/../lib64/libgromacs_mpi.so.0(do_steep+0x12d1)[0x7f93126eb0a1]
- [slave:26344] [ 7] gmx_mpi(mdrunner+0x1348)[0x433e18]
- [slave:26344] [ 8] gmx_mpi(_Z9gmx_mdruniPPc+0x18ba)[0x43670a]
- [slave:26344] [ 9] [ 3] /usr/bin/../lib64/libgromacs_mpi.so.0(_ZN3gmx24CommandLineModuleManager3runEiPPc+0xa2)[0x7f9311b81592]
- [slave:26344] [10] gmx_mpi(main+0x8c)[0x41154c]
- [slave:26344] [11] /usr/bin/../lib64/libc.so.6(__libc_start_main+0xf0)[0x7f931057d040]
- [slave:26344] [12] gmx_mpi[0x41168e]
- [slave:26344] *** End of error message ***
- /usr/bin/../lib64/../lib64/libgomp.so.1(GOMP_parallel+0x3f)[0x7f25d1543b7f]
- [slave:26345] [ 4] /usr/bin/../lib64/libgromacs_mpi.so.0(nbnxn_put_on_grid+0xe7d)[0x7f25d42b4efd]
- [slave:26345] [ 5] /usr/bin/../lib64/libgromacs_mpi.so.0(dd_partition_system+0x4549)[0x7f25d42f4b39]
- [slave:26345] [ 6] /usr/bin/../lib64/libgromacs_mpi.so.0(do_steep+0x12d1)[0x7f25d430e0a1]
- [slave:26345] [ 7] gmx_mpi(mdrunner+0x1348)[0x433e18]
- [slave:26345] [ 8] gmx_mpi(_Z9gmx_mdruniPPc+0x18ba)[0x43670a]
- [slave:26345] [ 9] /usr/bin/../lib64/libgromacs_mpi.so.0(_ZN3gmx24CommandLineModuleManager3runEiPPc+0xa2)[0x7f25d37a4592]
- [slave:26345] [10] gmx_mpi(main+0x8c)[0x41154c]
- [slave:26345] [11] /usr/bin/../lib64/libc.so.6(__libc_start_main+0xf0)[0x7f25d21a0040]
- [slave:26345] [12] gmx_mpi[0x41168e]
- [slave:26345] *** End of error message ***
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement