-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem:: Running the examples using the parallel version #3
Comments
Hi Kamra. Well yes, we use a bit different layout for boundary files, as in
serial version.
If you don't manage to make it work I will send you correct ones.on
monday,ok?
I am working on a new version of the code,so I didn't update the github
repo in a while. I will keep you updated.
What are you interested to investigate in general?
…On 31 May 2019 09:56, "mo7ammedmostafa" ***@***.***> wrote:
Hello my name Kamra,
I came across your code while looking a mpi parallel research that I can
use to test new schemes and algorithm.
I downloaded the code and managed to compile it but it would seem that
there is a problem with mesh reader when reading the boundary file for
examples cavity and pitzDaily
I get the following error
`At line 436 of file mesh_geometry_and_topology.f90 (unit = 7, file =
'processor0/constant/polyMesh/boundary')
Fortran runtime error: Bad integer for item 2 in list input
Error termination. Backtrace:
At line 436 of file mesh_geometry_and_topology.f90 (unit = 7, file =
'processor1/constant/polyMesh/boundary')
Fortran runtime error: Bad integer for item 2 in list input
Error termination. Backtrace:
#0 0x7ff209cd131a
#1 <#1> 0x7ff209cd1ec5
#2 <#2> 0x7ff209cd268d
#3 <#3> 0x7ff209e3e924
#4 0x7ff209e41c1a
#5 0x7ff209e430f9
#6 0x55a6efbdfb0b
#7 0x55a6efbd61e8
#8 0x55a6efbb006e
#9 0x7ff209101b96
#10 0x55a6efbb00a9
#11 0xffffffffffffffff
#0 0x7f504480731a
#1 <#1> 0x7f5044807ec5
#2 <#2> 0x7f504480868d
#3 <#3> 0x7f5044974924
#4 0x7f5044977c1a
#5 0x7f50449790f9
#6 0x559a1359eb0b
#7 0x559a135951e8
#8 0x559a1356f06e
#9 0x7f5043c37b96
#10 0x559a1356f0a9
#11 0xffffffffffffffff Primary job terminated normally, but 1 process
returned
a non-zero exit code.. Per user-direction, the job has been aborted.
------------------------------
mpiexec detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:
Process name: [[49949,1],1]
Exit code: 2
------------------------------------------------------------
--------------`
So do I just need to modify the boundary file from OpenFOAM format to you
native format?
THANKS
Kamra
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAD7LDYQLKSINQXA3ICG4QTPYDK23A5CNFSM4HRVWEIKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4GW4UAIA>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAD7LD5RPMXL2LZGEV3N2YLPYDK23ANCNFSM4HRVWEIA>
.
|
I am planning to implement a new scheme for high order flux Reconstruction |
I will be able to help you out. But algorithm layout will be different for
your case. It will probably be explicit code, based on Runge-Kutta. The
starting point should maybe be simple second order 3D explicit Euler or NS
code based on sequence reconstruction (gradient evaluation)-
fluxes-evolution. I am interested in higher order methods but didn't have
time to do anything about it.
I will send you dev version of the code soon as well.
…On 1 Jun 2019 15:13, "mo7ammedmostafa" ***@***.***> wrote:
I am planning to implement a new scheme for high order flux Reconstruction
Based on a quick read of the code it looks like the necessary data
structure will work but I wanted to evaluate the parallel efficiency
because most of my test problems are 3d
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAD7LDY26V3O2TBAJMWN7XLPYJYYHA5CNFSM4HRVWEIKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWXANAA#issuecomment-497944192>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAD7LD63ADA6ROMSMI2L33TPYJYYHANCNFSM4HRVWEIA>
.
|
I understand the difference between the two algorithms, but what I am interested in is the parallel efficiency of the implementation and data structure. |
I didn't forget about you, at the moment I'm changing the mesh reading
function that will result in a bit different spec of boundary data, and the
way they are structured. The way boundary faces are looped is changed too.
I don't have access to a bigger machine at the monent, but I will get some
cpu hours somewhere to test the code. As with many other codes the main
bottleneck in scalability are dot products and residul calculation in
solution of linear systems (mpi_allreduce). This is what I got so far, but
further testing will be necessary. I want to do hybrid mpi/openmp during
the year, where the openmp parallelisation won't be on loop level, but also
of spmd domain decomposition type, just focused on shared storage on the
node.
Do you have any thoughts on this?
Are you at some university or in industry?
Regards
…On 14 Jun 2019 14:56, "mo7ammedmostafa" ***@***.***> wrote:
I understand the difference between the two algorithms, but what I am
interested in is the parallel efficiency of the implementation and data
structure.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAD7LD2PP3YUPAM2LMADKWLP2OIRHA5CNFSM4HRVWEIKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXWWYUA#issuecomment-502099024>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAD7LD3THRWBQJQWJ2YCUXDP2OIRHANCNFSM4HRVWEIA>
.
|
Hello my name Kamra,
I came across your code while looking a mpi parallel research that I can use to test new schemes and algorithm.
I downloaded the code and managed to compile it but it would seem that there is a problem with mesh reader when reading the boundary file for examples cavity and pitzDaily
I get the following error
`At line 436 of file mesh_geometry_and_topology.f90 (unit = 7, file = 'processor0/constant/polyMesh/boundary')
Fortran runtime error: Bad integer for item 2 in list input
Error termination. Backtrace:
At line 436 of file mesh_geometry_and_topology.f90 (unit = 7, file = 'processor1/constant/polyMesh/boundary')
Fortran runtime error: Bad integer for item 2 in list input
Error termination. Backtrace:
#0 0x7ff209cd131a
#1 0x7ff209cd1ec5
#2 0x7ff209cd268d
#3 0x7ff209e3e924
#4 0x7ff209e41c1a
#5 0x7ff209e430f9
#6 0x55a6efbdfb0b
#7 0x55a6efbd61e8
#8 0x55a6efbb006e
#9 0x7ff209101b96
#10 0x55a6efbb00a9
#11 0xffffffffffffffff
#0 0x7f504480731a
#1 0x7f5044807ec5
#2 0x7f504480868d
#3 0x7f5044974924
#4 0x7f5044977c1a
#5 0x7f50449790f9
#6 0x559a1359eb0b
#7 0x559a135951e8
#8 0x559a1356f06e
#9 0x7f5043c37b96
#10 0x559a1356f0a9
#11 0xffffffffffffffff
Primary job terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[49949,1],1]
Exit code: 2
--------------------------------------------------------------------------`
So do I just need to modify the boundary file from OpenFOAM format to you native format?
THANKS
Kamra
The text was updated successfully, but these errors were encountered: