Skip to content

Buffer access issue in high-resolution IBM grids #1125

@SpeedyTurtle599

Description

@SpeedyTurtle599

Describe the bug
When running the examples/2D_forward_facing_step/case.py file, MFC crashes in the simulation step (after successfully preprocessing). The terminal shows this log:

+ mpirun -np 14 /Users/speedyturtle/Developer/mfc/build/install/51b5b52d6c/bin/simulation
 Simulating a regular 1499x499x0 case on 14 rank(s) on CPUs.
At line 514 of file /Users/speedyturtle/Developer/mfc/src/simulation/m_ibm.fpp
Fortran runtime error: Index '-11' of dimension 1 of array 's_cc' below lower bound of -10

Error termination. Backtrace:

Could not print backtrace: executable file is not an executable
#0  0x1052a2103
#1  0x1052a2cd7
#2  0x1052a3077
#3  0x10476ea63
#4  0x104778b2b
#5  0x10486e76f
#6  0x10494a377
--------------------------------------------------------------------------
prterun detected that one or more processes exited with non-zero status,
thus causing the job to be terminated. The first process to do so was:

   Process name: [prterun-BGC-MBP14-14574@1,2]
   Exit code:    2
--------------------------------------------------------------------------

mfc: ERROR > :( /Users/speedyturtle/Developer/mfc/build/install/51b5b52d6c/bin/simulation failed with exit code 2.

 

Error: Submitting batch file for Interactive failed. It can be found here: /Users/speedyturtle/Developer/mfc/examples/2D_forward_facing_step/MFC.sh. Please check the file for errors.

./mfc.sh: line 74: 98008 Terminated: 15          python3 "$(pwd)/toolchain/main.py" "$@"

mfc: ERROR > main.py finished with a 143 exit code.
mfc: (venv) Exiting the Python virtual environment.

To Reproduce

  1. Clone latest mfc from source
  2. Compile using ./mfc.sh build -j14
  3. Run example using ./mfc.sh run examples/2D_forward_facing_step/case.py -n14
  4. See error in simulation executable

Expected behavior
Simulation correctly accesses ghost points in high-resolution grids

Environment

  • OS: macOS 26.2
  • Compiler versions: Apple clang-1700.6.3.2, gfortran 15.2.0, Python 3.14.3

Additional context
Best guess, this happens because of a hardcoded buffer size limit for IBM simulations. I was able to sidestep the problem by increasing the buffer limit from 10 to 20 in /src/m_helper_basic.fpp. As of writing, that's lines 147 to 149.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions