Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No option for Intel LLVM compilers for WRF-Hydro #731

Open
HathewayWill opened this issue Dec 31, 2023 · 15 comments
Open

No option for Intel LLVM compilers for WRF-Hydro #731

HathewayWill opened this issue Dec 31, 2023 · 15 comments

Comments

@HathewayWill
Copy link

Intel has archived the Intel Classic Compilers (ifort/icc/icpc) and replaced them with new LLVM compilers (icx/ifx/icpx) along with new mpi intel compilers.

wrf_hydro_nwm_public does not have a compiler option for these new compilers.

Expected Behavior

image

new option with intel llvm

```

cd $WRFHYDRO_FOLDER/Downloads
wget -c https://github.com/NCAR/wrf_hydro_nwm_public/archive/refs/tags/v5.2.0.tar.gz -O WRFHYDRO.5.2.tar.gz
tar -xvzf WRFHYDRO.5.2.tar.gz -C $WRFHYDRO_FOLDER/

#Modifying WRF-HYDRO Environment
#Echo commands use due to lack of knowledge
cd $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS/template

sed -i 's/SPATIAL_SOIL=0/SPATIAL_SOIL=1/g' setEnvar.sh
echo " " >>setEnvar.sh
echo "# Large netcdf file support: 0=Off, 1=On." >>setEnvar.sh
echo "export WRFIO_NCD_LARGE_FILE_SUPPORT=1" >>setEnvar.sh
ln setEnvar.sh $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS

#Configure & Compile WRF HYDRO in Standalone Mode
#Compile WRF-Hydro offline with the NoahMP
cd $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS
source setEnvar.sh

echo 3 | ./configure 2>&1 | tee configure.log

sed -i '63s/mpif90/mpiifort/g' $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS/macros

./compile_offline_NoahMP.sh setEnvar.sh 2>&1 | tee compile_offline_NoahMP.log
## Possible Solution

cd $WRFHYDRO_FOLDER/Downloads
wget -c https://github.com/NCAR/wrf_hydro_nwm_public/archive/refs/tags/v5.2.0.tar.gz -O WRFHYDRO.5.2.tar.gz
tar -xvzf WRFHYDRO.5.2.tar.gz -C $WRFHYDRO_FOLDER/

#Modifying WRF-HYDRO Environment
#Echo commands use due to lack of knowledge
cd $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS/template

sed -i 's/SPATIAL_SOIL=0/SPATIAL_SOIL=1/g' setEnvar.sh
echo " " >>setEnvar.sh
echo "# Large netcdf file support: 0=Off, 1=On." >>setEnvar.sh
echo "export WRFIO_NCD_LARGE_FILE_SUPPORT=1" >>setEnvar.sh
ln setEnvar.sh $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS

#Configure & Compile WRF HYDRO in Standalone Mode
#Compile WRF-Hydro offline with the NoahMP
cd $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS
source setEnvar.sh

echo 3 | ./configure 2>&1 | tee configure.log

sed -i '63s/mpif90/mpiifx/g' $WRFHYDRO_FOLDER/wrf_hydro_nwm_public-5.2.0/trunk/NDHMS/macros

./compile_offline_NoahMP.sh setEnvar.sh 2>&1 | tee compile_offline_NoahMP.log

marcos:

.IGNORE:

ifeq ($(SPATIAL_SOIL),1)
SPATIAL_SOIL = -DSPATIAL_SOIL
else
SPATIAL_SOIL =
endif

ifeq ($(HYDRO_REALTIME),1)
HYDRO_REALTIME = -DHYDRO_REALTIME
else
HYDRO_REALTIME =
endif

ifeq ($(WRF_HYDRO),1)
WRF_HYDRO = -DWRF_HYDRO $(HYDRO_REALTIME)
else
WRF_HYDRO =
endif

ifeq ($(WRF_HYDRO_RAPID),1)
WRF_HYDRO = -DWRF_HYDRO -DWRF_HYDRO_RAPID $(HYDRO_REALTIME)
endif

ifeq ($(HYDRO_D),1)
HYDRO_D = -DHYDRO_D $(WRF_HYDRO)
else
HYDRO_D = $(WRF_HYDRO)
endif

ifeq ($(WRF_HYDRO_NUDGING),1)
WRF_HYDRO_NUDGING = -DWRF_HYDRO_NUDGING
else
WRF_HYDRO_NUDGING =
endif

ifeq ($(OUTPUT_CHAN_CONN),1)
OUTPUT_CHAN_CONN = -DOUTPUT_CHAN_CONN
else
OUTPUT_CHAN_CONN =
endif

ifeq ($(PRECIP_DOUBLE),1)
PRECIP_DOUBLE = -DPRECIP_DOUBLE
else
PRECIP_DOUBLE =
endif

ifeq ($(NWM_META),1)
NWM_META = -DNWM_META
else
NWM_META =
endif

ifeq ($(NCEP_WCOSS),1)
NCEP_WCOSS = -DNCEP_WCOSS
else
NCEP_WCOSS =
endif

RMD = rm -f
COMPILER90 = mpiifx
FORMAT_FREE = -FR
BYTESWAPIO = -convert big_endian
F90FLAGS = -O2 -g -w -c -ftz -align all -fno-alias -fp-model precise $(FORMAT_FREE) $(BYTESWAPIO)
MODFLAG = -I./ -I ../../MPP -I ../MPP -I ../mod
LDFLAGS =
CPPINVOKE = -fpp
CPPFLAGS = -DMPP_LAND -I ../Data_Rec $(HYDRO_D) $(SPATIAL_SOIL) $(NWM_META) $(WRF_HYDRO_NUDGING) $(OUTPUT_CHAN_CONN) $(PRECIP_DOUBLE) $(NCEP_WCOSS)
LIBS =
NETCDFINC = $(NETCDF_INC)
NETCDFLIB = -L$(NETCDF_LIB) -lnetcdff -lnetcdf


fails to build using that method above, despite it working for WRF


## Steps to Reproduce (for bugs)
<!--- Provide a list of steps to reproduce this bug.-->
<!--- Include code to reproduce, if relevant -->
1.
2.
3.
4.

## Your Environment
<!--- Include as many relevant details about the environment you experienced the bug in -->
* 5.2
* Operating System and version: ubuntu 22.04.3
* Compiler and version: intel llvm 2024
* Other relevant information:
@HathewayWill
Copy link
Author

Having problems building WRF-Hydro with the new Intel LLVM.

For reference I used to use the configure method and just change the compiler for classic intel (ifort/icc) and it would work just fine. Now with the new intel llvm there is something that has change.

Attached is the build folder with all the cmake logs. The only thing I can think is that there are some flags for the intel llvm that are no longer supported and that is the source of the issue.

I ask that someone from NCAR take a look and see if the Intel LLVM can be added to the configuration and macos files. Since Intel Classic is now a paid feature for new users and intel llvm is the way forward for all HPC, according to intel.

build_intel_llvm_cmake.zip

Thanks,
WH

@rcabell
Copy link
Collaborator

rcabell commented Sep 19, 2024

Hi Will,

My local HPC cluster doesn't have Intel OneAPI 2021 available for testing, but with the 2023.2.1 it builds and runs successfully. You may want to try a newer version and see if you are still having issues.

EDIT: I also tried with Intel OneAPI v2024.2.1 and it works for me as well.

@HathewayWill
Copy link
Author

Hi @rcabell

Can you tell me the steps you did and commands you issued?

Maybe I can reproduce your results. There's a good chance I'm missing a step

@HathewayWill
Copy link
Author

@rcabell reference, I'm using the 2024 version of the compilers

@rcabell
Copy link
Collaborator

rcabell commented Sep 19, 2024

Hi @HathewayWill -- I noticed the /opt/intel/oneapi/mpi/2021.13/ in your logs, hence my assumption you were using an older version.

To build, I followed the steps in docs/BUILD.md for a CMake build:

$ mkdir build
$ cd build
$ cmake ..
$ make -j 4

Intel OneAPI v2024.2.1, netCDF v4.9.2, MPICH v3.4a2

@HathewayWill
Copy link
Author

HathewayWill commented Sep 19, 2024

Hi @HathewayWill -- I noticed the /opt/intel/oneapi/mpi/2021.13/ in your logs, hence my assumption you were using an older version.

To build, I followed the steps in docs/BUILD.md for a CMake build:

$ mkdir build
$ cd build
$ cmake ..
$ make -j 4

Intel OneAPI v2024.2.1, netCDF v4.9.2, MPICH v3.4a2

Here's my environment:

ADVISOR_2024_DIR=/opt/intel/oneapi/advisor/2024.3
APM=/opt/intel/oneapi/advisor/2024.3/perfmodels
CC=icx
CCL_CONFIGURATION=cpu_gpu_dpcpp
CCL_CONFIGURATION_PATH=
CCL_ROOT=/opt/intel/oneapi/ccl/2021.13
_CE_CONDA=
_CE_M=
CFLAGS=-fPIC -fPIE -O3 -Wno-implicit-function-declaration -Wno-incompatible-function-pointer-types -Wno-unused-command-line-argument
CLASSPATH=/opt/intel/oneapi/mpi/2021.13/share/java/mpi.jar
CMAKE_PREFIX_PATH=/opt/intel/oneapi/tbb/2021.13/env/..:/opt/intel/oneapi/mkl/2024.2/lib/cmake:/opt/intel/oneapi/ipp/2021.12/lib/cmake/ipp:/opt/intel/oneapi/dpl/2022.6/lib/cmake/oneDPL:/opt/intel/oneapi/dnnl/2024.2/lib/cmake:/opt/intel/oneapi/dal/2024.7:/opt/intel/oneapi/compiler/2024.2
CMPLR_ROOT=/opt/intel/oneapi/compiler/2024.2
COLORTERM=truecolor
CONDA_DEFAULT_ENV=intelpython-python3.9
CONDA_EXE=/opt/intel/oneapi/intelpython/python3.9/bin/conda
CONDA_PREFIX=/opt/intel/oneapi/intelpython/python3.9
CONDA_PROMPT_MODIFIER=(intelpython-python3.9) 
CONDA_PYTHON_EXE=/opt/intel/oneapi/intelpython/python3.9/bin/python
CONDA_SHLVL=1
CPATH=/opt/intel/oneapi/tbb/2021.13/env/../include:/opt/intel/oneapi/mpi/2021.13/include:/opt/intel/oneapi/mkl/2024.2/include:/opt/intel/oneapi/ippcp/2021.12/include:/opt/intel/oneapi/ipp/2021.12/include:/opt/intel/oneapi/dpl/2022.6/include:/opt/intel/oneapi/dpcpp-ct/2024.2/include:/opt/intel/oneapi/dnnl/2024.2/include:/opt/intel/oneapi/dev-utilities/2024.2/include:/opt/intel/oneapi/dal/2024.7/include:/opt/intel/oneapi/ccl/2021.13/include
CPU_6CORE=6
CPU_CORE=32
CPU_QUARTER=8
CPU_QUARTER_EVEN=8
CXX=icpx
DAL_MAJOR_BINARY=2
DAL_MINOR_BINARY=0
DALROOT=/opt/intel/oneapi/dal/2024.7
DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/1000/bus
DEBUGINFOD_URLS=https://debuginfod.ubuntu.com 
DESKTOP_SESSION=ubuntu
DIAGUTIL_PATH=/opt/intel/oneapi/dpcpp-ct/2024.2/etc/dpct/sys_check/sys_check.sh:/opt/intel/oneapi/debugger/2024.2/etc/debugger/sys_check/sys_check.py:/opt/intel/oneapi/compiler/2024.2/etc/compiler/sys_check/sys_check.sh
DISPLAY=:1
DNNLROOT=/opt/intel/oneapi/dnnl/2024.2
DPL_ROOT=/opt/intel/oneapi/dpl/2022.6
F77=ifx
F90=ifx
FCFLAGS=-m64
FC=ifx
FFLAGS=-m64
FI_PROVIDER_PATH=/opt/intel/oneapi/mpi/2021.13/opt/mpi/libfabric/lib/prov:/usr/lib/x86_64-linux-gnu/libfabric
GDB_INFO=/opt/intel/oneapi/debugger/2024.2/share/info/
GDMSESSION=ubuntu
GNOME_DESKTOP_SESSION_ID=this-is-deprecated
GNOME_SHELL_SESSION_MODE=ubuntu
GNOME_TERMINAL_SCREEN=/org/gnome/Terminal/screen/b4d30638_3788_429a_ae1e_b8020f033c65
GNOME_TERMINAL_SERVICE=:1.104
GPG_AGENT_INFO=/run/user/1000/gnupg/S.gpg-agent:0:1
GSM_SKIP_SSH_AGENT_WORKAROUND=true
GTK_MODULES=gail:atk-bridge
HDF5_Sub_Version=3
HDF5_Version=1.14.4
HOME=/home/workhorse
I_MPI_ROOT=/opt/intel/oneapi/mpi/2021.13
INFOPATH=/opt/intel/oneapi/debugger/2024.2/share/info
INTEL_PYTHONHOME=/opt/intel/oneapi/debugger/2024.2/opt/debugger
IPPCP_TARGET_ARCH=intel64
IPPCRYPTOROOT=/opt/intel/oneapi/ippcp/2021.12
IPPROOT=/opt/intel/oneapi/ipp/2021.12
IPP_TARGET_ARCH=intel64
Jasper_Version=1.900.1
LANG=en_US.UTF-8
LD_LIBRARY_PATH=/opt/intel/oneapi/tbb/2021.13/env/../lib/intel64/gcc4.8:/opt/intel/oneapi/mpi/2021.13/opt/mpi/libfabric/lib:/opt/intel/oneapi/mpi/2021.13/lib:/opt/intel/oneapi/mkl/2024.2/lib:/opt/intel/oneapi/ippcp/2021.12/lib/:/opt/intel/oneapi/ipp/2021.12/lib:/opt/intel/oneapi/dpl/2022.6/lib:/opt/intel/oneapi/dnnl/2024.2/lib:/opt/intel/oneapi/debugger/2024.2/opt/debugger/lib:/opt/intel/oneapi/dal/2024.7/lib:/opt/intel/oneapi/compiler/2024.2/opt/compiler/lib:/opt/intel/oneapi/compiler/2024.2/lib:/opt/intel/oneapi/ccl/2021.13/lib/
LESSCLOSE=/usr/bin/lesspipe %s %s
LESSOPEN=| /usr/bin/lesspipe %s
Libpng_Version=1.6.39
LIBRARY_PATH=/opt/intel/oneapi/tbb/2021.13/env/../lib/intel64/gcc4.8:/opt/intel/oneapi/mpi/2021.13/lib:/opt/intel/oneapi/mkl/2024.2/lib/:/opt/intel/oneapi/ippcp/2021.12/lib/:/opt/intel/oneapi/ipp/2021.12/lib:/opt/intel/oneapi/dpl/2022.6/lib:/opt/intel/oneapi/dnnl/2024.2/lib:/opt/intel/oneapi/dal/2024.7/lib:/opt/intel/oneapi/compiler/2024.2/lib:/opt/intel/oneapi/ccl/2021.13/lib/
LOGNAME=workhorse
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=00:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.avif=01;35:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:*~=00;90:*#=00;90:*.bak=00;90:*.crdownload=00;90:*.dpkg-dist=00;90:*.dpkg-new=00;90:*.dpkg-old=00;90:*.dpkg-tmp=00;90:*.old=00;90:*.orig=00;90:*.part=00;90:*.rej=00;90:*.rpmnew=00;90:*.rpmorig=00;90:*.rpmsave=00;90:*.swp=00;90:*.tmp=00;90:*.ucf-dist=00;90:*.ucf-new=00;90:*.ucf-old=00;90:
MANPATH=/opt/intel/oneapi/mpi/2021.13/share/man:/opt/intel/oneapi/debugger/2024.2/share/man:/opt/intel/oneapi/compiler/2024.2/share/man:
MEMORY_PRESSURE_WATCH=/sys/fs/cgroup/user.slice/user-1000.slice/[email protected]/session.slice/[email protected]/memory.pressure
MEMORY_PRESSURE_WRITE=c29tZSAyMDAwMDAgMjAwMDAwMAA=
METPLUS_DATA=5.1
METPLUS_Version=5.1.0
met_VERSION_number=11.1
met_Version_number=11.1.1
MKLROOT=/opt/intel/oneapi/mkl/2024.2
MPICC=mpiicx
Mpich_Version=4.2.2
MPICXX=mpiicpc
MPIF77=mpiifx
MPIF90=mpiifx
MPIFC=mpiifx
Netcdf_C_Version=4.9.2
Netcdf_Fortran_Version=4.6.1
NLSPATH=/opt/intel/oneapi/mkl/2024.2/share/locale/%l_%t/%N:/opt/intel/oneapi/compiler/2024.2/lib/compiler/locale/%l_%t/%N
OCL_ICD_FILENAMES=libintelocl.so
OCL_ICD_FILENAMES_RESET=1
OCL_ICD_FILENAMES_SAVED=/opt/intel/oneapi/compiler/2024.2/lib/libintelocl.so
OCL_ICD_VENDORS=/opt/intel/oneapi/intelpython/python3.9/etc/OpenCL/vendors
OCL_ICD_VENDORS_RESET=1
ONEAPI_ROOT=/opt/intel/oneapi
PASSWD=
PATH=/opt/intel/oneapi/vtune/2024.3/bin64:/opt/intel/oneapi/mpi/2021.13/bin:/opt/intel/oneapi/mkl/2024.2/bin/:/opt/intel/oneapi/intelpython/python3.9/bin:/opt/intel/oneapi/dpcpp-ct/2024.2/bin:/opt/intel/oneapi/dev-utilities/2024.2/bin:/opt/intel/oneapi/debugger/2024.2/opt/debugger/bin:/opt/intel/oneapi/compiler/2024.2/bin:/opt/intel/oneapi/advisor/2024.3/bin64:/home/workhorse/WRF_Intel/miniconda3/condabin:/home/workhorse/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin
PKG_CONFIG_PATH=/opt/intel/oneapi/vtune/2024.3/include/pkgconfig/lib64:/opt/intel/oneapi/tbb/2021.13/env/../lib/pkgconfig:/opt/intel/oneapi/mpi/2021.13/lib/pkgconfig:/opt/intel/oneapi/mkl/2024.2/lib/pkgconfig:/opt/intel/oneapi/ippcp/2021.12/lib/pkgconfig:/opt/intel/oneapi/dpl/2022.6/lib/pkgconfig:/opt/intel/oneapi/dnnl/2024.2/lib/pkgconfig:/opt/intel/oneapi/dal/2024.7/lib/pkgconfig:/opt/intel/oneapi/compiler/2024.2/lib/pkgconfig:/opt/intel/oneapi/ccl/2021.13/lib/pkgconfig/:/opt/intel/oneapi/advisor/2024.3/include/pkgconfig/lib64:
Pnetcdf_Version=1.13.0
PWD=/home/workhorse/Desktop
PYTHONPATH=/opt/intel/oneapi/advisor/2024.3/pythonapi
QT_ACCESSIBILITY=1
QT_IM_MODULE=ibus
SESSION_MANAGER=local/workhorse-MS-7D91:@/tmp/.ICE-unix/4119,unix/workhorse-MS-7D91:/tmp/.ICE-unix/4119
SETVARS_COMPLETED=1
SHELL=/bin/bash
SHLVL=1
SSH_AUTH_SOCK=/run/user/1000/keyring/ssh
SYSTEMD_EXEC_PID=4151
TBBROOT=/opt/intel/oneapi/tbb/2021.13/env/..
TERM=xterm-256color
USERNAME=workhorse
USER=workhorse
_=/usr/bin/env
VTE_VERSION=7600
VTUNE_PROFILER_2024_DIR=/opt/intel/oneapi/vtune/2024.3
VTUNE_PROFILER_DIR=/opt/intel/oneapi/vtune/2024.3
WINDOWPATH=2
WPS_VERSION=4.6.0
WRF_VERSION=4.6.0
XAUTHORITY=/run/user/1000/gdm/Xauthority
XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg
XDG_CURRENT_DESKTOP=ubuntu:GNOME
XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/:/var/lib/snapd/desktop
XDG_MENU_PREFIX=gnome-
XDG_RUNTIME_DIR=/run/user/1000
XDG_SESSION_CLASS=user
XDG_SESSION_DESKTOP=ubuntu
XDG_SESSION_TYPE=x11
XML_CATALOG_FILES=file:///opt/intel/oneapi/intelpython/python3.9/etc/xml/catalog file:///etc/xml/catalog
XMODIFIERS=@im=ibus
Zlib_Version=1.3.1

The commands i'm using is this:

    source /opt/intel/oneapi/setvars.sh --force
   
    export NETCDF_INC=$DIR/NETCDF/include
    export NETCDF_LIB=$DIR/NETCDF/lib
    mkdir "${WRF_FOLDER}"/Hydro-Basecode
    cd "${WRF_FOLDER}"/Downloads
   
    wget -c https://github.com/NCAR/wrf_hydro_nwm_public/archive/refs/tags/v5.3.0.tar.gz -O WRFHYDRO.5.3.tar.gz
    tar -xvzf WRFHYDRO.5.3.tar.gz -C "${WRF_FOLDER}"/Hydro-Basecode
   
    cd "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public-5.3.0/trunk/NDHMS
   
   mkdir build
   cd build
   cmake .. -DSPATIAL_SOIL=1 -DWRF_HYDRO=1 -DWRF_HYDRO_NUDGING=1 -DWRFIO_NCD_LARGE_FILE_SUPPORT=1 2>&1 | tee cmake.log
   make -j $CPU_QUARTER_EVEN 2>&1 | tee make.log

Am I doing something wrong or issuing the commands in the wrong order?

WRF_HYDRO.txt

@rcabell
Copy link
Collaborator

rcabell commented Sep 19, 2024

@HathewayWill - if you are still using the older v5.3.0 tag, you will need to change the cmake line to:

cmake ../trunk/NDHMS/ <ANY_OTHER_CMAKE ARGS>

The current main branch (which will soon be released as v5.4.0) simplifies the build somewhat.

@HathewayWill
Copy link
Author

@rcabell

Does cloning directly from github fix that? If so which method should I use for cloning?

@HathewayWill
Copy link
Author

@HathewayWill - if you are still using the older v5.3.0 tag, you will need to change the cmake line to:

cmake ../trunk/NDHMS/ <ANY_OTHER_CMAKE ARGS>

The current main branch (which will soon be released as v5.4.0) simplifies the build somewhat.

export NETCDF_INC=$DIR/NETCDF/include
    export NETCDF_LIB=$DIR/NETCDF/lib
    mkdir "${WRF_FOLDER}"/Hydro-Basecode
    cd "${WRF_FOLDER}"/Downloads
   
    wget -c https://github.com/NCAR/wrf_hydro_nwm_public/archive/refs/tags/v5.3.0.tar.gz -O WRFHYDRO.5.3.tar.gz
    tar -xvzf WRFHYDRO.5.3.tar.gz -C "${WRF_FOLDER}"/Hydro-Basecode
   
    cd "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public-5.3.0/trunk/NDHMS
    
    mkdir -p "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public-5.3.0/trunk/NDHMS/build
    
    cd "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public-5.3.0/trunk/NDHMS/build

    cmake .. -DSPATIAL_SOIL=1 -DWRF_HYDRO=1 -DWRF_HYDRO_NUDGING=1 -DWRFIO_NCD_LARGE_FILE_SUPPORT=1 2>&1 | tee cmake.log
   make -j $CPU_QUARTER_EVEN 2>&1 | tee make.log

cmake.log
make.log

still failing not sure what i'm doing wrong. I may have to wait till 5.4 comes out

@HathewayWill
Copy link
Author

@rcabell

Good news and bad news.

Good news: I found the error with the .tar.gz file. The tagged file cannot recognize the intel llvm compilers. Which is what is causing the issue. When I issue these commands it can see the compiler

export NETCDF_INC=$DIR/NETCDF/include
    export NETCDF_LIB=$DIR/NETCDF/lib
    mkdir "${WRF_FOLDER}"/Hydro-Basecode
    cd "${WRF_FOLDER}"/Hydro-Basecode
    git clone https://github.com/NCAR/wrf_hydro_nwm_public.git
    cd "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public/trunk/NDHMS
   
    mkdir -p "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public/trunk/NDHMS/build
    cd "${WRF_FOLDER}"/Hydro-Basecode/wrf_hydro_nwm_public/trunk/NDHMS/build
    cmake .. -DSPATIAL_SOIL=1 -DWRF_HYDRO=1 -DWRF_HYDRO_NUDGING=1 -DWRFIO_NCD_LARGE_FILE_SUPPORT=1 2>&1 | tee cmake.log

Bad news:
There is a lot of -D(insert here) commands for cmake that seem to be unrecognized with the git cloned one. See log file below

cmake.log

@scrasmussen
Copy link
Member

scrasmussen commented Sep 19, 2024

I would try the following with the main branch.

$ git clone https://github.com/NCAR/wrf_hydro_nwm_public.git 
$ cd wrf_hydro_nwm_public
$ mkdir build 
$ cd build
$ cmake .. -DSPATIAL_SOIL=1 -DWRF_HYDRO=1 -DWRF_HYDRO_NUDGING=1 -DWRFIO_NCD_LARGE_FILE_SUPPORT=1
$ make -j 

I think the issue with your main branch build is that the main CMake file (that finds the MPI library et al.) is in the top source directory and you are getting the CMakeLists.txt from trunk/NDHMS. (And the older releases don't have newer and improved CMake files so those might have issues with ifx)

note: I was able to get this building and running a test case with ifx. My build wants to default to ifort, but it did build with ifx when I added -DCMAKE_Fortran_COMPILER=ifx

@HathewayWill
Copy link
Author

note: I was able to get this building and running a test case with ifx. My build wants to default to ifort, but it did build with ifx when I added -DCMAKE_Fortran_COMPILER=ifx

is that because there isn't a configure or macros for ifx?

@scrasmussen
Copy link
Member

note: I was able to get this building and running a test case with ifx. My build wants to default to ifort, but it did build with ifx when I added -DCMAKE_Fortran_COMPILER=ifx

is that because there isn't a configure or macros for ifx?

Good question, I'm not sure. I would guess because Derecho has bothifort and ifx the CMake build automatically defaults to the older one (which seems like the safer thing to do). CMake handles the compiler choice in the project (WRF_Hydro LANGUAGES Fortran) line

@HathewayWill
Copy link
Author

intel.llvm.zip
not sure if that would be of interest to you but that's what I made when I was trying to use the tagged version, it kinda worked.

Good news again; the commands you gave me seem to get farther along then before but still fails to build.

hydro.llvm.log

@HathewayWill
Copy link
Author

HathewayWill commented Sep 21, 2024

@rcabell

Found the solution

# Source Intel oneAPI environment
source /opt/intel/oneapi/setvars.sh --force

# Set up NETCDF environment variables
export NETCDF_INC="$DIR/NETCDF/include"
export NETCDF_LIB="$DIR/NETCDF/lib"

# Create directories for Hydro Basecode and navigate to it
mkdir -p "${WRF_FOLDER}/Hydro-Basecode"
cd "${WRF_FOLDER}/Hydro-Basecode"

# Clone the WRF-Hydro repository and set up the build
git clone https://github.com/NCAR/wrf_hydro_nwm_public.git 
cd wrf_hydro_nwm_public
mkdir -p build
cd build

# Run CMake configuration for WRF-Hydro with specified options
cmake .. \
  -DSPATIAL_SOIL=1 \
  -DWRF_HYDRO=1 \
  -DWRF_HYDRO_NUDGING=1 \
  -DWRFIO_NCD_LARGE_FILE_SUPPORT=1 \
  -DCMAKE_Fortran_COMPILER=ifx

# Compile using specified CPU settings
make -j "$CPU_QUARTER_EVEN" 2>&1 | tee make.log

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants