Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Fortran bindings #28

Merged
merged 15 commits into from Sep 19, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
8 changes: 8 additions & 0 deletions ci_support/run_docker_build.sh
Expand Up @@ -41,6 +41,14 @@ conda clean --lock
conda install --yes --quiet conda-forge-build-setup
source run_conda_forge_build_setup


# Install the yum requirements defined canonically in the
# "recipe/yum_requirements.txt" file. After updating that file,
# run "conda smithy rerender" and this line be updated
# automatically.
yum install -y devtoolset-2-gcc-gfortran


# Embarking on 1 case(s).
conda build /recipe_root --quiet || exit 1
/feedstock_root/ci_support/upload_or_check_non_existence.py /recipe_root conda-forge --channel=main || exit 1
Expand Down
6 changes: 6 additions & 0 deletions recipe/README.md
@@ -0,0 +1,6 @@
About
=====

The C, C++, and fortran files in this directory are used to test the
installation, and are copied from the HDF5 library. The copyright and license
for these files is contained in the TEST_FILES_LICENSE.md file.
41 changes: 41 additions & 0 deletions recipe/TEST_FILES_LICENSE.md
@@ -0,0 +1,41 @@
HDF5 (Hierarchical Data Format 5) Software Library and Utilities
Copyright 2006-2016 by The HDF Group.

NCSA HDF5 (Hierarchical Data Format 5) Software Library and Utilities
Copyright 1998-2006 by the Board of Trustees of the University of Illinois.

All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted for any purpose (including commercial purposes)
provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice,
this list of conditions, and the following disclaimer.

2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions, and the following disclaimer in the documentation
and/or materials provided with the distribution.

3. In addition, redistributions of modified forms of the source or binary
code must carry prominent notices stating that the original code was
changed and the date of the change.

4. All publications or advertising materials mentioning features or use of
this software are asked, but not required, to acknowledge that it was
developed by The HDF Group and by the National Center for Supercomputing
Applications at the University of Illinois at Urbana-Champaign and
credit the contributors.

5. Neither the name of The HDF Group, the name of the University, nor the
name of any Contributor may be used to endorse or promote products derived
from this software without specific prior written permission from
The HDF Group, the University, or the Contributor, respectively.

DISCLAIMER:
THIS SOFTWARE IS PROVIDED BY THE HDF GROUP AND THE CONTRIBUTORS
"AS IS" WITH NO WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED. In no
event shall The HDF Group or the Contributors be liable for any damages
suffered by the users arising out of the use of this software, even if
advised of the possibility of such damage.

13 changes: 9 additions & 4 deletions recipe/build.sh 100644 → 100755
Expand Up @@ -11,13 +11,18 @@ source activate "${CONDA_DEFAULT_ENV}"
if [ "$(uname)" == "Darwin" ]
then
export CXX="${CXX} -stdlib=libc++"
export DYLD_FALLBACK_LIBRARY_PATH=$PREFIX/lib
fi

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add export LIBRARY_PATH="${PREFIX}/lib".

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done below.

export LIBRARY_PATH="${PREFIX}/lib"

./configure --prefix="${PREFIX}" \
--enable-linux-lfs \
--with-zlib="${PREFIX}" \
--with-pthread=yes --enable-cxx \
--with-default-plugindir="${PREFIX}/lib/hdf5/plugin"
--enable-linux-lfs \
--with-zlib="${PREFIX}" \
--with-pthread=yes \
--enable-cxx \
--enable-fortran \
--with-default-plugindir="${PREFIX}/lib/hdf5/plugin"

make
make check
Expand Down
125 changes: 125 additions & 0 deletions recipe/h5_cmprss.c
@@ -0,0 +1,125 @@
/* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
* Copyright by The HDF Group. *
* Copyright by the Board of Trustees of the University of Illinois. *
* All rights reserved. *
* *
* This file is part of HDF5. The full HDF5 copyright notice, including *
* terms governing use, modification, and redistribution, is contained in *
* the files COPYING and Copyright.html. COPYING can be found at the root *
* of the source code distribution tree; Copyright.html can be found at the *
* root level of an installed copy of the electronic HDF5 document set and *
* is linked from the top-level documents page. It can also be found at *
* http://hdfgroup.org/HDF5/doc/Copyright.html. If you do not have *
* access to either file, you may request a copy from help@hdfgroup.org. *
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * */

/*
* This example illustrates how to create a compressed dataset.
* It is used in the HDF5 Tutorial.
*/

#include "hdf5.h"

#define FILE "cmprss.h5"
#define RANK 2
#define DIM0 100
#define DIM1 20

int main () {

hid_t file_id, dataset_id, dataspace_id; /* identifiers */
hid_t plist_id;

size_t nelmts;
unsigned flags, filter_info;
H5Z_filter_t filter_type;

herr_t status;
hsize_t dims[2];
hsize_t cdims[2];

int idx;
int i,j, numfilt;
int buf[DIM0][DIM1];
int rbuf [DIM0][DIM1];

/* Uncomment these variables to use SZIP compression
unsigned szip_options_mask;
unsigned szip_pixels_per_block;
*/

/* Create a file. */
file_id = H5Fcreate (FILE, H5F_ACC_TRUNC, H5P_DEFAULT, H5P_DEFAULT);


/* Create dataset "Compressed Data" in the group using absolute name. */
dims[0] = DIM0;
dims[1] = DIM1;
dataspace_id = H5Screate_simple (RANK, dims, NULL);

plist_id = H5Pcreate (H5P_DATASET_CREATE);

/* Dataset must be chunked for compression */
cdims[0] = 20;
cdims[1] = 20;
status = H5Pset_chunk (plist_id, 2, cdims);

/* Set ZLIB / DEFLATE Compression using compression level 6.
* To use SZIP Compression comment out these lines.
*/
status = H5Pset_deflate (plist_id, 6);

/* Uncomment these lines to set SZIP Compression
szip_options_mask = H5_SZIP_NN_OPTION_MASK;
szip_pixels_per_block = 16;
status = H5Pset_szip (plist_id, szip_options_mask, szip_pixels_per_block);
*/

dataset_id = H5Dcreate2 (file_id, "Compressed_Data", H5T_STD_I32BE,
dataspace_id, H5P_DEFAULT, plist_id, H5P_DEFAULT);

for (i = 0; i< DIM0; i++)
for (j=0; j<DIM1; j++)
buf[i][j] = i+j;

status = H5Dwrite (dataset_id, H5T_NATIVE_INT, H5S_ALL, H5S_ALL, H5P_DEFAULT, buf);

status = H5Sclose (dataspace_id);
status = H5Dclose (dataset_id);
status = H5Pclose (plist_id);
status = H5Fclose (file_id);

/* Now reopen the file and dataset in the file. */
file_id = H5Fopen (FILE, H5F_ACC_RDWR, H5P_DEFAULT);
dataset_id = H5Dopen2 (file_id, "Compressed_Data", H5P_DEFAULT);

/* Retrieve filter information. */
plist_id = H5Dget_create_plist (dataset_id);

numfilt = H5Pget_nfilters (plist_id);
printf ("Number of filters associated with dataset: %i\n", numfilt);

for (i=0; i<numfilt; i++) {
nelmts = 0;
filter_type = H5Pget_filter2 (plist_id, 0, &flags, &nelmts, NULL, 0, NULL,
&filter_info);
printf ("Filter Type: ");
switch (filter_type) {
case H5Z_FILTER_DEFLATE:
printf ("H5Z_FILTER_DEFLATE\n");
break;
case H5Z_FILTER_SZIP:
printf ("H5Z_FILTER_SZIP\n");
break;
default:
printf ("Other filter type included.\n");
}
}

status = H5Dread (dataset_id, H5T_NATIVE_INT, H5S_ALL, H5S_ALL,
H5P_DEFAULT, rbuf);

status = H5Dclose (dataset_id);
status = H5Pclose (plist_id);
status = H5Fclose (file_id);
}
131 changes: 131 additions & 0 deletions recipe/h5_cmprss.f90
@@ -0,0 +1,131 @@
! * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
! Copyright by The HDF Group. *
! Copyright by the Board of Trustees of the University of Illinois. *
! All rights reserved. *
! *
! This file is part of HDF5. The full HDF5 copyright notice, including *
! terms governing use, modification, and redistribution, is contained in *
! the files COPYING and Copyright.html. COPYING can be found at the root *
! of the source code distribution tree; Copyright.html can be found at the *
! root level of an installed copy of the electronic HDF5 document set and *
! is linked from the top-level documents page. It can also be found at *
! http://hdfgroup.org/HDF5/doc/Copyright.html. If you do not have *
! access to either file, you may request a copy from help@hdfgroup.org. *
! * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
!
! This example illustrates how to create a compressed dataset.
! It is used in the HDF5 Tutorial.
!
PROGRAM h5_cmprss

USE HDF5 ! This module contains all necessary modules

IMPLICIT NONE
!
! The dataset is stored in file "h5_cmprss.h5"
!
CHARACTER(LEN=12), PARAMETER :: filename = "h5_cmprss.h5"
INTEGER, PARAMETER :: rank = 2 ! Rank of the data set
INTEGER, PARAMETER :: dim0 = 100 ! Data set sizes
INTEGER, PARAMETER :: dim1 = 20

INTEGER(hid_t) :: file_id, dataset_id, dataspace_id ! Identifiers
INTEGER(hid_t) :: plist_id ! Property list identifier

INTEGER :: error
INTEGER(hsize_t), DIMENSION(1:rank) :: dims ! dimensions of data
INTEGER(hsize_t), DIMENSION(1:rank) :: cdims ! sizes of chunked data

INTEGER :: i,j, numfilt
INTEGER, DIMENSION(1:dim0,1:dim1) :: buf ! write buffer
INTEGER, DIMENSION(1:dim0,1:dim1) :: rbuf ! read buffer
INTEGER(HSIZE_T), DIMENSION(1:rank) :: data_dims ! dimensions of data buffers

INTEGER, DIMENSION(1:1) :: cd_values ! Auxiliary data for the filter
INTEGER(size_t) :: nelmts ! Number of elements in cd_values
INTEGER :: flags ! Bit vector specifying certain general properties of the filter
INTEGER(SIZE_T) :: namelen = 180 ! Anticipated number of characters in name
CHARACTER(LEN=180) :: name ! Name of the filter
INTEGER :: filter_id ! Filter identification number

! Uncomment these variables to use SZIP compression
!INTEGER :: szip_options_mask
!INTEGER :: szip_pixels_per_block

!
!Initialize FORTRAN predifined datatypes
!
CALL h5open_f(error)
!
! Create a file
CALL h5fcreate_f(filename, H5F_ACC_TRUNC_F, file_id, error)
!
! Create dataset "Compressed Data" in the group using absolute name.
dims(1:2) = (/dim0, dim1/)
CALL h5screate_simple_f(rank, dims, dataspace_id, error)
CALL h5pcreate_f(H5P_DATASET_CREATE_F, plist_id, error)
!
! Dataset must be chunked for compression
cdims(1:2) = 20
CALL h5pset_chunk_f(plist_id, 2, cdims, error)

! Set ZLIB / DEFLATE Compression using compression level 6.
! To use SZIP Compression comment out these lines.
CALL h5pset_deflate_f(plist_id, 6, error)

! Uncomment these lines to set SZIP Compression
!szip_options_mask = H5_SZIP_NN_OM_F
!szip_pixels_per_block = 16
!CALL H5Pset_szip_f(plist_id, szip_options_mask, szip_pixels_per_block, error)

! Create data set
CALL h5dcreate_f(file_id, "Compressed_Data", H5T_NATIVE_INTEGER, dataspace_id, &
dataset_id, error, dcpl_id=plist_id)

DO j = 1, dim1
DO i = 1, dim0
buf(i,j) = i+j
ENDDO
ENDDO

data_dims(1:2) = (/dim0,dim1/)
CALL h5dwrite_f(dataset_id, H5T_NATIVE_INTEGER, buf, data_dims, error)

! Close resources
CALL h5sclose_f(dataspace_id, error)
CALL h5pclose_f(plist_id, error)
CALL h5dclose_f(dataset_id, error)
CALL h5fclose_f(file_id, error)

! Now reopen the file and dataset in the file.
CALL h5fopen_f(filename, H5F_ACC_RDONLY_F, file_id, error)
CALL h5dopen_f(file_id, "Compressed_Data", dataset_id, error)

! Retrieve filter information.
CALL h5dget_create_plist_f(dataset_id, plist_id, error)

CALL h5pget_nfilters_f(plist_id, numfilt, error)
WRITE(*,'(A, I0)') "Number of filters associated with dataset: ", numfilt

DO i = 1, numfilt
nelmts = 1
CALL h5pget_filter_f(plist_id, 0, flags, nelmts, cd_values, &
namelen, name, filter_id, error)

WRITE(*,'(30X,A)', ADVANCE='NO')"Filter Type: "
IF(filter_id.EQ.H5Z_FILTER_DEFLATE_F)THEN
WRITE(*,'(A)') "H5Z_FILTER_DEFLATE"
ELSEIF (filter_id.EQ.H5Z_FILTER_SZIP_F)THEN
WRITE(*,'(A)') "H5Z_FILTER_SZIP"
ELSE
WRITE(*,'(A)') "Other filter type included"
ENDIF
ENDDO
data_dims(1:2) = (/dim0,dim1/)
CALL h5dread_f(dataset_id, H5T_NATIVE_INTEGER, rbuf, data_dims, error)

CALL h5dclose_f(dataset_id, error)
CALL h5pclose_f(plist_id, error)
CALL h5fclose_f(file_id, error)

END PROGRAM h5_cmprss