Giter Club home page Giter Club logo

hdf5-feedstock's Introduction

About hdf5-feedstock

Feedstock license: BSD-3-Clause

Home: https://www.hdfgroup.org/solutions/hdf5/

Package license: BSD-3-Clause

Summary: HDF5 is a data model, library, and file format for storing and managing data

Development: https://github.com/HDFGroup/hdf5

Documentation: https://portal.hdfgroup.org/display/HDF5/HDF5

Current build status

Azure
VariantStatus
linux_64_mpimpich variant
linux_64_mpinompi variant
linux_64_mpiopenmpi variant
linux_aarch64_mpimpich variant
linux_aarch64_mpinompi variant
linux_aarch64_mpiopenmpi variant
linux_ppc64le_mpimpich variant
linux_ppc64le_mpinompi variant
linux_ppc64le_mpiopenmpi variant
osx_64_mpimpich variant
osx_64_mpinompi variant
osx_64_mpiopenmpi variant
osx_arm64_mpimpich variant
osx_arm64_mpinompi variant
osx_arm64_mpiopenmpi variant
win_64_mpiimpi variant
win_64_mpinompi variant

Current release info

Name Downloads Version Platforms
Conda Recipe Conda Downloads Conda Version Conda Platforms

Installing hdf5

Installing hdf5 from the conda-forge channel can be achieved by adding conda-forge to your channels with:

conda config --add channels conda-forge
conda config --set channel_priority strict

Once the conda-forge channel has been enabled, hdf5 can be installed with conda:

conda install hdf5

or with mamba:

mamba install hdf5

It is possible to list all of the versions of hdf5 available on your platform with conda:

conda search hdf5 --channel conda-forge

or with mamba:

mamba search hdf5 --channel conda-forge

Alternatively, mamba repoquery may provide more information:

# Search all versions available on your platform:
mamba repoquery search hdf5 --channel conda-forge

# List packages depending on `hdf5`:
mamba repoquery whoneeds hdf5 --channel conda-forge

# List dependencies of `hdf5`:
mamba repoquery depends hdf5 --channel conda-forge

About conda-forge

Powered by NumFOCUS

conda-forge is a community-led conda channel of installable packages. In order to provide high-quality builds, the process has been automated into the conda-forge GitHub organization. The conda-forge organization contains one repository for each of the installable packages. Such a repository is known as a feedstock.

A feedstock is made up of a conda recipe (the instructions on what and how to build the package) and the necessary configurations for automatic building using freely available continuous integration services. Thanks to the awesome service provided by Azure, GitHub, CircleCI, AppVeyor, Drone, and TravisCI it is possible to build and upload installable packages to the conda-forge anaconda.org channel for Linux, Windows and OSX respectively.

To manage the continuous integration and simplify feedstock maintenance conda-smithy has been developed. Using the conda-forge.yml within this repository, it is possible to re-render all of this feedstock's supporting files (e.g. the CI configuration files) with conda smithy rerender.

For more information please check the conda-forge documentation.

Terminology

feedstock - the conda recipe (raw material), supporting scripts and CI configuration.

conda-smithy - the tool which helps orchestrate the feedstock. Its primary use is in the construction of the CI .yml files and simplify the management of many feedstocks.

conda-forge - the place where the feedstock and smithy live and work to produce the finished article (built conda distributions)

Updating hdf5-feedstock

If you would like to improve the hdf5 recipe or build a new package version, please fork this repository and submit a PR. Upon submission, your changes will be run on the appropriate platforms to give the reviewer an opportunity to confirm that the changes result in a successful build. Once merged, the recipe will be re-built and uploaded automatically to the conda-forge channel, whereupon the built conda packages will be available for everybody to install and use from the conda-forge channel. Note that all branches in the conda-forge/hdf5-feedstock are immediately built and any created packages are uploaded, so PRs should be based on branches in forks and branches in the main repository should only be used to build distinct package versions.

In order to produce a uniquely identifiable distribution:

  • If the version of a package is not being increased, please add or increase the build/number.
  • If the version of a package is being increased, please remember to return the build/number back to 0.

Feedstock Maintainers

hdf5-feedstock's People

Contributors

adamcpovey avatar astrofrog avatar ax3l avatar beckermr avatar conda-forge-admin avatar conda-forge-curator[bot] avatar conda-forge-webservices[bot] avatar davidbrochart avatar djhoese avatar erykoff avatar gillins avatar github-actions[bot] avatar hmaarrfk avatar isuruf avatar jakirkham avatar jjhelmus avatar jschueller avatar marcelotrevisani avatar mardiehl avatar mariusvniekerk avatar minrk avatar mkitti avatar ocefpaf avatar paulkmueller avatar qwhelan avatar regro-cf-autotick-bot avatar scopatz avatar varlackc avatar visr avatar wasade avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hdf5-feedstock's Issues

Update HDF5 ecosystem to 1.8.18

After merging #68, I realized that almost all conda-forge packages have pinned their recipes to 1.8.17|1.8.17.*. As hdf5 is cautious and checks that version numbers are the same for headers and shared objects, we'll need to update the pinned versions to allow a user to use 1.8.18.

I identified all conda-forge packages using hdf5 by checking out https://github.com/conda-forge/feedstocks, running git submodule init && git submodule update, and then executing the following:

grep hdf5 * -R | awk -F/ '{print $1}' | sort | uniq > hdf5.txt

I will be creating PRs for each library below and checking them off as they get merged:

Add patch to support Unicode filenames in windows

Hello,

The HDF5 library currently does not support filenames with unicode characters.
This can be seen on these threads: here and here.

This can be easily done through a patch. At the company that I work for, we have our own patched version of HDF5, but we're moving to conda-forge and we want to contribute it back.

Please ignore references to our issue tracker. The patch we use is this:

From d8e1bde709d643cabc2433164ca274fc77602a6c Mon Sep 17 00:00:00 2001
From: Tiago de Holanda Cunha Nobrega <[email protected]>
Date: Fri, 21 Nov 2014 14:46:46 -0200
Subject: [PATCH] Patch code to use _wopen() on windows

Reference:
http://hdf-forum.184993.n3.nabble.com/Problems-when-using-hdf5-on-non-English-windows-td4027373.html#a4027377

EDEN-851

---
 src/H5FDwindows.c    | 15 +++++++++++++++
 src/H5win32defs.h    |  6 +++++-
 windows/copy_hdf.bat |  4 ++--
 3 files changed, 22 insertions(+), 3 deletions(-)

diff --git src/H5FDwindows.c src/H5FDwindows.c
index 28434ec..28a3c81 100644
--- src/H5FDwindows.c
+++ src/H5FDwindows.c
@@ -24,8 +24,23 @@
 #include "H5MMprivate.h"    /* Memory management        */
 #include "H5Pprivate.h"     /* Property lists           */

+#include <stdio.h>
+
 #ifdef H5_HAVE_WINDOWS

+int win_open_patched(const char *name, int oflag, int pmode)
+{
+    // Patch to enable the opening of unicode paths
+    // See: EDEN-851
+    int fd = -1;
+    int name_len = strlen(name);
+    wchar_t* wname = (wchar_t*)malloc( sizeof(wchar_t)*(name_len + 1) );
+    MultiByteToWideChar( CP_UTF8, 0, name, -1, wname, name_len + 1 );
+    fd=_wopen(wname, oflag, pmode);
+    free(wname);
+    return fd;
+}
+
 
 /*-------------------------------------------------------------------------
  * Function:    H5Pset_fapl_windows
diff --git src/H5win32defs.h src/H5win32defs.h
index 5f886d1..9ca0e72 100644
--- src/H5win32defs.h
+++ src/H5win32defs.h
@@ -47,7 +47,11 @@ typedef __int64             h5_stat_size_t;
 /* _O_BINARY must be set in Windows to avoid CR-LF <-> LF EOL
  * transformations when performing I/O.
  */
-#define HDopen(S,F,M)       _open(S,F|_O_BINARY,M)
+//#define HDopen(S,F,M)       _open(S,F|_O_BINARY,M)
+// EDEN-851: Patch hdf5 to open unicode file paths.
+H5_DLL int win_open_patched(const char *name, int oflag, int pmode);
+#define HDopen(S,F,M)       win_open_patched(S,F|_O_BINARY,M)
+
 #define HDread(F,M,Z)       _read(F,M,Z)
 #define HDsetvbuf(F,S,M,Z)  setvbuf(F,S,M,(Z>1?Z:2))
 #define HDsleep(S)          Sleep(S*1000)
-- 
2.6.3.windows.1

Would you be willing to add this to the feedstock? I can contribute it with a PR.

gzip unavailable

@ericdill wrote:

Hi All,

I am running into an issue with the h5py and hdf5 packages from anaconda.org/conda-forge. It appears that the gzip compression filter is not behaving properly, as I am getting exceptions in my test suite if I use h5py and hdf5 from anaconda.org/conda-forge.

Packages from conda-forge:
    h5py:            2.6.0-np110py34_1             conda-forge 
    hdf5:            1.8.16-0                      conda-forge 

Errors are:

    ValueError: Compression filter "gzip" is unavailable   

And the relevant travis-ci log: https://travis-ci.org/NSLS-II/suitcase/jobs/119284041

If I remove conda-forge from my ~/.condarc and install h5py and hdf5 from anaconda.org then my test suite is happy again.

Relevant travis-ci log for working build after I comment out conda-forge in my condarc: https://travis-ci.org/NSLS-II/suitcase/builds/119289079

I would imagine that this is a problem with the conda-forge h5py and hdf5 packages?

Not really sure what to do here...

Thoughts?

Best,

Eric

Use CMake build for Linux as well

If CMake is used for dowstream projects, find_package(HDF5) is inconsistant across platform.

  1. On Windows, since CMake is used to build the package, a CMake package config file is generated and find_package(HDF5) uses the hdf5-config.cmake that exports an HDF5 target. Using HDF5 is simple:
target_link_libraries(my_program
    hdf5::hdf5-static
)
  1. On Linux, since autoconf is used, find_package(HDF5) works the old way and uses FindHDF5.cmake which does not export a target. Using HDF5 requires a more verbose CMake code:
target_include_directories(my_program
    ${HDF5_INCLUDE_DIRS}
)

target_compile_definitions(my_program
    ${HDF5_DEFINITIONS}
)

target_link_libraries(my_program
    ${HDF5_LIBRARIES}
)

Is there any reasons not to use CMake for both platforms in the first place?


Environment (conda list):
``` $ conda list hdf5 1.10.4 hb1b8bf9_0 ```

Details about conda and system ( conda info ):
$ conda info
          conda version : 4.6.2
    conda-build version : not installed
         python version : 3.7.1.final.0
               platform : linux-64
             user-agent : conda/4.6.2 requests/2.21.0 CPython/3.7.1 Linux/4.4.0-17763-Microsoft debian/9 glibc/2.24
                UID:GID : 1000:1000
             netrc file : None
           offline mode : False

install failing due to links

Install is failing at:

IOError: [Errno 2] No such file or directory: '/Users/xxx/anaconda/pkgs/hdf5-1.8.17-1/bin/activate'

A quick look shows that these have been installed as links (to nonexistent directories)
lrwxr-xr-x  1 xxxx  staff      37 Jun 21 19:03 activate -> /Users/travis/miniconda3/bin/activate
lrwxr-xr-x  1 xxxx  staff      34 Jun 21 19:03 conda -> /Users/travis/miniconda3/bin/conda
lrwxr-xr-x  1 xxxx  staff      39 Jun 21 19:03 deactivate -> /Users/travis/miniconda3/bin/deactivate

condo info
Using Anaconda Cloud api site https://api.anaconda.org
Current conda install:

             platform : osx-64
        conda version : 4.1.2
  conda-build version : 0+unknown
       python version : 2.7.11.final.0
     requests version : 2.9.1
     root environment : /Users/xxxx/anaconda  (writable)
  default environment : /Users/xxxx/anaconda
     envs directories : /Users/xxxx/anaconda/envs
        package cache : /Users/xxxx/anaconda/pkgs
         channel URLs : https://conda.anaconda.org/conda-forge/osx-64/
                        https://conda.anaconda.org/conda-forge/noarch/
                        https://repo.continuum.io/pkgs/free/osx-64/
                        https://repo.continuum.io/pkgs/free/noarch/
                        https://repo.continuum.io/pkgs/pro/osx-64/
                        https://repo.continuum.io/pkgs/pro/noarch/
          config file : /Users/xxxx/.condarc
    is foreign system : False

Add szip support

Issue: szip is a compression module available in the HDF5 distribution . Please enable it

Enabling it should be fairly straightforward if using cmake by adding this flow
-D HDF5_ENABLE_S_LIB_SUPPORT:BOOL=ON

Add "external" dummy builds

Comment:

This feedstock has been tremendously valuable for building stacks for Dedalus, but in some cases when building on HPC systems, it may be preferable to use an existing HDF5 build from the cluster. However, this currently makes it hard to use other conda builds that rely on HDF5, since conda tries to install it's own HDF5 binaries.

I think this could be avoided by creating "external" dummy builds on this feedstock like what has been done with MPI. Then upstream conda builds that rely on HDF5 should work with either the true builds from the feedstock, or just pass through to the existing system libraries if the user pre-installs the dummy build.

I'm still pretty new to conda forge builds, so I might eventually be able to try this myself and open a PR, but if it's trivial for anyone else to do in the meantime, it's a feature that I think would be greatly appreciated!

Read-only S3 driver (ros3) doesn't seem to work

I created #122 and #124 so I could play with the S3 reading ability of HDF5. I tried using a public S3 bucket to inspect a NetCDF4 file with no luck. I then started looking at one of the files here: https://www.hdfgroup.org/solutions/enterprise-support/cloud-amazon-s3-storage-hdf5-connector/ but got the same errors. Here's a command from the example video that should work:

h5dump -pBH --filedriver=ros3 "https://s3.amazonaws.com/hdfgroup/data/hdf5demo/snp500.h5"  

But just produces:

h5dump error: unable to open file "https://s3.amazonaws.com/hdfgroup/data/hdf5demo/snp500.h5"

I know this is probably an upstream issue but wanted to make this so if anyone else tries this feature they know it isn't working. Also, is there a way to install the built package from a PR so I could try enabling some debug flags to further debug this issue?


Environment (conda list):
$ conda list
# packages in environment at /home/davidh/miniconda3/envs/hdf5_test:
#
# Name                    Version                   Build  Channel
_libgcc_mutex             0.1                 conda_forge    conda-forge
_openmp_mutex             4.5                       1_gnu    conda-forge
c-ares                    1.16.1               h516909a_0    conda-forge
ca-certificates           2020.6.20            hecda079_0    conda-forge
hdf5                      1.12.0          nompi_h54c07f9_101    conda-forge
krb5                      1.17.1               hfafb76e_2    conda-forge
libcurl                   7.71.1               hcdd3856_4    conda-forge
libedit                   3.1.20191231         h46ee950_1    conda-forge
libev                     4.33                 h516909a_0    conda-forge
libgcc-ng                 9.3.0               h24d8f2e_14    conda-forge
libgfortran-ng            7.5.0               hdf63c60_14    conda-forge
libgomp                   9.3.0               h24d8f2e_14    conda-forge
libnghttp2                1.41.0               hab1572f_1    conda-forge
libssh2                   1.9.0                hab1572f_5    conda-forge
libstdcxx-ng              9.3.0               hdf63c60_14    conda-forge
ncurses                   6.2                  he1b5a44_1    conda-forge
openssl                   1.1.1g               h516909a_1    conda-forge
tk                        8.6.10               hed695b0_0    conda-forge
zlib                      1.2.11            h516909a_1007    conda-forge

Details about conda and system ( conda info ):
$ conda info

     active environment : hdf5_test
    active env location : /home/davidh/miniconda3/envs/hdf5_test
            shell level : 2
       user config file : /home/davidh/.condarc
 populated config files : /home/davidh/.condarc
          conda version : 4.8.3
    conda-build version : not installed
         python version : 3.7.3.final.0
       virtual packages : __cuda=10.2
                          __glibc=2.31
       base environment : /home/davidh/miniconda3  (writable)
           channel URLs : https://conda.anaconda.org/conda-forge/linux-64
                          https://conda.anaconda.org/conda-forge/noarch
                          https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
          package cache : /home/davidh/miniconda3/pkgs
                          /home/davidh/.conda/pkgs
       envs directories : /home/davidh/miniconda3/envs
                          /home/davidh/.conda/envs
               platform : linux-64
             user-agent : conda/4.8.3 requests/2.23.0 CPython/3.7.3 Linux/5.4.0-7634-generic pop/20.04 glibc/2.31
                UID:GID : 53807:1000
             netrc file : None
           offline mode : False

`h5perf_serial` is missing on Windows

It's not clear to me whether this should be present, but it appears that it is missing on Windows. Would be nice to figure out why and, if it is suppose to be there, address it.

Link problems on Windows for 1.8.17

When I try to link against HDF5 on Windows, I get unresolved externals errors like the following:

tree_sequence.obj : error LNK2001: unresolved external symbol H5T_NATIVE_INT32_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5P_CLS_DATASET_CREATE_ID_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_IEEE_F64LE_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_NATIVE_DOUBLE_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_NATIVE_UINT32_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_C_S1_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_STD_I8LE_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_STD_I32LE_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_STD_U32LE_g
tree_sequence.obj : error LNK2001: unresolved external symbol H5T_NATIVE_SCHAR_g
build\lib.win-amd64-3.5\_msprime.cp35-win_amd64.pyd : fatal error LNK1120: 11 unresolved externals
error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\BIN\\x86_amd64\\link.exe' failed with exit status 1120

See here for the full run on AppVeyor.

This problem does not happen if I pin HDF5 to version 1.8.15 (i.e., there are no unresolved externals reported). This is similar to an issue I've been having with GSL over at conda-forge/gsl-feedstock#15, and is possibly related.

I'll have a look to see if I can figure out what changed between the two versions. I'm not very good with windows coding though, so I thought I'd report the issue in case someone else knows what the problem is.

HDF5 CMake targets

The HDF5 CMake targets changed between 1.10.5 and 1.10.6.

  • 1.10.5
    • hdf5::hdf5-shared
  • 1.10.6
    • hdf5-shared

Defining the CMake variable HDF_PACKAGE_NAMESPACE to "hdf5::" restores the old namespace.

Rebuild with conda-build 2.0

Ok, it looks like the ๐Ÿ’ฉ has finally hit the fan--I just got this on a CI build for MetPy on AppVeyor (which is using conda-forge):

PaddingError: Placeholder of length '20' too short in package conda-forge::hdf5-1.8.17-vc9_3.
The package must be rebuilt with conda-build > 2.0

So, how do we need to proceed?

vtk reports problems with hdf5 on win10

Compiling vtk with hdf5 makes some problems on windows 10.
The compilation goes on, but we also encounter runtime-problems with hdf5 on win 10 and 8. (missing dll, which maybe is related to this issue)

-- HDF5: Performing CXX Test OLD_HEADER_FILENAME - Failed
-- HDF5: Performing CXX Test H5_NO_NAMESPACE - Failed
-- HDF5: Performing CXX Test H5_NO_STD - Failed
-- HDF5: Performing CXX Test BOOL_NOTDEFINED - Failed
-- HDF5: Performing CXX Test NO_STATIC_CAST - Failed
-- HDF5: Checking for InitOnceExecuteOnce:
-- HDF5: Performing Test InitOnceExecuteOnce - Failed
-- HDF5: Performing Other Test INLINE_TEST_inline - Success
-- HDF5: Performing Other Test INLINE_TEST___inline__ - Failed
-- HDF5: Performing Other Test INLINE_TEST___inline - Success
-- HDF5: Checking for appropriate format for 64 bit long:
HDF5: Width test failed with result: FAILED_TO_RUN
-- HDF5: Checking for appropriate format for 64 bit long: not found
-- HDF5: checking IF converting from long double to integers is accurate... yes
-- HDF5: checking IF accurately converting from integers to long double... yes
-- HDF5: Checking IF accurately converting unsigned long to float values... no
-- HDF5: Checking IF accurately roundup converting floating-point to unsigned long long values... no
-- HDF5: Checking IF right maximum converting floating-point to unsigned long long values... no
-- HDF5: Checking IF correctly converting long double to unsigned int values... no
-- HDF5: Checking IF compiling unsigned long long to floating-point typecasts work... yes
-- HDF5: Checking IF compiling long long to floating-point typecasts work... yes
-- HDF5: Checking IF overflows normally converting floating-point to integer values... no
-- HDF5: Checking IF your system converts long double to (unsigned) long values with special algorithm... no
-- HDF5: Checking IF your system can convert (unsigned) long to long double values with special algorithm... no
-- HDF5: Checking IF correctly converting long double to (unsigned) long long values... no
-- HDF5: Checking IF correctly converting (unsigned) long long to long double values... no
-- HDF5: Checking IF your system generates wrong code for log2 routine... no
-- HDF5: Checking IF alignment restrictions are strictly enforced... no

Further there is this output:
capture

Btw.: this is related to python3.5 -> visualstudio 15

build dependence on hdf5 leads to undifined reference errors # [win]

Problem occuring in this recipe https://github.com/looooo/FreeCAD_Conda/tree/experimenting/libMed, when building with windows. This is related to the zlib library, because if I build hdf5 without zlib support libMed builds.
https://github.com/conda-forge/hdf5-feedstock/blob/master/recipe/bld.bat#L8 setting -DHDF5_ENABLE_Z_LIB_SUPPORT:BOOL=OFF

The linking problems can be seen here: I have them only in german language. (error LNK2019: unresolved external symbol)

Bibliothek "src\medC.lib" und Objekt "src\medC.exp" werden erstellt.
LINK : warning LNK4098: Standardbibliothek "MSVCRT" steht in Konflikt mit anderen Bibliotheken; /NODEFAULTLIB:Bibliothek verwenden.
libhdf5.lib(H5Zdeflate.c.obj) : error LNK2019: Verweis auf nicht aufgelรทstes externes Symbol "inflate" in Funktion "H5Z_filter_deflate".
libhdf5.lib(H5Zdeflate.c.obj) : error LNK2019: Verweis auf nicht aufgelรทstes externes Symbol "inflateEnd" in Funktion "H5Z_filter_deflate".
libhdf5.lib(H5Zdeflate.c.obj) : error LNK2019: Verweis auf nicht aufgelรทstes externes Symbol "compress2" in Funktion "H5Z_filter_deflate".
libhdf5.lib(H5Zdeflate.c.obj) : error LNK2019: Verweis auf nicht aufgelรทstes externes Symbol "inflateInit_" in Funktion "H5Z_filter_deflate".

how to integrate blosc filter as plugin?

There is a long discussion in the h5py repository on how to get support for BLOSC filtering (h5py/h5py#611). The consensus so far is that users of the library import pytables prior importing h5py, because it enables the blosc plugin.
Since blosc needs to be built against a specific version of hdf5, I thought it should be an addition feature of this packages. Please point me into the right direction here.

I wonder how we would provide blosc as a plugin being build against the current version of this library?
Unfortunately I don't know how downwardly compatible the hdf5 ABI is. Might it be sufficient to built the BLOSC plugin against a fairly old version of Hdf5 and it will be future compatible until the api breaks (e.g. like in NumPy)?

This very tiny c-wrapper provides the plugin interface, which resides in $PREFIX/lib/hdf5/plugins:
https://github.com/PyTables/PyTables/tree/master/hdf5-blosc

should we create a new repository containing just this wrapper and build it against a specific version of this library to get it right? If we are required to build the wrapper against every version of HDF5, how would be able to trigger new builds for the plugin?

Any feedback greatly appreciated.

ClobberError with hdf4

Although it seems harmless enough I thought I'd report it

(ClobberError: This transaction has incompatible packages due to a shared path.
  packages: conda-forge::hdf5-1.10.1-vc14_2, conda-forge::hdf4-4.2.13-vc14_0
  path: 'library/copying'

ClobberError: This transaction has incompatible packages due to a shared path.
  packages: conda-forge::hdf5-1.10.1-vc14_2, conda-forge::hdf4-4.2.13-vc14_0
  path: 'library/release.txt'
ฮป conda create -v -n deleteme999 python=3.6* conda-forge::hdf5 conda-forge::hdf4
Solving environment: ...working... done

## Package Plan ##

  environment location: C:\Miniconda3\envs\deleteme999

  added / updated specs:
    - conda-forge::hdf4
    - conda-forge::hdf5
    - python=3.6


The following NEW packages will be INSTALLED:

    certifi:        2018.4.16-py36_0
    hdf4:           4.2.13-vc14_0      conda-forge [vc14]
    hdf5:           1.10.1-vc14_2      conda-forge [vc14]
    jpeg:           9b-vc14_2          conda-forge [vc14]
    pip:            10.0.1-py36_0
    python:         3.6.5-h0c2934d_0
    setuptools:     39.1.0-py36_0
    vc:             14-h0510ff6_3
    vs2015_runtime: 14.0.25420-0
    wheel:          0.31.1-py36_0
    wincertstore:   0.2-py36h7fe50ca_0
    zlib:           1.2.11-vc14_0      conda-forge [vc14]

Proceed ([y]/n)? y

Preparing transaction: ...working... done
Verifying transaction: ...working... (ClobberError: This transaction has incompatible packages due to a shared path.
  packages: conda-forge::hdf5-1.10.1-vc14_2, conda-forge::hdf4-4.2.13-vc14_0
  path: 'library/copying'

, ClobberError: This transaction has incompatible packages due to a shared path.
  packages: conda-forge::hdf5-1.10.1-vc14_2, conda-forge::hdf4-4.2.13-vc14_0
  path: 'library/release.txt'

)

done
Executing transaction: ...working... ===> LINKING PACKAGE: defaults::vs2015_runtime-14.0.25420-0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\vs2015_runtime-14.0.25420-0


===> LINKING PACKAGE: defaults::vc-14-h0510ff6_3 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\vc-14-h0510ff6_3


===> LINKING PACKAGE: conda-forge::zlib-1.2.11-vc14_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\zlib-1.2.11-vc14_0


===> LINKING PACKAGE: conda-forge::hdf5-1.10.1-vc14_2 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\hdf5-1.10.1-vc14_2


===> LINKING PACKAGE: conda-forge::jpeg-9b-vc14_2 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\jpeg-9b-vc14_2


===> LINKING PACKAGE: defaults::python-3.6.5-h0c2934d_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\python-3.6.5-h0c2934d_0


===> LINKING PACKAGE: defaults::certifi-2018.4.16-py36_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\certifi-2018.4.16-py36_0


===> LINKING PACKAGE: conda-forge::hdf4-4.2.13-vc14_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\hdf4-4.2.13-vc14_0


file exists, but clobbering: 'C:\\Miniconda3\\envs\\deleteme999\\Library\\COPYING'

file exists, but clobbering: 'C:\\Miniconda3\\envs\\deleteme999\\Library\\RELEASE.txt'

===> LINKING PACKAGE: defaults::wincertstore-0.2-py36h7fe50ca_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\wincertstore-0.2-py36h7fe50ca_0


===> LINKING PACKAGE: defaults::setuptools-39.1.0-py36_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\setuptools-39.1.0-py36_0


===> LINKING PACKAGE: defaults::wheel-0.31.1-py36_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\wheel-0.31.1-py36_0


===> LINKING PACKAGE: defaults::pip-10.0.1-py36_0 <===
  prefix=C:\Miniconda3\envs\deleteme999
  source=C:\Miniconda3\pkgs\pip-10.0.1-py36_0


done

@conda-forge-admin please rerender

Solution to issue cannot be found in the documentation.

  • I checked the documentation.

Issue

df

Installed packages

asdf

Environment info

fdsa

Not correctly finding zlib on Windows

As noted by @gillins on netcdf, the zlib referenced by the hdf5-targets.cmake file is incorrect. This can seen by the warning that is thrown by HDF5 here. We need to make sure the correct zlib is being picked up as other packages that link to HDF5 may also fail due to this - see here.

BUG: New linking issue on 1.12.2 for macOS x86_64 CPython

Solution to issue cannot be found in the documentation.

  • I checked the documentation.

Issue

In a dependent conda-forge library (OpenMEEG), the tickbot's update to 1.12.2 failed on OSX x86_64 CPython only (PyPy okay, arm64 okay, Windows/linux okay):

conda-forge/openmeeg-feedstock#28

https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=543348&view=logs&jobId=9c5ef928-2cd6-52e5-dbe6-9d173a7d951b&j=9c5ef928-2cd6-52e5-dbe6-9d173a7d951b&t=20c71c51-4b27-578b-485d-06ade2de1d00

[11/86] Linking CXX shared library OpenMEEGMaths/libOpenMEEGMaths.1.1.0.dylib
FAILED: OpenMEEGMaths/libOpenMEEGMaths.1.1.0.dylib 
: && $BUILD_PREFIX/bin/x86_64-apple-darwin13.4.0-clang++ -lgfortran -fopenmp=libomp -O3 -DNDEBUG -isysroot /Applications/Xcode_12.4.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk -mmacosx-version-min=10.9 -dynamiclib -Wl,-headerpad_max_install_names -Wl,-pie -Wl,-headerpad_max_install_names -Wl,-dead_strip_dylibs -Wl,-rpath,$PREFIX/lib -L$PREFIX/lib   -compatibility_version 1.0.0 -current_version 1.1.0 -o OpenMEEGMaths/libOpenMEEGMaths.1.1.0.dylib -install_name @rpath/libOpenMEEGMaths.1.dylib OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/vector.cpp.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/matrix.cpp.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/symmatrix.cpp.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/sparse_matrix.cpp.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/fast_sparse_matrix.cpp.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/MathsIO.C.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/MatlabIO.C.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/AsciiIO.C.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/BrainVisaTextureIO.C.o OpenMEEGMaths/CMakeFiles/OpenMEEGMaths.dir/src/TrivialBinIO.C.o  $PREFIX/lib/libmatio.a  $PREFIX/lib/libopenblas.a  $PREFIX/lib/libomp.dylib  $PREFIX/lib/libhdf5.a  $PREFIX/lib/libcrypto.dylib  $PREFIX/lib/libcurl.dylib  /Applications/Xcode_12.4.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/usr/lib/libpthread.dylib  $PREFIX/lib/libz.dylib  /Applications/Xcode_12.4.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/usr/lib/libdl.dylib  /Applications/Xcode_12.4.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.9.sdk/usr/lib/libm.dylib && :
ld: warning: -pie being ignored. It is only used when linking a main executable
Undefined symbols for architecture x86_64:
  "_MPI_Allgather", referenced from:
      _H5_mpio_gatherv_alloc_simple in libhdf5.a(H5mpi.o)
      _H5D__mpio_redistribute_shared_chunks in libhdf5.a(H5Dmpio.o)
  "_MPI_Allgatherv", referenced from:
      _H5_mpio_gatherv_alloc in libhdf5.a(H5mpi.o)
...

I can look into adding an extra linker flag on osx x86_64 to overcome this, but it seems like it might be a bug with the packing here, so I figured I'd raise an issue!

Installed packages

Recipe:

https://github.com/conda-forge/openmeeg-feedstock/blob/main/recipe/meta.yaml

Environment info

See recipe and build logs

Issue with placeholder path

A user reporting the following issue

ERROR: placeholder 'C:/conda/envs/_build' too short in:
hdf5-1.8.17-vc14_6

This happened when doing:

conda install -c conda-forge glueviz

on Windows.

Windows builds are failing

HDF5 is failing on Windows.

BUILD END: hdf5-1.8.16-vc9_0 
TEST START: hdf5-1.8.16-vc9_0 
Fetching package metadata: ........
Solving package specifications: ....
Error: The following specifications were found to be in conflict:
  - hdf5 1.8.16 vc9_0
Use "conda info <package>" to see the dependencies for each package.
Command exited with code 1

Include Fortran bindings

A number of scientific codes use the Fortran bindings for HDF5. Would it be possible to update this recipe to also make sure the Fortran bindings are installed?

CMake package has an hardcoded zlib path (Windows)

I am looking at version 1.10.5 build 1103 on Windows (built with CMake) that provides a cmake package.

In hdf5-targets.cmake, the exported target hdf5::hdf5-static has an hardcoded INTERFACE_LINK_LIBRARIES: C:/bld/hdf5_split_1566414186589/_h_env/Library/lib/z.lib;

I checked the previous 3 builds (down to 1100) and they have the same issue.

This is probably an issue with the upstream HDF5 project though...


As a side note, there is also some discrepancy between Windows and Unix packages:

Feature Windows Linux
CMake Package Yes No
Separate package for static libraries No Yes

Support for Apple's arm64 architecture (osx-arm64)

Issue: I'm unable to install hdf5, and other packages that rely on HDF5, using a MacBook with the new Apple Silicon processors, osx-arm64 architecture. Would it be possible to add support for this?


Environment (conda list):
$ conda list
# Name                    Version                   Build  Channel
apscheduler               3.6.3                    pypi_0    pypi
arrow                     0.10.0                     py_1    conda-forge
attrs                     20.3.0             pyhd3deb0d_0    conda-forge
blosc                     1.9.2                    pypi_0    pypi
brotlipy                  0.7.0           py38h51573d8_1001    conda-forge
bzip2                     1.0.8                h27ca646_4    conda-forge
c-ares                    1.17.1               h27ca646_0    conda-forge
ca-certificates           2020.11.8            h4653dfc_0    conda-forge
cachetools                4.1.1                      py_0    conda-forge
certifi                   2020.11.8        py38h10201cd_0    conda-forge
cffi                      1.14.4           py38he62ddd7_1    conda-forge
chardet                   3.0.4           py38h045b1e1_1008    conda-forge
click                     7.1.2                    pypi_0    pypi
cmake                     3.19.1               h60611b0_0    conda-forge
colorama                  0.4.4                    pypi_0    pypi
cryptography              3.2.1            py38h943ba7b_0    conda-forge
cycler                    0.10.0                     py_2    conda-forge
decorator                 4.4.2                    pypi_0    pypi
expat                     2.2.9                hc88da5d_2    conda-forge
filelock                  3.0.12             pyh9f0ad1d_0    conda-forge
flake8                    3.8.4                      py_0    conda-forge
flake8-tidy-imports       4.1.0                    pypi_0    pypi
flask                     1.1.2                    pypi_0    pypi
flask-cors                3.0.9                    pypi_0    pypi
flask-jwt-extended        3.25.0                   pypi_0    pypi
freetype                  2.10.4               h17b34a0_0    conda-forge
idna                      2.10               pyh9f0ad1d_0    conda-forge
importlib-metadata        3.1.1              pyhd8ed1ab_0    conda-forge
iniconfig                 1.1.1              pyh9f0ad1d_0    conda-forge
itsdangerous              1.1.0                    pypi_0    pypi
jinja2                    2.11.2                   pypi_0    pypi
joblib                    0.17.0                     py_0    conda-forge
jpeg                      9d                   h27ca646_0    conda-forge
jsonschema                3.2.0                    pypi_0    pypi
kiwisolver                1.3.1            py38h12a6f45_0    conda-forge
krb5                      1.17.2               h17618d6_0    conda-forge
lcms2                     2.11                 h2f7874f_1    conda-forge
libblas                   3.9.0                3_openblas    conda-forge
libcblas                  3.9.0                3_openblas    conda-forge
libcurl                   7.71.1               hd2aec06_8    conda-forge
libcxx                    11.0.0               h7cf67bf_1    conda-forge
libedit                   3.1.20191231         hc8eb9b7_2    conda-forge
libev                     4.33                 h642e427_1    conda-forge
libffi                    3.3                  h9f76cd9_1    conda-forge
libgfortran               5.0.0.dev0          h181927c_13    conda-forge
libgfortran5              11.0.0.dev0         h181927c_13    conda-forge
liblapack                 3.9.0                3_openblas    conda-forge
libnghttp2                1.41.0               h87e4072_2    conda-forge
libopenblas               0.3.12          openmp_h2ecc587_1    conda-forge
libpng                    1.6.37               hf7e6567_2    conda-forge
libssh2                   1.9.0                h1c49ba1_5    conda-forge
libtiff                   4.1.0                h70663a0_6    conda-forge
libuv                     1.40.0               h1f0153e_0    conda-forge
libwebp-base              1.1.0                h27ca646_3    conda-forge
llvm-openmp               11.0.0               hdb94862_1    conda-forge
lz4-c                     1.9.2                hc88da5d_3    conda-forge
markupsafe                1.1.1                    pypi_0    pypi
matplotlib-base           3.3.3            py38ha029820_0    conda-forge
mccabe                    0.6.1                      py_1    conda-forge
more-itertools            8.6.0              pyhd8ed1ab_0    conda-forge
multidict                 5.1.0                    pypi_0    pypi
mypy                      0.790                    pypi_0    pypi
mypy-extensions           0.4.3                    pypi_0    pypi
ncurses                   6.2                  h9aa5885_4    conda-forge
numpy                     1.19.4           py38h9e6c65a_1    conda-forge
olefile                   0.46               pyh9f0ad1d_1    conda-forge
openssl                   1.1.1h               h642e427_0    conda-forge
packaging                 20.7               pyhd3deb0d_0    conda-forge
pandas                    1.1.4            py38h9b9bf68_0    conda-forge
pillow                    8.0.1            py38h6b236ce_0    conda-forge
pip                       20.3.1             pyhd8ed1ab_0    conda-forge
plotly                    4.13.0             pyhd3deb0d_0    conda-forge
pluggy                    0.13.1           py38h045b1e1_3    conda-forge
progressbar2              3.53.1             pyh9f0ad1d_0    conda-forge
prompt-toolkit            3.0.8                    pypi_0    pypi
py                        1.9.0              pyh9f0ad1d_0    conda-forge
py-find-1st               1.1.4                    pypi_0    pypi
pyaml                     20.4.0             pyh9f0ad1d_0    conda-forge
pycodestyle               2.6.0              pyh9f0ad1d_0    conda-forge
pycoingecko               1.4.0                    pypi_0    pypi
pycparser                 2.20               pyh9f0ad1d_2    conda-forge
pyflakes                  2.2.0              pyh9f0ad1d_0    conda-forge
pyjwt                     1.7.1                    pypi_0    pypi
pyopenssl                 20.0.0             pyhd8ed1ab_0    conda-forge
pyparsing                 2.4.7              pyh9f0ad1d_0    conda-forge
pyrsistent                0.17.3                   pypi_0    pypi
pysocks                   1.7.1            py38h045b1e1_2    conda-forge
pytest                    6.1.2            py38h10201cd_0    conda-forge
python                    3.8.6           h12cc5a1_1_cpython    conda-forge
python-dateutil           2.8.1                      py_0    conda-forge
python-rapidjson          0.9.4                    pypi_0    pypi
python-telegram-bot       13.1                     pypi_0    pypi
python-utils              2.4.0                      py_0    conda-forge
python_abi                3.8                      1_cp38    conda-forge
pytz                      2020.4             pyhd8ed1ab_0    conda-forge
pyyaml                    5.3.1            py38h51573d8_1    conda-forge
questionary               1.8.1                    pypi_0    pypi
readline                  8.0                  hc8eb9b7_2    conda-forge
requests                  2.25.0             pyhd3deb0d_0    conda-forge
retrying                  1.3.3                      py_2    conda-forge
rhash                     1.3.6             h1f0153e_1001    conda-forge
scikit-learn              0.23.2           py38h397cc00_3    conda-forge
scikit-optimize           0.8.1              pyh9f0ad1d_0    conda-forge
scipy                     1.5.3            py38hdf044fb_0    conda-forge
sdnotify                  0.3.2                    pypi_0    pypi
setuptools                49.6.0           py38h045b1e1_2    conda-forge
six                       1.15.0             pyh9f0ad1d_0    conda-forge
sqlalchemy                1.3.20                   pypi_0    pypi
sqlite                    3.34.0               h6d56c25_0    conda-forge
tabulate                  0.8.7                    pypi_0    pypi
threadpoolctl             2.1.0              pyh5ca1d4c_0    conda-forge
tk                        8.6.10               h99d78ee_1    conda-forge
toml                      0.10.2             pyhd8ed1ab_0    conda-forge
tornado                   6.1              py38h30f7421_0    conda-forge
typed-ast                 1.4.1                    pypi_0    pypi
typing-extensions         3.7.4.3                  pypi_0    pypi
tzlocal                   2.1                      pypi_0    pypi
urllib3                   1.25.11                    py_0    conda-forge
wcwidth                   0.2.5                    pypi_0    pypi
werkzeug                  1.0.1                    pypi_0    pypi
wheel                     0.36.0             pyhd3deb0d_0    conda-forge
wrapt                     1.12.1           py38h30f7421_2    conda-forge
xz                        5.2.5                h642e427_1    conda-forge
yaml                      0.2.5                h642e427_0    conda-forge
yarl                      1.1.0                    pypi_0    pypi
zipp                      3.4.0                      py_0    conda-forge
zlib                      1.2.11            h31e879b_1009    conda-forge
zstd                      1.4.5                hc019d7c_2    conda-forge


Details about conda and system ( conda info ):
$ conda info

     active environment : ft
    active env location : /Users/ryan/miniforge3/envs/ft
            shell level : 2
       user config file : /Users/ryan/.condarc
 populated config files : /Users/ryan/miniforge3/.condarc
          conda version : 4.9.2
    conda-build version : not installed
         python version : 3.9.1.candidate.1
       virtual packages : __osx=11.0.1=0
                          __unix=0=0
                          __archspec=1=arm64
       base environment : /Users/ryan/miniforge3  (writable)
           channel URLs : https://conda.anaconda.org/conda-forge/osx-arm64
                          https://conda.anaconda.org/conda-forge/noarch
          package cache : /Users/ryan/miniforge3/pkgs
                          /Users/ryan/.conda/pkgs
       envs directories : /Users/ryan/miniforge3/envs
                          /Users/ryan/.conda/envs
               platform : osx-arm64
             user-agent : conda/4.9.2 requests/2.25.0 CPython/3.9.1rc1 Darwin/20.1.0 OSX/11.0.1
                UID:GID : 501:20
             netrc file : None
           offline mode : False

Installing hdf5 with mpi

Issue:

The following doesn't install hdf5 compiled with mpi(ch):

$ conda install hdf5 mpich
Collecting package metadata (current_repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /home/david/miniconda3

  added / updated specs:
    - hdf5
    - mpich


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    hdf5-1.10.5                |nompi_h3c11f04_1104         3.1 MB  conda-forge
    ------------------------------------------------------------
                                           Total:         3.1 MB

The following NEW packages will be INSTALLED:

  hdf5               conda-forge/linux-64::hdf5-1.10.5-nompi_h3c11f04_1104
  libgfortran-ng     conda-forge/linux-64::libgfortran-ng-7.3.0-hdf63c60_2
  mpi                conda-forge/linux-64::mpi-1.0-mpich
  mpich              conda-forge/linux-64::mpich-3.3.1-hc856adb_1

The following packages will be UPDATED:

  ca-certificates    pkgs/main::ca-certificates-2019.8.28-0 --> conda-forge::ca-certificates-2019.9.11-hecc5488_0

The following packages will be SUPERSEDED by a higher-priority channel:

  certifi                                         pkgs/main --> conda-forge
  conda                                           pkgs/main --> conda-forge
  openssl              pkgs/main::openssl-1.1.1d-h7b6447c_2 --> conda-forge::openssl-1.1.1c-h516909a_0

But the following does:

$ conda install hdf5=*=*mpich*
Collecting package metadata (current_repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /home/david/miniconda3

  added / updated specs:
    - hdf5[build=*mpich*]


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    hdf5-1.10.5                |mpi_mpich_ha7d0aea_1004         3.3 MB  conda-forge
    ------------------------------------------------------------
                                           Total:         3.3 MB

The following NEW packages will be INSTALLED:

  hdf5               conda-forge/linux-64::hdf5-1.10.5-mpi_mpich_ha7d0aea_1004
  libgfortran-ng     conda-forge/linux-64::libgfortran-ng-7.3.0-hdf63c60_2
  mpi                conda-forge/linux-64::mpi-1.0-mpich
  mpich              conda-forge/linux-64::mpich-3.3.1-hc856adb_1

The following packages will be UPDATED:

  ca-certificates    pkgs/main::ca-certificates-2019.8.28-0 --> conda-forge::ca-certificates-2019.9.11-hecc5488_0

The following packages will be SUPERSEDED by a higher-priority channel:

  certifi                                         pkgs/main --> conda-forge
  conda                                           pkgs/main --> conda-forge
  openssl              pkgs/main::openssl-1.1.1d-h7b6447c_2 --> conda-forge::openssl-1.1.1c-h516909a_0

Is it supposed to be the correct behavior?


Environment (conda list):

$ conda list
# packages in environment at /home/david/miniconda3:
#
# Name                    Version                   Build  Channel
_libgcc_mutex             0.1                        main  
asn1crypto                1.0.1                    py37_0  
ca-certificates           2019.8.28                     0  
certifi                   2019.9.11                py37_0  
cffi                      1.12.3           py37h2e261b9_0  
chardet                   3.0.4                 py37_1003  
conda                     4.7.12                   py37_0  
conda-package-handling    1.6.0            py37h7b6447c_0  
cryptography              2.7              py37h1ba5d50_0  
idna                      2.8                      py37_0  
libedit                   3.1.20181209         hc058e9b_0  
libffi                    3.2.1                hd88cf55_4  
libgcc-ng                 9.1.0                hdf63c60_0  
libstdcxx-ng              9.1.0                hdf63c60_0  
ncurses                   6.1                  he6710b0_1  
openssl                   1.1.1d               h7b6447c_2  
pip                       19.2.3                   py37_0  
pycosat                   0.6.3            py37h14c3975_0  
pycparser                 2.19                     py37_0  
pyopenssl                 19.0.0                   py37_0  
pysocks                   1.7.1                    py37_0  
python                    3.7.4                h265db76_1  
readline                  7.0                  h7b6447c_5  
requests                  2.22.0                   py37_0  
ruamel_yaml               0.15.46          py37h14c3975_0  
setuptools                41.4.0                   py37_0  
six                       1.12.0                   py37_0  
sqlite                    3.30.0               h7b6447c_0  
tk                        8.6.8                hbc83047_0  
tqdm                      4.36.1                     py_0  
urllib3                   1.24.2                   py37_0  
wheel                     0.33.6                   py37_0  
xz                        5.2.4                h14c3975_4  
yaml                      0.1.7                had09818_2  
zlib                      1.2.11               h7b6447c_3  

Details about conda and system ( conda info ):
$ conda info
     active environment : base
    active env location : /home/david/miniconda3
            shell level : 1
       user config file : /home/david/.condarc
 populated config files : /home/david/.condarc
          conda version : 4.7.12
    conda-build version : not installed
         python version : 3.7.4.final.0
       virtual packages : __cuda=10.0
       base environment : /home/david/miniconda3  (writable)
           channel URLs : https://conda.anaconda.org/conda-forge/linux-64
                          https://conda.anaconda.org/conda-forge/noarch
                          https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
          package cache : /home/david/miniconda3/pkgs
                          /home/david/.conda/pkgs
       envs directories : /home/david/miniconda3/envs
                          /home/david/.conda/envs
               platform : linux-64
             user-agent : conda/4.7.12 requests/2.22.0 CPython/3.7.4 Linux/4.15.0-65-generic ubuntu/18.04.3 glibc/2.27
                UID:GID : 1003:1003
             netrc file : None
           offline mode : False

parallel hdf5

Is it possible to get a parallel-enabled build of hdf5? Assuming a serial build is still desirable, how should they be separated? Different package (e.g. hdf5-parallel) or feature tracking in the hdf5 package (like numpy+blas)?

I'm looking at building fenics packages, and one impediment that has been pointed out is that it needs parallel hdf5 and vtk, but vtk pulls in this serial hdf5.

macOS CI builds failing

Bug with imported Cmake target 1.14.0

Solution to issue cannot be found in the documentation.

  • I checked the documentation.

Issue

The error in the automated OpenMEEG rebuild PR conda-forge/openmeeg-feedstock#42 seems to indicate a problem with HDF5 feedstock:

https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=654485&view=logs&j=d0d954b5-f111-5dc4-4d76-03b6c9d0cf7e&t=841356e0-85bb-57d8-dbbc-852e683d1642&l=271

Running CMAKE
-- The C compiler identification is GNU 11.3.0
...
-- Found HDF5: $PREFIX/lib/libhdf5.so;$PREFIX/lib/libcrypto.so;$PREFIX/lib/libcurl.so;$BUILD_PREFIX/x86_64-conda-linux-gnu/sysroot/usr/lib/librt.so;$BUILD_PREFIX/x86_64-conda-linux-gnu/sysroot/usr/lib/libpthread.so;$PREFIX/lib/libsz.so;$PREFIX/lib/libz.so;$BUILD_PREFIX/x86_64-conda-linux-gnu/sysroot/usr/lib/libdl.so;$BUILD_PREFIX/x86_64-conda-linux-gnu/sysroot/usr/lib/libm.so (found version "1.14.0")  
...
-- Configuring done
CMake Error in OpenMEEGMaths/CMakeLists.txt:
  Imported target "HDF5::HDF5" includes non-existent path

    "/home/conda/feedstock_root/build_artifacts/hdf5_1675740981212/work/src/H5FDsubfiling"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.

Note that my manual migration PR to 1.12.2 went smoothly and we haven't changed OpenMEEG infra/code at all, so it seems like a new problem with 1.14.0.

Installed packages

Copied from https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=654485&view=logs&j=d0d954b5-f111-5dc4-4d76-03b6c9d0cf7e&t=841356e0-85bb-57d8-dbbc-852e683d1642&l=271


    _libgcc_mutex:            0.1-conda_forge         conda-forge
    _openmp_mutex:            4.5-2_gnu               conda-forge
    binutils_impl_linux-64:   2.39-he00db2b_1         conda-forge
    binutils_linux-64:        2.39-h5fc0e48_11        conda-forge
    bzip2:                    1.0.8-h7f98852_4        conda-forge
    c-ares:                   1.18.1-h7f98852_0       conda-forge
    ca-certificates:          2022.12.7-ha878542_0    conda-forge
    cmake:                    3.25.2-h077f3f9_0       conda-forge
    expat:                    2.5.0-h27087fc_0        conda-forge
    gcc_impl_linux-64:        11.3.0-hab1b70f_19      conda-forge
    gcc_linux-64:             11.3.0-he6f903b_11      conda-forge
    gfortran_impl_linux-64:   11.3.0-he34c6f7_19      conda-forge
    gfortran_linux-64:        11.3.0-h3c55166_11      conda-forge
    gxx_impl_linux-64:        11.3.0-hab1b70f_19      conda-forge
    gxx_linux-64:             11.3.0-hc203a17_11      conda-forge
    kernel-headers_linux-64:  2.6.32-he073ed8_15      conda-forge
    keyutils:                 1.6.1-h166bdaf_0        conda-forge
    krb5:                     1.20.1-h81ceb04_0       conda-forge
    ld_impl_linux-64:         2.39-hcc3a1bd_1         conda-forge
    libcurl:                  7.87.0-hdc1c0ab_0       conda-forge
    libedit:                  3.1.20191231-he28a2e2_2 conda-forge
    libev:                    4.33-h516909a_1         conda-forge
    libgcc-devel_linux-64:    11.3.0-h210ce93_19      conda-forge


### Environment info

```shell
Standard conda-forge build, see https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=654485&view=logs&j=d0d954b5-f111-5dc4-4d76-03b6c9d0cf7e&t=841356e0-85bb-57d8-dbbc-852e683d1642&l=271

HDF5 1.10.0

HDF5 1.10.0 was release a few weeks back. It includes a number of new features as well as some backward incompatible changes with the 1.8.x library. Any plans to package up this new version?

MacOS Fortran shared libraries

Issue:

I am working on a project that uses libhdf5hl_fortran and I realized from looking into this feedstock that the Fortran shared libraries are not being build.
While I have no problem depending on both hdf5 and hdf5-static, it would be great to have the shared library available as well.

I came across this comment #69 (comment) from 2017 and before I go down the rabbit hole of try to remove the patch and build shared libraries for Fortran I decided to check if that is something that the maintainers of this feedstock would like to see done.

Thank you

Linker issues against 1.8.18-1 but not against 1.8.17.

I'm receiving linker errors against 1.8.18-1 (conda-forge) that I was not receiving against 1.8.17-2 (defaults). The linker errors suggest that the HDF5 shared libraries are not being built, however an unpacking of the 1.8.18-1 conda package does show the .so files, and my read of the ./configure in build.sh suggests they are being built as it's enabled by default (at least when I last checked HDF5 source). Is there any known reason for why linking would break between these builds?

EDITED: the issue is actually a discrepancy between defaults and conda-forge feedstocks.

# 1.8.18-1 from conda-forge
# ...snip...
h5c++ -shared -o libssu.so tree.o biom.o unifrac.o cmd.o unifrac_task.o api.o -lc -lhdf5_cpp -L/home/mcdonadt/miniconda3/conda-bld/unifrac_1506038888908/_h_env_placehold_placehold_placehold_placehold_p
lacehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_/lib
/usr/bin/ld: /home/mcdonadt/miniconda3/conda-bld/unifrac_1506038888908/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pla
cehold_placehold_placehold_placehold_placehold_placehold_placehold_/lib/libhdf5_cpp.a(H5IdComponent.o): relocation R_X86_64_32 against `.rodata' can not be used when making a shared object; recompile w
ith -fPIC
/home/mcdonadt/miniconda3/conda-bld/unifrac_1506038888908/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh
old_placehold_placehold_placehold_placehold_placehold_/lib/libhdf5_cpp.a: could not read symbols: Bad value
collect2: error: ld returned 1 exit status

# 1.8.17-2 from defaults
# ...snip...
h5c++ -shared -o libssu.so tree.o biom.o unifrac.o cmd.o unifrac_task.o api.o -lc -lhdf5_cpp -L/home/mcdonadt/miniconda3/conda-bld/unifrac_1506053505626/_h_env_placehold_placehold_placehold_placehold_p
lacehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_/lib
cp libssu.so /home/mcdonadt/miniconda3/conda-bld/unifrac_1506053505626/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pla
cehold_placehold_placehold_placehold_placehold_placehold_placehold_/lib/

I'm getting similar linking issues on 1.10.1.

OpenSSL 1 build?

Comment:

Would it possible to get a build that links against openssl 1 rather than openssl 3?

1.10.x+osx+Fortran: possible?

Issue:
I am trying to build a package for Linux and MacOS. Due to dependencies of other packages, I'm forced to use HDF5 1.10.x. Unfortunately, this package is build without Fortran libraries. Is it possible to rebuild the old 1.10.6 version with support for Fortran? The 1.12.1 version has Fortran support already.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.