Skip to content

Failure creating a larger hdf5 dataset #131

Open
@Vynikal

Description

@Vynikal

Whenever creating a larger hdf5 dataset, i.e. approximately more than 4 lidar HD tiles for training set, the resulting hdf5 file collapses to a few kilobytes without any explanation. The RAM size might play a role in this, I have 32GB and the process has to utilize swap partition in order to create a larger dataset. But even if it completes without any apparent error, the hdf5 file is tiny and when running the RandLa experiment, it attempts to create it again. Some error then follows.
What is happening? If it is because of the RAM size, is there any way to circumvent it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions