Page tree

 

JAVA

FORTRAN

C++

C

 

Link

H5T_SET_PRECISION

Sets the precision of an atomic datatype

Procedure:

H5T_SET_PRECISION ( dtype_id, precision )

Signature:

herr_t H5Tset_precision( hid_t dtype_id, size_t precision )

SUBROUTINE h5tset_precision_f(type_id, precision, hdferr) 
  IMPLICIT NONE
  INTEGER(HID_T), INTENT(IN) :: type_id    ! Datatype identifier 
  INTEGER(SIZE_T), INTENT(IN) :: precision ! Datatype precision
  INTEGER, INTENT(OUT) :: hdferr           ! Error code
END SUBROUTINE h5tset_precision_f

Parameters:
hid_t dtype_idIN: Identifier of datatype to set
size_t precision    IN: Number of bits of precision for datatype

Description:

H5T_SET_PRECISION sets the precision of an atomic datatype. The precision is the number of significant bits which, unless padding is present, is 8 times larger than the value returned by H5T_GET_SIZE .

If the precision is increased then the offset is decreased and then the size is increased to insure that significant bits do not "hang over" the edge of the datatype.

Changing the precision of an H5T_STRING automatically changes the size as well. The precision must be a multiple of 8.

When decreasing the precision of a floating point type, set the locations and sizes of the sign, mantissa, and exponent fields first.

Returns:

Returns a non-negative value if successful; otherwise returns a negative value.

Example:

Coming Soon!

--- Last Modified: January 08, 2020 | 12:54 PM