H5T_SET_PRECISION sets the precision of an atomic datatype. The precision is the number of significant bits which, unless padding is present, is 8 times larger than the value returned by H5T_GET_SIZE .
If the precision is increased then the offset is decreased and then the size is increased to insure that significant bits do not "hang over" the edge of the datatype.
Changing the precision of an H5T_STRING automatically changes the size as well. The precision must be a multiple of 8.
When decreasing the precision of a floating point type, set the locations and sizes of the sign, mantissa, and exponent fields first.