
Thread
2006.04.03 13:08 "Re: [Tiff] WORDS_BIGENDIAN makes libtiff platform dependent", by Edward Lam
Hi,
Earlier versions were using a define named 'HOST_BIGENDIAN' in libtiff/tiffconf.h:
- HOST_BIGENDIAN native cpu byte order: 1 if big-endian (Motorola)
- or 0 if little-endian (Intel); this may be used
- in codecs to optimize code
so it does not seem like there was a big change to me. Maybe this had already been figured out before in your build environment and you just did not realize it?
>
> Libtiff really does need to know the endian order, the size of 'short,
> 'int', 'long', bit order, etc., in order to work correctly for a given
> target.
Curious. I just checked my copy of libtiff 3.7.0. In tif_config.h, I have this:
HOST_BIGENDIAN defined to 0. I also have WORDS_BIGENDIAN mentioned
/* Native cpu byte order: 1 if big-endian (Motorola) or 0 if little-endian
(Intel) */
#define HOST_BIGENDIAN 0
/* Define to 1 if your processor stores words with the most significant byte
first (like Motorola and SPARC, unlike Intel and VAX). */
/* #undef WORDS_BIGENDIAN */
Note that when building, I do *not* have WORDS_BIGENDIAN defined as a compiler command line option.
Now here's the kicker:
% grep HOST_BIGENDIAN *.[ch]
tif_config.h:#define HOST_BIGENDIAN 0
% grep WORDS_BIGENDIAN *.[ch]
tif_config.h:/* #undef WORDS_BIGENDIAN */
Although they're mentioned, they don't seem to be used at all! Taken with your statement that libtiff requires knowing the host endian order, how does libtiff 3.7.0 work at all? (or maybe if doesn't since I'm only ever on Intel CPUs :)
Regards,
-Edward