2006.04.01 04:44 "[Tiff] WORDS_BIGENDIAN makes libtiff platform dependent (3.8.2) and you have to undefine it on MacTel", by ping shu

2006.04.03 13:14 "Re: [Tiff] WORDS_BIGENDIAN makes libtiff platform dependent", by Edward Lam

Sorry, I should have taken a close look first. Looking in tif_open.c, we have:

        { union { int32 i; char c[4]; } u; u.i = 1; bigendian = u.c[0] == 0; }

Thus, libtiff was automatically detecting the endianness as Joris mentioned.

So the real question is: Why did we change it? Andrey/Frank?


Earlier versions were using a define named 'HOST_BIGENDIAN' in libtiff/tiffconf.h:

  • HOST_BIGENDIAN native cpu byte order: 1 if big-endian (Motorola)
  • or 0 if little-endian (Intel); this may be used
  • in codecs to optimize code

so it does not seem like there was a big change to me. Maybe this had already been figured out before in your build environment and you just did not realize it?

 > Libtiff really does need to know the endian order, the size of 'short,
 > 'int', 'long', bit order, etc., in order to work correctly for a given
 > target.

Curious. I just checked my copy of libtiff 3.7.0. In tif_config.h, I have this:

HOST_BIGENDIAN defined to 0. I also have WORDS_BIGENDIAN mentioned

/* Native cpu byte order: 1 if big-endian (Motorola) or 0 if little-endian

    (Intel) */

/* Define to 1 if your processor stores words with the most significant byte

    first (like Motorola and SPARC, unlike Intel and VAX). */
/* #undef WORDS_BIGENDIAN */

Note that when building, I do *not* have WORDS_BIGENDIAN defined as a compiler command line option.

Now here's the kicker:

% grep HOST_BIGENDIAN *.[ch]
tif_config.h:#define HOST_BIGENDIAN 0
% grep WORDS_BIGENDIAN *.[ch]
tif_config.h:/* #undef WORDS_BIGENDIAN */

Although they're mentioned, they don't seem to be used at all! Taken with your statement that libtiff requires knowing the host endian order, how does libtiff 3.7.0 work at all? (or maybe if doesn't since I'm only ever on Intel CPUs :)