1994.09.16 04:17 "TIFF Bit Ordering Versus Fill Order", by John M Davison

1994.09.19 11:48 "Re: TIFF Bit Ordering Versus Fill Order", by Niles Ritter

Bit order and byte order are not necessarily related. I have experience working with devices that had/used byte order different from bit order.

What sort of devices?

I noticed that the libtiff code makes a comment to the effect, "How can bit-order be determined at runtime?" Are we talking here about the host CPU or an independent display device of some sort? If the latter, then it is hopeless for libtiff to try to determine "bit-order" at runtime.

Byte order is naturally defined by the addressing mechanism, but bits are not addressed directly. However, a possible way to define host bit order would be to use the C bit-field struct operations. As an experiment, I compiled and ran the following code on an SGI, HP, Sun, Macintosh, VAX/VMS, Ultrix and Alpha OS (Sorry, couldn't find anyone who uses PC's around here -- could someone run this on a PC and report back?):

#include <stdio.h>

struct HighLowBits {
     unsigned int highbit:  1;
     unsigned int lowbits:  7;
};

main()
{
   struct HighLowBits testbits;
   int one=1;
   int bigendian = (*(char *)&one == 0); /* byte-order from libtiff */

   printf("This is a %s machine\n",bigendian? "BigEndian":"LittleEndian");

   /* determine bit-order */
   testbits.highbit = 1;
   testbits.lowbits = 0;
   printf("Byte with highbit set has value %d\n",*(unsigned char *)&testbits);

}

On the Sun, HP, Macintosh and SGI the code returned:

   This is a BigEndian machine
   Byte with highbit set has value 128

While under VAX/VMS, Ultrix and Alpha the code returned:

   This is a LittleEndian machine
   Byte with highbit set has value 1

So, there does appear to be a strong correlation between bit and byte order. Are there any platforms on which this code returns any combination other than these two? Is this a compiler-implementation dependent approach? In any case, if bit-order is to have any practical meaning to programmers, then this would appear to be a way to objectively determine the bit-order at runtime; the simplest form of the test would be something like:

  struct HighLowBits {
     unsigned int highbit:  1;
     unsigned int lowbits:  7;
  };
  unsigned char char_one=1;
  int msb_to_lsb;
  msb_to_lsb = (*(struct HighLowBits *)&char_one).highbit==0;

I would be interested if anyone can come up with a non-equivalent alternative (programmer's-level) definition.

--Niles