I was able to replicate the problem using the file on the FTP server (and I think Kris and I have the same Macbook Pro model/config). One thing to be aware of is that these have two video cards:
Intel HD Graphics 4000 (512 MB VRAM).
NVIDIA GeForce GT 650M (1024 MB VRAM).
I noticed two interesting details though:
Selecting a smaller subset worked as expected.
When I tried selecting a subset that was fairly close to the actual "size" of the previewed data I apparently became like Neo in the Matrix. I've attached some screenshots of what I saw.
This is how things looked immediately after displaying.
I noticed that the bizarro texture(?) has some structure to it. At the exact same moment the X-Files theme song began to play in the background.
Th-that's a mangled version of this very thread--which I was looking at in Chrome! It's like a visual representation of pointing a pointer at the wrong location (in C or C++).
I found a useful app called "gfxCardStatus" that lets you force the system to use integrated/intel or discrete/nvidia graphics mode via a menu bar option:
I was able to confirm that the full image displays fine in discrete/nvidia mode but appears corrupted in integrated/intel mode (though not in such an interesting way as Jon's machine!). See screenshots:
The graphics switching stuff is pretty reliable about enabling the discrete graphics card if the current process/application is issuing OpenGL commands…so I'm pretty sure my previous post had the discrete stuff enabled.
However, I went ahead and "forced" the discrete graphics card[1] and have attached the result.
1: Using http://gfx.io/ , which will also tell you whether or not you are using the integrated or discrete card (along with the process that has triggered the discrete card).
We'll need to verify that we're not using the version of Java3D which comes with Apple, that we're using 1.5 or possible 1.6 bundled with McV on Kris and Jon's machine.
I did some testing to verify that the multi-tiling (invoked when image exceeds the gpu textureWidth) and NPOT vs POT texture work correctly. There have been some cases, I've only seen on ATI, where these properties are incorrectly reported. So we might try some manual settings on the command line.
Yeah, I was poking around the same sorts of things…according to various OpenGL "profilers" I was seeing the bad textures any time the texture width and/or height reached 8192. The driver is saying GL_MAX_TEXTURE_SIZE is 16384.