The graphics shown on-screen still need to use Chip RAM, it's only off-screen storage that gains a benefit. So an application loading the JPEG into RAM will use Fast RAM, but to actually display it, it will need to open a window which is in Chip RAM and put it there.
It could also be that the software you're using is forcing the image to be decoded into Chip RAM for some reason, though it shouldn't... What are you using to view them? What size and depth are they? What depth is the screen you're displaying them on?
|