All:
I'm finally getting around to playing with compression in 14.10, and am a bit puzzled. I've got a really large, wide table -- 88 million rows, 276 columns, 2,299 bytes per record -- that seems like it would be an excellent candidate for compression. But when I did a compression test on it, the savings was less than 8%. The majority of the columns are numeric, but still, I'd expect better than that.
When I output the table data to 'gzip -1' I get something on the order of 73% savings.
Anyone have experience with this?
Note: For testing, I made both the extent size and the next size very small, so we shouldn't be looking at a lot of empty tablespace.
TIA,
- TJG
------------------------------
TOM GIRSCH
------------------------------
#Informix