English Amiga Board


Go Back   English Amiga Board > Coders > Coders. General

 
 
Thread Tools
Old 09 July 2018, 10:15   #21
meynaf
son of 68k
meynaf's Avatar
 
Join Date: Nov 2007
Location: Lyon / France
Age: 46
Posts: 3,591
Quote:
Originally Posted by ross View Post
You can try Crunch-Mania LZ-H and if I'm not mistaken also RNC Propack have a LZH variant.
But sure Arj m7 is much more effective (depack speed like ZIP Deflate).
In addition Crunch-Mania has a sample mode so this can be tested directly.


Quote:
Originally Posted by NorthWay View Post
Your numbers do not sound(!) right. IIRC you typically see 40-50% reduction in size.
As said, LZ isn't efficient on samples. Furthermore, some samples will be very hard to compress because they contain high frequencies (relative to their replay freq).


Quote:
Originally Posted by SKOLMAN_MWS View Post
Even the fastest of these will be too slow for 68000 usage.
meynaf is offline  
Old 09 July 2018, 14:26   #22
phx
Natteravn

phx's Avatar
 
Join Date: Nov 2009
Location: Herford / Germany
Posts: 1,479
Coming back to Mr.Huffman on Deltas, followed by an LZ-compression. That seems to be the best approach.


The Huffman algorithm is straight-forward, but as I understand you have to save a table of bit-patterns in the output file, which were used to encode every byte. This requires some additional space, so the gain on small samples will be limited, right?



I'm currently thinking about the optimal format for such a Huffman encoded file...
phx is offline  
Old 09 July 2018, 14:57   #23
ross
Per aspera ad astra

ross's Avatar
 
Join Date: Mar 2017
Location: Crossing the Rubicon
Age: 49
Posts: 2,154
Quote:
Originally Posted by phx View Post
Coming back to Mr.Huffman on Deltas, followed by an LZ-compression. That seems to be the best approach.
No, this is not the best approach.

Much better what usually do LZH's family compressors: in addition to the 256 symbols, add LZ tokens reference in the huffman tree.
If you want you could chop into blocks and regenerate (all or in part) the htree to optimize output code but this 'split point' is compression art per se.

-
EDIT, to try to give a quick explanation: compressing a file is a form of redundancy reduction.
Two possible methods are: use statistic to better encode the symbols (huffman) or use a better encode for the dictionary (pardon the over-semplified lz specification..).
Obviusly if used together they influence each other retroactively. But what has more impact, since it transforms symbols into a variable number of 2^n bits, is huffman and as LZ requires groups of repeated bytes to work, if applied later, it finds a destroyed and completely unusable context.
Therefore all the compressors that also uses an entropic pass (huffman, arithmetric/range, ANS) should use it last, perhaps inserting among statistical data also those obtained from LZ tokens, giving them significant weight compared to normal symbols.
-

But instead of complicating your life much better use an already available algorithm, also because subject would become very complex..
LHA, ZIP, ARj, LZX.. all are of the LZH family.

Quote:
The Huffman algorithm is straight-forward, but as I understand you have to save a table of bit-patterns in the output file, which were used to encode every byte. This requires some additional space, so the gain on small samples will be limited, right?

I'm currently thinking about the optimal format for such a Huffman encoded file...
Yes, straight-forward but not trivial implement it in a fast manner
And yes, you need extra space for the tree in output file (search Canonical Huffman for optimal code).

Last edited by ross; 09 July 2018 at 19:34.
ross is offline  
Old 09 July 2018, 15:00   #24
meynaf
son of 68k
meynaf's Avatar
 
Join Date: Nov 2007
Location: Lyon / France
Age: 46
Posts: 3,591
Quote:
Originally Posted by phx View Post
Coming back to Mr.Huffman on Deltas, followed by an LZ-compression. That seems to be the best approach.
In theory this is all done together so that sequence (= repeat area command) is Huffman encoded as well.


Quote:
Originally Posted by phx View Post
The Huffman algorithm is straight-forward, but as I understand you have to save a table of bit-patterns in the output file, which were used to encode every byte. This requires some additional space, so the gain on small samples will be limited, right?
Right. For this reason it may be interesting to merge several small files together.


Quote:
Originally Posted by phx View Post
I'm currently thinking about the optimal format for such a Huffman encoded file...
That's a tricky task. LZX seems to be state of the art in regard to this, but probably too complex.
meynaf is offline  
Old 09 July 2018, 16:03   #25
ross
Per aspera ad astra

ross's Avatar
 
Join Date: Mar 2017
Location: Crossing the Rubicon
Age: 49
Posts: 2,154
Quote:
Originally Posted by meynaf View Post
That's a tricky task. LZX seems to be state of the art in regard to this, but probably too complex.
Yes, LZX is very well constructed (and can be considered the progenitor of the modern LZH family compressors).
A great enhancemed was the previous match concept (a fast selection for a queue of 3 matches already encountered).
This is extended in LZMA, with other great improvements (moreover entropy encoding with arithmetic/range coding, not huffman).
This concept is also used in aplib and many others (ex. our Blueberry's Shrinkler , but this is a CM compressor so another planet..).

Unfortunately here we enter the world of NP-hardness problems, so a perfect solution is not computable in usable times.

But some optimal solutions exist .
ross is offline  
Old 17 July 2018, 09:20   #26
SKOLMAN_MWS
Registered User

 
Join Date: Jan 2014
Location: Poland
Posts: 145
Quote:
Originally Posted by meynaf View Post
Even the fastest of these will be too slow for 68000 usage.
We do not know how works TTA 2.0 at 68000, because there is no port.
Attached Files
File Type: rar tta20.rar (15.0 KB, 35 views)
SKOLMAN_MWS is offline  
Old 17 July 2018, 09:51   #27
meynaf
son of 68k
meynaf's Avatar
 
Join Date: Nov 2007
Location: Lyon / France
Age: 46
Posts: 3,591
Quote:
Originally Posted by SKOLMAN_MWS View Post
We do not know how works TTA 2.0 at 68000, because there is no port.
We don't need a port to know. Merely looking at the compression methods shows it's far out of reach.
It would be really naive to think IIR (Infinite Impulse Response model) is doable in small enough time for 68000. This kind of computation needs many multiplies even for a single sample (check the formulas on the TTA website).
Then come Rice codes, which are enough to take 100% cpu time and more.
meynaf is offline  
Old 17 July 2018, 10:39   #28
SKOLMAN_MWS
Registered User

 
Join Date: Jan 2014
Location: Poland
Posts: 145
[ Show youtube player ]

SKOLMAN_MWS is offline  
Old 17 July 2018, 12:09   #29
meynaf
son of 68k
meynaf's Avatar
 
Join Date: Nov 2007
Location: Lyon / France
Age: 46
Posts: 3,591
Do you think it will magically solve the lack of power for the 68000 ?
50Mhz 68030 is much faster than this, and is barely able to do Flac which is simpler than TTA (because IIR is tougher than LPC).

Anyway this is not at all the OP's target.
meynaf is offline  
Old 17 July 2018, 12:17   #30
SKOLMAN_MWS
Registered User

 
Join Date: Jan 2014
Location: Poland
Posts: 145
But you probably think about playing 16bit 44k music files, and I did not write anything like that. It's about compressing 8bit samples.
SKOLMAN_MWS is offline  
Old 17 July 2018, 12:21   #31
meynaf
son of 68k
meynaf's Avatar
 
Join Date: Nov 2007
Location: Lyon / France
Age: 46
Posts: 3,591
You will see little or no speed change going from 16 to 8 bits. Ok, you can get it 4 times faster going to 10khz samples, but 68000 is still out of league.

And again this is not the OP's target.
meynaf is offline  
Old 17 July 2018, 13:51   #32
phx
Natteravn

phx's Avatar
 
Join Date: Nov 2009
Location: Herford / Germany
Posts: 1,479
I have to say that I didn't make any further experiments with Huffman encoding for now. I have still about 40K left on the disk and will probably only need some more samples for the menu. Also Meynaf's hint that I have to merge serveral small sample file to get the best compresion ratio out of Huffman is a problem for me, as I have multiple small samples (usuall 6-10K) which must be loaded in various arrangements for every game world.
phx is offline  
Old 17 July 2018, 15:34   #33
Thorham
Computer Nerd

Thorham's Avatar
 
Join Date: Sep 2007
Location: Rotterdam/Netherlands
Age: 43
Posts: 3,086
Quote:
Originally Posted by meynaf View Post
Methods such as ADPCM are good for real time, but worthless if you decrunch only while loading.
Depends on the resulting sound quality, space saving and speed.
Thorham is offline  
Old 14 October 2018, 06:09   #34
NorthWay
Registered User
 
Join Date: May 2013
Location: Grimstad / Norway
Posts: 612
How did the compression work out for you?

I just looked at a few of my MODs and saw that SQSH reduced the size by 35-40%, though I don't know how much comes from non-sample data.
NorthWay is offline  
 


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools

Similar Threads
Thread Thread Starter Forum Replies Last Post
Playing samples Lonewolf10 Coders. Tutorials 8 18 February 2013 23:33
Best way to get samples onto the amiga... ElectroBlaster Amiga scene 4 21 October 2012 18:13
MOD with Highlander samples? Steve C request.Modules 2 20 July 2012 20:36
Does anyone have any protracker samples adf's? LordNipple Amiga scene 7 26 June 2011 14:36
Need Samples from Turrican 2 hipoonios request.Other 9 07 April 2010 19:03

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +2. The time now is 08:52.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.
Page generated in 0.08251 seconds with 16 queries