Just read this Wikipedia article and didnt Understand anything.

An alternative view can show compression algorithms implicitly map strings into implicit feature space vectors, and compression-based similarity measures compute similarity within these feature spaces. For each compressor C(.) we define an associated vector space ℵ, such that C(.) maps an input string x, corresponding to the vector norm ||~x||. An exhaustive examination of the feature spaces underlying all compression algorithms is precluded by space; instead, feature vectors chooses to examine three representative lossless compression methods, LZW, LZ77, and PPM.[29]

According to AIXI theory, a connection more directly explained in Hutter Prize, the best possible compression of x is the smallest possible software that generates x.
 
An alternative view can show compression algorithms implicitly map strings into implicit feature space vectors, and compression-based similarity measures compute similarity within these feature spaces. For each compressor C(.) we define an associated vector space ℵ, such that C(.) maps an input string x, corresponding to the vector norm ||~x||. An exhaustive examination of the feature spaces underlying all compression algorithms is precluded by space; instead, feature vectors chooses to examine three representative lossless compression methods, LZW, LZ77, and PPM.[29]

According to AIXI theory, a connection more directly explained in Hutter Prize, the best possible compression of x is the smallest possible software that generates x.
Smart yap
 
What Kind of Autists write These articles?
 

Similar threads

Asiangymmax
Discussion My ancestors
Replies
0
Views
71
Asiangymmax
Asiangymmax
Сигма Бой
Replies
31
Views
210
Сигма Бой
Сигма Бой
Сигма Бой
2
Replies
66
Views
418
Сигма Бой
Сигма Бой
Terrorizer512
Replies
4
Views
61
JohnBaza
JohnBaza
_MVP_
Replies
4
Views
63
weedwacker
weedwacker

Users who are viewing this thread

Back
Top