26 points | by ganitam 2 days ago ago
8 comments
Hasn't this been the case for a while? I vaguely remember using the optimal e8 packing to try to get denser product quantization for vector embeddings.
She proved it
How did you apply e8 to vector embeddings?
Chunk up your 128D embedding into 16 8D pieces, then quantize using the optimal packing points. Performs worse than k-means in my experience.
I always think its funny when someone says "I am a chair" and really means it
I think it technically means they have a permanent endowed position.
Ceci n'est pas une chaise
Thank god i need to go rearrange the 8 dimensional cannonballs in the trunk of my car
Hasn't this been the case for a while? I vaguely remember using the optimal e8 packing to try to get denser product quantization for vector embeddings.
She proved it
How did you apply e8 to vector embeddings?
Chunk up your 128D embedding into 16 8D pieces, then quantize using the optimal packing points. Performs worse than k-means in my experience.
I always think its funny when someone says "I am a chair" and really means it
I think it technically means they have a permanent endowed position.
Ceci n'est pas une chaise
Thank god i need to go rearrange the 8 dimensional cannonballs in the trunk of my car