jb55 on Nostr: Same. In this case, it’s the idea of neural networks taking advantage of the ...
Same. In this case, it’s the idea of neural networks taking advantage of the sparsity of the embeddings to encode more features than just the dimensionality of the vector space (the set of orthogonal vectors). I probably can’t do it justice in a nostr note after a few whiskeys.
Published at
2024-11-20 03:20:20Event JSON
{
"id": "d47c84a99c916afddfe71ca62e7473c68c37f66be489f72095d96645ed3b282d",
"pubkey": "32e1827635450ebb3c5a7d12c1f8e7b2b514439ac10a67eef3d9fd9c5c68e245",
"created_at": 1732072820,
"kind": 1,
"tags": [
[
"e",
"86c27c56ba6be1d0f4e5f9b0591d41692aa6d853cbb1194dd3ae3e85799fa3a1",
"",
"root"
],
[
"e",
"870fcd6259bc8303284bf3b83c5732d2e2757a69e90c3c6496c69c0ac2938703",
"",
"reply"
],
[
"p",
"39cd6acd40e0301b250ebff26c7b181ee0a039b769f59e1ee9158fb5a2427731"
],
[
"p",
"416ca193aa5448b8cca1f09642807765cc0ee299609f972df0614cfb8ea2f2b1"
]
],
"content": "Same. In this case, it’s the idea of neural networks taking advantage of the sparsity of the embeddings to encode more features than just the dimensionality of the vector space (the set of orthogonal vectors). I probably can’t do it justice in a nostr note after a few whiskeys.",
"sig": "ae90b63a9f013e56ae501f9052b4d0441b02f6f4ef55a5d005905e3da463e1f0ddf19bb04f7abb33c65f58a704e4b43bc5636b0de5ba4f8937327060ce5861a6"
}