brittenedbor on Nostr: My suggestion is to go for ~low epoch training and iterate your dataset / parameters ...
My suggestion is to go for ~low epoch training and iterate your dataset / parameters frequently instead of wasting compute on big training sessions, you will learn so much more and get much better results doing 10x small training sessions than 3x medium for example
Published at
2024-11-26 19:20:37Event JSON
{
"id": "a121483814f3f80513548e146a402d2dcb156d4f0a23d3cb55b2c04109753dc4",
"pubkey": "0d62b2224829d50a3a4050ca3ccdf85b5a3e102b86a6dd9dcbe5828503ce7e1f",
"created_at": 1732648837,
"kind": 1,
"tags": [
[
"e",
"f5ff89024ac0c8a9ae7d137a5991f1eb2af9e13af00d7af397f446de6e0004bc",
"",
"root"
],
[
"e",
"280f547688d91cbac632cb86d7dc1ccc9d58284a98fa0092d0467205f0713b82",
"",
"reply"
],
[
"p",
"32e1827635450ebb3c5a7d12c1f8e7b2b514439ac10a67eef3d9fd9c5c68e245",
"",
"mention"
]
],
"content": "My suggestion is to go for ~low epoch training and iterate your dataset / parameters frequently instead of wasting compute on big training sessions, you will learn so much more and get much better results doing 10x small training sessions than 3x medium for example",
"sig": "01c070eca0d7305e178a2130b1c56355acb390ce7cc5d96c6edc895822bc6e6fba8399d378eff025f2a2d1a81951060480ce4251520fcb9119b741957f4a7feb"
}