dk on Nostr: all nostr clients use algorithms most just choose the algorithm of sorting messages ...
all nostr clients use algorithms
most just choose the algorithm of sorting messages in timestamp order
this is the most neutral algorithm, it’s attractive ideologically
it’s a stark contrast to the corporate-controlled engagement algorithms we’re accustomed to on the rest of the web. It feels like a breath of fresh air. it makes nostr feel peaceful
but it also makes it hard to find areas of activity and energy which might match my interests. what happened while I was gone? what do I not know about yet, but might I find interesting?
if an algorithm is designed to attempt to bait me into anger/rage for the purposes of serving me an ad, then this is an algorithm I would like the power to opt out of. It’s designed to serve me an ad, not to inform me
I want to choose algorithms that serve my interest to learn from people I trust (e.g. show me the notes from people I don’t follow, but who are most zapped by people I have zapped; surface notes that are replies to notes I may have seen and zapped but which are gaining a lot of zaps/replies themselves)
we can create algorithms that serve the needs of users and not the needs of corporations. They can be transparent (how do they work?) and pluggable (add/remove which algorithms you see fit). This is a different approach with different incentives from any existing algorithms in social media. “Show me the incentives I’ll show you the outcome.” We’ve never seen the outcome with this set of incentives.
I want a nostr that’s more alive and points me towards hubs of energy/activity, but is structured in service of me
most just choose the algorithm of sorting messages in timestamp order
this is the most neutral algorithm, it’s attractive ideologically
it’s a stark contrast to the corporate-controlled engagement algorithms we’re accustomed to on the rest of the web. It feels like a breath of fresh air. it makes nostr feel peaceful
but it also makes it hard to find areas of activity and energy which might match my interests. what happened while I was gone? what do I not know about yet, but might I find interesting?
if an algorithm is designed to attempt to bait me into anger/rage for the purposes of serving me an ad, then this is an algorithm I would like the power to opt out of. It’s designed to serve me an ad, not to inform me
I want to choose algorithms that serve my interest to learn from people I trust (e.g. show me the notes from people I don’t follow, but who are most zapped by people I have zapped; surface notes that are replies to notes I may have seen and zapped but which are gaining a lot of zaps/replies themselves)
we can create algorithms that serve the needs of users and not the needs of corporations. They can be transparent (how do they work?) and pluggable (add/remove which algorithms you see fit). This is a different approach with different incentives from any existing algorithms in social media. “Show me the incentives I’ll show you the outcome.” We’ve never seen the outcome with this set of incentives.
I want a nostr that’s more alive and points me towards hubs of energy/activity, but is structured in service of me