That's incredible. Within the first 5 posts we've got "Javascript is overrated", "tabs are better than spaces", something about trains/public transit, and a link to a hentai site. Talk about speed-running social media.
I've always thought that a problem with sites like Reddit and Hacker News is that a very small percentage of users engage with "new"; most people only see the posts that have been curated by that small minority which creates _some_ sort of bias (arguably, a positive one).
I've wanted to try something like a Hacker News where your homepage shows a random smattering of posts where the probability you'll see any particular one depends on its number of likes.
In other words, rather than having a firehose of "new" posts from which a few are elevated to the home page (masses), give everyone a dynamic home page which is mostly items that have been liked by many, but includes a mix of some that haven't made that threshold yet. Maybe instead of pure likes it could be a ratio of likes to views.
But the point is some way to engage everyone in the selection of what makes the homepage. It could even be as simple as "keep HN as is, but include 5 posts randomly chosen from recent submissions and tag them as such."
I've been working on this kind of thing over the past several years (for a while full time as an attempted entrepreneur, now on the side for the past couple years). The latest iteration is https://yakread.com -- hit "take a look around" and you can see the "home page"/a list of recommendations without signing up. The recommendations are personalized, i.e. the probability you'll see any particular post depends on your individual interactions with past posts, if you've signed up. (it does collaborative filtering with spark mllib). So that may be a bit different from what you had in mind, since your comment sounds more like an unpersonalized system, but with some extra exploration thrown in. However in practice I suspect the biggest thing the collaborative filtering is doing at Yakread's current scale (not much) is learning which items are good/bad in general.
I also do have some methods baked in for doing exploration. "Epsilon greedy" is a common simple approach where x% of the recommendations are purely random. I do a bit more of a linear thing where I rank all the posts by how many times they've been recommended, then I pick a percentage 0 - 100, then I throw out the top x% most popular (previously recommended) items. that also gives you some flexibility to try out different distributions for the x% variable.
Curation IS arguably one of the most critical factors. A library is useful because of what it doesn't contain, not because of what it does contain. The hypothetical "library of all possible books" isn't useful to anyone.
That's a long way of agreeing with you that there is positive in the duration bias of HackerNews and other sites.
Of course anything can be hijacked, and metrics proverbially tend towards becoming targets (and hence a dumb arms race), but the general concept of the value of curation is sound.
Thanks -- totally agree that curation is essential, and I suspect my original point may have come across as advocating against curation, which I wasn’t.
My goal isn’t to randomize the homepage or flatten quality, but to involve a broader swath of users in the curation process. It’s currently dominated by the few who browse “new”, essentially a self-selected minority of curators.
Concretely, I was imagining something like:
* Every new post is shown to a small % of users as part of their regular homepage (not in a “new” tab they’d have to seek out).
* Posts that get engagement from that slice are shown to more users, and so on — a gradual ramp-up based on actual interest rather than early-bird luck.
So it’s not removing filtering; it’s just moving from a binary gate (past the goalpost = homepage) to a more continuous, probabilistic exposure curve.
Curation still happens, but more people get to participate in it, and the system becomes more robust to time-of-day luck or early vote pile-ons.
Anyway, I mostly wanted to clarify that I’m not against filtering -- just having a thought experiment about how we might make it more adaptive and inclusive.
Does that clarify my point? Any thoughts? I appreciate your engagement!
Serious question is how often would you tolerate if those randomly displayed posts are absolutely out of your interests? Would you click or skip? Plenty of fish (Canadian-based dating site), programmed by Markus Frind, had a function: during onboarding you could choose types of people you think you prefer (e.g. brunette/blond etc.) and if you haven’t clicked later on them, algo had started to show different results…
Interesting anecdote on Plenty of Fish. It’s definitely interesting how people aren’t really good at telling you what they like; empirical evidence is far better. I believe Paul Graham has an essay on a similar topic where if you ask people if they like an idea you have for a product, they are likely to say yes even if they wouldn’t actually use it. But if you ask them how much they would pay for access, or if they’d pay a certain amount, you’d get a more accurate response.
FWIW, I wasn’t suggesting pure randomness though, it’s more like probabilistic randomness. Rather than a binary threshold a post must pass to make the homepage that divides the community into curators and consumers, this would show you posts with a degree of randomness with a probability proportional to the likes it’s garnered.
Btw, I’m not sure what you meant by randomness is an underdog? Are you implying it’s a nice goal but it rarely works out in practice, perhaps because people actually do fall into natural curator / consumer buckets?
This is a common idea, and it is a super-hard problem. Sites like Reddit are increasingly sampling by throwing more new posts into hot. That is a solution that may improve yet not really change HN. Ultimately, we don't all want the same feeds and user majorities tend to stifle key minorities such as early adopters.
I'm building such a system for prizeforge and am several more steps ahead on this solution. You need a decent background in probability and a more robust understanding of what a "good" outcome is to work on this kind of thing.
https://positron.solutions/careers if you do Rust and care a whole lot about this problem and can deal with the challenges we're going to have with PrizeForge.
I loaded it up to find myself presented with a completely broken visual appearance, screaming audio, and all sorts of hate content and other horrors. Absolutely terrible.
Every attempt at anonymous social fails terribly. Unfortunately without good moderation tooling this doesn't work. You have to limit the blast radius of what people put out there because on the internet, without repercussions, a lot of people say terrible things. After having tried a few things like this, it's clear identity matters. Yes you can find clever ways to get around it and surface the best content or have pseudonyms and all that, but essentially you must verify individuals. People need to know you can't just say anything aka you can't say terrible hurtful things without repercussions. You need a unique identifier per browser, per device, etc, and then you permaban anyone who uses vulgar language, racial slurs, or anything of the sort immediately. You use AI before the fact to actually moderate and check things, and anything that is attempted gets a warning.
Someone else said, let's go a step further and not post at all. You know what, YES. We have X, we have Facebook, we have many tools where you can create an account with any random email address, with any random name, and say anything you want. Let's leave behind the two decades of public social and go back to the real world e.g maybe there's a world in which you own what you say, its there forever and you have to be thoughtful before you say it, but we also put it in an appropriate place, in a category of communication that makes sense. Blogging was way better than microblogging, but obviously opening the door to everyone made social media far more viral and addictive. Adding pictures and video made it even more viral and addictive. It would just be nice to go back to something a bit more real, something that's not going to be horribly abused and it feels like part of that might mean, less public social.
> You have to limit the blast radius of what people put out there because on the internet, without repercussions, a lot of people say terrible things
And yet those same repercussions are the reason why social media is full of inoffensive slop. No one wants be the one who get fired for leaking their employer's unethical practices, after all.
Pseudonymity is not enough, sadly. Given enough time, you'll leak enough data points to be identified.
> Let's leave behind the two decades of public social and go back to the real world
The idea that the "internet" and the "real world" are separate has been outdated for a long time.
> There are no 'likes', no followers, no algorithms here. There is only the freedom to express yourself with radical anonymity and connect purely.
> Start being FREE
Love the concept, let’s go one step further. No publishing at all. Only you can see what you wrote. No need to connect to anything, even if it’s “pure”.
Indeed. I'm already free to express myself with radical anonymity. Am I now free due to that site to do so and have it reach the people I want it to reach? No, not at all.
To be anonymous you cannot use gstatic.com because this way you're just telling google I was there.
Also tailwindcss and unpkg call sell data to others and there you go zero anonimous.
I'm sure you can fix this.
Thankfully, Ublock Origin in advanced mod can block those requests by default. It's just very irritating and I'd say disrespectful when you HAVE to unblock them for the site to work.
You might as well not use the web / internet at this point.
Unless you use busybox or an esoteric OS to browse the web, almost every browser or OS (macOS, Linux, Windows) will ping to Google or some other bad spyware website.
Your site about "pure freedom of expression" and "raw thoughts" stops me from posting about buying new golf balls because of a badwords.js word filter.
Why are your Terms of Service and Privacy Policy a collection of empty bullet points?
Even taking into account that those words seem biased towards certain hispanic regions rather than English, you still can't talk about the Final Fantasy currency ("gil"), beans ("judías"), black coffee or any black beverage/object/thing at all ("negro"), taking/catching/picking/grabbing things ("coger"), seashells ("concha", which is also literally a person name[1]), slugs ("babosa"), Python snakes ("piton"), ladybugs ("mariquita"), china porcelain ("china", and words related to China in general).
And they apparently don't like Elden Ring since they don't want anyone talking about Queen Marika the Eternal ("marika").
Well, badwords.js is client side, so if you just block it from loading you can post about balls and butts and slurs as much as you like. Absolutely incredible stuff.
To the first point, could there not be an algorithm that nevertheless still works without followers, likes or ads? TikTok seems kind of similar, at least when it initially started. It seems its core algorithm doesn't need likes, it can detect whether you engage with a video via other metrics like time spent watching the video.
I suppose you'd want to allow people to self select based on topics, and then, well, you essentially get Reddit.
You can follow accounts which is a form of curation. I dropped all social media except this one because it was a time-sink but I did used to find following resulted in a useful feed on Instagram.
Ah the good old "freedom to" that totally forgets that freedom also should contain a "freedom of". Free like in a chaotic war zone, where you can commit any kind of horrible attrocity, but if you just want to have a nice time, that isn't an option, because everybody behaves like an idiot. No thanks.
Freedom in reality isn't just a thing that you increase by reducing the rules. There comes a point where less rules result in less freedom. So it is always a balancing act between your freedom to do X and others freedom to not have to be subjected to X. Example: Giving up the freedom being able to murder random people is a little price to pay, if it means reducing the risk to be murdered yourself – especially since decent people wouldn't have the actionable urge to murder each other anyways. If you're a murderer however that may reduce your freedom in ways you dislike. But then, maybe, your freedom shouldn't matter as much.
Maybe I would care more about that specific idea of active freedom if I routinely wittnessed a real repression of any idea that isn't just the mean bullshit of egocentrics who have lost all touch with humanity and just want to see the world burn.
As some who loves dark mode, I hate that extension. I get the appeal of forcing every website to be dark mode, but that ends up breaking half of them or just making them look like trash. People need to custom design their dark modes for whatever each design is.
If you don't care about breaking design and stuff looking the way its supposed to, I guess the extension is fine but I rather use something like Stylus where you can use people's custom designed stylesheets for most known sites.
No way to filter by subjects one may be interested, no way to find likeminded individuals, no way to mute people, so yeah, not much value could be derived from it; I think that there is space for some pseudo-anonymous social network but this is not it, at least not in its current state.
I hope TW Körner doesn't have that last footnote on the topic..
(Fwiw left/right (meta-)division seems to me to encompass all the moral taxonomies of the day, TWK just happened to settle for rich/smart vs social wrt UK)
>Social Darwinism applies the Darwinian doctrine of survival of the fittest to human society. Rich social Darwinists take wealth as the best indication of fitness to survive, academic social Darwinists take intellectual achievements as the best indication and so on. They are often haunted by the fear that the unfit do not understand this and may outbreed the fit.
>Curious to see how it evolves and what kind of conversations happen when metrics are removed.
I've seen a ton of projects like this. It will get deluged with spam and shitposts and people will stop caring once the novelty wears off and leave once it becomes a cesspool. Probably for 4chan which at least has a culture and established userbase. And porn.
Looks like you do only frontend validation and have XSS vulnerabilities all over the place. I also see aren’t taking down nazi posts and CP? I went ahead and reported you to the FBI, hopefully your vibe lawyer is good.
I always thought there was a space for a social media that has upvotes and followers and all that, with the intention of creating echo chambers and segregating users into their own groups and that sort of thing, but with the ability for anyone to go on a 'holiday' into the other echo chambers, to see what those other users experience.
Likes are not one dimensional. Likes flow from one person to another. If you like someones posts, you're more likely to enjoy the things they like. A network of endorsement emerges and the subgroups can become clear.
The important thing would be for moderators to be able to police "what is liked/disliked". Hear me out.
If you've got this space where dozens or hundreds of people all have a high overlap of favorable content, but there's this one turd who comes in and downvotes everything, always... he's not just a little different, and he's not assimilating. He's trying to sabotage. If this was visible to a moderator, that moderator could decide he doesn't belong to the group. I don't advocate that he no longer be able to view the content, but maybe his votes just stop counting. Maybe he's no longer able to post content of his own (would be up to the moderator, I think, perhaps his content was always good enough, but his voting is counterproductive).
I think that on places like reddit they avoided this functionality because it would give moderators too much control over their communities, and outsiders would be unable to come in and eventually take over and force the original group out. Being admins, they could of course have done this anyway, but it would require them to be heavy-handed and obvious.
I think, if you have a saboteur, they're probably not part of your 'network'. The people you've endorsed probably won't have endorsed the saboteur, so the saboteurs activity should not effect your feed in any meaningful way. This is how trust works in real social circles.
Moderators clearly work but it's a shame it relies on single people doing a good thing. It's a shame the moderation can't be done by everyone all the time, unconsciously.
I've always liked the idea of being able to just post anonymously without any votes, but it always quickly devolves into incoherent spam and endless slurs.
The website has been active for an hour and I'm already seeing some of that. It always turns into 4chan.
This is kind of the same idea as Yik Yak, which was an interesting app. Some good aspects (authenticity) and some bad aspects (lots of bomb threats and bullying).
All of us humans are fundamentally racist, spammy, and have some form of hate in us. Depending on the state of society, it can suppress or bring it out. Anonymous boards like this bring out the worst in all of us.
Just as the current administration has catered to our bigoted, racist selves. Anonymous boards have more or less the same effect.
As a developer on the ARPANET I've been on the net since the moment it existed. This is the arch opposite of the early internet, which was dominated by intelligent academics sharing technical articles and dense information.
That's exactly it. I would say that the experiment is not so novel, but non-novel experiments are good, too. We retread illuminating experiments from physics in school for good reason.
There are quite a lot of message boards and imageboards, large and small, which check the same boxes. Many of them have not gotten enough traction to catch the attention of the mainstream (e.g. lainchan), and have a distinct vibe.
I think that as soon as your userbase expands into a representative sample of the Internet user population, your platform's culture will become increasingly similar (i.e. "average out") to the largest platforms, e.g. 4chan.
With none of those features there is no excuse for the site working that terribly with JS disabled. Look at HN for comparison, which degrades gracefully with JS disabled.
I like the project, but the words "Libre" and "experiment" made me assume it would be open source. Of course, it's the owner's decision, but the title biased my expectation.
The problem with openness and anonymity is that it invites bad actors. Social media is an unsolved problem and any platform that gets sufficiently large will be more valuable as a tool to disseminate misinformation and propaganda than as a tool for people to actually communate freely and openly.
I asked you to post the word that you were not allowed to post, not a list of words that are forbidden on the site. For some reason, you don't want to post your comment or the words here.
"I didn't say they are ... just some set of words that the creator doesn't personally approve of."
"And what are those words? Post them here."
"I don't have the list of word that are forbidden, only the creator does, so ask him."
You say "I asked you to post the word that you were not allowed to post, not a list of words that are forbidden on the site" but that is mistaken ... see the above context.
"For some reason, you don't want to post your comment or the words here."
I've wanted to try something like a Hacker News where your homepage shows a random smattering of posts where the probability you'll see any particular one depends on its number of likes.
In other words, rather than having a firehose of "new" posts from which a few are elevated to the home page (masses), give everyone a dynamic home page which is mostly items that have been liked by many, but includes a mix of some that haven't made that threshold yet. Maybe instead of pure likes it could be a ratio of likes to views.
But the point is some way to engage everyone in the selection of what makes the homepage. It could even be as simple as "keep HN as is, but include 5 posts randomly chosen from recent submissions and tag them as such."
Dang, has anything like this been considered?
I also do have some methods baked in for doing exploration. "Epsilon greedy" is a common simple approach where x% of the recommendations are purely random. I do a bit more of a linear thing where I rank all the posts by how many times they've been recommended, then I pick a percentage 0 - 100, then I throw out the top x% most popular (previously recommended) items. that also gives you some flexibility to try out different distributions for the x% variable.
The source is at https://github.com/jacobobryant/yakread
That's a long way of agreeing with you that there is positive in the duration bias of HackerNews and other sites.
Of course anything can be hijacked, and metrics proverbially tend towards becoming targets (and hence a dumb arms race), but the general concept of the value of curation is sound.
That's an archive, and it has its own uses for researchers, especially historians.
> The hypothetical "library of all possible books" isn't useful to anyone.
That's not an archive, and has no uses even for researchers, especially not for historians.
My goal isn’t to randomize the homepage or flatten quality, but to involve a broader swath of users in the curation process. It’s currently dominated by the few who browse “new”, essentially a self-selected minority of curators.
Concretely, I was imagining something like: * Every new post is shown to a small % of users as part of their regular homepage (not in a “new” tab they’d have to seek out). * Posts that get engagement from that slice are shown to more users, and so on — a gradual ramp-up based on actual interest rather than early-bird luck.
So it’s not removing filtering; it’s just moving from a binary gate (past the goalpost = homepage) to a more continuous, probabilistic exposure curve.
Curation still happens, but more people get to participate in it, and the system becomes more robust to time-of-day luck or early vote pile-ons.
Anyway, I mostly wanted to clarify that I’m not against filtering -- just having a thought experiment about how we might make it more adaptive and inclusive.
Does that clarify my point? Any thoughts? I appreciate your engagement!
Serious question is how often would you tolerate if those randomly displayed posts are absolutely out of your interests? Would you click or skip? Plenty of fish (Canadian-based dating site), programmed by Markus Frind, had a function: during onboarding you could choose types of people you think you prefer (e.g. brunette/blond etc.) and if you haven’t clicked later on them, algo had started to show different results…
FWIW, I wasn’t suggesting pure randomness though, it’s more like probabilistic randomness. Rather than a binary threshold a post must pass to make the homepage that divides the community into curators and consumers, this would show you posts with a degree of randomness with a probability proportional to the likes it’s garnered.
Btw, I’m not sure what you meant by randomness is an underdog? Are you implying it’s a nice goal but it rarely works out in practice, perhaps because people actually do fall into natural curator / consumer buckets?
I'm building such a system for prizeforge and am several more steps ahead on this solution. You need a decent background in probability and a more robust understanding of what a "good" outcome is to work on this kind of thing. https://positron.solutions/careers if you do Rust and care a whole lot about this problem and can deal with the challenges we're going to have with PrizeForge.
Someone else said, let's go a step further and not post at all. You know what, YES. We have X, we have Facebook, we have many tools where you can create an account with any random email address, with any random name, and say anything you want. Let's leave behind the two decades of public social and go back to the real world e.g maybe there's a world in which you own what you say, its there forever and you have to be thoughtful before you say it, but we also put it in an appropriate place, in a category of communication that makes sense. Blogging was way better than microblogging, but obviously opening the door to everyone made social media far more viral and addictive. Adding pictures and video made it even more viral and addictive. It would just be nice to go back to something a bit more real, something that's not going to be horribly abused and it feels like part of that might mean, less public social.
And yet those same repercussions are the reason why social media is full of inoffensive slop. No one wants be the one who get fired for leaking their employer's unethical practices, after all.
Pseudonymity is not enough, sadly. Given enough time, you'll leak enough data points to be identified.
> Let's leave behind the two decades of public social and go back to the real world
The idea that the "internet" and the "real world" are separate has been outdated for a long time.
> Start being FREE
Love the concept, let’s go one step further. No publishing at all. Only you can see what you wrote. No need to connect to anything, even if it’s “pure”.
No profiles, usernames, or followers
No likes, trending topics, or algorithms
No ads, no data collection
Just anonymous thoughts from people around the world. Curious to see how it evolves and what kind of conversations happen when metrics are removed.
Unless you use busybox or an esoteric OS to browse the web, almost every browser or OS (macOS, Linux, Windows) will ping to Google or some other bad spyware website.
Why are your Terms of Service and Privacy Policy a collection of empty bullet points?
The badwords.js isn't even particularly hidden if you check the page's source: https://libreantisocial.com/badwords.js
And they apparently don't like Elden Ring since they don't want anyone talking about Queen Marika the Eternal ("marika").
[1]: https://en.wikipedia.org/wiki/Concha_(name)
Also no freedom, with all your content restrictions.
I suppose you'd want to allow people to self select based on topics, and then, well, you essentially get Reddit.
That is trivially disproven by observing that massive numbers of people use Twitter, Facebook, etc.
There's nothing about your site that grants me the sort of freedoms I want, including the freedom to locate useful content.
Freedom in reality isn't just a thing that you increase by reducing the rules. There comes a point where less rules result in less freedom. So it is always a balancing act between your freedom to do X and others freedom to not have to be subjected to X. Example: Giving up the freedom being able to murder random people is a little price to pay, if it means reducing the risk to be murdered yourself – especially since decent people wouldn't have the actionable urge to murder each other anyways. If you're a murderer however that may reduce your freedom in ways you dislike. But then, maybe, your freedom shouldn't matter as much.
Maybe I would care more about that specific idea of active freedom if I routinely wittnessed a real repression of any idea that isn't just the mean bullshit of egocentrics who have lost all touch with humanity and just want to see the world burn.
If you don't care about breaking design and stuff looking the way its supposed to, I guess the extension is fine but I rather use something like Stylus where you can use people's custom designed stylesheets for most known sites.
It currently leaks cross-site user tracking information to Google (www.gstatic.com).
being a haven for sharing very illegal content is probably how it's going to evolve... unfortunately
In any case..
I hope TW Körner doesn't have that last footnote on the topic..
(Fwiw left/right (meta-)division seems to me to encompass all the moral taxonomies of the day, TWK just happened to settle for rich/smart vs social wrt UK)
>Social Darwinism applies the Darwinian doctrine of survival of the fittest to human society. Rich social Darwinists take wealth as the best indication of fitness to survive, academic social Darwinists take intellectual achievements as the best indication and so on. They are often haunted by the fear that the unfit do not understand this and may outbreed the fit.
https://news.ycombinator.com/item?id=24154247
(For some reason the algolia DB wasn't properly normalised)
You talk about evolution ... that's change over time. How it evolves is how you decide to change it.
I've seen a ton of projects like this. It will get deluged with spam and shitposts and people will stop caring once the novelty wears off and leave once it becomes a cesspool. Probably for 4chan which at least has a culture and established userbase. And porn.
Likes are not one dimensional. Likes flow from one person to another. If you like someones posts, you're more likely to enjoy the things they like. A network of endorsement emerges and the subgroups can become clear.
If you've got this space where dozens or hundreds of people all have a high overlap of favorable content, but there's this one turd who comes in and downvotes everything, always... he's not just a little different, and he's not assimilating. He's trying to sabotage. If this was visible to a moderator, that moderator could decide he doesn't belong to the group. I don't advocate that he no longer be able to view the content, but maybe his votes just stop counting. Maybe he's no longer able to post content of his own (would be up to the moderator, I think, perhaps his content was always good enough, but his voting is counterproductive).
I think that on places like reddit they avoided this functionality because it would give moderators too much control over their communities, and outsiders would be unable to come in and eventually take over and force the original group out. Being admins, they could of course have done this anyway, but it would require them to be heavy-handed and obvious.
Moderators clearly work but it's a shame it relies on single people doing a good thing. It's a shame the moderation can't be done by everyone all the time, unconsciously.
https://en.wikipedia.org/wiki/Shadow_banning
The website has been active for an hour and I'm already seeing some of that. It always turns into 4chan.
Seems legit.
“Could not send the report. Please try again.”
It seems pretty clear to me no one that cares is on the other end of this.
All of us humans are fundamentally racist, spammy, and have some form of hate in us. Depending on the state of society, it can suppress or bring it out. Anonymous boards like this bring out the worst in all of us.
Just as the current administration has catered to our bigoted, racist selves. Anonymous boards have more or less the same effect.
There are quite a lot of message boards and imageboards, large and small, which check the same boxes. Many of them have not gotten enough traction to catch the attention of the mainstream (e.g. lainchan), and have a distinct vibe.
I think that as soon as your userbase expands into a representative sample of the Internet user population, your platform's culture will become increasingly similar (i.e. "average out") to the largest platforms, e.g. 4chan.
Perhaps the more fake and curated the internet becomes, the more people will act out when given the chance?
I run into false positives on like every third post I try to make
"I didn't say they are ... just some set of words that the creator doesn't personally approve of."
"And what are those words? Post them here."
"I don't have the list of word that are forbidden, only the creator does, so ask him."
You say "I asked you to post the word that you were not allowed to post, not a list of words that are forbidden on the site" but that is mistaken ... see the above context.
"For some reason, you don't want to post your comment or the words here."
I do have reasons not to accede to your demands, and I won't respond further to https://en.wikipedia.org/wiki/Sealioning