I don't think it's any less science, inasmuch as science seeks to explain the natural world. It's just at a higher level of complexity and a different point in the learning curve than more externally observable levels of science.
Would we say that Copernicus was a charlatan or not a scientist because the heliocentric model turned out to be wrong? As you acknowledge, Freud pushed the collective understanding further.
Then how do we know that Freud was "wrong" or "inaccurate" or "just made up" or "a fraud"?
It's definitely a lot more murky and there are massive gaps in our understanding between biology and neuroscience and psychology, and fundamental differences and limits on methodology that we may never surpass, but his work still has its place on the timeline of progress does it not?
What about something like phrenology? It's easy to laugh at it now and consider its proponents charlatans or lunatics but at a time it was considered a worthy avenue of exploration, that turned out to be a dead end, but that's part of science.
Maybe let's not leap to medicalizing large swathes of the human condition and just accept eccentrics as part of life.
I agree that a healthy dose of skepticism and acknowledgement of our rudimentary understanding is warranted, but it does start to sound a little anti-science. I don't think there's anything wrong with continuing to explore and attempting to explain or put words to these things even though they are near the highest level of complexity in nature and the hardest to empirically evaluate.
Are NSAIDs considered to be medicalizing large swathes of the human condition (or caffeine, or alcohol for that matter)? Where is the line between a universally accepted and ubiquitous pharmaceutical and an overmedicalized one? I think we should be moving more towards the question of "do you feel like this medication benefits you or would benefit you?" than "do you check these boxes in the DSM and officially receive this diagnosis".
I too experience many of these things, and I have been called autistic by numerous people independently, but in that tongue-in-cheek manner of our generation that has watered it down a lot. I'm nothing even close to the people on Love on the Spectrum, or the kids in grade school that were essentially in special ed.
I think yeah the language has gotten very ambiguous and the "spectrum" is so wide and ill-defined that we need more and better words, but, I do also feel like it isn't just everyone's shared experience. I do feel like there are a lot of people who don't really experience these things, that aren't stuck in a constant self-conscious hyper-analysis and reflection loop, and are able to just kind of go with the flow a bit more (which is not to say that they don't have troubles or anxieties).
Edit: I will also note that I did have a similar reaction to you to this game. I didn't even go past the intro because I felt like I knew what it was. I would call this something like autism-lite, and it probably is pretty widespread, particularly in HN-like circles. It does feel a little bit confusing and even offensive to compare it to "capital A autism," an actual disability, but that's where our lexicon is right now.
> I think yeah the language has gotten very ambiguous and the "spectrum" is so wide and ill-defined that we need more and better words, but, I do also feel like it isn't just everyone's shared experience.
We used to have other words. Asperger's used to be a separate condition but was merged into one diagnosis. I wonder if there was a reason the experts who study this decided to go with fewer words?
Have you tried adding additional adjectives? That's usually what I do when the word I want is too general, and isn't as specific as I want to be.
In the DSM-5, they merged several conditions into one diagnosis called Autism Spectrum Disorder. At the same time, they defined ASD as having levels 1, 2, and 3. Those levels are defined by how much support the individual needs. Level 1 is "requiring support", level 2 is "requiring substantial support", and level 3 is "requiring very substantial support". Asperger's diagnosis would generally correspond with Level 1 ASD.
That doesn't really help with the social side of describing it though.
There's a small problem with the definition of "requires support", because growing up, I was smart enough, and good enough at masking, that I never "required support." Arguably, I still probably don't. But once I grew up, and started to look for ways to improve my mental health. My life very quickly shifted from, surviving ok-ish. To thriving and improving.
So many people insist that it doesn't count unless you're completely or meaningfully incapacitated. But that's stupid place to put the bar.
"Requires support" does not mean you are completely incapacitated without that support, nor does it mean you will always require the same support. If your life shifted from surviving ok-ish to thriving and improving when you found tools to help yourself, to me that sounds like you were meaningfully incapacitated before.
There are many conditions that a key point of diagnosis is impact to your life, and that's a conversation to have with the practitioner doing the diagnosis. It's a starting point not a bar, unfortunately nuance gets lost a lot once it's talked about in the social sphere/used as common parlance.
I think it's because it did kind of used to mean that. It described people that couldn't mask, couldn't totally function in society, couldn't have the kind of job depicted in the Autism Simulator. It's been expanded officially and colloquially which may not have been the right direction with the terminology. I think the DSM and the approach of trying to follow and fit in with more concretely diagnosable medical conditions may be considered harmful and too rigid.
For more mild and gray-area cases, it's really more akin to personality and it should be about understanding the particular combination of traits or symptoms of an individual. I wouldn't be officially diagnosed with OCD, or depression, or BPD, or maybe even ADD, but I can relate to all of those on some level and I feel like learning about them helps me understand myself better (with a grain of salt just like any health thing). It doesn't make me go around telling people I'm disabled and how they need to accommodate or support me, that's just narcissism.
> I think it's because it did kind of used to mean that. It described people that couldn't mask, couldn't totally function in society, couldn't have the kind of job depicted in the Autism Simulator.
If you mean "Autism", that might be true. But I don't think "Asperger's" meant that. So we might have taken a step backwards there.
I don't really talk about it, I don't go around telling people I'm autistic, whatever it is is minor enough that I'm able to mask easily, if anything I casually reference "my ADD."
I sometimes jokingly refer to "my spectrum," but I think that word is not great either because it implies a linear gradation, when I think it's a higher dimensional space like a personality star chart.
This doesn’t really seem to touch on the problem I have with news, which is that it is all doom and gloom, FUD and outrage. The headlines I saw:
Trump, Congress deadlock as shutdown deadline nears
Taliban cuts internet nationwide, flights grounded in Afghanistan
Indonesia school collapse leaves 38 missing, 77 hurt
YouTube settles Trump suspension lawsuit for $24.5m
German court jails AfD aide for China spying
US deports 120 Iranians after deal
Russian drone strike kills family of four
Is this really what I need to know in the world? Am I saying “informed”? This is not helping the anxiety from reading news described in the article. This is not good for people.
The default settings are interesting. Are there people doing this every 15 minutes, daily? And otherwise live a "modern" life?
I don't meditate or actively practice mindfulness, the thought of it kind of gives me anxiety in a "I don't have time for that, gotta keep moving" way (though I do daydream and stare into space a lot, but is that mindfulness or the opposite?). In that sense I would almost consider this a form of torture, it's such a short interval, feels like it would always be on my mind (but that's the point?).
What is the relationship between mindfulness and deep focus, are they at odds or complementary?
As far as I understand it, the purpose of this is to avoid daydreaming, getting lost in thoughts or falling into auto-pilot mode. It's very easy to just start dreaming and go onto some tangent inside your head. So every few minutes there's a little wake you call to remind you if you're still present.
I am deeply concerned about the apathy people have towards the idea of ownership, openness and interoperability. It gives the idea that people just want to be fed TikTok and Instagram reels.
Can you expand on this feeling? Why is it deeply concerning? Why should people care about the abstract concept of data ownership? People were totally fine when they had zero ownership or agency over media and they were fed TV, books, movies, radio, etc. Most people do just want that, their primary motivation to engage with media is just to be entertained in that moment.
Now that they have places where they can publish stuff and their friends and family and maybe even some other people might see it, why should they care that they don't "own" their Instagram post, whatever that means?
It matters because your posts aren’t just entertainment in the moment — they’re your history, your proof of existence online. Platforms treat them as disposable. If Instagram dies or bans you, your years of photos, writing, and connections vanish. Owning your data means your work and identity survive these issues, if you want.
I think a lot of people treat their own content as disposable also though. I don't know if most people would really care to save or dig through their entire Twitter history, for example. The rise of Stories is evidence of this. We're moving from a culture of preserving ancient pieces of paper to swimming in a never-ending river of data where there's so many things coming at you that you just move forward and don't have a ton of time to look back.
People that really want to preserve and archive their content find a way to do it and manage it separately. I have all the pictures that I've posted to Instagram. I have anything I've written that I cared enough to keep. If and when IG dies or I move onto the next thing, am I really going to want to meaningfully preserve and transfer the specific contents of that walled garden somewhere else? Maybe. I can definitely see the value, but it doesn't seem super compelling to me yet.
There is something to be said for the uniquely curated walled gardens and the centralized trust and organization and opinions they bring. When I started an Instagram account, I didn't want to transfer my Facebook world, it's a new world with a fresh start. I didn't want the same friends, the same voice for myself, etc. I certainly wouldn't have wanted to dig through all of that to figure out what made sense to carry over.
An example: I have been a Swarm user for like, fifteen years. As soon as atproto has private records, I'll want to set up syncing that data into my PDS. It's kept track of a huge part of my life, and losing that would be sad.
You asked me to explain why this matters. I did. I think your answer is fairly dismissive. Not everyone who cares about this is going to be some terminally online edge case. Unclear why ask a question if you are not curious about it. Probably not an effective use of our time.
I mean.. if you can still find the archives (pretty sure they're out there, but getting harder and harder to find), I have my name on lots of usenet posts from the 90s. But I'm pretty sure all my BBS posts, GEnie posts, etc from before that are gone - they would stretch back as far as December '84, IIRC. And there's probably very little left from before 2000.
And yet, I don't lament that 10-15 years of my online life have "vanished" - I was an ignorant little snot back then, and actually, am VERY glad they HAVE vanished. And thankfully I've generally used aliases / usernames instead of my actual name in most places (other than the usenet posts that were from my university account) so that wayback can't be used against me easily. Heck - I wish I could assert/enforce a "right to be forgotten" (vanish) on some websites. Rarely have I wished (especially in this current administration) that I was MORE visible / persistent online.
> People were totally fine when they had zero ownership or agency over media
Disagree. The punk phenomenon was largely about reclaiming that ownership and agency over cultural output, and it was massive in the 70s/80s/90s. The early web was very punk in attitude, with people basically self-publishing. Even in the '00s, there was still a clear distinction between "corporate" portals and grassroots.
This phenomenon where even creatives and intellectuals are Just Fine with playing in someone else's heavily-tweaked, hyper-monetized sandbox, is a new development.
idk if the normal user should necessarily care about data ownership, but I think the incentive structure it creates would be immediately legible to most people
Did we read the same article? It spends so many words answering these exact questions with examples and helpful illustrations!
Your question:
> why should they care that they don't "own" their Instagram post, whatever that means?
From the article:
> The web Alice created—who she follows, what she likes, what she has posted—is trapped in a box that’s owned by somebody else. To leave it is to leave it behind.
On an individual level, it might not be a huge deal. However, collectively, the net effect is that social platforms—at first, gradually, and then suddenly—turn their backs on their users. If you can’t leave without losing something important, the platform has no incentives to respect you as a user.
Your question:
> can you give examples of good and bad incentive structures in this context?
From the article:
> Maybe the app gets squeezed by investors, and every third post is an ad. Maybe it gets bought by a congolomerate that wanted to get rid of competition, and is now on life support. Maybe it runs out of funding, and your content goes down in two days. Maybe the founders get acquihired—an exciting new chapter. Maybe the app was bought by some guy, and now you’re slowly getting cooked by the algorithm.
> Luckily, web’s decentralized design avoids this. Because it’s easy to walk away, hosting providers are forced to compete, and hosting is now a commodity.
I think you’re right that the average person doesn’t care so much as they just want to be entertained or reach a large network, but apathy is not an argument in favor of the status quo.
In fairness to you, I had originally skimmed the article and did later realize that some of my points had been addressed. In fairness to me, in this subthread I was responding to other commenters and asking them questions rather than commenting directly on the article itself.
At this point my argument is that the ability to switch providers is not a major concern to most users of these platforms. I don't want a generic social media hosting provider. I want the Facebook experience, or the Instagram experience, or the Twitter experience. I'm happy to be in the garden and on the rails because it's easy and tightly curated. I don't want some Frankenstein amalgamation of data from all these things. I don't want to shoehorn my Instagram world into something else.
It’s pithy because the request is pithy- if I have to explain the mechanisms at work here i doubt you’re ever going to buy into the theory at all. A short version is what Dan already said - the entire economic foundation of social media is predicated on high exit costs. ATProto takes substantive steps to lower them. The theory in turn is that new businesses will need to develop less extractive models of viability to survive, which will in turn read legibly to users as less exploitative (you decide your feed, you can switch providers, you can choose moderation layers, etc)
the entire economic foundation of social media is predicated on high exit costs
No I think it's predicated on creating a product that people like to use. That's the Step 1 that OSS zealots miss when they focus entirely on these niche lofty ideals. I highly doubt the average Instagram user is yearning for - or would even be enticed by - a version of that same experience that has a lower exit cost.
That's the problem with these Twitter clones. "It's just like Twitter, but RESPECTS your data ownership" is not compelling. Just create a freaking compelling and original user experience (the actual hard part that made the big platforms successful) and secretly do whatever you want on the back end.
The reason I like Bluesky is that they understand this, and that's why the protocol stuff isn't front and center. They're focused on product first, technology second. The tech serves to create a good product, they don't build the tech first and then hope people find the product acceptable.
There’s nothing compelling and original about the twitter UX compared to all the clones. Pretty much across the board it’s just posting short messages and following others.
The entire value of a social media platform is in the network. Accumulating and maintaining one is the actual hard part that made the big players successful.
It was compelling and original when the concept didn't exist, or at least hadn't been successfully brought to market like they did. In a world where Twitter exists, and has the network, there is nothing compelling about a Twitter clone.
None of these platforms started with a network. They weren't cooked up by evil investors and MBAs looking for a rent-extraction scheme. Nor were they designed by a committee of philosophical experts saying "oh we'll just copy their thing and make it more esoteric and confusing so that maybe one day we can aggregate content from 14 competing Twitter-like platforms and you can switch between them whenever you like!" They were started largely by kids goofing around and making fun things for people.
This is clearly a wild claim that almost undermines the rest of the argument, but to the extent that we can accept that there are open source software packages that decision-makers deep in that industry will reliably choose for their business...it's not clear how this revolution will extend to "regular people." They just want easy. Make something as easy and fun as Instagram. They don't give a crap about all this, they don't want to think about it.
In the tech industry, open source has clearly won. You're right that most end users don't particularly care. The engineers building solutions definitely care, and prefer to build on top of open source dependencies.
I just don’t understand that assertion when there is very clearly still a massive proprietary closed-source software industry. How are you measuring? A lot of people use open source packages, because they’re free and easy, to build closed source products. Does that count as a win for open source? In my experience most engineers aren’t that fervent about it.
Either way, with social, and the network effects required, you’re targeting end users. The widest net with the lowest common denominator. Whatever success open source has had among people that live and breathe these issues every day is not replicable there.
I’m probably coming off as just a hater or anti-open source or something but I’m really not, I just feel like there’s a certain perspective that is a lot more niche and esoteric than its proponents realize.
I wouldn't say words are metaphors, I think of a metaphor as like a pointer, words are the actual mapping of arbitrary sounds to concepts that are relatively universally relatable to human experience. Children can learn language easily by associating those sounds with contemporaneous objects and situations from their experience. I feel like AIs will continue to be regurgitative borg-brains-in-a-vat until they have some semblance of a relatable, "lived" experience that they can map words to instead of just analyzing the patterns.
Of course I've studied logic, particularly Eastern lack of requirement of logic. Language is inherently paradoxical, its inception, development defies logic in its endless contradictions. It can't be revealed by logic. The very idea that language has something called words that are scientifically visible in the label conduit metaphor paradox indicates that language is probably our biggest contradiction.
Logic shows that you can prove anything from a contradiction. No logic means everything is true. So I don't think the rejection of logic is conductive to constructive conversation.
"Multiple contradictions in logic are complex statements where two or more propositions, taken together, assert the logical impossibility of their own truth, often resulting in two mutually exclusive conclusions"
if you understand logic, you may grasp logic was not required to reach any analytic conclusions in multiple contradictions, scientific discourse does just fine without it and probably needs to jettison logic en route to new formats.
Why have none of the major players tried integrating a case? Making a ruggedized version? They could probably do a lot better and find ways to innovate with something integrated.
Why do they ignore the fact that so many people use cases (and the market opportunity)? It's almost a defect at this point. Some people like the personalization but I think a lot of people just want something that won't break when you drop it...
> Why do they ignore the fact that so many people use cases (and the market opportunity)?
Apple sells cases.
There are ruggedized phones available. The market is small.
You can get away without a case with a modern iPhone for longer than most people assume.
The average person does better with a $10 sacrificial case layer that snaps on to their phone that can be replaced whenever they want or if it gets damaged.
Lots of reasons. The most obvious ones that come to mind:
1. People like a variety of custom cases that themselves have features (eg wallet cases random designs etc). If it’s built into the phone that customization capability is worse because you now have two layers of protection making for a very thick and heat-insulating design.
2. It’s valuable to have partners that make accessories for your device. If you kill that line of business for them, other things may go away and those partners will want to work with you less.
3. An integrated case will still suffer cosmetic damage. But now without the option to replace, you’re stuck with that damage.
I’ve had phones before that rugged enough to not want a case. It’s great, I wish there were good options like that today. But I think I prefer a separate case. When mine gets enough wear and tear, I replace it and it feels like I have a whole new phone.
Would we say that Copernicus was a charlatan or not a scientist because the heliocentric model turned out to be wrong? As you acknowledge, Freud pushed the collective understanding further.
reply