> We're still super early, but already these agents are showing flashes of brilliance, and we're gaining more and more conviction that this is the right form factor
Slow down cowboy; we're seeing "flashes of brilliance" and "that this is the right form factor" for writing code only!
I'm still waiting for AI/LLM's to be posing a danger to jobs other than those in software development and the arts.
two thoughts:
1) did you read the linked post on malleable software (https://nothingisnttrivial.com/vines.html)? it has a very good distinction between “functional risk” (is the business logic right) and “technical risk” (was the logic implemented correctly), and says—correctly IMO—that the more control we give to end users, the lower the functional risk is.
2) sometimes it’s ok for software not to scale (https://gwern.net/doc/technology/2004-03-30-shirky-situateds...). sometimes people are solving a problem just for themselves, not for a business. i want it to be possible to build situated software in an afternoon without spending a month studying how terminals and stacks and data structures work.
macOS (since version 10) is Unix. You can say most macOS users are not using the terminal or that back in the 1990s and 1980s, all the popular desktop OSes weren't based on Unix and that would probably be more accurate.
The massive enterprise scale part is more complicated.
First of all, we need to clarify that the "people who should be know how to use Unix" here are developers and system administrators. Most people don't need to know Unix and that's fun. You sometimes see people (I get the feeling the OP might be lowkey one of them) mourning the fact that that everyone should be running Linux and doing everything through the terminal. This is like saying everyone should be driving manual transmission, baking their own bread, growing vegetables in their back yard, building their own computer from parts, sewing their own clothes... you get the story. All of these things could be cool and rewarding, but we lack the time and resources to become proficient at everything. GUI is good for most people.
Now the deal with developers using Unix is a much more complex story. Back in the 1970s Unix wasn't very enterprise-y at all, but gained traction in universities and research labs and started spreading to the business world. Even well into the 1980s, the "real" enterprise was IBM mainframes, with Unix still being somewhat of a rebel, but it was clearly the dominant OS for minicomputers, which were later replaced by (microcomputer-sized but far more expensive) servers and workstations. There were other competitors, such as Lisp Machines and BeOS, but nothing ever came close to taking over Unix.
Back in the 1980s, people were not using Unix on their home computers, because their home computers were _just not powerful enough_ to run Unix. Developers that had the money to spare, certainly did prefer an expensive Unix workstation. So it makes large (for that time) microcomputer software vendors often used Unix workstation to develop the software that was later run on cheaper microcomputer OSes. Microsoft has famously been using their own version of Unix (Xenix) during the 1980s as their main development platform.
This shows the enterprise made a great contribution for popularizing Unix. Back in the 1980s and 1990s there were a few disgruntled users[1] who saw the competition dying before their eyes and had to switch the dominant Unix monoculture (if by "monoculture" you mean a nation going through a 100-sided, 20-front post-apocalyptic civil war). But nobody complained about having to ditch DOS and use an expensive Unix workstation, except, perhaps, for the fact their choice of games to play got a lot slimmer.
This is all great and nice, but back in the 1990s most of the enterprise development moved back to Windows. Or maybe it's more precise to say, the industry grew larger and new developers were using Windows (with the occasional windows command prompt), since it was cheap and good enough. Windows was very much entrenched in the enterprise, as was Unix, but their spheres of market dominance was different. There were two major battlegrounds where Windows was gaining traction (medium sized servers and workstations). Eventually windows has almost entirely lost the servers but decisively won the workstations (only to lose half of them again to Apple later on). The interest part is that Windows was slowly winning over the Enterprise version of Unix, but eventually lost to the open-source Linux.
Looking at this, I think the explanation that Unix won over DOS/Windows CMD/PowerShell (or Mac OS 9 if we want to be criminally anachronistic) is waaaay too simplistic. Sure, Unix's enterprise dominance killed Lisp Machines and didn't leave any breathing space for BeOS, but that's not the claim. DOS was never a real competitor to Unix, and when it comes to newer versions of Windows, they were probably the dominant development platform for a while.
I think Unix won over pure Windows-based flows (whether with GUI or supplemented by windows command-line and even PowerShell) because of these things:
1. It was the dominant OS (except for a short period where Windows servers managed to dominate a sizable chunk of the market) , so you needed to know Unix if you wrote server side code, and it was useful to run Unix locally.
2. Unix tools were generally more reliable. Back in the 1990s and 2000s, Windows did have some powerful GUI tools, but GUI tools suffer when it came to reproducibility, knowledge transfer and productivity. It's a bit counterintuitive, but it's quite obvious if you think about it: having to locate some feature in a deeply nested menu or settings dialog and turn it on, is more complex than just adding a command line flag or setting an environment variable.
3. Unix tools are more composable. The story of small tools doing-one-thing-well and piping output is well known, but it's not just that. For instance, compare Apache httpd which had a textual config file format to IIS on Windows which had proprietary configuration database which often got corrupted. This meant that third-party tool integration, version control, automation and configuration review were all simpler on Apache httpd. This is just one example, but it applies to the vast majority of Windows tools back then. Windows tools were islands built on shaky foundations, while Unix tools were reliable mountain fortresses. They were often rough around the edges, but they turned out to be better suited for the job.
4. Unix was always dominant in teaching computer science. Almost all universities taught Unix classes and very few universities taught Windows. The students were often writing their code on Windows and later uploading their code to a Unix server to compile (and dealing with all these pesky line endings that were all wrong). But they did have to familiarize themselves with Unix.
I think all of these factors (and probably a couple of others) brought in the popularization and standardization of Unix tools as the basis for software development in the late 2000s and early 2010s.
I agree that files solve some rudimentary cases, but they do not even allow simple conflict resolution. Eg. compressed files, including container formats like OpenOffice (text files in a ZIP archive IIRC), might be simple to apply changes from two sides if they are in distant parts, but syncing full files simply barfs.
Note that this does not even need two users: I hit this problem with a desktop and laptop and self-hosted NextCloud myself.
In general, a filesystem that actually stored both raw data (to fail-over to), but also a per-format event log, and maybe even app specific events (imagine a PNG changes, we could have any change recorded as raw bytes, generic bitmap image operation like "modify pixels at x,y to ..." and app-specific log like "gimp: apply sharpen filter on polygon area ...").
This would allow the other side to attempt to do the smartest sync it has (if it has a compatible version of gimp, it could decide to apply the filter, otherwise fall back to raw pixel changes if no conflicts, and then fall back to full file contents reconciliation).
Just like MIME handlers get registered, if file systems provided such change logs, some could have very advanced sync systems with this support from "filesystems".
Uhh, I see that really made a very poor word choice when I wrote “‘real’ racism”. My bad, words are hard.
Given that you’ve chose “fake” as the antonym, I think I can see where we differ. Because in my mind I would’ve picked something like “less intentionally”. I’m not even sure if I can imagine what a fake racism could possibly be like: racism is an idea, and I don’t really get what fakeness for ideas means.
I will try to remember and be more careful with the word “real” in the future. Appreciate your comments.
And, yes, that was real racism, of course - in the way you’ve used the expression.
I'm happy to be convinced, but so far this isn't really helping the point. What you've described applies to an extraordinarily small percentage of games. I'm looking at my Steam library with ~170 games. I see ~8 that have real brand names, 7 being shooters that contain gun brands - which have never cared about these things, given they're already appearing in the context of people killing each other in the first place - and the other 1 being Football Manager (an offline game).
> I can already hear people thinking: but, most games don’t have any third party intellectual property. But that’s less true than you think, even fantasy games will inevitably wind up copying something from our world that is not completely generic.
Then please give us some proper examples we can learn from.
> Rockstar for example will almost assuredly have issues with using the shapes of famous buildings and licensing issues if they make their radio stations too easy to pirate.
GTA is hardly a "fantasy game", its entire schtick is getting as close to the real-world setting as possible, going as far as to parody real-life brands. They're quite unique in doing so, an extreme outlier.
Take a look at the current top 10 games on Steam by player count. You'll see that indeed the only real-world brands featured are potentially gun brands, and none of them have things like famous buildings. DOTA, Apex Legends, Stardew Valley, Rust, Palworld, Elden Ring and a bunch of idlers and shooters (CS2, PUBG, Delta Force).
I think this is what struck me as well. Hearing what I can only describe as radical anti-capitalism coming from George Hotz was not what I expected when I opened that link.
That said I have felt the same feelings expressed by Hotz in this post. I commend him for saying it.
Also, part of the issue is the death of private servers. Game publishers have chosen to revert to centralized servers, rather than allowing private servers. Thus they have also taken on the additional cost of running those servers. Older games can be easily played on private servers to this day, as the community of any moderately popular game will almost always step up to provide the service. Even games you might not expect would be that popular or games that never had private servers - for example, Rock Band 3 only ever supported connecting to Harmonix servers in an official capacity. This support is also discontinued (they still operate the Rock Central servers, but only for Rock Band 4). Yet right now, thanks to reverse engineering, there is a fan-operated server that you can connect to with a slightly modified game. You can even download the fan-created server software (written in Go) and stand up your own server for your friends or for whatever other reason (maybe you want to run a small tournament and use a private GoCentral server to record statistics and have a private leaderboard).
I've always been a bit flummoxed we haven't expanded a ton on this given how long it's been.
I'm not sure if it's wrong or right, and not smart enough to posit much, other than it -feels- wrong. But it wouldn't take a ton to convince me otherwise.
You'd think by now we'd have more supporting evidence of such a concept.
I keep telling it will be the second coming of how Netbooks went down, but people are too busy praising Valve for translating Windows APIs, instead of actually building a gaming ecosystem on Linux technologies.
The same kind of technologies that power Android NDK, and to lesser extent PlayStation Orbis OS, with its FreeBSD roots.
AAA Game studios don't see a reason for directly supporting Linux and if Valve is willing to do that for free, even better.
No title bars or menus... now you can't tell what application a Window belongs to.
I hate when applications stuff other controls (like browser tabs) into the title bar --- leaving you with no place to grab and move the window.
The irony is that we had title bars when monitors were only 640x480, yet now that they have multiplied many times in resolution, and become much bigger, UIs are somehow using the excuse of "saving space" to remove title bars and introducing even more useless whitespace.
What you consider fictional and anyone else considers fictional doesn't have to align. The point was clear, when interpreted in good faith. I understand you disagree, but that doesn't change what I think about the nature of persona.
There are subreddits dedicated to these kind of topics, for example /r/AmerExit /r/movingtojapan /r/expats (that I know of) with wealth of info. I am sure there are dedicated Facebook groups too. You most likely will find people and orgs that can help with migration in such targeted groups. As I understand, most desirable countries have seen multi-fold increase in inquiries and applications from US.
We should not underestimates the timeless human response to being manipulated: disengagement.
This isn't theoretical, it's happening right now. The boom in digital detoxes, the dumbphone revival among young people, the shift from public feeds to private DMs, and the "Do Not Disturb" generation are all symptoms of the same thing. People are feeling the manipulation and are choosing to opt out, one notification at a time.
A backend can be part of the functionality though, such as for real-time collaboration and syncing. But you can have ownership and longevity guarantees for both the data and the service as long as you can eject [1] from the cloud and switch to self-host or back at any time, which is what we do for our notes/tasks IDE
Slow down cowboy; we're seeing "flashes of brilliance" and "that this is the right form factor" for writing code only!
I'm still waiting for AI/LLM's to be posing a danger to jobs other than those in software development and the arts.