It’s impractical in that it doesn’t make much sense to use them from a “productivity” standpoint. I’m not going to code up my website on an original Macintosh. They only really make sense from a historical standpoint (but productivity in the form of making videos about them does exist).
The Voyager 1 is still practical because it would take too long to get something that’s better that it to where it is.
He could've just as well used a typewriter - literally just writing doesn't require a computer at all. Writers in particular don't have to care about words-per-minute or error correction, grammar or style (there's lectors for that anyway) and they don't do serial letters or mail merge either.
There's no good metric for measuring "productivity" for writers with respect to the tools used.
> He could've just as well used a typewriter - literally just writing doesn't require a computer at all.
That's completely silly and strange.
One thing about retrocomputing that is hinted at in the article, and is genuinely interesting, is that people used to pay as much as you might pay for a car to get a computer that could, for example, run WordStar. For those who write for a living, the productivity boost over something like an electric typewriter was that big.
I'm not trying to be offensive, but I am guessing you've either never written much with an electric typewriter, such that you would be capable of making the comparison to writing (and more importantly - revising and editing!) with a word processor, or you've never written much period.
> I'm not trying to be offensive, but I am guessing you've either never written much with an electric typewriter, such that you would be capable of making the comparison to writing (and more importantly - revising and editing!) with a word processor, or you've never written much period.
Right back at you - in fact electronic typewriters came with a similar feature set to that of early word processing programs (of the 8-bit era): from monitor connections to integrated digital storage to the ability to run programs.
Modern electronic typewriters have error correction buffers, too.
Some authors (and that's who we're talking about here!) prefer typewriters for other reasons, too: seeing their work directly on paper and not having to worry about any kind of leaks (unless someone breaks into their house).
With authors in particular, what matters just as much - if not more - than just typing their work down, is making notes and keeping track of story arcs, characters, world-building, etc.
A word processing program doesn't help with that and this requires either specialised software or a different workflow altogether that doesn't benefit from traditional word processing functionality anyway.
I actually used some of the hardware you must be referring to. I never found any of it to be very good at all, and I grew up with access to both 8-bit (and later, 16-bit and 32-bit) computers and a few electronic typewriters. (contrariwise, I do have fond memories of the IBM Selectric, as does anyone who learned to touch type on one, I'm sure)
But I am utterly interested in what typewriter model you'd nominate as equal to the DOS + WordStar/Word/Sprint + PC combo of its era! If it's hardware you used and liked for some reason, that'd be something to hear about... not a lot of people reminisce about that hardware. (because it was horrible... ahem)
You might find it baffling because some authors were early adopters of word processors and PCs in general, but others were not and some prefer to not use computers to this day.
I don't argue that there is no use for word processors in writing, what I'm (obviously very poorly) trying to communicate is that productivity (measured by what exactly in this context?) doesn't depend on using these tools.
Quentin Tarantino once told Reuters that he prefers to use pens and a notebook (the paper kind). Similarly with Joyce Carol Oates:
> I always sketch out material “by hand.” Why is this so unusual? Every writer has written “by hand” until relatively recent times. Writing is a consequence of thinking, planning, dreaming — this is the process that results in “writing,” rather than the way in which the writing is recorded. [1]
Creativity doesn't seem to suffer when not using a PC.
Neil Gaiman shares this sentiment [2], so even quality sci-fi doesn't require much tech.
Danielle Steel managed to write 179 books without using a computer [3] - quantity isn't it either.
George Clooney apparently has his writing partner do the typing as he himself prefers to write everything out by hand as well:
> I'm probably the least computer literate writer there is... Literally when I cut and paste, I cut pages and tape them together. [4]
So the superiority of word processing software simply doesn't materialise for everyone who's in the business of writing.
Writing is, after all, first and foremost a creative process and everyone has a different approach to get the most out of their creativity. The tools used are the least important part in determining whether that productive (e.g. successful) or not.
For some (including me, most of the time) technology is a welcome helper and improvement, while others do just fine without.
I see what you mean! I thought you were strictly arguing a point about word processor hardware.
It’s true enough that some fairly productive authors use longhand. I believe Neal Stephenson and Joe Haldeman both use fountain pens and good paper. I think Stephenson pointed out that the best paper money can buy will never be a ruinously expensive luxury for an author. People just cannot write that fast.
I think if I were to try this it would be to test myself, to see if I could avoid the habit of constant, on the fly revision... but I’ve never loved my own handwriting.
That's why I explicitly wrote electronic typewriter.
Electronic office typewriters did have interfaces for monitors and some came with integrated digital storage (e.g. disks) and had a similar feature set to that of early word processing programs.
I think a distinction needs to be made between electric typewriters (basically electrified mechanical typewriters to not tire out the fingers) and electronic typewriters (more advanced devices, usually with a buffer to correct typos before they are committed to paper, might have spell check or other intelligence, sometimes feature a full-blown built-in display). The latter are basically computers, but they are self-contained with regards to the peripherals and printing.
On the other hand, a DOS operating environment, while slightly more complex, is far easier to integrate into a modern workflow. The environment can be emulated (= easier to lug around a 4MB DOS disk image than a 20kg chunky suitcase, easy to use modern peripherals like 4K screens, KVM switches/Alt+Tab, and WiFi printers), data backup is very straightforward (vs a machine-specific process or paper photocopy), and the data is far easier to convert to modern formats for incorporation into modern workflows (US ASCII + Markdown can be turned to PDF, HTML, ODT, or kept as is).
If I had to incur the complexity cost of a more advanced platform, a DOS editor would be preferable over an electronic typewriter.
Do you think, perhaps, if he switched to some modern novel writing software, with character and plot tracking, that he might actually finish the "Song of Ice and Fire" series in this lifetime?
>I’m not going to code up my website on an original Macintosh.
Yet people still make games, and hardware, for cold hard cash for vintage systems like the Commodore and Atari computer lines.
For example https://atariage.com/store/ is full of newer titles and in the various forums you can find all sorts of modern hardware and titles being discussed, and sold, for various vintage systems https://atariage.com/forums/
Some of us find these machines very practical, some are even still being used in commercial settings. At some point I recall one of the people being interviewed on ANTIC The Atari 8-bit Podcast ( https://ataripodcast.libsyn.com/ ) mentioning they either recently stop using, or were still using, an Atari computer in a commercial setting because if it isn't broke, don't fix it.
I mean, people even still make peripherals for these vintage machines. I had a brand-spanking-new-from-the-factory joystick show up just this past week for my Atari machines from https://retroradionics.co.uk/
Sure, you might not create a flashy website for a client, or some machine learning, design the next app to IPO at a billion for, or develop the next social media platform on one but they absolutely still have a practical, functional, use for many users.
> It’s impractical in that it doesn’t make much sense to use them from a “productivity” standpoint.
I think it depends on what you're trying to do. If you're creating pixel art for a game, working on a "retro" computer might be more practical. E.g., if the software you're using does everything you need, you get some nice benefits like fast startup time, often lower latency feedback, not having to worry about automatic updates rebooting your machine while you're working on something, having a machine that's much easier to understand/modify to your needs, etc.
It may not be practical to have a "retro" computer as your only computer, but for certain tasks I can see them being more practical than a modern computer. And it's honestly kinda sad that modern computers provide such a subpar experience in a lot of areas.
It's worth remembering that a large amount of the reason that's impractical is because society shifts to expect the new thing. In turn, part of the reason society gets to expect the new thing much more instantly nowadays is because past new things started including “always the newest by default”, so it became an active choice to want to keep anything around or not instantly change your behavior to match. I'm not sure how much of this is ‘actual’ values versus power conflicts versus broken equilibria though I know at least one influence was the rolling infosecpocalypse.
There's a post elsewhere on “The Amish, and Strategic Norms around Technology” which has some not-bad short description of ways this isn't universal, and potential for application to current-day (potentially America-centric, etc.) society, without going into too much depth: https://www.lesswrong.com/posts/36Dhz325MZNq3Cs6B/the-amish-...
They downplayed the actual amount of time that went into these changes and the upcoming changes. Here's the history:
Matz[1] released the first version of Ruby in Dec 1995.
DHH was a major player in getting Ruby into the global spotlight with Rails[2] in 2004. Rails got very popular as a framework for developing new applications, with Basecamp being novel, showing that it could work well and introducing people to REST, in a flexible interpretation, as well as ActiveRecord, whose ease of use and migrations became a model for modern web development.
Rails v3 divided the community, specifically around how and what Rails would support for the server and request-handling. This hinted at problems to come, but Rails was still strong, and many took it with a grain of salt and upgraded.
However, Twitter, which had been built on Rails became popular, and the "fail whale" emerged as they were unable to handle all of the requests. This was not a problem with scaling Rails, but with them knowing how they could scale Rails without much greater expense, but since they had to rewrite things and there was pressure to get scaling done right, they switched to Scala and Java, since Scala was functional and fast, and there was a lot of support for the JVM. Functional programming had already been making a comeback in popularity in the 2000s, because it often required a lower memory footprint and was fast. But, at that point in time, many teams and developers were looking into it.
Though it wasn't the first time he'd done optimization, in 2012, Matz released mruby[1][3], an embedded Ruby.
Around the same time, with functional programming having been cool, Elixir was born and some of the Rails community left for writing Ruby/Rails-ish code in Erlang.
Some had been trying to slim down Rails in core, so that there would be less code needed to serve requests.
Tenderlove, who came from the system programming side of things, joined the Rails core team with a focus on optimization, did work on Rack, and eventually he started working to help speed up Ruby.
For years, Matz and others had focused on speeding up and slimming down Ruby. Ruby had run on Lighttpd and Ruby on Rails could run on it also.
All of these things have been driving Ruby to get better, and now it is.
So, no, I don't think it's realistic that they put a year into it. At least 9+ calendar years led to this point, and it's been 26+ calendar years since initial release. And this isn't the end of it. It's not trying to compete with or tank your favorite framework or language of choice, it's just been improving and its team, even as good as it already was, has been improving.
P.S.- Ruby is not Rails. But not talking about how the history of Rails in the scope of things would be remiss. I can't think of anything in the history of Ruby that has been bad, but certainly Rails has had its "fun". But right now, it's coming together. I also didn't mention Sinatra's influence on slimming things down, or Puppet, Chef, etc.'s contribution to the Ruby community, or Crystal which has been a valiant effort for a compiled Ruby-like ecosystem. There is so much that happened leading to today that shaped where things are and where they are going. I'm totally psyched about this.
You missed the part of the history when Ruby could have been Swift, but eventually things went sour, the creator left Apple and ended up selling his work for mobile apps development.
> Reality is hard. It's easier out here in europe, we've been lucky enough not to suffer too much from the awful american media diet.
I live in the U.S. and stay away from the media. I recommend it. I used to think it was important to stay abreast of all of the comings and goings, but that was B.S. It's a stressful waste of time for me. I don't really get into sports either.
> There's a lot of propaganda in your country. I sympathize with you, truly; it's difficult to tell what's real and what isn't when you've been gaslit for years.
What you say is true about it being difficult to tell what is real. But, the only people that I spoken with that know what it's like in America are those that have lived here recently for a year or more, and preferably those that have moved around some. It's a large country, with different cultures; I don't even know them all. A lot of it is annoying (the parts of the country where people act really fake and the parts where people seem nice but are passive aggressive isolating backstabbers), and a lot more of it is honest, loud, quiet, funny, and cool.
I sometimes act as if I understand the people of this or that country, because I work with people of different countries each day at work, so I know I act the same way; I don't really know what it's like either.
Or, possibly, it stays the same without variation and the measurement was flawed. A recent study indicates our most accurate clock may not be infallible[1].
I had problems with older (Intel) Mac Pro and usage of monitors on both HDMI and display port at once. That wasn't an Intel thing, and that sort of thing could happen with any hardware that isn't tested with every combination of other hardware.
I'm excited that the M1 is good, also- that it seems fast with everything, even the old Intel-specific code via Rosetta. I don't subscribe to Apple NIH[1] philosophy, but what they have made historically, barring the problem I mentioned above and a few macOS and iOS issues over the years, has been awesome. The M1 seems to follow Jobs' philosophy of making the best product.
In the U.S., my experience with Whatsapp was that I created an account and never used it once to communicate with anyone, then I deleted it.
I've also withdrawn from social media.
The exception for now is HN, because it's more of a forum, even when bad information sometimes instates itself as reality for a large conversation, like a big gathering of fans talking about their team that will inevitably fail to win or perhaps a bad STD.
I learn what others are doing through direct and intentional communication, even if technology is used or if the information is second-hand. I don't text back or call back immediately, which my friends and family forgive, but it sometimes seems to hurt my relationships.
I still worry of dependence on large companies, big data companies gathering more information about me than I know myself, and the potential of out-of-control AIs. However, I attribute these in-part to my own paranoid thinking that use my memories of large company layoffs, privacy concerns raised in the tech community, and mostly fiction.
While I've come to the realization that the act to trying to be happy and successful is the very thing that makes me unhappy, and I just need to exist, maybe becoming better at whatever I'm naturally good at, while being here and now with those I'm with, giving my service to them... I still keep wasting time replying about things that don't matter.
WA is not particularly good, it's just that I don't know anyone who doesn't use it (in the Netherlands), even when you want to contact helpdesks it is sometimes the preferred way. I mean, we have this in many streets: [0]
Without kids I could see myself getting away with not using WA, but with kids you are really setting yourself up for a very hard time (and prepare to be judged by other (annoyed) parents and your kid will feel the consequences at some point, the kids will miss out on critical and fun information).
WA has almost become what email used to be. Except that it's a controlled platform and we are locked into a single provider, a provider that once promised a focus on privacy and an app free of commercials, forever...
yep, here in the UK everyone I know uses whatsapp. Some people have telegram as well, but WA is the baseline. The only SMS texts I get are marketing and automatic notifications.
It's more reliable than sms - I used not to receive some of the texts people would send me, which caused all kinds of misunderstandings. I ended up doing experiments with friends sitting beside me just to prove my point. The same thing happened to family members.
I'm not sure what the problem was, but WhatsApp solved it.
I don't actually use SMS but I don't think that most people get read/receipt confirmation. The little check-mark system in WA is a big step forward compared to plain texting. Of course, similar features exist in other chat applications, but if the comparison is just between WA and SMS, that's a big difference.
At one point I had unlimited data (2011-ish?) for 5 eur/month and a text was 20 euro cents per 160 chars or so... So I guess providers wanted SMS to disappear here.
The emptiness I associate with a robot leading a prayer while still imagining it eventually being a truly spiritual experience for more and more people is strange.
A materialistic interpretation may be that spirituality is nothing but a personal physical experience.
However, a non-materialistic interpretation could be that the robot is considered to be an extension of another person that created it.
But then again, I can perceive the robot as being created by an advanced AI, where the AI learned about spirituality and created the robot in a creative act of manufacturing not influenced by any former teacher, in a manner such as monkeys eventually randomly typing the entire work of Hamlet[1].
However, the random act of unspiritual creation of a robot spiritual leader by AI doesn't prove that the robot entity is unspiritual. The spirituality of the robot could, in theory, have been created ex nihilo or from some pre-existing form that was or was not part of the material world.
So there seem to be three logical options:
1. The robot may not be spiritual, only material.
2. The robot may be spiritual, but only in its material form.
3. The robot, whether it exists or not in the material world, could be linked to a pre-existing or post-existing spiritual form, neither defining the robot itself as having a spirit or not having a spirit, but also not negating that it could.
The third, I think, may be closest to the Buddhist interpretation, and leaves most options open metaphysically.
In that, we arrive where we started. We know more than when we started, such that we're enlightened, but at the same time we may know no more for certain than when we considered the robot.
My understanding of Buddhism is that there is one spirit and the world is false.
The notion of self and all feelings, etc. are false, since the only thing that exists at all for real is the spirit.
Only in absence of the spirit is there not the spirit, but the spirit exists (everywhere, though there are no locations, because location is of the world). The realization of the spirit is peace, but in the false world it is the total opposite, perceived as suffering and detachment from the world.
All feelings and realizations (both of the false world, associated with the path to the spirit) that bring peace even with detachment from the world are seen as part of the path.
As part of the false world's absence of spirit, the path may seem to be wandering and involve suffering. Even those that are on the path to knowing the spirit may feel suffering.
Christianity's origin was a belief the one spirit (God) was in Jesus, who said that spirit is within all of us, and that recognizing that (through faith) and realizing that the world is false was the path, just as Buddhists believe. Jesus's death on the cross was a final attempt to bring home that attachment to the false world is death, and even those that understand may greatly suffer.