Of course he would. He was a regular hacker. If you've read Steve Jobs' biographies they all describe Woz wanting to keep things open and hackable and Jobs pushing through for the vertical integration they have now. Woz made the altruistic decision most of us would've "preferred" but Jobs made the decision to create the most valuable company in the world instead. Hard to say who was right, really.
In terms of providing value to Apple shareholders I think Steve Jobs clearly made the right decision.
I terms of providing value to the world at large they should have gone with Wozniak.
Of course Apple has provided a lot of value to users and the world but much more limited than if everything was hackable and repairable.
I guess my point is that I suspect, if you could add up the additional value of an 'open Apple' for every individual, you would end up with a considerably larger amount than the current market cap of Apple.
Let's do some math. Apple market cap is $2.4T right now.
Let's say a third of the world could use their products in some form, that's $2,4T divided by 2.7B people or $900 per potential user.
If I could extend the life of my iPhone by 4 years with repairs I've already recouped "my share". And that's without mentioning any other product or the added value of being able to customize and repurpose old stuff.
Of course some of Apple's achievments wouldn't have been possible without their business model, but I think there should be a pressure on companies to open source technology after a certain period of time. It would be to the benefit of society.
I love how hypocritical these companies are. Environment friendly they say:
- Force people to buy a part separately that could easily be included in the box (wasting less packaging in the process).
- Force people to replace unnecessary parts when repairing their products (my gfs MacBook broke the screen internally, without applying much strenght to it, and Apple repair solution is to replace the entire top end of the laptop for 600€)
- Make stupid expensive for people to repair their products, and almost impossible for third party shops to repair them.
And this isn't just Apple, most brands are going the same route. Meanwhile world its sinking in garbage, but shareholders are happy I guess.
> Force people to buy a part separately that could easily be included in the box (wasting less packaging in the process).
A part that 90% of people probably already own 10 of? Sounds like 10% more packaging and 90% less e-waste?
I mean I get the whole "non repairable is bad" argument but Apple stuff has great lifespan and good resale value, thereby being reused instead of junked when upgrading. And they accept devices for recycling, don't they?
Of course it could be better, but there are legitimate ways of looking at these things that aren't as unambiguously "Hypocritical" as you are asserting.
Commoditization is almost always a big compromise to form factor, which, given that laptops are by their very nature a compromise of power for form factor, isn't exactly desirable by everyone. Stacking PCBs, which is exactly what you're doing when you're adding slots and sockets to your motherboard is absolutely a compromise on form factor.
If you don't like glued on or soldered on parts, then don't buy a computer that has them. There's still a ton of options that have the features that you want. Why be bothered when manufacturers are making things that other people want?
When I bought my 2012 macbook pro 15 retina, the retina screen's high dpr was non-negotiable, and there were no other comparable options without glue + solder.
That means that you made a choice of what matters most to you. In any buying decision unless you're manufacturing it exactly to your specifications (and even then), you're probably going to be compromising on something. In this case you prioritized the screen over anything else, which is perfectly valid.
It's also perfectly valid to pine for a macbook pro with a socketed CPU, dimm slots, and a standard nvme drive. That doesn't mean that things should be that way just because you want them to be, though, which is the point that you made.
My 2013 MBP was the same, and eventually crapped out (power/charging circuit on main board. I couldn't upgrade to one with more RAM, due to some Apple like-for-like policy.
So I did vote with my wallet, and got a beefy ThinkPad running Linux. I wasn't that wedded to the Mac ecosystem, and this thing is a tank. If it needs upgrades or repairs, I can do them myself. So there are choices out there, if you're looking.
Are you sure they could? I can't imagine how could you possibly fit a connector within my great (and its thinness is a principal component of that greatness) Macbook, and having multiple ways of doing one thing doesn't seem like good business or good for the nature - especially if phones and tablets are the major product of the company.
Yes, they could. Look at Dell XPS 13 9310. It has the battery that's not glued in (using screws) which you can get a replacement for 70-150 bucks and change youself. It has trackpad and keyboard that are replaceable separately. And the SSD is slotted so you can upgrade the drive or recover data when stuff happens and your motherboard dies. Meanwhile, it has great screen, is very light (1.1kg) and thin. The battery life is on par with Intel macs as well.
In 15" laptops you additionally now get a slot for a second hard drive and upgradeable RAM without sacrificing much in terms of thickness (look at XPS 15 9500).
I had XPS 15 before I had the current M1 Macbook, and before that I had the previous gen XPS13 for a very short while - I had to return that, I couldn't work on it at all due to overheating/thermal throttling. XPS laptops are incomparably worse machines and opting for that instead of a Macbook was my greatest mistake that I'll never repeat (I also really tried to like Lenovo machines, but that ship has sailed too - I'm going Apple only since this Dell experiment).
I never lose any stuff on my Mac thanks to the seamless integration of iCloud. And the motherboard fire that happened to my XPS15 destroyed the SSD anyways...
About the screen... Don't even talk about that. I was so angry when I first saw it - I paid big bucks for the best screen Dell offers and yet it's so much worse than the screen on cheapest Macbooks (I moved to the XPS from Macbook 2015 - even that old machine has a much better screen). It was flickering when I enabled dark mode in my editor (not a faulty machine, I tried to return it and they have shown me that every machine does it)!!!
And no, the battery life is nowhere near the 2015 Macbook Pro, and absolutely nowhere near the M1 Macbook. The ads say so, but it's totally not true - my MBP2015 still can sustain 6 hours of work with the original battery, while the XPS was dead after 6 hours of not doing anything.
And to top it all off, the whole XPS was creaking even if I just put my hand on it, and raising it into the air made sounds so terrible people around me were having amused looks! None of my Macbooks ever made any sound like that - or any other unpleasant sound whatsoever.
Yeah, I kind of tuned the issues out because for me the XPS is the only machine that combines slickness with repairability and that's what I value. I have the skill to resolve some of the issues (I always do a wipe and a fresh install) and the rest I learn to walk around through time.
I agree that the XPS is a terrible computer for a person who just wants stuff to work, but my point was that nothing prevents Apple from following the same repairability practices that Dell has while having a better QA. Having removable SSD or the battery that's not glued-in doesn't automatically make your laptop as bad as a Dell or as thick as a brick. If tomorrow these practices are written into the law, Apple will still be producing very good machines which will also be more maintainable.
> A part that 90% of people probably already own 10 of? Sounds like 10% more packaging and 90% less e-waste?
No, not everyone has those parts since they purposedly changed the way the part works. I mean, look at Apple, chargers used to be USB Type-A -> Lightning and they the included cable from USB Type-C -> Lightning.
Move also made no sense since these chargers usually may not even outlast the person's device. In the end of the day you're causing more waste.
And yes, they do accept devices for recycling, what they do with them its unknown.
> USB Type-A -> Lightning and they the included cable from USB Type-C -> Lightning.
If you have a bunch of old USB A charger chances are you also have a suitable cable. Really it makes less sense giving people yet another USB A cable, considering the Mac lineup is now all USB C and so you don’t have to buy a cable to charge from/connect to your Mac.
I and everyone I know already has plenty of USB-A and USB-C bricks laying around. The things last forever and build up. So its more like every other company gives you a bit of junk you will not use.
Yes, they ask you if you need one at the time of purchase and if you don't ask for one you don't pay the $19 for one.
Trying to account for what the original BoM would have been with one in the box is an impossible effort and not very useful. At the end of the day all you need to do is decide if the value provided by the product matches the price the company sells it at. If not, don't buy it. For me the brick adds no extra value so I don't have to account for it.
If I needed the brick and the extra $19 pushed me over the edge to not worth it then I would consider another phone.
I really wish they had US support. They'll work if you can get one, but they don't ship to the US so you have to use a third party re-shipper (both for the phone and for replacement parts). Even then their frequency bands aren't optimized for the US so you're likely to have reception issues.
I don't think they care about the environment at all. It's plain marketing as all others LGBT things they "support". It goes well with the consumer so obviously they act as they supported those causes. Only thing companies want it's profit. It's wrong? No. But don't think they care about anything else.
At least rest of us can be less hypocritical, If we have decided to call out companies for their lackluster climate action then contributing to the e-waste disaster by selling blackboxes should be on the list.
It's a bit disingenuous to lay blame on companies for hypocrisy, when they have to play by the system set up by politicians.
Capitalism doesn't give you any extra points for being sustainable. If anything, it can interfere or be in direct opposition to making profits. You win at capitalism by making profits. A non-sustainable competitor will eat you, if you start worrying about the environment too much. So greenwashing is the way to go for most companies, in order to avoid boycotts, and people are ready to believe it because these problems are complex and people don't want to think that the stuff they buy are hurting the environment. This is a very foreseeable result, given the incentives.
Don't hate the player, hate the game. Demand change from the politicians.
The efficacy of different kinds of action is highly dependent on the existing structures, of course. That's not to say that nothing can be done, it's just a matter of choosing the right tool for the job and gathering up people to join the cause.
You can vote, run for office, participate in demonstrations, strikes or any kind of direct action. The possibilities are endless. No single method guarantees success for a movement and nobody knows what will happen in advance, but in general the bigger the mass of people participating, the higher the likelihood of success. But sitting on your ass and blaming companies is guaranteed to fail, if meaningful change to the system is what you want.
No, that is a stupid saying and people need to stop using. Unethical actions are unethical even if the "game" allows for it.
>> Demand change from the politicians.
No again, the solution to this problem is not some authoritarian government response, or even (which is implied by your indirect blame of capitalism for all the problems in the world) socialist economic model
>Capitalism doesn't give you any extra points for being sustainable.
Capitalism does not care about about sustainability or non-sustainable , non-sustainable companies are NOT givin a "competitive advantage" by capitalism.
In reality is current government regulations like provide non-sustainable companies with that advantage in the form of liability shields, and various other government programs written by big business for big business to ensure the status quo
You appeal to government authority is as misplaced as your blame of capitalism for all the problems
> Unethical actions are unethical even if the "game" allows for it.
Yeah, but nobody cares what you think is unethical. If the game allows unethical moves to be made, they will be made, because people play to win. And you can cry and complain all you want, but people strive to play optimally, and unless you change the rules of the game, people will continue to play in ways that upset you if it suits them to.
> non-sustainable companies are NOT givin a "competitive advantage" by capitalism.
Yes they are. Or rather, companies that care one way or another are at a disadvantage relative to companies who will make the optimal choice independent of whether or not it is sustainable. If we want to encourage sustainability, we have to use legislation to re-align incentives such that sustainability is the optimal choice. Otherwise, corporations will continue to be unsustainable whenever it suits them.
> government programs written by big business for big business to ensure the status quo
xD
Dominant corporations don't need the government to help them stay on top. All they need is for the government to get out of the way. When you have money, you can use it to influence the market to make more money. That's how advertising works. That's how vertical integration and walled gardens like Apple's app store work. That's how mergers and corporate consolidation work. Money is power, and market share is power. The state is the only thing powerful enough to compete with corporations, which is why corporations spend so much money lobbying the government to de-regulate and back down.
>Unethical actions are unethical even if the "game" allows for it.
Nobody says what Apple is doing is ethical, at least I sure didn't. The point is, the problem runs deeper than one company. The system has an incentive structure where companies benefit by doing as Apple does. It's like blaming a ball for rolling down a hill. If you don't want the ball to roll, go play on a level field.
>You appeal to government authority
You make this sound like I'm for some kind of dictatorship.
I firmly believe in a government democratically elected by the people. Even if the system is capitalist in nature, it should always be subservient to the will of the people. The governments should be tied to the will and interest of the people. Especially in the US it seems that the government acts for the corporate special interests. In that case, the solution is more democracy, not less.
>non-sustainable companies are NOT givin a "competitive advantage" by capitalism
So why are all the big companies ruining our climate then? What's the explanation? Random chance?
Democracy is 2 wolves and a Lamb voting on what to have for dinner.
I firmly believe in individualism and individual rights, governments are insulted by people to guard individuals rights nothing more. Governments just power and authority comes from that defense of rights, not from majority rule
If 51% agree that the other 49% should be enslaved does not make it ethical or right, but in your worldview that democratic government would be "tied to the will of the people"
No, government like fire is a useful tool but a dangerous leader and should never be left whims of the "majority"
>If 51% agree that the other 49% should be enslaved does not make it ethical or right, but in your worldview that democratic government would be "tied to the will of the people"
Of course democracy requires a constitution and a stable society to work. If there is no constitution that protects human rights and a majority thinks enslavement is okay, democracy isn't the right tool anymore. That's a description of a failed society at war against each other. I'm not suggesting that all problems can be solved by vote, just that it's much better to solve problems by vote than by bloodshed or by who has the most money, if you have that option. In a failed society, such option does not exist. That doesn't mean that democracy doesn't work. It clearly does in several countries.
Anything else is one wolf deciding to eat two lambs. Also, if you enslave almost half the population you definitely don't have a democracy anymore -- unless your slaves are allowed and able to vote and participate in public discourse like anyone else, which likely would not quite be slavery anymore.
What a perfect analogy — wolves and lambs voting on what to have for dinner. In the absence of democratic governance, the wolves will just eat the lambs because nobody's stopping them.
The wolf here is Apple. I can't make my own phone. Neither can you. And most users are substantially less tech-savvy from us, to the point where all they can really do is configure the settings on the factory-installed OS. The technology we use is overwhelmingly under the control of tech giants; there is no viable phone OS other than iOS (Apple-controlled) and Android + Play store (Google-controlled).
What do we do? Let ourselves get eaten? Or do we, the lambs (who are the overwhelmingly in the majority in our society) exert democratic power to counterbalance the capitalist power which controls the tech in our lives?
> If 51% agree that the other 49% should be enslaved does not make it ethical or right
Right, which is why the constitution exists: to protect certain individual rights from state overreach. But it absolutely does not follow that, since it is desirable that the government be constrained in some ways, it is always better for the government to do less.
> everything Apple has been marketing spin for decades.
After using Android since around 2010 getting a midrange iPhone around 18 or so months ago was almost a revelation for me, so no, it is clearly not all marketing spin.
(Why? Even on a Note II or S7 Edge something as trivial as opening the camera would have me waiting. On my iPhone XR pressing the camera button brings up the camera more or less instantaneously. And there are also a number of small conveniences that are hard to really pinpoint like actually understanding when it is in my pocket and then not turn on and burn out my battery.)
> On my ancient and overloaded S8, the camera loads in under a second after double tapping power.
Lucky you.
> These anecdotal "I switched to x and its waaay better" things always reek of bias.
Well, here I am. I don't think I touched an apple product from 2012 to summer 2018 because I disliked OS X so intensely. So not exactly the biggest Apple fan.
> That a 2017 phone is slower than a 2018 phone is obvious - plus you'd need to reset the s7 to factory defaults for fair(er) comparison.
I talk about normal steady state usage after a month or two. My iPhone is still smooth. My Androids were hardly ever smooth even shortly after installation. YMMW. If it works for you, more power to you.
Edit: I know Android devices can be good. My Samsung S II was amazing for its time.
your phone is the exception, not the norm. Apple are well known for choosing hardware that delivers a great user experience. Their choices may not cater to your specific requirements but their sales figures strongly indicate that the majority of people disagree with you.
If you want to see how much value there is in Apple's phones, look at the used phone market. The competition isn't even close and iphones hold their value much better than the vast majority of android phones.
Are you sure its really the exception? I had an S6 that still functions really well with minimal battery degredation. I used it really heavily until I upgraded to an S9+, which is still going strong with pretty heavy use.
I've used apple products too, but it sounds to me like the differences in quality are deeply exaggerated. I happen to like android mostly because I have access to the filesystem and like to tinker with settings (and I like using my headphone jack).
As far as aftermarket value, I'm not convinced the used marked is completely rational... Or rather, there are plenty of confounding factors that make that a poor argument for which phone is built better.
yes, your phone is the exception. There are numerically more exceptions because of the sheer number of android models out there but phones lasting longer than two or three years is the exception. You can see this on the estimated OS charts if you combine it with the knowledge that most android phones don't receive more than one OS update. I've had several android phones over the years and know several others who owned them as well. Both in developed and developing markets. Android phones are cheap and will suffer on two counts - software and hardware longevity.
The used market may not be totally rational but there's a good case to be made for why apple devices tend to hold their value better - they are often built better. You cannot simply dismiss the higher price of used apple hardware as the market being irrational.
Even my coworker had been using an S6 until a couple months ago when he jumped to a pixel. I've known multiple people who were using S5s until at least last year.
So my personal experience makes me doubt that my phones are some kind of exception. These things seem to be plenty durable enough to last several years. I think there's not as much difference in hardware and software as you'd like to think.
If you want to make definitive statements, then I have to ask for your data.
Some basic searching I've done unearthed a paper[1] in the
Journal of Industrial Ecology[2] that concludes that economic lifespan (how long a phone is actually used and thus depreciation rates) is only marginally effected by the functional durability (including hardware and software quality). Instead, they suggest that lifespan is more effected by brand equity and related intangibles. People choose to use certain products longer regardless of whether other products have similar functional qualities.
Thinking further on what could cause intangible factors to have such a large impact on the secondary market and depreciation, I can't help but wonder if each brand is attracting different kinds of people with commensurately different attitudes towards their smartphones. That certainly could drive a difference in behavior, and could even be a self-reinforcing trend where the users more likely to retain their products longer are drawn to the brand with the users who are more likely to retain their products longer.
This would mean that the S5's and S6's I've been talking about aren't the exception to the trend. Their users are the exception. That's something I'd be happy to accept. There definitely is a difference in behavior between iphone and android users.
P.S. it's worth noting that the paper itself was seeking to determine if repairability would significantly increase the economic lifespan of smartphones. That's why they were looking at what factors caused people to use their phones for longer or shorter periods of time.
You can supplement this with the OS share stats of android phone and combine it with the safe assumption that the majority of android phones are lucky to see one OS update. Consumentenbond in the netherlands estimated an average life of around 2.5 years. This is their source Android: beperkt houdbaar. Digitaal Gids
Add to this the resale value number.
I think you're making this more complicated that it needs to be. Android phones don't retain value as well because support is poor. There are few Samsung shops you can walk into to get OEM support and samsung is king of the android hill. Same with google and their name is behind the OS. Apple meanwhile, support six year old phones with software updates and even replace batteries inexpensively.
The point i was aiming for was that android phones aren't any less durable than apple's phones. They aren't used as long and this establishes a feedback loop where android OEMs must cut corners to maintain margins on devices simply because there isn't a revenue stream once sold.
End of the day, your old android phone working well is the exception. Apple know that what sells is the appearance of speed. That is a major reason why their phones sell, not because of GHz or GB or Megapixels.
> your old android phone working well is the exception
[citation needed]
This isn't supported at all by the article you linked. It's your own hypothesis, and I simply can't find any basis for it.
The peer reviewed article I linked explicitly shows that behavior (like how long a phone gets used for) is being driven by primarily nontangibles like brand equity rather than hardware and software durability.
The fact that you're ignoring that and still pushing your hypothesis makes it feel like you're clinging to Apple for some reason. I've never understood this kind of brand loyalty. I've stated a couple reasons for my decision to use android, but I'm not attached to Samsung, and I'm even thinking of picking up one of Sony's new phones. Hell, I've used Apple phones in the past, and even tried a Windows phone for a while (that tile interface they had was excellent BTW)
It’s been a minute since I used Android, but I definitely felt the same when I switched to the iPhone. I found it a much more refined experience over all.
That said, if there was a decent Linux phone, I’d hop on it, warts and all. Pinephone or Librem are getting close.
The Note II was released in 2012, and the S7 was released in 2016.
To be fair, transitioning from any phone around the S7 era to an iPhone XR bought in 2020 would probably give you the same feeling of revelation.
The longest I've held on to a phone was the iPhone 7 Plus for ~4 years, but even after 2 and a 1/2 years it was starting to show it's age. By the time I got rid of it, a charge would last me a little over half a day from moderate use.
Sorry, soft buttons. One on the lock screen and one in the home screen.
And yes, as far as I remember Xperia was good, it just failed physically (later realized it was my fault as I used it as alarm clock and ended up applying force to the charging cable each morning.) Also they lost me as a customer when they included Amazon ads in a OS upgrade.
I went the other way. I got an iPhone in 2016 and was shocked at how poor the quality of hardware, os, and software was. Admittedly opening the box and giving Apple my personal information was fun.
Maybe it was a bad time for Apple, but it was almost traumatic for me. They really did just use marketing to sell phones.
Which iPhone was 2016 and what was the quality issue of the hardware, os and software?
Consistently Apple have been the leaders in all of the above. Even now, superior chip, camera, battery life, pixel density... it's hard to find better.
I've used Pixels since they became a thing and Apple iPhones. Other than quirky App Store bugs the quality issues almost always occur on the Pixel phones. ("Ok google" just stopping, gestures just stopping, ringing phone not responding to touch etc).
Hardware might be related to software. The swiping from screen to screen was slow. I posted about this and people said to turn off animations. Annoyingly this was not straightforward and it didn't solve the problem. Not to mention having a unique charger meant an extra device to pack on trips.
The os annoyance was the relentless "type in your apple id password", and multiple times per week updates. A few users have spun the narrative that updates are good, but these were annoying and didn't have any front facing benefits. No widgets really sucked, it was regressive not to have my next alarm time on my home screen.
Finally Apple maps sucked, the podcast app was buggy, I'd hit play and nothing would happen. I'd then hit play a few times and nothing would happen. Then finally something would happen. I can't remember other software bugs, it's been years.
Sure these might be fixed today, but I wonder what other things are bad today. I have ad blocking and a few other non play store apps on my phone, given the App Store, I'm not sure Apple would let such apps through.
> I terms of providing value to the world at large they should have gone with Wozniak.
In theory yes, but it also could have tanked Apple resulting in Jobs/Woz going elsewhere and Apple never becoming what it's become, never pushing Mobile, Music, thinner / better quality laptops, etc.
I think in the ideal world we would have a balance between Job's vision and Woz's vision. Having that highly profitable company but the hackable open world.
We don’t know what we would have had. I could imagine Apple having solved the aesthetics while also keeping it hackable. My guess is they’d have come up with a LEGO-like approach to internals, using crazy materials science to find some way to get the blocks thin and strong.
I can just as easily imagine Apple failing to remain an independent going concern through some of its trough periods in the inevitable cycles that companies go through. As a user and an engineer, I’m 100% Woz, but I think Woz without Jobs probably doesn’t give us the Apple of today (just as much as the converse is also true).
The move towards mobile, music, thinner seems an inevitable progression in this vein of technology - analogous to transistor miniaturization that occurs no matter who the players are.
The build quality is something we have to credit Apple for though, not to be taken for granted.
It does seem like there's a sweet spot in a balance towards Woz's vision of hackability.
> The move towards mobile, music, thinner seems an inevitable progression in this vein of technology - analogous to transistor miniaturization that occurs no matter who the players are.
iPhone was a first of it's kind. We had smart phones, we had touch screens, but Apple pushed the boundary. Everyone followed.
We had mp3 players, we had mp3 players that played video, (I had a Creative Zen), but Apple pushed the boundary with iPod's high capacity and ease of use. Everyone... Apple stole the market on this one.
We had laptops, decent looking ones, thin...ish... ones, etc, but Apple pushed the boundary with the Macbook Air. Everyone followed.
Apple is great at Marketing, Build Quality, and User Experience. (doesn't matter who disagrees with the last one, its a fact when a non-tech savvy person can pick up an iPhone and use it, but struggles with Android)
I don't think Apple is much of an 'inventor' company, but they do take existing things and make them better or push them in ways others can't or didn't think were possible.
----
In terms of the laptop market, I feel Lenovo is the /only/ company thats close to Right to Repair. Pretty much all of their laptops can be opened up and you can swap ram, memory, and ssd's. Their much older laptops back around like T440 you could replace the CPU.
But if any part of the laptop breaks, you can order any part.
I bought a Lenovo Legion 5 Pro for my wife, but in Singapore we can't get the keyboard with Traditional Chinese, can only get it in Taiwan or Hong Kong...
Navigate to Lenovo website, find the part number, contact the local distributor in Singapore, they have ordered the part for me and just waiting for it to arrive.
Bought a 4k screen and think damn I wish I had the 1080p screen instead...
> iPhone was a first of it's kind. We had smart phones, we had touch screens, but Apple pushed the boundary. Everyone followed.
I agree that Apple took the lead, pushed the boundary, and that everyone else followed. But I think it was an inevitable evolution in the tech. In the absence of Apple, another company would have carried the torch, maybe 6 months later, maybe a couple of years later. But it was going to happen regardless.
> I terms of providing value to the world at large they should have gone with Wozniak.
As a personality, I like Wozniak more. If I ever had a chance to meet Jobs and Wozniak in person, I suspect that I would find Wozniak to be a much better person than Jobs. That being said, I think that Jobs provided much more value to the world.
Wozniak's chief contribution to the world is the Apple II. It is a wonderful computer with fun stories behind its development, but the computer industry would have gone on without it. Apple's early years are culturally significant since it was one of the few success stories that wasn't corporate (in contrast to the Commodore PET and Tandy TRS-80), but that story is probably the most significant part about the company.
Contrast that to Jobs. As a minimum, the Apple II and Macintosh can be contributed to him. Without his drive, the Apple II would likely be remembered as one of the multitude of personal computers that didn't make it in the marketplace. Without his drive, the GUI as a consumer product would have been set back years and would probably have looked very different. As expensive as the original Macintosh was, it was far less expensive than many of its contemporaries. As crude as the original Macintosh user interface was, it did provide a model for later products. Perhaps his antagonistic attitude towards user serviceability takes away from that, but it isn't all that different from how appliances were treated in the mid-1980's.
The catch is that in this hypothetical scenario most people would have had to decide that they wanted a big clunky hackable computer for apple to have been successful enough to have all this money... Judging by what people chose to buy, this doesn't seem to be the case. Personally, I buy raspberry pis for hacking and an iPhone for reliable, secure and frictionless pocket internet. YMMV
Arguments like these are used pretty often to defend the practices of corporations and it implies that manufacturers would have to adapt their devices to make them repairable. It is also false for the majority of cases.
Reasons many devices are not repairable include: DRM built into parts to prevent replacements, no repair manuals (see Thinkpads compared to macbooks). Also measures to prevent people from flashing the firmware or updating the software once a company decides to drop support despite the hardware being still functional, or easy to put back up to speed by changing a battery.
Forcing companies to stop these anti-consumer strategies has 0 impact on the designs while making electronics far more repairable.
I find the "right to repair will make my devices big and ugly and appeasing to the evil tinkerers and hackers" angle really dishonest.
I’m pro-repair (and a regular purchaser of 18-36 month old iPhones and 2-4 year old [often just off-lease] computers).
Even given that, I don’t see how “end users must be able to change batteries in a practical way” would not negatively impact the quality of the iPhone I’m holding in my hand right now.
From a water-resistance standpoint alone, I think my phone would suffer in at least one dimension that matters to me. (It’s an Xs Max and yes, I bought used, confident that the water resistance was intact. If I sold it now, the next buyer has that same assurance.) Phones get wet at some low (but not insignificant) rate; it’s hard to avoid that across the entire population.
The "right to repair" doesn't say anything about "ability to repair" or "easy to repair." What's at issue is people making modifications to firmware and then getting sued by the manufacturers. This is what's going on between John Deere and American Farmers, which was the original impetus of the "right to repair" movement. What "right to repair" is addressing is when you purchase a device then that device is yours - you can make any modification to it without fear of reprisal from the manufacturer. What "right to repair" does not mean is that you'll be able to repair your device, that it will be easy to repair, or that your warranty won't be voided if repairs are made by an unauthorized repairer.
I think the biggest issue here is there's not a colloquially accepted understanding of what Right to Repair is.
What you're describing, and I agree with you, is what many legal right to repair advocates are fighting for.
However you'll see many people conflate it with easy to repair and easy to get parts, or repair without voiding warranty. Which IMHO should be separate talking points, but alas, they're all jumbled up when these discussions happen.
It's also why I don't think right to repair will please most people. Because the scenarios you described are limited and outside what most people are thinking e.g home repairs of cell phones
The reason I doubt this is that Apple has gone to many lengths to make their phone as sleek as possible. How hard would it be to have batteries slide into the side of the phone or have a removable panel on the back? If any device company could find an innovative solution to this problem its Apple.
I don't have an issue with making certain design changes to make water resistance possible. Specialist repair services for these devices will always be in demand. I just don't want these companies to go out of their way to make things harder for me.
Oh yes I completely agree with being able to work with the hardware and software without getting sued. I was thinking of it purely from the hardware design for repairability and adaptability point of view. My point was that the original all in one mac was competing with all kinds of weird and clunky computers when it came out and it was successful because it streamlined the complexity of using a computer and, despite the bad period in the late 90’s, apple have shown that people want this streamlined experience. I want this streamlined experience most of the time but I would like to be able to unlock an advanced mode on my iphone so I can hack on it. E.g Years ago I wanted to write some software for my phone to communicate with my laser measure over Bluetooth and I couldn’t because there was no way to communicate with a Bluetooth device that was not approved, without signing up for some special hardware developer program, which I would never get access to. I was pissed off by this arbitrary limitation.
The problem is, "repairable" can have different meanings.
One is the availability of original spare parts like screens and backside covers (aka the stuff that breaks very often) for ordinary people and repair centers. A phone that's glued together and absolutely waterproof as a result can still be called "repairable" under that definition as long as there is non-discriminatory (aka external customers get charged the same as manufacturer repair centers) access to spare parts.
Another is the accessibility of repair without special tools (cough Pentalobe) or the need to discard a fully functional component to access a defective one, e.g. when the screen is to be replaced you need to remove the glued in back cover first, a step that risks permanently damaging it by bending or breaking it. Most waterproof designs have a really hard time here as it's hard to make a waterproof design that is still easily disassemblable and slim/optically pleasing.
The third definition of repairable is if an end-user can replace a common wear item on their own: batteries most obviously, but also outward-facing connectors for headsets and charging/USB.
The fourth definition of repairable is the firmware side - aka, can people repair defects in the firmware like security flaws on their own after the manufacturer has ceased support, without risking to lose functionality. Apple is the worst offender of them all with not allowing "rooting" at all, but in most of the Android world the situation isn't much better - root your device and you'll lose KnoxGuard/TrustZone functionality (sometimes breaking apps relying on it, half the apps on my rooted Samsung don't do fingerprint auth anymore since rooting), and SafetyNet attestation will also fail, leading to apps either stopping to work entirely (banking apps, just f..k off, I know what I'm doing) or offering reduced functionality (Netflix).
And with all of the various definitions of repairability in mind, you will always have some trade-offs to make.
> Of course some of Apple's achievments wouldn't have been possible without their business model
It's better to be more fine grained here. Apple's highest profits wouldn't be possible without creating walled gardens. It's entirely possible, though, that their gains would diminish very little if they hadn't made the decision to glue everything and make their after-2012 computing devices non-upgradeable.
Screw Apple, but the only time I was glad my phone came with a cable was when the phone had an USB-C port. I have more than enough cables in place anywhere I need. I also don't see the need for another fucking USB charger. Seriously, I have way too many already and don't know what to do with them other than throw away. Also the earphones that often come in the package are worse than useless.
I'd be very happy if phones and other devices didn't include already ubiquitous extra stuff. It creates more extra waste than the extra packaging for something you buy only once, anyway. It's different if you buy a printer because you need it to be plugged in and replugging cables behind the desk would be a hassle, but phones only use the cable for charging (sometimes data) and don't need it to be usable. So if you have a cable and a charger that will work for any other phone just as well, just not at the same time if you have multiple.
If you have incompatible plugs there are adapters, too. At least for micro USB to USB-C, which I use instead of buying more new cables. Dunno if you can do that with the connector iPhones use. If not, fuck them, but Apple is right that bundling too much crap is a waste, even if they really just wanted to decrease their costs. Ironically, those two things actually mean the same thing! Or at least they should if all externalities were factored in. So they definitely did the right thing, even if for the wrong reasons. But that's capitalism and at least the incentive points in the right direction.
I understand it's important to get this out to the world, but at the same time, this is as surprising as, "The Internet turns out to be a popular way for computers to interconnect!"
Woz is the king of hackers, of course he supports right-to-repair.
Personally I totally agree with Woz sentiment on right to repair. Absolutely right.
However from a software POV I absolutely love the Apple ecosystem and the level of integration that comes with it. I love that I don't need to think of my phone as a computer that needs protecting or configuring extensively.
So it's hard to say. I can talk all day how I agree with Woz in principal but ultimately I really enjoy using macOS and Steve Jobs vision is basically why it's ended up how it has.
Really? I think it is easy to point which was right. I guess for some the mind programming is too strong and cannot be free of it. Greed and profit is eating our planet resources, and there is a huge amount of people, which are not the beneficiaries of those monsters that are willing to protect the millionares. Just because they have been sold propaganda they cannot get out of their head.
Jobs was right, though that's not what I believed at the time! Eliminating expansion slots from the Apple II spurred development in SCSI, Firewire, and ultimately USB. Now we're used to taking any kind of device and connecting it to a USB port and it simply working. Jobs obsession with thinness has also eliminated several other wires: wireless networking is universal and most of us are aghast when needing to use an ethernet cable, same for wireless printing, keyboards and mice/trackpads, and more recently we've been accustomed to wireless headphones. I think all of this would have happened eventually but you have to admit Jobs drove this.
Your counterargument is flawed - nobody banned anything. These were the choices made by a private company to its own product line in order to better serve its customers and boost sales. Apple wasn't even the first company to choose this route. In 1984 when the Mac was released the most successful personal computer of the time was the Commodore 64 - which had no expansion slots, relying instead on a daisy-chained external serial bus for expansion. The computer makers of the time realized simplicity was the key to mass consumer adoption and thus increased sales. I wouldn't be surprised if Jobs pointed to the Commodore 64 when insisting expansion slots not be present in the Mac.
That was dealer price, not retail price. A C64 system having the monitor and floppy drive retailed for $1,000. It was the highest-selling computer at the time (technically of all time) and yes, Jobs was obsessed with it. Jobs believed taking the C64 with its serial expansion bus, and adding in a mouse and a GUI would make a computer "for the rest of us." Also, that price you quoted is after the release of the Mac, which subdued C64 sales.
The Mac was released in January of 1984. These are prices from Christmas of 1984 - several months after the Mac was released. The design of the Mac began in 1982, when the C64 was retailing for $595 and was seriously affecting Apple II sales. What these catalog prices reveal is the impact the Mac had on C64 sales in a very short period of time, which forced Commodore to respond with the Amiga. But that's a story for another day.
Yeah, you just neglected to mention it was the arrival of the Mac that forced Commodore to slash their prices by over 66%. That "bottom feeder" you spoke of is the top-selling machine in history and was a major factor in Apple's pivot. Maybe it's just me but I think those facts are important in understanding the why's and wherefore's for how we got to where we are now.
Exactly. Plus, Apple wasn't exactly a market leader when USB became ubiquitous. I'd imagine the camera industry had a much larger impact on usb adoption and associated data rate increases.
The reason why not every cynic can replace Steve Jobs is that Steve Jobs knew to force things only if they actually could be forced. It didn’t always work out but often enough.
Correct. Jobs didn't recommend eliminating expansion slots without having an alternative on hand. The same thing happened when they eliminated the floppy drive from their iMacs in the 90's - they had an alternative on hand, the USB drive. They were new at the time and certainly not widespread but its adoption by the iMac changed all that.
Woz also got into hacking about computers at an ideal time. Not long before then it would have been impossible to build a sophisticated computer in your garage. Years later, miniaturization has continued to progress and many things that were once relatively simple to replace have become much harder.
When most parts on a computer were through-hole components and relatively large integrated circuits that could easily be hand soldered, the skill required to participate in the repair process was much lower. Today, even relatively open hardware like a desktop PC has a ton of added complexity that would make people far less likely to ever want to attempt a repair on their own. While I've reflowed solder on a faulty GPU in the oven before, that's not exactly a good idea. A person is usually going to replace the things that have been component-ized like the RAM, GPU, SSD, motherboard, etc. rather than try to actually repair them.
Granted, when a component like that breaks on a PC you can just pop it open and replace it, rather than having to find a specialist to repair a laptop or phone where all the parts are glued in or soldered. While I can sympathize with people that do want that (there are projects out there trying to bring products in this category to market), I'm personally okay with that being the niche that it is. Most people didn't hack their computers back then and to this day most people don't hack on their computers.
There probably should be components available for experts like Louis Rossman who can replace these parts. There probably should be schematics available so they can more easily make these repairs and they can have businesses like his who can specialize in doing these sorts of repairs (they already do exist, clearly). Companies like Apple absolutely should not be using any sort of DRM to prevent use of third party hardware components in repairs. Going out of your way to make your devices difficult to repair is unethical. But I think we're well past the point that someone without specialized skills should be able to expect to repair any device that they purchase.
During a public Q&A back in 2017 I asked Woz what he thought about control Apple asserts on their "i-device" hardware. My question was to the effect of "What do you think about how I don't own my iPhone or iPad the way I owned my Apple II?" His answer referenced the openness of the Apple II but also included (not verbatim) "sometimes proprietary is the right thing" in reference to the Apple "app store" and the locked-down nature of the iOS device ecosystem.
To have Woz-- the hacker's hacker-- answer like that was really shocking to me. (I was really, really sad to hear one of my childhood heroes answer in that way. I know, I know-- don't meet your heroes...)
In this recording his position sounds a lot more reasonable. It makes me glad to have such a well-respected voice out there driving conversation about this topic.
As an aside: I recorded my question and Woz's answer (albeit via my phone in my breast pocket, so the audio quality isn't very good) back in 2017. This was from his October 30, 2017 visit to Miami University in Oxford, Ohio. I didn't find a recording available online with a quick search, so here's this:
I listened to your MP3, and thanks for sharing it. I think Woz's point is more what the App Store enabled, not the fact that it was a walled garden. The rules Apple put in the App Store is what made it feasible for them to exist at all, and then you can create any app you want for the hardware. I think that based on his extension over where the first Tesla Superchargers were - write apps for this hardware that make your life better because they're important to you.
For what it's worth, when I think about the openness of the first computers I had, and how many crappy search bars and garbage things snuck past my unsuspecting parents while they used the computer, I'm surprisingly fine with the App Store, too. I don't have to worry about them installing garbage because it's gated by Apple. This is what made it "life changing" by Woz's words, and I agree with him.
I think even 5 years ago many hackers (especially the pragmatic ones) thought Apple may have been doing the right thing. Woz tends to be on the more pragmatic/accepting side IMO, you have to be if you want to play with/create the newest things.
people bring this up but forget that a cellphone is designed for a minimum denominator i.e. the folks on HN are not the target audience.
There is a lot of benefit to apple's curation of the app store and the locked-down nature of ios brings a lot of security upsides. I have plenty of criticisms of apple's scummy behaviour (e.g. antenna gate) but the app store is not one of them.
Little too late here. But also better late than never.
The annoying part is how HN users want right to repair, but not enough to pick such options. I hate to say corporate marketing is stronger than the human brain.
I was just about to comment about this video that was in my feed.
Louis Rossman publicly asking Steve Wozniak to back right to repair because it's faster than finding someone in his social circle etc to pass the message
DMCA is often used to take down service manuals. Try to find the service/repair manual for a Pentax K20D, you'll end up in the darkest corners of the internet.
But even without the DMCA, having unauthorized copies of copyrighted documents is still illegal, right?
The DMCA creates a liability shield for sites that accept uploads and creates a procedure for takedowns. But without the DMCA, the company's lawyers could still send a nasty letter to the web hosting site, and they'd be even more incentivized to prevent unreviewed uploads from users in the first place because they wouldn't have the liability shield.
Also, the takedown / liability shield parts of the DMCA are also only relevant when the people running the site are not the people who uploaded the content. If you want to host your own website and claim that you're republishing the manuals because of fair use, that part of the DMCA doesn't affect you at all.
(The DMCA does meaningfully interact with right-to-repair in its anti-circumvention provisions, though.)
I don't think the original poster is talking about the DMCA copyright takedown notice/counter-notice procedure. They're probably talking about the DMCA's anti-circumvention bits, which make distributing tools to circumvent copyright protections (even on hardware you own) illegal.
On 30 November 1999, the U.S. District Court in Seattle, Washington, dismissed Mackie claims that Behringer had infringed on Mackie copyrights with its MX 8000 mixer, noting that circuit schematics are not covered by copyright laws.
Data has never been copyrightable. Copyright has only ever protected creative expression. Draw your own diagrams and write your own explanations and you're fine. (See iFixit, for instance.)
It's just like how you can't post a copy of The Matrix online but you can definitely write a plot summary without infringing anyone's copyright. Whether it makes sense that manuals (and firmware, for that matter) are considered as worthy of copyright protection as The Matrix is a separate question, but the law sees them the same way.
In the future, spare parts will no longer connect mechanically but also digitally, requiring the part and the system to perform a cryptographic handshake. If you make a part that's "unofficial" and reverse-engineer the connection, the OEM could argue you are in violation of DMCA's DRM clause. For example, think of printer ink, or phone batteries, car tires, anything replaceable, really.
The current legal approaches are based around requiring companies to provide manuals and such. I am in favor of that, but I also have no doubt that companies will do their best to provide the bare minimum or even provide inaccurate information. You're relying on them to do the right thing and comply with the law.
Meanwhile, if someone manages to reverse engineer a product, there is no reason why they shouldn't be able to fix their own product or offer a service for others to fix their products. But currently that is illegal due to the DMCA, and it should not be.
Of course he would. He was the brains behind Apple, the guy who could design a computer out of his bedroom and seems to have kept that tinkerer mindset.
What about the right to repair software? Or the software that runs somewhere on the hardware to make it work? There is so much firmware running on our hardware in some closed off environment and it plays an integral role in making the hardware work. Sometimes it even emulates other hardware components to save costs. To the OS it appears like the real thing, but it's just implemented in firmware code.
Today, if there is a bug in your firmware you are pretty much hosed even if there is nothing wrong with it, physically. Some manufacturers are better than others, but some are so bad that flashing anything but some signed firmware image will just produce a brick. Most things that require this firmware to be uploaded at runtime reject unsigned firmware. So again, if there is a bug in there and whoever made it doesn't exist or doesn't care, there is basically nothing you can do.
For that reason I think that anything that requires firmware, or has a mechanism to update firmware should be required to allow loading custom firmware somehow. Maybe make it a switch, or require blowing a fuse and voiding the warranty, but it should absolutely be possible to do.
Problem is that this is fundamentally incompatible for a lot of hardware with the way DRM is implemented. For example, on Intel systems the motherboard firmware is involved in handling HDCP and there is something similar going on with HD Audio. Same with the GuC/HuC firmware for the iGPU. It't involved with handling hardware decoding and handling HDCP. No way that could be opened up without making the DRM scheme useless.
So how do you repair that vulnerability in your CPU when it's already out of support? If you could at least load unsigned microcode you could at least patch it yourself, but you can't, so your CPU remainins unfixable broken even though it's physically fine.
Similar things could be argued for regular software, but in the case of hardware repair it goes, or should go hand in hand with repairability. Otherwise you are quite limited in what kinda thing you can repair.
Woz seems like a good guy IMO. Too bad the company he started is pretty much a worst offender in this case but it is a good move by him that gives the industry a strong signal that they're doing something very unethical.
I think of it as right to existence and functioning for the device...
And right of access for the device-human configuration.
It's feasible, and not even very difficult, to make information on a website accessible to every browser since Mosaic, for example. And that's exactly what I do.
If your site has a "browser not good enough" message on it for some visitors, I think you should be just a little bit ashamed of yourself for giving up.
I think it is the difference between hack and craft.
It's not nearly as popular as you might think it is, IMO. Louis Rossmann probably won't get to do a direct ballot initiative like he wanted, because he'll be underfunded. Politicians don't want to take this fight, because they don't feel like there's enough backing from the public. The issue needs clear cut support from the layperson, not unanimous approval from the tech literate.
Cynical orange site commenter takes cheapest pot shot at nice guy agreeing with reasonable thing, to..... actually I don't know why you did that, but you do you, I guess.
Here, have a re-up. It seems weird that Woz is treated like a god, yet kept at arm's length; you'd think everybody would beg him to be CEO of their company just for the street cred. Alas, even posing the question pisses people off (although I was thinking HN pushes new questions a little ways upwards so that it doesn't become one endless argument about the first thing posted)
Because I have severe ADHD (to which I attribute the fact I'm not nearly as awesome as Steve Jobs was even though we have much in common otherwise) and little-to-no knowledge of Apple internals. If I had co/founded the company I most probably would.
What I meant with my comment is that your question sounded like "Why didn't Albert Einstein become the president of the United States?", and I believe it's not a "good" question, just like "why doesn't water explode when you dissolve salt in it?".
I doubt Apple stakeholders would have wanted Woz. And I doubt Woz would have wanted to do it.
The same is incredibly likely for most people (including you and me).
I probably came across as rude, and I apologize about that. I hope expanding my thoughts as I did above helps.
> "why doesn't water explode when you dissolve salt in it?".
Because you have to melt the salt first.
> I doubt Apple stakeholders would have wanted Woz.
They didn't want Steve Jobs either but it turnt out they were wrong.
> I probably came across as rude
Not at all, you came across as original and fun. Boring and/or rude people just downvote silently, cool people say something interesting or provoking. You did. I ended up thinking about that in more details and although I'm not taking the idea of taking over something as big seriously I feel inspired to fix some things and achieve more.