<<Tesla pulled a fast one on the morons at GM and Ford, who signed stupid deals that handed the future back to Tesla>>
It's always curious to see someone with such strong conviction that somehow they have an insight, without access to any of the information involved in negotiating this partnership, that the various folks at GM and Ford failed to see.
This comment is either neither a constructive criticism of him, nor does contribute anything substantive to the discussion. Please review the site guidelines against name calling: https://news.ycombinator.com/item?id=26687330
“The events that have transpired demonstrate that Jack Ma experienced a disconnect between the power, freedom, and relevance he believed himself to have had and the reality exerted by the Chinese government. This could be viewed as a lack of foresight, exceptional hubris, and inability to interpret ground truth on Ma’s part.”
I must add that OP asserts the Chinese authorities iniquitously and deliberately allowed Jack Ma to acquire power through the success of his enterprise which was struck down swiftly as we are witnessing today. Thus, OP's insight is that the CCP employed an insidous tactic of bait-and-switch, exemplified by the unfolding situation, in order to further their authoritarian interests. Jack Ma's inability to excogitate his own demise is why OP thinks that his mental state is abnormal.
Personally I believe it just took CCP some time to figure out how to deal with these uppity billionaires. They couldn't use their normal playbook on how they deal with political opposition, because he wasn't opposing the politics of CCP, or how they deal with the powerless, because he wasn't powerless.
Now that they do, heads will roll quickly. The smarter ones will come voluntarily hat in hand and offer tributes. The dumber ones will disappear for a month, and then offer tributes.
This is by far the most important point in this discussion. According to this article (https://www.greencarreports.com/news/1107109_teslas-own-numb...), Tesla's statistics are deliberately wrong, and Tesla compared its Autopilot crash rate to the overall U.S. traffic fatality rate—which includes bicyclists, pedestrians, buses, and 18-wheelers.
You could argue that Tesla should be compared to luxury vehicles (which have more safety features), but even if you just compare their crash rate against U.S. drivers of cars and light trucks, the Tesla Autopilot driver fatality rate is almost four times higher than typical passenger vehicles...
<<Actually it is not what we're talking about, because they're not blind, they can use usual means of inquiry, and the technology. How comes already there are no firearm firing sound detectors, or remote-controlled drones with tasers in major streets to react instantly, ect...? There are ways to act which don't strip citizens from their power, as information nowadays is power.>>
That's actually exactly what we are talking about. The police's job is to enforce the law, and as technology changes, they need to adapt, the same way firearms now have serial numbers, and they collect DNA samples / fingerprints (all of which have met similar resistance before they became a broadly adopted technology, and are also open to abuse).
It's totally fine to have a rational discussion of where, as a society, we want to draw a line, but let's please keep our own rhetoric in check - there are lots of scenarios (outside the high visibility of this case) where, with proper legal controls, enabling the police to be able to access someone's phone is the equivalent of a traditional search order that allows the police to enter someone's premises and search for evidence.
The point is the government is already not respecting the rule of law with all the 'secret FISA court' stuff nor does their foreign policy seem capable of making us safer. I am loathe to give them more weapons against activists, politicians, journalist and the discontented when they have greatly earned our distrust.
Even if the "Searching X is like searching Y" were true which it rarely is, the issue is that "proper legal controls" don't get executed. Even in the traditional realm.
Once we have transparency and controls in place to make sure traditional laws are being used properly we can begin a careful deliberation on expanding those laws.
<<Was hoping Gates would be less naive in thinking this legal precedent has anything to do with this specific case which has close to 0% chance of providing any real-world information and more to do with the fact that this tragedy is used as political theatre and being exploited for maximum PR and political influence.>>
Regardless of the merit of the case, I think it's pretty inappropriate that you are calling BillG naive. You really think that somehow you have more insight into the situation than BillG, who has access to pretty much any resource and source of information?
> Regardless of the merit of the case, I think it's pretty inappropriate that you are calling BillG naive.
Actions speak louder than words, I don't care who you are. In this case his words are adding to the dangerous narrative the US Govt wants this debate to be framed on: exploiting a tragic case of terrorism to unlock the legal precedent with 175 other phones waiting in the wings:
If he's not naive, he's been actively complicit as part of the "Old Microsoft" (before security of user data affected their global Azure business model) who was more than happy to provide what ever access they could to the NSA which saw "Outlook.com encryption unlocked even before official launch" and "Skype worked to enable Prism collection of video calls":
To play the devil's advocate, LE can get a court order to search your house, your car, your file folder, tap your phone, and so on, and that's viewed as fair by most people, since there are some checks and balances: they have to convince a judge (we're not talking about the secret NSA court stuff), and have to stick to some rules when they do it.
Why should they not be able to search phones on a case by case basis, with a court order?
That's something reasonable people are going to ask.
Edit: This is a pretty good analysis of why turning over a tool to the FBI is a terrible idea: http://www.zdziarski.com/blog/?p=5645 - but the FBI is saying they don't want that. The guy in the blog disagrees.
Let's say they meant what they said and everything stays inside Apple. Why shouldn't a court be able to order a search?
> Let's say they meant what they said and everything stays inside Apple. Why shouldn't a court be able to order a search?
It should be more than clear by now they're not asking for a single iPhone's data - (the iPhone data in this case is useless). They're asking for Apple to create a new version of iOS (i.e. that doesn't yet exist) that weakens their own security protections to provide a backdoor allowing the FBI to hack into the iPhone themselves, which a) sets a legal precedent, b) allows them to keep going back to Apple crack new phones c) gives them access to software with a back door they can study and reverse engineer.
The FBI have carefully chosen this case to go public on (specifically denying Apple's request to have the case sealed) precisely because out of all its pending court orders to unlock iPhone's, this is the one that stands the best chance to gain political and public influence necessary to set the legal precedent.
Once set, it will compel Technology companies to create tools to weaken their own security, using their own resources against them, forcing them to include themselves as an Adversary who they need to protect their customers data from. Not to mention if Apple is forced to concede to the US Govt, it will be forced to concede to other governments as well. China have previously demanded to have a master key for all electronics sold in China which they had to back down from due to political and public pressure, if Apple concedes to the US Govt other governments will undoubtedly be demanding the same.
He says it's BS, but it's kind of his word against theirs. Presumably, if the FBI says one thing and then asks for another, Apple could get the court to put a stop to it.
Like others have mentioned, Apple already have a way to get into any phone, and so far have kept it safe.
<<The FBI is the opposite in every way, mostly because of budget constraints and the subsequent lack of training. I hope that this is a good learning opportunity for them and a chance for them to increase their training budget in this area.>>
Besides the legal precedents and other associated drama, I think is one of Apple's major concerns, and one of the reasons they implemented the "we don't have the keys" approach to their encryption. If the FBI can always just call on Apple (or Google) to fix whatever mistakes they made, there is little motivation for training / getting better on this front, effectively making Apple the computer forensics arm of the government.
The request they submitted to Apple was clearly written by competent people. They knew exactly what and why they wanted to do, how Apple can help and why only Apple can help.
I think part of that is Apple is/was actively working with the FBI to find alternative solutions. I would bet that the engineers described what would need to happen, i.e. the new OS. As is often the case, the Apple engineers probably documented alternative solutions. The FBI took that "solution" and ran with what they described. It's the "well Apple told us this is the only way to do this, but they won't do it for us" scenario.
> I would bet that the engineers described what would need to happen. ... The FBI took that "solution" and ran with what they described.
I think you absolutely nailed it!
For a high-profile investigation like this, Apple would have given the FBI access to the key developers in the security group. The developers are smart guys trying to be helpful. They are not thinking about Apple policy, or constitutional law, or the big picture of world liberty and privacy. They are tasked with finding the solution to a technical problem: How to get access to protected data.
What likely happened--exactly as you already suggested--is that the FBI asked the developers to explain how the security system could have been designed so as to permit easy government access in cases like this. The FBI was asking "hypothetically" of course. The developers happily gave a blueprint of how the system could have been designed.
The FBI now demands that blueprint be implemented.
Apple should have talked to the FBI through lawyers only.
> For a high-profile investigation like this, Apple would have given the FBI access to the key developers in the security group.
> Apple should have talked to the FBI through lawyers only.
You went from "would have" to "should have", turning your hypothesis into a certainty...
Why wouldn't the developers in the security group think about constitutional law? Have you ever seen an internet forum that talked about computer security regularly, yet did not talk about constitutional law regularly? If not, how would those developers have possibly avoided regular reminders about the 4th amendment?
They didn't have to avoid any reminders. They were most likely just asked "how" it could be done, not to do it. The law comes into effect now, where the FBI is trying to get the courts to order them to comply. Simply telling someone how to potentially do something illegal is not illegal itself, and really doesn't cross any boundaries in my opinion. A white hat hacker uses many of the same techniques that a black hat hacker uses, but in one instance it is legal and in the other it is illegal.
Well, that's the overhead of selling closed-source devices.
If you think about it, consulting vendors is probably a better use of taxpayer money then RE-ing every stupid crypto system on the market.
They contacted Apple, did their homework and came up with specific and generally sane demands. They even went as far as suggesting to perform the hacking at Apple site to ensure that insecure firmware doesn't leak outside.
BTW, this last part looks very much like a response to concerns voiced by Apple, which means that the official statements from both sides are just a tip of the iceberg.
Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to fuck off and write their own firmware. Then the only thing they would need Apple for is signing it once it's complete. And if each user could sign their own firmware updates with a key based on their password or provide their own key then it's game over.
They've put themselves in a weird legal situation because they've made it so that they are the only ones who can actually write and sign the firmware the FBI is demanding. A judge would laugh them out of the courtroom if the FBI was technically capable of writing the firmware and demanded Apple's help because it was too hard.
> Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to fuck off and write their own firmware. Then the only thing they would need Apple for is signing it once it's complete.
This is an example of a non-free software feature. Why are the keys baked in and can't be disabled. And "write your own firmware" doesn't solve this problem -- they could just pay a developer to do it $X an hour. A better security model should've been used -- where updates have to be confirmed (read: signed) by the user before they are applied.
> Spivak 1 hour ago
Sure it does. If all the hardware and software associated with iPhones was open-source Apple could tell the FBI to off and write their own firmware.
No, not based on the interpretation of the all writs act that the FBI is attempting to use. As far as the FBI is concerned, they could force my Grandma to write a backdoor if they deemed her the best person to do so. Given that she can't answer the phone most days it'd be a lon wait, but I wouldn't put t past them.
Poor choice of words, I meant general "closedness" of the platform - from undocumented design, through lack of source code up to centralized code signing.
The only reasonable way for law enforcement to deal with even a single one of those factors is to request help from device vendor.
There is one thing the FBI is very good at, and that's writing a compelling narrative. It's possible that there are highly competently people who know everything, but it's also possible there are moderately but not dazzlingly competent people who are really good at writing a story that feels complete and keeps one from asking questions outside the narrative.
Though, on second thought, I have to add that we don't know how many back-and-forth mail exchanges happened before they were able to come up with the officially published request.
Maybe they were just competent at working around excuses from Apple.
Exactly. Yet another reason to fight the court order. We should expect FBI to be competent, embarrassed if they aren't, and fix the problem. It's not a good state of affairs when a company is more trusted to do forensics.
Software update signing keys, which can't be disabled by the end user. This is what most people would consider "a flawed security model". Even UEFI lets you change the trusted booting keys.
Please enlighten me. Is this not exactly what the FBI is asking for? For Apple to flash a custom version of iOS that doesn't have the software rate-limiting and auto-wipe, which only someone with Apple's private key can do. A four-digit PIN is only secure in combination with those features. Having Apple's code-signing key is in fact "having the keys", except in the most pedantic literal sense.
I think you are mixing up the law enforcement and intelligence community.
These request are coming from the law enforcement side of the government, not the intelligence community. Given their mission, the intelligence community has no interest in having a public discussion around backdoors, since the last thing they want is for the people they are targeting to be aware of any backdoors in their mobile devices.
The line between the two is gone today. The CIA now both collects information (spy drones) and acts on it (armed drones). The FBI both investigates crimes and acts as a go-between between federal and local agencies via fusion centers. Since 9/11 everyone is all about information sharing. That means law enforcement now give intel to the intel community and that the intel community does likewise. Even lines between military and civilian worlds are blurred as contractors bounce between the two. We really are slipping into a world where feds are feds and the letters on the hat no longer matter.
Regardless of the merit of the FBI's request and Apple's refusal, this is rather hypocritical response from the FBI, since their insistence to make the request public is what triggered Apple's public response in the first place:
<<Apple had asked the F.B.I. to issue its application for the tool under seal. But the government made it public, prompting Mr. Cook to go into bunker mode to draft a response, according to people privy to the discussions, who spoke on condition of anonymity.>>
In this case, the Justice department is clearly the one with the bigger 'marketing strategy' agenda here by trying to market the merit of their request to the American public and congress.
(As a side note, most of the non-tech folks I talk to seem to be siding with the FBI's request, so at least at some level, the government seems to be winning the marketing battle.)
Consider the possibility that the government is using this all as a means to convince the public phones can be secure, and that apple's cooperation actually matters.
<<Haha, really? Isn't that how Uber started out, and the quote could well be verbatim from a taxi driver in 2011?>>
Yes, This this is completely disingenuous... They did not only start out like this in the US, but this is the ONLY way they were able to buy their way into China. When Uber started in China, their competitors had most of the chinese local drivers, and it was Uber who started the subsidy war to gain market share.
Anybody with common sense. Do you really think that Apple, who prides themselves on having outstanding customer satisfaction, would deliberately try to brick their customers phones through something this obvious?
Apple definitely looks out of their own interests, but they jump through a lot of hoops to protect their customers from bad experiences, especially since the obvious implementation was to just disable the fingerprint sensor if can't be trusted.
Edit: additional info from the TC article: <<The update is not for users who update their iPhones over the air (OTA) via iCloud. If you update your phone that way, you should never have encountered Error 53 in the first place.>> Your conspiracy theory would really require that they brick phones through both the OTA and iTunes update.
>> they jump through a lot of hoops to protect their customers from bad experiences,
I would say Apple is one of the best, if not the best company for customer service, but I wouldn't go so far as to make a blanket statement like that.
There are a lot of 2011 Macbook Pro owners (me being one of them) who would disagree.
There were thousands of complaints in the Apple support forums, several articles in major Apple/Tech blogs, etc. It took well over a year for the issue to be acknowledged. In that regard, I'm kinda jealous how fast this Error 53 thing got resolved.
> Do you really think that Apple, who prides themselves on having outstanding customer satisfaction, would deliberately try to brick their customers phones through something this obvious?
Yes I do think that they would attempt to discourage unauthorized repairs in such a way for less-than-noble reasons.
If you think it was a mistake, then can you explain why Apple wasn't bothered to do anything until someone ran an article on it and publicized it?
A thought experiment: suppose you are in charge of handling repairs for a multinational fleet of hundreds of millions of devices.
One thing you'll probably do is triage: by looking at the numbers of devices that fail in various ways, you can optimize your parts channels, training, processes, etc. in various ways. This is business 101.
Now try to guess how many people have been experiencing this error. My guess is it is a pretty small percentage of several hundred million. I also guess that there are a number of other failure modes affecting similarly small groups of users. In a device as complex as the iPhone, with a population that large, there has to be.
But wait! Now the press is hammering you over one of those small-population failure modes. Everything else equal, you're an idiot if you don't handle that one first.
Of course, thought, this is Apple. So the reasonable, simple explanation makes no sense and instead Occam's Second Exception indicates that when Apple is involved, skullduggery and shenanigans are the only reasonable explanation.
Triaging by data is just the first step. Once you decide it's an actual problem you have to be able to reproduce it. To confirm this is happening, you have to get production phones, then do an out-of-process rework, then do this for different OS versions, OS upgrade methods, iTunes version, etc... Reworking this sensor is not an easy task so you have to have someone do it for you and get their time, etc... It's actually a pretty big project to do this correctly.
> Yes I do think that they would attempt to discourage unauthorized repairs in such a way for less-than-noble reasons.
Repairs are not really a revenue stream. Apple Care is a revenue stream but the incentive is to not repair. Since every repair logged against Apple Care is a cost, it doesn't make sense that Apple would want to do this themselves from a purely economic perspective.
I've been in HW all my working life. Any field return is expensive and resource intensive to handle and you cannot pass all those costs onto your customers. You do it to provide good service to your customers. You eat the repair cost as part of internal warranty cost which is built into the pricing of every unit sold.
What you are saying just doesn't make sense to me.
As a user I don't want any yahoo being able to replace my touch ID sensor. I have tons of sensitive information on my phone. I want that thing disabled if touch ID breaks or has been tampered with.
>If you think it was a mistake, then can you explain why Apple wasn't bothered to do anything until someone ran an article on it and publicized it?
How do you know that they didn't "bother to do anything" on this issue until someone ran an article?
Precisely. Repairs are a cost. Turning people away because of pink dot, visible signs of 3rd party repair or Error 53 avoids that costs and provides opportunity for a new sale.
> If you think it was a mistake, then can you explain why Apple wasn't bothered to do anything until someone ran an article on it and publicized it?
That hardly takes too much imagination. One possible explanation is that higher-ups in Apple read the news, but not necessarily every single "the Apple Store won't replace my broken phone" complaint.
> That hardly takes too much imagination. One possible explanation is that higher-ups in Apple read the news, but not necessarily every single "the Apple Store won't replace my broken phone" complaint.
For a company that takes so much pride in supporting its customers, you'd think the stores would have been able to contact Apple internally to find out what the error even is before saying they won't fix it, right? Which would have led them to realize it was not meant to be running in production?
It seems entirely probable to me that most of the Apple Store incidents for this went along the lines of "I got this repaired at a repair shop and now it doesn't work!" "Unfortunately, we can't fix an issue caused by an unauthorized repair. Go back to them." I certainly don't think I'd have dug much further if I worked retail.
<<Yes I do think that they would attempt to discourage unauthorized repairs in such a way for less-than-noble reasons.>>
I have added some additional information to the parent comment (i.e., only bricked updates via iTunes, not OTA) that further undermine the theory that this was a deliberate change. I realize that you could still argue that the actual bug was that the OTA update did not brick the phones, but Occam's razor really starts applying...
<<If you think it was a mistake, then can you explain why Apple wasn't bothered to do anything until someone ran an article on it and publicized it?>>
I do agree with you that Apple has a long record of dragging their feet to issue fixes for pretty significant bugs, so it is possible that the press caused them the issue the patch faster.
> Do you really think that Apple, who prides themselves on having outstanding customer satisfaction, would deliberately try to brick their customers phones through something this obvious?
I don't buy Apple products for this specific reason. My answer is Apple would and have done it. I am not an Apple typical customer, I love to hack and take things apart.
> "Warning: Apple has discovered that some of the unauthorized unlocking programs available on the Internet may cause irreparable damage to the iPhone's software," the message read. "If you have modified your iPhone's software, applying this software update may result in your iPhone becoming permanently inoperable."
I did not downvote you, but I think people might feel that you seem to let your personal disdain for Apple color your response regardless of the facts of the case vs. having a real discussion.
Even this point is a bit disingenuous - your quote from the article in no way supports your position that Apple would deliberately brick your phone - and in fact, there is a quote further down in the article you cited that further undermines this claim:
<<a user identified as ansuz07 said, "The percentage of iPhones that have become bricked from hacks is very low. Even those that experienced problems could be fixed by a simple restore. Apple is going to make it sound a lot worse than it actually is since they are the ones who don't want you to do it in the first place">>
It's always curious to see someone with such strong conviction that somehow they have an insight, without access to any of the information involved in negotiating this partnership, that the various folks at GM and Ford failed to see.