Hacker News new | past | comments | ask | show | jobs | submit login
iPhone Privacy (seriot.ch)
16 points by taranfx on Dec 5, 2009 | hide | past | favorite | 9 comments



More details here: http://seriot.ch/resources/talks_papers/iPhonePrivacy.pdf

Note that this applies to only applications installed by the user, there is no hacking going on. Much like installing an application on a desktop.


Ironic that a talk that mentions distortions in the press around security issues, would be linked to by such a horribly written hit piece. It would be nice if a mod could replace the original link with the pdf above.

The talk finishes with four recommendations:

    1. User should be prompted to authorize read or read-write access to AddressBook
    2. WIFI connection history shouldn’t be readable by “mobile” user
    3. Keyboard cache should be an OS service
    4. iPhone should feature an outgoing firewall
Seems fairly uncontroversial. Hopefully we'll see them in 4.0.


You still need to get the application on the iphone somehow, and there's no indication this can happen through Safari. If anything, this is a good argument for the controversial "walled garden" approach Apple has taken to date.

Also, its worth noting here that if what is described as possible here is a security hole, then every operating system ever made is insecure. You can run a keystroke logger on your mac or any other operating system that would access everything you type, including passwords. You could also install a screen capture utility that records and automatically uploads what you do. Just because you can run a program that gets your personal data doesn't mean that the platform is inherently insecure. Now I understand that it may be stupid to allow apps access to information, but there may be a good reason here. Its possible that applications might need to access contacts, bookmarks, etc. and without knowing more about this particular situation, I can see why these types of things might be possible.

As things currently stand, some level of common sense is required by the end user. With the walled garden approach Apple has taken and with the coming Cloud operating systems, security will be force-fed to the end-user. And though this isn't perfect, its pretty damn good from a security standpoint.


The level of transparency about which APIs a particular app uses on the iPhone is not particularly good. I have a feeling that some apps and libraries, particularly advertising/analytics solutions have been abusing this fact.

The Android system of notifying the user exactly which APIs are being used by an app, prior to install, seems like a step in the right direction.


The Android system of notifying the user exactly which APIs are being used by an app, prior to install, seems like a step in the right direction.

The talk mentions that class unmarshalling, encrypted payloads, and other tricks that make this a very hard problem. The truth is that code-based analysis can only go so far, especially when what you're looking for will be deliberately obfuscated. The legal barriers that mechanical_fish brought up are probably far more effective.


There is no need to analysis - simply demand the app declare what it plans to use and then deny all other APIs at runtime.


[deleted]


It's important not to ignore the context of this "discovery": to become a trojan on non-jailbroken iPhones, an app must get into and remain in the Apple Store.

Has nobody else figured out the other reason why Apple bothers to have an official store with officially-registered developers? Apart from the "big piles of retail cash" angle?

It's because of this. Apple doesn't need to treat every application running on the phone like Hannibal Lecter, locking it inside some kind of padded sandbox with steel walls and redundant alarm systems and then (in a small but significant number of cases) watching it escape anyway. Instead Apple relies on a simple dynamic:

If you write an application that steals user data for nefarious purposes,

...and anyone at all reports it to Apple, today or in the future... [1]

... your app disappears from the App Store; a window pops up on every user's phone telling them to uninstall the app because it is a dangerous security breach, and (incidentally) that the developers are bad people who cannot be trusted; the developers are blacklisted forever and ever; and an enormous lawsuit gets filed against the developers -- who, because they had to register with Apple just to get an app accepted, have a business address to which that lawsuit can be delivered.

That's not considering the possible criminal proceedings. The cops will also not have a problem finding an iPhone developer, because distributors of non-jailbroken apps cannot be anonymous.

I'm guessing this is going to be a pretty effective deterrant. It seems to have worked well so far.

Of course, the usual problem will apply: To what extent will users voluntarily relinquish their personal information without realizing what can be deduced from it? In a world where Facebook asks you for your email password so it can riffle through your Inbox, this is a serious concern.

---

[1] Obviously it's best if the review catches security holes and TOS violations before the app is launched, but that won't happen in general. And it doesn't need to.


The problem is that Apple claims:

"Applications on the device are 'sandboxed' so they cannot access data stored by other applications. In addition, system files, resources, and the kernel are shielded from the user's application space."

http://images.apple.com/iphone/business/docs/iPhone_Security...

The research demonstrates the opposite.


I see process-title has been fixed not to mangle "iPhone" as the first word in titles. Much better.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: