We are doing that, but another issues is you may not even be aware of the existence of these "phantom apps", if the cloner can pay a little extra efforts.
The new Android License Server should help a bit. You embed your public key inside the app and it checks against your private key on the License Server in runtime. If someone merely changes the resources to claim an app as his own, the users running the app would still check against your private key and failed as non-paying users. Of course if the code containing the public key is reverse-engineered and replaced with a new public key, you are still screwed. Obfuscation would help in that case.
Yes, this is possible, and was among one of our "solution list" (not necessarily checking against google's server, but could be our own server), but it's not much harder (if harder at all) to take out the authentication code than just change a string name in the app...
One way to raise the bar is to download critical pieces of code from a trusted code server in runtime. The client code authenticates first with the License Server using its public key and gets back an authorization token, which expires with time. It submits the token to download the code from the code server, which checks the token against the License Server.
To raise the bar even higher, make the downloadable code expirable or embedding the public key to re-authenticate once downloaded.
Of course a persistent cracker can still run a sniffer to capture the runtime code, reverse engineering it, remove all the checks and stitch the code back together. It just makes it harder.
Even if you obfuscate your code, the resources can still be decoded from the application package and are thus subject to manipulation. It is often enough to just modify a few strings in the xml files to steal an app and rebrand it as yours.
There's always some way to do it, but in general, java itself is pretty easy to be reverse engineered (although DVM has a different bytecode format than the "real java").
Another problem for obfuscation is, the time spent on protecting the app, should have been used to improve the apps. I don't think iOS devs spent too much time on obfuscate their code...
I was amazed how easily Java bytecode can be 'decompiled' back to source. At work we once lost the source for some changes, but still had the binary. I decompiled it and with the exception of about 2 comments (and the change we were looking for), the source was identical to the last version in source control.
It's always made me wary of shipping anything that I'd consider 'secret sauce' (code that solves hard problems) in Java out of concern that our competitors could so easily see exactly how we do it. Similarly, I've been afraid of a competitor or even customer decompiling our stuff and finding either sloppy code or finding security issues.
Yes, I realize all of this is somewhat possible with C/C++ and other languages, and that there are obfuscators, but it is surprisingly easy with Java.
Java decompiling includes variable names (and comments)? That is terrifying. (Assuming you're sane and your variables aren't named a01, a02..a99, b01....)
I'm familiar with decompiling C, but in C, you're left guessing variable names. (Not that variable names 'secret-ify' the sauce at all, but it goes to time spent.) Bin-patching C programs is surprisingly 'easy' to learn these days if you're rather technically inclined.
I don't remember off the top of my head whether local variables within a function will have their names preserved (probably not) but all class variables and methods will have their names preserved, unless you use an obfuscater.
I haven't played around with Java decompilers, but I have sat down with a hex editor before and played with Java class files, and I could see all the strings!
I've done manipulations of binary executables myself (on Windows). For release level software, where debug info such as symbol names is totally no where to be found, all you can use is a disassembler and a hex editor. A slight mistake in changing the bits and bytes in the binary will just crash the software without telling you what the problem is.
I actually really admire those who can make sense of obfuscated native binary code without debug info.. but decompiling java code using "off-the-shelf" tools is totally different.. the difference is like playing mozart on piano v.s. ... on your ipod.....
No, I'm saying there were only about 2 comments in the whole class to begin with, so those would obviously be lost. I don't remember having any issue with any variable names, even locals, so it's possible that I had to patch those up if there were any.
Absolutely, and that's what we as Android devs are hoping for. or if Google won't provide system level support, maybe someone can provide a solution, and even make a business out of it? maybe that's some YC11's material~ ;)
Regarding decompiling Flash-to-iPhone apps, CS5's iPhone app packaging feature produces binary executables[1], not embedded SWFs - so they should not be vulnerable to decompiling.
"Is the Flash Player runtime bundled along with the application?
No. iPhone applications built with Flash Platform tools are compiled into standard, native iPhone executable packages and there is no runtime interpreter that could be used to run Flash byte-code within the application."
Google should be able to detect this by looking for very similar byte code. They could whitelist shared libraries, or rule them out with some other heuristic. It wouldn't need to be perfect, just good enough to make the human part of enforcement manageable.
That could be one way Google could handle it, but if they do it this way, they have to maintain a huge byte code DB of all Apps, not seems to be a very efficient way.
Providing obfuscation support in the build package or adding extra security layer in the Android platform itself is more manageable i guess..
This is the company that maintains a huge copy of the entire internet, and runs queries (including "is this site just a clone of another one?" type queries) on it, so on a technical level, they can probably do it. Of course, whether it's worth it to them to do so is another story.
That's definitely true, technically it could be done, similar to some plagiarism detection software used in academia to tell the level of similarities between publications. But I still think tweaking on the client side would be a more cost-efficient solution... but I agree, they are Google, they should always have free servers and cpu powers to throw in :)
Altering desktop apps seems like it wouldn't have as much impact since those aren't being marketed through integrated app stores. It seems attempts at selling counterfeit copies online is the biggest problem for those. They get removed from Ebay generally, but have turned up on Craigslist occasionally.
Still, there are quite a few programs out there that use open-source in violation of the GPL. In particular, there are quite a few video-conversion or DVD apps using ffmpeg improperly on both Windows and OS X. (Use on Linux is proper, and users get needed patches) Aside from denying users the ability to make more changes with the relevant source, the apps are also generally poorly maintained. Even some with frequent GUI updates have failed to stay current with the core code. There have been several ffmpeg updates to address exploits using malformed videos, but users of many of the utilities are still vulnerable. The true open source projects are maintained very well, it's the apps where people are out to make a quick buck with a GUI on some free code and ignore the license that are the offenders.
Projects such as VLC saw immediate updates.
The many utilities that hide their use of ffmpeg and related open-source generally don't get listed in the security alerts such as this.
http://www.securityfocus.com/bid/15743
Even Pre OS X Mac applications had many resources that were easily modified With ResEdit and other tools. It would have been more work to alter code with security features, but changing the about box, splash screens, icons, and menu/dialog text was trivial and didn't require programming skills. Of course the sort of resources modularity used it was made international localization so easy.
I never did hear of any apps being hijacked, but those were comparatively innocent times.
Interesting question... but we are not exactly petitioning Google for anything here. We just point the problem out and maybe they care enough about it and will fix it. otherwise we'll just have to live with it, or leave to other platform... I think we would have done the same if it were for Apple or MSFT
For all the downsides to iOS development, this is one area where an iOS developer is better off than with Android. Harder to have your code pirated or forked. (Not impossible, just harder.)