Hacker News new | past | comments | ask | show | jobs | submit login
How to write an iOS app purely in C (stackoverflow.com)
136 points by rubyn00bie on June 6, 2014 | hide | past | favorite | 40 comments



Author of the answer in question here.. Feel free to ask me any questions you'd like about this!

I also did the same thing in ARMv7 assembly, if you're interested in that:

https://github.com/richardjrossiii/iOSAppInAssembly


Reminds me of writing Win32 applications in C + Asm, although it looks even more difficult because the documentation is lacking and there's more stuff that has to be done. I find the fact that a simple, "do nothing" native iOS app is much bigger than some very featureful apps on other platforms somewhat sad... I don't know of any 64k/4k demos on iOS either.

Great work for trying it though. Are all those bytes actually needed? E.g. I see a lot of "restore VFP/Advanced SIMD registers" when you don't seem to be using those at all. It would be interesting to see how much you can cut out, as currently it looks more like a mostly mechanical ObjC -> C -> Asm translation, like a compiler would do, instead of a human. As a point of reference, on Win32 an app that just shows a window with some text in it requires <1KB of binary.


The only reason I'm storing the VFP registers is because that is the 'official' iOS ABI, and yes, I could probably trim it down significantly, however, the binary itself currently sits at about 63k when compiled in debug mode, so I didn't deem it necessary to trim that further at that time.

Some good ideas though, and one day I may just go ahead and do it!


Well done.. I'm curious about whether this would be possible in Android. I'm not talking about OpenGL and using the ndk but creating Android activities. Would it be much much more difficult?


"I'm not talking about OpenGL and using the ndk but creating Android activities"

I'm not really sure what you mean. Any program for Android which is 100% C code is going to be using the NDK in most cases. Including ones creating an activity natively ( http://developer.android.com/reference/android/app/NativeAct... ).


Seemly Marmalade SDK create android apps without using NDK...

Seemly what it do is compile a valid ARM code using arm abi (not even the linux abi), and then there is a component that do use android OS stuff to load in the memory the code you compiled for arm abi...

Funky stuff. And patended. (Marmalade website make VERY CLEAR that their "Secret sauce" is their loaders, that are also very much patended...)


Well I suppose the Android equivalent would be writing raw Dalvik bytecode. Certainly possible, but probably not a lot of fun. It wouldn't be nearly as informative about the underlying architecture in any case.


I'm not sure how the JNI works on Android, but theoretically, yes. I am not a huge android fan however, so I probably won't be the one to attempt it.


There's a category of questions on Stack Overflow like,

> How do I do (complicated task X) without using (class of solutions Y)?

It's like madlibs. I'll fill in the blanks for you:

> How do I write a portable function for searching the filesystem in pure C++?

Here, X is "search the filesystem portably" and Y is "just use Boost, dummy".

> How do I create an iOS application without using Xcode?

> How do I do AES encryption in my network code? (instead of just using a library to handle TLS)

Programmers develop highly specialized taste as they mature, with preferences like "C is good, Java is bad" or "everything should be done with the command line" or "I never want to leave Visual Studio again". I'm amazed by how many people say, for example, that they absolutely can't use Boost but when pressed, they can't give a good reason why. That's because Boost offends their subjective sensibilities for some ill-understood reason, not because Boost is flawed in any particular way (although Boost certainly has its flaws).

I have a ton of rep on Stack Overflow from answering questions, and so every time I answer a question like this, I remember all of the discussions I have had with people which go like this:

> Me: (complicated task X) is a complicated task. It will take you weeks to do it by yourself.

> Asker: But shouldn't it be easy?

> Me: (reasons 1, 2, 3 why it's not easy)

> Asker: But I don't care about 2 and 3, and I don't think 1 is a problem.

> Me: (list of ways that things go horribly wrong if you do it yourself) If I may ask, why can't you use libraries?

> Asker: I can't use libraries because (misconception about how libraries work).

> Me: Oh, that's not true at all.

The exchange takes place typically over the course of a couple hours, since Stack Overflow is not designed to allow a question to be used as chat.

So sometimes, if I'm feeling like a show-off, I post some ridiculously complicated way of solving the question just as it was asked, then say at the bottom:

> Or you can just do (easy solution Y), which is super easy, but you apparently can't do that for unspecified reasons.


There are a lot of good reasons not to use Boost. It has had version issues in the past, where code that compiles with one version does not compile with a later version. If you are creating a library, using Boost forces the users of your library to also depend on Boost... in particular, to depend on the SAME version of Boost as what you used, which could be a problem if they are already using a specific version.

In general, Boost is just a huge library. Using it may make your code unfamiliar to people who don't know the whole huge library, because there is a strong temptation to start using random parts of the library. Additionally, the style of Boost doesn't always mesh well with the coding style you are using. This is why Google, for example, bans boost except for a small number of approved functions.

When I'm on Stack Overflow, I always end up downvoting a lot of answers that don't answer the question being asked. "You are dumb for wanting to know the answer to this question," no matter how politely it's phrased, is not an answer.


> Me: (complicated task X) is a complicated task. It will take you weeks to do it by yourself.

It sounds like you're arguing against learning. When you're a beginner, one of the most frustrating things is when people won't answer your questions due to an air of superiority. I know you feel like you're guiding people onto the best path, but it can't be true that you're correct in every case. So it must be true that at least some of the time you're being deliberately evasive for incorrect reasons. In other words, you believe you're being helpful, but you're not.

Answer their question, choose not to answer, suggest an alternate way of doing something, but please don't go on for hours about why their original plan is a bad idea. It's often the quickest way to kill the fun and their will to learn.


You can learn a lot by writing your own code to solve problems solved by existing libraries. I always used to do these kind of things when I was in high school; just because I was bored and wanted to code. Plus, there are many reasons you might need to write your own library to replicate functions of an existing library. A great example is when you can't use an available library because of its license (closed source and you're an open sourced project, or open source and you're a closed source project).


These are people who want to learn how something is actually done, to experience that sense of accomplishment themselves, and not just use a solution someone else has already written; and I don't think that's necessarily a bad thing.


>These are people who want to learn how something is actually done

While this is true sometimes, many times it is simply that they don't want to learn a new way to do something. You can tell which side of the fence they lean towards in the way they phrase the question.


Surely the popularity of his answer has something to do with Apple's unwillingness to open their platform to other languages? For example http://toucharcade.com/2009/06/20/full-commodore-64-emulator...


>Programmers develop highly specialized taste as they mature, with preferences like "C is good, Java is bad" or "everything should be done with the command line" or "I never want to leave Visual Studio again". I'm amazed by how many people say, for example, that they absolutely can't use Boost but when pressed, they can't give a good reason why. That's because Boost offends their subjective sensibilities for some ill-understood reason, not because Boost is flawed in any particular way (although Boost certainly has its flaws).

I've personally have wrestled with this, and many times its a case where its most certainly a premature optimization. There are times where I've bent over backwards to avoid calling a `new` inside a Java loop because of some project, some years ago, was running slowly and when I profiled it came down to too many allocations in a loop. I try to hammer "benchmark & profile" in may head however.


AND there's a perfect example of the grief StackOverflow heaps upon both questioners and answerers. 80% of the comments were "you idiot, just use ObjC" or "ok that worked but you're an idiot for not using ObjC". Exactly one response to the correct answer bothered to read it or expand on it.

Is all of StackOverflow overrun by smartasses? Or is it just the iOS crowd? What excuse can be given for all the unprofessional junk responses? How can they be suppressed? Moderators can be draconian but something is needed.


The author of the accepted answer was 15 at the time of writing.

At that age I had a good grasp of Turbo Pascal, but not much that would be of any use professionally.


StackOverflow: CURATION BY HUMILIATION.


Sometimes I start thinking I'm a really good programmer.

These kinds of things are great for keeping me humble as hell.


One day, nobody will remember Objective-C.

Let Apple have its 15 minutes of fame.


As someone recently noted in the discussion of Swift, Objective-C is basically older than the CD-ROM. I'd say the language has had several decades of life, not 15 minutes of fame.

(Actually, it appears Objective-C is certainly older than the CD-ROM and is from about the same era as the audio CD.)


Looks like Objective C finally surpassed Fortran and Cobol somewhere in 2011. Good luck to you in the future ;-)

http://www.google.com/trends/explore#q=%22objective%20C%22%2...


And the CDROM wasn't a blip in history between the Compact Cassette and now? BTW the CD-ROM is still way more popular than Objective-C.

http://www.google.com/trends/explore#q=cd%20rom%2C%20objecti...

But nice try anyway.


Funny thing, you saying "15 minutes," since Objective-C had existed for 15 years by the first time Apple used it. Eighteen years ago.

As a multi-decade PHP developer, I would have guessed you'd be used to people deriding your language of choice and, based on that experience, developed enough common sense to not make a fool of yourself so publicly. I guess I expect too much of people.


I don't expect much wisdom from the youngsters here.

If you were even able to read back then, nobody had heard of Objective-C until your idol Steve touched it with his golden finger. If you remember, Apple was a sad joke of a failure from the 80s till the iPod launched. And I'm sure Objective-C was really integral to the success of the Apple IIe and the iPod.

Thanks for proving my point.


You're clearly ignorant of the history, and should stop commenting on it until you actually learn some.

In particular: Objective-C was created in 1983 and its first big success was when NeXT decided to adopt it as its main language for the NeXTstep operating system in 1988. And guess who was CEO of NeXT? That's right, none other than "your idol Steve".

So unless you're saying that nobody had heard of Objective-C until 1988, which is a pretty reasonable timeframe for a language invented in 1983, you're way off base.


Even if your assumption Objective-C was only relevant after the release of iOS was correct, then Objective-C was still relevant since 2007. That's 7 years.


Do you think I literally meant 15 minutes? 7 years is nothing.

Check this out. Objective-C wasn't even Apple's first choice.

"Apple CEO Gil Amelio started negotiations to buy Be Inc., but negotiations stalled when Be CEO Jean-Louis Gassée wanted $200 million; Apple was unwilling to offer any more than $125 million. Apple's board of directors decided NeXTSTEP was a better choice and purchased NeXT in 1996 for $429 million, bringing back Apple co-founder Steve Jobs." https://en.wikipedia.org/wiki/BeOS

So you're all using Objective-C because Apple couldn't afford to buy the OS they wanted. Kinda sad!


So Apple is willing to pay $125 million for Be, Gassée wants $200 million, Apple ends up buying NeXT for $429 million (at least $319 million in cash, according to the citations), and your conclusion is that Apple "couldn't afford to buy [Be]"?


NeXT was a reverse acquisition. The company known today as "Apple Inc." is essentially the result of NeXT being paid $400 million to acquire Apple.


> So you're all using Objective-C because Apple couldn't afford to buy the OS they wanted. Kinda sad!

In the early Mac OS X days, developers could choose between Objective-C and Java as the Mac OS X main language, the majority went for Objective-C and the JavaBridge was dropped.


It's a language. Get over it.


The way I remember it is that it was the iMac that really started the Apple turnaround, in 1998.


@mikeash I don't need the history, I lived through that era. Did you even read the whole Wikipedia entry?

"The NeXT Computer was not a great commercial success" https://en.wikipedia.org/wiki/NeXT_Computer


It might not have been a "commercial success", but one has to appreciate the irony of you arguing against the impact of NeXT while using a medium that was invented/first developed on this platform:

http://en.m.wikipedia.org/wiki/File:First_Web_Server.jpg

Disclaimer: I am younger than any technology mentioned in this thread.


Wow I lost 20 hitpoints (and falling rapidly) over this opinion. Did I touch a nerve?

I can just imagine $AAPL shareholders quivering. "What is this guy talking about?"

Betting it all on Objective-C, a unknown language based on a superseded language. Steve was so ballsy. Google at least had a clue. Java + Android actually makes sense.


[deleted]


How about you use your real name and explain why you dislike my opinion?

It's a fact people would love to write apps for the iPhone in other languages. But that's not possible because Apple is a tyrannical company. Do you endorse this tyranny and do you believe it's sustainable? I don't believe it's sustainable and I believe history agrees with me, based on my 30 years of following programming languages.

Who is the child now?



Just want to point out that you're complaining about Apple being a tyrannical company, while at the same time you're praising Google for using Java.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: