Hacker News new | past | comments | ask | show | jobs | submit login
Swift Language Highlights: An Objective-C Developer’s Perspective (raywenderlich.com)
54 points by davidbarker on June 8, 2014 | hide | past | favorite | 16 comments



On containers, while it's entirely possible to constrain the type, it's also possible to use AnyObject

    var example = Dictionary<Int,AnyObject>()
I'll miss the duck typing, but having bought my house off C# work, I can work with this.


The first time I saw this idea was in Rust, which calls its type Any. I haven't seen the need to use it yet, but this seems like a decent case.

What other languages have this? How is it often used? I've never heard of a similar type in, say, Haskell.


In Java, everything extends Object, so something like List<Object> will let you put anything you want into it. I should play with more languages, I didn't realize that was an uncommon feature.


Oh, duh, of course that's true. For some reason "the parent class of every class" is different than "a type explicitly for any type" in my brain.


>For some reason "the parent class of every class" is different than "a type explicitly for any type" in my brain.

Good, because types and classes are different things despite what languages like Java and my beloved C# would have us to think.


To clarify:

"Object" as in Java is a type for any tagged box type. Primitives, such as "int", don't derive from Object, but can be promoted to a corresponding boxed type, such as "Integer".

The story is a little more complex in C#, where ValueType derives from Object and "value types", both primitive and user-defined, are truly subclasses of Object. Types not derived from ValueType are considered reference types.


This is a consequence of lack of generics and comes from Smalltalk.

In languages where genericity is supported, you don't need a common base class with a pre-defined set of methods, as you can give type constraints.

Java could also have done it with interfaces, but those were the days OO was becoming mainstream and interface (component) based programming wasn't yet well understood.


Haskell does have an Any type, but because of it's functional nature it's not as useful as in an OO language.

http://stackoverflow.com/questions/6479444/is-there-a-type-a...


Scala also has a unified type system. The root is Any. AnyVal (primitives) and AnyRef (AKA java's Object) extend Any.

Int is always Int instead of thinking about int and Integer. Scala handles boxing when the JVM requires it, but it is invisible to the programmer.


Julia also has a useful Any type.


Did I read that right about the range operators: triple-dot (...) is like the inclusive range that perl, python, ruby (etc?) denote with double-dot (..), and in swift double-dot (..) is actually a special range operator that excludes the upper limit?

If so, that seems backwards, and too late to fix :-(

(not huge, but unfortunate)


I always thought that Ruby had it backwards, so I’m glad it’s different in Swift. I think it makes more sense for the triple-dot ... range to mean an inclusive range, because you type an extra dot at the end compared to the double-dot .. range, and that extra dot means that the range includes an extra number on the end.


* Python has no dots for ranges, it has a range() builtin function and it has slices which uses colon (e.g. some_array[1:42:2]). Both range and slices exclude the upper bound

* the Swift operators makes a surprising amount of sense (where Ruby's make very little): the longer operator yields the longer range and `0..10` yields a range of length 10. Think of it like this: the "center": dot is the range itself, the left dot includes the left bound, and the right dot includes the right bount. Thus `..` is "range with left bound included` and `...` is "range with right bound included"


Python has much more sensible range operations (based on slice syntax for array slices, or the arguments to the build-in 'range' function) that are unrelated to this.

Ruby (and coffeescript, and perhaps perl) uses a..b to indicate a closed range (i.e. including b), or a...b to indicate a half-closed range (not including b). Swift seems to do the opposite. My guess for how ruby (or predecessors) came up with this was by copying bash (or sh?), which has only the .. version.

To be honest, the .. vs. ... distinction is really confusing and error-prone, and a bad bit of syntax to include in any language, period.


First thing I looked for in Swift was support for Python's slicing syntax. It's a shame they decided against using that.


I can't talk about the comparison to perl/python/ruby, but yes, using the triple dot includes the upper limit, and using the double-dot excludes it. For the typical (for var i=0; i < count-1; i++) scenario, you'd using the double dot (i in 0..count).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: