Hacker News new | past | comments | ask | show | jobs | submit login

> Note that I've not written any significant code in either language, so I'm just writing based on what I've learnt by reading.

So he’s basically comparing 20+ years of Python usage with marketing material from Go and Rust. Right.

I wish people could be honest with their motivations, instead of making up excuses to justify this sort of change. Here it’s a classic case of “I got bored and I don’t like the new features, want new shiny”, wrapped in a bit of whining about problems that have well-known solutions (use a good IDE, use type hints or one of the many libs that enforce types...). I have no problem with that, just don’t try to sell it to me on technical grounds, because it looks very much like you have none.

Python is still one of the best ways to get stuff done, quickly and with a minimum of boilerplate. Its limitations are well-known and have not really increased for about 15 years - if anything they have shrunk in many, many areas. Adoption rates keep exploding, despite a lot of doom forecasted by some alpha geeks about 10 years ago: “There is no mobile story! We will all be using Ruby/Go/NodeJS tomorrow! Nobody will ever go to Py3!” Etc etc etc.

You got bored by it? That’s fine, it happens, no harm done. But you don’t need to convince anyone, including yourself, that you’re moving on because of this or that flimsy excuse.




> I wish people could be honest with their motivations, instead of making up excuses to justify this sort of change

Why do you assume motivations other than what the author claims? I very much started on Python, but as my career progressed, I got tired of being frustrated with poor library documentation and having no static types, making it extremely difficult to figure out even the parameter types to pass in, especially after it's been a while.

After switching to static typing, things have been a lot less brittle and even faster to develop. It is often said that Python is fast to develop in, but this is really only true for prototyping. Making a program and making a robust python program are entirely different, whereas they're much less separate in a statically typed language.

Also, statically typed does not imply FactoryFactory or no type inference, especially for languages like Rust, Swift and to a certain extent Go as well. The code written in Rust, Swift etc. could often be as elegant & short as one in Python, without loosing the benefits of static typing.


Wow. It's amazing the lengths people go to to maintain their delusions. Show me one python IDE in action doing effective refactoring across a large, nontrivial python code base. And you didn't even mention concurrency and skipped over it altogether!

> use type hints or one of the many libs that enforce types...

As if that's pythonic. Heard of duck typing? And those are things you have to do yourself, as additional work and maintenance rather than getting it free because it's a feature of the language. And you don't always have full control over the stack.

Is there an IDE that will guarantee that no runtime errors will take place due to typing?

> Adoption rates keep exploding

Because python made its way into the rapidly growing machine learning market. Python is great for writing glue code/scripts because it's easy to extend with C. Everyone and their grandma getting into data science use python.

> I have no problem with that, just don’t try to sell it to me on technical grounds, because it looks very much like you have none.

Looks like you are just not willing to accept or even recognize any valid criticism of Python. This type of fanaticism towards the language is not the first time I am seeing it.


You crossed into personal attack here, and incivility elsewhere in this thread. That's not ok on HN. Would you please review https://news.ycombinator.com/newsguidelines.html and follow the rules when posting here from now on, regardless of how wrong someone is, or you feel they are?


With regards to refactoring I’ve never had a problem using pycharm for refactoring python. But that is literally the only tool I can say that for.


i’ve used pycharm to refactor 100k+ LOC bases.

pycharm also uses the type hints that are PART OF THE LANGUAGE (aka pythonic) as of python 3.x and enforces them.


I've used Pycharm extensively in production. It's hands down the best Python IDE, but the refactoring is nowhere near as reliable as with IDEA, Golang etc. Having a static type system provides refactoring certainty that simply cannot be achieved with a dynamic language.

You cannot do with heuristics what you can do with static types.


Pycharm is built on top of IDEA.


Yes, I know, but it doesn't provide the same quality of refactoring as IDEA can for Java due to the dynamic nature of Python.


[flagged]


ok. tell me what i’ve done and haven’t done. you’re the expert.


> pycharm also uses the type hints that are PART OF THE LANGUAGE (aka pythonic) as of python 3.x and enforces them.

Just goes to show you how incredibly messy the transition from 2 to 3 really is. Duck typing and typing hinting are two fundamentally different approaches of doing the same thing that rub against one another. That's one more thing adding onto the long list of things that make it a pain in the ass to migrate from python2 to python3.


They're optional, and it's possible (and not unusual) to use them in Python 2-compatible code with an alternative comment-based syntax and a backport of the typing module.

It's possible to start using type hints without migrating to Python 3, and it's possible to migrate to Python 3 without using type hints. They're almost orthogonal.

It's weird to bring it up like this - do you have any experience migrating code to Python 3?


I’ve done all of that. I guess it depends on the codebase, but transitioning to Python 3 wasn’t so difficult, I suppose it should only be easier now than it was in 2016/17.

Coming back to Python in 2019, it seems almost mandatory to be on Python 3.4+. I can not imagine someone still using Python 2.7. Except for this one place I worked that was stuck in the last century, God bless their little souls.


You missed the point, so I'll repeat it. The parent I was responding to made the comment that python has type hinting, responding to OP's criticism of Python's lack of static typing. My point was, there was no notion of type hinting in python until PEP 484 or whenever it was released, long after the initial release of python. Python was released in 1991 and type hinting was introduced in like 2014. So for the 23 years the language existed prior to type hinting, people made all sorts of arguments/excuses for why that's not needed/unnecessary for Python. Hell, even now as a consequence of this type hinting in python isn't all that common. It may have to do with the fact that for 85% of the language's lifespan, there was no type hinting and duck typing was rampant, which like I said is an approach that goes against type hinting.

So if you want people to be pythonic in py3, they have to make massive changes like replacing duck types with abstract classes.

I think it's weird that you don't question why vast majority of python code has no type hints. Do you have experience in Python?


It makes sense now, thank you. I think the word "pythonic" was misused in the parent - usually it's normative, but type hints are strictly optional.

I don't think I've seen anyone argue that Python 3 code without type hints is unpythonic. Except maybe the parent, but I think that's just unfortunate wording. It's explicitly using a non-standard definition of pythonic ("part of the language").


You know that protocols and static duck typing are possible, right?

They're not at odds at all.


Not telling you what you have done or not done. But this is the internet, you need to provide proof in order for your statements to be taken seriously. I am not telling you to do anything, by all means don't provide any proof.


> But this is the internet

I find that on the internet, after proofs are given, more disagreement ensues. When one mind's is set, is set in stone. Proof or without proof.


Perhaps there should be apostasy punishments for Python defectors. /s

Python's typing story is far from perfect and rather annoying compared to other languages. Python is good for interacting with the operating system, networking and other things.

But wanting a proper compiler is an honest motivation.


There are type checkers now.

Personally I find type systems a lot more annoying because they lead to more verbose code and more cognitive load. There are studies that show bugs being proportional to the number of lines of code, regardless of language. And that effect seems to be more significant than the difference among dynamic and static type systems.

Defensive programming can be achieved through various means. Type checking is one of them and often can't replace Unit and Integration testing. Validation of Inputs or other data is another issue.


A proper type system offers a level of safety and correctness that simply cannot be achieved without one, especially as the complexity of an application grows. Also, the supposed lack of cognitive overhead without a type system is actually a subtle form of technical debt where you move faster because you eschew thinking deeply about your data structures up front, but inevitably pay a bigger price in production when you have to debug data type errors at run-time. The debt has to be paid again by every new developer tasked with working on the project because they have to organically absorb an understanding of the application's data structures that could have been explicitly defined and enforced by the computer. Finally, the extent of "defensive programming" required to begin to make up for a type system actually adds a ton of extra code itself (e.g. tons of guard blocks testing brittle application values at run time). Not all type systems are created equally, but a code base without types is pretty much always worse than a comparable one with types (the exception being shell scripts and other relatively simple programs that aren't maintained by a team of developers)


> Defensive programming can be achieved through various means. Type checking is one of them and often can't replace Unit and Integration testing.

Of course they won’t replace unit or integration tests, but they will replace a lot of damn trivial tests you have to write to make sure you exercise all paths of the program, because there are no types to help you.

Also, at least when I think of defensive programming, I don’t think of how Haskell et al. allow me to more clearly express my intents in the type system. I think of the times where I write JS/TS and cannot rely on any data anywhere being in the format it says it’s in.


> Personally I find type systems a lot more annoying because they lead to more verbose code and more cognitive load.

Have you forgotten the cognitive load of "what does this return and what can I call or access on it?"

> There are studies that show bugs being proportional to the number of lines of code, regardless of language

But that value isn't the same across all languages, only the trend.


No, that was explicitly true across languages. That the number of bugs increases with code length within a particular language is a trivial fact.


I don’t think Haskell or OCaml are more verbose than Python.

Other traditional languages are adopting type inference, etc. We could be near an inflection point where we get statically type languages that are as convenient as Python.


> I don’t think Haskell or OCaml are more verbose than Python.

I love Haskell computing model, but holy hell, why is so difficult using things like State Monad or even a trivial thing like a global counter [1] ? Why something so simple should be so difficult? Don't say it's because functional programming, Scheme/Racket is not that difficult!

Haskell and OCaml has their own problems, I agreed with you that both languages are more expressive but the code density is not necessary a good thing neither.

[1]: https://github.com/scotty-web/scotty/blob/master/examples/gl...


A global counter is difficult because shared global mutable state should be difficult —- it can wreak havoc on your code. That said, if you’re really set on it it’s not particularly bad

    counter :: IORef Int
    counter = unsafePerformIO (newIORef 0)

    usesGlobalState :: IO ()
    usesGlobalState = do
      count <- readIORef counter
      putStrLn (“the counter was “ ++ show count)
      modifyIORef (+1) counter
It’s even easier in ocaml, where you’d use ref and not need to worry about the IO monad.

Anecdotally, I find it incredibly rare that such a thing is desirable, when you could instead create it in an IO monad and pass it through to functions that need it. But, it’s there when necessary. And making this kind of anti pattern hard is a strength of these languages, not a weakness.


Oh sure, but Haskell and OCaml are much less popular than Python, with consequences for community building, recruiting, reusing existing code bases etc.

The popular statically typed languages which are actually Python's competitors are for example C++, C#, Java and the like.

Sure as hell they are more verbose. Even Typescript was extremely intimidating to me when I started to try and understand Typescript code bases.

And I am NOT going to try and teach Haskell or Ocaml as a first programming language...


If one were to write perfectly type-annotated Python, and a hypothetical perfectly annotated standard library existed, what’s preventing a faster runtime being developed with performance similar to C#, for example? That will never happen, so maybe “what if” isn’t even worth asking


It's already being done: https://github.com/mypyc/mypyc


Yet the entire data science world is built on Python. I don't understand this blatant disregard of reality.


Python is ideally suited for data science and machine learning because 80% of the work tends to be data exploration, transformations and feature engineering and 19% of the work is iterative development of models. In such workflows the lack of type checking & simple REPL is a huge feature. Most of the underlying code for ML is anyway written in Fortran or C and Python becomes a very convenient front end glue

For the 1% work that is actually engineering the model's deployment in production, it's not uncommon to use faster languages with the exported model.

If what you're doing is writing a vanilla CRUD server that gets thousands or hundreds of thousands of hits a second, Python is probably a bad choice.


I’m not sure how many people will ever even work on CRUD apps of that scale.

Even a mere 1,000 req/sec adds up to 86.4M per day. That sounds like something at least as popular as Stack Overflow, for example. How many web apps are there, really, that have millions of users?


Or almost any advertisement tracker server.


Isn't that market dominated by a few, or at the max, no more than 10-15 companies? My point is just that are there really more than a few dozen or maybe a hundred companies that see more than 1000 req/sec (on web apps)?


Depends if you're talking maximum or average. Getting to the front page of HN could easily give you a burst of 1000 req/sec. I would agree there's very few CRUD apps that deal with 1000 req/sec on average.

On the other hands, there's various APIs and tracking type of services that probably do handle well in excess of that, being run by not-large companies none of us have ever heard of.


Uh why ? Thousands of hits per second will generally be IO or DB bound and will be done using clear code and a minimum of boilerplate.


I think Python is at a weird place regarding data science.

If you need easy-to-use tools, R libraries cover significantly more ground than Python. You don't have to worry about 2/3 compatibility of niche libraries. Plus the functional language forces the libraries to have similar API's. You also get to use tidyverse and ggplot2 in all of their majestic beauty. At most you have to write a "%>% do() %>%" to get around weird library API choices.

However if you are in the frontier, you could avoid so much awkward notation with an array-first language like Matlab or Julia. You have "^" instead of "np.linalg.matrix_power", "/","\" instead of "np.linalg.lstsq", the . operator and so much more life-saving tools.


R has so many problems itself! Attempting to run it at any sort of scale is basically impossible. Its dependency management is atrocious. You can't thread or multi-process effectively. The amount of overhead to even install R on a system is incredible, and amazingly its documentation is even worse than Python. We use R as our foundational Data science language and it has caused nothing but immense pain for everyone attempting to wrap the packages in scalable software.


None of the libraries that do the computations are written in python. None of the GUIs or matplotlib are written in python. If you want to build large projects or need performance, python is not the right language. I do like python a lot - for the things it's good at.


no, the trivial frontend is built in python; the real code is usually c++ or c.

Ignore that reality if you want to, but it is a fact.

Big complicated python projects are seldom pure python, they are usually a friendly python frontend to a serious application written in something else.

It seems in no way remarkable that someone wanting to build a serious backend type piece of functionality would pick another language that was, just for example, multithreaded.


The CPython interpreter itself is a C program. Acting like extension modules “aren’t Python” is highly disingenuous.


Oh please, go read the source code for tensorflow and then come back and we can have a real conversation.


I don't understand this logic. The users are learning Python, not C++ when they're trying to learn data science or implement a machine learning model. Should I say Tensor flow isn't written in C++ but CUDA or OpenCL? Any self respecting researcher is training their models on GPU or FPGA, not CPU.

The point is that Python is the entry point for large majority of data scientists currently and its absolutely disingenuous to try to dispute that reality.


> The users are learning Python, not C++

The key word here is "users". Python is fantastic for users. It allows users to get things done without having to worry about types or memory or the underlying hardware in any way.


I’d say it’s very disingenuous to claim that Python users don’t have to care about types. Many types are directly related to semantics, so of course they have to care about types!

I think Python made some really good choices with their types from UX that lead to you having to care more about your logic than the machine. Only one integer type, for example! For most users it’s perfect UX.

However, I think (possibly due to the early time when a Python was made) some of the decisions made practical but regrettable trade offs. Duck typing is ultimately just type inference with strong performance penalties and really weak static tooling, for example. _Many_ static type systems are still very far from the user’s actual domain, but the core techniques don’t have to be.


Not just data science: the entire machine learning and scientific computing world, if we’re honest. Need to do huge numerical linear algebra operations very fast with great memory safety and ease of use? Python.


Technically, you'd use a C/C++ library with a Python wrapper.


Yes, that is called Python.


No it isn't? The substantive portions of the software aren't written in Python. Python is a thin interface to the actual library. Users of the library who need to rapidly prototype can leverage the library using its Python frontend, but the authors of the library did not write the library in Python.


To program in CPython is to use thin wrappers around C-level implemented extension types and modules. For example, all major built-in container types.

If you program in Python for performance-critical applications, as I do, then you will frequently hand-write C-level extension modules, write extension modules in Cython, use JIT tools like numba.

All of this stuff “is Python.”


even for the crud apps so frequently referred to in this thread the equivalent 'real code' is the sql rdbms written in c or c++ no matter what the webapp language is. i don't advocate using python in million line monoliths despite it being my favorite lang in general at the moment but the volume of cognitive dissonance and false dichotomies in this thread is crushing.


Wrong: the data science world is built on R, which was designed from the ground up to be the freeware alternative to SAS and Mathematica.


Oddly in biology I’m seeing an uptick in R use as it’s easier to install and run tools in R for non programmers. The python 2/3 confusion (why doesn’t this google result work?), and various install packages (conda/pip...) seem to have made R more popular. This is despite biopython which is quite good. The newest single sell rna seq analysis tools are in R.

R is a strang thing to me, though it has its moments (graphing). I still think pandas which gives python R like data frames is one of the great tools out there.


pandas gives base R a run for its money, but I find the multi-indexing really confusing, and R tidyverse is just way way nicer than pandas.

The sentiment that R is easier than python to install/manage is common among some R users but I disagree. With R I'm constantly facing dependency hell problems -- one package wants an old version of R, while another needs the newest. Conda/venv solves this problem very nicely in python.

Recently lI've been using Julia more and really like it. One nonstandard case where I've found it really shines is parsing large bioinformatics data, such as pileup files. Python is just so slowww here, but neither do I want to write a C program to do the text parsing. Julia is perfect in this case.


Wrestling with borrow checker is enough punishment already. /s


See the details here about his project that he maintained in Python for years (since 2006) in Python:

https://blog.liw.fi/posts/2017/08/13/retiring_obnam/

I can really understand his:

"Obnam has not turned out well, from a maintainability point of view. It seems that every time I try to fix something, I break something else. Usually what breaks is speed or memory use: Obnam gets slower or starts using even more memory."

Also from his current post:

"I could perhaps have been more diligent in how I used Python, and more careful in how I structured my code, but that's my point: a language like Python requires so much self-discipline that at some point it gets too much."


These two stuck out as well. I wonder if the issue is with their coding practices and not necessarily a fault with the language.

If the issue is one of fundamentals it will only follow them to the next project or language


Give me two lines of an innocent man's hand...

Really, changing a Python project is not pretty, because speed and memory usage guarantees vary wildly between releases. One factor is that CPython is constantly being rewritten.

And that does not even cover projects that depend on 20 PyPI packages, which introduces practically guaranteed breakage.


> One factor is that CPython is constantly being rewritten.

Exactly. I have an impression that the programmers as the whole already spent significant time only to develop and maintain different solutions only to manage different Python versions and dependencies in production environments, which were maybe much less needed exactly with some better "discipline" in the Python development itself.

On another side, Python won many hearts over Perl 5 which is much more stable, but "appears to be" harder. Interestingly, Perl actually forces a programmer to "care more" (i.e. be more precise): the reference to something requires different notation than the direct use of something, the use of the array requires different notation than the use of the scalar etc... it actually has "some kind" of typing enforced by the language and explicit in every line. In my personal experience, I have much more "trust" in a 'biggish' Perl program than in a that big Python program to behave "exactly how I'd expect it" (e.g. in the sense that Perl doesn't produce as easy "exceptions" and whoever wrote it had to think more about the correctness than for Python, especially if "use strict" was used, which has stronger guarantees about the typos in variable names than Python). Going further, however, C-like (actually Algol-like) compiled languages simply allow much more than the scripting languages, regarding the smallness of what is eventually produced. E.g. Busybox ( https://busybox.net/ ) has to be written in C or something very close to C. Free Pascal http://wiki.freepascal.org/Platform_list is actually also very practical language for many use-cases.


Are you making that comment by speculating or by having looked at the Obnam codebase? I'm not a Python developer, but I've looked at the Obnam code enough to have written a replacement crypto plugin for it. AFAICT, if Obnam was holding Python wrong, it's too hard to hold it right.


I haven’t looked at the code, but I hope someone who has been using Python for 20+ years knows how to write and use it properly... so yeah, I’m inclined to agree with you.


Languages encourage various ways or styles of solving problems, though, and some of those ways require more self-discipline or care than others. For example, many languages with static types make it so that you can make sweeping changes to the interfaces between things and the compiler will catch any mismatches for you, but in a language like Python, you'd better hope you have a unit test that will catch it.

I like Python well enough, have been using it since mid-2001 and was very much a hardcore Python enthusiast for a number of years, but nowadays I prefer languages that encourage immutability and functional programming styles, as I find that they make it much easier to avoid bugs. I've had plenty of Python experiences where bugs were caused because something somewhere else was mutating stuff behind my back, messing up my mostly-unrelated code. Usually this other code was also written by someone else, so I had no way of knowing about it. This kind of thing can be prevented in other languages, but requires a lot of discipline to avoid in Python (and other languages, I'm only picking on Python here since that's the topic of the article). These problems won't follow you to the next language, if the next language is more suited to avoid them.


Even on larger code bases I've found that the only real discipline needed is maintaining a test suite, and that's true for every language.



You don't need to convince anyone, including yourself, that python's well-known limitations couldn't possibly have inspired this guy to look at other languages.


I was a year long Python user when I stared learning Rust. Reason: I knew just enough C/C++ that I knew I wouldn’t be able to pull through learning it properly from a perspective of time. But I wanted a strongly typed fast alternative to Python without garbage collection.

I had read some stuff about Rust and it seemed like a interesting language, mature enough for my applications. I tried it out and came to like it more than I would’ve expected. In fact I tend to use it over python for many things nowadays. One of these things is definitly how the module system works, but I agree with you: if you fuck up in Python with your code structure, the problem is definitly between chair and desk.

However: Rust won’t let you get away for cheap if you don’t think about structure, and this could certainly help some to adopt better practise (which in the end is good for all of us).

Interestingly enough my appreciation for and interest in languages like C and C++ grew a lot, after I had learned Rust. It gave me a good new perspective on C++ and certainly learnd me ton of good patterns and concepts that will definitly end up beeing useful in other languages. Rust certainly is well thought out and good ideas are never bad to look at.


May I ask what your use case was that required a no-GC language?


> Here it’s a classic case of “I got bored and I don’t like the new features, want new shiny”, wrapped in a bit of whining

This seems to be a rather shallow reading of the blogpost, and the way you phrased your reaction is not exactly polite either. If anything, the "adoption rates" of Python (like those of ECMAscript, a language of a similar vintage all-things-considered) are most easily explained by non-technical factors, so I have quite a bit of trouble making sense of your comment.


There was absolutely nothing impolite in the way he phrased his response. Rather, what I'm responding to is passive-aggressive.


Wanting new shiny is a helpful healthy way to keep exploring and growing.

I've worked with the opposite who absolutely refuse to ever learn python/go/whatever, producing these awful thousand line bash scripts.


Type checking as an add-on is okay but it is not the same thing as a statically typed language.


There is no proof or study that statically typed languages are safer in any regard than others.


Wrong. http://ttendency.cs.ucl.ac.uk/projects/type_study/documents/...

"our central finding is that both static type systems find an important percentage of public bugs: both Flow 0.30 and TypeScript 2.0 successfully detect 15%!"

In fact there have been more than 1 study with the same conclusion.


I didn’t say anything about safety. I find that it makes writing code and refactoring and collaborating a lot easier.


There was a paper linked here on HN a week or two ago that proved a number of safety guarantees using something similar to Rust. Sometimes there is truth behind the hype.


True. However, there is fairly strong evidence that it helps with understanding code.


Type hints throughout the code give you exactly the same understanding.


Only if they are correct. I am pretty sure that in old code base it will not be correct unless automatically checked before commit.


That's trivially settled with a static type checker.

So the argument is analogous to saying "how can I trust the types in this Java program are correct, if it has never been compiled".

Well, whether some "old code base" has them correct or not, just run the type checker and find out.


If you do that you will probably hurt your productivity. There still will possible errors made by using 3rd party code.

In my opinion optional type checking is not comparable to statically typed language guaranties. So if I do not need types I would use python. If I need/want type specification I would use statically typed language.


Please cite that evidence.

To me it sounds like circular reasoning, because in my subjective perception, dynamically typed code is a lot easier to understand to begin with. Also, tooling has become a lot better, with and without typecheckers/annotations.


For example: Do Static Type Systems Improve the Maintainability of Software Systems? An Empirical Study

"Despite their importance, whether static type systems influence human software devel- opment capabilities remains an open question. One frequently mentioned argument for static type systems is that they improve the maintainability of software systems—an often used claim for which there is no empirical evidence. This paper describes an experiment which tests whether static type systems improve the maintainability of software systems. The results show rigorous empirical evidence that static type are indeed beneficial to these activities, except for fixing semantic errors."

https://pleiad.cl/papers/2012/kleinschmagerAl-icpc2012.pdf

Follow on:

An empirical study on the impact of static typing on software maintainability

"We further conduct an exploratory analysis of the data in order to understand possible reasons for the effect of type systems on the three kinds of tasks used in this experiment. From the exploratory analysis, we conclude that developers using a dynamic type system tend to look at different files more frequently when doing programming tasks—which is a potential reason for the observed differences in time."

https://link.springer.com/article/10.1007/s10664-013-9289-1

Note that I wrote "evidence", not "proof". Also, there is no robust evidence for safety improvements.

Are there potential confounders? You bet. Are there other aspects that can help? You bet.

I do find that types help me with older code, despite the fact that my preferred languages (Objective-C, Smalltalk), both have keyword syntax that helps tag arguments even if there's no types or documentation. Particularly protocols help me define the interactions that I expect in my system, which otherwise tend to be only defined implicitly through the objects. I'd prefer real connectors.

When I try to understand Smalltalk code, that's a lot of browsers open and digging etc. When I look at some of my older Objective-C frameworks, all the id types do leave me with a bit of head-scratching. A lot of the times the types are trivial to add, and just make things obvious at a glance.

However, I also totally agree that these languages help me write cleaner code, and that enforced type systems get in the way of useful architectural abstractions. Sometimes I need that id.

¯\_(ツ)_/¯


I feel exactly the same way with all the Rubyists moving to Elixir.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: