Hacker News new | past | comments | ask | show | jobs | submit | rabidgnat's comments login

My path to programming, as a product of the internet generation:

As a child, I loved the idea of computer programming. I got books at the library that told you how, but they always needed equipment I didn't have. I actually wrote fake programs in Ami Pro because of sample code I found in books! My parents were cutting-edge with computer purchases, but were otherwise nontechnical. They didn't know that toy programming languages existed, and probably didn't care. They just wanted me outside!

Time passed, and in 8th grade I needed to sign up for high school classes. I saw "Computer programming" in the class listing and I registered immediately. I didn't want to enter the class knowing nothing, so I researched C++ (the language taught by the class) and came across DJGPP. I spent a lot of nights that summer careening through internet tutorials and wrote a mountain of crap, but it was fun!

The first day of school was very disappointing - I got the syllabus, and saw that I'd already learned everything on the sheet. I was equally disappointed 2 years later when I signed up for the AB C++ exam and realized I knew nothing on the sheet. The next year was the worst yet - I knew everything on the AB Java exam, but I had to spend the year relearning it all in Java! Yuck!


There are plenty of problems with CSS. You can't specify the width of an inline element, for one. That alone prevents a lot of reasonable layouts, like tables. Also problematic is a standards body that ignores this use case! In this case, they protect the writer from over-filling some boxes with more boxes. But that would at least be visible - in this case, the workarounds end up far worse.


If you want a table layout, why not just use a table? Or you can use display: inline-block if you want to give a width to an inline element.


I run into plenty of situations where I have multiple lines, and I'd like the respective elements to align. It's not limited to tabular data, but it's the easiest example


Tables FTW. It's the HTML element for controlling horizontal and vertical relationships at the same time.


> You can't specify the width of an inline element, for one.

  display: inline-block;
  width: ...

?


Unfortunately, bad browser support :/

http://www.quirksmode.org/css/display.html


That was primarily in the context of "Also problematic is a standards body that ignores this use case!" part of your comment.

Besides if you don't care about antiquated browsers (eg IE6), the inline-block is a viable option. In IE it will require some massaging, but it can be made to work quite easily.


The police investigates crimes that have already been observed. Journalists have the freedom to investigate potential crimes based on their gut feeling. Recall Enron - a police officer standing anywhere in Headquarters would have missed a crime. However, some pesky reporters asking too many questions brought the whole operation to its knees.


Most application developers won't worry about concurrent program design. Ever. More and more applications are becoming layered each year, and all of the hard work is done on the server side. Drawing an application's chrome takes little power in comparison.

Concurrent programming ends up in the data center, fussed over by the (relatively) small core of engineers and software developers. Everyone else just queries this data and makes it look good. Concurrently fetching data doesn't need a paradigm shift, just a good library. If you still need a few threads, you can use the same crummy techniques we've always used.

Cloud computing will alter Moore's Law on most devices. Devices may need twice as many transistors every 18 months, but the transistors are no longer on your desktop. They're mostly in some data center.


Someone has seen a vision of the future; and it is bright, shiny, orderly, and ... beautiful!!!

In my 30 years in and out of the software development world, I've seen _many_ visions of the future. In particular I've been reading about the death of the desktop application since, well, since before desktop applications were around. Anytime someone starts to tell me that the future is going to be X, then my response is, yes, the future may include X, but it will also include a bunch of old stuff, and a bunch of stuff that no one has foreseen. Entropy increases until a given system falls apart and is replaced by something better that works ... at least as well. Usually. (See Ted Nelson's vision of Xanadu, that which was supposed to _prevent_ the World Wide Web.)

And the duct tape and bailing wire holding it all together is ... wait for it ... faster processing speeds, more and better storage, faster networks, desktop apps, plug ins, scripts, prayer, and lots and lots of consulting fees. (edit: I forgot to add faith, hope and charity as well.)

Don't get me wrong, your vision _is_ beautiful. It's worth believing in and probably worth working toward. Some version of it will probably crawl, writhing noisily and messily, from the sea of change. Just don't bet the farm on a particular version of it.

Edit: I will also add this in direct response. My desktop computers are quickly _becoming_ my data center. I'm spending most of my face time with mobile devices: laptops, smart phone/pda music player, and, of course, my beloved beautiful iPad...

So I hereby create the new buzz term PDC. Personal Desktop Cloud. Bask in its glory and power.


It's not some silly vision of the future, but an oversimplified version of today! A brief and incomplete list of applications I use in a typical day:

Remote storage and/or processing: GMail, Google Docs, Weather.com, Reddit.com, Hacker News, Outlook web client, Google.com, DuckDuckGo, Delicious, Facebook, Github, tens of blogs/articles, online help documentation for, well, everything

Local storage and/or processing: Windows+Linux, Firefox, Chrome, Outlook, Visual Studio, Emacs, Python (or other dynamic languages), Acrobat, Amarok, random Unix utilities, various games

Most of my applications exist solely to present data stored elsewhere. I see no reason the trend won't continue: for instance, why would I compile C++ code on my machine when I can farm it out? Why would I store flat code files on my machine when I can have synthesized views of the code I need to see at one time?

Split up by time, most of my attention is spent manipulating or displaying data from somewhere else (or that could be stored somewhere else)

Games show that there are exceptions


Even if all that is true, the server apps have to be written by somebody. They don't get written by themselves. It's not like whole program optimization of your C++ application in the cloud just magically happens. The same team that wrote the C++ app on your desktop is going to need to figure out to optimized across your application, and in some cases it is more difficult as they 'll be dealing with multithreading within a box and multiprocssing across boxes on the server. And then they have to work on optimizing data transfer, from one cloud to another (since presumably the cloud you build on isn't the cloud you debug or edit on).

The world just got more complex, not simpler for devs.


But my original point is that there's one dev team in the middle of this dealing with concurrency, and any number of remote applications that can use it through a library because someone else worried about the hard parts. There isn't a day of reckoning where developers as a group worry about efficient concurrent computation, it's the few guys in the center.


You are describing a project, not the world. We break the world up into projects so we can manage them. No one has be able to come up with a way to manage the world (so far.)

The next step in understanding "the cloud" is that it is actually "clouds." Some connected deeply, other's loosely, some held in jealous, secretive isolation. There are clouds within clouds, and some clouds are outside the light cones of other clouds.

I will describe a place known as "the pit." The pit has power wires going in, and that is only out of compelling necessity. (If they could make carrying in batteries work, they would.) Equipment, data and people go in, but only people come out. The pit is a crowded place. And there is much processing of the data; decisions are made, the world changes.

We all have our own personal pits. Or least we should.


Where is this one dev team that is dealing with concurrency? I don't see how you do this? At least not with the current state of the art in concurrency technology. They can provide some basic tools like concurrent collections, but the hard work is still app-dependent. I still need to figure out where concurrency makes sense. I still need to figure out when data races are OK and which ones are a problem. I still need to be the one to put locks in my code. It's not like I can just push up the source code to Photoshop and say, "Make it concurrent now please".


So, you centralize the real work, so the average computer just have to worry about eye candy.

Scary. http://www.softwarefreedom.org/news/2010/feb/01/freedom-clou...

On the other hand, free parallel libraries that execute on my computer are perfectly OK.


Rejewski and Turing's Enigma cracking accomplishments were staggeringly significant, but they won nothing. If the Allies bombed every submarine that radioed a position, the Germans would have thrown the machines into the ocean. The Enigma let the Allies stack the overall war strategy in their favor, but it was far from over.

I do agree that brainpower won the war - our cryptology was stronger, resource management between the Pacific and European fronts was superb, and the Allies had a knack for picking battles and battlefields where they could excel


I disagree with your perspective. Scientists don't need to understand everything about life to make meaningful contributions to our lifespan. This stuff isn't magic - there are sound, repeatable principles at the bottom of everything you mention, and the principles are possibly within our grasp. The technology needed for making these discoveries only gets better with time, never worse.

Look at it this way: crude principles applied at the macro level have extended our lifespan by decades. As scientists get better and better at piecing together the building blocks from the bottom, they'll likely find principles at the micro level that improves our lifespan as well


Red-Black Trees in a Functional Setting

Chris Okasaki

http://www.eecs.usma.edu/webs/people/okasaki/jfp99.ps

It shows how to construct Red-Black trees in an extremely simple manner in Haskell. I tried this technique in C++ and I was finished within an hour!


If I feel extra contact with someone is inevitable, I usually try to request that the person email me some helpful info that I need. If they say they'll "just call," I point out that [list of people] might find it useful, and joke that I can't forward them a phone conversation we've had. Some people still insist on calling, and I let them.


A friend of mine gave up drugs completely after watching "Requiem for a Dream"


That was one of the saddest movies I have seen. A great film, very well shot, good sound track, good actors. However, I don't have a desire to see it very often, because it just depresses the hell out of me.

Darren Aronofsky's other movie -- Pi is also great. Highly recommended.


You could take the beginning of habit 5 as complementary to Emacs. Has another program ever been extended to do more?

Towards the end of point 5, Bram really points out the strength of Vim over Emacs - it is small enough that it could be refactored to be embeddable. This is the real slight towards Emacs - Emacs will never be embeddable, not in a hundred years. It is the all-purpose consumer that provides little benefit to outside programs. I think Bram saw a niche in creating a 'libvim' that other programs could use


Take a look at ezbl.

Emacs doesn't even attempt to become embeddable. It instead takes the strategy of assimilating everything into a unified, tweakable UI.

Two completely different strategies, both useful in their own niches, and incredibly powerful for those who have mastered their esoteric incantations.

That being said, resistance is futile.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: