Hacker News new | past | comments | ask | show | jobs | submit | jntrnr's comments login

Or:

  open foo.txt out> bar.txt


This is how Nushell works. If the command isn't an internal command, it will run it from you path.

Curious what you tried that didn't work.


There are some questions around Servo as a result of the announcement, so I just wanted to jump in and talk a bit more about what’s going on. tl;dr - Servo is Mozilla’s vehicle for web engine research, and continues to be a major source of new tech for Firefox. But we’re also adding a more direct path to product for Mixed Reality.

Servo has produced lots of great browser tech. We're seeing that tech make its way into Firefox with Stylo and the Quantum release, and there's on-going work to bring even more tech into Firefox, like the WebRender work that you can try out in the latest nighlies. Servo's goal is to create the best technology for working with both current and upcoming web standards, which means collaborating with multiple product teams, from Firefox to Mixed Reality to other, future explorations.

There’s still a long road to full compatibility with the existing web in all of Servo’s components, and it will take time to get there. In the meantime, though, there are emerging technologies where Mozilla believes it is vital for the open web to play a central role. One of these is Mixed Reality (which refers to both Virtual and Augmented Reality), a space that’s getting a ton of attention from all of today’s tech giants.

Mixed Reality is interesting for Servo in two different respects. First, it’s a huge opportunity: it’s early days and content is brand new, so there’s no long tail of web compat to worry about; we can get products built on Servo to market relatively quickly. By putting open web tech on the cutting edge, Mozilla can help ensure that Mixed Reality doesn’t become yet another siloed technology.

But second, the constraints of Mixed Reality will help us push Servo technology to the limit: we need to achieve 75 or 90 frames per second per eye to make a workable product. The research advances here will pay huge dividends back in traditional browser engines like Firefox.

In short, this is a “yes, and” shift. Servo continues to be about building the best browser tech, period, for use across Mozilla’s products. The increased emphasis on Mixed Reality represents an opportunity to push that tech further, sooner. And the organizational change within Mozilla Research reflects a closer collaboration between the teams needed to make that happen.


I distinctly recall Servo presentation in 2012. It started: "web browsers are written in C++. It is bad for humanity."

Web browsers are still written in C++. It is still bad for humanity.

Emphasis on mixred reality may or may not solve that problem sooner. Sometimes the quickest way can seem roundabout. But I am skeptical.


The OP blog entry is so inundated with PR speak it is hard to read the actual message from it.

What I'm reading from the top comment here is that the actual point is to focus Servo on VR and Mixed Reality, because it will pull Servo even more in the direction it was always supposed to go in. That is to create a browser that outperforms everything else by leaps and bound by being able to more easily take advantage of multicore processors. VR and Mixed Reality require 90 fps in order to work properly for users, so this is a very high and a completely hard baseline for Servo to have to hit. So it's about setting a loftier goal and a higher bar to meet in order to make sure Servo reaches it's original goal of great performance by way of parallelism.


I think, that does actually work out.

The problem for humanity is not that browsers are written in C++, it's that browsers have lots of security vulnerabilities.

Being written in C++ is not helpful with that, but it's not integral to the problem. It's not impossible to produce C++-code that doesn't have vulnerabilities, it just requires a lot of effort and often years of battle-testing to close all of them.

But Firefox's source code has for the most part had those years of battle-testing. It's probably safer than if you'd completely rewrite it in Rust, at least in the short term.

Where the use of Rust can deflect most vulnerabilities is in new code. And that's what Mixed Reality is. It's gonna need to be in the browser at some point in the near future and it is a big chunk of new code. It also has harsh performance requirements, meaning they'll have to work with parallelism, which is where C++ is particularly error prone.


So we continue to play wack-a-mole with the C++ codebase rather than develop in a language that makes whole classes of exploits impossible?

Firefox is not safe. It’s been routinely exploited by law enforcement and hackers alike.


Trust me, if Mozilla actually had a choice in the matter, they would opt for just having it all in Rust, too.

But there is no choice. Rewriting Firefox from scratch is going to take decades. Firefox has to continue to function in the meantime. They do occasionally replace components with equivalent Rust components from Servo, and that's so far been a great success, but it's still scary as all heck to take a software that millions of people depend on in their daily life and wholesale replacing the CSS engine, URL parser or media decoder in it.

Besides that, it's not like Chrome/Opera, IE/Edge or Safari are bastions of security. Users can't go anywhere that's decisively safer.


It was wrong then, it remains wrong now.

Plenty of warts in C++, but at the end of the day, along with C, it is the systems language that powers 99% of the world's computing infrastructure at any level that's not a CRUD app or a throwaway script.


There's some coverage of the "why I chose not to use Rust" in the Rust community survey that happened earlier this year: https://blog.rust-lang.org/2016/06/30/State-of-Rust-Survey-2...


We're considering moving the new Rust language service over to this protocol, or possibly, having it as a default protocol while supporting other protocols.

Since I've been learning it recently, maybe some points to help understand it:

It's just the protocol piece. Currently, VSCode will stand up the language server and communicate using the protocol over stdin/stdout. Nothing to say they couldn't support http servers in the future, but currently it seems lower level that that.

They do support hovers, which you can use for seemingly anything. We currently are using them for type information and showing API docs.

I'm hoping this takes on with other editors, as once they support it, then they'll get the stronger language support the servers provide, which can do hovers, on-the-fly error checking, code navigation, and more. It's not perfect, but as a baseline set of features, it's a pretty good starting set.


website owner here: odd, what configuration of firefox gives you that warning. When I connect with a recent OS X version, it seems to work okay.


Firefox on Windows here. Looks like it's getting the certificate for github, but throwing an error because your domain isn't listed in that certificate. https://imgur.com/a/UeXtC


Yeah, it's not set up for https. Do you see the same thing with http://www.jonathanturner.org/?


Nope, that's fine. Not sure why 0xmohit posted the HTTPS link in the first place.


Same error for Firefox on Linux.


Author here.

Hi all - thanks for the interest in the survey! The Rust community team is still working the blog post (this link was to an early draft), and we're looking forward to posting it when it's done.


I think this survey was very poorly publicized. As somebody who only dabbles in Rust, this is the first I have heard of this survey. I am sure that others are in the same boat as I am. We would have participated, but we learned about it weeks after it closed!

Only now do I see the blog article, and a submission here [https://news.ycombinator.com/item?id=11661056] that got no traction. I don't remember seeing any sort of a notice on the main Rust web site.

The fact that many participants were excluded, albeit unintentionally, makes me really question the completeness, and hence the validity and usefulness, of this survey data.


This survey was publicized in all official channels including this week in rust, mentioned in the 1 year of rust announcement, on Hacker News (we cannot control traction), on the subreddit, the users forum, our Twitter account (where it also got huge traction) and literally all our other venues. We got a huge amount of replies.

I'm not sure how this is "poorly publicized".


Why was a link apparently not in the most obvious place, on the https://www.rust-lang.org/ web site? A small notice across the top would not have been disruptive, yet I and others would have seen it.


We put a great deal of effort into publicizing the survey, with a special focus on public spaces and piggybacking on highly-visible announcements. The website's front page sees less traffic than e.g. HN or the Rust subreddit, so it's not hard to imagine why nobody considered putting it there. It certainly wouldn't have hurt to have had it there, and I'm sure we'll consider putting it there for next year's survey, but we're quite happy with the several thousand respondents that we received.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: