Hacker News new | past | comments | ask | show | jobs | submit login
Mark Russinovich, Microsoft Critic, Is Now Building Azure (wired.com)
242 points by omnibrain on May 27, 2014 | hide | past | favorite | 113 comments



There are a handful of people worth watching at Microsoft conferences - no matter what they're doing. My list is:

* Anders Hejlsberg

* Mark Russinovich

* Scott Hanselman

They have the clout and the job security to speak their mind - and they do. They all work on neat projects and are interested in talking about the internals. I saw Mark at Build this year and his talk was about the "fails" of Azure. It was an honest breakdown of how they've had failures that have taken down customers, taken down Microsoft services, and hurt the reputation of Azure.

I've followed Mark's blog since before he became a Technical Fellow at Microsoft, and he deserved it, no doubt. Very few people could write or speak with such precision about the internals of Microsoft kit. His interviews on Channel9 are all fascinating, before Azure he did work on the NT Kernel (and who knows what else) and he spoke candidly about dealing with issues scaling up the operating system. He has given interviews solely about removing global locks from the kernel, because that's his domain and he's good at it. If that reminded me of anyone else, it would be Linus Torvalds and his intimate knowledge of the Linux kernel.


> Very few people could write or speak with such precision about the internals of Microsoft kit

Indeed, he "wrote the book" on it, which is apparently freely available as PDF now [1].

As someone who knows nothing about OS's, I picked up a cheap copy out of curiosity, and found it highly readable and enlightening (until it got rained on somewhere around chapter 10).

EDIT: Do you have a link to the "fails" talk? I've been using Azure for the last few months and have been extremely impressed so far.

[1] https://www.google.com/#q=windows+internals+russinovich+pdf



I don't think it's freely available. A search like that will often turn up a good PDF copy though.


I have to agree that Hanselman is one of the best speakers.


What on earth is Erik Meijer doing these days? I always used to enjoy his educational rants about Java and standard OOP.


According to his Linkedin page, he's a founder at http://www.applied-duality.com/.


Can't believe you put Scott in the same bucket as Anders and Mark. There are many more insightful and innovative people (including other Technical Fellows) at MS Research and MS itself that are worth following. Scott Hanselman's role is generally evangelizing existing technologies.


IS that my role? O_o


Don't mean to hijack the thread, but Scott, any plans to bring back This Developer's Life? It was (is?) one of the best dev podcasts out there.

ps: hope the things shared in the one before last episode are now completely over. My best wishes.


heh, funny :)


I'll let shanselman's response to your first sentence stand on its own :)

Regarding other people, certainly many are worth following. It'd be remiss not to mention Erik Meijer's group or the work done by others in MSR. They make for great Channel9 videos and great papers when they publish. But when I go to or watch a MSFT conference, those 3 are the top names I look for. Fair enough?


List of Mark's videos, including that one:

http://channel9.msdn.com/Events/Speakers/Mark-Russinovich

Thanks for the reference -- always interested in watching frank talks about software failure.


> Anders Hejlsberg

Gosh I have such a nerd crush on him. Easily my favourite language designer, by far. I wish I could be him, and funnily enough am learning language design outside of work, but I highly doubt I'll ever be as good!


> “I ranted at some of the architects when I was at Microsoft. They were constraining the sorts of things you could do,” Brown told us in 2012. “Microsoft likes to do a really big up-front design, where they define the physics of a new universe. They birth this new universe, and they say: ‘This is how you do it’–instead of starting out with something simple and letting people show them how it should be done.”

I really hope this advice is taken by others, particularly the Windows team. Powershell is a golden example of this. Great idea, piping around objects instead of text, but wait, they must be .NET objects. So the entirety of the language community cannot participate other than Microsoft languages.

I'm hoping they revitalize the command line taking this advice, start with something simple and let others build on top of it.


Absolutely.

MS is full of architecture astronauts (along with super smart people, who do useful things - but the arch people are in charge) that sit around and contemplate all these bizarre use cases for things that no one actually has. Then, since MSFT believes they have little/no competition in the space, they spend years developing this grand vision and dumping it on their customers, who then attempt to figure out what parts of this vision are useful and what parts are bunk.

Microsoft then looks at what happens, sees the work arounds, and rewrites the whole thing again to accommodate not only those use cases, but a whole host of other imagined ones, too.

Lather, rinse, repeat.

Open source generally doesn't have this problem, as a) some developer was scratching their own itch, so the problem is real b) doesn't have unlimited time and budget, so they have to keep it small and evolve, and c) relies on developer's contributions, so it has to be marketable.

Open source : Microsoft :: Lean startup : Waterfall


Ironic since Microsoft is more agile now than it's ever been. Perhaps a backhanded compliment, but it's true. Release cycles are getting shorter and shorter throughout the company. They're playing catch-up, but they're still in the game, particularly for web services and hosted applications.


SO much this

MFC was bad, but there are even worse stuff

Every time I have to check a new API (thankfully, not that often) it's a mess. They keep "re-solving" the same problems with new APIs that are a pain to use

The .NET stuff is not too bad though (but yes, it has some warts)


PowerShell also plays nicely with COM and WMI. It's not all .NET objects you interact with. (Well, technically it is, due to PSObjects wrapping everything, but that's another matter.)

How would you do it, though? Pipe around JSON or XML? Then you just have data structures, but not objects. They can't have methods and if they could, what language would they be written in?


> How would you do it, though? Pipe around JSON or XML? Then you just have data structures, but not objects

"just data structures" is a subset of objects - they are objects with data but no methods. i.e. DTOs. They're not always useful, but to say that they can't be accommodated in a pipeline because they "aren't objects" is not quite right.


Granted, but a lot of niceness in PowerShell comes from the fact that you're not just piping around objects. As in, you don't need special commands to handle certain sorts of output (though there are things liks Stop-Service, Stop-Process, etc.) because you can just as well call the appropriate methods.


I don't think we can design this in HN comments... all I'm saying is that it should have been a Day 1 requirement that it is low-level enough to work with any modern widely-used programming language, on a first-class basis.

That's what the comment I quoted is saying, Microsoft likes to do big up-front designs that restrict their platform to uses that they have already thought about.


I'm having trouble understanding what is needed. The 'shell' in PowerShell is an interface to an operating system interface, .NET. While Microsoft's implementations of C#, &rest can call PowerShell scripts and cmdlets, it's a bit kludgey compared to calling the API directly. Other than effort, there is nothing preventing creation of a library with equivalent function in any language, and the effort is what killed IronRuby and IronPython as continuously developed projects...and they didn't use powershell.

Can you provide an example?


You can write cmdlets in C# and VB, not just call Powershell cmdlets/scripts. You should be able to write cmdlets in Ruby, or Node.js or Brainfuck.


OK, I get what you want. My understanding is that the reason you can write a cmdlet in C# is because C# provides a facility to return .NET objects [more or less] and that's only in so far as it interops with the .NET API.

Ruby implementations will tend to return a byte stream rather than a value a value against the .NET API. Even if Microsoft provided another API on top of the .NET API, the same situation would exist because Ruby implementations usually don't write to that API by default. It's turtles all the way down on the .NET side.

Programming .NET in Ruby requires writing to the .NET interface from the Ruby program. If one wants to program to the .NET interface through Powershell from Ruby, a Ruby program can always output text that can be interpreted as a Powershell script. One way or another, the responsibility is on the program to write to an API.

What would be the output of a Brainfuck program that should operate as a PowerShell cmdlet? Would it ever be more than text which can be interpreted as a PowerShell command or script?


You're entirely missing my point. Powershell shouldn't have been written to require .NET. Full-stop. That's a deal-breaker and highlights the quote I original referenced. Stop building high-level stuff on top of other high-level stuff where it's uses are restricted to only a small subset of potential developers.


i think you can write cmdlets in anything that compiles down to IL. So i am guessing you could write them in IronRuby or IronPython. Might be mistaken about that though.

also you can use the '&' operator to call anything executable, so you could take an exe and call it from powershell without issue


It's still a shell. No need to use

    & 'externalProgram.exe'
when you can just use

    externalProgram.exe
just as well. I see a lot of weird things on Stack Overflow with & or even Invoke-Expression, especially when it involves passing parameters to programs and the lack of understanding how parameter passing works. The results usually aren't pretty, then.


Pretty sure you need & to call a variable, like & $git if you've a var pointing to git instead of having in the path.


Powershell was an internal grass roots product, I think. I read it somewhere so may or may not be true.


Snover specifically set out to design "the next generation platform for administrative automation" in 2002. See the Monad Manifesto:

http://www.jsnover.com/Docs/MonadManifesto.pdf

And Snover's description of the process almost a decade later:

http://www.jsnover.com/blog/2011/10/01/monad-manifesto/


Was there a connection to the math/CS concept of "monad", or did Snover intentionally pick a name that would guarantee eveyone would be afraid of it?

from http://www.jsnover.com/blog/2011/10/01/monad-manifesto/ > I had had a number of conversations with our team in India where everyone shook their heads and smiled and seemed to get it but then the code clearly demonstrated that they had absolutely no clue what I was saying but weren’t telling me that. > We tried a number of times to get very precise in our documentation but time and time again, it was clear that they just weren’t getting the concept

Wow, Microsoft Architechture Astronaut in a nutshell. He handwaves an idea, and then complains when someone actually writes some code and it doesn't match his imagination.


It's explained in Windows PowerShell in Action that is written by Bruce Payette, one of the PowerShell designers:

"before the first public release in 2006, the codename for this project was Monad. The name Monad comes from The Monadology by Gottfried Wilhelm Leibniz, one of the inventors of calculus. Here’s how Leibniz defined the Monad:

The Monad, of which we shall here speak, is nothing but a simple substance, which enters into compounds. By “simple” is meant “without parts.” —From The Monadology by Gottfried Wilhelm Leibniz (translated by Robert Latta)

In The Monadology, Leibniz described a world of irreducible components from which all things could be composed. This captures the spirit of the project: to create a toolkit of simple pieces that you compose to create complex solutions."


> Pipe around JSON or XML? Then you just have data structures, but not objects. They can't have methods and if they could, what language would they be written in?

You could pipe around DTOs like that, and as long as they had a structure that attached typing information to them, you could use something like intents (similar to those in Android or Web Intents) to attach "methods" to them. The executables implementing behavior could be written in any language that the platform could run.


Well, if it was me...

1. Replace PowerShell with a .Net interpreter so you can "just run" a .cs_exe file or whatever. No one wants to learn/deal with yet another throwaway proprietary scripting language.

2. Replace all the PowerShell cmdlet garbage with .Net libraries. You'd also be able to then trivially automate any scripting from existing .Net programs/tests.

3. Have a JavaScript -> .Net compiler so people don't have to deal with C# or VB.Net if they don't want to.

Scripts look something like (C# in this case):

  using Microsoft.Scripting
  
  void Main()
  {
    SQLServer.Configure(<SOME JSON>);
    var result = IIS.Start("MyWebPage");
    if (!Installed(<some windows package>))
    {
        File.Install()
    }
    if (!UserExists(server, user)) {
        CreateUser(user)
    }
  }
You can't do all of this today (at least not easily or without installing many extraneous or 3rd part libs which create deployment headaches).

EDIT: for clarity.


Microsoft has had a JavaScript compiler for a while (JScript.NET). I don't think it gets much use. They also had a Java compiler.

PowerShell is far better for scripting than C#. C# is terribly verbose and lacks scripting-ish stuff like string interpolation. I'm not sure it even had dynamic when PowerShell was made.

It is pretty dumb that C# still hasn't shipped a proper REPL or script runtime though.


This is an amazing repl recently released:

http://channel9.msdn.com/coding4fun/blog/REPL-for-the-masses...


JScript.net has never been a first-class citizen, and Microsoft never really made an effort to make it viable in all of the domains C# is. JScript feels like an attempt to embrace/extend JavaScript that never took hold.

The problem is that rather than adding libraries to improve .Net for scripting-ish stuff like string interpolation, you have to deal with an entirely different platform that shares little with the command line, and little with .Net.


It is pretty dumb that C# still hasn't shipped a proper REPL or script runtime though.

Workin' on it.


> Great idea, piping around objects instead of text, but wait, they must be .NET objects. So the entirety of the language community cannot participate other than Microsoft languages.

Also just a ton of missing UI / tooling / integration work meant that most people heard about it, tried it, didn't see what the hype was about and probably have never looked at it again. This happens a lot where people who work on big projects just assume everyone else will look past the initial experience and grasp the cool vision.


"Also just a ton of missing UI / tooling / integration work meant that most people heard about it, tried it, didn't see what the hype was about"

I think this depends on what component/system your trying to use. Recently I found myself using powershell (mostly via copy/paste) for what I consider simple one time system administration type duties. In the past I don't think microsoft would have shipped a product without a first class GUI to manage it (ignoring registry editing for "rare" tweaks). Now, it seems the prevalence of powershell allows some groups inside Microsoft to write a bunch of code without a functional UI, leaving the users to grub through what is effectively programming documentation to perform simple things like enabling/disabling a core feature of the product. I would expect from an architectural level, that microsoft would assure that the major functionality in their product is controllable both from the GUI as well as from the command line. That doesn't appear to be the case anymore than windows 8 can be used entirely from the metro/modern interface.

Frankly, its yet another erosion of what I considered to be Microsoft's strong points against Linux.


It's somewhat ironic in that one of the Microsoft's aims has been to solve the problem of having some features that were only GUI-accessible, meaning they were difficult/impossible to automate.

So PSh gets treated as the first-class interface with the GUI being a frontend to underlying PSh functionality. I've gotten away from Windows sysadmin work, but from what you're describing it sounds like they've swung the pendulum a bit too far in the other direction and no some of the GUI stuff isn't easily accessible.

Having both would obviously be ideal, but if you were to put a gun to my head and make me choose, I'd take the current situation of a lacking GUI every time, personally. All the times I'd run into situations where the answer was essentially "nope, can't automate that" in the past were beyond infuriating.


"All the times I'd run into situations where the answer was essentially "nope, can't automate that" in the past were beyond infuriating."

BTW: In the past (because I don't do a lot of this anymore either), GUI operations were automatable with one of the dozens of clones of the windows macro recorder. The ones I remember using had their own little scripting languages (which could be automatically generated/recorded) with simple variable substitution and control flows. The resulting scripts were callable from the command line/batch files. Those products overwhelmingly relied on the ability to walk the GUI resource trees and find things by id/name/some attribute and then send messages like clicking or typing to them. How well applications like this still work in a metro/modern environment I don't know. Although a google search indicates a number that are still available.

BTW: For the linux/X users out there, check out http://www.gnu.org/software/xnee/


PowerShell's specification is publicly available. Has been for several years.

http://www.powershellmagazine.com/tag/language-specification...


You don't even need the specification to implement cmdlets. Heck, you could extend PowerShell exclusively in PowerShell code (though performance makes that somewhat undesirable most of the time).


The downsides to powershell are mostly around the fact that along with some good innovation they also threw out a lot of other standard conventions. Which makes it tedious to integrate powershell with other command line tools.

Also another thing I have a problem with is the startup time, for a lot of quick hacks, I can start up, execute and close down the terminal (cygwin lets say) in the time i am still waiting for powershell to initialize. Not huge problem but for me it holds me back from using powershell somewhat.


As for command-line tools, the things that come to mind right now (that can be tedious) are

1. Result of a command-line tool is its output, not its exit code, so you have to use $LastExitCode when treating command-line invocations as boolean values

2. Stream behaviour other than stdout gets weird when redirecting one stream into another (e.g. 2>&1 causes every line that's output on stderr to be thrown as an exception.

3. Encodings are a mess, sadly. They mangle Unicode output from external programs. This was a compatibility decision, but it's unfortunate, especially because the appropriate .NET APIs are a bit clunky to use.

1 is just different from bash, 2 and 3 can be dealbreakers depending on what you do, and the workaround is the same, but annoying, admittedly.

The one-second startup time has been awful, I know. Although for me at least since Windows 8 it has been instantaneous, making this a non-issue. (Pash is worse at about two seconds currently, though.) But from the time when Powershell was slow to start I still have a habit of just running cmd for things that can be done in cmd.


Why MS never developed a truly powerful unix compatible terminal is beyond me (not cygwin). Developer buy-in and kudos alone should be incentive alone.

The feeling of being restricted on windows as a developer always gets me running back to linux sharpish.


(I'm assuming you don't mean a terminal emulator, but a command-line shell when you say »unix compatible terminal«.)

The team that developed PowerShell actually wanted to do just that. They didn't, because Unix was built mostly around text files, so tools that work with text files are a good way to manage such a system. Windows however, is not, so Unix tools are a particularly poor way of managing Windows.

That's also why PowerShell treats file system, registry, WMI, etc. as first-class constructs, because it has to to be effective.


> That's also why PowerShell treats file system, registry, WMI, etc. as first-class constructs, because it has to to be effective.

Also- it exposes them as filesystems & files.


Powershell is pretty powerful. You can do basically the same things as in bash.

However because of its youth, some equivalent tools are missing.


"Basically the same" and yet absolutely not the same.

The world did not need PowerShell.


The world didn't, but Windows admins did. It's allowed a lot of people to migrate away from legacy VBScript. You don't want to program in VBScript, trust me.


PowerShell is cool, but it requires learning PowerShell. Why should there be an entirely separate learning process to be able to work normally on just one OS?


Well, because Windows is not Unix. It's not even "not Unix" the way Linux isn't Unix. Its heritage is mostly VMS and CP/M, if you want to compare it to anything. Trying to treat Windows as a Unix is a very leaky abstraction that is going to cause many more problems than it solves.


Why do you think all operating systems should follow one single architecture? Why should there be an entirely separate learning process from C to learn Lisp or Haskell?


What would it take for a homebrew language to interact with the OS as a CLR language?



I feel like this article mischaracterizes sysinternals as something of a Microsoft critic and watchdog which just wasn't the case. They were more like a valued and vital member of the Microsoft ecosystem.

If you did development on a Windows machine in the last 20 years, you probably downloaded a sysinternals tool at one point. Many Microsoft KB articles pointed to their tools and articles and when Microsoft announced their acquisition, most people were like "That makes sense".


I completely agree but I can at least image where the critic characterization comes from. Mark's articles and the Sysinternals tools gave us wonderful insight many parts of Windows. They showed the good things but often times they also pointed out the flaws.

I think Mark is just honest about the technical aspects of Windows and thankfully his attitude has not changed since Microsoft pays his checks.

One example of this are his videos about Windows Memory Memory management (from 2011, hosted on MSDN)[1], which besides being a detailed technical explanation don't skip all the shortcomings of Windows' memory subsytem. Sometimes his statements seem harsh but I think it's more sincerity and he has been as sincere about his own tools as well[2].

[1] http://channel9.msdn.com/Events/TechEd/NorthAmerica/2011/WCL...

[2] http://blogs.technet.com/b/markrussinovich/archive/2009/11/0...


I too had always felt that sysinternals was a champion of Microsoft and Windows. Perhaps there were some politics going on that only Microsoft employees would know?

Every system has pros and cons and learning how to work with those constraints is how you become a master. From what I understood Mark is one of a small number of people who has an extremely deep understanding of Windows inner workings.


Exactly!

Sysinternals gave lectures about windows internals to Microsoft employees.


Mark Russinovich is the author of the Sysinternals tools, which have long been essential utilities for troubleshooting Microsoft environments.

His blog has some pretty good reads for anyone that wants an in-depth view of Microsoft products:

http://blogs.technet.com/b/markrussinovich/


Every time I set up a new dev system, literally the 2nd thing after installing Windows was installing all the Sysinternals tools. I've since moved on to OS X as my main dev system, but even my VM's Windows all have the Sysinternals tools installed.


For background, this is Mark Russinovich's original blog post from 2006 about Microsoft acquisition of Sysinternals/Winternals.

http://blogs.technet.com/b/markrussinovich/archive/2006/07/1...

"I’m joining Microsoft as a technical fellow in the Platform and Services Division, which is the division that includes the Core Operating Systems Division, Windows Client and Windows Live, and Windows Server and Tools. I’ll therefore be working on challenging projects that span the entire Windows product line and directly influence subsequent generations of the most important operating system on the planet. From security to virtualization to performance to a more manageable application model, there’s no end of interesting areas to explore and innovate."


If I remember correctly then Mark Russinovich is an Azure architect since at least 2011 and I think he had a leading role even back then. He is with Microsoft since when Sysinternals was bought in 2006.

A little off topic, but if you ever wondered what all those numbers shown by Task Manager really mean, Mark did a video[1] that really explains it well.

[1] http://channel9.msdn.com/Events/TechEd/NorthAmerica/2011/WCL...


I recommend watching "Case of the unexplained". Mark is showing practical examples of using sysinternals tools, it really gives a glimpse of how skilled he is.

http://technet.microsoft.com/en-us/sysinternals/bb963887.asp...


It's well worth the time to go and watch these. It teaches you all kinds of things about how windows works


I really like him for his contribution towards creating the Sysinternals set of utilities. I'm surprised that Windows doesn't include them by default nowadays.


The sysinternals tools are always available on a webdav-enabled share on live.sysinternals.com. Just run pushd \\live.sysinternals.com from a command prompt, and then run any of the sysinternals tools without any kind of installation - or, I guess, security or validation of the executable. YMMV.


Validation is provided by Authenticode signatures embedded in the EXEs themselves.


What is interesting is that the kernel mode drivers buried inside procexp is still signed using sysinternals not MS certificates, sometimes with no timestamp!


I didn't know pushd worked like that in Windows. Thanks!


It's about client support. People will break their systems or call MS support for infos about what these tools do etc. Plus they don't concern a lot of people, and proper integration of them would undermine genreal user experience.


> Cloud computing was invented by Amazon

That's a bit of a journalistic stretch.


I don't know....

The S3 stuff was kicked off in the late 90s and everyone just kinda-sorta said why? But it seems to have been the first modern cloud-type service.

Now, I personally hate the term "Cloud" because it's pretty meaningless, and you can look back throughout the entire history of computing and find examples of distributed, remote storage, compute power etc etc.

But if we're going to call anything "Cloud" in the sense we try to use it now, then I reckon Amazon were first to market by a mile.


“If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility... The computer utility could become the basis of a new and important industry.”

— John McCarthy, speaking at the MIT Centennial in 1961

(As quoted at http://en.wikipedia.org/wiki/Utility_computing)


Although that was no doubt in part due to the extreme expense of computers then. He said that 4 years before Moore published his famous Moore's Law, which many refused to believe for decades, in a period when computers were transitioning from vaccume tubes to transitors, which cost many hundreds of dollars when adjusted for inflation, and probably still when they were shipped with serial numbers. And of course memory was still made by hand out of magnetic cores.

Multics was the result of this vision of McCarty and others at MIT, hence its strong focus on security.


Computer "service bureaus" offered outsourced computing services in the 1960's and 70's. Do some research on "Tymshare", for example, and you'll see that "the Cloud" was alive and well back in the 1960's.


The S3 stuff was kicked off in the late 90s

What? As far as I know, S3 launched in 2006.


I can find reference to AWS based on EC2 going back as far as 2002 - http://en.wikipedia.org/wiki/Amazon.com#Computing_services

But I'll happily admit my memory might be faulty! And yes, that's not S3 as it's known now.


Late 90s?

Wikipedia: Amazon launched S3, its first publicly available web service, in the United States in March 2006[2] and in Europe in November 2007.[3]


I can find reference to AWS based on EC2 going back as far as 2002 - http://en.wikipedia.org/wiki/Amazon.com#Computing_services

But I'll happily admit my memory might be faulty!


The first EC2 beta launched in 2006, not 2002. Here's the official announcement from the AWS blog:

http://aws.typepad.com/aws/2006/08/amazon_ec2_beta.html

The Wikipedia article is organized confusingly. It talks about the 2002 launch of "Amazon Web Services" which at the time was just an API for the Amazon retail catalog. Then for some reason it links to a press release about the 2009 release of "EC2 for Windows", then goes back to talk about earlier launches like MechTurk (2005), S3 (2006), and EC2 (2006). I'll see if I can clean up this mess... [Update: cleaned up the article and added some citations.]

Note: I worked at Amazon.com from 2005 to 2008, the period when the company launched its first cloud computing services.


Softlayer was ahead of amazon by about 6 months. They just didn't use the term cloud. Or in the founder's terms "We knew it was awesome, but we didn't have a name for it".

(Disclaimer: pre-merger employee)


The prior strategy (meme) was called "grid computing".

AWS' strategy of virtualization honors interfaces (cut points) between the major OS and network stack functional areas.

Grid computing was about portable code running on compute farms, more like a time share. Like an app server.

I shamefully admit I was slow to grok AWS. I had been using VMware for testing and cohabitation and such for years. I just didn't see how AWS was any different, or better. I really missed the boat on this one.


This was my first take-away ... at least Wired linked to an article they wrote backing it up (even if you still don't agree): http://www.wired.com/2012/11/amazon-3/


I definitely found Sysinternals useful, and I was impressed when I saw Azure with easy to run Linux images way back when. I didn't realize they are becoming a better player. Props to them for hiring the guy.


> Props to them for hiring the guy.

They hired him (effectively) quite some years ago when they bought sysinternals (2007 or there abouts IIRC).

Props to them for neither neutering him nor making him leave though, either of which is common with a buy-out that includes a critic. A good critic can be a powerful resource for reflection and positive change.


I could use a summary.

(New idea for facebook: allow users to submit abstracts/summaries of articles, and show me the one written by someone least removed from me by friendships.)


Microsoft hired a security researcher and developer by the name of Mark Russinovich, who is famous for developering a powerful set of tools called Sysinternals and gained fame for that and finding issues with Windows and publishing them. He is now considered a "loose cannon" inside Microsoft, breaking them of their monolithic ways in favor of a customer-focused approaches and more open-mindedness (citing Linux availability in Azure as a demonstration).


Politely asking for one on HN ought to do the trick, I hope.

(I'd suggest not downmodding this, folks. Politely asking for a summary is nice, as is providing one, and let's not discourage it. It's not the same as just squirting out a "TL:DR?".)


Yea, I don't really get why folks have a problem with asking for a summary. As gedrap said, most writing on sites like Wired is 90% fluff, and I don't understand why we all should be reading through that to see if the core 10% is worthwhile. There's a reason journal articles have abstracts.

So maybe I'll just start asking for an "abstract" in the future, since that sounds fancier.


Yup. Whenever I see a link to something like Wired (i.e. 90% of text just more or less meaningless, 10% real information), I just go straight to HN comments. Doesn't feel like I'd be loosing anything.



Without an attractive, desirable suite of cloud services, their endpoint devices and OSs are going nowhere.

WITH world-class cloud services, Microsoft could derive revenue from Apple and Android users.

It appears that Microsoft's biggest problem was egos. People wanted to "leave their mark" rather than play to strengths and achieve the possible. Microsoft could readily be 3X bigger if they focus on what they are good at and on the customers most likely to welcome Microsoft products.


>People wanted to "leave their mark"

TBF this had a lot to do with the stack-ranking MS was famous for. When you've got to justify your job every year making a name for yourself is mandatory.


So what is their new method of measuring performance?


I think what benjaminpv is getting at is that a system like stacked ranking drives people within an organization to build fiefdoms of their own creation, where they are permanently on top. That way you don't get reorged into some group with established cliques and you end up on the bottom of the stack.


> So he’s now working to merge the platform service and the infrastructure service, giving people the power to run any software while still ensuring this software operates in an automatic way. “We want to blend the two worlds,” he says.

That sounds very much like App Engine Managed VMs.

https://developers.google.com/cloud/managed-vms


I don't know how closely Nadella and Russinovich are working, but all being well this could be the start of a new collaboration and wave of dominance not much unlike the Jobs-Ive duo. For new product development that is, not for aesthetic design or hardware brilliance.

I had been following Mark for quite some time in the early 2000s and was a big fan of his tools but hadn't seen him speak or anything like that. He then did talk on work he was going for the next Microsoft OS, which at the time was Windows 7. His talk was easily the best talk that day. He was confident, knew his subject and was passionate about what he was doing. No one else that day was as authentic and enthusiastic as he was. I'm not sure how many people in that particular audience recognised his drive and talent though.

I'm glad to see him written up about and if he's a big part of driving Microsoft into the future, then Microsoft could very well become exciting and relevant again.


Can anyone please suggest a more neutral, less baity title?

Edit: No takers? Ok, I'll try to come up with one.


As someone who worked for Microsoft for years (left few years back), everybody in Microsoft criticize Microsoft, there is no red lines to cross. The bad things usually come from the fanboys of Microsoft inside Microsoft, especially if one of them is guarding a pretty strategic area, and the good thing is, these people have been leaving (or asked to leave) for quite some time now, the likes of Steve Ballmer & Sinofsky, while other super geeks like Satya Nadella, Scott Guthrie, Qi Lu are taking over.


I find it strange that Wired would spend several paragraphs describing standard software licensing practices as a "fraud".


Interested to read the comments about IaaS and re-visiting PaaS. I expect PaaS services to be the differentiator that make people ship their new open source .NET vNext stack to Azure / Microsoft servers as opposed to alternative clouds / hosts. Seamless scaling of web servers, persistence stores, monitoring, analytics etc...


He's a very smart man, a true hacker in the technical sense. His set of free tools revealed amazing details about Windows internal. Those have helped me tremendously back in the days when I work on Windows kernel internals as an outsider. Thanks to Mark.


It was interesting to find out who the people behind Sysinternals were.

The article, though, has an embarrassing number of typos...


He also auctions rides in his Ferrari 458 for charity!


his Sysinternals stuff was pretty solid.


procmon/procexp are my power-tools. Thank you Mr.Russinovich.


Cleverest?


Well, those Sysinternals tools were open source before MSFT acquired them.

If you want to see the magic before MSFT force it behind the wall, here you go.

https://code.google.com/p/akeo/downloads/detail?name=sysinte...

And fuck you, Microsoft, and long live Russinovich. Could not have done without them in sysadmin work, open or closed.


This reminds me of how michkap had access to Jet source code and was able to get permission to release the TSI utiliities.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: