The binary blob in question is hotword-x86-64.nexe with sha256sum 8530e7b11122c4bd7568856ac6e93f886bd34839bd91e79e28e8370ee8421d5a.
This is labelled as being a "hotword" implementation, ie, something that will monitor the microphone until someone says "OK google", then start listening and transmitting the following words for a search. However, there is no guarantee that it does what it says it does; in particular, it might instead accept instructions to transmit audio from particular parties that Google wants to spy on.
I understand there are likely to be many uninvolved engineers within Google who have access to the source code. It would do a lot to restore trust if a few such engineers could take a look through the source code and find out whether it has a remote trigger, and whether the source code in Google's repo matches the file that's being distributed.
This is not the first time Google has taken an open-source project and added closed-source components to it. They did the same thing to Android, twice: once with the "Play Service Framework", which is a collection of APIs added to Android but theoretically independent of it, and again with Google Glass, which ran an entirely closed-source fork. In the case of Glass, I did some reverse-engineering and found that it would send all photos taken with Glass, and all text messages stored on a paired phone, and transmit them to Google, with no feasible way to stop it even with root. This was not documented and I don't think this behavior was well understood even within Google.
> I understand there are likely to be many uninvolved engineers within Google who have access to the source code. It would do a lot to restore trust if a few such engineers could take a look through the source code and find out whether it has a remote trigger, and whether the source code in Google's repo matches the file that's being distributed.
That would prove nothing since there'd be no evidence to back up said statement and that the statement originates from someone on Google's payroll to begin with.
If you're really that paranoid about closed source components within Chromium then the only recourse is not to use Chromium. Thankfully the alternatives are plentiful.
edit: s/Google Chrome/Chromium/g
> This is not the first time Google has taken an open-source project and added closed-source components to it. They did the same thing to Android
Android is Google's project to begin with, and the closed components which are part of the Play Service Framework have been a part of Android since it's initial release.
> In the case of Glass, I did some reverse-engineering and found that it would send all photos taken with Glass, and all text messages stored on a paired phone, and transmit them to Google, with no feasible way to stop it even with root. This was not documented and I don't think this behavior was well understood even within Google.
Did you do a write up of this study? I'd be interested to read it :)
It wasn't always; it was an independent company that was acquired by Google in the mid 2000s.[1]
> and the closed components which are part of the Play Service Framework have been a part of Android since it's initial release.
No, Google Play Services was first released in 2012, whereas Google's first Android release was in 2008[2], so it most certainly has not been a part of Android from the beginning.
> It wasn't always; it was an independent company that was acquired by Google in the mid 2000s.[1]
Fair point, but AFAIK Android was never released as an open source project until it was Google owned.
No, Google Play Services was first released in 2012, whereas Google's first Android release was in 2008[2], so it most certainly has not been a part of Android from the beginning.
You were emphasising the wrong part of my sentence. Many proprietary components that are now part of Google Play Services have existed seperately for longer than the "Play" brand had: https://en.wikipedia.org/wiki/Google_Mobile_Services
> Fair point, but AFAIK Android was never released as an open source project until it was Google owned.
Since we're both being pedantic here, I never said that it was. :) I believe the GP did though (the person you were initially replying to, that is).
> Many proprietary components that are now part of Google Play Services have existed seperately for longer than the "Play" brand had
You're right, and that's one of the things that bothers me about Android's reputation for being an "open" or "open source friendly" OS. Yes, AOSP is open source software (if you leave out the binary blobs necessary for the radios and GPU to work), but even plain vanilla Android as shipped by Google is far from open source. Google has steadily been moving towards a closed/locked down model in many of their projects.
>If you're really that paranoid about closed source components within Google Chrome then the only recourse is not to use Google Chrome. Thankfully the alternatives are plentiful.
The bug in question was spotted in Chromium, not Google Chrome. That would leave Firefox as the only crossplatform and sufficiently up-to-date alternative. Not exactly "plentiful".
There are other webkit browsers without Chromium's extended libraries such as Surf and Web (Epiphany). Konquorer was still kHTML last time I checked, but there are with webkit ports as well if that's really what you want. Then there's Opera, which on some platforms (eg Linux) is still using it's older renderer rather than Blink (see footnote); and Otter as well. There's quite a few Firefox forks too (eg Palemoon)....and if all else fails, you can always run lynx or elinks :p
So there are definitely quite a few alternatives (the last two were obviously a joke though). Granted many are not as feature rich, but they'll still be HTML5 compliant.
Thank you for the correction on the Google Chrome/Chromium point though. Updated my post to reflect that.
Footnote: has anyone checked if this is a Blink issue or just Chromium? Because Opera, Vivaldi and other browsers use Blink but likely wouldn't have hotword. So that would be even more alternatives available.
>Granted many are not as feature rich, but they'll still be HTML5 compliant.
This is a meaningless statement. HTML5 is a moving target. And on top of that, webpage design has deteriorated to the levels we saw around 2000 again: to be usable, your browser has to mirror the most popular engines well enough that sites work.
Most of the browsers I exampled used popular engines (Blink, webKit, Gecko).
And if you want to get pedantic about HTML5 being a moving target, technically it's not. People often lump the other web front end components (CSS, SVG, EMCAScript, etc) under the HTML5 heading - those components will obviously have their own specification enumerations. Furthermore, a lot of the tertiary technologies that are a moving target are either experimental features / proposed drafts (ie not part of the final specification) or browser specific extensions. Most sites tend to avoid using these without fallback code for non-supporting browsers (demo sites being the obvious exception).
>There are other webkit browsers without Chromium's extended libraries such as Surf and Web (Epiphany).
What about Midori (LGPL 2.1)? For some reason it's not available for jessie, but 0.4.3 is available for wheezy, stretch, and sid: https://packages.debian.org/stretch/midori
Firefox does auto-download an OpenH264 binary on systems without a supported H.264 decoder library (if this feature is enabled, which it isn't currently in Debian's iceweasel packages). But note that OpenH264 is free software available under the BSD license:
Firefox downloads binaries from Cisco because Cisco can legally distribute this software in binary form in countries where H.264 patents apply, while Mozilla can't do so directly.
There was also a plan discussed to make it easy to automatically verify that the binary corresponds with the published source code, but as far as I know that work hasn't been done yet:
> That would prove nothing since there'd be no evidence to back up said statement and that the statement originates from someone on Google's payroll to begin with.
Are you new or unfamiliar with free software? Open source software and web browsers of all things shouldn't have any need for secret code.
It's highly suggestive.
> If you're really that paranoid about closed source components within Google Chrome then the only recourse is not to use Google Chrome. Thankfully the alternatives are plentiful.
Sure, why don't we just leave our countries to go live somewhere else when things don't go our way? Why not just give up?
> Are you new or unfamiliar with free software? Open source software and web browsers of all things shouldn't have any need for secret code. It's highly suggestive.
Indeed. But that doesn't change my statement.
> Sure, why don't we just leave our countries to go live somewhere else when things don't go our way? Why not just give up?
Because you'd still have the same browser choices if you did move :p
In all seriousness though, what are your options:
1. fork Chromium and remove the closed components
2. use another browser
3. moan on the internet
You've got 3 nailed but that doesn't seem to be helping the situation. So maybe you should start a fork instead? Or perhaps go with my suggestion of boycotting Chromium since it actually turns out to be the easiest practical solution despite your exaggerative remark.
> An obvious one, getting an explanation from the vendor / upstream before we proceed to any decision.
I'd already addressed that. In fact you quoted it when you posted your condescending reply. An explanation is worthless if the code cannot be reviewed. Such a feature should either be opt-in and/or open source.
I couldn't care less what explanation Google give, I just don't want this built into my browser.
> It's normal to have a collection of patches in the package file / port.
It is, but then you're relying on your package maintainers to patch Chromium (or compile the software yourself). Thus personally I think it's easier just to use another browser which doesn't need to be patched to remove an unwanted feature.
Hi, I'm an engineer from Google responsible for the hotword module.
I understand the concern that a proprietary component may be performing unknown instructions, and indeed Chromium does download hotword-x86-64.nexe on startup, but it has been carefully designed as an opt-in feature. If you do not turn on "Enable "Ok Google" to start a voice search" (in chrome://settings), Chromium will not run the plugin. You do not need to trust Google engineers to tell you this; the open source Chromium code has the logic to decide whether to run the plugin.
I have posted a detailed response (including the link to the place in the Chromium source code where the module gets run) on our bug tracker at http://crbug.com/500922#c6.
Why do you need to unconditionally download a binary blob? Can't you just download it when "Enable "Ok Google" to start a voice search" is turned on? And also, it's not really open source anymore even though you delay downloading the non-open parts until runtime.
We probably could delay it until the setting is enabled. I wasn't on the team when that decision was made, but I would imagine it's because a) latency (we want the feature to be enabled right away when you turn it on), and b) just the way it happened and nobody really thought much about it at the time.
The fact is that an end user should not care if software downloads a "binary blob" without running it. This is functionally equivalent to downloading anything from the Internet, a JPG file for example. Chromium downloads a bunch of things on startup, and nobody seems to mind. Just because hotword.nexe happens to be an executable blob doesn't really make a difference.
"The fact is that an end user should not care if software downloads a "binary blob" without running it."
Is that the official position of Google? Reminiscent of when Thomas Hesse from Sony stated: "Most people, I think, don't even know what a rootkit is, so why should they care about it?"
Functionally equivalent to a .jpg how? You mean by the fact that the .jpg is an executable set of instructions telling the system to eavesdrop on me?
Great point. Sure, I get it. By that logic, sending me a mail bomb (and of course, not activating it, because I mean who would do such a thing after painstakingly creating it?) is the same as sending me flowers.
So a picture isn't a picture until you look at it?
Whether or not people should care, some people do care. If Debian has to edit what you distribute to remove proprietary parts of it, you're probably not distributing something that is open source (which means 100% open source).
So, if the article was titled "Chromium downloads and activates closed-source eavesdropping software on all its devices, bypassing any OS alerts", would that be too wordy? It's meant to be a little tongue-in-cheek, admittedly, but it seems to me that's exactly what they did.
Isn't Chromium behind the enterprise chromebox/chromebook stuff too? And does this mean that Chrome itself may, or has already, install eavesdropping software and activate it without my knowledge?
Edit: I see from a sibling comment that OS X has this eavesdropping software installed, so that leads me to believe that everyone running chromium devices will have this activated, and that it's going to be part of Chrome soon, if it isn't already.
I know it's hyperbole to call it "eavesdropping software", but I also know how many people here were unsettled by "OK Google" and "Alexa!" (Amazon Echo), and I really do want to understand how folks here feel about the intrusion.
A bit surprised that there is no security CVE report attached. Debian policy is that binaries are vetted by a debian developer, sorted into Main, Contrib and Non-free, cryptographically signed and later verified by the client package system. The bug could allow arbitrary code to be installed and run without any of the above process if someone MitM the connection between the binary file and the client.
Isn't the blob downloaded from Google's servers over a HSTS and cert-pinned TLS connection?[0] If someone has MitM'd Google, it's gonna be a bad day for a lot of people.
From my POV, the thing that's actually bothersome about this issue is that a closed-source blob is automatically inserted into a project that I -and others- had understood to be completely open-source.
The fact that the Chromium Google Hotword code was later made opt-out -rather than opt-in- through a build-time configuration option is similarly troubling.
[0] IIRC, Chromium does support enterprise TLS snooping/interception devices, but those certs have to be loaded into Chromium before such devices will work.
They spied on unencrypted data as it was transferred between data-centers. They can't decrypt or MITM anything because they don't have google's keys, and chrome using HSTS cert-pinning means that the cert is fixed and can't be faked with one for google from another top-level CA.
This isn't proof that the NSA has googles keys, but it outlines how the NSA uses stolen keys to decrypt information. I'd imagine google would be one of their main targets.
Yeah. Spies have regularly provided data, for free or minimal compensation, to nation-state actors. Sometimes, this is information they know will result in the deaths of others. Often, the very act of doing it may result in the death of the perpetrator if caught.
Appeal to patriotism, a few million bucks, and immunity from prosecution? Surely someone highly placed at AppGoogAzonSoft is susceptible to that.
As far as we know only on insecure channels. Google had "private" pipes that they thought they didn't need to encrypt between datacenters and they didn't think they needed to encrypt that data. That was the MITM we knew about. I don't believe we know of them MITM'ing a cryptographically secure channel.
Although Bruce Schneier suspects new leakers behind recent reports, for now anyway most data about NSA capabilities that we have comes from Snowden documents. From this data it indeed follows that NSA didn't break cryptography two years ago. But it would be plain unprofessional of them not to raise the game by this time, especially given world's backslash against leaks.
I'm not saying that NSA nowadays have means to break strong crypto. But they surely should have responded to the growing usage of crypto in some way. My money goes on increasingly employing insiders.
Actually, I'd say the probability of three-letter-agencies planting backdoors after Snowden leaks have increased: developer community hasn't responded with radically new tools and techniques that would allow us to detect and root them out on mass-scale, at the same time journalists burned lots of NSA's precious toys while IT-companies rendered others useless by mass-deploying crypto and modernizing their infrastructure.
Since it doesn't use the normal ways to download extension, I would assume it does not use TLS connection. Secret downloads has the problem to display certification errors and similar things to the users, so they would likely have to reimplemented quite a few code paths to make that work properly. https and tls is also not mentioned in either bug threads, which is a worrying sign.
If it uses TLS, then the bug is less exploitable but does still violate security policy of the vetting process, signed code and license classification (closed-source blob). At minimum, it leaves anyone vulnerable to arbitrary code injection from Google.
> Since it doesn't use the normal ways to download extension, I would assume it does not use TLS connection
If this was some random enterprise Java app, yes, that would be a reasonable assumption but you're talking about one of the most heavily audited codebases in existence, which has one of the best security teams in the world working on making TLS stronger and aggressively pinning certificates. Their track record would merit actually looking at the source rather than simply speculating.
It really looks like it uses the normal way to download extensions; the Web Store.
I'd be more concerned that the only thing the patch seems to do is not activate the hotword extension. It seems to still download and install it. I mean, maybe I'm wrong here. I think that checking through https://chromium.googlesource.com/chromium/src/+/f269d3b5482... for ENABLE_HOTWORDING is the right thing to do if one wants to understand the change that was made.
But this isn't a 'secret' download. Google isn't trying to hide it at all, it's just a feature for the browser. So there is no reason to try to avoid problems associated with certificate errors.
And their system still does this for everything they distribute. This is no different really to, for example, pythons PIP. At least, as far as policy around signing of all Debian archive content.
Granted.. This is a binary blob downloaded without consent which is an issue, its just not an issue with the Debian archive distributing unvetted code.
Note that although this bug report was forcibly closed, the fix is "This change adds an "enable_hotwording" build flag that is enabled by default, but can be disabled at compile time."
Consider what this backdoor does. It listens to any conversation in the vicinity of the phone and reports it to a remote site. You can't see its keyword list. You can't tell when it's transmitting to the mothership.
Has anyone filed a US-CERT report with Homeland Security on this?
Strange how they "fixed" it by making it opt-out rather than opt-in, given the culture collision here. Google really likes NaCl a lot for being a feature with almost no third party adoption.
While there's "almost no third party adoption", there are two pretty significant uses: The Flash player and the PDF viewer. Browsers that rely on NPAPI for these get all of Adobe's security bugs on top of their own. You may say that Flash and PDF doesn't exist in your view of the web, but it definitely does for many people.
It does, and I enjoy mp4 videos and I'm not bothered much by DRM on Netflix either, but should a component like NaCl implying a binary blob be part of an open source software.. by default? Feels weird to have it included with Chromium by default. Weren't Chrome and Chromium originally separated in order to make one compatible with open source distributions, so that this sort of thing would be avoided? A conflict with Debian sounds like a pretty big one.
NaCl doesn't imply proprietary any more than a JavaScript engine does. The downloaded code is bytecode at an abstraction level a bit below C but quite a bit above assembly. It's not substantially different from a freedom perspective than code compiled to asm.js or just minified JavaScript. Both are usually proprietary and require reverse engineering work to decipher.
It's really hard to create vendor lock in on a feature nobody uses.
But even if they did go down that route, NaCl is licensed under BSD, so even Microsoft could add NaCL to IE if they wanted to. That's some pretty weak vendor lock-in.
Licensing is not the issue. Microsoft could not add NaCL to IE in the same sense that it could not add Firefox' DevTools to IE. It would require a major rewrite and cost tremendous heaps of money.
Even if that's true (and I share the sibling's skepticism), vendor A's incompetence and poor product quality is not meaningfully described as "lock-in" to vendor B - especially so when there's a readily available vendor C (Firefox) that doesn't share these ills.
I've gotta raise an eyebrow on that one. If it's a major rewrite to support a new plugin authoring language, your plugin architecture was a terrible mess to begin with.
Given that we're supposed to believe that New Internet Explorer was pretty much a from-the-ground-up rewrite, I can't imagine that their plugin architecture is a terrible mess.
NaCl isn't simply a plugin architecture. It is effectively the entire Chrome sandbox and large parts of Chrome architecture made available to binary plugins.
You aren't pulling it into your project without also pulling in half of Chrome.
According to this [0], NaCl is a Pepper plugin. [1] This would strongly imply that all you'd need to do to use NaCl is to implement PPAPI. Care to point out how I'm wrong about that?
the PPAPI is very closely tied to chrome's inner workings and is extremely complicated to implement as, compared to the old plugin api's, it doesn't allow native code any access to the local system. So it needs to provide plugins with all the possible hooks they will ever need.
For other browsers to support PPAPI, they'd have to implement all of this, which, btw, also is a moving target that moves forward in lockstep with chrome releases.
> the PPAPI is very closely tied to chrome's inner workings and is extremely complicated to implement
I've looked at the API, and can't agree with your statement. The API is similar to a typical game engine API. There are classes for handling input devices, audio, OpenGL, hardware video decoding, filesystem access, and basic networking. The PPAPI does not even touch the DOM, so it's not tied to being in a web browser.
It's also not spec'd; there are many edge cases in the implementation that are undocumented and would have to be specified precisely in order to be implementable by others. Nobody has done that work so far.
By "tied to chrome's inner workings" I meant the implementation itself which can't be lifted off of Chrome because of that. So PPAPI would need to be re-implemented by other browsers which is difficult as it's very much a moving target without an official standards process.
Basically what happens is that their flash plugin or some internal chrome app needs feature X at which point they extend PPAPI to have feature X. Trying to play catch-up with this kind of development is frustrating and difficult.
And aside of that, browser vendors don't like the fact that PPAPI is more or less re-implementing other existing web technologies. Here's a writeup by Robert O'Callahan from 2010 that goes into this reasoning: https://mail.mozilla.org/pipermail/plugin-futures/2010-April...
PPAPI is just providing an alternate, lower-level interface to the same web techniques accessible via JavaScript APIs. There's a bit more power (OpenGL ES 2.0, not the crippled WebGL standard based upon it) but nothing very significant. It's not browser-specific but it's also not going beyond what browsers provide.
Unfortunately, that fix disables the functionality completely, rather than making it so that the sandboxed hotwording module can be used if enabled by the user.
This fix is an opt-out with a compilation flag. Also, I don't know much about Chromium development process, so it might be irrelevant, but I only see source updates, without any updates in the documentation.
In a web browser implementation with NaCl support, downloading and executing arbitrary binary blobs is very much a feature, not a bug. The issue here seems to be that Chromium was configured, by default, to download and execute a particular Google-provided binary blob. And now it isn't.
Note that as soon as you go to ANY WEBSITE using Chromium, you are entrusting that site to download you arbitrary data, which could include NaCl binaries, which you're then going to trust Chromium to execute.
If the default is to allow those arbitrary pieces of code to obtain audio input from the microphone, then that is a major issue. I very much doubt it allows that by default.
The problem here is not mainly that it downloads and executes code via NaCl, though it's iffy that it does so with no simple way to disable it.
The problem is if it does so and grants that code access to API's' that should be privileged and something the user ought to be aware of.
(Names removed as don't think they're important to repost. And not intending this as incendiary. At some point someone needs to ask "How much are we impacting usability by stripping features?")
"1 week ago (2015-06-09 02:17:42 UTC) #19
On 2015/06/09 01:34:04, _ wrote:
> On 2015/06/09 01:13:10, _ wrote:
> > Done, but I'm going to voice my opposition to this. When I took over the
> > hotwording code, there were a number of obscure hoops to jump though to get
> this
> > to work locally. They've been eliminated. Having to find these pair of
> #defines
> > and changing them is more obscure to a newcomer than a build flag (that can
> > easily be traced through the code).
>
> What about having a run time flag for enabling hotwording? We would always
> enable hotwording for Google Chrome builds, and check for it in Chromium
builds.
> That way, Chromium users can make the decision to use the feature, rather than
> the Chromium distributor. And it won't require hotwording developers to build
> with GOOGLE_CHROME_BUILD or do any #ifdef hacking.
We shouldn't be exposing feature-enables to end users as runtime switches.
Honestly, I think the right thing to do is for Chromium users to get this by default but have a build switch that can be used to disable it for projects like Debian that feel more strongly about not having any external binary blobs used for anything. I don't think simply disabling this in general for Chromium is correct. It's not true that Chromium by definition should not make use of such blobs, and I think the damage to developers and to more pragmatic users of Chromium-based projects is greater than we should be willing to pay."
A major difference is that web content does not have silent access to your microphone, and the ability to monitor you and send information back home.
There are large security differences between what NaCl allows for web content, and what it allows for what Google considers internal parts of Chromium, even if they run in NaCl. Like this blob here, and also Flash, the PDF plugin, and others. All those do a lot more than what a random website would be allowed to.
I've run a "arecord" on hw:1,0 (via pulseaudio, oh my....) to show that one of the devices (pcm0c) is currently capturing audio. Below is the "hardware status" as reported by alsa. On the pavucontrol UI application (again pulseaudio) the recording application is also shown.
Javascript isn't far off binary in terms of readability nowadays with the level of packing/minification, so legibility isn't a deciding factor. Therefore if you can't trust the native client sandbox, why trust javascript, or even HTML from third parties? Native client is part of Chromium, you can audit the source code just as much as for any other language your browser speaks, so why make this distinction?
I didn't even know this "native client sandbox" existed! Why the hell should I trust it, or anything executed inside it?
I'm tentatively willing to believe that Chrome is probably not trying to pwn my box, because I don't think Google has a compelling reason to do that which would outweigh the flak they would get if they were caught. Allowing them to run arbitrary compiled executables on my machine, however, would require me to transitively extend trust to everyone using their technology, and to do that I would have to be confident that there are not and never will be any security holes in their sandbox. That is an unlikely proposition to say the least and therefore I want nothing to do with NaCl.
In some instances when served with an NSL (or some other mechanism we don't even yet know about), they can be forced through legal policy to cooperate in building something that pwns your box. Google's compelling reason is that they are under the jurisdiction of the American government. Though I share your tentative belief that Chrome/ium isn't necessarily a "pwn vector" per se, I am 100% willing to believe that they are compelled to cooperate in building some kind of vector for the NSA.
You say "they" as in Google, but it would be much more effective to persuade a single developer (and maybe his manager) who can implement such a feature in a open, transparent way (which would display the standard "recording" icon in the omnibar) or in a closed, subversive way (like, this).
Basically lean on "Never attribute to malice that which is adequately explained by stupidity" as much as possible to fly under the radar as long as possible.
Yes, of course it is. Why would it be a thing in the open source version but not the almost-identical closed source version with extra Google goodies. Unless one of those goodies was "remove one of the key aspects of NaCl", which I can assure you, is not.
It seemed plausible, but I don't know what "NaCl" is, have no familiarity with Chrome internals, and had never heard of this before. Seemed like it was worth checking before making such a decision. I am amazed that Google has repeated Microsoft's mistake with ActiveX - who would think this was a good idea?
I did, the ticket you linked has been closed with 'adding a opt-out flag' as solution. I opened another one to discuss it's behaviour in general, wether it should be:
1) opted-in by default
2) not ask user for permission (or notify him in any way)
There are always microphones, everywhere, all the time.
I remember when I was a toddler and they taught me that Santa's elves and God's angels where spying me to make sure I was good. They later told me it was a lie. It became true soon enough.
Fighting for privacy is impossible. We should fight instead for transparency, that means, spying those with power. And we should fight for equal rights, that means, making it hard for people to hurt you with the private knowledge they have from you.
It's an extension running in the usual extension sandbox with permission to access the microphone. It's the "Ok, Google" implementation. It's a proprietary blob so you'd need to reverse engineer it to figure out exactly what it's doing.
There's a comment there indicating that FF has done the same in the past with an H.264 blob.
The conspiracist in me wonders why both these major browsers have downloaded and maybe executed binary blobs. Is it purely a convenience feature in the browser? Is it a secret order? That last question would have been silly a decade ago but we all know it's entirely possible now.
the open h.264 blob thing is annoying, but it's supposed to be a reproducible build of open source software.
The reason why there's a blob is because for that binary, Cisco pays the patent licenses.
So you can verify the source for any issues, verify whether it matches the binary, and work around MPEG-LA licensing at the same time (there are caps, and Cisco seems to have calculated that even when running into them, they're still better off with having webrtc support h.264 everywhere).
Wasn't Firefox recently called out for including proprietary integration from Pocket and Hello on their new versions by default which cannot be removed but only disabled? [1]
I wonder if I should just switch back to IE6 that has no microphone and webcam support, but then there is ActiveX! :(
Can you please cite where you read that proprietary blobs are used? IIRC the Pocket client is open-source, and so is the Hello client (it's basically a webapp that uses WebRTC)
You maybe right, that was bad wording on my side, thanks and corrected. I meant to write "Proprietary Integration", since it is only and only compatible with its respective companies/applications.
The client side code for Pocket integration is open source, so you can look at it if you'd like. You can disable it just by removing the Pocket icon from the toolbar. Plus, as Firefox uses lazy loading, once the Pocket icon is removed, the integration code will never be run.
Hello is just a thin wrapper around WebRTC (an open protocol)
Pocket is just a button that does a couple of AJAX calls to the Pocket site.
Both do have closed source code online, but when you click them it's pretty obvious that they are talking to some online service which may or may not snoop. You even have this "danger" when using Sync in any browser. In all these cases it's very clear what's going on.
When you use Pocket you know that the URL of the page you were visiting was sent to some service. When you use Hello you know that some routing service might be able to snoop on your call (I believe there's some encryption here though, but I'm not sure). When you use Sync you know that you're sending data to the server.
When you enable "Ok Google" detection in an open source software one would expect that the "Ok Google" detection is done locally in open source, verifiable code, and only after this detection is triggered, will sound be sent to the server. If this blob was instead some open source code, one would be able to verify that sound is only sent to the server when it is expected. But now that it's a blob, you don't have this guarantee. It could theoretically send periodic sound snippets to the server without you noticing, since it's listening on the microphone all the time.
That's the difference. Firefox's proprietary integration has verifiable triggers. It won't talk to a proprietary service unless you ask it to, and when it does you can verify what data it is sending.
On the other hand, this blob has no verifiable triggers. Yes, it is disabled by default (verifiably, apparently), but when enabled the data it collects and sends is not verifiable.
(Firefox also does have some blobs -- one for H.264, but the code behind it is open source, the blob is distributed for licensing reasons, and one for EME, but the EME blob is downloaded only with a confirmation which informs the user what is going on)
Another reason to switch to Iridium Browser. It has Google search disabled by default and even if you switch search to Google, Voice search and hot-words stay off until you manually enable it.
Hm, when I try to download Iron's source code ("for Coder"), a Rapidshare page "Our services have been closed. Thank you for your understanding!" pops out. How convenient.
"I seriously consider the good faith of an such upstream which does these kinds of things"
"But basically secretly downloading it leads to the question of possible malicious intent (and everyone knows that Google&Co. do voluntarily and/or forcibly cooperate with NSA and friends)."
"while I haven't looked at the code, I wouldn't even be surprised if the downloading itself is done insecurely."
"Worse, chromium isn't the only such rootkit-downloader,... e.g. FF which secretly downloaded the OpenH264 blob."
Really if you condone this attitude then I can only say...well I won't say it but it isn't nice. Not only that, everyone seemingly ignores the: "Note that the binary blob is executed throught native client, which is not enabled by default" part.
You people are so beyond reasonableness I find myself defending Chrome/Google. I can't believe this.
The tone was inflammatory but the sentiment is valid. Quoting:
Since no one really know which binaries have been downloaded there and what they actually do, and since it cannot be excluded that it was actually executed, such systems are basically to be considered compromised
A closed source binary being silently downloaded and executed without explicit action by the user or notification to the same is a security incident.
Many people are used to it because of all the training received by the "Java Auto Update", "Google Update Helper" and similar software receiving blank permission to monitor, download and execute closed source software with the same permission as the logged in user.
Despite of that a person that goes to the lengths of using Debian (instead of Ubuntu) and Chromium instead of Chrome certainly expects more from their sources than to allow this kind of behaviour.
It is a security incident and should be treated as one both by Debian and by the community in general.
"A closed source binary being silently downloaded and executed without explicit action by the user or notification to the same is a security incident."
Whereas source code being downloaded, compiled and run is not? Or a script being downloaded and run?
The source code being downloaded, compiled and run or a script being download and run would be a as much a security incident as what happened.
In this context (Chromium on Debian) having a closed source binary downloaded and executed is an additional problem to the security incident and that's the reason it is mentioned in the statement. There are two problems conflated in the same sentence:
1. A binary was downloaded and executed without explicit user intervention or consent.
2. A closed source binary was downloaded and executed by a primarily free and open source software in a free and open source distribution without explicit user intervention or consent.
So, answering the questions, having the source available would not make it ok but being closed source in this context is a problem on its own.
Yes. If opening a webpage downloaded a script that permanently altered the browser adding or removing functionality without explicit user intervention or consent it would be a security incident. There is even a class of scripts that warrants a special name because of this exactly behaviour: malware.
Considering the more general case of scripts being downloaded and executed in the browser (javascript, for instance) the more apt analogy would be one being downloaded and executed in a system with NoScript installed.
Just like NoScript is a tool that gives its users the power to decide on a case by case basis which scripts are executed by the browser, Debian is a tool that gives its users the power to decide on a case by case basis which closed source binaries are executed by their system.
Preventing this choice in this context is a security incident.
I think it's totally reasonable, from the Debian mindset.
Debian is an open-source zero-binary-blob distribution. You have to deliberately choose to install binary blobs, such as driver firmware etc.
It is the dismissive response which is not appropriate. The responder asserts that the blob is safe and trustworthy, how would anybody know?.
This guy is actually perfectly right. Google doesn't do thing like that by accident. I'm glad there are some sharp guys around to disclose such potentially malicious behavior.
Do what by accident? Of course the intent was to include the blob. If Debian wants to get rid of it it should patch Chromium or request it to be made configurable (which is what happened).
But you do understand that Chromium is supposed to be open source, right?
So, if the intent was to include a binary, closed source blob into an open source project, that could be called malicious.
No, he means Chromium (like Android) in practice are read-only, hostile projects that respond only to Google's needs.
Yes, you are free to create a fork.
In reality, it's nearly impossible to keep up with Google's development pace and their behavior of dumping huge changesets and lack of documentation and communication wears everyone out. If you have some exposure to biology/ecology you'll recognize the behavior as very effective at killing off diversity in ecosystems. It's like trying to co-exist on a lake with someone that keeps deliberately causing giant algal blooms.
> No, he means Chromium (like Android) in practice are read-only, hostile projects that respond only to Google's needs.
Huh? You've obviously never involved yourself in Chromium development. It's easy to get started and to stay up to date. As with all massive projects it takes work to do so, but no more so than any of the other open source browsers.
Unlike Android, Chromium is mostly developed in the open. As someone who has contributed to both projects, I wouldn't say Chromium is any less welcoming to contributors than Firefox. Mozilla is a lot better at presenting themselves in a positive light. Firefox even has similar automated downloaded of binary blobs like the EME plugin.
>As someone who has contributed to both projects, I wouldn't say Chromium is any less welcoming to contributors than Firefox.
You're far more likely to have hidden discussions about features or get patches obsolete due to code drops out of the blue when trying to upstream to Chromium.
>Firefox even has similar automated downloaded of binary blobs like the EME plugin.
Mozilla's EME stuff was widely discussed, announced in advance, and coordinated with distros.
Even if malice instead of incompetence is involved, it goes pretty far to call Chromium a "rootkit-downloader" just because it downloads a binary blob.
It could theoretically be a rootkit, but without any evidence to support it this is like calling someone a murderer because he went to the same high school as a murderer.
> without any evidence to support it this is like calling someone a murderer because he went to the same high school as a murderer.
That's a pretty flawed analogy. If you're going to examine it from a criminal act point of view, let's really look at it that way. If installing a rootkit is equivalent to premeditated murder, then the murderer must have motive, means, and opportunity. Let's take Sony as a good example of a company guilty of installing a rootkit. Motive: Prevent unauthorized copying of their music CDs. Means: Rootkit is embedded in the audio CD and uses Windows' Autoplay feature to install itself. Opportunity: They didn't disclose this rootkit so anyone who bought a Sony audio CD during that era was vulnerable.
Now, let's look at what we know about this Chromium binary blob silent install (note I'm not calling it a rootkit, as I agree with you that's taking it a bit far, but it would theoretically be possible to install one via the same method). Motive: Google wants to put the same always-listening "feature" on Chromium installs as well as plain old Chrome. Means: Google writes and publishes the Chromium source code. Opportunity: Just guessing here, but Google releases this change without announcing it (otherwise why didn't the Debian packagers see it right away?).
Now, once again I'm in agreement with you that calling this a rootkit downloader is a bit much. But what if it had actually been a rootkit, inserted by Google either intentionally (I don't trust them, but honestly why would they do something that nefarious?), or without their knowledge or consent (which would mean they are compromised by an outside actor). That is why this is such a big deal, and kudos to the Debian team for finding it.
It also bothers me that this binary blob, while not actually a rootkit, did have the ability to listen to the computer's microphone 24/7 (yes, that is a "feature" as it is part of Google Now), and can't be audited because there is no publicly available source code. That's quite a security hole; I recall discussing all kinds of financial and personal matters with my wife right in front of our computers. Thankfully they are both desktop machines without built in microphones, but many people these days use laptops as their main computer.
To sum up, I don't like it and I think it's a shitty thing for Google to do. Whether it was intended to be a silent install instead of public knowledge, or just a major gaffe, remains to be seen.
The sandbox binary uses setuid root if user namespaces aren't available, but that's a necessity for making the empty chroot and process/network namespaces used to sandbox tabs. The layer-2 sandboxing code (seccomp-bpf) doesn't require anything like that, but they're meant to be complementary (although both are strict enough that they could act as a meaningful sandbox alone).
>this is like calling someone a murderer because he went to the same high school as a murderer //
I'd say a closer analogy would be because he had the same blood splatter pattern on his clothes as a murderer. But neither is a useful analogy they're just biased by our perspective on Google.
Think about these peoples starting point. They want a complete OS and application set built from source that is completely auditable.
This is direct circumvention of that.
Additionally, you can never prove you have removed all of your security flaws. The best you can do is search harder to become more confident that none exist. There is no way to prove that a privilege escalation flaw does not exists. Now arbitrary code is run against you wishes and without your permission.
No one can prove it didn't run a privilege escalation attack. If you think your digital security is of paramount importance you must assume you have been compromised and take steps to remedy this potential issue.
It is not that google is guilty, it is that moments before this happened security was provable and now it is not.
I think you are underestimating how much stuff you can actually prove. There's an entire field doing software verification which deals which formalizing aspects of the execution environment and proving absence of certain properties, i.e. classes of bugs, in a system. Sure, your proofs might rely on some "assumptions", but testing, a mere search for the bugs, is surely not the only way to ensure correctness.
Its not about what I am thinking. I never said anything about my thoughts on it. Perhaps I think computer security can be improved with liberal applications of butter and jelly or something equally ridiculous.
The people in question think it is an issue. They do this because of their starting point.
Whether we agree or not is not really relevant. Advising people to not read a bug report because one of the comments in the discussion associated with the report doesn't appeal to you is... strange, to say the least.
How do you manage any kind of discussion format if opinions (however wrong they may be) you don't like make you leave the discussion entirely?
On the contrary, sometimes ignoring the discussion is the only way to stay sane and make progress because there are an infinite number of people with "opinions".
In this case the original report is informative, what comes after (up to 2 posts now) probably isn't going to be.
This is labelled as being a "hotword" implementation, ie, something that will monitor the microphone until someone says "OK google", then start listening and transmitting the following words for a search. However, there is no guarantee that it does what it says it does; in particular, it might instead accept instructions to transmit audio from particular parties that Google wants to spy on.
I understand there are likely to be many uninvolved engineers within Google who have access to the source code. It would do a lot to restore trust if a few such engineers could take a look through the source code and find out whether it has a remote trigger, and whether the source code in Google's repo matches the file that's being distributed.
This is not the first time Google has taken an open-source project and added closed-source components to it. They did the same thing to Android, twice: once with the "Play Service Framework", which is a collection of APIs added to Android but theoretically independent of it, and again with Google Glass, which ran an entirely closed-source fork. In the case of Glass, I did some reverse-engineering and found that it would send all photos taken with Glass, and all text messages stored on a paired phone, and transmit them to Google, with no feasible way to stop it even with root. This was not documented and I don't think this behavior was well understood even within Google.