Hacker Newsnew | past | comments | ask | show | jobs | submit | teknopaul's commentslogin

"This is nuts: it’s akin to saying that the milli- prefix should have different meanings depending on whether we’re talking about meters or liters."

Were were here recently with "mega": Sometimes mega is squared as in megapixels. Sometime not as in megabytes.

No biggie.

Db in audio is a relative scale and that makes perfect sense. If you mixer goes + or - 6db that makes sense but can't be measured as power, your mixer might not be plugged in to any speakers so relation to real power is moot in the digital realm.

3 eq bands with -+6db makes sense too. Doesn't need to be precisly specified to be of immediate value, +-12db is clearly something else and users know what.


> Sometimes mega is squared as in megapixels. Sometime not as in megabytes.

Even worse is Mega in Megabytes could be 1,000,000 or 1,048,576 and it's more or less up to you to know what's what

(Yeah, there are formally megabytes/mebiibytes/MiB/MB, but I honestly cannot recall the last time I heard anyone use anything other than just "megabyte" for 2^20 bytes... Or even wanted to refer to exactly 1,000,000 bytes. Other than decades ago when disk manufacturers wanted to make their hardware seem higher capacity than it really was)


I see MiB in usage a lot and I always make the distinction myself


Wait, when did they stop ?!


I think the biggest issue by far is that many of the different contextual uses for dB are all in the same domain, or very close.

When you're talking about the loudness of sound, in the same exact context you might care about SPL, perceived loudness, AND gain.

If it was just a matter of "in electrical engineering / physics, dB implies this unit + baseline, when dealing with acoustics, it implies this other unit + baseline", it would be less problematic.


Megapixels aren’t ‘squared’ though?

Ah, unless you’re trying to make ‘pixels’ the same unit as in ‘pixels per inch’…

The problem there isn’t how ‘mega’ is applied but how ‘pixel’ means both an area pixel as well as the linear size of a pixel.


> Sometimes mega is squared as in megapixels.

Is that right? A pixel is a 2D object already. It's not like e.g. with centimeters, where it's a 1D unit, so it becomes centimeters squared to form a 2D unit.


it's (cm)^2, not c(m^2). If there would be the one-dimensional equivalent to pixels, say pix, then 1pixel = 1pix^2, and then 1 megapixel = (kpix)^2


After putting significantly more effort than perhaps socially acceptable into this, I completely agree so far, but I'm still horribly confused about GP's point about "mega" being a "squared unit in megapixels".

That to me implies that "mega" in "megapixels" is a planar ("[pre?-]squared") scaling factor, but it's... not really? Are those even a thing?

I think this is what the debate is about at least. Mega does line up with kilo squared, but that's not because mega becomes a planar scaling factor, but because it just so happens that 1000 times 1000 is 1 million. It's kind of a coincidence? Like it's literally 1 million pixels, that's what's being meant. Just like with cm squared, the ohhhhhhhhhhhhhhhhhhhhhhhhh


I believe the point is that in "centimeters squared" you also square the cm. So it's 1/100 of a meter, then squared. While the mega in megapixels applies to the area. So presumably there are two length multiplied to be a pixel, and then you have a million of those.

I don't think it's a very deep point, and I would say the mega is not squared, but the centi is squared.


What finally clicked it into place for me was trying to perform a unit conversion. It's kind of annoying cause we don't really use mega with anything squared usually (I don't recall anything at the moment at least), which added to the confusion.

When one converts from square kilometers (km2) to square meters (m2), one needs to undo the kilo (x1000) not once, but twice, accounting for both dimensions. So as you say, it's actually here where the scaling factor is secretly squared, it's k2, just not written out. Hence despite kilo being a 1000x bump, you need to divide by 1 million, because it's actually squared in kilometers squared.

So if mega in megapixels was behaving "normally", that would imply similar semantics, so to convert into kilopixels, you'd divide by 1000 not just once, but twice. But no, 1 megapixels is 1000 kilopixels. I guess the idea is that instead of being "secretly" squared it's "explicitly" already "square" as a result? So instead of resolving to mega2 pixels2, it's just mega pixels, since pixels is 2D, and so squaring it is unnecessary.

I think I get it now at least, but yeah, I agree with your sentiment. It actually reminds me, when I first learned converting between units of area in primary school, I had quite the troubles with wrapping my head around it exactly because of this.


Yes, and I think it just depends where how you read the square. For cm^2, we mean that we square a centimeter, not that we have a centi of a meter^2. But a hectare is really a hect of ares, so 100 ares. Here the square "is baked in" into what comes after the modifier.


The pixel ain't no problem.

A "megapixel" is simply defined as 1024 pixels squared ish.

There is no kilopixel. Or exapixel.

No-one doesn't understand this?


This.

Trump will pardon anyone on his team.

The existence of Presidential pardons is a disgrace. There is no pretence of the rule of law.


Presidential pardons were a crucial check on the power of the courts. The Constitution was written to curb excesses of 18th century England.

I'd say we didn't use them nearly enough. And now they're being used exclusively for crime. Yet another sound idea turned against us. There just isn't any way to govern a nation which has a majority in favor of destroying democracy.


You mean, his friends and the top levels

Make no mistake the 'kids' in doge will be the first to be thrown under the bus


There is a law in Europe that says this should always be the case.

Unfortunately USA tends to set the global standard for greed, deception, dark patterns, accidental bugs that live for many years and just happen to prevent uses getting their data. And often flagrant disregard for international law backed by Washington.


It's a pet gripe with cli apps that do first wins as well.

You should be able to set

  alias foo='foo -p 80'
And still write

  foo -p 81

espeak suffers this affliction


Hammer, nut.

Clever trick tho if you are in a bind.


We need one of these for .debs. The answer files are easy to generate after installing once but it would be better to have an HTML ui that catered for every annoying .deb that can't think up sensible defaults for itself.


For Debian it's trivial to do so. `debconf-set-selections` is your friend, if you want something for unattended installations you can embed that into `live-boot` or into FAI.

[1] https://manpages.debian.org/testing/debconf/debconf-set-sele...


If you are a white US citizen you have nothing to be concerned about.

Unless you have friends, family or empathy.


Brown green card holders have it worst, but white, native-born US citizens are fucked too. The agent can decide you look Mexican to him; the agent can decide you're not sufficiently enthusiastic about Trump.

And if there's no due process for non-citizens, there's no due process for citizens either. That congress is allowing this to happen is deeply, deeply fucked.


Yep, without going into too much detail. My sister had a green card, is white, and clearly got a bored customs agent, because she got held in the little room with all the 'problem cases' for a couple of hours. It doesn't take much..


Just package node_modules subdirectories as tar files.

I stopped using npm a while back and push and pull tar files instead.

Naturally I get js modules from npm in the first place, but I never run code with it after initial install and testing of a library for my own use.


This is a valid choice, but you must accept some serious trade-offs. For one thing, anyone wanting to trust you must now scrutinize all of your dependencies for modification. Anyone wanting to contribute must learn whatever ad hoc method you used to fetch and package deps, and never be sure of fully reproducing your build.

The de facto compromise is to use package.json for deps, but your distributable blob is a docker image, which serializes a concrete node_modules. Something similar (and perhaps more elegant) is Java's "fat jar" approach where all dependencies are put into a single jar file (and a jar file is just a renamed zip so it's much like a tarball).


May not be a well known feature however npm can unpack tarballs as part of the install process, as that’s how they’re served from the CDN.

If you vendor and tar your dependencies correctly you could functionally build a system around trust layers by inspecting hashes before allowing unpacking for instance.

It’s a thought exercise certainly but there might be legs to this idea


I think Yarn zero install is now the default, and does the same thing you're advocating? I'm not really a JS person, but it looks like it's done reasonably competently (validating checksums etc).


I don't see why system wide dns caching would be of use?

How many different programs in the same process space hit so many common external services individual caching of names is not sufficient?

Article lists a bunch of fun with systemd running junk in containers that seem counterproductive to me. A lot of systemd stuff seems to be stuff useful on a laptop that ends up where it's really not wanted.

Local dns caching seems like a solution looking for a problem to me. I disable it whereever I can. I have local(ish) dns caches on the network. But not inside lxc containers or Linux hosts.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: