As one example, if the Internet Archive goes offline, a massive corpus of the last two decades is gone forever. As another, I had a friend who bought hundreds of dollars of PlaysForSure music only to have the store shut down and the license revoked within the span of 12 months. Hope you didn’t care too much about those 3DS, Wii and Wii-U eshop games. (And on and on.)
We currently live in a world of abundance and access. Even with that, there are movies that can no longer be seen, music that can’t be listened to, and books that can’t be read because they were never widely or publicly available.
There’s a claim that it was a marketing scheme in the 1940s to reduce the usefulness of hand-me-downs in families. My grandmother would have lived through that and I may see if she remembers anything about it. She was definitely babysitting or watching children by 1940.
That doesn't make sense to me US textiles were in high demand, though perhaps not in pink and blue starting around 1940 and by the end of the decade US consumers were getting quite wealthy. I'm not saying it didn't happen but I'd like to see a better reason than "companies love money" since if you loved money in the 1940s there were better ways to get it than trying to do some sort of marketing campaign to reverse a social standard (using a marketing industry that was much less advanced and pervasive no less.)
Unless you’re writing instructions for a Turing machine the impedance mismatch between the real world and “computing” is always going to have idiosyncrasies. You don’t have to like a language to understand its design goals and trade offs. There are some very popular languages with constraints or designs that I feel are absurd,
redundant, or counterproductive but I cannot think of a (mainstream) language where I haven’t seen someone much smarter than me do amazing things.
The language I consider the lamest, biggest impediment to learning computer science is used by some of the smartest people on the planet to build amazing things.
I typically use those when the following code does something emunexpected, relies on external state or is setting state relied on by something else.
There are times where you can’t rewrite or refactor something or have Toni’s ran API with non-obvious behavior. Plus NOTE is shorter than HERE BE DRAGONS
I read that as: take a bunch of pictures of a static scene, each with a different filter capturing specific frequency bands individually. Merge afterwards with whatever weights or algorithms you want.
So you get a header only implementation by forcing all users to do something unexpected instead of having a single C file. Seems silly and not really the point of a header only implementation.
It's not that weird, and it's explained first thing in the header and shown in the example of how to use it. You do need to read how to use the thing, and this is a simpler detail than any of the function signatures you'll have to look at.
Personally I would probably add a gif.c to my project which does nothing but include the header with the define set, at least if I'm going to need a gif decoder in more than one place. Probably many (most?) projects only need this library in one file anyway, in which case I'd just include it from that file and be done.
Telcos were given billions to expand and improve broadband for decades and never did it. If the FCC has scrapped the goal are they also scrapping the handouts? If so, it’s long overdue.
I had a Powercore III Sense 10K (I think), which doesn’t appear on that list, swell on me recently. It’s one of my newer batteries and I bought from Anker trusting their reputation and past purchases. I purchased their Soundcore earbuds, which failed to charge after a few weeks. I don’t think I have any cables from them anymore as they’ve all failed as quickly (or more quickly) than less expensive options.
My current perspective, recall or not, is their quality is no different from the alphabet soup companies selling identical looking (and possibly identical) items.
It seems like the overheating issue is “not thier fault,” but part of being a trusted brand isn’t just recalling but vetting suppliers and the components they receive.
I enjoyed our metrics systems at Amazon’s and wrote one with a similar API at Okta and should really look at writing another one to open source.
One of the huge missing things in metrics systems, imho, is keeping granular metrics in the context of a business operation and then using late aggregation for trends. Last I looked nearly every metrics systems either logged individual events and and required processing for any rollup or aggregated too early and you couldn’t determine the effect on any individual operation/request. There’s a happy medium where you can get per-request counts, stats, and timing and still roll those up at the host/data center/region/granularity to get higher level trends.
Most metrics APIs are incompatible with this idea, however.
You're likely talking about "wide events" Which is essentially as many dimensions as possible attached to an event.
I believe Meta was the one to develop an internal tool for handling this named Scuba.
reply