It's interesting how people come and mock without having any framework of understanding the thing. It's almost like a lost language.
Consider air - it is immaterial, the spirit/principle/reason/meaning/pattern of things. It's also the vehicle for speech, and when we stop breathing it we die.
I don't think the quote (or the broader text) means a single concrete thing - it's saying something about how the world works, and should be applicable in multiple ways. Under appreciated rabbit hole!
A purpose of a religious text is to control people. They do that through well-known ways. It says “blah-blah, but look at this fallacy you aren’t aware of, so believe in god”, at different zoom levels. Every one of these is trivially deconstructible cause their main target was uneducated masses which had no scrutiny. Those who had it were religiously “educated” and accounted for. Religions that didn’t do that didn’t survive. That’s the framework of understanding. This thing wasn’t written by “god”, it’s a work of a few scammers, sadly the biggest in our history.
It seems to me that your stand is analogous to anarchists' about law and government. Sure, there's a tyrannical aspect that can get out of hand, but it's far from the whole story.
I don’t think this is a good analogy, since laws and government don’t tell you how the world works, it’s either observable without explanation or left unexplained. In religion there’s no whole story, it all made up. It may contain some real life parts, but it could do so without religious parts. Real life stories doesn’t make it more credible in sentences containing “god”. In fact, the quote of this subthread is wrong, false, debunked. There’s no need to look at it in context, cause whatever role it plays in it can’t make it look good. Looking at falsehoods “in context” and referring to “deeper knowledge and proper understanding” is a beloved theme of religious manipulation.
I think the disconnect is that you seem to consider religious texts as a dry statements of fact. That doesn't make any sense, they're clearly not that.
Would you say the same about great works of fiction, or old fairy tales that for some reason keep grabbing our attention and we repeat them for generations? That they're just falsehoods because duh, frogs obviously can't talk? Or can they have some deeper meaning? Stating facts is not the only way to describe the world.
> Would you say the same about great works of fiction, or old fairy tales
Do you thing religious followers, such as Matthew, see god/heaven/etc as being merely a metaphor?
> That they're just falsehoods because duh
per previous poster: "laws and government don’t tell you how the world works"
works of fiction doesn't purport to either. They might have morals, or subtexts, as much of the contents of the Bible does - but some things in there are meant to be at least partially literal, such as the existence of a divine being that created the world.
What's the greater message behind "God takes care of lesser creatures" when there's no proof of such a thing? That things will generally turn out alright if you don't plan ahead (demonstratably bad advice)..
> Do you thing religious followers, such as Matthew, see god/heaven/etc as being merely a metaphor?
No, I'm not suggesting that. The alternatives to just reporting facts are more than "merely a metaphor".
> works of fiction doesn't purport to either. They might have morals, or subtexts
Disagree - I think they distill patterns from the factual and present them in the form of stories, encoded in the structure of the story. If you're a materialist you might say that the story is less true than the factual manifestations of the patterns, I'd say it's more true; and that it's telling something about the world.
> What's the greater message
We're debating if zero even exists, don't ask me about analysis ;)
Actually the purpose of that whole chapter is about not being a hypocrite, being authentic, not being greedy, and having faith. It's a quick < 5-minute read.
Yeah, a bunch of dudes created a book (which costed a fortune or two before typewriter age) to tell everyone to be good just for the sake of it. As plausible as it can get. /s
It’s a medieval gaslightenment and it would be great if people kept it private at least.
PS. purpose is different from meaning, the latter is just a medium for purpose and may be arbitrary.
That would be fair if OO/FP posts had always mentioned that the ideas were borne out of academia and don't necessarily easily apply in other domains, but that ship has sailed..
> That would be fair if OO/FP posts had always mentioned that the ideas were borne out of academia and don't necessarily easily apply in other domains, but that ship has sailed..
OO has been widely adopted because it maps almost perfectly to most domains.
FP is starting to be adopted because first class support from programming languages and frameworks lowered the barrier to entry by eliminating the need to implement all primitives and infrastructure. FP is being adopted because it maps almost perfectly to some domains.
Not sure if you’re being serious here: the whole point of academia is to eventually be useful to the outside world. Something that applies to academia and nowhere else is kind of useless. The very reason we take science so seriously comes from how it helped us completely transform the world.
So the idea that something would only (easily) apply to academia because it was born out of it feels kind of ridiculous to be honest.
The "methodology industry-complex" thrives on consultants, authors, certification experts, courseware, etc. By definition it is funded by those who literally do not know and need/looking for something better. So while we can be honest, there is a lot of money preventing that discussion from ever happening.
Huh? Coffee shops optimize for people not bumping into each other and having related items close together, and don't pretend to not know what kind of gear they have.. that's not a terrible analogy to the exact opposite argument.
I think the sentiment is that order and organisation is helpful in achieving goals and cultivating a good working environment as opposed to a big mess. Analogies, just like abstractions, are leaky.
Yeah, but this one leaks a smart-matter paint that self-assembles into a shape of text saying "the order and organization is not the goal, but a consequence of ruthlessly optimizing for performance above all".
I'm wondering if the premise was that people compiling other people's code should be prioritized over "actual" users who develop their software using the compiler, who then would need to resort to external tools to do more complicated things, nullifying the protection anyway; or that it's expected that developers routinely include code they don't trust?
Sure, many people do, and some don't. I suppose the point I'm trying to make is that decisions like that make one consider what are the assumptions and target audience of a project. In this case, as a potential user, if I'm looking for a language that's a deadly tool for doing difficult things, intentional limitations feel almost patronizing.
Never did any audio programming, I still can't make heads or tails out of it. Is it expected that an application sets up some processing graph for an engine to play a sound? [1] Instead of writing samples to a buffer in a loop? Who wants that?
Also, while I don’t know the project yet I can see the benefit in building something from the ground-up in a way that embraces multi-process “in/out” pipes. Someone could have dozens of effects all connected to the audio signal, then run that to dozens or hundreds of outputs (streams online, multiple speakers, recording applications, analog mixer inputs...). Traditionally that kind of usage is fragile as memory corruption can happen at any stage since everything is writing/reading the same buffer. If they have a true graph implementation then applications could interact with the graph edges and the overall system would be much more robust.
And doesn't processing necessitate buffers? How does this approach reduce the number of buffers involved? If anything, it sounds like less sharing would mean more buffers.
I don't know of very many applications that use PipeWire directly (at least for audio); usually a program would target some other Linux audio API (ALSA, PulseAudio, JACK), for which PipeWire would provide the necessary plumbing and libraries to be a drop-in replacement for those older APIs.
> When we make statements such as the size of the set of all natural numbers 1, 2, 3... is the same as the size of the set of all natural even numbers 2, 4, 6..., despite the former containing the latter but not vice-versa... it seems the word "size" -- and associated terminology "larger than", "smaller than", etc. -- is a particularly unhelpful set of words to have chosen for this.
It seems to me, when you're counting things, you wouldn't care what are the things you're counting specifically; while in your example it does matter for determining the subset relation. Whatever way of counting where it matters would be kind of weird.
> it seems like an unwarranted leap to go from this formal comparison of cardinality of infinite sets, to the intuitive English-sentence idea that "almost all" real numbers are irrational
But the article uses "almost all" in the formal sense? Which, by the way, also has pretty intuitive meaning, in my opinion.
> But the article uses "almost all" in the formal sense? Which, by the way, also has pretty intuitive meaning, in my opinion.
By formal sense, do you mean everywhere but a finite-measure set? Zero-measure?
I'll use finite-measure because it allows for intuitive constructions like "almost all real numbers are outside the closed unit interval 0 <= x <= 1".
But, there are still some constructions that a layperson might expect to hold, like: "almost all real numbers have fractional part < 1e-100", or "almost all positive numbers are of the form x.y with 0.y < 1/x" (thanks, harmonic series).
I think that without formal training, we're especially bad at reasoning about dense sets such as the set of rational numbers, compared to, say, the reals.
> ... intuitive constructions like "almost all real numbers are outside the closed unit interval 0 <= x <= 1".
But that would be wrong wouldn't it? I can produce all real numbers by pairing a finite number of (in this case 3) real numbers outside the set with each real number inside the set.
For each real x, in 0 <= x <= 1, we also have:
1/x (covers all real x, 1 <= x <= +infinity
-x (covers all real x, -1 <= x <= 0
-1/x (covers all real x, -infinity <= x <= -1)
The cardinality of all those reals outside of 0 <= x <= 1 is therefore 3x the cardinality of those inside 0 <= x <= 1, in this construction. But for infinite cardinalities the 3 can be discarded.
So there are exactly as many real numbers in 0 <= x <= -1 as outside it.
I don't disagree that these sets are the same cardinality. But cardinality isn't the only way to describe the "size" of a set.
I suppose the typical measure-theoretic definition of "almost all" / "almost everywhere" insists on "everywhere but a zero-measure set", and you can't define a measure that satisfies sigma-additivity that treats intervals of finite Lebesgue measure as such, while ascribing nonzero measure to sets of infinite measure.
But even so, the Lebesgue measure of R is infinite, while the same measure of the unit interval is 1.
If I do it the "normal" way (repeating the [0, 1] interval infinitely many times), there are infinite times as many reals outside the [0, 1] interval as there are in it.
But that "infinite times" is a countable infinity - the number of integers. How does "the number of reals in [0, 1] times the number of integers" compare to "the number of reals in [0, 1]"? Are they the "same" infinity?
What if we use rationals instead of reals? We can do the same x 3 thing, right? But the number of rationals is countably infinite, and "3 times countably infinite" is the same as "countably infinite times countably infinite", isn't it?
Maybe not the most formal of meanings, but my favorite is a probabilistic one: given a random element, how likely it is that it satisfies a predicate? If some elements don't, but it's still satisfied with probability 1, that's pretty clearly almost always.
EDIT: yeah you guys are right, I wouldn't worry too much about the prior not being a proper distribution, but still - this doesn't seem related to the cardinality of sets in a simple way after all!
If you are referring to a measure then 'almost all' pretty much exclusively means every except for a zero-measure set. I've never encountered another definition when dealing with measures.
But consider a different, entirely valid context of "almost all":
"Almost all natural numbers are greater than 10."
"Almost all prime numbers are odd."
If we wanted to extend this intuitively, we might want to support the statement that "almost all positive reals are greater than 10". One option of doing that is by using the nonstandard definition of "everywhere but on a finite-measure set".
Is the meaning unintuitive, or is it in fact simply that the world doesn't match your intuitions?
'cos I mean, I'm not a betting man, but "I bet mathematical facts are correct despite being unintuitive" makes my (winning, profitable) bet in December 2020 that Trump lost look like a true gamble by comparison.
Is it just me? This website is super laggy on my PC (which can run modern games just fine). Hardly hits 50 fps when scrolling. How slow can one get and still claim to be fast?
Further update - it turns out I'd turned off Hardware Acceleration in FF to avoid issues that Windows has when you have GPU accelerated content on two separate monitors with different refresh rates.
It's really bad with Hardware Accel turned off, but it's still a bit laggier with it on compared to chrome/edge.
Yeah, it's not really optimized for that. The floating labels over the code examples are offset in 3d space (for a subtle parallax effect while scrolling), so when HW acceleration is off it may end up repainting the page on every scroll. The effect is probably not worth the tradeoffs :)
That's a program that you can run on your machine to interoperate with your peripherals. If we have concerns about this being legal, our standards for ownership are way too low.
It effectively computes the bispectrum as B(p, q) = F(p)F(q)F^(p+q) and runs the inverse 2D FFT to restore the triple autocorrelation. The results are interesting, but not impressive and very GPU intensive (NxNxlog(N) per frame is slow). In any case, I strongly believe that bispectrum is hiding something interesting and I just haven't figured how to see it.
Cool! Maybe it's because it contains phase information which is not that relevant to hearing. FWIW I remember long time ago using it for image processing and the trick was to look at a sections of it, i.e. TC(p1, p2) where one of p1 or p2 was fixed.
It feels like lisp - neat, maybe has pedagogical value, does not really solve any hard problems and usually not worth the overhead. Yet I still want to like it.
Not knowing what you consider a hard problem, I think most of them are not of the kind that a language can do much about. It can easily be a drag though, for example by being too high-level and taking control away from the programmer when it's needed.
I'm not saying it's necessarily a bad tool (lisp or GA). They are just tools, not magic, and tools don't solve hard problems.
Or perhaps, choosing the right tool can reveal that certain, select problems are not actually that hard.
And perhaps, certain other problems, like computational geometry, are in fact just really fucking hard, no matter how you express them. Having some familiarity with the space, where the only really quality implementations are a massive GPL research code base in C++ (CGAL) and commercial C Libraries (Parasolid, SMLib), I lean towards this view.
I have some familiarity with computational geometry myself. I've built myself the tools I need for it in pure Apple Metal, and it beats anything you can buy or get for free (for my particular needs).
Do you mean computational geometry for graphics applications (your mention of Metal suggests this)?
I'm referring to computational geometry for manufacturing and engineering simulation applications, which is an entirely different beast (in particular, accurately tracking topology is much more important, and generally requires arbitrary-precision floats for degenerate cases).
No, manufacturing and engineering, for example, computing the offset of a 3D-body represented by a surface. This benefited heavily from massive parallel computation via Metal.
I also implemented other algorithms from scratch for solid body operations, and here indeed arbitrary precision rationals were needed first, but then I could get it working with normal double arithmetic in a lot of cases later on; I didn't use Metal here though.
I find that libraries like CGAL etc. are just either too slow or not general enough for my purposes. The whole sector seems ripe for disruption via proper algorithms implemented on graphics cards.