Interestingly, I've gotten used to checking $TERM and making sure it's set due to entering Docker containers to poke around them: in a lot of images, it's unset, and you get really old-school terminal emulator behaviour without it!
Nice screens, reasonable processing power but they throttle the ram to make you want to spend the $500-750 for the "power" of.... 8 gigs of ram. But with the bloat of a useless(to me) windows license included.
How long is the chromebook spec going to stay like this? It's awful and holding them back.
It's nonsensical in 2018 to have this little ram on a laptop,chromebooks included but there it is still.
there are models with 8gb. with the expanded scope (android, linux), there will be pressure to increase RAM.
Just like Android already increased the storage availability (3 years ago people would have laughed about a chromebook with 256gb storage - what's the point? And now there's Pixelbook)
HP Chromebook G1 goes up to 16GB with the M7 model. I have the M3 one with 4GB RAM and it works well for me with either ChromeOS + crouton or plain Ubuntu 18.04. It should get Linux app support too, it's a fairly recent device.
It was a shame that most people, seemingly the Eve devs as well, seemed to focus on making Eve a language for easy programming for the masses. Thats a huge wall of a problem with a limited funding slope.
I think if there had been more upfront work and focus on the distributed programming/database possibilities, that Eve would have founded an actual community to push it along.
Eve as a language for doing business logic against say Kubernetes? As a language to wire up ETL processes?
I think it could have found a niche that would have made it grow in power before trying to take on the mass market.
I assume Lineage Driven Fault Injection stuff [1] has some overlap with Eve's ability to tell you "Why is this blank?": The datalog model allows you to find the logical dependencies of results.
Some other bloom related links:
- Anna KVS[2] showed up recently on hacker news[3] and morning paper[4]
- Lasp lang is in the same space[5], Christopher Meiklejohn has a comparison with bloom[6]
I agree that the incidentals like tracking and form will matter a lot.
I think however there is no "proof of existence" of any software or capability that would compel even early adopters to use a 100x100 display, even with all the other physical issues being polished to a high sheen.
Looking down at your phone or smartwatch seems far more likely.
The "maybes" that people frequently suggest require a staggering amount of unwritten software infrastructure (sometimes up to and including sentient AI).
I suggest you spend some time reading his pretty extensive set of high quality articles on many companies and aspects of AR.
This is an industry where outsiders have a very limited understanding of the serious serious problems in front of AR.
It's also an industry that is taking advantage of the very limited understanding of just about everyone not in it to hype things far beyond their capabilities.
If you care about AR in even an offhand way you should spend some time self educating and Karl Guttag's site is a good way to get started.
It's very interesting field but there are tremendous challenges in front of it and it serves no purpose to just assume the companies involved have actual magic leaps that solve the problems.
I've read his site, but he focuses way too much on aspects of AR that have little to do with consumer needs or concerns. A "worse is better" AR display that violates pretty much every concern he has could win the market as long as it was done really well in terms of comfort and applications, the same way the limited Apple Watch pretty much dominates the market today, even though it falls far short of the kinds of magic people thought it would be capable of originally.
It's also entirely possible they offer partial solutions to some problems, no solution to other problems, and still produce am experience worth paying money for.
That's a natural argument extrapolating from the phone, electronics and laptop market.
I think it is wrong in this case, however. As I understand it, the problem is "half-working" in the case of VR/AR/etc isn't something like less convenient that early adopters can simply put up with. Half-working VR has ability to sicken a person and injure the vision system.
That is a pretty bad article about VR that might have been excusable in 2013 but is not on firm ground in 2017 much less 2018.
Broadly, almost everything in it has been either not born out (vision concerns) or less of a problem then first assumed (motion sickness) once the limitations of different users were understood.
Many people using VR find that the vestibular concerns are not an issue over time as long as the framerate of the system is high enough (90fps+). People actually do get their "sea legs" over time and many (largely gamers so far) ask for locomotion systems that were considered terrible ideas even 2 years ago.
Yes this article seems to be redefining "web brutalism" to be a simple recoloring of the same wretched over designed unusable webpages that "web brutalism" was a reaction to.
Thank you for writing this it was just what I wanted :)
What is your sense of the latency guarantees/scheduling to user space graphics and input? This is an area that everything else kind of fails at. "No drawing the groundplane and every possible frame for this user you have in your VR grasp is not optional".
No idea about the latency guarantees or scheduling. I've mostly just read and summarized some of the overview docs, and there's been nothing written on scheduling or latency that I've found yet.
It looks like the isolation of different applications, and capability based security, is pretty baked in. There do seem to be some TBD parts, like right now just like on Android apps can either have access to all of /data or none of it, and fixing that is something they list that they want to do but haven't yet.
It would seems like it will cause a lot of drama as this actually rolls out.
https://github.com/golang/go/issues/33980