Hacker Newsnew | past | comments | ask | show | jobs | submit | nswanberg's commentslogin

Nice! How'd the local models do vs gpt4o-mini? Did you spend much time playing with datasette?


Local models hallucinated a lot more that gpt4o-mini, so I stayed with OpenAI. On top of that, I paid around 14€ for inference on ~200 examples on OVH and inference was much slower. I am planning on getting everything running on Mistral or Llama though.

I used sqlite everywhere so datasette was good for visualizing scraped and extracted data. Simon released structured generation for llm a few days after I did the project though, so I haven't tried yet.


Yegge wrote about the business idea version of this as "Shit's Easy Syndrome":

https://steve-yegge.blogspot.com/2009/04/have-you-ever-legal...

It'd have been delightfully ironic had either of these Steves concluded their essays with a named methodology to "just" apply whenever faced with these "let's just" situations but alas...


There's at least one in Boulder too, near Broadway and Baseline.


I ended up losing nearly 15 years of my Google Location history during the switch to on-device, so if you're interested in doing analyses like this, be sure to back up your data on Takeout before you enable the on-device setting that nemo1618 mentioned. Once that setting is set, the data is no longer available on Takeout, and if the data didn't fully transfer to your device, which is what happened to me and to some others, it's gone: https://www.reddit.com/r/GoogleMaps/comments/1diivt3/megathr...


> Follow the prompts to set up automatic backups.

> Options include keeping your data for three, 18, or 36 months, or indefinitely until you manually delete it.

So, if we Takeout our current data, we can squirrel that away on our own computer.

Also navigate the transition process perfectly, including the above settings, so history -- new history anyway -- will be preserved on Google servers. Will it then be available for decryptable download to the user's computer via Takeout? Or only to a replacement phone?


That encrypted backup isn't available via Takeout, only via the Google Maps app. You can use that backup to load your history to various devices or a replacement phone.


> I ended up losing nearly 15 years of my Google Location history

genuinely curious—why would you want this?


I have my Dad's history as well as mine, and a mapping app into which I can load both. Where the the two tracks coincide, I'm prompted to remember the occasion.


that's a nice thing to have.


Making analyses like zdimension's, keeping a kind of automated diary, and occasionally looking up a spot I've been but can't quite remember.


Has anyone done a (hopefully) systematic survey of the processes and software people use to store their stuff, sort of like a usesthis.com but just for storing assets, and how well that's worked over time? My guess is the successful strategies would look a lot like Brajeshwar's comment, a thoughtful plan that uses simple software and formats, some planning for the future, and, probably critically, regularly doing "digital chores".

There've been some efforts in the past to store everything and make it searchable, like the ancient Chandler project, and the possibly still alive Parkeep, none that have been more widely adopted than a strategy of put everything in Gmail, Dropbox, etc, and hope for the best, which is what I do, minus the regular diligence that people like Brajeshwar have.

Making and using anything more complex looks like it turns into a (very cool looking!) hobby in itself, like these:

https://thesephist.com/posts/monocle/

https://simonwillison.net/2020/Nov/14/personal-data-warehous...

https://writings.stephenwolfram.com/2019/02/seeking-the-prod...

And yeah, the latter two also include storing and searching more than say email and photos, but maybe shows one's tendency to want to store and search everything.


Not much a survey, but in the past I've tried recoll (xapian wrapper, CLI and GUI, very simple to setup) only to realize that for my needs and the way I take notes (described in this page, another comment) I do not need full-text search except for emails (that are handled by notmuch, locally so indexed by xapian anyway).

I've tried for the web part archivy (pythonic web-app) and zotero, but they do not works like I want so I've dropped them and so far I've just pdf-render contents if I really want to save it beside a link/archive.ph and co.


Seems nice if you're using c# or java. It also supports python, but for that Simon's llm library is nice because he designed it as both a library and a command line tool: https://github.com/simonw/llm


Triboulet: "A noble has threatened to hang me!"

The Monarch: "Don't worry! If he hangs you I'll have him beheaded fifteen minutes later."

Triboulet: "Well, would it be possible to behead him 15 minutes before?


Hahahah. I literally Googled who Triboulet was.


What did they look like before? Have any favorites?


I saw a Killington map from the mid 80s that looked like a subway map.

https://www.newenglandskihistory.com/maps/


Different style for sure, but also much more difficult to read in my opinion. Elevation being depicted with topographical lines is harder for me to parse than elevation depicted with illustrated terrain.


Carl Öst Wilkens did something similar on Day 3: https://twitter.com/ostwilkens/status/1599026699999404033

Cool hacks, both of them.

Who's gotten ChatGPT to pontificate on the ethics or utility of people staying up late, using their mental energy to solve puzzles whose answers have no obvious utility? Would it then collapse into an introspective black hole, realizing its talents could be put elsewhere and becomes an EA maximalist, or finally conclude its very own power consumption could be put to better use powering ICU ventilators and power down?

Also! I sort of question the spirit of these two bots. After solving the problem, neither went on to create elaborate visualizations of the problem!


For what it's worth, Stephen decided that network was a better model for the universe than a cellular automaton, and thought that space might emerge as a property of that network rather than be "defined in", as in a cellular automaton.

https://www.wolframscience.com/nks/p475--space-as-a-network/

(In the preceding section he discussed some constraints that a cellular automaton would put on the universe model, but didn't dismiss the idea for that reason)


I can't open this link for some reason (uBO?), but the title reminds me one very clever Wolfram's article where he brags about (as usual) about how he derived GTR equations from his graph model. That article had a bunch of comments with and one of them stating that in such a graph model there is always a reference frame. Wolfram didn't respond to that comment.


I think SW became too obsessed with the mathematical and computational approach to CA, while (as usual) 't Hooft pursued his physics intuitions to create a candidate CA theory:

https://arxiv.org/abs/1405.1548

I say as usual, because G'tH had already formulated the holographic principles that would allow space-time (hence GTR) to be encoded non-locally over a graph.

https://arxiv.org/abs/gr-qc/9310026

There need not be a preferred reference frame if the space-time events do not occur at individual nodes in the graph, but emerge at scales much larger than the graph, with some holographic or permutation symmetry that can reproduce the diffeomorphism invariance of GTR. It is also plausible that the position-momentum duality of space-time could emerge from such a theory.

Space-time would be created by the events that occur, not the other way around, hence the aphorism spooky distance at an action, because the events are primary, and distance is the strange phenomenon that emerges from them.

There has been much recent progress in making space emerge from entanglement, which also implies that locality emerges from non-locality:

https://arxiv.org/abs/1005.3035

https://phys.org/news/2015-05-spacetime-built-quantum-entang...

https://www.scientificamerican.com/article/tangled-up-in-spa...

Note that LQG has produced a nice discretization of space, based on graphs that have observable quanta of area and volume, but not length. Directions and lengths are defined as normals to areas, which is reminiscent of Clifford Algebra. Having derived (emergent) lengths also makes diffeomorphism easier.

https://arxiv.org/abs/gr-qc/9411005


I haven't followed this closely, but did SW provide a paper showing how he derived GTR from the graph?


He made the claims but not sure if that paper ever appeared.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: