If you want to read more about the topic and about the readiness of the Swiss army, I can recommend a short book by John McPhee called La Place de la Concorde Suisse
Creator here, just woken up and it’s funny to see PizzaGPT on the front page of HN.
I had the idea yesterday, when reading about ChatGPT being blocked in Italy. I’m an Italian living abroad so I still have access to it, but my parents and friends living there don’t.
I believe AI is a revolutionary tool that should be available to anyone, the same way internet has been.
The website was coded yesterday in a couple of hours, so it’s full of bugs and potential improvements. But I just wanted to ship something quickly.
For the stack I used Nuxt 3, Tailwind and DaisyUI.
Regarding the AI, it’s literally just a wrapper of OpenAI completions API (turbo-3.5 model) and a chat interface, so it should give the same answers as the free version of ChatGPT.
I am still a bit afraid of the costs if many people start using it, so I’ve added a donate button. You can donate the equivalent of a pizza (pizzaware model?) to help keep it running
> Regarding the AI, it’s literally just a wrapper of OpenAI completions API (turbo-3.5 model) and a chat interface, so it should give the same answers as the free version of ChatGPT.
Have you tested this? My impression after a couple of tests is that its answers are worse than free ChatGPT (no idea why).
Nice to meet you, George. How can I assist you today?
> What is my name
As an AI language model, I don't have access to your personal information such as your name. Can you please tell me your name?
You need to pass the whole chat history in your request:
[{:role "assistant", :content "Ciao, sono PizzaGPT come posso aiutarti?"}
{:role "user", :content "My name is George"}
{:role "assistant", :content "Nice to meet you, George. How can I assist you today?"}
{:role "user", :content "What is my name"}
{:role "assistant", :content "Your name is George."}]
Not really had much time to test the results.
Yeah it should be possible to tweak the settings (mostly temperature and prompt) to get better results.
Also I think I’m not sending the full history right now, to avoid consuming too many tokens
It's confusing because is actually called Chat Completion API :) https://platform.openai.com/docs/guides/chat
It's the one used on ChatGPT that includes gpt-3.5-turbo and gpt-4
Recently I started using colima[0], a drop in replacement for Docker Desktop on Mac, and have seen an increase in performance and battery life. You can use all the normal docker and docker compose commands.
It does not have a GUI but you can use the Docker extension on VS Code to have an overview of running containers.
Replaced Desktop with `colima` as well few months ago. I've been using it daily since then. I did not have any issue, sometimes I just delete / start a instance to upgrade the docker version, it only takes few minutes.
I like the fact that I decide when I upgrade, not Docker Desktop nagging me every week.
Looking to convert, but I still can't understand how this is more performant. Docker Desktop has lots of engineering going into performance crossing the host/VM barrier. IIRC lima just pipes over SSH. How could that be faster?
Docker is an electron app which might explain some of the performance and battery differences. The containers don't run in electron but that extra copy of chrome is always running in the background.
I only use Docker Desktop for one thing - to see if one of my containers has accidentally started itself as amd64 instead of arm64. Sadly Colima doesn't seem to provide a way to do that.
I found even with ARM Docker containers were already slow as it was.
I also never understood the justification for the added complexity it created, but I also don't have a dedicated ops team at my job to solve my problems.
my personal experience. Much earlier this year at work, we migrated everyone to colima and I had to support devs with their issues. So many small issues kept popping up, and was definitely not a drop in replacement for us.
The higher ups eventually let us just buy docker desktop and we are all happier now.
Please help me understanding: Why is `brew install docker` not sufficient, why do you also need Colima or Docker Desktop? Is it so that there is a docker _daemon_ installed which `docker` doesn't ship with?
OSX dose not support running docker containers (or vis versa depending on your point of view). Instead you need a VM running Linux. Docker Desktop / Colima runs this VM for you.
I assume `brew install docker` just installs the docker CLI/etc, which can run on non-Linux OSes. However the docker daemon can only run on Linux, so something needs to setup a VM for it.
Homebrew handles this kinda poorly, so people are often confused. `brew install docker` installs the Docker CLI. `brew install --cask docker` installs Docker Desktop, and if you've permanently tapped homebrew-cask you'll get that instead of the Docker CLI.
Same. My requirements are very basic, so the switch to colima was basically seamless. I also appreciated being able to avoid Docker Desktop constantly trying to update itself (which is what ultimately motivated me to make the switch).
As a bonus, you can install the Docker CLI (e.g. `brew install --formula docker`) and use that to interact with any containers you start with colima.
Switched to colima a while back after the licensing debacle and have been mostly happy with it. Only real issue has been with some tools making assumptions about the docker socket location, which was easy to fix in the config file.
Thanks for sharing. I missed something like that. Docker is too enterprisy to my taste. I'm using VPS with docker right now which works good enough, but no volume mounts is not very nice.
It’s been a really doozy for me too, doubly so for local k8s in Docker (kind/k3s). I’ve tried a whole lot of variations, it’s hard and harder to scale across hundreds of devices.
Not OP, but if it's helpful, I use containers for all 3rd party services I need to run for development. (e.g. Postgres, Redis, Localstack, etc). This makes it easy to onboard new developers as they just have to run `docker-compose up` and not worry about those. It also allows me to easily use different versions of those services in different projects or even branches of a project.
somehow colima and my corporate vpn (using vpnc) keep deleting each others routes (colima loses network access when you turn off vpnc) and neither podman machine nor rancher desktop have this issue.
“improves battery life by up to 0.5%”
To me this line is quite ridiculous.
We all know that Chrome is now the main responsible for battery consumption on any laptop and could be optimized much more.
You mean the total energy saved across all devices? The only context in which anybody could give a fig about that is as it relates to total global energy consumption, of which it is an utterly neglible speck.
Edit: Had a go at putting some numbers on that for fun, and will partially retract my comment (no I don't, see below).
If it takes [1] around 0.01kWh to charge a smartphone, there are [2] around 6 billion smartphones, each is charged once a day and this saves 0.05 (wrong,see below) of that usage, the saving is on the order of 3GWh/day, i.e. around 125MW.
Total global energy consumption is [3] around 20 TW, so the saving is around 0.0006% of it, and this is a very generous estimate (chrome isn't all the consumption of a smartphone, not all phones are in active use and charging once a day, etc).
That said, in absolute terms it's more than I would have guessed - comparable to the electricity consumption of a small town (no, much less, probably less than the output of a single wind turbine - see below).
Everything is neglible in comparison to global energy consumption though, because we use energy for so many different things.
If everyone who is in a position to reduce the energy consumption of their small part significantly does their part then it will add up to a significant effort.
I don't disagree at all, but the comment I was responding to was saying that globally it's "massive" which it really isn't. Every little helps, but this is little.
Yes, Google (and other global companies) actually cares about that. They work on fleet-level changes to make such differences worth it. It might not matter to you though.
What fleet-level changes are you referring to? The energy saving is certainly a nice thing (environmentally if nothing else) but it's not energy they are paying for. I don't see how it would change Google's operations in any way.
The remote controller is just radio, not Bluetooth. But inside the board there is also a Bluetooth module that connects just to a smartphone app. Data remains in the app.
You can then decide to upload it to a web service in case you want to see the data on an interactive map or you want to share it.
Anyway, it's just riding data, it's not associated with any of your personal information.