I've used both. GoAccess tends to be more useful to me when I have a set of log files and want to track down something specific to the web server config. That's particularly handy when working on a server I'm not already familiar with, or when I have logs but no access to the server, since I don't have to run GoAccess on the server itself. It's also the only IIS log analyzer I bother using.
netdata is more useful for tracking live stats across live servers that I'm actively maintaining (especially multiple servers), and pulls a broader set of information from more sources, which is handier for tracking down intermittent networking issues.
They can both do a little bit of each other's job, which is nice. I tend to use GoAccess more often but think they're both pretty good tools for visualizing server data.
I am having the hardest time getting goaccess to work with the IIS 8.0 W3C logs. It keeps on complaining about the date / time format. What is your secret sauce?
This looks quite useful, and I think I'll be using it. I'm impressed by it being written in C... Especially because it can also run on the browser! Also, the default storage being Hash Tables. You don't see many modern projects like this! Actually feel like diving into the code.
I've been using GoAccess for a year now as a traffic analysis tool on my blog. It's been great. Easy to use, and great for real-time analysis using the command-line interface.
Yes, it supports custom log formatting but you'll likely have to write a bit of parsing code in the config, see docs here: https://goaccess.io/man#custom-log
There are also a lot of log formatting q&a in the github issues (both closed & open) if you're using a popular format.
just the other day - in lieu with AS ipblock info and 'rgxg cidr' - helped me see at a single glance (3xx to 2xx relative count) some redirect deficiencies of a platforms http image fetchers. As you can pipe to it, it complements the other text utilities filtering logs very well.
I use `find` to choose specific months and pipe the log files into goaccess (if you have lots of logs, that saves time instead of piping everything). Assuming the logs are in a bunch of gzipped files:
Indeed, most of the GoAccess docs around pipes provide examples for using them to backfill missing expected features, like filtering by date range, file/MIME type, status code, etc.
Should I have log files if I host a static (very high traffic) site? Everything is static. I've turned off the logging, no idea what performance gain it has, if any, but is there a reason I should write then in file? I don't see any need.
[0] - https://github.com/firehol/netdata