Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The user agent is such a mess, why should any website know all that? Why should a website know anything about the visiting guest, they should be using feature detection instead. Lets get rid of the user agent or just put "Mobile/phone", "Desktop" or similar in it. Maybe OS and a short browser name and main version number for statistics.


> why should any website know all that?

As a developer:

Without user agent: How would I easily detect which browser breaks a certain feature on my project?

If I deploy a new feature and see through logging that a browser X is not able to do Y then I can install X on my machine and test and fix it.

If I don't have a user agent then I can just detect that after deploy there are more cases where Y fails but I don't know which browser is responsible for this.


As a developer: If we actually pushed browsers to fix things, you wouldn't need to worry about that. Why should the job fall to you to work around their shitty implimentation of the spec?


Because when management asks you why their site that they paid hundreds of thousands of dollars for doesn't work on <insert major browser here>, your answer can't be "the browser's implementation of the spec is shitty, blame them." Your answer is going to be, "Yeah, sure, let me fix that."


> your answer can't be "the browser's implementation of the spec is shitty, blame them."

If it's a major browser which management cares about, then you should be testing with it already. If you're not, then logging user agent strings isn't going to help.

Logging user agent strings would help if, for example, an unexpectely-large proportion of users are using a "non-major" browser, in which your site is broken.

If the proportion is small, management won't care.

If the proportion is expected, then market/demographic research is partly to blame; update the spec.

If the browser is "major", you should be testing with it anyway.

If the site isn't broken, there's no problem.


I see what you're saying, but unfortunately, especially in enterprise, the browser version is often locked to something quite old. One of our clients has locked to Chrome 48.

Even if Chrome followed the spec to a T, programmers still write bugs. So, I'm not going to expect a browser (at least) 15 versions old to behave perfectly. And we all know that the spec isn't perfectly implemented.

So, no. Unfortunately sometimes there are things that will make management care a lot about a browser that they really shouldn't.


> Unfortunately sometimes there are things that will make management care a lot about a browser that they really shouldn't.

I never said management should or shouldn't care about this or that browser. I never said anything about browsers being new or old.

I said that developers should be testing with whatever browsers management cares about. If management care about it, and there's some justification, then add it to the spec.

> unfortunately, especially in enterprise, the browser version is often locked to something quite old. One of our clients has locked to Chrome 48.

That's an excellent justification for having Chrome 48 compatibility as part of the spec, so you should already be testing your sites with it. What has that got to do with user agent strings?

Is Chrome 48 even old? I tend to ensure IE6 compatibility, unless I have a good reason otherwise (e.g. voice calls over WebRTC, or something). When I'm using w3m, e.g. to read documentation inside Emacs, I occasionally play around with my sites to ensure they still degrade gracefully.


Well, your answer should be: I'll reimpliment it using known, simple, stable technologies. But for some reason our industry hates those things.


So... not the web, then? :-P


Management is not some all-powerful, all-knowing spectre impervious to persuasion. Don't give up so easily.


Don’t forget “Also, it’s going to cost $X more.”


Because 100% of implementations are differently shitty. There's no amount of "pushing browsers to fix things" that is going to catch 100% of novel interactions resulting from different combinations of the declarative HTML and CSS languages out in the wild (especially when JavaScript then comes along and moves all those declarations around anyway).


Sure, and the browsers that stray further off will get used less and die off.

And if you are using the latest and "greatest" JS features, you have to expect the failures that happen. If you enjoy sitting on the bleeding edge, don't complain about getting cut.

If you implement features using known, simple and stable tech, things will generally work great without needing to worry about special cases.


"Yeah sure thing boss. I'll get on the phone to Microsoft and ask them to fix that issue in IE8 that you insist needs to be supported."


So you think a better way is spending your evening trying to fix your square pegs so that they fit in round holes?

Why would you willingly do that to yourself? If we pushed browser developers to actually do their job, they wouldnt be pushing their weight around like they do now.


Yeah why ask to be empowered to fix your own problems when you could just beg someone else to fix them?


How is a browser not rendering correctly not their problem?


Who said it's not their problem? But it's also relying on someone else to fix something that you could fix. If I need to get somewhere it doesn't matter if my car's engine is broken because the company stabbed it with bolts I just need a working car. I can sit around whining about how awful the car company is but it doesn't get shit done fast.

Extreme ownership of problems. It's a really helpful concept. You'll stop trying to blame people all of the time for things that you can control and find solutions for them instead. On top of that, if you can't control it you can let it go as something that you can't fix.


If getting shit done fast is your goal, then you are gonna get burned, and I have very little sympathy for you. We should be focusing in getting shit done solid. If it's such a big deal that something works, why build unstable systems in the first place?

If you need your car to be reliable, don't bolt experimental features onto it, and test it before you need to take it on the road.


Not exactly related to your point about the user agent giving all kinds of arguably unnecessary information, but there's an interesting write-up about why the core user agent is the mess it is for those who've not seen it already.

https://webaim.org/blog/user-agent-string-history/


The user agent string definitely has a place on the web, the problem is that it's been used and abused by web developers in the 90s and 2000s when trying to deal with the utter mess that was "browser compatibility" back then.

I run whatismybrowser.com and it's a perfect case of why user agents are useful information. It'll tell you what browser you've got, what OS, and whether you're up to date or not. It's extremely useful to know this info when helping non-tech users - you would not believe how many people still reply "I just click the internet" when you ask them what browser they're using. My site helps answer all those complicated "first" questions.

I completely agree that using User Agents for feature detection/brower compatibilty is a terrible idea, but apparently enough websites still do it to warrant having to keep all that useless, contraditory mumbo jumbo in it too - it isn't what they should be used for any more!

And also, I don't think there's any problem with including "too much" information in the user agent either - point in case: Firefox used to include the full version number of Firefox in the user agent, but now it only shows the major version number, not the exact revision etc. The problem is I can no longer perfectly warn users if they're actually up to date or not.

The reasoning for this is given as a security concern, which I still don't understand - if there's a security problem in a particular point-revision version of Firefox which can be exploited by a malicious web server - odds are they're just going to try that exploit for any version of firefox and it either will or won't work - how does the malicious site knowing the exact version make the situation any worse?!


Feature detection is equivalent to user agent string, from a fingerprinting perspective.


Google makes the OS, but they also sell ads. Seems pretty advantageous to them to make their devices easy to fingerprint.


Not possible in short term. Many sites freak out when accessed with non-standard useragent.


I've always thought this. Just code to the standard, and if the browser doesn't render it correctly, then tell the user to fuck off and fix their browser.

I don't know why we ever thought sending all this data to the server was a good idea


if 99% of websites you visit work great, and 1 website you visit tells you to fuck off and fix your browser, are you going to do that or are you going to just not use that site?

Remember: incentives. The goal of a web developer is to make sites people use.


Yeah, I would just leave that site. But if you implement that feature using known simple and stable, tech, you wont really have that problem.

> Remember: incentives. The goal of a web developer is to make sites people use.

The goal should be to empower users. Anyone can make a site that people "use"


Does empowering users get the web developer paid?

I mean, in an ideal world, of course it does. But again: incentives. Keep in mind: search engines themselves are extremely empowering, and they are not generally considered to be something a person pays directly for.


Yeah, empowering users does get developers paid. I get paid to do that myself, and know a lot of other people who also get paid to do that. Of course it's sometimes easier to get paid by treating users like cattle. But if someone doesn't intuitively understand why screwing their users is a bad idea, I'm not sure I can help them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: