Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The semantic information is first present not in markup but in natural language.

Accurate natural language processing is a very hard problem though and is best processed by AI/LLMs today, but this goes against what the article was going for when it's saying we shouldn't need AI if the semantic web had been done properly?

For example, https://en.wikipedia.org/wiki/Resource_Description_Framework and https://en.wikipedia.org/wiki/Web_Ontology_Language are some markup approaches related to the semantic web.

Complex NLP is the opposite to what the semantic web was advocating? Imagine asking the computer to buy a certain product and it orders the wrong thing because the natural language parsed was ambiguous.

> Additionally infoboxes also hold relationships, you might find when a person was born in an infobox, or where they studied.

That's not a lot of semantic information compared to the contents of a Wikipedia article that's several pages long. Imagine a version of Wikipedia that only included the infoboxes and links within them.



Yeah. Wikidata




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: