Transforming Wikipedia into a Large Scale Multilingual Concept Network

Aug 04, 2012 19:57

I seem to have a long list of things I want to blog about, hopefully I'll actually manage to get down to it properly this week!!

Anyway to start another of my (obviously not remotely weekly) 100 papers in AI.

100 Current Papers in Artificial Intelligence, Automated Reasoning and Agent Programming. Number 6Vivi Nastase and Michael Strube, ( Read more... )

ai:knowledge representation, ai, 100 papers

Leave a comment

Comments 3

wellinghall August 4 2012, 20:07:18 UTC
That is interesting - thank you.

Reply


daniel_saunders August 5 2012, 15:18:51 UTC
Would Wikipedia be the main source of this Artificial Intelligence's knowledge base? Because, regardless of the technical advantages, I can see serious practical flaws there.

Reply

louisedennis August 5 2012, 17:38:04 UTC
Well, on one level, they are only extracting the structure rather than the text - i.e. they don't analyse any of the free text, just the categories and info boxes. Which at least should remove some of the wilder nonsense.

That said, it doesn't seem, per se, to be any less error-prone than most other ways of doing it - at least in the absence of reliable natural language processing (and for that you probably need a good knowledge base to start out with - a bit chicken and egg really).

Reply


Leave a comment

Up