Wikipedia, the collaborative and multilingual encyclopedia project, has a lot of usefull terms defined in its database, you can find informations on artists, cities, medical terms, cars, brands… quite everything.
If you need to add some content to your pages without having that content in your database you can use Wikipedia API or Google define query (probably there’s also a Google API). You can, for example, need to add automatically a simple description to a city name, or to a band name. Or you could need to add the definition of some technological terms. You can do all of this things using Wikipedia, since Wikipedia has an API that easily lets you do it.
The php job is simple: we use CURL to call the API that returns an XML response, we parse it and the get the defnition.
Here is the code that make it for italian wikipedia, you can modify the url to match your wikipedia country site:
function wikidefinition($s) { $url = "http://it.wikipedia.org/w/api.php?action=opensearch&search=".urlencode($s)."&format=xml&limit=1"; $ch = curl_init($url); curl_setopt($ch, CURLOPT_HTTPGET, TRUE); curl_setopt($ch, CURLOPT_POST, FALSE); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_NOBODY, FALSE); curl_setopt($ch, CURLOPT_VERBOSE, FALSE); curl_setopt($ch, CURLOPT_REFERER, ""); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE); curl_setopt($ch, CURLOPT_MAXREDIRS, 4); curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE); curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows; U; Windows NT 6.1; he; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8"); $page = curl_exec($ch); $xml = simplexml_load_string($page); if((string)$xml->Section->Item->Description) { return array((string)$xml->Section->Item->Text, (string)$xml->Section->Item->Description, (string)$xml->Section->Item->Url); } else { return ""; } }
This code will be added to the MINI BOTS CLASS.
Good idea, thanks man ;)
Hey there,
I’ve made a modified version of this code that can get multiple results back. You can find it here http://adamzwakk.com/?p=383 :)
hey. Nice post. Thanks for sharing…
Good script. Thx man.
Wikipedia may block you for making too many requests this way, which would really suck for you. The best way I found around it is to use Freebase.com which has Wikipedia abstracts/images/url’s stored inside of it. In ruby there’s a gem for this at: https://github.com/cbron/basuco , using that you can make stuff like this: http://wikidefs.com
Awesome!
Thanks guy