-
Notifications
You must be signed in to change notification settings - Fork 129
Home
- Wikiversity Article with Mathematical Expression
-
Export Format RevealJS Presentation - currently static export with PanDocElectron - can be generated on the fly with
wtf_wikipedia
with RevealJS export format. - Wiki2Reveal as online demo to create dynamically from Wiki source text a reveal presentation.
If you just want the source text of an MediaWiki article, call the API in the browser. The following example just downloads an article about Toronto from the english Wikipedia and the learning resource about Water from english Wikiversity.
var wtf = require('wtf_wikipedia');
//call the API and process the markup 'pWikiSource' in the callback function of the API
wtf.fetch('Toronto', 'en', function(err, doc) {
// do something with parsed document in doc returned by the callback function
console.log("JSON: doc="+JSON.stringify(doc,null,2))
});
You can fetch article e.g. from Wikiversity from the url: [https:// en.wikiversity.org/wiki/Water](https:// en.wikiversity.org/wiki/Water) with
wtf.fetch('Water', 'enwikiversity', function(pError, pDoc) {
// do something with parsed document in doc
// returned by the callback function
if (pError) {
console.error("ERROR: "+pError)
} else {
console.log("JSON: doc="+JSON.stringify(pDoc,null,2))
}
});
Remark: To distinguish local variables from parameters of functions a preceeding p
can used to indicate that (e.g. pWikiSource
). This denomination of parameters variables is just a support for reading the code without any impact on the Javascript interpretation of code.
You can retrieve the Wiki markdown from different MediaWiki products of the WikiFoundation. The domain name includes the Wiki product (e.g. Wikipedia or Wikiversity) and a language. The WikiID encoded the language and the domain determines the API that is called for fetching the source Wiki. The following WikiIDs are referring to the following domain name.
-
en
: https://en.wikipedia.org -
enwiki
: https://en.wikipedia.org -
enwikipedia
: https://en.wikipedia.org -
enwikibooks
: https://en.wikibooks.org', -
enwikinews
: https://en.wikinews.org', -
enwikiquote
: https://en.wikiquote.org', -
enwikisource
: https://en.wikisource.org', -
enwikiversity
: https://en.wikiversity.org', -
enwikivoyage
: https://en.wikivoyage.org'
If you use just the language identifier en
then wtf_wikipedia
assumes that the wiki is Wikipedia
. The same is applied if you just use enwiki
. Due to the fact that the Wiki Foundation has severval MediaWikis with a different content scope, also enwiki
is mapped to the english Wikipedia.
var wtf = require('wtf_wikipedia');
// Fetch the article about '3D Modelling' in english Wikiversity from the domain https://en.wikiversity.org
// call the API and process the markup 'pWikiSource' in the callback function of the API
wtf.fecth('3D Modelling', 'enwikiversity', function(pWikiSource) {
// do something with wiki markup return by the callback function in the parameter pWikiSource (String)
});
If you want to fetch Wiki markdown with a different language (e.g. german Wikiversity) use the appropriate language ID (e.g. de
for german).
-
de
: https://de.wikipedia.org -
dewiki
: https://de.wikipedia.org -
dewikipedia
: https://de.wikipedia.org -
dewikibooks
: https://de.wikibooks.org', -
dewikinews
: https://de.wikinews.org', -
dewikiquote
: https://de.wikiquote.org', -
dewikisource
: https://de.wikisource.org', -
dewikiversity
: https://de.wikiversity.org', -
dewikivoyage
: https://de.wikivoyage.org'
In previous versions of wtf_wikipedia
the wiki identifier (wikiid
) used for Wikipedia
the domain specification wiki
. To be consistent with wiki product specification part in the domain name wikipedia
the following wiki IDs are added to list of Wiki ID mapping (defined in the file /src/data/site_map.js
as hash).
-
enwikipedia
: https://en.wikipedia.org -
dewikipedia
: https://de.wikipedia.org
The additional entries for a consistent WikiID mapping are added with a regular expression (Perl-like)
If you can set up you own Media Wiki or want to fetch from an other Media Wiki which was not hosted by the Wiki Foundation, then you have to pass the wikiURL as options to wtf.fetch
- Parsing Concepts are based on Parsoid - https://www.mediawiki.org/wiki/Parsoid
- Output: Based on concepts of the swiss-army knife of
document conversion
developed by John MacFarlane PanDoc - https://www.pandoc.org