wikidata link

contents of wiki page:

    street address 1855, 10001 Old Georgetown Rd, Bethesda, MD 20814 (English)
    official website https://www.ratnermuseum.org/

https://www.wikidata.org/w/api.php?action=help&modules=wbsearchentities

says to do this:

Search for "alphabet" in English language for type property api.php?action=wbsearchentities&search=alphabet&language=en&type=property

so that would mean:

https://www.wikidata.org/w/api.php?action=wbsearchentities&search=Ratner+Museum&language=en&type=property

{
    "searchinfo": {
        "search": "Ratner Museum"
    },
    "search": [
        {
            "id": "Q65054804",
            "title": "Q65054804",
            "pageid": 64696664,
            "repository": "local",
            "url": "//www.wikidata.org/wiki/Q65054804",
            "concepturi": "http://www.wikidata.org/entity/Q65054804",
            "label": "The Ratner Museum",
            "description": "museum in Bethesda, Md",
            "match": {
                "type": "alias",
                "language": "en",
                "text": "Ratner Museum"
            },
            "aliases": [
                "Ratner Museum"
            ]
        }
    ],
    "success": 1
}

Yes, it gets results, but seems difficult to scrape...

No, you doofus, use the correct API:

ok: https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&titles=Berlin&props=descriptions&languages=en&format=json

ok: https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&ids=Q65054804&props=descriptions&languages=en&format=json

gets all results:
https://www.wikidata.org/w/api.php?action=wbgetentities&sites=enwiki&ids=Q65054804&props=info|sitelinks|aliases|labels|descriptions|claims|datatype&languages=en&format=json

python wikipedia

https://github.com/martin-majlis/Wikipedia-API last update: Jan 2020

https://github.com/goldsmith/Wikipedia (OLD) last update: 2016 descript: https://stackabuse.com/getting-started-with-pythons-wikipedia-api/

"Note: this library was designed for ease of use and simplicity, not for advanced use. If you plan on doing serious scraping or automated requests, please use Pywikipediabot (or one of the other more advanced Python MediaWiki API wrappers), which has a larger API, rate limiting, and other features so we can be considerate of the MediaWiki infrastructure."

python wikipedia libraries

listed on https://en.wikipedia.org/wiki/Help:Creating_a_bot#Python

will try: https://github.com/earwig/mwparserfromhell