WikiGraph With FreeLinks
Will have to extend WikiGraph to support WikiEngines with FreeLinks, because that's what almost all spaces have.
- TwinPages across sites
- BackLinks within sites
- WikiGraphBrowser, probably just for me
2016 work
Basic concept:
- For
pages
table: - add
url
field for the (Camel-Case for-me) page-bit of the URL (don't include the path part) - make
name
lower-case, so effectively get case-insensitive matching - (do I also want a clean separated version of this value for a
title
tag that would be nicely word-searchable, to make more general SearchEngine? Nah, can always just do regex.) - now when widget calls the function
- have to convert Referer to SmashedTogetherWords format if it's not already
- then make sure using the url value in the return, not the name
Adoption plan
- ask key engine-runners: Ward Cunningham and Mike Caulfield
- do their engines have an AllPages page to scrape?
- do they have a directory of spaces? naming convention per space (with uniqueness)?
- wikity
- has all-spaces FrontPage which you can crawl through Prev pages
- smells like it doesn't enforce name, but can generate from hostname+domain (drop the cc)
- scrape some data
- build (update) system
- start using on WebSeitz
- then let them call TwinPages widget
- then SearchEngine
- then copy from SearchEngine to local space
- see MikeCaulfield BootstrappingLibrary/ColdStart post
- store copied-from URL?
- allow copy to different name?
- hmm does my WikiEngine need Edit button for guests, so they can copy?
Mar07'2016: realize http://wikity.cc/ covers all Wikity sites, goes back 18 pages. So I can grab each, and regex out the links.
Mar08'2016: write parser for (1st page of) links
Mar09: create old db in local postgres, import old records. (Was in SQLite before.)
Mar09: tweak Wikity parser code to make pages.url lower-case
Mar09: scrape all wikity pages, parse (1766 urls, 75 spaces)
Mar09: get old/current wikigraph db/app working
- WikiGraph/WebPy won't talk to psql, realize none of my old WebPy apps ever used it, always used SQLite
- Launch WikiGraph using SQLite, tweak WikiWeb footer to call it. It works!
- change db structure, fill past fields, import couple records
- adjust code to use new db
Mar10: map Referer to space to support wikitys and not link to self
Mar11: adjust import code for extra field, and for spaces table
Mar11: native search form - WikiSearch
Mar12: deploy
- ugh bug when spaces.page_pattern includes a path, like for me! Hacked for me, so working. But need to fix.
Mar12: tell MikeCaulfield
Mar12: add custom SearchEngine! https://cse.google.com/cse/all
Mar14: add Phil Jones SFW space after he emails me list. Need to fix bug from Mar12 for him to use the widget too
Later: handle SFW sites
- scrape Google? or use Ward's scraper code?
Edited: | Tweet this! | Search Twitter for discussion