mirror of
https://github.com/urbit/shrub.git
synced 2024-12-01 06:35:32 +03:00
Merge pull request #929 from urbit/delete-all-the-things
Delete all the things
This commit is contained in:
commit
03a90bd17e
329
app/gh.hoon
329
app/gh.hoon
@ -1,329 +0,0 @@
|
||||
:: This is a connector for the Github API v3.
|
||||
::
|
||||
:: You can interact with this in a few different ways:
|
||||
::
|
||||
:: - .^({type} %gx /=gh={/endpoint}) to read data or
|
||||
:: .^(arch %gy /=gh={/endpoint}) to explore the possible
|
||||
:: endpoints.
|
||||
::
|
||||
:: - subscribe to /listen/{owner}/{repo}/{events...} for
|
||||
:: webhook-powered event notifications. For event list, see
|
||||
:: https://developer.github.com/webhooks/.
|
||||
::
|
||||
:: This is written with the standard structure for api
|
||||
:: connectors, as described in lib/connector.hoon.
|
||||
::
|
||||
/? 314
|
||||
/- gh, plan-acct
|
||||
/+ gh-parse, connector
|
||||
::
|
||||
::
|
||||
=, html
|
||||
=, eyre
|
||||
=> |%
|
||||
++ move (pair bone card)
|
||||
++ card
|
||||
$% {$diff sub-result}
|
||||
{$them wire (unit hiss)}
|
||||
{$hiss wire {~ ~} $httr {$hiss hiss}}
|
||||
==
|
||||
::
|
||||
:: Types of results we produce to subscribers.
|
||||
::
|
||||
++ sub-result
|
||||
$% {$arch arch}
|
||||
{$gh-issue issue:gh}
|
||||
{$gh-list-issues (list issue:gh)}
|
||||
{$gh-issues issues:gh}
|
||||
{$gh-issue-comment issue-comment:gh}
|
||||
{$json json}
|
||||
{$null ~}
|
||||
==
|
||||
::
|
||||
:: Types of webhooks we expect.
|
||||
::
|
||||
++ hook-response
|
||||
$% {$gh-issues issues:gh}
|
||||
{$gh-issue-comment issue-comment:gh}
|
||||
==
|
||||
--
|
||||
=+ connector=(connector move sub-result) :: Set up connector library
|
||||
::
|
||||
=, gall
|
||||
|_ $: hid/bowl
|
||||
hook/(map @t {id/@t listeners/(set bone)}) :: map events to listeners
|
||||
==
|
||||
:: ++ prep _`. :: Clear state when code changes
|
||||
::
|
||||
:: List of endpoints
|
||||
::
|
||||
++ places
|
||||
|= wir/wire
|
||||
^- (list place:connector)
|
||||
=+ (helpers:connector ost.hid wir "https://api.github.com")
|
||||
=> |% :: gh-specific helpers
|
||||
++ read-sentinel
|
||||
|=(pax/path [ost %diff %arch `0vsen.tinel ~])
|
||||
::
|
||||
++ sigh-list-issues-x
|
||||
|= jon/json
|
||||
%+ bind ((ar:jo issue:gh-parse) jon)
|
||||
|= issues/(list issue:gh)
|
||||
gh-list-issues+issues
|
||||
::
|
||||
++ sigh-list-issues-y
|
||||
|= jon/json
|
||||
%+ bind ((ar:jo issue:gh-parse) jon)
|
||||
|= issues/(list issue:gh)
|
||||
:- `(shax (jam issues))
|
||||
%- malt ^- (list {@ta ~})
|
||||
:- [%gh-list-issues ~]
|
||||
(turn issues |=(issue:gh [(rsh 3 2 (scot %ui number)) ~]))
|
||||
--
|
||||
:~ ^- place :: /
|
||||
:* guard=~
|
||||
read-x=read-null
|
||||
read-y=(read-static %issues ~)
|
||||
sigh-x=sigh-strange
|
||||
sigh-y=sigh-strange
|
||||
==
|
||||
^- place :: /issues
|
||||
:* guard={$issues ~}
|
||||
read-x=read-null
|
||||
read-y=(read-static %mine %by-repo ~)
|
||||
sigh-x=sigh-strange
|
||||
sigh-y=sigh-strange
|
||||
==
|
||||
^- place :: /issues/mine
|
||||
:* guard={$issues $mine ~}
|
||||
read-x=(read-get /issues)
|
||||
read-y=(read-static %gh-list-issues ~)
|
||||
sigh-x=sigh-list-issues-x
|
||||
sigh-y=sigh-list-issues-y
|
||||
==
|
||||
^- place :: /issues/mine/<mark>
|
||||
:* guard={$issues $mine @t ~}
|
||||
read-x=read-null
|
||||
read-y=read-sentinel
|
||||
sigh-x=sigh-list-issues-x
|
||||
sigh-y=sigh-list-issues-y
|
||||
==
|
||||
^- place :: /issues/by-repo
|
||||
:* guard={$issues $by-repo ~}
|
||||
read-x=read-null
|
||||
^= read-y
|
||||
|= pax/path
|
||||
=+ /(scot %p our.hid)/home/(scot %da now.hid)/web/plan
|
||||
=+ .^({* acc/(map knot plan-acct)} %cx -)
|
||||
::
|
||||
((read-static usr:(~(got by acc) %github) ~) pax)
|
||||
sigh-x=sigh-strange
|
||||
sigh-y=sigh-strange
|
||||
==
|
||||
^- place :: /issues/by-repo/<user>
|
||||
:* guard={$issues $by-repo @t ~}
|
||||
read-x=read-null
|
||||
read-y=|=(pax/path (get /users/[-.+>.pax]/repos))
|
||||
sigh-x=sigh-strange
|
||||
^= sigh-y
|
||||
|= jon/json
|
||||
%+ bind ((ar:jo repository:gh-parse) jon)
|
||||
|= repos/(list repository:gh)
|
||||
[~ (malt (turn repos |=(repository:gh [name ~])))]
|
||||
==
|
||||
^- place :: /issues/by-repo/<user>/<repo>
|
||||
:* guard={$issues $by-repo @t @t ~}
|
||||
read-x=|=(pax/path (get /repos/[-.+>.pax]/[-.+>+.pax]/issues))
|
||||
read-y=|=(pax/path (get /repos/[-.+>.pax]/[-.+>+.pax]/issues))
|
||||
sigh-x=sigh-list-issues-x
|
||||
sigh-y=sigh-list-issues-y
|
||||
==
|
||||
^- place :: /issues/by-repo/<user>/<repo>/<number>
|
||||
:* guard={$issues $by-repo @t @t @t ~}
|
||||
^= read-x
|
||||
|=(pax/path (get /repos/[-.+>.pax]/[-.+>+.pax]/issues/[-.+>+>.pax]))
|
||||
::
|
||||
^= read-y
|
||||
|= pax/path
|
||||
%. pax
|
||||
?: ((sane %tas) -.+>+>.pax)
|
||||
read-sentinel
|
||||
(read-static %gh-issue ~)
|
||||
::
|
||||
^= sigh-x
|
||||
|= jon/json
|
||||
%+ bind (issue:gh-parse jon)
|
||||
|= issue/issue:gh
|
||||
gh-issue+issue
|
||||
::
|
||||
sigh-y=sigh-strange
|
||||
==
|
||||
^- place :: /issues/by-repo/<u>/<r>/<n>/<mark>
|
||||
:* guard={$issues $by-repo @t @t @t @t ~}
|
||||
read-x=read-null
|
||||
read-y=read-sentinel
|
||||
sigh-x=sigh-strange
|
||||
sigh-y=sigh-strange
|
||||
==
|
||||
==
|
||||
::
|
||||
:: When a peek on a path blocks, ford turns it into a peer on
|
||||
:: /scry/{care}/{path}. You can also just peer to this
|
||||
:: directly.
|
||||
::
|
||||
:: We hand control to ++scry.
|
||||
::
|
||||
++ peer-scry
|
||||
|= pax/path
|
||||
^- {(list move) _+>.$}
|
||||
?> ?=({care:clay *} pax)
|
||||
:_ +>.$ :_ ~
|
||||
(read:connector ost.hid (places %read pax) i.pax t.pax)
|
||||
::
|
||||
:: HTTP response. We make sure the response is good, then
|
||||
:: produce the result (as JSON) to whoever sent the request.
|
||||
::
|
||||
++ sigh-httr
|
||||
|= {way/wire res/httr}
|
||||
^- {(list move) _+>.$}
|
||||
?. ?=({$read care:clay @ *} way)
|
||||
~& res=res
|
||||
[~ +>.$]
|
||||
=* style i.way
|
||||
=* ren i.t.way
|
||||
=* pax t.t.way
|
||||
:_ +>.$ :_ ~
|
||||
:+ ost.hid %diff
|
||||
(sigh:connector (places ren style pax) ren pax res)
|
||||
::
|
||||
:: HTTP error. We just print it out, though maybe we should
|
||||
:: also produce a result so that the request doesn't hang?
|
||||
::
|
||||
++ sigh-tang
|
||||
|= {way/wire tan/tang}
|
||||
^- {(list move) _+>.$}
|
||||
%- (slog >%gh-sigh-tang< tan)
|
||||
[[ost.hid %diff null+~]~ +>.$]
|
||||
::
|
||||
:: We can't actually give the response to pretty much anything
|
||||
:: without blocking, so we just block unconditionally.
|
||||
::
|
||||
++ peek
|
||||
|= {ren/@tas tyl/path}
|
||||
^- (unit (unit (pair mark *)))
|
||||
~ ::``noun/[ren tyl]
|
||||
::
|
||||
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
|
||||
:: Webhook-powered event streams (/listen) ::
|
||||
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
|
||||
::
|
||||
:: To listen to a webhook-powered stream of events, subscribe
|
||||
:: to /listen/<user>/<repo>/<events...>
|
||||
::
|
||||
:: We hand control to ++listen.
|
||||
::
|
||||
++ peer-listen
|
||||
|= pax/path
|
||||
^- {(list move) _+>.$}
|
||||
?. ?=({@ @ *} pax)
|
||||
~& [%bad-listen-path pax]
|
||||
[~ +>.$]
|
||||
(listen pax)
|
||||
::
|
||||
:: This core handles event subscription requests by starting or
|
||||
:: updating the webhook flow for each event.
|
||||
::
|
||||
++ listen
|
||||
|= pax/path
|
||||
=| mow/(list move)
|
||||
=< abet:listen
|
||||
|%
|
||||
++ abet :: Resolve core.
|
||||
^- {(list move) _+>.$}
|
||||
[(flop mow) +>.$]
|
||||
::
|
||||
++ send-hiss :: Send a hiss
|
||||
|= hiz/hiss
|
||||
^+ +>
|
||||
=+ wir=`wire`[%x %listen pax]
|
||||
+>.$(mow [[ost.hid %hiss wir `~ %httr [%hiss hiz]] mow])
|
||||
::
|
||||
:: Create or update a webhook to listen for a set of events.
|
||||
::
|
||||
++ listen
|
||||
^+ .
|
||||
=+ pax=pax :: TMI-proofing
|
||||
?> ?=({@ @ *} pax)
|
||||
=+ events=t.t.pax
|
||||
|- ^+ +>+.$
|
||||
?~ events
|
||||
+>+.$
|
||||
?: (~(has by hook) i.events)
|
||||
$(+>+ (update-hook i.events), events t.events)
|
||||
$(+>+ (create-hook i.events), events t.events)
|
||||
::
|
||||
:: Set up a webhook.
|
||||
::
|
||||
++ create-hook
|
||||
|= event/@t
|
||||
^+ +>
|
||||
?> ?=({@ @ *} pax)
|
||||
=+ clean-event=`tape`(turn (trip event) |=(a/@tD ?:(=('_' a) '-' a)))
|
||||
=. hook
|
||||
%+ ~(put by hook) (crip clean-event)
|
||||
=+ %+ fall
|
||||
(~(get by hook) (crip clean-event))
|
||||
*{id/@t listeners/(set bone)}
|
||||
[id (~(put in listeners) ost.hid)]
|
||||
%- send-hiss
|
||||
:* %+ scan
|
||||
=+ [(trip i.pax) (trip i.t.pax)]
|
||||
"https://api.github.com/repos/{-<}/{->}/hooks"
|
||||
auri:de-purl
|
||||
%post ~ ~
|
||||
%- as-octt:mimes %- en-json %- pairs:enjs:format :~
|
||||
name+s+%web
|
||||
active+b+&
|
||||
events+a+~[s+event] ::(turn `(list ,@t)`t.t.pax |=(a=@t s/a))
|
||||
:- %config
|
||||
%- pairs:enjs:format :~
|
||||
=+ =+ clean-event
|
||||
"http://107.170.195.5:8443/~/to/gh/gh-{-}.json?anon&wire=/"
|
||||
[%url s+(crip -)]
|
||||
[%'content_type' s+%json]
|
||||
==
|
||||
==
|
||||
==
|
||||
::
|
||||
:: Add current bone to the list of subscribers for this event.
|
||||
::
|
||||
++ update-hook
|
||||
|= event/@t
|
||||
^+ +>
|
||||
=+ hok=(~(got by hook) event)
|
||||
%_ +>.$
|
||||
hook
|
||||
%+ ~(put by hook) event
|
||||
hok(listeners (~(put in listeners.hok) ost.hid))
|
||||
==
|
||||
--
|
||||
::
|
||||
:: Pokes that aren't caught in more specific arms are handled
|
||||
:: here. These should be only from webhooks firing, so if we
|
||||
:: get any mark that we shouldn't get from a webhook, we reject
|
||||
:: it. Otherwise, we spam out the event to everyone who's
|
||||
:: listening for that event.
|
||||
::
|
||||
++ poke
|
||||
|= response/hook-response
|
||||
^- {(list move) _+>.$}
|
||||
=+ hook-data=(~(get by hook) (rsh 3 3 -.response))
|
||||
?~ hook-data
|
||||
~& [%strange-hook hook response]
|
||||
[~ +>.$]
|
||||
:: ~& response=response
|
||||
:_ +>.$
|
||||
%+ turn ~(tap in listeners.u.hook-data)
|
||||
|= ost/bone
|
||||
[ost %diff response]
|
||||
--
|
@ -1,50 +0,0 @@
|
||||
:: This is a command-line ui for the %gh Github driver.
|
||||
::
|
||||
:: Usage:
|
||||
:: :github &path /read{/endpoint}
|
||||
:: :github &path /listen/{owner}/{repo}/{events...}
|
||||
::
|
||||
/- gh
|
||||
::
|
||||
=> |%
|
||||
++ diff-result
|
||||
$% {$gh-issue issues:gh}
|
||||
{$gh-issue-comment issue-comment:gh}
|
||||
==
|
||||
--
|
||||
=, gall
|
||||
|_ {hid/bowl *}
|
||||
++ poke-path
|
||||
|= pax/path
|
||||
:_ +>.$ :_ ~
|
||||
[ost.hid %peer /into-the-mist [our.hid %gh] scry+x+pax]
|
||||
++ diff-gh-issues
|
||||
|= {way/wire issues:gh}
|
||||
%- %- slog :~
|
||||
leaf+"in repository {(trip login.owner.repository)}/{(trip name.repository)}:"
|
||||
leaf+"{(trip login.sender)} {(trip -.action)} issue #{<number.issue>} {<title.issue>}"
|
||||
?+ -.action *tank
|
||||
?($assigned $unassigned)
|
||||
leaf+"to {(trip login.assignee.action)}"
|
||||
?($labeled $unlabeled)
|
||||
leaf+"with {(trip name.label.action)}"
|
||||
==
|
||||
==
|
||||
[~ +>.$]
|
||||
++ diff-gh-issue-comment
|
||||
|= {way/wire issue-comment:gh}
|
||||
%- %- slog :~
|
||||
leaf+"in repository {(trip login.owner.repository)}/{(trip name.repository)}:"
|
||||
leaf+"{(trip login.sender)} commented on issue #{<number.issue>} {<title.issue>}:"
|
||||
leaf+(trip body.comment)
|
||||
==
|
||||
[~ +>.$]
|
||||
++ diff-json
|
||||
|= {way/wire jon/json}
|
||||
~& jon
|
||||
[~ +>.$]
|
||||
++ peek
|
||||
|= {ren/@tas tyl/path}
|
||||
^- (unit (unit (pair mark *)))
|
||||
``noun+[ren tyl]
|
||||
--
|
@ -1,78 +0,0 @@
|
||||
|%
|
||||
++ results (map mark (each vase tang))
|
||||
++ show-results
|
||||
=, format
|
||||
|= a/results ^- json
|
||||
:- %o
|
||||
%- ~(run by a)
|
||||
|= b/(each vase tang)
|
||||
?- -.b
|
||||
%& (tape:enjs (text p.b))
|
||||
%| (tape:enjs (of-wall (wush 160 (flop p.b))))
|
||||
==
|
||||
++ wush
|
||||
|= {wid/@u tan/tang} ^- wall
|
||||
(zing (turn tan |=(a/tank (wash 0^wid a))))
|
||||
--
|
||||
::
|
||||
=, gall
|
||||
=, ford
|
||||
|_ {bowl ~}
|
||||
++ peek _~
|
||||
++ peer-scry-x
|
||||
|= path
|
||||
[[ost %exec /all-marks our `build-marks]~ +>]
|
||||
::
|
||||
++ made-all-marks
|
||||
|= {path @uvH a/gage}
|
||||
:_ +>.$
|
||||
?> ?=($tabl -.a)
|
||||
=; res/results
|
||||
[ost %diff [%json (show-results res)]]~
|
||||
%- malt
|
||||
%+ turn p.a
|
||||
|= {k/gage v/gage} ^- {mark (each vase tang)}
|
||||
:- ?>(?=({%& $mark * @tas} k) q.q.p.k)
|
||||
?- -.v
|
||||
$tabl !!
|
||||
%& [%& q.p.v]
|
||||
%| v
|
||||
==
|
||||
::
|
||||
++ build-marks
|
||||
^- {beak silk}
|
||||
:- now-beak
|
||||
:- %tabl
|
||||
%+ turn (weld list-marks list-sub-marks)
|
||||
|= {a/mark ~} ^- {silk silk}
|
||||
:- [%$ %mark !>(a)]
|
||||
[%bunt a]
|
||||
::
|
||||
++ poke-noun
|
||||
|= *
|
||||
~& have+list-marks
|
||||
`+>
|
||||
::
|
||||
++ now-beak %_(byk r [%da now])
|
||||
++ list-marks
|
||||
=, space:userlib
|
||||
=, format
|
||||
=+ .^(arch %cy (en-beam now-beak /mar))
|
||||
%+ skim ~(tap by dir)
|
||||
|= {a/mark ~}
|
||||
?=(^ (file (en-beam now-beak /hoon/[a]/mar)))
|
||||
::
|
||||
++ list-sub-marks
|
||||
=, space:userlib
|
||||
=, format
|
||||
^- (list {mark ~})
|
||||
%- zing ^- (list (list {mark ~}))
|
||||
=/ top .^(arch %cy (en-beam now-beak /mar))
|
||||
%+ turn ~(tap by dir.top)
|
||||
|= {sub/knot ~}
|
||||
=+ .^(arch %cy (en-beam now-beak /[sub]/mar))
|
||||
%+ murn ~(tap by dir)
|
||||
|= {a/mark ~} ^- (unit {mark ~})
|
||||
?~ (file (en-beam now-beak /hoon/[a]/[sub]/mar)) ~
|
||||
`[(rap 3 sub '-' a ~) ~]
|
||||
--
|
@ -1,93 +0,0 @@
|
||||
/+ hall
|
||||
::
|
||||
=> |%
|
||||
++ move (pair bone card)
|
||||
++ card
|
||||
$% {$peel wire dock mark path}
|
||||
{$poke wire dock $hall-command command:hall}
|
||||
==
|
||||
--
|
||||
::
|
||||
=, gall
|
||||
|_ {hid/bowl connections/(set {app/term source/path station/knot})}
|
||||
++ poke-noun
|
||||
|= arg/*
|
||||
^- {(list move) _+>.$}
|
||||
?: ?=($list arg)
|
||||
(poke-pipe-list ~)
|
||||
=+ ((soft {$cancel app/term source/path station/knot}) arg)
|
||||
?^ -
|
||||
(poke-pipe-cancel app.u source.u station.u)
|
||||
=+ ((hard {app/term source/path station/knot}) arg)
|
||||
(poke-pipe-connect app source station)
|
||||
::
|
||||
++ poke-pipe-list
|
||||
|= ~
|
||||
^- {(list move) _+>.$}
|
||||
%- %- slog
|
||||
%+ turn ~(tap in connections)
|
||||
|= {app/term source/path station/knot}
|
||||
leaf+"{(trip app)}{<`path`source>} ---> {(trip station)}"
|
||||
[~ +>.$]
|
||||
::
|
||||
++ poke-pipe-cancel
|
||||
|= {app/term source/path station/knot}
|
||||
^- {(list move) _+>.$}
|
||||
?. (~(has in connections) [app source station])
|
||||
%- %- slog :~
|
||||
leaf+"no connection:"
|
||||
leaf+"{(trip app)}{<`path`source>} ---> {(trip station)}"
|
||||
==
|
||||
[~ +>.$]
|
||||
%- %- slog :~
|
||||
leaf+"canceling:"
|
||||
leaf+"{(trip app)}{<`path`source>} ---> {(trip station)}"
|
||||
==
|
||||
[~ +>.$(connections (~(del in connections) [app source station]))]
|
||||
::
|
||||
++ poke-pipe-connect
|
||||
|= {app/term source/path station/knot}
|
||||
^- {(list move) _+>.$}
|
||||
:_ +>.$(connections (~(put in connections) [app source station]))
|
||||
:_ ~
|
||||
~& [%peeling app source station]
|
||||
:* ost.hid %peel [%subscribe app station source]
|
||||
[our.hid app] %hall-speeches source
|
||||
==
|
||||
::
|
||||
++ diff-hall-speeches
|
||||
|= {way/wire speeches/(list speech:hall)}
|
||||
^- {(list move) _+>.$}
|
||||
?> ?=({$subscribe @ @ *} way)
|
||||
=+ app=(slav %tas i.t.way)
|
||||
=+ station=i.t.t.way
|
||||
=+ source=t.t.t.way
|
||||
?. (~(has in connections) [app source station])
|
||||
%- %- slog :~
|
||||
leaf+"pipe dropping:"
|
||||
leaf+"{(trip app)}{<`path`source>} ---> {(trip station)}"
|
||||
==
|
||||
[~ +>.$]
|
||||
:_ +>.$ :_ ~
|
||||
:* ost.hid %poke [%relay app station source]
|
||||
[our.hid %hall] %hall-command
|
||||
%publish
|
||||
|- ^- (list thought:hall)
|
||||
?~ speeches
|
||||
~
|
||||
:_ $(speeches t.speeches, eny.hid (shax (cat 3 %pipe eny.hid)))
|
||||
:* `@uvH`(end (sub 'H' 'A') 1 eny.hid)
|
||||
[[[%& our.hid station] *envelope:hall %pending] ~ ~]
|
||||
now.hid *(set flavor:hall) i.speeches
|
||||
==
|
||||
==
|
||||
::
|
||||
++ coup-relay
|
||||
|= {way/wire saw/(unit tang)}
|
||||
^- {(list move) _+>.$}
|
||||
?> ?=({@ @ @ *} way)
|
||||
?~ saw
|
||||
[~ +>.$]
|
||||
%- (slog leaf+"pipe relay failure in:" >way< u.saw)
|
||||
[~ +>.$]
|
||||
--
|
@ -159,8 +159,6 @@
|
||||
:- /ren/run "not meant to be called except on a (different) hoon file"
|
||||
:- /ren/collections "temporarily disabled"
|
||||
:- /ren/test-gen "temporarily disabled"
|
||||
:- /ren/tree/index "temporarily disabled"
|
||||
:- /ren/tree/elem "temporarily disabled"
|
||||
:- /ren/urb "temporarily disabled"
|
||||
:- /ren/x-urb "temporarily disabled"
|
||||
:- /ren/x-htm "temporarily disabled"
|
||||
@ -173,22 +171,6 @@
|
||||
++ failing
|
||||
^~ ^- (map path tape)
|
||||
%- my :~ ::TODO don't hardcode
|
||||
::
|
||||
:- /app/pipe "wants 'flavor:hall' to exist"
|
||||
:- /app/mark-dashboard "wants old ford"
|
||||
:- /app/static "wants old ford"
|
||||
:- /gen/capitalize "wants unicode-data/txt"
|
||||
::
|
||||
:- /app/twit "depends on sur/twitter"
|
||||
:- /gen/twit/as "depends on sur/twitter"
|
||||
:- /gen/twit/feed "depends on sur/twitter"
|
||||
:- /mar/twit/cred "depends on sur/twitter"
|
||||
:- /mar/twit/feed "depends on sur/twitter"
|
||||
:- /mar/twit/post "depends on sur/twitter"
|
||||
:- /mar/twit/req "depends on sur/twitter"
|
||||
:- /mar/twit/usel "depends on sur/twitter"
|
||||
:- /lib/twitter "depends on sur/twitter"
|
||||
:- /sur/twitter "crashes with new type system"
|
||||
::
|
||||
:- /gen/al "compiler types out-of-date"
|
||||
:- /gen/musk "compiler types out-of-date"
|
||||
@ -196,13 +178,5 @@
|
||||
:- /gen/cosmetic "incomplete"
|
||||
:- /gen/lust "incomplete"
|
||||
:- /gen/scantastic "incomplete"
|
||||
::
|
||||
:- /app/gh "crashes with new type system"
|
||||
:- /mar/gh/issue-comment "wants old 'speech:hall'"
|
||||
:- /mar/gh/issues "wants old 'speech:hall'"
|
||||
::
|
||||
:- /lib/down-jet "depends on lib/down-jet/parse"
|
||||
:- /mar/down "depends on lib/down-jet/parse"
|
||||
:- /lib/down-jet/parse "obsolete syntax"
|
||||
==
|
||||
--
|
||||
|
@ -1,9 +1,8 @@
|
||||
---
|
||||
comments: true
|
||||
---
|
||||
:- ~[comments+&]
|
||||
;>
|
||||
|
||||
# Static
|
||||
|
||||
You can put static files in here to serve them to the web. Actually, you can put static files anywhere in `/web` and see them in a browser.
|
||||
|
||||
Docs on static publishing with urbit are forthcoming — but feel free to drop markdown files in `/web` to try it out.
|
||||
Docs on static publishing with urbit are forthcoming — but feel free to drop markdown files in `/web` to try it out.
|
319
app/twit.hoon
319
app/twit.hoon
@ -1,319 +0,0 @@
|
||||
:: Twitter daemon
|
||||
::
|
||||
:::: /hoon/twit/app
|
||||
::
|
||||
/- plan-acct
|
||||
/+ twitter, hall
|
||||
::
|
||||
:::: ~fyr
|
||||
::
|
||||
=, eyre
|
||||
=, html
|
||||
|%
|
||||
++ twit-path :: valid peer path
|
||||
$% {$cred ~} :: credential info
|
||||
{$home p/@t ~} :: home timeline
|
||||
{$user p/@t ~} :: user's tweets
|
||||
{$post p/@taxuv ~} :: status of status
|
||||
==
|
||||
::
|
||||
++ axle :: app state
|
||||
$: $0
|
||||
out/(map @uvI (each {knot cord} stat)) :: sent tweets
|
||||
ran/(map path {p/@ud q/@da}) :: polls active
|
||||
fed/(jar path stat) :: feed cache
|
||||
ced/(unit (pair @da json)) :: credentials
|
||||
==
|
||||
::
|
||||
++ gift :: subscription action
|
||||
$% {$quit ~} :: terminate
|
||||
{$diff gilt} :: send data
|
||||
==
|
||||
++ gilt
|
||||
$% {$twit-feed p/(list stat)} :: posts in feed
|
||||
{$twit-post p/stat} :: tweet accepted
|
||||
{$ares term (list tank)} :: error
|
||||
{$json json} :: unspecialized
|
||||
==
|
||||
::
|
||||
++ move {bone card}
|
||||
++ card :: arvo request
|
||||
$? gift
|
||||
$% {$hiss wire (unit user:eyre) api-call} :: api request
|
||||
{$poke wire app-message} ::
|
||||
{$wait wire p/@da} :: timeout
|
||||
== ==
|
||||
::
|
||||
++ api-call {response-mark $twit-req {endpoint quay}} :: full hiss payload
|
||||
++ response-mark ?($twit-post $twit-feed $twit-cred) :: sigh options
|
||||
++ app-message
|
||||
$? {{ship $hall} $hall-action action:hall} :: chat message
|
||||
{{ship $hood} $write-plan-account user:eyre plan-acct} :: registration
|
||||
== ::
|
||||
++ sign :: arvo response
|
||||
$% {$e $thou p/httr} :: HTTP result
|
||||
{$t $wake ~} :: timeout ping
|
||||
==
|
||||
::
|
||||
:: XX =*
|
||||
++ stat post:twitter :: recieved tweet
|
||||
++ command command:twitter :: incoming command
|
||||
++ endpoint endpoint:reqs:twitter :: outgoing target
|
||||
++ param param:reqs:twitter :: twit-req paramters
|
||||
++ print print:twitter :: their serialization
|
||||
::
|
||||
--
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bowl:gall axle}
|
||||
::
|
||||
++ prep
|
||||
|= a/(unit axle) ^- (quip move _+>)
|
||||
?^ a [~ +>(+<+ u.a)]
|
||||
(peer-scry-x /cred)
|
||||
::
|
||||
++ cull :: remove seen tweets
|
||||
|= {pax/path rep/(list stat)} ^+ rep
|
||||
=+ pev=(silt (turn (~(get ja fed) pax) |=(stat id)))
|
||||
(skip rep |=(stat (~(has in pev) id)))
|
||||
::
|
||||
++ done [*(list move) .]
|
||||
++ dely :: next polling timeout
|
||||
|= pax/path
|
||||
^- {(unit time) _ran}
|
||||
=+ cur=(~(get by ran) pax)
|
||||
=+ tym=(add now (mul ~s8 (bex ?~(cur 0 p.u.cur))))
|
||||
:: ~& dely/`@dr`(sub tym now)
|
||||
?: &(?=(^ cur) (gte tym q.u.cur) (gth q.u.cur now))
|
||||
[~ ran]
|
||||
[`tym (~(put by ran) pax ?~(cur 0 (min 5 +(p.u.cur))) tym)]
|
||||
::
|
||||
++ wait-new :: poll with min delay
|
||||
|= {pax/path mof/(list move)}
|
||||
(wait(ran (~(del by ran) pax)) pax mof)
|
||||
::
|
||||
++ wait :: ensure poll by path
|
||||
|= {pax/path mof/(list move)} ^+ done
|
||||
=^ tym ran (dely pax)
|
||||
:_ +>.$
|
||||
?~ tym
|
||||
:: ~& no-wait/ran
|
||||
mof
|
||||
:: ~& will-wait/u.tym
|
||||
:- [ost %wait pax u.tym]
|
||||
mof
|
||||
::
|
||||
++ poke-twit-do :: recieve request
|
||||
|= {usr/user:eyre act/command} ^+ done
|
||||
?- -.act
|
||||
$post
|
||||
=. out (~(put by out) p.act %& usr q.act)
|
||||
%+ wait-new /peer/home/[usr]
|
||||
=+ req=[%twit-req `endpoint`update+[%status q.act]~ ~]
|
||||
[ost %hiss post+(dray:wired ~[%uv] p.act) `usr %twit-post req]~
|
||||
==
|
||||
::
|
||||
++ wake-peer
|
||||
|= {pax/path ~} ^+ done
|
||||
~& twit-wake+peer+pax
|
||||
:_ +>.$
|
||||
?. (~(has by ran) peer+pax) :: ignore if retracted
|
||||
~
|
||||
=+ => |=({a/bone @ b/path} [b a])
|
||||
pus=(~(gas ju *(jug path bone)) (turn ~(tap by sup) .))
|
||||
?~ (~(get ju pus) pax)
|
||||
~
|
||||
~& peer-again+[pax ran]
|
||||
(pear | `~. pax) ::(user-from-path pax))
|
||||
::
|
||||
++ sigh-recoverable-error :: Rate-limit
|
||||
|= {pax/path $429 $rate-limit lim/(unit @da)}
|
||||
=. ran (~(put by ran) pax 6 now)
|
||||
=+ tym=?~(lim (add ~m7.s30 now) (add ~1970.1.1 (mul ~s1 u.lim)))
|
||||
~& retrying-in+`@dr`(sub tym now)
|
||||
:_(+>.$ [ost %wait pax tym]~)
|
||||
::
|
||||
++ sigh-twit-cred-scry-cred sigh-twit-cred-cred :: alias
|
||||
++ sigh-twit-cred-cred
|
||||
|= {wir/wire acc/plan-acct raw/json} ^+ done
|
||||
?> ?=(~ wir)
|
||||
=+ pax=`twit-path`cred+wir
|
||||
:_ +>.$(ced `[now raw])
|
||||
:- [ost %poke pax [our %hood] %write-plan-account ~.twitter acc]
|
||||
(spam-with-scry-x pax json+raw)
|
||||
::
|
||||
++ sigh-twit-post-post :: status acknowledged
|
||||
|= {wir/wire rep/stat} ^+ done
|
||||
=+ (raid:wired wir mez=%uv ~)
|
||||
=. out (~(put by out) mez %| rep)
|
||||
:_ +>.$
|
||||
=+ pax=/[who.rep]/status/(rsh 3 2 (scot %ui id.rep))
|
||||
:- (show-url [& ~ &+/com/twitter] `pax ~)
|
||||
(spam-with-scry-x post+wir twit-post+rep)
|
||||
::
|
||||
++ sigh-twit-feed :: feed data
|
||||
|= {wir/wire rep/(list stat)} ^+ done
|
||||
?> ?=({?($peer $scry) *} wir)
|
||||
=* pax t.wir
|
||||
:: ~& got-feed+[(scag 5 (turn rep |=(stat id))) fed]
|
||||
=+ ren=(cull pax rep) :: new messages
|
||||
=. rep (weld ren (~(get ja fed) pax))
|
||||
=. fed (~(put by fed) pax rep) :: save full list
|
||||
?: ?=($scry -.wir)
|
||||
[(spam scry+x+pax [%diff twit-feed+(flop rep)] [%quit ~] ~) +>.$]
|
||||
?~ ren
|
||||
(wait wir ~) :: pump polling
|
||||
:: ~& spam-feed+ren
|
||||
(wait-new wir (spam pax [%diff twit-feed+(flop ren)] ~))
|
||||
::
|
||||
++ sigh-tang :: Err
|
||||
|= {pax/path tan/tang} ^+ done
|
||||
~& sigh-tang+pax
|
||||
%- (slog (flop tan))
|
||||
=+ ^- git/gift
|
||||
=+ err='' ::%.(q:(need r.hit) ;~(biff de-json mean:reparse:twitter)) :: XX parse?
|
||||
:^ %diff %ares %bad-http
|
||||
tan
|
||||
:: [leaf/"HTTP Code {<p.hit>}" (turn (need err) mean:render:twit)]
|
||||
?+ pax [[ost git]~ +>.$]
|
||||
{$post @ ~}
|
||||
[(spam pax git ~) +>.$]
|
||||
==
|
||||
::
|
||||
:: ++ user-to-path |=(a/(unit iden) ?~(a '~' (scot %ta u.a)))
|
||||
:: ++ user-from-path
|
||||
:: |= pax/path ^- {(unit iden) path}
|
||||
:: ~| %bad-user
|
||||
:: ?~ pax ~|(%empty-path !!)
|
||||
:: ~| i.pax
|
||||
:: ?: =('~' i.pax) [~ t.pax]
|
||||
:: [`(slav %ta i.pax) t.pax]
|
||||
::
|
||||
::
|
||||
++ compat
|
||||
|= {usr/(unit user:eyre) req/(unit user:eyre)}
|
||||
?~(req & =(usr req))
|
||||
::
|
||||
:: /+ twitter
|
||||
:: .^((list post:twitter) %gx /=twit=/home/urbit_test/twit-feed)
|
||||
:: .^(post:twitter %gx /=twit=/post/0vv0old.0post.hash0.0000/twit-feed)
|
||||
++ peek-x
|
||||
|= pax/path ^- (unit (unit gilt))
|
||||
=+ usr=`~. :: =^ usr pax (user-from-path pax)
|
||||
?. ?=(twit-path pax)
|
||||
~|([%missed-path pax] !!)
|
||||
=+ gil=(pear-scry pax)
|
||||
?- -.gil
|
||||
$none ~
|
||||
$part ~ :: stale data
|
||||
$full ``p.gil
|
||||
==
|
||||
::
|
||||
++ peer-scry-x
|
||||
|= pax/path ^+ done
|
||||
:_ +>
|
||||
=+ pek=(peek-x pax)
|
||||
?^ pek
|
||||
?~ u.pek ~|(bad-scry+x+pax !!)
|
||||
~[[ost %diff u.u.pek] [ost %quit ~]]
|
||||
=+ usr=`~. :: =^ usr pax (user-from-path pax)
|
||||
?. ?=(twit-path pax)
|
||||
~|([%missed-path pax] !!)
|
||||
=+ hiz=(pear-hiss pax)
|
||||
?~ hiz ~ :: already in flight
|
||||
::?> (compat usr -.u.hiz) :: XX better auth
|
||||
[ost %hiss scry+pax usr +.u.hiz]~
|
||||
::
|
||||
++ peer |=(pax/path :_(+> (pear & `~. pax))) :: accept subscription
|
||||
++ pear :: poll, possibly returning current data
|
||||
|= {ver/? usr/(unit user:eyre) pax/path}
|
||||
^- (list move)
|
||||
?. ?=(twit-path pax)
|
||||
~|([%missed-path pax] !!)
|
||||
=+ gil=(pear-scry pax)
|
||||
%+ welp
|
||||
^- (list move)
|
||||
?: ?=($full -.gil) ~ :: permanent result
|
||||
=+ hiz=(pear-hiss pax)
|
||||
?~ hiz ~
|
||||
::?> (compat usr -.u.hiz) :: XX better auth
|
||||
[ost %hiss peer+pax usr +.u.hiz]~
|
||||
^- (list move)
|
||||
?. ver ~
|
||||
?- -.gil
|
||||
$none ~
|
||||
$part [ost %diff p.gil]~
|
||||
$full ~[[ost %diff p.gil] [ost %quit ~]]
|
||||
==
|
||||
::
|
||||
++ pear-scry
|
||||
|= pax/twit-path ^- $%({$none ~} {$part p/gilt} {$full p/gilt})
|
||||
?- -.pax
|
||||
$post
|
||||
=+ (raid:wired +.pax mez=%uv ~)
|
||||
=+ sta=(~(get by out) mez)
|
||||
?. ?=({~ %| *} sta)
|
||||
[%none ~]
|
||||
[%full twit-post+p.u.sta]
|
||||
::
|
||||
?($user $home)
|
||||
[%part twit-feed+(flop (~(get ja fed) pax))]
|
||||
::
|
||||
$cred
|
||||
?~ ced [%none ~]
|
||||
?: (gth now (add p.u.ced ~m1)) :: stale
|
||||
[%none ~]
|
||||
[%full %json q.u.ced]
|
||||
==
|
||||
::
|
||||
++ pear-hiss
|
||||
|= pax/twit-path ^- (unit {(unit user:eyre) api-call})
|
||||
?- -.pax
|
||||
$post ~ :: future/unacked
|
||||
$cred
|
||||
`[`~. %twit-cred twit-req+[test-login+~ ['skip_status'^%t]~]]
|
||||
::
|
||||
$user
|
||||
=+ ole=(~(get ja fed) pax)
|
||||
=+ opt=?~(ole ~ ['since_id' (tid:print id.i.ole)]~)
|
||||
`[`~. [%twit-feed twit-req+[posts-by+[(to-sd p.pax)]~ opt]]]
|
||||
::
|
||||
$home
|
||||
=+ ole=(~(get ja fed) pax)
|
||||
=+ opt=?~(ole ~ ['since_id' (tid:print id.i.ole)]~)
|
||||
`[`p.pax [%twit-feed twit-req+[timeline+~ opt]]]
|
||||
==
|
||||
::
|
||||
++ to-sd :: parse user name/numb
|
||||
|= a/knot ^- sd:param
|
||||
~| [%not-user a]
|
||||
%+ rash a
|
||||
;~(pose (stag %user-id dem) (stag %screen-name user:parse:twitter))
|
||||
::
|
||||
:: ++ pull :: release subscription
|
||||
:: |= ost/bone
|
||||
:: ?. (~(has by sup) ost) `+>.$ :: XX should not occur
|
||||
:: =+ [his pax]=(~(got by sup) ost)
|
||||
:: ?: (lth 1 ~(wyt in (~(get ju pus) pax)))
|
||||
:: `+>.$
|
||||
:: =: ran (~(del by ran) [%peer pax])
|
||||
:: fed (~(del by fed) pax)
|
||||
:: ==
|
||||
:: `+>.$
|
||||
::
|
||||
++ spam-with-scry-x :: recieve final
|
||||
|= {a/path b/gilt} ^- (list move)
|
||||
=+ mof=~[[%diff b] [%quit ~]]
|
||||
(weld (spam a mof) (spam scry+x+a mof))
|
||||
::
|
||||
++ spam :: send by path
|
||||
|= {a/path b/(list gift)} ^- (list move)
|
||||
%- zing ^- (list (list move))
|
||||
%+ turn ~(tap by sup)
|
||||
|= {ost/bone @ pax/path}
|
||||
?. =(pax a) ~
|
||||
(turn b |=(c/gift [ost c]))
|
||||
::
|
||||
++ show-url ~(said-url hall `bowl:gall`+<-)
|
||||
--
|
@ -1,293 +0,0 @@
|
||||
:: to use, download UnicdoeData.txt and place it in `%/lib/unicode-data/txt`.
|
||||
::
|
||||
::::
|
||||
::
|
||||
:: part 1: parse the file into {uppers}
|
||||
::
|
||||
/- unicode-data
|
||||
/= case-table
|
||||
/; !:
|
||||
=>
|
||||
|%
|
||||
+$ case-fold
|
||||
:: state that's part of the fold which generates the list of case-nodes
|
||||
$: :: resulting data to pass to treeify.
|
||||
out=(list case-node:unicode-data)
|
||||
:: the start of a run of characters; ~ for not active.
|
||||
start=(unit case-state)
|
||||
:: previous character state
|
||||
prev=case-state
|
||||
==
|
||||
::
|
||||
+$ case-state
|
||||
:: a temporary model which we compress later in a second pass.
|
||||
$: point=@c
|
||||
case=case-class
|
||||
upper=case-offset:unicode-data
|
||||
lower=case-offset:unicode-data
|
||||
title=case-offset:unicode-data
|
||||
==
|
||||
::
|
||||
+$ case-class
|
||||
:: classification of an individual character.
|
||||
$? $upper
|
||||
$lower
|
||||
$title
|
||||
$none
|
||||
$missing
|
||||
==
|
||||
--
|
||||
|= a=(list line:unicode-data)
|
||||
::
|
||||
|^ %- build-tree
|
||||
%- flop
|
||||
(build-case-nodes a)
|
||||
::
|
||||
:: #
|
||||
:: # %case-nodes
|
||||
:: #
|
||||
:: transforms raw unicode data into sequential case nodes.
|
||||
+| %case-nodes
|
||||
++ build-case-nodes
|
||||
:: raw list of unicode data lines to a compact list of chardata
|
||||
|= lines=(list line:unicode-data)
|
||||
^- (list case-node:unicode-data)
|
||||
::
|
||||
:: todo: we don't have the final case range in the output of this
|
||||
:: gate. this is because this algorithm doesn't work when the last
|
||||
:: char is part of a range. this doesn't happen with the real one,
|
||||
:: only the excerpts i was using for testing.
|
||||
::
|
||||
=< out
|
||||
=| =case-fold
|
||||
|- ^+ case-fold
|
||||
?~ lines case-fold
|
||||
::
|
||||
=/ state=case-state (line-to-case-state i.lines)
|
||||
::
|
||||
?: (is-adjacent state prev.case-fold)
|
||||
case-fold(prev state)
|
||||
::
|
||||
=. case-fold (add-range case-fold)
|
||||
::
|
||||
%_ case-fold
|
||||
prev state
|
||||
start ?.(?=(?(%missing %none) case.state) ~ `state)
|
||||
==
|
||||
::
|
||||
++ line-to-case-state
|
||||
:: creates an easy to merge form.
|
||||
|= line:unicode-data
|
||||
^- case-state
|
||||
=/ out=case-state
|
||||
[code %none [%none ~] [%none ~] [%none ~]]
|
||||
?: =(code `@c`0)
|
||||
=. case.out %missing
|
||||
out
|
||||
=. case.out
|
||||
?+ gen %none
|
||||
$lu %upper
|
||||
$ll %lower
|
||||
$lt %title
|
||||
==
|
||||
::
|
||||
:: several characters aren't described as $lu or $ll but have lower or
|
||||
:: upper state, such as u+2161. detect this and fix it up.
|
||||
::
|
||||
=? case.out &(=(case.out %none) !=(low ~)) %upper
|
||||
=? case.out &(=(case.out %none) !=(up ~)) %lower
|
||||
::
|
||||
:: calculate offsets
|
||||
::
|
||||
=? upper.out !=(up ~) (calculate-offset (need up) code)
|
||||
=? lower.out !=(low ~)
|
||||
(calculate-offset (need low) code)
|
||||
=? title.out !=(title ~) (calculate-offset (need title) code)
|
||||
out
|
||||
::
|
||||
++ calculate-offset
|
||||
|= [src=@c dst=@c]
|
||||
^- case-offset:unicode-data
|
||||
?: =(src dst)
|
||||
[%none ~]
|
||||
?: (gth src dst)
|
||||
[%add (sub src dst)]
|
||||
[%sub (sub dst src)]
|
||||
::
|
||||
++ is-adjacent
|
||||
:: is {rhs} a continuation of {lhs}?
|
||||
|= [lhs=case-state rhs=case-state]
|
||||
^- ?
|
||||
?: (lth point.rhs point.lhs)
|
||||
$(lhs rhs, rhs lhs)
|
||||
?: !=(point.rhs +(point.lhs))
|
||||
%.n
|
||||
?: !=(case.rhs case.lhs)
|
||||
(upper-lower-adjacent lhs rhs)
|
||||
?: =(case.lhs %none)
|
||||
%.n
|
||||
?: =(case.lhs %missing)
|
||||
%.n
|
||||
?: !=(upper.lhs upper.rhs)
|
||||
%.n
|
||||
?: !=(lower.lhs lower.rhs)
|
||||
%.n
|
||||
?: !=(title.lhs title.rhs)
|
||||
%.n
|
||||
%.y
|
||||
::
|
||||
++ upper-lower-adjacent
|
||||
:: detects %upper-lower spans.
|
||||
::
|
||||
:: is {lhs} the same as {rhs}, but with opposite case?
|
||||
|= [lhs=case-state rhs=case-state]
|
||||
?: &(=(case.lhs %upper) !=(case.rhs %lower))
|
||||
%.n
|
||||
?: &(=(case.lhs %lower) !=(case.rhs %upper))
|
||||
%.n
|
||||
::
|
||||
:: to simplify detection, if things are in the opposite order, redo
|
||||
:: things flipped.
|
||||
::
|
||||
?: =(case.lhs %lower)
|
||||
$(lhs rhs, rhs lhs)
|
||||
?& (is-upper-lower lhs)
|
||||
(is-lower-upper rhs)
|
||||
==
|
||||
::
|
||||
++ is-upper-lower
|
||||
|= i=case-state
|
||||
=(+.+.i [[%none ~] [%add 1] [%none ~]])
|
||||
::
|
||||
++ is-lower-upper
|
||||
|= i=case-state
|
||||
=(+.+.i [[%sub 1] [%none ~] [%sub 1]])
|
||||
::
|
||||
++ is-none
|
||||
|= i=case-state
|
||||
=(+.+.i [[%none ~] [%none ~] [%none ~]])
|
||||
::
|
||||
++ add-range
|
||||
|= c=case-fold
|
||||
^+ c
|
||||
?~ start.c
|
||||
c
|
||||
?: (is-none u.start.c)
|
||||
c
|
||||
?: ?& (gth point.prev.c point.u.start.c)
|
||||
(is-upper-lower u.start.c)
|
||||
==
|
||||
=/ node=case-node:unicode-data
|
||||
[`@ux`point.u.start.c `@ux`point.prev.c [%uplo ~] [%uplo ~] [%uplo ~]]
|
||||
c(out [node out.c])
|
||||
=/ node=case-node:unicode-data
|
||||
[`@ux`point.u.start.c `@ux`point.prev.c +.+.u.start.c]
|
||||
c(out [node out.c])
|
||||
::
|
||||
:: #
|
||||
:: # %tree-building
|
||||
:: #
|
||||
:: builds a binary search tree out of the list
|
||||
+| %tree-building
|
||||
++ build-tree
|
||||
|= a=(list case-node:unicode-data)
|
||||
^- case-tree:unicode-data
|
||||
:: there's probably a bottom up approach that doesn't require walking
|
||||
:: a list over and over again.
|
||||
::
|
||||
:: use ?: instead of ?~ to prevent the TMI problem.
|
||||
::
|
||||
?: =(~ a)
|
||||
~
|
||||
=+ len=(lent a)
|
||||
=/ split-at=@ (div len 2)
|
||||
=/ lhs (scag split-at a)
|
||||
=/ rhs (slag split-at a)
|
||||
?~ rhs
|
||||
?~ lhs
|
||||
~
|
||||
[i.lhs ~ ~]
|
||||
=+ x=[i.rhs $(a lhs) $(a t.rhs)]
|
||||
x
|
||||
--
|
||||
/: /===/lib/unicode-data /&unicode-data&/txt/
|
||||
::
|
||||
:: part 2: utility core
|
||||
::
|
||||
|%
|
||||
++ transform
|
||||
|= [a=tape fun=$-(@c @c)]
|
||||
%- tufa
|
||||
(turn (tuba a) fun)
|
||||
::
|
||||
++ to-upper
|
||||
:: returns the uppercase of unicode codepoint {a}
|
||||
|= a=@c
|
||||
^- @c
|
||||
:: special case ascii to not perform map lookup.
|
||||
?: (lte a max-ascii)
|
||||
?: &((gte a 'a') (lte a 'z'))
|
||||
(sub a 32)
|
||||
a
|
||||
(apply-table a case-table %upper)
|
||||
::
|
||||
++ to-lower
|
||||
:: returns the lowercase of unicode codepoint {a}
|
||||
|= a=@c
|
||||
^- @c
|
||||
?: (lte a max-ascii)
|
||||
?: &((gte a 'A') (lte a 'Z'))
|
||||
(add 32 a)
|
||||
a
|
||||
(apply-table a case-table %lower)
|
||||
::
|
||||
++ apply-table
|
||||
:: searches {table} and apples applies {type} to {a}.
|
||||
::
|
||||
:: this recursively walks the case tree {table}. if it finds an entry which
|
||||
:: matches on {a}, it will apply the offset. otherwise, returns {a}.
|
||||
|= [a=@c table=case-tree:unicode-data type=?($upper $lower $title)]
|
||||
^- @c
|
||||
?~ table
|
||||
a
|
||||
?: (lth a start.n.table)
|
||||
$(table l.table)
|
||||
?: (gth a end.n.table)
|
||||
$(table r.table)
|
||||
?. &((lte start.n.table a) (lte a end.n.table))
|
||||
a
|
||||
%^ apply-offset a type
|
||||
?- type
|
||||
$upper upper.n.table
|
||||
$lower lower.n.table
|
||||
$title title.n.table
|
||||
==
|
||||
::
|
||||
++ apply-offset
|
||||
:: applies an character offset to {a}.
|
||||
|= [a=@c type=?($upper $lower $title) offset=case-offset:unicode-data]
|
||||
^- @c
|
||||
?- offset
|
||||
{$add *} (add a a.offset)
|
||||
{$sub *} (sub a s.offset)
|
||||
{$none *} a
|
||||
::
|
||||
{$uplo *}
|
||||
?- type
|
||||
$upper (sub a 1)
|
||||
$lower (add a 1)
|
||||
$title (sub a 1)
|
||||
==
|
||||
==
|
||||
::
|
||||
++ max-ascii `@c`0x7f
|
||||
--
|
||||
::
|
||||
:: part 3: generator
|
||||
::
|
||||
:- %say
|
||||
|= $: [now=@da eny=@uvJ bec=beak]
|
||||
[n=tape ~]
|
||||
~
|
||||
==
|
||||
:- %tape (transform n to-upper)
|
1039
gen/cosmetic.hoon
1039
gen/cosmetic.hoon
File diff suppressed because it is too large
Load Diff
486
gen/lust.hoon
486
gen/lust.hoon
@ -1,486 +0,0 @@
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
=> |%
|
||||
::
|
||||
++ system
|
||||
$: rec/(map @ud theory)
|
||||
say/theory
|
||||
==
|
||||
++ library
|
||||
|
||||
::
|
||||
++ theory
|
||||
$@ $? $void
|
||||
$path
|
||||
$noun
|
||||
$hoon
|
||||
$wall
|
||||
$text
|
||||
$tape
|
||||
$cord
|
||||
$null
|
||||
$term
|
||||
$type
|
||||
$tank
|
||||
==
|
||||
$% {$list item/theory}
|
||||
{$pole item/theory}
|
||||
{$set item/theory}
|
||||
{$map key/theory value/theory}
|
||||
{$soft type/type data/theory}
|
||||
{$tuple items/(list theory)}
|
||||
{$label name/term data/theory}
|
||||
{$tree item/theory}
|
||||
{$help writ/writ theory/theory}
|
||||
{$gate from/theory to/theory}
|
||||
:: {$core library/}
|
||||
{$unit item/theory}
|
||||
{$atom aura/aura}
|
||||
{$choice cases/(list theory)}
|
||||
{$branch atom/theory cell/theory}
|
||||
{$bridge double/theory single/theory}
|
||||
{$switch cases/(list {stem/theory bulb/theory})}
|
||||
{$constant aura/aura value/@}
|
||||
{$pair p/theory q/theory}
|
||||
{$trel p/theory q/theory r/theory}
|
||||
{$qual p/theory q/theory r/theory s/theory}
|
||||
{$quil p/theory q/theory r/theory s/theory t/theory}
|
||||
--
|
||||
|%
|
||||
++ py
|
||||
|
||||
|
||||
|
||||
++ us :: prettyprinter
|
||||
=> |%
|
||||
++ cape {p/(map @ud wine) q/wine} ::
|
||||
++ wine ::
|
||||
$@ $? $noun ::
|
||||
$path ::
|
||||
$type ::
|
||||
$void ::
|
||||
$wall ::
|
||||
$wool ::
|
||||
$yarn ::
|
||||
== ::
|
||||
$% {$mato p/term} ::
|
||||
{$core p/(list @ta) q/wine} ::
|
||||
{$face p/term q/wine} ::
|
||||
{$list p/term q/wine} ::
|
||||
{$pear p/term q/@} ::
|
||||
{$bcwt p/(list wine)} ::
|
||||
{$plot p/(list wine)} ::
|
||||
{$stop p/@ud} ::
|
||||
{$tree p/term q/wine} ::
|
||||
{$unit p/term q/wine} ::
|
||||
== ::
|
||||
--
|
||||
|_ sut/type
|
||||
++ dash
|
||||
|= {mil/tape lim/char} ^- tape
|
||||
:- lim
|
||||
|- ^- tape
|
||||
?~ mil [lim ~]
|
||||
?: =(lim i.mil) ['\\' i.mil $(mil t.mil)]
|
||||
?: =('\\' i.mil) ['\\' i.mil $(mil t.mil)]
|
||||
?: (lte ' ' i.mil) [i.mil $(mil t.mil)]
|
||||
['\\' ~(x ne (rsh 2 1 i.mil)) ~(x ne (end 2 1 i.mil)) $(mil t.mil)]
|
||||
::
|
||||
++ deal |=(lum/* (dish dole lum))
|
||||
++ dial
|
||||
|= ham/cape
|
||||
=+ gid=*(set @ud)
|
||||
=< `tank`-:$
|
||||
|%
|
||||
++ many
|
||||
|= haz/(list wine)
|
||||
^- {(list tank) (set @ud)}
|
||||
?~ haz [~ gid]
|
||||
=^ mor gid $(haz t.haz)
|
||||
=^ dis gid ^$(q.ham i.haz)
|
||||
[[dis mor] gid]
|
||||
::
|
||||
++ $
|
||||
^- {tank (set @ud)}
|
||||
?- q.ham
|
||||
$noun :_(gid [%leaf '*' ~])
|
||||
$path :_(gid [%leaf '/' ~])
|
||||
$type :_(gid [%leaf '#' 't' ~])
|
||||
$void :_(gid [%leaf '#' '!' ~])
|
||||
$wool :_(gid [%leaf '*' '"' '"' ~])
|
||||
$wall :_(gid [%leaf '*' '\'' '\'' ~])
|
||||
$yarn :_(gid [%leaf '"' '"' ~])
|
||||
{$mato *} :_(gid [%leaf '@' (trip p.q.ham)])
|
||||
{$core *}
|
||||
=^ cox gid $(q.ham q.q.ham)
|
||||
:_ gid
|
||||
:+ %rose
|
||||
[[' ' ~] ['<' ~] ['>' ~]]
|
||||
|- ^- (list tank)
|
||||
?~ p.q.ham [cox ~]
|
||||
[[%leaf (rip 3 i.p.q.ham)] $(p.q.ham t.p.q.ham)]
|
||||
::
|
||||
{$face *}
|
||||
=^ cox gid $(q.ham q.q.ham)
|
||||
:_(gid [%palm [['/' ~] ~ ~ ~] [%leaf (trip p.q.ham)] cox ~])
|
||||
::
|
||||
{$list *}
|
||||
=^ cox gid $(q.ham q.q.ham)
|
||||
:_(gid [%rose [" " (weld (trip p.q.ham) "(") ")"] cox ~])
|
||||
::
|
||||
{$bcwt *}
|
||||
=^ coz gid (many p.q.ham)
|
||||
:_(gid [%rose [[' ' ~] ['?' '(' ~] [')' ~]] coz])
|
||||
::
|
||||
{$plot *}
|
||||
=^ coz gid (many p.q.ham)
|
||||
:_(gid [%rose [[' ' ~] ['{' ~] ['}' ~]] coz])
|
||||
::
|
||||
{$pear *}
|
||||
:_(gid [%leaf '$' ~(rend co [%$ p.q.ham q.q.ham])])
|
||||
::
|
||||
{$stop *}
|
||||
=+ num=~(rend co [%$ %ud p.q.ham])
|
||||
?: (~(has in gid) p.q.ham)
|
||||
:_(gid [%leaf '#' num])
|
||||
=^ cox gid
|
||||
%= $
|
||||
gid (~(put in gid) p.q.ham)
|
||||
q.ham (~(got by p.ham) p.q.ham)
|
||||
==
|
||||
:_(gid [%palm [['.' ~] ~ ~ ~] [%leaf ['^' '#' num]] cox ~])
|
||||
::
|
||||
{$tree *}
|
||||
=^ cox gid $(q.ham q.q.ham)
|
||||
:_(gid [%rose [" " (weld (trip p.q.ham) "(") ")"] cox ~])
|
||||
::
|
||||
{$unit *}
|
||||
=^ cox gid $(q.ham q.q.ham)
|
||||
:_(gid [%rose [" " (weld (trip p.q.ham) "(") ")"] cox ~])
|
||||
==
|
||||
--
|
||||
::
|
||||
++ dish
|
||||
|= {ham/cape lum/*} ^- tank
|
||||
~| [%dish-h ?@(q.ham q.ham -.q.ham)]
|
||||
~| [%lump lum]
|
||||
~| [%ham ham]
|
||||
%- need
|
||||
=| gil/(set {@ud *})
|
||||
|- ^- (unit tank)
|
||||
?- q.ham
|
||||
$noun
|
||||
%= $
|
||||
q.ham
|
||||
?: ?=(@ lum)
|
||||
[%mato %$]
|
||||
:- %plot
|
||||
|- ^- (list wine)
|
||||
[%noun ?:(?=(@ +.lum) [[%mato %$] ~] $(lum +.lum))]
|
||||
==
|
||||
::
|
||||
$path
|
||||
:- ~
|
||||
:+ %rose
|
||||
[['/' ~] ['/' ~] ~]
|
||||
|- ^- (list tank)
|
||||
?~ lum ~
|
||||
?@ lum !!
|
||||
?> ?=(@ -.lum)
|
||||
[[%leaf (rip 3 -.lum)] $(lum +.lum)]
|
||||
::
|
||||
$type
|
||||
=+ tyr=|.((dial dole))
|
||||
=+ vol=tyr(sut lum)
|
||||
=+ cis=((hard tank) .*(vol -:vol))
|
||||
:^ ~ %palm
|
||||
[~ ~ ~ ~]
|
||||
[[%leaf '#' 't' '/' ~] cis ~]
|
||||
::
|
||||
$wall
|
||||
:- ~
|
||||
:+ %rose
|
||||
[[' ' ~] ['<' '|' ~] ['|' '>' ~]]
|
||||
|- ^- (list tank)
|
||||
?~ lum ~
|
||||
?@ lum !!
|
||||
[[%leaf (trip ((hard @) -.lum))] $(lum +.lum)]
|
||||
::
|
||||
$wool
|
||||
:- ~
|
||||
:+ %rose
|
||||
[[' ' ~] ['<' '<' ~] ['>' '>' ~]]
|
||||
|- ^- (list tank)
|
||||
?~ lum ~
|
||||
?@ lum !!
|
||||
[(need ^$(q.ham %yarn, lum -.lum)) $(lum +.lum)]
|
||||
::
|
||||
$yarn
|
||||
[~ %leaf (dash (tape lum) '"')]
|
||||
::
|
||||
$void
|
||||
~
|
||||
::
|
||||
{$mato *}
|
||||
?. ?=(@ lum)
|
||||
~
|
||||
:+ ~
|
||||
%leaf
|
||||
?+ (rash p.q.ham ;~(sfix (cook crip (star low)) (star hig)))
|
||||
~(rend co [%$ p.q.ham lum])
|
||||
$$ ~(rend co [%$ %ud lum])
|
||||
$t (dash (rip 3 lum) '\'')
|
||||
$tas ['%' ?.(=(0 lum) (rip 3 lum) ['$' ~])]
|
||||
==
|
||||
::
|
||||
{$core *}
|
||||
:: XX needs rethinking for core metal
|
||||
:: ?. ?=(^ lum) ~
|
||||
:: => .(lum `*`lum)
|
||||
:: =- ?~(tok ~ [~ %rose [[' ' ~] ['<' ~] ['>' ~]] u.tok])
|
||||
:: ^= tok
|
||||
:: |- ^- (unit (list tank))
|
||||
:: ?~ p.q.ham
|
||||
:: =+ den=^$(q.ham q.q.ham)
|
||||
:: ?~(den ~ [~ u.den ~])
|
||||
:: =+ mur=$(p.q.ham t.p.q.ham, lum +.lum)
|
||||
:: ?~(mur ~ [~ [[%leaf (rip 3 i.p.q.ham)] u.mur]])
|
||||
[~ (dial ham)]
|
||||
::
|
||||
{$face *}
|
||||
=+ wal=$(q.ham q.q.ham)
|
||||
?~ wal
|
||||
~
|
||||
[~ %palm [['=' ~] ~ ~ ~] [%leaf (trip p.q.ham)] u.wal ~]
|
||||
::
|
||||
{$list *}
|
||||
?: =(~ lum)
|
||||
[~ %leaf '~' ~]
|
||||
=- ?~ tok
|
||||
~
|
||||
[~ %rose [[' ' ~] ['~' '[' ~] [']' ~]] u.tok]
|
||||
^= tok
|
||||
|- ^- (unit (list tank))
|
||||
?: ?=(@ lum)
|
||||
?.(=(~ lum) ~ [~ ~])
|
||||
=+ [for=^$(q.ham q.q.ham, lum -.lum) aft=$(lum +.lum)]
|
||||
?. &(?=(^ for) ?=(^ aft))
|
||||
~
|
||||
[~ u.for u.aft]
|
||||
::
|
||||
{$bcwt *}
|
||||
|- ^- (unit tank)
|
||||
?~ p.q.ham
|
||||
~
|
||||
=+ wal=^$(q.ham i.p.q.ham)
|
||||
?~ wal
|
||||
$(p.q.ham t.p.q.ham)
|
||||
wal
|
||||
::
|
||||
{$plot *}
|
||||
=- ?~ tok
|
||||
~
|
||||
[~ %rose [[' ' ~] ['[' ~] [']' ~]] u.tok]
|
||||
^= tok
|
||||
|- ^- (unit (list tank))
|
||||
?~ p.q.ham
|
||||
~
|
||||
?: ?=({* ~} p.q.ham)
|
||||
=+ wal=^$(q.ham i.p.q.ham)
|
||||
?~(wal ~ [~ [u.wal ~]])
|
||||
?@ lum
|
||||
~
|
||||
=+ gim=^$(q.ham i.p.q.ham, lum -.lum)
|
||||
?~ gim
|
||||
~
|
||||
=+ myd=$(p.q.ham t.p.q.ham, lum +.lum)
|
||||
?~ myd
|
||||
~
|
||||
[~ u.gim u.myd]
|
||||
::
|
||||
{$pear *}
|
||||
?. =(lum q.q.ham)
|
||||
~
|
||||
=. p.q.ham
|
||||
(rash p.q.ham ;~(sfix (cook crip (star low)) (star hig)))
|
||||
=+ fox=$(q.ham [%mato p.q.ham])
|
||||
?> ?=({~ $leaf ^} fox)
|
||||
?: ?=(?($n $tas) p.q.ham)
|
||||
fox
|
||||
[~ %leaf '%' p.u.fox]
|
||||
::
|
||||
{$stop *}
|
||||
?: (~(has in gil) [p.q.ham lum]) ~
|
||||
=+ kep=(~(get by p.ham) p.q.ham)
|
||||
?~ kep
|
||||
~|([%stop-loss p.q.ham] !!)
|
||||
$(gil (~(put in gil) [p.q.ham lum]), q.ham u.kep)
|
||||
::
|
||||
{$tree *}
|
||||
=- ?~ tok
|
||||
~
|
||||
[~ %rose [[' ' ~] ['{' ~] ['}' ~]] u.tok]
|
||||
^= tok
|
||||
=+ tuk=*(list tank)
|
||||
|- ^- (unit (list tank))
|
||||
?: =(~ lum)
|
||||
[~ tuk]
|
||||
?. ?=({n/* l/* r/*} lum)
|
||||
~
|
||||
=+ rol=$(lum r.lum)
|
||||
?~ rol
|
||||
~
|
||||
=+ tim=^$(q.ham q.q.ham, lum n.lum)
|
||||
?~ tim
|
||||
~
|
||||
$(lum l.lum, tuk [u.tim u.rol])
|
||||
::
|
||||
{$unit *}
|
||||
?@ lum
|
||||
?.(=(~ lum) ~ [~ %leaf '~' ~])
|
||||
?. =(~ -.lum)
|
||||
~
|
||||
=+ wal=$(q.ham q.q.ham, lum +.lum)
|
||||
?~ wal
|
||||
~
|
||||
[~ %rose [[' ' ~] ['[' ~] [']' ~]] [%leaf '~' ~] u.wal ~]
|
||||
==
|
||||
::
|
||||
++ doge
|
||||
|= ham/cape
|
||||
=- ?+ woz woz
|
||||
{$list * {$mato $'ta'}} %path
|
||||
{$list * {$mato $'t'}} %wall
|
||||
{$list * {$mato $'tD'}} %yarn
|
||||
{$list * $yarn} %wool
|
||||
==
|
||||
^= woz
|
||||
^- wine
|
||||
?. ?=({$stop *} q.ham)
|
||||
?: ?& ?= {$bcwt {$pear $n $0} {$plot {$pear $n $0} {$face *} ~} ~}
|
||||
q.ham
|
||||
=(1 (met 3 p.i.t.p.i.t.p.q.ham))
|
||||
==
|
||||
[%unit =<([p q] i.t.p.i.t.p.q.ham)]
|
||||
q.ham
|
||||
=+ may=(~(get by p.ham) p.q.ham)
|
||||
?~ may
|
||||
q.ham
|
||||
=+ nul=[%pear %n 0]
|
||||
?. ?& ?=({$bcwt *} u.may)
|
||||
?=({* * ~} p.u.may)
|
||||
|(=(nul i.p.u.may) =(nul i.t.p.u.may))
|
||||
==
|
||||
q.ham
|
||||
=+ din=?:(=(nul i.p.u.may) i.t.p.u.may i.p.u.may)
|
||||
?: ?& ?=({$plot {$face *} {$face * $stop *} ~} din)
|
||||
=(p.q.ham p.q.i.t.p.din)
|
||||
=(1 (met 3 p.i.p.din))
|
||||
=(1 (met 3 p.i.t.p.din))
|
||||
==
|
||||
:+ %list
|
||||
(cat 3 p.i.p.din p.i.t.p.din)
|
||||
q.i.p.din
|
||||
?: ?& ?= $: $plot
|
||||
{$face *}
|
||||
{$face * $stop *}
|
||||
{{$face * $stop *} ~}
|
||||
==
|
||||
din
|
||||
=(p.q.ham p.q.i.t.p.din)
|
||||
=(p.q.ham p.q.i.t.t.p.din)
|
||||
=(1 (met 3 p.i.p.din))
|
||||
=(1 (met 3 p.i.t.p.din))
|
||||
=(1 (met 3 p.i.t.t.p.din))
|
||||
==
|
||||
:+ %tree
|
||||
%^ cat
|
||||
3
|
||||
p.i.p.din
|
||||
(cat 3 p.i.t.p.din p.i.t.t.p.din)
|
||||
q.i.p.din
|
||||
q.ham
|
||||
::
|
||||
++ dole
|
||||
^- cape
|
||||
=+ gil=*(set type)
|
||||
=+ dex=[p=*(map type @) q=*(map @ wine)]
|
||||
=< [q.p q]
|
||||
|- ^- {p/{p/(map type @) q/(map @ wine)} q/wine}
|
||||
=- [p.tez (doge q.p.tez q.tez)]
|
||||
^= tez
|
||||
^- {p/{p/(map type @) q/(map @ wine)} q/wine}
|
||||
?: (~(meet ut sut) -:!>(*type))
|
||||
[dex %type]
|
||||
?- sut
|
||||
$noun [dex sut]
|
||||
$void [dex sut]
|
||||
{$atom *} [dex ?~(q.sut [%mato p.sut] [%pear p.sut u.q.sut])]
|
||||
{$cell *}
|
||||
=+ hin=$(sut p.sut)
|
||||
=+ yon=$(dex p.hin, sut q.sut)
|
||||
:- p.yon
|
||||
:- %plot
|
||||
?:(?=({$plot *} q.yon) [q.hin p.q.yon] [q.hin q.yon ~])
|
||||
::
|
||||
{$core *}
|
||||
=+ yad=$(sut p.sut)
|
||||
:- p.yad
|
||||
=+ ^= doy ^- {p/(list @ta) q/wine}
|
||||
?: ?=({$core *} q.yad)
|
||||
[p.q.yad q.q.yad]
|
||||
[~ q.yad]
|
||||
:- %core
|
||||
:_ q.doy
|
||||
:_ p.doy
|
||||
%^ cat 3
|
||||
%~ rent co
|
||||
:+ %$ %ud
|
||||
%- ~(rep by (~(run by q.s.q.sut) |=(tomb ~(wyt by q))))
|
||||
|=([[@ a=@u] b=@u] (add a b))
|
||||
==
|
||||
%^ cat 3
|
||||
?-(p.q.sut $gold '.', $iron '|', $lead '?', $zinc '&')
|
||||
=+ gum=(mug q.s.q.sut)
|
||||
%+ can 3
|
||||
:~ [1 (add 'a' (mod gum 26))]
|
||||
[1 (add 'a' (mod (div gum 26) 26))]
|
||||
[1 (add 'a' (mod (div gum 676) 26))]
|
||||
==
|
||||
::
|
||||
{$help *}
|
||||
$(sut q.sut)
|
||||
::
|
||||
{$face *}
|
||||
=+ yad=$(sut q.sut)
|
||||
?^(q.p.sut yad [p.yad [%face q.p.sut q.yad]])
|
||||
::
|
||||
{$fork *}
|
||||
=+ yed=~(tap in p.sut)
|
||||
=- [p [%bcwt q]]
|
||||
|- ^- {p/{p/(map type @) q/(map @ wine)} q/(list wine)}
|
||||
?~ yed
|
||||
[dex ~]
|
||||
=+ mor=$(yed t.yed)
|
||||
=+ dis=^$(dex p.mor, sut i.yed)
|
||||
[p.dis q.dis q.mor]
|
||||
::
|
||||
{$hold *}
|
||||
=+ hey=(~(get by p.dex) sut)
|
||||
?^ hey
|
||||
[dex [%stop u.hey]]
|
||||
?: (~(has in gil) sut)
|
||||
=+ dyr=+(~(wyt by p.dex))
|
||||
[[(~(put by p.dex) sut dyr) q.dex] [%stop dyr]]
|
||||
=+ rom=$(gil (~(put in gil) sut), sut ~(repo ut sut))
|
||||
=+ rey=(~(get by p.p.rom) sut)
|
||||
?~ rey
|
||||
rom
|
||||
[[p.p.rom (~(put by q.p.rom) u.rey q.rom)] [%stop u.rey]]
|
||||
==
|
||||
::
|
||||
++ duck (dial dole)
|
||||
--
|
366
gen/musk.hoon
366
gen/musk.hoon
@ -1,366 +0,0 @@
|
||||
::
|
||||
::::
|
||||
::
|
||||
:- %say
|
||||
|= {^ {{typ/type gen/hoon ~} ~}}
|
||||
=< :- %noun
|
||||
=+ pro=(~(mint ut typ) %noun gen)
|
||||
~_ (~(dunk ut typ) 'blow-subject')
|
||||
=+ bus=(bran:musk typ)
|
||||
~& [%subject-mask mask.bus]
|
||||
=+ jon=(apex:musk bus q.pro)
|
||||
?~ jon
|
||||
~& %constant-stopped
|
||||
!!
|
||||
?. ?=(%& -.u.jon)
|
||||
~& %constant-blocked
|
||||
!!
|
||||
:: [p.pro [%1 p.u.jon]]
|
||||
p.u.jon
|
||||
|%
|
||||
++ musk :: nock with block set
|
||||
=> |%
|
||||
++ block
|
||||
:: identity of resource awaited
|
||||
:: XX parameterize
|
||||
noun
|
||||
::
|
||||
++ result
|
||||
:: internal interpreter result
|
||||
::
|
||||
$@(~ seminoun)
|
||||
::
|
||||
++ seminoun
|
||||
:: partial noun; blocked subtrees are ~
|
||||
::
|
||||
{mask/stencil data/noun}
|
||||
::
|
||||
++ stencil
|
||||
:: noun knowledge map
|
||||
::
|
||||
$% :: no; noun has partial block substructure
|
||||
::
|
||||
{%| left/stencil rite/stencil}
|
||||
:: yes; noun is either fully complete, or fully blocked
|
||||
::
|
||||
{%& blocks/(set block)}
|
||||
==
|
||||
::
|
||||
++ output
|
||||
:: nil; interpreter stopped
|
||||
::
|
||||
%- unit
|
||||
:: yes, complete noun; no, list of blocks
|
||||
::
|
||||
(each noun (list block))
|
||||
--
|
||||
|%
|
||||
++ bran
|
||||
|= sut/type
|
||||
=+ gil=*(set type)
|
||||
|- ^- seminoun
|
||||
?- sut
|
||||
$noun [&+[~ ~ ~] ~]
|
||||
$void [&+[~ ~ ~] ~]
|
||||
{$atom *} ?~(q.sut [&+[~ ~ ~] ~] [&+~ u.q.sut])
|
||||
{$cell *} (combine $(sut p.sut) $(sut q.sut))
|
||||
{$core *} %+ combine:musk
|
||||
?~ p.s.q.sut [&+[~ ~ ~] ~]
|
||||
[&+~ p.s.q.sut]
|
||||
$(sut p.sut)
|
||||
{$face *} $(sut ~(repo ut sut))
|
||||
{$fork *} [&+[~ ~ ~] ~]
|
||||
{$help *} $(sut ~(repo ut sut))
|
||||
{$hold *} ?: (~(has in gil) sut)
|
||||
[&+[~ ~ ~] ~]
|
||||
$(sut ~(repo ut sut), gil (~(put in gil) sut))
|
||||
==
|
||||
++ abet
|
||||
:: simplify raw result
|
||||
::
|
||||
|= $: :: noy: raw result
|
||||
::
|
||||
noy/result
|
||||
==
|
||||
^- output
|
||||
:: propagate stop
|
||||
::
|
||||
?~ noy ~
|
||||
:- ~
|
||||
:: merge all blocking sets
|
||||
::
|
||||
=/ blocks (squash mask.noy)
|
||||
?: =(~ blocks)
|
||||
:: no blocks, data is complete
|
||||
::
|
||||
&+data.noy
|
||||
:: reduce block set to block list
|
||||
::
|
||||
|+~(tap in blocks)
|
||||
::
|
||||
++ apex
|
||||
:: execute nock on partial subject
|
||||
::
|
||||
|= $: :: bus: subject, a partial noun
|
||||
:: fol: formula, a complete noun
|
||||
::
|
||||
bus/seminoun
|
||||
fol/noun
|
||||
==
|
||||
^- output
|
||||
:: simplify result
|
||||
::
|
||||
%- abet
|
||||
:: interpreter loop
|
||||
::
|
||||
|- ^- result
|
||||
:: ~& [%apex-fol fol]
|
||||
:: ~& [%apex-mac mask.bus]
|
||||
:: =- ~& [%apex-pro-mac ?@(foo ~ ~!(foo mask.foo))]
|
||||
:: foo
|
||||
:: ^= foo
|
||||
:: ^- result
|
||||
?@ fol
|
||||
:: bad formula, stop
|
||||
::
|
||||
~
|
||||
?: ?=(^ -.fol)
|
||||
:: hed: interpret head
|
||||
::
|
||||
=+ hed=$(fol -.fol)
|
||||
:: propagate stop
|
||||
::
|
||||
?~ hed ~
|
||||
:: tal: interpret tail
|
||||
::
|
||||
=+ tal=$(fol +.fol)
|
||||
:: propagate stop
|
||||
::
|
||||
?~ tal ~
|
||||
:: combine
|
||||
::
|
||||
(combine hed tal)
|
||||
?+ fol
|
||||
:: bad formula; stop
|
||||
::
|
||||
~
|
||||
:: 0; fragment
|
||||
::
|
||||
{$0 b/@}
|
||||
:: if bad axis, stop
|
||||
::
|
||||
?: =(0 b.fol) ~
|
||||
:: reduce to fragment
|
||||
::
|
||||
(fragment b.fol bus)
|
||||
::
|
||||
:: 1; constant
|
||||
::
|
||||
{$1 b/*}
|
||||
:: constant is complete
|
||||
::
|
||||
[&+~ b.fol]
|
||||
::
|
||||
:: 2; recursion
|
||||
::
|
||||
{$2 b/* c/*}
|
||||
:: require complete formula
|
||||
::
|
||||
%+ require
|
||||
:: compute formula with current subject
|
||||
::
|
||||
$(fol c.fol)
|
||||
|= :: ryf: next formula
|
||||
::
|
||||
ryf/noun
|
||||
:: lub: next subject
|
||||
::
|
||||
=+ lub=^$(fol b.fol)
|
||||
:: propagate stop
|
||||
::
|
||||
?~ lub ~
|
||||
:: recurse
|
||||
::
|
||||
^$(fol ryf, bus lub)
|
||||
::
|
||||
:: 3; probe
|
||||
::
|
||||
{$3 b/*}
|
||||
%+ require
|
||||
$(fol b.fol)
|
||||
|= :: fig: probe input
|
||||
::
|
||||
fig/noun
|
||||
:: yes if cell, no if atom
|
||||
::
|
||||
[&+~ .?(fig)]
|
||||
::
|
||||
:: 4; increment
|
||||
::
|
||||
{$4 b/*}
|
||||
%+ require
|
||||
$(fol b.fol)
|
||||
|= :: fig: increment input
|
||||
::
|
||||
fig/noun
|
||||
:: stop for cells, increment for atoms
|
||||
::
|
||||
?^(fig ~ [&+~ +(fig)])
|
||||
::
|
||||
:: 5; compare
|
||||
::
|
||||
{$5 b/*}
|
||||
%+ require
|
||||
$(fol b.fol)
|
||||
|= :: fig: operator input
|
||||
::
|
||||
fig/noun
|
||||
:: stop for atoms, compare cells
|
||||
::
|
||||
?@(fig ~ [&+~ =(-.fig +.fig)])
|
||||
::
|
||||
:: 6; if-then-else
|
||||
::
|
||||
{$6 b/* c/* d/*}
|
||||
:: use standard macro expansion (slow)
|
||||
::
|
||||
$(fol =>(fol [2 [0 1] 2 [1 c d] [1 0] 2 [1 2 3] [1 0] 4 4 b]))
|
||||
::
|
||||
:: 7; composition
|
||||
::
|
||||
{$7 b/* c/*}
|
||||
:: use standard macro expansion (slow)
|
||||
::
|
||||
$(fol =>(fol [2 b 1 c]))
|
||||
::
|
||||
:: 8; declaration
|
||||
::
|
||||
{$8 b/* c/*}
|
||||
:: use standard macro expansion (slow)
|
||||
::
|
||||
$(fol =>(fol [7 [[7 [0 1] b] 0 1] c]))
|
||||
::
|
||||
:: 9; invocation
|
||||
::
|
||||
{$9 b/* c/*}
|
||||
:: use standard macro expansion (slow)
|
||||
::
|
||||
$(fol =>(fol [7 c 2 [0 1] 0 b]))
|
||||
::
|
||||
:: 10; static hint
|
||||
::
|
||||
{$10 @ c/*}
|
||||
:: ignore hint
|
||||
::
|
||||
$(fol c.fol)
|
||||
::
|
||||
:: 10; dynamic hint
|
||||
::
|
||||
{$10 {b/* c/*} d/*}
|
||||
:: noy: dynamic hint
|
||||
::
|
||||
=+ noy=$(fol c.fol)
|
||||
:: propagate stop
|
||||
::
|
||||
?~ noy ~
|
||||
:: otherwise, ignore hint
|
||||
::
|
||||
$(fol d.fol)
|
||||
==
|
||||
::
|
||||
++ combine
|
||||
:: combine a pair of seminouns
|
||||
::
|
||||
|= $: :: hed: head of pair
|
||||
:: tal: tail of pair
|
||||
::
|
||||
hed/seminoun
|
||||
tal/seminoun
|
||||
==
|
||||
^- seminoun
|
||||
?. ?& &(?=(%& -.mask.hed) ?=(%& -.mask.tal))
|
||||
=(=(~ blocks.mask.hed) =(~ blocks.mask.tal))
|
||||
==
|
||||
:: default merge
|
||||
::
|
||||
[|+[mask.hed mask.tal] [data.hed data.tal]]
|
||||
:: both sides total
|
||||
::
|
||||
?: =(~ blocks.mask.hed)
|
||||
:: both sides are complete
|
||||
::
|
||||
[&+~ data.hed data.tal]
|
||||
:: both sides are blocked
|
||||
::
|
||||
[&+(~(uni in blocks.mask.hed) blocks.mask.tal) ~]
|
||||
::
|
||||
++ fragment
|
||||
:: seek to an axis in a seminoun
|
||||
::
|
||||
|= $: :: axe: tree address of subtree
|
||||
:: bus: partial noun
|
||||
::
|
||||
axe/axis
|
||||
bus/seminoun
|
||||
==
|
||||
|- ^- result
|
||||
:: 1 is the root
|
||||
::
|
||||
?: =(1 axe) bus
|
||||
:: now: 2 or 3, top of axis
|
||||
:: lat: rest of axis
|
||||
::
|
||||
=+ [now=(cap axe) lat=(mas axe)]
|
||||
?- -.mask.bus
|
||||
:: subject is fully blocked or complete
|
||||
::
|
||||
%& :: if fully blocked, produce self
|
||||
::
|
||||
?^ blocks.mask.bus bus
|
||||
:: descending into atom, stop
|
||||
::
|
||||
?@ data.bus ~
|
||||
:: descend into complete cell
|
||||
::
|
||||
$(axe lat, bus [&+~ ?:(=(2 now) -.data.bus +.data.bus)])
|
||||
:: subject is partly blocked
|
||||
::
|
||||
%| :: descend into partial cell
|
||||
::
|
||||
%= $
|
||||
axe lat
|
||||
bus ?: =(2 now)
|
||||
[left.mask.bus -.data.bus]
|
||||
[rite.mask.bus +.data.bus]
|
||||
== ==
|
||||
:: require complete intermediate step
|
||||
::
|
||||
++ require
|
||||
|= $: noy/result
|
||||
yen/$-(noun result)
|
||||
==
|
||||
^- result
|
||||
:: propagate stop
|
||||
::
|
||||
?~ noy ~
|
||||
:: if partial block, squash blocks and stop
|
||||
::
|
||||
?: ?=(%| -.mask.noy) [&+(squash mask.noy) ~]
|
||||
:: if full block, propagate block
|
||||
::
|
||||
?: ?=(^ blocks.mask.noy) [mask.noy ~]
|
||||
:: otherwise use complete noun
|
||||
::
|
||||
(yen data.noy)
|
||||
::
|
||||
++ squash
|
||||
:: convert stencil to block set
|
||||
::
|
||||
|= tyn/stencil
|
||||
^- (set block)
|
||||
?- -.tyn
|
||||
%& blocks.tyn
|
||||
%| (~(uni in $(tyn left.tyn)) $(tyn rite.tyn))
|
||||
==
|
||||
--
|
||||
--
|
194
gen/p2.hoon
194
gen/p2.hoon
@ -1,194 +0,0 @@
|
||||
/? 310
|
||||
::
|
||||
/+ pprint
|
||||
::
|
||||
!:
|
||||
::
|
||||
:- %say
|
||||
::
|
||||
=< |= {^ {{=arg ~} ~}}
|
||||
^- [%txt wain]
|
||||
::
|
||||
=/ v=vase
|
||||
?- target.arg
|
||||
^ target.arg
|
||||
%all !>(all-examples)
|
||||
%demo !>(demo-example)
|
||||
%test !>(test-example)
|
||||
%type !>(type-example)
|
||||
%xml !>(xml-example)
|
||||
%kernel !>(xray-the-kernel-example)
|
||||
%parser !>(xray-the-parser-example)
|
||||
==
|
||||
::
|
||||
:- %txt
|
||||
?- print.arg
|
||||
%type (render-type:pprint p.v)
|
||||
%val (render-vase:pprint v)
|
||||
%both (render-vase-with-type:pprint v)
|
||||
==
|
||||
::
|
||||
|%
|
||||
::
|
||||
+$ arg
|
||||
$: print=?(%type %val %both)
|
||||
target=$@(?(%all %demo %test %type %xml %kernel %parser) vase)
|
||||
==
|
||||
::
|
||||
+$ option $?(%a %b %c)
|
||||
::
|
||||
+$ junct $@(@ {@ cord})
|
||||
::
|
||||
+$ union $%([%list (list ~)] [%unit (unit ~)])
|
||||
::
|
||||
+$ conjunct $^ [[@ @] cord]
|
||||
[@ cord]
|
||||
::
|
||||
+$ misjunct $^([~ @] [cord @])
|
||||
::
|
||||
++ forks-example
|
||||
:* :- %junct ^- (list junct) ~[3 [4 '5']]
|
||||
:- %conjunct ^- (list conjunct) ~[[3 '4'] [[5 6] '7']]
|
||||
:- %union ^- (list union) ~[[%list [~ ~]] [%unit [~ ~]]]
|
||||
:- %option ^- (list option) ~[%a %a %b %c]
|
||||
:- %misjunct ^- (list misjunct) ~[[~ 3] [~ 4]]
|
||||
%nice
|
||||
==
|
||||
::
|
||||
++ all-examples
|
||||
:*
|
||||
:- %type type-example
|
||||
:- %cores core-example
|
||||
:- %add ..add
|
||||
:- zuse-example
|
||||
:- %demo demo-example
|
||||
:- %forks forks-example
|
||||
%eof
|
||||
==
|
||||
::
|
||||
++ type-example
|
||||
^- type
|
||||
-:!>(`(map ? (unit (list cord)))`~)
|
||||
::
|
||||
++ xray-the-parser-example
|
||||
=> ..musk
|
||||
|% ++ x ~ --
|
||||
::
|
||||
++ xray-the-kernel-example
|
||||
|% ++ x ~ --
|
||||
::
|
||||
++ zuse-example
|
||||
[%zuse ..zuse]
|
||||
::
|
||||
++ cores-example
|
||||
|^ :*
|
||||
[%trivial trivial-core-example]
|
||||
[%gate gate-example]
|
||||
[%core core-example]
|
||||
==
|
||||
::
|
||||
--
|
||||
::
|
||||
++ trivial-core-example
|
||||
=> ~
|
||||
|% ++ x 3 --
|
||||
::
|
||||
++ core-example
|
||||
=> [=gate-example]
|
||||
|%
|
||||
++ dup gate-example
|
||||
++ const
|
||||
|= x=* ^- $-(* *)
|
||||
|= * ^- *
|
||||
x
|
||||
--
|
||||
::
|
||||
++ gate-example
|
||||
=> ~
|
||||
|= x=@ud
|
||||
^- [@ud @ud]
|
||||
[x x]
|
||||
::
|
||||
++ test-example
|
||||
:*
|
||||
`(list ?)`~[%.y %.n]
|
||||
`(list ~)`~[~ ~]
|
||||
`(unit ~)``~
|
||||
/a/path
|
||||
==
|
||||
::
|
||||
++ hoon-example
|
||||
^- hoon
|
||||
:+ %brcn ~
|
||||
%- ~(gas by *(map term tome))
|
||||
^- (list (pair term tome))
|
||||
:_ ~
|
||||
^- (pair term tome)
|
||||
:- 'chapter'
|
||||
^- tome
|
||||
:- `what`~
|
||||
%- ~(gas by *(map term hoon))
|
||||
^- (list (pair term hoon))
|
||||
:_ ~
|
||||
:- 'arm'
|
||||
:+ %brts `spec`[%bsts 'x' [%base [%atom ~.ud]]]
|
||||
:- %clsg
|
||||
~[[%wing ~['x']] [%$ 0]]
|
||||
::
|
||||
++ demo-example
|
||||
:* [~ %.y %.n 1 0x2 ~ ~.knot 'cord' %const]
|
||||
:* [%tape "a tape"]
|
||||
[%path /path/literal `path`/typed/path]
|
||||
[%unit `(unit @)`[~ 9]]
|
||||
[%list [`?`%.y `(list ?)`~[%.y %.n %.y]]]
|
||||
%nice
|
||||
==
|
||||
[%hoon hoon-example]
|
||||
[%type -:!>(`(unit (list tape))`~)]
|
||||
[%json-and-xml json-example xml-example]
|
||||
%cool
|
||||
==
|
||||
::
|
||||
++ xml-example
|
||||
|^ ^- manx
|
||||
:- ['json' ~]
|
||||
:~ (json-to-xml json-example)
|
||||
==
|
||||
++ json-to-xml
|
||||
|= j=json
|
||||
^- manx
|
||||
?- j
|
||||
~ [['nil' ~] ~]
|
||||
[%a *] [['array' ~] (turn p.j json-to-xml)]
|
||||
[%b *] [['bool' ~[['' ?:(p.j "true" "false")]]] ~]
|
||||
[%o *] [['obj' ~] (turn ~(tap by p.j) pair)]
|
||||
[%n *] [['num' ~[[['n' 'val'] (trip p.j)]]] ~]
|
||||
[%s *] [['str' ~[['' (trip p.j)]]] ~]
|
||||
==
|
||||
++ pair
|
||||
|= [t=@t j=json]
|
||||
^- manx
|
||||
[['slot' ~[['key' (trip t)]]] ~[(json-to-xml j)]]
|
||||
--
|
||||
::
|
||||
++ json-example
|
||||
^- json
|
||||
|^ ob2
|
||||
++ nil ~
|
||||
++ yes [%b %.y]
|
||||
++ nah [%b %.n]
|
||||
++ str [%s 'Very long test string. Test test test test test test test.']
|
||||
++ foo 'foo'
|
||||
++ bar 'bar'
|
||||
++ baz 'baz'
|
||||
++ one [%n '1']
|
||||
++ ten [%n '10']
|
||||
++ mil [%n '100000']
|
||||
++ arr [%a ~[one ten mil]]
|
||||
++ ar2 [%a ~[arr yes nah nil str]]
|
||||
++ obj [%o (~(gas by *(map @t json)) ~[[foo mil] [baz arr]])]
|
||||
++ ob2 [%o (~(gas by *(map @t json)) ~[[foo ar2] [bar obj] [baz yes]])]
|
||||
++ ar3 [%a ~[arr obj ob2 one ten mil yes nah nil]]
|
||||
--
|
||||
::
|
||||
--
|
@ -1,6 +0,0 @@
|
||||
::
|
||||
:- %say
|
||||
|= $: {now/@da eny/@uvJ bec/beak}
|
||||
{{app/term source/path station/knot ~} ~}
|
||||
==
|
||||
[%pipe-cancel app source station]
|
@ -1,6 +0,0 @@
|
||||
::
|
||||
:- %say
|
||||
|= $: {now/@da eny/@uvJ bec/beak}
|
||||
{{app/term source/path station/knot ~} ~}
|
||||
==
|
||||
[%pipe-connect app source station]
|
@ -1,6 +0,0 @@
|
||||
::
|
||||
:- %say
|
||||
|= $: {now/@da eny/@uvJ bec/beak}
|
||||
{~ ~}
|
||||
==
|
||||
[%pipe-list ~]
|
@ -1,14 +0,0 @@
|
||||
:: Send tweet from an account
|
||||
::
|
||||
:::: /hoon/as/twit/gen
|
||||
::
|
||||
/- twitter
|
||||
::
|
||||
::::
|
||||
::
|
||||
=, twitter
|
||||
:- %say
|
||||
|= $: {now/@da eny/@uvJ bec/beak}
|
||||
{{who/knot msg/cord ~} ~}
|
||||
==
|
||||
[%twit-do [who %post `@uvI`(rsh 8 1 eny) msg]]
|
@ -1,18 +0,0 @@
|
||||
:: Display twitter feed
|
||||
::
|
||||
:::: /hoon/feed/twit/gen
|
||||
::
|
||||
/? 310
|
||||
/- twitter
|
||||
::
|
||||
:::: ~fyr
|
||||
::
|
||||
:- %say
|
||||
|= $: {now/@da eny/@uvJ bek/beak}
|
||||
{{who/iden ~} typ/?($user $home)}
|
||||
==
|
||||
=+ pax=/(scot %p p.bek)/twit/(scot %da now)/[typ]/[who]/twit-feed
|
||||
:- %tang
|
||||
%+ turn (flop .^((list post:twitter) %gx pax))
|
||||
|= post:twitter ^- tank
|
||||
rose+[": " `~]^~[leaf+"{<now>} @{(trip who)}" leaf+(trip txt)]
|
@ -1,17 +0,0 @@
|
||||
::
|
||||
:::: /hoon/down-jet/lib
|
||||
::
|
||||
/? 310
|
||||
/+ *down-jet-parse, *down-jet-rend
|
||||
::
|
||||
::::
|
||||
::
|
||||
~% %down ..is ~
|
||||
|%
|
||||
++ mark
|
||||
~/ %mark
|
||||
|= p/@t
|
||||
(normalize (rash p parse))
|
||||
::
|
||||
++ print sing
|
||||
--
|
File diff suppressed because one or more lines are too long
File diff suppressed because it is too large
Load Diff
@ -1,249 +0,0 @@
|
||||
:: ++down rendering arms
|
||||
::
|
||||
:::: /hoon/rend/down-jet/lib
|
||||
::
|
||||
/? 310
|
||||
/- *markdown
|
||||
::
|
||||
=, format
|
||||
=, html
|
||||
::
|
||||
|%
|
||||
++ into-inner
|
||||
|= {a/marl b/manx}
|
||||
?~ c.b b(c a)
|
||||
$(b i.c.b)
|
||||
::
|
||||
++ flat
|
||||
|= a/marl
|
||||
^- tape
|
||||
?~ a ~
|
||||
%+ weld
|
||||
^- tape
|
||||
?~ n.g.i.a
|
||||
?>(?=(_;/(**) i.a) v.i.a.g.i.a)
|
||||
?+ n.g.i.a $(a c.i.a)
|
||||
$img
|
||||
%- zing ^- wall
|
||||
%+ murn a.g.i.a |= {a/mane b/tape}
|
||||
^- (unit tape)
|
||||
?+ a ~
|
||||
$alt [~ b]
|
||||
==
|
||||
==
|
||||
$(a t.a)
|
||||
::
|
||||
++ sanitize
|
||||
|= a/marl ^- tape
|
||||
=- (zing `wall`(scan (flat a) fel))
|
||||
=< fel=;~(sfix (star ;~(plug (cold '-' -) (plus +))) (star next))
|
||||
[(star ;~(less aln prn)) ;~(pose nud low (cook |=(a/@ (add a ' ')) hig))]
|
||||
::
|
||||
++ sang :: tight item children
|
||||
|= a/(list elem)
|
||||
^- marl
|
||||
?~ a ~
|
||||
%+ weld
|
||||
?. ?=($para -.i.a)
|
||||
(sing i.a ~)
|
||||
(sung p.i.a)
|
||||
$(a t.a)
|
||||
::
|
||||
++ sing :: elem to manx
|
||||
=, html
|
||||
=> |%
|
||||
++ first-word
|
||||
|= a/tape
|
||||
=. a (trip (crip a)) :: XX valid tapes
|
||||
^- (unit tape)
|
||||
=. a q.q:(need q:((star ace) [1 1] a))
|
||||
=+ vex=((plus ;~(less ace prn)) [1 1] a)
|
||||
?~ q.vex ~
|
||||
(some (wonk vex))
|
||||
--
|
||||
=+ [tig=| had=*(unit mane)]
|
||||
|= lum/(list elem)
|
||||
|^ ^- marl
|
||||
=+ a=apex
|
||||
?~ q.a
|
||||
p.a
|
||||
(weld p.a $(lum q.a))
|
||||
::
|
||||
++ apex
|
||||
^- {p/marl q/_lum}
|
||||
?~ lum
|
||||
?~ had [~ ~]
|
||||
(lose "unclosed {<u.had>}")
|
||||
=> [ele=i.lum .(lum t.lum)]
|
||||
?. ?=($html -.ele)
|
||||
(push (reso ele) ~)
|
||||
:: begin reparsing of html that the spec jankily lets through ::
|
||||
=+ tex=(trip (of-wain p.ele))
|
||||
=^ mar lum (chomp tex (sear |=(a/marl ?~(a ~ (some a))) many:de-xml))
|
||||
?^ mar
|
||||
(push u.mar)
|
||||
=^ hed lum (chomp tex head:de-xml)
|
||||
?^ hed
|
||||
=+ max=`marx`u.hed
|
||||
(push(lum q) [max p] ~):[apex(had `n.max) .]
|
||||
=^ tal lum (chomp tex tail:de-xml)
|
||||
?~ tal
|
||||
=^ cha lum (chomp tex prn)
|
||||
?^ cha
|
||||
(push ;/([u.cha]~) ~)
|
||||
(push ;lost:"{tex}" ~)
|
||||
?: =(had tal)
|
||||
[~ lum]
|
||||
?^ had
|
||||
=. lum [ele lum]
|
||||
(lose "unclosed {<u.had>}")
|
||||
(lose "close {<u.tal>}")
|
||||
:: end reparsing of html that the spec jankily lets through ::
|
||||
::
|
||||
++ lose |=(a/tape [[;lost:"{a}"]~ lum])
|
||||
++ chomp
|
||||
|* {tap/tape fel/rule}
|
||||
^- {(unit _(wonk *fel)) _lum}
|
||||
=+ vex=(fel 1^1 tap)
|
||||
?~ q.vex [~ lum]
|
||||
:- [~ (wonk vex)]
|
||||
?~(q.q.u.q.vex lum [[%html (to-wain (crip q.q.u.q.vex))] lum])
|
||||
::
|
||||
++ push
|
||||
|= a/marl
|
||||
^+ apex
|
||||
?~ a apex
|
||||
[[b p] q]:[b=i.a (push t.a)]
|
||||
::
|
||||
++ reso
|
||||
|= a/elem
|
||||
?^ -.a
|
||||
=. tig ?.(?=($list -.p.a) tig p.p.a)
|
||||
?: &(tig ?=($item -.p.a))
|
||||
[/li (sang q.a)]
|
||||
%+ into-inner ^$(lum q.a)
|
||||
?- -.p.a
|
||||
$bloq ;blockquote;
|
||||
$item ;li;
|
||||
$list ?@ q.p.a ;ul;
|
||||
?: =(1 p.q.p.a) ;ol;
|
||||
=+ num=(en-json (numb:enjs p.q.p.a))
|
||||
;ol(start num);
|
||||
==
|
||||
?- -.a :: ;/("unimplemented {<p.a>}")
|
||||
$html !! :: handled earlier XX do type stuff
|
||||
$para [/p (sung p.a)]
|
||||
$head
|
||||
=+ [hed=(add %h0 (lsh 3 1 p.a)) kid=(sung q.a)]
|
||||
[[hed id+(sanitize kid) ~] kid]
|
||||
::
|
||||
$hrul ;hr;
|
||||
$meta ?: =(~ p.a) ;/(~)
|
||||
=+ jon=`json`o+(~(run by p.a) |=(cord s++<))
|
||||
;meta(value "{(en-json jon)}", name "frontmatter", urb_front "");
|
||||
:: %html
|
||||
::=+ tex=(of-wain (turn p.a crip))
|
||||
::=+ (de-xml tex)
|
||||
::?^ - u.-
|
||||
::=+ (rush tex (star ;~(pose gah comt:de-xml)))
|
||||
::?^ - ;/(~)
|
||||
::;lost: {<p.a>}
|
||||
:: ;/([(of-wain (turn p.a crip))]~) :: XX haaaaaaack
|
||||
$defn ;/(~)
|
||||
$code =+ lan=?~(p.a ~ (first-word r.u.p.a))
|
||||
=+ tex=(trip (of-wain q.a))
|
||||
?~ lan ;pre:code:"{tex}"
|
||||
;pre:code(class "language-{u.lan}"):"{tex}"
|
||||
|
||||
==
|
||||
--
|
||||
::
|
||||
++ sung
|
||||
|= lim/kids
|
||||
=+ had=*(unit mane)
|
||||
|^ ^- marl
|
||||
=+ a=apex
|
||||
?~ q.a
|
||||
p.a
|
||||
(weld p.a $(lim q.a))
|
||||
::
|
||||
++ apex
|
||||
^- {p/marl q/_lim}
|
||||
?~ lim
|
||||
?~ had [~ ~]
|
||||
(lose "unclosed {<u.had>}")
|
||||
=> [ele=i.lim .(lim t.lim)]
|
||||
?. ?=($htmt -.ele)
|
||||
?: &(?=($$ -.ele) ?=({{$$ *} *} lim))
|
||||
apex(p.i.lim (weld p.ele p.i.lim))
|
||||
(push (reso ele) ~)
|
||||
=+ tex=(trip p.ele)
|
||||
=^ emo lim (chomp tex empt:de-xml)
|
||||
?^ emo
|
||||
=+ man=`manx`u.emo
|
||||
(push man ~)
|
||||
=^ hed lim (chomp tex head:de-xml)
|
||||
?^ hed
|
||||
=+ max=`marx`u.hed
|
||||
(push(lim q) [max p] ~):[apex(had `n.max) .]
|
||||
=^ tal lim (chomp tex tail:de-xml)
|
||||
?~ tal
|
||||
(push ;lost:"{tex}" ~)
|
||||
?: =(had tal)
|
||||
[~ lim]
|
||||
?^ had
|
||||
=. lim [ele lim]
|
||||
(lose "unclosed {<u.had>}")
|
||||
(lose "unopened {<u.tal>}")
|
||||
::
|
||||
++ lose |=(a/tape [[;lost:"{a}"]~ lim])
|
||||
++ chomp
|
||||
|* {tap/tape fel/rule}
|
||||
^- {(unit _(wonk *fel)) _lim}
|
||||
=+ vex=(fel 1^1 tap)
|
||||
?~ q.vex [~ lim]
|
||||
:- [~ (wonk vex)]
|
||||
?~(q.q.u.q.vex lim [[%htmt (crip q.q.u.q.vex)] lim])
|
||||
::
|
||||
++ push
|
||||
|= a/marl
|
||||
^+ apex
|
||||
?~ a apex
|
||||
[[b p] q]:[b=i.a (push t.a)]
|
||||
::
|
||||
++ urly
|
||||
|= a/tape ^- tape
|
||||
?~ a ~
|
||||
?: ?| [?=(^ q)]:(alp 1^1 a)
|
||||
(~(has in (silt "#!*'();:@&=+$,/?/%.~_")) i.a) :: XX reparse
|
||||
==
|
||||
[i.a $(a t.a)]
|
||||
(weld (en-urlt:html (trip i.a)) $(a t.a))
|
||||
::
|
||||
++ reso
|
||||
|= b/inline
|
||||
^- manx
|
||||
?@ -.b
|
||||
?- -.b
|
||||
$$ ;/(p.b)
|
||||
$line ;br;
|
||||
$code ;code:"{p.b}"
|
||||
$htmt !! ::p.b :: handled earlier :: XX do type stuff
|
||||
==
|
||||
?: ?=($blot -.p.b)
|
||||
=+ res=`manx`;img(src (urly p.p.b), alt (flat (turn q.b ..$)));
|
||||
:: ;img@"{p.p.b}";
|
||||
?~ q.p.b res
|
||||
res(a.g (welp a.g.res title+u.q.p.b ~))
|
||||
=+ kid=(sung q.b)
|
||||
%+ into-inner kid
|
||||
?- p.b
|
||||
{$emph ?} ?.(p.p.b ;em; ;strong;)
|
||||
{$delt ~} ;del;
|
||||
{$link ^} =+ url=(urly p.p.b)
|
||||
=. url ?^(url url "#{(sanitize kid)}")
|
||||
?~ q.p.b ;a/"{url}";
|
||||
;a/"{url}"(title u.q.p.b);
|
||||
==
|
||||
--
|
||||
--
|
157
lib/twitter.hoon
157
lib/twitter.hoon
@ -1,157 +0,0 @@
|
||||
:: A Twitter API library.
|
||||
::
|
||||
:::: /hoon/twitter/lib
|
||||
::
|
||||
/? 314
|
||||
/- twitter
|
||||
/+ interpolate, hep-to-cab
|
||||
=+ sur-twit:^twitter :: XX
|
||||
=, eyre
|
||||
=, mimes:html
|
||||
=, html
|
||||
=, format
|
||||
=, html
|
||||
=, chrono:userlib
|
||||
::
|
||||
:::: functions
|
||||
::
|
||||
|%
|
||||
++ join
|
||||
|= {a/char b/(list @t)} ^- @t
|
||||
%+ rap 3
|
||||
?~ b ~
|
||||
|-(?~(t.b b [i.b a $(b t.b)]))
|
||||
::
|
||||
++ valve :: produce request
|
||||
|= {med/?($get $post) pax/path quy/quay}
|
||||
^- hiss
|
||||
=+ url=(scan "https://api.twitter.com/1.1/.json" auri:de-purl) :: base path
|
||||
=. q.q.url (welp q.q.url pax)
|
||||
=. r.url quy
|
||||
^- hiss
|
||||
?- med
|
||||
$get [url med *math ~]
|
||||
$post
|
||||
=+ hed=(my:nl content-type+['application/x-www-form-urlencoded']~ ~)
|
||||
[url(r ~) med hed ?~(r.url ~ (some (as-octt +:(tail:en-purl r.url))))]
|
||||
==
|
||||
::
|
||||
++ find-req
|
||||
=+ all=doc-data-dry:reqs
|
||||
|: a=-:$:endpoint:reqs ^- {?($get $post) path}
|
||||
?~ all ~|(endpoint-lost+a !!) :: type error, should never happen
|
||||
?: =(a -:$:typ.i.all)
|
||||
+.i.all
|
||||
$(all t.all)
|
||||
--
|
||||
::
|
||||
:::: library
|
||||
::
|
||||
|%
|
||||
++ render :: response printers
|
||||
=+ args:reqs
|
||||
|%
|
||||
++ mean
|
||||
|= {msg/@t num/@ud} ^- tank
|
||||
rose+[": " `~]^~[leaf+"Error {<num>}" leaf+(trip msg)]
|
||||
::
|
||||
++ user-url
|
||||
|: a=$:scr ^- purf
|
||||
:_ ~
|
||||
%^ into-url:interpolate 'https://twitter.com/:scr'
|
||||
~
|
||||
~[scr+a]
|
||||
::
|
||||
++ post-url
|
||||
|: $:{a/scr b/tid} ^- purf
|
||||
:_ ~
|
||||
%^ into-url:interpolate 'https://twitter.com/:scr/status/:tid'
|
||||
~
|
||||
~[scr+a tid+(tid:print b)]
|
||||
--
|
||||
++ parse ^? :: text parsers
|
||||
|%
|
||||
++ user (cook crip (plus ;~(pose aln cab)))
|
||||
--
|
||||
::
|
||||
++ reparse :: json reparsers
|
||||
=, parse
|
||||
|%
|
||||
++ ce |*({a/$-(* *) b/fist:dejs} (cu:dejs |:(c=$:a c) b)) :: output type
|
||||
++ fasp |*(a/{@tas *} [(hep-to-cab -.a) +.a])
|
||||
++ mean (ot errors+(ar (ot message+so code+ni ~)) ~):dejs
|
||||
++ post
|
||||
=, ^?(dejs)
|
||||
%+ ce post:sur-twit
|
||||
%- ot
|
||||
:~ id+ni
|
||||
user+(ot (fasp screen-name+(su user)) ~)
|
||||
(fasp created-at+(cu year (ci stud so)))
|
||||
:: parse html escapes and newlines
|
||||
text+(cu crip (su (star ;~(pose (just `@`10) escp:de-xml))))
|
||||
==
|
||||
++ usel
|
||||
=, ^?(dejs)
|
||||
%+ ce (list who/@ta)
|
||||
=- (ot users+(ar -) ~)
|
||||
(ot (fasp screen-name+(su user)) ~)
|
||||
--
|
||||
++ print
|
||||
=+ args:reqs
|
||||
|%
|
||||
++ tid |=(@u `@t`(rsh 3 2 (scot %ui +<)))
|
||||
++ scr |=(@t +<)
|
||||
++ lsc
|
||||
|: a=$:$@(^scr ^lsc) ^- @t
|
||||
?@(a `@t`a (join ',' a))
|
||||
::
|
||||
++ lid
|
||||
|: a=$:$@(^tid (list ^tid)) ^- @t
|
||||
?~ a ~|(%nil-id !!)
|
||||
?@(a (tid a) (join ',' (turn `(list ^tid)`a tid)))
|
||||
--
|
||||
++ request
|
||||
=< apex
|
||||
=+ args:reqs
|
||||
|%
|
||||
++ apex
|
||||
|: $:{a/endpoint b/quay} ^- hiss
|
||||
=+ [med pax]=(find-req -.a)
|
||||
(valve med (cowl pax +.a b))
|
||||
::
|
||||
++ lutt |=(@u `@t`(rsh 3 2 (scot %ui +<)))
|
||||
++ llsc
|
||||
:: => args:reqs
|
||||
|: a=$:$@(scr (list scr)) ^- @t
|
||||
?@(a `@t`a (join ',' a))
|
||||
::
|
||||
++ llst
|
||||
|= a/$@(@t (list @t)) ^- @t
|
||||
?@(a `@t`a (join ',' a))
|
||||
::
|
||||
++ llid
|
||||
:: =+ args:reqs
|
||||
|: a=$:$@(tid (list tid)) ^- @t
|
||||
?~ a ~|(%nil-id !!)
|
||||
?@(a (lutt a) (join ',' (turn `(list tid)`a lutt)))
|
||||
::
|
||||
++ cowl :: handle parameters
|
||||
|= $: pax/path
|
||||
ban/(list param)
|
||||
quy/quay
|
||||
==
|
||||
^- {path quay}
|
||||
%+ into-path-partial:interpolate
|
||||
(path:hep-to-cab pax)
|
||||
=- (weld - quy)
|
||||
%+ turn ban
|
||||
|: p=$:param
|
||||
^- {@t @t}
|
||||
:- (hep-to-cab -.p)
|
||||
?+ -.p p.p :: usually plain text
|
||||
?($source-id $target-id) (tid:print p.p)
|
||||
?($id $name $user-id) (lid:print p.p)
|
||||
$screen-name (lsc:print p.p)
|
||||
==
|
||||
--
|
||||
--
|
@ -1,17 +0,0 @@
|
||||
::
|
||||
:::: /hoon/coffee/mar
|
||||
::
|
||||
/? 310
|
||||
=, mimes:html
|
||||
|_ mud/@t
|
||||
++ grow
|
||||
|%
|
||||
++ mime [/text/coffeescript (as-octs mud)]
|
||||
--
|
||||
++ grab
|
||||
|%
|
||||
++ mime |=({p/mite q/octs} (@t q.q))
|
||||
++ noun @t
|
||||
--
|
||||
++ grad %mime
|
||||
--
|
@ -1,38 +0,0 @@
|
||||
::
|
||||
:::: /hoon/down/mar
|
||||
::
|
||||
/? 310
|
||||
/- markdown
|
||||
/+ down-jet, frontmatter
|
||||
::
|
||||
::::
|
||||
::
|
||||
=, format
|
||||
=, markdown
|
||||
|_ don/down
|
||||
++ grab :: convert from
|
||||
|%
|
||||
++ noun down :: clam from %noun
|
||||
++ md
|
||||
|= src/@t
|
||||
=+ [atr mud]=(parse:frontmatter (to-wain src))
|
||||
[[%meta atr] (mark:down-jet mud)]
|
||||
--
|
||||
::
|
||||
++ grow :: convert into
|
||||
|%
|
||||
++ front ?~(don ~ ?:(?=($meta -.i.don) p.i.don front(don t.don)))
|
||||
++ hymn :: convert to %hymn
|
||||
;html
|
||||
;head:title:"Untitled"
|
||||
;body
|
||||
;* (print:down-jet don)
|
||||
==
|
||||
==
|
||||
++ elem :: convert to %elem
|
||||
;div
|
||||
;* (print:down-jet don)
|
||||
==
|
||||
:: ++ react elem
|
||||
--
|
||||
--
|
@ -1,10 +0,0 @@
|
||||
/- gh
|
||||
/+ gh-parse, httr-to-json
|
||||
|_ commit/commit:gh
|
||||
++ grab
|
||||
|%
|
||||
++ noun commit:gh
|
||||
++ httr (cork httr-to-json json)
|
||||
++ json commit:gh-parse
|
||||
--
|
||||
--
|
@ -1,42 +0,0 @@
|
||||
:: Converts the result of an 'issues' event into a issues:gh.
|
||||
/- gh
|
||||
/+ gh-parse, hall
|
||||
|_ issue-comment/issue-comment:gh
|
||||
++ grow
|
||||
|%
|
||||
++ hall-speeches
|
||||
^- (list speech:hall)
|
||||
:_ ~
|
||||
=+ ^= txt
|
||||
;: (cury cat 3)
|
||||
'on issue #'
|
||||
`@t`(rsh 3 2 (scot %ui number.issue.issue-comment))
|
||||
': '
|
||||
body.comment.issue-comment
|
||||
==
|
||||
:* %api %github
|
||||
login.sender.issue-comment
|
||||
(rash html-url.sender.issue-comment aurf:urlp)
|
||||
txt
|
||||
txt
|
||||
(rash html-url.comment.issue-comment aurf:urlp)
|
||||
%- jobe :~
|
||||
repository+s+name.repository.issue-comment
|
||||
number+(numb:enjs:format number.issue.issue-comment)
|
||||
title+s+title.issue.issue-comment
|
||||
==
|
||||
==
|
||||
--
|
||||
++ grab
|
||||
|%
|
||||
++ json
|
||||
=; jop |=(jon/^json `issue-comment:gh`(need (jop jon)))
|
||||
%- ot:dejs-soft:format
|
||||
:~ repository+repository:gh-parse
|
||||
sender+user:gh-parse
|
||||
action+so:dejs-soft:format
|
||||
issue+issue:gh-parse
|
||||
comment+comment:gh-parse
|
||||
==
|
||||
--
|
||||
--
|
@ -1,17 +0,0 @@
|
||||
/- gh
|
||||
/+ gh-parse, httr-to-json
|
||||
=, mimes:html
|
||||
|_ issue/issue:gh
|
||||
++ grab
|
||||
|%
|
||||
++ noun issue:gh
|
||||
++ httr (cork httr-to-json json)
|
||||
++ json issue:gh-parse
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ json raw.issue
|
||||
++ mime [/txt/plain (as-octs (crip <issue>))]
|
||||
++ txt (print-issue:gh-parse issue)
|
||||
--
|
||||
--
|
@ -1,139 +0,0 @@
|
||||
:: Converts the result of an 'issues' event into a issues:gh.
|
||||
/- gh
|
||||
/+ gh-parse, hall
|
||||
|_ issues/issues:gh
|
||||
++ grow
|
||||
|%
|
||||
++ hall-speeches
|
||||
^- (list speech:hall)
|
||||
:_ ~
|
||||
=+ ^= txt
|
||||
?- -.action.issues
|
||||
$assigned
|
||||
;: (cury cat 3)
|
||||
'assigned issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
' to '
|
||||
login.assignee.action.issues
|
||||
' ('
|
||||
title.issue.issues
|
||||
')'
|
||||
==
|
||||
::
|
||||
$unassigned
|
||||
;: (cury cat 3)
|
||||
'unassigned issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
' from '
|
||||
login.assignee.action.issues
|
||||
' ('
|
||||
title.issue.issues
|
||||
')'
|
||||
==
|
||||
::
|
||||
$labeled
|
||||
;: (cury cat 3)
|
||||
'labeled issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
' as '
|
||||
name.label.action.issues
|
||||
' ('
|
||||
title.issue.issues
|
||||
')'
|
||||
==
|
||||
::
|
||||
$unlabeled
|
||||
;: (cury cat 3)
|
||||
'unlabeled issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
' as '
|
||||
name.label.action.issues
|
||||
' ('
|
||||
title.issue.issues
|
||||
')'
|
||||
==
|
||||
::
|
||||
$opened
|
||||
;: (cury cat 3)
|
||||
'opened issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
': '
|
||||
title.issue.issues
|
||||
==
|
||||
::
|
||||
$closed
|
||||
;: (cury cat 3)
|
||||
'closed issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
': '
|
||||
title.issue.issues
|
||||
==
|
||||
::
|
||||
$reopened
|
||||
;: (cury cat 3)
|
||||
'reopened issue #'
|
||||
(rsh 3 2 (scot %ui number.issue.issues))
|
||||
': '
|
||||
title.issue.issues
|
||||
==
|
||||
==
|
||||
^- speech:hall
|
||||
:* %api %github
|
||||
login.sender.issues
|
||||
(rash html-url.sender.issues aurf:urlp)
|
||||
txt txt
|
||||
(rash html-url.issue.issues aurf:urlp)
|
||||
%- jobe
|
||||
%+ welp
|
||||
:~ repository+s+name.repository.issues
|
||||
number+(jone number.issue.issues)
|
||||
title+s+title.issue.issues
|
||||
action+s+-.action.issues
|
||||
==
|
||||
?- -.action.issues
|
||||
$assigned
|
||||
:~ assignee+s+login.assignee.action.issues
|
||||
assignee-url+s+url.assignee.action.issues
|
||||
==
|
||||
::
|
||||
$unassigned
|
||||
:~ assignee+s+login.assignee.action.issues
|
||||
assignee-url+s+url.assignee.action.issues
|
||||
==
|
||||
::
|
||||
$labeled
|
||||
:~ label+s+name.label.action.issues
|
||||
==
|
||||
::
|
||||
$unlabeled
|
||||
:~ label+s+name.label.action.issues
|
||||
==
|
||||
::
|
||||
$opened ~
|
||||
$closed ~
|
||||
$reopened ~
|
||||
==
|
||||
==
|
||||
--
|
||||
++ grab
|
||||
|%
|
||||
++ json
|
||||
|= jon/^json
|
||||
^- issues:gh
|
||||
=+ top=(need ((om:dejs-soft:format some) jon))
|
||||
:* (need (repository:gh-parse (~(got by top) %repository)))
|
||||
(need (user:gh-parse (~(got by top) %sender)))
|
||||
=+ action=(need (so:dejs-soft:format (~(got by top) %action)))
|
||||
?+ action ~|([%bad-action action] !!)
|
||||
$assigned [action (need (user:gh-parse (~(got by top) %assignee)))]
|
||||
$unassigned [action (need (user:gh-parse (~(got by top) %assignee)))]
|
||||
$labeled [action (need (label:gh-parse (~(got by top) %label)))]
|
||||
$unlabeled [action (need (label:gh-parse (~(got by top) %label)))]
|
||||
$opened [action ~]
|
||||
$closed [action ~]
|
||||
$reopened [action ~]
|
||||
==
|
||||
(need (issue:gh-parse (~(got by top) %issue)))
|
||||
==
|
||||
--
|
||||
--
|
@ -1,18 +0,0 @@
|
||||
/- gh
|
||||
/+ gh-parse
|
||||
=, mimes:html
|
||||
|_ issues/(list issue:gh)
|
||||
++ grab
|
||||
|%
|
||||
++ noun (list issue:gh)
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ json [%a (turn issues |=(issue:gh raw))]
|
||||
++ mime [/txt/plain (as-octs (crip <issues>))]
|
||||
++ txt =- ?~ - - ->
|
||||
%+ roll (turn issues print-issue:gh-parse)
|
||||
|= {a/wain b/wain}
|
||||
:(welp b ~['----------------------------------------'] a)
|
||||
--
|
||||
--
|
@ -1,6 +0,0 @@
|
||||
|_ {method/meth:eyre endpoint/(list @t) jon/json}
|
||||
++ grab
|
||||
|%
|
||||
++ noun {method/meth:eyre endpoint/(list @t) jon/json}
|
||||
--
|
||||
--
|
@ -1,10 +0,0 @@
|
||||
/- gh
|
||||
/+ gh-parse, httr-to-json
|
||||
|_ repo/repository:gh
|
||||
++ grab
|
||||
|%
|
||||
++ noun repository:gh
|
||||
++ httr (cork httr-to-json json)
|
||||
++ json repository:gh-parse
|
||||
--
|
||||
--
|
@ -1,18 +0,0 @@
|
||||
::
|
||||
:::: /hoon/jam-crub/mar
|
||||
::
|
||||
/? 310
|
||||
::
|
||||
=, mimes:html
|
||||
|_ mud/@
|
||||
++ grow
|
||||
|%
|
||||
++ mime [/application/octet-stream (as-octs mud)]
|
||||
--
|
||||
++ grab
|
||||
|% :: convert from
|
||||
++ noun @ :: clam from %noun
|
||||
++ mime |=({* octs} q)
|
||||
--
|
||||
++ grad %mime
|
||||
--
|
@ -1,24 +0,0 @@
|
||||
::
|
||||
:::: /hoon/markdown/mar
|
||||
::
|
||||
/? 310
|
||||
::
|
||||
=, mimes:html
|
||||
=, format
|
||||
|_ mud/@t
|
||||
++ grow
|
||||
|%
|
||||
++ mime [/text/x-markdown (as-octs mud)]
|
||||
++ md mud
|
||||
++ txt
|
||||
(to-wain mud)
|
||||
--
|
||||
++ grab
|
||||
|%
|
||||
++ mime |=({p/mite q/octs} q.q)
|
||||
++ noun @t
|
||||
++ md |=(@t +<)
|
||||
++ txt of-wain
|
||||
--
|
||||
++ grad %txt
|
||||
--
|
21
mar/md.hoon
21
mar/md.hoon
@ -1,21 +0,0 @@
|
||||
::
|
||||
:::: /hoon/md/mar
|
||||
::
|
||||
/? 310
|
||||
::
|
||||
|_ mud/@t
|
||||
++ grow
|
||||
|%
|
||||
++ mime [/text/x-markdown (as-octs:mimes:html mud)]
|
||||
++ txt
|
||||
(to-wain:format mud)
|
||||
--
|
||||
++ grab
|
||||
|%
|
||||
++ mime |=({p/mite:eyre q/octs:eyre} q.q)
|
||||
++ noun @t
|
||||
++ txt of-wain:format
|
||||
--
|
||||
++ grad %txt
|
||||
++ garb /down
|
||||
--
|
@ -1,26 +0,0 @@
|
||||
::
|
||||
:::: /hoon/comments/tree/mar
|
||||
::
|
||||
/? 310
|
||||
/+ elem-to-react-json, time-to-id
|
||||
=, format
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ all/(list (pair time {ship marl}))
|
||||
::
|
||||
++ grow :: convert to
|
||||
|%
|
||||
++ json
|
||||
:- %a
|
||||
%+ turn
|
||||
(sort all |=({a/* b/*} (lor b a)))
|
||||
|= {a/time b/ship c/marl} ^- ^json
|
||||
=+ bod=[[%div id+(time-to-id a) ~] c]
|
||||
=, enjs
|
||||
(pairs time+(time a) user+(ship b) body+(elem-to-react-json bod) ~)
|
||||
--
|
||||
++ grab |% :: convert from
|
||||
++ noun (list {time manx}) :: clam from %noun
|
||||
::++ elem |=(a=manx `_all`[[/ ((getall %h1) a)] ~ ~])
|
||||
-- --
|
@ -1,8 +0,0 @@
|
||||
::
|
||||
:::: /hoon/elem/tree/mar
|
||||
::
|
||||
/? 310
|
||||
|_ own/manx
|
||||
::
|
||||
++ grow |% ++ elem own :: alias
|
||||
-- --
|
@ -1,15 +0,0 @@
|
||||
::
|
||||
:::: /hoon/hymn/tree/mar
|
||||
::
|
||||
/? 310
|
||||
=, mimes:html
|
||||
|_ own/manx
|
||||
::
|
||||
++ grow :: convert to
|
||||
|%
|
||||
++ html (crip (en-xml:^html own)) :: convert to %html
|
||||
++ mime [/text/html (as-octs html)] :: convert to %mime
|
||||
--
|
||||
++ grab |% :: convert from
|
||||
++ noun manx :: clam from %noun
|
||||
-- --
|
@ -1,8 +0,0 @@
|
||||
::
|
||||
:::: /hoon/include/tree/mar
|
||||
::
|
||||
/? 310
|
||||
/- tree-include
|
||||
|_ tree-include
|
||||
++ grab |% ++ noun tree-include
|
||||
-- --
|
@ -1,23 +0,0 @@
|
||||
::
|
||||
:::: /hoon/index/tree/mar
|
||||
::
|
||||
/? 310
|
||||
/+ tree,map-to-json,elem-to-react-json
|
||||
[. tree]
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ all/(map path marl)
|
||||
::
|
||||
++ grow :: convert to
|
||||
|%
|
||||
++ json
|
||||
%. all
|
||||
%+ map-to-json
|
||||
|=(a/path (crip (spud a)))
|
||||
|=(a/marl [%a (turn a elem-to-react-json)])
|
||||
--
|
||||
++ grab |% :: convert from
|
||||
++ noun (map path marl) :: clam from %noun
|
||||
::++ elem |=(a=manx `_all`[[/ ((getall %h1) a)] ~ ~])
|
||||
-- --
|
@ -1,20 +0,0 @@
|
||||
::
|
||||
:::: /hoon/json/tree/mar
|
||||
::
|
||||
/? 310
|
||||
::
|
||||
:::: compute
|
||||
::
|
||||
=, mimes:html
|
||||
=, html
|
||||
|_ jon/json
|
||||
::
|
||||
++ grow :: convert to
|
||||
|%
|
||||
++ mime [/text/json (as-octt (en-json jon))] :: convert to %mime
|
||||
--
|
||||
++ grab
|
||||
|% :: convert from
|
||||
++ noun json :: clam from %noun
|
||||
--
|
||||
--
|
@ -1,22 +0,0 @@
|
||||
:: Twitter credentials
|
||||
::
|
||||
:::: /hoon/cred/twit/mar
|
||||
::
|
||||
/- plan-acct
|
||||
/+ httr-to-json, twitter
|
||||
|_ {acc/plan-acct raw/json}
|
||||
++ grab
|
||||
|%
|
||||
++ noun {plan-acct ^json}
|
||||
++ httr (cork httr-to-json json) :: XX mark translation
|
||||
++ json
|
||||
|= jon/^json ^- {plan-acct ^json}
|
||||
=+ usr=(need ((ot 'screen_name'^so ~):dejs-soft:format jon))
|
||||
=+ url=(user-url:render:twitter usr)
|
||||
[[usr (some url)] jon]
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ tank >[+<.+]<
|
||||
--
|
||||
--
|
@ -1,31 +0,0 @@
|
||||
:: Twitter statuses
|
||||
::
|
||||
:::: /hoon/feed/twit/mar
|
||||
::
|
||||
/- hall
|
||||
/+ twitter, httr-to-json
|
||||
=, format
|
||||
|_ fed/(list post:twitter)
|
||||
++ grab
|
||||
|%
|
||||
++ noun (list post:twitter)
|
||||
++ json (ar:dejs post:reparse:twitter)
|
||||
++ httr (cork httr-to-json json) :: XX mark translation
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ tank >[fed]<
|
||||
++ hall-speeches
|
||||
=+ r=render:twitter
|
||||
%+ turn fed
|
||||
|= a/post:twitter ^- speech:hall
|
||||
:* %api %twitter
|
||||
who.a
|
||||
(user-url.r who.a)
|
||||
txt.a
|
||||
txt.a
|
||||
(post-url.r who.a id.a)
|
||||
(joba now+(jode now.a))
|
||||
==
|
||||
--
|
||||
--
|
@ -1,17 +0,0 @@
|
||||
:: Twitter status
|
||||
::
|
||||
:::: /hoon/post/twit/mar
|
||||
::
|
||||
/+ twitter, httr-to-json
|
||||
|_ post:twitter
|
||||
++ grab
|
||||
|%
|
||||
++ noun post:twitter
|
||||
++ json post:reparse:twitter
|
||||
++ httr (cork httr-to-json json) :: XX mark translation
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ tank >[+<]<
|
||||
--
|
||||
--
|
@ -1,15 +0,0 @@
|
||||
:: Twitter api request
|
||||
::
|
||||
:::: /hoon/req/twit/mar
|
||||
::
|
||||
/+ twitter
|
||||
|_ {req/endpoint:reqs:twitter quy/quay}
|
||||
++ grab
|
||||
|%
|
||||
++ noun {endpoint:reqs:twitter quay}
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ hiss (request:twitter req quy)
|
||||
--
|
||||
--
|
@ -1,17 +0,0 @@
|
||||
:: List of twitter users
|
||||
::
|
||||
:::: /hoon/usel/twit/mar
|
||||
::
|
||||
/+ twitter, httr-to-json
|
||||
|_ (list who/@ta)
|
||||
++ grab
|
||||
|%
|
||||
++ noun (list who/@ta)
|
||||
++ json usel:reparse:twitter
|
||||
++ httr (cork httr-to-json json) :: XX mark translation
|
||||
--
|
||||
++ grow
|
||||
|%
|
||||
++ tank >[+<]<
|
||||
--
|
||||
--
|
@ -1,79 +0,0 @@
|
||||
/- unicode-data
|
||||
=, eyre
|
||||
=, format
|
||||
::
|
||||
|_ all/(list line:unicode-data)
|
||||
++ grab
|
||||
:: converts from mark to unicode-data.
|
||||
|%
|
||||
++ mime |=([* a=octs] (txt (to-wain q.a))) :: XX mark translation
|
||||
++ txt
|
||||
|^ |= a=wain
|
||||
^+ all
|
||||
%+ murn a
|
||||
|= b=cord
|
||||
^- (unit line:unicode-data)
|
||||
?~ b ~
|
||||
`(rash b line)
|
||||
::
|
||||
:: parses a single character information line of the unicode data file.
|
||||
++ line
|
||||
;~ (glue mic)
|
||||
hex :: code/@c codepoint in hex format
|
||||
name-string :: name/tape character name
|
||||
general-category :: gen/general type of character
|
||||
(bass 10 (plus dit)) :: can/@ud canonical combining class
|
||||
bidi-category :: bi/bidi bidirectional category
|
||||
decomposition-mapping :: de/decomp decomposition mapping
|
||||
::
|
||||
:: todo: decimal/digit/numeric need to be parsed.
|
||||
::
|
||||
string-number :: decimal/tape decimal digit value (or ~)
|
||||
string-number :: digit/tape digit value, even if non-decimal
|
||||
string-number :: numeric/tape numeric value, including fractions
|
||||
::
|
||||
(fuss 'Y' 'N') :: mirrored/? is char mirrored in bidi text?
|
||||
name-string :: old-name/tape unicode 1.0 compatibility name
|
||||
name-string :: iso/tape iso 10646 comment field
|
||||
(punt hex) :: up/(unit @c) uppercase mapping codepoint
|
||||
(punt hex) :: low/(unit @c) lowercase mapping codepoint
|
||||
(punt hex) :: title/(unit @c) titlecase mapping codepoint
|
||||
==
|
||||
::
|
||||
:: parses a single name or comment string.
|
||||
++ name-string
|
||||
%+ cook
|
||||
|=(a=tape a)
|
||||
(star ;~(less mic prn))
|
||||
::
|
||||
:: parses a unicode general category abbreviation to symbol
|
||||
++ general-category
|
||||
%+ sear (soft general:unicode-data)
|
||||
:(cook crip cass ;~(plug hig low (easy ~)))
|
||||
::
|
||||
:: parses a bidirectional category abbreviation to symbol.
|
||||
++ bidi-category
|
||||
%+ sear (soft bidi:unicode-data)
|
||||
:(cook crip cass (star hig))
|
||||
::
|
||||
++ decomposition-mapping
|
||||
%- punt :: optional
|
||||
:: a tag and a list of characters to decompose to
|
||||
;~ plug
|
||||
(punt (ifix [gal ;~(plug gar ace)] decomp-tag))
|
||||
(cook |=(a=(list @c) a) (most ace hex))
|
||||
==
|
||||
::
|
||||
++ decomp-tag
|
||||
%+ sear (soft decomp-tag:unicode-data)
|
||||
:(cook crip cass (star alf))
|
||||
::
|
||||
++ string-number
|
||||
%+ cook
|
||||
|=(a=tape a)
|
||||
(star ;~(pose nud net hep))
|
||||
::
|
||||
--
|
||||
--
|
||||
++ grad %txt
|
||||
--
|
@ -1,57 +0,0 @@
|
||||
window.urb = window.urb || {}
|
||||
|
||||
urb.waspWait = []
|
||||
urb.wasp = urb.wasp || [].push.bind(urb.waspWait)
|
||||
|
||||
// debugging
|
||||
urb.verb = false
|
||||
urb.sources = {}
|
||||
urb.waspDeps = function(){
|
||||
urb.deps.map(function(a){urb.sources[a] = "dep"})
|
||||
}
|
||||
|
||||
urb.waspElem = function(ele){
|
||||
url = ele.src || ele.href
|
||||
if(!url || (new URL(url)).host != document.location.host)
|
||||
return;
|
||||
urb.waspUrl(url)
|
||||
}
|
||||
urb.waspUrl = function(url){
|
||||
var xhr = new XMLHttpRequest()
|
||||
xhr.open("HEAD", url)
|
||||
xhr.send()
|
||||
xhr.onload = urb.waspLoadedXHR
|
||||
xhr.channel = url
|
||||
}
|
||||
urb.waspLoadedXHR = function(){
|
||||
urb.sources[urb.getXHRWasp(this)] = this.channel
|
||||
urb.wasp(urb.getXHRWasp(this))
|
||||
}
|
||||
urb.getXHRWasp = function(xhr){
|
||||
var dep = xhr.getResponseHeader("etag")
|
||||
if(dep) return JSON.parse(dep.substr(2))
|
||||
}
|
||||
|
||||
urb.datadeps = {}
|
||||
urb.waspData = function(dep){
|
||||
urb.datadeps[dep] = true
|
||||
urb.wasp(dep)
|
||||
}
|
||||
|
||||
urb.onLoadUrbJS = function(){
|
||||
urb.ondataupdate = urb.ondataupdate || urb.onupdate // overridable
|
||||
|
||||
var _onupdate = urb.onupdate
|
||||
urb.onupdate = function(dep){
|
||||
if(urb.verb)
|
||||
console.log("update", urb.datadeps[dep] ? "data" : "full", dep, urb.sources[dep])
|
||||
if(urb.datadeps[dep]) urb.ondataupdate(dep)
|
||||
else _onupdate(dep)
|
||||
}
|
||||
urb.waspDeps()
|
||||
|
||||
urb.waspAll = function(sel){
|
||||
[].map.call(document.querySelectorAll(sel), urb.waspElem)
|
||||
}
|
||||
urb.waspAll('script'); urb.waspAll('link')
|
||||
}
|
@ -1,18 +0,0 @@
|
||||
::
|
||||
:::: /hoon/index/tree/ren
|
||||
::
|
||||
/? 310
|
||||
/+ tree
|
||||
/, /
|
||||
/; (getall:tree /h1/h2/h3/h4/h5/h6) /tree-elem/
|
||||
::
|
||||
/pub/docs/dev/hoon/runes
|
||||
/; |= {tip/marl sub/(map knot marl) ~}
|
||||
(zing `(list marl)`[tip (turn ~(tap by sub) tail)])
|
||||
/. /; (getall:tree %h1 ~) /tree-elem/
|
||||
/_ /; (getall:tree %h1 ~) /tree-elem/
|
||||
== ==
|
||||
::
|
||||
::::
|
||||
::
|
||||
`(map path marl)`[[/ -.-] ~ ~]
|
@ -1,39 +0,0 @@
|
||||
:: Test url +https://app.asana.com/api/1.0/users/me
|
||||
::
|
||||
:::: /hoon/asana/com/sec
|
||||
::
|
||||
/+ oauth2
|
||||
::
|
||||
::::
|
||||
::
|
||||
|%
|
||||
++ dialog-url 'https://app.asana.com/-/oauth_authorize?response_type=code'
|
||||
++ exchange-url 'https://app.asana.com/-/oauth_token'
|
||||
--
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth2) tok/token:oauth2}
|
||||
:: ++aut is a "standard oauth2" core, which implements the
|
||||
:: most common handling of oauth2 semantics. see lib/oauth2 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut (~(standard oauth2 bal tok) . |=(tok/token:oauth2 +>(tok tok)))
|
||||
++ filter-request (out-add-header:aut scope=~ dialog-url)
|
||||
::
|
||||
++ receive-auth-query-string (in-code-to-token:aut exchange-url)
|
||||
++ receive-auth-response bak-save-token:aut
|
||||
--
|
||||
:: create a developer app by logging into https://app.asana.com/, and clicking
|
||||
:: "My Profile Settings" > Apps > "Manage my developer apps"
|
||||
|
||||
:: Be sure to be on https://localhost:8443 and to have registered
|
||||
:: 'http://localhost:8443/~/ac/asana.com/~./in' as the redirect URI.
|
||||
:: (If unable to change port number of ship, change the redirect URI port in %eyre)
|
||||
|
||||
:: |init-oauth2 /com/asana
|
||||
|
||||
:: Enter this sample command to get your user information:
|
||||
:: +https://app.asana.com/api/1.0/users/me
|
||||
|
||||
:: Before you receive the response, you'll have to clink on the link.
|
||||
:: If you successfully auth, you should receive the response in the dojo.
|
@ -1,38 +0,0 @@
|
||||
:: Test url +https://api.digitalocean.com/v2/account
|
||||
::
|
||||
:::: /hoon/digitalocean/com/sec
|
||||
::
|
||||
/+ oauth2
|
||||
::
|
||||
::::
|
||||
::
|
||||
|%
|
||||
++ dialog-url 'https://cloud.digitalocean.com/v1/oauth/authorize?response_type=code'
|
||||
++ exchange-url 'https://cloud.digitalocean.com/v1/oauth/token'
|
||||
--
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth2) tok/token:oauth2}
|
||||
:: ++aut is a "standard oauth2" core, which implements the
|
||||
:: most common handling of oauth2 semantics. see lib/oauth2 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut (~(standard oauth2 bal tok) . |=(tok/token:oauth2 +>(tok tok)))
|
||||
++ filter-request (out-add-header:aut scope=~[%read %write] dialog-url)
|
||||
::
|
||||
++ receive-auth-query-string (in-code-to-token:aut exchange-url)
|
||||
++ receive-auth-response bak-save-token:aut
|
||||
--
|
||||
:: create a developer app on https://cloud.digitalocean.com/settings/api/applications/new
|
||||
:: to get a client id and secret
|
||||
|
||||
:: Be sure to be on https://localhost:8443 and to have registered
|
||||
:: 'http://localhost:8443/~/ac/digitalocean.com/~./in' as the redirect URI.
|
||||
:: (If unable to change port number of ship, change the redirect URI port in %eyre)
|
||||
|
||||
:: |init-oauth2 |init-oauth2 /com/digitalocean
|
||||
|
||||
:: Enter home this sample command to get your user information:
|
||||
:: +https://api.digitalocean.com/v2/account
|
||||
:: Before you receive the response, you'll have to clink on the link.
|
||||
:: If you successfully auth, you should receive the response in the dojo.
|
@ -1,41 +0,0 @@
|
||||
:: Test url +https://api.dropboxapi.com/2/users/get_current_account &json ~
|
||||
::
|
||||
:::: /hoon/dropboxapi/com/sec
|
||||
::
|
||||
/+ oauth2
|
||||
::
|
||||
::::
|
||||
::
|
||||
|%
|
||||
++ dialog-url 'https://www.dropbox.com/1/oauth2/authorize?response_type=code'
|
||||
++ exchange-url 'https://api.dropboxapi.com/1/oauth2/token'
|
||||
--
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth2) tok/token:oauth2}
|
||||
:: ++aut is a "standard oauth2" core, which implements the
|
||||
:: most common handling of oauth2 semantics. see lib/oauth2 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut (~(standard oauth2 bal tok) . |=(tok/token:oauth2 +>(tok tok)))
|
||||
++ filter-request (out-add-header:aut scope=~ dialog-url)
|
||||
::
|
||||
++ receive-auth-query-string (in-code-to-token:aut exchange-url)
|
||||
++ receive-auth-response bak-save-token:aut
|
||||
--
|
||||
:: create a developer app on https://www.dropbox.com/developers-v1/apps to get a
|
||||
:: client id and secret.
|
||||
|
||||
:: Be sure to be on https://localhost:8443 and to have registered
|
||||
:: 'http://localhost:8443/~/ac/dropboxapi.com/~./in' as the redirect URI.
|
||||
:: (If unable to change port number of ship, change the redirect URI port in %eyre)
|
||||
|
||||
:: |init-oauth2 |init-oauth2 /com/dropbox
|
||||
|
||||
:: Enter this sample command to show your user info:
|
||||
:: +https://api.dropboxapi.com/2/users/get_current_account &json ~
|
||||
|
||||
:: Before you receive the response, you'll have to click on the link in the
|
||||
:: dojo to authenticate yourself.
|
||||
|
||||
:: You should receive a response listing the contents of that directory.
|
@ -1,42 +0,0 @@
|
||||
:: Test url +https://graph.facebook.com/v2.5/me
|
||||
::
|
||||
:::: /hoon/facebook/com/sec
|
||||
::
|
||||
/+ oauth2
|
||||
::
|
||||
::::
|
||||
::
|
||||
|%
|
||||
++ dialog-url 'https://www.facebook.com/dialog/oauth?response_type=code'
|
||||
++ exchange-url 'https://graph.facebook.com/v2.3/oauth/access_token'
|
||||
--
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth2) access-token/token:oauth2}
|
||||
:: ++aut is a "standard oauth2" core, which implements the
|
||||
:: most common handling of oauth2 semantics. see lib/oauth2 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut
|
||||
%+ ~(standard oauth2 bal access-token) .
|
||||
|=(access-token/token:oauth2 +>(access-token access-token))
|
||||
::
|
||||
++ filter-request
|
||||
%^ out-add-query-param:aut 'access_token'
|
||||
scope=~['user_about_me' 'user_posts']
|
||||
dialog-url
|
||||
::
|
||||
++ receive-auth-query-string (in-code-to-token:aut exchange-url)
|
||||
::
|
||||
++ receive-auth-response
|
||||
|= a/httr:eyre ^- core-move:aut
|
||||
?: (bad-response:aut p.a)
|
||||
[%give a] :: [%redo ~] :: handle 4xx?
|
||||
=+ `{access-token/@t expires-in/@u}`(grab-expiring-token:aut a)
|
||||
?. (lth expires-in ^~((div ~d7 ~s1))) :: short-lived token
|
||||
[[%redo ~] +>.$(access-token access-token)]
|
||||
:- %send
|
||||
%^ request-token:aut exchange-url
|
||||
grant-type='fb_exchange_token'
|
||||
[key='fb_exchange_token' value=access-token]~
|
||||
--
|
@ -1,10 +0,0 @@
|
||||
:: Test url +https://api.github.com/user
|
||||
::
|
||||
:::: /hoon/github/com/sec
|
||||
::
|
||||
/+ basic-auth
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:basic-auth) ~}
|
||||
++ aut ~(standard basic-auth bal ~)
|
||||
++ filter-request out-adding-header:aut
|
||||
--
|
@ -1,41 +0,0 @@
|
||||
:: Test url +https://api.instagram.com/v1/users/self
|
||||
::
|
||||
:::: /hoon/instagram/com/sec
|
||||
::
|
||||
/+ oauth2
|
||||
::
|
||||
::::
|
||||
::
|
||||
|%
|
||||
++ dialog-url 'https://api.instagram.com/oauth/authorize?response_type=code'
|
||||
++ exchange-url 'https://api.instagram.com/oauth/access_token'
|
||||
--
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth2) tok/token:oauth2}
|
||||
:: ++aut is a "standard oauth2" core, which implements the
|
||||
:: most common handling of oauth2 semantics. see lib/oauth2 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut (~(standard oauth2 bal tok) . |=(tok/token:oauth2 +>(tok tok)))
|
||||
++ filter-request
|
||||
%^ out-add-query-param:aut 'access_token'
|
||||
scope=~[%basic]
|
||||
dialog-url
|
||||
::
|
||||
++ receive-auth-query-string (in-code-to-token:aut exchange-url)
|
||||
++ receive-auth-response bak-save-token:aut
|
||||
--
|
||||
:: create a developer app on https://www.instagram.com/developer/ to get a
|
||||
:: client id and secret
|
||||
|
||||
:: Be sure to be on https://localhost:8443, and to have registered
|
||||
:: http://localhost:8443/~/ac/instagram.com/~./in as the redirect URI.
|
||||
:: (If unable to change port number of ship, change the redirect URI port in %eyre)
|
||||
:: |init-oauth2 |init-oauth2 /com/instagram
|
||||
|
||||
:: Enter this sample command to get your user information:
|
||||
:: +https://api.instagram.com/v1/users/self
|
||||
|
||||
:: Before you receive the response, you'll have to clink on the link to
|
||||
:: authenicate yourself. You should then receive the response.
|
@ -1,21 +0,0 @@
|
||||
:: Test url +https://slack.com/api/auth.test
|
||||
::
|
||||
:::: /hoon/slack/com/sec
|
||||
::
|
||||
/+ oauth2
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth2) tok/token:oauth2}
|
||||
:: ++aut is a "standard oauth2" core, which implements the
|
||||
:: most common handling of oauth2 semantics. see lib/oauth2 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut (~(standard oauth2 bal tok) . |=(tok/token:oauth2 +>(tok tok)))
|
||||
++ filter-request
|
||||
%^ out-add-query-param:aut 'token'
|
||||
scope=~[%client %admin]
|
||||
oauth-dialog='https://slack.com/oauth/authorize'
|
||||
::
|
||||
++ receive-auth-query-string (in-code-to-token:aut url='https://slack.com/api/oauth.access')
|
||||
++ receive-auth-response bak-save-token:aut
|
||||
--
|
@ -1,27 +0,0 @@
|
||||
:: Test url +https://api.twitter.com/1.1/account/verify_credentials.json
|
||||
::
|
||||
:::: /hoon/twitter/com/sec
|
||||
::
|
||||
/+ oauth1
|
||||
::
|
||||
::::
|
||||
::
|
||||
|_ {bal/(bale:eyre keys:oauth1) tok/token:oauth1}
|
||||
:: ++aut is a "standard oauth1" core, which implements the
|
||||
:: most common handling of oauth1 semantics. see lib/oauth1 for more details,
|
||||
:: and examples at the bottom of the file.
|
||||
++ aut (~(standard oauth1 bal tok) . |=(tok/token:oauth1 +>(tok tok)))
|
||||
++ filter-request
|
||||
%+ out-add-header:aut
|
||||
token-request='https://api.twitter.com/oauth/request_token'
|
||||
oauth-dialog='https://api.twitter.com/oauth/authorize'
|
||||
::
|
||||
++ filter-response res-handle-request-token:aut
|
||||
::
|
||||
++ receive-auth-query-string
|
||||
%- in-exchange-token:aut
|
||||
exchange-url='https://api.twitter.com/oauth/access_token'
|
||||
::
|
||||
++ receive-auth-response bak-save-token:aut
|
||||
:: ++ discard-state ~
|
||||
--
|
@ -1,6 +0,0 @@
|
||||
::
|
||||
:::: /hoon/down/sur
|
||||
::
|
||||
/? 310
|
||||
/- markdown
|
||||
down:markdown
|
@ -1,42 +0,0 @@
|
||||
::
|
||||
:::: /hoon/markdown/sur
|
||||
::
|
||||
/? 310
|
||||
|%
|
||||
++ down (list elem)
|
||||
++ kids (list inline)
|
||||
++ inline
|
||||
=+ ^= inlik
|
||||
$% {$emph p/?} :: strong?
|
||||
{$delt ~} :: strikethrough
|
||||
{$link p/tape q/(unit tape)}
|
||||
{$blot p/tape q/(unit tape)} :: image
|
||||
==
|
||||
=+ ^= inlin
|
||||
$% {$$ p/tape}
|
||||
{$line ~}
|
||||
{$code p/tape}
|
||||
{$htmt p/cord} :: XX (each marx mane)
|
||||
==
|
||||
$^({p/inlik q/kids} inlin)
|
||||
::
|
||||
::
|
||||
++ elem $^(tops node)
|
||||
++ tops :: childful block
|
||||
$: $= p
|
||||
$% {$bloq ~}
|
||||
{$list p/? q/$@(char {p/@u q/char})} :: tight ordered?
|
||||
{$item ~}
|
||||
==
|
||||
q/down
|
||||
==
|
||||
++ node :: childless block
|
||||
$% {$para p/kids}
|
||||
{$meta p/(map cord cord)} :: front matter
|
||||
{$hrul ~}
|
||||
{$head p/@u q/kids}
|
||||
{$code p/(unit {p/char q/@u r/tape}) q/wain} :: info contents
|
||||
{$html p/wain}
|
||||
{$defn ~} :: empty para
|
||||
==
|
||||
--
|
@ -1,15 +0,0 @@
|
||||
::
|
||||
:::: /hoon/tree-include/sur
|
||||
::
|
||||
/? 310
|
||||
|-
|
||||
$: mime/mime
|
||||
body/json
|
||||
head/json
|
||||
snip/json
|
||||
meta/json
|
||||
sect/json
|
||||
comt/json
|
||||
plan/json
|
||||
bump/knot
|
||||
==
|
200
sur/twitter.hoon
200
sur/twitter.hoon
@ -1,200 +0,0 @@
|
||||
|%
|
||||
++ post {id/@u who/@ta now/@da txt/@t} :: recieved tweet
|
||||
++ keys :: twitter-key type
|
||||
$: con/{tok/@t sec/@t} :: user key pair
|
||||
acc/{tok/@t sec/@t} :: app key pair
|
||||
==
|
||||
::
|
||||
++ command :: poke action
|
||||
$% {$post p/@uvI q/cord} :: post a tweet
|
||||
==
|
||||
++ sur-twit . :: XX
|
||||
::
|
||||
++ reqs
|
||||
|%
|
||||
++ args
|
||||
|%
|
||||
++ dev @t :: device name
|
||||
++ gat @t :: grant type
|
||||
++ lat @t :: latitude
|
||||
++ lid (list tid) :: screen names
|
||||
++ lon @t :: longitude
|
||||
++ lsc (list scr) ::
|
||||
++ nam @t :: location name
|
||||
++ pla @t :: place-id
|
||||
++ scr @t :: screen name
|
||||
++ slu @t :: category name
|
||||
++ tid @u :: user id
|
||||
++ tok @t :: oauth token
|
||||
++ url @t :: callback url
|
||||
--
|
||||
++ param
|
||||
=> args
|
||||
=< $? de gr id is la lo na os pl qq sc
|
||||
sd ss sl si st te ti ts ur ui us
|
||||
==
|
||||
|%
|
||||
++ de {$device p/dev}
|
||||
++ gr {$grant-type p/gat}
|
||||
++ id {$id p/tid}
|
||||
++ is {$id p/lid}
|
||||
++ la {$lat p/lat}
|
||||
++ lo {$long p/lon}
|
||||
++ na {$name p/lid}
|
||||
++ os {$source-screen-name p/scr}
|
||||
++ pl {$place-id p/pla}
|
||||
++ qq {$q p/@t}
|
||||
++ sc {$screen-name p/scr}
|
||||
++ sd ?(ui sc)
|
||||
++ ss {$screen-name p/lsc}
|
||||
++ sl {$slug p/slu}
|
||||
++ si {$source-id p/tid}
|
||||
++ st {$status p/@t}
|
||||
++ te {$text p/@t}
|
||||
++ ti {$target-id p/tid}
|
||||
++ ts {$target-screen-name p/scr}
|
||||
++ ur {$url p/url}
|
||||
++ ui {$user-id p/tid}
|
||||
++ us {$user-id p/lid}
|
||||
--
|
||||
::
|
||||
:: the head of every element in ++doc-data is a hoon type for an endpoint
|
||||
:: ++endpoint is the grand union of all of them
|
||||
++ endpoint (normalize (fork-clams (heads doc-data)))
|
||||
++ heads |*(a/(pole) ?~(a a [-<.a (heads +.a)]))
|
||||
++ fork-clams
|
||||
=+ $:{a/(pair _{term *} (pole _{term *}))}
|
||||
|@ ++ $
|
||||
?~ q.a p.a
|
||||
?(p.a (fork-clams q.a))
|
||||
--
|
||||
::
|
||||
++ normalize
|
||||
=+ $:{a/_{@ *}}
|
||||
|@ ++ $
|
||||
|= b/*
|
||||
^+ [?@(- . .)]:(a b)
|
||||
(a b)
|
||||
--
|
||||
::
|
||||
++ doc-data-dry :: staticly typed for endpoint lookup
|
||||
=, param
|
||||
^- (list {typ/_{term (list param)} met/?($get $post) pax/path})
|
||||
doc-data
|
||||
::
|
||||
++ doc-data :: scraped from api docs, used to create types and requests
|
||||
:: ^- (pole {_{term _(pole *param)} ?($get $post) path})
|
||||
=> param
|
||||
:~
|
||||
[ {$mentions ~} %get /statuses/mentions-timeline ]
|
||||
[ {$posts-by sd ~} %get /statuses/user-timeline ]
|
||||
[ {$timeline ~} %get /statuses/home-timeline ]
|
||||
[ {$retweets-mine ~} %get /statuses/retweets-of-me ]
|
||||
[ {$retweets-of id ~} %get /statuses/retweets/':id' ]
|
||||
[ {$show-status id ~} %get /statuses/show ]
|
||||
[ {$del-status id ~} %post /statuses/destroy/':id' ]
|
||||
[ {$full-status id ~} %post /statuses/looup ]
|
||||
[ {$update st ~} %post /statuses/update ]
|
||||
[ {$retweet id ~} %post /statuses/retweet/':id' ]
|
||||
[ {$unretweet id ~} %post /statuses/unretweet/':id' ]
|
||||
::
|
||||
[ {$oembed-from-id id ~} %get /statuses/oembed ]
|
||||
[ {$oembed-from-url ur ~} %get /statuses/oembed ]
|
||||
[ {$retweeters id ~} %get /statuses/retweeters/ids ]
|
||||
[ {$search qq ~} %get /search/tweets ]
|
||||
[ {$all-dms ~} %get /direct-messages ]
|
||||
[ {$all-dms-sent ~} %get /direct-messages/sent ]
|
||||
[ {$show-dm id ~} %get /direct-messages/show ]
|
||||
[ {$del-dm id ~} %post /direct-messages/destroy ]
|
||||
[ {$dm sd te ~} %post /direct-messages/new ]
|
||||
::
|
||||
[ {$blocked-retweeters ~} %get /friendships/no-retweets/ids ]
|
||||
[ {$followers sd ~} %get /followers/list ]
|
||||
[ {$follower-ids sd ~} %get /followers/ids ]
|
||||
[ {$friends sd ~} %get /friends/list ]
|
||||
[ {$friend-ids sd ~} %get /friends/ids ]
|
||||
[ {$friend-requests ~} %get /friendships/incoming ]
|
||||
[ {$friend-requesting ~} %get /friendships/outgoing ]
|
||||
[ {$follow sd ~} %post /friendships/create ]
|
||||
[ {$unfollow sd ~} %post /friendships/destroy ]
|
||||
[ {$set-friendship sd ~} %post /friendships/update ]
|
||||
[ {$relationships ?(us ss) ~} %get /friendships/lookup ]
|
||||
:- {$relationship ?(si os) ?(ti ts) ~}
|
||||
[%get /friendships/show]
|
||||
::
|
||||
[ {$show-settings ~} %get /account/settings ]
|
||||
[ {$test-login ~} %get /account/verify-credentials ]
|
||||
[ {$set-settings ~} %post /account/settings ]
|
||||
[ {$set-sms-target de ~} %post /account/update-delivery-device ]
|
||||
[ {$set-profile ~} %post /account/update-profile ]
|
||||
[ {$set-colors ~} %post /account/update-profile-colors ]
|
||||
[ {$del-background ~} %post /account/remove-profile-banner ]
|
||||
:- {$set-background ~}
|
||||
[%post /account/update-profile-background-image]
|
||||
::
|
||||
[ {$blocks ~} %get /blocks/list ]
|
||||
[ {$blocks-ids ~} %get /blocks/ids ]
|
||||
[ {$block sd ~} %post /blocks/create ]
|
||||
[ {$unblock sd ~} %post /blocks/destroy ]
|
||||
::
|
||||
[ {$full-users ?(us ss) ~} %get /users/lookup ]
|
||||
[ {$user sd ~} %get /users/show ]
|
||||
[ {$search-users qq ~} %get /users/search ]
|
||||
[ {$user-contributees sd ~} %get /users/contributees ] :: undoc'd
|
||||
[ {$user-contributors sd ~} %get /users/contributors ] :: undoc'd
|
||||
[ {$user-prof sd ~} %get /users/profile-banner ]
|
||||
::
|
||||
[ {$mute-user sd ~} %post /mutes/users/create ]
|
||||
[ {$unmute-user sd ~} %post /mutes/users/destroy ]
|
||||
[ {$muted ~} %get /mutes/users/list ]
|
||||
[ {$muted-ids ~} %get /mutes/users/ids ]
|
||||
::
|
||||
[ {$suggested ~} %get /users/suggestions ]
|
||||
[ {$suggestion sl ~} %get /users/suggestions/':slug' ]
|
||||
:- {$suggestion-posts sl ~}
|
||||
[%get /users/suggestions/':slug'/members]
|
||||
::
|
||||
[ {$favorites ~} %get /favorites/list ]
|
||||
[ {$del-favorite id ~} %post /favorites/destroy ]
|
||||
[ {$favorite id ~} %post /favorites/create ]
|
||||
::
|
||||
[ {$lists ~} %get /lists/list ]
|
||||
[ {$lists-of sd ~} %get /lists/memberships ]
|
||||
[ {$lists-by sd ~} %get /lists/ownerships ]
|
||||
[ {$lists-subscribed sd ~} %get /lists/subscriptions ]
|
||||
[ {$list ~} %get /lists/show ]
|
||||
[ {$list-posts ~} %get /lists/statuses ]
|
||||
[ {$list-remove ?(us ss) ~} %post /lists/members/destroy-all ]
|
||||
[ {$list-subscribers ~} %get /lists/subscribers ]
|
||||
[ {$list-subscribe ~} %post /lists/subscribers/create ]
|
||||
[ {$list-unsubscribe ~} %post /lists/subscribers/destroy ]
|
||||
[ {$list-is-subscribed sd ~} %get /lists/subscribers/show ]
|
||||
[ {$list-add ?(us ss) ~} %post /lists/members/create-all ]
|
||||
[ {$list-is-in sd ~} %get /lists/members/show ]
|
||||
[ {$list-members ~} %get /lists/members ]
|
||||
[ {$del-list ~} %post /lists/destroy ]
|
||||
[ {$config-list ~} %post /lists/update ]
|
||||
[ {$new-list na ~} %post /lists/create ]
|
||||
::
|
||||
[ {$saved-searches ~} %get /saved-searches/list ]
|
||||
[ {$full-saved-search id ~} %get /saved-searches/show/':id' ]
|
||||
[ {$save-search qq ~} %post /saved-searches/create ]
|
||||
[ {$del-saved-search id ~} %post /saved-searches/destroy/':id' ]
|
||||
::
|
||||
[ {$full-geo id ~} %get /geo/id/':id' ]
|
||||
[ {$geo-reverse la lo ~} %get /geo/reverse-geocode ]
|
||||
[ {$search-geo ~} %get /geo/search ]
|
||||
[ {$geo-similar la lo na ~} %get /geo/similar-places ]
|
||||
[ {$trend-locations ~} %get /trends/available ]
|
||||
[ {$trends-at id ~} %get /trends/place ]
|
||||
[ {$trends-near la lo ~} %get /trends/closest ]
|
||||
::
|
||||
[ {$user-report sd ~} %post /users/report-spam ]
|
||||
[ {$help-config ~} %get /help/configuration ]
|
||||
[ {$help-langs ~} %get /help/languages ]
|
||||
[ {$help-privacy ~} %get /help/privacy ]
|
||||
[ {$help-tos ~} %get /help/tos ]
|
||||
[ {$rate-limit-info ~} %get /application/rate-limit-status ]
|
||||
==
|
||||
--
|
||||
--
|
@ -1,150 +0,0 @@
|
||||
|%
|
||||
:: # %unicode-data
|
||||
:: types to represent UnicdoeData.txt.
|
||||
+| %unicode-data
|
||||
++ line
|
||||
:: an individual codepoint definition
|
||||
::
|
||||
$: code=@c :: codepoint in hexadecimal format
|
||||
name=tape :: character name
|
||||
gen=general :: type of character this is
|
||||
:: canonical combining class for ordering algorithms
|
||||
can=@ud
|
||||
bi=bidi :: bidirectional category of this character
|
||||
de=decomp :: character decomposition mapping
|
||||
:: todo: decimal/digit/numeric need to be parsed.
|
||||
decimal=tape :: decimal digit value (or ~)
|
||||
digit=tape :: digit value, covering non decimal radix forms
|
||||
numeric=tape :: numeric value, including fractions
|
||||
mirrored=? :: whether char is mirrored in bidirectional text
|
||||
old-name=tape :: unicode 1.0 compatibility name
|
||||
iso=tape :: iso 10646 comment field
|
||||
up=(unit @c) :: uppercase mapping codepoint
|
||||
low=(unit @c) :: lowercase mapping codepoint
|
||||
title=(unit @c) :: titlecase mapping codepoint
|
||||
==
|
||||
::
|
||||
++ general
|
||||
:: one of the normative or informative unicode general categories
|
||||
::
|
||||
:: these abbreviations are as found in the unicode standard, except
|
||||
:: lowercased as to be valid symbols.
|
||||
$? $lu :: letter, uppercase
|
||||
$ll :: letter, lowercase
|
||||
$lt :: letter, titlecase
|
||||
$mn :: mark, non-spacing
|
||||
$mc :: mark, spacing combining
|
||||
$me :: mark, enclosing
|
||||
$nd :: number, decimal digit
|
||||
$nl :: number, letter
|
||||
$no :: number, other
|
||||
$zs :: separator, space
|
||||
$zl :: separator, line
|
||||
$zp :: separator, paragraph
|
||||
$cc :: other, control
|
||||
$cf :: other, format
|
||||
$cs :: other, surrogate
|
||||
$co :: other, private use
|
||||
$cn :: other, not assigned
|
||||
::
|
||||
$lm :: letter, modifier
|
||||
$lo :: letter, other
|
||||
$pc :: punctuation, connector
|
||||
$pd :: punctuation, dash
|
||||
$ps :: punctuation, open
|
||||
$pe :: punctuation, close
|
||||
$pi :: punctuation, initial quote
|
||||
$pf :: punctuation, final quote
|
||||
$po :: punctuation, other
|
||||
$sm :: symbol, math
|
||||
$sc :: symbol, currency
|
||||
$sk :: symbol, modifier
|
||||
$so :: symbol, other
|
||||
==
|
||||
::
|
||||
++ bidi
|
||||
:: bidirectional category of a unicode character
|
||||
$? $l :: left-to-right
|
||||
$lre :: left-to-right embedding
|
||||
$lri :: left-to-right isolate
|
||||
$lro :: left-to-right override
|
||||
$fsi :: first strong isolate
|
||||
$r :: right-to-left
|
||||
$al :: right-to-left arabic
|
||||
$rle :: right-to-left embedding
|
||||
$rli :: right-to-left isolate
|
||||
$rlo :: right-to-left override
|
||||
$pdf :: pop directional format
|
||||
$pdi :: pop directional isolate
|
||||
$en :: european number
|
||||
$es :: european number separator
|
||||
$et :: european number terminator
|
||||
$an :: arabic number
|
||||
$cs :: common number separator
|
||||
$nsm :: non-spacing mark
|
||||
$bn :: boundary neutral
|
||||
$b :: paragraph separator
|
||||
$s :: segment separator
|
||||
$ws :: whitespace
|
||||
$on :: other neutrals
|
||||
==
|
||||
::
|
||||
++ decomp
|
||||
:: character decomposition mapping.
|
||||
::
|
||||
:: tag: type of decomposition.
|
||||
:: c: a list of codepoints this decomposes into.
|
||||
(unit {tag/(unit decomp-tag) c/(list @c)})
|
||||
::
|
||||
++ decomp-tag
|
||||
:: tag that describes the type of a character decomposition.
|
||||
$? $font :: a font variant
|
||||
$nobreak :: a no-break version of a space or hyphen
|
||||
$initial :: an initial presentation form (arabic)
|
||||
$medial :: a medial presentation form (arabic)
|
||||
$final :: a final presentation form (arabic)
|
||||
$isolated :: an isolated presentation form (arabic)
|
||||
$circle :: an encircled form
|
||||
$super :: a superscript form
|
||||
$sub :: a subscript form
|
||||
$vertical :: a vertical layout presentation form
|
||||
$wide :: a wide (or zenkaku) compatibility character
|
||||
$narrow :: a narrow (or hankaku) compatibility character
|
||||
$small :: a small variant form (cns compatibility)
|
||||
$square :: a cjk squared font variant
|
||||
$fraction :: a vulgar fraction form
|
||||
$compat :: otherwise unspecified compatibility character
|
||||
==
|
||||
::
|
||||
:: #
|
||||
:: # %case-map
|
||||
:: #
|
||||
:: types to represent fast lookups of case data
|
||||
+| %case-map
|
||||
++ case-offset
|
||||
:: case offsets can be in either direction
|
||||
$% :: add {a} to get the new character
|
||||
[%add a=@u]
|
||||
:: subtract {a} to get the new character
|
||||
[%sub s=@u]
|
||||
:: take no action; return self
|
||||
[%none ~]
|
||||
:: represents series of alternating uppercase/lowercase characters
|
||||
[%uplo ~]
|
||||
==
|
||||
::
|
||||
++ case-node
|
||||
:: a node in a case-tree.
|
||||
::
|
||||
:: represents a range of
|
||||
$: start=@ux
|
||||
end=@ux
|
||||
upper=case-offset
|
||||
lower=case-offset
|
||||
title=case-offset
|
||||
==
|
||||
::
|
||||
++ case-tree
|
||||
:: a binary search tree of ++case-node items, sorted on span.
|
||||
(tree case-node)
|
||||
--
|
@ -1,13 +0,0 @@
|
||||
::
|
||||
:::: /hoon/listen/web
|
||||
::
|
||||
/? 310
|
||||
;div.mini-module
|
||||
;script@"/~/at/lib/js/urb.js";
|
||||
;script@"https://cdn.rawgit.com/seatgeek/react-infinite/0.8.0/dist/react-infinite.js";
|
||||
;script@"https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.11.2/moment-with-locales.js";
|
||||
;script@"https://cdnjs.cloudflare.com/ajax/libs/moment-timezone/0.5.1/moment-timezone.js";
|
||||
;script@"/talk/main.js";
|
||||
;link/"/talk/main.css"(rel "stylesheet");
|
||||
;talk(readonly "", chrono "reverse", station "comments");
|
||||
==
|
@ -1,73 +0,0 @@
|
||||
var secToString = function(secs) {
|
||||
if (secs <= 0) {
|
||||
return 'Completed';
|
||||
}
|
||||
secs = Math.floor(secs)
|
||||
var min = 60;
|
||||
var hour = 60 * min;
|
||||
var day = 24 * hour;
|
||||
var week = 7 * day;
|
||||
var year = 52 * week;
|
||||
var fy = function(s) {
|
||||
if (s < year) {
|
||||
return ['', s];
|
||||
} else {
|
||||
return [Math.floor(s / year) + 'y', s % year];
|
||||
}
|
||||
}
|
||||
var fw = function(tup) {
|
||||
var str = tup[0];
|
||||
var sec = tup[1];
|
||||
if (sec < week) {
|
||||
return [str, sec];
|
||||
} else {
|
||||
return [str + ' ' + Math.floor(sec / week) + 'w', sec % week];
|
||||
}
|
||||
}
|
||||
var fd = function(tup) {
|
||||
var str = tup[0];
|
||||
var sec = tup[1];
|
||||
if (sec < day) {
|
||||
return [str, sec];
|
||||
} else {
|
||||
return [str + ' ' + Math.floor(sec / day) + 'd', sec % day];
|
||||
}
|
||||
}
|
||||
var fh = function(tup) {
|
||||
var str = tup[0];
|
||||
var sec = tup[1];
|
||||
if (sec < hour) {
|
||||
return [str, sec];
|
||||
} else {
|
||||
return [str + ' ' + Math.floor(sec / hour) + 'h', sec % hour];
|
||||
}
|
||||
}
|
||||
var fm = function(tup) {
|
||||
var str = tup[0];
|
||||
var sec = tup[1];
|
||||
if (sec < min) {
|
||||
return [str, sec];
|
||||
} else {
|
||||
return [str + ' ' + Math.floor(sec / min) + 'm', sec % min];
|
||||
}
|
||||
}
|
||||
var fs = function(tup) {
|
||||
var str = tup[0];
|
||||
var sec = tup[1];
|
||||
return str + ' ' + sec + 's';
|
||||
}
|
||||
return fs(fm(fh(fd(fw(fy(secs)))))).trim();
|
||||
}
|
||||
|
||||
window.onload = function() {
|
||||
var das = document.querySelectorAll('[data-urb-elapsed]');
|
||||
for (var i=0; i < das.length; i ++) {
|
||||
var urbD = das[i].dataset.urbElapsed; // UTC
|
||||
var serverTime = new Date(urbD);
|
||||
var clientTime = new Date(); // local
|
||||
var interval = secToString((clientTime - serverTime) / 1000).split(' ')[0];
|
||||
document.querySelector("[data-urb-elapsed='" + urbD + "']").innerHTML = '-' + interval;
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -1,18 +0,0 @@
|
||||
document.toggleDisplay = function(id1, id2) {
|
||||
var id1 = 'show';
|
||||
var id2 = 'edit';
|
||||
var isDisplayed = function(id) {
|
||||
return document.getElementById(id).style.display != 'none';
|
||||
}
|
||||
console.log(document.getElementById(id1));
|
||||
console.log(document.getElementById(id2));
|
||||
if (isDisplayed(id1)) {
|
||||
document.getElementById(id1).style.display = 'none';
|
||||
document.getElementById(id2).style.display = 'inherit';
|
||||
} else {
|
||||
document.getElementById(id1).style.display = 'inherit';
|
||||
document.getElementById(id2).style.display = 'none';
|
||||
}
|
||||
};
|
||||
|
||||
document.getElementById('edit-btn').onclick = document.toggleDisplay;
|
@ -1,13 +0,0 @@
|
||||
::
|
||||
:::: /hoon/talk/web
|
||||
::
|
||||
/? 310
|
||||
;module(nav_title "Talk", nav_no-dpad "", nav_no-sibs "", nav_subnav "talk-station")
|
||||
;script@"/~~/~/at/lib/js/urb.js";
|
||||
;script@"https://cdn.rawgit.com/seatgeek/react-infinite/0.8.0/dist/react-infinite.js";
|
||||
;script@"https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.11.2/moment-with-locales.js";
|
||||
;script@"https://cdnjs.cloudflare.com/ajax/libs/moment-timezone/0.5.1/moment-timezone.js";
|
||||
;script@"/talk/main.js";
|
||||
;link/"/talk/main.css"(rel "stylesheet");
|
||||
;talk;
|
||||
==
|
@ -1,255 +0,0 @@
|
||||
.planet,
|
||||
.room {
|
||||
font-family: 'scp'; }
|
||||
|
||||
div.input.valid-false {
|
||||
color: #FF0808;
|
||||
border-color: #FF0808; }
|
||||
|
||||
.grams {
|
||||
list-style-type: none;
|
||||
padding: 0; }
|
||||
.grams .meta {
|
||||
display: inline-block; }
|
||||
|
||||
.grams .meta {
|
||||
max-height: 1.6rem;
|
||||
overflow: hidden;
|
||||
width: 100%; }
|
||||
.grams .meta label {
|
||||
margin-right: 0.9375rem;
|
||||
height: 0.9375rem;
|
||||
width: 0.9375rem;
|
||||
text-align: center; }
|
||||
.grams .meta label:before {
|
||||
content: attr(data-glyph);
|
||||
color: #fff;
|
||||
font-family: 'scp';
|
||||
font-size: .8rem;
|
||||
font-weight: 500;
|
||||
line-height: 1rem;
|
||||
vertical-align: middle; }
|
||||
.grams .meta label,
|
||||
.grams .meta h2,
|
||||
.grams .meta h3 {
|
||||
display: inline-block;
|
||||
vertical-align: top; }
|
||||
.grams .meta h2,
|
||||
.grams .meta h3 {
|
||||
font-family: 'scp';
|
||||
font-size: .8rem;
|
||||
font-weight: 400;
|
||||
margin-top: 0;
|
||||
padding-top: 0; }
|
||||
.grams .meta h3 {
|
||||
margin-left: 2rem;
|
||||
line-height: 1rem; }
|
||||
.grams .meta .time {
|
||||
padding-right: 2rem;
|
||||
float: right; }
|
||||
|
||||
.grams .meta:hover {
|
||||
overflow: visible; }
|
||||
|
||||
.gram:hover {
|
||||
z-index: 2;
|
||||
position: relative; }
|
||||
|
||||
div.gram.first:first-of-type {
|
||||
margin-top: 0; }
|
||||
|
||||
div.gram.first {
|
||||
margin-top: 1.875rem; }
|
||||
|
||||
div.gram.same div.meta {
|
||||
display: none; }
|
||||
|
||||
div.gram.same:hover div.meta {
|
||||
display: block;
|
||||
position: absolute;
|
||||
z-index: 0; }
|
||||
div.gram.same:hover div.meta label, div.gram.same:hover div.meta h2, div.gram.same:hover div.meta h3 {
|
||||
display: none; }
|
||||
div.gram.same:hover div.meta h3.time {
|
||||
display: block;
|
||||
opacity: .6;
|
||||
padding-right: 3.8rem; }
|
||||
|
||||
.speech {
|
||||
position: absolute;
|
||||
z-index: 1;
|
||||
margin-left: 1.875rem; }
|
||||
.speech .fat {
|
||||
max-height: 0;
|
||||
transition: max-height .1s ease-in-out;
|
||||
overflow: hidden; }
|
||||
.speech .fat pre {
|
||||
color: #fff; }
|
||||
.speech:hover .fat {
|
||||
max-height: 16rem;
|
||||
overflow: scroll;
|
||||
background-color: #000;
|
||||
color: #fff; }
|
||||
|
||||
.exp {
|
||||
font-family: 'scp';
|
||||
font-size: .9rem; }
|
||||
.exp .speech {
|
||||
max-width: 100%;
|
||||
overflow-x: scroll; }
|
||||
.exp .speech > span {
|
||||
color: #fff;
|
||||
background-color: #000;
|
||||
padding: .3rem; }
|
||||
|
||||
.comment .speech a.btn {
|
||||
background-color: transparent;
|
||||
color: #B1B7BD;
|
||||
font-size: .9rem;
|
||||
border: 0;
|
||||
border-bottom: 3px solid #b1b7bd;
|
||||
text-transform: none;
|
||||
text-decoration: none;
|
||||
padding: 0;
|
||||
line-height: 1rem;
|
||||
margin: 1rem 0 2rem 0;
|
||||
letter-spacing: 0; }
|
||||
|
||||
.gram pre {
|
||||
background-color: transparent; }
|
||||
|
||||
div.gram label {
|
||||
background-color: #000; }
|
||||
|
||||
div.gram.say .speech {
|
||||
font-style: italic; }
|
||||
|
||||
div.gram.pending .speech {
|
||||
color: #B1B7BD; }
|
||||
|
||||
.author,
|
||||
.path {
|
||||
cursor: pointer; }
|
||||
|
||||
input.action {
|
||||
background-color: transparent;
|
||||
border-color: transparent;
|
||||
font-family: 'scp';
|
||||
font-size: .8rem; }
|
||||
|
||||
input.action.valid-false {
|
||||
color: #FF0808; }
|
||||
|
||||
input.action::-webkit-input-placeholder {
|
||||
color: #000;
|
||||
font-family: 'bau';
|
||||
font-size: 1rem; }
|
||||
|
||||
input.action:-moz-placeholder {
|
||||
color: #000;
|
||||
font-family: 'bau';
|
||||
font-size: 1rem; }
|
||||
|
||||
input.action::-moz-placeholder {
|
||||
color: #000;
|
||||
font-family: 'bau';
|
||||
font-size: 1rem; }
|
||||
|
||||
input.action:-ms-input-placeholder {
|
||||
color: #000;
|
||||
font-family: 'bau';
|
||||
font-size: 1rem; }
|
||||
|
||||
input.action:focus::-webkit-input-placeholder {
|
||||
color: transparent; }
|
||||
|
||||
input.action:focus:-moz-placeholder {
|
||||
color: transparent; }
|
||||
|
||||
input.action:focus::-moz-placeholder {
|
||||
color: transparent; }
|
||||
|
||||
input.action:focus:-ms-input-placeholder {
|
||||
color: transparent; }
|
||||
|
||||
.menu {
|
||||
max-height: 100%; }
|
||||
.menu .planet,
|
||||
.menu .room {
|
||||
margin-bottom: .8rem; }
|
||||
.menu .name,
|
||||
.menu .planet {
|
||||
display: inline-block; }
|
||||
.menu .name,
|
||||
.menu .planet,
|
||||
.menu .room {
|
||||
font-size: .8rem; }
|
||||
.menu .room > div {
|
||||
display: inline;
|
||||
cursor: pointer; }
|
||||
.menu .room > div.selected {
|
||||
font-weight: 500; }
|
||||
.menu .room .close {
|
||||
display: none;
|
||||
margin: 0;
|
||||
float: none;
|
||||
margin-left: .6rem;
|
||||
font-weight: 600;
|
||||
font-size: .8rem;
|
||||
color: #FF0808; }
|
||||
.menu .room:hover .close {
|
||||
display: inline; }
|
||||
.menu .room.disabled {
|
||||
opacity: .6; }
|
||||
.menu .name {
|
||||
display: none;
|
||||
min-width: 33.333%;
|
||||
font-size: .9rem; }
|
||||
.menu .planet {
|
||||
min-width: 66.667%; }
|
||||
|
||||
.menu.depth-2 {
|
||||
overflow: scroll; }
|
||||
|
||||
.input {
|
||||
display: inline-block;
|
||||
line-height: 2rem;
|
||||
font-size: 1rem;
|
||||
padding: 0 .2rem;
|
||||
min-width: 1rem;
|
||||
min-height: 1rem;
|
||||
outline: none; }
|
||||
.input[contenteditable] {
|
||||
border-bottom: 3px solid #000; }
|
||||
|
||||
.audience,
|
||||
.message {
|
||||
margin-left: 1.875rem; }
|
||||
|
||||
.audience {
|
||||
margin-bottom: 1rem; }
|
||||
|
||||
.audience .input {
|
||||
border-color: #B1B7BD;
|
||||
font-family: 'scp';
|
||||
font-size: .8rem; }
|
||||
|
||||
.message {
|
||||
display: inline-block; }
|
||||
|
||||
.message .input {
|
||||
border-color: #373a3c;
|
||||
font-family: 'bau'; }
|
||||
|
||||
.writing {
|
||||
margin-top: 2rem; }
|
||||
|
||||
.length {
|
||||
display: inline-block;
|
||||
width: 120px;
|
||||
margin-left: 2rem;
|
||||
line-height: 2rem;
|
||||
font-family: 'bau';
|
||||
font-size: .7rem;
|
||||
font-weight: 500;
|
||||
letter-spacing: 1px; }
|
2667
web/talk/main.js
2667
web/talk/main.js
File diff suppressed because it is too large
Load Diff
1487
web/tree/main.css
1487
web/tree/main.css
File diff suppressed because it is too large
Load Diff
3595
web/tree/main.js
3595
web/tree/main.js
File diff suppressed because it is too large
Load Diff
@ -1,3 +0,0 @@
|
||||
The quick *brown fox* jumped over #(add 2 2)
|
||||
their owner's "extremely lazy" dogs.
|
||||
|
@ -1,3 +0,0 @@
|
||||
;style:'#test-style {transform: skew(25deg)}'
|
||||
|
||||
### Test style
|
@ -1,12 +0,0 @@
|
||||
;+
|
||||
;>
|
||||
foo *some style*
|
||||
|
||||
outdent
|
||||
|
||||
;= ;div; ==
|
||||
|
||||
;=
|
||||
moar markdown
|
||||
==
|
||||
|
@ -1,11 +0,0 @@
|
||||
The quick brown fox jumped _over
|
||||
the_ extremely lazy dogs.
|
||||
|
||||
Then a horse arrived. It was extremely angry.
|
||||
Outside, two bears [were fighting](http://google.com) each other.
|
||||
|
||||
Also present at the scene were:
|
||||
|
||||
- an Armenian.
|
||||
|
||||
Everything was soon back to normal.
|
@ -1,52 +0,0 @@
|
||||
#(add 2 2) is a hoon expression
|
||||
|
||||
un*bearably*
|
||||
|
||||
0b1100
|
||||
|
||||
---
|
||||
|
||||
|
||||
## This is a header
|
||||
|
||||
The quick brown fox jumped over
|
||||
the extremely lazy dogs.
|
||||
|
||||
Then a horse arrived. It was extremely angry.
|
||||
Outside, two bears [were fighting](http://google.com) each other.
|
||||
|
||||
Also present at _the intense %hoon scene_ were:
|
||||
|
||||
- an Armenian.
|
||||
|
||||
- a haberdasher.
|
||||
|
||||
A haberdasher is someone who makes hats. There are quite
|
||||
a few kinds of hats:
|
||||
|
||||
- fedoras
|
||||
|
||||
- borsalinos
|
||||
|
||||
- sombreros
|
||||
|
||||
- baseball caps
|
||||
|
||||
All these devices will protect your bald spot from the rain.
|
||||
|
||||
It is _sometimes difficult_ to be a bald man when it's raining.
|
||||
|
||||
We sometimes speak in %hoon We also say 0xdead.beef things like ~ and #`@`2.
|
||||
|
||||
We don't care if we sound funny, and sometimes we !@#$%%#^? cuss.
|
||||
|
||||
```
|
||||
We also sometimes put
|
||||
in
|
||||
code
|
||||
looks
|
||||
|
||||
like
|
||||
this.
|
||||
```
|
||||
|
@ -1,18 +0,0 @@
|
||||
## A digital home base
|
||||
|
||||
What you need is a digital home base. What is that computer? Is
|
||||
it (a) your phone, (b) your browser, (c) your PC or laptop, (d)
|
||||
your AWS instance, (e) your RasPi or other custom home computer?
|
||||
|
||||
Here are three obvious features your digital home base needs.
|
||||
(1) it should be infinitely secure and persistent -- at the level
|
||||
of Amazon S3, Gmail, your bank, etc. (2) it should be a server,
|
||||
not just a client. (3) it should be usable by ordinary people.
|
||||
|
||||
Everything except (d) falls far short of (1) and/or (2). (d)
|
||||
falls far short of (3).
|
||||
|
||||
The missing piece is a practical _personal server_ -- a virtual
|
||||
computer in the cloud, with persistence guarantees comparable to
|
||||
cloud storage services, that's as completely yours as a RasPi.
|
||||
|
@ -1,6 +0,0 @@
|
||||
*brown fox* ;{s "ignoreme"} ;{a(name "foo")} jumped over
|
||||
|
||||
;div#test: hello world
|
||||
|
||||
- - foo
|
||||
- bar
|
@ -1,37 +0,0 @@
|
||||
> xyz
|
||||
abc
|
||||
|
||||
```
|
||||
code at the beginning of the line
|
||||
```
|
||||
|
||||
zyxxy
|
||||
|
||||
> bar
|
||||
|
||||
poe
|
||||
m
|
||||
|
||||
> baz
|
||||
> bal
|
||||
|
||||
- - bleh
|
||||
- blah
|
||||
+ one
|
||||
+ two
|
||||
|
||||
1
|
||||
|
||||
> > bel
|
||||
> what did you just say about me
|
||||
|
||||
...
|
||||
|
||||
```
|
||||
code
|
||||
still code?
|
||||
```
|
||||
|
||||
> > foo
|
||||
|
||||
not-code
|
@ -1 +0,0 @@
|
||||
> - + ;div.test: nesting
|
@ -1,65 +0,0 @@
|
||||
:: Render all %%/{@u}.txt test cases
|
||||
::
|
||||
:::: /hoon/all/unmark/web
|
||||
::
|
||||
/+ cram
|
||||
::
|
||||
/= cor
|
||||
/^ (list {@ud wain})
|
||||
/: /%%/
|
||||
/; |= a/(map knot wain)
|
||||
=; kid/(list {@ud wain}) (sort kid dor)
|
||||
%+ murn ~(tap by a)
|
||||
|= {b/knot c/wain}
|
||||
%+ bind (slaw %ud b)
|
||||
|=(d/@ud [d c])
|
||||
/_ /txt/
|
||||
::
|
||||
|%
|
||||
++ rolt |=(a/wall `tape`?~(a ~ ?~(t.a i.a :(weld i.a "\0a" $(a t.a)))))
|
||||
++ wush
|
||||
|= {wid/@u tan/tang} ^- tape
|
||||
(rolt (zing (turn tan |=(a/tank (wash 0^wid a)))))
|
||||
::
|
||||
++ mads
|
||||
=, userlib
|
||||
|= a/wain ^- manx
|
||||
=/ try/(each manx tang)
|
||||
%- mule |.
|
||||
elm:(static:cram (rash (nule:unix ';>' a) apex:(sail &):vast))
|
||||
?- -.try
|
||||
%& p.try
|
||||
%| ;div
|
||||
;h3: ERROR
|
||||
;pre: {(wush 120 p.try)}
|
||||
== ==
|
||||
::
|
||||
++ split-on
|
||||
=| hed/wain
|
||||
|= {mid/@t all/wain} ^+ [hed all]
|
||||
?~ all !!
|
||||
?: =(mid i.all) [(flop hed) t.all]
|
||||
$(all t.all, hed :_(hed i.all))
|
||||
::
|
||||
++ strip
|
||||
|= a/manx ^- manx
|
||||
:_ (turn c.a ..$)
|
||||
?+ g.a g.a
|
||||
{@ {$id *} *} g.a(a t.a.g.a)
|
||||
{$$ {$$ *} ~}
|
||||
=< g.a(v.i.a (tufa (turn (tuba v.i.a.g.a) .)))
|
||||
|=(b/@c `@`?+(b b %~-~201c. '"', %~-~201d. '"'))
|
||||
==
|
||||
--
|
||||
::
|
||||
^- manx
|
||||
;ul
|
||||
;li
|
||||
;ul
|
||||
;* ^- marl
|
||||
%+ turn cor
|
||||
|= {num/@u txt/wain}
|
||||
;li: ;{p -[<num>]} +{(mads txt)} ;{hr}
|
||||
==
|
||||
==
|
||||
==
|
@ -1,348 +0,0 @@
|
||||
:: :- :* title+"urbit-flavored markdown docs"
|
||||
:: author+"ted blackman"
|
||||
:: date+~2017.8.25
|
||||
:: ==
|
||||
::
|
||||
;>
|
||||
|
||||
# udon: urbit-flavored markdown
|
||||
|
||||
## overview
|
||||
|
||||
Udon is a minimal markup language for creating and rendering text documents,
|
||||
with a markdown-inspired syntax. It's integrated with the hoon programming
|
||||
language, allowing it to be used as standalone prose in its own file or inside
|
||||
a hoon source file, in which case it will be parsed into a tree of HTML nodes
|
||||
using hoon's `sail` datatype.
|
||||
|
||||
Udon is stricter than markdown and generally supports only one syntax for each
|
||||
type of HTML node it emits.
|
||||
|
||||
### headers
|
||||
|
||||
Headers in udon begin with one or more `#` characters, followed by a space. The
|
||||
number of leading `#`s corresponds to the resulting HTML element: `#` yields an
|
||||
`<h1>`, `##` yields an `<h2>`, and so on through `<h6>`.
|
||||
|
||||
Example:
|
||||
```
|
||||
### Header (h3)
|
||||
|
||||
##### Header (h5)
|
||||
```
|
||||
produces:
|
||||
|
||||
> ### Header (h3)
|
||||
|
||||
##### Header (h5)
|
||||
|
||||
### lists
|
||||
|
||||
A line beginning with a `-` or `+` followed by a space is interpreted as an
|
||||
element of a list. `-` means unordered list (`<ul>`) and `+` means ordered list
|
||||
(`<ol>`).
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
- unordered 1
|
||||
text on newline shows up on same line
|
||||
- unordered 2\
|
||||
text on newline after `\` puts in <br> line break
|
||||
|
||||
- unordered after 1 blank line
|
||||
- nested
|
||||
- double-nested
|
||||
|
||||
+ leading '+'
|
||||
+ leading '+'
|
||||
- unordered '-'
|
||||
+ nested ordered '+' item 1
|
||||
+ nested ordered '+' item 2
|
||||
|
||||
+ ordered '+'
|
||||
+ nested item 1
|
||||
+ nested item 2
|
||||
```
|
||||
|
||||
produces:
|
||||
|
||||
> - unordered 1
|
||||
text on newline shows up on same line
|
||||
- unordered 2\
|
||||
text on newline after `\` puts in <br> line break
|
||||
|
||||
- unordered after 1 blank line
|
||||
- nested
|
||||
- double-nested
|
||||
|
||||
+ leading '+'
|
||||
+ leading '+'
|
||||
- unordered '-'
|
||||
+ nested ordered '+' item 1
|
||||
+ nested ordered '+' item 2
|
||||
|
||||
+ ordered '+'
|
||||
+ nested item 1
|
||||
+ nested item 2
|
||||
|
||||
### blockquotes
|
||||
|
||||
A section of text beginning with `> ` and indented by two spaces yields a
|
||||
`<blockquote>` element. This blockquote can itself turn contain more udon,
|
||||
including more blockquotes to render nested levels of quotation.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
> As Gregor Samsa awoke one morning from uneasy dreams
|
||||
he found himself _transformed_ in his bed into a *monstrous* vermin.
|
||||
```
|
||||
|
||||
produces:
|
||||
|
||||
> > As Gregor Samsa awoke one morning from uneasy dreams
|
||||
he found himself _transformed_ in his bed into a *monstrous* vermin.
|
||||
|
||||
### code blocks
|
||||
|
||||
By enclosing a block of text in `\`\`\` on their own lines
|
||||
before and after the block, the text will be treated as a code block.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
> ```
|
||||
(def Y (fn [f]
|
||||
((fn [x]
|
||||
(x x))
|
||||
(fn [x]
|
||||
(f (fn [y]
|
||||
((x x) y)))))))
|
||||
```
|
||||
```
|
||||
|
||||
produces:
|
||||
|
||||
> ```
|
||||
(def Y (fn [f]
|
||||
((fn [x]
|
||||
(x x))
|
||||
(fn [x]
|
||||
(f (fn [y]
|
||||
((x x) y)))))))
|
||||
```
|
||||
|
||||
### poems
|
||||
|
||||
A poem is a section of text with meaningful newlines. Normally in udon,
|
||||
newlines are treated as spaces and do not create a new line of text. If you
|
||||
want to embed text where newlines are retained, then indent the text by
|
||||
question with eight spaces.
|
||||
|
||||
Example:
|
||||
```
|
||||
A shape with lion body and the head of a man,
|
||||
A gaze blank and pitiless as the sun,
|
||||
Is moving its slow thighs, while all about it
|
||||
Reel shadows of the indignant desert birds.
|
||||
```
|
||||
produces:
|
||||
> A shape with lion body and the head of a man,
|
||||
A gaze blank and pitiless as the sun,
|
||||
Is moving its slow thighs, while all about it
|
||||
Reel shadows of the indignant desert birds.
|
||||
|
||||
### sail expressions
|
||||
|
||||
It's possible to use udon as an HTML templating language akin to
|
||||
PHP, ERB, JSP, or Handlebars templates. This facility derives
|
||||
in part from the support for embedding hoon code inside the markup.
|
||||
There are two ways to do embed hoon in udon: inline expressions and sail.
|
||||
[Sail](https://urbit.org/fora/posts/~2017.7.6..21.27.00..bebb~/)
|
||||
is a DSL within hoon for creating XML nodes, including HTML. It can
|
||||
be used directly within udon to provide scripting capability and also to
|
||||
provide more fine-grained control over the resulting HTML.
|
||||
|
||||
Example:
|
||||
```
|
||||
;=
|
||||
;p
|
||||
;strong: Don't panic!
|
||||
;br;
|
||||
;small: [reactive publishing intensifies]
|
||||
==
|
||||
==
|
||||
```
|
||||
|
||||
produces:
|
||||
> ;=
|
||||
;p
|
||||
;strong: Don't panic!
|
||||
;br;
|
||||
;small: [reactive publishing intensifies]
|
||||
==
|
||||
==
|
||||
|
||||
_Note:
|
||||
[urbit's web publishing system](https://urbit.org/docs/arvo/web-apps/)
|
||||
currently does not apply `<style>` elements or element attributes,
|
||||
which are supported in sail syntax. Future versions of the publishing
|
||||
system will rectify this._
|
||||
|
||||
### horizontal rules
|
||||
|
||||
`---` on its own line produces an `<hr>` element, the 'horizontal rule'.
|
||||
This is rendered as a horizontal line the width of its containing paragraph.
|
||||
|
||||
Example:
|
||||
```
|
||||
Above the line
|
||||
---
|
||||
Below the line
|
||||
```
|
||||
> :: produces:\
|
||||
Above the line
|
||||
---
|
||||
Below the line
|
||||
|
||||
### inline markup
|
||||
|
||||
In addition to the above, udon includes several options for marking up
|
||||
inline text.
|
||||
|
||||
##### bold
|
||||
|
||||
Enclose some text in asterisks to boldly render it inside a `<b>` element.
|
||||
|
||||
Example:
|
||||
```
|
||||
The first rule of tautology club is
|
||||
*the first rule of tautology club*.
|
||||
```
|
||||
produces:\
|
||||
|
||||
> The first rule of tautology club is
|
||||
*the first rule of tautology club*.
|
||||
|
||||
##### italics
|
||||
|
||||
Surrounding text with `_` on each side will cause it to appear
|
||||
in italics, using an <i> element.
|
||||
|
||||
Example:
|
||||
```
|
||||
Bueller? _Bueller?_
|
||||
```
|
||||
|
||||
produces:
|
||||
|
||||
Bueller? _Bueller?_
|
||||
|
||||
##### double quote
|
||||
|
||||
Text enclosed in double quotes (`"`) will be rendered with
|
||||
opening and closing quotes.
|
||||
|
||||
Example:
|
||||
```
|
||||
"Yes," he said. "That is the way with him."
|
||||
```
|
||||
produces:\
|
||||
|
||||
"Yes," he said. "That is the way with him."
|
||||
|
||||
##### backslash escape
|
||||
|
||||
A backslash directly before a word (with no spaces) will be interpreted
|
||||
as an escape character, causing it to be rendered raw.
|
||||
|
||||
Example:
|
||||
```
|
||||
Here is some *bold* text.
|
||||
Here is some \*not bold\* text.
|
||||
```
|
||||
produces:
|
||||
|
||||
Here is some *bold* text.
|
||||
Here is some \*not bold\* text.
|
||||
|
||||
##### trailing backslash
|
||||
|
||||
A backslash at the end of a line inserts a line break (`<br>`)
|
||||
after that line. This contrasts with the normal udon behavior of
|
||||
converting newlines to spaces.
|
||||
|
||||
Example:
|
||||
```
|
||||
I wonder how long each line
|
||||
will be if I put backslashes\
|
||||
at the ends of the lines.
|
||||
```
|
||||
produces:
|
||||
|
||||
I wonder how long each line
|
||||
will be if I put backslashes\
|
||||
at the ends of the lines.
|
||||
|
||||
##### inline code literal
|
||||
|
||||
Enclosing some text in ``` characters will cause it to be displayed as code,
|
||||
inside a <code> element with monospace font and a different background color.
|
||||
|
||||
Example:
|
||||
```
|
||||
`*[a 2 b c] -> *[*[a b] *[a c]]` is like lisp's `apply`.
|
||||
```
|
||||
produces:\
|
||||
|
||||
`*[a 2 b c] -> *[*[a b] *[a c]]` is like lisp's `apply`.
|
||||
|
||||
Also, using the `++` prefix before a word will cause the word
|
||||
to be rendered as code, since that's the standard notation
|
||||
for an arm in hoon.
|
||||
|
||||
Example:
|
||||
```
|
||||
The udon parser is part of ++vast.
|
||||
```
|
||||
produces:\
|
||||
|
||||
The udon parser is part of ++vast.
|
||||
|
||||
##### hoon constants
|
||||
|
||||
Hoon has several syntactic forms for literals (numbers, strings, dates, etc.)
|
||||
that can be used in udon as well. They will appear inside a <code> element like
|
||||
inline code.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
~2017.8.29 \
|
||||
0xdead.beef \
|
||||
%term
|
||||
```
|
||||
|
||||
produces:\
|
||||
|
||||
~2017.8.29 \
|
||||
0xdead.beef \
|
||||
%term
|
||||
|
||||
##### url
|
||||
|
||||
To insert a hyperlink, put the text content of the link in `[]` brackets
|
||||
followed by the destination URL in `()` parentheses. Note that the text
|
||||
of the displayed link can contain markdown styling.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
A [hoon `core`](https://urbit.org/docs/hoon/concepts/#-core-object)
|
||||
is similar to an object in a traditional programming langauge.
|
||||
```
|
||||
produces:\
|
||||
|
||||
A [hoon `core`](https://urbit.org/docs/hoon/concepts/#-core-object)
|
||||
is similar to an object in a traditional programming langauge.
|
@ -1,9 +0,0 @@
|
||||
=- ;pre:"{<[-]>}"
|
||||
:- ;>
|
||||
indented
|
||||
indented
|
||||
|
||||
:- ;= *{~}
|
||||
==
|
||||
;= some *markdown*
|
||||
==
|
Loading…
Reference in New Issue
Block a user