Merge branch 'trunk' into cp/bracket-launch-resources

This commit is contained in:
Chris Penner 2022-12-05 10:07:51 -06:00
commit 5971f4f690
69 changed files with 2201 additions and 591 deletions

View File

@ -1,4 +1,4 @@
Copyright (c) 2013-2021, Unison Computing, public benefit corp and contributors
Copyright (c) 2013-2022, Unison Computing, public benefit corp and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

49
chez-libs/readme.md Normal file
View File

@ -0,0 +1,49 @@
This directory contains libraries necessary for building and running
unison programs via Chez Scheme. At the moment, they need to be
manually installed in the expected location. The default location is
the `unisonlanguage` directory in the XDG data directory.
Specifically, the `unison` subdirectory should be (recursively) copied
to:
$XDG_DATA_DIRECTORY/unisonlanguage/scheme-libs
On unix systems, the XDG directory is ~/.local/share by default, ~
being the home directory. On Windows, this is instead the %APPDATA%
directory. See:
https://hackage.haskell.org/package/directory/docs/System-Directory.html#t:XdgDirectory
for more information.
UCM can also be told to look in another directory by setting the
`SchemeLibs.Static` item in the unison config file. If this path is
denoted by `$CUSTOM`, then the compiler commands will look in:
$CUSTOM/scheme-libs/
for the `unison/` directory containing the library files.
The compiler commands also expect Chez Scheme to be installed
separately, and for `scheme` to be callable on the user's path. For
information on how to install, see:
https://github.com/cisco/ChezScheme/blob/main/BUILDING
For more information on Chez Scheme in general, see:
https://cisco.github.io/ChezScheme/csug9.5/csug.html
There are two ways to run code via scheme. The first is to use
`run.native`, which compiles and immediately runs a given unison
definition. The second is `compile.native`, which produces a Chez
scheme `.boot` file with a specified name. The boot file can then be
used with the scheme binary to run the precompiled program, like so:
scheme -b ./my-program.boot
It is also possible to install the boot file in a particular
directory, and make a renamed copy of the scheme binary, which will
automatically execute the boot file with the corresponding name on
start up. For more information on how to accomplish that, see:
https://cisco.github.io/ChezScheme/csug9.5/use.html#./use:h8

130
chez-libs/unison/boot.ss Normal file
View File

@ -0,0 +1,130 @@
; This library implements various syntactic constructs and functions
; that are used in the compilation of unison (intermediate) source to
; scheme. The intent is to provide for writing scheme definitions that
; more directly match the source, so that the compiler doesn't need to
; emit all the code necessary to fix up the difference itself.
;
; Probably the best example of this is the define-unison macro, which
; looks similar to scheme's define, but the function being defined is
; allowed to be under/over applied similar to a unison function. It
; has an 'arity' at which computation happens, but the function
; automatically handles being applied to fewer or more arguments than
; that arity appropriately.
(library (unison boot)
(export
name
define-unison
func-wrap
handle
request
unison-force)
(import (chezscheme)
(unison cont))
; Helper function. Turns a list of syntax objects into a list-syntax object.
(meta define (list->syntax l) #`(#,@l))
; Concatenates
(meta define fun-sym
(case-lambda
[(pfx sfx) (string->symbol (string-append pfx "-" sfx))]
[(pfx ifx sfx)
(string->symbol (string-append pfx "-" ifx "-" sfx))]))
; Computes a symbol for automatically generated partial application
; cases, based on number of arguments applied. The partial
; application of `f` is (locally) named `f-partial-N`
(meta define (partial-symbol name m)
(fun-sym (symbol->string name) "partial" (number->string m)))
; As above, but takes a syntactic object representing the arguments
; rather than their count.
(meta define (partial-name name us)
(datum->syntax name (syntax->datum name)))
(define-syntax with-name
(syntax-rules ()
[(with-name name e) (let ([name e]) name)]))
; Builds partial application cases for unison functions. It seems
; most efficient to have a case for each posible under-application.
(meta define (build-partials name formals)
(let rec ([us formals] [acc '()])
(syntax-case us ()
[() (list->syntax (cons #`[() #,name] acc))]
[(a ... z)
(rec #'(a ...)
(cons
#`[(a ... z)
(with-name
#,(partial-name name us)
(lambda r (apply #,name a ... z r)))]
acc))])))
; Given an overall function name, a fast path name, and a list of arguments,
; builds the case-lambda body of a unison function that enables applying to
; arbitrary numbers of arguments.
(meta define (func-cases name fast args)
(syntax-case args ()
[() #`(case-lambda
[() (#,fast)]
[r (apply (#,fast) r)])]
[(a ... z)
#`(case-lambda
#,@(build-partials name #'(a ...))
[(a ... z) (#,fast a ... z)]
[(a ... z . r) (apply (#,fast a ... z) r)])]))
(meta define (func-wrap name args body)
#`(let ([fast-path (lambda (#,@args) #,@body)])
#,(func-cases name #'fast-path args)))
; function definition with slow/fast path. Slow path allows for
; under/overapplication. Fast path is exact application.
;
; The intent is for the scheme compiler to be able to recognize and
; optimize static, fast path calls itself, while still supporting
; unison-like automatic partial application and such.
(define-syntax (define-unison x)
(syntax-case x ()
[(define-unison (name a ...) e ...)
#`(define name
#,(func-wrap #'name #'(a ...) #'(e ...)))]))
; call-by-name bindings
(define-syntax name
(syntax-rules ()
((name ([v (f . args)] ...) body)
(let ([v (lambda r (apply f (append (list . args) r)))]
...)
body))))
; wrapper that more closely matches `handle` constructs
(define-syntax handle
(syntax-rules ()
[(handle [r ...] h e ...)
(prompt p (fluid-let ([r (cons p h)] ...) e ...))]))
; wrapper that more closely matches ability requests
(define-syntax request
(syntax-rules ()
[(request r t . args)
((cdr r) (list (quote r) t . args))]))
(define-record-type
data
(fields type-ref payload))
(define-syntax data-case
(syntax-rules ()
[(data-case scrut c ...)
(record-case (data-payload scrut) c ...)]))
; forces something that is expected to be a thunk, defined with
; e.g. `name` above. In some cases, we might have a normal value,
; so just do nothing in that case.
(define (unison-force x)
(if (procedure? x) (x) x))
)

View File

@ -0,0 +1,35 @@
; This library implements missing bytevector functionality for unison
; builtins. The main missing bits are better support for immutable
; bytevectors. Chez does provide a way to make an immutable
; bytevector, but it copies its input, so implementing things that way
; would make many unncessary copies. This library instead implements
; functions on immutable bytevectors by directly freezing the newly
; created mutable vector. It also provides the freezing function,
; which is itself a unison builtin.
(library (unison bytevector)
(export
freeze-bv!
ibytevector-drop
ibytevector-take
u8-list->ibytevector)
(import (chezscheme))
(define (freeze-bv! bs)
(($primitive $bytevector-set-immutable!) bs)
bs)
(define (ibytevector-drop n bs)
(let* ([l (bytevector-length bs)]
[k (max 0 (- l n))]
[br (make-bytevector k)])
(bytevector-copy! bs n br 0 k)
(freeze-bv! br)))
(define (ibytevector-take n bs)
(let* ([sz (min n (bytevector-length bs))]
[br (make-bytevector sz)])
(bytevector-copy! bs 0 br 0 sz)
(freeze-bv! br)))
(define (u8-list->ibytevector l) (freeze-bv! (u8-list->bytevector l))))

47
chez-libs/unison/cont.ss Normal file
View File

@ -0,0 +1,47 @@
; This library is intended to contain the implementation of
; delimited continuations used in the semantics of abilities.
;
; Currently, it is a somewhat naive implementation based on call/cc.
; This has known issues that seem to still be in force even though a
; tweak has been applied that should fix certain space leaks. So, in
; the future it will likely need to be rewritten using some
; implementation specific machinery (possibly making use of
; continuation attachments as in the Racket implementation).
;
; Also, although the API includes prompts, they are currently ignored
; in `control` and `prompt` always uses the same prompt (0). This
; means that nesting handlers will not work in general, since requests
; will not be able to jump over an inner handler of unrelated
; abilities. It should be sufficient for testing simple examples in
; the mean time, though.
(library (unison cont)
(export prompt control)
(import (chezscheme))
(define mk (lambda (x) (raise "fell off end")))
(define (prompt-impl h)
((call/cc
(lambda (k)
(let ([ok mk])
(set! mk (lambda (x) (set! mk ok) (k x)))
; (h 0) <-- prompt = 0
(mk (let ([v (h 0)]) (lambda () v))))))))
(define-syntax prompt
(syntax-rules ()
[(prompt p e ...)
(prompt-impl (lambda (p) e ...))]))
(define (control-impl h)
(call/cc
(lambda (k)
(let* ([g (lambda () (prompt p (h k)))])
(mk g)))))
(define-syntax control
(syntax-rules ()
[(control p k e ...)
(control-impl (lambda (k) e ...))])))

43
chez-libs/unison/core.ss Normal file
View File

@ -0,0 +1,43 @@
(library (unison core)
(export
identity
describe-value
decode-value)
(import (chezscheme))
(define (identity x) x)
; Recovers the original function name from the partial
; application name.
(define (extract-name i)
(string->symbol ((i 'code) 'name)))
(define (describe-value x)
(let explain ([i (inspect/object x)])
(case (i 'type)
[(simple) (i 'value)]
[(variable) (explain (i 'ref))]
[(procedure)
(let explain-args ([j (- (i 'length) 1)] [args '()])
(if (< j 0)
(cons (extract-name i) args)
(explain-args
(- j 1)
(cons (explain (i 'ref j)) args))))])))
; partial, data, cont, lit
(define (decode-value x)
(let reify ([i (inspect/object x)])
(case (i 'type)
[(simple) (list 3 (i 'value))]
[(variable) (reify (i 'ref))]
[(procedure)
(let reify-args ([j (- (i 'length) 1)] [args '()])
(if (< j 0)
(cons* 0 (extract-name i) args)
(reify-args
(- j 1)
(cons (reify (i 'ref j)) args))))])))
)

205
chez-libs/unison/primops.ss Normal file
View File

@ -0,0 +1,205 @@
; This library implements pimitive operations that are used in
; builtins. There are two different sorts of primitive operations, but
; the difference is essentially irrelevant except for naming schemes.
;
; POps are part of a large enumeration of 'instructions' directly
; implemented in the Haskell runtime. These are referred to using the
; naming scheme `unison-POp-INST` where `INST` is the name of the
; instruction, which is (at the time of this writing) 4 letters.
;
; FOps are 'foreign' functons, which are allowed to be declared more
; flexibly in the Haskell runtime. Each such declaration associates a
; builtin to a Haskell function. For these, the naming shceme is
; `unison-FOp-NAME` where `NAME` is the name of the unison builtin
; associated to the declaration.
;
; Both POps and FOps are always called with exactly the right number
; of arguments, so they may be implemented as ordinary scheme
; definitions with a fixed number of arguments. By implementing the
; POp/FOp, you are expecting the associated unison function(s) to be
; implemented by code generation from the wrappers in
; Unison.Runtime.Builtin, so the POp/FOp implementation must
; take/return arguments that match what is expected in those wrappers.
(library (unison primops)
(export
; unison-FOp-Bytes.decodeNat16be
; unison-FOp-Bytes.decodeNat32be
; unison-FOp-Bytes.decodeNat64be
unison-FOp-Char.toText
; unison-FOp-Code.dependencies
; unison-FOp-Code.serialize
unison-FOp-IO.closeFile.impl.v3
unison-FOp-IO.openFile.impl.v3
unison-FOp-IO.putBytes.impl.v3
; unison-FOp-Text.fromUtf8.impl.v3
unison-FOp-Text.repeat
unison-FOp-Text.toUtf8
; unison-FOp-Value.serialize
unison-FOp-IO.stdHandle
unison-FOp-ImmutableByteArray.copyTo!
unison-FOp-ImmutableByteArray.read8
unison-FOp-MutableByteArray.freeze!
unison-FOp-MutableByteArray.write8
unison-FOp-Scope.bytearray
unison-POp-ADDN
unison-POp-ANDN
unison-POp-BLDS
unison-POp-CATS
unison-POp-CATT
unison-POp-CMPU
unison-POp-COMN
unison-POp-CONS
unison-POp-DECI
unison-POp-DIVN
unison-POp-DRPB
unison-POp-DRPS
unison-POp-EQLN
unison-POp-EQLT
unison-POp-EQLU
unison-POp-EROR
unison-POp-FTOT
unison-POp-IDXB
unison-POp-IDXS
unison-POp-IORN
unison-POp-ITOT
unison-POp-LEQN
; unison-POp-LKUP
unison-POp-MULN
unison-POp-NTOT
unison-POp-PAKT
unison-POp-SHLI
unison-POp-SHLN
unison-POp-SHRI
unison-POp-SHRN
unison-POp-SIZS
unison-POp-SIZT
unison-POp-SNOC
unison-POp-SUBN
unison-POp-TAKS
unison-POp-TAKT
unison-POp-TRCE
unison-POp-UPKT
unison-POp-VALU
unison-POp-VWLS
)
(import (chezscheme)
(unison core)
(unison string)
(unison bytevector))
; Core implemented primops, upon which primops-in-unison can be built.
(define (unison-POp-ADDN m n) (fx+ m n))
(define (unison-POp-ANDN m n) (fxlogand m n))
(define unison-POp-BLDS list)
(define (unison-POp-CATS l r) (append l r))
(define (unison-POp-CATT l r) (istring-append l r))
(define (unison-POp-CMPU l r) (equal? l r))
(define (unison-POp-COMN n) (fxlognot n))
(define (unison-POp-CONS x xs) (cons x xs))
(define (unison-POp-DECI n) (+ n 1))
(define (unison-POp-DIVN m n) (fxdiv m n))
(define (unison-POp-DRPB n bs) (ibytevector-drop n bs))
(define (unison-POp-DRPS n l)
(let ([m (max 0 (min n (length l)))]) (list-tail l m)))
(define (unison-POp-DRPT n t) (istring-drop n t))
(define (unison-POp-EQLN m n) (if (fx= m n) 1 0))
(define (unison-POp-EQLT s t) (if (string=? s t) 1 0))
(define (unison-POp-EQLU x y) (equal? x y))
(define (unison-POp-EROR fnm x) (raise fnm))
(define (unison-POp-FTOT f) (number->istring f))
(define (unison-POp-IDXB n bs) (bytevector-u8-ref bs n))
(define (unison-POp-IDXS n l) (list-ref l n))
(define (unison-POp-IORN m n) (fxlogior m n))
(define (unison-POp-ITOT i) (signed-number->istring i))
(define (unison-POp-LEQN m n) (if (fx< m n) 1 0))
(define (unison-POp-MULN m n) (* m n))
(define (unison-POp-NTOT m) (number->istring m))
(define (unison-POp-PAKB l) (u8-list->ibytevector l))
(define (unison-POp-PAKT l) (list->istring l))
(define (unison-POp-SHLI i k) (fxarithmetic-shift-left i k))
(define (unison-POp-SHLN n k) (fxarithmetic-shift-left n k))
(define (unison-POp-SHRI i k) (fxarithmetic-shift-right i k))
(define (unison-POp-SHRN n k) (fxarithmetic-shift-right n k))
(define (unison-POp-SIZS l) (length l))
(define (unison-POp-SIZT t) (string-length t))
(define (unison-POp-SNOC xs x) (append xs (list x)))
(define (unison-POp-SUBN m n) (fx- m n))
(define (unison-POp-TAKS n s) (list-head s n))
(define (unison-POp-TAKT n t) (istring-take n t))
(define (unison-POp-TRCE x) (display (describe-value x)))
(define (unison-POp-UPKT t) (string->list t))
(define (unison-POp-VWLS l)
(if (null? l)
(list 0)
(list 1 (car l) (cdr l))))
(define (unison-POp-VALU c) (decode-value c))
(define (unison-FOp-IO.putBytes.impl.v3 p bs)
(begin
(put-bytevector p bs)
(flush-output-port p)
(list 1 #f)))
(define (unison-FOp-Char.toText c) (istring c))
(define stdin (standard-input-port))
(define stdout (standard-output-port))
(define stderr (standard-error-port))
(define (unison-FOp-IO.stdHandle n)
(case n
[(0) stdin]
[(1) stdout]
[(2) stderr]))
(define (unison-FOp-Text.toUtf8 s)
(string->bytevector s utf-8-transcoder))
(define (unison-FOp-IO.closeFile.impl.v3 h)
(close-input-port h))
(define (unison-FOp-IO.openFile.impl.v3 fn mode)
(case mode
[(0) (open-file-input-port fn)]
[(1) (open-file-output-port fn)]
[(2) (open-file-output-port fn 'no-truncate)]
[else (open-file-input/output-port fn)]))
(define (unison-FOp-Text.repeat n t) (istring-repeat n t))
(define (catch-array thunk)
(with-exception-handler
(lambda (e) (list 0 '() "array index out of boudns" e))
thunk))
(define (unison-FOp-ImmutableByteArray.copyTo! dst doff src soff n)
(catch-array
(lambda ()
(bytevector-copy! src soff dst doff n)
(list 1))))
(define (unison-FOp-ImmutableByteArray.read8 arr i)
(catch-array
(lambda ()
(list 1 (bytevector-u8-ref arr i)))))
(define (unison-FOp-MutableByteArray.freeze! arr)
(freeze-bv! arr))
(define (unison-FOp-MutableByteArray.write8 arr i b)
(catch-array
(lambda ()
(bytevector-u8-set! arr i b)
(list 1))))
(define (unison-FOp-Scope.bytearray n) (make-bytevector n))
)

View File

@ -0,0 +1,60 @@
; This library wraps some chez functionality to provide immutable
; strings. There is a wrapper for making immutable strings in chez,
; but it copies its input, so relying on that would involve every
; string operation building a fresh mutable string, then making an
; immutable copy. This library instead directly freezes the newly
; created mutable strings.
(library (unison string)
(export
istring
istring-append
istring-drop
istring-take
istring-repeat
list->istring
make-istring
number->istring
signed-number->istring
utf8-bytevector->istring
utf-8-transcoder)
(import (chezscheme))
(define (freeze-s! s)
(($primitive $string-set-immutable!) s)
s)
(define istring (lambda l (freeze-s! (apply string l))))
(define (make-istring n c) (freeze-s! (make-string n c)))
(define (istring-repeat n s)
(let* ([k (string-length s)]
[t (make-string (* k n))])
(let loop ([i 0])
(if (< i n)
(begin
(string-copy! s 0 t (* i k) k)
(loop (+ i 1)))
(freeze-s! t)))))
(define istring-append (lambda l (freeze-s! (apply string-append l))))
(define (istring-drop n s) (freeze-s! (substring s n (- (string-length s) n))))
(define (number->istring n) (freeze-s! (number->string n)))
(define (signed-number->istring n)
(freeze-s!
(if (>= n 0)
(string-append "+" (number->string n))
(number->string n))))
(define (list->istring l) (freeze-s! (list->string l)))
(define (istring-take n s) (freeze-s! (substring s 0 n)))
(define utf-8-transcoder (make-transcoder (utf-8-codec)))
(define (utf8-bytevector->istring bs)
(freeze-s! (bytevector->string bs utf-8-transcoder))))

View File

@ -41,6 +41,7 @@ dependencies:
- errors
- exceptions
- extra
- filelock
- filepath
- fingertree
- fsnotify
@ -73,7 +74,6 @@ dependencies:
- openapi3
- optparse-applicative >= 0.16.1.0
- pem
- prelude-extras
- pretty-simple
- primitive
- process
@ -91,7 +91,6 @@ dependencies:
- servant-server
- shellmet
- stm
- strings
- tagged
- temporary
- terminal-size

View File

@ -42,7 +42,7 @@ module Unison.Codebase
SqliteCodebase.Operations.before,
getShallowBranchAtPath,
getShallowCausalAtPath,
getShallowCausalForHash,
Operations.expectCausalBranchByCausalHash,
getShallowCausalFromRoot,
getShallowRootBranch,
getShallowRootCausal,
@ -174,49 +174,53 @@ runTransaction Codebase {withConnection} action =
withConnection \conn -> Sqlite.runTransaction conn action
getShallowCausalFromRoot ::
MonadIO m =>
Codebase m v a ->
-- Optional root branch, if Nothing use the codebase's root branch.
Maybe V2.CausalHash ->
Path.Path ->
m (V2Branch.CausalBranch m)
getShallowCausalFromRoot codebase mayRootHash p = do
Sqlite.Transaction (V2Branch.CausalBranch Sqlite.Transaction)
getShallowCausalFromRoot mayRootHash p = do
rootCausal <- case mayRootHash of
Nothing -> getShallowRootCausal codebase
Just ch -> getShallowCausalForHash codebase ch
getShallowCausalAtPath codebase p (Just rootCausal)
Nothing -> getShallowRootCausal
Just ch -> Operations.expectCausalBranchByCausalHash ch
getShallowCausalAtPath p (Just rootCausal)
-- | Get the shallow representation of the root branches without loading the children or
-- history.
getShallowRootBranch :: MonadIO m => Codebase m v a -> m (V2.Branch m)
getShallowRootBranch codebase = do
getShallowRootCausal codebase >>= V2Causal.value
getShallowRootBranch :: Sqlite.Transaction (V2.Branch Sqlite.Transaction)
getShallowRootBranch = do
getShallowRootCausal >>= V2Causal.value
-- | Get the shallow representation of the root branches without loading the children or
-- history.
getShallowRootCausal :: MonadIO m => Codebase m v a -> m (V2.CausalBranch m)
getShallowRootCausal codebase = do
hash <- runTransaction codebase Operations.expectRootCausalHash
getShallowCausalForHash codebase hash
getShallowRootCausal :: Sqlite.Transaction (V2.CausalBranch Sqlite.Transaction)
getShallowRootCausal = do
hash <- Operations.expectRootCausalHash
Operations.expectCausalBranchByCausalHash hash
-- | Recursively descend into causals following the given path,
-- Use the root causal if none is provided.
getShallowCausalAtPath :: MonadIO m => Codebase m v a -> Path -> Maybe (V2Branch.CausalBranch m) -> m (V2Branch.CausalBranch m)
getShallowCausalAtPath codebase path mayCausal = do
causal <- whenNothing mayCausal (getShallowRootCausal codebase)
getShallowCausalAtPath ::
Path ->
Maybe (V2Branch.CausalBranch Sqlite.Transaction) ->
Sqlite.Transaction (V2Branch.CausalBranch Sqlite.Transaction)
getShallowCausalAtPath path mayCausal = do
causal <- whenNothing mayCausal getShallowRootCausal
case path of
Path.Empty -> pure causal
(ns Path.:< p) -> do
b <- V2Causal.value causal
case (V2Branch.childAt (Cv.namesegment1to2 ns) b) of
Nothing -> pure (Cv.causalbranch1to2 Branch.empty)
Just childCausal -> getShallowCausalAtPath codebase p (Just childCausal)
Just childCausal -> getShallowCausalAtPath p (Just childCausal)
-- | Recursively descend into causals following the given path,
-- Use the root causal if none is provided.
getShallowBranchAtPath :: MonadIO m => Codebase m v a -> Path -> Maybe (V2Branch.Branch m) -> m (V2Branch.Branch m)
getShallowBranchAtPath codebase path mayBranch = do
branch <- whenNothing mayBranch (getShallowRootCausal codebase >>= V2Causal.value)
getShallowBranchAtPath ::
Path ->
Maybe (V2Branch.Branch Sqlite.Transaction) ->
Sqlite.Transaction (V2Branch.Branch Sqlite.Transaction)
getShallowBranchAtPath path mayBranch = do
branch <- whenNothing mayBranch (getShallowRootCausal >>= V2Causal.value)
case path of
Path.Empty -> pure branch
(ns Path.:< p) -> do
@ -224,7 +228,7 @@ getShallowBranchAtPath codebase path mayBranch = do
Nothing -> pure V2Branch.empty
Just childCausal -> do
childBranch <- V2Causal.value childCausal
getShallowBranchAtPath codebase p (Just childBranch)
getShallowBranchAtPath p (Just childBranch)
-- | Get a branch from the codebase.
getBranchForHash :: Monad m => Codebase m v a -> Branch.CausalHash -> m (Maybe (Branch m))
@ -246,17 +250,15 @@ getBranchForHash codebase h =
-- | Get the metadata attached to the term at a given path and name relative to the given branch.
termMetadata ::
MonadIO m =>
Codebase m v a ->
-- | The branch to search inside. Use the current root if 'Nothing'.
Maybe (V2Branch.Branch m) ->
Maybe (V2Branch.Branch Sqlite.Transaction) ->
Split ->
-- | There may be multiple terms at the given name. You can specify a Referent to
-- disambiguate if desired.
Maybe V2.Referent ->
m [Map V2Branch.MetadataValue V2Branch.MetadataType]
termMetadata codebase mayBranch (path, nameSeg) ref = do
b <- getShallowBranchAtPath codebase path mayBranch
Sqlite.Transaction [Map V2Branch.MetadataValue V2Branch.MetadataType]
termMetadata mayBranch (path, nameSeg) ref = do
b <- getShallowBranchAtPath path mayBranch
V2Branch.termMetadata b (coerce @NameSegment.NameSegment nameSeg) ref
-- | Get the lowest common ancestor of two branches, i.e. the most recent branch that is an ancestor of both branches.
@ -266,7 +268,7 @@ lca code b1@(Branch.headHash -> h1) b2@(Branch.headHash -> h2) = do
runTransaction code do
eb1 <- SqliteCodebase.Operations.branchExists h1
eb2 <- SqliteCodebase.Operations.branchExists h2
if eb1 && eb2
if eb1 && eb2
then do
SqliteCodebase.Operations.sqlLca h1 h2 >>= \case
Just h -> pure (getBranchForHash code h)
@ -329,7 +331,7 @@ getTypeOfConstructor codebase (ConstructorReference r0 cid) =
-- MaybeT (getWatch codebase RegularWatch ref)
-- <|> MaybeT (getWatch codebase TestWatch ref))
-- @
lookupWatchCache :: (Monad m) => Codebase m v a -> Reference.Id -> m (Maybe (Term v a))
lookupWatchCache :: Codebase m v a -> Reference.Id -> Sqlite.Transaction (Maybe (Term v a))
lookupWatchCache codebase h = do
m1 <- getWatch codebase WK.RegularWatch h
maybe (getWatch codebase WK.TestWatch h) (pure . Just) m1

View File

@ -6,6 +6,7 @@ module Unison.Codebase.Init
DebugName,
InitError (..),
CodebaseInitOptions (..),
CodebaseLockOption (..),
InitResult (..),
SpecifiedCodebase (..),
MigrationStrategy (..),
@ -42,6 +43,10 @@ data SpecifiedCodebase
= CreateWhenMissing CodebasePath
| DontCreateWhenMissing CodebasePath
data CodebaseLockOption
= DoLock
| DontLock
data MigrationStrategy
= -- | Perform a migration immediately if one is required.
MigrateAutomatically
@ -60,9 +65,9 @@ type DebugName = String
data Init m v a = Init
{ -- | open an existing codebase
withOpenCodebase :: forall r. DebugName -> CodebasePath -> MigrationStrategy -> (Codebase m v a -> m r) -> m (Either OpenCodebaseError r),
withOpenCodebase :: forall r. DebugName -> CodebasePath -> CodebaseLockOption -> MigrationStrategy -> (Codebase m v a -> m r) -> m (Either OpenCodebaseError r),
-- | create a new codebase
withCreatedCodebase :: forall r. DebugName -> CodebasePath -> (Codebase m v a -> m r) -> m (Either CreateCodebaseError r),
withCreatedCodebase :: forall r. DebugName -> CodebasePath -> CodebaseLockOption -> (Codebase m v a -> m r) -> m (Either CreateCodebaseError r),
-- | given a codebase root, and given that the codebase root may have other junk in it,
-- give the path to the "actual" files; e.g. what a forked transcript should clone.
codebasePath :: CodebasePath -> CodebasePath
@ -85,10 +90,11 @@ createCodebaseWithResult ::
Init m v a ->
DebugName ->
CodebasePath ->
CodebaseLockOption ->
(Codebase m v a -> m r) ->
m (Either (CodebasePath, InitError) r)
createCodebaseWithResult cbInit debugName dir action =
createCodebase cbInit debugName dir action <&> mapLeft \case
createCodebaseWithResult cbInit debugName dir lockOption action =
createCodebase cbInit debugName dir lockOption action <&> mapLeft \case
errorMessage -> (dir, (CouldntCreateCodebase errorMessage))
withOpenOrCreateCodebase ::
@ -96,12 +102,13 @@ withOpenOrCreateCodebase ::
Init m v a ->
DebugName ->
CodebaseInitOptions ->
CodebaseLockOption ->
MigrationStrategy ->
((InitResult, CodebasePath, Codebase m v a) -> m r) ->
m (Either (CodebasePath, InitError) r)
withOpenOrCreateCodebase cbInit debugName initOptions migrationStrategy action = do
withOpenOrCreateCodebase cbInit debugName initOptions lockOption migrationStrategy action = do
let resolvedPath = initOptionsToDir initOptions
result <- withOpenCodebase cbInit debugName resolvedPath migrationStrategy \codebase -> do
result <- withOpenCodebase cbInit debugName resolvedPath lockOption migrationStrategy \codebase -> do
action (OpenedCodebase, resolvedPath, codebase)
case result of
Right r -> pure $ Right r
@ -114,7 +121,7 @@ withOpenOrCreateCodebase cbInit debugName initOptions migrationStrategy action =
(do pure (Left (homeDir, FoundV1Codebase)))
( do
-- Create V2 codebase if neither a V1 or V2 exists
createCodebaseWithResult cbInit debugName homeDir (\codebase -> action (CreatedCodebase, homeDir, codebase))
createCodebaseWithResult cbInit debugName homeDir lockOption (\codebase -> action (CreatedCodebase, homeDir, codebase))
)
Specified specified ->
ifM
@ -124,14 +131,15 @@ withOpenOrCreateCodebase cbInit debugName initOptions migrationStrategy action =
DontCreateWhenMissing dir ->
pure (Left (dir, (InitErrorOpen OpenCodebaseDoesntExist)))
CreateWhenMissing dir ->
createCodebaseWithResult cbInit debugName dir (\codebase -> action (CreatedCodebase, dir, codebase))
createCodebaseWithResult cbInit debugName dir lockOption (\codebase -> action (CreatedCodebase, dir, codebase))
OpenCodebaseUnknownSchemaVersion {} -> pure (Left (resolvedPath, InitErrorOpen err))
OpenCodebaseRequiresMigration {} -> pure (Left (resolvedPath, InitErrorOpen err))
OpenCodebaseFileLockFailed {} -> pure (Left (resolvedPath, InitErrorOpen err))
createCodebase :: MonadIO m => Init m v a -> DebugName -> CodebasePath -> (Codebase m v a -> m r) -> m (Either Pretty r)
createCodebase cbInit debugName path action = do
createCodebase :: MonadIO m => Init m v a -> DebugName -> CodebasePath -> CodebaseLockOption -> (Codebase m v a -> m r) -> m (Either Pretty r)
createCodebase cbInit debugName path lockOption action = do
prettyDir <- P.string <$> canonicalizePath path
withCreatedCodebase cbInit debugName path action <&> mapLeft \case
withCreatedCodebase cbInit debugName path lockOption action <&> mapLeft \case
CreateCodebaseAlreadyExists ->
P.wrap $
"It looks like there's already a codebase in: "
@ -141,30 +149,31 @@ createCodebase cbInit debugName path action = do
-- previously: initCodebaseOrExit :: CodebasePath -> m (m (), Codebase m v a)
-- previously: FileCodebase.initCodebase :: CodebasePath -> m (m (), Codebase m v a)
withNewUcmCodebaseOrExit :: MonadIO m => Init m Symbol Ann -> DebugName -> CodebasePath -> (Codebase m Symbol Ann -> m r) -> m r
withNewUcmCodebaseOrExit cbInit debugName path action = do
withNewUcmCodebaseOrExit :: MonadIO m => Init m Symbol Ann -> DebugName -> CodebasePath -> CodebaseLockOption -> (Codebase m Symbol Ann -> m r) -> m r
withNewUcmCodebaseOrExit cbInit debugName path lockOption action = do
prettyDir <- P.string <$> canonicalizePath path
let codebaseSetup codebase = do
liftIO $ PT.putPrettyLn' . P.wrap $ "Initializing a new codebase in: " <> prettyDir
Codebase.runTransaction codebase (Codebase.installUcmDependencies codebase)
createCodebase cbInit debugName path (\cb -> codebaseSetup cb *> action cb)
createCodebase cbInit debugName path lockOption (\cb -> codebaseSetup cb *> action cb)
>>= \case
Left error -> liftIO $ PT.putPrettyLn' error >> exitFailure
Right result -> pure result
-- | try to init a codebase where none exists and then exit regardless (i.e. `ucm --codebase dir init`)
initCodebaseAndExit :: MonadIO m => Init m Symbol Ann -> DebugName -> Maybe CodebasePath -> m ()
initCodebaseAndExit i debugName mdir = do
initCodebaseAndExit :: MonadIO m => Init m Symbol Ann -> DebugName -> Maybe CodebasePath -> CodebaseLockOption -> m ()
initCodebaseAndExit i debugName mdir lockOption = do
codebaseDir <- Codebase.getCodebaseDir mdir
withNewUcmCodebaseOrExit i debugName codebaseDir (const $ pure ())
withNewUcmCodebaseOrExit i debugName codebaseDir lockOption (const $ pure ())
withTemporaryUcmCodebase ::
MonadUnliftIO m =>
Init m Symbol Ann ->
DebugName ->
CodebaseLockOption ->
((CodebasePath, Codebase m Symbol Ann) -> m r) ->
m r
withTemporaryUcmCodebase cbInit debugName action = do
withTemporaryUcmCodebase cbInit debugName lockOption action = do
UnliftIO.withSystemTempDirectory debugName $ \tempDir -> do
withNewUcmCodebaseOrExit cbInit debugName tempDir $ \codebase -> do
withNewUcmCodebaseOrExit cbInit debugName tempDir lockOption $ \codebase -> do
action (tempDir, codebase)

View File

@ -15,6 +15,7 @@ data OpenCodebaseError
OpenCodebaseDoesntExist
| -- | The codebase exists, but its schema version is unknown to this application.
OpenCodebaseUnknownSchemaVersion SchemaVersion
| OpenCodebaseFileLockFailed
| -- | The codebase exists, but requires a migration before it can be used.
OpenCodebaseRequiresMigration
-- current version

View File

@ -0,0 +1,110 @@
module Unison.Codebase.RootBranchCache
( RootBranchCache,
newEmptyRootBranchCache,
newEmptyRootBranchCacheIO,
fetchRootBranch,
withLock,
)
where
import Control.Concurrent.STM (newTVarIO)
import Control.Monad (join)
import Control.Monad.IO.Class
import Data.Coerce (coerce)
import Unison.Codebase.Branch.Type (Branch)
import qualified Unison.Sqlite as Sqlite
import UnliftIO (MonadUnliftIO, mask, onException)
import UnliftIO.STM
( STM,
TVar,
atomically,
newTVar,
readTVar,
retrySTM,
writeTVar,
)
data RootBranchCacheVal
= Empty
| -- | Another thread is updating the cache. If this value is observed
-- then the reader should wait until the value is Empty or Full. The
-- api exposed from this module guarantees that a thread cannot exit
-- and leave the cache in this state.
ConcurrentModification
| Full (Branch Sqlite.Transaction)
-- This is isomorphic to @TMVar (Maybe (Branch Sqlite.Transaction))@
newtype RootBranchCache = RootBranchCache (TVar RootBranchCacheVal)
newEmptyRootBranchCacheIO :: MonadIO m => m RootBranchCache
newEmptyRootBranchCacheIO = liftIO (coerce $ newTVarIO Empty)
newEmptyRootBranchCache :: STM RootBranchCache
newEmptyRootBranchCache = coerce (newTVar Empty)
readRbc :: RootBranchCache -> STM RootBranchCacheVal
readRbc (RootBranchCache v) = readTVar v
writeRbc :: RootBranchCache -> RootBranchCacheVal -> STM ()
writeRbc (RootBranchCache v) x = writeTVar v x
-- | Read the root branch cache, wait if the cache is currently being
-- updated
readRootBranchCache :: RootBranchCache -> STM (Maybe (Branch Sqlite.Transaction))
readRootBranchCache v =
readRbc v >>= \case
Empty -> pure Nothing
ConcurrentModification -> retrySTM
Full x -> pure (Just x)
fetchRootBranch :: forall m. MonadUnliftIO m => RootBranchCache -> m (Branch Sqlite.Transaction) -> m (Branch Sqlite.Transaction)
fetchRootBranch rbc getFromDb = mask \restore -> do
join (atomically (fetch restore))
where
fetch :: (forall x. m x -> m x) -> STM (m (Branch Sqlite.Transaction))
fetch restore = do
readRbc rbc >>= \case
Empty -> do
writeRbc rbc ConcurrentModification
pure do
rootBranch <- restore getFromDb `onException` atomically (writeRbc rbc Empty)
atomically (writeRbc rbc (Full rootBranch))
pure rootBranch
ConcurrentModification -> retrySTM
Full x -> pure (pure x)
-- | Take a cache lock so that no other thread can read or write to
-- the cache, perform an action with the cached value, then restore
-- the cache to Empty or Full
withLock ::
forall m r.
MonadUnliftIO m =>
RootBranchCache ->
-- | Perform an action with the cached value
( -- restore masking state
(forall x. m x -> m x) ->
-- value retrieved from cache
Maybe (Branch Sqlite.Transaction) ->
m r
) ->
-- | compute value to restore to the cache
(r -> Maybe (Branch Sqlite.Transaction)) ->
m r
withLock v f g = mask \restore -> do
mbranch <- atomically (takeLock v)
r <- f restore mbranch `onException` releaseLock mbranch
releaseLock (g r)
pure r
where
releaseLock :: Maybe (Branch Sqlite.Transaction) -> m ()
releaseLock mbranch =
let !val = case mbranch of
Nothing -> Empty
Just x -> Full x
in atomically (writeRbc v val)
takeLock :: RootBranchCache -> STM (Maybe (Branch Sqlite.Transaction))
takeLock v = do
res <- readRootBranchCache v
writeRbc v ConcurrentModification
pure res

View File

@ -9,6 +9,7 @@
module Unison.Codebase.SqliteCodebase
( Unison.Codebase.SqliteCodebase.init,
MigrationStrategy (..),
CodebaseLockOption (..),
)
where
@ -23,9 +24,9 @@ import qualified Data.Set as Set
import qualified Data.Text as Text
import Data.Time (getCurrentTime)
import qualified System.Console.ANSI as ANSI
import System.FileLock (SharedExclusive (Exclusive), withTryFileLock)
import qualified System.FilePath as FilePath
import qualified System.FilePath.Posix as FilePath.Posix
import qualified U.Codebase.Branch as V2Branch
import U.Codebase.HashTags (BranchHash, CausalHash (CausalHash))
import qualified U.Codebase.Reflog as Reflog
import qualified U.Codebase.Sqlite.Operations as Ops
@ -48,12 +49,13 @@ import Unison.Codebase.Editor.RemoteRepo
writeToReadGit,
)
import qualified Unison.Codebase.GitError as GitError
import Unison.Codebase.Init (MigrationStrategy (..))
import Unison.Codebase.Init (CodebaseLockOption (..), MigrationStrategy (..))
import qualified Unison.Codebase.Init as Codebase
import qualified Unison.Codebase.Init.CreateCodebaseError as Codebase1
import Unison.Codebase.Init.OpenCodebaseError (OpenCodebaseError (..))
import qualified Unison.Codebase.Init.OpenCodebaseError as Codebase1
import Unison.Codebase.Path (Path)
import Unison.Codebase.RootBranchCache
import Unison.Codebase.SqliteCodebase.Branch.Cache (newBranchCache)
import qualified Unison.Codebase.SqliteCodebase.Branch.Dependencies as BD
import qualified Unison.Codebase.SqliteCodebase.Conversions as Cv
@ -106,13 +108,14 @@ withOpenOrCreateCodebase ::
Codebase.DebugName ->
CodebasePath ->
LocalOrRemote ->
CodebaseLockOption ->
MigrationStrategy ->
((CodebaseStatus, Codebase m Symbol Ann) -> m r) ->
m (Either Codebase1.OpenCodebaseError r)
withOpenOrCreateCodebase debugName codebasePath localOrRemote migrationStrategy action = do
createCodebaseOrError debugName codebasePath (action' CreatedCodebase) >>= \case
withOpenOrCreateCodebase debugName codebasePath localOrRemote lockOption migrationStrategy action = do
createCodebaseOrError debugName codebasePath lockOption (action' CreatedCodebase) >>= \case
Left (Codebase1.CreateCodebaseAlreadyExists) -> do
sqliteCodebase debugName codebasePath localOrRemote migrationStrategy (action' ExistingCodebase)
sqliteCodebase debugName codebasePath localOrRemote lockOption migrationStrategy (action' ExistingCodebase)
Right r -> pure (Right r)
where
action' openOrCreate codebase = action (openOrCreate, codebase)
@ -122,9 +125,10 @@ createCodebaseOrError ::
(MonadUnliftIO m) =>
Codebase.DebugName ->
CodebasePath ->
CodebaseLockOption ->
(Codebase m Symbol Ann -> m r) ->
m (Either Codebase1.CreateCodebaseError r)
createCodebaseOrError debugName path action = do
createCodebaseOrError debugName path lockOption action = do
ifM
(doesFileExist $ makeCodebasePath path)
(pure $ Left Codebase1.CreateCodebaseAlreadyExists)
@ -136,7 +140,7 @@ createCodebaseOrError debugName path action = do
Q.createSchema
void . Ops.saveRootBranch v2HashHandle $ Cv.causalbranch1to2 Branch.empty
sqliteCodebase debugName path Local DontMigrate action >>= \case
sqliteCodebase debugName path Local lockOption DontMigrate action >>= \case
Left schemaVersion -> error ("Failed to open codebase with schema version: " ++ show schemaVersion ++ ", which is unexpected because I just created this codebase.")
Right result -> pure (Right result)
@ -147,13 +151,14 @@ withCodebaseOrError ::
(MonadUnliftIO m) =>
Codebase.DebugName ->
CodebasePath ->
CodebaseLockOption ->
MigrationStrategy ->
(Codebase m Symbol Ann -> m r) ->
m (Either Codebase1.OpenCodebaseError r)
withCodebaseOrError debugName dir migrationStrategy action = do
withCodebaseOrError debugName dir lockOption migrationStrategy action = do
doesFileExist (makeCodebasePath dir) >>= \case
False -> pure (Left Codebase1.OpenCodebaseDoesntExist)
True -> sqliteCodebase debugName dir Local migrationStrategy action
True -> sqliteCodebase debugName dir Local lockOption migrationStrategy action
initSchemaIfNotExist :: MonadIO m => FilePath -> m ()
initSchemaIfNotExist path = liftIO do
@ -187,11 +192,12 @@ sqliteCodebase ::
CodebasePath ->
-- | When local, back up the existing codebase before migrating, in case there's a catastrophic bug in the migration.
LocalOrRemote ->
CodebaseLockOption ->
MigrationStrategy ->
(Codebase m Symbol Ann -> m r) ->
m (Either Codebase1.OpenCodebaseError r)
sqliteCodebase debugName root localOrRemote migrationStrategy action = do
rootBranchCache <- newTVarIO Nothing
sqliteCodebase debugName root localOrRemote lockOption migrationStrategy action = handleLockOption do
rootBranchCache <- newEmptyRootBranchCacheIO
branchCache <- newBranchCache
getDeclType <- CodebaseOps.makeCachedTransaction 2048 CodebaseOps.getDeclType
-- The v1 codebase interface has operations to read and write individual definitions
@ -264,25 +270,36 @@ sqliteCodebase debugName root localOrRemote migrationStrategy action = do
putTypeDeclarationComponent =
CodebaseOps.putTypeDeclarationComponent termBuffer declBuffer
getShallowCausalForHash :: MonadIO m => V2Branch.CausalHash -> m (V2Branch.CausalBranch m)
getShallowCausalForHash bh =
V2Branch.hoistCausalBranch runTransaction <$> runTransaction (Ops.expectCausalBranchByCausalHash bh)
getRootBranch :: m (Branch m)
getRootBranch =
Branch.transform runTransaction
<$> fetchRootBranch
rootBranchCache
(runTransaction (CodebaseOps.uncachedLoadRootBranch branchCache getDeclType))
getRootBranch :: TVar (Maybe (Sqlite.DataVersion, Branch Sqlite.Transaction)) -> m (Branch m)
getRootBranch rootBranchCache =
Branch.transform runTransaction <$> runTransaction (CodebaseOps.getRootBranch branchCache getDeclType rootBranchCache)
putRootBranch :: TVar (Maybe (Sqlite.DataVersion, Branch Sqlite.Transaction)) -> Text -> Branch m -> m ()
putRootBranch rootBranchCache reason branch1 = do
putRootBranch :: Text -> Branch m -> m ()
putRootBranch reason branch1 = do
now <- liftIO getCurrentTime
withRunInIO \runInIO -> do
runInIO do
runTransaction do
let emptyCausalHash = Cv.causalHash1to2 $ Branch.headHash Branch.empty
fromRootCausalHash <- fromMaybe emptyCausalHash <$> Ops.loadRootCausalHash
let toRootCausalHash = Cv.causalHash1to2 $ Branch.headHash branch1
CodebaseOps.putRootBranch rootBranchCache (Branch.transform (Sqlite.unsafeIO . runInIO) branch1)
Ops.appendReflog (Reflog.Entry {time = now, fromRootCausalHash, toRootCausalHash, reason})
-- this is naughty, the type says Transaction but it
-- won't run automatically with whatever Transaction
-- it is composed into unless the enclosing
-- Transaction is applied to the same db connection.
let branch1Trans = Branch.transform (Sqlite.unsafeIO . runInIO) branch1
putRootBranchTrans :: Sqlite.Transaction () = do
let emptyCausalHash = Cv.causalHash1to2 (Branch.headHash Branch.empty)
fromRootCausalHash <- fromMaybe emptyCausalHash <$> Ops.loadRootCausalHash
let toRootCausalHash = Cv.causalHash1to2 (Branch.headHash branch1)
CodebaseOps.putRootBranch branch1Trans
Ops.appendReflog (Reflog.Entry {time = now, fromRootCausalHash, toRootCausalHash, reason})
-- We need to update the database and the cached
-- value. We want to keep these in sync, so we take
-- the cache lock while updating sqlite.
withLock
rootBranchCache
(\restore _ -> restore $ runInIO $ runTransaction putRootBranchTrans)
(\_ -> Just branch1Trans)
-- if this blows up on cromulent hashes, then switch from `hashToHashId`
-- to one that returns Maybe.
@ -314,9 +331,9 @@ sqliteCodebase debugName root localOrRemote migrationStrategy action = do
Sqlite.runWriteTransaction destConn \runDest -> do
syncInternal (syncProgress progressStateRef) runSrc runDest b
getWatch :: UF.WatchKind -> Reference.Id -> m (Maybe (Term Symbol Ann))
getWatch k r =
runTransaction (CodebaseOps.getWatch getDeclType k r)
getWatch :: UF.WatchKind -> Reference.Id -> Sqlite.Transaction (Maybe (Term Symbol Ann))
getWatch =
CodebaseOps.getWatch getDeclType
termsOfTypeImpl :: Reference -> m (Set Referent.Id)
termsOfTypeImpl r =
@ -348,9 +365,8 @@ sqliteCodebase debugName root localOrRemote migrationStrategy action = do
putTypeDeclaration,
putTypeDeclarationComponent,
getTermComponentWithTypes,
getRootBranch = getRootBranch rootBranchCache,
putRootBranch = putRootBranch rootBranchCache,
getShallowCausalForHash,
getRootBranch,
putRootBranch,
getBranchForHashImpl = getBranchForHash,
putBranch,
syncFromDirectory,
@ -375,6 +391,13 @@ sqliteCodebase debugName root localOrRemote migrationStrategy action = do
runTransaction action =
withConn \conn -> Sqlite.runTransaction conn action
handleLockOption ma = case lockOption of
DontLock -> ma
DoLock -> withRunInIO \runInIO ->
withTryFileLock (lockfilePath root) Exclusive (\_flock -> runInIO ma) <&> \case
Nothing -> Left OpenCodebaseFileLockFailed
Just x -> x
syncInternal ::
forall m.
MonadUnliftIO m =>
@ -567,7 +590,7 @@ viewRemoteBranch' ReadGitRemoteNamespace {repo, sch, path} gitBranchBehavior act
then throwIO (C.GitSqliteCodebaseError (GitError.NoDatabaseFile repo remotePath))
else throwIO exception
result <- sqliteCodebase "viewRemoteBranch.gitCache" remotePath Remote MigrateAfterPrompt \codebase -> do
result <- sqliteCodebase "viewRemoteBranch.gitCache" remotePath Remote DoLock MigrateAfterPrompt \codebase -> do
-- try to load the requested branch from it
branch <- time "Git fetch (sch)" $ case sch of
-- no sub-branch was specified, so use the root.
@ -617,7 +640,7 @@ pushGitBranch srcConn repo (PushGitBranchOpts behavior _syncMode) action = Unlif
-- set up the cache dir
throwEitherMWith C.GitProtocolError . withRepo readRepo Git.CreateBranchIfMissing $ \pushStaging -> do
newBranchOrErr <- throwEitherMWith (C.GitSqliteCodebaseError . C.gitErrorFromOpenCodebaseError (Git.gitDirToPath pushStaging) readRepo)
. withOpenOrCreateCodebase "push.dest" (Git.gitDirToPath pushStaging) Remote MigrateAfterPrompt
. withOpenOrCreateCodebase "push.dest" (Git.gitDirToPath pushStaging) Remote DoLock MigrateAfterPrompt
$ \(codebaseStatus, destCodebase) -> do
currentRootBranch <-
Codebase1.runTransaction destCodebase CodebaseOps.getRootBranchExists >>= \case
@ -696,9 +719,10 @@ pushGitBranch srcConn repo (PushGitBranchOpts behavior _syncMode) action = Unlif
-- so we have to convert our expected path to test.
posixCodebasePath =
FilePath.Posix.joinPath (FilePath.splitDirectories codebasePath)
posixLockfilePath = FilePath.replaceExtension posixCodebasePath "lockfile"
statusLines = Text.unpack <$> Text.lines status
t = dropWhile Char.isSpace
okLine (t -> '?' : '?' : (t -> p)) | p == posixCodebasePath = True
okLine (t -> '?' : '?' : (t -> p)) | p == posixCodebasePath || p == posixLockfilePath = True
okLine (t -> 'M' : (t -> p)) | p == posixCodebasePath = True
okLine line = isWalDelete line || isShmDelete line
isWalDelete (t -> 'D' : (t -> p)) | p == posixCodebasePath ++ "-wal" = True

View File

@ -6,6 +6,7 @@ import Unison.CodebasePath (CodebasePath)
data GitSqliteCodebaseError
= GitCouldntParseRootBranchHash ReadGitRepo String
| CodebaseFileLockFailed
| NoDatabaseFile ReadGitRepo CodebasePath
| UnrecognizedSchemaVersion ReadGitRepo CodebasePath SchemaVersion
| CodebaseRequiresMigration SchemaVersion SchemaVersion

View File

@ -8,14 +8,13 @@
module Unison.Codebase.SqliteCodebase.Operations where
import Control.Lens (ifor)
import Data.Bifunctor (second)
import Data.Maybe (fromJust)
import Data.Bitraversable (bitraverse)
import Data.Either.Extra ()
import qualified Data.List as List
import qualified Data.List.NonEmpty as NEList
import Data.List.NonEmpty.Extra (NonEmpty ((:|)), maximum1)
import qualified Data.Map as Map
import Data.Maybe (fromJust)
import qualified Data.Set as Set
import qualified Data.Text as Text
import qualified U.Codebase.Branch as V2Branch
@ -377,36 +376,6 @@ tryFlushDeclBuffer termBuffer declBuffer =
h
in loop
getRootBranch ::
-- | A 'getDeclType'-like lookup, possibly backed by a cache.
BranchCache Sqlite.Transaction ->
(C.Reference.Reference -> Transaction CT.ConstructorType) ->
TVar (Maybe (Sqlite.DataVersion, Branch Transaction)) ->
Transaction (Branch Transaction)
getRootBranch branchCache doGetDeclType rootBranchCache =
Sqlite.unsafeIO (readTVarIO rootBranchCache) >>= \case
Nothing -> forceReload
Just (v, b) -> do
-- check to see if root namespace hash has been externally modified
-- and reload it if necessary
v' <- Sqlite.getDataVersion
if v == v'
then pure b
else do
newRootHash <- Ops.expectRootCausalHash
if Branch.headHash b == Cv.causalHash2to1 newRootHash
then pure b
else do
traceM $ "database was externally modified (" ++ show v ++ " -> " ++ show v' ++ ")"
forceReload
where
forceReload :: Transaction (Branch Transaction)
forceReload = do
branch1 <- uncachedLoadRootBranch branchCache doGetDeclType
ver <- Sqlite.getDataVersion
Sqlite.unsafeIO (atomically (writeTVar rootBranchCache (Just (ver, branch1))))
pure branch1
uncachedLoadRootBranch ::
BranchCache Sqlite.Transaction ->
(C.Reference.Reference -> Sqlite.Transaction CT.ConstructorType) ->
@ -420,12 +389,11 @@ getRootBranchExists :: Transaction Bool
getRootBranchExists =
isJust <$> Ops.loadRootCausalHash
putRootBranch :: TVar (Maybe (Sqlite.DataVersion, Branch Transaction)) -> Branch Transaction -> Transaction ()
putRootBranch rootBranchCache branch1 = do
putRootBranch :: Branch Transaction -> Transaction ()
putRootBranch branch1 = do
-- todo: check to see if root namespace hash has been externally modified
-- and do something (merge?) it if necessary. But for now, we just overwrite it.
void (Ops.saveRootBranch v2HashHandle (Cv.causalbranch1to2 branch1))
Sqlite.unsafeIO (atomically $ modifyTVar' rootBranchCache (fmap . second $ const branch1))
-- if this blows up on cromulent hashes, then switch from `hashToHashId`
-- to one that returns Maybe.

View File

@ -3,6 +3,7 @@ module Unison.Codebase.SqliteCodebase.Paths
makeCodebasePath,
makeCodebaseDirPath,
backupCodebasePath,
lockfilePath,
)
where
@ -18,6 +19,9 @@ codebasePath = ".unison" </> "v2" </> "unison.sqlite3"
makeCodebasePath :: CodebasePath -> FilePath
makeCodebasePath root = makeCodebaseDirPath root </> "unison.sqlite3"
lockfilePath :: CodebasePath -> FilePath
lockfilePath root = makeCodebaseDirPath root </> "unison.lockfile"
-- | Makes a path to the location where sqlite files are stored within a codebase path.
makeCodebaseDirPath :: CodebasePath -> FilePath
makeCodebaseDirPath root = root </> ".unison" </> "v2"

View File

@ -13,7 +13,6 @@ module Unison.Codebase.Type
)
where
import qualified U.Codebase.Branch as V2
import U.Codebase.HashTags (BranchHash)
import qualified U.Codebase.Reference as V2
import Unison.Codebase.Branch (Branch)
@ -82,7 +81,6 @@ data Codebase m v a = Codebase
Text -> -- Reason for the change, will be recorded in the reflog
Branch m ->
m (),
getShallowCausalForHash :: V2.CausalHash -> m (V2.CausalBranch m),
getBranchForHashImpl :: Branch.CausalHash -> m (Maybe (Branch m)),
-- | Put a branch into the codebase, which includes its children, its patches, and the branch itself, if they don't
-- already exist.
@ -97,7 +95,7 @@ data Codebase m v a = Codebase
-- | Push the given branch to the given repo, and optionally set it as the root branch.
pushGitBranch :: forall e. WriteGitRepo -> PushGitBranchOpts -> (Branch m -> m (Either e (Branch m))) -> m (Either GitError (Either e (Branch m))),
-- | @getWatch k r@ returns watch result @t@ that was previously put by @putWatch k r t@.
getWatch :: WK.WatchKind -> Reference.Id -> m (Maybe (Term v a)),
getWatch :: WK.WatchKind -> Reference.Id -> Sqlite.Transaction (Maybe (Term v a)),
-- | Get the set of user-defined terms-or-constructors that have the given type.
termsOfTypeImpl :: Reference -> m (Set Referent.Id),
-- | Get the set of user-defined terms-or-constructors mention the given type anywhere in their signature.
@ -159,3 +157,4 @@ gitErrorFromOpenCodebaseError path repo = \case
UnrecognizedSchemaVersion repo path (fromIntegral v)
OpenCodebaseRequiresMigration fromSv toSv ->
CodebaseRequiresMigration fromSv toSv
OpenCodebaseFileLockFailed -> CodebaseFileLockFailed

View File

@ -32,13 +32,13 @@ fromNames len names = PrettyPrintEnv terms' types'
--
-- 1. Prefer Relative Names to Absolute Names
-- 2. Prefer names that aren't hash qualified to those that are
-- 3. Prefer names which have fewer segments in their suffixified form (if applicable)
-- 4. Prefer names which have fewer segments in their fully-qualified form
-- 3. Prefer names which have fewer segments in their fully-qualified form
-- 4. Prefer names which have fewer segments in their suffixified form (if applicable)
prioritize :: [(HQ'.HashQualified Name, HQ'.HashQualified Name)] -> [(HQ'.HashQualified Name, HQ'.HashQualified Name)]
prioritize =
sortOn \case
(fqn, HQ'.NameOnly name) -> (Name.isAbsolute name, Nothing, Name.countSegments name, Name.countSegments (HQ'.toName fqn))
(fqn, HQ'.HashQualified name hash) -> (Name.isAbsolute name, Just hash, Name.countSegments name, Name.countSegments (HQ'.toName fqn))
(fqn, HQ'.NameOnly name) -> (Name.isAbsolute name, Nothing, Name.countSegments (HQ'.toName fqn), Name.countSegments name)
(fqn, HQ'.HashQualified name hash) -> (Name.isAbsolute name, Just hash, Name.countSegments (HQ'.toName fqn), Name.countSegments name)
fromSuffixNames :: Int -> NamesWithHistory -> PrettyPrintEnv
fromSuffixNames len names = PrettyPrintEnv terms' types'

View File

@ -79,6 +79,7 @@ import Data.Bits (shiftL, shiftR, (.&.), (.|.))
import Data.Functor.Compose (Compose (..))
import Data.List hiding (and, or)
import qualified Data.Map as Map
import qualified Data.Primitive as PA
import qualified Data.Set as Set
import qualified Data.Text as Data.Text
import GHC.Stack (CallStack, callStack)
@ -226,12 +227,13 @@ newtype Prefix v x = Pfx (Map v [v]) deriving (Show)
instance Functor (Prefix v) where
fmap _ (Pfx m) = Pfx m
instance Ord v => Applicative (Prefix v) where
pure _ = Pfx Map.empty
Pfx ml <*> Pfx mr = Pfx $ Map.unionWith common ml mr
common :: Eq v => [v] -> [v] -> [v]
common (u:us) (v:vs)
common (u : us) (v : vs)
| u == v = u : common us vs
common _ _ = []
@ -264,13 +266,13 @@ dropPrefix v n = ABT.visitPure rw
| v == u = Just (apps' (var (ABT.annotation f) u) (drop n as))
rw _ = Nothing
dropPrefixes
:: Ord v => Semigroup a => Map v Int -> Term v a -> Term v a
dropPrefixes ::
Ord v => Semigroup a => Map v Int -> Term v a -> Term v a
dropPrefixes m = ABT.visitPure rw
where
rw (Apps' f@(Var' u) as)
| Just n <- Map.lookup u m =
Just (apps' (var (ABT.annotation f) u) (drop n as))
Just (apps' (var (ABT.annotation f) u) (drop n as))
rw _ = Nothing
-- Performs opposite transformations to those in enclose. Named after
@ -279,27 +281,28 @@ beta :: Var v => Monoid a => (Term v a -> Term v a) -> Term v a -> Maybe (Term v
beta rec (LetRecNamedTop' top (fmap (fmap rec) -> vbs) (rec -> bd)) =
Just $ letRec' top lvbs lbd
where
-- Avoid completely reducing a lambda expression, because recursive
-- lets must be guarded.
args (v, LamsNamed' vs Ann'{}) = (v, vs)
args (v, LamsNamed' vs _) = (v, init vs)
args (v, _) = (v, [])
-- Avoid completely reducing a lambda expression, because recursive
-- lets must be guarded.
args (v, LamsNamed' vs Ann' {}) = (v, vs)
args (v, LamsNamed' vs _) = (v, init vs)
args (v, _) = (v, [])
Pfx m0 = traverse_ (prefix . snd) vbs *> prefix bd
Pfx m0 = traverse_ (prefix . snd) vbs *> prefix bd
f ls rs = case common ls rs of
[] -> Nothing
vs -> Just vs
f ls rs = case common ls rs of
[] -> Nothing
vs -> Just vs
m = Map.map length $ Map.differenceWith f (Map.fromList $ map args vbs) m0
lvbs = vbs <&> \(v, b0) -> (,) v $ case b0 of
LamsNamed' vs b | Just n <- Map.lookup v m ->
lam' (ABT.annotation b0) (drop n vs) (dropPrefixes m b)
-- shouldn't happen
b -> dropPrefixes m b
lbd = dropPrefixes m bd
m = Map.map length $ Map.differenceWith f (Map.fromList $ map args vbs) m0
lvbs =
vbs <&> \(v, b0) -> (,) v $ case b0 of
LamsNamed' vs b
| Just n <- Map.lookup v m ->
lam' (ABT.annotation b0) (drop n vs) (dropPrefixes m b)
-- shouldn't happen
b -> dropPrefixes m b
lbd = dropPrefixes m bd
beta rec (Let1NamedTop' top v l@(LamsNamed' vs bd) (rec -> e))
| n > 0 = Just $ let1' top [(v, lamb)] (dropPrefix v n e)
| otherwise = Nothing
@ -310,17 +313,18 @@ beta rec (Let1NamedTop' top v l@(LamsNamed' vs bd) (rec -> e))
-- Enclosing doesn't create let-bound lambdas, so we
-- should never reduce a lambda to a non-lambda, as that
-- could affect evaluation order.
m | Ann' _ _ <- bd = length vs
m
| Ann' _ _ <- bd = length vs
| otherwise = length vs - 1
n = min m . length $ appPfx (prefix e) v vs
beta rec (Apps' l@(LamsNamed' vs body) as)
| n <- matchVars 0 vs as
, n > 0 = Just $ apps' (lam' al (drop n vs) (rec body)) (drop n as)
| n <- matchVars 0 vs as,
n > 0 =
Just $ apps' (lam' al (drop n vs) (rec body)) (drop n as)
| otherwise = Nothing
where
al = ABT.annotation l
matchVars !n (u:us) (Var' v : as) | u == v = matchVars (1+n) us as
matchVars !n (u : us) (Var' v : as) | u == v = matchVars (1 + n) us as
matchVars n _ _ = n
beta _ _ = Nothing
@ -1261,6 +1265,7 @@ data BLit
| Bytes Bytes
| Quote Value
| Code (SuperGroup Symbol)
| BArr PA.ByteArray
deriving (Show)
groupVars :: ANFM v (Set v)

View File

@ -70,7 +70,15 @@ data LtTag
| LMT
| LYT
data BLTag = TextT | ListT | TmLinkT | TyLinkT | BytesT | QuoteT | CodeT
data BLTag
= TextT
| ListT
| TmLinkT
| TyLinkT
| BytesT
| QuoteT
| CodeT
| BArrT
data VaTag = PartialT | DataT | ContT | BLitT
@ -170,6 +178,7 @@ instance Tag BLTag where
BytesT -> 4
QuoteT -> 5
CodeT -> 6
BArrT -> 7
word2tag = \case
0 -> pure TextT
@ -179,6 +188,7 @@ instance Tag BLTag where
4 -> pure BytesT
5 -> pure QuoteT
6 -> pure CodeT
7 -> pure BArrT
t -> unknownTag "BLTag" t
instance Tag VaTag where
@ -579,6 +589,7 @@ putBLit (TyLink r) = putTag TyLinkT *> putReference r
putBLit (Bytes b) = putTag BytesT *> putBytes b
putBLit (Quote v) = putTag QuoteT *> putValue v
putBLit (Code g) = putTag CodeT *> putGroup mempty g
putBLit (BArr a) = putTag BArrT *> putByteArray a
getBLit :: MonadGet m => Version -> m BLit
getBLit v =
@ -590,6 +601,7 @@ getBLit v =
BytesT -> Bytes <$> getBytes
QuoteT -> Quote <$> getValue v
CodeT -> Code <$> getGroup
BArrT -> BArr <$> getByteArray
putRefs :: MonadPut m => [Reference] -> m ()
putRefs rs = putFoldable putReference rs
@ -775,7 +787,8 @@ deserializeValue bs = runGetS (getVersion >>= getValue) bs
where
getVersion =
getWord32be >>= \case
n | n < 1 -> fail $ "deserializeValue: unknown version: " ++ show n
n
| n < 1 -> fail $ "deserializeValue: unknown version: " ++ show n
| n < 3 -> fail $ "deserializeValue: unsupported version: " ++ show n
| n == 3 -> pure n
| otherwise -> fail $ "deserializeValue: unknown version: " ++ show n

View File

@ -375,7 +375,7 @@ exec !_ !denv !_activeThreads !ustk !bstk !k _ (BPrim2 CMPU i j) = do
ustk <- bump ustk
poke ustk . fromEnum $ universalCompare compare x y
pure (denv, ustk, bstk, k)
exec !_ !_ !_activeThreads !_ !bstk !k r (BPrim2 THRO i j) = do
exec !_ !_ !_activeThreads !_ !bstk !k r (BPrim2 THRO i j) = do
name <- peekOffBi @Util.Text.Text bstk i
x <- peekOff bstk j
throwIO (BU (traceK r k) (Util.Text.toText name) x)
@ -1938,6 +1938,8 @@ reflectValue rty = goV
pure (ANF.Quote v)
| Just g <- maybeUnwrapForeign Rf.codeRef f =
pure (ANF.Code g)
| Just a <- maybeUnwrapForeign Rf.ibytearrayRef f =
pure (ANF.BArr a)
| otherwise = die $ err $ "foreign value: " <> (show f)
reifyValue :: CCache -> ANF.Value -> IO (Either [Reference] Closure)
@ -2010,6 +2012,7 @@ reifyValue0 (rty, rtm) = goV
goL (ANF.Bytes b) = pure . Foreign $ Wrap Rf.bytesRef b
goL (ANF.Quote v) = pure . Foreign $ Wrap Rf.valueRef v
goL (ANF.Code g) = pure . Foreign $ Wrap Rf.codeRef g
goL (ANF.BArr a) = pure . Foreign $ Wrap Rf.ibytearrayRef a
-- Universal comparison functions

View File

@ -15,10 +15,12 @@ import Data.Bytes.VarInt
import Data.Foldable (traverse_)
import Data.Int (Int64)
import Data.Map.Strict as Map (Map, fromList, toList)
import qualified Data.Primitive as PA
import Data.Text (Text)
import Data.Text.Encoding (decodeUtf8, encodeUtf8)
import qualified Data.Vector.Primitive as BA
import Data.Word (Word64, Word8)
import GHC.Exts as IL (IsList (..))
import qualified U.Util.Hash as Hash
import Unison.ConstructorReference (ConstructorReference, GConstructorReference (..))
import qualified Unison.ConstructorType as CT
@ -156,6 +158,12 @@ getBytes = Bytes.fromChunks <$> getList getBlock
putBytes :: MonadPut m => Bytes.Bytes -> m ()
putBytes = putFoldable putBlock . Bytes.chunks
getByteArray :: MonadGet m => m PA.ByteArray
getByteArray = PA.byteArrayFromList <$> getList getWord8
putByteArray :: MonadPut m => PA.ByteArray -> m ()
putByteArray a = putFoldable putWord8 (IL.toList a)
getBlock :: MonadGet m => m Bytes.Chunk
getBlock = getLength >>= fmap Bytes.byteStringToChunk . getByteString

View File

@ -193,6 +193,7 @@ pretty0
fmt S.LinkKeyword "termLink "
<> parenIfInfix name ic (styleHashQualified'' (fmt $ S.TermReference r) name)
where
TypeLink' r -> do
n <- getPPE
let name = elideFQN im $ PrettyPrintEnv.typeName n r
@ -200,6 +201,7 @@ pretty0
fmt S.LinkKeyword "typeLink "
<> parenIfInfix name ic (styleHashQualified'' (fmt $ S.TypeReference r) name)
where
Ann' tm t -> do
tm' <- pretty0 (ac 10 Normal im doc) tm
tp' <- TypePrinter.pretty0 im 0 t
@ -259,21 +261,17 @@ pretty0
fmt S.DelayForceChar (l "!") <> px
Delay' x
| Lets' _ _ <- x -> do
px <- pretty0 (ac 0 Block im doc) x
pure . paren (p >= 3) $
fmt S.ControlKeyword "do" `PP.hang` px
px <- pretty0 (ac 0 Block im doc) x
pure . paren (p >= 3) $
fmt S.ControlKeyword "do" `PP.hang` px
| otherwise -> do
px <- pretty0 (ac 10 Normal im doc) x
pure . paren (p >= 11 || isBlock x && p >= 3) $
fmt S.DelayForceChar (l "'")
<> ( case x of
Lets' _ _ -> id
-- Add indentation below if we're opening parens with '(
-- This is in case the contents are a long function application
-- in which case the arguments should be indented.
_ -> PP.indentAfterNewline " "
)
px
px <- pretty0 (ac 10 Normal im doc) x
pure . paren (p >= 11 || isBlock x && p >= 3) $
fmt S.DelayForceChar (l "'")
-- Add indentation below since we're opening parens with '(
-- This is in case the contents are a long function application
-- in which case the arguments should be indented.
<> PP.indentAfterNewline " " px
List' xs ->
PP.group <$> do
xs' <- traverse (pretty0 (ac 0 Normal im doc)) xs
@ -327,30 +325,30 @@ pretty0
-- See `isDestructuringBind` definition.
Match' scrutinee cs@[MatchCase pat guard (AbsN' vs body)]
| p < 1 && isDestructuringBind scrutinee cs -> do
n <- getPPE
let letIntro = case bc of
Block -> id
Normal -> \x ->
-- We don't call calcImports here, because we can't easily do the
-- corequisite step in immediateChildBlockTerms (because it doesn't
-- know bc.) So we'll fail to take advantage of any opportunity
-- this let block provides to add a use statement. Not so bad.
fmt S.ControlKeyword "let" `PP.hang` x
lhs <- do
let (lhs, _) = prettyPattern n (ac 0 Block im doc) 10 vs pat
guard' <- printGuard guard
pure $ PP.group lhs `PP.hang` guard'
let eq = fmt S.BindingEquals "="
rhs <- do
let (im', uses) = calcImports im scrutinee
uses <$> sequence [pretty0 (ac (-1) Block im' doc) scrutinee]
letIntro <$> do
prettyBody <- pretty0 (ac (-1) Block im doc) body
pure $
PP.lines
[ (lhs <> eq) `PP.hang` rhs,
prettyBody
]
n <- getPPE
let letIntro = case bc of
Block -> id
Normal -> \x ->
-- We don't call calcImports here, because we can't easily do the
-- corequisite step in immediateChildBlockTerms (because it doesn't
-- know bc.) So we'll fail to take advantage of any opportunity
-- this let block provides to add a use statement. Not so bad.
fmt S.ControlKeyword "let" `PP.hang` x
lhs <- do
let (lhs, _) = prettyPattern n (ac 0 Block im doc) 10 vs pat
guard' <- printGuard guard
pure $ PP.group lhs `PP.hang` guard'
let eq = fmt S.BindingEquals "="
rhs <- do
let (im', uses) = calcImports im scrutinee
uses <$> sequence [pretty0 (ac (-1) Block im' doc) scrutinee]
letIntro <$> do
prettyBody <- pretty0 (ac (-1) Block im doc) body
pure $
PP.lines
[ (lhs <> eq) `PP.hang` rhs,
prettyBody
]
where
printGuard Nothing = pure mempty
printGuard (Just g') = do
@ -391,9 +389,9 @@ pretty0
case (term, binaryOpsPred) of
(DD.Doc, _)
| doc == MaybeDoc ->
if isDocLiteral term
then applyPPE3 prettyDoc im term
else pretty0 (a {docContext = NoDoc}) term
if isDocLiteral term
then applyPPE3 prettyDoc im term
else pretty0 (a {docContext = NoDoc}) term
(TupleTerm' [x], _) -> do
let conRef = DD.pairCtorRef
name <- elideFQN im <$> applyPPE2 PrettyPrintEnv.termName conRef
@ -465,12 +463,12 @@ pretty0
_ -> case (term, nonForcePred) of
OverappliedBinaryAppPred' f a b r
| binaryOpsPred f ->
-- Special case for overapplied binary op
do
prettyB <- pretty0 (ac 3 Normal im doc) b
prettyR <- PP.spacedTraverse (pretty0 (ac 10 Normal im doc)) r
prettyA <- binaryApps [(f, a)] prettyB
pure $ paren True $ PP.hang prettyA prettyR
-- Special case for overapplied binary op
do
prettyB <- pretty0 (ac 3 Normal im doc) b
prettyR <- PP.spacedTraverse (pretty0 (ac 10 Normal im doc)) r
prettyA <- binaryApps [(f, a)] prettyB
pure $ paren True $ PP.hang prettyA prettyR
AppsPred' f args ->
paren (p >= 10) <$> do
f' <- pretty0 (ac 10 Normal im doc) f
@ -611,8 +609,8 @@ prettyPattern n c@AmbientContext {imports = im} p vs patt = case patt of
Pattern.Text _ t -> (fmt S.TextLiteral $ l $ show t, vs)
TuplePattern pats
| length pats /= 1 ->
let (pats_printed, tail_vs) = patterns (-1) vs pats
in (PP.parenthesizeCommas pats_printed, tail_vs)
let (pats_printed, tail_vs) = patterns (-1) vs pats
in (PP.parenthesizeCommas pats_printed, tail_vs)
Pattern.Constructor _ ref [] ->
(styleHashQualified'' (fmt $ S.TermReference conRef) name, vs)
where
@ -710,7 +708,10 @@ printCase ::
DocLiteralContext ->
[MatchCase' () (Term3 v PrintAnnotation)] ->
m (Pretty SyntaxText)
printCase im doc ms0 = PP.lines . alignGrid <$> grid
printCase im doc ms0 =
PP.orElse
<$> (PP.lines . alignGrid True <$> grid)
<*> (PP.lines . alignGrid False <$> grid)
where
ms = groupCases ms0
justify rows =
@ -718,21 +719,34 @@ printCase im doc ms0 = PP.lines . alignGrid <$> grid
where
alignPatterns (p, _, _) = (p, Just "")
gbs (_, gs, bs) = zip gs bs
alignGrid = fmap alignCase . justify
alignCase (p, gbs) =
if not (null (drop 1 gbs))
then PP.hang p guardBlock
else p <> guardBlock
nojustify = map f
where
guardBlock =
PP.lines $
fmap (\(g, (a, b)) -> PP.hang (PP.group (g <> a)) b) justified
justified = PP.leftJustify $ fmap (\(g, b) -> (g, (arrow, b))) gbs
f (p, gs, bs) = (p, zip gs bs)
alignGrid alignArrows grid =
fmap alignCase $ if alignArrows then justify grid else nojustify grid
where
alignCase (p, gbs) =
if not (null (drop 1 gbs))
then PP.hang p guardBlock
else p <> guardBlock
where
guardBlock =
PP.lines $
fmap
( \(g, (a, b)) ->
PP.hang
( PP.group
(g <> (if alignArrows then "" else " ") <> a)
)
b
)
justified
justified = PP.leftJustify $ fmap (\(g, b) -> (g, (arrow, b))) gbs
grid = traverse go ms
patLhs env vs pats =
case pats of
[pat] -> PP.group (fst (prettyPattern env (ac 0 Block im doc) (-1) vs pat))
pats -> PP.group . PP.sep ("," <> PP.softbreak)
pats -> PP.group . PP.sep (PP.indentAfterNewline " " $ "," <> PP.softbreak)
. (`evalState` vs)
. for pats
$ \pat -> do
@ -858,14 +872,14 @@ prettyBinding0 a@AmbientContext {imports = im, docContext = doc} v term =
where
defnLhs v vs
| infix' = case vs of
x : y : _ ->
PP.sep
" "
[ fmt S.Var $ PP.text (Var.name x),
styleHashQualified'' (fmt $ S.HashQualifier v) $ elideFQN im v,
fmt S.Var $ PP.text (Var.name y)
]
_ -> l "error"
x : y : _ ->
PP.sep
" "
[ fmt S.Var $ PP.text (Var.name x),
styleHashQualified'' (fmt $ S.HashQualifier v) $ elideFQN im v,
fmt S.Var $ PP.text (Var.name y)
]
_ -> l "error"
| null vs = renderName v
| otherwise = renderName v `PP.hang` args vs
args = PP.spacedMap $ fmt S.Var . PP.text . Var.name
@ -1466,7 +1480,7 @@ unLetBlock t = rec t
Just (_isTop, bindings, body) -> case rec body of
Just (innerBindings, innerBody)
| dontIntersect bindings innerBindings ->
Just (bindings ++ innerBindings, innerBody)
Just (bindings ++ innerBindings, innerBody)
_ -> Just (bindings, body)
nonrec t = case unLet t of
Nothing -> Nothing
@ -1475,7 +1489,7 @@ unLetBlock t = rec t
in case rec body of
Just (innerBindings, innerBody)
| dontIntersect bindings innerBindings ->
Just (bindings ++ innerBindings, innerBody)
Just (bindings ++ innerBindings, innerBody)
_ -> Just (bindings, body)
pattern LamsNamedMatch' ::
@ -1528,7 +1542,7 @@ unLamsMatch' t = case unLamsUntilDelay' t of
| -- if `v1'` is referenced in any of the branches, we can't use lambda case
-- syntax as we need to keep the `v1'` name that was introduced
v1 == v1' && Set.notMember v1' (Set.unions $ freeVars <$> branches) ->
Just (reverse vs, [([p], guard, body) | MatchCase p guard body <- branches])
Just (reverse vs, [([p], guard, body) | MatchCase p guard body <- branches])
-- x y z -> match (x,y,z) with (pat1, pat2, pat3) -> ...
-- becomes
-- cases pat1 pat2 pat3 -> ...`
@ -1541,7 +1555,7 @@ unLamsMatch' t = case unLamsUntilDelay' t of
all notFree (take len vs)
&& all isRightArity branches
&& len /= 0 -> -- all patterns need to match arity of scrutes
Just (reverse (drop len vs), branches')
Just (reverse (drop len vs), branches')
where
isRightArity (MatchCase (TuplePattern ps) _ _) = length ps == len
isRightArity MatchCase {} = False
@ -1793,7 +1807,7 @@ toDocExample' suffix ppe (Apps' (Ref' r) [Nat' n, l@(LamsNamed' vs tm)])
| nameEndsWith ppe suffix r,
ABT.freeVars l == mempty,
ok tm =
Just (lam' (ABT.annotation l) (drop (fromIntegral n + 1) vs) tm)
Just (lam' (ABT.annotation l) (drop (fromIntegral n + 1) vs) tm)
where
ok (Apps' f _) = ABT.freeVars f == mempty
ok tm = ABT.freeVars tm == mempty
@ -1807,9 +1821,9 @@ toDocTransclude _ _ = Nothing
toDocLink :: Ord v => PrettyPrintEnv -> Term3 v PrintAnnotation -> Maybe (Either Reference Referent)
toDocLink ppe (App' (Ref' r) tm)
| nameEndsWith ppe ".docLink" r = case tm of
(toDocEmbedTermLink ppe -> Just tm) -> Just (Right tm)
(toDocEmbedTypeLink ppe -> Just tm) -> Just (Left tm)
_ -> Nothing
(toDocEmbedTermLink ppe -> Just tm) -> Just (Right tm)
(toDocEmbedTypeLink ppe -> Just tm) -> Just (Left tm)
_ -> Nothing
toDocLink _ _ = Nothing
toDocNamedLink :: PrettyPrintEnv -> Term3 v PrintAnnotation -> Maybe (Term3 v PrintAnnotation, Term3 v PrintAnnotation)
@ -1848,7 +1862,7 @@ toDocSourceAnnotations _ppe _tm = Just [] -- todo fetch annotations
toDocSourceElement :: Ord v => PrettyPrintEnv -> Term3 v PrintAnnotation -> Maybe (Either Reference Referent, [Referent])
toDocSourceElement ppe (Apps' (Ref' r) [tm, toDocSourceAnnotations ppe -> Just annotations])
| nameEndsWith ppe ".docSourceElement" r =
(,annotations) <$> ok tm
(,annotations) <$> ok tm
where
ok tm =
Right <$> toDocEmbedTermLink ppe tm
@ -1863,9 +1877,9 @@ toDocSource' ::
Maybe [(Either Reference Referent, [Referent])]
toDocSource' suffix ppe (App' (Ref' r) (List' tms))
| nameEndsWith ppe suffix r =
case [tm | Just tm <- toDocSourceElement ppe <$> toList tms] of
tms' | length tms' == length tms -> Just tms'
_ -> Nothing
case [tm | Just tm <- toDocSourceElement ppe <$> toList tms] of
tms' | length tms' == length tms -> Just tms'
_ -> Nothing
toDocSource' _ _ _ = Nothing
toDocSource,
@ -1903,9 +1917,9 @@ toDocEmbedSignatureLink _ _ = Nothing
toDocSignature :: Ord v => PrettyPrintEnv -> Term3 v PrintAnnotation -> Maybe [Referent]
toDocSignature ppe (App' (Ref' r) (List' tms))
| nameEndsWith ppe ".docSignature" r =
case [tm | Just tm <- toDocEmbedSignatureLink ppe <$> toList tms] of
tms' | length tms' == length tms -> Just tms'
_ -> Nothing
case [tm | Just tm <- toDocEmbedSignatureLink ppe <$> toList tms] of
tms' | length tms' == length tms -> Just tms'
_ -> Nothing
toDocSignature _ _ = Nothing
toDocBulletedList :: PrettyPrintEnv -> Term3 v PrintAnnotation -> Maybe [Term3 v PrintAnnotation]

View File

@ -29,7 +29,7 @@ test =
[ scope "a v2 codebase should be opened" do
tmp <- io (Temp.getCanonicalTemporaryDirectory >>= flip Temp.createTempDirectory "ucm-test")
cbInit <- io initMockWithCodebase
r <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Home tmp) CI.DontMigrate \case
r <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Home tmp) CI.DontLock CI.DontMigrate \case
(CI.OpenedCodebase, _, _) -> pure True
_ -> pure False
case r of
@ -38,7 +38,7 @@ test =
scope "a v2 codebase should be created when one does not exist" do
tmp <- io (Temp.getCanonicalTemporaryDirectory >>= flip Temp.createTempDirectory "ucm-test")
cbInit <- io initMockWithoutCodebase
r <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Home tmp) CI.DontMigrate \case
r <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Home tmp) CI.DontLock CI.DontMigrate \case
(CI.CreatedCodebase, _, _) -> pure True
_ -> pure False
case r of
@ -51,7 +51,7 @@ test =
tmp <- io (Temp.getCanonicalTemporaryDirectory >>= flip Temp.createTempDirectory "ucm-test")
cbInit <- io initMockWithCodebase
res <- io $
CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (DontCreateWhenMissing tmp)) CI.DontMigrate $ \case
CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (DontCreateWhenMissing tmp)) CI.DontLock CI.DontMigrate $ \case
(CI.OpenedCodebase, _, _) -> pure True
_ -> pure False
case res of
@ -61,7 +61,7 @@ test =
tmp <- io (Temp.getCanonicalTemporaryDirectory >>= flip Temp.createTempDirectory "ucm-test")
cbInit <- io initMockWithoutCodebase
res <- io $
CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (DontCreateWhenMissing tmp)) CI.DontMigrate $ \case
CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (DontCreateWhenMissing tmp)) CI.DontLock CI.DontMigrate $ \case
_ -> pure False
case res of
Left (_, CI.InitErrorOpen OpenCodebaseDoesntExist) -> expect True
@ -72,7 +72,7 @@ test =
[ scope "a v2 codebase should be created when one does not exist at the Specified dir" do
tmp <- io (Temp.getCanonicalTemporaryDirectory >>= flip Temp.createTempDirectory "ucm-test")
cbInit <- io initMockWithoutCodebase
res <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (CreateWhenMissing tmp)) CI.DontMigrate \case
res <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (CreateWhenMissing tmp)) CI.DontLock CI.DontMigrate \case
(CI.CreatedCodebase, _, _) -> pure True
_ -> pure False
case res of
@ -81,7 +81,7 @@ test =
scope "a v2 codebase should be opened when one exists" do
tmp <- io (Temp.getCanonicalTemporaryDirectory >>= flip Temp.createTempDirectory "ucm-test")
cbInit <- io initMockWithCodebase
res <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (CreateWhenMissing tmp)) CI.DontMigrate \case
res <- io $ CI.withOpenOrCreateCodebase cbInit "ucm-test" (Specified (CreateWhenMissing tmp)) CI.DontLock CI.DontMigrate \case
(CI.OpenedCodebase, _, _) -> pure True
_ -> pure False
case res of
@ -98,9 +98,9 @@ initMockWithCodebase = do
pure $
Init
{ -- withOpenCodebase :: forall r. DebugName -> CodebasePath -> (Codebase m v a -> m r) -> m (Either Pretty r),
withOpenCodebase = \_ _ _ action -> Right <$> action codebase,
withOpenCodebase = \_ _ _ _ action -> Right <$> action codebase,
-- withCreatedCodebase :: forall r. DebugName -> CodebasePath -> (Codebase m v a -> m r) -> m (Either CreateCodebaseError r),
withCreatedCodebase = \_ _ action -> Right <$> action codebase,
withCreatedCodebase = \_ _ _ action -> Right <$> action codebase,
-- CodebasePath -> CodebasePath
codebasePath = id
}
@ -110,9 +110,9 @@ initMockWithoutCodebase = do
let codebase = error "did we /actually/ need a Codebase?"
pure $
Init
{ withOpenCodebase = \_ _ _ _ -> pure (Left OpenCodebaseDoesntExist),
{ withOpenCodebase = \_ _ _ _ _ -> pure (Left OpenCodebaseDoesntExist),
-- withCreatedCodebase :: forall r. DebugName -> CodebasePath -> (Codebase m v a -> m r) -> m (Either CreateCodebaseError r),
withCreatedCodebase = \_ _ action -> Right <$> action codebase,
withCreatedCodebase = \_ _ _ action -> Right <$> action codebase,
-- CodebasePath -> CodebasePath
codebasePath = id
}

View File

@ -59,6 +59,7 @@ library
Unison.Codebase.Path
Unison.Codebase.Path.Parse
Unison.Codebase.PushBehavior
Unison.Codebase.RootBranchCache
Unison.Codebase.Runtime
Unison.Codebase.Serialization
Unison.Codebase.ShortCausalHash
@ -208,6 +209,7 @@ library
, errors
, exceptions
, extra
, filelock
, filepath
, fingertree
, free
@ -240,7 +242,6 @@ library
, openapi3
, optparse-applicative >=0.16.1.0
, pem
, prelude-extras
, pretty-simple
, primitive
, process
@ -258,7 +259,6 @@ library
, servant-server
, shellmet
, stm
, strings
, tagged
, temporary
, terminal-size
@ -393,6 +393,7 @@ test-suite parser-typechecker-tests
, errors
, exceptions
, extra
, filelock
, filemanip
, filepath
, fingertree
@ -426,7 +427,6 @@ test-suite parser-typechecker-tests
, openapi3
, optparse-applicative >=0.16.1.0
, pem
, prelude-extras
, pretty-simple
, primitive
, process
@ -445,7 +445,6 @@ test-suite parser-typechecker-tests
, shellmet
, split
, stm
, strings
, tagged
, temporary
, terminal-size

View File

@ -54,9 +54,7 @@ extra-deps:
- github: judah/haskeline
commit: d6c2643b0d5c19be7e440615c6f84d603d4bc648
- guid-0.1.0@sha256:a7c975be473f6f142d5cc1b39bc807a99043d20b1bb0873fdfe7a3ce84d2faf1,1078
- prelude-extras-0.4.0.3@sha256:1c10b0123ea13a6423d74a8fcbaeb2d5249b472588abde418a36b47b7c4f48c8,1163
- sandi-0.5@sha256:b278d072ca717706ea38f9bd646e023f7f2576a778fb43565b434f93638849aa,3010
- strings-1.1@sha256:0285dec4c8ab262359342b3e5ef1eb567074669461b9b38404f1cb870c881c5c,1617
- fuzzyfind-3.0.0@sha256:d79a5d3ed194dd436c6b839bf187211d880cf773b2febaca456e5ccf93f5ac65,1814
- monad-validate-1.2.0.0@sha256:9850f408431098b28806dd464b6825a88a0b56c84f380d7fe0454c1df9d6f881,3505
- NanoID-3.1.0@sha256:9118ab00e8650b5a56a10c90295d357eb77a8057a598b7e56dfedc9c6d53c77d,1524

View File

@ -5,173 +5,159 @@
packages:
- completed:
sha256: d4fd87fb7bfc5d8e9fbc3e4ee7302c6b1500cdc00fdb9b659d0f4849b6ebe2d5
name: configurator
size: 15989
url: https://github.com/unisonweb/configurator/archive/e47e9e9fe1f576f8c835183b9def52d73c01327a.tar.gz
pantry-tree:
sha256: 90547cd983fd15ebdc803e057d3ef8735fe93a75e29a00f8a74eadc13ee0f6e9
size: 955
name: configurator
version: 0.3.0.0
sha256: d4fd87fb7bfc5d8e9fbc3e4ee7302c6b1500cdc00fdb9b659d0f4849b6ebe2d5
pantry-tree:
size: 955
sha256: 90547cd983fd15ebdc803e057d3ef8735fe93a75e29a00f8a74eadc13ee0f6e9
original:
url: https://github.com/unisonweb/configurator/archive/e47e9e9fe1f576f8c835183b9def52d73c01327a.tar.gz
- completed:
sha256: 6e642163070a217cc3363bdbefde571ff6c1878f4fc3d92e9c910db7fa88eaf2
name: shellmet
size: 10460
url: https://github.com/unisonweb/shellmet/archive/2fd348592c8f51bb4c0ca6ba4bc8e38668913746.tar.gz
pantry-tree:
sha256: 05a169a7a6b68100630e885054dc1821d31cd06571b0317ec90c75ac2c41aeb7
size: 654
name: shellmet
version: 0.0.4.0
sha256: 6e642163070a217cc3363bdbefde571ff6c1878f4fc3d92e9c910db7fa88eaf2
pantry-tree:
size: 654
sha256: 05a169a7a6b68100630e885054dc1821d31cd06571b0317ec90c75ac2c41aeb7
original:
url: https://github.com/unisonweb/shellmet/archive/2fd348592c8f51bb4c0ca6ba4bc8e38668913746.tar.gz
- completed:
sha256: a45eb3dbe7333c108aef4afce7f763c7661919b09641ef9d241c7ca4a78bf735
name: ki
subdir: ki
size: 15840
subdir: ki
url: https://github.com/awkward-squad/ki/archive/563e96238dfe392dccf68d93953c8f30fd53bec8.tar.gz
pantry-tree:
sha256: c63220c438c076818e09061b117c56055e154f6abb66ea9bc44a3900fcabd654
size: 704
name: ki
version: 1.0.0
sha256: a45eb3dbe7333c108aef4afce7f763c7661919b09641ef9d241c7ca4a78bf735
pantry-tree:
size: 704
sha256: c63220c438c076818e09061b117c56055e154f6abb66ea9bc44a3900fcabd654
original:
subdir: ki
url: https://github.com/awkward-squad/ki/archive/563e96238dfe392dccf68d93953c8f30fd53bec8.tar.gz
- completed:
sha256: ef827ea5e8581cd68da9600660b2e584877d4fcdcf1cd2eb4652e0e51d817465
name: haskeline
size: 74363
url: https://github.com/judah/haskeline/archive/d6c2643b0d5c19be7e440615c6f84d603d4bc648.tar.gz
pantry-tree:
sha256: e30301b5389893948e25d39978d09948b11479b5b2a3517b978466fde548fc48
size: 3769
name: haskeline
version: 0.8.0.0
sha256: ef827ea5e8581cd68da9600660b2e584877d4fcdcf1cd2eb4652e0e51d817465
pantry-tree:
size: 3769
sha256: e30301b5389893948e25d39978d09948b11479b5b2a3517b978466fde548fc48
original:
url: https://github.com/judah/haskeline/archive/d6c2643b0d5c19be7e440615c6f84d603d4bc648.tar.gz
- completed:
hackage: guid-0.1.0@sha256:a7c975be473f6f142d5cc1b39bc807a99043d20b1bb0873fdfe7a3ce84d2faf1,1078
pantry-tree:
sha256: a33838b7b1c54f6ac3e1b436b25674948713a4189658e4d82e639b9a689bc90d
size: 364
hackage: guid-0.1.0@sha256:a7c975be473f6f142d5cc1b39bc807a99043d20b1bb0873fdfe7a3ce84d2faf1,1078
sha256: a33838b7b1c54f6ac3e1b436b25674948713a4189658e4d82e639b9a689bc90d
original:
hackage: guid-0.1.0@sha256:a7c975be473f6f142d5cc1b39bc807a99043d20b1bb0873fdfe7a3ce84d2faf1,1078
- completed:
hackage: sandi-0.5@sha256:b278d072ca717706ea38f9bd646e023f7f2576a778fb43565b434f93638849aa,3010
pantry-tree:
sha256: a03f60a1250c9d0daaf208ba58ccf24e05e0807f2e3dc7892fad3f2f3a196f7f
size: 476
hackage: prelude-extras-0.4.0.3@sha256:1c10b0123ea13a6423d74a8fcbaeb2d5249b472588abde418a36b47b7c4f48c8,1163
original:
hackage: prelude-extras-0.4.0.3@sha256:1c10b0123ea13a6423d74a8fcbaeb2d5249b472588abde418a36b47b7c4f48c8,1163
- completed:
pantry-tree:
sha256: 5ca7ce4bc22ab9d4427bb149b5e283ab9db43375df14f7131fdfd48775f36350
size: 3455
hackage: sandi-0.5@sha256:b278d072ca717706ea38f9bd646e023f7f2576a778fb43565b434f93638849aa,3010
sha256: 5ca7ce4bc22ab9d4427bb149b5e283ab9db43375df14f7131fdfd48775f36350
original:
hackage: sandi-0.5@sha256:b278d072ca717706ea38f9bd646e023f7f2576a778fb43565b434f93638849aa,3010
- completed:
hackage: fuzzyfind-3.0.0@sha256:d79a5d3ed194dd436c6b839bf187211d880cf773b2febaca456e5ccf93f5ac65,1814
pantry-tree:
sha256: 876e5a679244143e2a7e74357427c7522a8d61f68a4cc3ae265fe4960b75a2e2
size: 212
hackage: strings-1.1@sha256:0285dec4c8ab262359342b3e5ef1eb567074669461b9b38404f1cb870c881c5c,1617
original:
hackage: strings-1.1@sha256:0285dec4c8ab262359342b3e5ef1eb567074669461b9b38404f1cb870c881c5c,1617
- completed:
pantry-tree:
sha256: 0e6c6d4f89083c8385de5adc4f36ad01b2b0ff45261b47f7d90d919969c8b5ed
size: 542
hackage: fuzzyfind-3.0.0@sha256:d79a5d3ed194dd436c6b839bf187211d880cf773b2febaca456e5ccf93f5ac65,1814
sha256: 0e6c6d4f89083c8385de5adc4f36ad01b2b0ff45261b47f7d90d919969c8b5ed
original:
hackage: fuzzyfind-3.0.0@sha256:d79a5d3ed194dd436c6b839bf187211d880cf773b2febaca456e5ccf93f5ac65,1814
- completed:
hackage: monad-validate-1.2.0.0@sha256:9850f408431098b28806dd464b6825a88a0b56c84f380d7fe0454c1df9d6f881,3505
pantry-tree:
sha256: 8e049bd12ce2bd470909578f2ee8eb80b89d5ff88860afa30e29dd4eafecfa3e
size: 713
hackage: monad-validate-1.2.0.0@sha256:9850f408431098b28806dd464b6825a88a0b56c84f380d7fe0454c1df9d6f881,3505
sha256: 8e049bd12ce2bd470909578f2ee8eb80b89d5ff88860afa30e29dd4eafecfa3e
original:
hackage: monad-validate-1.2.0.0@sha256:9850f408431098b28806dd464b6825a88a0b56c84f380d7fe0454c1df9d6f881,3505
- completed:
hackage: NanoID-3.1.0@sha256:9118ab00e8650b5a56a10c90295d357eb77a8057a598b7e56dfedc9c6d53c77d,1524
pantry-tree:
sha256: d33d603a2f0d1a220ff0d5e7edb6273def89120e6bb958c2d836cae89e788334
size: 363
hackage: NanoID-3.1.0@sha256:9118ab00e8650b5a56a10c90295d357eb77a8057a598b7e56dfedc9c6d53c77d,1524
sha256: d33d603a2f0d1a220ff0d5e7edb6273def89120e6bb958c2d836cae89e788334
original:
hackage: NanoID-3.1.0@sha256:9118ab00e8650b5a56a10c90295d357eb77a8057a598b7e56dfedc9c6d53c77d,1524
- completed:
pantry-tree:
sha256: c7f5afe70db567e2cf9f3119b49f4b402705e6bd08ed8ba98747a64a8a0bef41
size: 770
hackage: direct-sqlite-2.3.27@sha256:94207d3018da3bda84bc6ce00d2c0236ced7edb37afbd726ed2a0bfa236e149b,3771
pantry-tree:
size: 770
sha256: c7f5afe70db567e2cf9f3119b49f4b402705e6bd08ed8ba98747a64a8a0bef41
original:
hackage: direct-sqlite-2.3.27
- completed:
hackage: recover-rtti-0.4.0.0@sha256:2ce1e031ec0e34d736fa45f0149bbd55026f614939dc90ffd14a9c5d24093ff4,4423
pantry-tree:
sha256: d87d84c3f760c1b2540f74e4a301cd4e8294df891e8e4262e8bdd313bc8e0bfd
size: 2410
hackage: recover-rtti-0.4.0.0@sha256:2ce1e031ec0e34d736fa45f0149bbd55026f614939dc90ffd14a9c5d24093ff4,4423
sha256: d87d84c3f760c1b2540f74e4a301cd4e8294df891e8e4262e8bdd313bc8e0bfd
original:
hackage: recover-rtti-0.4.0.0@sha256:2ce1e031ec0e34d736fa45f0149bbd55026f614939dc90ffd14a9c5d24093ff4,4423
- completed:
hackage: lock-file-0.7.0.0@sha256:3ad84b5e454145e1d928063b56abb96db24a99a21b493989520e58fa0ab37b00,4484
pantry-tree:
sha256: 3634593ce191e82793ea0e060598ab3cf67f2ef2fe1d65345dc9335ad529d25f
size: 718
hackage: lock-file-0.7.0.0@sha256:3ad84b5e454145e1d928063b56abb96db24a99a21b493989520e58fa0ab37b00,4484
sha256: 3634593ce191e82793ea0e060598ab3cf67f2ef2fe1d65345dc9335ad529d25f
original:
hackage: lock-file-0.7.0.0@sha256:3ad84b5e454145e1d928063b56abb96db24a99a21b493989520e58fa0ab37b00,4484
- completed:
pantry-tree:
sha256: 8372e84e9c710097f4f80f2016ca15a5a0cd7884b8ac5ce70f26b3110f4401bd
size: 2547
hackage: http-client-0.7.11@sha256:3f59ac8ffe2a3768846cdda040a0d1df2a413960529ba61c839861c948871967,5756
pantry-tree:
size: 2547
sha256: 8372e84e9c710097f4f80f2016ca15a5a0cd7884b8ac5ce70f26b3110f4401bd
original:
hackage: http-client-0.7.11
- completed:
pantry-tree:
sha256: 87526822a8ffb514d355975bca3a3f5ceb9a19eaf664cbdcde2f866c4d33878c
size: 1551
hackage: lsp-1.5.0.0@sha256:1ad138526f9177965d4b5b01f9074fe0475636b2c563dcc7036fb6908f8e6189,5382
pantry-tree:
size: 1551
sha256: 87526822a8ffb514d355975bca3a3f5ceb9a19eaf664cbdcde2f866c4d33878c
original:
hackage: lsp-1.5.0.0
- completed:
pantry-tree:
sha256: e45ef86a4301beb45ae7ec527e69880944a03c2d959cb0a051bf58dd0a5579f4
size: 4160
hackage: lsp-types-1.5.0.0@sha256:7ed97bbc9290ad6ffb9b5a8e082226783c710fff9e4ca2df4c578b065997b1ea,4301
pantry-tree:
size: 4160
sha256: e45ef86a4301beb45ae7ec527e69880944a03c2d959cb0a051bf58dd0a5579f4
original:
hackage: lsp-types-1.5.0.0
- completed:
pantry-tree:
sha256: 51b22419f8d9bfd2a8aa3efa16b80a48e4b0c915a1d27fefe5f0b6d2d9e48312
size: 1180
hackage: text-rope-0.2@sha256:53b9b4cef0b278b9c591cd4ca76543acacf64c9d1bfbc06d0d9a88960446d9a7,2087
pantry-tree:
size: 1180
sha256: 51b22419f8d9bfd2a8aa3efa16b80a48e4b0c915a1d27fefe5f0b6d2d9e48312
original:
hackage: text-rope-0.2@sha256:53b9b4cef0b278b9c591cd4ca76543acacf64c9d1bfbc06d0d9a88960446d9a7,2087
- completed:
pantry-tree:
sha256: d4cc089c40c5052ee02f91eafa567e0a239908aabc561dfa6080ba3bfc8c25bd
size: 584
hackage: co-log-core-0.3.1.0@sha256:9794bdedd1391decd0e22bdfe2b11abcb42e6cff7a4531e1f8882890828f4e63,3816
pantry-tree:
size: 584
sha256: d4cc089c40c5052ee02f91eafa567e0a239908aabc561dfa6080ba3bfc8c25bd
original:
hackage: co-log-core-0.3.1.0
- completed:
pantry-tree:
sha256: 2a9669ed392657d34ec2e180ddac68c9ef657e54bf4b5fbc9b9efaa7b1d341be
size: 580
hackage: terminal-size-0.3.3@sha256:bd5f02333982bc8d6017db257b2a0b91870a295b4a37142a0c0525d8f533a48f,1255
pantry-tree:
size: 580
sha256: 2a9669ed392657d34ec2e180ddac68c9ef657e54bf4b5fbc9b9efaa7b1d341be
original:
hackage: terminal-size-0.3.3
- completed:
pantry-tree:
sha256: 1981a732d1917213de7f51d26255af733a61918c59eebb6c6f6ca939856839ef
size: 3971
hackage: network-3.1.2.7@sha256:e3d78b13db9512aeb106e44a334ab42b7aa48d26c097299084084cb8be5c5568,4888
pantry-tree:
size: 3971
sha256: 1981a732d1917213de7f51d26255af733a61918c59eebb6c6f6ca939856839ef
original:
hackage: network-3.1.2.7
snapshots:
- completed:
sha256: 428ec8d5ce932190d3cbe266b9eb3c175cd81e984babf876b64019e2cbe4ea68
size: 590100
url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/18/28.yaml
sha256: 428ec8d5ce932190d3cbe266b9eb3c175cd81e984babf876b64019e2cbe4ea68
original: lts-18.28

View File

@ -54,6 +54,7 @@ dependencies:
- nonempty-containers
- open-browser
- pretty-simple
- process
- random >= 1.2.0
- regex-tdfa
- semialign
@ -117,7 +118,7 @@ executables:
other-modules: Paths_unison_cli
source-dirs: unison
main: Main.hs
ghc-options: -threaded -rtsopts -with-rtsopts=-I0 -optP-Wno-nonportable-include-path
ghc-options: -threaded -rtsopts "-with-rtsopts=-I0 -N -qn4 -qg1 -T" -optP-Wno-nonportable-include-path
dependencies:
- code-page
- optparse-applicative >= 0.16.1.0
@ -132,7 +133,7 @@ executables:
other-modules: Paths_unison_cli
source-dirs: transcripts
main: Transcripts.hs
ghc-options: -threaded -rtsopts -with-rtsopts=-N -v0
ghc-options: -threaded -rtsopts "-with-rtsopts=-I0 -N -qn4 -qg1" -v0
dependencies:
- code-page
- easytest

View File

@ -251,11 +251,11 @@ assertNoBranchAtPath' path' = do
branchExistsAtPath' :: Path' -> Cli Bool
branchExistsAtPath' path' = do
absPath <- resolvePath' path'
Cli.Env {codebase} <- ask
causal <- liftIO $ Codebase.getShallowCausalFromRoot codebase Nothing (Path.unabsolute absPath)
branch <- liftIO $ V2Causal.value causal
isEmpty <- Cli.runTransaction $ V2Branch.isEmpty branch
pure (not isEmpty)
Cli.runTransaction do
causal <- Codebase.getShallowCausalFromRoot Nothing (Path.unabsolute absPath)
branch <- V2Causal.value causal
isEmpty <- V2Branch.isEmpty branch
pure (not isEmpty)
------------------------------------------------------------------------------------------------------------------------
-- Updating branches

View File

@ -2,6 +2,7 @@ module Unison.Cli.TypeCheck
( typecheck,
typecheckHelper,
typecheckFile,
typecheckTerm,
)
where
@ -19,11 +20,13 @@ import Unison.Parser.Ann (Ann (..))
import Unison.Prelude
import qualified Unison.Result as Result
import qualified Unison.Sqlite as Sqlite
import Unison.Symbol (Symbol)
import Unison.Symbol (Symbol (Symbol))
import qualified Unison.Syntax.Lexer as L
import qualified Unison.Syntax.Parser as Parser
import Unison.Term (Term)
import Unison.Type (Type)
import qualified Unison.UnisonFile as UF
import qualified Unison.Var as Var
typecheck ::
[Type Symbol Ann] ->
@ -70,6 +73,39 @@ typecheckHelper codebase generateUniqueName ambient names sourceName source = do
(Text.unpack sourceName)
(fst source)
typecheckTerm ::
Term Symbol Ann ->
Cli
( Result.Result
(Seq (Result.Note Symbol Ann))
(Type Symbol Ann)
)
typecheckTerm tm = do
Cli.Env {codebase} <- ask
let v = Symbol 0 (Var.Inference Var.Other)
liftIO $
fmap extract
<$> Codebase.runTransaction codebase (typecheckFile' codebase [] (UF.UnisonFileId mempty mempty [(v, tm)] mempty))
where
extract tuf
| [[(_, _, ty)]] <- UF.topLevelComponents' tuf = ty
| otherwise = error "internal error: typecheckTerm"
typecheckFile' ::
Codebase m Symbol Ann ->
[Type Symbol Ann] ->
UF.UnisonFile Symbol Ann ->
Sqlite.Transaction
( Result.Result
(Seq (Result.Note Symbol Ann))
(UF.TypecheckedUnisonFile Symbol Ann)
)
typecheckFile' codebase ambient file = do
typeLookup <-
(<> Builtin.typeLookup)
<$> Codebase.typeLookupForDependencies codebase (UF.dependencies file)
pure $ synthesizeFile' ambient typeLookup file
typecheckFile ::
Codebase m Symbol Ann ->
[Type Symbol Ann] ->
@ -79,8 +115,5 @@ typecheckFile ::
(Seq (Result.Note Symbol Ann))
(Either Names (UF.TypecheckedUnisonFile Symbol Ann))
)
typecheckFile codebase ambient file = do
typeLookup <-
(<> Builtin.typeLookup)
<$> Codebase.typeLookupForDependencies codebase (UF.dependencies file)
pure . fmap Right $ synthesizeFile' ambient typeLookup file
typecheckFile codebase ambient file =
fmap Right <$> typecheckFile' codebase ambient file

View File

@ -7,6 +7,7 @@ where
import Control.Concurrent.STM (atomically, modifyTVar', newTVarIO, readTVar, readTVarIO)
import qualified Control.Error.Util as ErrorUtil
import Control.Exception (catch)
import Control.Lens
import Control.Monad.Reader (ask)
import Control.Monad.State (StateT)
@ -27,7 +28,15 @@ import qualified Data.Text as Text
import Data.Time (UTCTime)
import Data.Tuple.Extra (uncurry3)
import qualified System.Console.Regions as Console.Regions
import System.Directory
( XdgDirectory (..),
createDirectoryIfMissing,
doesFileExist,
getXdgDirectory,
)
import System.Environment (withArgs)
import System.FilePath ((</>))
import System.Process (callCommand, readCreateProcess, shell)
import qualified Text.Megaparsec as P
import qualified U.Codebase.Branch.Diff as V2Branch
import qualified U.Codebase.Causal as V2Causal
@ -45,9 +54,9 @@ import qualified Unison.Builtin.Terms as Builtin
import Unison.Cli.Monad (Cli)
import qualified Unison.Cli.Monad as Cli
import qualified Unison.Cli.MonadUtils as Cli
import Unison.Cli.NamesUtils (basicParseNames, basicPrettyPrintNamesA, displayNames, findHistoricalHQs, getBasicPrettyPrintNames, makeHistoricalParsingNames, makePrintNamesFromLabeled', makeShadowedPrintNamesFromHQ)
import Unison.Cli.NamesUtils (basicParseNames, displayNames, findHistoricalHQs, getBasicPrettyPrintNames, makeHistoricalParsingNames, makePrintNamesFromLabeled', makeShadowedPrintNamesFromHQ)
import Unison.Cli.PrettyPrintUtils (currentPrettyPrintEnvDecl, prettyPrintEnvDecl)
import Unison.Cli.TypeCheck (typecheck)
import Unison.Cli.TypeCheck (typecheck, typecheckTerm)
import Unison.Cli.UnisonConfigUtils (gitUrlKey, remoteMappingKey)
import Unison.Codebase (Codebase, Preprocessing (..), PushGitBranchOpts (..))
import qualified Unison.Codebase as Codebase
@ -66,6 +75,11 @@ import Unison.Codebase.Editor.HandleInput.MetadataUtils (addDefaultMetadata, man
import Unison.Codebase.Editor.HandleInput.MoveBranch (doMoveBranch)
import qualified Unison.Codebase.Editor.HandleInput.NamespaceDependencies as NamespaceDependencies
import Unison.Codebase.Editor.HandleInput.NamespaceDiffUtils (diffHelper)
import Unison.Codebase.Editor.HandleInput.TermResolution
( resolveCon,
resolveMainRef,
resolveTermRef,
)
import Unison.Codebase.Editor.HandleInput.Update (doSlurpAdds, handleUpdate)
import Unison.Codebase.Editor.Input
import qualified Unison.Codebase.Editor.Input as Input
@ -166,6 +180,7 @@ import Unison.Symbol (Symbol)
import qualified Unison.Sync.Types as Share (Path (..), hashJWTHash)
import qualified Unison.Syntax.Lexer as L
import qualified Unison.Syntax.Parser as Parser
import qualified Unison.Syntax.TermPrinter as TP
import Unison.Term (Term)
import qualified Unison.Term as Term
import Unison.Type (Type)
@ -373,7 +388,7 @@ loop e = do
Cli.respond $ PrintMessage pretty
ShowReflogI -> do
let numEntriesToShow = 500
entries <-
entries <-
Cli.runTransaction do
schLength <- Codebase.branchHashLength
Codebase.getReflog numEntriesToShow <&> fmap (first $ SCH.fromHash schLength)
@ -1157,24 +1172,14 @@ loop e = do
Cli.respond (RunResult ppe mainRes)
MakeStandaloneI output main -> do
Cli.Env {codebase, runtime} <- ask
let mainType = Runtime.mainType runtime
parseNames <-
flip NamesWithHistory.NamesWithHistory mempty <$> basicPrettyPrintNamesA
ppe <- suffixifiedPPE parseNames
let resolved = toList $ NamesWithHistory.lookupHQTerm main parseNames
smain = HQ.toString main
filtered <-
Cli.runTransaction do
catMaybes
<$> traverse (\r -> fmap (r,) <$> loadTypeOfTerm codebase r) resolved
case filtered of
[(Referent.Ref ref, ty)]
| Typechecker.fitsScheme ty mainType -> do
let codeLookup = () <$ Codebase.toCodeLookup codebase
whenJustM (liftIO (Runtime.compileTo runtime codeLookup ppe ref (output <> ".uc"))) \err ->
Cli.returnEarly (EvaluationFailure err)
| otherwise -> Cli.returnEarly (BadMainFunction smain ty ppe [mainType])
_ -> Cli.returnEarly (NoMainFunction smain ppe [mainType])
(ref, ppe) <- resolveMainRef main
let codeLookup = () <$ Codebase.toCodeLookup codebase
whenJustM (liftIO (Runtime.compileTo runtime codeLookup ppe ref (output <> ".uc"))) \err ->
Cli.returnEarly (EvaluationFailure err)
CompileSchemeI output main -> doCompileScheme output main
ExecuteSchemeI main -> doRunAsScheme main
GenSchemeLibsI -> doGenerateSchemeBoot True Nothing
FetchSchemeCompilerI -> doFetchCompiler
IOTestI main -> do
Cli.Env {codebase, runtime} <- ask
-- todo - allow this to run tests from scratch file, using addRunMain
@ -1261,46 +1266,9 @@ loop e = do
patch <- Cli.getPatchAt (fromMaybe Cli.defaultPatchPath maybePath)
ppe <- suffixifiedPPE =<< makePrintNamesFromLabeled' (Patch.labeledDependencies patch)
Cli.respondNumbered $ ListEdits patch ppe
PullRemoteBranchI mayRepo path syncMode pullMode verbosity -> do
Cli.Env {codebase} <- ask
let preprocess = case pullMode of
Input.PullWithHistory -> Unmodified
Input.PullWithoutHistory -> Preprocessed $ pure . Branch.discardHistory
ns <- maybe (writePathToRead <$> resolveConfiguredUrl Pull path) pure mayRepo
remoteBranch <- case ns of
ReadRemoteNamespaceGit repo ->
Cli.ioE (Codebase.importRemoteBranch codebase repo syncMode preprocess) \err ->
Cli.returnEarly (Output.GitError err)
ReadRemoteNamespaceShare repo -> importRemoteShareBranch repo
description <- inputDescription input
let unchangedMsg = PullAlreadyUpToDate ns path
destAbs <- Cli.resolvePath' path
let printDiffPath = if Verbosity.isSilent verbosity then Nothing else Just path
case pullMode of
Input.PullWithHistory -> do
destBranch <- Cli.getBranch0At destAbs
if Branch.isEmpty0 destBranch
then do
void $ Cli.updateAtM description destAbs (const $ pure remoteBranch)
Cli.respond $ MergeOverEmpty path
else
mergeBranchAndPropagateDefaultPatch
Branch.RegularMerge
description
(Just unchangedMsg)
remoteBranch
printDiffPath
destAbs
Input.PullWithoutHistory -> do
didUpdate <-
Cli.updateAtM
description
destAbs
(\destBranch -> pure $ remoteBranch `Branch.consBranchSnapshot` destBranch)
Cli.respond
if didUpdate
then PullSuccessful ns path
else unchangedMsg
PullRemoteBranchI mRepo path sMode pMode verbosity ->
inputDescription input
>>= doPullRemoteBranch mRepo path sMode pMode verbosity
PushRemoteBranchI pushRemoteBranchInput -> handlePushRemoteBranch pushRemoteBranchInput
ListDependentsI hq -> handleDependents hq
ListDependenciesI hq -> do
@ -1413,14 +1381,13 @@ loop e = do
traceM $ show name ++ ",Type," ++ Text.unpack (Reference.toText r)
for_ (Relation.toList . Branch.deepTerms $ rootBranch0) \(r, name) ->
traceM $ show name ++ ",Term," ++ Text.unpack (Referent.toText r)
DebugClearWatchI {} ->
DebugClearWatchI {} ->
Cli.runTransaction Codebase.clearWatches
DebugDoctorI {} -> do
r <- Cli.runTransaction IntegrityCheck.integrityCheckFullCodebase
Cli.respond (IntegrityCheck r)
DebugNameDiffI fromSCH toSCH -> do
Cli.Env {codebase} <- ask
(schLen, fromCHs, toCHs) <-
(schLen, fromCHs, toCHs) <-
Cli.runTransaction do
schLen <- Codebase.branchHashLength
fromCHs <- Codebase.causalHashesByPrefix fromSCH
@ -1432,12 +1399,13 @@ loop e = do
(_, []) -> Cli.returnEarly $ Output.NoBranchWithHash toSCH
(_, (_ : _ : _)) -> Cli.returnEarly $ Output.BranchHashAmbiguous toSCH (Set.map (SCH.fromHash schLen) toCHs)
([fromCH], [toCH]) -> pure (fromCH, toCH)
output <- liftIO do
fromBranch <- (Codebase.getShallowCausalForHash codebase $ Cv.causalHash1to2 fromCH) >>= V2Causal.value
toBranch <- (Codebase.getShallowCausalForHash codebase $ Cv.causalHash1to2 toCH) >>= V2Causal.value
treeDiff <- V2Branch.diffBranches fromBranch toBranch
let nameChanges = V2Branch.nameChanges Nothing treeDiff
pure (DisplayDebugNameDiff nameChanges)
output <-
Cli.runTransaction do
fromBranch <- (Codebase.expectCausalBranchByCausalHash $ Cv.causalHash1to2 fromCH) >>= V2Causal.value
toBranch <- (Codebase.expectCausalBranchByCausalHash $ Cv.causalHash1to2 toCH) >>= V2Causal.value
treeDiff <- V2Branch.diffBranches fromBranch toBranch
let nameChanges = V2Branch.nameChanges Nothing treeDiff
pure (DisplayDebugNameDiff nameChanges)
Cli.respond output
DeprecateTermI {} -> Cli.respond NotImplemented
DeprecateTypeI {} -> Cli.respond NotImplemented
@ -1564,6 +1532,10 @@ inputDescription input =
MergeBuiltinsI -> pure "builtins.merge"
MergeIOBuiltinsI -> pure "builtins.mergeio"
MakeStandaloneI out nm -> pure ("compile " <> Text.pack out <> " " <> HQ.toText nm)
ExecuteSchemeI nm -> pure ("run.native " <> HQ.toText nm)
CompileSchemeI fi nm -> pure ("compile.native " <> HQ.toText nm <> " " <> Text.pack fi)
GenSchemeLibsI -> pure "compile.native.genlibs"
FetchSchemeCompilerI -> pure "compile.native.fetch"
PullRemoteBranchI orepo dest0 _syncMode pullMode _ -> do
dest <- p' dest0
let command =
@ -2068,7 +2040,7 @@ handleTest TestInput {includeLibNamespace, showFailures, showSuccesses} = do
fmap Map.fromList do
Set.toList testRefs & wither \case
Reference.Builtin _ -> pure Nothing
r@(Reference.DerivedId rid) -> liftIO (fmap (r,) <$> Codebase.getWatch codebase WK.TestWatch rid)
r@(Reference.DerivedId rid) -> fmap (r,) <$> Cli.runTransaction (Codebase.getWatch codebase WK.TestWatch rid)
let stats = Output.CachedTests (Set.size testRefs) (Map.size cachedTests)
names <-
makePrintNamesFromLabeled' $
@ -2516,6 +2488,266 @@ mergeBranchAndPropagateDefaultPatch mode inputDescription unchangedMessage srcb
Cli.respondNumbered (ShowDiffAfterMerge dest0 dest ppe diff)
pure b
basicPPE :: Cli PPE.PrettyPrintEnv
basicPPE = do
parseNames <-
flip NamesWithHistory.NamesWithHistory mempty
<$> basicParseNames
suffixifiedPPE parseNames
compilerPath :: Path.Path'
compilerPath = Path.Path' {Path.unPath' = Left abs}
where
segs = NameSegment <$> ["unison", "internal"]
rootPath = Path.Path {Path.toSeq = Seq.fromList segs}
abs = Path.Absolute {Path.unabsolute = rootPath}
doFetchCompiler :: Cli ()
doFetchCompiler =
inputDescription pullInput
>>= doPullRemoteBranch
repo
compilerPath
SyncMode.Complete
Input.PullWithoutHistory
Verbosity.Silent
where
-- fetching info
ns =
ReadShareRemoteNamespace
{ server = RemoteRepo.DefaultCodeserver,
repo = "dolio",
path =
Path.fromList $ NameSegment <$> ["public", "internal", "trunk"]
}
repo = Just $ ReadRemoteNamespaceShare ns
pullInput =
PullRemoteBranchI
repo
compilerPath
SyncMode.Complete
Input.PullWithoutHistory
Verbosity.Silent
ensureCompilerExists :: Cli ()
ensureCompilerExists =
Cli.branchExistsAtPath' compilerPath
>>= flip unless doFetchCompiler
getCacheDir :: Cli String
getCacheDir = liftIO $ getXdgDirectory XdgCache "unisonlanguage"
getSchemeGenLibDir :: Cli String
getSchemeGenLibDir =
Cli.getConfig "SchemeLibs.Generated" >>= \case
Just dir -> pure dir
Nothing -> (</> "scheme-libs") <$> getCacheDir
getSchemeStaticLibDir :: Cli String
getSchemeStaticLibDir =
Cli.getConfig "SchemeLibs.Static" >>= \case
Just dir -> pure dir
Nothing ->
liftIO $
getXdgDirectory XdgData ("unisonlanguage" </> "scheme-libs")
doGenerateSchemeBoot :: Bool -> Maybe PPE.PrettyPrintEnv -> Cli ()
doGenerateSchemeBoot force mppe = do
ppe <- maybe basicPPE pure mppe
dir <- getSchemeGenLibDir
let bootf = dir </> "unison" </> "boot-generated.ss"
binf = dir </> "unison" </> "builtin-generated.ss"
dirTm = Term.text a (Text.pack dir)
liftIO $ createDirectoryIfMissing True dir
saveBase <- Term.ref a <$> resolveTermRef sbName
gen ppe saveBase bootf dirTm bootName
gen ppe saveBase binf dirTm builtinName
where
a = External
hq nm
| Just hqn <- HQ.fromString nm = hqn
| otherwise = error $ "internal error: cannot hash qualify: " ++ nm
sbName = hq ".unison.internal.compiler.scheme.saveBaseFile"
bootName = hq ".unison.internal.compiler.scheme.bootSpec"
builtinName = hq ".unison.internal.compiler.scheme.builtinSpec"
gen ppe save file dir nm =
liftIO (doesFileExist file) >>= \b -> when (not b || force) do
spec <- Term.ref a <$> resolveTermRef nm
let make = Term.apps' save [dir, spec]
typecheckAndEval ppe make
typecheckAndEval :: PPE.PrettyPrintEnv -> Term Symbol Ann -> Cli ()
typecheckAndEval ppe tm = do
Cli.Env {runtime} <- ask
let mty = Runtime.mainType runtime
typecheckTerm (Term.delay a tm) >>= \case
-- Type checking succeeded
Result.Result _ (Just ty)
| Typechecker.fitsScheme ty mty ->
() <$ evalUnisonTerm False ppe False tm
| otherwise ->
Cli.returnEarly $ BadMainFunction rendered ty ppe [mty]
Result.Result notes Nothing -> do
currentPath <- Cli.getCurrentPath
let tes = [err | Result.TypeError err <- toList notes]
Cli.returnEarly (TypeErrors currentPath (Text.pack rendered) ppe tes)
where
a = External
rendered = P.toPlainUnbroken $ TP.pretty ppe tm
ensureSchemeExists :: Cli ()
ensureSchemeExists =
liftIO callScheme >>= \case
True -> pure ()
False -> Cli.returnEarly (PrintMessage msg)
where
msg =
P.lines
[ "I can't seem to call scheme. See",
"",
P.indentN
2
"https://github.com/cisco/ChezScheme/blob/main/BUILDING",
"",
"for how to install Chez Scheme."
]
callScheme =
catch
(True <$ readCreateProcess (shell "scheme -q") "")
(\(_ :: IOException) -> pure False)
runScheme :: String -> Cli ()
runScheme file = do
ensureSchemeExists
gendir <- getSchemeGenLibDir
statdir <- getSchemeStaticLibDir
let includes = gendir ++ ":" ++ statdir
lib = "--libdirs " ++ includes
opt = "--optimize-level 3"
cmd = "scheme -q " ++ opt ++ " " ++ lib ++ " --script " ++ file
success <-
liftIO $
(True <$ callCommand cmd) `catch` \(_ :: IOException) ->
pure False
unless success $
Cli.returnEarly (PrintMessage "Scheme evaluation failed.")
buildScheme :: String -> String -> Cli ()
buildScheme main file = do
ensureSchemeExists
statDir <- getSchemeStaticLibDir
genDir <- getSchemeGenLibDir
let cmd = shell "scheme -q --optimize-level 3"
void . liftIO $ readCreateProcess cmd (build statDir genDir)
where
surround s = '"' : s ++ "\""
parens s = '(' : s ++ ")"
lns dir nms = surround . ln dir <$> nms
ln dir nm = dir </> "unison" </> (nm ++ ".ss")
static = ["core", "cont", "bytevector", "string", "primops", "boot"]
gen = ["boot-generated", "builtin-generated"]
bootf = surround $ main ++ ".boot"
base = "'(\"scheme\" \"petite\")"
build sd gd =
parens . List.intercalate " " $
["make-boot-file", bootf, base]
++ lns sd static
++ lns gd gen
++ [surround file]
doRunAsScheme :: HQ.HashQualified Name -> Cli ()
doRunAsScheme main = do
fullpath <- generateSchemeFile (HQ.toString main) main
runScheme fullpath
doCompileScheme :: String -> HQ.HashQualified Name -> Cli ()
doCompileScheme out main =
generateSchemeFile out main >>= buildScheme out
generateSchemeFile :: String -> HQ.HashQualified Name -> Cli String
generateSchemeFile out main = do
(comp, ppe) <- resolveMainRef main
ensureCompilerExists
doGenerateSchemeBoot False $ Just ppe
cacheDir <- getCacheDir
liftIO $ createDirectoryIfMissing True (cacheDir </> "scheme-tmp")
let scratch = out ++ ".scm"
fullpath = cacheDir </> "scheme-tmp" </> scratch
output = Text.pack fullpath
sscm <- Term.ref a <$> resolveTermRef saveNm
fprf <- resolveCon filePathNm
let toCmp = Term.termLink a (Referent.Ref comp)
outTm = Term.text a output
fpc = Term.constructor a fprf
fp = Term.app a fpc outTm
tm :: Term Symbol Ann
tm = Term.apps' sscm [toCmp, fp]
typecheckAndEval ppe tm
pure fullpath
where
a = External
hq nm
| Just hqn <- HQ.fromString nm = hqn
| otherwise = error $ "internal error: cannot hash qualify: " ++ nm
saveNm = hq ".unison.internal.compiler.saveScheme"
filePathNm = hq "FilePath.FilePath"
doPullRemoteBranch ::
Maybe ReadRemoteNamespace ->
Path' ->
SyncMode.SyncMode ->
PullMode ->
Verbosity.Verbosity ->
Text ->
Cli ()
doPullRemoteBranch mayRepo path syncMode pullMode verbosity description = do
Cli.Env {codebase} <- ask
let preprocess = case pullMode of
Input.PullWithHistory -> Unmodified
Input.PullWithoutHistory -> Preprocessed $ pure . Branch.discardHistory
ns <- maybe (writePathToRead <$> resolveConfiguredUrl Pull path) pure mayRepo
remoteBranch <- case ns of
ReadRemoteNamespaceGit repo ->
Cli.ioE (Codebase.importRemoteBranch codebase repo syncMode preprocess) \err ->
Cli.returnEarly (Output.GitError err)
ReadRemoteNamespaceShare repo -> importRemoteShareBranch repo
let unchangedMsg = PullAlreadyUpToDate ns path
destAbs <- Cli.resolvePath' path
let printDiffPath = if Verbosity.isSilent verbosity then Nothing else Just path
case pullMode of
Input.PullWithHistory -> do
destBranch <- Cli.getBranch0At destAbs
if Branch.isEmpty0 destBranch
then do
void $ Cli.updateAtM description destAbs (const $ pure remoteBranch)
Cli.respond $ MergeOverEmpty path
else
mergeBranchAndPropagateDefaultPatch
Branch.RegularMerge
description
(Just unchangedMsg)
remoteBranch
printDiffPath
destAbs
Input.PullWithoutHistory -> do
didUpdate <-
Cli.updateAtM
description
destAbs
(\destBranch -> pure $ remoteBranch `Branch.consBranchSnapshot` destBranch)
Cli.respond
if didUpdate
then PullSuccessful ns path
else unchangedMsg
loadPropagateDiffDefaultPatch ::
Text ->
Maybe Path.Path' ->
@ -2974,7 +3206,7 @@ evalUnisonFile sandbox ppe unisonFile args = do
let watchCache :: Reference.Id -> IO (Maybe (Term Symbol ()))
watchCache ref = do
maybeTerm <- Codebase.lookupWatchCache codebase ref
maybeTerm <- Codebase.runTransaction codebase (Codebase.lookupWatchCache codebase ref)
pure (Term.amap (\(_ :: Ann) -> ()) <$> maybeTerm)
Cli.with_ (withArgs args) do
@ -3000,7 +3232,7 @@ evalUnisonTermE sandbox ppe useCache tm = do
let watchCache :: Reference.Id -> IO (Maybe (Term Symbol ()))
watchCache ref = do
maybeTerm <- Codebase.lookupWatchCache codebase ref
maybeTerm <- Codebase.runTransaction codebase (Codebase.lookupWatchCache codebase ref)
pure (Term.amap (\(_ :: Ann) -> ()) <$> maybeTerm)
let cache = if useCache then watchCache else Runtime.noCache

View File

@ -0,0 +1,123 @@
module Unison.Codebase.Editor.HandleInput.TermResolution
( lookupTermRefs,
lookupTermRefWithType,
resolveCon,
resolveTerm,
resolveTermRef,
resolveMainRef,
)
where
import Control.Monad.Reader (ask)
import Control.Monad.Trans (liftIO)
import Data.Maybe (catMaybes, fromJust)
import Data.Set (fromList, toList)
import Unison.Cli.Monad (Cli)
import qualified Unison.Cli.Monad as Cli
import Unison.Cli.NamesUtils (basicParseNames, basicPrettyPrintNamesA)
import qualified Unison.Codebase as Codebase
import Unison.Codebase.Editor.Output (Output (..))
import Unison.Codebase.Path (hqSplitFromName')
import qualified Unison.Codebase.Runtime as Runtime
import Unison.ConstructorReference
import qualified Unison.HashQualified as HQ
import Unison.Name (Name)
import Unison.Names (Names)
import Unison.NamesWithHistory
( NamesWithHistory (..),
lookupHQTerm,
)
import Unison.Parser.Ann (Ann)
import Unison.PrettyPrintEnv (PrettyPrintEnv)
import Unison.PrettyPrintEnv.Names (fromSuffixNames)
import Unison.Reference (Reference)
import Unison.Referent (Referent, pattern Con, pattern Ref)
import Unison.Symbol (Symbol)
import Unison.Type (Type)
import qualified Unison.Typechecker as Typechecker
addHistory :: Names -> NamesWithHistory
addHistory names = NamesWithHistory names mempty
lookupTerm :: HQ.HashQualified Name -> Names -> [Referent]
lookupTerm hq parseNames = toList (lookupHQTerm hq hnames)
where
hnames = addHistory parseNames
lookupCon ::
HQ.HashQualified Name ->
Names ->
([ConstructorReference], [Referent])
lookupCon hq parseNames =
unzip . catMaybes . fmap extract $ lookupTerm hq parseNames
where
extract rt@(Con rf _) = Just (rf, rt)
extract _ = Nothing
lookupTermRefs ::
HQ.HashQualified Name -> Names -> ([Reference], [Referent])
lookupTermRefs hq parseNames =
unzip . catMaybes . fmap extract $ lookupTerm hq parseNames
where
extract rt@(Ref rf) = Just (rf, rt)
extract _ = Nothing
lookupTermRefWithType ::
Codebase.Codebase IO Symbol Ann ->
HQ.HashQualified Name ->
Cli [(Reference, Type Symbol Ann)]
lookupTermRefWithType codebase name = do
nms <- basicParseNames
liftIO
. Codebase.runTransaction codebase
. fmap catMaybes
. traverse annot
. fst
$ lookupTermRefs name nms
where
annot tm =
fmap ((,) tm) <$> Codebase.getTypeOfTerm codebase tm
resolveTerm :: HQ.HashQualified Name -> Cli Referent
resolveTerm name =
basicParseNames >>= \nms ->
case lookupTerm name nms of
[] -> Cli.returnEarly (TermNotFound $ fromJust parsed)
where
parsed = hqSplitFromName' =<< HQ.toName name
[rf] -> pure rf
rfs -> Cli.returnEarly (TermAmbiguous name (fromList rfs))
resolveCon :: HQ.HashQualified Name -> Cli ConstructorReference
resolveCon name =
basicParseNames >>= \nms ->
case lookupCon name nms of
([], _) -> Cli.returnEarly (TermNotFound $ fromJust parsed)
where
parsed = hqSplitFromName' =<< HQ.toName name
([co], _) -> pure co
(_, rfts) -> Cli.returnEarly (TermAmbiguous name (fromList rfts))
resolveTermRef :: HQ.HashQualified Name -> Cli Reference
resolveTermRef name =
basicParseNames >>= \nms ->
case lookupTermRefs name nms of
([], _) -> Cli.returnEarly (TermNotFound $ fromJust parsed)
where
parsed = hqSplitFromName' =<< HQ.toName name
([rf], _) -> pure rf
(_, rfts) -> Cli.returnEarly (TermAmbiguous name (fromList rfts))
resolveMainRef :: HQ.HashQualified Name -> Cli (Reference, PrettyPrintEnv)
resolveMainRef main = do
Cli.Env {codebase, runtime} <- ask
let mainType = Runtime.mainType runtime
smain = HQ.toString main
parseNames <- basicPrettyPrintNamesA
k <- Cli.runTransaction Codebase.hashLength
let ppe = fromSuffixNames k (addHistory parseNames)
lookupTermRefWithType codebase main >>= \case
[(rf, ty)]
| Typechecker.fitsScheme ty mainType -> pure (rf, ppe)
| otherwise -> Cli.returnEarly (BadMainFunction smain ty ppe [mainType])
_ -> Cli.returnEarly (NoMainFunction smain ppe [mainType])

View File

@ -155,6 +155,14 @@ data Input
IOTestI (HQ.HashQualified Name)
| -- make a standalone binary file
MakeStandaloneI String (HQ.HashQualified Name)
| -- execute an IO thunk using scheme
ExecuteSchemeI (HQ.HashQualified Name)
| -- compile to a scheme file
CompileSchemeI String (HQ.HashQualified Name)
| -- generate scheme libraries
GenSchemeLibsI
| -- fetch scheme compiler
FetchSchemeCompilerI
| TestI TestInput
| -- metadata
-- `link metadata definitions` (adds metadata to all of `definitions`)

View File

@ -36,13 +36,14 @@ import qualified Unison.DataDeclaration as Decl
import Unison.FileParsers (synthesizeFile')
import Unison.Hash (Hash)
import qualified Unison.Hashing.V2.Convert as Hashing
import Unison.Name (Name)
import qualified Unison.Name as Name
import Unison.NameSegment (NameSegment)
import Unison.Names (Names)
import qualified Unison.Names as Names
import Unison.Parser.Ann (Ann (..))
import Unison.Prelude
import Unison.Reference (Reference (..))
import Unison.Reference (Reference (..), TermReference, TypeReference)
import qualified Unison.Reference as Reference
import Unison.Referent (Referent)
import qualified Unison.Referent as Referent
@ -58,6 +59,7 @@ import Unison.UnisonFile (UnisonFile (..))
import qualified Unison.UnisonFile as UF
import Unison.Util.Monoid (foldMapM)
import qualified Unison.Util.Relation as R
import qualified Unison.Util.Set as Set
import qualified Unison.Util.Star3 as Star3
import Unison.Util.TransitiveClosure (transitiveClosure)
import Unison.Var (Var)
@ -240,14 +242,7 @@ propagate patch b = case validatePatch patch of
-- TODO: this can be removed once patches have term replacement of type `Referent -> Referent`
rootNames <- Branch.toNames <$> Cli.getRootBranch0
let entireBranch =
Set.union
(Branch.deepTypeReferences b)
( Set.fromList
[r | Referent.Ref r <- Set.toList $ Branch.deepReferents b]
)
-- TODO: these are just used for tracing, could be deleted if we don't care
let -- TODO: these are just used for tracing, could be deleted if we don't care
-- about printing meaningful names for definitions during propagation, or if
-- we want to just remove the tracing.
refName r =
@ -270,7 +265,8 @@ propagate patch b = case validatePatch patch of
computeDirty
(Codebase.dependents Queries.ExcludeOwnComponent)
patch
(Names.contains names0)
-- Dirty reference predicate: does the reference have a name in this branch that isn't in the "lib" namespace?
(Names.contains (Names.filter nameNotInLibNamespace (Branch.toNames b)))
let initialTypeReplacements = Map.mapMaybe TypeEdit.toReference initialTypeEdits
-- TODO: once patches can directly contain constructor replacements, this
@ -278,7 +274,16 @@ propagate patch b = case validatePatch patch of
-- in the patch which have a `Referent.Con` as their LHS.
initialCtorMappings <- genInitialCtorMapping rootNames initialTypeReplacements
order <- sortDependentsGraph initialDirty entireBranch
order <-
let restrictToTypes :: Set TypeReference
restrictToTypes =
R.dom (R.filterRan nameNotInLibNamespace (Branch.deepTypes b))
restrictToTerms :: Set TermReference
restrictToTerms =
Set.mapMaybe Referent.toTermReference (R.dom (R.filterRan nameNotInLibNamespace (Branch.deepTerms b)))
in sortDependentsGraph
initialDirty
(Set.union restrictToTypes restrictToTerms)
let getOrdered :: Set Reference -> Map Int Reference
getOrdered rs =
@ -478,7 +483,6 @@ propagate patch b = case validatePatch patch of
(zip (view _1 . getReference <$> Graph.topSort graph) [0 ..])
-- vertex i precedes j whenever i has an edge to j and not vice versa.
-- vertex i precedes j when j is a dependent of i.
names0 = Branch.toNames b
validatePatch ::
Patch -> Maybe (Map Reference TermEdit, Map Reference TypeEdit)
validatePatch p =
@ -600,12 +604,20 @@ applyDeprecations patch =
-- | Things in the patch are not marked as propagated changes, but every other
-- definition that is created by the `Edits` which is passed in is marked as
-- a propagated change.
applyPropagate :: Applicative m => Patch -> Edits Symbol -> Branch0 m -> Branch0 m
applyPropagate patch Edits {..} = do
applyPropagate :: forall m. Applicative m => Patch -> Edits Symbol -> Branch0 m -> Branch0 m
applyPropagate patch Edits {newTerms, termReplacements, typeReplacements, constructorReplacements} = do
let termTypes = Map.map (Hashing.typeToReference . snd) newTerms
-- recursively update names and delete deprecated definitions
Branch.stepEverywhere (updateLevel termReplacements typeReplacements termTypes)
stepEverywhereButLib (updateLevel termReplacements typeReplacements termTypes)
where
-- Like Branch.stepEverywhere, but don't step the child named "lib"
stepEverywhereButLib :: (Branch0 m -> Branch0 m) -> (Branch0 m -> Branch0 m)
stepEverywhereButLib f branch =
let children =
Map.mapWithKey
(\name child -> if name == "lib" then child else Branch.step (Branch.stepEverywhere f) child)
(branch ^. Branch.children)
in f (Branch.branch0 (branch ^. Branch.terms) (branch ^. Branch.types) children (branch ^. Branch.edits))
isPropagated r = Set.notMember r allPatchTargets
allPatchTargets = Patch.allReferenceTargets patch
propagatedMd :: forall r. r -> (r, Metadata.Type, Metadata.Value)
@ -627,32 +639,28 @@ applyPropagate patch Edits {..} = do
terms0 = Star3.replaceFacts replaceConstructor constructorReplacements _terms
terms :: Branch.Star Referent NameSegment
terms =
updateMetadatas Referent.Ref $
updateMetadatas $
Star3.replaceFacts replaceTerm termEdits terms0
types :: Branch.Star Reference NameSegment
types =
updateMetadatas id $
updateMetadatas $
Star3.replaceFacts replaceType typeEdits _types
updateMetadatas ::
Ord r =>
(Reference -> r) ->
Star3.Star3 r NameSegment Metadata.Type (Metadata.Type, Metadata.Value) ->
Star3.Star3 r NameSegment Metadata.Type (Metadata.Type, Metadata.Value)
updateMetadatas ref s = clearPropagated $ Star3.mapD3 go s
updateMetadatas s = Star3.mapD3 go s
where
clearPropagated s = foldl' go s allPatchTargets
where
go s r = Metadata.delete (propagatedMd $ ref r) s
go (tp, v) = case Map.lookup (Referent.Ref v) termEdits of
Just (Referent.Ref r) -> (typeOf r tp, r)
_ -> (tp, v)
typeOf r t = fromMaybe t $ Map.lookup r termTypes
replaceTerm :: Referent -> Referent -> Metadata.Star Referent NameSegment -> Metadata.Star Referent NameSegment
replaceTerm r r' s =
replaceTerm _r r' s =
( if isPropagatedReferent r'
then Metadata.insert (propagatedMd r') . Metadata.delete (propagatedMd r)
then Metadata.insert (propagatedMd r')
else Metadata.delete (propagatedMd r')
)
$ s
@ -682,11 +690,9 @@ applyPropagate patch Edits {..} = do
-- | Compute the set of "dirty" references. They each:
--
-- 1. Depend directly on some reference that was edited in the given patch
-- 2. Have a name in the current namespace (the given Names)
-- 3. Are not themselves edited in the given patch.
--
-- Note: computeDirty a b c = R.dom <$> computeFrontier a b c
-- 1. Depend directly on some reference that was edited in the given patch
-- 2. Are not themselves edited in the given patch.
-- 3. Pass the given predicate.
computeDirty ::
Monad m =>
(Reference -> m (Set Reference)) -> -- eg Codebase.dependents codebase
@ -703,3 +709,7 @@ computeDirty getDependents patch shouldUpdate =
edited :: Set Reference
edited = R.dom (Patch._termEdits patch) <> R.dom (Patch._typeEdits patch)
nameNotInLibNamespace :: Name -> Bool
nameNotInLibNamespace name =
not (Name.beginsWithSegment name "lib")

View File

@ -57,6 +57,7 @@ import qualified Unison.Server.Endpoints.NamespaceListing as Server
import qualified Unison.Server.Types as Server
import qualified Unison.Share.Codeserver as Codeserver
import qualified Unison.Share.Types as Share
import qualified Unison.Sqlite as Sqlite
import qualified Unison.Util.Pretty as P
import qualified UnliftIO
import Prelude hiding (readFile, writeFile)
@ -131,18 +132,15 @@ noCompletions _ _ _ _ = pure []
-- .> view base.List.map#<Tab>
-- base.List.map#0q926sgnn6
completeWithinNamespace ::
forall m v a.
MonadIO m =>
-- | The types of completions to return
NESet CompletionType ->
-- | The portion of this are that the user has already typed.
String ->
Codebase m v a ->
Path.Absolute ->
m [System.Console.Haskeline.Completion.Completion]
completeWithinNamespace compTypes query codebase currentPath = do
shortHashLen <- Codebase.runTransaction codebase Codebase.hashLength
b <- Codebase.getShallowBranchAtPath codebase (Path.unabsolute absQueryPath) Nothing
Sqlite.Transaction [System.Console.Haskeline.Completion.Completion]
completeWithinNamespace compTypes query currentPath = do
shortHashLen <- Codebase.hashLength
b <- Codebase.getShallowBranchAtPath (Path.unabsolute absQueryPath) Nothing
currentBranchSuggestions <- do
nib <- namesInBranch shortHashLen b
nib
@ -162,9 +160,9 @@ completeWithinNamespace compTypes query codebase currentPath = do
(queryPathPrefix, querySuffix) = parseLaxPath'Query (Text.pack query)
absQueryPath :: Path.Absolute
absQueryPath = Path.resolve currentPath queryPathPrefix
getChildSuggestions :: Int -> V2Branch.Branch m -> m [Completion]
getChildSuggestions :: Int -> V2Branch.Branch Sqlite.Transaction -> Sqlite.Transaction [Completion]
getChildSuggestions shortHashLen b = do
nonEmptyChildren <- Codebase.runTransaction codebase (V2Branch.nonEmptyChildren b)
nonEmptyChildren <- V2Branch.nonEmptyChildren b
case querySuffix of
"" -> pure []
suffix -> do
@ -180,9 +178,9 @@ completeWithinNamespace compTypes query codebase currentPath = do
& filter (\(_isFinished, match) -> List.isPrefixOf query match)
& fmap (\(isFinished, match) -> prettyCompletionWithQueryPrefix isFinished query match)
& pure
namesInBranch :: Int -> V2Branch.Branch m -> m [(Bool, Text)]
namesInBranch :: Int -> V2Branch.Branch Sqlite.Transaction -> Sqlite.Transaction [(Bool, Text)]
namesInBranch hashLen b = do
nonEmptyChildren <- Codebase.runTransaction codebase (V2Branch.nonEmptyChildren b)
nonEmptyChildren <- V2Branch.nonEmptyChildren b
let textifyHQ :: (V2Branch.NameSegment -> r -> HQ'.HashQualified V2Branch.NameSegment) -> Map V2Branch.NameSegment (Map r metadata) -> [(Bool, Text)]
textifyHQ f xs =
xs
@ -208,9 +206,9 @@ completeWithinNamespace compTypes query codebase currentPath = do
qualifyRefs :: V2Branch.NameSegment -> (Map r metadata) -> [HQ'.HashQualified V2Branch.NameSegment]
qualifyRefs n refs
| ((Text.isInfixOf "#" . NameSegment.toText) querySuffix) || length refs > 1 =
refs
& Map.keys
<&> qualify n
refs
& Map.keys
<&> qualify n
| otherwise = [HQ'.NameOnly n]
-- If we're not completing namespaces, then all namespace completions should automatically
@ -265,52 +263,37 @@ parseLaxPath'Query txt =
-- | Completes a namespace argument by prefix-matching against the query.
prefixCompleteNamespace ::
forall m v a.
(MonadIO m) =>
String ->
Codebase m v a ->
Path.Absolute -> -- Current path
m [Line.Completion]
Sqlite.Transaction [Line.Completion]
prefixCompleteNamespace = completeWithinNamespace (NESet.singleton NamespaceCompletion)
-- | Completes a term or type argument by prefix-matching against the query.
prefixCompleteTermOrType ::
forall m v a.
MonadIO m =>
String ->
Codebase m v a ->
Path.Absolute -> -- Current path
m [Line.Completion]
Sqlite.Transaction [Line.Completion]
prefixCompleteTermOrType = completeWithinNamespace (NESet.fromList (TermCompletion NE.:| [TypeCompletion]))
-- | Completes a term argument by prefix-matching against the query.
prefixCompleteTerm ::
forall m v a.
MonadIO m =>
String ->
Codebase m v a ->
Path.Absolute -> -- Current path
m [Line.Completion]
Sqlite.Transaction [Line.Completion]
prefixCompleteTerm = completeWithinNamespace (NESet.singleton TermCompletion)
-- | Completes a term or type argument by prefix-matching against the query.
prefixCompleteType ::
forall m v a.
MonadIO m =>
String ->
Codebase m v a ->
Path.Absolute -> -- Current path
m [Line.Completion]
Sqlite.Transaction [Line.Completion]
prefixCompleteType = completeWithinNamespace (NESet.singleton TypeCompletion)
-- | Completes a patch argument by prefix-matching against the query.
prefixCompletePatch ::
forall m v a.
MonadIO m =>
String ->
Codebase m v a ->
Path.Absolute -> -- Current path
m [Line.Completion]
Sqlite.Transaction [Line.Completion]
prefixCompletePatch = completeWithinNamespace (NESet.singleton PatchCompletion)
-- | Renders a completion option with the prefix matching the query greyed out.

View File

@ -193,11 +193,14 @@ displayPretty pped terms typeOf eval types tm = go tm
go = pure . P.underline . P.syntaxToColor . NP.prettyHashQualified
in case e of
DD.EitherLeft' (Term.TypeLink' ref) -> go $ PPE.typeName ppe ref
DD.EitherRight' (DD.Doc2Term (Term.Ref' ref)) -> go $ PPE.termName ppe (Referent.Ref ref)
DD.EitherRight' (DD.Doc2Term (Term.Request' ref)) ->
go $ PPE.termName ppe (Referent.Con ref CT.Effect)
DD.EitherRight' (DD.Doc2Term (Term.Constructor' ref)) ->
go $ PPE.termName ppe (Referent.Con ref CT.Data)
-- Eta-reduce the term, as the compiler may have eta-expanded it.
DD.EitherRight' (DD.Doc2Term t) -> case Term.etaNormalForm t of
Term.Ref' ref -> go $ PPE.termName ppe (Referent.Ref ref)
Term.Request' ref ->
go $ PPE.termName ppe (Referent.Con ref CT.Effect)
Term.Constructor' ref ->
go $ PPE.termName ppe (Referent.Con ref CT.Data)
_ -> P.red <$> displayTerm pped terms typeOf eval types t
_ -> P.red <$> displayTerm pped terms typeOf eval types e
-- Signature [Doc2.Term]
DD.Doc2SpecialFormSignature (Term.List' tms) ->

View File

@ -13,6 +13,7 @@ import qualified Data.Set as Set
import qualified Data.Text as Text
import System.Console.Haskeline.Completion (Completion (Completion))
import qualified Text.Megaparsec as P
import qualified Unison.Codebase as Codebase
import qualified Unison.Codebase.Branch as Branch
import qualified Unison.Codebase.Branch.Merge as Branch
import Unison.Codebase.Editor.Input (Input)
@ -332,8 +333,14 @@ view =
I.Visible
[(ZeroPlus, definitionQueryArg)]
( P.lines
[ "`view foo` prints definitions named `foo` within your current namespace.",
"`view` without arguments invokes a search to select definitions to view, which requires that `fzf` can be found within your PATH."
[ P.wrap $ makeExample view ["foo"] <> "shows definitions named `foo` within your current namespace.",
P.wrap $ makeExample view [] <> "without arguments invokes a search to select definitions to view, which requires that `fzf` can be found within your PATH.",
" ", -- hmm, this blankline seems to be ignored by pretty printer
P.wrap $
"Supports glob syntax, where ? acts a wildcard, so"
<> makeExample view ["List.?"]
<> "will show `List.map`, `List.filter`, etc, but "
<> "not `List.map.doc` (since ? only matches 1 name segment)."
]
)
( fmap (Input.ShowDefinitionI Input.ConsoleLocation Input.ShowDefinitionLocal)
@ -2116,6 +2123,96 @@ makeStandalone =
_ -> Left $ showPatternHelp makeStandalone
)
runScheme :: InputPattern
runScheme =
InputPattern
"run.native"
[]
I.Visible
[(Required, exactDefinitionTermQueryArg)]
( P.wrapColumn2
[ ( makeExample runScheme ["main"],
"Executes !main using native compilation via scheme."
)
]
)
( \case
[main] ->
Input.ExecuteSchemeI <$> parseHashQualifiedName main
_ -> Left $ showPatternHelp runScheme
)
compileScheme :: InputPattern
compileScheme =
InputPattern
"compile.native"
[]
I.Visible
[(Required, exactDefinitionTermQueryArg), (Required, noCompletionsArg)]
( P.wrapColumn2
[ ( makeExample compileScheme ["main", "file"],
"Creates stand alone executable via compilation to"
<> "scheme. The created executable will have the effect"
<> "of running `!main`."
)
]
)
( \case
[main, file] ->
Input.CompileSchemeI file <$> parseHashQualifiedName main
_ -> Left $ showPatternHelp compileScheme
)
schemeLibgen :: InputPattern
schemeLibgen =
InputPattern
"compile.native.genlibs"
[]
I.Visible
[]
( P.wrapColumn2
[ ( makeExample schemeLibgen [],
"Generates libraries necessary for scheme compilation.\n\n\
\There is no need to run this before"
<> P.group (makeExample compileScheme [])
<> "as\
\ the latter will check if the libraries are missing and\
\ auto-generate them. However, this will generate the\
\ libraries even if their files already exist, so if the\
\ compiler has been upgraded, this can be used to ensure\
\ the generated libraries are up to date."
)
]
)
( \case
[] -> pure Input.GenSchemeLibsI
_ -> Left $ showPatternHelp schemeLibgen
)
fetchScheme :: InputPattern
fetchScheme =
InputPattern
"compile.native.fetch"
[]
I.Visible
[]
( P.wrapColumn2
[ ( makeExample fetchScheme [],
"Fetches the unison library for compiling to scheme.\n\n\
\This is done automatically when"
<> P.group (makeExample compileScheme [])
<> "is run\
\ if the library is not already in the standard location\
\ (unison.internal). However, this command will force\
\ a pull even if the library already exists."
)
]
)
( \case
[] -> pure Input.FetchSchemeCompilerI
_ -> Left $ showPatternHelp fetchScheme
)
createAuthor :: InputPattern
createAuthor =
InputPattern
@ -2288,6 +2385,10 @@ validInputs =
quit,
updateBuiltins,
makeStandalone,
runScheme,
compileScheme,
schemeLibgen,
fetchScheme,
mergeBuiltins,
mergeIOBuiltins,
dependents,
@ -2331,7 +2432,7 @@ exactDefinitionArg :: ArgumentType
exactDefinitionArg =
ArgumentType
{ typeName = "definition",
suggestions = \q cb _http p -> prefixCompleteTermOrType q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompleteTermOrType q p),
globTargets = Set.fromList [Globbing.Term, Globbing.Type]
}
@ -2339,7 +2440,7 @@ fuzzyDefinitionQueryArg :: ArgumentType
fuzzyDefinitionQueryArg =
ArgumentType
{ typeName = "fuzzy definition query",
suggestions = \q cb _http p -> prefixCompleteTermOrType q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompleteTermOrType q p),
globTargets = Set.fromList [Globbing.Term, Globbing.Type]
}
@ -2350,7 +2451,7 @@ exactDefinitionTypeQueryArg :: ArgumentType
exactDefinitionTypeQueryArg =
ArgumentType
{ typeName = "type definition query",
suggestions = \q cb _http p -> prefixCompleteType q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompleteType q p),
globTargets = Set.fromList [Globbing.Type]
}
@ -2358,7 +2459,7 @@ exactDefinitionTermQueryArg :: ArgumentType
exactDefinitionTermQueryArg =
ArgumentType
{ typeName = "term definition query",
suggestions = \q cb _http p -> prefixCompleteTerm q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompleteTerm q p),
globTargets = Set.fromList [Globbing.Term]
}
@ -2366,7 +2467,7 @@ patchArg :: ArgumentType
patchArg =
ArgumentType
{ typeName = "patch",
suggestions = \q cb _http p -> prefixCompletePatch q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompletePatch q p),
globTargets = Set.fromList []
}
@ -2374,7 +2475,7 @@ namespaceArg :: ArgumentType
namespaceArg =
ArgumentType
{ typeName = "namespace",
suggestions = \q cb _http p -> prefixCompleteNamespace q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompleteNamespace q p),
globTargets = Set.fromList [Globbing.Namespace]
}
@ -2386,7 +2487,7 @@ newNameArg :: ArgumentType
newNameArg =
ArgumentType
{ typeName = "new-name",
suggestions = \q cb _http p -> prefixCompleteNamespace q cb p,
suggestions = \q cb _http p -> Codebase.runTransaction cb (prefixCompleteNamespace q p),
globTargets = mempty
}

View File

@ -665,8 +665,8 @@ notifyUser dir o = case o of
CachedTests 0 _ -> pure . P.callout "😶" $ "No tests to run."
CachedTests n n'
| n == n' ->
pure $
P.lines [cache, "", displayTestResults True ppe oks fails]
pure $
P.lines [cache, "", displayTestResults True ppe oks fails]
CachedTests _n m ->
pure $
if m == 0
@ -675,6 +675,7 @@ notifyUser dir o = case o of
P.indentN 2 $
P.lines ["", cache, "", displayTestResults False ppe oks fails, "", ""]
where
NewlyComputed -> do
clearCurrentLine
pure $
@ -1131,6 +1132,9 @@ notifyUser dir o = case o of
else pure mempty
GitError e -> pure $ case e of
GitSqliteCodebaseError e -> case e of
CodebaseFileLockFailed ->
P.wrap $
"It looks to me like another ucm process is using this codebase. Only one ucm process can use a codebase at a time."
NoDatabaseFile repo localPath ->
P.wrap $
"I didn't find a codebase in the repository at"
@ -2463,7 +2467,7 @@ showDiffNamespace ::
(Pretty, NumberedArgs)
showDiffNamespace _ _ _ _ diffOutput
| OBD.isEmpty diffOutput =
("The namespaces are identical.", mempty)
("The namespaces are identical.", mempty)
showDiffNamespace sn ppe oldPath newPath OBD.BranchDiffOutput {..} =
(P.sepNonEmpty "\n\n" p, toList args)
where

View File

@ -513,9 +513,10 @@ test =
|]
)
( \cb -> do
void . fmap (fromJust . sequence) $
traverse (Codebase.getWatch cb TestWatch)
=<< Codebase.runTransaction cb (Codebase.watches TestWatch)
Codebase.runTransaction cb do
void . fmap (fromJust . sequence) $
traverse (Codebase.getWatch cb TestWatch)
=<< Codebase.watches TestWatch
),
gistTest fmt,
pushPullBranchesTests fmt,

View File

@ -53,7 +53,7 @@ initCodebase fmt = do
tmp <-
Temp.getCanonicalTemporaryDirectory
>>= flip Temp.createTempDirectory "ucm-test"
result <- Codebase.Init.withCreatedCodebase cbInit "ucm-test" tmp (const $ pure ())
result <- Codebase.Init.withCreatedCodebase cbInit "ucm-test" tmp SC.DoLock (const $ pure ())
case result of
Left CreateCodebaseAlreadyExists -> fail $ P.toANSI 80 "Codebase already exists"
Right _ -> pure $ Codebase tmp fmt
@ -66,7 +66,7 @@ runTranscript (Codebase codebasePath fmt) transcript = do
let err e = fail $ "Parse error: \n" <> show e
cbInit = case fmt of CodebaseFormat2 -> SC.init
TR.withTranscriptRunner "Unison.Test.Ucm.runTranscript Invalid Version String" configFile $ \runner -> do
result <- Codebase.Init.withOpenCodebase cbInit "transcript" codebasePath SC.DontMigrate \codebase -> do
result <- Codebase.Init.withOpenCodebase cbInit "transcript" codebasePath SC.DoLock SC.DontMigrate \codebase -> do
Codebase.runTransaction codebase (Codebase.installUcmDependencies codebase)
let transcriptSrc = Text.pack . stripMargin $ unTranscript transcript
output <- either err Text.unpack <$> runner "transcript" transcriptSrc (codebasePath, codebase)
@ -81,7 +81,7 @@ runTranscript (Codebase codebasePath fmt) transcript = do
lowLevel :: Codebase -> (Codebase.Codebase IO Symbol Ann -> IO a) -> IO a
lowLevel (Codebase root fmt) action = do
let cbInit = case fmt of CodebaseFormat2 -> SC.init
result <- Codebase.Init.withOpenCodebase cbInit "lowLevel" root SC.DontMigrate action
result <- Codebase.Init.withOpenCodebase cbInit "lowLevel" root SC.DoLock SC.DontMigrate action
case result of
Left e -> PT.putPrettyLn (P.shown e) *> pure (error "This really should have loaded")
Right a -> pure a

View File

@ -34,7 +34,7 @@ type TestBuilder = FilePath -> [String] -> String -> Test ()
testBuilder ::
Bool -> FilePath -> [String] -> String -> Test ()
testBuilder expectFailure dir prelude transcript = scope transcript $ do
outputs <- io . withTemporaryUcmCodebase SC.init "transcript" $ \(codebasePath, codebase) -> do
outputs <- io . withTemporaryUcmCodebase SC.init "transcript" SC.DoLock $ \(codebasePath, codebase) -> do
withTranscriptRunner "TODO: pass version here" Nothing $ \runTranscript -> do
for files $ \filePath -> do
transcriptSrc <- readUtf8 filePath

View File

@ -43,6 +43,7 @@ library
Unison.Codebase.Editor.HandleInput.MoveBranch
Unison.Codebase.Editor.HandleInput.NamespaceDependencies
Unison.Codebase.Editor.HandleInput.NamespaceDiffUtils
Unison.Codebase.Editor.HandleInput.TermResolution
Unison.Codebase.Editor.HandleInput.Update
Unison.Codebase.Editor.Input
Unison.Codebase.Editor.Output
@ -166,6 +167,7 @@ library
, nonempty-containers
, open-browser
, pretty-simple
, process
, random >=1.2.0
, regex-tdfa
, semialign
@ -360,7 +362,7 @@ executable transcripts
TupleSections
TypeApplications
ViewPatterns
ghc-options: -Wall -threaded -rtsopts -with-rtsopts=-N -v0
ghc-options: -Wall -threaded -rtsopts "-with-rtsopts=-I0 -N -qn4 -qg1" -v0
build-depends:
IntervalMap
, ListLike
@ -449,6 +451,7 @@ executable unison
main-is: Main.hs
other-modules:
ArgParse
Stats
System.Path
Version
hs-source-dirs:
@ -483,7 +486,7 @@ executable unison
TupleSections
TypeApplications
ViewPatterns
ghc-options: -Wall -threaded -rtsopts -with-rtsopts=-I0 -optP-Wno-nonportable-include-path
ghc-options: -Wall -threaded -rtsopts "-with-rtsopts=-I0 -N -qn4 -qg1 -T" -optP-Wno-nonportable-include-path
build-depends:
IntervalMap
, ListLike
@ -530,6 +533,7 @@ executable unison
, open-browser
, optparse-applicative >=0.16.1.0
, pretty-simple
, process
, random >=1.2.0
, regex-tdfa
, semialign
@ -658,6 +662,7 @@ test-suite cli-tests
, nonempty-containers
, open-browser
, pretty-simple
, process
, random >=1.2.0
, regex-tdfa
, semialign

View File

@ -53,6 +53,7 @@ import qualified Options.Applicative as OptParse
import Options.Applicative.Builder.Internal (noGlobal {- https://github.com/pcapriotti/optparse-applicative/issues/461 -})
import Options.Applicative.Help (bold, (<+>))
import qualified Options.Applicative.Help.Pretty as P
import Stats
import System.Environment (lookupEnv)
import Text.Read (readMaybe)
import qualified Unison.Codebase.Path as Path
@ -116,7 +117,7 @@ data Command
| -- @deprecated in trunk after M2g. Remove the Init command completely after M2h has been released
Init
| Run RunSource [String]
| Transcript ShouldForkCodebase ShouldSaveCodebase (NonEmpty FilePath)
| Transcript ShouldForkCodebase ShouldSaveCodebase (Maybe RtsStatsPath) (NonEmpty FilePath)
deriving (Show, Eq)
-- | Options shared by sufficiently many subcommands.
@ -381,6 +382,15 @@ runCompiledParser :: Parser Command
runCompiledParser =
Run . RunCompiled <$> fileArgument "path/to/file" <*> runArgumentParser
rtsStatsOption :: Parser (Maybe RtsStatsPath)
rtsStatsOption =
let meta =
metavar "FILE.json"
<> long "rts-stats"
<> help "Write json summary of rts stats to FILE"
<> noGlobal
in optional (option OptParse.str meta)
saveCodebaseFlag :: Parser ShouldSaveCodebase
saveCodebaseFlag = flag DontSaveCodebase SaveCodebase (long "save-codebase" <> help saveHelp)
where
@ -448,15 +458,17 @@ transcriptParser :: Parser Command
transcriptParser = do
-- ApplicativeDo
shouldSaveCodebase <- saveCodebaseFlag
mrtsStatsFp <- rtsStatsOption
files <- liftA2 (NE.:|) (fileArgument "FILE") (many (fileArgument "FILES..."))
pure (Transcript DontFork shouldSaveCodebase files)
pure (Transcript DontFork shouldSaveCodebase mrtsStatsFp files)
transcriptForkParser :: Parser Command
transcriptForkParser = do
-- ApplicativeDo
shouldSaveCodebase <- saveCodebaseFlag
mrtsStatsFp <- rtsStatsOption
files <- liftA2 (NE.:|) (fileArgument "FILE") (many (fileArgument "FILES..."))
pure (Transcript UseFork shouldSaveCodebase files)
pure (Transcript UseFork shouldSaveCodebase mrtsStatsFp files)
unisonHelp :: String -> String -> P.Doc
unisonHelp (P.text -> executable) (P.text -> version) =

View File

@ -42,6 +42,7 @@ import qualified Language.Haskell.TH as TH
import qualified Language.Haskell.TH.Syntax as TH
import qualified Network.HTTP.Client as HTTP
import qualified Network.HTTP.Client.TLS as HTTP
import Stats (recordRtsStats)
import System.Directory (canonicalizePath, getCurrentDirectory, removeDirectoryRecursive)
import System.Environment (getProgName, lookupEnv, withArgs)
import qualified System.Exit as Exit
@ -240,8 +241,11 @@ main = withCP65001 . runInUnboundThread . Ki.scoped $ \scope -> do
"to produce a new compiled program \
\that matches your version of Unison."
]
Transcript shouldFork shouldSaveCodebase transcriptFiles ->
runTranscripts renderUsageInfo shouldFork shouldSaveCodebase mCodePathOption transcriptFiles
Transcript shouldFork shouldSaveCodebase mrtsStatsFp transcriptFiles -> do
let action = runTranscripts renderUsageInfo shouldFork shouldSaveCodebase mCodePathOption transcriptFiles
case mrtsStatsFp of
Nothing -> action
Just fp -> recordRtsStats fp action
Launch isHeadless codebaseServerOpts downloadBase mayStartingPath shouldWatchFiles -> do
getCodebaseOrExit mCodePathOption SC.MigrateAfterPrompt \(initRes, _, theCodebase) -> do
withRuntimes RTI.Persistent \(runtime, sbRuntime) -> do
@ -347,7 +351,7 @@ prepareTranscriptDir shouldFork mCodePathOption = do
Path.copyDir (CodebaseInit.codebasePath cbInit path) (CodebaseInit.codebasePath cbInit tmp)
DontFork -> do
PT.putPrettyLn . P.wrap $ "Transcript will be run on a new, empty codebase."
CodebaseInit.withNewUcmCodebaseOrExit cbInit "main.transcript" tmp (const $ pure ())
CodebaseInit.withNewUcmCodebaseOrExit cbInit "main.transcript" tmp SC.DoLock (const $ pure ())
pure tmp
runTranscripts' ::
@ -527,7 +531,7 @@ defaultBaseLib =
getCodebaseOrExit :: Maybe CodebasePathOption -> SC.MigrationStrategy -> ((InitResult, CodebasePath, Codebase IO Symbol Ann) -> IO r) -> IO r
getCodebaseOrExit codebasePathOption migrationStrategy action = do
initOptions <- argsToCodebaseInitOptions codebasePathOption
result <- CodebaseInit.withOpenOrCreateCodebase SC.init "main" initOptions migrationStrategy \case
result <- CodebaseInit.withOpenOrCreateCodebase SC.init "main" initOptions SC.DoLock migrationStrategy \case
cbInit@(CreatedCodebase, dir, _) -> do
pDir <- prettyDir dir
PT.putPrettyLn' ""
@ -546,6 +550,13 @@ getCodebaseOrExit codebasePathOption migrationStrategy action = do
case err of
InitErrorOpen err ->
case err of
OpenCodebaseFileLockFailed ->
pure
( P.lines
[ "Failed to obtain a file lock on the codebase. ",
"Perhaps you are running multiple ucm processes against the same codebase."
]
)
OpenCodebaseDoesntExist ->
pure
( P.lines

View File

@ -0,0 +1,39 @@
module Stats
( RtsStatsPath (..),
recordRtsStats,
)
where
import Control.Exception (finally)
import Data.Aeson (encode, object, (.=))
import qualified Data.ByteString.Lazy as BL
import Data.Function
import Data.String (IsString)
import GHC.Stats
newtype RtsStatsPath
= RtsStatsPath FilePath
deriving stock (Show, Eq)
deriving newtype (IsString)
recordRtsStats :: RtsStatsPath -> IO a -> IO a
recordRtsStats (RtsStatsPath fp) action = do
r0 <- getRTSStats
action `finally` do
r1 <- getRTSStats
BL.writeFile fp (encode (produceStats r0 r1))
where
produceStats r0 r1 =
object
[ "gcs" .= on (-) gcs r1 r0,
"major_gcs" .= on (-) major_gcs r1 r0,
"allocated_bytes" .= on (-) allocated_bytes r1 r0,
"max_live_bytes" .= on (-) max_live_bytes r1 r0,
"copied_bytes" .= on (-) copied_bytes r1 r0,
"mutator_cpu_ns" .= on (-) mutator_cpu_ns r1 r0,
"mutator_elapsed_ns" .= on (-) mutator_elapsed_ns r1 r0,
"gc_cpu_ns" .= on (-) mutator_cpu_ns r1 r0,
"gc_elapsed_ns" .= on (-) mutator_elapsed_ns r1 r0,
"cpu_ns" .= on (-) cpu_ns r1 r0,
"elapsed_ns" .= on (-) cpu_ns r1 r0
]

View File

@ -19,7 +19,6 @@ library:
- fuzzyfind
- generic-lens
- lens
- prelude-extras
- memory
- mtl
- rfc5051

View File

@ -43,7 +43,6 @@ import Control.Monad.State (evalState)
import Data.Bifunctor (bimap, first, second)
import qualified Data.Map as Map
import qualified Data.Set as Set
import Prelude.Extras (Show1)
import qualified Unison.ABT as ABT
import Unison.ConstructorReference (GConstructorReference (..))
import qualified Unison.ConstructorType as CT
@ -273,7 +272,7 @@ data F a
| LetRec [a] a
| Constructors [a]
| Modified Modifier a
deriving (Functor, Foldable, Show, Show1)
deriving (Functor, Foldable, Show)
updateDependencies :: Ord v => Map Reference Reference -> Decl v a -> Decl v a
updateDependencies typeUpdates decl =

View File

@ -21,6 +21,7 @@ module Unison.Name
countSegments,
isAbsolute,
isPrefixOf,
beginsWithSegment,
endsWithReverseSegments,
endsWithSegments,
stripReversedPrefix,
@ -164,6 +165,19 @@ countSegments :: Name -> Int
countSegments (Name _ ss) =
length ss
-- | @beginsWithSegment name segment@ returns whether @name@'s first name segment is @segment@.
--
-- >>> beginsWithSegment "abc.def" "abc"
-- True
--
-- >>> beginsWithSegment "abc.def" "ab"
-- False
--
-- /O(n)/, where /n/ is the number of name segments.
beginsWithSegment :: Name -> NameSegment -> Bool
beginsWithSegment name segment =
segment == List.NonEmpty.head (segments name)
-- | @endsWithSegments x y@ returns whether @x@ ends with @y@.
--
-- >>> endsWithSegments "a.b.c" ["b", "c"]
@ -559,7 +573,7 @@ commonPrefix :: Name -> Name -> [NameSegment]
commonPrefix x@(Name p1 _) y@(Name p2 _)
| p1 /= p2 = []
| otherwise =
commonPrefix' (toList $ segments x) (toList $ segments y)
commonPrefix' (toList $ segments x) (toList $ segments y)
where
commonPrefix' (a : as) (b : bs)
| a == b = a : commonPrefix' as bs

View File

@ -21,7 +21,6 @@ import qualified Data.Sequence as Sequence
import qualified Data.Set as Set
import qualified Data.Set.NonEmpty as NES
import qualified Data.Text as Text
import Prelude.Extras (Eq1 (..), Show1 (..))
import Text.Show
import qualified Unison.ABT as ABT
import qualified Unison.Blank as B
@ -1354,10 +1353,6 @@ fromReferent a = \case
-- mostly boring serialization code below ...
instance (Eq a, ABT.Var v) => Eq1 (F v a p) where (==#) = (==)
instance (Show v) => Show1 (F v a p) where showsPrec1 = showsPrec
instance (ABT.Var vt, Eq at, Eq a) => Eq (F vt at p a) where
Int x == Int y = x == y
Nat x == Nat y = x == y

View File

@ -16,7 +16,6 @@ import Data.List.Extra (nubOrd)
import qualified Data.Map as Map
import Data.Monoid (Any (..))
import qualified Data.Set as Set
import Prelude.Extras (Eq1 (..), Ord1 (..), Show1 (..))
import qualified Unison.ABT as ABT
import qualified Unison.Kind as K
import qualified Unison.Name as Name
@ -43,12 +42,6 @@ data F a
-- variables
deriving (Foldable, Functor, Generic, Generic1, Eq, Ord, Traversable)
instance Eq1 F where (==#) = (==)
instance Ord1 F where compare1 = compare
instance Show1 F where showsPrec1 = showsPrec
_Ref :: Prism' (F a) Reference
_Ref = _Ctor @"Ref"

View File

@ -94,7 +94,6 @@ library
, memory
, mtl
, nonempty-containers
, prelude-extras
, rfc5051
, safe
, text

View File

@ -338,12 +338,12 @@ lsAtPath ::
MonadIO m =>
Codebase m Symbol Ann ->
-- The root to follow the path from.
Maybe (V2Branch.Branch m) ->
Maybe (V2Branch.Branch Sqlite.Transaction) ->
-- Path from the root to the branch to 'ls'
Path.Absolute ->
m [ShallowListEntry Symbol Ann]
lsAtPath codebase mayRootBranch absPath = do
b <- Codebase.getShallowBranchAtPath codebase (Path.unabsolute absPath) mayRootBranch
b <- Codebase.runTransaction codebase (Codebase.getShallowBranchAtPath (Path.unabsolute absPath) mayRootBranch)
lsBranch codebase b
findShallowReadmeInBranchAndRender ::
@ -407,7 +407,7 @@ resultListType = Type.app mempty (Type.list mempty) (Type.ref mempty Decls.testR
termListEntry ::
MonadIO m =>
Codebase m Symbol Ann ->
V2Branch.Branch m ->
V2Branch.Branch n ->
ExactName NameSegment V2Referent.Referent ->
m (TermEntry Symbol Ann)
termListEntry codebase branch (ExactName nameSegment ref) = do
@ -477,7 +477,7 @@ getTypeTag codebase r = do
typeListEntry ::
Var v =>
Codebase m v Ann ->
V2Branch.Branch m ->
V2Branch.Branch n ->
ExactName NameSegment Reference ->
Sqlite.Transaction TypeEntry
typeListEntry codebase b (ExactName nameSegment ref) = do
@ -551,7 +551,7 @@ typeEntryToNamedType te@TypeEntry {typeEntryTag, typeEntryHash} =
lsBranch ::
MonadIO m =>
Codebase m Symbol Ann ->
V2Branch.Branch m ->
V2Branch.Branch n ->
m [ShallowListEntry Symbol Ann]
lsBranch codebase b0 = do
let flattenRefs :: Map V2Branch.NameSegment (Map ref v) -> [(ref, V2Branch.NameSegment)]
@ -796,13 +796,15 @@ expandShortCausalHash hash = do
throwError . AmbiguousBranchHash hash $ Set.map (SCH.fromHash len) hashSet
-- | Efficiently resolve a root hash and path to a shallow branch's causal.
getShallowCausalAtPathFromRootHash :: MonadIO m => Codebase m v a -> Maybe Branch.CausalHash -> Path -> Backend m (V2Branch.CausalBranch m)
getShallowCausalAtPathFromRootHash codebase mayRootHash path = do
getShallowCausalAtPathFromRootHash ::
Maybe Branch.CausalHash ->
Path ->
Sqlite.Transaction (V2Branch.CausalBranch Sqlite.Transaction)
getShallowCausalAtPathFromRootHash mayRootHash path = do
shallowRoot <- case mayRootHash of
Nothing -> lift (Codebase.getShallowRootCausal codebase)
Just h -> do
lift $ Codebase.getShallowCausalForHash codebase (Cv.causalHash1to2 h)
lift $ Codebase.getShallowCausalAtPath codebase path (Just shallowRoot)
Nothing -> Codebase.getShallowRootCausal
Just h -> Codebase.expectCausalBranchByCausalHash (Cv.causalHash1to2 h)
Codebase.getShallowCausalAtPath path (Just shallowRoot)
formatType' :: Var v => PPE.PrettyPrintEnv -> Width -> Type v a -> SyntaxText
formatType' ppe w =
@ -841,8 +843,11 @@ prettyDefinitionsForHQName ::
HQ.HashQualified Name ->
Backend IO DefinitionDisplayResults
prettyDefinitionsForHQName path mayRoot renderWidth suffixifyBindings rt codebase query = do
shallowRoot <- resolveCausalHashV2 codebase (fmap Cv.causalHash1to2 mayRoot)
hqLength <- lift $ Codebase.runTransaction codebase Codebase.hashLength
(shallowRoot, hqLength) <-
(lift . Codebase.runTransaction codebase) do
shallowRoot <- resolveCausalHashV2 (fmap Cv.causalHash1to2 mayRoot)
hqLength <- Codebase.hashLength
pure (shallowRoot, hqLength)
(localNamesOnly, unbiasedPPE) <- scopedNamesForBranchHash codebase (Just shallowRoot) path
-- Bias towards both relative and absolute path to queries,
-- This allows us to still bias towards definitions outside our perspective but within the
@ -858,8 +863,10 @@ prettyDefinitionsForHQName path mayRoot renderWidth suffixifyBindings rt codebas
let nameSearch :: NameSearch
nameSearch = makeNameSearch hqLength (NamesWithHistory.fromCurrentNames localNamesOnly)
DefinitionResults terms types misses <- lift (definitionsBySuffixes codebase nameSearch DontIncludeCycles [query])
causalAtPath <- lift $ Codebase.getShallowCausalAtPath codebase path (Just shallowRoot)
branchAtPath <- lift $ V2Causal.value causalAtPath
branchAtPath <- do
(lift . Codebase.runTransaction codebase) do
causalAtPath <- Codebase.getShallowCausalAtPath path (Just shallowRoot)
V2Causal.value causalAtPath
let width = mayDefaultWidth renderWidth
-- Return only references which refer to docs.
filterForDocs :: [Referent] -> Sqlite.Transaction [TermReference]
@ -982,7 +989,7 @@ renderDoc ppe width rt codebase r = do
eval (Term.amap (const mempty) -> tm) = do
let ppes = PPED.suffixifiedPPE ppe
let codeLookup = Codebase.toCodeLookup codebase
let cache r = fmap Term.unannotate <$> Codebase.lookupWatchCache codebase r
let cache r = fmap Term.unannotate <$> Codebase.runTransaction codebase (Codebase.lookupWatchCache codebase r)
r <- fmap hush . liftIO $ Rt.evaluateTerm' codeLookup cache ppes rt tm
case r of
Just tmr ->
@ -1105,7 +1112,13 @@ bestNameForType ppe width =
-- - 'local' includes ONLY the names within the provided path
-- - 'ppe' is a ppe which searches for a name within the path first, but falls back to a global name search.
-- The 'suffixified' component of this ppe will search for the shortest unambiguous suffix within the scope in which the name is found (local, falling back to global)
scopedNamesForBranchHash :: forall m v a. MonadIO m => Codebase m v a -> Maybe (V2Branch.CausalBranch m) -> Path -> Backend m (Names, PPED.PrettyPrintEnvDecl)
scopedNamesForBranchHash ::
forall m n v a.
MonadIO m =>
Codebase m v a ->
Maybe (V2Branch.CausalBranch n) ->
Path ->
Backend m (Names, PPED.PrettyPrintEnvDecl)
scopedNamesForBranchHash codebase mbh path = do
shouldUseNamesIndex <- asks useNamesIndex
hashLen <- lift $ Codebase.runTransaction codebase Codebase.hashLength
@ -1148,11 +1161,10 @@ resolveCausalHash h codebase = case h of
mayBranch <- lift $ Codebase.getBranchForHash codebase bhash
whenNothing mayBranch (throwError $ NoBranchForHash bhash)
resolveCausalHashV2 ::
MonadIO m => Codebase m v a -> Maybe V2Branch.CausalHash -> Backend m (V2Branch.CausalBranch m)
resolveCausalHashV2 codebase h = case h of
Nothing -> lift $ Codebase.getShallowRootCausal codebase
Just ch -> lift $ Codebase.getShallowCausalForHash codebase ch
resolveCausalHashV2 :: Maybe V2Branch.CausalHash -> Sqlite.Transaction (V2Branch.CausalBranch Sqlite.Transaction)
resolveCausalHashV2 h = case h of
Nothing -> Codebase.getShallowRootCausal
Just ch -> Codebase.expectCausalBranchByCausalHash ch
resolveRootBranchHash ::
MonadIO m => Maybe ShortCausalHash -> Codebase m v a -> Backend m (Branch m)
@ -1164,12 +1176,12 @@ resolveRootBranchHash mayRoot codebase = case mayRoot of
resolveCausalHash (Just h) codebase
resolveRootBranchHashV2 ::
MonadIO m => Codebase m v a -> Maybe ShortCausalHash -> Backend m (V2Branch.CausalBranch m)
resolveRootBranchHashV2 codebase mayRoot = case mayRoot of
Nothing -> lift (Codebase.getShallowRootCausal codebase)
Maybe ShortCausalHash -> Backend Sqlite.Transaction (V2Branch.CausalBranch Sqlite.Transaction)
resolveRootBranchHashV2 mayRoot = case mayRoot of
Nothing -> lift Codebase.getShallowRootCausal
Just sch -> do
h <- Cv.causalHash1to2 <$> hoistBackend (Codebase.runTransaction codebase) (expandShortCausalHash sch)
resolveCausalHashV2 codebase (Just h)
h <- Cv.causalHash1to2 <$> expandShortCausalHash sch
lift (resolveCausalHashV2 (Just h))
-- | Determines whether we include full cycles in the results, (e.g. if I search for `isEven`, will I find `isOdd` too?)
--

View File

@ -220,7 +220,10 @@ renderDoc pped terms typeOf eval types tm =
ty r = (NP.styleHashQualified'' (NP.fmt (S.TypeReference r)) . PPE.typeName ppe) r
in Link <$> case e of
DD.EitherLeft' (Term.TypeLink' r) -> (pure . formatPretty . ty) r
DD.EitherRight' (DD.Doc2Term (Term.Referent' r)) -> (pure . formatPretty . tm) r
DD.EitherRight' (DD.Doc2Term t) ->
case Term.etaNormalForm t of
Term.Referent' r -> (pure . formatPretty . tm) r
x -> source x
_ -> source e
DD.Doc2SpecialFormSignature (Term.List' tms) ->
let rs = [r | DD.Doc2Term (Term.Referent' r) <- toList tms]

View File

@ -90,8 +90,11 @@ serveTermSummary codebase referent mayName mayRoot relativeTo mayWidth = do
let relativeToPath = fromMaybe Path.empty relativeTo
let termReference = Referent.toReference referent
let v2Referent = Cv.referent1to2 referent
root <- Backend.resolveRootBranchHashV2 codebase mayRoot
sig <- lift (Codebase.runTransaction codebase (Backend.loadReferentType codebase referent))
(root, sig) <-
Backend.hoistBackend (Codebase.runTransaction codebase) do
root <- Backend.resolveRootBranchHashV2 mayRoot
sig <- lift (Backend.loadReferentType codebase referent)
pure (root, sig)
case sig of
Nothing ->
throwError (Backend.MissingSignatureForTerm termReference)

View File

@ -138,11 +138,15 @@ serveFuzzyFind ::
Backend.Backend m [(FZF.Alignment, FoundResult)]
serveFuzzyFind codebase mayRoot relativeTo limit typeWidth query = do
let path = fromMaybe Path.empty relativeTo
rootHash <- traverse (Backend.hoistBackend (Codebase.runTransaction codebase) . Backend.expandShortCausalHash) mayRoot
rootCausal <- Backend.resolveCausalHashV2 codebase (Cv.causalHash1to2 <$> rootHash)
rootCausal <-
Backend.hoistBackend (Codebase.runTransaction codebase) do
rootHash <- traverse Backend.expandShortCausalHash mayRoot
lift (Backend.resolveCausalHashV2 (Cv.causalHash1to2 <$> rootHash))
(localNamesOnly, ppe) <- Backend.scopedNamesForBranchHash codebase (Just rootCausal) path
relativeToCausal <- lift $ Codebase.getShallowCausalAtPath codebase path (Just rootCausal)
relativeToBranch <- lift $ V2Causal.value relativeToCausal
relativeToBranch <- do
(lift . Codebase.runTransaction codebase) do
relativeToCausal <- Codebase.getShallowCausalAtPath path (Just rootCausal)
V2Causal.value relativeToCausal
let alignments ::
( [ ( FZF.Alignment,
UnisonName,

View File

@ -80,9 +80,12 @@ namespaceDetails ::
namespaceDetails runtime codebase namespacePath maySCH mayWidth =
let width = mayDefaultWidth mayWidth
in do
rootCausal <- Backend.resolveRootBranchHashV2 codebase maySCH
namespaceCausal <- lift $ Codebase.getShallowCausalAtPath codebase namespacePath (Just rootCausal)
shallowBranch <- lift $ V2Causal.value namespaceCausal
(rootCausal, namespaceCausal, shallowBranch) <-
Backend.hoistBackend (Codebase.runTransaction codebase) do
rootCausal <- Backend.resolveRootBranchHashV2 maySCH
namespaceCausal <- lift $ Codebase.getShallowCausalAtPath namespacePath (Just rootCausal)
shallowBranch <- lift $ V2Causal.value namespaceCausal
pure (rootCausal, namespaceCausal, shallowBranch)
namespaceDetails <- do
(_localNamesOnly, ppe) <- Backend.scopedNamesForBranchHash codebase (Just rootCausal) namespacePath
readme <-

View File

@ -171,7 +171,7 @@ serve ::
Backend.Backend IO NamespaceListing
serve codebase maySCH mayRelativeTo mayNamespaceName = do
useIndex <- asks Backend.useNamesIndex
(mayRootHash, codebaseRootHash) <-
(mayRootHash, codebaseRootHash) <-
Backend.hoistBackend (Codebase.runTransaction codebase) do
mayRootHash <- traverse Backend.expandShortCausalHash maySCH
codebaseRootHash <- lift Operations.expectRootCausalHash
@ -200,9 +200,9 @@ serve codebase maySCH mayRelativeTo mayNamespaceName = do
serveFromIndex codebase mayRootHash path'
(True, Just rh)
| rh == causalHash2to1 codebaseRootHash ->
serveFromIndex codebase mayRootHash path'
serveFromIndex codebase mayRootHash path'
| otherwise -> do
serveFromBranch codebase path' (Cv.causalHash1to2 rh)
serveFromBranch codebase path' (Cv.causalHash1to2 rh)
(False, Just rh) -> do
serveFromBranch codebase path' (Cv.causalHash1to2 rh)
(False, Nothing) -> do
@ -222,9 +222,12 @@ serveFromBranch codebase path' rootHash = do
-- worth slowing down the request for this right now.
let ppe = PPE.empty
let listingFQN = Path.toText . Path.unabsolute . either id (Path.Absolute . Path.unrelative) $ Path.unPath' path'
causalAtPath <- liftIO $ Codebase.getShallowCausalFromRoot codebase (Just rootHash) (Path.unabsolute absPath)
(causalAtPath, branchAtPath) <-
(lift . Codebase.runTransaction codebase) do
causalAtPath <- Codebase.getShallowCausalFromRoot (Just rootHash) (Path.unabsolute absPath)
branchAtPath <- V2Causal.value causalAtPath
pure (causalAtPath, branchAtPath)
let listingHash = v2CausalBranchToUnisonHash causalAtPath
branchAtPath <- liftIO $ V2Causal.value causalAtPath
listingEntries <- liftIO $ Backend.lsBranch codebase branchAtPath
makeNamespaceListing ppe listingFQN listingHash listingEntries
@ -234,8 +237,11 @@ serveFromIndex ::
Path.Path' ->
Backend.Backend IO NamespaceListing
serveFromIndex codebase mayRootHash path' = do
listingCausal <- Backend.getShallowCausalAtPathFromRootHash codebase mayRootHash (Path.fromPath' path')
listingBranch <- liftIO $ V2Causal.value listingCausal
(listingCausal, listingBranch) <-
(lift . Codebase.runTransaction codebase) do
listingCausal <- Backend.getShallowCausalAtPathFromRootHash mayRootHash (Path.fromPath' path')
listingBranch <- V2Causal.value listingCausal
pure (listingCausal, listingBranch)
-- TODO: Currently the ppe is just used to render the types returned from the namespace
-- listing, which are currently unused because we don't show types in the side-bar.
-- If we ever show types on hover we need to build and use a proper PPE here, but it's not

View File

@ -40,6 +40,7 @@ import Unison.Prelude
import Unison.Server.Backend
import qualified Unison.Server.Backend as Backend
import Unison.Server.Types (APIGet, UnisonHash)
import qualified Unison.Sqlite as Sqlite
import Unison.Symbol (Symbol)
import Unison.Util.Monoid (foldMapM)
@ -133,13 +134,15 @@ serve codebase mayRoot mayOwner = projects
where
projects :: Backend m [ProjectListing]
projects = do
shallowRootBranch <- case mayRoot of
Nothing -> lift (Codebase.getShallowRootBranch codebase)
Just sch -> do
h <- Backend.hoistBackend (Codebase.runTransaction codebase) (Backend.expandShortCausalHash sch)
-- TODO: can this ever be missing?
causal <- lift $ Codebase.getShallowCausalForHash codebase (Cv.causalHash1to2 h)
lift $ V2Causal.value causal
shallowRootBranch <-
Backend.hoistBackend (Codebase.runTransaction codebase) do
case mayRoot of
Nothing -> lift Codebase.getShallowRootBranch
Just sch -> do
h <- Backend.expandShortCausalHash sch
-- TODO: can this ever be missing?
causal <- lift $ Codebase.expectCausalBranchByCausalHash (Cv.causalHash1to2 h)
lift $ V2Causal.value causal
ownerEntries <- lift $ Backend.lsBranch codebase shallowRootBranch
-- If an owner is provided, we only want projects belonging to them
@ -149,7 +152,7 @@ serve codebase mayRoot mayOwner = projects
Nothing -> mapMaybe entryToOwner ownerEntries
foldMapM (ownerToProjectListings shallowRootBranch) owners
ownerToProjectListings :: V2Branch.Branch m -> ProjectOwner -> Backend m [ProjectListing]
ownerToProjectListings :: V2Branch.Branch Sqlite.Transaction -> ProjectOwner -> Backend m [ProjectListing]
ownerToProjectListings root owner = do
let (ProjectOwner ownerName) = owner
ownerPath' <- (parsePath . Text.unpack) ownerName

View File

@ -0,0 +1,61 @@
This transcript executes very slowly, because the compiler has an
entire copy of base (and other stuff) within it.
```ucm:hide
.> builtins.mergeio
.> pull.without-history unison.public.base.trunk base
```
```unison
stdOut = stdHandle StdOut
print txt =
match putBytes.impl stdOut (toUtf8 txt) with
Left f -> raise f
Right _ -> ()
prints n f = print (join " " f n ++ "\n")
join sep f =
use Text ++
loop acc = cases
0 -> acc
n -> loop (!f ++ (sep ++ acc)) (drop n 1)
cases
0 -> ""
n -> loop !f (drop n 1)
addUp : Nat -> Nat -> Nat
addUp acc = cases
0 -> acc
n -> addUp (1+acc) (drop n 1)
repeat : Nat -> '{g} () -> '{g} ()
repeat n act =
loop : Nat ->{g} ()
loop = cases
0 -> ()
k ->
!act
loop (drop k 1)
'(loop n)
printAddUp : Nat ->{IO,Exception} ()
printAddUp n =
ns = [addUp 0 n, addUp 0 n, addUp 0 n, addUp 0 n, addUp 0 n]
prints 8 '(toText (addUp 0 n))
singleAddUp : '{IO,Exception} ()
singleAddUp = do printAddUp 3000000
multiAddUp : '{IO,Exception} ()
multiAddUp = repeat 35 '(printAddUp 3000000)
```
```ucm
.> add
.> run singleAddUp
.> run.native multiAddUp
```

View File

@ -484,3 +484,26 @@ Regression test for https://github.com/unisonweb/unison/pull/3548
```ucm
.> load scratch.u
```
# Indent long pattern lists to avoid virtual semicolon
Regression test for https://github.com/unisonweb/unison/issues/3627
```unison:hide
(+) a b = ##Nat.+ a b
foo = cases
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa,
bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
-> aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa + bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
```
```ucm
.> add
.> edit foo
.> undo
```
```ucm
.> load scratch.u
```

View File

@ -1469,3 +1469,63 @@ Regression test for https://github.com/unisonweb/unison/pull/3548
I loaded scratch.u and didn't find anything.
```
# Indent long pattern lists to avoid virtual semicolon
Regression test for https://github.com/unisonweb/unison/issues/3627
```unison
(+) a b = ##Nat.+ a b
foo = cases
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa,
bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
-> aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa + bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
```
```ucm
.> add
⍟ I've added these definitions:
+ : Nat -> Nat -> Nat
foo : Nat -> Nat -> Nat
.> edit foo
☝️
I added these definitions to the top of
/Users/runar/work/unison/scratch.u
foo : Nat -> Nat -> Nat
foo = cases
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa,
bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb ->
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
+ bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
You can edit them there, then do `update` to replace the
definitions currently in this namespace.
.> undo
Here are the changes I undid
Added definitions:
1. + : Nat -> Nat -> Nat
2. foo : Nat -> Nat -> Nat
```
```ucm
.> load scratch.u
I found and typechecked these definitions in scratch.u. If you
do an `add` or `update`, here's how your codebase would
change:
⍟ These new definitions are ok to `add`:
foo : Nat -> Nat -> Nat
```

View File

@ -0,0 +1,21 @@
```ucm:hide
.> builtins.mergeio
```
```unison
structural type M a = N | J a
d = {{
{{ docExample 0 '(x -> J x) }}
{J}
}}
```
```ucm
.> add
.> display d
```

View File

@ -0,0 +1,41 @@
```unison
structural type M a = N | J a
d = {{
{{ docExample 0 '(x -> J x) }}
{J}
}}
```
```ucm
I found and typechecked these definitions in scratch.u. If you
do an `add` or `update`, here's how your codebase would
change:
⍟ These new definitions are ok to `add`:
structural type M a
(also named builtin.Optional)
d : Doc2
```
```ucm
.> add
⍟ I've added these definitions:
structural type M a
(also named builtin.Optional)
d : Doc2
.> display d
`x -> J x`
J
```

View File

@ -105,8 +105,8 @@ Notice that Unison detects this as an alias of `merge`, and if we view `merge`
merge : [a] -> [a] -> [a]
merge = cases
[], ys -> ys
xs, [] -> xs
[], ys -> ys
xs, [] -> xs
h +: t, h2 +: t2 ->
if h <= h2 then h +: merge t (h2 +: t2)
else h2 +: merge (h +: t) t2

View File

@ -55,7 +55,7 @@ d = c + 10
At this point, `a3` is conflicted for symbols `c` and `d`, so those are deprioritized.
The original `a2` namespace has an unconflicted definition for `c` and `d`, but since there are multiple 'c's in scope,
`long.name.but.shortest.suffixification` is chosen because its suffixified version has the fewest segments.
`a2.c` is chosen because although the suffixified version has fewer segments, its fully-qualified name has the fewest segments.
```ucm
.> view a b c d
@ -88,7 +88,8 @@ other.value = 20
```ucm
.biasing> add
-- nested.value should still be preferred even if the suffixification requires more segments than `a`
-- nested.value should be preferred over the shorter name `a` due to biasing
-- because `deeply.nested.value` is nearby to the term being viewed.
.biasing> view deeply.nested.term
```

View File

@ -1373,7 +1373,7 @@ d = c + 10
```
At this point, `a3` is conflicted for symbols `c` and `d`, so those are deprioritized.
The original `a2` namespace has an unconflicted definition for `c` and `d`, but since there are multiple 'c's in scope,
`long.name.but.shortest.suffixification` is chosen because its suffixified version has the fewest segments.
`a2.c` is chosen because although the suffixified version has fewer segments, its fully-qualified name has the fewest segments.
```ucm
.> view a b c d
@ -1394,7 +1394,7 @@ The original `a2` namespace has an unconflicted definition for `c` and `d`, but
a2.d : Nat
a2.d =
use Nat +
suffixification + 10
a2.c + 10
a3.c#dcgdua2lj6 : Nat
a3.c#dcgdua2lj6 = 2
@ -1475,7 +1475,8 @@ other.value = 20
other.value : Nat
-- nested.value should still be preferred even if the suffixification requires more segments than `a`
-- nested.value should be preferred over the shorter name `a` due to biasing
-- because `deeply.nested.value` is nearby to the term being viewed.
.biasing> view deeply.nested.term
deeply.nested.term : Nat

View File

@ -0,0 +1,25 @@
`update` / `patch` (anything that a patch) ignores the namespace named "lib" at the location it's applied. This follows
the project organization convention that dependencies are put in "lib"; it's much easier to apply a patch to all of
one's own code if the "lib" namespace is simply ignored.
```ucm:hide
.> builtins.merge
```
```unison
foo = 100
lib.foo = 100
```
```ucm
.> add
```
```unison
foo = 200
```
```ucm
.> update
.> names foo
```

View File

@ -0,0 +1,66 @@
`update` / `patch` (anything that a patch) ignores the namespace named "lib" at the location it's applied. This follows
the project organization convention that dependencies are put in "lib"; it's much easier to apply a patch to all of
one's own code if the "lib" namespace is simply ignored.
```unison
foo = 100
lib.foo = 100
```
```ucm
I found and typechecked these definitions in scratch.u. If you
do an `add` or `update`, here's how your codebase would
change:
⍟ These new definitions are ok to `add`:
foo : Nat
lib.foo : Nat
```
```ucm
.> add
⍟ I've added these definitions:
foo : Nat
lib.foo : Nat
```
```unison
foo = 200
```
```ucm
I found and typechecked these definitions in scratch.u. If you
do an `add` or `update`, here's how your codebase would
change:
⍟ These names already exist. You can `update` them to your
new definition:
foo : Nat
(The old definition is also named lib.foo. I'll update
this name too.)
```
```ucm
.> update
⍟ I've updated these names to your new definition:
foo : Nat
(The old definition was also named lib.foo. I updated this
name too.)
.> names foo
Term
Hash: #9ntnotdp87
Names: foo
Tip: Use `names.global` to see more results.
```