mirror of
https://github.com/tweag/nickel.git
synced 2024-10-03 22:57:11 +03:00
Fixes typos
This commit is contained in:
parent
3c60ef1d97
commit
0f192ea0ae
@ -130,7 +130,7 @@ raised during evaluation.
|
||||
Each of these `.ncl` files is structured as an array of `Bool` expressions, which
|
||||
is ultimately passed to a `check` function defined in
|
||||
`tests/integration/pass/lib/assert.ncl`. This function applies an `Assert` contract
|
||||
to each value in the array, which checks that the value it is applied to evalutes
|
||||
to each value in the array, which checks that the value it is applied to evaluates
|
||||
to `true`. The benefit of using a contract for this is that if a test fails we
|
||||
can simply run the file directly using Nickel, which gives better error messages
|
||||
than the ones we get by default from `cargo test`.
|
||||
|
@ -157,8 +157,8 @@ never requires recursion, this is not the case with library code. Allowing
|
||||
recursion makes it possible for programmers to implement new generic
|
||||
functionalities \[2\].
|
||||
|
||||
\[1\]: [Why Dhall is not Turing complete](http://neilmitchell.blogspot.com/2020/11/turing-incomplete-languages.html)\
|
||||
\[2\]: [Turing incomplete languages](http://www.haskellforall.com/2020/01/why-dhall-advertises-absence-of-turing.html)
|
||||
\[1\]: [Why Dhall is not Turing complete](https://neilmitchell.blogspot.com/2020/11/turing-incomplete-languages.html)\
|
||||
\[2\]: [Turing incomplete languages](https://www.haskellforall.com/2020/01/why-dhall-advertises-absence-of-turing.html)
|
||||
|
||||
### Side-Effects
|
||||
|
||||
|
@ -310,7 +310,7 @@ being the lowest possible one.
|
||||
#### Forcing values
|
||||
|
||||
Dually, values with the `force` annotation are given the highest priority. Such
|
||||
a value can never be overriden, and will either take precedence over another
|
||||
a value can never be overridden, and will either take precedence over another
|
||||
value or be tentatively merged if the other value is forcing as well.
|
||||
|
||||
#### Recursive priorities
|
||||
|
@ -75,7 +75,7 @@ impl Completed {
|
||||
}
|
||||
|
||||
/// Finds the index of a linearization item for a given location
|
||||
/// The linearization is a list of items that are sorted by their physical occurence.
|
||||
/// The linearization is a list of items that are sorted by their physical occurrence.
|
||||
/// - Each element has a corresponding span in the source
|
||||
/// - Spans are either equal (same starting point, same length)
|
||||
/// or shorter but never intersecting
|
||||
|
@ -15,7 +15,7 @@ impl ResolutionState for Unresolved {}
|
||||
pub type Resolved = Types;
|
||||
impl ResolutionState for Resolved {}
|
||||
|
||||
/// Abstact term kinds.
|
||||
/// Abstract term kinds.
|
||||
/// Currently tracks
|
||||
/// 1. Declarations
|
||||
/// 2. Usages
|
||||
|
@ -298,7 +298,7 @@ fn find_fields_from_term(
|
||||
let pos = term.pos;
|
||||
let span = pos.unwrap();
|
||||
let locator = (span.src_id, span.start);
|
||||
// This unwrap is safe becuase we're geting an expression from
|
||||
// This unwrap is safe because we're getting an expression from
|
||||
// a linearized file, so all terms in the file and all terms in its
|
||||
// dependencies must have being linearized and stored in the cache.
|
||||
let linearization = lin_cache.get(&span.src_id).unwrap();
|
||||
|
@ -55,7 +55,7 @@ pub fn handle_to_definition(
|
||||
TermKind::Usage(UsageState::Resolved(usage_id)) => {
|
||||
let definition = linearization.get_item(usage_id, &server.lin_cache).unwrap();
|
||||
if server.cache.is_stdlib_module(definition.id.file_id) {
|
||||
// The standard library files are embeded in the executable,
|
||||
// The standard library files are embedded in the executable,
|
||||
// so we can't possibly go to their definition on disk.
|
||||
server.reply(Response::new_ok(id, Value::Null));
|
||||
return Ok(());
|
||||
|
@ -52,7 +52,7 @@ I think there are some nice properties we should try to show (at least informall
|
||||
|
||||
We have two ways of proceeding, as far as I understand:
|
||||
* The first would be to increase the language in some way to add dependencies to the labels, something like `check` and `set` fields. Every time we find `blame l` we first check that everything it depends on has been set, if that's true, we set everything that depends on this.
|
||||
* The second would be to provide powerful primitives per usecase, for instance, `split_union l` would generate two sublabels (that would depend on each other) that, if raised, would blame `l`.
|
||||
* The second would be to provide powerful primitives per use case, for instance, `split_union l` would generate two sublabels (that would depend on each other) that, if raised, would blame `l`.
|
||||
|
||||
There are still a few hard to solve challenges:
|
||||
* **Context tracking**: since negative blame depends on the context, in order to raise it we need to make sure it's the same label, and the same context, both papers solve this in different ways:
|
||||
@ -71,6 +71,6 @@ The most similar implementation I could find is [TreatJS], which has a nice blam
|
||||
|
||||
|
||||
|
||||
[wadler]: http://homepages.inf.ed.ac.uk/wadler/papers/root-blame/root-blame.pdf
|
||||
[wadler]: https://homepages.inf.ed.ac.uk/wadler/papers/root-blame/root-blame.pdf
|
||||
[treatjs]: https://proglang.informatik.uni-freiburg.de/treatjs/index.html
|
||||
[keil]: http://matthias-keil.de/papers/icfp2015-blame.pdf
|
||||
|
@ -912,7 +912,7 @@ weakEval e = ...
|
||||
|
||||
### Semantics
|
||||
|
||||
Now that we defined merge trees, we need to extract a potential merge fonction
|
||||
Now that we defined merge trees, we need to extract a potential merge function
|
||||
from it:
|
||||
|
||||
```text
|
||||
@ -962,7 +962,7 @@ tree of this expression. It needs to be accessible even after evaluation by the
|
||||
implementation.
|
||||
|
||||
In practice, we can't update a thunk that contains `e1 & e2` with the result of
|
||||
the evaluation. This is already the case with `default` values currenlty
|
||||
the evaluation. This is already the case with `default` values currently
|
||||
(`default (1 + 1)` isn't updated to `2`, but to `default 2`, otherwise the
|
||||
semantics would change). A good view on this is that the semantics is inherently
|
||||
call-by-name, and that any caching mechanism (including call-by-need) is a
|
||||
|
@ -138,7 +138,7 @@ annotations:
|
||||
|
||||
```nickel
|
||||
{
|
||||
serialized : Arry Str =
|
||||
serialized : Array Str =
|
||||
let some_data = {script = "echo ${hello}", vars = ["hello"] } in
|
||||
let other_data = ["one", "two", "three"] in
|
||||
[
|
||||
|
@ -646,8 +646,8 @@ impl Cache {
|
||||
let CachedTerm {
|
||||
term, parse_errs, ..
|
||||
} = self.terms.get(&file_id).unwrap();
|
||||
// okay, for now these clones are here becuase the function call below
|
||||
// short circuts, and then we don't put back the item in cache, so the
|
||||
// okay, for now these clones are here because the function call below
|
||||
// short circuits, and then we don't put back the item in cache, so the
|
||||
// linearization of a file fails if we can't resolve any of it's imports
|
||||
// The current solution is not to remove the item from the cache, and
|
||||
// put it back when done, but to get a reference and clone it.
|
||||
|
@ -637,11 +637,11 @@ mod tests {
|
||||
),
|
||||
"source"
|
||||
)
|
||||
.expect("program should't fail")
|
||||
.expect("program shouldn't fail")
|
||||
.eval_full()
|
||||
.expect("evaluation should't fail")
|
||||
.expect("evaluation shouldn't fail")
|
||||
)
|
||||
.expect("deserialization should't fail"),
|
||||
.expect("deserialization shouldn't fail"),
|
||||
A {
|
||||
a: 10.0,
|
||||
b: "test string".to_string(),
|
||||
@ -660,11 +660,11 @@ mod tests {
|
||||
assert_eq!(
|
||||
Vec::<f64>::deserialize(
|
||||
TestProgram::new_from_source(Cursor::new(br#"[1, 2, 3, 4]"#.to_vec()), "source")
|
||||
.expect("program should't fail")
|
||||
.expect("program shouldn't fail")
|
||||
.eval_full()
|
||||
.expect("evaluation should't fail")
|
||||
.expect("evaluation shouldn't fail")
|
||||
)
|
||||
.expect("deserialization should't fail"),
|
||||
.expect("deserialization shouldn't fail"),
|
||||
vec![1.0, 2.0, 3.0, 4.0]
|
||||
)
|
||||
}
|
||||
@ -678,9 +678,9 @@ mod tests {
|
||||
Cursor::new(br#"fun a b => a + b"#.to_vec()),
|
||||
"source",
|
||||
)
|
||||
.expect("program should't fail");
|
||||
.expect("program shouldn't fail");
|
||||
|
||||
let q = p.eval_full().expect("evaluation should't fail");
|
||||
let q = p.eval_full().expect("evaluation shouldn't fail");
|
||||
|
||||
assert_eq!(
|
||||
A::deserialize(q),
|
||||
@ -704,11 +704,11 @@ mod tests {
|
||||
Cursor::new(br#"{ a = (10 | Num) }"#.to_vec()),
|
||||
"source"
|
||||
)
|
||||
.expect("program should't fail")
|
||||
.expect("program shouldn't fail")
|
||||
.eval_full()
|
||||
.expect("evaluation should't fail")
|
||||
.expect("evaluation shouldn't fail")
|
||||
)
|
||||
.expect("deserialization should't fail"),
|
||||
.expect("deserialization shouldn't fail"),
|
||||
A { a: 10.0 }
|
||||
)
|
||||
}
|
||||
|
@ -14,7 +14,7 @@ use crate::types::{TypeF, Types};
|
||||
#[derive(Debug, PartialEq, Clone)]
|
||||
pub enum Match {
|
||||
/// `{..., a=b, ...}` will bind the field a of the record to variable a. Here, a is the first
|
||||
/// field of this variant and b the optional one. The last field can actualy be a nested
|
||||
/// field of this variant and b the optional one. The last field can actually be a nested
|
||||
/// destruct pattern.
|
||||
Assign(Ident, MetaValue, (Option<Ident>, Destruct)),
|
||||
/// Simple binding. the `Ident` is bind to a variable with the same name.
|
||||
@ -27,7 +27,7 @@ pub enum LastMatch {
|
||||
/// The last field is a normal match. In this case the pattern is "closed" so every record
|
||||
/// fields should be matched.
|
||||
Match(Box<Match>),
|
||||
/// The pattern is "open" `, ..}`. Optionaly you can bind a record containing the remaining
|
||||
/// The pattern is "open" `, ..}`. Optionally you can bind a record containing the remaining
|
||||
/// fields to an `Identifier` using the syntax `, ..y}`.
|
||||
Ellipsis(Option<Ident>),
|
||||
}
|
||||
@ -44,7 +44,7 @@ pub enum Destruct {
|
||||
},
|
||||
/// An array destructuring. Not implemented.
|
||||
Array { matches: Vec<Match>, span: RawSpan },
|
||||
/// An empty destructuring. In this case, the pattern is a clasical `let var = something in
|
||||
/// An empty destructuring. In this case, the pattern is a classical `let var = something in
|
||||
/// body` form.
|
||||
Empty,
|
||||
}
|
||||
|
@ -1427,7 +1427,7 @@ impl ToDiagnostic<FileId> for ParseError {
|
||||
])
|
||||
.with_notes(vec![
|
||||
String::from("Using a polymorphic tail in a record `{ ..; a}` requires the rest of the record to be only composed of type annotations, of the form `<field>: <type>`."),
|
||||
String::from("Value assignements, such as `<field> = <expr>`, metadata, etc. are forbidden."),
|
||||
String::from("Value assignments, such as `<field> = <expr>`, metadata, etc. are forbidden."),
|
||||
]),
|
||||
ParseError::RecursiveLetPattern(span) => Diagnostic::error()
|
||||
.with_message("recursive destructuring is not supported")
|
||||
|
@ -322,7 +322,7 @@ pub fn merge<C: Cache>(
|
||||
_ => unreachable!(),
|
||||
};
|
||||
|
||||
// Finally, we also need to closurize the contracts in the final envirnment.
|
||||
// Finally, we also need to closurize the contracts in the final environment.
|
||||
let mut contracts1: Vec<Contract> = contracts1
|
||||
.into_iter()
|
||||
.map(|ctr| ctr.closurize(cache, &mut env, env1.clone()))
|
||||
@ -358,7 +358,7 @@ pub fn merge<C: Cache>(
|
||||
doc,
|
||||
types,
|
||||
contracts,
|
||||
// If one of the record requires this field, then it musn't be optional. The
|
||||
// If one of the record requires this field, then it mustn't be optional. The
|
||||
// resulting field is optional iff both are.
|
||||
opt: opt1 && opt2,
|
||||
priority,
|
||||
|
@ -15,7 +15,7 @@
|
||||
//! to garbage collection (currently reference counting based)
|
||||
//! - A [callstack], mainly for error reporting purpose
|
||||
//!
|
||||
//! Depending on the shape of the current term, the following actions are preformed:
|
||||
//! Depending on the shape of the current term, the following actions are performed:
|
||||
//!
|
||||
//! ## Core calculus
|
||||
//! - **Var(id)**: the term bound to `id` in the environment is fetched, and an update thunk is
|
||||
@ -582,7 +582,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
))
|
||||
}
|
||||
// Closurize the array if it's not already done.
|
||||
// This *should* make it unecessary to call closurize in [operation].
|
||||
// This *should* make it unnecessary to call closurize in [operation].
|
||||
// See the comment on the `BinaryOp::ArrayConcat` match arm.
|
||||
Term::Array(terms, attrs) if !attrs.closurized => {
|
||||
let mut local_env = Environment::new();
|
||||
@ -1063,8 +1063,8 @@ pub fn is_empty_optional<C: Cache>(cache: &C, rt: &RichTerm, env: &Environment)
|
||||
}
|
||||
}
|
||||
|
||||
// The total amount of gas is rather abritrary, but in any case, it ought to stay low: remember
|
||||
// that is_empty_optional may be called on each field of a record when evaluatinog some record
|
||||
// The total amount of gas is rather arbitrary, but in any case, it ought to stay low: remember
|
||||
// that is_empty_optional may be called on each field of a record when evaluating some record
|
||||
// operations.
|
||||
is_empty_optional_aux(cache, rt, env, false, &mut 8)
|
||||
}
|
||||
|
@ -542,7 +542,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
let mut shared_env = Environment::new();
|
||||
let f_as_var = f.body.closurize(&mut self.cache, &mut env, f.env);
|
||||
|
||||
// Array elements are closurized to preserve lazyness of data structures. It
|
||||
// Array elements are closurized to preserve laziness of data structures. It
|
||||
// maintains the invariant that any data structure only contain thunks (that is,
|
||||
// currently, variables).
|
||||
let ts = ts
|
||||
@ -585,7 +585,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
if n < 0.0 || n.fract() != 0.0 {
|
||||
Err(EvalError::Other(
|
||||
format!(
|
||||
"generate: expected the 1st agument to be a positive integer, got {}",
|
||||
"generate: expected the 1st argument to be a positive integer, got {}",
|
||||
n
|
||||
),
|
||||
pos_op,
|
||||
@ -594,7 +594,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
let mut shared_env = Environment::new();
|
||||
let f_as_var = f.body.closurize(&mut self.cache, &mut env, f.env);
|
||||
|
||||
// Array elements are closurized to preserve lazyness of data structures. It
|
||||
// Array elements are closurized to preserve laziness of data structures. It
|
||||
// maintains the invariant that any data structure only contain thunks (that is,
|
||||
// currently, variables).
|
||||
let ts = (0..n_int)
|
||||
@ -926,7 +926,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
UnaryOp::CharFromCode() => {
|
||||
if let Term::Num(code) = *t {
|
||||
if code.fract() != 0.0 {
|
||||
Err(EvalError::Other(format!("charFromCode: expected the agument to be an integer, got the floating-point value {}", code), pos_op))
|
||||
Err(EvalError::Other(format!("charFromCode: expected the argument to be an integer, got the floating-point value {}", code), pos_op))
|
||||
} else if code < 0.0 || code > (u32::MAX as f64) {
|
||||
Err(EvalError::Other(format!("charFromCode: code out of bounds. Expected a value between 0 and {}, got {}", u32::MAX, code), pos_op))
|
||||
} else if let Some(car) = std::char::from_u32(code as u32) {
|
||||
@ -2151,7 +2151,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
(Term::Array(ts, attrs), Term::Num(n)) => {
|
||||
let n_int = *n as usize;
|
||||
if n.fract() != 0.0 {
|
||||
Err(EvalError::Other(format!("elemAt: expected the 2nd agument to be an integer, got the floating-point value {}", n), pos_op))
|
||||
Err(EvalError::Other(format!("elemAt: expected the 2nd argument to be an integer, got the floating-point value {}", n), pos_op))
|
||||
} else if *n < 0.0 || n_int >= ts.len() {
|
||||
Err(EvalError::Other(format!("elemAt: index out of bounds. Expected a value between 0 and {}, got {}", ts.len(), n), pos_op))
|
||||
} else {
|
||||
@ -2551,7 +2551,7 @@ impl<R: ImportResolver, C: Cache> VirtualMachine<R, C> {
|
||||
let end_int = *end as usize;
|
||||
|
||||
if start.fract() != 0.0 {
|
||||
Err(EvalError::Other(format!("substring: expected the 2nd agument (start) to be an integer, got the floating-point value {}", start), pos_op))
|
||||
Err(EvalError::Other(format!("substring: expected the 2nd argument (start) to be an integer, got the floating-point value {}", start), pos_op))
|
||||
} else if !s.is_char_boundary(start_int) {
|
||||
Err(EvalError::Other(format!("substring: index out of bounds. Expected the 2nd argument (start) to be between 0 and {}, got {}", s.len(), start), pos_op))
|
||||
} else if end.fract() != 0.0 {
|
||||
@ -3029,7 +3029,7 @@ impl RecPriority {
|
||||
/// # Return
|
||||
///
|
||||
/// If the comparison is successful, returns a bool indicating whether the values were equal,
|
||||
/// otherwise returns an [`EvalError`] indiciating that the values cannot be compared.
|
||||
/// otherwise returns an [`EvalError`] indicating that the values cannot be compared.
|
||||
fn eq<C: Cache>(
|
||||
cache: &mut C,
|
||||
env: &mut Environment,
|
||||
@ -3053,7 +3053,7 @@ fn eq<C: Cache>(
|
||||
} = c2;
|
||||
|
||||
// Take a list of subequalities, and either return `EqResult::Bool(true)` if it is empty, or
|
||||
// generate an approriate `EqResult::Eqs` variant with closurized terms in it.
|
||||
// generate an appropriate `EqResult::Eqs` variant with closurized terms in it.
|
||||
fn gen_eqs<I, C: Cache>(
|
||||
cache: &mut C,
|
||||
mut it: I,
|
||||
|
@ -127,7 +127,7 @@ mod interner {
|
||||
|
||||
use typed_arena::Arena;
|
||||
|
||||
/// A symbol is a correspondance between an [Ident](super::Ident) and its string representation stored in the [Interner].
|
||||
/// A symbol is a correspondence between an [Ident](super::Ident) and its string representation stored in the [Interner].
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
pub struct Symbol(u32);
|
||||
|
||||
|
@ -613,7 +613,7 @@ impl<'input> Iterator for Lexer<'input> {
|
||||
NormalToken::MultiStringStart(delim_size)
|
||||
| NormalToken::SymbolicStringStart(delim_size),
|
||||
)) => {
|
||||
// for interpolation & closing delimeters we only care about
|
||||
// for interpolation & closing delimiters we only care about
|
||||
// the number of `%`s (plus the opening `"` or `{`) so we
|
||||
// drop the "kind marker" size here (i.e. the `m` character).
|
||||
let size_without_kind_marker = delim_size - 1;
|
||||
|
@ -523,7 +523,7 @@ impl FixTypeVars for Types {
|
||||
// to set the right value for `var_kind`.
|
||||
bound_vars.insert(*var, VarKindCell::new());
|
||||
(*body).fix_type_vars_env(bound_vars.clone(), span)?;
|
||||
// unwrap(): we just inseted a value for `var` above, and environment can never
|
||||
// unwrap(): we just inserted a value for `var` above, and environment can never
|
||||
// delete values.
|
||||
*var_kind = bound_vars.get(var).unwrap().var_kind();
|
||||
|
||||
|
@ -50,8 +50,8 @@ impl StringStartDelimiter {
|
||||
}
|
||||
}
|
||||
|
||||
/// Distinguish between the standard string closing delimter `"` and the "special" string
|
||||
/// closing delimeter `"%`.
|
||||
/// Distinguish between the standard string closing delimiter `"` and the "special" string
|
||||
/// closing delimiter `"%`.
|
||||
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
|
||||
pub enum StringEndDelimiter {
|
||||
Standard,
|
||||
@ -279,7 +279,7 @@ pub fn elaborate_field_path(
|
||||
let fst = it.next().unwrap();
|
||||
|
||||
let content = it.rev().fold(content, |acc, path_elem| {
|
||||
// unwrap is safe here becuase the initial content has a position,
|
||||
// unwrap is safe here because the initial content has a position,
|
||||
// and we make sure we assign a position for the next field.
|
||||
let acc_span = acc.pos.unwrap();
|
||||
let pos = match path_elem {
|
||||
|
@ -43,7 +43,7 @@ impl RawSpan {
|
||||
#[derive(Debug, Clone, Copy, Eq, PartialEq, Hash)]
|
||||
pub enum TermPos {
|
||||
/// The term exactly corresponds to an original expression in the source, or is a construct
|
||||
/// introduced by program transformation that correponds to an original span in the source.
|
||||
/// introduced by program transformation that corresponds to an original span in the source.
|
||||
Original(RawSpan),
|
||||
/// The term is the result of the evaluation of an original expression in the source.
|
||||
Inherited(RawSpan),
|
||||
|
@ -362,7 +362,7 @@ mod doc {
|
||||
document.append(parse_documentation(header_level, arena, md, options))
|
||||
}
|
||||
Term::Record(record) | Term::RecRecord(record, _, _) => {
|
||||
// Sorting fields for a determinstic output
|
||||
// Sorting fields for a deterministic output
|
||||
let mut entries: Vec<(_, _)> = record.fields.iter().collect();
|
||||
entries.sort_by_key(|(k, _)| *k);
|
||||
|
||||
|
@ -160,7 +160,7 @@ impl Default for Attributes {
|
||||
/// Print the result of a metadata query, which is a "weakly" evaluated term (see
|
||||
/// [`crate::eval::VirtualMachine::eval_meta`] and [`crate::program::query`]).
|
||||
///
|
||||
/// Wrapper around `write_query_result_` that selects an adapated query printer at compile time.
|
||||
/// Wrapper around `write_query_result_` that selects an adapted query printer at compile time.
|
||||
pub fn write_query_result(
|
||||
out: &mut impl Write,
|
||||
term: &Term,
|
||||
|
@ -1130,7 +1130,7 @@ impl UnaryOp {
|
||||
pub enum BinaryOp {
|
||||
/// Addition of numerals.
|
||||
Plus(),
|
||||
/// Substraction of numerals.
|
||||
/// Subtraction of numerals.
|
||||
Sub(),
|
||||
/// Multiplication of numerals.
|
||||
Mult(),
|
||||
@ -1144,11 +1144,11 @@ pub enum BinaryOp {
|
||||
StrConcat(),
|
||||
/// Polymorphic equality.
|
||||
Eq(),
|
||||
/// Stricty less than comparison operator.
|
||||
/// Strictly less than comparison operator.
|
||||
LessThan(),
|
||||
/// Less than or equal comparison operator.
|
||||
LessOrEq(),
|
||||
/// Stricty greater than comparison operator.
|
||||
/// Strictly greater than comparison operator.
|
||||
GreaterThan(),
|
||||
/// Greater than or equal comparison operator.
|
||||
GreaterOrEq(),
|
||||
|
@ -40,7 +40,7 @@ use crate::term::{BinaryOp::DynRemove, MetaValue, RichTerm, Term, UnaryOp::Stati
|
||||
/// It desugar a `RichTerm` if possible (the term is a let pattern or a function with patterns in
|
||||
/// its arguments).
|
||||
/// ## Warning:
|
||||
/// The transformation is generaly not recursive. The result can contain patterns itself.
|
||||
/// The transformation is generally not recursive. The result can contain patterns itself.
|
||||
pub fn transform_one(rt: RichTerm) -> RichTerm {
|
||||
match *rt.term {
|
||||
Term::LetPattern(..) => desugar_with_contract(rt),
|
||||
@ -50,7 +50,7 @@ pub fn transform_one(rt: RichTerm) -> RichTerm {
|
||||
}
|
||||
|
||||
/// Desugar a function with patterns as arguments.
|
||||
/// This function does not perform nested transformation because internaly it's only used in a top
|
||||
/// This function does not perform nested transformation because internally it's only used in a top
|
||||
/// down traversal. This means that the return value is a normal `Term::Fun` but it can contain
|
||||
/// `Term::FunPattern` and `Term::LetPattern` inside.
|
||||
pub fn desugar_fun(rt: RichTerm) -> RichTerm {
|
||||
|
@ -68,7 +68,7 @@ pub trait LinearizationState {}
|
||||
impl LinearizationState for () {}
|
||||
impl LinearizationState for Uninit {}
|
||||
|
||||
/// The linearizer trait is what is refered to during typechecking.
|
||||
/// The linearizer trait is what is referred to during typechecking.
|
||||
/// It is the interface to recording terms (while tracking their scope)
|
||||
/// and finalizing a linearization using generically defined external information
|
||||
///
|
||||
@ -124,7 +124,7 @@ pub trait Linearizer {
|
||||
/// Ensures the scope structure of the source can be represented in the
|
||||
/// linearization.
|
||||
/// The specific implementations need to take care of how to represent
|
||||
/// decending into a lower scope.
|
||||
/// descending into a lower scope.
|
||||
/// Notice, the resulting instance is a fresh value, any resource that is
|
||||
/// required or produced in parallel instances should therefore be put
|
||||
/// into the Building State `L` which is passed
|
||||
|
@ -682,7 +682,7 @@ pub struct State<'a> {
|
||||
}
|
||||
|
||||
/// Immutable and owned data, required by the LSP to carry out specific analysis.
|
||||
/// It is basically an owned-subset of the typecheking state.
|
||||
/// It is basically an owned-subset of the typechecking state.
|
||||
pub struct Extra {
|
||||
pub table: UnifTable,
|
||||
pub names: HashMap<VarId, Ident>,
|
||||
|
@ -241,7 +241,7 @@ pub fn get_bop_type(
|
||||
// Str -> Str -> Str
|
||||
BinaryOp::StrConcat() => (mk_uniftype::str(), mk_uniftype::str(), mk_uniftype::str()),
|
||||
// Ideally: Contract -> Label -> Dyn -> Dyn
|
||||
// Currenty: Dyn -> Dyn -> (Dyn -> Dyn)
|
||||
// Currently: Dyn -> Dyn -> (Dyn -> Dyn)
|
||||
BinaryOp::Assume() => (
|
||||
mk_uniftype::dynamic(),
|
||||
mk_uniftype::dynamic(),
|
||||
|
@ -80,7 +80,7 @@ pub struct RecordRowF<Ty> {
|
||||
/// a sequence of `EnumRow`s, ending potentially with a type variable tail position.
|
||||
///
|
||||
/// `EnumRowF` is the same as `EnumRow` and doesn't have any type parameter. We introduce the alias
|
||||
/// nontheless for consistency with other parametrized type definitions. See [`TypeF`] for more
|
||||
/// nonetheless for consistency with other parametrized type definitions. See [`TypeF`] for more
|
||||
/// details.
|
||||
pub type EnumRowF = Ident;
|
||||
pub type EnumRow = EnumRowF;
|
||||
@ -1050,7 +1050,7 @@ mod test {
|
||||
/// Take a string representation of a type, parse it, and assert that formatting it gives the
|
||||
/// same string as the original argument.
|
||||
///
|
||||
/// Note that their are infintely many string representations of the same type since, for
|
||||
/// Note that there are infinitely many string representations of the same type since, for
|
||||
/// example, spaces are ignored: for the outcome of this function to be meaningful, the
|
||||
/// original type must be written in the same way as types are formatted.
|
||||
fn assert_format_eq(s: &str) {
|
||||
|
@ -148,7 +148,7 @@
|
||||
|
||||
join : Str -> Array Str -> Str
|
||||
| doc m%"
|
||||
Joins a array of strings given a seperator.
|
||||
Joins a array of strings given a separator.
|
||||
|
||||
For example:
|
||||
```nickel
|
||||
@ -286,7 +286,7 @@
|
||||
|
||||
replace: Str -> Str -> Str -> Str
|
||||
| doc m%"
|
||||
`replace sub repl str` replaces every occurence of `sub` in `str` with `repl`.
|
||||
`replace sub repl str` replaces every occurrence of `sub` in `str` with `repl`.
|
||||
|
||||
For example:
|
||||
```nickel
|
||||
|
@ -94,7 +94,7 @@ fn serialize() {
|
||||
)
|
||||
.as_bytes(),
|
||||
),
|
||||
"shoud success",
|
||||
"should success",
|
||||
)
|
||||
.unwrap();
|
||||
assert_eq!(prog.eval().map(Term::from), Ok(Term::Bool(true)));
|
||||
|
@ -24,7 +24,7 @@ let {check, ..} = import "lib/assert.ncl" in
|
||||
|> record.insert "foo" 1
|
||||
|> record.has_field "foo",
|
||||
|
||||
# lazyness of map
|
||||
# laziness of map
|
||||
(record.map (fun x y => y + 1) {foo = 1, bar = "it's lazy"}).foo
|
||||
== 2,
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user