All Enso objects are hasheable (#3878)

* Hash codes prototype

* Remove Any.hash_code

* Improve caching of hashcode in atoms

* [WIP] Add Hash_Map type

* Implement Any.hash_code builtin for primitives and vectors

* Add some values to ValuesGenerator

* Fix example docs on Time_Zone.new

* [WIP] QuickFix for HashCodeTest before PR #3956 is merged

* Fix hash code contract in HashCodeTest

* Add times and dates values to HashCodeTest

* Fix docs

* Remove hashCodeForMetaInterop specialization

* Introduce snapshoting of HashMapBuilder

* Add unit tests for EnsoHashMap

* Remove duplicate test in Map_Spec.enso

* Hash_Map.to_vector caches result

* Hash_Map_Spec is a copy of Map_Spec

* Implement some methods in Hash_Map

* Add equalsHashMaps specialization to EqualsAnyNode

* get and insert operations are able to work with polyglot values

* Implement rest of Hash_Map API

* Add test that inserts elements with keys with same hash code

* EnsoHashMap.toDisplayString use builder storage directly

* Add separate specialization for host objects in EqualsAnyNode

* Fix specialization for host objects in EqualsAnyNode

* Add polyglot hash map tests

* EconomicMap keeps reference to EqualsNode and HashCodeNode.

Rather than passing these nodes to `get` and `insert` methods.

* HashMapTest run in polyglot context

* Fix containsKey index handling in snapshots

* Remove snapshots field from EnsoHashMapBuilder

* Prepare polyglot hash map handling.

- Hash_Map builtin methods are separate nodes

* Some bug fixes

* Remove ForeignMapWrapper.

We would have to wrap foreign maps in assignments for this to be efficient.

* Improve performance of Hash_Map.get_builtin

Also, if_nothing parameter is suspended

* Remove to_flat_vector.

Interop API requires nested vector (our previous to_vector implementation). Seems that I have misunderstood the docs  the first time I read it.

- to_vector does not sort the vector by keys by default

* Fix polyglot hash maps method dispatch

* Add tests that effectively test hash code implementation.

Via hash map that behaves like a hash set.

* Remove Hashcode_Spec

* Add some polyglot tests

* Add Text.== tests for NFD normalization

* Fix NFD normalization bug in Text.java

* Improve performance of EqualsAnyNode.equalsTexts specialization

* Properly compute hash code for Atom and cache it

* Fix Text specialization in HashCodeAnyNode

* Add Hash_Map_Spec as part of all tests

* Remove HashMapTest.java

Providing all the infrastructure for all the needed Truffle nodes is no longer manageable.

* Remove rest of identityHashCode message implementations

* Replace old Map with Hash_Map

* Add some docs

* Add TruffleBoundaries

* Formatting

* Fix some tests to accept unsorted vector from Map.to_vector

* Delete Map.first and Map.last methods

* Add specialization for big integer hash

* Introduce proper HashCodeTest and EqualsTest.

- Use jUnit theories.
- Call nodes directly

* Fix some specializations for primitives in HashCodeAnyNode

* Fix host object specialization

* Remove Any.hash_code

* Fix import in Map.enso

* Update changelog

* Reformat

* Add truffle boundary to BigInteger.hashCode

* Fix performance of HashCodeTest - initialize DataPoints just once

* Fix MetaIsATest

* Fix ValuesGenerator.textual - Java's char is not Text

* Fix indent in Map_Spec.enso

* Add maps to datapoints in HashCodeTest

* Add specialization for maps in HashCodeAnyNode

* Add multiLevelAtoms to ValuesGenerator

* Provide a workaround for non-linear key inserts

* Fix specializations for double and BigInteger

* Cosmetics

* Add truffle boundaries

* Add allowInlining=true to some truffle boundaries.

Increases performance a lot.

* Increase the size of vectors, and warmup time for Vector.Distinct benchmark

* Various small performance fixes.

* Fix Geo_Spec tests to accept unsorted Map.to_vector

* Implement Map.remove

* FIx Visualization tests to accept unsorted Map.to_vector

* Treat java.util.Properties as Map

* Add truffle boundaries

* Invoke polyglot methods on java.util.Properties

* Ignore python tests if python lang is missing
This commit is contained in:
Pavel Marek 2023-01-19 10:33:25 +01:00 committed by GitHub
parent d463a43633
commit fcc2163ae3
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
42 changed files with 2839 additions and 697 deletions

View File

@ -429,6 +429,7 @@
[3964]: https://github.com/enso-org/enso/pull/3964 [3964]: https://github.com/enso-org/enso/pull/3964
[3967]: https://github.com/enso-org/enso/pull/3967 [3967]: https://github.com/enso-org/enso/pull/3967
[3987]: https://github.com/enso-org/enso/pull/3987 [3987]: https://github.com/enso-org/enso/pull/3987
[3878]: https://github.com/enso-org/enso/pull/3878
[3997]: https://github.com/enso-org/enso/pull/3997 [3997]: https://github.com/enso-org/enso/pull/3997
[4013]: https://github.com/enso-org/enso/pull/4013 [4013]: https://github.com/enso-org/enso/pull/4013
[4026]: https://github.com/enso-org/enso/pull/4026 [4026]: https://github.com/enso-org/enso/pull/4026
@ -514,6 +515,7 @@
- [Sync language server with file system after VCS restore][4020] - [Sync language server with file system after VCS restore][4020]
- [`ArrayOverBuffer` behaves like an `Array` and `Array.sort` no longer sorts in - [`ArrayOverBuffer` behaves like an `Array` and `Array.sort` no longer sorts in
place][4022] place][4022]
- [Implement hashing functionality for all objects][3878]
- [Introducing Meta.atom_with_hole][4023] - [Introducing Meta.atom_with_hole][4023]
- [Report failures in name resolution in type signatures][4030] - [Report failures in name resolution in type signatures][4030]
- [Attach visualizations to sub-expressions][4048] - [Attach visualizations to sub-expressions][4048]

View File

@ -1,181 +1,89 @@
import project.Any.Any
import project.Data.Numbers.Integer import project.Data.Numbers.Integer
import project.Data.Ordering.Ordering import project.Data.Vector.Vector
import project.Data.Map.Internal
import project.Data.Pair.Pair import project.Data.Pair.Pair
import project.Data.Text.Text import project.Data.Text.Text
import project.Data.Vector.Vector
import project.Error.Error
import project.Error.No_Such_Key.No_Such_Key
import project.Nothing.Nothing
from project.Data.Boolean import Boolean, True, False from project.Data.Boolean import Boolean, True, False
from project import Error, Nothing, Any, Panic
from project.Error.No_Such_Key import No_Such_Key
## A key-value store. This type assumes all keys are pairwise comparable,
using the `<`, `>` and `==` operators. ## A key-value store. It is possible to use any type as keys and values and mix them in
type Map one Map. Keys are checked for equality based on their hash code and `==` operator, which
is both an internal part of Enso. Enso is capable of computing a hash code, and checking
for equality any objects that can appear in Enso - primitives, Atoms, values coming from
different languages, etc.
A single key-value pair is called an *entry*.
It is possible to pass a Map created in Enso to foreign functions, where it will be treated
as appropriate map structures - in Python that is a dictionary, and in JavaScript, it is
a `Map`. And likewise, it is possible to pass a foreign map into Enso, where it will be
treated as a Map.
@Builtin_Type
type Map key value
## Returns an empty map. ## Returns an empty map.
> Example
Create an empty map.
import Standard.Base.Data.Map.Map
example_empty = Map.empty
empty : Map empty : Map
empty = Map.Tip empty = @Builtin_Method "Map.empty"
## Returns a single-element map with the given key and value present. ## Returns a single-element map with the given key and value.
A Call to `Map.singleton key value` is the same as a call to
`Map.empty.insert key value`.
Arguments: Arguments:
- key: The key to update in the map. - key: The key to to use for `value` in the map.
- value: The value to store against 'key' in the map. - value: The value to store under 'key' in the map.
> Example > Example
Create a single element map storing the key 1 and the value 2. Create a single element map storing the key "my_key" and the value 2.
import Standard.Base.Data.Map.Map import Standard.Base.Data.Map.Map
example_singleton = Map.singleton 1 2 example_singleton = Map.singleton "my_key" 2
singleton : Any -> Any -> Map singleton : Any -> Any -> Map
singleton key value = Map.Bin 1 key value Map.Tip Map.Tip singleton key value = Map.empty.insert key value
## Builds a map from a vector of key-value pairs. ## Builds a map from a vector of key-value pairs, with each key-value pair
represented as a 2 element vector.
Arguments: Arguments:
- vec: A vector of key-value pairs. - vec: A vector of key-value pairs (2 element vectors).
> Example > Example
Building a map containing two key-value pairs. Building a map containing two key-value pairs.
import Standard.Base.Data.Map.Map import Standard.Base.Data.Map.Map
example_from_vector = Map.from_vector [[1, 2], [3, 4]] example_from_vector = Map.from_vector [["A", 1], ["B", 2]]
from_vector : Vector Any -> Map from_vector : Vector Any -> Map
from_vector vec = vec.fold Map.empty (m -> el -> m.insert (el.at 0) (el.at 1)) from_vector vec = vec.fold Map.empty (m -> el -> m.insert (el.at 0) (el.at 1))
## PRIVATE ## Returns True iff the Map is empty, i.e., does not have any entries.
A key-value store. This type assumes all keys are pairwise comparable,
using the `<`, `>` and `==` operators.
Tip
## PRIVATE
A key-value store. This type assumes all keys are pairwise comparable,
using the `<`, `>` and `==` operators.
Arguments:
- s: The size of the tree at this node.
- key: The key stored at this node.
- value: The value stored at this node.
- left: The left subtree.
- right: The right subtree.
Bin s key value left right
## Checks if the map is empty.
> Example
Check if a map is empty.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_is_empty = Examples.map.is_empty
is_empty : Boolean is_empty : Boolean
is_empty self = case self of is_empty self = self.size == 0
Map.Bin _ _ _ _ _ -> False
Map.Tip -> True
## Checks if the map is not empty. ## Returns True iff the Map is not empty, i.e., has at least one entry.
> Example
Check if a map is not empty.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_not_empty = Examples.map.not_empty
not_empty : Boolean not_empty : Boolean
not_empty self = self.is_empty.not not_empty self = self.is_empty.not
## Returns the number of entries in this map. ## Returns the number of entries in this map.
> Example
Get the size of a map.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_size = Examples.map.size
size : Integer size : Integer
size self = case self of size self = @Builtin_Method "Map.size"
Map.Bin s _ _ _ _ -> s
Map.Tip -> 0
## Converts the map into a vector of `[key, value]` pairs.
The returned vector is sorted in the increasing order of keys.
> Example
Convert a map to a vector.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_to_vector = Examples.map.to_vector
to_vector : Vector Any
to_vector self =
builder = Vector.new_builder
to_vector_with_builder m = case m of
Map.Bin _ k v l r ->
to_vector_with_builder l
builder.append [k, v]
to_vector_with_builder r
Nothing
Map.Tip -> Nothing
to_vector_with_builder self
result = builder.to_vector
result
## Returns a text representation of this map.
> Example
Convert a map to text.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_to_text = Examples.map.to_text
to_text : Text
to_text self = self.to_vector.to_text
## Checks if this map is equal to another map.
Arguments:
- that: The map to compare `self` to.
Maps are equal when they contained the same keys and the values
associated with each key are pairwise equal.
> Example
Checking two maps for equality.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_equals =
other = Map.empty . insert 1 "one" . insert 3 "three" . insert 5 "five"
Examples.map == other
== : Map -> Boolean
== self that = case that of
_ : Map -> self.to_vector == that.to_vector
_ -> False
## Inserts a key-value mapping into this map, overriding any existing ## Inserts a key-value mapping into this map, overriding any existing
instance of `key` with the new `value`. instance of `key` with the new `value`.
Note that since the return type is also a `Map`, multiple `insert`
calls can be chained, e.g., `map.insert "A" 1 . insert "B" 2`.
Due to the limitation of the current implementation, inserts with a
key that is already contained in the map, or insert on a map instance that
is re-used in other computations, have a linear time complexity.
For all the other cases, the time complexity of this method is constant.
Arguments: Arguments:
- key: The key to insert the value for. - key: The key to insert the value for.
- value: The value to associate with `key`. - value: The value to associate with the `key`.
> Example > Example
Insert the value "seven" into the map for the key 7. Insert the value "seven" into the map for the key 7.
@ -185,27 +93,50 @@ type Map
example_insert = Examples.map.insert 7 "seven" example_insert = Examples.map.insert 7 "seven"
insert : Any -> Any -> Map insert : Any -> Any -> Map
insert self key value = Internal.insert self key value insert self key value = @Builtin_Method "Map.insert"
## Gets the value associated with `key` in this map, or throws a ## Removes an entry specified by the given key from this map, and
`No_Such_Key.Error` if `key` is not present. returns a new map without this entry. Throw `No_Such_Key.Error`
if `key` is not present.
Arguments: Arguments:
- key: The key to look up in the map. - key: The key to look up in the map.
> Example > Example
Get the value for the key 1 in a map. Remove key "A" from a map
import Standard.Data.Map.Map
Examples.map.remove "A"
remove : Any -> Map ! No_Such_Key
remove self key =
Panic.catch Any (self.remove_builtin key) _->
Error.throw No_Such_Key.Error self key
## Gets the value associated with `key` in this map, or throws a
`No_Such_Key.Error` if `key` is not present.
This method has a constant time complexity.
Arguments:
- key: The key to look up in the map.
> Example
Looks up the value for the key "A" in a map.
import Standard.Base.Data.Map.Map import Standard.Base.Data.Map.Map
import Standard.Examples import Standard.Examples
example_at = Examples.map.at 1 example_at = Examples.map.at "A"
at : Any -> Any ! No_Such_Key at : Any -> Any ! No_Such_Key
at self key = self.get key (Error.throw (No_Such_Key.Error self key)) at self key = self.get key (Error.throw (No_Such_Key.Error self key))
## Gets the value associated with `key` in this map, or returns ## Gets the value associated with `key` in this map, or returns
`if_missing` if it isn't present. `if_missing` if it isn't present.
This method has a constant time complexity.
Arguments: Arguments:
- key: The key to look up in the map. - key: The key to look up in the map.
- if_missing: The value to use if the key isn't present. - if_missing: The value to use if the key isn't present.
@ -219,57 +150,19 @@ type Map
example_get = Examples.map.get 2 "zero" example_get = Examples.map.get 2 "zero"
get : Any -> Any -> Any get : Any -> Any -> Any
get self key ~if_missing=Nothing = get self key ~if_missing=Nothing = self.get_builtin key if_missing
go map = case map of
Map.Tip -> if_missing
Map.Bin _ k v l r -> case Internal.compare_allow_nothing key k of
Ordering.Equal -> v
Ordering.Less -> @Tail_Call go l
Ordering.Greater -> @Tail_Call go r
result = go self
result
## Checks if a key is in the map. ## Returns True iff the Map contains the given `key`.
Arguments:
- key: The key to look up in the map.
> Example
Checks the key 2 is in a map.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_contains = Examples.map.contains_key 2
contains_key : Any -> Boolean contains_key : Any -> Boolean
contains_key self key = contains_key self key = @Builtin_Method "Map.contains_key"
go map = case map of
Map.Tip -> False
Map.Bin _ k _ l r -> case Internal.compare_allow_nothing key k of
Ordering.Equal -> True
Ordering.Less -> @Tail_Call go l
Ordering.Greater -> @Tail_Call go r
go self
## Transforms the map's keys and values to create a new map. ## Returns an unsorted vector of all the keys in this Map.
keys : Vector Any
keys self = self.to_vector.map pair-> pair.at 0
Arguments: ## Returns an unsorted vector of all the values in this Map.
- function: The function used to transform the map, taking a key and a values : Vector Any
value and returning a pair of `[key, value]`. values self = self.to_vector.map pair-> pair.at 1
> Example
Turn all keys into `Text` and append "_word" to the values in the map.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_transform =
Examples.map.transform (k -> v -> [k.to_text, v + "_word"])
transform : (Any -> Any -> [Any, Any]) -> Map
transform self function =
func_pairs = p -> function (p.at 0) (p.at 1)
vec_transformed = self.to_vector.map func_pairs
Map.from_vector vec_transformed
## Maps a function over each value in this map. ## Maps a function over each value in this map.
@ -306,11 +199,10 @@ type Map
Examples.map.map_with_key (k -> v -> k.to_text + "-" + v) Examples.map.map_with_key (k -> v -> k.to_text + "-" + v)
map_with_key : (Any -> Any -> Any) -> Map map_with_key : (Any -> Any -> Any) -> Map
map_with_key self function = map_with_key self function =
go map = case map of Map.from_vector <| self.to_vector.map pair->
Map.Bin s k v l r -> key = pair.first
Map.Bin s k (function k v) (go l) (go r) value = pair.last
Map.Tip -> Map.Tip [key, (function key value)]
go self
## Maps a function over each key in this map. ## Maps a function over each key in this map.
@ -330,6 +222,62 @@ type Map
trans_function = k -> v -> [function k, v] trans_function = k -> v -> [function k, v]
self.transform trans_function self.transform trans_function
## Transforms the map's keys and values to create a new map.
Arguments:
- function: The function used to transform the map, taking a key and a
value and returning a pair of `[key, value]`.
> Example
Turn all keys into `Text` and append "_word" to the values in the map.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_transform =
Examples.map.transform (k -> v -> [k.to_text, v + "_word"])
transform : (Any -> Any -> [Any, Any]) -> Map
transform self function =
func_pairs = p -> function (p.at 0) (p.at 1)
vec_transformed = self.to_vector.map func_pairs
Map.from_vector vec_transformed
## Combines the values in the map.
Arguments:
- init: The initial value for the fold.
- function: A binary function to apply to pairs of values in the map.
> Example
Find the length of the longest word in the map.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_fold = Examples.map.fold 0 (l -> r -> Math.max l r.length)
fold : Any -> (Any -> Any -> Any) -> Any
fold self init function = self.values.fold init function
## Combines the key-value pairs in the map.
Arguments:
- init: The initial value for the fold.
- function: A function taking the left value, the current key, and the
current value, and combining them to yield a single value.
> Example
Glue the values in the map together with the keys.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_fold_with_key =
Examples.map.fold_with_key "" (l -> k -> v -> l + k.to_text + v)
fold_with_key : Any -> (Any -> Any -> Any -> Any) -> Any
fold_with_key self init function =
self.to_vector.fold init acc-> pair->
function acc pair.first pair.last
## Applies a function to each value in the map. ## Applies a function to each value in the map.
Arguments: Arguments:
@ -371,121 +319,20 @@ type Map
IO.println v IO.println v
each_with_key : (Any -> Any -> Any) -> Nothing each_with_key : (Any -> Any -> Any) -> Nothing
each_with_key self function = each_with_key self function =
go map = case map of self.to_vector.each pair->
Map.Bin _ k v l r -> function pair.first pair.last
go l
function k v
go r
Nothing
Map.Tip -> Nothing
go self
## Combines the values in the map. ## Returns an unsorted vector of key-value pairs (nested 2 element vectors).
`Map.from_vector` method is an inverse method, so the following expression
is true for all maps: `Map.from_vector map.to_vector == map`.
to_vector : Vector Any
to_vector self = @Builtin_Method "Map.to_vector"
Arguments: ## Returns a text representation of this Map.
- init: The initial value for the fold. to_text : Text
- function: A binary function to apply to pairs of values in the map. to_text self = @Builtin_Method "Map.to_text"
> Example ## PRIVATE
Find the length of the longest word in the map. get_builtin : Any -> Any -> Any
get_builtin self key ~if_missing = @Builtin_Method "Map.get_builtin"
import Standard.Base.Data.Map.Map
import Standard.Examples
example_fold = Examples.map.fold 0 (l -> r -> Math.max l r.length)
fold : Any -> (Any -> Any -> Any) -> Any
fold self init function =
go map init = case map of
Map.Bin _ _ v l r ->
y = go l init
z = function y v
go r z
Map.Tip -> init
go self init
## Combines the key-value pairs in the map.
Arguments:
- init: The initial value for the fold.
- function: A function taking the left value, the current key, and the
current value, and combining them to yield a single value.
> Example
Glue the values in the map together with the keys.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_fold_with_key =
Examples.map.fold_with_key "" (l -> k -> v -> l + k.to_text + v)
fold_with_key : Any -> (Any -> Any -> Any -> Any) -> Any
fold_with_key self init function =
go map init = case map of
Map.Bin _ k v l r ->
y = go l init
z = function y k v
go r z
Map.Tip -> init
go self init
## Get a vector containing the keys in the map.
> Example
Get the keys from the map `m`.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_keys = Examples.map.keys
keys : Vector
keys self =
builder = Vector.new_builder
to_vector_with_builder m = case m of
Map.Bin _ k _ l r ->
to_vector_with_builder l
builder.append k
to_vector_with_builder r
Nothing
Map.Tip -> Nothing
to_vector_with_builder self
builder.to_vector
## Get a vector containing the values in the map.
> Example
Get the values from the map `m`.
import Standard.Base.Data.Map.Map
import Standard.Examples
example_values = Examples.map.values
values : Vector
values self =
builder = Vector.new_builder
to_vector_with_builder m = case m of
Map.Bin _ _ v l r ->
to_vector_with_builder l
builder.append v
to_vector_with_builder r
Nothing
Map.Tip -> Nothing
to_vector_with_builder self
builder.to_vector
## Get a key value pair of the lowest key in the map.
If the map is empty, returns Nothing.
first : Pair
first self =
first p m = case m of
Map.Bin _ k v l _ -> @Tail_Call first (Pair.new k v) l
Map.Tip -> p
first Nothing self
## Get a key value pair of the highest key in the map.
If the map is empty, returns Nothing.
last : Pair
last self =
last p m = case m of
Map.Bin _ k v _ r -> @Tail_Call last (Pair.new k v) r
Map.Tip -> p
last Nothing self

View File

@ -1,164 +0,0 @@
import project.Any.Any
import project.Data.Map.Map
import project.Data.Numbers.Integer
import project.Data.Ordering.Ordering
## PRIVATE
Compares keys allowing for the possibility that one or both keys are Nothing.
compare_allow_nothing : Any -> Any -> Ordering
compare_allow_nothing x y = if x == y then Ordering.Equal else
if x.is_nothing then Ordering.Less else
if y.is_nothing then Ordering.Greater else
x.compare_to y
## PRIVATE
A helper used in the insert operation to insert into the left subtree.
Arguments:
- key: The key to insert.
- value: The value to insert.
- k: The previous top key of the left subtree.
- v: The previous top value of the left subtree.
- l: The left subtree.
- r: The right subtree.
insert_l : Any -> Any -> Any -> Any -> Map -> Map -> Map
insert_l key value k v l r =
new_left = insert l key value
balance_left k v new_left r
## PRIVATE
A helper used in the insert operation to insert into the right subtree.
Arguments:
- key: The key to insert.
- value: The value to insert.
- k: The previous top key of the right subtree.
- v: The previous top value of the right subtree.
- l: The left subtree.
- r: The right subtree.
insert_r : Any -> Any -> Any -> Any -> Map -> Map -> Map
insert_r key value k v l r =
new_right = insert r key value
balance_right k v l new_right
## PRIVATE
Helper for inserting a new key-value pair into a map.
Arguments:
- map: The map into which the insertion is performed.
- key: The key for which to insert the value into the map.
- value: The value to insert into the map at the given key.
The algorithm used here is based on the paper "Implementing Sets Efficiently
in a Functional Language" by Stephen Adams. The implementation is based on
Haskell's `Data.Map.Strict` as implemented in the `containers` package.
insert : Map -> Any -> Any -> Map
insert map key value = case map of
Map.Bin s k v l r -> case compare_allow_nothing key k of
Ordering.Less -> @Tail_Call insert_l key value k v l r
Ordering.Greater -> @Tail_Call insert_r key value k v l r
Ordering.Equal -> Map.Bin s key value l r
_ -> Map.Bin 1 key value Map.Tip Map.Tip
## PRIVATE
Re-balances the map after the left subtree grows.
Arguments:
- k: The old top key of the left subtree.
- x: The old top value of the left subtree.
- l: The left subtree.
- r: The right subtree.
balance_left : Any -> Any -> Map -> Map -> Map
balance_left k x l r = case r of
Map.Bin rs _ _ _ _ -> case l of
Map.Bin ls lk lx ll lr ->
if ls <= delta*rs then Map.Bin 1+ls+rs k x l r else
lls = size ll
case lr of
Map.Bin lrs lrk lrx lrl lrr ->
if lrs < ratio*lls then Map.Bin 1+ls+rs lk lx ll (Map.Bin 1+rs+lrs k x lr r) else
lrls = size lrl
lrrs = size lrr
Map.Bin 1+ls+rs lrk lrx (Map.Bin 1+lls+lrls lk lx ll lrl) (Map.Bin 1+rs+lrrs k x lrr r)
_ -> Map.Bin 1+rs k x Map.Tip r
_ -> case l of
Map.Tip -> Map.Bin 1 k x Map.Tip Map.Tip
Map.Bin _ _ _ Map.Tip Map.Tip -> Map.Bin 2 k x l Map.Tip
Map.Bin _ lk lx Map.Tip (Map.Bin _ lrk lrx _ _) -> Map.Bin 3 lrk lrx (Map.Bin 1 lk lx Map.Tip Map.Tip) (Map.Bin 1 k x Map.Tip Map.Tip)
Map.Bin _ lk lx ll Map.Tip -> Map.Bin 3 lk lx ll (Map.Bin 1 k x Map.Tip Map.Tip)
Map.Bin ls lk lx ll lr -> case lr of
Map.Bin lrs lrk lrx lrl lrr ->
lls = size ll
if lrs < ratio*lls then Map.Bin 1+ls lk lx ll (Map.Bin 1+lrs k x lr Map.Tip) else
lrls = size lrl
lrrs = size lrr
Map.Bin 1+ls lrk lrx (Map.Bin 1+lls+lrls lk lx ll lrl) (Map.Bin 1+lrrs k x lrr Map.Tip)
## PRIVATE
Re-balances the map after the right subtree grows.
Arguments:
- k: The old top key of the right subtree.
- x: The old top value of the right subtree.
- l: The left subtree.
- r: The right subtree.
balance_right : Any -> Any -> Map -> Map -> Map
balance_right k x l r = case l of
Map.Bin ls _ _ _ _ -> case r of
Map.Bin rs rk rx rl rr ->
if rs <= delta*ls then Map.Bin 1+ls+rs k x l r else
case rl of
Map.Bin rls rlk rlx rll rlr ->
rrs = size rr
if rls < ratio*rrs then Map.Bin 1+ls+rs rk rx (Map.Bin 1+ls+rls k x l rl) rr else
rlls = size rll
rlrs = size rlr
Map.Bin 1+ls+rs rlk rlx (Map.Bin 1+ls+rlls k x l rll) (Map.Bin 1+rrs+rlrs rk rx rlr rr)
_ -> Map.Bin 1+ls k x l Map.Tip
_ -> case r of
Map.Tip -> Map.Bin 1 k x Map.Tip Map.Tip
Map.Bin _ _ _ Map.Tip Map.Tip -> Map.Bin 2 k x Map.Tip r
Map.Bin _ rk rx Map.Tip rr -> Map.Bin 3 rk rx (Map.Bin 1 k x Map.Tip Map.Tip) rr
Map.Bin _ rk rx (Map.Bin _ rlk rlx _ _) Map.Tip -> Map.Bin 3 rlk rlx (Map.Bin 1 k x Map.Tip Map.Tip) (Map.Bin 1 rk rx Map.Tip Map.Tip)
Map.Bin rs rk rx rl rr -> case rl of
Map.Bin rls rlk rlx rll rlr -> case rr of
Map.Bin rrs _ _ _ _ ->
if rls < ratio*rrs then Map.Bin 1+rs rk rx (Map.Bin 1+rls k x Map.Tip rl) rr else
srll = size rll
srlr = size rlr
Map.Bin 1+rs rlk rlx (Map.Bin 1+srll k x Map.Tip rll) (Map.Bin 1+rrs+srlr rk rx rlr rr)
## PRIVATE
Controls the difference between inner and outer siblings of a heavy subtree.
Used to decide between a double and a single rotation.
The choice of values for `ratio` and `delta` is taken from the Haskell
implementation.
ratio : Integer
ratio = 2
## PRIVATE
Controls the maximum size difference between subtrees.
The choice of values for `ratio` and `delta` is taken from the Haskell
implementation.
delta : Integer
delta = 3
## PRIVATE
Gets the size of a map.
Arguments:
- m: The map to get the size of.
size : Map -> Integer
size m = case m of
Map.Bin s _ _ _ _ -> s
_ -> 0

View File

@ -15,8 +15,7 @@ type Encoding
Used to provide auto completion in the UI. Used to provide auto completion in the UI.
all_character_sets : Vector Text all_character_sets : Vector Text
all_character_sets = all_character_sets =
java_array = Charset.availableCharsets.keySet.toArray Charset.availableCharsets.keys
Vector.from_polyglot_array java_array
## Get all available Encodings. ## Get all available Encodings.
all_encodings : Vector Encoding all_encodings : Vector Encoding

View File

@ -27,7 +27,7 @@ type Text_Ordering
this to `True` results in a "Natural" ordering. this to `True` results in a "Natural" ordering.
Case_Sensitive (sort_digits_as_numbers:Boolean=False) Case_Sensitive (sort_digits_as_numbers:Boolean=False)
## Case sensitive ordering of values. ## Case insensitive ordering of values.
It will ensure case-insensitive ordering regardless of backend defaults. It will ensure case-insensitive ordering regardless of backend defaults.
This may make database queries more complicated and may result in being This may make database queries more complicated and may result in being

View File

@ -99,9 +99,9 @@ type Time_Zone
> Example > Example
Get time zone 1 hour 1 minute and 50 seconds from UTC. Get time zone 1 hour 1 minute and 50 seconds from UTC.
from Standard.Base import Zone from Standard.Base.Time.Time_Zone import Time_Zone
example_new = Zone.new 1 1 50 example_new = Time_Zone.new 1 1 50
new : Integer -> Integer -> Integer -> Time_Zone new : Integer -> Integer -> Integer -> Time_Zone
new (hours = 0) (minutes = 0) (seconds = 0) = new (hours = 0) (minutes = 0) (seconds = 0) =
new_builtin hours minutes seconds new_builtin hours minutes seconds

View File

@ -138,8 +138,9 @@ create : Text -> Vector -> JDBC_Connection
create url properties = handle_sql_errors <| create url properties = handle_sql_errors <|
java_props = Properties.new java_props = Properties.new
properties.each pair-> properties.each pair->
if pair.second.is_nothing.not then java_props.setProperty pair.first pair.second else case pair.second of
java_props.remove pair.first Nothing -> Polyglot.invoke java_props "remove" [pair.first]
_ -> Polyglot.invoke java_props "setProperty" [pair.first, pair.second]
java_connection = JDBCProxy.getConnection url java_props java_connection = JDBCProxy.getConnection url java_props
resource = Managed_Resource.register java_connection close_connection resource = Managed_Resource.register java_connection close_connection

View File

@ -9,6 +9,7 @@ import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException; import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.TruffleObject; import com.oracle.truffle.api.interop.TruffleObject;
import com.oracle.truffle.api.interop.UnknownIdentifierException; import com.oracle.truffle.api.interop.UnknownIdentifierException;
import com.oracle.truffle.api.interop.UnknownKeyException;
import com.oracle.truffle.api.interop.UnsupportedMessageException; import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.interop.UnsupportedTypeException; import com.oracle.truffle.api.interop.UnsupportedTypeException;
import com.oracle.truffle.api.library.CachedLibrary; import com.oracle.truffle.api.library.CachedLibrary;
@ -422,6 +423,131 @@ public final class PolyglotProxy implements TruffleObject {
} }
} }
@ExportMessage
public boolean hasHashEntries(
@CachedLibrary("this.delegate") InteropLibrary hashMaps,
@CachedLibrary("this") InteropLibrary node,
@CachedLibrary(limit = "5") InteropLibrary errors,
@Cached @Cached.Exclusive ContextRewrapExceptionNode contextRewrapExceptionNode,
@Cached @Cached.Exclusive BranchProfile profile) {
Object p = enterOrigin(node);
try {
return hashMaps.hasHashEntries(this.delegate);
} catch (Throwable e) {
profile.enter();
if (errors.isException(e)) {
// `isException` means this must be AbstractTruffleException
//noinspection ConstantConditions
throw contextRewrapExceptionNode.execute((AbstractTruffleException) e, origin, target);
} else {
throw e;
}
} finally {
leaveOrigin(node, p);
}
}
@ExportMessage
public long getHashSize(
@CachedLibrary("this.delegate") InteropLibrary hashes,
@CachedLibrary("this") InteropLibrary node,
@CachedLibrary(limit = "5") InteropLibrary errors,
@Cached @Cached.Exclusive ContextRewrapExceptionNode contextRewrapExceptionNode,
@Cached @Cached.Exclusive BranchProfile profile)
throws UnsupportedMessageException {
Object p = enterOrigin(node);
try {
return hashes.getHashSize(this.delegate);
} catch (Throwable e) {
profile.enter();
if (errors.isException(e)) {
// `isException` means this must be AbstractTruffleException
//noinspection ConstantConditions
throw contextRewrapExceptionNode.execute((AbstractTruffleException) e, origin, target);
} else {
throw e;
}
} finally {
leaveOrigin(node, p);
}
}
@ExportMessage
public boolean isHashEntryReadable(
Object key,
@CachedLibrary("this.delegate") InteropLibrary hashes,
@CachedLibrary("this") InteropLibrary node,
@CachedLibrary(limit = "5") InteropLibrary errors,
@Cached @Cached.Exclusive ContextRewrapExceptionNode contextRewrapExceptionNode,
@Cached @Cached.Exclusive BranchProfile profile) {
Object p = enterOrigin(node);
try {
return hashes.isHashEntryReadable(this.delegate, key);
} catch (Throwable e) {
profile.enter();
if (errors.isException(e)) {
// `isException` means this must be AbstractTruffleException
//noinspection ConstantConditions
throw contextRewrapExceptionNode.execute((AbstractTruffleException) e, origin, target);
} else {
throw e;
}
} finally {
leaveOrigin(node, p);
}
}
@ExportMessage
public Object readHashValue(
Object key,
@CachedLibrary("this.delegate") InteropLibrary hashes,
@CachedLibrary("this") InteropLibrary node,
@CachedLibrary(limit = "5") InteropLibrary errors,
@Cached @Cached.Exclusive ContextRewrapExceptionNode contextRewrapExceptionNode,
@Cached @Cached.Exclusive BranchProfile profile)
throws UnsupportedMessageException, UnknownKeyException {
Object p = enterOrigin(node);
try {
return hashes.readHashValue(this.delegate, key);
} catch (Throwable e) {
profile.enter();
if (errors.isException(e)) {
// `isException` means this must be AbstractTruffleException
//noinspection ConstantConditions
throw contextRewrapExceptionNode.execute((AbstractTruffleException) e, origin, target);
} else {
throw e;
}
} finally {
leaveOrigin(node, p);
}
}
@ExportMessage
public Object getHashEntriesIterator(
@CachedLibrary("this.delegate") InteropLibrary hashes,
@CachedLibrary("this") InteropLibrary node,
@CachedLibrary(limit = "5") InteropLibrary errors,
@Cached @Cached.Exclusive ContextRewrapExceptionNode contextRewrapExceptionNode,
@Cached @Cached.Exclusive BranchProfile profile)
throws UnsupportedMessageException {
Object p = enterOrigin(node);
try {
return hashes.getHashEntriesIterator(this.delegate);
} catch (Throwable e) {
profile.enter();
if (errors.isException(e)) {
// `isException` means this must be AbstractTruffleException
//noinspection ConstantConditions
throw contextRewrapExceptionNode.execute((AbstractTruffleException) e, origin, target);
} else {
throw e;
}
} finally {
leaveOrigin(node, p);
}
}
@ExportMessage @ExportMessage
public boolean isString( public boolean isString(
@CachedLibrary("this.delegate") InteropLibrary strings, @CachedLibrary("this.delegate") InteropLibrary strings,

View File

@ -272,6 +272,28 @@ public abstract class InvokeMethodNode extends BaseNode {
return invokeFunctionNode.execute(function, frame, state, arguments); return invokeFunctionNode.execute(function, frame, state, arguments);
} }
@Specialization(
guards = {
"!types.hasType(self)",
"!types.hasSpecialDispatch(self)",
"getPolyglotCallType(self, symbol, interop, methodResolverNode) == CONVERT_TO_HASH_MAP",
})
Object doConvertHashMap(
VirtualFrame frame,
State state,
UnresolvedSymbol symbol,
Object self,
Object[] arguments,
@CachedLibrary(limit = "10") InteropLibrary interop,
@CachedLibrary(limit = "10") TypesLibrary types,
@Cached MethodResolverNode methodResolverNode) {
var ctx = EnsoContext.get(this);
var hashMapType = ctx.getBuiltins().map();
var function = methodResolverNode.expectNonNull(self, hashMapType, symbol);
arguments[0] = self;
return invokeFunctionNode.execute(function, frame, state, arguments);
}
@Specialization( @Specialization(
guards = { guards = {
"!types.hasType(self)", "!types.hasType(self)",

View File

@ -80,6 +80,11 @@ public abstract class HostMethodCallNode extends Node {
* Standard.Base.Data.Time.Time_Zone} and dispatching natively. * Standard.Base.Data.Time.Time_Zone} and dispatching natively.
*/ */
CONVERT_TO_TIME_ZONE, CONVERT_TO_TIME_ZONE,
/**
* The method call should be handled by converting {@code self} to a {@code
* Standard.Base.Data.Map} and dispatching natively.
*/
CONVERT_TO_HASH_MAP,
/** The method call should be handled by dispatching through the {@code Any} type. */ /** The method call should be handled by dispatching through the {@code Any} type. */
NOT_SUPPORTED; NOT_SUPPORTED;
@ -99,7 +104,8 @@ public abstract class HostMethodCallNode extends Node {
&& this != CONVERT_TO_DURATION && this != CONVERT_TO_DURATION
&& this != CONVERT_TO_ZONED_DATE_TIME && this != CONVERT_TO_ZONED_DATE_TIME
&& this != CONVERT_TO_TIME_OF_DAY && this != CONVERT_TO_TIME_OF_DAY
&& this != CONVERT_TO_TIME_ZONE; && this != CONVERT_TO_TIME_ZONE
&& this != CONVERT_TO_HASH_MAP;
} }
} }
@ -163,6 +169,8 @@ public abstract class HostMethodCallNode extends Node {
return PolyglotCallType.CONVERT_TO_ARRAY; return PolyglotCallType.CONVERT_TO_ARRAY;
} }
} }
} else if (library.hasHashEntries(self)) {
return PolyglotCallType.CONVERT_TO_HASH_MAP;
} }
String methodName = symbol.getName(); String methodName = symbol.getName();

View File

@ -1,6 +1,7 @@
package org.enso.interpreter.node.expression.builtin.meta; package org.enso.interpreter.node.expression.builtin.meta;
import com.ibm.icu.text.Normalizer; import com.ibm.icu.text.Normalizer;
import com.ibm.icu.text.Normalizer2;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary; import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.dsl.Cached; import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.Fallback; import com.oracle.truffle.api.dsl.Fallback;
@ -9,7 +10,9 @@ import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.ArityException; import com.oracle.truffle.api.interop.ArityException;
import com.oracle.truffle.api.interop.InteropLibrary; import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException; import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.StopIterationException;
import com.oracle.truffle.api.interop.UnknownIdentifierException; import com.oracle.truffle.api.interop.UnknownIdentifierException;
import com.oracle.truffle.api.interop.UnknownKeyException;
import com.oracle.truffle.api.interop.UnsupportedMessageException; import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.interop.UnsupportedTypeException; import com.oracle.truffle.api.interop.UnsupportedTypeException;
import com.oracle.truffle.api.library.CachedLibrary; import com.oracle.truffle.api.library.CachedLibrary;
@ -31,6 +34,7 @@ import org.enso.interpreter.runtime.callable.atom.Atom;
import org.enso.interpreter.runtime.callable.atom.AtomConstructor; import org.enso.interpreter.runtime.callable.atom.AtomConstructor;
import org.enso.interpreter.runtime.callable.function.Function; import org.enso.interpreter.runtime.callable.function.Function;
import org.enso.interpreter.runtime.data.Type; import org.enso.interpreter.runtime.data.Type;
import org.enso.interpreter.runtime.data.text.Text;
import org.enso.interpreter.runtime.error.WarningsLibrary; import org.enso.interpreter.runtime.error.WarningsLibrary;
import org.enso.interpreter.runtime.number.EnsoBigInteger; import org.enso.interpreter.runtime.number.EnsoBigInteger;
import org.enso.interpreter.runtime.state.State; import org.enso.interpreter.runtime.state.State;
@ -157,18 +161,46 @@ public abstract class EqualsAnyNode extends Node {
} }
} }
@Specialization(limit = "3")
boolean equalsTexts(Text selfText, Text otherText,
@CachedLibrary("selfText") InteropLibrary selfInterop,
@CachedLibrary("otherText") InteropLibrary otherInterop) {
if (selfText.is_normalized() && otherText.is_normalized()) {
return selfText.toString().compareTo(otherText.toString()) == 0;
} else {
return equalsStrings(selfText, otherText, selfInterop, otherInterop);
}
}
/** Interop libraries **/ /** Interop libraries **/
@Specialization(guards = { @Specialization(guards = {
"selfInterop.isNull(selfNull)", "selfInterop.isNull(selfNull) || otherInterop.isNull(otherNull)",
"otherInterop.isNull(otherNull)"
}, limit = "3") }, limit = "3")
boolean equalsNull( boolean equalsNull(
Object selfNull, Object otherNull, Object selfNull, Object otherNull,
@CachedLibrary("selfNull") InteropLibrary selfInterop, @CachedLibrary("selfNull") InteropLibrary selfInterop,
@CachedLibrary("otherNull") InteropLibrary otherInterop @CachedLibrary("otherNull") InteropLibrary otherInterop
) { ) {
return true; return selfInterop.isNull(selfNull) && otherInterop.isNull(otherNull);
}
@Specialization(guards = {
"isHostObject(selfHostObject)",
"isHostObject(otherHostObject)",
})
boolean equalsHostObjects(
Object selfHostObject, Object otherHostObject,
@CachedLibrary(limit = "5") InteropLibrary interop
) {
try {
return interop.asBoolean(
interop.invokeMember(selfHostObject, "equals", otherHostObject)
);
} catch (UnsupportedMessageException | ArityException | UnknownIdentifierException |
UnsupportedTypeException e) {
throw new IllegalStateException(e);
}
} }
@Specialization(guards = { @Specialization(guards = {
@ -373,6 +405,43 @@ public abstract class EqualsAnyNode extends Node {
} }
} }
@Specialization(guards = {
"selfInterop.hasHashEntries(selfHashMap)",
"otherInterop.hasHashEntries(otherHashMap)"
}, limit = "3")
boolean equalsHashMaps(Object selfHashMap, Object otherHashMap,
@CachedLibrary("selfHashMap") InteropLibrary selfInterop,
@CachedLibrary("otherHashMap") InteropLibrary otherInterop,
@CachedLibrary(limit = "5") InteropLibrary entriesInterop,
@Cached EqualsAnyNode equalsNode) {
try {
int selfHashSize = (int) selfInterop.getHashSize(selfHashMap);
int otherHashSize = (int) otherInterop.getHashSize(otherHashMap);
if (selfHashSize != otherHashSize) {
return false;
}
Object selfEntriesIter = selfInterop.getHashEntriesIterator(selfHashMap);
while (entriesInterop.hasIteratorNextElement(selfEntriesIter)) {
Object selfKeyValue = entriesInterop.getIteratorNextElement(selfEntriesIter);
Object key = entriesInterop.readArrayElement(selfKeyValue, 0);
Object selfValue = entriesInterop.readArrayElement(selfKeyValue, 1);
if (otherInterop.isHashEntryExisting(otherHashMap, key)
&& otherInterop.isHashEntryReadable(otherHashMap, key)) {
Object otherValue = otherInterop.readHashValue(otherHashMap, key);
if (!equalsNode.execute(selfValue, otherValue)) {
return false;
}
} else {
return false;
}
}
return true;
} catch (UnsupportedMessageException | StopIterationException | UnknownKeyException |
InvalidArrayIndexException e) {
throw new IllegalStateException(e);
}
}
/** Equals for Atoms and AtomConstructors */ /** Equals for Atoms and AtomConstructors */
@Specialization @Specialization
@ -534,24 +603,13 @@ public abstract class EqualsAnyNode extends Node {
@TruffleBoundary @TruffleBoundary
boolean equalsGeneric(Object left, Object right, boolean equalsGeneric(Object left, Object right,
@CachedLibrary(limit = "5") InteropLibrary interop) { @CachedLibrary(limit = "5") InteropLibrary interop) {
EnsoContext ctx = EnsoContext.get(interop);
if (isHostObject(ctx, left) && isHostObject(ctx, right)) {
try {
return interop.asBoolean(
interop.invokeMember(left, "equals", right)
);
} catch (UnsupportedMessageException | ArityException | UnknownIdentifierException |
UnsupportedTypeException e) {
throw new IllegalStateException(e);
}
} else {
return left == right return left == right
|| left.equals(right) || interop.isIdentical(left, right, interop)
|| interop.isIdentical(left, right, interop); || left.equals(right);
}
} }
private static boolean isHostObject(EnsoContext context, Object object) { @TruffleBoundary
return context.getEnvironment().isHostObject(object); boolean isHostObject(Object object) {
return EnsoContext.get(this).getEnvironment().isHostObject(object);
} }
} }

View File

@ -0,0 +1,393 @@
package org.enso.interpreter.node.expression.builtin.meta;
import com.ibm.icu.text.Normalizer2;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.GenerateUncached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.ArityException;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.StopIterationException;
import com.oracle.truffle.api.interop.UnknownIdentifierException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.interop.UnsupportedTypeException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import com.oracle.truffle.api.profiles.ConditionProfile;
import com.oracle.truffle.api.profiles.LoopConditionProfile;
import java.math.BigDecimal;
import java.time.LocalDateTime;
import java.time.ZonedDateTime;
import java.util.Arrays;
import org.enso.interpreter.dsl.AcceptsError;
import org.enso.interpreter.node.expression.builtin.number.utils.BigIntegerOps;
import org.enso.interpreter.runtime.EnsoContext;
import org.enso.interpreter.runtime.callable.atom.Atom;
import org.enso.interpreter.runtime.callable.atom.AtomConstructor;
import org.enso.interpreter.runtime.data.text.Text;
import org.enso.interpreter.runtime.error.WarningsLibrary;
import org.enso.interpreter.runtime.number.EnsoBigInteger;
/**
* Implements {@code hash_code} functionality.
*
* <p>Make sure that the hashing contract is retained after any modification.
*
* <h3>Hashing contract:</h3>
*
* <ul>
* <li>Whenever two objects are equal ({@code EqualsAnyNode} returns {@code true}), their hashcode
* should equal. More formally: {@code For all objects o1, o2: if o1 == o2 then hash(o1) ==
* hash(o2)}
* <li>Whenever two hash codes are different, their associated objects are different: {@code For all objects
* o1, o2: if hash(o1) != hash(o2) then o1 != o2.
* </ul>
*/
@GenerateUncached
public abstract class HashCodeAnyNode extends Node {
public static HashCodeAnyNode build() {
return HashCodeAnyNodeGen.create();
}
public abstract long execute(@AcceptsError Object self);
/** Specializations for primitive values * */
@Specialization
long hashCodeForShort(short s) {
return s;
}
@Specialization
long hashCodeForByte(byte b) {
return b;
}
@Specialization
long hashCodeForLong(long l) {
return Long.hashCode(l);
}
@Specialization
long hashCodeForInt(int i) {
return i;
}
@Specialization
long hashCodeForFloat(float f) {
return Float.hashCode(f);
}
@Specialization
@TruffleBoundary
long hashCodeForDouble(double d) {
if (d % 1.0 != 0.0) {
return Double.hashCode(d);
} else {
if (BigIntegerOps.fitsInLong(d)) {
return hashCodeForLong(Double.valueOf(d).longValue());
} else {
try {
return BigDecimal.valueOf(d).toBigIntegerExact().hashCode();
} catch (ArithmeticException e) {
throw new IllegalStateException(e);
}
}
}
}
@Specialization
@TruffleBoundary
long hashCodeForBigInteger(EnsoBigInteger bigInteger) {
return bigInteger.getValue().hashCode();
}
@Specialization
long hashCodeForAtomConstructor(AtomConstructor atomConstructor) {
return System.identityHashCode(atomConstructor);
}
/** How many {@link HashCodeAnyNode} nodes should be created for fields in atoms. */
static final int hashCodeNodeCountForFields = 10;
static HashCodeAnyNode[] createHashCodeNodes(int size) {
HashCodeAnyNode[] nodes = new HashCodeAnyNode[size];
Arrays.fill(nodes, HashCodeAnyNode.build());
return nodes;
}
@Specialization
long hashCodeForAtom(
Atom atom,
@Cached(value = "createHashCodeNodes(hashCodeNodeCountForFields)", allowUncached = true)
HashCodeAnyNode[] fieldHashCodeNodes,
@Cached ConditionProfile isHashCodeCached,
@Cached ConditionProfile enoughHashCodeNodesForFields,
@Cached LoopConditionProfile loopProfile) {
if (isHashCodeCached.profile(atom.getHashCode() != null)) {
return atom.getHashCode();
}
// TODO[PM]: If atom overrides hash_code, call that method (Will be done in a follow-up PR for
// https://www.pivotaltracker.com/story/show/183945328)
int fieldsCount = atom.getFields().length;
Object[] fields = atom.getFields();
// hashes stores hash codes for all fields, and for constructor.
int[] hashes = new int[fieldsCount + 1];
if (enoughHashCodeNodesForFields.profile(fieldsCount <= hashCodeNodeCountForFields)) {
loopProfile.profileCounted(fieldsCount);
for (int i = 0; loopProfile.inject(i < fieldsCount); i++) {
hashes[i] = (int) fieldHashCodeNodes[i].execute(fields[i]);
}
} else {
hashCodeForAtomFieldsUncached(fields, hashes);
}
int ctorHashCode = System.identityHashCode(atom.getConstructor());
hashes[hashes.length - 1] = ctorHashCode;
int atomHashCode = Arrays.hashCode(hashes);
atom.setHashCode(atomHashCode);
return atomHashCode;
}
@TruffleBoundary
private void hashCodeForAtomFieldsUncached(Object[] fields, int[] fieldHashes) {
for (int i = 0; i < fields.length; i++) {
fieldHashes[i] = (int) HashCodeAnyNodeGen.getUncached().execute(fields[i]);
}
}
@Specialization(
guards = {"warnLib.hasWarnings(selfWithWarning)"},
limit = "3")
long hashCodeForWarning(
Object selfWithWarning,
@CachedLibrary("selfWithWarning") WarningsLibrary warnLib,
@Cached HashCodeAnyNode hashCodeNode) {
try {
return hashCodeNode.execute(warnLib.removeWarnings(selfWithWarning));
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
/** Specializations for interop values * */
@Specialization(
guards = {"interop.isBoolean(selfBool)"},
limit = "3")
long hashCodeForBooleanInterop(
Object selfBool, @CachedLibrary("selfBool") InteropLibrary interop) {
try {
return Boolean.hashCode(interop.asBoolean(selfBool));
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@TruffleBoundary
@Specialization(
guards = {
"!interop.isDate(selfTimeZone)",
"!interop.isTime(selfTimeZone)",
"interop.isTimeZone(selfTimeZone)",
},
limit = "3")
long hashCodeForTimeZoneInterop(
Object selfTimeZone, @CachedLibrary("selfTimeZone") InteropLibrary interop) {
try {
return interop.asTimeZone(selfTimeZone).hashCode();
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@TruffleBoundary
@Specialization(
guards = {
"interop.isDate(selfZonedDateTime)",
"interop.isTime(selfZonedDateTime)",
"interop.isTimeZone(selfZonedDateTime)",
},
limit = "3")
long hashCodeForZonedDateTimeInterop(
Object selfZonedDateTime, @CachedLibrary("selfZonedDateTime") InteropLibrary interop) {
try {
return ZonedDateTime.of(
interop.asDate(selfZonedDateTime),
interop.asTime(selfZonedDateTime),
interop.asTimeZone(selfZonedDateTime))
.hashCode();
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@Specialization(
guards = {
"interop.isDate(selfDateTime)",
"interop.isTime(selfDateTime)",
"!interop.isTimeZone(selfDateTime)",
},
limit = "3")
long hashCodeForDateTimeInterop(
Object selfDateTime, @CachedLibrary("selfDateTime") InteropLibrary interop) {
try {
return LocalDateTime.of(interop.asDate(selfDateTime), interop.asTime(selfDateTime))
.hashCode();
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@Specialization(
guards = {
"!interop.isDate(selfTime)",
"interop.isTime(selfTime)",
"!interop.isTimeZone(selfTime)",
},
limit = "3")
long hashCodeForTimeInterop(Object selfTime, @CachedLibrary("selfTime") InteropLibrary interop) {
try {
return interop.asTime(selfTime).hashCode();
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@Specialization(
guards = {
"interop.isDate(selfDate)",
"!interop.isTime(selfDate)",
"!interop.isTimeZone(selfDate)",
},
limit = "3")
long hashCodeForDateInterop(Object selfDate, @CachedLibrary("selfDate") InteropLibrary interop) {
try {
return interop.asDate(selfDate).hashCode();
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@Specialization(
guards = {
"interop.isDuration(selfDuration)",
},
limit = "3")
long hashCodeForDurationInterop(
Object selfDuration, @CachedLibrary("selfDuration") InteropLibrary interop) {
try {
return interop.asDuration(selfDuration).hashCode();
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@Specialization
long hashCodeForText(Text text, @CachedLibrary(limit = "3") InteropLibrary interop) {
if (text.is_normalized()) {
return text.toString().hashCode();
} else {
return hashCodeForString(text, interop);
}
}
@TruffleBoundary
@Specialization(
guards = {"interop.isString(selfStr)"},
limit = "3")
long hashCodeForString(Object selfStr, @CachedLibrary("selfStr") InteropLibrary interop) {
String str;
try {
str = interop.asString(selfStr);
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
Normalizer2 normalizer = Normalizer2.getNFDInstance();
if (normalizer.isNormalized(str)) {
return str.hashCode();
} else {
return normalizer.normalize(str).hashCode();
}
}
@Specialization(
guards = {"interop.hasArrayElements(selfArray)"},
limit = "3")
long hashCodeForArray(
Object selfArray,
@CachedLibrary("selfArray") InteropLibrary interop,
@Cached HashCodeAnyNode hashCodeNode,
@Cached("createCountingProfile()") LoopConditionProfile loopProfile) {
try {
long arraySize = interop.getArraySize(selfArray);
loopProfile.profileCounted(arraySize);
int[] elemHashCodes = new int[(int) arraySize];
for (int i = 0; loopProfile.inject(i < arraySize); i++) {
if (interop.isArrayElementReadable(selfArray, i)) {
elemHashCodes[i] = (int) hashCodeNode.execute(interop.readArrayElement(selfArray, i));
}
}
return Arrays.hashCode(elemHashCodes);
} catch (UnsupportedMessageException | InvalidArrayIndexException e) {
throw new IllegalStateException(e);
}
}
/**
* Two maps are considered equal, if they have the same entries. Note that we do not care about
* ordering.
*/
@Specialization(guards = "interop.hasHashEntries(selfMap)")
long hashCodeForMap(
Object selfMap,
@CachedLibrary(limit = "5") InteropLibrary interop,
@Cached HashCodeAnyNode hashCodeNode) {
int mapSize;
long keysHashCode = 0;
long valuesHashCode = 0;
try {
mapSize = (int) interop.getHashSize(selfMap);
Object entriesIterator = interop.getHashEntriesIterator(selfMap);
while (interop.hasIteratorNextElement(entriesIterator)) {
Object entry = interop.getIteratorNextElement(entriesIterator);
Object key = interop.readArrayElement(entry, 0);
Object value = interop.readArrayElement(entry, 1);
// We don't care about the order of keys and values, so we just sum all their hash codes.
keysHashCode += hashCodeNode.execute(key);
valuesHashCode += hashCodeNode.execute(value);
}
} catch (UnsupportedMessageException | StopIterationException | InvalidArrayIndexException e) {
throw new IllegalStateException(e);
}
return Arrays.hashCode(new long[] {keysHashCode, valuesHashCode, mapSize});
}
@Specialization(
guards = {"interop.isNull(selfNull)"},
limit = "3")
long hashCodeForNull(Object selfNull, @CachedLibrary("selfNull") InteropLibrary interop) {
return 0;
}
@Specialization(guards = "isHostObject(hostObject)")
long hashCodeForHostObject(
Object hostObject, @CachedLibrary(limit = "3") InteropLibrary interop) {
try {
Object hashCodeRes = interop.invokeMember(hostObject, "hashCode");
assert interop.fitsInInt(hashCodeRes);
return interop.asInt(hashCodeRes);
} catch (UnsupportedMessageException
| ArityException
| UnknownIdentifierException
| UnsupportedTypeException e) {
throw new IllegalStateException(e);
}
}
@TruffleBoundary
boolean isHostObject(Object object) {
return EnsoContext.get(this).getEnvironment().isHostObject(object);
}
}

View File

@ -6,7 +6,7 @@ import org.enso.interpreter.dsl.BuiltinMethod;
import org.enso.interpreter.node.expression.builtin.number.utils.BigIntegerOps; import org.enso.interpreter.node.expression.builtin.number.utils.BigIntegerOps;
import org.enso.interpreter.node.expression.builtin.number.utils.ToEnsoNumberNode; import org.enso.interpreter.node.expression.builtin.number.utils.ToEnsoNumberNode;
@BuiltinMethod(type = "Small_Integer", name = "abs", description = "Negation for numbers.") @BuiltinMethod(type = "Small_Integer", name = "abs", description = "Absolute value of a number")
public abstract class AbsNode extends Node { public abstract class AbsNode extends Node {
private @Child ToEnsoNumberNode toEnsoNumberNode = ToEnsoNumberNode.build(); private @Child ToEnsoNumberNode toEnsoNumberNode = ToEnsoNumberNode.build();

View File

@ -92,6 +92,7 @@ public class Builtins {
private final Builtin text; private final Builtin text;
private final Builtin array; private final Builtin array;
private final Builtin vector; private final Builtin vector;
private final Builtin map;
private final Builtin dataflowError; private final Builtin dataflowError;
private final Builtin ref; private final Builtin ref;
private final Builtin managedResource; private final Builtin managedResource;
@ -137,6 +138,7 @@ public class Builtins {
text = builtins.get(Text.class); text = builtins.get(Text.class);
array = builtins.get(Array.class); array = builtins.get(Array.class);
vector = builtins.get(Vector.class); vector = builtins.get(Vector.class);
map = builtins.get(org.enso.interpreter.node.expression.builtin.Map.class);
dataflowError = builtins.get(org.enso.interpreter.node.expression.builtin.Error.class); dataflowError = builtins.get(org.enso.interpreter.node.expression.builtin.Error.class);
ref = builtins.get(Ref.class); ref = builtins.get(Ref.class);
managedResource = builtins.get(ManagedResource.class); managedResource = builtins.get(ManagedResource.class);
@ -552,6 +554,10 @@ public class Builtins {
return vector.getType(); return vector.getType();
} }
public Type map() {
return map.getType();
}
/** @return the Ref constructor. */ /** @return the Ref constructor. */
public Type ref() { public Type ref() {
return ref.getType(); return ref.getType();

View File

@ -1,7 +1,15 @@
package org.enso.interpreter.runtime.callable.atom; package org.enso.interpreter.runtime.callable.atom;
import com.oracle.truffle.api.Assumption;
import com.oracle.truffle.api.CompilerDirectives; import com.oracle.truffle.api.CompilerDirectives;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.Truffle;
import com.oracle.truffle.api.dsl.Cached.Shared;
import com.oracle.truffle.api.dsl.Fallback;
import com.oracle.truffle.api.profiles.ConditionProfile;
import com.oracle.truffle.api.profiles.ValueProfile;
import com.oracle.truffle.api.utilities.TriState;
import com.oracle.truffle.api.dsl.Cached; import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.Specialization; import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.exception.AbstractTruffleException; import com.oracle.truffle.api.exception.AbstractTruffleException;
@ -12,11 +20,16 @@ import com.oracle.truffle.api.library.ExportMessage;
import com.oracle.truffle.api.nodes.ExplodeLoop; import com.oracle.truffle.api.nodes.ExplodeLoop;
import com.oracle.truffle.api.nodes.UnexpectedResultException; import com.oracle.truffle.api.nodes.UnexpectedResultException;
import com.oracle.truffle.api.profiles.BranchProfile; import com.oracle.truffle.api.profiles.BranchProfile;
import java.util.Arrays;
import java.util.List;
import java.util.Objects;
import java.util.stream.Collectors;
import org.enso.interpreter.runtime.callable.UnresolvedSymbol; import org.enso.interpreter.runtime.callable.UnresolvedSymbol;
import org.enso.interpreter.runtime.callable.function.Function; import org.enso.interpreter.runtime.callable.function.Function;
import org.enso.interpreter.runtime.data.Array; import org.enso.interpreter.runtime.data.Array;
import org.enso.interpreter.runtime.data.Type; import org.enso.interpreter.runtime.data.Type;
import org.enso.interpreter.runtime.data.text.Text; import org.enso.interpreter.runtime.data.text.Text;
import org.enso.interpreter.runtime.error.PanicException;
import org.enso.interpreter.runtime.library.dispatch.TypesLibrary; import org.enso.interpreter.runtime.library.dispatch.TypesLibrary;
import org.enso.interpreter.runtime.type.TypesGen; import org.enso.interpreter.runtime.type.TypesGen;
@ -30,6 +43,7 @@ import org.enso.interpreter.runtime.error.WarningsLibrary;
public final class Atom implements TruffleObject { public final class Atom implements TruffleObject {
final AtomConstructor constructor; final AtomConstructor constructor;
private final Object[] fields; private final Object[] fields;
private Integer hashCode;
/** /**
* Creates a new Atom for a given constructor. * Creates a new Atom for a given constructor.
@ -60,6 +74,15 @@ public final class Atom implements TruffleObject {
return fields; return fields;
} }
public void setHashCode(int hashCode) {
assert this.hashCode == null : "setHashCode must be called at most once";
this.hashCode = hashCode;
}
public Integer getHashCode() {
return hashCode;
}
private void toString(StringBuilder builder, boolean shouldParen, int depth) { private void toString(StringBuilder builder, boolean shouldParen, int depth) {
if (depth <= 0) { if (depth <= 0) {
builder.append("..."); builder.append("...");

View File

@ -10,6 +10,7 @@ import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.library.ExportLibrary; import com.oracle.truffle.api.library.ExportLibrary;
import com.oracle.truffle.api.library.ExportMessage; import com.oracle.truffle.api.library.ExportMessage;
import com.oracle.truffle.api.nodes.RootNode; import com.oracle.truffle.api.nodes.RootNode;
import com.oracle.truffle.api.utilities.TriState;
import org.enso.interpreter.node.ClosureRootNode; import org.enso.interpreter.node.ClosureRootNode;
import org.enso.interpreter.node.ExpressionNode; import org.enso.interpreter.node.ExpressionNode;
import org.enso.interpreter.node.callable.argument.ReadArgumentNode; import org.enso.interpreter.node.callable.argument.ReadArgumentNode;

View File

@ -0,0 +1,194 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.TruffleObject;
import com.oracle.truffle.api.interop.UnknownKeyException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.library.ExportLibrary;
import com.oracle.truffle.api.library.ExportMessage;
import com.oracle.truffle.api.profiles.ConditionProfile;
import org.enso.interpreter.dsl.Builtin;
import org.enso.interpreter.node.expression.builtin.meta.EqualsAnyNode;
import org.enso.interpreter.node.expression.builtin.meta.HashCodeAnyNode;
import org.enso.interpreter.runtime.EnsoContext;
import org.enso.interpreter.runtime.data.Type;
import org.enso.interpreter.runtime.data.Vector;
import org.enso.interpreter.runtime.data.hash.EnsoHashMapBuilder.StorageEntry;
import org.enso.interpreter.runtime.library.dispatch.TypesLibrary;
/**
* Implementation of a hash map structure, capable of holding any types of keys and values. The
* actual hash map storage is implemented in {@link EnsoHashMapBuilder}, and every {@link
* EnsoHashMap} has just a reference to the builder and its size, which allows us to implement
* {@code insert} operation in constant time. In other words, every map is just a snapshot of its
* builder.
*
* <p>Users should not use Enso objects as keys to Java maps, because equals won't work the same way
* as it works in Enso.
*/
@ExportLibrary(TypesLibrary.class)
@ExportLibrary(InteropLibrary.class)
@Builtin(stdlibName = "Standard.Base.Data.Map.Map", name = "Map")
public final class EnsoHashMap implements TruffleObject {
private final EnsoHashMapBuilder mapBuilder;
/**
* Size of this Map. Basically an index into {@link EnsoHashMapBuilder}'s storage. See {@link
* #isEntryInThisMap(StorageEntry)}.
*/
private final int snapshotSize;
/**
* True iff {@code insert} method was already called. If insert was already called, and we are
* calling {@code insert} again, the {@link #mapBuilder} should be duplicated for the newly
* created Map.
*/
private boolean insertCalled;
private Object cachedVectorRepresentation;
private EnsoHashMap(EnsoHashMapBuilder mapBuilder, int snapshotSize) {
this.mapBuilder = mapBuilder;
this.snapshotSize = snapshotSize;
assert snapshotSize <= mapBuilder.getSize();
}
static EnsoHashMap createWithBuilder(EnsoHashMapBuilder mapBuilder, int snapshotSize) {
return new EnsoHashMap(mapBuilder, snapshotSize);
}
static EnsoHashMap createEmpty(HashCodeAnyNode hashCodeAnyNode, EqualsAnyNode equalsNode) {
return new EnsoHashMap(EnsoHashMapBuilder.create(hashCodeAnyNode, equalsNode), 0);
}
EnsoHashMapBuilder getMapBuilder() {
return mapBuilder;
}
Object getCachedVectorRepresentation() {
return getCachedVectorRepresentation(ConditionProfile.getUncached());
}
Object getCachedVectorRepresentation(ConditionProfile isNotCachedProfile) {
if (isNotCachedProfile.profile(cachedVectorRepresentation == null)) {
Object[] keys = new Object[snapshotSize];
Object[] values = new Object[snapshotSize];
int arrIdx = 0;
for (StorageEntry entry : mapBuilder.getStorage().getValues()) {
if (entry.index() < snapshotSize) {
keys[arrIdx] = entry.key();
values[arrIdx] = entry.value();
arrIdx++;
}
}
cachedVectorRepresentation =
Vector.fromArray(HashEntriesVector.createFromKeysAndValues(keys, values));
}
return cachedVectorRepresentation;
}
public boolean isInsertCalled() {
return insertCalled;
}
public void setInsertCalled() {
assert !insertCalled : "setInsertCalled should be called at most once";
insertCalled = true;
}
@Builtin.Method
@Builtin.Specialize
public static EnsoHashMap empty(
@Cached HashCodeAnyNode hashCodeNode, @Cached EqualsAnyNode equalsNode) {
return createEmpty(hashCodeNode, equalsNode);
}
@ExportMessage
boolean hasHashEntries() {
return true;
}
@ExportMessage
int getHashSize() {
return snapshotSize;
}
@ExportMessage
boolean isHashEntryExisting(Object key) {
return isEntryInThisMap(mapBuilder.get(key));
}
@ExportMessage
boolean isHashEntryReadable(Object key) {
return isHashEntryExisting(key);
}
@ExportMessage
Object readHashValue(Object key) throws UnknownKeyException {
StorageEntry entry = mapBuilder.get(key);
if (isEntryInThisMap(entry)) {
return entry.value();
} else {
throw UnknownKeyException.create(key);
}
}
@ExportMessage
Object getHashEntriesIterator(@CachedLibrary(limit = "3") InteropLibrary interop) {
try {
return interop.getIterator(getCachedVectorRepresentation());
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@ExportMessage(library = TypesLibrary.class)
boolean hasType() {
return true;
}
@ExportMessage(library = TypesLibrary.class)
Type getType(@CachedLibrary("this") TypesLibrary thisLib) {
return EnsoContext.get(thisLib).getBuiltins().map();
}
@ExportMessage
boolean hasMetaObject() {
return true;
}
@ExportMessage
Type getMetaObject(@CachedLibrary("this") InteropLibrary thisLib) {
return EnsoContext.get(thisLib).getBuiltins().map();
}
@ExportMessage
@TruffleBoundary
Object toDisplayString(boolean allowSideEffects) {
var sb = new StringBuilder();
sb.append("{");
boolean empty = true;
for (StorageEntry entry : mapBuilder.getStorage().getValues()) {
if (isEntryInThisMap(entry)) {
empty = false;
sb.append(entry.key()).append("=").append(entry.value()).append(", ");
}
}
if (!empty) {
// Delete last comma
sb.delete(sb.length() - 2, sb.length());
}
sb.append("}");
return sb.toString();
}
@Override
public String toString() {
return (String) toDisplayString(true);
}
private boolean isEntryInThisMap(StorageEntry entry) {
return entry != null && entry.index() < snapshotSize;
}
}

View File

@ -0,0 +1,188 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import java.util.ArrayList;
import java.util.List;
import org.enso.interpreter.node.expression.builtin.meta.EqualsAnyNode;
import org.enso.interpreter.node.expression.builtin.meta.HashCodeAnyNode;
import org.graalvm.collections.EconomicMap;
import org.graalvm.collections.Equivalence;
/**
* A storage for a {@link EnsoHashMap}. For one builder, there may be many snapshots ({@link
* EnsoHashMap}). There should be at most one snapshot for a given size. All the snapshots should
* have size smaller than this builder size.
*/
public final class EnsoHashMapBuilder {
private final EconomicMap<Object, StorageEntry> storage;
/** All entries stored by their sequential index. */
private final List<StorageEntry> sequentialEntries;
private final HashCodeAnyNode hashCodeNode;
private final EqualsAnyNode equalsNode;
private int size;
private EnsoHashMapBuilder(HashCodeAnyNode hashCodeAnyNode, EqualsAnyNode equalsNode) {
this.storage = EconomicMap.create(new StorageStrategy(equalsNode, hashCodeAnyNode));
this.sequentialEntries = new ArrayList<>();
this.hashCodeNode = hashCodeAnyNode;
this.equalsNode = equalsNode;
}
private EnsoHashMapBuilder(EnsoHashMapBuilder other, int numEntries) {
assert 0 < numEntries && numEntries <= other.size;
this.storage = EconomicMap.create(new StorageStrategy(other.equalsNode, other.hashCodeNode));
var entriesToBeDuplicated = other.sequentialEntries.subList(0, numEntries);
this.sequentialEntries = new ArrayList<>(entriesToBeDuplicated);
entriesToBeDuplicated.forEach(entry -> this.storage.put(entry.key, entry));
this.hashCodeNode = other.hashCodeNode;
this.equalsNode = other.equalsNode;
this.size = numEntries;
}
private EnsoHashMapBuilder(EnsoHashMapBuilder other) {
this.storage =
EconomicMap.create(
new StorageStrategy(other.equalsNode, other.hashCodeNode), other.storage);
this.sequentialEntries = new ArrayList<>(other.sequentialEntries);
this.hashCodeNode = other.hashCodeNode;
this.equalsNode = other.equalsNode;
this.size = other.size;
}
/**
* Create a new builder with stored nodes.
*
* @param hashCodeNode Node that will be stored in the storage for invoking `hash_code` on keys.
* @param equalsNode Node that will be stored in the storage for invoking `==` on keys.
*/
public static EnsoHashMapBuilder create(HashCodeAnyNode hashCodeNode, EqualsAnyNode equalsNode) {
return new EnsoHashMapBuilder(hashCodeNode, equalsNode);
}
/** Returns count of elements in the storage. */
public int getSize() {
return size;
}
public EconomicMap<Object, StorageEntry> getStorage() {
return storage;
}
/**
* Duplicates the MapBuilder with just first {@code numEntries} number of entries.
*
* @param numEntries Number of entries to take from this MapBuilder.
*/
public EnsoHashMapBuilder duplicatePartial(int numEntries) {
return new EnsoHashMapBuilder(this, numEntries);
}
/** Duplicates this builder with all its entries. */
@TruffleBoundary
public EnsoHashMapBuilder duplicate() {
return new EnsoHashMapBuilder(this);
}
/** Adds a key-value mapping, overriding any existing value. */
@TruffleBoundary(allowInlining = true)
public void add(Object key, Object value) {
var oldEntry = storage.get(key);
int newEntryIndex = oldEntry != null ? oldEntry.index : size;
var newEntry = new StorageEntry(key, value, newEntryIndex);
storage.put(key, newEntry);
if (oldEntry == null) {
assert newEntry.index == size;
sequentialEntries.add(newEntry);
size++;
} else {
sequentialEntries.set(newEntryIndex, newEntry);
}
}
@TruffleBoundary(allowInlining = true)
public StorageEntry get(Object key) {
return storage.get(key);
}
/**
* Removes an entry denoted by the given key.
*
* @return true if the removal was successful, i.e., the key was in the map and was removed, false
* otherwise.
*/
@TruffleBoundary
public boolean remove(Object key) {
var oldEntry = storage.removeKey(key);
if (oldEntry == null) {
return false;
} else {
sequentialEntries.remove(oldEntry.index);
// Rewrite rest of the sequentialEntries list and repair indexes in storage
for (int i = oldEntry.index; i < sequentialEntries.size(); i++) {
var entry = sequentialEntries.get(i);
StorageEntry newEntry = new StorageEntry(entry.key, entry.value, i);
sequentialEntries.set(i, newEntry);
storage.put(newEntry.key, newEntry);
}
size--;
return true;
}
}
@TruffleBoundary(allowInlining = true)
public boolean containsKey(Object key) {
return storage.containsKey(key);
}
/**
* Creates a snapshot with the current size. The created snapshot contains all the entries that
* are in the storage as of this moment, i.e., all the entries with their indexes lesser than
* {@code size}.
*
* <p>Should be called at most once for a particular {@code size}.
*
* @return A new hash map snapshot.
*/
public EnsoHashMap build() {
return EnsoHashMap.createWithBuilder(this, size);
}
@Override
public String toString() {
return "EnsoHashMapBuilder{size = " + size + ", storage = " + storage + "}";
}
record StorageEntry(
Object key,
Object value,
/**
* A sequential index of the entry within this map. {@link EnsoHashMap} uses it for checking
* whether a certain key belongs in that map.
*/
int index) {}
/**
* Custom {@link Equivalence} used for the {@link EconomicMap} that delegates {@code equals} to
* {@link EqualsAnyNode} and {@code hash_code} to {@link HashCodeAnyNode}.
*/
private static final class StorageStrategy extends Equivalence {
private final EqualsAnyNode equalsNode;
private final HashCodeAnyNode hashCodeNode;
private StorageStrategy(EqualsAnyNode equalsNode, HashCodeAnyNode hashCodeNode) {
this.equalsNode = equalsNode;
this.hashCodeNode = hashCodeNode;
}
@Override
public boolean equals(Object a, Object b) {
return equalsNode.execute(a, b);
}
@Override
public int hashCode(Object o) {
return (int) hashCodeNode.execute(o);
}
}
}

View File

@ -0,0 +1,133 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.TruffleObject;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.ExportLibrary;
import com.oracle.truffle.api.library.ExportMessage;
import org.enso.interpreter.runtime.data.Vector;
/**
* A vector used to hold hash map entries, where each entry is represented as a 2-element vector.
* Used for both Truffle interop, where {@code getHashEntriesIterator} expects this form of vector
* (array), and for Enso {@code Map.to_vector} method. May be empty.
*/
@ExportLibrary(InteropLibrary.class)
final class HashEntriesVector implements TruffleObject {
private final Vector[] entryPairs;
private HashEntriesVector(Object[] keys, Object[] values) {
assert keys.length == values.length;
this.entryPairs = new Vector[keys.length];
for (int i = 0; i < keys.length; i++) {
entryPairs[i] = Vector.fromArray(new EntryPair(keys[i], values[i]));
}
}
static HashEntriesVector createFromKeysAndValues(Object[] keys, Object[] values) {
return new HashEntriesVector(keys, values);
}
static HashEntriesVector createEmpty() {
return new HashEntriesVector(new Object[] {}, new Object[] {});
}
@ExportMessage
boolean hasArrayElements() {
return true;
}
@ExportMessage
long getArraySize() {
return entryPairs.length;
}
@ExportMessage
boolean isArrayElementReadable(long idx) {
return idx < entryPairs.length;
}
@ExportMessage
boolean isArrayElementModifiable(long idx) {
return false;
}
@ExportMessage
boolean isArrayElementInsertable(long idx) {
return false;
}
@ExportMessage
Object readArrayElement(long idx) throws InvalidArrayIndexException {
if (idx < entryPairs.length) {
return entryPairs[(int) idx];
} else {
throw InvalidArrayIndexException.create(idx);
}
}
@ExportMessage
void writeArrayElement(long index, Object value) throws UnsupportedMessageException {
throw UnsupportedMessageException.create();
}
@ExportLibrary(InteropLibrary.class)
static final class EntryPair implements TruffleObject {
private final Object key;
private final Object value;
EntryPair(Object key, Object value) {
this.key = key;
this.value = value;
}
@ExportMessage
boolean hasArrayElements() {
return true;
}
@ExportMessage
long getArraySize() {
return 2;
}
@ExportMessage
boolean isArrayElementReadable(long idx) {
return idx < 2;
}
@ExportMessage
Object readArrayElement(long idx) throws InvalidArrayIndexException {
if (idx == 0) {
return key;
} else if (idx == 1) {
return value;
} else {
throw InvalidArrayIndexException.create(idx);
}
}
@ExportMessage
boolean isArrayElementModifiable(long idx) {
return false;
}
@ExportMessage
boolean isArrayElementInsertable(long idx) {
return false;
}
@ExportMessage
void writeArrayElement(long index, Object value) throws UnsupportedMessageException {
throw UnsupportedMessageException.create();
}
@TruffleBoundary
@ExportMessage
Object toDisplayString(boolean sideEffectsAllowed) {
return "(" + key + ", " + value + ")";
}
}
}

View File

@ -0,0 +1,40 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.dsl.Fallback;
import com.oracle.truffle.api.dsl.GenerateUncached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import org.enso.interpreter.dsl.BuiltinMethod;
@BuiltinMethod(
type = "Map",
name = "contains_key",
description = """
Returns True if the hash map contains mapping with the given key, False otherwise.
""",
autoRegister = false
)
@GenerateUncached
public abstract class HashMapContainsKeyNode extends Node {
public static HashMapContainsKeyNode build() {
return HashMapContainsKeyNodeGen.create();
}
public abstract boolean execute(Object self, Object key);
@Specialization(guards = {
"interop.hasHashEntries(foreignMap)"
}, limit = "3")
boolean doForeignHashMap(Object foreignMap, Object key,
@CachedLibrary("foreignMap") InteropLibrary interop) {
return interop.isHashEntryExisting(foreignMap, key);
}
@Fallback
boolean fallback(Object map, Object key) {
return false;
}
}

View File

@ -0,0 +1,55 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.Fallback;
import com.oracle.truffle.api.dsl.GenerateUncached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.UnknownKeyException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import org.enso.interpreter.dsl.BuiltinMethod;
import org.enso.interpreter.dsl.Suspend;
import org.enso.interpreter.node.BaseNode.TailStatus;
import org.enso.interpreter.node.callable.thunk.ThunkExecutorNode;
import org.enso.interpreter.runtime.state.State;
@BuiltinMethod(
type = "Map",
name = "get_builtin",
description = """
Gets a value from the map on the specified key, or the given default.
""",
autoRegister = false
)
@GenerateUncached
public abstract class HashMapGetNode extends Node {
public static HashMapGetNode build() {
return HashMapGetNodeGen.create();
}
public abstract Object execute(State state, Object self, Object key, @Suspend Object defaultValue);
@Specialization(guards = "interop.hasHashEntries(self)", limit = "3")
Object hashMapGet(State state, Object self, Object key, Object defaultValue,
@CachedLibrary("self") InteropLibrary interop,
@Cached("build()") ThunkExecutorNode thunkExecutorNode) {
if (interop.isHashEntryReadable(self, key)) {
try {
return interop.readHashValue(self, key);
} catch (UnsupportedMessageException | UnknownKeyException e) {
throw new IllegalStateException(e);
}
} else {
return thunkExecutorNode.executeThunk(defaultValue, state, TailStatus.NOT_TAIL);
}
}
@Fallback
Object fallback(State state, Object self, Object key, Object defaultValue,
@Cached("build()") ThunkExecutorNode thunkExecutorNode) {
return thunkExecutorNode.executeThunk(defaultValue, state, TailStatus.NOT_TAIL);
}
}

View File

@ -0,0 +1,84 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.StopIterationException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import org.enso.interpreter.dsl.BuiltinMethod;
import org.enso.interpreter.node.expression.builtin.meta.EqualsAnyNode;
import org.enso.interpreter.node.expression.builtin.meta.HashCodeAnyNode;
@BuiltinMethod(
type = "Map",
name = "insert",
description = """
Returns newly created hash map with the given key value mapping.
""",
autoRegister = false
)
public abstract class HashMapInsertNode extends Node {
public static HashMapInsertNode build() {
return HashMapInsertNodeGen.create();
}
public abstract EnsoHashMap execute(Object self, Object key, Object value);
@Specialization
@TruffleBoundary
EnsoHashMap doEnsoHashMap(EnsoHashMap hashMap, Object key, Object value) {
EnsoHashMapBuilder mapBuilder = hashMap.getMapBuilder();
boolean containsKey = mapBuilder.get(key) != null;
boolean insertCalledOnMap = hashMap.isInsertCalled();
if (insertCalledOnMap || containsKey) {
// insert was already called on this map => We need to duplicate MapBuilder
// If a key is already contained in the Map there is no way telling whether there is another
// binding pointing to the Map, and we do not want to mutate this older binding.
var newMapBuilder = hashMap.getHashSize() < mapBuilder.getSize() ?
mapBuilder.duplicatePartial(hashMap.getHashSize()) :
mapBuilder.duplicate();
newMapBuilder.add(key, value);
return newMapBuilder.build();
} else {
// Do not duplicate the builder, just create a snapshot.
mapBuilder.add(key, value);
var newMap = mapBuilder.build();
hashMap.setInsertCalled();
return newMap;
}
}
/**
* Creates a new {@link EnsoHashMapBuilder} for the given {@code foreignMap} - iterates through
* all the entries of the foreign map. The returned map is {@link EnsoHashMap}.
*/
@Specialization(guards = "mapInterop.hasHashEntries(foreignMap)", limit = "3")
EnsoHashMap doForeign(Object foreignMap, Object keyToInsert, Object valueToInsert,
@CachedLibrary("foreignMap") InteropLibrary mapInterop,
@CachedLibrary(limit = "3") InteropLibrary iteratorInterop,
@Cached HashCodeAnyNode hashCodeNode,
@Cached EqualsAnyNode equalsNode) {
var mapBuilder = EnsoHashMapBuilder.create(hashCodeNode, equalsNode);
try {
Object entriesIterator = mapInterop.getHashEntriesIterator(foreignMap);
while (iteratorInterop.hasIteratorNextElement(entriesIterator)) {
Object keyValueArr = iteratorInterop.getIteratorNextElement(entriesIterator);
Object key = iteratorInterop.readArrayElement(keyValueArr, 0);
Object value = iteratorInterop.readArrayElement(keyValueArr, 1);
mapBuilder.add(key, value);
}
} catch (UnsupportedMessageException | StopIterationException | InvalidArrayIndexException e) {
throw new IllegalStateException(
"Polyglot hash map " + foreignMap + " has wrongly specified Interop API (hash entries iterator)",
e
);
}
mapBuilder.add(keyToInsert, valueToInsert);
return EnsoHashMap.createWithBuilder(mapBuilder, mapBuilder.getSize());
}
}

View File

@ -0,0 +1,88 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.CompilerDirectives;
import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.GenerateUncached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.StopIterationException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import org.enso.interpreter.dsl.BuiltinMethod;
import org.enso.interpreter.node.expression.builtin.meta.EqualsAnyNode;
import org.enso.interpreter.node.expression.builtin.meta.HashCodeAnyNode;
import org.enso.interpreter.runtime.error.DataflowError;
@BuiltinMethod(
type = "Map",
name = "remove_builtin",
description = """
Removes an entry from this map specified with the key.
"""
)
@GenerateUncached
public abstract class HashMapRemoveNode extends Node {
public static HashMapRemoveNode build() {
return HashMapRemoveNodeGen.create();
}
public abstract EnsoHashMap execute(Object self, Object key);
@Specialization
EnsoHashMap removeFromEnsoMap(EnsoHashMap ensoMap, Object key) {
var oldEntry = ensoMap.getMapBuilder().get(key);
if (oldEntry == null) {
throw DataflowError.withoutTrace("No such key", null);
} else {
var newBuilder = ensoMap.getMapBuilder().duplicate();
if (!newBuilder.remove(key)) {
throw new IllegalStateException("Key '" + key + "' should be in the map");
}
return EnsoHashMap.createWithBuilder(newBuilder, newBuilder.getSize());
}
}
@Specialization(
guards = "interop.hasHashEntries(map)"
)
EnsoHashMap removeFromInteropMap(Object map, Object keyToRemove,
@CachedLibrary(limit = "5") InteropLibrary interop,
@Cached HashCodeAnyNode hashCodeNode,
@Cached EqualsAnyNode equalsNode) {
// We cannot simply call interop.isHashEntryExisting, because it would, most likely
// use the default `hashCode` and `equals` Java methods. But we need to use our
// EqualsAnyNode, so we do the check for non-existing key inside the while loop.
boolean keyToRemoveFound = false;
var mapBuilder = EnsoHashMapBuilder.create(hashCodeNode, equalsNode);
try {
Object entriesIterator = interop.getHashEntriesIterator(map);
while (interop.hasIteratorNextElement(entriesIterator)) {
Object keyValueArr = interop.getIteratorNextElement(entriesIterator);
Object key = interop.readArrayElement(keyValueArr, 0);
if (equalsNode.execute(keyToRemove, key)) {
if (keyToRemoveFound) {
throw new IllegalStateException("Key " + key + " found twice");
} else {
keyToRemoveFound = true;
}
} else {
Object value = interop.readArrayElement(keyValueArr, 1);
mapBuilder.add(key, value);
}
}
} catch (UnsupportedMessageException | StopIterationException | InvalidArrayIndexException e) {
throw new IllegalStateException(
"Polyglot hash map " + map + " has wrongly specified Interop API (hash entries iterator)",
e
);
}
if (keyToRemoveFound) {
return EnsoHashMap.createWithBuilder(mapBuilder, mapBuilder.getSize());
} else {
CompilerDirectives.transferToInterpreter();
throw DataflowError.withoutTrace("No such key " + keyToRemove, interop);
}
}
}

View File

@ -0,0 +1,39 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.dsl.Fallback;
import com.oracle.truffle.api.dsl.GenerateUncached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import org.enso.interpreter.dsl.BuiltinMethod;
@BuiltinMethod(
type = "Map",
name = "size",
description = "Returns the number of entries in this hash map",
autoRegister = false)
@GenerateUncached
public abstract class HashMapSizeNode extends Node {
public static HashMapSizeNode build() {
return HashMapSizeNodeGen.create();
}
public abstract long execute(Object self);
@Specialization(guards = "interop.hasHashEntries(hashMap)", limit = "3")
long getHashMapSize(Object hashMap, @CachedLibrary("hashMap") InteropLibrary interop) {
try {
return interop.getHashSize(hashMap);
} catch (UnsupportedMessageException e) {
throw new IllegalStateException(e);
}
}
@Fallback
long fallback(Object hashMap) {
return 0;
}
}

View File

@ -0,0 +1,57 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.StopIterationException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import org.enso.interpreter.dsl.BuiltinMethod;
@BuiltinMethod(
type = "Map",
name = "to_text",
description = """
Returns text representation of this hash map
""",
autoRegister = false
)
public abstract class HashMapToTextNode extends Node {
public static HashMapToTextNode build() {
return HashMapToTextNodeGen.create();
}
public abstract Object execute(Object self);
@TruffleBoundary
@Specialization(guards = "interop.hasHashEntries(hashMap)")
Object hashMapToText(Object hashMap,
@CachedLibrary(limit = "5") InteropLibrary interop) {
var sb = new StringBuilder();
sb.append("{");
try {
Object entryIterator = interop.getHashEntriesIterator(hashMap);
while (interop.hasIteratorNextElement(entryIterator)) {
Object keyValuePair = interop.getIteratorNextElement(entryIterator);
Object key = interop.readArrayElement(keyValuePair, 0);
Object value = interop.readArrayElement(keyValuePair, 1);
sb.append(key).append("=").append(value).append(", ");
}
if (interop.getHashSize(hashMap) > 0) {
// Delete last comma
sb.delete(sb.length() - 2, sb.length());
}
} catch (UnsupportedMessageException | StopIterationException | InvalidArrayIndexException e) {
throw new IllegalStateException(
"hashMap " + hashMap + " probably implements interop API incorrectly",
e
);
}
sb.append("}");
return sb.toString();
}
}

View File

@ -0,0 +1,78 @@
package org.enso.interpreter.runtime.data.hash;
import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.dsl.Fallback;
import com.oracle.truffle.api.dsl.GenerateUncached;
import com.oracle.truffle.api.dsl.Specialization;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.InvalidArrayIndexException;
import com.oracle.truffle.api.interop.StopIterationException;
import com.oracle.truffle.api.interop.UnsupportedMessageException;
import com.oracle.truffle.api.library.CachedLibrary;
import com.oracle.truffle.api.nodes.Node;
import com.oracle.truffle.api.profiles.ConditionProfile;
import org.enso.interpreter.dsl.BuiltinMethod;
import org.enso.interpreter.runtime.data.Vector;
@BuiltinMethod(
type = "Map",
name = "to_vector",
description = """
Transforms the hash map into a vector of key value pairs. If possible, caches
the result. Key value pairs are represented as nested 2 element vectors.
""",
autoRegister = false
)
@GenerateUncached
public abstract class HashMapToVectorNode extends Node {
public static HashMapToVectorNode build() {
return HashMapToVectorNodeGen.create();
}
public abstract Object execute(Object self);
@Specialization
Object ensoMapToVector(EnsoHashMap hashMap,
@Cached ConditionProfile vectorReprNotCachedProfile) {
return hashMap.getCachedVectorRepresentation(vectorReprNotCachedProfile);
}
@Specialization(guards = "mapInterop.hasHashEntries(hashMap)", limit = "3")
Object foreignMapToVector(Object hashMap,
@CachedLibrary("hashMap") InteropLibrary mapInterop,
@CachedLibrary(limit = "3") InteropLibrary iteratorInterop) {
return createEntriesVectorFromForeignMap(hashMap, mapInterop, iteratorInterop);
}
@Fallback
Object fallback(Object object) {
return Vector.fromArray(HashEntriesVector.createEmpty());
}
private static Object createEntriesVectorFromForeignMap(
Object hashMap,
InteropLibrary mapInterop,
InteropLibrary iteratorInterop) {
try {
int hashSize = (int) mapInterop.getHashSize(hashMap);
Object[] keys = new Object[hashSize];
Object[] values = new Object[hashSize];
Object entryIterator = mapInterop.getHashEntriesIterator(hashMap);
int arrIdx = 0;
while (iteratorInterop.hasIteratorNextElement(entryIterator)) {
Object keyValueArr = iteratorInterop.getIteratorNextElement(entryIterator);
keys[arrIdx] = iteratorInterop.readArrayElement(keyValueArr, 0);
values[arrIdx] = iteratorInterop.readArrayElement(keyValueArr, 1);
arrIdx++;
}
return Vector.fromArray(
HashEntriesVector.createFromKeysAndValues(keys, values)
);
} catch (UnsupportedMessageException | StopIterationException | InvalidArrayIndexException e) {
throw new IllegalStateException("hashMap: " + hashMap + " has probably wrong hash interop API", e);
}
}
}

View File

@ -3,7 +3,9 @@ package org.enso.interpreter.runtime.data.text;
import com.ibm.icu.text.BreakIterator; import com.ibm.icu.text.BreakIterator;
import com.ibm.icu.text.Normalizer2; import com.ibm.icu.text.Normalizer2;
import com.oracle.truffle.api.CompilerDirectives; import com.oracle.truffle.api.CompilerDirectives;
import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary;
import com.oracle.truffle.api.dsl.Cached; import com.oracle.truffle.api.dsl.Cached;
import com.oracle.truffle.api.utilities.TriState;
import com.oracle.truffle.api.interop.InteropLibrary; import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.TruffleObject; import com.oracle.truffle.api.interop.TruffleObject;
import com.oracle.truffle.api.library.CachedLibrary; import com.oracle.truffle.api.library.CachedLibrary;
@ -83,7 +85,7 @@ public final class Text implements TruffleObject {
return false; return false;
} }
case UNKNOWN -> { case UNKNOWN -> {
Normalizer2 normalizer = Normalizer2.getInstance(null, "nfc", Normalizer2.Mode.FCD); Normalizer2 normalizer = Normalizer2.getNFDInstance();
boolean isNormalized = normalizer.isNormalized(toString()); boolean isNormalized = normalizer.isNormalized(toString());
setFcdNormalized(isNormalized); setFcdNormalized(isNormalized);
return isNormalized; return isNormalized;

View File

@ -9,6 +9,7 @@ import org.enso.interpreter.runtime.callable.atom.Atom;
import org.enso.interpreter.runtime.callable.atom.AtomConstructor; import org.enso.interpreter.runtime.callable.atom.AtomConstructor;
import org.enso.interpreter.runtime.callable.function.Function; import org.enso.interpreter.runtime.callable.function.Function;
import org.enso.interpreter.runtime.data.*; import org.enso.interpreter.runtime.data.*;
import org.enso.interpreter.runtime.data.hash.EnsoHashMap;
import org.enso.interpreter.runtime.data.text.Text; import org.enso.interpreter.runtime.data.text.Text;
import org.enso.interpreter.runtime.error.*; import org.enso.interpreter.runtime.error.*;
import org.enso.interpreter.runtime.number.EnsoBigInteger; import org.enso.interpreter.runtime.number.EnsoBigInteger;
@ -47,6 +48,7 @@ import org.enso.polyglot.data.TypeGraph;
PanicException.class, PanicException.class,
PanicSentinel.class, PanicSentinel.class,
Vector.class, Vector.class,
EnsoHashMap.class,
Warning.class, Warning.class,
EnsoFile.class, EnsoFile.class,
EnsoDate.class, EnsoDate.class,

View File

@ -0,0 +1,101 @@
package org.enso.interpreter.test;
import static org.junit.Assert.assertEquals;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import org.enso.interpreter.node.expression.builtin.meta.EqualsAnyNode;
import org.graalvm.polyglot.Context;
import org.graalvm.polyglot.Value;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.experimental.theories.DataPoints;
import org.junit.experimental.theories.Theories;
import org.junit.experimental.theories.Theory;
import org.junit.runner.RunWith;
@RunWith(Theories.class)
public class EqualsTest extends TestBase {
private static Context context;
private EqualsAnyNode equalsNode;
@BeforeClass
public static void initContextAndData() {
context = createDefaultContext();
unwrappedValues = fetchAllUnwrappedValues();
}
@Before
public void initNodes() {
executeInContext(
context,
() -> {
equalsNode = EqualsAnyNode.build();
return null;
});
}
@AfterClass
public static void disposeContext() {
context.close();
}
@DataPoints public static Object[] unwrappedValues;
private static Object[] fetchAllUnwrappedValues() {
var valGenerator =
ValuesGenerator.create(
context,
ValuesGenerator.Language.ENSO,
ValuesGenerator.Language.JAVA,
ValuesGenerator.Language.JAVASCRIPT,
ValuesGenerator.Language.PYTHON);
List<Value> values = new ArrayList<>();
values.addAll(valGenerator.numbers());
values.addAll(valGenerator.booleans());
values.addAll(valGenerator.textual());
values.addAll(valGenerator.arrayLike());
values.addAll(valGenerator.vectors());
values.addAll(valGenerator.maps());
values.addAll(valGenerator.multiLevelAtoms());
values.addAll(valGenerator.timesAndDates());
values.addAll(valGenerator.timeZones());
values.addAll(valGenerator.durations());
values.addAll(valGenerator.periods());
values.addAll(valGenerator.warnings());
try {
return values.stream()
.map(value -> unwrapValue(context, value))
.collect(Collectors.toList())
.toArray(new Object[] {});
} catch (Exception e) {
throw new AssertionError(e);
}
}
@Theory
public void equalsOperatorShouldBeSymmetric(Object firstValue, Object secondValue) {
executeInContext(
context,
() -> {
boolean firstResult = equalsNode.execute(firstValue, secondValue);
boolean secondResult = equalsNode.execute(firstValue, secondValue);
assertEquals("equals should be symmetric", firstResult, secondResult);
return null;
});
}
@Theory
public void equalsOperatorShouldBeConsistent(Object value) {
executeInContext(
context,
() -> {
boolean firstResult = equalsNode.execute(value, value);
boolean secondResult = equalsNode.execute(value, value);
assertEquals("equals should be consistent", firstResult, secondResult);
return null;
});
}
}

View File

@ -0,0 +1,136 @@
package org.enso.interpreter.test;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertFalse;
import com.oracle.truffle.api.interop.InteropLibrary;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import org.enso.interpreter.node.expression.builtin.meta.EqualsAnyNode;
import org.enso.interpreter.node.expression.builtin.meta.HashCodeAnyNode;
import org.graalvm.polyglot.Context;
import org.graalvm.polyglot.Value;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.experimental.theories.DataPoints;
import org.junit.experimental.theories.Theories;
import org.junit.experimental.theories.Theory;
import org.junit.runner.RunWith;
@RunWith(Theories.class)
public class HashCodeTest extends TestBase {
private static Context context;
private static final InteropLibrary interop = InteropLibrary.getUncached();
private HashCodeAnyNode hashCodeNode;
private EqualsAnyNode equalsNode;
@BeforeClass
public static void initContextAndData() {
context = createDefaultContext();
// Initialize datapoints here, to make sure that it is initialized just once.
unwrappedValues = fetchAllUnwrappedValues();
}
@Before
public void initNodes() {
executeInContext(context, () -> {
hashCodeNode = HashCodeAnyNode.build();
equalsNode = EqualsAnyNode.build();
return null;
});
}
@AfterClass
public static void disposeContext() {
context.close();
}
/**
* All values are static field, instead of method. Methods annotated with {@code DataPoints}
* may be called multiple times, therefore, we should avoid this annotation for expensive methods.
*/
@DataPoints
public static Object[] unwrappedValues;
private static Object[] fetchAllUnwrappedValues() {
var valGenerator = ValuesGenerator.create(
context,
ValuesGenerator.Language.ENSO,
ValuesGenerator.Language.JAVA,
ValuesGenerator.Language.JAVASCRIPT,
ValuesGenerator.Language.PYTHON
);
List<Value> values = new ArrayList<>();
values.addAll(valGenerator.numbers());
values.addAll(valGenerator.booleans());
values.addAll(valGenerator.textual());
values.addAll(valGenerator.arrayLike());
values.addAll(valGenerator.vectors());
values.addAll(valGenerator.maps());
values.addAll(valGenerator.multiLevelAtoms());
values.addAll(valGenerator.timesAndDates());
values.addAll(valGenerator.timeZones());
values.addAll(valGenerator.durations());
values.addAll(valGenerator.periods());
values.addAll(valGenerator.warnings());
try {
return values
.stream()
.map(value -> unwrapValue(context, value))
.collect(Collectors.toList())
.toArray(new Object[]{});
} catch (Exception e) {
throw new AssertionError(e);
}
}
@Theory
public void hashCodeContractTheory(Object firstValue, Object secondValue) {
executeInContext(context, () -> {
long firstHash = hashCodeNode.execute(firstValue);
long secondHash = hashCodeNode.execute(secondValue);
boolean valuesAreEqual = equalsNode.execute(firstValue, secondValue);
// if o1 == o2 then hash(o1) == hash(o2)
if (valuesAreEqual) {
assertEquals(
String.format("""
If two objects are same, they should have same hash codes:
firstVal = %s, secondVal = %s, firstHash = %d, secondHash = %d
""",
interop.toDisplayString(firstValue),
interop.toDisplayString(secondValue),
firstHash,
secondHash
),
firstHash,
secondHash
);
}
// if hash(o1) != hash(o2) then o1 != o2
if (firstHash != secondHash) {
assertFalse(
"Violated rule: `if hash(o1) != hash(o2) then o1 != o2`",
valuesAreEqual
);
}
return null;
});
}
@Theory
public void hashCodeIsConsistent(Object value) {
executeInContext(context, () -> {
long firstHash = hashCodeNode.execute(value);
long secondHash = hashCodeNode.execute(value);
assertEquals(
"Hash code of an object should be consistent",
firstHash,
secondHash
);
return null;
});
}
}

View File

@ -15,30 +15,18 @@ import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotNull; import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
import static org.junit.Assert.fail; import static org.junit.Assert.fail;
import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
public class MetaIsATest { public class MetaIsATest extends TestBase {
private Context ctx; private Context ctx;
private Value isACheck; private Value isACheck;
@Before @Before
public void prepareCtx() throws Exception { public void prepareCtx() throws Exception {
Engine eng = Engine.newBuilder() ctx = createDefaultContext();
.allowExperimentalOptions(true)
.logHandler(new ByteArrayOutputStream())
.option(
RuntimeOptions.LANGUAGE_HOME_OVERRIDE,
Paths.get("../../distribution/component").toFile().getAbsolutePath()
).build();
this.ctx = Context.newBuilder()
.engine(eng)
.allowIO(true)
.allowAllAccess(true)
.build();
final Map<String, Language> langs = ctx.getEngine().getLanguages();
assertNotNull("Enso found: " + langs, langs.get("enso"));
final URI uri = new URI("memory://choose.enso"); final URI uri = new URI("memory://choose.enso");
final Source src = Source.newBuilder("enso", """ final Source src = Source.newBuilder("enso", """
import Standard.Base.Meta import Standard.Base.Meta
@ -53,6 +41,11 @@ public class MetaIsATest {
assertTrue("it is a function", isACheck.canExecute()); assertTrue("it is a function", isACheck.canExecute());
} }
@After
public void disposeCtx() {
ctx.close();
}
@Test @Test
public void checkNumbersAreNumber() { public void checkNumbersAreNumber() {
var g = ValuesGenerator.create(ctx); var g = ValuesGenerator.create(ctx);

View File

@ -21,25 +21,22 @@ import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNotNull; import static org.junit.Assert.assertNotNull;
import static org.junit.Assert.assertTrue; import static org.junit.Assert.assertTrue;
import static org.junit.Assert.fail; import static org.junit.Assert.fail;
import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
public class MetaObjectTest { public class MetaObjectTest extends TestBase {
private Context ctx; private Context ctx;
@Before @Before
public void prepareCtx() throws Exception { public void prepareCtx() {
Engine eng = ctx = createDefaultContext();
Engine.newBuilder() }
.allowExperimentalOptions(true)
.logHandler(new ByteArrayOutputStream()) @After
.option( public void disposeCtx() {
RuntimeOptions.LANGUAGE_HOME_OVERRIDE, ctx.close();
Paths.get("../../distribution/component").toFile().getAbsolutePath())
.build();
this.ctx = Context.newBuilder().engine(eng).allowIO(true).allowAllAccess(true).build();
final Map<String, Language> langs = ctx.getEngine().getLanguages();
assertNotNull("Enso found: " + langs, langs.get("enso"));
} }
@Test @Test

View File

@ -0,0 +1,91 @@
package org.enso.interpreter.test;
import static org.junit.Assert.assertNotNull;
import com.oracle.truffle.api.interop.InteropLibrary;
import com.oracle.truffle.api.interop.TruffleObject;
import com.oracle.truffle.api.library.ExportLibrary;
import com.oracle.truffle.api.library.ExportMessage;
import java.io.ByteArrayOutputStream;
import java.nio.file.Paths;
import java.util.Map;
import java.util.concurrent.Callable;
import org.enso.polyglot.RuntimeOptions;
import org.graalvm.polyglot.Context;
import org.graalvm.polyglot.Language;
import org.graalvm.polyglot.Source;
import org.graalvm.polyglot.Value;
import org.graalvm.polyglot.proxy.ProxyExecutable;
public abstract class TestBase {
protected static Context createDefaultContext() {
var context =
Context.newBuilder("enso")
.allowExperimentalOptions(true)
.allowIO(true)
.allowAllAccess(true)
.logHandler(new ByteArrayOutputStream())
.option(
RuntimeOptions.LANGUAGE_HOME_OVERRIDE,
Paths.get("../../distribution/component").toFile().getAbsolutePath())
.build();
final Map<String, Language> langs = context.getEngine().getLanguages();
assertNotNull("Enso found: " + langs, langs.get("enso"));
return context;
}
/**
* Executes the given callable in the given context. A necessity for executing artificially
* created Truffle ASTs.
*
* @return Object returned from {@code callable} wrapped in {@link Value}.
*/
protected static Value executeInContext(Context ctx, Callable<Object> callable) {
// Force initialization of the context
ctx.eval("enso", "42");
ctx.getPolyglotBindings()
.putMember(
"testSymbol",
(ProxyExecutable)
(Value... args) -> {
try {
return callable.call();
} catch (Exception e) {
throw new AssertionError(e);
}
});
return ctx.getPolyglotBindings().getMember("testSymbol").execute();
}
/**
* Unwraps the `receiver` field from the Value. This is a hack to allow us to test execute methods
* of artificially created ASTs, e.g., single nodes.
*
* <p>Does something similar to what {@link
* com.oracle.truffle.tck.DebuggerTester#getSourceImpl(Source)} does, but uses a different hack
* than reflective access.
*/
protected static Object unwrapValue(Context ctx, Value value) {
var unwrapper = new Unwrapper();
var unwrapperValue = ctx.asValue(unwrapper);
unwrapperValue.execute(value);
assertNotNull(unwrapper.args);
return unwrapper.args[0];
}
@ExportLibrary(InteropLibrary.class)
static final class Unwrapper implements TruffleObject {
Object[] args;
@ExportMessage
Object execute(Object[] args) {
this.args = args;
return this;
}
@ExportMessage
boolean isExecutable() {
return true;
}
}
}

View File

@ -1,7 +1,15 @@
package org.enso.interpreter.test; package org.enso.interpreter.test;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertTrue;
import java.time.Duration;
import java.time.LocalDate; import java.time.LocalDate;
import java.time.LocalTime; import java.time.LocalTime;
import java.time.Period;
import java.time.ZoneId;
import java.time.ZoneOffset;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.EnumSet; import java.util.EnumSet;
@ -9,12 +17,10 @@ import java.util.HashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Set; import java.util.Set;
import java.util.TimeZone;
import org.graalvm.polyglot.Context; import org.graalvm.polyglot.Context;
import org.graalvm.polyglot.PolyglotException; import org.graalvm.polyglot.PolyglotException;
import org.graalvm.polyglot.Value; import org.graalvm.polyglot.Value;
import static org.junit.Assert.assertFalse;
import static org.junit.Assert.assertNull;
import static org.junit.Assert.assertTrue;
/** /**
* The purpose of this class is to generate various values needed for other * The purpose of this class is to generate various values needed for other
@ -175,6 +181,12 @@ class ValuesGenerator {
""", "Vector").type(); """, "Vector").type();
} }
public Value typeMap() {
return v("typeMap", """
import Standard.Base.Data.Map.Map
""", "Map").type();
}
public Value typeWarning() { public Value typeWarning() {
return v("typeWarning", """ return v("typeWarning", """
import Standard.Base.Warning.Warning import Standard.Base.Warning.Warning
@ -236,7 +248,9 @@ class ValuesGenerator {
fac s n = if n <= 1 then s else fac s n = if n <= 1 then s else
@Tail_Call fac n*s n-1 @Tail_Call fac n*s n-1
""", "fac 1 100").type()); """, "fac 1 100").type());
collect.add(v(null, "", "123 * 10^40").type());
collect.add(v(null, "", "123 * 10^40 + 0.0").type());
collect.add(v(null, "", "123 * 10^40 + 1.0").type());
} }
if (languages.contains(Language.JAVA)) { if (languages.contains(Language.JAVA)) {
@ -257,6 +271,8 @@ class ValuesGenerator {
public List<Value> textual() { public List<Value> textual() {
var collect = new ArrayList<Value>(); var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) { if (languages.contains(Language.ENSO)) {
// TODO: Add once PR #3956 is merged
//collect.add(v(null, "", "''").type());
collect.add(v(null, "", "'fourty two'").type()); collect.add(v(null, "", "'fourty two'").type());
collect.add(v(null, "", "'?'").type()); collect.add(v(null, "", "'?'").type());
collect.add(v(null, "", """ collect.add(v(null, "", """
@ -269,7 +285,14 @@ class ValuesGenerator {
if (languages.contains(Language.JAVA)) { if (languages.contains(Language.JAVA)) {
collect.add(ctx.asValue("fourty four from Java")); collect.add(ctx.asValue("fourty four from Java"));
// collect.add(ctx.asValue('J')); collect.add(ctx.asValue(""));
collect.add(ctx.asValue("吰 abcde 1"));
collect.add(ctx.asValue("1234"));
collect.add(ctx.asValue("\t"));
collect.add(ctx.asValue("\n"));
collect.add(ctx.asValue("\r"));
collect.add(ctx.asValue("\r\t \t\r"));
collect.add(ctx.asValue("J"));
} }
for (var v : collect) { for (var v : collect) {
@ -297,7 +320,7 @@ class ValuesGenerator {
return collect; return collect;
} }
public List<Value> times() { public List<Value> timesAndDates() {
var collect = new ArrayList<Value>(); var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) { if (languages.contains(Language.ENSO)) {
collect.add(v(null, "import Standard.Base.Data.Time.Date.Date", "Date.now").type()); collect.add(v(null, "import Standard.Base.Data.Time.Date.Date", "Date.now").type());
@ -321,6 +344,98 @@ class ValuesGenerator {
return collect; return collect;
} }
public List<Value> timeZones() {
var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) {
for (var expr : List.of(
"Time_Zone.new",
"Time_Zone.system",
"Time_Zone.local",
"Time_Zone.utc",
"Time_Zone.new 1 2 3",
"Time_Zone.parse 'Europe/Moscow'",
"Time_Zone.parse 'Europe/London'",
"Time_Zone.parse 'CET'"
)) {
collect.add(v(null, "import Standard.Base.Data.Time.Time_Zone.Time_Zone", expr).type());
}
}
if (languages.contains(Language.JAVA)) {
for (var javaValue : List.of(
TimeZone.getTimeZone("America/Los_Angeles"),
TimeZone.getTimeZone(ZoneId.systemDefault()),
TimeZone.getTimeZone(ZoneId.ofOffset("GMT", ZoneOffset.ofHours(2))),
TimeZone.getTimeZone(ZoneId.ofOffset("GMT", ZoneOffset.ofHoursMinutes(14, 45))),
TimeZone.getTimeZone(ZoneId.ofOffset("UTC", ZoneOffset.ofHours(-15)))
)) {
collect.add(ctx.asValue(javaValue));
}
}
return collect;
}
public List<Value> durations() {
var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) {
for (var expr : List.of(
"Duration.zero",
"Duration.new 1",
"Duration.new 1 1",
"Duration.new nanoseconds=900",
"Duration.new minutes=900",
"Duration.between (Date_Time.new 2022 01 01) (Date_Time.new 2022 02 02)",
"Duration.between (Date_Time.new 2022 01 01) (Date_Time.new 2022 02 02) timezone_aware=False"
)) {
collect.add(v(null, """
import Standard.Base.Data.Time.Duration.Duration
import Standard.Base.Data.Time.Date_Time.Date_Time
from Standard.Base.Data.Boolean.Boolean import False
""", expr).type());
}
}
if (languages.contains(Language.JAVA)) {
for (var javaValue : List.of(
Duration.ofHours(1),
Duration.ofHours(0),
Duration.ofSeconds(600),
Duration.ofNanos(9784),
Duration.ZERO
)) {
collect.add(ctx.asValue(javaValue));
}
}
collect.forEach(value -> assertTrue("Is duration: " + value, value.isDuration()));
return collect;
}
public List<Value> periods() {
var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) {
for (var expr : List.of(
"Period.new",
"Period.new 1",
"Period.new 1 14",
"Period.new days=568",
"Period.new years=23451"
)) {
collect.add(v(null, "import Standard.Base.Data.Time.Period.Period", expr).type());
}
}
if (languages.contains(Language.JAVA)) {
for (var javaValue : List.of(
Period.ZERO,
Period.ofDays(12),
Period.ofDays(65),
Period.ofMonths(13),
Period.of(12, 4, 60),
Period.ofYears(23410)
)) {
collect.add(ctx.asValue(javaValue));
}
}
return collect;
}
public List<Value> arrayLike() { public List<Value> arrayLike() {
var collect = new ArrayList<Value>(); var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) { if (languages.contains(Language.ENSO)) {
@ -347,6 +462,83 @@ class ValuesGenerator {
return collect; return collect;
} }
public List<Value> vectors() {
var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) {
collect.add(v(null, "", "[1,2,3]").type());
collect.add(v(null, "", "[]").type());
collect.add(v(null, "", "['a', 2, 0]").type());
collect.add(v(null, "", "['a', 'b', 'c']").type());
collect.add(v(null, "from Standard.Base.Nothing import Nothing", "[Nothing, Nothing]").type());
collect.add(v(null, "from Standard.Base.Nothing import Nothing", "[Nothing, 'fff', 0, Nothing]").type());
}
return collect;
}
public List<Value> maps() {
var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) {
var imports = """
import Standard.Base.Data.Map.Map
import Standard.Base.Nothing.Nothing
""";
for (var expr : List.of(
"Map.empty",
"Map.singleton Nothing Nothing",
"Map.singleton Nothing 'my_value'",
"Map.singleton 'my_value' Nothing",
"Map.singleton 1 1",
"Map.singleton 'C' 3",
"Map.singleton 'C' 43",
"Map.empty.insert 'A' 10 . insert 'B' 20",
// ((int) 'A') + ((int) 'B') = 131 ; codePoint(131) = \203
"Map.singleton '\203' 30",
"Map.singleton Map.empty 1",
"Map.singleton Map.empty Map.empty",
"Map.empty.insert 1 1 . insert 2 2",
"Map.empty.insert Nothing 'val' . insert 'key' 42",
"Map.empty.insert 'A' 1 . insert 'B' 2 . insert 'C' 3",
"Map.empty.insert 'C' 3 . insert 'B' 2 . insert 'A' 1"
)) {
collect.add(v(null, imports, expr).type());
}
}
return collect;
}
public List<Value> multiLevelAtoms() {
var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) {
var nodeTypeDef = """
type Node
C1 f1
C2 f1 f2
C3 f1 f2 f3
Nil
Value value
""";
for (var expr : List.of(
"Node.C2 Node.Nil (Node.Value 42)",
"Node.C2 (Node.Value 42) Node.Nil",
"Node.Nil",
"Node.Value 42",
"Node.Value 2",
"Node.Value 2.0",
"Node.C1 (Node.Value 42)",
"Node.C1 Node.Nil",
"Node.C3 Node.Nil (Node.Value 42) Node.Nil",
"Node.C3 (Node.Value 42) Node.Nil Node.Nil",
"Node.C3 Node.Nil Node.Nil Node.Nil",
"Node.C2 (Node.C2 (Node.C1 Node.Nil) (Node.C1 (Node.C1 Node.Nil))) (Node.C2 (Node.C3 (Node.Nil) (Node.Value 22) (Node.Nil)) (Node.C2 (Node.Value 22) (Node.Nil)))",
"Node.C2 (Node.C2 (Node.C1 Node.Nil) (Node.C1 Node.Nil)) (Node.C2 (Node.C3 (Node.Nil) (Node.Value 22) (Node.Nil)) (Node.C2 (Node.Value 22) (Node.Nil)))",
"Node.C2 (Node.C2 (Node.C1 Node.Nil) (Node.C1 Node.Nil)) (Node.C2 (Node.C3 (Node.Nil) (Node.Nil) (Node.Value 22)) (Node.C2 (Node.Value 22) (Node.Nil)))"
)) {
collect.add(v(null, nodeTypeDef, expr).type());
}
}
return collect;
}
public List<Value> functions() { public List<Value> functions() {
var collect = new ArrayList<Value>(); var collect = new ArrayList<Value>();
if (languages.contains(Language.ENSO)) { if (languages.contains(Language.ENSO)) {

View File

@ -43,6 +43,7 @@ public record TypeWithKind(String baseType, TypeKind kind) {
"org.enso.interpreter.runtime.callable.function.Function", "org.enso.interpreter.runtime.callable.function.Function",
"org.enso.interpreter.runtime.data.Array", "org.enso.interpreter.runtime.data.Array",
"org.enso.interpreter.runtime.data.Vector", "org.enso.interpreter.runtime.data.Vector",
"org.enso.interpreter.runtime.data.hash.EnsoHashMap",
"org.enso.interpreter.runtime.data.ArrayOverBuffer", "org.enso.interpreter.runtime.data.ArrayOverBuffer",
"org.enso.interpreter.runtime.data.ArrayProxy", "org.enso.interpreter.runtime.data.ArrayProxy",
"org.enso.interpreter.runtime.data.EnsoFile", "org.enso.interpreter.runtime.data.EnsoFile",

View File

@ -12,13 +12,13 @@ polyglot java import org.enso.base.Time_Utils
## Bench Utilities ============================================================ ## Bench Utilities ============================================================
iter_size = 100 iter_size = 100
num_iterations = 10 num_iterations = 20
# The Benchmarks ============================================================== # The Benchmarks ==============================================================
bench = bench =
random_vec = Utils.make_random_vec 10000 random_vec = Utils.make_random_vec 100000
uniform_vec = Base.Vector.fill 10000 1 uniform_vec = Base.Vector.fill 100000 1
random_text_vec = random_vec.map .to_text random_text_vec = random_vec.map .to_text
uniform_text_vec = random_vec.map .to_text uniform_text_vec = random_vec.map .to_text

View File

@ -32,7 +32,7 @@ spec =
Test.specify "should allow converting a GeoJSON array of features into a table" <| Test.specify "should allow converting a GeoJSON array of features into a table" <|
fields = ['foo', 'bar', 'baz', 'longitude', 'elevation'] fields = ['foo', 'bar', 'baz', 'longitude', 'elevation']
t = Geo.geo_json_to_table (geo_json.get "features") fields t = Geo.geo_json_to_table (geo_json.get "features") fields
t.columns.map .name . should_equal fields t.columns.map .name . should_contain_the_same_elements_as fields
t.at 'foo' . to_vector . should_equal [1, 2] t.at 'foo' . to_vector . should_equal [1, 2]
t.at 'bar' . to_vector . should_equal ['value2', Nothing] t.at 'bar' . to_vector . should_equal ['value2', Nothing]
t.at 'baz' . to_vector . should_equal [Nothing, 3] t.at 'baz' . to_vector . should_equal [Nothing, 3]
@ -42,7 +42,7 @@ spec =
Test.specify "should allow converting a GeoJSON object into a table with provided fields" <| Test.specify "should allow converting a GeoJSON object into a table with provided fields" <|
fields = ['foo', 'bar', 'longitude'] fields = ['foo', 'bar', 'longitude']
t = Geo.geo_json_to_table geo_json fields t = Geo.geo_json_to_table geo_json fields
t.columns.map .name . should_equal fields t.columns.map .name . should_contain_the_same_elements_as fields
t.at 'foo' . to_vector . should_equal [1, 2] t.at 'foo' . to_vector . should_equal [1, 2]
t.at 'bar' . to_vector . should_equal ['value2', Nothing] t.at 'bar' . to_vector . should_equal ['value2', Nothing]
t.at 'longitude' . to_vector . should_equal [-118.58, 10.11] t.at 'longitude' . to_vector . should_equal [-118.58, 10.11]
@ -50,7 +50,7 @@ spec =
Test.specify "should allow converting a GeoJSON object into a table containing all available fields" <| Test.specify "should allow converting a GeoJSON object into a table containing all available fields" <|
fields = ['bar', 'baz', 'elevation', 'foo', 'latitude', 'longitude'] fields = ['bar', 'baz', 'elevation', 'foo', 'latitude', 'longitude']
t = Geo.geo_json_to_table geo_json t = Geo.geo_json_to_table geo_json
t.columns.map .name . should_equal fields t.columns.map .name . should_contain_the_same_elements_as fields
t.at 'foo' . to_vector . should_equal [1, 2] t.at 'foo' . to_vector . should_equal [1, 2]
t.at 'bar' . to_vector . should_equal ['value2', Nothing] t.at 'bar' . to_vector . should_equal ['value2', Nothing]
t.at 'baz' . to_vector . should_equal [Nothing, 3] t.at 'baz' . to_vector . should_equal [Nothing, 3]

View File

@ -84,7 +84,10 @@ spec setup =
action = table.select_columns selector on_problems=_ action = table.select_columns selector on_problems=_
tester = expect_column_names ["foo", "bar"] tester = expect_column_names ["foo", "bar"]
problems = [Input_Indices_Already_Matched.Error [-7, 1]] problems = [Input_Indices_Already_Matched.Error [-7, 1]]
Problems.test_problem_handling action problems tester err_checker err =
err.catch.should_be_a Input_Indices_Already_Matched.Error
err.catch.indices.should_contain_the_same_elements_as [-7, 1]
Problems.test_advanced_problem_handling action err_checker (x-> x) tester
Test.specify "should correctly handle problems: duplicate names" <| Test.specify "should correctly handle problems: duplicate names" <|
selector = By_Name ["foo", "foo"] selector = By_Name ["foo", "foo"]
@ -475,22 +478,28 @@ spec setup =
map = Column_Name_Mapping.By_Name (Map.from_vector [["alpha", "FirstColumn"], ["omicron", "Another"], [weird_name, "Fixed"]]) map = Column_Name_Mapping.By_Name (Map.from_vector [["alpha", "FirstColumn"], ["omicron", "Another"], [weird_name, "Fixed"]])
action = table.rename_columns map on_problems=_ action = table.rename_columns map on_problems=_
tester = expect_column_names ["FirstColumn", "beta", "gamma", "delta"] tester = expect_column_names ["FirstColumn", "beta", "gamma", "delta"]
problems = [Missing_Input_Columns.Error [weird_name, "omicron"]] err_checker err =
Problems.test_problem_handling action problems tester err.catch.should_be_a Missing_Input_Columns.Error
err.catch.criteria.should_contain_the_same_elements_as ["omicron", weird_name]
Problems.test_advanced_problem_handling action err_checker (x-> x) tester
Test.specify "should correctly handle problems: out of bounds indices" <| Test.specify "should correctly handle problems: out of bounds indices" <|
map = Column_Name_Mapping.By_Index (Map.from_vector [[0, "FirstColumn"], [-1, "Another"], [100, "Boo"], [-200, "Nothing"], [300, "Here"]]) map = Column_Name_Mapping.By_Index (Map.from_vector [[0, "FirstColumn"], [-1, "Another"], [100, "Boo"], [-200, "Nothing"], [300, "Here"]])
action = table.rename_columns map on_problems=_ action = table.rename_columns map on_problems=_
tester = expect_column_names ["FirstColumn", "beta", "gamma", "Another"] tester = expect_column_names ["FirstColumn", "beta", "gamma", "Another"]
problems = [Column_Indexes_Out_Of_Range.Error [-200, 100, 300]] err_checker err =
Problems.test_problem_handling action problems tester err.catch.should_be_a Column_Indexes_Out_Of_Range.Error
err.catch.indexes.should_contain_the_same_elements_as [-200, 100, 300]
Problems.test_advanced_problem_handling action err_checker (x-> x) tester
Test.specify "should correctly handle problems: aliased indices" <| Test.specify "should correctly handle problems: aliased indices" <|
map = Column_Name_Mapping.By_Index (Map.from_vector [[1, "FirstColumn"], [-3, "Another"]]) map = Column_Name_Mapping.By_Index (Map.from_vector [[1, "FirstColumn"], [-3, "FirstColumn"]])
action = table.rename_columns map on_problems=_ action = table.rename_columns map on_problems=_
tester = expect_column_names ["alpha", "Another", "gamma", "delta"] tester = expect_column_names ["alpha", "FirstColumn", "gamma", "delta"]
problems = [Input_Indices_Already_Matched.Error [1]] err_checker err =
Problems.test_problem_handling action problems tester err.catch.should_be_a Input_Indices_Already_Matched.Error
(err.catch.indices.contains 1 || err.catch.indices.contains -3) . should_be_true
Problems.test_advanced_problem_handling action err_checker (x-> x) tester
Test.specify "should correctly handle problems: invalid names ''" <| Test.specify "should correctly handle problems: invalid names ''" <|
map = Column_Name_Mapping.By_Index (Map.from_vector [[1, ""]]) map = Column_Name_Mapping.By_Index (Map.from_vector [[1, ""]])
@ -517,5 +526,7 @@ spec setup =
map = Column_Name_Mapping.By_Position ["A", "B", "C", "D", "E", "F"] map = Column_Name_Mapping.By_Position ["A", "B", "C", "D", "E", "F"]
action = table.rename_columns map on_problems=_ action = table.rename_columns map on_problems=_
tester = expect_column_names ["A", "B", "C", "D"] tester = expect_column_names ["A", "B", "C", "D"]
problems = [Too_Many_Column_Names_Provided.Error ["E", "F"]] err_checker err =
Problems.test_problem_handling action problems tester err.catch.should_be_a Too_Many_Column_Names_Provided.Error
err.catch.column_names.should_contain_the_same_elements_as ["E", "F"]
Problems.test_advanced_problem_handling action err_checker (x-> x) tester

View File

@ -1,38 +1,191 @@
from Standard.Base import all from Standard.Base import all
import Standard.Base.Error.No_Such_Key.No_Such_Key import Standard.Base.Error.No_Such_Key.No_Such_Key
import Standard.Base.Data.Time.Date_Time.Date_Time
from Standard.Base.Data.Map import Map
from Standard.Test import Test, Test_Suite from Standard.Test import Test, Test_Suite
import Standard.Test.Extensions import Standard.Test.Extensions
spec = Test.group "Maps" <| polyglot java import java.nio.file.Path as JavaPath
m = Map.empty . insert 1 2 . insert 2 4 polyglot java import java.util.Map as JavaMap
expected = Map.empty . insert "1" 4 . insert "2" 8
m.transform (k -> v -> [k.to_text, v*2]) . should_equal expected foreign js js_str str = """
return new String(str)
foreign js js_empty_dict = """
return new Map()
foreign python py_empty_dict = """
return {}
foreign js js_dict_from_vec vec = """
dict = new Map()
for (let i = 0; i < vec.length; i += 2) {
dict.set(vec[i], vec[i+1])
}
return dict
foreign python py_dict_from_vec vec = """
d = {}
for i in range(0, len(vec), 2):
d[vec[i]] = vec[i + 1]
return d
foreign python py_dict_from_map map = """
d = dict()
for key in map.__iter__():
d[key] = map[key]
return d
foreign python py_vec_from_map map = """
vec = []
for key in map.__iter__():
value = map[key]
vec.append([key, value])
return vec
# Should throw error - updating immutable map from Enso
foreign python py_update_dict map key val = """
map[key] = val
foreign python py_wrapper obj = """
class MyClass:
def __init__(self, obj):
self.data = obj
return MyClass(obj)
pending_python_missing = if Polyglot.is_language_installed "python" then Nothing else """
Can't run Python tests, Python is not installed.
type Child
Value data
type Parent
Value child
type GrandParent
Value parent
spec =
Test.group "Enso maps" <|
Test.specify "should allow checking for emptiness" <| Test.specify "should allow checking for emptiness" <|
empty_map = Map.empty empty_map = Map.empty
non_empty = Map.empty . insert "foo" 1234 non_empty = Map.empty . insert "foo" 1234
empty_map.is_empty . should_be_true empty_map.is_empty . should_be_true
non_empty.is_empty . should_be_false non_empty.is_empty . should_be_false
Test.specify "should compare two hash maps" <|
(Map.singleton "a" 1).should_equal (Map.singleton "a" 1)
(Map.singleton "b" 2).should_not_equal (Map.singleton "a" 1)
Map.empty.should_equal Map.empty
Map.empty.should_not_equal (Map.singleton "a" 1)
(Map.empty.insert "a" 1 . insert "b" 2).should_equal (Map.empty.insert "b" 2 . insert "a" 1)
Test.specify "should allow checking for non emptiness" <| Test.specify "should allow checking for non emptiness" <|
empty_map = Map.empty empty_map = Map.empty
non_empty = Map.empty . insert "foo" 1234 non_empty = Map.empty . insert "foo" 1234
empty_map.not_empty . should_be_false empty_map.not_empty . should_be_false
non_empty.not_empty . should_be_true non_empty.not_empty . should_be_true
Test.specify "should allow checking its size" <| Test.specify "should allow checking its size" <|
empty_map = Map.empty empty_map = Map.empty
non_empty = Map.singleton "a" "b" . insert "x" "y" non_empty = Map.singleton "a" "b" . insert "x" "y"
empty_map.size . should_equal 0 empty_map.size . should_equal 0
non_empty.size . should_equal 2 non_empty.size . should_equal 2
Test.specify "should support arbitrary atoms as keys" <|
map = Map.singleton (Pair.new "one" "two") 42
(map.get (Pair.new "one" "two")).should_equal 42
(map.get (Pair.new "A" "B")).should_equal Nothing
(map.get (Pair.new "two" "two")).should_equal Nothing
Test.specify "should use proper hash code for keys" <|
single_key_map key = Map.singleton key 42
grand_parent_1 = GrandParent.Value (Parent.Value (Child.Value 2))
grand_parent_2 = GrandParent.Value (Parent.Value (Child.Value 2.0))
(single_key_map 2 . at 2.0) . should_equal 42
(single_key_map -2 . at -2.0) . should_equal 42
(single_key_map 'ś' . at 's\u0301') . should_equal 42
(single_key_map 's\u0301' . at 'ś') . should_equal 42
(single_key_map 'éabc' . at 'e\u0301abc') . should_equal 42
(single_key_map 'e\u0301abc' . at 'éabc') . should_equal 42
(single_key_map grand_parent_1 . at grand_parent_2) . should_equal 42
(single_key_map (Json.parse '{"a": 1}') . at (Json.parse '{"a": 1}')) . should_equal 42
(single_key_map (Child.Value 1) . at (Child.Value 1.0)) . should_equal 42
Test.specify "should support vectors as keys" <|
map = Map.singleton [1, "a", 2] "Value"
map.size.should_equal 1
map.get [1, "a", 2] . should_equal "Value"
Test.specify "should support dates as keys" <|
map = Map.empty.insert (Date.new 1993) 1 . insert (Date.new 1993 2 5) 2 . insert (Date_Time.new 1993 2 5 13 45) 3
map.size.should_equal 3
map.get (Date.new 1993 6 7) . should_equal Nothing
map.get (Date.new 1993) . should_equal 1
map.get (Date_Time.new 1993) . should_equal Nothing
map.get (Date.new 1993 2 5) . should_equal 2
map.get (Date_Time.new 1993 2 5) . should_equal Nothing
map.get (Date_Time.new 1993 2 5 13 45) . should_equal 3
Test.specify "should support another hash map as key" <|
key_map = Map.singleton (Pair.new "one" "two") 42
map = Map.singleton key_map 23
map.size.should_equal 1
(map.get "A").should_equal Nothing
(map.get key_map).should_equal 23
(map.get map).should_equal Nothing
Test.specify "should handle keys with standard equality semantics" <|
map = Map.singleton 2 "Hello"
(map.get 2).should_equal "Hello"
(map.get 2.0).should_equal "Hello"
(Map.singleton 2 "Hello").should_equal (Map.singleton 2.0 "Hello")
Test.specify "should handle Nothing as keys" <|
Map.singleton Nothing 3 . get Nothing . should_equal 3
Map.singleton Nothing 1 . insert Nothing 2 . get Nothing . should_equal 2
Test.specify "should handle Nothing as values" <|
Map.singleton 1 Nothing . at 1 . should_equal Nothing
Map.singleton Nothing Nothing . at Nothing . should_equal Nothing
Test.specify "should support rewriting values with same keys" <|
map = Map.empty.insert "a" 1 . insert "a" 42
map.size.should_equal 1
map.get "a" . should_equal 42
Test.specify "should allow storing atoms as values" <|
json = Json.parse '{"a": 1}'
pair = Pair.new "first" "second"
map = Map.empty.insert 0 json . insert 1 pair
map.get 0 . should_equal json
map.get 1 . should_equal pair
Test.specify "should not drop warnings from keys" <|
key = Warning.attach "my_warn" "my_key"
map = Map.singleton key 42
(Warning.get_all (map.keys.at 0)).length . should_equal 1
Test.specify "should not drop warnings from values" <|
val = Warning.attach "my_warn" "my_val"
map = Map.singleton 42 val
(Warning.get_all (map.values.at 0)).length . should_equal 1
Test.specify "should convert the whole map to a vector" <| Test.specify "should convert the whole map to a vector" <|
m = Map.empty . insert 0 0 . insert 3 -5 . insert 1 2 m = Map.empty . insert 0 0 . insert 3 -5 . insert 1 2
m.to_vector.should_equal [[0, 0], [1, 2], [3, -5]] m.to_vector.should_equal [[0, 0], [3, -5], [1, 2]]
Test.specify "should allow building the map from a vector" <| Test.specify "should allow building the map from a vector" <|
expected = Map.empty . insert 0 0 . insert 3 -5 . insert 1 2 expected = Map.empty . insert 0 0 . insert 3 -5 . insert 1 2
vec = [[0, 0], [3, -5], [1, 2]] vec = [[0, 0], [3, -5], [1, 2]]
Map.from_vector vec . should_equal expected Map.from_vector vec . should_equal expected
Test.specify "should define a well-defined text conversion" <| Test.specify "should define a well-defined text conversion" <|
m = Map.empty . insert 0 0 . insert 3 -5 . insert 1 2 m = Map.empty . insert 0 0 . insert 3 -5 . insert 1 2
m.to_text . should_equal "[[0, 0], [1, 2], [3, -5]]" m.to_text . should_equal "{0=0, 3=-5, 1=2}"
Test.specify "should define structural equality" <| Test.specify "should define structural equality" <|
map_1 = Map.empty . insert "1" 2 . insert "2" "1" map_1 = Map.empty . insert "1" 2 . insert "2" "1"
map_2 = Map.empty . insert "1" 2 . insert "2" "1" map_2 = Map.empty . insert "1" 2 . insert "2" "1"
@ -40,89 +193,284 @@ spec = Test.group "Maps" <|
map_1==map_2 . should_be_true map_1==map_2 . should_be_true
map_1==map_3 . should_be_false map_1==map_3 . should_be_false
map_2==map_3 . should_be_false map_2==map_3 . should_be_false
Test.specify "should allow inserting and looking up values" <| Test.specify "should allow inserting and looking up values" <|
m = Map.empty . insert "foo" 134 . insert "bar" 654 . insert "baz" "spam" m = Map.empty . insert "foo" 134 . insert "bar" 654 . insert "baz" "spam"
m.at "foo" . should_equal 134 m.at "foo" . should_equal 134
m.at "bar" . should_equal 654 m.at "bar" . should_equal 654
m.at "baz" . should_equal "spam" m.at "baz" . should_equal "spam"
(m.at "nope").should_fail_with No_Such_Key.Error (m.at "nope").should_fail_with No_Such_Key.Error
Test.specify "should support get" <| Test.specify "should support get" <|
m = Map.empty . insert 2 3 m = Map.empty . insert 2 3
m.get 2 0 . should_equal 3 m.get 2 0 . should_equal 3
m.get 1 10 . should_equal 10 m.get 1 10 . should_equal 10
m.get 2 (Panic.throw "missing") . should_equal 3 m.get 2 (Panic.throw "missing") . should_equal 3
Test.specify "should allow getting a vector of the keys" <|
m = Map.empty . insert 1 2 . insert 2 4
m.keys . should_equal [1, 2]
Test.specify "should allow getting a vector of the values" <|
m = Map.empty . insert 1 2 . insert 2 4
m.values . should_equal [2, 4]
Test.specify "should support contains_key" <| Test.specify "should support contains_key" <|
m = Map.empty . insert 2 3 m = Map.empty . insert 2 3
m.contains_key 2 . should_be_true m.contains_key 2 . should_be_true
m.contains_key 1 . should_be_false m.contains_key 1 . should_be_false
Test.specify "should allow transforming the map" <| Test.specify "should allow transforming the map" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
expected = Map.empty . insert "1" 4 . insert "2" 8 expected = Map.empty . insert "1" 4 . insert "2" 8
m.transform (k -> v -> [k.to_text, v*2]) . should_equal expected m.transform (k -> v -> [k.to_text, v*2]) . should_equal expected
Test.specify "should allow mapping over values" <| Test.specify "should allow mapping over values" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
expected = Map.empty . insert 1 4 . insert 2 8 expected = Map.empty . insert 1 4 . insert 2 8
m.map (v -> v*2) . should_equal expected m.map (v -> v*2) . should_equal expected
Test.specify "should allow mapping over keys" <| Test.specify "should allow mapping over keys" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
expected = Map.empty . insert 2 2 . insert 4 4 expected = Map.empty . insert 2 2 . insert 4 4
m.map_keys (k -> k*2) . should_equal expected m.map_keys (k -> k*2) . should_equal expected
Test.specify "should allow mapping with keys" <| Test.specify "should allow mapping with keys" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
expected = Map.empty . insert 1 3 . insert 2 6 expected = Map.empty . insert 1 3 . insert 2 6
m.map_with_key (k -> v -> k + v) . should_equal expected m.map_with_key (k -> v -> k + v) . should_equal expected
Test.specify "should allow iterating over each value" <| Test.specify "should allow iterating over each value" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
vec = Vector.new_builder vec = Vector.new_builder
expected_vec = [2, 4] expected_vec = [2, 4]
m.each (v -> vec.append v) m.each (v -> vec.append v)
vec.to_vector . should_equal expected_vec vec.to_vector . should_equal expected_vec
Test.specify "should allow iterating over each key-value pair" <| Test.specify "should allow iterating over each key-value pair" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
vec = Vector.new_builder vec = Vector.new_builder
expected_vec = [3, 6] expected_vec = [3, 6]
m.each_with_key (k -> v -> vec.append (k+v)) m.each_with_key (k -> v -> vec.append (k+v))
vec.to_vector . should_equal expected_vec vec.to_vector . should_equal expected_vec
Test.specify "should allow folding over the values" <| Test.specify "should allow folding over the values" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
m.fold 0 (+) . should_equal 6 m.fold 0 (+) . should_equal 6
Test.specify "should allow folding over the key-value pairs" <| Test.specify "should allow folding over the key-value pairs" <|
m = Map.empty . insert 1 2 . insert 2 4 m = Map.empty . insert 1 2 . insert 2 4
m.fold_with_key 0 (l -> k -> v -> l + k + v) . should_equal 9 m.fold_with_key 0 (l -> k -> v -> l + k + v) . should_equal 9
Test.specify "should allow getting a vector of the keys" <|
m = Map.empty . insert 1 2 . insert 2 4
m.keys . should_equal [1, 2]
Test.specify "should allow getting a vector of the values" <|
m = Map.empty . insert 1 2 . insert 2 4
m.values . should_equal [2, 4]
Test.specify "should be able to get the first key value pair" <|
m = Map.empty . insert 1 2 . insert 2 4
pair = m.first
pair.first . should_equal 1
pair.second . should_equal 2
Test.specify "should be able to get the first key value pair of an empty map" <|
m = Map.empty
m.first . should_equal Nothing
Test.specify "should be able to get the last key value pair" <|
m = Map.empty . insert 1 2 . insert 2 4
pair = m.last
pair.first . should_equal 2
pair.second . should_equal 4
Test.specify "should be able to get the last key value pair of an empty map" <|
m = Map.empty
m.last . should_equal Nothing
Test.specify "should be able to add a Nothing key to the map" <|
m = Map.empty . insert Nothing 1
m.last . should_equal (Pair.new Nothing 1)
Test.specify "should be able to add a Nothing key to the map of Text" <| Test.specify "should be able to add a Nothing key to the map of Text" <|
m = Map.empty . insert "A" 2 . insert Nothing 1 . insert "B" 3 m = Map.empty . insert "A" 2 . insert Nothing 1 . insert "B" 3
m.at "A" . should_equal 2 m.at "A" . should_equal 2
m.at "B" . should_equal 3 m.at "B" . should_equal 3
m.at Nothing . should_equal 1 m.at Nothing . should_equal 1
Test.specify "should be able to add a Nothing key to the map of Integer" <| Test.specify "should be able to add a Nothing key to the map of Integer" <|
m = Map.empty . insert 100 2 . insert Nothing 1 . insert 200 3 m = Map.empty . insert 100 2 . insert Nothing 1 . insert 200 3
m.at 100 . should_equal 2 m.at 100 . should_equal 2
m.at 200 . should_equal 3 m.at 200 . should_equal 3
m.at Nothing . should_equal 1 m.at Nothing . should_equal 1
Test.specify "should be able to remove entries (1)" <|
m1 = Map.empty.insert "A" 1 . insert "B" 2
m2 = m1.remove "B"
m2.get "A" . should_equal 1
m2.remove "A" . should_equal Map.empty
Test.expect_panic_with (m1.remove "foo") Any
Test.specify "should be able to remove entries (2)" <|
m1 = Map.empty.insert "A" 1
m2 = m1.insert "B" 2
m3 = m1.insert "C" 3
m2.remove "A" . to_vector . should_equal [["B", 2]]
m2.remove "B" . to_vector . should_equal [["A", 1]]
m3.remove "A" . to_vector . should_equal [["C", 3]]
m3.remove "C" . to_vector . should_equal [["A", 1]]
Test.specify "should be able to remove entries (3)" <|
m = Map.empty.insert "A" 1 . insert "B" 2 . insert "C" 3
m.remove "B" . should_equal (Map.singleton "A" 1 . insert "C" 3)
Test.group "Polyglot keys and values" <|
Test.specify "should support polyglot keys" <|
map = Map.singleton (js_str "A") 42
map.size.should_equal 1
map.get "A" . should_equal 42
map.get (js_str "A") . should_equal 42
Test.specify "should support host objects as keys" <|
# JavaPath has proper implementation of hashCode
map = Map.singleton (JavaPath.of "/home/user/file.txt") 42
map.get "X" . should_equal Nothing
map.get "A" . should_equal Nothing
map.get (JavaPath.of "/home/user/file.txt") . should_equal 42
Test.specify "should support Python objects as keys" pending=pending_python_missing <|
py_obj = py_wrapper 42
map = Map.singleton py_obj "Value"
map.get py_obj . should_equal "Value"
Test.specify "should support Python objects as values" pending=pending_python_missing <|
map = Map.singleton "A" (py_wrapper 42)
map.get "A" . data . should_equal 42
Test.specify "should insert entries to a polyglot map" pending=pending_python_missing <|
dict = py_dict_from_vec ["A", 1, "B", 2]
dict.insert "C" 3 . keys . sort . should_equal ["A", "B", "C"]
Test.specify "should remove entries from a polyglot map" pending=pending_python_missing <|
dict = py_dict_from_vec ["A", 1, "B", 2]
dict.remove "B" . to_vector . should_equal [["A", 1]]
Test.group "non-linear inserts" <|
Test.specify "should handle inserts with different keys" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m1.insert "C" 3
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 1], ["C", 3]]
Test.specify "should handle inserts with same keys (1)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "A" 2
m3 = m1.insert "A" 3
m4 = m1.insert "B" 4
m2.to_vector.sort on=_.first . should_equal [["A", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 3]]
m4.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 4]]
Test.specify "should handle inserts with same keys (2)" <|
m1 = Map.singleton "foo" 1
m2 = m1.insert "baz" 2
m3 = m2.insert "foo" 3
m1.to_vector.sort on=_.first . should_equal [['foo', 1]]
m2.to_vector.sort on=_.first . should_equal [['baz', 2], ['foo', 1]]
m3.to_vector.sort on=_.first . should_equal [['baz', 2], ['foo', 3]]
Test.specify "should handle inserts with same keys (3)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "A" 3
m4 = m2.insert "C" 4
m1.to_vector.sort on=_.first . should_equal [["A", 1]]
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 3], ["B", 2]]
m4.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 4]]
Test.specify "should handle inserts with same keys (4)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "C" 3
m4 = m2.insert "D" 4
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 3]]
m4.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["D", 4]]
Test.specify "should handle inserts with same keys (5)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "A" 3
m4 = m2.insert "A" 4
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 3], ["B", 2]]
m4.to_vector.sort on=_.first . should_equal [["A", 4], ["B", 2]]
Test.specify "should handle inserts with same keys (6)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "C" 3
m4 = m2.insert "A" 4
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 3]]
m4.to_vector.sort on=_.first . should_equal [["A", 4], ["B", 2]]
Test.specify "should handle inserts with same keys (7)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "C" 3
m4 = m3.insert "D" 4
m5 = m2.insert "A" 5
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 3]]
m4.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 3], ["D", 4]]
m5.to_vector.sort on=_.first . should_equal [["A", 5], ["B", 2]]
Test.specify "should handle inserts with same keys (8)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "C" 3
m4 = m3.insert "A" 4
m5 = m2.insert "A" 5
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 3]]
m4.to_vector.sort on=_.first . should_equal [["A", 4], ["B", 2], ["C", 3]]
m5.to_vector.sort on=_.first . should_equal [["A", 5], ["B", 2]]
Test.specify "should handle inserts with same keys (9)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "A" 3
m4 = m2.insert "B" 4
m5 = m2.insert "C" 5
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 3], ["B", 2]]
m4.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 4]]
m5.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 5]]
Test.specify "should handle inserts with same keys (10)" <|
m1 = Map.singleton "A" 1
m2 = m1.insert "B" 2
m3 = m2.insert "C" 3
m4 = m2.insert "D" 4
m5 = m2.insert "E" 5
m2.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2]]
m3.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["C", 3]]
m4.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["D", 4]]
m5.to_vector.sort on=_.first . should_equal [["A", 1], ["B", 2], ["E", 5]]
Test.group "Polyglot hash maps" <|
Test.specify "should pass maps as immutable maps to other langs" pending=pending_python_missing <|
map = Map.singleton "A" 1
# Python's KeyError should be raised
Test.expect_panic_with (py_update_dict map "A" 2) Any
map.get "A" . should_equal 1
Test.specify "should treat JavaScript maps as Enso maps" <|
js_dict = js_dict_from_vec ["A", 1, "B", 2]
map = js_dict.insert "C" 3
js_dict.to_vector.should_equal [["A", 1], ["B", 2]]
map.to_vector.should_equal [["A", 1], ["B", 2], ["C", 3]]
Test.specify "should treat Java Map as Enso map" <|
sort_by_keys vec = vec.sort by=x-> y-> x.first.compare_to y.first
jmap = JavaMap.of "A" 1 "B" 2
(sort_by_keys jmap.to_vector) . should_equal [["A", 1], ["B", 2]]
(sort_by_keys (jmap.insert "C" 3 . to_vector)) . should_equal [["A", 1], ["B", 2], ["C", 3]]
Test.specify "should treat Python dicts as Enso maps" pending=pending_python_missing <|
py_dict = py_dict_from_vec ["A", 1, "B", 2]
map = py_dict.insert "C" 3
py_dict.not_empty . should_be_true
py_dict.to_vector.should_equal [["A", 1], ["B", 2]]
map.to_vector.should_equal [["A", 1], ["B", 2], ["C", 3]]
py_empty_dict.is_empty.should_be_true
py_empty_dict.insert "A" 1 . insert "A" 2 . get "A" . should_equal 2
Test.specify "should pass maps with null keys to Python and back" pending=pending_python_missing <|
# Python supports None as keys, Enso support Nothing as keys
py_dict = py_dict_from_map (Map.singleton Nothing 42)
py_dict.get Nothing . should_equal 42
py_dict.insert "A" 23 . get Nothing . should_equal 42
py_dict.insert Nothing 23 . get Nothing . should_equal 23
Test.specify "should treat Enso maps as Python dicts when passed to Python" pending=pending_python_missing <|
map1 = Map.empty.insert "A" 1 . insert "B" 2
py_vec_from_map map1 . should_equal [["A", 1], ["B", 2]]
map2 = Map.empty.insert "A" 1 . insert Nothing 2
py_vec_from_map map2 . should_equal [["A", 1], [Nothing, 2]]
main = Test_Suite.run_main spec main = Test_Suite.run_main spec

View File

@ -641,11 +641,6 @@ spec = Test.group "Vectors" <|
[1, 1.0, 2, 2.0].distinct . should_equal [1, 2] [1, 1.0, 2, 2.0].distinct . should_equal [1, 2]
[].distinct . should_equal [] [].distinct . should_equal []
Test.specify "should throw a clean error for incomparable types" <|
["a", 2].distinct . should_fail_with Incomparable_Values
[2, "a", Integer, "a", 2].distinct . should_fail_with Incomparable_Values
[Pair.new 1 2, Pair.new 3 4].distinct . should_fail_with Incomparable_Values
Test.specify "should correctly handle distinct with custom types like Atoms that implement compare_to" <| Test.specify "should correctly handle distinct with custom types like Atoms that implement compare_to" <|
[T.Value 1 2, T.Value 3 3, T.Value 1 2].distinct . should_equal [T.Value 1 2, T.Value 3 3] [T.Value 1 2, T.Value 3 3, T.Value 1 2].distinct . should_equal [T.Value 1 2, T.Value 3 3]

View File

@ -3,7 +3,7 @@ from Standard.Base import all
from Standard.Test import Test, Test_Suite from Standard.Test import Test, Test_Suite
import Standard.Test.Extensions import Standard.Test.Extensions
polyglot java import java.util.HashMap polyglot java import java.nio.file.Path as JavaPath
polyglot java import java.util.Random as Java_Random polyglot java import java.util.Random as Java_Random
type CustomEqType type CustomEqType
@ -113,6 +113,11 @@ spec =
(js_true == False).should_be_false (js_true == False).should_be_false
(js_text_foo == "foo").should_be_true (js_text_foo == "foo").should_be_true
Test.specify "should handle Text via NFD normalization" <|
('ś' == 's\u0301') . should_be_true
('e\u0301abc' == 'éabc') . should_be_true
('e\u0301abc' == 'é') . should_be_false
((Point.Value 'ś' 23.0) == (Point.Value 's\u0301' 23)) . should_be_true
Test.specify "should dispatch to overriden `==` on atoms" <| Test.specify "should dispatch to overriden `==` on atoms" <|
child1 = Child.Value 11 child1 = Child.Value 11
@ -136,24 +141,14 @@ spec =
((CustomEqType.C1 0) == (CustomEqType.C2 7 3)).should_be_false ((CustomEqType.C1 0) == (CustomEqType.C2 7 3)).should_be_false
Test.specify "should dispatch to equals on host values" <| Test.specify "should dispatch to equals on host values" <|
java_object1 = HashMap.new path1 = JavaPath.of "home" "user" . resolve "file.txt"
java_object1.put "a" 1 path2 = JavaPath.of "home" "user" "file.txt"
java_object1.put "b" 2 (path1 == path2).should_be_true
path3 = path1.resolve "subfile.txt"
java_object2 = HashMap.new (path3 == path2).should_be_false
java_object2.put "b" 2
java_object2.put "a" 1
(java_object1 == java_object2).should_be_true
java_object2.put "c" 42
(java_object1 == java_object2).should_be_false
Test.specify "should return False for different Atoms with same fields" <| Test.specify "should return False for different Atoms with same fields" <|
p1 = Point.Value 1 2 rect = Rect.Value (Point.Value 1 2) (Point.Value 3 4)
p2 = Point.Value 3 4
rect = Rect.Value p1 p2
four_field = FourFieldType.Value 1 2 3 4 four_field = FourFieldType.Value 1 2 3 4
(rect == four_field).should_be_false (rect == four_field).should_be_false
@ -161,8 +156,8 @@ spec =
(Child == Child).should_be_true (Child == Child).should_be_true
(Child == Point).should_be_false (Child == Point).should_be_false
(Point == Child).should_be_false (Point == Child).should_be_false
(HashMap == Child).should_be_false (JavaPath == Child).should_be_false
(Child == HashMap).should_be_false (Child == JavaPath).should_be_false
(Boolean == Any).should_be_false (Boolean == Any).should_be_false
(Any == Boolean).should_be_false (Any == Boolean).should_be_false
(Any == Any).should_be_true (Any == Any).should_be_true

View File

@ -110,7 +110,7 @@ spec =
json.field_names.should_equal ['data','axis'] json.field_names.should_equal ['data','axis']
data = json.get 'data' data = json.get 'data'
data.length . should_equal 10 data.length . should_equal 10
(data.take (First 3)).to_text . should_equal '[{"x":0,"y":225}, {"x":15,"y":0}, {"x":29,"y":196}]' (data.take (First 3)).to_text . should_equal '[{"x":0,"y":225}, {"x":29,"y":196}, {"x":15,"y":0}]'
Test.specify "filter the elements" <| Test.specify "filter the elements" <|
vector = [0,10,20,30] vector = [0,10,20,30]