* Mitiigate LS DDOS scenario on startup
For a medium-size project with a lot of visualizations, Language Server
will be flooded with visualization requests on startup. Due to an
existing limit for the job execution engine, some of those requests
might have been silently dropped.
This change lifts that limit until a better fix can be invented.
Additionally, a slow startup might lead to a timeout when serving open
file request. This change adds some retries as a fallback
mechanism/progress monitoring.
* add runtime-fat-jar to a list of aggregates
Addresses one of two concerns of #5298 - adds support for `--jvm` argument to allow us to switch from _native image_ built Enso binary (as developed by #10126) to regular JVM based Enso execution. This change _doesn't affect production builds_. The _native executable_ continues to be only built by `engine-runner/buildNativeImage` which is tested on CI, but not in the production jobs.
* Eliminating circe-yaml
This change adds our very-own YAML parser on top of SnakeYAML. Compared
to Circe parser on top of SnakeYAML. The advantage? In some not-so-distant
future we might actually get rid of circe and the related performance
issues.
The logic is similar to what circe does i.e. analyzing SnakeYAML to
build our own structure.
This change is not complete, as there are still some tests failing, but
most common Configs are already parseable.
We _could_ auto-generate some of the code but still some of the logic
would have to be tweaked by hand; the current logic has a number of
special cases, as I found out the hard way.
* wip: more tests passing
* Fix remaining tests in ConfigSpec
* Fixing YAML decoder for editions
Dropping circe as a decoder for Editions revealed some problems. Turns
out the current implementation had even more special cases to deal with.
* nit
* Allow for empty exports
* Mostly complete encodin part
Replaced almost all `toYAML` locations with SnakeYAML equivalent.
The encoding has to use Java collections for which there exists a
built-in support. If we were to use Scala collections we would have to
deal with tagging, at the very least.
* Remove the last remaining Circe's YAML parser
* Bug fix + further loop optimization
* removal of some dependencies
* Remove circe-yaml
Added a custom SnakeYAML Node updater to mimick the JSON -> YAML -> JSON
conversion needed for updating fields. The algorithm recursively follows
the key-path and inserts the desired Node. This is not a performance
oriented code on purpose.
* Fix compilation issues
`circe-core` was marked as `provided` but no one eventually included it
in the final jar, hence `NoClassFoundException`.
* fix licensing
* Removing obsolete circe definitions
* fmt
* nits
* s/SnakeYamlDecoder/YamlDecoder
* fmt
* Partial revert, PM needs JSON decoders/encoders
* style
* incremental compilation gone wrong
Ultimately, we want to forbid the `from ... export all` syntax. This PR starts by providing a way to explicitly export extension and conversion methods by name.
Stdlib code will be modified in upcoming PR.
# Important Notes
A single name can refer to multiple extension or conversion methods. Exports are not qualified. For example,
```
type My_Type
type Other_Type
My_Type.ext_method x = x
Other_Type.ext_method x = x
```
```
from project.Mod export ext_method
```
will export both `My_Type.ext_method` and `Other_Type.ext_method`.
close#10240
Changelog:
- add: `desktop-environment` Java module to detect user environment configuration
- add: `ProjectsMigration` module containing the migration logic of the enso projects directory
- update: updated and cleaned up unused settings from the storage config
- add: `desktopEnvironment` TS module to detect user environment configuration in the `project-manager-shim`
- update: `project-manager-shim` with the new user projects directory
The current implementation contains logic that should enable us to make some backward compatibility config changes.
At the same time, the logic is tightly integrated with circe's JSON library, which we want to eventually to get rid off.
Rather than trying to keep it somehow around and maintain via some hacks this PR proposes to ditch that logic completely as we currently have no use-case for such scenarios.
As a result, classes modelling YAML configs now don't have the extra fields and there is 1:1 correspondence.
Performance has also improved although that wasn't the main objective, yet. Follow up PR will attempt to replace `circe-yaml` with `snakeyaml` directly.
In preparation for #9113. Note that the dependency upgrade is necessary because it brings latest available `snakeyaml` (as part of `circe-yaml`).
Outline view and completions for Enso code in VSCode.
# Important Notes
This PR provides the necessary infrastructure for building VSCode extension that includes `enso_parser` library compiled for all supported platforms.
VSCode extension can now use libraries from `sbt` that are `publishM2`-ready. To make that possible a documentation must have been provided and fixed for those modules - hence so many changes in `.scala` classes.
<img width="862" alt="image" src="https://github.com/enso-org/enso/assets/26887752/7374bf41-bdc6-4322-b562-85a2e761de2a">
Last, but not least. The outline view and completions display something.
Reducing the number of dependencies. Explicit `cats` are almost gone (present in `cli`). `enumeration` is completely gone. `cats` is also still included implicitly via `io.circe` but that's a different kind of beast.
Also, really removed `jackson` from dependencies by fixing the dependency on `http-test-helper`.
# Important Notes
In a number of places importing all cats implicits could be simply replaced with a single or two method calls. Not to mention that this will reduce compilation times due to reduced implicit search space.
One example of how the changes affect performance (not only startup):
Before:
![Screenshot from 2024-06-11 12-05-24](https://github.com/enso-org/enso/assets/292128/a1a772a9-635d-4a16-a543-e2fd2124a22c)
Now:
![Screenshot from 2024-06-11 14-27-47](https://github.com/enso-org/enso/assets/292128/b17c7fcc-9a6d-48b9-8200-60708354ee03)
(frequently executed)
![Screenshot from 2024-06-12 12-46-34](https://github.com/enso-org/enso/assets/292128/31bc4dfd-4edc-45c9-9c5d-13e3472089b9)
Also appears to be gone.
This PR is by no means finished. The purge will continue in follow up PRs.
- Follow-up to #9361
- Enables assertions and fixes `count` check
- Tests and fixes null references
- Tests and fixes serializing a deserialized structure - by saving the id of the `Persistance` corresponding to the entry
- After the change to how we determine which `Persistance` instance to use, the most specific one is now used (based on the saved id). This has an unfortunate consequence that `Seq` which is most of the time represented by a subtype of `List`, is now using `PersistScalaList` which is not lazy.
- To alleviate that, we no longer use `Seq` to store some field lazily and instead use a dedicated type for that purpose: `InlineReference`.
- Remove remnants of deprecated Scala parser
- The following projects are now JPMS modules provided on system module-path (in components directory):
- `ydoc-server`
- `profiling-utils`
- `syntax-rust-definition`
- The contents of the aforementioned modules are excluded from both `runner.jar` and `runtime.jar` fat jars.
- Suggestions are serialized and deserialized with our Persistance framework, rather than via the default Java OutputObjectWriter.
Add support for private methods. Most of the changes are in parser and compiler. The runtime checking of private functions was already present since #9692
# Important Notes
- Only top-level methods can be declared `private`.
- private method cannot be called from different project
- private method cannot be accessed from polyglot code (private method does not exist for polyglot code)
* Ensure all relevant sbt projects have annotationProcSetting
* Remove unused `syntax-rust-definition/javaModuleName` key.
* Add annotationProcSetting to more projects
Introduce a new `test-utils` project, and moves the `TestBase` there. Moreover, `TestBase` is renamed to `TestUtils` and is no longer an abstract class.
# Important Notes
`test-utils` project does not depend on junit, so it can be used, for example, by any benchmarks as well.
- related #7954
Changelog:
- update: Ydoc starts with the language server on the `localhost:1234` by default. The hostname and ports can be configured by setting environment variables `LANGUAGE_SERVER_YDOC_HOSTNAME` and `LANGUAGE_SERVER_YDOC_PORT`
- update: by default `npm dev run` uses the node Ydoc server. You can control it with `POLYGLOT_YDOC_SERVER` env variable. For example,
```
env POLYGLOT_YDOC_SERVER='true' npm --workspace=enso-gui2 run dev
```
To connect to the Ydoc server running on the 1234 port (the one started with the language server)
⠀
```
env POLYGLOT_YDOC_SERVER='ws://127.0.0.1:1235' npm --workspace=enso-gui2 run dev
```
To connect to the provided URL. Can be useful for debugging when you start a separate Ydoc process.
- update: run `npm install` before the engine build. It is required to create the Ydoc JS bundle.
Setting execution environment to the existing one should have no effect.
Should (positively) affect startup in #9789.
# Important Notes
Cancelling jobs and triggering a fresh execute job is expensive and unnecessary, especially on startup, when the result should be the same as before.
part of #7954
# Important Notes
The workflow is:
- `$ npm install` -- just in case
- `$ npm --workspace=enso-gui2 run build-ydoc-server-polyglot` -- build the `ydocServer.js` bundle
- `$ sbt ydoc-server/assembly` -- build the ydoc server jar
- `env POLYGLOT_YDOC_SERVER=true npm --workspace=enso-gui2 run dev` -- run the dev server with the polyglot ydoc server. Providing `POLYGLOT_YDOC_SERVER_DEBUG=true` env variable enables the chrome debugger
To workaround problems with `glibc` we need to build parser in the backend with C stdlib statically linked using musl. Uses the arguments presented in #9521.
# Important Notes
No problems in tests so far.
Marking the `shapeless` dependency as compile-time only.
# Important Notes
If we want to ditch the dependency completely then, if we want to keep the check in place, we would have to replace with its runtime equivalent:
```
--- a/engine/runtime-compiler/src/main/scala/org/enso/compiler/pass/IRPass.scala
+++ b/engine/runtime-compiler/src/main/scala/org/enso/compiler/pass/IRPass.scala
@@ -148,6 +148,9 @@ object IRPass {
def unsafeAs[T <: Metadata: ClassTag]: T = {
+ if (implicitly[ClassTag[T]].runtimeClass == Metadata.getClass) {
+ throw new InternalError("Type argument must be specified")
+ }
```
Not sure if we want to do it still after this change.
Note that `project-manager` uses that and some other definitions as well.
As reported by our users, when using the AWS SSO, our code was failing with:
```
Execution finished with an error: To use Sso related properties in the 'xyz' profile, the 'sso' service module must be on the class path.
```
This PR adds the missing JARs to fix that.
Additionally it improves the license review tool UX a bit (parts of #9122):
- sorting the report by amount of problems, so that dependencies with unresolved problems appear at the top,
- semi-automatic helper button to rename package configurations after a version bump,
- button to remove stale entries from config (files or copyrights that disappeared after update),
- button to add custom copyright notice text straight from the report UI,
- button to set a file as the license for the project (creating the `custom-license` file automatically)
- ability to filter processed projects - e.g. `openLegalReviewReport AWS` will only run on the AWS subproject - saving time processing unchanged dependencies,
- updated the license search heuristic, fixing a problem with duplicates:
- if we had dependencies `netty-http` and `netty-http2`, because of a prefix-check logic, the notices for `netty-http` would also appear again for `netty-http2`, which is not valid. I have improved the heuristic to avoid these false positives and removed them from the current report.
- WIP: button to mark a license type as reviewed (not finished in this PR).
This change replaces an sqllite-backed suggestions' repo with a simple, in-memory, one.
As `completion` functionality has been implemented completely in GUI, there is no need to support it in backend, which simplifies a lot of functionality.
Closes#9650 and #9471.
# Important Notes
Loading suggestions and sending them to GUI on startup is almost instantaneous. Previously it would take ~10s just for `Standard.Base`.
This PR bumps the FlatBuffers version used by the backend to `24.3.25` (the latest version as of now).
Since the newer FlatBuffers releases come with prebuilt binaries for all platforms we target, we can simplify the build process by simply downloading the required `flatc` binary from the official FlatBuffers GitHub release page. This allows us to remove the dependency on `conda`, which was the only reliable way to get the outdated `flatc`.
The `conda` setup has been removed from the CI steps and the relevant code has been removed from the build script.
The FlatBuffers version is no longer hard-coded in the Rust build script, it is inferred from the `build.sbt` definition (similar to GraalVM).
# Important Notes
This does not affect the GUI binary protocol implementation.
While I initially wanted to update it, it turned out farly non-trivial.
As there are multiple issues with the generated TS code, it was significantly refactored by hand and it is impossible to automatically update it. Work to address this problem is left as [a future task](https://github.com/enso-org/enso/issues/9658).
As the Flatbuffers binary protocol is guaranteed to be compatible between versions (unlike the generated sources), there should be no adverse effects from bumping `flatc` only on the backend side.
* Initial connection to Snowflake via an account, username and password.
* Fix databases and schemas in Snowflake.
Add warehouses.
* Add warehouse.
Update schema dropdowns.
* Add ability to set warehouse and pass at connect.
* Fix for NPE in license review
* scalafmt
* Separate Snowflake from Database.
* Scala fmt.
* Legal Review
* Avoid using ARROW for snowflake.
* Tidy up Entity_Naming_Properties.
* Fix for separating Entity_Namimg_Properties.
* Allow some tweaking of Postgres dialect to allow snowflake to use as well.
* Working on reading Date, Time and Date Times.
* Changelog.
* Java format.
* Make Snowflake Time and TimeStamp stuff work.
Move some responsibilities to Type_Mapping.
* Make Snowflake Time and TimeStamp stuff work.
Move some responsibilities to Type_Mapping.
* fix
* Update distribution/lib/Standard/Database/0.0.0-dev/src/Connection/Connection.enso
Co-authored-by: Radosław Waśko <radoslaw.wasko@enso.org>
* PR comments.
* Last refactor for PR.
* Fix.
---------
Co-authored-by: Radosław Waśko <radoslaw.wasko@enso.org>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
If some benchmark fails in dry-run (compileOnly) mode, the whole process exits with non-zero return code. Also fixes failing engine compiler benchmarks.
# Important Notes
Manually added failure:
```diff
diff --git a/engine/runtime-benchmarks/src/main/java/org/enso/interpreter/bench/benchmarks/semantic/ArrayProxyBenchmarks.java b/engine/runtime-benchmarks/src/main/java/org/enso/interpreter/bench/benchmarks/semantic/ArrayProxyBenchmarks.java
index c8d86cecc..f9f4d7cbc 100644
--- a/engine/runtime-benchmarks/src/main/java/org/enso/interpreter/bench/benchmarks/semantic/ArrayProxyBenchmarks.java
+++ b/engine/runtime-benchmarks/src/main/java/org/enso/interpreter/bench/benchmarks/semantic/ArrayProxyBenchmarks.java
@@ -95,7 +95,8 @@ public class ArrayProxyBenchmarks {
@Benchmark
public void sumOverComputingProxy(Blackhole matter) {
- performBenchmark(matter);
+ //performBenchmark(matter);
+ throw new AssertionError("My error");
}
@Benchmark
```
Run with `sbt "-Dbench.compileOnly=true runtime-benchmarks/benchOnly org.enso.interpreter.bench.benchmarks.semantic.ArrayProxyBenchmarks.sumOverComputingProxy"` fails with:
```
[info] Running benchmarks [org.enso.interpreter.bench.benchmarks.semantic.ArrayProxyBenchmarks.sumOverComputingProxy] in compileOnly mode
[info] # JMH version: 1.36
[info] # VM version: JDK 21.0.2, Java HotSpot(TM) 64-Bit Server VM, 21.0.2+13-LTS-jvmci-23.1-b30
[info] # VM invoker: /home/pavel/.sdkman/candidates/java/21.0.2-graal/bin/java
[info] # VM options: -XX:ThreadPriorityPolicy=1 -XX:+UnlockExperimentalVMOptions -XX:+EnableJVMCIProduct -XX:-UnlockExperimentalVMOptions -Dslf4j.provider=org.slf4j.nop.NOPServiceProvider -Dbench.compileOnly=true --module-path=/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/sdk/nativeimage/23.1.2/nativeimage-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/sdk/word/23.1.2/word-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/sdk/jniutils/23.1.2/jniutils-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/sdk/collections/23.1.2/collections-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/polyglot/polyglot/23.1.2/polyglot-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/truffle/truffle-api/23.1.2/truffle-api-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/truffle/truffle-runtime/23.1.2/truffle-runtime-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/truffle/truffle-compiler/23.1.2/truffle-compiler-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/js/js-language/23.1.2/js-language-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/regex/regex/23.1.2/regex-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/shadowed/icu4j/23.1.2/icu4j-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/python/python-language/23.1.2/python-language-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/python/python-resources/23.1.2/python-resources-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/bouncycastle/bcutil-jdk18on/1.76/bcutil-jdk18on-1.76.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/bouncycastle/bcpkix-jdk18on/1.76/bcpkix-jdk18on-1.76.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/bouncycastle/bcprov-jdk18on/1.76/bcprov-jdk18on-1.76.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/llvm/llvm-api/23.1.2/llvm-api-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/truffle/truffle-nfi/23.1.2/truffle-nfi-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/truffle/truffle-nfi-libffi/23.1.2/truffle-nfi-libffi-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/tools/profiler-tool/23.1.2/profiler-tool-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/graalvm/shadowed/json/23.1.2/json-23.1.2.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/tukaani/xz/1.9/xz-1.9.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/slf4j/slf4j-api/2.0.9/slf4j-api-2.0.9.jar:/home/pavel/.cache/coursier/v1/https/repo1.maven.org/maven2/org/slf4j/slf4j-nop/2.0.9/slf4j-nop-2.0.9.jar:/home/pavel/dev/enso/runtime.jar --add-modules=org.enso.runtime --add-exports=org.slf4j.nop/org.slf4j.nop=org.slf4j
[info] # Blackhole mode: compiler (auto-detected, use -Djmh.blackhole.autoDetect=false to disable)
[info] # Warmup: <none>
[info] # Measurement: 1 iterations, 1 s each
[info] # Timeout: 10 min per iteration
[info] # Threads: 1 thread, will synchronize iterations
[info] # Benchmark mode: Average time, time/op
[info] # Benchmark: org.enso.interpreter.bench.benchmarks.semantic.ArrayProxyBenchmarks.sumOverComputingProxy
[info] # Run progress: 0.00% complete, ETA 00:00:01
[info] # Fork: N/A, test runs in the host VM
[info] # *** WARNING: Non-forked runs may silently omit JVM options, mess up profilers, disable compiler hints, etc. ***
[info] # *** WARNING: Use non-forked runs only for debugging purposes, not for actual performance runs. ***
[error] SLF4J: Attempting to load provider "org.slf4j.nop.NOPServiceProvider" specified via "slf4j.provider" system property
[info] Iteration 1: <failure>
[info] java.lang.AssertionError: My error
[info] at org.enso.interpreter.bench.benchmarks.semantic.ArrayProxyBenchmarks.sumOverComputingProxy(ArrayProxyBenchmarks.java:99)
[info] at org.enso.interpreter.bench.benchmarks.semantic.jmh_generated.ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.sumOverComputingProxy_avgt_jmhStub(ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.java:232)
[info] at org.enso.interpreter.bench.benchmarks.semantic.jmh_generated.ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.sumOverComputingProxy_AverageTime(ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.java:173)
[info] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
[info] at java.base/java.lang.reflect.Method.invoke(Method.java:580)
[info] at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:475)
[info] at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:458)
[info] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
[info] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
[info] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
[info] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
[info] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
[error] Benchmark run failed: Benchmark caught the exception
[info] at java.base/java.lang.Thread.run(Thread.java:1583)
[error] org.openjdk.jmh.runner.RunnerException: Benchmark caught the exception
[error] at org.openjdk.jmh.runner.Runner.runBenchmarks(Runner.java:575)
[error] at org.openjdk.jmh.runner.Runner.internalRun(Runner.java:310)
[error] at org.openjdk.jmh.runner.Runner.run(Runner.java:209)
[error] at org.enso.interpreter.bench.BenchmarksRunner.runCompileOnly(BenchmarksRunner.java:93)
[error] at org.enso.interpreter.bench.BenchmarksRunner.run(BenchmarksRunner.java:36)
[error] at org.enso.interpreter.bench.benchmarks.RuntimeBenchmarksRunner.main(RuntimeBenchmarksRunner.java:8)
[error] Caused by: org.openjdk.jmh.runner.BenchmarkException: Benchmark error during the run
[error] at org.openjdk.jmh.runner.BenchmarkHandler.runIteration(BenchmarkHandler.java:424)
[error] at org.openjdk.jmh.runner.BaseRunner.runBenchmark(BaseRunner.java:281)
[error] at org.openjdk.jmh.runner.BaseRunner.runBenchmark(BaseRunner.java:233)
[error] at org.openjdk.jmh.runner.BaseRunner.doSingle(BaseRunner.java:138)
[error] at org.openjdk.jmh.runner.BaseRunner.runBenchmarksEmbedded(BaseRunner.java:110)
[error] at org.openjdk.jmh.runner.Runner.runBenchmarks(Runner.java:555)
[error] ... 5 more
[error] Suppressed: java.lang.AssertionError: My error
[error] at org.enso.interpreter.bench.benchmarks.semantic.ArrayProxyBenchmarks.sumOverComputingProxy(ArrayProxyBenchmarks.java:99)
[error] at org.enso.interpreter.bench.benchmarks.semantic.jmh_generated.ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.sumOverComputingProxy_avgt_jmhStub(ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.java:232)
[error] at org.enso.interpreter.bench.benchmarks.semantic.jmh_generated.ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.sumOverComputingProxy_AverageTime(ArrayProxyBenchmarks_sumOverComputingProxy_jmhTest.java:173)
[error] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103)
[error] at java.base/java.lang.reflect.Method.invoke(Method.java:580)
[error] at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:475)
[error] at org.openjdk.jmh.runner.BenchmarkHandler$BenchmarkTask.call(BenchmarkHandler.java:458)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
[error] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
[error] at java.base/java.lang.Thread.run(Thread.java:1583)
[error] Nonzero exit code returned from runner: 1
[error] (Compile / run) Nonzero exit code returned from runner: 1
[error] Total time: 5 s, completed Mar 13, 2024, 12:49:59 PM
```
Reload of sbt used to fail on "Not a valid command: graalVMVersionCheck" This PR fixes this issue.
# Important Notes
Checked:
- Manually running `reload` inside sbt shell
- Temporarily setting a key in the current sbt shell with `set javaOptions += "foo"` and then `reload`.
Invoking sbt with incompatible java:
```
[info] welcome to sbt 1.9.7 (GraalVM Community Java 21)
[info] loading settings for project enso-build from plugins.sbt ...
[info] loading project definition from /home/pavel/dev/enso/project
[info] loading settings for project enso from build.sbt ...
[info] resolving key references (66170 settings) ...
[info] set current project to enso (in build file:/home/pavel/dev/enso/)
[error] Running on GraalVM version 21. Expected GraalVM version 21.0.2.
[warn] Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? (default: r)
```