This change downgrades hashing algorithm used in caching IR and library bindings to SHA-1. It is sufficient and significantly faster for the purpose of simple checksum we use it for.
Additionally, don't calculate the digest for serialized bytes - if we get the expected object type then we are confident about the integrity.
Don't initialize Jackson's ObjectMapper for every metadata serialization/de-serialization. Initialization is very costly.
Avoid unnecessary conversions between Scala and Java. Those back-and-forth `asScala` and `asJava` are pretty expensive.
Finally fix an SBT warning when generating library cache.
Closes https://github.com/enso-org/enso/issues/5763
# Important Notes
The change cuts roughly 0.8-1s from the overall startup.
This change will certainly lead to invalidation of existing caches. It is advised to simply start with a clean slate.
This change adds serialization and deserialization of library bindings.
In order to be functional, one needs to first generate IR and
serialize bindings using `--compiled <path-to-library>` command. The bindings
will be stored under the library with `.bindings` suffix.
Bindings are being generated during `buildEngineDistribution` task, thus not
requiring any extra steps.
When resolving import/exports the compiler will first try to load
module's bindings from cache. If successful, it will not schedule its
imports/exports for immediate compilation, as we always did, but use the
bindings info to infer the dependent modules.
The current change does not make any optimizations when it comes to
compiling the modules, yet. It only delays the actual
compilation/loading IR from cache so that it can be done in bulk.
Further optimizations will come from this opportunity such as parallel
loading of caches or lazily inferring only the necessary modules.
Part of https://github.com/enso-org/enso/issues/5568 work.
Creating two `findExceptionMessage` methods in `HostEnsoUtils` and in `VisualizationResult`. Why two? Because one of them is using `org.graalvm.polyglot` SDK as it runs in _"normal Java"_ mode. The other one is using Truffle API as it is running inside of partially evaluated instrument.
There is a `FindExceptionMessageTest` to guarantee consistency between the two methods. It simulates some exceptions in Enso code and checks that both methods extract the same _"message"_ from the exception. The tests verifies hosted and well as Enso exceptions - however testing other polyglot languages is only possible in other modules - as such I created `PolyglotFindExceptionMessageTest` - but that one doesn't have access to Truffle API - e.g. it doesn't really check the consistency - just that a reasonable message is extracted from a JavaScript exception.
# Important Notes
This is not full fix of #5260 - something needs to be done on the IDE side, as the IDE seems to ignore the delivered JSON message - even if it contains properly extracted exception message.
Automating the assembly of the engine and its execution into a single task. If you are modifying standard libraries, engine sources or Enso tests, you can launch `sbt` and then just:
```
sbt:enso> runEngineDistribution --run test/Tests/src/Data/Maybe_Spec.enso
[info] Engine package created at built-distribution/enso-engine-0.0.0-dev-linux-amd64/enso-0.0.0-dev
[info] Executing built-distribution/enso-engine-...-dev/bin/enso --run test/Tests/src/Data/Maybe_Spec.enso
Maybe: [5/5, 30ms]
- should have a None variant [14ms]
- should have a Some variant [5ms]
- should provide the `maybe` function [4ms]
- should provide `is_some` [2ms]
- should provide `is_none` [3ms]
5 tests succeeded.
0 tests failed.unEngineDistribution 4s
0 tests skipped.
```
the [runEngineDistribution](3a581f29ee/docs/CONTRIBUTING.md (running-enso)) `sbt` input task makes sure all your sources are properly compiled and only then executes your enso source. Everything ready at a single press of Enter.
# Important Notes
To debug in chrome dev tools, just add `--inspect`:
```
sbt:enso> runEngineDistribution --inspect --run test/Tests/src/Data/Maybe_Spec.enso
E.g. in Chrome open: devtools://devtools/bundled/js_app.html?ws=127.0.0.1:9229/7JsgjXlntK8
```
everything gets build and one can just attach the Enso debugger.
Before, any failures of the Rust-side of the parser build would be swallowed by sbt, for example if I add gibberish to the Rust code I will get:
<img width="474" alt="image" src="https://user-images.githubusercontent.com/1436948/217374050-fd9ddaca-136c-459e-932e-c4b9e630d610.png">
This is problematic, because when users are compiling in SBT they may get confusing errors about Java files not being found whereas the true cause is hard to track down because it is somewhere deep in the logs. We've run into this silent failure when setting up SBT builds together with @GregoryTravis today.
I suggest to change it so that once cargo fails, the build is failed with a helpful message - this way it will be easier to track down the issues.
With these changes we get:
<img width="802" alt="image" src="https://user-images.githubusercontent.com/1436948/217374531-707ae348-4c55-4d62-9a86-93850ad8086b.png">
Upgrading to GraalVM 22.3.0.
# Important Notes
- Removed all deprecated `FrameSlot`, and replaced them with frame indexes - integers.
- Add more information to `AliasAnalysis` so that it also gathers these indexes.
- Add quick build mode option to `native-image` as default for non-release builds
- `graaljs` and `native-image` should now be downloaded via `gu` automatically, as dependencies.
- Remove `engine-runner-native` project - native image is now build straight from `engine-runner`.
- We used to have `engine-runner-native` without `sqldf` in classpath as a workaround for an internal native image bug.
- Fixed chrome inspector integration, such that it shows values of local variables both for current stack frame and caller stack frames.
- There are still many issues with the debugging in general, for example, when there is a polyglot value among local variables, a `NullPointerException` is thrown and no values are displayed.
- Removed some deprecated `native-image` options
- Remove some deprecated Truffle API method calls.
- Added expression ANTLR4 grammar and sbt based build.
- Added expression support to `set` and `filter` on the Database and InMemory `Table`.
- Added expression support to `aggregate` on the Database and InMemory `Table`.
- Removed old aggregate functions (`sum`, `max`, `min` and `mean`) from `Column` types.
- Adjusted database `Column` `+` operator to do concatenation (`||`) when text types.
- Added power operator `^` to both `Column` types.
- Adjust `iif` to allow for columns to be passed for `when_true` and `when_false` parameters.
- Added `is_present` to database `Column` type.
- Added `coalesce`, `min` and `max` functions to both `Column` types performing row based operation.
- Added support for `Date`, `Time_Of_Day` and `Date_Time` constants in database.
- Added `read` method to InMemory `Column` returning `self` (or a slice).
# Important Notes
- Moved approximate type computation to `SQL_Type`.
- Fixed issue in `LongNumericOp` where it was always casting to a double.
- Removed `head` from InMemory Table (still has `first` method).
Make sure `libenso_parser.so`, `.dll` or `.dylib` are packaged and included when `sbt buildEngineDistribution`.
# Important Notes
There was [a discussion](https://discord.com/channels/401396655599124480/1036562819644141598) about proper location of the library. It was concluded that _"there's no functional difference between a dylib and a jar."_ and as such the library is placed in `component` folder.
Currently the old parser is still used for parsing. This PR just integrates the build system changes and makes us ready for smooth flipping of the parser in the future as part of #3611.
We've had an old attempt at integrating a Rust parser with our Scala/Java projects. It seems to have been abandoned and is not used anywhere - it is also superseded by the new integration of the Rust parser. I think it was used as an experiment to see how to approach such an integration.
Since it is not used anymore - it make sense to remove it, because it only adds some (slight, but non-zero) maintenance effort. We can always bring it back from git history if necessary.
- Reimplement the `Duration` type to a built-in type.
- `Duration` is an interop type.
- Allow Enso method dispatch on `Duration` interop coming from different languages.
# Important Notes
- The older `Duration` type should now be split into new `Duration` builtin type and a `Period` type.
- This PR does not implement `Period` type, so all the `Period`-related functionality is currently not working, e.g., `Date - Period`.
- This PR removes `Integer.milliseconds`, `Integer.seconds`, ..., `Integer.years` extension methods.
This PR adds a possibility to generate native-image for engine-runner.
Note that due to on-demand loading of stdlib, programs that make use of it are currently not yet supported
(that will be resolved at a later point).
The purpose of this PR is only to make sure that we can generate a bare minimum runner because due to lack TruffleBoundaries or misconfiguration in reflection config, this can get broken very easily.
To generate a native image simply execute:
```
sbt> engine-runner-native/buildNativeImage
... (wait a few minutes)
```
The executable is called `runner` and can be tested via a simple test that is in the resources. To illustrate the benefits
see the timings difference between the non-native and native one:
```
>time built-distribution/enso-engine-0.0.0-dev-linux-amd64/enso-0.0.0-dev/bin/enso --no-ir-caches --in-project test/Tests/ --run engine/runner-native/src/test/resources/Factorial.enso 6
720
real 0m4.503s
user 0m9.248s
sys 0m1.494s
> time ./runner --run engine/runner-native/src/test/resources/Factorial.enso 6
720
real 0m0.176s
user 0m0.042s
sys 0m0.038s
```
# Important Notes
Notice that due to a [bug in GraalVM](https://github.com/oracle/graal/issues/4200), which is already fixed in 22.x, and us still being on 21.x for the time being, I had to add a workaround to our sbt build to build a different fat jar for native image. To workaround it I had to exclude sqlite jar. Hence native image task is on `engine-runner-native` and not on `engine-runner`.
Will need to add the above command to CI.
New plan to [fix the `sbt` build](https://www.pivotaltracker.com/n/projects/2539304/stories/182209126) and its annoying:
```
log.error(
"Truffle Instrumentation is not up to date, " +
"which will lead to runtime errors\n" +
"Fixes have been applied to ensure consistent Instrumentation state, " +
"but compilation has to be triggered again.\n" +
"Please re-run the previous command.\n" +
"(If this for some reason fails, " +
s"please do a clean build of the $projectName project)"
)
```
When it is hard to fix `sbt` incremental compilation, let's restructure our project sources so that each `@TruffleInstrument` and `@TruffleLanguage` registration is in individual compilation unit. Each such unit is either going to be compiled or not going to be compiled as a batch - that will eliminate the `sbt` incremental compilation issues without addressing them in `sbt` itself.
fa2cf6a33ec4a5b2e3370e1b22c2b5f712286a75 is the first step - it introduces `IdExecutionService` and moves all the `IdExecutionInstrument` API up to that interface. The rest of the `runtime` project then depends only on `IdExecutionService`. Such refactoring allows us to move the `IdExecutionInstrument` out of `runtime` project into independent compilation unit.
This change introduces a custom LogManager for console that allows for
excluding certain log messages. The primarily reason for introducing
such LogManager/Appender is to stop issuing hundreds of pointless
warnings coming from the analyzing compiler (wrapper around javac) for
classes that are being generated by annotation processors.
The output looks like this:
```
[info] Cannot install GraalVM MBean due to Failed to load org.graalvm.nativebridge.jni.JNIExceptionWrapperEntryPoints
[info] compiling 129 Scala sources and 395 Java sources to /home/hubert/work/repos/enso/enso/engine/runtime/target/scala-2.13/classes ...
[warn] Unexpected javac output: warning: File for type 'org.enso.interpreter.runtime.type.ConstantsGen' created in the last round will not be subject to annotation processing.
[warn] 1 warning.
[info] [Use -Dgraal.LogFile=<path> to redirect Graal log output to a file.]
[info] Cannot install GraalVM MBean due to Failed to load org.graalvm.nativebridge.jni.JNIExceptionWrapperEntryPoints
[info] foojavac Filer
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.number.decimal.CeilMethodGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.resource.TakeNodeGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.error.ThrowErrorMethodGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.number.smallInteger.MultiplyMethodGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.warning.GetWarningsNodeGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.number.smallInteger.BitAndMethodGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.error.ErrorToTextNodeGen
[warn] Could not determine source for class org.enso.interpreter.node.expression.builtin.warning.GetValueMethodGen
[warn] Could not determine source for class org.enso.interpreter.runtime.callable.atom.AtomGen$MethodDispatchLibraryExports$Cached
....
```
The output now has over 500 of those and there will be more. Much more
(generated by our and Truffle processors).
There is no way to tell SBT that those are OK. One could potentially
think of splitting compilation into 3 stages (Java processors, Java and
Scala) but that will already complicate the non-trivial build definition
and we may still end up with the initial problem.
This is a fix to make it possible to get reasonable feedback from
compilation without scrolling mutliple screens *every single time*.
Also fixed a spurious warning in javac processor complaining about
creating files in the last round.
Related to https://www.pivotaltracker.com/story/show/182138198
`provided` classifier is completely omitted in the update report so we
cannot filter on that during `.select`.
Instead, we only consider runtime configuration which lists all the
necessary dependencies needed to run things.
This is a follow up on #182093808. With this change frgaal compiler is no
longer packaged.
Auxiliary sbt commands for building individual
stdlib packages.
The commands check if the engine distribution was built at least once,
and only copy the necessary package files if necessary.
So far added:
- `buildStdLibBase`
- `buildStdLibDatabase`
- `buildStdLibTable`
- `buildStdLibImage`
- `buildStdLibGoogle_Api`
Related to [#182014385](https://www.pivotaltracker.com/story/show/182014385)
* Initial integration with Frgaal in sbt
Half-working since it chokes on generated classes from annotation
processor.
* Replace AutoService with ServiceProvider
For reasons unknown AutoService would fail to initialize and fail to
generate required builtin method classes.
Hidden error message is not particularly revealing on the reason for
that:
```
[error] error: Bad service configuration file, or exception thrown while constructing Processor object: javax.annotation.processing.Processor: Provider com.google.auto.service.processor.AutoServiceProcessor could not be instantiated
```
The sample records is only to demonstrate that we can now use newer Java
features.
* Cleanup + fix benchmark compilation
Bench requires jmh classes which are not available because we obviously
had to limit `java.base` modules to get Frgaal to work nicely.
For now, we default to good ol' javac for Benchmarks.
Limiting Frgaal to runtime for now, if it plays nicely, we can expand it
to other projects.
* Update CHANGELOG
* Remove dummy record class
* Update licenses
* New line
* PR review
* Update legal review
Co-authored-by: Radosław Waśko <radoslaw.wasko@enso.org>
Implements https://www.pivotaltracker.com/story/show/181266184
### Important Notes
Changed example image download to only proceed if the file did not exist before - thus cutting on the build time (the build used to download it _every_ time - which completely failed the build if network is down). A redownload can be forced by performing a fresh repository checkout.
Result of automatic formatting with `scalafmtAll` and `javafmtAll`.
Prerequisite for https://github.com/enso-org/enso/pull/3394
### Important Notes
This touches a lot of files and might conflict with existing PRs that are in progress. If that's the case, just run
`scalafmtAll` and `javafmtAll` after merge and everything should be in order since formatters should be deterministic.