* Adapt JSON API write path to the new explicit disclosure Ledger API interface
* Address review comments
* Switch to vanilla Base64 for createdEventBlob instead of Base64Url
* update TypeScript bindings of DisclosedContract to use the createdEventBlob field instead of payload, payloadBlob and metadata, and short-circuit a test which depends on canton populating the createdEventBlob field
* get TypeScript integration tests to use transaction service to get created_event_blob data
---------
Co-authored-by: = <=>
Fixes https://github.com/digital-asset/daml/issues/17692
Fixes the error from the `jsonEncoder` method (which was about `javac` mis-interpreting a method reference), and the warnings in `toValue`, `valueDecoder` and the initialiser for `__enums$`, which did not need to qualify the static identifiers by class name.
* replace community canton snapshot with our bazel-built canton deploy jar
* add VERSION to the resources of the jar
* fix canton-test-runner-with-dependencies-script
* add resources to community_common
* Update test-common/canton/it-lib/src/main/com/daml/CantonFixture.scala
Co-authored-by: Remy <remy.haemmerle@daml.com>
---------
Co-authored-by: Remy <remy.haemmerle@daml.com>
To avoid excess duplicate code-gen, I put the implementation of `toJson` into the parent interface `DefinedDataType`, and leverage the `jsonEncoder()` in each generated subclasses. However custom datatypes with type parameters have a different signature of `toJson(...)` as they need to pass in arguments for encoders for each type arg, so we generate a custom `toJson(...)` for those classes. I also do a static import of the `JsonLfEncoder.apply` method, as a useful uniform syntax for calling either `Function` objects or method references (e.g. `JsonLfEncoder::bool`). That required propagating the static imports up from the various methods that need them, and adding them at the `JavaFile.Builder` level.
I changed all the integration tests that did `fromJson` from a hard-coded string, into round-trip tests a la `Foo.fromJson(foo.toJson())`, now that we have `toJson`.
Also add a benchmark, which can be run with
```
bazel run //language-support/java/codegen:to-json-bench
```
Locally I currently get these results.
```
# Run complete. Total time: 00:00:56
Benchmark Mode Cnt Score Error Units
ToJsonBench.enummodBox thrpt 10 8138043.644 ± 197223.048 ops/s
ToJsonBench.enummodColoredTree thrpt 10 3132500.885 ± 23677.832 ops/s
ToJsonBench.enummodOptionalColor thrpt 10 10762883.904 ± 116865.102 ops/s
ToJsonBench.genmapmodBox thrpt 10 1554965.578 ± 4956.210 ops/s
```
The `Score` is the number of times the sample values can be converted to json per second.
* Explicit disclosure based on blobs only
* cosmetic changes
* Changes post-review
* remove buf check suppression
* Silence deprecation warnings
* more silencing of deprecation warnings
* Changes after recent round of reveiews
Mostly mechanical change, pulling nested static class `JsonLfReader.Decoders` out into its own class `JsonLfDecoders`.
Previously the nested class could access private members of `JsonLfReader`. I've now made the relevant methods package-accessible, and the fully private components such as anything from the jackson library are properly private. `currentText()` becomes the single way for `JsonLfDecoders` to read the current value.
This change will mirror what we'll then do with the upcoming `JsonLfEncoders`.
* Support record up/downgrades in Java codegen
This adds enough upgrading support to the Java codegen to use it
against the current upgrading PoCs. This is backwards compatible so I
enabled it in all cases instead of trying to add a flag somewhere.
Also rename the private `SubmitCommandsRequest.toProto` method which is used by the deprecated but public `toProto` overloads, to `deprecatedToProto`. The passing of disclosed contracts is not supported by these deprecated overloads.
This PR does not provide facilities for building a `DisclosedContract`, either as read from Scribe or from whatever format it may have been published or shared.
* initial commit
* split gen-stable-packages-v_i into two targets
* rename compatibleWith to canDependOn
* update damlc integration tests annotation to allow for 2.x
* use the right package ID for preconditionFailed when compiling to V2
* fuse stable-packages-v1 and stable-packages-v2 into one single filegroup
* Fix DA.Daml.LF.Ast.Tests
* remove leftover 'undefined' in Version.hs
* progress on fixing DataDependencies.hs
* fix Cross-SDK typeclasses test for 2.dev
* Fix the 'Typeclasses and instances' data dependency test
* Update comment
* fix //compiler/damlc/tests:packaging
* Add TODO
* parameterize the machine by the langage version, hardcode v1 in script v2, v2 in script v2, v1 in exports
* get EngineTests to pass
* fix more tests
* fix canton integration tests
* formatting
* fix more tests
* fix transactionversiontest
* fix exceptiontest
* Fix ValueEnricherSpec
* Fix EngineInfoTest
* fix PartialTransactionSpec
* fix upgragetest
* fix TransactionSnapshot
* Fix ContractKeySpec
* Fix ReinterpretTest
* fix InterfaceViewSpec
* fix InterfacesTest
* fix stable package v1 names
* fix validate.sh tests
* formatting
* Fix ChoiceAuthorityTest
* fix explicit disclosure test
* Fix SpeedyTest
* formatting
* Fix integration test
* fix data dependency tests
* fix package vetting count, increased due to metadata being added
* Redact stable package IDs in error messages in order for the ExceptionSemantics test to work for both v1 and v2
* cleanup
* fix Daml2ScriptTestRunner
* fix JsonApiIT and daml-script golden tests
* fix daml3-script runner test
* enable v2 for all integration tests
* formatting
* fix NodeSeedsTest
* fix since-lf annotations
* add comments, improve consistency
* stop hardcoding V1 in runPureExpr and runPureSExpr
* formatting
* remove harcoding of LFv1 in ConcurrentCompiledPackages.apply
* Parameterize Compiler.Config.Default with major language version
* remove global parser implicit and default package ID and language version
* daml-lf-test.sh no longer takes damlc argument
* Split //daml-lf/tests:BasicTests into engine and integration tests
* Remove unused //daml-lf/engine:Optional
* Convert //daml-lf/tests:AuthorizedDivulgence into integration test
* Convert //daml-lf/tests:DontDiscloseNonConsumingExercisesToObservers into integration test
* Convert //daml-lf/tests:ConjunctionChoices into integration test
* Convert //daml-lf/tests:ContractKeys into integration test LFContractKeys
* Move //daml-lf/tests:AuthTests to //daml-lf/engine:AuthTests
* Split //daml-lf/tests:LargeTransaction into ledger and non-ledger tests
* Remove scenarios in //language-support/java/codegen:ledger-tests-model
* Remove unused scenarios in //test-common:src/main/daml/model/Test.daml
* Remove 'enable_scenarios' param in da_scala_dar_resources_library
* Remove 'enable_scenarios' param in damlc_compile_test
* Remove '--enable-scenarios=yes in //compiler/damlc/tests:deterministic
* Convert /daml-lf/tests/scenario/dev/experimental to integration test
* Convert /daml-lf/tests/scenario/dev/interfaces to integration test
* Convert /daml-lf/tests/scenario/stable/big-numeric to integration test
* Convert /daml-lf/tests/scenario/stable/contract-key-through-exercises to integration test
* Convert /daml-lf/tests/scenario/stable/contract-keys to integration test
* Convert /daml-lf/tests/scenario/stable/divulge-iou to integration test
* Convert /daml-lf/tests/scenario/stable/embed-abort to integration test
* Convert /daml-lf/tests/scenario/stable/eval-agreement to integration test
* Convert /daml-lf/tests/scenario/stable/exception-auth to integration test
* Convert /daml-lf/tests/scenario/stable/gen-map to integration test
* Convert /daml-lf/tests/scenario/stable/many-fields to integration test
* Convert /daml-lf/tests/scenario/stable/mustfailcommit to integration test
* Convert /daml-lf/tests/scenario/stable/mustfailinterpretation to integration test
* Convert /daml-lf/tests/scenario/stable/mustfails to integration test
* Convert /daml-lf/tests/scenario/stable/no-contract-ids-in-keys to integration test
* Convert /daml-lf/tests/scenario/stable/pass-time to integration test
* Convert /daml-lf/tests/scenario/stable/pattern-matching to integration test
* Convert /daml-lf/tests/scenario/stable/timeout to integration test
* Remove scaffolding for daml-lf/tests/scenario
* Remove unused 'enable-scenarios' flags
* Remove unused daml-lf/tests/daml-lf-test.sh
* Remove unused daml-lf/tests/scala-test-limited-stack.sh
* Remove comments about deprecated flag 'DontDivulgeContractIdsInCreateArguments' in AuthorizedDivulgence integration test
* Add 'create' in LfStableTimeout:testScriptLoop
* add TODO for nesting limits test in LfStableMustFails
* Move LfStableMustFails expected ledger files into subdir
In doing so:
- `JsonLfReader::readFieldName` now also returns the location, as that is needed for error reporting
- Renamed `JsonLfReader::Field` to `JsonLfReader::JavaArg` as for our purposes it really represents an argument to the constructor of the Java class, and this avoids confusion with the new `FieldName` class, which refers to JSON object fields.
- Fix `locationEnd` so that it always does return the end location of the current token.
As for the benchmarks, on my laptop `bazel run language-support/java/codegen:from-json-bench` currently gives me
```
# Run complete. Total time: 00:01:06
Benchmark Mode Cnt Score Error Units
FromJsonBench.enummodBox thrpt 10 4412899.724 ± 456030.906 ops/s
FromJsonBench.enummodColoredTree thrpt 10 1429975.211 ± 72853.649 ops/s
FromJsonBench.enummodOptionalColor thrpt 10 4696006.017 ± 267894.386 ops/s
FromJsonBench.enummodOptionalColor_ValueBeforeTag thrpt 10 1293112.352 ± 29469.946 ops/s
FromJsonBench.genmapmodBox thrpt 10 714561.743 ± 55421.006 ops/s
```
i.e. depending on the data itself, throughput can be in the order of millions per second.
Simple stack profiling with
`bazel run language-support/java/codegen:from-json-bench -- -prof stack`
can also provide some insight into where time is spent while decoding.
This is not complete, but gets it far enough that once can start playing around with the feature.
done in this PR
- pass the actual `JsonLfReader` in as the final arg to `decode`, and it can then be threaded through all the sub-decoders
- renamed `FromJson` to `JsonLfDecoder` to more better match the existing `ValueDecoder`
- make the non-generic decoders simple fields rather than nullary methods
- support variants with simple type args, as well as with their own records
- add a `T fromJson(String)` to all relevant types, as the main user-facing method, rather than just `JsonLfDecoder<T> jsonDecoder()`.
to be done:
- complete testing of different combinations of types, including nested optionals
- `JsonLfEncoder`, and round-trip testing
- alternative handling of missing and unknown fields
- capability to decode from an in-memory JSON object, for frameworks where the original JSON has already been decoded and the object embedded by the time you get access to it.
- Introduces a new major version, "2", in the daml_lf proto
- Adds new major versions to the compiler and the engine
- Updates all code that assumes only one major version
- Updates all code that assumes only one dev version
Provide initial implementation of `JsonLfReader`, which can be used by Java code-gen to build the Java objects from a JSON LF formatted string, as specified here https://docs.daml.com/json-api/lf-value-specification.html
Not done yet:
* the actual code-gen to produces the `fromJson` methods on custom types
* unit tests for failure cases on parsing
* some corner-cases of rounding
* a `JsonLfWriter`
* Fix a probable typo in //daml-fl/encoder/testing-dar-*
* apply TODOs in bazel files
* remove obsolete comments in bazel files
* use 'default' instead of 'latest' for targets relying on 'latest' in order to ensure interfaces are supported
* Update to rules_haskell v0.16
* Update comments re bazel patches
* clean up bazel overrides
* Upgrade to Bazel 5.2.0
* Remove '--distinct_host_configuration=false'
* Update buildifier to 6.3.2
* Suffix macos and ubuntu caches with yyyymm
* bump windows cache to v14
* [REVERTME] bump linux/macos/darwin timeout to 4h
* [LF] make Timestamp parsing consistent between Java 11 and Java 17
Between Java 11 and Java 17 there is one bug fix on Instant.parse
that expands the range of values that can be parsed into an
Instant. See https://bugs.openjdk.org/browse/JDK-8166138
Daml-LF happened to uses Instant.parse to parse a string into a
Daml-LF timestamp and we observe a different behavior when running
Daml on Java 11 and Java 17
additionally make explicit that conversion form java Instant and
string may drop nanoseconds, i.e. we create a lenient version that may
drop the significant nanoseconds (legacy or) and a strict
version that reject instant/string that cannot be converted without
loss of precision.
* exerciseArchive(Archive) was skipped before by the flattening rules
because Archive is defined in a separate package. By flattening across
packages in this PR, Archive is now a valid candidate.
* The new Exercises.Archive interface is a superinterface of most
generated exercise* method sets, and therefore generated ContractIds,
CreateAnds, and ByKeys, giving a LUB to different contract types that
contains the exerciseArchive method.
- This is most, not all, because the Daml-LF rules determine what
codegen ought to do, and Daml-LF does not mandate an Archive method
of any kind, never mind one in the exact shape that we happen to
generate.
- If an Archive is present, and it has an empty record as its
parameter and Unit as its result type, then we can safely assume
that the exerciseArchive method is sensible to include in the type.
- However, this still excludes the generic ContractId type, because
generic ContractId does not have enough data to generate a
Daml-LF-correct Archive command given the unknowns mentioned
above. So we simply exclude it from the type instead of guessing the
argument and hoping the guess is close enough. That's why the
Exercises.Archive type is available to users who want access to the
method, which is a supertype of codegenned contract IDs among other
things mentioned above:
SimpleTemplate.ContractId cid = new SimpleTemplate.ContractId("id");
Exercises.Archive<?> wideCid = cid;
assertEquals(
wideCid.exerciseArchive().commands(),
cid.exerciseArchive(new Archive()).commands());
* remove sandbox-on-x project
update bazel readme
update release artifacts
comment out last remaining SoX test
remove ledger-runner-common
remove participant-state-kv-errors
remove recovering-indexer-integration-tests
remove participant-integration-api
update doc pages
cleanup ledger-api-auth usage
remove participant-state
fix build
fix build
clean up ledger-api-common part I
clean up ledger-api-comon part II
clean up ledger-api-common part III
remove ledger/metrics
clean up ledger-api-health and ledger-api-domain
* remove ledger-configuration ad ledger-offset
* remove ledger-grpc and clean up participant-local-store
* reshuffle few more classes
* format