* Resolve release version for sdk build checks
* lint
* lint
run-full-compat: true
* Simple test for using daml-script in release versions
* Fix build issues in tests using pSdkVersion
run-full-compat: true
* Fix build issues with DamlcIntegration
* fix bad sdk version being an invalid version
run-full-compat: true
* Fix the linux "mmap 4096 bytes at (nil): Cannot allocate memory" error
* Fix compat tests on Windows
run-full-compat: true
* test windows os correctly
run-full-compat: true
* temporarily disable canton_3x
run-full-compat: true
---------
Co-authored-by: Paul Brauner <paul.brauner@digitalasset.com>
Basically we remove the dependency of different components on "//language-support/scala/bindings" by:
- replacing com.daml.ledger.api.refinements.ApiTypes.Party by com.daml.lf.data.Ref.Party
- replacing com.daml.ledger.api.refinements.ApiTypes.ApplicationId by Option[com.daml.lf.data.Ref.ApplicationId] (here we use option as ApiTypes.ApplicationId allows empty string while Ref.ApplicationId does not).
- adding rounding logic for timestamp in com.daml.lf.data.Time.Timestamp and use it instead of the one from com.daml.api.util.TimestampConversion
Note we did not clean daml-sript export as it have never pass the alpha stage and will be dropped with the 3.x fork.
ContractMetadata is deprecated and has been replaced with an opaque byte array, in a field called `created_event_blob` in the api protos, and a column called `metadata` in Scribe. I don't think we need a wrapper type for this data. When reading out of scribe, we'd typically be using it to build a `DisclosedContract`, and we have a Java type for that, so I think that would be the anchor we'd use when reading.
* ensure we java-sanitise the names of type variables when building identifiers for JSON decoding and encoding these types
* avoid name clashes (obscuring) when the package-qualified name of a type begins with the same identifier as a field name
* initial implementation supporting version splitting in assistant
stubbed out implementations of resolveReleaseVersion/other resolution
* implement resolveReleaseVersionFromGithub
* continue work on fixing SDK/release version split
* First commit that successfully runs `daml-head install <split version>`
* fix tests
* Ignore snapshot/metadata for isHeadVersion
* remove log debugging
* Fix tests for getSdk
* refactor DA.Daml.Project.Types.defaultSdkPath
* enable incremental changes to version cache
* Allow resolveReleaseVersionFromGithub to fail via Either
* Split getSdkVersionFromSdkPath into get{Sdk,Release}VersionFromSdkPath
* Add resolveReleaseVersionFromDamlPath/Github to resolveReleaseVersion
* Add mock sdk config with version
* Remove getInstalledSdkVersions dependency on Cache
* add override for github version api endpoint, useful for mocking
* Add alternate-download to provide alternate tarball install resolution
* initial work on comprehensive autotester
* Copyright header
* fix build generation to have valid Main.daml
* improve error message for check_daml_install_nonzero
* Make killing miniserve processes more robust
* remove breakpoint
* Implement `daml build` tests
* Solve TODO in installExtracted to resolve sourceSdkVersion coherently
Supply useful error message with it.
* Return SdkVersion from sdkVersionFromReleaseVersion for typechecking
* Finish tests for tarball build, drop installed_already_behaviour
* Try to force-reload cache on tar failure, fix `Possible fix:` notes
* Add AllowInstallNonRelease as a flag
* Test allow_nonrelease, refactor, clean up setup_sandbox
* Move no_cache_override_github_endpoint API response into file
* Remove dead code, vestigial code
* Remove TODOs that no longer are relevant
* fix iAllowInstallNonRelease in InstallOptions for autoInstall
* Refactor alternateVersionLocation
* (Try to) lint language-support/ts/codegen/BUILD.bazel
* fix daml-assistant/test
* Enable allow_nonrelease post-build checks
* Remove writeFile debugging from DA.Daml.Assistant.Version
* Replace ../test-daml-yaml-install/test.sh with test-all-installs.sh
* lint
* more lint
run-full-compat: true
* Stop tracking API response in test-daml-yaml-install
* Move ReleaseResolution into Version
* remove extra deps, fixes daml-project-config-cabal-file-matches
* Move InstallLocation to minimize changes to DA.Daml.Project.Types
run-full-compat: true
* Create cachePath in daml-assistant tests that need it
* Bind to unused ports in test-all-installs.sh
run-full-compat: true
* Override via files instead of just URLs
run-full-compat: true
* Remove unused RELEASES_ENDPOINT
* Use dict keys instead of hardcoding in test-all-installs listing
* Refactor to remove check_daml_install_from_tarball_after_cache_reload
* Improve comment on update_cache
* Move shift before cases
* Add comment for unsafeParseReleaseVersion
* Rename unsafeParseReleaseVersion into unsafeParseOldReleaseVersion
* remove done todo
* Add some hungarian notation to resolveReleaseVersionFromDamlPath
* drop redundant let
* Define ordering over ReleaseVersion
run-full-compat: true
* use sdkVersion for codegen
* Use `urls` attribute in http_file, `url` is unsupported on Windows (!)
run-full-compat: true
* Remove unnecessary check for cache reload
run-full-compat: true
* Try use daml executable directly without daml-sdk-0.0.0
run-full-compat: true
* try force daml.exe to daml
run-full-compat: true
* Use daml.exe when windows is detected
* add windows tarballs for snapshots
* Fix most tests on windows, "line too long" breaks some tarball tests
run-full-compat: true
* Point to more recent snapshot post Moises's fixes
* Add os-specific tarball paths and alternate-download
run-full-compat: true
* Fix windows autoinstall with 0.0.0
run-full-compat: true
* Fix error message, remove daml_install_output catching
* Detect "The input line is too long" in other post_failed commands
* Fix missing releases endpoint, handle "cannot find the path specified"
run-full-compat: true
* Automated renames by bash script
This commit is exclusively contains changes by the bash script.
For the bash script is present at the pull request.
* Manual pekko migration changes
* adapt fully qualified name references
* adapt pekko package declarations
* adapt bazel files with dependency changes
* adapt canton pekko lib shade_rule
* adapt logger configuration declarations
* pin maven dependencies
* revert incorrect changes by script to compatibility module
Workarounds for further TODOs:
* disable http-json-perf and libs-scala/gatling-utils modules to maintain clean pekko dependencies (without akka)
* disable GraphQLSchemaSpec test (sangria library needs to be upgraded)
* Formatting
* Make `TransactionFilter.toProto` public
We are in the process of enabling the usage of the Java bindings from the
Canton Console. In order to do so in a backward-compatible fashion, we
often rely on existing Scala code and use the Protobuf representation as
a bridge between the two. Unfortunately the fact that the `toProto` method
on `TransactionFilter`s is package-private forced us to temporarily add
a conversion method located within the package that re-exported it publicly
for usage as part of the Canton Console. This change should enable us to
remove this workaround. I would be happy to hear other possible approaches
to consider.
* Update subclass
* pin dependencies to json and add missing dep
* fix cyclic dep
* remove unused dep
* add missing dep to //ledger-api/testing-utils:testing-utils
* remove unused dep in //ledger/ledger-api-auth:ledger-api-auth
* remove more unused deps
* more dep fixes
* yet more dep fixing
* more fixing..
* more of the same
* hopefully the last deps to fix
* Bump the version of protobuf and fix everything that depends on it. Took shortcuts that I need to fix in a next commit, but would like to run the CI on this now that it compiles
* don't error out in the grpc-haskell patch
* remove obsolete patch
* patch absl to compile on mingw
* Add a patch to recognize the compiler
* Define _DNS_SD_LIBDISPATCH for macOS gRPC
* bump netty_tcnative_version according to https://github.com/grpc/grpc-java/blob/master/SECURITY.md#netty
* pin maven deps
* Fix macos linking errors 'dyld[xxx]: missing symbol called'
* Skip Darwin frameworks in package-app.sh
* pin stackage packages
* pin stackage windows deps
* use the netty version agreed on
* bump the windows global cache to try and debug the upb issue
* restart the CI after timeout
* clean up
* disable failing tests for now
* comment out unused code
* reset the windows machine name to 'default'
---------
Co-authored-by: Moisés Ackerman <6054733+akrmn@users.noreply.github.com>
* Adapt JSON API write path to the new explicit disclosure Ledger API interface
* Address review comments
* Switch to vanilla Base64 for createdEventBlob instead of Base64Url
* update TypeScript bindings of DisclosedContract to use the createdEventBlob field instead of payload, payloadBlob and metadata, and short-circuit a test which depends on canton populating the createdEventBlob field
* get TypeScript integration tests to use transaction service to get created_event_blob data
---------
Co-authored-by: = <=>
Fixes https://github.com/digital-asset/daml/issues/17692
Fixes the error from the `jsonEncoder` method (which was about `javac` mis-interpreting a method reference), and the warnings in `toValue`, `valueDecoder` and the initialiser for `__enums$`, which did not need to qualify the static identifiers by class name.
* replace community canton snapshot with our bazel-built canton deploy jar
* add VERSION to the resources of the jar
* fix canton-test-runner-with-dependencies-script
* add resources to community_common
* Update test-common/canton/it-lib/src/main/com/daml/CantonFixture.scala
Co-authored-by: Remy <remy.haemmerle@daml.com>
---------
Co-authored-by: Remy <remy.haemmerle@daml.com>
To avoid excess duplicate code-gen, I put the implementation of `toJson` into the parent interface `DefinedDataType`, and leverage the `jsonEncoder()` in each generated subclasses. However custom datatypes with type parameters have a different signature of `toJson(...)` as they need to pass in arguments for encoders for each type arg, so we generate a custom `toJson(...)` for those classes. I also do a static import of the `JsonLfEncoder.apply` method, as a useful uniform syntax for calling either `Function` objects or method references (e.g. `JsonLfEncoder::bool`). That required propagating the static imports up from the various methods that need them, and adding them at the `JavaFile.Builder` level.
I changed all the integration tests that did `fromJson` from a hard-coded string, into round-trip tests a la `Foo.fromJson(foo.toJson())`, now that we have `toJson`.
Also add a benchmark, which can be run with
```
bazel run //language-support/java/codegen:to-json-bench
```
Locally I currently get these results.
```
# Run complete. Total time: 00:00:56
Benchmark Mode Cnt Score Error Units
ToJsonBench.enummodBox thrpt 10 8138043.644 ± 197223.048 ops/s
ToJsonBench.enummodColoredTree thrpt 10 3132500.885 ± 23677.832 ops/s
ToJsonBench.enummodOptionalColor thrpt 10 10762883.904 ± 116865.102 ops/s
ToJsonBench.genmapmodBox thrpt 10 1554965.578 ± 4956.210 ops/s
```
The `Score` is the number of times the sample values can be converted to json per second.
* Explicit disclosure based on blobs only
* cosmetic changes
* Changes post-review
* remove buf check suppression
* Silence deprecation warnings
* more silencing of deprecation warnings
* Changes after recent round of reveiews
Mostly mechanical change, pulling nested static class `JsonLfReader.Decoders` out into its own class `JsonLfDecoders`.
Previously the nested class could access private members of `JsonLfReader`. I've now made the relevant methods package-accessible, and the fully private components such as anything from the jackson library are properly private. `currentText()` becomes the single way for `JsonLfDecoders` to read the current value.
This change will mirror what we'll then do with the upcoming `JsonLfEncoders`.
* Support record up/downgrades in Java codegen
This adds enough upgrading support to the Java codegen to use it
against the current upgrading PoCs. This is backwards compatible so I
enabled it in all cases instead of trying to add a flag somewhere.
Also rename the private `SubmitCommandsRequest.toProto` method which is used by the deprecated but public `toProto` overloads, to `deprecatedToProto`. The passing of disclosed contracts is not supported by these deprecated overloads.
This PR does not provide facilities for building a `DisclosedContract`, either as read from Scribe or from whatever format it may have been published or shared.
* initial commit
* split gen-stable-packages-v_i into two targets
* rename compatibleWith to canDependOn
* update damlc integration tests annotation to allow for 2.x
* use the right package ID for preconditionFailed when compiling to V2
* fuse stable-packages-v1 and stable-packages-v2 into one single filegroup
* Fix DA.Daml.LF.Ast.Tests
* remove leftover 'undefined' in Version.hs
* progress on fixing DataDependencies.hs
* fix Cross-SDK typeclasses test for 2.dev
* Fix the 'Typeclasses and instances' data dependency test
* Update comment
* fix //compiler/damlc/tests:packaging
* Add TODO
* parameterize the machine by the langage version, hardcode v1 in script v2, v2 in script v2, v1 in exports
* get EngineTests to pass
* fix more tests
* fix canton integration tests
* formatting
* fix more tests
* fix transactionversiontest
* fix exceptiontest
* Fix ValueEnricherSpec
* Fix EngineInfoTest
* fix PartialTransactionSpec
* fix upgragetest
* fix TransactionSnapshot
* Fix ContractKeySpec
* Fix ReinterpretTest
* fix InterfaceViewSpec
* fix InterfacesTest
* fix stable package v1 names
* fix validate.sh tests
* formatting
* Fix ChoiceAuthorityTest
* fix explicit disclosure test
* Fix SpeedyTest
* formatting
* Fix integration test
* fix data dependency tests
* fix package vetting count, increased due to metadata being added
* Redact stable package IDs in error messages in order for the ExceptionSemantics test to work for both v1 and v2
* cleanup
* fix Daml2ScriptTestRunner
* fix JsonApiIT and daml-script golden tests
* fix daml3-script runner test
* enable v2 for all integration tests
* formatting
* fix NodeSeedsTest
* fix since-lf annotations
* add comments, improve consistency
* stop hardcoding V1 in runPureExpr and runPureSExpr
* formatting
* remove harcoding of LFv1 in ConcurrentCompiledPackages.apply
* Parameterize Compiler.Config.Default with major language version
* remove global parser implicit and default package ID and language version