This test only worked by chance since in older JDK versions
java.time.Instant.now() didn’t have nanoseconds precision. As
evidenced by the test after this, nanoseconds are lost during a
roundtrip so this test breaks on newer JDK versions that increased the
precision. See https://bugs.openjdk.java.net/browse/JDK-8068730 for
more information.
If a variant itself is not serializable, but the synthesized record for
one of its constructors is, then said record is returned by the
interface reader in the set of type declarations, when the variant type
itself is not.
When constructing the InterfaceTree in preparation of the codegen, we
previously rejected such a situation.
We now generate Java code for such a synthesized record, as it is a more
generally correct way of interpreting DAML LF (i.e. the DAML compiler
could decide tomorrow to create such multi-component record names for
regular records).
In any case, we consider this to be an edge case, as the synthesized
record for the variant constructor cannot be used directly either from
DAML or the Ledger API.
Submitting a command via the CommandService now returns either the
transaction id (SubmitAndWaitForTransactionId), the flat transaction
(SubmitAndWaitForTransactionResponse), or the transaction tree
(SubmitAndWaitForTransactionTreeResponse).
This means that users don't have to wade through the transaction stream
to retrieve the resulting transaction. This is particularly useful in
combination with #479.
Fixes#406
This change is needed in preparation of #406, where we want to return a
transaction tree and flat transaction after a SubmitAndWaitForTransaction(Tree).
* Drop DAML-LF 1.0 support from compiler
This will enable us to add `Functor`, `Applicative` and `Monad` instances
for `(->) r` in the `daml-stdlib`. We'll do this in a separate PR.
* Remove codegen test for DAML-LF 1.0
Also move Interface and InterfaceType out of the reader subpackage; they
belong with the rest of the data model at the iface root.
The specific mechanics of reading a Dar all the way to producing an
EnvironmentInterface are left to Scala codegen's Codegen and Java
codegen's CodeGenRunner; there's no consensus or great stability on the
best way to tie these pieces together, but all the pieces might as well
be available in the interface library at least.
Inspired by a query on Slack by @leonelag regarding reading the
codegen-relevant parts of dalfs and dars; thanks!
* daml-lf: move EnvironmentInterface to interface library from Scala codegen
* daml-lf: move Interface out of reader subpackage
* language-support/java: deal with moving Interface out of reader subpackage
* document Interface and EnvironmentInterface
* missed copyright header in reader package.scala
* extractor: deal with moving Interface out of reader subpackage
* navigator: deal with moving Interface out of reader subpackage
* Extract codegen-common module, #166
* Scala Codegen Main using the same option parser as Java Codegen, #166
There is one important difference, Scala Codegen does not allow mapping
dars to different package names, all dars have to be mapped to the same
package name.
Replace Scala Codegen println's with scala logging, respecting the
configured codegen verbosity
* Fix bazel formatting
* Update the release dry run script
* Releasing codegen-common
* Improving Scala Codegen error reporting (code review)
* Addressing codereview comments
* Make it explicit that we skip not supported option
* Add release notes entry
* Add CreateAndExercise command to Java Bindings data layer
* Add CreateAndExercise command to DAMLe
The CreateAndExerciseCommand allows users to create a contract and
exercise a choice on it within the same transaction. Users can use this
method to implement "callable update functions" by creating a template
that calls the update function in a choice body.
Fixes#382.
* Add CreateAndExercise command handling to the sandbox
* Add CreateAndExercise command to the Ledger API
The call to blockingGet can lead to no progress being made in certain
scenarios. Therefore I am removing the blocking call and replacing it
with a regular "doOnSuccess".
* Add buildifier targets.
The tool allows to check and format BUILD files in the repo.
To check if files are well formatted, run:
bazel run //:buildifier
To fix badly-formatted files run:
bazel run //:buildifier-fix
* Cleanup dade-copyright-headers formatting.
* Fix dade-copyright-headers on files with just the copyright.
* Run buildifier automatically on CI via 'fmt.sh'.
* Reformat all BUILD files with buildifier.
Excludes autogenerated Bazel files.
* Fetch status.proto from remote, simplify JS gRPC codegen
Fetch the `status.proto` file (part of the standard gRPC distribution)
from a distribution channel. _Moreover_, use the recently introduced
`proto_gen` rule to simplify how the gRPC code for the Node.js bindings
are generated (and remove the need to have `google/rpc/status.proto`
locally in the repository.
* Add plugin_runfiles option to proto_gen
This allows use to add additional files to the bazel sandbox so that
plugins can refer to them. This will subsequently be used by the
protoc-gen-doc plugin.
Also, pass the plugin options via --name_opt parameter.
* Add missing status.proto dependency /language-support/java and /ledger
* Build proto docs using the proto_gen rule
To make this work, I had to turn on the bazel build flag
`--protocopt=--include_source_info` because we cannot turn enable this
flag only for specific build rules.
* Make /ledger-api/grpc-definitions:docs public again
* Revert to the old style of passing plugin arguments to --name_out=options:path
* Suppress output of unzipping
* Fix link for google.rpc.Status in proto-docs
Allows users to supply the argument -V or --verbosity with a number from 0 to 4 for additional logging.
Also, the first and last log message is logged as a warning, when it really
should just be on INFO level.
changing ports to use 6865 everywhere
no need for default ports
addressing missed outdated ports
changed more ports from 7600 to 6865
dealt with more 8080s