The package compiled via daml migrate was missing the actual
Upgrade/Rollback templates. This is because we used `type` instead of
`template instance` to define those templates. Also, apparently we need
to export UpgradeInstance/RollbackInstance from DA.Upgrade in the
standard library.
* Improve UX of generic templates over Ledger API
Currently, if you write
```
template Template t => Proposal t with
receiver: Party
asset: t
where ...
template Iou with ...
template instance ProposalIou = Proposal Iou
```
you'll get the following DAML-LF types:
```
record Proposal t = { receiver : Party, asset : t }
record Iou = ...
record ProposalIou = { unpack : Proposal Iou }
```
The definition of `ProposalIou` is not particularly user friendly when used
over the Ledger API.
This PR changes the definition of `ProposalIou` to
```
record ProposalIou = { receiver : Party, asset : Iou }
```
Basically, the definition of `Proposal` is copied and `t` is instantiated
with `Iou`. This should make a much nicer UX.
* Update documentation
* Add test
* Fix docs examples
* Fix release notes
* Add some disclaimers about scenarios is Sandbox
* Something on errors
* Individual let bindings per line
* Update docs/source/daml/intro/5_Restrictions.rst
Co-Authored-By: Martin Huschenbett <martin.huschenbett@posteo.me>
* filter out exception stack traces logged at debug
* Adding logback.xml for DAML Assistant release
* Fixing readme
* Fixing readme
* Use the config file in the assistant
* language: compile everything in the source directory
This removes the need to specify a 'main'. Instead we 'source' in
daml.yaml should point to the source root directory.
* language: use generic templates for upgrades
Instead of generating templates we now have generic upgrade/rollback
templates in the standart library and we just create instances of in the
migration project. I will update the documentation in a follow up PR.
Let's have the correct help text in the docu until we know what we do
about MANIFEST and know whether we can drop the SOURCE argument from the
command.
* language: docu: documentation for the migrate command of damlc
This adds a section on how to migrate between two different versions of
two packages.
* addressing martin's comments
* addressing bernhard's comments
* Ledger topologies: first draft
* Participant node definition
* Regroup headings
* Marcin's comments
* Second attempt after discussion with Brian H
* Change the topic order in the navigation sidebar
* Apply suggestions from code review
Co-Authored-By: Shaul Kfir <shaul-da@users.noreply.github.com>
* Shaul's/Bernhard's comments
* Clarify the interoperability statement
* Bernhard's comments
* language: append the version to the output dar name by default.
We now by default output foo-1.0.0.dar instead just foo.dar. Also the
maven coordinate default naming got removed.
* fixing integration tests and quickstart.dar occurences
This fixes all flakiness in `damlc test` that I was able to
reproduce. Previously, I got it to fail in about 10% of the cases
whereas now I have successfully run tests 200 times under load without
issues.
There were two issues at play here:
1. We run scenarios in separate threads to be able to kill the running
Shake session quickly even if a scenario has an infinite loop or
something like that (there is a timeout but it’s quite long). This
could result in one of those left-over threads trying to issue a
request while we are already trying to shut down.
To fix that, we wait for the concurrency semaphore to be empty before
shutting down.
2. Just waiting for scenario executions is not quite sufficient as
`runAction` does not wait for all rules to finish (we could just use
runActionSync in `damlc test` but I’d rather make this work
properly). While we do wait for all scenario executions to finish
there is one gRPC request in offInterest that we do not wait for:
gcCtxs.
To fix this, I’ve now routed all gRPC requests through the semaphore
which means that we will also wait for these requests to finish (or
prevent them from spawning).
This makes more sense anyway as scenario executions are mostly fairly
cheap requests while things like setting up the context are expensive
so we want to limit their concurrency.
We should make the concurrency limit configurable but I’ll leave that
for a separate PR.
* language: docs: added package import documentation
This adds a chapter on DAML archives and how to import them into other
projects to the documentation.
* Update docs/source/daml/reference/packages.rst
Co-Authored-By: Moritz Kiefer <moritz.kiefer@purelyfunctional.org>
* addressing comments
* removed random include in integration kit docu
* Cleanup
* WIP
* first integration test + fixture
* minor cleanup
* Implementing ContractService.lookup
* Reverting back to endpoints.all (all2 did not work)
* Cleanup
* replace ApiValue ADT with aliases to daml-lf/transaction Value ADT
* porting rest of navigator to LF Value ADT
* Command Service WIP
* CommandService WIP
* porting more of navigator to LF Value ADT
* last error, not first
* rename ApiValueImplicits file
* special conversion features for ImmArray and FrontStack
- just .to[ImmArray] or .to[FrontStack] any random collection
* finish porting most of navigator main code
* use numeric indices for record field name fallback when pretty-printing
* tuples are not serializable
* use numeric indices for label fallback in JSON verbose encoding
* make traverseEitherStrictly more likely to preserve the seq's class
* to shortcut for ImmArraySeq .to[ImmArraySeq]
* compiling, passing navigator backend tests
* test traverseEitherStrictly more, er, strictly
* pass scalacopts through to scaladoc
* deal with unused warning
* remove unneeded function
* simpler error reporting, more private functions in ApiCodecCompressed
* move slowApply to FrontStack, test it so it actually works
* remove unneeded toStrings; better error from impossible ValueTuple case
* scalafmt FrontStackSpec
* support alternative, label-free record JSON encoding
* Adding domain.CreateCommand + corresponding json formats and dummy json format for lav1.value.Record
* CommandService.create should be done... need to test it
* TODO added
* Cleanup
* move ApiCodecCompressed, ApiValueImplicits, and some aliases to new lf-value-json package
* Using tagged TemplateId type instead of Identifier + exercise command WIP
* adapt navigator to moved pieces
* start defining scalacheck extension to ApiCodecCompressedSpec
* CommandService.exercise + introducing CommandMeta
* Adding command endpoints, can't test them yet, need lf value json formats
* fuse some list operations
- suggested by @stefanobaghino-da; thanks
* blue error message
* Minor fixes after merging librify-navigator-json-compressed, #2136
* experiment with an inductive case in TypedValueGenerators
* finish a List case for TypedValueGenerators; it's revealing
* Introducing API value to LF value converter,
CommandsValidator takes IdentifierResolverLike instead of IdentifierResolver
* cleanup
* remove accidentally readded duplicate aliases
* start tying knots in TypedValueGenerators
* verbatim copy ApiCodecCompressedSpec to lf-value-json
* shift some tests from navigator to lf-value-json
* test Optional and Map for ApiCodecCompressed
* heavier random testing of ApiCodecCompressed
* remove unused dependencies from lf-value-json
* adding value json writer
* cleanup
* Revert "cleanup"
This reverts commit 2e4d153f
* fixing the build
* cleanup
* cleaning up imports
* JsValue to API value is done, needs a test
* cleanup
* use scalac -Ypartial-unification in http-json
* simplify some Traverse instances
* factor CreateCommand and ExerciseCommand traverse instances
* Command create integration test WIP
* Command create integration test WIP, got rid of the JsonReader and JsonWriter for the values, converting values explicitly
* Extracting DomainJsonDecoder and DomainJsonEncoder
* LfV refactoring
* Create command serialize/deserialize test works
* cleanup
* resolving conflicts
* More json encode/decode tests
* logging
* command/create passes integration test now
* Adding readme
* grammar
* TODO added
* GetActiveContractsResponse encoding
* ideintifier conversion renaming
* PackageService resolveTemplateId returns domain.TemplateId now
* Resolving LF Identifier instead of Template ID, this should also work for Exercise command decoding
* cleaning up a bit
* daml-lf: show type in TypedValueGenerators-driven errors
* exercise command json encoding/decoding works
* command/exercise IOU_Transfer integration test passes now
* avoid filter for Gens; makes many contract ID gens not fail
* test ApiCodecCompressed against 100 random types, 20 random values each
* Updating README instructions
* improving error handling, failed futures, get logged and reported to the user now as 500
* [ROUTING DSL] Removing routing DSL, it did not work
* getting rid of HttpEntity.Strict match + cleanup
* fixing the merge conflict
* updating README
* use Show.shows instead of new Show
* List(_) isn't checked, but Seq(_) is slightly safer
* improving test assertions
* Adding /contracts/lookup implementation
* http-json: use ImmArraySeq instead of List; use toRightDisjuction
* http-json: .toList.toSet is shorter than fold
* http-json: replace .leftMap.map with .bimap
* http-json: use subst instead of reimplementing JsonFormat
* http-json: remove unused ExceptionHandler
* http-json: safer == comparison
* Adding two test cases for expected errors
* Adding BazelRunfiles.rlocation magic that supposed to handle windows path for bazel dependencies
* http-json: import, not extend
* Fix typo in README.md and stale link in CONTRIBUTING.md
* Fix minor typos in quickstart
* Fix various minor typos/mistakes in DAML Intro
* Make DAML code in quickstart-java example cleaner
Also update module hash as well as screenshots in docs