docs: new recommend path architecture documentation (#4795)

CHANGELOG_BEGIN
CHANGELOG_END
This commit is contained in:
Robin Krom 2020-03-09 12:15:37 +01:00 committed by GitHub
parent e3c9f363bd
commit 0e046d9eca
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
22 changed files with 525 additions and 405 deletions

View File

@ -35,16 +35,27 @@ Building applications
:titlesonly:
:maxdepth: 2
app-dev/index
daml-script/index
upgrade/index
app-dev/app-arch
app-dev/authentication
app-dev/ledger-api
app-dev/bindings-java/index
app-dev/bindings-scala/index
app-dev/bindings-js
app-dev/grpc/index
app-dev/bindings-x-lang/index
app-dev/app-arch
app-dev/authentication
DAML Script <daml-script/index>
upgrade/index
Deploying to DAML ledgers
-------------------------
.. toctree::
:titlesonly:
:maxdepth: 2
deploy/index
deploy/generic_ledger
deploy/ledger-topologies
SDK tools
---------
@ -67,14 +78,6 @@ Background concepts
concepts/glossary
concepts/ledger-model/index
Deploying
---------
.. toctree::
:titlesonly:
:maxdepth: 2
deploy/index
Examples
--------

View File

@ -1,295 +1,124 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
Application architecture guide
#########################################
Application architecture
########################
This document is a guide to building applications that interact with a DA ledger deployment (the 'ledger'). It:
This section describes our recommended design of a full-stack DAML application.
- describes the characteristics of the ledger API, how this affects the way an application is built (the 'application architecture'), and why it is important to understand this when building applications
- describes the resources in the SDK to help with this task
- gives some guidelines to help you build correct, performant, and maintainable applications using all of the supported languages
.. image:: ./recommended_architecture.svg
Categories of application
*************************
The above image shows the recommended architecture. Of course there are many ways how you can change
the architecture and technology stack to fit your needs, which we'll mention in the corresponding
sections.
Applications that interact with the ledger normally fall into four categories:
To get started quickly with the recommended application architecture clone the
``create-daml-app`` application template:
.. list-table:: Categories of application
:header-rows: 1
.. code-block:: bash
* - Category
- Receives transactions?
- Sends commands?
- Example
* - Source
- No
- Yes
- An injector that reads new contracts from a file and injects them into the system.
* - Sink
- Yes
- No
- A reader that pipes data from the ledger into an SQL database.
* - Automation
- Yes
- Yes, responding to transactions
- Automatic trade registration.
* - Interactive
- Yes (and displays to user)
- Yes, based on user input
- DAs :doc:`Navigator </tools/navigator/index>`, which lets you see and interact with the ledger
git clone https://github.com/digital-asset/create-daml-app
Additionally, applications can be written in two different styles:
``create-daml-app`` is a small, but fully functional demo application implementing the recommended
architecture, providing you with an excellent starting point for your own application. It showcases
- Event-driven - applications base their actions on individual ledger events only.
- State-driven - applications base their actions on some model of all contracts active on the ledger.
- using DAML React libraries
- quick iteration against the :ref:`DAML Ledger Sandbox <sandbox-manual>`.
- authentication
- deploying your application in the cloud as a Docker container
Event-driven applications
=========================
Backend
~~~~~~~
**Event-driven** applications react to events on the the ledger and generate commands and other outputs on a per-event basis. They do not require access to ledger state beyond the event they are reacting to.
The backend for your application can be any DAML ledger implementation running your DAR (:ref:`DAML
Archive <dar-file-dalf-file>`) file.
Examples are sink applications that read the ledger and dump events to an external store (e.g. an external (reporting) database).
We recommend using the :ref:`DAML JSON API <json-api>` as an interface to your frontend. It is
served by the HTTP JSON API server connected to the ledger API server. It provides simple HTTP
endpoints to interact with the ledger via GET/POST requests. However, if you prefer, you can also
use the :ref:`gRPC API <grpc>` directly.
State-driven applications
=========================
When you use the ``create-daml-app`` template application, you can start a local sandbox together
with a JSON API server by running
**State-driven** applications build up a real-time view of the ledger state by reading events and recording contract create and archive events. They then generate commands based on a given state, not just single events.
.. code-block:: bash
Examples of these are automation and interactive applications that let a user or code react to complex state on the ledger (e.g. the DA Navigator tool).
./daml-start.sh
Which approach to take
======================
in the root of the project. This is the most simple DAML ledger implementation. Once your
application matures and becomes ready for production, the ``daml deploy`` command helps you deploy
your frontend and DAML artefacts of your project to a production ledger. See :ref:`Deploying to DAML
Ledgers <deploy-ref_overview>` for an in depth manual for specific ledgers.
For all except the simplest applications, we generally recommend the state-driven approach. State-driven applications are easier to reason about when determining correctness, so this makes design and implementation easier.
Frontend
~~~~~~~~
In practice, most applications are actually a mixture of the two styles, with one predominating. It is easier to add some event handling to a state-driven application, so it is better to start with that style.
We recommended building your frontend with the `React <https://reactjs.org>`_ framework. However,
you can choose virtually any language for your frontend and interact with the ledger via :ref:`HTTP
JSON <json-api>` endpoints. In addition, we provide support libraries for :ref:`Java
<java-bindings>` and :ref:`Scala <scala-bindings>` and you can also interact with the :ref:`gRPC API
<grpc>` directly.
Structuring an application
**************************
.. TODO (drsk) add and point to javascript bindings.
.. If you choose a different Javascript based frontend framework, the packages ``@daml/ledger``,
.. ``@daml/types`` and the generated ``@daml2ts`` package provide you with the necessary interface code
.. to connect and issue commands against your ledger.
Although applications that communicate with the ledger have many purposes, they generally have some common features, usually related to their style: event-driven or state-driven. This section describes these commonalities, and the major functions of each of these styles.
We provide two libraries to build your React frontend for a DAML application.
In particular, all applications need to handle the asynchronous nature of the ledger API. The most important consequence of this is that applications must be multi-threaded. This is because of the asynchronous, separate streams of commands, transaction and completion events.
+--------------------------------------------------------------+--------------------------------------------------------------------------+
| Name | Summary |
+==============================================================+==========================================================================+
| `@daml/react <https://www.npmjs.com/package/@daml/react>`_ | React hooks to query/create/exercise DAML contracts |
+--------------------------------------------------------------+--------------------------------------------------------------------------+
| `@daml/ledger <https://www.npmjs.com/package/@daml/ledger>`_ | DAML ledger object to connect and directly submit commands to the ledger |
+--------------------------------------------------------------+--------------------------------------------------------------------------+
Although you can choose to do this in several ways, from bare threads (such as a Java Thread) through thread libraries, generally the most effective way of handling this is by adopting a reactive architecture, often using a library such as `RxJava <https://github.com/ReactiveX/RxJava>`__.
You can install any of these libraries by running ``yarn add <library>`` in the ``ui`` directory of
your project, e.g. ``yarn add @daml/react``. Please explore the ``create-daml-app`` example project
to see the usage of these libraries.
All the language bindings support this reactive pattern as a fundamental requirement.
To make your life easy when interacting with the ledger, the DAML assistant can generate
corresponding typescript data definitions for the data types declared in the deployed DAR.
.. _event-driven-applications-1:
.. code-block:: bash
Structuring event-driven applications
=====================================
daml codegen ts .daml/dist/<your-project-name.dar> -o daml-ts
Event-driven applications read a stream of transaction events from the ledger, and convert them to some other representation. This may be a record on a database, some update of a UI, or a differently formatted message that is sent to an upstream process. It may also be a command that transforms the ledger.
This command will generate a typescript project in the ``daml-ts`` folder that needs to be connected
with your frontend code in ``ui``. To do so, navigate to ``daml-ts`` and run ``yarn install`` and
then ``yarn workspaces run build``.
The critical thing here is that each event is processed in isolation - the application does not need to keep any application-related state between each event. It is this that differentiates it from a state-driven application.
.. TODO (drsk) this process is changing right now, make sure it is documented up to date here.
To do this, the application should:
Authentication
~~~~~~~~~~~~~~
1. Create a connection to the Transaction Service, and instantiate a stream handler to handle the new event stream. By default, this will read events from the beginning of the ledger. This is usually not what is wanted, as it may replay already processed transactions. In this case, the application can request the stream from the current ledger end. This will, however, cause any events between the last read point and the current ledger end to be missed. If the application must start reading from the point it last stopped, it must record that point and explicitly restart the event stream from there.
When you deploy your application to a production ledger, you need to authenticate the identities of
your users.
2. Optionally, create a connection to the Command Submission Service to send any required commands back to the ledger.
DAML ledgers support a unified interface for authentication of commands. Some DAML ledgers like for
example https://projectdabl.com offer an integrated authentication service, but you can also use an
external service provider for authentication like https://auth0.com. The DAML react libraries
support interfacing with an authenticated DAML ledger. Simply initialize your ``DamlLedger`` object
with the token obtained by an authentication service. How authentication works and the form of the
required tokens is described in the :ref:`Authentication <authentication>` section.
3. Act on the content of events (type, content) to perform any action required by the application e.g. writing a database record or generating and submitting a command.
Developer workflow
~~~~~~~~~~~~~~~~~~
.. _state-driven-applications-1:
The DAML SDK enables a local development environment with fast iteration cycles. If you run
``daml-reload-on-change.sh`` of the ``create-daml-app``, a local DAML sandbox ledger is started that
is updated with your most recent DAML code on any change. Next, you can start your frontend in
development mode by changing to your ``ui`` directory and run ``yarn start``. This will reload your
frontend whenever you make changes to it. You can add unit tests for your DAML models by writing
:ref:`DAML scenarios <testing-using-scenarios>`. These will also be reevaluated on change. A
typical DAML developer workflow is to
Structuring state-driven applications
=====================================
#. Make a small change to your DAML data model
#. Optionally test your DAML code and with :ref:`scenarios <testing-using-scenarios>`
#. Edit your React components to be aligned with changes made in DAML code
#. Extend the UI to make use of the newly introduced feature
#. Make further changes either to your DAML and/or React code until you're happy with what you've developed
State-driven applications read a stream of events from the ledger, examine them and build up an application-specific view of the ledger state based on the events type and content. This involves storing some representation of existing contracts on a Create event, and removing them on an Archive event. To be able to remove a contract from the state, they are indexed by :ref:`contractId <com.digitalasset.ledger.api.v1.CreatedEvent.contract_id>`.
This is the most basic kind of update, but other types are also possible. For example, counting the number of a certain type of contract, and establishing relationships between contracts based on business-level keys.
The core of the application is then to write an algorithm that examines the overall state, and generates a set of commands to transform the ledger, based on that state.
If the result of this algorithm depends purely on the current ledger state (and not, for instance, on the event history), you should consider this as a pure function between ledger state and command set, and structure the design of an application accordingly. This is highlighted in the `language bindings <#application-libraries>`__.
To do this, the application should:
1. Obtain the initial state of the ledger by using the Active Contracts service, processing each event received to create an initial application state.
2. Create a connection to the Transaction Service to receive new events from that initial state, and instantiate a stream handler to process them.
3. Create a connection to the Command Submission Service to send commands.
4. Create a connection to the Command Completion Service, and set up a stream handler to handle completions.
5. Read the event stream and process each event to update its view of the ledger state.
To make accessing and examining this state easier, this often involves turning the generic description of create contracts into instances of structures (such as class instances that are more appropriate for the language being used. This also allows the application to ignore contract data it does not need.
6. Examine the state at regular intervals (often after receiving and processing each transaction event) and send commands back to the ledger on significant changes.
7. Maintain a record of **pending contracts**: contracts that will be archived by these commands, but whose completion has not been received.
Because of the asynchronous nature of the API, these contracts will not exist on the ledger at some point after the command has been submitted, but will exist in the application state until the corresponding archive event has been received. Until that happens, the application must ensure that these **pending contracts** are not considered part of the application state, even though their archives have not yet been received. Processing and maintaining this pending set is a crucial part of a state-driven application.
8. Examine command completions, and handle any command errors. As well as application defined needs (such as command re-submission and de-duplications), this must also include handling command errors as described `Common tasks <#common-tasks>`__, and also consider the pending set. Exercise commands that fail mean that contracts that are marked as pending will now not be archived (the application will not receive any archive events for them) and must be returned to the application state.
Common tasks
============
Both styles of applications will take the following steps:
- Define an **applicationId** - this identifies the application to the ledger server.
- Connect to the ledger (including handling authentication). This creates a client interface object that allows creation of the stream connection described in `Structuring an application <#structuring-an-application>`__.
- Handle execution errors. Because these are received asynchronously, the application will need to keep a record of commands in flight - those sent but not yet indicated complete (via an event). Correlate commands and completions via an application-defined :ref:`commandId <com.digitalasset.ledger.api.v1.Commands.command_id>`. Categorize different sets of commands with a :ref:`workflowId <com.digitalasset.ledger.api.v1.Commands.workflow_id>`.
- Handle lost commands. The ledger server does not guarantee that all commands submitted to it will be executed. This means that a command submission will not result in a corresponding completion, and some other mechanism must be employed to detect this. This is done using the values of Ledger Effective Time (LET) and Maximum Record Time (MRT). The server does guarantee that if a command is executed, it will be executed within a time window between the LET and MRT specified in the command submission. Since the value of the ledger time at which a command is executed is returned with every completion, reception of a completion with a record time that is greater than the MRT of any pending command guarantees that the pending command will not be executed, and can be considered lost.
- Have a policy regarding command resubmission. In what situations should failing commands be re-submitted? Duplicate commands must be avoided in some situations - what state must be kept to implement this?
- Access auxiliary services such as the time service and package service. The `time service <#time-service>`__ will be used to determine Ledger Effective Time value for command submission, and the package service will be used to determine packageId, used in creating a connection, as well as metadata that allows creation events to be turned in to application domain objects.
Application libraries
*********************
We provide several libraries and tools that support the task of building applications. Some of this is provided by the API (e.g. the Active Contracts Service), but mostly is provided by several language binding libraries.
Java
====
The Java API bindings have three levels:
- A low-level Data Layer, including Java classes generated from the gRPC protocol definition files and thin layer of support classes. These provide a builder pattern for constructing protocol items, and blocking and non-blocking interfaces for sending and receiving requests and responses.
- A Reactive Streams interface, exposing all API endpoints as `RxJava <https://github.com/ReactiveX/RxJava>`__ `Flowables <http://reactivex.io/RxJava/javadoc/io/reactivex/Flowable.html>`__.
- A Reactive Components API that uses the above to provide high-level facilities for building state-driven applications.
For more information on these, see the documentation: a :doc:`tutorial/description </app-dev/bindings-java/index>` and the `JavaDoc reference </app-dev/bindings-java/javadocs/index.html>`__.
This API allows a Java application to accomplish all the steps detailed in `Application Structure <#structuring-an-application>`__. In particular, the `Bot <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/Bot.html>`__ abstraction fully supports building of state-driven applications. This is described further in `Architectural Guidance <#architecture-guidance>`__, below.
Scala
=====
The Java libraries above are compatible with Scala and can be used directly.
gRPC
====
We provides the full details of the gRPC service and protocol definitions. These can be compiled to a variety of target languages using the open-source `protobuf and gRPC tools <https://grpc.io/docs/>`__. This allows an application to attach to an interface at the same level as the provided Data Layer Java bindings.
Architecture guidance
*********************
This section presents some suggestions and guidance for building successful applications.
Use a reactive architecture and libraries
=========================================
In general, you should consider using a reactive architecture for your application. This has a number of advantages:
- It matches well to the streaming nature of the ledger API.
- It will handle all the multi-threading issues, providing you with sequential model to implement your application code.
- It allows for several implementation strategies that are inherently scalable e.g. RxJava, Akka Streams/Actors, RxJS, RxPy etc.
Prefer a state-driven approach
==============================
For all but the simplest applications, the state-driven approach has several advantages:
- It's easier to add direct event handling to state-driven applications than the reverse.
- Most applications have to keep some state.
- DigitalAsset language bindings directly support the pattern, and provide libraries that handle many of the required tasks.
Consider a state-driven application as a function of state to commands
======================================================================
As far as possible, aim to encode the core application as a function between application state and generated commands. This helps because:
- It separates the application into separate stages of event transformation, state update and command generation.
- The command generation is the core of the application - implementing as a pure function makes it easy to reason about, and thus reduces bugs and fosters correctness.
- Doing this will also require that the application is structured so that the state examined by that function is stable - that is, not subject to an update while the function is running. This is one of the things that makes the function, and hence the application, easier to reason about.
The Java Reactive Components library provides an abstraction and framework that directly supports this. It provides a `Bot <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/Bot.html>`__ abstraction that handles much of work of doing this, and allows the command generation function to be represented as an actual Java function, and wired into the framework, along with a transform function that allows the state objects to be Java classes that better represent the underlying contracts.
This allows you to reduce the work of building an application to the following tasks:
- Define the Bot function.
- Define the event transformation.
- Define setup tasks such as disposing of command failure, connecting to the ledger and obtaining ledger- and package- IDs.
The framework handles much of the work of building a state-driven application. It handles the streams of events and completions, transforming events into domain objects (via the provided event transform function) and storing them in a `LedgerView <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/LedgerViewFlowable.LedgerView.html>`__ object. This is then passed to the Bot function (provided by the application), which generates a set of commands and a pending set. The commands are sent back to the ledger, and the pending set, along with the commandId that identifies it, is held by the framework (`LedgerViewFlowable <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/LedgerViewFlowable.html>`__). This allows it to handle all command completion events.
|image0|
Full details of the framework are available in the links described in the `Java library <#java>`__ above.
Commonly used types
*******************
Primitive and structured types (records, variants and lists) appearing in the contract constructors and choice arguments are compatible with the types defined in the current version of DAML-LF (v1). They appear in the submitted commands and in the event streams.
There are some identifier fields that are represented as strings in the protobuf messages. They are opaque: you shouldn't interpret them in client code, except by comparing them for equality. They include:
- Transaction IDs
- Event IDs
- Contract IDs
- Package IDs (part of template identifiers)
There are some other identifiers that are determined by your client code. These aren't interpreted by the server, and are transparently passed to the responses. They include:
- Command IDs: used to uniquely identify a command and to match it against its response.
- Application ID: used to uniquely identify client process talking to the server. You could use a combination of submitting party, command ID, and application ID for deduplication of commands.
- Workflow IDs: identify chains of transactions. You can use these to correlate transactions sent across time spans and by different parties.
.. |image0| image:: images/BotFlow.png
:width: 6.5in
:height: 3.69444in
Testing
=======
Testing is fundamental to ensure correctness and improve maintainability.
Testing is usually divided into different categories according to its scope and aim:
- unit testing verifies single properties of individual components
- integration testing verifies that an aggregation of components behaves as expected
- acceptance testing checks that the overall behavior of a whole system satisfies certain criteria
Both tests in the small scale (unit testing) and large (acceptance testing) tend to be specific to the given component or system under test.
This chapter focuses on providing portable approaches and techniques to perform integration testing between your components and an actual running ledger.
Test the business logic with a ledger
*************************************
In production, your application is going to interact with a DAML model deployed on an actual ledger. Each model is usually specific to a business need and describes specific workflows.
Mocking a ledger response is usually not desirable to test the business logic, because so much of it is encapsulated in the DAML model. This makes integration testing with an actual running ledger fundamental to evaluating the correctness of an application.
This is usually achieved by running a ledger as part of the test process and running several tests against it, possibly coordinated by a test framework. Since the in-memory sandbox shipped as part of the SDK is a full-fledged implementation of a DAML ledger, it's usually the tool of choice for these tests. Please note that this does not replace acceptance tests with the actual ledger implementation that your application aims to use in production. Whatever your choice is, sharing a single ledger to run several tests is a suggested best practice.
Share the ledger
****************
Sharing a ledger is useful because booting a ledger and loading DAML code into it takes time. As you're likely to have a lot of very short tests in order to properly test your application the total running time of these would be severely impacted if you ran a new ledger for every test.
Tests must thus be designed to not interfere with each other. Both the transaction and the active contract service offer the possibility of filtering by party. Parties can thus be used as a way to isolate tests.
You can use the party management service to allocate new parties and use them to test your application. You can also limit the number of transactions read from the ledger by reading the current offset of the ledger end before the test starts, since no transactions can possibly appear for the newly allocated parties before this time.
In summary:
1. retrieve the current offset of the ledger end before the test starts
1. use the party management service to allocate the parties needed by the test
1. whenever you issue a command, issue it as one of the parties allocated for this test
1. whenever you need to get the set of active contracts or a stream of transactions, always filter by one or more of the parties allocated for this test
This isolation between instances of tests also means that different tests can be run completely in parallel with respect to each other, possibly improving on the overall running time of your test suite.
Reset if you need to
********************
It may be the case that you are running a very high number of tests, verifying the ins and outs of a very complex application interacting with an equally complex DAML model.
If that's the case, the leak of resources caused by the approach to test isolation mentioned above can become counterproductive, causing slow-downs or even crashes as the ledger backing your test suite has to keep track of more parties and more transactions that are actually no longer relevant after the test itself finishes.
As a last resort for these cases, your tests can use the reset service, which ledger implementations can optionally expose for testing.
The reset service has a single ``reset`` method that will cause all the accumulated state to be dropped, including all active contracts, the entire history of transactions and all allocated users. Only the DAML packages loaded in the ledger are preserved, thereby saving the time needed for reloading them as opposed to simply spinning up a new ledger.
The reset service momentarily shuts down the gRPC channel it communicates over, so your testing infrastructure must take this into account and, when the ``reset`` is invoked, must ensure that tests are temporarily suspended as attempts to reconnect with the rebooted ledger are performed. There is no guarantee as to how long the reset will take, so this should also be taken into account when attempting to reconnect.
.. image:: ./developer_workflow.svg

View File

@ -0,0 +1,295 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
Application architecture guide
#########################################
This document is a guide to building applications that interact with a DA ledger deployment (the 'ledger'). It:
- describes the characteristics of the ledger API, how this affects the way an application is built (the 'application architecture'), and why it is important to understand this when building applications
- describes the resources in the SDK to help with this task
- gives some guidelines to help you build correct, performant, and maintainable applications using all of the supported languages
Categories of application
*************************
Applications that interact with the ledger normally fall into four categories:
.. list-table:: Categories of application
:header-rows: 1
* - Category
- Receives transactions?
- Sends commands?
- Example
* - Source
- No
- Yes
- An injector that reads new contracts from a file and injects them into the system.
* - Sink
- Yes
- No
- A reader that pipes data from the ledger into an SQL database.
* - Automation
- Yes
- Yes, responding to transactions
- Automatic trade registration.
* - Interactive
- Yes (and displays to user)
- Yes, based on user input
- DAs :doc:`Navigator </tools/navigator/index>`, which lets you see and interact with the ledger
Additionally, applications can be written in two different styles:
- Event-driven - applications base their actions on individual ledger events only.
- State-driven - applications base their actions on some model of all contracts active on the ledger.
Event-driven applications
=========================
**Event-driven** applications react to events on the the ledger and generate commands and other outputs on a per-event basis. They do not require access to ledger state beyond the event they are reacting to.
Examples are sink applications that read the ledger and dump events to an external store (e.g. an external (reporting) database).
State-driven applications
=========================
**State-driven** applications build up a real-time view of the ledger state by reading events and recording contract create and archive events. They then generate commands based on a given state, not just single events.
Examples of these are automation and interactive applications that let a user or code react to complex state on the ledger (e.g. the DA Navigator tool).
Which approach to take
======================
For all except the simplest applications, we generally recommend the state-driven approach. State-driven applications are easier to reason about when determining correctness, so this makes design and implementation easier.
In practice, most applications are actually a mixture of the two styles, with one predominating. It is easier to add some event handling to a state-driven application, so it is better to start with that style.
Structuring an application
**************************
Although applications that communicate with the ledger have many purposes, they generally have some common features, usually related to their style: event-driven or state-driven. This section describes these commonalities, and the major functions of each of these styles.
In particular, all applications need to handle the asynchronous nature of the ledger API. The most important consequence of this is that applications must be multi-threaded. This is because of the asynchronous, separate streams of commands, transaction and completion events.
Although you can choose to do this in several ways, from bare threads (such as a Java Thread) through thread libraries, generally the most effective way of handling this is by adopting a reactive architecture, often using a library such as `RxJava <https://github.com/ReactiveX/RxJava>`__.
All the language bindings support this reactive pattern as a fundamental requirement.
.. _event-driven-applications-1:
Structuring event-driven applications
=====================================
Event-driven applications read a stream of transaction events from the ledger, and convert them to some other representation. This may be a record on a database, some update of a UI, or a differently formatted message that is sent to an upstream process. It may also be a command that transforms the ledger.
The critical thing here is that each event is processed in isolation - the application does not need to keep any application-related state between each event. It is this that differentiates it from a state-driven application.
To do this, the application should:
1. Create a connection to the Transaction Service, and instantiate a stream handler to handle the new event stream. By default, this will read events from the beginning of the ledger. This is usually not what is wanted, as it may replay already processed transactions. In this case, the application can request the stream from the current ledger end. This will, however, cause any events between the last read point and the current ledger end to be missed. If the application must start reading from the point it last stopped, it must record that point and explicitly restart the event stream from there.
2. Optionally, create a connection to the Command Submission Service to send any required commands back to the ledger.
3. Act on the content of events (type, content) to perform any action required by the application e.g. writing a database record or generating and submitting a command.
.. _state-driven-applications-1:
Structuring state-driven applications
=====================================
State-driven applications read a stream of events from the ledger, examine them and build up an application-specific view of the ledger state based on the events type and content. This involves storing some representation of existing contracts on a Create event, and removing them on an Archive event. To be able to remove a contract from the state, they are indexed by :ref:`contractId <com.digitalasset.ledger.api.v1.CreatedEvent.contract_id>`.
This is the most basic kind of update, but other types are also possible. For example, counting the number of a certain type of contract, and establishing relationships between contracts based on business-level keys.
The core of the application is then to write an algorithm that examines the overall state, and generates a set of commands to transform the ledger, based on that state.
If the result of this algorithm depends purely on the current ledger state (and not, for instance, on the event history), you should consider this as a pure function between ledger state and command set, and structure the design of an application accordingly. This is highlighted in the `language bindings <#application-libraries>`__.
To do this, the application should:
1. Obtain the initial state of the ledger by using the Active Contracts service, processing each event received to create an initial application state.
2. Create a connection to the Transaction Service to receive new events from that initial state, and instantiate a stream handler to process them.
3. Create a connection to the Command Submission Service to send commands.
4. Create a connection to the Command Completion Service, and set up a stream handler to handle completions.
5. Read the event stream and process each event to update its view of the ledger state.
To make accessing and examining this state easier, this often involves turning the generic description of create contracts into instances of structures (such as class instances that are more appropriate for the language being used. This also allows the application to ignore contract data it does not need.
6. Examine the state at regular intervals (often after receiving and processing each transaction event) and send commands back to the ledger on significant changes.
7. Maintain a record of **pending contracts**: contracts that will be archived by these commands, but whose completion has not been received.
Because of the asynchronous nature of the API, these contracts will not exist on the ledger at some point after the command has been submitted, but will exist in the application state until the corresponding archive event has been received. Until that happens, the application must ensure that these **pending contracts** are not considered part of the application state, even though their archives have not yet been received. Processing and maintaining this pending set is a crucial part of a state-driven application.
8. Examine command completions, and handle any command errors. As well as application defined needs (such as command re-submission and de-duplications), this must also include handling command errors as described `Common tasks <#common-tasks>`__, and also consider the pending set. Exercise commands that fail mean that contracts that are marked as pending will now not be archived (the application will not receive any archive events for them) and must be returned to the application state.
Common tasks
============
Both styles of applications will take the following steps:
- Define an **applicationId** - this identifies the application to the ledger server.
- Connect to the ledger (including handling authentication). This creates a client interface object that allows creation of the stream connection described in `Structuring an application <#structuring-an-application>`__.
- Handle execution errors. Because these are received asynchronously, the application will need to keep a record of commands in flight - those sent but not yet indicated complete (via an event). Correlate commands and completions via an application-defined :ref:`commandId <com.digitalasset.ledger.api.v1.Commands.command_id>`. Categorize different sets of commands with a :ref:`workflowId <com.digitalasset.ledger.api.v1.Commands.workflow_id>`.
- Handle lost commands. The ledger server does not guarantee that all commands submitted to it will be executed. This means that a command submission will not result in a corresponding completion, and some other mechanism must be employed to detect this. This is done using the values of Ledger Effective Time (LET) and Maximum Record Time (MRT). The server does guarantee that if a command is executed, it will be executed within a time window between the LET and MRT specified in the command submission. Since the value of the ledger time at which a command is executed is returned with every completion, reception of a completion with a record time that is greater than the MRT of any pending command guarantees that the pending command will not be executed, and can be considered lost.
- Have a policy regarding command resubmission. In what situations should failing commands be re-submitted? Duplicate commands must be avoided in some situations - what state must be kept to implement this?
- Access auxiliary services such as the time service and package service. The `time service <#time-service>`__ will be used to determine Ledger Effective Time value for command submission, and the package service will be used to determine packageId, used in creating a connection, as well as metadata that allows creation events to be turned in to application domain objects.
Application libraries
*********************
We provide several libraries and tools that support the task of building applications. Some of this is provided by the API (e.g. the Active Contracts Service), but mostly is provided by several language binding libraries.
Java
====
The Java API bindings have three levels:
- A low-level Data Layer, including Java classes generated from the gRPC protocol definition files and thin layer of support classes. These provide a builder pattern for constructing protocol items, and blocking and non-blocking interfaces for sending and receiving requests and responses.
- A Reactive Streams interface, exposing all API endpoints as `RxJava <https://github.com/ReactiveX/RxJava>`__ `Flowables <http://reactivex.io/RxJava/javadoc/io/reactivex/Flowable.html>`__.
- A Reactive Components API that uses the above to provide high-level facilities for building state-driven applications.
For more information on these, see the documentation: a :doc:`tutorial/description </app-dev/bindings-java/index>` and the `JavaDoc reference </app-dev/bindings-java/javadocs/index.html>`__.
This API allows a Java application to accomplish all the steps detailed in `Application Structure <#structuring-an-application>`__. In particular, the `Bot <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/Bot.html>`__ abstraction fully supports building of state-driven applications. This is described further in `Architectural Guidance <#architecture-guidance>`__, below.
Scala
=====
The Java libraries above are compatible with Scala and can be used directly.
gRPC
====
We provides the full details of the gRPC service and protocol definitions. These can be compiled to a variety of target languages using the open-source `protobuf and gRPC tools <https://grpc.io/docs/>`__. This allows an application to attach to an interface at the same level as the provided Data Layer Java bindings.
Architecture guidance
*********************
This section presents some suggestions and guidance for building successful applications.
Use a reactive architecture and libraries
=========================================
In general, you should consider using a reactive architecture for your application. This has a number of advantages:
- It matches well to the streaming nature of the ledger API.
- It will handle all the multi-threading issues, providing you with sequential model to implement your application code.
- It allows for several implementation strategies that are inherently scalable e.g. RxJava, Akka Streams/Actors, RxJS, RxPy etc.
Prefer a state-driven approach
==============================
For all but the simplest applications, the state-driven approach has several advantages:
- It's easier to add direct event handling to state-driven applications than the reverse.
- Most applications have to keep some state.
- DigitalAsset language bindings directly support the pattern, and provide libraries that handle many of the required tasks.
Consider a state-driven application as a function of state to commands
======================================================================
As far as possible, aim to encode the core application as a function between application state and generated commands. This helps because:
- It separates the application into separate stages of event transformation, state update and command generation.
- The command generation is the core of the application - implementing as a pure function makes it easy to reason about, and thus reduces bugs and fosters correctness.
- Doing this will also require that the application is structured so that the state examined by that function is stable - that is, not subject to an update while the function is running. This is one of the things that makes the function, and hence the application, easier to reason about.
The Java Reactive Components library provides an abstraction and framework that directly supports this. It provides a `Bot <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/Bot.html>`__ abstraction that handles much of work of doing this, and allows the command generation function to be represented as an actual Java function, and wired into the framework, along with a transform function that allows the state objects to be Java classes that better represent the underlying contracts.
This allows you to reduce the work of building an application to the following tasks:
- Define the Bot function.
- Define the event transformation.
- Define setup tasks such as disposing of command failure, connecting to the ledger and obtaining ledger- and package- IDs.
The framework handles much of the work of building a state-driven application. It handles the streams of events and completions, transforming events into domain objects (via the provided event transform function) and storing them in a `LedgerView <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/LedgerViewFlowable.LedgerView.html>`__ object. This is then passed to the Bot function (provided by the application), which generates a set of commands and a pending set. The commands are sent back to the ledger, and the pending set, along with the commandId that identifies it, is held by the framework (`LedgerViewFlowable <../../app-dev/bindings-java/javadocs/com/daml/ledger/rxjava/components/LedgerViewFlowable.html>`__). This allows it to handle all command completion events.
|image0|
Full details of the framework are available in the links described in the `Java library <#java>`__ above.
Commonly used types
*******************
Primitive and structured types (records, variants and lists) appearing in the contract constructors and choice arguments are compatible with the types defined in the current version of DAML-LF (v1). They appear in the submitted commands and in the event streams.
There are some identifier fields that are represented as strings in the protobuf messages. They are opaque: you shouldn't interpret them in client code, except by comparing them for equality. They include:
- Transaction IDs
- Event IDs
- Contract IDs
- Package IDs (part of template identifiers)
There are some other identifiers that are determined by your client code. These aren't interpreted by the server, and are transparently passed to the responses. They include:
- Command IDs: used to uniquely identify a command and to match it against its response.
- Application ID: used to uniquely identify client process talking to the server. You could use a combination of submitting party, command ID, and application ID for deduplication of commands.
- Workflow IDs: identify chains of transactions. You can use these to correlate transactions sent across time spans and by different parties.
.. |image0| image:: images/BotFlow.png
:width: 6.5in
:height: 3.69444in
Testing
=======
Testing is fundamental to ensure correctness and improve maintainability.
Testing is usually divided into different categories according to its scope and aim:
- unit testing verifies single properties of individual components
- integration testing verifies that an aggregation of components behaves as expected
- acceptance testing checks that the overall behavior of a whole system satisfies certain criteria
Both tests in the small scale (unit testing) and large (acceptance testing) tend to be specific to the given component or system under test.
This chapter focuses on providing portable approaches and techniques to perform integration testing between your components and an actual running ledger.
Test the business logic with a ledger
*************************************
In production, your application is going to interact with a DAML model deployed on an actual ledger. Each model is usually specific to a business need and describes specific workflows.
Mocking a ledger response is usually not desirable to test the business logic, because so much of it is encapsulated in the DAML model. This makes integration testing with an actual running ledger fundamental to evaluating the correctness of an application.
This is usually achieved by running a ledger as part of the test process and running several tests against it, possibly coordinated by a test framework. Since the in-memory sandbox shipped as part of the SDK is a full-fledged implementation of a DAML ledger, it's usually the tool of choice for these tests. Please note that this does not replace acceptance tests with the actual ledger implementation that your application aims to use in production. Whatever your choice is, sharing a single ledger to run several tests is a suggested best practice.
Share the ledger
****************
Sharing a ledger is useful because booting a ledger and loading DAML code into it takes time. As you're likely to have a lot of very short tests in order to properly test your application the total running time of these would be severely impacted if you ran a new ledger for every test.
Tests must thus be designed to not interfere with each other. Both the transaction and the active contract service offer the possibility of filtering by party. Parties can thus be used as a way to isolate tests.
You can use the party management service to allocate new parties and use them to test your application. You can also limit the number of transactions read from the ledger by reading the current offset of the ledger end before the test starts, since no transactions can possibly appear for the newly allocated parties before this time.
In summary:
1. retrieve the current offset of the ledger end before the test starts
1. use the party management service to allocate the parties needed by the test
1. whenever you issue a command, issue it as one of the parties allocated for this test
1. whenever you need to get the set of active contracts or a stream of transactions, always filter by one or more of the parties allocated for this test
This isolation between instances of tests also means that different tests can be run completely in parallel with respect to each other, possibly improving on the overall running time of your test suite.
Reset if you need to
********************
It may be the case that you are running a very high number of tests, verifying the ins and outs of a very complex application interacting with an equally complex DAML model.
If that's the case, the leak of resources caused by the approach to test isolation mentioned above can become counterproductive, causing slow-downs or even crashes as the ledger backing your test suite has to keep track of more parties and more transactions that are actually no longer relevant after the test itself finishes.
As a last resort for these cases, your tests can use the reset service, which ledger implementations can optionally expose for testing.
The reset service has a single ``reset`` method that will cause all the accumulated state to be dropped, including all active contracts, the entire history of transactions and all allocated users. Only the DAML packages loaded in the ledger are preserved, thereby saving the time needed for reloading them as opposed to simply spinning up a new ledger.
The reset service momentarily shuts down the gRPC channel it communicates over, so your testing infrastructure must take this into account and, when the ``reset`` is invoked, must ensure that tests are temporarily suspended as attempts to reconnect with the rebooted ledger are performed. There is no guarantee as to how long the reset will take, so this should also be taken into account when attempting to reconnect.

View File

@ -1,6 +1,8 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _authentication:
Authentication
##############
@ -13,10 +15,9 @@ To run your application against a :doc:`deployed ledger </deploy/index>`, you wi
Introduction
************
The main way for a DAML application to interact with a DAML ledger is through the :doc:`gRPC ledger API </app-dev/grpc/index>`.
This API can be used to request changes to the ledger (e.g., "*Alice wants to exercise choice X on contract Y*),
or to read data from the ledger (e.g., "*Alice wants to see all active contracts*").
The :doc:`Ledger API </app-dev/ledger-api>` is used to request changes to the ledger (e.g., "*Alice
wants to exercise choice X on contract Y*), or to read data from the ledger (e.g., "*Alice wants to
see all active contracts*").
What requests are valid is defined by :ref:`integrity <da-model-integrity>` and :ref:`privacy <da-model-privacy>` parts the :ref:`DA Ledger Model <da-ledgers>`.
This model is defined in terms of :ref:`DAML parties <glossary-party>`,

View File

@ -1,6 +1,8 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _java-bindings:
Java bindings
#############

View File

@ -1,6 +1,8 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _scala-bindings:
Scala bindings
##############

View File

@ -6,20 +6,7 @@ Creating your own bindings
This page gets you started with creating custom bindings for the Digital Asset distributed ledger.
Introduction
============
Digital Asset currently provides bindings for the following programming languages:
- :doc:`Java </app-dev/bindings-java/index>`
- :doc:`Scala </app-dev/bindings-scala/index>`
- :doc:`JavaScript (Node.js) </app-dev/bindings-js>`
You can create bindings for any programming language supported by `gRPC <https://grpc.io/docs/>`_.
What do we mean by "bindings"? Bindings for a language consist of two main components:
Bindings for a language consist of two main components:
- Ledger API
Client "stubs" for the programming language, -- the remote API that allows sending ledger commands and receiving ledger transactions. You have to generate **Ledger API** from `the gRPC protobuf definitions in the daml repository on GitHub <https://github.com/digital-asset/daml/tree/master/ledger-api/grpc-definitions>`_. **Ledger API** is documented on this page: :doc:`/app-dev/grpc/index`. The `gRPC <https://grpc.io/docs/>`_ tutorial explains how to generate client "stubs".

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 5.1 KiB

View File

@ -1,8 +1,10 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
The Ledger API using gRPC
#########################
.. _grpc:
gRPC
####
.. toctree::
:hidden:

View File

@ -1,8 +1,8 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
Writing applications using the Ledger API
#########################################
The Ledger API
##############
.. toctree::
:hidden:
@ -10,28 +10,14 @@ Writing applications using the Ledger API
services
daml-lf-translation
DAML contracts are stored on a ledger. In order to exercise choices on those contracts, create new ones, or read from the ledger, you need to use the **Ledger API**. (Every ledger that DAML can run on exposes this same API.) To write an application around a DAML ledger, you'll need to interact with the Ledger API from another language.
Resources available to you
**************************
- **The Java bindings**: a library to help you write idiomatic applications using the Ledger API in Java.
:doc:`Read the documentation for the Java bindings </app-dev/bindings-java/index>`
- **The experimental Node.js bindings**: a library to help you write idiomatic applications using the Ledger API in JavaScript. Information about the Node.js bindings isn't available in this documentation, but is on GitHub.
`Read the documentation for the Node.js bindings <http://www.github.com/digital-asset/daml-js>`__
- **The underlying gRPC API**: if you want to interact with the ledger API from other languages, you'll need to use `gRPC <https://grpc.io>`__ directly.
:doc:`Read the documentation for the gRPC API </app-dev/grpc/index>`
- **The application architecture guide**: this documentation gives high-level guidance on designing DAML Ledger applications.
:doc:`Read the application architecture guide </app-dev/app-arch>`
To write an application around a DAML ledger, you'll need to interact with the **Ledger API** from
another language. Every ledger that DAML can run on exposes this same API.
What's in the Ledger API
************************
No matter how you're accessing it (Java bindings, Node.js bindings, or gRPC), the Ledger API exposes the same services:
You can access the Ledger API via via the HTTP JSON API, Java bindings, Scala bindings or gRPC. In
all cases, the Ledger API exposes the same services:
- Submitting commands to the ledger
@ -74,7 +60,7 @@ As a user, you don't need to interact with DAML-LF directly. But inside the DAML
When you need to know about DAML-LF
===================================
DAML-LF is only really relevant when you're dealing with the objects you send to or receive from the ledger. If you use :doc:`code generation </app-dev/bindings-java/codegen>`, you don't need to know about DAML-LF at all, because this generates idiomatic representations of DAML for you.
DAML-LF is only really relevant when you're dealing with the objects you send to or receive from the ledger. If you use any of the provided language bindings for the Ledger API, you don't need to know about DAML-LF at all, because this generates idiomatic representations of DAML for you.
Otherwise, it can be helpful to know what the types in your DAML code look like at the DAML-LF level, so you know what to expect from the Ledger API.

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 28 KiB

View File

@ -307,7 +307,7 @@ Application, ledger client, integration
**Application**, **ledger client** and **integration** are all terms for an application that sits on top of the `ledger <#ledger-daml-ledger>`__. These usually `read from the ledger <#reading-from-the-ledger>`_, `send commands <#submitting-commands-writing-to-the-ledger>`__ to the ledger, or both.
There's a lot of information available about application development, starting with the :doc:`/app-dev/index` page.
There's a lot of information available about application development, starting with the :doc:`/app-dev/app-arch` page.
Ledger API
==========

View File

@ -9,7 +9,7 @@ DAML Integration Kit - ALPHA
/tools/ledger-api-test-tool/index
:doc:`DAML Applications </app-dev/index>` run on DAML Ledgers.
:doc:`DAML Applications </app-dev/app-arch>` run on DAML Ledgers.
A DAML Ledger is a server serving the
:doc:`Ledger API </app-dev/grpc/index>` as per the semantics defined in
the :doc:`/concepts/ledger-model/index` and the
@ -120,8 +120,8 @@ To acquire this context, you should:
1. Complete the :doc:`/getting-started/quickstart`.
2. Get an in-depth understanding of the :doc:`/concepts/ledger-model/index`.
3. Build a mental model of how the :doc:`Ledger API </app-dev/index>`
is used to :doc:`build DAML Applications </app-dev/index>`.
3. Build a mental model of how the :doc:`Ledger API </app-dev/ledger-api>`
is used to :doc:`build DAML Applications </app-dev/app-arch>`.
.. _integration-kit_writing_code:
@ -285,7 +285,7 @@ In the diagram above:
Explaining this diagram in detail (for brevity, we drop prefixes
of their qualified names where unambiguous):
:doc:`Ledger API </app-dev/index>`
:doc:`Ledger API </app-dev/ledger-api>`
is the collection of gRPC
services that you would like your `daml-on-<X>-server` to provide.
``<X> services``
@ -367,7 +367,7 @@ Testing a DAML Ledger
You can test your DAML ledger implementation using :doc:`Ledger API Test Tool
</tools/ledger-api-test-tool/index>`, which will assess correctness of
implementation of the :doc:`Ledger API
</app-dev/index>`. For example, it will show you if
</app-dev/ledger-api>`. For example, it will show you if
there are consistency or conformance problem with your implementation.
Assuming that your Ledger API endpoint is accessible at ``localhost:6865``, you can use the tool in the following manner:

View File

@ -15,7 +15,7 @@ DAML archives
When a DAML project is build with ``daml build``, build artifacts are generated in the hidden
directory ``.daml/dist/`` relative to the project root directory. The main build artifact of a
project is the `DAML archive`, recognized by the ``.dar`` file ending. DAML archives are platform
independent. They can be deployed on a ledger (see :ref:`deploy <deploy-ref_index>`) or can be
independent. They can be deployed on a ledger (see :ref:`deploy <deploy-ref_overview>`) or can be
imported into other projects as a package dependency.
Importing DAML archives

View File

@ -1,6 +1,8 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _testing-using-scenarios:
Testing using scenarios
#######################

View File

@ -0,0 +1,59 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _deploy-generic-ledger:
Deploying to a generic DAML ledger
==================================
DAML ledgers expose a unified administration API. This means that deploying to a DAML ledger is no
different from deploying to your local sandbox.
To deploy to a DAML ledger, run the following command from within your DAML project:
.. code-block:: none
$ daml deploy --host=<HOST> --port=<PORT> --access-token-file=<TOKEN-FILE>
where ``<HOST>`` and ``<PORT>`` is the hostname and port your ledger is listening on, which defaults
to port ``6564``. The ``<TOKEN-FILE>`` is needed if your sandbox runs with :ref:`authentication
<authentication>` and needs to contain a JWT token with an ``admin`` claim. If your sandbox is not
setup to use any authentication it can be omitted.
Instead of passing ``--host`` and ``--port`` flags to the command above, you can add the following
section to the project's ``daml.yaml`` file:
.. code-block:: yaml
ledger:
host: <HOSTNAME>
port: <PORT>
The ``daml deploy`` command will
#. upload the project's compiled DAR file to the ledger. This will make the DAML templates defined
in the current project available to the API users of the sandbox.
#. allocate the parties specified in the project's ``daml.yaml`` on the ledger if they are missing.
For more further interactions with the ledger, use the ``daml ledger`` command. Try running ``daml
ledger --help`` to get a list of available ledger commands:
.. code-block:: none
$ daml ledger --help
Usage: daml ledger COMMAND
Interact with a remote DAML ledger. You can specify the ledger in daml.yaml
with the ledger.host and ledger.port options, or you can pass the --host and
--port flags to each command below. If the ledger is authenticated, you should
pass the name of the file containing the token using the --access-token-file
flag.
Available options:
-h,--help Show this help text
Available commands:
list-parties List parties known to ledger
allocate-parties Allocate parties on ledger
upload-dar Upload DAR file to ledger
navigator Launch Navigator on ledger

View File

@ -1,72 +1,19 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _deploy-ref_index:
.. _deploy-ref_overview:
Deploying to DAML Ledgers
*************************
Overview of DAML ledgers
========================
To run a DAML application, you'll need to deploy it to a DAML ledger.
This is an overview of DAML deployment options. Instructions on how to deploy to a specific ledger
are available in the following section.
How to Deploy
=============
Commercial Integrations
-----------------------
You can deploy to:
- The Sandbox with persistence. For information on how to do this, see the section on persistence in :doc:`/tools/sandbox` docs.
- Other available DAML ledgers. For information on these options and their stage of development, see the :ref:`tables below <deploy-ref_available>`.
To deploy a DAML project to a ledger, you will need the ledger's hostname (or IP) and the port number for the gRPC Ledger API. The default port number is 6865. Then, inside your DAML project folder, run the following command, taking care to substitute the ledger's hostname and port for ``<HOSTNAME>`` and ``<PORT>``:
Once you have retrieved your access token, you can provide it by storing it in a file and provide the path to it using the ``--access-token-file`` command line option.
.. code-block:: none
$ daml deploy --host=<HOSTNAME> --port=<PORT> --access-token-file=<TOKEN-FILE>
This command will deploy your project to the ledger. This has two steps:
#. It will allocate the parties specified in the project's ``daml.yaml`` on the ledger if they are missing. The command looks through the list of parties known to the ledger, sees if any party is missing by comparing display names, and adds any missing party via the party management service of the Ledger API.
#. It will upload the project's compiled DAR file to the ledger via the package management service of the Ledger API. This will make the templates defined in the current project available to the users of the ledger.
Instead of passing ``--host`` and ``--port`` flags to the command above, you can add the following section to the project's ``daml.yaml`` file:
If the ledger has no authenication, the ``--access-token-file`` flag may be ommitted.
.. code-block:: yaml
ledger:
host: <HOSTNAME>
port: <PORT>
You can also use the ``daml ledger`` command for more fine-grained deployment options, and to interact with the ledger more generally. Try running ``daml ledger --help`` to get a list of available ledger commands:
.. code-block:: none
$ daml ledger --help
Usage: daml ledger COMMAND
Interact with a remote DAML ledger. You can specify the ledger in daml.yaml
with the ledger.host and ledger.port options, or you can pass the --host and
--port flags to each command below. If the ledger is authenticated, you should
pass the name of the file containing the token using the --access-token-file
flag.
Available options:
-h,--help Show this help text
Available commands:
list-parties List parties known to ledger
allocate-parties Allocate parties on ledger
upload-dar Upload DAR file to ledger
navigator Launch Navigator on ledger
.. _deploy-ref_available:
Available DAML Products
=======================
The following table lists commercially supported DAML ledgers and environments that are available for production use today.
The following table lists commercially supported DAML ledgers and environments that are available
for production use today.
.. list-table::
:header-rows: 1
@ -87,7 +34,7 @@ The following table lists commercially supported DAML ledgers and environments t
.. _deploy-ref_open_source:
Open Source Integrations
========================
------------------------
The following table lists open source DAML integrations.
@ -110,7 +57,7 @@ The following table lists open source DAML integrations.
.. _deploy-ref_in_development:
DAML Ledgers in Development
===========================
---------------------------
The following table lists the ledgers that are implementing support for running DAML.

View File

@ -141,7 +141,7 @@ In this section, you will run the quickstart application and get introduced to t
Initialized sandbox version 100.13.10 with ledger-id = sandbox-5e12e502-817e-41f9-ad40-1c57b8845f9d, port = 6865, dar file = DamlPackageContainer(List(target/daml/iou.dar),false), time mode = Static, ledger = in-memory, daml-engine = {}
The sandbox is now running, and you can access its :doc:`ledger API </app-dev/index>` on port ``6865``.
The sandbox is now running, and you can access its :doc:`ledger API </app-dev/ledger-api>` on port ``6865``.
.. _quickstart-script:
@ -452,7 +452,7 @@ The ``submit`` function used in this scenario tries to perform a transaction and
.. Interact with the ledger through the command line
*************************************************
All interaction with the DA ledger, be it sandbox or full ledger server, happens via the :doc:`Ledger API </app-dev/index>``. It is based on `gRPC <https://grpc.io/>`_.
All interaction with the DA ledger, be it sandbox or full ledger server, happens via the :doc:`Ledger API </app-dev/ledger-api>``. It is based on `gRPC <https://grpc.io/>`_.
The Navigator uses this API, as will any :ref:`custom integration <quickstart-application>`.

View File

@ -40,16 +40,26 @@ DAML SDK documentation
:hidden:
:caption: Building applications
app-dev/index
DAML Script <daml-script/index>
upgrade/index
app-dev/app-arch
app-dev/authentication
app-dev/ledger-api
app-dev/bindings-java/index
app-dev/bindings-scala/index
app-dev/bindings-js
app-dev/grpc/index
app-dev/bindings-x-lang/index
app-dev/app-arch
app-dev/authentication
DAML Script <daml-script/index>
upgrade/index
.. toctree::
:titlesonly:
:maxdepth: 2
:hidden:
:caption: Deploying to DAML ledgers
deploy/index
deploy/generic_ledger
deploy/ledger-topologies
.. toctree::
:titlesonly:
@ -72,15 +82,6 @@ DAML SDK documentation
concepts/ledger-model/index
concepts/identity-and-package-management
.. toctree::
:titlesonly:
:maxdepth: 2
:hidden:
:caption: Deploying
deploy/index
deploy/ledger-topologies
.. toctree::
:titlesonly:
:maxdepth: 2

View File

@ -1,6 +1,8 @@
.. Copyright (c) 2020 The DAML Authors. All rights reserved.
.. SPDX-License-Identifier: Apache-2.0
.. _json-api:
HTTP JSON API Service
#####################
@ -15,7 +17,7 @@ or `on Slack <https://hub.daml.com/slack/>`_.
Please keep in mind that the presence of **/v1** prefix in the the URLs below does not mean that the endpoint interfaces are stabilized.
The **JSON API** provides a significantly simpler way than :doc:`the Ledger
API </app-dev/index>` to interact with a ledger by providing *basic active contract set functionality*:
API </app-dev/ledger-api>` to interact with a ledger by providing *basic active contract set functionality*:
- creating contracts,
- exercising choices on contracts,
@ -30,7 +32,7 @@ complicating concerns, including but not limited to:
- temporal queries (e.g. active contracts *as of a certain time*), and
- ledger metaprogramming (e.g. retrieving packages and templates).
For these and other features, use :doc:`the Ledger API </app-dev/index>`
For these and other features, use :doc:`the Ledger API </app-dev/ledger-api>`
instead.
.. toctree::

View File

@ -6,7 +6,7 @@ Ledger API Test Tool
The Ledger API Test Tool is a command line tool for testing the correctness of
implementations of the :doc:`Ledger API
</app-dev/index>`, i.e. DAML ledgers. For example, it
</app-dev/ledger-api>`, i.e. DAML ledgers. For example, it
will show you if there are consistency or conformance problem with your
implementation.

View File

@ -21,7 +21,7 @@
</div>
<div class="box box3">
<h3>Building applications</h3>
<a href="{{ safe_pathto('app-dev/index') }}" class="btn">Go <img src="{{ pathto('_static/images/boxes/GoButtonArrow.svg', 1) }}"></a>
<a href="{{ safe_pathto('app-dev/app-arch') }}" class="btn">Go <img src="{{ pathto('_static/images/boxes/GoButtonArrow.svg', 1) }}"></a>
</div>
<div class="box box4">
<h3>SDK tools</h3>
@ -35,4 +35,4 @@
<h3>Examples</h3>
<a href="{{ safe_pathto('examples/examples') }}" class="btn">Go <img src="{{ pathto('_static/images/boxes/GoButtonArrow.svg', 1) }}"></a>
</div>
</div>
</div>