Merge remote-tracking branch 'origin/develop' into expose-overlay
8
.github/pull_request_template.md
vendored
Normal file
@ -0,0 +1,8 @@
|
||||
<!-- Provide a clear overview of your changes. -->
|
||||
|
||||
I have:
|
||||
|
||||
- [ ] Based work on latest `develop` branch
|
||||
- [ ] Looked for lint in my changes with `hlint .` (lint found code you did not write can be left alone)
|
||||
- [ ] Run the test suite: `$(nix-build -A selftest --no-out-link)`
|
||||
- [ ] (Optional) Run CI tests locally: `nix-build release.nix -A build.x86_64-linux --no-out-link` (or `x86_64-darwin` on macOS)
|
5
.gitignore
vendored
@ -2,14 +2,13 @@
|
||||
**/.cabal-sandbox
|
||||
**/cabal.sandbox.config
|
||||
result
|
||||
result-*
|
||||
*.o
|
||||
*.hi
|
||||
|
||||
TAGS
|
||||
tags
|
||||
ctags
|
||||
dist-newstyle
|
||||
cabal.project.local
|
||||
.ghc.environment.*
|
||||
|
||||
**/ghcid-output.txt
|
||||
**/ghcid-output.txt
|
||||
|
65
.hlint.yaml
Normal file
@ -0,0 +1,65 @@
|
||||
# HLint configuration file
|
||||
# https://github.com/ndmitchell/hlint
|
||||
##########################
|
||||
|
||||
# This file contains a template configuration file, which is typically
|
||||
# placed as .hlint.yaml in the root of your project
|
||||
|
||||
|
||||
# Specify additional command line arguments
|
||||
#
|
||||
# - arguments: [--color, --cpp-simple, -XQuasiQuotes]
|
||||
|
||||
|
||||
# Control which extensions/flags/modules/functions can be used
|
||||
#
|
||||
# - extensions:
|
||||
# - default: false # all extension are banned by default
|
||||
# - name: [PatternGuards, ViewPatterns] # only these listed extensions can be used
|
||||
# - {name: CPP, within: CrossPlatform} # CPP can only be used in a given module
|
||||
#
|
||||
# - flags:
|
||||
# - {name: -w, within: []} # -w is allowed nowhere
|
||||
#
|
||||
# - modules:
|
||||
# - {name: [Data.Set, Data.HashSet], as: Set} # if you import Data.Set qualified, it must be as 'Set'
|
||||
# - {name: Control.Arrow, within: []} # Certain modules are banned entirely
|
||||
#
|
||||
# - functions:
|
||||
# - {name: unsafePerformIO, within: []} # unsafePerformIO can only appear in no modules
|
||||
|
||||
|
||||
# Add custom hints for this project
|
||||
#
|
||||
# Will suggest replacing "wibbleMany [myvar]" with "wibbleOne myvar"
|
||||
# - error: {lhs: "wibbleMany [x]", rhs: wibbleOne x}
|
||||
|
||||
|
||||
# Turn on hints that are off by default
|
||||
#
|
||||
# Ban "module X(module X) where", to require a real export list
|
||||
# - warn: {name: Use explicit module export list}
|
||||
#
|
||||
# Replace a $ b $ c with a . b $ c
|
||||
# - group: {name: dollar, enabled: true}
|
||||
#
|
||||
# Generalise map to fmap, ++ to <>
|
||||
# - group: {name: generalise, enabled: true}
|
||||
|
||||
|
||||
# Ignore some builtin hints
|
||||
- ignore: {name: Redundant do}
|
||||
- ignore: {name: Redundant flip}
|
||||
- ignore: {name: Use <$>}
|
||||
- ignore: {name: Use camelCase}
|
||||
- ignore: {name: Use if}
|
||||
- ignore: {name: Use unless}
|
||||
# - ignore: {name: Use const, within: SpecialModule} # Only within certain modules
|
||||
|
||||
|
||||
# Define some custom infix operators
|
||||
# - fixity: infixr 3 ~^#^~
|
||||
|
||||
|
||||
# To generate a suitable file for HLint do:
|
||||
# $ hlint --default > .hlint.yaml
|
@ -1,9 +1,79 @@
|
||||
## Contributing to Obelisk
|
||||
# Contribution Guide
|
||||
|
||||
Thanks for taking the time to contribute!
|
||||
Contributions and issue reports are encouraged and appreciated!
|
||||
|
||||
Before submitting a pull request, please run the tests:
|
||||
- [Opening Issues](#opening-issues)
|
||||
- [Submitting Changes](#submitting-changes)
|
||||
- [Guidelines for Commit Messages](#guidelines-for-commit-messages)
|
||||
- [Guidelines for Pull Requests](#guidelines-for-pull-requests)
|
||||
- [Code Quality](#code-quality)
|
||||
- [Documentation](#documentation)
|
||||
|
||||
## Opening Issues
|
||||
|
||||
Before opening an issue, please check whether your issue has already been reported. Assuming it has not:
|
||||
|
||||
* Describe the issue you're encountering or the suggestion you're making
|
||||
* Include any relevant steps to reproduce or code samples you can. It's always easier for us to debug if we have something that demonstrates the error.
|
||||
* Let us know what version of this project you were using. If you're using a github checkout, provide the git hash.
|
||||
|
||||
## Submitting Changes
|
||||
|
||||
Most pull requests should target the `develop` branch. `master` is the release branch. `develop` is periodically merged into master after a period of testing.
|
||||
|
||||
### Guidelines for Commit Messages
|
||||
|
||||
#### Summary Line
|
||||
The summary line of your commit message should summarize the changes being made. Commit messages should be written in the imperative mood and should describe what happens when the commit is applied. If your commit modifies one of the in-tree haskell packages (found in `./lib`), please prefix your commit summary with the name of the package being modified.
|
||||
|
||||
One way to think about it is that your commit message should be able to complete the sentence:
|
||||
"When applied, this commit will..."
|
||||
|
||||
##### Note on bumping dependencies
|
||||
Commits that update a dependency should include some information about why the dependency was updated in the commit message.
|
||||
|
||||
#### Body
|
||||
For breaking changes, new features, refactors, or other major changes, the body of the commit message should describe the motivation behind the change in greater detail and may include references to the issue tracker. The body shouldn't repeat code/comments from the diff.
|
||||
|
||||
### Guidelines for Pull Requests
|
||||
|
||||
Wherever possible, pull requests should add a single feature or fix a single bug. Pull requests should not bundle several unrelated changes.
|
||||
|
||||
### Code Quality
|
||||
|
||||
#### Warnings
|
||||
|
||||
Your pull request should add no new warnings to the project. It should also generally not disable any warnings.
|
||||
|
||||
#### Build and Test
|
||||
|
||||
Make sure the project builds and that the tests pass! This will generally also be checked by CI before merge, but trying it yourself first means you'll catch problems earlier and your contribution can be merged that much sooner!
|
||||
|
||||
You can run the tests like this:
|
||||
```bash
|
||||
$(nix-build -A selftest --no-out-link)
|
||||
```
|
||||
|
||||
To test that your changes build across platforms, you can also try to build release.nix, like this:
|
||||
```bash
|
||||
nix-build release.nix
|
||||
```
|
||||
|
||||
Note, however, that to build release.nix you must accept the android license agreement and your machine must be configured to build both ios and android executables (usually via remote builders).
|
||||
|
||||
|
||||
### Documentation
|
||||
|
||||
#### In the code
|
||||
We're always striving to improve documentation. Please include [haddock](https://haskell-haddock.readthedocs.io/en/latest/index.html) documentation for any added code, and update the documentation for any code you modify.
|
||||
|
||||
#### In the [Changelog](ChangeLog.md)
|
||||
Add an entry to the changelog when your PR:
|
||||
* Adds a feature
|
||||
* Deprecates something
|
||||
* Includes a breaking change
|
||||
* Makes any other change that will impact users
|
||||
|
||||
#### In the [Readme](README.md)
|
||||
The readme is the first place a lot of people look for information about the repository. Update any parts of the readme that are affected by your PR.
|
||||
|
||||
|
111
ChangeLog.md
Normal file
@ -0,0 +1,111 @@
|
||||
# Revision history for obelisk
|
||||
|
||||
This project's release branch is `master`. This log is written from the perspective of the release branch: when changes hit `master`, they are considered released, and the date should reflect that release.
|
||||
|
||||
## v0.7.0.1
|
||||
|
||||
* Fix the version number for `ob` the command-line tool. ([#679](https://github.com/obsidiansystems/obelisk/pull/679))
|
||||
|
||||
## v0.7.0.0
|
||||
|
||||
* Fully support HTTP [Range](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Range) requests on static assets to support Safari. ([#664](https://github.com/obsidiansystems/obelisk/pull/664))
|
||||
* Support non-EC2 deployments. ([#658](https://github.com/obsidiansystems/obelisk/pull/658))
|
||||
* Fix `ob deploy test android` to work. ([#645](https://github.com/obsidiansystems/obelisk/pull/645))
|
||||
* Fix vulnerability where Android deployments would leave signing keys in the nix store which is world readable. ([#645](https://github.com/obsidiansystems/obelisk/pull/645)) (Thanks to [kmicklas](https://github.com/kmicklas) for the report.)
|
||||
* Add `Obelisk.Backend.runBackendWith` to allow several customizations. ([#668](https://github.com/obsidiansystems/obelisk/pull/668), [#644](https://github.com/obsidiansystems/obelisk/pull/644))
|
||||
* Add `ob profile` command to run Obelisk projects with profiling. `ob profile` works like `ob run`, but instead of using `ghci`, it builds an executable that is built with profiling enabled. ([#654](https://github.com/obsidiansystems/obelisk/pull/654))
|
||||
* Obelisk's `default.nix` now exposes `mkAssets` function which is used to construct the assets served by an Obelisk application. ([#651](https://github.com/obsidiansystems/obelisk/pull/651))
|
||||
* Bump reflex-platform to v0.5.2.0. ([#671](https://github.com/obsidiansystems/obelisk/pull/671))
|
||||
|
||||
## v0.6.0.0 - 2020-02-21
|
||||
|
||||
* Fix a bug in `Obelisk.Route.Frontend` where `routeLink`, `routeLinkDynAttr`, and `dynRouteLink` would not behave exactly like `<a href="...">` when run by JavaScript. These functions now scroll to the top of the page when the link is clicked. ([#540](https://github.com/obsidiansystems/obelisk/pull/540))
|
||||
* Fix a bug in `ob run`/`ob watch`/`ob repl` where nested Obelisk projects would also be loaded into the session. These are now ignored. ([#652](https://github.com/obsidiansystems/obelisk/pull/652))
|
||||
* Improve behavior of `ob run`/`ob watch`/`ob repl` when multiple packages with the same name are encountered. Obelisk now issues a warning and tells you which one it will use. ([#653](https://github.com/obsidiansystems/obelisk/pull/653))
|
||||
* Removed `Obelisk.Backend.mkRouteToUrl` since it is easily written in terms of `Obelisk.Route.renderObeliskRoute`:
|
||||
|
||||
mkRouteToUrl validFullEncoder (k :/ v) = renderObeliskRoute validFullEncoder (FullRoute_Frontend (ObeliskRoute_App k) :/ v)
|
||||
|
||||
* Add `Obelisk.Backend.renderAllJsPath` to expose URL path to `ghcjs/all.js`. ([#545](https://github.com/obsidiansystems/obelisk/pull/545))
|
||||
* Add argument to `serveDefaultObeliskApp`, `serveObeliskApp`, and `serveGhcjsApp` to take the path to `all.js` instead of hard-coding it. ([#545](https://github.com/obsidiansystems/obelisk/pull/545))
|
||||
|
||||
## v0.5.0.0 - 2020-02-07
|
||||
|
||||
* Add `Obelisk.Route.(?/)`, a convenience function for constructing routes nested in `Maybe`. ([#457](https://github.com/obsidiansystems/obelisk/pull/457))
|
||||
* Add local unpacked packages to the `ob run`, `ob watch`, and `ob repl` sessions. Any `.cabal` or hpack package inside the current obelisk project will be loaded into the session. For `ob run`/`ob watch` this means the session will automatically reload when you save a source file in any of those packages. For `ob repl` it means that `:r` will reload changes to any of those packages. There are some edge cases where this integration is still rough. Report any issues you encounter. ([#489](https://github.com/obsidiansystems/obelisk/pull/489))
|
||||
* Add `ob hoogle` command to start a local [Hoogle](https://hoogle.haskell.org/) server for the project. ([#628](https://github.com/obsidiansystems/obelisk/pull/628))
|
||||
* `ob thunk pack` will now attempt to automatically detect if the thunk is a private or public repo. To avoid this detection, specify `--private` or `--public` manually. ([#607](https://github.com/obsidiansystems/obelisk/pull/607))
|
||||
* Fix a bug in the plain git thunk loader for thunks marked as 'private' when the revision is not in the default branch. ([#648](https://github.com/obsidiansystems/obelisk/pull/648))
|
||||
* Improve handling of runtime nix dependencies. This may fix some issues encountered particularly by users on systems other than NixOS.
|
||||
|
||||
## v0.4.0.0 - 2020-01-10
|
||||
|
||||
* Bump reflex-platform which, notably, bumps nixpkgs to 19.09. ([#585](https://github.com/obsidiansystems/obelisk/pull/585))
|
||||
* Add new thunk loader for Git repositories that supports `file://` Git remotes and supports private repositories via `builtins.fetchGit` for private repositories (when the `git.json` file specifies `"private": true`). ([#594](https://github.com/obsidiansystems/obelisk/pull/594))
|
||||
* Add a new thunk loader for GitHub repositories that uses `builtins.fetchTarball` for public repositories to increase loader performance and uses `fetchFromGitHub` for private repositories (when the `github.json` file specifies `"private": true`). Note that `fetchFromGitHub` requires some Nix configuration for the Nix builder to access the repository. If `ob thunk pack` fails in this case, use `-v` to see Nix's helpful message. ([#594](https://github.com/obsidiansystems/obelisk/pull/594))
|
||||
* Add `--public`/`--private` options to `ob thunk pack` to specify if a repository should be treated as a public or private. ([#594](https://github.com/obsidiansystems/obelisk/pull/594))
|
||||
* Improve error messaging when a dependency doesn't have the expected `.cabal` or `package.yaml` file. ([#597](https://github.com/obsidiansystems/obelisk/pull/597))
|
||||
* Improve the skeleton in small ways. ([#593](https://github.com/obsidiansystems/obelisk/pull/593), [#589](https://github.com/obsidiansystems/obelisk/pull/589))
|
||||
* Fix `ob` commands to again support running from any subdirectory of an obelisk project ([#591](https://github.com/obsidiansystems/obelisk/pull/591))
|
||||
* Add `reflex-platform-func` argument to Obelisk's `default.nix`. It defaults to it's prior behavior of using the reflex-platform in in `dep`. ([#612](https://github.com/obsidiansystems/obelisk/pull/612))
|
||||
|
||||
## v0.3.0.0 - 2019-12-20
|
||||
|
||||
* Change the structure of Obelisk routes to use a designated
|
||||
`FullRoute` type. This combines frontend and backend routes into one
|
||||
structure. This is a **breaking** change which requires Obelisk apps
|
||||
to take specific migrations. They are:
|
||||
* Rewrite the implementation of `_backend_routeEncoder` in
|
||||
`Backend` to use `mkFullRouteEncoder` instead of
|
||||
`handleEncoder`. Specifically, the backend and frontend cases of
|
||||
the top-level `pathComponentEncoder` become the second and third
|
||||
arguments of `mkFullRouteEncoder` respectively, while the
|
||||
missing route becomes the first argument. An example of how to
|
||||
do this is available in [a reflex-examples
|
||||
commit](https://github.com/reflex-frp/reflex-examples/commits/28f566c3e7dc615578dc74297b7c620c1f13683e).
|
||||
* Replace type constructions of `InL` with `FullRoute_Backend` and
|
||||
`InR` with `FullRoute_Frontend`.
|
||||
* Generalised `pathSegmentEncoder`, added `pathFieldEncoder`.
|
||||
* Added some `Prism`s to the encoder library for manipulating `DSum`s.
|
||||
* Add `ob doc` command, which lists paths to haddock documentation for specified packages.
|
||||
* Bump reflex-platform so that obelisk now uses GHC 8.6.5 and the nixos-19.03 nixpkgs set.
|
||||
* Add support in `obelisk-route` for single parameters in URL paths.
|
||||
* Bump reflex-platform so that obelisk now uses reflex-dom 0.5.2.0.
|
||||
* Use a `--pure` nix shell in `ob run` for parity with `ob repl` and more resilience against "works on my machine".
|
||||
* Pin usages of `<nixpkgs>` in obelisk thunks, etc. to the nixpkgs used by the project's obelisk impl.
|
||||
* Backport ACMEv2 support in obelisk server to regain LetsEncrypt account creation.
|
||||
* Enable HTTPS in `ob run`.
|
||||
* `ob run` now handles `ghci` errors better, and includes a custom `ghcid`
|
||||
version. As a result, you no longer need to have ghcid installed to
|
||||
use `ob run`, as we provide one for you.
|
||||
* `ob` commands now complain less on systems with umasks other than `0022`.
|
||||
* Ignore package environment files in `ob run` and `ob repl`.
|
||||
* Add `Obelisk.Route.Frontend.routeLinkDynAttr`.
|
||||
|
||||
## v0.2.0.0 - 2019-8-17
|
||||
|
||||
* Configs become ByteStrings.
|
||||
* FrontendConfigsT has been changed into ConfigsT and configs are made available via getConfig/getConfigs
|
||||
* The frontend will still only have access to configs that are placed in config/frontend and config/common, while the backend has access to the entire contents of the config directory via `Obelisk.ExecutableConfig.Lookup.getConfigs`.
|
||||
* The backend no longer runs in BackendConfigsT.
|
||||
* Add tabulation package. See Data.Tabulation for details.
|
||||
* Add encoders for `DMap`, `HasFields` (cf. Data.Tabulation), and JSON.
|
||||
* Use IP address for nginx proxy pass instead of localhost
|
||||
|
||||
## v0.1.1.0 - 2019-05-17
|
||||
|
||||
* Fix crashes of Android apps on 32-bit ARM devices.
|
||||
* Provide a way to indicate acceptance of the Android SDK license by passing `config.android_sdk.accept_license = true;` in the arguments to the import of `.obelisk/impl` in the project's `default.nix`.
|
||||
* Add `COMPLETE` pragma to `(:/)`. Using this pattern synonym should no longer generate spurious warnings about non-exhaustive pattern matching.
|
||||
* Make asset path hashing strict (see `Obelisk.Asset.Gather`)
|
||||
* Add the `ob shell` command to enter a nix shell for an obelisk project
|
||||
* Allow skeleton's obelisk to be overridden. This changes the skeleton's default.nix interface: the arguments that it used to take are now part of the new "obelisk" argument.
|
||||
* Removed `MonadIO` from `ObeliskWidget` to prevent accidental IO during prerendering. If you need to do IO in a widget it should be on the right hand side of a `prerender`.
|
||||
* Significantly changed the interface to the "executable config" packages. `obelisk-executable-config-lookup` is a new internal package which looks up all configs in a platform-specific way. `obelisk-executable-frontend` and `obelisk-executable-backend` provide MTL-style monad classes (`HasFrontendConfigs` and `HasBackendConfigs`) which the frontend and backend, respectively, can use to look up configs. This replaces the old `get` function which ran in `IO`.
|
||||
* Add a flag to force thunk packing even if there are unpushed changes in the unpacked thunk.
|
||||
|
||||
|
||||
## v0.1.0.0 - 2019-03-29
|
||||
|
||||
* Use reflex-dom's `HydrationDomBuilder` to "hydrate" statically rendered DOM served by the Obelisk backend (rather than re-creating DOM and replacing it all).
|
||||
* Add `HasCookies` and `CookiesT` to allow `ObeliskWidget`s to access cookies during static and "hydrated" DOM rendering.
|
107
FAQ.md
@ -1,6 +1,9 @@
|
||||
# Frequently Asked Questions
|
||||
|
||||
1. [How do I fix invalid entitlements? ](#how-do-i-fix-invalid-entitlements)
|
||||
1. [`ob thunk update` or `ob deploy update` fails](#ob-thunk-update-or-ob-deploy-update-fails)
|
||||
1. [How do I fix `Ambiguous module name` errors?](#how-do-i-fix-ambiguous-module-name-errors)
|
||||
1. [What does `ob run` actually do?](#what-does-ob-run-actually-do)
|
||||
|
||||
### How do I fix invalid entitlements?
|
||||
|
||||
@ -13,3 +16,107 @@ When this happens you'll see an error something like this:
|
||||
|
||||
Fixing the value of `ios.bundleIdentifier` should fix the error.
|
||||
|
||||
### `ob thunk update` or `ob deploy update` fails
|
||||
Whenever an `ob` command fails, try re-running it with `-v`.
|
||||
|
||||
If you're using a private repo, and you get a failure in nix-prefetch-url, you may need to unpack and repack the thunk. Here's some example output that shows this issue:
|
||||
|
||||
```
|
||||
Starting Obelisk </nix/store/j8wls8a89xr6s1a47lg6g83gnbdrfd0l-obelisk-command-0.1/bin/.ob-wrapped> args=["deploy","update","-v"] logging-level=Debug
|
||||
Creating process: 'nix-build' './.obelisk/impl' '-A' 'command' '--no-out-link'
|
||||
✔ Built on ./.obelisk/impl [command]
|
||||
Handing off to /nix/store/j8wls8a89xr6s1a47lg6g83gnbdrfd0l-obelisk-command-0.1/bin/ob
|
||||
Starting Obelisk </nix/store/j8wls8a89xr6s1a47lg6g83gnbdrfd0l-obelisk-command-0.1/bin/.ob-wrapped> args=["--no-handoff","deploy","update","-v"] logging-level=Debug
|
||||
Creating process: 'git' 'ls-remote' '--exit-code' '--symref' 'https://github.com/obsidiansystems/some-private-repo.git' 'refs/heads/master'
|
||||
git ls-remote maps: (fromList [],fromList [(GitRef_Branch "master","8546dfc8be9653f20bb26214e0991dbb957cb290")])
|
||||
Latest commit in branch master from remote repo https://github.com/obsidiansystems/some-private-repo.git is 8546dfc8be9653f20bb26214e0991dbb957cb290
|
||||
Creating process: 'nix-prefetch-url' '--unpack' '--type' 'sha256' 'https://github.com/obsidiansystems/some-private-repo/archive/8546dfc8be9653f20bb26214e0991dbb957cb290.tar.gz'
|
||||
error: unable to download 'https://github.com/obsidiansystems/some-private-repo/archive/8546dfc8be9653f20bb26214e0991dbb957cb290.tar.gz': HTTP error 404
|
||||
Process exited with code 1; 'nix-prefetch-url' '--unpack' '--type' 'sha256' 'https://github.com/obsidiansystems/some-private-repo/archive/8546dfc8be9653f20bb26214e0991dbb957cb290.tar.gz'
|
||||
nix-prefetch-url: Failed to determine sha256 hash of URL https://github.com/obsidiansystems/some-private-repo/archive/8546dfc8be9653f20bb26214e0991dbb957cb290.tar.gz
|
||||
✖ Updating thunk ./src to latest
|
||||
```
|
||||
|
||||
And here's how you can fix it:
|
||||
|
||||
```
|
||||
ob thunk unpack $SOME_THUNK
|
||||
ob thunk pack $SOME_THUNK
|
||||
```
|
||||
Substitute the directory of your thunk for `$SOME_THUNK`. In the case of `ob deploy update`, `$SOME_THUNK` will be `src` in the deployment directory.
|
||||
|
||||
(Based on issue #351).
|
||||
|
||||
### How do I fix `Ambiguous module name` errors?
|
||||
Since obelisk places the common/backend/frontend modules packages into the same ghci, and ghci doesn't "sandbox" them, it is possible to have conflicting module errors inside `ob repl/watch/run` that do not appear when doing a cabal build.
|
||||
|
||||
You can disambiguate this via [PackageImports](https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/glasgow_exts.html#package-qualified-imports). For instance, if you see
|
||||
```
|
||||
error:
|
||||
Ambiguous module name ‘Crypto.Hash’:
|
||||
it was found in multiple packages:
|
||||
cryptohash-0.11.9 cryptonite-0.25
|
||||
```
|
||||
then specify the package you want in the import, e.g:
|
||||
`import "cryptonite" Crypto.Hash`
|
||||
|
||||
|
||||
### What does `ob run` actually do?
|
||||
|
||||
|
||||
#### Short version:
|
||||
|
||||
`ob run` starts a [`ghcid`](https://github.com/ndmitchell/ghcid) process which
|
||||
tries to build your project within a carefully crafted `nix-shell` with all the
|
||||
project's dependencies and,
|
||||
|
||||
* either displays compilation errors/warnings,
|
||||
* or starts the Obelisk server, which serves:
|
||||
* your backend's route handlers,
|
||||
* the static assets,
|
||||
* the [JSaddle](https://github.com/ghcjs/jsaddle) frontend code (while opening its websocket).
|
||||
|
||||
#### Longer version:
|
||||
|
||||
Assuming we are in a project created with `ob init`, `ob run` calls (see
|
||||
`lib/command/src/Obelisk/Command.hs`):
|
||||
|
||||
nix-shell --pure -A shells.ghc --run 'ob --no-handoff internal run-static-io <real-run-function>'
|
||||
|
||||
where
|
||||
|
||||
* `shells.ghc` is defined in `./default.nix` by importing `./.obelisk/impl/default.nix` which
|
||||
is `default.nix` in the present (Obelisk) repository,
|
||||
* `run-static-io` is a logging-enabled command wrapper
|
||||
(cf. `runObelisk` in `lib/command/src/Obelisk/App.hs`).
|
||||
|
||||
In this case, it runs the function `Obelisk.Command.Command.run` (defined in
|
||||
`lib/command/src/Obelisk/Command/Run.hs`).
|
||||
|
||||
* It creates a GHCi config from a Nix expression which:
|
||||
* loads three packages: `backend`, `common`, `frontend`,
|
||||
* obtains a free port number.
|
||||
* Then runs `ghcid`
|
||||
* with a command that reruns `Obelisk.Run.run` at each restart (option
|
||||
`--test`).
|
||||
|
||||
It is defined in `lib/run/src/Obelisk/Run.hs`:
|
||||
|
||||
* It creates a thread which starts the main backend.
|
||||
This runs `runSnapWithCommandLineArgs` and passes routes:
|
||||
* to the backend (result of the `_backend_run` field of your
|
||||
implementation of the `Backend fullRoute frontendRoute` record),
|
||||
* or to `serveDefaultObeliskApp` (`lib/backend/src/Obelisk/Backend.hs`) to
|
||||
serve static assets.
|
||||
* Starts `runWidget`
|
||||
which itself runs
|
||||
[runSettingsSocket](https://hackage.haskell.org/package/warp-3.2.26/docs/Network-Wai-Handler-Warp.html#v:runSettingsSocket)
|
||||
(the *“TCP listen loop”*):
|
||||
* it binds to TCP socket, and
|
||||
* creates the HTTP connection manager
|
||||
( [`newManager`](https://hackage.haskell.org/package/http-client-0.6.3/docs/Network-HTTP-Client.html#v:newManager))
|
||||
* calls `obeliskApp` which
|
||||
* has a route to serve the [JSaddle](https://github.com/ghcjs/jsaddle) frontend,
|
||||
* runs the frontend (*on* the server), to produce pre-rendered pages,
|
||||
* starts the JSaddle web-socket,
|
||||
* “falls back” to proxying the backend (`fallbackProxy`).
|
||||
|
30
LICENSE
Normal file
@ -0,0 +1,30 @@
|
||||
Copyright (c) 2020 Obsidian Systems
|
||||
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
* Redistributions in binary form must reproduce the above
|
||||
copyright notice, this list of conditions and the following
|
||||
disclaimer in the documentation and/or other materials provided
|
||||
with the distribution.
|
||||
|
||||
* Neither the name of Elliot Cameron nor the names of other
|
||||
contributors may be used to endorse or promote products derived
|
||||
from this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
171
README.md
@ -1,12 +1,21 @@
|
||||
# Obelisk
|
||||
|
||||
[![Haskell Programming Language](https://img.shields.io/badge/language-Haskell-blue.svg)](http://www.haskell.org)
|
||||
[![BSD3 License](http://img.shields.io/badge/license-BSD3-brightgreen.svg)](https://tldrlegal.com/license/bsd-3-clause-license-%28revised%29)
|
||||
|
||||
|
||||
Obelisk provides an easy way to develop and deploy your [Reflex](https://github.com/reflex-frp/reflex) project as web apps and as mobile apps.
|
||||
|
||||
- [Installing Obelisk](#installing-obelisk)
|
||||
- [Developing an Obelisk project](#developing-an-obelisk-project)
|
||||
- [Adding Packages](#adding-packages)
|
||||
- [Adding Package Overrides](#adding-package-overrides)
|
||||
- [Running over https](#running-over-https)
|
||||
- [Deploying](#deploying)
|
||||
- [Locally](#locally)
|
||||
- [EC2](#ec2)
|
||||
- [Default EC2 Deployment](#default-ec2-deployment)
|
||||
- [Custom Non-EC2 Deployment](#custom-non-ec2-deployment)
|
||||
- [VirtualBox Deployment](#virtualbox-deployment)
|
||||
- [From macOS](#from-macos)
|
||||
- [Deploying an updated version](#deploying-an-updated-version)
|
||||
- [Mobile](#mobile)
|
||||
@ -19,13 +28,15 @@ Obelisk provides an easy way to develop and deploy your [Reflex](https://github.
|
||||
1. Set up nix caches
|
||||
1. If you are running NixOS, add this to `/etc/nixos/configuration.nix`:
|
||||
```
|
||||
nix.binaryCaches = [ "https://cache.nixos.org/" "https://nixcache.reflex-frp.org" ];
|
||||
nix.binaryCaches = [ "https://nixcache.reflex-frp.org" ];
|
||||
nix.binaryCachePublicKeys = [ "ryantrinkle.com-1:JJiAKaRv9mWgpVAz8dwewnZe0AzzEAzPkagE9SP5NWI=" ];
|
||||
```
|
||||
and rebuild your NixOS configuration (e.g. `sudo nixos-rebuild switch`).
|
||||
1. If you are using another operating system or linux distribution, ensure that these lines are present in your Nix configuration file (`/etc/nix/nix.conf` on most systems; [see full list](https://nixos.org/nix/manual/#sec-conf-file)):
|
||||
```
|
||||
substituters = https://cache.nixos.org https://nixcache.reflex-frp.org
|
||||
trusted-public-keys = cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY= ryantrinkle.com-1:JJiAKaRv9mWgpVAz8dwewnZe0AzzEAzPkagE9SP5NWI=
|
||||
binary-caches = https://cache.nixos.org https://nixcache.reflex-frp.org
|
||||
binary-cache-public-keys = cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY= ryantrinkle.com-1:JJiAKaRv9mWgpVAz8dwewnZe0AzzEAzPkagE9SP5NWI=
|
||||
binary-caches-parallel-connections = 40
|
||||
```
|
||||
* other Linux: enable sandboxing (see these [issue172](https://github.com/obsidiansystems/obelisk/issues/172#issuecomment-411507818) or [issue6](https://github.com/obsidiansystems/obelisk/issues/6) if you run into build problems)
|
||||
```
|
||||
@ -46,16 +57,38 @@ Obelisk provides an easy way to develop and deploy your [Reflex](https://github.
|
||||
|
||||
When developing on obelisk itself you may launch `ghcid` for the corresponding project as follows. For example to launch ghcid for `lib/backend` project:
|
||||
|
||||
```
|
||||
```bash
|
||||
nix-shell -A obeliskEnvs.obelisk-backend --run "cd lib/backend && ghcid -c 'cabal new-repl'"
|
||||
```
|
||||
|
||||
Or to launch ghcid for `lib/command` project:
|
||||
|
||||
```
|
||||
```bash
|
||||
nix-shell -A obeliskEnvs.obelisk-command --run "cd lib/command && ghcid -c 'cabal new-repl'"
|
||||
```
|
||||
|
||||
To re-install `ob` from source do
|
||||
```bash
|
||||
nix-env -f /path/to/obelisk -iA command
|
||||
```
|
||||
|
||||
Note that `ob` will defer to the version found in your project's `.obelisk/impl` directory. To update that version specifically:
|
||||
|
||||
```bash
|
||||
ob thunk unpack ./.obelisk/impl
|
||||
cd ./.obelisk/impl
|
||||
# apply your changes
|
||||
```
|
||||
|
||||
If you want to commit your changes, first push them to your fork of obelisk and then
|
||||
|
||||
```bash
|
||||
cd /your/project/root
|
||||
ob thunk pack .obelisk/impl
|
||||
git add .obelisk/impl
|
||||
git commit -m "Bump obelisk"
|
||||
```
|
||||
|
||||
### Accessing private repositories
|
||||
|
||||
To allow the Nix builder to access private git repositories, you must be set up
|
||||
@ -69,44 +102,103 @@ access to:
|
||||
|
||||
To create a new Obelisk project, go to an empty directory and run:
|
||||
|
||||
```
|
||||
```bash
|
||||
ob init
|
||||
```
|
||||
|
||||
Obelisk leverages ghcid to provide a live-reloading server that handles both frontend and backend. To run your Obelisk app and monitor the source for changes:
|
||||
|
||||
```
|
||||
```bash
|
||||
ob run
|
||||
```
|
||||
|
||||
Now go to http://localhost:8000 (or the port specified in `config/common/route`) to access your app.
|
||||
Now, with an appropriate browser, go to http://localhost:8000 (or the address/port specified in `config/common/route`) to access your app.
|
||||
Firefox will not be able to properly run the development website due to [issue 460](https://github.com/obsidiansystems/obelisk/issues/460). Fortunately, this problem does not occur on a fully built website.
|
||||
|
||||
Every time you change the Haskell source files in frontend, common or backend, `ob run` will automatically recompile the modified files and reload the server. Furthermore, it will display on screen compilation errors and warnings if any.
|
||||
|
||||
### Adding packages
|
||||
|
||||
In order to add package dependencies, declare them under the build-depends field in the appropriate cabal files (backend, common, and frontend each have their own). The corresponding Nix packages will automatically be selected when building.
|
||||
|
||||
### Adding package overrides
|
||||
|
||||
To add a version override to any Haskell package, or to add a Haskell package that doesn't exist in the nixpkgs used by Obelisk, use the `overrides` attribute in your project's `default.nix`. For example, to use a specific version of the `aeson` package fetched from GitHub and a specific version of the `waargonaut` package fetched from Hackage, your `default.nix` will look like:
|
||||
|
||||
```nix
|
||||
# ...
|
||||
project ./. ({ pkgs, ... }: {
|
||||
# ...
|
||||
overrides = self: super: let
|
||||
aesonSrc = pkgs.fetchFromGitHub {
|
||||
owner = "obsidiansystems";
|
||||
repo = "aeson-gadt-th";
|
||||
rev = "ed573c2cccf54d72aa6279026752a3fecf9c1383";
|
||||
sha256 = "08q6rnz7w9pn76jkrafig6f50yd0f77z48rk2z5iyyl2jbhcbhx3";
|
||||
};
|
||||
in
|
||||
{
|
||||
aeson = self.callCabal2nix "aeson" aesonSrc {};
|
||||
waargonaut = self.callHackageDirect {
|
||||
pkg = "waargonaut";
|
||||
ver = "0.8.0.1";
|
||||
sha256 = "1zv28np3k3hg378vqm89v802xr0g8cwk7gy3mr77xrzy5jbgpa39";
|
||||
} {};
|
||||
};
|
||||
# ...
|
||||
```
|
||||
|
||||
For further information see [the Haskell section](https://nixos.org/nixpkgs/manual/#users-guide-to-the-haskell-infrastructure) of nixpkgs Contributors Guide.
|
||||
|
||||
### Adding extra local packages
|
||||
|
||||
If the standard packages (`frontend`, `backend`, and `common`) are not
|
||||
enough, to add more local Haskell packages, define them with the
|
||||
`packages` parameter. The sources of these packages will be
|
||||
automatically reloaded by `ob run`.
|
||||
|
||||
```nix
|
||||
# ...
|
||||
project ./. ({ pkgs, ... }: {
|
||||
# ...
|
||||
packages = {
|
||||
another = ./another;
|
||||
};
|
||||
# ...
|
||||
```
|
||||
|
||||
### Running over HTTPS
|
||||
|
||||
To run your app locally over https, update the protocol in `config/common/route` to `https`, and then use `ob run` as normal.
|
||||
|
||||
Since Obelisk generates a self-signed certificate for running https, the browser will issue a warning about using an invalid certificate. On Chrome, you can go to `chrome://flags/#allow-insecure-localhost` to enable invalid certificates for localhost.
|
||||
|
||||
## Deploying
|
||||
|
||||
### Locally
|
||||
|
||||
Build everything:
|
||||
|
||||
```
|
||||
nix-build -A exe -o result-exe
|
||||
```bash
|
||||
nix-build -A exe --no-out-link
|
||||
```
|
||||
|
||||
Run the server:
|
||||
Copy the result to a new directory, add configuration, and run!
|
||||
|
||||
```
|
||||
cd result-exe
|
||||
./backend
|
||||
```bash
|
||||
mkdir test-app
|
||||
ln -s $(nix-build -A exe --no-out-link)/* test-app/
|
||||
cp -r config test-app
|
||||
(cd test-app && ./backend)
|
||||
```
|
||||
|
||||
### EC2
|
||||
### Default EC2 Deployment
|
||||
|
||||
In this section we will demonstrate how to deploy your Obelisk app to an Amazon EC2 instance.
|
||||
In this section we will demonstrate how to deploy your Obelisk app to an Amazon EC2 instance. Obelisk deployments are configured for EC2 by default (see [Custom Non-EC2 Deployment](#custom-non-ec2-deployment)).
|
||||
|
||||
First create a new EC2 instance:
|
||||
|
||||
1. Launch a NixOS 17.09 EC2 instance (we recommend [this AMI](https://console.aws.amazon.com/ec2/v2/home?region=us-east-1#LaunchInstanceWizard:ami=ami-40bee63a))
|
||||
1. Launch a NixOS 19.09 EC2 instance (we recommend [this AMI](https://console.aws.amazon.com/ec2/v2/home?region=us-east-1#LaunchInstanceWizard:ami=ami-00a8eeaf232a74f84))
|
||||
1. In the instance configuration wizard ensure that your instance has at least 1GB RAM and 10GB disk space.
|
||||
1. When prompted save your AWS private key (`~/myaws.pem`) somewhere safe. We'll need it later during deployment.
|
||||
1. Go to "Security Groups", select your instance's security group and under "Inbound" tab add a new rule for HTTP port 80 and 443.
|
||||
@ -129,9 +221,11 @@ ob deploy init \
|
||||
~/code/myapp-deploy
|
||||
```
|
||||
|
||||
NOTE: HTTPS is enabled by default; to disable https, pass `--disable-https` to the `ob deploy init` command above.
|
||||
HTTPS is enabled by default; to disable HTTPS pass `--disable-https` to the `ob deploy init` command above.
|
||||
|
||||
Then go to that created deployment configuration directory, and initiate the deployment:
|
||||
This step will also require that you manually verify the authenticity of the host `$SERVER`. Obelisk will save the fingerprint in a deployment-specific configuration. **Obelisk deployments do *not* rely on the `known_hosts` of your local machine.** This is because, in the event that you need to switch from one deploy machine / bastion host to another, you want to be absolutely sure that you're still connecting to the machines you think you are, even if that deploy machine / bastion host has never connected to them before. Obelisk explicitly avoids a workflow that encourages people to accept host keys without checking them, since that could result in leaking production secrets to anyone who manages to MITM you, e.g. via DNS spoofing or cache poisoning. (Note that an active attack is a circumstance where you may need to quickly switch bastion hosts, e.g. because the attacker has taken one down or you have taken it down in case it was compromised. In this circumstance you might need to deploy to production to fix an exploit or rotate keys, etc.) When you run `ob deploy` later it will rely on the saved verification in this step.
|
||||
|
||||
Next, go to the deployment directory that you just initialized and deploy!
|
||||
|
||||
```
|
||||
cd ~/code/myapp-deploy
|
||||
@ -142,6 +236,26 @@ ob deploy push
|
||||
|
||||
At this point you are done. Your app will be accessible at `${ROUTE}`. The currently deployed version - the git commit hash of the source repo - can be found at `${ROUTE}/version`.
|
||||
|
||||
### Custom Non-EC2 Deployment
|
||||
|
||||
By default Obelisk deployments are configured for NixOS machines running on AWS EC2. To provide your own configuration, you need to write a custom `module.nix` in the deployment repository. This still requires that your server is running NixOS.
|
||||
|
||||
`module.nix` must contain a Nix *function* that produces a [NixOS module function](https://nixos.org/nixos/manual/index.html#sec-writing-modules). The top-level function takes deployment configuration as arguments: `hostName`, `adminEmail`, `routeHost`, `enableHttps`, `version`, `exe`, `nixosPkgs`. Most of these are the values you specified during `ob deploy init` and are stored in the deployment repository. `version` is a `git` hash for the app that you're deploying. `exe` is the Linux build of the app (as seen in [Deploying Locally](#locally)). `nixosPkgs` is the package set used to construct the NixOS VM.
|
||||
|
||||
The [VirtualBox Deployment](#virtualbox-deployment) section provides an example.
|
||||
|
||||
#### VirtualBox Deployment
|
||||
|
||||
Here's a `module.nix` that is configured for deployment to a VirtualBox VM (running NixOS):
|
||||
|
||||
```nix
|
||||
{ nixosPkgs, ... }: {...}: {
|
||||
imports = [ (nixosPkgs.path + /nixos/modules/virtualisation/virtualbox-image.nix) ];
|
||||
}
|
||||
```
|
||||
|
||||
The `{...}:` and following is the [NixOS module](https://nixos.org/nixos/manual/index.html#sec-writing-modules) definition.
|
||||
|
||||
### From macOS
|
||||
|
||||
Deploying from macOS requires some extra setup:
|
||||
@ -155,7 +269,7 @@ Running `ob deploy push` will give you additional setup instructions.
|
||||
|
||||
If you'd like to deploy an updated version (with new commits) of your Obelisk app: simply go to the configuration directory, update the source thunk and push:
|
||||
|
||||
```
|
||||
```bash
|
||||
cd ~/code/myapp-deploy
|
||||
ob deploy update
|
||||
ob deploy push
|
||||
@ -181,14 +295,14 @@ These versions will work out of the box but iOS SDKs prior to 11.3 should also w
|
||||
|
||||
More recent Xcodes should also work, as long as one of the SDKs mentioned above has been used.
|
||||
To add another SDK to your current Xcode, [download](https://developer.apple.com/download/more/) the corresponding Xcode, extract it and copy its SDK folder next to the installed one, e.g.
|
||||
```
|
||||
```bash
|
||||
open -W Xcode_9.2.xip
|
||||
sudo cp -R Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS11.2.sdk
|
||||
```
|
||||
|
||||
|
||||
You can verify that you have correct versions by running
|
||||
```
|
||||
```bash
|
||||
xcodebuild -showsdks
|
||||
```
|
||||
|
||||
@ -214,7 +328,7 @@ Ensure that `bundleIdentifier` matches the App ID of the development profile, or
|
||||
1. Connect the registered iPhone.
|
||||
1. Find your Apple Team ID in the [developer portal](https://developer.apple.com/account/#/membership).
|
||||
1. Run the deploy command with your Team ID:
|
||||
```
|
||||
```bash
|
||||
result-ios/bin/deploy [TEAM_ID]
|
||||
# or in debug mode via lldb:
|
||||
result-ios/bin/deploy [TEAM_ID] -d
|
||||
@ -224,7 +338,7 @@ result-ios/bin/deploy [TEAM_ID] -d
|
||||
1. Go to [developer portal - distribution profiles](https://developer.apple.com/account/ios/profile/production).
|
||||
Create and download a distribution profile.
|
||||
1. Run the package script with your TEAM ID and your distribution profile to create a `.ipa`:
|
||||
```
|
||||
```bash
|
||||
result-ios/bin/package [TEAM_ID] /path/to/output/.ipa /path/to/profile/file
|
||||
```
|
||||
|
||||
@ -237,7 +351,10 @@ It's also possible to inspect iOS WkWebView apps once they are installed in the
|
||||
|
||||
### Android
|
||||
|
||||
NOTE: Currently Android builds are only supported on Linux.
|
||||
|
||||
1. In your project's `default.nix` set a suitable value for `android.applicationId` and `android.displayName`.
|
||||
1. In your project's `default.nix` pass `config.android_sdk.accept_license = true;` in the arguments to the import of of `.obelisk/impl` to indicate your acceptance of the [Android Software Development Kit License Agreement](https://developer.android.com/studio/terms), which is required to build Android apps.
|
||||
1. Run `nix-build -A android.frontend -o result-android` to build the Android app.
|
||||
1. A debug version of the app should be generated at `result-android/android-app-debug.apk`
|
||||
|
||||
@ -254,9 +371,9 @@ This command will accomplish the following:
|
||||
1. Build a Signed Android apk for your application
|
||||
1. Deploy the Signed apk to your connected Android device
|
||||
|
||||
In the event that you change your key or keystore password, you will have to update your credentials within the JSON object found in `android_keytool_config.json`
|
||||
In the event that you change your key or keystore password, you will have to update your credentials within the JSON object found in `android_keytool_config.json`.
|
||||
|
||||
Additional documentation on java key stores can be found [here] (https://docs.oracle.com/javase/8/docs/technotes/tools/unix/keytool.html)
|
||||
Additional documentation on Java key stores can be found [here](https://docs.oracle.com/javase/8/docs/technotes/tools/unix/keytool.html).
|
||||
|
||||
This should copy over and install the application on your device (if you see a "*signatures do not match*" error, simply uninstall the previous app from the device before retrying the deploy). The name of the installed application will be what you have specified for `android.displayName` in the `default.nix`.
|
||||
|
||||
|
104
all-builds.nix
Normal file
@ -0,0 +1,104 @@
|
||||
{ self-args ? {
|
||||
config.android_sdk.accept_license = true;
|
||||
iosSdkVersion = "10.2";
|
||||
}
|
||||
, local-self ? import ./. self-args
|
||||
, supportedSystems ? [ builtins.currentSystem ]
|
||||
}:
|
||||
|
||||
let
|
||||
inherit (local-self.nixpkgs) lib runCommand nix;
|
||||
cacheBuildSystems = supportedSystems;
|
||||
|
||||
obeliskPackagesCommon = [
|
||||
"obelisk-frontend"
|
||||
"obelisk-route"
|
||||
"obelisk-executable-config-lookup"
|
||||
];
|
||||
|
||||
obeliskPackagesBackend = obeliskPackagesCommon ++ [
|
||||
"obelisk-asset-manifest"
|
||||
"obelisk-asset-serve-snap"
|
||||
"obelisk-backend"
|
||||
"obelisk-cliapp"
|
||||
"obelisk-command"
|
||||
"obelisk-executable-config-inject"
|
||||
"obelisk-frontend"
|
||||
"obelisk-run"
|
||||
"obelisk-route"
|
||||
"obelisk-selftest"
|
||||
"obelisk-snap-extras"
|
||||
];
|
||||
|
||||
pnameToAttrs = pkgsSet: pnames:
|
||||
lib.listToAttrs (map
|
||||
(name: { inherit name; value = pkgsSet.${name}; })
|
||||
pnames);
|
||||
|
||||
collect = v:
|
||||
if lib.isDerivation v then [v]
|
||||
else if lib.isAttrs v then lib.concatMap collect (builtins.attrValues v)
|
||||
else if lib.isList v then lib.concatMap collect v
|
||||
else [];
|
||||
|
||||
perPlatform = lib.genAttrs cacheBuildSystems (system: let
|
||||
reflex-platform = import ./dep/reflex-platform { inherit system; };
|
||||
|
||||
mkPerProfiling = profiling: let
|
||||
obelisk = import ./. (self-args // { inherit system profiling; });
|
||||
ghc = pnameToAttrs
|
||||
obelisk.haskellPackageSets.ghc
|
||||
obeliskPackagesBackend;
|
||||
ghcjs = pnameToAttrs
|
||||
obelisk.haskellPackageSets.ghcjs
|
||||
obeliskPackagesCommon;
|
||||
command = obelisk.command;
|
||||
skeleton = import ./skeleton { inherit obelisk; };
|
||||
serverSkeletonExe = skeleton.exe;
|
||||
# TODO fix nixpkgs so it doesn't try to run the result of haskell shells as setup hooks.
|
||||
serverSkeletonShell = local-self.nixpkgs.runCommand "shell-safe-for-dep" {} ''
|
||||
touch "$out"
|
||||
echo "return" >> "$out"
|
||||
cat "${skeleton.shells.ghc}" >> "$out"
|
||||
'';
|
||||
androidSkeleton = (import ./skeleton { inherit obelisk; }).android.frontend;
|
||||
iosSkeleton = (import ./skeleton { inherit obelisk; }).ios.frontend;
|
||||
nameSuffix = if profiling then "profiled" else "unprofiled";
|
||||
packages = {
|
||||
skeletonProfiledObRun = skeleton.__unstable__.profiledObRun;
|
||||
inherit
|
||||
command
|
||||
serverSkeletonShell
|
||||
ghc
|
||||
;
|
||||
} // lib.optionalAttrs (!profiling) {
|
||||
inherit
|
||||
ghcjs
|
||||
serverSkeletonExe
|
||||
;
|
||||
} // lib.optionalAttrs reflex-platform.androidSupport {
|
||||
inherit androidSkeleton;
|
||||
} // lib.optionalAttrs reflex-platform.iosSupport {
|
||||
inherit iosSkeleton;
|
||||
};
|
||||
in packages // {
|
||||
cache = reflex-platform.pinBuildInputs
|
||||
"obelisk-${system}-${nameSuffix}"
|
||||
(collect packages);
|
||||
};
|
||||
|
||||
perProfiling = {
|
||||
profiled = mkPerProfiling true;
|
||||
unprofiled = mkPerProfiling false;
|
||||
};
|
||||
in perProfiling // {
|
||||
cache = reflex-platform.pinBuildInputs
|
||||
"obelisk-${system}"
|
||||
(map (p: p.cache) (builtins.attrValues perProfiling));
|
||||
});
|
||||
|
||||
metaCache = local-self.reflex-platform.pinBuildInputs
|
||||
"obelisk-everywhere"
|
||||
(map (a: a.cache) (builtins.attrValues perPlatform));
|
||||
|
||||
in perPlatform // { inherit metaCache; }
|
145
all-tests.nix
Normal file
@ -0,0 +1,145 @@
|
||||
{
|
||||
supportedSystems ? [ builtins.currentSystem ]
|
||||
}:
|
||||
let
|
||||
nginxRoot = "/run/nginx";
|
||||
obelisk = import ./default.nix {};
|
||||
# Get NixOS a pre-release 20.03 that contains the python based tests and recursive nix
|
||||
pkgs = import (builtins.fetchTarball https://github.com/nixos/nixpkgs/archive/3de5266.tar.gz) {};
|
||||
sshKeys = import (pkgs.path + /nixos/tests/ssh-keys.nix) pkgs;
|
||||
make-test = import (pkgs.path + /nixos/tests/make-test-python.nix);
|
||||
obelisk-everywhere = (import ./all-builds.nix { inherit supportedSystems; }).x86_64-linux.cache;
|
||||
snakeOilPrivateKey = sshKeys.snakeOilPrivateKey.text;
|
||||
snakeOilPublicKey = sshKeys.snakeOilPublicKey;
|
||||
in
|
||||
make-test ({...}: {
|
||||
name = "obelisk";
|
||||
nodes = {
|
||||
githost = {
|
||||
networking.firewall.allowedTCPPorts = [ 22 80 ];
|
||||
services.openssh = {
|
||||
enable = true;
|
||||
};
|
||||
environment.systemPackages = [
|
||||
pkgs.git
|
||||
];
|
||||
users.users.root.openssh.authorizedKeys.keys = [
|
||||
snakeOilPublicKey
|
||||
];
|
||||
};
|
||||
|
||||
client = {
|
||||
imports = [
|
||||
(pkgs.path + /nixos/modules/installer/cd-dvd/channel.nix)
|
||||
];
|
||||
nix.useSandbox = false;
|
||||
nix.binaryCaches = [];
|
||||
environment.systemPackages = [
|
||||
obelisk.command
|
||||
obelisk.shell
|
||||
obelisk-everywhere
|
||||
pkgs.git
|
||||
];
|
||||
};
|
||||
};
|
||||
|
||||
testScript =
|
||||
let
|
||||
privateKeyFile = pkgs.writeText "id_rsa" ''${snakeOilPrivateKey}'';
|
||||
thunkableSample = pkgs.writeText "default.nix" ''
|
||||
let pkgs = import <nixpkgs> {}; in pkgs.git
|
||||
'';
|
||||
invalidThunkableSample = pkgs.writeText "default.nix" ''
|
||||
let pkgs = import <nixpkgs> {}; in pkgtypo.git
|
||||
'';
|
||||
sshConfigFile = pkgs.writeText "ssh_config" ''
|
||||
Host *
|
||||
StrictHostKeyChecking no
|
||||
UserKnownHostsFile=/dev/null
|
||||
ConnectionAttempts=1
|
||||
ConnectTimeout=1
|
||||
IdentityFile=~/.ssh/id_rsa
|
||||
User=root
|
||||
'';
|
||||
in ''
|
||||
start_all()
|
||||
githost.wait_for_open_port("22")
|
||||
|
||||
with subtest("test obelisk is installed"):
|
||||
client.succeed("ob --help")
|
||||
|
||||
with subtest("test the client can access the server via ssh"):
|
||||
client.succeed("mkdir -p ~/.ssh/")
|
||||
client.succeed(
|
||||
"cp ${privateKeyFile} ~/.ssh/id_rsa"
|
||||
)
|
||||
client.succeed("chmod 600 ~/.ssh/id_rsa")
|
||||
client.wait_until_succeeds(
|
||||
"ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -i ~/.ssh/id_rsa githost true"
|
||||
)
|
||||
client.succeed(
|
||||
"cp ${sshConfigFile} ~/.ssh/config"
|
||||
)
|
||||
client.wait_until_succeeds("ssh githost true")
|
||||
|
||||
with subtest("test a remote bare repo can be started"):
|
||||
githost.succeed("mkdir -p ~/myorg/myapp.git")
|
||||
githost.succeed("cd ~/myorg/myapp.git && git init --bare")
|
||||
|
||||
with subtest("test a git project can be configured with a remote using ssh"):
|
||||
client.succeed("mkdir -p ~/code/myapp")
|
||||
client.succeed("cd ~/code/myapp && git init")
|
||||
client.succeed(
|
||||
"cp ${thunkableSample} ~/code/myapp/default.nix"
|
||||
)
|
||||
client.succeed("cd ~/code/myapp && git add .")
|
||||
client.succeed('git config --global user.email "you@example.com"')
|
||||
client.succeed('git config --global user.name "Your Name"')
|
||||
client.succeed('cd ~/code/myapp && git commit -m "Initial"')
|
||||
client.succeed(
|
||||
"cd ~/code/myapp && git remote add origin root@githost:/root/myorg/myapp.git"
|
||||
)
|
||||
|
||||
|
||||
with subtest("test pushing code to the remote"):
|
||||
client.succeed("cd ~/code/myapp && git push -u origin master")
|
||||
client.succeed("cd ~/code/myapp && git status")
|
||||
|
||||
with subtest("test obelisk can pack"):
|
||||
client.succeed("ob -v thunk pack ~/code/myapp")
|
||||
client.succeed("grep -qF 'git' ~/code/myapp/default.nix")
|
||||
client.succeed("grep -qF 'myorg' ~/code/myapp/git.json")
|
||||
client.succeed("ob -v thunk unpack ~/code/myapp")
|
||||
|
||||
with subtest("test obelisk can set the public / private flag"):
|
||||
client.succeed("ob -v thunk pack ~/code/myapp --private")
|
||||
client.fail("""grep -qF '"private": practice' ~/code/myapp/git.json""")
|
||||
client.succeed("""grep -qF '"private": true' ~/code/myapp/git.json""")
|
||||
client.succeed("nix-build ~/code/myapp")
|
||||
client.succeed("ob -v thunk unpack ~/code/myapp")
|
||||
client.succeed("ob -v thunk pack ~/code/myapp --public")
|
||||
client.succeed("""grep -qF '"private": false' ~/code/myapp/git.json""")
|
||||
client.succeed("nix-build ~/code/myapp")
|
||||
client.succeed("ob -v thunk unpack ~/code/myapp")
|
||||
|
||||
with subtest("test building an invalid thunk fails"):
|
||||
client.succeed("cd ~/code/myapp && git checkout -b bad")
|
||||
client.succeed(
|
||||
"cp ${invalidThunkableSample} ~/code/myapp/default.nix"
|
||||
)
|
||||
client.succeed("cd ~/code/myapp && git add .")
|
||||
client.succeed('git config --global user.email "you@example.com"')
|
||||
client.succeed('git config --global user.name "Your Name"')
|
||||
client.succeed('cd ~/code/myapp && git commit -m "Bad commit"')
|
||||
client.succeed("cd ~/code/myapp && git push -u origin bad")
|
||||
client.succeed("ob -v thunk pack ~/code/myapp --public")
|
||||
client.fail("nix-build ~/code/myapp")
|
||||
client.succeed("ob -v thunk unpack ~/code/myapp")
|
||||
client.succeed("cd ~/code/myapp && git checkout master")
|
||||
|
||||
with subtest("test obelisk can detect private repos"):
|
||||
client.succeed("ob -v thunk pack ~/code/myapp")
|
||||
client.succeed("""grep -qF '"private": true' ~/code/myapp/git.json""")
|
||||
client.succeed("ob -v thunk unpack ~/code/myapp")
|
||||
'';
|
||||
}) {}
|
301
default.nix
@ -1,31 +1,31 @@
|
||||
{ system ? builtins.currentSystem
|
||||
, profiling ? false
|
||||
, iosSdkVersion ? "10.2"
|
||||
, __useLegacyCompilers ? false
|
||||
, config ? {}
|
||||
, reflex-platform-func ? import ./dep/reflex-platform
|
||||
}:
|
||||
let
|
||||
reflex-platform = getReflexPlatform { inherit system; };
|
||||
inherit (reflex-platform) hackGet nixpkgs;
|
||||
pkgs = nixpkgs;
|
||||
|
||||
getReflexPlatform = getReflexPlatform' __useLegacyCompilers;
|
||||
getReflexPlatform' = __useLegacyCompilers: sys: import ./dep/reflex-platform {
|
||||
inherit iosSdkVersion __useLegacyCompilers;
|
||||
system = sys;
|
||||
enableLibraryProfiling = profiling;
|
||||
inherit (import dep/gitignore.nix { inherit (nixpkgs) lib; }) gitignoreSource;
|
||||
|
||||
getReflexPlatform = { system, enableLibraryProfiling ? profiling }: reflex-platform-func {
|
||||
inherit iosSdkVersion config system enableLibraryProfiling;
|
||||
|
||||
nixpkgsOverlays = [
|
||||
(import ./nixpkgs-overlays)
|
||||
];
|
||||
|
||||
haskellOverlays = [
|
||||
(import ./haskell-overlays/misc-deps.nix)
|
||||
(import ./haskell-overlays/misc-deps.nix { inherit hackGet; })
|
||||
pkgs.obeliskExecutableConfig.haskellOverlay
|
||||
(import ./haskell-overlays/obelisk.nix)
|
||||
(import ./haskell-overlays/tighten-ob-exes.nix)
|
||||
];
|
||||
};
|
||||
|
||||
reflex-platform = getReflexPlatform' false system;
|
||||
inherit (reflex-platform) hackGet nixpkgs;
|
||||
pkgs = nixpkgs;
|
||||
|
||||
# The haskell environment used to build Obelisk itself, e.g. the 'ob' command
|
||||
ghcObelisk = reflex-platform.ghc;
|
||||
|
||||
@ -43,19 +43,18 @@ in rec {
|
||||
pathGit = ./.; # Used in CI by the migration graph hash algorithm to correctly ignore files.
|
||||
path = reflex-platform.filterGit ./.;
|
||||
obelisk = ghcObelisk;
|
||||
obeliskEnvs = ghcObeliskEnvs;
|
||||
obeliskEnvs = pkgs.lib.filterAttrs (k: _: pkgs.lib.strings.hasPrefix "obelisk-" k) ghcObeliskEnvs;
|
||||
command = ghcObelisk.obelisk-command;
|
||||
shell = pinBuildInputs "obelisk-shell" ([command] ++ command.commandRuntimeDeps) [];
|
||||
shell = pinBuildInputs "obelisk-shell" ([command] ++ command.commandRuntimeDeps);
|
||||
|
||||
selftest = pkgs.writeScript "selftest" ''
|
||||
#!/usr/bin/env bash
|
||||
#!${pkgs.runtimeShell}
|
||||
set -euo pipefail
|
||||
|
||||
PATH="${command}/bin:$PATH"
|
||||
export OBELISK_IMPL="${hackGet ./.}"
|
||||
cd ${./.}
|
||||
"${ghcObelisk.obelisk-selftest}/bin/obelisk-selftest" +RTS -N -RTS "$@"
|
||||
'';
|
||||
#TODO: Why can't I build ./skeleton directly as a derivation? `nix-build -E ./.` doesn't work
|
||||
skeleton = pkgs.runCommand "skeleton" {
|
||||
dir = builtins.filterSource (path: type: builtins.trace path (baseNameOf path != ".obelisk")) ./skeleton;
|
||||
} ''
|
||||
@ -74,27 +73,42 @@ in rec {
|
||||
obelisk-asset-manifest-generate "$src" "$haskellManifest" ${packageName} ${moduleName} "$symlinked"
|
||||
'';
|
||||
|
||||
compressedJs = frontend: optimizationLevel: pkgs.runCommand "compressedJs" { buildInputs = [ pkgs.closurecompiler ]; } ''
|
||||
compressedJs = frontend: optimizationLevel: pkgs.runCommand "compressedJs" {} ''
|
||||
mkdir $out
|
||||
cd $out
|
||||
# TODO profiling + static shouldn't break and need an ad-hoc workaround like that
|
||||
ln -s "${haskellLib.justStaticExecutables frontend}/bin/frontend.jsexe/all.js" all.unminified.js
|
||||
closure-compiler --externs "${reflex-platform.ghcjsExternsJs}" -O ${optimizationLevel} --jscomp_warning=checkVars --create_source_map="all.js.map" --source_map_format=V3 --js_output_file="all.js" all.unminified.js
|
||||
echo "//# sourceMappingURL=all.js.map" >> all.js
|
||||
${if optimizationLevel == null then ''
|
||||
ln -s all.unminified.js all.js
|
||||
'' else ''
|
||||
${pkgs.closurecompiler}/bin/closure-compiler --externs "${reflex-platform.ghcjsExternsJs}" -O ${optimizationLevel} --jscomp_warning=checkVars --create_source_map="all.js.map" --source_map_format=V3 --js_output_file="all.js" all.unminified.js
|
||||
echo "//# sourceMappingURL=all.js.map" >> all.js
|
||||
''}
|
||||
'';
|
||||
|
||||
serverModules = {
|
||||
mkBaseEc2 = { hostName, routeHost, enableHttps, adminEmail, ... }: {...}: {
|
||||
mkBaseEc2 = { nixosPkgs, ... }: {...}: {
|
||||
imports = [
|
||||
(pkgs.path + /nixos/modules/virtualisation/amazon-image.nix)
|
||||
(nixosPkgs.path + /nixos/modules/virtualisation/amazon-image.nix)
|
||||
];
|
||||
ec2.hvm = true;
|
||||
};
|
||||
|
||||
mkDefaultNetworking = { adminEmail, enableHttps, hostName, routeHost, ... }: {...}: {
|
||||
networking = {
|
||||
inherit hostName;
|
||||
firewall.allowedTCPPorts = if enableHttps then [ 80 443 ] else [ 80 ];
|
||||
};
|
||||
|
||||
# `amazon-image.nix` already sets these but if the user provides their own module then
|
||||
# forgetting these can cause them to lose access to the server!
|
||||
# https://github.com/NixOS/nixpkgs/blob/fab05f17d15e4e125def4fd4e708d205b41d8d74/nixos/modules/virtualisation/amazon-image.nix#L133-L136
|
||||
services.openssh.enable = true;
|
||||
services.openssh.permitRootLogin = "prohibit-password";
|
||||
|
||||
security.acme.certs = if enableHttps then {
|
||||
"${routeHost}".email = adminEmail;
|
||||
} else {};
|
||||
ec2.hvm = true;
|
||||
};
|
||||
|
||||
mkObeliskApp =
|
||||
@ -115,7 +129,8 @@ in rec {
|
||||
enableACME = enableHttps;
|
||||
forceSSL = enableHttps;
|
||||
locations.${baseUrl} = {
|
||||
proxyPass = "http://localhost:" + toString internalPort;
|
||||
proxyPass = "http://127.0.0.1:" + toString internalPort;
|
||||
proxyWebsockets = true;
|
||||
};
|
||||
};
|
||||
};
|
||||
@ -123,10 +138,11 @@ in rec {
|
||||
wantedBy = [ "multi-user.target" ];
|
||||
after = [ "network.target" ];
|
||||
restartIfChanged = true;
|
||||
path = [ pkgs.gnutar ];
|
||||
script = ''
|
||||
ln -sft . '${exe}'/*
|
||||
mkdir -p log
|
||||
exec ./backend ${backendArgs} >>backend.out 2>>backend.err </dev/null
|
||||
exec ./backend ${backendArgs} </dev/null
|
||||
'';
|
||||
serviceConfig = {
|
||||
User = user;
|
||||
@ -149,119 +165,184 @@ in rec {
|
||||
};
|
||||
};
|
||||
|
||||
inherit mkAssets;
|
||||
|
||||
serverExe = backend: frontend: assets: optimizationLevel: version:
|
||||
pkgs.runCommand "serverExe" {} ''
|
||||
mkdir $out
|
||||
set -eux
|
||||
ln -s "${haskellLib.justStaticExecutables backend}"/bin/* $out/
|
||||
ln -s "${if profiling then backend else haskellLib.justStaticExecutables backend}"/bin/* $out/
|
||||
ln -s "${mkAssets assets}" $out/static.assets
|
||||
ln -s ${mkAssets (compressedJs frontend optimizationLevel)} $out/frontend.jsexe.assets
|
||||
echo ${version} > $out/version
|
||||
'';
|
||||
|
||||
server = { exe, hostName, adminEmail, routeHost, enableHttps, version }@args:
|
||||
server = { exe, hostName, adminEmail, routeHost, enableHttps, version, module ? serverModules.mkBaseEc2 }@args:
|
||||
let
|
||||
nixos = import (pkgs.path + /nixos);
|
||||
in nixos {
|
||||
system = "x86_64-linux";
|
||||
configuration = {
|
||||
imports = [
|
||||
(serverModules.mkBaseEc2 args)
|
||||
(module { inherit exe hostName adminEmail routeHost enableHttps version; nixosPkgs = pkgs; })
|
||||
(serverModules.mkDefaultNetworking args)
|
||||
(serverModules.mkObeliskApp args)
|
||||
];
|
||||
};
|
||||
};
|
||||
|
||||
|
||||
# An Obelisk project is a reflex-platform project with a predefined layout and role for each component
|
||||
project = base: projectDefinition:
|
||||
let projectOut = sys: (getReflexPlatform sys).project (args@{ nixpkgs, ... }:
|
||||
let mkProject = { android ? null #TODO: Better error when missing
|
||||
, ios ? null #TODO: Better error when missing
|
||||
, packages ? {}
|
||||
, overrides ? _: _: {}
|
||||
, staticFiles ? base + /static
|
||||
, tools ? _: []
|
||||
, shellToolOverrides ? _: _: {}
|
||||
, withHoogle ? false # Setting this to `true` makes shell reloading far slower
|
||||
, __closureCompilerOptimizationLevel ? "ADVANCED"
|
||||
}:
|
||||
let frontendName = "frontend";
|
||||
backendName = "backend";
|
||||
commonName = "common";
|
||||
staticName = "obelisk-generated-static";
|
||||
staticFilesImpure = if lib.isDerivation staticFiles then staticFiles else toString staticFiles;
|
||||
processedStatic = processAssets { src = staticFiles; };
|
||||
# The packages whose names and roles are defined by this package
|
||||
predefinedPackages = lib.filterAttrs (_: x: x != null) {
|
||||
${frontendName} = nullIfAbsent (base + "/frontend");
|
||||
${commonName} = nullIfAbsent (base + "/common");
|
||||
${backendName} = nullIfAbsent (base + "/backend");
|
||||
};
|
||||
combinedPackages = predefinedPackages // packages;
|
||||
projectOverrides = self: super: {
|
||||
${staticName} = haskellLib.dontHaddock (self.callCabal2nix staticName processedStatic.haskellManifest {});
|
||||
${backendName} = haskellLib.addBuildDepend super.${backendName} self.obelisk-run;
|
||||
};
|
||||
totalOverrides = lib.composeExtensions projectOverrides overrides;
|
||||
inherit (lib.strings) hasPrefix;
|
||||
privateConfigDirs = ["config/backend"];
|
||||
injectableConfig = builtins.filterSource (path: _:
|
||||
!(lib.lists.any (x: hasPrefix (toString base + "/" + toString x) (toString path)) privateConfigDirs)
|
||||
);
|
||||
__android = configPath: {
|
||||
${if android == null then null else frontendName} = {
|
||||
executableName = "frontend";
|
||||
${if builtins.pathExists staticFiles then "assets" else null} =
|
||||
nixpkgs.obeliskExecutableConfig.platforms.android.inject (injectableConfig configPath) processedStatic.symlinked;
|
||||
} // android;
|
||||
};
|
||||
__ios = configPath: {
|
||||
${if ios == null then null else frontendName} = {
|
||||
executableName = "frontend";
|
||||
${if builtins.pathExists staticFiles then "staticSrc" else null} =
|
||||
nixpkgs.obeliskExecutableConfig.platforms.ios.inject (injectableConfig configPath) processedStatic.symlinked;
|
||||
} // ios;
|
||||
};
|
||||
in {
|
||||
inherit shellToolOverrides tools withHoogle;
|
||||
overrides = totalOverrides;
|
||||
packages = combinedPackages;
|
||||
shells = {
|
||||
${if android == null && ios == null then null else "ghcSavedSplices"} = (lib.filter (x: lib.hasAttr x combinedPackages) [
|
||||
commonName
|
||||
frontendName
|
||||
]);
|
||||
ghc = (lib.filter (x: lib.hasAttr x combinedPackages) [
|
||||
backendName
|
||||
commonName
|
||||
frontendName
|
||||
]);
|
||||
ghcjs = lib.filter (x: lib.hasAttr x combinedPackages) [
|
||||
frontendName
|
||||
commonName
|
||||
];
|
||||
project = base': projectDefinition:
|
||||
let
|
||||
projectOut = { system, enableLibraryProfiling ? profiling }: let reflexPlatformProject = (getReflexPlatform { inherit system enableLibraryProfiling; }).project; in reflexPlatformProject (args@{ nixpkgs, ... }:
|
||||
let
|
||||
inherit (lib.strings) hasPrefix;
|
||||
mkProject =
|
||||
{ android ? null #TODO: Better error when missing
|
||||
, ios ? null #TODO: Better error when missing
|
||||
, packages ? {}
|
||||
, overrides ? _: _: {}
|
||||
, staticFiles ? null
|
||||
, tools ? _: []
|
||||
, shellToolOverrides ? _: _: {}
|
||||
, withHoogle ? false # Setting this to `true` makes shell reloading far slower
|
||||
, __closureCompilerOptimizationLevel ? "ADVANCED" # Set this to `null` to skip the closure-compiler step
|
||||
}:
|
||||
let
|
||||
allConfig = nixpkgs.lib.makeExtensible (self: {
|
||||
base = base';
|
||||
inherit args;
|
||||
userSettings = {
|
||||
inherit android ios packages overrides tools shellToolOverrides withHoogle __closureCompilerOptimizationLevel;
|
||||
staticFiles = if staticFiles == null then self.base + /static else staticFiles;
|
||||
};
|
||||
android = __android null;
|
||||
ios = __ios null;
|
||||
passthru = { inherit android ios packages overrides tools shellToolOverrides withHoogle staticFiles staticFilesImpure __closureCompilerOptimizationLevel processedStatic __ios __android; };
|
||||
};
|
||||
in mkProject (projectDefinition args));
|
||||
serverOn = sys: version: serverExe
|
||||
(projectOut sys).ghc.backend
|
||||
(projectOut system).ghcjs.frontend
|
||||
(projectOut sys).passthru.staticFiles
|
||||
(projectOut sys).passthru.__closureCompilerOptimizationLevel
|
||||
frontendName = "frontend";
|
||||
backendName = "backend";
|
||||
commonName = "common";
|
||||
staticName = "obelisk-generated-static";
|
||||
staticFilesImpure = let fs = self.userSettings.staticFiles; in if lib.isDerivation fs then fs else toString fs;
|
||||
processedStatic = processAssets { src = self.userSettings.staticFiles; };
|
||||
# The packages whose names and roles are defined by this package
|
||||
predefinedPackages = lib.filterAttrs (_: x: x != null) {
|
||||
${self.frontendName} = nullIfAbsent (self.base + "/frontend");
|
||||
${self.commonName} = nullIfAbsent (self.base + "/common");
|
||||
${self.backendName} = nullIfAbsent (self.base + "/backend");
|
||||
};
|
||||
shellPackages = {};
|
||||
combinedPackages = self.predefinedPackages // self.userSettings.packages // self.shellPackages;
|
||||
projectOverrides = self': super': {
|
||||
${self.staticName} = haskellLib.dontHaddock (self'.callCabal2nix self.staticName self.processedStatic.haskellManifest {});
|
||||
${self.backendName} = haskellLib.addBuildDepend super'.${self.backendName} self'.obelisk-run;
|
||||
};
|
||||
totalOverrides = lib.composeExtensions self.projectOverrides self.userSettings.overrides;
|
||||
privateConfigDirs = ["config/backend"];
|
||||
injectableConfig = builtins.filterSource (path: _:
|
||||
!(lib.lists.any (x: hasPrefix (toString self.base + "/" + toString x) (toString path)) self.privateConfigDirs)
|
||||
);
|
||||
__androidWithConfig = configPath: {
|
||||
${if self.userSettings.android == null then null else self.frontendName} = {
|
||||
executableName = "frontend";
|
||||
${if builtins.pathExists self.userSettings.staticFiles then "assets" else null} =
|
||||
nixpkgs.obeliskExecutableConfig.platforms.android.inject
|
||||
(self.injectableConfig configPath)
|
||||
self.processedStatic.symlinked;
|
||||
} // self.userSettings.android;
|
||||
};
|
||||
__iosWithConfig = configPath: {
|
||||
${if self.userSettings.ios == null then null else self.frontendName} = {
|
||||
executableName = "frontend";
|
||||
${if builtins.pathExists self.userSettings.staticFiles then "staticSrc" else null} =
|
||||
nixpkgs.obeliskExecutableConfig.platforms.ios.inject
|
||||
(self.injectableConfig configPath)
|
||||
self.processedStatic.symlinked;
|
||||
} // self.userSettings.ios;
|
||||
};
|
||||
|
||||
shells-ghc = builtins.attrNames (self.predefinedPackages // self.shellPackages);
|
||||
|
||||
shells-ghcjs = [
|
||||
self.frontendName
|
||||
self.commonName
|
||||
];
|
||||
|
||||
shells-ghcSavedSplices = [
|
||||
self.commonName
|
||||
self.frontendName
|
||||
];
|
||||
|
||||
project = reflexPlatformProject ({...}: self.projectConfig);
|
||||
projectConfig = {
|
||||
inherit (self.userSettings) shellToolOverrides tools withHoogle;
|
||||
overrides = self.totalOverrides;
|
||||
packages = self.combinedPackages;
|
||||
shells = {
|
||||
${if self.userSettings.android == null && self.userSettings.ios == null then null else "ghcSavedSplices"} =
|
||||
lib.filter (x: lib.hasAttr x self.combinedPackages) self.shells-ghcSavedSplices;
|
||||
ghc = lib.filter (x: lib.hasAttr x self.combinedPackages) self.shells-ghc;
|
||||
ghcjs = lib.filter (x: lib.hasAttr x self.combinedPackages) self.shells-ghcjs;
|
||||
};
|
||||
android = self.__androidWithConfig (self.base + "/config");
|
||||
ios = self.__iosWithConfig (self.base + "/config");
|
||||
|
||||
passthru = {
|
||||
__unstable__.self = allConfig;
|
||||
inherit (self)
|
||||
staticFilesImpure processedStatic
|
||||
__iosWithConfig __androidWithConfig
|
||||
;
|
||||
inherit (self.userSettings)
|
||||
android ios overrides packages shellToolOverrides staticFiles tools withHoogle
|
||||
__closureCompilerOptimizationLevel
|
||||
;
|
||||
};
|
||||
};
|
||||
});
|
||||
in allConfig;
|
||||
in (mkProject (projectDefinition args)).projectConfig);
|
||||
mainProjectOut = projectOut { inherit system; };
|
||||
serverOn = projectInst: version: serverExe
|
||||
projectInst.ghc.backend
|
||||
mainProjectOut.ghcjs.frontend
|
||||
projectInst.passthru.staticFiles
|
||||
projectInst.passthru.__closureCompilerOptimizationLevel
|
||||
version;
|
||||
linuxExe = serverOn "x86_64-linux";
|
||||
linuxExe = serverOn (projectOut { system = "x86_64-linux"; });
|
||||
dummyVersion = "Version number is only available for deployments";
|
||||
in projectOut system // {
|
||||
in mainProjectOut // {
|
||||
__unstable__.profiledObRun = let
|
||||
profiled = projectOut { inherit system; enableLibraryProfiling = true; };
|
||||
exeSource = builtins.toFile "ob-run.hs" ''
|
||||
module Main where
|
||||
|
||||
import Control.Exception
|
||||
import Reflex.Profiled
|
||||
import System.Environment
|
||||
|
||||
import qualified Obelisk.Run
|
||||
import qualified Frontend
|
||||
import qualified Backend
|
||||
|
||||
main :: IO ()
|
||||
main = do
|
||||
args <- getArgs
|
||||
let port = read $ args !! 0
|
||||
assets = args !! 1
|
||||
profileFile = (args !! 2) <> ".rprof"
|
||||
Obelisk.Run.run port (Obelisk.Run.runServeAsset assets) Backend.backend Frontend.frontend `finally` writeProfilingData profileFile
|
||||
'';
|
||||
in nixpkgs.runCommand "ob-run" {
|
||||
buildInputs = [ (profiled.ghc.ghcWithPackages (p: [ p.backend p.frontend])) ];
|
||||
} ''
|
||||
mkdir -p $out/bin/
|
||||
ghc -x hs -prof -fno-prof-auto -threaded ${exeSource} -o $out/bin/ob-run
|
||||
'';
|
||||
|
||||
linuxExeConfigurable = linuxExe;
|
||||
linuxExe = linuxExe dummyVersion;
|
||||
exe = serverOn system dummyVersion;
|
||||
server = args@{ hostName, adminEmail, routeHost, enableHttps, version }:
|
||||
exe = serverOn mainProjectOut dummyVersion;
|
||||
server = args@{ hostName, adminEmail, routeHost, enableHttps, version, module ? serverModules.mkBaseEc2 }:
|
||||
server (args // { exe = linuxExe version; });
|
||||
obelisk = import (base + "/.obelisk/impl") {};
|
||||
obelisk = import (base' + "/.obelisk/impl") {};
|
||||
};
|
||||
haskellPackageSets = {
|
||||
inherit (reflex-platform) ghc ghcjs;
|
||||
|
1
dep/.gitignore
vendored
@ -1,4 +1,5 @@
|
||||
*
|
||||
!*/
|
||||
!.gitignore
|
||||
!*/*.json
|
||||
!*/default.nix
|
||||
|
7
dep/ghcid/default.nix
Normal file
@ -0,0 +1,7 @@
|
||||
# DO NOT HAND-EDIT THIS FILE
|
||||
import ((import <nixpkgs> {}).fetchFromGitHub (
|
||||
let json = builtins.fromJSON (builtins.readFile ./github.json);
|
||||
in { inherit (json) owner repo rev sha256;
|
||||
private = json.private or false;
|
||||
}
|
||||
))
|
7
dep/ghcid/github.json
Normal file
@ -0,0 +1,7 @@
|
||||
{
|
||||
"owner": "ndmitchell",
|
||||
"repo": "ghcid",
|
||||
"branch": "master",
|
||||
"rev": "f572318f32b1617f6054248e5888af68222f8e50",
|
||||
"sha256": "1icg3r70lg2kmd9gdc024ih1n9nrja98yav74z9nvykqygvv5w0n"
|
||||
}
|
8
dep/gitignore.nix/default.nix
Normal file
@ -0,0 +1,8 @@
|
||||
# DO NOT HAND-EDIT THIS FILE
|
||||
let fetch = { private ? false, fetchSubmodules ? false, owner, repo, rev, sha256, ... }:
|
||||
if !fetchSubmodules && !private then builtins.fetchTarball {
|
||||
url = "https://github.com/${owner}/${repo}/archive/${rev}.tar.gz"; inherit sha256;
|
||||
} else (import <nixpkgs> {}).fetchFromGitHub {
|
||||
inherit owner repo rev sha256 fetchSubmodules private;
|
||||
};
|
||||
in import (fetch (builtins.fromJSON (builtins.readFile ./github.json)))
|
8
dep/gitignore.nix/github.json
Normal file
@ -0,0 +1,8 @@
|
||||
{
|
||||
"owner": "hercules-ci",
|
||||
"repo": "gitignore.nix",
|
||||
"branch": "master",
|
||||
"private": false,
|
||||
"rev": "7415c4feb127845553943a3856cbc5cb967ee5e0",
|
||||
"sha256": "1zd1ylgkndbb5szji32ivfhwh04mr1sbgrnvbrqpmfb67g2g3r9i"
|
||||
}
|
7
dep/hnix/default.nix
Normal file
@ -0,0 +1,7 @@
|
||||
# DO NOT HAND-EDIT THIS FILE
|
||||
import ((import <nixpkgs> {}).fetchFromGitHub (
|
||||
let json = builtins.fromJSON (builtins.readFile ./github.json);
|
||||
in { inherit (json) owner repo rev sha256;
|
||||
private = json.private or false;
|
||||
}
|
||||
))
|
7
dep/hnix/github.json
Normal file
@ -0,0 +1,7 @@
|
||||
{
|
||||
"owner": "haskell-nix",
|
||||
"repo": "hnix",
|
||||
"branch": "master",
|
||||
"rev": "6c9c7c310c54372b3db0fdf5a0137b395cde1bdb",
|
||||
"sha256": "1i5903b7lxqn2s3jarb14h6wdq8bxiik1hp0xy43w5w1hgvvq0g5"
|
||||
}
|
@ -1,7 +1,8 @@
|
||||
# DO NOT HAND-EDIT THIS FILE
|
||||
import ((import <nixpkgs> {}).fetchFromGitHub (
|
||||
let json = builtins.fromJSON (builtins.readFile ./github.json);
|
||||
in { inherit (json) owner repo rev sha256;
|
||||
private = json.private or false;
|
||||
}
|
||||
))
|
||||
let fetch = { private ? false, fetchSubmodules ? false, owner, repo, rev, sha256, ... }:
|
||||
if !fetchSubmodules && !private then builtins.fetchTarball {
|
||||
url = "https://github.com/${owner}/${repo}/archive/${rev}.tar.gz"; inherit sha256;
|
||||
} else (import <nixpkgs> {}).fetchFromGitHub {
|
||||
inherit owner repo rev sha256 fetchSubmodules private;
|
||||
};
|
||||
in import (fetch (builtins.fromJSON (builtins.readFile ./github.json)))
|
||||
|
@ -1,7 +1,8 @@
|
||||
{
|
||||
"owner": "reflex-frp",
|
||||
"repo": "reflex-platform",
|
||||
"branch": "develop",
|
||||
"rev": "a7cd9a23e7faa9c2545a4c111229e28208e42351",
|
||||
"sha256": "0y52rfrhk4zszgprpyni51l0pgq18dg695k5bmpd62c3zxar5mvm"
|
||||
}
|
||||
"branch": "master",
|
||||
"private": false,
|
||||
"rev": "41be4d952b75515a037318aa344dd6b13ad29cfe",
|
||||
"sha256": "132yqzyzd3c5fjy7wwnwa5d6pjxv5ap2xz0swwbv3h5pwi8jwv0a"
|
||||
}
|
8
dep/snap-core/default.nix
Normal file
@ -0,0 +1,8 @@
|
||||
# DO NOT HAND-EDIT THIS FILE
|
||||
let fetch = { private ? false, fetchSubmodules ? false, owner, repo, rev, sha256, ... }:
|
||||
if !fetchSubmodules && !private then builtins.fetchTarball {
|
||||
url = "https://github.com/${owner}/${repo}/archive/${rev}.tar.gz"; inherit sha256;
|
||||
} else (import <nixpkgs> {}).fetchFromGitHub {
|
||||
inherit owner repo rev sha256 fetchSubmodules private;
|
||||
};
|
||||
in import (fetch (builtins.fromJSON (builtins.readFile ./github.json)))
|
8
dep/snap-core/github.json
Normal file
@ -0,0 +1,8 @@
|
||||
{
|
||||
"owner": "obsidiansystems",
|
||||
"repo": "snap-core",
|
||||
"branch": "ts-expose-fileserve-internals",
|
||||
"private": false,
|
||||
"rev": "e015fc484e38dd9389c58b277b1884395e553bdf",
|
||||
"sha256": "1793x1h4lw9vnrii5akhfxx1x9yrgxf3052jf6bs8f1mhaic6mxx"
|
||||
}
|
311
guides/app-deploy/README.md
Normal file
@ -0,0 +1,311 @@
|
||||
# Obelisk Application Deployment Guide
|
||||
This document serves two purposes, the first one is to guide new users that want to deploy Obelisk/Reflex apps and the second one to provide a walk-through for test driving Obelisk releases.
|
||||
|
||||
> **Note:** To complete this *entire* guide you need access to a macOS machine running the latest OSX release, a Linux machine (can be a VM on that mac) with Nix installed, an iPhone or iPad, and a recent Android device. It also assumes that you have Nix correctly set up and that your system is set up to fetch from binary caches instead of building everything on your machine. If you notice any of the commands below takes more than 5 minutes, then caches are not likely enabled.
|
||||
>
|
||||
> Refer to the [Obelisk Installation Documentation](https://github.com/obsidiansystems/obelisk#installing-obelisk) for instructions on configuring the binary caches for any Nix-compatible setup.
|
||||
|
||||
## a) Get Obelisk
|
||||
|
||||
There are several ways to install Obelisk, but the one we are going to use here allows us to test a single version of Obelisk in different ways. The approach is to clone Obelisk from it's GitHub repository and start a project that uses that local check out.
|
||||
|
||||
Open a terminal and run the following:
|
||||
|
||||
```bash
|
||||
export WORKDIR=~/obelisk-guide
|
||||
mkdir -p "$WORKDIR"
|
||||
```
|
||||
|
||||
Then let's get Obelisk from GitHub using a specific branch:
|
||||
|
||||
```bash
|
||||
export OBELISK_BRANCH=develop
|
||||
git clone https://github.com/obsidiansystems/obelisk.git -b "$OBELISK_BRANCH" "$WORKDIR/obelisk"
|
||||
```
|
||||
|
||||
Once you have an Obelisk check out, you can build it using Nix and make a shortcut (alias) for the rest of the terminal session. With this you will be able to type `ob` anywhere as long as you do not close the terminal.
|
||||
|
||||
```bash
|
||||
alias ob=$(nix-build "$WORKDIR/obelisk" -A command --no-out-link)/bin/ob
|
||||
```
|
||||
|
||||
In order to start an Obelisk project, we need initialize it in a new directory. Let's call it `myapp`:
|
||||
|
||||
```bash
|
||||
mkdir -p "$WORKDIR/myapp"
|
||||
cd "$WORKDIR/myapp"
|
||||
ob init --branch "$OBELISK_BRANCH"
|
||||
```
|
||||
|
||||
> **Note:** If the `--branch` parameter is not used, then Obelisk sets your project up to look at the `master` branch from GitHub. This is quite handy for real life projects but distracting for our test drive since we want to make sure any changes we do to the Obelisk codebase (like testing a pull request) are immediately picked up.
|
||||
|
||||
## b) Run the Obelisk app on localhost
|
||||
|
||||
Let's test that we can run a server on localhost. This is quite easy:
|
||||
|
||||
```bash
|
||||
cd "$WORKDIR/myapp"
|
||||
ob run
|
||||
```
|
||||
|
||||
Now open a browser and point it to http://localhost:8000 (or just click on this link). You should see the following:
|
||||
|
||||
![](assets/app-deploy.png)
|
||||
|
||||
## c) Deploy the web app on a remote machine
|
||||
|
||||
### Import the NixOS VirtualBox appliance
|
||||
Install [VirtualBox](https://www.virtualbox.org/) on your machine.
|
||||
|
||||
On NixOS, do this by adding the following line to your `/etc/nixos/configuration.nix` before the last line containing `}`:
|
||||
|
||||
```nix
|
||||
virtualisation.virtualbox.host.enable = true;
|
||||
```
|
||||
|
||||
then `sudo nixos-rebuild switch`.
|
||||
|
||||
The NixOS [download page](https://nixos.org/nixos/download.html) has a section called "VirtualBox appliances". Download that as the target system. The author used the 19.09 `.ova` file.
|
||||
|
||||
With the downloaded file, open VirtualBox and import the `.ova` file:
|
||||
|
||||
![](assets/virtualbox-appliance-import.png)
|
||||
|
||||
Leave the default settings and finish the import. That will take a few minutes:
|
||||
|
||||
![](assets/virtualbox-appliance-import-loading.png)
|
||||
|
||||
You should now see a NixOS machine in the dashboard:
|
||||
|
||||
![](assets/virtualbox-dashboard-nixos.png)
|
||||
|
||||
If you are on a network with DHCP on your wireless or network card then select right-click on the "NixOS" image and click "Settings". Go to the "Network" section and make "Attached to:" set to "Bridged Adapter".
|
||||
|
||||
![](assets/virtualbox-image-settings-network.png)
|
||||
|
||||
Finally, double-click on the "NixOS" machine and wait for it boot up. If it asks for a password, use `demo`.
|
||||
|
||||
![](assets/virtualbox-nioxs-after-boot.png)
|
||||
|
||||
### Configure the virtual machine for SSH
|
||||
|
||||
Inside the virtual machine open up "Konsole" by clicking on the start menu in the bottom left corner and typing in "Konsole" until you see it as an option in the menu. Then click on it to open it.
|
||||
|
||||
Change the system configuration using `sudo` to enable SSH access:
|
||||
|
||||
```bash
|
||||
sudo nano /etc/nixos/configuration.nix
|
||||
```
|
||||
|
||||
This will ask you to enter your password. Enter `demo`.
|
||||
|
||||
![](assets/nano-edit-nixos-configuration.png)
|
||||
|
||||
You need to add the following snippet right before the final line containing `}`.
|
||||
|
||||
```nix
|
||||
services.openssh.enable = true;
|
||||
services.openssh.permitRootLogin = "yes";
|
||||
```
|
||||
|
||||
Save and close the editor by typing <kbd>Ctrl</kbd>+<kbd>O</kbd>, <kbd>Enter</kbd>, and then <kbd>Ctrl</kbd>+<kbd>X</kbd>.
|
||||
|
||||
Now run `sudo nixos-rebuild switch` (if it asks for a password, again use `demo`) and then set the root password using `sudo passwd root`. Pick a simple password like `root`. You'll need to use it again.
|
||||
|
||||
To get the virtual machine's IP address go back to the terminal on your *host* machine (not the virtual machine) and run:
|
||||
|
||||
```bash
|
||||
export VM_ID=$(VBoxManage list runningvms | grep "NixOS" | awk -F'[{}]' '{print $2}')
|
||||
echo "$VM_ID"
|
||||
```
|
||||
|
||||
You should see a single identifier that looks like `388bbbe5-e0a2-4b20-9c76-bbaa5746682e`. If you see anything else then you likely you have multiple NixOS machines running. Stop any other machines and run this command again. (Of course, if you're a power user, just change the `grep` to pick out right VM.)
|
||||
|
||||
Now get the IP address for that machine:
|
||||
|
||||
```bash
|
||||
export VM_IP=$(VBoxManage guestproperty enumerate "$VM_ID" | grep /V4/IP | awk '{print $4}' | sed 's/,//')
|
||||
echo "$VM_IP"
|
||||
```
|
||||
|
||||
You should see an IP like `192.168.7.86`.
|
||||
|
||||
With that you will be able to SSH into that machine from a terminal, using the username `root`.
|
||||
|
||||
```bash
|
||||
ssh root@$VM_IP echo "Access granted"
|
||||
```
|
||||
|
||||
This may ask if you want to trust the fingerprint for this server. Answer `yes`. It will ask you for a password. Enter the one you specified above. You should see `Access granted` printed after you type your password and hit <kbd>Enter</kbd>.
|
||||
|
||||
> **Note:** If instead you see an error like this
|
||||
>
|
||||
> ```
|
||||
> @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
|
||||
> @ WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED! @
|
||||
> @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
|
||||
> IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!
|
||||
> Someone could be eavesdropping on you right now (man-in-the-middle attack)!
|
||||
> It is also possible that a host key has just been changed.
|
||||
> ```
|
||||
>
|
||||
> you need to remove the old fingerprint for this IP. Run the following:
|
||||
>
|
||||
> ```bash
|
||||
> sed -i "/^$VM_IP /d" ~/.ssh/known_hosts
|
||||
> ```
|
||||
>
|
||||
> and then try the previous `ssh` command again.
|
||||
|
||||
Now we'll allow access with an SSH key instead of a password. Create a new SSH key with `ssh-keygen`:
|
||||
|
||||
```bash
|
||||
ssh-keygen -t ed25519 -f "$WORKDIR/obkey" -P ""
|
||||
```
|
||||
|
||||
Now copy your local SSH key to the server:
|
||||
|
||||
```bash
|
||||
ssh-copy-id -i "$WORKDIR/obkey.pub" root@$VM_IP
|
||||
```
|
||||
|
||||
This will ask you for your password again.
|
||||
|
||||
Now test login with your key instead:
|
||||
|
||||
```bash
|
||||
ssh -i "$WORKDIR/obkey" root@$VM_IP echo "Access granted"
|
||||
```
|
||||
|
||||
This should print `Access granted` *without* asking for a password.
|
||||
|
||||
### Configure the git remote
|
||||
|
||||
Before the project can be deployed, it needs to be a valid `git` repository.
|
||||
|
||||
> **Note:** You need `git` installed and configured. You can do that with:
|
||||
>
|
||||
> ```bash
|
||||
> nix-env -i git
|
||||
> git config --global user.name "John Doe"
|
||||
> git config --global user.email johndoe@example.com
|
||||
> ```
|
||||
|
||||
```bash
|
||||
git init --bare "$WORKDIR/myapp-git-remote"
|
||||
cd "$WORKDIR/myapp"
|
||||
git init
|
||||
git add --all
|
||||
git commit -m "initial commit"
|
||||
git remote add origin "$WORKDIR/myapp-git-remote"
|
||||
git push -u origin master
|
||||
```
|
||||
|
||||
### Deploy
|
||||
|
||||
With that, we can come back to the Obelisk app and initialize the deployment:
|
||||
|
||||
```bash
|
||||
cd "$WORKDIR/myapp"
|
||||
ob deploy init --ssh-key "$WORKDIR/obkey" --admin-email a@a.a --hostname $VM_IP --route https://$VM_IP "$WORKDIR/myappdeploy" --disable-https
|
||||
```
|
||||
|
||||
Then configure the deployment for VirtualBox:
|
||||
|
||||
```bash
|
||||
echo '{nixosPkgs, ...}: {...}: { imports = [ (nixosPkgs.path + /nixos/modules/virtualisation/virtualbox-image.nix) ]; }' > "$WORKDIR/myappdeploy/module.nix"
|
||||
```
|
||||
|
||||
And deploy:
|
||||
|
||||
```bash
|
||||
cd "$WORKDIR/myappdeploy"
|
||||
ob deploy push -v
|
||||
```
|
||||
|
||||
Once that is complete, open the webpage with
|
||||
|
||||
```bash
|
||||
xdg-open http://$VM_IP
|
||||
```
|
||||
|
||||
If that fails, just get the URL by running
|
||||
|
||||
```bash
|
||||
echo http://$VM_IP
|
||||
```
|
||||
|
||||
and copy/paste that URL into your browser navigation input.
|
||||
|
||||
It should look like this:
|
||||
|
||||
![](assets/app-deploy-2.png)
|
||||
|
||||
Congratulations! You have deployed an Obelisk application to a remote server via SSH.
|
||||
|
||||
## d) Deploy an Android app (Linux host only)
|
||||
|
||||
> **Note:** This section requires that you be running a Linux *host*.
|
||||
|
||||
Modify `$WORKDIR/myapp/default.nix` to set `config.android_sdk.accept_license = true;` and then update your app.
|
||||
|
||||
```bash
|
||||
cd "$WORKDIR/myapp"
|
||||
sed -i 's/# config.android_sdk.accept_license = false;/config.android_sdk.accept_license = true;/g' default.nix
|
||||
git add default.nix
|
||||
git commit -m"Accept license"
|
||||
git push
|
||||
```
|
||||
|
||||
Make sure *USB debugging* is enabled in your Android device ([instructions here](https://developer.android.com/studio/debug/dev-options) and connect the device using USB. Be sure to confirm any security prompts on the device.
|
||||
|
||||
Now update your deployment and deploy to Android:
|
||||
|
||||
```bash
|
||||
cd "$WORKDIR/myappdeploy"
|
||||
ob deploy update
|
||||
ob deploy test android -v
|
||||
```
|
||||
|
||||
This deployment will ask you to create a password (at least 6 characters long) and then ask you a series of questions. You can pick arbitrary answers. If the deployment fails, try using different USB ports on your computer and running `ob deploy test android -v` again. The USB cable you use can also make a difference.
|
||||
|
||||
When connecting your Android device you may be asked to "Allow USB debugging". You need to allow it.
|
||||
![](assets/android-confirm-usb-debugging.jpg)
|
||||
|
||||
|
||||
When the deployment is complete opening the app should look something like
|
||||
|
||||
![](assets/android-app.jpg)
|
||||
|
||||
Congratulations! You have deployed an Obelisk Android app via USB.
|
||||
|
||||
## e) Deploy an iOS app (macOS host only)
|
||||
|
||||
> **Note:** This section requires that you be running a macOS *host*.
|
||||
|
||||
Verify that you can see the device from XCode and you have installed a Provisioning profile that links your Apple Developer Id and the Device identifier. The workflow depends on whether or not you are an independent developer or part of an organization and is out of scope for this manual.
|
||||
|
||||
![](assets/xcode-devices.png)
|
||||
|
||||
Find your `Team ID` at the following URL (Apple Developer Membership details): https://developer.apple.com/account/#/membership/
|
||||
|
||||
It will be something like `5B445B3WY1`. With that, you can start the deployment workflow after plugging in an iPhone or iPad via USB and setting it to trust the computer:
|
||||
|
||||
```bash
|
||||
cd "$WORKDIR/myappdeploy"
|
||||
ob deploy test ios <TEAM-ID> # Use your Team ID here
|
||||
```
|
||||
|
||||
![](assets/ios-deploy-keychain.png)
|
||||
|
||||
![](assets/ios-deploy-example.png)
|
||||
|
||||
If there are no errors and the last line says `100% Installed Package` you can open the device and look for the Obelisk app:
|
||||
|
||||
![](assets/ios-obelisk-app-icon.jpg)
|
||||
|
||||
![](assets/ios-obelisk-app-open.jpg)
|
||||
|
||||
Congratulations, you have deployed an Obelisk app on an iOS device.
|
||||
|
||||
You are now ready to create your own multi-platform application using Obelisk!
|
BIN
guides/app-deploy/assets/android-app.jpg
Normal file
After Width: | Height: | Size: 1.9 MiB |
BIN
guides/app-deploy/assets/android-confirm-usb-debugging.jpg
Normal file
After Width: | Height: | Size: 1.8 MiB |
BIN
guides/app-deploy/assets/app-deploy-2.png
Normal file
After Width: | Height: | Size: 129 KiB |
BIN
guides/app-deploy/assets/app-deploy.png
Normal file
After Width: | Height: | Size: 128 KiB |
BIN
guides/app-deploy/assets/ios-deploy-example.png
Normal file
After Width: | Height: | Size: 606 KiB |
BIN
guides/app-deploy/assets/ios-deploy-keychain.png
Normal file
After Width: | Height: | Size: 263 KiB |
BIN
guides/app-deploy/assets/ios-obelisk-app-icon.jpg
Normal file
After Width: | Height: | Size: 1.9 MiB |
BIN
guides/app-deploy/assets/ios-obelisk-app-open.jpg
Normal file
After Width: | Height: | Size: 2.3 MiB |
BIN
guides/app-deploy/assets/nano-edit-nixos-configuration.png
Normal file
After Width: | Height: | Size: 75 KiB |
BIN
guides/app-deploy/assets/virtualbox-appliance-import-loading.png
Normal file
After Width: | Height: | Size: 127 KiB |
BIN
guides/app-deploy/assets/virtualbox-appliance-import.png
Normal file
After Width: | Height: | Size: 103 KiB |
BIN
guides/app-deploy/assets/virtualbox-dashboard-nixos.png
Normal file
After Width: | Height: | Size: 148 KiB |
BIN
guides/app-deploy/assets/virtualbox-image-settings-network.png
Normal file
After Width: | Height: | Size: 37 KiB |
BIN
guides/app-deploy/assets/virtualbox-nioxs-after-boot.png
Normal file
After Width: | Height: | Size: 244 KiB |
BIN
guides/app-deploy/assets/xcode-devices.png
Normal file
After Width: | Height: | Size: 540 KiB |
@ -1,23 +1,20 @@
|
||||
{ hackGet }:
|
||||
|
||||
# Fix misc upstream packages
|
||||
self: super:
|
||||
|
||||
let
|
||||
pkgs = self.callPackage ({ pkgs }: pkgs) {};
|
||||
haskellLib = pkgs.haskell.lib;
|
||||
in
|
||||
|
||||
{
|
||||
hnix = pkgs.haskell.lib.dontCheck (self.callCabal2nix "hnix" (pkgs.fetchFromGitHub {
|
||||
owner = "haskell-nix";
|
||||
repo = "hnix";
|
||||
rev = "42afdc21da5d9e076eab57eaa42bfdde938192b8";
|
||||
sha256 = "0psw384dx9bw2dp93xrzw8rd9amvcwgzn64jzzwby7sfspj6k349";
|
||||
}) {});
|
||||
# Need 8.0.2 build support
|
||||
# PR: https://github.com/dmwit/universe/pull/33
|
||||
universe-template = self.callCabal2nix "universe-template" (pkgs.fetchFromGitHub {
|
||||
owner = "obsidiansystems";
|
||||
repo = "universe";
|
||||
rev = "6a71119bfa5db2b9990a2491c941469ff8ef5d13";
|
||||
sha256 = "0z8smyainnlzcglv3dlx6x1n9j6d2jv48aa8f2421iayfkxg3js5";
|
||||
} + /template) {};
|
||||
hnix = haskellLib.dontCheck (haskellLib.doJailbreak (self.callCabal2nix "hnix" (hackGet ../dep/hnix) {}));
|
||||
hnix-store-core = self.callHackage "hnix-store-core" "0.1.0.0" {};
|
||||
|
||||
ghcid = self.callCabal2nix "ghcid" (hackGet ../dep/ghcid) {};
|
||||
# Exports more internals
|
||||
snap-core = haskellLib.dontCheck (self.callCabal2nix "snap-core" (hackGet ../dep/snap-core) {});
|
||||
|
||||
shelly = self.callHackage "shelly" "1.9.0" {};
|
||||
}
|
||||
|
@ -1,23 +1,42 @@
|
||||
# Add obelisk packages
|
||||
|
||||
self: super:
|
||||
|
||||
let
|
||||
pkgs = self.callPackage ({ pkgs }: pkgs) {};
|
||||
inherit (pkgs) cleanHaskellSource;
|
||||
inherit (pkgs) obeliskCleanSource;
|
||||
haskellLib = pkgs.haskell.lib;
|
||||
onLinux = pkg: f: if pkgs.stdenv.isLinux then f pkg else pkg;
|
||||
in
|
||||
|
||||
{
|
||||
obelisk-executable-config = pkgs.obeliskExecutableConfig.haskellPackage self;
|
||||
obelisk-executable-config-inject = pkgs.obeliskExecutableConfig.platforms.web.inject self;
|
||||
|
||||
obelisk-asset-manifest = self.callCabal2nix "obelisk-asset-manifest" ../lib/asset/manifest {};
|
||||
obelisk-asset-serve-snap = self.callCabal2nix "obelisk-asset-serve-snap" ../lib/asset/serve-snap {};
|
||||
obelisk-backend = self.callCabal2nix "obelisk-backend" (cleanHaskellSource ../lib/backend) {};
|
||||
obelisk-cliapp = self.callCabal2nix "obelisk-cliapp" (cleanHaskellSource ../lib/cliapp) {};
|
||||
obelisk-command = self.callCabal2nix "obelisk-command" (cleanHaskellSource ../lib/command) {};
|
||||
obelisk-frontend = self.callCabal2nix "obelisk-frontend" (cleanHaskellSource ../lib/frontend) {};
|
||||
obelisk-run = self.callCabal2nix "obelisk-run" (cleanHaskellSource ../lib/run) {};
|
||||
obelisk-route = self.callCabal2nix "obelisk-route" (cleanHaskellSource ../lib/route) {};
|
||||
obelisk-selftest = self.callCabal2nix "obelisk-selftest" (cleanHaskellSource ../lib/selftest) {};
|
||||
obelisk-snap-extras = self.callCabal2nix "obelisk-snap-extras" (cleanHaskellSource ../lib/snap-extras) {};
|
||||
obelisk-asset-manifest = self.callCabal2nix "obelisk-asset-manifest" (obeliskCleanSource ../lib/asset/manifest) {};
|
||||
obelisk-asset-serve-snap = self.callCabal2nix "obelisk-asset-serve-snap" (obeliskCleanSource ../lib/asset/serve-snap) {};
|
||||
obelisk-backend = self.callCabal2nix "obelisk-backend" (obeliskCleanSource ../lib/backend) {};
|
||||
obelisk-cliapp = self.callCabal2nix "obelisk-cliapp" (obeliskCleanSource ../lib/cliapp) {};
|
||||
obelisk-command = haskellLib.overrideCabal (self.callCabal2nix "obelisk-command" (obeliskCleanSource ../lib/command) {}) {
|
||||
librarySystemDepends = [
|
||||
pkgs.jre
|
||||
pkgs.nix
|
||||
(haskellLib.justStaticExecutables self.ghcid)
|
||||
];
|
||||
};
|
||||
obelisk-frontend = self.callCabal2nix "obelisk-frontend" (obeliskCleanSource ../lib/frontend) {};
|
||||
obelisk-run = onLinux (self.callCabal2nix "obelisk-run" (obeliskCleanSource ../lib/run) {}) (pkg:
|
||||
haskellLib.overrideCabal pkg (drv: { librarySystemDepends = [ pkgs.iproute ]; })
|
||||
);
|
||||
obelisk-route = self.callCabal2nix "obelisk-route" (obeliskCleanSource ../lib/route) {};
|
||||
obelisk-selftest = haskellLib.overrideCabal (self.callCabal2nix "obelisk-selftest" (obeliskCleanSource ../lib/selftest) {}) {
|
||||
librarySystemDepends = [
|
||||
pkgs.cabal-install
|
||||
pkgs.coreutils
|
||||
pkgs.git
|
||||
pkgs.nix
|
||||
pkgs.rsync
|
||||
];
|
||||
};
|
||||
obelisk-snap-extras = self.callCabal2nix "obelisk-snap-extras" (obeliskCleanSource ../lib/snap-extras) {};
|
||||
tabulation = self.callCabal2nix "tabulation" (obeliskCleanSource ../lib/tabulation) {};
|
||||
}
|
||||
|
@ -3,26 +3,7 @@ self: super:
|
||||
let
|
||||
pkgs = self.callPackage ({ pkgs }: pkgs) {};
|
||||
haskellLib = pkgs.haskell.lib;
|
||||
#TODO: Upstream
|
||||
# Modify a Haskell package to add completion scripts for the given
|
||||
# executable produced by it. These completion scripts will be picked up
|
||||
# automatically if the resulting derivation is installed, e.g. by
|
||||
# `nix-env -i`.
|
||||
addOptparseApplicativeCompletionScripts = exeName: pkg: haskellLib.overrideCabal pkg (drv: {
|
||||
postInstall = (drv.postInstall or "") + ''
|
||||
BASH_COMP_DIR="$out/share/bash-completion/completions"
|
||||
mkdir -p "$BASH_COMP_DIR"
|
||||
"$out/bin/${exeName}" --bash-completion-script "$out/bin/${exeName}" >"$BASH_COMP_DIR/ob"
|
||||
|
||||
ZSH_COMP_DIR="$out/share/zsh/vendor-completions"
|
||||
mkdir -p "$ZSH_COMP_DIR"
|
||||
"$out/bin/${exeName}" --zsh-completion-script "$out/bin/${exeName}" >"$ZSH_COMP_DIR/_ob"
|
||||
|
||||
FISH_COMP_DIR="$out/share/fish/vendor_completions.d"
|
||||
mkdir -p "$FISH_COMP_DIR"
|
||||
"$out/bin/${exeName}" --fish-completion-script "$out/bin/${exeName}" >"$FISH_COMP_DIR/ob.fish"
|
||||
'';
|
||||
});
|
||||
commandRuntimeDeps = with pkgs; [
|
||||
coreutils
|
||||
git
|
||||
@ -35,17 +16,17 @@ in
|
||||
# Dynamic linking with split objects dramatically increases startup time (about
|
||||
# 0.5 seconds on a decent machine with SSD), so we do `justStaticExecutables`.
|
||||
obelisk-command = haskellLib.overrideCabal
|
||||
(addOptparseApplicativeCompletionScripts "ob"
|
||||
(haskellLib.justStaticExecutables super.obelisk-command))
|
||||
(drv: {
|
||||
buildTools = (drv.buildTools or []) ++ [ pkgs.buildPackages.makeWrapper ];
|
||||
postFixup = ''
|
||||
${drv.postFixup or ""}
|
||||
# Make `ob` reference its runtime dependencies.
|
||||
wrapProgram "$out"/bin/ob --prefix PATH : ${pkgs.lib.makeBinPath commandRuntimeDeps}
|
||||
'';
|
||||
passthru = { inherit commandRuntimeDeps; };
|
||||
});
|
||||
(haskellLib.generateOptparseApplicativeCompletion "ob"
|
||||
(haskellLib.justStaticExecutables super.obelisk-command))
|
||||
(drv: {
|
||||
buildTools = (drv.buildTools or []) ++ [ pkgs.buildPackages.makeWrapper ];
|
||||
postFixup = ''
|
||||
${drv.postFixup or ""}
|
||||
# Make `ob` reference its runtime dependencies.
|
||||
wrapProgram "$out"/bin/ob --prefix PATH : ${pkgs.lib.makeBinPath commandRuntimeDeps}
|
||||
'';
|
||||
passthru = { inherit commandRuntimeDeps; };
|
||||
});
|
||||
|
||||
obelisk-selftest = haskellLib.justStaticExecutables super.obelisk-selftest;
|
||||
}
|
||||
|
66
jobsets.nix
@ -1,66 +0,0 @@
|
||||
{ prs }:
|
||||
|
||||
let
|
||||
self = import ./. {};
|
||||
pkgs = self.nixpkgs;
|
||||
mkFetchGithub = value: {
|
||||
inherit value;
|
||||
type = "git";
|
||||
emailresponsible = false;
|
||||
};
|
||||
in
|
||||
with pkgs.lib;
|
||||
let
|
||||
defaults = jobs: {
|
||||
inherit (jobs) description;
|
||||
enabled = 1;
|
||||
hidden = false;
|
||||
keepnr = 10;
|
||||
schedulingshares = 100;
|
||||
checkinterval = 120;
|
||||
enableemail = false;
|
||||
emailoverride = "";
|
||||
nixexprinput = "obelisk";
|
||||
nixexprpath = "release.nix";
|
||||
inputs = jobs.inputs // {
|
||||
nixpkgs = {
|
||||
type = "git";
|
||||
value = "https://github.com/NixOS/nixpkgs-channels nixos-unstable";
|
||||
emailresponsible = false;
|
||||
};
|
||||
};
|
||||
};
|
||||
branchJobset = branch: defaults {
|
||||
description = "obelisk-${branch}";
|
||||
inputs = {
|
||||
obelisk = {
|
||||
value = "https://github.com/obsidiansystems/obelisk ${branch}";
|
||||
type = "git";
|
||||
emailresponsible = false;
|
||||
};
|
||||
};
|
||||
};
|
||||
makePr = num: info: {
|
||||
name = "obelisk-pr-${num}";
|
||||
value = defaults {
|
||||
description = "#${num}: ${info.title}";
|
||||
inputs = {
|
||||
obelisk = {
|
||||
#NOTE: This should really use "pull/${num}/merge"; however, GitHub's
|
||||
#status checks only operate on PR heads. This creates a race
|
||||
#condition, which can currently only be solved by requiring PRs to be
|
||||
#up to date before they're merged. See
|
||||
#https://github.com/isaacs/github/issues/1002
|
||||
value = "https://github.com/obsidiansystems/obelisk pull/${num}/head";
|
||||
type = "git";
|
||||
emailresponsible = false;
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
processedPrs = mapAttrs' makePr (builtins.fromJSON (builtins.readFile prs));
|
||||
jobsetsAttrs = processedPrs //
|
||||
genAttrs ["master"] branchJobset;
|
||||
in {
|
||||
jobsets = pkgs.writeText "spec.json" (builtins.toJSON jobsetsAttrs);
|
||||
}
|
@ -1,7 +1,7 @@
|
||||
## obelisk-asset
|
||||
|
||||
The 'assets.nix' file contains nix expressions that are a mutually
|
||||
recursive set of attributes used to create static asset directories
|
||||
The 'assets.nix' file contains nix expressions that are a mutually
|
||||
recursive set of attributes used to create static asset directories
|
||||
with hashable file encodings. This file should be incorporated within
|
||||
your project's 'default.nix' file. (See ./example/default.nix , line 2)
|
||||
```
|
||||
@ -10,20 +10,20 @@ your project's 'default.nix' file. (See ./example/default.nix , line 2)
|
||||
```
|
||||
|
||||
The 'mkAssets' and 'mkAssetsWith' functions within 'assets.nix' only
|
||||
need to be passed an encoding('zopfliEncodings', 'gzipEncodings',
|
||||
need to be passed an encoding('zopfliEncodings', 'gzipEncodings',
|
||||
'noEncodings') and a static directory of your choice as arguments.
|
||||
|
||||
In this example, 'mkAssets' is used on a directory that has a .png
|
||||
In this example, 'mkAssets' is used on a directory that has a .png
|
||||
file inside of it. (See ./example/default.nix, line 11)
|
||||
```
|
||||
myAssets = assets.mkAssets ./static;
|
||||
myAssets = assets.mkAssets ./static;
|
||||
```
|
||||
|
||||
Once you have successfully incorporated 'assets.nix' into your
|
||||
project's nix file(s), use nix-build to generate a symlink of
|
||||
your 'mkAssets' expression. The following command from
|
||||
example/run-example (line 3) will generate an immutable symlink
|
||||
that holds hashed static assets.
|
||||
Once you have successfully incorporated 'assets.nix' into your
|
||||
project's nix file(s), use nix-build to generate a symlink of
|
||||
your 'mkAssets' expression. The following command from
|
||||
example/run-example (line 3) will generate an immutable symlink
|
||||
that holds hashed static assets.
|
||||
```bash
|
||||
nix-build -o static.assets -A myAssets
|
||||
```
|
||||
|
@ -17,6 +17,7 @@ library
|
||||
base
|
||||
, bytestring
|
||||
, containers
|
||||
, deepseq
|
||||
, directory
|
||||
, SHA
|
||||
, filepath
|
||||
|
@ -1,10 +1,12 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE BangPatterns #-}
|
||||
-- | Functions for collecting up files to be processed by the asset pipeline
|
||||
module Obelisk.Asset.Gather
|
||||
( gatherHashedPaths
|
||||
, toHashedPath
|
||||
) where
|
||||
|
||||
import Control.DeepSeq
|
||||
import Control.Monad (forM)
|
||||
import Data.Bits
|
||||
import qualified Data.ByteString.Lazy as LBS
|
||||
@ -33,7 +35,7 @@ gatherHashedPaths root = go ""
|
||||
let relativePath = subdir </> sub
|
||||
isFile <- doesFileExist $ root </> relativePath
|
||||
if isFile
|
||||
then do hashedRelativePath <- toHashedPath root relativePath
|
||||
then do !hashedRelativePath <- force <$> toHashedPath root relativePath
|
||||
return $ Map.singleton relativePath hashedRelativePath
|
||||
else go relativePath
|
||||
|
||||
@ -47,7 +49,8 @@ toHashedPath root relativePath = do
|
||||
contents <- LBS.readFile path
|
||||
let hashPrefix = T.unpack $ decodeUtf8 $ LBS.toStrict $ toNixBase32 $ bytestringDigest $ sha256 contents
|
||||
(dir, filename) = splitFileName relativePath
|
||||
return $ normalise $ dir </> (hashPrefix <> "-" <> filename)
|
||||
!hashedRelativePath = force $ normalise $ dir </> (hashPrefix <> "-" <> filename)
|
||||
return hashedRelativePath
|
||||
|
||||
-- | Convert a ByteString to base 32 in the way that Nix does
|
||||
toNixBase32 :: LBS.ByteString -> LBS.ByteString
|
||||
|
@ -136,16 +136,16 @@ qvalue = do
|
||||
skipMany lws
|
||||
q0 <|> q1
|
||||
where q0 = do
|
||||
char '0'
|
||||
_ <- char '0'
|
||||
decimals <- option [] $ do
|
||||
char '.'
|
||||
_ <- char '.'
|
||||
starRule (Just 0) (Just numAllowedDigits) digit
|
||||
return $ QValue $ MkFixed $ fromIntegral $ (read ('0' : decimals) :: Int) * 10 ^ (numAllowedDigits - length decimals)
|
||||
q1 = do
|
||||
char '1'
|
||||
_ <- char '1'
|
||||
option () $ do
|
||||
char '.'
|
||||
starRule (Just 0) (Just numAllowedDigits) $ char '0'
|
||||
_ <- char '.'
|
||||
_ <- starRule (Just 0) (Just numAllowedDigits) $ char '0'
|
||||
return ()
|
||||
return $ QValue 1
|
||||
numAllowedDigits :: Int
|
||||
@ -232,7 +232,7 @@ hashRule minNum maxNum element = do
|
||||
lws :: Parser ()
|
||||
lws = do
|
||||
option () $ cr >> lf
|
||||
starRule (Just 1) Nothing $ sp <|> ht
|
||||
_ <- starRule (Just 1) Nothing $ sp <|> ht
|
||||
return ()
|
||||
|
||||
-- | See http://www.w3.org/Protocols/rfc2616/rfc2616-sec2.html#sec2.2
|
||||
|
@ -12,13 +12,15 @@ import Obelisk.Asset.Accept (Encoding (..), acceptEncodingBody, chooseEncoding,
|
||||
import Obelisk.Snap.Extras
|
||||
|
||||
import Snap
|
||||
(MonadSnap, getHeader, getsRequest, modifyResponse, pass, redirect, sendFile, setContentLength, setContentType, setHeader, setResponseCode)
|
||||
(MonadSnap, getHeader, getRequest, getsRequest, modifyResponse, pass, redirect, sendFile, setContentLength, setContentType, setHeader, setResponseCode)
|
||||
import Snap.Util.FileServe (defaultMimeTypes, fileType, getSafePath, serveFile)
|
||||
import Snap.Internal.Util.FileServe (checkRangeReq)
|
||||
|
||||
import Control.Applicative ((<|>))
|
||||
import Control.Exception (handleJust, try, throwIO)
|
||||
import Control.Monad (forM, liftM)
|
||||
import Control.Monad (forM, liftM, unless)
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
import Control.Monad.Fail (MonadFail)
|
||||
import Data.Attoparsec.ByteString (parseOnly, endOfInput)
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString as BS
|
||||
@ -38,7 +40,7 @@ import System.Posix (getFileStatus, fileSize)
|
||||
-- current version of the asset
|
||||
-- (e.g. @0rx5yvkkgkig2pcqf4ngi1l7vh89qqajdzc6aiayaibyhvj0d853-test.png@). Hashed
|
||||
-- asset names will be sent with permanent caching headers.
|
||||
serveAssets :: MonadSnap m => FilePath -> FilePath -> m ()
|
||||
serveAssets :: (MonadSnap m, MonadFail m) => FilePath -> FilePath -> m ()
|
||||
serveAssets = serveAssets' True
|
||||
|
||||
-- | Serve static assets from an asset directory generated via @assets.nix@ or, failing that, from a regular directory.
|
||||
@ -47,15 +49,15 @@ serveAssets = serveAssets' True
|
||||
-- browser for the logical asset name (e.g. @test.png@) but instead the current
|
||||
-- version of the asset with to point the browser at the unique filename for the
|
||||
-- current version of the asset.
|
||||
serveAssetsInPlace :: MonadSnap m => FilePath -> FilePath -> m ()
|
||||
serveAssetsInPlace :: (MonadSnap m, MonadFail m) => FilePath -> FilePath -> m ()
|
||||
serveAssetsInPlace = serveAssets' False
|
||||
|
||||
-- | Like 'serveAssets', but only serves a single specified asset
|
||||
serveAsset :: MonadSnap m => FilePath -> FilePath -> FilePath -> m ()
|
||||
serveAsset :: (MonadSnap m, MonadFail m) => FilePath -> FilePath -> FilePath -> m ()
|
||||
serveAsset = serveAsset' True
|
||||
|
||||
-- | Like 'serveAssetsInPlace', but only serves a single specified asset
|
||||
serveAssetInPlace :: MonadSnap m => FilePath -> FilePath -> FilePath -> m ()
|
||||
serveAssetInPlace :: (MonadSnap m, MonadFail m) => FilePath -> FilePath -> FilePath -> m ()
|
||||
serveAssetInPlace = serveAsset' False
|
||||
|
||||
-- | Serve static assets from an asset directory generated via @assets.nix@ or, failing that, from a regular directory.
|
||||
@ -63,13 +65,13 @@ serveAssetInPlace = serveAsset' False
|
||||
-- For assets generated from @assets.nix@, the @Bool@ argument @doRedirect@ controls whether redirects will be sent to the browser if a request is made for
|
||||
-- an unhashed asset name, e.g. @test.png@. For @True@, a redirect will be sent, yielding more round trips to the server but better caching behavior if the
|
||||
-- asset doesn't change often. Conversely for @False@, the asset will be served "in place" but made uncacheable.
|
||||
serveAssets' :: MonadSnap m => Bool -> FilePath -> FilePath -> m ()
|
||||
serveAssets' :: (MonadSnap m, MonadFail m) => Bool -> FilePath -> FilePath -> m ()
|
||||
serveAssets' doRedirect base fallback = do
|
||||
pRaw <- getSafePath
|
||||
serveAsset' doRedirect base fallback $ if "/" `isSuffixOf` pRaw || pRaw == "" then pRaw <> "index.html" else pRaw
|
||||
|
||||
-- | Serve a single static asset from an asset directory generated via @assets.nix@ or, failing that, from a regular directory.
|
||||
serveAsset' :: MonadSnap m => Bool -> FilePath -> FilePath -> FilePath -> m ()
|
||||
serveAsset' :: (MonadFail m, MonadSnap m) => Bool -> FilePath -> FilePath -> FilePath -> m ()
|
||||
serveAsset' doRedirect base fallback p = do
|
||||
assetType <- liftIO $ try $ BS.readFile $ base </> p </> "type"
|
||||
case assetType of
|
||||
@ -93,8 +95,17 @@ serveAsset' doRedirect base fallback p = do
|
||||
let finalFilename = base </> p </> "encodings" </> T.unpack (decodeUtf8 e)
|
||||
stat <- liftIO $ getFileStatus finalFilename
|
||||
modifyResponse $ setHeader "Last-Modified" "Thu, 1 Jan 1970 00:00:00 GMT"
|
||||
modifyResponse $ setResponseCode 200 . setContentLength (fromIntegral $ fileSize stat) . setContentType (fileType defaultMimeTypes p)
|
||||
sendFile finalFilename
|
||||
. setHeader "Accept-Ranges" "bytes"
|
||||
. setContentType (fileType defaultMimeTypes p)
|
||||
let size = fromIntegral $ fileSize stat
|
||||
req <- getRequest
|
||||
-- Despite the name, this function actually does all of the work for
|
||||
-- responding to range requests. We only need to handle *none* range
|
||||
-- requests ourselves.
|
||||
wasRange <- checkRangeReq req finalFilename size
|
||||
unless wasRange $ do
|
||||
modifyResponse $ setResponseCode 200 . setContentLength size
|
||||
sendFile finalFilename
|
||||
Just _ -> cachePermanently >> modifyResponse (setResponseCode 304)
|
||||
Right "redirect" -> do
|
||||
mtarget <- liftIO $ getAssetTarget $ base </> p
|
||||
|
@ -8,6 +8,8 @@ library
|
||||
build-depends: base,
|
||||
bytestring,
|
||||
categories,
|
||||
containers,
|
||||
cookie,
|
||||
data-default,
|
||||
dependent-sum,
|
||||
dependent-sum-template,
|
||||
@ -15,6 +17,7 @@ library
|
||||
mtl,
|
||||
obelisk-asset-serve-snap,
|
||||
obelisk-executable-config-inject,
|
||||
obelisk-executable-config-lookup,
|
||||
obelisk-frontend,
|
||||
obelisk-route,
|
||||
obelisk-snap-extras,
|
||||
|
@ -1,3 +1,5 @@
|
||||
{-# LANGUAGE DeriveFunctor #-}
|
||||
{-# LANGUAGE DeriveGeneric #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE GADTs #-}
|
||||
@ -6,62 +8,112 @@
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# OPTIONS_GHC -fno-warn-orphans #-}
|
||||
module Obelisk.Backend
|
||||
( Backend (..)
|
||||
, BackendConfig (..)
|
||||
, defaultBackendConfig
|
||||
, StaticAssets (..)
|
||||
, defaultStaticAssets
|
||||
-- * Running a backend
|
||||
, runBackend
|
||||
, runBackendWith
|
||||
-- * Configuration of backend
|
||||
, GhcjsWidgets(..)
|
||||
, defaultGhcjsWidgets
|
||||
-- * all.js script loading functions
|
||||
, deferredGhcjsScript
|
||||
, delayedGhcjsScript
|
||||
-- * all.js preload functions
|
||||
, preloadGhcjs
|
||||
, renderAllJsPath
|
||||
-- * Re-exports
|
||||
, Default (def)
|
||||
, getPageName
|
||||
, getRouteWith
|
||||
, runSnapWithCommandLineArgs
|
||||
, runSnapWithConfig
|
||||
, serveDefaultObeliskApp
|
||||
, prettifyOutput
|
||||
, runBackend
|
||||
, staticRenderContentType
|
||||
, mkRouteToUrl
|
||||
, getPublicConfigs
|
||||
) where
|
||||
|
||||
import Prelude hiding (id, (.))
|
||||
import Control.Category
|
||||
import Control.Monad
|
||||
import Control.Monad.Except
|
||||
import Control.Categorical.Bifunctor
|
||||
import Control.Monad.Fail (MonadFail)
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString.Char8 as BSC8
|
||||
import Data.Default (Default (..))
|
||||
import Data.Dependent.Sum
|
||||
import Data.Functor.Sum
|
||||
import Data.Functor.Identity
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Monoid ((<>))
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Data.Text.Encoding
|
||||
import GHC.Generics (Generic)
|
||||
import Obelisk.Asset.Serve.Snap (serveAsset)
|
||||
import Obelisk.ExecutableConfig.Inject (injectExecutableConfigs)
|
||||
import qualified Obelisk.ExecutableConfig.Lookup as Lookup
|
||||
import Obelisk.Frontend
|
||||
import Obelisk.Route
|
||||
import Obelisk.Snap.Extras (doNotCache, serveFileIfExistsAs)
|
||||
import Reflex.Dom
|
||||
import Snap (MonadSnap, Snap, commandLineConfig, defaultConfig, getsRequest, httpServe, modifyResponse
|
||||
, rqPathInfo, rqQueryString, setContentType, writeBS, writeText)
|
||||
, rqPathInfo, rqQueryString, setContentType, writeBS, writeText
|
||||
, rqCookies, Cookie(..) , setHeader)
|
||||
import Snap.Internal.Http.Server.Config (Config (accessLog, errorLog), ConfigLog (ConfigIoLog))
|
||||
import System.IO (BufferMode (..), hSetBuffering, stderr, stdout)
|
||||
|
||||
data Backend backendRoute frontendRoute = Backend
|
||||
{ _backend_routeEncoder :: Encoder (Either Text) Identity (R (Sum backendRoute (ObeliskRoute frontendRoute))) PageName
|
||||
{ _backend_routeEncoder :: Encoder (Either Text) Identity (R (FullRoute backendRoute frontendRoute)) PageName
|
||||
, _backend_run :: ((R backendRoute -> Snap ()) -> IO ()) -> IO ()
|
||||
}
|
||||
} deriving (Generic)
|
||||
|
||||
data BackendConfig frontendRoute = BackendConfig
|
||||
{ _backendConfig_runSnap :: !(Snap () -> IO ()) -- ^ Function to run the snap server
|
||||
, _backendConfig_staticAssets :: !StaticAssets -- ^ Static assets
|
||||
, _backendConfig_ghcjsWidgets :: !(GhcjsWidgets (Text -> FrontendWidgetT (R frontendRoute) ()))
|
||||
-- ^ Given the URL of all.js, return the widgets which are responsible for
|
||||
-- loading the script.
|
||||
} deriving (Generic)
|
||||
|
||||
-- | The static assets provided must contain a compiled GHCJS app that corresponds exactly to the Frontend provided
|
||||
data GhcjsApp route = GhcjsApp
|
||||
{ _ghcjsApp_compiled :: !StaticAssets
|
||||
, _ghcjsApp_value :: !(Frontend route)
|
||||
} deriving (Generic)
|
||||
|
||||
-- | Widgets used to load all.js on the frontend
|
||||
data GhcjsWidgets a = GhcjsWidgets
|
||||
{ _ghcjsWidgets_preload :: a
|
||||
-- ^ A preload widget, placed in the document head
|
||||
, _ghcjsWidgets_script :: a
|
||||
-- ^ A script widget, placed in the document body
|
||||
} deriving (Functor, Generic)
|
||||
|
||||
|
||||
-- | Given the URL of all.js, return the widgets which are responsible for
|
||||
-- loading the script. Defaults to 'preloadGhcjs' and 'deferredGhcjsScript'.
|
||||
defaultGhcjsWidgets :: GhcjsWidgets (Text -> FrontendWidgetT r ())
|
||||
defaultGhcjsWidgets = GhcjsWidgets
|
||||
{ _ghcjsWidgets_preload = preloadGhcjs
|
||||
, _ghcjsWidgets_script = deferredGhcjsScript
|
||||
}
|
||||
|
||||
-- | Serve a frontend, which must be the same frontend that Obelisk has built and placed in the default location
|
||||
--TODO: The frontend should be provided together with the asset paths so that this isn't so easily breakable; that will probably make this function obsolete
|
||||
serveDefaultObeliskApp :: MonadSnap m => (R appRoute -> Text) -> ([Text] -> m ()) -> Frontend (R appRoute) -> R (ObeliskRoute appRoute) -> m ()
|
||||
serveDefaultObeliskApp urlEnc serveStaticAsset frontend = serveObeliskApp urlEnc serveStaticAsset frontendApp
|
||||
serveDefaultObeliskApp
|
||||
:: (MonadSnap m, HasCookies m, MonadFail m)
|
||||
=> (R appRoute -> Text)
|
||||
-> GhcjsWidgets (FrontendWidgetT (R appRoute) ())
|
||||
-> ([Text] -> m ())
|
||||
-> Frontend (R appRoute)
|
||||
-> Map Text ByteString
|
||||
-> R (ObeliskRoute appRoute)
|
||||
-> m ()
|
||||
serveDefaultObeliskApp urlEnc ghcjsWidgets serveStaticAsset frontend =
|
||||
serveObeliskApp urlEnc ghcjsWidgets serveStaticAsset frontendApp
|
||||
where frontendApp = GhcjsApp
|
||||
{ _ghcjsApp_compiled = defaultFrontendGhcjsAssets
|
||||
, _ghcjsApp_value = frontend
|
||||
@ -86,45 +138,56 @@ defaultFrontendGhcjsAssets = StaticAssets
|
||||
, _staticAssets_unprocessed = "frontend.jsexe"
|
||||
}
|
||||
|
||||
runSnapWithCommandLineArgs :: Snap () -> IO ()
|
||||
runSnapWithCommandLineArgs a = do
|
||||
-- Get the web server configuration from the command line
|
||||
cmdLineConf <- commandLineConfig defaultConfig
|
||||
let httpConf = cmdLineConf
|
||||
runSnapWithConfig :: MonadIO m => Config Snap a -> Snap () -> m ()
|
||||
runSnapWithConfig conf a = do
|
||||
let httpConf = conf
|
||||
{ accessLog = Just $ ConfigIoLog BSC8.putStrLn
|
||||
, errorLog = Just $ ConfigIoLog BSC8.putStrLn
|
||||
}
|
||||
-- Start the web server
|
||||
httpServe httpConf a
|
||||
liftIO $ httpServe httpConf a
|
||||
|
||||
-- Get the web server configuration from the command line
|
||||
runSnapWithCommandLineArgs :: MonadIO m => Snap () -> m ()
|
||||
runSnapWithCommandLineArgs s = liftIO (commandLineConfig defaultConfig) >>= \c ->
|
||||
runSnapWithConfig c s
|
||||
|
||||
getPageName :: (MonadSnap m) => m PageName
|
||||
getPageName = do
|
||||
p <- getsRequest rqPathInfo
|
||||
q <- getsRequest rqQueryString
|
||||
let pageNameEncoder' :: Encoder Identity Identity PageName (String, String)
|
||||
pageNameEncoder' = bimap
|
||||
(unpackTextEncoder . pathSegmentsTextEncoder . listToNonEmptyEncoder)
|
||||
(unpackTextEncoder . queryParametersTextEncoder . toListMapEncoder)
|
||||
return $ decode pageNameEncoder' (T.unpack (decodeUtf8 p), T.unpack (decodeUtf8 q))
|
||||
return $ byteStringsToPageName p q
|
||||
|
||||
getRouteWith :: (MonadSnap m) => Encoder Identity parse route PageName -> m (parse route)
|
||||
getRouteWith e = do
|
||||
pageName <- getPageName
|
||||
return $ tryDecode e pageName
|
||||
|
||||
serveObeliskApp :: MonadSnap m => (R appRoute -> Text) -> ([Text] -> m ()) -> GhcjsApp (R appRoute) -> R (ObeliskRoute appRoute) -> m ()
|
||||
serveObeliskApp urlEnc serveStaticAsset frontendApp = \case
|
||||
ObeliskRoute_App appRouteComponent :=> Identity appRouteRest -> serveGhcjsApp urlEnc frontendApp $ GhcjsAppRoute_App appRouteComponent :/ appRouteRest
|
||||
renderAllJsPath :: Encoder Identity Identity (R (FullRoute a b)) PageName -> Text
|
||||
renderAllJsPath validFullEncoder =
|
||||
renderObeliskRoute validFullEncoder $ FullRoute_Frontend (ObeliskRoute_Resource ResourceRoute_Ghcjs) :/ ["all.js"]
|
||||
|
||||
serveObeliskApp
|
||||
:: (MonadSnap m, HasCookies m, MonadFail m)
|
||||
=> (R appRoute -> Text)
|
||||
-> GhcjsWidgets (FrontendWidgetT (R appRoute) ())
|
||||
-> ([Text] -> m ())
|
||||
-> GhcjsApp (R appRoute)
|
||||
-> Map Text ByteString
|
||||
-> R (ObeliskRoute appRoute)
|
||||
-> m ()
|
||||
serveObeliskApp urlEnc ghcjsWidgets serveStaticAsset frontendApp config = \case
|
||||
ObeliskRoute_App appRouteComponent :=> Identity appRouteRest -> serveGhcjsApp urlEnc ghcjsWidgets frontendApp config $ GhcjsAppRoute_App appRouteComponent :/ appRouteRest
|
||||
ObeliskRoute_Resource resComponent :=> Identity resRest -> case resComponent :=> Identity resRest of
|
||||
ResourceRoute_Static :=> Identity pathSegments -> serveStaticAsset pathSegments
|
||||
ResourceRoute_Ghcjs :=> Identity pathSegments -> serveGhcjsApp urlEnc frontendApp $ GhcjsAppRoute_Resource :/ pathSegments
|
||||
ResourceRoute_Ghcjs :=> Identity pathSegments -> serveGhcjsApp urlEnc ghcjsWidgets frontendApp config $ GhcjsAppRoute_Resource :/ pathSegments
|
||||
ResourceRoute_JSaddleWarp :=> Identity _ -> do
|
||||
let msg = "Error: Obelisk.Backend received jsaddle request"
|
||||
liftIO $ putStrLn $ T.unpack msg
|
||||
writeText msg
|
||||
ResourceRoute_Version :=> Identity () -> doNotCache >> serveFileIfExistsAs "text/plain" "version"
|
||||
|
||||
serveStaticAssets :: MonadSnap m => StaticAssets -> [Text] -> m ()
|
||||
serveStaticAssets :: (MonadSnap m, MonadFail m) => StaticAssets -> [Text] -> m ()
|
||||
serveStaticAssets assets pathSegments = serveAsset (_staticAssets_processed assets) (_staticAssets_unprocessed assets) $ T.unpack $ T.intercalate "/" pathSegments
|
||||
|
||||
data StaticAssets = StaticAssets
|
||||
@ -141,35 +204,95 @@ staticRenderContentType :: ByteString
|
||||
staticRenderContentType = "text/html; charset=utf-8"
|
||||
|
||||
--TODO: Don't assume we're being served at "/"
|
||||
serveGhcjsApp :: MonadSnap m => (R appRouteComponent -> Text) -> GhcjsApp (R appRouteComponent) -> R (GhcjsAppRoute appRouteComponent) -> m ()
|
||||
serveGhcjsApp urlEnc app = \case
|
||||
serveGhcjsApp
|
||||
:: (MonadSnap m, HasCookies m, MonadFail m)
|
||||
=> (R appRouteComponent -> Text)
|
||||
-> GhcjsWidgets (FrontendWidgetT (R appRouteComponent) ())
|
||||
-> GhcjsApp (R appRouteComponent)
|
||||
-> Map Text ByteString
|
||||
-> R (GhcjsAppRoute appRouteComponent)
|
||||
-> m ()
|
||||
serveGhcjsApp urlEnc ghcjsWidgets app config = \case
|
||||
GhcjsAppRoute_App appRouteComponent :=> Identity appRouteRest -> do
|
||||
modifyResponse $ setContentType staticRenderContentType
|
||||
writeBS <=< liftIO $ renderGhcjsFrontend urlEnc (appRouteComponent :/ appRouteRest) $ _ghcjsApp_value app
|
||||
modifyResponse $ setHeader "Cache-Control" "no-store private"
|
||||
writeBS <=< renderGhcjsFrontend urlEnc ghcjsWidgets (appRouteComponent :/ appRouteRest) config $ _ghcjsApp_value app
|
||||
GhcjsAppRoute_Resource :=> Identity pathSegments -> serveStaticAssets (_ghcjsApp_compiled app) pathSegments
|
||||
|
||||
runBackend :: Backend fullRoute frontendRoute -> Frontend (R frontendRoute) -> IO ()
|
||||
runBackend backend frontend = case checkEncoder $ _backend_routeEncoder backend of
|
||||
-- | Default obelisk backend configuration.
|
||||
defaultBackendConfig :: BackendConfig frontendRoute
|
||||
defaultBackendConfig = BackendConfig runSnapWithCommandLineArgs defaultStaticAssets defaultGhcjsWidgets
|
||||
|
||||
-- | Run an obelisk backend with the default configuration.
|
||||
runBackend :: Backend backendRoute frontendRoute -> Frontend (R frontendRoute) -> IO ()
|
||||
runBackend = runBackendWith defaultBackendConfig
|
||||
|
||||
-- | Run an obelisk backend with the given configuration.
|
||||
runBackendWith
|
||||
:: BackendConfig frontendRoute
|
||||
-> Backend backendRoute frontendRoute
|
||||
-> Frontend (R frontendRoute)
|
||||
-> IO ()
|
||||
runBackendWith (BackendConfig runSnap staticAssets ghcjsWidgets) backend frontend = case checkEncoder $ _backend_routeEncoder backend of
|
||||
Left e -> fail $ "backend error:\n" <> T.unpack e
|
||||
Right validFullEncoder -> _backend_run backend $ \serveRoute -> do
|
||||
runSnapWithCommandLineArgs $ do
|
||||
getRouteWith validFullEncoder >>= \case
|
||||
Identity r -> case r of
|
||||
InL backendRoute :=> Identity a -> serveRoute $ backendRoute :/ a
|
||||
InR obeliskRoute :=> Identity a ->
|
||||
serveDefaultObeliskApp (mkRouteToUrl validFullEncoder) (serveStaticAssets defaultStaticAssets) frontend $ obeliskRoute :/ a
|
||||
Right validFullEncoder -> do
|
||||
publicConfigs <- getPublicConfigs
|
||||
_backend_run backend $ \serveRoute ->
|
||||
runSnap $
|
||||
getRouteWith validFullEncoder >>= \case
|
||||
Identity r -> case r of
|
||||
FullRoute_Backend backendRoute :/ a -> serveRoute $ backendRoute :/ a
|
||||
FullRoute_Frontend obeliskRoute :/ a ->
|
||||
serveDefaultObeliskApp routeToUrl (($ allJsUrl) <$> ghcjsWidgets) (serveStaticAssets staticAssets) frontend publicConfigs $
|
||||
obeliskRoute :/ a
|
||||
where
|
||||
routeToUrl (k :/ v) = renderObeliskRoute validFullEncoder $ FullRoute_Frontend (ObeliskRoute_App k) :/ v
|
||||
allJsUrl = renderAllJsPath validFullEncoder
|
||||
|
||||
mkRouteToUrl :: Encoder Identity parse (R (Sum f (ObeliskRoute r))) PageName -> R r -> Text
|
||||
mkRouteToUrl validFullEncoder =
|
||||
let pageNameEncoder' :: Encoder Identity (Either Text) PageName PathQuery = pageNameEncoder
|
||||
in \(k :/ v) -> T.pack . uncurry (<>) . encode pageNameEncoder' . encode validFullEncoder $ (InR $ ObeliskRoute_App k) :/ v
|
||||
renderGhcjsFrontend
|
||||
:: (MonadSnap m, HasCookies m)
|
||||
=> (route -> Text)
|
||||
-> GhcjsWidgets (FrontendWidgetT route ())
|
||||
-> route
|
||||
-> Map Text ByteString
|
||||
-> Frontend route
|
||||
-> m ByteString
|
||||
renderGhcjsFrontend urlEnc ghcjsWidgets route configs f = do
|
||||
cookies <- askCookies
|
||||
renderFrontendHtml configs cookies urlEnc route f (_ghcjsWidgets_preload ghcjsWidgets) (_ghcjsWidgets_script ghcjsWidgets)
|
||||
|
||||
-- | Preload all.js in a link tag.
|
||||
-- This is the default preload method.
|
||||
preloadGhcjs :: Text -> FrontendWidgetT r ()
|
||||
preloadGhcjs allJsUrl = elAttr "link" ("rel" =: "preload" <> "as" =: "script" <> "href" =: allJsUrl) blank
|
||||
|
||||
renderGhcjsFrontend :: MonadIO m => (route -> Text) -> route -> Frontend route -> m ByteString
|
||||
renderGhcjsFrontend urlEnc route f = do
|
||||
let baseTag = elAttr "base" ("href" =: "/") blank --TODO: Figure out the base URL from the routes
|
||||
ghcjsPreload = elAttr "link" ("rel" =: "preload" <> "as" =: "script" <> "href" =: "ghcjs/all.js") blank
|
||||
ghcjsScript = elAttr "script" ("language" =: "javascript" <> "src" =: "ghcjs/all.js" <> "defer" =: "defer") blank
|
||||
liftIO $ renderFrontendHtml urlEnc route
|
||||
(_frontend_head f >> injectExecutableConfigs >> baseTag >> ghcjsPreload)
|
||||
(_frontend_body f >> ghcjsScript)
|
||||
-- | Load the script from the given URL in a deferred script tag.
|
||||
-- This is the default method.
|
||||
deferredGhcjsScript :: Text -> FrontendWidgetT r ()
|
||||
deferredGhcjsScript allJsUrl = elAttr "script" ("type" =: "text/javascript" <> "src" =: allJsUrl <> "defer" =: "defer") blank
|
||||
|
||||
-- | An all.js script which is loaded after waiting for some time to pass. This
|
||||
-- is useful to ensure any CSS animations on the page can play smoothly before
|
||||
-- blocking the UI thread by running all.js.
|
||||
delayedGhcjsScript
|
||||
:: Int -- ^ The number of milliseconds to delay loading by
|
||||
-> Text -- ^ URL to GHCJS app JavaScript
|
||||
-> FrontendWidgetT r ()
|
||||
delayedGhcjsScript n allJsUrl = elAttr "script" ("type" =: "text/javascript") $ text $ T.unlines
|
||||
[ "setTimeout(function() {"
|
||||
, " var all_js_script = document.createElement('script');"
|
||||
, " all_js_script.type = 'text/javascript';"
|
||||
, " all_js_script.src = '" <> allJsUrl <> "';"
|
||||
, " document.body.appendChild(all_js_script);"
|
||||
, "}, " <> T.pack (show n) <> ");"
|
||||
]
|
||||
|
||||
instance HasCookies Snap where
|
||||
askCookies = map (\c -> (cookieName c, cookieValue c)) <$> getsRequest rqCookies
|
||||
|
||||
-- | Get configs from the canonical "public" locations (i.e., locations that obelisk expects to make available
|
||||
-- to frontend applications, and hence visible to end users).
|
||||
getPublicConfigs :: IO (Map Text ByteString)
|
||||
getPublicConfigs = Map.filterWithKey (\k _ -> isMemberOf k ["common", "frontend"]) <$> Lookup.getConfigs
|
||||
where
|
||||
isMemberOf k = any (`T.isPrefixOf` k)
|
||||
|
@ -7,9 +7,11 @@ cabal-version: >=1.2
|
||||
library
|
||||
hs-source-dirs: src
|
||||
build-depends:
|
||||
ansi-terminal
|
||||
aeson
|
||||
, ansi-terminal
|
||||
, base
|
||||
, bytestring
|
||||
, containers
|
||||
, exceptions
|
||||
, io-streams
|
||||
, lens
|
||||
@ -23,11 +25,11 @@ library
|
||||
, transformers
|
||||
exposed-modules:
|
||||
Obelisk.CliApp
|
||||
Obelisk.CliApp.Demo
|
||||
other-modules:
|
||||
Obelisk.CliApp.Logging
|
||||
Obelisk.CliApp.Process
|
||||
Obelisk.CliApp.Spinner
|
||||
Obelisk.CliApp.TerminalString
|
||||
Obelisk.CliApp.Theme
|
||||
Obelisk.CliApp.Types
|
||||
ghc-options: -Wall -fobject-code
|
||||
|
@ -33,16 +33,25 @@ module Obelisk.CliApp
|
||||
, Severity (..)
|
||||
|
||||
-- .Process
|
||||
, ProcessFailure (..)
|
||||
, AsProcessFailure (..)
|
||||
, readProcessAndLogStderr
|
||||
, readProcessAndLogOutput
|
||||
, readCreateProcessWithExitCode
|
||||
, ProcessFailure (..)
|
||||
, ProcessSpec (..)
|
||||
, callCommand
|
||||
, callProcess
|
||||
, callProcessAndLogOutput
|
||||
, createProcess_
|
||||
, callProcess
|
||||
, callCommand
|
||||
, overCreateProcess
|
||||
, proc
|
||||
, readCreateProcessWithExitCode
|
||||
, readProcessAndLogOutput
|
||||
, readProcessAndLogStderr
|
||||
, readProcessJSONAndLogStderr
|
||||
, reconstructCommand
|
||||
, setCwd
|
||||
, setDelegateCtlc
|
||||
, setEnvOverride
|
||||
, shell
|
||||
, waitForProcess
|
||||
) where
|
||||
|
||||
import Control.Monad.Log (Severity (..))
|
||||
|
@ -1,52 +0,0 @@
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
module Obelisk.CliApp.Demo where
|
||||
|
||||
import Control.Concurrent (threadDelay)
|
||||
import Control.Monad.IO.Class
|
||||
import Data.Semigroup ((<>))
|
||||
import qualified Data.Text as T
|
||||
import System.Process (proc)
|
||||
|
||||
import Control.Monad.Catch (MonadMask)
|
||||
|
||||
import Obelisk.CliApp
|
||||
|
||||
cliDemo
|
||||
:: ( MonadIO m, MonadMask m
|
||||
, CliLog m, HasCliConfig e m, CliThrow e m, AsProcessFailure e, AsUnstructuredError e)
|
||||
=> m ()
|
||||
cliDemo = withSpinner "CLI Demo" $ do
|
||||
putLog Notice "This demo will showcase the CLI library functionality"
|
||||
_ <- withSpinner' "Searching long for something" (Just $ \c -> "Discovered " <> T.pack (show c) <> " objects") $ do
|
||||
delay
|
||||
withSpinner "Nested task" $ do
|
||||
delay
|
||||
putLog Notice "In nested task"
|
||||
-- This is a 'no trail' spinner that won't leave a trail (even its sub spinners)
|
||||
withSpinnerNoTrail "In between, doing something temporary" $ do
|
||||
withSpinner "Runnning something deep inside" $ do
|
||||
delay
|
||||
putLog Notice "You can still log from inside no-trail spinners"
|
||||
delay
|
||||
delay
|
||||
putLog Error "Top nested task finished"
|
||||
delay
|
||||
putLog Error "Some user error while spinning"
|
||||
delay
|
||||
putLog Notice "This is some info mesage"
|
||||
putLog Warning "And now a warning as well"
|
||||
delay
|
||||
return (42 :: Integer)
|
||||
putLog Notice "Now we start a 2nd spinner, run a couple of process, the last of which fails:"
|
||||
withSpinner "Looking around" $ do
|
||||
delay
|
||||
output <- readProcessAndLogStderr Notice $ proc "ls" ["-l", "/"]
|
||||
putLog Notice $ "Output was: " <> output
|
||||
delay
|
||||
callProcessAndLogOutput (Notice, Error) $ proc "ls" ["-l", "/does-not-exist"]
|
||||
delay
|
||||
failWith "Something dangerous happened"
|
||||
where
|
||||
delay = liftIO $ threadDelay 1000000
|
@ -43,13 +43,15 @@ import Data.Semigroup ((<>))
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as T
|
||||
import GHC.IO.Encoding.Types
|
||||
import System.Console.ANSI (Color (Red, Yellow), ColorIntensity (Vivid),
|
||||
ConsoleIntensity (FaintIntensity), ConsoleLayer (Foreground),
|
||||
SGR (SetColor, SetConsoleIntensity), clearLine)
|
||||
import System.Exit (ExitCode (..))
|
||||
import System.IO (BufferMode (NoBuffering), hFlush, hReady, hSetBuffering, stderr, stdin, stdout)
|
||||
import System.IO
|
||||
|
||||
import qualified Obelisk.CliApp.TerminalString as TS
|
||||
import Obelisk.CliApp.Theme
|
||||
import Obelisk.CliApp.Types
|
||||
|
||||
newCliConfig
|
||||
@ -63,7 +65,11 @@ newCliConfig sev noColor noSpinner errorLogExitCode = do
|
||||
lock <- newMVar False
|
||||
tipDisplayed <- newIORef False
|
||||
stack <- newIORef ([], [])
|
||||
return $ CliConfig level noColor noSpinner lock tipDisplayed stack errorLogExitCode
|
||||
textEncoding <- hGetEncoding stdout
|
||||
let theme = if maybe False supportsUnicode textEncoding
|
||||
then unicodeTheme
|
||||
else noUnicodeTheme
|
||||
return $ CliConfig level noColor noSpinner lock tipDisplayed stack errorLogExitCode theme
|
||||
|
||||
runCli :: MonadIO m => CliConfig e -> CliT e m a -> m a
|
||||
runCli c =
|
||||
@ -126,7 +132,7 @@ handleLog' noColor output = do
|
||||
hFlush stdout
|
||||
Output_Overwrite ts -> liftIO $ do
|
||||
width <- TS.getTerminalWidth
|
||||
T.putStr $ "\r" <> (TS.render (not noColor) width ts)
|
||||
T.putStr $ "\r" <> TS.render (not noColor) width ts
|
||||
hFlush stdout
|
||||
Output_ClearLine -> liftIO $ do
|
||||
-- Go to the first column and clear the whole line
|
||||
@ -172,14 +178,16 @@ withExitFailMessage msg f = f `catch` \(e :: ExitCode) -> do
|
||||
|
||||
-- | Write log to stdout, with colors (unless `noColor`)
|
||||
writeLog :: (MonadIO m, MonadMask m) => Bool -> Bool -> WithSeverity Text -> m ()
|
||||
writeLog withNewLine noColor (WithSeverity severity s)
|
||||
| noColor && severity <= Warning = liftIO $ putFn $ T.pack (show severity) <> ": " <> s
|
||||
| not noColor && severity <= Error = TS.putStrWithSGR errorColors h withNewLine s
|
||||
| not noColor && severity <= Warning = TS.putStrWithSGR warningColors h withNewLine s
|
||||
| not noColor && severity >= Debug = TS.putStrWithSGR debugColors h withNewLine s
|
||||
| otherwise = liftIO $ putFn s
|
||||
writeLog withNewLine noColor (WithSeverity severity s) = if T.null s then pure () else write
|
||||
where
|
||||
putFn = if withNewLine then (T.hPutStrLn h) else (T.hPutStr h)
|
||||
write
|
||||
| noColor && severity <= Warning = liftIO $ putFn $ T.pack (show severity) <> ": " <> s
|
||||
| not noColor && severity <= Error = TS.putStrWithSGR errorColors h withNewLine s
|
||||
| not noColor && severity <= Warning = TS.putStrWithSGR warningColors h withNewLine s
|
||||
| not noColor && severity >= Debug = TS.putStrWithSGR debugColors h withNewLine s
|
||||
| otherwise = liftIO $ putFn s
|
||||
|
||||
putFn = if withNewLine then T.hPutStrLn h else T.hPutStr h
|
||||
h = if severity <= Error then stderr else stdout
|
||||
errorColors = [SetColor Foreground Vivid Red]
|
||||
warningColors = [SetColor Foreground Vivid Yellow]
|
||||
@ -226,3 +234,20 @@ fork :: (HasCliConfig e m, MonadIO m) => CliT e IO () -> m ThreadId
|
||||
fork f = do
|
||||
c <- getCliConfig
|
||||
liftIO $ forkIO $ runCli c f
|
||||
|
||||
-- | Conservatively determines whether the encoding supports Unicode.
|
||||
--
|
||||
-- Currently this uses a whitelist of known-to-work encodings. In principle it
|
||||
-- could test dynamically by opening a file with this encoding, but it doesn't
|
||||
-- look like base exposes any way to determine this in a pure fashion.
|
||||
supportsUnicode :: TextEncoding -> Bool
|
||||
supportsUnicode enc = any ((textEncodingName enc ==) . textEncodingName)
|
||||
[ utf8
|
||||
, utf8_bom
|
||||
, utf16
|
||||
, utf16be
|
||||
, utf16le
|
||||
, utf32
|
||||
, utf32be
|
||||
, utf32le
|
||||
]
|
||||
|
@ -9,43 +9,81 @@
|
||||
-- | An extension of `System.Process` that integrates with logging (`Obelisk.CLI.Logging`)
|
||||
-- and is thus spinner friendly.
|
||||
module Obelisk.CliApp.Process
|
||||
( ProcessFailure (..)
|
||||
, AsProcessFailure (..)
|
||||
, readProcessAndLogStderr
|
||||
, readProcessAndLogOutput
|
||||
, readCreateProcessWithExitCode
|
||||
( AsProcessFailure (..)
|
||||
, ProcessFailure (..)
|
||||
, ProcessSpec (..)
|
||||
, callCommand
|
||||
, callProcess
|
||||
, callProcessAndLogOutput
|
||||
, createProcess
|
||||
, createProcess_
|
||||
, callProcess
|
||||
, callCommand
|
||||
, overCreateProcess
|
||||
, proc
|
||||
, readCreateProcessWithExitCode
|
||||
, readProcessAndLogOutput
|
||||
, readProcessAndLogStderr
|
||||
, readProcessJSONAndLogStderr
|
||||
, reconstructCommand
|
||||
, setCwd
|
||||
, setDelegateCtlc
|
||||
, setEnvOverride
|
||||
, shell
|
||||
, waitForProcess
|
||||
) where
|
||||
|
||||
import Control.Monad ((<=<), join, void)
|
||||
import Control.Monad.Except (throwError)
|
||||
import Control.Monad.Fail
|
||||
import Control.Monad.IO.Class (MonadIO, liftIO)
|
||||
import Control.Lens (Prism', review)
|
||||
import qualified Data.ByteString as BS
|
||||
import qualified Data.ByteString.Char8 as BSC
|
||||
import Data.Function (fix)
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Monoid ((<>))
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Data.Text.Encoding.Error (lenientDecode)
|
||||
import System.Environment (getEnvironment)
|
||||
import System.Exit (ExitCode (..))
|
||||
import System.IO (Handle)
|
||||
import System.IO.Streams (InputStream, handleToInputStream)
|
||||
import qualified System.IO.Streams as Streams
|
||||
import System.IO.Streams.Concurrent (concurrentMerge)
|
||||
import System.Process (CreateProcess, ProcessHandle, StdStream (CreatePipe), cmdspec, std_err, std_out,
|
||||
waitForProcess)
|
||||
import System.Process (CreateProcess, ProcessHandle, StdStream (CreatePipe), std_err, std_out)
|
||||
import qualified System.Process as Process
|
||||
import qualified Data.Aeson as Aeson
|
||||
|
||||
import Control.Monad.Log (Severity (..))
|
||||
import Obelisk.CliApp.Logging (putLog, putLogRaw)
|
||||
import Obelisk.CliApp.Types (CliLog, CliThrow)
|
||||
|
||||
data ProcessSpec = ProcessSpec
|
||||
{ _processSpec_createProcess :: !CreateProcess
|
||||
, _processSpec_overrideEnv :: !(Maybe (Map String String -> Map String String))
|
||||
}
|
||||
|
||||
proc :: FilePath -> [String] -> ProcessSpec
|
||||
proc cmd args = ProcessSpec (Process.proc cmd args) Nothing
|
||||
|
||||
shell :: String -> ProcessSpec
|
||||
shell cmd = ProcessSpec (Process.shell cmd) Nothing
|
||||
|
||||
setEnvOverride :: (Map String String -> Map String String) -> ProcessSpec -> ProcessSpec
|
||||
setEnvOverride f p = p { _processSpec_overrideEnv = Just f }
|
||||
|
||||
overCreateProcess :: (CreateProcess -> CreateProcess) -> ProcessSpec -> ProcessSpec
|
||||
overCreateProcess f (ProcessSpec p x) = ProcessSpec (f p) x
|
||||
|
||||
setDelegateCtlc :: Bool -> ProcessSpec -> ProcessSpec
|
||||
setDelegateCtlc b = overCreateProcess (\p -> p { Process.delegate_ctlc = b })
|
||||
|
||||
setCwd :: Maybe FilePath -> ProcessSpec -> ProcessSpec
|
||||
setCwd fp = overCreateProcess (\p -> p { Process.cwd = fp })
|
||||
|
||||
|
||||
-- TODO put back in `Obelisk.CliApp.Process` and use prisms for extensible exceptions
|
||||
data ProcessFailure = ProcessFailure Process.CmdSpec Int -- exit code
|
||||
deriving Show
|
||||
@ -59,18 +97,32 @@ instance AsProcessFailure ProcessFailure where
|
||||
asProcessFailure = id
|
||||
|
||||
readProcessAndLogStderr
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e)
|
||||
=> Severity -> CreateProcess -> m Text
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e, MonadFail m)
|
||||
=> Severity -> ProcessSpec -> m Text
|
||||
readProcessAndLogStderr sev process = do
|
||||
(out, _err) <- withProcess process $ \_out err -> do
|
||||
streamToLog =<< liftIO (streamHandle sev err)
|
||||
liftIO $ T.decodeUtf8 <$> BS.hGetContents out
|
||||
liftIO $ T.decodeUtf8With lenientDecode <$> BS.hGetContents out
|
||||
|
||||
readProcessJSONAndLogStderr
|
||||
:: (Aeson.FromJSON a, MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e, MonadFail m)
|
||||
=> Severity -> ProcessSpec -> m a
|
||||
readProcessJSONAndLogStderr sev process = do
|
||||
(out, _err) <- withProcess process $ \_out err -> do
|
||||
streamToLog =<< liftIO (streamHandle sev err)
|
||||
json <- liftIO $ BS.hGetContents out
|
||||
case Aeson.eitherDecodeStrict json of
|
||||
Right a -> pure a
|
||||
Left err -> do
|
||||
putLog Error $ "Could not decode process output as JSON: " <> T.pack err
|
||||
throwError $ review asProcessFailure $ ProcessFailure (Process.cmdspec $ _processSpec_createProcess process) 0
|
||||
|
||||
readCreateProcessWithExitCode
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e)
|
||||
=> CreateProcess -> m (ExitCode, String, String)
|
||||
readCreateProcessWithExitCode process = do
|
||||
putLog Debug $ "Creating process: " <> reconstructCommand (cmdspec process)
|
||||
=> ProcessSpec -> m (ExitCode, String, String)
|
||||
readCreateProcessWithExitCode procSpec = do
|
||||
process <- mkCreateProcess procSpec
|
||||
putLog Debug $ "Creating process: " <> reconstructProcSpec procSpec
|
||||
liftIO $ Process.readCreateProcessWithExitCode process ""
|
||||
|
||||
-- | Like `System.Process.readProcess` but logs the combined output (stdout and stderr)
|
||||
@ -81,22 +133,22 @@ readCreateProcessWithExitCode process = do
|
||||
-- which case it is advisable to call it with a non-Error severity for stderr, like
|
||||
-- `callProcessAndLogOutput (Debug, Debug)`.
|
||||
readProcessAndLogOutput
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e)
|
||||
=> (Severity, Severity) -> CreateProcess -> m Text
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e, MonadFail m)
|
||||
=> (Severity, Severity) -> ProcessSpec -> m Text
|
||||
readProcessAndLogOutput (sev_out, sev_err) process = do
|
||||
(_, Just out, Just err, p) <- createProcess $ process
|
||||
{ std_out = CreatePipe , std_err = CreatePipe }
|
||||
(_, Just out, Just err, p) <- createProcess $ overCreateProcess
|
||||
(\p -> p { std_out = CreatePipe , std_err = CreatePipe }) process
|
||||
|
||||
-- TODO interleave stdout and stderr in log correctly
|
||||
streamToLog =<< liftIO (streamHandle sev_err err)
|
||||
outText <- liftIO $ T.decodeUtf8 <$> BS.hGetContents out
|
||||
outText <- liftIO $ T.decodeUtf8With lenientDecode <$> BS.hGetContents out
|
||||
putLogRaw sev_out outText
|
||||
|
||||
liftIO (waitForProcess p) >>= \case
|
||||
waitForProcess p >>= \case
|
||||
ExitSuccess -> pure outText
|
||||
ExitFailure code -> throwError $ review asProcessFailure $ ProcessFailure (cmdspec process) code
|
||||
ExitFailure code -> throwError $ review asProcessFailure $ ProcessFailure (Process.cmdspec $ _processSpec_createProcess process) code
|
||||
|
||||
-- | Like `System.Process.callProcess` but logs the combined output (stdout and stderr)
|
||||
-- | Like 'System.Process.callProcess' but logs the combined output (stdout and stderr)
|
||||
-- with the corresponding severity.
|
||||
--
|
||||
-- Usually this function is called as `callProcessAndLogOutput (Debug, Error)`. However
|
||||
@ -104,9 +156,8 @@ readProcessAndLogOutput (sev_out, sev_err) process = do
|
||||
-- which case it is advisable to call it with a non-Error severity for stderr, like
|
||||
-- `callProcessAndLogOutput (Debug, Debug)`.
|
||||
callProcessAndLogOutput
|
||||
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e)
|
||||
=> (Severity, Severity) -> CreateProcess -> m ()
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e, MonadFail m)
|
||||
=> (Severity, Severity) -> ProcessSpec -> m ()
|
||||
callProcessAndLogOutput (sev_out, sev_err) process =
|
||||
void $ withProcess process $ \out err -> do
|
||||
stream <- liftIO $ join $ combineStream
|
||||
@ -116,21 +167,31 @@ callProcessAndLogOutput (sev_out, sev_err) process =
|
||||
where
|
||||
combineStream s1 s2 = concurrentMerge [s1, s2]
|
||||
|
||||
-- | Like `System.Process.createProcess` but also logs (debug) the process being run
|
||||
-- | Like 'System.Process.createProcess' but also logs (debug) the process being run
|
||||
createProcess
|
||||
:: (MonadIO m, CliLog m)
|
||||
=> CreateProcess -> m (Maybe Handle, Maybe Handle, Maybe Handle, ProcessHandle)
|
||||
createProcess p = do
|
||||
putLog Debug $ "Creating process: " <> reconstructCommand (cmdspec p)
|
||||
=> ProcessSpec -> m (Maybe Handle, Maybe Handle, Maybe Handle, ProcessHandle)
|
||||
createProcess procSpec = do
|
||||
p <- mkCreateProcess procSpec
|
||||
putLog Debug $ "Creating process: " <> reconstructProcSpec procSpec
|
||||
liftIO $ Process.createProcess p
|
||||
|
||||
-- | Like `System.Process.createProcess_` but also logs (debug) the process being run
|
||||
createProcess_
|
||||
:: (MonadIO m, CliLog m)
|
||||
=> String -> CreateProcess -> m (Maybe Handle, Maybe Handle, Maybe Handle, ProcessHandle)
|
||||
createProcess_ name p = do
|
||||
putLog Debug $ "Creating process " <> T.pack name <> ": " <> reconstructCommand (cmdspec p)
|
||||
liftIO $ Process.createProcess p
|
||||
=> String -> ProcessSpec -> m (Maybe Handle, Maybe Handle, Maybe Handle, ProcessHandle)
|
||||
createProcess_ name procSpec = do
|
||||
p <- mkCreateProcess procSpec
|
||||
putLog Debug $ "Creating process " <> T.pack name <> ": " <> reconstructProcSpec procSpec
|
||||
liftIO $ Process.createProcess_ name p
|
||||
|
||||
mkCreateProcess :: MonadIO m => ProcessSpec -> m Process.CreateProcess
|
||||
mkCreateProcess (ProcessSpec p override') = do
|
||||
case override' of
|
||||
Nothing -> pure p
|
||||
Just override -> do
|
||||
procEnv <- Map.fromList <$> maybe (liftIO getEnvironment) pure (Process.env p)
|
||||
pure $ p { Process.env = Just $ Map.toAscList (override procEnv) }
|
||||
|
||||
-- | Like `System.Process.callProcess` but also logs (debug) the process being run
|
||||
callProcess
|
||||
@ -149,18 +210,18 @@ callCommand cmd = do
|
||||
liftIO $ Process.callCommand cmd
|
||||
|
||||
withProcess
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e)
|
||||
=> CreateProcess -> (Handle -> Handle -> m ()) -> m (Handle, Handle)
|
||||
:: (MonadIO m, CliLog m, CliThrow e m, AsProcessFailure e, MonadFail m)
|
||||
=> ProcessSpec -> (Handle -> Handle -> m ()) -> m (Handle, Handle)
|
||||
withProcess process f = do -- TODO: Use bracket.
|
||||
-- FIXME: Using `withCreateProcess` here leads to something operating illegally on closed handles.
|
||||
(_, Just out, Just err, p) <- createProcess $ process
|
||||
{ std_out = CreatePipe , std_err = CreatePipe }
|
||||
(_, Just out, Just err, p) <- createProcess $ overCreateProcess
|
||||
(\x -> x { std_out = CreatePipe , std_err = CreatePipe }) process
|
||||
|
||||
f out err -- Pass the handles to the passed function
|
||||
|
||||
liftIO (waitForProcess p) >>= \case
|
||||
waitForProcess p >>= \case
|
||||
ExitSuccess -> return (out, err)
|
||||
ExitFailure code -> throwError $ review asProcessFailure $ ProcessFailure (cmdspec process) code
|
||||
ExitFailure code -> throwError $ review asProcessFailure $ ProcessFailure (Process.cmdspec $ _processSpec_createProcess process) code
|
||||
|
||||
-- Create an input stream from the file handle, associating each item with the given severity.
|
||||
streamHandle :: Severity -> Handle -> IO (InputStream (Severity, BSC.ByteString))
|
||||
@ -173,12 +234,20 @@ streamToLog
|
||||
streamToLog stream = fix $ \loop -> do
|
||||
liftIO (Streams.read stream) >>= \case
|
||||
Nothing -> return ()
|
||||
Just (sev, line) -> putLogRaw sev (T.decodeUtf8 line) >> loop
|
||||
Just (sev, line) -> putLogRaw sev (T.decodeUtf8With lenientDecode line) >> loop
|
||||
|
||||
-- | Wrapper around `System.Process.waitForProcess`
|
||||
waitForProcess :: MonadIO m => ProcessHandle -> m ExitCode
|
||||
waitForProcess = liftIO . Process.waitForProcess
|
||||
|
||||
-- | Pretty print a 'CmdSpec'
|
||||
reconstructCommand :: Process.CmdSpec -> Text
|
||||
reconstructCommand (Process.ShellCommand str) = T.pack str
|
||||
reconstructCommand (Process.RawCommand c as) = processToShellString c as
|
||||
reconstructCommand p = case p of
|
||||
Process.ShellCommand str -> T.pack str
|
||||
Process.RawCommand c as -> processToShellString c as
|
||||
where
|
||||
processToShellString cmd args = T.unwords $ map quoteAndEscape (cmd : args)
|
||||
quoteAndEscape x = "'" <> T.replace "'" "'\''" (T.pack x) <> "'"
|
||||
|
||||
reconstructProcSpec :: ProcessSpec -> Text
|
||||
reconstructProcSpec = reconstructCommand . Process.cmdspec . _processSpec_createProcess
|
||||
|
@ -24,6 +24,7 @@ import System.Console.ANSI (Color (Blue, Cyan, Green, Red))
|
||||
|
||||
import Obelisk.CliApp.Logging (allowUserToMakeLoggingVerbose, fork, putLog)
|
||||
import Obelisk.CliApp.TerminalString (TerminalString (..), enquiryCode)
|
||||
import Obelisk.CliApp.Theme
|
||||
import Obelisk.CliApp.Types (CliLog, CliConfig (..), HasCliConfig, Output (..), getCliConfig)
|
||||
|
||||
-- | Run an action with a CLI spinner.
|
||||
@ -51,7 +52,8 @@ withSpinner'
|
||||
-> m a
|
||||
-> m a
|
||||
withSpinner' msg mkTrail action = do
|
||||
noSpinner <- _cliConfig_noSpinner <$> getCliConfig
|
||||
cliConf <- getCliConfig
|
||||
let noSpinner = _cliConfig_noSpinner cliConf
|
||||
if noSpinner
|
||||
then putLog Notice msg >> action
|
||||
else bracket' run cleanup $ const action
|
||||
@ -61,8 +63,11 @@ withSpinner' msg mkTrail action = do
|
||||
modifyStack pushSpinner >>= \case
|
||||
True -> do -- Top-level spinner; fork a thread to manage output of anything on the stack
|
||||
ctrleThread <- fork $ allowUserToMakeLoggingVerbose enquiryCode
|
||||
cliConf <- getCliConfig
|
||||
let theme = _cliConfig_theme cliConf
|
||||
spinner = coloredSpinner $ _cliTheme_spinner theme
|
||||
spinnerThread <- fork $ runSpinner spinner $ \c -> do
|
||||
logs <- renderSpinnerStack c . snd <$> readStack
|
||||
logs <- renderSpinnerStack theme c . snd <$> readStack
|
||||
logMessage $ Output_Overwrite logs
|
||||
pure [ctrleThread, spinnerThread]
|
||||
False -> -- Sub-spinner; nothing to do.
|
||||
@ -70,13 +75,15 @@ withSpinner' msg mkTrail action = do
|
||||
cleanup tids resultM = do
|
||||
liftIO $ mapM_ killThread tids
|
||||
logMessage Output_ClearLine
|
||||
logsM <- modifyStack $ popSpinner $ case resultM of
|
||||
cliConf <- getCliConfig
|
||||
let theme = _cliConfig_theme cliConf
|
||||
logsM <- modifyStack $ (popSpinner theme) $ case resultM of
|
||||
Nothing ->
|
||||
( TerminalString_Colorized Red "✖"
|
||||
( TerminalString_Colorized Red $ _cliTheme_failed $ _cliConfig_theme cliConf
|
||||
, Just msg -- Always display final message if there was an exception.
|
||||
)
|
||||
Just result ->
|
||||
( TerminalString_Colorized Green "✔"
|
||||
( TerminalString_Colorized Green $ _cliTheme_done $ _cliConfig_theme cliConf
|
||||
, mkTrail <*> pure result
|
||||
)
|
||||
-- Last message, finish off with newline.
|
||||
@ -87,10 +94,10 @@ withSpinner' msg mkTrail action = do
|
||||
)
|
||||
where
|
||||
isTemporary = isNothing mkTrail
|
||||
popSpinner (mark, trailMsgM) (flag, old) =
|
||||
popSpinner theme (mark, trailMsgM) (flag, old) =
|
||||
( (newFlag, new)
|
||||
-- With final trail spinner message to render
|
||||
, renderSpinnerStack mark . (: new) . TerminalString_Normal <$> (
|
||||
, renderSpinnerStack theme mark . (: new) . TerminalString_Normal <$> (
|
||||
if inTemporarySpinner then Nothing else trailMsgM
|
||||
)
|
||||
)
|
||||
@ -102,19 +109,19 @@ withSpinner' msg mkTrail action = do
|
||||
=<< fmap _cliConfig_spinnerStack getCliConfig
|
||||
modifyStack f = liftIO . flip atomicModifyIORef' f
|
||||
=<< fmap _cliConfig_spinnerStack getCliConfig
|
||||
spinner = coloredSpinner defaultSpinnerTheme
|
||||
|
||||
-- | How nested spinner logs should be displayed
|
||||
renderSpinnerStack
|
||||
:: TerminalString -- ^ That which comes before the final element in stack
|
||||
:: CliTheme
|
||||
-> TerminalString -- ^ That which comes before the final element in stack
|
||||
-> [TerminalString] -- ^ Spinner elements in reverse order
|
||||
-> [TerminalString]
|
||||
renderSpinnerStack mark = L.intersperse space . go . L.reverse
|
||||
renderSpinnerStack theme mark = L.intersperse space . go . L.reverse
|
||||
where
|
||||
go [] = []
|
||||
go (x:[]) = mark : [x]
|
||||
go (x:xs) = arrow : x : go xs
|
||||
arrow = TerminalString_Colorized Blue "⇾"
|
||||
arrow = TerminalString_Colorized Blue $ _cliTheme_arrow theme
|
||||
space = TerminalString_Normal " "
|
||||
|
||||
-- | A spinner is simply an infinite list of strings that supplant each other in a delayed loop, creating the
|
||||
@ -130,18 +137,6 @@ runSpinner spinner f = forM_ spinner $ f >=> const delay
|
||||
where
|
||||
delay = liftIO $ threadDelay 100000 -- A shorter delay ensures that we update promptly.
|
||||
|
||||
type SpinnerTheme = [Text]
|
||||
|
||||
-- Find more spinners at https://github.com/sindresorhus/cli-spinners/blob/master/spinners.json
|
||||
_spinnerCircleHalves :: SpinnerTheme
|
||||
_spinnerCircleHalves = ["◐", "◓", "◑", "◒"]
|
||||
|
||||
spinnerMoon :: SpinnerTheme
|
||||
spinnerMoon = ["🌑", "🌒", "🌓", "🌔", "🌕", "🌖", "🌗", "🌘"]
|
||||
|
||||
defaultSpinnerTheme :: SpinnerTheme
|
||||
defaultSpinnerTheme = spinnerMoon
|
||||
|
||||
-- | Like `bracket` but the `release` function can know whether an exception was raised
|
||||
bracket' :: MonadMask m => m a -> (a -> Maybe c -> m b) -> (a -> m c) -> m c
|
||||
bracket' acquire release use = mask $ \unmasked -> do
|
||||
|
29
lib/cliapp/src/Obelisk/CliApp/Theme.hs
Normal file
@ -0,0 +1,29 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.CliApp.Theme where
|
||||
|
||||
import Data.Text (Text)
|
||||
|
||||
data CliTheme = CliTheme
|
||||
{ _cliTheme_done :: Text
|
||||
, _cliTheme_failed :: Text
|
||||
, _cliTheme_arrow :: Text
|
||||
, _cliTheme_spinner :: SpinnerTheme
|
||||
}
|
||||
|
||||
type SpinnerTheme = [Text]
|
||||
|
||||
unicodeTheme :: CliTheme
|
||||
unicodeTheme = CliTheme
|
||||
{ _cliTheme_done = "✔"
|
||||
, _cliTheme_failed = "✖"
|
||||
, _cliTheme_arrow = "⇾"
|
||||
, _cliTheme_spinner = ["◐", "◓", "◑", "◒"]
|
||||
}
|
||||
|
||||
noUnicodeTheme :: CliTheme
|
||||
noUnicodeTheme = CliTheme
|
||||
{ _cliTheme_done = "DONE"
|
||||
, _cliTheme_failed = "FAILED"
|
||||
, _cliTheme_arrow = "->"
|
||||
, _cliTheme_spinner = ["|", "/", "-", "\\"]
|
||||
}
|
@ -6,12 +6,14 @@
|
||||
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE StandaloneDeriving #-}
|
||||
{-# LANGUAGE UndecidableInstances #-}
|
||||
module Obelisk.CliApp.Types where
|
||||
|
||||
import Control.Concurrent.MVar (MVar)
|
||||
import Control.Monad.Catch (MonadCatch, MonadMask, MonadThrow)
|
||||
import Control.Monad.Log (LoggingT, MonadLog, Severity (..), WithSeverity (..), logMessage)
|
||||
import Control.Monad.Fail (MonadFail)
|
||||
import Control.Monad.Log (LoggingT(..), MonadLog, Severity (..), WithSeverity (..), logMessage)
|
||||
import Control.Monad.Reader (MonadIO, ReaderT (..), MonadReader (..), ask, mapReaderT)
|
||||
import Control.Monad.Writer (WriterT)
|
||||
import Control.Monad.State (StateT)
|
||||
@ -23,6 +25,7 @@ import Data.Text (Text)
|
||||
import System.Exit (ExitCode (..), exitWith)
|
||||
|
||||
import Obelisk.CliApp.TerminalString (TerminalString)
|
||||
import Obelisk.CliApp.Theme (CliTheme)
|
||||
|
||||
--------------------------------------------------------------------------------
|
||||
|
||||
@ -44,9 +47,11 @@ type CliThrow e m = MonadError e m
|
||||
putLog :: CliLog m => Severity -> Text -> m ()
|
||||
putLog sev = logMessage . Output_Log . WithSeverity sev
|
||||
|
||||
deriving instance MonadFail m => MonadFail (LoggingT Output m)
|
||||
|
||||
newtype DieT e m a = DieT { unDieT :: ReaderT (e -> (Text, Int)) (LoggingT Output m) a }
|
||||
deriving
|
||||
( Functor, Applicative, Monad, MonadIO
|
||||
( Functor, Applicative, Monad, MonadIO, MonadFail
|
||||
, MonadThrow, MonadCatch, MonadMask
|
||||
, MonadLog Output
|
||||
)
|
||||
@ -88,6 +93,8 @@ data CliConfig e = CliConfig
|
||||
_cliConfig_spinnerStack :: IORef ([Bool], [TerminalString])
|
||||
, -- | Failure handler. How to log error and what exit status to use.
|
||||
_cliConfig_errorLogExitCode :: e -> (Text, Int)
|
||||
, -- | Theme strings for spinners
|
||||
_cliConfig_theme :: CliTheme
|
||||
}
|
||||
|
||||
class Monad m => HasCliConfig e m | m -> e where
|
||||
@ -111,7 +118,7 @@ newtype CliT e m a = CliT
|
||||
{ unCliT :: ReaderT (CliConfig e) (DieT e m) a
|
||||
}
|
||||
deriving
|
||||
( Functor, Applicative, Monad, MonadIO
|
||||
( Functor, Applicative, Monad, MonadIO, MonadFail
|
||||
, MonadThrow, MonadCatch, MonadMask
|
||||
, MonadLog Output -- CliLog
|
||||
, MonadError e -- CliThrow
|
||||
|
@ -1,5 +1,5 @@
|
||||
name: obelisk-command
|
||||
version: 0.1
|
||||
version: 0.7.0.1
|
||||
cabal-version: >= 1.8
|
||||
build-type: Simple
|
||||
|
||||
@ -7,35 +7,33 @@ library
|
||||
hs-source-dirs: src
|
||||
build-depends:
|
||||
aeson
|
||||
, ansi-terminal
|
||||
, Cabal
|
||||
, aeson-pretty
|
||||
, ansi-terminal
|
||||
, base
|
||||
, base16-bytestring
|
||||
, binary
|
||||
, bytestring
|
||||
, obelisk-cliapp
|
||||
, Cabal
|
||||
, containers
|
||||
, data-default
|
||||
, directory
|
||||
, either
|
||||
, errors
|
||||
, exceptions
|
||||
, filepath
|
||||
, git
|
||||
, github
|
||||
, here
|
||||
, hit
|
||||
, hnix
|
||||
, hpack
|
||||
, io-streams
|
||||
, exceptions
|
||||
, logging-effect
|
||||
, lens
|
||||
, logging-effect
|
||||
, megaparsec
|
||||
, modern-uri
|
||||
, monad-loops
|
||||
, mtl
|
||||
, megaparsec
|
||||
, network
|
||||
, network-uri
|
||||
, obelisk-cliapp
|
||||
, optparse-applicative
|
||||
, placeholders
|
||||
, process
|
||||
@ -46,6 +44,7 @@ library
|
||||
, transformers
|
||||
, unix
|
||||
, unordered-containers
|
||||
, which
|
||||
, yaml
|
||||
exposed-modules:
|
||||
Obelisk.App
|
||||
@ -57,8 +56,8 @@ library
|
||||
Obelisk.Command.Thunk
|
||||
Obelisk.Command.Utils
|
||||
Obelisk.Command.VmBuilder
|
||||
-- -fobject-code is so that the StaticPointers extension can work in ghci
|
||||
ghc-options: -Wall -fobject-code
|
||||
Obelisk.Command.Preprocessor
|
||||
ghc-options: -Wall
|
||||
|
||||
executable ob
|
||||
main-is: src-bin/ob.hs
|
||||
|
@ -12,6 +12,7 @@ module Obelisk.App where
|
||||
|
||||
import Control.Lens
|
||||
import Control.Monad.Catch (MonadCatch, MonadMask, MonadThrow)
|
||||
import Control.Monad.Fail (MonadFail)
|
||||
import Control.Monad.Reader (MonadIO, ReaderT (..), ask, runReaderT)
|
||||
import Control.Monad.Writer (WriterT)
|
||||
import Control.Monad.State (StateT)
|
||||
@ -61,10 +62,11 @@ newtype ObeliskT m a = ObeliskT
|
||||
{ unObeliskT :: ReaderT Obelisk (CliT ObeliskError m) a
|
||||
}
|
||||
deriving
|
||||
( Functor, Applicative, Monad, MonadIO, MonadThrow, MonadCatch, MonadMask
|
||||
( Functor, Applicative, Monad, MonadIO, MonadThrow, MonadCatch, MonadMask, MonadFail
|
||||
, MonadLog Output -- CliLog
|
||||
, MonadError ObeliskError -- CliThrow ObeliskError
|
||||
, HasCliConfig ObeliskError)
|
||||
, HasCliConfig ObeliskError
|
||||
)
|
||||
|
||||
instance MonadTrans ObeliskT where
|
||||
lift = ObeliskT . lift . lift
|
||||
@ -104,6 +106,7 @@ type MonadInfallibleObelisk m =
|
||||
type MonadObelisk m =
|
||||
( MonadInfallibleObelisk m
|
||||
, CliThrow ObeliskError m
|
||||
, MonadFail m
|
||||
)
|
||||
|
||||
getObeliskUserStateDir :: IO FilePath
|
||||
|
@ -3,22 +3,16 @@
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE StaticPointers #-}
|
||||
{-# LANGUAGE TypeApplications #-}
|
||||
module Obelisk.Command where
|
||||
|
||||
import Control.Monad
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
import qualified Data.Binary as Binary
|
||||
import Data.Bool (bool)
|
||||
import qualified Data.ByteString.Base16 as Base16
|
||||
import qualified Data.ByteString.Lazy as LBS
|
||||
import Data.Foldable (for_)
|
||||
import Data.List
|
||||
import Data.Maybe (catMaybes)
|
||||
import qualified Data.Text as T
|
||||
import Data.Text.Encoding
|
||||
import GHC.StaticPtr
|
||||
import Options.Applicative
|
||||
import Options.Applicative.Help.Pretty (text, (<$$>))
|
||||
import System.Directory
|
||||
import System.Environment
|
||||
import System.FilePath
|
||||
@ -28,13 +22,12 @@ import System.Posix.Process (executeFile)
|
||||
|
||||
import Obelisk.App
|
||||
import Obelisk.CliApp
|
||||
import Obelisk.CliApp.Demo (cliDemo)
|
||||
import Obelisk.Command.Deploy
|
||||
import Obelisk.Command.Project
|
||||
import Obelisk.Command.Run
|
||||
import Obelisk.Command.Thunk
|
||||
import Obelisk.Command.Utils
|
||||
import qualified Obelisk.Command.VmBuilder as VmBuilder
|
||||
import qualified Obelisk.Command.Preprocessor as Preprocessor
|
||||
|
||||
|
||||
data Args = Args
|
||||
@ -88,50 +81,43 @@ data ObCommand
|
||||
= ObCommand_Init InitSource Bool
|
||||
| ObCommand_Deploy DeployCommand
|
||||
| ObCommand_Run
|
||||
| ObCommand_Profile String [String]
|
||||
| ObCommand_Thunk ThunkCommand
|
||||
| ObCommand_Repl
|
||||
| ObCommand_Watch
|
||||
| ObCommand_Shell ShellOpts
|
||||
| ObCommand_Doc String [String] -- shell and list of packages
|
||||
| ObCommand_Hoogle String Int -- shell and port
|
||||
| ObCommand_Internal ObInternal
|
||||
deriving Show
|
||||
|
||||
data ObInternal
|
||||
= ObInternal_RunStaticIO StaticKey
|
||||
| ObInternal_CLIDemo
|
||||
-- the preprocessor argument syntax is also handled outside
|
||||
-- optparse-applicative, but it shouldn't ever conflict with another syntax
|
||||
= ObInternal_ApplyPackages String String String [String]
|
||||
deriving Show
|
||||
|
||||
inNixShell' :: MonadObelisk m => StaticPtr (ObeliskT IO ()) -> m ()
|
||||
inNixShell' p = withProjectRoot "." $ \root -> do
|
||||
cmd <- liftIO $ unwords <$> mkCmd -- TODO: shell escape instead of unwords
|
||||
projectShell root False "ghc" cmd
|
||||
where
|
||||
mkCmd = do
|
||||
argsCfg <- getArgsConfig
|
||||
myArgs <- getArgs
|
||||
obArgs <- parseCLIArgs argsCfg myArgs
|
||||
progName <- getObeliskExe
|
||||
return $ progName : catMaybes
|
||||
[ Just "--no-handoff"
|
||||
, bool Nothing (Just "--verbose") $ _args_verbose obArgs
|
||||
, Just "internal"
|
||||
, Just "run-static-io"
|
||||
, Just $ encodeStaticKey $ staticKey p
|
||||
]
|
||||
|
||||
obCommand :: ArgsConfig -> Parser ObCommand
|
||||
obCommand cfg = hsubparser
|
||||
(mconcat
|
||||
[ command "init" $ info (ObCommand_Init <$> initSource <*> initForce) $ progDesc "Initialize an Obelisk project"
|
||||
, command "deploy" $ info (ObCommand_Deploy <$> deployCommand cfg) $ progDesc "Prepare a deployment for an Obelisk project"
|
||||
, command "run" $ info (pure ObCommand_Run) $ progDesc "Run current project in development mode"
|
||||
, command "thunk" $ info (ObCommand_Thunk <$> thunkCommand) $ progDesc "Manipulate thunk directories"
|
||||
, command "repl" $ info (pure ObCommand_Repl) $ progDesc "Open an interactive interpreter"
|
||||
, command "watch" $ info (pure ObCommand_Watch) $ progDesc "Watch current project for errors and warnings"
|
||||
])
|
||||
<|> subparser
|
||||
(mconcat
|
||||
[ internal
|
||||
, command "internal" (info (ObCommand_Internal <$> internalCommand) mempty)
|
||||
])
|
||||
(mconcat
|
||||
[ command "init" $ info (ObCommand_Init <$> initSource <*> initForce) $ progDesc "Initialize an Obelisk project"
|
||||
, command "deploy" $ info (ObCommand_Deploy <$> deployCommand cfg) $ progDesc "Prepare a deployment for an Obelisk project"
|
||||
, command "run" $ info (pure ObCommand_Run) $ progDesc "Run current project in development mode"
|
||||
, command "profile" $ info (uncurry ObCommand_Profile <$> profileCommand) $ progDesc "Run current project with profiling enabled"
|
||||
, command "thunk" $ info (ObCommand_Thunk <$> thunkCommand) $ progDesc "Manipulate thunk directories"
|
||||
, command "repl" $ info (pure ObCommand_Repl) $ progDesc "Open an interactive interpreter"
|
||||
, command "watch" $ info (pure ObCommand_Watch) $ progDesc "Watch current project for errors and warnings"
|
||||
, command "shell" $ info (ObCommand_Shell <$> shellOpts) $ progDesc "Enter a shell with project dependencies"
|
||||
, command "doc" $ info (ObCommand_Doc <$> shellFlags <*> packageNames) $
|
||||
progDesc "List paths to haddock documentation for specified packages"
|
||||
<> footerDoc (Just $
|
||||
text "Hint: To open the documentation you can pipe the output of this command like"
|
||||
<$$> text "ob doc reflex reflex-dom-core | xargs -n1 xdg-open")
|
||||
, command "hoogle" $ info (ObCommand_Hoogle <$> shellFlags <*> portOpt 8080) $ progDesc "Run a hoogle server locally for your project's dependency tree"
|
||||
])
|
||||
|
||||
packageNames :: Parser [String]
|
||||
packageNames = some (strArgument (metavar "PACKAGE-NAME..."))
|
||||
|
||||
deployCommand :: ArgsConfig -> Parser DeployCommand
|
||||
deployCommand cfg = hsubparser $ mconcat
|
||||
@ -143,7 +129,7 @@ deployCommand cfg = hsubparser $ mconcat
|
||||
where
|
||||
platformP = hsubparser $ mconcat
|
||||
[ command "android" $ info (pure (Android, [])) mempty
|
||||
, command "ios" $ info ((,) <$> pure IOS <*> (fmap pure $ strArgument (metavar "TEAMID" <> help "Your Team ID - found in the Apple developer portal"))) mempty
|
||||
, command "ios" $ info ((,) <$> pure IOS <*> fmap pure (strArgument (metavar "TEAMID" <> help "Your Team ID - found in the Apple developer portal"))) mempty
|
||||
]
|
||||
|
||||
remoteBuilderParser :: Parser (Maybe RemoteBuilder)
|
||||
@ -165,8 +151,8 @@ deployCommand cfg = hsubparser $ mconcat
|
||||
|
||||
deployInitOpts :: Parser DeployInitOpts
|
||||
deployInitOpts = DeployInitOpts
|
||||
<$> strArgument (action "directory" <> metavar "DEPLOYDIR" <> help "Path to a directory that it will create")
|
||||
<*> strOption (long "ssh-key" <> action "file" <> metavar "SSHKEY" <> help "Path to an ssh key that it will symlink to")
|
||||
<$> strArgument (action "directory" <> metavar "DEPLOYDIR" <> help "Path to a directory where the deployment repository will be initialized")
|
||||
<*> strOption (long "ssh-key" <> action "file" <> metavar "SSHKEY" <> help "Path to an SSH key that will be *copied* to the deployment repository")
|
||||
<*> some (strOption (long "hostname" <> metavar "HOSTNAME" <> help "hostname of the deployment target"))
|
||||
<*> strOption (long "route" <> metavar "PUBLICROUTE" <> help "Publicly accessible URL of your app")
|
||||
<*> strOption (long "admin-email" <> metavar "ADMINEMAIL" <> help "Email address where administrative alerts will be sent")
|
||||
@ -193,11 +179,24 @@ data DeployInitOpts = DeployInitOpts
|
||||
}
|
||||
deriving Show
|
||||
|
||||
internalCommand :: Parser ObInternal
|
||||
internalCommand = subparser $ mconcat
|
||||
[ command "run-static-io" $ info (ObInternal_RunStaticIO <$> argument (eitherReader decodeStaticKey) (action "static-key")) mempty
|
||||
, command "clidemo" $ info (pure ObInternal_CLIDemo) mempty
|
||||
]
|
||||
profileCommand :: Parser (String, [String])
|
||||
profileCommand = (,)
|
||||
<$> strOption
|
||||
( long "output"
|
||||
<> short 'o'
|
||||
<> help "Base output to use for profiling output. Suffixes are added to this based on the profiling type. Defaults to a timestamped path in the profile/ directory in the project's root."
|
||||
<> metavar "PATH"
|
||||
<> value "profile/%Y-%m-%dT%H:%M:%S"
|
||||
<> showDefault
|
||||
)
|
||||
<*> (words <$> strOption
|
||||
( long "rts-flags"
|
||||
<> help "RTS Flags to pass to the executable."
|
||||
<> value "-p -hc"
|
||||
<> metavar "FLAGS"
|
||||
<> showDefault
|
||||
))
|
||||
|
||||
|
||||
--TODO: Result should provide normalised path and also original user input for error reporting.
|
||||
thunkDirectoryParser :: Parser FilePath
|
||||
@ -207,33 +206,70 @@ thunkDirectoryParser = fmap (dropTrailingPathSeparator . normalise) . strArgumen
|
||||
, help "Path to directory containing thunk data"
|
||||
]
|
||||
|
||||
thunkConfig :: Parser ThunkConfig
|
||||
thunkConfig = ThunkConfig
|
||||
<$>
|
||||
( flag' (Just True) (long "private" <> help "Mark thunks as pointing to a private repository")
|
||||
<|> flag' (Just False) (long "public" <> help "Mark thunks as pointing to a public repository")
|
||||
<|> pure Nothing
|
||||
)
|
||||
|
||||
thunkUpdateConfig :: Parser ThunkUpdateConfig
|
||||
thunkUpdateConfig = ThunkUpdateConfig
|
||||
<$> optional (strOption (long "branch" <> metavar "BRANCH" <> help "Use the given branch when looking for the latest revision"))
|
||||
<*> thunkConfig
|
||||
|
||||
thunkPackConfig :: Parser ThunkPackConfig
|
||||
thunkPackConfig = ThunkPackConfig
|
||||
<$> switch (long "force" <> short 'f' <> help "Force packing thunks even if there are branches not pushed upstream, uncommitted changes, stashes. This will cause changes that have not been pushed upstream to be lost; use with care.")
|
||||
<*> thunkConfig
|
||||
|
||||
data ThunkCommand
|
||||
= ThunkCommand_Update [FilePath] (Maybe String)
|
||||
= ThunkCommand_Update [FilePath] ThunkUpdateConfig
|
||||
| ThunkCommand_Unpack [FilePath]
|
||||
| ThunkCommand_Pack [FilePath]
|
||||
| ThunkCommand_Pack [FilePath] ThunkPackConfig
|
||||
deriving Show
|
||||
|
||||
thunkCommand :: Parser ThunkCommand
|
||||
thunkCommand = hsubparser $ mconcat
|
||||
[ command "update" $ info (ThunkCommand_Update <$> some thunkDirectoryParser <*> optional (strOption (long "branch" <> metavar "BRANCH"))) $ progDesc "Update thunk to latest revision available"
|
||||
[ command "update" $ info (ThunkCommand_Update <$> some thunkDirectoryParser <*> thunkUpdateConfig) $ progDesc "Update thunk to latest revision available"
|
||||
, command "unpack" $ info (ThunkCommand_Unpack <$> some thunkDirectoryParser) $ progDesc "Unpack thunk into git checkout of revision it points to"
|
||||
, command "pack" $ info (ThunkCommand_Pack <$> some thunkDirectoryParser) $ progDesc "Pack git checkout into thunk that points at the current branch's upstream"
|
||||
, command "pack" $ info (ThunkCommand_Pack <$> some thunkDirectoryParser <*> thunkPackConfig) $ progDesc "Pack git checkout into thunk that points at the current branch's upstream"
|
||||
]
|
||||
|
||||
data ShellOpts
|
||||
= ShellOpts
|
||||
{ _shellOpts_shell :: String
|
||||
, _shellOpts_command :: Maybe String
|
||||
}
|
||||
deriving Show
|
||||
|
||||
shellFlags :: Parser String
|
||||
shellFlags =
|
||||
flag' "ghc" (long "ghc" <> help "Enter a shell environment having ghc (default)")
|
||||
<|> flag "ghc" "ghcjs" (long "ghcjs" <> help "Enter a shell having ghcjs rather than ghc")
|
||||
<|> strOption (short 'A' <> long "argument" <> metavar "NIXARG" <> help "Use the environment specified by the given nix argument of `shells'")
|
||||
|
||||
shellOpts :: Parser ShellOpts
|
||||
shellOpts = ShellOpts
|
||||
<$> shellFlags
|
||||
<*> optional (strArgument (metavar "COMMAND"))
|
||||
|
||||
portOpt :: Int -> Parser Int
|
||||
portOpt dfault = option auto (long "port" <> short 'p' <> help "Port number for server" <> showDefault <> value dfault <> metavar "INT")
|
||||
|
||||
parserPrefs :: ParserPrefs
|
||||
parserPrefs = defaultPrefs
|
||||
{ prefShowHelpOnEmpty = True
|
||||
}
|
||||
|
||||
parseCLIArgs :: ArgsConfig -> [String] -> IO Args
|
||||
parseCLIArgs cfg as = pure as >>= handleParseResult . execParserPure parserPrefs (argsInfo cfg)
|
||||
|
||||
-- | Create an Obelisk config for the current process.
|
||||
mkObeliskConfig :: IO Obelisk
|
||||
mkObeliskConfig = do
|
||||
cliArgs <- getArgs
|
||||
-- This function should not use argument parser (full argument parsing happens post handoff)
|
||||
let logLevel = toLogLevel $ "-v" `elem` cliArgs
|
||||
-- TODO: See if we can use the argument parser with a subset of the parsers to get logging level out.
|
||||
let logLevel = toLogLevel $ any (`elem` ["-v", "--verbose"]) cliArgs
|
||||
notInteractive <- not <$> isInteractiveTerm
|
||||
cliConf <- newCliConfig logLevel notInteractive notInteractive $ \case
|
||||
ObeliskError_ProcessError (ProcessFailure p code) ann ->
|
||||
@ -298,6 +334,9 @@ main' argsCfg = do
|
||||
liftIO $ executeFile impl False ("--no-handoff" : myArgs) Nothing
|
||||
case myArgs of
|
||||
"--no-handoff" : as -> go as -- If we've been told not to hand off, don't hand off
|
||||
origPath:inPath:outPath:preprocessorName:packagePaths
|
||||
| preprocessorName == preprocessorIdentifier && any (\c -> c == '.' || c == pathSeparator) origPath ->
|
||||
ob $ ObCommand_Internal $ ObInternal_ApplyPackages origPath inPath outPath packagePaths
|
||||
a:as -- Otherwise bash completion would always hand-off even if the user isn't trying to
|
||||
| "--bash-completion" `isPrefixOf` a
|
||||
&& "--no-handoff" `elem` as -> go (a:as)
|
||||
@ -324,14 +363,13 @@ ob = \case
|
||||
Right (ThunkData_Packed ptr) -> return ptr
|
||||
Right (ThunkData_Checkout (Just ptr)) -> return ptr
|
||||
Right (ThunkData_Checkout Nothing) ->
|
||||
getThunkPtr' False root
|
||||
getThunkPtr False root Nothing
|
||||
let sshKeyPath = _deployInitOpts_sshKey deployOpts
|
||||
hostname = _deployInitOpts_hostname deployOpts
|
||||
route = _deployInitOpts_route deployOpts
|
||||
adminEmail = _deployInitOpts_adminEmail deployOpts
|
||||
enableHttps = _deployInitOpts_enableHttps deployOpts
|
||||
deployInit thunkPtr (root </> "config") deployDir
|
||||
sshKeyPath hostname route adminEmail enableHttps
|
||||
deployInit thunkPtr deployDir sshKeyPath hostname route adminEmail enableHttps
|
||||
DeployCommand_Push remoteBuilder -> do
|
||||
deployPath <- liftIO $ canonicalizePath "."
|
||||
deployPush deployPath $ case remoteBuilder of
|
||||
@ -339,35 +377,33 @@ ob = \case
|
||||
Just RemoteBuilder_ObeliskVM -> (:[]) <$> VmBuilder.getNixBuildersArg
|
||||
DeployCommand_Update -> deployUpdate "."
|
||||
DeployCommand_Test (platform, extraArgs) -> deployMobile platform extraArgs
|
||||
ObCommand_Run -> inNixShell' $ static run
|
||||
-- inNixShell ($(mkClosure 'ghcidAction) ())
|
||||
ObCommand_Run -> run
|
||||
ObCommand_Profile basePath rtsFlags -> profile basePath rtsFlags
|
||||
ObCommand_Thunk tc -> case tc of
|
||||
ThunkCommand_Update thunks mBranch -> mapM_ ((flip updateThunkToLatest) mBranch) thunks
|
||||
ThunkCommand_Unpack thunks -> mapM_ unpackThunk thunks
|
||||
ThunkCommand_Pack thunks -> forM_ thunks packThunk
|
||||
ThunkCommand_Update thunks config -> for_ thunks (updateThunkToLatest config)
|
||||
ThunkCommand_Unpack thunks -> for_ thunks unpackThunk
|
||||
ThunkCommand_Pack thunks config -> for_ thunks (packThunk config)
|
||||
ObCommand_Repl -> runRepl
|
||||
ObCommand_Watch -> inNixShell' $ static runWatch
|
||||
ObCommand_Watch -> runWatch
|
||||
ObCommand_Shell so -> withProjectRoot "." $ \root ->
|
||||
projectShell root False (_shellOpts_shell so) (_shellOpts_command so)
|
||||
ObCommand_Doc shell' pkgs -> withProjectRoot "." $ \root ->
|
||||
projectShell root False shell' (Just $ haddockCommand pkgs)
|
||||
ObCommand_Hoogle shell' port -> withProjectRoot "." $ \root -> do
|
||||
nixShellWithHoogle root True shell' $ Just $ "hoogle server -p " <> show port <> " --local"
|
||||
ObCommand_Internal icmd -> case icmd of
|
||||
ObInternal_RunStaticIO k -> liftIO (unsafeLookupStaticPtr @(ObeliskT IO ()) k) >>= \case
|
||||
Nothing -> failWith $ "ObInternal_RunStaticIO: no such StaticKey: " <> T.pack (show k)
|
||||
Just p -> do
|
||||
c <- getObelisk
|
||||
liftIO $ runObelisk c $ deRefStaticPtr p
|
||||
ObInternal_CLIDemo -> cliDemo
|
||||
ObInternal_ApplyPackages origPath inPath outPath packagePaths -> do
|
||||
liftIO $ Preprocessor.applyPackages origPath inPath outPath packagePaths
|
||||
|
||||
haddockCommand :: [String] -> String
|
||||
haddockCommand pkgs = unwords
|
||||
[ "for p in"
|
||||
, unwords [getHaddockPath p ++ "/index.html" | p <- pkgs]
|
||||
, "; do echo $p; done"
|
||||
]
|
||||
where getHaddockPath p = "$(ghc-pkg field " ++ p ++ " haddock-html --simple-output)"
|
||||
|
||||
--TODO: Clean up all the magic strings throughout this codebase
|
||||
|
||||
getArgsConfig :: IO ArgsConfig
|
||||
getArgsConfig = pure $ ArgsConfig { _argsConfig_enableVmBuilderByDefault = System.Info.os == "darwin" }
|
||||
|
||||
encodeStaticKey :: StaticKey -> String
|
||||
encodeStaticKey = T.unpack . decodeUtf8 . Base16.encode . LBS.toStrict . Binary.encode
|
||||
|
||||
-- TODO: Use failWith in place of fail to be consistent.
|
||||
decodeStaticKey :: String -> Either String StaticKey
|
||||
decodeStaticKey s = case Base16.decode $ encodeUtf8 $ T.pack s of
|
||||
(b, "") -> case Binary.decodeOrFail $ LBS.fromStrict b of
|
||||
Right ("", _, a) -> pure a
|
||||
Right _ -> fail "decodeStaticKey: Binary.decodeOrFail didn't consume all input"
|
||||
Left (_, _, e) -> fail $ "decodeStaticKey: Binary.decodeOrFail failed: " <> show e
|
||||
_ -> fail $ "decodeStaticKey: could not decode hex string: " <> show s
|
||||
|
@ -4,92 +4,101 @@
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
{-# LANGUAGE DeriveGeneric #-}
|
||||
{-# LANGUAGE TypeApplications #-}
|
||||
module Obelisk.Command.Deploy where
|
||||
|
||||
import Control.Lens
|
||||
import Control.Monad
|
||||
import Control.Monad.Catch (Exception (displayException), MonadThrow, throwM, try, bracket_)
|
||||
import Control.Monad.Catch (Exception (displayException), MonadThrow, bracket, throwM, try)
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
import Data.Aeson (FromJSON, ToJSON, encode, eitherDecode)
|
||||
import Data.Bits
|
||||
import qualified Data.ByteString.Lazy as BSL
|
||||
import Data.Default
|
||||
import qualified Data.HashMap.Strict as HM
|
||||
import qualified Data.Map as Map
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Set as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as T
|
||||
import GHC.Generics
|
||||
import Nix.Convert
|
||||
import Nix.Pretty
|
||||
import Nix.Value
|
||||
import System.Directory
|
||||
import System.Environment (getEnvironment)
|
||||
import System.FilePath
|
||||
import System.IO
|
||||
import System.Posix.Files
|
||||
import System.Process (delegate_ctlc, env, proc, cwd)
|
||||
import Text.URI (URI)
|
||||
import qualified Text.URI as URI
|
||||
import Text.URI.Lens
|
||||
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp (Severity (..), callProcessAndLogOutput, failWith, putLog, withSpinner)
|
||||
import Obelisk.CliApp (
|
||||
Severity (..), callProcessAndLogOutput, failWith, proc, putLog,
|
||||
setCwd, setDelegateCtlc, setEnvOverride, withSpinner)
|
||||
import Obelisk.Command.Nix
|
||||
import Obelisk.Command.Project
|
||||
import Obelisk.Command.Thunk
|
||||
import Obelisk.Command.Utils
|
||||
|
||||
|
||||
deployInit
|
||||
:: MonadObelisk m
|
||||
=> ThunkPtr
|
||||
-> FilePath
|
||||
-> FilePath
|
||||
-> FilePath
|
||||
-> [String] -- ^ hostnames
|
||||
-> String -- ^ route
|
||||
-> String -- ^ admin email
|
||||
-> Bool -- ^ enable https
|
||||
-> m ()
|
||||
deployInit thunkPtr configDir deployDir sshKeyPath hostnames route adminEmail enableHttps = do
|
||||
(hasConfigDir, localKey) <- withSpinner ("Preparing " <> T.pack deployDir) $ do
|
||||
hasConfigDir <- liftIO $ do
|
||||
createDirectoryIfMissing True deployDir
|
||||
doesDirectoryExist configDir
|
||||
deployInit thunkPtr deployDir sshKeyPath hostnames route adminEmail enableHttps = do
|
||||
liftIO $ createDirectoryIfMissing True deployDir
|
||||
localKey <- withSpinner ("Preparing " <> T.pack deployDir) $ do
|
||||
localKey <- liftIO (doesFileExist sshKeyPath) >>= \case
|
||||
False -> failWith $ T.pack $ "ob deploy init: file does not exist: " <> sshKeyPath
|
||||
True -> pure $ deployDir </> "ssh_key"
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "cp" [sshKeyPath, localKey]
|
||||
proc cp [sshKeyPath, localKey]
|
||||
liftIO $ setFileMode localKey $ ownerReadMode .|. ownerWriteMode
|
||||
return $ (hasConfigDir, localKey)
|
||||
return localKey
|
||||
withSpinner "Validating configuration" $ do
|
||||
void $ getHostFromRoute enableHttps route -- make sure that hostname is present
|
||||
forM_ hostnames $ \hostname -> do
|
||||
putLog Notice $ "Verifying host keys (" <> T.pack hostname <> ")"
|
||||
-- Note: we can't use a spinner here as this function will prompt the user.
|
||||
verifyHostKey (deployDir </> "backend_known_hosts") localKey hostname
|
||||
when hasConfigDir $ withSpinner "Importing project configuration" $ do
|
||||
--IMPORTANT: We cannot copy config directory from the development project to
|
||||
--the deployment directory. If we do, it's very likely someone will
|
||||
--accidentally create a production deployment that uses development
|
||||
--credentials to connect to some resources. This could result in, e.g.,
|
||||
--production data backed up to a dev environment.
|
||||
withSpinner "Creating project configuration directories" $ do
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "cp" [ "-r" , "-T" , configDir , deployDir </> "config"]
|
||||
proc "mkdir" [ "-p"
|
||||
, deployDir </> "config" </> "backend"
|
||||
, deployDir </> "config" </> "common"
|
||||
, deployDir </> "config" </> "frontend"
|
||||
]
|
||||
|
||||
let srcDir = deployDir </> "src"
|
||||
withSpinner ("Creating source thunk (" <> T.pack (makeRelative deployDir srcDir) <> ")") $ liftIO $ do
|
||||
createThunk srcDir thunkPtr
|
||||
setupObeliskImpl deployDir
|
||||
|
||||
withSpinner "Writing deployment configuration" $ do
|
||||
writeDeployConfig deployDir "backend_hosts" $ unlines hostnames
|
||||
writeDeployConfig deployDir "enable_https" $ show enableHttps
|
||||
writeDeployConfig deployDir "admin_email" adminEmail
|
||||
writeDeployConfig deployDir ("config" </> "common" </> "route") $ route
|
||||
withSpinner "Creating source thunk (./src)" $ liftIO $ do
|
||||
createThunk (deployDir </> "src") thunkPtr
|
||||
setupObeliskImpl deployDir
|
||||
writeDeployConfig deployDir ("config" </> "common" </> "route") route
|
||||
writeDeployConfig deployDir "module.nix" $
|
||||
"(import " <> toNixPath (makeRelative deployDir srcDir) <> " {}).obelisk.serverModules.mkBaseEc2"
|
||||
|
||||
withSpinner ("Initializing git repository (" <> T.pack deployDir <> ")") $
|
||||
initGit deployDir
|
||||
|
||||
setupObeliskImpl :: FilePath -> IO ()
|
||||
setupObeliskImpl deployDir = do
|
||||
let implDir = toImplDir deployDir
|
||||
let
|
||||
implDir = toImplDir deployDir
|
||||
goBackUp = foldr (</>) "" $ (".." <$) $ splitPath $ makeRelative deployDir implDir
|
||||
createDirectoryIfMissing True implDir
|
||||
writeFile (implDir </> "default.nix") "(import ../../src {}).obelisk"
|
||||
writeFile (implDir </> "default.nix") $ "(import " <> toNixPath (goBackUp </> "src") <> " {}).obelisk"
|
||||
|
||||
deployPush :: MonadObelisk m => FilePath -> m [String] -> m ()
|
||||
deployPush deployPath getNixBuilders = do
|
||||
@ -103,11 +112,13 @@ deployPush deployPath getNixBuilders = do
|
||||
Right (ThunkData_Packed ptr) -> return ptr
|
||||
Right (ThunkData_Checkout _) -> do
|
||||
checkGitCleanStatus srcPath True >>= \case
|
||||
True -> packThunk srcPath
|
||||
True -> packThunk (ThunkPackConfig False (ThunkConfig Nothing)) srcPath
|
||||
False -> failWith $ T.pack $ "ob deploy push: ensure " <> srcPath <> " has no pending changes and latest is pushed upstream."
|
||||
Left err -> failWith $ "ob deploy push: couldn't read src thunk: " <> T.pack (show err)
|
||||
let version = show . _thunkRev_commit $ _thunkPtr_rev thunkPtr
|
||||
builders <- getNixBuilders
|
||||
let moduleFile = deployPath </> "module.nix"
|
||||
moduleFileExists <- liftIO $ doesFileExist moduleFile
|
||||
buildOutputByHost <- ifor (Map.fromSet (const ()) hosts) $ \host () -> do
|
||||
--TODO: What does it mean if this returns more or less than 1 line of output?
|
||||
[result] <- fmap lines $ nixCmd $ NixCmd_Build $ def
|
||||
@ -117,13 +128,13 @@ deployPush deployPath getNixBuilders = do
|
||||
, _target_expr = Nothing
|
||||
}
|
||||
& nixBuildConfig_outLink .~ OutLink_None
|
||||
& nixCmdConfig_args .~
|
||||
& nixCmdConfig_args .~ (
|
||||
[ strArg "hostName" host
|
||||
, strArg "adminEmail" adminEmail
|
||||
, strArg "routeHost" routeHost
|
||||
, strArg "version" version
|
||||
, boolArg "enableHttps" enableHttps
|
||||
]
|
||||
] <> [rawArg "module" ("import " <> toNixPath moduleFile) | moduleFileExists ])
|
||||
& nixCmdConfig_builders .~ builders
|
||||
pure result
|
||||
let knownHostsPath = deployPath </> "backend_known_hosts"
|
||||
@ -131,7 +142,7 @@ deployPush deployPath getNixBuilders = do
|
||||
withSpinner "Uploading closures" $ ifor_ buildOutputByHost $ \host outputPath -> do
|
||||
callProcess'
|
||||
(Map.fromList [("NIX_SSHOPTS", unwords sshOpts)])
|
||||
"nix-copy-closure" ["-v", "--to", "root@" <> host, "--gzip", outputPath]
|
||||
"nix-copy-closure" ["-v", "--to", "--use-substitutes", "root@" <> host, "--gzip", outputPath]
|
||||
withSpinner "Uploading config" $ ifor_ buildOutputByHost $ \host _ -> do
|
||||
callProcessAndLogOutput (Notice, Warning) $
|
||||
proc "rsync"
|
||||
@ -153,7 +164,7 @@ deployPush deployPath getNixBuilders = do
|
||||
]
|
||||
isClean <- checkGitCleanStatus deployPath True
|
||||
when (not isClean) $ do
|
||||
withSpinner "Commiting changes to Git" $ do
|
||||
withSpinner "Committing changes to Git" $ do
|
||||
callProcessAndLogOutput (Debug, Error) $ proc "git"
|
||||
["-C", deployPath, "add", "."]
|
||||
callProcessAndLogOutput (Debug, Error) $ proc "git"
|
||||
@ -161,25 +172,11 @@ deployPush deployPath getNixBuilders = do
|
||||
putLog Notice $ "Deployed => " <> T.pack route
|
||||
where
|
||||
callProcess' envMap cmd args = do
|
||||
processEnv <- Map.toList . (envMap <>) . Map.fromList <$> liftIO getEnvironment
|
||||
let p = (proc cmd args) { delegate_ctlc = True, env = Just processEnv }
|
||||
let p = setEnvOverride (envMap <>) $ setDelegateCtlc True $ proc cmd args
|
||||
callProcessAndLogOutput (Notice, Notice) p
|
||||
|
||||
deployUpdate :: MonadObelisk m => FilePath -> m ()
|
||||
deployUpdate deployPath = updateThunkToLatest (deployPath </> "src") Nothing
|
||||
|
||||
keytoolToAndroidConfig :: KeytoolConfig -> HM.HashMap Text (NValueNF Identity)
|
||||
keytoolToAndroidConfig conf = runIdentity $ do
|
||||
path <- toValue $ Path $ _keytoolConfig_keystore conf
|
||||
storepass <- toValue $ T.pack $ _keytoolConfig_storepass conf
|
||||
alias <- toValue $ T.pack $ _keytoolConfig_alias conf
|
||||
keypass <- toValue $ T.pack $ _keytoolConfig_keypass conf
|
||||
return $ HM.fromList
|
||||
[ ("storeFile", path)
|
||||
, ("storePassword", storepass)
|
||||
, ("keyAlias", alias)
|
||||
, ("keyPassword", keypass)
|
||||
]
|
||||
deployUpdate deployPath = updateThunkToLatest (ThunkUpdateConfig Nothing (ThunkConfig Nothing)) (deployPath </> "src")
|
||||
|
||||
data PlatformDeployment = Android | IOS
|
||||
deriving (Show, Eq)
|
||||
@ -189,13 +186,13 @@ renderPlatformDeployment = \case
|
||||
Android -> "android"
|
||||
IOS -> "ios"
|
||||
|
||||
deployMobile :: MonadObelisk m => PlatformDeployment -> [String] -> m ()
|
||||
deployMobile :: forall m. MonadObelisk m => PlatformDeployment -> [String] -> m ()
|
||||
deployMobile platform mobileArgs = withProjectRoot "." $ \root -> do
|
||||
let srcDir = root </> "src"
|
||||
configDir = root </> "config"
|
||||
exists <- liftIO $ doesDirectoryExist srcDir
|
||||
unless exists $ failWith "ob test should be run inside of a deploy directory"
|
||||
nixBuildTarget <- case platform of
|
||||
(nixBuildTarget, extraArgs) <- case platform of
|
||||
Android -> do
|
||||
let keystorePath = root </> "android_keystore.jks"
|
||||
keytoolConfPath = root </> "android_keytool_config.json"
|
||||
@ -214,48 +211,60 @@ deployMobile platform mobileArgs = withProjectRoot "." $ \root -> do
|
||||
, _keytoolConfig_storepass = keyStorePassword
|
||||
, _keytoolConfig_keypass = keyStorePassword
|
||||
}
|
||||
createKeystore root $ keyToolConf
|
||||
createKeystore root keyToolConf
|
||||
liftIO $ BSL.writeFile keytoolConfPath $ encode keyToolConf
|
||||
checkKeytoolConfExist <- liftIO $ doesFileExist keytoolConfPath
|
||||
unless checkKeytoolConfExist $ failWith "Missing android KeytoolConfig"
|
||||
keytoolConfContents <- liftIO $ BSL.readFile keytoolConfPath
|
||||
liftIO $ putStrLn $ show keytoolConfContents
|
||||
releaseKey <- case eitherDecode keytoolConfContents of
|
||||
keyArgs <- case eitherDecode keytoolConfContents :: Either String KeytoolConfig of
|
||||
Left err -> failWith $ T.pack err
|
||||
Right conf -> do
|
||||
let nvset = toValue @(HM.HashMap Text (NValueNF Identity)) @Identity @(NValueNF Identity) $ keytoolToAndroidConfig conf
|
||||
return $ printNix $ runIdentity nvset
|
||||
Right conf -> pure
|
||||
[ "--sign"
|
||||
, "--store-file", _keytoolConfig_keystore conf
|
||||
, "--store-password", _keytoolConfig_storepass conf
|
||||
, "--key-alias", _keytoolConfig_alias conf
|
||||
, "--key-password", _keytoolConfig_keypass conf
|
||||
]
|
||||
let expr = mconcat
|
||||
[ "with (import ", srcDir, " {});"
|
||||
, "android.frontend.override (drv: { "
|
||||
, "releaseKey = (if builtins.isNull drv.releaseKey then {} else drv.releaseKey) // " <> releaseKey <> "; "
|
||||
, "staticSrc = (passthru.__android ", configDir, ").frontend.staticSrc;"
|
||||
[ "with (import ", toNixPath srcDir, " {});"
|
||||
, "android.frontend.override (drv: {"
|
||||
, "isRelease = true;"
|
||||
, "staticSrc = (passthru.__androidWithConfig ", configDir, ").frontend.staticSrc;"
|
||||
, "assets = (passthru.__androidWithConfig ", configDir, ").frontend.assets;"
|
||||
, "})"
|
||||
]
|
||||
return $ Target
|
||||
return (Target
|
||||
{ _target_path = Nothing
|
||||
, _target_attr = Nothing
|
||||
, _target_expr = Just expr
|
||||
}
|
||||
}, keyArgs)
|
||||
IOS -> do
|
||||
let expr = mconcat
|
||||
[ "with (import ", srcDir, " {});"
|
||||
, "ios.frontend.override (_: { staticSrc = (passthru.__ios ", configDir, ").frontend.staticSrc; })"
|
||||
[ "with (import ", toNixPath srcDir, " {});"
|
||||
, "ios.frontend.override (_: { staticSrc = (passthru.__iosWithConfig ", toNixPath configDir, ").frontend.staticSrc; })"
|
||||
]
|
||||
return $ Target
|
||||
return (Target
|
||||
{ _target_path = Nothing
|
||||
, _target_attr = Nothing
|
||||
, _target_expr = Just expr
|
||||
}
|
||||
}, [])
|
||||
result <- nixCmd $ NixCmd_Build $ def
|
||||
& nixBuildConfig_outLink .~ OutLink_None
|
||||
& nixCmdConfig_target .~ nixBuildTarget
|
||||
putLog Notice $ T.pack $ "Your recently built android apk can be found at the following path: " <> (show result)
|
||||
callProcessAndLogOutput (Notice, Error) $ proc (result </> "bin" </> "deploy") mobileArgs
|
||||
let mobileArtifact = case platform of
|
||||
IOS -> "iOS App"
|
||||
Android -> "Android APK"
|
||||
putLog Notice $ T.pack $ unwords ["Your recently built", mobileArtifact, "can be found at the following path:", show result]
|
||||
callProcessAndLogOutput (Notice, Error) $ proc (result </> "bin" </> "deploy") (mobileArgs ++ extraArgs)
|
||||
where
|
||||
withEcho showEcho f = do
|
||||
prevEcho <- hGetEcho stdin
|
||||
bracket_ (hSetEcho stdin showEcho) (hSetEcho stdin prevEcho) f
|
||||
withEcho showEcho f = bracket
|
||||
(do
|
||||
prevEcho <- hGetEcho stdin
|
||||
hSetEcho stdin showEcho
|
||||
pure prevEcho
|
||||
)
|
||||
(hSetEcho stdin)
|
||||
(const f)
|
||||
|
||||
data KeytoolConfig = KeytoolConfig
|
||||
{ _keytoolConfig_keystore :: FilePath
|
||||
@ -268,19 +277,16 @@ instance FromJSON KeytoolConfig
|
||||
instance ToJSON KeytoolConfig
|
||||
|
||||
createKeystore :: MonadObelisk m => FilePath -> KeytoolConfig -> m ()
|
||||
createKeystore root config = do
|
||||
let expr = "with (import " <> toImplDir root <> ").reflex-platform.nixpkgs; pkgs.mkShell { buildInputs = [ pkgs.jdk ]; }"
|
||||
callProcessAndLogOutput (Notice,Notice) $ (proc "nix-shell" ["-E" , expr, "--run" , keytoolCmd]) { cwd = Just root }
|
||||
where
|
||||
keytoolCmd = processToShellString "keytool"
|
||||
[ "-genkeypair", "-noprompt"
|
||||
, "-keystore", _keytoolConfig_keystore config
|
||||
, "-keyalg", "RSA", "-keysize", "2048"
|
||||
, "-validity", "1000000000"
|
||||
, "-storepass", _keytoolConfig_storepass config
|
||||
, "-alias", _keytoolConfig_alias config
|
||||
, "-keypass", _keytoolConfig_keypass config
|
||||
]
|
||||
createKeystore root config =
|
||||
callProcessAndLogOutput (Notice, Notice) $ setCwd (Just root) $ proc jreKeyToolPath
|
||||
[ "-genkeypair", "-noprompt"
|
||||
, "-keystore", _keytoolConfig_keystore config
|
||||
, "-keyalg", "RSA", "-keysize", "2048"
|
||||
, "-validity", "1000000"
|
||||
, "-storepass", _keytoolConfig_storepass config
|
||||
, "-alias", _keytoolConfig_alias config
|
||||
, "-keypass", _keytoolConfig_keypass config
|
||||
]
|
||||
|
||||
-- | Simplified deployment configuration mechanism. At one point we may revisit this.
|
||||
writeDeployConfig :: MonadObelisk m => FilePath -> FilePath -> String -> m ()
|
||||
|
@ -9,22 +9,33 @@
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# LANGUAGE TypeApplications #-}
|
||||
module Obelisk.Command.Nix
|
||||
( nixCmd
|
||||
, NixCmd (..)
|
||||
, nixCmdConfig_target
|
||||
, nixCmdConfig_args
|
||||
, nixCmdConfig_builders
|
||||
( Arg (..)
|
||||
, NixBuildConfig (..)
|
||||
, nixBuildConfig_common
|
||||
, nixBuildConfig_outLink
|
||||
, NixCmd (..)
|
||||
, nixCmdConfig_args
|
||||
, nixCmdConfig_builders
|
||||
, nixCmdConfig_target
|
||||
, NixCommonConfig (..)
|
||||
, NixInstantiateConfig (..)
|
||||
, nixInstantiateConfig_eval
|
||||
, NixCommonConfig (..)
|
||||
, Target (..)
|
||||
, NixShellConfig (..)
|
||||
, nixShellConfig_common
|
||||
, nixShellConfig_pure
|
||||
, nixShellConfig_run
|
||||
, OutLink (..)
|
||||
, Arg (..)
|
||||
, Target (..)
|
||||
, target_attr
|
||||
, target_expr
|
||||
, target_path
|
||||
|
||||
, boolArg
|
||||
, nixCmd
|
||||
, nixCmdProc
|
||||
, nixCmdProc'
|
||||
, rawArg
|
||||
, runNixShellConfig
|
||||
, strArg
|
||||
) where
|
||||
|
||||
@ -37,7 +48,6 @@ import Data.List (intercalate)
|
||||
import Data.Maybe
|
||||
import Data.Monoid ((<>))
|
||||
import qualified Data.Text as T
|
||||
import System.Process (proc)
|
||||
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp
|
||||
@ -63,10 +73,10 @@ data Arg
|
||||
deriving (Eq, Show)
|
||||
|
||||
strArg :: String -> String -> Arg
|
||||
strArg k = Arg_Str k
|
||||
strArg = Arg_Str
|
||||
|
||||
rawArg :: String -> String -> Arg
|
||||
rawArg k = Arg_Expr k
|
||||
rawArg = Arg_Expr
|
||||
|
||||
boolArg :: String -> Bool -> Arg
|
||||
boolArg k = Arg_Expr k . bool "false" "true"
|
||||
@ -86,7 +96,7 @@ makeClassy ''NixCommonConfig
|
||||
instance Default NixCommonConfig where
|
||||
def = NixCommonConfig def mempty mempty
|
||||
|
||||
runNixCommonConfig :: NixCommonConfig -> [FilePath]
|
||||
runNixCommonConfig :: NixCommonConfig -> [String]
|
||||
runNixCommonConfig cfg = mconcat [maybeToList path, attrArg, exprArg, args, buildersArg]
|
||||
where
|
||||
path = _target_path $ _nixCmdConfig_target cfg
|
||||
@ -124,7 +134,7 @@ instance HasNixCommonConfig NixBuildConfig where
|
||||
instance Default NixBuildConfig where
|
||||
def = NixBuildConfig def def
|
||||
|
||||
runNixBuildConfig :: NixBuildConfig -> [FilePath]
|
||||
runNixBuildConfig :: NixBuildConfig -> [String]
|
||||
runNixBuildConfig cfg = mconcat
|
||||
[ runNixCommonConfig $ cfg ^. nixCommonConfig
|
||||
, case _nixBuildConfig_outLink cfg of
|
||||
@ -145,12 +155,26 @@ instance HasNixCommonConfig NixInstantiateConfig where
|
||||
instance Default NixInstantiateConfig where
|
||||
def = NixInstantiateConfig def False
|
||||
|
||||
runNixInstantiateConfig :: NixInstantiateConfig -> [FilePath]
|
||||
runNixInstantiateConfig :: NixInstantiateConfig -> [String]
|
||||
runNixInstantiateConfig cfg = mconcat
|
||||
[ runNixCommonConfig $ cfg ^. nixCommonConfig
|
||||
, "--eval" <$ guard (_nixInstantiateConfig_eval cfg)
|
||||
]
|
||||
|
||||
data NixShellConfig = NixShellConfig
|
||||
{ _nixShellConfig_common :: NixCommonConfig
|
||||
, _nixShellConfig_pure :: Bool
|
||||
, _nixShellConfig_run :: Maybe String
|
||||
}
|
||||
|
||||
makeLenses ''NixShellConfig
|
||||
|
||||
instance HasNixCommonConfig NixShellConfig where
|
||||
nixCommonConfig = nixShellConfig_common
|
||||
|
||||
instance Default NixShellConfig where
|
||||
def = NixShellConfig def False Nothing
|
||||
|
||||
data NixCmd
|
||||
= NixCmd_Build NixBuildConfig
|
||||
| NixCmd_Instantiate NixInstantiateConfig
|
||||
@ -158,27 +182,43 @@ data NixCmd
|
||||
instance Default NixCmd where
|
||||
def = NixCmd_Build def
|
||||
|
||||
nixCmd :: MonadObelisk m => NixCmd -> m FilePath
|
||||
nixCmd cmdCfg = withSpinner' ("Running " <> cmd <> desc) (Just $ const $ "Built " <> desc) $ do
|
||||
output <- readProcessAndLogStderr Debug $ proc (T.unpack cmd) $ options
|
||||
-- Remove final newline that Nix appends
|
||||
Just (outPath, '\n') <- pure $ T.unsnoc output
|
||||
pure $ T.unpack outPath
|
||||
runNixShellConfig :: NixShellConfig -> [String]
|
||||
runNixShellConfig cfg = mconcat
|
||||
[ runNixCommonConfig $ cfg ^. nixCommonConfig
|
||||
, [ "--pure" | cfg ^. nixShellConfig_pure ]
|
||||
] ++ mconcat [
|
||||
["--run", run] | run <- maybeToList $ cfg ^. nixShellConfig_run
|
||||
]
|
||||
|
||||
nixCmdProc :: NixCmd -> ProcessSpec
|
||||
nixCmdProc = fst . nixCmdProc'
|
||||
|
||||
nixCmdProc' :: NixCmd -> (ProcessSpec, T.Text)
|
||||
nixCmdProc' cmdCfg = (proc (T.unpack cmd) options, cmd)
|
||||
where
|
||||
(cmd, options, commonCfg) = case cmdCfg of
|
||||
(cmd, options) = case cmdCfg of
|
||||
NixCmd_Build cfg' ->
|
||||
( "nix-build"
|
||||
, runNixBuildConfig cfg'
|
||||
, cfg' ^. nixCommonConfig
|
||||
)
|
||||
NixCmd_Instantiate cfg' ->
|
||||
( "nix-instantiate"
|
||||
, runNixInstantiateConfig cfg'
|
||||
, cfg' ^. nixCommonConfig
|
||||
)
|
||||
|
||||
nixCmd :: MonadObelisk m => NixCmd -> m FilePath
|
||||
nixCmd cmdCfg = withSpinner' ("Running " <> cmd <> desc) (Just $ const $ "Built " <> desc) $ do
|
||||
output <- readProcessAndLogStderr Debug cmdProc
|
||||
-- Remove final newline that Nix appends
|
||||
Just (outPath, '\n') <- pure $ T.unsnoc output
|
||||
pure $ T.unpack outPath
|
||||
where
|
||||
(cmdProc, cmd) = nixCmdProc' cmdCfg
|
||||
commonCfg = case cmdCfg of
|
||||
NixCmd_Build cfg' -> cfg' ^. nixCommonConfig
|
||||
NixCmd_Instantiate cfg' -> cfg' ^. nixCommonConfig
|
||||
path = commonCfg ^. nixCmdConfig_target . target_path
|
||||
desc = T.pack $ mconcat $ catMaybes
|
||||
[ (" on " <>) <$> path
|
||||
[ ("on " <>) <$> path
|
||||
, (\a -> " [" <> a <> "]") <$> (commonCfg ^. nixCmdConfig_target . target_attr)
|
||||
]
|
||||
|
||||
|
96
lib/command/src/Obelisk/Command/Preprocessor.hs
Normal file
@ -0,0 +1,96 @@
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
|
||||
module Obelisk.Command.Preprocessor where
|
||||
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.ByteString.Lazy.Builder as BU
|
||||
import Data.List (intersperse, isPrefixOf, sortOn)
|
||||
import Data.Maybe (fromMaybe)
|
||||
import qualified Data.Text.Lazy.Builder as TL
|
||||
import qualified Data.Text.Lazy.Encoding as TL
|
||||
import Distribution.Compiler (CompilerFlavor (..))
|
||||
import Language.Haskell.Extension (Extension (..), Language(..))
|
||||
import System.IO (IOMode (..), hClose, hPutStrLn, openFile, stderr)
|
||||
import System.FilePath (hasTrailingPathSeparator, joinPath, normalise, splitPath)
|
||||
|
||||
import Obelisk.Command.Run (CabalPackageInfo (..), parseCabalPackage')
|
||||
|
||||
applyPackages :: FilePath -> FilePath -> FilePath -> [FilePath] -> IO ()
|
||||
applyPackages origPath inPath outPath packagePaths = do
|
||||
-- This code is intended to be executed via ghci's -pgmF preprocessor option
|
||||
-- The command line arguments are passed in via ghc, which dictates the first three options and meanings
|
||||
-- In order for this code to execute, origPath must contain either a '.' character or a '/' character.
|
||||
-- (This is to avoid the possibility of the command line syntax conflicting with another ob command)
|
||||
-- We do have control over the remaining arguments, but they must be the same for all files.
|
||||
-- Thus, the fourth command line argument must be "apply-packages", which has already been handled.
|
||||
-- We assume all the remaining arguments passed in are paths to cabal or hpack package specifications.
|
||||
|
||||
outFile <- openFile outPath WriteMode
|
||||
let hPutTextBuilder h = BU.hPutBuilder h . TL.encodeUtf8Builder . TL.toLazyText
|
||||
|
||||
-- putStr "--------------------------------------------------------------------------------\n"
|
||||
-- print args
|
||||
-- putStr "--------------------------------------------------------------------------------\n"
|
||||
|
||||
-- Thus we must select among the packagePaths for the file we are going to parse.
|
||||
|
||||
let takeDirs = takeWhile hasTrailingPathSeparator
|
||||
packageDirs = sortOn (negate . length . takeDirs) $ map (splitPath . normalise) packagePaths
|
||||
origDir = splitPath $ normalise origPath
|
||||
matches = [ joinPath d | d <- packageDirs, takeDirs d `isPrefixOf` origDir ]
|
||||
|
||||
-- So the first element of matches is going to be the deepest path to a package spec that contains
|
||||
-- our file as a subdirectory.
|
||||
|
||||
case matches of
|
||||
[] -> hPutTextBuilder outFile (lineNumberPragma origPath) -- TODO: probably should produce a warning
|
||||
(packagePath:_) -> do
|
||||
parseCabalPackage' packagePath >>= \case
|
||||
Left err ->
|
||||
hPutStrLn stderr $ "Error: Unable to parse cabal package " <> packagePath <> "; Skipping preprocessor on " <> origPath <> ". Error was " <> show err
|
||||
Right (_warnings, packageInfo) ->
|
||||
hPutTextBuilder outFile (generateHeader origPath packageInfo)
|
||||
|
||||
BL.readFile inPath >>= BL.hPut outFile
|
||||
hClose outFile
|
||||
|
||||
|
||||
-- I'm pretty sure there's a certain amount of oversimplification in CabalPackageInfo, so I doubt this is fully robust.
|
||||
|
||||
generateHeader :: FilePath -> CabalPackageInfo -> TL.Builder
|
||||
generateHeader origPath packageInfo =
|
||||
hsExtensions <> ghcOptions <> lineNumberPragma origPath
|
||||
where
|
||||
hsExtensions =
|
||||
if not (null extList)
|
||||
then TL.fromText "{-# LANGUAGE "
|
||||
<> mconcat (intersperse (TL.fromText ", ") extList)
|
||||
<> TL.fromText " #-}\n"
|
||||
else mempty
|
||||
extList = addDefaultLanguage $ concatMap showExt $ _cabalPackageInfo_defaultExtensions packageInfo
|
||||
addDefaultLanguage =
|
||||
case _cabalPackageInfo_defaultLanguage packageInfo of
|
||||
Nothing -> id
|
||||
Just x -> case x of
|
||||
UnknownLanguage ext -> ( TL.fromString ext :)
|
||||
ext -> ( TL.fromString (show ext) :)
|
||||
showExt = \case
|
||||
EnableExtension ext -> [TL.fromString (show ext)]
|
||||
DisableExtension _ -> []
|
||||
UnknownExtension ext -> [TL.fromString ext]
|
||||
|
||||
ghcOptions =
|
||||
if not (null optList)
|
||||
then TL.fromText "{-# OPTIONS_GHC "
|
||||
<> mconcat (intersperse (TL.fromText " ") optList)
|
||||
<> TL.fromText " #-}\n"
|
||||
else mempty
|
||||
optList = map TL.fromString
|
||||
$ filter (\x -> not (isPrefixOf "-O" x))
|
||||
$ fromMaybe []
|
||||
$ lookup GHC (_cabalPackageInfo_compilerOptions packageInfo)
|
||||
|
||||
lineNumberPragma :: FilePath -> TL.Builder
|
||||
lineNumberPragma origPath =
|
||||
TL.fromText "{-# LINE 1 \"" <> TL.fromString origPath <> TL.fromText "\" #-}\n"
|
@ -3,37 +3,55 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.Command.Project
|
||||
( InitSource (..)
|
||||
, initProject
|
||||
, findProjectObeliskCommand
|
||||
, findProjectRoot
|
||||
, withProjectRoot
|
||||
, inProjectShell
|
||||
, inImpureProjectShell
|
||||
, findProjectAssets
|
||||
, initProject
|
||||
, nixShellRunConfig
|
||||
, nixShellRunProc
|
||||
, nixShellWithHoogle
|
||||
, nixShellWithPkgs
|
||||
, obeliskDirName
|
||||
, projectShell
|
||||
|
||||
, toObeliskDir
|
||||
, toImplDir
|
||||
, toNixPath
|
||||
, toObeliskDir
|
||||
, withProjectRoot
|
||||
) where
|
||||
|
||||
import Control.Concurrent.MVar (MVar, newMVar, withMVarMasked)
|
||||
import Control.Lens ((.~), (?~), (<&>))
|
||||
import Control.Monad
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
import Control.Monad.State
|
||||
import qualified Data.Aeson as Json
|
||||
import Data.Bits
|
||||
import Data.Function (on)
|
||||
import qualified Data.ByteString.Lazy as BSL
|
||||
import Data.Default (def)
|
||||
import Data.Function ((&), on)
|
||||
import Data.List (isInfixOf)
|
||||
import Data.Map (Map)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Data.Text.Encoding (decodeUtf8)
|
||||
import Data.Traversable (for)
|
||||
import System.Directory
|
||||
import System.Environment (lookupEnv)
|
||||
import System.FilePath
|
||||
import System.IO.Temp
|
||||
import System.Posix (FileStatus, UserID, deviceID, fileID, fileMode, fileOwner, getFileStatus, getRealUserID)
|
||||
import System.IO.Unsafe (unsafePerformIO)
|
||||
import System.Posix (FileStatus, FileMode, CMode (..), UserID, deviceID, fileID, fileMode, fileOwner, getFileStatus, getRealUserID)
|
||||
import System.Posix.Files
|
||||
import System.Process (CreateProcess, cwd, proc, waitForProcess, delegate_ctlc)
|
||||
|
||||
import GitHub.Data.GitData (Branch)
|
||||
import GitHub.Data.Name (Name)
|
||||
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp
|
||||
import Obelisk.Command.Nix
|
||||
import Obelisk.Command.Thunk
|
||||
import Obelisk.Command.Utils (nixExePath, nixBuildExePath)
|
||||
|
||||
--TODO: Make this module resilient to random exceptions
|
||||
|
||||
--TODO: Don't hardcode this
|
||||
@ -47,17 +65,21 @@ obeliskSourceWithBranch branch = ThunkSource_GitHub $ GitHubSource
|
||||
{ _gitHubSource_owner = "obsidiansystems"
|
||||
, _gitHubSource_repo = "obelisk"
|
||||
, _gitHubSource_branch = Just branch
|
||||
, _gitHubSource_private = False
|
||||
}
|
||||
|
||||
data InitSource
|
||||
= InitSource_Default
|
||||
| InitSource_Branch (Name Branch)
|
||||
| InitSource_Symlink FilePath
|
||||
deriving Show
|
||||
= InitSource_Default
|
||||
| InitSource_Branch (Name Branch)
|
||||
| InitSource_Symlink FilePath
|
||||
deriving Show
|
||||
|
||||
obeliskDirName :: FilePath
|
||||
obeliskDirName = ".obelisk"
|
||||
|
||||
-- | Path to obelisk directory in given path
|
||||
toObeliskDir :: FilePath -> FilePath
|
||||
toObeliskDir p = p </> ".obelisk"
|
||||
toObeliskDir p = p </> obeliskDirName
|
||||
|
||||
-- | Path to impl file in given path
|
||||
toImplDir :: FilePath -> FilePath
|
||||
@ -114,15 +136,16 @@ initProject source force = withSystemTempDirectory "ob-init" $ \tmpDir -> do
|
||||
findProjectObeliskCommand :: MonadObelisk m => FilePath -> m (Maybe FilePath)
|
||||
findProjectObeliskCommand target = do
|
||||
myUid <- liftIO getRealUserID
|
||||
processUmask <- liftIO getUmask
|
||||
targetStat <- liftIO $ getFileStatus target
|
||||
(result, insecurePaths) <- flip runStateT [] $ walkToProjectRoot target targetStat myUid >>= \case
|
||||
(result, insecurePaths) <- flip runStateT [] $ walkToProjectRoot target targetStat processUmask myUid >>= \case
|
||||
Nothing -> pure Nothing
|
||||
Just projectRoot -> liftIO (doesDirectoryExist $ toImplDir projectRoot) >>= \case
|
||||
False -> do
|
||||
putLog Warning $ "Found obelisk directory in " <> T.pack projectRoot <> " but the implementation (impl) file is missing"
|
||||
pure Nothing
|
||||
True -> do
|
||||
walkToImplDir projectRoot myUid -- For security check
|
||||
walkToImplDir projectRoot myUid processUmask -- For security check
|
||||
return $ Just projectRoot
|
||||
case (result, insecurePaths) of
|
||||
(Just projDir, []) -> do
|
||||
@ -133,17 +156,37 @@ findProjectObeliskCommand target = do
|
||||
putLog Error $ T.unlines
|
||||
[ "Error: Found a project at " <> T.pack (normalise projDir) <> ", but had to traverse one or more insecure directories to get there:"
|
||||
, T.unlines $ fmap (T.pack . normalise) insecurePaths
|
||||
, "Please ensure that all of these directories are owned by you and are not writable by anyone else."
|
||||
, "Please ensure that all of these directories are owned by you, not world-writable, and no more group-writable than permitted by your umask."
|
||||
]
|
||||
return Nothing
|
||||
|
||||
-- | Get the umask for the Obelisk process.
|
||||
--
|
||||
-- Because of
|
||||
-- http://man7.org/linux/man-pages/man2/umask.2.html#NOTES we have to set the
|
||||
-- umask to read it. We are using 'withMVarMasked' to guarantee that setting and
|
||||
-- reading isn't interrupted by any exception or interleaved with another thread.
|
||||
getUmask :: IO FileMode
|
||||
getUmask = withMVarMasked globalUmaskLock $ \() -> do
|
||||
initialMask <- setFileCreationMask safeUmask
|
||||
void (setFileCreationMask initialMask)
|
||||
pure initialMask
|
||||
where
|
||||
safeUmask :: FileMode
|
||||
safeUmask = CMode 0o22
|
||||
|
||||
{-# NOINLINE globalUmaskLock #-}
|
||||
globalUmaskLock :: MVar ()
|
||||
globalUmaskLock = unsafePerformIO (newMVar ())
|
||||
|
||||
-- | Get the FilePath to the containing project directory, if there is one
|
||||
findProjectRoot :: MonadObelisk m => FilePath -> m (Maybe FilePath)
|
||||
findProjectRoot target = do
|
||||
myUid <- liftIO getRealUserID
|
||||
targetStat <- liftIO $ getFileStatus target
|
||||
(result, _) <- liftIO $ runStateT (walkToProjectRoot target targetStat myUid) []
|
||||
return result
|
||||
umask <- liftIO getUmask
|
||||
(result, _) <- liftIO $ runStateT (walkToProjectRoot target targetStat umask myUid) []
|
||||
return $ makeRelative "." <$> result
|
||||
|
||||
withProjectRoot :: MonadObelisk m => FilePath -> (FilePath -> m a) -> m a
|
||||
withProjectRoot target f = findProjectRoot target >>= \case
|
||||
@ -155,15 +198,15 @@ withProjectRoot target f = findProjectRoot target >>= \case
|
||||
-- traversed in the process. Return the project root directory, if found.
|
||||
walkToProjectRoot
|
||||
:: (MonadState [FilePath] m, MonadIO m)
|
||||
=> FilePath -> FileStatus -> UserID -> m (Maybe FilePath)
|
||||
walkToProjectRoot this thisStat myUid = liftIO (doesDirectoryExist this) >>= \case
|
||||
=> FilePath -> FileStatus -> FileMode -> UserID -> m (Maybe FilePath)
|
||||
walkToProjectRoot this thisStat desiredUmask myUid = liftIO (doesDirectoryExist this) >>= \case
|
||||
-- It's not a directory, so it can't be a project
|
||||
False -> do
|
||||
let dir = takeDirectory this
|
||||
dirStat <- liftIO $ getFileStatus dir
|
||||
walkToProjectRoot dir dirStat myUid
|
||||
walkToProjectRoot dir dirStat desiredUmask myUid
|
||||
True -> do
|
||||
when (not $ isWritableOnlyBy thisStat myUid) $ modify (this:)
|
||||
when (not $ isWellOwnedAndWellPermissioned thisStat myUid desiredUmask) $ modify (this:)
|
||||
liftIO (doesDirectoryExist $ toObeliskDir this) >>= \case
|
||||
True -> return $ Just this
|
||||
False -> do
|
||||
@ -173,47 +216,114 @@ walkToProjectRoot this thisStat myUid = liftIO (doesDirectoryExist this) >>= \ca
|
||||
isSameFileAs = (==) `on` fileIdentity
|
||||
if thisStat `isSameFileAs` nextStat
|
||||
then return Nothing -- Found a cycle; probably hit root directory
|
||||
else walkToProjectRoot next nextStat myUid
|
||||
else walkToProjectRoot next nextStat desiredUmask myUid
|
||||
|
||||
-- | Walk from the given project root directory to its Obelisk implementation
|
||||
-- directory, accumulating potentially insecure directories that were traversed
|
||||
-- in the process.
|
||||
walkToImplDir :: (MonadState [FilePath] m, MonadIO m) => FilePath -> UserID -> m ()
|
||||
walkToImplDir projectRoot myUid = do
|
||||
walkToImplDir :: (MonadState [FilePath] m, MonadIO m) => FilePath -> UserID -> FileMode -> m ()
|
||||
walkToImplDir projectRoot myUid umask = do
|
||||
let obDir = toObeliskDir projectRoot
|
||||
obDirStat <- liftIO $ getFileStatus obDir
|
||||
when (not $ isWritableOnlyBy obDirStat myUid) $ modify (obDir:)
|
||||
when (not $ isWellOwnedAndWellPermissioned obDirStat myUid umask) $ modify (obDir:)
|
||||
let implThunk = obDir </> "impl"
|
||||
implThunkStat <- liftIO $ getFileStatus implThunk
|
||||
when (not $ isWritableOnlyBy implThunkStat myUid) $ modify (implThunk:)
|
||||
when (not $ isWellOwnedAndWellPermissioned implThunkStat myUid umask) $ modify (implThunk:)
|
||||
|
||||
--TODO: Is there a better way to ask if anyone else can write things?
|
||||
--E.g. what about ACLs?
|
||||
-- | Check to see if directory is only writable by a user whose User ID matches the second argument provided
|
||||
isWritableOnlyBy :: FileStatus -> UserID -> Bool
|
||||
isWritableOnlyBy s uid = fileOwner s == uid && fileMode s .&. 0o22 == 0
|
||||
-- | Check to see if directory is writable by a user whose User ID matches the
|
||||
-- second argument provided, and if the fact that other people can write to that
|
||||
-- directory is in accordance with the umask of the system, passed as the third
|
||||
-- argument.
|
||||
isWellOwnedAndWellPermissioned :: FileStatus -> UserID -> FileMode -> Bool
|
||||
isWellOwnedAndWellPermissioned s uid umask = isOwnedBy s uid && filePermissionIsSafe s umask
|
||||
|
||||
-- | Run a command in the given shell for the current project
|
||||
inProjectShell :: MonadObelisk m => String -> String -> m ()
|
||||
inProjectShell shellName command = withProjectRoot "." $ \root ->
|
||||
projectShell root True shellName command
|
||||
isOwnedBy :: FileStatus -> UserID -> Bool
|
||||
isOwnedBy s uid = fileOwner s == uid
|
||||
|
||||
inImpureProjectShell :: MonadObelisk m => String -> String -> m ()
|
||||
inImpureProjectShell shellName command = withProjectRoot "." $ \root ->
|
||||
projectShell root False shellName command
|
||||
-- | Check to see if a directory respect the umask, but check explicitly that
|
||||
-- it's not world writable in any case.
|
||||
filePermissionIsSafe :: FileStatus -> FileMode -> Bool
|
||||
filePermissionIsSafe s umask = not fileWorldWritable && fileGroupWritable <= umaskGroupWritable
|
||||
where
|
||||
fileWorldWritable = fileMode s .&. 0o002 == 0o002
|
||||
fileGroupWritable = fileMode s .&. 0o020 == 0o020
|
||||
umaskGroupWritable = umask .&. 0o020 == 0
|
||||
|
||||
projectShell :: MonadObelisk m => FilePath -> Bool -> String -> String -> m ()
|
||||
-- | Nix syntax requires relative paths to be prefixed by @./@ or
|
||||
-- @../@. This will make a 'FilePath' that can be embedded in a Nix
|
||||
-- expression.
|
||||
toNixPath :: FilePath -> FilePath
|
||||
toNixPath root | "/" `isInfixOf` root = root
|
||||
| otherwise = "./" <> root
|
||||
|
||||
nixShellRunConfig :: MonadObelisk m => FilePath -> Bool -> Maybe String -> m NixShellConfig
|
||||
nixShellRunConfig root isPure command = do
|
||||
nixpkgsPath <- fmap T.strip $ readProcessAndLogStderr Debug $ setCwd (Just root) $
|
||||
proc nixExePath ["eval", "(import .obelisk/impl {}).nixpkgs.path"]
|
||||
nixRemote <- liftIO $ lookupEnv "NIX_REMOTE"
|
||||
pure $ def
|
||||
& nixShellConfig_pure .~ isPure
|
||||
& nixShellConfig_common . nixCmdConfig_target .~ (def & target_path .~ Nothing)
|
||||
& nixShellConfig_run .~ (command <&> \c -> mconcat
|
||||
[ "export NIX_PATH=nixpkgs=", T.unpack nixpkgsPath, "; "
|
||||
, maybe "" (\v -> "export NIX_REMOTE=" <> v <> "; ") nixRemote
|
||||
, c
|
||||
])
|
||||
|
||||
nixShellRunProc :: NixShellConfig -> ProcessSpec
|
||||
nixShellRunProc cfg = setDelegateCtlc True $ proc "nix-shell" $ runNixShellConfig cfg
|
||||
|
||||
nixShellWithPkgs :: MonadObelisk m => FilePath -> Bool -> Bool -> Map Text FilePath -> Maybe String -> m ()
|
||||
nixShellWithPkgs root isPure chdirToRoot packageNamesAndPaths command = do
|
||||
packageNamesAndAbsPaths <- liftIO $ for packageNamesAndPaths makeAbsolute
|
||||
defShellConfig <- nixShellRunConfig root isPure command
|
||||
let setCwd_ = if chdirToRoot then setCwd (Just root) else id
|
||||
(_, _, _, ph) <- createProcess_ "nixShellWithPkgs" $ setCwd_ $ nixShellRunProc $ defShellConfig
|
||||
& nixShellConfig_common . nixCmdConfig_target . target_expr ?~
|
||||
"{root, pkgs}: ((import root {}).passthru.__unstable__.self.extend (_: _: {\
|
||||
\shellPackages = builtins.fromJSON pkgs;\
|
||||
\})).project.shells.ghc"
|
||||
& nixShellConfig_common . nixCmdConfig_args .~
|
||||
[ rawArg "root" $ toNixPath $ if chdirToRoot then "." else root
|
||||
, strArg "pkgs" (T.unpack $ decodeUtf8 $ BSL.toStrict $ Json.encode packageNamesAndAbsPaths)
|
||||
]
|
||||
void $ waitForProcess ph
|
||||
|
||||
nixShellWithHoogle :: MonadObelisk m => FilePath -> Bool -> String -> Maybe String -> m ()
|
||||
nixShellWithHoogle root isPure shell' command = do
|
||||
defShellConfig <- nixShellRunConfig root isPure command
|
||||
(_, _, _, ph) <- createProcess_ "nixShellWithHoogle" $ setCwd (Just root) $ nixShellRunProc $ defShellConfig
|
||||
& nixShellConfig_common . nixCmdConfig_target . target_expr ?~
|
||||
"{shell}: ((import ./. {}).passthru.__unstable__.self.extend (_: super: {\
|
||||
\userSettings = super.userSettings // { withHoogle = true; };\
|
||||
\})).project.shells.${shell}"
|
||||
& nixShellConfig_common . nixCmdConfig_args .~ [ strArg "shell" shell' ]
|
||||
void $ waitForProcess ph
|
||||
|
||||
projectShell :: MonadObelisk m => FilePath -> Bool -> String -> Maybe String -> m ()
|
||||
projectShell root isPure shellName command = do
|
||||
(_, _, _, ph) <- createProcess_ "runNixShellAttr" $ setCtlc $ setCwd (Just root) $ proc "nix-shell" $
|
||||
[ "--pure" | isPure ] <>
|
||||
[ "-A"
|
||||
, "shells." <> shellName
|
||||
, "--run", command
|
||||
]
|
||||
void $ liftIO $ waitForProcess ph
|
||||
defShellConfig <- nixShellRunConfig root isPure command
|
||||
(_, _, _, ph) <- createProcess_ "runNixShellAttr" $ setCwd (Just root) $ nixShellRunProc $ defShellConfig
|
||||
& nixShellConfig_common . nixCmdConfig_target . target_path ?~ "default.nix"
|
||||
& nixShellConfig_common . nixCmdConfig_target . target_attr ?~ ("shells." <> shellName)
|
||||
void $ waitForProcess ph
|
||||
|
||||
setCtlc :: CreateProcess -> CreateProcess
|
||||
setCtlc cfg = cfg { delegate_ctlc = True }
|
||||
|
||||
setCwd :: Maybe FilePath -> CreateProcess -> CreateProcess
|
||||
setCwd fp cfg = cfg { cwd = fp }
|
||||
findProjectAssets :: MonadObelisk m => FilePath -> m Text
|
||||
findProjectAssets root = do
|
||||
isDerivation <- readProcessAndLogStderr Debug $ setCwd (Just root) $
|
||||
proc nixExePath
|
||||
[ "eval"
|
||||
, "(let a = import ./. {}; in toString (a.reflex.nixpkgs.lib.isDerivation a.passthru.staticFilesImpure))"
|
||||
, "--raw"
|
||||
-- `--raw` is not available with old nix-instantiate. It drops quotation
|
||||
-- marks and trailing newline, so is very convenient for shelling out.
|
||||
]
|
||||
-- Check whether the impure static files are a derivation (and so must be built)
|
||||
if isDerivation == "1"
|
||||
then fmap T.strip $ readProcessAndLogStderr Debug $ setCwd (Just root) $ -- Strip whitespace here because nix-build has no --raw option
|
||||
proc nixBuildExePath
|
||||
[ "--no-out-link"
|
||||
, "-E", "(import ./. {}).passthru.staticFilesImpure"
|
||||
]
|
||||
else readProcessAndLogStderr Debug $ setCwd (Just root) $
|
||||
proc nixExePath ["eval", "-f", ".", "passthru.staticFilesImpure", "--raw"]
|
||||
|
@ -3,164 +3,329 @@
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TupleSections #-}
|
||||
{-# LANGUAGE QuasiQuotes #-}
|
||||
module Obelisk.Command.Run where
|
||||
|
||||
import Control.Exception (bracket)
|
||||
import Control.Monad
|
||||
import Control.Arrow ((&&&))
|
||||
import Control.Exception (Exception, bracket)
|
||||
import Control.Lens (ifor, (.~), (&))
|
||||
import Control.Monad (filterM, unless, void)
|
||||
import Control.Monad.Except (runExceptT, throwError)
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
import Control.Monad.Reader (MonadIO)
|
||||
import Data.Coerce (coerce)
|
||||
import Data.Default (def)
|
||||
import Data.Either
|
||||
import Data.List
|
||||
import Data.List.NonEmpty as NE
|
||||
import Data.Foldable (for_, toList)
|
||||
import qualified Data.List.NonEmpty as NE
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Maybe
|
||||
import qualified Data.Set as Set
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Data.Time.Clock (getCurrentTime)
|
||||
import Data.Time.Format (formatTime, defaultTimeLocale)
|
||||
import Data.Traversable (for)
|
||||
import Debug.Trace (trace)
|
||||
import Distribution.Compiler (CompilerFlavor(..))
|
||||
import Distribution.PackageDescription.Parsec (parseGenericPackageDescription)
|
||||
import Distribution.Parsec.ParseResult (runParseResult)
|
||||
import Distribution.Pretty
|
||||
import qualified Distribution.System as Dist
|
||||
import Distribution.Types.BuildInfo
|
||||
import Distribution.Types.CondTree
|
||||
import Distribution.Types.GenericPackageDescription
|
||||
import Distribution.Types.Library
|
||||
import Distribution.Utils.Generic
|
||||
import Hpack.Config
|
||||
import Hpack.Render
|
||||
import Hpack.Yaml
|
||||
import qualified Distribution.Parsec.Common as Dist
|
||||
import qualified Hpack.Config as Hpack
|
||||
import qualified Hpack.Render as Hpack
|
||||
import qualified Hpack.Yaml as Hpack
|
||||
import Language.Haskell.Extension
|
||||
import Network.Socket hiding (Debug)
|
||||
import System.Directory
|
||||
import System.Environment (getExecutablePath)
|
||||
import System.FilePath
|
||||
import System.Process (proc)
|
||||
import qualified System.Info
|
||||
import System.IO.Temp (withSystemTempDirectory)
|
||||
import Data.ByteString (ByteString)
|
||||
|
||||
import Obelisk.App (MonadObelisk, ObeliskT)
|
||||
import Obelisk.CliApp
|
||||
( CliT (..), HasCliConfig, Severity (..)
|
||||
, callCommand, failWith, getCliConfig, putLog
|
||||
, readProcessAndLogStderr, runCli)
|
||||
import Obelisk.Command.Project (inProjectShell, withProjectRoot)
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp (
|
||||
Severity (..),
|
||||
createProcess_,
|
||||
failWith,
|
||||
proc,
|
||||
putLog,
|
||||
readCreateProcessWithExitCode,
|
||||
readProcessAndLogStderr,
|
||||
setCwd,
|
||||
setDelegateCtlc,
|
||||
waitForProcess,
|
||||
withSpinner,
|
||||
)
|
||||
import Obelisk.Command.Nix
|
||||
import Obelisk.Command.Project (nixShellWithPkgs, toImplDir, withProjectRoot, findProjectAssets)
|
||||
import Obelisk.Command.Thunk (attrCacheFileName)
|
||||
import Obelisk.Command.Utils (findExePath, ghcidExePath)
|
||||
|
||||
data CabalPackageInfo = CabalPackageInfo
|
||||
{ _cabalPackageInfo_packageRoot :: FilePath
|
||||
{ _cabalPackageInfo_packageFile :: FilePath
|
||||
, _cabalPackageInfo_packageName :: T.Text
|
||||
, _cabalPackageInfo_packageRoot :: FilePath
|
||||
, _cabalPackageInfo_buildable :: Bool
|
||||
, _cabalPackageInfo_sourceDirs :: NE.NonEmpty FilePath
|
||||
-- ^ List of hs src dirs of the library component
|
||||
, _cabalPackageInfo_defaultExtensions :: [Extension]
|
||||
-- ^ List of globally enable extensions of the library component
|
||||
, _cabalPackageInfo_defaultLanguage :: Maybe Language
|
||||
-- ^ List of globally set languages of the library component
|
||||
, _cabalPackageInfo_compilerOptions :: [(CompilerFlavor, [String])]
|
||||
-- ^ List of compiler-specific options (e.g., the "ghc-options" field of the cabal file)
|
||||
}
|
||||
|
||||
-- NOTE: `run` is not polymorphic like the rest because we use StaticPtr to invoke it.
|
||||
run :: ObeliskT IO ()
|
||||
run = do
|
||||
pkgs <- getLocalPkgs
|
||||
withGhciScript pkgs $ \dotGhciPath -> do
|
||||
-- | Used to signal to obelisk that it's being invoked as a preprocessor
|
||||
preprocessorIdentifier :: String
|
||||
preprocessorIdentifier = "__preprocessor-apply-packages"
|
||||
|
||||
profile
|
||||
:: MonadObelisk m
|
||||
=> String
|
||||
-> [String]
|
||||
-> m ()
|
||||
profile profileBasePattern rtsFlags = withProjectRoot "." $ \root -> do
|
||||
putLog Debug "Using profiled build of project."
|
||||
|
||||
outPath <- withSpinner "Building profiled executable" $
|
||||
fmap (T.unpack . T.strip) $ readProcessAndLogStderr Debug $ setCwd (Just root) $ nixCmdProc $
|
||||
NixCmd_Build $ def
|
||||
& nixBuildConfig_outLink .~ OutLink_None
|
||||
& nixCmdConfig_target .~ Target
|
||||
{ _target_path = Just "."
|
||||
, _target_attr = Just "__unstable__.profiledObRun"
|
||||
, _target_expr = Nothing
|
||||
}
|
||||
assets <- findProjectAssets root
|
||||
putLog Debug $ "Assets impurely loaded from: " <> assets
|
||||
time <- liftIO getCurrentTime
|
||||
let profileBaseName = formatTime defaultTimeLocale profileBasePattern time
|
||||
liftIO $ createDirectoryIfMissing True $ takeDirectory $ root </> profileBaseName
|
||||
putLog Debug $ "Storing profiled data under base name of " <> T.pack (root </> profileBaseName)
|
||||
freePort <- getFreePort
|
||||
(_, _, _, ph) <- createProcess_ "runProfExe" $ setCwd (Just root) $ setDelegateCtlc True $ proc (outPath </> "bin" </> "ob-run") $
|
||||
[ show freePort
|
||||
, T.unpack assets
|
||||
, profileBaseName
|
||||
, "+RTS"
|
||||
, "-po" <> profileBaseName
|
||||
] <> rtsFlags
|
||||
<> [ "-RTS" ]
|
||||
void $ waitForProcess ph
|
||||
|
||||
run
|
||||
:: MonadObelisk m
|
||||
=> m ()
|
||||
run = withProjectRoot "." $ \root -> do
|
||||
pkgs <- fmap toList . parsePackagesOrFail =<< getLocalPkgs root
|
||||
withGhciScript pkgs root $ \dotGhciPath -> do
|
||||
freePort <- getFreePort
|
||||
assets <- withProjectRoot "." $ \root ->
|
||||
-- `--raw` is not available with old nix-instantiate. It drops quotation
|
||||
-- marks and trailing newline, so is very convenient for shelling out.
|
||||
readProcessAndLogStderr Debug $
|
||||
proc "nix" ["eval", "-f", root, "passthru.staticFilesImpure", "--raw"]
|
||||
assets <- findProjectAssets root
|
||||
putLog Debug $ "Assets impurely loaded from: " <> assets
|
||||
runGhcid dotGhciPath $ Just $ unwords
|
||||
[ "run"
|
||||
runGhcid root True dotGhciPath pkgs $ Just $ unwords
|
||||
[ "Obelisk.Run.run"
|
||||
, show freePort
|
||||
, "(runServeAsset " ++ show assets ++ ")"
|
||||
, "(Obelisk.Run.runServeAsset " ++ show assets ++ ")"
|
||||
, "Backend.backend"
|
||||
, "Frontend.frontend"
|
||||
]
|
||||
|
||||
runRepl :: MonadObelisk m => m ()
|
||||
runRepl = do
|
||||
pkgs <- getLocalPkgs
|
||||
withGhciScript pkgs $ \dotGhciPath -> do
|
||||
runGhciRepl dotGhciPath
|
||||
runRepl = withProjectRoot "." $ \root -> do
|
||||
pkgs <- fmap toList . parsePackagesOrFail =<< getLocalPkgs root
|
||||
withGhciScript pkgs "." $ \dotGhciPath ->
|
||||
runGhciRepl root pkgs dotGhciPath
|
||||
|
||||
runWatch :: MonadObelisk m => m ()
|
||||
runWatch = do
|
||||
pkgs <- getLocalPkgs
|
||||
withGhciScript pkgs $ \dotGhciPath -> runGhcid dotGhciPath Nothing
|
||||
runWatch = withProjectRoot "." $ \root -> do
|
||||
pkgs <- fmap toList . parsePackagesOrFail =<< getLocalPkgs root
|
||||
withGhciScript pkgs root $ \dotGhciPath ->
|
||||
runGhcid root True dotGhciPath pkgs Nothing
|
||||
|
||||
-- | Relative paths to local packages of an obelisk project
|
||||
-- TODO a way to query this
|
||||
getLocalPkgs :: Applicative f => f [FilePath]
|
||||
getLocalPkgs = pure ["backend", "common", "frontend"]
|
||||
-- | Relative paths to local packages of an obelisk project.
|
||||
--
|
||||
-- These are a combination of the obelisk predefined local packages,
|
||||
-- and any packages that the user has set with the @packages@ argument
|
||||
-- to the Nix @project@ function.
|
||||
getLocalPkgs :: MonadObelisk m => FilePath -> m [FilePath]
|
||||
getLocalPkgs root = do
|
||||
obeliskPaths <- runFind ["-L", root, "-name", ".obelisk", "-type", "d"]
|
||||
|
||||
-- We do not want to find packages that are embedded inside other obelisk projects, unless that
|
||||
-- obelisk project is our own.
|
||||
let exclusions = filter (/= root) $ map takeDirectory obeliskPaths
|
||||
fmap (map (makeRelative ".")) $ runFind $
|
||||
["-L", root, "(", "-name", "*.cabal", "-o", "-name", Hpack.packageConfig, ")", "-a", "-type", "f"]
|
||||
<> concat [["-not", "-path", p </> "*"] | p <- (toImplDir "*" </> attrCacheFileName) : exclusions]
|
||||
where
|
||||
runFind args = do
|
||||
(_exitCode, out, err) <- readCreateProcessWithExitCode $ proc findExePath args
|
||||
putLog Debug $ T.strip $ T.pack err
|
||||
pure $ map T.unpack $ T.lines $ T.strip $ T.pack out
|
||||
|
||||
data GuessPackageFileError = GuessPackageFileError_Ambiguous [FilePath] | GuessPackageFileError_NotFound
|
||||
deriving (Eq, Ord, Show)
|
||||
instance Exception GuessPackageFileError
|
||||
|
||||
newtype HPackFilePath = HPackFilePath { unHPackFilePath :: FilePath } deriving (Eq, Ord, Show)
|
||||
newtype CabalFilePath = CabalFilePath { unCabalFilePath :: FilePath } deriving (Eq, Ord, Show)
|
||||
|
||||
-- | Given a directory, try to guess what the appropriate @.cabal@ or @package.yaml@ file is for the package.
|
||||
guessCabalPackageFile
|
||||
:: (MonadIO m)
|
||||
=> FilePath -- ^ Directory or path to search for cabal package
|
||||
-> m (Either GuessPackageFileError (Either CabalFilePath HPackFilePath))
|
||||
guessCabalPackageFile pkg = do
|
||||
liftIO (doesDirectoryExist pkg) >>= \case
|
||||
False -> case cabalOrHpackFile pkg of
|
||||
(Just hpack@(Right _)) -> pure $ Right hpack
|
||||
(Just cabal@(Left (CabalFilePath cabalFilePath))) -> do
|
||||
-- If the cabal file has a sibling hpack file, we use that instead
|
||||
-- since running hpack often generates a sibling cabal file
|
||||
let possibleHpackSibling = takeDirectory cabalFilePath </> Hpack.packageConfig
|
||||
hasHpackSibling <- liftIO $ doesFileExist possibleHpackSibling
|
||||
pure $ Right $ if hasHpackSibling then Right (HPackFilePath possibleHpackSibling) else cabal
|
||||
Nothing -> pure $ Left GuessPackageFileError_NotFound
|
||||
True -> do
|
||||
candidates <- liftIO $
|
||||
filterM (doesFileExist . either unCabalFilePath unHPackFilePath)
|
||||
=<< mapMaybe (cabalOrHpackFile . (pkg </>)) <$> listDirectory pkg
|
||||
pure $ case partitionEithers candidates of
|
||||
([hpack], _) -> Right $ Left hpack
|
||||
([], [cabal]) -> Right $ Right cabal
|
||||
([], []) -> Left GuessPackageFileError_NotFound
|
||||
(hpacks, cabals) -> Left $ GuessPackageFileError_Ambiguous $ coerce hpacks <> coerce cabals
|
||||
|
||||
cabalOrHpackFile :: FilePath -> Maybe (Either CabalFilePath HPackFilePath)
|
||||
cabalOrHpackFile = \case
|
||||
x | takeExtension x == ".cabal" -> Just (Left $ CabalFilePath x)
|
||||
| takeFileName x == Hpack.packageConfig -> Just (Right $ HPackFilePath x)
|
||||
| otherwise -> Nothing
|
||||
|
||||
-- | Parses the cabal package in a given directory.
|
||||
-- This automatically figures out which .cabal file or package.yaml (hpack) file to use in the given directory.
|
||||
parseCabalPackage
|
||||
:: (MonadObelisk m)
|
||||
=> FilePath -- ^ package directory
|
||||
:: MonadObelisk m
|
||||
=> FilePath -- ^ Package directory
|
||||
-> m (Maybe CabalPackageInfo)
|
||||
parseCabalPackage dir = do
|
||||
let cabalFp = dir </> (takeBaseName dir <> ".cabal")
|
||||
hpackFp = dir </> "package.yaml"
|
||||
hasCabal <- liftIO $ doesFileExist cabalFp
|
||||
hasHpack <- liftIO $ doesFileExist hpackFp
|
||||
parseCabalPackage dir = parseCabalPackage' dir >>= \case
|
||||
Left err -> Nothing <$ putLog Error err
|
||||
Right (warnings, pkgInfo) -> do
|
||||
for_ warnings $ putLog Warning . T.pack . show
|
||||
pure $ Just pkgInfo
|
||||
|
||||
mCabalContents <- if hasCabal
|
||||
then Just <$> liftIO (readUTF8File cabalFp)
|
||||
else if hasHpack
|
||||
then do
|
||||
let decodeOptions = DecodeOptions hpackFp Nothing decodeYaml
|
||||
liftIO (readPackageConfig decodeOptions) >>= \case
|
||||
Left err -> do
|
||||
putLog Error $ T.pack $ "Failed to parse " <> hpackFp <> ": " <> err
|
||||
return Nothing
|
||||
Right (DecodeResult hpackPackage _ _) -> do
|
||||
return $ Just $ renderPackage [] hpackPackage
|
||||
else return Nothing
|
||||
-- | Like 'parseCabalPackage' but returns errors and warnings directly so as to avoid 'MonadObelisk'.
|
||||
parseCabalPackage'
|
||||
:: (MonadIO m)
|
||||
=> FilePath -- ^ Package directory
|
||||
-> m (Either T.Text ([Dist.PWarning], CabalPackageInfo))
|
||||
parseCabalPackage' pkg = runExceptT $ do
|
||||
(cabalContents, packageFile, packageName) <- guessCabalPackageFile pkg >>= \case
|
||||
Left GuessPackageFileError_NotFound -> throwError $ "No .cabal or package.yaml file found in " <> T.pack pkg
|
||||
Left (GuessPackageFileError_Ambiguous _) -> throwError $ "Unable to determine which .cabal file to use in " <> T.pack pkg
|
||||
Right (Left (CabalFilePath file)) -> (, file, takeBaseName file) <$> liftIO (readUTF8File file)
|
||||
Right (Right (HPackFilePath file)) -> do
|
||||
let
|
||||
decodeOptions = Hpack.DecodeOptions (Hpack.ProgramName "ob") file Nothing Hpack.decodeYaml
|
||||
liftIO (Hpack.readPackageConfig decodeOptions) >>= \case
|
||||
Left err -> throwError $ T.pack $ "Failed to parse " <> file <> ": " <> err
|
||||
Right (Hpack.DecodeResult hpackPackage _ _ _) -> pure (Hpack.renderPackage [] hpackPackage, file, Hpack.packageName hpackPackage)
|
||||
|
||||
fmap join $ forM mCabalContents $ \cabalContents -> do
|
||||
let (warnings, result) = runParseResult $ parseGenericPackageDescription $
|
||||
toUTF8BS $ cabalContents
|
||||
mapM_ (putLog Warning) $ fmap (T.pack . show) warnings
|
||||
case result of
|
||||
Right gpkg -> do
|
||||
return $ do
|
||||
(_, lib) <- simplifyCondTree (const $ pure True) <$> condLibrary gpkg
|
||||
pure $ CabalPackageInfo
|
||||
{ _cabalPackageInfo_packageRoot = takeDirectory cabalFp
|
||||
, _cabalPackageInfo_sourceDirs =
|
||||
fromMaybe (pure ".") $ NE.nonEmpty $ hsSourceDirs $ libBuildInfo lib
|
||||
, _cabalPackageInfo_defaultExtensions =
|
||||
defaultExtensions $ libBuildInfo lib
|
||||
}
|
||||
Left (_, errors) -> do
|
||||
putLog Error $ T.pack $ "Failed to parse " <> cabalFp <> ":"
|
||||
mapM_ (putLog Error) $ fmap (T.pack . show) errors
|
||||
return Nothing
|
||||
let
|
||||
(warnings, result) = runParseResult $ parseGenericPackageDescription $ toUTF8BS cabalContents
|
||||
osConfVar = case System.Info.os of
|
||||
"linux" -> Just Dist.Linux
|
||||
"darwin" -> Just Dist.OSX
|
||||
_ -> trace "Unrecgonized System.Info.os" Nothing
|
||||
archConfVar = Just Dist.X86_64 -- TODO: Actually infer this
|
||||
evalConfVar v = Right $ case v of
|
||||
OS osVar -> Just osVar == osConfVar
|
||||
Arch archVar -> Just archVar == archConfVar
|
||||
Impl GHC _ -> True -- TODO: Actually check version range
|
||||
_ -> False
|
||||
case condLibrary <$> result of
|
||||
Right (Just condLib) -> do
|
||||
let (_, lib) = simplifyCondTree evalConfVar condLib
|
||||
pure $ (warnings,) $ CabalPackageInfo
|
||||
{ _cabalPackageInfo_packageName = T.pack packageName
|
||||
, _cabalPackageInfo_packageFile = packageFile
|
||||
, _cabalPackageInfo_packageRoot = takeDirectory packageFile
|
||||
, _cabalPackageInfo_buildable = buildable $ libBuildInfo lib
|
||||
, _cabalPackageInfo_sourceDirs =
|
||||
fromMaybe (pure ".") $ NE.nonEmpty $ hsSourceDirs $ libBuildInfo lib
|
||||
, _cabalPackageInfo_defaultExtensions =
|
||||
defaultExtensions $ libBuildInfo lib
|
||||
, _cabalPackageInfo_defaultLanguage =
|
||||
defaultLanguage $ libBuildInfo lib
|
||||
, _cabalPackageInfo_compilerOptions = options $ libBuildInfo lib
|
||||
}
|
||||
Right Nothing -> throwError "Haskell package has no library component"
|
||||
Left (_, errors) ->
|
||||
throwError $ T.pack $ "Failed to parse " <> packageFile <> ":\n" <> unlines (map show errors)
|
||||
|
||||
withUTF8FileContentsM :: (MonadIO m, HasCliConfig e m) => FilePath -> (ByteString -> CliT e IO a) -> m a
|
||||
withUTF8FileContentsM fp f = do
|
||||
c <- getCliConfig
|
||||
liftIO $ withUTF8FileContents fp $ runCli c . f . toUTF8BS
|
||||
parsePackagesOrFail :: MonadObelisk m => [FilePath] -> m (NE.NonEmpty CabalPackageInfo)
|
||||
parsePackagesOrFail dirs = do
|
||||
(pkgDirErrs, packageInfos') <- fmap partitionEithers $ for dirs $ \dir -> do
|
||||
flip fmap (parseCabalPackage dir) $ \case
|
||||
Just packageInfo
|
||||
| _cabalPackageInfo_buildable packageInfo -> Right packageInfo
|
||||
_ -> Left dir
|
||||
|
||||
let packagesByName = Map.fromListWith (<>) [(_cabalPackageInfo_packageName p, p NE.:| []) | p <- packageInfos']
|
||||
unambiguous <- ifor packagesByName $ \packageName ps -> case ps of
|
||||
p NE.:| [] -> pure p -- No ambiguity here
|
||||
p NE.:| _ -> do
|
||||
putLog Warning $ T.pack $
|
||||
"Packages named '" <> T.unpack packageName <> "' appear in " <> show (length ps) <> " different locations: "
|
||||
<> intercalate ", " (map _cabalPackageInfo_packageFile $ toList ps)
|
||||
<> "; Picking " <> _cabalPackageInfo_packageFile p
|
||||
pure p
|
||||
|
||||
packageInfos <- case NE.nonEmpty $ toList unambiguous of
|
||||
Nothing -> failWith $ T.pack $ "No valid, buildable packages found in " <> intercalate ", " dirs
|
||||
Just xs -> pure xs
|
||||
|
||||
unless (null pkgDirErrs) $
|
||||
putLog Warning $ T.pack $ "Failed to find buildable packages in " <> intercalate ", " pkgDirErrs
|
||||
|
||||
pure packageInfos
|
||||
|
||||
packageInfoToNamePathMap :: [CabalPackageInfo] -> Map Text FilePath
|
||||
packageInfoToNamePathMap = Map.fromList . map (_cabalPackageInfo_packageName &&& _cabalPackageInfo_packageRoot)
|
||||
|
||||
-- | Create ghci configuration to load the given packages
|
||||
withGhciScript
|
||||
:: MonadObelisk m
|
||||
=> [FilePath] -- ^ List of packages to load into ghci
|
||||
-> (FilePath -> m ()) -- ^ Action to run with the path to generated temporory .ghci
|
||||
=> [CabalPackageInfo] -- ^ List of packages to load into ghci
|
||||
-> FilePath -- ^ All paths written to the .ghci file will be relative to this path
|
||||
-> (FilePath -> m ()) -- ^ Action to run with the path to generated temporary .ghci
|
||||
-> m ()
|
||||
withGhciScript pkgs f = do
|
||||
(pkgDirErrs, packageInfos) <- fmap partitionEithers $ forM pkgs $ \pkg -> do
|
||||
flip fmap (parseCabalPackage pkg) $ \case
|
||||
Nothing -> Left pkg
|
||||
Just packageInfo -> Right packageInfo
|
||||
|
||||
when (null packageInfos) $
|
||||
failWith $ T.pack $ "No valid pkgs found in " <> intercalate ", " pkgs
|
||||
unless (null pkgDirErrs) $
|
||||
putLog Warning $ T.pack $ "Failed to find pkgs in " <> intercalate ", " pkgDirErrs
|
||||
|
||||
let extensions = packageInfos >>= _cabalPackageInfo_defaultExtensions
|
||||
extensionsLine = if extensions == mempty
|
||||
then ""
|
||||
else ":set " <> intercalate " " ((("-X" <>) . prettyShow) <$> extensions)
|
||||
dotGhci = unlines $
|
||||
[ ":set -i" <> intercalate ":" (packageInfos >>= rootedSourceDirs)
|
||||
, extensionsLine
|
||||
, ":load Backend Frontend"
|
||||
, "import Obelisk.Run"
|
||||
, "import qualified Frontend"
|
||||
, "import qualified Backend"
|
||||
]
|
||||
withGhciScript packageInfos pathBase f = do
|
||||
selfExe <- liftIO getExecutablePath
|
||||
let
|
||||
packageNames = Set.fromList $ map _cabalPackageInfo_packageName packageInfos
|
||||
modulesToLoad = mconcat
|
||||
[ [ "Obelisk.Run" | "obelisk-run" `Set.member` packageNames ]
|
||||
, [ "Backend" | "backend" `Set.member` packageNames ]
|
||||
, [ "Frontend" | "frontend" `Set.member` packageNames ]
|
||||
]
|
||||
dotGhci = unlines
|
||||
-- TODO: Shell escape
|
||||
[ ":set -F -pgmF " <> selfExe <> " -optF " <> preprocessorIdentifier <> " " <> unwords (map (("-optF " <>) . makeRelative pathBase . _cabalPackageInfo_packageFile) packageInfos)
|
||||
, ":set -i" <> intercalate ":" (packageInfos >>= rootedSourceDirs)
|
||||
, if null modulesToLoad then "" else ":load " <> unwords modulesToLoad
|
||||
, "import qualified Obelisk.Run"
|
||||
, "import qualified Frontend"
|
||||
, "import qualified Backend" ]
|
||||
withSystemTempDirectory "ob-ghci" $ \fp -> do
|
||||
let dotGhciPath = fp </> ".ghci"
|
||||
liftIO $ writeFile dotGhciPath dotGhci
|
||||
@ -168,31 +333,52 @@ withGhciScript pkgs f = do
|
||||
|
||||
where
|
||||
rootedSourceDirs pkg = NE.toList $
|
||||
(_cabalPackageInfo_packageRoot pkg </>) <$> _cabalPackageInfo_sourceDirs pkg
|
||||
makeRelative pathBase . (_cabalPackageInfo_packageRoot pkg </>) <$> _cabalPackageInfo_sourceDirs pkg
|
||||
|
||||
-- | Run ghci repl
|
||||
runGhciRepl
|
||||
:: MonadObelisk m
|
||||
=> FilePath -- ^ Path to .ghci
|
||||
=> FilePath -- ^ Path to project root
|
||||
-> [CabalPackageInfo]
|
||||
-> FilePath -- ^ Path to .ghci
|
||||
-> m ()
|
||||
runGhciRepl dotGhci = inProjectShell "ghc" $ unwords $ "ghci" : ["-no-user-package-db", "-ghci-script", dotGhci]
|
||||
runGhciRepl root packages dotGhci =
|
||||
-- NOTE: We do *not* want to use $(staticWhich "ghci") here because we need the
|
||||
-- ghc that is provided by the shell in the user's project.
|
||||
nixShellWithPkgs root True False (packageInfoToNamePathMap packages) $ Just $ "ghci " <> makeBaseGhciOptions dotGhci -- TODO: Shell escape
|
||||
|
||||
-- | Run ghcid
|
||||
runGhcid
|
||||
:: MonadObelisk m
|
||||
=> FilePath -- ^ Path to .ghci
|
||||
=> FilePath -- ^ Path to project root
|
||||
-> Bool -- ^ Should we chdir to root when running this process?
|
||||
-> FilePath -- ^ Path to .ghci
|
||||
-> [CabalPackageInfo]
|
||||
-> Maybe String -- ^ Optional command to run at every reload
|
||||
-> m ()
|
||||
runGhcid dotGhci mcmd = callCommand $ unwords $ "ghcid" : opts
|
||||
runGhcid root chdirToRoot dotGhci packages mcmd =
|
||||
nixShellWithPkgs root True chdirToRoot (packageInfoToNamePathMap packages) (Just $ unwords $ ghcidExePath : opts) -- TODO: Shell escape
|
||||
where
|
||||
opts =
|
||||
[ "-W"
|
||||
--TODO: The decision of whether to use -fwarn-redundant-constraints should probably be made by the user
|
||||
, "--command='ghci -Wall -ignore-dot-ghci -fwarn-redundant-constraints -no-user-package-db -ghci-script " <> dotGhci <> "' "
|
||||
, "--reload=config"
|
||||
, "--command='ghci -ignore-dot-ghci " <> makeBaseGhciOptions dotGhci <> "' "
|
||||
, "--outputfile=ghcid-output.txt"
|
||||
] <> testCmd
|
||||
testCmd = maybeToList (flip fmap mcmd $ \cmd -> "--test='" <> cmd <> "'")
|
||||
] <> map (\x -> "--reload='" <> x <> "'") reloadFiles
|
||||
<> map (\x -> "--restart='" <> x <> "'") restartFiles
|
||||
<> testCmd
|
||||
testCmd = maybeToList (flip fmap mcmd $ \cmd -> "--test='" <> cmd <> "'") -- TODO: Shell escape
|
||||
|
||||
adjustRoot x = if chdirToRoot then makeRelative root x else x
|
||||
reloadFiles = map adjustRoot [root </> "config"]
|
||||
restartFiles = map (adjustRoot . _cabalPackageInfo_packageFile) packages
|
||||
|
||||
makeBaseGhciOptions :: FilePath -> String
|
||||
makeBaseGhciOptions dotGhci =
|
||||
unwords
|
||||
[ "-no-user-package-db"
|
||||
, "-package-env -"
|
||||
, "-ghci-script " <> dotGhci
|
||||
]
|
||||
|
||||
getFreePort :: MonadIO m => m PortNumber
|
||||
getFreePort = liftIO $ withSocketsDo $ do
|
||||
|
@ -8,26 +8,31 @@
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TypeApplications #-}
|
||||
module Obelisk.Command.Thunk
|
||||
( ThunkPtr (..)
|
||||
( GitHubSource (..)
|
||||
, GitUri (..)
|
||||
, ReadThunkError (..)
|
||||
, ThunkConfig (..)
|
||||
, ThunkData (..)
|
||||
, ThunkPackConfig (..)
|
||||
, ThunkPtr (..)
|
||||
, ThunkRev (..)
|
||||
, ThunkSource (..)
|
||||
, ThunkData (..)
|
||||
, ReadThunkError (..)
|
||||
, GitHubSource (..)
|
||||
, getThunkGitBranch
|
||||
, getLatestRev
|
||||
, updateThunkToLatest
|
||||
, ThunkUpdateConfig (..)
|
||||
, attrCacheFileName
|
||||
, createThunk
|
||||
, createThunkWithLatest
|
||||
, getLatestRev
|
||||
, getThunkGitBranch
|
||||
, getThunkPtr
|
||||
, gitUriToText
|
||||
, nixBuildAttrWithCache
|
||||
, nixBuildThunkAttrWithCache
|
||||
, unpackThunk
|
||||
, packThunk
|
||||
, readThunk
|
||||
, updateThunk
|
||||
, getThunkPtr
|
||||
, getThunkPtr'
|
||||
, parseGitUri
|
||||
, readThunk
|
||||
, unpackThunk
|
||||
, updateThunk
|
||||
, updateThunkToLatest
|
||||
, uriThunkPtr
|
||||
) where
|
||||
|
||||
@ -43,6 +48,7 @@ import qualified Data.Aeson as Aeson
|
||||
import Data.Aeson.Encode.Pretty
|
||||
import qualified Data.Aeson.Types as Aeson
|
||||
import qualified Data.ByteString.Lazy as LBS
|
||||
import Data.Containers.ListUtils (nubOrd)
|
||||
import Data.Default
|
||||
import Data.Either.Combinators (fromRight', rightToMaybe)
|
||||
import Data.Git.Ref (Ref)
|
||||
@ -65,12 +71,12 @@ import GitHub
|
||||
import GitHub.Data.Name
|
||||
import Obelisk.Command.Nix
|
||||
import System.Directory
|
||||
import System.Exit
|
||||
import System.FilePath
|
||||
import System.IO.Error
|
||||
import System.IO.Temp
|
||||
import System.Posix (getSymbolicLinkStatus, modificationTime)
|
||||
import System.Process (proc)
|
||||
import Text.URI
|
||||
import qualified Text.URI as URI
|
||||
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp
|
||||
@ -95,7 +101,7 @@ type NixSha256 = Text --TODO: Use a smart constructor and make this actually ver
|
||||
|
||||
-- | A specific revision of data; it may be available from multiple sources
|
||||
data ThunkRev = ThunkRev
|
||||
{ _thunkRev_commit :: Ref
|
||||
{ _thunkRev_commit :: Ref Ref.SHA1
|
||||
, _thunkRev_nixSha256 :: NixSha256
|
||||
}
|
||||
deriving (Show, Eq, Ord)
|
||||
@ -112,37 +118,63 @@ data GitHubSource = GitHubSource
|
||||
{ _gitHubSource_owner :: Name Owner
|
||||
, _gitHubSource_repo :: Name Repo
|
||||
, _gitHubSource_branch :: Maybe (Name Branch)
|
||||
, _gitHubSource_private :: Bool
|
||||
}
|
||||
deriving (Show, Eq, Ord)
|
||||
|
||||
newtype GitUri = GitUri { unGitUri :: URI.URI } deriving (Eq, Ord, Show)
|
||||
|
||||
gitUriToText :: GitUri -> Text
|
||||
gitUriToText (GitUri uri)
|
||||
| (T.toLower . URI.unRText <$> URI.uriScheme uri) == Just "file"
|
||||
, Just (_, path) <- URI.uriPath uri
|
||||
= "/" <> T.intercalate "/" (map URI.unRText $ NonEmpty.toList path)
|
||||
| otherwise = URI.render uri
|
||||
|
||||
data GitSource = GitSource
|
||||
{ _gitSource_url :: URI
|
||||
{ _gitSource_url :: GitUri
|
||||
, _gitSource_branch :: Maybe (Name Branch)
|
||||
, _gitSource_fetchSubmodules :: Bool
|
||||
, _gitSource_private :: Bool
|
||||
}
|
||||
deriving (Show, Eq, Ord)
|
||||
|
||||
newtype ThunkConfig = ThunkConfig
|
||||
{ _thunkConfig_private :: Maybe Bool
|
||||
} deriving Show
|
||||
|
||||
data ThunkUpdateConfig = ThunkUpdateConfig
|
||||
{ _thunkUpdateConfig_branch :: Maybe String
|
||||
, _thunkUpdateConfig_config :: ThunkConfig
|
||||
} deriving Show
|
||||
|
||||
data ThunkPackConfig = ThunkPackConfig
|
||||
{ _thunkPackConfig_force :: Bool
|
||||
, _thunkPackConfig_config :: ThunkConfig
|
||||
} deriving Show
|
||||
|
||||
-- | Convert a GitHub source to a regular Git source. Assumes no submodules.
|
||||
forgetGithub :: Bool -> GitHubSource -> GitSource
|
||||
forgetGithub useSsh s = GitSource
|
||||
{ _gitSource_url = URI
|
||||
{ uriScheme = Just $ fromRight' $ mkScheme $ if useSsh then "ssh" else "https"
|
||||
, uriAuthority = Right $ Authority
|
||||
{ authUserInfo = UserInfo (fromRight' $ mkUsername "git") Nothing
|
||||
{ _gitSource_url = GitUri $ URI.URI
|
||||
{ URI.uriScheme = Just $ fromRight' $ URI.mkScheme $ if useSsh then "ssh" else "https"
|
||||
, URI.uriAuthority = Right $ URI.Authority
|
||||
{ URI.authUserInfo = URI.UserInfo (fromRight' $ URI.mkUsername "git") Nothing
|
||||
<$ guard useSsh
|
||||
, authHost = fromRight' $ mkHost "github.com"
|
||||
, authPort = Nothing
|
||||
, URI.authHost = fromRight' $ URI.mkHost "github.com"
|
||||
, URI.authPort = Nothing
|
||||
}
|
||||
, uriPath = Just ( False
|
||||
, fromRight' . mkPathPiece <$>
|
||||
, URI.uriPath = Just ( False
|
||||
, fromRight' . URI.mkPathPiece <$>
|
||||
untagName (_gitHubSource_owner s)
|
||||
:| [ untagName (_gitHubSource_repo s) <> ".git" ]
|
||||
)
|
||||
, uriQuery = []
|
||||
, uriFragment = Nothing
|
||||
, URI.uriQuery = []
|
||||
, URI.uriFragment = Nothing
|
||||
}
|
||||
, _gitSource_branch = _gitHubSource_branch s
|
||||
, _gitSource_fetchSubmodules = False
|
||||
, _gitSource_private = _gitHubSource_private s
|
||||
}
|
||||
|
||||
getThunkGitBranch :: ThunkPtr -> Maybe Text
|
||||
@ -150,26 +182,26 @@ getThunkGitBranch (ThunkPtr _ src) = fmap untagName $ case src of
|
||||
ThunkSource_GitHub s -> _gitHubSource_branch s
|
||||
ThunkSource_Git s -> _gitSource_branch s
|
||||
|
||||
commitNameToRef :: Name Commit -> Ref
|
||||
commitNameToRef :: Name Commit -> Ref Ref.SHA1
|
||||
commitNameToRef (N c) = Ref.fromHex $ encodeUtf8 c
|
||||
|
||||
-- TODO: Use spinner here.
|
||||
getNixSha256ForUriUnpacked
|
||||
:: MonadObelisk m
|
||||
=> URI
|
||||
=> GitUri
|
||||
-> m NixSha256
|
||||
getNixSha256ForUriUnpacked uri =
|
||||
withExitFailMessage ("nix-prefetch-url: Failed to determine sha256 hash of URL " <> render uri) $ do
|
||||
withExitFailMessage ("nix-prefetch-url: Failed to determine sha256 hash of URL " <> gitUriToText uri) $ do
|
||||
[hash] <- fmap T.lines $ readProcessAndLogOutput (Debug, Debug) $
|
||||
proc "nix-prefetch-url" ["--unpack" , "--type" , "sha256" , T.unpack $ render uri]
|
||||
proc "nix-prefetch-url" ["--unpack", "--type", "sha256", T.unpack $ gitUriToText uri]
|
||||
pure hash
|
||||
|
||||
nixPrefetchGit :: MonadObelisk m => URI -> Text -> Bool -> m NixSha256
|
||||
nixPrefetchGit :: MonadObelisk m => GitUri -> Text -> Bool -> m NixSha256
|
||||
nixPrefetchGit uri rev fetchSubmodules =
|
||||
withExitFailMessage ("nix-prefetch-git: Failed to determine sha256 hash of Git repo " <> render uri <> " at " <> rev) $ do
|
||||
withExitFailMessage ("nix-prefetch-git: Failed to determine sha256 hash of Git repo " <> gitUriToText uri <> " at " <> rev) $ do
|
||||
out <- readProcessAndLogStderr Debug $
|
||||
proc "nix-prefetch-git" $ filter (/="")
|
||||
[ "--url", T.unpack $ render uri
|
||||
[ "--url", T.unpack $ gitUriToText uri
|
||||
, "--rev", T.unpack rev
|
||||
, if fetchSubmodules then "--fetch-submodules" else ""
|
||||
, "--quiet"
|
||||
@ -200,6 +232,9 @@ readThunk thunkDir = do
|
||||
False -> return $ return Nothing
|
||||
False -> liftIO $ fmap ThunkData_Packed <$> readPackedThunk thunkDir
|
||||
|
||||
attrCacheFileName :: FilePath
|
||||
attrCacheFileName = ".attr-cache"
|
||||
|
||||
data ThunkType = ThunkType
|
||||
{ _thunkType_loader :: FilePath
|
||||
, _thunkType_json :: FilePath
|
||||
@ -212,7 +247,7 @@ gitHubThunkType :: ThunkType
|
||||
gitHubThunkType = ThunkType
|
||||
{ _thunkType_loader = "default.nix"
|
||||
, _thunkType_json = "github.json"
|
||||
, _thunkType_optional = Set.fromList [".attr-cache"]
|
||||
, _thunkType_optional = Set.fromList [attrCacheFileName]
|
||||
, _thunkType_loaderVersions = gitHubStandaloneLoaders
|
||||
, _thunkType_parser = parseThunkPtr $ \v ->
|
||||
ThunkSource_GitHub <$> parseGitHubSource v <|> ThunkSource_Git <$> parseGitSource v
|
||||
@ -222,7 +257,7 @@ gitThunkType :: ThunkType
|
||||
gitThunkType = ThunkType
|
||||
{ _thunkType_loader = "default.nix"
|
||||
, _thunkType_json = "git.json"
|
||||
, _thunkType_optional = Set.fromList [".attr-cache"]
|
||||
, _thunkType_optional = Set.fromList [attrCacheFileName]
|
||||
, _thunkType_loaderVersions = plainGitStandaloneLoaders
|
||||
, _thunkType_parser = parseThunkPtr $ fmap ThunkSource_Git . parseGitSource
|
||||
}
|
||||
@ -287,10 +322,12 @@ parseGitHubSource v = do
|
||||
owner <- v Aeson..: "owner"
|
||||
repo <- v Aeson..: "repo"
|
||||
branch <- v Aeson..:! "branch"
|
||||
private <- v Aeson..:? "private"
|
||||
pure $ GitHubSource
|
||||
{ _gitHubSource_owner = owner
|
||||
, _gitHubSource_repo = repo
|
||||
, _gitHubSource_branch = branch
|
||||
, _gitHubSource_private = fromMaybe False private
|
||||
}
|
||||
|
||||
parseGitSource :: Aeson.Object -> Aeson.Parser GitSource
|
||||
@ -298,10 +335,12 @@ parseGitSource v = do
|
||||
Just url <- parseGitUri <$> v Aeson..: "url"
|
||||
branch <- v Aeson..:! "branch"
|
||||
fetchSubmodules <- v Aeson..:! "fetchSubmodules"
|
||||
private <- v Aeson..:? "private"
|
||||
pure $ GitSource
|
||||
{ _gitSource_url = url
|
||||
, _gitSource_branch = branch
|
||||
, _gitSource_fetchSubmodules = fromMaybe False fetchSubmodules
|
||||
, _gitSource_private = fromMaybe False private
|
||||
}
|
||||
|
||||
overwriteThunk :: MonadObelisk m => FilePath -> ThunkPtr -> m ()
|
||||
@ -328,13 +367,15 @@ encodeThunkPtrData (ThunkPtr rev src) = case src of
|
||||
, ("branch" .=) <$> _gitHubSource_branch s
|
||||
, Just $ "rev" .= Ref.toHexString (_thunkRev_commit rev)
|
||||
, Just $ "sha256" .= _thunkRev_nixSha256 rev
|
||||
, Just $ "private" .= _gitHubSource_private s
|
||||
]
|
||||
ThunkSource_Git s -> encodePretty' plainGitCfg $ Aeson.object $ catMaybes
|
||||
[ Just $ "url" .= render (_gitSource_url s)
|
||||
[ Just $ "url" .= gitUriToText (_gitSource_url s)
|
||||
, Just $ "rev" .= Ref.toHexString (_thunkRev_commit rev)
|
||||
, ("branch" .=) <$> _gitSource_branch s
|
||||
, Just $ "sha256" .= _thunkRev_nixSha256 rev
|
||||
, Just $ "fetchSubmodules" .= _gitSource_fetchSubmodules s
|
||||
, Just $ "private" .= _gitSource_private s
|
||||
]
|
||||
where
|
||||
githubCfg = defConfig
|
||||
@ -355,6 +396,7 @@ encodeThunkPtrData (ThunkPtr rev src) = case src of
|
||||
[ "url"
|
||||
, "rev"
|
||||
, "sha256"
|
||||
, "private"
|
||||
, "fetchSubmodules"
|
||||
] <> compare
|
||||
, confTrailingNewline = True
|
||||
@ -362,7 +404,7 @@ encodeThunkPtrData (ThunkPtr rev src) = case src of
|
||||
|
||||
createThunk :: MonadIO m => FilePath -> ThunkPtr -> m ()
|
||||
createThunk target thunk = liftIO $ do
|
||||
createDirectoryIfMissing True (target </> ".attr-cache")
|
||||
createDirectoryIfMissing True (target </> attrCacheFileName)
|
||||
T.writeFile (target </> "default.nix") (thunkPtrLoader thunk)
|
||||
let
|
||||
jsonFileName = case _thunkPtr_source thunk of
|
||||
@ -378,8 +420,8 @@ createThunkWithLatest target s = do
|
||||
, _thunkPtr_rev = rev
|
||||
}
|
||||
|
||||
updateThunkToLatest :: MonadObelisk m => FilePath -> Maybe String -> m ()
|
||||
updateThunkToLatest target mBranch = withSpinner' ("Updating thunk " <> T.pack target <> " to latest") (pure $ const $ "Thunk " <> T.pack target <> " updated to latest") $ do
|
||||
updateThunkToLatest :: MonadObelisk m => ThunkUpdateConfig -> FilePath -> m ()
|
||||
updateThunkToLatest (ThunkUpdateConfig mBranch thunkConfig) target = withSpinner' ("Updating thunk " <> T.pack target <> " to latest") (pure $ const $ "Thunk " <> T.pack target <> " updated to latest") $ do
|
||||
checkThunkDirectory "ob thunk update directory cannot be '.'" target
|
||||
-- check to see if thunk should be updated to a specific branch or just update it's current branch
|
||||
case mBranch of
|
||||
@ -391,7 +433,7 @@ updateThunkToLatest target mBranch = withSpinner' ("Updating thunk " <> T.pack t
|
||||
ThunkData_Checkout _ -> failWith "cannot update an unpacked thunk"
|
||||
let src = _thunkPtr_source ptr
|
||||
rev <- getLatestRev src
|
||||
overwriteThunk overwrite $ ThunkPtr
|
||||
overwriteThunk overwrite $ modifyThunkPtrByConfig thunkConfig $ ThunkPtr
|
||||
{ _thunkPtr_source = src
|
||||
, _thunkPtr_rev = rev
|
||||
}
|
||||
@ -399,24 +441,25 @@ updateThunkToLatest target mBranch = withSpinner' ("Updating thunk " <> T.pack t
|
||||
Left err -> failWith $ T.pack $ "thunk update: " <> show err
|
||||
Right c -> case c of
|
||||
ThunkData_Packed t -> case _thunkPtr_source t of
|
||||
ThunkSource_Git tsg -> setThunk target tsg branch
|
||||
ThunkSource_Git tsg -> setThunk thunkConfig target tsg branch
|
||||
ThunkSource_GitHub tsgh -> do
|
||||
let tsg = forgetGithub False tsgh
|
||||
setThunk target tsg branch
|
||||
ThunkData_Checkout _ -> failWith $ T.pack $ "thunk located at " <> (show target) <> " is unpacked. Use ob thunk pack on the desired directory and then try ob thunk update again."
|
||||
setThunk thunkConfig target tsg branch
|
||||
ThunkData_Checkout _ -> failWith $ T.pack $ "thunk located at " <> show target <> " is unpacked. Use ob thunk pack on the desired directory and then try ob thunk update again."
|
||||
|
||||
setThunk :: MonadObelisk m => FilePath -> GitSource -> String -> m ()
|
||||
setThunk target gs branch = do
|
||||
newThunkPtr <- uriThunkPtr (_gitSource_url gs) (Just $ T.pack branch) Nothing
|
||||
setThunk :: MonadObelisk m => ThunkConfig -> FilePath -> GitSource -> String -> m ()
|
||||
setThunk thunkConfig target gs branch = do
|
||||
newThunkPtr <- uriThunkPtr (_gitSource_url gs) (_thunkConfig_private thunkConfig) (Just $ T.pack branch) Nothing
|
||||
overwriteThunk target newThunkPtr
|
||||
updateThunkToLatest target Nothing
|
||||
updateThunkToLatest (ThunkUpdateConfig Nothing thunkConfig) target
|
||||
|
||||
-- | All recognized github standalone loaders, ordered from newest to oldest.
|
||||
-- This tool will only ever produce the newest one when it writes a thunk.
|
||||
gitHubStandaloneLoaders :: NonEmpty Text
|
||||
gitHubStandaloneLoaders =
|
||||
gitHubStandaloneLoaderV2 :|
|
||||
gitHubStandaloneLoaderV4 :|
|
||||
[ gitHubStandaloneLoaderV3
|
||||
, gitHubStandaloneLoaderV2
|
||||
, gitHubStandaloneLoaderV1
|
||||
]
|
||||
|
||||
@ -456,10 +499,24 @@ gitHubStandaloneLoaderV3 = T.unlines
|
||||
, "in import (fetch (builtins.fromJSON (builtins.readFile ./github.json)))"
|
||||
]
|
||||
|
||||
gitHubStandaloneLoaderV4 :: Text
|
||||
gitHubStandaloneLoaderV4 = T.unlines
|
||||
[ "# DO NOT HAND-EDIT THIS FILE"
|
||||
, "let fetch = { private ? false, fetchSubmodules ? false, owner, repo, rev, sha256, ... }:"
|
||||
, " if !fetchSubmodules && !private then builtins.fetchTarball {"
|
||||
, " url = \"https://github.com/${owner}/${repo}/archive/${rev}.tar.gz\"; inherit sha256;"
|
||||
, " } else (import <nixpkgs> {}).fetchFromGitHub {"
|
||||
, " inherit owner repo rev sha256 fetchSubmodules private;"
|
||||
, " };"
|
||||
, "in import (fetch (builtins.fromJSON (builtins.readFile ./github.json)))"
|
||||
]
|
||||
|
||||
plainGitStandaloneLoaders :: NonEmpty Text
|
||||
plainGitStandaloneLoaders =
|
||||
plainGitStandaloneLoaderV1 :|
|
||||
[ plainGitStandaloneLoaderV2
|
||||
plainGitStandaloneLoaderV4 :|
|
||||
[ plainGitStandaloneLoaderV1
|
||||
, plainGitStandaloneLoaderV2
|
||||
, plainGitStandaloneLoaderV3
|
||||
]
|
||||
|
||||
plainGitStandaloneLoaderV1 :: Text
|
||||
@ -480,6 +537,41 @@ plainGitStandaloneLoaderV2 = T.unlines
|
||||
, "in import (fetchGit (builtins.fromJSON (builtins.readFile ./git.json)))"
|
||||
]
|
||||
|
||||
-- This loader has a bug because @builtins.fetchGit@ is not given a @ref@
|
||||
-- and will fail to find commits without this because it does shallow clones.
|
||||
plainGitStandaloneLoaderV3 :: Text
|
||||
plainGitStandaloneLoaderV3 = T.unlines
|
||||
[ "# DO NOT HAND-EDIT THIS FILE"
|
||||
, "let fetch = {url, rev, ref ? null, sha256 ? null, fetchSubmodules ? false, private ? false, ...}:"
|
||||
, " let realUrl = let firstChar = builtins.substring 0 1 url; in"
|
||||
, " if firstChar == \"/\" then /. + url"
|
||||
, " else if firstChar == \".\" then ./. + url"
|
||||
, " else url;"
|
||||
, " in if !fetchSubmodules && private then builtins.fetchGit {"
|
||||
, " url = realUrl; inherit rev;"
|
||||
, " } else (import <nixpkgs> {}).fetchgit {"
|
||||
, " url = realUrl; inherit rev sha256;"
|
||||
, " };"
|
||||
, "in import (fetch (builtins.fromJSON (builtins.readFile ./git.json)))"
|
||||
]
|
||||
|
||||
plainGitStandaloneLoaderV4 :: Text
|
||||
plainGitStandaloneLoaderV4 = T.unlines
|
||||
[ "# DO NOT HAND-EDIT THIS FILE"
|
||||
, "let fetch = {url, rev, branch ? null, sha256 ? null, fetchSubmodules ? false, private ? false, ...}:"
|
||||
, " let realUrl = let firstChar = builtins.substring 0 1 url; in"
|
||||
, " if firstChar == \"/\" then /. + url"
|
||||
, " else if firstChar == \".\" then ./. + url"
|
||||
, " else url;"
|
||||
, " in if !fetchSubmodules && private then builtins.fetchGit {"
|
||||
, " url = realUrl; inherit rev;"
|
||||
, " ${if branch == null then null else \"ref\"} = branch;"
|
||||
, " } else (import <nixpkgs> {}).fetchgit {"
|
||||
, " url = realUrl; inherit rev sha256;"
|
||||
, " };"
|
||||
, "in import (fetch (builtins.fromJSON (builtins.readFile ./git.json)))"
|
||||
]
|
||||
|
||||
thunkFileNames :: [FilePath]
|
||||
thunkFileNames = ["default.nix", "github.json", "git.json"]
|
||||
|
||||
@ -506,7 +598,7 @@ nixBuildThunkAttrWithCache thunkDir attr = do
|
||||
let cacheErrHandler e
|
||||
| isDoesNotExistError e = return Nothing -- expected from a cache miss
|
||||
| otherwise = putLog Error (T.pack $ displayException e) >> return Nothing
|
||||
cacheDir = thunkDir </> ".attr-cache"
|
||||
cacheDir = thunkDir </> attrCacheFileName
|
||||
cachePath = cacheDir </> attr <.> "out"
|
||||
latestChange <- liftIO $ do
|
||||
createDirectoryIfMissing False cacheDir
|
||||
@ -521,7 +613,7 @@ nixBuildThunkAttrWithCache thunkDir attr = do
|
||||
Just c -> return c
|
||||
Nothing -> do
|
||||
putLog Warning $ T.pack $ mconcat [thunkDir, ": ", attr, " not cached, building ..."]
|
||||
_ <- nixCmd $ NixCmd_Build$ def
|
||||
_ <- nixCmd $ NixCmd_Build $ def
|
||||
Lens.& nixBuildConfig_outLink Lens..~ OutLink_IndirectRoot cachePath
|
||||
Lens.& nixCmdConfig_target Lens..~ Target
|
||||
{ _target_path = Just thunkDir
|
||||
@ -569,13 +661,13 @@ updateThunk p f = withSystemTempDirectory "obelisk-thunkptr-" $ \tmpDir -> do
|
||||
Right (ThunkData_Packed _) -> do
|
||||
let tmpThunk = tmpDir </> "thunk"
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "cp" ["-r", "-T", thunkDir, tmpThunk]
|
||||
proc cp ["-r", "-T", thunkDir, tmpThunk]
|
||||
return tmpThunk
|
||||
Right _ -> failWith $ "Thunk is not packed"
|
||||
Right _ -> failWith "Thunk is not packed"
|
||||
updateThunkFromTmp p' = do
|
||||
_ <- packThunk' True p'
|
||||
_ <- packThunk' True (ThunkPackConfig False (ThunkConfig Nothing)) p'
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "cp" ["-r", "-T", p', p]
|
||||
proc cp ["-r", "-T", p', p]
|
||||
|
||||
finalMsg :: Bool -> (a -> Text) -> Maybe (a -> Text)
|
||||
finalMsg noTrail s = if noTrail then Nothing else Just s
|
||||
@ -607,7 +699,7 @@ unpackThunk' noTrail thunkDir = checkThunkDirectory "Can't pack/unpack from with
|
||||
let git = callProcessAndLogOutput (Notice, Notice) . gitProc tmpRepo
|
||||
git $ [ "clone" ]
|
||||
++ ("--recursive" <$ guard (_gitSource_fetchSubmodules s))
|
||||
++ [ T.unpack $ render $ _gitSource_url s ]
|
||||
++ [ T.unpack $ gitUriToText $ _gitSource_url s ]
|
||||
++ do branch <- maybeToList $ _gitSource_branch s
|
||||
[ "--branch", T.unpack $ untagName branch ]
|
||||
git ["reset", "--hard", Ref.toHexString $ _thunkRev_commit $ _thunkPtr_rev tptr]
|
||||
@ -616,35 +708,39 @@ unpackThunk' noTrail thunkDir = checkThunkDirectory "Can't pack/unpack from with
|
||||
|
||||
liftIO $ createDirectory obGitDir
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "cp" ["-r", "-T", thunkDir </> ".", obGitDir </> "orig-thunk"]
|
||||
proc cp ["-r", "-T", thunkDir </> ".", obGitDir </> "orig-thunk"]
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "rm" ["-r", thunkDir]
|
||||
callProcessAndLogOutput (Notice, Error) $
|
||||
proc "mv" ["-T", tmpRepo, thunkDir]
|
||||
|
||||
--TODO: add force mode to pack even if changes are present
|
||||
--TODO: add a rollback mode to pack to the original thunk
|
||||
packThunk :: MonadObelisk m => FilePath -> m ThunkPtr
|
||||
packThunk :: MonadObelisk m => ThunkPackConfig -> FilePath -> m ThunkPtr
|
||||
packThunk = packThunk' False
|
||||
|
||||
packThunk' :: MonadObelisk m => Bool -> FilePath -> m ThunkPtr
|
||||
packThunk' noTrail thunkDir = checkThunkDirectory "Can't pack/unpack from within the thunk directory" thunkDir >> readThunk thunkDir >>= \case
|
||||
packThunk' :: MonadObelisk m => Bool -> ThunkPackConfig -> FilePath -> m ThunkPtr
|
||||
packThunk' noTrail (ThunkPackConfig force thunkConfig) thunkDir = checkThunkDirectory "Can't pack/unpack from within the thunk directory" thunkDir >> readThunk thunkDir >>= \case
|
||||
Left err -> failWith $ T.pack $ "thunk pack: " <> show err
|
||||
Right (ThunkData_Packed _) -> failWith "pack: thunk is already packed"
|
||||
Right (ThunkData_Checkout _) -> do
|
||||
withSpinner' ("Packing thunk " <> T.pack thunkDir)
|
||||
(finalMsg noTrail $ const $ "Packed thunk " <> T.pack thunkDir) $ do
|
||||
thunkPtr <- getThunkPtr thunkDir
|
||||
thunkPtr <- modifyThunkPtrByConfig thunkConfig <$> getThunkPtr (not force) thunkDir (_thunkConfig_private thunkConfig)
|
||||
callProcessAndLogOutput (Debug, Error) $ proc "rm" ["-rf", thunkDir]
|
||||
liftIO $ createThunk thunkDir thunkPtr
|
||||
pure thunkPtr
|
||||
|
||||
getThunkPtr :: MonadObelisk m => FilePath -> m ThunkPtr
|
||||
getThunkPtr = getThunkPtr' True
|
||||
modifyThunkPtrByConfig :: ThunkConfig -> ThunkPtr -> ThunkPtr
|
||||
modifyThunkPtrByConfig (ThunkConfig markPrivate') ptr = case markPrivate' of
|
||||
Nothing -> ptr
|
||||
Just markPrivate -> ptr { _thunkPtr_source = case _thunkPtr_source ptr of
|
||||
ThunkSource_Git s -> ThunkSource_Git $ s { _gitSource_private = markPrivate }
|
||||
ThunkSource_GitHub s -> ThunkSource_GitHub $ s { _gitHubSource_private = markPrivate }
|
||||
}
|
||||
|
||||
getThunkPtr' :: forall m. MonadObelisk m => Bool -> FilePath -> m ThunkPtr
|
||||
getThunkPtr' checkClean thunkDir = do
|
||||
when checkClean $ ensureCleanGitRepo thunkDir True $
|
||||
getThunkPtr :: forall m. MonadObelisk m => Bool -> FilePath -> Maybe Bool -> m ThunkPtr
|
||||
getThunkPtr checkClean thunkDir mPrivate = do
|
||||
when checkClean $ ensureCleanGitRepo thunkDir True
|
||||
"thunk pack: thunk checkout contains unsaved modifications"
|
||||
|
||||
-- Check whether there are any stashes
|
||||
@ -657,7 +753,7 @@ getThunkPtr' checkClean thunkDir = do
|
||||
] ++ T.lines stashOutput
|
||||
True -> return ()
|
||||
|
||||
-- Get current branch ``
|
||||
-- Get current branch
|
||||
(mCurrentBranch, mCurrentCommit) <- do
|
||||
b <- listToMaybe
|
||||
<$> T.lines
|
||||
@ -666,7 +762,7 @@ getThunkPtr' checkClean thunkDir = do
|
||||
<$> T.lines
|
||||
<$> readGitProcess thunkDir ["rev-parse", "HEAD"]
|
||||
case b of
|
||||
(Just "HEAD") -> failWith $ T.unlines $
|
||||
(Just "HEAD") -> failWith $ T.unlines
|
||||
[ "thunk pack: You are in 'detached HEAD' state."
|
||||
, "If you want to pack at the current ref \
|
||||
\then please create a new branch with 'git checkout -b <new-branch-name>' and push this upstream."
|
||||
@ -738,7 +834,7 @@ getThunkPtr' checkClean thunkDir = do
|
||||
[ "thunk pack: Certain branches in the thunk have commits not yet pushed upstream:"
|
||||
, ""
|
||||
] ++
|
||||
(flip map (Map.toList nonGood) $ \(branch, (upstream, (ahead, behind))) -> mconcat
|
||||
flip map (Map.toList nonGood) (\(branch, (upstream, (ahead, behind))) -> mconcat
|
||||
[" ", branch, " ahead: ", T.pack (show ahead), " behind: ", T.pack (show behind), " remote branch ", upstream]) ++
|
||||
[ ""
|
||||
, "Please push these upstream and try again. (Or just fetch, if they are somehow \
|
||||
@ -746,7 +842,7 @@ getThunkPtr' checkClean thunkDir = do
|
||||
]
|
||||
|
||||
-- We assume it's safe to pack the thunk at this point
|
||||
putLog Informational $ "All changes safe in git remotes. OK to pack thunk."
|
||||
putLog Informational "All changes safe in git remotes. OK to pack thunk."
|
||||
|
||||
let remote = maybe "origin" snd $ flip Map.lookup headUpstream =<< mCurrentBranch
|
||||
|
||||
@ -759,7 +855,7 @@ getThunkPtr' checkClean thunkDir = do
|
||||
remoteUri <- case parseGitUri remoteUri' of
|
||||
Nothing -> failWith $ "Could not identify git remote: " <> remoteUri'
|
||||
Just uri -> pure uri
|
||||
uriThunkPtr remoteUri mCurrentBranch mCurrentCommit
|
||||
uriThunkPtr remoteUri mPrivate mCurrentBranch mCurrentCommit
|
||||
|
||||
-- | Get the latest revision available from the given source
|
||||
getLatestRev :: MonadObelisk m => ThunkSource -> m ThunkRev
|
||||
@ -778,18 +874,18 @@ getLatestRev os = do
|
||||
-- performance. If that doesn't work (e.g. authentication issue), we fall back
|
||||
-- on just doing things the normal way for git repos in general, and save it as
|
||||
-- a regular git thunk.
|
||||
uriThunkPtr :: MonadObelisk m => URI -> Maybe Text -> Maybe Text -> m ThunkPtr
|
||||
uriThunkPtr uri mbranch mcommit = do
|
||||
uriThunkPtr :: MonadObelisk m => GitUri -> Maybe Bool -> Maybe Text -> Maybe Text -> m ThunkPtr
|
||||
uriThunkPtr uri mPrivate mbranch mcommit = do
|
||||
commit <- case mcommit of
|
||||
Nothing -> gitGetCommitBranch uri mbranch >>= return . snd
|
||||
(Just c) -> return c
|
||||
(src, rev) <- case uriToThunkSource uri mbranch of
|
||||
(src, rev) <- uriToThunkSource uri mPrivate mbranch >>= \case
|
||||
ThunkSource_GitHub s -> do
|
||||
rev <- runExceptT $ githubThunkRev s commit
|
||||
case rev of
|
||||
Right r -> pure (ThunkSource_GitHub s, r)
|
||||
Left e -> do
|
||||
putLog Warning $ "\
|
||||
putLog Warning "\
|
||||
\Failed to fetch archive from GitHub. This is probably a private repo. \
|
||||
\Falling back on normal fetchgit. Original failure:"
|
||||
errorToWarning e
|
||||
@ -806,34 +902,66 @@ uriThunkPtr uri mbranch mcommit = do
|
||||
-- If the thunk is a GitHub thunk and fails, we do *not* fall back like with
|
||||
-- `uriThunkPtr`. Unlike a plain URL, a thunk src explicitly states which method
|
||||
-- should be employed, and so we respect that.
|
||||
uriToThunkSource :: URI -> Maybe Text -> ThunkSource
|
||||
uriToThunkSource u
|
||||
| Right uriAuth <- uriAuthority u
|
||||
, Just scheme <- unRText <$> uriScheme u
|
||||
uriToThunkSource :: MonadObelisk m => GitUri -> Maybe Bool -> Maybe Text -> m ThunkSource
|
||||
uriToThunkSource (GitUri u) mPrivate
|
||||
| Right uriAuth <- URI.uriAuthority u
|
||||
, Just scheme <- URI.unRText <$> URI.uriScheme u
|
||||
, case scheme of
|
||||
"ssh" -> uriAuth == Authority
|
||||
{ authUserInfo = Just $ UserInfo (fromRight' $ mkUsername "git") Nothing
|
||||
, authHost = fromRight' $ mkHost "github.com"
|
||||
, authPort = Nothing
|
||||
"ssh" -> uriAuth == URI.Authority
|
||||
{ URI.authUserInfo = Just $ URI.UserInfo (fromRight' $ URI.mkUsername "git") Nothing
|
||||
, URI.authHost = fromRight' $ URI.mkHost "github.com"
|
||||
, URI.authPort = Nothing
|
||||
}
|
||||
s -> s `L.elem` [ "git", "https", "http" ] -- "http:" just redirects to "https:"
|
||||
&& unRText (authHost uriAuth) == "github.com"
|
||||
, Just (_, owner :| [repoish]) <- uriPath u
|
||||
= \mbranch -> ThunkSource_GitHub $ GitHubSource
|
||||
{ _gitHubSource_owner = N $ unRText owner
|
||||
, _gitHubSource_repo = N $ let
|
||||
repoish' = unRText repoish
|
||||
in case T.stripSuffix ".git" repoish' of
|
||||
Just repo -> repo
|
||||
Nothing -> repoish'
|
||||
, _gitHubSource_branch = N <$> mbranch
|
||||
}
|
||||
&& URI.unRText (URI.authHost uriAuth) == "github.com"
|
||||
, Just (_, owner :| [repoish]) <- URI.uriPath u
|
||||
= \mbranch -> do
|
||||
isPrivate <- getIsPrivate
|
||||
pure $ ThunkSource_GitHub $ GitHubSource
|
||||
{ _gitHubSource_owner = N $ URI.unRText owner
|
||||
, _gitHubSource_repo = N $ let
|
||||
repoish' = URI.unRText repoish
|
||||
in fromMaybe repoish' $ T.stripSuffix ".git" repoish'
|
||||
, _gitHubSource_branch = N <$> mbranch
|
||||
, _gitHubSource_private = isPrivate
|
||||
}
|
||||
|
||||
| otherwise = \mbranch -> ThunkSource_Git $ GitSource
|
||||
{ _gitSource_url = u
|
||||
, _gitSource_branch = N <$> mbranch
|
||||
, _gitSource_fetchSubmodules = False -- TODO: How do we determine if this should be true?
|
||||
}
|
||||
| otherwise = \mbranch -> do
|
||||
isPrivate <- getIsPrivate
|
||||
pure $ ThunkSource_Git $ GitSource
|
||||
{ _gitSource_url = GitUri u
|
||||
, _gitSource_branch = N <$> mbranch
|
||||
, _gitSource_fetchSubmodules = False -- TODO: How do we determine if this should be true?
|
||||
, _gitSource_private = isPrivate
|
||||
}
|
||||
where
|
||||
getIsPrivate = maybe (guessGitRepoIsPrivate $ GitUri u) pure mPrivate
|
||||
|
||||
guessGitRepoIsPrivate :: MonadObelisk m => GitUri -> m Bool
|
||||
guessGitRepoIsPrivate uri = flip fix urisToTry $ \loop -> \case
|
||||
[] -> pure True
|
||||
uriAttempt:xs -> do
|
||||
result <- readCreateProcessWithExitCode $
|
||||
isolateGitProc $
|
||||
gitProcNoRepo
|
||||
[ "ls-remote"
|
||||
, "--quiet"
|
||||
, "--exit-code"
|
||||
, "--symref"
|
||||
, T.unpack $ gitUriToText uriAttempt
|
||||
]
|
||||
case result of
|
||||
(ExitSuccess, _, _) -> pure False -- Must be a public repo
|
||||
_ -> loop xs
|
||||
where
|
||||
urisToTry = nubOrd $
|
||||
-- Include the original URI if it isn't using SSH because SSH will certainly fail.
|
||||
[uri | fmap URI.unRText (URI.uriScheme (unGitUri uri)) /= Just "ssh"] <>
|
||||
[changeScheme "https" uri, changeScheme "http" uri]
|
||||
changeScheme scheme (GitUri u) = GitUri $ u
|
||||
{ URI.uriScheme = URI.mkScheme scheme
|
||||
, URI.uriAuthority = (\x -> x { URI.authUserInfo = Nothing }) <$> URI.uriAuthority u
|
||||
}
|
||||
|
||||
-- Funny signature indicates no effects depend on the optional branch name.
|
||||
githubThunkRev
|
||||
@ -845,19 +973,19 @@ githubThunkRev
|
||||
githubThunkRev s commit = do
|
||||
owner <- forcePP $ _gitHubSource_owner s
|
||||
repo <- forcePP $ _gitHubSource_repo s
|
||||
revTarball <- mkPathPiece $ commit <> ".tar.gz"
|
||||
let archiveUri = URI
|
||||
{ uriScheme = Just $ fromRight' $ mkScheme "https"
|
||||
, uriAuthority = Right $ Authority
|
||||
{ authUserInfo = Nothing
|
||||
, authHost = fromRight' $ mkHost "github.com"
|
||||
, authPort = Nothing
|
||||
revTarball <- URI.mkPathPiece $ commit <> ".tar.gz"
|
||||
let archiveUri = GitUri $ URI.URI
|
||||
{ URI.uriScheme = Just $ fromRight' $ URI.mkScheme "https"
|
||||
, URI.uriAuthority = Right $ URI.Authority
|
||||
{ URI.authUserInfo = Nothing
|
||||
, URI.authHost = fromRight' $ URI.mkHost "github.com"
|
||||
, URI.authPort = Nothing
|
||||
}
|
||||
, uriPath = Just ( False
|
||||
, owner :| [ repo, fromRight' $ mkPathPiece "archive", revTarball ]
|
||||
, URI.uriPath = Just ( False
|
||||
, owner :| [ repo, fromRight' $ URI.mkPathPiece "archive", revTarball ]
|
||||
)
|
||||
, uriQuery = []
|
||||
, uriFragment = Nothing
|
||||
, URI.uriQuery = []
|
||||
, URI.uriFragment = Nothing
|
||||
}
|
||||
hash <- getNixSha256ForUriUnpacked archiveUri
|
||||
putLog Debug $ "Nix sha256 is " <> hash
|
||||
@ -866,8 +994,8 @@ githubThunkRev s commit = do
|
||||
, _thunkRev_nixSha256 = hash
|
||||
}
|
||||
where
|
||||
forcePP :: Name entity -> m (RText 'PathPiece)
|
||||
forcePP = mkPathPiece . untagName
|
||||
forcePP :: Name entity -> m (URI.RText 'URI.PathPiece)
|
||||
forcePP = URI.mkPathPiece . untagName
|
||||
|
||||
gitThunkRev
|
||||
:: MonadObelisk m
|
||||
@ -876,11 +1004,11 @@ gitThunkRev
|
||||
-> m ThunkRev
|
||||
gitThunkRev s commit = do
|
||||
let u = _gitSource_url s
|
||||
protocols = ["https", "ssh", "git"]
|
||||
Just scheme <- pure $ unRText <$> uriScheme u
|
||||
protocols = ["file", "https", "ssh", "git"]
|
||||
scheme = maybe "file" URI.unRText $ URI.uriScheme $ (\(GitUri x) -> x) u
|
||||
unless (T.toLower scheme `elem` protocols) $
|
||||
failWith $ "obelisk currently only supports "
|
||||
<> T.intercalate ", " protocols <> " protocols for non-GitHub remotes"
|
||||
<> T.intercalate ", " protocols <> " protocols for plain Git remotes"
|
||||
hash <- nixPrefetchGit u commit $ _gitSource_fetchSubmodules s
|
||||
putLog Informational $ "Nix sha256 is " <> hash
|
||||
pure $ ThunkRev
|
||||
@ -893,11 +1021,12 @@ gitThunkRev s commit = do
|
||||
--
|
||||
-- If the branch name is passed in, it is returned exactly as-is. If it is not
|
||||
-- passed it, the default branch of the repo is used instead.
|
||||
|
||||
gitGetCommitBranch
|
||||
:: MonadObelisk m => URI -> Maybe Text -> m (Text, CommitId)
|
||||
:: MonadObelisk m => GitUri -> Maybe Text -> m (Text, CommitId)
|
||||
gitGetCommitBranch uri mbranch = withExitFailMessage ("Failure for git remote " <> uriMsg) $ do
|
||||
(_, bothMaps) <- gitLsRemote
|
||||
(T.unpack $ render uri)
|
||||
(T.unpack $ gitUriToText uri)
|
||||
(GitRef_Branch <$> mbranch)
|
||||
Nothing
|
||||
branch <- case mbranch of
|
||||
@ -913,18 +1042,21 @@ gitGetCommitBranch uri mbranch = withExitFailMessage ("Failure for git remote "
|
||||
pure (branch, commit)
|
||||
where
|
||||
rethrowE = either failWith pure
|
||||
uriMsg = render uri
|
||||
uriMsg = gitUriToText uri
|
||||
|
||||
parseGitUri :: Text -> Maybe URI
|
||||
parseGitUri x = parseAbsoluteURI x <|> parseSshShorthand x
|
||||
parseGitUri :: Text -> Maybe GitUri
|
||||
parseGitUri x = GitUri <$> (parseFileURI x <|> parseAbsoluteURI x <|> parseSshShorthand x)
|
||||
|
||||
parseAbsoluteURI :: Text -> Maybe URI
|
||||
parseFileURI :: Text -> Maybe URI.URI
|
||||
parseFileURI uri = if "/" `T.isPrefixOf` uri then parseAbsoluteURI ("file://" <> uri) else Nothing
|
||||
|
||||
parseAbsoluteURI :: Text -> Maybe URI.URI
|
||||
parseAbsoluteURI uri = do
|
||||
parsedUri <- mkURI uri
|
||||
guard $ isPathAbsolute parsedUri
|
||||
pure $ parsedUri
|
||||
parsedUri <- URI.mkURI uri
|
||||
guard $ URI.isPathAbsolute parsedUri
|
||||
pure parsedUri
|
||||
|
||||
parseSshShorthand :: Text -> Maybe URI
|
||||
parseSshShorthand :: Text -> Maybe URI.URI
|
||||
parseSshShorthand uri = do
|
||||
-- This is what git does to check that the remote
|
||||
-- is not a local file path when parsing shorthand.
|
||||
@ -937,4 +1069,4 @@ parseSshShorthand uri = do
|
||||
-- This check is used to disambiguate a filepath containing a colon from shorthand
|
||||
guard $ isNothing (T.findIndex (=='/') authAndHostname)
|
||||
&& not (T.null colonAndPath)
|
||||
mkURI properUri
|
||||
URI.mkURI properUri
|
||||
|
@ -1,114 +0,0 @@
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TupleSections #-}
|
||||
-- | Hash a git repo or arbitrary directory using nix-hash.
|
||||
module Obelisk.Command.Upgrade.Hash
|
||||
( getDirectoryHash
|
||||
, getHashAtGitRevision
|
||||
) where
|
||||
|
||||
import Control.Monad (forM, forM_)
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
import Data.Bool (bool)
|
||||
import Data.Maybe (catMaybes)
|
||||
import Data.Semigroup ((<>))
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import System.Directory
|
||||
import System.FilePath
|
||||
import System.IO.Temp
|
||||
import System.Process (cwd, proc)
|
||||
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp
|
||||
import Obelisk.Command.Utils
|
||||
|
||||
import Obelisk.Migration
|
||||
|
||||
-- | Get the unique hash of the given directory
|
||||
--
|
||||
-- Excludes the following before computing the hash:
|
||||
-- * the specified top-level files/ directories.
|
||||
-- * .git directory
|
||||
-- * untracked Git files
|
||||
-- * ignored Git files
|
||||
-- * empty directories
|
||||
--
|
||||
-- Uses the same predictive algorithm as that of Nix (`nix-hash`).
|
||||
--
|
||||
-- This function will do a full copy of the directory to a temporary location before
|
||||
-- computing the hash. Because it will be deleting the files in exclude list, and
|
||||
-- other files if the directory is a git repo, which needs to be done as `nix-hash`
|
||||
-- doesn't support taking an excludes list.
|
||||
getDirectoryHash :: MonadObelisk m => [FilePath] -> FilePath -> m Hash
|
||||
getDirectoryHash excludes dir = withSystemTempDirectory "obelisk-hash-" $ \tmpDir -> do
|
||||
withSpinnerNoTrail (T.pack $ "Copying " <> dir <> " to " <> tmpDir) $ do
|
||||
runProc $ copyDir dir tmpDir
|
||||
getDirectoryHashDestructive excludes tmpDir
|
||||
|
||||
getDirectoryHashDestructive :: MonadObelisk m => [FilePath] -> FilePath -> m Hash
|
||||
getDirectoryHashDestructive excludes dir = do
|
||||
liftIO (doesDirectoryExist $ dir </> ".git") >>= \case
|
||||
True -> do
|
||||
tidyUpGitWorkingCopy dir
|
||||
withSpinnerNoTrail "Removing .git directory" $
|
||||
liftIO $ removePathForcibly $ dir </> ".git"
|
||||
False -> pure ()
|
||||
withSpinnerNoTrail "Removing excluded paths" $ do
|
||||
forM_ (fmap (dir </>) excludes) $
|
||||
liftIO . removePathForcibly
|
||||
nixHash dir
|
||||
|
||||
getHashAtGitRevision :: MonadObelisk m => [Text] -> [FilePath] -> FilePath -> m [Hash]
|
||||
getHashAtGitRevision revs excludes dir = withSystemTempDirectory "obelisk-hashrev-" $ \tmpDir -> do
|
||||
withSpinnerNoTrail (T.pack $ "Copying " <> dir <> " to " <> tmpDir) $ do
|
||||
runProc $ copyDir dir tmpDir
|
||||
tidyUpGitWorkingCopy tmpDir
|
||||
-- Discard changes to tracked files
|
||||
runProcSilently $ gitProc tmpDir ["reset", "--hard"]
|
||||
forM revs $ \rev -> do
|
||||
runProcSilently $ gitProc tmpDir ["checkout", T.unpack rev]
|
||||
-- Checking out an arbitrary revision can leave untracked files (from
|
||||
-- previous revison) around, so tidy them up.
|
||||
tidyUpGitWorkingCopy tmpDir
|
||||
withFilesStashed tmpDir (excludes <> [".git"]) $
|
||||
nixHash tmpDir
|
||||
where
|
||||
withFilesStashed base fs m = withSystemTempDirectory "obelisk-hashrev-stash-" $ \stashDir -> do
|
||||
existingPaths <- fmap catMaybes $ forM fs $ \p -> do
|
||||
liftIO (doesPathExist $ base </> p) >>= pure . bool Nothing (Just p)
|
||||
forM_ existingPaths $ \p ->
|
||||
liftIO $ renamePath (base </> p) (stashDir </> p)
|
||||
result <- m
|
||||
forM_ existingPaths $ \p ->
|
||||
liftIO $ renamePath (stashDir </> p) (base </> p)
|
||||
return result
|
||||
|
||||
nixHash :: MonadObelisk m => FilePath -> m Hash
|
||||
nixHash dir = withSpinnerNoTrail "Running `nix-hash`" $
|
||||
readProc $ proc "nix-hash" [dir]
|
||||
|
||||
-- | Clean up the following files in the git working copy
|
||||
--
|
||||
-- * Paths ignored by .gitignore, and still present
|
||||
-- * Untracked files (not added to git index)
|
||||
-- * Any empty directories (these are not tracked by git)
|
||||
--
|
||||
-- Note that this leaves modified (staged or unstaged) files as they are.
|
||||
tidyUpGitWorkingCopy :: MonadObelisk m => FilePath -> m ()
|
||||
tidyUpGitWorkingCopy dir = withSpinnerNoTrail "Tidying up git working copy" $ do
|
||||
ignored <- gitLsFiles dir ["--ignored", "--exclude-standard", "--others"]
|
||||
untracked <- gitLsFiles dir ["--exclude-standard", "--others"]
|
||||
putLog Debug $ "Ignored: " <> T.pack (show $ length ignored) <> " files."
|
||||
putLog Debug $ "Untracked files:\n" <> T.unlines untracked
|
||||
withSpinnerNoTrail "Removing untracked and ignored files" $ do
|
||||
forM_ (fmap ((dir </>) . T.unpack) $ ignored <> untracked) $
|
||||
liftIO . removePathForcibly
|
||||
-- Empty directories won't be included in these lists. Git doesn't track them
|
||||
-- So we must delete these separately.
|
||||
runProc $ proc "find" [dir, "-depth", "-empty", "-type", "d", "-delete"]
|
||||
where
|
||||
gitLsFiles pwd opts = fmap T.lines $ readProc $
|
||||
(proc "git" $ ["ls-files", "."] <> opts) { cwd = Just pwd }
|
@ -4,6 +4,7 @@
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE PatternGuards #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
module Obelisk.Command.Utils where
|
||||
|
||||
import Control.Applicative hiding (many)
|
||||
@ -20,18 +21,31 @@ import Data.Maybe (maybeToList)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Data.Void (Void)
|
||||
import System.Directory (canonicalizePath)
|
||||
import System.Environment (getExecutablePath)
|
||||
import System.Exit (ExitCode)
|
||||
import qualified System.Process as P
|
||||
import System.Which (staticWhich)
|
||||
import Text.Megaparsec as MP
|
||||
import Text.Megaparsec.Char as MP
|
||||
|
||||
import Obelisk.App (MonadObelisk)
|
||||
import Obelisk.CliApp
|
||||
|
||||
getObeliskExe :: IO FilePath
|
||||
getObeliskExe = getExecutablePath >>= canonicalizePath
|
||||
cp :: FilePath
|
||||
cp = $(staticWhich "cp")
|
||||
|
||||
ghcidExePath :: FilePath
|
||||
ghcidExePath = $(staticWhich "ghcid")
|
||||
|
||||
findExePath :: FilePath
|
||||
findExePath = $(staticWhich "find")
|
||||
|
||||
nixExePath :: FilePath
|
||||
nixExePath = $(staticWhich "nix")
|
||||
|
||||
nixBuildExePath :: FilePath
|
||||
nixBuildExePath = $(staticWhich "nix-build")
|
||||
|
||||
jreKeyToolPath :: FilePath
|
||||
jreKeyToolPath = $(staticWhich "keytool")
|
||||
|
||||
-- Check whether the working directory is clean
|
||||
checkGitCleanStatus :: MonadObelisk m => FilePath -> Bool -> m Bool
|
||||
@ -61,20 +75,31 @@ initGit repo = do
|
||||
git ["add", "."]
|
||||
git ["commit", "-m", "Initial commit."]
|
||||
|
||||
gitProcNoRepo :: [String] -> P.CreateProcess
|
||||
gitProcNoRepo = P.proc "git"
|
||||
gitProcNoRepo :: [String] -> ProcessSpec
|
||||
gitProcNoRepo args = setEnvOverride (M.singleton "GIT_TERMINAL_PROMPT" "0" <>) $ proc "git" args
|
||||
|
||||
gitProc :: FilePath -> [String] -> P.CreateProcess
|
||||
gitProc :: FilePath -> [String] -> ProcessSpec
|
||||
gitProc repo = gitProcNoRepo . runGitInDir
|
||||
where
|
||||
runGitInDir args' = case filter (not . null) args' of
|
||||
args@("clone":_) -> args <> [repo]
|
||||
args -> ["-C", repo] <> args
|
||||
|
||||
-- | Recursively copy a directory using `cp -a`
|
||||
copyDir :: FilePath -> FilePath -> P.CreateProcess
|
||||
isolateGitProc :: ProcessSpec -> ProcessSpec
|
||||
isolateGitProc = setEnvOverride (overrides <>)
|
||||
where
|
||||
overrides = M.fromList
|
||||
[ ("HOME", "/dev/null")
|
||||
, ("GIT_CONFIG_NOSYSTEM", "1")
|
||||
, ("GIT_TERMINAL_PROMPT", "0") -- git 2.3+
|
||||
, ("GIT_ASKPASS", "echo") -- pre git 2.3 to just use empty password
|
||||
, ("GIT_SSH_COMMAND", "ssh -o PreferredAuthentications password -o PubkeyAuthentication no -o GSSAPIAuthentication no")
|
||||
]
|
||||
|
||||
-- | Recursively copy a directory using `cp -a` -- TODO: Should use -rT instead of -a
|
||||
copyDir :: FilePath -> FilePath -> ProcessSpec
|
||||
copyDir src dest =
|
||||
(P.proc "cp" ["-a", ".", dest]) { P.cwd = Just src }
|
||||
setCwd (Just src) $ proc cp ["-a", ".", dest] -- TODO: This will break if dest is relative since we change cwd
|
||||
|
||||
readGitProcess :: MonadObelisk m => FilePath -> [String] -> m Text
|
||||
readGitProcess repo = readProcessAndLogOutput (Debug, Notice) . gitProc repo
|
||||
@ -87,15 +112,15 @@ processToShellString cmd args = unwords $ map quoteAndEscape (cmd : args)
|
||||
where quoteAndEscape x = T.unpack $ "'" <> T.replace "'" "'\''" (T.pack x) <> "'"
|
||||
|
||||
-- | A simpler wrapper for CliApp's most used process function with sensible defaults.
|
||||
runProc :: MonadObelisk m => P.CreateProcess -> m ()
|
||||
runProc :: MonadObelisk m => ProcessSpec -> m ()
|
||||
runProc = callProcessAndLogOutput (Notice, Error)
|
||||
|
||||
-- | Like runProc, but all output goes to Debug logging level
|
||||
runProcSilently :: MonadObelisk m => P.CreateProcess -> m ()
|
||||
runProcSilently :: MonadObelisk m => ProcessSpec -> m ()
|
||||
runProcSilently = callProcessAndLogOutput (Debug, Debug)
|
||||
|
||||
-- | A simpler wrapper for CliApp's readProcessAndLogStderr with sensible defaults.
|
||||
readProc :: MonadObelisk m => P.CreateProcess -> m Text
|
||||
readProc :: MonadObelisk m => ProcessSpec -> m Text
|
||||
readProc = readProcessAndLogOutput (Debug, Error)
|
||||
|
||||
tshow :: Show a => a -> Text
|
||||
@ -129,13 +154,13 @@ gitLsRemote repository mRef mBranch = do
|
||||
(exitCode, out, _err) <- case mBranch of
|
||||
Nothing -> readCreateProcessWithExitCode $ gitProcNoRepo $
|
||||
["ls-remote", "--exit-code", "--symref", repository]
|
||||
++ (maybeToList $ T.unpack . showGitRef <$> mRef)
|
||||
Just branchName -> readCreateProcessWithExitCode $ gitProcNoRepo $
|
||||
++ maybeToList (T.unpack . showGitRef <$> mRef)
|
||||
Just branchName -> readCreateProcessWithExitCode $ gitProcNoRepo
|
||||
["ls-remote", "--exit-code", repository, branchName]
|
||||
let t = T.pack out
|
||||
maps <- case MP.runParser parseLsRemote "" t of
|
||||
Left err -> failWith $ T.pack $ MP.parseErrorPretty' t err
|
||||
Right table -> pure $ bimap M.fromList M.fromList $ partitionEithers $ table
|
||||
Left err -> failWith $ T.pack $ MP.errorBundlePretty err
|
||||
Right table -> pure $ bimap M.fromList M.fromList $ partitionEithers table
|
||||
putLog Debug $ "git ls-remote maps: " <> T.pack (show maps)
|
||||
pure (exitCode, maps)
|
||||
|
||||
|
@ -18,7 +18,6 @@ import System.Directory (createDirectoryIfMissing)
|
||||
import System.Exit (ExitCode(..))
|
||||
import System.FilePath ((<.>), (</>))
|
||||
import qualified System.Info
|
||||
import System.Process (proc)
|
||||
|
||||
import Obelisk.App (MonadObelisk, getObeliskUserStateDir)
|
||||
import Obelisk.CliApp
|
||||
@ -156,7 +155,8 @@ testLinuxBuild stateDir
|
||||
| System.Info.os == "linux" = failWith "Using the docker builder is not necessary on linux."
|
||||
| otherwise = do
|
||||
(exitCode, _stdout, stderr) <- readCreateProcessWithExitCode $ proc "nix-build"
|
||||
[ "-E", "(import <nixpkgs> { system = \"x86_64-linux\"; }).writeText \"test\" builtins.currentTime"
|
||||
[ "--no-out-link"
|
||||
, "-E", "(import <nixpkgs> { system = \"x86_64-linux\"; }).writeText \"test\" builtins.currentTime"
|
||||
, "--builders", nixBuildersArgString stateDir
|
||||
]
|
||||
unless (exitCode == ExitSuccess) $ do
|
||||
|
@ -4,7 +4,7 @@
|
||||
# platform.
|
||||
{ lib
|
||||
, runCommand
|
||||
, cleanHaskellSource
|
||||
, obeliskCleanSource
|
||||
}:
|
||||
|
||||
let
|
||||
@ -23,7 +23,10 @@ let
|
||||
in
|
||||
|
||||
{
|
||||
haskellPackage = self: self.callPackage (cleanHaskellSource ./lookup) {};
|
||||
haskellOverlay = self: super: {
|
||||
obelisk-executable-config-lookup = self.callCabal2nix "obelisk-executable-config-lookup" (obeliskCleanSource ./lookup) {};
|
||||
};
|
||||
|
||||
platforms = {
|
||||
android = {
|
||||
# Inject the given config directory into an android assets folder
|
||||
@ -34,7 +37,7 @@ in
|
||||
inject = injectConfig;
|
||||
};
|
||||
web = {
|
||||
inject = self: self.callCabal2nix "obelisk-executable-config-inject" (cleanHaskellSource ./inject) {};
|
||||
inject = self: self.callCabal2nix "obelisk-executable-config-inject" (obeliskCleanSource ./inject) {};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
@ -9,8 +9,8 @@ library
|
||||
build-depends:
|
||||
base,
|
||||
bytestring,
|
||||
directory,
|
||||
filepath,
|
||||
base64-bytestring,
|
||||
containers,
|
||||
reflex-dom-core,
|
||||
text
|
||||
|
||||
|
@ -2,50 +2,24 @@
|
||||
module Obelisk.ExecutableConfig.Inject where
|
||||
|
||||
import Control.Monad (mapM_)
|
||||
import Control.Monad.IO.Class (MonadIO, liftIO)
|
||||
import Data.Semigroup ((<>))
|
||||
import Control.Monad.IO.Class (MonadIO)
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString.Base64 as B64
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Semigroup ((<>))
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Reflex.Dom.Core hiding (value)
|
||||
import System.Directory
|
||||
import System.FilePath ((</>))
|
||||
|
||||
-- | Produces a @<script>@ tag containing the contents of the config item,
|
||||
-- rendered as a 'ByteString'.
|
||||
inject :: FilePath -> IO ByteString
|
||||
inject key = do
|
||||
value <- T.readFile key
|
||||
fmap snd $ renderStatic $ injectPure (T.pack key) value
|
||||
|
||||
-- | Produces a @<script>@ tag with an @id@ attribute containing the key and
|
||||
-- whose value is the provided configuration
|
||||
injectPure :: DomBuilder t m => Text -> Text -> m ()
|
||||
-- whose value is the provided configuration, encoded in base64.
|
||||
injectPure :: DomBuilder t m => Text -> ByteString -> m ()
|
||||
injectPure key value =
|
||||
let attrs = ("type" =: "text/plain" <> "id" =: ("config-" <> key))
|
||||
in elAttr "script" attrs $ text value
|
||||
let attrs = ("type" =: "text/plain" <> "data-obelisk-executable-config-inject-key" =: key <> "data-hydration-skip" =: "")
|
||||
in elAttr "script" attrs $ text (T.decodeUtf8 (B64.encode value))
|
||||
|
||||
-- | Scans the "common" and "frontend" configuration folders and produces
|
||||
-- injectable @<script>@ tags containing the configuration keys (filepaths) and
|
||||
-- values.
|
||||
injectExecutableConfigs :: (MonadIO m, DomBuilder t m) => m ()
|
||||
injectExecutableConfigs = do
|
||||
cfgC <- getConfigs "config/common"
|
||||
cfgF <- getConfigs "config/frontend"
|
||||
mapM_ (uncurry injectPure) (cfgC <> cfgF)
|
||||
|
||||
getConfigs :: MonadIO m => FilePath -> m [(Text, Text)]
|
||||
getConfigs fp = liftIO $ do
|
||||
dir <- doesDirectoryExist fp
|
||||
if dir
|
||||
then do
|
||||
ps <- listDirectory fp
|
||||
fmap concat $ mapM (\p -> getConfigs $ fp </> p) ps
|
||||
else do
|
||||
file <- doesFileExist fp
|
||||
if file
|
||||
then do
|
||||
f <- T.readFile fp
|
||||
return [(T.pack fp, f)]
|
||||
else return []
|
||||
-- | Produces injectable @<script>@ tags containing the configuration keys
|
||||
-- (filepaths) and values.
|
||||
injectExecutableConfigs :: (MonadIO m, DomBuilder t m) => Map Text ByteString -> m ()
|
||||
injectExecutableConfigs = mapM_ (uncurry injectPure) . Map.toList
|
||||
|
@ -1,24 +0,0 @@
|
||||
{ hostPlatform, ghc
|
||||
, mkDerivation, base, bytestring, filepath, stdenv, text, transformers
|
||||
, android-activity ? null, jsaddle-wkwebview ? null, ghcjs-dom ? null
|
||||
}:
|
||||
let isAndroid = hostPlatform.libc == "bionic";
|
||||
isIOS = hostPlatform.isDarwin && hostPlatform.isAarch64;
|
||||
isGhcjs = ghc.isGhcjs or false;
|
||||
in mkDerivation {
|
||||
pname = "obelisk-executable-config";
|
||||
version = "0.1";
|
||||
src = ./.;
|
||||
libraryHaskellDepends = [
|
||||
base bytestring filepath text transformers
|
||||
] ++ (if isAndroid then [
|
||||
android-activity
|
||||
] else if isIOS then [
|
||||
jsaddle-wkwebview
|
||||
] else if isGhcjs then [
|
||||
ghcjs-dom
|
||||
] else [
|
||||
]);
|
||||
license = stdenv.lib.licenses.bsd3;
|
||||
}
|
||||
|
@ -0,0 +1,45 @@
|
||||
name: obelisk-executable-config-lookup
|
||||
version: 0.1.1
|
||||
cabal-version: >= 1.2
|
||||
build-type: Simple
|
||||
|
||||
library
|
||||
exposed-modules: Obelisk.ExecutableConfig.Lookup, Obelisk.Configs
|
||||
if os(ios)
|
||||
hs-source-dirs: src-ios, src
|
||||
other-modules: Obelisk.Configs.Internal.Directory
|
||||
build-depends: jsaddle-wkwebview
|
||||
build-depends: directory
|
||||
else
|
||||
if os(linux-android)
|
||||
hs-source-dirs: src-android, src
|
||||
c-sources: cbits/ExecutableConfig.c
|
||||
other-modules: Obelisk.ExecutableConfig.Internal.AssetManager
|
||||
build-depends: android-activity
|
||||
ld-options: -landroid
|
||||
else
|
||||
if impl(ghcjs)
|
||||
hs-source-dirs: src-ghcjs, src
|
||||
build-depends: ghcjs-dom
|
||||
else
|
||||
hs-source-dirs: src-other, src
|
||||
other-modules: Obelisk.Configs.Internal.Directory
|
||||
build-depends: directory
|
||||
|
||||
build-depends:
|
||||
base,
|
||||
base64-bytestring,
|
||||
bytestring,
|
||||
containers,
|
||||
filepath,
|
||||
text,
|
||||
transformers,
|
||||
transformers-base,
|
||||
ref-tf,
|
||||
primitive,
|
||||
monad-control,
|
||||
reflex,
|
||||
reflex-dom,
|
||||
jsaddle
|
||||
|
||||
ghc-options: -Wall
|
@ -1,30 +0,0 @@
|
||||
name: obelisk-executable-config
|
||||
version: 0.1.1
|
||||
cabal-version: >= 1.2
|
||||
build-type: Simple
|
||||
|
||||
library
|
||||
exposed-modules: Obelisk.ExecutableConfig
|
||||
if os(ios)
|
||||
hs-source-dirs: src-ios
|
||||
build-depends: jsaddle-wkwebview
|
||||
else
|
||||
if os(linux-android)
|
||||
hs-source-dirs: src-android
|
||||
c-sources: cbits/ExecutableConfig.c
|
||||
other-modules: Obelisk.ExecutableConfig.Internal.AssetManager
|
||||
build-depends: android-activity
|
||||
ld-options: -landroid
|
||||
else
|
||||
if impl(ghcjs)
|
||||
hs-source-dirs: src-ghcjs
|
||||
build-depends: ghcjs-dom
|
||||
else
|
||||
hs-source-dirs: src
|
||||
|
||||
build-depends:
|
||||
base,
|
||||
bytestring,
|
||||
filepath,
|
||||
text,
|
||||
transformers
|
@ -1,27 +0,0 @@
|
||||
module Obelisk.ExecutableConfig (get) where
|
||||
|
||||
import Control.Exception (bracket)
|
||||
import qualified Data.ByteString as BS
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Foreign.C.String (withCString)
|
||||
import Foreign.Ptr (nullPtr)
|
||||
import System.FilePath.Posix ((</>))
|
||||
|
||||
import Obelisk.ExecutableConfig.Internal.AssetManager
|
||||
|
||||
get :: Text -> IO (Maybe Text)
|
||||
get name = bracket getAssets freeAssetManager $ \mgrObj -> do
|
||||
mgr <- assetManagerFromJava mgrObj
|
||||
let open = do
|
||||
a <- withCString (T.unpack name) $ \fn ->
|
||||
assetManager_open mgr fn 3
|
||||
return $ if unAAsset a == nullPtr
|
||||
then Nothing
|
||||
else Just a
|
||||
close = mapM_ asset_close
|
||||
bracket open close $ mapM $ \asset -> do
|
||||
b <- asset_getBuffer asset
|
||||
l <- asset_getLength asset
|
||||
fmap T.decodeUtf8 $ BS.packCStringLen (b, fromIntegral l)
|
@ -0,0 +1,60 @@
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.ExecutableConfig.Lookup where
|
||||
|
||||
import Control.Exception (bracket)
|
||||
import Control.Monad (forM)
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString as BS
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Foreign.C.String (withCString)
|
||||
import Foreign.Ptr (nullPtr)
|
||||
|
||||
import Obelisk.ExecutableConfig.Internal.AssetManager
|
||||
|
||||
getFromMgr :: AAssetManager -> BS.ByteString -> IO (Maybe ByteString)
|
||||
getFromMgr mgr name = do
|
||||
let open = do
|
||||
a <- BS.useAsCString name $ \fn ->
|
||||
assetManager_open mgr fn 3
|
||||
return $ if unAAsset a == nullPtr
|
||||
then Nothing
|
||||
else Just a
|
||||
close = mapM_ asset_close
|
||||
bracket open close $ mapM $ \asset -> do
|
||||
b <- asset_getBuffer asset
|
||||
l <- asset_getLength asset
|
||||
BS.packCStringLen (b, fromIntegral l)
|
||||
|
||||
getConfigs :: IO (Map Text ByteString)
|
||||
getConfigs = fmap (Map.mapKeys T.decodeUtf8) $ bracket getAssets freeAssetManager $ \mgrObj -> do
|
||||
mgr <- assetManagerFromJava mgrObj
|
||||
let openDir = do
|
||||
d <- withCString "config.files" $ \fn ->
|
||||
assetManager_open mgr fn 3
|
||||
return $ if unAAsset d == nullPtr
|
||||
then Nothing
|
||||
else Just d
|
||||
closeDir = mapM_ asset_close
|
||||
configPaths <- bracket openDir closeDir $ \case
|
||||
Just asset -> do
|
||||
b <- asset_getBuffer asset
|
||||
l <- asset_getLength asset
|
||||
lines0 <$> BS.packCStringLen (b, fromIntegral l)
|
||||
Nothing -> error "could not open configuration manifest 'config.files'"
|
||||
result <- fmap Map.fromList $ forM configPaths $ \fp ->
|
||||
getFromMgr mgr ("config/" <> fp) >>= \case
|
||||
Just v -> return (fp, v)
|
||||
Nothing -> error $ "Config present in config.files but not in assets: " <> show fp
|
||||
putStrLn $ "getConfigs: found " <> show result
|
||||
pure result
|
||||
|
||||
lines0 :: BS.ByteString -> [BS.ByteString]
|
||||
lines0 ps
|
||||
| BS.null ps = []
|
||||
| otherwise = case BS.elemIndex 0 ps of
|
||||
Nothing -> [ps]
|
||||
Just n -> BS.take n ps : lines0 (BS.drop (n+1) ps)
|
@ -1,17 +0,0 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.ExecutableConfig where
|
||||
|
||||
import Control.Monad.Trans.Maybe
|
||||
import Data.Semigroup ((<>))
|
||||
import Data.Text (Text)
|
||||
import GHCJS.DOM
|
||||
import GHCJS.DOM.Document
|
||||
import GHCJS.DOM.Element
|
||||
import GHCJS.DOM.NonElementParentNode
|
||||
|
||||
get :: Text -> IO (Maybe Text)
|
||||
get item = runMaybeT $ do
|
||||
doc <- MaybeT currentDocument
|
||||
e <- MaybeT $ getElementById doc $ "config-" <> item
|
||||
getInnerHTML e
|
||||
|
@ -0,0 +1,51 @@
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.ExecutableConfig.Lookup where
|
||||
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString.Base64 as B64
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Maybe (catMaybes)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Data.Traversable (for)
|
||||
import GHCJS.DOM
|
||||
import GHCJS.DOM.Document (getHead)
|
||||
import GHCJS.DOM.DOMStringMap (get)
|
||||
import GHCJS.DOM.Element (getInnerHTML)
|
||||
import GHCJS.DOM.HTMLElement (HTMLElement(HTMLElement), getDataset)
|
||||
import GHCJS.DOM.NodeList (IsNodeList, item, getLength)
|
||||
import GHCJS.DOM.ParentNode (querySelectorAll)
|
||||
import GHCJS.DOM.Types (MonadJSM, Node(Node), castTo)
|
||||
|
||||
getConfigs :: IO (Map Text ByteString)
|
||||
getConfigs = do
|
||||
Just doc <- currentDocument
|
||||
Just hd <- getHead doc
|
||||
nodes <- nodeListNodes =<< querySelectorAll hd ("[data-obelisk-executable-config-inject-key]" :: Text)
|
||||
fmap Map.fromList $ for nodes $ \node -> do
|
||||
e <- castTo HTMLElement node >>= \case
|
||||
Nothing -> error "Found node with data attribute obelisk-executable-config-inject-key that is not an HTMLElement."
|
||||
Just htmlE -> return htmlE
|
||||
dataset <- getDataset e
|
||||
(,)
|
||||
-- the key is camelCased: https://html.spec.whatwg.org/multipage/dom.html#dom-dataset
|
||||
<$> get dataset ("obeliskExecutableConfigInjectKey" :: Text)
|
||||
<*> (fmap decodeOrFail (getInnerHTML e))
|
||||
where
|
||||
decodeOrFail x = case B64.decode (T.encodeUtf8 x) of
|
||||
Left e -> error ("Obelisk.ExecutableConfig.Lookup.getConfigs: error when decoding base64: " ++ e)
|
||||
Right x' -> x'
|
||||
|
||||
-- | Collect all nodes in the node list.
|
||||
--
|
||||
-- TODO: this and the version in obelisk-frontend should be
|
||||
-- upstreamed to jsaddle.
|
||||
nodeListNodes :: (IsNodeList l, MonadJSM m) => l -> m [Node]
|
||||
nodeListNodes es = do
|
||||
len <- getLength es
|
||||
-- Warning! len is unsigned. If the NodeList is empty, we must avoid
|
||||
-- accidentally traversing over [0..maxBound::Word]
|
||||
nodes <- traverse (item es) $ if len == 0 then [] else [0..len-1]
|
||||
pure $ catMaybes nodes
|
@ -1,22 +0,0 @@
|
||||
module Obelisk.ExecutableConfig (get) where
|
||||
|
||||
import Control.Exception
|
||||
import Control.Monad
|
||||
import Control.Monad.Trans.Maybe
|
||||
import Data.ByteString as BS
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Language.Javascript.JSaddle.WKWebView
|
||||
import System.FilePath.Posix ((</>))
|
||||
import System.IO.Error
|
||||
|
||||
get :: Text -> IO (Maybe Text)
|
||||
get name = fmap join $ mainBundleResourcePath >>= \mp -> forM mp $ \p ->
|
||||
catchDoesNotExist $ T.readFile $ T.unpack (T.decodeUtf8 p) </> T.unpack name
|
||||
|
||||
catchDoesNotExist :: IO a -> IO (Maybe a)
|
||||
catchDoesNotExist f = catchJust doesNotExist (Just <$> f) $ const $ return Nothing
|
||||
where
|
||||
doesNotExist e = if isDoesNotExistError e then Just () else Nothing
|
@ -0,0 +1,17 @@
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
module Obelisk.ExecutableConfig.Lookup where
|
||||
|
||||
import Data.ByteString (ByteString)
|
||||
import Data.Map (Map)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Language.Javascript.JSaddle.WKWebView
|
||||
import System.FilePath.Posix
|
||||
|
||||
import Obelisk.Configs.Internal.Directory
|
||||
|
||||
getConfigs :: IO (Map Text ByteString)
|
||||
getConfigs = mainBundleResourcePath >>= \case
|
||||
Nothing -> error "Could not get bundle resource path."
|
||||
Just p -> getConfigsFromDirectory $ T.unpack (T.decodeUtf8 p) </> "config"
|
@ -0,0 +1,11 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.ExecutableConfig.Lookup where
|
||||
|
||||
import Data.Map (Map)
|
||||
import Data.Text (Text)
|
||||
import Data.ByteString (ByteString)
|
||||
|
||||
import Obelisk.Configs.Internal.Directory (getConfigsFromDirectory)
|
||||
|
||||
getConfigs :: IO (Map Text ByteString)
|
||||
getConfigs = getConfigsFromDirectory "config"
|
129
lib/executable-config/lookup/src/Obelisk/Configs.hs
Normal file
@ -0,0 +1,129 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
{-# LANGUAGE DataKinds #-}
|
||||
{-# LANGUAGE DefaultSignatures #-}
|
||||
{-# LANGUAGE DeriveFunctor #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE TypeFamilies #-}
|
||||
{-# LANGUAGE UndecidableInstances #-}
|
||||
module Obelisk.Configs
|
||||
( HasConfigs(..)
|
||||
, ConfigsT
|
||||
, runConfigsT
|
||||
, mapConfigsT
|
||||
, getTextConfig
|
||||
) where
|
||||
|
||||
import Control.Applicative
|
||||
import Control.Monad
|
||||
import Control.Monad.Base
|
||||
import Control.Monad.Fix
|
||||
import Control.Monad.Primitive
|
||||
import Control.Monad.Ref
|
||||
import Control.Monad.IO.Class
|
||||
import Control.Monad.Trans.Class
|
||||
import Control.Monad.Trans.Control
|
||||
import Control.Monad.Trans.Reader
|
||||
import Data.ByteString (ByteString)
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Reflex
|
||||
import Reflex.Host.Class
|
||||
import Reflex.Dom
|
||||
#ifndef ghcjs_HOST_OS
|
||||
import Language.Javascript.JSaddle (MonadJSM)
|
||||
#endif
|
||||
|
||||
class Monad m => HasConfigs m where
|
||||
getConfigs :: m (Map Text ByteString)
|
||||
default getConfigs :: (HasConfigs m', m ~ t m', MonadTrans t) => m (Map Text ByteString)
|
||||
getConfigs = lift getConfigs
|
||||
getConfig :: Text -> m (Maybe ByteString)
|
||||
getConfig k = do
|
||||
configs <- getConfigs
|
||||
return $ Map.lookup k configs
|
||||
|
||||
instance Monad m => HasConfigs (ConfigsT m) where
|
||||
getConfigs = ConfigsT ask
|
||||
|
||||
getTextConfig :: HasConfigs m => Text -> m (Maybe Text)
|
||||
getTextConfig k = fmap T.decodeUtf8 <$> getConfig k
|
||||
|
||||
instance HasConfigs m => HasConfigs (BehaviorWriterT t w m)
|
||||
instance HasConfigs m => HasConfigs (DynamicWriterT t w m)
|
||||
instance HasConfigs m => HasConfigs (EventWriterT t w m)
|
||||
instance HasConfigs m => HasConfigs (PostBuildT t m)
|
||||
instance HasConfigs m => HasConfigs (QueryT t q m)
|
||||
instance HasConfigs m => HasConfigs (ReaderT r m)
|
||||
instance HasConfigs m => HasConfigs (RequesterT t request response m)
|
||||
instance HasConfigs m => HasConfigs (StaticDomBuilderT t m)
|
||||
instance HasConfigs m => HasConfigs (TriggerEventT t m)
|
||||
|
||||
newtype ConfigsT m a = ConfigsT { unConfigsT :: ReaderT (Map Text ByteString) m a }
|
||||
deriving
|
||||
( Functor
|
||||
, Applicative
|
||||
, Monad
|
||||
, MonadPlus
|
||||
, Alternative
|
||||
, MonadFix
|
||||
, MonadIO
|
||||
, MonadBase m'
|
||||
, MonadBaseControl m'
|
||||
, MonadRef
|
||||
, MonadTrans
|
||||
, DomBuilder t
|
||||
, MonadHold t
|
||||
, MonadReflexCreateTrigger t
|
||||
, MonadSample t
|
||||
, NotReady t
|
||||
, PostBuild t
|
||||
, TriggerEvent t
|
||||
, HasDocument
|
||||
, DomRenderHook t
|
||||
, HasJSContext
|
||||
, HasJS js
|
||||
#ifndef ghcjs_HOST_OS
|
||||
, MonadJSM
|
||||
#endif
|
||||
)
|
||||
|
||||
instance PerformEvent t m => PerformEvent t (ConfigsT m) where
|
||||
type Performable (ConfigsT m) = ConfigsT (Performable m)
|
||||
performEvent e = ConfigsT $ ReaderT $ \configs ->
|
||||
performEvent $ runConfigsT configs <$> e
|
||||
performEvent_ e = ConfigsT $ ReaderT $ \configs ->
|
||||
performEvent_ $ runConfigsT configs <$> e
|
||||
|
||||
instance Adjustable t m => Adjustable t (ConfigsT m) where
|
||||
runWithReplace a e = ConfigsT $ runWithReplace (unConfigsT a) (unConfigsT <$> e)
|
||||
traverseDMapWithKeyWithAdjust f m e = ConfigsT $ traverseDMapWithKeyWithAdjust (\k v -> unConfigsT $ f k v) m e
|
||||
traverseIntMapWithKeyWithAdjust f m e = ConfigsT $ traverseIntMapWithKeyWithAdjust (\k v -> unConfigsT $ f k v) m e
|
||||
traverseDMapWithKeyWithAdjustWithMove f m e = ConfigsT $ traverseDMapWithKeyWithAdjustWithMove (\k v -> unConfigsT $ f k v) m e
|
||||
|
||||
instance Prerender js t m => Prerender js t (ConfigsT m) where
|
||||
type Client (ConfigsT m) = ConfigsT (Client m)
|
||||
prerender server client = ConfigsT $ ReaderT $ \configs ->
|
||||
prerender (runConfigsT configs server) (runConfigsT configs client)
|
||||
|
||||
instance PrimMonad m => PrimMonad (ConfigsT m) where
|
||||
type PrimState (ConfigsT m) = PrimState m
|
||||
primitive = lift . primitive
|
||||
|
||||
runConfigsT
|
||||
:: Map Text ByteString
|
||||
-> ConfigsT m a
|
||||
-> m a
|
||||
runConfigsT cs child = runReaderT (unConfigsT child) cs
|
||||
|
||||
mapConfigsT
|
||||
:: (forall x. m x -> n x)
|
||||
-> ConfigsT m a
|
||||
-> ConfigsT n a
|
||||
mapConfigsT f (ConfigsT x) = ConfigsT $ mapReaderT f x
|
@ -0,0 +1,24 @@
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
module Obelisk.Configs.Internal.Directory where
|
||||
|
||||
import Control.Monad
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString as BS
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import System.Directory
|
||||
import System.FilePath.Posix
|
||||
|
||||
getConfigsFromDirectory :: FilePath -> IO (Map Text ByteString)
|
||||
getConfigsFromDirectory base = doesDirectoryExist base >>= \case
|
||||
True -> do
|
||||
ps <- listDirectory base
|
||||
fmap mconcat $ forM ps $ \p -> do
|
||||
subdirConfigs <- getConfigsFromDirectory $ base </> p
|
||||
pure $ Map.mapKeys (T.pack . (p </>) . T.unpack) subdirConfigs
|
||||
False -> doesFileExist base >>= \case
|
||||
True -> Map.singleton "" <$> BS.readFile base
|
||||
False -> pure mempty
|
@ -1,13 +0,0 @@
|
||||
module Obelisk.ExecutableConfig (get) where
|
||||
|
||||
import Control.Exception
|
||||
import Data.Text (Text)
|
||||
import Data.Text as T
|
||||
import Data.Text.IO as T
|
||||
import System.FilePath.Posix ((</>))
|
||||
import System.IO.Error
|
||||
|
||||
get :: Text -> IO (Maybe Text)
|
||||
get path = do
|
||||
let doesNotExist = \e -> if isDoesNotExistError e then Just () else Nothing
|
||||
catchJust doesNotExist (fmap Just $ T.readFile $ T.unpack path) (\_ -> pure Nothing)
|
@ -7,10 +7,15 @@ library
|
||||
hs-source-dirs: src
|
||||
build-depends: base,
|
||||
bytestring,
|
||||
containers,
|
||||
cookie,
|
||||
dependent-sum,
|
||||
ghcjs-dom,
|
||||
jsaddle,
|
||||
lens,
|
||||
mtl,
|
||||
obelisk-executable-config-inject,
|
||||
obelisk-executable-config-lookup,
|
||||
obelisk-route,
|
||||
primitive,
|
||||
ref-tf,
|
||||
@ -18,5 +23,7 @@ library
|
||||
reflex-dom-core,
|
||||
text,
|
||||
transformers
|
||||
exposed-modules: Obelisk.Frontend
|
||||
exposed-modules:
|
||||
Obelisk.Frontend
|
||||
Obelisk.Frontend.Cookie
|
||||
ghc-options: -Wall
|
||||
|
@ -1,10 +1,13 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
{-# LANGUAGE ConstraintKinds #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE PackageImports #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE RecursiveDo #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# LANGUAGE TypeFamilies #-}
|
||||
@ -13,39 +16,55 @@ module Obelisk.Frontend
|
||||
( ObeliskWidget
|
||||
, Frontend (..)
|
||||
, runFrontend
|
||||
, runFrontendWithConfigsAndCurrentRoute
|
||||
, renderFrontendHtml
|
||||
, removeHTMLConfigs
|
||||
, FrontendMode (..)
|
||||
, FrontendWidgetT
|
||||
, module Obelisk.Frontend.Cookie
|
||||
) where
|
||||
|
||||
import Prelude hiding ((.))
|
||||
|
||||
import Control.Category
|
||||
import Control.Concurrent
|
||||
import Control.Lens
|
||||
import Control.Monad
|
||||
import Control.Monad.Fix
|
||||
import Control.Monad.IO.Class
|
||||
import Control.Monad.Primitive
|
||||
import Control.Monad.Reader
|
||||
import Control.Monad.Ref
|
||||
import Control.Monad.Trans.Class
|
||||
import Control.Monad.Trans.Maybe (MaybeT(..), runMaybeT)
|
||||
import Data.ByteString (ByteString)
|
||||
import Data.Dependent.Sum (DSum (..))
|
||||
import Data.Foldable (for_)
|
||||
import Data.Functor.Sum
|
||||
import Data.IORef
|
||||
import Data.Map (Map)
|
||||
import Data.Maybe (catMaybes)
|
||||
import Data.Monoid ((<>))
|
||||
import Data.Text (Text)
|
||||
import GHCJS.DOM hiding (bracket, catch)
|
||||
import GHCJS.DOM.Document
|
||||
import GHCJS.DOM.Node
|
||||
import qualified GHCJS.DOM as DOM
|
||||
import qualified GHCJS.DOM.Types as DOM
|
||||
import Language.Javascript.JSaddle (JSM)
|
||||
import qualified GHCJS.DOM.History as DOM
|
||||
import qualified GHCJS.DOM.Window as DOM
|
||||
import Language.Javascript.JSaddle (MonadJSM, JSM, jsNull)
|
||||
import GHCJS.DOM (currentDocument)
|
||||
import "ghcjs-dom" GHCJS.DOM.Document (getHead)
|
||||
import GHCJS.DOM.Node (Node, removeChild_)
|
||||
import GHCJS.DOM.NodeList (IsNodeList, item, getLength)
|
||||
import GHCJS.DOM.ParentNode (querySelectorAll)
|
||||
import Obelisk.Frontend.Cookie
|
||||
import Obelisk.Route.Frontend
|
||||
import Reflex.Dom.Core
|
||||
import Reflex.Host.Class
|
||||
import qualified Reflex.TriggerEvent.Base as TriggerEvent
|
||||
import Obelisk.Configs
|
||||
import Obelisk.ExecutableConfig.Inject (injectExecutableConfigs)
|
||||
import qualified Obelisk.ExecutableConfig.Lookup as Lookup
|
||||
import Web.Cookie
|
||||
|
||||
import Debug.Trace
|
||||
|
||||
makePrisms ''Sum
|
||||
|
||||
type ObeliskWidget t x route m =
|
||||
type ObeliskWidget js t route m =
|
||||
( DomBuilder t m
|
||||
, MonadFix m
|
||||
, MonadHold t m
|
||||
@ -53,8 +72,6 @@ type ObeliskWidget t x route m =
|
||||
, MonadReflexCreateTrigger t m
|
||||
, PostBuild t m
|
||||
, PerformEvent t m
|
||||
, MonadIO m
|
||||
, MonadIO (Performable m)
|
||||
, TriggerEvent t m
|
||||
, HasDocument m
|
||||
, MonadRef m
|
||||
@ -63,106 +80,175 @@ type ObeliskWidget t x route m =
|
||||
, Ref (Performable m) ~ Ref IO
|
||||
, MonadFix (Performable m)
|
||||
, PrimMonad m
|
||||
, Prerender x m
|
||||
, SetRoute t route m
|
||||
, Prerender js t m
|
||||
, PrebuildAgnostic t route m
|
||||
, PrebuildAgnostic t route (Client m)
|
||||
, HasConfigs m
|
||||
, HasCookies m
|
||||
, MonadIO (Performable m)
|
||||
)
|
||||
|
||||
type PrebuildAgnostic t route m =
|
||||
( SetRoute t route m
|
||||
, RouteToUrl route m
|
||||
, MonadFix m
|
||||
, HasConfigs m
|
||||
, HasConfigs (Performable m)
|
||||
)
|
||||
|
||||
data Frontend route = Frontend
|
||||
{ _frontend_head :: !(forall t m x. ObeliskWidget t x route m => RoutedT t route m ())
|
||||
, _frontend_body :: !(forall t m x. ObeliskWidget t x route m => RoutedT t route m ())
|
||||
{ _frontend_head :: !(forall js t m. ObeliskWidget js t route m => RoutedT t route m ())
|
||||
, _frontend_body :: !(forall js t m. ObeliskWidget js t route m => RoutedT t route m ())
|
||||
}
|
||||
|
||||
type Widget' x = ImmediateDomBuilderT DomTimeline (DomCoreWidget x)
|
||||
baseTag :: forall route js t m. ObeliskWidget js t route m => RoutedT t route m ()
|
||||
baseTag = elAttr "base" ("href" =: "/") blank --TODO: Figure out the base URL from the routes
|
||||
|
||||
-- | A widget that isn't attached to any particular part of the DOM hierarchy
|
||||
type FloatingWidget x = TriggerEventT DomTimeline (DomCoreWidget x)
|
||||
removeHTMLConfigs :: JSM ()
|
||||
removeHTMLConfigs = void $ runMaybeT $ do
|
||||
doc <- MaybeT currentDocument
|
||||
hd <- MaybeT $ getHead doc
|
||||
es <- nodeListNodes =<< querySelectorAll hd ("[data-obelisk-executable-config-inject-key]" :: Text)
|
||||
for_ es $ removeChild_ hd
|
||||
|
||||
type DomCoreWidget x = PostBuildT DomTimeline (WithJSContextSingleton x (PerformEventT DomTimeline DomHost))
|
||||
-- | Collect all nodes in the node list.
|
||||
--
|
||||
-- TODO: this and the version in exe-config/ghcjs/lookup should be
|
||||
-- upstreamed to jsaddle.
|
||||
nodeListNodes :: (IsNodeList l, MonadJSM m) => l -> m [Node]
|
||||
nodeListNodes es = do
|
||||
len <- getLength es
|
||||
-- Warning! len is unsigned. If the NodeList is empty, we must avoid
|
||||
-- accidentally traversing over [0..maxBound::Word]
|
||||
nodes <- traverse (item es) $ if len == 0 then [] else [0..len-1]
|
||||
pure $ catMaybes nodes
|
||||
|
||||
--TODO: Rename
|
||||
{-# INLINABLE attachWidget''' #-}
|
||||
attachWidget'''
|
||||
:: (EventChannel -> PerformEventT DomTimeline DomHost (IORef (Maybe (EventTrigger DomTimeline ()))))
|
||||
-> IO ()
|
||||
attachWidget''' w = runDomHost $ do
|
||||
events <- liftIO newChan
|
||||
(postBuildTriggerRef, fc@(FireCommand fire)) <- hostPerformEventT $ w events
|
||||
mPostBuildTrigger <- readRef postBuildTriggerRef
|
||||
forM_ mPostBuildTrigger $ \postBuildTrigger -> fire [postBuildTrigger :=> Identity ()] $ return ()
|
||||
liftIO $ processAsyncEvents events fc
|
||||
setInitialRoute :: Bool -> JSM ()
|
||||
setInitialRoute useHash = do
|
||||
traceM "setInitialRoute"
|
||||
window <- DOM.currentWindowUnchecked
|
||||
initialLocation <- DOM.getLocation window
|
||||
initialUri <- getLocationUri initialLocation
|
||||
history <- DOM.getHistory window
|
||||
DOM.replaceState history jsNull ("" :: Text) $ Just $
|
||||
show $ setAdaptedUriPath useHash "/" initialUri
|
||||
|
||||
--TODO: This is a collection of random stuff; we should make it make some more sense and then upstream to reflex-dom-core
|
||||
runWithHeadAndBody
|
||||
:: ( (forall c. Widget' () c -> FloatingWidget () c) -- "Append to head"
|
||||
-> (forall c. Widget' () c -> FloatingWidget () c) -- "Append to body"
|
||||
-> FloatingWidget () ()
|
||||
)
|
||||
data FrontendMode = FrontendMode
|
||||
{ _frontendMode_hydrate :: Bool
|
||||
-- ^ There is already a rendering of the DOM in place; hydrate it rather
|
||||
-- than building new DOM
|
||||
, _frontendMode_adjustRoute :: Bool
|
||||
-- ^ The page can't use regular routes, so encode routes into the hash
|
||||
-- instead
|
||||
}
|
||||
|
||||
-- | Run the frontend, setting the initial route to "/" on platforms where no
|
||||
-- route exists ambiently in the context (e.g. anything but web).
|
||||
-- Selects FrontendMode based on platform; this doesn't work for jsaddle-warp
|
||||
runFrontend
|
||||
:: forall backendRoute route
|
||||
. Encoder Identity Identity (R (FullRoute backendRoute route)) PageName
|
||||
-> Frontend (R route)
|
||||
-> JSM ()
|
||||
runWithHeadAndBody app = withJSContextSingletonMono $ \jsSing -> do
|
||||
globalDoc <- currentDocumentUnchecked
|
||||
headFragment <- createDocumentFragment globalDoc
|
||||
bodyFragment <- createDocumentFragment globalDoc
|
||||
unreadyChildren <- liftIO $ newIORef 0
|
||||
let commit = do
|
||||
headElement <- getHeadUnchecked globalDoc
|
||||
bodyElement <- getBodyUnchecked globalDoc
|
||||
void $ inAnimationFrame' $ \_ -> do
|
||||
replaceElementContents headElement headFragment
|
||||
replaceElementContents bodyElement bodyFragment
|
||||
liftIO $ attachWidget''' $ \events -> flip runWithJSContextSingleton jsSing $ do
|
||||
(postBuild, postBuildTriggerRef) <- newEventWithTriggerRef
|
||||
let appendImmediateDom :: DOM.DocumentFragment -> Widget' () c -> FloatingWidget () c
|
||||
appendImmediateDom df w = do
|
||||
events' <- TriggerEvent.askEvents
|
||||
lift $ do
|
||||
doc <- getOwnerDocumentUnchecked df
|
||||
let builderEnv = ImmediateDomBuilderEnv
|
||||
{ _immediateDomBuilderEnv_document = doc
|
||||
, _immediateDomBuilderEnv_parent = toNode df
|
||||
, _immediateDomBuilderEnv_unreadyChildren = unreadyChildren
|
||||
, _immediateDomBuilderEnv_commitAction = commit
|
||||
}
|
||||
runImmediateDomBuilderT w builderEnv events'
|
||||
flip runPostBuildT postBuild $ flip runTriggerEventT events $ app (appendImmediateDom headFragment) (appendImmediateDom bodyFragment)
|
||||
liftIO (readIORef unreadyChildren) >>= \case
|
||||
0 -> DOM.liftJSM commit
|
||||
_ -> return ()
|
||||
return postBuildTriggerRef
|
||||
|
||||
runFrontend :: forall backendRoute route. Encoder Identity Identity (R (Sum backendRoute (ObeliskRoute route))) PageName -> Frontend (R route) -> JSM ()
|
||||
runFrontend validFullEncoder frontend = do
|
||||
let ve = validFullEncoder . hoistParse errorLeft (prismEncoder (rPrism $ _InR . _ObeliskRoute_App))
|
||||
let mode = FrontendMode
|
||||
{ _frontendMode_hydrate =
|
||||
#ifdef ghcjs_HOST_OS
|
||||
True
|
||||
#else
|
||||
False
|
||||
#endif
|
||||
, _frontendMode_adjustRoute =
|
||||
#ifdef ghcjs_HOST_OS
|
||||
False
|
||||
#else
|
||||
True
|
||||
#endif
|
||||
}
|
||||
configs <- liftIO Lookup.getConfigs
|
||||
when (_frontendMode_hydrate mode) removeHTMLConfigs
|
||||
-- There's no fundamental reason that adjustRoute needs to control setting the
|
||||
-- initial route and *also* the useHash parameter; that's why these are
|
||||
-- separate here. However, currently, they are always the same.
|
||||
when (_frontendMode_adjustRoute mode) $ do
|
||||
setInitialRoute $ _frontendMode_adjustRoute mode
|
||||
runFrontendWithConfigsAndCurrentRoute mode configs validFullEncoder frontend
|
||||
|
||||
runFrontendWithConfigsAndCurrentRoute
|
||||
:: forall backendRoute frontendRoute
|
||||
. FrontendMode
|
||||
-> Map Text ByteString
|
||||
-> Encoder Identity Identity (R (FullRoute backendRoute frontendRoute)) PageName
|
||||
-> Frontend (R frontendRoute)
|
||||
-> JSM ()
|
||||
runFrontendWithConfigsAndCurrentRoute mode configs validFullEncoder frontend = do
|
||||
let ve = validFullEncoder . hoistParse errorLeft (prismEncoder (rPrism $ _FullRoute_Frontend . _ObeliskRoute_App))
|
||||
errorLeft = \case
|
||||
Left _ -> error "runFrontend: Unexpected non-app ObeliskRoute reached the frontend. This shouldn't happen."
|
||||
Right x -> Identity x
|
||||
runMyRouteViewT
|
||||
:: ( TriggerEvent t m
|
||||
, PerformEvent t m
|
||||
, MonadHold t m
|
||||
w :: ( RawDocument (DomBuilderSpace (HydrationDomBuilderT s DomTimeline m)) ~ DOM.Document
|
||||
, Ref (Performable m) ~ Ref IO
|
||||
, Ref m ~ Ref IO
|
||||
, DomBuilder DomTimeline (HydrationDomBuilderT s DomTimeline m)
|
||||
, MonadHold DomTimeline m
|
||||
, MonadRef m
|
||||
, MonadRef (Performable m)
|
||||
, MonadReflexCreateTrigger DomTimeline m
|
||||
, PerformEvent DomTimeline m
|
||||
, PostBuild DomTimeline m
|
||||
, PrimMonad m
|
||||
, MonadSample DomTimeline (Performable m)
|
||||
, DOM.MonadJSM m
|
||||
, DOM.MonadJSM (Performable m)
|
||||
, MonadFix m
|
||||
, MonadFix (Client (HydrationDomBuilderT s DomTimeline m))
|
||||
, MonadFix (Performable m)
|
||||
, MonadFix m
|
||||
, Prerender js DomTimeline (HydrationDomBuilderT s DomTimeline m)
|
||||
, MonadIO (Performable m)
|
||||
)
|
||||
=> RoutedT t (R route) (SetRouteT t (R route) (RouteToUrlT (R route) m)) a
|
||||
-> m a
|
||||
runMyRouteViewT = runRouteViewT ve
|
||||
runWithHeadAndBody $ \appendHead appendBody -> runMyRouteViewT $ do
|
||||
mapRoutedT (mapSetRouteT (mapRouteToUrlT appendHead)) $ _frontend_head frontend
|
||||
mapRoutedT (mapSetRouteT (mapRouteToUrlT appendBody)) $ _frontend_body frontend
|
||||
=> (forall c. HydrationDomBuilderT s DomTimeline m c -> FloatingWidget () c)
|
||||
-> (forall c. HydrationDomBuilderT s DomTimeline m c -> FloatingWidget () c)
|
||||
-> FloatingWidget () ()
|
||||
w appendHead appendBody = do
|
||||
rec switchover <- runRouteViewT ve switchover (_frontendMode_adjustRoute mode) $ do
|
||||
(switchover'', fire) <- newTriggerEvent
|
||||
mapRoutedT (mapSetRouteT (mapRouteToUrlT (appendHead . runConfigsT configs))) $ do
|
||||
-- The order here is important - baseTag has to be before headWidget!
|
||||
baseTag
|
||||
_frontend_head frontend
|
||||
mapRoutedT (mapSetRouteT (mapRouteToUrlT (appendBody . runConfigsT configs))) $ do
|
||||
_frontend_body frontend
|
||||
switchover' <- case _frontendMode_hydrate mode of
|
||||
True -> lift $ lift $ lift $ lift $ HydrationDomBuilderT $ asks _hydrationDomBuilderEnv_switchover
|
||||
False -> getPostBuild
|
||||
performEvent_ $ liftIO (fire ()) <$ switchover'
|
||||
pure switchover''
|
||||
pure ()
|
||||
if _frontendMode_hydrate mode
|
||||
then runHydrationWidgetWithHeadAndBody (pure ()) w
|
||||
else runImmediateWidgetWithHeadAndBody w
|
||||
|
||||
type FrontendWidgetT r = RoutedT DomTimeline r (SetRouteT DomTimeline r (RouteToUrlT r (ConfigsT (CookiesT (HydratableT (PostBuildT DomTimeline (StaticDomBuilderT DomTimeline (PerformEventT DomTimeline DomHost))))))))
|
||||
|
||||
renderFrontendHtml
|
||||
:: (t ~ DomTimeline)
|
||||
=> (r' -> Text)
|
||||
:: MonadIO m
|
||||
=> Map Text ByteString
|
||||
-> Cookies
|
||||
-> (r -> Text)
|
||||
-> r
|
||||
-> RoutedT t r (SetRouteT t r' (RouteToUrlT r' (PostBuildT DomTimeline (StaticDomBuilderT DomTimeline (PerformEventT DomTimeline DomHost))))) ()
|
||||
-> RoutedT t r (SetRouteT t r' (RouteToUrlT r' (PostBuildT DomTimeline (StaticDomBuilderT DomTimeline (PerformEventT DomTimeline DomHost))))) ()
|
||||
-> IO ByteString
|
||||
renderFrontendHtml urlEnc route headWidget bodyWidget = do
|
||||
-> Frontend r
|
||||
-> FrontendWidgetT r ()
|
||||
-> FrontendWidgetT r ()
|
||||
-> m ByteString
|
||||
renderFrontendHtml configs cookies urlEnc route frontend headExtra bodyExtra = do
|
||||
--TODO: We should probably have a "NullEventWriterT" or a frozen reflex timeline
|
||||
html <- fmap snd $ renderStatic $ fmap fst $ flip runRouteToUrlT urlEnc $ runSetRouteT $ flip runRoutedT (pure route) $
|
||||
html <- fmap snd $ liftIO $ renderStatic $ runHydratableT $ fmap fst $ runCookiesT cookies $ runConfigsT configs $ flip runRouteToUrlT urlEnc $ runSetRouteT $ flip runRoutedT (pure route) $
|
||||
el "html" $ do
|
||||
el "head" headWidget
|
||||
el "body" bodyWidget
|
||||
el "head" $ do
|
||||
baseTag
|
||||
injectExecutableConfigs configs
|
||||
_frontend_head frontend
|
||||
headExtra
|
||||
el "body" $ do
|
||||
_frontend_body frontend
|
||||
bodyExtra
|
||||
return $ "<!DOCTYPE html>" <> html
|
||||
|
103
lib/frontend/src/Obelisk/Frontend/Cookie.hs
Normal file
@ -0,0 +1,103 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
{-# LANGUAGE DataKinds #-}
|
||||
{-# LANGUAGE DefaultSignatures #-}
|
||||
{-# LANGUAGE DeriveFunctor #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE GeneralizedNewtypeDeriving #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE PackageImports #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE TypeFamilies #-}
|
||||
{-# LANGUAGE UndecidableInstances #-}
|
||||
|
||||
module Obelisk.Frontend.Cookie where
|
||||
|
||||
import Control.Monad.Fix
|
||||
import Control.Monad.Primitive
|
||||
import Control.Monad.Ref
|
||||
import Control.Monad.IO.Class
|
||||
import Control.Monad.Trans.Class
|
||||
import Control.Monad.Trans.Reader
|
||||
import Data.Text.Encoding (encodeUtf8)
|
||||
import Reflex
|
||||
import Reflex.Host.Class
|
||||
import Reflex.Dom.Core
|
||||
import "ghcjs-dom" GHCJS.DOM.Document (getCookie, Document)
|
||||
import GHCJS.DOM.Types (MonadJSM)
|
||||
import Web.Cookie
|
||||
|
||||
import Obelisk.Configs
|
||||
import Obelisk.Route.Frontend
|
||||
|
||||
class Monad m => HasCookies m where
|
||||
askCookies :: m Cookies
|
||||
default askCookies :: (HasCookies m', m ~ t m', MonadTrans t) => m Cookies
|
||||
askCookies = lift askCookies
|
||||
|
||||
instance HasCookies m => HasCookies (BehaviorWriterT t w m)
|
||||
instance HasCookies m => HasCookies (DynamicWriterT t w m)
|
||||
instance HasCookies m => HasCookies (EventWriterT t w m)
|
||||
instance HasCookies m => HasCookies (PostBuildT t m)
|
||||
instance HasCookies m => HasCookies (QueryT t q m)
|
||||
instance HasCookies m => HasCookies (ReaderT r m)
|
||||
instance HasCookies m => HasCookies (RequesterT t request response m)
|
||||
instance HasCookies m => HasCookies (RouteToUrlT t m)
|
||||
instance HasCookies m => HasCookies (SetRouteT t r m)
|
||||
instance HasCookies m => HasCookies (StaticDomBuilderT t m)
|
||||
instance HasCookies m => HasCookies (TriggerEventT t m)
|
||||
instance HasCookies m => HasCookies (RoutedT t r m)
|
||||
instance HasCookies m => HasCookies (ConfigsT m)
|
||||
instance HasConfigs m => HasConfigs (CookiesT m)
|
||||
|
||||
newtype CookiesT m a = CookiesT { unCookiesT :: ReaderT Cookies m a }
|
||||
deriving
|
||||
( Functor
|
||||
, Applicative
|
||||
, DomBuilder t
|
||||
, Monad
|
||||
, MonadFix
|
||||
, MonadHold t
|
||||
, MonadIO
|
||||
#ifndef ghcjs_HOST_OS
|
||||
, MonadJSM
|
||||
#endif
|
||||
, MonadRef
|
||||
, MonadReflexCreateTrigger t
|
||||
, MonadSample t
|
||||
, MonadTrans
|
||||
, NotReady t
|
||||
, PerformEvent t
|
||||
, PostBuild t
|
||||
, Prerender js t
|
||||
, TriggerEvent t
|
||||
, HasDocument
|
||||
)
|
||||
|
||||
instance Adjustable t m => Adjustable t (CookiesT m) where
|
||||
runWithReplace a e = CookiesT $ runWithReplace (unCookiesT a) (unCookiesT <$> e)
|
||||
traverseDMapWithKeyWithAdjust f m e = CookiesT $ traverseDMapWithKeyWithAdjust (\k v -> unCookiesT $ f k v) m e
|
||||
traverseIntMapWithKeyWithAdjust f m e = CookiesT $ traverseIntMapWithKeyWithAdjust (\k v -> unCookiesT $ f k v) m e
|
||||
traverseDMapWithKeyWithAdjustWithMove f m e = CookiesT $ traverseDMapWithKeyWithAdjustWithMove (\k v -> unCookiesT $ f k v) m e
|
||||
|
||||
instance PrimMonad m => PrimMonad (CookiesT m) where
|
||||
type PrimState (CookiesT m) = PrimState m
|
||||
primitive = lift . primitive
|
||||
|
||||
runCookiesT
|
||||
:: Cookies
|
||||
-> CookiesT m a
|
||||
-> m a
|
||||
runCookiesT cs child = runReaderT (unCookiesT child) cs
|
||||
|
||||
instance Monad m => HasCookies (CookiesT m) where
|
||||
askCookies = CookiesT ask
|
||||
|
||||
mapCookiesT
|
||||
:: (forall x. m x -> n x)
|
||||
-> CookiesT m a
|
||||
-> CookiesT n a
|
||||
mapCookiesT f (CookiesT x) = CookiesT $ mapReaderT f x
|
||||
|
||||
instance (MonadJSM m, RawDocument (DomBuilderSpace (HydrationDomBuilderT s t m)) ~ Document) => HasCookies (HydrationDomBuilderT s t m) where
|
||||
askCookies = fmap (parseCookies . encodeUtf8) $ getCookie =<< askDocument
|
@ -6,12 +6,14 @@ build-type: Simple
|
||||
library
|
||||
hs-source-dirs: src
|
||||
build-depends: base,
|
||||
aeson,
|
||||
bytestring,
|
||||
categories,
|
||||
constraints,
|
||||
constraints-extras,
|
||||
containers,
|
||||
dependent-map,
|
||||
dependent-sum,
|
||||
dependent-sum-universe-orphans,
|
||||
dependent-sum-template,
|
||||
either,
|
||||
ghcjs-dom,
|
||||
@ -21,15 +23,18 @@ library
|
||||
monad-control,
|
||||
mtl,
|
||||
network-uri,
|
||||
obelisk-executable-config-lookup,
|
||||
primitive,
|
||||
ref-tf,
|
||||
reflex,
|
||||
reflex-dom-core,
|
||||
template-haskell,
|
||||
tabulation,
|
||||
text,
|
||||
th-extras,
|
||||
transformers,
|
||||
universe,
|
||||
universe-template
|
||||
universe-dependent-sum
|
||||
exposed-modules: Obelisk.Route
|
||||
Obelisk.Route.TH
|
||||
Obelisk.Route.Frontend
|
||||
|
@ -12,12 +12,18 @@
|
||||
{-# LANGUAGE PatternSynonyms #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE StandaloneDeriving #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# LANGUAGE TypeApplications #-}
|
||||
{-# LANGUAGE TypeFamilies #-}
|
||||
{-# LANGUAGE TypeOperators #-}
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
module Obelisk.Route
|
||||
( R
|
||||
, (:.)
|
||||
, (?/)
|
||||
, hoistR
|
||||
, pattern (:.)
|
||||
, pattern (:/)
|
||||
, PageName
|
||||
, PathQuery
|
||||
@ -40,6 +46,9 @@ module Obelisk.Route
|
||||
, checkEnum1EncoderFunc
|
||||
, unitEncoder
|
||||
, pathOnlyEncoder
|
||||
, addPathSegmentEncoder
|
||||
, pathParamEncoder
|
||||
, pathLiteralEncoder
|
||||
, singletonListEncoder
|
||||
, unpackTextEncoder
|
||||
, prefixTextEncoder
|
||||
@ -60,10 +69,15 @@ module Obelisk.Route
|
||||
, shadowEncoder
|
||||
, prismEncoder
|
||||
, rPrism
|
||||
, _R
|
||||
, obeliskRouteEncoder
|
||||
, obeliskRouteSegment
|
||||
, pageNameEncoder
|
||||
, handleEncoder
|
||||
, FullRoute (..)
|
||||
, _FullRoute_Frontend
|
||||
, _FullRoute_Backend
|
||||
, mkFullRouteEncoder
|
||||
, ObeliskRoute (..)
|
||||
, _ObeliskRoute_App
|
||||
, _ObeliskRoute_Resource
|
||||
@ -78,6 +92,19 @@ module Obelisk.Route
|
||||
, void1Encoder
|
||||
, pathSegmentsTextEncoder
|
||||
, queryParametersTextEncoder
|
||||
, renderObeliskRoute
|
||||
, renderBackendRoute
|
||||
, renderFrontendRoute
|
||||
, readShowEncoder
|
||||
, integralEncoder
|
||||
, pathSegmentEncoder
|
||||
, queryOnlyEncoder
|
||||
, Decoder(..)
|
||||
, dmapEncoder
|
||||
, fieldMapEncoder
|
||||
, pathFieldEncoder
|
||||
, jsonEncoder
|
||||
, byteStringsToPageName
|
||||
) where
|
||||
|
||||
import Prelude hiding ((.), id)
|
||||
@ -88,8 +115,32 @@ import qualified Control.Categorical.Functor as Cat
|
||||
import Control.Categorical.Bifunctor
|
||||
import Control.Category.Associative
|
||||
import Control.Category.Monoidal
|
||||
import Control.Lens (Identity (..), Prism', makePrisms, itraverse, imap, prism, (^.), re, matching, (^?), _Just, _Nothing, Iso', from, view, Wrapped (..))
|
||||
import Control.Category.Braided
|
||||
import Control.Lens
|
||||
( Identity (..)
|
||||
, (^.)
|
||||
, (^?)
|
||||
, _Just
|
||||
, _Nothing
|
||||
, Cons(..)
|
||||
, from
|
||||
, imap
|
||||
, iso
|
||||
, Iso'
|
||||
, itraverse
|
||||
, makePrisms
|
||||
, Prism'
|
||||
, prism'
|
||||
, re
|
||||
, view
|
||||
, Wrapped (..)
|
||||
)
|
||||
import Control.Monad.Except
|
||||
import Control.Monad.Writer (execWriter, tell)
|
||||
import qualified Control.Monad.State.Strict as State
|
||||
import Control.Monad.Trans (lift)
|
||||
import qualified Data.ByteString as BS
|
||||
import qualified Data.ByteString.Lazy as BSL
|
||||
import Data.Dependent.Sum (DSum (..))
|
||||
import Data.Dependent.Map (DMap)
|
||||
import qualified Data.Dependent.Map as DMap
|
||||
@ -106,16 +157,19 @@ import Data.Maybe
|
||||
import Data.Monoid ((<>))
|
||||
import Data.Set (Set)
|
||||
import qualified Data.Set as Set
|
||||
import Data.Some (Some)
|
||||
import qualified Data.Some as Some
|
||||
import Data.Some.Universe.Orphans ()
|
||||
import Data.Some (Some(Some))
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Data.Text.Encoding
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Data.Universe
|
||||
import Data.Universe.Some
|
||||
import Network.HTTP.Types.URI
|
||||
import qualified Numeric.Lens
|
||||
import Obelisk.Route.TH
|
||||
import Text.Read (readMaybe)
|
||||
import Data.Tabulation
|
||||
import qualified Data.Aeson as Aeson
|
||||
import Data.Aeson (FromJSON, ToJSON)
|
||||
|
||||
-- Design goals:
|
||||
-- No start-up time on the frontend (not yet met)
|
||||
@ -141,13 +195,64 @@ import Text.Read (readMaybe)
|
||||
|
||||
type R f = DSum f Identity --TODO: Better name
|
||||
|
||||
--TODO: COMPLETE pragma
|
||||
infixr 5 :/
|
||||
-- | Convenience builder for an 'R' using 'Identity' for the functor.
|
||||
pattern (:/) :: f a -> a -> R f
|
||||
pattern a :/ b = a :=> Identity b
|
||||
{-# COMPLETE (:/) #-}
|
||||
infixr 5 :/
|
||||
|
||||
-- | Like '(:/)' but adds a 'Just' wrapper around the right-hand side.
|
||||
(?/) :: f (Maybe a) -> a -> R f
|
||||
r ?/ a = r :/ Just a
|
||||
infixr 5 ?/
|
||||
|
||||
mapSome :: (forall a. f a -> g a) -> Some f -> Some g
|
||||
mapSome f (Some.This a) = Some.This $ f a
|
||||
mapSome f (Some a) = Some $ f a
|
||||
|
||||
hoistR :: (forall x. f x -> g x) -> R f -> R g
|
||||
hoistR f (x :=> Identity y) = f x :/ y
|
||||
|
||||
--------------------------------------------------------------------------------
|
||||
-- Dealing with pairs (i.e. non-dependently-typed subroutes/paths)
|
||||
--------------------------------------------------------------------------------
|
||||
|
||||
infixr 5 :.
|
||||
type (:.) = (,)
|
||||
|
||||
{-# COMPLETE (:.) #-}
|
||||
pattern (:.) :: a -> b -> a :. b
|
||||
pattern a :. b = (a, b)
|
||||
|
||||
addPathSegmentEncoder
|
||||
:: ( Applicative check
|
||||
, MonadError Text parse
|
||||
)
|
||||
=> Encoder check parse (Text, PageName) PageName
|
||||
addPathSegmentEncoder = unsafeMkEncoder $ EncoderImpl
|
||||
{ _encoderImpl_encode = \(ph, (pt, q)) -> (ph : pt, q)
|
||||
, _encoderImpl_decode = \(p, q) -> case p of
|
||||
[] -> throwError "Expected a path segment"
|
||||
ph : pt -> pure (ph, (pt, q))
|
||||
}
|
||||
|
||||
pathParamEncoder
|
||||
:: forall check parse item rest.
|
||||
( Applicative check
|
||||
, MonadError Text parse
|
||||
)
|
||||
=> Encoder check parse item Text
|
||||
-> Encoder check parse rest PageName
|
||||
-> Encoder check parse (item :. rest) PageName
|
||||
pathParamEncoder itemUnchecked restUnchecked = addPathSegmentEncoder . bimap itemUnchecked restUnchecked
|
||||
|
||||
pathLiteralEncoder
|
||||
:: ( Applicative check
|
||||
, MonadError Text parse
|
||||
)
|
||||
=> Text
|
||||
-> Encoder check parse a PageName
|
||||
-> Encoder check parse a PageName
|
||||
pathLiteralEncoder t e = addPathSegmentEncoder . bimap (unitEncoder t) e . coidl
|
||||
|
||||
--------------------------------------------------------------------------------
|
||||
-- Encoder fundamentals
|
||||
@ -182,7 +287,7 @@ decode e x = runIdentity (tryDecode e x)
|
||||
tryDecode :: Encoder Identity parse decoded encoded -> encoded -> parse decoded
|
||||
tryDecode (Encoder (Identity impl)) x = _encoderImpl_decode impl x
|
||||
|
||||
-- | Similar to 'decode' above, once an encoder has been checked so that its check monad is Identity, it
|
||||
-- | Similar to 'decode', once an encoder has been checked so that its check monad is Identity, it
|
||||
-- can be used to actually encode by using this. Note that while there's no constraint on the parse monad here,
|
||||
-- one should usually be applying decode and encode to the same 'Encoder'
|
||||
encode :: Encoder Identity parse decoded encoded -> decoded -> encoded
|
||||
@ -232,6 +337,10 @@ instance Monad parse => Bifunctor (,) (EncoderImpl parse) (EncoderImpl parse) (E
|
||||
, _encoderImpl_decode = \(a, b) -> liftA2 (,) (_encoderImpl_decode f a) (_encoderImpl_decode g b)
|
||||
}
|
||||
|
||||
instance (Monad parse, Applicative check) => Braided (Encoder check parse) (,) where
|
||||
braid = isoEncoder (iso swap swap)
|
||||
|
||||
|
||||
instance (Applicative check, Monad parse) => PFunctor (,) (Encoder check parse) (Encoder check parse) where
|
||||
first f = bimap f id
|
||||
instance (Applicative check, Monad parse) => QFunctor (,) (Encoder check parse) (Encoder check parse) where
|
||||
@ -245,6 +354,34 @@ instance (Traversable f, Monad parse) => Cat.Functor f (EncoderImpl parse) (Enco
|
||||
, _encoderImpl_decode = traverse $ _encoderImpl_decode ve
|
||||
}
|
||||
|
||||
instance Monad parse => PFunctor Either (EncoderImpl parse) (EncoderImpl parse) where
|
||||
first f = bimap f id
|
||||
instance Monad parse => QFunctor Either (EncoderImpl parse) (EncoderImpl parse) where
|
||||
second g = bimap id g
|
||||
instance Monad parse => Bifunctor Either (EncoderImpl parse) (EncoderImpl parse) (EncoderImpl parse) where
|
||||
bimap f g = EncoderImpl
|
||||
{ _encoderImpl_encode = bimap (_encoderImpl_encode f) (_encoderImpl_encode g)
|
||||
, _encoderImpl_decode = \case
|
||||
Left a -> Left <$> _encoderImpl_decode f a
|
||||
Right b -> Right <$> _encoderImpl_decode g b
|
||||
}
|
||||
|
||||
instance (Monad parse, Applicative check) => QFunctor Either (Encoder check parse) (Encoder check parse) where
|
||||
second g = bimap id g
|
||||
instance (Monad parse, Applicative check) => PFunctor Either (Encoder check parse) (Encoder check parse) where
|
||||
first f = bimap f id
|
||||
instance (Monad parse, Applicative check) => Bifunctor Either (Encoder check parse) (Encoder check parse) (Encoder check parse) where
|
||||
bimap f g = Encoder $ liftA2 bimap (unEncoder f) (unEncoder g)
|
||||
|
||||
instance (Applicative check, Monad parse) => Associative (Encoder check parse) Either where
|
||||
associate = isoEncoder (iso (associate @(->) @Either) disassociate)
|
||||
disassociate = isoEncoder (iso disassociate associate)
|
||||
|
||||
instance (Monad parse, Applicative check) => Braided (Encoder check parse) Either where
|
||||
braid = isoEncoder (iso swap swap)
|
||||
|
||||
|
||||
|
||||
instance (Traversable f, Monad check, Monad parse) => Cat.Functor f (Encoder check parse) (Encoder check parse) where
|
||||
fmap e = Encoder $ do
|
||||
ve <- unEncoder e
|
||||
@ -332,14 +469,14 @@ maybeEncoder f g = shadowEncoder f g . maybeToEitherEncoder
|
||||
justEncoder :: (Applicative check, MonadError Text parse) => Encoder check parse a (Maybe a)
|
||||
justEncoder = prismEncoder _Just
|
||||
|
||||
-- |
|
||||
-- | Encode () to 'Nothing'.
|
||||
nothingEncoder :: (Applicative check, MonadError Text parse) => Encoder check parse () (Maybe a)
|
||||
nothingEncoder = prismEncoder _Nothing
|
||||
|
||||
someConstEncoder :: (Applicative check, Applicative parse) => Encoder check parse (Some (Const a)) a
|
||||
someConstEncoder = unsafeMkEncoder $ EncoderImpl
|
||||
{ _encoderImpl_encode = \(Some.This (Const a)) -> a
|
||||
, _encoderImpl_decode = pure . Some.This . Const
|
||||
{ _encoderImpl_encode = \(Some (Const a)) -> a
|
||||
, _encoderImpl_decode = pure . Some . Const
|
||||
}
|
||||
|
||||
-- | WARNING: This is only safe if the Show and Read instances for 'a' are
|
||||
@ -367,7 +504,7 @@ checkEnum1EncoderFunc
|
||||
-> check (EncoderFunc check' parse p r)
|
||||
checkEnum1EncoderFunc f = do
|
||||
(encoderImpls :: DMap p (Flip (EncoderImpl parse) r)) <- DMap.fromList <$>
|
||||
traverse (\(Some.This p) -> (p :=>) . Flip <$> unEncoder (f p)) universe
|
||||
traverse (\(Some p) -> (p :=>) . Flip <$> unEncoder (f p)) universe
|
||||
pure $ EncoderFunc $ \p -> unsafeMkEncoder . unFlip $
|
||||
DMap.findWithDefault (error "checkEnum1EncoderFunc: EncoderImpl not found (should be impossible)") p encoderImpls
|
||||
|
||||
@ -422,11 +559,12 @@ chainEncoder cons this rest = Encoder $ do
|
||||
pure $ EncoderImpl
|
||||
{ _encoderImpl_decode = \v -> do
|
||||
(here, following) <- _encoderImpl_decode consValid v
|
||||
Some.This r <- _encoderImpl_decode thisValid here
|
||||
(r :/) <$> _encoderImpl_decode (runIdentity . unEncoder $ rest r) following
|
||||
_encoderImpl_decode thisValid here >>= \case
|
||||
Some r ->
|
||||
(r :/) <$> _encoderImpl_decode (runIdentity . unEncoder $ rest r) following
|
||||
, _encoderImpl_encode = \(r :/ s) ->
|
||||
_encoderImpl_encode consValid
|
||||
( _encoderImpl_encode thisValid $ Some.This r
|
||||
( _encoderImpl_encode thisValid $ Some r
|
||||
, _encoderImpl_encode (runIdentity . unEncoder $ rest r) s)
|
||||
}
|
||||
|
||||
@ -493,7 +631,7 @@ enum1Encoder
|
||||
, Show r
|
||||
)
|
||||
=> (forall a. p a -> r) -> Encoder check parse (Some p) r
|
||||
enum1Encoder f = enumEncoder $ \(Some.This p) -> f p
|
||||
enum1Encoder f = enumEncoder $ \(Some p) -> f p
|
||||
|
||||
-- | Encode an enumerable, bounded type. WARNING: Don't use this on types that
|
||||
-- have a large number of values - it will use a lot of memory.
|
||||
@ -536,13 +674,10 @@ pathOnlyEncoderIgnoringQuery = unsafeMkEncoder $ EncoderImpl
|
||||
}
|
||||
|
||||
pathOnlyEncoder :: (Applicative check, MonadError Text parse) => Encoder check parse [Text] PageName
|
||||
pathOnlyEncoder = unsafeMkEncoder $ EncoderImpl
|
||||
{ _encoderImpl_decode = \(path, query) ->
|
||||
if query == mempty
|
||||
then pure path
|
||||
else throwError "pathOnlyEncoderImpl: query was provided"
|
||||
, _encoderImpl_encode = \path -> (path, mempty)
|
||||
}
|
||||
pathOnlyEncoder = second (unitEncoder mempty) . coidr
|
||||
|
||||
queryOnlyEncoder :: (Applicative check, MonadError Text parse) => Encoder check parse (Map Text (Maybe Text)) PageName
|
||||
queryOnlyEncoder = first (unitEncoder []) . coidl
|
||||
|
||||
singletonListEncoder :: (Applicative check, MonadError Text parse) => Encoder check parse a [a]
|
||||
singletonListEncoder = unsafeMkEncoder $ EncoderImpl
|
||||
@ -585,10 +720,10 @@ queryParametersTextEncoder = Encoder $ pure $ EncoderImpl
|
||||
in (urlDecodeText True k, urlDecodeText True <$> mv)
|
||||
|
||||
urlEncodeText :: Bool -> Text -> Text
|
||||
urlEncodeText q = decodeUtf8 . urlEncode q . encodeUtf8
|
||||
urlEncodeText q = T.decodeUtf8 . urlEncode q . T.encodeUtf8
|
||||
|
||||
urlDecodeText :: Bool -> Text -> Text
|
||||
urlDecodeText q = decodeUtf8 . urlDecode q . encodeUtf8
|
||||
urlDecodeText q = T.decodeUtf8 . urlDecode q . T.encodeUtf8
|
||||
|
||||
listToNonEmptyEncoder :: (Applicative check, Applicative parse, Monoid a, Eq a) => Encoder check parse [a] (NonEmpty a)
|
||||
listToNonEmptyEncoder = Encoder $ pure $ EncoderImpl
|
||||
@ -646,10 +781,47 @@ joinPairTextEncoder = Encoder . \case
|
||||
_ -> return (kt, T.drop (T.length separator) vt)
|
||||
}
|
||||
|
||||
--TODO: Rewrite this by composing the given prism with a lens on the first element of the DSum
|
||||
-- Or, more likely, the user can compose it themselves
|
||||
rPrism :: forall f g. (forall a. Prism' (f a) (g a)) -> Prism' (R f) (R g)
|
||||
rPrism p = prism (\(g :/ x) -> g ^. re p :/ x) (\(f :/ x) -> bimap (:/ x) (:/ x) $ matching p f)
|
||||
-- This slight generalization of 'rPrism' happens to be enough to write all our
|
||||
-- prism combinators so far.
|
||||
dSumPrism
|
||||
:: forall f f' g
|
||||
. (forall a. Prism' (f a) (f' a))
|
||||
-> Prism' (DSum f g) (DSum f' g)
|
||||
dSumPrism p = prism'
|
||||
(\(f' :=> x) -> f' ^. re p :=> x)
|
||||
(\(f :=> x) -> (:=> x) <$> (f ^? p))
|
||||
|
||||
-- already in obelisk
|
||||
rPrism
|
||||
:: forall f f'
|
||||
. (forall a. Prism' (f a) (f' a))
|
||||
-> Prism' (R f) (R f')
|
||||
rPrism p = dSumPrism p
|
||||
|
||||
dSumPrism'
|
||||
:: forall f g a
|
||||
. (forall b. Prism' (f b) (a :~: b))
|
||||
-> Prism' (DSum f g) (g a)
|
||||
dSumPrism' p = dSumPrism p . iso (\(Refl :=> b) -> b) (Refl :=>)
|
||||
|
||||
dSumGEqPrism
|
||||
:: GEq f
|
||||
=> f a
|
||||
-> Prism' (DSum f g) (g a)
|
||||
dSumGEqPrism variant = dSumPrism' $ prism' (\Refl -> variant) (\x -> geq variant x)
|
||||
|
||||
-- | Given a 'tag :: f a', make a prism for 'R f'. This generalizes the usual
|
||||
-- prisms for a sum type (the ones that 'mkPrisms' would make), just as 'R'
|
||||
-- generalized a usual sum type.
|
||||
--
|
||||
-- [This is given the '_R' name of the "cannonical" prism not because it is the
|
||||
-- most general, but because it seems the most useful for routes, and 'R' itself
|
||||
-- trades generality for route-specificity.]
|
||||
_R
|
||||
:: GEq f
|
||||
=> f a
|
||||
-> Prism' (R f) a
|
||||
_R variant = dSumGEqPrism variant . iso runIdentity Identity
|
||||
|
||||
-- | An encoder that only works on the items available via the prism. An error will be thrown in the parse monad
|
||||
-- if the prism doesn't match.
|
||||
@ -693,6 +865,45 @@ handleEncoder recover e = Encoder $ do
|
||||
-- Actual obelisk route info
|
||||
--------------------------------------------------------------------------------
|
||||
|
||||
-- | The typical full route type comprising all of an Obelisk application's routes.
|
||||
-- Parameterised by the top level GADTs that define backend and frontend routes, respectively.
|
||||
data FullRoute :: (* -> *) -> (* -> *) -> * -> * where
|
||||
FullRoute_Backend :: br a -> FullRoute br fr a
|
||||
FullRoute_Frontend :: ObeliskRoute fr a -> FullRoute br fr a
|
||||
|
||||
instance (GShow br, GShow fr) => GShow (FullRoute br fr) where
|
||||
gshowsPrec p = \case
|
||||
FullRoute_Backend x -> showParen (p > 10) (showString "FullRoute_Backend " . gshowsPrec 11 x)
|
||||
FullRoute_Frontend x -> showParen (p > 10) (showString "FullRoute_Frontend " . gshowsPrec 11 x)
|
||||
|
||||
instance (GEq br, GEq fr) => GEq (FullRoute br fr) where
|
||||
geq (FullRoute_Backend x) (FullRoute_Backend y) = geq x y
|
||||
geq (FullRoute_Frontend x) (FullRoute_Frontend y) = geq x y
|
||||
geq _ _ = Nothing
|
||||
|
||||
instance (GCompare br, GCompare fr) => GCompare (FullRoute br fr) where
|
||||
gcompare (FullRoute_Backend _) (FullRoute_Frontend _) = GLT
|
||||
gcompare (FullRoute_Frontend _) (FullRoute_Backend _) = GGT
|
||||
gcompare (FullRoute_Backend x) (FullRoute_Backend y) = gcompare x y
|
||||
gcompare (FullRoute_Frontend x) (FullRoute_Frontend y) = gcompare x y
|
||||
|
||||
instance (UniverseSome br, UniverseSome fr) => UniverseSome (FullRoute br fr) where
|
||||
universeSome = [Some (FullRoute_Backend x) | Some x <- universeSome]
|
||||
++ [Some (FullRoute_Frontend x) | Some x <- universeSome]
|
||||
|
||||
-- | Build the typical top level application route encoder from a route for handling 404's,
|
||||
-- and segment encoders for backend and frontend routes.
|
||||
mkFullRouteEncoder
|
||||
:: (GCompare br, GCompare fr, GShow br, GShow fr, UniverseSome br, UniverseSome fr)
|
||||
=> R (FullRoute br fr) -- ^ 404 handler
|
||||
-> (forall a. br a -> SegmentResult (Either Text) (Either Text) a) -- ^ How to encode a single backend route segment
|
||||
-> (forall a. fr a -> SegmentResult (Either Text) (Either Text) a) -- ^ How to encode a single frontend route segment
|
||||
-> Encoder (Either Text) Identity (R (FullRoute br fr)) PageName
|
||||
mkFullRouteEncoder missing backendSegment frontendSegment = handleEncoder (const missing) $
|
||||
pathComponentEncoder $ \case
|
||||
FullRoute_Backend backendRoute -> backendSegment backendRoute
|
||||
FullRoute_Frontend obeliskRoute -> obeliskRouteSegment obeliskRoute frontendSegment
|
||||
|
||||
-- | A type which can represent Obelisk-specific resource routes, in addition to application specific routes which serve your
|
||||
-- frontend.
|
||||
data ObeliskRoute :: (* -> *) -> * -> * where
|
||||
@ -700,9 +911,11 @@ data ObeliskRoute :: (* -> *) -> * -> * where
|
||||
ObeliskRoute_App :: f a -> ObeliskRoute f a
|
||||
ObeliskRoute_Resource :: ResourceRoute a -> ObeliskRoute f a
|
||||
|
||||
instance Universe (Some f) => Universe (Some (ObeliskRoute f)) where
|
||||
universe = fmap (\(Some.This x) -> Some.This (ObeliskRoute_App x)) universe
|
||||
++ fmap (\(Some.This x) -> Some.This (ObeliskRoute_Resource x)) universe
|
||||
instance UniverseSome f => UniverseSome (ObeliskRoute f) where
|
||||
universeSome = concat
|
||||
[ (\(Some x) -> Some (ObeliskRoute_App x)) <$> universe
|
||||
, (\(Some x) -> Some (ObeliskRoute_Resource x)) <$> universe
|
||||
]
|
||||
|
||||
instance GEq f => GEq (ObeliskRoute f) where
|
||||
geq (ObeliskRoute_App x) (ObeliskRoute_App y) = geq x y
|
||||
@ -790,12 +1003,12 @@ indexOnlyRouteEncoder = pathComponentEncoder indexOnlyRouteSegment
|
||||
|
||||
someSumEncoder :: (Applicative check, Applicative parse) => Encoder check parse (Some (Sum a b)) (Either (Some a) (Some b))
|
||||
someSumEncoder = Encoder $ pure $ EncoderImpl
|
||||
{ _encoderImpl_encode = \(Some.This t) -> case t of
|
||||
InL l -> Left $ Some.This l
|
||||
InR r -> Right $ Some.This r
|
||||
{ _encoderImpl_encode = \(Some t) -> case t of
|
||||
InL l -> Left $ Some l
|
||||
InR r -> Right $ Some r
|
||||
, _encoderImpl_decode = pure . \case
|
||||
Left (Some.This l) -> Some.This (InL l)
|
||||
Right (Some.This r) -> Some.This (InR r)
|
||||
Left (Some l) -> Some (InL l)
|
||||
Right (Some r) -> Some (InR r)
|
||||
}
|
||||
|
||||
data Void1 :: * -> * where {}
|
||||
@ -806,7 +1019,7 @@ instance Universe (Some Void1) where
|
||||
void1Encoder :: (Applicative check, MonadError Text parse) => Encoder check parse (Some Void1) a
|
||||
void1Encoder = Encoder $ pure $ EncoderImpl
|
||||
{ _encoderImpl_encode = \case
|
||||
Some.This f -> case f of {}
|
||||
Some f -> case f of {}
|
||||
, _encoderImpl_decode = \_ -> throwError "void1Encoder: can't decode anything"
|
||||
}
|
||||
|
||||
@ -820,8 +1033,150 @@ concat <$> mapM deriveRouteComponent
|
||||
]
|
||||
|
||||
makePrisms ''ObeliskRoute
|
||||
|
||||
makePrisms ''FullRoute
|
||||
deriveGEq ''Void1
|
||||
deriveGCompare ''Void1
|
||||
|
||||
-- | Given a backend route and a checked route encoder, render the route (path
|
||||
-- and query string). See 'checkEncoder' for how to produce a checked encoder.
|
||||
renderBackendRoute
|
||||
:: forall br a.
|
||||
Encoder Identity Identity (R (FullRoute br a)) PageName
|
||||
-> R br
|
||||
-> Text
|
||||
renderBackendRoute enc = renderObeliskRoute enc . hoistR FullRoute_Backend
|
||||
|
||||
-- | Renders a frontend route with the supplied checked encoder
|
||||
renderFrontendRoute
|
||||
:: forall a fr.
|
||||
Encoder Identity Identity (R (FullRoute a fr)) PageName
|
||||
-> R fr
|
||||
-> Text
|
||||
renderFrontendRoute enc = renderObeliskRoute enc . hoistR (FullRoute_Frontend . ObeliskRoute_App)
|
||||
|
||||
-- | Renders a route of the form typically found in an Obelisk project
|
||||
renderObeliskRoute
|
||||
:: forall a b.
|
||||
Encoder Identity Identity (R (FullRoute a b)) PageName
|
||||
-> R (FullRoute a b)
|
||||
-> Text
|
||||
renderObeliskRoute e r =
|
||||
let enc :: Encoder Identity (Either Text) (R (FullRoute a b)) PathQuery
|
||||
enc = (pageNameEncoder . hoistParse (pure . runIdentity) e)
|
||||
in (T.pack . uncurry (<>)) $ encode enc r
|
||||
|
||||
readShowEncoder :: (MonadError Text parse, Read a, Show a, Applicative check) => Encoder check parse a PageName
|
||||
readShowEncoder = singlePathSegmentEncoder . unsafeTshowEncoder
|
||||
|
||||
integralEncoder :: (MonadError Text parse, Applicative check, Integral a) => Encoder check parse a Integer
|
||||
integralEncoder = prismEncoder (Numeric.Lens.integral)
|
||||
|
||||
pathSegmentEncoder :: (MonadError Text parse, Applicative check, Cons as as a a) =>
|
||||
Encoder check parse (a, (as, b)) (as, b)
|
||||
pathSegmentEncoder = first (prismEncoder _Cons) . disassociate
|
||||
|
||||
newtype Decoder check parse b a = Decoder { toEncoder :: Encoder check parse a b }
|
||||
|
||||
dmapEncoder :: forall check parse k k' v.
|
||||
( Monad check
|
||||
, MonadError Text parse
|
||||
, Universe (Some k')
|
||||
, Ord k
|
||||
, GCompare k'
|
||||
, GShow k'
|
||||
)
|
||||
=> Encoder check parse (Some k') k
|
||||
-> (forall v'. k' v' -> Encoder check parse v' v)
|
||||
-> Encoder check parse (DMap k' Identity) (Map k v)
|
||||
dmapEncoder keyEncoder' valueEncoderFor = unsafeEncoder $ do
|
||||
keyEncoder :: Encoder Identity parse (Some k') k <- checkEncoder keyEncoder'
|
||||
valueDecoders :: DMap k' (Decoder Identity parse v) <- fmap DMap.fromList . forM universe $ \(Some (k' :: k' t)) -> do
|
||||
ve :: Encoder Identity parse t v <- checkEncoder (valueEncoderFor k')
|
||||
return $ (k' :: k' t) :=> (Decoder ve :: Decoder Identity parse v t)
|
||||
let keyError k = "dmapEncoder: key `" <> k <> "' was missing from the Universe instance for its type."
|
||||
return $ EncoderImpl
|
||||
{ _encoderImpl_encode = \dm -> Map.fromList $ do
|
||||
((k' :: k' t) :=> Identity v') <- DMap.toList dm
|
||||
return ( encode keyEncoder (Some k')
|
||||
, encode (toEncoder (DMap.findWithDefault (error . keyError $ gshow k') k' valueDecoders)) v'
|
||||
)
|
||||
, _encoderImpl_decode = \m -> fmap DMap.fromList . forM (Map.toList m) $ \(k,v) -> do
|
||||
tryDecode keyEncoder k >>= \case
|
||||
Some (k' :: k' t) -> case DMap.lookup k' valueDecoders of
|
||||
Nothing -> throwError . T.pack . keyError $ gshow k'
|
||||
Just (Decoder e) -> do
|
||||
v' <- tryDecode e v
|
||||
return (k' :=> Identity v')
|
||||
}
|
||||
|
||||
fieldMapEncoder :: forall check parse r.
|
||||
( Applicative check
|
||||
, MonadError Text parse
|
||||
, HasFields r
|
||||
, Universe (Some (Field r))
|
||||
, GShow (Field r)
|
||||
, GCompare (Field r)
|
||||
)
|
||||
=> Encoder check parse r (DMap (Field r) Identity)
|
||||
fieldMapEncoder = unsafeEncoder $ do
|
||||
pure $ EncoderImpl
|
||||
{ _encoderImpl_encode = \r -> DMap.fromList [ f :=> Identity (indexField r f) | Some f <- universe ]
|
||||
, _encoderImpl_decode = \dm -> tabulateFieldsA $ \f -> do
|
||||
case DMap.lookup f dm of
|
||||
Nothing -> throwError $ "fieldMapEncoder: Couldn't find key for `" <> T.pack (gshow f) <> "' in DMap."
|
||||
Just (Identity v) -> return v
|
||||
}
|
||||
|
||||
-- this is in base 4.12 (GHC 8.6);
|
||||
newtype Ap f a = Ap {getAp :: f a}
|
||||
|
||||
instance (Applicative f, Semigroup a) => Semigroup (Ap f a) where
|
||||
Ap x <> Ap y = Ap (liftA2 (<>) x y)
|
||||
|
||||
instance (Applicative f, Monoid a) => Monoid (Ap f a) where
|
||||
mappend = (<>)
|
||||
mempty = Ap (pure mempty)
|
||||
|
||||
pathFieldEncoder :: forall a p check parse . (HasFields a, Monad check, MonadError Text parse, GCompare (Field a)) => (forall x. Field a x -> Encoder check parse x p) -> Encoder check parse (a, [p]) [p]
|
||||
pathFieldEncoder fieldEncoder = unsafeEncoder $ do
|
||||
fieldEncoderPureMap :: DMap.DMap (Field a) (Decoder Identity parse p) <- getAp $ getConst $ tabulateFieldsA @a $ \f -> Const (Ap $ fmap (DMap.singleton f . Decoder) $ (checkEncoder @Identity) $ fieldEncoder f)
|
||||
let fieldEncoderPure :: forall x. Field a x -> Encoder Identity parse x p
|
||||
fieldEncoderPure f = toEncoder (DMap.findWithDefault (error "bad") f fieldEncoderPureMap)
|
||||
pure $ EncoderImpl
|
||||
{ _encoderImpl_encode = \(x, rest) -> execWriter $ do
|
||||
_ <- traverseWithField (\f x_i -> tell (pure $ encode (fieldEncoderPure f) x_i) *> pure x_i) x
|
||||
tell rest
|
||||
, _encoderImpl_decode = State.runStateT $ tabulateFieldsA $ \f -> State.get >>= \case
|
||||
[] -> throwError $ T.pack "not enough path components"
|
||||
p:ps -> do
|
||||
State.put ps
|
||||
lift $ tryDecode (fieldEncoderPure f) p
|
||||
}
|
||||
|
||||
-- | Use ToJSON/FromJSON to encode to Text. The correctness of this encoder is dependent on the encoding being injective and round-tripping correctly.
|
||||
jsonEncoder :: forall check parse r.
|
||||
( ToJSON r
|
||||
, FromJSON r
|
||||
, Applicative check
|
||||
, MonadError Text parse
|
||||
)
|
||||
=> Encoder check parse r Text
|
||||
jsonEncoder = unsafeEncoder $ do
|
||||
pure $ EncoderImpl
|
||||
{ _encoderImpl_encode = \r -> T.decodeUtf8 . BSL.toStrict $ Aeson.encode r
|
||||
, _encoderImpl_decode = \t -> case Aeson.eitherDecodeStrict $ T.encodeUtf8 t of
|
||||
Left err -> throwError ("jsonEncoder: " <> T.pack err)
|
||||
Right x -> return x
|
||||
}
|
||||
|
||||
-- Useful for app server integration.
|
||||
-- p must not start with slashes
|
||||
byteStringsToPageName :: BS.ByteString -> BS.ByteString -> PageName
|
||||
byteStringsToPageName p q =
|
||||
let pageNameEncoder' :: Encoder Identity Identity PageName (String, String)
|
||||
pageNameEncoder' = bimap
|
||||
(unpackTextEncoder . pathSegmentsTextEncoder . listToNonEmptyEncoder)
|
||||
(unpackTextEncoder . queryParametersTextEncoder . toListMapEncoder)
|
||||
in decode pageNameEncoder' (T.unpack (T.decodeUtf8 p), T.unpack (T.decodeUtf8 q))
|
||||
|
||||
--TODO: decodeURIComponent as appropriate
|
||||
|
@ -3,17 +3,15 @@
|
||||
{-# LANGUAGE RecursiveDo #-}
|
||||
{-# LANGUAGE StandaloneDeriving #-}
|
||||
{-# LANGUAGE ConstraintKinds #-}
|
||||
{-# LANGUAGE DefaultSignatures #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE GADTs #-}
|
||||
{-# LANGUAGE KindSignatures #-}
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE PatternSynonyms #-}
|
||||
{-# LANGUAGE RankNTypes #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# LANGUAGE TypeFamilies #-}
|
||||
{-# LANGUAGE TypeOperators #-}
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
@ -33,9 +31,14 @@ module Obelisk.Route.Frontend
|
||||
, mapRoutedT
|
||||
, subRoute
|
||||
, subRoute_
|
||||
, subPairRoute
|
||||
, subPairRoute_
|
||||
, maybeRoute
|
||||
, maybeRoute_
|
||||
, maybeRouted
|
||||
, eitherRoute
|
||||
, eitherRoute_
|
||||
, eitherRouted
|
||||
, runRouteViewT
|
||||
, SetRouteT(..)
|
||||
, SetRoute(..)
|
||||
@ -46,6 +49,10 @@ module Obelisk.Route.Frontend
|
||||
, runRouteToUrlT
|
||||
, mapRouteToUrlT
|
||||
, routeLink
|
||||
, routeLinkDynAttr
|
||||
, dynRouteLink
|
||||
, adaptedUriPath
|
||||
, setAdaptedUriPath
|
||||
) where
|
||||
|
||||
import Prelude hiding ((.), id)
|
||||
@ -53,23 +60,24 @@ import Prelude hiding ((.), id)
|
||||
import Obelisk.Route
|
||||
|
||||
import Control.Category (Category (..), (.))
|
||||
import Control.Category.Cartesian
|
||||
import Control.Category.Cartesian ((&&&))
|
||||
import Control.Lens hiding (Bifunctor, bimap, universe, element)
|
||||
import Control.Monad ((<=<))
|
||||
import Control.Monad.Fix
|
||||
import Control.Monad.Primitive
|
||||
import Control.Monad.Reader
|
||||
import Control.Monad.Ref
|
||||
import Control.Monad.Trans
|
||||
import Control.Monad.Trans.Control
|
||||
import Control.Monad.Trans.Reader
|
||||
import Data.Coerce
|
||||
import Data.Constraint (Dict (..))
|
||||
import Data.Dependent.Sum (DSum (..))
|
||||
import Data.GADT.Compare
|
||||
import Data.Map (Map)
|
||||
import Data.Monoid
|
||||
import Data.Proxy
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import Data.Functor.Compose
|
||||
import Data.Functor.Misc
|
||||
import Reflex.Class
|
||||
import Reflex.Host.Class
|
||||
import Reflex.PostBuild.Class
|
||||
@ -82,34 +90,47 @@ import Reflex.Dom.Builder.Class
|
||||
import Data.Type.Coercion
|
||||
import Language.Javascript.JSaddle --TODO: Get rid of this - other platforms can also be routed
|
||||
import Reflex.Dom.Core
|
||||
import qualified GHCJS.DOM as DOM
|
||||
import qualified GHCJS.DOM.Types as DOM
|
||||
import qualified GHCJS.DOM.Window as Window
|
||||
import Network.URI
|
||||
#if defined(ios_HOST_OS)
|
||||
import Data.Maybe (fromMaybe)
|
||||
import qualified Data.List as L
|
||||
#endif
|
||||
|
||||
import Unsafe.Coerce
|
||||
|
||||
import Obelisk.Configs
|
||||
|
||||
infixr 5 :~
|
||||
pattern (:~) :: Reflex t => f a -> Dynamic t a -> DSum f (Compose (Dynamic t) Identity)
|
||||
pattern a :~ b <- a :=> (coerceDynamic . getCompose -> b)
|
||||
|
||||
class Routed t r m | m -> t r where
|
||||
askRoute :: m (Dynamic t r)
|
||||
default askRoute :: (Monad m', MonadTrans f, Routed t r m', m ~ f m') => m (Dynamic t r)
|
||||
askRoute = lift askRoute
|
||||
|
||||
instance Monad m => Routed t r (RoutedT t r m) where
|
||||
askRoute = RoutedT ask
|
||||
|
||||
instance (Monad m, Routed t r m) => Routed t r (ReaderT r' m)
|
||||
|
||||
newtype RoutedT t r m a = RoutedT { unRoutedT :: ReaderT (Dynamic t r) m a }
|
||||
deriving (Functor, Applicative, Monad, MonadFix, MonadTrans, NotReady t, MonadHold t, MonadSample t, PostBuild t, TriggerEvent t, MonadIO, MonadReflexCreateTrigger t, HasDocument)
|
||||
deriving (Functor, Applicative, Monad, MonadFix, MonadTrans, NotReady t, MonadHold t, MonadSample t, PostBuild t, TriggerEvent t, MonadIO, MonadReflexCreateTrigger t, HasDocument, DomRenderHook t)
|
||||
|
||||
instance MonadReader r' m => MonadReader r' (RoutedT t r m) where
|
||||
ask = lift ask
|
||||
local = mapRoutedT . local
|
||||
|
||||
instance HasJSContext m => HasJSContext (RoutedT t r m) where
|
||||
type JSContextPhantom (RoutedT t r m) = JSContextPhantom m
|
||||
askJSContext = lift askJSContext
|
||||
|
||||
instance Prerender js m => Prerender js (RoutedT t r m) where
|
||||
prerenderClientDict = fmap (\Dict -> Dict) (prerenderClientDict :: Maybe (Dict (PrerenderClientConstraint js m)))
|
||||
instance (Prerender js t m, Monad m) => Prerender js t (RoutedT t r m) where
|
||||
type Client (RoutedT t r m) = RoutedT t r (Client m)
|
||||
prerender server client = RoutedT $ do
|
||||
r <- ask
|
||||
lift $ prerender (runRoutedT server r) (runRoutedT client r)
|
||||
|
||||
instance Requester t m => Requester t (RoutedT t r m) where
|
||||
type Request (RoutedT t r m) = Request m
|
||||
@ -161,17 +182,20 @@ instance (Monad m, MonadQuery t vs m) => MonadQuery t vs (RoutedT t r m) where
|
||||
askQueryResult = lift askQueryResult
|
||||
queryIncremental = lift . queryIncremental
|
||||
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (QueryT t q m) where
|
||||
askRouteToUrl = lift askRouteToUrl
|
||||
instance HasConfigs m => HasConfigs (RoutedT t r m)
|
||||
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (QueryT t q m) where
|
||||
setRoute = lift . setRoute
|
||||
modifyRoute = lift . modifyRoute
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (QueryT t q m)
|
||||
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (QueryT t q m)
|
||||
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (EventWriterT t w m)
|
||||
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (EventWriterT t w m)
|
||||
|
||||
runRoutedT :: RoutedT t r m a -> Dynamic t r -> m a
|
||||
runRoutedT = runReaderT . unRoutedT
|
||||
|
||||
mapRoutedT :: (m a -> n a) -> RoutedT t r m a -> RoutedT t r n a
|
||||
mapRoutedT :: (m a -> n b) -> RoutedT t r m a -> RoutedT t r n b
|
||||
mapRoutedT f = RoutedT . mapReaderT f . unRoutedT
|
||||
|
||||
withRoutedT :: (Dynamic t r -> Dynamic t r') -> RoutedT t r' m a -> RoutedT t r m a
|
||||
@ -181,10 +205,18 @@ subRoute_ :: (MonadFix m, MonadHold t m, GEq r, Adjustable t m) => (forall a. r
|
||||
subRoute_ f = factorRouted $ strictDynWidget_ $ \(c :=> r') -> do
|
||||
runRoutedT (f c) r'
|
||||
|
||||
-- | Like 'subRoute_', but with a pair rather than an R
|
||||
subPairRoute_ :: (MonadFix m, MonadHold t m, Eq a, Adjustable t m) => (a -> RoutedT t b m ()) -> RoutedT t (a, b) m ()
|
||||
subPairRoute_ f = withRoutedT (fmap (\(a, b) -> Const2 a :/ b)) $ subRoute_ (\(Const2 a) -> f a)
|
||||
|
||||
subRoute :: (MonadFix m, MonadHold t m, GEq r, Adjustable t m) => (forall a. r a -> RoutedT t a m b) -> RoutedT t (R r) m (Dynamic t b)
|
||||
subRoute f = factorRouted $ strictDynWidget $ \(c :=> r') -> do
|
||||
runRoutedT (f c) r'
|
||||
|
||||
-- | Like 'subRoute_', but with a pair rather than an R
|
||||
subPairRoute :: (MonadFix m, MonadHold t m, Eq a, Adjustable t m) => (a -> RoutedT t b m c) -> RoutedT t (a, b) m (Dynamic t c)
|
||||
subPairRoute f = withRoutedT (fmap (\(a, b) -> Const2 a :/ b)) $ subRoute (\(Const2 a) -> f a)
|
||||
|
||||
maybeRoute_ :: (MonadFix m, MonadHold t m, Adjustable t m) => m () -> RoutedT t r m () -> RoutedT t (Maybe r) m ()
|
||||
maybeRoute_ n j = maybeRouted $ strictDynWidget_ $ \case
|
||||
Nothing -> n
|
||||
@ -201,6 +233,20 @@ maybeRoute f = factorRouted $ strictDynWidget $ \(c :=> r') -> do
|
||||
runRoutedT (f c) r'
|
||||
-}
|
||||
|
||||
eitherRoute_
|
||||
:: (MonadFix m, MonadHold t m, Adjustable t m)
|
||||
=> RoutedT t l m ()
|
||||
-> RoutedT t r m ()
|
||||
-> RoutedT t (Either l r) m ()
|
||||
eitherRoute_ l r = eitherRouted $ strictDynWidget_ $ either (runRoutedT l) (runRoutedT r)
|
||||
|
||||
eitherRoute
|
||||
:: (MonadFix m, MonadHold t m, Adjustable t m)
|
||||
=> RoutedT t l m a
|
||||
-> RoutedT t r m a
|
||||
-> RoutedT t (Either l r) m (Dynamic t a)
|
||||
eitherRoute l r = eitherRouted $ strictDynWidget $ either (runRoutedT l) (runRoutedT r)
|
||||
|
||||
dsumValueCoercion :: Coercion f g -> Coercion (DSum k f) (DSum k g)
|
||||
dsumValueCoercion Coercion = Coercion
|
||||
|
||||
@ -217,6 +263,9 @@ maybeRouted r = RoutedT $ ReaderT $ \d -> do
|
||||
d' <- maybeDyn d
|
||||
runRoutedT r d'
|
||||
|
||||
eitherRouted :: (Reflex t, MonadFix m, MonadHold t m) => RoutedT t (Either (Dynamic t a) (Dynamic t b)) m c -> RoutedT t (Either a b) m c
|
||||
eitherRouted r = RoutedT $ ReaderT $ runRoutedT r <=< eitherDyn
|
||||
|
||||
-- | WARNING: The input 'Dynamic' must be fully constructed when this is run
|
||||
strictDynWidget :: (MonadSample t m, MonadHold t m, Adjustable t m) => (a -> m b) -> RoutedT t a m (Dynamic t b)
|
||||
strictDynWidget f = RoutedT $ ReaderT $ \r -> do
|
||||
@ -231,7 +280,7 @@ strictDynWidget_ f = RoutedT $ ReaderT $ \r -> do
|
||||
pure ()
|
||||
|
||||
newtype SetRouteT t r m a = SetRouteT { unSetRouteT :: EventWriterT t (Endo r) m a }
|
||||
deriving (Functor, Applicative, Monad, MonadFix, MonadTrans, MonadIO, NotReady t, MonadHold t, MonadSample t, PostBuild t, TriggerEvent t, MonadReflexCreateTrigger t, HasDocument)
|
||||
deriving (Functor, Applicative, Monad, MonadFix, MonadTrans, MonadIO, NotReady t, MonadHold t, MonadSample t, PostBuild t, TriggerEvent t, MonadReflexCreateTrigger t, HasDocument, DomRenderHook t)
|
||||
|
||||
instance (MonadFix m, MonadHold t m, DomBuilder t m) => DomBuilder t (SetRouteT t r m) where
|
||||
type DomBuilderSpace (SetRouteT t r m) = DomBuilderSpace m
|
||||
@ -253,16 +302,26 @@ runSetRouteT = runEventWriterT . unSetRouteT
|
||||
class Reflex t => SetRoute t r m | m -> t r where
|
||||
setRoute :: Event t r -> m ()
|
||||
modifyRoute :: Event t (r -> r) -> m ()
|
||||
default modifyRoute :: (Monad m', MonadTrans f, SetRoute t r m', m ~ f m') => Event t (r -> r) -> m ()
|
||||
modifyRoute = lift . modifyRoute
|
||||
|
||||
setRoute = modifyRoute . fmap const
|
||||
|
||||
instance (Reflex t, Monad m) => SetRoute t r (SetRouteT t r m) where
|
||||
modifyRoute = SetRouteT . tellEvent . fmap Endo
|
||||
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (RoutedT t r' m) where
|
||||
modifyRoute = lift . modifyRoute
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (RoutedT t r' m)
|
||||
|
||||
instance Prerender js m => Prerender js (SetRouteT t r m) where
|
||||
prerenderClientDict = fmap (\Dict -> Dict) (prerenderClientDict :: Maybe (Dict (PrerenderClientConstraint js m)))
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (ReaderT r' m)
|
||||
|
||||
instance (PerformEvent t m, Prerender js t m, Monad m, Reflex t) => Prerender js t (SetRouteT t r m) where
|
||||
type Client (SetRouteT t r m) = SetRouteT t r (Client m)
|
||||
prerender server client = do
|
||||
d <- lift $ prerender (runSetRouteT server) (runSetRouteT client)
|
||||
let (a, r) = splitDynPure d
|
||||
-- Must be prompt here
|
||||
SetRouteT . tellEvent $ switchPromptlyDyn r
|
||||
pure a
|
||||
|
||||
instance Requester t m => Requester t (SetRouteT t r m) where
|
||||
type Request (SetRouteT t r m) = Request m
|
||||
@ -270,6 +329,8 @@ instance Requester t m => Requester t (SetRouteT t r m) where
|
||||
requesting = SetRouteT . requesting
|
||||
requesting_ = SetRouteT . requesting_
|
||||
|
||||
instance (Monad m, SetRoute t r m) => SetRoute t r (RequesterT t req rsp m)
|
||||
|
||||
#ifndef ghcjs_HOST_OS
|
||||
deriving instance MonadJSM m => MonadJSM (SetRouteT t r m)
|
||||
#endif
|
||||
@ -293,6 +354,8 @@ instance PrimMonad m => PrimMonad (SetRouteT t r m ) where
|
||||
type PrimState (SetRouteT t r m) = PrimState m
|
||||
primitive = lift . primitive
|
||||
|
||||
instance HasConfigs m => HasConfigs (SetRouteT t r m)
|
||||
|
||||
instance (MonadHold t m, Adjustable t m) => Adjustable t (SetRouteT t r m) where
|
||||
runWithReplace a0 a' = SetRouteT $ runWithReplace (coerce a0) $ coerceEvent a'
|
||||
traverseIntMapWithKeyWithAdjust f a0 a' = SetRouteT $ traverseIntMapWithKeyWithAdjust (coerce f) (coerce a0) $ coerce a'
|
||||
@ -306,9 +369,11 @@ instance (Monad m, MonadQuery t vs m) => MonadQuery t vs (SetRouteT t r m) where
|
||||
|
||||
class RouteToUrl r m | m -> r where
|
||||
askRouteToUrl :: m (r -> Text)
|
||||
default askRouteToUrl :: (Monad m', MonadTrans f, RouteToUrl r m', m ~ f m') => m (r -> Text)
|
||||
askRouteToUrl = lift askRouteToUrl
|
||||
|
||||
newtype RouteToUrlT r m a = RouteToUrlT { unRouteToUrlT :: ReaderT (r -> Text) m a }
|
||||
deriving (Functor, Applicative, Monad, MonadFix, MonadTrans, NotReady t, MonadHold t, MonadSample t, PostBuild t, TriggerEvent t, MonadIO, MonadReflexCreateTrigger t, HasDocument)
|
||||
deriving (Functor, Applicative, Monad, MonadFix, MonadTrans, NotReady t, MonadHold t, MonadSample t, PostBuild t, TriggerEvent t, MonadIO, MonadReflexCreateTrigger t, HasDocument, DomRenderHook t)
|
||||
|
||||
runRouteToUrlT
|
||||
:: RouteToUrlT r m a
|
||||
@ -323,17 +388,22 @@ instance Monad m => RouteToUrl r (RouteToUrlT r m) where
|
||||
askRouteToUrl = RouteToUrlT ask
|
||||
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (SetRouteT t r' m) where
|
||||
askRouteToUrl = lift askRouteToUrl
|
||||
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (RoutedT t r' m) where
|
||||
askRouteToUrl = lift askRouteToUrl
|
||||
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (ReaderT r' m) where
|
||||
|
||||
instance (Monad m, RouteToUrl r m) => RouteToUrl r (RequesterT t req rsp m)
|
||||
|
||||
instance HasJSContext m => HasJSContext (RouteToUrlT r m) where
|
||||
type JSContextPhantom (RouteToUrlT r m) = JSContextPhantom m
|
||||
askJSContext = lift askJSContext
|
||||
|
||||
instance Prerender js m => Prerender js (RouteToUrlT r m) where
|
||||
prerenderClientDict = fmap (\Dict -> Dict) (prerenderClientDict :: Maybe (Dict (PrerenderClientConstraint js m)))
|
||||
instance (Prerender js t m, Monad m) => Prerender js t (RouteToUrlT r m) where
|
||||
type Client (RouteToUrlT r m) = RouteToUrlT r (Client m)
|
||||
prerender server client = do
|
||||
r <- RouteToUrlT ask
|
||||
lift $ prerender (runRouteToUrlT server r) (runRouteToUrlT client r)
|
||||
|
||||
instance Requester t m => Requester t (RouteToUrlT r m) where
|
||||
type Request (RouteToUrlT r m) = Request m
|
||||
@ -383,6 +453,8 @@ instance (Monad m, MonadQuery t vs m) => MonadQuery t vs (RouteToUrlT r m) where
|
||||
askQueryResult = lift askQueryResult
|
||||
queryIncremental = lift . queryIncremental
|
||||
|
||||
instance HasConfigs m => HasConfigs (RouteToUrlT t m)
|
||||
|
||||
runRouteViewT
|
||||
:: forall t m r a.
|
||||
( TriggerEvent t m
|
||||
@ -393,16 +465,20 @@ runRouteViewT
|
||||
, MonadFix m
|
||||
)
|
||||
=> (Encoder Identity Identity r PageName)
|
||||
--TODO: Get rid of the switchover and useHash arguments
|
||||
-- useHash can probably be baked into the encoder
|
||||
-> Event t () -- ^ Switchover event, nothing is done until this event fires. Used to prevent incorrect DOM expectations at hydration switchover time
|
||||
-> Bool
|
||||
-> RoutedT t r (SetRouteT t r (RouteToUrlT r m)) a
|
||||
-> m a
|
||||
runRouteViewT routeEncoder a = do
|
||||
rec historyState <- manageHistory $ HistoryCommand_PushState <$> setState
|
||||
runRouteViewT routeEncoder switchover useHash a = do
|
||||
rec historyState <- manageHistory' switchover $ HistoryCommand_PushState <$> setState
|
||||
let theEncoder = pageNameEncoder . hoistParse (pure . runIdentity) routeEncoder
|
||||
-- NB: The can only fail if the uriPath doesn't begin with a '/' or if the uriQuery
|
||||
-- is nonempty, but begins with a character that isn't '?'. Since we don't expect
|
||||
-- this ever to happen, we'll just handle it by failing completely with 'error'.
|
||||
route :: Dynamic t r
|
||||
route = fmap (errorLeft . tryDecode theEncoder . (adaptedUriPath &&& uriQuery) . _historyItem_uri) historyState
|
||||
route = fmap (errorLeft . tryDecode theEncoder . (adaptedUriPath useHash &&& uriQuery) . _historyItem_uri) historyState
|
||||
where
|
||||
errorLeft (Left e) = error (T.unpack e)
|
||||
errorLeft (Right x) = x
|
||||
@ -423,7 +499,7 @@ runRouteViewT routeEncoder a = do
|
||||
-- we can change this function later to accommodate.
|
||||
-- See: https://github.com/whatwg/html/issues/2174
|
||||
, _historyStateUpdate_title = ""
|
||||
, _historyStateUpdate_uri = Just $ setAdaptedUriPath newPath $ (_historyItem_uri currentHistoryState)
|
||||
, _historyStateUpdate_uri = Just $ setAdaptedUriPath useHash newPath $ (_historyItem_uri currentHistoryState)
|
||||
{ uriQuery = newQuery
|
||||
}
|
||||
}
|
||||
@ -433,42 +509,141 @@ runRouteViewT routeEncoder a = do
|
||||
-- | A link widget that, when clicked, sets the route to the provided route. In non-javascript
|
||||
-- contexts, this widget falls back to using @href@s to control navigation
|
||||
routeLink
|
||||
:: forall t m a route.
|
||||
:: forall t m a route js.
|
||||
( DomBuilder t m
|
||||
, RouteToUrl (R route) m
|
||||
, SetRoute t (R route) m
|
||||
, RouteToUrl route m
|
||||
, SetRoute t route m
|
||||
, Prerender js t m
|
||||
)
|
||||
=> R route -- ^ Target route
|
||||
=> route -- ^ Target route
|
||||
-> m a -- ^ Child widget
|
||||
-> m a
|
||||
routeLink r w = do
|
||||
(e, a) <- routeLinkImpl r w
|
||||
scrollToTop e
|
||||
return a
|
||||
|
||||
-- | Raw implementation of 'routeLink'. Does not scroll to the top of the page on clicks.
|
||||
routeLinkImpl
|
||||
:: forall t m a route.
|
||||
( DomBuilder t m
|
||||
, RouteToUrl route m
|
||||
, SetRoute t route m
|
||||
)
|
||||
=> route -- ^ Target route
|
||||
-> m a -- ^ Child widget
|
||||
-> m (Event t (), a)
|
||||
routeLinkImpl r w = do
|
||||
enc <- askRouteToUrl
|
||||
let cfg = (def :: ElementConfig EventResult t (DomBuilderSpace m))
|
||||
& elementConfig_eventSpec %~ addEventSpecFlags (Proxy :: Proxy (DomBuilderSpace m)) Click (\_ -> preventDefault)
|
||||
& elementConfig_initialAttributes .~ "href" =: enc r
|
||||
(e, a) <- element "a" cfg w
|
||||
setRoute $ r <$ domEvent Click e
|
||||
return (domEvent Click e, a)
|
||||
|
||||
scrollToTop :: forall m t js. (Prerender js t m, Monad m) => Event t () -> m ()
|
||||
scrollToTop e = prerender_ blank $ performEvent_ $ ffor e $ \_ -> liftJSM $ DOM.currentWindow >>= \case
|
||||
Nothing -> pure ()
|
||||
Just win -> Window.scrollTo win 0 0
|
||||
|
||||
-- | Like 'routeLinkDynAttr' but without custom attributes.
|
||||
dynRouteLink
|
||||
:: forall t m a route js.
|
||||
( DomBuilder t m
|
||||
, PostBuild t m
|
||||
, RouteToUrl route m
|
||||
, SetRoute t route m
|
||||
, Prerender js t m
|
||||
)
|
||||
=> Dynamic t route -- ^ Target route
|
||||
-> m a -- ^ Child widget
|
||||
-> m a
|
||||
dynRouteLink r w = do
|
||||
(e, a) <- dynRouteLinkImpl r w
|
||||
scrollToTop e
|
||||
return a
|
||||
|
||||
-- | Raw implementation of 'dynRouteLink'. Does not scroll to the top of the page on clicks.
|
||||
dynRouteLinkImpl
|
||||
:: forall t m a route.
|
||||
( DomBuilder t m
|
||||
, PostBuild t m
|
||||
, RouteToUrl route m
|
||||
, SetRoute t route m
|
||||
)
|
||||
=> Dynamic t route -- ^ Target route
|
||||
-> m a -- ^ Child widget
|
||||
-> m (Event t (), a)
|
||||
dynRouteLinkImpl dr w = do
|
||||
enc <- askRouteToUrl
|
||||
er <- dynamicAttributesToModifyAttributes $ ("href" =:) . enc <$> dr
|
||||
let cfg = (def :: ElementConfig EventResult t (DomBuilderSpace m))
|
||||
& elementConfig_eventSpec %~ addEventSpecFlags (Proxy :: Proxy (DomBuilderSpace m)) Click (\_ -> preventDefault)
|
||||
& elementConfig_modifyAttributes .~ er
|
||||
(e, a) <- element "a" cfg w
|
||||
let clk = domEvent Click e
|
||||
setRoute $ tag (current dr) clk
|
||||
return (clk, a)
|
||||
|
||||
-- | An @a@-tag link widget that, when clicked, sets the route to current value of the
|
||||
-- provided dynamic route. In non-JavaScript contexts the value of the dynamic post
|
||||
-- build is used so the link still works like 'routeLink'.
|
||||
routeLinkDynAttr
|
||||
:: forall t m a route js.
|
||||
( DomBuilder t m
|
||||
, PostBuild t m
|
||||
, RouteToUrl (R route) m
|
||||
, SetRoute t (R route) m
|
||||
, Prerender js t m
|
||||
)
|
||||
=> Dynamic t (Map AttributeName Text) -- ^ Attributes for @a@ element. Note that if @href@ is present it will be ignored
|
||||
-> Dynamic t (R route) -- ^ Target route
|
||||
-> m a -- ^ Child widget of the @a@ element
|
||||
-> m a
|
||||
routeLinkDynAttr dAttr dr w = do
|
||||
(e, a) <- routeLinkDynAttrImpl dAttr dr w
|
||||
scrollToTop e
|
||||
return a
|
||||
|
||||
-- | Raw implementation of 'routeLinkDynAttr'. Does not scroll to the top of the page on clicks.
|
||||
routeLinkDynAttrImpl
|
||||
:: forall t m a route.
|
||||
( DomBuilder t m
|
||||
, PostBuild t m
|
||||
, RouteToUrl (R route) m
|
||||
, SetRoute t (R route) m
|
||||
)
|
||||
=> Dynamic t (Map AttributeName Text) -- ^ Attributes for @a@ element. Note that if @href@ is present it will be ignored
|
||||
-> Dynamic t (R route) -- ^ Target route
|
||||
-> m a -- ^ Child widget of the @a@ element
|
||||
-> m (Event t (), a)
|
||||
routeLinkDynAttrImpl dAttr dr w = do
|
||||
enc <- askRouteToUrl
|
||||
er <- dynamicAttributesToModifyAttributes $ zipDynWith (<>) (("href" =:) . enc <$> dr) dAttr
|
||||
let cfg = (def :: ElementConfig EventResult t (DomBuilderSpace m))
|
||||
& elementConfig_eventSpec %~ addEventSpecFlags (Proxy :: Proxy (DomBuilderSpace m)) Click (\_ -> preventDefault)
|
||||
& elementConfig_modifyAttributes .~ er
|
||||
(e, a) <- element "a" cfg w
|
||||
let clk = domEvent Click e
|
||||
setRoute $ tag (current dr) clk
|
||||
return (clk, a)
|
||||
|
||||
-- On ios due to sandboxing when loading the page from a file adapt the
|
||||
-- path to be based on the hash.
|
||||
|
||||
adaptedUriPath :: URI -> String
|
||||
#if defined(ios_HOST_OS)
|
||||
adaptedUriPath = hashToPath . uriFragment
|
||||
adaptedUriPath :: Bool -> URI -> String
|
||||
adaptedUriPath = \case
|
||||
True -> hashToPath . uriFragment
|
||||
False -> uriPath
|
||||
|
||||
hashToPath :: String -> String
|
||||
hashToPath = ('/' :) . fromMaybe "" . L.stripPrefix "#"
|
||||
#else
|
||||
adaptedUriPath = uriPath
|
||||
#endif
|
||||
|
||||
setAdaptedUriPath :: String -> URI -> URI
|
||||
#if defined(ios_HOST_OS)
|
||||
setAdaptedUriPath s u = u { uriFragment = pathToHash s }
|
||||
setAdaptedUriPath :: Bool -> String -> URI -> URI
|
||||
setAdaptedUriPath useHash s u = case useHash of
|
||||
True -> u { uriFragment = pathToHash s }
|
||||
False -> u { uriPath = s }
|
||||
|
||||
pathToHash :: String -> String
|
||||
pathToHash = ('#' :) . fromMaybe "" . L.stripPrefix "/"
|
||||
#else
|
||||
setAdaptedUriPath s u = u { uriPath = s }
|
||||
#endif
|
||||
|
||||
hashToPath :: String -> String
|
||||
hashToPath = ('/' :) . fromMaybe "" . L.stripPrefix "#"
|
||||
|
@ -1,8 +1,9 @@
|
||||
module Obelisk.Route.TH (deriveRouteComponent) where
|
||||
|
||||
import Data.Constraint.Extras.TH
|
||||
import Data.GADT.Show.TH
|
||||
import Data.GADT.Compare.TH
|
||||
import Data.Universe.TH
|
||||
import Data.Universe.Some.TH
|
||||
import Language.Haskell.TH
|
||||
|
||||
-- | Derive all the typeclasses needed for a RouteComponent type. The argument should be the name of a type of kind @k -> *@
|
||||
@ -11,8 +12,6 @@ deriveRouteComponent x = concat <$> traverse ($ x)
|
||||
[ deriveGShow
|
||||
, deriveGEq
|
||||
, deriveGCompare
|
||||
, deriveShowTagIdentity
|
||||
, deriveEqTagIdentity
|
||||
, deriveOrdTagIdentity
|
||||
, deriveSomeUniverse
|
||||
, deriveUniverseSome
|
||||
, deriveArgDict
|
||||
]
|
||||
|
@ -10,9 +10,12 @@ library
|
||||
attoparsec
|
||||
, base
|
||||
, bytestring
|
||||
, containers
|
||||
, cookie
|
||||
, dependent-sum
|
||||
, dependent-sum-template
|
||||
, ghcjs-dom
|
||||
, HsOpenSSL
|
||||
, http-client
|
||||
, http-reverse-proxy
|
||||
, http-types
|
||||
@ -24,8 +27,7 @@ library
|
||||
, network
|
||||
, obelisk-asset-serve-snap
|
||||
, obelisk-backend
|
||||
, obelisk-executable-config
|
||||
, obelisk-executable-config-inject
|
||||
, obelisk-executable-config-lookup
|
||||
, obelisk-frontend
|
||||
, obelisk-route
|
||||
, process
|
||||
@ -35,11 +37,17 @@ library
|
||||
, snap-core
|
||||
, streaming-commons
|
||||
, text
|
||||
, time
|
||||
, universe
|
||||
, utf8-string
|
||||
, wai
|
||||
, wai-websockets
|
||||
, warp
|
||||
, warp-tls
|
||||
, websockets
|
||||
exposed-modules:
|
||||
Obelisk.Run
|
||||
ghc-options: -Wall -fwarn-redundant-constraints
|
||||
if os(linux)
|
||||
cpp-options: -DIPROUTE_SUPPORTED
|
||||
build-depends: which
|
||||
|
@ -1,17 +1,17 @@
|
||||
{-# LANGUAGE CPP #-}
|
||||
#if defined(IPROUTE_SUPPORTED)
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
#endif
|
||||
|
||||
{-# LANGUAGE ConstraintKinds #-}
|
||||
{-# LANGUAGE EmptyCase #-}
|
||||
{-# LANGUAGE EmptyDataDecls #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE GADTs #-}
|
||||
{-# LANGUAGE KindSignatures #-}
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE Rank2Types #-}
|
||||
{-# LANGUAGE RecursiveDo #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# LANGUAGE TypeFamilies #-}
|
||||
{-# LANGUAGE UndecidableInstances #-}
|
||||
{-# OPTIONS_GHC -fno-warn-orphans #-} -- Due to instance HasJS x (EventWriterT t w m)
|
||||
@ -25,17 +25,22 @@ import Control.Exception
|
||||
import Control.Lens ((%~), (^?), _Just, _Right)
|
||||
import qualified Data.Attoparsec.ByteString.Char8 as A
|
||||
import Data.ByteString (ByteString)
|
||||
import qualified Data.ByteString as BS
|
||||
import qualified Data.ByteString.Char8 as BSC
|
||||
import qualified Data.ByteString.Lazy.Char8 as BSLC
|
||||
import Data.Dependent.Sum (DSum (..))
|
||||
import qualified Data.ByteString.UTF8 as BSUTF8
|
||||
import Data.Functor.Identity
|
||||
import Data.Functor.Sum
|
||||
import Data.List (uncons)
|
||||
import Data.Map (Map)
|
||||
import qualified Data.Map as Map
|
||||
import Data.Maybe
|
||||
import Data.Semigroup ((<>))
|
||||
import Data.Streaming.Network (bindPortTCP)
|
||||
import Data.String (fromString)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import Data.Time.Clock (getCurrentTime, addUTCTime)
|
||||
import Language.Javascript.JSaddle.Run (syncPoint)
|
||||
import Language.Javascript.JSaddle.WebSockets
|
||||
import Network.HTTP.Client (Manager, defaultManagerSettings, newManager)
|
||||
@ -45,29 +50,36 @@ import Network.Socket
|
||||
import Network.Wai (Application)
|
||||
import qualified Network.Wai as W
|
||||
import Network.Wai.Handler.Warp
|
||||
import Network.Wai.Handler.WarpTLS
|
||||
import Network.Wai.Handler.Warp.Internal (settingsHost, settingsPort)
|
||||
import Network.WebSockets (ConnectionOptions)
|
||||
import Network.WebSockets.Connection (defaultConnectionOptions)
|
||||
import qualified Obelisk.Asset.Serve.Snap as Snap
|
||||
import Obelisk.ExecutableConfig (get)
|
||||
import Obelisk.ExecutableConfig.Inject (injectExecutableConfigs)
|
||||
import Obelisk.Backend
|
||||
import Obelisk.Frontend
|
||||
import Obelisk.Route.Frontend
|
||||
import qualified OpenSSL.PEM as PEM
|
||||
import qualified OpenSSL.RSA as RSA
|
||||
import qualified OpenSSL.X509 as X509
|
||||
import qualified OpenSSL.X509.Request as X509Request
|
||||
import Reflex.Dom.Core
|
||||
import Snap.Core (Snap)
|
||||
import System.Environment
|
||||
import System.IO
|
||||
import System.Process
|
||||
import System.Exit (ExitCode(..))
|
||||
import Text.URI (URI)
|
||||
import qualified Text.URI as URI
|
||||
import Text.URI.Lens
|
||||
import Obelisk.Backend
|
||||
import Web.Cookie
|
||||
|
||||
#if defined(IPROUTE_SUPPORTED)
|
||||
import qualified System.Which
|
||||
#endif
|
||||
|
||||
run
|
||||
:: Int -- ^ Port to run the backend
|
||||
-> ([Text] -> Snap ()) -- ^ Static asset handler
|
||||
-> Backend fullRoute frontendRoute -- ^ Backend
|
||||
-> Backend backendRoute frontendRoute -- ^ Backend
|
||||
-> Frontend (R frontendRoute) -- ^ Frontend
|
||||
-> IO ()
|
||||
run port serveStaticAsset backend frontend = do
|
||||
@ -77,82 +89,126 @@ run port serveStaticAsset backend frontend = do
|
||||
case checkEncoder $ _backend_routeEncoder backend of
|
||||
Left e -> hPutStrLn stderr $ "backend error:\n" <> T.unpack e
|
||||
Right validFullEncoder -> do
|
||||
backendTid <- forkIO $ handle handleBackendErr $ withArgs ["--quiet", "--port", show port] $ do
|
||||
_backend_run backend $ \serveRoute -> do
|
||||
runSnapWithCommandLineArgs $ do
|
||||
publicConfigs <- getPublicConfigs
|
||||
backendTid <- forkIO $ handle handleBackendErr $ withArgs ["--quiet", "--port", show port] $
|
||||
_backend_run backend $ \serveRoute ->
|
||||
runSnapWithCommandLineArgs $
|
||||
getRouteWith validFullEncoder >>= \case
|
||||
Identity r -> case r of
|
||||
InL backendRoute :=> Identity a -> serveRoute $ backendRoute :/ a
|
||||
InR obeliskRoute :=> Identity a ->
|
||||
serveDefaultObeliskApp (mkRouteToUrl validFullEncoder) serveStaticAsset frontend $ obeliskRoute :/ a
|
||||
FullRoute_Backend backendRoute :/ a -> serveRoute $ backendRoute :/ a
|
||||
FullRoute_Frontend obeliskRoute :/ a ->
|
||||
serveDefaultObeliskApp appRouteToUrl (($ allJsUrl) <$> defaultGhcjsWidgets) serveStaticAsset frontend publicConfigs $ obeliskRoute :/ a
|
||||
where
|
||||
appRouteToUrl (k :/ v) = renderObeliskRoute validFullEncoder (FullRoute_Frontend (ObeliskRoute_App k) :/ v)
|
||||
allJsUrl = renderAllJsPath validFullEncoder
|
||||
|
||||
let conf = defRunConfig { _runConfig_redirectPort = port }
|
||||
runWidget conf frontend validFullEncoder `finally` killThread backendTid
|
||||
runWidget conf publicConfigs frontend validFullEncoder `finally` killThread backendTid
|
||||
|
||||
-- Convenience wrapper to handle path segments for 'Snap.serveAsset'
|
||||
runServeAsset :: FilePath -> [Text] -> Snap ()
|
||||
runServeAsset rootPath = Snap.serveAsset "" rootPath . T.unpack . T.intercalate "/"
|
||||
|
||||
getConfigRoute :: IO (Maybe URI)
|
||||
getConfigRoute = get "config/common/route" >>= \case
|
||||
Just r -> case URI.mkURI $ T.strip r of
|
||||
Just route -> pure $ Just route
|
||||
Nothing -> do
|
||||
putStrLn $ "Route is invalid: " <> show r
|
||||
pure Nothing
|
||||
Nothing -> pure Nothing
|
||||
getConfigRoute :: Map Text ByteString -> Either Text URI
|
||||
getConfigRoute configs = case Map.lookup "common/route" configs of
|
||||
Just r ->
|
||||
let stripped = T.strip (T.decodeUtf8 r)
|
||||
in case URI.mkURI stripped of
|
||||
Just route -> Right route
|
||||
Nothing -> Left $ "Couldn't parse route as URI; value read was: " <> T.pack (show stripped)
|
||||
Nothing -> Left $ "Couldn't find config file common/route; it should contain the site's canonical root URI" <> T.pack (show $ Map.keys configs)
|
||||
|
||||
defAppUri :: URI
|
||||
defAppUri = fromMaybe (error "defAppUri") $ URI.mkURI "http://127.0.0.1:8000"
|
||||
|
||||
runWidget :: RunConfig -> Frontend (R route) -> Encoder Identity Identity (R (Sum backendRoute (ObeliskRoute route))) PageName -> IO ()
|
||||
runWidget conf frontend validFullEncoder = do
|
||||
uri <- fromMaybe defAppUri <$> getConfigRoute
|
||||
runWidget
|
||||
:: RunConfig
|
||||
-> Map Text ByteString
|
||||
-> Frontend (R frontendRoute)
|
||||
-> Encoder Identity Identity (R (FullRoute backendRoute frontendRoute)) PageName
|
||||
-> IO ()
|
||||
runWidget conf configs frontend validFullEncoder = do
|
||||
uri <- either (fail . T.unpack) pure $ getConfigRoute configs
|
||||
let port = fromIntegral $ fromMaybe 80 $ uri ^? uriAuthority . _Right . authPort . _Just
|
||||
redirectHost = _runConfig_redirectHost conf
|
||||
redirectPort = _runConfig_redirectPort conf
|
||||
beforeMainLoop = do
|
||||
putStrLn $ "Frontend running on " <> T.unpack (URI.render uri)
|
||||
settings = setBeforeMainLoop beforeMainLoop (setPort port (setTimeout 3600 defaultSettings))
|
||||
-- Providing TLS here will also incidentally provide it to proxied requests to the backend.
|
||||
prepareRunner = case uri ^? uriScheme . _Just . unRText of
|
||||
Just "https" -> do
|
||||
-- Generate a private key and self-signed certificate for TLS
|
||||
privateKey <- RSA.generateRSAKey' 2048 3
|
||||
|
||||
certRequest <- X509Request.newX509Req
|
||||
_ <- X509Request.setPublicKey certRequest privateKey
|
||||
_ <- X509Request.signX509Req certRequest privateKey Nothing
|
||||
|
||||
cert <- X509.newX509 >>= X509Request.makeX509FromReq certRequest
|
||||
_ <- X509.setPublicKey cert privateKey
|
||||
now <- getCurrentTime
|
||||
_ <- X509.setNotBefore cert $ addUTCTime (-1) now
|
||||
_ <- X509.setNotAfter cert $ addUTCTime (365 * 24 * 60 * 60) now
|
||||
_ <- X509.signX509 cert privateKey Nothing
|
||||
|
||||
certByteString <- BSUTF8.fromString <$> PEM.writeX509 cert
|
||||
privateKeyByteString <- BSUTF8.fromString <$> PEM.writePKCS8PrivateKey privateKey Nothing
|
||||
|
||||
return $ runTLSSocket (tlsSettingsMemory certByteString privateKeyByteString)
|
||||
_ -> return runSettingsSocket
|
||||
runner <- prepareRunner
|
||||
bracket
|
||||
(bindPortTCPRetry settings (logPortBindErr port) (_runConfig_retryTimeout conf))
|
||||
close
|
||||
(\skt -> do
|
||||
man <- newManager defaultManagerSettings
|
||||
app <- obeliskApp defaultConnectionOptions frontend validFullEncoder uri $ fallbackProxy redirectHost redirectPort man
|
||||
runSettingsSocket settings skt app)
|
||||
app <- obeliskApp configs defaultConnectionOptions frontend validFullEncoder uri $ fallbackProxy redirectHost redirectPort man
|
||||
runner settings skt app)
|
||||
|
||||
obeliskApp
|
||||
:: forall route backendRoute. ConnectionOptions
|
||||
-> Frontend (R route)
|
||||
-> Encoder Identity Identity (R (Sum backendRoute (ObeliskRoute route))) PageName
|
||||
:: forall frontendRoute backendRoute
|
||||
. Map Text ByteString
|
||||
-> ConnectionOptions
|
||||
-> Frontend (R frontendRoute)
|
||||
-> Encoder Identity Identity (R (FullRoute backendRoute frontendRoute)) PageName
|
||||
-> URI
|
||||
-> Application
|
||||
-> IO Application
|
||||
obeliskApp opts frontend validFullEncoder uri backend = do
|
||||
let entryPoint = do
|
||||
runFrontend validFullEncoder frontend
|
||||
obeliskApp configs opts frontend validFullEncoder uri backend = do
|
||||
let mode = FrontendMode
|
||||
{ _frontendMode_hydrate = True
|
||||
, _frontendMode_adjustRoute = False
|
||||
}
|
||||
entryPoint = do
|
||||
runFrontendWithConfigsAndCurrentRoute mode configs validFullEncoder frontend
|
||||
syncPoint
|
||||
jsaddlePath <- URI.mkPathPiece "jsaddle"
|
||||
let jsaddleUri = BSLC.fromStrict $ URI.renderBs $ uri & uriPath %~ (<>[jsaddlePath])
|
||||
Right (jsaddleWarpRouteValidEncoder :: Encoder Identity (Either Text) (R JSaddleWarpRoute) PageName) <- return $ checkEncoder jsaddleWarpRouteEncoder
|
||||
jsaddle <- jsaddleWithAppOr opts entryPoint $ \_ sendResponse -> sendResponse $ W.responseLBS H.status500 [("Content-Type", "text/plain")] "obeliskApp: jsaddle got a bad URL"
|
||||
return $ \req sendResponse -> case tryDecode validFullEncoder (W.pathInfo req, mempty) of --TODO: Query strings
|
||||
return $ \req sendResponse -> case tryDecode validFullEncoder $ byteStringsToPageName (BS.dropWhile (== (fromIntegral $ fromEnum '/')) $ W.rawPathInfo req) (BS.drop 1 $ W.rawQueryString req) of
|
||||
Identity r -> case r of
|
||||
InR (ObeliskRoute_Resource ResourceRoute_JSaddleWarp) :=> Identity jsaddleRoute -> case jsaddleRoute of
|
||||
FullRoute_Frontend (ObeliskRoute_Resource ResourceRoute_JSaddleWarp) :/ jsaddleRoute -> case jsaddleRoute of
|
||||
JSaddleWarpRoute_JavaScript :/ () -> sendResponse $ W.responseLBS H.status200 [("Content-Type", "application/javascript")] $ jsaddleJs' (Just jsaddleUri) False
|
||||
_ -> flip jsaddle sendResponse $ req
|
||||
{ W.pathInfo = fst $ encode jsaddleWarpRouteValidEncoder jsaddleRoute
|
||||
}
|
||||
InR (ObeliskRoute_App appRouteComponent) :=> Identity appRouteRest -> do
|
||||
html <- renderJsaddleFrontend (mkRouteToUrl validFullEncoder) (appRouteComponent :/ appRouteRest) frontend
|
||||
FullRoute_Frontend (ObeliskRoute_App appRouteComponent) :/ appRouteRest -> do
|
||||
let cookies = maybe [] parseCookies $ lookup (fromString "Cookie") (W.requestHeaders req)
|
||||
routeToUrl (k :/ v) = renderObeliskRoute validFullEncoder $ FullRoute_Frontend (ObeliskRoute_App k) :/ v
|
||||
html <- renderJsaddleFrontend configs cookies routeToUrl (appRouteComponent :/ appRouteRest) frontend
|
||||
sendResponse $ W.responseLBS H.status200 [("Content-Type", staticRenderContentType)] $ BSLC.fromStrict html
|
||||
_ -> backend req sendResponse
|
||||
|
||||
renderJsaddleFrontend :: (route -> Text) -> route -> Frontend route -> IO ByteString
|
||||
renderJsaddleFrontend urlEnc r f =
|
||||
renderJsaddleFrontend
|
||||
:: Map Text ByteString
|
||||
-> Cookies
|
||||
-> (route -> Text)
|
||||
-> route
|
||||
-> Frontend route
|
||||
-> IO ByteString
|
||||
renderJsaddleFrontend configs cookies urlEnc r f =
|
||||
let jsaddleScript = elAttr "script" ("src" =: "/jsaddle/jsaddle.js") blank
|
||||
jsaddlePreload = elAttr "link" ("rel" =: "preload" <> "as" =: "script" <> "href" =: "/jsaddle/jsaddle.js") blank
|
||||
in renderFrontendHtml urlEnc r (_frontend_head f >> injectExecutableConfigs >> jsaddlePreload) (_frontend_body f >> jsaddleScript)
|
||||
in renderFrontendHtml configs cookies urlEnc r f jsaddlePreload jsaddleScript
|
||||
|
||||
-- | like 'bindPortTCP' but reconnects on exception
|
||||
bindPortTCPRetry :: Settings
|
||||
@ -169,17 +225,22 @@ logPortBindErr p e = getProcessIdForPort p >>= \case
|
||||
Nothing -> putStrLn $ "runWidget: " <> show e
|
||||
Just pid -> putStrLn $ unwords [ "Port", show p, "is being used by process ID", show pid <> ".", "Please kill that process or change the port in config/common/route."]
|
||||
|
||||
ssPath :: Maybe String
|
||||
ssPath =
|
||||
#if defined(IPROUTE_SUPPORTED)
|
||||
Just $(System.Which.staticWhich "ss")
|
||||
#else
|
||||
Nothing
|
||||
#endif
|
||||
|
||||
getProcessIdForPort :: Int -> IO (Maybe Int)
|
||||
getProcessIdForPort port = do
|
||||
-- First check if 'ss' is available
|
||||
(c, _, _) <- readProcessWithExitCode "which" ["ss"] mempty
|
||||
case c of
|
||||
ExitSuccess -> do
|
||||
xs <- lines <$> readProcess "ss" ["-lptn", "sport = " <> show port] mempty
|
||||
case uncons xs of
|
||||
Just (_, x:_) -> return $ A.maybeResult $ A.parse parseSsPid $ BSC.pack x
|
||||
_ -> return Nothing
|
||||
_ -> return Nothing
|
||||
getProcessIdForPort port = case ssPath of
|
||||
Just ss -> do
|
||||
xs <- lines <$> readProcess ss ["-lptn", "sport = " <> show port] mempty
|
||||
case uncons xs of
|
||||
Just (_, x:_) -> return $ A.maybeResult $ A.parse parseSsPid $ BSC.pack x
|
||||
_ -> return Nothing
|
||||
_ -> return Nothing
|
||||
|
||||
parseSsPid :: A.Parser Int
|
||||
parseSsPid = do
|
||||
|
@ -11,15 +11,17 @@ library
|
||||
base
|
||||
, containers
|
||||
, directory
|
||||
, filepath
|
||||
, hspec
|
||||
, http-client
|
||||
, http-types
|
||||
, HUnit
|
||||
, obelisk-cliapp
|
||||
, obelisk-run
|
||||
, process
|
||||
, network
|
||||
, shelly
|
||||
, obelisk-executable-config-lookup
|
||||
, obelisk-run
|
||||
, which
|
||||
, process
|
||||
, shelly >= 1.9.0
|
||||
, temporary
|
||||
, text
|
||||
exposed-modules:
|
||||
|
@ -1,39 +1,43 @@
|
||||
{-# LANGUAGE DoAndIfThenElse #-}
|
||||
{-# LANGUAGE EmptyCase #-}
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE PatternGuards #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
{-# LANGUAGE TypeApplications #-}
|
||||
{-# LANGUAGE ViewPatterns #-}
|
||||
module Obelisk.SelfTest where
|
||||
|
||||
import Control.Exception (bracket, throw)
|
||||
import Control.Concurrent (threadDelay)
|
||||
import Control.Exception (bracket, throw, try)
|
||||
import Control.Monad
|
||||
import Control.Monad.IO.Class
|
||||
import Data.Bool (bool)
|
||||
import Data.Function (fix)
|
||||
import Data.Semigroup (Semigroup, (<>))
|
||||
import Data.Semigroup ((<>))
|
||||
import qualified Data.Set as Set
|
||||
import Data.String
|
||||
import Data.Text (Text)
|
||||
import Data.Void
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as T
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.HTTP.Types as HTTP
|
||||
import qualified Network.Socket as Socket
|
||||
import Shelly
|
||||
import System.Directory (withCurrentDirectory, getDirectoryContents)
|
||||
import System.Directory (getCurrentDirectory, withCurrentDirectory, getDirectoryContents)
|
||||
import System.Environment
|
||||
import System.Exit (ExitCode (..))
|
||||
import System.Info
|
||||
import System.IO (Handle, hClose)
|
||||
import System.FilePath (addTrailingPathSeparator)
|
||||
import qualified System.Info
|
||||
import System.IO (Handle, hClose, hIsEOF, hGetContents)
|
||||
import System.IO.Temp
|
||||
import System.Process (readProcessWithExitCode, CreateProcess(cwd), readCreateProcessWithExitCode, proc)
|
||||
import System.Process (readProcessWithExitCode)
|
||||
import System.Which (staticWhich)
|
||||
import Test.Hspec
|
||||
import Test.HUnit.Base
|
||||
|
||||
import Obelisk.CliApp hiding (runCli, readCreateProcessWithExitCode)
|
||||
import qualified Obelisk.CliApp as CliApp
|
||||
import Obelisk.ExecutableConfig.Lookup (getConfigs)
|
||||
import Obelisk.Run (getConfigRoute)
|
||||
|
||||
data ObRunState
|
||||
@ -42,24 +46,47 @@ data ObRunState
|
||||
| ObRunState_BackendStarted
|
||||
deriving (Eq, Show)
|
||||
|
||||
doubleQuotes :: (IsString a, Semigroup a) => a -> a
|
||||
doubleQuotes s = "\"" <> s <> "\""
|
||||
cpPath :: FilePath
|
||||
cpPath = $(staticWhich "cp")
|
||||
|
||||
cabalPath :: FilePath
|
||||
cabalPath = $(staticWhich "cabal")
|
||||
|
||||
gitPath :: FilePath
|
||||
gitPath = $(staticWhich "git")
|
||||
|
||||
chownPath :: FilePath
|
||||
chownPath = $(staticWhich "chown")
|
||||
|
||||
chmodPath :: FilePath
|
||||
chmodPath = $(staticWhich "chmod")
|
||||
|
||||
whoamiPath :: FilePath
|
||||
whoamiPath = $(staticWhich "whoami")
|
||||
|
||||
nixBuildPath :: FilePath
|
||||
nixBuildPath = $(staticWhich "nix-build")
|
||||
|
||||
lnPath :: FilePath
|
||||
lnPath = $(staticWhich "ln")
|
||||
|
||||
rmPath :: FilePath
|
||||
rmPath = $(staticWhich "rm")
|
||||
|
||||
rsyncPath :: FilePath
|
||||
rsyncPath = $(staticWhich "rsync")
|
||||
|
||||
gitUserConfig :: [Text]
|
||||
gitUserConfig = ["-c", "user.name=Obelisk Selftest", "-c", "user.email=noreply@example.com"]
|
||||
|
||||
commit :: Text -> Sh ()
|
||||
commit msg = void $ run "git"
|
||||
[ "commit"
|
||||
commit msg = void $ run gitPath $ gitUserConfig <> [ "commit"
|
||||
, "--no-gpg-sign"
|
||||
, "--allow-empty"
|
||||
, "-m"
|
||||
, doubleQuotes msg
|
||||
, msg
|
||||
]
|
||||
|
||||
-- TODO replace Void and stop using loosely typed synchronous exceptions
|
||||
runCli :: MonadIO m => CliT Void IO a -> m a
|
||||
runCli f = liftIO $ do
|
||||
c <- newCliConfig Notice False False (\case {})
|
||||
CliApp.runCli c f
|
||||
|
||||
tshow :: Show a => a -> Text
|
||||
tshow = T.pack . show
|
||||
|
||||
@ -67,194 +94,294 @@ tshow = T.pack . show
|
||||
shellyOb :: MonadIO m => (Sh a -> Sh a) -> Sh a -> m a
|
||||
shellyOb f obTest = shelly $ f obTest
|
||||
|
||||
-- Set "ob" in a single place so it can be replaced with a
|
||||
-- link to obelisk in the nix store in the future,
|
||||
-- and avoid PATH hacking before calling this script.
|
||||
ob :: FilePath
|
||||
ob = "ob"
|
||||
|
||||
augmentWithVerbosity :: (String -> [Text] -> a) -> String -> Bool -> [Text] -> a
|
||||
augmentWithVerbosity runner executable isVerbose args = runner executable $ (if isVerbose then ("-v" :) else id) args
|
||||
|
||||
-- | Copies a git repo to a new location and "resets" the git history to include
|
||||
-- exactly one commit with all files added. It then restricts writing and reading
|
||||
-- for group and user to make the repo ideal for being a valid git remote for thunks.
|
||||
--
|
||||
-- Using this allows dirty repos to be used as git remotes during the test since 'git clone'ing
|
||||
-- a dirty repo will not include the uncommitted changes.
|
||||
copyForGitRemote :: Bool -> FilePath -> FilePath -> IO ()
|
||||
copyForGitRemote isVerbose origDir copyDir = shelly $ bool silently verbosely isVerbose $ do
|
||||
setenv "HOME" "/dev/null"
|
||||
setenv "GIT_CONFIG_NOSYSTEM" "1"
|
||||
run_ rsyncPath
|
||||
[ "-r", "--no-perms", "--no-owner", "--no-group", "--exclude", ".git"
|
||||
, toTextIgnore (addTrailingPathSeparator origDir), toTextIgnore copyDir
|
||||
]
|
||||
git ["init"]
|
||||
git ["config", "user.name", "SelfTest"]
|
||||
git ["config", "user.email", "self@test"]
|
||||
git ["add", "--all"]
|
||||
git ["commit", "-m", "Copy repo"]
|
||||
run_ chmodPath ["-R", "u-w,g-rw,o-rw", toTextIgnore copyDir] -- Freeze this state
|
||||
where
|
||||
git args = run_ gitPath $ ["-C", toTextIgnore copyDir] <> args
|
||||
|
||||
main :: IO ()
|
||||
main = do
|
||||
-- Note: you can pass hspec arguments as well, eg: `-m <pattern>`
|
||||
isVerbose <- (elem "-v") <$> getArgs
|
||||
isVerbose <- elem "-v" <$> getArgs
|
||||
unless isVerbose $
|
||||
putStrLn "Tests may take longer to run if there are unbuilt derivations: use -v for verbose output"
|
||||
let verbosity = bool silently verbosely isVerbose
|
||||
obeliskImpl <- fromString <$> getEnv "OBELISK_IMPL"
|
||||
|
||||
obeliskImplDirtyReadOnly <- getCurrentDirectory
|
||||
httpManager <- HTTP.newManager HTTP.defaultManagerSettings
|
||||
[p0, p1, p2, p3] <- liftIO $ getFreePorts 4
|
||||
withSystemTempDirectory "initCache" $ \initCache -> do
|
||||
-- Setup the ob init cache
|
||||
void . shellyOb verbosity $ chdir (fromString initCache) $ do
|
||||
run_ "ob" ["init"]
|
||||
run_ "git" ["init"]
|
||||
hspec $ parallel $ do
|
||||
let shelly_ = void . shellyOb verbosity
|
||||
|
||||
inTmp :: (Shelly.FilePath -> Sh a) -> IO ()
|
||||
inTmp f = withTmp (chdir <*> f)
|
||||
withSystemTempDirectory "obelisk-repo-git-remote" $ \copyDir -> do
|
||||
copyForGitRemote isVerbose obeliskImplDirtyReadOnly copyDir
|
||||
main' isVerbose httpManager copyDir
|
||||
|
||||
withTmp f = shelly_ . withSystemTempDirectory "test" $ f . fromString
|
||||
main' :: Bool -> HTTP.Manager -> FilePath -> IO ()
|
||||
main' isVerbose httpManager obeliskRepoReadOnly = withInitCache $ \initCache -> hspec $ parallel $ do
|
||||
let
|
||||
inTmpObInit' dirname f = inTmp' dirname $ \dir -> do
|
||||
run_ cpPath ["-rT", fromString initCache, toTextIgnore dir]
|
||||
f dir
|
||||
inTmpObInit = inTmpObInit' defaultTmpDirName
|
||||
|
||||
inTmpObInit f = inTmp $ \dir -> do
|
||||
run_ "cp" ["-a", fromString $ initCache <> "/.", toTextIgnore dir]
|
||||
f dir
|
||||
-- To be used in tests that change the obelisk impl directory
|
||||
inTmpObInitWithImplCopy f = inTmpObInit $ \dir ->
|
||||
withObeliskImplClean $ \(fromString -> implClean) -> do
|
||||
run_ rmPath [thunk]
|
||||
run_ lnPath ["-s", implClean, thunk]
|
||||
f dir
|
||||
|
||||
assertRevEQ a b = liftIO . assertEqual "" "" =<< diff a b
|
||||
assertRevNE a b = liftIO . assertBool "" . (/= "") =<< diff a b
|
||||
describe "ob init" $ parallel $ do
|
||||
it "works with default impl" $ inTmp $ \_ -> runOb ["init"]
|
||||
it "works with master branch impl" $ inTmp $ \_ -> runOb ["init", "--branch", "master"]
|
||||
it "works with symlink" $ inTmp $ \_ -> runOb ["init", "--symlink", toTextIgnore obeliskRepoReadOnly]
|
||||
it "doesn't silently overwrite existing files" $ inTmp $ \_ -> do
|
||||
let p force = errExit False $ do
|
||||
run_ ob $ "--no-handoff" : "-v" : "init" : ["--force"|force]
|
||||
(== 0) <$> lastExitCode
|
||||
|
||||
revParseHead = T.strip <$> run "git" ["rev-parse", "HEAD"]
|
||||
True <- p False
|
||||
False <- p False
|
||||
True <- p True
|
||||
pure ()
|
||||
|
||||
commitAll = do
|
||||
run_ "git" ["add", "."]
|
||||
commit "checkpoint"
|
||||
revParseHead
|
||||
it "doesn't create anything when given an invalid impl" $ inTmp $ \tmp -> do
|
||||
void $ errExit False $ runOb ["init", "--symlink", "/dev/null"]
|
||||
ls tmp >>= liftIO . assertEqual "" []
|
||||
|
||||
thunk = ".obelisk/impl"
|
||||
update = run "ob" ["thunk", "update", thunk] >> commitAll
|
||||
pack = run "ob" ["thunk", "pack", thunk] >> commitAll
|
||||
unpack = run "ob" ["thunk", "unpack", thunk] >> commitAll
|
||||
it "produces a valid route config" $ inTmpObInit $ \tmp -> liftIO $
|
||||
withCurrentDirectory (T.unpack $ toTextIgnore tmp) $ do
|
||||
configs <- getConfigs
|
||||
return (either (const Nothing) Just $ getConfigRoute configs) `shouldNotReturn` Nothing
|
||||
|
||||
diff a b = run "git" ["diff", a, b]
|
||||
-- These tests fail with "Could not find module 'Obelisk.Generated.Static'"
|
||||
-- when not run by 'nix-build --attr selftest'
|
||||
describe "ob run" $ {- NOT parallel $ -} do
|
||||
it "works in root directory" $ inTmpObInit $ \_ -> testObRunInDir' Nothing httpManager
|
||||
it "works in sub directory" $ inTmpObInit $ \_ -> testObRunInDir' (Just "frontend") httpManager
|
||||
|
||||
describe "ob init" $ parallel $ do
|
||||
it "works with default impl" $ inTmp $ \_ -> run "ob" ["init"]
|
||||
it "works with master branch impl" $ inTmp $ \_ -> run "ob" ["init", "--branch", "master"]
|
||||
it "works with symlink" $ inTmp $ \_ -> run "ob" ["init", "--symlink", obeliskImpl]
|
||||
it "doesn't silently overwrite existing files" $ withSystemTempDirectory "ob-init" $ \dir -> do
|
||||
let p force = (proc "ob" $ "--no-handoff" : "init" : if force then ["--force"] else []) { cwd = Just dir }
|
||||
(ExitSuccess, _, _) <- readCreateProcessWithExitCode (p False) ""
|
||||
(ExitFailure _, _, _) <- readCreateProcessWithExitCode (p False) ""
|
||||
(ExitSuccess, _, _) <- readCreateProcessWithExitCode (p True) ""
|
||||
pure ()
|
||||
describe "obelisk project" $ parallel $ do
|
||||
it "can build obelisk command" $ inTmpObInit $ \_ -> nixBuild ["-A", "command" , toTextIgnore obeliskRepoReadOnly]
|
||||
it "can build obelisk skeleton" $ inTmpObInit $ \_ -> nixBuild ["-A", "skeleton", toTextIgnore obeliskRepoReadOnly]
|
||||
it "can build obelisk shell" $ inTmpObInit $ \_ -> nixBuild ["-A", "shell", toTextIgnore obeliskRepoReadOnly]
|
||||
it "can build everything" $ inTmpObInit $ \_ -> nixBuild [toTextIgnore obeliskRepoReadOnly]
|
||||
|
||||
it "doesn't create anything when given an invalid impl" $ inTmp $ \tmp -> do
|
||||
void $ errExit False $ run "ob" ["init", "--symlink", "/dev/null"]
|
||||
ls tmp >>= liftIO . assertEqual "" []
|
||||
describe "blank initialized project" $ parallel $ do
|
||||
|
||||
it "produces a valid route config" $ inTmpObInit $ \tmp -> do
|
||||
liftIO $ withCurrentDirectory (T.unpack $ toTextIgnore tmp) $ getConfigRoute `shouldNotReturn` Nothing
|
||||
it "can build ghc.backend" $ inTmpObInit $ \_ -> nixBuild ["-A", "ghc.backend"]
|
||||
it "can build ghcjs.frontend" $ inTmpObInit $ \_ -> nixBuild ["-A", "ghcjs.frontend"]
|
||||
|
||||
-- These tests fail with "Could not find module 'Obelisk.Generated.Static'"
|
||||
-- when not run by 'nix-build --attr selftest'
|
||||
describe "ob run" $ parallel $ do
|
||||
it "works in root directory" $ inTmpObInit $ \_ -> do
|
||||
testObRunInDir p0 p1 Nothing httpManager
|
||||
it "works in sub directory" $ inTmpObInit $ \_ -> do
|
||||
testObRunInDir p2 p3 (Just "frontend") httpManager
|
||||
if System.Info.os == "darwin"
|
||||
then it "can build ios" $ inTmpObInit $ \_ -> nixBuild ["-A", "ios.frontend"]
|
||||
else it "can build android after accepting license" $ inTmpObInit $ \dir -> do
|
||||
let defaultNixPath = dir </> ("default.nix" :: FilePath)
|
||||
writefile defaultNixPath
|
||||
=<< T.replace
|
||||
"# config.android_sdk.accept_license = false;"
|
||||
"config.android_sdk.accept_license = true;"
|
||||
<$> readfile defaultNixPath
|
||||
nixBuild ["-A", "android.frontend"]
|
||||
|
||||
describe "obelisk project" $ parallel $ do
|
||||
it "can build obelisk command" $ inTmpObInit $ \_ -> run "nix-build" ["-A", "command" , obeliskImpl]
|
||||
it "can build obelisk skeleton" $ inTmpObInit $ \_ -> run "nix-build" ["-A", "skeleton", obeliskImpl]
|
||||
it "can build obelisk shell" $ inTmpObInit $ \_ -> run "nix-build" ["-A", "shell", obeliskImpl]
|
||||
-- See https://github.com/obsidiansystems/obelisk/issues/101
|
||||
-- it "can build everything" $ shelly_ $ run "nix-build" [obeliskImpl]
|
||||
forM_ ["ghc", "ghcjs"] $ \compiler -> do
|
||||
let
|
||||
shellName = "shells." <> compiler
|
||||
inShell cmd' = run_ "nix-shell" ["default.nix", "-A", fromString shellName, "--run", cmd']
|
||||
it ("can enter " <> shellName) $ inTmpObInit $ \_ -> inShell "exit"
|
||||
-- NOTE: We override the temporary directory name because cabal has a bug preventing new-build from working
|
||||
-- in a path that has unicode characters.
|
||||
it ("can build in " <> shellName) $ inTmpObInit' "test" $ \_ -> inShell $
|
||||
T.pack cabalPath <> " --version; " <> T.pack cabalPath <> " new-build --" <> T.pack compiler <> " all"
|
||||
|
||||
describe "blank initialized project" $ parallel $ do
|
||||
it "has idempotent thunk update" $ inTmpObInitWithImplCopy $ \_ -> do
|
||||
_ <- pack
|
||||
u <- update
|
||||
uu <- update
|
||||
assertRevEQ u uu
|
||||
|
||||
it "can build ghc.backend" $ inTmpObInit $ \_ -> do
|
||||
run "nix-build" ["--no-out-link", "-A", "ghc.backend"]
|
||||
it "can build ghcjs.frontend" $ inTmpObInit $ \_ -> do
|
||||
run "nix-build" ["--no-out-link", "-A", "ghcjs.frontend"]
|
||||
describe "ob thunk pack/unpack" $ parallel $ do
|
||||
it "has thunk pack and unpack inverses" $ inTmpObInitWithImplCopy $ \_ -> do
|
||||
|
||||
if os == "darwin"
|
||||
then it "can build ios" $ inTmpObInit $ \_ -> run "nix-build" ["--no-out-link", "-A", "ios.frontend" ]
|
||||
else it "can build android" $ inTmpObInit $ \_ -> run "nix-build" ["--no-out-link", "-A", "android.frontend"]
|
||||
_ <- pack
|
||||
e <- commitAll
|
||||
eu <- unpack
|
||||
eup <- pack
|
||||
eupu <- unpack
|
||||
_ <- pack
|
||||
|
||||
forM_ ["ghc", "ghcjs"] $ \compiler -> do
|
||||
let
|
||||
shell = "shells." <> compiler
|
||||
inShell cmd' = run "nix-shell" ["-A", fromString shell, "--run", cmd']
|
||||
it ("can enter " <> shell) $ inTmpObInit $ \_ -> inShell "exit"
|
||||
it ("can build in " <> shell) $ inTmpObInit $ \_ -> inShell $ "cabal new-build --" <> fromString compiler <> " all"
|
||||
assertRevEQ e eup
|
||||
assertRevEQ eu eupu
|
||||
assertRevNE e eu
|
||||
|
||||
it "can build reflex project" $ inTmpObInit $ \_ -> do
|
||||
run "nix-build" []
|
||||
it "unpacks the correct branch" $ withTmp $ \dir -> do
|
||||
let branch = "master"
|
||||
run_ gitPath ["clone", "https://github.com/reflex-frp/reflex.git", toTextIgnore dir, "--branch", branch]
|
||||
runOb_ ["thunk", "pack", toTextIgnore dir]
|
||||
runOb_ ["thunk", "unpack", toTextIgnore dir]
|
||||
branch' <- chdir dir $ run gitPath ["rev-parse", "--abbrev-ref", "HEAD"]
|
||||
liftIO $ assertEqual "" branch (T.strip branch')
|
||||
|
||||
it "has idempotent thunk update" $ inTmpObInit $ \_ -> do
|
||||
u <- update
|
||||
uu <- update
|
||||
assertRevEQ u uu
|
||||
it "can pack and unpack plain git repos" $
|
||||
shelly_ $ withSystemTempDirectory "git-repo" $ \dir -> do
|
||||
let repo = toTextIgnore $ dir </> ("repo" :: FilePath)
|
||||
run_ gitPath ["clone", "https://github.com/haskell/process.git", repo]
|
||||
origHash <- chdir (fromText repo) revParseHead
|
||||
|
||||
describe "ob thunk pack/unpack" $ parallel $ do
|
||||
it "has thunk pack and unpack inverses" $ inTmpObInit $ \_ -> do
|
||||
runOb_ ["thunk", "pack", repo]
|
||||
packedFiles <- Set.fromList <$> ls (fromText repo)
|
||||
liftIO $ assertEqual "" packedFiles $ Set.fromList $ (repo </>) <$>
|
||||
["default.nix", "github.json", ".attr-cache" :: FilePath]
|
||||
|
||||
e <- commitAll
|
||||
eu <- unpack
|
||||
eup <- pack
|
||||
eupu <- unpack
|
||||
_ <- pack
|
||||
runOb_ ["thunk", "unpack", repo]
|
||||
chdir (fromText repo) $ do
|
||||
unpackHash <- revParseHead
|
||||
assertRevEQ origHash unpackHash
|
||||
|
||||
assertRevEQ e eup
|
||||
assertRevEQ eu eupu
|
||||
assertRevNE e eu
|
||||
testThunkPack' $ fromText repo
|
||||
|
||||
it "unpacks the correct branch" $ withTmp $ \dir -> do
|
||||
let branch = "master"
|
||||
run_ "git" ["clone", "https://github.com/reflex-frp/reflex.git", toTextIgnore dir, "--branch", branch]
|
||||
run_ "ob" ["thunk", "pack", toTextIgnore dir]
|
||||
run_ "ob" ["thunk", "unpack", toTextIgnore dir]
|
||||
branch' <- chdir dir $ run "git" ["rev-parse", "--abbrev-ref", "HEAD"]
|
||||
liftIO $ assertEqual "" branch (T.strip branch')
|
||||
it "aborts thunk pack when there are uncommitted files" $ inTmpObInitWithImplCopy $ \dir -> do
|
||||
testThunkPack' (dir </> thunk)
|
||||
|
||||
it "can pack and unpack plain git repos" $ do
|
||||
shelly_ $ withSystemTempDirectory "git-repo" $ \dir -> do
|
||||
let repo = toTextIgnore $ dir </> ("repo" :: String)
|
||||
run_ "git" ["clone", "https://github.com/haskell/process.git", repo]
|
||||
origHash <- chdir (fromText repo) revParseHead
|
||||
describe "ob thunk update --branch" $ parallel $ do
|
||||
it "can change a thunk to the latest version of a desired branch" $ withTmp $ \dir -> do
|
||||
let branch1 = "master"
|
||||
branch2 = "develop"
|
||||
run_ gitPath ["clone", "https://github.com/reflex-frp/reflex.git", toTextIgnore dir, "--branch", branch1]
|
||||
runOb_ ["thunk" , "pack", toTextIgnore dir]
|
||||
runOb_ ["thunk", "update", toTextIgnore dir, "--branch", branch2]
|
||||
|
||||
run_ "ob" ["thunk", "pack", repo]
|
||||
packedFiles <- Set.fromList <$> ls (fromText repo)
|
||||
liftIO $ assertEqual "" packedFiles $ Set.fromList $ (repo </>) <$>
|
||||
["default.nix", "github.json", ".attr-cache" :: String]
|
||||
it "doesn't create anything when given an invalid branch" $ withTmp $ \dir -> do
|
||||
let checkDir dir' = liftIO $ getDirectoryContents $ T.unpack $ toTextIgnore dir'
|
||||
run_ gitPath ["clone", "https://github.com/reflex-frp/reflex.git", toTextIgnore dir, "--branch", "master"]
|
||||
runOb_ ["thunk" , "pack", toTextIgnore dir]
|
||||
startingContents <- checkDir dir
|
||||
void $ errExit False $ runOb ["thunk", "update", toTextIgnore dir, "--branch", "dumble-palooza"]
|
||||
checkDir dir >>= liftIO . assertEqual "" startingContents
|
||||
|
||||
run_ "ob" ["thunk", "unpack", repo]
|
||||
chdir (fromText repo) $ do
|
||||
unpackHash <- revParseHead
|
||||
assertRevEQ origHash unpackHash
|
||||
describe "ob hoogle" $ {- NOT parallel -} do
|
||||
it "starts a hoogle server on the given port" $ inTmpObInit $ \_ -> do
|
||||
[p0] <- liftIO $ getFreePorts 1
|
||||
maskExitSuccess $ runHandle "ob" ["hoogle", "--port", T.pack (show p0)] $ \stdout -> flip fix Nothing $ \loop -> \case
|
||||
Nothing -> do -- Still waiting for initial signal that the server has started
|
||||
ln <- liftIO $ T.hGetLine stdout
|
||||
let search = "Server starting on port " <> T.pack (show p0)
|
||||
case search `T.isInfixOf` ln of
|
||||
False -> loop Nothing -- keep waiting
|
||||
True -> loop $ Just 10
|
||||
Just (n :: Int) -> do -- Server has started and we have n attempts left
|
||||
let req uri = liftIO $ try @HTTP.HttpException $ HTTP.parseRequest uri >>= flip HTTP.httpLbs httpManager
|
||||
req ("http://127.0.0.1:" <> show p0) >>= \case
|
||||
Right r | HTTP.responseStatus r == HTTP.ok200 -> exit 0
|
||||
e -> if n <= 0
|
||||
then errorExit $ "Request to hoogle server failed: " <> T.pack (show e)
|
||||
else liftIO (threadDelay (1*10^(6 :: Int))) *> loop (Just $ n - 1)
|
||||
where
|
||||
verbosity = bool silently verbosely isVerbose
|
||||
nixBuild args = run nixBuildPath ("--no-out-link" : args)
|
||||
|
||||
testThunkPack $ fromText repo
|
||||
runOb_ = augmentWithVerbosity run_ ob isVerbose
|
||||
runOb = augmentWithVerbosity run ob isVerbose
|
||||
testObRunInDir' = augmentWithVerbosity testObRunInDir ob isVerbose ["run"]
|
||||
testThunkPack' = augmentWithVerbosity testThunkPack ob isVerbose []
|
||||
|
||||
it "aborts thunk pack when there are uncommitted files" $ inTmpObInit $ \dir -> do
|
||||
void $ unpack
|
||||
testThunkPack (dir </> thunk)
|
||||
withObeliskImplClean f =
|
||||
withSystemTempDirectory "obelisk-impl-clean" $ \obeliskImpl -> do
|
||||
void . shellyOb verbosity $ chdir obeliskImpl $ do
|
||||
dirtyFiles <- T.strip <$> run gitPath ["-C", toTextIgnore obeliskRepoReadOnly, "diff", "--stat"]
|
||||
() <- when (dirtyFiles /= "") $ error "SelfTest does not work correctly with dirty obelisk repos as remote"
|
||||
run_ gitPath ["clone", "file://" <> toTextIgnore obeliskRepoReadOnly, toTextIgnore obeliskImpl]
|
||||
f obeliskImpl
|
||||
|
||||
describe "ob thunk update --branch" $ parallel $ do
|
||||
it "can change a thunk to the latest version of a desired branch" $ withTmp $ \dir -> do
|
||||
let branch1 = "master"
|
||||
branch2 = "develop"
|
||||
run_ "git" ["clone", "https://github.com/reflex-frp/reflex.git", toTextIgnore dir, "--branch", branch1]
|
||||
run_ "ob" ["thunk" , "pack", toTextIgnore dir]
|
||||
run_ "ob" ["thunk", "update", toTextIgnore dir, "--branch", branch2]
|
||||
withInitCache f =
|
||||
withSystemTempDirectory "init Cache λ" $ \initCache -> do
|
||||
-- Setup the ob init cache
|
||||
void . shellyOb verbosity $ chdir initCache $ do
|
||||
runOb_ ["init", "--symlink", toTextIgnore obeliskRepoReadOnly]
|
||||
run_ gitPath ["init"]
|
||||
|
||||
it "doesn't create anything when given an invalid branch" $ withTmp $ \dir -> do
|
||||
let checkDir dir' = liftIO $ getDirectoryContents $ T.unpack $ toTextIgnore dir'
|
||||
run_ "git" ["clone", "https://github.com/reflex-frp/reflex.git", toTextIgnore dir, "--branch", "master"]
|
||||
run_ "ob" ["thunk" , "pack", toTextIgnore dir]
|
||||
startingContents <- checkDir dir
|
||||
void $ errExit False $ run "ob" ["thunk", "update", toTextIgnore dir, "--branch", "dumble-palooza"]
|
||||
checkDir dir >>= liftIO . assertEqual "" startingContents
|
||||
f initCache
|
||||
|
||||
shelly_ = void . shellyOb verbosity
|
||||
|
||||
defaultTmpDirName = "test λ"
|
||||
|
||||
inTmp' :: FilePath -> (FilePath -> Sh a) -> IO ()
|
||||
inTmp' dirname f = withTmp' dirname (chdir <*> f)
|
||||
inTmp = inTmp' defaultTmpDirName
|
||||
|
||||
withTmp' dirname f = shelly_ . withSystemTempDirectory dirname $ f . fromString
|
||||
withTmp = withTmp' defaultTmpDirName
|
||||
|
||||
assertRevEQ a b = liftIO . assertEqual "" "" =<< diff a b
|
||||
assertRevNE a b = liftIO . assertBool "" . (/= "") =<< diff a b
|
||||
|
||||
revParseHead = T.strip <$> run gitPath ["rev-parse", "HEAD"]
|
||||
|
||||
commitAll = do
|
||||
run_ gitPath ["add", "."]
|
||||
commit "checkpoint"
|
||||
revParseHead
|
||||
|
||||
thunk = ".obelisk/impl"
|
||||
update = runOb ["thunk", "update", thunk] *> commitAll
|
||||
pack = runOb ["thunk", "pack", thunk] *> commitAll
|
||||
unpack = runOb ["thunk", "unpack", thunk] *> commitAll
|
||||
|
||||
diff a b = run gitPath ["diff", a, b]
|
||||
|
||||
|
||||
maskExitSuccess :: Sh () -> Sh ()
|
||||
maskExitSuccess = handle_sh (\case ExitSuccess -> pure (); e -> throw e)
|
||||
|
||||
-- | Run `ob run` in the given directory (maximum of one level deep)
|
||||
testObRunInDir :: Socket.PortNumber -> Socket.PortNumber -> Maybe Shelly.FilePath -> HTTP.Manager -> Sh ()
|
||||
testObRunInDir p0 p1 mdir httpManager = handle_sh (\case ExitSuccess -> pure (); e -> throw e) $ do
|
||||
testObRunInDir :: String -> [Text] -> Maybe FilePath -> HTTP.Manager -> Sh ()
|
||||
testObRunInDir executable extraArgs mdir httpManager = maskExitSuccess $ do
|
||||
[p0, p1] <- liftIO $ getFreePorts 2
|
||||
let uri p = "http://localhost:" <> T.pack (show p) <> "/" -- trailing slash required for comparison
|
||||
writefile "config/common/route" $ uri p0
|
||||
maybe id chdir mdir $ runHandle "ob" ["run"] $ \stdout -> do
|
||||
firstUri <- handleObRunStdout httpManager stdout
|
||||
maybe id chdir mdir $ runHandles executable extraArgs [] $ \_stdin stdout stderr -> do
|
||||
firstUri <- handleObRunStdout httpManager stdout stderr
|
||||
let newUri = uri p1
|
||||
when (firstUri == newUri) $ errorExit $
|
||||
"Startup URI (" <> firstUri <> ") is the same as test URI (" <> newUri <> ")"
|
||||
maybe id (\_ -> chdir "..") mdir $ alterRouteTo newUri stdout
|
||||
runningUri <- handleObRunStdout httpManager stdout
|
||||
runningUri <- handleObRunStdout httpManager stdout stderr
|
||||
if runningUri /= newUri
|
||||
then errorExit $ "Reloading failed: expected " <> newUri <> " but got " <> runningUri
|
||||
else exit 0
|
||||
|
||||
testThunkPack :: Shelly.FilePath -> Sh ()
|
||||
testThunkPack path' = withTempFile (T.unpack $ toTextIgnore path') "test-file" $ \file handle -> do
|
||||
let pack' = readProcessWithExitCode "ob" ["thunk", "pack", T.unpack $ toTextIgnore path'] ""
|
||||
testThunkPack :: String -> [Text] -> FilePath -> Sh ()
|
||||
testThunkPack executable args path' = withTempFile (T.unpack $ toTextIgnore path') "test-file" $ \file handle -> do
|
||||
let pack' = readProcessWithExitCode executable (T.unpack <$> ["thunk", "pack", toTextIgnore path'] ++ args) ""
|
||||
ensureThunkPackFails q = liftIO $ pack' >>= \case
|
||||
(code, out, err)
|
||||
| code == ExitSuccess -> fail "ob thunk pack succeeded when it should have failed"
|
||||
| code == ExitSuccess -> fail $ "ob thunk pack succeeded when it should have failed with error '" <> show q <> "'"
|
||||
| q `T.isInfixOf` T.pack (out <> err) -> pure ()
|
||||
| otherwise -> fail $ "ob thunk pack failed for an unexpected reason: " <> show out <> "\nstderr: " <> err
|
||||
git = chdir path' . run "git"
|
||||
| otherwise -> fail $ "ob thunk pack failed for an unexpected reason, expecting '" <> show q <> "', received: " <> show out <> "\nstderr: " <> err
|
||||
git = chdir path' . run gitPath
|
||||
-- Untracked files
|
||||
ensureThunkPackFails "Untracked files"
|
||||
void $ git ["add", T.pack file]
|
||||
@ -267,7 +394,7 @@ testThunkPack path' = withTempFile (T.unpack $ toTextIgnore path') "test-file" $
|
||||
liftIO $ T.hPutStrLn handle "test file" >> hClose handle
|
||||
ensureThunkPackFails "modified"
|
||||
-- Existing stashes
|
||||
void $ git ["stash"]
|
||||
void $ git $ gitUserConfig <> [ "stash" ]
|
||||
ensureThunkPackFails "has stashes"
|
||||
|
||||
-- | Blocks until a non-empty line is available
|
||||
@ -288,9 +415,12 @@ alterRouteTo uri stdout = do
|
||||
"Reloading failed: " <> T.pack (show t)
|
||||
|
||||
-- | Handle stdout of `ob run`: check that the frontend and backend servers are started correctly
|
||||
handleObRunStdout :: HTTP.Manager -> Handle -> Sh Text
|
||||
handleObRunStdout httpManager stdout = flip fix (ObRunState_Init, []) $ \loop (state, msgs) -> do
|
||||
liftIO (T.hGetLine stdout) >>= \t -> case state of
|
||||
handleObRunStdout :: HTTP.Manager -> Handle -> Handle -> Sh Text
|
||||
handleObRunStdout httpManager stdout stderr = flip fix (ObRunState_Init, []) $ \loop (state, msgs) -> do
|
||||
isEOF <- liftIO $ hIsEOF stdout
|
||||
if isEOF
|
||||
then handleObRunError msgs
|
||||
else liftIO (T.hGetLine stdout) >>= \t -> case state of
|
||||
ObRunState_Init
|
||||
| "Running test..." `T.isPrefixOf` t -> loop (ObRunState_BackendStarted, msgs)
|
||||
ObRunState_Startup
|
||||
@ -302,8 +432,12 @@ handleObRunStdout httpManager stdout = flip fix (ObRunState_Init, []) $ \loop (s
|
||||
obRunCheck httpManager stdout uri
|
||||
pure uri
|
||||
| not (T.null t) -> errorExit $ "Started: " <> t -- If theres any other output here, startup failed
|
||||
_ | "Failed" `T.isPrefixOf` t -> errorExit $ "ob run failed: " <> T.unlines (reverse $ t : msgs)
|
||||
_ | "Failed" `T.isPrefixOf` t -> handleObRunError (t : msgs)
|
||||
| otherwise -> loop (state, t : msgs)
|
||||
where
|
||||
handleObRunError msgs = do
|
||||
stderrContent <- liftIO $ hGetContents stderr
|
||||
errorExit $ "ob run failed: " <> T.unlines (reverse msgs) <> " stderr: " <> T.pack stderrContent
|
||||
|
||||
-- | Make requests to frontend/backend servers to check they are working properly
|
||||
obRunCheck :: HTTP.Manager -> Handle -> Text -> Sh ()
|
||||
|
@ -1,6 +1,6 @@
|
||||
name: obelisk-snap-extras
|
||||
version: 0.1
|
||||
synopsys: Extra functionality for Snap that should be considered for upstreaming
|
||||
synopsis: Extra functionality for Snap that should be considered for upstreaming
|
||||
license: BSD3
|
||||
build-type: Simple
|
||||
cabal-version: >=1.2
|
||||
|
@ -2,13 +2,13 @@
|
||||
module Obelisk.Snap.Extras
|
||||
( cachePermanently
|
||||
, doNotCache
|
||||
, ensureSecure
|
||||
, serveFileIfExists
|
||||
, serveFileIfExistsAs
|
||||
) where
|
||||
|
||||
import Control.Monad.IO.Class
|
||||
import Data.ByteString (ByteString)
|
||||
import Data.Monoid
|
||||
import Data.String
|
||||
import Snap.Core
|
||||
import Snap.Util.FileServe
|
||||
|
5
lib/tabulation/CHANGELOG.md
Normal file
@ -0,0 +1,5 @@
|
||||
# Revision history for tabulation
|
||||
|
||||
## 0.1.0.0 -- 2019-05-24
|
||||
|
||||
* First version. Basically carve out this library as a place for the HasFields class.
|
30
lib/tabulation/LICENSE
Normal file
@ -0,0 +1,30 @@
|
||||
Copyright (c) 2019, Cale Gibbard
|
||||
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
* Redistributions in binary form must reproduce the above
|
||||
copyright notice, this list of conditions and the following
|
||||
disclaimer in the documentation and/or other materials provided
|
||||
with the distribution.
|
||||
|
||||
* Neither the name of Cale Gibbard nor the names of other
|
||||
contributors may be used to endorse or promote products derived
|
||||
from this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
|
||||
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
|
||||
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
||||
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|