Big rewrite

1. Foliage takes as input a complete description of the index, where
   source distributions and revisions come with a timestamp. This allows
   us to recreate the entire index in a reproducible way.

2. Added a experimental command to import an index from a Hackage (as
   downloaded with Cabal). This was originally a testing/development
   need but there might be different use cases.
This commit is contained in:
Andrea Bedini 2022-03-28 12:53:39 +08:00
parent 5ce3fc0501
commit 080197e9e2
No known key found for this signature in database
GPG Key ID: EE8DEB94262733BE
30 changed files with 1023 additions and 1786 deletions

4
NOTES-HRT.md Normal file
View File

@ -0,0 +1,4 @@
# Hackage Repo Tool
It looks like the update re-adds cabal files to the index based on
timestamp but package.json on content?

175
README.md
View File

@ -5,62 +5,101 @@ A hash-friendly Haskell Package Repository.
Foliage is a tool to create custom or private Haskell package repositories,
in a fully reproducible way.
## Background
The problem of build reproducibility in the Haskell ecosystem has discussed
many times. Hackage does not natively offer a way to pin down the files it
serves.
Although there are workarounds to obtain a fixed repository (e.g. by
truncating the index file, which is append only) I think we can solve this
at the root.
## Main idea
_Like GitHub Pages but for Haskell Packages_
A "Hackage repository" is collection of files (source distributions, cabal
files, public keys and signatures).
A "Hackage repository" is collection of source distributions and cabal
files. In addition, Hackage has implemented [The Update
Framework (TUF)](https://theupdateframework.com) and the repository also
includes cryptographic metadata (public keys and signatures).
These files are commonly served by Hackage proper, that is the central
deployment of [hackage-server](https://github.com/haskell/hackage-server/).
Foliage explores the idea of serving this content as a static website,
which is generated programmatically from a small set of input files.
Both the input files and the generated repository can be stored in a git
repository and referred to via stable URL corresponding to commit hashes.
Foliage explores the idea of creaating and serving this content as a static
website, generated programmatically from textual input files.
## Example
An input file could look like the following
Foliage expects a folder `_sources` with a subfolder per package name and
version.
```toml
[[sources]]
url = "https://.../source1.tar.gz"
E.g.
[[sources]]
url = "https://.../source2.tar.gz"
subdirs = [
"a",
"b",
"c"
]
```
_sources
└── typed-protocols
   └── 0.1.0.0
   └── meta.toml
```
This file basically mirrors the functionality of
[`source-repository-package`](https://cabal.readthedocs.io/en/3.6/cabal-project.html#specifying-packages-from-remote-version-control-locations)
in Cabal.
The file `meta.toml` describes a package and looks like this
For each source (and each subdir, if any is specified), foliage will
download the tarball and make a sdist. Foliage will then use the
hackage-repo-tool to create an on-disk repository (e.g. in `_repo`) from
the collected packages. Additionally, one can specify revisions to each
package version.
```toml
timestamp = 2022-03-28T07:57:10Z
url = 'https://github.com/input-output-hk/ouroboros-network/tarball/d2d219a86cda42787325bb8c20539a75c2667132'
subdir = 'typed-protocols' # optional
```
Foliage will download the source url for each package (assumed to be a
tarball), decompress it, make a source distribution and take the cabal
file.
After all packages have been processed, foliage will create a repository,
including the index and the TUF metadata. With the input above foliage will
produce the following:
```
_repo
├── 01-index.tar
├── 01-index.tar.gz
├── index
│   └── typed-protocols
│   └── 0.1.0.0
│   ├── package.json
│   └── typed-protocols.cabal
├── mirrors.json
├── package
│   └── typed-protocols-0.1.0.0.tar.gz
├── root.json
├── snapshot.json
└── timestamp.json
```
* `typed-protocols-0.1.0.0.tar.gz` is obtained by running
`cabal sdist` of the repository (and, optionally, subfolder) specified in
`meta.toml`.
* `type-protocols.cabal` is extracted from the repository.
* `01-index.tar` will include the cabal files and signed target file, using
the timestamp in `meta.toml`.
```bash
$ TZ=UTC tar tvf _repo/01-index.tar
-rw-r--r-- foliage/foliage 1627 2022-03-28 07:57 typed-protocols/0.1.0.0/typed-protocols.cabal
-rw-r--r-- foliage/foliage 833 2022-03-28 07:57 typed-protocols/0.1.0.0/package.json
```
* The TUF files (`mirrors.json`, `root.json`, `snapshot.json`,
`timestamp.json`) are signed and contains reasonable defaults.
## Revisions
Foliage supports cabal file revisions. Adding the following snippet to a
package's `meta.toml`, will make foliage look for a cabal file in
`<pkgName>/<pkgVersion>/revisions/1.cabal`.
```
[[revisions]]
number = 1
timestamp = 2022-03-22T14:15:00+00:00
```
The revised cabal file will enter the index with the timestamp provided in
`meta.toml`.
## Using the repository with cabal
The resulting repository can then be server through HTTPS and used with
cabal, e.g. in a `cabal.project`
cabal, e.g. in a `cabal.project`:
```
repository packages.example.org
@ -68,7 +107,7 @@ repository packages.example.org
secure: True
```
Alternatively, cabal can read the repository directly off disk
Alternatively, cabal can read the repository directly off disk:
```
repository packages.example.org
@ -76,70 +115,12 @@ repository packages.example.org
secure: True
```
**Note:** The package id (package name + package version) is unknown at
download time and only known after looking at the cabal file. This is the
reason package names and versions do not show in the input file. Foliage
ensures two sources do not provide colliding package ids.
**Note:** Hackage implements [The Update
Framework](https://theupdateframework.io) which requires a set of public
and private keys. Foliage can either generate a new set of keys or reuse a
pre-existing one. Cabal can either trust a repository at first use or
verify signatures against public keys obtained separately.
## GitHub
Foliage can make use of three features supported by GitHub, to further advance automation.
1. GitHub has long suppored accessing git repositories via HTTPS. E.g. one can access a blob in a git repo through the following URL.
https://raw.githubusercontent.com/{owner}/{repo}/{ref}/path
where `ref` can either be a commit hash or a branch name.
2. GitHub also offer URLs of tarballs for repos at given commit, e.g.
https://github.com/Quid2/flat/tarball/ee59880f47ab835dbd73bea0847dab7869fc20d8
Afaik, these tarballs might not be entirely immutable (TODO)
3. GitHub offers URLs for tagged releases (these tarballs are supposed to be immutable).
4. GitHub Actions can be used to automate the generation
5. (Perhaps optional) GitHub Pages supports publishing a git branch over HTTP.
This means we automatically have a stable url for any package whose source is available on GitHub.
Also the generated repository can be committed to a git branch and be immediately available through HTTPS to cabal.
E.g.
This configuration
https://github.com/andreabedini/byo-hackage/blob/933760117a3800366b420b07c8c887c1313e2b22/packages.tsv
(warning old TSV format)
Generated this repo https://github.com/andreabedini/byo-hackage/tree/1e8c5184836acb05972dfff00ac8edca575e1df1
Which can be give to cabal like this
```
repository my-hackage-repo
url: https://raw.githubusercontent.com/andreabedini/byo-hackage/1e8c5184836acb05972dfff00ac8edca575e1df1
secure: True
```
## To infinity and Beyond
One can think of more features
- A pretty website could be automatically generated along with the
repository. With a list of packages, their versions, metadata, etc
- The input file itself could be automatically generated, e.g. from all
tagged releases in a GitHub organisation. Making it a turn-key Hackage
repository for any GitHub Organisation.
## Author
- Andrea Bedini (@andreabedini)

403
app/Foliage/CmdBuild.hs Normal file
View File

@ -0,0 +1,403 @@
{-# LANGUAGE NamedFieldPuns #-}
{-# OPTIONS_GHC -Wno-name-shadowing #-}
module Foliage.CmdBuild (cmdBuild) where
import Codec.Archive.Tar qualified as Tar
import Codec.Archive.Tar.Entry qualified as Tar
import Codec.Compression.GZip qualified as GZip
import Control.Monad (unless)
import Data.ByteString.Lazy qualified as BSL
import Data.Foldable (for_)
import Data.List (isPrefixOf, sortOn)
import Data.Traversable (for)
import Development.Shake
import Development.Shake.FilePath
import Foliage.HackageSecurity
import Foliage.Meta
import Foliage.Options
import Foliage.Package
import Foliage.Shake
import Foliage.Shake.Oracle
import Foliage.Time qualified as Time
import Foliage.Utils
import System.Directory qualified as IO
cmdBuild :: BuildOptions -> IO ()
cmdBuild (BuildOptions keysPath mCurrentTime outDir) = do
ks <- IO.doesDirectoryExist keysPath
unless ks $ do
putStrLn $ "🗝️ You don't seem to have created a set of TUF keys. I will create one in " <> keysPath
createKeys keysPath
let opts =
shakeOptions
{ shakeChange = ChangeDigest,
shakeFiles = "_cache"
}
shake opts $ do
--
-- Oracles
--
getCurrentTime <- addOracle $ \GetCurrentTime ->
case mCurrentTime of
Nothing -> do
t <- Time.truncateSeconds <$> liftIO Time.getCurrentTime
putInfo $
unlines
[ "🕐 Current time set to " <> Time.iso8601Show t <> ".",
"You can set a fixed time using the --current-time option"
]
return t
Just t -> do
putInfo $ "🕐 Current time set to " <> Time.iso8601Show t <> "."
return t
getExpiryTime <- addOracle $ \GetExpiryTime -> do
t <- Time.addUTCTime (Time.nominalDay * 365) <$> getCurrentTime GetCurrentTime
putInfo $ "🕐 Expiry time set to " <> Time.iso8601Show t <> " (a year from now)."
return t
getSourceMeta <- addOracle $ \(GetSourceMeta PackageId {pkgName, pkgVersion}) ->
readSourceMeta' $ "_sources" </> pkgName </> pkgVersion </> "meta.toml"
getSourceDir <- addOracle $ \(GetSourceDir pkgId) -> do
SourceMeta {sourceUrl, sourceSubdir} <- getSourceMeta (GetSourceMeta pkgId)
let srcDir = "_cache" </> urlToFileName sourceUrl
need [srcDir </> ".downloaded"]
-- FIXME Without this, sometimes the download doesn't trigger
putInfo $ "👀 " <> sourceUrl
projectFiles <- liftIO $ filter ("cabal.project" `isPrefixOf`) <$> IO.getDirectoryContents srcDir
unless (null projectFiles) $ do
putWarn $ "⚠️ Deleting cabal project files from " ++ srcDir
liftIO $ for_ projectFiles $ IO.removeFile . (srcDir </>)
return $ case sourceSubdir of
Just s -> srcDir </> s
Nothing -> srcDir
getPackages <- addOracle $ \GetPackages -> do
metaFiles <- getDirectoryFiles "_sources" ["*/*/meta.toml"]
return $
[ PackageId pkgName pkgVersion
| path <- metaFiles,
let [pkgName, pkgVersion, _] = splitDirectories path
]
--
-- Entrypoint
--
-- This triggers the whole chain of TUF metadata
want [outDir </> "timestamp.json"]
-- This build the current index entry for all packages
action $ do
pkgIds <- getPackages GetPackages
need
[ outDir </> "index" </> pkgName </> pkgVersion </> pkgName <.> "cabal"
| PackageId pkgName pkgVersion <- pkgIds
]
--
-- timestamp.json
--
outDir </> "timestamp.json" %> \path -> do
snapshotInfo <- computeFileInfoSimple' (outDir </> "snapshot.json")
expires <- getExpiryTime GetExpiryTime
let timestamp =
Timestamp
{ timestampVersion = FileVersion 1,
timestampExpires = FileExpires (Just expires),
timestampInfoSnapshot = snapshotInfo
}
keys <- readKeysAt (keysPath </> "timestamp")
let timestampSigned = withSignatures hackageRepoLayout keys timestamp
liftIO $ do
p <- makeAbsolute (fromFilePath path)
writeJSON hackageRepoLayout p timestampSigned
putInfo $ "✅ Written " <> path
--
-- snapshot.json
--
outDir </> "snapshot.json" %> \path -> do
rootInfo <- computeFileInfoSimple' (outDir </> "root.json")
mirrorsInfo <- computeFileInfoSimple' (outDir </> "mirrors.json")
tarInfo <- computeFileInfoSimple' (outDir </> "01-index.tar")
tarGzInfo <- computeFileInfoSimple' (outDir </> "01-index.tar.gz")
expires <- getExpiryTime GetExpiryTime
let snapshot =
Snapshot
{ snapshotVersion = FileVersion 1,
snapshotExpires = FileExpires (Just expires),
snapshotInfoRoot = rootInfo,
snapshotInfoMirrors = mirrorsInfo,
snapshotInfoTar = Just tarInfo,
snapshotInfoTarGz = tarGzInfo
}
keys <- readKeysAt (keysPath </> "snapshot")
let snapshotSigned = withSignatures hackageRepoLayout keys snapshot
liftIO $ do
p <- makeAbsolute (fromFilePath path)
writeJSON hackageRepoLayout p snapshotSigned
putInfo $ "✅ Written " <> path
--
-- root.json
--
outDir </> "root.json" %> \path -> do
expires <- getExpiryTime GetExpiryTime
privateKeysRoot <- readKeysAt (keysPath </> "root")
privateKeysTarget <- readKeysAt (keysPath </> "target")
privateKeysSnapshot <- readKeysAt (keysPath </> "snapshot")
privateKeysTimestamp <- readKeysAt (keysPath </> "timestamp")
privateKeysMirrors <- readKeysAt (keysPath </> "mirrors")
let root =
Root
{ rootVersion = FileVersion 1,
rootExpires = FileExpires (Just expires),
rootKeys =
fromKeys $
concat
[ privateKeysRoot,
privateKeysTarget,
privateKeysSnapshot,
privateKeysTimestamp,
privateKeysMirrors
],
rootRoles =
RootRoles
{ rootRolesRoot =
RoleSpec
{ roleSpecKeys = map somePublicKey privateKeysRoot,
roleSpecThreshold = KeyThreshold 2
},
rootRolesSnapshot =
RoleSpec
{ roleSpecKeys = map somePublicKey privateKeysSnapshot,
roleSpecThreshold = KeyThreshold 1
},
rootRolesTargets =
RoleSpec
{ roleSpecKeys = map somePublicKey privateKeysTarget,
roleSpecThreshold = KeyThreshold 1
},
rootRolesTimestamp =
RoleSpec
{ roleSpecKeys = map somePublicKey privateKeysTimestamp,
roleSpecThreshold = KeyThreshold 1
},
rootRolesMirrors =
RoleSpec
{ roleSpecKeys = map somePublicKey privateKeysMirrors,
roleSpecThreshold = KeyThreshold 1
}
}
}
keys <- readKeysAt (keysPath </> "root")
let signedRoot = withSignatures hackageRepoLayout keys root
liftIO $ do
p <- makeAbsolute (fromFilePath path)
writeJSON hackageRepoLayout p signedRoot
putInfo $ "✅ Written " <> path
--
-- mirrors.json
--
outDir </> "mirrors.json" %> \path -> do
expires <- getExpiryTime GetExpiryTime
let mirrors =
Mirrors
{ mirrorsVersion = FileVersion 1,
mirrorsExpires = FileExpires (Just expires),
mirrorsMirrors = []
}
keys <- readKeysAt (keysPath </> "mirrors")
let signedMirrors = withSignatures hackageRepoLayout keys mirrors
liftIO $ do
p <- makeAbsolute (fromFilePath path)
writeJSON hackageRepoLayout p signedMirrors
putInfo $ "✅ Written " <> path
--
-- 01-index.tar
--
outDir </> "01-index.tar" %> \path -> do
pkgIds <- getPackages GetPackages
entries <-
fmap concat $
for pkgIds $ \pkgId -> do
let PackageId {pkgName, pkgVersion} = pkgId
SourceMeta {sourceTimestamp, sourceRevisions} <- getSourceMeta (GetSourceMeta pkgId)
srcDir <- getSourceDir (GetSourceDir pkgId)
sequence $
[ -- original cabal file
mkTarEntry
(srcDir </> pkgName <.> "cabal")
(pkgName </> pkgVersion </> pkgName <.> "cabal")
sourceTimestamp,
-- package.json
mkTarEntry
(outDir </> "index" </> pkgName </> pkgVersion </> "package.json")
(pkgName </> pkgVersion </> "package.json")
sourceTimestamp
]
++ [ -- revised cabal files
mkTarEntry
("_sources" </> pkgName </> pkgVersion </> "revisions" </> show revNum <.> "cabal")
(pkgName </> pkgVersion </> pkgName <.> "cabal")
revTimestamp
| RevisionMeta revTimestamp revNum <- sourceRevisions
]
liftIO $ BSL.writeFile path $ Tar.write (sortOn Tar.entryTime entries)
putInfo $ "✅ Written " <> path
--
-- 01-index.tar.gz
--
outDir </> "01-index.tar.gz" %> \path -> do
tar <- readFileByteStringLazy (outDir </> "01-index.tar")
liftIO $ BSL.writeFile path (GZip.compress tar)
putInfo $ "✅ Written " <> path
--
-- index cabal files (latest revision)
--
outDir </> "index/*/*/*.cabal" %> \path -> do
let [_, _, pkgName, pkgVersion, _] = splitDirectories path
let pkgId = PackageId pkgName pkgVersion
-- Figure out where to get it from
meta <- getSourceMeta $ GetSourceMeta pkgId
case latestRevisionNumber meta of
Nothing -> do
srcDir <- getSourceDir (GetSourceDir pkgId)
copyFileChanged (srcDir </> pkgName <.> "cabal") path
Just revNum -> do
let revisionCabal = "_sources" </> pkgName </> pkgVersion </> "revisions" </> show revNum <.> "cabal"
copyFileChanged revisionCabal path
putInfo $ "✅ Written " <> path
--
-- index package files (only depends on the source distribution)
--
outDir </> "index/*/*/package.json" %> \path -> do
let [_, _, pkgName, pkgVersion, _] = splitDirectories path
let packagePath = "package" </> pkgName <> "-" <> pkgVersion <.> "tar.gz"
let targetPath = rootPath $ fromUnrootedFilePath packagePath
targetFileInfo <- computeFileInfoSimple' ("_repo" </> packagePath)
expires <- getExpiryTime GetExpiryTime
let targets =
Targets
{ targetsVersion = FileVersion 1,
targetsExpires = FileExpires (Just expires),
targetsTargets = fromList [(TargetPathRepo targetPath, targetFileInfo)],
targetsDelegations = Nothing
}
keys <- readKeysAt (keysPath </> "target")
let signedTargets = withSignatures hackageRepoLayout keys targets
liftIO $ do
p <- makeAbsolute (fromFilePath path)
writeJSON hackageRepoLayout p signedTargets
putInfo $ "✅ Written " <> path
--
-- source distributions
--
outDir </> "package/*.tar.gz" %> \path -> do
let [_, _, filename] = splitDirectories path
let Just pkgId = parsePkgId <$> stripExtension "tar.gz" filename
srcDir <- getSourceDir (GetSourceDir pkgId)
withTempDir $ \tmpDir -> do
putInfo $ " Creating source distribution for " <> pkgIdToString pkgId
cmd_ Shell (Cwd srcDir) (FileStdout path) ("cabal sdist --ignore-project --output-directory " <> tmpDir)
-- check cabal sdist has produced a single tarball with the
-- expected name
ls <- liftIO $ IO.getDirectoryContents tmpDir
let ls' = filter (not . all (== '.')) ls
case ls' of
[l]
| l == filename ->
cmd_ Shell ["mv", tmpDir </> l, path]
[l]
| l /= filename ->
fail $ "cabal sdist produced a different package. I expected " <> filename <> " but found " <> l
_ ->
fail $ "cabal sdist for " <> pkgIdToString pkgId <> " did not produce a single tarball!"
putInfo $ "✅ Written " <> path
--
-- source tree downloads
--
"_cache/*/.downloaded" %> \path -> do
let [_, hashedUrl, _] = splitDirectories path
let url = fileNameToUrl hashedUrl
let srcDir = takeDirectory path
withTempDir $ \tmpDir -> do
-- Download and extract tarball
putInfo $ "🐢 Downloading " <> url
cmd_ Shell $ "curl --silent -L " <> url <> " | tar xz -C " <> tmpDir
-- Special treatment of top-level directory: we remove it
--
-- Note: Don't let shake look into tmpDir! it will cause
-- unnecessary rework because tmpDir is always new
ls <- liftIO $ IO.getDirectoryContents tmpDir
let ls' = filter (not . all (== '.')) ls
case ls' of
[l] -> cmd_ Shell ["mv", "-T", tmpDir </> l, srcDir]
_ -> cmd_ Shell ["mv", "-T", tmpDir, srcDir]
-- Touch the trigger file
writeFile' path ""
putStrLn $ "💥 All done. The repository is now available in " <> outDir <> "."
mkTarEntry :: FilePath -> [Char] -> UTCTime -> Action Tar.Entry
mkTarEntry filePath indexPath timestamp = do
let Right tarPath = Tar.toTarPath False indexPath
contents <- readFileByteStringLazy filePath
return
(Tar.fileEntry tarPath contents)
{ Tar.entryTime = floor $ Time.utcTimeToPOSIXSeconds timestamp,
Tar.entryOwnership =
Tar.Ownership
{ Tar.ownerName = "foliage",
Tar.groupName = "foliage",
Tar.ownerId = 0,
Tar.groupId = 0
}
}

View File

@ -0,0 +1,8 @@
module Foliage.CmdCreateKeys where
import Foliage.HackageSecurity
cmdCreateKeys :: FilePath -> IO ()
cmdCreateKeys keyPath = do
putStrLn $ "Creating a new set of keys in " <> keyPath
createKeys keyPath

View File

@ -0,0 +1,104 @@
module Foliage.CmdImportHackage
( cmdImportHackage,
)
where
import Codec.Archive.Tar qualified as Tar
import Codec.Archive.Tar.Entry qualified as Tar
import Data.ByteString.Lazy qualified as BSL
import Data.Foldable (for_)
import Data.List (isSuffixOf)
import Data.Map.Strict (Map)
import Data.Map.Strict qualified as M
import Data.Maybe (fromMaybe)
import Data.Time.Clock.POSIX (posixSecondsToUTCTime)
import Foliage.Meta
import Foliage.Options
import Foliage.Package
import System.Directory qualified as IO
import System.Environment
import System.FilePath
cmdImportHackage :: ImportHackageOptions -> IO ()
cmdImportHackage (ImportHackageOptions Nothing) = importHackage (const True)
cmdImportHackage (ImportHackageOptions (Just s)) = importHackage ((== s) . pkgName)
importHackage ::
(PackageId -> Bool) ->
IO ()
importHackage f = do
putStrLn "EXPERIMENTAL. Import the Hackage index from $HOME/.cabal. Make sure you have done `cabal update` recently."
home <- getEnv "HOME"
entries <- Tar.read <$> BSL.readFile (home </> ".cabal/packages/hackage.haskell.org/01-index.tar")
m <- importIndex f entries M.empty
for_ (M.toList m) $ uncurry finalise
importIndex ::
Show e =>
(PackageId -> Bool) ->
Tar.Entries e ->
Map PackageId SourceMeta ->
IO (Map PackageId SourceMeta)
importIndex f (Tar.Next e es) m =
case isCabalFile e of
Just (pkgId, contents, time)
| f pkgId ->
do
putStrLn $ "Found cabal file " ++ pkgIdToString pkgId ++ " with time " ++ show time
m' <-
M.alterF
( \case
-- New package
Nothing ->
pure $
Just $
SourceMeta
{ sourceUrl = pkgIdToHackageUrl pkgId,
sourceTimestamp = time,
sourceSubdir = Nothing,
sourceRevisions = []
}
-- Existing package, new revision
Just sm -> do
let revnum = 1 + fromMaybe 0 (latestRevisionNumber sm)
newRevision = RevisionMeta {revisionNumber = revnum, revisionTimestamp = time}
-- bad performance here but I don't care
let sm' = sm {sourceRevisions = sourceRevisions sm ++ [newRevision]}
let PackageId pkgName pkgVersion = pkgId
let outDir = "_sources" </> pkgName </> pkgVersion </> "revisions"
IO.createDirectoryIfMissing True outDir
BSL.writeFile (outDir </> show revnum <.> "cabal") contents
return $ Just sm'
)
pkgId
m
importIndex f es m'
_ -> importIndex f es m
importIndex _f Tar.Done m =
return m
importIndex _f (Tar.Fail e) _ =
error $ show e
finalise ::
PackageId ->
SourceMeta ->
IO ()
finalise PackageId {pkgName, pkgVersion} meta = do
let dir = "_sources" </> pkgName </> pkgVersion
IO.createDirectoryIfMissing True dir
writeSourceMeta (dir </> "meta.toml") meta
isCabalFile ::
Tar.Entry ->
Maybe (PackageId, BSL.ByteString, UTCTime)
isCabalFile
Tar.Entry
{ Tar.entryTarPath = Tar.fromTarPath -> path,
Tar.entryContent = Tar.NormalFile contents _,
Tar.entryTime = posixSecondsToUTCTime . fromIntegral -> time
}
| ".cabal" `isSuffixOf` path =
let [pkgName, pkgVersion, _] = splitDirectories path
packageId = PackageId pkgName pkgVersion
in Just (packageId, contents, time)
isCabalFile _ = Nothing

View File

@ -1,3 +1,4 @@
{-# LANGUAGE OverloadedStrings #-}
module Foliage.Config
( Config (..),
Source (..),

View File

@ -0,0 +1,58 @@
{-# LANGUAGE FlexibleContexts #-}
module Foliage.HackageSecurity
( module Foliage.HackageSecurity,
module Hackage.Security.Server,
module Hackage.Security.TUF.FileMap,
module Hackage.Security.Key.Env,
module Hackage.Security.Util.Path,
module Hackage.Security.Util.Some,
)
where
import Control.Monad (replicateM_)
import Data.Functor.Identity
import Hackage.Security.Key.Env (fromKeys)
import Hackage.Security.Server
import Hackage.Security.TUF.FileMap
import Hackage.Security.Util.Path (fromFilePath, fromUnrootedFilePath, makeAbsolute, rootPath)
import Hackage.Security.Util.Some
import System.Directory (createDirectoryIfMissing)
import System.FilePath
readJSONSimple :: FromJSON ReadJSON_NoKeys_NoLayout a => FilePath -> IO (Either DeserializationError a)
readJSONSimple fp = do
p <- makeAbsolute (fromFilePath fp)
readJSON_NoKeys_NoLayout p
writeJSONSimple :: ToJSON Identity a => FilePath -> a -> IO ()
writeJSONSimple fp a = do
p <- makeAbsolute (fromFilePath fp)
writeJSON_NoLayout p a
computeFileInfoSimple :: FilePath -> IO FileInfo
computeFileInfoSimple fp = do
p <- makeAbsolute (fromFilePath fp)
computeFileInfo p
createKeys :: FilePath -> IO ()
createKeys base = do
createDirectoryIfMissing True (base </> "root")
replicateM_ 3 $ createKey' KeyTypeEd25519 >>= writeKeyWithId (base </> "root")
createDirectoryIfMissing True (base </> "target")
replicateM_ 3 $ createKey' KeyTypeEd25519 >>= writeKeyWithId (base </> "target")
createDirectoryIfMissing True (base </> "timestamp")
replicateM_ 1 $ createKey' KeyTypeEd25519 >>= writeKeyWithId (base </> "timestamp")
createDirectoryIfMissing True (base </> "snapshot")
replicateM_ 1 $ createKey' KeyTypeEd25519 >>= writeKeyWithId (base </> "snapshot")
createDirectoryIfMissing True (base </> "mirrors")
replicateM_ 3 $ createKey' KeyTypeEd25519 >>= writeKeyWithId (base </> "mirrors")
writeKeyWithId :: FilePath -> Some Key -> IO ()
writeKeyWithId base k =
writeKey (base </> keyIdString (someKeyId k) <.> "json") k
writeKey :: FilePath -> Some Key -> IO ()
writeKey fp key = do
p <- makeAbsolute (fromFilePath fp)
writeJSON_NoLayout p key

86
app/Foliage/Meta.hs Normal file
View File

@ -0,0 +1,86 @@
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE DerivingVia #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE PatternSynonyms #-}
module Foliage.Meta
( SourceMeta,
pattern SourceMeta,
sourceTimestamp,
sourceUrl,
sourceSubdir,
sourceRevisions,
readSourceMeta,
writeSourceMeta,
RevisionMeta,
pattern RevisionMeta,
revisionTimestamp,
revisionNumber,
UTCTime,
latestRevisionNumber,
)
where
import Control.Monad (void)
import Data.Time.Format.ISO8601
import Data.Time.LocalTime (utc, utcToZonedTime, zonedTimeToUTC)
import Development.Shake.Classes
import Foliage.Time
import GHC.Generics
import Toml (TomlCodec, (.=))
import Toml qualified
data SourceMeta = SourceMeta' WrapUTCTime String (Maybe String) [RevisionMeta]
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
pattern SourceMeta :: UTCTime -> String -> Maybe String -> [RevisionMeta] -> SourceMeta
pattern SourceMeta {sourceTimestamp, sourceUrl, sourceSubdir, sourceRevisions} =
SourceMeta' (WrapUTCTime sourceTimestamp) sourceUrl sourceSubdir sourceRevisions
sourceMetaCodec :: TomlCodec SourceMeta
sourceMetaCodec =
SourceMeta
<$> timeCodec "timestamp" .= sourceTimestamp
<*> Toml.string "url" .= sourceUrl
<*> Toml.dioptional (Toml.string "subdir") .= sourceSubdir
<*> Toml.list revisionMetaCodec "revisions" .= sourceRevisions
readSourceMeta :: FilePath -> IO SourceMeta
readSourceMeta = Toml.decodeFile sourceMetaCodec
writeSourceMeta :: FilePath -> SourceMeta -> IO ()
writeSourceMeta fp a = void $ Toml.encodeToFile sourceMetaCodec fp a
data RevisionMeta = RevisionMeta' WrapUTCTime Int
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
pattern RevisionMeta :: UTCTime -> Int -> RevisionMeta
pattern RevisionMeta {revisionTimestamp, revisionNumber} =
RevisionMeta' (WrapUTCTime revisionTimestamp) revisionNumber
revisionMetaCodec :: TomlCodec RevisionMeta
revisionMetaCodec =
RevisionMeta
<$> timeCodec "timestamp" .= revisionTimestamp
<*> Toml.int "number" .= revisionNumber
newtype WrapUTCTime = WrapUTCTime {unwrapUTCTime :: UTCTime}
deriving (Show, Eq, Generic)
deriving anyclass (Hashable, NFData)
deriving (ISO8601) via UTCTime
instance Binary WrapUTCTime where
get = iso8601ParseM =<< get
put = put . iso8601Show
timeCodec :: Toml.Key -> TomlCodec UTCTime
timeCodec key = Toml.dimap (utcToZonedTime utc) zonedTimeToUTC $ Toml.zonedTime key
latestRevisionNumber :: SourceMeta -> Maybe Int
latestRevisionNumber sm =
if null (sourceRevisions sm)
then Nothing
else Just $ maximum $ map revisionNumber (sourceRevisions sm)

View File

@ -1,42 +1,88 @@
module Foliage.Options
( parseOptions,
Options (..),
module Options.Applicative,
( parseCommand,
Command (..),
BuildOptions (..),
ImportHackageOptions (..),
)
where
import Foliage.Time
import Options.Applicative
parseOptions :: IO Options
parseOptions =
execParser $
info
data Command
= CreateKeys FilePath
| Build BuildOptions
| ImportHackage ImportHackageOptions
parseCommand :: IO Command
parseCommand =
customExecParser
(prefs showHelpOnEmpty)
$ info
(optionsParser <**> helper)
( fullDesc
<> progDesc "foliage"
<> header "foliage - a builder for static Hackage repositories"
)
data Options = Options
{ optionsConfig :: FilePath
, optionsKeys :: FilePath
}
optionsParser :: Parser Options
optionsParser :: Parser Command
optionsParser =
Options
hsubparser $
command "create-keys" (info createKeysCommand (progDesc "Create TUF keys"))
<> command "build" (info buildCommand (progDesc "Build repository"))
<> command "import-hackage" (info importHackageCommand (progDesc "Import from Hackage"))
data BuildOptions = BuildOptions FilePath (Maybe UTCTime) FilePath
buildCommand :: Parser Command
buildCommand =
Build
<$> ( BuildOptions
<$> strOption
( long "config"
<> metavar "CONFIG"
<> help "Config file"
<> showDefault
<> value "config.toml"
)
<*> strOption
( long "keys"
<> metavar "KEYS"
<> help "Keys folder"
<> help "TUF keys location"
<> showDefault
<> value "_keys"
)
<*> optional
( option
(maybeReader iso8601ParseM)
( long "current-time"
<> metavar "TIME"
<> help "Set current time"
<> showDefault
)
)
<*> strOption
( long "output-directory"
<> metavar "OUTDIR"
<> help "Repository output directory"
<> showDefault
<> value "_repo"
)
)
createKeysCommand :: Parser Command
createKeysCommand =
CreateKeys
<$> strOption
( long "keys"
<> metavar "KEYS"
<> help "TUF keys location"
<> showDefault
<> value "_keys"
)
newtype ImportHackageOptions = ImportHackageOptions (Maybe String)
importHackageCommand :: Parser Command
importHackageCommand =
ImportHackage . ImportHackageOptions
<$> optional
( strOption
( long "package-name"
<> metavar "NAME"
<> help "Filter for package name"
)
)

37
app/Foliage/Package.hs Normal file
View File

@ -0,0 +1,37 @@
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE DerivingStrategies #-}
module Foliage.Package
( PackageId (..),
parsePkgId,
pkgIdToTarGzName,
pkgIdToString,
pkgIdToHackageUrl,
)
where
import Data.Bifunctor
import Data.Tuple
import Development.Shake.Classes
import GHC.Generics
import System.FilePath
data PackageId = PackageId {pkgName :: String, pkgVersion :: String}
deriving (Eq, Ord, Show, Generic)
deriving anyclass (Binary, NFData, Hashable)
parsePkgId :: String -> PackageId
parsePkgId fn = PackageId (init pn) pv
where
(pn, pv) = swap $ bimap reverse reverse $ break (== '-') $ reverse fn
pkgIdToTarGzName :: PackageId -> FilePath
pkgIdToTarGzName pkgId = pkgIdToString pkgId <.> "tar.gz"
pkgIdToString :: PackageId -> String
pkgIdToString (PackageId name version) = name <> "-" <> version
pkgIdToHackageUrl :: PackageId -> String
pkgIdToHackageUrl pkgId =
"https://hackage.haskell.org/package" </> pkgIdToString pkgId </> pkgIdToString pkgId <.> "tar.gz"

View File

@ -1,33 +0,0 @@
module Foliage.RepoToolWrapper
( bootstrapRepo,
createKeys,
updateRepo,
)
where
import Hackage.Security.RepoTool.Layout.Keys
import Hackage.Security.RepoTool.Main qualified as RepoTool
import Hackage.Security.RepoTool.Options
import Hackage.Security.RepoTool.Paths
import Hackage.Security.Server
import Hackage.Security.Util.Path
updateRepo :: FilePath -> FilePath -> IO ()
updateRepo keysDir repoDir = do
keysLoc <- KeysLoc <$> makeAbsolute (fromFilePath keysDir)
repoLoc <- RepoLoc <$> makeAbsolute (fromFilePath repoDir)
RepoTool.bootstrapOrUpdate (globalOpts $ Update keysLoc repoLoc) keysLoc repoLoc False
bootstrapRepo :: FilePath -> FilePath -> IO ()
bootstrapRepo keysDir repoDir = do
keysLoc <- KeysLoc <$> makeAbsolute (fromFilePath keysDir)
repoLoc <- RepoLoc <$> makeAbsolute (fromFilePath repoDir)
RepoTool.bootstrapOrUpdate (globalOpts $ Bootstrap keysLoc repoLoc) keysLoc repoLoc True
createKeys :: FilePath -> IO ()
createKeys keysDir = do
keysLoc <- KeysLoc <$> makeAbsolute (fromFilePath keysDir)
RepoTool.createKeys (globalOpts $ CreateKeys keysLoc) keysLoc
globalOpts :: Command -> GlobalOpts
globalOpts = GlobalOpts defaultKeysLayout hackageRepoLayout hackageIndexLayout True 1 10

34
app/Foliage/Shake.hs Normal file
View File

@ -0,0 +1,34 @@
module Foliage.Shake
( computeFileInfoSimple',
readFileByteStringLazy,
readKeysAt,
readSourceMeta',
)
where
import Data.ByteString.Lazy qualified as BSL
import Data.Traversable (for)
import Development.Shake
import Development.Shake.FilePath
import Foliage.HackageSecurity
import Foliage.Meta
computeFileInfoSimple' :: FilePath -> Action FileInfo
computeFileInfoSimple' fp = do
need [fp] >> liftIO (computeFileInfoSimple fp)
readFileByteStringLazy :: FilePath -> Action BSL.ByteString
readFileByteStringLazy x = need [x] >> liftIO (BSL.readFile x)
readKeysAt :: FilePath -> Action [Some Key]
readKeysAt base = do
paths <- getDirectoryFiles base ["*.private"]
need $ map (base </>) paths
for paths $ \path -> do
Right key <- liftIO $ readJSONSimple (base </> path)
pure key
readSourceMeta' :: FilePath -> Action SourceMeta
readSourceMeta' fp = do
need [fp]
liftIO $ readSourceMeta fp

View File

@ -0,0 +1,51 @@
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE DerivingStrategies #-}
{-# LANGUAGE TypeFamilies #-}
module Foliage.Shake.Oracle
( UTCTime,
GetCurrentTime (..),
GetExpiryTime (..),
GetSourceMeta (..),
GetPackages (..),
GetSourceDir (..),
)
where
import Data.Time.Compat ()
import Development.Shake (RuleResult)
import Development.Shake.Classes (Binary, Hashable, NFData)
import Foliage.Meta
import Foliage.Package
import GHC.Generics (Generic)
data GetCurrentTime = GetCurrentTime
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
type instance RuleResult GetCurrentTime = UTCTime
data GetExpiryTime = GetExpiryTime
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
type instance RuleResult GetExpiryTime = UTCTime
data GetPackages = GetPackages
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
type instance RuleResult GetPackages = [PackageId]
newtype GetSourceMeta = GetSourceMeta PackageId
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
type instance RuleResult GetSourceMeta = SourceMeta
newtype GetSourceDir = GetSourceDir PackageId
deriving (Show, Eq, Generic)
deriving anyclass (Binary, Hashable, NFData)
type instance RuleResult GetSourceDir = FilePath

27
app/Foliage/Time.hs Normal file
View File

@ -0,0 +1,27 @@
{-# LANGUAGE TypeApplications #-}
{-# OPTIONS_GHC -Wno-orphans #-}
module Foliage.Time
( iso8601ParseM,
iso8601Show,
getCurrentTime,
UTCTime (..),
utcTimeToPOSIXSeconds,
addUTCTime,
nominalDay,
truncateSeconds,
)
where
import Data.Time
import Data.Time.Clock.POSIX
import Data.Time.Compat ()
import Data.Time.Format.ISO8601
import Development.Shake.Classes
instance Binary UTCTime where
get = iso8601ParseM =<< get
put = put . iso8601Show
truncateSeconds :: UTCTime -> UTCTime
truncateSeconds = posixSecondsToUTCTime . fromIntegral @Int . floor . utcTimeToPOSIXSeconds

14
app/Foliage/Utils.hs Normal file
View File

@ -0,0 +1,14 @@
module Foliage.Utils
( urlToFileName,
fileNameToUrl,
)
where
import Data.Text qualified as T
import Data.Text.Encoding.Base64.URL qualified as T
urlToFileName :: String -> FilePath
urlToFileName = T.unpack . T.encodeBase64Unpadded . T.pack
fileNameToUrl :: FilePath -> String
fileNameToUrl = T.unpack . T.decodeBase64Lenient . T.pack

View File

@ -1,142 +1,14 @@
{-# LANGUAGE ImportQualifiedPost #-}
{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE OverloadedStrings #-}
{-# OPTIONS_GHC -Wno-unrecognised-pragmas #-}
module Main where
import Control.Monad
import Control.Monad.IO.Class (MonadIO)
import Data.List (isPrefixOf)
import Data.Maybe (catMaybes)
import Data.Text (Text)
import Data.Text qualified as T
import Foliage.Config (Config (..), Source (..), readConfig)
import Foliage.Options (Options (..), parseOptions)
import Foliage.RepoToolWrapper
import Shelly
import System.FilePath (replaceDirectory, stripExtension, takeFileName)
import System.IO (hPutStrLn, stderr)
cabal :: Text -> [Text] -> Sh ()
cabal = command1_ "cabal" []
import Foliage.CmdBuild
import Foliage.CmdCreateKeys
import Foliage.CmdImportHackage
import Foliage.Options
main :: IO ()
main = do
Options {optionsConfig, optionsKeys} <- parseOptions
eConfig <- readConfig optionsConfig
case eConfig of
Left e ->
hPutStrLn stderr e
Right config ->
makeRepository (configSources config) optionsKeys
makeRepository :: MonadIO m => [Source] -> FilePath -> m ()
makeRepository sources keysPath = shelly $ do
outDir <- absPath "_repo"
idxDir <- absPath "_repo/index"
pkgDir <- absPath "_repo/package"
-- clean repository directory
rm_rf outDir
mkdir outDir
mkdir pkgDir
keysDir <- absPath keysPath
ensureKeys keysDir
forM_ sources $ processSource pkgDir
echo "Bootstrapping repository"
liftIO $ bootstrapRepo keysDir outDir
revisions <-
fmap catMaybes $ do
pkgs <- ls pkgDir
forM pkgs $ \pkg -> do
let Just pkgId = stripExtension ".tar.gz" pkg
let revisionPath = replaceDirectory (pkgId <.> "cabal") "revisions"
hasRevision <- test_e revisionPath
return $
if hasRevision
then Just revisionPath
else Nothing
forM_ revisions $ \revisionFilePath -> do
let Just pkgId = stripExtension ".cabal" $ takeFileName revisionFilePath
let (pn, pv) = parsePkgId (toTextIgnore pkgId)
let cabalFilePath = idxDir </> pn </> pv </> (pn <.> "cabal")
echo $ toTextIgnore cabalFilePath
echo $ "Adopting revision " <> toTextIgnore revisionFilePath
cp revisionFilePath cabalFilePath
unless (null revisions) $ do
echo "Updating index after applying revisions"
liftIO $ updateRepo keysDir outDir
echo $ "Hackage repository built in " <> toTextIgnore outDir
parsePkgId :: Text -> (Text, Text)
parsePkgId t = (T.init pn, pv)
where
(pn, pv) = T.breakOnEnd "-" t
ensureKeys :: FilePath -> Sh ()
ensureKeys keysDir = do
b <- test_d keysDir
if b
then echo $ "Using existing keys in " <> toTextIgnore keysDir
else do
echo $ "Creating new repository keys in " <> toTextIgnore keysDir
liftIO $ createKeys keysDir
processSource :: FilePath -> Source -> Sh ()
processSource pkgDir (Source url subdirs) = do
echo $ "Processing " <> url
withTmpDir $ \tmpDir -> do
bash_ "curl" ["--silent", "-L", url, " | tar xz -C ", toTextIgnore tmpDir]
dir <- skipSingleDirectory tmpDir
chdir dir $ do
removeCabalProjectFiles
case subdirs of
[] ->
sdistWithProtection pkgDir
_ ->
forM_ subdirs $ \subdir ->
chdir (fromText subdir) $
sdistWithProtection pkgDir
sdistWithProtection :: FilePath -> Sh ()
sdistWithProtection pkgDir =
withTmpDir $ \tmpDir -> do
print_stdout False $ cabal "sdist" ["-o", toTextIgnore tmpDir]
[sdistPath] <- ls tmpDir
let destPath = replaceDirectory sdistPath pkgDir
-- this is a bit rude
False <- test_e destPath
echo $ "written " <> toTextIgnore destPath
cp sdistPath destPath
removeCabalProjectFiles :: Sh ()
removeCabalProjectFiles = do
cpfs <- findWhen (\p -> pure $ "./cabal.project" `isPrefixOf` p) "."
forM_ cpfs $ \p -> do
echo $ "removing " <> toTextIgnore p
rm p
skipSingleDirectory :: FilePath -> Sh FilePath
skipSingleDirectory dir = do
es <- ls dir
case es of
[e] -> do
b <- test_d e
return $
if b
then dir </> e
else dir
_ -> return dir
putStrLn "🌿 Foliage"
parseCommand >>= \case
CreateKeys path -> cmdCreateKeys path
Build buildOpts -> cmdBuild buildOpts
ImportHackage importHackageOpts -> cmdImportHackage importHackageOpts

View File

@ -1,6 +0,0 @@
packages:
./foliage.cabal
./hackage-repo-tool-0.1.1.2/hackage-repo-tool.cabal
allow-newer:
hackage-repo-tool:*

View File

@ -1,27 +1,45 @@
cabal-version: 2.4
name: foliage
version: 0.1.0.0
version: 0.4.0.0
maintainer: andrea@andreabedini.com
author: Andrea Bedini
executable foliage
main-is: Main.hs
other-modules: Foliage.Config
, Foliage.Options
, Foliage.RepoToolWrapper
hs-source-dirs: app
other-modules:
Foliage.CmdCreateKeys
Foliage.CmdBuild
Foliage.CmdImportHackage
Foliage.HackageSecurity
Foliage.Meta
Foliage.Options
Foliage.Package
Foliage.Shake
Foliage.Shake.Oracle
Foliage.Time
Foliage.Utils
default-language: Haskell2010
default-extensions:
OverloadedStrings
ImportQualifiedPost
ImportQualifiedPost LambdaCase NamedFieldPuns ViewPatterns
ghc-options: -Wall
build-depends:
base >=4.14.3.0 && <4.15,
aeson >=2.0.3.0 && <2.1,
base64 >=0.4.2.3 && <0.5,
bytestring >=0.10.12.0 && <0.11,
containers >=0.6.5.1 && <0.7,
cryptohash-sha256 >=0.11.102.1 && <0.12,
directory >=1.3.6.0 && <1.4,
filepath >=1.4.2.1 && <1.5,
tomland >=1.3.3.1 && <1.4,
hackage-security >=0.6.2.1 && <0.7,
optparse-applicative >=0.17.0.0 && <0.18,
shelly >=1.10.0 && <1.11,
shake >=0.19.6 && <0.20,
tar >=0.5.1.1 && <0.6,
text >=1.2.4.1 && <1.3,
hackage-security,
hackage-repo-tool
ghc-options: -Wall
time >=1.9.3 && <1.10,
time-compat >=1.9.6.1 && <1.10,
tomland >=1.3.3.1 && <1.4,
zlib >=0.6.2.3 && <0.7

View File

@ -1,22 +0,0 @@
0.1.1.2
-------
* Compat release for `hackage-security-0.6`
0.1.1.1
-------
* Make `hackage-repo-tool` buildable on Windows (#175)
* Fix build-failure w/ `directory-1.2`
0.1.1
-----
* Update for `hackage-security-0.5`
0.1.0.1
-------
* Add missing module to other-modules (#100)
* Allow for `hackage-security-0.3`
* Include ChangeLog.md in sdist (#98)
0.1.0.0
-------
* Initial (beta) release

View File

@ -1,30 +0,0 @@
Copyright (c) 2015, Well-Typed LLP
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
* Neither the name of Well-Typed LLP nor the names of other
contributors may be used to endorse or promote products derived
from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@ -1,98 +0,0 @@
# `hackage-repo-tool`: Manage secure file-based package repositories
**Please refer to the [package description](https://hackage.haskell.org/package/hackage-repo-tool#description) for an overview of `hackage-repo-tool`, TUF and `hackage-security`.**
## Setting up a secure file-based repo
A file-based repository (as opposed to one running the actual [Hackage software](https://hackage.haskell.org/package/hackage-server)) is much easier to set up and will suffice for many purposes. Note that such a local file-based package repository can be turned into a remotely accessible secure package repository by any HTTP server supporting static file serving such as Nginx or Apache httpd.
1. Create a directory `~/my-secure-repo` containing a single
subdirectory `~/my-secure-repo/package`. Put whatever packages you
want to make available from your repo in this subdirectory. At this
point your repository might look like
~/my-secure-repo/package/basic-sop-0.1.0.5.tar.gz
~/my-secure-repo/package/generics-sop-0.1.1.1.tar.gz
~/my-secure-repo/package/generics-sop-0.1.1.2.tar.gz
~/my-secure-repo/package/json-sop-0.1.0.4.tar.gz
~/my-secure-repo/package/lens-sop-0.1.0.2.tar.gz
~/my-secure-repo/package/pretty-sop-0.1.0.1.tar.gz
~/my-secure-repo/package/HsYAML-0.2.1.0.tar.gz
*(Due to [#174](https://github.com/haskell/hackage-security/issues/174) this folder must contain at least one package tarball or `hackage-repo-tool` will fail in non-obvious ways)*
Note the flat directory structure: different packages and
different versions of those packages all live in the one
directory.
2. Create public and private keys:
# hackage-repo-tool create-keys \
--keys ~/my-private-keys
This will create a directory structure such as
~/my-private-keys/mirrors/id01.private
~/my-private-keys/mirrors/..
~/my-private-keys/root/id04.private
~/my-private-keys/root/..
~/my-private-keys/snapshot/id07.private
~/my-private-keys/target/id08.private
~/my-private-keys/target/..
~/my-private-keys/timestamp/id11.private
containing keys for all the various TUF roles.
Note that these keys are stored outside of the repository proper.
3. Create the initial TUF metadata and construct an index using
# hackage-repo-tool bootstrap \
--repo ~/my-secure-repo \
--keys ~/my-private-keys
This will create a directory `~/my-secure-repo/index` containing the
`.cabal` files (extracted from the package tarballs) and TUF
metadata for all packages
~/my-secure-repo/index/basic-sop/0.1.0.5/basic-sop.cabal
~/my-secure-repo/index/basic-sop/0.1.0.5/package.json
~/my-secure-repo/index/generics-sop/0.1.1.1/generics-sop.cabal
~/my-secure-repo/index/generics-sop/0.1.1.1/package.json
...
and package the contents of that directory up as the index tarball
`~/my-secure-repo/00-index.tar.gz`; it will also create the
top-level metadata files
~/my-secure-repo/mirrors.json
~/my-secure-repo/root.json
~/my-secure-repo/snapshot.json
~/my-secure-repo/timestamp.json
4. The timestamp and snapshot are valid for three days, so you will
need to resign these files regularly using
# hackage-repo-tool update \
--repo ~/my-secure-repo \
--keys ~/my-private-keys
You can use the same command whenever you add any additional
packages to your repository.
5. If you now make this directory available (for instance, by pointing
Apache httpd at it) you'll be able to use `cabal` to access it remotely by
defining an appropriate [`repository` stanza](https://www.haskell.org/cabal/users-guide/installing-packages.html#using-secure-repositories):
repository my-secure-repo
url: http://packages.example.org/
secure: True
root-keys: 2ae741f4c4a5f70ed6e6c48762e0d7a493d8dd265e9cbc6c4037dfc7ceaec70e
32d3db5b4403935c0baf52a2bcb05031784a971ee2d43587288776f2e90609db
eed36d2bb15f94628221cde558e99c4e1ad36fd243fe3748e1ee7ad00eb9d628
key-threshold: 2
Note that the keys in example above must be replaced: You need to
copy the root key IDs from your generated `root.json` file (or you
can set `key-threshold` to 0 if you're aware of the security
implications)

View File

@ -1,2 +0,0 @@
import Distribution.Simple
main = defaultMain

View File

@ -1,98 +0,0 @@
cabal-version: 1.12
name: hackage-repo-tool
version: 0.1.1.2
x-revision: 2
build-type: Simple
synopsis: Manage secure file-based package repositories
description: This utility can be used to manage secure file-based package
repositories (creating [TUF](https://theupdateframework.github.io/)
metadata as well as a Hackage index tarball) which can be used by
clients such as [cabal-install](http://hackage.haskell.org/package/cabal-install).
Currently it also provides various lower level utilities for creating
and signing TUF files.
.
This is part of the [Hackage Security](https://github.com/haskell/hackage-security#readme)
infrastructure.
license: BSD3
license-file: LICENSE
author: Edsko de Vries
maintainer: cabal-devel@haskell.org
copyright: Copyright 2015 Well-Typed LLP
category: Distribution
homepage: https://github.com/haskell/hackage-security
bug-reports: https://github.com/haskell/hackage-security/issues
tested-with: GHC==8.8.1, GHC==8.6.5, GHC==8.4.4, GHC==8.2.2, GHC==8.0.2,
GHC==7.10.3, GHC==7.8.4, GHC==7.6.3, GHC==7.4.2
extra-source-files:
ChangeLog.md README.md
source-repository head
type: git
location: https://github.com/haskell/hackage-security.git
flag use-network-uri
description: Are we using @network-uri@?
manual: False
flag use-old-time
description: Are we using @old-time@?
manual: False
library
hs-source-dirs: src
exposed-modules: Hackage.Security.RepoTool.Main
Hackage.Security.RepoTool.Options
Hackage.Security.RepoTool.Layout
Hackage.Security.RepoTool.Layout.Keys
Hackage.Security.RepoTool.Paths
Hackage.Security.RepoTool.Util.IO
-- For boot libraries we try to accomodate the versions bundled with
-- the respective GHC release
build-depends: base >= 4.5 && < 4.15,
bytestring >= 0.9 && < 0.12,
Cabal >= 1.14 && < 1.26
|| >= 2.0 && < 2.6
|| >= 3.0 && < 3.4,
directory >= 1.1 && < 1.4,
filepath >= 1.3 && < 1.5,
time >= 1.4 && < 1.10
if !os(windows)
build-depends: unix >= 2.5 && < 2.8
if flag(use-old-time)
build-depends: directory < 1.2
, old-time == 1.1.*
else
build-depends: directory >= 1.2
-- For non-boot libraries we try to support single major versions
-- to reduce the risk of semantic differences
build-depends: microlens >= 0.4.11.2 && < 0.5,
optparse-applicative >= 0.15.1 && < 0.16,
tar >= 0.5 && < 0.6,
zlib >= 0.6 && < 0.7,
hackage-security >= 0.6 && < 0.7
-- see comments in hackage-security.cabal
if flag(use-network-uri)
build-depends: network-uri >= 2.6 && < 2.7,
network >= 2.6 && < 2.9
|| >= 3.0 && < 3.2
else
build-depends: network >= 2.5 && < 2.6
default-language: Haskell2010
default-extensions: DeriveDataTypeable
FlexibleContexts
FlexibleInstances
NoMonomorphismRestriction
ScopedTypeVariables
StandaloneDeriving
RecordWildCards
ghc-options: -Wall

View File

@ -1,89 +0,0 @@
-- | Layout of the local repository as managed by this tool
--
-- The local repository follows a RepoLayout exactly, but adds some additional
-- files. In addition, we also manage a directory of keys (although this will
-- eventually need to be replaced with a proper key management system).
module Hackage.Security.RepoTool.Layout (
-- * Additional paths in the repository
repoLayoutIndexDir
-- * Layout-parametrized version of TargetPath
, TargetPath'(..)
, prettyTargetPath'
, applyTargetPath'
-- * Utility
, anchorIndexPath
, anchorRepoPath
, anchorKeyPath
, anchorTargetPath'
) where
import Distribution.Package
import Hackage.Security.Client
import Hackage.Security.Util.Path
import Hackage.Security.Util.Pretty
import Hackage.Security.RepoTool.Layout.Keys
import Hackage.Security.RepoTool.Options
import Hackage.Security.RepoTool.Paths
{-------------------------------------------------------------------------------
Additional paths specifically to the kind of repository this tool manages
-------------------------------------------------------------------------------}
-- | Directory containing the unpacked index
--
-- Since the layout of the tarball may not match the layout of the index,
-- we create a local directory with the unpacked contents of the index.
repoLayoutIndexDir :: RepoLayout -> RepoPath
repoLayoutIndexDir _ = rootPath $ fragment "index"
{-------------------------------------------------------------------------------
TargetPath'
-------------------------------------------------------------------------------}
-- | This is a variation on 'TargetPath' parameterized by layout
data TargetPath' =
InRep (RepoLayout -> RepoPath)
| InIdx (IndexLayout -> IndexPath)
| InRepPkg (RepoLayout -> PackageIdentifier -> RepoPath) PackageIdentifier
| InIdxPkg (IndexLayout -> PackageIdentifier -> IndexPath) PackageIdentifier
prettyTargetPath' :: GlobalOpts -> TargetPath' -> String
prettyTargetPath' opts = pretty . applyTargetPath' opts
-- | Apply the layout
applyTargetPath' :: GlobalOpts -> TargetPath' -> TargetPath
applyTargetPath' GlobalOpts{..} targetPath =
case targetPath of
InRep file -> TargetPathRepo $ file globalRepoLayout
InIdx file -> TargetPathIndex $ file globalIndexLayout
InRepPkg file pkgId -> TargetPathRepo $ file globalRepoLayout pkgId
InIdxPkg file pkgId -> TargetPathIndex $ file globalIndexLayout pkgId
{-------------------------------------------------------------------------------
Utility
-------------------------------------------------------------------------------}
-- | Anchor a tarball path to the repo (see 'repoLayoutIndex')
anchorIndexPath :: GlobalOpts -> RepoLoc -> (IndexLayout -> IndexPath) -> Path Absolute
anchorIndexPath opts@GlobalOpts{..} repoLoc file =
anchorRepoPath opts repoLoc repoLayoutIndexDir
</> unrootPath (file globalIndexLayout)
anchorRepoPath :: GlobalOpts -> RepoLoc -> (RepoLayout -> RepoPath) -> Path Absolute
anchorRepoPath GlobalOpts{..} (RepoLoc repoLoc) file =
anchorRepoPathLocally repoLoc $ file globalRepoLayout
anchorKeyPath :: GlobalOpts -> KeysLoc -> (KeysLayout -> KeyPath) -> Path Absolute
anchorKeyPath GlobalOpts{..} (KeysLoc keysLoc) dir =
keysLoc </> unrootPath (dir globalKeysLayout)
anchorTargetPath' :: GlobalOpts -> RepoLoc -> TargetPath' -> Path Absolute
anchorTargetPath' opts repoLoc = go
where
go :: TargetPath' -> Path Absolute
go (InRep file) = anchorRepoPath opts repoLoc file
go (InIdx file) = anchorIndexPath opts repoLoc file
go (InRepPkg file pkgId) = anchorRepoPath opts repoLoc (`file` pkgId)
go (InIdxPkg file pkgId) = anchorIndexPath opts repoLoc (`file` pkgId)

View File

@ -1,44 +0,0 @@
-- | Layout of the directory containing the (private) keys
module Hackage.Security.RepoTool.Layout.Keys (
-- * Layout of the keys directory
KeysLayout(..)
, defaultKeysLayout
, keysLayoutKey
) where
import Hackage.Security.Client
import Hackage.Security.Util.Path
import Hackage.Security.Util.Some
import Hackage.Security.RepoTool.Paths
-- | Layout of the keys directory
--
-- Specifies the directories containing the keys (relative to the keys loc),
-- as well as the filename for individual keys.
data KeysLayout = KeysLayout {
keysLayoutRoot :: KeyPath
, keysLayoutTarget :: KeyPath
, keysLayoutTimestamp :: KeyPath
, keysLayoutSnapshot :: KeyPath
, keysLayoutMirrors :: KeyPath
, keysLayoutKeyFile :: Some Key -> Path Unrooted
}
defaultKeysLayout :: KeysLayout
defaultKeysLayout = KeysLayout {
keysLayoutRoot = rp $ fragment "root"
, keysLayoutTarget = rp $ fragment "target"
, keysLayoutTimestamp = rp $ fragment "timestamp"
, keysLayoutSnapshot = rp $ fragment "snapshot"
, keysLayoutMirrors = rp $ fragment "mirrors"
, keysLayoutKeyFile = \key -> let kId = keyIdString (someKeyId key)
in fragment kId <.> "private"
}
where
rp :: Path Unrooted -> KeyPath
rp = rootPath
keysLayoutKey :: (KeysLayout -> KeyPath) -> Some Key -> KeysLayout -> KeyPath
keysLayoutKey dir key keysLayout@KeysLayout{..} =
dir keysLayout </> keysLayoutKeyFile key

View File

@ -1,687 +0,0 @@
{-# LANGUAGE CPP #-}
module Hackage.Security.RepoTool.Main where
import Control.Exception
import Control.Monad
import Data.List (nub)
import Data.Maybe (catMaybes, mapMaybe)
import Data.Time
import GHC.Conc.Sync (setUncaughtExceptionHandler)
import Network.URI (URI)
import System.Exit
import qualified Data.ByteString.Lazy as BS.L
import qualified Lens.Micro as Lens
import qualified System.FilePath as FilePath
#ifndef mingw32_HOST_OS
import System.IO.Error (isAlreadyExistsError)
#endif
import System.IO.Error (isDoesNotExistError)
-- Cabal
import Distribution.Package
import Distribution.Text
-- hackage-security
import Hackage.Security.Server
import Hackage.Security.Util.Some
import Hackage.Security.Util.Path
import Hackage.Security.Util.Pretty
import qualified Hackage.Security.Key.Env as KeyEnv
import qualified Hackage.Security.TUF.FileMap as FileMap
import Text.JSON.Canonical (JSValue)
-- hackage-repo-tool
import Hackage.Security.RepoTool.Options
import Hackage.Security.RepoTool.Layout
import Hackage.Security.RepoTool.Layout.Keys
import Hackage.Security.RepoTool.Paths
import Hackage.Security.RepoTool.Util.IO
{-------------------------------------------------------------------------------
Main application driver
-------------------------------------------------------------------------------}
main :: IO ()
main = do
setUncaughtExceptionHandler topLevelExceptionHandler
opts@GlobalOpts{..} <- getOptions
case globalCommand of
CreateKeys keysLoc ->
createKeys opts keysLoc
Bootstrap keysLoc repoLoc ->
bootstrapOrUpdate opts keysLoc repoLoc True
Update keysLoc repoLoc ->
bootstrapOrUpdate opts keysLoc repoLoc False
CreateRoot keysLoc rootLoc ->
createRoot opts keysLoc rootLoc
CreateMirrors keysLoc mirrorsLoc mirrors ->
createMirrors opts keysLoc mirrorsLoc mirrors
#ifndef mingw32_HOST_OS
SymlinkCabalLocalRepo repoLoc cabalRepoLoc ->
symlinkCabalLocalRepo opts repoLoc cabalRepoLoc
#endif
Sign keys deleteExisting file ->
signFile keys deleteExisting file
-- | Top-level exception handler that uses 'displayException'
--
-- Although base 4.8 introduces 'displayException', the top-level exception
-- handler still uses 'show', sadly. See "PROPOSAL: Add displayException to
-- Exception typeclass" thread on the libraries mailing list.
--
-- NOTE: This is a terrible hack. See the above thread for some insights into
-- how we should do this better. For now it will do however.
topLevelExceptionHandler :: SomeException -> IO ()
topLevelExceptionHandler e = do
putStrLn $ displayException e
exitFailure
#if !MIN_VERSION_base(4,8,0)
displayException :: Exception e => e -> String
displayException = show
#endif
{-------------------------------------------------------------------------------
Creating keys
-------------------------------------------------------------------------------}
createKeys :: GlobalOpts -> KeysLoc -> IO ()
createKeys opts keysLoc = do
privateRoot <- replicateM 3 $ createKey' KeyTypeEd25519
privateTarget <- replicateM 3 $ createKey' KeyTypeEd25519
privateTimestamp <- replicateM 1 $ createKey' KeyTypeEd25519
privateSnapshot <- replicateM 1 $ createKey' KeyTypeEd25519
privateMirrors <- replicateM 3 $ createKey' KeyTypeEd25519
writeKeys opts keysLoc PrivateKeys{..}
{-------------------------------------------------------------------------------
Dealing with (private) keys
-------------------------------------------------------------------------------}
data PrivateKeys = PrivateKeys {
privateRoot :: [Some Key]
, privateTarget :: [Some Key]
, privateTimestamp :: [Some Key]
, privateSnapshot :: [Some Key]
, privateMirrors :: [Some Key]
}
readKeys :: GlobalOpts -> KeysLoc -> IO PrivateKeys
readKeys opts keysLoc =
PrivateKeys <$> readKeysAt opts keysLoc keysLayoutRoot
<*> readKeysAt opts keysLoc keysLayoutTarget
<*> readKeysAt opts keysLoc keysLayoutTimestamp
<*> readKeysAt opts keysLoc keysLayoutSnapshot
<*> readKeysAt opts keysLoc keysLayoutMirrors
writeKeys :: GlobalOpts -> KeysLoc -> PrivateKeys -> IO ()
writeKeys opts keysLoc PrivateKeys{..} = do
forM_ privateRoot $ writeKey opts keysLoc keysLayoutRoot
forM_ privateTarget $ writeKey opts keysLoc keysLayoutTarget
forM_ privateTimestamp $ writeKey opts keysLoc keysLayoutTimestamp
forM_ privateSnapshot $ writeKey opts keysLoc keysLayoutSnapshot
forM_ privateMirrors $ writeKey opts keysLoc keysLayoutMirrors
readKeysAt :: GlobalOpts -> KeysLoc -> (KeysLayout -> KeyPath) -> IO [Some Key]
readKeysAt opts keysLoc subDir = catMaybes <$> do
entries <- getDirectoryContents absPath
forM entries $ \entry -> do
let path = absPath </> entry
mKey <- readJSON_NoKeys_NoLayout path
case mKey of
Left _err -> do logWarn opts $ "Skipping unrecognized " ++ pretty path
return Nothing
Right key -> return $ Just key
where
absPath = anchorKeyPath opts keysLoc subDir
writeKey :: GlobalOpts -> KeysLoc -> (KeysLayout -> KeyPath) -> Some Key -> IO ()
writeKey opts@GlobalOpts{..} keysLoc subDir key = do
logInfo opts $ "Writing " ++ pretty (relPath globalKeysLayout)
createDirectoryIfMissing True (takeDirectory absPath)
writeJSON_NoLayout absPath key
where
relPath = keysLayoutKey subDir key
absPath = anchorKeyPath opts keysLoc relPath
{-------------------------------------------------------------------------------
Creating individual files
We translate absolute paths to repo layout to fit with rest of infrastructure.
-------------------------------------------------------------------------------}
createRoot :: GlobalOpts -> KeysLoc -> Path Absolute -> IO ()
createRoot opts@GlobalOpts{..} keysLoc rootLoc = do
keys <- readKeys opts keysLoc
now <- getCurrentTime
updateRoot opts { globalRepoLayout = layout }
repoLoc
WriteUpdate
keys
now
where
repoLoc = RepoLoc $ takeDirectory rootLoc
layout = globalRepoLayout {
repoLayoutRoot = rootFragment $ takeFileName rootLoc
}
createMirrors :: GlobalOpts -> KeysLoc -> Path Absolute -> [URI] -> IO ()
createMirrors opts@GlobalOpts{..} keysLoc mirrorsLoc mirrors = do
keys <- readKeys opts keysLoc
now <- getCurrentTime
updateMirrors opts { globalRepoLayout = layout }
repoLoc
WriteUpdate
keys
now
mirrors
where
repoLoc = RepoLoc $ takeDirectory mirrorsLoc
layout = globalRepoLayout {
repoLayoutMirrors = rootFragment $ takeFileName mirrorsLoc
}
rootFragment :: String -> RepoPath
rootFragment = rootPath . fragment
{-------------------------------------------------------------------------------
Bootstrapping / updating
TODO: Some of this functionality should be moved to
@Hackage.Security.Server.*@ (to be shared by both, say, Hackage, and
secure-local), but I'm not sure precisely in what form yet.
-------------------------------------------------------------------------------}
bootstrapOrUpdate :: GlobalOpts -> KeysLoc -> RepoLoc -> Bool -> IO ()
bootstrapOrUpdate opts@GlobalOpts{..} keysLoc repoLoc isBootstrap = do
-- Collect info
keys <- readKeys opts keysLoc
now <- getCurrentTime
pkgs <- findPackages opts repoLoc
-- Sanity check
repoLayoutOk <- checkRepoLayout opts repoLoc pkgs
unless repoLayoutOk $
throwIO $ userError "Unexpected repository layout"
-- We overwrite files during bootstrap process, but update them only
-- if necessary during an update. Note that we _only_ write the updated
-- files to the tarball, so the user deletes the tarball and then calls
-- update (rather than bootstrap) the tarball will be missing files.
let whenWrite = if isBootstrap
then WriteInitial
else WriteUpdate
-- If doing bootstrap: create root and mirrors
when isBootstrap $ do
updateRoot opts repoLoc whenWrite keys now
updateMirrors opts repoLoc whenWrite keys now []
-- Create targets.json for each package version
forM_ pkgs $ \pkgId -> do
createPackageMetadata opts repoLoc whenWrite pkgId
extractCabalFile opts repoLoc whenWrite pkgId
-- Recreate index tarball
newFiles <- findNewIndexFiles opts repoLoc whenWrite
case (whenWrite, null newFiles) of
(WriteInitial, _) -> do
-- If we are recreating all files, also recreate the index
_didExist <- handleDoesNotExist $ removeFile pathIndexTar
logInfo opts $ "Writing " ++ prettyRepo repoLayoutIndexTar
(WriteUpdate, True) -> do
logInfo opts $ "Skipping " ++ prettyRepo repoLayoutIndexTar
(WriteUpdate, False) ->
logInfo opts $ "Appending " ++ show (length newFiles)
++ " file(s) to " ++ prettyRepo repoLayoutIndexTar
unless (null newFiles) $ do
tarAppend
(anchorRepoPath opts repoLoc repoLayoutIndexTar)
(anchorRepoPath opts repoLoc repoLayoutIndexDir)
(map castRoot newFiles)
logInfo opts $ "Writing " ++ prettyRepo repoLayoutIndexTarGz
compress (anchorRepoPath opts repoLoc repoLayoutIndexTar)
(anchorRepoPath opts repoLoc repoLayoutIndexTarGz)
-- Create snapshot
-- TODO: If we are updating we should be incrementing the version, not
-- keeping it the same
rootInfo <- computeFileInfo' repoLayoutRoot
mirrorsInfo <- computeFileInfo' repoLayoutMirrors
tarInfo <- computeFileInfo' repoLayoutIndexTar
tarGzInfo <- computeFileInfo' repoLayoutIndexTarGz
let snapshot = Snapshot {
snapshotVersion = versionInitial
, snapshotExpires = expiresInDays now 3
, snapshotInfoRoot = rootInfo
, snapshotInfoMirrors = mirrorsInfo
, snapshotInfoTar = Just tarInfo
, snapshotInfoTarGz = tarGzInfo
}
updateFile opts
repoLoc
whenWrite
(InRep repoLayoutSnapshot)
(withSignatures globalRepoLayout (privateSnapshot keys))
snapshot
-- Finally, create the timestamp
snapshotInfo <- computeFileInfo' repoLayoutSnapshot
let timestamp = Timestamp {
timestampVersion = versionInitial
, timestampExpires = expiresInDays now 3
, timestampInfoSnapshot = snapshotInfo
}
updateFile opts
repoLoc
whenWrite
(InRep repoLayoutTimestamp)
(withSignatures globalRepoLayout (privateTimestamp keys))
timestamp
where
pathIndexTar :: Path Absolute
pathIndexTar = anchorRepoPath opts repoLoc repoLayoutIndexTar
-- | Compute file information for a file in the repo
computeFileInfo' :: (RepoLayout -> RepoPath) -> IO FileInfo
computeFileInfo' = computeFileInfo . anchorRepoPath opts repoLoc
prettyRepo :: (RepoLayout -> RepoPath) -> String
prettyRepo = prettyTargetPath' opts . InRep
-- | Create root metadata
updateRoot :: GlobalOpts
-> RepoLoc
-> WhenWrite
-> PrivateKeys
-> UTCTime
-> IO ()
updateRoot opts repoLoc whenWrite keys now =
updateFile opts
repoLoc
whenWrite
(InRep repoLayoutRoot)
(withSignatures' (privateRoot keys))
root
where
root :: Root
root = Root {
rootVersion = versionInitial
, rootExpires = expiresInDays now (globalExpireRoot opts * 365)
, rootKeys = KeyEnv.fromKeys $ concat [
privateRoot keys
, privateTarget keys
, privateSnapshot keys
, privateTimestamp keys
, privateMirrors keys
]
, rootRoles = RootRoles {
rootRolesRoot = RoleSpec {
roleSpecKeys = map somePublicKey (privateRoot keys)
, roleSpecThreshold = KeyThreshold 2
}
, rootRolesTargets = RoleSpec {
roleSpecKeys = map somePublicKey (privateTarget keys)
, roleSpecThreshold = KeyThreshold 1
}
, rootRolesSnapshot = RoleSpec {
roleSpecKeys = map somePublicKey (privateSnapshot keys)
, roleSpecThreshold = KeyThreshold 1
}
, rootRolesTimestamp = RoleSpec {
roleSpecKeys = map somePublicKey (privateTimestamp keys)
, roleSpecThreshold = KeyThreshold 1
}
, rootRolesMirrors = RoleSpec {
roleSpecKeys = map somePublicKey (privateMirrors keys)
, roleSpecThreshold = KeyThreshold 1
}
}
}
-- | Create root metadata
updateMirrors :: GlobalOpts
-> RepoLoc
-> WhenWrite
-> PrivateKeys
-> UTCTime
-> [URI]
-> IO ()
updateMirrors opts repoLoc whenWrite keys now uris =
updateFile opts
repoLoc
whenWrite
(InRep repoLayoutMirrors)
(withSignatures' (privateMirrors keys))
mirrors
where
mirrors :: Mirrors
mirrors = Mirrors {
mirrorsVersion = versionInitial
, mirrorsExpires = expiresInDays now (globalExpireMirrors opts * 365)
, mirrorsMirrors = map mkMirror uris
}
mkMirror :: URI -> Mirror
mkMirror uri = Mirror uri MirrorFull
-- | Create package metadata
createPackageMetadata :: GlobalOpts -> RepoLoc -> WhenWrite -> PackageIdentifier -> IO ()
createPackageMetadata opts repoLoc whenWrite pkgId = do
srcTS <- getFileModTime opts repoLoc src
dstTS <- getFileModTime opts repoLoc dst
let skip = case whenWrite of
WriteInitial -> False
WriteUpdate -> dstTS >= srcTS
if skip
then logInfo opts $ "Skipping " ++ prettyTargetPath' opts dst
else do
fileMapEntries <- mapM computeFileMapEntry fileMapFiles
let targets = Targets {
targetsVersion = versionInitial
, targetsExpires = expiresNever
, targetsTargets = FileMap.fromList fileMapEntries
, targetsDelegations = Nothing
}
-- Currently we "sign" with no keys
updateFile opts
repoLoc
whenWrite
dst
(withSignatures' [])
targets
where
computeFileMapEntry :: TargetPath' -> IO (TargetPath, FileInfo)
computeFileMapEntry file = do
info <- computeFileInfo $ anchorTargetPath' opts repoLoc file
return (applyTargetPath' opts file, info)
-- The files we need to add to the package targets file
-- Currently this is just the .tar.gz file
fileMapFiles :: [TargetPath']
fileMapFiles = [src]
src, dst :: TargetPath'
src = InRepPkg repoLayoutPkgTarGz pkgId
dst = InIdxPkg indexLayoutPkgMetadata pkgId
{-------------------------------------------------------------------------------
Working with the index
-------------------------------------------------------------------------------}
-- | Find the files we need to add to the index
findNewIndexFiles :: GlobalOpts -> RepoLoc -> WhenWrite -> IO [IndexPath]
findNewIndexFiles opts@GlobalOpts{..} repoLoc whenWrite = do
indexTS <- getFileModTime opts repoLoc (InRep repoLayoutIndexTar)
indexFiles <- getRecursiveContents absIndexDir
let indexFiles' :: [IndexPath]
indexFiles' = map rootPath indexFiles
case whenWrite of
WriteInitial -> return indexFiles'
WriteUpdate -> liftM catMaybes $
forM indexFiles' $ \indexFile -> do
fileTS <- getFileModTime opts repoLoc $ InIdx (const indexFile)
if fileTS > indexTS then return $ Just indexFile
else return Nothing
where
absIndexDir :: Path Absolute
absIndexDir = anchorRepoPath opts repoLoc repoLayoutIndexDir
-- | Extract the cabal file from the package tarball and copy it to the index
extractCabalFile :: GlobalOpts -> RepoLoc -> WhenWrite -> PackageIdentifier -> IO ()
extractCabalFile opts@GlobalOpts{..} repoLoc whenWrite pkgId = do
srcTS <- getFileModTime opts repoLoc src
dstTS <- getFileModTime opts repoLoc dst
let skip = case whenWrite of
WriteInitial -> False
WriteUpdate -> dstTS >= srcTS
if skip
then logInfo opts $ "Skipping " ++ prettyTargetPath' opts dst
else do
mCabalFile <- try $ tarExtractFile opts repoLoc src pathCabalInTar
case mCabalFile of
Left (ex :: SomeException) ->
logWarn opts $ "Failed to extract .cabal from package " ++ display pkgId
++ ": " ++ displayException ex
Right Nothing ->
logWarn opts $ ".cabal file missing for package " ++ display pkgId
Right (Just (cabalFile, _cabalSize)) -> do
logInfo opts $ "Writing "
++ prettyTargetPath' opts dst
++ " (extracted from "
++ prettyTargetPath' opts src
++ ")"
withFile pathCabalInIdx WriteMode $ \h -> BS.L.hPut h cabalFile
where
pathCabalInTar :: FilePath
pathCabalInTar = FilePath.joinPath [
display pkgId
, display (packageName pkgId)
] FilePath.<.> "cabal"
pathCabalInIdx :: Path Absolute
pathCabalInIdx = anchorTargetPath' opts repoLoc dst
src, dst :: TargetPath'
dst = InIdxPkg indexLayoutPkgCabal pkgId
src = InRepPkg repoLayoutPkgTarGz pkgId
{-------------------------------------------------------------------------------
Updating files in the repo or in the index
-------------------------------------------------------------------------------}
data WhenWrite =
-- | Write the initial version of a file
--
-- If applicable, set file version to 1.
WriteInitial
-- | Update an existing
--
-- If applicable, increment file version number.
| WriteUpdate
-- | Write canonical JSON
--
-- We write the file to a temporary location and compare file info with the file
-- that was already in the target location (if any). If it's the same (modulo
-- version number) we don't overwrite it and return Nothing; otherwise we
-- increment the version number, write the file, and (if it's in the index)
-- copy it to the unpacked index directory.
updateFile :: forall a. (ToJSON WriteJSON a, HasHeader a)
=> GlobalOpts
-> RepoLoc
-> WhenWrite
-> TargetPath'
-> (a -> Signed a) -- ^ Signing function
-> a -- ^ Unsigned file contents
-> IO ()
updateFile opts@GlobalOpts{..} repoLoc whenWrite fileLoc signPayload a = do
mOldHeader :: Maybe (Either DeserializationError (UninterpretedSignatures Header)) <-
handleDoesNotExist $ readJSON_NoKeys_NoLayout fp
case (whenWrite, mOldHeader) of
(WriteInitial, _) ->
writeDoc writing a
(WriteUpdate, Nothing) -> -- no previous version
writeDoc creating a
(WriteUpdate, Just (Left _err)) -> -- old file corrupted
writeDoc overwriting a
(WriteUpdate, Just (Right (UninterpretedSignatures oldHeader _oldSigs))) -> do
-- We cannot quite read the entire old file, because we don't know what
-- key environment to use. Instead, we render the _new_ file, but
-- setting the version number to be equal to the version number of the
-- old file. If the result turns out to be equal to the old file (same
-- FileInfo), we skip writing this file. However, if this is NOT equal,
-- we set the version number of the new file to be equal to the version
-- number of the old plus one, and write it.
oldFileInfo <- computeFileInfo fp
let oldVersion :: FileVersion
oldVersion = headerVersion oldHeader
wOldVersion, wIncVersion :: a
wOldVersion = Lens.set fileVersion oldVersion a
wIncVersion = Lens.set fileVersion (versionIncrement oldVersion) a
wOldSigned :: Signed a
wOldSigned = signPayload wOldVersion
wOldRendered :: BS.L.ByteString
wOldRendered = renderJSON globalRepoLayout wOldSigned
-- TODO: We could be be more efficient here and verify file size
-- first; however, these files are tiny so it doesn't really matter.
wOldFileInfo :: FileInfo
wOldFileInfo = fileInfo wOldRendered
if knownFileInfoEqual oldFileInfo wOldFileInfo
then logInfo opts $ "Unchanged " ++ prettyTargetPath' opts fileLoc
else writeDoc updating wIncVersion
where
-- | Actually write the file
writeDoc :: String -> a -> IO ()
writeDoc reason doc = do
logInfo opts reason
createDirectoryIfMissing True (takeDirectory fp)
writeJSON globalRepoLayout fp (signPayload doc)
fp :: Path Absolute
fp = anchorTargetPath' opts repoLoc fileLoc
writing, creating, overwriting, updating :: String
writing = "Writing " ++ prettyTargetPath' opts fileLoc
creating = "Creating " ++ prettyTargetPath' opts fileLoc
overwriting = "Overwriting " ++ prettyTargetPath' opts fileLoc ++ " (old file corrupted)"
updating = "Updating " ++ prettyTargetPath' opts fileLoc
{-------------------------------------------------------------------------------
Inspect the repo layout
-------------------------------------------------------------------------------}
-- | Find packages
--
-- Repository layouts are configurable, but we don't know if the layout of the
-- current directory matches the specified layout. We therefore here just search
-- through the directory looking for anything that looks like a package.
-- We can then verify that this list of packages actually matches the layout as
-- a separate step.
findPackages :: GlobalOpts -> RepoLoc -> IO [PackageIdentifier]
findPackages GlobalOpts{..} (RepoLoc repoLoc) =
nub . mapMaybe isPackage <$> getRecursiveContents repoLoc
where
isPackage :: Path Unrooted -> Maybe PackageIdentifier
isPackage path = do
guard $ not (isIndex path)
pkg <- hasExtensions path [".tar", ".gz"]
simpleParse pkg
isIndex :: Path Unrooted -> Bool
isIndex = (==) (unrootPath (repoLayoutIndexTarGz globalRepoLayout))
-- | Check that packages are in their expected location
checkRepoLayout :: GlobalOpts -> RepoLoc -> [PackageIdentifier] -> IO Bool
checkRepoLayout opts repoLoc = liftM and . mapM checkPackage
where
checkPackage :: PackageIdentifier -> IO Bool
checkPackage pkgId = do
existsTarGz <- doesFileExist $ anchorTargetPath' opts repoLoc expectedTarGz
unless existsTarGz $
logWarn opts $ "Package tarball " ++ display pkgId
++ " expected in location "
++ prettyTargetPath' opts expectedTarGz
return existsTarGz
where
expectedTarGz :: TargetPath'
expectedTarGz = InRepPkg repoLayoutPkgTarGz pkgId
#ifndef mingw32_HOST_OS
{-------------------------------------------------------------------------------
Creating Cabal-local-repo
-------------------------------------------------------------------------------}
symlinkCabalLocalRepo :: GlobalOpts -> RepoLoc -> RepoLoc -> IO ()
symlinkCabalLocalRepo opts@GlobalOpts{..} repoLoc cabalRepoLoc = do
symlink repoLayoutIndexTar
pkgs <- findPackages opts repoLoc
forM_ pkgs $ \pkgId -> symlink (`repoLayoutPkgTarGz` pkgId)
where
-- TODO: This gives a warning for files that we previously linked, as well
-- as for files that we _never_ need to link (because the location of both
-- repos is the same). This is potentially confusing.
symlink :: (RepoLayout -> RepoPath) -> IO ()
symlink file =
catch (createSymbolicLink target loc) $ \ex ->
if isAlreadyExistsError ex
then logWarn opts $ "Skipping " ++ pretty (file globalRepoLayout)
++ " (already exists)"
else throwIO ex
where
target = anchorRepoPath opts repoLoc file
loc = anchorRepoPath opts' cabalRepoLoc file
opts' = opts { globalRepoLayout = cabalLocalRepoLayout }
#endif
{-------------------------------------------------------------------------------
Signing individual files
-------------------------------------------------------------------------------}
signFile :: [KeyLoc] -> DeleteExistingSignatures -> Path Absolute -> IO ()
signFile keyLocs deleteExisting fp = do
UninterpretedSignatures (payload :: JSValue) oldSigs <-
throwErrors =<< readJSON_NoKeys_NoLayout fp
keys :: [Some Key] <- forM keyLocs $ \keyLoc ->
throwErrors =<< readJSON_NoKeys_NoLayout keyLoc
let newSigs = concat [
if deleteExisting then [] else oldSigs
, toPreSignatures (signRendered keys $ renderJSON_NoLayout payload)
]
writeJSON_NoLayout fp $ UninterpretedSignatures payload newSigs
{-------------------------------------------------------------------------------
Logging
-------------------------------------------------------------------------------}
logInfo :: GlobalOpts -> String -> IO ()
logInfo GlobalOpts{..} str = when globalVerbose $
putStrLn $ "Info: " ++ str
logWarn :: GlobalOpts -> String -> IO ()
logWarn _opts str =
putStrLn $ "Warning: " ++ str
{-------------------------------------------------------------------------------
Auxiliary
-------------------------------------------------------------------------------}
-- | Check that a file has the given extensions
--
-- Returns the filename without the verified extensions. For example:
--
-- > hasExtensions "foo.tar.gz" [".tar", ".gz"] == Just "foo"
hasExtensions :: Path a -> [String] -> Maybe String
hasExtensions = \fp exts -> go (takeFileName fp) (reverse exts)
where
go :: FilePath -> [String] -> Maybe String
go fp [] = return fp
go fp (e:es) = do let (fp', e') = FilePath.splitExtension fp
guard $ e == e'
go fp' es
throwErrors :: Exception e => Either e a -> IO a
throwErrors (Left err) = throwIO err
throwErrors (Right a) = return a
handleDoesNotExist :: IO a -> IO (Maybe a)
handleDoesNotExist act = handle aux (Just <$> act)
where
aux e | isDoesNotExistError e = return Nothing
| otherwise = throwIO e

View File

@ -1,205 +0,0 @@
{-# LANGUAGE CPP #-}
module Hackage.Security.RepoTool.Options (
GlobalOpts(..)
, Command(..)
, KeyLoc
, DeleteExistingSignatures
, getOptions
) where
import Network.URI (URI, parseURI)
import Options.Applicative
import System.IO.Unsafe (unsafePerformIO)
import Hackage.Security.Client
import Hackage.Security.Util.Path
import Hackage.Security.RepoTool.Layout.Keys
import Hackage.Security.RepoTool.Paths
{-------------------------------------------------------------------------------
Types
-------------------------------------------------------------------------------}
-- | Command line options
data GlobalOpts = GlobalOpts {
-- | Key directory layout
globalKeysLayout :: KeysLayout
-- | Local repository layout
, globalRepoLayout :: RepoLayout
-- | Local index layout
, globalIndexLayout :: IndexLayout
-- | Should we be verbose?
, globalVerbose :: Bool
-- | Expiry time when creating root (in years)
, globalExpireRoot :: Integer
-- | Expiry time when creating mirrors (in years)
, globalExpireMirrors :: Integer
-- | Command to execute
, globalCommand :: Command
}
data Command =
-- | Create keys
CreateKeys KeysLoc
-- | Bootstrap a secure local repository
| Bootstrap KeysLoc RepoLoc
-- | Update a previously bootstrapped local repository
| Update KeysLoc RepoLoc
-- | Create root metadta
| CreateRoot KeysLoc (Path Absolute)
-- | Create mirrors metadata
| CreateMirrors KeysLoc (Path Absolute) [URI]
#ifndef mingw32_HOST_OS
-- | Create a directory with symlinks in cabal-local-rep layout
| SymlinkCabalLocalRepo RepoLoc RepoLoc
#endif
-- | Sign an individual file
| Sign [KeyLoc] DeleteExistingSignatures (Path Absolute)
type KeyLoc = Path Absolute
type DeleteExistingSignatures = Bool
{-------------------------------------------------------------------------------
Parsers
-------------------------------------------------------------------------------}
getOptions :: IO GlobalOpts
getOptions = execParser opts
where
opts = info (helper <*> parseGlobalOptions) $ mconcat [
fullDesc
, header "Manage local Hackage repositories"
]
parseRepoLoc :: Parser RepoLoc
parseRepoLoc = RepoLoc <$> (option (str >>= readAbsolutePath) $ mconcat [
long "repo"
, metavar "PATH"
, help "Path to local repository"
])
parseKeysLoc :: Parser KeysLoc
parseKeysLoc = KeysLoc <$> (option (str >>= readAbsolutePath) $ mconcat [
long "keys"
, metavar "PATH"
, help "Path to key store"
])
parseCreateKeys :: Parser Command
parseCreateKeys = CreateKeys <$> parseKeysLoc
parseBootstrap :: Parser Command
parseBootstrap = Bootstrap <$> parseKeysLoc <*> parseRepoLoc
parseUpdate :: Parser Command
parseUpdate = Update <$> parseKeysLoc <*> parseRepoLoc
parseCreateRoot :: Parser Command
parseCreateRoot = CreateRoot
<$> parseKeysLoc
<*> (option (str >>= readAbsolutePath) $ mconcat [
short 'o'
, metavar "FILE"
, help "Location of the root file"
])
parseCreateMirrors :: Parser Command
parseCreateMirrors = CreateMirrors
<$> parseKeysLoc
<*> (option (str >>= readAbsolutePath) $ mconcat [
short 'o'
, metavar "FILE"
, help "Location of the mirrors file"
])
<*> (many $ argument (str >>= readURI) (metavar "MIRROR"))
#ifndef mingw32_HOST_OS
parseSymlinkCabalLocalRepo :: Parser Command
parseSymlinkCabalLocalRepo = SymlinkCabalLocalRepo
<$> parseRepoLoc
<*> (option (str >>= liftA RepoLoc . readAbsolutePath) $ mconcat [
long "cabal-repo"
, help "Location of the cabal repo"
])
#endif
parseSign :: Parser Command
parseSign = Sign
<$> (many . option (str >>= readAbsolutePath) $ mconcat [
long "key"
, help "Path to private key (can be specified multiple times)"
])
<*> (switch $ mconcat [
long "delete-existing"
, help "Delete any existing signatures"
])
<*> argument (str >>= readAbsolutePath) (metavar "FILE")
-- | Global options
--
-- TODO: Make repo and keys layout configurable
parseGlobalOptions :: Parser GlobalOpts
parseGlobalOptions = GlobalOpts
<$> (pure defaultKeysLayout)
<*> (pure hackageRepoLayout)
<*> (pure hackageIndexLayout)
<*> (switch $ mconcat [
long "verbose"
, short 'v'
, help "Verbose logging"
])
<*> (option auto $ mconcat [
long "expire-root"
, metavar "YEARS"
, help "Expiry time for the root info"
, value 1
, showDefault
])
<*> (option auto $ mconcat [
long "expire-mirrors"
, metavar "YEARS"
, help "Expiry time for the mirrors"
, value 10
, showDefault
])
<*> (subparser $ mconcat [
command "create-keys" $ info (helper <*> parseCreateKeys) $
progDesc "Create keys"
, command "bootstrap" $ info (helper <*> parseBootstrap) $
progDesc "Bootstrap a local repository"
, command "update" $ info (helper <*> parseUpdate) $
progDesc "Update a (previously bootstrapped) local repository"
, command "create-root" $ info (helper <*> parseCreateRoot) $
progDesc "Create root metadata"
, command "create-mirrors" $ info (helper <*> parseCreateMirrors) $
progDesc "Create mirrors metadata. All MIRRORs specified on the command line will be written to the file."
#ifndef mingw32_HOST_OS
, command "symlink-cabal-local-repo" $ info (helper <*> parseSymlinkCabalLocalRepo) $
progDesc "Create a directory in cabal-local-repo layout with symlinks to the specified repository."
#endif
, command "sign" $ info (helper <*> parseSign) $
progDesc "Sign a file"
])
readURI :: String -> ReadM URI
readURI uriStr =
case parseURI uriStr of
Nothing -> fail $ "Invalid URI " ++ show uriStr
Just uri -> return uri
-- Sadly, cannot do I/O actions inside ReadM
readAbsolutePath :: String -> ReadM (Path Absolute)
readAbsolutePath = return . unsafePerformIO . makeAbsolute . fromFilePath

View File

@ -1,33 +0,0 @@
-- | Additional paths
module Hackage.Security.RepoTool.Paths (
-- * Repo
RepoLoc(..)
-- * Keys
, KeyRoot
, KeyPath
, KeysLoc(..)
) where
import Hackage.Security.Util.Path
import Hackage.Security.Util.Pretty
{-------------------------------------------------------------------------------
Repo
-------------------------------------------------------------------------------}
newtype RepoLoc = RepoLoc { repoLocPath :: Path Absolute }
deriving Eq
{-------------------------------------------------------------------------------
Keys
-------------------------------------------------------------------------------}
-- | The key directory
data KeyRoot
type KeyPath = Path KeyRoot
instance Pretty (Path KeyRoot) where
pretty (Path fp) = "<keys>/" ++ fp
newtype KeysLoc = KeysLoc { keysLocPath :: Path Absolute }
deriving Eq

View File

@ -1,133 +0,0 @@
-- | IO utilities
{-# LANGUAGE CPP #-}
module Hackage.Security.RepoTool.Util.IO (
-- * Miscellaneous
compress
, getFileModTime
#ifndef mingw32_HOST_OS
, createSymbolicLink
#endif
-- * Tar archives
, TarGzError
, tarExtractFile
) where
import qualified Codec.Archive.Tar as Tar
import qualified Codec.Archive.Tar.Entry as Tar
import qualified Codec.Compression.GZip as GZip
import Control.Exception
import qualified Data.ByteString.Lazy as BS.L
import Data.Typeable
import qualified System.Directory as Directory
import System.IO.Error
-- hackage-security
import Hackage.Security.Util.Path
-- hackage-repo-tool
import Hackage.Security.RepoTool.Layout
import Hackage.Security.RepoTool.Options
import Hackage.Security.RepoTool.Paths
import System.Posix.Types (EpochTime)
#ifndef mingw32_HOST_OS
import qualified System.Posix.Files as Posix
#endif
#if MIN_VERSION_directory(1,2,0)
import Data.Time.Clock.POSIX (utcTimeToPOSIXSeconds)
#else
import System.Time (ClockTime (TOD))
#endif
-- | Get the modification time of the specified file
--
-- Returns 0 if the file does not exist .
getFileModTime :: GlobalOpts -> RepoLoc -> TargetPath' -> IO EpochTime
getFileModTime opts repoLoc targetPath =
handle handler $
-- Underlying implementation of 'Directory.getModificationTime' converts
-- from POSIX seconds, so there shouldn't be loss of precision.
-- NB: Apparently, this has low clock resolution on GHC < 7.8.
-- I don't think we care.
#if MIN_VERSION_directory(1,2,0)
fromInteger . floor . utcTimeToPOSIXSeconds
<$> Directory.getModificationTime (toFilePath fp)
#else
Directory.getModificationTime (toFilePath fp) >>= \(TOD s _) ->
return (fromInteger s)
#endif
where
fp :: Path Absolute
fp = anchorTargetPath' opts repoLoc targetPath
handler :: IOException -> IO EpochTime
handler ex = if isDoesNotExistError ex then return 0
else throwIO ex
compress :: Path Absolute -> Path Absolute -> IO ()
compress src dst =
withFile dst WriteMode $ \h ->
BS.L.hPut h =<< GZip.compress <$> readLazyByteString src
#ifndef mingw32_HOST_OS
-- | Create a symbolic link (unix only)
--
-- Create the directory for the target if it does not exist.
--
-- TODO: Currently this always creates links to absolute locations, whether the
-- user specified an absolute or a relative target.
createSymbolicLink :: (FsRoot root, FsRoot root')
=> Path root -- ^ Link target
-> Path root' -- ^ Link location
-> IO ()
createSymbolicLink linkTarget linkLoc = do
createDirectoryIfMissing True (takeDirectory linkLoc)
linkTarget' <- toAbsoluteFilePath linkTarget
linkLoc' <- toAbsoluteFilePath linkLoc
Posix.createSymbolicLink linkTarget' linkLoc'
#endif
{-------------------------------------------------------------------------------
Working with tar archives
-------------------------------------------------------------------------------}
-- | Extract a file from a tar archive
--
-- Throws an exception if there is an error in the archive or when the entry
-- is not a file. Returns nothing if the entry cannot be found.
tarExtractFile :: GlobalOpts
-> RepoLoc
-> TargetPath'
-> FilePath
-> IO (Maybe (BS.L.ByteString, Tar.FileSize))
tarExtractFile opts repoLoc pathTarGz pathToExtract =
handle (throwIO . TarGzError (prettyTargetPath' opts pathTarGz)) $ do
let pathTarGz' = anchorTargetPath' opts repoLoc pathTarGz
go =<< Tar.read . GZip.decompress <$> readLazyByteString pathTarGz'
where
go :: Exception e => Tar.Entries e -> IO (Maybe (BS.L.ByteString, Tar.FileSize))
go Tar.Done = return Nothing
go (Tar.Fail err) = throwIO err
go (Tar.Next e es) =
if Tar.entryPath e == pathToExtract
then case Tar.entryContent e of
Tar.NormalFile bs sz -> return $ Just (bs, sz)
_ -> throwIO $ userError
$ "tarExtractFile: "
++ pathToExtract ++ " not a normal file"
else do -- putStrLn $ show (Tar.entryPath e) ++ " /= " ++ show path
go es
data TarGzError = TarGzError FilePath SomeException
deriving (Typeable)
instance Exception TarGzError where
#if MIN_VERSION_base(4,8,0)
displayException (TarGzError path e) = path ++ ": " ++ displayException e
deriving instance Show TarGzError
#else
instance Show TarGzError where
show (TarGzError path e) = path ++ ": " ++ show e
#endif

View File

@ -1,27 +0,0 @@
-- | Smooth over differences between various ghc versions by making older
-- preludes look like 4.8.0
{-# LANGUAGE PackageImports #-}
{-# LANGUAGE CPP #-}
module Prelude (
module P
#if !MIN_VERSION_base(4,8,0)
, Applicative(..)
, Monoid(..)
, (<$>)
, (<$)
, traverse
#endif
) where
#if MIN_VERSION_base(4,8,0)
import "base" Prelude as P
#else
#if MIN_VERSION_base(4,6,0)
import "base" Prelude as P
#else
import "base" Prelude as P hiding (catch)
#endif
import Control.Applicative
import Data.Monoid
import Data.Traversable (traverse)
#endif