Compare commits

...

30 Commits

Author SHA1 Message Date
github-actions[bot]
5d53ea9409 flake.lock: Update
Flake lock file updates:

• Updated input 'mmdoc':
    'github:ryantm/mmdoc/cec02bafac9456bd1ed9b261b8d163a893885e5b' (2022-11-21)
  → 'github:ryantm/mmdoc/b6ddf748b1d1c01ca582bb1b3dafd6bc3a4c83a6' (2024-03-17)
• Removed input 'mmdoc/nixpkgs-for-manual'
• Added input 'mmdoc/systems':
    'github:nix-systems/default/da67096a3b9bf56a91d16901293e51ba5b49a27e' (2023-04-09)
• Updated input 'nixpkgs':
    'github:NixOS/nixpkgs/293a28df6d7ff3dec1e61e37cc4ee6e6c0fb0847' (2022-12-30)
  → 'github:NixOS/nixpkgs/cf8cc1201be8bc71b7cbbbdaf349b22f4f99c7ae' (2024-04-28)
• Updated input 'runtimeDeps':
    'github:NixOS/nixpkgs/c8d7c8a78fb516c0842cc65346506a565c88014d' (2024-04-27)
  → 'github:NixOS/nixpkgs/9b03589f4d174bfd6ea9a9de6fb85333fd190f8b' (2024-04-30)
• Updated input 'treefmt-nix':
    'github:numtide/treefmt-nix/49dc4a92b02b8e68798abd99184f228243b6e3ac' (2024-04-01)
  → 'github:numtide/treefmt-nix/c6aaf729f34a36c445618580a9f95a48f5e4e03f' (2024-04-25)
2024-05-01 01:01:41 +00:00
zowoq
df23e62c33
Merge pull request #391 from qowoz/insecure
allow nixpkgs-review to build insecure packages
2024-04-28 12:15:47 +10:00
zowoq
fc6a441992
Merge pull request #399 from qowoz/review-timeout
increase nixpkgs-review timeout to 120 minutes, increase python max package rebuilds from 25 to 100
2024-04-28 12:15:13 +10:00
zowoq
5b65371222 increase python max package rebuilds from 25 to 100 2024-04-28 12:11:52 +10:00
zowoq
256708538c increase nixpkgs-review timeout to 120 minutes 2024-04-28 12:11:52 +10:00
zowoq
6e45a0911d allow nixpkgs-review to build insecure packages 2024-04-28 12:08:31 +10:00
zowoq
71ab24a2eb separate runtime dependencies
this is only intended to be temporary until the flake is updated
2024-04-28 12:07:16 +10:00
zowoq
0572aa48da
Merge pull request #406 from qowoz/treefmt
add treefmt
2024-04-28 12:01:12 +10:00
zowoq
936918ca61
Merge pull request #417 from nix-community/disable-package
doc: add section about disabling package updates
2024-04-28 10:39:34 +10:00
zowoq
170e20a236
Merge pull request #412 from qowoz/docs
docs: add contact page with github and matrix links
2024-04-28 10:39:29 +10:00
zowoq
08fc765cd5
Merge pull request #405 from qowoz/python-updatescript
doc: remove pypi, add updateScript
2024-04-28 10:39:14 +10:00
zowoq
9fca69f919
Merge pull request #418 from nix-community/readme
README: fix link to docs
2024-04-28 10:31:33 +10:00
zowoq
86bf57dce5
Merge pull request #394 from nix-community/dependabot/github_actions/cachix/install-nix-action-26
build(deps): bump cachix/install-nix-action from 15 to 26
2024-04-28 10:30:59 +10:00
dependabot[bot]
cea9d014b8
build(deps): bump cachix/install-nix-action from 15 to 26
Bumps [cachix/install-nix-action](https://github.com/cachix/install-nix-action) from 15 to 26.
- [Release notes](https://github.com/cachix/install-nix-action/releases)
- [Commits](https://github.com/cachix/install-nix-action/compare/v15...v26)

---
updated-dependencies:
- dependency-name: cachix/install-nix-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-28 00:29:43 +00:00
zowoq
9c6d0ff795 format 2024-04-28 10:25:40 +10:00
zowoq
a57bd0d25e add treefmt 2024-04-28 10:24:49 +10:00
zowoq
10b64aaaf0
Merge pull request #407 from qowoz/buildbot
switch to buildbot
2024-04-28 10:22:12 +10:00
zowoq
cfc9fd5cf5 switch to buildbot
- drop CI workflow, add attributes to checks

- change cachix to nix-community

- change doc workflow to deploy cached artifact
  also switch to using actions deployment that doesn't use repo branch
2024-04-28 10:20:24 +10:00
zowoq
8a113424ca
Merge pull request #401 from nix-community/dependabot/github_actions/DeterminateSystems/update-flake-lock-21
build(deps): bump DeterminateSystems/update-flake-lock from 20 to 21
2024-04-28 10:20:09 +10:00
zowoq
2a56175dd3 README: fix link to docs 2024-04-27 18:42:46 +10:00
zowoq
289072083f doc: add section about disabling package updates 2024-04-27 15:44:10 +10:00
Ryan Mulligan
1d6efcd1ed
Merge pull request #414 from qowoz/skiplist
skiplist: add package sets that cause duplicate PRs
2024-04-26 22:32:20 -07:00
zowoq
e879b3666b skiplist: add package sets that cause duplicate PRs 2024-04-21 09:57:48 +10:00
Ryan Hendrickson
fbd0e11cae
Merge pull request #411 from qowoz/repo-branch
update repo default branch
2024-04-10 23:18:13 -04:00
Ryan Hendrickson
9c581302b0
Merge pull request #410 from rnhmjoj/main
skiplist: add monero packages
2024-04-10 23:16:50 -04:00
zowoq
6bcc854083 docs: add contact page with github and matrix links 2024-04-07 11:15:46 +10:00
zowoq
b528afd663 update repo default branch 2024-04-07 11:02:58 +10:00
rnhmjoj
992d8bc9be
skiplist: add monero packages 2024-04-06 21:31:11 +02:00
zowoq
fee7238d72 doc: remove pypi, add updateScript 2024-04-02 15:47:56 +10:00
dependabot[bot]
c10ce7e506
build(deps): bump DeterminateSystems/update-flake-lock from 20 to 21
Bumps [DeterminateSystems/update-flake-lock](https://github.com/determinatesystems/update-flake-lock) from 20 to 21.
- [Release notes](https://github.com/determinatesystems/update-flake-lock/releases)
- [Commits](https://github.com/determinatesystems/update-flake-lock/compare/v20...v21)

---
updated-dependencies:
- dependency-name: DeterminateSystems/update-flake-lock
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-18 04:07:19 +00:00
33 changed files with 519 additions and 379 deletions

View File

@ -1,17 +0,0 @@
name: "CI"
on:
pull_request:
push:
jobs:
tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: cachix/install-nix-action@v20
- uses: cachix/cachix-action@v12
with:
name: nixpkgs-update
signingKey: '${{ secrets.CACHIX_SIGNING_KEY }}'
authToken: '${{ secrets.CACHIX_AUTH_TOKEN }}'
- run: nix build --accept-flake-config
- run: nix flake check

View File

@ -6,23 +6,44 @@ on:
- main
workflow_dispatch:
permissions:
contents: read
pages: write
id-token: write
concurrency:
group: "doc"
cancel-in-progress: false
jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
# Nix Flakes doesn't work on shallow clones
fetch-depth: 0
- uses: cachix/install-nix-action@v15
- uses: cachix/install-nix-action@v26
with:
extra_nix_config: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
- run: nix build .#nixpkgs-update-doc
- if: success()
uses: crazy-max/ghaction-github-pages@v4
accept-flake-config = true
experimental-features = nix-command flakes
max-jobs = 0
- name: Setup Pages
id: pages
uses: actions/configure-pages@v5
- name: Build Pages
run: |
nix build .#nixpkgs-update-doc
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
target_branch: gh-pages
build_dir: result/multi
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
path: ./result/multi
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
needs: publish
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4

View File

@ -9,9 +9,9 @@ jobs:
steps:
- uses: actions/checkout@v4
- name: Install Nix
uses: cachix/install-nix-action@v25
uses: cachix/install-nix-action@v26
with:
extra_nix_config: |
experimental-features = nix-command flakes
- name: Update flake.lock
uses: DeterminateSystems/update-flake-lock@v20
uses: DeterminateSystems/update-flake-lock@v21

View File

@ -3,4 +3,4 @@
[![Build Status](https://github.com/ryantm/nixpkgs-update/workflows/CI/badge.svg)](https://github.com/ryantm/nixpkgs-update/actions)
[![Patreon](https://img.shields.io/badge/patreon-donate-blue.svg)](https://www.patreon.com/nixpkgsupdate)
Please read the [documentation](https://ryantm.github.io/nixpkgs-update/).
Please read the [documentation](https://nix-community.github.io/nixpkgs-update/).

View File

@ -71,9 +71,9 @@ deleteDoneParser =
commandParser :: O.Parser Command
commandParser =
O.hsubparser
(O.command
"update"
(O.info (updateParser) (O.progDesc "Update one package"))
( O.command
"update"
(O.info (updateParser) (O.progDesc "Update one package"))
<> O.command
"update-batch"
(O.info (updateBatchParser) (O.progDesc "Update one package in batch mode."))
@ -116,7 +116,8 @@ commandParser =
checkVulnerable :: O.Parser Command
checkVulnerable =
CheckVulnerable <$> O.strArgument (O.metavar "PRODUCT_ID")
CheckVulnerable
<$> O.strArgument (O.metavar "PRODUCT_ID")
<*> O.strArgument (O.metavar "OLD_VERSION")
<*> O.strArgument (O.metavar "NEW_VERSION")

View File

@ -46,8 +46,8 @@ subcommand.
There also exist alternative sources of updates, these include:
- PyPI, the Python Package Index:
[nixpkgs-update-pypi-releases](https://github.com/jonringer/nixpkgs-update-pypi-releases)
- package updateScript:
[passthru.updateScript](https://nixos.org/manual/nixpkgs/unstable/#var-passthru-updateScript)
- GitHub releases:
[nixpkgs-update-github-releases](https://github.com/synthetica9/nixpkgs-update-github-releases)

5
doc/contact.md Normal file
View File

@ -0,0 +1,5 @@
# Contact {#contact}
Github: [https://github.com/ryantm/nixpkgs-update](https://github.com/ryantm/nixpkgs-update)
Matrix: [https://matrix.to/#/#nixpkgs-update:nixos.org](https://matrix.to/#/#nixpkgs-update:nixos.org)

View File

@ -38,4 +38,4 @@ nix run nixpkgs#haskellPackages.hpack
Source files are formatted with [Ormolu](https://github.com/tweag/ormolu).
There is also a [Cachix cache](https://nixpkgs-update.cachix.org/) available for the dependencies of this program.
There is also a [Cachix cache](https://nix-community.cachix.org/) available for the dependencies of this program.

View File

@ -9,8 +9,8 @@ Run without installing on stable Nix:
```ShellSession
$ nix run \
--option extra-substituters 'https://nixpkgs-update.cachix.org/' \
--option extra-trusted-public-keys 'nixpkgs-update.cachix.org-1:6y6Z2JdoL3APdu6/+Iy8eZX2ajf09e4EE9SnxSML1W8=' \
--option extra-substituters 'https://nix-community.cachix.org/' \
--option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
-f https://github.com/ryantm/nixpkgs-update/archive/main.tar.gz \
-c nixpkgs-update --help
```
@ -19,8 +19,8 @@ Run without installing on unstable Nix with nix command enabled:
```ShellSession
$ nix shell \
--option extra-substituters 'https://nixpkgs-update.cachix.org/' \
--option extra-trusted-public-keys 'nixpkgs-update.cachix.org-1:6y6Z2JdoL3APdu6/+Iy8eZX2ajf09e4EE9SnxSML1W8=' \
--option extra-substituters 'https://nix-community.cachix.org/' \
--option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
-f https://github.com/ryantm/nixpkgs-update/archive/main.tar.gz \
-c nixpkgs-update --help
```
@ -29,8 +29,8 @@ Run without installing on unstable Nix with nix flakes enabled:
```ShellSession
$ nix run \
--option extra-substituters 'https://nixpkgs-update.cachix.org/' \
--option extra-trusted-public-keys 'nixpkgs-update.cachix.org-1:6y6Z2JdoL3APdu6/+Iy8eZX2ajf09e4EE9SnxSML1W8=' \
--option extra-substituters 'https://nix-community.cachix.org/' \
--option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
github:ryantm/nixpkgs-update -- --help
```
@ -38,8 +38,8 @@ Install into your Nix profile:
```ShellSession
$ nix-env \
--option extra-substituters 'https://nixpkgs-update.cachix.org/' \
--option extra-trusted-public-keys 'nixpkgs-update.cachix.org-1:6y6Z2JdoL3APdu6/+Iy8eZX2ajf09e4EE9SnxSML1W8=' \
--option extra-substituters 'https://nix-community.cachix.org/' \
--option extra-trusted-public-keys 'nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=' \
-if https://github.com/ryantm/nixpkgs-update/archive/main.tar.gz
```

View File

@ -17,5 +17,5 @@ It is the code used by the GitHub bot
[@r-ryantm](https://github.com/r-ryantm) to automatically update
nixpkgs. It uses package repository information from
[Repology.org](https://repology.org/repository/nix_unstable), the
GitHub releases API, and PyPI to generate a lists of outdated
GitHub releases API, and the package passthru.updateScript to generate a lists of outdated
packages.

View File

@ -12,7 +12,7 @@ Thanks for being a maintainer. Hopefully, @r-ryantm will be able to save you som
## Why is @r-ryantm not updating my package? {#no-update}
There are lots of reasons a package might not be updated. You can usually figure out which one is the issue by looking at the [logs](https://r.ryantm.com/log/) or by asking @ryantm on Matrix or GitHub.
There are lots of reasons a package might not be updated. You can usually figure out which one is the issue by looking at the [logs](https://r.ryantm.com/log/) or by asking the [maintainers](#contact).
### No new version information
@ -20,15 +20,23 @@ r-ryantm gets its new version information from three sources:
* Repology - information from Repology is delayed because it only updates when there is an unstable channel release
* GitHub releases
* PyPi releases
* package passthru.updateScript
If none of these sources says the package is out of date, it will not attempt to update it.
### Disabling package updates
Updates can be disabled by adding a comment to the package:
```
# nixpkgs-update: no auto update
```
[Example in nixpkgs](https://github.com/NixOS/nixpkgs/blob/f2294037ad2b1345c5d9c2df0e81bdb00eab21f3/pkgs/applications/version-management/gitlab/gitlab-pages/default.nix#L7)
### Skiplist
We maintain a [Skiplist](https://github.com/ryantm/nixpkgs-update/blob/main/src/Skiplist.hs) of different things not to update. It is possible your package is triggering one of the skip criteria.
Python updates are skipped if they cause more than 25 rebuilds.
Python updates are skipped if they cause more than 100 rebuilds.
### Existing Open or Draft PR

View File

@ -9,3 +9,4 @@
* [Contributing](#contributing)
* [Donate](#donate)
* [Nixpkgs Maintainer FAQ](#nixpkgs-maintainer-faq)
* [Contact](#contact)

View File

@ -5,14 +5,14 @@
"nixpkgs": [
"nixpkgs"
],
"nixpkgs-for-manual": "nixpkgs-for-manual"
"systems": "systems"
},
"locked": {
"lastModified": 1669050835,
"narHash": "sha256-4ppYRBBY6lIqwMNYp0XA2mku1lSPyX4JaoTf+gt5NDg=",
"lastModified": 1710694589,
"narHash": "sha256-5wa+Jzxr+LygoxSZuZg0YU81jgdnx2IY/CqDIJMOgec=",
"owner": "ryantm",
"repo": "mmdoc",
"rev": "cec02bafac9456bd1ed9b261b8d163a893885e5b",
"rev": "b6ddf748b1d1c01ca582bb1b3dafd6bc3a4c83a6",
"type": "github"
},
"original": {
@ -23,11 +23,11 @@
},
"nixpkgs": {
"locked": {
"lastModified": 1672428209,
"narHash": "sha256-eejhqkDz2cb2vc5VeaWphJz8UXNuoNoM8/Op8eWv2tQ=",
"lastModified": 1714314149,
"narHash": "sha256-yNAevSKF4krRWacmLUsLK7D7PlfuY3zF0lYnGYNi9vQ=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "293a28df6d7ff3dec1e61e37cc4ee6e6c0fb0847",
"rev": "cf8cc1201be8bc71b7cbbbdaf349b22f4f99c7ae",
"type": "github"
},
"original": {
@ -35,26 +35,63 @@
"type": "indirect"
}
},
"nixpkgs-for-manual": {
"root": {
"inputs": {
"mmdoc": "mmdoc",
"nixpkgs": "nixpkgs",
"runtimeDeps": "runtimeDeps",
"treefmt-nix": "treefmt-nix"
}
},
"runtimeDeps": {
"locked": {
"lastModified": 1663819393,
"narHash": "sha256-SMWfyAOKRPBC95M8dhZJTlb0kHyilr2lKEAfQSHlM7I=",
"owner": "ryantm",
"lastModified": 1714463564,
"narHash": "sha256-8+w/34w9azAHZA9TwvzdT2z7VFFEey7tXsan5q5bR1M=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "6a6caacfdd079a0fa249046514480a1c4597d861",
"rev": "9b03589f4d174bfd6ea9a9de6fb85333fd190f8b",
"type": "github"
},
"original": {
"owner": "ryantm",
"ref": "minman",
"owner": "NixOS",
"ref": "nixos-unstable-small",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
},
"treefmt-nix": {
"inputs": {
"mmdoc": "mmdoc",
"nixpkgs": "nixpkgs"
"nixpkgs": [
"nixpkgs"
]
},
"locked": {
"lastModified": 1714058656,
"narHash": "sha256-Qv4RBm4LKuO4fNOfx9wl40W2rBbv5u5m+whxRYUMiaA=",
"owner": "numtide",
"repo": "treefmt-nix",
"rev": "c6aaf729f34a36c445618580a9f95a48f5e4e03f",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "treefmt-nix",
"type": "github"
}
}
},

View File

@ -4,11 +4,35 @@
inputs.mmdoc.url = "github:ryantm/mmdoc";
inputs.mmdoc.inputs.nixpkgs.follows = "nixpkgs";
nixConfig.extra-substituters = "https://nixpkgs-update.cachix.org";
nixConfig.extra-trusted-public-keys = "nixpkgs-update.cachix.org-1:6y6Z2JdoL3APdu6/+Iy8eZX2ajf09e4EE9SnxSML1W8=";
inputs.treefmt-nix.url = "github:numtide/treefmt-nix";
inputs.treefmt-nix.inputs.nixpkgs.follows = "nixpkgs";
outputs = { self, nixpkgs, mmdoc } @ args:
inputs.runtimeDeps.url = "github:NixOS/nixpkgs/nixos-unstable-small";
nixConfig.extra-substituters = "https://nix-community.cachix.org";
nixConfig.extra-trusted-public-keys = "nix-community.cachix.org-1:mB9FSh9qf2dCimDSUo8Zy7bkq5CX+/rkCWyvRCYg3Fs=";
outputs = { self, nixpkgs, mmdoc, treefmt-nix, runtimeDeps } @ args:
let
systems = [ "x86_64-linux" "aarch64-linux" "x86_64-darwin" "aarch64-darwin" ];
eachSystem = f: nixpkgs.lib.genAttrs systems (system: f nixpkgs.legacyPackages.${system});
treefmtEval = eachSystem (pkgs: treefmt-nix.lib.evalModule pkgs {
projectRootFile = ".git/config";
programs.ormolu.enable = true;
});
in
{
checks.x86_64-linux =
let
packages = nixpkgs.lib.mapAttrs' (n: nixpkgs.lib.nameValuePair "package-${n}") self.packages.x86_64-linux;
devShells = nixpkgs.lib.mapAttrs' (n: nixpkgs.lib.nameValuePair "devShell-${n}") self.devShells.x86_64-linux;
in
packages // devShells // {
treefmt = treefmtEval.x86_64-linux.config.build.check self;
};
formatter = eachSystem (pkgs: treefmtEval.${pkgs.system}.config.build.wrapper);
packages.x86_64-linux = import ./pkgs/default.nix (args // { system = "x86_64-linux"; });
devShells.x86_64-linux.default = self.packages."x86_64-linux".devShell;

View File

@ -1,5 +1,6 @@
{ nixpkgs
, mmdoc
, runtimeDeps
, system
, self
, ...
@ -7,9 +8,11 @@
let
runtimePkgs = import runtimeDeps { inherit system; };
pkgs = import nixpkgs { inherit system; config = { allowBroken = true; }; };
drvAttrs = attrs: with pkgs; {
drvAttrs = attrs: with runtimePkgs; {
NIX = nix;
GIT = git;
JQ = jq;

View File

@ -16,6 +16,7 @@ where
import Data.Aeson
( FromJSON,
Key,
Object,
eitherDecode,
parseJSON,
@ -23,7 +24,6 @@ import Data.Aeson
(.!=),
(.:),
(.:!),
Key,
)
import Data.Aeson.Types (Parser, prependFailure)
import qualified Data.ByteString.Lazy.Char8 as BSL

View File

@ -8,7 +8,7 @@ module Check
( result,
-- exposed for testing:
hasVersion,
versionWithoutPath
versionWithoutPath,
)
where
@ -19,7 +19,7 @@ import qualified Data.Text as T
import qualified Data.Text.IO as T
import Language.Haskell.TH.Env (envQ)
import OurPrelude
import System.Exit()
import System.Exit ()
import Text.Regex.Applicative.Text (RE', (=~))
import qualified Text.Regex.Applicative.Text as RE
import Utils (UpdateEnv (..), nixBuildOptions)
@ -50,11 +50,11 @@ isNonWordCharacter c = not (isWordCharacter c)
-- | Construct regex: /.*\b${version}\b.*/s
versionRegex :: Text -> RE' ()
versionRegex version =
(\_ -> ()) <$> (
(((many RE.anySym) <* (RE.psym isNonWordCharacter)) <|> (RE.pure ""))
*> (RE.string version) <*
((RE.pure "") <|> ((RE.psym isNonWordCharacter) *> (many RE.anySym)))
)
(\_ -> ())
<$> ( (((many RE.anySym) <* (RE.psym isNonWordCharacter)) <|> (RE.pure ""))
*> (RE.string version)
<* ((RE.pure "") <|> ((RE.psym isNonWordCharacter) *> (many RE.anySym)))
)
hasVersion :: Text -> Text -> Bool
hasVersion contents expectedVersion =
@ -63,10 +63,9 @@ hasVersion contents expectedVersion =
checkTestsBuild :: Text -> IO Bool
checkTestsBuild attrPath = do
let timeout = "10m"
let
args =
[ T.unpack timeout, "nix-build" ] ++
nixBuildOptions
let args =
[T.unpack timeout, "nix-build"]
++ nixBuildOptions
++ [ "-E",
"{ config }: (import ./. { inherit config; })."
++ (T.unpack attrPath)
@ -99,19 +98,19 @@ versionWithoutPath resultPath expectedVersion =
-- This can be done with negative lookbehind e.g
-- /^(?<!${storePathWithoutVersion})${version}/
-- Note we also escape the version with \Q/\E for grep -P
let storePath = fromMaybe (T.pack resultPath) $ T.stripPrefix "/nix/store/" (T.pack resultPath) in
case T.breakOn expectedVersion storePath of
(_, "") ->
-- no version in prefix, just match version
"\\Q"
<> T.unpack expectedVersion
<> "\\E"
(storePrefix, _) ->
"(?<!\\Q"
<> T.unpack storePrefix
<> "\\E)\\Q"
<> T.unpack expectedVersion
<> "\\E"
let storePath = fromMaybe (T.pack resultPath) $ T.stripPrefix "/nix/store/" (T.pack resultPath)
in case T.breakOn expectedVersion storePath of
(_, "") ->
-- no version in prefix, just match version
"\\Q"
<> T.unpack expectedVersion
<> "\\E"
(storePrefix, _) ->
"(?<!\\Q"
<> T.unpack storePrefix
<> "\\E)\\Q"
<> T.unpack expectedVersion
<> "\\E"
foundVersionInOutputs :: Text -> String -> IO (Maybe Text)
foundVersionInOutputs expectedVersion resultPath =
@ -140,7 +139,8 @@ foundVersionInFileNames expectedVersion resultPath =
( do
(_, contents) <-
shell ("find " <> resultPath) & ourReadProcessInterleaved
(contents =~ versionRegex expectedVersion) & hoistMaybe
(contents =~ versionRegex expectedVersion)
& hoistMaybe
& noteT (T.pack "Expected version not found")
return $
"- found "
@ -157,7 +157,8 @@ treeGist resultPath =
( do
contents <- procTree [resultPath] & ourReadProcessInterleavedBS_
g <-
shell gistBin & setStdin (byteStringInput contents)
shell gistBin
& setStdin (byteStringInput contents)
& ourReadProcessInterleaved_
return $ "- directory tree listing: " <> g <> "\n"
)
@ -169,7 +170,8 @@ duGist resultPath =
( do
contents <- proc "du" [resultPath] & ourReadProcessInterleavedBS_
g <-
shell gistBin & setStdin (byteStringInput contents)
shell gistBin
& setStdin (byteStringInput contents)
& ourReadProcessInterleaved_
return $ "- du listing: " <> g <> "\n"
)
@ -182,9 +184,9 @@ result updateEnv resultPath =
someReports <-
fromMaybe ""
<$> foundVersionInOutputs expectedVersion resultPath
<> foundVersionInFileNames expectedVersion resultPath
<> treeGist resultPath
<> duGist resultPath
<> foundVersionInFileNames expectedVersion resultPath
<> treeGist resultPath
<> duGist resultPath
return $
let testsBuildSummary = checkTestsBuildReport testsBuild
in [interpolate|

View File

@ -23,7 +23,7 @@ import Data.Aeson (FromJSON)
import Data.Bitraversable (bitraverse)
import qualified Data.Text as T
import qualified Data.Text.Encoding as T
import Data.Time.Clock (getCurrentTime, addUTCTime)
import Data.Time.Clock (addUTCTime, getCurrentTime)
import qualified Data.Vector as V
import qualified Git
import qualified GitHub as GH
@ -54,23 +54,25 @@ pr env title body prHead base = do
tryPR `catchE` \case
-- If creating the PR returns a 422, most likely cause is that the
-- branch was deleted, so push it again and retry once.
GH.HTTPError (HttpExceptionRequest _ (StatusCodeException r _)) | statusCode (responseStatus r) == 422 ->
Git.push env >> withExceptT (T.pack . show) tryPR
GH.HTTPError (HttpExceptionRequest _ (StatusCodeException r _))
| statusCode (responseStatus r) == 422 ->
Git.push env >> withExceptT (T.pack . show) tryPR
e ->
throwE . T.pack . show $ e
where
tryPR = ExceptT $
fmap ((False, ) . GH.getUrl . GH.pullRequestUrl)
<$> ( liftIO $
( GH.github
(authFrom env)
( GH.createPullRequestR
(N "nixos")
(N "nixpkgs")
(GH.CreatePullRequest title body prHead base)
tryPR =
ExceptT $
fmap ((False,) . GH.getUrl . GH.pullRequestUrl)
<$> ( liftIO $
( GH.github
(authFrom env)
( GH.createPullRequestR
(N "nixos")
(N "nixpkgs")
(GH.CreatePullRequest title body prHead base)
)
)
)
)
prUpdate :: forall m. MonadIO m => UpdateEnv -> Text -> Text -> Text -> Text -> ExceptT Text m (Bool, Text)
prUpdate env title body prHead base = do
@ -78,24 +80,25 @@ prUpdate env title body prHead base = do
runRequest = ExceptT . fmap (first (T.pack . show)) . liftIO . GH.github (authFrom env)
let inNixpkgs f = f (N "nixos") (N "nixpkgs")
prs <- runRequest $
inNixpkgs GH.pullRequestsForR (GH.optionsHead prHead) GH.FetchAll
prs <-
runRequest $
inNixpkgs GH.pullRequestsForR (GH.optionsHead prHead) GH.FetchAll
case V.toList prs of
[] -> pr env title body prHead base
(_:_:_) -> throwE $ "Too many open PRs from " <> prHead
(_ : _ : _) -> throwE $ "Too many open PRs from " <> prHead
[thePR] -> do
let withExistingPR :: (GH.Name GH.Owner -> GH.Name GH.Repo -> GH.IssueNumber -> a) -> a
withExistingPR f = inNixpkgs f (GH.simplePullRequestNumber thePR)
_ <- runRequest $
withExistingPR GH.updatePullRequestR $
GH.EditPullRequest (Just title) Nothing Nothing Nothing Nothing
_ <-
runRequest $
withExistingPR GH.updatePullRequestR $
GH.EditPullRequest (Just title) Nothing Nothing Nothing Nothing
_ <- runRequest $
withExistingPR GH.createCommentR body
_ <-
runRequest $
withExistingPR GH.createCommentR body
return (True, GH.getUrl $ GH.simplePullRequestUrl thePR)
@ -129,12 +132,18 @@ parseURLMaybe url =
extension = RE.string ".zip" <|> RE.string ".tar.gz"
toParts n o = URLParts (N n) (N o)
regex =
( toParts <$> (domain *> pathSegment) <* slash <*> pathSegment
( toParts
<$> (domain *> pathSegment)
<* slash
<*> pathSegment
<*> (RE.string "/releases/download/" *> pathSegment)
<* slash
<* pathSegment
)
<|> ( toParts <$> (domain *> pathSegment) <* slash <*> pathSegment
<|> ( toParts
<$> (domain *> pathSegment)
<* slash
<*> pathSegment
<*> (RE.string "/archive/" *> pathSegment)
<* extension
)
@ -187,7 +196,8 @@ commitIsOldEnoughToDelete auth ghUser sha = do
refShouldBeDeleted :: GH.Auth -> GH.Name GH.Owner -> (Text, GH.Name GH.GitCommit) -> IO Bool
refShouldBeDeleted auth ghUser (ref, sha) =
liftA2 (&&)
liftA2
(&&)
(either (const False) not <$> openPRWithAutoUpdateRefFrom auth ghUser ref)
(commitIsOldEnoughToDelete auth ghUser sha)

View File

@ -18,7 +18,7 @@ module Git
setupNixpkgs,
Git.show,
worktreeAdd,
worktreeRemove
worktreeRemove,
)
where
@ -33,9 +33,9 @@ import Data.Time.Clock (addUTCTime, getCurrentTime)
import qualified Data.Vector as V
import Language.Haskell.TH.Env (envQ)
import OurPrelude hiding (throw)
import System.Directory (doesDirectoryExist, doesFileExist, getModificationTime, getCurrentDirectory, setCurrentDirectory)
import System.Directory (doesDirectoryExist, doesFileExist, getCurrentDirectory, getModificationTime, setCurrentDirectory)
import System.Environment.XDG.BaseDir (getUserCacheDir)
import System.Exit()
import System.Exit ()
import System.IO.Error (tryIOError)
import System.Posix.Env (setEnv)
import Utils (Options (..), UpdateEnv (..), branchName, branchPrefix)
@ -57,8 +57,8 @@ worktreeRemove :: FilePath -> IO ()
worktreeRemove path = do
exist <- doesDirectoryExist path
if exist
then runProcessNoIndexIssue_IO $ silently $ procGit ["worktree", "remove", "--force", path]
else return ()
then runProcessNoIndexIssue_IO $ silently $ procGit ["worktree", "remove", "--force", path]
else return ()
checkout :: Text -> Text -> ProcessConfig () () ()
checkout branch target =
@ -98,8 +98,9 @@ diff :: MonadIO m => Text -> ExceptT Text m Text
diff branch = readProcessInterleavedNoIndexIssue_ $ procGit ["diff", T.unpack branch]
diffFileNames :: MonadIO m => Text -> ExceptT Text m [Text]
diffFileNames branch = readProcessInterleavedNoIndexIssue_ (procGit ["diff", T.unpack branch, "--name-only"])
& fmapRT T.lines
diffFileNames branch =
readProcessInterleavedNoIndexIssue_ (procGit ["diff", T.unpack branch, "--name-only"])
& fmapRT T.lines
staleFetchHead :: MonadIO m => m Bool
staleFetchHead =
@ -121,7 +122,8 @@ fetchIfStale = whenM staleFetchHead fetch
fetch :: MonadIO m => ExceptT Text m ()
fetch =
runProcessNoIndexIssue_ $
silently $ procGit ["fetch", "-q", "--prune", "--multiple", "upstream", "origin"]
silently $
procGit ["fetch", "-q", "--prune", "--multiple", "upstream", "origin"]
push :: MonadIO m => UpdateEnv -> ExceptT Text m ()
push updateEnv =
@ -210,7 +212,6 @@ deleteBranchesEverywhere branches = do
Left error2 -> T.putStrLn $ tshow error2
Right success2 -> T.putStrLn $ tshow success2
runProcessNoIndexIssue_IO ::
ProcessConfig () () () -> IO ()
runProcessNoIndexIssue_IO config = go
@ -220,8 +221,8 @@ runProcessNoIndexIssue_IO config = go
case code of
ExitFailure 128
| "index.lock" `BS.isInfixOf` BSL.toStrict e -> do
threadDelay 100000
go
threadDelay 100000
go
ExitSuccess -> return ()
ExitFailure _ -> throw $ ExitCodeException code config out e
@ -234,8 +235,8 @@ runProcessNoIndexIssue_ config = tryIOTextET go
case code of
ExitFailure 128
| "index.lock" `BS.isInfixOf` BSL.toStrict e -> do
threadDelay 100000
go
threadDelay 100000
go
ExitSuccess -> return ()
ExitFailure _ -> throw $ ExitCodeException code config out e
@ -248,12 +249,11 @@ readProcessInterleavedNoIndexIssue_ config = tryIOTextET go
case code of
ExitFailure 128
| "index.lock" `BS.isInfixOf` BSL.toStrict out -> do
threadDelay 100000
go
threadDelay 100000
go
ExitSuccess -> return $ bytestringToText out
ExitFailure _ -> throw $ ExitCodeException code config out out
readProcessInterleavedNoIndexIssue_IO ::
ProcessConfig () () () -> IO Text
readProcessInterleavedNoIndexIssue_IO config = go
@ -263,7 +263,7 @@ readProcessInterleavedNoIndexIssue_IO config = go
case code of
ExitFailure 128
| "index.lock" `BS.isInfixOf` BSL.toStrict out -> do
threadDelay 100000
go
threadDelay 100000
go
ExitSuccess -> return $ bytestringToText out
ExitFailure _ -> throw $ ExitCodeException code config out out

View File

@ -15,15 +15,19 @@ filter _ cpeMatch "socat" v
| cpeUpdatePresentAndNotPartOfVersion cpeMatch v = False -- TODO consider if this rule should be applied to all packages
filter _ cpeMatch "uzbl" v
| isNothing (v =~ yearRegex)
&& "2009.12.22" `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
False
&& "2009.12.22"
`anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
False
| isNothing (v =~ yearRegex)
&& "2010.04.03" `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
False
&& "2010.04.03"
`anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
False
filter _ cpeMatch "go" v
| "." `T.isInfixOf` v
&& "-" `anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
False
| "."
`T.isInfixOf` v
&& "-"
`anyVersionInfixOf` cpeMatchVersionMatcher cpeMatch =
False
filter _ cpeMatch "terraform" _
| cpeTargetSoftware (cpeMatchCPE cpeMatch) == Just "aws" = False
filter cve _ "tor" _
@ -31,7 +35,7 @@ filter cve _ "tor" _
filter _ cpeMatch "arena" _
| cpeVendor (cpeMatchCPE cpeMatch) == Just "rockwellautomation"
|| cpeVendor (cpeMatchCPE cpeMatch) == Just "openforis" =
False
False
filter _ cpeMatch "thrift" _
| cpeVendor (cpeMatchCPE cpeMatch) == Just "facebook" = False
filter _ cpeMatch "kanboard" _

View File

@ -40,8 +40,8 @@ import qualified Data.Text.Lazy.Encoding as TL
import qualified Git
import Language.Haskell.TH.Env (envQ)
import OurPrelude
import System.Exit ()
import qualified System.Process.Typed as TP
import System.Exit()
import Utils (UpdateEnv (..), nixBuildOptions, nixCommonOptions, srcOrMain)
import Prelude hiding (log)
@ -82,7 +82,7 @@ nixEvalApplyRaw applyFunc attrPath =
nixEvalExpr ::
MonadIO m =>
Text ->
Text ->
ExceptT Text m Text
nixEvalExpr expr =
ourReadProcess_
@ -116,22 +116,23 @@ assertNewerVersion updateEnv = do
lookupAttrPath :: MonadIO m => UpdateEnv -> ExceptT Text m Text
lookupAttrPath updateEnv =
-- lookup attrpath by nix-env
(proc
(binPath <> "/nix-env")
( [ "-qa",
(packageName updateEnv <> "-" <> oldVersion updateEnv) & T.unpack,
"-f",
".",
"--attr-path"
]
<> nixCommonOptions
)
& ourReadProcess_
& fmapRT (fst >>> T.lines >>> head >>> T.words >>> head))
<|>
-- if that fails, check by attrpath
(getAttrString "name" (packageName updateEnv))
& fmapRT (const (packageName updateEnv))
( proc
(binPath <> "/nix-env")
( [ "-qa",
(packageName updateEnv <> "-" <> oldVersion updateEnv) & T.unpack,
"-f",
".",
"--attr-path"
]
<> nixCommonOptions
)
& ourReadProcess_
& fmapRT (fst >>> T.lines >>> head >>> T.words >>> head)
)
<|>
-- if that fails, check by attrpath
(getAttrString "name" (packageName updateEnv))
& fmapRT (const (packageName updateEnv))
getDerivationFile :: MonadIO m => Text -> ExceptT Text m Text
getDerivationFile attrPath = do
@ -143,10 +144,10 @@ getDerivationFile attrPath = do
-- Get an attribute that can be evaluated off a derivation, as in:
-- getAttr "cargoSha256" "ripgrep" -> 0lwz661rbm7kwkd6mallxym1pz8ynda5f03ynjfd16vrazy2dj21
getAttr :: MonadIO m => Text -> Text -> ExceptT Text m Text
getAttr attr = srcOrMain (nixEvalApply ("p: p."<> attr))
getAttr attr = srcOrMain (nixEvalApply ("p: p." <> attr))
getAttrString :: MonadIO m => Text -> Text -> ExceptT Text m Text
getAttrString attr = srcOrMain (nixEvalApplyRaw ("p: p."<> attr))
getAttrString attr = srcOrMain (nixEvalApplyRaw ("p: p." <> attr))
getHash :: MonadIO m => Text -> ExceptT Text m Text
getHash = getAttrString "drvAttrs.outputHash"
@ -178,8 +179,9 @@ getHomepage :: MonadIO m => Text -> ExceptT Text m Text
getHomepage = nixEvalApplyRaw "p: p.meta.homepage or \"\""
getSrcUrl :: MonadIO m => Text -> ExceptT Text m Text
getSrcUrl = srcOrMain
(nixEvalApplyRaw "p: builtins.elemAt p.drvAttrs.urls 0")
getSrcUrl =
srcOrMain
(nixEvalApplyRaw "p: builtins.elemAt p.drvAttrs.urls 0")
buildCmd :: Text -> ProcessConfig () () ()
buildCmd attrPath =
@ -245,10 +247,9 @@ resultLink =
fakeHashMatching :: Text -> Text
fakeHashMatching oldHash =
if "sha512-" `T.isPrefixOf` oldHash then
"sha512-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=="
else
"sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
if "sha512-" `T.isPrefixOf` oldHash
then "sha512-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=="
else "sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
-- fixed-output derivation produced path '/nix/store/fg2hz90z5bc773gpsx4gfxn3l6fl66nw-source' with sha256 hash '0q1lsgc1621czrg49nmabq6am9sgxa9syxrwzlksqqr4dyzw4nmf' instead of the expected hash '0bp22mzkjy48gncj5vm9b7whzrggcbs5pd4cnb6k8jpl9j02dhdv'
getHashFromBuild :: MonadIO m => Text -> ExceptT Text m Text
@ -287,17 +288,19 @@ hasPatchNamed attrPath name = do
return $ name `T.isInfixOf` ps
hasUpdateScript :: MonadIO m => Text -> ExceptT Text m Bool
hasUpdateScript attrPath= do
hasUpdateScript attrPath = do
nixEvalApply
"p: builtins.hasAttr \"updateScript\" p" attrPath
"p: builtins.hasAttr \"updateScript\" p"
attrPath
& readNixBool
runUpdateScript :: MonadIO m => Text -> ExceptT Text m (ExitCode, Text)
runUpdateScript attrPath = do
let timeout = "10m" :: Text
(exitCode, output) <- ourReadProcessInterleaved $
TP.setStdin (TP.byteStringInput "\n") $
proc "timeout" [T.unpack timeout, "nix-shell", "maintainers/scripts/update.nix", "--argstr", "package", T.unpack attrPath ]
(exitCode, output) <-
ourReadProcessInterleaved $
TP.setStdin (TP.byteStringInput "\n") $
proc "timeout" [T.unpack timeout, "nix-shell", "maintainers/scripts/update.nix", "--argstr", "package", T.unpack attrPath]
case exitCode of
ExitFailure 124 -> do
return (exitCode, "updateScript for " <> attrPath <> " took longer than " <> timeout <> " and timed out. Other output: " <> output)

View File

@ -33,20 +33,22 @@ run ::
FilePath ->
Text ->
Sem r Text
run cache commit = let timeout = "45m" :: Text in do
-- TODO: probably just skip running nixpkgs-review if the directory
-- already exists
void $
ourReadProcessInterleavedSem $
proc "rm" ["-rf", revDir cache commit]
(exitCode, _nixpkgsReviewOutput) <-
ourReadProcessInterleavedSem $
proc "timeout" [T.unpack timeout, (binPath <> "/nixpkgs-review"), "rev", T.unpack commit, "--no-shell"]
case exitCode of
ExitFailure 124 -> do
output $ "[check][nixpkgs-review] took longer than " <> timeout <> " and timed out"
return $ "nixpkgs-review took longer than " <> timeout <> " and timed out"
_ -> F.read $ (revDir cache commit) <> "/report.md"
run cache commit =
let timeout = "120m" :: Text
in do
-- TODO: probably just skip running nixpkgs-review if the directory
-- already exists
void $
ourReadProcessInterleavedSem $
proc "rm" ["-rf", revDir cache commit]
(exitCode, _nixpkgsReviewOutput) <-
ourReadProcessInterleavedSem $
proc "timeout" [T.unpack timeout, (binPath <> "/nixpkgs-review"), "rev", T.unpack commit, "--no-shell", "--extra-nixpkgs-config", "{ allowInsecurePredicate = x: true; }"]
case exitCode of
ExitFailure 124 -> do
output $ "[check][nixpkgs-review] took longer than " <> timeout <> " and timed out"
return $ "nixpkgs-review took longer than " <> timeout <> " and timed out"
_ -> F.read $ (revDir cache commit) <> "/report.md"
-- Assumes we are already in nixpkgs dir
runReport :: (Text -> IO ()) -> Text -> IO Text

View File

@ -34,7 +34,6 @@ module OurPrelude
)
where
import System.FilePath ((</>))
import Control.Applicative ((<|>))
import Control.Category ((>>>))
import Control.Error
@ -57,6 +56,7 @@ import Polysemy
import Polysemy.Error hiding (note, try, tryJust)
import qualified Process as P
import System.Exit
import System.FilePath ((</>))
import System.Process.Typed
interpolate :: QuasiQuoter
@ -82,16 +82,16 @@ ourReadProcessInterleavedBS_ = readProcessInterleaved_ >>> tryIOTextET
ourReadProcess_ ::
MonadIO m =>
ProcessConfig stdin stdout stderr ->
ProcessConfig stdin stdout stderr ->
ExceptT Text m (Text, Text)
ourReadProcess_ = readProcess_ >>> tryIOTextET >>> fmapRT (\(stdout,stderr) -> (bytestringToText stdout, bytestringToText stderr))
ourReadProcess_ = readProcess_ >>> tryIOTextET >>> fmapRT (\(stdout, stderr) -> (bytestringToText stdout, bytestringToText stderr))
ourReadProcess_Sem ::
Members '[P.Process] r =>
ProcessConfig stdin stdoutIgnored stderrIgnored ->
Sem r (Text, Text)
ourReadProcess_Sem =
P.read_ >>> fmap (\(stdout,stderr) -> (bytestringToText stdout, bytestringToText stderr))
P.read_ >>> fmap (\(stdout, stderr) -> (bytestringToText stdout, bytestringToText stderr))
ourReadProcessInterleaved_ ::
MonadIO m =>

View File

@ -17,14 +17,14 @@ import qualified Data.Set as S
import qualified Data.Text as T
import qualified Data.Text.IO as T
import qualified Data.Vector as V
import qualified System.Posix.Files as F
import qualified Git
import qualified Utils
import qualified System.Directory
import OurPrelude
import qualified System.Directory
import qualified System.Posix.Files as F
import Text.Parsec (parse)
import Text.Parser.Char
import Text.Parser.Combinators
import qualified Utils
outPathsExpr :: Text
outPathsExpr =
@ -94,14 +94,23 @@ outPath = do
liftIO $ T.writeFile outpathFile outPathsExpr
liftIO $ putStrLn "[outpaths] eval start"
currentDir <- liftIO $ System.Directory.getCurrentDirectory
result <- ourReadProcessInterleaved_ $ proc "nix-env" [
"-f", outpathFile,
"-qaP",
"--no-name",
"--out-path",
"--arg", "path", currentDir,
"--arg", "checkMeta", "true",
"--show-trace"]
result <-
ourReadProcessInterleaved_ $
proc
"nix-env"
[ "-f",
outpathFile,
"-qaP",
"--no-name",
"--out-path",
"--arg",
"path",
currentDir,
"--arg",
"checkMeta",
"true",
"--show-trace"
]
liftIO $ putStrLn "[outpaths] eval end"
pure result
@ -163,7 +172,8 @@ parseResults = S.fromList <$> parseResultLine `sepEndBy` newline
parseResultLine :: CharParsing m => m ResultLine
parseResultLine =
ResultLine <$> (T.dropWhileEnd (== '.') <$> parseAttrpath)
ResultLine
<$> (T.dropWhileEnd (== '.') <$> parseAttrpath)
<*> parseArchitecture
<* spaces
<*> parseOutpaths
@ -182,7 +192,8 @@ parseOutpaths = V.fromList <$> (parseOutpath `sepBy1` char ';')
parseOutpath :: CharParsing m => m Outpath
parseOutpath =
Outpath <$> optional (try (T.pack <$> (many (noneOf "=\n") <* char '=')))
Outpath
<$> optional (try (T.pack <$> (many (noneOf "=\n") <* char '=')))
<*> (T.pack <$> many (noneOf ";\n"))
packageRebuilds :: Set ResultLine -> Vector Text

View File

@ -16,7 +16,7 @@ import GHC.Generics
import Network.HTTP.Client.TLS (newTlsManager)
import OurPrelude
import Servant.API
import Servant.Client (BaseUrl (..), mkClientEnv, ClientM, Scheme (..), client, runClientM)
import Servant.Client (BaseUrl (..), ClientM, Scheme (..), client, mkClientEnv, runClientM)
import System.IO
baseUrl :: BaseUrl
@ -35,9 +35,9 @@ type Project = Vector Package
type Projects = HashMap Text Project
type API =
"project" :> Capture "project_name" Text :> Get '[JSON] Project :<|>
"projects" :> QueryParam "inrepo" Text :> QueryParam "outdated" Bool :> Get '[JSON] Projects :<|>
"projects" :> Capture "name" Text :> QueryParam "inrepo" Text :> QueryParam "outdated" Bool :> Get '[JSON] Projects
"project" :> Capture "project_name" Text :> Get '[JSON] Project
:<|> "projects" :> QueryParam "inrepo" Text :> QueryParam "outdated" Bool :> Get '[JSON] Projects
:<|> "projects" :> Capture "name" Text :> QueryParam "inrepo" Text :> QueryParam "outdated" Bool :> Get '[JSON] Projects
data Package = Package
{ repo :: Text,
@ -56,12 +56,10 @@ api :: Proxy API
api = Proxy
project :: Text -> ClientM (Vector Package)
projects ::
Maybe Text ->
Maybe Bool ->
ClientM Projects
projects' ::
Text ->
Maybe Text ->

View File

@ -22,7 +22,7 @@ import qualified Network.HTTP.Client as HTTP
import Network.HTTP.Types.Status (statusCode)
import qualified Nix
import OurPrelude
import System.Exit()
import System.Exit ()
import Utils (UpdateEnv (..))
import Prelude hiding (log)
@ -61,7 +61,7 @@ plan =
("golangModuleVersion", golangModuleVersion),
("npmDepsVersion", npmDepsVersion),
("updateScript", updateScript)
--("redirectedUrl", Rewrite.redirectedUrls)
-- ("redirectedUrl", Rewrite.redirectedUrls)
]
runAll :: (Text -> IO ()) -> Args -> ExceptT Text IO [Text]
@ -81,15 +81,15 @@ version :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
version log args@Args {..} = do
if
| Nix.numberOfFetchers derivationContents > 1 || Nix.numberOfHashes derivationContents > 1 -> do
lift $ log "generic version rewriter does not support multiple hashes"
return Nothing
lift $ log "generic version rewriter does not support multiple hashes"
return Nothing
| hasUpdateScript -> do
lift $ log "skipping because derivation has updateScript"
return Nothing
lift $ log "skipping because derivation has updateScript"
return Nothing
| otherwise -> do
srcVersionFix args
lift $ log "updated version and sha256"
return $ Just "Version update"
srcVersionFix args
lift $ log "updated version and sha256"
return $ Just "Version update"
--------------------------------------------------------------------------------
-- Redirect homepage when moved.
@ -130,28 +130,28 @@ redirectedUrls log Args {..} = do
rustCrateVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe Text)
rustCrateVersion log args@Args {..} = do
if
| and [(not (T.isInfixOf "cargoSha256" derivationContents)),(not (T.isInfixOf "cargoHash" derivationContents))] -> do
lift $ log "No cargoSha256 or cargoHash found"
return Nothing
| and [(not (T.isInfixOf "cargoSha256" derivationContents)), (not (T.isInfixOf "cargoHash" derivationContents))] -> do
lift $ log "No cargoSha256 or cargoHash found"
return Nothing
| hasUpdateScript -> do
lift $ log "skipping because derivation has updateScript"
return Nothing
lift $ log "skipping because derivation has updateScript"
return Nothing
| otherwise -> do
_ <- lift $ File.replaceIO "cargoSha256 =" "cargoHash =" derivationFile
-- This starts the same way `version` does, minus the assert
srcVersionFix args
-- But then from there we need to do this a second time for the cargoHash!
oldCargoHash <- Nix.getAttrString "cargoHash" attrPath
let fakeHash = Nix.fakeHashMatching oldCargoHash
_ <- lift $ File.replaceIO oldCargoHash fakeHash derivationFile
newCargoHash <- Nix.getHashFromBuild attrPath
when (oldCargoHash == newCargoHash) $ throwE ("cargo hashes equal; no update necessary: " <> oldCargoHash)
lift . log $ "Replacing cargoHash with " <> newCargoHash
_ <- lift $ File.replaceIO fakeHash newCargoHash derivationFile
-- Ensure the package actually builds and passes its tests
Nix.build attrPath
lift $ log "Finished updating Crate version and replacing hashes"
return $ Just "Rust version update"
_ <- lift $ File.replaceIO "cargoSha256 =" "cargoHash =" derivationFile
-- This starts the same way `version` does, minus the assert
srcVersionFix args
-- But then from there we need to do this a second time for the cargoHash!
oldCargoHash <- Nix.getAttrString "cargoHash" attrPath
let fakeHash = Nix.fakeHashMatching oldCargoHash
_ <- lift $ File.replaceIO oldCargoHash fakeHash derivationFile
newCargoHash <- Nix.getHashFromBuild attrPath
when (oldCargoHash == newCargoHash) $ throwE ("cargo hashes equal; no update necessary: " <> oldCargoHash)
lift . log $ "Replacing cargoHash with " <> newCargoHash
_ <- lift $ File.replaceIO fakeHash newCargoHash derivationFile
-- Ensure the package actually builds and passes its tests
Nix.build attrPath
lift $ log "Finished updating Crate version and replacing hashes"
return $ Just "Rust version update"
--------------------------------------------------------------------------------
-- Rewrite Golang packages with buildGoModule
@ -161,38 +161,38 @@ golangModuleVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Ma
golangModuleVersion log args@Args {..} = do
if
| and [not (T.isInfixOf "buildGoModule" derivationContents && T.isInfixOf "vendorSha256" derivationContents), not (T.isInfixOf "buildGoModule" derivationContents && T.isInfixOf "vendorHash" derivationContents)] -> do
lift $ log "Not a buildGoModule package with vendorSha256 or vendorHash"
return Nothing
lift $ log "Not a buildGoModule package with vendorSha256 or vendorHash"
return Nothing
| hasUpdateScript -> do
lift $ log "skipping because derivation has updateScript"
return Nothing
lift $ log "skipping because derivation has updateScript"
return Nothing
| otherwise -> do
_ <- lift $ File.replaceIO "vendorSha256 =" "vendorHash =" derivationFile
-- This starts the same way `version` does, minus the assert
srcVersionFix args
-- But then from there we need to do this a second time for the vendorHash!
-- Note that explicit `null` cannot be coerced to a string by nix eval --raw
oldVendorHash <- Nix.getAttr "vendorHash" attrPath
lift . log $ "Found old vendorHash = " <> oldVendorHash
original <- liftIO $ T.readFile derivationFile
_ <- lift $ File.replaceIO oldVendorHash "null" derivationFile
ok <- runExceptT $ Nix.build attrPath
_ <-
if isLeft ok
then do
_ <- liftIO $ T.writeFile derivationFile original
let fakeHash = Nix.fakeHashMatching oldVendorHash
_ <- lift $ File.replaceIO oldVendorHash ("\"" <> fakeHash <> "\"") derivationFile
newVendorHash <- Nix.getHashFromBuild attrPath
_ <- lift $ File.replaceIO fakeHash newVendorHash derivationFile
-- Note that on some small bumps, this may not actually change if go.sum did not
lift . log $ "Replaced vendorHash with " <> newVendorHash
else do
lift . log $ "Set vendorHash to null"
-- Ensure the package actually builds and passes its tests
Nix.build attrPath
lift $ log "Finished updating vendorHash"
return $ Just "Golang update"
_ <- lift $ File.replaceIO "vendorSha256 =" "vendorHash =" derivationFile
-- This starts the same way `version` does, minus the assert
srcVersionFix args
-- But then from there we need to do this a second time for the vendorHash!
-- Note that explicit `null` cannot be coerced to a string by nix eval --raw
oldVendorHash <- Nix.getAttr "vendorHash" attrPath
lift . log $ "Found old vendorHash = " <> oldVendorHash
original <- liftIO $ T.readFile derivationFile
_ <- lift $ File.replaceIO oldVendorHash "null" derivationFile
ok <- runExceptT $ Nix.build attrPath
_ <-
if isLeft ok
then do
_ <- liftIO $ T.writeFile derivationFile original
let fakeHash = Nix.fakeHashMatching oldVendorHash
_ <- lift $ File.replaceIO oldVendorHash ("\"" <> fakeHash <> "\"") derivationFile
newVendorHash <- Nix.getHashFromBuild attrPath
_ <- lift $ File.replaceIO fakeHash newVendorHash derivationFile
-- Note that on some small bumps, this may not actually change if go.sum did not
lift . log $ "Replaced vendorHash with " <> newVendorHash
else do
lift . log $ "Set vendorHash to null"
-- Ensure the package actually builds and passes its tests
Nix.build attrPath
lift $ log "Finished updating vendorHash"
return $ Just "Golang update"
--------------------------------------------------------------------------------
-- Rewrite NPM packages with buildNpmPackage
@ -202,26 +202,26 @@ npmDepsVersion :: MonadIO m => (Text -> m ()) -> Args -> ExceptT Text m (Maybe T
npmDepsVersion log args@Args {..} = do
if
| not (T.isInfixOf "npmDepsHash" derivationContents) -> do
lift $ log "No npmDepsHash"
return Nothing
lift $ log "No npmDepsHash"
return Nothing
| hasUpdateScript -> do
lift $ log "skipping because derivation has updateScript"
return Nothing
lift $ log "skipping because derivation has updateScript"
return Nothing
| otherwise -> do
-- This starts the same way `version` does, minus the assert
srcVersionFix args
-- But then from there we need to do this a second time for the cargoHash!
oldDepsHash <- Nix.getAttrString "npmDepsHash" attrPath
let fakeHash = Nix.fakeHashMatching oldDepsHash
_ <- lift $ File.replaceIO oldDepsHash fakeHash derivationFile
newDepsHash <- Nix.getHashFromBuild attrPath
when (oldDepsHash == newDepsHash) $ throwE ("deps hashes equal; no update necessary: " <> oldDepsHash)
lift . log $ "Replacing npmDepsHash with " <> newDepsHash
_ <- lift $ File.replaceIO fakeHash newDepsHash derivationFile
-- Ensure the package actually builds and passes its tests
Nix.build attrPath
lift $ log "Finished updating NPM deps version and replacing hashes"
return $ Just "NPM version update"
-- This starts the same way `version` does, minus the assert
srcVersionFix args
-- But then from there we need to do this a second time for the cargoHash!
oldDepsHash <- Nix.getAttrString "npmDepsHash" attrPath
let fakeHash = Nix.fakeHashMatching oldDepsHash
_ <- lift $ File.replaceIO oldDepsHash fakeHash derivationFile
newDepsHash <- Nix.getHashFromBuild attrPath
when (oldDepsHash == newDepsHash) $ throwE ("deps hashes equal; no update necessary: " <> oldDepsHash)
lift . log $ "Replacing npmDepsHash with " <> newDepsHash
_ <- lift $ File.replaceIO fakeHash newDepsHash derivationFile
-- Ensure the package actually builds and passes its tests
Nix.build attrPath
lift $ log "Finished updating NPM deps version and replacing hashes"
return $ Just "NPM version update"
--------------------------------------------------------------------------------
@ -241,6 +241,7 @@ updateScript log Args {..} = do
else do
lift $ log "skipping because derivation has no updateScript"
return Nothing
--------------------------------------------------------------------------------
-- Common helper functions and utilities
-- Helper to update version and src attributes, re-computing the sha256.

View File

@ -60,6 +60,9 @@ attrPathList =
prefix
"deepin"
"deepin packages are upgraded in lockstep https://github.com/NixOS/nixpkgs/pull/52327#issuecomment-447684194",
prefix
"monero-"
"monero-cli and monero-gui packages are upgraded in lockstep",
prefix
"element-desktop"
"@Ma27 asked to skip",
@ -87,7 +90,17 @@ attrPathList =
eq "klee" "update repeatedly exceeded the 6h timeout",
regex
(string "python" *> few (psym isDigit) *> string "Packages.mmengine")
"takes way too long to build"
"takes way too long to build",
prefix
"linuxKernel"
"creates too many duplicate PRs",
prefix
"postgresql"
"creates too many duplicate PRs",
prefix
-- bump this when the default version is changed
"python312Packages"
"isn't the default python version"
]
nameList :: Skiplist
@ -184,9 +197,9 @@ checkResultList =
skipOutpathCalcList :: Skiplist
skipOutpathCalcList =
[ eq "firefox-beta-bin-unwrapped" "master"
, eq "firefox-devedition-bin-unwrapped" "master"
-- "firefox-release-bin-unwrapped" is unneeded here because firefox-bin is a dependency of other packages that Hydra doesn't ignore.
[ eq "firefox-beta-bin-unwrapped" "master",
eq "firefox-devedition-bin-unwrapped" "master"
-- "firefox-release-bin-unwrapped" is unneeded here because firefox-bin is a dependency of other packages that Hydra doesn't ignore.
]
binariesStickAround :: Text -> (Text -> Bool, Text)
@ -221,4 +234,4 @@ python numPackageRebuilds derivationContents =
(not isPython || numPackageRebuilds <= maxPackageRebuild)
where
isPython = "buildPythonPackage" `T.isInfixOf` derivationContents
maxPackageRebuild = 25
maxPackageRebuild = 100

View File

@ -25,12 +25,12 @@ import Control.Exception (bracket)
import Control.Monad.Writer (execWriterT, tell)
import qualified Data.ByteString.Lazy.Char8 as BSL
import Data.Maybe (fromJust)
import Data.Monoid (Alt(..))
import Data.Monoid (Alt (..))
import qualified Data.Set as S
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Data.Time.Calendar (showGregorian)
import Data.Time.Clock (getCurrentTime, utctDay, addUTCTime, UTCTime)
import Data.Time.Clock (UTCTime, addUTCTime, getCurrentTime, utctDay)
import qualified GH
import qualified Git
import Language.Haskell.TH.Env (envQ)
@ -41,6 +41,8 @@ import OurPrelude
import qualified Outpaths
import qualified Rewrite
import qualified Skiplist
import System.Directory (doesDirectoryExist, withCurrentDirectory)
import System.Posix.Directory (createDirectory)
import Utils
( Boundary (..),
Options (..),
@ -55,12 +57,10 @@ import Utils
import qualified Utils as U
import qualified Version
import Prelude hiding (log)
import System.Directory (doesDirectoryExist, withCurrentDirectory)
import System.Posix.Directory (createDirectory)
default (T.Text)
alsoLogToAttrPath :: Text -> (Text -> IO()) -> IO (Text -> IO())
alsoLogToAttrPath :: Text -> (Text -> IO ()) -> IO (Text -> IO ())
alsoLogToAttrPath attrPath topLevelLog = do
logFile <- attrPathLogFilePath attrPath
let attrPathLog = log' logFile
@ -181,9 +181,10 @@ updatePackageBatch simpleLog updateInfoLine updateEnv@UpdateEnv {..} = do
Right foundAttrPath -> do
log <- alsoLogToAttrPath foundAttrPath simpleLog
log updateInfoLine
mergeBase <- if batchUpdate options
then Git.mergeBase
else pure "HEAD"
mergeBase <-
if batchUpdate options
then Git.mergeBase
else pure "HEAD"
withWorktree mergeBase foundAttrPath updateEnv $
updateAttrPath log mergeBase updateEnv foundAttrPath
@ -198,12 +199,14 @@ checkExistingUpdate log updateEnv existingCommitMsg attrPath = do
Nothing -> lift $ log "No auto update branch exists"
Just msg -> do
let nV = newVersion updateEnv
lift $ log
[interpolate|An auto update branch exists with message `$msg`. New version is $nV.|]
lift $
log
[interpolate|An auto update branch exists with message `$msg`. New version is $nV.|]
case U.titleVersion msg of
Just branchV | Version.matchVersion (RangeMatcher (Including nV) Unbounded) branchV ->
throwError "An auto update branch exists with an equal or greater version"
Just branchV
| Version.matchVersion (RangeMatcher (Including nV) Unbounded) branchV ->
throwError "An auto update branch exists with an equal or greater version"
_ ->
lift $ log "The auto update branch does not match or exceed the new version."
@ -329,7 +332,7 @@ updateAttrPath log mergeBase updateEnv@UpdateEnv {..} attrPath = do
when hasUpdateScript do
changedFiles <- Git.diffFileNames mergeBase
let rewrittenFile = case changedFiles of { [f] -> f; _ -> derivationFile }
let rewrittenFile = case changedFiles of [f] -> f; _ -> derivationFile
assertNotUpdatedOn updateEnv' rewrittenFile "master"
assertNotUpdatedOn updateEnv' rewrittenFile "staging"
assertNotUpdatedOn updateEnv' rewrittenFile "staging-next"
@ -350,16 +353,18 @@ updateAttrPath log mergeBase updateEnv@UpdateEnv {..} attrPath = do
--
-- Publish the result
lift . log $ "Successfully finished processing"
result <- Nix.resultLink
result <- Nix.resultLink
let opReport =
if isJust skipOutpathBase
then "Outpath calculations were skipped for this package; total number of rebuilds unknown."
else Outpaths.outpathReport opDiff
then "Outpath calculations were skipped for this package; total number of rebuilds unknown."
else Outpaths.outpathReport opDiff
let prBase =
flip fromMaybe skipOutpathBase
flip
fromMaybe
skipOutpathBase
if Outpaths.numPackageRebuilds opDiff <= 500
then "master"
else "staging"
then "master"
else "staging"
publishPackage log updateEnv' oldSrcUrl newSrcUrl attrPath result opReport prBase rewriteMsgs (isJust existingCommitMsg)
case successOrFailure of
@ -437,10 +442,12 @@ publishPackage log updateEnv oldSrcUrl newSrcUrl attrPath result opReport prBase
let ghUser = GH.untagName . githubUser . options $ updateEnv
let mkPR = if branchExists then GH.prUpdate else GH.pr
(reusedPR, pullRequestUrl) <- mkPR updateEnv (prTitle updateEnv attrPath) prMsg (ghUser <> ":" <> (branchName updateEnv)) prBase
when branchExists $ liftIO $ log
if reusedPR
then "Updated existing PR"
else "Reused existing auto update branch, but no corresponding open PR was found, so created a new PR"
when branchExists $
liftIO $
log
if reusedPR
then "Updated existing PR"
else "Reused existing auto update branch, but no corresponding open PR was found, so created a new PR"
liftIO $ log pullRequestUrl
else liftIO $ T.putStrLn prMsg
@ -492,7 +499,7 @@ prMessage updateEnv isBroken metaDescription metaHomepage metaChangelog rewriteM
rewriteMsgsLine = foldl (\ms m -> ms <> T.pack "\n- " <> m) "\n###### Updates performed" rewriteMsgs
maintainersCc =
if not (T.null maintainers)
then "cc " <> maintainers <> " for [testing](https://github.com/ryantm/nixpkgs-update/blob/master/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do)."
then "cc " <> maintainers <> " for [testing](https://github.com/ryantm/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do)."
else ""
releaseUrlMessage =
if releaseUrl == T.empty
@ -611,7 +618,8 @@ untilOfBorgFree log waitUntil = do
stats <-
shell "curl -s https://events.ofborg.org/stats.php" & readProcessInterleaved_
waiting <-
shell (jqBin <> " .evaluator.messages.waiting") & setStdin (byteStringInput stats)
shell (jqBin <> " .evaluator.messages.waiting")
& setStdin (byteStringInput stats)
& readProcessInterleaved_
& fmap (BSL.readInt >>> fmap fst >>> fromMaybe 0)
when (waiting > 2) $ do
@ -695,13 +703,12 @@ cveReport updateEnv =
doCachix :: MonadIO m => (Text -> m ()) -> UpdateEnv -> Text -> ExceptT Text m Text
doCachix log updateEnv resultPath =
let o = options updateEnv
in
if batchUpdate o && "r-ryantm" == (GH.untagName $ githubUser o)
then do
lift $ log ("cachix " <> (T.pack . show) resultPath)
Nix.cachix resultPath
return
[interpolate|
in if batchUpdate o && "r-ryantm" == (GH.untagName $ githubUser o)
then do
lift $ log ("cachix " <> (T.pack . show) resultPath)
Nix.cachix resultPath
return
[interpolate|
Either **download from Cachix**:
```
nix-store -r $resultPath \
@ -716,9 +723,9 @@ doCachix log updateEnv resultPath =
Or, **build yourself**:
|]
else do
lift $ log "skipping cachix"
return "Build yourself:"
else do
lift $ log "skipping cachix"
return "Build yourself:"
updatePackage ::
Options ->
@ -737,18 +744,19 @@ updatePackage o updateInfo = do
UpdatePackageSuccess -> do
log $ "[result] Success updating " <> updateInfoLine
withWorktree :: Text -> Text -> UpdateEnv -> IO a -> IO a
withWorktree branch attrpath updateEnv action = do
bracket
(do
( do
dir <- U.worktreeDir
let path = dir <> "/" <> T.unpack (T.replace ".lock" "_lock" attrpath)
Git.worktreeRemove path
Git.delete1 (branchName updateEnv)
Git.worktreeAdd path branch updateEnv
pure path)
(\ path -> do
pure path
)
( \path -> do
Git.worktreeRemove path
Git.delete1 (branchName updateEnv))
(\ path -> withCurrentDirectory path action)
Git.delete1 (branchName updateEnv)
)
(\path -> withCurrentDirectory path action)

View File

@ -27,7 +27,7 @@ module Utils
regDirMode,
outpathCacheDir,
cacheDir,
worktreeDir
worktreeDir,
)
where
@ -48,10 +48,10 @@ import Database.SQLite.Simple.ToField (ToField, toField)
import qualified GitHub as GH
import OurPrelude
import Polysemy.Output
import System.Directory (doesDirectoryExist, createDirectoryIfMissing)
import System.Directory (createDirectoryIfMissing, doesDirectoryExist)
import System.Environment (lookupEnv)
import System.Posix.Directory (createDirectory)
import System.Posix.Env (getEnv)
import System.Environment (lookupEnv)
import System.Posix.Files
( directoryMode,
fileExist,
@ -141,7 +141,10 @@ titleVersion title = if T.null prefix then Nothing else Just suffix
regDirMode :: FileMode
regDirMode =
directoryMode .|. ownerModes .|. groupModes .|. otherReadMode
directoryMode
.|. ownerModes
.|. groupModes
.|. otherReadMode
.|. otherExecuteMode
logsDirectory :: MonadIO m => ExceptT Text m FilePath
@ -149,7 +152,7 @@ logsDirectory = do
dir <-
noteT "Could not get environment variable LOGS_DIRECTORY" $
MaybeT $
liftIO $
liftIO $
getEnv "LOGS_DIRECTORY"
dirExists <- liftIO $ doesDirectoryExist dir
tryAssert ("LOGS_DIRECTORY " <> T.pack dir <> " does not exist.") dirExists
@ -163,8 +166,8 @@ logsDirectory = do
cacheDir :: MonadIO m => m FilePath
cacheDir = do
cacheDirectory <- liftIO $ lookupEnv "CACHE_DIRECTORY"
xdgCacheHome <- liftIO $ fmap (fmap (\ dir -> dir </> "nixpkgs-update")) $ lookupEnv "XDG_CACHE_HOME"
cacheHome <- liftIO $ fmap (fmap (\ dir -> dir </> ".cache/nixpkgs-update")) $ lookupEnv "HOME"
xdgCacheHome <- liftIO $ fmap (fmap (\dir -> dir </> "nixpkgs-update")) $ lookupEnv "XDG_CACHE_HOME"
cacheHome <- liftIO $ fmap (fmap (\dir -> dir </> ".cache/nixpkgs-update")) $ lookupEnv "HOME"
let dir = fromJust (cacheDirectory <|> xdgCacheHome <|> cacheHome)
liftIO $ createDirectoryIfMissing True dir
return dir
@ -214,7 +217,9 @@ logDir :: IO FilePath
logDir = do
r <-
runExceptT
( logsDirectory <|> xdgRuntimeDir <|> tmpRuntimeDir
( logsDirectory
<|> xdgRuntimeDir
<|> tmpRuntimeDir
<|> throwE
"Failed to create log directory."
)

View File

@ -8,8 +8,8 @@ module Version
)
where
import Data.Foldable (toList)
import Data.Char (isAlpha, isDigit)
import Data.Foldable (toList)
import Data.Function (on)
import qualified Data.PartialOrd as PO
import qualified Data.Text as T
@ -62,30 +62,30 @@ clearBreakOn boundary string =
versionCompatibleWithPathPin :: Text -> Version -> Bool
versionCompatibleWithPathPin attrPath newVer
| "-unwrapped" `T.isSuffixOf` attrPath =
versionCompatibleWithPathPin (T.dropEnd 10 attrPath) newVer
versionCompatibleWithPathPin (T.dropEnd 10 attrPath) newVer
| "_x" `T.isSuffixOf` T.toLower attrPath =
versionCompatibleWithPathPin (T.dropEnd 2 attrPath) newVer
versionCompatibleWithPathPin (T.dropEnd 2 attrPath) newVer
| "_" `T.isInfixOf` attrPath =
let attrVersionPart =
let (_, version) = clearBreakOn "_" attrPath
in if T.any (notElemOf ('_' : ['0' .. '9'])) version
then Nothing
else Just version
-- Check assuming version part has underscore separators
attrVersionPeriods = T.replace "_" "." <$> attrVersionPart
in -- If we don't find version numbers in the attr path, exit success.
maybe True (`T.isPrefixOf` newVer) attrVersionPeriods
let attrVersionPart =
let (_, version) = clearBreakOn "_" attrPath
in if T.any (notElemOf ('_' : ['0' .. '9'])) version
then Nothing
else Just version
-- Check assuming version part has underscore separators
attrVersionPeriods = T.replace "_" "." <$> attrVersionPart
in -- If we don't find version numbers in the attr path, exit success.
maybe True (`T.isPrefixOf` newVer) attrVersionPeriods
| otherwise =
let attrVersionPart =
let version = T.dropWhile (notElemOf ['0' .. '9']) attrPath
in if T.any (notElemOf ['0' .. '9']) version
then Nothing
else Just version
-- Check assuming version part is the prefix of the version with dots
-- removed. For example, 91 => "9.1"
noPeriodNewVersion = T.replace "." "" newVer
in -- If we don't find version numbers in the attr path, exit success.
maybe True (`T.isPrefixOf` noPeriodNewVersion) attrVersionPart
let attrVersionPart =
let version = T.dropWhile (notElemOf ['0' .. '9']) attrPath
in if T.any (notElemOf ['0' .. '9']) version
then Nothing
else Just version
-- Check assuming version part is the prefix of the version with dots
-- removed. For example, 91 => "9.1"
noPeriodNewVersion = T.replace "." "" newVer
in -- If we don't find version numbers in the attr path, exit success.
maybe True (`T.isPrefixOf` noPeriodNewVersion) attrVersionPart
versionIncompatibleWithPathPin :: Text -> Version -> Bool
versionIncompatibleWithPathPin path version =

View File

@ -1,8 +1,8 @@
module CheckSpec where
import qualified Check
import qualified Data.Text as T
import Test.Hspec
import qualified Check
main :: IO ()
main = hspec spec

View File

@ -84,4 +84,4 @@ nixpkgs-review comment body
###### Maintainer pings
cc @maintainer1 for [testing](https://github.com/ryantm/nixpkgs-update/blob/master/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).
cc @maintainer1 for [testing](https://github.com/ryantm/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).

View File

@ -77,4 +77,4 @@ NixPkgs review skipped
###### Maintainer pings
cc @maintainer1 for [testing](https://github.com/ryantm/nixpkgs-update/blob/master/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).
cc @maintainer1 for [testing](https://github.com/ryantm/nixpkgs-update/blob/main/doc/nixpkgs-maintainer-faq.md#r-ryantm-opened-a-pr-for-my-package-what-do-i-do).