Modernise hooks support

Summary: Migrate hooks to new futures and thus modern tokio. In the process, replace Lua hooks with Rust hooks, and add fixes for the few cases where Lua was too restrictive about what could be done.

Reviewed By: StanislavGlebik

Differential Revision: D20165425

fbshipit-source-id: 7bdc6820144f2fdaed653a34ff7c998913007ca2
This commit is contained in:
Simon Farnsworth 2020-03-18 09:13:05 -07:00 committed by Facebook GitHub Bot
parent 6744740f42
commit a908be34b3
21 changed files with 677 additions and 3202 deletions

View File

@ -13,7 +13,8 @@ use blobrepo::BlobRepo;
use bookmarks::BookmarkName;
use cloned::cloned;
use context::CoreContext;
use futures_ext::{spawn_future, BoxFuture, FutureExt};
use futures::{FutureExt, TryFutureExt};
use futures_ext::{spawn_future, BoxFuture, FutureExt as OldFutureExt};
use futures_old::{Future, Stream};
use hooks::{hook_loader::load_hooks, HookManager};
use hooks_content_stores::{blobrepo_text_only_store, BlobRepoChangesetStore};
@ -285,24 +286,29 @@ fn run_hooks_for_changeset(
) -> impl Future<Item = (HgChangesetId, HookResults), Error = Error> {
repo.get_hg_from_bonsai_changeset(ctx.clone(), cs)
.and_then(move |hg_cs| {
debug!(ctx.logger(), "Running file hooks for changeset {:?}", hg_cs);
hm.run_file_hooks_for_bookmark(ctx.clone(), hg_cs.clone(), &bm, None)
.map(move |res| (hg_cs, res))
.and_then(move |(hg_cs, file_res)| {
let hg_cs = hg_cs.clone();
debug!(
ctx.logger(),
"Running changeset hooks for changeset {:?}", hg_cs
);
hm.run_changeset_hooks_for_bookmark(ctx, hg_cs.clone(), &bm, None)
.map(move |res| {
let hook_results = HookResults {
file_hooks_results: file_res,
cs_hooks_result: res,
};
(hg_cs, hook_results)
})
})
let ctx = ctx.clone();
let hm = hm.clone();
let bm = bm.clone();
async move {
debug!(ctx.logger(), "Running file hooks for changeset {:?}", hg_cs);
let file_hooks_results = hm
.run_file_hooks_for_bookmark(&ctx, hg_cs, &bm, None)
.await?;
debug!(
ctx.logger(),
"Running changeset hooks for changeset {:?}", hg_cs
);
let cs_hooks_result = hm
.run_changeset_hooks_for_bookmark(&ctx, hg_cs.clone(), &bm, None)
.await?;
let hook_results = HookResults {
file_hooks_results,
cs_hooks_result,
};
Ok((hg_cs, hook_results))
}
.boxed()
.compat()
})
}

View File

@ -1,130 +0,0 @@
# Commit Hooks
This document explains how to author and configure commit hooks for Mononoke.
## Overview
Two types of hooks are supported (note that both are pre-commit hooks):
* `PerChangeset` runs once per changeset.
* `PerAddedOrModifiedFile` runs once per added or modified file per changeset.
Individual hooks are declared as follows in the `server.toml` file that
contains the configuration for a repo:
```toml
[[hooks]]
# The name of your hook. Must be unique within this file.
name="my_hook"
# Path to the Lua script that implements your hook.
# Relative paths are resolved relative to the root of the config repo.
path="my_hook.lua"
# This must be either "PerChangeset" or "PerAddedOrModifiedFile".
hook_type="PerAddedOrModifiedFile"
# The following property is optional.
# If specified and there is a line in the commit message that matches
# the specified value, then the hook is not run for that commit.
bypass_commit_string="@ignore-conflict-markers"
# The following property is optional.
# If specified, the hook can be bypassed by specifying `--pushvars KEY=VALUE`
# when running `hg push`.
bypass_pushvar="KEY=VALUE"
```
Enabled hooks must be declared in `[[bookmark.hooks]]`:
```toml
[[bookmarks.hooks]]
# Note how this matches the "name" property in [[hooks]].
hook_name="my_hook"
```
Here is an example of a hook to prevent Perl files from being checked in:
```lua
-- file my_hook.lua
function hook (ctx)
if ctx.file.path:match("\\.perl$") then
return false, "Boo: scary Perl!"
else
return true
end
```
## Lua API
Your hook must be implemented in Lua. The entry point to your hook must be a
global function named `hook()`. This function should return up to three
values:
* `success` (`boolean`) This should be `true` if the hook was satisfied and
`false` if it was not. If this is `true`, then `description` and
`long_description` must be `nil`.
* `description` (`string` or `nil`) If the hook was not satisfied, this
must provide a short description of the failure, used to summarize this failure
with other similar failures.
* `long_description` (`string` or `nil`) If the hook was not satisfied, this
should provide a long description of the failure with suggestions for how the
user should approach fixing this hook failure. Mononoke will use `description` if this
is not provided, but this message needs to stand alone (`description` is not
automatically added to this message).
Here are some common properties of hooks:
The `ctx` argument passed to the hook function always has at least the following
fields:
| key | description |
| --- | ----------- |
| `config_strings` | (`table of string`) Contains string configs defined per repository config |
| `config_ints` | (`table of int`) Contains int configs defined per repository config |
| `regex_match(regex, string)` | (`function`) Returns a `boolean` indicating whether the string matches the supplied regex |
The type `file` is a table with the following fields:
| key | description |
| --------- | ----------- |
| `path` | (`string`) Path to the file relative to the repo root. |
| `is_added()` | Returns a `boolean` indicating whether the file was added as part of the changeset |
| `is_deleted()` | Returns a `boolean` indicating whether the file was deleted as part of the changeset |
| `is_modified()` | Returns a `boolean` indicating whether the file was modified as part of the changeset
| `len()` | Returns a `number` that is the length of the file in bytes. (Only present if `is_deleted()` returns `false`.) |
| `text()` | Returns a `string` containing the contents of the file, or `nil` if the file is binary or too large. (Only present if `is_deleted()` returns `false`.) |
| `path_regex_match(regex)` | Returns a `boolean` indicating whether the file's path matches the supplied regex |
### PerChangeset
Your `hook()` function receives a single `ctx` argument, which is a table with
the following additional fields:
| key | description |
| --------- | ----------- |
| `info` | (`table`) Described below. |
| `files` | (`table`) List of objects of type `file`, described above. |
| `file_text(path)` | (`function`) Takes the relative path to a file in the repo and returns its contents, or `nil` if the file does not exist, is binary, or too large. |
| `parse_commit_msg()` | (`function`) Returns a table with phabricator tags parsed. |
| `is_valid_reviewer(user)` | (`function`) Returns whether a user can review the commit. |
`ctx.info` is a table with the following fields:
| key | description |
| --------- | ----------- |
| `author` | (`string`) The author of the changeset. This should be something like `"Stanislau Hlebik <stash@fb.com>"`. |
| `comments` | (`string`) The commit message. |
| `parent1_hash` | (`string` or `nil`) `p1` for the commit as a hex string, if it exists. |
| `parent2_hash` | (`string` or `nil`) `p2` for the commit as a hex string, if it exists. |
### PerAddedOrModifiedFile
Your `hook()` function receives a single `ctx` argument, which is a table with
the following additional fields:
| key | description |
| --------- | ----------- |
| `file` | (`table`) Object of type `file`, described above. (Note `is_deleted()` will return `false`.) |

View File

@ -6,12 +6,15 @@
*/
use anyhow::Error;
use async_trait::async_trait;
use blobrepo::BlobRepo;
use blobstore::Loadable;
use cloned::cloned;
use context::CoreContext;
use futures::{Future, Stream};
use futures_ext::{BoxFuture, FutureExt, StreamExt};
use futures::{
compat::{Future01CompatExt, Stream01CompatExt},
future,
stream::TryStreamExt,
};
use manifest::{Diff, Entry, ManifestOps};
use mercurial_types::{blobs::HgBlobChangeset, FileBytes, HgChangesetId, HgFileNodeId, MPath};
use mononoke_types::FileType;
@ -29,46 +32,46 @@ pub struct BlobRepoChangesetStore {
pub repo: BlobRepo,
}
#[async_trait]
impl FileContentStore for BlobRepoFileContentStore {
fn resolve_path(
&self,
ctx: CoreContext,
async fn resolve_path<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changeset_id: HgChangesetId,
path: MPath,
) -> BoxFuture<Option<HgFileNodeId>, Error> {
changeset_id
) -> Result<Option<HgFileNodeId>, Error> {
let cs = changeset_id
.load(ctx.clone(), self.repo.blobstore())
.from_err()
.and_then({
cloned!(self.repo, ctx);
move |changeset| {
changeset
.manifestid()
.find_entry(ctx, repo.get_blobstore(), Some(path))
.map(|entry| Some(entry?.into_leaf()?.1))
}
})
.boxify()
.compat()
.await?;
let entry = cs
.manifestid()
.find_entry(ctx.clone(), self.repo.get_blobstore(), Some(path))
.compat()
.await?;
Ok(entry.and_then(|entry| entry.into_leaf()).map(|leaf| leaf.1))
}
fn get_file_text(
&self,
ctx: CoreContext,
async fn get_file_text<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
id: HgFileNodeId,
) -> BoxFuture<Option<FileBytes>, Error> {
) -> Result<Option<FileBytes>, Error> {
let store = self.repo.get_blobstore();
id.load(ctx.clone(), &store)
.from_err()
.and_then(move |envelope| filestore::fetch_concat(&store, ctx, envelope.content_id()))
.map(|content| Some(FileBytes(content)))
.boxify()
let envelope = id.load(ctx.clone(), &store).compat().await?;
let content = filestore::fetch_concat(&store, ctx.clone(), envelope.content_id())
.compat()
.await?;
Ok(Some(FileBytes(content)))
}
fn get_file_size(&self, ctx: CoreContext, id: HgFileNodeId) -> BoxFuture<u64, Error> {
id.load(ctx, self.repo.blobstore())
.from_err()
.map(|envelope| envelope.content_size())
.boxify()
async fn get_file_size<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
id: HgFileNodeId,
) -> Result<u64, Error> {
let envelope = id.load(ctx.clone(), self.repo.blobstore()).compat().await?;
Ok(envelope.content_size())
}
}
@ -78,94 +81,88 @@ impl BlobRepoFileContentStore {
}
}
#[async_trait]
impl ChangesetStore for BlobRepoChangesetStore {
fn get_changeset_by_changesetid(
&self,
ctx: CoreContext,
async fn get_changeset_by_changesetid<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changesetid: HgChangesetId,
) -> BoxFuture<HgBlobChangeset, Error> {
changesetid
.load(ctx, self.repo.blobstore())
.from_err()
.boxify()
}
fn get_changed_files(
&self,
ctx: CoreContext,
changesetid: HgChangesetId,
) -> BoxFuture<Vec<(String, ChangedFileType, Option<(HgFileNodeId, FileType)>)>, Error> {
cloned!(self.repo);
) -> Result<HgBlobChangeset, Error> {
changesetid
.load(ctx.clone(), self.repo.blobstore())
.from_err()
.map({
cloned!(ctx);
move |cs| {
let mf_id = cs.manifestid();
let parents = cs.parents();
let (maybe_p1, _) = parents.get_nodes();
match maybe_p1 {
Some(p1) => HgChangesetId::new(p1)
.load(ctx.clone(), repo.blobstore())
.from_err()
.map(|p1| p1.manifestid())
.map({
cloned!(repo);
move |p_mf_id| {
p_mf_id.diff(ctx.clone(), repo.get_blobstore(), mf_id)
}
})
.flatten_stream()
.filter_map(|diff| {
let (path, entry) = match diff.clone() {
Diff::Added(path, entry) => (path, entry),
Diff::Removed(path, entry) => (path, entry),
Diff::Changed(path, .., entry) => (path, entry),
};
.compat()
.await
.map_err(|e| e.into())
}
let hash_and_type = match entry {
Entry::Leaf((ty, hash)) => (hash, ty),
Entry::Tree(_) => {
return None;
}
};
async fn get_changed_files<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changesetid: HgChangesetId,
) -> Result<Vec<(String, ChangedFileType, Option<(HgFileNodeId, FileType)>)>, Error> {
let cs = changesetid
.load(ctx.clone(), self.repo.blobstore())
.compat()
.await?;
let mf_id = cs.manifestid();
let parents = cs.parents();
let (maybe_p1, _) = parents.get_nodes();
match maybe_p1 {
Some(p1) => {
let p1 = HgChangesetId::new(p1)
.load(ctx.clone(), self.repo.blobstore())
.compat()
.await?;
let p_mf_id = p1.manifestid();
p_mf_id
.diff(ctx.clone(), self.repo.get_blobstore(), mf_id)
.compat()
.try_filter_map(|diff| {
let (path, change_type, entry) = match diff {
Diff::Added(path, entry) => (path, ChangedFileType::Added, entry),
Diff::Removed(path, entry) => (path, ChangedFileType::Deleted, entry),
Diff::Changed(path, .., entry) => {
(path, ChangedFileType::Modified, entry)
}
};
match diff {
Diff::Added(..) => {
Some((path, ChangedFileType::Added, Some(hash_and_type)))
}
Diff::Changed(..) => {
Some((path, ChangedFileType::Modified, Some(hash_and_type)))
}
Diff::Removed(..) => {
Some((path, ChangedFileType::Deleted, None))
}
}
})
.filter_map(|(maybe_path, ty, hash_and_type)| {
maybe_path.map(|path| (path, ty, hash_and_type))
})
.boxify(),
None => mf_id
.list_leaf_entries(ctx.clone(), repo.get_blobstore())
.map(|(path, (ty, filenode))| {
(path, ChangedFileType::Added, Some((filenode, ty)))
})
.boxify(),
}
}
})
.flatten_stream()
.map(|(path, ty, hash_and_type)| {
(
String::from_utf8_lossy(&path.to_vec()).into_owned(),
ty,
hash_and_type,
)
})
.collect()
.boxify()
match (change_type, entry) {
(ChangedFileType::Deleted, Entry::Leaf(_)) => {
future::ok(Some((path, ChangedFileType::Deleted, None)))
}
(change_type, Entry::Leaf((ty, hash))) => {
future::ok(Some((path, change_type, Some((hash, ty)))))
}
(_, Entry::Tree(_)) => future::ok(None),
}
})
.try_filter_map(|(maybe_path, ty, hash_and_type)| {
future::ok(maybe_path.map(|path| {
(
String::from_utf8_lossy(&path.to_vec()).into_owned(),
ty,
hash_and_type,
)
}))
})
.try_collect()
.await
}
None => {
mf_id
.list_leaf_entries(ctx.clone(), self.repo.get_blobstore())
.compat()
.map_ok(|(path, (ty, filenode))| {
(
String::from_utf8_lossy(&path.to_vec()).into_owned(),
ChangedFileType::Added,
Some((filenode, ty)),
)
})
.try_collect()
.await
}
}
}
}

View File

@ -6,10 +6,9 @@
*/
use anyhow::Error;
use async_trait::async_trait;
use bytes::Bytes;
use context::CoreContext;
use futures::IntoFuture;
use futures_ext::{BoxFuture, FutureExt};
use mercurial_types::{blobs::HgBlobChangeset, FileBytes, HgChangesetId, HgFileNodeId, MPath};
use mononoke_types::FileType;
use std::collections::HashMap;
@ -22,31 +21,28 @@ pub struct InMemoryChangesetStore {
map_cs: HashMap<HgChangesetId, HgBlobChangeset>,
}
#[async_trait]
impl ChangesetStore for InMemoryChangesetStore {
fn get_changeset_by_changesetid(
&self,
_ctx: CoreContext,
async fn get_changeset_by_changesetid<'a, 'b: 'a>(
&'a self,
_ctx: &'b CoreContext,
changesetid: HgChangesetId,
) -> BoxFuture<HgBlobChangeset, Error> {
) -> Result<HgBlobChangeset, Error> {
match self.map_cs.get(&changesetid) {
Some(cs) => Ok(cs.clone()),
None => Err(ErrorKind::NoSuchChangeset(changesetid.to_string()).into()),
}
.into_future()
.boxify()
}
fn get_changed_files(
&self,
_ctx: CoreContext,
async fn get_changed_files<'a, 'b: 'a>(
&'a self,
_ctx: &'b CoreContext,
changesetid: HgChangesetId,
) -> BoxFuture<Vec<(String, ChangedFileType, Option<(HgFileNodeId, FileType)>)>, Error> {
) -> Result<Vec<(String, ChangedFileType, Option<(HgFileNodeId, FileType)>)>, Error> {
match self.map_files.get(&changesetid) {
Some(files) => Ok(files.clone()),
None => Err(ErrorKind::NoSuchChangeset(changesetid.to_string()).into()),
}
.into_future()
.boxify()
}
}
@ -102,22 +98,22 @@ pub struct InMemoryFileContentStore {
path_to_filenode: HashMap<(HgChangesetId, MPath), HgFileNodeId>,
}
#[async_trait]
impl FileContentStore for InMemoryFileContentStore {
fn resolve_path(
&self,
_ctx: CoreContext,
async fn resolve_path<'a, 'b: 'a>(
&'a self,
_ctx: &'b CoreContext,
cs_id: HgChangesetId,
path: MPath,
) -> BoxFuture<Option<HgFileNodeId>, Error> {
let filenode = self.path_to_filenode.get(&(cs_id, path)).cloned();
Ok(filenode).into_future().boxify()
) -> Result<Option<HgFileNodeId>, Error> {
Ok(self.path_to_filenode.get(&(cs_id, path)).cloned())
}
fn get_file_text(
&self,
_ctx: CoreContext,
async fn get_file_text<'a, 'b: 'a>(
&'a self,
_ctx: &'b CoreContext,
id: HgFileNodeId,
) -> BoxFuture<Option<FileBytes>, Error> {
) -> Result<Option<FileBytes>, Error> {
self.id_to_text
.get(&id)
.ok_or(Error::msg("file not found"))
@ -125,11 +121,13 @@ impl FileContentStore for InMemoryFileContentStore {
InMemoryFileText::Present(ref bytes) => Some(bytes.clone()),
InMemoryFileText::Elided(_) => None,
})
.into_future()
.boxify()
}
fn get_file_size(&self, _ctx: CoreContext, id: HgFileNodeId) -> BoxFuture<u64, Error> {
async fn get_file_size<'a, 'b: 'a>(
&'a self,
_ctx: &'b CoreContext,
id: HgFileNodeId,
) -> Result<u64, Error> {
self.id_to_text
.get(&id)
.ok_or(Error::msg("file not found"))
@ -137,8 +135,6 @@ impl FileContentStore for InMemoryFileContentStore {
InMemoryFileText::Present(ref bytes) => bytes.size() as u64,
InMemoryFileText::Elided(size) => *size,
})
.into_future()
.boxify()
}
}

View File

@ -6,45 +6,51 @@
*/
use anyhow::Error;
use async_trait::async_trait;
use context::CoreContext;
use futures_ext::BoxFuture;
use mercurial_types::{blobs::HgBlobChangeset, FileBytes, HgChangesetId, HgFileNodeId, MPath};
use mononoke_types::FileType;
#[derive(Clone)]
#[derive(Clone, PartialEq, Eq)]
pub enum ChangedFileType {
Added,
Deleted,
Modified,
}
#[async_trait]
pub trait FileContentStore: Send + Sync {
fn resolve_path(
&self,
ctx: CoreContext,
async fn resolve_path<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changeset_id: HgChangesetId,
path: MPath,
) -> BoxFuture<Option<HgFileNodeId>, Error>;
) -> Result<Option<HgFileNodeId>, Error>;
fn get_file_text(
&self,
ctx: CoreContext,
async fn get_file_text<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
id: HgFileNodeId,
) -> BoxFuture<Option<FileBytes>, Error>;
) -> Result<Option<FileBytes>, Error>;
fn get_file_size(&self, ctx: CoreContext, id: HgFileNodeId) -> BoxFuture<u64, Error>;
async fn get_file_size<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
id: HgFileNodeId,
) -> Result<u64, Error>;
}
#[async_trait]
pub trait ChangesetStore: Send + Sync {
fn get_changeset_by_changesetid(
&self,
ctx: CoreContext,
async fn get_changeset_by_changesetid<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changesetid: HgChangesetId,
) -> BoxFuture<HgBlobChangeset, Error>;
) -> Result<HgBlobChangeset, Error>;
fn get_changed_files(
&self,
ctx: CoreContext,
async fn get_changed_files<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changesetid: HgChangesetId,
) -> BoxFuture<Vec<(String, ChangedFileType, Option<(HgFileNodeId, FileType)>)>, Error>;
) -> Result<Vec<(String, ChangedFileType, Option<(HgFileNodeId, FileType)>)>, Error>;
}

View File

@ -6,10 +6,8 @@
*/
use anyhow::Error;
use cloned::cloned;
use async_trait::async_trait;
use context::CoreContext;
use futures::{Future, IntoFuture};
use futures_ext::{BoxFuture, FutureExt};
use mercurial_types::{FileBytes, HgChangesetId, HgFileNodeId, MPath};
use std::sync::Arc;
@ -31,45 +29,42 @@ impl<T> TextOnlyFileContentStore<T> {
}
}
#[async_trait]
impl<T: FileContentStore + 'static> FileContentStore for TextOnlyFileContentStore<T> {
fn resolve_path(
&self,
ctx: CoreContext,
async fn resolve_path<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
changeset_id: HgChangesetId,
path: MPath,
) -> BoxFuture<Option<HgFileNodeId>, Error> {
self.inner.resolve_path(ctx, changeset_id, path)
) -> Result<Option<HgFileNodeId>, Error> {
self.inner.resolve_path(ctx, changeset_id, path).await
}
/// Override the inner store's get_file_text by filtering out files that are to large or
/// contain null bytes (those are assumed to be binary).
fn get_file_text(
&self,
ctx: CoreContext,
async fn get_file_text<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
id: HgFileNodeId,
) -> BoxFuture<Option<FileBytes>, Error> {
self.get_file_size(ctx.clone(), id)
.and_then({
cloned!(self.inner, self.max_size);
move |file_size| {
if file_size > max_size {
return Ok(None).into_future().left_future();
}
) -> Result<Option<FileBytes>, Error> {
let file_size = self.get_file_size(ctx, id).await?;
if file_size > self.max_size {
return Ok(None);
}
inner
.get_file_text(ctx, id)
.map(|file_bytes| match file_bytes {
Some(ref file_bytes) if looks_like_binary(&file_bytes) => None,
_ => file_bytes,
})
.right_future()
}
})
.boxify()
let file_bytes = self.inner.get_file_text(ctx, id).await?;
Ok(match file_bytes {
Some(ref file_bytes) if looks_like_binary(file_bytes) => None,
_ => file_bytes,
})
}
fn get_file_size(&self, ctx: CoreContext, id: HgFileNodeId) -> BoxFuture<u64, Error> {
self.inner.get_file_size(ctx, id)
async fn get_file_size<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
id: HgFileNodeId,
) -> Result<u64, Error> {
self.inner.get_file_size(ctx, id).await
}
}
@ -94,7 +89,9 @@ mod test {
inner.insert(ONES_CSID, MPath::new("f1").unwrap(), TWOS_FNID, "foobar");
let store = TextOnlyFileContentStore::new(inner, 10);
let ret = rt.block_on(store.get_file_text(ctx, TWOS_FNID)).unwrap();
let ret = rt
.block_on_std(store.get_file_text(&ctx, TWOS_FNID))
.unwrap();
assert_eq!(ret, Some(FileBytes("foobar".into())));
}
@ -107,7 +104,9 @@ mod test {
inner.insert(ONES_CSID, MPath::new("f1").unwrap(), TWOS_FNID, "foobar");
let store = TextOnlyFileContentStore::new(inner, 2);
let ret = rt.block_on(store.get_file_text(ctx, TWOS_FNID)).unwrap();
let ret = rt
.block_on_std(store.get_file_text(&ctx, TWOS_FNID))
.unwrap();
assert_eq!(ret, None);
}
@ -120,7 +119,9 @@ mod test {
inner.insert(ONES_CSID, MPath::new("f1").unwrap(), TWOS_FNID, "foo\0");
let store = TextOnlyFileContentStore::new(inner, 10);
let ret = rt.block_on(store.get_file_text(ctx, TWOS_FNID)).unwrap();
let ret = rt
.block_on_std(store.get_file_text(&ctx, TWOS_FNID))
.unwrap();
assert_eq!(ret, None);
}
}

View File

@ -8,20 +8,21 @@
#![deny(warnings)]
use anyhow::Error;
use async_trait::async_trait;
use blobrepo::BlobRepo;
use blobstore::Loadable;
use bookmarks::{BookmarkName, BookmarkUpdateReason};
use context::CoreContext;
use fbinit::FacebookInit;
use fixtures::many_files_dirs;
use futures::compat::Future01CompatExt;
use futures_ext::{BoxFuture, FutureExt};
use futures_old::future::ok;
use futures_old::Future;
use futures_old::{stream, Stream};
use futures::{
compat::Future01CompatExt,
future,
stream::{futures_unordered, TryStreamExt},
};
use hooks::{
hook_loader::load_hooks, ErrorKind, FileHookExecutionID, Hook, HookChangeset,
HookChangesetParents, HookContext, HookExecution, HookFile, HookManager, HookRejectionInfo,
hook_loader::load_hooks, ErrorKind, Hook, HookChangeset, HookChangesetParents, HookContext,
HookExecution, HookFile, HookManager, HookRejectionInfo,
};
use hooks_content_stores::{
BlobRepoChangesetStore, BlobRepoFileContentStore, ChangedFileType, InMemoryChangesetStore,
@ -31,9 +32,9 @@ use maplit::{btreemap, hashmap, hashset};
use mercurial_types::{HgChangesetId, MPath};
use mercurial_types_mocks::nodehash::{ONES_FNID, THREES_FNID, TWOS_FNID};
use metaconfig_types::{
BlobConfig, BookmarkOrRegex, BookmarkParams, Bundle2ReplayParams, DerivedDataConfig, HookCode,
HookConfig, HookParams, HookType, InfinitepushParams, MetadataDBConfig, Redaction, RepoConfig,
RepoReadOnly, SourceControlServiceParams, StorageConfig,
BlobConfig, BookmarkParams, Bundle2ReplayParams, DerivedDataConfig, HookConfig, HookParams,
HookType, InfinitepushParams, MetadataDBConfig, Redaction, RepoConfig, RepoReadOnly,
SourceControlServiceParams, StorageConfig,
};
use mononoke_types::{FileType, RepositoryId};
use regex::Regex;
@ -55,13 +56,14 @@ impl FnChangesetHook {
}
}
#[async_trait]
impl Hook<HookChangeset> for FnChangesetHook {
fn run(
async fn run(
&self,
_ctx: CoreContext,
_ctx: &CoreContext,
context: HookContext<HookChangeset>,
) -> BoxFuture<HookExecution, Error> {
ok((self.f)(context)).boxify()
) -> Result<HookExecution, Error> {
Ok((self.f)(context))
}
}
@ -80,14 +82,15 @@ struct ContextMatchingChangesetHook {
expected_context: HookContext<HookChangeset>,
}
#[async_trait]
impl Hook<HookChangeset> for ContextMatchingChangesetHook {
fn run(
async fn run(
&self,
_ctx: CoreContext,
_ctx: &CoreContext,
context: HookContext<HookChangeset>,
) -> BoxFuture<HookExecution, Error> {
) -> Result<HookExecution, Error> {
assert_eq!(self.expected_context, context);
Box::new(ok(HookExecution::Accepted))
Ok(HookExecution::Accepted)
}
}
@ -102,51 +105,48 @@ struct FileContentMatchingChangesetHook {
expected_content: HashMap<String, Option<String>>,
}
#[async_trait]
impl Hook<HookChangeset> for FileContentMatchingChangesetHook {
fn run(
async fn run(
&self,
ctx: CoreContext,
ctx: &CoreContext,
context: HookContext<HookChangeset>,
) -> BoxFuture<HookExecution, Error> {
let mut futs = stream::FuturesUnordered::new();
) -> Result<HookExecution, Error> {
let futs = futures_unordered::FuturesUnordered::new();
for file in context.data.files {
let fut = match self.expected_content.get(&file.path) {
Some(expected_content) => {
let expected_content = expected_content.clone();
file.file_text(ctx.clone())
.map(move |content| {
let content = content
.map(|c| std::str::from_utf8(c.as_bytes()).unwrap().to_string());
let fut = async move {
match self.expected_content.get(&file.path) {
Some(expected_content) => {
let content = file.file_text(ctx).await?;
let content =
content.map(|c| std::str::from_utf8(c.as_bytes()).unwrap().to_string());
match (content, expected_content) {
(Some(content), Some(expected_content)) => {
if content.contains(&expected_content) {
true
} else {
false
}
Ok(match (content, expected_content.as_ref()) {
(Some(content), Some(expected_content)) => {
if content.contains(expected_content) {
true
} else {
false
}
(None, None) => true,
_ => false,
}
(None, None) => true,
_ => false,
})
.boxify()
}
None => Ok(false),
}
None => Box::new(ok(false)),
};
futs.push(fut);
}
futs.skip_while(|b| Ok(*b))
.into_future()
.map(|(opt_item, _)| {
if opt_item.is_some() {
default_rejection()
} else {
HookExecution::Accepted
}
})
.map_err(|(e, _)| e)
.boxify()
let opt_item = futs
.try_skip_while(|b: &bool| future::ok::<_, Error>(*b))
.try_next()
.await?;
Ok(if opt_item.is_some() {
default_rejection()
} else {
HookExecution::Accepted
})
}
}
@ -161,36 +161,36 @@ struct LengthMatchingChangesetHook {
expected_lengths: HashMap<String, u64>,
}
#[async_trait]
impl Hook<HookChangeset> for LengthMatchingChangesetHook {
fn run(
async fn run(
&self,
ctx: CoreContext,
ctx: &CoreContext,
context: HookContext<HookChangeset>,
) -> BoxFuture<HookExecution, Error> {
let mut futs = stream::FuturesUnordered::new();
) -> Result<HookExecution, Error> {
let futs = futures_unordered::FuturesUnordered::new();
for file in context.data.files {
let fut = match self.expected_lengths.get(&file.path) {
Some(expected_length) => {
let expected_length = *expected_length;
file.len(ctx.clone())
.map(move |length| length == expected_length)
.boxify()
let fut = async move {
match self.expected_lengths.get(&file.path) {
Some(expected_length) => {
let expected_length = *expected_length;
let length = file.len(ctx).await?;
Ok(length == expected_length)
}
None => Ok(false),
}
None => Box::new(ok(false)),
};
futs.push(fut);
}
futs.skip_while(|b| Ok(*b))
.into_future()
.map(|(opt_item, _)| {
if opt_item.is_some() {
default_rejection()
} else {
HookExecution::Accepted
}
})
.map_err(|(e, _)| e)
.boxify()
let opt_item = futs
.try_skip_while(|b: &bool| future::ok::<_, Error>(*b))
.try_next()
.await?;
Ok(if opt_item.is_some() {
default_rejection()
} else {
HookExecution::Accepted
})
}
}
@ -206,27 +206,23 @@ struct OtherFileMatchingChangesetHook {
expected_content: Option<String>,
}
#[async_trait]
impl Hook<HookChangeset> for OtherFileMatchingChangesetHook {
fn run(
async fn run(
&self,
ctx: CoreContext,
ctx: &CoreContext,
context: HookContext<HookChangeset>,
) -> BoxFuture<HookExecution, Error> {
let expected_content = self.expected_content.clone();
context
) -> Result<HookExecution, Error> {
let opt = context
.data
.file_text(ctx, self.file_path.clone())
.map(|opt| {
opt.map(|content| std::str::from_utf8(content.as_bytes()).unwrap().to_string())
})
.map(move |opt| {
if opt == expected_content {
HookExecution::Accepted
} else {
default_rejection()
}
})
.boxify()
.await?
.map(|content| std::str::from_utf8(content.as_bytes()).unwrap().to_string());
Ok(if opt == self.expected_content {
HookExecution::Accepted
} else {
default_rejection()
})
}
}
@ -251,13 +247,14 @@ impl FnFileHook {
}
}
#[async_trait]
impl Hook<HookFile> for FnFileHook {
fn run(
async fn run(
&self,
_ctx: CoreContext,
_ctx: &CoreContext,
context: HookContext<HookFile>,
) -> BoxFuture<HookExecution, Error> {
ok((self.f)(context)).boxify()
) -> Result<HookExecution, Error> {
Ok((self.f)(context))
}
}
@ -276,18 +273,18 @@ struct PathMatchingFileHook {
paths: HashSet<String>,
}
#[async_trait]
impl Hook<HookFile> for PathMatchingFileHook {
fn run(
async fn run(
&self,
_ctx: CoreContext,
_ctx: &CoreContext,
context: HookContext<HookFile>,
) -> BoxFuture<HookExecution, Error> {
ok(if self.paths.contains(&context.data.path) {
) -> Result<HookExecution, Error> {
Ok(if self.paths.contains(&context.data.path) {
HookExecution::Accepted
} else {
default_rejection()
})
.boxify()
}
}
@ -300,32 +297,26 @@ struct FileContentMatchingFileHook {
expected_content: Option<String>,
}
#[async_trait]
impl Hook<HookFile> for FileContentMatchingFileHook {
fn run(
async fn run(
&self,
ctx: CoreContext,
ctx: &CoreContext,
context: HookContext<HookFile>,
) -> BoxFuture<HookExecution, Error> {
let expected_content = self.expected_content.clone();
context
.data
.file_text(ctx)
.map(move |content| {
let content =
content.map(|c| std::str::from_utf8(c.as_bytes()).unwrap().to_string());
match (content, expected_content) {
(Some(content), Some(expected_content)) => {
if content.contains(&expected_content) {
HookExecution::Accepted
} else {
default_rejection()
}
}
(None, None) => HookExecution::Accepted,
_ => default_rejection(),
) -> Result<HookExecution, Error> {
let content = context.data.file_text(ctx).await?;
let content = content.map(|c| std::str::from_utf8(c.as_bytes()).unwrap().to_string());
Ok(match (content, self.expected_content.as_ref()) {
(Some(content), Some(expected_content)) => {
if content.contains(expected_content) {
HookExecution::Accepted
} else {
default_rejection()
}
})
.boxify()
}
(None, None) => HookExecution::Accepted,
_ => default_rejection(),
})
}
}
@ -338,28 +329,23 @@ struct IsSymLinkMatchingFileHook {
is_symlink: bool,
}
#[async_trait]
impl Hook<HookFile> for IsSymLinkMatchingFileHook {
fn run(
async fn run(
&self,
ctx: CoreContext,
ctx: &CoreContext,
context: HookContext<HookFile>,
) -> BoxFuture<HookExecution, Error> {
let is_symlink = self.is_symlink;
context
.data
.file_type(ctx)
.map(move |file_type| {
let actual = match file_type {
FileType::Symlink => true,
_ => false,
};
if is_symlink == actual {
HookExecution::Accepted
} else {
default_rejection()
}
})
.boxify()
) -> Result<HookExecution, Error> {
let file_type = context.data.file_type(ctx)?;
let actual = match file_type {
FileType::Symlink => true,
_ => false,
};
Ok(if self.is_symlink == actual {
HookExecution::Accepted
} else {
default_rejection()
})
}
}
@ -372,24 +358,19 @@ struct LengthMatchingFileHook {
length: u64,
}
#[async_trait]
impl Hook<HookFile> for LengthMatchingFileHook {
fn run(
async fn run(
&self,
ctx: CoreContext,
ctx: &CoreContext,
context: HookContext<HookFile>,
) -> BoxFuture<HookExecution, Error> {
let exp_length = self.length;
context
.data
.len(ctx)
.map(move |length| {
if length == exp_length {
HookExecution::Accepted
} else {
default_rejection()
}
})
.boxify()
) -> Result<HookExecution, Error> {
let length = context.data.len(ctx).await?;
Ok(if length == self.length {
HookExecution::Accepted
} else {
default_rejection()
})
}
}
@ -1091,13 +1072,15 @@ async fn run_changeset_hooks_with_mgr(
for (hook_name, hook) in hooks {
hook_manager.register_changeset_hook(&hook_name, hook.into(), Default::default());
}
let fut = hook_manager.run_changeset_hooks_for_bookmark(
ctx,
default_changeset_id(),
&BookmarkName::new(bookmark_name).unwrap(),
None,
);
let res = fut.compat().await.unwrap();
let res = hook_manager
.run_changeset_hooks_for_bookmark(
&ctx,
default_changeset_id(),
&BookmarkName::new(bookmark_name).unwrap(),
None,
)
.await
.unwrap();
let map: HashMap<String, HookExecution> = res
.into_iter()
.map(|(exec_id, exec)| (exec_id.hook_name, exec))
@ -1169,14 +1152,15 @@ async fn run_file_hooks_with_mgr(
for (hook_name, hook) in hooks {
hook_manager.register_file_hook(&hook_name, hook.into(), Default::default());
}
let fut: BoxFuture<Vec<(FileHookExecutionID, HookExecution)>, Error> = hook_manager
let res = hook_manager
.run_file_hooks_for_bookmark(
ctx,
&ctx,
hg_cs_id,
&BookmarkName::new(bookmark_name).unwrap(),
None,
);
let res = fut.compat().await.unwrap();
)
.await
.unwrap();
let map: HashMap<String, HashMap<String, HookExecution>> =
res.into_iter()
.fold(HashMap::new(), |mut m, (exec_id, exec)| {
@ -1212,7 +1196,7 @@ async fn setup_hook_manager(
fn default_rejection() -> HookExecution {
HookExecution::Rejected(HookRejectionInfo::new_long(
"desc".into(),
"long_desc".into(),
"long_desc".to_string(),
))
}
@ -1344,56 +1328,6 @@ fn default_repo_config() -> RepoConfig {
}
}
#[fbinit::test]
fn test_load_hooks(fb: FacebookInit) {
async_unit::tokio_unit_test(async move {
let mut config = default_repo_config();
config.bookmarks = vec![
BookmarkParams {
bookmark: BookmarkName::new("bm1").unwrap().into(),
hooks: vec!["hook1".into(), "hook2".into()],
only_fast_forward: false,
allowed_users: None,
rewrite_dates: None,
},
BookmarkParams {
bookmark: Regex::new("bm2").unwrap().into(),
hooks: vec!["hook2".into(), "hook3".into()],
only_fast_forward: false,
allowed_users: None,
rewrite_dates: None,
},
];
config.hooks = vec![
HookParams {
name: "hook1".into(),
code: Some(HookCode::Code("hook1 code".into())),
hook_type: HookType::PerAddedOrModifiedFile,
config: Default::default(),
},
HookParams {
name: "hook2".into(),
code: Some(HookCode::Code("hook2 code".into())),
hook_type: HookType::PerAddedOrModifiedFile,
config: Default::default(),
},
HookParams {
name: "hook3".into(),
code: Some(HookCode::Code("hook3 code".into())),
hook_type: HookType::PerChangeset,
config: Default::default(),
},
];
let mut hm = hook_manager_many_files_dirs_blobrepo(fb).await;
match load_hooks(fb, &mut hm, config, &hashset![]) {
Err(e) => assert!(false, format!("Failed to load hooks {}", e)),
Ok(()) => (),
};
});
}
#[fbinit::test]
fn test_verify_integrity_fast_failure(fb: FacebookInit) {
async_unit::tokio_unit_test(async move {
@ -1407,7 +1341,6 @@ fn test_verify_integrity_fast_failure(fb: FacebookInit) {
}];
config.hooks = vec![HookParams {
name: "rust:verify_integrity".into(),
code: Some(HookCode::Code("whateva".into())),
hook_type: HookType::PerChangeset,
config: HookConfig {
strings: hashmap! {String::from("verify_integrity_path") => String::from("bad_nonexisting_filename")},
@ -1421,40 +1354,6 @@ fn test_verify_integrity_fast_failure(fb: FacebookInit) {
});
}
#[fbinit::test]
fn test_load_hooks_no_such_hook(fb: FacebookInit) {
async_unit::tokio_unit_test(async move {
let book_or_rex = BookmarkOrRegex::Bookmark(BookmarkName::new("bm1").unwrap());
let mut config = default_repo_config();
config.bookmarks = vec![BookmarkParams {
bookmark: book_or_rex.clone(),
hooks: vec!["hook1".into(), "hook2".into()],
only_fast_forward: false,
allowed_users: None,
rewrite_dates: None,
}];
config.hooks = vec![HookParams {
name: "hook1".into(),
code: Some(HookCode::Code("hook1 code".into())),
hook_type: HookType::PerAddedOrModifiedFile,
config: Default::default(),
}];
let mut hm = hook_manager_many_files_dirs_blobrepo(fb).await;
match load_hooks(fb, &mut hm, config, &hashset![])
.unwrap_err()
.downcast::<ErrorKind>()
{
Ok(ErrorKind::NoSuchBookmarkHook(bookmark, _)) => {
assert_eq!(book_or_rex, bookmark);
}
_ => assert!(false, "Unexpected err type"),
};
});
}
#[fbinit::test]
fn test_load_hooks_bad_rust_hook(fb: FacebookInit) {
async_unit::tokio_unit_test(async move {
@ -1469,7 +1368,6 @@ fn test_load_hooks_bad_rust_hook(fb: FacebookInit) {
config.hooks = vec![HookParams {
name: "rust:hook1".into(),
code: Some(HookCode::Code("hook1 code".into())),
hook_type: HookType::PerChangeset,
config: Default::default(),
}];
@ -1497,7 +1395,6 @@ fn test_load_disabled_hooks(fb: FacebookInit) {
config.hooks = vec![HookParams {
name: "hook1".into(),
code: None,
hook_type: HookType::PerChangeset,
config: Default::default(),
}];
@ -1524,7 +1421,6 @@ fn test_load_disabled_hooks_referenced_by_bookmark(fb: FacebookInit) {
config.hooks = vec![HookParams {
name: "hook1".into(),
code: None,
hook_type: HookType::PerChangeset,
config: Default::default(),
}];

View File

@ -10,18 +10,23 @@
#![deny(warnings)]
use crate::errors::*;
use crate::facebook::rust_hooks::check_nocommit::CheckNocommitHook;
use crate::facebook::rust_hooks::check_unittests::CheckUnittestsHook;
use crate::facebook::rust_hooks::ensure_valid_email::EnsureValidEmailHook;
use crate::facebook::rust_hooks::limit_commit_message_length::LimitCommitMessageLength;
use crate::facebook::rust_hooks::limit_path_length::LimitPathLengthHook;
use crate::facebook::rust_hooks::signed_source::SignedSourceHook;
use crate::facebook::rust_hooks::verify_integrity::VerifyIntegrityHook;
use crate::lua_hook::LuaHook;
use crate::facebook::rust_hooks::{
always_fail_changeset::AlwaysFailChangeset, block_cross_repo_commits::BlockCrossRepoCommits,
block_empty_commit::BlockEmptyCommit, check_nocommit::CheckNocommitHook,
check_unittests::CheckUnittestsHook, conflict_markers::ConflictMarkers, deny_files::DenyFiles,
ensure_valid_email::EnsureValidEmailHook,
gitattributes_textdirectives::GitattributesTextDirectives,
limit_commit_message_length::LimitCommitMessageLength, limit_commitsize::LimitCommitsize,
limit_filesize::LimitFilesize, limit_path_length::LimitPathLengthHook,
no_bad_filenames::NoBadFilenames, no_insecure_filenames::NoInsecureFilenames,
no_questionable_filenames::NoQuestionableFilenames, signed_source::SignedSourceHook,
tp2_symlinks_only::TP2SymlinksOnly, verify_integrity::VerifyIntegrityHook,
verify_reviewedby_info::VerifyReviewedbyInfo,
};
use crate::{Hook, HookChangeset, HookFile, HookManager};
use anyhow::{format_err, Error};
use anyhow::Error;
use fbinit::FacebookInit;
use metaconfig_types::{HookType, RepoConfig};
use metaconfig_types::RepoConfig;
use std::collections::HashSet;
use std::sync::Arc;
@ -40,6 +45,7 @@ pub fn load_hooks(
let mut hook_set = HashSet::new();
for hook in config.hooks {
use LoadedRustHook::*;
let name = hook.name;
if disabled_hooks.contains(&name) {
@ -47,53 +53,52 @@ pub fn load_hooks(
continue;
}
if name.starts_with("rust:") {
use LoadedRustHook::*;
let rust_name = &name[5..];
let rust_name = rust_name.to_string();
let rust_hook = match rust_name.as_ref() {
"check_unittests" => {
ChangesetHook(Arc::new(CheckUnittestsHook::new(&hook.config)?))
}
"verify_integrity" => {
ChangesetHook(Arc::new(VerifyIntegrityHook::new(&hook.config)?))
}
"ensure_valid_email" => {
ChangesetHook(Arc::new(EnsureValidEmailHook::new(fb, &hook.config)))
}
"limit_commit_message_length" => {
ChangesetHook(Arc::new(LimitCommitMessageLength::new(&hook.config)?))
}
"limit_path_length" => FileHook(Arc::new(LimitPathLengthHook::new(&hook.config)?)),
"signed_source" => FileHook(Arc::new(SignedSourceHook::new(&hook.config)?)),
"check_nocommit" => FileHook(Arc::new(CheckNocommitHook::new(&hook.config)?)),
_ => return Err(ErrorKind::InvalidRustHook(name.clone()).into()),
};
match rust_hook {
FileHook(rust_hook) => {
hook_manager.register_file_hook(&name, rust_hook, hook.config)
}
ChangesetHook(rust_hook) => {
hook_manager.register_changeset_hook(&name, rust_hook, hook.config)
}
}
// Backwards compatibility only
let hook_name = if name.starts_with("rust:") {
name[5..].to_string()
} else {
let code = hook
.code
.ok_or(format_err!("Lua Hook must have path: {}", name))?
.load()?;
let lua_hook = LuaHook::new(name.clone(), code);
match hook.hook_type {
HookType::PerAddedOrModifiedFile => {
hook_manager.register_file_hook(&name, Arc::new(lua_hook), hook.config)
}
HookType::PerChangeset => {
hook_manager.register_changeset_hook(&name, Arc::new(lua_hook), hook.config)
}
name.clone()
};
let rust_hook = match hook_name.as_ref() {
"always_fail_changeset" => ChangesetHook(Arc::new(AlwaysFailChangeset::new())),
"block_cross_repo_commits" => FileHook(Arc::new(BlockCrossRepoCommits::new())),
"block_empty_commit" => ChangesetHook(Arc::new(BlockEmptyCommit::new())),
"check_nocommit" => FileHook(Arc::new(CheckNocommitHook::new(&hook.config)?)),
"check_unittests" => ChangesetHook(Arc::new(CheckUnittestsHook::new(&hook.config)?)),
"conflict_markers" => FileHook(Arc::new(ConflictMarkers::new())),
"deny_files" => FileHook(Arc::new(DenyFiles::new()?)),
"ensure_valid_email" => {
ChangesetHook(Arc::new(EnsureValidEmailHook::new(fb, &hook.config)?))
}
"gitattributes-textdirectives" => {
FileHook(Arc::new(GitattributesTextDirectives::new()?))
}
"limit_commit_message_length" => {
ChangesetHook(Arc::new(LimitCommitMessageLength::new(&hook.config)?))
}
"limit_commitsize" => ChangesetHook(Arc::new(LimitCommitsize::new(&hook.config))),
"limit_filesize" => FileHook(Arc::new(LimitFilesize::new(&hook.config))),
"limit_path_length" => FileHook(Arc::new(LimitPathLengthHook::new(&hook.config)?)),
"no_bad_filenames" => FileHook(Arc::new(NoBadFilenames::new()?)),
"no_insecure_filenames" => FileHook(Arc::new(NoInsecureFilenames::new()?)),
"no_questionable_filenames" => FileHook(Arc::new(NoQuestionableFilenames::new()?)),
"signed_source" => FileHook(Arc::new(SignedSourceHook::new(&hook.config)?)),
"tp2_symlinks_only" => FileHook(Arc::new(TP2SymlinksOnly::new())),
"verify_integrity" => ChangesetHook(Arc::new(VerifyIntegrityHook::new(&hook.config)?)),
"verify_reviewedby_info" => {
ChangesetHook(Arc::new(VerifyReviewedbyInfo::new(&hook.config)?))
}
_ => return Err(ErrorKind::InvalidRustHook(name.clone()).into()),
};
match rust_hook {
FileHook(rust_hook) => hook_manager.register_file_hook(&name, rust_hook, hook.config),
ChangesetHook(rust_hook) => {
hook_manager.register_changeset_hook(&name, rust_hook, hook.config)
}
}
hook_set.insert(name);
}

View File

@ -1,51 +0,0 @@
--[[
Copyright (c) Facebook, Inc. and its affiliates.
This software may be used and distributed according to the terms of the
GNU General Public License version 2.
--]]
g__set_common_file_functions = function(path, type)
local file = {}
file.path = path
file.is_added = function() return type == "added" end
file.is_deleted = function() return type == "deleted" end
file.is_modified = function() return type == "modified" end
return file
end
g__hook_start_base = function(info, arg, setup)
if hook == nil then
error("no hook function")
end
local ctx = {}
ctx.config_strings = g__config_strings
ctx.config_ints = g__config_ints
ctx.regex_match = function(pattern, s)
return coroutine.yield(g__regex_match(pattern, s))
end
ctx.info=info
setup(arg, ctx)
io = nil
os = nil
local acc, desc, long_desc = hook(ctx)
if type(acc) ~= "boolean" then
error("invalid hook return type")
end
if acc and desc ~= nil then
error("failure description must only be set if hook fails")
end
if acc and long_desc ~= nil then
error("failure long description must only be set if hook fails")
end
if desc ~= nil and type(desc) ~= "string" then
error("invalid hook failure short description type")
end
if long_desc ~= nil and type(long_desc) ~= "string" then
error("invalid hook failure long description type")
end
local res = {acc, desc, long_desc}
return res
end

View File

@ -1,69 +0,0 @@
--[[
Copyright (c) Facebook, Inc. and its affiliates.
This software may be used and distributed according to the terms of the
GNU General Public License version 2.
--]]
g__hook_start = function(info, arg)
return g__hook_start_base(info, arg, function(arg, ctx)
local files = {}
-- translation to lua from mercurial's util.shortuser()
local get_author_unixname = function(author)
local ind = author:find('@')
if ind then
author = author:sub(1, ind - 1)
end
ind = author:find('<')
if ind then
author = author:sub(ind + 1)
end
ind = author:find(' ')
if ind then
author = author:sub(0, ind)
end
ind = author:find('%.')
if ind then
author = author:sub(0, ind)
end
return author
end
for _, file_data in ipairs(arg) do
local file = g__set_common_file_functions(file_data.path, file_data.type)
if not file.is_deleted() then
file.contains_string = function(s)
return coroutine.yield(g__contains_string(file.path, s))
end
file.len = function()
return coroutine.yield(g__file_len(file.path))
end
file.text = function()
return coroutine.yield(g__file_text(file.path))
end
file.path_regex_match = function(p)
return coroutine.yield(g__regex_match(p, file.path))
end
end
files[#files+1] = file
end
ctx.files = files
ctx.info.author_unixname = get_author_unixname(ctx.info.author)
ctx.file_text = function(path)
return coroutine.yield(g__file_text(path))
end
ctx.parse_commit_msg = function()
return coroutine.yield(g__parse_commit_msg())
end
ctx.is_valid_reviewer = function(user)
return coroutine.yield(g__is_valid_reviewer(user))
end
end)
end

View File

@ -1,26 +0,0 @@
--[[
Copyright (c) Facebook, Inc. and its affiliates.
This software may be used and distributed according to the terms of the
GNU General Public License version 2.
--]]
g__hook_start = function(info, arg)
return g__hook_start_base(info, arg, function(arg, ctx)
local file = g__set_common_file_functions(arg.path, arg.type)
if not file.is_deleted() then
file.contains_string = function(s)
return coroutine.yield(g__contains_string(s))
end
file.len = function() return coroutine.yield(g__file_len()) end
file.text = function() return coroutine.yield(g__file_text()) end
file.is_symlink = function() return coroutine.yield(g__is_symlink()) end
file.path_regex_match = function(p)
return coroutine.yield(g__regex_match(p, file.path))
end
end
ctx.file = file
end)
end

View File

@ -17,22 +17,23 @@
pub mod errors;
mod facebook;
pub mod hook_loader;
pub mod lua_hook;
mod phabricator_message_parser;
pub mod rust_hook;
use aclchecker::{AclChecker, Identity};
use anyhow::{bail, Error};
use async_trait::async_trait;
use bookmarks::BookmarkName;
use bytes::Bytes;
use cloned::cloned;
use context::CoreContext;
pub use errors::*;
use failure_ext::FutureFailureErrorExt;
use fbinit::FacebookInit;
use futures_ext::{try_boxfuture, BoxFuture, FutureExt};
use futures_old::{future, Future, IntoFuture};
use futures_stats::Timed;
use futures::{
future::{try_join, try_join_all},
Future, TryFutureExt,
};
use futures_stats::TimedFutureExt;
use hooks_content_stores::{ChangedFileType, ChangesetStore, FileContentStore};
use mercurial_types::{FileBytes, HgChangesetId, HgFileNodeId, HgParents, MPath};
use metaconfig_types::{BookmarkOrRegex, HookBypass, HookConfig, HookManagerParams};
@ -46,7 +47,6 @@ use std::fmt;
use std::hash::{Hash, Hasher};
use std::str;
use std::sync::Arc;
use tracing::{trace_args, Traced};
type ChangesetHooks = HashMap<String, (Arc<dyn Hook<HookChangeset>>, HookConfig)>;
type FileHooks = HashMap<String, (Arc<dyn Hook<HookFile>>, HookConfig)>;
@ -190,13 +190,14 @@ impl HookManager {
// Changeset hooks
pub fn run_changeset_hooks_for_bookmark(
&self,
ctx: CoreContext,
pub fn run_changeset_hooks_for_bookmark<'a, 'b: 'a, 'c: 'a, 'd: 'a>(
&'a self,
ctx: &'b CoreContext,
changeset_id: HgChangesetId,
bookmark: &BookmarkName,
maybe_pushvars: Option<HashMap<String, Bytes>>,
) -> BoxFuture<Vec<(ChangesetHookExecutionID, HookExecution)>, Error> {
bookmark: &'c BookmarkName,
maybe_pushvars: Option<&'d HashMap<String, Bytes>>,
) -> impl Future<Output = Result<Vec<(ChangesetHookExecutionID, HookExecution)>, Error>> + 'a
{
debug!(
ctx.logger(),
"Running changeset hooks for bookmark {:?}", bookmark
@ -211,20 +212,20 @@ impl HookManager {
)
}
fn run_changeset_hooks_for_changeset_id(
async fn run_changeset_hooks_for_changeset_id(
&self,
ctx: CoreContext,
ctx: &CoreContext,
changeset_id: HgChangesetId,
hooks: Vec<String>,
maybe_pushvars: Option<HashMap<String, Bytes>>,
maybe_pushvars: Option<&HashMap<String, Bytes>>,
bookmark: &BookmarkName,
) -> BoxFuture<Vec<(ChangesetHookExecutionID, HookExecution)>, Error> {
) -> Result<Vec<(ChangesetHookExecutionID, HookExecution)>, Error> {
debug!(
ctx.logger(),
"Running changeset hooks for changeset id {:?}", changeset_id
);
let hooks: Result<Vec<(String, (Arc<dyn Hook<HookChangeset>>, _))>, Error> = hooks
let hooks: Vec<_> = hooks
.iter()
.map(|hook_name| {
let hook = self
@ -233,110 +234,87 @@ impl HookManager {
.ok_or(ErrorKind::NoSuchHook(hook_name.to_string()))?;
Ok((hook_name.clone(), hook.clone()))
})
.collect();
let hooks = try_boxfuture!(hooks);
.collect::<Result<_, Error>>()?;
cloned!(mut self.scuba);
scuba.add("hash", changeset_id.to_hex().to_string());
self.get_hook_changeset(ctx.clone(), changeset_id)
.and_then({
cloned!(bookmark);
move |hcs| {
let hooks = HookManager::filter_bypassed_hooks(
hooks,
&hcs.comments,
maybe_pushvars.as_ref(),
);
let hcs = self.get_hook_changeset(&ctx, changeset_id).await?;
let hooks = HookManager::filter_bypassed_hooks(hooks, &hcs.comments, maybe_pushvars);
HookManager::run_changeset_hooks_for_changeset(
ctx,
hcs.clone(),
hooks.clone(),
bookmark,
scuba,
)
}
let res = HookManager::run_changeset_hooks_for_changeset(ctx, hcs, hooks, bookmark, scuba)
.await?;
Ok(res
.into_iter()
.map(|(hook_name, exec)| {
(
ChangesetHookExecutionID {
cs_id: changeset_id,
hook_name,
},
exec,
)
})
.map(move |res| {
res.into_iter()
.map(|(hook_name, exec)| {
(
ChangesetHookExecutionID {
cs_id: changeset_id,
hook_name,
},
exec,
)
})
.collect()
})
.boxify()
.collect())
}
fn run_changeset_hooks_for_changeset(
ctx: CoreContext,
async fn run_changeset_hooks_for_changeset<'book, 'ctx: 'book>(
ctx: &'ctx CoreContext,
changeset: HookChangeset,
hooks: Vec<(String, Arc<dyn Hook<HookChangeset>>, HookConfig)>,
bookmark: BookmarkName,
bookmark: &'book BookmarkName,
scuba: ScubaSampleBuilder,
) -> BoxFuture<Vec<(String, HookExecution)>, Error> {
futures_old::future::join_all(hooks.into_iter().map(move |(hook_name, hook, config)| {
) -> Result<Vec<(String, HookExecution)>, Error> {
try_join_all(hooks.into_iter().map(|(hook_name, hook, config)| {
HookManager::run_hook(
ctx.clone(),
ctx,
hook,
HookContext::new(hook_name, config, changeset.clone(), bookmark.clone()),
HookContext::new(hook_name, config, changeset.clone(), bookmark),
scuba.clone(),
)
}))
.boxify()
.await
}
// File hooks
pub fn run_file_hooks_for_bookmark(
pub async fn run_file_hooks_for_bookmark(
&self,
ctx: CoreContext,
ctx: &CoreContext,
changeset_id: HgChangesetId,
bookmark: &BookmarkName,
maybe_pushvars: Option<HashMap<String, Bytes>>,
) -> BoxFuture<Vec<(FileHookExecutionID, HookExecution)>, Error> {
maybe_pushvars: Option<&HashMap<String, Bytes>>,
) -> Result<Vec<(FileHookExecutionID, HookExecution)>, Error> {
debug!(
ctx.logger(),
"Running file hooks for bookmark {:?}", bookmark
);
let file_hooks = self.file_hooks_for_bookmark(&bookmark);
self.run_file_hooks_for_changeset_id(
ctx.clone(),
&ctx,
changeset_id,
self.file_hooks_for_bookmark(bookmark),
file_hooks,
maybe_pushvars,
bookmark.clone(),
&bookmark,
)
.traced(
&ctx.trace(),
"run_file_hooks",
trace_args! {
"bookmark" => bookmark.to_string(),
"changeset" => changeset_id.to_hex().to_string(),
},
)
.boxify()
.await
}
fn run_file_hooks_for_changeset_id(
async fn run_file_hooks_for_changeset_id(
&self,
ctx: CoreContext,
ctx: &CoreContext,
changeset_id: HgChangesetId,
hooks: Vec<String>,
maybe_pushvars: Option<HashMap<String, Bytes>>,
bookmark: BookmarkName,
) -> BoxFuture<Vec<(FileHookExecutionID, HookExecution)>, Error> {
maybe_pushvars: Option<&HashMap<String, Bytes>>,
bookmark: &BookmarkName,
) -> Result<Vec<(FileHookExecutionID, HookExecution)>, Error> {
debug!(
ctx.logger(),
"Running file hooks for changeset id {:?}", changeset_id
);
let hooks: Result<Vec<(String, (Arc<dyn Hook<HookFile>>, HookConfig))>, Error> = hooks
let hooks: Vec<_> = hooks
.iter()
.map(|hook_name| {
let hook = self
@ -345,42 +323,26 @@ impl HookManager {
.ok_or(ErrorKind::NoSuchHook(hook_name.to_string()))?;
Ok((hook_name.clone(), hook.clone()))
})
.collect();
let hooks = try_boxfuture!(hooks);
.collect::<Result<_, Error>>()?;
cloned!(mut self.scuba);
scuba.add("hash", changeset_id.to_hex().to_string());
self.get_hook_changeset(ctx.clone(), changeset_id)
.and_then({
move |hcs| {
let hooks = HookManager::filter_bypassed_hooks(
hooks.clone(),
&hcs.comments,
maybe_pushvars.as_ref(),
);
let hcs = self.get_hook_changeset(ctx, changeset_id).await?;
let hooks = HookManager::filter_bypassed_hooks(hooks, &hcs.comments, maybe_pushvars);
HookManager::run_file_hooks_for_changeset(
ctx.clone(),
changeset_id,
hcs.clone(),
hooks,
bookmark,
scuba,
)
}
})
.boxify()
HookManager::run_file_hooks_for_changeset(ctx, changeset_id, &hcs, hooks, bookmark, scuba)
.await
}
fn run_file_hooks_for_changeset(
ctx: CoreContext,
fn run_file_hooks_for_changeset<'cs, 'book: 'cs, 'ctx: 'cs>(
ctx: &'ctx CoreContext,
changeset_id: HgChangesetId,
changeset: HookChangeset,
changeset: &'cs HookChangeset,
hooks: Vec<(String, Arc<dyn Hook<HookFile>>, HookConfig)>,
bookmark: BookmarkName,
bookmark: &'book BookmarkName,
scuba: ScubaSampleBuilder,
) -> BoxFuture<Vec<(FileHookExecutionID, HookExecution)>, Error> {
let v: Vec<BoxFuture<Vec<(FileHookExecutionID, HookExecution)>, _>> = changeset
) -> impl Future<Output = Result<Vec<(FileHookExecutionID, HookExecution)>, Error>> + 'cs {
let v: Vec<_> = changeset
.files
.iter()
// Do not run file hooks for deleted files
@ -388,11 +350,11 @@ impl HookManager {
match file.ty {
ChangedFileType::Added | ChangedFileType::Modified => Some(
HookManager::run_file_hooks(
ctx.clone(),
ctx,
changeset_id,
file.clone(),
hooks.clone(),
bookmark.clone(),
bookmark,
scuba.clone(),
)
),
@ -400,31 +362,25 @@ impl HookManager {
}
})
.collect();
futures_old::future::join_all(v)
.map(|vv| vv.into_iter().flatten().collect())
.boxify()
try_join_all(v).map_ok(|vv| vv.into_iter().flatten().collect())
}
fn run_file_hooks(
ctx: CoreContext,
async fn run_file_hooks<'book, 'ctx: 'book>(
ctx: &'ctx CoreContext,
cs_id: HgChangesetId,
file: HookFile,
hooks: Vec<(String, Arc<dyn Hook<HookFile>>, HookConfig)>,
bookmark: BookmarkName,
bookmark: &'book BookmarkName,
scuba: ScubaSampleBuilder,
) -> BoxFuture<Vec<(FileHookExecutionID, HookExecution)>, Error> {
) -> Result<Vec<(FileHookExecutionID, HookExecution)>, Error> {
let hook_futs = hooks.into_iter().map(move |(hook_name, hook, config)| {
let hook_context = HookContext::new(
hook_name.to_string(),
config,
file.clone(),
bookmark.clone(),
);
let hook_context =
HookContext::new(hook_name.to_string(), config, file.clone(), bookmark);
cloned!(mut scuba);
scuba.add("hash", cs_id.to_hex().to_string());
HookManager::run_hook(ctx.clone(), hook, hook_context, scuba).map({
HookManager::run_hook(ctx, hook, hook_context, scuba).map_ok({
cloned!(file, bookmark);
move |(hook_name, exec)| {
(
@ -439,15 +395,15 @@ impl HookManager {
}
})
});
futures_old::future::join_all(hook_futs).boxify()
try_join_all(hook_futs).await
}
fn run_hook<T: Clone>(
ctx: CoreContext,
async fn run_hook<T: Clone>(
ctx: &CoreContext,
hook: Arc<dyn Hook<T>>,
hook_context: HookContext<T>,
mut scuba: ScubaSampleBuilder,
) -> BoxFuture<(String, HookExecution), Error> {
) -> Result<(String, HookExecution), Error> {
let hook_name = hook_context.hook_name.clone();
debug!(ctx.logger(), "Running hook {:?}", hook_context.hook_name);
@ -464,70 +420,62 @@ impl HookManager {
scuba.add("hook", hook_name.clone());
hook.run(ctx, hook_context)
.map({
cloned!(hook_name);
move |he| (hook_name, he)
})
.timed(move |stats, result| {
if let Err(e) = result.as_ref() {
scuba.add("stderr", e.to_string());
}
let (stats, result) = hook.run(ctx, hook_context).timed().await;
let elapsed = stats.completion_time.as_millis() as i64;
if let Err(e) = result.as_ref() {
scuba.add("stderr", e.to_string());
}
scuba
.add("elapsed", elapsed)
.add("total_time", elapsed)
.add("errorcode", result.is_err() as i32)
.add("failed_hooks", result.is_err() as i32)
.log();
let elapsed = stats.completion_time.as_millis() as i64;
Ok(())
})
.with_context(move || format!("while executing hook {}", hook_name))
.from_err()
.boxify()
scuba
.add("elapsed", elapsed)
.add("total_time", elapsed)
.add("errorcode", result.is_err() as i32)
.add("failed_hooks", result.is_err() as i32)
.log();
let he = result.map_err(|e| e.context(format!("while executing hook {}", hook_name)))?;
Ok((hook_name, he))
}
fn get_hook_changeset(
async fn get_hook_changeset(
&self,
ctx: CoreContext,
ctx: &CoreContext,
changeset_id: HgChangesetId,
) -> BoxFuture<HookChangeset, Error> {
) -> Result<HookChangeset, Error> {
let content_store = self.content_store.clone();
let hg_changeset = self
.changeset_store
.get_changeset_by_changesetid(ctx.clone(), changeset_id);
.get_changeset_by_changesetid(ctx, changeset_id);
let changed_files = self.changeset_store.get_changed_files(ctx, changeset_id);
let reviewers_acl_checker = self.reviewers_acl_checker.clone();
Box::new((hg_changeset, changed_files).into_future().and_then(
move |(changeset, changed_files)| {
let author = str::from_utf8(changeset.user())?.into();
let files = changed_files
.into_iter()
.map(|(path, ty, hash_and_type)| {
HookFile::new(
path,
content_store.clone(),
changeset_id.clone(),
ty,
hash_and_type,
)
})
.collect();
let comments = str::from_utf8(changeset.comments())?.into();
let parents = HookChangesetParents::from(changeset.parents());
Ok(HookChangeset::new(
author,
files,
comments,
parents,
changeset_id,
content_store,
reviewers_acl_checker,
))
},
let (changeset, changed_files) = try_join(hg_changeset, changed_files).await?;
let author = str::from_utf8(changeset.user())?.into();
let files = changed_files
.into_iter()
.map(|(path, ty, hash_and_type)| {
HookFile::new(
path,
content_store.clone(),
changeset_id.clone(),
ty,
hash_and_type,
)
})
.collect();
let comments = str::from_utf8(changeset.comments())?.into();
let parents = HookChangesetParents::from(changeset.parents());
Ok(HookChangeset::new(
author,
files,
comments,
parents,
changeset_id,
content_store,
reviewers_acl_checker,
))
}
@ -579,15 +527,16 @@ impl HookManager {
}
}
#[async_trait]
pub trait Hook<T>: Send + Sync
where
T: Clone,
{
fn run(
&self,
ctx: CoreContext,
async fn run<'a, 'b: 'a>(
&'a self,
ctx: &'b CoreContext,
hook_context: HookContext<T>,
) -> BoxFuture<HookExecution, Error>;
) -> Result<HookExecution, Error>;
}
/// Represents a changeset - more user friendly than the blob changeset
@ -670,35 +619,28 @@ impl HookFile {
}
}
pub fn len(&self, ctx: CoreContext) -> BoxFuture<u64, Error> {
let path = try_boxfuture!(MPath::new(self.path.as_bytes()));
pub async fn len(&self, ctx: &CoreContext) -> Result<u64, Error> {
let path = MPath::new(self.path.as_bytes())?;
match self.hash_and_type {
Some((entry_id, _)) => self.content_store.get_file_size(ctx, entry_id).boxify(),
None => {
future::err(ErrorKind::MissingFile(self.changeset_id, path.into()).into()).boxify()
}
Some((entry_id, _)) => self.content_store.get_file_size(ctx, entry_id).await,
None => Err(ErrorKind::MissingFile(self.changeset_id, path.into()).into()),
}
}
pub fn file_text(&self, ctx: CoreContext) -> BoxFuture<Option<FileBytes>, Error> {
let path = try_boxfuture!(MPath::new(self.path.as_bytes()));
pub async fn file_text(&self, ctx: &CoreContext) -> Result<Option<FileBytes>, Error> {
let path = MPath::new(self.path.as_bytes())?;
match self.hash_and_type {
Some((id, _)) => self.content_store.get_file_text(ctx, id).boxify(),
None => {
future::err(ErrorKind::MissingFile(self.changeset_id, path.into()).into()).boxify()
}
Some((id, _)) => self.content_store.get_file_text(ctx, id).await,
None => Err(ErrorKind::MissingFile(self.changeset_id, path.into()).into()),
}
}
pub fn file_type(&self, _ctx: CoreContext) -> BoxFuture<FileType, Error> {
let path = try_boxfuture!(MPath::new(self.path.as_bytes()));
cloned!(self.changeset_id);
pub fn file_type(&self, _ctx: &CoreContext) -> Result<FileType, Error> {
let path = MPath::new(self.path.as_bytes())?;
self.hash_and_type
.ok_or(ErrorKind::MissingFile(changeset_id, path.into()).into())
.into_future()
.ok_or(ErrorKind::MissingFile(self.changeset_id, path.into()).into())
.map(|(_, file_type)| file_type)
.boxify()
}
pub fn changed_file_type(&self) -> ChangedFileType {
@ -727,18 +669,20 @@ impl HookChangeset {
}
}
pub fn file_text(&self, ctx: CoreContext, path: String) -> BoxFuture<Option<FileBytes>, Error> {
let path = try_boxfuture!(MPath::new(path.as_bytes()));
self.content_store
.resolve_path(ctx.clone(), self.changeset_id, path)
.and_then({
cloned!(self.content_store);
move |id| match id {
Some(id) => content_store.get_file_text(ctx, id).left_future(),
None => Ok(None).into_future().right_future(),
}
})
.boxify()
pub async fn file_text(
&self,
ctx: &CoreContext,
path: String,
) -> Result<Option<FileBytes>, Error> {
let path = MPath::new(path.as_bytes())?;
let id = self
.content_store
.resolve_path(ctx, self.changeset_id, path)
.await?;
match id {
Some(id) => self.content_store.get_file_text(ctx, id).await,
None => Ok(None),
}
}
}
@ -761,22 +705,26 @@ impl fmt::Display for HookExecution {
#[derive(Clone, Debug, PartialEq)]
pub struct HookRejectionInfo {
/// A short description for summarizing this failure with similar failures
pub description: String,
pub description: &'static str,
/// A full explanation of what went wrong, suitable for presenting to the user (should include guidance for fixing this failure, where possible)
pub long_description: String,
}
impl HookRejectionInfo {
pub fn new(description: String) -> Self {
Self::new_long(description.clone(), description)
/// A rejection with just a short description
/// The text should just summarize this failure - it should not be different on different invocations of this hook
pub fn new(description: &'static str) -> Self {
Self::new_long(description, description.to_string())
}
pub fn new_opt(description: String, long_description: Option<String>) -> Self {
let long_description = long_description.unwrap_or(description.clone());
Self::new_long(description, long_description)
}
pub fn new_long(description: String, long_description: String) -> Self {
/// A rejection with a possible per-invocation fix explanation.
pub fn new_long<OS>(description: &'static str, long_description: OS) -> Self
where
OS: Into<Option<String>>,
{
let long_description = long_description
.into()
.unwrap_or_else(|| description.to_string());
Self {
description,
long_description,
@ -836,13 +784,13 @@ where
hook_name: String,
config: HookConfig,
data: T,
bookmark: BookmarkName,
bookmark: &BookmarkName,
) -> HookContext<T> {
HookContext {
hook_name,
config,
data,
bookmark,
bookmark: bookmark.clone(),
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -5,10 +5,9 @@
* GNU General Public License version 2.
*/
use hlua::{AnyLuaString, AnyLuaValue};
use lazy_static::lazy_static;
use regex::{Regex, RegexBuilder};
use std::collections::{HashMap, HashSet};
use std::collections::HashSet;
use std::mem;
const TITLE: &'static str = "title";
@ -133,49 +132,6 @@ impl PhabricatorMessage {
parsed
}
pub fn to_lua(self) -> HashMap<&'static str, AnyLuaValue> {
let mut map = HashMap::new();
let PhabricatorMessage {
title,
cc,
subscribers,
differential_revision,
revert_plan,
reviewed_by,
reviewers,
summary,
signature,
tasks,
test_plan,
} = self;
let insert_str = |map: &mut HashMap<&str, AnyLuaValue>, name, value| {
if let Some(v) = value {
map.insert(name, to_lua_string(v));
}
};
let insert_array = |map: &mut HashMap<&str, AnyLuaValue>, name, value| {
if let Some(v) = value {
map.insert(name, to_lua_array(v));
}
};
insert_str(&mut map, TITLE, title);
insert_array(&mut map, CC, cc);
insert_array(&mut map, SUBSCRIBERS, subscribers);
insert_str(&mut map, DIFFERENTIAL_REVISION, differential_revision);
insert_str(&mut map, REVERT_PLAN, revert_plan);
insert_array(&mut map, REVIEWED_BY, reviewed_by);
insert_array(&mut map, REVIEWERS, reviewers);
insert_str(&mut map, SUMMARY, summary);
insert_str(&mut map, SIGNATURE, signature);
insert_array(&mut map, TASKS, tasks);
insert_str(&mut map, TEST_PLAN, test_plan);
map
}
fn add(&mut self, tag: String, value: Vec<&str>) {
let value = itertools::join(value, "\n").trim().to_string();
@ -209,28 +165,9 @@ impl PhabricatorMessage {
}
}
fn to_lua_string(s: String) -> AnyLuaValue {
AnyLuaValue::LuaAnyString(AnyLuaString(s.as_bytes().to_vec()))
}
fn to_lua_array<'a, T: IntoIterator<Item = String>>(v: T) -> AnyLuaValue {
let v: Vec<_> = v
.into_iter()
.enumerate()
.map(|(i, val)| {
(
AnyLuaValue::LuaNumber((i + 1) as f64),
AnyLuaValue::LuaString(val),
)
})
.collect();
AnyLuaValue::LuaArray(v)
}
#[cfg(test)]
mod test {
use super::*;
use maplit::hashmap;
fn s(v: &str) -> String {
v.to_string()
@ -242,14 +179,9 @@ mod test {
#[test]
fn test_parse_commit_msg() {
fn check_parse_commit(
commit_msg: &str,
expected_msg: PhabricatorMessage,
expected_lua: HashMap<&'static str, AnyLuaValue>,
) {
fn check_parse_commit(commit_msg: &str, expected_msg: PhabricatorMessage) {
let msg = PhabricatorMessage::parse_message(commit_msg);
assert_eq!(msg, expected_msg);
assert_eq!(msg.to_lua(), expected_lua);
}
check_parse_commit(
@ -260,11 +192,6 @@ mod test {
test_plan: ss("testinprod"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("mononoke: fix bug")),
SUMMARY => to_lua_string(s("fix")),
TEST_PLAN => to_lua_string(s("testinprod")),
},
);
// multiline title
@ -276,11 +203,6 @@ mod test {
test_plan: ss("testinprod"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("mononoke: fix bug\nsecondline")),
SUMMARY => to_lua_string(s("fix")),
TEST_PLAN => to_lua_string(s("testinprod")),
},
);
check_parse_commit(
@ -291,11 +213,6 @@ mod test {
test_plan: ss("testinprod"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
SUMMARY => to_lua_string(s("fix")),
TEST_PLAN => to_lua_string(s("testinprod")),
},
);
// Tag should start at beginning of the line
@ -306,10 +223,6 @@ mod test {
summary: ss("fix\n Test Plan: testinprod"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
SUMMARY => to_lua_string(s("fix\n Test Plan: testinprod")),
},
);
check_parse_commit(
@ -319,10 +232,6 @@ mod test {
summary: ss("fix\nnot a tag: testinprod"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
SUMMARY => to_lua_string(s("fix\nnot a tag: testinprod")),
},
);
check_parse_commit(
@ -332,10 +241,6 @@ mod test {
summary: ss("fix\nFixed\na\nbug"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
SUMMARY => to_lua_string(s("fix\nFixed\na\nbug")),
},
);
check_parse_commit(
@ -346,11 +251,6 @@ mod test {
cc: Some(vec![]),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
SUMMARY => to_lua_string(s("fix")),
CC => to_lua_array(vec![]),
},
);
check_parse_commit(
@ -360,10 +260,6 @@ mod test {
cc: Some(vec![s("user1"), s("user2"), s("user3")]),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
CC => to_lua_array(vec![s("user1"), s("user2"), s("user3")]),
},
);
check_parse_commit(
@ -373,10 +269,6 @@ mod test {
tasks: Some(vec![s("T1111"), s("T2222"), s("T3333")]),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
TASKS => to_lua_array(vec![s("T1111"), s("T2222"), s("T3333")]),
},
);
check_parse_commit(
@ -388,12 +280,6 @@ mod test {
reviewed_by: Some(vec![s("stash"), s("luk"), s("simonfar")]),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("")),
SUMMARY => to_lua_string(s("fix")),
TEST_PLAN => to_lua_string(s("testinprod")),
REVIEWED_BY => to_lua_array(vec![s("stash"), s("luk"), s("simonfar")]),
},
);
check_parse_commit(
@ -422,16 +308,6 @@ Differential Revision: https://url/D123
differential_revision: ss("https://url/D123"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("mononoke: fix fixovich")),
SUMMARY => to_lua_string(s("fix\nof a mononoke\nbug")),
TEST_PLAN => to_lua_string(s("testinprod")),
REVIEWED_BY => to_lua_array(vec![s("stash")]),
REVIEWERS => to_lua_array(vec![s("#mononoke")]),
CC => to_lua_array(vec![s("jsgf")]),
TASKS => to_lua_array(vec![s("T1234")]),
DIFFERENTIAL_REVISION => to_lua_string(s("https://url/D123")),
},
);
// Parse (almost) a real commit message
@ -470,19 +346,6 @@ Signature: 111111111:1111111111:bbbbbbbbbbbbbbbb",
signature: ss("111111111:1111111111:bbbbbbbbbbbbbbbb"),
..Default::default()
},
hashmap! {
TITLE => to_lua_string(s("mononoke: log error only once")),
SUMMARY => to_lua_string(s("Previously `log_with_msg()` was logged twice if msg wasn't None - with and\n\
without the message. This diff fixes it.\n\
\n\
#accept2ship")),
TEST_PLAN => to_lua_string(s("buck check")),
REVIEWED_BY => to_lua_array(vec![s("simonfar")]),
REVIEWERS => to_lua_array(vec![s("simonfar"), s("#mononoke")]),
SUBSCRIBERS => to_lua_array(vec![s("jsgf")]),
DIFFERENTIAL_REVISION => to_lua_string(s("https://phabricator.intern.facebook.com/D1111111")),
SIGNATURE => to_lua_string(s("111111111:1111111111:bbbbbbbbbbbbbbbb")),
},
);
}
}

View File

@ -12,20 +12,20 @@
use super::{Hook, HookChangeset, HookContext, HookExecution};
use anyhow::Error;
use async_trait::async_trait;
use context::CoreContext;
use futures_ext::{BoxFuture, FutureExt};
use futures_old::future::ok;
pub struct RustHook {
pub name: String,
}
#[async_trait]
impl Hook<HookChangeset> for RustHook {
fn run(
async fn run(
&self,
_ctx: CoreContext,
_ctx: &CoreContext,
_context: HookContext<HookChangeset>,
) -> BoxFuture<HookExecution, Error> {
ok(HookExecution::Accepted).boxify()
) -> Result<HookExecution, Error> {
Ok(HookExecution::Accepted)
}
}

View File

@ -29,11 +29,10 @@ use maplit::hashmap;
use metaconfig_types::{
BookmarkOrRegex, BookmarkParams, Bundle2ReplayParams, CacheWarmupParams, CommitSyncConfig,
CommitSyncDirection, CommonConfig, DefaultSmallToLargeCommitSyncPathAction, DerivedDataConfig,
HookBypass, HookCode, HookConfig, HookManagerParams, HookParams, HookType,
InfinitepushNamespace, InfinitepushParams, LfsParams, PushParams, PushrebaseFlags,
PushrebaseParams, Redaction, RepoConfig, RepoReadOnly, SmallRepoCommitSyncConfig,
SourceControlServiceParams, StorageConfig, UnodeVersion, WhitelistEntry,
WireprotoLoggingConfig,
HookBypass, HookConfig, HookManagerParams, HookParams, HookType, InfinitepushNamespace,
InfinitepushParams, LfsParams, PushParams, PushrebaseFlags, PushrebaseParams, Redaction,
RepoConfig, RepoReadOnly, SmallRepoCommitSyncConfig, SourceControlServiceParams, StorageConfig,
UnodeVersion, WhitelistEntry, WireprotoLoggingConfig,
};
use mononoke_types::{MPath, RepositoryId};
use regex::Regex;
@ -78,7 +77,6 @@ impl RepoConfigs {
raw_repo_config.clone(),
&storage,
&commit_sync,
config_path,
)?;
if !repoids.insert(config.repoid) {
@ -375,43 +373,9 @@ impl RepoConfigs {
raw_config: RawRepoConfig,
storage_config: &HashMap<String, RawStorageConfig>,
commit_sync: &HashMap<String, CommitSyncConfig>,
config_root_path: &Path,
) -> Result<RepoConfig> {
let hooks = raw_config.hooks.clone().unwrap_or_default();
let base_hook_loading_path = if config_root_path.starts_with(CONFIGERATOR_PREFIX) {
// When we load main config from configerator, we expect to find our
// configuration relative to the main folder where the Mononoke binary
// is executed from.
std::env::current_exe()?
.parent()
.expect("expected to get base dir of mononoke executable")
.to_path_buf()
} else if config_root_path.is_file() {
// JSON config directories are expected to look like:
// $ tree /tmp/mononoke_configs_4vUi2g
// └── config
// ├── hooks
// │   ├── block_cross_repo_commits.lua
// │   ├── <snip>
// └── tiers
// ├── apiserver.json
// ├── <snip>
//
// When invoking Mononoke, we expect the path to point directly
// at the JSON file and paths in JSON config files are expected
// to refer to hooks as 'config/hooks/...', so we need to go 3
// parents up to get to the right place.
config_root_path
.ancestors()
.skip(3)
.next()
.expect("invalid config path structure")
.to_path_buf()
} else {
config_root_path.to_path_buf()
};
let mut all_hook_params = vec![];
for raw_hook_config in hooks {
let config = HookConfig {
@ -420,29 +384,10 @@ impl RepoConfigs {
ints: raw_hook_config.config_ints.unwrap_or_default(),
};
let hook_params = if raw_hook_config.name.starts_with("rust:") {
// No need to load lua code for rust hook
HookParams {
name: raw_hook_config.name,
code: None,
hook_type: HookType::from_str(&raw_hook_config.hook_type)?,
config,
}
} else {
let hook_path = raw_hook_config.path.clone();
let hook_path = match hook_path {
Some(hook_path) => base_hook_loading_path.join(hook_path),
None => {
return Err(ErrorKind::MissingPath().into());
}
};
HookParams {
name: raw_hook_config.name,
code: Some(HookCode::Path(hook_path.clone())),
hook_type: HookType::from_str(&raw_hook_config.hook_type)?,
config,
}
let hook_params = HookParams {
name: raw_hook_config.name,
hook_type: HookType::from_str(&raw_hook_config.hook_type)?,
config,
};
all_hook_params.push(hook_params);
@ -1593,9 +1538,6 @@ mod test {
hooks: vec![
HookParams {
name: "hook1".to_string(),
code: Some(HookCode::Path(
tmp_dir.path().join(PathBuf::from("common/hooks/hook1.lua")),
)),
hook_type: HookType::PerAddedOrModifiedFile,
config: HookConfig {
bypass: Some(HookBypass::CommitMessage("@allow_hook1".into())),
@ -1605,7 +1547,6 @@ mod test {
},
HookParams {
name: "rust:rusthook".to_string(),
code: None,
hook_type: HookType::PerChangeset,
config: HookConfig {
bypass: None,

View File

@ -435,8 +435,6 @@ pub struct HookParams {
pub name: String,
/// The type of the hook
pub hook_type: HookType,
/// The code of the hook
pub code: Option<HookCode>,
/// Configs that should be passed to hook
pub config: HookConfig,
}

View File

@ -13,9 +13,9 @@ use crate::{
use bookmarks::BookmarkName;
use bytes::Bytes;
use context::CoreContext;
use futures_ext::{BoxFuture, FutureExt};
use futures_old::future::ok;
use futures_old::{stream, Future, Stream};
use futures::{future, stream::futures_unordered, FutureExt, TryFutureExt, TryStreamExt};
use futures_ext::{BoxFuture, FutureExt as _};
use futures_old::future::{ok, Future};
use hooks::{ChangesetHookExecutionID, FileHookExecutionID, HookExecution, HookManager};
use std::collections::HashMap;
use std::sync::Arc;
@ -41,38 +41,51 @@ fn run_pushrebase_hooks(
let changesets = action.uploaded_hg_changeset_ids.clone();
let maybe_pushvars = action.maybe_pushvars.clone();
let bookmark = action.bookmark_spec.get_bookmark_name();
run_pushrebase_hooks_impl(ctx, changesets, maybe_pushvars, &bookmark, hook_manager)
run_pushrebase_hooks_impl(ctx, changesets, maybe_pushvars, bookmark, hook_manager)
}
fn run_pushrebase_hooks_impl(
ctx: CoreContext,
changesets: UploadedHgChangesetIds,
pushvars: Option<HashMap<String, Bytes>>,
onto_bookmark: &BookmarkName,
onto_bookmark: BookmarkName,
hook_manager: Arc<HookManager>,
) -> BoxFuture<(), BundleResolverError> {
// TODO: should we also accept the Option<HgBookmarkPush> and run hooks on that?
let mut futs = stream::FuturesUnordered::new();
for hg_cs_id in changesets {
futs.push(
hook_manager
.run_changeset_hooks_for_bookmark(
ctx.clone(),
hg_cs_id.clone(),
onto_bookmark,
pushvars.clone(),
)
.join(hook_manager.run_file_hooks_for_bookmark(
ctx.clone(),
hg_cs_id,
onto_bookmark,
pushvars.clone(),
)),
)
}
futs.collect()
let futs: futures_unordered::FuturesUnordered<_> = changesets
.into_iter()
.map({
|hg_cs_id| {
let ctx = ctx.clone();
let hook_manager = hook_manager.clone();
let onto_bookmark = onto_bookmark.clone();
let pushvars = pushvars.clone();
async move {
future::try_join(
hook_manager.run_changeset_hooks_for_bookmark(
&ctx,
hg_cs_id.clone(),
&onto_bookmark,
pushvars.as_ref(),
),
hook_manager.run_file_hooks_for_bookmark(
&ctx,
hg_cs_id,
&onto_bookmark,
pushvars.as_ref(),
),
)
.await
}
.boxed()
}
})
.collect();
futs.try_collect()
.boxed()
.compat()
.from_err()
.and_then(|res| {
.and_then(|res: Vec<_>| {
let (cs_hook_results, file_hook_results): (Vec<_>, Vec<_>) = res.into_iter().unzip();
let cs_hook_failures: Vec<(ChangesetHookExecutionID, HookExecution)> = cs_hook_results
.into_iter()

View File

@ -794,10 +794,9 @@ CONFIG
function register_hook {
hook_name="$1"
path="$2"
hook_type="$3"
hook_type="$2"
shift 3
shift 2
EXTRA_CONFIG_DESCRIPTOR=""
if [[ $# -gt 0 ]]; then
EXTRA_CONFIG_DESCRIPTOR="$1"
@ -809,7 +808,6 @@ function register_hook {
hook_name="$hook_name"
[[hooks]]
name="$hook_name"
path="$path"
hook_type="$hook_type"
CONFIG
[ -n "$EXTRA_CONFIG_DESCRIPTOR" ] && cat "$EXTRA_CONFIG_DESCRIPTOR"
@ -1184,23 +1182,16 @@ function hook_test_setup() {
name="$HOOKBOOKMARK"
CONFIG
HOOK_FILE="$1"
HOOK_NAME="$2"
HOOK_TYPE="$3"
shift 3
HOOK_NAME="$1"
HOOK_TYPE="$2"
shift 2
EXTRA_CONFIG_DESCRIPTOR=""
if [[ $# -gt 0 ]]; then
EXTRA_CONFIG_DESCRIPTOR="$1"
fi
if [[ ! -z "$HOOK_FILE" ]] ; then
mkdir -p common/hooks
cp "$HOOK_FILE" common/hooks/"$HOOK_NAME".lua
register_hook "$HOOK_NAME" common/hooks/"$HOOK_NAME".lua "$HOOK_TYPE" "$EXTRA_CONFIG_DESCRIPTOR"
else
register_hook "$HOOK_NAME" "" "$HOOK_TYPE" "$EXTRA_CONFIG_DESCRIPTOR"
fi
register_hook "$HOOK_NAME" "$HOOK_TYPE" "$EXTRA_CONFIG_DESCRIPTOR"
setup_common_hg_configs
cd "$TESTTMP" || exit 1

View File

@ -15,45 +15,11 @@ setup configuration
> name="master_bookmark"
> CONFIG
$ mkdir -p common/hooks
$ cat > common/hooks/file_size_hook.lua <<CONFIG
> hook = function (ctx)
> if ctx.file.len() > 10 then
> return false, "File is too large"
> end
> return true
> end
> CONFIG
$ register_hook file_size_hook common/hooks/file_size_hook.lua PerAddedOrModifiedFile <(
> echo 'bypass_commit_string="@allow_large_files"'
> )
$ cat > common/hooks/no_owners_file_deletes.lua <<CONFIG
> hook = function (ctx)
> for _, f in ipairs(ctx.files) do
> if f.is_deleted() and string.match(f.path, ".*OWNERS$") then
> return false, "Deletion of OWNERS files is not allowed"
> end
> end
> return true
> end
> CONFIG
$ register_hook no_owners_file_deletes common/hooks/no_owners_file_deletes.lua PerChangeset <(
> echo 'bypass_commit_string="@allow_delete_owners"'
> )
$ cat > common/hooks/no_owners2_file_deletes_pushvars.lua <<CONFIG
> hook = function (ctx)
> for _, f in ipairs(ctx.files) do
> if f.is_deleted() and string.match(f.path, ".*OWNERS2$") then
> return false, "Deletion of OWNERS files is not allowed"
> end
> end
> return true
> end
> CONFIG
$ register_hook no_owners2_file_deletes_pushvars common/hooks/no_owners2_file_deletes_pushvars.lua PerChangeset <(
> echo 'bypass_pushvar="ALLOW_DELETE_OWNERS=true"'
$ register_hook limit_filesize PerAddedOrModifiedFile <(
> cat <<CONF
> bypass_commit_string="@allow_large_files"
> config_ints={filesizelimit=10}
> CONF
> )
$ setup_common_hg_configs
@ -64,6 +30,8 @@ setup common configuration
$ cat >> $HGRCPATH <<EOF
> [ui]
> ssh="$DUMMYSSH"
> [extensions]
> amend=
> EOF
setup repo
@ -117,63 +85,6 @@ Delete a file, make sure that file_size_hook is not called on deleted files
added 0 changesets with 0 changes to 0 files
updating bookmark master_bookmark
Add OWNERS file, then delete it. Make sure deletion is not allowed
$ touch OWNERS && hg add OWNERS && hg ci -m 'add OWNERS'
$ hgmn push -r . --to master_bookmark -q
$ hg rm OWNERS
$ hg ci -m 'remove OWNERS'
$ hgmn push -r . --to master_bookmark
pushing rev 2d1a0bcf73ee to destination ssh://user@dummy/repo bookmark master_bookmark
searching for changes
remote: Command failed
remote: Error:
remote: hooks failed:
remote: no_owners_file_deletes for 2d1a0bcf73ee48cde9073fd52b6bbb71e4459c9b: Deletion of OWNERS files is not allowed
remote: Root cause:
remote: "hooks failed:\nno_owners_file_deletes for 2d1a0bcf73ee48cde9073fd52b6bbb71e4459c9b: Deletion of OWNERS files is not allowed"
abort: stream ended unexpectedly (got 0 bytes, expected 4)
[255]
Bypass owners check
$ cat >> .hg/hgrc <<CONFIG
> [extensions]
> amend=
> CONFIG
$ hg amend -m 'remove OWNERS\n@allow_delete_owners'
$ hgmn push -r . --to master_bookmark
pushing rev 67730b0d6122 to destination ssh://user@dummy/repo bookmark master_bookmark
searching for changes
adding changesets
adding manifests
adding file changes
added 0 changesets with 0 changes to 0 files
updating bookmark master_bookmark
Add OWNERS2 file. This time bypass it with pushvars
$ touch OWNERS2 && hg ci -Aqm 'add OWNERS2'
$ hgmn push -r . --to master_bookmark -q
$ hg rm OWNERS2
$ hg ci -m 'remove OWNERS2'
$ hgmn push -r . --to master_bookmark
pushing rev 55334cb4e1e4 to destination ssh://user@dummy/repo bookmark master_bookmark
searching for changes
remote: Command failed
remote: Error:
remote: hooks failed:
remote: no_owners2_file_deletes_pushvars for 55334cb4e1e487f6de665629326eb1aaddccde53: Deletion of OWNERS files is not allowed
remote: Root cause:
remote: "hooks failed:\nno_owners2_file_deletes_pushvars for 55334cb4e1e487f6de665629326eb1aaddccde53: Deletion of OWNERS files is not allowed"
abort: stream ended unexpectedly (got 0 bytes, expected 4)
[255]
$ hgmn push -r . --to master_bookmark --pushvars "ALLOW_DELETE_OWNERS=true"
pushing rev 55334cb4e1e4 to destination ssh://user@dummy/repo bookmark master_bookmark
searching for changes
adding changesets
adding manifests
adding file changes
added 0 changesets with 0 changes to 0 files
updating bookmark master_bookmark
Send large file
$ hg up -q 0
$ echo 'aaaaaaaaaaa' > largefile
@ -184,9 +95,9 @@ Send large file
remote: Command failed
remote: Error:
remote: hooks failed:
remote: file_size_hook for 3e0db158edcc82d93b971f44c13ac74836db5714: File is too large
remote: limit_filesize for 3e0db158edcc82d93b971f44c13ac74836db5714: File size limit is 10 bytes. You tried to push file largefile that is over the limit (12 bytes). See https://fburl.com/landing_big_diffs for instructions.
remote: Root cause:
remote: "hooks failed:\nfile_size_hook for 3e0db158edcc82d93b971f44c13ac74836db5714: File is too large"
remote: "hooks failed:\nlimit_filesize for 3e0db158edcc82d93b971f44c13ac74836db5714: File size limit is 10 bytes. You tried to push file largefile that is over the limit (12 bytes). See https://fburl.com/landing_big_diffs for instructions."
abort: stream ended unexpectedly (got 0 bytes, expected 4)
[255]
@ -212,8 +123,8 @@ Send large file inside a directory
remote: Command failed
remote: Error:
remote: hooks failed:
remote: file_size_hook for cbc62a724366fbea4663ca3e1f1a834af9f2f992: File is too large
remote: limit_filesize for cbc62a724366fbea4663ca3e1f1a834af9f2f992: File size limit is 10 bytes. You tried to push file dir/largefile that is over the limit (12 bytes). See https://fburl.com/landing_big_diffs for instructions.
remote: Root cause:
remote: "hooks failed:\nfile_size_hook for cbc62a724366fbea4663ca3e1f1a834af9f2f992: File is too large"
remote: "hooks failed:\nlimit_filesize for cbc62a724366fbea4663ca3e1f1a834af9f2f992: File size limit is 10 bytes. You tried to push file dir/largefile that is over the limit (12 bytes). See https://fburl.com/landing_big_diffs for instructions."
abort: stream ended unexpectedly (got 0 bytes, expected 4)
[255]

View File

@ -22,13 +22,7 @@ setup configuration
> name="master_bookmark"
> CONFIG
$ mkdir -p common/hooks
$ cat > common/hooks/failing_hook.lua <<CONFIG
> hook = function (ctx)
> return false, "failed"
> end
> CONFIG
$ register_hook failing_hook common/hooks/failing_hook.lua PerChangeset <(
$ register_hook always_fail_changeset PerChangeset <(
> echo 'bypass_pushvar="BYPASS_REVIEW=true"'
> )
@ -89,9 +83,9 @@ Unsuccessful push creates a draft commit on the server
remote: Command failed
remote: Error:
remote: hooks failed:
remote: failing_hook for 812eca0823f97743f8d85cdef5cf338b54cebb01: failed
remote: always_fail_changeset for 812eca0823f97743f8d85cdef5cf338b54cebb01: This hook always fails
remote: Root cause:
remote: "hooks failed:\nfailing_hook for 812eca0823f97743f8d85cdef5cf338b54cebb01: failed"
remote: "hooks failed:\nalways_fail_changeset for 812eca0823f97743f8d85cdef5cf338b54cebb01: This hook always fails"
abort: stream ended unexpectedly (got 0 bytes, expected 4)
[255]