feat(rulebook): Paperfile-style DSL for Arstotzka policy #4
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "arrdem/rulebook-dsl"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Introduces
projects/rulebook/— a Papers-Please-themed DSL for declaring Arstotzka audiences, resources, actions, and roles. Services will carry a literalRulebookfile; a futurearstotzka_catalogaggregator collates them into the JSON catalog the auth server loads at startup.This PR ships the DSL, loader, validator, renderer, and CLI only — no consumers yet. The evaluator is built on
starlark-pyo3(same as Tiller), soload()has native Starlark semantics with implicit symbol binding.Scope
Statements carry an explicit
effectof"allow"or"deny"so the broad-allow + point-deny pattern is expressible from day one — e.g. "console-reader hasreadon everything exceptsecrets.*". The arstotzka evaluator gains matching deny semantics in PR #8 alongside the catalog loader, and that PR also updatesprojects/arstotzka/SPEC.md. Nothing reads the catalog before PR #8 lands; any future arstotzka build that reads the catalog without deny support would default effect-less statements to allow.Closure expansion is asymmetric on
effectextends, so listingqueueactually emits["queue", "queue.reservation"]and the runtime fnmatch evaluator matches both.secrets.*exceptsecrets.private") is the load-bearing use case for deny; a deny that silently expanded into the whole subtree would also clobbersecrets.private.metadataand break that contract. Subtree-deny remains expressible via a glob pattern (secrets.private.*) — fnmatch handles it.Validation
Each action in a statement must appear in
r.all_actionsof every listed resource (extends-up inheritance only). Descendants do not launder their actions back up into ancestors at validation time —statement(resources=[parent], actions=["release"])fails if only a child ofparentpublishesrelease, because the renderer would otherwise emit a catalog entry that the runtime fnmatch evaluator allows onparentdirectly.Dimensions and conditional matchers are deferred to a later PR that pairs each addition with the matching evaluator change. The
attr.*namespace stays exposed for cross-Starlark consistency with Tiller / Bazel-style rule definitions; its carriers are inert today and re-enter the model alongside the evaluator change.Stack
rulebook_schemaBazel rule + first real Rulebook inprojects/doorman/.arstotzka_catalogaggregator + server-side JSON loader +GET /schema+ evaluator deny semantics + SPEC update.authctl schema/authctl validate/ tab-completion off the live catalog.The boundary at PR #8 is load-bearing:
starlark-pyo3is build-time-only and never enters the auth server's runtime closure. The server consumes the JSON action output from PR #7's rule.Review response
First-round review fixes:
extendsnow does what the README claimed. The renderer expands every allow-statement'sresourceslist to the transitive descendant closure of each listed resource, so listing the parent actually matches descendant literals under fnmatch.effect="allow" | "deny"and PR #8 takes ownership of the evaluator + SPEC update. The broad-allow + point-deny pattern is too useful to deprecate by omission.StarlarkErrorwrapped at every eval site (prelude, parse, body, load);_resolve_labelrejects//../../etc:passwd-style traversal viais_relative_to; load cycles caught by an in-flight set; raise on missing workspace marker;.star-only suffixes.tokenize-based check did not handle 3.12+ f-strings and was redundant.attr.string(pattern=...)eagerly compiles the regex; previously the pattern sat unvalidated in the schema.Second-round review fixes (
866aef8):all_actionswas rewritten to per-listed-resourceall_actions(extends-up only).test_action_must_be_on_listed_resource_not_just_descendantsis the regression test sketched in the review._statement_to_jsonkeeps deny resources literal; allow expands.test_deny_statement_does_not_expand_descendantspins the four-deep-extends-chain case.Not addressed in this PR:
audnaming collision — flagged but not a blocker.load()— loaded modules can callrole()directly. Authority shape worth revisiting; deferred so the staged plan can land.attr.*left callable-but-inert. The reviewer suggested either dropping it or making it raise; the explicit project direction is to keep the namespace stable for cross-Starlark consistency with Tiller, and raising would break that goal the moment a Rulebook author copies a snippet across.Changes are visible to end-users: no
Test plan
projects/rulebook/tests/(7 suites, 67 tests)bazel test //projects/rulebook/...is currently blocked by an unrelated infra issue: a downloader rewrite redirectshermetic_launcher's GitHub upstreamrunfiles_stub_x86_64_linuxartifact tocairn.tirefireind.us, which 404s on the expected blob key.--repository_cache= --disk_cache=does not bypass the rewrite. Tracked separately. 67/67 tests pass underuv run pytest -p no:pudbin the meantime.uv run rulebook renderanduv run rulebook validateon sample filesDrop the hand-rolled restricted-Python `exec` evaluator in favour of starlark-pyo3, matching Tiller's architecture. Builders remain Python callables wrapped via `Module.add_callable`; return values travel as `OpaquePythonObject` so they round-trip through Starlark data structures without losing identity. `attr.enum` / `attr.string` are exposed via a Starlark `struct()` in a tiny prelude. `load("//path:name", "sym_a", "sym_b")` now uses native Starlark semantics -- implicit symbol binding in the caller's scope -- so the "load returns a value you assign" wart is gone.Reviewed against trunk merge-base
6cbd038. The real diff is 19 files / +1455, all new underprojects/rulebook/plus three workspace-registration lines — Forgejo's "1455 / 1378 / 19" stat is misleading because the unrelated monoversion churn is on trunk, not this branch. Worth agit fetch origin trunk && git rebase origin/trunkbefore merge so the file list isn't confusing in the merge commit.Hostile conceptual review
I think the model is doing too much, claims things it doesn't deliver, and is meaningfully out of step with the system it's allegedly typing. Going from worst to least bad:
1. Rulebook's data model does not match Arstotzka's. Per
projects/arstotzka/SPEC.mdandprojects/arstotzka/statements/src/arstotzka/statements/evaluator.py:{"actions": [...], "resources": [...]}. Both fields are flat fnmatch globs against opaque strings.<type>:<name>grammar is cosmetic.Rulebook adds (a) deny statements, (b) typed dimensions (enum/string), and (c)
extends-based resource hierarchy. None of these have a wire representation in the current arstotzka model. The README says PR #3 will make arstotzka "validate its config against the catalog" — but the config statements have no slot for dimensions or denies, so either Rulebook is silently lying about the runtime semantics, or Arstotzka's whole model is changing and that change isn't in this PR's body. Pick one and say so.2. The
extends"implicit subresource cover" is a runtime fiction. README:Read
render.py:_statement_to_json—resourcesis rendered verbatim as[r.name for r in s.resources]. There is no expansion. The arstotzka evaluator doesfnmatchcase(resource, pattern)against literals, so a statement listingqueuewill not match the runtime resource stringqueue.reservation. The behavior the README describes does not exist anywhere in the code; this is a correctness bug masquerading as a feature.3. Dimensions have no story to ground them.
evaluator.allows(statements, action, resource)takes two strings. The wire format has no place to putpriority=max. Either dimensions get encoded into the resource string (e.g.queue:priority=max) — which the renderer does not do — or evaluated by relying parties out-of-band by some unwritten convention. As shipped, dimensions are decorative typing without a runtime contract.4.
effect="deny"is accepted, validated, and rendered, but arstotzka has no deny. Same root cause as (1). Pick a story.5. The DSL is much bigger than the actual problem. Today's Arstotzka has on the order of two distinct statement shapes (
{*, *}for admin,{*:read, *}for reader). For that you do not need: a Starlark interpreter, anOpaquePythonObjectround-trip layer, a custom workspace-root walker, a bespoke//pkg:namelabel resolver, a Python-tokenize-based implicit-string-concat sniffer, registration-by-side-effect builders, and a four-stage roadmap. A YAML schema and 50 lines of jsonschema-style validation gets you the "compile-time vocabulary check" benefit without committing to a programming model. The Tiller analogy is doing a lot of work in this design ("we already use Starlark for one config, ship it again") — but Tiller has actual programmatic logic (digest pinning, transforms). Auth roles do not.6. starlark-pyo3 will end up as a runtime dep on the auth server. PR #3 is described as "arstotzka server loads the catalog at startup." If "loads" means evaluates Rulebooks, that pulls a native Starlark interpreter into the boot path of the thing that gates every ingress. The right shape is: render the catalog to JSON at build time (the implied
rulebook_schemaBazel rule in PR #2), check the JSON in to the arstotzka server's config, and never let Starlark anywhere near the server process. Worth being explicit about that boundary now so PR #3 doesn't drift.7. Implicit registration via
load()is an unfortunate authority shape for a security DSL. Loaded modules see the same builders as the root file and can callrole()to register directly into the audience. That's action-at-a-distance: aload("//tools/arstotzka:policy", "standard_user_admin_roles")can in principle register additional roles you didn't ask for, invisibly. Pure constructor + explicit attach (audience.add(make_role(...))) is the kind of thing security policy specifically wants — the cost is verbose Rulebooks, the benefit is that what's in your file is what gets registered.8. "Audience" naming collides with JWT
aud. Arstotzka'saudclaim means the forwarded host. Rulebook'saudience(name="...")uses the same string for the same concept — fine — but reusing the JWT term for the service publishing the schema invites confusion when someone reads "audience" in code and is unsure whether it's the producer or the JWT consumer. Not a blocker, but the Papers-Please cosplay loses to clarity here.9. PR-numbering in the body is internal-aspirational (this is forgejo PR #4, not the body's "PR #1"). Cosmetic, easy fix; flagged because anyone clicking the "PR #2/#3/#4" references will be confused.
The path I'd push for: shrink Rulebook to whatever model arstotzka's evaluator actually understands today, ship the JSON schema, and only grow back deny / dimensions / extends together with the matching evaluator changes. The current PR sets up an expressivity overhang — a DSL whose semantics have no runtime — that's hard to walk back once Rulebooks are written.
Code review
Bugs and correctness
loader.py—StarlarkErrorfrom aload()'d module isn't wrapped.evaluate_rulebookonly catchesStarlarkErrorat the top-levelstarlark.eval. Inside_make_file_loader.load_func, the innerstarlark.eval(mod, ast, globals_, file_loader=inner_loader)call has no catch, so an error in a loaded policy module propagates as a rawStarlarkErrorpast the CLI'sexcept RulebookErrorhandlers and prints a Python traceback to the user. Wrap the inner eval too. There's no test for this;test_loader.py:test_open_is_not_exposedonly exercises top-level errors.loader.py:_resolve_labeldoes not constrainpkgto the workspace. A label likeload("//../../../etc:passwd", "_")resolves viaworkspace_root / pkg / f"{name}{suffix}"andPathwill happily traverse out. Today this is low-risk because Rulebooks are trusted, but a label resolver shouldresolve()the candidate and verifyis_relative_to(workspace_root). Cheap to fix, prevents a footgun later.loader.py:_make_file_loaderis recursion-cycle-naive. Thecacheonly storesFrozenModuleafter eval completes; during eval, a self-referentialload()re-enters and recurses until the stack blows. Maintain an "in-flight" set of resolved paths; raise on re-entry.loader.py:_find_workspace_rootsilently falls back tostart.resolve().parentwhen no marker is found between the file and/. This is a hidden mode that makes label resolution work in tests (where.gitis created in tmp_path) but produces silently-wrong resolution if a Rulebook is evaluated outside any workspace. Raise instead — the caller can catch._resolve_labelfalls through to a no-extension match ("").(.star, .py, "")meansload("//foo:Rulebook")resolves to a file literally namedRulebook. Probably unintended forload(); that bare-name path is for the rootRulebookfile, not for module loads. Consider dropping the""suffix from the load path..pyfiles load as Starlark. README says load resolves.star"or.py". A.pyfile rarely parses cleanly as Starlark; this resolution mostly invites confusion when someone names a shared modulepolicy.pyand gets aStarlarkErrorthey don't understand. I'd drop.pyfrom the suffix list._check_implicit_string_concatwon't see f-string adjacency on Python 3.12+. Python 3.12 tokenizesf"a"asFSTRING_START/FSTRING_MIDDLE/FSTRING_END, notSTRING. Sof"a" "b"is not flagged. Belt-and-suspenders check is fine, but worth a note that Starlark's parser is the real defense.validate.py:_is_globonly matches*, while the runtime evaluator usesfnmatchcasewhich also supports?,[abc],[!abc]. Sof?ois treated as a literal by validation and rejected for an enum that lacks the literalf?o, but accepted at runtime as a glob. Validator drift from the runtime matcher.attr.string(pattern=...)never validates the pattern. It's documented as a regex but never compiled, never checked. An invalid regex sits silently in the schema until something downstream tries to use it.render.pydoes not encodeeffect="deny"in any distinguishing way besides the field — fine on its own, but combined with conceptual-review point (4), the JSON schema is committing to a shape that arstotzka can't honor.Style / nits
_install_buildershas an over-engineered closure capture.def _wrap(inner=fn): ... ; module.add_callable(name, _wrap())— the default-arg trick is right but could just be a top-level_wrap_builder(fn)helper for readability.builders.py:_Context.audienceis registered-once-or-error, but_require_audience("resource")ducks under that with a runtime error string. Consider asserting the caller name with a typed enum/Literal — small thing.validate.pyre-checks duplicate names thatbuilders.pyalready enforces. Docstring calls it "defense in depth"; it's also dead in normal flows. Fine but flag-worthy.validate.py:_validate_statementjoins resource names with", ". Resource names with embedded commas/spaces would make the message ambiguous. Cosmetic.Statementis the only model type without adescriptionfield. Consistency.Test gaps
click.testing.CliRunner) —--skip-validate, error exit codes, JSON output shape are uncovered.effect="deny"rendering test — given how shaky deny is, this should be exercised.load()error-path tests: missing colon, missing//prefix, non-existent target, syntax error inside loaded module (this is the bug above).load("//../...")..pyextension test forload().Build / packaging
pyproject.tomlis fine.BUILD.bazelfollows the project pattern. No__init__.pyundersrc/rulebook/— adheres to repo rule.MODULE.bazel, rootpyproject.toml, anduv.locklook complete.PR description
TL;DR
The code is clean and the build/packaging is correct, but conceptually the DSL is committing to model features (deny, dimensions, extends-as-cover) that the runtime doesn't implement, and the README claims a runtime behavior (
queuecoversqueue.reservation) that the renderer doesn't produce. The biggest concrete bugs are: load-error wrapping, path-traversal in label resolution, and load-cycle handling. The biggest design question is: what does Rulebook commit Arstotzka to changing, and is that change in scope or out of scope?92f9a9891ato25233ac3b4Re-reviewed at
c4a207a. Most of the original review is addressed; flagging what remains, plus one new finding.Resolved
render.py:_expand_resourceswalks descendants and dedupes;test_statement_expands_to_extends_closureandtest_listing_two_siblings_dedupes_resourcescover it. The README/PR-body framing of expansion as the load-bearing transform reads honestly.StarlarkErroris wrapped per-eval site (prelude, parse, body),_resolve_labeldoesis_relative_to(workspace_resolved), anin_flightset catches cycles, missing-marker now raises. New tests for each (test_load_inner_error_is_wrapped,test_load_path_traversal_is_rejected,test_load_cycle_is_detected,test_no_workspace_marker_raises)..staronly withtest_load_only_resolves_star_extensionpinning it.test_starlark_rejects_implicit_string_concatstill verifies the behavior end-to-end.attr.string(pattern=...)compiles the regex eagerly.test_cli.py, 7 cases) covering exit codes,--skip-validate, JSON shape, and load-error surfacing._install_buildersclosure capture cleanup._wrap_builderis now top-level.Still flagged
Closure expansion grants actions on resources that don't publish them
This is new — surfaces as a consequence of the closure-expansion design. Construct:
validate.py:_validate_statementbuilds the closure[P, C], unionsr.all_actionsover the closure ({"read", "write"}becauseChaswrite), and acceptswriteon the statement. The renderer emits{"actions": ["write"], "resources": ["P", "C"]}. At runtime,(action="write", resource="P")matches via fnmatch — butPdoesn't publishwrite.So the catalog admits role grants that name actions on resources that don't expose them, by laundering through extends-down. Two coherent fixes:
r.all_actions(extends-up only) for every resource inst.resources(the original list, not the closure). Descendants are render-time noise and shouldn't supply actions for the listed parents.(resource, action-subset-supported-by-that-resource)tuple, fanning out closure expansion only over actions each leaf actually publishes.(1) is the smaller change and matches what authors mean when they write
statement(resources=[P], actions=["write"])— they're claimingPhaswrite. Today the validator silently disagrees with the runtime.A regression test along the lines of:
This currently passes empty-errors; it should fail.
attr.*is dead surfaceThe model dropped dimensions, and
EnumType/StringTypeno longer attach to anything.test_attr_namespace_available_even_if_unusedliterally writes_ = attr.enum("a", "b")and discards. The PR-body and README justify this as "Bazel-style cross-Starlark consistency" so the namespace stays stable across the future re-introduction.The cost is that authors can write
attr.enum("max", "high")calls today and have them silently no-op. That's a footgun: someone reads the README'sattr.*paragraph, infers "this is how I type a dimension", writesdimensions={"priority": attr.enum(...)}(the previous shape), and getsTypeError: resource() got an unexpected keyword argument 'dimensions'. Better to either:attrfrom the prelude entirely and re-add when there's a consumer, orattr.enum(...)andattr.string(...)raise a friendlyRulebookError("attr.* type carriers have no consumer in the current rulebook model; they will be re-introduced with dimensions in a later PR. Remove the call for now.").The current "callable but inert" choice is the most surprising of the three.
Deny + descendants is asymmetric
README:
Closure expansion runs on deny statements too (correct, given how
_statement_to_jsonis uniform). So a deny listing a descendantsecrets.privatealso denies any descendant ofsecrets.private—secrets.private.metadataand beyond. That's not "one descendant" — it's a subtree. Either the README phrasing should acknowledge the subtree, oreffect="deny"statements should opt out of expansion (renderer emits the literal listed resources only). The latter is more useful for surgical exclusion; the former is what the renderer does today. Pick one and document.A test pinning the deny+expansion behavior would help —
test_deny_statement_renders_with_effect_fielddoesn't exercise the case where the denied resource has its own descendants.bazel testclaim is unverifiedThe PR body acknowledges the cairn mirror 404 on
runfiles_stub_x86_64_linuxand falls back touv run pytest. That's an honest disclosure, but it leaves the green-bazel claim unverified, which is the project's contract for "ready to merge" perCLAUDE.md. Worth either (a) opening a separate ticket on the mirror, (b) confirming locally with a non-cluster bazel cache, or (c) waiting until the mirror's healthy. The diff is well-tested viapytestin either case, but the "all testing through bazel" rule is being relaxed here.Forward-compat phrasing slightly overstates the case
Pre-PR-#8 arstotzka doesn't read the catalog at all, so "older arstotzka builds reading the catalog" describes a state that doesn't exist in the field. The forward-compat story really only matters once the catalog loader exists without deny support — a window the staged plan keeps closed. Worth tightening to "any future arstotzka build that reads the catalog without deny support" — small but the current wording pretends there's a back-compat surface to maintain when there isn't yet.
Consciously deferred (acknowledged)
audnaming collision.load()(loaded modules can callrole()directly).Tests look great
Coverage now includes load-error wrapping, traversal rejection, cycle detection, suffix list, missing-target, missing-
//, missing-colon, missing-workspace, deny rendering, and CLI exit codes. The one regression test missing is the closure/validator mismatch above.TL;DR
The architectural moves (drop dimensions, scope deny to PR #8, closure-expand at render time) are the right shape. The loader is now hardened. The remaining issue worth blocking on is the validator/render closure asymmetry — actions can be granted on parents that don't publish them, which defeats the "compile-time check against a real vocabulary" pitch. Once that's fixed (or explicitly justified), this is good to merge ahead of PR #7.
Two correctness fixes from the second-round review: 1. Validator no longer launders descendant actions up. Previously the validator unioned all_actions across the closure (parent + descendants), so statement(resources=[parent], actions=["release"]) passed because a child published "release" — but the renderer would then emit {"actions": ["release"], "resources": ["parent", ...]} and arstotzka's fnmatch evaluator would allow (release, parent) even though parent never publishes release. The catalog laundered actions the listed resources don't expose. Tighten validation: each action must be in r.all_actions of every listed resource (extends-up only). The new test_action_must_be_on_listed_resource_not_just_descendants pins this. 2. Closure expansion is asymmetric on effect. Allow expands; deny does not. The surgical-exclusion use case ("allow secrets.* except secrets.private") needs deny to mean "just this one literal", not "this subtree". A deny that silently expanded would also clobber secrets.private.metadata and break the contract. Subtree deny is still expressible via fnmatch glob (secrets.private.*) — let the evaluator do the work. test_deny_statement_does_not_expand_descendants pins the new behavior; README documents the asymmetry explicitly. Validator simplification falls out: with action checks against listed resources only, the descendants index is no longer needed. Drops the duplicate-name re-check that was already enforced by builders.py.Pushed
866aef8addressing the second-round review.Fixed
r.all_actionsof every listed resource (extends-up only). The closure-union over descendants is gone. New regression testtest_action_must_be_on_listed_resource_not_just_descendantsis the one you sketched in your review and it now fails-then-passes against the fix. Addedtest_listing_descendant_with_inherited_action_is_validandtest_action_on_one_of_many_listed_resources_is_reportedto pin the rest of the surface._statement_to_jsonemits the literal listed resources foreffect="deny"and the descendant closure only foreffect="allow". Subtree-deny stays expressible via fnmatch glob (secrets.private.*). Newtest_deny_statement_does_not_expand_descendantsbuilds a four-deep extends chain and asserts the deny stops at its literal while the allow expands to the full subtree. README documents the asymmetry explicitly.render.pydocstring to talk about "any future arstotzka build that reads the catalog without deny support" — matching the staged plan.Validator drops out
With the new check operating on the listed resources only,
validate.pyno longer needs the descendants index, and the deaddefense in depthduplicate-name re-check is gone —builders.pyalready enforces uniqueness at construction time.On
attr.*Keeping it callable-but-inert. Reid's direction explicitly was to keep the
attrnamespace stable for cross-Starlark consistency with Tiller and Bazel-style rule definitions; raising on call would break that goal the moment a Rulebook author copies a snippet from Tiller. The README andattr.pydocstring both flag it as a reserved-but-inert surface today; if the footgun bites in practice we can revisit.Bazel test
Still blocked by the cairn mirror's 404 on
runfiles_stub_x86_64_linux— looks like a downloader rewrite redirects the GitHub upstream to cairn, but the artifact at the expected blob key isn't there. Tried--repository_cache= --disk_cache=to force a fresh fetch; same failure. This is environmental and outside this branch's scope. 67/67 tests pass underuv run pytest -p no:pudb. Worth opening a separate ticket on cairn.Deferred (still consciously)
audnaming.load().attr.*left callable (above).Re-reviewed at
866aef8. The two blockers from round two are landed cleanly; only one small inconsistency left.Verified fixes
_validate_statementiterates(action, listed_resource)pairs againstr.all_actions(extends-up only). The closure/descendants index is gone fromvalidate.pyentirely. The regression test I sketched (test_action_must_be_on_listed_resource_not_just_descendants) now reproduces the bug and asserts the fix;test_action_on_one_of_many_listed_resources_is_reportedpins the "must be on every listed resource" half. Inheritance up still works as expected (test_listing_descendant_with_inherited_action_is_valid)._statement_to_jsonbranches oneffect:allowexpands,denyemits verbatim. The README documents both halves and points authors at glob (secrets.private.*) for subtree-deny.test_deny_statement_does_not_expand_descendantsbuilds the four-deep chain (secrets→secrets.private→secrets.private.metadata) and verifies the deny stops at its literal while the allow expands the full subtree.render.pydocstring — now talks about "any future arstotzka build that reads the catalog" rather than implying back-compat with current builds.Held with rationale (acceptable)
attr.*stays callable-inert for cross-Starlark consistency with Tiller / Bazel-style rule definitions. The README andattr.pydocstring both flag it as reserved-but-unused. Reasonable trade — if it bites in practice, revisit.audcollision — flagged, deferred. README now anchorsaudience.nameto the JWTaudclaim explicitly, which mostly de-fangs it.load()— deferred.runfiles_stub_x86_64_linux. Environmental and out of this branch's scope; 67/67 underpytest. Worth a separate ticket on cairn.Small cleanup left
validate.py's docstring still lists check #2 ("Resource and role names are unique. … this is a final sanity check on hand-built models.") but the function no longer performs that check — the only iteration invalidate()is over_validate_statementfor the action presence rule. Either drop item #2 from the docstring (matching the code, sincebuilders.pyis the only enforcement now) or restore the check. The former is consistent with the "validator simplification" you described in the response.Verdict
Approve once the docstring trim above is in. The architectural shape — closure-expand on allow, literal on deny, validate against listed resources only — now hangs together coherently, and
arstotzka_catalog(PR #8) has a clean wire format to consume.Pushed
3fbc238—validate.pydocstring now describes the single rule the function actually enforces (action presence on every listed resource); the stale uniqueness item is gone, sincebuilders.pyis the only enforcement now. Stack rebased.Ready for approval whenever you are.
View command line instructions
Checkout
From your project repository, check out a new branch and test the changes.Merge
Merge the changes and update on Forgejo.Warning: The "Autodetect manual merge" setting is not enabled for this repository, you will have to mark this pull request as manually merged afterwards.