Merge pull request #5 from CereusLinuxProject/next

Import changes from upstream
This commit is contained in:
Kevin Figueroa 2022-11-14 18:33:21 -06:00 committed by GitHub
commit 768c33f45b
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
60 changed files with 1330 additions and 816 deletions

View file

@ -4,15 +4,39 @@ void-packages is the backbone of the Void Linux distribution. It contains all th
This document describes how you, as a contributor, can help with adding packages, correcting bugs and adding features to void-packages. This document describes how you, as a contributor, can help with adding packages, correcting bugs and adding features to void-packages.
## Getting your packages into Void by yourself ## Package Requirements
If you really want to get a package into Void Linux, we recommend you package it yourself. To be included in the Void repository, software must meet at least one of the following requirements.
Exceptions to the list are possible, and might be accepted, but are extremely unlikely.
If you believe you have an exception, start a PR and make an argument for why that particular piece of software,
while not meeting any of the following requirements, is a good candidate for the Void packages system.
1. **System**: The software should be installed system-wide, not per-user.
1. **Compiled**: The software needs to be compiled before being used, even if it is software that is not needed by the whole system.
1. **Required**: Another package either within the repository or pending inclusion requires the package.
In particular, new themes are highly unlikely to be accepted.
Simple shell scripts are unlikely to be accepted unless they provide considerable value to a broad user base.
New fonts may be accepted if they provide value beyond aesthetics (e.g. they contain glyphs for a script missing in already packaged fonts).
Browser forks, including those based on Chromium and Firefox, are generally not accepted.
Such forks require heavy patching, maintenance and hours of build time.
Software need to be used in version announced by authors as ready to use by the general public - usually called releases.
Betas, arbitrary VCS revisions, templates using tip of development branch taken at build time and releases created by the package maintainer won't be accepted.
## Creating, updating, and modifying packages in Void by yourself
If you really want to get a new package or package update into Void Linux, we recommend you contribute it yourself.
We provide a [comprehensive Manual](./Manual.md) on how to create new packages. We provide a [comprehensive Manual](./Manual.md) on how to create new packages.
There's also a [manual for xbps-src](./README.md), which is used There's also a [manual for xbps-src](./README.md), which is used to build package files from templates.
to build package files from templates.
For this guide, we assume you have basic knowledge about [git](http://git-scm.org), as well as a [GitHub Account](http://github.com). For this guide, we assume you have basic knowledge about [git](http://git-scm.org), as well as a [GitHub Account](http://github.com) with [SSH set up](https://docs.github.com/en/authentication/connecting-to-github-with-ssh).
You should also [set the email](https://docs.github.com/en/account-and-profile/setting-up-and-managing-your-personal-account-on-github/managing-email-preferences/setting-your-commit-email-address) on your GitHub account and in git so your commits are associated with your GitHub account properly.
To get started, [fork](https://help.github.com/articles/fork-a-repo) the void-linux `void-packages` git repository on GitHub and clone it: To get started, [fork](https://help.github.com/articles/fork-a-repo) the void-linux `void-packages` git repository on GitHub and clone it:
@ -23,9 +47,24 @@ To keep your forked repository up to date, setup the `upstream` remote to pull i
$ git remote add upstream https://github.com/void-linux/void-packages.git $ git remote add upstream https://github.com/void-linux/void-packages.git
$ git pull --rebase upstream master $ git pull --rebase upstream master
This can also be done with the `github-cli` tool:
$ gh repo fork void-linux/void-packages
$ gh repo clone <user>/void-packages
This automatically sets up the `upstream` remote, so `git pull --rebase upstream master` can still be used to keep your fork up-to-date.
Using the GitHub web editor for making changes is strongly discouraged, because you will need to clone the repo anyways to edit and test your changes.
using the the `master` branch of your fork for contributing is also strongly discouraged.
It can cause many issues with updating your pull request (also called a PR), and having multiple PRs open at once.
To create a new branch:
$ git checkout master -b <a-descriptive-name>
### Creating a new template ### Creating a new template
You can use the helper tool `xnew`, from the [xtools](https://github.com/chneukirchen/xtools) package, to create new templates: You can use the helper tool `xnew`, from the [xtools](https://github.com/leahneukirchen/xtools) package, to create new templates:
$ xnew pkgname subpkg1 subpkg2 ... $ xnew pkgname subpkg1 subpkg2 ...
@ -33,44 +72,58 @@ Templates must have the name `void-packages/srcpkgs/<pkgname>/template`, where `
For deeper insights on the contents of template files, please read the [manual](./Manual.md), and be sure to browse the existing template files in the `srcpkgs` directory of this repository for concrete examples. For deeper insights on the contents of template files, please read the [manual](./Manual.md), and be sure to browse the existing template files in the `srcpkgs` directory of this repository for concrete examples.
When you've finished working on the template file, please check it with `xlint` helper from the [xtools](https://github.com/chneukirchen/xtools) package: ### Updating a template
At minimum, a template update will consist of changing `version` and `checksum`, if there was an upstream version change, and/or `revision`, if a template-specific change (e.g. patch, correction, etc.) is needed.
Other changes to the template may be needed depending on what changes the upstream has made.
The checksum can be updated automatically with the `xgensum` helper from the [xtools](https://github.com/leahneukirchen/xtools) package:
$ xgensum -i <pkgname>
### Committing your changes
After making your changes, please check that the package builds successfully. From the top level directory of your local copy of the `void-packages` repository, run:
$ ./xbps-src pkg <pkgname>
Your package must build successfully for at least x86, but we recommend also trying a cross-build for armv6l* as well, e.g.:
$ ./xbps-src -a armv6l pkg <pkgname>
When building for `x86_64*` or `i686`, building with the `-Q` flag or with `XBPS_CHECK_PKGS=yes` set in `etc/conf` (to run the check phase) is strongly encouraged.
Also, new packages and updates will not be accepted unless they have been runtime tested by installing and running the package.
When you've finished working on the template file, please check it with `xlint` helper from the [xtools](https://github.com/leahneukirchen/xtools) package:
$ xlint template $ xlint template
If `xlint` reports any issues, resolve them before committing. If `xlint` reports any issues, resolve them before committing.
### Committing your changes
Once you have made and verified your changes to the package template and/or other files, make one commit per package (including all changes to its sub-packages). Each commit message should have one of the following formats: Once you have made and verified your changes to the package template and/or other files, make one commit per package (including all changes to its sub-packages). Each commit message should have one of the following formats:
* for new packages, use ```New package: <pkgname>-<version>``` ([example](https://github.com/void-linux/void-packages/commit/176d9655429188aac10cd229827f99b72982ab10)). * for new packages, use `New package: <pkgname>-<version>` ([example](https://github.com/void-linux/void-packages/commit/8ed8d41c40bf6a82cf006c7e207e05942c15bff8)).
* for package updates, use ```<pkgname>: update to <version>.``` ([example](https://github.com/void-linux/void-packages/commit/b6b82dcbd4aeea5fc37a32e4b6a8dd8bd980d5a3)). * for package updates, use `<pkgname>: update to <version>.` ([example](https://github.com/void-linux/void-packages/commit/c92203f1d6f33026ae89f3e4c1012fb6450bbac1)).
* for template modifications without a version change, use ```<pkgname>: <reason>``` ([example](https://github.com/void-linux/void-packages/commit/8b68d6bf1eb997cd5e7c095acd040e2c5379c91d)). * for template modifications without a version change, use `<pkgname>: <reason>` ([example](https://github.com/void-linux/void-packages/commit/ff39c912d412717d17232de9564f659b037e95b5)).
* for package removals, use ```<pkgname>: remove package``` ([example](https://github.com/void-linux/void-packages/commit/83784632d94deee5d038c8e1c4c1dffa922fca21)). * for package removals, use `<pkgname>: remove package` and include the removal reason in the commit body ([example](https://github.com/void-linux/void-packages/commit/4322f923bdf5d4e0eb36738d4f4717d72d0a0ca4)).
* for `common/shlibs` modifications, use `common/shlibs: <pkgname>` ([example](https://github.com/void-linux/void-packages/commit/613651c91811cb4fd2e1a6be701c87072d759a9f)). * for changes to any other file, use `<filename>: <reason>` ([example](https://github.com/void-linux/void-packages/commit/e00bea014c36a70d60acfa1758514b0c7cb0627d),
[example](https://github.com/void-linux/void-packages/commit/93bf159ce10d8e474da5296e5bc98350d00c6c82), [example](https://github.com/void-linux/void-packages/commit/dc62938c67b66a7ff295eab541dc37b92fb9fb78), [example](https://github.com/void-linux/void-packages/commit/e52317e939d41090562cf8f8131a68772245bdde))
If you want to describe your changes in more detail, add an empty line followed by those details ([example](https://github.com/void-linux/void-packages/commit/f1c45a502086ba1952f23ace9084a870ce437bc6)). If you want to describe your changes in more detail, explain in the commit body (separated from the first line with a blank line) ([example](https://github.com/void-linux/void-packages/commit/f1c45a502086ba1952f23ace9084a870ce437bc6)).
`xbump`, available in the [xtools](https://github.com/chneukirchen/xtools) package, can be used to commit a new or updated package: `xbump`, available in the [xtools](https://github.com/leahneukirchen/xtools) package, can be used to commit a new or updated package:
$ xbump <pkgname> <git commit options> $ xbump <pkgname> <git commit options>
`xbump` will use `git commit` to commit the changes with the appropriate commit message. For more fine-grained control over the commit, specific options can be passed to `git commit` by adding them after the package name. `xrevbump`, also available in the [xtools](https://github.com/leahneukirchen/xtools) package, can be used to commit a template modification for a package:
After committing your changes, please check that the package builds successfully. From the top level directory of your local copy of the `void-packages` repository, run: $ xrevbump '<message>' <pkgnames...>
$ ./xbps-src pkg <pkgname> `xbump` and `xrevbump` will use `git commit` to commit the changes with the appropriate commit message. For more fine-grained control over the commit, specific options can be passed to `git commit` by adding them after the package name.
Your package must build successfully for at least x86, but we recommend trying to build for armv* as well, e.g.:
$ ./xbps-src -a armv7l pkg <pkgname>
Runtime testing of packages and building with the `-Q` flag or with `XBPS_CHECK_PKGS=yes` set in the environment or `etc/conf` are strongly encouraged.
New packages will not be accepted unless they have been runtime tested.
### Starting a pull request ### Starting a pull request
@ -123,9 +176,31 @@ Once you have applied all requested changes, the reviewers will merge your reque
If the pull request becomes inactive for some days, the reviewers may or may not warn you when they are about to close it. If the pull request becomes inactive for some days, the reviewers may or may not warn you when they are about to close it.
If it stays inactive further, it will be closed. If it stays inactive further, it will be closed.
Please abstain from temporarily closing a pull request while revising the templates. Instead, leave a comment on the PR describing what still needs work, or add "[WIP]" to the PR title. Only close your pull request if you're sure you don't want your changes to be included. Please abstain from temporarily closing a pull request while revising the templates. Instead, leave a comment on the PR describing what still needs work, [mark it as a draft](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/changing-the-stage-of-a-pull-request#converting-a-pull-request-to-a-draft), or add "[WIP]" to the PR title. Only close your pull request if you're sure you don't want your changes to be included.
#### Publishing the package #### Publishing the package
Once the reviewers have merged the pull request, our [build server](http://build.voidlinux.org) is automatically triggered and builds Once the reviewers have merged the pull request, our [build server](http://build.voidlinux.org) is automatically triggered and builds
all packages in the pull request for all supported platforms. Upon completion, the packages are available to all Void Linux users. all packages in the pull request for all supported platforms. Upon completion, the packages are available to all Void Linux users.
## Testing Pull Requests
While it is the responsibility of the PR creator to test changes before sending it, one person can't test all configuration options, usecases, hardware, etc.
Testing new package submissions and updates is always helpful, and is a great way to get started with contributing.
First, [clone the repository](https://github.com/void-linux/void-packages#quick-start) if you haven't done so already.
Then check out the pull request, either with `github-cli`:
$ gh pr checkout <number>
Or with `git`:
If your local void-packages repository is cloned from your fork, you may need to add the main repository as a remote first:
$ git remote add upstream https://github.com/void-linux/void-packages.git
Then fetch and check out the PR (replacing `<remote>` with either `origin` or `upstream`):
$ git fetch <remote> pull/<number>/head:<branch-name>
$ git checkout <branch-name>
Then [build and install](https://github.com/void-linux/void-packages#building-packages) the package and test its functionality.

View file

@ -6,7 +6,6 @@ packages for XBPS, the `Void Linux` native packaging system.
*Table of Contents* *Table of Contents*
* [Introduction](#Introduction) * [Introduction](#Introduction)
* [Quality Requirements](#quality_requirements)
* [Package build phases](#buildphase) * [Package build phases](#buildphase)
* [Package naming conventions](#namingconventions) * [Package naming conventions](#namingconventions)
* [Libraries](#libs) * [Libraries](#libs)
@ -62,6 +61,7 @@ packages for XBPS, the `Void Linux` native packaging system.
* [kernel-hooks](#triggers_kernel_hooks) * [kernel-hooks](#triggers_kernel_hooks)
* [mimedb](#triggers_mimedb) * [mimedb](#triggers_mimedb)
* [mkdirs](#triggers_mkdirs) * [mkdirs](#triggers_mkdirs)
* [openjdk-profile](#triggers_openjdk_profile)
* [pango-modules](#triggers_pango_module) * [pango-modules](#triggers_pango_module)
* [pycompile](#triggers_pycompile) * [pycompile](#triggers_pycompile)
* [register-shell](#triggers_register_shell) * [register-shell](#triggers_register_shell)
@ -123,38 +123,6 @@ If everything went fine after running
a binary package named `foo-1.0_1.<arch>.xbps` will be generated in the local repository a binary package named `foo-1.0_1.<arch>.xbps` will be generated in the local repository
`hostdir/binpkgs`. `hostdir/binpkgs`.
<a id="quality_requirements"></a>
### Quality Requirements
To be included in the Void repository, software must meet at least one
of the following requirements. Exceptions to the list are possible,
and might be accepted, but are extremely unlikely. If you believe you have an
exception, start a PR and make an argument for why that particular piece of
software, while not meeting any of the following requirements, is a good candidate for
the Void packages system.
1. System: The software should be installed system-wide, not per-user.
1. Compiled: The software needs to be compiled before being used, even if it is
software that is not needed by the whole system.
1. Required: Another package either within the repository or pending inclusion
requires the package.
In particular, new themes are highly unlikely to be accepted. Simple shell
scripts are unlikely to be accepted unless they provide considerable value to a
broad user base. New fonts may be accepted if they provide value beyond
aesthetics (e.g. they contain glyphs for a script missing in already packaged
fonts).
Browser forks, including those based on Chromium and Firefox, are generally not
accepted. Such forks require heavy patching, maintenance and hours of build time.
Software need to be used in version announced by authors as ready to use by
the general public - usually called releases. Betas, arbitrary VCS revisions,
templates using tip of development branch taken at build time and releases
created by the package maintainer won't be accepted.
<a id="buildphase"></a> <a id="buildphase"></a>
### Package build phases ### Package build phases
@ -427,6 +395,8 @@ in this directory such as `${XBPS_BUILDDIR}/${wrksrc}`.
- `XBPS_RUST_TARGET` The target architecture triplet used by `rustc` and `cargo`. - `XBPS_RUST_TARGET` The target architecture triplet used by `rustc` and `cargo`.
- `XBPS_BUILD_ENVIRONMENT` Enables continuous-integration-specific operations. Set to `void-packages-ci` if in continuous integration.
<a id="available_vars"></a> <a id="available_vars"></a>
### Available variables ### Available variables
@ -463,7 +433,7 @@ the generated `binary packages` have been modified.
- `short_desc` A string with a brief description for this package. Max 72 chars. - `short_desc` A string with a brief description for this package. Max 72 chars.
- `version` A string with the package version. Must not contain dashes or underscore - `version` A string with the package version. Must not contain dashes or underscore
and at least one digit is required. Shell's variable substition usage is not allowed. and at least one digit is required. Shell's variable substitution usage is not allowed.
Neither `pkgname` or `version` should contain special characters which make it Neither `pkgname` or `version` should contain special characters which make it
necessary to quote them, so they shouldn't be quoted in the template. necessary to quote them, so they shouldn't be quoted in the template.
@ -542,13 +512,14 @@ can be specified by prepending a commercial at (@).
For tarballs you can find the contents checksum by using the command For tarballs you can find the contents checksum by using the command
`tar xf <tarball.ext> --to-stdout | sha256sum`. `tar xf <tarball.ext> --to-stdout | sha256sum`.
- `wrksrc` The directory name where the package sources are extracted, by default - `wrksrc` The directory name where the package sources are extracted, set to `${pkgname}-${version}`.
set to `${pkgname}-${version}`. If the top level directory of a package's `distfile` is different from the default, `wrksrc` must be set to the top level directory name inside the archive.
- `build_wrksrc` A directory relative to `${wrksrc}` that will be used when building the package. - `build_wrksrc` A directory relative to `${wrksrc}` that will be used when building the package.
- `create_wrksrc` Enable it to create the `${wrksrc}` directory. Required if a package - `create_wrksrc` Usually, after extracting, if there're multiple top-level
contains multiple `distfiles`. files and/or directories or when there're no directories at all, top-level files,
and directories will be wrapped inside one more layer of directory.
Set `create_wrksrc` to force this behaviour.
- `build_style` This specifies the `build method` for a package. Read below to know more - `build_style` This specifies the `build method` for a package. Read below to know more
about the available package `build methods` or effect of leaving this not set. about the available package `build methods` or effect of leaving this not set.
@ -577,10 +548,8 @@ build methods. Unset by default.
`${build_style}` is set to `configure`, `gnu-configure` or `gnu-makefile` `${build_style}` is set to `configure`, `gnu-configure` or `gnu-makefile`
build methods. Unset by default. build methods. Unset by default.
- `make_install_args` The arguments to be passed in to `${make_cmd}` at the `install-destdir` - `make_install_args` The arguments to be passed in to `${make_cmd}` at the `install`
phase if `${build_style}` is set to `configure`, `gnu-configure` or phase if `${build_style}` is set to `configure`, `gnu-configure` or `gnu-makefile` build methods.
`gnu-makefile` build methods. By default set to
`PREFIX=/usr DESTDIR=${DESTDIR}`.
- `make_build_target` The build target. If `${build_style}` is set to `configure`, `gnu-configure` - `make_build_target` The build target. If `${build_style}` is set to `configure`, `gnu-configure`
or `gnu-makefile`, this is the target passed to `${make_cmd}` in the build phase; or `gnu-makefile`, this is the target passed to `${make_cmd}` in the build phase;
@ -600,6 +569,9 @@ path of the Python wheel produced by the build phase that will be installed; whe
`python-pep517` build style will look for a wheel matching the package name and version in the `python-pep517` build style will look for a wheel matching the package name and version in the
current directory with respect to the install. current directory with respect to the install.
- `make_check_pre` The expression in front of `${make_cmd}`. This can be used for wrapper commands
or for setting environment variables for the check command. By default empty.
- `patch_args` The arguments to be passed in to the `patch(1)` command when applying - `patch_args` The arguments to be passed in to the `patch(1)` command when applying
patches to the package sources during `do_patch()`. Patches are stored in patches to the package sources during `do_patch()`. Patches are stored in
`srcpkgs/<pkgname>/patches` and must be in `-p1` format. By default set to `-Np1`. `srcpkgs/<pkgname>/patches` and must be in `-p1` format. By default set to `-Np1`.
@ -609,6 +581,11 @@ and `XBPS_MAKEJOBS` will be set to 1. If a package does not work well with `XBPS
but still has a mechanism to build in parallel, set `disable_parallel_build` and but still has a mechanism to build in parallel, set `disable_parallel_build` and
use `XBPS_ORIG_MAKEJOBS` (which holds the original value of `XBPS_MAKEJOBS`) in the template. use `XBPS_ORIG_MAKEJOBS` (which holds the original value of `XBPS_MAKEJOBS`) in the template.
- `disable_parallel_check` If set tests for the package won't be built and run in parallel
and `XBPS_MAKEJOBS` will be set to 1. If a package does not work well with `XBPS_MAKEJOBS`
but still has a mechanism to run checks in parallel, set `disable_parallel_check` and
use `XBPS_ORIG_MAKEJOBS` (which holds the original value of `XBPS_MAKEJOBS`) in the template.
- `make_check` Sets the cases in which the `check` phase is run. - `make_check` Sets the cases in which the `check` phase is run.
This option has to be accompanied by a comment explaining why the tests fail. This option has to be accompanied by a comment explaining why the tests fail.
Allowed values: Allowed values:
@ -653,7 +630,7 @@ debugging symbols. Files can be given by full path or by filename.
- `noshlibprovides` If set, the ELF binaries won't be inspected to collect the provided - `noshlibprovides` If set, the ELF binaries won't be inspected to collect the provided
sonames in shared libraries. sonames in shared libraries.
- `noverifyrdeps` If set, the ELF binaries and shared libaries won't be inspected to collect - `noverifyrdeps` If set, the ELF binaries and shared libraries won't be inspected to collect
their reverse dependencies. You need to specify all dependencies in the `depends` when you their reverse dependencies. You need to specify all dependencies in the `depends` when you
need to set this. need to set this.
@ -693,7 +670,7 @@ This appends to the generated file rather than replacing it.
- `nopie` Only needs to be set to something to make active, disables building the package with hardening - `nopie` Only needs to be set to something to make active, disables building the package with hardening
features (PIE, relro, etc). Not necessary for most packages. features (PIE, relro, etc). Not necessary for most packages.
- `nopie_files` White-space seperated list of ELF binaries that won't be checked - `nopie_files` White-space separated list of ELF binaries that won't be checked
for PIE. Files must be given by full path. for PIE. Files must be given by full path.
- `reverts` xbps supports a unique feature which allows to downgrade from broken - `reverts` xbps supports a unique feature which allows to downgrade from broken
@ -779,7 +756,7 @@ A special value `noarch` used to be available, but has since been removed.
So far, we have listed four types of `depends` variables: `hostmakedepends`, So far, we have listed four types of `depends` variables: `hostmakedepends`,
`makedepends`, `checkdepends` and `depends`. These different kinds of variables `makedepends`, `checkdepends` and `depends`. These different kinds of variables
are necessary because `xbps-src` supports cross compilation and to avoid are necessary because `xbps-src` supports cross compilation and to avoid
installing unecessary packages in the build environment. installing unnecessary packages in the build environment.
During a build process, there are programs that must be _run_ on the host, such During a build process, there are programs that must be _run_ on the host, such
as `yacc` or the C compiler. The packages that contain these programs should be as `yacc` or the C compiler. The packages that contain these programs should be
@ -1124,9 +1101,9 @@ Current working directory for functions is set as follows:
- For do_fetch, post_fetch: `XBPS_BUILDDIR`. - For do_fetch, post_fetch: `XBPS_BUILDDIR`.
- For do_extract, post_extract: `wrksrc`. - For do_extract through do_patch: `wrksrc`.
- For pre_patch through post_install: `build_wrksrc` - For post_patch through post_install: `build_wrksrc`
if it is defined, otherwise `wrksrc`. if it is defined, otherwise `wrksrc`.
<a id="build_options"></a> <a id="build_options"></a>
@ -1276,8 +1253,8 @@ declaring a virtual name and version in the `${provides}` template variable (e.g
specific provider can declare a dependency on the virtual package name with the prefix `virtual?` specific provider can declare a dependency on the virtual package name with the prefix `virtual?`
(e.g., `depends="virtual?vpkg-0.1_1"`). When a package is built by `xbps-src`, providers for any (e.g., `depends="virtual?vpkg-0.1_1"`). When a package is built by `xbps-src`, providers for any
virtual packages will be confirmed to exist and will be built if necessary. A map from virtual virtual packages will be confirmed to exist and will be built if necessary. A map from virtual
packages to their default providers is defined in `etc/default.virtual`. Individual mappings can be packages to their default providers is defined in `etc/defaults.virtual`. Individual mappings can be
overridden by local preferences in `etc/virtual`. Comments in `etc/default.virtual` provide more overridden by local preferences in `etc/virtual`. Comments in `etc/defaults.virtual` provide more
information on this map. information on this map.
<a id="install_remove_files"></a> <a id="install_remove_files"></a>
@ -1406,6 +1383,14 @@ If the service requires directories in parts of the system that are not generall
temporary filesystems. Then use the `make_dirs` variable in the template to create temporary filesystems. Then use the `make_dirs` variable in the template to create
those directories when the package is installed. those directories when the package is installed.
If the package installs a systemd service file or other unit, leave it in place as a
reference point so long as including it has no negative side effects.
Examples of when *not* to install systemd units:
1. When doing so changes runtime behavior of the packaged software.
2. When it is done via a compile time flag that also changes build dependencies.
<a id="32bit_pkgs"></a> <a id="32bit_pkgs"></a>
### 32bit packages ### 32bit packages
@ -1585,12 +1570,11 @@ recursively by the target python version. This differs from `pycompile_module` i
path may be specified, Example: `pycompile_dirs="usr/share/foo"`. path may be specified, Example: `pycompile_dirs="usr/share/foo"`.
- `python_version`: this variable expects the supported Python major version. - `python_version`: this variable expects the supported Python major version.
By default it's set to `2`. This variable is needed for multi-language In most cases version is inferred from shebang, install path or build style.
Only required for some multi-language
applications (e.g., the application is written in C while the command is applications (e.g., the application is written in C while the command is
written in Python) or just single Python file ones that live in `/usr/bin`. written in Python) or just single Python file ones that live in `/usr/bin`.
> NOTE: you need to define it *only* for non-Python modules.
Also, a set of useful variables are defined to use in the templates: Also, a set of useful variables are defined to use in the templates:
| Variable | Value | | Variable | Value |
@ -1628,6 +1612,7 @@ The following template variables influence how Go packages are built:
any go.mod files, `default` to use Go's default behavior, or anything any go.mod files, `default` to use Go's default behavior, or anything
accepted by `go build -mod MODE`. Defaults to `vendor` if there's accepted by `go build -mod MODE`. Defaults to `vendor` if there's
a vendor directory, otherwise `default`. a vendor directory, otherwise `default`.
- `go_ldflags`: Arguments to pass to the linking steps of go tool.
The following environment variables influence how Go packages are built: The following environment variables influence how Go packages are built:
@ -1997,6 +1982,13 @@ During removal it will delete the directory using `rmdir`.
To include this trigger use the `make_dirs` variable, as the trigger won't do anything To include this trigger use the `make_dirs` variable, as the trigger won't do anything
unless it is defined. unless it is defined.
<a id="triggers_openjdk_profile"></a>
#### openjdk-profile
The openjdk-profile trigger is responsible for creating an entry in /etc/profile.d that
sets the `JAVA_HOME` environment variable to the currently-selected alternative for
`/usr/bin/java` on installation. This trigger must be manually requested.
<a id="triggers_pango_module"></a> <a id="triggers_pango_module"></a>
#### pango-modules #### pango-modules

View file

@ -3,3 +3,4 @@ if [ "$CROSS_BUILD" ]; then
else else
export WX_CONFIG=/usr/bin/wx-config-gtk3 export WX_CONFIG=/usr/bin/wx-config-gtk3
fi fi
configure_args+=" -DwxWidgets_CONFIG_EXECUTABLE=${WX_CONFIG} "

View file

@ -0,0 +1,16 @@
# fix building non-pure-python modules on cross
if [ -n "$CROSS_BUILD" ]; then
export PYPREFIX="$XBPS_CROSS_BASE"
export CFLAGS+=" -I${XBPS_CROSS_BASE}/${py3_inc} -I${XBPS_CROSS_BASE}/usr/include"
export LDFLAGS+=" -L${XBPS_CROSS_BASE}/${py3_lib} -L${XBPS_CROSS_BASE}/usr/lib"
export CC="${XBPS_CROSS_TRIPLET}-gcc -pthread $CFLAGS $LDFLAGS"
export LDSHARED="${CC} -shared $LDFLAGS"
export PYTHON_CONFIG="${XBPS_CROSS_BASE}/usr/bin/python3-config"
export PYTHONPATH="${XBPS_CROSS_BASE}/${py3_lib}"
for f in ${XBPS_CROSS_BASE}/${py3_lib}/_sysconfigdata_*; do
[ -f "$f" ] || continue
f=${f##*/}
_PYTHON_SYSCONFIGDATA_NAME=${f%.py}
done
[ -n "$_PYTHON_SYSCONFIGDATA_NAME" ] && export _PYTHON_SYSCONFIGDATA_NAME
fi

View file

@ -3,24 +3,24 @@
# #
do_build() { do_build() {
: ${make_cmd:=cargo} : ${make_cmd:=cargo auditable}
${make_cmd} build --release --target ${RUST_TARGET} ${configure_args} ${make_cmd} build --release --target ${RUST_TARGET} ${configure_args}
} }
do_check() { do_check() {
: ${make_cmd:=cargo} : ${make_cmd:=cargo auditable}
${make_cmd} test --release --target ${RUST_TARGET} ${configure_args} \ ${make_check_pre} ${make_cmd} test --release --target ${RUST_TARGET} ${configure_args} \
${make_check_args} ${make_check_args}
} }
do_install() { do_install() {
: ${make_cmd:=cargo} : ${make_cmd:=cargo auditable}
: ${make_install_args:=--path .} : ${make_install_args:=--path .}
${make_cmd} install --target ${RUST_TARGET} --root="${DESTDIR}/usr" \ ${make_cmd} install --target ${RUST_TARGET} --root="${DESTDIR}/usr" \
--locked ${configure_args} ${make_install_args} --offline --locked ${configure_args} ${make_install_args}
rm -f "${DESTDIR}"/usr/.crates.toml rm -f "${DESTDIR}"/usr/.crates.toml
rm -f "${DESTDIR}"/usr/.crates2.json rm -f "${DESTDIR}"/usr/.crates2.json

View file

@ -54,6 +54,14 @@ _EOF
cmake_args+=" -DCMAKE_INSTALL_PREFIX=/usr" cmake_args+=" -DCMAKE_INSTALL_PREFIX=/usr"
cmake_args+=" -DCMAKE_BUILD_TYPE=None" cmake_args+=" -DCMAKE_BUILD_TYPE=None"
cmake_args+=" -DCMAKE_INSTALL_LIBDIR=lib${XBPS_TARGET_WORDSIZE}" cmake_args+=" -DCMAKE_INSTALL_LIBDIR=lib${XBPS_TARGET_WORDSIZE}"
cmake_args+=" -DCMAKE_INSTALL_SYSCONFDIR=/etc"
if [ "$CROSS_BUILD" ]; then
cmake_args+=" -DQT_HOST_PATH=/usr"
# QT_HOST_PATH isn't enough in my system,
# which have binfmts support on and off
cmake_args+=" -DQT_HOST_PATH_CMAKE_DIR=/usr/lib/cmake"
fi
if [[ $build_helper = *"qemu"* ]]; then if [[ $build_helper = *"qemu"* ]]; then
echo "SET(CMAKE_CROSSCOMPILING_EMULATOR /usr/bin/qemu-${XBPS_TARGET_QEMU_MACHINE}-static)" \ echo "SET(CMAKE_CROSSCOMPILING_EMULATOR /usr/bin/qemu-${XBPS_TARGET_QEMU_MACHINE}-static)" \
@ -116,7 +124,7 @@ do_check() {
: ${make_check_target:=test} : ${make_check_target:=test}
${make_cmd} ${make_check_args} ${make_check_target} ${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
} }
do_install() { do_install() {

View file

@ -29,7 +29,7 @@ do_check() {
: ${make_cmd:=make} : ${make_cmd:=make}
: ${make_check_target:=check} : ${make_check_target:=check}
${make_cmd} ${make_check_args} ${make_check_target} ${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
} }
do_install() { do_install() {

View file

@ -30,7 +30,7 @@ do_check() {
: ${make_cmd:=make} : ${make_cmd:=make}
: ${make_check_target:=check} : ${make_check_target:=check}
${make_cmd} ${make_check_args} ${make_check_target} ${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
} }
do_install() { do_install() {

View file

@ -9,8 +9,10 @@ do_build() {
CC="$CC" CXX="$CXX" LD="$LD" AR="$AR" RANLIB="$RANLIB" \ CC="$CC" CXX="$CXX" LD="$LD" AR="$AR" RANLIB="$RANLIB" \
CPP="$CPP" AS="$AS" OBJCOPY="$OBJCOPY" OBJDUMP="$OBJDUMP" \ CPP="$CPP" AS="$AS" OBJCOPY="$OBJCOPY" OBJDUMP="$OBJDUMP" \
CFLAGS="$CFLAGS" CXXFLAGS="$CXXFLAGS" LDFLAGS="$LDFLAGS" \ CFLAGS="$CFLAGS" CXXFLAGS="$CXXFLAGS" LDFLAGS="$LDFLAGS" \
PREFIX=/usr prefix=/usr \
${makejobs} ${make_build_args} ${make_build_target} ${makejobs} ${make_build_args} ${make_build_target}
else else
export PREFIX=/usr prefix=/usr
${make_cmd} ${makejobs} ${make_build_args} ${make_build_target} ${make_cmd} ${makejobs} ${make_build_args} ${make_build_target}
fi fi
} }
@ -30,12 +32,12 @@ do_check() {
: ${make_cmd:=make} : ${make_cmd:=make}
: ${make_check_target:=check} : ${make_check_target:=check}
${make_cmd} ${make_check_args} ${make_check_target} ${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
} }
do_install() { do_install() {
: ${make_cmd:=make} : ${make_cmd:=make}
: ${make_install_target:=install} : ${make_install_target:=install}
${make_cmd} STRIP=true PREFIX=/usr DESTDIR=${DESTDIR} ${make_install_args} ${make_install_target} ${make_cmd} STRIP=true PREFIX=/usr prefix=/usr DESTDIR=${DESTDIR} ${make_install_args} ${make_install_target}
} }

View file

@ -22,6 +22,13 @@ do_configure() {
} }
do_build() { do_build() {
# remove -s and -w from go_ldflags, we should let xbps-src strip binaries itself
for wd in $go_ldflags; do
if [ "$wd" == "-s" ] || [ "$wd" == "-w" ]; then
msg_error "$pkgname: remove -s and -w from go_ldflags\n"
fi
done
go_package=${go_package:-$go_import_path} go_package=${go_package:-$go_import_path}
# Build using Go modules if there's a go.mod file # Build using Go modules if there's a go.mod file
if [ "${go_mod_mode}" != "off" ] && [ -f go.mod ]; then if [ "${go_mod_mode}" != "off" ] && [ -f go.mod ]; then

View file

@ -103,7 +103,7 @@ do_configure() {
export AR="gcc-ar" export AR="gcc-ar"
# unbuffered output for continuous logging # unbuffered output for continuous logging
PYTHONUNBUFFERED=1 ${meson_cmd} \ PYTHONUNBUFFERED=1 ${meson_cmd} setup \
--prefix=/usr \ --prefix=/usr \
--libdir=/usr/lib${XBPS_TARGET_WORDSIZE} \ --libdir=/usr/lib${XBPS_TARGET_WORDSIZE} \
--libexecdir=/usr/libexec \ --libexecdir=/usr/libexec \
@ -138,7 +138,7 @@ do_check() {
: ${make_check_target:=test} : ${make_check_target:=test}
: ${meson_builddir:=build} : ${meson_builddir:=build}
${make_cmd} -C ${meson_builddir} ${makejobs} ${make_check_args} ${make_check_target} ${make_check_pre} ${make_cmd} -C ${meson_builddir} ${makejobs} ${make_check_args} ${make_check_target}
} }
do_install() { do_install() {

View file

@ -41,7 +41,7 @@ do_check() {
if [ ! -x ./Build ]; then if [ ! -x ./Build ]; then
msg_error "$pkgver: cannot find ./Build script!\n" msg_error "$pkgver: cannot find ./Build script!\n"
fi fi
./Build test ${make_check_pre} ./Build test
} }
do_install() { do_install() {

View file

@ -79,7 +79,7 @@ do_check() {
: ${make_cmd:=make} : ${make_cmd:=make}
: ${make_check_target:=test} : ${make_check_target:=test}
${make_cmd} ${make_check_args} ${make_check_target} ${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
} }
do_install() { do_install() {

View file

@ -49,7 +49,7 @@ do_check() {
fi fi
fi fi
python${pyver} setup.py ${make_check_target:-test} ${make_check_args} ${make_check_pre} python${pyver} setup.py ${make_check_target:-test} ${make_check_args}
rm build rm build
done done
} }

View file

@ -3,30 +3,18 @@
# #
do_build() { do_build() {
if [ -n "$CROSS_BUILD" ]; then python3 setup.py build ${make_build_args}
PYPREFIX="$XBPS_CROSS_BASE"
CFLAGS+=" -I${XBPS_CROSS_BASE}/${py3_inc} -I${XBPS_CROSS_BASE}/usr/include"
LDFLAGS+=" -L${XBPS_CROSS_BASE}/${py3_lib} -L${XBPS_CROSS_BASE}/usr/lib"
CC="${XBPS_CROSS_TRIPLET}-gcc -pthread $CFLAGS $LDFLAGS"
LDSHARED="${CC} -shared $LDFLAGS"
for f in ${XBPS_CROSS_BASE}/${py3_lib}/_sysconfigdata_*; do
f=${f##*/}
_PYTHON_SYSCONFIGDATA_NAME=${f%.py}
done
env CC="$CC" LDSHARED="$LDSHARED" \
PYPREFIX="$PYPREFIX" CFLAGS="$CFLAGS" \
PYTHONPATH=${XBPS_CROSS_BASE}/${py3_lib} \
_PYTHON_SYSCONFIGDATA_NAME="$_PYTHON_SYSCONFIGDATA_NAME" \
LDFLAGS="$LDFLAGS" python3 setup.py build ${make_build_args}
else
python3 setup.py build ${make_build_args}
fi
} }
do_check() { do_check() {
local testjobs
if python3 -c 'import pytest' >/dev/null 2>&1; then if python3 -c 'import pytest' >/dev/null 2>&1; then
if python3 -c 'import xdist' >/dev/null 2>&1; then
testjobs="-n $XBPS_MAKEJOBS"
fi
PYTHONPATH="$(cd build/lib* && pwd)" \ PYTHONPATH="$(cd build/lib* && pwd)" \
python3 -m pytest ${make_check_args} ${make_check_target} ${make_check_pre} \
python3 -m pytest ${testjobs} ${make_check_args} ${make_check_target}
else else
# Fall back to deprecated setup.py test orchestration without pytest # Fall back to deprecated setup.py test orchestration without pytest
if [ -z "$make_check_target" ]; then if [ -z "$make_check_target" ]; then
@ -37,28 +25,10 @@ do_check() {
fi fi
: ${make_check_target:=test} : ${make_check_target:=test}
python3 setup.py ${make_check_target} ${make_check_args} ${make_check_pre} python3 setup.py ${make_check_target} ${make_check_args}
fi fi
} }
do_install() { do_install() {
if [ -n "$CROSS_BUILD" ]; then python3 setup.py install --prefix=/usr --root=${DESTDIR} ${make_install_args}
PYPREFIX="$XBPS_CROSS_BASE"
CFLAGS+=" -I${XBPS_CROSS_BASE}/${py3_inc} -I${XBPS_CROSS_BASE}/usr/include"
LDFLAGS+=" -L${XBPS_CROSS_BASE}/${py3_lib} -L${XBPS_CROSS_BASE}/usr/lib"
CC="${XBPS_CROSS_TRIPLET}-gcc -pthread $CFLAGS $LDFLAGS"
LDSHARED="${CC} -shared $LDFLAGS"
for f in ${XBPS_CROSS_BASE}/${py3_lib}/_sysconfigdata_*; do
f=${f##*/}
_PYTHON_SYSCONFIGDATA_NAME=${f%.py}
done
env CC="$CC" LDSHARED="$LDSHARED" \
PYPREFIX="$PYPREFIX" CFLAGS="$CFLAGS" \
PYTHONPATH=${XBPS_CROSS_BASE}/${py3_lib} \
_PYTHON_SYSCONFIGDATA_NAME="$_PYTHON_SYSCONFIGDATA_NAME" \
LDFLAGS="$LDFLAGS" python3 setup.py \
install --prefix=/usr --root=${DESTDIR} ${make_install_args}
else
python3 setup.py install --prefix=/usr --root=${DESTDIR} ${make_install_args}
fi
} }

View file

@ -3,31 +3,31 @@
# #
do_build() { do_build() {
# No PEP517 build tool currently supports compiled extensions
# Thus, there is no need to accommodate cross compilation here
: ${make_build_target:=.} : ${make_build_target:=.}
: ${make_build_args:=--no-isolation --wheel}
mkdir -p build python3 -m build ${make_build_args} ${make_build_target}
TMPDIR=build python3 -m pip wheel --no-deps --use-pep517 --no-clean \
--no-build-isolation ${make_build_args} ${make_build_target}
} }
do_check() { do_check() {
local testjobs
if python3 -c 'import pytest' >/dev/null 2>&1; then if python3 -c 'import pytest' >/dev/null 2>&1; then
python3 -m pytest ${make_check_args} ${make_check_target} if python3 -c 'import xdist' >/dev/null 2>&1; then
testjobs="-n $XBPS_MAKEJOBS"
fi
${make_check_pre} python3 -m pytest ${testjobs} ${make_check_args} ${make_check_target}
else else
msg_warn "Unable to determine tests for PEP517 Python templates" msg_warn "Unable to determine tests for PEP517 Python templates\n"
return 0 return 0
fi fi
} }
do_install() { do_install() {
# As with do_build, no need to accommodate cross compilation here if [ -z "${make_install_target}" ]; then
: ${make_install_target:=${pkgname#python3-}-${version}-*-*-*.whl} # Default wheel name normalizes hyphens to underscores
local wheelbase="${pkgname#python3-}"
make_install_target="dist/${wheelbase//-/_}-${version}-*-*-*.whl"
fi
# If do_build was overridden, make sure the TMPDIR exists python3 -m installer --destdir ${DESTDIR} \
mkdir -p build ${make_install_args} ${make_install_target}
TMPDIR=build python3 -m pip install --use-pep517 --prefix /usr \
--root ${DESTDIR} --no-deps --no-build-isolation \
--no-clean ${make_install_args} ${make_install_target}
} }

View file

@ -1,17 +1,12 @@
# #
# This helper is for templates using Qt4/Qt5 qmake. # This helper is for templates using Qt5/Qt6 qmake.
# #
do_configure() { do_configure() {
local qmake local qmake
local qmake_args local qmake_args
if [ -x "/usr/lib/qt5/bin/qmake" ]; then if [ -x "/usr/lib/qt5/bin/qmake" ]; then
# Qt5 qmake
qmake="/usr/lib/qt5/bin/qmake" qmake="/usr/lib/qt5/bin/qmake"
fi fi
if [ -x "/usr/lib/qt/bin/qmake" ]; then
# Qt4 qmake
qmake="/usr/lib/qt/bin/qmake"
fi
if [ -z "${qmake}" ]; then if [ -z "${qmake}" ]; then
msg_error "${pkgver}: Could not find qmake - missing in hostmakedepends?\n" msg_error "${pkgver}: Could not find qmake - missing in hostmakedepends?\n"
fi fi

View file

@ -3,7 +3,7 @@
# #
do_check() { do_check() {
RAKULIB=lib prove -r -e raku t/ RAKULIB=lib ${make_check_pre} prove -r -e raku t/
} }
do_install() { do_install() {

View file

@ -9,18 +9,27 @@ do_build() {
CXXFLAGS="$CXXFLAGS" LINKFLAGS="$LDFLAGS" \ CXXFLAGS="$CXXFLAGS" LINKFLAGS="$LDFLAGS" \
cxxflags="$CXXFLAGS" linkflags="$LDFLAGS" \ cxxflags="$CXXFLAGS" linkflags="$LDFLAGS" \
RANLIB="$RANLIB" ranlib="$RANLIB" \ RANLIB="$RANLIB" ranlib="$RANLIB" \
prefix=/usr destdir=${DESTDIR} DESTDIR=${DESTDIR} \ prefix=/usr \
${scons_use_destdir:+DESTDIR="${DESTDIR}"} \
${scons_use_destdir:+destdir="${DESTDIR}"} \
${make_build_args} ${make_build_target} ${make_build_args} ${make_build_target}
} }
do_install() { do_install() {
: ${make_cmd:=scons} : ${make_cmd:=scons}
: ${make_install_target:=install} : ${make_install_target:=install}
local _sandbox=
if [ -z "$scons_use_destdir" ]; then _sandbox=yes ; fi
${make_cmd} ${makejobs} CC=$CC CXX=$CXX CCFLAGS="$CFLAGS" \ ${make_cmd} ${makejobs} CC=$CC CXX=$CXX CCFLAGS="$CFLAGS" \
cc=$CC cxx=$CXX ccflags="$CFLAGS" \ cc=$CC cxx=$CXX ccflags="$CFLAGS" \
CXXFLAGS="$CXXFLAGS" LINKFLAGS="$LDFLAGS" \ CXXFLAGS="$CXXFLAGS" LINKFLAGS="$LDFLAGS" \
cxxflags="$CXXFLAGS" linkflags="$LDFLAGS" \ cxxflags="$CXXFLAGS" linkflags="$LDFLAGS" \
RANLIB="$RANLIB" ranlib="$RANLIB" \ RANLIB="$RANLIB" ranlib="$RANLIB" \
prefix=/usr destdir=${DESTDIR} DESTDIR=${DESTDIR} \ prefix=/usr \
${scons_use_destdir:+DESTDIR="${DESTDIR}"} \
${scons_use_destdir:+destdir="${DESTDIR}"} \
${_sandbox:+--install-sandbox="${DESTDIR}"} \
${make_install_args} ${make_install_target} ${make_install_args} ${make_install_target}
} }

View file

@ -5,7 +5,6 @@
# required variables # required variables
# #
# build_style=slashpackage # build_style=slashpackage
# wrksrc=<category>
# build_wrksrc=${pkgname}-${version} # build_wrksrc=${pkgname}-${version}
# distfiles=<download link> # distfiles=<download link>
# #
@ -15,7 +14,6 @@
# pkgname=daemontools # pkgname=daemontools
# version=0.76 # version=0.76
# revision=1 # revision=1
# wrksrc=admin
# build_wrksrc=${pkgname}-${version} # build_wrksrc=${pkgname}-${version}
# build_style=slashpackage # build_style=slashpackage
# short_desc="A collection of tools for managing UNIX services" # short_desc="A collection of tools for managing UNIX services"

View file

@ -1,8 +1,10 @@
makedepends+=" R" makedepends+=" R"
depends+=" R" depends+=" R"
wrksrc="${XBPS_BUILDDIR}/${pkgname#R-cran-}" create_wrksrc=required
build_wrksrc="${pkgname#R-cran-}"
# default to cran # default to cran
if [ -z "$distfiles" ]; then if [ -z "$distfiles" ]; then
distfiles="https://cran.r-project.org/src/contrib/${pkgname#R-cran-}_${version//r/-}.tar.gz" distfiles="https://cran.r-project.org/src/contrib/${pkgname#R-cran-}_${version//r/-}.tar.gz
https://cran.r-project.org/src/contrib/Archive/${pkgname#R-cran-}/${pkgname#R-cran-}_${version//r/-}.tar.gz"
fi fi

View file

@ -1,5 +1,9 @@
hostmakedepends+=" cargo" hostmakedepends+=" cargo"
if ! [[ "$pkgname" =~ ^cargo-auditable(-bootstrap)?$ ]]; then
hostmakedepends+=" cargo-auditable"
fi
if [ "$CROSS_BUILD" ]; then if [ "$CROSS_BUILD" ]; then
makedepends+=" rust-std" makedepends+=" rust-std"
fi fi

View file

@ -43,3 +43,8 @@ case "$XBPS_TARGET_MACHINE" in
*-musl) export GOCACHE="${XBPS_HOSTDIR}/gocache-muslc" ;; *-musl) export GOCACHE="${XBPS_HOSTDIR}/gocache-muslc" ;;
*) export GOCACHE="${XBPS_HOSTDIR}/gocache-glibc" ;; *) export GOCACHE="${XBPS_HOSTDIR}/gocache-glibc" ;;
esac esac
case "$XBPS_TARGET_MACHINE" in
# https://go.dev/cl/421935
i686*) export CGO_CFLAGS="$CGO_CFLAGS -fno-stack-protector" ;;
esac

View file

@ -1,2 +1,3 @@
lib32disabled=yes lib32disabled=yes
makedepends+=" python3" makedepends+=" python3"
build_helper+=" python3"

View file

@ -1,2 +1,3 @@
hostmakedepends+=" python3-pip" hostmakedepends+=" python3-build python3-installer"
lib32disabled=yes lib32disabled=yes
build_helper+=" python3"

View file

@ -2,4 +2,3 @@
hostmakedepends+=" rsync" hostmakedepends+=" rsync"
# python_version isn't needed for everything either # python_version isn't needed for everything either
python_version=3 python_version=3
create_wrksrc=yes

View file

@ -1,6 +1,5 @@
lib32disabled=yes lib32disabled=yes
nopie=yes nopie=yes
create_wrksrc=yes
nostrip_files+=" libcaf_single.a libgcc.a libgcov.a libgcc_eh.a nostrip_files+=" libcaf_single.a libgcc.a libgcov.a libgcc_eh.a
libgnarl_pic.a libgnarl.a libgnat_pic.a libgnat.a libgmem.a" libgnarl_pic.a libgnarl.a libgnat_pic.a libgnat.a libgmem.a"

View file

@ -1,2 +1,16 @@
CFLAGS="${CFLAGS} -fdebug-prefix-map=$wrksrc=." local _wrksrc="$wrksrc${build_wrksrc:+/$build_wrksrc}"
CXXFLAGS="${CXXFLAGS} -fdebug-prefix-map=$wrksrc=." case "$build_style" in
cmake)
CFLAGS="${CFLAGS} -fdebug-prefix-map=$_wrksrc/${cmake_builddir:-build}=."
CXXFLAGS="${CXXFLAGS} -fdebug-prefix-map=$_wrksrc/${cmake_builddir:-build}=."
;;
meson)
CFLAGS="${CFLAGS} -fdebug-prefix-map=$_wrksrc/${meson_builddir:-build}=."
CXXFLAGS="${CXXFLAGS} -fdebug-prefix-map=$_wrksrc/${meson_builddir:-build}=."
;;
*)
CFLAGS="${CFLAGS} -fdebug-prefix-map=$_wrksrc=."
CXXFLAGS="${CXXFLAGS} -fdebug-prefix-map=$_wrksrc=."
esac
unset _wrksrc

View file

@ -7,7 +7,7 @@ py2_lib="usr/lib/python${py2_ver}"
py2_sitelib="${py2_lib}/site-packages" py2_sitelib="${py2_lib}/site-packages"
py2_inc="usr/include/python${py2_ver}" py2_inc="usr/include/python${py2_ver}"
py3_ver="3.10" py3_ver="3.11"
py3_abiver="" py3_abiver=""
py3_lib="usr/lib/python${py3_ver}" py3_lib="usr/lib/python${py3_ver}"
py3_sitelib="${py3_lib}/site-packages" py3_sitelib="${py3_lib}/site-packages"

View file

@ -7,17 +7,24 @@ unset -v archs distfiles checksum build_style build_helper nocross broken
unset -v configure_script configure_args wrksrc build_wrksrc create_wrksrc unset -v configure_script configure_args wrksrc build_wrksrc create_wrksrc
unset -v make_build_args make_check_args make_install_args unset -v make_build_args make_check_args make_install_args
unset -v make_build_target make_check_target make_install_target unset -v make_build_target make_check_target make_install_target
unset -v make_cmd meson_cmd gem_cmd fetch_cmd unset -v make_cmd meson_cmd gem_cmd fetch_cmd make_check_pre
unset -v python_version stackage unset -v python_version stackage
unset -v cmake_builddir meson_builddir unset -v cmake_builddir meson_builddir
unset -v meson_crossfile unset -v meson_crossfile
unset -v gemspec unset -v gemspec
unset -v go_import_path go_package go_mod_mode unset -v go_import_path go_package go_mod_mode
unset -v patch_args disable_parallel_build keep_libtool_archives make_use_env unset -v patch_args disable_parallel_build disable_parallel_check
unset -v keep_libtool_archives make_use_env
unset -v reverts subpackages makedepends hostmakedepends checkdepends depends restricted unset -v reverts subpackages makedepends hostmakedepends checkdepends depends restricted
unset -v nopie build_options build_options_default bootstrap repository reverts unset -v nopie build_options build_options_default bootstrap repository reverts
unset -v CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS LD_LIBRARY_PATH unset -v CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS LD_LIBRARY_PATH
unset -v CC CXX CPP GCC LD AR AS RANLIB NM OBJDUMP OBJCOPY STRIP READELF PKG_CONFIG unset -v CC CXX CPP GCC LD AR AS RANLIB NM OBJDUMP OBJCOPY STRIP READELF PKG_CONFIG
unset -v CMAKE_GENERATOR
# build-helper python3
unset -v PYPREFIX LDSHARED PYTHON_CONFIG PYTHONPATH _PYTHON_SYSCONFIGDATA_NAME
# unset all $build_option_ variables
unset -v "${!build_option_@}"
# hooks/do-extract/00-distfiles # hooks/do-extract/00-distfiles
unset -v skip_extraction unset -v skip_extraction

View file

@ -57,8 +57,13 @@ vsed() {
newdigest="$($XBPS_DIGEST_CMD "$f")" newdigest="$($XBPS_DIGEST_CMD "$f")"
newdigest="${newdigest%% *}" newdigest="${newdigest%% *}"
msgfunc=msg_warn
if [ -n "$XBPS_STRICT" ]; then
msgfunc=msg_error
fi
if [ "$olddigest" = "$newdigest" ]; then if [ "$olddigest" = "$newdigest" ]; then
msg_warn "$pkgver: vsed: regex \"$rx\" didn't change file \"$f\"\n" $msgfunc "$pkgver: vsed: regex \"$rx\" didn't change file \"$f\"\n"
fi fi
olddigest="${newdigest}" olddigest="${newdigest}"
done done

View file

@ -3,7 +3,7 @@
hook() { hook() {
local srcdir="$XBPS_SRCDISTDIR/$pkgname-$version" local srcdir="$XBPS_SRCDISTDIR/$pkgname-$version"
local f j curfile found extractdir local f j curfile found extractdir innerdir num_dirs
local TAR_CMD local TAR_CMD
if [ -z "$distfiles" -a -z "$checksum" ]; then if [ -z "$distfiles" -a -z "$checksum" ]; then
@ -20,10 +20,6 @@ hook() {
fi fi
done done
if [ -n "$create_wrksrc" ]; then
mkdir -p "${wrksrc}" || msg_error "$pkgver: failed to create wrksrc.\n"
fi
# Disable trap on ERR; the code is smart enough to report errors and abort. # Disable trap on ERR; the code is smart enough to report errors and abort.
trap - ERR trap - ERR
@ -31,6 +27,9 @@ hook() {
[ -z "$TAR_CMD" ] && TAR_CMD="$(command -v tar)" [ -z "$TAR_CMD" ] && TAR_CMD="$(command -v tar)"
[ -z "$TAR_CMD" ] && msg_error "xbps-src: no suitable tar cmd (bsdtar, tar)\n" [ -z "$TAR_CMD" ] && msg_error "xbps-src: no suitable tar cmd (bsdtar, tar)\n"
extractdir=$(mktemp -d "$XBPS_BUILDDIR/.extractdir-XXXXXXX") ||
msg_error "Cannot create temporary dir for do-extract\n"
msg_normal "$pkgver: extracting distfile(s), please wait...\n" msg_normal "$pkgver: extracting distfile(s), please wait...\n"
for f in ${distfiles}; do for f in ${distfiles}; do
@ -73,12 +72,6 @@ hook() {
*) msg_error "$pkgver: unknown distfile suffix for $curfile.\n";; *) msg_error "$pkgver: unknown distfile suffix for $curfile.\n";;
esac esac
if [ -n "$create_wrksrc" ]; then
extractdir="$wrksrc"
else
extractdir="$XBPS_BUILDDIR"
fi
case ${cursufx} in case ${cursufx} in
tar|txz|tbz|tlz|tgz|crate) tar|txz|tbz|tlz|tgz|crate)
$TAR_CMD -x --no-same-permissions --no-same-owner -f $srcdir/$curfile -C "$extractdir" $TAR_CMD -x --no-same-permissions --no-same-owner -f $srcdir/$curfile -C "$extractdir"
@ -128,11 +121,7 @@ hook() {
fi fi
;; ;;
txt) txt)
if [ "$create_wrksrc" ]; then cp -f $srcdir/$curfile "$extractdir"
cp -f $srcdir/$curfile "$extractdir"
else
msg_error "$pkgname: ${curfile##*.} files can only be extracted when create_wrksrc is set\n"
fi
;; ;;
7z) 7z)
if command -v 7z &>/dev/null; then if command -v 7z &>/dev/null; then
@ -150,16 +139,10 @@ hook() {
fi fi
;; ;;
gem) gem)
case "$TAR_CMD" in innerdir="$extractdir/${wrksrc##*/}"
*bsdtar) mkdir -p "$innerdir"
$TAR_CMD -xOf $srcdir/$curfile data.tar.gz | \ $TAR_CMD -xOf $srcdir/$curfile data.tar.gz |
$TAR_CMD -xz -C "$extractdir" -s ",^,${wrksrc##*/}/," -f - $TAR_CMD -xz -C "$innerdir" -f -
;;
*)
$TAR_CMD -xOf $srcdir/$curfile data.tar.gz | \
$TAR_CMD -xz -C "$extractdir" --transform="s,^,${wrksrc##*/}/,"
;;
esac
if [ $? -ne 0 ]; then if [ $? -ne 0 ]; then
msg_error "$pkgver: extracting $curfile into $XBPS_BUILDDIR.\n" msg_error "$pkgver: extracting $curfile into $XBPS_BUILDDIR.\n"
fi fi
@ -169,4 +152,31 @@ hook() {
;; ;;
esac esac
done done
# find "$extractdir" -mindepth 1 -maxdepth 1 -printf '1\n' | wc -l
# However, it requires GNU's find
num_dirs=0
for f in "$extractdir"/* "$extractdir"/.*; do
if [ -e "$f" ] || [ -L "$f" ]; then
case "$f" in
*/. | */..) ;;
*)
innerdir="$f"
num_dirs=$(( num_dirs + 1 ))
;;
esac
fi
done
rm -rf "$wrksrc"
if [ "$num_dirs" = 1 ] && [ -d "$innerdir" ] && [ -z "$create_wrksrc" ]; then
# rename the subdirectory (top-level of distfiles) to $wrksrc
mv "$innerdir" "$wrksrc" &&
rmdir "$extractdir"
elif [ "$num_dirs" -gt 1 ] || [ -n "$create_wrksrc" ]; then
# rename the tmpdir to wrksrc
mv "$extractdir" "$wrksrc"
else
mkdir -p "$wrksrc"
fi ||
msg_error "$pkgver: failed to move sources to $wrksrc\n"
} }

View file

@ -2,24 +2,6 @@
# the $distfiles variable and then verifies its sha256 checksum comparing # the $distfiles variable and then verifies its sha256 checksum comparing
# its value with the one stored in the $checksum variable. # its value with the one stored in the $checksum variable.
# Get the checksum for $curfile at index $dfcount
get_cksum() {
local curfile="$1" dfcount="$2" ckcount cksum i
ckcount=0
cksum=0
for i in ${checksum}; do
if [ $dfcount -eq $ckcount -a -n "$i" ]; then
cksum=$i
fi
ckcount=$((ckcount + 1))
done
if [ -z "$cksum" ]; then
msg_error "$pkgver: cannot find checksum for $curfile.\n"
fi
echo "$cksum"
}
# Return the checksum of the contents of a tarball # Return the checksum of the contents of a tarball
contents_cksum() { contents_cksum() {
local curfile="$1" cursufx cksum local curfile="$1" cursufx cksum
@ -110,9 +92,7 @@ contents_cksum() {
# Verify the checksum for $curfile stored at $distfile and index $dfcount # Verify the checksum for $curfile stored at $distfile and index $dfcount
verify_cksum() { verify_cksum() {
local curfile="$1" distfile="$2" dfcount="$3" filesum cksum local curfile="$1" distfile="$2" cksum="$3" filesum
cksum=$(get_cksum $curfile $dfcount)
# If the checksum starts with an commercial at (@) it is the contents checksum # If the checksum starts with an commercial at (@) it is the contents checksum
if [ "${cksum:0:1}" = "@" ]; then if [ "${cksum:0:1}" = "@" ]; then
@ -121,7 +101,7 @@ verify_cksum() {
filesum=$(contents_cksum "$curfile") filesum=$(contents_cksum "$curfile")
if [ "${cksum}" != "$filesum" ]; then if [ "${cksum}" != "$filesum" ]; then
echo echo
msg_red "SHA256 mismatch for '$curfile:'\n@$filesum\n" msg_red "SHA256 mismatch for '${curfile}:'\n@${filesum}\n"
errors=$((errors + 1)) errors=$((errors + 1))
else else
msg_normal_append "OK.\n" msg_normal_append "OK.\n"
@ -131,7 +111,7 @@ verify_cksum() {
filesum=$(${XBPS_DIGEST_CMD} "$distfile") filesum=$(${XBPS_DIGEST_CMD} "$distfile")
if [ "$cksum" != "$filesum" ]; then if [ "$cksum" != "$filesum" ]; then
echo echo
msg_red "SHA256 mismatch for '$curfile:'\n$filesum\n" msg_red "SHA256 mismatch for '${curfile}:'\n${filesum}\n"
errors=$((errors + 1)) errors=$((errors + 1))
else else
if [ ! -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then if [ ! -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then
@ -145,22 +125,20 @@ verify_cksum() {
# Link an existing cksum $distfile for $curfile at index $dfcount # Link an existing cksum $distfile for $curfile at index $dfcount
link_cksum() { link_cksum() {
local curfile="$1" distfile="$2" dfcount="$3" filesum cksum local curfile="$1" distfile="$2" cksum="$3"
cksum=$(get_cksum $curfile $dfcount)
if [ -n "$cksum" -a -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then if [ -n "$cksum" -a -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then
ln -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" "$distfile" ln -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" "$distfile"
msg_normal "$pkgver: using known distfile $curfile.\n" msg_normal "$pkgver: using known distfile $curfile.\n"
return 0
fi fi
return 1
} }
try_mirrors() { try_mirrors() {
local curfile="$1" distfile="$2" dfcount="$3" subdir="$4" f="$5" local curfile="$1" distfile="$2" cksum="$3" f="$4"
local filesum cksum basefile mirror path scheme local filesum basefile mirror path scheme good
[ -z "$XBPS_DISTFILES_MIRROR" ] && return [ -z "$XBPS_DISTFILES_MIRROR" ] && return 1
basefile="${f##*/}" basefile="${f##*/}"
cksum=$(get_cksum $curfile $dfcount)
for mirror in $XBPS_DISTFILES_MIRROR; do for mirror in $XBPS_DISTFILES_MIRROR; do
scheme="file" scheme="file"
if [[ $mirror == *://* ]]; then if [[ $mirror == *://* ]]; then
@ -179,28 +157,80 @@ try_mirrors() {
fi fi
if [[ "$mirror" == *voidlinux* ]]; then if [[ "$mirror" == *voidlinux* ]]; then
# For distfiles.voidlinux.* append the subdirectory # For distfiles.voidlinux.* append the subdirectory
mirror="$mirror/$subdir" mirror="$mirror/$pkgname-$version"
fi fi
msg_normal "$pkgver: fetching distfile '$curfile' from '$mirror'...\n" msg_normal "$pkgver: fetching distfile '$curfile' from mirror '$mirror'...\n"
$fetch_cmd "$mirror/$curfile" $fetch_cmd "$mirror/$curfile"
# If basefile was not found, but a curfile file may exist, try to fetch it # If basefile was not found, but a curfile file may exist, try to fetch it
if [ ! -f "$distfile" -a "$basefile" != "$curfile" ]; then # if [ ! -f "$distfile" -a "$basefile" != "$curfile" ]; then
$fetch_cmd "$mirror/$basefile" # msg_normal "$pkgver: fetching distfile '$basefile' from mirror '$mirror'...\n"
fi # $fetch_cmd "$mirror/$basefile"
# fi
[ ! -f "$distfile" ] && continue [ ! -f "$distfile" ] && continue
flock -n ${distfile}.part rm -f ${distfile}.part flock -n ${distfile}.part rm -f ${distfile}.part
filesum=$(${XBPS_DIGEST_CMD} "$distfile") filesum=$(${XBPS_DIGEST_CMD} "$distfile")
[ "$cksum" == "$filesum" ] && break if [ "$cksum" == "$filesum" ]; then
return 0
fi
msg_normal "$pkgver: checksum failed - removing '$curfile'...\n" msg_normal "$pkgver: checksum failed - removing '$curfile'...\n"
rm -f ${distfile} rm -f ${distfile}
done done
return 1
}
try_urls() {
local curfile="$1"
local good=
for i in ${_file_idxs["$curfile"]}; do
local cksum=${_checksums["$i"]}
local url=${_distfiles["$i"]}
# If distfile does not exist, download it from the original location.
if [[ "$FTP_RETRIES" && "${url}" =~ ^ftp:// ]]; then
max_retries="$FTP_RETRIES"
else
max_retries=1
fi
for retry in $(seq 1 1 $max_retries); do
if [ ! -f "$distfile" ]; then
if [ "$retry" == 1 ]; then
msg_normal "$pkgver: fetching distfile '$curfile' from '$url'...\n"
else
msg_normal "$pkgver: fetch attempt $retry of $max_retries...\n"
fi
flock "${distfile}.part" $fetch_cmd "$url"
fi
done
if [ ! -f "$distfile" ]; then
continue
fi
# distfile downloaded, verify sha256 hash.
flock -n "${distfile}.part" rm -f "${distfile}.part"
verify_cksum "$curfile" "$distfile" "$cksum"
return 0
done
return 1
} }
hook() { hook() {
local srcdir="$XBPS_SRCDISTDIR/$pkgname-$version" local srcdir="$XBPS_SRCDISTDIR/$pkgname-$version"
local dfcount=0 dfgood=0 errors=0 max_retries local dfcount=0 dfgood=0 errors=0 max_retries
if [ ! -d "$srcdir" ]; then local -a _distfiles=($distfiles)
local -a _checksums=($checksum)
local -A _file_idxs
# Create a map from target file to index in _distfiles/_checksums
for i in ${!_distfiles[@]}; do
f="${_distfiles[$i]}"
curfile="${f#*>}"
curfile="${curfile##*/}"
_file_idxs["$curfile"]+=" $i"
done
if [[ ! -d "$srcdir" ]]; then
mkdir -p -m775 "$srcdir" mkdir -p -m775 "$srcdir"
chgrp $(id -g) "$srcdir" chgrp $(id -g) "$srcdir"
fi fi
@ -209,90 +239,72 @@ hook() {
# Disable trap on ERR; the code is smart enough to report errors and abort. # Disable trap on ERR; the code is smart enough to report errors and abort.
trap - ERR trap - ERR
# Detect bsdtar and GNU tar (in that order of preference) # Detect bsdtar and GNU tar (in that order of preference)
TAR_CMD="$(command -v bsdtar)" TAR_CMD="$(command -v bsdtar)"
if [ -z "$TAR_CMD" ]; then if [[ -z "$TAR_CMD" ]]; then
TAR_CMD="$(command -v tar)" TAR_CMD="$(command -v tar)"
fi fi
# Detect distfiles with obsolete checksum and purge them from the cache # Detect distfiles with obsolete checksum and purge them from the cache
for f in ${distfiles}; do for f in ${!_file_idxs[@]}; do
curfile="${f#*>}" distfile="$srcdir/$f"
curfile="${curfile##*/}" for i in ${_file_idxs["$f"]}; do
distfile="$srcdir/$curfile" if [[ -f $distfile ]]; then
cksum=${_checksums["$i"]}
if [ -f "$distfile" ]; then if [[ ${cksum:0:1} = @ ]]; then
cksum=$(get_cksum $curfile $dfcount) cksum=${cksum:1}
if [ "${cksum:0:1}" = "@" ]; then filesum=$(contents_cksum "$distfile")
cksum=${cksum:1} else
filesum=$(contents_cksum "$distfile") filesum=$(${XBPS_DIGEST_CMD} "$distfile")
else fi
filesum=$(${XBPS_DIGEST_CMD} "$distfile") if [[ $cksum = $filesum ]]; then
dfgood=$((dfgood + 1))
else
inode=$(stat "$distfile" --printf "%i")
msg_warn "$pkgver: wrong checksum found for ${curfile} - purging\n"
find ${XBPS_SRCDISTDIR} -inum ${inode} -delete -print
fi
fi fi
if [ "$cksum" = "$filesum" ]; then dfcount=$((dfcount + 1))
dfgood=$((dfgood + 1)) done
else
inode=$(stat "$distfile" --printf "%i")
msg_warn "$pkgver: wrong checksum found for ${curfile} - purging\n"
find ${XBPS_SRCDISTDIR} -inum ${inode} -delete -print
fi
fi
dfcount=$((dfcount + 1))
done done
# We're done, if all distfiles were found and had good checksums # We're done, if all distfiles were found and had good checksums
[ $dfcount -eq $dfgood ] && return [[ $dfcount -eq $dfgood ]] && return
# Download missing distfiles and verify their checksums # Download missing distfiles and verify their checksums
dfcount=0 for curfile in ${!_file_idxs[@]}; do
for f in ${distfiles}; do
curfile="${f#*>}"
curfile="${curfile##*/}"
distfile="$srcdir/$curfile" distfile="$srcdir/$curfile"
set -- ${_file_idxs["$curfile"]}
i="$1"
# If file lock cannot be acquired wait until it's available. # If file lock cannot be acquired wait until it's available.
while true; do while ! flock -w 1 "${distfile}.part" true; do
flock -w 1 ${distfile}.part true
[ $? -eq 0 ] && break
msg_warn "$pkgver: ${curfile} is already being downloaded, waiting for 1s ...\n" msg_warn "$pkgver: ${curfile} is already being downloaded, waiting for 1s ...\n"
done done
if [[ -f "$distfile" ]]; then
continue
fi
# If distfile does not exist, try to link to it. # If distfile does not exist, try to link to it.
if [ ! -f "$distfile" ]; then if link_cksum "$curfile" "$distfile" "${_checksums[$i]}"; then
link_cksum $curfile $distfile $dfcount continue
fi fi
# If distfile does not exist, download it from a mirror location. # If distfile does not exist, download it from a mirror location.
if [ ! -f "$distfile" ]; then if try_mirrors "$curfile" "$distfile" "${_checksums[$i]}" "${_distfiles[$i]}"; then
try_mirrors $curfile $distfile $dfcount $pkgname-$version $f continue
fi fi
# If distfile does not exist, download it from the original location.
if [[ "$FTP_RETRIES" && "${f}" =~ ^ftp:// ]]; then if ! try_urls "$curfile"; then
max_retries="$FTP_RETRIES" msg_error "$pkgver: failed to fetch '$curfile'.\n"
else
max_retries=1
fi fi
for retry in $(seq 1 1 $max_retries); do
if [ ! -f "$distfile" ]; then
if [ "$retry" == 1 ]; then
msg_normal "$pkgver: fetching distfile '$curfile'...\n"
else
msg_normal "$pkgver: fetch attempt $retry of $max_retries...\n"
fi
flock "${distfile}.part" $fetch_cmd "$f"
fi
done
if [ ! -f "$distfile" ]; then
msg_error "$pkgver: failed to fetch $curfile.\n"
fi
# distfile downloaded, verify sha256 hash.
flock -n ${distfile}.part rm -f ${distfile}.part
verify_cksum $curfile $distfile $dfcount
dfcount=$((dfcount + 1))
done done
unset TAR_CMD unset TAR_CMD
if [ $errors -gt 0 ]; then if [[ $errors -gt 0 ]]; then
msg_error "$pkgver: couldn't verify distfiles, exiting...\n" msg_error "$pkgver: couldn't verify distfiles, exiting...\n"
fi fi
} }

View file

@ -6,23 +6,23 @@ _process_patch() {
_args="-Np1" _args="-Np1"
_patch=${i##*/} _patch=${i##*/}
if [ -f $PATCHESDIR/${_patch}.args ]; then if [ -f "$PATCHESDIR/${_patch}.args" ]; then
_args=$(<$PATCHESDIR/${_patch}.args) _args=$(<"$PATCHESDIR/${_patch}.args")
elif [ -n "$patch_args" ]; then elif [ -n "$patch_args" ]; then
_args=$patch_args _args=$patch_args
fi fi
cp -f $i "$wrksrc" cp -f "$i" "$wrksrc"
# Try to guess if its a compressed patch. # Try to guess if its a compressed patch.
if [[ $f =~ .gz$ ]]; then if [[ $i =~ .gz$ ]]; then
gunzip "$wrksrc/${_patch}" gunzip "$wrksrc/${_patch}"
_patch=${_patch%%.gz} _patch=${_patch%%.gz}
elif [[ $f =~ .bz2$ ]]; then elif [[ $i =~ .bz2$ ]]; then
bunzip2 "$wrksrc/${_patch}" bunzip2 "$wrksrc/${_patch}"
_patch=${_patch%%.bz2} _patch=${_patch%%.bz2}
elif [[ $f =~ .diff$ ]]; then elif [[ $i =~ .diff$ ]]; then
: :
elif [[ $f =~ .patch$ ]]; then elif [[ $i =~ .patch$ ]]; then
: :
else else
msg_warn "$pkgver: unknown patch type: $i.\n" msg_warn "$pkgver: unknown patch type: $i.\n"
@ -31,7 +31,7 @@ _process_patch() {
cd "$wrksrc" cd "$wrksrc"
msg_normal "$pkgver: patching: ${_patch}.\n" msg_normal "$pkgver: patching: ${_patch}.\n"
patch -s ${_args} -i ${_patch} 2>/dev/null patch -s ${_args} <"${_patch}" 2>/dev/null
} }
hook() { hook() {
@ -44,11 +44,11 @@ hook() {
done < $PATCHESDIR/series done < $PATCHESDIR/series
else else
for f in $PATCHESDIR/*; do for f in $PATCHESDIR/*; do
[ ! -f $f ] && continue [ ! -f "$f" ] && continue
if [[ $f =~ ^.*.args$ ]]; then if [[ $f =~ ^.*.args$ ]]; then
continue continue
fi fi
_process_patch $f _process_patch "$f"
done done
fi fi
} }

View file

@ -24,8 +24,7 @@ hook() {
# Find all binaries in /usr/share and add them to the pool # Find all binaries in /usr/share and add them to the pool
while read -r f; do while read -r f; do
mime="${f##*:}" mime="${f##*: }"
mime="${mime// /}"
file="${f%:*}" file="${f%:*}"
file="${file#${PKGDESTDIR}}" file="${file#${PKGDESTDIR}}"
case "${mime}" in case "${mime}" in
@ -37,7 +36,7 @@ hook() {
fi fi
;; ;;
esac esac
done < <(find $PKGDESTDIR/usr/share $prune_expr -type f | file --mime-type --files-from -) done < <(find $PKGDESTDIR/usr/share $prune_expr -type f | file --no-pad --mime-type --files-from -)
# Check passed if no packages in pool # Check passed if no packages in pool
if [ -z "$matches" ]; then if [ -z "$matches" ]; then

View file

@ -236,7 +236,7 @@ hook() {
generic_wrapper3 libetpan-config generic_wrapper3 libetpan-config
generic_wrapper3 giblib-config generic_wrapper3 giblib-config
python_wrapper python-config 2.7 python_wrapper python-config 2.7
python_wrapper python3-config 3.10 python_wrapper python3-config 3.11
apr_apu_wrapper apr-1-config apr_apu_wrapper apr-1-config
apr_apu_wrapper apu-1-config apr_apu_wrapper apu-1-config
} }

View file

@ -34,15 +34,14 @@ add_rundep() {
store_pkgdestdir_rundeps() { store_pkgdestdir_rundeps() {
if [ -n "$run_depends" ]; then if [ -n "$run_depends" ]; then
: > ${PKGDESTDIR}/rdeps
for f in ${run_depends}; do for f in ${run_depends}; do
_curdep="$(echo "$f" | sed -e 's,\(.*\)?.*,\1,')" _curdep="$(echo "$f" | sed -e 's,\(.*\)?.*,\1,')"
if [ -z "$($XBPS_UHELPER_CMD getpkgdepname ${_curdep} 2>/dev/null)" -a \ if [ -z "$($XBPS_UHELPER_CMD getpkgdepname ${_curdep} 2>/dev/null)" -a \
-z "$($XBPS_UHELPER_CMD getpkgname ${_curdep} 2>/dev/null)" ]; then -z "$($XBPS_UHELPER_CMD getpkgname ${_curdep} 2>/dev/null)" ]; then
_curdep="${_curdep}>=0" _curdep="${_curdep}>=0"
fi fi
printf -- "${_curdep} " >> ${PKGDESTDIR}/rdeps printf -- "${_curdep}\n"
done done | sort | xargs > ${PKGDESTDIR}/rdeps
fi fi
} }
@ -166,6 +165,6 @@ hook() {
sorequires+="${f} " sorequires+="${f} "
done done
if [ -n "${sorequires}" ]; then if [ -n "${sorequires}" ]; then
echo "${sorequires}" > ${PKGDESTDIR}/shlib-requires echo "${sorequires}" | xargs -n1 | sort | xargs > ${PKGDESTDIR}/shlib-requires
fi fi
} }

View file

@ -22,7 +22,7 @@ hook() {
fi fi
done done
for f in var/run usr/local; do for f in var/run usr/local usr/etc; do
if [ -d ${PKGDESTDIR}/${f} ]; then if [ -d ${PKGDESTDIR}/${f} ]; then
msg_red "${pkgver}: /${f} directory is not allowed, remove it!\n" msg_red "${pkgver}: /${f} directory is not allowed, remove it!\n"
error=1 error=1
@ -103,11 +103,21 @@ hook() {
error=1 error=1
fi fi
if [ -d ${PKGDESTDIR}/usr/usr ]; then
msg_red "${pkgver}: /usr/usr is forbidden, use /usr.\n"
error=1
fi
if [ -d ${PKGDESTDIR}/usr/man ]; then if [ -d ${PKGDESTDIR}/usr/man ]; then
msg_red "${pkgver}: /usr/man is forbidden, use /usr/share/man.\n" msg_red "${pkgver}: /usr/man is forbidden, use /usr/share/man.\n"
error=1 error=1
fi fi
if [[ -d ${PKGDESTDIR}/usr/share/man/man ]]; then
msg_red "${pkgver}: /usr/share/man/man is forbidden, use /usr/share/man.\n"
error=1
fi
if [ -d ${PKGDESTDIR}/usr/doc ]; then if [ -d ${PKGDESTDIR}/usr/doc ]; then
msg_red "${pkgver}: /usr/doc is forbidden. Use /usr/share/doc.\n" msg_red "${pkgver}: /usr/doc is forbidden. Use /usr/share/doc.\n"
error=1 error=1
@ -182,7 +192,7 @@ hook() {
if [ -z "$found" ]; then if [ -z "$found" ]; then
_myshlib="${libname}.so" _myshlib="${libname}.so"
[ "${_myshlib}" != "${rev}" ] && _myshlib+=".${rev}" [ "${_myshlib}" != "${rev}" ] && _myshlib+=".${rev}"
msg_warn "${pkgver}: ${_myshlib} not found in common/shlibs!\n" msg_normal "${pkgver}: ${_myshlib} not found in common/shlibs.\n"
fi; fi;
} }
done done

View file

@ -5,18 +5,17 @@ die() {
exit 1 exit 1
} }
GIT_CMD=$(command -v chroot-git 2>/dev/null) || command -v git >/dev/null 2>&1 ||
GIT_CMD=$(command -v git 2>/dev/null) ||
die "neither chroot-git nor git could be found!" die "neither chroot-git nor git could be found!"
rev_parse() { rev_parse() {
if [ -n "$1" ]; then if [ -n "$1" ]; then
"$GIT_CMD" rev-parse --verify "$1" git rev-parse --verify "$1"
else else
shift shift
while test "$#" != 0 while test "$#" != 0
do do
"$GIT_CMD" rev-parse --verify "$1" 2>/dev/null && return git rev-parse --verify "$1" 2>/dev/null && return
shift shift
done done
return 1 return 1
@ -27,26 +26,26 @@ base=$(rev_parse "$1" FETCH_HEAD ORIG_HEAD) || die "base commit not found"
tip=$(rev_parse "$2" HEAD) || die "tip commit not found" tip=$(rev_parse "$2" HEAD) || die "tip commit not found"
status=0 status=0
for cmt in $("$GIT_CMD" rev-list --abbrev-commit $base..$tip) for cmt in $(git rev-list --abbrev-commit $base..$tip)
do do
"$GIT_CMD" cat-file commit "$cmt" | git cat-file commit "$cmt" |
awk -vC="$cmt" ' awk -vC="$cmt" '
# skip header # skip header
/^$/ && !msg { msg = 1; next } /^$/ && !msg { msg = 1; next }
!msg { next } !msg { next }
# 3: long-line-is-banned-except-footnote-like-this-for-url # 3: long-line-is-banned-except-footnote-like-this-for-url
(NF > 2) && (length > 80) { print C ": long line: " $0; exit 1 } (NF > 2) && (length > 80) { print "::error title=Commit Lint::" C ": long line: " $0; exit 1 }
!subject { !subject {
if (length > 50) { print C ": subject is a bit long" } if (length > 50) { print "::warning title=Commit Lint::" C ": subject is a bit long" }
if (!($0 ~ ":" || $0 ~ "^Take over maintainership " || $0 ~ "^Orphan ")) { print C ": subject does not follow CONTRIBUTING.md guildelines"; exit 1 } if (!($0 ~ ":" || $0 ~ "^Take over maintainership " || $0 ~ "^Orphan ")) { print "::error title=Commit Lint::" C ": subject does not follow CONTRIBUTING.md guildelines"; exit 1 }
# Below check is too noisy? # Below check is too noisy?
# if (!($0 ~ "^New package:" || $0 ~ ".*: update to")) { # if (!($0 ~ "^New package:" || $0 ~ ".*: update to")) {
# print C ": not new package/update/removal?" # print "::warning title=Commit Lint::" C ": not new package/update/removal?"
# } # }
subject = 1; next subject = 1; next
} }
/^$/ { body = 1; next } /^$/ { body = 1; next }
!body { print C ": second line must be blank"; exit 1 } !body { print "::error title=Commit Lint::" C ": second line must be blank"; exit 1 }
' || status=1 ' || status=1
done done
exit $status exit $status

198
common/scripts/lint-conflicts Executable file
View file

@ -0,0 +1,198 @@
#!/usr/bin/env bash
# Report packages installing same file and not marked with
# conflicts or replaces.
# Without argument, find conflicts between packages in local
# repository at hostdir/binpkgs and packages indexed in xlocate.
# With single path as argument, read that local repository.
# With -a flag, find conflicts between packages indexed in xlocate.
if [ "$#" = 0 ]; then
binpkgs="$PWD/hostdir/binpkgs"
elif [ "$1" = -a ]; then
all=1
elif [ -d "$1" ]; then
binpkgs="$1"
else
echo "Usage:"
echo "$0"
echo " check packages in ./hostdir/binpkgs"
echo "$0 path/to/hostdir/binpkgs"
echo " check packages there"
echo "$0 -a"
echo " check all packages indexed in xlocate"
exit 1
fi
declare -A newly_built conflicts_cache providers_cache pairs owners
repositories=("--repository=${binpkgs}" "--repository=${binpkgs}/nonfree")
rv=0
template_exists() {
[ -f "srcpkgs/$1/template" ]
}
partial_check() {
[ -z "$all" ]
}
providers_of() {
# print the pkgname and packages that `provides` it
local pkgname=$1
if [ "${providers_cache[$pkgname]}" = '' ]; then
local line provider_pkgver provided_pkgver provider_pkgname provided_pkgname
local -A providers
providers[$pkgname]=$pkgname
while read -r line; do
line=${line%%'('*}
provider_pkgver=${line%': '*}
provided_pkgver=${line#*': '}
provider_pkgname=${provider_pkgver%-*}
provided_pkgname=${provided_pkgver%-*}
# comes from $(xbps-query -s $pkgname), so $pkgname can be substring
if [ "$provided_pkgname" = "$pkgname" ]; then
providers[$provider_pkgname]=$provider_pkgname
fi
done < <(xbps-query "${repositories[@]}" -p provides -R -s "$pkgname")
# leading space ensures ${[]} != ''
providers_cache[$pkgname]=" ${providers[*]}"
fi
echo ${providers_cache[$pkgname]}
}
conflicts_of() {
# print list of packages that are _marked_ as conflicting with given one
local pkgname=$1
if [ "${conflicts_cache[$pkgname]}" = '' ]; then
local in_conflict provider
local -A all
while read -r in_conflict; do
in_conflict=${in_conflict%'<'*}
in_conflict=${in_conflict%'>'*}
providers_of "$in_conflict" > /dev/null # executing in same process to fill cache
for provider in $(providers_of "$in_conflict"); do
all[$provider]=$provider
done
done < <(xbps-query "${repositories[@]}" -p conflicts,replaces -R "$pkgname")
# leading space ensures ${[]} != ''
conflicts_cache[$pkgname]=" ${all[*]}"
fi
echo ${conflicts_cache[$pkgname]}
}
conflict_between() {
# exit successfully if packages are _marked_ as conflicting
conflicts_of "$1" > /dev/null # executing in same process to fill cache
case " $(conflicts_of "$1") " in
*" $2 "*) return 0
esac
conflicts_of "$2" > /dev/null # executing in same process to fill cache
case " $(conflicts_of "$2") " in
*" $1 "*) return 0
esac
return 1
}
list_newly_built_files() {
# print one line per file in newly built packages
# each line contains pkgname and file path
local pkgver pkgname
while read -r pkgver; do
pkgname=${pkgver%-*}
xbps-query "${repositories[@]}" -i -f "$pkgname" | sed s'/ -> .*//;'" s/^/$pkgname /"
done < <(xbps-query "${repositories[@]}" -i -R -s '' | cut -d' ' -f 2)
}
list_interesting_files() {
# list files potentially contained in more than one package
# each line contains pkgver/pkgname and file path
if partial_check; then
list_newly_built_files
else
xlocate / | sed s'/ -> .*//' | grep -F -f <(xlocate / | cut -f 2- | sed s'/ -> .*//' | sort | uniq -d)
fi
}
group_by_file_full() {
# create associative array `owners` mapping file to list of packages
# for packages potentially conflicting with newly built ones
local pkgver file pkgname
while read -r pkgver file; do
pkgname=${pkgver%-*}
if template_exists "$pkgname"; then
owners[$file]+=" $pkgname"
fi
done < <(list_interesting_files)
}
group_by_file_partial() {
# create associative array `owners` mapping file to list of packages
# for all packages in xlocate
local pkgname file
## newly built packages
while read -r pkgver; do
pkgname=${pkgver%-*}
newly_built[$pkgname]=$pkgname
done < <(xbps-query "${repositories[@]}" -i -R -s '' | cut -d' ' -f 2)
while read -r pkgname file; do
owners[$file]+=" $pkgname"
done < <(list_newly_built_files)
## rest of repository
while read -r pkgver file; do
pkgname=${pkgver%-*}
if [ -z "${newly_built[$pkgname]}" ] && template_exists "$pkgname"; then
owners[$file]+=" $pkgname"
fi
done < <(xlocate / | sed s'/ -> .*//' | grep -F -f <(list_newly_built_files | cut -d ' ' -f 2-))
}
group_by_pair() {
# find package pairs owning same file and not marked as conflicting
local pkg file a b
while read -r pkg file; do
for a in ${owners[$file]}; do
for b in ${owners[$file]}; do
if ! [ "$a" "<" "$b" ]; then
continue
fi
if partial_check && [ -z "${newly_built[$a]}" ] && [ -z "${newly_built[$b]}" ]; then
continue
fi
if ! conflict_between "$a" "$b"; then
unset pair_files
local -A pair_files
eval "${pairs["$a $b"]}"
pair_files[$file]="$file"
pairs["$a $b"]="${pair_files[@]@A}"
fi
done
done
done < <(list_interesting_files)
}
print_out() {
local pair file
if [ "${#pairs[@]}" = 0 ]; then
echo 1>&2 "No conflicts found in" "${repositories[@]#*=}"
exit 0
fi
while read -r pair; do
rv=1
echo "${pair% *} and ${pair#* } conflict for"
unset pair_files
eval "${pairs[$pair]}"
for file in "${pair_files[@]}"; do
echo " $file"
done | sort
done < <(printf '%s\n' "${!pairs[@]}" | sort)
}
if partial_check; then
group_by_file_partial
else
group_by_file_full
fi
group_by_pair
print_out
exit $rv

View file

@ -13,20 +13,17 @@ if ! [ "$base_rev" ]; then
die "usage: $0 TEMPLATE BASE-REVISION [TIP-REVISION]" die "usage: $0 TEMPLATE BASE-REVISION [TIP-REVISION]"
fi fi
if command -v chroot-git >/dev/null 2>&1; then if ! command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v chroot-git)
elif command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v git)
else
die "neither chroot-git nor git could be found" die "neither chroot-git nor git could be found"
fi fi
scan() { scan() {
rx="$1" msg="$2" rx="$1" msg="$2"
template_path=$template template_path=$template
maybe_git=
if [ "$tip_rev" ]; then if [ "$tip_rev" ]; then
template_path="${tip_rev}:${template}" template_path="${tip_rev}:${template}"
maybe_git="$GIT_CMD" maybe_git="git"
revspec="[^:]*:" revspec="[^:]*:"
fi fi
$maybe_git grep -P -Hn -e "$rx" "$template_path" | $maybe_git grep -P -Hn -e "$rx" "$template_path" |
@ -37,7 +34,7 @@ scan() {
show_template() { show_template() {
rev="$1" rev="$1"
if [ "$rev" ]; then if [ "$rev" ]; then
$GIT_CMD cat-file blob "${rev}:${template}" 2>/dev/null git cat-file blob "${rev}:${template}" 2>/dev/null
else else
cat "${template}" 2>/dev/null cat "${template}" 2>/dev/null
fi fi
@ -45,7 +42,10 @@ show_template() {
show_template_var() { show_template_var() {
rev="$1" var="$2" rev="$1" var="$2"
show_template "$rev" | grep -Po '^'${var}'=\K.*' (
show_template "$rev"
printf '%s\n' "printf '%s\\n' \"\$${var}\""
) | bash 2>/dev/null
} }
revision_reset() { revision_reset() {
@ -72,6 +72,28 @@ reverts_on_downgrade() {
esac esac
} }
check_revert() {
for vr in $reverts; do
xbps-uhelper cmpver "${version}" "${vr%_*}"
case "$?" in
0 | 1)
scan '^version=' "remove $vr from \$reverts"
status=1
;;
esac
done
for vr in $prev_reverts; do
if ! xbps-uhelper cmpver "$version" "${vr%_*}"; then
continue
fi
if [ $revision -gt "${vr##*_}" ]; then
continue
fi
scan '^revision=' "undo a revert with same revision as before"
status=1
done
}
version_change() { version_change() {
version="$(show_template_var "$tip_rev" version)" version="$(show_template_var "$tip_rev" version)"
revision="$(show_template_var "$tip_rev" revision)" revision="$(show_template_var "$tip_rev" revision)"
@ -83,6 +105,7 @@ version_change() {
1) revision_reset;; 1) revision_reset;;
-1|255) reverts_on_downgrade;; -1|255) reverts_on_downgrade;;
esac esac
check_revert
} }
version_change version_change

View file

@ -0,0 +1,11 @@
# Converts xlint/etc format lints into GH Actions annotations
# The original line is printed alongside the annotation command
{
split($0, a, ": ")
split(a[1], b, ":")
msg = substr($0, index($0, ": ") + 2)
if (b[2]) {
line = ",line=" b[2]
}
printf "::error title=Template Lint,file=%s%s::%s\n", b[1], line, msg
}

File diff suppressed because it is too large Load diff

View file

@ -13,7 +13,7 @@ fi
PKGS=$(/hostrepo/xbps-src sort-dependencies $(cat /tmp/templates)) PKGS=$(/hostrepo/xbps-src sort-dependencies $(cat /tmp/templates))
for pkg in ${PKGS}; do for pkg in ${PKGS}; do
/hostrepo/xbps-src -j$(nproc) -H "$HOME"/hostdir $arch $test pkg "$pkg" /hostrepo/xbps-src -j$(nproc) -s -H "$HOME"/hostdir $arch $test pkg "$pkg"
[ $? -eq 1 ] && exit 1 [ $? -eq 1 ] && exit 1
done done

View file

@ -2,19 +2,15 @@
# #
# changed_templates.sh # changed_templates.sh
if command -v chroot-git >/dev/null 2>&1; then tip="$(git rev-list -1 --parents HEAD)"
GIT_CMD=$(command -v chroot-git)
elif command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v git)
fi
tip="$($GIT_CMD rev-list -1 --parents HEAD)"
case "$tip" in case "$tip" in
# This is a merge commit, pick last parent
*" "*" "*) tip="${tip##* }" ;; *" "*" "*) tip="${tip##* }" ;;
# This is a non-merge commit, pick itself
*) tip="${tip%% *}" ;; *) tip="${tip%% *}" ;;
esac esac
base="$($GIT_CMD merge-base FETCH_HEAD "$tip")" || { base="$(git merge-base FETCH_HEAD "$tip")" || {
echo "Your branches is based on too old copy." echo "Your branches is based on too old copy."
echo "Please rebase to newest copy." echo "Please rebase to newest copy."
exit 1 exit 1
@ -23,7 +19,7 @@ base="$($GIT_CMD merge-base FETCH_HEAD "$tip")" || {
echo "$base $tip" >/tmp/revisions echo "$base $tip" >/tmp/revisions
/bin/echo -e '\x1b[32mChanged packages:\x1b[0m' /bin/echo -e '\x1b[32mChanged packages:\x1b[0m'
$GIT_CMD diff-tree -r --no-renames --name-only --diff-filter=AM \ git diff-tree -r --no-renames --name-only --diff-filter=AM \
"$base" "$tip" \ "$base" "$tip" \
-- 'srcpkgs/*/template' | -- 'srcpkgs/*/template' |
cut -d/ -f 2 | cut -d/ -f 2 |

View file

@ -7,7 +7,7 @@ TAR=tar
command -v bsdtar >/dev/null && TAR=bsdtar command -v bsdtar >/dev/null && TAR=bsdtar
ARCH=$(uname -m)-musl ARCH=$(uname -m)-musl
VERSION=0.59_5 VERSION=0.59_5
URL="https://alpha.de.repo.voidlinux.org/static/xbps-static-static-${VERSION}.${ARCH}.tar.xz" URL="https://repo-ci.voidlinux.org/static/xbps-static-static-${VERSION}.${ARCH}.tar.xz"
FILE=${URL##*/} FILE=${URL##*/}
mkdir -p /tmp/bin mkdir -p /tmp/bin

View file

@ -2,11 +2,8 @@
# #
# changed_templates.sh # changed_templates.sh
if command -v chroot-git >/dev/null 2>&1; then # required by git 2.35.2+
GIT_CMD=$(command -v chroot-git) git config --global --add safe.directory "$PWD"
elif command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v git)
fi
/bin/echo -e '\x1b[32mFetching upstream...\x1b[0m' /bin/echo -e '\x1b[32mFetching upstream...\x1b[0m'
$GIT_CMD fetch --depth 200 https://github.com/void-linux/void-packages.git master git fetch --depth 200 https://github.com/void-linux/void-packages.git master

View file

@ -31,6 +31,7 @@ Apache-1.0
Apache-1.1 Apache-1.1
Apache-2.0 Apache-2.0
App-s2p App-s2p
Arphic-1999
Artistic-1.0-Perl Artistic-1.0-Perl
Artistic-1.0-cl8 Artistic-1.0-cl8
Artistic-1.0 Artistic-1.0
@ -58,12 +59,14 @@ BSD-Protection
BSD-Source-Code BSD-Source-Code
BSL-1.0 BSL-1.0
BUSL-1.1 BUSL-1.1
Baekmuk
Bahyph Bahyph
Barr Barr
Beerware Beerware
Bison-exception-2.2 Bison-exception-2.2
BitTorrent-1.0 BitTorrent-1.0
BitTorrent-1.1 BitTorrent-1.1
Bitstream-Vera
BlueOak-1.0.0 BlueOak-1.0.0
Bootloader-exception Bootloader-exception
Borceux Borceux
@ -77,6 +80,7 @@ CC-BY-2.5-AU
CC-BY-2.5 CC-BY-2.5
CC-BY-3.0-AT CC-BY-3.0-AT
CC-BY-3.0-DE CC-BY-3.0-DE
CC-BY-3.0-IGO
CC-BY-3.0-NL CC-BY-3.0-NL
CC-BY-3.0-US CC-BY-3.0-US
CC-BY-3.0 CC-BY-3.0
@ -220,6 +224,8 @@ GPL-3.0-linking-source-exception
GPL-3.0-only GPL-3.0-only
GPL-3.0-or-later GPL-3.0-or-later
GPL-CC-1.0 GPL-CC-1.0
GStreamer-exception-2005
GStreamer-exception-2008
Giftware Giftware
Glide Glide
Glulxe Glulxe
@ -244,6 +250,7 @@ JPNIC
JSON JSON
Jam Jam
JasPer-2.0 JasPer-2.0
KiCad-libraries-exception
LAL-1.2 LAL-1.2
LAL-1.3 LAL-1.3
LGPL-2.0-only LGPL-2.0-only
@ -262,6 +269,8 @@ LPPL-1.1
LPPL-1.2 LPPL-1.2
LPPL-1.3a LPPL-1.3a
LPPL-1.3c LPPL-1.3c
LZMA-SDK-9.11-to-9.20
LZMA-SDK-9.22
LZMA-exception LZMA-exception
Latex2e Latex2e
Leptonica Leptonica
@ -286,10 +295,12 @@ MPL-1.0
MPL-1.1 MPL-1.1
MPL-2.0-no-copyleft-exception MPL-2.0-no-copyleft-exception
MPL-2.0 MPL-2.0
MS-LPL
MS-PL MS-PL
MS-RL MS-RL
MTLL MTLL
MakeIndex MakeIndex
Minpack
MirOS MirOS
Motosoto Motosoto
MulanPSL-1.0 MulanPSL-1.0
@ -302,6 +313,7 @@ NBPL-1.0
NCGL-UK-2.0 NCGL-UK-2.0
NCSA NCSA
NGPL NGPL
NICTA-1.0
NIST-PD-fallback NIST-PD-fallback
NIST-PD NIST-PD
NLOD-1.0 NLOD-1.0
@ -379,6 +391,7 @@ Plexus
PolyForm-Noncommercial-1.0.0 PolyForm-Noncommercial-1.0.0
PolyForm-Small-Business-1.0.0 PolyForm-Small-Business-1.0.0
PostgreSQL PostgreSQL
Python-2.0.1
Python-2.0 Python-2.0
QPL-1.0 QPL-1.0
Qhull Qhull
@ -515,7 +528,9 @@ libpng-2.0
libselinux-1.0 libselinux-1.0
libtiff libtiff
mif-exception mif-exception
mpi-permissive
mpich2 mpich2
mplus
openvpn-openssl-exception openvpn-openssl-exception
psfrag psfrag
psutils psutils

View file

@ -1,15 +1,8 @@
#!/bin/sh #!/bin/sh
TRAVIS_PROTO=http TRAVIS_MIRROR=repo-ci.voidlinux.org
TRAVIS_MIRROR=repo-us.voidlinux.org
for _i in etc/xbps.d/repos-remote*.conf ; do for _i in etc/xbps.d/repos-remote*.conf ; do
/bin/echo -e "\x1b[32mUpdating $_i...\x1b[0m" /bin/echo -e "\x1b[32mUpdating $_i...\x1b[0m"
# First fix the proto, ideally we'd serve everything with HTTPS, sed -i "s:repo-default\.voidlinux\.org:$TRAVIS_MIRROR:g" $_i
# but key management and rotation is a pain, and things are signed
# so we can afford to be a little lazy at times.
sed -i "s:https:$TRAVIS_PROTO:g" $_i
# Now set the mirror
sed -i "s:alpha\.de\.repo\.voidlinux\.org:$TRAVIS_MIRROR:g" $_i
done done

View file

@ -11,7 +11,8 @@ common/scripts/lint-commits $base $tip || EXITCODE=$?
for t in $(awk '{ print "srcpkgs/" $0 "/template" }' /tmp/templates); do for t in $(awk '{ print "srcpkgs/" $0 "/template" }' /tmp/templates); do
/bin/echo -e "\x1b[32mLinting $t...\x1b[0m" /bin/echo -e "\x1b[32mLinting $t...\x1b[0m"
xlint "$t" || EXITCODE=$? xlint "$t" > /tmp/xlint_out || EXITCODE=$?
common/scripts/lint-version-change "$t" $base $tip || EXITCODE=$? common/scripts/lint-version-change "$t" $base $tip > /tmp/vlint_out || EXITCODE=$?
awk -f common/scripts/lint2annotations.awk /tmp/xlint_out /tmp/vlint_out
done done
exit $EXITCODE exit $EXITCODE

25
common/travis/xpkgdiff.sh Executable file
View file

@ -0,0 +1,25 @@
#!/bin/sh
#
# xpkgdiff.sh
export XBPS_DISTDIR=/hostrepo XBPS_HOSTDIR="$HOME/hostdir"
export DIFF='diff --unified=0 --report-identical-files --suppress-common-lines
--color=always --label REPO --label BUILT'
ARGS="-a $2 -R https://repo-ci.voidlinux.org/current"
while read -r pkg; do
for subpkg in $(xsubpkg $pkg); do
if xbps-query --repository=$HOME/hostdir/binpkgs \
--repository=$HOME/hostdir/binpkgs/nonfree \
-i "$subpkg" >&/dev/null; then
/bin/echo -e "\x1b[34mFile Diff of $subpkg:\x1b[0m"
xpkgdiff $ARGS -f $subpkg
/bin/echo -e "\x1b[34mMetadata Diff of $subpkg:\x1b[0m"
xpkgdiff $ARGS -S $subpkg
/bin/echo -e "\x1b[34mDependency Diff of $subpkg:\x1b[0m"
xpkgdiff $ARGS -x $subpkg
else
/bin/echo -e "\x1b[33m$subpkg wasn't found\x1b[0m"
fi
done
done < /tmp/templates

View file

@ -20,6 +20,13 @@ done
setup_pkg "$PKGNAME" $XBPS_CROSS_BUILD setup_pkg "$PKGNAME" $XBPS_CROSS_BUILD
if [ -n "$disable_parallel_check" ]; then
XBPS_MAKEJOBS=1
else
XBPS_MAKEJOBS="$XBPS_ORIG_MAKEJOBS"
fi
makejobs="-j$XBPS_MAKEJOBS"
XBPS_CHECK_DONE="${XBPS_STATEDIR}/${sourcepkg}_${XBPS_CROSS_BUILD}_check_done" XBPS_CHECK_DONE="${XBPS_STATEDIR}/${sourcepkg}_${XBPS_CROSS_BUILD}_check_done"
if [ -n "$XBPS_CROSS_BUILD" ]; then if [ -n "$XBPS_CROSS_BUILD" ]; then

View file

@ -25,9 +25,9 @@ setup_pkg_depends() {
_pkgname=$(xbps-uhelper getpkgname $_depname 2>/dev/null) _pkgname=$(xbps-uhelper getpkgname $_depname 2>/dev/null)
[ -z "$_pkgname" ] && _pkgname="$_depname" [ -z "$_pkgname" ] && _pkgname="$_depname"
if [ -s ${XBPS_DISTDIR}/etc/virtual ]; then if [ -s ${XBPS_DISTDIR}/etc/virtual ]; then
foo=$(egrep "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/virtual|cut -d ' ' -f2) foo=$(grep -E "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/virtual|cut -d ' ' -f2)
elif [ -s ${XBPS_DISTDIR}/etc/defaults.virtual ]; then elif [ -s ${XBPS_DISTDIR}/etc/defaults.virtual ]; then
foo=$(egrep "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/defaults.virtual|cut -d ' ' -f2) foo=$(grep -E "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/defaults.virtual|cut -d ' ' -f2)
fi fi
if [ -z "$foo" ]; then if [ -z "$foo" ]; then
msg_error "$pkgver: failed to resolve virtual dependency for '$j' (missing from etc/virtual)\n" msg_error "$pkgver: failed to resolve virtual dependency for '$j' (missing from etc/virtual)\n"

View file

@ -54,7 +54,7 @@ bulk_sortdeps() {
} }
bulk_build() { bulk_build() {
local sys="$1" local bulk_build_cmd="$1"
local NPROCS=$(($(nproc)*2)) local NPROCS=$(($(nproc)*2))
local NRUNNING=0 local NRUNNING=0
@ -67,10 +67,17 @@ bulk_build() {
fi fi
# Compare installed pkg versions vs srcpkgs # Compare installed pkg versions vs srcpkgs
if [[ $sys ]]; then case "$bulk_build_cmd" in
xbps-checkvers -f '%n' -I -D $XBPS_DISTDIR installed)
bulk_sortdeps $(xbps-checkvers -f '%n' -I -D "$XBPS_DISTDIR")
return $? return $?
fi ;;
local)
bulk_sortdeps $(xbps-checkvers -f '%n' -i -R "${XBPS_REPOSITORY}" -R "${XBPS_REPOSITORY}/nonfree" -D "$XBPS_DISTDIR")
return $?
;;
esac
# compare repo pkg versions vs srcpkgs # compare repo pkg versions vs srcpkgs
for f in $(xbps-checkvers -f '%n' -D $XBPS_DISTDIR); do for f in $(xbps-checkvers -f '%n' -D $XBPS_DISTDIR); do
if [ $NRUNNING -eq $NPROCS ]; then if [ $NRUNNING -eq $NPROCS ]; then
@ -90,9 +97,9 @@ bulk_build() {
} }
bulk_update() { bulk_update() {
local args="$1" pkgs f rval local bulk_update_cmd="$1" pkgs f rval
pkgs="$(bulk_build ${args})" pkgs="$(bulk_build "${bulk_update_cmd}")"
[[ -z $pkgs ]] && return 0 [[ -z $pkgs ]] && return 0
msg_normal "xbps-src: the following packages must be rebuilt and updated:\n" msg_normal "xbps-src: the following packages must be rebuilt and updated:\n"
@ -112,7 +119,7 @@ bulk_update() {
msg_error "xbps-src: failed to build $pkgver pkg!\n" msg_error "xbps-src: failed to build $pkgver pkg!\n"
fi fi
done done
if [ -n "$pkgs" -a -n "$args" ]; then if [ -n "$pkgs" -a "$bulk_update_cmd" == installed ]; then
echo echo
msg_normal "xbps-src: updating your system, confirm to proceed...\n" msg_normal "xbps-src: updating your system, confirm to proceed...\n"
${XBPS_SUCMD} "xbps-install --repository=$XBPS_REPOSITORY --repository=$XBPS_REPOSITORY/nonfree -u ${pkgs//[$'\n']/ }" || return 1 ${XBPS_SUCMD} "xbps-install --repository=$XBPS_REPOSITORY --repository=$XBPS_REPOSITORY/nonfree -u ${pkgs//[$'\n']/ }" || return 1

View file

@ -8,22 +8,22 @@ install_base_chroot() {
XBPS_TARGET_PKG="$1" XBPS_TARGET_PKG="$1"
fi fi
# binary bootstrap # binary bootstrap
msg_normal "xbps-src: installing base-chroot-cereus...\n" msg_normal "xbps-src: installing base-chroot...\n"
# XBPS_TARGET_PKG == arch # XBPS_TARGET_PKG == arch
if [ "$XBPS_TARGET_PKG" ]; then if [ "$XBPS_TARGET_PKG" ]; then
_bootstrap_arch="env XBPS_TARGET_ARCH=$XBPS_TARGET_PKG" _bootstrap_arch="env XBPS_TARGET_ARCH=$XBPS_TARGET_PKG"
fi fi
(export XBPS_MACHINE=$XBPS_TARGET_PKG XBPS_ARCH=$XBPS_TARGET_PKG; chroot_sync_repodata) (export XBPS_MACHINE=$XBPS_TARGET_PKG XBPS_ARCH=$XBPS_TARGET_PKG; chroot_sync_repodata)
${_bootstrap_arch} $XBPS_INSTALL_CMD ${XBPS_INSTALL_ARGS} -y base-chroot-cereus ${_bootstrap_arch} $XBPS_INSTALL_CMD ${XBPS_INSTALL_ARGS} -y base-chroot
if [ $? -ne 0 ]; then if [ $? -ne 0 ]; then
msg_error "xbps-src: failed to install base-chroot-cereus!\n" msg_error "xbps-src: failed to install base-chroot!\n"
fi fi
# Reconfigure base-files to create dirs/symlinks. # Reconfigure base-files to create dirs/symlinks.
if xbps-query -r $XBPS_MASTERDIR base-files>=2022.07.03 &>/dev/null; then if xbps-query -r $XBPS_MASTERDIR base-files &>/dev/null; then
XBPS_ARCH=$XBPS_TARGET_PKG xbps-reconfigure -r $XBPS_MASTERDIR -f base-files>=2022.07.03 &>/dev/null XBPS_ARCH=$XBPS_TARGET_PKG xbps-reconfigure -r $XBPS_MASTERDIR -f base-files &>/dev/null
fi fi
msg_normal "xbps-src: installed base-chroot-cereus successfully!\n" msg_normal "xbps-src: installed base-chroot successfully!\n"
chroot_prepare $XBPS_TARGET_PKG || msg_error "xbps-src: failed to initialize chroot!\n" chroot_prepare $XBPS_TARGET_PKG || msg_error "xbps-src: failed to initialize chroot!\n"
chroot_check chroot_check
chroot_handler clean chroot_handler clean
@ -34,7 +34,7 @@ reconfigure_base_chroot() {
local pkgs="glibc-locales ca-certificates" local pkgs="glibc-locales ca-certificates"
[ -z "$IN_CHROOT" -o -e $statefile ] && return 0 [ -z "$IN_CHROOT" -o -e $statefile ] && return 0
# Reconfigure ca-certificates. # Reconfigure ca-certificates.
msg_normal "xbps-src: reconfiguring base-chroot-cereus...\n" msg_normal "xbps-src: reconfiguring base-chroot...\n"
for f in ${pkgs}; do for f in ${pkgs}; do
if xbps-query -r $XBPS_MASTERDIR $f &>/dev/null; then if xbps-query -r $XBPS_MASTERDIR $f &>/dev/null; then
xbps-reconfigure -r $XBPS_MASTERDIR -f $f xbps-reconfigure -r $XBPS_MASTERDIR -f $f
@ -51,7 +51,7 @@ update_base_chroot() {
if $(${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -nu|grep -q xbps); then if $(${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -nu|grep -q xbps); then
${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu xbps || msg_error "xbps-src: failed to update xbps!\n" ${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu xbps || msg_error "xbps-src: failed to update xbps!\n"
fi fi
${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu || msg_error "xbps-src: failed to update base-chroot-cereus!\n" ${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu || msg_error "xbps-src: failed to update base-chroot!\n"
msg_normal "xbps-src: cleaning up $XBPS_MASTERDIR masterdir...\n" msg_normal "xbps-src: cleaning up $XBPS_MASTERDIR masterdir...\n"
[ -z "$XBPS_KEEP_ALL" -a -z "$XBPS_SKIP_DEPS" ] && remove_pkg_autodeps [ -z "$XBPS_KEEP_ALL" -a -z "$XBPS_SKIP_DEPS" ] && remove_pkg_autodeps
[ -z "$XBPS_KEEP_ALL" -a -z "$keep_all_force" ] && rm -rf $XBPS_MASTERDIR/builddir $XBPS_MASTERDIR/destdir [ -z "$XBPS_KEEP_ALL" -a -z "$keep_all_force" ] && rm -rf $XBPS_MASTERDIR/builddir $XBPS_MASTERDIR/destdir
@ -115,14 +115,14 @@ chroot_prepare() {
[ ! -d $XBPS_MASTERDIR/$f ] && mkdir -p $XBPS_MASTERDIR/$f [ ! -d $XBPS_MASTERDIR/$f ] && mkdir -p $XBPS_MASTERDIR/$f
done done
# Copy /etc/passwd and /etc/group from base-files # Copy /etc/passwd and /etc/group from base-files.
cp -f $XBPS_SRCPKGDIR/base-files/files/passwd $XBPS_MASTERDIR/etc cp -f $XBPS_SRCPKGDIR/base-files/files/passwd $XBPS_MASTERDIR/etc
echo "$(whoami):x:$(id -u):$(id -g):$(whoami) user:/tmp:/bin/xbps-shell" \ echo "$(whoami):x:$(id -u):$(id -g):$(whoami) user:/tmp:/bin/xbps-shell" \
>> $XBPS_MASTERDIR/etc/passwd >> $XBPS_MASTERDIR/etc/passwd
cp -f $XBPS_SRCPKGDIR/base-files/files/group $XBPS_MASTERDIR/etc cp -f $XBPS_SRCPKGDIR/base-files/files/group $XBPS_MASTERDIR/etc
echo "$(whoami):x:$(id -g):" >> $XBPS_MASTERDIR/etc/group echo "$(whoami):x:$(id -g):" >> $XBPS_MASTERDIR/etc/group
# Copy /etc/hosts from base-files # Copy /etc/hosts from base-files.
cp -f $XBPS_SRCPKGDIR/base-files/files/hosts $XBPS_MASTERDIR/etc cp -f $XBPS_SRCPKGDIR/base-files/files/hosts $XBPS_MASTERDIR/etc
# Prepare default locale: en_US.UTF-8. # Prepare default locale: en_US.UTF-8.

View file

@ -147,6 +147,23 @@ msg_normal() {
fi fi
} }
report_broken() {
if [ "$show_problems" = "ignore-problems" ]; then
return
fi
if [ -z "$XBPS_IGNORE_BROKENNESS" ]; then
for line in "$@"; do
msg_red "$line"
done
exit 2
elif [ "$XBPS_IGNORE_BROKENNESS" != reported ]; then
for line in "$@"; do
msg_warn "$line"
done
XBPS_IGNORE_BROKENNESS=reported
fi
}
msg_normal_append() { msg_normal_append() {
[ -n "$NOCOLORS" ] || printf "\033[1m" [ -n "$NOCOLORS" ] || printf "\033[1m"
printf "$@" printf "$@"
@ -472,7 +489,15 @@ setup_pkg() {
fi fi
makejobs="-j$XBPS_MAKEJOBS" makejobs="-j$XBPS_MAKEJOBS"
if [ -n "$XBPS_BINPKG_EXISTS" ]; then if [ -n "$XBPS_BINPKG_EXISTS" ]; then
local _binpkgver="$($XBPS_QUERY_XCMD -R -ppkgver $pkgver 2>/dev/null)" local extraflags=""
if [ -n "$XBPS_SKIP_REMOTEREPOS" ]; then
extraflags="-i"
# filter out remote repositories
for repo in $(xbps-query -L | awk '{ print $2 }' | grep '^/host/'); do
extraflags+=" --repository=$repo"
done
fi
local _binpkgver="$($XBPS_QUERY_XCMD -R -ppkgver $pkgver $extraflags 2>/dev/null)"
if [ "$_binpkgver" = "$pkgver" ]; then if [ "$_binpkgver" = "$pkgver" ]; then
if [ -z "$XBPS_DEPENDENCY" ]; then if [ -z "$XBPS_DEPENDENCY" ]; then
local _repo="$($XBPS_QUERY_XCMD -R -prepository $pkgver 2>/dev/null)" local _repo="$($XBPS_QUERY_XCMD -R -prepository $pkgver 2>/dev/null)"
@ -630,20 +655,16 @@ setup_pkg() {
fi fi
# Setup some specific package vars. # Setup some specific package vars.
if [ -z "$wrksrc" ]; then wrksrc="$XBPS_BUILDDIR/${sourcepkg}-${version}"
wrksrc="$XBPS_BUILDDIR/${sourcepkg}-${version}"
else
wrksrc="$XBPS_BUILDDIR/$wrksrc"
fi
if [ "$cross" -a "$nocross" -a "$show_problems" != "ignore-problems" ]; then if [ "$cross" -a "$nocross" ]; then
msg_red "$pkgver: cannot be cross compiled, exiting...\n" report_broken \
msg_red "$pkgver: $nocross\n" "$pkgver: cannot be cross compiled...\n" \
exit 2 "$pkgver: $nocross\n"
elif [ "$broken" -a "$show_problems" != "ignore-problems" ]; then elif [ "$broken" ]; then
msg_red "$pkgver: cannot be built, it's currently broken; see the build log:\n" report_broken \
msg_red "$pkgver: $broken\n" "$pkgver: cannot be built, it's currently broken; see the build log:\n" \
exit 2 "$pkgver: $broken\n"
fi fi
if [ -n "$restricted" -a -z "$XBPS_ALLOW_RESTRICTED" -a "$show_problems" != "ignore-problems" ]; then if [ -n "$restricted" -a -z "$XBPS_ALLOW_RESTRICTED" -a "$show_problems" != "ignore-problems" ]; then

View file

@ -72,7 +72,7 @@ prepare_cross_sysroot() {
fi fi
rm -f $errlog rm -f $errlog
# Create top level symlinks in sysroot. # Create top level symlinks in sysroot.
XBPS_ARCH=$XBPS_TARGET_MACHINE xbps-reconfigure -r $XBPS_CROSS_BASE -f base-files-cereus>=2022.07.03 &>/dev/null XBPS_ARCH=$XBPS_TARGET_MACHINE xbps-reconfigure -r $XBPS_CROSS_BASE -f base-files &>/dev/null
# Create a sysroot/include and sysroot/lib symlink just in case. # Create a sysroot/include and sysroot/lib symlink just in case.
ln -s usr/include ${XBPS_CROSS_BASE}/include ln -s usr/include ${XBPS_CROSS_BASE}/include
ln -s usr/lib ${XBPS_CROSS_BASE}/lib ln -s usr/lib ${XBPS_CROSS_BASE}/lib

View file

@ -34,8 +34,7 @@ check_pkg_arch() {
esac esac
done done
if [ -z "$nonegation" -a -n "$match" ] || [ -n "$nonegation" -a -z "$match" ]; then if [ -z "$nonegation" -a -n "$match" ] || [ -n "$nonegation" -a -z "$match" ]; then
msg_red "${pkgname}-${version}_${revision}: this package cannot be built for ${_arch}.\n" report_broken "${pkgname}-${version}_${revision}: this package cannot be built for ${_arch}.\n"
exit 2
fi fi
fi fi
} }

View file

@ -4,6 +4,7 @@ update_check() {
local i p url pkgurlname rx found_version consider local i p url pkgurlname rx found_version consider
local update_override=$XBPS_SRCPKGDIR/$XBPS_TARGET_PKG/update local update_override=$XBPS_SRCPKGDIR/$XBPS_TARGET_PKG/update
local original_pkgname=$pkgname local original_pkgname=$pkgname
local pkgname=$sourcepkg
local urlpfx urlsfx local urlpfx urlsfx
local -A fetchedurls local -A fetchedurls
@ -23,8 +24,9 @@ update_check() {
if [ -z "$site" ]; then if [ -z "$site" ]; then
case "$distfiles" in case "$distfiles" in
# only consider versions those exist in ftp.gnome.org # special case those sites provide better source elsewhere
*ftp.gnome.org*) ;; *ftp.gnome.org*|*download.gnome.org*) ;;
*archive.xfce.org*) ;;
*) *)
printf '%s\n' "$homepage" ;; printf '%s\n' "$homepage" ;;
esac esac
@ -56,7 +58,8 @@ update_check() {
*github.com*|\ *github.com*|\
*//gitlab.*|\ *//gitlab.*|\
*bitbucket.org*|\ *bitbucket.org*|\
*ftp.gnome.org*|\ *ftp.gnome.org*|*download.gnome.org*|\
*archive.xfce.org*|\
*kernel.org/pub/linux/kernel/*|\ *kernel.org/pub/linux/kernel/*|\
*cran.r-project.org/src/contrib*|\ *cran.r-project.org/src/contrib*|\
*rubygems.org*|\ *rubygems.org*|\
@ -122,8 +125,11 @@ update_check() {
pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)" pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)"
url="https://github.com/$pkgurlname/tags" url="https://github.com/$pkgurlname/tags"
rx='/archive/refs/tags/(v?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar\.gz")';; rx='/archive/refs/tags/(v?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar\.gz")';;
*//gitlab.*) *//gitlab.*|*code.videolan.org*)
pkgurlname="$(printf %s "$url" | cut -d/ -f1-5)" case "$url" in
*/-/*) pkgurlname="$(printf %s "$url" | sed -e 's%/-/.*%%g; s%/$%%')";;
*) pkgurlname="$(printf %s "$url" | cut -d / -f 1-5)";;
esac
url="$pkgurlname/tags" url="$pkgurlname/tags"
rx='/archive/[^/]+/\Q'"$pkgname"'\E-v?\K[\d.]+(?=\.tar\.gz")';; rx='/archive/[^/]+/\Q'"$pkgname"'\E-v?\K[\d.]+(?=\.tar\.gz")';;
*bitbucket.org*) *bitbucket.org*)
@ -131,8 +137,11 @@ update_check() {
url="https://bitbucket.org/$pkgurlname/downloads" url="https://bitbucket.org/$pkgurlname/downloads"
rx='/(get|downloads)/(v?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar)';; rx='/(get|downloads)/(v?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar)';;
*ftp.gnome.org*|*download.gnome.org*) *ftp.gnome.org*|*download.gnome.org*)
: ${pattern="\Q$pkgname\E-\K(0|[13]\.[0-9]*[02468]|[4-9][0-9]+)\.[0-9.]*[0-9](?=)"} : ${pattern="\Q$pkgname\E-\K(0|[13]\.[0-9]*[02468]|[4-9][0-9]+)\.[0-9.]*[0-9](?=.tar)"}
url="https://download.gnome.org/sources/$pkgname/cache.json";; url="https://download.gnome.org/sources/$pkgname/cache.json";;
*archive.xfce.org*)
: ${pattern="\Q$pkgname\E-\K((([4-9]|([1-9][0-9]+))\.[0-9]*[02468]\.[0-9.]*[0-9])|([0-3]\.[0-9.]*))(?=.tar)"}
url="https://archive.xfce.org/feeds/project/$pkgname" ;;
*kernel.org/pub/linux/kernel/*) *kernel.org/pub/linux/kernel/*)
rx=linux-'\K'${version%.*}'[\d.]+(?=\.tar\.xz)';; rx=linux-'\K'${version%.*}'[\d.]+(?=\.tar\.xz)';;
*cran.r-project.org/src/contrib*) *cran.r-project.org/src/contrib*)
@ -145,8 +154,8 @@ update_check() {
rx='/crates/'${pkgname#rust-}'/\K[0-9.]*(?=/download)' ;; rx='/crates/'${pkgname#rust-}'/\K[0-9.]*(?=/download)' ;;
*codeberg.org*) *codeberg.org*)
pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)" pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)"
url="https://codeberg.org/$pkgurlname/releases" url="https://codeberg.org/$pkgurlname/tags"
rx='/archive/\K[\d.]+(?=\.tar\.gz)' ;; rx='/archive/(v-?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar\.gz)' ;;
*hg.sr.ht*) *hg.sr.ht*)
pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)" pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)"
url="https://hg.sr.ht/$pkgurlname/tags" url="https://hg.sr.ht/$pkgurlname/tags"

View file

@ -122,6 +122,9 @@ show-repo-updates
show-sys-updates show-sys-updates
Prints the list of outdated packages in your system. Prints the list of outdated packages in your system.
show-local-updates
Prints the list of outdated packages in your local repositories.
sort-dependencies <pkg> <pkgN+1> ... sort-dependencies <pkg> <pkgN+1> ...
Given a list of packages specified as additional arguments, a sorted dependency Given a list of packages specified as additional arguments, a sorted dependency
list will be returned to stdout. list will be returned to stdout.
@ -132,6 +135,9 @@ update-bulk
update-sys update-sys
Rebuilds all packages in your system that are outdated and updates them. Rebuilds all packages in your system that are outdated and updates them.
update-local
Rebuilds all packages in your local repositories that are outdated.
update-check <pkgname> update-check <pkgname>
Check upstream site of <pkgname> for new releases. Check upstream site of <pkgname> for new releases.
@ -150,6 +156,8 @@ Options:
$(print_cross_targets) $(print_cross_targets)
-b Build packages even if marked as broken, nocross, or excluded with archs.
-c <configuration> -c <configuration>
If specified, etc/conf.<configuration> will be used as the primary config If specified, etc/conf.<configuration> will be used as the primary config
file name; etc/conf will only be attempted if that does not exist. file name; etc/conf will only be attempted if that does not exist.
@ -157,7 +165,7 @@ $(print_cross_targets)
-C Do not remove build directory, automatic dependencies and -C Do not remove build directory, automatic dependencies and
package destdir after successful install. package destdir after successful install.
-E If a binary package exists in a local repository for the target package, -E If a binary package exists in a repository for the target package,
do not try to build it, exit immediately. do not try to build it, exit immediately.
-f Force running the specified stage (configure/build/install/pkg) -f Force running the specified stage (configure/build/install/pkg)
@ -216,6 +224,8 @@ $(print_cross_targets)
This alternative repository will also be used to resolve dependencies This alternative repository will also be used to resolve dependencies
with highest priority order than others. with highest priority order than others.
-s Make vsed warnings errors.
-t Create a temporary masterdir to not pollute the current one. Note that -t Create a temporary masterdir to not pollute the current one. Note that
the existing masterdir must be fully populated with binary-bootstrap first. the existing masterdir must be fully populated with binary-bootstrap first.
Once the target has finished, this temporary masterdir will be removed. Once the target has finished, this temporary masterdir will be removed.
@ -357,7 +367,7 @@ readonly XBPS_SRC_VERSION="113"
export XBPS_MACHINE=$(xbps-uhelper -C /dev/null arch) export XBPS_MACHINE=$(xbps-uhelper -C /dev/null arch)
XBPS_OPTIONS= XBPS_OPTIONS=
XBPS_OPTSTRING="1a:c:CEfgGhH:iIj:Lm:No:p:qQKr:tV" XBPS_OPTSTRING="1a:bc:CEfgGhH:iIj:Lm:No:p:qsQKr:tV"
# Preprocess arguments in order to allow options before and after XBPS_TARGET. # Preprocess arguments in order to allow options before and after XBPS_TARGET.
eval set -- $(getopt "$XBPS_OPTSTRING" "$@"); eval set -- $(getopt "$XBPS_OPTSTRING" "$@");
@ -365,10 +375,12 @@ eval set -- $(getopt "$XBPS_OPTSTRING" "$@");
# Options are saved as XBPS_ARG_FOO instead of XBPS_FOO for now; this is # Options are saved as XBPS_ARG_FOO instead of XBPS_FOO for now; this is
# because configuration files may override those and we want arguments to # because configuration files may override those and we want arguments to
# take precedence over configuration files # take precedence over configuration files
while getopts "$XBPS_OPTSTRING" opt; do while getopts "$XBPS_OPTSTRING" opt; do
case $opt in case $opt in
1) XBPS_ARG_BUILD_ONLY_ONE_PKG=yes; XBPS_OPTIONS+=" -1";; 1) XBPS_ARG_BUILD_ONLY_ONE_PKG=yes; XBPS_OPTIONS+=" -1";;
a) XBPS_ARG_CROSS_BUILD="$OPTARG"; XBPS_OPTIONS+=" -a $OPTARG";; a) XBPS_ARG_CROSS_BUILD="$OPTARG"; XBPS_OPTIONS+=" -a $OPTARG";;
b) XBPS_ARG_IGNORE_BROKENNESS=yes; XBPS_OPTIONS+=" -b";;
c) XBPS_ARG_CONFIG="$OPTARG"; XBPS_OPTIONS+=" -c $OPTARG";; c) XBPS_ARG_CONFIG="$OPTARG"; XBPS_OPTIONS+=" -c $OPTARG";;
C) XBPS_ARG_KEEP_ALL=1; XBPS_OPTIONS+=" -C";; C) XBPS_ARG_KEEP_ALL=1; XBPS_OPTIONS+=" -C";;
E) XBPS_ARG_BINPKG_EXISTS=1; XBPS_OPTIONS+=" -E";; E) XBPS_ARG_BINPKG_EXISTS=1; XBPS_OPTIONS+=" -E";;
@ -389,6 +401,7 @@ while getopts "$XBPS_OPTSTRING" opt; do
Q) XBPS_ARG_CHECK_PKGS=yes; XBPS_OPTIONS+=" -Q";; Q) XBPS_ARG_CHECK_PKGS=yes; XBPS_OPTIONS+=" -Q";;
K) XBPS_ARG_CHECK_PKGS=full; XBPS_OPTIONS+=" -K";; K) XBPS_ARG_CHECK_PKGS=full; XBPS_OPTIONS+=" -K";;
r) XBPS_ARG_ALT_REPOSITORY="$OPTARG"; XBPS_OPTIONS+=" -r $OPTARG";; r) XBPS_ARG_ALT_REPOSITORY="$OPTARG"; XBPS_OPTIONS+=" -r $OPTARG";;
s) XBPS_ARG_STRICT=yes; XBPS_OPTIONS+=" -s";;
t) XBPS_ARG_TEMP_MASTERDIR=1; XBPS_OPTIONS+=" -t -C";; t) XBPS_ARG_TEMP_MASTERDIR=1; XBPS_OPTIONS+=" -t -C";;
V) echo "xbps-src-$XBPS_SRC_VERSION $(xbps-uhelper -V)" && exit 0;; V) echo "xbps-src-$XBPS_SRC_VERSION $(xbps-uhelper -V)" && exit 0;;
--) shift; break;; --) shift; break;;
@ -459,6 +472,7 @@ fi
# Set options passed on command line, after configuration files have been read # Set options passed on command line, after configuration files have been read
[ -n "$XBPS_ARG_BUILD_ONLY_ONE_PKG" ] && XBPS_BUILD_ONLY_ONE_PKG=yes [ -n "$XBPS_ARG_BUILD_ONLY_ONE_PKG" ] && XBPS_BUILD_ONLY_ONE_PKG=yes
[ -n "$XBPS_ARG_IGNORE_BROKENNESS" ] && XBPS_IGNORE_BROKENNESS=1
[ -n "$XBPS_ARG_SKIP_REMOTEREPOS" ] && XBPS_SKIP_REMOTEREPOS=1 [ -n "$XBPS_ARG_SKIP_REMOTEREPOS" ] && XBPS_SKIP_REMOTEREPOS=1
[ -n "$XBPS_ARG_BUILD_FORCEMODE" ] && XBPS_BUILD_FORCEMODE=1 [ -n "$XBPS_ARG_BUILD_FORCEMODE" ] && XBPS_BUILD_FORCEMODE=1
[ -n "$XBPS_ARG_INFORMATIVE_RUN" ] && XBPS_INFORMATIVE_RUN=1 [ -n "$XBPS_ARG_INFORMATIVE_RUN" ] && XBPS_INFORMATIVE_RUN=1
@ -471,6 +485,7 @@ fi
[ -n "$XBPS_ARG_QUIET" ] && XBPS_QUIET=1 [ -n "$XBPS_ARG_QUIET" ] && XBPS_QUIET=1
[ -n "$XBPS_ARG_PRINT_VARIABLES" ] && XBPS_PRINT_VARIABLES="$XBPS_ARG_PRINT_VARIABLES" [ -n "$XBPS_ARG_PRINT_VARIABLES" ] && XBPS_PRINT_VARIABLES="$XBPS_ARG_PRINT_VARIABLES"
[ -n "$XBPS_ARG_ALT_REPOSITORY" ] && XBPS_ALT_REPOSITORY="$XBPS_ARG_ALT_REPOSITORY" [ -n "$XBPS_ARG_ALT_REPOSITORY" ] && XBPS_ALT_REPOSITORY="$XBPS_ARG_ALT_REPOSITORY"
[ -n "$XBPS_ARG_STRICT" ] && XBPS_STRICT="$XBPS_ARG_STRICT"
[ -n "$XBPS_ARG_CROSS_BUILD" ] && XBPS_CROSS_BUILD="$XBPS_ARG_CROSS_BUILD" [ -n "$XBPS_ARG_CROSS_BUILD" ] && XBPS_CROSS_BUILD="$XBPS_ARG_CROSS_BUILD"
[ -n "$XBPS_ARG_CHECK_PKGS" ] && XBPS_CHECK_PKGS="$XBPS_ARG_CHECK_PKGS" [ -n "$XBPS_ARG_CHECK_PKGS" ] && XBPS_CHECK_PKGS="$XBPS_ARG_CHECK_PKGS"
[ -n "$XBPS_ARG_MAKEJOBS" ] && XBPS_MAKEJOBS="$XBPS_ARG_MAKEJOBS" [ -n "$XBPS_ARG_MAKEJOBS" ] && XBPS_MAKEJOBS="$XBPS_ARG_MAKEJOBS"
@ -478,8 +493,8 @@ fi
export XBPS_BUILD_ONLY_ONE_PKG XBPS_SKIP_REMOTEREPOS XBPS_BUILD_FORCEMODE \ export XBPS_BUILD_ONLY_ONE_PKG XBPS_SKIP_REMOTEREPOS XBPS_BUILD_FORCEMODE \
XBPS_INFORMATIVE_RUN XBPS_TEMP_MASTERDIR XBPS_BINPKG_EXISTS \ XBPS_INFORMATIVE_RUN XBPS_TEMP_MASTERDIR XBPS_BINPKG_EXISTS \
XBPS_USE_GIT_REVS XBPS_CHECK_PKGS XBPS_DEBUG_PKGS XBPS_SKIP_DEPS \ XBPS_USE_GIT_REVS XBPS_CHECK_PKGS XBPS_DEBUG_PKGS XBPS_SKIP_DEPS \
XBPS_KEEP_ALL XBPS_QUIET XBPS_ALT_REPOSITORY XBPS_CROSS_BUILD \ XBPS_KEEP_ALL XBPS_QUIET XBPS_ALT_REPOSITORY XBPS_STRICT XBPS_CROSS_BUILD \
XBPS_MAKEJOBS XBPS_PRINT_VARIABLES XBPS_MAKEJOBS XBPS_PRINT_VARIABLES XBPS_IGNORE_BROKENNESS
# The masterdir/hostdir variables are forced and readonly in chroot # The masterdir/hostdir variables are forced and readonly in chroot
if [ -z "$IN_CHROOT" ]; then if [ -z "$IN_CHROOT" ]; then
@ -626,7 +641,7 @@ readonly XBPS_CMPVER_CMD="xbps-uhelper cmpver"
export XBPS_SHUTILSDIR XBPS_CROSSPFDIR XBPS_TRIGGERSDIR \ export XBPS_SHUTILSDIR XBPS_CROSSPFDIR XBPS_TRIGGERSDIR \
XBPS_SRCPKGDIR XBPS_COMMONDIR XBPS_BUILDDIR \ XBPS_SRCPKGDIR XBPS_COMMONDIR XBPS_BUILDDIR \
XBPS_REPOSITORY XBPS_ALT_REPOSITORY XBPS_SRCDISTDIR XBPS_DIGEST_CMD \ XBPS_REPOSITORY XBPS_ALT_REPOSITORY XBPS_STRICT XBPS_SRCDISTDIR XBPS_DIGEST_CMD \
XBPS_UHELPER_CMD XBPS_INSTALL_CMD XBPS_QUERY_CMD XBPS_BUILD_ONLY_ONE_PKG \ XBPS_UHELPER_CMD XBPS_INSTALL_CMD XBPS_QUERY_CMD XBPS_BUILD_ONLY_ONE_PKG \
XBPS_RINDEX_CMD XBPS_RECONFIGURE_CMD XBPS_REMOVE_CMD XBPS_CHECKVERS_CMD \ XBPS_RINDEX_CMD XBPS_RECONFIGURE_CMD XBPS_REMOVE_CMD XBPS_CHECKVERS_CMD \
XBPS_CMPVER_CMD XBPS_FETCH_CMD XBPS_VERSION XBPS_BUILDSTYLEDIR \ XBPS_CMPVER_CMD XBPS_FETCH_CMD XBPS_VERSION XBPS_BUILDSTYLEDIR \
@ -639,7 +654,7 @@ export XBPS_SHUTILSDIR XBPS_CROSSPFDIR XBPS_TRIGGERSDIR \
XBPS_LIBEXECDIR XBPS_DISTDIR XBPS_DISTFILES_MIRROR XBPS_ALLOW_RESTRICTED \ XBPS_LIBEXECDIR XBPS_DISTDIR XBPS_DISTFILES_MIRROR XBPS_ALLOW_RESTRICTED \
XBPS_USE_GIT_COMMIT_DATE XBPS_PKG_COMPTYPE XBPS_REPO_COMPTYPE \ XBPS_USE_GIT_COMMIT_DATE XBPS_PKG_COMPTYPE XBPS_REPO_COMPTYPE \
XBPS_BUILDHELPERDIR XBPS_USE_BUILD_MTIME XBPS_BUILD_ENVIRONMENT \ XBPS_BUILDHELPERDIR XBPS_USE_BUILD_MTIME XBPS_BUILD_ENVIRONMENT \
XBPS_PRESERVE_PKGS XBPS_PRESERVE_PKGS XBPS_IGNORE_BROKENNESS
for i in REPOSITORY DESTDIR BUILDDIR SRCDISTDIR; do for i in REPOSITORY DESTDIR BUILDDIR SRCDISTDIR; do
eval val="\$XBPS_$i" eval val="\$XBPS_$i"
@ -797,6 +812,7 @@ case "$XBPS_TARGET" in
if [ -n "$CHROOT_READY" -a -z "$IN_CHROOT" ]; then if [ -n "$CHROOT_READY" -a -z "$IN_CHROOT" ]; then
chroot_handler $XBPS_TARGET $XBPS_TARGET_PKG chroot_handler $XBPS_TARGET $XBPS_TARGET_PKG
else else
check_existing_pkg
chroot_sync_repodata chroot_sync_repodata
# prevent update_base_chroot from removing the builddir/destdir # prevent update_base_chroot from removing the builddir/destdir
update_base_chroot keep-all-force update_base_chroot keep-all-force
@ -952,7 +968,10 @@ case "$XBPS_TARGET" in
bulk_build bulk_build
;; ;;
show-sys-updates) show-sys-updates)
bulk_build -I bulk_build installed
;;
show-local-updates)
bulk_build local
;; ;;
sort-dependencies) sort-dependencies)
bulk_sortdeps ${@/$XBPS_TARGET/} bulk_sortdeps ${@/$XBPS_TARGET/}
@ -961,7 +980,10 @@ case "$XBPS_TARGET" in
bulk_update bulk_update
;; ;;
update-sys) update-sys)
bulk_update -I bulk_update installed
;;
update-local)
bulk_update local
;; ;;
update-check) update-check)
read_pkg ignore-problems read_pkg ignore-problems