Correct repo merge

This commit is contained in:
KF-Art 2022-11-14 18:27:23 -05:00
parent d0eb91a89d
commit 47e890b4b6
56 changed files with 31 additions and 1509 deletions

View file

@ -4,17 +4,6 @@ void-packages is the backbone of the Void Linux distribution. It contains all th
This document describes how you, as a contributor, can help with adding packages, correcting bugs and adding features to void-packages.
<<<<<<< HEAD
## Getting your packages into Void by yourself
If you really want to get a package into Void Linux, we recommend you package it yourself.
We provide a [comprehensive Manual](./Manual.md) on how to create new packages.
There's also a [manual for xbps-src](./README.md), which is used
to build package files from templates.
For this guide, we assume you have basic knowledge about [git](http://git-scm.org), as well as a [GitHub Account](http://github.com).
=======
## Package Requirements
To be included in the Void repository, software must meet at least one of the following requirements.
@ -48,7 +37,6 @@ There's also a [manual for xbps-src](./README.md), which is used to build packag
For this guide, we assume you have basic knowledge about [git](http://git-scm.org), as well as a [GitHub Account](http://github.com) with [SSH set up](https://docs.github.com/en/authentication/connecting-to-github-with-ssh).
You should also [set the email](https://docs.github.com/en/account-and-profile/setting-up-and-managing-your-personal-account-on-github/managing-email-preferences/setting-your-commit-email-address) on your GitHub account and in git so your commits are associated with your GitHub account properly.
>>>>>>> upstream/master
To get started, [fork](https://help.github.com/articles/fork-a-repo) the void-linux `void-packages` git repository on GitHub and clone it:
@ -59,11 +47,6 @@ To keep your forked repository up to date, setup the `upstream` remote to pull i
$ git remote add upstream https://github.com/void-linux/void-packages.git
$ git pull --rebase upstream master
<<<<<<< HEAD
### Creating a new template
You can use the helper tool `xnew`, from the [xtools](https://github.com/chneukirchen/xtools) package, to create new templates:
=======
This can also be done with the `github-cli` tool:
$ gh repo fork void-linux/void-packages
@ -82,7 +65,6 @@ To create a new branch:
### Creating a new template
You can use the helper tool `xnew`, from the [xtools](https://github.com/leahneukirchen/xtools) package, to create new templates:
>>>>>>> upstream/master
$ xnew pkgname subpkg1 subpkg2 ...
@ -90,9 +72,6 @@ Templates must have the name `void-packages/srcpkgs/<pkgname>/template`, where `
For deeper insights on the contents of template files, please read the [manual](./Manual.md), and be sure to browse the existing template files in the `srcpkgs` directory of this repository for concrete examples.
<<<<<<< HEAD
When you've finished working on the template file, please check it with `xlint` helper from the [xtools](https://github.com/chneukirchen/xtools) package:
=======
### Updating a template
At minimum, a template update will consist of changing `version` and `checksum`, if there was an upstream version change, and/or `revision`, if a template-specific change (e.g. patch, correction, etc.) is needed.
@ -116,46 +95,11 @@ When building for `x86_64*` or `i686`, building with the `-Q` flag or with `XBPS
Also, new packages and updates will not be accepted unless they have been runtime tested by installing and running the package.
When you've finished working on the template file, please check it with `xlint` helper from the [xtools](https://github.com/leahneukirchen/xtools) package:
>>>>>>> upstream/master
$ xlint template
If `xlint` reports any issues, resolve them before committing.
<<<<<<< HEAD
### Committing your changes
Once you have made and verified your changes to the package template and/or other files, make one commit per package (including all changes to its sub-packages). Each commit message should have one of the following formats:
* for new packages, use ```New package: <pkgname>-<version>``` ([example](https://github.com/void-linux/void-packages/commit/176d9655429188aac10cd229827f99b72982ab10)).
* for package updates, use ```<pkgname>: update to <version>.``` ([example](https://github.com/void-linux/void-packages/commit/b6b82dcbd4aeea5fc37a32e4b6a8dd8bd980d5a3)).
* for template modifications without a version change, use ```<pkgname>: <reason>``` ([example](https://github.com/void-linux/void-packages/commit/8b68d6bf1eb997cd5e7c095acd040e2c5379c91d)).
* for package removals, use ```<pkgname>: remove package``` ([example](https://github.com/void-linux/void-packages/commit/83784632d94deee5d038c8e1c4c1dffa922fca21)).
* for `common/shlibs` modifications, use `common/shlibs: <pkgname>` ([example](https://github.com/void-linux/void-packages/commit/613651c91811cb4fd2e1a6be701c87072d759a9f)).
If you want to describe your changes in more detail, add an empty line followed by those details ([example](https://github.com/void-linux/void-packages/commit/f1c45a502086ba1952f23ace9084a870ce437bc6)).
`xbump`, available in the [xtools](https://github.com/chneukirchen/xtools) package, can be used to commit a new or updated package:
$ xbump <pkgname> <git commit options>
`xbump` will use `git commit` to commit the changes with the appropriate commit message. For more fine-grained control over the commit, specific options can be passed to `git commit` by adding them after the package name.
After committing your changes, please check that the package builds successfully. From the top level directory of your local copy of the `void-packages` repository, run:
$ ./xbps-src pkg <pkgname>
Your package must build successfully for at least x86, but we recommend trying to build for armv* as well, e.g.:
$ ./xbps-src -a armv7l pkg <pkgname>
Runtime testing of packages and building with the `-Q` flag or with `XBPS_CHECK_PKGS=yes` set in the environment or `etc/conf` are strongly encouraged.
New packages will not be accepted unless they have been runtime tested.
=======
Once you have made and verified your changes to the package template and/or other files, make one commit per package (including all changes to its sub-packages). Each commit message should have one of the following formats:
* for new packages, use `New package: <pkgname>-<version>` ([example](https://github.com/void-linux/void-packages/commit/8ed8d41c40bf6a82cf006c7e207e05942c15bff8)).
@ -180,7 +124,6 @@ If you want to describe your changes in more detail, explain in the commit body
$ xrevbump '<message>' <pkgnames...>
`xbump` and `xrevbump` will use `git commit` to commit the changes with the appropriate commit message. For more fine-grained control over the commit, specific options can be passed to `git commit` by adding them after the package name.
>>>>>>> upstream/master
### Starting a pull request
@ -233,18 +176,12 @@ Once you have applied all requested changes, the reviewers will merge your reque
If the pull request becomes inactive for some days, the reviewers may or may not warn you when they are about to close it.
If it stays inactive further, it will be closed.
<<<<<<< HEAD
Please abstain from temporarily closing a pull request while revising the templates. Instead, leave a comment on the PR describing what still needs work, or add "[WIP]" to the PR title. Only close your pull request if you're sure you don't want your changes to be included.
=======
Please abstain from temporarily closing a pull request while revising the templates. Instead, leave a comment on the PR describing what still needs work, [mark it as a draft](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/changing-the-stage-of-a-pull-request#converting-a-pull-request-to-a-draft), or add "[WIP]" to the PR title. Only close your pull request if you're sure you don't want your changes to be included.
>>>>>>> upstream/master
#### Publishing the package
Once the reviewers have merged the pull request, our [build server](http://build.voidlinux.org) is automatically triggered and builds
all packages in the pull request for all supported platforms. Upon completion, the packages are available to all Void Linux users.
<<<<<<< HEAD
=======
## Testing Pull Requests
@ -267,4 +204,3 @@ Then fetch and check out the PR (replacing `<remote>` with either `origin` or `u
$ git checkout <branch-name>
Then [build and install](https://github.com/void-linux/void-packages#building-packages) the package and test its functionality.
>>>>>>> upstream/master

113
Manual.md
View file

@ -6,10 +6,6 @@ packages for XBPS, the `Void Linux` native packaging system.
*Table of Contents*
* [Introduction](#Introduction)
<<<<<<< HEAD
* [Quality Requirements](#quality_requirements)
=======
>>>>>>> upstream/master
* [Package build phases](#buildphase)
* [Package naming conventions](#namingconventions)
* [Libraries](#libs)
@ -65,10 +61,7 @@ packages for XBPS, the `Void Linux` native packaging system.
* [kernel-hooks](#triggers_kernel_hooks)
* [mimedb](#triggers_mimedb)
* [mkdirs](#triggers_mkdirs)
<<<<<<< HEAD
=======
* [openjdk-profile](#triggers_openjdk_profile)
>>>>>>> upstream/master
* [pango-modules](#triggers_pango_module)
* [pycompile](#triggers_pycompile)
* [register-shell](#triggers_register_shell)
@ -130,41 +123,6 @@ If everything went fine after running
a binary package named `foo-1.0_1.<arch>.xbps` will be generated in the local repository
`hostdir/binpkgs`.
<<<<<<< HEAD
<a id="quality_requirements"></a>
### Quality Requirements
To be included in the Void repository, software must meet at least one
of the following requirements. Exceptions to the list are possible,
and might be accepted, but are extremely unlikely. If you believe you have an
exception, start a PR and make an argument for why that particular piece of
software, while not meeting any of the following requirements, is a good candidate for
the Void packages system.
1. System: The software should be installed system-wide, not per-user.
1. Compiled: The software needs to be compiled before being used, even if it is
software that is not needed by the whole system.
1. Required: Another package either within the repository or pending inclusion
requires the package.
In particular, new themes are highly unlikely to be accepted. Simple shell
scripts are unlikely to be accepted unless they provide considerable value to a
broad user base. New fonts may be accepted if they provide value beyond
aesthetics (e.g. they contain glyphs for a script missing in already packaged
fonts).
Browser forks, including those based on Chromium and Firefox, are generally not
accepted. Such forks require heavy patching, maintenance and hours of build time.
Software need to be used in version announced by authors as ready to use by
the general public - usually called releases. Betas, arbitrary VCS revisions,
templates using tip of development branch taken at build time and releases
created by the package maintainer won't be accepted.
=======
>>>>>>> upstream/master
<a id="buildphase"></a>
### Package build phases
@ -437,11 +395,8 @@ in this directory such as `${XBPS_BUILDDIR}/${wrksrc}`.
- `XBPS_RUST_TARGET` The target architecture triplet used by `rustc` and `cargo`.
<<<<<<< HEAD
=======
- `XBPS_BUILD_ENVIRONMENT` Enables continuous-integration-specific operations. Set to `void-packages-ci` if in continuous integration.
>>>>>>> upstream/master
<a id="available_vars"></a>
### Available variables
@ -478,11 +433,7 @@ the generated `binary packages` have been modified.
- `short_desc` A string with a brief description for this package. Max 72 chars.
- `version` A string with the package version. Must not contain dashes or underscore
<<<<<<< HEAD
and at least one digit is required. Shell's variable substition usage is not allowed.
=======
and at least one digit is required. Shell's variable substitution usage is not allowed.
>>>>>>> upstream/master
Neither `pkgname` or `version` should contain special characters which make it
necessary to quote them, so they shouldn't be quoted in the template.
@ -561,15 +512,6 @@ can be specified by prepending a commercial at (@).
For tarballs you can find the contents checksum by using the command
`tar xf <tarball.ext> --to-stdout | sha256sum`.
<<<<<<< HEAD
- `wrksrc` The directory name where the package sources are extracted, by default
set to `${pkgname}-${version}`. If the top level directory of a package's `distfile` is different from the default, `wrksrc` must be set to the top level directory name inside the archive.
- `build_wrksrc` A directory relative to `${wrksrc}` that will be used when building the package.
- `create_wrksrc` Enable it to create the `${wrksrc}` directory. Required if a package
contains multiple `distfiles`.
=======
- `wrksrc` The directory name where the package sources are extracted, set to `${pkgname}-${version}`.
- `build_wrksrc` A directory relative to `${wrksrc}` that will be used when building the package.
@ -578,7 +520,6 @@ contains multiple `distfiles`.
files and/or directories or when there're no directories at all, top-level files,
and directories will be wrapped inside one more layer of directory.
Set `create_wrksrc` to force this behaviour.
>>>>>>> upstream/master
- `build_style` This specifies the `build method` for a package. Read below to know more
about the available package `build methods` or effect of leaving this not set.
@ -607,15 +548,8 @@ build methods. Unset by default.
`${build_style}` is set to `configure`, `gnu-configure` or `gnu-makefile`
build methods. Unset by default.
<<<<<<< HEAD
- `make_install_args` The arguments to be passed in to `${make_cmd}` at the `install-destdir`
phase if `${build_style}` is set to `configure`, `gnu-configure` or
`gnu-makefile` build methods. By default set to
`PREFIX=/usr DESTDIR=${DESTDIR}`.
=======
- `make_install_args` The arguments to be passed in to `${make_cmd}` at the `install`
phase if `${build_style}` is set to `configure`, `gnu-configure` or `gnu-makefile` build methods.
>>>>>>> upstream/master
- `make_build_target` The build target. If `${build_style}` is set to `configure`, `gnu-configure`
or `gnu-makefile`, this is the target passed to `${make_cmd}` in the build phase;
@ -635,12 +569,9 @@ path of the Python wheel produced by the build phase that will be installed; whe
`python-pep517` build style will look for a wheel matching the package name and version in the
current directory with respect to the install.
<<<<<<< HEAD
=======
- `make_check_pre` The expression in front of `${make_cmd}`. This can be used for wrapper commands
or for setting environment variables for the check command. By default empty.
>>>>>>> upstream/master
- `patch_args` The arguments to be passed in to the `patch(1)` command when applying
patches to the package sources during `do_patch()`. Patches are stored in
`srcpkgs/<pkgname>/patches` and must be in `-p1` format. By default set to `-Np1`.
@ -650,14 +581,11 @@ and `XBPS_MAKEJOBS` will be set to 1. If a package does not work well with `XBPS
but still has a mechanism to build in parallel, set `disable_parallel_build` and
use `XBPS_ORIG_MAKEJOBS` (which holds the original value of `XBPS_MAKEJOBS`) in the template.
<<<<<<< HEAD
=======
- `disable_parallel_check` If set tests for the package won't be built and run in parallel
and `XBPS_MAKEJOBS` will be set to 1. If a package does not work well with `XBPS_MAKEJOBS`
but still has a mechanism to run checks in parallel, set `disable_parallel_check` and
use `XBPS_ORIG_MAKEJOBS` (which holds the original value of `XBPS_MAKEJOBS`) in the template.
>>>>>>> upstream/master
- `make_check` Sets the cases in which the `check` phase is run.
This option has to be accompanied by a comment explaining why the tests fail.
Allowed values:
@ -702,11 +630,7 @@ debugging symbols. Files can be given by full path or by filename.
- `noshlibprovides` If set, the ELF binaries won't be inspected to collect the provided
sonames in shared libraries.
<<<<<<< HEAD
- `noverifyrdeps` If set, the ELF binaries and shared libaries won't be inspected to collect
=======
- `noverifyrdeps` If set, the ELF binaries and shared libraries won't be inspected to collect
>>>>>>> upstream/master
their reverse dependencies. You need to specify all dependencies in the `depends` when you
need to set this.
@ -746,11 +670,7 @@ This appends to the generated file rather than replacing it.
- `nopie` Only needs to be set to something to make active, disables building the package with hardening
features (PIE, relro, etc). Not necessary for most packages.
<<<<<<< HEAD
- `nopie_files` White-space seperated list of ELF binaries that won't be checked
=======
- `nopie_files` White-space separated list of ELF binaries that won't be checked
>>>>>>> upstream/master
for PIE. Files must be given by full path.
- `reverts` xbps supports a unique feature which allows to downgrade from broken
@ -836,11 +756,7 @@ A special value `noarch` used to be available, but has since been removed.
So far, we have listed four types of `depends` variables: `hostmakedepends`,
`makedepends`, `checkdepends` and `depends`. These different kinds of variables
are necessary because `xbps-src` supports cross compilation and to avoid
<<<<<<< HEAD
installing unecessary packages in the build environment.
=======
installing unnecessary packages in the build environment.
>>>>>>> upstream/master
During a build process, there are programs that must be _run_ on the host, such
as `yacc` or the C compiler. The packages that contain these programs should be
@ -1185,15 +1101,9 @@ Current working directory for functions is set as follows:
- For do_fetch, post_fetch: `XBPS_BUILDDIR`.
<<<<<<< HEAD
- For do_extract, post_extract: `wrksrc`.
- For pre_patch through post_install: `build_wrksrc`
=======
- For do_extract through do_patch: `wrksrc`.
- For post_patch through post_install: `build_wrksrc`
>>>>>>> upstream/master
if it is defined, otherwise `wrksrc`.
<a id="build_options"></a>
@ -1343,13 +1253,8 @@ declaring a virtual name and version in the `${provides}` template variable (e.g
specific provider can declare a dependency on the virtual package name with the prefix `virtual?`
(e.g., `depends="virtual?vpkg-0.1_1"`). When a package is built by `xbps-src`, providers for any
virtual packages will be confirmed to exist and will be built if necessary. A map from virtual
<<<<<<< HEAD
packages to their default providers is defined in `etc/default.virtual`. Individual mappings can be
overridden by local preferences in `etc/virtual`. Comments in `etc/default.virtual` provide more
=======
packages to their default providers is defined in `etc/defaults.virtual`. Individual mappings can be
overridden by local preferences in `etc/virtual`. Comments in `etc/defaults.virtual` provide more
>>>>>>> upstream/master
information on this map.
<a id="install_remove_files"></a>
@ -1478,8 +1383,6 @@ If the service requires directories in parts of the system that are not generall
temporary filesystems. Then use the `make_dirs` variable in the template to create
those directories when the package is installed.
<<<<<<< HEAD
=======
If the package installs a systemd service file or other unit, leave it in place as a
reference point so long as including it has no negative side effects.
@ -1488,7 +1391,6 @@ Examples of when *not* to install systemd units:
1. When doing so changes runtime behavior of the packaged software.
2. When it is done via a compile time flag that also changes build dependencies.
>>>>>>> upstream/master
<a id="32bit_pkgs"></a>
### 32bit packages
@ -1668,20 +1570,11 @@ recursively by the target python version. This differs from `pycompile_module` i
path may be specified, Example: `pycompile_dirs="usr/share/foo"`.
- `python_version`: this variable expects the supported Python major version.
<<<<<<< HEAD
By default it's set to `2`. This variable is needed for multi-language
applications (e.g., the application is written in C while the command is
written in Python) or just single Python file ones that live in `/usr/bin`.
> NOTE: you need to define it *only* for non-Python modules.
=======
In most cases version is inferred from shebang, install path or build style.
Only required for some multi-language
applications (e.g., the application is written in C while the command is
written in Python) or just single Python file ones that live in `/usr/bin`.
>>>>>>> upstream/master
Also, a set of useful variables are defined to use in the templates:
| Variable | Value |
@ -1719,10 +1612,7 @@ The following template variables influence how Go packages are built:
any go.mod files, `default` to use Go's default behavior, or anything
accepted by `go build -mod MODE`. Defaults to `vendor` if there's
a vendor directory, otherwise `default`.
<<<<<<< HEAD
=======
- `go_ldflags`: Arguments to pass to the linking steps of go tool.
>>>>>>> upstream/master
The following environment variables influence how Go packages are built:
@ -2092,8 +1982,6 @@ During removal it will delete the directory using `rmdir`.
To include this trigger use the `make_dirs` variable, as the trigger won't do anything
unless it is defined.
<<<<<<< HEAD
=======
<a id="triggers_openjdk_profile"></a>
#### openjdk-profile
@ -2101,7 +1989,6 @@ The openjdk-profile trigger is responsible for creating an entry in /etc/profile
sets the `JAVA_HOME` environment variable to the currently-selected alternative for
`/usr/bin/java` on installation. This trigger must be manually requested.
>>>>>>> upstream/master
<a id="triggers_pango_module"></a>
#### pango-modules

View file

@ -3,7 +3,4 @@ if [ "$CROSS_BUILD" ]; then
else
export WX_CONFIG=/usr/bin/wx-config-gtk3
fi
<<<<<<< HEAD
=======
configure_args+=" -DwxWidgets_CONFIG_EXECUTABLE=${WX_CONFIG} "
>>>>>>> upstream/master

View file

@ -3,42 +3,24 @@
#
do_build() {
<<<<<<< HEAD
: ${make_cmd:=cargo}
=======
: ${make_cmd:=cargo auditable}
>>>>>>> upstream/master
${make_cmd} build --release --target ${RUST_TARGET} ${configure_args}
}
do_check() {
<<<<<<< HEAD
: ${make_cmd:=cargo}
${make_cmd} test --release --target ${RUST_TARGET} ${configure_args} \
=======
: ${make_cmd:=cargo auditable}
${make_check_pre} ${make_cmd} test --release --target ${RUST_TARGET} ${configure_args} \
>>>>>>> upstream/master
${make_check_args}
}
do_install() {
<<<<<<< HEAD
: ${make_cmd:=cargo}
: ${make_install_args:=--path .}
${make_cmd} install --target ${RUST_TARGET} --root="${DESTDIR}/usr" \
--locked ${configure_args} ${make_install_args}
=======
: ${make_cmd:=cargo auditable}
: ${make_install_args:=--path .}
${make_cmd} install --target ${RUST_TARGET} --root="${DESTDIR}/usr" \
--offline --locked ${configure_args} ${make_install_args}
>>>>>>> upstream/master
rm -f "${DESTDIR}"/usr/.crates.toml
rm -f "${DESTDIR}"/usr/.crates2.json

View file

@ -54,8 +54,6 @@ _EOF
cmake_args+=" -DCMAKE_INSTALL_PREFIX=/usr"
cmake_args+=" -DCMAKE_BUILD_TYPE=None"
cmake_args+=" -DCMAKE_INSTALL_LIBDIR=lib${XBPS_TARGET_WORDSIZE}"
<<<<<<< HEAD
=======
cmake_args+=" -DCMAKE_INSTALL_SYSCONFDIR=/etc"
if [ "$CROSS_BUILD" ]; then
@ -64,7 +62,6 @@ _EOF
# which have binfmts support on and off
cmake_args+=" -DQT_HOST_PATH_CMAKE_DIR=/usr/lib/cmake"
fi
>>>>>>> upstream/master
if [[ $build_helper = *"qemu"* ]]; then
echo "SET(CMAKE_CROSSCOMPILING_EMULATOR /usr/bin/qemu-${XBPS_TARGET_QEMU_MACHINE}-static)" \
@ -127,11 +124,7 @@ do_check() {
: ${make_check_target:=test}
<<<<<<< HEAD
${make_cmd} ${make_check_args} ${make_check_target}
=======
${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
}
do_install() {

View file

@ -29,11 +29,7 @@ do_check() {
: ${make_cmd:=make}
: ${make_check_target:=check}
<<<<<<< HEAD
${make_cmd} ${make_check_args} ${make_check_target}
=======
${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
}
do_install() {

View file

@ -30,11 +30,7 @@ do_check() {
: ${make_cmd:=make}
: ${make_check_target:=check}
<<<<<<< HEAD
${make_cmd} ${make_check_args} ${make_check_target}
=======
${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
}
do_install() {

View file

@ -9,15 +9,10 @@ do_build() {
CC="$CC" CXX="$CXX" LD="$LD" AR="$AR" RANLIB="$RANLIB" \
CPP="$CPP" AS="$AS" OBJCOPY="$OBJCOPY" OBJDUMP="$OBJDUMP" \
CFLAGS="$CFLAGS" CXXFLAGS="$CXXFLAGS" LDFLAGS="$LDFLAGS" \
<<<<<<< HEAD
${makejobs} ${make_build_args} ${make_build_target}
else
=======
PREFIX=/usr prefix=/usr \
${makejobs} ${make_build_args} ${make_build_target}
else
export PREFIX=/usr prefix=/usr
>>>>>>> upstream/master
${make_cmd} ${makejobs} ${make_build_args} ${make_build_target}
fi
}
@ -37,20 +32,12 @@ do_check() {
: ${make_cmd:=make}
: ${make_check_target:=check}
<<<<<<< HEAD
${make_cmd} ${make_check_args} ${make_check_target}
=======
${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
}
do_install() {
: ${make_cmd:=make}
: ${make_install_target:=install}
<<<<<<< HEAD
${make_cmd} STRIP=true PREFIX=/usr DESTDIR=${DESTDIR} ${make_install_args} ${make_install_target}
=======
${make_cmd} STRIP=true PREFIX=/usr prefix=/usr DESTDIR=${DESTDIR} ${make_install_args} ${make_install_target}
>>>>>>> upstream/master
}

View file

@ -22,8 +22,6 @@ do_configure() {
}
do_build() {
<<<<<<< HEAD
=======
# remove -s and -w from go_ldflags, we should let xbps-src strip binaries itself
for wd in $go_ldflags; do
if [ "$wd" == "-s" ] || [ "$wd" == "-w" ]; then
@ -31,7 +29,6 @@ do_build() {
fi
done
>>>>>>> upstream/master
go_package=${go_package:-$go_import_path}
# Build using Go modules if there's a go.mod file
if [ "${go_mod_mode}" != "off" ] && [ -f go.mod ]; then

View file

@ -103,11 +103,7 @@ do_configure() {
export AR="gcc-ar"
# unbuffered output for continuous logging
<<<<<<< HEAD
PYTHONUNBUFFERED=1 ${meson_cmd} \
=======
PYTHONUNBUFFERED=1 ${meson_cmd} setup \
>>>>>>> upstream/master
--prefix=/usr \
--libdir=/usr/lib${XBPS_TARGET_WORDSIZE} \
--libexecdir=/usr/libexec \
@ -142,11 +138,7 @@ do_check() {
: ${make_check_target:=test}
: ${meson_builddir:=build}
<<<<<<< HEAD
${make_cmd} -C ${meson_builddir} ${makejobs} ${make_check_args} ${make_check_target}
=======
${make_check_pre} ${make_cmd} -C ${meson_builddir} ${makejobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
}
do_install() {

View file

@ -41,11 +41,7 @@ do_check() {
if [ ! -x ./Build ]; then
msg_error "$pkgver: cannot find ./Build script!\n"
fi
<<<<<<< HEAD
./Build test
=======
${make_check_pre} ./Build test
>>>>>>> upstream/master
}
do_install() {

View file

@ -79,11 +79,7 @@ do_check() {
: ${make_cmd:=make}
: ${make_check_target:=test}
<<<<<<< HEAD
${make_cmd} ${make_check_args} ${make_check_target}
=======
${make_check_pre} ${make_cmd} ${makejobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
}
do_install() {

View file

@ -49,11 +49,7 @@ do_check() {
fi
fi
<<<<<<< HEAD
python${pyver} setup.py ${make_check_target:-test} ${make_check_args}
=======
${make_check_pre} python${pyver} setup.py ${make_check_target:-test} ${make_check_args}
>>>>>>> upstream/master
rm build
done
}

View file

@ -3,32 +3,6 @@
#
do_build() {
<<<<<<< HEAD
if [ -n "$CROSS_BUILD" ]; then
PYPREFIX="$XBPS_CROSS_BASE"
CFLAGS+=" -I${XBPS_CROSS_BASE}/${py3_inc} -I${XBPS_CROSS_BASE}/usr/include"
LDFLAGS+=" -L${XBPS_CROSS_BASE}/${py3_lib} -L${XBPS_CROSS_BASE}/usr/lib"
CC="${XBPS_CROSS_TRIPLET}-gcc -pthread $CFLAGS $LDFLAGS"
LDSHARED="${CC} -shared $LDFLAGS"
for f in ${XBPS_CROSS_BASE}/${py3_lib}/_sysconfigdata_*; do
f=${f##*/}
_PYTHON_SYSCONFIGDATA_NAME=${f%.py}
done
env CC="$CC" LDSHARED="$LDSHARED" \
PYPREFIX="$PYPREFIX" CFLAGS="$CFLAGS" \
PYTHONPATH=${XBPS_CROSS_BASE}/${py3_lib} \
_PYTHON_SYSCONFIGDATA_NAME="$_PYTHON_SYSCONFIGDATA_NAME" \
LDFLAGS="$LDFLAGS" python3 setup.py build ${make_build_args}
else
python3 setup.py build ${make_build_args}
fi
}
do_check() {
if python3 -c 'import pytest' >/dev/null 2>&1; then
PYTHONPATH="$(cd build/lib* && pwd)" \
python3 -m pytest ${make_check_args} ${make_check_target}
=======
python3 setup.py build ${make_build_args}
}
@ -41,7 +15,6 @@ do_check() {
PYTHONPATH="$(cd build/lib* && pwd)" \
${make_check_pre} \
python3 -m pytest ${testjobs} ${make_check_args} ${make_check_target}
>>>>>>> upstream/master
else
# Fall back to deprecated setup.py test orchestration without pytest
if [ -z "$make_check_target" ]; then
@ -52,36 +25,10 @@ do_check() {
fi
: ${make_check_target:=test}
<<<<<<< HEAD
python3 setup.py ${make_check_target} ${make_check_args}
=======
${make_check_pre} python3 setup.py ${make_check_target} ${make_check_args}
>>>>>>> upstream/master
fi
}
do_install() {
<<<<<<< HEAD
if [ -n "$CROSS_BUILD" ]; then
PYPREFIX="$XBPS_CROSS_BASE"
CFLAGS+=" -I${XBPS_CROSS_BASE}/${py3_inc} -I${XBPS_CROSS_BASE}/usr/include"
LDFLAGS+=" -L${XBPS_CROSS_BASE}/${py3_lib} -L${XBPS_CROSS_BASE}/usr/lib"
CC="${XBPS_CROSS_TRIPLET}-gcc -pthread $CFLAGS $LDFLAGS"
LDSHARED="${CC} -shared $LDFLAGS"
for f in ${XBPS_CROSS_BASE}/${py3_lib}/_sysconfigdata_*; do
f=${f##*/}
_PYTHON_SYSCONFIGDATA_NAME=${f%.py}
done
env CC="$CC" LDSHARED="$LDSHARED" \
PYPREFIX="$PYPREFIX" CFLAGS="$CFLAGS" \
PYTHONPATH=${XBPS_CROSS_BASE}/${py3_lib} \
_PYTHON_SYSCONFIGDATA_NAME="$_PYTHON_SYSCONFIGDATA_NAME" \
LDFLAGS="$LDFLAGS" python3 setup.py \
install --prefix=/usr --root=${DESTDIR} ${make_install_args}
else
python3 setup.py install --prefix=/usr --root=${DESTDIR} ${make_install_args}
fi
=======
python3 setup.py install --prefix=/usr --root=${DESTDIR} ${make_install_args}
>>>>>>> upstream/master
}

View file

@ -3,22 +3,6 @@
#
do_build() {
<<<<<<< HEAD
# No PEP517 build tool currently supports compiled extensions
# Thus, there is no need to accommodate cross compilation here
: ${make_build_target:=.}
mkdir -p build
TMPDIR=build python3 -m pip wheel --no-deps --use-pep517 --no-clean \
--no-build-isolation ${make_build_args} ${make_build_target}
}
do_check() {
if python3 -c 'import pytest' >/dev/null 2>&1; then
python3 -m pytest ${make_check_args} ${make_check_target}
else
msg_warn "Unable to determine tests for PEP517 Python templates"
=======
: ${make_build_target:=.}
: ${make_build_args:=--no-isolation --wheel}
python3 -m build ${make_build_args} ${make_build_target}
@ -33,22 +17,11 @@ do_check() {
${make_check_pre} python3 -m pytest ${testjobs} ${make_check_args} ${make_check_target}
else
msg_warn "Unable to determine tests for PEP517 Python templates\n"
>>>>>>> upstream/master
return 0
fi
}
do_install() {
<<<<<<< HEAD
# As with do_build, no need to accommodate cross compilation here
: ${make_install_target:=${pkgname#python3-}-${version}-*-*-*.whl}
# If do_build was overridden, make sure the TMPDIR exists
mkdir -p build
TMPDIR=build python3 -m pip install --use-pep517 --prefix /usr \
--root ${DESTDIR} --no-deps --no-build-isolation \
--no-clean ${make_install_args} ${make_install_target}
=======
if [ -z "${make_install_target}" ]; then
# Default wheel name normalizes hyphens to underscores
local wheelbase="${pkgname#python3-}"
@ -57,5 +30,4 @@ do_install() {
python3 -m installer --destdir ${DESTDIR} \
${make_install_args} ${make_install_target}
>>>>>>> upstream/master
}

View file

@ -1,26 +1,12 @@
#
<<<<<<< HEAD
# This helper is for templates using Qt4/Qt5 qmake.
=======
# This helper is for templates using Qt5/Qt6 qmake.
>>>>>>> upstream/master
#
do_configure() {
local qmake
local qmake_args
if [ -x "/usr/lib/qt5/bin/qmake" ]; then
<<<<<<< HEAD
# Qt5 qmake
qmake="/usr/lib/qt5/bin/qmake"
fi
if [ -x "/usr/lib/qt/bin/qmake" ]; then
# Qt4 qmake
qmake="/usr/lib/qt/bin/qmake"
fi
=======
qmake="/usr/lib/qt5/bin/qmake"
fi
>>>>>>> upstream/master
if [ -z "${qmake}" ]; then
msg_error "${pkgver}: Could not find qmake - missing in hostmakedepends?\n"
fi

View file

@ -3,11 +3,7 @@
#
do_check() {
<<<<<<< HEAD
RAKULIB=lib prove -r -e raku t/
=======
RAKULIB=lib ${make_check_pre} prove -r -e raku t/
>>>>>>> upstream/master
}
do_install() {

View file

@ -9,38 +9,27 @@ do_build() {
CXXFLAGS="$CXXFLAGS" LINKFLAGS="$LDFLAGS" \
cxxflags="$CXXFLAGS" linkflags="$LDFLAGS" \
RANLIB="$RANLIB" ranlib="$RANLIB" \
<<<<<<< HEAD
prefix=/usr destdir=${DESTDIR} DESTDIR=${DESTDIR} \
=======
prefix=/usr \
${scons_use_destdir:+DESTDIR="${DESTDIR}"} \
${scons_use_destdir:+destdir="${DESTDIR}"} \
>>>>>>> upstream/master
${make_build_args} ${make_build_target}
}
do_install() {
: ${make_cmd:=scons}
: ${make_install_target:=install}
<<<<<<< HEAD
=======
local _sandbox=
if [ -z "$scons_use_destdir" ]; then _sandbox=yes ; fi
>>>>>>> upstream/master
${make_cmd} ${makejobs} CC=$CC CXX=$CXX CCFLAGS="$CFLAGS" \
cc=$CC cxx=$CXX ccflags="$CFLAGS" \
CXXFLAGS="$CXXFLAGS" LINKFLAGS="$LDFLAGS" \
cxxflags="$CXXFLAGS" linkflags="$LDFLAGS" \
RANLIB="$RANLIB" ranlib="$RANLIB" \
<<<<<<< HEAD
prefix=/usr destdir=${DESTDIR} DESTDIR=${DESTDIR} \
=======
prefix=/usr \
${scons_use_destdir:+DESTDIR="${DESTDIR}"} \
${scons_use_destdir:+destdir="${DESTDIR}"} \
${_sandbox:+--install-sandbox="${DESTDIR}"} \
>>>>>>> upstream/master
${make_install_args} ${make_install_target}
}

View file

@ -5,10 +5,6 @@
# required variables
#
# build_style=slashpackage
<<<<<<< HEAD
# wrksrc=<category>
=======
>>>>>>> upstream/master
# build_wrksrc=${pkgname}-${version}
# distfiles=<download link>
#
@ -18,10 +14,6 @@
# pkgname=daemontools
# version=0.76
# revision=1
<<<<<<< HEAD
# wrksrc=admin
=======
>>>>>>> upstream/master
# build_wrksrc=${pkgname}-${version}
# build_style=slashpackage
# short_desc="A collection of tools for managing UNIX services"

View file

@ -1,12 +1,5 @@
makedepends+=" R"
depends+=" R"
<<<<<<< HEAD
wrksrc="${XBPS_BUILDDIR}/${pkgname#R-cran-}"
# default to cran
if [ -z "$distfiles" ]; then
distfiles="https://cran.r-project.org/src/contrib/${pkgname#R-cran-}_${version//r/-}.tar.gz"
=======
create_wrksrc=required
build_wrksrc="${pkgname#R-cran-}"
@ -14,5 +7,4 @@ build_wrksrc="${pkgname#R-cran-}"
if [ -z "$distfiles" ]; then
distfiles="https://cran.r-project.org/src/contrib/${pkgname#R-cran-}_${version//r/-}.tar.gz
https://cran.r-project.org/src/contrib/Archive/${pkgname#R-cran-}/${pkgname#R-cran-}_${version//r/-}.tar.gz"
>>>>>>> upstream/master
fi

View file

@ -1,12 +1,9 @@
hostmakedepends+=" cargo"
<<<<<<< HEAD
=======
if ! [[ "$pkgname" =~ ^cargo-auditable(-bootstrap)?$ ]]; then
hostmakedepends+=" cargo-auditable"
fi
>>>>>>> upstream/master
if [ "$CROSS_BUILD" ]; then
makedepends+=" rust-std"
fi

View file

@ -43,11 +43,8 @@ case "$XBPS_TARGET_MACHINE" in
*-musl) export GOCACHE="${XBPS_HOSTDIR}/gocache-muslc" ;;
*) export GOCACHE="${XBPS_HOSTDIR}/gocache-glibc" ;;
esac
<<<<<<< HEAD
=======
case "$XBPS_TARGET_MACHINE" in
# https://go.dev/cl/421935
i686*) export CGO_CFLAGS="$CGO_CFLAGS -fno-stack-protector" ;;
esac
>>>>>>> upstream/master

View file

@ -1,6 +1,3 @@
lib32disabled=yes
makedepends+=" python3"
<<<<<<< HEAD
=======
build_helper+=" python3"
>>>>>>> upstream/master

View file

@ -1,8 +1,3 @@
<<<<<<< HEAD
hostmakedepends+=" python3-pip"
lib32disabled=yes
=======
hostmakedepends+=" python3-build python3-installer"
lib32disabled=yes
build_helper+=" python3"
>>>>>>> upstream/master

View file

@ -2,7 +2,3 @@
hostmakedepends+=" rsync"
# python_version isn't needed for everything either
python_version=3
<<<<<<< HEAD
create_wrksrc=yes
=======
>>>>>>> upstream/master

View file

@ -1,9 +1,5 @@
lib32disabled=yes
nopie=yes
<<<<<<< HEAD
create_wrksrc=yes
=======
>>>>>>> upstream/master
nostrip_files+=" libcaf_single.a libgcc.a libgcov.a libgcc_eh.a
libgnarl_pic.a libgnarl.a libgnat_pic.a libgnat.a libgmem.a"

View file

@ -1,7 +1,3 @@
<<<<<<< HEAD
CFLAGS="${CFLAGS} -fdebug-prefix-map=$wrksrc=."
CXXFLAGS="${CXXFLAGS} -fdebug-prefix-map=$wrksrc=."
=======
local _wrksrc="$wrksrc${build_wrksrc:+/$build_wrksrc}"
case "$build_style" in
cmake)
@ -18,4 +14,3 @@ meson)
esac
unset _wrksrc
>>>>>>> upstream/master

View file

@ -7,11 +7,7 @@ py2_lib="usr/lib/python${py2_ver}"
py2_sitelib="${py2_lib}/site-packages"
py2_inc="usr/include/python${py2_ver}"
<<<<<<< HEAD
py3_ver="3.10"
=======
py3_ver="3.11"
>>>>>>> upstream/master
py3_abiver=""
py3_lib="usr/lib/python${py3_ver}"
py3_sitelib="${py3_lib}/site-packages"

View file

@ -7,35 +7,24 @@ unset -v archs distfiles checksum build_style build_helper nocross broken
unset -v configure_script configure_args wrksrc build_wrksrc create_wrksrc
unset -v make_build_args make_check_args make_install_args
unset -v make_build_target make_check_target make_install_target
<<<<<<< HEAD
unset -v make_cmd meson_cmd gem_cmd fetch_cmd
=======
unset -v make_cmd meson_cmd gem_cmd fetch_cmd make_check_pre
>>>>>>> upstream/master
unset -v python_version stackage
unset -v cmake_builddir meson_builddir
unset -v meson_crossfile
unset -v gemspec
unset -v go_import_path go_package go_mod_mode
<<<<<<< HEAD
unset -v patch_args disable_parallel_build keep_libtool_archives make_use_env
=======
unset -v patch_args disable_parallel_build disable_parallel_check
unset -v keep_libtool_archives make_use_env
>>>>>>> upstream/master
unset -v reverts subpackages makedepends hostmakedepends checkdepends depends restricted
unset -v nopie build_options build_options_default bootstrap repository reverts
unset -v CFLAGS CXXFLAGS FFLAGS CPPFLAGS LDFLAGS LD_LIBRARY_PATH
unset -v CC CXX CPP GCC LD AR AS RANLIB NM OBJDUMP OBJCOPY STRIP READELF PKG_CONFIG
<<<<<<< HEAD
=======
unset -v CMAKE_GENERATOR
# build-helper python3
unset -v PYPREFIX LDSHARED PYTHON_CONFIG PYTHONPATH _PYTHON_SYSCONFIGDATA_NAME
# unset all $build_option_ variables
unset -v "${!build_option_@}"
>>>>>>> upstream/master
# hooks/do-extract/00-distfiles
unset -v skip_extraction

View file

@ -57,10 +57,6 @@ vsed() {
newdigest="$($XBPS_DIGEST_CMD "$f")"
newdigest="${newdigest%% *}"
<<<<<<< HEAD
if [ "$olddigest" = "$newdigest" ]; then
msg_warn "$pkgver: vsed: regex \"$rx\" didn't change file \"$f\"\n"
=======
msgfunc=msg_warn
if [ -n "$XBPS_STRICT" ]; then
msgfunc=msg_error
@ -68,7 +64,6 @@ vsed() {
if [ "$olddigest" = "$newdigest" ]; then
$msgfunc "$pkgver: vsed: regex \"$rx\" didn't change file \"$f\"\n"
>>>>>>> upstream/master
fi
olddigest="${newdigest}"
done

View file

@ -3,11 +3,7 @@
hook() {
local srcdir="$XBPS_SRCDISTDIR/$pkgname-$version"
<<<<<<< HEAD
local f j curfile found extractdir
=======
local f j curfile found extractdir innerdir num_dirs
>>>>>>> upstream/master
local TAR_CMD
if [ -z "$distfiles" -a -z "$checksum" ]; then
@ -24,13 +20,6 @@ hook() {
fi
done
<<<<<<< HEAD
if [ -n "$create_wrksrc" ]; then
mkdir -p "${wrksrc}" || msg_error "$pkgver: failed to create wrksrc.\n"
fi
=======
>>>>>>> upstream/master
# Disable trap on ERR; the code is smart enough to report errors and abort.
trap - ERR
@ -38,12 +27,9 @@ hook() {
[ -z "$TAR_CMD" ] && TAR_CMD="$(command -v tar)"
[ -z "$TAR_CMD" ] && msg_error "xbps-src: no suitable tar cmd (bsdtar, tar)\n"
<<<<<<< HEAD
=======
extractdir=$(mktemp -d "$XBPS_BUILDDIR/.extractdir-XXXXXXX") ||
msg_error "Cannot create temporary dir for do-extract\n"
>>>>>>> upstream/master
msg_normal "$pkgver: extracting distfile(s), please wait...\n"
for f in ${distfiles}; do
@ -86,15 +72,6 @@ hook() {
*) msg_error "$pkgver: unknown distfile suffix for $curfile.\n";;
esac
<<<<<<< HEAD
if [ -n "$create_wrksrc" ]; then
extractdir="$wrksrc"
else
extractdir="$XBPS_BUILDDIR"
fi
=======
>>>>>>> upstream/master
case ${cursufx} in
tar|txz|tbz|tlz|tgz|crate)
$TAR_CMD -x --no-same-permissions --no-same-owner -f $srcdir/$curfile -C "$extractdir"
@ -144,15 +121,7 @@ hook() {
fi
;;
txt)
<<<<<<< HEAD
if [ "$create_wrksrc" ]; then
cp -f $srcdir/$curfile "$extractdir"
else
msg_error "$pkgname: ${curfile##*.} files can only be extracted when create_wrksrc is set\n"
fi
=======
cp -f $srcdir/$curfile "$extractdir"
>>>>>>> upstream/master
;;
7z)
if command -v 7z &>/dev/null; then
@ -170,23 +139,10 @@ hook() {
fi
;;
gem)
<<<<<<< HEAD
case "$TAR_CMD" in
*bsdtar)
$TAR_CMD -xOf $srcdir/$curfile data.tar.gz | \
$TAR_CMD -xz -C "$extractdir" -s ",^,${wrksrc##*/}/," -f -
;;
*)
$TAR_CMD -xOf $srcdir/$curfile data.tar.gz | \
$TAR_CMD -xz -C "$extractdir" --transform="s,^,${wrksrc##*/}/,"
;;
esac
=======
innerdir="$extractdir/${wrksrc##*/}"
mkdir -p "$innerdir"
$TAR_CMD -xOf $srcdir/$curfile data.tar.gz |
$TAR_CMD -xz -C "$innerdir" -f -
>>>>>>> upstream/master
if [ $? -ne 0 ]; then
msg_error "$pkgver: extracting $curfile into $XBPS_BUILDDIR.\n"
fi
@ -196,8 +152,6 @@ hook() {
;;
esac
done
<<<<<<< HEAD
=======
# find "$extractdir" -mindepth 1 -maxdepth 1 -printf '1\n' | wc -l
# However, it requires GNU's find
@ -225,5 +179,4 @@ hook() {
mkdir -p "$wrksrc"
fi ||
msg_error "$pkgver: failed to move sources to $wrksrc\n"
>>>>>>> upstream/master
}

View file

@ -2,27 +2,6 @@
# the $distfiles variable and then verifies its sha256 checksum comparing
# its value with the one stored in the $checksum variable.
<<<<<<< HEAD
# Get the checksum for $curfile at index $dfcount
get_cksum() {
local curfile="$1" dfcount="$2" ckcount cksum i
ckcount=0
cksum=0
for i in ${checksum}; do
if [ $dfcount -eq $ckcount -a -n "$i" ]; then
cksum=$i
fi
ckcount=$((ckcount + 1))
done
if [ -z "$cksum" ]; then
msg_error "$pkgver: cannot find checksum for $curfile.\n"
fi
echo "$cksum"
}
=======
>>>>>>> upstream/master
# Return the checksum of the contents of a tarball
contents_cksum() {
local curfile="$1" cursufx cksum
@ -113,13 +92,7 @@ contents_cksum() {
# Verify the checksum for $curfile stored at $distfile and index $dfcount
verify_cksum() {
<<<<<<< HEAD
local curfile="$1" distfile="$2" dfcount="$3" filesum cksum
cksum=$(get_cksum $curfile $dfcount)
=======
local curfile="$1" distfile="$2" cksum="$3" filesum
>>>>>>> upstream/master
# If the checksum starts with an commercial at (@) it is the contents checksum
if [ "${cksum:0:1}" = "@" ]; then
@ -128,11 +101,7 @@ verify_cksum() {
filesum=$(contents_cksum "$curfile")
if [ "${cksum}" != "$filesum" ]; then
echo
<<<<<<< HEAD
msg_red "SHA256 mismatch for '$curfile:'\n@$filesum\n"
=======
msg_red "SHA256 mismatch for '${curfile}:'\n@${filesum}\n"
>>>>>>> upstream/master
errors=$((errors + 1))
else
msg_normal_append "OK.\n"
@ -142,11 +111,7 @@ verify_cksum() {
filesum=$(${XBPS_DIGEST_CMD} "$distfile")
if [ "$cksum" != "$filesum" ]; then
echo
<<<<<<< HEAD
msg_red "SHA256 mismatch for '$curfile:'\n$filesum\n"
=======
msg_red "SHA256 mismatch for '${curfile}:'\n${filesum}\n"
>>>>>>> upstream/master
errors=$((errors + 1))
else
if [ ! -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then
@ -160,24 +125,6 @@ verify_cksum() {
# Link an existing cksum $distfile for $curfile at index $dfcount
link_cksum() {
<<<<<<< HEAD
local curfile="$1" distfile="$2" dfcount="$3" filesum cksum
cksum=$(get_cksum $curfile $dfcount)
if [ -n "$cksum" -a -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then
ln -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" "$distfile"
msg_normal "$pkgver: using known distfile $curfile.\n"
fi
}
try_mirrors() {
local curfile="$1" distfile="$2" dfcount="$3" subdir="$4" f="$5"
local filesum cksum basefile mirror path scheme
[ -z "$XBPS_DISTFILES_MIRROR" ] && return
basefile="${f##*/}"
cksum=$(get_cksum $curfile $dfcount)
=======
local curfile="$1" distfile="$2" cksum="$3"
if [ -n "$cksum" -a -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" ]; then
ln -f "$XBPS_SRCDISTDIR/by_sha256/${cksum}_${curfile}" "$distfile"
@ -192,7 +139,6 @@ try_mirrors() {
local filesum basefile mirror path scheme good
[ -z "$XBPS_DISTFILES_MIRROR" ] && return 1
basefile="${f##*/}"
>>>>>>> upstream/master
for mirror in $XBPS_DISTFILES_MIRROR; do
scheme="file"
if [[ $mirror == *://* ]]; then
@ -211,23 +157,6 @@ try_mirrors() {
fi
if [[ "$mirror" == *voidlinux* ]]; then
# For distfiles.voidlinux.* append the subdirectory
<<<<<<< HEAD
mirror="$mirror/$subdir"
fi
msg_normal "$pkgver: fetching distfile '$curfile' from '$mirror'...\n"
$fetch_cmd "$mirror/$curfile"
# If basefile was not found, but a curfile file may exist, try to fetch it
if [ ! -f "$distfile" -a "$basefile" != "$curfile" ]; then
$fetch_cmd "$mirror/$basefile"
fi
[ ! -f "$distfile" ] && continue
flock -n ${distfile}.part rm -f ${distfile}.part
filesum=$(${XBPS_DIGEST_CMD} "$distfile")
[ "$cksum" == "$filesum" ] && break
msg_normal "$pkgver: checksum failed - removing '$curfile'...\n"
rm -f ${distfile}
done
=======
mirror="$mirror/$pkgname-$version"
fi
msg_normal "$pkgver: fetching distfile '$curfile' from mirror '$mirror'...\n"
@ -283,16 +212,12 @@ try_urls() {
return 0
done
return 1
>>>>>>> upstream/master
}
hook() {
local srcdir="$XBPS_SRCDISTDIR/$pkgname-$version"
local dfcount=0 dfgood=0 errors=0 max_retries
<<<<<<< HEAD
if [ ! -d "$srcdir" ]; then
=======
local -a _distfiles=($distfiles)
local -a _checksums=($checksum)
local -A _file_idxs
@ -306,7 +231,6 @@ hook() {
done
if [[ ! -d "$srcdir" ]]; then
>>>>>>> upstream/master
mkdir -p -m775 "$srcdir"
chgrp $(id -g) "$srcdir"
fi
@ -315,93 +239,13 @@ hook() {
# Disable trap on ERR; the code is smart enough to report errors and abort.
trap - ERR
<<<<<<< HEAD
# Detect bsdtar and GNU tar (in that order of preference)
TAR_CMD="$(command -v bsdtar)"
if [ -z "$TAR_CMD" ]; then
=======
# Detect bsdtar and GNU tar (in that order of preference)
TAR_CMD="$(command -v bsdtar)"
if [[ -z "$TAR_CMD" ]]; then
>>>>>>> upstream/master
TAR_CMD="$(command -v tar)"
fi
# Detect distfiles with obsolete checksum and purge them from the cache
<<<<<<< HEAD
for f in ${distfiles}; do
curfile="${f#*>}"
curfile="${curfile##*/}"
distfile="$srcdir/$curfile"
if [ -f "$distfile" ]; then
cksum=$(get_cksum $curfile $dfcount)
if [ "${cksum:0:1}" = "@" ]; then
cksum=${cksum:1}
filesum=$(contents_cksum "$distfile")
else
filesum=$(${XBPS_DIGEST_CMD} "$distfile")
fi
if [ "$cksum" = "$filesum" ]; then
dfgood=$((dfgood + 1))
else
inode=$(stat "$distfile" --printf "%i")
msg_warn "$pkgver: wrong checksum found for ${curfile} - purging\n"
find ${XBPS_SRCDISTDIR} -inum ${inode} -delete -print
fi
fi
dfcount=$((dfcount + 1))
done
# We're done, if all distfiles were found and had good checksums
[ $dfcount -eq $dfgood ] && return
# Download missing distfiles and verify their checksums
dfcount=0
for f in ${distfiles}; do
curfile="${f#*>}"
curfile="${curfile##*/}"
distfile="$srcdir/$curfile"
# If file lock cannot be acquired wait until it's available.
while true; do
flock -w 1 ${distfile}.part true
[ $? -eq 0 ] && break
msg_warn "$pkgver: ${curfile} is already being downloaded, waiting for 1s ...\n"
done
# If distfile does not exist, try to link to it.
if [ ! -f "$distfile" ]; then
link_cksum $curfile $distfile $dfcount
fi
# If distfile does not exist, download it from a mirror location.
if [ ! -f "$distfile" ]; then
try_mirrors $curfile $distfile $dfcount $pkgname-$version $f
fi
# If distfile does not exist, download it from the original location.
if [[ "$FTP_RETRIES" && "${f}" =~ ^ftp:// ]]; then
max_retries="$FTP_RETRIES"
else
max_retries=1
fi
for retry in $(seq 1 1 $max_retries); do
if [ ! -f "$distfile" ]; then
if [ "$retry" == 1 ]; then
msg_normal "$pkgver: fetching distfile '$curfile'...\n"
else
msg_normal "$pkgver: fetch attempt $retry of $max_retries...\n"
fi
flock "${distfile}.part" $fetch_cmd "$f"
fi
done
if [ ! -f "$distfile" ]; then
msg_error "$pkgver: failed to fetch $curfile.\n"
fi
# distfile downloaded, verify sha256 hash.
flock -n ${distfile}.part rm -f ${distfile}.part
verify_cksum $curfile $distfile $dfcount
dfcount=$((dfcount + 1))
=======
for f in ${!_file_idxs[@]}; do
distfile="$srcdir/$f"
for i in ${_file_idxs["$f"]}; do
@ -456,16 +300,11 @@ hook() {
if ! try_urls "$curfile"; then
msg_error "$pkgver: failed to fetch '$curfile'.\n"
fi
>>>>>>> upstream/master
done
unset TAR_CMD
<<<<<<< HEAD
if [ $errors -gt 0 ]; then
=======
if [[ $errors -gt 0 ]]; then
>>>>>>> upstream/master
msg_error "$pkgver: couldn't verify distfiles, exiting...\n"
fi
}

View file

@ -6,25 +6,6 @@ _process_patch() {
_args="-Np1"
_patch=${i##*/}
<<<<<<< HEAD
if [ -f $PATCHESDIR/${_patch}.args ]; then
_args=$(<$PATCHESDIR/${_patch}.args)
elif [ -n "$patch_args" ]; then
_args=$patch_args
fi
cp -f $i "$wrksrc"
# Try to guess if its a compressed patch.
if [[ $f =~ .gz$ ]]; then
gunzip "$wrksrc/${_patch}"
_patch=${_patch%%.gz}
elif [[ $f =~ .bz2$ ]]; then
bunzip2 "$wrksrc/${_patch}"
_patch=${_patch%%.bz2}
elif [[ $f =~ .diff$ ]]; then
:
elif [[ $f =~ .patch$ ]]; then
=======
if [ -f "$PATCHESDIR/${_patch}.args" ]; then
_args=$(<"$PATCHESDIR/${_patch}.args")
elif [ -n "$patch_args" ]; then
@ -42,7 +23,6 @@ _process_patch() {
elif [[ $i =~ .diff$ ]]; then
:
elif [[ $i =~ .patch$ ]]; then
>>>>>>> upstream/master
:
else
msg_warn "$pkgver: unknown patch type: $i.\n"
@ -51,11 +31,7 @@ _process_patch() {
cd "$wrksrc"
msg_normal "$pkgver: patching: ${_patch}.\n"
<<<<<<< HEAD
patch -s ${_args} -i ${_patch} 2>/dev/null
=======
patch -s ${_args} <"${_patch}" 2>/dev/null
>>>>>>> upstream/master
}
hook() {
@ -68,19 +44,11 @@ hook() {
done < $PATCHESDIR/series
else
for f in $PATCHESDIR/*; do
<<<<<<< HEAD
[ ! -f $f ] && continue
if [[ $f =~ ^.*.args$ ]]; then
continue
fi
_process_patch $f
=======
[ ! -f "$f" ] && continue
if [[ $f =~ ^.*.args$ ]]; then
continue
fi
_process_patch "$f"
>>>>>>> upstream/master
done
fi
}

View file

@ -24,12 +24,7 @@ hook() {
# Find all binaries in /usr/share and add them to the pool
while read -r f; do
<<<<<<< HEAD
mime="${f##*:}"
mime="${mime// /}"
=======
mime="${f##*: }"
>>>>>>> upstream/master
file="${f%:*}"
file="${file#${PKGDESTDIR}}"
case "${mime}" in
@ -41,11 +36,7 @@ hook() {
fi
;;
esac
<<<<<<< HEAD
done < <(find $PKGDESTDIR/usr/share $prune_expr -type f | file --mime-type --files-from -)
=======
done < <(find $PKGDESTDIR/usr/share $prune_expr -type f | file --no-pad --mime-type --files-from -)
>>>>>>> upstream/master
# Check passed if no packages in pool
if [ -z "$matches" ]; then

View file

@ -236,11 +236,7 @@ hook() {
generic_wrapper3 libetpan-config
generic_wrapper3 giblib-config
python_wrapper python-config 2.7
<<<<<<< HEAD
python_wrapper python3-config 3.10
=======
python_wrapper python3-config 3.11
>>>>>>> upstream/master
apr_apu_wrapper apr-1-config
apr_apu_wrapper apu-1-config
}

View file

@ -34,23 +34,14 @@ add_rundep() {
store_pkgdestdir_rundeps() {
if [ -n "$run_depends" ]; then
<<<<<<< HEAD
: > ${PKGDESTDIR}/rdeps
=======
>>>>>>> upstream/master
for f in ${run_depends}; do
_curdep="$(echo "$f" | sed -e 's,\(.*\)?.*,\1,')"
if [ -z "$($XBPS_UHELPER_CMD getpkgdepname ${_curdep} 2>/dev/null)" -a \
-z "$($XBPS_UHELPER_CMD getpkgname ${_curdep} 2>/dev/null)" ]; then
_curdep="${_curdep}>=0"
fi
<<<<<<< HEAD
printf -- "${_curdep} " >> ${PKGDESTDIR}/rdeps
done
=======
printf -- "${_curdep}\n"
done | sort | xargs > ${PKGDESTDIR}/rdeps
>>>>>>> upstream/master
fi
}
@ -174,10 +165,6 @@ hook() {
sorequires+="${f} "
done
if [ -n "${sorequires}" ]; then
<<<<<<< HEAD
echo "${sorequires}" > ${PKGDESTDIR}/shlib-requires
=======
echo "${sorequires}" | xargs -n1 | sort | xargs > ${PKGDESTDIR}/shlib-requires
>>>>>>> upstream/master
fi
}

View file

@ -22,11 +22,7 @@ hook() {
fi
done
<<<<<<< HEAD
for f in var/run usr/local; do
=======
for f in var/run usr/local usr/etc; do
>>>>>>> upstream/master
if [ -d ${PKGDESTDIR}/${f} ]; then
msg_red "${pkgver}: /${f} directory is not allowed, remove it!\n"
error=1
@ -107,27 +103,21 @@ hook() {
error=1
fi
<<<<<<< HEAD
=======
if [ -d ${PKGDESTDIR}/usr/usr ]; then
msg_red "${pkgver}: /usr/usr is forbidden, use /usr.\n"
error=1
fi
>>>>>>> upstream/master
if [ -d ${PKGDESTDIR}/usr/man ]; then
msg_red "${pkgver}: /usr/man is forbidden, use /usr/share/man.\n"
error=1
fi
<<<<<<< HEAD
=======
if [[ -d ${PKGDESTDIR}/usr/share/man/man ]]; then
msg_red "${pkgver}: /usr/share/man/man is forbidden, use /usr/share/man.\n"
error=1
fi
>>>>>>> upstream/master
if [ -d ${PKGDESTDIR}/usr/doc ]; then
msg_red "${pkgver}: /usr/doc is forbidden. Use /usr/share/doc.\n"
error=1
@ -202,11 +192,7 @@ hook() {
if [ -z "$found" ]; then
_myshlib="${libname}.so"
[ "${_myshlib}" != "${rev}" ] && _myshlib+=".${rev}"
<<<<<<< HEAD
msg_warn "${pkgver}: ${_myshlib} not found in common/shlibs!\n"
=======
msg_normal "${pkgver}: ${_myshlib} not found in common/shlibs.\n"
>>>>>>> upstream/master
fi;
}
done

View file

@ -5,30 +5,17 @@ die() {
exit 1
}
<<<<<<< HEAD
GIT_CMD=$(command -v chroot-git 2>/dev/null) ||
GIT_CMD=$(command -v git 2>/dev/null) ||
=======
command -v git >/dev/null 2>&1 ||
>>>>>>> upstream/master
die "neither chroot-git nor git could be found!"
rev_parse() {
if [ -n "$1" ]; then
<<<<<<< HEAD
"$GIT_CMD" rev-parse --verify "$1"
=======
git rev-parse --verify "$1"
>>>>>>> upstream/master
else
shift
while test "$#" != 0
do
<<<<<<< HEAD
"$GIT_CMD" rev-parse --verify "$1" 2>/dev/null && return
=======
git rev-parse --verify "$1" 2>/dev/null && return
>>>>>>> upstream/master
shift
done
return 1
@ -39,29 +26,14 @@ base=$(rev_parse "$1" FETCH_HEAD ORIG_HEAD) || die "base commit not found"
tip=$(rev_parse "$2" HEAD) || die "tip commit not found"
status=0
<<<<<<< HEAD
for cmt in $("$GIT_CMD" rev-list --abbrev-commit $base..$tip)
do
"$GIT_CMD" cat-file commit "$cmt" |
=======
for cmt in $(git rev-list --abbrev-commit $base..$tip)
do
git cat-file commit "$cmt" |
>>>>>>> upstream/master
awk -vC="$cmt" '
# skip header
/^$/ && !msg { msg = 1; next }
!msg { next }
# 3: long-line-is-banned-except-footnote-like-this-for-url
<<<<<<< HEAD
(NF > 2) && (length > 80) { print C ": long line: " $0; exit 1 }
!subject {
if (length > 50) { print C ": subject is a bit long" }
if (!($0 ~ ":" || $0 ~ "^Take over maintainership " || $0 ~ "^Orphan ")) { print C ": subject does not follow CONTRIBUTING.md guildelines"; exit 1 }
# Below check is too noisy?
# if (!($0 ~ "^New package:" || $0 ~ ".*: update to")) {
# print C ": not new package/update/removal?"
=======
(NF > 2) && (length > 80) { print "::error title=Commit Lint::" C ": long line: " $0; exit 1 }
!subject {
if (length > 50) { print "::warning title=Commit Lint::" C ": subject is a bit long" }
@ -69,16 +41,11 @@ do
# Below check is too noisy?
# if (!($0 ~ "^New package:" || $0 ~ ".*: update to")) {
# print "::warning title=Commit Lint::" C ": not new package/update/removal?"
>>>>>>> upstream/master
# }
subject = 1; next
}
/^$/ { body = 1; next }
<<<<<<< HEAD
!body { print C ": second line must be blank"; exit 1 }
=======
!body { print "::error title=Commit Lint::" C ": second line must be blank"; exit 1 }
>>>>>>> upstream/master
' || status=1
done
exit $status

View file

@ -13,31 +13,17 @@ if ! [ "$base_rev" ]; then
die "usage: $0 TEMPLATE BASE-REVISION [TIP-REVISION]"
fi
<<<<<<< HEAD
if command -v chroot-git >/dev/null 2>&1; then
GIT_CMD=$(command -v chroot-git)
elif command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v git)
else
=======
if ! command -v git >/dev/null 2>&1; then
>>>>>>> upstream/master
die "neither chroot-git nor git could be found"
fi
scan() {
rx="$1" msg="$2"
template_path=$template
<<<<<<< HEAD
if [ "$tip_rev" ]; then
template_path="${tip_rev}:${template}"
maybe_git="$GIT_CMD"
=======
maybe_git=
if [ "$tip_rev" ]; then
template_path="${tip_rev}:${template}"
maybe_git="git"
>>>>>>> upstream/master
revspec="[^:]*:"
fi
$maybe_git grep -P -Hn -e "$rx" "$template_path" |
@ -48,11 +34,7 @@ scan() {
show_template() {
rev="$1"
if [ "$rev" ]; then
<<<<<<< HEAD
$GIT_CMD cat-file blob "${rev}:${template}" 2>/dev/null
=======
git cat-file blob "${rev}:${template}" 2>/dev/null
>>>>>>> upstream/master
else
cat "${template}" 2>/dev/null
fi
@ -60,14 +42,10 @@ show_template() {
show_template_var() {
rev="$1" var="$2"
<<<<<<< HEAD
show_template "$rev" | grep -Po '^'${var}'=\K.*'
=======
(
show_template "$rev"
printf '%s\n' "printf '%s\\n' \"\$${var}\""
) | bash 2>/dev/null
>>>>>>> upstream/master
}
revision_reset() {
@ -94,8 +72,6 @@ reverts_on_downgrade() {
esac
}
<<<<<<< HEAD
=======
check_revert() {
for vr in $reverts; do
xbps-uhelper cmpver "${version}" "${vr%_*}"
@ -118,7 +94,6 @@ check_revert() {
done
}
>>>>>>> upstream/master
version_change() {
version="$(show_template_var "$tip_rev" version)"
revision="$(show_template_var "$tip_rev" revision)"
@ -130,10 +105,7 @@ version_change() {
1) revision_reset;;
-1|255) reverts_on_downgrade;;
esac
<<<<<<< HEAD
=======
check_revert
>>>>>>> upstream/master
}
version_change

File diff suppressed because it is too large Load diff

View file

@ -13,11 +13,7 @@ fi
PKGS=$(/hostrepo/xbps-src sort-dependencies $(cat /tmp/templates))
for pkg in ${PKGS}; do
<<<<<<< HEAD
/hostrepo/xbps-src -j$(nproc) -H "$HOME"/hostdir $arch $test pkg "$pkg"
=======
/hostrepo/xbps-src -j$(nproc) -s -H "$HOME"/hostdir $arch $test pkg "$pkg"
>>>>>>> upstream/master
[ $? -eq 1 ] && exit 1
done

View file

@ -2,21 +2,6 @@
#
# changed_templates.sh
<<<<<<< HEAD
if command -v chroot-git >/dev/null 2>&1; then
GIT_CMD=$(command -v chroot-git)
elif command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v git)
fi
tip="$($GIT_CMD rev-list -1 --parents HEAD)"
case "$tip" in
*" "*" "*) tip="${tip##* }" ;;
*) tip="${tip%% *}" ;;
esac
base="$($GIT_CMD merge-base FETCH_HEAD "$tip")" || {
=======
tip="$(git rev-list -1 --parents HEAD)"
case "$tip" in
# This is a merge commit, pick last parent
@ -26,7 +11,6 @@ case "$tip" in
esac
base="$(git merge-base FETCH_HEAD "$tip")" || {
>>>>>>> upstream/master
echo "Your branches is based on too old copy."
echo "Please rebase to newest copy."
exit 1
@ -35,11 +19,7 @@ base="$(git merge-base FETCH_HEAD "$tip")" || {
echo "$base $tip" >/tmp/revisions
/bin/echo -e '\x1b[32mChanged packages:\x1b[0m'
<<<<<<< HEAD
$GIT_CMD diff-tree -r --no-renames --name-only --diff-filter=AM \
=======
git diff-tree -r --no-renames --name-only --diff-filter=AM \
>>>>>>> upstream/master
"$base" "$tip" \
-- 'srcpkgs/*/template' |
cut -d/ -f 2 |

View file

@ -7,11 +7,7 @@ TAR=tar
command -v bsdtar >/dev/null && TAR=bsdtar
ARCH=$(uname -m)-musl
VERSION=0.59_5
<<<<<<< HEAD
URL="https://alpha.de.repo.voidlinux.org/static/xbps-static-static-${VERSION}.${ARCH}.tar.xz"
=======
URL="https://repo-ci.voidlinux.org/static/xbps-static-static-${VERSION}.${ARCH}.tar.xz"
>>>>>>> upstream/master
FILE=${URL##*/}
mkdir -p /tmp/bin

View file

@ -2,19 +2,8 @@
#
# changed_templates.sh
<<<<<<< HEAD
if command -v chroot-git >/dev/null 2>&1; then
GIT_CMD=$(command -v chroot-git)
elif command -v git >/dev/null 2>&1; then
GIT_CMD=$(command -v git)
fi
/bin/echo -e '\x1b[32mFetching upstream...\x1b[0m'
$GIT_CMD fetch --depth 200 https://github.com/void-linux/void-packages.git master
=======
# required by git 2.35.2+
git config --global --add safe.directory "$PWD"
/bin/echo -e '\x1b[32mFetching upstream...\x1b[0m'
git fetch --depth 200 https://github.com/void-linux/void-packages.git master
>>>>>>> upstream/master

View file

@ -31,10 +31,7 @@ Apache-1.0
Apache-1.1
Apache-2.0
App-s2p
<<<<<<< HEAD
=======
Arphic-1999
>>>>>>> upstream/master
Artistic-1.0-Perl
Artistic-1.0-cl8
Artistic-1.0
@ -62,20 +59,14 @@ BSD-Protection
BSD-Source-Code
BSL-1.0
BUSL-1.1
<<<<<<< HEAD
=======
Baekmuk
>>>>>>> upstream/master
Bahyph
Barr
Beerware
Bison-exception-2.2
BitTorrent-1.0
BitTorrent-1.1
<<<<<<< HEAD
=======
Bitstream-Vera
>>>>>>> upstream/master
BlueOak-1.0.0
Bootloader-exception
Borceux
@ -89,10 +80,7 @@ CC-BY-2.5-AU
CC-BY-2.5
CC-BY-3.0-AT
CC-BY-3.0-DE
<<<<<<< HEAD
=======
CC-BY-3.0-IGO
>>>>>>> upstream/master
CC-BY-3.0-NL
CC-BY-3.0-US
CC-BY-3.0
@ -236,11 +224,8 @@ GPL-3.0-linking-source-exception
GPL-3.0-only
GPL-3.0-or-later
GPL-CC-1.0
<<<<<<< HEAD
=======
GStreamer-exception-2005
GStreamer-exception-2008
>>>>>>> upstream/master
Giftware
Glide
Glulxe
@ -265,10 +250,7 @@ JPNIC
JSON
Jam
JasPer-2.0
<<<<<<< HEAD
=======
KiCad-libraries-exception
>>>>>>> upstream/master
LAL-1.2
LAL-1.3
LGPL-2.0-only
@ -287,11 +269,8 @@ LPPL-1.1
LPPL-1.2
LPPL-1.3a
LPPL-1.3c
<<<<<<< HEAD
=======
LZMA-SDK-9.11-to-9.20
LZMA-SDK-9.22
>>>>>>> upstream/master
LZMA-exception
Latex2e
Leptonica
@ -316,18 +295,12 @@ MPL-1.0
MPL-1.1
MPL-2.0-no-copyleft-exception
MPL-2.0
<<<<<<< HEAD
=======
MS-LPL
>>>>>>> upstream/master
MS-PL
MS-RL
MTLL
MakeIndex
<<<<<<< HEAD
=======
Minpack
>>>>>>> upstream/master
MirOS
Motosoto
MulanPSL-1.0
@ -340,10 +313,7 @@ NBPL-1.0
NCGL-UK-2.0
NCSA
NGPL
<<<<<<< HEAD
=======
NICTA-1.0
>>>>>>> upstream/master
NIST-PD-fallback
NIST-PD
NLOD-1.0
@ -421,10 +391,7 @@ Plexus
PolyForm-Noncommercial-1.0.0
PolyForm-Small-Business-1.0.0
PostgreSQL
<<<<<<< HEAD
=======
Python-2.0.1
>>>>>>> upstream/master
Python-2.0
QPL-1.0
Qhull
@ -561,13 +528,9 @@ libpng-2.0
libselinux-1.0
libtiff
mif-exception
<<<<<<< HEAD
mpich2
=======
mpi-permissive
mpich2
mplus
>>>>>>> upstream/master
openvpn-openssl-exception
psfrag
psutils

View file

@ -1,23 +1,8 @@
#!/bin/sh
<<<<<<< HEAD
TRAVIS_PROTO=http
TRAVIS_MIRROR=repo-us.voidlinux.org
for _i in etc/xbps.d/repos-remote*.conf ; do
/bin/echo -e "\x1b[32mUpdating $_i...\x1b[0m"
# First fix the proto, ideally we'd serve everything with HTTPS,
# but key management and rotation is a pain, and things are signed
# so we can afford to be a little lazy at times.
sed -i "s:https:$TRAVIS_PROTO:g" $_i
# Now set the mirror
sed -i "s:alpha\.de\.repo\.voidlinux\.org:$TRAVIS_MIRROR:g" $_i
=======
TRAVIS_MIRROR=repo-ci.voidlinux.org
for _i in etc/xbps.d/repos-remote*.conf ; do
/bin/echo -e "\x1b[32mUpdating $_i...\x1b[0m"
sed -i "s:repo-default\.voidlinux\.org:$TRAVIS_MIRROR:g" $_i
>>>>>>> upstream/master
done

View file

@ -11,13 +11,8 @@ common/scripts/lint-commits $base $tip || EXITCODE=$?
for t in $(awk '{ print "srcpkgs/" $0 "/template" }' /tmp/templates); do
/bin/echo -e "\x1b[32mLinting $t...\x1b[0m"
<<<<<<< HEAD
xlint "$t" || EXITCODE=$?
common/scripts/lint-version-change "$t" $base $tip || EXITCODE=$?
=======
xlint "$t" > /tmp/xlint_out || EXITCODE=$?
common/scripts/lint-version-change "$t" $base $tip > /tmp/vlint_out || EXITCODE=$?
awk -f common/scripts/lint2annotations.awk /tmp/xlint_out /tmp/vlint_out
>>>>>>> upstream/master
done
exit $EXITCODE

View file

@ -20,8 +20,6 @@ done
setup_pkg "$PKGNAME" $XBPS_CROSS_BUILD
<<<<<<< HEAD
=======
if [ -n "$disable_parallel_check" ]; then
XBPS_MAKEJOBS=1
else
@ -29,7 +27,6 @@ else
fi
makejobs="-j$XBPS_MAKEJOBS"
>>>>>>> upstream/master
XBPS_CHECK_DONE="${XBPS_STATEDIR}/${sourcepkg}_${XBPS_CROSS_BUILD}_check_done"
if [ -n "$XBPS_CROSS_BUILD" ]; then

View file

@ -25,15 +25,9 @@ setup_pkg_depends() {
_pkgname=$(xbps-uhelper getpkgname $_depname 2>/dev/null)
[ -z "$_pkgname" ] && _pkgname="$_depname"
if [ -s ${XBPS_DISTDIR}/etc/virtual ]; then
<<<<<<< HEAD
foo=$(egrep "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/virtual|cut -d ' ' -f2)
elif [ -s ${XBPS_DISTDIR}/etc/defaults.virtual ]; then
foo=$(egrep "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/defaults.virtual|cut -d ' ' -f2)
=======
foo=$(grep -E "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/virtual|cut -d ' ' -f2)
elif [ -s ${XBPS_DISTDIR}/etc/defaults.virtual ]; then
foo=$(grep -E "^${_pkgname}[[:blank:]]" ${XBPS_DISTDIR}/etc/defaults.virtual|cut -d ' ' -f2)
>>>>>>> upstream/master
fi
if [ -z "$foo" ]; then
msg_error "$pkgver: failed to resolve virtual dependency for '$j' (missing from etc/virtual)\n"

View file

@ -54,11 +54,7 @@ bulk_sortdeps() {
}
bulk_build() {
<<<<<<< HEAD
local sys="$1"
=======
local bulk_build_cmd="$1"
>>>>>>> upstream/master
local NPROCS=$(($(nproc)*2))
local NRUNNING=0
@ -71,12 +67,6 @@ bulk_build() {
fi
# Compare installed pkg versions vs srcpkgs
<<<<<<< HEAD
if [[ $sys ]]; then
xbps-checkvers -f '%n' -I -D $XBPS_DISTDIR
return $?
fi
=======
case "$bulk_build_cmd" in
installed)
bulk_sortdeps $(xbps-checkvers -f '%n' -I -D "$XBPS_DISTDIR")
@ -88,7 +78,6 @@ bulk_build() {
;;
esac
>>>>>>> upstream/master
# compare repo pkg versions vs srcpkgs
for f in $(xbps-checkvers -f '%n' -D $XBPS_DISTDIR); do
if [ $NRUNNING -eq $NPROCS ]; then
@ -108,15 +97,9 @@ bulk_build() {
}
bulk_update() {
<<<<<<< HEAD
local args="$1" pkgs f rval
pkgs="$(bulk_build ${args})"
=======
local bulk_update_cmd="$1" pkgs f rval
pkgs="$(bulk_build "${bulk_update_cmd}")"
>>>>>>> upstream/master
[[ -z $pkgs ]] && return 0
msg_normal "xbps-src: the following packages must be rebuilt and updated:\n"
@ -136,11 +119,7 @@ bulk_update() {
msg_error "xbps-src: failed to build $pkgver pkg!\n"
fi
done
<<<<<<< HEAD
if [ -n "$pkgs" -a -n "$args" ]; then
=======
if [ -n "$pkgs" -a "$bulk_update_cmd" == installed ]; then
>>>>>>> upstream/master
echo
msg_normal "xbps-src: updating your system, confirm to proceed...\n"
${XBPS_SUCMD} "xbps-install --repository=$XBPS_REPOSITORY --repository=$XBPS_REPOSITORY/nonfree -u ${pkgs//[$'\n']/ }" || return 1

View file

@ -8,28 +8,12 @@ install_base_chroot() {
XBPS_TARGET_PKG="$1"
fi
# binary bootstrap
<<<<<<< HEAD
msg_normal "xbps-src: installing base-chroot-cereus...\n"
=======
msg_normal "xbps-src: installing base-chroot...\n"
>>>>>>> upstream/master
# XBPS_TARGET_PKG == arch
if [ "$XBPS_TARGET_PKG" ]; then
_bootstrap_arch="env XBPS_TARGET_ARCH=$XBPS_TARGET_PKG"
fi
(export XBPS_MACHINE=$XBPS_TARGET_PKG XBPS_ARCH=$XBPS_TARGET_PKG; chroot_sync_repodata)
<<<<<<< HEAD
${_bootstrap_arch} $XBPS_INSTALL_CMD ${XBPS_INSTALL_ARGS} -y base-chroot-cereus
if [ $? -ne 0 ]; then
msg_error "xbps-src: failed to install base-chroot-cereus!\n"
fi
# Reconfigure base-files to create dirs/symlinks.
if xbps-query -r $XBPS_MASTERDIR base-files>=2022.07.03 &>/dev/null; then
XBPS_ARCH=$XBPS_TARGET_PKG xbps-reconfigure -r $XBPS_MASTERDIR -f base-files>=2022.07.03 &>/dev/null
fi
msg_normal "xbps-src: installed base-chroot-cereus successfully!\n"
=======
${_bootstrap_arch} $XBPS_INSTALL_CMD ${XBPS_INSTALL_ARGS} -y base-chroot
if [ $? -ne 0 ]; then
msg_error "xbps-src: failed to install base-chroot!\n"
@ -40,7 +24,6 @@ install_base_chroot() {
fi
msg_normal "xbps-src: installed base-chroot successfully!\n"
>>>>>>> upstream/master
chroot_prepare $XBPS_TARGET_PKG || msg_error "xbps-src: failed to initialize chroot!\n"
chroot_check
chroot_handler clean
@ -51,11 +34,7 @@ reconfigure_base_chroot() {
local pkgs="glibc-locales ca-certificates"
[ -z "$IN_CHROOT" -o -e $statefile ] && return 0
# Reconfigure ca-certificates.
<<<<<<< HEAD
msg_normal "xbps-src: reconfiguring base-chroot-cereus...\n"
=======
msg_normal "xbps-src: reconfiguring base-chroot...\n"
>>>>>>> upstream/master
for f in ${pkgs}; do
if xbps-query -r $XBPS_MASTERDIR $f &>/dev/null; then
xbps-reconfigure -r $XBPS_MASTERDIR -f $f
@ -72,11 +51,7 @@ update_base_chroot() {
if $(${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -nu|grep -q xbps); then
${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu xbps || msg_error "xbps-src: failed to update xbps!\n"
fi
<<<<<<< HEAD
${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu || msg_error "xbps-src: failed to update base-chroot-cereus!\n"
=======
${XBPS_INSTALL_CMD} ${XBPS_INSTALL_ARGS} -yu || msg_error "xbps-src: failed to update base-chroot!\n"
>>>>>>> upstream/master
msg_normal "xbps-src: cleaning up $XBPS_MASTERDIR masterdir...\n"
[ -z "$XBPS_KEEP_ALL" -a -z "$XBPS_SKIP_DEPS" ] && remove_pkg_autodeps
[ -z "$XBPS_KEEP_ALL" -a -z "$keep_all_force" ] && rm -rf $XBPS_MASTERDIR/builddir $XBPS_MASTERDIR/destdir
@ -140,22 +115,14 @@ chroot_prepare() {
[ ! -d $XBPS_MASTERDIR/$f ] && mkdir -p $XBPS_MASTERDIR/$f
done
<<<<<<< HEAD
# Copy /etc/passwd and /etc/group from base-files
=======
# Copy /etc/passwd and /etc/group from base-files.
>>>>>>> upstream/master
cp -f $XBPS_SRCPKGDIR/base-files/files/passwd $XBPS_MASTERDIR/etc
echo "$(whoami):x:$(id -u):$(id -g):$(whoami) user:/tmp:/bin/xbps-shell" \
>> $XBPS_MASTERDIR/etc/passwd
cp -f $XBPS_SRCPKGDIR/base-files/files/group $XBPS_MASTERDIR/etc
echo "$(whoami):x:$(id -g):" >> $XBPS_MASTERDIR/etc/group
<<<<<<< HEAD
# Copy /etc/hosts from base-files
=======
# Copy /etc/hosts from base-files.
>>>>>>> upstream/master
cp -f $XBPS_SRCPKGDIR/base-files/files/hosts $XBPS_MASTERDIR/etc
# Prepare default locale: en_US.UTF-8.

View file

@ -147,8 +147,6 @@ msg_normal() {
fi
}
<<<<<<< HEAD
=======
report_broken() {
if [ "$show_problems" = "ignore-problems" ]; then
return
@ -166,7 +164,6 @@ report_broken() {
fi
}
>>>>>>> upstream/master
msg_normal_append() {
[ -n "$NOCOLORS" ] || printf "\033[1m"
printf "$@"
@ -492,9 +489,6 @@ setup_pkg() {
fi
makejobs="-j$XBPS_MAKEJOBS"
if [ -n "$XBPS_BINPKG_EXISTS" ]; then
<<<<<<< HEAD
local _binpkgver="$($XBPS_QUERY_XCMD -R -ppkgver $pkgver 2>/dev/null)"
=======
local extraflags=""
if [ -n "$XBPS_SKIP_REMOTEREPOS" ]; then
extraflags="-i"
@ -504,7 +498,6 @@ setup_pkg() {
done
fi
local _binpkgver="$($XBPS_QUERY_XCMD -R -ppkgver $pkgver $extraflags 2>/dev/null)"
>>>>>>> upstream/master
if [ "$_binpkgver" = "$pkgver" ]; then
if [ -z "$XBPS_DEPENDENCY" ]; then
local _repo="$($XBPS_QUERY_XCMD -R -prepository $pkgver 2>/dev/null)"
@ -662,22 +655,6 @@ setup_pkg() {
fi
# Setup some specific package vars.
<<<<<<< HEAD
if [ -z "$wrksrc" ]; then
wrksrc="$XBPS_BUILDDIR/${sourcepkg}-${version}"
else
wrksrc="$XBPS_BUILDDIR/$wrksrc"
fi
if [ "$cross" -a "$nocross" -a "$show_problems" != "ignore-problems" ]; then
msg_red "$pkgver: cannot be cross compiled, exiting...\n"
msg_red "$pkgver: $nocross\n"
exit 2
elif [ "$broken" -a "$show_problems" != "ignore-problems" ]; then
msg_red "$pkgver: cannot be built, it's currently broken; see the build log:\n"
msg_red "$pkgver: $broken\n"
exit 2
=======
wrksrc="$XBPS_BUILDDIR/${sourcepkg}-${version}"
if [ "$cross" -a "$nocross" ]; then
@ -688,7 +665,6 @@ setup_pkg() {
report_broken \
"$pkgver: cannot be built, it's currently broken; see the build log:\n" \
"$pkgver: $broken\n"
>>>>>>> upstream/master
fi
if [ -n "$restricted" -a -z "$XBPS_ALLOW_RESTRICTED" -a "$show_problems" != "ignore-problems" ]; then

View file

@ -72,11 +72,7 @@ prepare_cross_sysroot() {
fi
rm -f $errlog
# Create top level symlinks in sysroot.
<<<<<<< HEAD
XBPS_ARCH=$XBPS_TARGET_MACHINE xbps-reconfigure -r $XBPS_CROSS_BASE -f base-files-cereus>=2022.07.03 &>/dev/null
=======
XBPS_ARCH=$XBPS_TARGET_MACHINE xbps-reconfigure -r $XBPS_CROSS_BASE -f base-files &>/dev/null
>>>>>>> upstream/master
# Create a sysroot/include and sysroot/lib symlink just in case.
ln -s usr/include ${XBPS_CROSS_BASE}/include
ln -s usr/lib ${XBPS_CROSS_BASE}/lib

View file

@ -34,12 +34,7 @@ check_pkg_arch() {
esac
done
if [ -z "$nonegation" -a -n "$match" ] || [ -n "$nonegation" -a -z "$match" ]; then
<<<<<<< HEAD
msg_red "${pkgname}-${version}_${revision}: this package cannot be built for ${_arch}.\n"
exit 2
=======
report_broken "${pkgname}-${version}_${revision}: this package cannot be built for ${_arch}.\n"
>>>>>>> upstream/master
fi
fi
}

View file

@ -4,10 +4,7 @@ update_check() {
local i p url pkgurlname rx found_version consider
local update_override=$XBPS_SRCPKGDIR/$XBPS_TARGET_PKG/update
local original_pkgname=$pkgname
<<<<<<< HEAD
=======
local pkgname=$sourcepkg
>>>>>>> upstream/master
local urlpfx urlsfx
local -A fetchedurls
@ -27,14 +24,9 @@ update_check() {
if [ -z "$site" ]; then
case "$distfiles" in
<<<<<<< HEAD
# only consider versions those exist in ftp.gnome.org
*ftp.gnome.org*) ;;
=======
# special case those sites provide better source elsewhere
*ftp.gnome.org*|*download.gnome.org*) ;;
*archive.xfce.org*) ;;
>>>>>>> upstream/master
*)
printf '%s\n' "$homepage" ;;
esac
@ -66,12 +58,8 @@ update_check() {
*github.com*|\
*//gitlab.*|\
*bitbucket.org*|\
<<<<<<< HEAD
*ftp.gnome.org*|\
=======
*ftp.gnome.org*|*download.gnome.org*|\
*archive.xfce.org*|\
>>>>>>> upstream/master
*kernel.org/pub/linux/kernel/*|\
*cran.r-project.org/src/contrib*|\
*rubygems.org*|\
@ -137,16 +125,11 @@ update_check() {
pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)"
url="https://github.com/$pkgurlname/tags"
rx='/archive/refs/tags/(v?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar\.gz")';;
<<<<<<< HEAD
*//gitlab.*)
pkgurlname="$(printf %s "$url" | cut -d/ -f1-5)"
=======
*//gitlab.*|*code.videolan.org*)
case "$url" in
*/-/*) pkgurlname="$(printf %s "$url" | sed -e 's%/-/.*%%g; s%/$%%')";;
*) pkgurlname="$(printf %s "$url" | cut -d / -f 1-5)";;
esac
>>>>>>> upstream/master
url="$pkgurlname/tags"
rx='/archive/[^/]+/\Q'"$pkgname"'\E-v?\K[\d.]+(?=\.tar\.gz")';;
*bitbucket.org*)
@ -154,16 +137,11 @@ update_check() {
url="https://bitbucket.org/$pkgurlname/downloads"
rx='/(get|downloads)/(v?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar)';;
*ftp.gnome.org*|*download.gnome.org*)
<<<<<<< HEAD
: ${pattern="\Q$pkgname\E-\K(0|[13]\.[0-9]*[02468]|[4-9][0-9]+)\.[0-9.]*[0-9](?=)"}
url="https://download.gnome.org/sources/$pkgname/cache.json";;
=======
: ${pattern="\Q$pkgname\E-\K(0|[13]\.[0-9]*[02468]|[4-9][0-9]+)\.[0-9.]*[0-9](?=.tar)"}
url="https://download.gnome.org/sources/$pkgname/cache.json";;
*archive.xfce.org*)
: ${pattern="\Q$pkgname\E-\K((([4-9]|([1-9][0-9]+))\.[0-9]*[02468]\.[0-9.]*[0-9])|([0-3]\.[0-9.]*))(?=.tar)"}
url="https://archive.xfce.org/feeds/project/$pkgname" ;;
>>>>>>> upstream/master
*kernel.org/pub/linux/kernel/*)
rx=linux-'\K'${version%.*}'[\d.]+(?=\.tar\.xz)';;
*cran.r-project.org/src/contrib*)
@ -176,13 +154,8 @@ update_check() {
rx='/crates/'${pkgname#rust-}'/\K[0-9.]*(?=/download)' ;;
*codeberg.org*)
pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)"
<<<<<<< HEAD
url="https://codeberg.org/$pkgurlname/releases"
rx='/archive/\K[\d.]+(?=\.tar\.gz)' ;;
=======
url="https://codeberg.org/$pkgurlname/tags"
rx='/archive/(v-?|\Q'"$pkgname"'\E-)?\K[\d.]+(?=\.tar\.gz)' ;;
>>>>>>> upstream/master
*hg.sr.ht*)
pkgurlname="$(printf %s "$url" | cut -d/ -f4,5)"
url="https://hg.sr.ht/$pkgurlname/tags"

View file

@ -122,12 +122,9 @@ show-repo-updates
show-sys-updates
Prints the list of outdated packages in your system.
<<<<<<< HEAD
=======
show-local-updates
Prints the list of outdated packages in your local repositories.
>>>>>>> upstream/master
sort-dependencies <pkg> <pkgN+1> ...
Given a list of packages specified as additional arguments, a sorted dependency
list will be returned to stdout.
@ -138,12 +135,9 @@ update-bulk
update-sys
Rebuilds all packages in your system that are outdated and updates them.
<<<<<<< HEAD
=======
update-local
Rebuilds all packages in your local repositories that are outdated.
>>>>>>> upstream/master
update-check <pkgname>
Check upstream site of <pkgname> for new releases.
@ -162,11 +156,8 @@ Options:
$(print_cross_targets)
<<<<<<< HEAD
=======
-b Build packages even if marked as broken, nocross, or excluded with archs.
>>>>>>> upstream/master
-c <configuration>
If specified, etc/conf.<configuration> will be used as the primary config
file name; etc/conf will only be attempted if that does not exist.
@ -174,11 +165,7 @@ $(print_cross_targets)
-C Do not remove build directory, automatic dependencies and
package destdir after successful install.
<<<<<<< HEAD
-E If a binary package exists in a local repository for the target package,
=======
-E If a binary package exists in a repository for the target package,
>>>>>>> upstream/master
do not try to build it, exit immediately.
-f Force running the specified stage (configure/build/install/pkg)
@ -237,11 +224,8 @@ $(print_cross_targets)
This alternative repository will also be used to resolve dependencies
with highest priority order than others.
<<<<<<< HEAD
=======
-s Make vsed warnings errors.
>>>>>>> upstream/master
-t Create a temporary masterdir to not pollute the current one. Note that
the existing masterdir must be fully populated with binary-bootstrap first.
Once the target has finished, this temporary masterdir will be removed.
@ -383,11 +367,7 @@ readonly XBPS_SRC_VERSION="113"
export XBPS_MACHINE=$(xbps-uhelper -C /dev/null arch)
XBPS_OPTIONS=
<<<<<<< HEAD
XBPS_OPTSTRING="1a:c:CEfgGhH:iIj:Lm:No:p:qQKr:tV"
=======
XBPS_OPTSTRING="1a:bc:CEfgGhH:iIj:Lm:No:p:qsQKr:tV"
>>>>>>> upstream/master
# Preprocess arguments in order to allow options before and after XBPS_TARGET.
eval set -- $(getopt "$XBPS_OPTSTRING" "$@");
@ -395,14 +375,12 @@ eval set -- $(getopt "$XBPS_OPTSTRING" "$@");
# Options are saved as XBPS_ARG_FOO instead of XBPS_FOO for now; this is
# because configuration files may override those and we want arguments to
# take precedence over configuration files
while getopts "$XBPS_OPTSTRING" opt; do
case $opt in
1) XBPS_ARG_BUILD_ONLY_ONE_PKG=yes; XBPS_OPTIONS+=" -1";;
a) XBPS_ARG_CROSS_BUILD="$OPTARG"; XBPS_OPTIONS+=" -a $OPTARG";;
<<<<<<< HEAD
=======
b) XBPS_ARG_IGNORE_BROKENNESS=yes; XBPS_OPTIONS+=" -b";;
>>>>>>> upstream/master
c) XBPS_ARG_CONFIG="$OPTARG"; XBPS_OPTIONS+=" -c $OPTARG";;
C) XBPS_ARG_KEEP_ALL=1; XBPS_OPTIONS+=" -C";;
E) XBPS_ARG_BINPKG_EXISTS=1; XBPS_OPTIONS+=" -E";;
@ -423,10 +401,7 @@ while getopts "$XBPS_OPTSTRING" opt; do
Q) XBPS_ARG_CHECK_PKGS=yes; XBPS_OPTIONS+=" -Q";;
K) XBPS_ARG_CHECK_PKGS=full; XBPS_OPTIONS+=" -K";;
r) XBPS_ARG_ALT_REPOSITORY="$OPTARG"; XBPS_OPTIONS+=" -r $OPTARG";;
<<<<<<< HEAD
=======
s) XBPS_ARG_STRICT=yes; XBPS_OPTIONS+=" -s";;
>>>>>>> upstream/master
t) XBPS_ARG_TEMP_MASTERDIR=1; XBPS_OPTIONS+=" -t -C";;
V) echo "xbps-src-$XBPS_SRC_VERSION $(xbps-uhelper -V)" && exit 0;;
--) shift; break;;
@ -497,10 +472,7 @@ fi
# Set options passed on command line, after configuration files have been read
[ -n "$XBPS_ARG_BUILD_ONLY_ONE_PKG" ] && XBPS_BUILD_ONLY_ONE_PKG=yes
<<<<<<< HEAD
=======
[ -n "$XBPS_ARG_IGNORE_BROKENNESS" ] && XBPS_IGNORE_BROKENNESS=1
>>>>>>> upstream/master
[ -n "$XBPS_ARG_SKIP_REMOTEREPOS" ] && XBPS_SKIP_REMOTEREPOS=1
[ -n "$XBPS_ARG_BUILD_FORCEMODE" ] && XBPS_BUILD_FORCEMODE=1
[ -n "$XBPS_ARG_INFORMATIVE_RUN" ] && XBPS_INFORMATIVE_RUN=1
@ -513,10 +485,7 @@ fi
[ -n "$XBPS_ARG_QUIET" ] && XBPS_QUIET=1
[ -n "$XBPS_ARG_PRINT_VARIABLES" ] && XBPS_PRINT_VARIABLES="$XBPS_ARG_PRINT_VARIABLES"
[ -n "$XBPS_ARG_ALT_REPOSITORY" ] && XBPS_ALT_REPOSITORY="$XBPS_ARG_ALT_REPOSITORY"
<<<<<<< HEAD
=======
[ -n "$XBPS_ARG_STRICT" ] && XBPS_STRICT="$XBPS_ARG_STRICT"
>>>>>>> upstream/master
[ -n "$XBPS_ARG_CROSS_BUILD" ] && XBPS_CROSS_BUILD="$XBPS_ARG_CROSS_BUILD"
[ -n "$XBPS_ARG_CHECK_PKGS" ] && XBPS_CHECK_PKGS="$XBPS_ARG_CHECK_PKGS"
[ -n "$XBPS_ARG_MAKEJOBS" ] && XBPS_MAKEJOBS="$XBPS_ARG_MAKEJOBS"
@ -524,13 +493,8 @@ fi
export XBPS_BUILD_ONLY_ONE_PKG XBPS_SKIP_REMOTEREPOS XBPS_BUILD_FORCEMODE \
XBPS_INFORMATIVE_RUN XBPS_TEMP_MASTERDIR XBPS_BINPKG_EXISTS \
XBPS_USE_GIT_REVS XBPS_CHECK_PKGS XBPS_DEBUG_PKGS XBPS_SKIP_DEPS \
<<<<<<< HEAD
XBPS_KEEP_ALL XBPS_QUIET XBPS_ALT_REPOSITORY XBPS_CROSS_BUILD \
XBPS_MAKEJOBS XBPS_PRINT_VARIABLES
=======
XBPS_KEEP_ALL XBPS_QUIET XBPS_ALT_REPOSITORY XBPS_STRICT XBPS_CROSS_BUILD \
XBPS_MAKEJOBS XBPS_PRINT_VARIABLES XBPS_IGNORE_BROKENNESS
>>>>>>> upstream/master
# The masterdir/hostdir variables are forced and readonly in chroot
if [ -z "$IN_CHROOT" ]; then
@ -677,11 +641,7 @@ readonly XBPS_CMPVER_CMD="xbps-uhelper cmpver"
export XBPS_SHUTILSDIR XBPS_CROSSPFDIR XBPS_TRIGGERSDIR \
XBPS_SRCPKGDIR XBPS_COMMONDIR XBPS_BUILDDIR \
<<<<<<< HEAD
XBPS_REPOSITORY XBPS_ALT_REPOSITORY XBPS_SRCDISTDIR XBPS_DIGEST_CMD \
=======
XBPS_REPOSITORY XBPS_ALT_REPOSITORY XBPS_STRICT XBPS_SRCDISTDIR XBPS_DIGEST_CMD \
>>>>>>> upstream/master
XBPS_UHELPER_CMD XBPS_INSTALL_CMD XBPS_QUERY_CMD XBPS_BUILD_ONLY_ONE_PKG \
XBPS_RINDEX_CMD XBPS_RECONFIGURE_CMD XBPS_REMOVE_CMD XBPS_CHECKVERS_CMD \
XBPS_CMPVER_CMD XBPS_FETCH_CMD XBPS_VERSION XBPS_BUILDSTYLEDIR \
@ -694,11 +654,7 @@ export XBPS_SHUTILSDIR XBPS_CROSSPFDIR XBPS_TRIGGERSDIR \
XBPS_LIBEXECDIR XBPS_DISTDIR XBPS_DISTFILES_MIRROR XBPS_ALLOW_RESTRICTED \
XBPS_USE_GIT_COMMIT_DATE XBPS_PKG_COMPTYPE XBPS_REPO_COMPTYPE \
XBPS_BUILDHELPERDIR XBPS_USE_BUILD_MTIME XBPS_BUILD_ENVIRONMENT \
<<<<<<< HEAD
XBPS_PRESERVE_PKGS
=======
XBPS_PRESERVE_PKGS XBPS_IGNORE_BROKENNESS
>>>>>>> upstream/master
for i in REPOSITORY DESTDIR BUILDDIR SRCDISTDIR; do
eval val="\$XBPS_$i"
@ -856,10 +812,7 @@ case "$XBPS_TARGET" in
if [ -n "$CHROOT_READY" -a -z "$IN_CHROOT" ]; then
chroot_handler $XBPS_TARGET $XBPS_TARGET_PKG
else
<<<<<<< HEAD
=======
check_existing_pkg
>>>>>>> upstream/master
chroot_sync_repodata
# prevent update_base_chroot from removing the builddir/destdir
update_base_chroot keep-all-force
@ -1015,14 +968,10 @@ case "$XBPS_TARGET" in
bulk_build
;;
show-sys-updates)
<<<<<<< HEAD
bulk_build -I
=======
bulk_build installed
;;
show-local-updates)
bulk_build local
>>>>>>> upstream/master
;;
sort-dependencies)
bulk_sortdeps ${@/$XBPS_TARGET/}
@ -1031,14 +980,10 @@ case "$XBPS_TARGET" in
bulk_update
;;
update-sys)
<<<<<<< HEAD
bulk_update -I
=======
bulk_update installed
;;
update-local)
bulk_update local
>>>>>>> upstream/master
;;
update-check)
read_pkg ignore-problems