>>> Building on exopi-6 under net/crawl BDEPENDS = [databases/db/v3] DIST = [net/crawl:crawl-0.4.tar.gz] FULLPKGNAME = crawl-0.4p5 RDEPENDS = [databases/db/v3] (Junk lock obtained for exopi-6 at 1713354127.90) >>> Running depends in net/crawl at 1713354127.96 last junk was in devel/py-pylint_venv,python3 /usr/sbin/pkg_add -aI -Drepair db-3.1.17p3v0 was: /usr/sbin/pkg_add -aI -Drepair db-3.1.17p3v0 /usr/sbin/pkg_add -aI -Drepair db-3.1.17p3v0 >>> Running show-prepare-results in net/crawl at 1713354131.31 ===> net/crawl ===> Building from scratch crawl-0.4p5 ===> crawl-0.4p5 depends on: db->=3,<4|db->=3v0,<4v0 -> db-3.1.17p3v0 ===> Verifying specs: c event lib/db/db=3 ===> found c.100.0 event.4.1 lib/db/db.3.1 db-3.1.17p3v0 (Junk lock released for exopi-6 at 1713354132.22) distfiles size=111084 >>> Running build in net/crawl at 1713354132.29 ===> net/crawl ===> Checking files for crawl-0.4p5 `/exopi-cvs/ports/distfiles/crawl-0.4.tar.gz' is up to date. >> (SHA256) crawl-0.4.tar.gz: OK ===> Extracting for crawl-0.4p5 ===> Patching for crawl-0.4p5 ===> Applying OpenBSD patch patch-cfg_c Hmm... Looks like a unified diff to me... The text leading up to this was: -------------------------- |--- cfg.c.orig Wed Dec 12 23:40:45 2001 |+++ cfg.c Sat May 22 16:42:06 2010 -------------------------- Patching file cfg.c using Plan A... Hunk #1 succeeded at 182. Hunk #2 succeeded at 613. done ===> Applying OpenBSD patch patch-configure Hmm... Looks like a unified diff to me... The text leading up to this was: -------------------------- |--- configure.orig Sun May 18 03:50:55 2003 |+++ configure Sat May 22 16:42:06 2010 -------------------------- Patching file configure using Plan A... Hunk #1 succeeded at 2616. Hunk #2 succeeded at 2625. done ===> Applying OpenBSD patch patch-crawl_c Hmm... Looks like a unified diff to me... The text leading up to this was: -------------------------- |--- crawl.c.orig Sun May 18 03:26:41 2003 |+++ crawl.c Sat May 22 16:42:06 2010 -------------------------- Patching file crawl.c using Plan A... Hunk #1 succeeded at 167. done ===> Applying OpenBSD patch patch-dns_c Hmm... Looks like a unified diff to me... The text leading up to this was: -------------------------- |--- dns.c.orig Sun May 18 03:21:33 2003 |+++ dns.c Sun May 23 18:24:21 2010 -------------------------- Patching file dns.c using Plan A... Hunk #1 succeeded at 139. done ===> Applying OpenBSD patch patch-http_c Hmm... Looks like a unified diff to me... The text leading up to this was: -------------------------- |--- http.c.orig Sun May 18 03:50:24 2003 |+++ http.c Sat May 22 16:42:06 2010 -------------------------- Patching file http.c using Plan A... Hunk #1 succeeded at 221. Hunk #2 succeeded at 261. Hunk #3 succeeded at 679. Hunk #4 succeeded at 885. Hunk #5 succeeded at 924. done ===> Applying OpenBSD patch patch-robots_c Hmm... Looks like a unified diff to me... The text leading up to this was: -------------------------- |--- robots.c.orig Tue Jul 16 02:07:20 2002 |+++ robots.c Tue Jul 16 02:08:50 2002 -------------------------- Patching file robots.c using Plan A... Hunk #1 succeeded at 98. done ===> Compiler link: clang -> /usr/bin/clang ===> Compiler link: clang++ -> /usr/bin/clang++ ===> Compiler link: cc -> /usr/bin/cc ===> Compiler link: c++ -> /usr/bin/c++ ===> Generating configure for crawl-0.4p5 ===> Configuring for crawl-0.4p5 Using /exopi-obj/pobj/crawl-0.4/config.site (generated) configure: loading site script /exopi-obj/pobj/crawl-0.4/config.site checking build system type... x86_64-unknown-openbsd7.5 checking host system type... x86_64-unknown-openbsd7.5 checking target system type... x86_64-unknown-openbsd7.5 checking for a BSD-compatible install... /exopi-obj/pobj/crawl-0.4/bin/install -c checking whether build environment is sane... yes checking whether make sets ${MAKE}... (cached) yes checking for working aclocal-1.4... missing checking for working autoconf... missing checking for working automake-1.4... missing checking for working autoheader... missing checking for working makeinfo... found checking for gcc... cc checking for C compiler default output... a.out checking whether the C compiler works... yes checking whether we are cross compiling... no checking for suffix of executables... checking for suffix of object files... (cached) o checking whether we are using the GNU C compiler... (cached) yes checking whether cc accepts -g... (cached) yes checking for cc option to accept ANSI C... none needed checking for a BSD-compatible install... /exopi-obj/pobj/crawl-0.4/bin/install -c checking if we may use "-I/usr/include"... yes checking for libevent... yes checking for Berkeley DB with 1.85 compatibility... /usr/local checking how to run the C preprocessor... cc -E checking for egrep... (cached) grep -E checking for ANSI C header files... (cached) yes checking for sys/types.h... (cached) yes checking for sys/stat.h... (cached) yes checking for stdlib.h... (cached) yes checking for string.h... (cached) yes checking for memory.h... (cached) yes checking for strings.h... (cached) yes checking for inttypes.h... (cached) yes checking for stdint.h... (cached) yes checking for unistd.h... (cached) yes checking for libgen.h... (cached) yes checking for unistd.h... (cached) yes checking for sys/time.h... (cached) yes checking sys/queue.h usability... yes checking sys/queue.h presence... yes checking for sys/queue.h... yes checking for TAILQ_FOREACH in sys/queue.h... yes checking for LIST_FIRST in sys/queue.h... yes checking for an ANSI C-conforming const... (cached) yes checking for size_t... (cached) yes checking for u_int64_t... yes checking for u_int32_t... yes checking for u_int16_t... yes checking for u_int8_t... yes checking whether time.h and sys/time.h may both be included... (cached) yes checking for socket in -lsocket... no checking for gettimeofday... (cached) yes checking for memmove... (cached) yes checking for memset... (cached) yes checking for strcasecmp... (cached) yes checking for strchr... (cached) yes checking for strdup... (cached) yes checking for strncasecmp... (cached) yes checking for basename... (cached) yes checking for dirname... (cached) yes checking for getaddrinfo... (cached) yes checking for strlcat... (cached) yes checking for strlcpy... (cached) yes checking for strsep... (cached) yes checking for inet_aton... (cached) yes checking for working basename... yes checking for working dirname... yes checking for md5 in libc... yes checking for warnx... (cached) yes checking for struct addrinfo in netdb.h... yes checking for socklen_t... yes checking for NI_MAXSERV... yes checking for timeradd in sys/time.h... yes configure: creating ./config.status config.status: creating Makefile mv: Makefile: set owner/group: Operation not permitted config.status: creating config.h mv: config.h: set owner/group: Operation not permitted config.status: executing default-1 commands ===> Building for crawl-0.4p5 cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c crawl.c cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c http.c http.c:145:7: warning: variable 'n' set but not used [-Wunused-but-set-variable] int n = 0; ^ http.c:407:17: warning: self-comparison always evaluates to false [-Wtautological-compare] if (dns->depth != dns->depth) ^ 2 warnings generated. cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c connection.c cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c atomicio.c cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c html.c atomicio.c:43:13: warning: passing arguments to a function without a prototype is deprecated in all versions of C and is not supported in C2x [-Wdeprecated-non-prototype] res = (f) (fd, s + pos, n - pos); ^ atomicio.c:33:1: warning: a function definition without a prototype is deprecated in all versions of C and is not supported in C2x [-Wdeprecated-non-prototype] atomicio(f, fd, _s, n) ^ 2 warnings generated. cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c crawldb.c cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c util.c util.c:131:1: warning: a function definition without a prototype is deprecated in all versions of C and is not supported in C2x [-Wdeprecated-non-prototype] mkpath(path, mode, dir_mode) ^ 1 warning generated. cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c dns.c dns.c:495:6: warning: variable 'positive' set but not used [-Wunused-but-set-variable] int positive; ^ dns.c:90:1: warning: unused function 'tree_SPLAY_NEXT' [-Wunused-function] SPLAY_PROTOTYPE(tree, dns_entry, splay_next, compare); ^ ./tree.h:146:36: note: expanded from macro 'SPLAY_PROTOTYPE' static __inline struct type * \ ^ :11:1: note: expanded from here tree_SPLAY_NEXT ^ dns.c:90:1: warning: unused function 'tree_SPLAY_MIN_MAX' [-Wunused-function] ./tree.h:160:36: note: expanded from macro 'SPLAY_PROTOTYPE' static __inline struct type * \ ^ :13:1: note: expanded from here tree_SPLAY_MIN_MAX ^ 3 warnings generated. cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c cfg.c cc -DHAVE_CONFIG_H -I. -I. -I. -I./compat -I/usr/local/include/db -Wall -c robots.c robots.c:79:1: warning: unused function 'rtree_SPLAY_NEXT' [-Wunused-function] SPLAY_PROTOTYPE(rtree, http_robots, splay_next, compare); ^ ./tree.h:146:36: note: expanded from macro 'SPLAY_PROTOTYPE' static __inline struct type * \ ^ :11:1: note: expanded from here rtree_SPLAY_NEXT ^ robots.c:79:1: warning: unused function 'rtree_SPLAY_MIN_MAX' [-Wunused-function] ./tree.h:160:36: note: expanded from macro 'SPLAY_PROTOTYPE' static __inline struct type * \ ^ :13:1: note: expanded from here rtree_SPLAY_MIN_MAX ^ 2 warnings generated. cc -Wall -o crawl crawl.o http.o connection.o atomicio.o html.o crawldb.o util.o dns.o cfg.o robots.o -levent -L/usr/local/lib/db -ldb html.c(html.o:(html_parsetag)): warning: sprintf() is often misused, please use snprintf() >>> Running package in net/crawl at 1713354142.97 ===> net/crawl ===> Faking installation for crawl-0.4p5 /bin/sh ./mkinstalldirs /exopi-obj/pobj/crawl-0.4/fake-amd64/usr/local/bin /exopi-obj/pobj/crawl-0.4/bin/install -c -s -m 755 crawl /exopi-obj/pobj/crawl-0.4/fake-amd64/usr/local/bin/crawl /usr/bin/make install-man1 /bin/sh ./mkinstalldirs /exopi-obj/pobj/crawl-0.4/fake-amd64/usr/local/man/man1 /exopi-obj/pobj/crawl-0.4/bin/install -c -m 644 ./crawl.1 /exopi-obj/pobj/crawl-0.4/fake-amd64/usr/local/man/man1/crawl.1 ===> Building package for crawl-0.4p5 Create /exopi-cvs/ports/packages/amd64/all/crawl-0.4p5.tgz Creating package crawl-0.4p5 reading plist| checking dependencies| checking dependencies|databases/db/v3,-main checksumming| checksumming| | 0% checksumming|***** | 8% checksumming|********* | 15% checksumming|************** | 23% checksumming|******************* | 31% checksumming|*********************** | 38% checksumming|**************************** | 46% checksumming|********************************* | 54% checksumming|************************************** | 62% checksumming|****************************************** | 69% checksumming|*********************************************** | 77% checksumming|**************************************************** | 85% checksumming|******************************************************** | 92% checksumming|*************************************************************|100% archiving| archiving| | 0% archiving|* | 1% archiving|************************************************************* | 95% archiving|****************************************************************|100% Link to /exopi-cvs/ports/packages/amd64/ftp/crawl-0.4p5.tgz >>> Running clean in net/crawl at 1713354145.09 ===> net/crawl ===> Cleaning for crawl-0.4p5 >>> Ended at 1713354145.40 max_stuck=2.88/depends=3.36/show-prepare-results=0.96/build=10.71/package=2.12/clean=0.35