Opened 3 years ago

Last modified 3 years ago

#24359 closed defect

Turn MeatAxe into a dynamic library — at Version 82

Reported by: SimonKing Owned by:
Priority: major Milestone: sage-8.2
Component: packages: optional Keywords: meataxe static dynamic
Cc: jdemeyer Merged in:
Authors: Reviewers:
Report Upstream: None of the above - read trac for reasoning. Work issues:
Branch: u/SimonKing/turn_meataxe_into_a_dynamic_library (Commits, GitHub, GitLab) Commit: 168b297b7c03637281a6ce7574cac8d85a6688a5
Dependencies: Stopgaps:

Status badges

Description (last modified by SimonKing)

Working on #18514, it turns out that it is quite awkward that MeatAxe is a static library: MeatAxe uses some global variables, and each cython module using MeatAxe library functions needs to set these global variables.

So, purpose of this ticket: Change MeatAxe from static to dynamic.

Unfortunately, we aren't upstream for MeatAxe. We are already extensively patching the upstream sources, both adding features and fixing bugs. But upstream isn't really active, the last release is several years old, and they seemed reluctant to consider our patches.

Therefore I created a fork, that I call SharedMeatAxe, as it provides a shared library. It provides (in contrast to upstream MeatAxe)

  • an autotools build system
  • a shared library
  • a test suite that can be run in parallel and passes make distcheck
  • some bug fixes
  • a speedup in echelon computation
  • asymptotically fast matrix multiplication, that is faster than schoolbook multiplication even for rather small matrices.

Also it fixes some errors in the documentation.

The tarball of the fork is available at http://users.minet.uni-jena.de/cohomology/shared_meataxe-1.0.tar.gz

Change History (82)

comment:1 Changed 3 years ago by SimonKing

  • Report Upstream changed from N/A to None of the above - read trac for reasoning.

comment:2 Changed 3 years ago by SimonKing

Problems:

  • I have no idea what needs to be changed in MeatAxe's Makefile to build a dynamic library.
  • I have no idea what needs to be changed in MeatAxe's Makefile to link against the newly created dynamic library. Is there a syntactical difference for linking against static resp. dynamic libraries? Or is it just -lmtx in both cases?
  • Is it a problem when the static libmtx.a is still around from a previous installation of meataxe? I.e., is it needed that spkg-install removes libmtx.a before creation of the dynamic library?

comment:3 Changed 3 years ago by SimonKing

For the record: This is the upstream Makefile

include Makefile.conf

CFLAGS=$(CFLAGS1) -Itmp
LFLAGS=$(LFLAGS1)

PROGRAMS = \
  cfcomp checksum chop decomp genmod mkcycl mkdotl mkgraph mkhom mkhom_old\
  mkinc mksub mktree orbrep precond pseudochop pwkond rad soc symnew tcond tuc \
  zad zbl zcf zcl zcp zct zcv zef zev zfr ziv zkd zmo zmu zmw znu zor zpo zpr \
  zpt zqt zro zsc zsi zsp zsy ztc zte ztr zts zuk zvp

all build: $(PROGRAMS:%=bin/%)

clean:
	rm -rf doc bin tmp
	cd test; make clean


# ------------------------------------------------------------------------------
# Compile C sources
# ------------------------------------------------------------------------------

tmp/%.o: tmp/mk.dir src/%.c src/meataxe.h tmp/config.h
	@echo "Compiling $*.c"
	@$(CC) $(CFLAGS) -c src/$*.c -o $@

tmp/mk.dir:
	mkdir -p tmp
	touch $@

bin/mk.dir:
	mkdir -p bin
	touch $@

# ------------------------------------------------------------------------------
# Link programs
# ------------------------------------------------------------------------------

bin/%: bin/mk.dir tmp/%.o tmp/libmtx.a
	@echo "Linking $@"
	@$(CC) $(LFLAGS) -o $@ tmp/$*.o tmp/libmtx.a


# ------------------------------------------------------------------------------
# Library
# ------------------------------------------------------------------------------

LIB_OBJS=\
	args berlekmp \
	bsand bscore bsdup bsissub bsmatch bsminus \
	bsop bsor bsprint bsread bswrite \
	cfinfo \
	charpol chbasis \
	error \
	ffio \
	fpcore fpdup fpmul fpmul2 fpprint \
	gcd genseed\
	grmaprow grmatcore grtable \
	homcomp \
	imatcore imatread imatwrite\
	init intio issub \
	isisom kernel-$(ZZZ) \
	ldiag \
	maddmul mat2vec matadd matclean matcmp \
	maketabF \
	matcopy matcore matcut \
	matdup matech matid matins matinv matmul \
	matnull matorder \
	matpivot\
	matprint matpwr matread mattr mattrace matwrite \
	message \
	mfcore mfread mfreadlong mfwrite mfwritelong \
	minpol mkendo\
	mmulscal mraddgen mrcore mrread mrtranspose mrwrite \
	msclean mscore \
	mtensor mtxobj os \
	permcmp permcore permdup perminv permmul permorder\
	permprint permpwr permread permwrite poladd\
	polcmp polcore polderive poldiv poldup\
	polgcd polmul polprint polread polwrite \
	quotient random rdcfgen \
	saction setcore setinsert settest \
	spinup spinup2 \
	split stabpwr stfcore \
	stfread stfwrite \
	string \
	sumint \
	temap \
	tkinfo vec2mat \
	wgen \
	zcleanrow zcmprow zgap zpermrow \
	zzz2 \
	version

tmp/libmtx.a: $(LIB_OBJS:%=tmp/%.o)
	@echo "Creating $@"
	@rm -f $@
	@ar r $@ $(LIB_OBJS:%=tmp/%.o)


# ------------------------------------------------------------------------------
# Test suite
# ------------------------------------------------------------------------------

TS_OBJS1=c-args c-bitstring c-charpol\
	c-ffio c-fileio c-ffmat c-ffrow c-fpoly \
	c-grease c-kernel c-matins c-matrix c-matset\
	c-os c-perm c-poly c-pseed c-quot c-random \
	c-sets c-stf c-tensor

TS_OBJS=tmp/zzztest.o $(TS_OBJS1:%=tmp/%.o) tmp/libmtx.a

bin/zzztest: bin/mk.dir $(TS_OBJS)
	@echo "Linking $@"
	@$(CC) $(CFLAGS) -o $@ $(TS_OBJS)

bin/checksum: bin/mk.dir tmp/checksum.o
	@echo "Linking $@"
	@$(CC) $(CFLAGS) -o $@ tmp/checksum.o

TESTS=0000\
  0100 0100 0105 0106 0107 0108 0109 0110 0111 0112\
  0113 0114 0115 0116 0117 0118 0200 0201 0210 0211\
  0212 0213 0214 0215

check: tmp/zzztest.done $(TESTS:%=tmp/t-%.done)

tmp/zzztest.done: tmp/mk.dir bin/zzztest
	cd tmp && ../bin/zzztest
	touch $@

tmp/t-%.done: tmp/mk.dir test/t-% tmp/t.config bin/checksum build
	@echo "t-$* `grep '^#:' test/t-$* | cut -c 3-100`"
	@cd tmp && ../test/t-$*
	@touch $@

tmp/t.config: tmp/mk.dir test/config
	cp test/config $@

# ------------------------------------------------------------------------------
# config.h
# ------------------------------------------------------------------------------

tmp/config.h: tmp/mk.dir Makefile Makefile.conf src/genconfig.c svnversion
	@echo "Generating config.h"
	@$(CC) $(CFLAGS) $(LFLAGS) -o tmp/genconfig src/genconfig.c
	@tmp/genconfig >$@


	
	
.PHONY: doc tar clean install check
.PRECIOUS: tmp/%.o

svnversion: $(EXPORTED_FILES)
	svnversion >$@ || echo "UNKNOWN" >$@

.PHONY: force-svnversion
force-svnversion:
	svnversion >svnversion

# ------------------------------------------------------------------------------
# Documentation
# ------------------------------------------------------------------------------

.PHONY: doc

rebuild-doc:
	rm -rf doc
	mkdir -p doc
	doxygen etc/Doxyfile >/dev/null

doc: doc/index.html

doc/index.html: etc/Doxyfile $(PROGRAMS:%=src/%.c) $(LIB_OBJS:%=src/%.c) src/meataxe.h src/meataxe.doc src/changelog.doc
	mkdir -p doc
	doxygen etc/Doxyfile >/dev/null


# ------------------------------------------------------------------------------
# Releasing
# ------------------------------------------------------------------------------

EXPORTED_FILES =\
  $(PROGRAMS:%=src/%.c)\
  $(LIB_OBJS:%=src/%.c)\
  src/meataxe.h src/genconfig.c\
  src/check.h $(TS_OBJS1:%=src/%.c) $(TS_OBJS1:%=src/%.h) src/zzztest.c\
  $(TESTS:%=test/t-%) test/config test/data\
  Makefile Makefile.conf.dist README COPYING\
  src/meataxe.doc src/changelog.doc\
  svnversion etc doc

tar: svnversion all rebuild-doc
	V=2.4.`cat svnversion` \
	&& rm -f meataxe-$$V meataxe-$$V.tar meataxe-$$V.tar.gz \
	&& ln -s . meataxe-$$V \
	&& tar cf meataxe-$$V.tar $(EXPORTED_FILES) \
	&& rm meataxe-$$V \
	&& gzip meataxe-$$V.tar \
	&& echo "Created meataxe-$$V.tar.gz"




PS: Makefile.conf is empty.

Last edited 3 years ago by SimonKing (previous) (diff)

comment:4 follow-ups: Changed 3 years ago by dimpase

I propose autotoolization. It would be hell to make this work on OSX and other non-linux otherwise. Do shout if you need help with this - it's not hard, in fact.

comment:5 Changed 3 years ago by SimonKing

Remark: Probably it makes no sense to contact upstream about this.

For the current meataxe spkg, we have 7 patches. Some of them fix clear bugs, some of them add a considerable speed-up in echelon computation and matrix multiplication. However, none of it was taken into account by upstream, stating that they are currently preparing another version anyway (no idea how long this will take and how many backward incompatible changes are made in the new version).

comment:6 in reply to: ↑ 4 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

I propose autotoolization. It would be hell to make this work on OSX and other non-linux otherwise. Do shout if you need help with this - it's not hard, in fact.

Actually I already successfully autoconfiscated a package: David Green did not use autotools for his projective resolution programs, but when I became upstream, I made it use autotools.

However, I consider it as a crucial difference that I am not upstream for MeatAxe. So, I am somehow reluctant to completely change the build system.

OTOH, I did provide rather invasive patches for MeatAxe (e.g., one adding asymptotically fast matrix multiplication). So, I am not categorically against a patch that commits autoconfiscation...

comment:7 in reply to: ↑ 6 Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to dimpase:

I propose autotoolization. It would be hell to make this work on OSX and other non-linux otherwise. Do shout if you need help with this - it's not hard, in fact.

Actually I already successfully autoconfiscated a package: David Green did not use autotools for his projective resolution programs, but when I became upstream, I made it use autotools.

However, I consider it as a crucial difference that I am not upstream for MeatAxe. So, I am somehow reluctant to completely change the build system.

OTOH, I did provide rather invasive patches for MeatAxe (e.g., one adding asymptotically fast matrix multiplication). So, I am not categorically against a patch that commits autoconfiscation...

Another build system is less invasive than functionality changes, I think. Anyhow, if I were you I would have just forked MeatAxe and forgot about non-cooperating upstream.

PS. A threat of a fork is sometimes making upstream cooperating, all of a sudden :-)

Last edited 3 years ago by dimpase (previous) (diff)

comment:8 in reply to: ↑ 4 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

I propose autotoolization. It would be hell to make this work on OSX and other non-linux otherwise. Do shout if you need help with this - it's not hard, in fact.

When I autotoolized David Green's modular resolution code, I used "Autotools - A Practitioner's guide to GNU autoconf, automake, and libtool" by John Calcote. Nonetheless, since I am still quite a newbie, I'd appreciate if you could summarise for me what steps I need to take. I guess I would be able to look up myself the details on top the summary.

comment:9 in reply to: ↑ 8 ; follow-up: Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to dimpase:

I propose autotoolization. It would be hell to make this work on OSX and other non-linux otherwise. Do shout if you need help with this - it's not hard, in fact.

When I autotoolized David Green's modular resolution code, I used "Autotools - A Practitioner's guide to GNU autoconf, automake, and libtool" by John Calcote. Nonetheless, since I am still quite a newbie, I'd appreciate if you could summarise for me what steps I need to take. I guess I would be able to look up myself the details on top the summary.

Not that I have much more experience with this, but yes, this book is probably the best source.

As this code does not have any nontrivial dependencies (according to a quick look at the makefile), I guess it's about

  • figuring out what sources (*.c and *.h) are, probably putting all *.h into a separate include/ directory, I would keep the library sources in src/ but create another directory, say progs/, for sources of the executables.
  • figuring out what to install: headers, library, executables
  • figuring out what to test.

Then, putting Makefile.am in the appropriate directories; for include/ you'd want an 1-line Makefile.am with nobase_include_HEADERS = I'd go for include/meataxe/, then you'd need to modify sources to have meataxe/ in #include <> statements (this is for headers you need installed, I gather, but there is no harm in installing them all in meataxe/)

The Makefile.am in src/ will deal with building the library; the one in progs with executables, the one in test/ with testing.

configure.ac would be very trivial...

comment:10 in reply to: ↑ 9 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

Then, putting Makefile.am in the appropriate directories; for include/ you'd want an 1-line Makefile.am with nobase_include_HEADERS = I'd go for include/meataxe/,

Sure, currently the meataxe headers are there, too.

then you'd need to modify sources to have meataxe/ in #include <> statements (this is for headers you need installed, I gather, but there is no harm in installing them all in meataxe/)

I don't understand what that means. Currently, there is #include "meataxe.h". Are you saying that I need to change it into #include "meataxe/meataxe.h"? But why? Wouldn't "make build" take place in some build directory whose structure mimics the source structure, whereas the transfer of headers to include/meataxe would only happen during "make install"?

The Makefile.am in src/ will deal with building the library; the one in progs with executables, the one in test/ with testing.

I need to figure out what to do so that Makefile.am in test/ will only be executing if someone does "make test".

comment:11 Changed 3 years ago by SimonKing

PS: There is only one header file to be installed, namely meataxe.h. The other header files apparently are only used for testing. So, they should actually better be moved into the test/ folder.

comment:12 Changed 3 years ago by SimonKing

Currently meataxe only has a single src directory, containing all headers (the one to be included and the ones used for testing) and all code files (regardless whether for library functions, for executable or for tests). That'd really require a major overhaul.

comment:13 in reply to: ↑ 4 Changed 3 years ago by jdemeyer

Replying to dimpase:

I propose autotoolization.

Indeed, it's the only realistic solution. Don't try to fix the upstream build system, just throw it out and start from scratch with autotools.

comment:14 in reply to: ↑ 10 Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to dimpase:

Then, putting Makefile.am in the appropriate directories; for include/ you'd want an 1-line Makefile.am with nobase_include_HEADERS = I'd go for include/meataxe/,

Sure, currently the meataxe headers are there, too.

then you'd need to modify sources to have meataxe/ in #include <> statements (this is for headers you need installed, I gather, but there is no harm in installing them all in meataxe/)

I don't understand what that means. Currently, there is #include "meataxe.h". Are you saying that I need to change it into #include "meataxe/meataxe.h"? But why? Wouldn't "make build" take place in some build directory whose structure mimics the source structure, whereas the transfer of headers to include/meataxe would only happen during "make install"?

No, that's not what I meant, but indeed if you only need to install meataxe.h, and no other headers, then you may leave it as it is. However, it's bad practice to mix *.c and *.h in the same dir. Still I'd create include/ and move all the *.h files there. Then if your AC_CONFIG_FILES in configure.ac mentions include/Makefile the building setup with add the relevant -I../include/ or something like this to the compilation commands.

IIRC, nobase_include_HEADERS = meataxe.h in include/ will then make sure that make install copies meataxe.h (and no other headers) to where it should go, i.e. $SAGE_LOCAL/include. (But I might be wrong on this one, it could be that you must list all the *.h files there, and then they would all get installed, something you rather want to avoid, I guess. Not sure.)

The Makefile.am in src/ will deal with building the library; the one in progs with executables, the one in test/ with testing.

I need to figure out what to do so that Makefile.am in test/ will only be executing if someone does "make test".

I think this is made sure by mentioning in test/Makefile.am the corresponding executables in check_PROGRAMS =, and not in bin_PROGRAMS =. The latter is for executables to be installed by make install.

PS. I'm following my autotoolization project here: https://github.com/dimpase/csdp

comment:15 follow-up: Changed 3 years ago by SimonKing

So far, I thought I should keep the upstream sources and add yet another patch. The patch would define the change from the upstream build system to whatever comes out by applying autotools.

However, autotools creates a lot of files, say, INSTALL, COPYING, makefile.in, config.h.in, etc, and if I understand correctly these are all files supposed to be part of a distribution, i.e., the patch would add all those files and would thus be HUGE.

Wouldn't it then be better to fork MeatAxe, after all?

I.e., remove the old build system, apply autotools, create meataxe-2.4.24.p4.tar.gz by make dist (IIRC that's one of the make targets provided by automake), and post this on my web page? Then, the current patches from SAGE_ROOT/build/pkgs/meataxe/patches will already be part of the .tar.gz.

I just checked that meataxe 2.5 is still not using autotools. But how to tell upstream about it without creating anger?

comment:16 follow-up: Changed 3 years ago by SimonKing

Another question concerning the portability of the dynamic library approach. Are dynamic libraries really available on all platforms targeted by Sage?

comment:17 in reply to: ↑ 16 Changed 3 years ago by dimpase

Replying to SimonKing:

Another question concerning the portability of the dynamic library approach. Are dynamic libraries really available on all platforms targeted by Sage?

Certainly.

comment:18 in reply to: ↑ 15 Changed 3 years ago by dimpase

Replying to SimonKing:

So far, I thought I should keep the upstream sources and add yet another patch. The patch would define the change from the upstream build system to whatever comes out by applying autotools.

However, autotools creates a lot of files, say, INSTALL, COPYING, makefile.in, config.h.in, etc, and if I understand correctly these are all files supposed to be part of a distribution, i.e., the patch would add all those files and would thus be HUGE.

Wouldn't it then be better to fork MeatAxe, after all?

I.e., remove the old build system, apply autotools, create meataxe-2.4.24.p4.tar.gz by make dist (IIRC that's one of the make targets provided by automake), and post this on my web page? Then, the current patches from SAGE_ROOT/build/pkgs/meataxe/patches will already be part of the .tar.gz.

I just checked that meataxe 2.5 is still not using autotools. But how to tell upstream about it without creating anger?

Supporting dynamic libraries on a variety of platforms basically requires using autotools or cmake, otherwise it is a nightmare.

Sage has quite some experience with autotoolising upstreams, usually it ends up well. Cliquer (https://github.com/dimpase/autocliquer/) was an exception, upstream declined our patches, for reason that the software has not been touched for 6 or 7 years, and that the code never included outside contributions in the past.

I think *reasonable* people would gladly accept these changes...

comment:19 Changed 3 years ago by SimonKing

As I mentioned earlier, upstream ignores my patches to add Winograd-Strassen multiplication, although it is faster than schoolbook multiplication even for quite small matrices. But what would happen if upstream would accept autotoolization, but would keep ignoring the other patches?

Wouldn't the addition of another source file imply a change in src/Makefile.am (which would be part of the distribution)? I.e., we would have to wait for upstream to accept the autotoolization and then still need to patch Makefile.am

comment:20 follow-up: Changed 3 years ago by dimpase

Just create a fork. The fact that upstream ignores patches should not preclude further development, and the license perfectly allows this.

Call it AutoMeatAxe, say...

Last edited 3 years ago by dimpase (previous) (diff)

comment:21 in reply to: ↑ 20 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

Just create a fork. The fact that upstream ignores patches should not preclude further development, and the license perfectly allows this.

Call it AutoMeatAxe, say...

What would then happen with the current optional "meataxe" spkg and with the OptionalExtension depending on it? Would sage -i meataxe henceforth install automeataxe? But how would that work, given that sage -i bla looks in upstream for the latest version of bla (so that sage -i meataxe couldn't pick up automeataxe)?

A static library provided by meataxe and a dynamic library provided by automeataxe can't coexist, I believe.

comment:22 follow-up: Changed 3 years ago by SimonKing

I just thought: Maybe a dynamic library is not a good solution, in particular if it comes to parallel computation?

libmtx has global C variables not only for the folders holding multiplication tables, but also for the current field and the current row length. Since the library is static, any Cython module and any executable linked against libmtx has their own copy of these variables, and it means that they are all obligated to set the appropriate variables.

Suppose we would use a dynamic library instead. Now imagine a computation in which functions from different Cython modules A,B,C and different executables D,E,F are supposed to deal with matrices of different dimensions. When they are all linked against the same dynamic library, chaos would result: If module A changed the row size and executable E would pick that up, it would undoubtedly crash.

Am I right that this would happen?

If "yes", then it is a show stopper, and we can mark this ticket as invalid.

comment:23 Changed 3 years ago by SimonKing

And what about a multiuser setting?

If user A changes a global variable in a dynamic library, would that affect user B?

comment:24 in reply to: ↑ 22 ; follow-up: Changed 3 years ago by jdemeyer

Global variables in dynamic libraries are only shared on the level of programs. Two different programs (regardless of whether it's the same executable or same user) won't share anything.

So, in practice, the relevant difference between static and shared for Sage will be if different Cython modules use the same library.

Since parallel computation will most likely involve just one Cython module, I don't see an issue there.

comment:25 in reply to: ↑ 24 Changed 3 years ago by SimonKing

Replying to jdemeyer:

Since parallel computation will most likely involve just one Cython module, I don't see an issue there.

The computation of the cohomology ring of a finite group G typically involves the computation of the cohomology rings of various subgroups of G. It would be natural to have these computed in parallel. It involves libmtx calls at least from sage/groups/modular_cohomology/resolution, sage/groups/modular_cohomology/cochain and sage/matrix/matrix_gfpn_dense, and these computations would likely to be operating with different row sizes.

Last edited 3 years ago by SimonKing (previous) (diff)

comment:26 Changed 3 years ago by SimonKing

BTW, I just created a parallel computation (using the @parallel decorator) of matrix products of different sizes. At least with the current static library, it works fine.

comment:27 in reply to: ↑ 21 ; follow-up: Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to dimpase:

Just create a fork. The fact that upstream ignores patches should not preclude further development, and the license perfectly allows this.

Call it AutoMeatAxe, say...

What would then happen with the current optional "meataxe" spkg and with the OptionalExtension depending on it? Would sage -i meataxe henceforth install automeataxe? But how would that work, given that sage -i bla looks in upstream for the latest version of bla (so that sage -i meataxe couldn't pick up automeataxe)?

meataxe package would use automeataxe tarball---tarball name can be more or less arbitrary. The package name won't have to change - it's just the name of the upstream project. See how this is done with cliquer.

A static library provided by meataxe and a dynamic library provided by automeataxe can't coexist, I believe.

why would you need both?

comment:28 in reply to: ↑ 27 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

A static library provided by meataxe and a dynamic library provided by automeataxe can't coexist, I believe.

why would you need both?

I don't. But I was afraid that an automeataxe spkg couldn't be called meataxe. Also, I am slightly worried about a stale libmtx.a: How to make sure that everything is linked against a new dynamic library rather than an old static library?

That said, I still worry about potential problems with parallelism, due to shared global variables.

comment:29 follow-up: Changed 3 years ago by dimpase

@parallel does not do any memory sharing among subprocesses---which is often a problem for computations where you really want to avoid the overhead of copying everything, but in this case it would actually make the transition to the dynamic setting seamless.

(of course, in a real multithreaded application things like global variables are something that is to be avoided, but it's not the case here.)

comment:30 in reply to: ↑ 28 Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to dimpase:

A static library provided by meataxe and a dynamic library provided by automeataxe can't coexist, I believe.

why would you need both?

I don't. But I was afraid that an automeataxe spkg couldn't be called meataxe. Also, I am slightly worried about a stale libmtx.a: How to make sure that everything is linked against a new dynamic library rather than an old static library?

Remove the stale libs and other things in spkg-install (many packages actually do this).

That said, I still worry about potential problems with parallelism, due to shared global variables.

As I said in the comment I just posted, with @parallel you will be fine.

comment:31 in reply to: ↑ 29 Changed 3 years ago by jdemeyer

Replying to dimpase:

@parallel does not do any memory sharing among subprocesses

To elaborate: @parallel uses fork() which creates multiple processes (which don't share memory), not threads.

comment:32 Changed 3 years ago by SimonKing

MeatAxe defines various symbols during its hand-made initialisation. Among them:

  • MTX_CONFIG_BIG_ENDIAN
  • MTX_BUILD and MTX_VERSION
  • MTX_CONFIG_LONG32 and MTX_CONFIG_LONG64
  • MTX_CONFIG_STRING

Some of them are only used in the docs, some are used in a file intio.c.

Question: How can autoconf be used to define these symbols? I.e., how can I check during configuration whether the machine is big or little endian and whether it is 32 or 64 bit? So far, I only found runtime tests, and in fact the upstream hand-made configuration of MeatAxe first executes a little C code that at its runtime checks the machine and then defines the above symbols, so that the rest of the code knows the configuration at compile time.

comment:33 Changed 3 years ago by SimonKing

For the record, the current configuration works like this:

tmp/config.h: tmp/mk.dir Makefile Makefile.conf src/genconfig.c svnversion
	@echo "Generating config.h"
	@$(CC) $(CFLAGS) $(LFLAGS) -o tmp/genconfig src/genconfig.c
	@tmp/genconfig >$@

and the resulting header is then included in two files, intio.c and version.c. The former C file relies on endianness and 32/64bit architecture. The latter C file provides an environment variable MtxVersion, which is then used to print help resp. status information.

Again, the question is how that would change when using autotools?

comment:34 follow-up: Changed 3 years ago by dimpase

there is Autoconf macro AC_C_BIGENDIAN which you can use for endianness detection. For the version etc there is AC_INIT, which is typically used for setting the version.

For 32 vs 64 bits, use AC_CHECK_SIZEOF

Last edited 3 years ago by dimpase (previous) (diff)

comment:35 in reply to: ↑ 34 Changed 3 years ago by SimonKing

Replying to dimpase:

there is Autoconf macro AC_C_BIGENDIAN which you can use for endianness detection. For the version etc there is AC_INIT, which is typically used for setting the version.

For 32 vs 64 bits, use AC_CHECK_SIZEOF

Thank you! That information is in fact missing in the autotools tutorial I am using. But I found it documented in the autoconf manual.

comment:36 follow-up: Changed 3 years ago by SimonKing

I made some progress: The library and the executables of SharedMeatAxe (that's the name I chose) build.

I now need to see how to run MeatAxe's test suite and how to build MeatAxe's documentation (they use doxygen).

What I don't understand: After installation, I expected to find a single shared library in lib/, but I find:

$ ls lib
libmtx.a  libmtx.la  libmtx.so	libmtx.so.0  libmtx.so.0.0.0

In particular, the static library libmtx.a is there as will. Is that not a problem?

Also, would it be a problem (e.g. for the OptionalExtension sage.matrix.matrix_gfpn_dense) that meataxe.h is not installed directly in include/ but in include/shared_meataxe/?

Last edited 3 years ago by SimonKing (previous) (diff)

comment:37 follow-up: Changed 3 years ago by dimpase

various versions of .so are fine, it is the usual versioning stuff. only one of them is not a link. by default both static and dynamic libs are built. you can use a parameter to configure to build only dynamic one, something like --disable-static

header location can be adjusted in the progs that use them.

Last edited 3 years ago by dimpase (previous) (diff)

comment:38 in reply to: ↑ 37 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

various versions of .so are fine, it is the usual versioning stuff. only one of them is not a link. by default both static and dynamic libs are built. you can use a parameter to configure to build only dynamic one, something like --disable-static

From my perspective, the crucial question is: What happens if one does -lmtx? Would that result in linking against the static or against the dynamic library, if both are present?

OTOH, I was told that Sage only supports platforms which do provide shared libraries. If that's the case then --disable-static probably is the way to go.

header location can be adjusted in the progs that use them.

Sure. But there is only a single header to be included. Thus, it might be better to avoid the additional shared_meataxe folder.

comment:39 in reply to: ↑ 36 ; follow-up: Changed 3 years ago by jdemeyer

Replying to SimonKing:

Also, would it be a problem (e.g. for the OptionalExtension sage.matrix.matrix_gfpn_dense) that meataxe.h is not installed directly in include/ but in include/shared_meataxe/?

shared_meataxe is a bit a strange name. I think it should be either include/meataxe.h or include/meataxe/meataxe.h

comment:40 in reply to: ↑ 38 ; follow-up: Changed 3 years ago by jdemeyer

Replying to SimonKing:

From my perspective, the crucial question is: What happens if one does -lmtx?

On Linux at least, dynamic libraries (.so) take precendence. But I know that handling of libraries differs a lot between operating systems, so I cannot speak for non-Linux systems.

comment:41 in reply to: ↑ 39 ; follow-up: Changed 3 years ago by SimonKing

Replying to jdemeyer:

Replying to SimonKing:

Also, would it be a problem (e.g. for the OptionalExtension sage.matrix.matrix_gfpn_dense) that meataxe.h is not installed directly in include/ but in include/shared_meataxe/?

shared_meataxe is a bit a strange name.

SharedMeatAxe is the package name that I currently propose. Dima suggested AutoMeatAxe. However, I prefer a name that indicates what the package provides (namely: A shared MeatAxe library) over a name that tells how the package is built (namely: With autotools).

At the current stage, the name isn't fixed. So, if you can think of a catchy name, then please tell!

I think it should be either include/meataxe.h or include/meataxe/meataxe.h

I guess it will be the former; that's what the old meataxe spkg did anyway.

comment:42 in reply to: ↑ 40 Changed 3 years ago by SimonKing

Replying to jdemeyer:

Replying to SimonKing:

From my perspective, the crucial question is: What happens if one does -lmtx?

On Linux at least, dynamic libraries (.so) take precendence. But I know that handling of libraries differs a lot between operating systems, so I cannot speak for non-Linux systems.

That means I have to use --disable-static. The whole point of this ticket is to provide a shared library, so that all parts of Sage that are using meataxe (matrix_gfpn_dense and in future also the modular group cohomology code) need to define certain global parameters only once per session, rather than once in each module. So, on a machine where static libraries have precedence, the code would break.

comment:43 Changed 3 years ago by SimonKing

I am stuck with the question: How to build the documentation using autotools?

In "Autotools - A practitioner's guide...", the description on doxygen was obscure to me, and web search also didn't help me to understand what needs to be done.

For reference, here is the doc related part of the original Makefile:

# ------------------------------------------------------------------------------
# Documentation
# ------------------------------------------------------------------------------

.PHONY: doc

rebuild-doc:
        rm -rf doc
        mkdir -p doc
        doxygen etc/Doxyfile >/dev/null

doc: doc/index.html

doc/index.html: etc/Doxyfile $(PROGRAMS:%=src/%.c) $(LIB_OBJS:%=src/%.c) src/meataxe.h src/meataxe.doc src/changelog.doc
        mkdir -p doc
        doxygen etc/Doxyfile >/dev/null

How can one check for the presence of doxygen? How can one translate the above into an appropriate Makefile.am that should probably reside in doc/?

comment:44 Changed 3 years ago by SimonKing

If found a macro to check for doxygen at the Autoconf archive. Do I understand correctly that I am supposed to copy that macro into the m4/ directory of my package in order to use it?

comment:45 follow-up: Changed 3 years ago by SimonKing

I don't know if this is an autotools question or a doxygen question.

I am now able to get a make target "doxygen-doc". It somehow creates a documentation, but unfortunately I meet three problems:

  • The html pages created by doxygen are not as thorough as with the original MeatAxe. Namely, the data structures are documented, but the modules are not, and also the main page is basically empty.
  • When trying to build the pdf documentation, latex complains about a missing file refman.tex. Am I supposed to create that file? Is it supposed to be autogenerated? Why is it not autogenerated? I am a bit stuck.
  • make uninstall does not remove the created documentation.

In relation with doxygen, this is what I did:

  • In configure.ac:
    DX_INIT_DOXYGEN([shared_meataxe], [$(top_srcdir)/etc/Doxyfile], [doc])
    DX_DOXYGEN_FEATURE(ON)
    
  • In Makefile.am:
    @DX_RULES@
    
    MOSTLYCLEANFILES = DX_CLEANFILES
    

The Makefile generated by autotools/configure is then calling doxygen with these environment variables:

SRCDIR='../autoconfiscate' PROJECT='shared_meataxe' VERSION='1.0' PERL_PATH='/usr/bin/perl' HAVE_DOT='NO' GENERATE_MAN='NO' GENERATE_RTF='NO' GENERATE_XML='NO' GENERATE_HTMLHELP='NO' GENERATE_CHI='NO' GENERATE_HTML='YES' GENERATE_LATEX='YES'

Do you have an idea what is the point about the obscure refman.tex, why some stuff's documentation is created but other stuff's isn't, why the main doc page is empty and why the doc files aren't uninstalled?

comment:46 in reply to: ↑ 45 Changed 3 years ago by SimonKing

Replying to SimonKing:

SRCDIR='../autoconfiscate' PROJECT='shared_meataxe' VERSION='1.0' PERL_PATH='/usr/bin/perl' HAVE_DOT='NO' GENERATE_MAN='NO' GENERATE_RTF='NO' GENERATE_XML='NO' GENERATE_HTMLHELP='NO' GENERATE_CHI='NO' GENERATE_HTML='YES' GENERATE_LATEX='YES'

Note that SRCDIR='../autoconfiscate' sounds odd to me. The C sources are in fact in ../autoconfiscate/src. However, when I manually edited the Makefile so that SRCDIR got a seemingly better definition, the modules have still not been documented.

So, where is one supposed to state (1) what .c and .h files are to be considered by doxygen, and (2) what file is the main doc page to be taken from?

comment:47 in reply to: ↑ 41 ; follow-up: Changed 3 years ago by jdemeyer

Replying to SimonKing:

SharedMeatAxe is the package name that I currently propose. Dima suggested AutoMeatAxe. However, I prefer a name that indicates what the package provides (namely: A shared MeatAxe library) over a name that tells how the package is built (namely: With autotools).

It doesn't matter how you call your package. The header file doesn't depend on whether it's plain old MeatAxe or SharedMeatAxe or AutoMeatAxe. So I simply wouldn't change the header file name or location.

comment:48 Changed 3 years ago by SimonKing

The setting is:

  • I have a directory in which I build the package: ~/Projekte/Meataxe/build.
  • The configure script resides in ~/Projekte/Meataxe/autoconfiscate
  • The C sources are in ~/Projekte/Meataxe/autoconfiscate/src
  • The Doxygen file is in ~/Projekte/Meataxe/autoconfiscate/etc

When doing make doxygen-doc, I see

error: the type 'dirs' is not supported for the entry tag within a navindex! Check your layout file!

Later, it says

Preprocessing /home/king/Projekte/MeatAxe/build/config.h...
Parsing file /home/king/Projekte/MeatAxe/build/config.h...
Preprocessing /home/king/Projekte/MeatAxe/build/include/meataxe.h...
Parsing file /home/king/Projekte/MeatAxe/build/include/meataxe.h...

which probably explains why some documentation is in fact created.

And then, latex starts:

writing tag file...
lookup cache used 523/65536 hits=1630 misses=530
finished...
echo Timestamp >doc/shared_meataxe.tag
cd doc/latex; \
rm -f *.aux *.toc *.idx *.ind *.ilg *.log *.out; \
/usr/bin/latex refman.tex; \
/usr/bin/makeindex refman.idx; \
/usr/bin/latex refman.tex; \
countdown=5; \

If refman.tex is expected, but not provided, I suppose that upstream didn't want pdf documentation to be created. The documentation that is shipped with the upstream meataxe package is html only.

However, I don't understand how to make doxygen consider the whole src directory, rather than only config.h and meataxe.h

comment:49 in reply to: ↑ 47 Changed 3 years ago by SimonKing

Replying to jdemeyer:

It doesn't matter how you call your package. The header file doesn't depend on whether it's plain old MeatAxe or SharedMeatAxe or AutoMeatAxe. So I simply wouldn't change the header file name or location.

  1. I didn't intend to change the name of the header file.
  2. At some stage, my Makefile.am used pkginclude_HEADERS, and thus autotools has put meataxe.h into local/include/$PACKAGE_NAME/. However, I changed it to include_HEADERS, so that the header now goes directly into local/include. So, problem vanished.
  3. The package needs to have some name. I prefer SharedMeatAxe over AutoMeatAxe, but I'm open for better suggestions.

comment:50 Changed 3 years ago by SimonKing

Concerning "type 'dirs' is not supported", I find the following matches for "dirs" in etc/ (where the doxygen layout file resides):

etc/layout.xml:    <tab type="dirs" visible="yes" title=""/>
etc/layout.xml:      <dirs visible="yes" title=""/>
etc/layout.xml:      <dirs visible="yes"/>

comment:51 follow-up: Changed 3 years ago by dimpase

Are you able to build the docs in the original package? It could be that the original package does not work with the up-to-date doxygen; if you google for the error message the type 'dirs' is not supported for the entry tag within a navindex! you will see that it might be due to a (relatively) recent doxygen update.

Perhaps simply removing these dirs directives in etc/layout.xml will do the trick.

comment:52 in reply to: ↑ 51 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

Are you able to build the docs in the original package?

Yes, I just tested it. I need to be in the top-level directory. I simply did

$ doxygen etc/Doxyfile

Then, html documentation was created in the ./doc/ folder, using all the files from the ./src/ folder.

So, I guess the question is how to achieve that automake is issuing the doxygen command in the top-level directory of the source tree (NOT in src/, but one level above src/, and also not in the build/ directory).

It could be that the original package does not work with the up-to-date doxygen; if you google for the error message the type 'dirs' is not supported for the entry tag within a navindex! you will see that it might be due to a (relatively) recent doxygen update.

Yes, the error occurs with the upstream package as well. However, the "error" is more a warning, as the documentation still builds in a useful way.

By the way, did you understand how the layout file is supposed to change? Because I didn't.

Last edited 3 years ago by SimonKing (previous) (diff)

comment:53 in reply to: ↑ 52 ; follow-up: Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to dimpase:

Are you able to build the docs in the original package?

Yes, I just tested it. I need to be in the top-level directory. I simply did

$ doxygen etc/Doxyfile

Then, html documentation was created in the ./doc/ folder, using all the files from the ./src/ folder.

So, I guess the question is how to achieve that automake is issuing the doxygen command in the top-level directory of the source tree (NOT in src/, but one level above src/, and also not in the build/ directory).

IMHO, you don't need to tweak autoconf stuff for this, you need to tweak Doxyfile to have INPUT = specifying directories you need to process.

By the way, did you understand how the layout file is supposed to change? Because I didn't.

I've never used Doxygen...

comment:54 in reply to: ↑ 53 ; follow-up: Changed 3 years ago by SimonKing

Replying to dimpase:

IMHO, you don't need to tweak autoconf stuff for this, you need to tweak Doxyfile to have INPUT = specifying directories you need to process.

You are right! Upstream says INPUT = .. But when I change it to

INPUT                  = $(SRCDIR)

then the html doc build fine.

However, the pdf documentation is not available. I guess that's because refman.tex is not automatically generated by doxygen (which I find strange).

Can you tell me how I can achieve that make uninstall removes the documentation?

Anyway: Next problem is solved! Now I can turn towards the test suite.

comment:55 in reply to: ↑ 54 Changed 3 years ago by SimonKing

Replying to SimonKing:

Can you tell me how I can achieve that make uninstall removes the documentation?

Another problem: So far I am unable to convince doxygen that it is NOT supposed to try and apply !LaTeX on the documentation. Also, I am unable to convince doxygen to create the documentation in a directory different from $(prefix)/doc.

I tried to put

DX_INIT_DOXYGEN([shared_meataxe], [$(top_srcdir)/etc/Doxyfile], [$(docdir)])

and was configuring with --docdir=..., but the html documentation was still built in doc/, not in the folder that I indicated by --docdir --- however, a certain tag was attempted to create in $(docdir), which failed!

Any idea?

Last edited 3 years ago by SimonKing (previous) (diff)

comment:56 Changed 3 years ago by SimonKing

Very very odd: Although Doxyfile says GENERATE_LATEX='NO', autotools is still calling doxygen with GENERATE_LATEX='YES'.

comment:57 Changed 3 years ago by SimonKing

Solved it! Apparently DX_INIT needs to be called after DX_..._FEATURE.

comment:58 follow-up: Changed 3 years ago by SimonKing

There remain these problems: Documentation is not uninstalled and I cannot prescribe a directory name different from doc/.

Last edited 3 years ago by SimonKing (previous) (diff)

comment:59 in reply to: ↑ 58 ; follow-up: Changed 3 years ago by SimonKing

Replying to SimonKing:

There remain these problems: Documentation is not uninstalled

Still unsolved

and I cannot prescribe a directory name different from doc/.

Done: Define OUTPUT_DIRECTORY = $(DOCDIR) in Doxyfile, and then configure --docdir=... works.

comment:60 in reply to: ↑ 59 ; follow-up: Changed 3 years ago by dimpase

Replying to SimonKing:

Replying to SimonKing:

There remain these problems: Documentation is not uninstalled

Still unsolved

and I cannot prescribe a directory name different from doc/.

Done: Define OUTPUT_DIRECTORY = $(DOCDIR) in Doxyfile, and then configure --docdir=... works.

As far as I can see, the 3rd parameter of DX_INIT_DOXYGEN controls the output directory.

comment:61 in reply to: ↑ 60 Changed 3 years ago by SimonKing

Replying to dimpase:

As far as I can see, the 3rd parameter of DX_INIT_DOXYGEN controls the output directory.

I thought so. But in fact, it seems that it only controls where to put certain "tags". But the actual location of the documentation is determined by the OUTPUT_DIRECTORY variable defined in Doxyfile.

Last edited 3 years ago by SimonKing (previous) (diff)

comment:62 Changed 3 years ago by SimonKing

For the record: I am now also fixing MeatAxe's documentation: Most doxygen sections are incorrectly created: The first word after @section is supposed to be a globally unique section label, if I understand correctly, but upstream just start with what is supposed to be the section title, or starts with a non-unique label.

Also, they are still documenting a stand-alone program for the creation of multiplication tables, that was removed.

The more upstream bugs I am fixing, the more I think it is justified to say that I am forking MeatAxe.

comment:63 follow-up: Changed 3 years ago by SimonKing

I am now done with the docs and making some progress with getting the test suite to run.

Automake would run several test scripts in parallel, which I'm OK with. However, upstream doesn't seem to think of parallel tests: Several test scripts write test data into local files WITH THE SAME NAME.

Solution: I want to create a sub-directory for each individual test. That sub-directory should be created with mkdir -p. However, mkdir -p is (if I understand correctly) not available on all supported platforms.

When I put AC_PROG_MKDIR_P into configure.ac, then the variable MKDIR_P gets defined into the resulting Makefile. However, MKDIR_P is not visible for the test scripts.

So, what shall I do?

  • Use mkdir -p in my test scripts, although it may not be fully portable?
  • How can I obtain an environment variable for MKDIR_P that the test scripts can use?
Last edited 3 years ago by SimonKing (previous) (diff)

comment:64 in reply to: ↑ 63 Changed 3 years ago by SimonKing

Replying to SimonKing:

When I put AC_PROG_MKDIR_P into configure.ac, then the variable MKDIR_P gets defined into the resulting Makefile. However, MKDIR_P is not visible for the test scripts.

I found something that works:

  • In my Makefile.am, I define check-local and let it create a little script that exports MKDIR_P
  • In the tests, I first let that script be executed, and by consequence the variable is then available.

Is there a more elegant solution, or is creating a custom temporary script fine?

comment:65 follow-up: Changed 3 years ago by dimpase

You can call `$(MKDIR_P)$ in the makefile, why would you need to call it from your scripts? You can also export it in the makefile, why would you need a script for it?

However, it seems that you rather want to create temporary directories/files directly, make tests output to stdout, and redirect it to temporary files.

comment:66 in reply to: ↑ 65 ; follow-ups: Changed 3 years ago by SimonKing

Replying to dimpase:

You can call `$(MKDIR_P)$ in the makefile, why would you need to call it from your scripts? You can also export it in the makefile, why would you need a script for it?

Let's define what we mean by "scripts". In the first place, I am talking about "test scripts". They are essential for the test suite.

In order to be able to run tests in parallel, it is essential that each test script has its own directory in which it stores its results. Hence, my aim was that each test script creates such a directory, named test_$$, where as usual $$ provides the PID of a script.

Of course, the Makefile is unaware of the PIDs of the test scripts. Therefore, Makefile can not be made responsible for creation of the directories. By consequence, I need that mkdir -p works IN THE TEST SCRIPTS.

Exporting MKDIR_P in the Makefile simply doesn't work. That was the first thing I tried, but the corresponding environment variable isn't visible for the test scripts.

Hence, my solution: Makefile creates a helper script that exports MKDIR_P (of course, Makefile is able to dump the value of that variable into the script!), and each test script sources the helper script. Problem solved.

Actually, I now have a SharedMeatAxe 1.0 creating a dynamic library and executables; it passes the full MeatAxe test suite, can create docs, make clean works fine, and make uninstall removes both the binaries, the library and the docs.

So, it basically is ready for publication.

However, it seems that you rather want to create temporary directories/files directly, make tests output to stdout, and redirect it to temporary files.

No. It is not about text written to stdout. The MeatAxe test suite is testing executables, that read input from data files, and write output into data files. So, during parallel tests, it must not happen that several tests have output data files of the same name. Hence, the need to use a temporary directory.

comment:67 in reply to: ↑ 66 Changed 3 years ago by SimonKing

Replying to SimonKing:

In order to be able to run tests in parallel, it is essential that each test script has its own directory in which it stores its results. Hence, my aim was that each test script creates such a directory, named test_$$, where as usual $$ provides the PID of a script.

Of course, the Makefile is unaware of the PIDs of the test scripts. Therefore, Makefile can not be made responsible for creation of the directories. By consequence, I need that mkdir -p works IN THE TEST SCRIPTS.

Exporting MKDIR_P in the Makefile simply doesn't work. That was the first thing I tried, but the corresponding environment variable isn't visible for the test scripts.

Hence, my solution: Makefile creates a helper script that exports MKDIR_P (of course, Makefile is able to dump the value of that variable into the script!), and each test script sources the helper script. Problem solved.

Actually, I now have a SharedMeatAxe 1.0 creating a dynamic library and executables; it passes the full MeatAxe test suite, can create docs, make clean works fine, and make uninstall removes both the binaries, the library and the docs.

So, it basically is ready for publication.

However, it seems that you rather want to create temporary directories/files directly, make tests output to stdout, and redirect it to temporary files.

No. It is not about text written to stdout. The MeatAxe test suite is testing executables, that read input from data files, and write output into data files. So, during parallel tests, it must not happen that several tests have output data files of the same name. Hence, the need to use a temporary directory.

Or perhaps I can really proceed differently. The name scheme for the test scripts is t-0000, t-0001, etc. So, the Makefile could create unique directories test_0000, test_0001 etc, and then each script can cd into its designated directory and do its job.

However, how to do this most easily using autotools?

Currently, the test suite is implemented like this in Makefile.am:

TESTS                 = zzztest t-0100 t-0000 \
	t-0100 t-0100 t-0105 t-0106 t-0107 t-0108 t-0109 t-0110 t-0111 t-0112 \
	t-0113 t-0114 t-0115 t-0116 t-0117 t-0118 t-0200 t-0201 t-0210 t-0211 \
	t-0212 t-0213 t-0214 t-0215

check-local:
	@echo "CREATING ENV"
	@echo 'export MKDIR_P="@MKDIR_P@"' > MKDIR_P_ENV
	@chmod +x MKDIR_P_ENV

As you can see, make check first creates the MKDIR_P_ENV helper script, and then calls all the test scripts t-0100 etc., which in turn source the MKDIR_P_ENV helper script.

I will now try to work with a list of numbers that are both used to generate the list of TESTS and a list of test directories.

comment:68 Changed 3 years ago by SimonKing

Shouldn't the following work?

TestNumbers = 0100 0000 0100 0100 0105 0106 0107 0108 0109 0110 0111 0112 \
	0113 0114 0115 0116 0117 0118 0200 0201 0210 0211 0212 0213 0214 0215

TESTS                 = $(TestNumbers:%=t-%)

Well, it doesn't. Eventually, TESTS is the same as TestNumbers. Why?

In other words, given a list of strings separated by blanks, how can I prepend "t-" to each individual element of the list, in a Makefile?

comment:69 follow-up: Changed 3 years ago by SimonKing

I could work around: I still don't know how to prepend something to the elements of a variable list - the above syntax that works fine in a Makefile doesn't work when used in Makefile.am.

However, it seems to be possible to APPEND something to the elements of a variable list. Hence, for test script t-xxxx I am now using the folder t-xxxx_out, with the following syntax in Makefile.am:

@$(MKDIR_P) $(TESTNUMBERS:=_out)

Again, the MeatAxe test suite passes, make clean works as it should, and I avoid the ugly hack of exporting MKDIR_P.

Now, I should finally address Sage: How do I tell Sage that ./sage -i meataxe is supposed to install shared_meataxe-1.0? Do I provide that information in checksum.ini?

comment:70 Changed 3 years ago by SimonKing

Is it ok to build the always install the documentation if the package gets installed? Could be a problem if doxygen is missing. Or should it only be installed, if SAGE_SPKG_INSTALL=yes?

comment:71 Changed 3 years ago by SimonKing

Is it fine to use rm -f in an install script, or is there some variable for it, such as $(RM) -f?

comment:72 in reply to: ↑ 69 Changed 3 years ago by jdemeyer

Replying to SimonKing:

Now, I should finally address Sage: How do I tell Sage that ./sage -i meataxe is supposed to install shared_meataxe-1.0? Do I provide that information in checksum.ini?

Yes.

Replying to SimonKing:

Is it ok to build the always install the documentation if the package gets installed?

Fine for me.

Could be a problem if doxygen is missing.

Then don't build the documentation.

Replying to SimonKing:

Is it fine to use rm -f in an install script

Yes.

comment:73 Changed 3 years ago by SimonKing

There is one test in sage.matrix.special that fails, but that's because of #23706. I opened #24445, but I think this shouldn't be a dependency: The failing test can be fixed without #24445.

comment:74 in reply to: ↑ 66 ; follow-up: Changed 3 years ago by jdemeyer

Replying to SimonKing:

Actually, I now have a SharedMeatAxe 1.0 creating a dynamic library and executables; it passes the full MeatAxe test suite, can create docs, make clean works fine, and make uninstall removes both the binaries, the library and the docs.

Don't forget make distcheck. That's the all-in-one autotools testsuite. If it passes that, you should be good to go.

comment:75 Changed 3 years ago by SimonKing

  • Branch set to u/SimonKing/turn_meataxe_into_a_dynamic_library

comment:76 Changed 3 years ago by SimonKing

  • Commit set to bf65465ed0561ec015d9bec2ee8c7fa1744ac51c
  • Description modified (diff)

New commits:

9e8caf1Upgrade meataxe spkg to SharedMeatAxe 1.0 (dynamic library)
9690d79Change sage.libs.meataxe initialisation
bf65465Fix two failing tests

comment:77 in reply to: ↑ 74 ; follow-up: Changed 3 years ago by SimonKing

Replying to jdemeyer:

Replying to SimonKing:

Actually, I now have a SharedMeatAxe 1.0 creating a dynamic library and executables; it passes the full MeatAxe test suite, can create docs, make clean works fine, and make uninstall removes both the binaries, the library and the docs.

Don't forget make distcheck. That's the all-in-one autotools testsuite. If it passes that, you should be good to go.

It fails, but perhaps for an interesting reason: The test suite uses certain binaries. When the test suite is run during make distcheck, these binaries can't be found.

Why is that? Doesn't make distcheck do make install? Or do I need to append to PATH when running tests?

comment:78 in reply to: ↑ 77 Changed 3 years ago by SimonKing

  • Description modified (diff)

comment:79 follow-up: Changed 3 years ago by SimonKing

I can reproduce the problem with make distcheck as follows:

  • configure
  • make all
  • make check

When I instead do

  • configure
  • make install
  • make check

then things work.

I see that the missing binaries are built in src/ before being installed. So, is it a correct solution to add src/ to the PATH during the tests?

comment:80 in reply to: ↑ 79 Changed 3 years ago by SimonKing

Replying to SimonKing:

So, is it a correct solution to add src/ to the PATH during the tests?

In any case, when I do so then make distcheck passes. So, I change the tarball accordingly.

comment:81 Changed 3 years ago by git

  • Commit changed from bf65465ed0561ec015d9bec2ee8c7fa1744ac51c to 168b297b7c03637281a6ce7574cac8d85a6688a5

Branch pushed to git repo; I updated commit sha1. New commits:

168b297Replace shared_meataxe-1.0 by a package passing 'make distcheck'

comment:82 Changed 3 years ago by SimonKing

  • Description modified (diff)

For now, I have fixed make distcheck by appending to the path. It works...

Next, I will be running make test for Sage.

Note: See TracTickets for help on using tickets.