0%

GNU Automake 版本(version 1.16.1, 26 February 2018)

Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with no Invariant Sections, with no Front-Cover texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled “GNU Free Documentation License.”

15 Support for test suites


Automake can generate code to handle two kinds of test suites. One is
based on integration with the ‘dejagnu’ framework. The other (and most
used) form is based on the use of generic test scripts, and its
activation is triggered by the definition of the special ‘TESTS’
variable. This second form allows for various degrees of sophistication
and customization; in particular, it allows for concurrent execution of
test scripts, use of established test protocols such as TAP, and
definition of custom test drivers and test runners.

In either case, the testsuite is invoked via ‘make check’.

15.1 Generalities about Testing

The purpose of testing is to determine whether a program or system
behaves as expected (e.g., known inputs produce the expected outputs,
error conditions are correctly handled or reported, and older bugs do
not resurface).

The minimal unit of testing is usually called test case, or simply
test. How a test case is defined or delimited, and even what exactly
constitutes a test case, depends heavily on the testing paradigm
and/or framework in use, so we won’t attempt any more precise
definition. The set of the test cases for a given program or system
constitutes its testsuite.

A test harness (also testsuite harness) is a program or software
component that executes all (or part of) the defined test cases,
analyzes their outcomes, and report or register these outcomes
appropriately. Again, the details of how this is accomplished (and how
the developer and user can influence it or interface with it) varies
wildly, and we’ll attempt no precise definition.

A test is said to pass when it can determine that the condition or
behaviour it means to verify holds, and is said to fail when it can
determine that such condition of behaviour does not hold.

Sometimes, tests can rely on non-portable tools or prerequisites, or
simply make no sense on a given system (for example, a test checking a
Windows-specific feature makes no sense on a GNU/Linux system). In this
case, accordingly to the definition above, the tests can neither be
considered passed nor failed; instead, they are skipped – i.e., they
are not run, or their result is anyway ignored for what concerns the
count of failures an successes. Skips are usually explicitly reported
though, so that the user will be aware that not all of the testsuite has
really run.

It’s not uncommon, especially during early development stages, that
some tests fail for known reasons, and that the developer doesn’t want
to tackle these failures immediately (this is especially true when the
failing tests deal with corner cases). In this situation, the better
policy is to declare that each of those failures is an expected
failure
(or xfail). In case a test that is expected to fail ends up
passing instead, many testing environments will flag the result as a
special kind of failure called unexpected pass (or xpass).

Many testing environments and frameworks distinguish between test
failures and hard errors. As we’ve seen, a test failure happens when
some invariant or expected behaviour of the software under test is not
met. An hard error happens when e.g., the set-up of a test case
scenario fails, or when some other unexpected or highly undesirable
condition is encountered (for example, the program under test
experiences a segmentation fault).

15.2 Simple Tests

15.2.1 Scripts-based Testsuites

If the special variable ‘TESTS’ is defined, its value is taken to be a
list of programs or scripts to run in order to do the testing. Under
the appropriate circumstances, it’s possible for ‘TESTS’ to list also
data files to be passed to one or more test scripts defined by different
means (the so-called “log compilers”, *note Parallel Test Harness::).

Test scripts can be executed serially or concurrently. Automake
supports both these kinds of test execution, with the parallel test
harness being the default. The concurrent test harness relies on the
concurrence capabilities (if any) offered by the underlying ‘make’
implementation, and can thus only be as good as those are.

By default, only the exit statuses of the test scripts are considered
when determining the testsuite outcome. But Automake allows also the
use of more complex test protocols, either standard (*note Using the TAP
test protocol::) or custom (*note Custom Test Drivers::). Note that you
can’t enable such protocols when the serial harness is used, though. In
the rest of this section we are going to concentrate mostly on
protocol-less tests, since we cover test protocols in a later section
(again, *note Custom Test Drivers::).

When no test protocol is in use, an exit status of 0 from a test
script will denote a success, an exit status of 77 a skipped test, an
exit status of 99 an hard error, and any other exit status will denote a
failure.

You may define the variable ‘XFAIL_TESTS’ to a list of tests (usually
a subset of ‘TESTS’) that are expected to fail; this will effectively
reverse the result of those tests (with the provision that skips and
hard errors remain untouched). You may also instruct the testsuite
harness to treat hard errors like simple failures, by defining the
‘DISABLE_HARD_ERRORS’ make variable to a nonempty value.

Note however that, for tests based on more complex test protocols,
the exact effects of ‘XFAIL_TESTS’ and ‘DISABLE_HARD_ERRORS’ might
change, or they might even have no effect at all (for example, in tests
using TAP, there is no way to disable hard errors, and the
‘DISABLE_HARD_ERRORS’ variable has no effect on them).

The result of each test case run by the scripts in ‘TESTS’ will be
printed on standard output, along with the test name. For test
protocols that allow more test cases per test script (such as TAP), a
number, identifier and/or brief description specific for the single test
case is expected to be printed in addition to the name of the test
script. The possible results (whose meanings should be clear from the
previous *note Generalities about Testing::) are ‘PASS’, ‘FAIL’, ‘SKIP’,
‘XFAIL’, ‘XPASS’ and ‘ERROR’. Here is an example of output from an
hypothetical testsuite that uses both plain and TAP tests:
PASS: foo.sh
PASS: zardoz.tap 1 - Daemon started
PASS: zardoz.tap 2 - Daemon responding
SKIP: zardoz.tap 3 - Daemon uses /proc # SKIP /proc is not mounted
PASS: zardoz.tap 4 - Daemon stopped
SKIP: bar.sh
PASS: mu.tap 1
XFAIL: mu.tap 2 # TODO frobnication not yet implemented

A testsuite summary (expected to report at least the number of run,
skipped and failed tests) will be printed at the end of the testsuite
run.

If the standard output is connected to a capable terminal, then the
test results and the summary are colored appropriately. The developer
and the user can disable colored output by setting the ‘make’ variable
‘AM_COLOR_TESTS=no’; the user can in addition force colored output even
without a connecting terminal with ‘AM_COLOR_TESTS=always’. It’s also
worth noting that some ‘make’ implementations, when used in parallel
mode, have slightly different semantics (*note (autoconf)Parallel
make::), which can break the automatic detection of a connection to a
capable terminal. If this is the case, the user will have to resort to
the use of ‘AM_COLOR_TESTS=always’ in order to have the testsuite output
colorized.

Test programs that need data files should look for them in ‘srcdir’
(which is both a make variable and an environment variable made
available to the tests), so that they work when building in a separate
directory (*note Build Directories: (autoconf)Build Directories.), and
in particular for the ‘distcheck’ rule (*note Checking the
Distribution::).

The ‘AM_TESTS_ENVIRONMENT’ and ‘TESTS_ENVIRONMENT’ variables can be
used to run initialization code and set environment variables for the
test scripts. The former variable is developer-reserved, and can be
defined in the ‘Makefile.am’, while the latter is reserved for the user,
which can employ it to extend or override the settings in the former;
for this to work portably, however, the contents of a non-empty
‘AM_TESTS_ENVIRONMENT’ must be terminated by a semicolon.

The ‘AM_TESTS_FD_REDIRECT’ variable can be used to define file
descriptor redirections for the test scripts. One might think that
‘AM_TESTS_ENVIRONMENT’ could be used for this purpose, but experience
has shown that doing so portably is practically impossible. The main
hurdle is constituted by Korn shells, which usually set the
close-on-exec flag on file descriptors opened with the ‘exec’ builtin,
thus rendering an idiom like ‘AM_TESTS_ENVIRONMENT = exec 9>&2;’
ineffectual. This issue also affects some Bourne shells, such as the
HP-UX’s ‘/bin/sh’,

 AM_TESTS_ENVIRONMENT = \
 ## Some environment initializations are kept in a separate shell
 ## file 'tests-env.sh', which can make it easier to also run tests
 ## from the command line.
   . $(srcdir)/tests-env.sh; \
 ## On Solaris, prefer more POSIX-compliant versions of the standard
 ## tools by default.
   if test -d /usr/xpg4/bin; then \
     PATH=/usr/xpg4/bin:$$PATH; export PATH; \
   fi;
 ## With this, the test scripts will be able to print diagnostic
 ## messages to the original standard error stream, even if the test
 ## driver redirects the stderr of the test scripts to a log file
 ## before executing them.
 AM_TESTS_FD_REDIRECT = 9>&2

Note however that ‘AM_TESTS_ENVIRONMENT’ is, for historical and
implementation reasons, not supported by the serial harness (*note
Serial Test Harness::).

Automake ensures that each file listed in ‘TESTS’ is built before it
is run; you can list both source and derived programs (or scripts) in
‘TESTS’; the generated rule will look both in ‘srcdir’ and ‘.’. For
instance, you might want to run a C program as a test. To do this you
would list its name in ‘TESTS’ and also in ‘check_PROGRAMS’, and then
specify it as you would any other program.

Programs listed in ‘check_PROGRAMS’ (and ‘check_LIBRARIES’,
‘check_LTLIBRARIES’…) are only built during ‘make check’, not during
‘make all’. You should list there any program needed by your tests that
does not need to be built by ‘make all’. Note that ‘check_PROGRAMS’ are
not automatically added to ‘TESTS’ because ‘check_PROGRAMS’ usually
lists programs used by the tests, not the tests themselves. Of course
you can set ‘TESTS = $(check_PROGRAMS)’ if all your programs are test
cases.

15.2.2 Older (and discouraged) serial test harness

First, note that today the use of this harness is strongly discouraged
in favour of the parallel test harness (*note Parallel Test Harness::).
Still, there are few situations when the advantages offered by the
parallel harness are irrelevant, and when test concurrency can even
cause tricky problems. In those cases, it might make sense to still use
the serial harness, for simplicity and reliability (we still suggest
trying to give the parallel harness a shot though).

The serial test harness is enabled by the Automake option
‘serial-tests’. It operates by simply running the tests serially, one
at the time, without any I/O redirection. It’s up to the user to
implement logging of tests’ output, if that’s required or desired.

For historical and implementation reasons, the ‘AM_TESTS_ENVIRONMENT’
variable is not supported by this harness (it will be silently ignored
if defined); only ‘TESTS_ENVIRONMENT’ is, and it is to be considered a
developer-reserved variable. This is done so that, when using the
serial harness, ‘TESTS_ENVIRONMENT’ can be defined to an invocation of
an interpreter through which the tests are to be run. For instance, the
following setup may be used to run tests with Perl:

 TESTS_ENVIRONMENT = $(PERL) -Mstrict -w
 TESTS = foo.pl bar.pl baz.pl

It’s important to note that the use of ‘TESTS_ENVIRONMENT’ endorsed here
would be invalid with the parallel harness. That harness provides a
more elegant way to achieve the same effect, with the further benefit of
freeing the ‘TESTS_ENVIRONMENT’ variable for the user (*note Parallel
Test Harness::).

Another, less serious limit of the serial harness is that it doesn’t
really distinguish between simple failures and hard errors; this is due
to historical reasons only, and might be fixed in future Automake
versions.

15.2.3 Parallel Test Harness

By default, Automake generated a parallel (concurrent) test harness. It
features automatic collection of the test scripts output in ‘.log’
files, concurrent execution of tests with ‘make -j’, specification of
inter-test dependencies, lazy reruns of tests that have not completed in
a prior run, and hard errors for exceptional failures.

The parallel test harness operates by defining a set of ‘make’ rules
that run the test scripts listed in ‘TESTS’, and, for each such script,
save its output in a corresponding ‘.log’ file and its results (and
other “metadata”, *note API for Custom Test Drivers::) in a
corresponding ‘.trs’ (as in Test ReSults) file. The ‘.log’ file will
contain all the output emitted by the test on its standard output and
its standard error. The ‘.trs’ file will contain, among the other
things, the results of the test cases run by the script.

The parallel test harness will also create a summary log file,
‘TEST_SUITE_LOG’, which defaults to ‘test-suite.log’ and requires a
‘.log’ suffix. This file depends upon all the ‘.log’ and ‘.trs’ files
created for the test scripts listed in ‘TESTS’.

As with the serial harness above, by default one status line is
printed per completed test, and a short summary after the suite has
completed. However, standard output and standard error of the test are
redirected to a per-test log file, so that parallel execution does not
produce intermingled output. The output from failed tests is collected
in the ‘test-suite.log’ file. If the variable ‘VERBOSE’ is set, this
file is output after the summary.

Each couple of ‘.log’ and ‘.trs’ files is created when the
corresponding test has completed. The set of log files is listed in the
read-only variable ‘TEST_LOGS’, and defaults to ‘TESTS’, with the
executable extension if any (*note EXEEXT::), as well as any suffix
listed in ‘TEST_EXTENSIONS’ removed, and ‘.log’ appended. Results are
undefined if a test file name ends in several concatenated suffixes.
‘TEST_EXTENSIONS’ defaults to ‘.test’; it can be overridden by the user,
in which case any extension listed in it must be constituted by a dot,
followed by a non-digit alphabetic character, followed by any number of
alphabetic characters. For example, ‘.sh’, ‘.T’ and ‘.t1’ are valid
extensions, while ‘.x-y’, ‘.6c’ and ‘.t.1’ are not.

It is important to note that, due to current limitations (unlikely to
be lifted), configure substitutions in the definition of ‘TESTS’ can
only work if they will expand to a list of tests that have a suffix
listed in ‘TEST_EXTENSIONS’.

For tests that match an extension ‘.EXT’ listed in ‘TEST_EXTENSIONS’,
you can provide a custom “test runner” using the variable
‘EXT_LOG_COMPILER’ (note the upper-case extension) and pass options in
‘AM_EXT_LOG_FLAGS’ and allow the user to pass options in
‘EXT_LOG_FLAGS’. It will cause all tests with this extension to be
called with this runner. For all tests without a registered extension,
the variables ‘LOG_COMPILER’, ‘AM_LOG_FLAGS’, and ‘LOG_FLAGS’ may be
used. For example,

 TESTS = foo.pl bar.py baz
 TEST_EXTENSIONS = .pl .py
 PL_LOG_COMPILER = $(PERL)
 AM_PL_LOG_FLAGS = -w
 PY_LOG_COMPILER = $(PYTHON)
 AM_PY_LOG_FLAGS = -v
 LOG_COMPILER = ./wrapper-script
 AM_LOG_FLAGS = -d

will invoke ‘$(PERL) -w foo.pl’, ‘$(PYTHON) -v bar.py’, and
‘./wrapper-script -d baz’ to produce ‘foo.log’, ‘bar.log’, and
‘baz.log’, respectively. The ‘foo.trs’, ‘bar.trs’ and ‘baz.trs’ files
will be automatically produced as a side-effect.

It’s important to note that, differently from what we’ve seen for the
serial test harness (*note Serial Test Harness::), the
‘AM_TESTS_ENVIRONMENT’ and ‘TESTS_ENVIRONMENT’ variables cannot be
used to define a custom test runner; the ‘LOG_COMPILER’ and ‘LOG_FLAGS’
(or their extension-specific counterparts) should be used instead:

 ## This is WRONG!
 AM_TESTS_ENVIRONMENT = PERL5LIB='$(srcdir)/lib' $(PERL) -Mstrict -w

 ## Do this instead.
 AM_TESTS_ENVIRONMENT = PERL5LIB='$(srcdir)/lib'; export PERL5LIB;
 LOG_COMPILER = $(PERL)
 AM_LOG_FLAGS = -Mstrict -w

By default, the test suite harness will run all tests, but there are
several ways to limit the set of tests that are run:

• You can set the ‘TESTS’ variable. For example, you can use a
command like this to run only a subset of the tests:

      env TESTS="foo.test bar.test" make -e check

 Note however that the command above will unconditionally overwrite
 the ‘test-suite.log’ file, thus clobbering the recorded results of
 any previous testsuite run.  This might be undesirable for packages
 whose testsuite takes long time to execute.  Luckily, this problem
 can easily be avoided by overriding also ‘TEST_SUITE_LOG’ at
 runtime; for example,

      env TEST_SUITE_LOG=partial.log TESTS="..." make -e check

 will write the result of the partial testsuite runs to the
 ‘partial.log’, without touching ‘test-suite.log’.

• You can set the ‘TEST_LOGS’ variable. By default, this variable is
computed at ‘make’ run time from the value of ‘TESTS’ as described
above. For example, you can use the following:

      set x subset*.log; shift
      env TEST_LOGS="foo.log $*" make -e check

 The comments made above about ‘TEST_SUITE_LOG’ overriding applies
 here too.

• By default, the test harness removes all old per-test ‘.log’ and
‘.trs’ files before it starts running tests to regenerate them.
The variable ‘RECHECK_LOGS’ contains the set of ‘.log’ (and, by
implication, ‘.trs’) files which are removed. ‘RECHECK_LOGS’
defaults to ‘TEST_LOGS’, which means all tests need to be
rechecked. By overriding this variable, you can choose which tests
need to be reconsidered. For example, you can lazily rerun only
those tests which are outdated, i.e., older than their prerequisite
test files, by setting this variable to the empty value:

      env RECHECK_LOGS= make -e check

• You can ensure that all tests are rerun which have failed or passed
unexpectedly, by running ‘make recheck’ in the test directory.
This convenience target will set ‘RECHECK_LOGS’ appropriately
before invoking the main test harness.

In order to guarantee an ordering between tests even with ‘make -jN’,
dependencies between the corresponding ‘.log’ files may be specified
through usual ‘make’ dependencies. For example, the following snippet
lets the test named ‘foo-execute.test’ depend upon completion of the
test ‘foo-compile.test’:

 TESTS = foo-compile.test foo-execute.test
 foo-execute.log: foo-compile.log

Please note that this ordering ignores the results of required tests,
thus the test ‘foo-execute.test’ is run even if the test
‘foo-compile.test’ failed or was skipped beforehand. Further, please
note that specifying such dependencies currently works only for tests
that end in one of the suffixes listed in ‘TEST_EXTENSIONS’.

Tests without such specified dependencies may be run concurrently
with parallel ‘make -jN’, so be sure they are prepared for concurrent
execution.

The combination of lazy test execution and correct dependencies
between tests and their sources may be exploited for efficient unit
testing during development. To further speed up the edit-compile-test
cycle, it may even be useful to specify compiled programs in
‘EXTRA_PROGRAMS’ instead of with ‘check_PROGRAMS’, as the former allows
intertwined compilation and test execution (but note that
‘EXTRA_PROGRAMS’ are not cleaned automatically, *note Uniform::).

The variables ‘TESTS’ and ‘XFAIL_TESTS’ may contain conditional parts
as well as configure substitutions. In the latter case, however,
certain restrictions apply: substituted test names must end with a
nonempty test suffix like ‘.test’, so that one of the inference rules
generated by ‘automake’ can apply. For literal test names, ‘automake’
can generate per-target rules to avoid this limitation.

Please note that it is currently not possible to use ‘$(srcdir)/’ or
‘$(top_srcdir)/’ in the ‘TESTS’ variable. This technical limitation is
necessary to avoid generating test logs in the source tree and has the
unfortunate consequence that it is not possible to specify distributed
tests that are themselves generated by means of explicit rules, in a way
that is portable to all ‘make’ implementations (*note (autoconf)Make
Target Lookup::, the semantics of FreeBSD and OpenBSD ‘make’ conflict
with this). In case of doubt you may want to require to use GNU ‘make’,
or work around the issue with inference rules to generate the tests.

15.3 Custom Test Drivers

15.3.1 Overview of Custom Test Drivers Support

Starting from Automake version 1.12, the parallel test harness allows
the package authors to use third-party custom test drivers, in case the
default ones are inadequate for their purposes, or do not support their
testing protocol of choice.

A custom test driver is expected to properly run the test programs
passed to it (including the command-line arguments passed to those
programs, if any), to analyze their execution and outcome, to create the
‘.log’ and ‘.trs’ files associated to these test runs, and to display
the test results on the console. It is responsibility of the author of
the test driver to ensure that it implements all the above steps
meaningfully and correctly; Automake isn’t and can’t be of any help
here. On the other hand, the Automake-provided code for testsuite
summary generation offers support for test drivers allowing several test
results per test script, if they take care to register such results
properly (*note Log files generation and test results recording::).

The exact details of how test scripts’ results are to be determined
and analyzed is left to the individual drivers. Some drivers might only
consider the test script exit status (this is done for example by the
default test driver used by the parallel test harness, described in the
previous section). Other drivers might implement more complex and
advanced test protocols, which might require them to parse and
interpreter the output emitted by the test script they’re running
(examples of such protocols are TAP and SubUnit).

It’s very important to note that, even when using custom test
drivers, most of the infrastructure described in the previous section
about the parallel harness remains in place; this includes:

• list of test scripts defined in ‘TESTS’, and overridable at runtime
through the redefinition of ‘TESTS’ or ‘TEST_LOGS’;
• concurrency through the use of ‘make’’s option ‘-j’;
• per-test ‘.log’ and ‘.trs’ files, and generation of a summary
‘.log’ file from them;
• ‘recheck’ target, ‘RECHECK_LOGS’ variable, and lazy reruns of
tests;
• inter-test dependencies;
• support for ‘check_*’ variables (‘check_PROGRAMS’,
‘check_LIBRARIES’, …);
• use of ‘VERBOSE’ environment variable to get verbose output on
testsuite failures;
• definition and honoring of ‘TESTS_ENVIRONMENT’,
‘AM_TESTS_ENVIRONMENT’ and ‘AM_TESTS_FD_REDIRECT’ variables;
• definition of generic and extension-specific ‘LOG_COMPILER’ and
‘LOG_FLAGS’ variables.

On the other hand, the exact semantics of how (and if) testsuite output
colorization, ‘XFAIL_TESTS’, and hard errors are supported and handled
is left to the individual test drivers.

15.3.2 Declaring Custom Test Drivers

Custom testsuite drivers are declared by defining the make variables
‘LOG_DRIVER’ or ‘EXT_LOG_DRIVER’ (where EXT must be declared in
‘TEST_EXTENSIONS’). They must be defined to programs or scripts that
will be used to drive the execution, logging, and outcome report of the
tests with corresponding extensions (or of those with no registered
extension in the case of ‘LOG_DRIVER’). Clearly, multiple distinct test
drivers can be declared in the same ‘Makefile.am’. Note moreover that
the ‘LOG_DRIVER’ variables are not a substitute for the ‘LOG_COMPILER’
variables: the two sets of variables can, and often do, usefully and
legitimately coexist.

The developer-reserved variable ‘AM_LOG_DRIVER_FLAGS’ and the
user-reserved variable ‘LOG_DRIVER_FLAGS’ can be used to define flags
that will be passed to each invocation of ‘LOG_DRIVER’, with the
user-defined flags obviously taking precedence over the
developer-reserved ones. Similarly, for each extension EXT declared in
‘TEST_EXTENSIONS’, flags listed in ‘AM_EXT_LOG_DRIVER_FLAGS’ and
‘EXT_LOG_DRIVER_FLAGS’ will be passed to invocations of
‘EXT_LOG_DRIVER’.

15.3.3 API for Custom Test Drivers

Note that the APIs described here are still highly experimental, and
will very likely undergo tightenings and likely also extensive changes
in the future, to accommodate for 新特性 or to satisfy additional
portability requirements.

The main characteristic of these APIs is that they are designed to
share as much infrastructure, semantics, and implementation details as
possible with the parallel test harness and its default driver.

15.3.3.1 Command-line arguments for test drivers
…………………………………………

A custom driver can rely on various command-line options and arguments
being passed to it automatically by the Automake-generated test harness.
It is mandatory that it understands all of them (even if the exact
interpretation of the associated semantics can legitimately change
between a test driver and another, and even be a no-op in some drivers).

Here is the list of options:

‘–test-name=NAME’
The name of the test, with VPATH prefix (if any) removed. This can
have a suffix and a directory component (as in e.g.,
‘sub/foo.test’), and is mostly meant to be used in console reports
about testsuite advancements and results (*note Testsuite progress
output::).
‘–log-file=PATH.log’
The ‘.log’ file the test driver must create (*note Basics of test
metadata::). If it has a directory component (as in e.g.,
‘sub/foo.log’), the test harness will ensure that such directory
exists before the test driver is called.
‘–trs-file=PATH.trs’
The ‘.trs’ file the test driver must create (*note Basics of test
metadata::). If it has a directory component (as in e.g.,
‘sub/foo.trs’), the test harness will ensure that such directory
exists before the test driver is called.
‘–color-tests={yes|no}’
Whether the console output should be colorized or not (*note Simple
tests and color-tests::, to learn when this option gets activated
and when it doesn’t).
‘–expect-failure={yes|no}’
Whether the tested program is expected to fail.
‘–enable-hard-errors={yes|no}’
Whether “hard errors” in the tested program should be treated
differently from normal failures or not (the default should be
‘yes’). The exact meaning of “hard error” is highly dependent from
the test protocols or conventions in use.
‘–’
Explicitly terminate the list of options.

The first non-option argument passed to the test driver is the program
to be run, and all the following ones are command-line options and
arguments for this program.

Note that the exact semantics attached to the ‘–color-tests’,
‘–expect-failure’ and ‘–enable-hard-errors’ options are left up to the
individual test drivers. Still, having a behaviour compatible or at
least similar to that provided by the default driver is advised, as that
would offer a better consistency and a more pleasant user experience.

15.3.3.2 Log files generation and test results recording
………………………………………………..

The test driver must correctly generate the files specified by the
‘–log-file’ and ‘–trs-file’ option (even when the tested program fails
or crashes).

The ‘.log’ file should ideally contain all the output produced by the
tested program, plus optionally other information that might facilitate
debugging or analysis of bug reports. Apart from that, its format is
basically free.

The ‘.trs’ file is used to register some metadata through the use of
custom reStructuredText fields. This metadata is expected to be
employed in various ways by the parallel test harness; for example, to
count the test results when printing the testsuite summary, or to decide
which tests to re-run upon ‘make recheck’. Unrecognized metadata in a
‘.trs’ file is currently ignored by the harness, but this might change
in the future. The list of currently recognized metadata follows.

‘:test-result:’
The test driver must use this field to register the results of
each test case run by a test script file. Several
‘:test-result:’ fields can be present in the same ‘.trs’ file; this
is done in order to support test protocols that allow a single test
script to run more test cases.

 The only recognized test results are currently ‘PASS’, ‘XFAIL’,
 ‘SKIP’, ‘FAIL’, ‘XPASS’ and ‘ERROR’.  These results, when declared
 with ‘:test-result:’, can be optionally followed by text holding
 the name and/or a brief description of the corresponding test; the
 harness will ignore such extra text when generating
 ‘test-suite.log’ and preparing the testsuite summary.

‘:recheck:’
If this field is present and defined to ‘no’, then the
corresponding test script will not be run upon a ‘make recheck’.
What happens when two or more ‘:recheck:’ fields are present in the
same ‘.trs’ file is undefined behaviour.

‘:copy-in-global-log:’
If this field is present and defined to ‘no’, then the content of
the ‘.log’ file will not be copied into the global
‘test-suite.log’. We allow to forsake such copying because, while
it can be useful in debugging and analysis of bug report, it can
also be just a waste of space in normal situations, e.g., when a
test script is successful. What happens when two or more
‘:copy-in-global-log:’ fields are present in the same ‘.trs’ file
is undefined behaviour.

‘:test-global-result:’
This is used to declare the “global result” of the script.
Currently, the value of this field is needed only to be reported
(more or less verbatim) in the generated global log file
‘$(TEST_SUITE_LOG)’, so it’s quite free-form. For example, a test
script which run 10 test cases, 6 of which pass and 4 of which are
skipped, could reasonably have a ‘PASS/SKIP’ value for this field,
while a test script which run 19 successful tests and one failed
test could have an ‘ALMOST PASSED’ value. What happens when two or
more ‘:test-global-result:’ fields are present in the same ‘.trs’
file is undefined behaviour.

Let’s see a small example. Assume a ‘.trs’ file contains the following
lines:

 :test-result: PASS server starts
 :global-log-copy: no
 :test-result: PASS HTTP/1.1 request
 :test-result: FAIL HTTP/1.0 request
 :recheck: yes
 :test-result: SKIP HTTPS request (TLS library wasn't available)
 :test-result: PASS server stops

Then the corresponding test script will be re-run by ‘make check’, will
contribute with five test results to the testsuite summary (three of
these tests being successful, one failed, and one skipped), and the
content of the corresponding ‘.log’ file will not be copied in the
global log file ‘test-suite.log’.

15.3.3.3 Testsuite progress output
…………………………….

A custom test driver also has the task of displaying, on the standard
output, the test results as soon as they become available. Depending on
the protocol in use, it can also display the reasons for failures and
skips, and, more generally, any useful diagnostic output (but remember
that each line on the screen is precious, so that cluttering the screen
with overly verbose information is bad idea). The exact format of this
progress output is left up to the test driver; in fact, a custom test
driver might theoretically even decide not to do any such report,
leaving it all to the testsuite summary (that would be a very lousy
idea, of course, and serves only to illustrate the flexibility that is
granted here).

Remember that consistency is good; so, if possible, try to be
consistent with the output of the built-in Automake test drivers,
providing a similar “look & feel”. In particular, the testsuite
progress output should be colorized when the ‘–color-tests’ is passed
to the driver. On the other end, if you are using a known and
widespread test protocol with well-established implementations, being
consistent with those implementations’ output might be a good idea too.

15.4 Using the TAP test protocol

15.4.1 Introduction to TAP

TAP, the Test Anything Protocol, is a simple text-based interface
between testing modules or programs and a test harness. The tests (also
called “TAP producers” in this context) write test results in a simple
format on standard output; a test harness (also called “TAP consumer”)
will parse and interpret these results, and properly present them to the
user, and/or register them for later analysis. The exact details of how
this is accomplished can vary among different test harnesses. The
Automake harness will present the results on the console in the usual
fashion (*note Testsuite progress on console::), and will use the ‘.trs’
files (*note Basics of test metadata::) to store the test results and
related metadata. Apart from that, it will try to remain as much
compatible as possible with pre-existing and widespread utilities, such
as the ‘prove’ utility
(http://search.cpan.org/~andya/Test-Harness/bin/prove), at least for the
simpler usages.

TAP started its life as part of the test harness for Perl, but today
it has been (mostly) standardized, and has various independent
implementations in different languages; among them, C, C++, Perl,
Python, PHP, and Java. For a semi-official specification of the TAP
protocol, please refer to the documentation of ‘Test::Harness::TAP’
(http://search.cpan.org/~petdance/Test-Harness/lib/Test/Harness/TAP.pod).

The most relevant real-world usages of TAP are obviously in the
testsuites of ‘perl’ and of many perl modules. Still, also other
important third-party packages, such as ‘git’ (http://git-scm.com/), use
TAP in their testsuite.

15.4.2 Use TAP with the Automake test harness

Currently, the TAP driver that comes with Automake requires some by-hand
steps on the developer’s part (this situation should hopefully be
improved in future Automake versions). You’ll have to grab the
‘tap-driver.sh’ script from the Automake distribution by hand, copy it
in your source tree, and use the Automake support for third-party test
drivers to instruct the harness to use the ‘tap-driver.sh’ script and
the awk program found by ‘AM_INIT_AUTOMAKE’ to run your TAP-producing
tests. See the example below for clarification.

Apart from the options common to all the Automake test drivers (*note
Command-line arguments for test drivers::), the ‘tap-driver.sh’ supports
the following options, whose names are chosen for enhanced compatibility
with the ‘prove’ utility.

‘–ignore-exit’
Causes the test driver to ignore the exit status of the test
scripts; by default, the driver will report an error if the script
exits with a non-zero status. This option has effect also on
non-zero exit statuses due to termination by a signal.
‘–comments’
Instruct the test driver to display TAP diagnostic (i.e., lines
beginning with the ‘#’ character) in the testsuite progress output
too; by default, TAP diagnostic is only copied to the ‘.log’ file.
‘–no-comments’
Revert the effects of ‘–comments’.
‘–merge’
Instruct the test driver to merge the test scripts’ standard error
into their standard output. This is necessary if you want to
ensure that diagnostics from the test scripts are displayed in the
correct order relative to test results; this can be of great help
in debugging (especially if your test scripts are shell scripts run
with shell tracing active). As a downside, this option might cause
the test harness to get confused if anything that appears on
standard error looks like a test result.
‘–no-merge’
Revert the effects of ‘–merge’.
‘–diagnostic-string=STRING’
Change the string that introduces TAP diagnostic from the default
value of “‘#’” to ‘STRING’. This can be useful if your TAP-based
test scripts produce verbose output on which they have limited
control (because, say, the output comes from other tools invoked in
the scripts), and it might contain text that gets spuriously
interpreted as TAP diagnostic: such an issue can be solved by
redefining the string that activates TAP diagnostic to a value you
know won’t appear by chance in the tests’ output. Note however
that this feature is non-standard, as the “official” TAP protocol
does not allow for such a customization; so don’t use it if you can
avoid it.

Here is an example of how the TAP driver can be set up and used.

 % cat configure.ac
 AC_INIT([GNU Try Tap], [1.0], [bug-automake@gnu.org])
 AC_CONFIG_AUX_DIR([build-aux])
 AM_INIT_AUTOMAKE([foreign -Wall -Werror])
 AC_CONFIG_FILES([Makefile])
 AC_REQUIRE_AUX_FILE([tap-driver.sh])
 AC_OUTPUT

 % cat Makefile.am
 TEST_LOG_DRIVER = env AM_TAP_AWK='$(AWK)' $(SHELL) \
                   $(top_srcdir)/build-aux/tap-driver.sh
 TESTS = foo.test bar.test baz.test
 EXTRA_DIST = $(TESTS)

 % cat foo.test
 #!/bin/sh
 echo 1..4 # Number of tests to be executed.
 echo 'ok 1 - Swallows fly'
 echo 'not ok 2 - Caterpillars fly # TODO metamorphosis in progress'
 echo 'ok 3 - Pigs fly # SKIP not enough acid'
 echo '# I just love word plays ...'
 echo 'ok 4 - Flies fly too :-)'

 % cat bar.test
 #!/bin/sh
 echo 1..3
 echo 'not ok 1 - Bummer, this test has failed.'
 echo 'ok 2 - This passed though.'
 echo 'Bail out! Ennui kicking in, sorry...'
 echo 'ok 3 - This will not be seen.'

 % cat baz.test
 #!/bin/sh
 echo 1..1
 echo ok 1
 # Exit with error, even if all the tests have been successful.
 exit 7

 % cp PREFIX/share/automake-APIVERSION/tap-driver.sh .
 % autoreconf -vi && ./configure && make check
 ...
 PASS: foo.test 1 - Swallows fly
 XFAIL: foo.test 2 - Caterpillars fly # TODO metamorphosis in progress
 SKIP: foo.test 3 - Pigs fly # SKIP not enough acid
 PASS: foo.test 4 - Flies fly too :-)
 FAIL: bar.test 1 - Bummer, this test has failed.
 PASS: bar.test 2 - This passed though.
 ERROR: bar.test - Bail out! Ennui kicking in, sorry...
 PASS: baz.test 1
 ERROR: baz.test - exited with status 7
 ...
 Please report to bug-automake@gnu.org
 ...
 % echo exit status: $?
 exit status: 1

 % env TEST_LOG_DRIVER_FLAGS='--comments --ignore-exit' \
       TESTS='foo.test baz.test' make -e check
 ...
 PASS: foo.test 1 - Swallows fly
 XFAIL: foo.test 2 - Caterpillars fly # TODO metamorphosis in progress
 SKIP: foo.test 3 - Pigs fly # SKIP not enough acid
 # foo.test: I just love word plays...
 PASS: foo.test 4 - Flies fly too :-)
 PASS: baz.test 1
 ...
 % echo exit status: $?
 exit status: 0

15.4.3 Incompatibilities with other TAP parsers and drivers

For implementation or historical reasons, the TAP driver and harness as
implemented by Automake have some minors incompatibilities with the
mainstream versions, which you should be aware of.

• A ‘Bail out!’ directive doesn’t stop the whole testsuite, but only
the test script it occurs in. This doesn’t follow TAP
specifications, but on the other hand it maximizes compatibility
(and code sharing) with the “hard error” concept of the default
testsuite driver.
• The ‘version’ and ‘pragma’ directives are not supported.
• The ‘–diagnostic-string’ option of our driver allows to modify the
string that introduces TAP diagnostic from the default value of
“‘#’”. The standard TAP protocol has currently no way to allow
this, so if you use it your diagnostic will be lost to more
compliant tools like ‘prove’ and ‘Test::Harness’
• And there are probably some other small and yet undiscovered
incompatibilities, especially in corner cases or with rare usages.

Here are some links to more extensive official or third-party
documentation and resources about the TAP protocol and related tools and
libraries.
• ‘Test::Harness::TAP’
(http://search.cpan.org/~petdance/Test-Harness/lib/Test/Harness/TAP.pod),
the (mostly) official documentation about the TAP format and
protocol.
• ‘prove’ (http://search.cpan.org/~andya/Test-Harness/bin/prove),
the most famous command-line TAP test driver, included in the
distribution of ‘perl’ and ‘Test::Harness’
(http://search.cpan.org/~andya/Test-Harness/lib/Test/Harness.pm).
• The TAP wiki (http://testanything.org/wiki/index.php/Main_Page).
• A “gentle introduction” to testing for perl coders:
‘Test::Tutorial’
(http://search.cpan.org/dist/Test-Simple/lib/Test/Tutorial.pod).
• ‘Test::Simple’
(http://search.cpan.org/~mschwern/Test-Simple/lib/Test/Simple.pm)
and ‘Test::More’
(http://search.cpan.org/~mschwern/Test-Simple/lib/Test/More.pm),
the standard perl testing libraries, which are based on TAP.
• C TAP Harness
(http://www.eyrie.org/~eagle/software/c-tap-harness/), a C-based
project implementing both a TAP producer and a TAP consumer.
• tap4j (http://www.tap4j.org/), a Java-based project implementing
both a TAP producer and a TAP consumer.

15.5 DejaGnu Tests

If ‘dejagnu’ (https://ftp.gnu.org/gnu/dejagnu/) appears in
‘AUTOMAKE_OPTIONS’, then a ‘dejagnu’-based test suite is assumed. The
variable ‘DEJATOOL’ is a list of names that are passed, one at a time,
as the ‘–tool’ argument to ‘runtest’ invocations; it defaults to the
name of the package.

The variable ‘RUNTESTDEFAULTFLAGS’ holds the ‘–tool’ and ‘–srcdir’
flags that are passed to dejagnu by default; this can be overridden if
necessary.

The variables ‘EXPECT’ and ‘RUNTEST’ can also be overridden to
provide project-specific values. For instance, you will need to do this
if you are testing a compiler toolchain, because the default values do
not take into account host and target names.

The contents of the variable ‘RUNTESTFLAGS’ are passed to the
‘runtest’ invocation. This is considered a “user variable” (*note User
Variables::). If you need to set ‘runtest’ flags in ‘Makefile.am’, you
can use ‘AM_RUNTESTFLAGS’ instead.

Automake will generate rules to create a local ‘site.exp’ file,
defining various variables detected by ‘configure’. This file is
automatically read by DejaGnu. It is OK for the user of a package to
edit this file in order to tune the test suite. However this is not the
place where the test suite author should define new variables: this
should be done elsewhere in the real test suite code. Especially,
‘site.exp’ should not be distributed.

Still, if the package author has legitimate reasons to extend
‘site.exp’ at ‘make’ time, he can do so by defining the variable
‘EXTRA_DEJAGNU_SITE_CONFIG’; the files listed there will be considered
‘site.exp’ prerequisites, and their content will be appended to it (in
the same order in which they appear in ‘EXTRA_DEJAGNU_SITE_CONFIG’).
Note that files are not distributed by default.

For more information regarding DejaGnu test suites, see *note
(dejagnu)Top::.

15.6 Install Tests

The ‘installcheck’ target is available to the user as a way to run any
tests after the package has been installed. You can add tests to this
by writing an ‘installcheck-local’ rule.

GNU Automake 版本(version 1.16.1, 26 February 2018)

Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with no Invariant Sections, with no Front-Cover texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled “GNU Free Documentation License.”

8 Building Programs and Libraries


A large part of Automake functionality is dedicated to making it easy
to build programs and libraries.

8.1 Building a program

======================

为了编译一个程序,你需要告诉Automake需要哪些源码,需要链接哪些库。

8.1.1 Defining program sources


In a directory containing source that gets built into a program (as
opposed to a library or a script), the PROGRAMS primary is used.
Programs can be installed in bindir, sbindir, libexecdir,
pkglibexecdir, or not at all (noinst_). They can also be built only
for make check, in which case the prefix is check_.

For instance:

 bin_PROGRAMS = hello

In this simple case, the resulting Makefile.in will contain code to
generate a program named hello.

Associated with each program are several assisting variables that are
named after the program. These variables are all optional, and have
reasonable defaults. Each variable, its use, and default is spelled out
below; we use the “hello” example throughout.

The variable hello_SOURCES is used to specify which source files
get built into an executable:

 hello_SOURCES = hello.c version.c getopt.c getopt1.c getopt.h system.h

This causes each mentioned .c file to be compiled into the
corresponding .o. Then all are linked to produce hello.

If hello_SOURCES is not specified, then it defaults to the single
file hello.c

Multiple programs can be built in a single directory. Multiple
programs can share a single source file, which must be listed in each
_SOURCES definition.

Header files listed in a _SOURCES definition will be included in
the distribution but otherwise ignored. In case it isnt obvious, you should not include the header file generated by configurein a_SOURCES variable; this file should not be distributed. Lex (.l) and Yacc (.y`) files can also be listed; see *note Yacc and Lex::.

8.1.2 Linking the program


如果希望链接 configure文件中没有的库,可以使用 LDADD 搞定。这个变量用于指定链接的目标文件或者库,如果希望再指定一些参数,可以使用 AM_LDFLAGS

Sometimes, multiple programs are built in one directory but do not
share the same link-time requirements. In this case, you can use the
PROG_LDADD variable (where PROG is the name of the program as it
appears in some _PROGRAMS variable, and usually written in lowercase)
to override LDADD. If this variable exists for a given program, then
that program is not linked using LDADD.

For instance, in GNU cpio, pax, cpio and mt are linked against
the library libcpio.a. However, rmt is built in the same directory,
and has no such link requirement. Also, mt and rmt are only built
on certain architectures. Here is what cpios src/Makefile.am` looks
like (abridged):

 bin_PROGRAMS = cpio pax $(MT)
 libexec_PROGRAMS = $(RMT)
 EXTRA_PROGRAMS = mt rmt

 LDADD = ../lib/libcpio.a $(INTLLIBS)
 rmt_LDADD =

 cpio_SOURCES = ...
 pax_SOURCES = ...
 mt_SOURCES = ...
 rmt_SOURCES = ...

PROG_LDADD is inappropriate for passing program-specific linker
flags (except for -l, -L, -dlopen and -dlpreopen). So, use the
PROG_LDFLAGS variable for this purpose.

It is also occasionally useful to have a program depend on some other
target that is not actually part of that program. This can be done
using either the PROG_DEPENDENCIES or the EXTRA_PROG_DEPENDENCIES
variable. Each program depends on the contents both variables, but no
further interpretation is done.

Since these dependencies are associated to the link rule used to
create the programs they should normally list files used by the link
command. That is *.$(OBJEXT), *.a, or *.la files. In rare cases
you may need to add other kinds of files such as linker scripts, but
listing a source file in _DEPENDENCIES is wrong. If some source
file needs to be built before all the components of a program are built,
consider using the BUILT_SOURCES variable instead (*note Sources::).

If PROG_DEPENDENCIES is not supplied, it is computed by Automake.
The automatically-assigned value is the contents of PROG_LDADD, with
most configure substitutions, -l, -L, -dlopen and -dlpreopen
options removed. The configure substitutions that are left in are only
$(LIBOBJS) and $(ALLOCA); these are left because it is known that
they will not cause an invalid value for PROG_DEPENDENCIES to be
generated.

The EXTRA_PROG_DEPENDENCIES may be useful for cases where you
merely want to augment the automake-generated PROG_DEPENDENCIES
rather than replacing it.

We recommend that you avoid using -l options in LDADD or
PROG_LDADD when referring to libraries built by your package.
Instead, write the file name of the library explicitly as in the above
cpio example. Use -l only to list third-party libraries. If you
follow this rule, the default value of PROG_DEPENDENCIES will list all
your local libraries and omit the other ones.

8.1.3 Conditional compilation of sources


You cant put a configure substitution (e.g., @FOO@or$(FOO)whereFOOis defined viaAC_SUBST) into a _SOURCES variable. The reason for this is a bit hard to explain, but suffice to say that it simply wont work. Automake will give an error if you try to do this.

Fortunately there are two other ways to achieve the same result. One
is to use configure substitutions in _LDADD variables, the other is to
use an Automake conditional.

Conditional Compilation using _LDADD Substitutions
…………………………………………….

Automake must know all the source files that could possibly go into a
program, even if not all the files are built in every circumstance. Any
files that are only conditionally built should be listed in the
appropriate EXTRA_ variable. For instance, if hello-linux.c or
hello-generic.c were conditionally included in hello, the
Makefile.am would contain:

 bin_PROGRAMS = hello
 hello_SOURCES = hello-common.c
 EXTRA_hello_SOURCES = hello-linux.c hello-generic.c
 hello_LDADD = $(HELLO_SYSTEM)
 hello_DEPENDENCIES = $(HELLO_SYSTEM)

You can then setup the $(HELLO_SYSTEM) substitution from
configure.ac:

 ...
 case $host in
   *linux*) HELLO_SYSTEM='hello-linux.$(OBJEXT)' ;;
   *)       HELLO_SYSTEM='hello-generic.$(OBJEXT)' ;;
 esac
 AC_SUBST([HELLO_SYSTEM])
 ...

In this case, the variable HELLO_SYSTEM should be replaced by
either hello-linux.o or hello-generic.o, and added to both
hello_DEPENDENCIES and hello_LDADD in order to be built and linked
in.

Conditional Compilation using Automake Conditionals
……………………………………………

An often simpler way to compile source files conditionally is to use
Automake conditionals. For instance, you could use this Makefile.am
construct to build the same hello example:

 bin_PROGRAMS = hello
 if LINUX
 hello_SOURCES = hello-linux.c hello-common.c
 else
 hello_SOURCES = hello-generic.c hello-common.c
 endif

In this case, configure.ac should setup the LINUX conditional
using AM_CONDITIONAL (*note Conditionals::).

When using conditionals like this you dont need to use the EXTRA_`
variable, because Automake will examine the contents of each variable to
construct the complete list of source files.

If your program uses a lot of files, you will probably prefer a
conditional +=.

 bin_PROGRAMS = hello
 hello_SOURCES = hello-common.c
 if LINUX
 hello_SOURCES += hello-linux.c
 else
 hello_SOURCES += hello-generic.c
 endif

8.1.4 Conditional compilation of programs


Sometimes it is useful to determine the programs that are to be built at
configure time. For instance, GNU cpio only builds mt and rmt
under special circumstances. The means to achieve conditional
compilation of programs are the same you can use to compile source files
conditionally: substitutions or conditionals.

Conditional Programs using configure Substitutions
…………………………………………….

In this case, you must notify Automake of all the programs that can
possibly be built, but at the same time cause the generated
Makefile.in to use the programs specified by configure. This is
done by having configure substitute values into each _PROGRAMS
definition, while listing all optionally built programs in
EXTRA_PROGRAMS.

 bin_PROGRAMS = cpio pax $(MT)
 libexec_PROGRAMS = $(RMT)
 EXTRA_PROGRAMS = mt rmt

As explained in *note EXEEXT::, Automake will rewrite bin_PROGRAMS,
libexec_PROGRAMS, and EXTRA_PROGRAMS, appending $(EXEEXT) to each
binary. Obviously it cannot rewrite values obtained at run-time through
configure substitutions, therefore you should take care of appending
$(EXEEXT) yourself, as in AC_SUBST([MT], ['mt${EXEEXT}']).

Conditional Programs using Automake Conditionals
…………………………………………

You can also use Automake conditionals (*note Conditionals::) to select
programs to be built. In this case you dont have to worry about $(EXEEXT)orEXTRA_PROGRAMS`.

 bin_PROGRAMS = cpio pax
 if WANT_MT
   bin_PROGRAMS += mt
 endif
 if WANT_RMT
   libexec_PROGRAMS = rmt
 endif

8.2 Building a library

======================

Building a library is much like building a program. In this case, the
name of the primary is LIBRARIES. Libraries can be installed in
libdir or pkglibdir.

*Note A Shared Library::, for information on how to build shared
libraries using libtool and the LTLIBRARIES primary.

Each _LIBRARIES variable is a list of the libraries to be built.
For instance, to create a library named libcpio.a, but not install it,
you would write:

 noinst_LIBRARIES = libcpio.a
 libcpio_a_SOURCES = ...

The sources that go into a library are determined exactly as they are
for programs, via the _SOURCES variables. Note that the library name
is canonicalized (*note Canonicalization::), so the _SOURCES variable
corresponding to libcpio.a is libcpio_a_SOURCES, not
libcpio.a_SOURCES.

Extra objects can be added to a library using the LIBRARY_LIBADD
variable. This should be used for objects determined by configure.
Again from cpio:

 libcpio_a_LIBADD = $(LIBOBJS) $(ALLOCA)

In addition, sources for extra objects that will not exist until
configure-time must be added to the BUILT_SOURCES variable (*note
Sources::).

Building a static library is done by compiling all object files, then
by invoking $(AR) $(ARFLAGS) followed by the name of the library and
the list of objects, and finally by calling $(RANLIB) on that library.
You should call AC_PROG_RANLIB from your configure.ac to define
RANLIB (Automake will complain otherwise). You should also call
AM_PROG_AR to define AR, in order to support unusual archivers such
as Microsoft lib. ARFLAGS will default to cru; you can override
this variable by setting it in your Makefile.am or by AC_SUBSTing it
from your configure.ac. You can override the AR variable by
defining a per-library maude_AR variable (*note Program and Library
Variables::).

Be careful when selecting library components conditionally. Because
building an empty library is not portable, you should ensure that any
library always contains at least one object.

To use a static library when building a program, add it to LDADD
for this program. In the following example, the program cpio is
statically linked with the library libcpio.a.

 noinst_LIBRARIES = libcpio.a
 libcpio_a_SOURCES = ...

 bin_PROGRAMS = cpio
 cpio_SOURCES = cpio.c ...
 cpio_LDADD = libcpio.a

8.3 Building a Shared Library

=============================

Building shared libraries portably is a relatively complex matter. For
this reason, GNU Libtool (*note Introduction: (libtool)Top.) was created
to help build shared libraries in a platform-independent way.

8.3.1 The Libtool Concept


Libtool abstracts shared and static libraries into a unified concept
henceforth called “libtool libraries”. Libtool libraries are files
using the .la suffix, and can designate a static library, a shared
library, or maybe both. Their exact nature cannot be determined until
./configure is run: not all platforms support all kinds of libraries,
and users can explicitly select which libraries should be built.
(However the packages maintainers can tune the default, *note The AC_PROG_LIBTOOL` macro: (libtool)AC_PROG_LIBTOOL.)

Because object files for shared and static libraries must be compiled
differently, libtool is also used during compilation. Object files
built by libtool are called “libtool objects”: these are files using the
.lo suffix. Libtool libraries are built from these libtool objects.

You should not assume anything about the structure of .la or .lo
files and how libtool constructs them: this is libtools concern, and the last thing one wants is to learn about libtools guts. However the
existence of these files matters, because they are used as targets and
dependencies in Makefiles rules when building libtool libraries.
There are situations where you may have to refer to these, for instance
when expressing dependencies for building source files conditionally
(*note Conditional Libtool Sources::).

People considering writing a plug-in system, with dynamically loaded
modules, should look into libltdl: libtool`s dlopening library (*note
Using libltdl: (libtool)Using libltdl.). This offers a portable
dlopening facility to load libtool libraries dynamically, and can also
achieve static linking where unavoidable.

Before we discuss how to use libtool with Automake in details, it
should be noted that the libtool manual also has a section about how to
use Automake with libtool (*note Using Automake with Libtool:
(libtool)Using Automake.).

8.3.2 Building Libtool Libraries


Automake uses libtool to build libraries declared with the LTLIBRARIES
primary. Each _LTLIBRARIES variable is a list of libtool libraries to
build. For instance, to create a libtool library named libgettext.la,
and install it in libdir, write:

 lib_LTLIBRARIES = libgettext.la
 libgettext_la_SOURCES = gettext.c gettext.h ...

Automake predefines the variable pkglibdir, so you can use
pkglib_LTLIBRARIES to install libraries in $(libdir)/@PACKAGE@/.

If gettext.h is a public header file that needs to be installed in
order for people to use the library, it should be declared using a
_HEADERS variable, not in libgettext_la_SOURCES. Headers listed in
the latter should be internal headers that are not part of the public
interface.

 lib_LTLIBRARIES = libgettext.la
 libgettext_la_SOURCES = gettext.c ...
 include_HEADERS = gettext.h ...

A package can build and install such a library along with other
programs that use it. This dependency should be specified using
LDADD. The following example builds a program named hello that is
linked with libgettext.la.

 lib_LTLIBRARIES = libgettext.la
 libgettext_la_SOURCES = gettext.c ...

 bin_PROGRAMS = hello
 hello_SOURCES = hello.c ...
 hello_LDADD = libgettext.la

Whether hello is statically or dynamically linked with libgettext.la
is not yet known: this will depend on the configuration of libtool and
the capabilities of the host.

8.3.3 Building Libtool Libraries Conditionally


Like conditional programs (*note Conditional Programs::), there are two
main ways to build conditional libraries: using Automake conditionals or
using Autoconf AC_SUBSTitutions.

The important implementation detail you have to be aware of is that
the place where a library will be installed matters to libtool: it needs
to be indicated at link-time using the -rpath option.

For libraries whose destination directory is known when Automake
runs, Automake will automatically supply the appropriate -rpath option
to libtool. This is the case for libraries listed explicitly in some
installable _LTLIBRARIES variables such as lib_LTLIBRARIES.

However, for libraries determined at configure time (and thus
mentioned in EXTRA_LTLIBRARIES), Automake does not know the final
installation directory. For such libraries you must add the -rpath
option to the appropriate _LDFLAGS variable by hand.

The examples below illustrate the differences between these two
methods.

Here is an example where WANTEDLIBS is an AC_SUBSTed variable set
at ./configure-time to either libfoo.la, libbar.la, both, or none.
Although $(WANTEDLIBS) appears in the lib_LTLIBRARIES, Automake
cannot guess it relates to libfoo.la or libbar.la at the time it
creates the link rule for these two libraries. Therefore the -rpath
argument must be explicitly supplied.

 EXTRA_LTLIBRARIES = libfoo.la libbar.la
 lib_LTLIBRARIES = $(WANTEDLIBS)
 libfoo_la_SOURCES = foo.c ...
 libfoo_la_LDFLAGS = -rpath '$(libdir)'
 libbar_la_SOURCES = bar.c ...
 libbar_la_LDFLAGS = -rpath '$(libdir)'

Here is how the same Makefile.am would look using Automake
conditionals named WANT_LIBFOO and WANT_LIBBAR. Now Automake is
able to compute the -rpath setting itself, because its clear that both libraries will end up in $(libdir)` if they are installed.

 lib_LTLIBRARIES =
 if WANT_LIBFOO
 lib_LTLIBRARIES += libfoo.la
 endif
 if WANT_LIBBAR
 lib_LTLIBRARIES += libbar.la
 endif
 libfoo_la_SOURCES = foo.c ...
 libbar_la_SOURCES = bar.c ...

8.3.4 Libtool Libraries with Conditional Sources


Conditional compilation of sources in a library can be achieved in the
same way as conditional compilation of sources in a program (*note
Conditional Sources::). The only difference is that _LIBADD should be
used instead of _LDADD and that it should mention libtool objects
(.lo files).

So, to mimic the hello example from *note Conditional Sources::, we
could build a libhello.la library using either hello-linux.c or
hello-generic.c with the following Makefile.am.

 lib_LTLIBRARIES = libhello.la
 libhello_la_SOURCES = hello-common.c
 EXTRA_libhello_la_SOURCES = hello-linux.c hello-generic.c
 libhello_la_LIBADD = $(HELLO_SYSTEM)
 libhello_la_DEPENDENCIES = $(HELLO_SYSTEM)

And make sure configure defines HELLO_SYSTEM as either
hello-linux.lo or hello-generic.lo.

Or we could simply use an Automake conditional as follows.

 lib_LTLIBRARIES = libhello.la
 libhello_la_SOURCES = hello-common.c
 if LINUX
 libhello_la_SOURCES += hello-linux.c
 else
 libhello_la_SOURCES += hello-generic.c
 endif

8.3.5 Libtool Convenience Libraries

Sometimes you want to build libtool libraries that should not be
installed. These are called “libtool convenience libraries” and are
typically used to encapsulate many sublibraries, later gathered into one
big installed library.

Libtool convenience libraries are declared by directory-less
variables such as noinst_LTLIBRARIES, check_LTLIBRARIES, or even
EXTRA_LTLIBRARIES. Unlike installed libtool libraries they do not
need an -rpath flag at link time (actually this is the only
difference).

Convenience libraries listed in noinst_LTLIBRARIES are always
built. Those listed in check_LTLIBRARIES are built only upon make check. Finally, libraries listed in EXTRA_LTLIBRARIES are never
built explicitly: Automake outputs rules to build them, but if the
library does not appear as a Makefile dependency anywhere it wont be built (this is why EXTRA_LTLIBRARIES` is used for conditional
compilation).

Here is a sample setup merging libtool convenience libraries from
subdirectories into one main libtop.la library.

 # -- Top-level Makefile.am --
 SUBDIRS = sub1 sub2 ...
 lib_LTLIBRARIES = libtop.la
 libtop_la_SOURCES =
 libtop_la_LIBADD = \
   sub1/libsub1.la \
   sub2/libsub2.la \
   ...

 # -- sub1/Makefile.am --
 noinst_LTLIBRARIES = libsub1.la
 libsub1_la_SOURCES = ...

 # -- sub2/Makefile.am --
 # showing nested convenience libraries
 SUBDIRS = sub2.1 sub2.2 ...
 noinst_LTLIBRARIES = libsub2.la
 libsub2_la_SOURCES =
 libsub2_la_LIBADD = \
   sub21/libsub21.la \
   sub22/libsub22.la \
   ...

When using such setup, beware that automake will assume libtop.la
is to be linked with the C linker. This is because libtop_la_SOURCES
is empty, so automake picks C as default language. If
libtop_la_SOURCES was not empty, automake would select the linker as
explained in *note How the Linker is Chosen::.

If one of the sublibraries contains non-C source, it is important
that the appropriate linker be chosen. One way to achieve this is to
pretend that there is such a non-C file among the sources of the
library, thus forcing automake to select the appropriate linker. Here
is the top-level Makefile of our example updated to force C++ linking.

 SUBDIRS = sub1 sub2 ...
 lib_LTLIBRARIES = libtop.la
 libtop_la_SOURCES =
 # Dummy C++ source to cause C++ linking.
 nodist_EXTRA_libtop_la_SOURCES = dummy.cxx
 libtop_la_LIBADD = \
   sub1/libsub1.la \
   sub2/libsub2.la \
   ...

EXTRA_*_SOURCES variables are used to keep track of source files
that might be compiled (this is mostly useful when doing conditional
compilation using AC_SUBST, *note Conditional Libtool Sources::), and
the nodist_ prefix means the listed sources are not to be distributed
(*note Program and Library Variables::). In effect the file dummy.cxx
does not need to exist in the source tree. Of course if you have some
real source file to list in libtop_la_SOURCES there is no point in
cheating with nodist_EXTRA_libtop_la_SOURCES.

8.3.6 Libtool Modules


These are libtool libraries meant to be dlopened. They are indicated to
libtool by passing -module at link-time.

 pkglib_LTLIBRARIES = mymodule.la
 mymodule_la_SOURCES = doit.c
 mymodule_la_LDFLAGS = -module

Ordinarily, Automake requires that a librarys name start with lib. However, when building a dynamically loadable module you might wish to use a "nonstandard" name. Automake will not complain about such nonstandard names if it knows the library being built is a libtool module, i.e., if -module explicitly appears in the librarys
_LDFLAGS variable (or in the common AM_LDFLAGS variable when no
per-library _LDFLAGS variable is defined).

As always, AC_SUBST variables are black boxes to Automake since
their values are not yet known when automake is run. Therefore if
-module is set via such a variable, Automake cannot notice it and will
proceed as if the library was an ordinary libtool library, with strict
naming.

If mymodule_la_SOURCES is not specified, then it defaults to the
single file mymodule.c (*note Default _SOURCES::).

8.3.7 _LIBADD, _LDFLAGS, and _LIBTOOLFLAGS


As shown in previous sections, the LIBRARY_LIBADD variable should be
used to list extra libtool objects (.lo files) or libtool libraries
(.la) to add to LIBRARY.

The LIBRARY_LDFLAGS variable is the place to list additional
libtool linking flags, such as -version-info, -static, and a lot
more. *Note Link mode: (libtool)Link mode.

The libtool command has two kinds of options: mode-specific options
and generic options. Mode-specific options such as the aforementioned
linking flags should be lumped with the other flags passed to the tool
invoked by libtool (hence the use of LIBRARY_LDFLAGS for libtool
linking flags). Generic options include --tag=TAG and --silent
(*note Invoking libtool: (libtool)Invoking libtool. for more options)
should appear before the mode selection on the command line; in
Makefile.ams they should be listed in the LIBRARY_LIBTOOLFLAGS
variable.

If LIBRARY_LIBTOOLFLAGS is not defined, then the variable
AM_LIBTOOLFLAGS is used instead.

These flags are passed to libtool after the --tag=TAG option
computed by Automake (if any), so LIBRARY_LIBTOOLFLAGS (or
AM_LIBTOOLFLAGS) is a good place to override or supplement the
--tag=TAG setting.

The libtool rules also use a LIBTOOLFLAGS variable that should not
be set in Makefile.am: this is a user variable (*note Flag Variables
Ordering::. It allows users to run make LIBTOOLFLAGS=--silent, for
instance. Note that the verbosity of libtool can also be influenced
by the Automake support for silent rules (*note Automake Silent
Rules::).

8.3.8 LTLIBOBJS and LTALLOCA

Where an ordinary library might include $(LIBOBJS) or $(ALLOCA)
(*note LIBOBJS::), a libtool library must use $(LTLIBOBJS) or
$(LTALLOCA). This is required because the object files that libtool
operates on do not necessarily end in .o.

Nowadays, the computation of LTLIBOBJS from LIBOBJS is performed
automatically by Autoconf (*note AC_LIBOBJ vs. LIBOBJS:
(autoconf)AC_LIBOBJ vs LIBOBJS.).

8.3.9.1 Error: required file ./ltmain.sh’ not found`
………………………………………………

Libtool comes with a tool called libtoolize that will install
libtools supporting files into a package. Running this command will install ltmain.sh. You should execute it before aclocalandautomake`.

People upgrading old packages to newer autotools are likely to face
this issue because older Automake versions used to call libtoolize.
Therefore old build scripts do not call libtoolize.

Since Automake 1.6, it has been decided that running libtoolize was
none of Automakes business. Instead, that functionality has been moved into the autoreconfcommand (*note Usingautoreconf: (autoconf)autoreconf Invocation.). If you do not want to remember what to run and when, just learn the autoreconfcommand. Hopefully, replacing existingbootstraporautogen.shscripts by a call toautoreconf` should also free you from any similar incompatible change
in the future.

8.3.9.2 Objects created with both libtool and without

……………………………………………….

Sometimes, the same source file is used both to build a libtool library
and to build another non-libtool target (be it a program or another
library).

Lets consider the following Makefile.am`.

 bin_PROGRAMS = prog
 prog_SOURCES = prog.c foo.c ...

 lib_LTLIBRARIES = libfoo.la
 libfoo_la_SOURCES = foo.c ...

(In this trivial case the issue could be avoided by linking libfoo.la
with prog instead of listing foo.c in prog_SOURCES. But lets assume we really want to keep progandlibfoo.la` separate.)

Technically, it means that we should build foo.$(OBJEXT) for
prog, and foo.lo for libfoo.la. The problem is that in the course
of creating foo.lo, libtool may erase (or replace) foo.$(OBJEXT),
and this cannot be avoided.

Therefore, when Automake detects this situation it will complain with
a message such as
​ object ‘foo.$(OBJEXT)’ created both with libtool and without

A workaround for this issue is to ensure that these two objects get
different basenames. As explained in *note Renamed Objects::, this
happens automatically when per-targets flags are used.

 bin_PROGRAMS = prog
 prog_SOURCES = prog.c foo.c ...
 prog_CFLAGS = $(AM_CFLAGS)

 lib_LTLIBRARIES = libfoo.la
 libfoo_la_SOURCES = foo.c ...

Adding prog_CFLAGS = $(AM_CFLAGS) is almost a no-op, because when the
prog_CFLAGS is defined, it is used instead of AM_CFLAGS. However as
a side effect it will cause prog.c and foo.c to be compiled as
prog-prog.$(OBJEXT) and prog-foo.$(OBJEXT), which solves the issue.

8.4 Program and Library Variables

Associated with each program is a collection of variables that can be
used to modify how that program is built. There is a similar list of
such variables for each library. The canonical name of the program (or
library) is used as a base for naming these variables.

In the list below, we use the name “maude” to refer to the program or
library. In your Makefile.am you would replace this with the
canonical name of your program. This list also refers to “maude” as a
program, but in general the same rules apply for both static and dynamic
libraries; the documentation below notes situations where programs and
libraries differ.

maude_SOURCES
​ This variable, if it exists, lists all the source files that are
​ compiled to build the program. These files are added to the
​ distribution by default. When building the program, Automake will
​ cause each source file to be compiled to a single .o file (or
.lo when using libtool). Normally these object files are named
​ after the source file, but other factors can change this. If a
​ file in the _SOURCES variable has an unrecognized extension,
​ Automake will do one of two things with it. If a suffix rule
​ exists for turning files with the unrecognized extension into .o
​ files, then automake will treat this file as it will any other
​ source file (*note Support for Other Languages::). Otherwise, the
​ file will be ignored as though it were a header file.

 The prefixes `dist_` and `nodist_` can be used to control whether
 files listed in a `_SOURCES` variable are distributed.  `dist_` is
 redundant, as sources are distributed by default, but it can be
 specified for clarity if desired.

 It is possible to have both `dist_` and `nodist_` variants of a
 given `_SOURCES` variable at once; this lets you easily distribute
 some files and not others, for instance:

      nodist_maude_SOURCES = nodist.c
      dist_maude_SOURCES = dist-me.c

 By default the output file (on Unix systems, the `.o` file) will be
 put into the current build directory.  However, if the option
 `subdir-objects` is in effect in the current directory then the
 `.o` file will be put into the subdirectory named after the source
 file.  For instance, with `subdir-objects` enabled,
 `sub/dir/file.c` will be compiled to `sub/dir/file.o`.  Some people
 prefer this mode of operation.  You can specify `subdir-objects` in
 `AUTOMAKE_OPTIONS` (*note Options::).

EXTRA_maude_SOURCES
​ Automake needs to know the list of files you intend to compile
statically. For one thing, this is the only way Automake has of
​ knowing what sort of language support a given Makefile.in
​ requires. (1) This means that, for example, you cant put a ​ configure substitution like @my_sources@into a_SOURCES​ variable. If you intend to conditionally compile source files and ​ useconfigureto substitute the appropriate object names into, ​ e.g.,LDADD(see below), then you should list the corresponding ​ source files in theEXTRA` variable.

 This variable also supports `dist_` and `nodist_` prefixes.  For
 instance, `nodist_EXTRA_maude_SOURCES` would list extra sources
 that may need to be built, but should not be distributed.

maude_AR
​ A static library is created by default by invoking $(AR) ​ $(ARFLAGS) followed by the name of the library and then the
​ objects being put into the library. You can override this by
​ setting the _AR variable. This is usually used with C++; some
​ C++ compilers require a special invocation in order to instantiate
​ all the templates that should go into a library. For instance, the
​ SGI C++ compiler likes this variable set like so:
​ libmaude_a_AR = $(CXX) -ar -o

maude_LIBADD
​ Extra objects can be added to a library using the _LIBADD
​ variable. For instance, this should be used for objects determined
​ by configure (*note A Library::).

 In the case of libtool libraries, `maude_LIBADD` can also refer to
 other libtool libraries.

maude_LDADD
​ Extra objects (*.$(OBJEXT)) and libraries (*.a, *.la) can be
​ added to a program by listing them in the _LDADD variable. For
​ instance, this should be used for objects determined by configure
​ (*note Linking::).

 `_LDADD` and `_LIBADD` are inappropriate for passing
 program-specific linker flags (except for `-l`, `-L`, `-dlopen` and
 `-dlpreopen`).  Use the `_LDFLAGS` variable for this purpose.

 For instance, if your `configure.ac` uses `AC_PATH_XTRA`, you could
 link your program against the X libraries like so:

      maude_LDADD = $(X_PRE_LIBS) $(X_LIBS) $(X_EXTRA_LIBS)

 We recommend that you use `-l` and `-L` only when referring to
 third-party libraries, and give the explicit file names of any
 library built by your package.  Doing so will ensure that
 `maude_DEPENDENCIES` (see below) is correctly defined by default.

maude_LDFLAGS
​ This variable is used to pass extra flags to the link step of a
​ program or a shared library. It overrides the AM_LDFLAGS
​ variable.

maude_LIBTOOLFLAGS
​ This variable is used to pass extra options to libtool. It
​ overrides the AM_LIBTOOLFLAGS variable. These options are output
​ before libtool``s –mode=MODE` option, so they should not be
​ mode-specific options (those belong to the compiler or linker
​ flags). *Note Libtool Flags::.

maude_DEPENDENCIES
EXTRA_maude_DEPENDENCIES
​ It is also occasionally useful to have a target (program or
​ library) depend on some other file that is not actually part of
​ that target. This can be done using the _DEPENDENCIES variable.
​ Each target depends on the contents of such a variable, but no
​ further interpretation is done.

 Since these dependencies are associated to the link rule used to
 create the programs they should normally list files used by the
 link command.  That is `*.$(OBJEXT)`, `*.a`, or `*.la` files for
 programs; `*.lo` and `*.la` files for Libtool libraries; and
 `*.$(OBJEXT)` files for static libraries.  In rare cases you may
 need to add other kinds of files such as linker scripts, but
 _listing a source file in `_DEPENDENCIES` is wrong_.  If some
 source file needs to be built before all the components of a
 program are built, consider using the `BUILT_SOURCES` variable
 (*note Sources::).

 If `_DEPENDENCIES` is not supplied, it is computed by Automake.
 The automatically-assigned value is the contents of `_LDADD` or
 `_LIBADD`, with most configure substitutions, `-l`, `-L`, `-dlopen`
 and `-dlpreopen` options removed.  The configure substitutions that
 are left in are only `$(LIBOBJS)` and `$(ALLOCA)`; these are left
 because it is known that they will not cause an invalid value for
 `_DEPENDENCIES` to be generated.

 `_DEPENDENCIES` is more likely used to perform conditional
 compilation using an `AC_SUBST` variable that contains a list of
 objects.  *Note Conditional Sources::, and *note Conditional
 Libtool Sources::.

 The `EXTRA_*_DEPENDENCIES` variable may be useful for cases where
 you merely want to augment the `automake`-generated `_DEPENDENCIES`
 variable rather than replacing it.

maude_LINK
​ You can override the linker on a per-program basis. By default the
​ linker is chosen according to the languages used by the program.
​ For instance, a program that includes C++ source code would use the
​ C++ compiler to link. The _LINK variable must hold the name of a
​ command that can be passed all the .o file names and libraries to
​ link against as arguments. Note that the name of the underlying
​ program is not passed to _LINK; typically one uses $@:

      maude_LINK = $(CCLD) -magic -o $@

 If a `_LINK` variable is not supplied, it may still be generated
 and used by Automake due to the use of per-target link flags such
 as `_CFLAGS`, `_LDFLAGS` or `_LIBTOOLFLAGS`, in cases where they
 apply.

maude_CCASFLAGS
maude_CFLAGS
maude_CPPFLAGS
maude_CXXFLAGS
maude_FFLAGS
maude_GCJFLAGS
maude_LFLAGS
maude_OBJCFLAGS
maude_OBJCXXFLAGS
maude_RFLAGS
maude_UPCFLAGS
maude_YFLAGS
​ Automake allows you to set compilation flags on a per-program (or
​ per-library) basis. A single source file can be included in
​ several programs, and it will potentially be compiled with
​ different flags for each program. This works for any language
​ directly supported by Automake. These “per-target compilation
​ flags” are _CCASFLAGS, _CFLAGS, _CPPFLAGS, _CXXFLAGS,
_FFLAGS, _GCJFLAGS, _LFLAGS, _OBJCFLAGS, _OBJCXXFLAGS,
_RFLAGS, _UPCFLAGS, and _YFLAGS.

 When using a per-target compilation flag, Automake will choose a
 different name for the intermediate object files.  Ordinarily a
 file like `sample.c` will be compiled to produce `sample.o`.
 However, if the program`s `_CFLAGS` variable is set, then the
 object file will be named, for instance, `maude-sample.o`.  (See
 also *note Renamed Objects::).

 In compilations with per-target flags, the ordinary `AM_` form of
 the flags variable is _not_ automatically included in the
 compilation (however, the user form of the variable _is_ included).
 So for instance, if you want the hypothetical `maude` compilations
 to also use the value of `AM_CFLAGS`, you would need to write:

      maude_CFLAGS = ... your flags ... $(AM_CFLAGS)

 *Note Flag Variables Ordering::, for more discussion about the
 interaction between user variables, `AM_` shadow variables, and
 per-target variables.

maude_SHORTNAME
​ On some platforms the allowable file names are very short. In
​ order to support these systems and per-target compilation flags at
​ the same time, Automake allows you to set a “short name” that will
​ influence how intermediate object files are named. For instance,
​ in the following example,

      bin_PROGRAMS = maude
      maude_CPPFLAGS = -DSOMEFLAG
      maude_SHORTNAME = m
      maude_SOURCES = sample.c ...

 the object file would be named `m-sample.o` rather than
 `maude-sample.o`.

 This facility is rarely needed in practice, and we recommend
 avoiding it until you find it is required.

(1) There are other, more obscure reasons for this limitation as
well.

8.5 Default _SOURCES

_SOURCES variables are used to specify source files of programs (*note
A Program::), libraries (*note A Library::), and Libtool libraries
(*note A Shared Library::).

When no such variable is specified for a target, Automake will define
one itself. The default is to compile a single C file whose base name
is the name of the target itself, with any extension replaced by
AM_DEFAULT_SOURCE_EXT, which defaults to .c.

For example if you have the following somewhere in your Makefile.am
with no corresponding libfoo_a_SOURCES:

 lib_LIBRARIES = libfoo.a sub/libc++.a

libfoo.a will be built using a default source file named libfoo.c,
and sub/libc++.a will be built from sub/libc++.c. (In older
versions sub/libc++.a would be built from sub_libc___a.c, i.e., the
default source was the canonized name of the target, with .c appended.
We believe the new behavior is more sensible, but for backward
compatibility automake will use the old name if a file or a rule with
that name exists and AM_DEFAULT_SOURCE_EXT is not used.)

Default sources are mainly useful in test suites, when building many
test programs each from a single source. For instance, in

 check_PROGRAMS = test1 test2 test3
 AM_DEFAULT_SOURCE_EXT = .cpp

test1, test2, and test3 will be built from test1.cpp,
test2.cpp, and test3.cpp. Without the last line, they will be built
from test1.c, test2.c, and test3.c.

Another case where this is convenient is building many Libtool
modules (moduleN.la), each defined in its own file (moduleN.c).

 AM_LDFLAGS = -module
 lib_LTLIBRARIES = module1.la module2.la module3.la

Finally, there is one situation where this default source computation
needs to be avoided: when a target should not be built from sources. We
already saw such an example in *note true::; this happens when all the
constituents of a target have already been compiled and just need to be
combined using a _LDADD variable. Then it is necessary to define an
empty _SOURCES variable, so that automake does not compute a
default.

 bin_PROGRAMS = target
 target_SOURCES =
 target_LDADD = libmain.a libmisc.a

8.6 Special handling for LIBOBJS and ALLOCA

The $(LIBOBJS) and $(ALLOCA) variables list object files that should
be compiled into the project to provide an implementation for functions
that are missing or broken on the host system. They are substituted by
configure.

These variables are defined by Autoconf macros such as AC_LIBOBJ,
AC_REPLACE_FUNCS (*note Generic Function Checks: (autoconf)Generic
Functions.), or AC_FUNC_ALLOCA (*note Particular Function Checks:
(autoconf)Particular Functions.). Many other Autoconf macros call
AC_LIBOBJ or AC_REPLACE_FUNCS to populate $(LIBOBJS).

Using these variables is very similar to doing conditional
compilation using AC_SUBST variables, as described in *note
Conditional Sources::. That is, when building a program, $(LIBOBJS)
and $(ALLOCA) should be added to the associated *_LDADD variable, or
to the *_LIBADD variable when building a library. However there is no
need to list the corresponding sources in EXTRA_*_SOURCES nor to
define *_DEPENDENCIES. Automake automatically adds $(LIBOBJS) and
$(ALLOCA) to the dependencies, and it will discover the list of
corresponding source files automatically (by tracing the invocations of
the AC_LIBSOURCE Autoconf macros). If you have already defined
*_DEPENDENCIES explicitly for an unrelated reason, then you either
need to add these variables manually, or use EXTRA_*_DEPENDENCIES
instead of *_DEPENDENCIES.

These variables are usually used to build a portability library that
is linked with all the programs of the project. We now review a sample
setup. First, configure.ac contains some checks that affect either
LIBOBJS or ALLOCA.

 # configure.ac
 ...
 AC_CONFIG_LIBOBJ_DIR([lib])
 ...
 AC_FUNC_MALLOC             dnl May add malloc.$(OBJEXT) to LIBOBJS
 AC_FUNC_MEMCMP             dnl May add memcmp.$(OBJEXT) to LIBOBJS
 AC_REPLACE_FUNCS([strdup]) dnl May add strdup.$(OBJEXT) to LIBOBJS
 AC_FUNC_ALLOCA             dnl May add alloca.$(OBJEXT) to ALLOCA
 ...
 AC_CONFIG_FILES([
   lib/Makefile
   src/Makefile
 ])
 AC_OUTPUT

The AC_CONFIG_LIBOBJ_DIR tells Autoconf that the source files of
these object files are to be found in the lib/ directory. Automake
can also use this information, otherwise it expects the source files are
to be in the directory where the $(LIBOBJS) and $(ALLOCA) variables
are used.

The lib/ directory should therefore contain malloc.c, memcmp.c,
strdup.c, alloca.c. Here is its Makefile.am:

 # lib/Makefile.am

 noinst_LIBRARIES = libcompat.a
 libcompat_a_SOURCES =
 libcompat_a_LIBADD = $(LIBOBJS) $(ALLOCA)

The library can have any name, of course, and anyway it is not going
to be installed: it just holds the replacement versions of the missing
or broken functions so we can later link them in. Many projects also
include extra functions, specific to the project, in that library: they
are simply added on the _SOURCES line.

There is a small trap here, though: $(LIBOBJS) and $(ALLOCA)
might be empty, and building an empty library is not portable. You
should ensure that there is always something to put in libcompat.a.
Most projects will also add some utility functions in that directory,
and list them in libcompat_a_SOURCES, so in practice libcompat.a
cannot be empty.

Finally here is how this library could be used from the src/
directory.

 # src/Makefile.am

 # Link all programs in this directory with libcompat.a
 LDADD = ../lib/libcompat.a

 bin_PROGRAMS = tool1 tool2 ...
 tool1_SOURCES = ...
 tool2_SOURCES = ...

When option subdir-objects is not used, as in the above example,
the variables $(LIBOBJS) or $(ALLOCA) can only be used in the
directory where their sources lie. E.g., here it would be wrong to use
$(LIBOBJS) or $(ALLOCA) in src/Makefile.am. However if both
subdir-objects and AC_CONFIG_LIBOBJ_DIR are used, it is OK to use
these variables in other directories. For instance src/Makefile.am
could be changed as follows.

 # src/Makefile.am

 AUTOMAKE_OPTIONS = subdir-objects
 LDADD = $(LIBOBJS) $(ALLOCA)

 bin_PROGRAMS = tool1 tool2 ...
 tool1_SOURCES = ...
 tool2_SOURCES = ...

Because $(LIBOBJS) and $(ALLOCA) contain object file names that
end with .$(OBJEXT), they are not suitable for Libtool libraries
(where the expected object extension is .lo): LTLIBOBJS and
LTALLOCA should be used instead.

LTLIBOBJS is defined automatically by Autoconf and should not be
defined by hand (as in the past), however at the time of writing
LTALLOCA still needs to be defined from ALLOCA manually. *Note
AC_LIBOBJ vs. LIBOBJS: (autoconf)AC_LIBOBJ vs LIBOBJS.

8.7 Variables used when building a program

Occasionally it is useful to know which Makefile variables Automake
uses for compilations, and in which order (*note Flag Variables
Ordering::); for instance, you might need to do your own compilation in
some special cases.

Some variables are inherited from Autoconf; these are CC, CFLAGS,
CPPFLAGS, DEFS, LDFLAGS, and LIBS.

There are some additional variables that Automake defines on its own:

AM_CPPFLAGS
​ The contents of this variable are passed to every compilation that
​ invokes the C preprocessor; it is a list of arguments to the
​ preprocessor. For instance, -I and -D options should be listed
​ here.

 Automake already provides some `-I` options automatically, in a
 separate variable that is also passed to every compilation that
 invokes the C preprocessor.  In particular it generates `-I.`,
 `-I$(srcdir)`, and a `-I` pointing to the directory holding
 `config.h` (if you`ve used `AC_CONFIG_HEADERS`).  You can disable
 the default `-I` options using the `nostdinc` option.

 When a file to be included is generated during the build and not
 part of a distribution tarball, its location is under
 `$(builddir)`, not under `$(srcdir)`.  This matters especially for
 packages that use header files placed in sub-directories and want
 to allow builds outside the source tree (*note VPATH Builds::).  In
 that case we recommend to use a pair of `-I` options, such as,
 e.g., `-Isome/subdir -I$(srcdir)/some/subdir` or
 `-I$(top_builddir)/some/subdir -I$(top_srcdir)/some/subdir`.  Note
 that the reference to the build tree should come before the
 reference to the source tree, so that accidentally leftover
 generated files in the source directory are ignored.

 `AM_CPPFLAGS` is ignored in preference to a per-executable (or
 per-library) `_CPPFLAGS` variable if it is defined.

INCLUDES
​ This does the same job as AM_CPPFLAGS (or any per-target
_CPPFLAGS variable if it is used). It is an older name for the
​ same functionality. This variable is deprecated; we suggest using
AM_CPPFLAGS and per-target _CPPFLAGS instead.

AM_CFLAGS
​ This is the variable the Makefile.am author can use to pass in
​ additional C compiler flags. In some situations, this is not used,
​ in preference to the per-executable (or per-library) _CFLAGS.

COMPILE
​ This is the command used to actually compile a C source file. The
​ file name is appended to form the complete command line.

AM_LDFLAGS
​ This is the variable the Makefile.am author can use to pass in
​ additional linker flags. In some situations, this is not used, in
​ preference to the per-executable (or per-library) _LDFLAGS.

LINK
​ This is the command used to actually link a C program. It already
​ includes -o $@ and the usual variable references (for instance,
CFLAGS); it takes as “arguments” the names of the object files
​ and libraries to link in. This variable is not used when the
​ linker is overridden with a per-target _LINK variable or
​ per-target flags cause Automake to define such a _LINK variable.

8.8 Yacc and Lex support

Automake has somewhat idiosyncratic support for Yacc and Lex.

Automake assumes that the .c file generated by yacc (or lex)
should be named using the basename of the input file. That is, for a
yacc source file foo.y, Automake will cause the intermediate file to
be named foo.c (as opposed to y.tab.c, which is more traditional).

The extension of a yacc source file is used to determine the
extension of the resulting C or C++ source and header files. Note that
header files are generated only when the -d Yacc option is used; see
below for more information about this flag, and how to specify it.
Files with the extension .y will thus be turned into .c sources and
.h headers; likewise, .yy will become .cc and .hh, .y++ will
become c++ and h++, .yxx will become .cxx and .hxx, and .ypp
will become .cpp and .hpp.

Similarly, lex source files can be used to generate C or C++; the
extensions .l, .ll, .l++, .lxx, and .lpp are recognized.

You should never explicitly mention the intermediate (C or C++) file
in any SOURCES variable; only list the source file.

The intermediate files generated by yacc (or lex) will be
included in any distribution that is made. That way the user doesnt need to have yaccorlex`.

If a yacc source file is seen, then your configure.ac must define
the variable YACC. This is most easily done by invoking the macro
AC_PROG_YACC (*note Particular Program Checks: (autoconf)Particular
Programs.).

When yacc is invoked, it is passed AM_YFLAGS and YFLAGS. The
latter is a user variable and the former is intended for the
Makefile.am author.

AM_YFLAGS is usually used to pass the -d option to yacc.
Automake knows what this means and will automatically adjust its rules
to update and distribute the header file built by yacc -d(1). What
Automake cannot guess, though, is where this header will be used: it is
up to you to ensure the header gets built before it is first used.
Typically this is necessary in order for dependency tracking to work
when the header is included by another file. The common solution is
listing the header file in BUILT_SOURCES (*note Sources::) as follows.

 BUILT_SOURCES = parser.h
 AM_YFLAGS = -d
 bin_PROGRAMS = foo
 foo_SOURCES = ... parser.y ...

If a lex source file is seen, then your configure.ac must define
the variable LEX. You can use AC_PROG_LEX to do this (*note
Particular Program Checks: (autoconf)Particular Programs.), but using
AM_PROG_LEX macro (*note Macros::) is recommended.

When lex is invoked, it is passed AM_LFLAGS and LFLAGS. The
latter is a user variable and the former is intended for the
Makefile.am author.

When AM_MAINTAINER_MODE (*note maintainer-mode::) is used, the
rebuild rule for distributed Yacc and Lex sources are only used when
maintainer-mode is enabled, or when the files have been erased.

When lex or yacc sources are used, automake -a automatically
installs an auxiliary program called ylwrap in your package (*note
Auxiliary Programs::). This program is used by the build rules to
rename the output of these tools, and makes it possible to include
multiple yacc (or lex) source files in a single directory. (This is
necessary because yaccs output file name is fixed, and a parallel make could conceivably invoke more than one instance of yacc`
simultaneously.)

For yacc, simply managing locking is insufficient. The output of
yacc always uses the same symbol names internally, so it isnt possible to link two yacc` parsers into the same executable.

We recommend using the following renaming hack used in gdb:
​ #define yymaxdepth c_maxdepth
​ #define yyparse c_parse
​ #define yylex c_lex
​ #define yyerror c_error
​ #define yylval c_lval
​ #define yychar c_char
​ #define yydebug c_debug
​ #define yypact c_pact
​ #define yyr1 c_r1
​ #define yyr2 c_r2
​ #define yydef c_def
​ #define yychk c_chk
​ #define yypgo c_pgo
​ #define yyact c_act
​ #define yyexca c_exca
​ #define yyerrflag c_errflag
​ #define yynerrs c_nerrs
​ #define yyps c_ps
​ #define yypv c_pv
​ #define yys c_s
​ #define yy_yys c_yys
​ #define yystate c_state
​ #define yytmp c_tmp
​ #define yyv c_v
​ #define yy_yyv c_yyv
​ #define yyval c_val
​ #define yylloc c_lloc
​ #define yyreds c_reds
​ #define yytoks c_toks
​ #define yylhs c_yylhs
​ #define yylen c_yylen
​ #define yydefred c_yydefred
​ #define yydgoto c_yydgoto
​ #define yysindex c_yysindex
​ #define yyrindex c_yyrindex
​ #define yygindex c_yygindex
​ #define yytable c_yytable
​ #define yycheck c_yycheck
​ #define yyname c_yyname
​ #define yyrule c_yyrule

For each define, replace the c_ prefix with whatever you like.
These defines work for bison, byacc, and traditional yaccs. If
you find a parser generator that uses a symbol not covered here, please
report the new name so it can be added to the list.

(1) Please note that automake recognizes -d in AM_YFLAGS only
if it is not clustered with other options; for example, it wont be recognized if AM_YFLAGSis-dt, but it will be if AM_YFLAGSis-d
-tor-t -d`.

8.9 C++ Support

Automake includes full support for C++.

Any package including C++ code must define the output variable CXX
in configure.ac; the simplest way to do this is to use the
AC_PROG_CXX macro (*note Particular Program Checks:
(autoconf)Particular Programs.).

A few additional variables are defined when a C++ source file is
seen:

CXX
​ The name of the C++ compiler.

CXXFLAGS
​ Any flags to pass to the C++ compiler.

AM_CXXFLAGS
​ The maintainers variant of CXXFLAGS`.

CXXCOMPILE
​ The command used to actually compile a C++ source file. The file
​ name is appended to form the complete command line.

CXXLINK
​ The command used to actually link a C++ program.

8.10 Objective C Support

Automake includes some support for Objective C.

Any package including Objective C code must define the output
variable OBJC in configure.ac; the simplest way to do this is to use
the AC_PROG_OBJC macro (*note Particular Program Checks:
(autoconf)Particular Programs.).

A few additional variables are defined when an Objective C source
file is seen:

OBJC
​ The name of the Objective C compiler.

OBJCFLAGS
​ Any flags to pass to the Objective C compiler.

AM_OBJCFLAGS
​ The maintainers variant of OBJCFLAGS`.

OBJCCOMPILE
​ The command used to actually compile an Objective C source file.
​ The file name is appended to form the complete command line.

OBJCLINK
​ The command used to actually link an Objective C program.

8.11 Objective C++ Support

Automake includes some support for Objective C++.

Any package including Objective C++ code must define the output
variable OBJCXX in configure.ac; the simplest way to do this is to
use the AC_PROG_OBJCXX macro (*note Particular Program Checks:
(autoconf)Particular Programs.).

A few additional variables are defined when an Objective C++ source
file is seen:

OBJCXX
​ The name of the Objective C++ compiler.

OBJCXXFLAGS
​ Any flags to pass to the Objective C++ compiler.

AM_OBJCXXFLAGS
​ The maintainers variant of OBJCXXFLAGS`.

OBJCXXCOMPILE
​ The command used to actually compile an Objective C++ source file.
​ The file name is appended to form the complete command line.

OBJCXXLINK
​ The command used to actually link an Objective C++ program.

8.12 Unified Parallel C Support

Automake includes some support for Unified Parallel C.

Any package including Unified Parallel C code must define the output
variable UPC in configure.ac; the simplest way to do this is to use
the AM_PROG_UPC macro (*note Public Macros::).

A few additional variables are defined when a Unified Parallel C
source file is seen:

UPC
​ The name of the Unified Parallel C compiler.

UPCFLAGS
​ Any flags to pass to the Unified Parallel C compiler.

AM_UPCFLAGS
​ The maintainers variant of UPCFLAGS`.

UPCCOMPILE
​ The command used to actually compile a Unified Parallel C source
​ file. The file name is appended to form the complete command line.

UPCLINK
​ The command used to actually link a Unified Parallel C program.

8.13 Assembly Support

Automake includes some support for assembly code. There are two forms
of assembler files: normal (*.s) and preprocessed by CPP (*.S or
*.sx).

The variable CCAS holds the name of the compiler used to build
assembly code. This compiler must work a bit like a C compiler; in
particular it must accept -c and -o. The values of CCASFLAGS and
AM_CCASFLAGS (or its per-target definition) is passed to the
compilation. For preprocessed files, DEFS, DEFAULT_INCLUDES,
INCLUDES, CPPFLAGS and AM_CPPFLAGS are also used.

The autoconf macro AM_PROG_AS will define CCAS and CCASFLAGS
for you (unless they are already set, it simply sets CCAS to the C
compiler and CCASFLAGS to the C compiler flags), but you are free to
define these variables by other means.

Only the suffixes .s, .S, and .sx are recognized by automake
as being files containing assembly code.

8.14 Fortran 77 Support

Automake includes full support for Fortran 77.

Any package including Fortran 77 code must define the output variable
F77 in configure.ac; the simplest way to do this is to use the
AC_PROG_F77 macro (*note Particular Program Checks:
(autoconf)Particular Programs.).

A few additional variables are defined when a Fortran 77 source file
is seen:

F77
​ The name of the Fortran 77 compiler.

FFLAGS
​ Any flags to pass to the Fortran 77 compiler.

AM_FFLAGS
​ The maintainers variant of FFLAGS`.

RFLAGS
​ Any flags to pass to the Ratfor compiler.

AM_RFLAGS
​ The maintainers variant of RFLAGS`.

F77COMPILE
​ The command used to actually compile a Fortran 77 source file. The
​ file name is appended to form the complete command line.

FLINK
​ The command used to actually link a pure Fortran 77 program or
​ shared library.

Automake can handle preprocessing Fortran 77 and Ratfor source files
in addition to compiling them(1). Automake also contains some support
for creating programs and shared libraries that are a mixture of Fortran
77 and other languages (*note Mixing Fortran 77 With C and C++::).

These issues are covered in the following sections.

(1) Much, if not most, of the information in the following sections
pertaining to preprocessing Fortran 77 programs was taken almost
verbatim from *note Catalogue of Rules: (make)Catalogue of Rules.

8.14.1 Preprocessing Fortran 77

N.f is made automatically from N.F or N.r. This rule runs just
the preprocessor to convert a preprocessable Fortran 77 or Ratfor source
file into a strict Fortran 77 source file. The precise command used is
as follows:

.F
$(F77) -F $(DEFS) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) ​ $(AM_FFLAGS) $(FFLAGS)

.r
$(F77) -F $(AM_FFLAGS) $(FFLAGS) $(AM_RFLAGS) $(RFLAGS)

8.14.2 Compiling Fortran 77 Files

N.o is made automatically from N.f, N.F or N.r by running the
Fortran 77 compiler. The precise command used is as follows:

.f
$(F77) -c $(AM_FFLAGS) $(FFLAGS)

.F
$(F77) -c $(DEFS) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) ​ $(AM_FFLAGS) $(FFLAGS)

.r
$(F77) -c $(AM_FFLAGS) $(FFLAGS) $(AM_RFLAGS) $(RFLAGS)

8.14.3 Mixing Fortran 77 With C and C++

Automake currently provides limited support for creating programs and
shared libraries that are a mixture of Fortran 77 and C and/or C++.
However, there are many other issues related to mixing Fortran 77 with
other languages that are not (currently) handled by Automake, but that
are handled by other packages(1).

Automake can help in two ways:

  1. Automatic selection of the linker depending on which combinations
    of source code.

  2. Automatic selection of the appropriate linker flags (e.g., -L and
    -l) to pass to the automatically selected linker in order to link
    in the appropriate Fortran 77 intrinsic and run-time libraries.

    These extra Fortran 77 linker flags are supplied in the output
    variable FLIBS by the AC_F77_LIBRARY_LDFLAGS Autoconf macro.
    *Note Fortran Compiler Characteristics: (autoconf)Fortran Compiler.

If Automake detects that a program or shared library (as mentioned in
some _PROGRAMS or _LTLIBRARIES primary) contains source code that is
a mixture of Fortran 77 and C and/or C++, then it requires that the
macro AC_F77_LIBRARY_LDFLAGS be called in configure.ac, and that
either $(FLIBS) appear in the appropriate _LDADD (for programs) or
_LIBADD (for shared libraries) variables. It is the responsibility of
the person writing the Makefile.am to make sure that $(FLIBS)
appears in the appropriate _LDADD or _LIBADD variable.

For example, consider the following Makefile.am:

 bin_PROGRAMS = foo
 foo_SOURCES  = main.cc foo.f
 foo_LDADD    = libfoo.la $(FLIBS)

 pkglib_LTLIBRARIES = libfoo.la
 libfoo_la_SOURCES  = bar.f baz.c zardoz.cc
 libfoo_la_LIBADD   = $(FLIBS)

In this case, Automake will insist that AC_F77_LIBRARY_LDFLAGS is
mentioned in configure.ac. Also, if $(FLIBS) hadnt been mentioned in foo_LDADDandlibfoo_la_LIBADD`, then Automake would have issued a
warning.

(1) For example, the cfortran package
(http://www-zeus.desy.de/~burow/cfortran/) addresses all of these
inter-language issues, and runs under nearly all Fortran 77, C and C++
compilers on nearly all platforms. However, cfortran is not yet Free
Software, but it will be in the next major release.

8.14.3.1 How the Linker is Chosen
……………………………

When a program or library mixes several languages, Automake choose the
linker according to the following priorities. (The names in parentheses
are the variables containing the link command.)

  1. Native Java (GCJLINK)
  2. Objective C++ (OBJCXXLINK)
  3. C++ (CXXLINK)
  4. Fortran 77 (F77LINK)
  5. Fortran (FCLINK)
  6. Objective C (OBJCLINK)
  7. Unified Parallel C (UPCLINK)
  8. C (LINK)

For example, if Fortran 77, C and C++ source code is compiled into a
program, then the C++ linker will be used. In this case, if the C or
Fortran 77 linkers required any special libraries that werent included by the C++ linker, then they must be manually added to an _LDADDor_LIBADDvariable by the user writing theMakefile.am`.

Automake only looks at the file names listed in _SOURCES variables
to choose the linker, and defaults to the C linker. Sometimes this is
inconvenient because you are linking against a library written in
another language and would like to set the linker more appropriately.
*Note Libtool Convenience Libraries::, for a trick with
nodist_EXTRA_..._SOURCES.

A per-target _LINK variable will override the above selection.
Per-target link flags will cause Automake to write a per-target _LINK
variable according to the language chosen as above.

8.15 Fortran 9x Support

=======================

Automake includes support for Fortran 9x.

Any package including Fortran 9x code must define the output variable
FC in configure.ac; the simplest way to do this is to use the
AC_PROG_FC macro (*note Particular Program Checks:
(autoconf)Particular Programs.).

A few additional variables are defined when a Fortran 9x source file
is seen:

FC
​ The name of the Fortran 9x compiler.

FCFLAGS
​ Any flags to pass to the Fortran 9x compiler.

AM_FCFLAGS
​ The maintainers variant of FCFLAGS`.

FCCOMPILE
​ The command used to actually compile a Fortran 9x source file. The
​ file name is appended to form the complete command line.

FCLINK
​ The command used to actually link a pure Fortran 9x program or
​ shared library.

8.15.1 Compiling Fortran 9x Files


FILE.o is made automatically from FILE.f90, FILE.f95, FILE.f03,
or FILE.f08 by running the Fortran 9x compiler. The precise command
used is as follows:

.f90
$(FC) $(AM_FCFLAGS) $(FCFLAGS) -c $(FCFLAGS_f90) $<

.f95
$(FC) $(AM_FCFLAGS) $(FCFLAGS) -c $(FCFLAGS_f95) $<

.f03
$(FC) $(AM_FCFLAGS) $(FCFLAGS) -c $(FCFLAGS_f03) $<

.f08
$(FC) $(AM_FCFLAGS) $(FCFLAGS) -c $(FCFLAGS_f08) $<

8.16 Compiling Java sources using gcj

Automake includes support for natively compiled Java, using gcj, the
Java front end to the GNU Compiler Collection (rudimentary support for
compiling Java to bytecode using the javac compiler is also present,
albeit deprecated; *note Java::).

Any package including Java code to be compiled must define the output
variable GCJ in configure.ac; the variable GCJFLAGS must also be
defined somehow (either in configure.ac or Makefile.am). The
simplest way to do this is to use the AM_PROG_GCJ macro.

By default, programs including Java source files are linked with
gcj.

As always, the contents of AM_GCJFLAGS are passed to every
compilation invoking gcj (in its role as an ahead-of-time compiler,
when invoking it to create .class files, AM_JAVACFLAGS is used
instead). If it is necessary to pass options to gcj from
Makefile.am, this variable, and not the user variable GCJFLAGS,
should be used.

gcj can be used to compile .java, .class, .zip, or .jar
files.

When linking, gcj requires that the main class be specified using
the --main= option. The easiest way to do this is to use the
_LDFLAGS variable for the program.

8.17 Vala Support

=================

Automake provides initial support for Vala
(http://www.vala-project.org/). This requires valac version 0.7.0 or
later, and currently requires the user to use GNU make.

 foo_SOURCES = foo.vala bar.vala zardoc.c

Any .vala file listed in a _SOURCES variable will be compiled
into C code by the Vala compiler. The generated .c files are
distributed. The end user does not need to have a Vala compiler
installed.

Automake ships with an Autoconf macro called AM_PROG_VALAC that
will locate the Vala compiler and optionally check its version number.

– Macro: AM_PROG_VALAC ([MINIMUM-VERSION], [ACTION-IF-FOUND],
​ [ACTION-IF-NOT-FOUND]) Search for a Vala compiler in PATH. If it
​ is found, the variable VALAC is set to point to it (see below for
​ more details). This macro takes three optional arguments. The
​ first argument, if present, is the minimum version of the Vala
​ compiler required to compile this package. If a compiler is found
​ and satisfies MINIMUM-VERSION, then ACTION-IF-FOUND is run (this
​ defaults to do nothing). Otherwise, ACTION-IF-NOT-FOUND is run.
​ If ACTION-IF-NOT-FOUND is not specified, the default value is to
​ print a warning in case no compiler is found, or if a too-old
​ version of the compiler is found.

There are a few variables that are used when compiling Vala sources:

VALAC
​ Absolute path to the Vala compiler, or simply valac if no
​ suitable compiler Vala could be found at configure runtime.

VALAFLAGS
​ Additional arguments for the Vala compiler.

AM_VALAFLAGS
​ The maintainers variant of VALAFLAGS`.

      lib_LTLIBRARIES = libfoo.la
      libfoo_la_SOURCES = foo.vala

Note that currently, you cannot use per-target *_VALAFLAGS (*note
Renamed Objects::) to produce different C files from one Vala source
file.

8.18 Support for Other Languages

Automake currently only includes full support for C, C++ (*note C++
Support::), Objective C (*note Objective C Support::), Objective C++
(*note Objective C++ Support::), Fortran 77 (*note Fortran 77
Support::), Fortran 9x (*note Fortran 9x Support::), and Java (*note
Java Support with gcj::). There is only rudimentary support for other
languages, support for which will be improved based on user demand.

Some limited support for adding your own languages is available via
the suffix rule handling (*note Suffixes::).

8.19 Automatic dependency tracking

As a developer it is often painful to continually update the
Makefile.am whenever the include-file dependencies change in a
project. Automake supplies a way to automatically track dependency
changes (*note Dependency Tracking::).

Automake always uses complete dependencies for a compilation,
including system headers. Automakes model is that dependency computation should be a side effect of the build. To this end, dependencies are computed by running all compilations through a special wrapper program called depcomp. depcompunderstands how to coax many different C and C++ compilers into generating dependency information in the format it requires. automake -awill installdepcompinto your source tree for you. Ifdepcomp cant figure out
how to properly invoke your compiler, dependency tracking will simply be
disabled for your build.

Experience with earlier versions of Automake (*note Dependency
Tracking Evolution: (automake-history)Dependency Tracking Evolution.)
taught us that it is not reliable to generate dependencies only on the
maintainer`s system, as configurations vary too much. So instead
Automake implements dependency tracking at build time.

Automatic dependency tracking can be suppressed by putting
no-dependencies in the variable AUTOMAKE_OPTIONS, or passing
no-dependencies as an argument to AM_INIT_AUTOMAKE (this should be
the preferred way). Or, you can invoke automake with the -i option.
Dependency tracking is enabled by default.

The person building your package also can choose to disable
dependency tracking by configuring with --disable-dependency-tracking.

8.20 Support for executable extensions

On some platforms, such as Windows, executables are expected to have an
extension such as .exe. On these platforms, some compilers (GCC among
them) will automatically generate foo.exe when asked to generate
foo.

Automake provides mostly-transparent support for this. Unfortunately
mostly doesn`t yet mean fully. Until the English dictionary is
revised, you will have to assist Automake if your package must support
those platforms.

One thing you must be aware of is that, internally, Automake rewrites
something like this:

 bin_PROGRAMS = liver

to this:

 bin_PROGRAMS = liver$(EXEEXT)

The targets Automake generates are likewise given the $(EXEEXT)
extension.

The variables TESTS and XFAIL_TESTS (*note Simple Tests::) are
also rewritten if they contain filenames that have been declared as
programs in the same Makefile. (This is mostly useful when some
programs from check_PROGRAMS are listed in TESTS.)

However, Automake cannot apply this rewriting to configure
substitutions. This means that if you are conditionally building a
program using such a substitution, then your configure.ac must take
care to add $(EXEEXT) when constructing the output variable.

Sometimes maintainers like to write an explicit link rule for their
program. Without executable extension support, this is easy—you simply
write a rule whose target is the name of the program. However, when
executable extension support is enabled, you must instead add the
$(EXEEXT) suffix.

This might be a nuisance for maintainers who know their package will
never run on a platform that has executable extensions. For those
maintainers, the no-exeext option (*note Options::) will disable this
feature. This works in a fairly ugly way; if no-exeext is seen, then
the presence of a rule for a target named foo in Makefile.am will
override an automake-generated rule for foo$(EXEEXT). Without the
no-exeext option, this use will give a diagnostic.

GNU Automake 版本(version 1.16.1, 26 February 2018)

Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with no Invariant Sections, with no Front-Cover texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled “GNU Free Documentation License.”

7 Directories


For simple projects that distribute all files in the same directory it
is enough to have a single ‘Makefile.am’ that builds everything in
place.

In larger projects, it is common to organize files in different
directories, in a tree. For example, there could be a directory for the
program’s source, one for the testsuite, and one for the documentation;
or, for very large projects, there could be one directory per program,
per library or per module.

The traditional approach is to build these subdirectories
recursively, employing make recursion: each directory contains its own
‘Makefile’, and when ‘make’ is run from the top-level directory, it
enters each subdirectory in turn, and invokes there a new ‘make’
instance to build the directory’s contents.

Because this approach is very widespread, Automake offers built-in
support for it. However, it is worth nothing that the use of make
recursion has its own serious issues and drawbacks, and that it’s well
possible to have packages with a multi directory layout that make little
or no use of such recursion (examples of such packages are GNU Bison and
GNU Automake itself); see also the

7.1 Recursing subdirectories

In packages using make recursion, the top level ‘Makefile.am’ must tell
Automake which subdirectories are to be built. This is done via the
‘SUBDIRS’ variable.

The ‘SUBDIRS’ variable holds a list of subdirectories in which
building of various sorts can occur. The rules for many targets (e.g.,
‘all’) in the generated ‘Makefile’ will run commands both locally and in
all specified subdirectories. Note that the directories listed in
‘SUBDIRS’ are not required to contain ‘Makefile.am’s; only ‘Makefile’s
(after configuration). This allows inclusion of libraries from packages
that do not use Automake (such as ‘gettext’; see also

In packages that use subdirectories, the top-level ‘Makefile.am’ is
often very short. For instance, here is the ‘Makefile.am’ from the GNU
Hello distribution:

 EXTRA_DIST = BUGS ChangeLog.O README-alpha
 SUBDIRS = doc intl po src tests

When Automake invokes ‘make’ in a subdirectory, it uses the value of
the ‘MAKE’ variable. It passes the value of the variable ‘AM_MAKEFLAGS’
to the ‘make’ invocation; this can be set in ‘Makefile.am’ if there are
flags you must always pass to ‘make’.

The directories mentioned in ‘SUBDIRS’ are usually direct children of
the current directory, each subdirectory containing its own
‘Makefile.am’ with a ‘SUBDIRS’ pointing to deeper subdirectories.
Automake can be used to construct packages of arbitrary depth this way.

By default, Automake generates ‘Makefiles’ that work depth-first in
postfix order: the subdirectories are built before the current
directory. However, it is possible to change this ordering. You can do
this by putting ‘.’ into ‘SUBDIRS’. For instance, putting ‘.’ first
will cause a prefix ordering of directories.

Using

 SUBDIRS = lib src . test

will cause ‘lib/’ to be built before ‘src/’, then the current directory
will be built, finally the ‘test/’ directory will be built. It is
customary to arrange test directories to be built after everything else
since they are meant to test what has been constructed.

In addition to the built-in recursive targets defined by Automake
(‘all’, ‘check’, etc.), the developer can also define his own recursive
targets. That is done by passing the names of such targets as arguments
to the m4 macro ‘AM_EXTRA_RECURSIVE_TARGETS’ in ‘configure.ac’.
Automake generates rules to handle the recursion for such targets; and
the developer can define real actions for them by defining corresponding
‘-local’ targets.

 % cat configure.ac
 AC_INIT([pkg-name], [1.0]
 AM_INIT_AUTOMAKE
 AM_EXTRA_RECURSIVE_TARGETS([foo])
 AC_CONFIG_FILES([Makefile sub/Makefile sub/src/Makefile])
 AC_OUTPUT
 % cat Makefile.am
 SUBDIRS = sub
 foo-local:
         @echo This will be run by "make foo".
 % cat sub/Makefile.am
 SUBDIRS = src
 % cat sub/src/Makefile.am
 foo-local:
         @echo This too will be run by a "make foo" issued either in
         @echo the 'sub/src/' directory, the 'sub/' directory, or the
         @echo top-level directory.

7.2 Conditional Subdirectories

It is possible to define the ‘SUBDIRS’ variable conditionally if, like
in the case of GNU Inetutils, you want to only build a subset of the
entire package.

To illustrate how this works, let’s assume we have two directories
‘src/’ and ‘opt/’. ‘src/’ should always be built, but we want to decide
in ‘configure’ whether ‘opt/’ will be built or not. (For this example
we will assume that ‘opt/’ should be built when the variable ‘$want_opt’
was set to ‘yes’.)

Running ‘make’ should thus recurse into ‘src/’ always, and then maybe
in ‘opt/’.

However ‘make dist’ should always recurse into both ‘src/’ and
‘opt/’. Because ‘opt/’ should be distributed even if it is not needed
in the current configuration. This means ‘opt/Makefile’ should be
created unconditionally.

There are two ways to setup a project like this. You can use
Automake conditionals () or use Autoconf ‘AC_SUBST’
variables Using Automake conditionals is the preferred solution.
Before we illustrate these two possibilities, let’s introduce
‘DIST_SUBDIRS’.

7.2.1 ‘SUBDIRS’ vs. ‘DIST_SUBDIRS’

Automake considers two sets of directories, defined by the variables
‘SUBDIRS’ and ‘DIST_SUBDIRS’.

‘SUBDIRS’ contains the subdirectories of the current directory that
must be built . It must be defined manually;
Automake will never guess a directory is to be built. As we will see in
the next two sections, it is possible to define it conditionally so that
some directory will be omitted from the build.

‘DIST_SUBDIRS’ is used in rules that need to recurse in all
directories, even those that have been conditionally left out of the
build. Recall our example where we may not want to build subdirectory
‘opt/’, but yet we want to distribute it? This is where ‘DIST_SUBDIRS’
comes into play: ‘opt’ may not appear in ‘SUBDIRS’, but it must appear
in ‘DIST_SUBDIRS’.

Precisely, ‘DIST_SUBDIRS’ is used by ‘make maintainer-clean’, ‘make
distclean’ and ‘make dist’. All other recursive rules use ‘SUBDIRS’.

If ‘SUBDIRS’ is defined conditionally using Automake conditionals,
Automake will define ‘DIST_SUBDIRS’ automatically from the possible
values of ‘SUBDIRS’ in all conditions.

If ‘SUBDIRS’ contains ‘AC_SUBST’ variables, ‘DIST_SUBDIRS’ will not
be defined correctly because Automake does not know the possible values
of these variables. In this case ‘DIST_SUBDIRS’ needs to be defined
manually.

7.2.2 Subdirectories with ‘AM_CONDITIONAL’

‘configure’ should output the ‘Makefile’ for each directory and define a
condition into which ‘opt/’ should be built.

 ...
 AM_CONDITIONAL([COND_OPT], [test "$want_opt" = yes])
 AC_CONFIG_FILES([Makefile src/Makefile opt/Makefile])
 ...

Then ‘SUBDIRS’ can be defined in the top-level ‘Makefile.am’ as
follows.

 if COND_OPT
   MAYBE_OPT = opt
 endif
 SUBDIRS = src $(MAYBE_OPT)

As you can see, running ‘make’ will rightly recurse into ‘src/’ and
maybe ‘opt/’.

As you can’t see, running ‘make dist’ will recurse into both ‘src/’
and ‘opt/’ directories because ‘make dist’, unlike ‘make all’, doesn’t
use the ‘SUBDIRS’ variable. It uses the ‘DIST_SUBDIRS’ variable.

In this case Automake will define ‘DIST_SUBDIRS = src opt’
automatically because it knows that ‘MAYBE_OPT’ can contain ‘opt’ in
some condition.

7.2.3 Subdirectories with ‘AC_SUBST’

Another possibility is to define ‘MAYBE_OPT’ from ‘./configure’ using
‘AC_SUBST’:

 ...
 if test "$want_opt" = yes; then
   MAYBE_OPT=opt
 else
   MAYBE_OPT=
 fi
 AC_SUBST([MAYBE_OPT])
 AC_CONFIG_FILES([Makefile src/Makefile opt/Makefile])
 ...

In this case the top-level ‘Makefile.am’ should look as follows.

 SUBDIRS = src $(MAYBE_OPT)
 DIST_SUBDIRS = src opt

The drawback is that since Automake cannot guess what the possible
values of ‘MAYBE_OPT’ are, it is necessary to define ‘DIST_SUBDIRS’.

7.2.4 Unconfigured Subdirectories

The semantics of ‘DIST_SUBDIRS’ are often misunderstood by some users
that try to configure and build subdirectories conditionally. Here by
configuring we mean creating the ‘Makefile’ (it might also involve
running a nested ‘configure’ script: this is a costly operation that
explains why people want to do it conditionally, but only the ‘Makefile’
is relevant to the discussion).

The above examples all assume that every ‘Makefile’ is created, even
in directories that are not going to be built. The simple reason is
that we want ‘make dist’ to distribute even the directories that are not
being built (e.g., platform-dependent code), hence ‘make dist’ must
recurse into the subdirectory, hence this directory must be configured
and appear in ‘DIST_SUBDIRS’.

Building packages that do not configure every subdirectory is a
tricky business, and we do not recommend it to the novice as it is easy
to produce an incomplete tarball by mistake. We will not discuss this
topic in depth here, yet for the adventurous here are a few rules to
remember.

• ‘SUBDIRS’ should always be a subset of ‘DIST_SUBDIRS’.

 It makes little sense to have a directory in ‘SUBDIRS’ that is not
 in ‘DIST_SUBDIRS’.  Think of the former as a way to tell which
 directories listed in the latter should be built.

• Any directory listed in ‘DIST_SUBDIRS’ and ‘SUBDIRS’ must be
configured.

 I.e., the ‘Makefile’ must exists or the recursive ‘make’ rules will
 not be able to process the directory.

• Any configured directory must be listed in ‘DIST_SUBDIRS’.

 So that the cleaning rules remove the generated ‘Makefile’s.  It
 would be correct to see ‘DIST_SUBDIRS’ as a variable that lists all
 the directories that have been configured.

In order to prevent recursion in some unconfigured directory you must
therefore ensure that this directory does not appear in ‘DIST_SUBDIRS’
(and ‘SUBDIRS’). For instance, if you define ‘SUBDIRS’ conditionally
using ‘AC_SUBST’ and do not define ‘DIST_SUBDIRS’ explicitly, it will be
default to ‘$(SUBDIRS)’; another possibility is to force ‘DIST_SUBDIRS =
$(SUBDIRS)’.

Of course, directories that are omitted from ‘DIST_SUBDIRS’ will not
be distributed unless you make other arrangements for this to happen
(for instance, always running ‘make dist’ in a configuration where all
directories are known to appear in ‘DIST_SUBDIRS’; or writing a
‘dist-hook’ target to distribute these directories).

In few packages, unconfigured directories are not even expected to be
distributed. Although these packages do not require the aforementioned
extra arrangements, there is another pitfall. If the name of a
directory appears in ‘SUBDIRS’ or ‘DIST_SUBDIRS’, ‘automake’ will make
sure the directory exists. Consequently ‘automake’ cannot be run on
such a distribution when one directory has been omitted. One way to
avoid this check is to use the ‘AC_SUBST’ method to declare conditional
directories; since ‘automake’ does not know the values of ‘AC_SUBST’
variables it cannot ensure the corresponding directory exists.

7.3 An Alternative Approach to Subdirectories

If you’ve ever read Peter Miller’s excellent paper, Recursive Make
Considered Harmful (http://miller.emu.id.au/pmiller/books/rmch/), the
preceding sections on the use of make recursion will probably come as
unwelcome advice. For those who haven’t read the paper, Miller’s main
thesis is that recursive ‘make’ invocations are both slow and
error-prone.

Automake provides sufficient cross-directory support (1) to enable
you to write a single ‘Makefile.am’ for a complex multi-directory
package.

By default an installable file specified in a subdirectory will have
its directory name stripped before installation. For instance, in this
example, the header file will be installed as ‘$(includedir)/stdio.h’:

 include_HEADERS = inc/stdio.h

However, the ‘nobase_’ prefix can be used to circumvent this path
stripping. In this example, the header file will be installed as
‘$(includedir)/sys/types.h’:

 nobase_include_HEADERS = sys/types.h

‘nobase_’ should be specified first when used in conjunction with
either ‘dist_’ or ‘nodist_’ (.
For instance:

 nobase_dist_pkgdata_DATA = images/vortex.pgm sounds/whirl.ogg

Finally, note that a variable using the ‘nobase_’ prefix can often be
replaced by several variables, one for each destination directory (. For instance, the last example could be rewritten as
follows:

 imagesdir = $(pkgdatadir)/images
 soundsdir = $(pkgdatadir)/sounds
 dist_images_DATA = images/vortex.pgm
 dist_sounds_DATA = sounds/whirl.ogg

This latter syntax makes it possible to change one destination directory
without changing the layout of the source tree.

Currently, ‘nobase_*_LTLIBRARIES’ are the only exception to this
rule, in that there is no particular installation order guarantee for an
otherwise equivalent set of variables without ‘nobase_’ prefix.

———- Footnotes ———-

(1) We believe. This work is new and there are probably warts.

7.4 Nesting Packages

In the GNU Build System, packages can be nested to arbitrary depth.
This means that a package can embed other packages with their own
‘configure’, ‘Makefile’s, etc.

These other packages should just appear as subdirectories of their
parent package. They must be listed in ‘SUBDIRS’ like other ordinary
directories. However the subpackage’s ‘Makefile’s should be output by
its own ‘configure’ script, not by the parent’s ‘configure’. This is
achieved using the ‘AC_CONFIG_SUBDIRS’ Autoconf macro (

Here is an example package for an ‘arm’ program that links with a
‘hand’ library that is a nested package in subdirectory ‘hand/’.

‘arm’’s ‘configure.ac’:

 AC_INIT([arm], [1.0])
 AC_CONFIG_AUX_DIR([.])
 AM_INIT_AUTOMAKE
 AC_PROG_CC
 AC_CONFIG_FILES([Makefile])
 # Call hand's ./configure script recursively.
 AC_CONFIG_SUBDIRS([hand])
 AC_OUTPUT

‘arm’’s ‘Makefile.am’:

 # Build the library in the hand subdirectory first.
 SUBDIRS = hand

 # Include hand's header when compiling this directory.
 AM_CPPFLAGS = -I$(srcdir)/hand

 bin_PROGRAMS = arm
 arm_SOURCES = arm.c
 # link with the hand library.
 arm_LDADD = hand/libhand.a

Now here is ‘hand’’s ‘hand/configure.ac’:

 AC_INIT([hand], [1.2])
 AC_CONFIG_AUX_DIR([.])
 AM_INIT_AUTOMAKE
 AC_PROG_CC
 AM_PROG_AR
 AC_PROG_RANLIB
 AC_CONFIG_FILES([Makefile])
 AC_OUTPUT

and its ‘hand/Makefile.am’:

 lib_LIBRARIES = libhand.a
 libhand_a_SOURCES = hand.c

When ‘make dist’ is run from the top-level directory it will create
an archive ‘arm-1.0.tar.gz’ that contains the ‘arm’ code as well as the
‘hand’ subdirectory. This package can be built and installed like any
ordinary package, with the usual ‘./configure && make && make install’
sequence (the ‘hand’ subpackage will be built and installed by the
process).

When ‘make dist’ is run from the hand directory, it will create a
self-contained ‘hand-1.2.tar.gz’ archive. So although it appears to be
embedded in another package, it can still be used separately.

The purpose of the ‘AC_CONFIG_AUX_DIR([.])’ instruction is to force
Automake and Autoconf to search for auxiliary scripts in the current
directory. For instance, this means that there will be two copies of
‘install-sh’: one in the top-level of the ‘arm’ package, and another one
in the ‘hand/’ subdirectory for the ‘hand’ package.

The historical default is to search for these auxiliary scripts in
the parent directory and the grandparent directory. So if the
‘AC_CONFIG_AUX_DIR([.])’ line was removed from ‘hand/configure.ac’, that
subpackage would share the auxiliary script of the ‘arm’ package. This
may looks like a gain in size (a few kilobytes), but it is actually a
loss of modularity as the ‘hand’ subpackage is no longer self-contained
(‘make dist’ in the subdirectory will not work anymore).

Packages that do not use Automake need more work to be integrated
this way.

GNU Automake 版本(version 1.16.1, 26 February 2018)

Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with no Invariant Sections, with no Front-Cover texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled “GNU Free Documentation License.”

6 使用aclocal解析configure.ac


Automake通过解析文件 configure.ac 来决定软件包的信息。一些 autoconf 宏需要定义在该文件中。

6.1 Configuration requirements


Automake需要的一个必须的参数是AM_INIT_AUTOMAKE

初次之外还有一些其他的宏,如下所示:

AC_CONFIG_FILES
AC_OUTPUT
​ 这个两个宏一般位于文件的最末尾。

      AC_CONFIG_FILES([
        Makefile
        doc/Makefile
        src/Makefile
        src/lib/Makefile
        ...
      ])
      AC_OUTPUT

Automake使用这些信息来确定哪些文件需要被创建,上面的内容就是如果在目录中存在 Makefile.am的文件,那么将生成Makefile文件。
When using AC_CONFIG_FILES with multiple input files, as in
AC_CONFIG_FILES([Makefile:top.in:Makefile.in:bot.in])

automake will generate the first .in input file for which a
.am file exists. If no such file exists the output file is not
considered to be generated by Automake.

通过 AC_CONFIG_FILES创建的文件在使用 make distclean的时候都会被清除。 Their inputs are
automatically distributed, unless they are the output of prior
AC_CONFIG_FILES commands. Finally, rebuild rules are generated
in the Automake Makefile existing in the subdirectory of the
output file, if there is one, or in the top-level Makefile
otherwise.

The above machinery (cleaning, distributing, and rebuilding) works
fine if the AC_CONFIG_FILES specifications contain only literals.
If part of the specification uses shell variables, automake will
not be able to fulfill this setup, and you will have to complete
the missing bits by hand. For instance, on

      file=input
      ...
      AC_CONFIG_FILES([output:$file],, [file=$file])

automake will output rules to clean output, and rebuild it.
However the rebuild rule will not depend on input, and this file
will not be distributed either. (You must add EXTRA_DIST = input
to your Makefile.am if input is a source file.)

Similarly

​ file=output
​ file2=out:in
​ …
​ AC_CONFIG_FILES([$file:input],, [file=$file])
​ AC_CONFIG_FILES([$file2],, [file2=$file2])

will only cause input to be distributed. No file will be cleaned
automatically (add DISTCLEANFILES = output out yourself), and no
rebuild rule will be output.

Obviously automake cannot guess what value $file is going to
hold later when configure is run, and it cannot use the shell
variable $file in a Makefile. However, if you make reference
to $file as ${file} (i.e., in a way that is compatible with
make``s syntax) and furthermore use AC_SUBSTto ensure that${file}is meaningful in aMakefile, then automakewill be able to use${file}` to generate all of these rules. For
instance, here is how the Automake package itself generates
versioned scripts for its test suite:

 AC_SUBST([APIVERSION], ...)
 ...
 AC_CONFIG_FILES(
   [tests/aclocal-${APIVERSION}:tests/aclocal.in],
   [chmod +x tests/aclocal-${APIVERSION}],
   [APIVERSION=$APIVERSION])
 AC_CONFIG_FILES(
   [tests/automake-${APIVERSION}:tests/automake.in],
   [chmod +x tests/automake-${APIVERSION}])

Here cleaning, distributing, and rebuilding are done automatically,
because ${APIVERSION} is known at make-time.

Note that you should not use shell variables to declare Makefile
files for which automake must create Makefile.in. Even
AC_SUBST does not help here, because automake needs to know the
file name when it runs in order to check whether Makefile.am
exists. (In the very hairy case that your setup requires such use
of variables, you will have to tell Automake which Makefile.ins
to generate on the command-line.)

It is possible to let automake emit conditional rules for
AC_CONFIG_FILES with the help of AM_COND_IF .

To summarize:
• Use literals for Makefiles, and for other files whenever
​ possible.
• Use $file (or ${file} without AC_SUBST([file])) for
​ files that automake should ignore.
• Use ${file} and AC_SUBST([file]) for files that automake
​ should not ignore.

6.2 Other things Automake recognizes


Every time Automake is run it calls Autoconf to trace configure.ac.
This way it can recognize the use of certain macros and tailor the
generated Makefile.in appropriately. Currently recognized macros and
their effects are:

AC_CANONICAL_BUILD
AC_CANONICAL_HOST
AC_CANONICAL_TARGET
​ Automake will ensure that config.guess and config.sub exist.
​ Also, the Makefile variables build_triplet, host_triplet and
target_triplet are introduced.

AC_CONFIG_AUX_DIR
​ Automake will look for various helper scripts, such as
install-sh, in the directory named in this macro invocation.
​ (The full list of scripts is: ar-lib, config.guess,
config.sub, depcomp, compile, install-sh, ltmain.sh,
mdate-sh, missing, mkinstalldirs, py-compile,
test-driver, texinfo.tex, ylwrap.) Not all scripts are
​ always searched for; some scripts will only be sought if the
​ generated Makefile.in requires them.

 If `AC_CONFIG_AUX_DIR` is not given, the scripts are looked for in
 their standard locations.  For `mdate-sh`, `texinfo.tex`, and
 `ylwrap`, the standard location is the source directory
 corresponding to the current `Makefile.am`.  For the rest, the
 standard location is the first one of `.`, `..`, or `../..`
 (relative to the top source directory) that provides any one of the
 helper scripts.  

 Required files from `AC_CONFIG_AUX_DIR` are automatically
 distributed, even if there is no `Makefile.am` in this directory.

AC_CONFIG_LIBOBJ_DIR
​ Automake will require the sources file declared with AC_LIBSOURCE
​ (see below) in the directory specified by this macro.

AC_CONFIG_HEADERS
​ Automake will generate rules to rebuild these headers from the
​ corresponding templates (usually, the template for a foo.h header
​ being foo.h.in). Older versions of Automake required the use of
AM_CONFIG_HEADER; this is no longer the case, and that macro has
​ indeed been removed.

 As with `AC_CONFIG_FILES` , parts of the
 specification using shell variables will be ignored as far as
 cleaning, distributing, and rebuilding is concerned.

AC_CONFIG_LINKS
​ Automake will generate rules to remove configure generated links
​ on make distclean and to distribute named source files as part of
make dist.

 As for `AC_CONFIG_FILES` parts of the
 specification using shell variables will be ignored as far as
 cleaning and distributing is concerned.  (There are no rebuild
 rules for links.)

AC_LIBOBJ
AC_LIBSOURCE
AC_LIBSOURCES
​ Automake will automatically distribute any file listed in
AC_LIBSOURCE or AC_LIBSOURCES.

 Note that the `AC_LIBOBJ` macro calls `AC_LIBSOURCE`.  So if an
 Autoconf macro is documented to call `AC_LIBOBJ([file])`, then
 `file.c` will be distributed automatically by Automake.  This
 encompasses many macros like `AC_FUNC_ALLOCA`, `AC_FUNC_MEMCMP`,
 `AC_REPLACE_FUNCS`, and others.

 By the way, direct assignments to `LIBOBJS` are no longer
 supported.  You should always use `AC_LIBOBJ` for this purpose.
 Note `AC_LIBOBJ` vs. `LIBOBJS`: (autoconf)AC_LIBOBJ vs LIBOBJS.

AC_PROG_RANLIB
​ This is required if any libraries are built in the package. Note
​ Particular Program Checks: (autoconf)Particular Programs.

AC_PROG_CXX
​ This is required if any C++ source is included. Note Particular
​ Program Checks: (autoconf)Particular Programs.

AC_PROG_OBJC
​ This is required if any Objective C source is included. Note
​ Particular Program Checks: (autoconf)Particular Programs.

AC_PROG_OBJCXX
​ This is required if any Objective C++ source is included. Note
​ Particular Program Checks: (autoconf)Particular Programs.

AC_PROG_F77
​ This is required if any Fortran 77 source is included. Note
​ Particular Program Checks: (autoconf)Particular Programs.

AC_F77_LIBRARY_LDFLAGS
​ This is required for programs and shared libraries that are a
​ mixture of languages that include Fortran 77 .

AC_FC_SRCEXT
​ Automake will add the flags computed by AC_FC_SRCEXT to
​ compilation of files with the respective source extension

AC_PROG_FC
​ This is required if any Fortran 90/95 source is included. This
​ macro is distributed with Autoconf version 2.58 and later. Note
​ Particular Program Checks: (autoconf)Particular Programs.

AC_PROG_LIBTOOL
​ Automake will turn on processing for libtool

AC_PROG_YACC
​ If a Yacc source file is seen, then you must either use this macro
​ or define the variable YACC in configure.ac. The former is
​ preferred

AC_PROG_LEX
​ If a Lex source file is seen, then this macro must be used. Note
​ Particular Program Checks: (autoconf)Particular Programs.

AC_REQUIRE_AUX_FILE
​ For each AC_REQUIRE_AUX_FILE([FILE]), automake will ensure that
FILE exists in the aux directory, and will complain otherwise.
​ It will also automatically distribute the file. This macro should
​ be used by third-party Autoconf macros that require some supporting
​ files in the aux directory specified with AC_CONFIG_AUX_DIR
​ above. Note Finding configure Input: (autoconf)Input.

AC_SUBST
​ The first argument is automatically defined as a variable in each
​ generated Makefile.in, unless AM_SUBST_NOTMAKE is also used for
​ this variable. Note Setting Output Variables: (autoconf)Setting
​ Output Variables.

 For every substituted variable VAR, `automake` will add a line `VAR
 = VALUE` to each `Makefile.in` file.  Many Autoconf macros invoke
 `AC_SUBST` to set output variables this way, e.g., `AC_PATH_XTRA`
 defines `X_CFLAGS` and `X_LIBS`.  Thus, you can access these
 variables as `$(X_CFLAGS)` and `$(X_LIBS)` in any `Makefile.am` if
 `AC_PATH_XTRA` is called.

AM_CONDITIONAL
​ This introduces an Automake conditional

AM_COND_IF
​ This macro allows automake to detect subsequent access within
configure.ac to a conditional previously introduced with
AM_CONDITIONAL, thus enabling conditional AC_CONFIG_FILES

AM_GNU_GETTEXT
​ This macro is required for packages that use GNU gettext . It is distributed with gettext. If Automake sees this
​ macro it ensures that the package meets some of gettext`s
​ requirements.

AM_GNU_GETTEXT_INTL_SUBDIR
​ This macro specifies that the intl/ subdirectory is to be built,
​ even if the AM_GNU_GETTEXT macro was invoked with a first
​ argument of external.

AM_MAINTAINER_MODE([DEFAULT-MODE])
​ This macro adds an --enable-maintainer-mode option to
configure. If this is used, automake will cause
​ “maintainer-only” rules to be turned off by default in the
​ generated Makefile.ins, unless DEFAULT-MODE is enable. This
​ macro defines the MAINTAINER_MODE conditional, which you can use
​ in your own Makefile.am. Note maintainer-mode::.

AM_SUBST_NOTMAKE(VAR)
​ Prevent Automake from defining a variable VAR, even if it is
​ substituted by config.status. Normally, Automake defines a
make variable for each configure substitution, i.e., for each
AC_SUBST([VAR]). This macro prevents that definition from
​ Automake. If AC_SUBST has not been called for this variable,
​ then AM_SUBST_NOTMAKE has no effects. Preventing variable
​ definitions may be useful for substitution of multi-line values,
​ where VAR = @VALUE@ might yield unintended results.

m4_include
​ Files included by configure.ac using this macro will be detected
​ by Automake and automatically distributed. They will also appear
​ as dependencies in Makefile rules.

 `m4_include` is seldom used by `configure.ac` authors, but can
 appear in `aclocal.m4` when `aclocal` detects that some required
 macros come from files local to your package (as opposed to macros
 installed in a system-wide directory, note aclocal Invocation::).

6.3 Auto-generating aclocal.m4


Automake includes a number of Autoconf macros that can be used in your
package (Macros::); some of them are actually required by Automake
in certain situations. These macros must be defined in your
aclocal.m4; otherwise they will not be seen by autoconf.

程序 aclocal 将根据 configure.ac自动生成文件 aclocal.m4 。 This provides a convenient way
to get Automake-provided macros, without having to search around. The
aclocal mechanism allows other packages to supply their own macros
(Extending aclocal::). You can also use it to maintain your own
set of custom macros (Local Macros::).

At startup, aclocal scans all the .m4 files it can find, looking
for macro definitions (Macro Search Path::). Then it scans
configure.ac. Any mention of one of the macros found in the first
step causes that macro, and any macros it in turn requires, to be put
into aclocal.m4.

Putting the file that contains the macro definition into
aclocal.m4 is usually done by copying the entire text of this file,
including unused macro definitions as well as both # and dnl
comments. If you want to make a comment that will be completely ignored
by aclocal, use ## as the comment leader.

When a file selected by aclocal is located in a subdirectory
specified as a relative search path with aclocal``s -Iargument,aclocalassumes the file belongs to the package and usesm4_includeinstead of copying it intoaclocal.m4. This makes the package smaller, eases dependency tracking, and cause the file to be distributed automatically. (Note Local Macros::, for an example.) Any macro that is found in a system-wide directory, or via an absolute search path will be copied. So use -I pwd/reldirinstead of-I reldir` whenever
some relative directory should be considered outside the package.

The contents of acinclude.m4, if this file exists, are also
automatically included in aclocal.m4. We recommend against using
acinclude.m4 in new packages (note Local Macros::).

While computing aclocal.m4, aclocal runs autom4te (note Using
Autom4te: (autoconf)Using autom4te.) in order to trace the macros that
are really used, and omit from aclocal.m4 all macros that are
mentioned but otherwise unexpanded (this can happen when a macro is
called conditionally). autom4te is expected to be in the PATH, just
as autoconf. Its location can be overridden using the AUTOM4TE
environment variable.

6.3.1 aclocal Options



aclocal accepts the following options:

--automake-acdir=DIR
​ Look for the automake-provided macro files in DIR instead of in the
​ installation directory. This is typically used for debugging.

 The environment variable `ACLOCAL_AUTOMAKE_DIR` provides another
 way to set the directory containing automake-provided macro files.
 However `--automake-acdir` takes precedence over it.

--system-acdir=DIR
​ Look for the system-wide third-party macro files (and the special
dirlist file) in DIR instead of in the installation directory.
​ This is typically used for debugging.

--diff[=COMMAND]
​ Run COMMAND on M4 file that would be installed or overwritten by
--install. The default COMMAND is diff -u. This option
​ implies --install and --dry-run.

--dry-run
​ Do not actually overwrite (or create) aclocal.m4 and M4 files
​ installed by --install.

--help
​ Print a summary of the command line options and exit.

-I DIR
​ Add the directory DIR to the list of directories searched for .m4
​ files.

--install
​ Install system-wide third-party macros into the first directory
​ specified with -I DIR instead of copying them in the output file.
​ Note that this will happen also if DIR is an absolute path.

 When this option is used, and only when this option is used,
 `aclocal` will also honor `#serial NUMBER` lines that appear in
 macros: an M4 file is ignored if there exists another M4 file with
 the same basename and a greater serial number in the search path
 (note Serials::).

--force
​ Always overwrite the output file. The default is to overwrite the
​ output file only when really needed, i.e., when its contents
​ changes or if one of its dependencies is younger.

 This option forces the update of `aclocal.m4` (or the file
 specified with `--output` below) and only this file, it has
 absolutely no influence on files that may need to be installed by
 `--install`.

--output=FILE
​ Cause the output to be put into FILE instead of aclocal.m4.

--print-ac-dir
​ Prints the name of the directory that aclocal will search to find
​ third-party .m4 files. When this option is given, normal
​ processing is suppressed. This option was used in the past by
​ third-party packages to determine where to install .m4 macro
​ files, but this usage is today discouraged, since it causes
$(prefix) not to be thoroughly honored (which violates the GNU
​ Coding Standards), and a similar semantics can be better obtained
​ with the ACLOCAL_PATH environment variable; note Extending
​ aclocal::.

--verbose
​ Print the names of the files it examines.

--version
​ Print the version number of Automake and exit.

-W CATEGORY
--warnings=CATEGORY
​ Output warnings falling in CATEGORY. CATEGORY can be one of:
syntax
​ dubious syntactic constructs, underquoted macros, unused
​ macros, etc.
unsupported
​ unknown macros
all
​ all the warnings, this is the default
none
​ turn off all the warnings
error
​ treat warnings as errors

 All warnings are output by default.

 The environment variable `WARNINGS` is honored in the same way as
 it is for `automake` (note automake Invocation::).

6.3.2 Macro Search Path



默认情况下 aclocal 按照下面的目录顺序来搜索 .m4 文件:

ACDIR-APIVERSION
​ This is where the .m4 macros distributed with Automake itself are
​ stored. APIVERSION depends on the Automake release used; for
​ example, for Automake 1.11.x, APIVERSION = 1.11.

ACDIR
​ This directory is intended for third party .m4 files, and is
​ configured when automake itself is built. This is
@datadir@/aclocal/, which typically expands to
${prefix}/share/aclocal/. To find the compiled-in value of
​ ACDIR, use the --print-ac-dir option (note aclocal Options::).

As an example, suppose that automake-1.11.2 was configured with
--prefix=/usr/local. Then, the search path would be:

  1. /usr/local/share/aclocal-1.11.2/
  2. /usr/local/share/aclocal/

The paths for the ACDIR and ACDIR-APIVERSION directories can be
changed respectively through aclocal options --system-acdir and
--automake-acdir (note aclocal Options::). Note however that these
options are only intended for use by the internal Automake test suite,
or for debugging under highly unusual situations; they are not
ordinarily needed by end-users.

As explained in (note aclocal Options::), there are several options
that can be used to change or extend this search path.

Modifying the Macro Search Path: -I DIR
…………………………………..

Any extra directories specified using -I options (note aclocal
Options::) are prepended to this search list. Thus, aclocal -I /foo -I /bar results in the following search path:

  1. /foo
  2. /bar
  3. ACDIR-APIVERSION
  4. ACDIR

Modifying the Macro Search Path: dirlist
……………………………………

There is a third mechanism for customizing the search path. If a
dirlist file exists in ACDIR, then that file is assumed to contain a
list of directory patterns, one per line. aclocal expands these
patterns to directory names, and adds them to the search list after
all other directories. dirlist entries may use shell wildcards such
as *, ?, or [...].

For example, suppose ACDIR/dirlist contains the following:

 /test1
 /test2
 /test3*

and that aclocal was called with the -I /foo -I /bar options. Then,
the search path would be

  1. /foo
  2. /bar
  3. ACDIR-APIVERSION
  4. ACDIR
  5. /test1
  6. /test2

and all directories with path names starting with /test3.

If the --system-acdir=DIR option is used, then aclocal will
search for the dirlist file in DIR; but remember the warnings above
against the use of --system-acdir.

dirlist is useful in the following situation: suppose that
automake version 1.11.2 is installed with --prefix=/usr by the
system vendor. Thus, the default search directories are

  1. /usr/share/aclocal-1.11/
  2. /usr/share/aclocal/

However, suppose further that many packages have been manually
installed on the system, with $prefix=/usr/local, as is typical. In
that case, many of these “extra” .m4 files are in
/usr/local/share/aclocal. The only way to force /usr/bin/aclocal to
find these “extra” .m4 files is to always call aclocal -I /usr/local/share/aclocal. This is inconvenient. With dirlist, one
may create a file /usr/share/aclocal/dirlist containing only the
single line

 /usr/local/share/aclocal

Now, the “default” search path on the affected system is

  1. /usr/share/aclocal-1.11/
  2. /usr/share/aclocal/
  3. /usr/local/share/aclocal/

without the need for -I options; -I options can be reserved for
project-specific needs (my-source-dir/m4/), rather than using it to
work around local system-dependent tool installation directories.

Similarly, dirlist can be handy if you have installed a local copy
of Automake in your account and want aclocal to look for macros
installed at other places on the system.

Modifying the Macro Search Path: ACLOCAL_PATH
………………………………………..

The fourth and last mechanism to customize the macro search path is also
the simplest. Any directory included in the colon-separated environment
variable ACLOCAL_PATH is added to the search path and takes precedence
over system directories (including those found via dirlist), with the
exception of the versioned directory ACDIR-APIVERSION (note Macro
Search Path::). However, directories passed via -I will take
precedence over directories in ACLOCAL_PATH.

Also note that, if the --install option is used, any .m4 file
containing a required macro that is found in a directory listed in
ACLOCAL_PATH will be installed locally. In this case, serial numbers
in .m4 are honored too, note Serials::.

Conversely to dirlist, ACLOCAL_PATH is useful if you are using a
global copy of Automake and want aclocal to look for macros somewhere
under your home directory.

Planned future incompatibilities
…………………………..

The order in which the directories in the macro search path are
currently looked up is confusing and/or suboptimal in various aspects,
and is probably going to be changed in the future Automake release. In
particular, directories in ACLOCAL_PATH and ACDIR might end up
taking precedence over ACDIR-APIVERSION, and directories in
ACDIR/dirlist might end up taking precedence over ACDIR. This is a
possible future incompatibility!

6.3.3 Writing your own aclocal macros



The aclocal program doesn`t have any built-in knowledge of any macros,
so it is easy to extend it with your own macros.

This can be used by libraries that want to supply their own Autoconf
macros for use by other programs. For instance, the gettext library
supplies a macro AM_GNU_GETTEXT that should be used by any package
using gettext. When the library is installed, it installs this macro
so that aclocal will find it.

A macro files name should end in .m4. Such files should be installed in $(datadir)/aclocal`. This is as simple as writing:

 aclocaldir = $(datadir)/aclocal
 aclocal_DATA = mymacro.m4 myothermacro.m4

Please do use $(datadir)/aclocal, and not something based on the
result of aclocal --print-ac-dir (note Hard-Coded Install Paths::,
for arguments). It might also be helpful to suggest to the user to add
the $(datadir)/aclocal directory to his ACLOCAL_PATH variable (note
ACLOCAL_PATH::) so that aclocal will find the .m4 files installed by
your package automatically.

A file of macros should be a series of properly quoted AC_DEFUN``s (note (autoconf)Macro Definitions::). The aclocalprograms also understandsAC_REQUIRE(note (autoconf)Prerequisite Macros::), so it is safe to put each macro in a separate file. Each file should have no side effects but macro definitions. Especially, any call toAC_PREREQ`
should be done inside the defined macro, not at the beginning of the
file.

Starting with Automake 1.8, aclocal will warn about all underquoted
calls to AC_DEFUN. We realize this will annoy a lot of people,
because aclocal was not so strict in the past and many third party
macros are underquoted; and we have to apologize for this temporary
inconvenience. The reason we have to be stricter is that a future
implementation of aclocal (note Future of aclocal::) will have to
temporarily include all of these third party .m4 files, maybe several
times, including even files that are not actually needed. Doing so
should alleviate many problems of the current implementation, however it
requires a stricter style from the macro authors. Hopefully it is easy
to revise the existing macros. For instance,

 # bad style
 AC_PREREQ(2.68)
 AC_DEFUN(AX_FOOBAR,
 [AC_REQUIRE([AX_SOMETHING])dnl
 AX_FOO
 AX_BAR
 ])

should be rewritten as

 AC_DEFUN([AX_FOOBAR],
 [AC_PREREQ([2.68])dnl
 AC_REQUIRE([AX_SOMETHING])dnl
 AX_FOO
 AX_BAR
 ])

Wrapping the AC_PREREQ call inside the macro ensures that Autoconf
2.68 will not be required if AX_FOOBAR is not actually used. Most
importantly, quoting the first argument of AC_DEFUN allows the macro
to be redefined or included twice (otherwise this first argument would
be expanded during the second definition). For consistency we like to
quote even arguments such as 2.68 that do not require it.

If you have been directed here by the aclocal diagnostic but are
not the maintainer of the implicated macro, you will want to contact the
maintainer of that macro. Please make sure you have the latest version
of the macro and that the problem hasnt already been reported before doing so: people tend to work faster when they arent flooded by mails.

Another situation where aclocal is commonly used is to manage
macros that are used locally by the package, note Local Macros::.

6.3.4 Handling Local Macros



Feature tests offered by Autoconf do not cover all needs. People often
have to supplement existing tests with their own macros, or with
third-party macros.

There are two ways to organize custom macros in a package.

The first possibility (the historical practice) is to list all your
macros in acinclude.m4. This file will be included in aclocal.m4
when you run aclocal, and its macro(s) will henceforth be visible to
autoconf. However if it contains numerous macros, it will rapidly
become difficult to maintain, and it will be almost impossible to share
macros between packages.

The second possibility, which we do recommend, is to write each macro
in its own file and gather all these files in a directory. This
directory is usually called m4/. Then its enough to update configure.acby adding a proper call toAC_CONFIG_MACRO_DIRS`:

 AC_CONFIG_MACRO_DIRS([m4])

aclocal will then take care of automatically adding m4/ to its
search path for m4 files.

When aclocal is run, it will build an aclocal.m4 that
m4_includes any file from m4/ that defines a required macro. Macros
not found locally will still be searched in system-wide directories, as
explained in note Macro Search Path::.

Custom macros should be distributed for the same reason that
configure.ac is: so that other people have all the sources of your
package if they want to work on it. Actually, this distribution happens
automatically because all m4_included files are distributed.

However there is no consensus on the distribution of third-party
macros that your package may use. Many libraries install their own
macro in the system-wide aclocal directory (note Extending
aclocal::). For instance, Guile ships with a file called guile.m4
that contains the macro GUILE_FLAGS that can be used to define setup
compiler and linker flags appropriate for using Guile. Using
GUILE_FLAGS in configure.ac will cause aclocal to copy guile.m4
into aclocal.m4, but as guile.m4 is not part of the project, it will
not be distributed. Technically, that means a user who needs to rebuild
aclocal.m4 will have to install Guile first. This is probably OK, if
Guile already is a requirement to build the package. However, if Guile
is only an optional feature, or if your package might run on
architectures where Guile cannot be installed, this requirement will
hinder development. An easy solution is to copy such third-party macros
in your local m4/ directory so they get distributed.

Since Automake 1.10, aclocal offers the option --install to copy
these system-wide third-party macros in your local macro directory,
helping to solve the above problem.

With this setup, system-wide macros will be copied to m4/ the first
time you run aclocal. Then the locally installed macros will have
precedence over the system-wide installed macros each time aclocal is
run again.

One reason why you should keep --install in the flags even after
the first run is that when you later edit configure.ac and depend on a
new macro, this macro will be installed in your m4/ automatically.
Another one is that serial numbers (note Serials::) can be used to
update the macros in your source tree automatically when new system-wide
versions are installed. A serial number should be a single line of the
form

 #serial NNN

where NNN contains only digits and dots. It should appear in the M4
file before any macro definition. It is a good practice to maintain a
serial number for each macro you distribute, even if you do not use the
--install option of aclocal: this allows other people to use it.

6.3.5 Serial Numbers



Because third-party macros defined in *.m4 files are naturally shared
between multiple projects, some people like to version them. This makes
it easier to tell which of two M4 files is newer. Since at least 1996,
the tradition is to use a #serial line for this.

A serial number should be a single line of the form

 # serial VERSION

where VERSION is a version number containing only digits and dots.
Usually people use a single integer, and they increment it each time
they change the macro (hence the name of “serial”). Such a line should
appear in the M4 file before any macro definition.

The # must be the first character on the line, and it is OK to have
extra words after the version, as in

 #serial VERSION GARBAGE

Normally these serial numbers are completely ignored by aclocal and
autoconf, like any genuine comment. However when using aclocal``s –installfeature, these serial numbers will modify the wayaclocalselects the macros to install in the package: if two files with the same basename exist in your search path, and if at least one of them uses a#serialline,aclocalwill ignore the file that has the older#serial` line (or the file that has none).

Note that a serial number applies to a whole M4 file, not to any
macro it contains. A file can contains multiple macros, but only one
serial.

Here is a use case that illustrates the use of --install and its
interaction with serial numbers. Lets assume we maintain a package called MyPackage, the configure.acof which requires a third-party macroAX_THIRD_PARTYdefined in/usr/share/aclocal/thirdparty.m4` as
follows:

 # serial 1
 AC_DEFUN([AX_THIRD_PARTY], [...])

MyPackage uses an m4/ directory to store local macros as explained
in note Local Macros::, and has

 AC_CONFIG_MACRO_DIRS([m4])

in its configure.ac.

Initially the m4/ directory is empty. The first time we run
aclocal --install, it will notice that

configure.ac uses AX_THIRD_PARTY
• No local macros define AX_THIRD_PARTY
/usr/share/aclocal/thirdparty.m4 defines AX_THIRD_PARTY with
​ serial 1.

Because /usr/share/aclocal/thirdparty.m4 is a system-wide macro and
aclocal was given the --install option, it will copy this file in
m4/thirdparty.m4, and output an aclocal.m4 that contains
m4_include([m4/thirdparty.m4]).

The next time aclocal --install is run, something different
happens. aclocal notices that

configure.ac uses AX_THIRD_PARTY
m4/thirdparty.m4 defines AX_THIRD_PARTY with serial 1.
/usr/share/aclocal/thirdparty.m4 defines AX_THIRD_PARTY with
​ serial 1.

Because both files have the same serial number, aclocal uses the first
it found in its search path order (note Macro Search Path::).
aclocal therefore ignores /usr/share/aclocal/thirdparty.m4 and
outputs an aclocal.m4 that contains m4_include([m4/thirdparty.m4]).

Local directories specified with -I are always searched before
system-wide directories, so a local file will always be preferred to the
system-wide file in case of equal serial numbers.

Now suppose the system-wide third-party macro is changed. This can
happen if the package installing this macro is updated. Lets suppose the new macro has serial number 2. The next time aclocal –install` is
run the situation is the following:

configure.ac uses AX_THIRD_PARTY
m4/thirdparty.m4 defines AX_THIRD_PARTY with serial 1.
/usr/share/aclocal/thirdparty.m4 defines AX_THIRD_PARTY with
​ serial 2.

When aclocal sees a greater serial number, it immediately forgets
anything it knows from files that have the same basename and a smaller
serial number. So after it has found /usr/share/aclocal/thirdparty.m4
with serial 2, aclocal will proceed as if it had never seen
m4/thirdparty.m4. This brings us back to a situation similar to that
at the beginning of our example, where no local file defined the macro.
aclocal will install the new version of the macro in
m4/thirdparty.m4, in this case overriding the old version. MyPackage
just had its macro updated as a side effect of running aclocal.

If you are leery of letting aclocal update your local macro, you
can run aclocal --diff to review the changes aclocal --install would
perform on these macros.

Finally,note that the --force option of aclocal has absolutely
no effect on the files installed by --install. For instance, if you
have modified your local macros, do not expect --install --force to
replace the local macros by their system-wide versions. If you want to
do so, simply erase the local macros you want to revert, and run
aclocal --install.

6.3.6 The Future of aclocal



aclocal is expected to disappear. This feature really should not be
offered by Automake. Automake should focus on generating Makefiles;
dealing with M4 macros really is Autoconfs job. The fact that some people install Automake just to use aclocal, but do not use automake`
otherwise is an indication of how that feature is misplaced.

The new implementation will probably be done slightly differently.
For instance, it could enforce the m4/-style layout discussed in note
Local Macros::.

We have no idea when and how this will happen. This has been
discussed several times in the past, but someone still has to commit to
that non-trivial task.

From the user point of view, aclocal``s removal might turn out to be painful. There is a simple precaution that you may take to make that switch more seamless: never call aclocalyourself. Keep this guy under the exclusive control ofautoreconf and Automakes rebuild
rules. Hopefully you wont need to worry about things breaking, when aclocaldisappears, because everything will have been taken care of. If otherwise you used to callaclocal` directly yourself or from some
script, you will quickly notice the change.

Many packages come with a script called bootstrap or autogen.sh,
that will just call aclocal, libtoolize, gettextize or
autopoint, autoconf, autoheader, and automake in the right
order. Actually this is precisely what autoreconf can do for you. If
your package has such a bootstrap or autogen.sh script, consider
using autoreconf. That should simplify its logic a lot (less things
to maintain, yum!), its even likely you will not need the script anymore, and more to the point you will not call aclocal` directly
anymore.

For the time being, third-party packages should continue to install
public macros into /usr/share/aclocal/. If aclocal is replaced by
another tool it might make sense to rename the directory, but supporting
/usr/share/aclocal/ for backward compatibility should be really easy
provided all macros are properly written (note Extending aclocal::).

6.4 Autoconf macros supplied with Automake


Automake ships with several Autoconf macros that you can use from your
configure.ac. When you use one of them it will be included by
aclocal in aclocal.m4.

6.4.1 Public Macros



AM_INIT_AUTOMAKE([OPTIONS])
​ Runs many macros required for proper operation of the generated
​ Makefiles.

 Today, `AM_INIT_AUTOMAKE` is called with a single argument: a
 space-separated list of Automake options that should be applied to
 every `Makefile.am` in the tree.  The effect is as if each option
 were listed in `AUTOMAKE_OPTIONS` (note Options::).

 This macro can also be called in another, _deprecated_ form:
 `AM_INIT_AUTOMAKE(PACKAGE, VERSION, [NO-DEFINE])`.  In this form,
 there are two required arguments: the package and the version
 number.  This usage is mostly obsolete because the PACKAGE and
 VERSION can be obtained from Autoconf`s `AC_INIT` macro.  However,
 differently from what happens for `AC_INIT` invocations, this
 `AM_INIT_AUTOMAKE` invocation supports shell variables` expansions
 in the `PACKAGE` and `VERSION` arguments (which otherwise defaults,
 respectively, to the `PACKAGE_TARNAME` and `PACKAGE_VERSION`
 defined via the `AC_INIT` invocation; note The `AC_INIT` macro:
 (autoconf)AC_INIT.); and this can be still be useful in some
 selected situations.  Our hope is that future Autoconf versions
 will improve their support for package versions defined dynamically
 at configure runtime; when (and if) this happens, support for the
 two-args `AM_INIT_AUTOMAKE` invocation will likely be removed from
 Automake.

 If your `configure.ac` has:

      AC_INIT([src/foo.c])
      AM_INIT_AUTOMAKE([mumble], [1.5])

 you should modernize it as follows:

      AC_INIT([mumble], [1.5])
      AC_CONFIG_SRCDIR([src/foo.c])
      AM_INIT_AUTOMAKE

 Note that if you`re upgrading your `configure.ac` from an earlier
 version of Automake, it is not always correct to simply move the
 package and version arguments from `AM_INIT_AUTOMAKE` directly to
 `AC_INIT`, as in the example above.  The first argument to
 `AC_INIT` should be the name of your package (e.g., `GNU
 Automake`), not the tarball name (e.g., `automake`) that you used
 to pass to `AM_INIT_AUTOMAKE`.  Autoconf tries to derive a tarball
 name from the package name, which should work for most but not all
 package names.  (If it doesn`t work for yours, you can use the
 four-argument form of `AC_INIT` to provide the tarball name
 explicitly).

 By default this macro `AC_DEFINE``s `PACKAGE` and `VERSION`.  This
 can be avoided by passing the `no-define` option (note List of
 Automake options::):
      AM_INIT_AUTOMAKE([no-define ...])

AM_PATH_LISPDIR
​ Searches for the program emacs, and, if found, sets the output
​ variable lispdir to the full path to Emacs` site-lisp directory.

 Note that this test assumes the `emacs` found to be a version that
 supports Emacs Lisp (such as GNU Emacs or XEmacs).  Other emacsen
 can cause this test to hang (some, like old versions of MicroEmacs,
 start up in interactive mode, requiring `C-x C-c` to exit, which is
 hardly obvious for a non-emacs user).  In most cases, however, you
 should be able to use `C-c` to kill the test.  In order to avoid
 problems, you can set `EMACS` to “no” in the environment, or use
 the `--with-lispdir` option to `configure` to explicitly set the
 correct path (if you`re sure you have an `emacs` that supports
 Emacs Lisp).

AM_PROG_AR([ACT-IF-FAIL])
​ You must use this macro when you use the archiver in your project,
​ if you want support for unusual archivers such as Microsoft lib.
​ The content of the optional argument is executed if the archiver
​ interface is not recognized; the default action is to abort
​ configure with an error message.

AM_PROG_AS
​ Use this macro when you have assembly code in your project. This
​ will choose the assembler for you (by default the C compiler) and
​ set CCAS, and will also set CCASFLAGS if required.

AM_PROG_CC_C_O
​ This is an obsolescent macro that checks that the C compiler
​ supports the -c and -o options together. Note that, since
​ Automake 1.14, the AC_PROG_CC is rewritten to implement such
​ checks itself, and thus the explicit use of AM_PROG_CC_C_O should
​ no longer be required.

AM_PROG_LEX
​ Like AC_PROG_LEX (note Particular Program Checks:
​ (autoconf)Particular Programs.), but uses the missing script on
​ systems that do not have lex. HP-UX 10 is one such system.

AM_PROG_GCJ
​ This macro finds the gcj program or causes an error. It sets
GCJ and GCJFLAGS. gcj is the Java front-end to the GNU
​ Compiler Collection.

AM_PROG_UPC([COMPILER-SEARCH-LIST])
​ Find a compiler for Unified Parallel C and define the UPC
​ variable. The default COMPILER-SEARCH-LIST is upcc upc. This
​ macro will abort configure if no Unified Parallel C compiler is
​ found.

AM_MISSING_PROG(NAME, PROGRAM)
​ Find a maintainer tool PROGRAM and define the NAME environment
​ variable with its location. If PROGRAM is not detected, then NAME
​ will instead invoke the missing script, in order to give useful
​ advice to the user about the missing maintainer tool. for more information on when the missing
​ script is appropriate.

AM_SILENT_RULES
​ Control the machinery for less verbose build output (note Automake
​ Silent Rules::).

AM_WITH_DMALLOC
​ Add support for the Dmalloc package (http://dmalloc.com/). If the
​ user runs configure with --with-dmalloc, then define
WITH_DMALLOC and add -ldmalloc to LIBS.

6.4.2 Obsolete Macros



Although using some of the following macros was required in past
releases, you should not use any of them in new code. All these macros
will be removed in the next major Automake version
; if you are still
using them, running autoupdate should adjust your configure.ac
automatically (note Using autoupdate to Modernize configure.ac:
(autoconf)autoupdate Invocation.). Do it NOW!

AM_PROG_MKDIR_P

 From Automake 1.8 to 1.9.6 this macro used to define the output
 variable `mkdir_p` to one of `mkdir -p`, `install-sh -d`, or
 `mkinstalldirs`.

 Nowadays Autoconf provides a similar functionality with
 `AC_PROG_MKDIR_P` (note Particular Program Checks:
 (autoconf)Particular Programs.), however this defines the output
 variable `MKDIR_P` instead.  In case you are still using the
 `AM_PROG_MKDIR_P` macro in your `configure.ac`, or its provided
 variable `$(mkdir_p)` in your `Makefile.am`, you are advised to
 switch ASAP to the more modern Autoconf-provided interface instead;
 both the macro and the variable might be removed in a future major
 Automake release.

6.4.3 Private Macros
***********************

The following macros are private macros you should not call directly.
They are called by the other public macros when appropriate. Do not
rely on them, as they might be changed in a future version. Consider
them as implementation details; or better, do not consider them at all:
skip this section!

_AM_DEPENDENCIES
AM_SET_DEPDIR
AM_DEP_TRACK
AM_OUTPUT_DEPENDENCY_COMMANDS
​ These macros are used to implement Automake`s automatic dependency
​ tracking scheme. They are called automatically by Automake when
​ required, and there should be no need to invoke them manually.

AM_MAKE_INCLUDE
​ This macro is used to discover how the users makehandles ​ include` statements. This macro is automatically invoked when
​ needed; there should be no need to invoke it manually.

AM_PROG_INSTALL_STRIP
​ This is used to find a version of install that can be used to
​ strip a program at installation time. This macro is automatically
​ included when required.

AM_SANITY_CHECK
​ This checks to make sure that a file created in the build directory
​ is newer than a file in the source directory. This can fail on
​ systems where the clock is set incorrectly. This macro is
​ automatically run from AM_INIT_AUTOMAKE.

GNU Automake 版本(version 1.16.1, 26 February 2018)

Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with no Invariant Sections, with no Front-Cover texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled “GNU Free Documentation License.”

5 创建一个 Makefile.in


为了创建一个包的Makefile.in,只需要在顶层目录执行命令 automake即可。 automake 会通过扫描文件configure.ac来找到每一个 Makefile.am 并生成相应的Makefile.in

此处留意: automake 假定一个包只有一个configure.ac,所以如果你的包包含多个configure.ac就需要在每个包含的目录执行 automake ,或者使用Autoconf的命令autoreconf,这个命令可以自动遍历所有的目录。

另外还需要留意 automake 必须在顶层目录执行,顶层目录需要包含文件 configure.ac

Automake通过运行autoconf来扫描文件 configure.ac及它的依赖,因此autoconf 必须在你的环境变量里面,如果有另外一个 AUTOCONF 环境变量,那么就会 替代默认的 autoconf,这在使用另外一个版本的时候比较有用。

注:automake 只是通过 autoconf 来扫描configure.ac,并不会构建 configure ,所以还需要使用autoconf来构建。

automake 接收下面的参数:

-a
--add-missing
​ Automake requires certain common files to exist in certain
​ situations; for instance, config.guess is required if
configure.ac invokes AC_CANONICAL_HOST. Automake is
​ distributed with several of these files (; this option will cause the missing ones to be
​ automatically added to the package, whenever possible. In general
​ if Automake tells you a file is missing, try using this option. By
​ default Automake tries to make a symbolic link pointing to its own
​ copy of the missing file; this can be changed with --copy.

 Many of the potentially-missing files are common scripts whose
 location may be specified via the `AC_CONFIG_AUX_DIR` macro.
 Therefore, `AC_CONFIG_AUX_DIR``s setting affects whether a file is
 considered missing, and where the missing file is added (.

 In some strictness modes, additional files are installed, x

--libdir=DIR
​ Look for Automake data files in directory DIR instead of in the
​ installation directory. This is typically used for debugging.

 The environment variable `AUTOMAKE_LIBDIR` provides another way to
 set the directory containing Automake data files.  However
 `--libdir` takes precedence over it.

--print-libdir
​ Print the path of the installation directory containing
​ Automake-provided scripts and data files (like e.g., texinfo.texi
​ and install-sh).

-c
--copy
​ 使用 --add-missing的时候默认会生成一个符号链接,这个选项会拷贝文件。

-f
--force-missing
​ When used with --add-missing, causes standard files to be
​ reinstalled even if they already exist in the source tree. This
​ involves removing the file from the source tree before creating the
​ new symlink (or, with --copy, copying the new file).

--foreign
​ 设定级别为 foreign.

--gnits
​ 设定级别为 gnits.

--gnu
​ 设定级别为 gnu.

--help
​ 打印一系列命令行选项并退出

-i
--ignore-deps
​ This disables the dependency tracking feature in generated
Makefiles; see

--include-deps
​ This enables the dependency tracking feature. This feature is
​ enabled by default. This option is provided for historical reasons
​ only and probably should not be used.

--no-force
​ Ordinarily automake creates all Makefile.ins mentioned in
configure.ac. This option causes it to only update those
Makefile.ins that are out of date with respect to one of their
​ dependents.

-o DIR
--output-dir=DIR
​ Put the generated Makefile.in in the directory DIR. Ordinarily
​ each Makefile.in is created in the directory of the corresponding
Makefile.am. This option is deprecated and will be removed in a
​ future release.

-v
--verbose
​ Cause Automake to print information about which files are being
​ read or created.

--version
​ 打印版本号并退出

-W CATEGORY
--warnings=CATEGORY
​ Output warnings falling in CATEGORY. CATEGORY can be one of:
gnu
​ warnings related to the GNU Coding Standards (
obsolete
​ obsolete features or constructions
override
​ user redefinitions of Automake rules or variables
portability
​ portability issues (e.g., use of make features that are
​ known to be not portable)
extra-portability
​ extra portability issues related to obscure tools. One
​ example of such a tool is the Microsoft lib archiver.
syntax
​ weird syntax, unused variables, typos
unsupported
​ unsupported or incomplete features
all
​ all the warnings
none
​ turn off all the warnings
error
​ treat warnings as errors

 A category can be turned off by prefixing its name with `no-`.  For
 instance, `-Wno-syntax` will hide the warnings about unused
 variables.

 The categorie output by default are `obsolete`, `syntax` and
 `unsupported`.  Additionally, `gnu` and `portability` are enabled
 in `--gnu` and `--gnits` strictness.

 Turning off `portability` will also turn off `extra-portability`,
 and similarly turning on `extra-portability` will also turn on
 `portability`.  However, turning on `portability` or turning off
 `extra-portability` will not affect the other category.

 The environment variable `WARNINGS` can contain a comma separated
 list of categorie to enable.  It will be taken into account before
 the command-line switches, this way `-Wnone` will also ignore any
 warning category enabled by `WARNINGS`.  This variable is also used
 by other tools like `autoconf`; unknown categorie are ignored for
 this reason.

If the environment variable AUTOMAKE_JOBS contains a positive
number, it is taken as the maximum number of Perl threads to use in
automake for generating multiple Makefile.in files concurrently.
This is an experimental feature.

循序渐进学Docker - 读书笔记

第1章 全面认识Docker

Docker使用容器引擎解决平台依赖问题,它在每台宿主机上都启动一个Docker的守护进程,守护进程屏蔽了与具体平台相关的信息,对上层应用提供统一的接口。

Java曾提出 Write once, Run anywhere,而Docker提出了 Build once, Run anywhere, Run anything

所以docker的含义就是管理软件部署的应用,把应用打包成一个镜像,镜像带有版本控制功能,每次的修改迭代都对应一个版本,制作好的镜像可以发布到镜像库,供别人使用。

第2章 初步体验Docker

第3章 Ubuntu下使用Docker

Docker是在Ubuntu下诞生和发展的,所以Docker的最新特性都是在Ubuntu下开发和测试的,所以Ubuntu是支持Docker最好的操作系统。

不过现在Docker支持各大主流操作系统,作为生产环境,还是使用REHL或者CentOS,红帽系列正合我意。

这里需要注意的是,如果不希望每次都输入sudo命令,需要把用户user加入到docker组中,命令如下:

1
$ sudo usermod -aG docker user

其中user为用户名,重启就可以生效了。

参考sameersbn/docker-gitlab来了解如何搭建gitlab的docker环境。

第4章 Docker的基础知识

第5章 Docker容器管理

第6章 Docker镜像管理

第7章 Docker仓库管理

第8章 Docker网络和存储管理

第9章 Docker项目日常维护

第10章 Docker Swarm容器集群

第11章 Docker插件开发

第12章 Docker离线系统应用案例

第13章 Etcd、Cadvisor和Kubernetes实践

第14章 构建Docker高可用及自动发现架构实践

第15章 Docker Overlay Network实践

第16章 Docker源码探索

积极牵头组织国际大科学计划和大科学工程方案

积极牵头组织国际大科学计划

和大科学工程方案

积极提出并牵头组织国际大科学计划和大科学工程是党中央、国务院作出的重大决策部署。为做好组织实施工作,制定本方案。

一、重要意义

国际大科学计划和大科学工程(以下简称大科学计划)是人类开拓知识前沿、探索未知世界和解决重大全球性问题的重要手段,是一个国家综合实力和科技创新竞争力的重要体现。牵头组织大科学计划作为建设创新型国家和世界科技强国的重要标志,对于我国增强科技创新实力、提升国际话语权具有积极深远意义。

(一)牵头组织大科学计划是解决全球关键科学问题的有力工具。大科学计划以实现重大科学问题的原创性突破为目标,是基础研究在科学前沿领域的全方位拓展,对于推动世界科技创新与进步、应对人类社会面临的共同挑战具有重要支撑作用。牵头组织大科学计划有利于发挥我国主导作用,为解决世界性重大科学难题贡献中国智慧、提出中国方案、发出中国声音,提供全球公共产品,为世界文明发展作出积极贡献。

(二)牵头组织大科学计划是聚集全球优势科技资源的高端平台。牵头组织大科学计划,有利于面向全球吸引和集聚高端人才,培养和造就一批国际同行认可的领军科学家、高水平学科带头人、学术骨干、工程师和管理人员,形成具有国际水平的管理团队和良好机制,打造高端科研试验和协同创新平台,带动我国科技创新由跟跑为主向并跑和领跑为主转变。

(三)牵头组织大科学计划是构建全球创新治理体系的重要内容。开展大科学计划在优化全球科技资源布局、完善创新治理体系中扮演重要角色,已成为国际科技创新合作的重要议题。牵头组织大科学计划作为科技外交的重要途径,有利于建立以合作共赢为核心的新型国际关系和构建全球伙伴关系网络,对落实国家整体外交战略发挥积极作用。

二、总体要求

(一)指导思想。

全面贯彻党的十九大精神,以习近平新时代中国特色社会主义思想为指导,落实全国科技创新大会精神,统筹推进“五位一体”总体布局和协调推进“四个全面”战略布局,牢固树立和贯彻落实创新、协调、绿色、开放、共享的发展理念,按照《国家创新驱动发展战略纲要》总体要求和外交总体布局,坚持中方主导、前瞻布局、分步推进、量力而行的整体思路,以全球视野谋划科技开放合作,深入落实“一带一路”倡议,遵循共商共建共享原则,积极牵头组织实施大科学计划,着力提升战略前沿领域创新能力和国际影响力,打造创新能力开放合作新平台,推进构建全球创新治理新格局和人类命运共同体,为建设创新型国家和世界科技强国提供有力支撑,为中国特色大国外交作出重要贡献。

(二)基本原则。

国际尖端,科学前沿。适应大科学计划基础性、战略性和前瞻性特点,聚焦国际科技界普遍关注、对人类社会发展和科技进步影响深远的研究领域,选择能够在国际上引起广泛共鸣的项目,力求攻克重大科学问题。

战略导向,提升能力。落实建设世界科技强国“三步走”战略,服务于科技创新和经济社会发展整体战略需要,集聚国内外优秀科技力量,形成一批具有国际影响力的标志性科研成果,全面提升我国科技创新实力。

中方主导,合作共赢。发挥我国在大科学计划核心专家确定、研究问题提出、技术路线选择、科技资源配置、设施选址等问题上的主导作用,尊重各国及各方的优势特长,坚持多国多机构共同参与、优势互补,采取共同出资、实物贡献、成立基金等方式,共享知识产权,实现互利共赢。

创新机制,分步推进。借鉴国际先进经验,注重在大科学计划发起、组织、建设、运行和管理等方面进行系统创新,完善科技资源合作及共享机制,吸引部门、地方共同参加,加强科技界与产业界协作,试点先行,充分论证,根据实施条件成熟一个、启动一个。

(三)主要目标。

总体目标:通过牵头组织大科学计划,在世界科技前沿和驱动经济社会发展的关键领域,形成具有全球影响力的大科学计划布局,开展高水平科学研究,培养引进顶尖科技人才,增强凝聚国际共识和合作创新能力,提升我国科技创新和高端制造水平,推动科技创新合作再上新台阶,努力成为国际重大科技议题和规则的倡导者、推动者和制定者,提升在全球科技创新领域的核心竞争力和话语权。

近期目标:到2020年,培育3—5个项目,研究遴选并启动1—2个我国牵头组织的大科学计划,初步形成牵头组织大科学计划的机制做法,为后续工作探索积累有益经验。

中期目标:到2035年,培育6—10个项目,启动培育成熟项目,形成我国牵头组织的大科学计划初期布局,提升在全球若干科技领域的影响力。

远期目标:到本世纪中叶,培育若干项目,启动培育成熟项目,我国原始科技创新能力显著提高,在国际科技创新治理体系中发挥重要作用,持续为全球重大科技议题作出贡献。

三、重点任务

(一)制定战略规划,确定优先领域。

根据《国家创新驱动发展战略纲要》等部署,结合当前战略前沿领域发展趋势,立足我国现有基础条件,综合考虑潜在风险,组织编制牵头组织大科学计划规划,围绕物质科学、宇宙演化、生命起源、地球系统、环境和气候变化、健康、能源、材料、空间、天文、农业、信息以及多学科交叉领域的优先方向、潜在项目、建设重点、组织机制等,制定发展路线图,明确阶段性战略目标、资金来源、建设方式、运行管理等,科学有序推进各项任务实施。

(二)做好项目的遴选论证、培育倡议和启动实施。

立足我国优势特色领域,根据实施条件成熟度和人力财力保障等情况,遴选具有合作潜力的若干项目进行重点培育,发出相关国际倡议,开展磋商与谈判,视情确定启动实施项目。要加强与国家重大研究布局的统筹协调,做好与“科技创新2030—重大项目”等的衔接,充分利用国家实验室、综合性国家科学中心、国家重大科技基础设施等基础条件和已有优势,实现资源开放共享和人员深入交流。

(三)建立符合项目特点的管理机制。

依托具有国际影响力的国家实验室、科研机构、高等院校、科技社团,通过科研机构间合作或政府间合作等模式,整合各方资源,组建成立专门科研机构、股份公司或政府间国际组织进行大科学计划项目的规划、建设和运营。积极争取把新组建的政府间国际组织总部设在中国。每个大科学计划可成立项目理事会和专家咨询委员会,对项目实施作出决策部署和提供专业化咨询建议。

(四)积极参与他国发起的大科学计划。

继续参与他国发起或多国共同发起的大科学计划,积极承担项目任务,深度参与运行管理,积累组织管理经验,形成与我国牵头组织的大科学计划互为补充、相互支撑、有效联动的良好格局。积极参加重要国际组织的大科学计划相关活动,主动参与大科学计划相关国际规则的起草制定。

四、组织实施保障

(一)加强组织领导和协调管理。

在国家科技计划(专项、基金等)管理部际联席会议机制下,召开牵头组织大科学计划专题会议,由科技部、国家发展改革委、教育部、工业和信息化部、财政部、农业部、国家卫生计生委、国家知识产权局、中科院、工程院、自然科学基金会、国家国防科工局、中央军委装备发展部、中央军委科学技术委员会和中国科协等部门和单位参加,统筹和审议大科学计划的战略规划、发展方向、领域布局、重点任务、项目启动、运行管理机制、知识产权管理和开放共享政策等。

成立由科技界、工程界、产业界等高层次专家组成的大科学计划专家咨询委员会,对大科学计划的优先领域、战略规划、项目论证等进行咨询评审,为国家决策提供参考。战略规划和项目设置等重大事项,经国家科技体制改革和创新体系建设领导小组审议后,按程序报国务院,特别重大事项报党中央。

(二)建立多元化投入和管理机制。

完善财政投入机制,充分利用现有资源和资金渠道,更好发挥财政资金在我国牵头组织大科学计划过程中的引导作用,吸引地方、企业、外国及国际组织的投入。根据实际需求,测算和编制项目经费概算,鼓励社会资本参与,建立多元化投入机制。充分借鉴国际经验,通过有偿使用、知识产权共享等多种方式,吸引国内外政府、科研机构、高等院校、科技社团、企业及国际组织等参与支持大科学计划的建设、运营及管理。

(三)加强高水平专业人才队伍建设。

实施更加积极开放的高层次人才引进政策,依托国家重大人才工程培养和引进大科学计划所需人才,建立支持相关人员参与大科学计划的激励机制。探索建立与国际接轨的全球人才招聘制度,公开招聘世界一流科学家、国际顶尖工程技术人才。加强我国牵头组织大科学计划多层次专业人才队伍建设,构建可持续发展的人才梯队。

(四)建立大科学计划监督评估机制。

建立健全监督评估与动态调整机制,定期对大科学计划的执行情况与成效进行跟踪检查,并将监督评估结果作为项目目标、技术路线、研究任务、预算、进度等调整的重要依据。监督评估结果和调整建议及时报国务院。

GNU Automake 版本(version 1.16.1, 26 February 2018)

Permission is granted to copy, distribute and/or modify this
document under the terms of the GNU Free Documentation License,
Version 1.3 or any later version published by the Free Software
Foundation; with no Invariant Sections, with no Front-Cover texts,
and with no Back-Cover Texts. A copy of the license is included in
the section entitled “GNU Free Documentation License.”

4 两个例子小程序


本章讲两个例子,一个是假定一个工程已经使用Autoconf来处理并手工编写 Makefile,现在切换到 Automake来处理;第二个例子是如何从同一套源码中使用不同的参数编译出两个程序。

4.1 简单小程序,几步就搞定


假定我们已经完成了一个软件zardoz,使用Autoconf提供一个可扩展的框架,不过 Makefile.in是专门另行准备的,现在为了完善这部分工作,我们转而使用Automake。.

第一步就是更新configure.ac,在AC_INIT之后添加AM_INIT_AUTOMAKE

 AC_INIT([zardoz], [1.0])
 AM_INIT_AUTOMAKE
 ...

此时还需要重新生成 configure文件,在此之前需要告诉 autoconf如何找到新的使用宏。最简单的方法是使用aclocal生成 aclocal.m4

1
2
$ aclocal
$ autoconf

接下来就要编写文件Makefile.am了,下面的内容说明我们要安装在bin目录,还指定了源码并且还有一个Texinfo文档。

1
2
3
4
bin_PROGRAMS = zardoz
zardoz_SOURCES = main.c head.c float.c vortex9.c gun.c

info_TEXINFOS = zardoz.texi

最后使用automake --add-missing就可以生成Makefile.in 了,赞。

4.2 一套源码编译两套可执行程序


OK来看第二个例子,这个例子展示了如何从相同的源码true.c通过不同的cpp参数生成两个程序 (truefalse) 。

1
2
3
4
5
6
7
8
9
bin_PROGRAMS = true false
false_SOURCES =
false_LDADD = false.o

true.o: true.c
$(COMPILE) -DEXIT_CODE=0 -c true.c

false.o: true.c
$(COMPILE) -DEXIT_CODE=1 -o false.o -c true.c

注意例子里面没有 true_SOURCESAutomake会自动假定源码名为 true.c,编译为 true.o 并链接到
true,规则段true.o: true.c 会覆盖默认由 Automake 产生的,使用新的一些规则。

false_SOURCES 置为空,因为没有相应的源码,所以我们需要告诉 Automake 它是如何编译链接的。false_LDADD 给出了方案。

如果编译器不能同时支持 -c-o,上面的这个文件可能不会工作,此时我们就需要变通一下,使用下面的文件解决:

1
2
3
4
5
true.o: true.c false.o
$(COMPILE) -DEXIT_CODE=0 -c true.c

false.o: true.c
$(COMPILE) -DEXIT_CODE=1 -c true.c && mv true.o false.o

上面是比较有技巧的处理方法,刚开始的操作,还是希望尽量简化,简单,易于理解,如下所示:

1
2
3
4
5
6
7
bin_PROGRAMS = false true

false_SOURCES = true.c
false_CPPFLAGS = -DEXIT_CODE=1

true_SOURCES = true.c
true_CPPFLAGS = -DEXIT_CODE=0

gitlab服务器迁移

新换了服务器,需要将原来服务器商的gitlab项目迁移到新的服务器上.

1.迁移准备工作和思路:从a服务器迁移到b服务器,由于Gitlab自身的兼容性问题,高版本的Gitlab无法恢复低版本备份的数据,需要注意在b服务器部署和a服务器一样版本的gitlab,部署好环境后开始备份和数据迁移.

查看gitlab版本的命令:

1
gitlab-rake gitlab:env:info
  1. 备份原a服务器上的的数据
1
gitlab-rake gitlab:backup:create RAILS_ENV=production

PS: 备份后的文件一般是位于/var/opt/gitlab/backups下, 自动生成文件名文件名如1481529483_gitlab_backup.tar

  1. 将步骤2生成的tar文件拷贝到b服务器上相应的backups目录下
    可以利用scp进行直接拷贝.
1
scp username@src_ip:/var/opt/gitlab/backups/xxxxxxxxxx_yyyy_mm_dd_gitlab_backup.tar /var/opt/gitlab/backups

PS: username为原服务器的用户名,src_ip原服务器IP地址

  1. 在b服务器恢复数据
1
gitlab-rake gitlab:backup:restore RAILS_ENV=production BACKUP=xxxxxxxxxx_yyyy_mm_dd_

PS:BACKUP的时间点必须与原服务器备份后的文件名一致

gitlab启动

1
2
3
$ gitlab-ctl reconfigure

$ gitlab-ctl restart

gitlab 仓库的存储地址

1
$ ls /var/opt/gitlab/git-data/repositories

gitlab 手动备份

1
$ gitlab-backup create

阳野道子-好大的苹果

是谁,在暗地里悄悄地看着我们呢?

几次过后,哈哈大笑。。

本书来自作者吃苹果时的随意想象,讲述了一个大大的苹果从树上掉下来,小鸟看见了开心地吃起来,随后相继引来兔子、松鼠、老鼠、蝴蝶、虫子们来吃。大大的苹果变成了苹果芯,最后由蚂蚁搬走了分着吃。这么一个大苹果,大家都吃得好开心。这是一个非常温馨动人的故事,画面讲究细节,却又简单好玩,作者还暗藏玄机,让读者去发现与惊喜。