mbox series

[v3,0/5] fcntl fortification

Message ID 20230617222218.642125-1-bugaevc@gmail.com
Headers show
Series fcntl fortification | expand

Message

Sergey Bugaev June 17, 2023, 10:22 p.m. UTC
Hello,

this is v3 of the fcntl fortification work. v1 was at [0], v2 at [1].

[0]: https://sourceware.org/pipermail/libc-alpha/2023-May/148332.html
[1]: https://sourceware.org/pipermail/libc-alpha/2023-May/148569.html

Changes since v2:

- This is now in its own separate header, bits/fcntl3.h
  (bits/fcntl2.h is used by the open fortification)

- Clang is now supported in addition to GCC!
  - Clang does not support nor need the "-Wsystem-headers" pragma
  - Clang does support error/warning attributes since recently
  - There seems to be a bug in Clang which prevents the type mismatch
    warning from actually firing. Specifically, it appears that Clang
    gets confused about C functions names vs symbol names when it comes
    to attribute ((warning)), and does not emit the warning if the
    function declared with __warnattr has a symbol name matching that of
    another function that has not been declared with __warnattr.

    While this could be worked around in glibc (such as by adding
    __fcntl_warn as a real wrapper function when built with Clang), I
    think this just needs to be fixed in Clang. Any LLVM developers
    here? :D

- Changed hide_constant utility to use an empty inline asm statement
  instead of volatile and noinline, as per the discussion. I did not
  make this into a general-purpose glibc-wide utility because I don't
  know what a fitting name and place (header) for it would be. If you'd
  like to see it glibc-wide, please suggest me where to put it and how
  to name it!

- Fixed the C++ template linkage thing

- Addressed misc review comments

- Looked into applying __builtin_constant_p to the result of the cmd
  check and not the cmd value itself, as suggested by Florian.

  Unfortunately this does not work at all :( __builtin_constant_p starts
  returning 0 given anything remotely complex like even a trivial inline
  function call (so technically hide_constant would still work if it was
  just 'return value;'), even if the function is (later?) fully inlined
  and const-folded.

  *Maybe* this could be made to work if I used an obscene amount of
  macros instead of inline functions (to handle all the various commands
  being conditionally defined), but I don't want to go there.

  So since this didn't work out, I left the runtime __fcntl_2 function,
  but split it into a separate patch, so you can apply it or drop it
  depending on what you prefer in the end.

Clang / C++ demo:

------------------------------------------------------------------
$ clang++ test-fcntl.cpp -D _FORTIFY_SOURCE=2 -O2
In file included from test-fcntl.cpp:1:
In file included from /usr/include/fcntl.h:348:
/usr/include/bits/fcntl3.h:394:5: error: call to '__fcntl_missing_arg' declared with 'error' attribute: fcntl with with this command needs 3 arguments
    __fcntl_missing_arg ();
    ^
1 error generated.
------------------------------------------------------------------

Sergey

Comments

Carlos O'Donell June 19, 2023, 12:58 p.m. UTC | #1
On 6/17/23 18:22, Sergey Bugaev via Libc-alpha wrote:
> Hello,
> 
> this is v3 of the fcntl fortification work. v1 was at [0], v2 at [1].
> 
> [0]: https://sourceware.org/pipermail/libc-alpha/2023-May/148332.html
> [1]: https://sourceware.org/pipermail/libc-alpha/2023-May/148569.html

Pre-commit CI regression caused by this series on both aarch64 and aarch32.
https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/

# Running glibc:io ...
# FAIL: io/check-installed-headers-c 
# FAIL: io/check-installed-headers-cxx 
# 
# Running glibc:misc ...
# FAIL: misc/check-installed-headers-c 
# FAIL: misc/check-installed-headers-cxx 
# 
# Running glibc:rt ...
# FAIL: rt/check-installed-headers-c 
# FAIL: rt/check-installed-headers-cxx 


> Changes since v2:
> 
> - This is now in its own separate header, bits/fcntl3.h
>   (bits/fcntl2.h is used by the open fortification)
> 
> - Clang is now supported in addition to GCC!
>   - Clang does not support nor need the "-Wsystem-headers" pragma
>   - Clang does support error/warning attributes since recently
>   - There seems to be a bug in Clang which prevents the type mismatch
>     warning from actually firing. Specifically, it appears that Clang
>     gets confused about C functions names vs symbol names when it comes
>     to attribute ((warning)), and does not emit the warning if the
>     function declared with __warnattr has a symbol name matching that of
>     another function that has not been declared with __warnattr.
> 
>     While this could be worked around in glibc (such as by adding
>     __fcntl_warn as a real wrapper function when built with Clang), I
>     think this just needs to be fixed in Clang. Any LLVM developers
>     here? :D
> 
> - Changed hide_constant utility to use an empty inline asm statement
>   instead of volatile and noinline, as per the discussion. I did not
>   make this into a general-purpose glibc-wide utility because I don't
>   know what a fitting name and place (header) for it would be. If you'd
>   like to see it glibc-wide, please suggest me where to put it and how
>   to name it!
> 
> - Fixed the C++ template linkage thing
> 
> - Addressed misc review comments
> 
> - Looked into applying __builtin_constant_p to the result of the cmd
>   check and not the cmd value itself, as suggested by Florian.
> 
>   Unfortunately this does not work at all :( __builtin_constant_p starts
>   returning 0 given anything remotely complex like even a trivial inline
>   function call (so technically hide_constant would still work if it was
>   just 'return value;'), even if the function is (later?) fully inlined
>   and const-folded.
> 
>   *Maybe* this could be made to work if I used an obscene amount of
>   macros instead of inline functions (to handle all the various commands
>   being conditionally defined), but I don't want to go there.
> 
>   So since this didn't work out, I left the runtime __fcntl_2 function,
>   but split it into a separate patch, so you can apply it or drop it
>   depending on what you prefer in the end.
> 
> Clang / C++ demo:
> 
> ------------------------------------------------------------------
> $ clang++ test-fcntl.cpp -D _FORTIFY_SOURCE=2 -O2
> In file included from test-fcntl.cpp:1:
> In file included from /usr/include/fcntl.h:348:
> /usr/include/bits/fcntl3.h:394:5: error: call to '__fcntl_missing_arg' declared with 'error' attribute: fcntl with with this command needs 3 arguments
>     __fcntl_missing_arg ();
>     ^
> 1 error generated.
> ------------------------------------------------------------------
> 
> Sergey
>
Sergey Bugaev June 19, 2023, 2:23 p.m. UTC | #2
On Mon, Jun 19, 2023 at 3:58 PM Carlos O'Donell <carlos@redhat.com> wrote:
> Pre-commit CI regression caused by this series on both aarch64 and aarch32.
> https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/
>
> # Running glibc:io ...
> # FAIL: io/check-installed-headers-c
> # FAIL: io/check-installed-headers-cxx
> #
> # Running glibc:misc ...
> # FAIL: misc/check-installed-headers-c
> # FAIL: misc/check-installed-headers-cxx
> #
> # Running glibc:rt ...
> # FAIL: rt/check-installed-headers-c
> # FAIL: rt/check-installed-headers-cxx

Thank you :(

I'm a little bit lost in the Jenkins web UI -- is there anywhere I can
view the output of the failing tests to understand what exactly went
wrong? I see the sum files here [0], but not the .out files.

[0]: https://ci.linaro.org/job/tcwg_glibc_check--master-arm-build/386/artifact/artifacts/artifacts.precommit/sumfiles/

Sergey
Adhemerval Zanella Netto June 19, 2023, 6:36 p.m. UTC | #3
On 19/06/23 11:23, Sergey Bugaev via Libc-alpha wrote:
> On Mon, Jun 19, 2023 at 3:58 PM Carlos O'Donell <carlos@redhat.com> wrote:
>> Pre-commit CI regression caused by this series on both aarch64 and aarch32.
>> https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/
>>
>> # Running glibc:io ...
>> # FAIL: io/check-installed-headers-c
>> # FAIL: io/check-installed-headers-cxx
>> #
>> # Running glibc:misc ...
>> # FAIL: misc/check-installed-headers-c
>> # FAIL: misc/check-installed-headers-cxx
>> #
>> # Running glibc:rt ...
>> # FAIL: rt/check-installed-headers-c
>> # FAIL: rt/check-installed-headers-cxx
> 
> Thank you :(
> 
> I'm a little bit lost in the Jenkins web UI -- is there anywhere I can
> view the output of the failing tests to understand what exactly went
> wrong? I see the sum files here [0], but not the .out files.
> 
> [0]: https://ci.linaro.org/job/tcwg_glibc_check--master-arm-build/386/artifact/artifacts/artifacts.precommit/sumfiles/

I could not reproduce it locally, so I think it is a limitation of our 
(Linaro) CI framework, where not all tests artifacts are being removed 
with make test-clean. Maxim already sent a fix for this [1].

[1] https://patchwork.sourceware.org/project/glibc/patch/20230615152547.2924770-1-maxim.kuvyrkov@linaro.org/
Maxim Kuvyrkov June 20, 2023, 5:59 a.m. UTC | #4
> On Jun 19, 2023, at 22:36, Adhemerval Zanella Netto <adhemerval.zanella@linaro.org> wrote:
> 
> 
> 
> On 19/06/23 11:23, Sergey Bugaev via Libc-alpha wrote:
>> On Mon, Jun 19, 2023 at 3:58 PM Carlos O'Donell <carlos@redhat.com> wrote:
>>> Pre-commit CI regression caused by this series on both aarch64 and aarch32.
>>> https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/
>>> 
>>> # Running glibc:io ...
>>> # FAIL: io/check-installed-headers-c
>>> # FAIL: io/check-installed-headers-cxx
>>> #
>>> # Running glibc:misc ...
>>> # FAIL: misc/check-installed-headers-c
>>> # FAIL: misc/check-installed-headers-cxx
>>> #
>>> # Running glibc:rt ...
>>> # FAIL: rt/check-installed-headers-c
>>> # FAIL: rt/check-installed-headers-cxx
>> 
>> Thank you :(
>> 
>> I'm a little bit lost in the Jenkins web UI -- is there anywhere I can
>> view the output of the failing tests to understand what exactly went
>> wrong? I see the sum files here [0], but not the .out files.
>> 
>> [0]: https://ci.linaro.org/job/tcwg_glibc_check--master-arm-build/386/artifact/artifacts/artifacts.precommit/sumfiles/

Hi Sergey,

See the attached tarball with output from failed tests:
===
Running glibc:io ...
FAIL: io/check-installed-headers-c 
FAIL: io/check-installed-headers-cxx 

Running glibc:misc ...
FAIL: misc/check-installed-headers-c 
FAIL: misc/check-installed-headers-cxx 

Running glibc:rt ...
FAIL: rt/check-installed-headers-c 
FAIL: rt/check-installed-headers-cxx 
===

Let me know if you need any help in reproducing these.  The results are for Ubuntu 22.04 armhf target, but results for Ubuntu 22.04 aarch64 show the same regressions.

Interestingly the top-level check-installed-headers-* tests PASS:
===
PASS: check-installed-headers-c
PASS: check-installed-headers-cxx
===

> 
> I could not reproduce it locally, so I think it is a limitation of our 
> (Linaro) CI framework, where not all tests artifacts are being removed 
> with make test-clean. Maxim already sent a fix for this [1].
> 
> [1] https://patchwork.sourceware.org/project/glibc/patch/20230615152547.2924770-1-maxim.kuvyrkov@linaro.org/

Hi Adhemerval,

We (Linaro) have fixed or added workarounds for all known testsuite stability problems in our CI before enabling reporting for failed patches.  I am surprised that you could not reproduce the above failures in your setup; let's troubleshoot your local setup offline.

Kind regards,

--
Maxim Kuvyrkov
https://www.linaro.org
Sergey Bugaev June 20, 2023, 7:33 a.m. UTC | #5
On Tue, Jun 20, 2023 at 8:59 AM Maxim Kuvyrkov
<maxim.kuvyrkov@linaro.org> wrote:
> Hi Sergey,
>
> See the attached tarball with output from failed tests

Hi, and thank you -- so it's just that F_DUPFD_CLOEXEC is defined
conditionally (ifdef __USE_XOPEN2K8), that'd be easy to fix. I don't see
why it would vary depending on CPU architecture though.

Sergey
Maxim Kuvyrkov June 20, 2023, 9:41 a.m. UTC | #6
> On Jun 20, 2023, at 11:33, Sergey Bugaev <bugaevc@gmail.com> wrote:
> 
> On Tue, Jun 20, 2023 at 8:59 AM Maxim Kuvyrkov
> <maxim.kuvyrkov@linaro.org> wrote:
>> Hi Sergey,
>> 
>> See the attached tarball with output from failed tests
> 
> Hi, and thank you -- so it's just that F_DUPFD_CLOEXEC is defined
> conditionally (ifdef __USE_XOPEN2K8), that'd be easy to fix. I don't see
> why it would vary depending on CPU architecture though.

I don't think CPU architecture plays a role here either.  My guess (didn't verify) is that the difference comes from GCC version and its default C standard.

In our case we are running CI in latest Ubuntu LTS (22.04 at the moment), which has GCC 11.

What environment did you test this in?

Kind regards,

--
Maxim Kuvyrkov
https://www.linaro.org
Sergey Bugaev June 20, 2023, 11:28 a.m. UTC | #7
On Tue, Jun 20, 2023 at 12:41 PM Maxim Kuvyrkov
<maxim.kuvyrkov@linaro.org> wrote:
> I don't think CPU architecture plays a role here either.  My guess (didn't verify) is that the difference comes from GCC version and its default C standard.

But the C standard is being explicitly set in these tests (those
-std=c89 flags), no?

> In our case we are running CI in latest Ubuntu LTS (22.04 at the moment), which has GCC 11.
>
> What environment did you test this in?

I have GCC 13.1 and Clang 16.0 targeting {x86_64,i686}-linux-gnu, GCC
12.2 targeting i686-gnu, and GCC master targeting x86_64-gnu. I have
only run the full testsuite on x86_64-linux-gnu (and it does pass,
including all the check-installed-headers tests); for the other
variants I've only checked debug/tst-fortify (which also tries to
compile the header in various configurations) and tried to build
sample code against the installed headers manually (but have not
checked different C standards and _XXXX_SOURCE definitions while doing
that). I have also done some checks on Compiler Explorer, and indeed I
can reproduce F_DUPFD_CLOEXEC being unavailable with -std=c89 (or any
non-GNU C, e.g. -std=c11) there -- but that happens when targeting
x86_64 too.

...Having written the above, I went and re-ran the test suite, and
traced which files GCC opens, and it does not even look at fcntl3.h
when running check-installed-headers! That explains it, the CI must
have _FORTIFY_SOURCE set in the CFLAGS for glibc itself, but I don't,
and Adhemerval probably doesn't either. With _FORTIFY_SOURCE manually
added, I can reproduce the exact test failure on my setup too.

Thanks!

Sergey
Maxim Kuvyrkov June 20, 2023, 12:38 p.m. UTC | #8
> On Jun 20, 2023, at 15:28, Sergey Bugaev <bugaevc@gmail.com> wrote:
> 
> On Tue, Jun 20, 2023 at 12:41 PM Maxim Kuvyrkov
> <maxim.kuvyrkov@linaro.org> wrote:
>> I don't think CPU architecture plays a role here either.  My guess (didn't verify) is that the difference comes from GCC version and its default C standard.
> 
> But the C standard is being explicitly set in these tests (those
> -std=c89 flags), no?
> 
>> In our case we are running CI in latest Ubuntu LTS (22.04 at the moment), which has GCC 11.
>> 
>> What environment did you test this in?
> 
> I have GCC 13.1 and Clang 16.0 targeting {x86_64,i686}-linux-gnu, GCC
> 12.2 targeting i686-gnu, and GCC master targeting x86_64-gnu. I have
> only run the full testsuite on x86_64-linux-gnu (and it does pass,
> including all the check-installed-headers tests); for the other
> variants I've only checked debug/tst-fortify (which also tries to
> compile the header in various configurations) and tried to build
> sample code against the installed headers manually (but have not
> checked different C standards and _XXXX_SOURCE definitions while doing
> that). I have also done some checks on Compiler Explorer, and indeed I
> can reproduce F_DUPFD_CLOEXEC being unavailable with -std=c89 (or any
> non-GNU C, e.g. -std=c11) there -- but that happens when targeting
> x86_64 too.
> 
> ...Having written the above, I went and re-ran the test suite, and
> traced which files GCC opens, and it does not even look at fcntl3.h
> when running check-installed-headers! That explains it, the CI must
> have _FORTIFY_SOURCE set in the CFLAGS for glibc itself, but I don't,
> and Adhemerval probably doesn't either. With _FORTIFY_SOURCE manually
> added, I can reproduce the exact test failure on my setup too.

We don't set _FORTIFY_SOURCE in our CI's glibc build, but, I think, it comes from Ubuntu's GCC, where it may be enabled by default.  Or are you using Ubuntu and not seeing this with default Ubuntu toolchain?

--
Maxim Kuvyrkov
https://www.linaro.org
Adhemerval Zanella Netto June 20, 2023, 12:50 p.m. UTC | #9
On 20/06/23 02:59, Maxim Kuvyrkov wrote:
>> On Jun 19, 2023, at 22:36, Adhemerval Zanella Netto <adhemerval.zanella@linaro.org> wrote:
>>
>>
>>
>> On 19/06/23 11:23, Sergey Bugaev via Libc-alpha wrote:
>>> On Mon, Jun 19, 2023 at 3:58 PM Carlos O'Donell <carlos@redhat.com> wrote:
>>>> Pre-commit CI regression caused by this series on both aarch64 and aarch32.
>>>> https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/
>>>>
>>>> # Running glibc:io ...
>>>> # FAIL: io/check-installed-headers-c
>>>> # FAIL: io/check-installed-headers-cxx
>>>> #
>>>> # Running glibc:misc ...
>>>> # FAIL: misc/check-installed-headers-c
>>>> # FAIL: misc/check-installed-headers-cxx
>>>> #
>>>> # Running glibc:rt ...
>>>> # FAIL: rt/check-installed-headers-c
>>>> # FAIL: rt/check-installed-headers-cxx
>>>
>>> Thank you :(
>>>
>>> I'm a little bit lost in the Jenkins web UI -- is there anywhere I can
>>> view the output of the failing tests to understand what exactly went
>>> wrong? I see the sum files here [0], but not the .out files.
>>>
>>> [0]: https://ci.linaro.org/job/tcwg_glibc_check--master-arm-build/386/artifact/artifacts/artifacts.precommit/sumfiles/
> 
> Hi Sergey,
> 
> See the attached tarball with output from failed tests:
> ===
> Running glibc:io ...
> FAIL: io/check-installed-headers-c 
> FAIL: io/check-installed-headers-cxx 
> 
> Running glibc:misc ...
> FAIL: misc/check-installed-headers-c 
> FAIL: misc/check-installed-headers-cxx 
> 
> Running glibc:rt ...
> FAIL: rt/check-installed-headers-c 
> FAIL: rt/check-installed-headers-cxx 
> ===
> 
> Let me know if you need any help in reproducing these.  The results are for Ubuntu 22.04 armhf target, but results for Ubuntu 22.04 aarch64 show the same regressions.
> 
> Interestingly the top-level check-installed-headers-* tests PASS:
> ===
> PASS: check-installed-headers-c
> PASS: check-installed-headers-cxx
> ===
> 
>>
>> I could not reproduce it locally, so I think it is a limitation of our 
>> (Linaro) CI framework, where not all tests artifacts are being removed 
>> with make test-clean. Maxim already sent a fix for this [1].
>>
>> [1] https://patchwork.sourceware.org/project/glibc/patch/20230615152547.2924770-1-maxim.kuvyrkov@linaro.org/
> 
> Hi Adhemerval,
> 
> We (Linaro) have fixed or added workarounds for all known testsuite stability problems in our CI before enabling reporting for failed patches.  I am surprised that you could not reproduce the above failures in your setup; let's troubleshoot your local setup offline.

I think I figure out what is happening here. I am using toolchains built with
scripts/build-many-glibcs.py which does not set fortify as default, different
than distro that are now setting the it as default.

Since in our CI we use the system compiler to check for header, fortify will
be always enabled and thus breaking the testing.

Frédéric is fixing a lot of tests to assume fortify is enabled as default, so
maybe it is something that he is already tracking.  In any case, I think we
should fix scripts/check-installed-headers.sh to check for different levels
of _FORTIFY_SOURCE along with already in places flags.
Sergey Bugaev June 20, 2023, 12:53 p.m. UTC | #10
On Tue, Jun 20, 2023 at 3:38 PM Maxim Kuvyrkov
<maxim.kuvyrkov@linaro.org> wrote:
> We don't set _FORTIFY_SOURCE in our CI's glibc build, but, I think, it comes from Ubuntu's GCC, where it may be enabled by default.  Or are you using Ubuntu and not seeing this with default Ubuntu toolchain?

I'm not using Ubuntu.

Do they just set _FORTIFY_SOURCE by default -- i.e. not only when
building OS packages, but for all compilations? That's... unusual :|

It's also concerning that check-installed-headers doesn't check this
configuration (_FORTIFY_SOURCE with non-GNU C standard), and we're
only finding out about this by accident. There's a comment in
check-installed-headers.sh that says:

# An exhaustive test of feature selection macros would take far too long.
# These are probably the most commonly used three.

That makes sense when running the testsuite locally since you want it
to finish in minutes, not days, but wouldn't checking all combinations
(or at least _a lot_ more of them) make more sense for CI?

Sergey
Adhemerval Zanella Netto June 20, 2023, 1:40 p.m. UTC | #11
On 20/06/23 09:53, Sergey Bugaev wrote:
> On Tue, Jun 20, 2023 at 3:38 PM Maxim Kuvyrkov
> <maxim.kuvyrkov@linaro.org> wrote:
>> We don't set _FORTIFY_SOURCE in our CI's glibc build, but, I think, it comes from Ubuntu's GCC, where it may be enabled by default.  Or are you using Ubuntu and not seeing this with default Ubuntu toolchain?
> 
> I'm not using Ubuntu.
> 
> Do they just set _FORTIFY_SOURCE by default -- i.e. not only when
> building OS packages, but for all compilations? That's... unusual :|

It does for any optimized build:

$ gcc -v 2>&1 | grep 'gcc version'
gcc version 11.3.0 (Ubuntu 11.3.0-1ubuntu1~22.04.1)
$ gcc -dM -E - < /dev/null | grep -w _FORTIFY_SOURCE
$ gcc -O2 -dM -E - < /dev/null | grep -w _FORTIFY_SOURCE
#define _FORTIFY_SOURCE 2

And I think this is a common configuration for recent distros.

> 
> It's also concerning that check-installed-headers doesn't check this
> configuration (_FORTIFY_SOURCE with non-GNU C standard), and we're
> only finding out about this by accident. There's a comment in
> check-installed-headers.sh that says:
> 
> # An exhaustive test of feature selection macros would take far too long.
> # These are probably the most commonly used three.
> 
> That makes sense when running the testsuite locally since you want it
> to finish in minutes, not days, but wouldn't checking all combinations
> (or at least _a lot_ more of them) make more sense for CI?
> 

I think it is worth to extend check-installed-headers.sh to include 
fortify as well, specially because it is now being enabled as default
on multiple configurations.
Zack Weinberg June 20, 2023, 1:46 p.m. UTC | #12
On Tue, Jun 20, 2023, at 8:53 AM, Sergey Bugaev via Libc-alpha wrote:
> I'm not using Ubuntu.
> Do they just set _FORTIFY_SOURCE by default -- i.e. not only when
> building OS packages, but for all compilations? That's... unusual :|

To me it looks like they only set it when building packages:

$ lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 22.04.1 LTS
Release:	22.04
Codename:	jammy

$ which gcc
/usr/bin/gcc

$ echo '#include <string.h>' | gcc -E -dD -xc - | grep FORTIFY
#undef __USE_FORTIFY_LEVEL
#define __USE_FORTIFY_LEVEL 0
#define __bos(ptr) __builtin_object_size (ptr, __USE_FORTIFY_LEVEL > 1)

$ dpkg-buildflags | grep FORTIFY
CPPFLAGS=-Wdate-time -D_FORTIFY_SOURCE=2

> It's also concerning that check-installed-headers doesn't check this
> configuration (_FORTIFY_SOURCE with non-GNU C standard), and we're
> only finding out about this by accident. There's a comment in
> check-installed-headers.sh that says:
>
> # An exhaustive test of feature selection macros would take far too long.
> # These are probably the most commonly used three.
>
> That makes sense when running the testsuite locally since you want it
> to finish in minutes, not days, but wouldn't checking all combinations
> (or at least _a lot_ more of them) make more sense for CI?

When I wrote check-installed-headers, IIRC, the only CI we had was
Joseph's build-many-glibcs bot, which was already struggling to keep
up just with the load from its own built in list of configurations.

It may well make sense to change this now, and I agree that permutations
involving _FORTIFY_SOURCE should be a priority since fortification makes
so many changes to important headers.

zw
Zack Weinberg June 20, 2023, 1:47 p.m. UTC | #13
On Tue, Jun 20, 2023, at 9:40 AM, Adhemerval Zanella Netto via Libc-alpha wrote:
> On 20/06/23 09:53, Sergey Bugaev wrote:
>> On Tue, Jun 20, 2023 at 3:38 PM Maxim Kuvyrkov
>> <maxim.kuvyrkov@linaro.org> wrote:
>>> We don't set _FORTIFY_SOURCE in our CI's glibc build, but, I think, it comes from Ubuntu's GCC, where it may be enabled by default.  Or are you using Ubuntu and not seeing this with default Ubuntu toolchain?
>> 
>> I'm not using Ubuntu.
>> 
>> Do they just set _FORTIFY_SOURCE by default -- i.e. not only when
>> building OS packages, but for all compilations? That's... unusual :|
>
> It does for any optimized build:
>
> $ gcc -v 2>&1 | grep 'gcc version'
> gcc version 11.3.0 (Ubuntu 11.3.0-1ubuntu1~22.04.1)
> $ gcc -dM -E - < /dev/null | grep -w _FORTIFY_SOURCE
> $ gcc -O2 -dM -E - < /dev/null | grep -w _FORTIFY_SOURCE
> #define _FORTIFY_SOURCE 2
>
> And I think this is a common configuration for recent distros.

Argh, I should have thought to try turning on optimization :-/

zw
Frederic Berat June 20, 2023, 2:21 p.m. UTC | #14
On Tue, Jun 20, 2023 at 2:50 PM Adhemerval Zanella Netto
<adhemerval.zanella@linaro.org> wrote:
>
>
>
> On 20/06/23 02:59, Maxim Kuvyrkov wrote:
> >> On Jun 19, 2023, at 22:36, Adhemerval Zanella Netto <adhemerval.zanella@linaro.org> wrote:
> >>
> >>
> >>
> >> On 19/06/23 11:23, Sergey Bugaev via Libc-alpha wrote:
> >>> On Mon, Jun 19, 2023 at 3:58 PM Carlos O'Donell <carlos@redhat.com> wrote:
> >>>> Pre-commit CI regression caused by this series on both aarch64 and aarch32.
> >>>> https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/
> >>>>
> >>>> # Running glibc:io ...
> >>>> # FAIL: io/check-installed-headers-c
> >>>> # FAIL: io/check-installed-headers-cxx
> >>>> #
> >>>> # Running glibc:misc ...
> >>>> # FAIL: misc/check-installed-headers-c
> >>>> # FAIL: misc/check-installed-headers-cxx
> >>>> #
> >>>> # Running glibc:rt ...
> >>>> # FAIL: rt/check-installed-headers-c
> >>>> # FAIL: rt/check-installed-headers-cxx
> >>>
> >>> Thank you :(
> >>>
> >>> I'm a little bit lost in the Jenkins web UI -- is there anywhere I can
> >>> view the output of the failing tests to understand what exactly went
> >>> wrong? I see the sum files here [0], but not the .out files.
> >>>
> >>> [0]: https://ci.linaro.org/job/tcwg_glibc_check--master-arm-build/386/artifact/artifacts/artifacts.precommit/sumfiles/
> >
> > Hi Sergey,
> >
> > See the attached tarball with output from failed tests:
> > ===
> > Running glibc:io ...
> > FAIL: io/check-installed-headers-c
> > FAIL: io/check-installed-headers-cxx
> >
> > Running glibc:misc ...
> > FAIL: misc/check-installed-headers-c
> > FAIL: misc/check-installed-headers-cxx
> >
> > Running glibc:rt ...
> > FAIL: rt/check-installed-headers-c
> > FAIL: rt/check-installed-headers-cxx
> > ===
> >
> > Let me know if you need any help in reproducing these.  The results are for Ubuntu 22.04 armhf target, but results for Ubuntu 22.04 aarch64 show the same regressions.
> >
> > Interestingly the top-level check-installed-headers-* tests PASS:
> > ===
> > PASS: check-installed-headers-c
> > PASS: check-installed-headers-cxx
> > ===
> >
> >>
> >> I could not reproduce it locally, so I think it is a limitation of our
> >> (Linaro) CI framework, where not all tests artifacts are being removed
> >> with make test-clean. Maxim already sent a fix for this [1].
> >>
> >> [1] https://patchwork.sourceware.org/project/glibc/patch/20230615152547.2924770-1-maxim.kuvyrkov@linaro.org/
> >
> > Hi Adhemerval,
> >
> > We (Linaro) have fixed or added workarounds for all known testsuite stability problems in our CI before enabling reporting for failed patches.  I am surprised that you could not reproduce the above failures in your setup; let's troubleshoot your local setup offline.
>
> I think I figure out what is happening here. I am using toolchains built with
> scripts/build-many-glibcs.py which does not set fortify as default, different
> than distro that are now setting the it as default.
>
> Since in our CI we use the system compiler to check for header, fortify will
> be always enabled and thus breaking the testing.
>
> Frédéric is fixing a lot of tests to assume fortify is enabled as default, so
> maybe it is something that he is already tracking.  In any case, I think we
> should fix scripts/check-installed-headers.sh to check for different levels
> of _FORTIFY_SOURCE along with already in places flags.
>

I'm indeed working on a patch series to enable _FORTIFY_SOURCE while
building Glibc, which may improve your situation.
I expect the patchset will be out in the next few days (or even hours
if everything goes right), so you might be able to check soon.
Adhemerval Zanella Netto July 21, 2023, 1:59 p.m. UTC | #15
On 19/06/23 15:36, Adhemerval Zanella Netto wrote:
> 
> 
> On 19/06/23 11:23, Sergey Bugaev via Libc-alpha wrote:
>> On Mon, Jun 19, 2023 at 3:58 PM Carlos O'Donell <carlos@redhat.com> wrote:
>>> Pre-commit CI regression caused by this series on both aarch64 and aarch32.
>>> https://patchwork.sourceware.org/project/glibc/patch/20230617222218.642125-6-bugaevc@gmail.com/
>>>
>>> # Running glibc:io ...
>>> # FAIL: io/check-installed-headers-c
>>> # FAIL: io/check-installed-headers-cxx
>>> #
>>> # Running glibc:misc ...
>>> # FAIL: misc/check-installed-headers-c
>>> # FAIL: misc/check-installed-headers-cxx
>>> #
>>> # Running glibc:rt ...
>>> # FAIL: rt/check-installed-headers-c
>>> # FAIL: rt/check-installed-headers-cxx
>>
>> Thank you :(
>>
>> I'm a little bit lost in the Jenkins web UI -- is there anywhere I can
>> view the output of the failing tests to understand what exactly went
>> wrong? I see the sum files here [0], but not the .out files.
>>
>> [0]: https://ci.linaro.org/job/tcwg_glibc_check--master-arm-build/386/artifact/artifacts/artifacts.precommit/sumfiles/
> 
> I could not reproduce it locally, so I think it is a limitation of our 
> (Linaro) CI framework, where not all tests artifacts are being removed 
> with make test-clean. Maxim already sent a fix for this [1].
> 
> [1] https://patchwork.sourceware.org/project/glibc/patch/20230615152547.2924770-1-maxim.kuvyrkov@linaro.org/

Hi Surgey,

Now that we have proper tests (30379efad117b and a3090c2c98facb) for 
check-installed-headers with fortify usage, it easy to trigger the regressions.
I think it should be easy to fix and I hope we get this fortify extension
on next development cycle.
Sergey Bugaev July 21, 2023, 3:50 p.m. UTC | #16
On Fri, Jul 21, 2023 at 4:59 PM Adhemerval Zanella Netto
<adhemerval.zanella@linaro.org> wrote:
> Hi Surgey,

Hi,

> Now that we have proper tests (30379efad117b and a3090c2c98facb) for
> check-installed-headers with fortify usage, it easy to trigger the regressions.
> I think it should be easy to fix and I hope we get this fortify extension
> on next development cycle.

That's great! Yes, the regression should be very easy to fix (need to
add an ifdef check around F_DUPFD_CLOEXEC). I've been putting it off
because I saw (some of?) Frederic's work landing, so I'll have to
rebase and see if there's any new breakage; and also because of the
upcoming release (this is clearly not 2.38 material at this point).
I'll look into rebasing & retesting this weekend then.

Sergey (not Surgey :D)