Skip to content

Conversation

gbaraldi
Copy link
Member

@gbaraldi gbaraldi commented Sep 9, 2025

No description provided.

@KristofferC
Copy link
Member

Any performance numbers for how this affects the runtime of a "typical" coverage run?

@gbaraldi
Copy link
Member Author

gbaraldi commented Sep 9, 2025

Haven't measured but uncontended locks are very cheap. If you are contending on the coverage it starts losing performance

@nsajko
Copy link
Member

nsajko commented Sep 9, 2025

xref #59355

@vtjnash vtjnash added the backport 1.12 Change should be backported to release-1.12 label Sep 9, 2025
@IanButterworth
Copy link
Member

IanButterworth commented Sep 9, 2025

We should run julia-buildkite CI with this branch before merging so that the scheduled coverage jobs are run. Given the work that went into making that green it would be a shame to make it so slow it breaks again this quickly.

@vtjnash vtjnash added the merge me PR is reviewed. Merge when all tests are passing label Sep 9, 2025
@IanButterworth
Copy link
Member

IanButterworth commented Sep 9, 2025

The coverage job on JuliaCI/julia-buildkite#482 seems to have aborted early? Or has somehow sped up significantly.. it's to tell hard because the logs are huge and tests don't pass. I didn't see the test report on a quick look.

https://buildkite.com/julialang/julia-buildkite-scheduled/builds/1421#01992fb5-bdfd-4cd4-ba59-059c17ec4601

@IanButterworth
Copy link
Member

IanButterworth commented Sep 10, 2025

It seems the macOS run was actually complete.. huh, went from 5h -> 3h 26m. Maybe nothing else was running on the mac at that time?

Linux slowed down 6h 3m -> 6h 38m https://buildkite.com/julialang/julia-buildkite-scheduled/builds/1421#01992fb5-bdfb-4986-bd65-5499c11a8746
Windows slowed down 6h 13m -> timeout at 7h https://buildkite.com/julialang/julia-buildkite-scheduled/builds/1421#01992fb5-bdfe-4716-8c85-7b75cbef1c69

Maybe there's too much noise to be sure whether it's a real slowdown, but we should at least increase the timeout from 7h given it seems to be relatively close even on master under usual CI conditions (I saw shorter times than this during the weekend when I adjusted that timeout down from 12h).

There's probably also tests we should just skip on the coverage jobs.. like just run the threads_exec stuff once.

@vtjnash
Copy link
Member

vtjnash commented Sep 10, 2025

Seems about right then. The main risk with seeing it get a lot faster is likely just that it might be crashing more / failing more tests, and thus losing coverage. Doesn't seem true here. The threads_exec test itself isn't actually slow, it is just has a test that threads can sleep for a minute added recently unnecessarily and then run repeatedly unnecessarily.

@vtjnash vtjnash merged commit da5d73c into master Sep 10, 2025
7 checks passed
@vtjnash vtjnash deleted the gb/coverage-fun branch September 10, 2025 14:00
@IanButterworth IanButterworth removed the merge me PR is reviewed. Merge when all tests are passing label Sep 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport 1.12 Change should be backported to release-1.12 code coverage
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants