Skip to content

Conversation

@diegorusso
Copy link
Contributor

No description provided.

install = True

# Specify '-j' parameter in 'make' command
jobs = 32
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I'm not misunderstanding: 10 (pool size) * 32 will be serious!

Copy link
Contributor Author

@diegorusso diegorusso Oct 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The machine I was running the backfill has 192 cores but the latest 16 have been reserved for benchmarks. So there are really 176 cores available. I can decrease to 16 or maybe decrease the numbers of parallel jobs.
In reality this doesn't cause a problem because the compilation step with so many cores takes just a few minutes.
Anyway I'll set it less aggressive (24)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also bear in mind that this file is used for the daily run of pyperformance, when 192 cores will be idle :)

@corona10
Copy link
Member

Would you like to give me a time to this weekend?

@diegorusso
Copy link
Contributor Author

Would you like to give me a time to this weekend?

no rush!

@corona10
Copy link
Member

corona10 commented Nov 4, 2025

Sorry for the delay plan to review in 24hours

Copy link
Member

@corona10 corona10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall lgtm, I left some comments but it is all up to you!

Comment on lines +27 to +28
sha = revision[0]
branch = revision[1]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
sha = revision[0]
branch = revision[1]
sha, branch = revision

nit

"""


def get_revisions():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def get_revisions():
def get_revisions() -> tuple[str, str]:

Comment on lines +52 to +64
pool = Pool(8)
signal.signal(signal.SIGINT, original_sigint_handler)
try:
res = pool.map_async(run_pyperformance, get_revisions())
# Without the timeout this blocking call ignores all signals.
res.get(86400)
except KeyboardInterrupt:
print("Caught KeyboardInterrupt, terminating workers")
pool.terminate()
else:
print("Normal termination")
pool.close()
pool.join()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a minor suggestion, why not use a simpler and more Pythonic approach like this?

Suggested change
pool = Pool(8)
signal.signal(signal.SIGINT, original_sigint_handler)
try:
res = pool.map_async(run_pyperformance, get_revisions())
# Without the timeout this blocking call ignores all signals.
res.get(86400)
except KeyboardInterrupt:
print("Caught KeyboardInterrupt, terminating workers")
pool.terminate()
else:
print("Normal termination")
pool.close()
pool.join()
signal.signal(signal.SIGINT, original_sigint_handler)
with Pool(8) as pool:
res = pool.map_async(run_pyperformance, get_revisions())
# Without the timeout this blocking call ignores all signals.
res.get(86400)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants