Skip to content

Conversation

@AlexanderKalistratov
Copy link
Collaborator

@AlexanderKalistratov AlexanderKalistratov commented Jul 18, 2023

Adding laptop friendly preset size M16Gb to some workloads

@AlexanderKalistratov AlexanderKalistratov changed the title Modify M preset for workloads to better fit integrated GPU systems with 16GB of ram Modify M preset for workloads to better fit systems with integrated GPU and 16GB of ram Jul 18, 2023
@AlexanderKalistratov
Copy link
Collaborator Author

@diptorupd @ZzEeKkAa dpnp uses this numbers to report performance.

Previous numbers causes swaps on 16GB ram systems preventing measurements.

@AlexanderKalistratov AlexanderKalistratov force-pushed the make_m_size_more_laptop_freindly branch from f300af1 to 90e95eb Compare July 18, 2023 16:51
@AlexanderKalistratov
Copy link
Collaborator Author

@diptorupd pairwise distance fails on ci on numba_dpex_p. Should I add it to expected failures?

@AlexanderKalistratov AlexanderKalistratov force-pushed the make_m_size_more_laptop_freindly branch from 0baafac to 6179764 Compare July 18, 2023 17:17
@AlexanderKalistratov
Copy link
Collaborator Author

rambo fails on windows with numba_dpex_p

Copy link
Collaborator

@antonwolfy antonwolfy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is about to halve the M for l2_norm?

@AlexanderKalistratov AlexanderKalistratov force-pushed the make_m_size_more_laptop_freindly branch from 37d6a32 to 12d21e0 Compare July 18, 2023 21:06
@AlexanderKalistratov
Copy link
Collaborator Author

@antonwolfy Fixed l2_norm.
As discussed with @diptorupd we decided to add special preset named M16Gb.
Please verify

@AlexanderKalistratov AlexanderKalistratov force-pushed the make_m_size_more_laptop_freindly branch from 12d21e0 to 5bff057 Compare July 18, 2023 21:21
@antonwolfy
Copy link
Collaborator

antonwolfy commented Jul 19, 2023

@antonwolfy Fixed l2_norm. As discussed with @diptorupd we decided to add special preset named M16Gb. Please verify

I've checked on the laptop with M16Gb preset, works great! Thank you @AlexanderKalistratov

@AlexanderKalistratov
Copy link
Collaborator Author

@diptorupd @ZzEeKkAa ping

Copy link

@diptorupd diptorupd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@diptorupd
Copy link

CI failures are unrelated to the change, so going ahead and merging.

@ZzEeKkAa
Copy link
Contributor

This PR is good to go, the current issue is similar to one that was fixed in IntelPython/numba-dpex#1093. For testing CI installs old version of numba - 0.56.4. We can merge this one and I'll work on PR that fixes this issue.

@AlexanderKalistratov
Copy link
Collaborator Author

@ZzEeKkAa I can't merge this PR since I don't have sufficient permissions

@ZzEeKkAa ZzEeKkAa enabled auto-merge July 24, 2023 16:18
@ZzEeKkAa
Copy link
Contributor

@AlexanderKalistratov I've set it for auto-mega, once conflicts are resolved. I'm working on those issues in separate PR and should finish it by EOD.

@ZzEeKkAa ZzEeKkAa mentioned this pull request Jul 24, 2023
5 tasks
@diptorupd diptorupd force-pushed the make_m_size_more_laptop_freindly branch from 5bff057 to e4f1682 Compare July 25, 2023 18:01
@ZzEeKkAa ZzEeKkAa merged commit d5d0fc4 into IntelPython:main Jul 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants