-
Notifications
You must be signed in to change notification settings - Fork 21
[FEAT][EXP] Adaptive optimizations (CF-831) #1003
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
| if len(prev_candidates) == 1: | ||
| # we already have the refinement going for this single candidate tree, no need to do adaptive optimize | ||
| return None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: instead of checking the tree length, check if there was a refinement before, and only call the adaptive endpoint if so.
the reason is that calling the adaptive on non-refined candidate is a bit of duplication, as the candidate will be refined anyway
so we can end up having these two processes in parallel
optimize (c1) -> repair (c2) -> adaptive (c3)
optimize (c1) -> repair (c2) -> refine (c3) -> adaptive (c4)
resulting in 3 different candidates (2 adaptives, 1 refinement)
it's better to only have
optimize (c1) -> repair (c2) -> refine (c3) -> adaptive (c4)
which will result in 2 candidates (1 refined , 1 adaptive)
KRRT7
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review: Adaptive Optimizations
Potential Issue: Forest Cycle/Corruption
In CandidateForest.add(), if /adaptive_optimize returns a candidate with a parent_id that doesn't exist yet, a placeholder node is created:
if parent is None:
parent = CandidateNode(candidate=None) # placeholder
self.nodes[pid] = parentThis placeholder has candidate=None, but path_to_root() unconditionally appends node.candidate:
def path_to_root(self) -> list[OptimizedCandidate]:
path = []
node: CandidateNode | None = self
while node:
path.append(node.candidate) # Will append None for placeholder
node = node.parent
return path[::-1]Then in call_adaptive_optimize(), the code iterates over this path and accesses c.optimization_id, c.source_code, etc. - which would raise AttributeError if c is None.
Question: What prevents the adaptive optimize endpoint from returning a candidate with a parent_id that hasn't been processed yet? If this can happen, path_to_root() needs to filter out None candidates or the placeholder logic needs rethinking.
|
used Claude to help me review here, testing it for that |
|
@KRRT7 in adaptive optimization the new candidate's parent is the last candidate in the tree |
|
@mohammedahmed18 why do we need a new endpoint for adaptive optimization? why can't we modify existing endpoints to take past optimizations into consideration? |
|
@aseem then we'll have to keep those optimizations in memory in the backend which would be tricky to implement i guess |
|
@aseembits93 @KRRT7 actually I'm using the same optimizer context used in optimize & line-profiler optimize. the duplication we have in all 5 endpoints can be refactored but maybe not in a single endpoint imo, this will make it hard for us to implement something specific for a single endpoint. |
depends on https://github.com/codeflash-ai/codeflash-internal/pull/2157