Skip to content

Conversation

@mohammedahmed18
Copy link
Contributor

@mohammedahmed18 mohammedahmed18 commented Dec 31, 2025

@mohammedahmed18 mohammedahmed18 marked this pull request as ready for review January 1, 2026 17:10
Comment on lines 1099 to 1101
if len(prev_candidates) == 1:
# we already have the refinement going for this single candidate tree, no need to do adaptive optimize
return None
Copy link
Contributor Author

@mohammedahmed18 mohammedahmed18 Jan 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: instead of checking the tree length, check if there was a refinement before, and only call the adaptive endpoint if so.

the reason is that calling the adaptive on non-refined candidate is a bit of duplication, as the candidate will be refined anyway

so we can end up having these two processes in parallel

optimize (c1) -> repair (c2) -> adaptive (c3)
optimize (c1) -> repair (c2) -> refine (c3) -> adaptive (c4)

resulting in 3 different candidates (2 adaptives, 1 refinement)

it's better to only have
optimize (c1) -> repair (c2) -> refine (c3) -> adaptive (c4)

which will result in 2 candidates (1 refined , 1 adaptive)

Copy link
Collaborator

@KRRT7 KRRT7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review: Adaptive Optimizations

Potential Issue: Forest Cycle/Corruption

In CandidateForest.add(), if /adaptive_optimize returns a candidate with a parent_id that doesn't exist yet, a placeholder node is created:

if parent is None:
    parent = CandidateNode(candidate=None)  # placeholder
    self.nodes[pid] = parent

This placeholder has candidate=None, but path_to_root() unconditionally appends node.candidate:

def path_to_root(self) -> list[OptimizedCandidate]:
    path = []
    node: CandidateNode | None = self
    while node:
        path.append(node.candidate)  # Will append None for placeholder
        node = node.parent
    return path[::-1]

Then in call_adaptive_optimize(), the code iterates over this path and accesses c.optimization_id, c.source_code, etc. - which would raise AttributeError if c is None.

Question: What prevents the adaptive optimize endpoint from returning a candidate with a parent_id that hasn't been processed yet? If this can happen, path_to_root() needs to filter out None candidates or the placeholder logic needs rethinking.

@KRRT7
Copy link
Collaborator

KRRT7 commented Jan 1, 2026

used Claude to help me review here, testing it for that

@mohammedahmed18
Copy link
Contributor Author

@KRRT7 in adaptive optimization the new candidate's parent is the last candidate in the tree
so any adaptive optimization we are sure it has a parent

https://github.com/codeflash-ai/codeflash-internal/blob/5d938dd3f9afe86c9d163eef97b4c28ae2ac7388/django/aiservice/adaptive_optimizer/adaptive_optimizer.py#L103-L105

KRRT7
KRRT7 previously approved these changes Jan 1, 2026
@aseembits93
Copy link
Contributor

@mohammedahmed18 why do we need a new endpoint for adaptive optimization? why can't we modify existing endpoints to take past optimizations into consideration?

Copy link
Collaborator

KRRT7 commented Jan 1, 2026

@aseem then we'll have to keep those optimizations in memory in the backend which would be tricky to implement i guess

@mohammedahmed18
Copy link
Contributor Author

@aseembits93 @KRRT7 actually I'm using the same optimizer context used in optimize & line-profiler optimize.

https://github.com/codeflash-ai/codeflash-internal/blob/main/django/aiservice/adaptive_optimizer/adaptive_optimizer_context.py

the duplication we have in all 5 endpoints can be refactored but maybe not in a single endpoint imo, this will make it hard for us to implement something specific for a single endpoint.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants