Message ID | 20230721171007.2065423-10-shikemeng@huaweicloud.com |
---|---|
State | Superseded |
Headers | show |
Series | A few fixes and cleanups to mballoc | expand |
Kemeng Shi <shikemeng@huaweicloud.com> writes: > Return good group when it's found in loop to remove futher check if good > group is found after loop. > > Signed-off-by: Kemeng Shi <shikemeng@huaweicloud.com> > --- > fs/ext4/mballoc.c | 18 ++++++++---------- > 1 file changed, 8 insertions(+), 10 deletions(-) > Looks good to me. Feel free to add: Reviewed-by: Ritesh Harjani (IBM) <ritesh.list@gmail.com> -ritesh > diff --git a/fs/ext4/mballoc.c b/fs/ext4/mballoc.c > index 6f8e804905d5..b04eceeab967 100644 > --- a/fs/ext4/mballoc.c > +++ b/fs/ext4/mballoc.c > @@ -1043,18 +1043,16 @@ static void ext4_mb_choose_next_group_best_avail(struct ext4_allocation_context > ac->ac_g_ex.fe_len); > > grp = ext4_mb_find_good_group_avg_frag_lists(ac, frag_order); > - if (grp) > - break; > + if (grp) { > + *group = grp->bb_group; > + ac->ac_flags |= EXT4_MB_CR_BEST_AVAIL_LEN_OPTIMIZED; > + return; > + } > } > > - if (grp) { > - *group = grp->bb_group; > - ac->ac_flags |= EXT4_MB_CR_BEST_AVAIL_LEN_OPTIMIZED; > - } else { > - /* Reset goal length to original goal length before falling into CR_GOAL_LEN_SLOW */ > - ac->ac_g_ex.fe_len = ac->ac_orig_goal_len; > - *new_cr = CR_GOAL_LEN_SLOW; > - } > + /* Reset goal length to original goal length before falling into CR_GOAL_LEN_SLOW */ > + ac->ac_g_ex.fe_len = ac->ac_orig_goal_len; > + *new_cr = CR_GOAL_LEN_SLOW; > } > > static inline int should_optimize_scan(struct ext4_allocation_context *ac) > -- > 2.30.0
diff --git a/fs/ext4/mballoc.c b/fs/ext4/mballoc.c index 6f8e804905d5..b04eceeab967 100644 --- a/fs/ext4/mballoc.c +++ b/fs/ext4/mballoc.c @@ -1043,18 +1043,16 @@ static void ext4_mb_choose_next_group_best_avail(struct ext4_allocation_context ac->ac_g_ex.fe_len); grp = ext4_mb_find_good_group_avg_frag_lists(ac, frag_order); - if (grp) - break; + if (grp) { + *group = grp->bb_group; + ac->ac_flags |= EXT4_MB_CR_BEST_AVAIL_LEN_OPTIMIZED; + return; + } } - if (grp) { - *group = grp->bb_group; - ac->ac_flags |= EXT4_MB_CR_BEST_AVAIL_LEN_OPTIMIZED; - } else { - /* Reset goal length to original goal length before falling into CR_GOAL_LEN_SLOW */ - ac->ac_g_ex.fe_len = ac->ac_orig_goal_len; - *new_cr = CR_GOAL_LEN_SLOW; - } + /* Reset goal length to original goal length before falling into CR_GOAL_LEN_SLOW */ + ac->ac_g_ex.fe_len = ac->ac_orig_goal_len; + *new_cr = CR_GOAL_LEN_SLOW; } static inline int should_optimize_scan(struct ext4_allocation_context *ac)
Return good group when it's found in loop to remove futher check if good group is found after loop. Signed-off-by: Kemeng Shi <shikemeng@huaweicloud.com> --- fs/ext4/mballoc.c | 18 ++++++++---------- 1 file changed, 8 insertions(+), 10 deletions(-)