The program points start from 1, so max_point should be equal to
length().
Tested on RV64 and no regression.
gcc/ChangeLog:
* config/riscv/riscv-vector-costs.cc: Use length()
Signed-off-by: demin.han <[email protected]>
---
gcc/config/riscv/riscv-vector-costs.cc | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/gcc/config/riscv/riscv-vector-costs.cc
b/gcc/config/riscv/riscv-vector-costs.cc
index 484196b15b4..9f7fe936a29 100644
--- a/gcc/config/riscv/riscv-vector-costs.cc
+++ b/gcc/config/riscv/riscv-vector-costs.cc
@@ -759,7 +759,7 @@ update_local_live_ranges (
We will be likely using one more vector variable. */
unsigned int max_point
- = (*program_points_per_bb.get (bb)).length () - 1;
+ = (*program_points_per_bb.get (bb)).length ();
auto *live_ranges = live_ranges_per_bb.get (bb);
bool existed_p = false;
tree var = type == load_vec_info_type
--
2.44.0