| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add a fast-path in rb_get() and rb_get_64() that checks the tree’s
cached leftmost->start and rightmost->end bounds before calling
find_addr(). If the query address lies outside the address bounds,
we return 0 immediately, avoiding an unnecessary tree walk.
This mirrors the idea used in jserv/rbtree’s rb_cached_contains(),
using cached min/max to short-circuit lookups.
Behavior is unchanged (both functions still return 0 when not found);
only negative lookups become cheaper.
Co-authored-by: Jim Huang <jserv@ccns.ncku.edu.tw>
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
Replaced all occurrences of `rb_get_righter` with `rb_get_rightmost` and
`rb_get_lefter` with `rb_get_leftmost` for improved semantic clarity.
Also updated related function declarations, definitions, and usage
across:
- rbtree.c
- rbtree.h
- custommem.c
- dynablock.c
- env.c
- box64context.c
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Cache leftmost and rightmost node
Add two fields to `rbtree`: `lefter` and `righter`, to cache the
leftmost and rightmost nodes respectively. This eliminates the need for
O(log N) traversals in `rb_get_lefter()` and `rb_get_righter()`.
Motivated by the Linux kernel's use of cached pointers for `rb_first()`
and `rb_last()`, this change improves efficiency of boundary queries by
replacing repeated tree walks with direct pointer dereference.
Experiment: running `chess.exe` with Box64 + Wine (#2511)
- ~3,500 insertions into the tree
- 607 lightweight cache updates (single assignment)
- 397 full tree traversals avoided
This results in reduced runtime overhead for boundary checks, with
memory cost (+2 pointer per tree). Expected benefits increase
in larger or more dynamic workloads.
Ref: https://docs.kernel.org/core-api/rbtree.html
* Remove redundant add_range() wrapper
The function `add_range()` was only called when `tree->root == NULL`.
In such cases, the while-loop inside `add_range()` never runs,
resulting in a call to `add_range_next_to()` with `prev == NULL`.
Replaced it with direct calls to `add_range_next_to(tree, NULL, ...)`.
|
| | |
|
| | |
|
| | |
|
| |
|
|
| |
setting
|
| |
|
|
| |
FindDynablockFromNativeAddress function
|
| | |
|
| |
|
|
|
| |
Red-black tree operations now consistently use the 'rbtree_' prefix, and
internal functions remain unexposed. Tested on RV64GC, resulting in a
498-byte reduction in the .text section size.
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
|
|
* [MEMORY] Switched from a sparse array to an RB tree
* [RBTREE] Fixed the Android build
|