diff options
author | Wilco Dijkstra <wdijkstr@arm.com> | 2017-10-17 18:55:16 +0100 |
---|---|---|
committer | Wilco Dijkstra <wdijkstr@arm.com> | 2017-10-17 18:55:16 +0100 |
commit | 3381be5cdef2e43949db12f66a5a3ec23b2c4c90 (patch) | |
tree | 113f4b911e7eb488ec01ded9307bedcb38fa5793 /malloc/arena.c | |
parent | e956075a5a2044d05ce48b905b10270ed4a63e87 (diff) | |
download | glibc-3381be5cdef2e43949db12f66a5a3ec23b2c4c90.tar glibc-3381be5cdef2e43949db12f66a5a3ec23b2c4c90.tar.gz glibc-3381be5cdef2e43949db12f66a5a3ec23b2c4c90.tar.bz2 glibc-3381be5cdef2e43949db12f66a5a3ec23b2c4c90.zip |
Improve malloc initialization sequence
The current malloc initialization is quite convoluted. Instead of
sometimes calling malloc_consolidate from ptmalloc_init, call
malloc_init_state early so that the main_arena is always initialized.
The special initialization can now be removed from malloc_consolidate.
This also fixes BZ #22159.
Check all calls to malloc_consolidate and remove calls that are
redundant initialization after ptmalloc_init, like in int_mallinfo
and __libc_mallopt (but keep the latter as consolidation is required for
set_max_fast). Update comments to improve clarity.
Remove impossible initialization check from _int_malloc, fix assert
in do_check_malloc_state to ensure arena->top != 0. Fix the obvious bugs
in do_check_free_chunk and do_check_remalloced_chunk to enable single
threaded malloc debugging (do_check_malloc_state is not thread safe!).
[BZ #22159]
* malloc/arena.c (ptmalloc_init): Call malloc_init_state.
* malloc/malloc.c (do_check_free_chunk): Fix build bug.
(do_check_remalloced_chunk): Fix build bug.
(do_check_malloc_state): Add assert that checks arena->top.
(malloc_consolidate): Remove initialization.
(int_mallinfo): Remove call to malloc_consolidate.
(__libc_mallopt): Clarify why malloc_consolidate is needed.
Diffstat (limited to 'malloc/arena.c')
-rw-r--r-- | malloc/arena.c | 13 |
1 files changed, 4 insertions, 9 deletions
diff --git a/malloc/arena.c b/malloc/arena.c index 9e5a62d260..85b985e193 100644 --- a/malloc/arena.c +++ b/malloc/arena.c @@ -307,13 +307,9 @@ ptmalloc_init (void) thread_arena = &main_arena; -#if HAVE_TUNABLES - /* Ensure initialization/consolidation and do it under a lock so that a - thread attempting to use the arena in parallel waits on us till we - finish. */ - __libc_lock_lock (main_arena.mutex); - malloc_consolidate (&main_arena); + malloc_init_state (&main_arena); +#if HAVE_TUNABLES TUNABLE_GET (check, int32_t, TUNABLE_CALLBACK (set_mallopt_check)); TUNABLE_GET (top_pad, size_t, TUNABLE_CALLBACK (set_top_pad)); TUNABLE_GET (perturb, int32_t, TUNABLE_CALLBACK (set_perturb_byte)); @@ -322,13 +318,12 @@ ptmalloc_init (void) TUNABLE_GET (mmap_max, int32_t, TUNABLE_CALLBACK (set_mmaps_max)); TUNABLE_GET (arena_max, size_t, TUNABLE_CALLBACK (set_arena_max)); TUNABLE_GET (arena_test, size_t, TUNABLE_CALLBACK (set_arena_test)); -#if USE_TCACHE +# if USE_TCACHE TUNABLE_GET (tcache_max, size_t, TUNABLE_CALLBACK (set_tcache_max)); TUNABLE_GET (tcache_count, size_t, TUNABLE_CALLBACK (set_tcache_count)); TUNABLE_GET (tcache_unsorted_limit, size_t, TUNABLE_CALLBACK (set_tcache_unsorted_limit)); -#endif - __libc_lock_unlock (main_arena.mutex); +# endif #else const char *s = NULL; if (__glibc_likely (_environ != NULL)) |