I have detected, the garbage collector consumes memory
that is never been freed after using a search, which
sorts a field.
My test code (see attached file) executes a search
which use Lucene.Net.Search.Sort to sort a field. After
closing the searcher, I print out the bytes which are
used by the garbage collector.
Result, 10 searches (using sort):
0: 0 Hits
19932 bytes
1: 0 Hits
31624 bytes
2: 0 Hits
43328 bytes
...
End of search:
125592 bytes
==> memory leak, about 100kB lost!
Result, 10 searches (sort = null):
0: 0 Hits
12876 bytes
1: 0 Hits
12876 bytes
2: 0 Hits
12852 bytes
...
End of search:
12864 bytes
==> OK
How many memory gets lost, seems to depend on the size
of the sorted field only. It doesn't matter whether the
search matches some documents or not.
This could been reproduced with the following versions:
- DotLucene 1.4.3 build 002
- DotLucene 1.9 RC1
Memory leak test code
Logged In: YES
user_id=778461
I have profiled the application with CLRProfiler from
Microsoft (nice tool...). It says that the wasted memory was
allocated in Search\FieldCache.StringIndex(int[] values,
System.String[] lookup).
But I cannot find out why the memory couldn't been released
by the garbage collector. I hope sombody can solve this...
Logged In: YES
user_id=1443706
It is still found in 1.4.3 build 004 with framework 2.0.