After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 160150 - performance degrades when deleting many files
performance degrades when deleting many files
Status: RESOLVED FIXED
Product: beagle
Classification: Other
Component: General
0.0.x
Other Linux
: Normal normal
: Milestone 1.5
Assigned To: Jon Trowbridge
Jon Trowbridge
: 160888 (view as bug list)
Depends on:
Blocks:
 
 
Reported: 2004-12-01 22:46 UTC by Joe Shaw
Modified: 2005-06-30 18:51 UTC
See Also:
GNOME target: ---
GNOME version: ---



Description Joe Shaw 2004-12-01 22:46:58 UTC
The system performance really takes a hit if you rm -rf a directory, because
every one of those inotify events triggers an instantaneous delete from the index.

It'd be nice if beagled queued up deletes for, say, 500ms so that it can process
these in batches.
Comment 1 Joe Shaw 2004-12-01 22:52:36 UTC
Filter out "joe milestone spam" to ignore these.

Moving milestones from versions to target milestones, and adding in actual versions
Comment 2 Joe Shaw 2004-12-09 20:51:42 UTC
*** Bug 160888 has been marked as a duplicate of this bug. ***
Comment 3 Arthur Peters 2005-06-17 15:27:05 UTC
I have experienced the same problem when adding files. I created ~7000 small
files and beagled used 100% of 2 CPUs for several minutes indexing them.

I know this is an extreme case, but imagine uncompressing an archive. This could
produce the same effect, especially if it is a huge archive like the Linux
kernel source.

I think queuing the events is a very good idea. Could the list of new files (or
deleted ones) then be passed to the same system that handles throttling indexing
at start-up? No reason to implement throttling twice.
Comment 4 Joe Shaw 2005-06-30 18:51:35 UTC
I checked in some scheduler tweaks which should fix this.