Image 01
profile-image

pcordes

Peter Cordes , Canada
test
KleanSweep

System Software 134 comments

by yogin
Score 58.0%
Jun 30 2010
forgot to mention: the total duplicated size looks like it suffers from signed 32bit overflow. Use long long, int64_t, or off_t with
#define _FILE_OFFSET_BITS 64
- May 01 2009
kleansweep spends a ton of CPU time updating the GUI while loading the dup list. While actually deleting the dups, it keeps Xorg at > 80% CPU usage (on my C2Duo desktop, g965 graphics, Ubuntu Jaunty). It looks like it redraws the whole window every for every file it deletes! Not good! It keeps Xorg busy even when the window is minimized. (I'm running fluxbox; I don't know if kwin can tell kde apps that they don't need to actually draw when they're minimized...) The not-drawing-when-minimized optimization should probably be in Qt or KDE, not in individual apps, though.

BTW, I'm using it to compare sets of files recovered with different methods from a half-lost filesystem. I've mostly had to hack stuff together in Perl, because I haven't found anything that lets you essentially subtract files found in other directories from files in one given directory. i.e. for a given directory, keep only stuff that isn't a duplicate of something outside that directory. I want to subtract sets of files.

I've done some stuff with perl parsing the output saved by fslint-gui, but it's a pain.

hmm, I just found komparator. It might do what I want...

- May 01 2009
I don't like self-extracting archives. I prefer having extraction handled by archive software I trust, and whose behaviour I understand. e.g. by using
tar xkf ...
I know I won't replace existing files.

If you want to make it easier to beginners, make an undo-cleanup option in the GUI. You could just prompt the user for the file to restore from, if you don't want to get all fancy with keeping track of which file was created by which cleanup action to provide more kleanup-specific metadata for the restore files.
- May 01 2009