After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 90421 - Gimp difficult to use with large files
Gimp difficult to use with large files
Status: RESOLVED DUPLICATE of bug 76096
Product: GIMP
Classification: Other
Component: General
1.x
Other Linux
: High minor
: ---
Assigned To: GIMP Bugs
GIMP Bugs
Depends on:
Blocks:
 
 
Reported: 2002-08-10 19:35 UTC by Leonard Evens
Modified: 2003-06-03 16:04 UTC
See Also:
GNOME target: ---
GNOME version: ---



Description Leonard Evens 2002-08-10 19:35:28 UTC
I've used the Gimp successfully with scans of medium format negatives
up to 5000 x 7000, although it is a bit slow, even with a fast machine
with a lot of memory.  I am trying to use it with very large files obtained
from scanning 4 x 5 negatives.   The poor response for medium format is
excacerbated so that the Gimp is close to useless.  Photoshop is also
slow with such files, but it gives the user an immediate response in the
displayed image when making changes.  The Gimp changes the actual stored
image each time, and this can take so long it is difficult to experiment.
Often the Gimp will take 5 or 10 minutes to do something that seems fairly
straightforward, and I haven't been able to figure out why.  For example,
with a 8 bit b/w image of size 8800 x 11,000, I tried to select a region
with the bezier tool.   After the border is chosen and I click in the
region, it takes an extremely long time for the region to be selected.
During this time, top shows the Gimp using about 90 percent or more of
the CPU but only about 35 percent of memory.

Perhaps the number of people using the Gimp on large files is rather
small, but with more and more people using high resolution scanners,
they will become more common.  For example, scanning a medium format
negative at 4000 ppi can easily produce such a file.
Comment 1 Sven Neumann 2002-08-10 21:03:07 UTC
You did tune the size of the tile-cache didn't you?
Comment 2 Leonard Evens 2002-08-10 23:22:48 UTC
I've experimented with different tile cache sizes before, and it
didn't seem to make a big difference.  The Gimp Users Manual seems to
suggest a tile cache size of one fourth the total memory.   Since I
have 1 Gb of ram, I set it to 256 Mb.  A color image might be slightly
larger than that, but I already have lots of problems with gray scale
images which are one third the size.

However, this might have something to do with my problems.  It may be
that the Gimp is not making effective use of my memory when the images
get so large.  I notice an enormous improvement in response if I scale
down by a factor of two thirds.

Which tile cache size do you recommend I try?                
Comment 3 Simon Budig 2002-08-11 11:39:58 UTC
I'd recommend to increase the tile cache even more - assumed that you
don't need the other 512MB for other stuff. I think the
recommendataion in the GUM has been written with multi-user systems in
mind.

You might also try to reduce the number of undo levels.

And as a last thing: You seem to underestimate the stuff gimp stores
in the images: A 8800 by 11000 pixel grayscale image is roughly 95 MB,
creating a selection in it is another 95 MB (a selection channel which
is basically another byte per pixel). Plus an undo step (maybe for the
whole image too) it easily exceeds the 256 MB tile cache of your
original report.

It would be cool if Gimp could handle bigger images better though...
Comment 4 Leonard Evens 2002-08-11 15:32:45 UTC
As I noted, the increase in tile cache size to 512 Mb resolved some of
my problems, and I will try even larger sizes.   But I still remember
having some problems when I had much less memory with what were then
for me relatively large files.  It does happen for example that the
Gimp and some plugin take up essentially all available memory, so it
is possible that making more space available for the program proper
reduces the memory available for a plugin.  It would be helpful if
documentation provide guidelines about how to deal with such issues.

In the past, reducing the number of undo levels helped some but not as
much as I hoped.  Even reducing it to zero doesn't always help much. 
Recently, when dealing with large files, I've reduced it to one, and I
save regularly.  Even with smaller files, I seldom set undo to larger
than two.  Photoshop seems to be able to handle multiple undos better,
even under Windows 98.

There is still the issue of how tools like curves display changes.
That is still very slow with a large file.  In Photoshop, such changes
are made only to what is displayed on the screen.  Changing the full
image is delayed until the tool is closed. The way the Gimp does it
limits one's ability to experiment freely.  I would argue for placing
a high priority on changing this.
Comment 5 Raphaël Quinet 2002-08-12 09:21:34 UTC
We should mark this as a duplicate of bug #76096, in which I suggested
to cache a scaled-down copy of the image (this is probably what
Photoshop does) and perform some operations such as the previews for
the color adjustments on the scaled-down version instead of the full
image.

Leonard, would that solve your problem?
Comment 6 Raphaël Quinet 2003-06-02 17:49:32 UTC
Due to the lack of feedback from the reporter, I am marking this as a
duplicate of bug #76096, assuming that my previous suggestion was
correct.  I have also submitted a new bug #114268 dealing more
specifically with the color transformations.

*** This bug has been marked as a duplicate of 76096 ***
Comment 7 Sven Neumann 2003-06-03 16:04:53 UTC
As pointed out in bug #114268, I don't think that this report is a
duplicate of bug #76096. The latter is about improving the scaling
algorithm for the image projection. This would not help at all when it
comes to previewing any drawable operations.