GNOME Bugzilla – Bug 748160
disk space too large on inactive storage pool
Last modified: 2016-09-20 08:15:55 UTC
When I tried to install CoreOS gnome-boxes allows to select a disk size from 21.5 GB to 18.4 EB. This means that any increase results in multi PB disk - thousands times larger then total disk space (not mentioning space on partition).
(In reply to Maciej Piechotka from comment #0) > When I tried to install CoreOS gnome-boxes allows to select a disk size from > 21.5 GB to 18.4 EB. This means that any increase results in multi PB disk - > thousands times larger then total disk space (not mentioning space on > partition). I have seen this happening but haven't been able to reproduce it in the recent times with some fixes for this (that casme before 3.14). You sure you are using 3.14?
3.14.3.1 if the about box is to be believed.
I'm having trouble reproducing this against Boxes 3.14.3.1. I used coreos_production_iso_image.iso that I got from https://coreos.com/docs/running-coreos/platforms/iso/ . Is this the right one?
Yes. I've just checked.
Maciej, Kalev failed to reproduce this against the same ISO as well so I wonder if this could be specific to your filesystem. Can you give us following info about your host: 1. Which filesystem your home directory is on? 2. What does the following command say: virsh pool-info gnome-boxes
1. btrfs but I keep big files on ext4 via symlink (i.e. .local/share/gnome-boxes/images is symlink) 2. It says: Name: gnome-boxes UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXXX State: inactive Persistent: yes Autostart: no
(In reply to Maciej Piechotka from comment #6) > 1. btrfs but I keep big files on ext4 via symlink (i.e. > .local/share/gnome-boxes/images is symlink) > 2. It says: > Name: gnome-boxes > UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXXX > State: inactive > Persistent: yes > Autostart: no I don't see how/why it would be inactive unless you inactivated it? That *might* explain what's going on your system. Another possibility is that libvirt not be able to deal with btfs. If pool wasn't inactive, it would also say: Capacity: XXXX.XX GiB Allocation: XXXX,XX GiB Available: XXXX.XX GiB Could you please try `virsh pool-start gnome-boxes` and see if that succeeds and helps with the issue?
(In reply to Zeeshan Ali (Khattak) from comment #7) > (In reply to Maciej Piechotka from comment #6) > > 1. btrfs but I keep big files on ext4 via symlink (i.e. > > .local/share/gnome-boxes/images is symlink) > > 2. It says: > > Name: gnome-boxes > > UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXXX > > State: inactive > > Persistent: yes > > Autostart: no > > I don't see how/why it would be inactive unless you inactivated it? That > *might* explain what's going on your system. Another possibility is that > libvirt not be able to deal with btfs. If pool wasn't inactive, it would > also say: > > Capacity: XXXX.XX GiB > Allocation: XXXX,XX GiB > Available: XXXX.XX GiB > > Could you please try `virsh pool-start gnome-boxes` and see if that succeeds > and helps with the issue? Capacity: 124.88 GiB Allocation: 69.30 GiB Available: 55.58 GiB
(In reply to Maciej Piechotka from comment #8) > (In reply to Zeeshan Ali (Khattak) from comment #7) > > (In reply to Maciej Piechotka from comment #6) > > > 1. btrfs but I keep big files on ext4 via symlink (i.e. > > > .local/share/gnome-boxes/images is symlink) > > > 2. It says: > > > Name: gnome-boxes > > > UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXXX > > > State: inactive > > > Persistent: yes > > > Autostart: no > > > > I don't see how/why it would be inactive unless you inactivated it? That > > *might* explain what's going on your system. Another possibility is that > > libvirt not be able to deal with btfs. If pool wasn't inactive, it would > > also say: > > > > Capacity: XXXX.XX GiB > > Allocation: XXXX,XX GiB > > Available: XXXX.XX GiB > > > > Could you please try `virsh pool-start gnome-boxes` and see if that succeeds > > and helps with the issue? > > Capacity: 124.88 GiB > Allocation: 69.30 GiB > Available: 55.58 GiB Thanks but I'm a bit confused. You got this after trying the above command or it was there but you didn't paste before? If its former, I meant that that command might help with the original bug so could you try to reproducing now?
(In reply to Zeeshan Ali (Khattak) from comment #9) > (In reply to Maciej Piechotka from comment #8) > > (In reply to Zeeshan Ali (Khattak) from comment #7) > > > (In reply to Maciej Piechotka from comment #6) > > > > 1. btrfs but I keep big files on ext4 via symlink (i.e. > > > > .local/share/gnome-boxes/images is symlink) > > > > 2. It says: > > > > Name: gnome-boxes > > > > UUID: XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXXX > > > > State: inactive > > > > Persistent: yes > > > > Autostart: no > > > > > > I don't see how/why it would be inactive unless you inactivated it? That > > > *might* explain what's going on your system. Another possibility is that > > > libvirt not be able to deal with btfs. If pool wasn't inactive, it would > > > also say: > > > > > > Capacity: XXXX.XX GiB > > > Allocation: XXXX,XX GiB > > > Available: XXXX.XX GiB > > > > > > Could you please try `virsh pool-start gnome-boxes` and see if that succeeds > > > and helps with the issue? > > > > Capacity: 124.88 GiB > > Allocation: 69.30 GiB > > Available: 55.58 GiB > > Thanks but I'm a bit confused. You got this after trying the above command > or it was there but you didn't paste before? If its former, I meant that > that command might help with the original bug so could you try to > reproducing now? Ups. Sorry - I misunderstood you. It seems to work now correctly (the maximum disk size is sane now).
Any idea how the pool became de-activated? Did you change your home directory somehow or changed filesystem? Thanks for sticking around to provide all info. I have heard same complaint from some other folks so it seems this can happen. Boxes should start the pool before using it..
Interesting. I just looked into code and Boxes starts the storage pool if its inactive. I also tested this to work on both 3.14 and master branches. So I don't have any clue of whats going on. My guess is that somehow Boxes is failing to see that pool in inactive on your machine. If that is the case, you should have seen at least a few warnings on console. I guess setting the autostart to 'true' should help avoid such situation so I'll see if I should enable that.
Created attachment 303527 [details] [review] build: Require libvirt-glib >= 0.2.1 We'll need next version of libvirt-glib to be able to set autostart flag on storage pool in the following patch.
Created attachment 303528 [details] [review] vm-creator: Set autostart=true on storage pool Even though we ensure that storage pool in activated if its been de-activated for some reason and even create the backing directory, it seems that on at least some machines we fail to do that somehow. With lack of any data on why and when that happens, let's make the storage pool autostart in hopes it helps the situation.
18.4 EB looks like (uint64_t)-1, I guess GVirStoragePoolInfo::capacity or GVirStoragePoolInfo::available gets set when gvir_storage_pool_get_info() is called on an invalid storage pool, and then some error handling/notifying is missing somewhere in the stack.
(In reply to Christophe Fergeau from comment #15) > 18.4 EB looks like (uint64_t)-1, I guess GVirStoragePoolInfo::capacity or > GVirStoragePoolInfo::available gets set when gvir_storage_pool_get_info() is > called on an invalid storage pool, and then some error handling/notifying is > missing somewhere in the stack. True! However I failed to reproduce this and I don't see where/how this can happen looking at the code. So I need to know if we can just apply the patches attached and for now assume that they will help?
I'm assuming these patches help. Attachment 303527 [details] pushed as e3287b8 - build: Require libvirt-glib >= 0.2.1 Attachment 303528 [details] pushed as 193dc1d - vm-creator: Set autostart=true on storage pool