GNOME Bugzilla – Bug 749372
WARNING **: send_infos_cb and send_done_cb: »org.gtk.vfs.Enumerator«, g-dbus-error-quark, 19
Last modified: 2016-04-08 06:33:55 UTC
I am using gvfs on Fedora 22. Every time I log in to gnome session or I connect to the first gvfs storage after logging in I get these error messages: org.gtk.vfs.Daemon[11842]: (process:12878): WARNING : send_infos_cb: Keine derartige Schnittstelle »org.gtk.vfs.Enumerator« des Objekts im Pfad /org/gtk/vfs/client/enumerator/1 (g-dbus-error-quark, 19) I get about 20 to 80 messages of this and one to three of them with send_done_cb instead of send_infos_cb. Sometimes it changes to org.gtk.vfs.Daemon[1746]: ** (process:2900): WARNING **: send_infos_cb: Verbindung ist geschlossen (g-io-error-quark, 18) Which gets repeated 20…40 times with 1…2 times send_done_cb instead of send_infos_cb.
Thanks for the bug report. This isn't bug in gvfs probably, but it might indicate e.g. crash of another application using gvfs. They are printed when enumeration job isn't finished ordinarily. I see only few of those crashes in my logs. Some of them happened immediately after org.gnome.Nautilus.service started: May 15 09:07:57 localhost.localdomain /usr/libexec/gdm-x-session[1880]: Activating service name='org.gnome.Nautilus' May 15 09:07:57 localhost.localdomain /usr/libexec/gdm-x-session[1880]: Successfully activated service 'org.gnome.Nautilus' May 15 09:08:02 localhost.localdomain org.gtk.vfs.Daemon[1985]: ** (process:28388): WARNING **: send_infos_cb: No such interface 'org.gtk.vfs.Enumerator' on object at path /org/gtk/vfs/client/enumerator/1 (g-dbus-error-quark, 19) May 15 09:08:02 localhost.localdomain org.gtk.vfs.Daemon[1985]: ** (process:28388): WARNING **: send_infos_cb: No such interface 'org.gtk.vfs.Enumerator' on object at path /org/gtk/vfs/client/enumerator/1 (g-dbus-error-quark, 19) May 15 09:08:02 localhost.localdomain org.gtk.vfs.Daemon[1985]: ** (process:28388): WARNING **: send_done_cb: No such interface 'org.gtk.vfs.Enumerator' on object at path /org/gtk/vfs/client/enumerator/1 (g-dbus-error-quark, 19) I see also some tracker crashes: May 15 09:15:54 localhost.localdomain gnome-session[1991]: (tracker-miner-fs:2327): Tracker-CRITICAL **: tracker_indexing_tree_file_is_indexable: assertion 'G_IS_FILE (file)' failed Could you attach your journal log with those warnings? Maybe we can find what's wrong from the log...
Can you provide a GPG key to encrypt to? I think the log will contain sensitive information.
There shouldn't be any sensitive data, but I see. We don't need necessarily full log, you can attach only parts with send_infos_cb/send_done_cb messages with some reasonable context and check the context to not include sensitive data, e.g. using: journalctl -b | grep -E '(send_infos_cb|send_done_cb)' -B 15 -A 5 Because other developers won't see it if you encrypt the result using my key... Is it acceptable for you?
Created attachment 303432 [details] journal output as described in previous comment Ok, see attached journal. The system was booted on 15:18 and as far as I know the user didn't use any gvfs things like (s)ftp, smb, mounting archives, …
Thanks, I see following lines immediately before/after send_infos_cb/send_done_cb lines: Mai 15 15:31:57 localhost.localdomain org.gnome.Nautilus[1823]: (nautilus:3634): Gtk-WARNING **: gtk_widget_size_allocate(): attempt to allocate widget with width -15 and height 34 This might indicate that something is wrong with Nautilus, but I tested it on F22 and those nautilus warning seems unrelated. The long sequences of send_infos_cb warnings indicates, that something is trying to use gvfs to read directory with thousands of files, but it doesn't handle the result... You can try to check which shares are mounted: gvfs-mount -li And what daemons are running: ps ax | grep gvfs If you see those lines also when mounting to the first storage, you can try to run your own main gvfs daemon before mounting and see what folders are being read when mounting: GVFS_DEBUG=1 /usr/libexec/gvfsd --replace Could you attach outputs of those commands please?
Created attachment 303593 [details] output of gvfs-mount -li Note that I don't have any partitions mounted using gvfs. There is only some system partitions like /tmp, /proc, /sys and one btrfs partition with three mount points on /, /var, /home. No removable storage at all. There is another hard disk present (/dev/sdb in attached output) but it is not mounted.
Ok, there is nothing useful. Hmm, but still something can read from e.g. trash, recent files, network... Could you attach also outputs for the other two commands (from Comment 5) please?
Created attachment 303683 [details] Output of `ps ax | grep gvfs`
I found a reliable way to trigger this issue: start the nautilus process. If I close the old one (e.g. using gnome-system-monitor, sending SIGTERM) and start nautilus again those lines will show up in syslog. Running `GVFS_DEBUG=1 /usr/libexec/gvfsd --replace` does not generate anything at all since I don't have a storage mounted and don't mount one to trigger this issue. Note that these warnings are NOT printed by some gvfs process but by the dbus-daemon instance started by gdm-x-session. There is another dbus-daemon running for the same user name (!) started by at-spi-bus-launcher which does not emit these warnings.
http and trash daemons are running: 3363 tty2 Sl+ 0:00 /usr/libexec/gvfsd-http --spawner :1.6 /org/gtk/gvfs/exec_spaw/0 4607 tty2 Sl+ 0:00 /usr/libexec/gvfsd-trash --spawner :1.6 /org/gtk/gvfs/exec_spaw/1 Trash daemon is usually started when you opened nautilus, but http shouldn't be running by default... However enumeration isn't implemented for http backend, so I suppose something (probably Nautilus) is reading from trash... What version of nautilus are you using? Hmm, you are right, "GVFS_DEBUG=1 /usr/libexec/gvfsd --replace" seems useless for already running backends...
Sorry, wrong info: When I get this message org.gtk.vfs.Daemon[1713]: ** (process:4607): WARNING **: send_infos_cb: Keine derartige Schnittstelle »org.gtk.vfs.Enumerator« des Objekts im Pfad /org/gtk/vfs/client/enumerator/1 (g-dbus-error-quark, 19) PID 1713 is the dbus-daemon and PID 4607 is gvfsd-trash.
Created attachment 303687 [details] backtrace for send_infos_cb
Created attachment 303688 [details] backtrace for send_done_cb
I hope those backtraces help you. I don't know why gvfsd-http is running, gnome's sharing functionality is disabled.
Oh, it is nautilus 3.16.2
As a workaround I emptied the trash and the problem is gone for now.
(cleared needinfo, since I gave the requested information some weeks ago.)
Thanks for the backtraces and sorry I didn't response. I saw the backtraces, however unfortunately they aren't really useful. Those warnings signalize problems with a client side, but the backtrace is for gvfs side. But as I said before it doesn't mean that something is wrong. It might mean just that client exited, or just stopped enumerating before all infos are transferred. I just wonder why too much "No such interface" warnings are printed, because usually "The connection is closed" is printed in such cases. Finally I am able to "reproduce" it: 1/ start Nautilus and click on Trash folder (with really lot of files, or with modified GVfsJobEnumerate to sent small batches and sleep in the meantime) 2/ immediately click e.g. on Home folder 3/ "No such interface" warnings are printed It needs more investigation, what is happening there. Maybe enumeration can be terminated more cleanly, however this isn't really critical... Also maybe we shouldn't print those warnings at all...
(In reply to Ondrej Holy from comment #18) > Finally I am able to "reproduce" it: > 1/ start Nautilus and click on Trash folder (with really lot of files, or > with modified GVfsJobEnumerate to sent small batches and sleep in the > meantime) That seems like the reason in my case too. I had _many_ files in the trash folder too.
I am pretty sure that it is related to the following bugs and will be fixed by the recently pushed patches: https://bugzilla.gnome.org/show_bug.cgi?id=763218 https://bugzilla.gnome.org/show_bug.cgi?id=763600 *** This bug has been marked as a duplicate of bug 763218 ***