On Tue, May 30, 2017 at 12:44:21PM +0200, Michal Privoznik wrote:
I've been experimenting with sparse streams and found a bug. If
you try to
download a volume which doesn't support sparseness here's what happens:
# virsh vol-download --sparse
/dev/disk/by-path/ip-XX.XX.XX.XX:3260-iscsi-iqn.2017-03.com.blah:server-lun-0
/mnt/floppy/blah.raw
# echo $?
0
# ls -lhs /mnt/floppy/bla.raw
0 -rw-r--r-- 1 root root 0 May 30 12:40 /mnt/floppy/bla.raw
That's not good. iSCSI doesn't know anything about sparseness so an error is
expected here. Fortunately, the fix is fairly simple:
# virsh vol-download --sparse
/dev/disk/by-path/ip-XX.XX.XX.XX:3260-iscsi-iqn.2017-03.com.blah:server-lun-0
/mnt/floppy/bla.raw
error: cannot close volume
/dev/disk/by-path/ip-XX.XX.XX.XX:3260-iscsi-iqn.2017-03.com.blah:server-lun-0
error: Unable to seek to data: Invalid argument
I'm also getting confusing errors when there is no space on the
destination:
error: cannot receive data from volume fedora.img
error: An error occurred, but the cause is unknown
But that's not related to the sparse streams (unless it was caused by
making the iohelper a thread).
... few moments later after /me tries just a thing or two ...
Well, this made me try out few more things and I've found out few
things. I'm not sure what's related to your patches and what's not, so
here's the rundown, and I'll let you decide:
- vol-download --sparse --offset $source_file_size --length 1
/path/to/source.file destination.file
- Every now and then (not always) it gets stuck waiting for the
daemon to receive data (see backtrace below), but the daemon is not
waiting for anything, it's just some weird race. We can try
debugging it with wireshark later. That file ends with a hole.
Thread 1 (Thread 0x7f1d2b434880 (LWP 28584)):
#0 0x00007f1d2796efbd in poll () at ../sysdeps/unix/syscall-template.S:84
#1 0x00007f1d2a806ee3 in poll (__timeout=5000, __nfds=2, __fds=0x7ffe9effd640) at
/usr/include/bits/poll2.h:46
#2 virNetClientIOEventLoop (client=client@entry=0x563525bb06d0,
thiscall=thiscall@entry=0x563525badc00) at rpc/virnetclient.c:1664
#3 0x00007f1d2a8074d3 in virNetClientIO (client=client@entry=0x563525bb06d0,
thiscall=0x563525badc00) at rpc/virnetclient.c:1957
#4 0x00007f1d2a80780e in virNetClientSendInternal (client=client@entry=0x563525bb06d0,
msg=msg@entry=0x563525bb03d0, expectReply=expectReply@entry=true,
nonBlock=nonBlock@entry=false) at rpc/virnetclient.c:2132
#5 0x00007f1d2a808dfc in virNetClientSendWithReplyStream
(client=client@entry=0x563525bb06d0, msg=msg@entry=0x563525bb03d0,
st=st@entry=0x563525bade10) at rpc/virnetclient.c:2236
#6 0x00007f1d2a80ab2d in virNetClientStreamRecvPacket (st=st@entry=0x563525bade10,
client=0x563525bb06d0, data=data@entry=0x7f1d20686010 "",
nbytes=nbytes@entry=262120, nonblock=false, flags=32766, flags@entry=1) at
rpc/virnetclientstream.c:499
#7 0x00007f1d2a7e0e3e in remoteStreamRecvFlags (st=0x563525badc60, data=0x7f1d20686010
"", nbytes=262120, flags=1) at remote/remote_driver.c:5664
#8 0x00007f1d2a7c8347 in virStreamRecvFlags (stream=stream@entry=0x563525badc60,
data=0x7f1d20686010 "", nbytes=nbytes@entry=262120, flags=flags@entry=1) at
libvirt-stream.c:361
#9 0x00007f1d2a7c9b7f in virStreamSparseRecvAll (stream=stream@entry=0x563525badc60,
handler=0x563525760196 <virshStreamSink>, holeHandler=0x56352576020b
<virshStreamSkip>, opaque=opaque@entry=0x7ffe9effd954) at libvirt-stream.c:964
#10 0x000056352576232e in cmdVolDownload (ctl=0x7ffe9effda40, cmd=<optimized out>)
at virsh-volume.c:834
#11 0x00005635257662f1 in vshCommandRun (ctl=0x7ffe9effda40, cmd=0x563525bacf40) at
vsh.c:1327
#12 0x000056352572aee2 in main (argc=9, argv=<optimized out>) at virsh.c:929
Trying to reproduce yet another one, the command gets stuck even with
different offsets.
- vol-download --sparse --offset $X --length 1
/path/to/source.file destination.file
- This does not respect the length if:
X > $source_file_size - $last_hole_size
The size ends up being $source_file_size - $X
I'm afraid to try more things, but I can provide more info for these if
you want.
Have a nice day,
Martin