On 03/29/2012 05:52 AM, Christophe Fergeau wrote:
On Wed, Mar 28, 2012 at 03:20:45PM -0400, Laine Stump wrote:
> I know I'm a bit late to the party (I missed your mail before), but
> would it have been possible to do the "default" processing in the caller
> rather than in the parsing function itself?
I cannot disagree since I was not sure at all that this was the right place
for setting defaults, I should have mentioned this explicitly when sending
the patch.. Who is "the caller" exactly in this case?
I'm not previously familiar with this attribute, but it looks like the
only place it's used is in qemuBuildVirtioSerialPortDevStr, where it
does the following:
1) if it's set, and doesn't == "com.redhat.spice.0" it logs an error
and
fails
2) if it's set, the option ",name=com.redhat.spice.0" is added to the
device string.
So it looks like you could get the same effect by changing that last bit
to either add the contents of dev->target.name, or if that's NULL,
"com.redhat.spice.0".
In this particular case I don't know if it's going to make any
difference in practice; I'm just always wary of parser code that
modifies the output such that it doesn't mirror the input. (One way of
thinking about it is that if you were to set everything in an object,
then do a Format/Parse roundtrip, the results would be different. That's
going to end up being the case anyway because there's already so much
existing parse code that commits this sin (my opinion), but I think it's
good to reduce the amount of this type behavior as much as possible.
Does anyone with more experience with the libvirt spice code have any
opinion on whether it makes any difference if we add in the default in
the parsing function or just when we build the commandline? danpb?