Duplicate raw default values on eval rather than on init #25

merged 8 commits into from Nov 16, 2011


None yet

4 participants


Currently, default values are dup'ed on initialization. This, however,
leaves the raw default value exposed to be modified at runtime.


class Foo
  include Virtus
  attribute :bar, Array, :default => []

foo_one = Foo.new
foo_one.bar << 'baz'
foo_two = Foo.new

#=> ['baz']

This bug is active on all supported dupable attribute types. My solution (to dup the raw default value on every eval), while not as efficient, should ensure that the raw default value keeps its hands clean.

I have yet to write an acceptance spec for this, and I'm not even sure that this is how I want to fix this bug. Mostly I wanted to document the issue so I'd remember it in the morning when I'll take a look at fixing it.

In the meantime, it's late so I'm off to get some shuteye.

dkubb commented Oct 1, 2011

We definitely should dup inside #evaluate, which should only get executed once when the value is being assigned to the virtus instance. That's the same approach we use in DataMapper.

I would rather not have #value be dynamic though, and would rather localize it to #evaluate.

@solnic, wdyt?


@dkubb. I agree- if #evaluate is only called once, and #value is never called at runtime then we should duplicate the value in #evaluate.

I'll move the duplication line to #evaluate, and write up an acceptance spec tomorrow.

dkubb commented Oct 3, 2011

@rclosner thinking about this a bit more, in DataMapper we would actually consider using :default => [] to be a problem, the same as using :default => '' for a String.

If the value isn't immutable, then we would encourage :default => proc { [] } to make it completely unambiguous that evaluation happens at runtime. Really, this is the same thing as how Hash.new([]) works. Sure, Ruby could maybe guess that you wanted to use a dup of the Array for each Hash value, but it could be wrong. There may be valid use cases for using the same object for each value in the Hash.

Now, with that said, I'm not against the idea, syntactically it's quite beautiful even if it does break a bit from convention. If it is done, I think there are more use cases to consider. For example, what if someone does :default => [ obj1, obj2, obj3 ], do we deep copy? What if those are supposed to be references to 3 singleton objects, and deep copying breaks the reference.

I really don't have a good answer for this; I don't think there's a strategy that could work for one set of use cases that would break others. I guess if I had to make a gut call, I would document the usage of Proc objects as default values, warn for any non-immutable objects used as default values (basically anything that has a proper #dup method), and then defer adding this until we've all had time to think about it more.

solnic commented Nov 15, 2011

@dkubb Let's just fix it by moving duplication to #evaluate - @rclosner are you still up to finishing this patch?


@solnic, if @dkubb agrees then I'll submit the patch to move dup'ing the default to #evaluate.

dkubb commented Nov 15, 2011

Yeah, I agree, we should dup inside evaluate.

It would also be great to have some specs that would've broken with the old behaviour, but pass with the new one. That way we can prevent any unintentional regressions.

@lovehandle lovehandle pushed a commit to lovehandle/virtus that referenced this pull request Nov 16, 2011
Ryan Closner Dup default value on evaluate rather than on initialization
Issue #25
@solnic solnic merged commit 9568cc4 into solnic:master Nov 16, 2011
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment