>> > 2) Why would allowing one or both of the Bigs prevent Number from being
>> > allowed as a serializable type?
>> >
>> Not sure I said that. The problem is that if something is potentially
>> Big... then a database must be prepared to deal with it and it has a
>> high cost.
>
>
>
> Every Puppet value is potentially a Big now.  What new cost is involved?
> I'm having trouble seeing how a database can deal efficiently with Puppet's
> current implicit typing anyway, Big values notwithstanding.  Without
> additional type information, it must be prepared for any given value to be a
> boolean, an Integer, a float, or a 37kb string (among other possibilities).
> Why do Big values present an especial problem in that regard?

So right now, we have alternating native postgresql columns for the
bare types: text, biginteger, boolean, double precision. This provides
us with the ability to use the most optimal index for the type, and of
course avoid storing any more then we need to. As I mentioned at the
top of the thread, we specifically do not support arbitrary precision
decimals.

At least one example in PostgreSQL, once you jump to say, a numeric
column type the performance characteristics and index efficiency
changes for the worse. Same goes for floats being stored in a numeric.
This is why you are better off avoiding the conversion until you
overflow and absolutely need a decimal usually.

ken.

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/puppet-dev/CAE4bNTngnVLb9Fxhbr4MBKee8sfyqLviua%2BgWWiS6q2H4eiBng%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to