Description
I came across an inconsistency in the API that was quite hard to diagnose. If you pass a tuple of size 2 to a Boolean scale (or Nominal actually), you will get the reverse of what you would get from a list for some of the properties. E.g. in the following code
df = pl.DataFrame({"x":[0.1,0.2,0.3],"y":[3,2,1],"preferred":[True,False,False]})
fig = plt.figure()
p = (
so.Plot(data=df,x="x",y="y",pointsize="preferred")
.add(so.Dots())
.scale(
pointsize=so.Boolean((10,5)),
)
)
p.show()
True will be mapped to the small dots and False to big dots, whereas you get the reverse if you pass [10,5] instead. This also happens for alpha
and other properties. It seems to me this is actually common to all IntervalProperty
; in the _get_values
method, there is a special case where tuples are interpreted as a vmin, vmax
rather than if it was a list. There is probably some reason it is like that, but it creates a rather unpleasant inconsistency, given that other properties like color
will give the expected result.