@data».sparkle
While reading solutions to the PWC, I spotted a pattern. There where plenty of .map
-calls that contained only simple maths. It must not be so! Task 179-2 leans itself to use vector operations.
use v6.d;
constant @sparks = "▁" .. "█";
constant \steps = +@sparks - 1;
multi MAIN(
#| space separated list of numbers
*@data
) {
my $scale := @data.&{ (.max - .min) / steps };
put @sparks[((@data »-» @data.min) »/» $scale)».round].join;
}
multi MAIN(
#| output an example sparkline-graph
Bool :$test
) {
.&MAIN for <2 4 6 8 10 12 10 8 6 4 2>, <0 1 19 20>, <0 999 4000 4999 7000 7999>;
}
First, I define a character range (no need for .ord
here) and store the number of bar-graph steps. Then I calculate the scale like it is used in a map (those paper-things, that always work and don’t break when dropped). Now I can use that scale to shrink (or grow) values. I believe rounding is better here then to cut of to integers.
When we feed a list to a postcircumfix:<[ ]>
we get a list of values in return. However, we break the method-call chain when we do so. Since operators are just Sub
s with a funny syntax, we can solve that issue easily.
constant &sparks = {
constant @sparks = "▁" .. "█";
constant \steps = +@sparks - 1;
&postcircumfix:<[ ]>.assuming(@sparks) but role :: { has $.steps = steps }
}.();
my $scale := @data.&{ (.max - .min) / &sparks.steps };
((@data »-» @data.min) »/» $scale)».round.&sparks.join.put;
The constants are hidden within a block-scope and only steps
is exposed as part of a Sub
s “namespace”. By declaring &sparks
as constant, I move the currying to compile time.
Hypers are not getting much use in PWC-solutions. I believe another entry in raku-did-you-know is in order.
-
August 29, 2022 at 22:022022.35 Reworkout – Rakudo Weekly News