Skip to content

Commit

Permalink
add unique()
Browse files Browse the repository at this point in the history
  • Loading branch information
nothingmuch committed Sep 2, 2008
1 parent 87c4dc2 commit 66ba069
Showing 1 changed file with 17 additions and 1 deletion.
18 changes: 17 additions & 1 deletion lib/Data/Stream/Bulk/Util.pm
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,12 @@ use warnings;
use Data::Stream::Bulk::Nil;
use Data::Stream::Bulk::Array;

use Scalar::Util qw(refaddr);

use namespace::clean;

use Sub::Exporter -setup => {
exports => [qw(nil bulk cat filter)],
exports => [qw(nil bulk cat filter unique)],
};

sub nil () { Data::Stream::Bulk::Nil->new }
Expand All @@ -25,6 +27,11 @@ sub filter (&$) {
$stream->filter($filter);
}

sub unique ($) {
my %seen;
shift->filter(sub { [ grep { !$seen{ref($_) ? refaddr($_) : $_}++ } @$_ ] }); # FIXME Hash::Util::FieldHash::Compat::id()?
}

__PACKAGE__

__END__
Expand Down Expand Up @@ -78,6 +85,15 @@ Returns C<nil> if no arguments are provided.
Calls C<filter> on $stream with the provided filter.
=item unique $stream
Filter the stream to remove duplicates.
Note that this may potentially scales to O(k) where k is the number of distinct
items.
In the future this will be optimized to be iterative for sorted streams.
=back
=cut
Expand Down

0 comments on commit 66ba069

Please sign in to comment.