Algorithm::FeatureSelection -
use Algorithm::FeatureSelection; my $fs = Algorithm::FeatureSelection->new(); # feature-class data structure ... my $features = { feature_1 => { class_a => 10, class_b => 2, }, feature_2 => { class_b => 11, class_d => 32 }, . . . }; # get pairwise-mutula-information my $pmi = $fs->pairwise_mutual_information($features); my $pmi = $fs->pmi($features); # same above # get information-gain my $ig = $fs->information_gain($features); my $ig = $fs->ig($features); # same above
This library is an perl implementation of 'Pairwaise Mutual Information' and 'Information Gain' that are used as well-known method of feature selection on text mining fields.
my $features = { feature_1 => { class_a => 10, class_b => 2, }, feature_2 => { class_b => 11, class_d => 32 }, . . . }; my $fs = Algorithm::FeatureSelection->new(); my $ig = $fs->information_gain($features);
short name of information_gain()
my $features = { feature_1 => { class_a => 10, class_b => 2, }, feature_2 => { class_b => 11, class_d => 32 }, . . . }; my $fs = Algorithm::FeatureSelection->new(); my $igr = $fs->information_gain_ratio($features);
short name of information_gain_ratio()
my $features = { feature_1 => { class_a => 10, class_b => 2, }, feature_2 => { class_b => 11, class_d => 32 }, . . . }; my $fs = Algorithm::FeatureSelection->new(); my $pmi = $fs->pairwise_mutual_information($features);
short name of pairwise_mutual_information()
calcurate entropy.
Takeshi Miki <miki@cpan.org>
This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
To install Algorithm::FeatureSelection, copy and paste the appropriate command in to your terminal.
cpanm
cpanm Algorithm::FeatureSelection
CPAN shell
perl -MCPAN -e shell install Algorithm::FeatureSelection
For more information on module installation, please visit the detailed CPAN module installation guide.