Follow by Email

jueves, 7 de septiembre de 2017

WordCount


Count word using Hadoop

package com.ciocomit.count;

import java.io.IOException;
import java.util.StringTokenizer;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

public class WordCount {
    public static class Map extends Mapper {

        public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
            String line = value.toString();
            StringTokenizer tokenizer = new StringTokenizer(line);

            while (tokenizer.hasMoreTokens()) {
                value.set(tokenizer.nextToken());
                context.write(value, new IntWritable(1));
            }
        }

    }// class

    public static class Reduce extends Reducer {

        public void reduce(Text key, Iterable values, Context context)
                throws IOException, InterruptedException {
            int sum = 0;
            for (IntWritable x : values) {
                sum += x.get();
            }
            context.write(key, new IntWritable(sum));
        }
    }// class

    public static void main(String[] args) throws Exception {

        Configuration conf = new Configuration();
        Job job = new Job(conf, "mywc");
       
        job.setJarByClass(WordCount.class);
        job.setMapperClass(Map.class);
        job.setReducerClass(Reduce.class);
       
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(IntWritable.class);
       
        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);
       
        Path outputPath = new Path(args[1]);

        // Configuration the input/output path from the filesystem into the job
        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));
       
        // deleting the output path automatically from hdfs so that we don't have delete it explicity
        outputPath.getFileSystem(conf).delete(outputPath);
    }
}// class

lunes, 17 de julio de 2017

AWK: n lines before, m lines after

awk 'c-->0; $0~s{if(b)for(c=b+1;c>1;c--)print r[(NR-c+1)%b];print;c=a}b{r[NR%b]=$0}' b=3 a=5 s="cadenita" short.txt

sábado, 15 de julio de 2017

Git Merge


git granch                       # local branches

git branch -r                   # remote branches

git branch -a                   # local and remote branches

git status                         # current branch status

git checkout develop     # go to "develop" branch

git checkout hotfix/tk_01_inv    # go to "hotfix/tk_01_inv" branch

git pull   # download the remote changes into current branch

git add .    # save all changes

git commit -m 'some comment'    # commit into local branch

git push    # public my local changes into remote git repository



git merge : "develop" branch gets changes from "hotfix/tk_01_inv" branch

git branch -a    # local and remote branches
git status

git checkout hotfix/tk_01_inv   # branch with new changes
git pull

gi push

git checkout develop

git pull

git status

git merge --no-ff hotfix/tk_01_inv

git status

git push

domingo, 14 de mayo de 2017

Unix shell scripting: Full Log


Creating of a Shell Scripting full log.


#!/bin/bash -x

export Me=`basename $0 .sh`
{
export current_date=`date "+%Y-%m-%d_%H%M%S"`

#
echo "Me: $Me"
echo "Current Date: $current_date"
echo -e "Successful Job. \n\n\n"
exit 0

} > ${Me}".log1" 2> ${Me}".log2"



Results

full_log.log1
Me: full_log
Current Date: 2017-05-14_200038
Successful Job.

full_log.log2
++ date +%Y-%m-%d_%H%M%S
+ export current_date=2017-05-14_200038
+ current_date=2017-05-14_200038
+ echo 'Me: full_log'
+ echo 'Current Date: 2017-05-14_200038'
+ echo -e 'Successful Job. \n\n\n'
+ exit 0

jueves, 27 de octubre de 2016


PERL: Dígito verificador con Perl (Chile).


#!/usr/bin/perl

use strict;
use warnings;

my $s_Rut = "30686957";


my @a_rut = split '', $s_Rut;
print("@a_rut" ."\n");

my @rut_rev = reverse @a_rut;
print("@rut_rev" ."\n");

my $adding=0;
my $multi=2;
foreach my $n (@rut_rev) {
  print ("$n \n");
  $adding = $adding + ($n * $multi++);

  if($multi==8){
      $multi=2;
  }
}

print ("$adding \n");

my $module=$adding%11;
my $dv=11-$module;

if($dv == 11){
    $dv=0;
}

if($dv == 10){
    $dv='K';
}
$s_Rut="$s_Rut-$dv";
print("RUT: $s_Rut\n");

martes, 22 de marzo de 2016

Perl: File into Array

#!/usr/bin/perl

use strict;
use warnings;

my $filename = 'hobbies.txt';

open (my $handle , '<:encoding br="" filename="">    or die "Could not open file '$filename' $!";

chomp (my @hobbies = <$handle>);
close $handle;

my $totalHobbies = $#hobbies;

print "hobbies: @hobbies\n";

print "---------------: $totalHobbies\n";
print "1st. hobby: $hobbies[0]\n";

my $index=9;
print "nine hobby: $hobbies[$index]\n";
print "last hobby: $hobbies[$totalHobbies]\n";

my $randomHobby=int(rand($totalHobbies+1));
print "Random hobby: $hobbies[$randomHobby]\n";

# generate ages between 18 and 118 years
my $minYear=18;
my $range=100;
my $randomYear=int(rand($range)) + $minYear;

print "Random year: $randomYear\n";

jueves, 4 de febrero de 2016

I love my Job


I am graduated of Computer System Engineer from Universidad Católica de Temuco, Chile.

Design and modeling (Databases and Software) with UML and coding software; work and management different operating systems, including Windows and Unix/Linux, and how certain types of software work with them; knowledge of programming languages such as Java EE (11 years), HTML, CSS, JavaScript (with Ajax and jQuery), PHP, Linux Shell Scripting and Python; I have developed and implemented Web Applications with Java EE including JSF, Struts, EJB, JPA and Hibernate; database management Postgresql (with PL/PgSQL), Oracle (with PL/SQL), Sql Server and MySql; Application integration with OSB (Oracle Service Bus), SOAP/REST Web Services for interchanging of data with mobile systems Android; Servers management on FreeBSD and Linux; Study in data analysis and knowledge extraction from sets of data (with Simulated Annealing). Through self-learning I have studied NoSql Architecture, Sap Hana, In Memory Databases and Big Data.

My projections for the future are to continue in the area of Computer Systems Development and Integrate a Software Engineering Network.

Best regards.