Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 files corruption #68

Closed
estezz opened this issue Jan 14, 2015 · 4 comments
Closed

S3 files corruption #68

estezz opened this issue Jan 14, 2015 · 4 comments

Comments

@estezz
Copy link
Contributor

estezz commented Jan 14, 2015

Secor is leaving the end of messages in the output of smaller messages.

If you load these 2 messages into kafka.
{"array":[1,2,3],"boolean":true,"null":null,"dog":"brown","dog2":"brown","dog3":"brown","dog4":"brown","dog5":"brown","dog6":"brown","dog7":"brown","dog8":"brown","dog9":"brown","dog10":"brown","dog11":"brown","number":123,"object":{"a":"b","c":"d","e":"f"},"string":"Hello World"}

{"array":[1,2,3],"boolean":true,"null":null,"number":123,"object":{"a":"b","c":"d","e":"f"},"string":"Hello World"}

The secor file in S3 will look like this. I put the extra data in bold on line 14
13: {"array":[1,2,3],"boolean":true,"null":null,"dog":"brown","dog2":"brown","dog3":"brown","dog4":"brown","dog5":"brown","dog6":"brown","dog7":"brown","dog8":"brown","dog9":"brown","dog10":"brown","dog11":"brown","number":123,"object":{"a":"b","c":"d","e":"f"},"string":"Hello World"}

14: {"array":[1,2,3],"boolean":true,"null":null,"number":123,"object":{"a":"b","c":"d","e":"f"},"string":"Hello World"}n","dog6":"brown","dog7":"brown","dog8":"brown","dog9":"brown","dog10":"brown","dog11":"brown","number":123,"object":{"a":"b","c":"d","e":"f"},"string":"Hello World"}

Looks like this is an issue with BytesWritable. Here is some code testing that.
package secor;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.BytesWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.SequenceFile;

public class TestWriter {

private  SequenceFile.Writer mWriter;


public static void main(String[] args) {

    TestWriter testWriter = new TestWriter();

    try {
        testWriter.init();

        testWriter.write(10, "This is the shit");
        testWriter.write(11, "Leo");
        testWriter.mWriter.close();

        testWriter.read();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (InstantiationException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (IllegalAccessException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }


}

public void init() throws IOException{
    Configuration config = new Configuration();
    FileSystem fs = FileSystem.get(config);
    Path fsPath = new Path("/tmp/testfile");

    this.mWriter = SequenceFile.createWriter(fs, config, fsPath,
        LongWritable.class, BytesWritable.class);
}


public void write(long key, String value) throws IOException {

    LongWritable writeableKey = new LongWritable(key);
    BytesWritable writeableValue = new BytesWritable(value.getBytes());
    System.out.println(new String(writeableValue.getBytes()));
    this.mWriter.append(writeableKey, writeableValue);

}

public void read() throws IOException, InstantiationException, IllegalAccessException{
    Configuration config = new Configuration();
    FileSystem fs = FileSystem.get(config);
    Path fsPath = new Path("/tmp/testfile");

    SequenceFile.Reader reader = new SequenceFile.Reader(fs, fsPath, new Configuration());
    LongWritable key = new LongWritable();
    BytesWritable value = new BytesWritable();
    System.out.println("reading file " + fsPath);
    while (reader.next(key, value)) {
        System.out.println(new String(value.getBytes()));
    }
}

}

@estezz
Copy link
Contributor Author

estezz commented Jan 14, 2015

This apears to be a ptoblem with the com.pinterest.secor.main.LogFilePrinterMain instead of the writing of the file. I will close this and open a new one for the com.pinterest.secor.main.LogFilePrinterMain

@estezz estezz closed this as completed Jan 14, 2015
@foovungle
Copy link

Has this been fixed? Thx

@estezz
Copy link
Contributor Author

estezz commented Mar 30, 2015

Yes this is fixed. It was a tool bug not a data problem. Look at my comment on Jan 14

@foovungle
Copy link

Got it. Thx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants