You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
$ javac -classpath $(hadoop classpath) *.java
$ jar -cvf Compress.jar Compress.class
$ hadoop jar Compress.jar Compress file.txt test1 1
$ hadoop jar Compress.jar Compress file.txt test7 7
The filefile.txt is of size 1Gb. When I then check the size of test1 and test2 with hdfs dfs -du -s -h, I get 594.6 M for each.
This proves that the compression level is ignored.
The text was updated successfully, but these errors were encountered:
wilcoln
changed the title
Comression Level is ignored.
Compression Level is ignored.
Mar 3, 2020
Your code looks fine at first glance. I'm not actively maintaining this project anymore -- it's largely in maintenance mode as most people have moved on to using better file formats like Parquet along with LZ4 or Snappy. I'd suggest doing some debugging of your own -- rebuild hadoop-lzo with logging at the point where the compressor is created and see if it's getting passed through properly, and follow the breadcrumbs from there.
I want to compress some file already inside hdfs using different compression levels.
To do so, I write the following program:
Compress.java
Then I execute run the following commands
The file
file.txt
is of size 1Gb. When I then check the size oftest1
andtest2
withhdfs dfs -du -s -h
, I get 594.6 M for each.This proves that the compression level is ignored.
The text was updated successfully, but these errors were encountered: