Giter Club home page Giter Club logo

winutils's Introduction

winutils

winutils.exe hadoop.dll and hdfs.dll binaries for hadoop on windows

I've been using https://github.com/steveloughran/winutils but it stops to update So I tried to compile myself and push binaries here for you all

compile steps (in Chinese)

how to use

place a copy of hadoop-ver folder on your local drive set environment vars:

HADOOP_HOME=<your local hadoop-ver folder>
PATH=%PATH%;%HADOOP_HOME%\bin

then you'll pass the "no native library" and "access0" error

winutils's People

Contributors

cdarlint avatar haijialiu avatar huskyui avatar jarieshan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

winutils's Issues

WIN 10+hadoop 3.2.1 bin 添加后无效

环境:
win10
本地 安装 Hadoop 3.2.1 + 环境变量配置 把对应的3.2.1/bin/下全部文件复制到本地bin/后
依旧出现native 错误 尝试添加.dll到system32 之后依旧如此

下面是相关log 代码 和 依赖
`
"D:\Program Files\Java\jdk1.8.0_92\bin\java.exe" "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2019.2\lib\idea_rt.jar=8233:D:\Program Files\JetBrains\IntelliJ IDEA 2019.2\bin" -Dfile.encoding=UTF-8 -classpath "D:\Program Files\Java\jdk1.8.0_92\jre\lib\charsets.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\deploy.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\endorsed\rt_debug.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\access-bridge-32.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\cldrdata.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\dnsns.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\jaccess.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\jfxrt.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\localedata.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\nashorn.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunec.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunjce_provider.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunmscapi.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\sunpkcs11.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\ext\zipfs.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\javaws.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\jce.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\jfr.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\jfxswt.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\jsse.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\management-agent.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\plugin.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\resources.jar;D:\Program Files\Java\jdk1.8.0_92\jre\lib\rt.jar;E:\java_code\ideal\hadoop\demo\wordcount\target\classes;E:\m2\local\repository\org\apache\hadoop\hadoop-client\3.2.1\hadoop-client-3.2.1.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-hdfs-client\3.2.1\hadoop-hdfs-client-3.2.1.jar;E:\m2\local\repository\com\squareup\okhttp\okhttp\2.7.5\okhttp-2.7.5.jar;E:\m2\local\repository\com\squareup\okio\okio\1.6.0\okio-1.6.0.jar;E:\m2\local\repository\com\fasterxml\jackson\core\jackson-annotations\2.9.8\jackson-annotations-2.9.8.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-yarn-api\3.2.1\hadoop-yarn-api-3.2.1.jar;E:\m2\local\repository\javax\xml\bind\jaxb-api\2.2.11\jaxb-api-2.2.11.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-yarn-client\3.2.1\hadoop-yarn-client-3.2.1.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\3.2.1\hadoop-mapreduce-client-jobclient-3.2.1.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-mapreduce-client-common\3.2.1\hadoop-mapreduce-client-common-3.2.1.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-annotations\3.2.1\hadoop-annotations-3.2.1.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-hdfs\3.2.1\hadoop-hdfs-3.2.1.jar;E:\m2\local\repository\com\google\guava\guava\27.0-jre\guava-27.0-jre.jar;E:\m2\local\repository\com\google\guava\failureaccess\1.0\failureaccess-1.0.jar;E:\m2\local\repository\com\google\guava\listenablefuture\9999.0-empty-to-avoid-conflict-with-guava\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;E:\m2\local\repository\org\checkerframework\checker-qual\2.5.2\checker-qual-2.5.2.jar;E:\m2\local\repository\com\google\errorprone\error_prone_annotations\2.2.0\error_prone_annotations-2.2.0.jar;E:\m2\local\repository\com\google\j2objc\j2objc-annotations\1.1\j2objc-annotations-1.1.jar;E:\m2\local\repository\org\codehaus\mojo\animal-sniffer-annotations\1.17\animal-sniffer-annotations-1.17.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-server\9.3.24.v20180605\jetty-server-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-http\9.3.24.v20180605\jetty-http-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-io\9.3.24.v20180605\jetty-io-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-util\9.3.24.v20180605\jetty-util-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-util-ajax\9.3.24.v20180605\jetty-util-ajax-9.3.24.v20180605.jar;E:\m2\local\repository\com\sun\jersey\jersey-core\1.19\jersey-core-1.19.jar;E:\m2\local\repository\javax\ws\rs\jsr311-api\1.1.1\jsr311-api-1.1.1.jar;E:\m2\local\repository\com\sun\jersey\jersey-server\1.19\jersey-server-1.19.jar;E:\m2\local\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;E:\m2\local\repository\commons-codec\commons-codec\1.11\commons-codec-1.11.jar;E:\m2\local\repository\commons-io\commons-io\2.5\commons-io-2.5.jar;E:\m2\local\repository\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;E:\m2\local\repository\commons-daemon\commons-daemon\1.0.13\commons-daemon-1.0.13.jar;E:\m2\local\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;E:\m2\local\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;E:\m2\local\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;E:\m2\local\repository\io\netty\netty\3.10.5.Final\netty-3.10.5.Final.jar;E:\m2\local\repository\io\netty\netty-all\4.0.52.Final\netty-all-4.0.52.Final.jar;E:\m2\local\repository\org\apache\htrace\htrace-core4\4.1.0-incubating\htrace-core4-4.1.0-incubating.jar;E:\m2\local\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;E:\m2\local\repository\com\fasterxml\jackson\core\jackson-databind\2.9.8\jackson-databind-2.9.8.jar;E:\m2\local\repository\com\fasterxml\jackson\core\jackson-core\2.9.8\jackson-core-2.9.8.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-mapreduce-client-core\3.2.1\hadoop-mapreduce-client-core-3.2.1.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-yarn-common\3.2.1\hadoop-yarn-common-3.2.1.jar;E:\m2\local\repository\com\sun\jersey\jersey-client\1.19\jersey-client-1.19.jar;E:\m2\local\repository\com\google\inject\guice\4.0\guice-4.0.jar;E:\m2\local\repository\javax\inject\javax.inject\1\javax.inject-1.jar;E:\m2\local\repository\aopalliance\aopalliance\1.0\aopalliance-1.0.jar;E:\m2\local\repository\com\sun\jersey\contribs\jersey-guice\1.19\jersey-guice-1.19.jar;E:\m2\local\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.9.8\jackson-module-jaxb-annotations-2.9.8.jar;E:\m2\local\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-json-provider\2.9.8\jackson-jaxrs-json-provider-2.9.8.jar;E:\m2\local\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-base\2.9.8\jackson-jaxrs-base-2.9.8.jar;E:\m2\local\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;E:\m2\local\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;E:\m2\local\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;E:\m2\local\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;E:\m2\local\repository\org\xerial\snappy\snappy-java\1.0.5\snappy-java-1.0.5.jar;E:\m2\local\repository\org\slf4j\slf4j-api\1.7.25\slf4j-api-1.7.25.jar;E:\m2\local\repository\org\slf4j\slf4j-log4j12\1.7.25\slf4j-log4j12-1.7.25.jar;E:\m2\local\repository\com\google\inject\extensions\guice-servlet\4.0\guice-servlet-4.0.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-common\3.2.1\hadoop-common-3.2.1.jar;E:\m2\local\repository\org\apache\commons\commons-math3\3.1.1\commons-math3-3.1.1.jar;E:\m2\local\repository\org\apache\httpcomponents\httpclient\4.5.6\httpclient-4.5.6.jar;E:\m2\local\repository\org\apache\httpcomponents\httpcore\4.4.10\httpcore-4.4.10.jar;E:\m2\local\repository\commons-net\commons-net\3.6\commons-net-3.6.jar;E:\m2\local\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-servlet\9.3.24.v20180605\jetty-servlet-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-security\9.3.24.v20180605\jetty-security-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-webapp\9.3.24.v20180605\jetty-webapp-9.3.24.v20180605.jar;E:\m2\local\repository\org\eclipse\jetty\jetty-xml\9.3.24.v20180605\jetty-xml-9.3.24.v20180605.jar;E:\m2\local\repository\javax\servlet\jsp\jsp-api\2.1\jsp-api-2.1.jar;E:\m2\local\repository\com\sun\jersey\jersey-servlet\1.19\jersey-servlet-1.19.jar;E:\m2\local\repository\com\sun\jersey\jersey-json\1.19\jersey-json-1.19.jar;E:\m2\local\repository\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;E:\m2\local\repository\com\sun\xml\bind\jaxb-impl\2.2.3-1\jaxb-impl-2.2.3-1.jar;E:\m2\local\repository\org\codehaus\jackson\jackson-jaxrs\1.9.2\jackson-jaxrs-1.9.2.jar;E:\m2\local\repository\org\codehaus\jackson\jackson-xc\1.9.2\jackson-xc-1.9.2.jar;E:\m2\local\repository\commons-beanutils\commons-beanutils\1.9.3\commons-beanutils-1.9.3.jar;E:\m2\local\repository\org\apache\commons\commons-configuration2\2.1.1\commons-configuration2-2.1.1.jar;E:\m2\local\repository\org\apache\commons\commons-lang3\3.7\commons-lang3-3.7.jar;E:\m2\local\repository\org\apache\commons\commons-text\1.4\commons-text-1.4.jar;E:\m2\local\repository\com\google\re2j\re2j\1.1\re2j-1.1.jar;E:\m2\local\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;E:\m2\local\repository\org\apache\hadoop\hadoop-auth\3.2.1\hadoop-auth-3.2.1.jar;E:\m2\local\repository\com\nimbusds\nimbus-jose-jwt\4.41.1\nimbus-jose-jwt-4.41.1.jar;E:\m2\local\repository\com\github\stephenc\jcip\jcip-annotations\1.0-1\jcip-annotations-1.0-1.jar;E:\m2\local\repository\net\minidev\json-smart\2.3\json-smart-2.3.jar;E:\m2\local\repository\net\minidev\accessors-smart\1.2\accessors-smart-1.2.jar;E:\m2\local\repository\org\ow2\asm\asm\5.0.4\asm-5.0.4.jar;E:\m2\local\repository\org\apache\curator\curator-framework\2.13.0\curator-framework-2.13.0.jar;E:\m2\local\repository\com\jcraft\jsch\0.1.54\jsch-0.1.54.jar;E:\m2\local\repository\org\apache\curator\curator-client\2.13.0\curator-client-2.13.0.jar;E:\m2\local\repository\org\apache\curator\curator-recipes\2.13.0\curator-recipes-2.13.0.jar;E:\m2\local\repository\com\google\code\findbugs\jsr305\3.0.0\jsr305-3.0.0.jar;E:\m2\local\repository\org\apache\zookeeper\zookeeper\3.4.13\zookeeper-3.4.13.jar;E:\m2\local\repository\jline\jline\0.9.94\jline-0.9.94.jar;E:\m2\local\repository\org\apache\yetus\audience-annotations\0.5.0\audience-annotations-0.5.0.jar;E:\m2\local\repository\org\apache\commons\commons-compress\1.18\commons-compress-1.18.jar;E:\m2\local\repository\org\apache\kerby\kerb-simplekdc\1.0.1\kerb-simplekdc-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-client\1.0.1\kerb-client-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerby-config\1.0.1\kerby-config-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-core\1.0.1\kerb-core-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerby-pkix\1.0.1\kerby-pkix-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerby-asn1\1.0.1\kerby-asn1-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerby-util\1.0.1\kerby-util-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-common\1.0.1\kerb-common-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-crypto\1.0.1\kerb-crypto-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-util\1.0.1\kerb-util-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\token-provider\1.0.1\token-provider-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-admin\1.0.1\kerb-admin-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-server\1.0.1\kerb-server-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerb-identity\1.0.1\kerb-identity-1.0.1.jar;E:\m2\local\repository\org\apache\kerby\kerby-xdr\1.0.1\kerby-xdr-1.0.1.jar;E:\m2\local\repository\org\codehaus\woodstox\stax2-api\3.1.4\stax2-api-3.1.4.jar;E:\m2\local\repository\com\fasterxml\woodstox\woodstox-core\5.0.3\woodstox-core-5.0.3.jar;E:\m2\local\repository\dnsjava\dnsjava\2.1.7\dnsjava-2.1.7.jar" WordCountDriver
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:645)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1230)
at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:160)
at org.apache.hadoop.util.DiskChecker.checkDirInternal(DiskChecker.java:100)
at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:77)
at org.apache.hadoop.util.BasicDiskValidator.checkStatus(BasicDiskValidator.java:32)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:331)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:394)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:165)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:146)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:130)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:123)
at org.apache.hadoop.mapred.LocalJobRunner$Job.(LocalJobRunner.java:172)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:794)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:251)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at WordCountDriver.main(WordCountDriver.java:77)

Process finished with exit code 1

maven 添加依赖
org.apache.hadoop
hadoop-client
3.2.1

`

Java 程序
`
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

import java.io.File;
import java.io.IOException;

/**

  • @author chen.chao

  • @Version 1.0

  • @Date 2020/2/6 15:12

  • @description
    */
    public class WordCountDriver {

    static class WordCountReducer extends Reducer<Text, LongWritable,Text,LongWritable> {

     @Override
     protected void reduce(Text key, Iterable<LongWritable> values, Context context) throws IOException, InterruptedException {
         // super.reduce(key, values, context);
         long count =  0 ;
         for (LongWritable value : values) {
             count += value.get();
         }
         context.write(key,new LongWritable(count));
     }
    

    }

    static class WordCountMapper extends Mapper<LongWritable, Text,Text,LongWritable> {

     @Override
     protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
         // super.map(key, value, context);
         String[] s = value.toString().split(" ");
         for (String w : s) {
             context.write(new Text(w),new LongWritable(1));
         }
     }
    

    }

    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {

     // 配置文件
     Configuration conf = new Configuration();
     // 实例
     Job instance = Job.getInstance(conf);
    
     // 指定jar
     instance.setJarByClass(WordCountDriver.class);
    
     // 指定mapper
     instance.setMapperClass(WordCountMapper.class);
     instance.setReducerClass(WordCountReducer.class);
    
     // 输出类型
     instance.setOutputKeyClass(Text.class);
     instance.setOutputValueClass(LongWritable.class);
    
     // 指定文件输入输出类型
     FileInputFormat.setInputPaths(instance,new Path("D:\\Program Files\\hadoop-3.2.1\\LICENSE.txt"));
     FileOutputFormat.setOutputPath(instance,new Path("D:\\out"));
    

/*
FileInputFormat.setInputPaths(instance,new Path(args[0]));
FileOutputFormat.setOutputPath(instance,new Path(args[1]));
*/

    // 提交
    instance.submit();
    // 等待
    System.exit(instance.waitForCompletion(true)?0:1);


}

}
`

winutils.exe is not compatible with the version of Windows

Hi, I have setup pyspark in client mode for research purpose. I was trying to download and run winutils.exe from cmd and I am getting this error - "This version of winutils.exe is not compatible with the version of Windows you're running. Check your computer's system information and then contact the software publisher".

I have set up HADOOP_HOME environment variable. Could anyone please help me with this?

Hadoop/Spark slower on Windows VS Linux

Hi,

I have two machines that have identical Hardware. CPU, RAM, and BIOS configurations are exactly the same. I am running Spark 3.3.1 with Hadoop 3.3.1. The benchmark is also exactly the same. I am not using any HDFS at all.

Problem: Spark on Windows runs slower than Linux

Any idea why Windows implementation is slower? What is exactly inside hadoop.dll and winutils.exe.

Hadoop with snappy compression support

Can you compile a version of Hadoop that supports snappy compression, I can't find a Windows version of Hadoop that supports snappy compilation tutorials or the final product on the web

hadoop-3.2.1 Windows 版本不兼容

Cannot run program "Hadoop\hadoop-3.2.1\bin\winutils.exe": CreateProcess error=216, 该版本的 %1 与你运行的 Windows 版本不兼容。请查看计算机的系统信息,然后联系软件发布者。这怎么解决啊

Mr

Can I get some help for Hadoop 3.2.2

2.7.5?

Would you like to provide a library for Hadoop 2.7.5 version.

Thanks.

License Apache 2.0

Hi,
Firstly, thanks for updating the binaries.
It would be fine if you could add an Apache license to this repo. My company only accepts FOSS libraries.

This version of %1 is not compatible with the version of Windows you're running. Check your computer's system information and then contact the software publisher

I am on Windows 10 Pro x64 AMD, and cannot get winutils.exe or hadoop.dll to work.

I created: c:\hadoop and HADOOP_HOME = c:\hadoop

I dropped the files in c:\hadoop\bin, and added %HADOOP_HOME%\bin to my Windows PATH.

When I run, I get:

java.io.IOException: Cannot run program "C:\hadoop\bin\winutils.exe": CreateProcess error=216, This version of %1 is not compatible with the version of Windows you're running. Check your computer's system information and then contact the software publisher


Can someone help? Fix? this? please?

It is impossible to compile winutils.exe and hadoop.dll. If you believe it is possible, find the instructions and attempt to do it.

Mapred Streaming on 3.3.1

Hi, I just want to say thank you for providing these files for the public. I am fairly new to Hadoop and currently using it for my lab environment for school.

When I try to run "mapred streaming", I receive the following error:
The system cannot find the batch label specified - streaming

I am currently using "hadoop jar hadoop-streaming.jar" as an alternative, but was wondering if you would be able to provide assistance with the error when using mapred streaming.

This version of %1 is not compatible with the version of Windows you're running

Windows 10 Pro 10.0.18363 Build 18363
AdoptOpenJDK build 1.8.0_282-b08
Spark 2.4.5
Scala 2.12.13

I'm getting this error trying to save a dataframe in Spark parquet locally on my Windows 10 computer. Anyone got any ideas on how to fix this or why it throws an exception? I've tried changing java version, spark version, scala version and winutils.exe version with the same result. Seems to be a Windows issue.

"java.io.IOException: Cannot run program "C:\hadoop\bin\winutils.exe": CreateProcess error=216, This version of %1 is not compatible with the version of Windows you're running. Check your computer's system information and then contact the software publisher"

关于使用

您好,3.2.1版本,我用您编译好的bin替换了我目录里的bin,想要使用hdfs这个命令的时候发现没有,请问怎么解决啊,谢谢

error

Please update C:\hadooop\hadoop-3.3.0.tar\hadoop-3.3.0\hadoop-3.3.0\etc\hadoop\hadoop-env.cmd '-Dhadoop.security.logger' is not recognized as an internal or external command, operable program or batch file.

请求博主帮忙编译hadoop2.10

现在hadoop2.10系列已经发布了,由于我刚刚接触hadoop,对这块的编译还不是很熟悉,总是遇到很多问题,请问您可以帮忙编译一下吗?

Winutils for Hadoop 3.3.6

Hi!
Can I please have the winutils for Haddop 3.3.6. It is not available on GitHub yet, I think.

thank you and best regards.

just wanted to say thanks for doing this.

IF you submit a PR to my repo I'll point to it.

One recommendation, do provide some details on you, sign the artifacts etc. Some people are very cautious about sticking untrusted native binaries on their systems.

[need help] error compiling 32x hdfs native client

There are errors when compiling hadoop 3.2.0 and 3.2.1 on windows
I believe there might be configuration problems or some glitches of windows-related problems
I'll post error messages here and please help

This is a dedicated virtual machine for compiling hadoop
I've successfully compiled multiple versions, up to 3.1.x

as my machine is Chinese language and locale, I translated and hand picked most related messages, as below

near the end of the output, there are mesasges like:
APACHE Hadoop HDFS Native Client ................... FAILURE [ 9.908 s]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned 1
[ERROR] around Ant part ...(exec failonerror="true" dir="C:\h3\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="msbuild">... @ 9:121 in c:\h3\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

above this, there are error messages during compiling native client
there are basically two problems
1 cannot resolve reference
2 missing semicolon

problem 1:
test_libhdfs_ops.obj : error LNK2019: unresolved external symbol hdfsFileDisableDirectRead referenced in main
test_libhdfs_ops.obj : error LNK2019: unresolved external symbol hdfsFileUsesDirectRead referenced in main

problem 2. error messages got too long so I opened this solution in VS2010 on the same VM and collected error messages there:
ClCompile:
test_libhdfs_threaded.c
..........\src\main\native\libhdfs-tests\test_libhdfs_threaded.c(163): error C2143: grammar error : missing ";" (before 'type')
..........\src\main\native\libhdfs-tests\test_libhdfs_threaded.c(164): error C2065: "invalid_path": undeclared symbol

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.