Details
-
Bug
-
Status: Closed
-
Critical
-
Resolution: Fixed
-
None
Description
> ./setup_demo.sh [+] Running 1/0 ⠿ compose Warning: No resource found to remove 0.0s [+] Running 15/15 ⠿ namenode Pulled 1.4s ⠿ kafka Pulled 1.3s ⠿ presto-worker-1 Pulled 1.3s ⠿ historyserver Pulled 1.4s ⠿ adhoc-2 Pulled 1.3s ⠿ adhoc-1 Pulled 1.4s ⠿ graphite Pulled 1.3s ⠿ sparkmaster Pulled 1.3s ⠿ hive-metastore-postgresql Pulled 1.3s ⠿ presto-coordinator-1 Pulled 1.3s ⠿ spark-worker-1 Pulled 1.4s ⠿ hiveserver Pulled 1.3s ⠿ hivemetastore Pulled 1.4s ⠿ zookeeper Pulled 1.3s ⠿ datanode1 Pulled 1.3s [+] Running 16/16 ⠿ Network compose_default Created 0.0s ⠿ Container hive-metastore-postgresql Started 1.1s ⠿ Container kafkabroker Started 1.1s ⠿ Container zookeeper Started 1.1s ⠿ Container namenode Started 1.3s ⠿ Container graphite Started 1.2s ⠿ Container historyserver Started 2.2s ⠿ Container hivemetastore Started 2.2s ⠿ Container datanode1 Started 3.3s ⠿ Container presto-coordinator-1 Started 2.7s ⠿ Container hiveserver Started 3.2s ⠿ Container presto-worker-1 Started 4.2s ⠿ Container sparkmaster Started 3.5s ⠿ Container adhoc-2 Started 4.7s ⠿ Container adhoc-1 Started 4.8s ⠿ Container spark-worker-1 Started 4.8s Copying spark default config and setting up configs 21/11/18 01:16:19 WARN ipc.Client: Failed to connect to server: namenode/172.19.0.6:8020: try once and fail. java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:685) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:788) at org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:410) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1550) at org.apache.hadoop.ipc.Client.call(Client.java:1381) at org.apache.hadoop.ipc.Client.call(Client.java:1345) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:796) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1649) at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1440) at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1437) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1437) at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:64) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:269) at org.apache.hadoop.fs.Globber.glob(Globber.java:148) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1686) at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326) at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:245) at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:228) at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103) at org.apache.hadoop.fs.shell.Command.run(Command.java:175) at org.apache.hadoop.fs.FsShell.run(FsShell.java:317) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) at org.apache.hadoop.fs.FsShell.main(FsShell.java:380) mkdir: Call From adhoc-1/172.19.0.13 to namenode:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused copyFromLocal: `/var/demo/.': No such file or directory: `hdfs://namenode:8020/var/demo' Copying spark default config and setting up configs
Env: MacBook with M1 chip, macOS 12.0.1