storm部署实践

1.下载storm并解压

2.修改文件 storm.yaml 


我这里填的是域名,填ip也可以
hadoop@namenode:~/storm-0.9.2/conf$ vim storm.yaml 
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.


########### These MUST be filled in for a storm configuration
# storm.zookeeper.servers:
#     - "server1"
#     - "server2"

# nimbus.host: "nimbus"


# ##### These may optionally be filled in:
"storm.yaml" 48L, 1613C                                                                                    1,1           Top
nimbus.host: "namenode
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.


########### These MUST be filled in for a storm configuration
storm.zookeeper.servers:
     - "namenode"
     - "datanode1"
     - "datanode1"

nimbus.host: "namenode"
storm.local.dir: "/home/hadoop/storm-0.9.2/data"
supervisor.slots.ports:
    - 6700
    - 6701
    - 6702
    - 6703


# ##### These may optionally be filled in:
#    
## List of custom serializations
# topology.kryo.register:
#     - org.mycompany.MyType
#     - org.mycompany.MyType2: org.mycompany.MyType2Serializer
#
## List of custom kryo decorators
# topology.kryo.decorators:
#     - org.mycompany.MyDecorator
#
## Locations of the drpc servers
# drpc.servers:
#     - "server1"

3.创建临时文件目录

hadoop@namenode:~/storm-0.9.2$ mkdir data
hadoop@namenode:~/storm-0.9.2$ ls
bin  CHANGELOG.md  conf  data  DISCLAIMER  examples  external  lib  LICENSE  logback  NOTICE  public  README.markdown  RELEASE  SECURITY.md                   

4.将storm复制到各个节点

hadoop@namenode:~$ scp -r storm-0.9.2/ hadoop@datanode1:~/
hadoop@namenode:~$ scp -r storm-0.9.2/ hadoop@datanode2:~/

5.配置stom环境变量

export STORM_HOME=/home/hadoop/storm-0.9.2
export PATH=$JAVA_HOME/bin:$ZOOKEEPER_HOME/bin:$HADOOP_HOME/bin:$MAHOUT_HOME/bin:$HBASE_HOME/bin:$STORM_HOME/bin:$PATH

6.开启storm

hadoop@namenode:~$ storm nimbus &
[1] 8341
hadoop@namenode:~$ Running: /usr/programs/jdk1.7.0_65/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/storm-0.9.2 -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/storm-0.9.2/lib/commons-io-2.4.jar:/home/hadoop/storm-0.9.2/lib/chill-java-0.3.5.jar:/home/hadoop/storm-0.9.2/lib/commons-lang-2.5.jar:/home/hadoop/storm-0.9.2/lib/kryo-2.21.jar:/home/hadoop/storm-0.9.2/lib/hiccup-0.3.6.jar:/home/hadoop/storm-0.9.2/lib/reflectasm-1.07-shaded.jar:/home/hadoop/storm-0.9.2/lib/zookeeper-3.4.5.jar:/home/hadoop/storm-0.9.2/lib/clojure-1.5.1.jar:/home/hadoop/storm-0.9.2/lib/json-simple-1.1.jar:/home/hadoop/storm-0.9.2/lib/ring-devel-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/storm-0.9.2/lib/disruptor-2.10.1.jar:/home/hadoop/storm-0.9.2/lib/commons-logging-1.1.3.jar:/home/hadoop/storm-0.9.2/lib/carbonite-1.4.0.jar:/home/hadoop/storm-0.9.2/lib/netty-3.6.3.Final.jar:/home/hadoop/storm-0.9.2/lib/minlog-1.2.jar:/home/hadoop/storm-0.9.2/lib/jline-2.11.jar:/home/hadoop/storm-0.9.2/lib/clj-stacktrace-0.2.4.jar:/home/hadoop/storm-0.9.2/lib/httpclient-4.3.3.jar:/home/hadoop/storm-0.9.2/lib/netty-3.2.2.Final.jar:/home/hadoop/storm-0.9.2/lib/logback-core-1.0.6.jar:/home/hadoop/storm-0.9.2/lib/jetty-6.1.26.jar:/home/hadoop/storm-0.9.2/lib/commons-exec-1.1.jar:/home/hadoop/storm-0.9.2/lib/compojure-1.1.3.jar:/home/hadoop/storm-0.9.2/lib/tools.macro-0.1.0.jar:/home/hadoop/storm-0.9.2/lib/servlet-api-2.5-20081211.jar:/home/hadoop/storm-0.9.2/lib/tools.logging-0.2.3.jar:/home/hadoop/storm-0.9.2/lib/httpcore-4.3.2.jar:/home/hadoop/storm-0.9.2/lib/guava-13.0.jar:/home/hadoop/storm-0.9.2/lib/jetty-util-6.1.26.jar:/home/hadoop/storm-0.9.2/lib/commons-fileupload-1.2.1.jar:/home/hadoop/storm-0.9.2/lib/curator-client-2.4.0.jar:/home/hadoop/storm-0.9.2/lib/curator-framework-2.4.0.jar:/home/hadoop/storm-0.9.2/lib/ring-servlet-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/joda-time-2.0.jar:/home/hadoop/storm-0.9.2/lib/clout-1.0.1.jar:/home/hadoop/storm-0.9.2/lib/slf4j-api-1.6.5.jar:/home/hadoop/storm-0.9.2/lib/commons-codec-1.6.jar:/home/hadoop/storm-0.9.2/lib/servlet-api-2.5.jar:/home/hadoop/storm-0.9.2/lib/logback-classic-1.0.6.jar:/home/hadoop/storm-0.9.2/lib/core.incubator-0.1.0.jar:/home/hadoop/storm-0.9.2/lib/tools.cli-0.2.4.jar:/home/hadoop/storm-0.9.2/lib/ring-core-1.1.5.jar:/home/hadoop/storm-0.9.2/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/jgrapht-core-0.9.0.jar:/home/hadoop/storm-0.9.2/lib/clj-time-0.4.1.jar:/home/hadoop/storm-0.9.2/lib/snakeyaml-1.11.jar:/home/hadoop/storm-0.9.2/lib/storm-core-0.9.2-incubating.jar:/home/hadoop/storm-0.9.2/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/storm-0.9.2/lib/objenesis-1.2.jar:/home/hadoop/storm-0.9.2/lib/asm-4.0.jar:/home/hadoop/storm-0.9.2/conf -Xmx1024m -Dlogfile.name=nimbus.log -Dlogback.configurationFile=/home/hadoop/storm-0.9.2/logback/cluster.xml backtype.storm.daemon.nimbus

hadoop@namenode:~$ storm ui &
[2] 8391
hadoop@namenode:~$ Running: /usr/programs/jdk1.7.0_65/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/storm-0.9.2 -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/storm-0.9.2/lib/commons-io-2.4.jar:/home/hadoop/storm-0.9.2/lib/chill-java-0.3.5.jar:/home/hadoop/storm-0.9.2/lib/commons-lang-2.5.jar:/home/hadoop/storm-0.9.2/lib/kryo-2.21.jar:/home/hadoop/storm-0.9.2/lib/hiccup-0.3.6.jar:/home/hadoop/storm-0.9.2/lib/reflectasm-1.07-shaded.jar:/home/hadoop/storm-0.9.2/lib/zookeeper-3.4.5.jar:/home/hadoop/storm-0.9.2/lib/clojure-1.5.1.jar:/home/hadoop/storm-0.9.2/lib/json-simple-1.1.jar:/home/hadoop/storm-0.9.2/lib/ring-devel-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/storm-0.9.2/lib/disruptor-2.10.1.jar:/home/hadoop/storm-0.9.2/lib/commons-logging-1.1.3.jar:/home/hadoop/storm-0.9.2/lib/carbonite-1.4.0.jar:/home/hadoop/storm-0.9.2/lib/netty-3.6.3.Final.jar:/home/hadoop/storm-0.9.2/lib/minlog-1.2.jar:/home/hadoop/storm-0.9.2/lib/jline-2.11.jar:/home/hadoop/storm-0.9.2/lib/clj-stacktrace-0.2.4.jar:/home/hadoop/storm-0.9.2/lib/httpclient-4.3.3.jar:/home/hadoop/storm-0.9.2/lib/netty-3.2.2.Final.jar:/home/hadoop/storm-0.9.2/lib/logback-core-1.0.6.jar:/home/hadoop/storm-0.9.2/lib/jetty-6.1.26.jar:/home/hadoop/storm-0.9.2/lib/commons-exec-1.1.jar:/home/hadoop/storm-0.9.2/lib/compojure-1.1.3.jar:/home/hadoop/storm-0.9.2/lib/tools.macro-0.1.0.jar:/home/hadoop/storm-0.9.2/lib/servlet-api-2.5-20081211.jar:/home/hadoop/storm-0.9.2/lib/tools.logging-0.2.3.jar:/home/hadoop/storm-0.9.2/lib/httpcore-4.3.2.jar:/home/hadoop/storm-0.9.2/lib/guava-13.0.jar:/home/hadoop/storm-0.9.2/lib/jetty-util-6.1.26.jar:/home/hadoop/storm-0.9.2/lib/commons-fileupload-1.2.1.jar:/home/hadoop/storm-0.9.2/lib/curator-client-2.4.0.jar:/home/hadoop/storm-0.9.2/lib/curator-framework-2.4.0.jar:/home/hadoop/storm-0.9.2/lib/ring-servlet-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/joda-time-2.0.jar:/home/hadoop/storm-0.9.2/lib/clout-1.0.1.jar:/home/hadoop/storm-0.9.2/lib/slf4j-api-1.6.5.jar:/home/hadoop/storm-0.9.2/lib/commons-codec-1.6.jar:/home/hadoop/storm-0.9.2/lib/servlet-api-2.5.jar:/home/hadoop/storm-0.9.2/lib/logback-classic-1.0.6.jar:/home/hadoop/storm-0.9.2/lib/core.incubator-0.1.0.jar:/home/hadoop/storm-0.9.2/lib/tools.cli-0.2.4.jar:/home/hadoop/storm-0.9.2/lib/ring-core-1.1.5.jar:/home/hadoop/storm-0.9.2/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/jgrapht-core-0.9.0.jar:/home/hadoop/storm-0.9.2/lib/clj-time-0.4.1.jar:/home/hadoop/storm-0.9.2/lib/snakeyaml-1.11.jar:/home/hadoop/storm-0.9.2/lib/storm-core-0.9.2-incubating.jar:/home/hadoop/storm-0.9.2/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/storm-0.9.2/lib/objenesis-1.2.jar:/home/hadoop/storm-0.9.2/lib/asm-4.0.jar:/home/hadoop/storm-0.9.2:/home/hadoop/storm-0.9.2/conf -Xmx768m -Dlogfile.name=ui.log -Dlogback.configurationFile=/home/hadoop/storm-0.9.2/logback/cluster.xml backtype.storm.ui.core

hadoop@namenode:~$ storm logviewer &
[3] 8451
hadoop@namenode:~$ Running: /usr/programs/jdk1.7.0_65/bin/java -server -Dstorm.options= -Dstorm.home=/home/hadoop/storm-0.9.2 -Djava.library.path=/usr/local/lib:/opt/local/lib:/usr/lib -Dstorm.conf.file= -cp /home/hadoop/storm-0.9.2/lib/commons-io-2.4.jar:/home/hadoop/storm-0.9.2/lib/chill-java-0.3.5.jar:/home/hadoop/storm-0.9.2/lib/commons-lang-2.5.jar:/home/hadoop/storm-0.9.2/lib/kryo-2.21.jar:/home/hadoop/storm-0.9.2/lib/hiccup-0.3.6.jar:/home/hadoop/storm-0.9.2/lib/reflectasm-1.07-shaded.jar:/home/hadoop/storm-0.9.2/lib/zookeeper-3.4.5.jar:/home/hadoop/storm-0.9.2/lib/clojure-1.5.1.jar:/home/hadoop/storm-0.9.2/lib/json-simple-1.1.jar:/home/hadoop/storm-0.9.2/lib/ring-devel-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/math.numeric-tower-0.0.1.jar:/home/hadoop/storm-0.9.2/lib/disruptor-2.10.1.jar:/home/hadoop/storm-0.9.2/lib/commons-logging-1.1.3.jar:/home/hadoop/storm-0.9.2/lib/carbonite-1.4.0.jar:/home/hadoop/storm-0.9.2/lib/netty-3.6.3.Final.jar:/home/hadoop/storm-0.9.2/lib/minlog-1.2.jar:/home/hadoop/storm-0.9.2/lib/jline-2.11.jar:/home/hadoop/storm-0.9.2/lib/clj-stacktrace-0.2.4.jar:/home/hadoop/storm-0.9.2/lib/httpclient-4.3.3.jar:/home/hadoop/storm-0.9.2/lib/netty-3.2.2.Final.jar:/home/hadoop/storm-0.9.2/lib/logback-core-1.0.6.jar:/home/hadoop/storm-0.9.2/lib/jetty-6.1.26.jar:/home/hadoop/storm-0.9.2/lib/commons-exec-1.1.jar:/home/hadoop/storm-0.9.2/lib/compojure-1.1.3.jar:/home/hadoop/storm-0.9.2/lib/tools.macro-0.1.0.jar:/home/hadoop/storm-0.9.2/lib/servlet-api-2.5-20081211.jar:/home/hadoop/storm-0.9.2/lib/tools.logging-0.2.3.jar:/home/hadoop/storm-0.9.2/lib/httpcore-4.3.2.jar:/home/hadoop/storm-0.9.2/lib/guava-13.0.jar:/home/hadoop/storm-0.9.2/lib/jetty-util-6.1.26.jar:/home/hadoop/storm-0.9.2/lib/commons-fileupload-1.2.1.jar:/home/hadoop/storm-0.9.2/lib/curator-client-2.4.0.jar:/home/hadoop/storm-0.9.2/lib/curator-framework-2.4.0.jar:/home/hadoop/storm-0.9.2/lib/ring-servlet-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/joda-time-2.0.jar:/home/hadoop/storm-0.9.2/lib/clout-1.0.1.jar:/home/hadoop/storm-0.9.2/lib/slf4j-api-1.6.5.jar:/home/hadoop/storm-0.9.2/lib/commons-codec-1.6.jar:/home/hadoop/storm-0.9.2/lib/servlet-api-2.5.jar:/home/hadoop/storm-0.9.2/lib/logback-classic-1.0.6.jar:/home/hadoop/storm-0.9.2/lib/core.incubator-0.1.0.jar:/home/hadoop/storm-0.9.2/lib/tools.cli-0.2.4.jar:/home/hadoop/storm-0.9.2/lib/ring-core-1.1.5.jar:/home/hadoop/storm-0.9.2/lib/ring-jetty-adapter-0.3.11.jar:/home/hadoop/storm-0.9.2/lib/jgrapht-core-0.9.0.jar:/home/hadoop/storm-0.9.2/lib/clj-time-0.4.1.jar:/home/hadoop/storm-0.9.2/lib/snakeyaml-1.11.jar:/home/hadoop/storm-0.9.2/lib/storm-core-0.9.2-incubating.jar:/home/hadoop/storm-0.9.2/lib/log4j-over-slf4j-1.6.6.jar:/home/hadoop/storm-0.9.2/lib/objenesis-1.2.jar:/home/hadoop/storm-0.9.2/lib/asm-4.0.jar:/home/hadoop/storm-0.9.2:/home/hadoop/storm-0.9.2/conf -Xmx128m -Dlogfile.name=logviewer.log -Dlogback.configurationFile=/home/hadoop/storm-0.9.2/logback/cluster.xml backtype.storm.daemon.logviewer



分别去各个节点开启supervisor

hadoop@datanode2:~$ ~/storm-0.9.2/bin/storm supervisor &

hadoop@datanode1:~$ ~/storm-0.9.2/bin/storm supervisor &


7.访问web界面

输入你的nimbus节点的ip:8080
比如我的是 http://namenode:8080
相关文章
相关标签/搜索