环境:阿里云轻服务器2G 2CPU
pip install torch 显示已杀死或者killed,torch依赖包接近2G。
解决问题:pip install –no-cache-dir torch
量化自我和极简主义的窝藏点
环境:阿里云轻服务器2G 2CPU
pip install torch 显示已杀死或者killed,torch依赖包接近2G。
解决问题:pip install –no-cache-dir torch
环境: Ubuntu 16.04
安装必备软件
sudo apt-get install cmake libzmq5 libzmq5-dev libhiredis-dev libev-dev g++ redis-server
安装boost 1.60
wget http://sourceforge.net/projects/boost/files/boost/1.60.0/boost_1_60_0.tar.gz
tar xvf boost_1_60_0.tar.gz
cd boost_1_60_0
sudo ./bootstrap.sh –prefix=/usr/local –with-libraries=all
sudo ./b2 install
sudo /bin/bash -c ‘echo “/usr/local/lib” > /etc/ld.so.conf.d/boost.conf’
sudo ldconfig
下载编译clipper
git clone –recursive https://github.com/ucbrise/clipper.git
cd clipper
./configure
make
启动restful服务
cd /home/jerry/clipper
bin/start_clipper.sh
测试服务
python examples/example_client.py
或者
curl -H “Content-Type: application/json” -X POST -d ‘{“input”: [0.4], “uid”: 4}’ http://192.168.56.101:1337/example_app/predict
目前先到这里,后面的特征将会陆续探索。
环境: Ubuntu 14.04
export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.9.0-cp27-none-linux_x86_64.whl
sudo pip install –upgrade $TF_BINARY_URL
python
import tensorflow
出现AttributeError: type object ‘NewBase’ has no attribute ‘is_abstract’
解决方法:
python
import six
print(six.__file__) 查看路径
重新安装
sudo pip uninstall six
sudo pip install six –upgrade –target=”/usr/lib/python2.7/dist-packages”
环境:Ubuntu 14.04
git clone https://github.com/Microsoft/lightlda.git
cd lightlda/
vi build.sh
修改如下:
#git clone https://github.com:Microsoft/multiverso.git
git clone https://github.com/Microsoft/multiverso.git
sh build.sh
cd example
export LD_LIBRARY_PATH=~/lightlda/multiverso/third_party/lib:$LD_LIBRARY_PATH
sh nytimes.sh
环境:Ubuntu 14.04
git clone https://github.com/dmlc/xgboost.git
cd xgboost
编译
make cxx11=1
运行示例
cd demo/binary_classification
../../xgboost mushroom.conf
安装python包
cd ~/xgboost/python-package
sudo python setup.py install
https://archive.ics.uci.edu/ml/datasets.html
环境: Ubuntu 12.04, CUDA 6.0
1. 预先安装软件
pip install -r /u01/caffe/python/requirements.txt
sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libboost-all-dev libhdf5-serial-dev
# gflags
wget https://github.com/schuhschuh/gflags/archive/master.zip
unzip master.zip
cd gflags-master
mkdir build && cd build
CXXFLAGS=”-fPIC” cmake .. -DGFLAGS_NAMESPACE=google
make && make install
# glog
wget https://google-glog.googlecode.com/files/glog-0.3.3.tar.gz
tar zxvf glog-0.3.3.tar.gz
cd glog-0.3.3
./configure
make && make install
# lmdb
git clone git://gitorious.org/mdb/mdb.git
cd mdb/libraries/liblmdb
make && make install
2. 配置安装文件
cp Makefile.config.example Makefile.config
vi Makefile.config, 去掉注释(由于虚拟机不支技显卡)
CPU_ONLY := 1
3. 编译,报错如下:
jerry@hq:/u01/caffe$ make
g++ .build_release/tools/convert_imageset.o .build_release/lib/libcaffe.a -o .build_release/tools/convert_imageset.bin -fPIC -DCPU_ONLY -DNDEBUG -O2 -I/usr/include/python2.7 -I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/local/include -I.build_release/src -I./src -I./include -Wall -Wno-sign-compare -L/usr/lib -L/usr/local/lib -L/usr/lib -lglog -lgflags -lpthread -lprotobuf -lleveldb -lsnappy -llmdb -lboost_system -lhdf5_hl -lhdf5 -lopencv_core -lopencv_highgui -lopencv_imgproc -lcblas -latlas
.build_release/lib/libcaffe.a(blob.o): In function `caffe::Blob<float>::Update()’:
blob.cpp:(.text._ZN5caffe4BlobIfE6UpdateEv[_ZN5caffe4BlobIfE6UpdateEv]+0x43): undefined reference to `void caffe::caffe_gpu_axpy<float>(int, float, float const*, float*)’
.build_release/lib/libcaffe.a(blob.o): In function `caffe::Blob<float>::asum_data() const’:
blob.cpp:(.text._ZNK5caffe4BlobIfE9asum_dataEv[_ZNK5caffe4BlobIfE9asum_dataEv]+0x3f): undefined reference to `void caffe::caffe_gpu_asum<float>(int, float const*, float*)’
.build_release/lib/libcaffe.a(blob.o): In function `caffe::Blob<float>::asum_diff() const’:
blob.cpp:(.text._ZNK5caffe4BlobIfE9asum_diffEv[_ZNK5caffe4BlobIfE9asum_diffEv]+0x3f): undefined reference to `void caffe::caffe_gpu_asum<float>(int, float const*, float*)’
.build_release/lib/libcaffe.a(blob.o): In function `caffe::Blob<double>::Update()’:
blob.cpp:(.text._ZN5caffe4BlobIdE6UpdateEv[_ZN5caffe4BlobIdE6UpdateEv]+0x43): undefined reference to `void caffe::caffe_gpu_axpy<double>(int, double, double const*, double*)’
.build_release/lib/libcaffe.a(blob.o): In function `caffe::Blob<double>::asum_data() const’:
blob.cpp:(.text._ZNK5caffe4BlobIdE9asum_dataEv[_ZNK5caffe4BlobIdE9asum_dataEv]+0x3f): undefined reference to `void caffe::caffe_gpu_asum<double>(int, double const*, double*)’
.build_release/lib/libcaffe.a(blob.o): In function `caffe::Blob<double>::asum_diff() const’:
blob.cpp:(.text._ZNK5caffe4BlobIdE9asum_diffEv[_ZNK5caffe4BlobIdE9asum_diffEv]+0x3f): undefined reference to `void caffe::caffe_gpu_asum<double>(int, double const*, double*)’
.build_release/lib/libcaffe.a(common.o): In function `caffe::GlobalInit(int*, char***)’:
common.cpp:(.text+0x12a): undefined reference to `gflags::ParseCommandLineFlags(int*, char***, bool)’
.build_release/lib/libcaffe.a(common.o): In function `caffe::Caffe::Caffe()’:
common.cpp:(.text+0x179): undefined reference to `cublasCreate_v2′
common.cpp:(.text+0x1cb): undefined reference to `curandCreateGenerator’
common.cpp:(.text+0x22d): undefined reference to `curandSetPseudoRandomGeneratorSeed’
.build_release/lib/libcaffe.a(common.o): In function `caffe::Caffe::~Caffe()’:
common.cpp:(.text+0x434): undefined reference to `cublasDestroy_v2′
common.cpp:(.text+0x456): undefined reference to `curandDestroyGenerator’
.build_release/lib/libcaffe.a(common.o): In function `caffe::Caffe::DeviceQuery()’:
common.cpp:(.text+0x5f8): undefined reference to `cudaGetDevice’
common.cpp:(.text+0x616): undefined reference to `cudaGetDeviceProperties’
common.cpp:(.text+0xd22): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(common.o): In function `caffe::Caffe::SetDevice(int)’:
common.cpp:(.text+0x1222): undefined reference to `cudaGetDevice’
common.cpp:(.text+0x1247): undefined reference to `cudaSetDevice’
common.cpp:(.text+0x127b): undefined reference to `cublasDestroy_v2′
common.cpp:(.text+0x12a9): undefined reference to `curandDestroyGenerator’
common.cpp:(.text+0x12ce): undefined reference to `cublasCreate_v2′
common.cpp:(.text+0x12fc): undefined reference to `curandCreateGenerator’
common.cpp:(.text+0x1330): undefined reference to `curandSetPseudoRandomGeneratorSeed’
common.cpp:(.text+0x1729): undefined reference to `cudaGetErrorString’
common.cpp:(.text+0x1882): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(common.o): In function `caffe::Caffe::set_random_seed(unsigned int)’:
common.cpp:(.text+0x1aff): undefined reference to `curandDestroyGenerator’
common.cpp:(.text+0x1b2d): undefined reference to `curandCreateGenerator’
common.cpp:(.text+0x1b5c): undefined reference to `curandSetPseudoRandomGeneratorSeed’
.build_release/lib/libcaffe.a(math_functions.o): In function `void caffe::caffe_copy<double>(int, double const*, double*)’:
math_functions.cpp:(.text._ZN5caffe10caffe_copyIdEEviPKT_PS1_[_ZN5caffe10caffe_copyIdEEviPKT_PS1_]+0x6c): undefined reference to `cudaMemcpy’
math_functions.cpp:(.text._ZN5caffe10caffe_copyIdEEviPKT_PS1_[_ZN5caffe10caffe_copyIdEEviPKT_PS1_]+0x160): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(math_functions.o): In function `void caffe::caffe_copy<int>(int, int const*, int*)’:
math_functions.cpp:(.text._ZN5caffe10caffe_copyIiEEviPKT_PS1_[_ZN5caffe10caffe_copyIiEEviPKT_PS1_]+0x6c): undefined reference to `cudaMemcpy’
math_functions.cpp:(.text._ZN5caffe10caffe_copyIiEEviPKT_PS1_[_ZN5caffe10caffe_copyIiEEviPKT_PS1_]+0x160): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(math_functions.o): In function `void caffe::caffe_copy<unsigned int>(int, unsigned int const*, unsigned int*)’:
math_functions.cpp:(.text._ZN5caffe10caffe_copyIjEEviPKT_PS1_[_ZN5caffe10caffe_copyIjEEviPKT_PS1_]+0x6c): undefined reference to `cudaMemcpy’
math_functions.cpp:(.text._ZN5caffe10caffe_copyIjEEviPKT_PS1_[_ZN5caffe10caffe_copyIjEEviPKT_PS1_]+0x160): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(math_functions.o): In function `void caffe::caffe_copy<float>(int, float const*, float*)’:
math_functions.cpp:(.text._ZN5caffe10caffe_copyIfEEviPKT_PS1_[_ZN5caffe10caffe_copyIfEEviPKT_PS1_]+0x6c): undefined reference to `cudaMemcpy’
math_functions.cpp:(.text._ZN5caffe10caffe_copyIfEEviPKT_PS1_[_ZN5caffe10caffe_copyIfEEviPKT_PS1_]+0x160): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(syncedmem.o): In function `caffe::SyncedMemory::cpu_data()’:
syncedmem.cpp:(.text+0x26): undefined reference to `caffe::caffe_gpu_memcpy(unsigned long, void const*, void*)’
.build_release/lib/libcaffe.a(syncedmem.o): In function `caffe::SyncedMemory::mutable_cpu_data()’:
syncedmem.cpp:(.text+0x136): undefined reference to `caffe::caffe_gpu_memcpy(unsigned long, void const*, void*)’
.build_release/lib/libcaffe.a(syncedmem.o): In function `caffe::SyncedMemory::~SyncedMemory()’:
syncedmem.cpp:(.text+0x1c1): undefined reference to `cudaFree’
syncedmem.cpp:(.text+0x20f): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(syncedmem.o): In function `caffe::SyncedMemory::mutable_gpu_data()’:
syncedmem.cpp:(.text+0x29a): undefined reference to `caffe::caffe_gpu_memcpy(unsigned long, void const*, void*)’
syncedmem.cpp:(.text+0x2b9): undefined reference to `cudaMalloc’
syncedmem.cpp:(.text+0x2e5): undefined reference to `cudaMemset’
syncedmem.cpp:(.text+0x321): undefined reference to `cudaGetErrorString’
syncedmem.cpp:(.text+0x379): undefined reference to `cudaMalloc’
syncedmem.cpp:(.text+0x3c2): undefined reference to `cudaGetErrorString’
syncedmem.cpp:(.text+0x435): undefined reference to `cudaGetErrorString’
.build_release/lib/libcaffe.a(syncedmem.o): In function `caffe::SyncedMemory::gpu_data()’:
syncedmem.cpp:(.text+0x4ca): undefined reference to `caffe::caffe_gpu_memcpy(unsigned long, void const*, void*)’
syncedmem.cpp:(.text+0x4e9): undefined reference to `cudaMalloc’
syncedmem.cpp:(.text+0x515): undefined reference to `cudaMemset’
syncedmem.cpp:(.text+0x549): undefined reference to `cudaMalloc’
syncedmem.cpp:(.text+0x592): undefined reference to `cudaGetErrorString’
syncedmem.cpp:(.text+0x608): undefined reference to `cudaGetErrorString’
syncedmem.cpp:(.text+0x678): undefined reference to `cudaGetErrorString’
collect2: error: ld returned 1 exit status
make: *** [.build_release/tools/convert_imageset.bin] Error 1
4. 修改Makefile.config, 注释CPU_ONLY := 1, 同时修改CUSTOM_CXX := g++-4.6
sudo apt-get install gcc-4.6 g++-4.6 gcc-4.6-multilib g++-4.6-multilib
修改这两个文件
vi src/caffe/common.cpp
vi tools/caffe.cpp
使用google替代gflags
make clean
make
make pycaffe
g++-4.6 -shared -o python/caffe/_caffe.so python/caffe/_caffe.cpp \\\\
.build_release/lib/libcaffe.a -fPIC -DNDEBUG -O2 -I/usr/include/python2.7 -I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/local/include -I.build_release/src -I./src -I./include -I/usr/local/cuda/include -Wall -Wno-sign-compare -L/usr/lib -L/usr/local/lib -L/usr/lib -L/usr/local/cuda/lib64 -L/usr/local/cuda/lib -lcudart -lcublas -lcurand -lglog -lgflags -lpthread -lprotobuf -lleveldb -lsnappy -llmdb -lboost_system -lhdf5_hl -lhdf5 -lopencv_core -lopencv_highgui -lopencv_imgproc -lcblas -latlas -lboost_python -lpython2.7
touch python/caffe/proto/__init__.py
protoc –proto_path=src –python_out=python src/caffe/proto/caffe_pretty_print.proto
protoc –proto_path=src –python_out=python src/caffe/proto/caffe.proto
执行 sudo cp /u01/caffe/python/caffe/ /usr/local/lib/python2.7/dist-packages/ -Rf
环境:Ubuntu 14.04
一直在关注DMLC 这个机器学习项目,最新的一个子项目是虫洞,提供可靠的和可扩展的机器学习工具在不平的计算平台(MPI, Yarn, Sungrid)。将大幅降低安装和部署分布式机器学习应用的门槛。对所有组件提供一致的数据流支持。还提供统一脚本来编译和运行所有组件。使得用户既可以在方便的本地集群运行深盟的任何一个分布式组件。
编译安装如下:
git clone https://github.com/dmlc/wormhole.git
cd wormhole
cp make/config.mk .
vi config.mk
注释HDFS, S3
#USE_HDFS = 1
#USE_S3 = 1
然后编译即可
make
生成两个执行文件:
kmeans.dmlc xgboost.dmlc
环境: Ubuntu 12.04 , Octave 3.8.2
Octave是一个旨在提供与Matlab语法兼容的开放源代码科学计算及数值分析的工具,它支持向量和矩阵计算,方便写数学表达式。
安装步骤如下:
wget ftp://ftp.gnu.org/gnu/octave/octave-3.8.2.tar.gz
tar xvf octave-3.8.2.tar.gz
cd octave-3.8.2
./configure
make
sudo make install
进入命令行:
jerry@hq:~$ octave
warning: docstring file ‘/usr/local/share/octave/3.8.2/etc/built-in-docstrings’ not found
No protocol specified
GNU Octave, version 3.8.2
Copyright (C) 2014 John W. Eaton and others.
This is free software; see the source code for copying conditions.
There is ABSOLUTELY NO WARRANTY; not even for MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE. For details, type ‘warranty’.
Octave was configured for “x86_64-unknown-linux-gnu”.
Additional information about Octave is available at http://www.octave.org.
Please contribute if you find this software useful.
For more information, visit http://www.octave.org/get-involved.html
Read http://www.octave.org/bugs.html to learn how to submit bug reports.
For information about changes from previous versions, type ‘news’.
octave:1>
1 2
3 4
octave:31>
环境: CentOS 6.2
h2o-sparking 是h2o与spark结合的产物,用于机器学习这一方面,它可在spark环境中使用h2o拥有的机器学习包。
安装如下 :
git clone https://github.com/0xdata/h2o-sparkling.git
cd h2o-sparking
sbt assembly
运行测试:
[cloudera@localhost h2o-sparkling]$ sbt -mem 500 “run –local”
[info] Loading project definition from /home/cloudera/h2o-sparkling/project
[info] Set current project to h2o-sparkling-demo (in build file:/home/cloudera/h2o-sparkling/)
[info] Running water.sparkling.demo.SparklingDemo –local
03:41:11.030 main INFO WATER: —– H2O started —–
03:41:11.046 main INFO WATER: Build git branch: (unknown)
03:41:11.047 main INFO WATER: Build git hash: (unknown)
03:41:11.047 main INFO WATER: Build git describe: (unknown)
03:41:11.047 main INFO WATER: Build project version: (unknown)
03:41:11.047 main INFO WATER: Built by: ‘(unknown)’
03:41:11.047 main INFO WATER: Built on: ‘(unknown)’
03:41:11.048 main INFO WATER: Java availableProcessors: 1
03:41:11.077 main INFO WATER: Java heap totalMemory: 3.87 gb
03:41:11.077 main INFO WATER: Java heap maxMemory: 3.87 gb
03:41:11.078 main INFO WATER: Java version: Java 1.6.0_31 (from Sun Microsystems Inc.)
03:41:11.078 main INFO WATER: OS version: Linux 2.6.32-220.23.1.el6.x86_64 (amd64)
03:41:11.381 main INFO WATER: Machine physical memory: 4.83 gb
03:41:11.393 main INFO WATER: ICE root: ‘/tmp/h2o-cloudera’
03:41:11.438 main INFO WATER: Possible IP Address: eth1 (eth1), 192.168.56.101
03:41:11.439 main INFO WATER: Possible IP Address: eth0 (eth0), 10.0.2.15
03:41:11.439 main INFO WATER: Possible IP Address: lo (lo), 127.0.0.1
03:41:11.669 main WARN WATER: Multiple local IPs detected:
+ /192.168.56.101 /10.0.2.15
+ Attempting to determine correct address…
+ Using /10.0.2.15
03:41:11.929 main INFO WATER: Internal communication uses port: 54322
+ Listening for HTTP and REST traffic on http://10.0.2.15:54321/
03:41:12.912 main INFO WATER: H2O cloud name: ‘cloudera’
03:41:12.913 main INFO WATER: (v(unknown)) ‘cloudera’ on /10.0.2.15:54321, discovery address /230.63.2.255:58943
03:41:12.913 main INFO WATER: If you have trouble connecting, try SSH tunneling from your local machine (e.g., via port 55555):
+ 1. Open a terminal and run ‘ssh -L 55555:localhost:54321 cloudera@10.0.2.15’
+ 2. Point your browser to http://localhost:55555
03:41:12.954 main INFO WATER: Cloud of size 1 formed [/10.0.2.15:54321 (00:00:00.000)]
03:41:12.954 main INFO WATER: Log dir: ‘/tmp/h2o-cloudera/h2ologs’
prostate
03:41:20.369 main INFO WATER: Running demo with following configuration: DemoConf(prostate,true,RDDExtractor@file,true)
03:41:20.409 main INFO WATER: Demo configuration: DemoConf(prostate,true,RDDExtractor@file,true)
03:41:21.830 main INFO WATER: Data : data/prostate.csv
03:41:21.831 main INFO WATER: Table: prostate_table
03:41:21.831 main INFO WATER: Query: SELECT * FROM prostate_table WHERE capsule=1
03:41:21.831 main INFO WATER: Spark: LOCAL
03:41:21.901 main INFO WATER: Creating LOCAL Spark context.
03:41:34.616 main INFO WATER: RDD result has: 153 rows
03:41:34.752 main INFO WATER: Going to write RDD into /tmp/rdd_null_6.csv
03:41:36.099 FJ-0-1 INFO WATER: Parse result for rdd_data_6 (153 rows):
03:41:36.136 FJ-0-1 INFO WATER: C1: numeric min(6.000000) max(378.000000)
03:41:36.140 FJ-0-1 INFO WATER: C2: numeric min(1.000000) max(1.000000) constant
03:41:36.146 FJ-0-1 INFO WATER: C3: numeric min(47.000000) max(79.000000)
03:41:36.152 FJ-0-1 INFO WATER: C4: numeric min(0.000000) max(2.000000)
03:41:36.158 FJ-0-1 INFO WATER: C5: numeric min(1.000000) max(4.000000)
03:41:36.161 FJ-0-1 INFO WATER: C6: numeric min(1.000000) max(2.000000)
03:41:36.165 FJ-0-1 INFO WATER: C7: numeric min(1.400000) max(139.700000)
03:41:36.169 FJ-0-1 INFO WATER: C8: numeric min(0.000000) max(73.400000)
03:41:36.176 FJ-0-1 INFO WATER: C9: numeric min(5.000000) max(9.000000)
03:41:37.457 main INFO WATER: Extracted frame from Spark:
03:41:37.474 main INFO WATER: {id,capsule,age,race,dpros,dcaps,psa,vol,gleason}, 2.8 KB
+ Chunk starts: {0,83,}
+ Rows: 153
03:41:37.482 #ti-UDP-R INFO WATER: Orderly shutdown command from /10.0.2.15:54321
[success] Total time: 44 s, completed Aug 4, 2014 3:41:37 AM
本地集群运行:
[cloudera@localhost h2o-sparkling]$ sbt -mem 100 “run –remote”
[info] Loading project definition from /home/cloudera/h2o-sparkling/project
[info] Set current project to h2o-sparkling-demo (in build file:/home/cloudera/h2o-sparkling/)
[info] Running water.sparkling.demo.SparklingDemo –remote
03:25:42.306 main INFO WATER: —– H2O started —–
03:25:42.309 main INFO WATER: Build git branch: (unknown)
03:25:42.309 main INFO WATER: Build git hash: (unknown)
03:25:42.309 main INFO WATER: Build git describe: (unknown)
03:25:42.309 main INFO WATER: Build project version: (unknown)
03:25:42.309 main INFO WATER: Built by: ‘(unknown)’
03:25:42.309 main INFO WATER: Built on: ‘(unknown)’
03:25:42.310 main INFO WATER: Java availableProcessors: 4
03:25:42.316 main INFO WATER: Java heap totalMemory: 3.83 gb
03:25:42.316 main INFO WATER: Java heap maxMemory: 3.83 gb
03:25:42.316 main INFO WATER: Java version: Java 1.6.0_31 (from Sun Microsystems Inc.)
03:25:42.317 main INFO WATER: OS version: Linux 2.6.32-220.23.1.el6.x86_64 (amd64)
03:25:42.383 main INFO WATER: Machine physical memory: 4.95 gb
03:25:42.384 main INFO WATER: ICE root: ‘/tmp/h2o-cloudera’
03:25:42.389 main INFO WATER: Possible IP Address: eth1 (eth1), 192.168.56.101
03:25:42.389 main INFO WATER: Possible IP Address: eth0 (eth0), 10.0.2.15
03:25:42.389 main INFO WATER: Possible IP Address: lo (lo), 127.0.0.1
03:25:42.587 main WARN WATER: Multiple local IPs detected:
+ /192.168.56.101 /10.0.2.15
+ Attempting to determine correct address…
+ Using /10.0.2.15
03:25:42.650 main INFO WATER: Internal communication uses port: 54322
+ Listening for HTTP and REST traffic on http://10.0.2.15:54321/
03:25:43.906 main INFO WATER: H2O cloud name: ‘cloudera’
03:25:43.906 main INFO WATER: (v(unknown)) ‘cloudera’ on /10.0.2.15:54321, discovery address /230.63.2.255:58943
03:25:43.907 main INFO WATER: If you have trouble connecting, try SSH tunneling from your local machine (e.g., via port 55555):
+ 1. Open a terminal and run ‘ssh -L 55555:localhost:54321 cloudera@10.0.2.15’
+ 2. Point your browser to http://localhost:55555
03:25:43.920 main INFO WATER: Cloud of size 1 formed [/10.0.2.15:54321 (00:00:00.000)]
03:25:43.921 main INFO WATER: Log dir: ‘/tmp/h2o-cloudera/h2ologs’
prostate
03:25:46.985 main INFO WATER: Running demo with following configuration: DemoConf(prostate,false,RDDExtractor@file,true)
03:25:46.991 main INFO WATER: Demo configuration: DemoConf(prostate,false,RDDExtractor@file,true)
03:25:48.000 main INFO WATER: Data : data/prostate.csv
03:25:48.000 main INFO WATER: Table: prostate_table
03:25:48.000 main INFO WATER: Query: SELECT * FROM prostate_table WHERE capsule=1
03:25:48.001 main INFO WATER: Spark: REMOTE
03:25:48.024 main INFO WATER: Creating REMOTE (spark://localhost:7077) Spark context.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 1.0:1 failed 4 times, most recent failure: TID 7 on host 192.168.56.101 failed for unknown reason
Driver stacktrace:
03:26:07.151 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1033)
03:26:07.151 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1017)
03:26:07.151 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1015)
03:26:07.152 main INFO WATER: at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
03:26:07.152 main INFO WATER: at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
03:26:07.152 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1015)
03:26:07.152 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:633)
03:26:07.152 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:633)
03:26:07.153 main INFO WATER: at scala.Option.foreach(Option.scala:236)
03:26:07.153 main INFO WATER: at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:633)
03:26:07.153 main INFO WATER: at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1207)
03:26:07.153 main INFO WATER: at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
03:26:07.155 main INFO WATER: at akka.actor.ActorCell.invoke(ActorCell.scala:456)
03:26:07.155 main INFO WATER: at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
03:26:07.156 main INFO WATER: at akka.dispatch.Mailbox.run(Mailbox.scala:219)
03:26:07.156 main INFO WATER: at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
03:26:07.157 main INFO WATER: at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
03:26:07.158 main INFO WATER: at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
03:26:07.158 main INFO WATER: at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
03:26:07.162 main INFO WATER: at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
03:26:07.172 #ti-UDP-R INFO WATER: Orderly shutdown command from /10.0.2.15:54321
[success] Total time: 27 s, completed Aug 4, 2014 3:26:07 PM