clipper的安装使用

环境: Ubuntu 16.04
安装必备软件
sudo apt-get install cmake libzmq5 libzmq5-dev libhiredis-dev libev-dev g++ redis-server
安装boost 1.60
wget http://sourceforge.net/projects/boost/files/boost/1.60.0/boost_1_60_0.tar.gz
tar xvf boost_1_60_0.tar.gz
cd boost_1_60_0
sudo ./bootstrap.sh –prefix=/usr/local –with-libraries=all
sudo ./b2 install
sudo /bin/bash -c ‘echo “/usr/local/lib” > /etc/ld.so.conf.d/boost.conf’
sudo ldconfig
下载编译clipper
git clone –recursive https://github.com/ucbrise/clipper.git
cd clipper
./configure
make

 

启动restful服务

cd /home/jerry/clipper

bin/start_clipper.sh

测试服务

python examples/example_client.py

或者

curl -H “Content-Type: application/json” -X POST -d ‘{“input”: [0.4], “uid”: 4}’ http://192.168.56.101:1337/example_app/predict

 

目前先到这里,后面的特征将会陆续探索。

Ubuntu 16

环境:Ubuntu 16,  VirtualBox

安装一个ubuntu 16的虚拟环境,已经启用NAT和host-only,查询网络时却发现host-only没有分配网络地址。

ifconfig -a

编辑interfaces

sudo vi /etc/network/interfaces

添加如下(其中enp0s8是上面查询出的网络标识):

auto enp0s8

iface enp0s8 inet dhcp

然后重启网络

sudo /etc/init.d/networking restart

wordpress使用固定链接

环境:Ubuntu 14.04, wordpress 4.7.1,  Apache/2.4.7

由于wordpress本身的链接方法就是?p=1这种方法,不利于搜索引擎索引,因而使用固定链接方法。步骤如下:

  1. 启用urlrewrite, sudo a2enmode rewrite
  2. vi /etc/apache2/apache2.conf,搜索AllowOverride None,将None修改成All
  3. 重启apache2, sudo service apache2 restart
  4. 登录到wordpress后台,“设置”-》“固定链接”-》“月份和名称型”, 然后在服务器上的目录/var/www/wordpress下创建.htaccess文件,并将后台页面最下面的.htaccess的内容复制的到.htaccess文件内。

到此wordpress的固定链接已经打通。

wordpress 数据库postgresql迁移至mysql

环境: Ubuntu 14.04, postgresql 9.3, mysql 5.5, wordpress 4.7.1

之前用wordpress 3.4.2和postgresql, pg4wp搭建一个个人博客网站,后来想升级时发现pg4wp从1.3.1之后再也没升级过,这个比较坑了,后续的wordpress都没法再升级上去。没办法,硬着头皮把postgresql替换成mysql。

#postgresql operations,从postgresql上导出数据
\copy wp_postmeta to ‘/home/jerry/postmeta.txt’;
\copy wp_posts ‘/home/jerry/posts.txt’;
\copy wp_links ‘/home/jerry/links.txt’;
\copy wp_terms ‘/home/jerry/terms.txt’;
\copy wp_term_relationships ‘/home/jerry/terms_relationships.txt’;
\copy wp_term_taxonomy ‘/home/jerry/terms_taxonomy.txt’;

 

#mysql operations,新建数据库wordpress, 并导入之前的数据
CREATE DATABASE wordpress DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci;

delete from wp_postmeta;
delete from wp_posts;
delete from wp_links;
delete from wp_terms;
delete from wp_term_relationships;
delete from wp_term_taxonomy;

load data infile ‘/home/jerry/postmeta.txt’ into table wp_postmeta;
load data infile ‘/home/jerry/posts.txt’ into table wp_posts;
load data infile ‘/home/jerry/links.txt’ into table wp_links;
load data infile ‘/home/jerry/terms.txt’ into table wp_terms;
load data infile ‘/home/jerry/terms_relationships.txt’ into table wp_term_relationships;
load data infile ‘/home/jerry/terms_taxonomy.txt’ into table wp_term_taxonomy;

mysql 5.5 load data问题

环境: Ubuntu 14.04,  mysql 5.5

使用mysql导入数据

load data infile ‘/home/jerry/aa.txt’ into table t1;

发现有两个问题:

  1. mysql环境变量secure_file_priv, 只有这个目录才能存放数据并导入到mysql内。查看路径  show variables like ‘%secure_file_priv%’; (但仍然无效)。设置其它路径, 在/etc/mysql/my.conf上[mysqld]下面添加一行了secure-file-priv = “”,重启mysql,仍无效。
  2. /etc/apparmor.d/usr.sbin.mysqld, 这个mysql是用来设置文件的读写权限,在这个文件的底部添加如下两行。

/home/jerry/ r,
/home/jerry/* rw,

sudo /etc/init.d/apparmor reload 重新加载下

 

目前看来mysql的文件的权限设置这块非常严格。

 

如何将kafka的数据导入到Elastic

想到有三种方法:

1. logstash

2. kafka-connect-elasticsearch

3. elasticsearch-river-kafka-1.2.1-plugin

方法一:简单,只需启动一个代理程序

方法二:与confluent绑定紧,有些复杂

方法三:代码很久没更新,后续支持比较差

 

logstash使用如下:

input {
kafka {
zk_connect => “kafka:2181”
group_id => “logstash”
topic_id => “apache_logs”
consumer_threads => 16
}
}
output {
elasticsearch {
document_id => “%{my_uuid}”
}
}

https://www.elastic.co/blog/just-enough-kafka-for-the-elastic-stack-part2

Spark整合Elastic

环境: spark 1.6, ElasticSearch 1.6.1, elasticsearch-hadoop

通过 elasticsearch-hadoop可以将spark 处理后的数据保存在Elastic上,后续数据的检查和查询非常方便。

https://db-blog.web.cern.ch/blog/prasanth-kothuri/2016-05-integrating-hadoop-and-elasticsearch-%E2%80%93-part-2-%E2%80%93-writing-and-querying

https://www.elastic.co/guide/en/elasticsearch/hadoop/master/spark.html

https://spark-packages.org/package/elastic/elasticsearch-hadoop