命令如下:
time echo “scale=5000; 4*a(1)” | bc -l -q
量化自我和极简主义的窝藏点
命令如下:
time echo “scale=5000; 4*a(1)” | bc -l -q
环境:CentOS 6.3
一直以来都有需要模拟某一端口来发送数据,发现linux上自带的nc非常方便。命令如下:
nc -l 8888
环境: Ubuntu 14.02, gcc 4.8
安装gcc 4.9
sudo add-apt-repository ppa:ubuntu-toolchain-r/test
sudo apt-get update
sudo apt-get install g++-4.9
修改默认的gcc版本
sudo update-alternatives –install /usr/bin/gcc gcc /usr/bin/gcc-4.9 150
sudo update-alternatives –install /usr/bin/gcc gcc /usr/bin/gcc-4.8 100
sudo update-alternatives –config gcc
环境: Ubuntu 14.04, pip
今天使用pip安装python包出现如下报错:
ImportError: cannot import name IncompleteRead
查找发现是pip的一个bug
重新下载安装新的版本:
sudo apt-get remove python-pip
sudo apt-get autoremove
wget https://raw.github.com/pypa/pip/master/contrib/get-pip.py –no-check-certificate sudo python get-pip.py
环境:CentOS 6.3, gcc 4.4.7 g++4.4.7
wget http://people.centos.org/tru/devtools-2/devtools-2.repo -O /etc/yum.repos.d/devtools-2.repo
yum install devtoolset-2-gcc devtoolset-2-binutils devtoolset-2-gcc-c++
scl enable devtoolset-2 bash
环境:Ubuntu 12.04, Pidgin, pidgin-lwqq, Spark, ejabberd
由于个人经常在Ubuntu平台上操作,而这个平台上基于XMPP协议和跨平台比较好用的客户端有pidgin和spark,因而可以自己搭配私人的IM的平台。同时要跟外部QQ通信,因而希望其支持。
具体步骤如下:
安装ejabberd
1. 安装erlang
sudo apt-get install erlang
2. 安装ejabberd
sudo apt-get install ejabberd
3. 修改ejabberd.cfg,增加管理员用户,密码和主机名
vi /etc/ejabberd/ejabberd.cfg
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Options which are set by Debconf and managed by ucf
%% Admin user
{acl, admin, {user, “jerry”, “hq”}}.
%% Hostname
{hosts, [“hq”]}.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
4. 修改hosts,增加ip与主机名对应
sudo vi /etc/hosts
6. 重启ejabberd服务
sudo service ejabberd restart
****************************************************************************************************************************
安装pidgin和pidgin-lwqq
1. 安装pidgin
sudo apt-get install pidgin
2. 安装pidgin-lwqq
sudo add-apt-repository ppa:lainme/pidgin-lwqq
sudo apt-get update
sudo apt-get install pidgin-lwqq
****************************************************************************************************************************
启动并使用pidgin
1. 启动pidgin
jerry@hq:~$ pidgin
2. 与QQ通信
增加帐户,”Basic”项目栏protocol指定为WebQQ,输入QQ的用户名和密码即可
3. 私人通信
增加帐户,”Basic”项目栏protocol指定为XMPP,”Advanced”项目栏上输入主机和端口(例: hq, 5222, 此信息是ejabberd所提供的)
至止,可以跨平台跨各种通信协议来使用自己的和外部的即时通信了
环境: Centos 5.6 , Python
语句如下:
[os.path.split(f)[1].split(“.”) for f in os.listdir(“/u01/app/bietl/code/bdhi”) if os.path.split(f)[1].split(“.”)[1] == ‘dat’]
/u01/app/bietl/code/bdhi — 代表目录名
dat — 代表后缀名
这两个参数可以按你想要的结果传入。
环境: CentOS 5.7
mvn clean package -DskipTests
环境: CentOS
在使用Maven编译一些与hadoop相关的产品时候需要使用hadoop相关版本对应的核心组件,而自己使用的大多数都是CDH版本。因而需要从些版本上下载相应的包。
相应的解决方法是在pom.xml增加如下:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <repositories> <repository> <id>cloudera</id> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> </repository> </repositories> </project>
以下显示的是project name, groupId, artifactId, and version required to access each CDH4 artifact.
Project | groupId | artifactId | version |
---|---|---|---|
Hadoop | org.apache.hadoop | hadoop-annotations | 2.0.0-cdh4.2.0 |
org.apache.hadoop | hadoop-archives | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-assemblies | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-auth | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-client | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-datajoin | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-dist | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-distcp | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-extras | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-gridmix | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-hdfs | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-app | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-core | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-hs | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-jobclient | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-shuffle | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-examples | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-rumen | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-api | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-applications-distributedshell | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-applications-unmanaged-am-launcher | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-client | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-nodemanager | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-resourcemanager | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-tests | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-web-proxy | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-site | 2.0.0-cdh4.2.0 | |
Hadoop MRv1 | org.apache.hadoop | hadoop-core | 2.0.0-mr1-cdh4.2.0 |
org.apache.hadoop | hadoop-examples | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-minicluster | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-streaming | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-test | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-tools | 2.0.0-mr1-cdh4.2.0 | |
Hive | org.apache.hive | hive-anttasks | 0.10.0-cdh4.2.0 |
org.apache.hive | hive-builtins | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-cli | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-common | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-contrib | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-exec | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-hbase-handler | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-hwi | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-jdbc | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-metastore | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-pdk | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-serde | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-service | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-shims | 0.10.0-cdh4.2.0 | |
HBase | org.apache.hbase | hbase | 0.94.2-cdh4.2.0 |
ZooKeeper | org.apache.zookeeper | zookeeper | 3.4.5-cdh4.2.0 |
Sqoop | org.apache.sqoop | sqoop | 1.4.2-cdh4.2.0 |
Pig | org.apache.pig | pig | 0.10.0-cdh4.2.0 |
org.apache.pig | pigsmoke | 0.10.0-cdh4.2.0 | |
org.apache.pig | pigunit | 0.10.0-cdh4.2.0 | |
Flume 1.x | org.apache.flume | flume-ng-configuration | 1.3.0-cdh4.2.0 |
org.apache.flume | flume-ng-core | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-embedded-agent | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-node | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-sdk | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-tests | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-channels | flume-file-channel | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-channels | flume-jdbc-channel | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-channels | flume-recoverable-memory-channel | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-clients | flume-ng-log4jappender | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-legacy-sources | flume-avro-source | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-legacy-sources | flume-thrift-source | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-hdfs-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-irc-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-ng-elasticsearch-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-ng-hbase-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sources | flume-jms-source | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sources | flume-scribe-source | 1.3.0-cdh4.2.0 | |
Oozie | org.apache.oozie | oozie-client | 3.3.0-cdh4.2.0 |
org.apache.oozie | oozie-core | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-examples | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hadoop | 2.0.0-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hadoop-distcp | 2.0.0-mr1-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hadoop-test | 2.0.0-mr1-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hbase | 0.94.2-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-distcp | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-distcp-yarn | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-hive | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-oozie | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-pig | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-sqoop | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-streaming | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-streaming-yarn | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-tools | 3.3.0-cdh4.2.0 | |
Mahout | org.apache.mahout | mahout-buildtools | 0.7-cdh4.2.0 |
org.apache.mahout | mahout-core | 0.7-cdh4.2.0 | |
org.apache.mahout | mahout-examples | 0.7-cdh4.2.0 | |
org.apache.mahout | mahout-integration | 0.7-cdh4.2.0 | |
org.apache.mahout | mahout-math | 0.7-cdh4.2.0 | |
Whirr | org.apache.whirr | whirr-build-tools | 0.8.0-cdh4.2.0 |
org.apache.whirr | whirr-cassandra | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-cdh | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-chef | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-cli | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-core | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-elasticsearch | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-examples | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-ganglia | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-hadoop | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-hama | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-hbase | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-mahout | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-pig | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-puppet | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-solr | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-yarn | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-zookeeper | 0.8.0-cdh4.2.0 | |
DataFu | com.linkedin.datafu | datafu | 0.0.4-cdh4.2.0 |
Sqoop2 | org.apache.sqoop | sqoop-client | 1.99.1-cdh4.2.0 |
org.apache.sqoop | sqoop-common | 1.99.1-cdh4.2.0 | |
org.apache.sqoop | sqoop-core | 1.99.1-cdh4.2.0 | |
org.apache.sqoop | sqoop-docs | 1.99.1-cdh4.2.0 | |
org.apache.sqoop | sqoop-spi | 1.99.1-cdh4.2.0 | |
org.apache.sqoop.connector | sqoop-connector-generic-jdbc | 1.99.1-cdh4.2.0 | |
org.apache.sqoop.repository | sqoop-repository-derby | 1.99.1-cdh4.2.0 | |
HCatalog | org.apache.hcatalog | hcatalog-core | 0.4.0-cdh4.2.0 |
org.apache.hcatalog | hcatalog-pig-adapter | 0.4.0-cdh4.2.0 | |
org.apache.hcatalog | hcatalog-server-extensions | 0.4.0-cdh4.2.0 | |
org.apache.hcatalog | webhcat | 0.4.0-cdh4.2.0 | |
org.apache.hcatalog | webhcat-java-client | 0.4.0-cdh4.2.0 |
环境: Ubuntu 12.04, Postgresql 9.1, WordPress 3.4.2
1. 环境安装
sudo apt-get install apache2
sudo apt-get install postgresql-9.1
sudo apt-get install php5
sudo apt-get install php5-pgsql
2. 下载wordpress,
wget -O wordpress.tar.gz http://wordpress.org/latest.tar.gz
wget https://downloads.wordpress.org/plugin/postgresql-for-wordpress.1.3.1.zip
3. 解压并放到/var/www目录下
unzip latest.tar.gz
unzip postgresql-for-wordpress.1.3.1.zip
sudo cp -R wordpress /var/www
sudo chown jerry:jerry /var/www/wordpress
sudo cp -R postgresql-for-wordpress/pg4wp /var/www/wordpress/wp-content/
cp /var/www/wordpress/wp-content/pg4wp/db.php /var/www/wordpress/wp-content
4. 切换到/var/www/wordpress目录,拷贝一份wp-config-sample.php文件为wp-config.php
vi wp-config.php
修改这四项为postgresql的配置参数
define(‘DB_NAME’, ‘wordpress’);
/** MySQL database username */
define(‘DB_USER’, ‘postgres’);
/** MySQL database password */
define(‘DB_PASSWORD’, ‘xxxxxxx’);
/** MySQL hostname */
define(‘DB_HOST’, ‘localhost:5432’);
以后可以有自己的博客了!