环境: Centos 5.6 , Python
语句如下:
[os.path.split(f)[1].split(“.”) for f in os.listdir(“/u01/app/bietl/code/bdhi”) if os.path.split(f)[1].split(“.”)[1] == ‘dat’]
/u01/app/bietl/code/bdhi — 代表目录名
dat — 代表后缀名
这两个参数可以按你想要的结果传入。
量化自我和极简主义的窝藏点
环境: Centos 5.6 , Python
语句如下:
[os.path.split(f)[1].split(“.”) for f in os.listdir(“/u01/app/bietl/code/bdhi”) if os.path.split(f)[1].split(“.”)[1] == ‘dat’]
/u01/app/bietl/code/bdhi — 代表目录名
dat — 代表后缀名
这两个参数可以按你想要的结果传入。
环境: CentOS 5.7
mvn clean package -DskipTests
环境: CentOS
在使用Maven编译一些与hadoop相关的产品时候需要使用hadoop相关版本对应的核心组件,而自己使用的大多数都是CDH版本。因而需要从些版本上下载相应的包。
相应的解决方法是在pom.xml增加如下:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <repositories> <repository> <id>cloudera</id> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> </repository> </repositories> </project>
以下显示的是project name, groupId, artifactId, and version required to access each CDH4 artifact.
Project | groupId | artifactId | version |
---|---|---|---|
Hadoop | org.apache.hadoop | hadoop-annotations | 2.0.0-cdh4.2.0 |
org.apache.hadoop | hadoop-archives | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-assemblies | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-auth | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-client | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-datajoin | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-dist | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-distcp | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-extras | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-gridmix | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-hdfs | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-app | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-core | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-hs | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-jobclient | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-client-shuffle | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-mapreduce-examples | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-rumen | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-api | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-applications-distributedshell | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-applications-unmanaged-am-launcher | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-client | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-common | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-nodemanager | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-resourcemanager | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-tests | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-server-web-proxy | 2.0.0-cdh4.2.0 | |
org.apache.hadoop | hadoop-yarn-site | 2.0.0-cdh4.2.0 | |
Hadoop MRv1 | org.apache.hadoop | hadoop-core | 2.0.0-mr1-cdh4.2.0 |
org.apache.hadoop | hadoop-examples | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-minicluster | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-streaming | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-test | 2.0.0-mr1-cdh4.2.0 | |
org.apache.hadoop | hadoop-tools | 2.0.0-mr1-cdh4.2.0 | |
Hive | org.apache.hive | hive-anttasks | 0.10.0-cdh4.2.0 |
org.apache.hive | hive-builtins | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-cli | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-common | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-contrib | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-exec | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-hbase-handler | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-hwi | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-jdbc | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-metastore | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-pdk | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-serde | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-service | 0.10.0-cdh4.2.0 | |
org.apache.hive | hive-shims | 0.10.0-cdh4.2.0 | |
HBase | org.apache.hbase | hbase | 0.94.2-cdh4.2.0 |
ZooKeeper | org.apache.zookeeper | zookeeper | 3.4.5-cdh4.2.0 |
Sqoop | org.apache.sqoop | sqoop | 1.4.2-cdh4.2.0 |
Pig | org.apache.pig | pig | 0.10.0-cdh4.2.0 |
org.apache.pig | pigsmoke | 0.10.0-cdh4.2.0 | |
org.apache.pig | pigunit | 0.10.0-cdh4.2.0 | |
Flume 1.x | org.apache.flume | flume-ng-configuration | 1.3.0-cdh4.2.0 |
org.apache.flume | flume-ng-core | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-embedded-agent | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-node | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-sdk | 1.3.0-cdh4.2.0 | |
org.apache.flume | flume-ng-tests | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-channels | flume-file-channel | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-channels | flume-jdbc-channel | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-channels | flume-recoverable-memory-channel | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-clients | flume-ng-log4jappender | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-legacy-sources | flume-avro-source | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-legacy-sources | flume-thrift-source | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-hdfs-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-irc-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-ng-elasticsearch-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sinks | flume-ng-hbase-sink | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sources | flume-jms-source | 1.3.0-cdh4.2.0 | |
org.apache.flume.flume-ng-sources | flume-scribe-source | 1.3.0-cdh4.2.0 | |
Oozie | org.apache.oozie | oozie-client | 3.3.0-cdh4.2.0 |
org.apache.oozie | oozie-core | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-examples | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hadoop | 2.0.0-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hadoop-distcp | 2.0.0-mr1-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hadoop-test | 2.0.0-mr1-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-hbase | 0.94.2-cdh4.2.0.oozie-3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-distcp | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-distcp-yarn | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-hive | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-oozie | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-pig | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-sqoop | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-streaming | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-sharelib-streaming-yarn | 3.3.0-cdh4.2.0 | |
org.apache.oozie | oozie-tools | 3.3.0-cdh4.2.0 | |
Mahout | org.apache.mahout | mahout-buildtools | 0.7-cdh4.2.0 |
org.apache.mahout | mahout-core | 0.7-cdh4.2.0 | |
org.apache.mahout | mahout-examples | 0.7-cdh4.2.0 | |
org.apache.mahout | mahout-integration | 0.7-cdh4.2.0 | |
org.apache.mahout | mahout-math | 0.7-cdh4.2.0 | |
Whirr | org.apache.whirr | whirr-build-tools | 0.8.0-cdh4.2.0 |
org.apache.whirr | whirr-cassandra | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-cdh | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-chef | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-cli | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-core | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-elasticsearch | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-examples | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-ganglia | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-hadoop | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-hama | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-hbase | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-mahout | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-pig | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-puppet | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-solr | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-yarn | 0.8.0-cdh4.2.0 | |
org.apache.whirr | whirr-zookeeper | 0.8.0-cdh4.2.0 | |
DataFu | com.linkedin.datafu | datafu | 0.0.4-cdh4.2.0 |
Sqoop2 | org.apache.sqoop | sqoop-client | 1.99.1-cdh4.2.0 |
org.apache.sqoop | sqoop-common | 1.99.1-cdh4.2.0 | |
org.apache.sqoop | sqoop-core | 1.99.1-cdh4.2.0 | |
org.apache.sqoop | sqoop-docs | 1.99.1-cdh4.2.0 | |
org.apache.sqoop | sqoop-spi | 1.99.1-cdh4.2.0 | |
org.apache.sqoop.connector | sqoop-connector-generic-jdbc | 1.99.1-cdh4.2.0 | |
org.apache.sqoop.repository | sqoop-repository-derby | 1.99.1-cdh4.2.0 | |
HCatalog | org.apache.hcatalog | hcatalog-core | 0.4.0-cdh4.2.0 |
org.apache.hcatalog | hcatalog-pig-adapter | 0.4.0-cdh4.2.0 | |
org.apache.hcatalog | hcatalog-server-extensions | 0.4.0-cdh4.2.0 | |
org.apache.hcatalog | webhcat | 0.4.0-cdh4.2.0 | |
org.apache.hcatalog | webhcat-java-client | 0.4.0-cdh4.2.0 |
环境: Ubuntu 12.04, Postgresql 9.1, WordPress 3.4.2
1. 环境安装
sudo apt-get install apache2
sudo apt-get install postgresql-9.1
sudo apt-get install php5
sudo apt-get install php5-pgsql
2. 下载wordpress,
wget -O wordpress.tar.gz http://wordpress.org/latest.tar.gz
wget https://downloads.wordpress.org/plugin/postgresql-for-wordpress.1.3.1.zip
3. 解压并放到/var/www目录下
unzip latest.tar.gz
unzip postgresql-for-wordpress.1.3.1.zip
sudo cp -R wordpress /var/www
sudo chown jerry:jerry /var/www/wordpress
sudo cp -R postgresql-for-wordpress/pg4wp /var/www/wordpress/wp-content/
cp /var/www/wordpress/wp-content/pg4wp/db.php /var/www/wordpress/wp-content
4. 切换到/var/www/wordpress目录,拷贝一份wp-config-sample.php文件为wp-config.php
vi wp-config.php
修改这四项为postgresql的配置参数
define(‘DB_NAME’, ‘wordpress’);
/** MySQL database username */
define(‘DB_USER’, ‘postgres’);
/** MySQL database password */
define(‘DB_PASSWORD’, ‘xxxxxxx’);
/** MySQL hostname */
define(‘DB_HOST’, ‘localhost:5432’);
以后可以有自己的博客了!
参考以下文件:
https://github.com/txthinking/google-hosts
环境: Windows 7, pidgin-2.10.9, send-screenshot-v0.8-3
pidgin默认安装是不支持屏幕截图发送功能,但可以通过插件来弥补这一功能。 从https://code.google.com/p/pidgin-sendscreenshot/downloads/list下载send-screenshot-v0.8-3.exe,安装些插件。
在“对话” –》 “更多(o) –> “截图发送”, 使用此项就能实现了。
环境: Ubuntu 12.04
chown u+x /etc/sudoers
vi /etc/sudoers
增加一行
jerry ALL=(ALL:ALL) ALL
chown u-x /etc/sudoers
环境: CentOS 5.7
1. 配置好免密码登录的SSH
场景: 主机A, B, A访问B
首先在主机A执行
[oracle@A ~]$ ssh-keygen -t rsa -P ”
[oracle@A ~]$ scp .ssh/id_rsa.pub oracle@B:/home/oracle
主机B执行:
[oracle@B ~]$ cat /home/oracle/id_rsa.pub > ~/.ssh/authorized_keys
[oracle@B ~]$ chmod 600 ~/.ssh/authorized_keys
[oracle@B ~]$ chmod 700 ~/.ssh
2. vi test_ssh.sh
脚本如下
#!/bin/sh
cmd=”
cd /home/oracle
. ~/.bash_profile
ls
python load_zhixin.py “$1″
”
echo $cmd
ssh oracle@xx.xx.xx.xx “$cmd”
3. 执行如下 ./test_ssh.sh 20140917
问题:在linux服务器中使用anaconda或RStudio做图需要一个terminal连接, 使用Xmanager的Xstart无法复制和粘贴代码,背景色也无法调成黑白色。
解决方法: 结合Xmanager – Passive 和 PuTTY这两个工具,同时在PuTTY的terminal设置环境变量DISPLAY。例export DISPLAY=192.168.56.1:0.0
http://blog.csdn.net/zbyufei/article/details/6412043
http://outofmemory.cn/code-snippet/1127/python-control-shell-zhixingshijian-ruo-chaoshi-ze-qiangxing-tuichu