Archive for July, 2009
at the title, benefits in short
1. the Google use CDN, so the clients can browser faster than your own server.
2. use it can save ur bandwidth and network usage
3. Increases Number Of Parallel Downloads
but in China, maybe not well enough, because the Google is f**ked sometimes.
The programmers met a problem when use Typo3, the clients can’t login with IE but can login with Firefox/Opera.
After the investigation , we find the problem is the cookies. Typo3 can set the domain for the cookies. change the setting to “.domain.com” it works.
btw, close the browers and clean the cookies the most right way.
In order to do this properly, remember to close your browser first. This is because all your cookies are held in memory until you close your browser. So, if you delete the file with your browser open, it will make a new file when you close it, and your cookies will be back.
Refer of the cookies.
So excellent articles, but the blogspot.com is blocked by STH. so i paste it here.
Original Link: http://0xfe.blogspot.com/2006/03/troubleshooting-unix-systems-with-lsof.html
One of the least-talked-about tools in a UNIX sysadmin’s toolkit is lsof. Lsof lists information about files opened by processes. But that’s really an understatement.
Most people forget that, in UNIX, (almost) everything is a file. The OS makes hardware available to applications by way of files in /dev. Kernel, system, memory, device etc. information in made available inside files in /proc. TCP/UDP sockets are sometimes represented internally as files. Even directories are really just files containing other filenames.
Read the rest of this entry »
I met the “aborted by signal=PIPE” when i used Backuppc to backup the remote server
so i run the commands on console:
>/usr/bin/ssh -q -x -l backup remote_ip /usr/bin/rsync –server –sender -v –ignore-times . /
protocol version mismatch — is your shell clean?
(see the rsync man page for an explanation)
rsync error: protocol incompatibility (code 2) at compat.c(171) [sender=3.0.4]
I search from internet, get the answer: “clean the .bashrc .bash_profile”….
but still doesn’t work
Finally I got the answer, because the argLists was overwritten, so I changed backup. the Backuppc works.
the right argLists is :
/usr/bin/ssh -q -x -l backup remote_ip /usr/bin/rsync –server –sender –numeric-ids –perms –owner –group -D –links –hard-links –times –block-size=2048 –recursive –ignore-times . /
Modules for block some un-kind users. to some extents, it can do some help with DDoS
Distributed White and Black listing of IPs, ranges, and CIDR blocks
Configurable timeouts, memcache server listings
Support for continuous hasing using libmemcached’s Ketama
Windowded Rate limiting based on Response code (to block brute-force dictionary attacks against .htpasswd, for example)
I wrote a shell scripts to trace Apache log formerly.
such as I read the last 50000 visit log, and check the same ip’s active status.
use iptables block it. and free it in some minutes laters.
this modules is more friendly i think.
Read some articles on Ubuntu recently.
occasionally read an article about approx, It works like a cache server. can cache the *.deb for other usage
the approx server: run sudo apt-get install approx
sudo /etc/init.d/approx start
if u has firewall, open port 9999
the clients: sudo mv /etc/atp/source.list /etc/atp/source.list.orgin
sudo vi /etc/atp/source.list
paste the following lines
(note: change SID to ur release )
deb http://approx_server_ip:9999/ubuntu/ intrepid main restricted universe multiverse
deb http://approx_server_ip:9999/buntu/ intrepid-updates universe multiverse
deb http://approx_server_ip:9999/ubuntu intrepid-security main restricted universe multiverse
checkproc – Checks for a process by full path name
this problem is error from crontab, it is also happened on every openSUSE server, if it is installed tripwire
the tripwire generate the report. if the server is busy, it will take a long time to finish,
but the crontab will run every 15 minutes , if it is done more than 15 mins, maybe it will show the error
“checkproc: checkproc: xread error: No such process”
-*/15 * * * * root test -x /usr/lib/cron/run-crons && /usr/lib/cron/run-crons >/dev/null 2>&1
/sbin/checkproc $SCRIPT && continue
btw, it maybe caused by other cron jobs that takes a long time to finish
there are lots of articles on Linux/Vim/Others.
they dig depth of the linux commands.
lshw -class memory
lshw -class disk
1. create ur own packages, put it into www home directory. let clients can visit from network
such as /var/www/html/my_repo
http://my_ip/my_repo even ftp://my_ip/my_repo
2. create u repo files on clients
name=My Cluster Repo $basearch
gpgcheck=0 # if u read further, we should change it to “1″ for security.
3. yum search some_pkgs
u should get ur repo now.