2010/07 update: I am now surrounded by a bunch of Linux computers so that I have finally gotten rid of Solaris! which is not pleasant to use, btw.

This is mainly from my Solaris experience.

list files in the order of file size
ls -l|sort -nk5
-n: numeric
-k5: key field definition(sort by the 5th field)

execute commands remotely without logging in
I often crawl gigantic data from usually one website at one time, and very often, because I started late, I had to run the cr awling in parallel from different IP addresses so that I wouldn't get blocked. To do this, one natual way is to SSH to each c omputer and type the command. But imagine doing this for 50 times(that is, 50 different computers) everytime you want to craw l. Now I use a script to start the crawling simultaneously. Here's the solution:
1. run ssh-keygen -t rsa to generate a pair of public/private keys
2. copy the content of your public key(.ssh/id_rsa.pub) into the file(.ssh/authorized_keys) of the computer you want to ssh t o
3. run ssh computer_name command

only list directories:
ls -F|grep /
ls -l|grep ^d
ls -d ~/*/
find * -type d

to list or unmount usb devices:
/opt/SUNWut/bin/utdiskadm -l
/opt/SUNWut/bin/utdiskadm -e device_name

xargs

Last modification: 01/2008