Plowshare is a command-line downloader and uploader for some of the most popular file-sharing websites. It works on UNIX-like systems and presently supports: Megaupload, Rapidshare, 2Shared, 4Shared, ZShare, Badongo, DepositFiles and Mediafire. It's basically a wget for the websites mentioned above.
Installing Plowshare in Ubuntu
1. First, we must install the dependencies:
sudo apt-get install curl recode imagemagick tesseract-ocr-eng spidermonkey-bin aview
2. Download Plowshare, and extract the contents of the archive:
tar xvzf plowshare-0.8
3. Now let's set up Plowshare:
cd plowshare-0.8
sudo bash setup.sh install
That's itUsing Plowshare
Download
* Download a file from rapidshare:
* Download a list of files (one link per line):
* Download a file from megaupload using a free membership account:
* Download a password-protected file from megaupload:
* Get only the file URL without actually downloading it. Very handy if you want to use another web downloader:
* A more complex example for advanced users: you have a found a webpage containing a bunch of megaupload files and want to download, let's say, the first ten links. You could then write:
* Filter alive links in a text file
Upload
* Upload a file to megaupload with a free membership account:
* Upload a file to megaupload with a Premium account and multifetch upload:
* Upload a file to rapidshare anonymously changing uploaded file name:
[via gnometips]
* Download a file from rapidshare:
plowdown http://www.rapidshare.com/files/86545320/Tux-Trainer_25-01-2008.rar
* Download a list of files (one link per line):
plowdown file_with_links.txt
* Download a file from megaupload using a free membership account:
plowdown -a myuser:mypassword http://www.megaupload.com/?d=132348234
* Download a password-protected file from megaupload:
plowdown -p somepassword http://www.megaupload.com/?d=ieo1g52v
* Get only the file URL without actually downloading it. Very handy if you want to use another web downloader:
plowdown --link-only http://www.2shared.com/file/4446939/c9fd70d6/Test.html | xargs -rt wget
* A more complex example for advanced users: you have a found a webpage containing a bunch of megaupload files and want to download, let's say, the first ten links. You could then write:
curl http://some-website.com/page.html | \
grep -o "http://www.megaupload.com/[^\"< ]*" | uniq | head -n10 | plowdown -
* Filter alive links in a text file
plowdown -c file_with_links.txt > file_with_active_links.txt
Upload
* Upload a file to megaupload with a free membership account:
plowup -a myuser:mypassword -d "My description" /path/myfile.txt megaupload
* Upload a file to megaupload with a Premium account and multifetch upload:
plowup -a myuser:mypassword -d "My description" --multifetch http://www.somewherefarbeyond.com/somefile megaupload
* Upload a file to rapidshare anonymously changing uploaded file name:
plowup /path/myfile.txt rapidshare:anothername.txt
[via gnometips]
Tidak ada komentar:
Posting Komentar