风云小站 » 『 求助专区 』 » 求批处理高手进来(已更新)
本页主题: 求批处理高手进来(已更新) 打印 | 加为IE收藏 | 复制链接 | 收藏主题 | 上一主题 | 下一主题

nuaa
元老|nuaa
级别: 风云元老


精华: 2
发帖: 1051
威望: 967 点
风云币: 2062 元
专家分: 41 分
在线时间:469(小时)
注册时间:2006-11-01
最后登录:2008-04-25

 求批处理高手进来(已更新)

想写个批处理的下载文件

ftp或者http的

会的告诉我吧

感激不尽啊~~~~~~



我用了楼下的,不过下了exe文件就没法运行了,而且都是0字节大小,运行则提示不是有效的win32程序,哪位高手能解决啊~~


最好是http的
[ 此贴被nuaa在2006-11-26 16:01重新编辑 ]
YUWIND.COM
ALL THE BEST FOR YOU!
顶端 Posted: 2006-11-24 23:48 | [楼 主]
xingyun321
Nothing Is Impossible
级别: 资深会员


精华: 0
发帖: 1879
威望: 1378 点
风云币: 2701 元
专家分: 4 分
在线时间:619(小时)
注册时间:2006-11-01
最后登录:2008-04-27

 

是 bin模式的问题?

我明天到公司试试看,目前没有环境,,

http感觉不容易啊,,除非可以带有一个wget的exe文件一起是用,呵呵
顶端 Posted: 2006-11-26 22:29 | 1 楼
xingyun321
Nothing Is Impossible
级别: 资深会员


精华: 0
发帖: 1879
威望: 1378 点
风云币: 2701 元
专家分: 4 分
在线时间:619(小时)
注册时间:2006-11-01
最后登录:2008-04-27

 

今天试验了一下,,发现exe文件传输不正确好像是ftp本身的问题,用bin模式或者ascii模式都一样.直接在命令行敲也是一样,我的server是SUSE10(64bit),和Redhat 4 server版.

如果用http就比较麻烦了,因为ftp是系统本身自带的,http可以用wget作传输辅助,就是不知道是不是符合楼主的要求
顶端 Posted: 2006-11-27 12:44 | 2 楼
xingyun321
Nothing Is Impossible
级别: 资深会员


精华: 0
发帖: 1879
威望: 1378 点
风云币: 2701 元
专家分: 4 分
在线时间:619(小时)
注册时间:2006-11-01
最后登录:2008-04-27

 

其实用wget就超级简单了,,呵呵!

我现在就有一个用cygwin的程序,你可以先看看,,不能直接再windows下面执行不过.
附件: pic_down.rar (119 K) 下载次数:1
顶端 Posted: 2006-11-27 18:43 | 3 楼
xingyun321
Nothing Is Impossible
级别: 资深会员


精华: 0
发帖: 1879
威望: 1378 点
风云币: 2701 元
专家分: 4 分
在线时间:619(小时)
注册时间:2006-11-01
最后登录:2008-04-27

 

用wget直接就可以下载的,,恐怕也没有多少批处理好写的,,我那个主要是筛除文件用的。

GNU Wget 1.10+devel, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
-V, --version       display the version of Wget and exit.
-h, --help         print this help.
-b, --background     go to background after startup.
-e, --execute=COMMAND   execute a `.wgetrc'-style command.

Logging and input file:
-o, --output-file=FILE   log messages to FILE.
-a, --append-output=FILE append messages to FILE.
-d, --debug           print lots of debugging information.
-q, --quiet           quiet (no output).
-v, --verbose         be verbose (this is the default).
-nv, --no-verbose       turn off verboseness, without being quiet.
-i, --input-file=FILE   download URLs found in FILE.
-F, --force-html       treat input file as HTML.
-B, --base=URL         prepends URL to relative links in -F -i file.

Download:
-t, --tries=NUMBER         set number of retries to NUMBER (0 unlimits).
    --retry-connrefused     retry even if connection is refused.
-O, --output-document=FILE   write documents to FILE.
-nc, --no-clobber         skip downloads that would download to
                      existing files.
-c, --continue           resume getting a partially-downloaded file.
    --progress=TYPE       select progress gauge type.
-N, --timestamping         don't re-retrieve files unless newer than
                      local.
-S, --server-response       print server response.
    --spider             don't download anything.
-T, --timeout=SECONDS       set all timeout values to SECONDS.
    --dns-timeout=SECS     set the DNS lookup timeout to SECS.
    --connect-timeout=SECS   set the connect timeout to SECS.
    --read-timeout=SECS     set the read timeout to SECS.
-w, --wait=SECONDS         wait SECONDS between retrievals.
    --waitretry=SECONDS     wait 1..SECONDS between retries of a retrieval

    --random-wait         wait from 0...2*WAIT secs between retrievals.
-Y, --proxy             explicitly turn on proxy.
    --no-proxy           explicitly turn off proxy.
-Q, --quota=NUMBER         set retrieval quota to NUMBER.
    --bind-address=ADDRESS   bind to ADDRESS (hostname or IP) on local host

    --limit-rate=RATE       limit download rate to RATE.
    --no-dns-cache         disable caching DNS lookups.
    --restrict-file-names=OS restrict chars in file names to ones OS allows

    --ignore-case         ignore case when matching files/directories.
    --user=USER           set both ftp and http user to USER.
    --password=PASS       set both ftp and http password to PASS.

Directories:
-nd, --no-directories       don't create directories.
-x, --force-directories     force creation of directories.
-nH, --no-host-directories     don't create host directories.
    --protocol-directories   use protocol name in directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
    --cut-dirs=NUMBER       ignore NUMBER remote directory components.

HTTP options:
    --http-user=USER     set http user to USER.
    --http-password=PASS   set http password to PASS.
    --no-cache         disallow server-cached data.
-E, --html-extension     save HTML documents with `.html' extension.
    --ignore-length       ignore `Content-Length' header field.
    --header=STRING       insert STRING among the headers.
    --proxy-user=USER     set USER as proxy username.
    --proxy-password=PASS   set PASS as proxy password.
    --referer=URL       include `Referer: URL' header in HTTP request.
    --save-headers       save the HTTP headers to file.
-U, --user-agent=AGENT     identify as AGENT instead of Wget/VERSION.
    --no-http-keep-alive   disable HTTP keep-alive (persistent connections)

    --no-cookies         don't use cookies.
    --load-cookies=FILE   load cookies from FILE before session.
    --save-cookies=FILE   save cookies to FILE after session.
    --keep-session-cookies load and save session (non-permanent) cookies.
    --post-data=STRING     use the POST method; send STRING as the data.
    --post-file=FILE     use the POST method; send contents of FILE.

FTP options:
    --ftp-user=USER       set ftp user to USER.
    --ftp-password=PASS   set ftp password to PASS.
    --no-remove-listing   don't remove `.listing' files.
    --no-glob           turn off FTP file name globbing.
    --no-passive-ftp     disable the "passive" transfer mode.
    --retr-symlinks       when recursing, get linked-to files (not dir).
    --preserve-permissions preserve remote file permissions.

Recursive download:
-r, --recursive       specify recursive download.
-l, --level=NUMBER     maximum recursion depth (inf or 0 for infinite).
    --delete-after     delete files locally after downloading them.
-k, --convert-links     make links in downloaded HTML point to local files.
-K, --backup-converted   before converting file X, back up as X.orig.
-m, --mirror         shortcut for -N -r -l inf --no-remove-listing.
-p, --page-requisites   get all images, etc. needed to display HTML page.
    --strict-comments   turn on strict (SGML) handling of HTML comments.

Recursive accept/reject:
-A, --accept=LIST           comma-separated list of accepted extensions.
-R, --reject=LIST           comma-separated list of rejected extensions.
-D, --domains=LIST         comma-separated list of accepted domains.
    --exclude-domains=LIST     comma-separated list of rejected domains.
    --follow-ftp           follow FTP links from HTML documents.
    --follow-tags=LIST       comma-separated list of followed HTML tags.
    --ignore-tags=LIST       comma-separated list of ignored HTML tags.
-H, --span-hosts           go to foreign hosts when recursive.
-L, --relative             follow relative links only.
-I, --include-directories=LIST list of allowed directories.
-X, --exclude-directories=LIST list of excluded directories.
-np, --no-parent           don't ascend to the parent directory.

Mail bug reports and suggestions to <[email protected]>.

这是wget的用法,最简单的就是直接wget "地址"了,呵呵
顶端 Posted: 2006-11-27 20:48 | 4 楼
xingyun321
Nothing Is Impossible
级别: 资深会员


精华: 0
发帖: 1879
威望: 1378 点
风云币: 2701 元
专家分: 4 分
在线时间:619(小时)
注册时间:2006-11-01
最后登录:2008-04-27

 

exe文件用这样的批处理下就是不行的,,这个是已知问题拉,,

用ftp应该不容易实现,,

用wget来,就是直接 wget http://.....exe就可以下载的,,
顶端 Posted: 2006-12-03 23:08 | 5 楼
帖子浏览记录 版块浏览记录
风云小站 » 『 求助专区 』
感谢,曾经的版主
Total 0.009069(s) query 7, Time now is:01-08 01:35, Gzip enabled 渝ICP备20004412号-1

Powered by PHPWind v6.3.2 Certificate Code © 2003-07 PHPWind.com Corporation
Skin by Chen Bo