风云小站 » 『 求助专区 』 » 求批处理高手进来(已更新)
本页主题: 求批处理高手进来(已更新) 打印 | 加为IE收藏 | 复制链接 | 收藏主题 | 上一主题 | 下一主题

nuaa
元老|nuaa
级别: 风云元老


精华: 2
发帖: 1051
威望: 967 点
风云币: 2062 元
专家分: 41 分
在线时间:469(小时)
注册时间:2006-11-01
最后登录:2008-04-25

 求批处理高手进来(已更新)

想写个批处理的下载文件

ftp或者http的

会的告诉我吧

感激不尽啊~~~~~~



我用了楼下的,不过下了exe文件就没法运行了,而且都是0字节大小,运行则提示不是有效的win32程序,哪位高手能解决啊~~


最好是http的
[ 此贴被nuaa在2006-11-26 16:01重新编辑 ]
YUWIND.COM
ALL THE BEST FOR YOU!
顶端 Posted: 2006-11-24 23:48 | [楼 主]
lovebasssolo
求助专区|茫然不知忧伤
级别: 中级会员


精华: 0
发帖: 152
威望: 366 点
风云币: 11672 元
专家分: 3 分
在线时间:24(小时)
注册时间:2006-11-21
最后登录:2007-10-11

 

建一个   ftpc.bat   内容如下:  
 
@echo   off  
echo   open   ftp.sec.gov>tmp.bat  
echo   anonymous>>tmp.bat  
echo   [email protected]>>tmp.bat  
echo   user   anonymous   [email protected]>>tmp.bat  
echo   lcd   c:\>>tmp.bat  
echo   get   /edgar/data/1002135/0000914760-03-000098.txt>>tmp.bat  
echo   get   /edgar/data/1034594/0000950168-03-001955.txt>>tmp.bat  
echo   bye>>tmp.bat  
 
ftp   -s:tmp.bat  
 
cd   /d   c:\  
ren   "0000914760-03-000098.txt"   000098.txt  
 
echo   done!!
优越且偏执狂般的思考
顶端 Posted: 2006-11-25 00:41 | 1 楼
百灵儿
休闲一族
级别: 高级会员


精华: 0
发帖: 88
威望: 866 点
风云币: 2209 元
专家分: 0 分
在线时间:4(小时)
注册时间:2006-11-01
最后登录:2008-03-14

 

好厉害啊
本帖最近评分记录:
  • 风云币:-2(powerday)
  • 顶端 Posted: 2006-11-26 09:30 | 2 楼
    xingyun321
    Nothing Is Impossible
    级别: 资深会员


    精华: 0
    发帖: 1879
    威望: 1378 点
    风云币: 2701 元
    专家分: 4 分
    在线时间:619(小时)
    注册时间:2006-11-01
    最后登录:2008-04-27

     

    是 bin模式的问题?

    我明天到公司试试看,目前没有环境,,

    http感觉不容易啊,,除非可以带有一个wget的exe文件一起是用,呵呵
    顶端 Posted: 2006-11-26 22:29 | 3 楼
    xingyun321
    Nothing Is Impossible
    级别: 资深会员


    精华: 0
    发帖: 1879
    威望: 1378 点
    风云币: 2701 元
    专家分: 4 分
    在线时间:619(小时)
    注册时间:2006-11-01
    最后登录:2008-04-27

     

    今天试验了一下,,发现exe文件传输不正确好像是ftp本身的问题,用bin模式或者ascii模式都一样.直接在命令行敲也是一样,我的server是SUSE10(64bit),和Redhat 4 server版.

    如果用http就比较麻烦了,因为ftp是系统本身自带的,http可以用wget作传输辅助,就是不知道是不是符合楼主的要求
    顶端 Posted: 2006-11-27 12:44 | 4 楼
    nuaa
    元老|nuaa
    级别: 风云元老


    精华: 2
    发帖: 1051
    威望: 967 点
    风云币: 2062 元
    专家分: 41 分
    在线时间:469(小时)
    注册时间:2006-11-01
    最后登录:2008-04-25

     

    谢谢楼上的啊,那就给我写一个能下载exe的吧,谢谢啊~~
    YUWIND.COM
    ALL THE BEST FOR YOU!
    顶端 Posted: 2006-11-27 16:21 | 5 楼
    xingyun321
    Nothing Is Impossible
    级别: 资深会员


    精华: 0
    发帖: 1879
    威望: 1378 点
    风云币: 2701 元
    专家分: 4 分
    在线时间:619(小时)
    注册时间:2006-11-01
    最后登录:2008-04-27

     

    其实用wget就超级简单了,,呵呵!

    我现在就有一个用cygwin的程序,你可以先看看,,不能直接再windows下面执行不过.
    附件: pic_down.rar (119 K) 下载次数:1
    顶端 Posted: 2006-11-27 18:43 | 6 楼
    mfkiud
    级别: 初级会员


    精华: 0
    发帖: 123
    威望: 167 点
    风云币: 5278 元
    专家分: 0 分
    在线时间:27(小时)
    注册时间:2006-11-01
    最后登录:2007-03-19

     

    都是高手的 应该奖励 我觉得
    顶端 Posted: 2006-11-27 20:17 | 7 楼
    xingyun321
    Nothing Is Impossible
    级别: 资深会员


    精华: 0
    发帖: 1879
    威望: 1378 点
    风云币: 2701 元
    专家分: 4 分
    在线时间:619(小时)
    注册时间:2006-11-01
    最后登录:2008-04-27

     

    用wget直接就可以下载的,,恐怕也没有多少批处理好写的,,我那个主要是筛除文件用的。

    GNU Wget 1.10+devel, a non-interactive network retriever.
    Usage: wget [OPTION]... [URL]...

    Mandatory arguments to long options are mandatory for short options too.

    Startup:
    -V, --version       display the version of Wget and exit.
    -h, --help         print this help.
    -b, --background     go to background after startup.
    -e, --execute=COMMAND   execute a `.wgetrc'-style command.

    Logging and input file:
    -o, --output-file=FILE   log messages to FILE.
    -a, --append-output=FILE append messages to FILE.
    -d, --debug           print lots of debugging information.
    -q, --quiet           quiet (no output).
    -v, --verbose         be verbose (this is the default).
    -nv, --no-verbose       turn off verboseness, without being quiet.
    -i, --input-file=FILE   download URLs found in FILE.
    -F, --force-html       treat input file as HTML.
    -B, --base=URL         prepends URL to relative links in -F -i file.

    Download:
    -t, --tries=NUMBER         set number of retries to NUMBER (0 unlimits).
        --retry-connrefused     retry even if connection is refused.
    -O, --output-document=FILE   write documents to FILE.
    -nc, --no-clobber         skip downloads that would download to
                          existing files.
    -c, --continue           resume getting a partially-downloaded file.
        --progress=TYPE       select progress gauge type.
    -N, --timestamping         don't re-retrieve files unless newer than
                          local.
    -S, --server-response       print server response.
        --spider             don't download anything.
    -T, --timeout=SECONDS       set all timeout values to SECONDS.
        --dns-timeout=SECS     set the DNS lookup timeout to SECS.
        --connect-timeout=SECS   set the connect timeout to SECS.
        --read-timeout=SECS     set the read timeout to SECS.
    -w, --wait=SECONDS         wait SECONDS between retrievals.
        --waitretry=SECONDS     wait 1..SECONDS between retries of a retrieval

        --random-wait         wait from 0...2*WAIT secs between retrievals.
    -Y, --proxy             explicitly turn on proxy.
        --no-proxy           explicitly turn off proxy.
    -Q, --quota=NUMBER         set retrieval quota to NUMBER.
        --bind-address=ADDRESS   bind to ADDRESS (hostname or IP) on local host

        --limit-rate=RATE       limit download rate to RATE.
        --no-dns-cache         disable caching DNS lookups.
        --restrict-file-names=OS restrict chars in file names to ones OS allows

        --ignore-case         ignore case when matching files/directories.
        --user=USER           set both ftp and http user to USER.
        --password=PASS       set both ftp and http password to PASS.

    Directories:
    -nd, --no-directories       don't create directories.
    -x, --force-directories     force creation of directories.
    -nH, --no-host-directories     don't create host directories.
        --protocol-directories   use protocol name in directories.
    -P, --directory-prefix=PREFIX save files to PREFIX/...
        --cut-dirs=NUMBER       ignore NUMBER remote directory components.

    HTTP options:
        --http-user=USER     set http user to USER.
        --http-password=PASS   set http password to PASS.
        --no-cache         disallow server-cached data.
    -E, --html-extension     save HTML documents with `.html' extension.
        --ignore-length       ignore `Content-Length' header field.
        --header=STRING       insert STRING among the headers.
        --proxy-user=USER     set USER as proxy username.
        --proxy-password=PASS   set PASS as proxy password.
        --referer=URL       include `Referer: URL' header in HTTP request.
        --save-headers       save the HTTP headers to file.
    -U, --user-agent=AGENT     identify as AGENT instead of Wget/VERSION.
        --no-http-keep-alive   disable HTTP keep-alive (persistent connections)

        --no-cookies         don't use cookies.
        --load-cookies=FILE   load cookies from FILE before session.
        --save-cookies=FILE   save cookies to FILE after session.
        --keep-session-cookies load and save session (non-permanent) cookies.
        --post-data=STRING     use the POST method; send STRING as the data.
        --post-file=FILE     use the POST method; send contents of FILE.

    FTP options:
        --ftp-user=USER       set ftp user to USER.
        --ftp-password=PASS   set ftp password to PASS.
        --no-remove-listing   don't remove `.listing' files.
        --no-glob           turn off FTP file name globbing.
        --no-passive-ftp     disable the "passive" transfer mode.
        --retr-symlinks       when recursing, get linked-to files (not dir).
        --preserve-permissions preserve remote file permissions.

    Recursive download:
    -r, --recursive       specify recursive download.
    -l, --level=NUMBER     maximum recursion depth (inf or 0 for infinite).
        --delete-after     delete files locally after downloading them.
    -k, --convert-links     make links in downloaded HTML point to local files.
    -K, --backup-converted   before converting file X, back up as X.orig.
    -m, --mirror         shortcut for -N -r -l inf --no-remove-listing.
    -p, --page-requisites   get all images, etc. needed to display HTML page.
        --strict-comments   turn on strict (SGML) handling of HTML comments.

    Recursive accept/reject:
    -A, --accept=LIST           comma-separated list of accepted extensions.
    -R, --reject=LIST           comma-separated list of rejected extensions.
    -D, --domains=LIST         comma-separated list of accepted domains.
        --exclude-domains=LIST     comma-separated list of rejected domains.
        --follow-ftp           follow FTP links from HTML documents.
        --follow-tags=LIST       comma-separated list of followed HTML tags.
        --ignore-tags=LIST       comma-separated list of ignored HTML tags.
    -H, --span-hosts           go to foreign hosts when recursive.
    -L, --relative             follow relative links only.
    -I, --include-directories=LIST list of allowed directories.
    -X, --exclude-directories=LIST list of excluded directories.
    -np, --no-parent           don't ascend to the parent directory.

    Mail bug reports and suggestions to <[email protected]>.

    这是wget的用法,最简单的就是直接wget "地址"了,呵呵
    顶端 Posted: 2006-11-27 20:48 | 8 楼
    nuaa
    元老|nuaa
    级别: 风云元老


    精华: 2
    发帖: 1051
    威望: 967 点
    风云币: 2062 元
    专家分: 41 分
    在线时间:469(小时)
    注册时间:2006-11-01
    最后登录:2008-04-25

     

    衷心感谢xingyun321大哥的帮助,这些太复杂了看不懂啊,你就直接给我写个能下载exe文件的批处理吧
    参照以下的下载txt文件,我改为exe下的就运行不了

    @echo   off  
    echo   open   ftp.sec.gov>tmp.bat  
    echo   anonymous>>tmp.bat  
    echo   [email protected]>>tmp.bat  
    echo   user   anonymous   [email protected]>>tmp.bat  
    echo   lcd   c:\>>tmp.bat  
    echo   get   /edgar/data/1002135/0000914760-03-000098.txt>>tmp.bat  
    echo   get   /edgar/data/1034594/0000950168-03-001955.txt>>tmp.bat  
    echo   bye>>tmp.bat  

    ftp   -s:tmp.bat  

    cd   /d   c:\  
    ren   "0000914760-03-000098.txt"   000098.txt  

    echo   done!!


    YUWIND.COM
    ALL THE BEST FOR YOU!
    顶端 Posted: 2006-12-03 22:48 | 9 楼
    xingyun321
    Nothing Is Impossible
    级别: 资深会员


    精华: 0
    发帖: 1879
    威望: 1378 点
    风云币: 2701 元
    专家分: 4 分
    在线时间:619(小时)
    注册时间:2006-11-01
    最后登录:2008-04-27

     

    exe文件用这样的批处理下就是不行的,,这个是已知问题拉,,

    用ftp应该不容易实现,,

    用wget来,就是直接 wget http://.....exe就可以下载的,,
    顶端 Posted: 2006-12-03 23:08 | 10 楼
    帖子浏览记录 版块浏览记录
    风云小站 » 『 求助专区 』
    感谢,曾经的版主
    Total 0.028698(s) query 6, Time now is:01-06 13:24, Gzip enabled 渝ICP备20004412号-1

    Powered by PHPWind v6.3.2 Certificate Code © 2003-07 PHPWind.com Corporation
    Skin by Chen Bo