FileZilla Forums
Cannot Download /Resume files which are larger than 2GB
Need help with FileZilla Client? Something does not work as expected? In this forum you may find an answer.
Moderator:Project members
8 posts • Page 1 of 1
- damageboy
- 500 Command not understood
- Posts: 3
- Joined: 2004-07-28 06:42
Cannot Download /Resume files which are larger than 2GB
#1Postby damageboy » 2004-07-28 06:49
I've been (for sevral times) unsuccessful in downloading files which are
larger than 2GB or resuming downloads of such size from a FileZilla client.
I'm guessing that there's some signed 32 bit integer(s) inside filezilla that overflows and therefore my downloads get fsck'd.
I usually experience ever-lasting downloads which would fill my disk up eventually.
I've just started another one like that (3.1GB download) on 2.2.7b.
If it croacks I'll try and trunk the file after the REAL size and run MD5 do
make sure it's just some weird counting crap.
Anyway... can someone point me to where in the source these things happen?
larger than 2GB or resuming downloads of such size from a FileZilla client.
I'm guessing that there's some signed 32 bit integer(s) inside filezilla that overflows and therefore my downloads get fsck'd.
I usually experience ever-lasting downloads which would fill my disk up eventually.
I've just started another one like that (3.1GB download) on 2.2.7b.
If it croacks I'll try and trunk the file after the REAL size and run MD5 do
make sure it's just some weird counting crap.
Anyway... can someone point me to where in the source these things happen?
- botg
- Site Admin
- Posts: 33236
- Joined: 2004-02-23 20:49
- First name: Tim
- Last name: Kosse
#2Postby botg » 2004-07-28 09:12
Either your filesystem or the servers filesystem does not support large files (> 2/4 GB), this problem is not caused by FileZilla.
If you're not using NTFS, convert your filesystem to NTFS, FAT32 does NOT support large files.
If you're not using NTFS, convert your filesystem to NTFS, FAT32 does NOT support large files.
- damageboy
- 500 Command not understood
- Posts: 3
- Joined: 2004-07-28 06:42
#3Postby damageboy » 2004-07-28 09:16
Sorry there. My Windows boxes have been using NTFS for about 3-5 years now.
I wouldn't know how FAT/FAT32 looks like even if I saw it.
I'm pretty sure it happened only when resuming.
The server side was/is proftpd.
perhaps the problem is server-oriented?
BTW: The clock is ticking... I'm 50% through with the 3.1GB dl.
No disconnections for now
I wouldn't know how FAT/FAT32 looks like even if I saw it.
I'm pretty sure it happened only when resuming.
The server side was/is proftpd.
perhaps the problem is server-oriented?
BTW: The clock is ticking... I'm 50% through with the 3.1GB dl.
No disconnections for now
- damageboy
- 500 Command not understood
- Posts: 3
- Joined: 2004-07-28 06:42
Just happened again
#4Postby damageboy » 2004-07-28 14:29
So it DID happen again.
I was downloading pretty much well uptil the last byte.
then filezilla reported a transfer error.
The it "resumed" the download although all the bytes were in place
and went on downloading.
I wrote small trunc64.exe that truncated the file to the original reported size and truncated the file.
md5sum'd both (on the file server & local truncted copy) and the results came out the same... (at least that... huh?).
So I guess there is something bad.
I'm not aying it's all the filezilla client. It could also be the FTP server partly to blame.
but the filezilla client should at least be aware that the total file size went over the reported size...
i.e. if filezilla says it's 100% complete but still receives bytes then there's SOMETHING wrong at the client side as well. or at least it could be DETECTED by the client.
I was downloading pretty much well uptil the last byte.
then filezilla reported a transfer error.
The it "resumed" the download although all the bytes were in place
and went on downloading.
I wrote small trunc64.exe that truncated the file to the original reported size and truncated the file.
md5sum'd both (on the file server & local truncted copy) and the results came out the same... (at least that... huh?).
So I guess there is something bad.
I'm not aying it's all the filezilla client. It could also be the FTP server partly to blame.
but the filezilla client should at least be aware that the total file size went over the reported size...
i.e. if filezilla says it's 100% complete but still receives bytes then there's SOMETHING wrong at the client side as well. or at least it could be DETECTED by the client.
- botg
- Site Admin
- Posts: 33236
- Joined: 2004-02-23 20:49
- First name: Tim
- Last name: Kosse
Re: Just happened again
#5Postby botg » 2004-07-28 14:56
Unfortunately the size reported by the server in the directory list or using the SIZE command is not always correct, or can't be retrieved at all.damageboy wrote:i.e. if filezilla says it's 100% complete but still receives bytes then there's SOMETHING wrong at the client side as well. or at least it could be DETECTED by the client.
It's up to the server to close the transfer once its complete.
FileZilla itself hat no problems with large files, so if it's not your filesystem, then the server or any firewalls/proxies between did cause this problem.
- Tweak
- 500 Command not understood
- Posts: 3
- Joined: 2006-04-22 00:08
#6Postby Tweak » 2006-04-22 00:21
I don't kjnow if my problem is the same doubt it but... I am trying to use g4u with the filesilla ftp server. It seems to transfer fine but after 2.4 gigs it stops. I know that this is wrong because thisis a 20gig hard drive I am talkin about. There is know way that g4u can compress it that small. So I believe theat filezilla is killing it. I looked at the log and there is no errors it says trasfer ok and tehn diconnected. any ideas guys?
- Zythan
- 226 Transfer OK
- Posts: 82
- Joined: 2005-08-31 15:51
- Location: France
#7Postby Zythan » 2006-04-22 10:31
Hello Tweak,
What is the server OS ? What size is the disk on the server and how much of that can you access ?
TIA
Zythan
What is the server OS ? What size is the disk on the server and how much of that can you access ?
TIA
Zythan
- ®om
- 500 Command not understood
- Posts: 2
- Joined: 2007-09-22 19:38
#8Postby ®om » 2007-09-22 19:40
I have exactly the same problem.
I try to transfer via SFTP (ssh) from my desktop pc (linux, where the wanted file is on a ntfs partition) to my laptop pc (linux too, ext3). Once 2147483648 bytes are transfered, the dialog "resume/override/..." always open.
I tried to do exactly the same thing with scp in command line, and it works fine...
EDIT: maybe you use a signed int to store the file length instead of a long ?
I try to transfer via SFTP (ssh) from my desktop pc (linux, where the wanted file is on a ntfs partition) to my laptop pc (linux too, ext3). Once 2147483648 bytes are transfered, the dialog "resume/override/..." always open.
I tried to do exactly the same thing with scp in command line, and it works fine...
EDIT: maybe you use a signed int to store the file length instead of a long ?
8 posts • Page 1 of 1
-