File capacity limitation in theory in SVN
There are about 1.65T , 2.72 million files ，274 thousand folders in Samba. It consist of any file types : txt , bin , pic , audio , video and so on . We are considering moving it from samba to SVN. And the data may grow larger and larger . We wonder that is there file capacity limitation in theory in SVN? or if the data is too large , will it cause any downgrade performance ? Thank you !
Interesting question for you: do you know how much, if any, of the data files are duplicates? In other words, if you were to run through your Samba exported file system and do an MD5 on every file, would you find any duplicates? In any case, 1.65T would not be the largest SVN repo that I've seen, but it would be among the largest. That said, I don't know of any limits that would cause troubles. Clearly, I'm not authoritative in this area. I would suggest you ask that question directly to the committers via email (email@example.com). That said, I'll observe that proper backup will definitely be critical. And restoration should be verified periodically.
Thank you DougR. I have asked the dev list , here is the answer : https://mail-archives.apache.org/mod_mbox/subversion-users/201903.mbox/browser (search :File capacity limitation in theory in SVN ) In short : 1. Subversion is not a good choice to use as a file server . If one need version control , one can consider using it . 2. if one tranfer such large data from samba to SVN, it is recommend break it up into a lot of commits instead of only one big commit.
Thank you for reporting back! Cheers!
1-4 of 4
Reply to this discussion
You cannot edit posts or make replies: You should be logged in before you can post.