Hi all. I've got a Galaxy instance up and running on my college's server, but the server admin adamantly does not want to use an FTP client for transferring files, for security reasons. Is there a way to just remove the limit on file size when using the "Get Data" tool, or some more secure alternative to FTP other people have used? I wouldn't mind if uploading the data took a long time, since that's something you can just leave in the background while you do other things. I also wouldn't mind if I had to edit the Get Data xml file or mess around with other code to achieve this. Thanks much!
Direct browser upload is capped at 2G due to the limitations of web browsers (they time out). This is not a Galaxy setting.
An alternative to using FTP is to use the URL upload method. The files can be http:// or ftp:// but must be publically accessible.
If these are private, I believe that a user/pass can be added to the URL (with characters such as "@" converted to hex), but I haven't done that in a while, so please test that out first if trying it. A google will explain how to format URL with user/pass in more detail.
Best, Jen, Galaxy team