Performance: Creating duplicate file

by threeFatCat   Last Updated August 24, 2018 08:05 AM

I have a Web server (Local) and File server(Remote).

I need to create a duplicate of an xml file in the same location with different file name in the File server. The size of the file is random, but the maximum file size can be 45mb, and I have 200 users a day. I'll be using ftp_put in transferring files between server. Which is faster and efficient way between the two below?

a. Upload file to local -> Upload the file to remote with new name

b. Read the contents of the remote file and save it to variable -> Create a new file with the contents from the variable in local -> Upload the file to remote with new name

I'm not sure how I can benchmark this,so performance wise which is better a or b,why?



Related Questions



Handle Optional Logging in High-Performance Library

Updated January 28, 2018 13:05 PM

multiple uses for original software

Updated February 02, 2017 14:02 PM