Does anyone know If there is an easy way to generate 50K files (please note: not 1 million as originally thought) that are 75MB each, (so that we get the full 3.5TB of data) which is approximately what our suspect process was deleting, I would love to hear suggestions on how to create those files quickly.
We are presently running script on an NFS client to copy /dev/random into a file to create a 75MB filesize, that takes 11 sec/file (so over 6 days for the 50K files) if we copy that randomly generated file to a new filename, it take 1.5secs (so still 12+ hrs) to generate 50K files.