I have a PC with Windows 10 and have several attached HD with a collective data of about 40TB (lots of music/videos). As such, there is a ton of data at risk, but it is data I don't access often (ie a movie or show file would only be accessed when I actually want to watch it). They also don't get revised (ie the file is the file, I'm not editing a video file over and over like I might an Excel spreadsheet) Currently, the Crashplan app needs 1GB RAM for every 1TB backup, and my PC has only 8GB RAM. Crashplan saves versions and also has permanent retention if files get deleted. What I'm wondering is - if I create a folder specifically for backup to Crashplan, move a few files in so they backup, then move them back out again so crashplan thinks they were deleted - would that solve the issue? ie would those files now be backed up safely in permanent retention without counting against the 1TB/1GB RAM system requirement? Other solutions welcomed, but option above seems like it might work and wondering if anyone tried it or has better alernatives for large cloud backup.