It's A Digital Disease!

91 readers
1 users here now

This is a sub that aims at bringing data hoarders together to share their passion with like minded people.

founded 2 years ago
MODERATORS
76
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Cultural-Victory3442 on 2025-06-01 23:10:36+00:00.


I am about to buy a better capacity hard drive for saving my files, because right now I only use 500Gb hard drives that i had along the years

So I want to move to a better capacity drive.

But I'm not sure on how much $ per TB is a good price.

Any suggestions?

77
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Ok_Quantity_5697 on 2025-06-01 21:37:07+00:00.

78
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/JJPath005 on 2025-05-31 17:35:26+00:00.


Every week I take like 15 GB of footage and it adds pretty quick. What is the most efficient way to upload and store this content. Im saying 1 TB as it allows me space to leverage and avoids bigger crashing issues. Is an SSD Disk the best option.

79
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/philcolinsfan on 2025-05-31 18:56:01+00:00.

80
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/searchjobs_poster on 2025-06-01 07:09:40+00:00.


guide on downloading youtube playlists:

https://www.reddit.com/r/downr/comments/1l0gi4f/how_to_download_an_entire_youtube_playlist/

81
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/DrDoom229 on 2025-06-01 02:41:48+00:00.

82
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Haorelian on 2025-05-31 23:34:52+00:00.


Hey folks,

I’ve been working on a personal project: a doomsday-ready PC/phone setup packed with everything you'd need for survival and entertainment.

Right now, I’ve got a solid base going. Around 10GB of resources—over 200 books and PDFs—covering blacksmithing, water purification, wildlife ID, medical stuff (treatments + pharma), basic maintenance (car, electrical, general repairs), psychology, and more.

I’ve also set up a local LLM (Llama 3.1 8B), downloaded the entire Wikipedia, offline maps of my country (via OSM), and built a bootable USB with a portable Linux OS that has everything preloaded—plug in and go.

For entertainment, I’ve loaded enough content to last 10+ years: manga, light novels, classic literature, etc. I’ve also added ~30 practical video tutorials.

I’ve mirrored the whole setup across two laptops—one of them stored in a Faraday cage in case of EMP—and also cloned it onto my phone.

Now I’m looking to fine-tune it and get some outside input:

If you were building your own doomsday digital datahoard, what would your must-haves be?

Also, if this isn’t the right place for this kind of post—apologies in advance, and thanks for reading.

83
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/peyton_montana on 2025-05-31 23:29:41+00:00.

84
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Broad_Sheepherder593 on 2025-05-31 05:25:23+00:00.

85
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/babybuttoneyes on 2025-05-31 12:47:33+00:00.

86
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/babybuttoneyes on 2025-05-31 10:29:19+00:00.

87
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/XStylus on 2025-05-31 05:21:43+00:00.


I just spent a few hours hunting down an alarming issue when copying a folder via MacOS Finder to a Samba share.

TL;DR, if you're using the veto files = "/.DS_Store/" global parameter in Samba you're playing with fire. A bug in either Samba or macOS Finder (or both) will falsely indicate a successful folder copy when, in fact, files within the folder had not been copied.

Here's the conditions on how to replicate the issue:

  1. Set the following global parameter in smb.conf on the Samba file server:  veto files = "/.DS_Store/"
  2. Mount the Samba file server on a macOS client.
  3. Create three folders and put whatever files you want into each folder.
  4. Open up a Terminal window, navigate to the first folder, and run "ls -hal" to see if there's a .DS_Store file in it. If so, delete it.
  5. Navigate to the second folder via Terminal and check for a .DS_Store file. If one is in there that is larger than 0 bytes, delete it, then run "touch .DS_Store" to create one of 0 bytes.
  6. Navigate to the third folder via Terminal and, again, check for a .DS_Store file. If one is there and is larger than 0 bytes, leave it alone. If not, run "nano .DS_Store", type any gibberish you want, then save it.
  7. Copy the folders to your Samba share.
  8. Check the copied folders on the destination server. You'll note that the contents of the second folder (the one with a 0 byte .DS_Store file) did not copy at all, but Finder acted as though it did and gave absolutely no alert.

In summary, if a folder contains a 0-byte ".DS_Store" file, Finder will not copy any of the contents of that folder if the destination server is using the "veto files" parameter, but will behave as though it did.

The risk is that if a user is not attentively checking to make sure that all data actually copied as intended, a user can be lulled into thinking that all is well.

This issue does not happen when using other methods of file copy, such as rsync or Path Finder.

I tested this on Ubuntu and TrueNAS using Samba versions 4.19.5 and 4.20.5 respectively, with macOS versions 14 through 15.5 as the client.

88
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/photoby_tj on 2025-05-30 21:58:56+00:00.


I’ve been a photographer for over a decade and have accumulated around 15TB of images, all spread across 12 external hard drives and dozens of Lightroom Classic catalogues. This includes everything: personal photos, professional shoots, travel, family, etc.

It’s been a bit of a “save everything, sort it later” approach, and now I’m facing the “later” part.

I'll have loads of catalogues (many need upgrading), with 10k–50k photos inside. Some are organised, 99% aren’t. I do have exported favourites saved for my website, but there are thousands more that I’ve forgotten about and would love to rediscover.

But the idea of manually opening each catalogue and scrolling through dozens of 50,000 image catalogues makes my brain hurt.

So what’s the most efficient way to actually review and organise this? Merge catalogues? Use a tool like Photo Mechanic to batch preview?

Would love to hear from anyone who’s done large-scale digital cleanup / management before.

89
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/xEvilL_ on 2025-05-30 18:21:07+00:00.

90
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Neurrone on 2025-05-30 16:28:00+00:00.

91
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/AggravatingTear4919 on 2025-05-29 21:54:18+00:00.


Someone i know recently asked if i could share my entire collection with them. Theyre hesitant because their uncle did this and absolutely refused to share with anyone he kept them under lock in key. So would i share my data? the data ive been actively hoarding and collecting for 5+ years? while he gets it all in a matter of minutes? abso freaking lutely. Im hoarding this stuff TOO potentially share and he can act as a back up. He can spread the information ive collected to others and keep it alive.

92
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Afterlast1 on 2025-05-29 19:57:38+00:00.


I feel like I'm in the right place to ask this question - I have too many god damn hard drives! They got all kinds of stuff on them; old school projects, ADHD hyperfixations, hundreds of gigabytes of raw photos. I've got hard drives that are backups of other hard drives and at this point I don't know what's what. Does anyone here know of any process that can scan all the attached harddrives and highlight or ignore all the duplicate files so I can start clean and get organized and only have, idk maybe 3 full back ups? instead of half a dozen partial back ups?

93
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/dwhite21787 on 2025-05-29 19:34:26+00:00.

94
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/pokeyfortnite on 2025-05-29 17:38:51+00:00.

95
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/TheThingCreator on 2025-05-29 14:21:26+00:00.

Original Title: Pocket is Shutting down: Don't lose your folders and tags when importing your data somewhere else. Use this free/open-source tool to extract the meta data from the export file into a format that can easily migrate anywhere.

96
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/Confident_Bobcat5238 on 2025-05-28 23:33:28+00:00.


Anything I can reasonably do?

it's all null bytes 💀

Also, to confirm, this is the command I used to check:

fsutil behavior query DisableDeleteNotify

97
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/cheater00 on 2025-05-28 09:20:45+00:00.

98
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/ploz on 2025-05-28 13:31:32+00:00.


120,000+ Historic Gaming Files to Find a New Home

Download.it, the trusted multilingual software download and review platform, announces the upcoming merger with FilePlanet.com, to be completed on May 29, 2025. Over 120,000 historic FilePlanet gaming files, including rare demos, mods, patches, and promotional materials, will be preserved and remain freely accessible through Download.it's infrastructure.

Originally founded in 1997 and previously operated by IGN Entertainment Inc. (Ziff Davis), FilePlanet served as an essential resource for gamers, modders, and enthusiasts for almost 28 years. Facing permanent closure, FilePlanet was acquired by Download.it to ensure these files, many unavailable elsewhere, could remain accessible to gaming communities around the world.

Download.it, established as a reliable destination for software, apps, and game downloads for Windows, macOS, and Android platforms, has always emphasized free and convenient access without registration barriers or fees. This merger furthers the platform's commitment to digital preservation, combining resources to create one of the largest free download archives online: over 500,000 files totaling nearly 30TB of content.

Key facts about the merger:

120,000+ historic gaming-related files saved from FilePlanet

Combined archive of 500,000+ files across both platforms

Nearly 30TB of preserved digital content

Free, no-registration-required access continues

Automatic redirects preserve all historic links

Starting May 29, users visiting original FilePlanet.com URLs will automatically redirect to equivalent pages at the new address, safeguarding decades of historic links and bookmarks.

Visit FilePlanet's new home starting May 29:

https://fileplanet.download.it/

About Download.it

Download.it is a multilingual software review and download portal, providing trusted, curated downloads for Windows, Android, and macOS users globally. Offering software, apps, games, utility tools, and now a historical gaming archive, Download.it serves millions of visitors with fast, reliable, and free downloads each month.

99
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/fordiem on 2025-05-27 20:11:53+00:00.


Hi all! Hoping I’m asking this in the right place — I’m part of a global video production team, and we’re currently looking for a long-term storage solution for our cold archive. I’m relatively new to NAS/storage infrastructure, so apologies if I misuse any terms!

We shoot a high volume of content each year — 2024 alone generated about 150TB of assets (footage, project files, etc.). We currently use a cloud-based platform for editorial and work-in-progress files, but need a physical, on-prem solution to store archived assets for the long haul.

Right now, we’re running:

  • 2 x QNAP TVS-1282T3 units (each with ~75TB)
  • Each connected to a QNAP DL-800C expansion (~110TB)
  • We’ll max these out by the end of 2025 once we finish archiving 2024

We’re looking for a new solution that can:

  • Store at least the next 2–3 years (so ideally 400–500TB total)
  • Be expandable as our needs grow
  • Function as cold storage — speed is less of a priority than reliability and scale
  • Be reasonably user-friendly (we’re a creative team, not full-time IT pros)
    • EDIT: We have an IT department! But unfortunately there's a lot of turnover in IT (the person who installed our existing QNAPs years ago was long gone by the time I started at my job, we begged them to help us out since nobody knew how to access them but they said no/couldn't figure it out, so I had to learn how to use them myself) so I want to make sure that it would be easily understandable if/when someone takes over my job!

I’ve reached out to a few vendors (Synology, QNAP, SNS), and quotes so far have ranged anywhere from $40K to $100K, depending on the level of performance and scalability. That said, I’m wondering if there are better or more cost-effective options? Would something like a large DAS with 20–24TB drives work for us, or do we need to stick with the same/similar current NAS system? Is there anything better and expandable?

Would love any recommendations on setups, brands, or pitfalls to avoid. We’re in the process of cleaning up our archive — keeping only final exports and essential assets for older projects, but we aim to preserve the past two years of production in full, including all raw footage and project files.

Hoping to find the best path forward! Happy to clarify anything I’ve missed! :)

100
 
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/datahoarder by /u/vaaoid95 on 2025-05-28 02:55:54+00:00.

view more: ‹ prev next ›