
Sometimes it's hard to fathom the vast quantities of data that comprise our digital reality. What's just as perplexing is wrapping your head around the insane amount of data gathered by just a single observatory monitoring outer space. The Vera C. Rubin Observatory is located in Chile and takes around 1,000 pictures every day of the nighttime sky.
The New York Times recently published an interview with team members from the observatory, who confirmed that each photo taken by the telescope's camera is composed of 3.2 billion pixels, each one representing one of 65,536 shades of gray. The average image contains approximately 6.4 GB of data.
Images are automatically sent to nearby servers, allowing the telescope to move to another area and take additional photos. The plan is to observe the same areas repeatedly for at least a decade. According to the team, they expect a huge amount of data to be generated from the project, which is expected to accumulate a minimum of 60 billion bytes (approximately 60,000 TB) by the end.
To handle all of that data, a data center was constructed at Rubin, designed to store at least one month's worth of data locally in the event of unexpected network issues. This provides the necessary redundancy to protect their work. The team is also able to analyze the data there at the observatory before sending it off for further analysis.
The data is transmitted to the SLAC National Accelerator Laboratory, which is a part of the Energy Research Center in Menlo Park, California. It takes 60 miles of fiber optic cable to connect the data center to the city of La Serena in Chile, ensuring that this data can reach the research center. It's here that the new images are compared to older ones taken in the same region of space to look for any significant changes. Any variances are then isolated, highlighted, and then studied in greater depth.
While the team currently estimates the final total to land at around 60,000 TB of data, there are plenty of factors that could cause variations in the final total. They also suggested that the data could total up to as much as 500,000 TB. Dr. O’Mullane, the observatory's associate director of data production, suggests that it will be easier for astronomers to analyze data of this size using AI.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Ash Hill is a contributing writer for Tom's Hardware with a wealth of experience in the hobby electronics, 3D printing and PCs. She manages the Pi projects of the month and much of our daily Raspberry Pi reporting while also finding the best coupons and deals on all tech.