The story of Apollo 13 is that of courage, innovation, partnership, and a treasure trove of data. Data that was expertly backed up, archived, and stored with the help of a backup module that was available to them. At first glance, the unequivocal importance between space exploration and data may seem unclear. But the truth is without data there can be no missions into the unknown.
To put things in perspective, for NASA's (National Aeronautics and Space Administration) missions, several hundreds of terabytes are gathered per hour. The premier space exploration and aeronautics research agency conducts dozens of missions, which means they are sitting on a goldmine of data.
The digital records related to the launching and tracking of satellites in space, images of Earth, and faraway galaxies can yield tremendous results in offering solutions that can solve the human race’s greatest challenges.
All the digital records, therefore, need to be indexed, stored, and processed for scientists, software engineers, spacecraft engineers, and people across the world to leverage data. This calls for a different level of solutions and strategies to handle extremely complex requirements of records retention, e-discovery, and security.
Storing fuel for a smooth, long journey
There are three major areas of managing data in the context of space missions -- storage, processing, and access.
The storage of data is the more challenging feat of the other two simply because of its enormous volume. For instance, once the Square Kilometer Array (SKA) -- an array of thousands of telescopes in Australia and South Africa -- becomes operational, it will send out 700 terabytes of data daily which amounts to double the data that flows on the internet every day.
Another example is that of Dhruva Space, an Indian space-tech startup known for creating small satellite systems for commercial and government purposes makes extensive use of its satellite to collate humongous amounts of earth data from outer space and relays it back to the earth for businesses and industries to churn value out of that data. In the face of this Big Data, technology leaders are moving away from deploying more hardware and coming up with innovative software solutions and models in the cloud, to store and extract data more effectively.
Just as a journey in space, data is also traversing an unknown trajectory—whether it’s multiple clouds or containers (or both!). It is imperative to mitigate and minimize risk factors of uncharted territories through proactive data management. While multi-cloud and/or containers offer simplicity and ease of scalability to data management, it's important to ensure all risks associated with these disruptive technologies are mitigated and minimized.
Democratising Data: High on availability & Accessibility
Providing seamless access to archival data is crucial. Online archives offer primary access to both raw and calibrated mission data. Scientists combine and analyze data sets across traditionally separate wavelength boundaries.
Therefore, it's important to consider where to store long-term archival data, and how can it be made sustainable and usable for the community.
Besides overcoming archiving issues, technology leaders in the field of space exploration must also deal with complexity. The Mars Reconnaissance Orbiter, launched by NASA to study the red planet’s climate and geology, sends images back to Earth on a regular basis. Each image contains 120 megapixels. Needing advanced ways to visualize this information, technology leaders at NASA's Jet Propulsion Laboratory are creating computer graphics, animations, and movies from the data sets.
To ensure engineers and scientists can easily utilize the data, IT leaders are working to automate the process for creating visualization products. It is, therefore, imperative to pave way for innovations that will enable them to easily utilize the data sets they receive from space.
Yet another aspect of big data is its availability -- making it easy for users to access whatever they need from the archives. The data archived by ISRO (Indian Space Research Organization) which was accumulated by the instruments onboard their Chandrayaan-1 – its first mission to Moon – have been extensively used to understand queries arising in the field of lunar science and applications of remotely sensed data to study lunar evolution.
Satellite data is an important source of information for climate activists and environmentalists. However, the possibilities are not limited to only comparing and viewing satellite images. By combining different images, one can derive critical information on soil hydration levels or the health of vegetation. Timely research in these areas can go a long way in protecting our environment.
Therefore, it is imperative for that data is always made available and accessible. The data management solution should be such that it does away with single points of failure and promises continuous uptime or operations for extended periods.
Data analysis is the key to unlocking the universe’s mysteries and processing data in milliseconds is critical. Eventually, it boils down to having a solution that promises the high availability of big data.
Towards failure-proof space exploration
The next few decades will accelerate the growth in the number of terabytes generated from man's exploration of space. The dawn of interplanetary missions, deployment of faster communication technology, and enhanced speed of commercial satellite deployment will all contribute towards this end.
The space industry is steadily moving in a direction where everything will be governed by digital connections to data, models, and software. This digitization is necessary to ensure all data collected is being exploited to the fullest.
To put an intelligent, resilient, and secure data solution in place, the space exploration agencies – public or private, must align with IT vendors that can provide a state-of-the-art and future-proof solution with a 'no boundaries' architecture that can allow seamless flow of data into and back from the cloud. Through this cutting-edge data solution, they will be able to control and mitigate risk through a single command center, while protecting the data from any attacks. As the big data challenges are all set to become bigger, space agencies will need to continue innovation in data processing, storage, and visualization (access) technologies.
To weather the incoming data storm, they must continue to align with a robust data management and protection partner and develop new strategies because, in space, failures aren't an option.
-Pradeep Seshadri is director, sales engineering at Commvault India and SAARC
Reference links –