Emoka is often a great way to get headlines. This was the case for Kyoto University in Japan, whose supercomputer smoothed out valuable research data for a total of 77 terabytes.
The fault is suspected to be an invalid script designed to remove old and unnecessary data from the inside of the machine. Instead of unnecessary data, the script deleted 34 million quite relevant files, Techspot write.
Probably the only silver border in the cloud is that scientists initially feared the supercomputer shredded data in front of 100 terabytes, as all files older than 10 days were marked for deletion.
Supercomputer Supplier Hewlett-Packard Japan (HPE) took full responsibility for the blunder and sent a letter of apology to the university.
The University of Kyoto does not reveal in detail what kind of data it was, but some of the files are non-refundable, so lost in Finnish.
Source: Tivi by www.tivi.fi.
*The article has been translated based on the content of Tivi by www.tivi.fi. If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!
*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.
*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!