Interpreting deep-sea images with artificial intelligence AI| Artificial intelligence
As to how the workflow operates, Dr. Timm Schoening explains: “Over the past three years, we have developed a standardized workflow that makes it possible to scientifically evaluate large amounts of image data systematically and sustainably.”
He adds: “All this information has to be linked to the respective image because it provides important information for subsequent evaluation.” For example, a recent deep sea robot called ABYSS collected over 500,000 images of the seafloor in around 30 dives. This creates a large amount of data for researchers to shift through. The artificial intelligence can take an array of separate images and combine them to form larger maps of the seafloor.
The artificial intelligence also helped with data capture, which was via three steps, which were: data acquisition, data curation and data management. The system specified how the camera was set up, which data to be captured, and lighting levels in order to be able to answer a certain scientific question.
The new application has been described in the journal Scientific Data, with the research paper titled “An acquisition, curation and management workflow for sustainable, terabyte-scale marine image analysis.” The image data was drawn from research published in Pangaea, titled “Seafloor images and raw context data along AUV tracks during SONNE cruises SO239 and SO242/1.”