2017-02-13 - Questions 7 (Dylan)
#MSS Codex 0877 and Linked Data
In the 2014 report by Smith et al., I noticed in the transcription it mentioned a [hole in the substrate between a and q prior to the moment of writing]. This led me to reflect on all the minute details necessitated by a study like this, and how difficult it is to translate those details into machine-readable data. My question then was: How do we mark up (and standardize marking up) a detail like holes in the substrate prior to writing?
After the Linked Data reading, my question was: Do we need to? This has more to do with what we’re trying to accomplish in the field. I think Elaine has had some very apt questions about the purpose of digitial humanities; I get very excited about the prospect of interoperability and cross-platform tools, but there’s also the importance of what we do with those tools. The reading also seemed to suggest that XML provided too much flexibility, to the point where it’d be difficult to standardize (a balance we’ve discussed before). A question from this reading that interested me was:
“How can a system process information without regard to its meaning and simultaneously generate meaning in the experience of its users?”
This is essentially what we’re trying to do; we’re trying to have all the advantages of machine-readable data (quick processing and analysis, quantifiable, flexible, accessible) along with the advantages of traditional humanities (“control, provenance, transparency, reproducibility, and all the other elements of good research.”) Is this a correct assessment? (Have I been repeating this sentiment?)
Also, could we go over the difference between Linked Data and Semantic Mapping?