The increasing volume, diversity, and role of data in modern research has been very fruitful. However, these same factors, have also made it harder to describe (in sufficient detail) the processing behind a scientific result within the confines of a traditional paper. It is thus becoming harder and harder to reproduce results (i.e., critically review by coauthors, referees or larger community) that define scientific progress. In this talk, Maneage (MANaging data linEAGE) is introduced as a working solution to this problem. Maneage is a template that should be customized for every project. It will enable exact reproduce a scientific analysis (from the input data and software, to the processing and creation of final report, paper or dataset. The necessary software are built (from the low-level C compiler and shell, to the higher-level science programs and all their dependencies) with the predefined configuration. The data are imported from pre-determined URLs (and validated with recorded checksums). The software are then run on the input data sets to produce the final result. The template will finally produce a "dynamic" PDF using LaTeX macros: any change in the analysis will automatically update the relevant parts of the PDF (for example numbers, tables or figures). A project defined in this template is fully managed and published in plain text and only consumes a few hundred kilo-bytes (unlike binary multi-gigabyte blobs like containers). It is thus easily to publish, for example on arXiv with the paper's LaTeX source or Software Heritage. Giving readers the ability to exactly study and reproduce the paper's results, for the long term. It is also easily search-able, providing a treasure trove to extract metadata on the project (even after publication, and without the author's active involvement). This can be very valuable when implement widely (e.g., using machine learning on many project sources to define automatic workflows). But most importantly it will allow other scientists to independently study the details, verify in practice, and build incrementally on each others' work, without necessarily needing to run it. |