This coming 7th June will take place the second edition of the course titled II Workshop on data Visualisation for healthcare technicians and scientific journalism in an effort to jointly work with tools which make health data more visible and user-friendly. You can register for the course, but beforehand, we would like to present a short report of the topics covered in last year’s edition.
The course was divided into two very different parts: firstly, Eva Domínguez chaired a general discussion about digital journalism media and secondly, Paula Guisado focused on procedures, tools and applications within the area of health data.
When we refer to digital journalism, we are referring to some of the emerging traits for instance new narrative styles, such as immersion, audiovisual development, adapting the varying content to the most suitable format and hybridisation.
Beyond these characteristics we might be led to believe that becoming viral is a very common concept associated with everything digital but the questions remain: Does everybody want, and does everybody have the capacity to generate viral content?
Regardless of the objective, well-known successful factors can be analysed and utilised when deemed adequate by adapting them to the desired objective and context. Certain recommendations in this area reference classic ideas such as emotio (being capable of generating an emotion amongst your audience), universality (a “universally” identifiable concept might be successful) and brevity (eliminating superfluous elements for transmitting the key message).
In practise, how can all this be achieved?
We can approach the idea of universality for example by trying to explain short stories which become big. With regard to generating emotion, the basic idea is to awaken empathy in the reader. From this point onwards, total freedom and creativity and a proposal for working: we must question every technique in an aim to surprise the audience and we must do all this without losing sight of the fact that “Content is King“. Not everything has to be interactive, but we do have to think carefully about what we want to explain and how we wish to go about it.
More ideas. Interactive tools which enable us to identify ourselves work extremely well, whether this is a quantitative or qualitative identification.
Another compelling element is to involve the audience in the story. How can we achieve this goal? The following strategies can be used:
• Transmedia / Multiplatform. Confusion might arise as to whether the end product is a report, a data base, a creative project, a project designed to raise awareness, activism or serialisation. The Spanish serie El Ministerio del Tiempo, for example, has taken a lot out of this.
• Serialisation. Fragmenting information into “chapters” o “instalments”. This can be addictive when performed well. Example: Serial Podcast has managed to create a community of fans explaining a journalistic investigation by weekly deliveries.
• Creating an experience. By way of navigation it is possible to establish a connection with the user in such a way that navigation becomes a factor for immersion. Example: ViceNews about Ebola (Wired).
• Immersion through navigation (or immersion in the area). Interactive tools where the user places themselves inside the story. It is the case of this application of virtual reality that simulates that you are in the Roman Tarraco.
• Let the user participate and find elements that must be discovered. Play, the operative word here, with the fun element of the game … or with the fear element as in Take this lollipop.
• Constructing a story within the story. Example: documentary film Mujeres en venta.
• Immersion narratives in the first person. The aim is to give visibility to large documentaries. Format of the “docu-game”. Example: The refugee project.
• “Make it personal”. A close personal approach tends to work well. Example: Do not track regarding data privacy.
The second part of the course, which revolved around the applications to health data, got underway with a fascinating reflection: journalism with data is not data journalism (The Guardian 2011).
Massive analysis by computational means is the defining characteristic of data journalism. From this point onwards we can see specific patterns and tools:
• Datamining. Tools such as scrapping: tabula, import.io, kimono labs
• Data visualisation tools: adobe edge, hype tumult, cartoDB, datawrapper, infogram, odyssey.js, juxtapose.js
• Data cleanup and transformation: Excel, Open Refine
• Other tools: Tableau, Tableau public, Quadrigam (in the beta phase at the time of the course)
• Final recommendations (unusual ones): Remove to improve, Spurius correlations
We look forward to seeing you for the second edition of the course, which like the first, aims to act as an incentive for innovation and professional development based on the sharing of knowledge and a range of tools between professionals whose objective is to collect the public’s health data, in the best way possible.
You can also see the course information in the web of the Catalan Association on Scientific Comunication, about the 2015 edition and the 2016 edition.
Post written by Marta Millaret (@martamillaret) and Cristina Ribas (@cristinaribas), president of the Catalan Association of Scientific Communication (ACCC).
(Photo credit: dcJohn via Foter.com / CC BY)