CRI is co-constructing and sharing new ways of learning, teaching, conducting research and mobilizing collective intelligence in the fields of life, learning and digital sciences, in order to address the UN's sustainable development goals (SDGs).
CRI operates around 4 main areas :
CRI was founded in 2006 by François Taddei and Ariel Lindner with the Bettencourt Schueller Foundation as an essential and key supporting partner and Paris City Hall. It also benefits from the support of a wide range of foundations, corporate sponsors and institutions including the University of Paris, with which CRI co-founded the interdisciplinary action-based research challenge institute (“Institut des Défis”) to prototype a model of a Learning University enable of responding to the global challenges of our time.
The defense will be online, using this link: https://u-paris.zoom.us/j/89886167970?pwd=RDdrSzFzQ2hpaVRNZkhqYktQSjBCdz09
The aim of this project was to understand how to democratize nucleic acid detection and how to harness it for citizen science and education. Allowing anybody anywhere to do and understand genetic detection, simply rapidly and affordably and to demystify, empower, educate, and inspire.
This interdisciplinary work both used and developed novel tools at the intersection of Molecular Biology, Citizen/Open, and Learning Sciences. We tested new high throughput, low volume, and multi-parameter techniques to develop and optimize fluorescent isothermal nucleic acid amplification assays that are not only rapid, sensitive, and specific but also robust. We created a 5 minutes DNA extraction protocol that only needs water. Detection is done utilizing our ultra- affordable (less than 2$) easy to build open-hardware fluorescence detector.
Making finding a specific fragment of DNA/RNA more accessible by reducing the cost of the reactions and instrumentation by at least an order of magnitude and halving the length of the experiments as compared to traditional PCR. Additionally simplifying it, such that even the untrained public (from the ages of 5 to 85) can successfully detect and see a gene with their own eyes. The hour of incubation time allows for deeper discussion, learning, and debate. The first use case is detection of GMOs in food and feed.
This has all been packaged into a modular open Workshop/Lab that has been adapted to different audiences. The workshop has been done over 25 times in 5 countries (France, UK, Switzerland, USA, Spain), by more than 400 people, half of which were K-14 students in the Paris region, and the other half on diverse groups such as researchers, biohackers/makers, and the general public. Pre/post-workshop questionnaires have shown significantly improved understanding, empowerment and motivation by the participants.
As Proof of the generality of this approach during the COVID pandemic these methodologies and the lessons learned have been applied to the detection of SARS-CoV-2 in an open and collaborative way with partners around the globe. Particularly focusing on solutions for low resource settings, which might lack access to infrastructure and robust cold chains.
We believe this is an important step towards making nucleic acids accessible to a wider audience, in an open, hands-on, learning by doing way. This powerful methodology could be used for a variety of other targets such as interrogating the food we eat or searching for endangered, invasive, or pathogenic species.The Jury:
Prof Jim Hasselhof Professor of Synthetic Biology - Department of Plant Sciences, University of Cambridge https://haseloff.plantsci.cam.ac.uk/ also the founder of Openplant and Biomaker Dr. David Sun Kong Director, Community Biotechnology Initiative Research Scientist MIT MediaLAB http://www.davidsunkong.com/
Prof Murial MAMBRINI- DOUDET Head of the FIRE doctoral school CRI/UDP/INRA Dr. Amir MITCHEL ASSISTANT PROFESSOR University of Massachusetts https://mitchell-lab.umassmed.edu/ Dr. Fernan FEDERICI ASSISTANT PROFESSOR Pontificia Universidad Católica de Chile https://federicilab.org/
Tips for a successful crowdfunding. This workshop is intended for club referees but open to all CRI students within the limit of the number of places available (30 participants maximum), think to register you in advance. Event organized by the Student Life Coordination and EphiScience Association team. Join us the 11th March at 6PM on google meet: meet.google.com/zro-bccv-cqt
Perceptions and Movements in Collective Virtual Reality
Behavior is a fundamental property of living organisms. Individuals move in space, gather resources, mate, form collective structures. The individuals provide an adapted response to their environment by perceiving external stimuli, e.g., the direction of the light or the others' position, and internal stimuli, e.g., proprioception. The central problem of modeling is identifying functions that can predict individuals' behavior according to their perceived environment. A clear description of the environment is then critical. The recent advances in Virtual Reality (VR) allow us to investigate these questions by immersing individuals in a 3d virtual environment, where we can finely control each individual's visual field. This provides a unique opportunity to tackle vision in collective and individual behavior. I joined the CRI last year to design a general platform for studying behavior by the networking and automation of VR systems with two objectives in mind: i - Studying the relation between perception and movements ii - Providing an open platform for collective VR. I will present experiments where people interact with a unique object, discuss how this project has evolved with the ongoing pandemics and the shape this project will take in the coming future.
Tips to succeed when looking for fundings. This workshop is intended for club referees but open to all CRI students within the limit of the number of places available (30 participants maximum), think to register you in advance. This workshop is organized by the Student Life Coordination and HOME Association team. Join us on the 31th March at 6PM on google meet: meet.google.com/rua-rutn-mdg