We are proud to present to you one of our biggest Human Practices projects this year - The 6th SynBio Sense. Why 6th, you ask? Well, each person learns about his or her environment through five senses: vision, hearing, smell, taste and touch. We wanted to dive deeper and introduce the sixth sense of seeing what is invisible to a naked eye - synthetic biology through the lens of Augmented Reality.
According to studies1, science education in different parts of the world has many problems, like the lack of motivation in students and mostly passive learning, which only consists of listening and writing. Research has shown2 that STEM students who demonstrated strong capabilities in this area have identified extracurricular activities such as childhood experiences as the main factor for their interest in this field. 'The 6th SynBio Sense' project seeks for that exact experience - to connect with future researchers through new and technologically advanced ways.
It is no secret that innovative technologies profoundly impact how we perceive and learn new information3. These days, where the education sector must keep up with the newest trends in this area, it is essential to provide tools for teachers to excite their students.
However, our focus did not stop with the education of young pupils - the still ongoing COVID-19 pandemic has heavily impacted the way that adults think about life sciences. Due to the interest of how viruses infect people, their testing procedures, and the development of vaccines, we thought that it was important to create a strong base of biology knowledge for all age groups.
Our way of reaching this goal was to use up-and-coming technology Augmented Reality (AR)(Fig. 1). Similarly to Virtual Reality (VR), this technology seeks to blur the lines between the digital and physical world3. However, instead of creating a separate reality as VR does, Augmented Reality complements our environment - or you could say augments it - by adding digital elements on top of it. Research done in 20194 has concluded that the usage of AR in mobile applications for learning purposes has increased students' attention span by 30.72% and overall experience by 14.43%.
Figure 1. The 6th SynBio Sense in action
While there are many ways to implement Augmented Reality, for our web application, we chose to use marker-based AR, where with a quick scan of a QR code and a custom-designed marker, you can experience the 6th sense of Synthetic Biology. The main idea was to create and implement 13 different 3D models that would depict the main themes of life sciences, such as DNA, protein translation, GFP, etc. Each of these models would then be placed in separate AR scenes that can be opened in 3 quick steps, explained later in the ‘Usage’ section.
The objective was to place these 13 stops with QR codes and AR markers (Fig. 2) in several cities of Lithuania as well as to collaborate with other iGEM teams and their projects.
Figure 2. QR codes and AR markers on stickers
All 13 stops were distributed throughout the most popular city places and each one of them was designed in such a manner that it would be easy to use for everyone who has a smartphone in their pocket. Only three steps are required to be fully immersed into Augmented Reality scene (Fig. 3):
- Scan the QR code with your phone camera (or specific QR scanner app);
- Click ‘Allow’ in your browser when asked about camera usage permission;
- Scan the AR marker.
Figure 3.Three easy steps to experience Augmented Reality
When users finish these three steps, they can now examine the model. The next obvious step for us was to write descriptions for each of these models so that the users could also read about what they are seeing. This was also important because studies have shown that some adults prefer textual information rather than visual5. With the aim of attracting tourists and locals that do not speak Lithuanian, each description is available both in English and Lithuanian languages.
To increase the usability of our website, we took several approaches: it was designed by conforming to the Web Content Accessibility Guidelines (WCAG), such as high colour contrast ratio for good text legibility and vibrant call-to-action buttons were placed at the bottom of the mobile screen, as it is easier for the user to press them compared to the upper corners. Also, we took into consideration people with dyslexia and implemented a unique typeface to improve readability6.
Keeping in mind people with low vision and those who better comprehend auditory information, we included audio recordings about the 3D models in previously mentioned languages. While most of these were recorded by our team member Liepa and instructor Povilas, we also invited some of the most distinguished scientists from Lithuania - dr. Urtė Neniškytė, dr. Linas Mažutis and prof. Aurelija Žvirblienė (Fig. 4). They were kind enough to not only help us to record some of these informational texts but also review them and make them even more understandable for the general public.
Figure 4. Lithuanian scientists, that contributed to 'The 6th SynBio Sense' project - Urtė Neniškytė, Linas Mažutis and Aurelija Žvirblienė (from right to left)
Although the technology that was used to develop this project is supported by most modern browsers, there are a few exceptions - Google Chrome and Mozilla Firefox on iOS devices. These browsers do not allow camera access, required for marker detection. To solve this problem, our solution was to create an additional option (Fig. 5) for users to analyze these 3D models without Augmented Reality implementation.
Figure 5. Accessibility and different features in our website
Another feature that seemed important in the light of the recent pandemic was the ability to examine all the stops without leaving the house. In the main menu, you can find a link (Fig. 6) that lets you download all markers that can be used in AR scenes (preferably printed out, but not necessary).
Figure 6. Feature that allows the user to examine all the stops without leaving the room
To spread our project as wide as possible, we decided to collaborate with various cities in our home country Lithuania, including densely and sparsely populated areas (Fig. 7). It was important for us to collaborate not only with the capital city Vilnius where we are from but also with smaller cities, whose habitants do not have as many choices for cultural and educational resources in their hometowns.
Figure 7. Cities that collaborates with our team
Each of these cities had the possibility to decide where they would like to position these 13 stops. Mainly, they were placed in the central area of each city so that they would attract as many people as possible. The most common mode of implementation of this project was to print out hundreds of large stickers and stick them on throughout the city, however, one city also put up stainless steel plates on benches. Spots like public transport, schools, cafés, etc. were chosen to place the stickers since it maximizes the spread of the project.
‘The 6th SynBio sense’ is included in our other Human Practices project - children's colouring book about life sciences ‘The (Un)hidden Code of Life’ (Fig. 8). The goal for combining these two learning activities was to create a mutual learning environment for primary school children and allow them to interact with life sciences directly. Even though some of the information might be too advanced for this audience, we believe that seeing and analyzing these 3D models can be a great start for them to learn about science.
Figure 8.We incorporated 3D models in educational colouring book
We were also happy to collaborate with other iGEM teams to involve more people in this project. We participated in the International Postcard Project curated by the iGEM Team Düsseldorf. Our team created a postcard that describes our lab project, invites other teams to follow our journey until Giant Jamboree and also includes a sneak peek to ‘The 6th SynBio sense’ project.
While meeting up with the administrations of cities where we wanted to implement our project, we received several requests to add a few more AR scenes that would capture the link between biology and the city itself (might be famous scientists, local plants, etc.). From there, we thought: “Why don’t we create a tool that would allow anyone to create their own AR scene with no coding or 3D modelling skills?”. And that is what we did.
The main inspiration for this additional project was to allow teachers of any subject to create their own Augmented Reality scenes with descriptions so that their students could still experience engaging lessons even in the midst of a pandemic. Creating the scene then generates a simple 6-digit code, which later can be sent to anyone and used on the same website with a default AR marker provided by us.
The creation of the scene includes several simple steps (you can find full instructions in our website):
- Downloading the model from 3D resource websites or creating your own with 3D modelling software;
- Converting or exporting (if you design it yourself) the model to .glb format;
- Uploading it to our website;
- Writing the title and description for your AR scene;
- Setting up the 3D model so that it would be scaled, positioned and rotated correctly according to the marker.
When finished, the user receives a code that he can later send to his friends, pupils - or anyone else. The user also receives additional code that he can later use to update the information about the scene or setup details about the 3D model.
This platform is quick and easy to use because it does not require login information or IT and 3D modelling skills. We hope that future iGEM teams could use it to improve their Human Practices and especially Education projects since it is a fast and exciting new way to pass their knowledge to other people.
Vilnius-Lithuania iGEM team strongly believes that the best way to spread the word about synthetic biology and life sciences in general is to share the knowledge about the technical implementation of projects like this one. ‘The 6th SynBio sense’ is made up of three main parts:
3D models design;
Augmented Reality implementation in a web app;
Markers for Augmented Reality scenes.
Visually compelling learning material makes the process a few times more effective and can spark interest in the subject. Yet, in order to communicate science in particular, the provided information has to not only be interesting but also exclusively accurate. Therefore, we chose to use 3D models instead of 2D art, because this format can provide much more insight, help understand the material better all without losing the visual appeal.
In the beginning, we wanted to use 3D models from popular databases, such as Sketchfab, for our project. But we were quick to realize that there was a lack of visuals specifically tailored to educate the public on science. They were either overly simplified or in contrast, only available in databases used for protein mapping and complex calculations. Therefore a few of our team members decided to delve deeper into the process of creating a 3D model from scratch. To make the models more accurate, we used some of the protein models from databases. By using an open source software Blender and combining our knowledge of science, animation and graphic design we were able to produce high quality informative models and use them to augment reality.
Our main requirement while developing this project was to create a web app instead of a native app that you have to download from applications like Google Play or App Store. It was important for us, because it allows the user to access ‘The 6th SynBio Sense’ directly from their default browser rapidly instead of having to spend time downloading it. To do this we had to do an extensive research about WebAR since it is relatively new technology compared to Augmented Reality in native applications.
Another problem that also came up is the lack of well-documented examples or already implemented platforms that we could use for free. Applications like Pokémon Go, Snapchat and others led to rapid development of Augmented Reality in native apps, therefore has all of those things. WebAR, on the other hand, is only now accelerating into new heights. When we started developing this project we tried numerous examples and modes of implementation, yet most of these did not work on most of the modern browsers or operating systems as well as were not well-documented.
In order to create robust and easily recognized markers by any device camera, we had to take into consideration black to white (B:W) ratio, edge sharpness and especially complexity of the inner illustration design. While designing inner markers, we discovered that an asymmetrical illustration significantly increased AR marker's detectability and the user is able to scan the marker at any angle more successfully.
Each AR scene has its own simplified illustration, which reflects what the user can expect from the marker. To attract people's attention, we have created colourful stickers with QR codes and AR markers (Fig. 9), which has an intriguing title and a fun short fact.
Figure 9.Augmented Reality markers and colourful stickers
- Kaptan, K. & Timurlenk, O. Challenges for Science Education. Procedia - Social and Behavioral Sciences 51, 763–771 (2012).
- VanMeter-Adams, A., Frankenfeld, C. L., Bases, J., Espina, V. & Liotta, L. A. Students who demonstrate strong talent and interest in STEM are initially attracted to STEM through extracurricular experiences.. CBE life sciences education 13, 687–697 (2014).
- Kiryakova, G. The Immersive Power of Augmented Reality. in Human-Computer Interaction [Working Title] (2020). doi:10.5772/intechopen.92361.
- Khan, T., Johnston, K. & Ophoff, J. The Impact of an Augmented Reality Application on Learning Motivation of Students. Advances in Human-Computer Interaction 2019, 7208494 (2019).
- Vera, F., Sánchez, J. A. & Cervantes, O. Enhancing User Experience in Points of Interest with Augmented Reality. International Journal of Computer Theory and Engineering 8, 450–457 (2016).
- Rello, L. & Baeza-Yates, R. Good Fonts for Dyslexia. Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility 14:1-14:8 (2013). doi:10.1145/2513383.2513447