Brice Brown, Stuttgart High School, Germany
Mentor: Jason Baer; Daniel Coapstick, Stuttgart High School
Dominant feeding nature of Cnidarian-dinoflagellate symbiosis and effects of minerals on the coral model Aiptasia’s growth and health
Coral reefs have been suffering in recent years. One of the causes behind reduction is likely the poor health of ecosystem engineer cnidarians, such as corals, sea anemones, and sponges in an already nutrient-poor environment. Understanding the preferred feeding nature of symbiotic cnidarians and surrounding mineral compositions can help reinvigorate and improve tropical reefs that hold a significant amount of biota. Through this experiment, the optimal feeding and minerals will be revealed by using Aiptasia pallida, a commonly used sea anemone for coral research. Other researchers in the field will be able to use this data to efficiently grow their Aiptasia to ease coral research. Light levels were measured on five scales, “always-on,” “always-off,” an eight, twelve, and sixteen on cycle with intermittent addition of brine shrimp to stimulate heterotopic feeding. Iron, Calcium, Nitrate, and Phosphate were chosen to supplement a twelve-hour light and dark cycle complemented by brine shrimp feeding which provides a more realistic environment without the introduction of confounding variables like those in a mesocosm.
VIDEO
Thomas Thompson Arjona, Sigonella High School, Sicily
Mentor: Marsha McCauley, Sigonella High School
Aircraft Reliability in an Event of Engine Failure
The project focuses on how long and far a plane can fly if its engineer were to shut down in different weather conditions and in different airplanes. This is unrealistic to test in real life, so professional simulation software which accurately depicts real-life situations was used to complete the experiment. I tested the boeing 373-800 and the airbus a320 in clear weather, rainy weather, and snowy weather conditions. It turns out that Overall, data collected has shown that Boeing aircraft are slightly safer in engine failure emergencies based on statistical analysis. In each respective category, Airbus does remain airborne for a longer period of time on average than Boeing planes, but by a very small margin; Boeing planes on the other hand fly far greater distances than airbus planes in an engine failure by a considerable margin.
VIDEO
Mentor: Marsha McCauley, Sigonella High School
Aircraft Reliability in an Event of Engine Failure
The project focuses on how long and far a plane can fly if its engineer were to shut down in different weather conditions and in different airplanes. This is unrealistic to test in real life, so professional simulation software which accurately depicts real-life situations was used to complete the experiment. I tested the boeing 373-800 and the airbus a320 in clear weather, rainy weather, and snowy weather conditions. It turns out that Overall, data collected has shown that Boeing aircraft are slightly safer in engine failure emergencies based on statistical analysis. In each respective category, Airbus does remain airborne for a longer period of time on average than Boeing planes, but by a very small margin; Boeing planes on the other hand fly far greater distances than airbus planes in an engine failure by a considerable margin.
VIDEO
Kenneth Colin Roedl, Stuttgart High School, Germany
Mentor: Kamisha Roedl, Kenneth W. Roedl; Daniel Coapstick, Stuttgart High School
“I Don’t See Color”: An Analysis of Racial Diversity within Prime Time Television
Racial representation on television has been changing ever since television was invented. This study aims to do three things: 1.) Track the trends television has been setting since the 1960s and showcase what the future of television might look like; 2.) Identify the audience’s opinions on the growing diversification within television as well as what racial groups believe about the past, present, and future of diversity on television; 3.) Understand the impact that diversity (or the lack of) on television has had on viewers based on their perception on past shows as well as their self confidence after watching television. Using the research of Riva Tukachinsky and Dana Mastro, this project aimed to dive more deeply into the presence of people of color on TV since the start of television and less on their portrayal. The study looks to answer two questions: “Is diversity within primetime television becoming more mainstream?” and “What are primetime television viewers opinions on diversity within primetime television?” This is done by creating an examination for television shows on primetime television (ABC, NBC, and CBS). The survey also shows how minorities agree that representation is necessary, as opposed to Caucasians, and how they felt underrepresented on television as a child while whites believe they were represented on television as a child and that they are still represented now. The research will potentially serve as a blueprint for further studies to look into race portrayal and the future of racial representation on television which could be extended out to other countries and point out differences in television representation.
VIDEO
Mentor: Kamisha Roedl, Kenneth W. Roedl; Daniel Coapstick, Stuttgart High School
“I Don’t See Color”: An Analysis of Racial Diversity within Prime Time Television
Racial representation on television has been changing ever since television was invented. This study aims to do three things: 1.) Track the trends television has been setting since the 1960s and showcase what the future of television might look like; 2.) Identify the audience’s opinions on the growing diversification within television as well as what racial groups believe about the past, present, and future of diversity on television; 3.) Understand the impact that diversity (or the lack of) on television has had on viewers based on their perception on past shows as well as their self confidence after watching television. Using the research of Riva Tukachinsky and Dana Mastro, this project aimed to dive more deeply into the presence of people of color on TV since the start of television and less on their portrayal. The study looks to answer two questions: “Is diversity within primetime television becoming more mainstream?” and “What are primetime television viewers opinions on diversity within primetime television?” This is done by creating an examination for television shows on primetime television (ABC, NBC, and CBS). The survey also shows how minorities agree that representation is necessary, as opposed to Caucasians, and how they felt underrepresented on television as a child while whites believe they were represented on television as a child and that they are still represented now. The research will potentially serve as a blueprint for further studies to look into race portrayal and the future of racial representation on television which could be extended out to other countries and point out differences in television representation.
VIDEO
Marta Laatsch, Kaiserslautern High School, Germany
Mentor: Ken Robinson, Kaiserslautern High School
Effects of Changes in Temperature on the Speed of Sound
Sound travels in waves of compression through a medium. Changes in the medium such as density and temperature can affect the speed of sound. Using a device that uses a calculation based on a set speed of sound, researchers can use the changes in the measured length of a fixed distance to determine the change in the speed of sound. This experiment used changes over the course of a high altitude balloon launch in the measured length of tube to measure the correlation between temperature of medium and speed of sound when other factors are held constant. The experiment supports the idea that the speed of sound decreases as temperature decreases, and vice versa, as predicted by the kinetic molecular theory of gases.
VIDEO
Mentor: Ken Robinson, Kaiserslautern High School
Effects of Changes in Temperature on the Speed of Sound
Sound travels in waves of compression through a medium. Changes in the medium such as density and temperature can affect the speed of sound. Using a device that uses a calculation based on a set speed of sound, researchers can use the changes in the measured length of a fixed distance to determine the change in the speed of sound. This experiment used changes over the course of a high altitude balloon launch in the measured length of tube to measure the correlation between temperature of medium and speed of sound when other factors are held constant. The experiment supports the idea that the speed of sound decreases as temperature decreases, and vice versa, as predicted by the kinetic molecular theory of gases.
VIDEO
Sophie Hoffman, Stuttgart High School, Germany
Mentor: Daniel Coapstick, Stuttgart High School
A further analysis of the white dwarf binary candidates (optical counterparts) of x-ray sources in the Omega Centauri globular cluster
This research aims to further analyze the 18 white dwarf binary candidates in the globular cluster Omega Centauri (NGC 5139) previously identified in the study “Examining x-ray sources to detect possible optical candidates/counterparts in the OmegaCentauri globular cluster”. The astronomical programs VizieR, Aladin, SIMBAD, Gaia, and Chandra helped facilitate the progression of classification of the white dwarf binary candidates. Additionally, the VizieR photometry viewer, specifically, the spectral energy distribution graphs showed the energy emissions of each object within a 2 arcsec radius of the white dwarf binary candidates.
VIDEO
Mentor: Daniel Coapstick, Stuttgart High School
A further analysis of the white dwarf binary candidates (optical counterparts) of x-ray sources in the Omega Centauri globular cluster
This research aims to further analyze the 18 white dwarf binary candidates in the globular cluster Omega Centauri (NGC 5139) previously identified in the study “Examining x-ray sources to detect possible optical candidates/counterparts in the OmegaCentauri globular cluster”. The astronomical programs VizieR, Aladin, SIMBAD, Gaia, and Chandra helped facilitate the progression of classification of the white dwarf binary candidates. Additionally, the VizieR photometry viewer, specifically, the spectral energy distribution graphs showed the energy emissions of each object within a 2 arcsec radius of the white dwarf binary candidates.
VIDEO
Arman Markarian, Brussels High School, Belgium
Mentor: Amy Parlo, Brussels High School
Comparing Visual Quality and Efficiency in Hardware and Software Video Encoding Implementations
Today, humanity is consuming and streaming more video and other digital media than ever before. For this reason, it is important to be as efficient as possible regarding media production, and a part of achieving this goal is streamlining the process of video encoding. Video encoding, in the context of modern media, is the process of taking a video and preparing it for consumption by making it meet specifications for different formats or for playback on various devices. H.264 is generally agreed to be the best video encoding tool, or codec, because of its visual quality and availability. However, there is debate as to whether H.264 should be implemented using what are called hardware encoders, or software encoders. Hardware encoders, like Nvidia’s NVENC or Intel’s Quick Sync Video, make use of a computer’s GPU (graphics processing unit) whereas software encoders, like the open source x264, are more CPU (central processing unit) intensive. This study compared three encoders using six sample videos to determine which method of encoding is the most effective and efficient in terms of speed, system resource use, and visual quality. System resource use was measured by the average amount of RAM, or random access memory, used by each encoder for each video, and visual quality was measured using Netflix’s perceptual video quality assessment algorithm, VMAF. The researcher hypothesized that hardware encoders encode videos at a higher quality, faster speed, and with far less computational resources than software encoders. The results support this data, as Nvidia’s NVENC and Intel’s QSV, each hardware encoders, outperformed x264, a software encoder, on every point of comparison.
VIDEO
Mentor: Amy Parlo, Brussels High School
Comparing Visual Quality and Efficiency in Hardware and Software Video Encoding Implementations
Today, humanity is consuming and streaming more video and other digital media than ever before. For this reason, it is important to be as efficient as possible regarding media production, and a part of achieving this goal is streamlining the process of video encoding. Video encoding, in the context of modern media, is the process of taking a video and preparing it for consumption by making it meet specifications for different formats or for playback on various devices. H.264 is generally agreed to be the best video encoding tool, or codec, because of its visual quality and availability. However, there is debate as to whether H.264 should be implemented using what are called hardware encoders, or software encoders. Hardware encoders, like Nvidia’s NVENC or Intel’s Quick Sync Video, make use of a computer’s GPU (graphics processing unit) whereas software encoders, like the open source x264, are more CPU (central processing unit) intensive. This study compared three encoders using six sample videos to determine which method of encoding is the most effective and efficient in terms of speed, system resource use, and visual quality. System resource use was measured by the average amount of RAM, or random access memory, used by each encoder for each video, and visual quality was measured using Netflix’s perceptual video quality assessment algorithm, VMAF. The researcher hypothesized that hardware encoders encode videos at a higher quality, faster speed, and with far less computational resources than software encoders. The results support this data, as Nvidia’s NVENC and Intel’s QSV, each hardware encoders, outperformed x264, a software encoder, on every point of comparison.
VIDEO
Lauren Schmidt, Stuttgart High School, Germany
Mentor: Daniel Coapstick, Stuttgart High School
Examining The Role Of Race In The Cook County Criminal Justice System
The criminal justice system of the United States is increasingly affected by racial bias in all aspects, from charge and arrest to the trial process to incarceration and recidivism rates. This study examines the trial process and judges in particular in Cook County (Chicago), Illinois due to its high crime rates. Through a set of statistical analyses and disposition and sentencing data from the Cook County Courthouse, it is shown that there is a disproportion in the number of Black defendants and convicts to White defendants and convicts. Additionally, the proportions of Black judges to White judges show a much greater number of White judges overall. Placing the results of the analyses and statistical tests in a broader context shows that judges, juries, and trial courts are one of many components to determining racial bias in a city’s criminal justice system, and that while showing an abnormal bias, the results from this study alone cannot definitively deem the Cook County Courthouse responsible for the observed bias.
VIDEO
Mentor: Daniel Coapstick, Stuttgart High School
Examining The Role Of Race In The Cook County Criminal Justice System
The criminal justice system of the United States is increasingly affected by racial bias in all aspects, from charge and arrest to the trial process to incarceration and recidivism rates. This study examines the trial process and judges in particular in Cook County (Chicago), Illinois due to its high crime rates. Through a set of statistical analyses and disposition and sentencing data from the Cook County Courthouse, it is shown that there is a disproportion in the number of Black defendants and convicts to White defendants and convicts. Additionally, the proportions of Black judges to White judges show a much greater number of White judges overall. Placing the results of the analyses and statistical tests in a broader context shows that judges, juries, and trial courts are one of many components to determining racial bias in a city’s criminal justice system, and that while showing an abnormal bias, the results from this study alone cannot definitively deem the Cook County Courthouse responsible for the observed bias.
VIDEO
Ian Kirkpatrick, Lakenheath High School, United Kingdom
Mentor: Anita Lang, Lakenheath High School
The Utilization of an Object-Oriented Approach in the Creation of a Multi-layered Perceptron
Much research has been done recently on the topic of deep learning artificial intelligence. Most of the research being conducted relates to the potential applications of deep learning and different architectures that can be used in deep learning, but very little research focuses on the approach to creating the neural network itself. This project investigates the efficacy of utilizing an object-oriented approach to the creation and utilization of multi-layered perceptrons. While the object-oriented approach succeeds in making the movement from input to output through the neural network more understandable, the approach fails when the programmer attempts to train the network. Because the nature of object-oriented programming removes the programmer from low-level calculations, access to important values becomes very difficult to obtain when stochastic gradient descent is applied. Because of this, the use of object-oriented programming in creating a multi-layered perceptron is not recommended.
VIDEO
Mentor: Anita Lang, Lakenheath High School
The Utilization of an Object-Oriented Approach in the Creation of a Multi-layered Perceptron
Much research has been done recently on the topic of deep learning artificial intelligence. Most of the research being conducted relates to the potential applications of deep learning and different architectures that can be used in deep learning, but very little research focuses on the approach to creating the neural network itself. This project investigates the efficacy of utilizing an object-oriented approach to the creation and utilization of multi-layered perceptrons. While the object-oriented approach succeeds in making the movement from input to output through the neural network more understandable, the approach fails when the programmer attempts to train the network. Because the nature of object-oriented programming removes the programmer from low-level calculations, access to important values becomes very difficult to obtain when stochastic gradient descent is applied. Because of this, the use of object-oriented programming in creating a multi-layered perceptron is not recommended.
VIDEO
Daniel Morrow, Stuttgart High School, Germany
Mentor: Jason Baer; Daniel Coapstick, Stuttgart High School
Buffering of Ocean Acidification in the Exaiptasia pallida, a Model System for the Cnidarian-Dinoflagellate Symbiosis
The absorption of atmospheric CO2 by the ocean causes a decrease in pH, a decrease in the concentration of CO3- ions, and an increase in the ratio of CO2 to total dissolved inorganic carbon. This process is known as ocean acidification (OA), and presents severe risks to marine life. CO3- ions are essential for the process of calcification, so calcifying-organisms such as corals and bivalves are greatly harmed by OA. Furthermore, lower pH values have been shown to induce metabolic depression in marine species. Thankfully, the symbiosis between cnidarians, the phylum that includes corals and sea anemones, and the microscopic algae Symbiodinium is resilient to the metabolic effects of OA because of increased rates of photosynthesis resulting from high pCO2. Nitrogen pollution in aquatic environments is a concern, but high levels of nitrogen have been shown to benefit calcifying algae under OA. This study investigates if anthropogenic nitrogen pollution has the potential to buffer the metabolic effects of OA in the cnidarian-dinoflagellate symbiosis using the anemone Exaiptasia pallida.
VIDEO
Mentor: Jason Baer; Daniel Coapstick, Stuttgart High School
Buffering of Ocean Acidification in the Exaiptasia pallida, a Model System for the Cnidarian-Dinoflagellate Symbiosis
The absorption of atmospheric CO2 by the ocean causes a decrease in pH, a decrease in the concentration of CO3- ions, and an increase in the ratio of CO2 to total dissolved inorganic carbon. This process is known as ocean acidification (OA), and presents severe risks to marine life. CO3- ions are essential for the process of calcification, so calcifying-organisms such as corals and bivalves are greatly harmed by OA. Furthermore, lower pH values have been shown to induce metabolic depression in marine species. Thankfully, the symbiosis between cnidarians, the phylum that includes corals and sea anemones, and the microscopic algae Symbiodinium is resilient to the metabolic effects of OA because of increased rates of photosynthesis resulting from high pCO2. Nitrogen pollution in aquatic environments is a concern, but high levels of nitrogen have been shown to benefit calcifying algae under OA. This study investigates if anthropogenic nitrogen pollution has the potential to buffer the metabolic effects of OA in the cnidarian-dinoflagellate symbiosis using the anemone Exaiptasia pallida.
VIDEO
Lauren Gerber, Kaiserslautern High School, Germany
Mentor: Ken Robinson, Kaiserslautern High School
Gravity's Effect on the Redshifting and Blueshifting of Light
Light can be used to gauge the distance of distant galaxies bordering the edge of our visible universe. Being able to understand how it warps and shifts with gravity could help show areas of higher or lower gravity in space, showing us where previously unseen planets or galaxies are located. The understanding of light's relation to gravitational force could also help scientists locate black holes and their relativity to earth. This experiment set out to prove that at our estimated final altitude the difference in gravitational strength would not be great enough to cause any visible shift. This experiment also set out to prove that if the relation of the observer to the light was constant then a visible shift could not be recorded. More than anything this experiment set out to improve our knowledge on gravitational fields and how light interacts with them. This experiment had some difficulties yet it was still able to provide adequate results. From which a conclusion could be drawn. During the course of the flight which this experiment was on the lights amplitude was measured every two seconds. The final results showed that gravitational Redshift did not occur during the course of the experiment, nor did gravitational blueshift did not occur during the course of the experiment.
VIDEO
Mentor: Ken Robinson, Kaiserslautern High School
Gravity's Effect on the Redshifting and Blueshifting of Light
Light can be used to gauge the distance of distant galaxies bordering the edge of our visible universe. Being able to understand how it warps and shifts with gravity could help show areas of higher or lower gravity in space, showing us where previously unseen planets or galaxies are located. The understanding of light's relation to gravitational force could also help scientists locate black holes and their relativity to earth. This experiment set out to prove that at our estimated final altitude the difference in gravitational strength would not be great enough to cause any visible shift. This experiment also set out to prove that if the relation of the observer to the light was constant then a visible shift could not be recorded. More than anything this experiment set out to improve our knowledge on gravitational fields and how light interacts with them. This experiment had some difficulties yet it was still able to provide adequate results. From which a conclusion could be drawn. During the course of the flight which this experiment was on the lights amplitude was measured every two seconds. The final results showed that gravitational Redshift did not occur during the course of the experiment, nor did gravitational blueshift did not occur during the course of the experiment.
VIDEO
Bettina Wagner, Stuttgart High School, Germany
Mentor: Daniel Coapstick, Stuttgart High School
Levels of Cognitive Processing in Relation to Reaction Times to External Stimuli
Sensory Processing Sensitivity (SPS) is estimated to be unknowingly possessed by fifteen percent of the world’s population, having only been researched in the past twenty years. Awareness of this innate personality trait in the Highly Sensitive Person (HSP) population has been found to be life-changing and empowering in managing all aspects of well-being, but knowledge on the trait is limited and further study would prove beneficial to those who are and are not highly sensitive alike. The aim of the present study was to investigate a potential correlation between reaction times to external stimuli, and levels of cognitive processing. Through the creation of the Wagner SPS Survey based on the frameworks provided by Dr. Aron and Aron and psychotherapist Julie Bjelland, distinctly highly sensitive individuals were identified in a Department of Defense (DoD) American high school in Germany. Individuals whose scores fell within the predetermined threshold were selected as participants of the study. Choice-reaction time testing was then performed through a constructed Raspberry Pi apparatus. Knowing that HSPs are inherently deeper processors than the general population, the researcher expects reaction times to be subsequently slower in HSPs as a result of processing inhibitions. The data, currently under review, will undergo a statistical analysis in order for a correlation between the two variables to be determined. As SPS becomes increasingly recognized in society, it becomes more important to understand the HSP holistically.
VIDEO
Mentor: Daniel Coapstick, Stuttgart High School
Levels of Cognitive Processing in Relation to Reaction Times to External Stimuli
Sensory Processing Sensitivity (SPS) is estimated to be unknowingly possessed by fifteen percent of the world’s population, having only been researched in the past twenty years. Awareness of this innate personality trait in the Highly Sensitive Person (HSP) population has been found to be life-changing and empowering in managing all aspects of well-being, but knowledge on the trait is limited and further study would prove beneficial to those who are and are not highly sensitive alike. The aim of the present study was to investigate a potential correlation between reaction times to external stimuli, and levels of cognitive processing. Through the creation of the Wagner SPS Survey based on the frameworks provided by Dr. Aron and Aron and psychotherapist Julie Bjelland, distinctly highly sensitive individuals were identified in a Department of Defense (DoD) American high school in Germany. Individuals whose scores fell within the predetermined threshold were selected as participants of the study. Choice-reaction time testing was then performed through a constructed Raspberry Pi apparatus. Knowing that HSPs are inherently deeper processors than the general population, the researcher expects reaction times to be subsequently slower in HSPs as a result of processing inhibitions. The data, currently under review, will undergo a statistical analysis in order for a correlation between the two variables to be determined. As SPS becomes increasingly recognized in society, it becomes more important to understand the HSP holistically.
VIDEO