Volume 8 Issue 3 — December 2017

PDF icon Download Full Issue PDF

Contents

Using Inexpensive Microclusters and Accessible Materials for Cost-Effective Parallel and Distributed Computing Education

Joel C. Adams, Suzanne J. Matthews, Elizabeth Shoop, David Toth, and James Wolfer

pp. 2–10

https://doi.org/10.22369/issn.2153-4136/8/3/1

PDF icon Download PDF

BibTeX
@article{jocse-8-3-1,
  author={Joel C. Adams and Suzanne J. Matthews and Elizabeth Shoop and David Toth and James Wolfer},
  title={Using Inexpensive Microclusters and Accessible Materials for Cost-Effective Parallel and Distributed Computing Education},
  journal={The Journal of Computational Science Education},
  year=2017,
  month=dec,
  volume=8,
  issue=3,
  pages={2--10},
  doi={https://doi.org/10.22369/issn.2153-4136/8/3/1}
}
Copied to clipboard!

With parallel and distributed computing (PDC) now in the core CS curriculum, CS educators are building new pedagogical tools to teach their students about this cutting-edge area of computing. In this paper, we present an innovative approach we call microclusters - personal, portable Beowulf clusters - that provide students with hands-on PDC learning experiences. We present several different microclusters, each built using a different combination of single board computers (SBCs) as its compute nodes, including various ODROID models, Nvidia's Jetson TK1, Adapteva's Parallella, and the Raspberry Pi. We explore different ways that CS educators are using these systems in their teaching, and describe specific courses in which CS educators have used microclusters. Finally, we present an overview of sources of free PDC pedagogical materials that can be used with microclusters.

Innovative Model, Tools, and Learning Environments to Promote Active Learning for Undergraduates in Computational Science & Engineering

Hong Liu, Michael Spector, Matthew Ikle, Andrei Ludu, and Jerry Klein

pp. 11–18

https://doi.org/10.22369/issn.2153-4136/8/3/2

PDF icon Download PDF

BibTeX
@article{jocse-8-3-2,
  author={Hong Liu and Michael Spector and Matthew Ikle and Andrei Ludu and Jerry Klein},
  title={Innovative Model, Tools, and Learning Environments to Promote Active Learning for Undergraduates in Computational Science \& Engineering},
  journal={The Journal of Computational Science Education},
  year=2017,
  month=dec,
  volume=8,
  issue=3,
  pages={11--18},
  doi={https://doi.org/10.22369/issn.2153-4136/8/3/2}
}
Copied to clipboard!

This paper presents an innovative hybrid learning model as well as the tools, resources, and learning environment to promote active learning for both face-to-face students and online students. Most small universities in the United States lack adequate resources and cost justifiable enrollments to offer Computational Science and Engineering (CSE) courses. The goal of the project was to find an effective and affordable model for small universities to prepare underserved students with marketable analytical skills in CSE. As the primary outcome, the project created a cluster of collaborating institutions that combines students into common classes and used cyberlearning learning tools to deliver and manage instruction. The instrumental tools for educational technologies included Smart Podium, digital projector, teleconference system such as AdobeConnect, auto tracking camera and high quality audios in both local and remote classrooms. As innovative active learning environment, R&D process was used to provide a coherent framework for designing instruction and assessing learning. Course design centered on model-based learning which proposes that students learn complex content by elaborating on their mental model, developing a conceptual model, refining a mathematical model, and conducting experiments to validate and revise their conceptual and mathematical models. A wave lab and underwater robotics lab were used to facilitate the experimental components of hands-on research projects. Course delivery included interactive live online help sessions, immediate feedback to students, peer support, and teamwork which were crucial for student success. Another key feature of instruction of the project was using emerging technologies such as HIMATT [8] to evaluate how students think through and model complex, ill-defined and ill-structured realistic problems.

Computational approaches to scattering by microspheres

Reed M. Hodges, Kelvin Rosado-Ayala, and Maxim Durach

pp. 19–24

https://doi.org/10.22369/issn.2153-4136/8/3/3

PDF icon Download PDF

BibTeX
@article{jocse-8-3-3,
  author={Reed M. Hodges and Kelvin Rosado-Ayala and Maxim Durach},
  title={Computational approaches to scattering by microspheres},
  journal={The Journal of Computational Science Education},
  year=2017,
  month=dec,
  volume=8,
  issue=3,
  pages={19--24},
  doi={https://doi.org/10.22369/issn.2153-4136/8/3/3}
}
Copied to clipboard!

Mie theory is used to model the scattering off of wavelength-sized microspheres. It has numerous applications for many different geometries of spheres. The calculations of the electromagnetic fields involve large sums over vector spherical harmonics. Thus, the simple task of calculating the fields, along with additional analytical tools such as cross sections and intensities, require large summations that are conducive to high performance computing. In this paper, we derive Mie theory from first principles, and detail the process and results of programming Mie theory physics in Fortran 95. We describe the theoretical background specific to the microspheres in our system and the procedure of translating functions to Fortran. We then outline the process of optimizing the code and parallelizing various functions, comparing efficiencies and runtimes. The shorter runtimes of the Fortran functions are then compared to their corresponding functions in Wolfram Mathematica. Fortran has shorter runtimes than Mathematica by between one and four orders of magnitude for our code. Parallelization further reduces the runtimes of the Fortran code for large jobs. Finally, various plots and data related to scattering by dielectric spheres are presented.

Toward simulating Black Widow binaries with CASTRO

Platon I. Karpov, Maria Barrios Sazo, Michael Zingale, Weiqun Zhang, and Alan C. Calder

pp. 25–29

https://doi.org/10.22369/issn.2153-4136/8/3/4

PDF icon Download PDF

BibTeX
@article{jocse-8-3-4,
  author={Platon I. Karpov and Maria Barrios Sazo and Michael Zingale and Weiqun Zhang and Alan C. Calder},
  title={Toward simulating Black Widow binaries with CASTRO},
  journal={The Journal of Computational Science Education},
  year=2017,
  month=dec,
  volume=8,
  issue=3,
  pages={25--29},
  doi={https://doi.org/10.22369/issn.2153-4136/8/3/4}
}
Copied to clipboard!

We present results and lessons learned from a 2015-2016 Blue Waters Student Internship. The project was to perform preliminary simulations of an astrophysics application, Black Widow binary systems, with the adaptive-mesh simulation code Castro. The process involved updating the code as needed to run on Blue Waters, constructing initial conditions, and performing scaling tests exploring Castro's hybrid message passing/threaded architecture.

Using Blue Waters to Assess Non-Tornadic Outbreak Forecast Capability by Lead Time

Taylor Prislovsky and Andrew Mercer

pp. 30–35

https://doi.org/10.22369/issn.2153-4136/8/3/5

PDF icon Download PDF

BibTeX
@article{jocse-8-3-5,
  author={Taylor Prislovsky and Andrew Mercer},
  title={Using Blue Waters to Assess Non-Tornadic Outbreak Forecast Capability by Lead Time},
  journal={The Journal of Computational Science Education},
  year=2017,
  month=dec,
  volume=8,
  issue=3,
  pages={30--35},
  doi={https://doi.org/10.22369/issn.2153-4136/8/3/5}
}
Copied to clipboard!

Derechos are a dangerous, primarily non-tornadic severe weather outbreak type responsible for a variety of atmospheric hazards. However, the exact predictability of these events by lead time is unknown, yet would likely be invaluable to forecasters responsible for predicting these events. As such, the predictability of nontornadic outbreaks by lead time was assessed. Five derecho events spanning 1979 to 2012 were selected and simulated using the Weather Research and Forecasting (WRF) model at 24, 48, 72, 96, and 120-hours lead time. Nine stochastically perturbed initial conditions were generated for each case and each lead time, yielding an ensemble of derecho simulations. Moment statistics of the derecho composite parameter (DCP), a good proxy for derecho environments, were used to assess variability in forecast quality and precision by lead time. Overall, results showed that 24 and 48 hour simulations had similar variability characteristics, as did 96 and 120 hours. This suggests the existence of a change point or statistically notable drop-off in forecast performance at 72-hours lead time that should be more fully explored in future work. These results are useful for forecasters as they give a first guess as to forecast skill and precision prior to initiating their predictions at lead times of out to 5 days.

Blue Waters Supercomputing Applications in Climate Modeling with the WRF Model

Morgan Smith and Andrew Mercer

pp. 36–43

https://doi.org/10.22369/issn.2153-4136/8/3/6

PDF icon Download PDF

BibTeX
@article{jocse-8-3-6,
  author={Morgan Smith and Andrew Mercer},
  title={Blue Waters Supercomputing Applications in Climate Modeling with the WRF Model},
  journal={The Journal of Computational Science Education},
  year=2017,
  month=dec,
  volume=8,
  issue=3,
  pages={36--43},
  doi={https://doi.org/10.22369/issn.2153-4136/8/3/6}
}
Copied to clipboard!

Long-term atmospheric forecasting remains a significant challenge that in the field of operational meteorology. These long-term forecasts are typically completed through the use of climatological variability patterns in the geopotential height fields, known in the field of meteorology as teleconnections. Despite heavy reliance on teleconnections for long-term forecasts, the characterization of these patterns in operational weather models remains inadequate. The purpose of this study is to diagnose the ability of an operational forecast model to render well-known teleconnection patterns. The Weather Research and Forecasting (WRF) model, a commonly employed regional operational forecast model, was used in the simulation of the major 500 mb Northern Hemisphere midlatitude teleconnection patterns. These patterns were formulated using rotated principal component analysis on the 500 mb geopotential height fields. The resulting simulated teleconnection patterns were directly compared to observed teleconnection fields derived from the National Center for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis 500 mb geopotential height database, a commonly utilized observational dataset in climate research. Results were quite poor, as the resulting teleconnection patterns only somewhat resembled those constructed on the observed dataset, suggesting a limited capability of the WRF in resolving the underlying variability structure of the hemispheric midlatitude atmosphere. Additionally, configuring the regional model to complete this simulation was met with a series of computational challenges, some of which were not successfully overcome. These results suggest future needs for improvement of the WRF model in reconstructing teleconnection fields and for use in climate modeling.