A report from VIS 2016

3 days ago
IEEE VIS 2016 brought together researchers and practitioners to discuss the latest developments in visualization and visual analytics research and their applications. The authors describe the highlights of the 2016 event.

Recursive Orthogonal Label Regression: A Framework for Semisupervised Dimension Reduction

1 week ago
Semisupervised DR techniques using virtual label regression have attracted considerable attention, but they suffer from two restrictions: the number of discriminant directions available is constrained to the number of classes, and they’re nonorthogonal. Traditional methods easily address these problems. However, an interesting problem is how to address these restrictions in label regression modelings. To do this, the authors developed Recursive Orthogonal Label Regression (ROLR), a regression framework of semisupervised dimension reduction that uses label propagation and label regression in a recursive procedure. Here, they illustrate the formulation of ROLR using semisupervised regression encoding. ROLR provides an unified view to understand and explain a large family of label regression techniques. Experimental results show the approach’s feasibility and effectiveness.

Application Modernization for the Exascale Era

2 weeks 5 days ago
From the development of proxy applications to rapidly explore algorithmic and programming model changes and co-design of hardware and software features between vendors and application teams well in advance of delivery to early access to hardware and software stacks through dedicated hands-on activities with vendors, the articles in this special issue provide others with a starting point for their own roadmap to application modernization efforts.

What Is the Blockchain?

2 weeks 6 days ago
Blockchain is a new technology, based on hashing, which is at the foundation of the platforms for trading cryptocurrencies and executing smart contracts. This article reviews the basic ideas of this technology and provides a sample minimalist implementation in Python.

Wavelet-Based Visual Analysis for Data Exploration

2 weeks 6 days ago
The conventional wavelet transform is widely used in image and signal processing, where a signal is decomposed into a combination of known signals. By analyzing the individual contributions, the behavior of the original signal can be inferred. In this article, the authors present an introductory overview of the extension of this theory into graphs domains. They review the graph Fourier transform and graph wavelet transforms that are based on dictionaries of graph spectral filters, namely, spectral graph wavelet transforms. Then, the main features of the graph wavelet transforms are presented using real and synthetic data.

Modeling and Simulating Internet-of-Things Systems: A Hybrid Agent-Oriented Approach

2 weeks 6 days ago
The focus of the Internet has recently shifted from current computers and mobile devices to everyday objects, people, and places; consequently, the Internet of Things (IoT) promises to be not only a compelling vision but the actual driving force of the upcoming fourth Industrial Revolution. Novel cyber-physical, customized, and highly pervasive services are impacting our lives, involving several stakeholders and fostering an unseen globally interconnected ecosystem. However, IoT system development is a multifaceted process that’s complex, error-prone, and time-consuming. Although modeling and simulation are crucial aspects that could effectively support IoT system development, an integrated approach synergistically providing both of them is lacking. The authors propose a hybrid approach that uses agents for IoT modeling and OMNeT for simulation, providing mapping guidelines between the agent paradigm and the OMNeT simulator abstractions. The proposed approach has been applied in small-, medium-, and large-scale IoT scenarios, where relevant performance indexes of IoT entities communication have been measured and analyzed.

Teaching Scenario-Based Programming: An Additional Paradigm for the High School Computer Science Curriculum, Part 1

2 weeks 6 days ago
This article describes a pilot programming course in which high school students were introduced, through the visual programming language of live sequence charts (LSC), to a new paradigm termed scenario-based programming. The rationale underlying this course was teaching high school students a second, very different programming paradigm. Using LSC for this purpose has other advantages, such as exposing students to high-level programming, dealing with nondeterminism and concurrency, and referring to human-computer interaction (HCI) issues. This work also contributes to the discussion about guiding principles for curriculum development. It highlights an important principle: the educational objective of a course should include more than mere knowledge enhancement. A course should be examined and justified through its contribution to learning fundamental ideas and forming useful habits of mind.

Parallel Implementation of the Ensemble Empirical Mode Decomposition and Its Application for Earth Science Data Analysis

2 weeks 6 days ago
To efficiently perform multiscale analysis of high-resolution, global, multiple-dimensional datasets, the authors have deployed the parallel ensemble empirical mode decomposition (PEEMD) package by implementing three-level parallelism into the ensemble empirical mode decomposition (EMD), achieving a scaled performance of 5,000 cores. In this study, they discuss the implementation of the PEEMD and its application for the analysis of Earth science data, including the solution of the Lorenz model, an idealized terrain-induced flow, and Hurricane Sandy.

SPACSSIM: Simulation and Analysis Software for Mathematical Modeling of Satellite Position and Attitude Control Systems

2 weeks 6 days ago
This article presents software for a satellite position and attitude control system based on mathematical modeling. The software is developed using Matlab for conceptual design; an added benefit is that it shortens design time and decrease design costs. Providing interactive modules for different actuators with various configurations and control algorithms makes this toolbox suitable for analyzing the effect of design parameters on satellite response and stability. Moreover, this toolbox contains adjustable modules for date and orbital parameters, which helps users improve their realization of satellite position effects on different matters such as eclipse, magnetic field, and satellite communication with ground station. Taking position data, the software computes disturbance torques and gives users the ability to analyze satellite attitude control performance. The software’s ability to show simulation results as a set of graphics and text windows makes it more user friendly.

Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis

2 weeks 6 days ago
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative to a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.

The Trinity Center of Excellence Co-Design Best Practices

2 weeks 6 days ago
Co-design between application developers and the vendors providing high-performance computing systems is essential to efficiently using current-generation HPC systems and shaping next-generation architectures. Application developers are faced with the challenge of effectively orchestrating many hardware threads and exploiting vector instructions, which can require code modifications. Will refactoring applications for current-generation systems apply to future-generation systems? Is there help available for this enormous task? Over the past 10 years, a partnership has evolved to address these issues. Centers of Excellence are mutually beneficial collaborations between application developers and vendors; vendors receive a better understanding of the needs of their users, and users receive a better understanding of the characteristics of the hardware, system software, and tools available to address their challenges. This article describes the partnership between two vendors, Cray and Intel, and the application teams working together in the Trinity Center of Excellence.

Application Modernization at LLNL and the Sierra Center of Excellence

2 weeks 6 days ago
In 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018. This article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.

Exploratory Research to Expand Opportunities in Computer Science for Students with Learning Differences

3 weeks 5 days ago
The computer science (CS) education field is engaging in unprecedented efforts to expand learning opportunities in K-12 CS education, but one group of students is often overlooked: those with specific learning disabilities and related attention deficit disorders. As CS education initiatives grow, K-12 teachers need research-informed guidance to make computing more accessible for students who learn differently. This article reports on the first phase of a National Science Foundation-supported exploratory research study to address this problem. The authors present their education research-practice partnership, initial findings, and highlights of a collaborative process that has furthered their work to support more equitable learning in CS.

Signals, Systems, and Design [Book review]

1 month ago
This book differentiates itself through a number of seemingly small design decisions that are powerful and go a long way in realizing an effective and instructive textbook for a first course in the subject. Students in the second year of an engineering program, especially those who have gone through the standard mathematics core courses, will also find themselves comfortable in studying and learning from this book. The book’s organization is straightforward but effective. After two introductory chapters dedicated to signals, systems,and the mathematical background required to study them, the text continues with two sets of chapters focusing, respectively, on analog and digital signals and systems. The two sets are organized very similarly, sequentially dedicating one chapter each to signals in the time domain, systems in the time domain, signals in the frequency domain, and systems in the frequency domain. The time domain chapters mostly examine classification, with the addition of sampling operation when discussing digital signals. The frequency domain chapters consider the different fundamental transforms (such as the continuous- and discrete-time Fourier transform, the Laplace transform for analog signals and systems, and the Z-transform for digital signals and systems) and the techniques for studying system response to different input signals. The last two chapters are dedicated to the discrete Fourier transform, including its relation to the discrete-time Fourier transform and the fast Fourier transform algorithm itself. The book distinguishes itself from others in the same category through several design decisions that go a long way in helping the reader learn.

Understanding the Solar Wind–Mars Interaction with Global Magnetohydrodynamic Modeling

1 month ago
This article presents recent progress in understanding solar wind–Mars interaction using a sophisticated global magnetohydrodynamic (MHD) model. Mars has localized crustal magnetic fields, so the solar wind plasma flow interacts directly with the Mars atmosphere/ionosphere system. Such an interaction generates an induced current in the ionosphere, modifies the magnetic field environment around Mars, and more importantly, causes the erosion of the Mars atmosphere. The nonuniformly distributed crustal magnetic field also plays an important role in the interaction process, which is modulated by planetary rotation. Recent advances in computing power allow the inclusion of the continuous crustal field rotation in the simulation with a time-dependent MHD model. Model results have been validated with observations from previous and ongoing Mars missions. The validated time-dependent MHD model is useful in quantifying the variation of ion loss rates with planet rotation and the internal response time scale of the Martian ionosphere.

Quantity Correctness in Fortran Programs

1 month ago
The ISO/IEC Fortran standards working group states that the third most detectable type of error in scientific software is “incorrect use of units of measurement” and has developed a draft specification for incorporating unit-checking into Fortran compilers. This article shows that unit-checking is insufficient to detect all quantity errors, and that kind-of-quantity also needs to be verified. A suggested syntax and program for a Fortran source-code preprocessor are given.
Checked
41 minutes 9 seconds ago
IEEE NEWS
Physics, medicine, astronomy — these and other hard sciences share a common need for efficient algorithms, system software, and computer architecture to address large computational problems. And yet, useful advances in computational techniques that could benefit many researchers are rarely shared. To meet that need, Computing in Science & Engineering (CiSE) presents scientific and computational contributions in a clear and accessible format.
Subscribe to IEEE NEWS feed