Over much of its 80 year history, radio astronomy has been enabled by and dependent
on three technologies: digital signal processing, computing, and networking technologies.
By the end of this decade, radio astronomers working in an international consortium will
have built the world's largest, most sensitive, fastest radio telescope, the Square
Kilometre Array. The SKA will be one of the foremost scientific instruments in the world,
addressing some of the most important questions in astrophysics and cosmology. The SKA will
naturally stress and stretch the state of the art in the three technology areas mentioned above.
One of the most notable technical aspects of the SKA is the very large data rate - exceeding
10 Pbit/s over long-distance networks into the digital signal processing chain, requiring about
1 Exaflop/s for processing into images, and producing 1 Exabyte of science data per week. The data
rate raises many issues throughout the entire system design. I will concentrate on
the interaction between the computing architecture and the physics and algorithmics of the
measurement process.
The serpentine geometry is meant to isolate the influence of curvature on turbulence.
Periodicity is imposed between inlet and exit and a fully developed flow field develops
in our direct numerical simulation. The duct is rotated to examine the combined effects
of curvature and rotation. The theoretical framework will be sketched.
Stabilizing and destabilizing influences of curvature and rotation are seen in the
DNS data. Large, but quite weak, streamwise vortices form due to convex curvature;
but they break up under rotation, possibly because rotation destroys the symmetry
between the two bends of the serpentine. Within the bends turbulent intensities are
enhanced or suppressed roughly in accord to theory. Particles are tracked in the flow
and some rather interesting distributions of particle accumulation are seen. Light particles
acculumlate somewhat along the walls. It is unclear whether turbophoresis plays a role, or
whether this is a balance of centrifugal acceleration and turbulent dispersion.
Heavy particles form into jets, leaving the inner wall, and bands of concentration are caused
by multiple reflections from the outer wall. The concentration bands spread by turbulent mixing.
Standard drag formulas for one-way coupling are used for the particle trajectories. The domain
is partitioned geometrically, so the high concentrations cause the particle tracking to be poorly
load balanced.
The University of Texas at Austin and
Royal Institute of Technology at Stockholm |
We will give a brief overview of multiscale modeling for wave equation problems and
then focus on two techniques. One is the heterogeneous multiscale method applied to problems
with oscillatory velocity fields. In this technique calculations in time
domain on a refined grid is performed on small subsets of the computational domain
to achieve efficiency. The other technique is a new class of sweeping preconditioners for
frequency domain simulation with variable coefficients resulting in computational procedures
that essentially scale linearly even in the high frequency regime.
Australian National University |
One of the main obstacles to real progress in the science of complex
real world materials has been the need to accurately characterise
material structure and thereby to predict properties of materials in
three dimensions. We describe the development of a new quantitative
numerical laboratory approach to the study of complex real world
materials in 3D. A first part incorporates the development and
integration of experimental 3D imaging facilities for characterizing
materials at multiple scales. A second part is the development of
computational infrastructure for image reconstruction, phase
identification, multiscale mapping, 3D visualisation, structural
characterisation and prediction of physical properties of material
properties from digitised 3D images. We describe examples where the
ability to quantitatively measure and characterize the structure and
properties of complex materials in 3D has impacted on applied sciences
including materials design and bone health diagnosis. We also describe
proven commercial applications of the technology in the resources sector.
University of New South Wales |
High dimensional problems, that is, problems with a very large number of variables,
are coming to play an ever more important role in applications. These include, for
example, option pricing problems in mathematical finance, maximum likelihood problems in statistics,
and porous flow problems in computational physics. High dimensional problems pose immense challenges
for practical computation, because of a nearly inevitable tendency for the costs of computation to
increase exponentially with dimension: this is the celebrated "curse of dimensionality". In this talk
I will give an introduction to "quasi-Monte Carlo methods" for tackling high dimensional integrals, with
a focus on "lattice rules", and discuss the challenges that we face while attempting to lift the curse
of dimensionality.
The quantification of uncertainty in groundwater flow plays a central
role in the safety assessment of radioactive waste disposal and of CO2
capture and storage underground. Stochastic modelling of data
uncertainties in the rock permeabilities lead to elliptic PDEs with
random coefficients. A typical computational goal is the estimation of
the expected value or higher order moments of some relevant quantities
of interest, such as the effective permeability or the breakthrough
time of a plume of radionuclides. Because of the typically large variances
and short correlation lengths in groundwater flow applications, methods based
on truncated Karhunen-Loeve expansions are only of limited use and Monte Carlo type
methods are still most commonly used in practice. To overcome the
notoriously slow convergence of conventional Monte Carlo, we formulate
and implement novel methods based on (i) deterministic rules to cover
probability space (Quasi-Monte Carlo) and (ii) hierarchies of spatial
grids (multilevel Monte Carlo). It has been proven theoretically for
certain classes of problems that both of these approaches have the
potential to significantly outperform conventional Monte Carlo. A full
theoretical justification that the groundwater flow applications
discussed here belong to those problem classes are under current
investigation. However, experimentally our numerical results show that
both methods do indeed always clearly outperform conventional Monte
Carlo even within this more complicated setting, to the extent that
asymptotically the computational cost is proportional to the cost of
solving one deterministic PDE to the same accuracy.
University of California, Berkeley |
Propagating interfaces occur in a wide variety of settings, and include ocean waves, burning flames,
and material boundaries. Less obvious boundaries are equally important, and include iso-intensity
contours in images, handwritten characters, and shapes against boundaries. In addition, some static
problems can be recast as advancing fronts, including robotic navigation and finding shortest paths
on contorted surfaces. One way to frame moving interfaces is to recast them as solutions to fixed
domain Eulerian partial differential equations, and this has led to a collection of
PDE-based techniques, including level set methods, fast marching methods, and ordered upwind methods.
These techniques easily accommodate merging boundaries and the delicate 3D physics of interface motion.
In many settings, they been proven valuable.
In this talk, we will focus on scientific and engineering applications of these techniques. This will
include "How do home inkjet plotters work?". "What happens when my faucet drips?"
"How can we guide chemical probes through complex materials?". "How are semiconductors built?".
And, "How can we automate the early detection of eye disease?".