Back to Top


Speakers




Invited Speaker Bios and Abstracts

Thierry Coupez, Ecole Centrale de Nantes

Title: Implicit Boundaries and Anisotropic Adaptive Meshing

Biography: Thierry Coupez is Professor at Ecole Centrale de Nantes and the director of ICI, the newly created HPC Institute of Nantes. He was previously Professor at MinesParistech in the CEMEF laboratory. He developed software in material forming simulation widely used in industry (Froge3, Rem3D, Ximex, Thost … and the general parallel FE library CimLib). Its expertise ranges from meshing and remeshing, anisotropic adaptive meshing, parallel meshing and Finite Element solvers in fluid and solid mechanics.

Abstract: The automatic generation of a single mesh involving several complex geometrical domains is an issue in many applications as for instance in multiphase flow, fluid structure interaction, computation from images or also in multi-domain solid and fracture mechanics. In order to simplify this issue we propose to combine an implicit boundary representation with the anisotropic adaptive framework. The interfaces are described by an implicit function based on the hyperbolic tangent of a distance function (derived from the level set method) and controlled by a thickness parameter [4]. This approach provides a very flexible way for building complex multi-domain simulations in comparison to body-fitted techniques that required to constraint explicitly the boundaries and interfaces into the volume mesh. The geometrical accuracy of implicit representations depends on the thickness parameter depending itself on the fineness of mesh at the targeted interfaces.

The new route proposed here is to use an adaptive mesh technique based on the a posteriori estimation of the interpolation error of the implicit function associated with the interface. Starting from any uniform coarse mesh, the anisotropic adaptive machinery [1,2] provides both a mesh with arbitrary complex embedded geometrical domains but also a mesh well adapted to any flow features as vortices and boundary layers [3]. Mesh generation is seen as a mesh adaptation process by following this approach.

This technique is suited directly for diffuse interface methods but can be extended to the body fitted techniques. Moreover, it can be combined with an adaptive solution based of the solver itself. Here, a VMS Finite Element solver. Several examples will be proposed in multiphase flows, fluid structure interaction and CFD from 3D image data


Dr. Oleg Skipa, Team Leader, CST - Computer Simulation Technology

Title: Mesh generation for electromagnetic design and analysis

Biography: Oleg Skipa works as a Team Leader at the R&D department of CST Computer Simulation Technology AG based in Darmstadt, Germany. His responsibility is the mesh generation technology for electromagnetic and multiphysics simulations. He received the Ph. D. degree in 2004 from Karlsruhe Institute of Technology (KIT), Germany working on numeric simulations of electric fields generated by human heart. Before starting his post-graduate studies in 1999, he obtained the Diploma in Physics from Kiev National Taras Shevchenko University, Ukraine.

Abstract: Having a versatile tool being able to simulate electromagnetic fields in 3D has been proven extremely useful in a wide range of technical applications. Today electromagnetic simulations are used to design components like antennas and filters and to study complex systems like a car or an airplane with electronic circuitry inside. The geometric scale may range from nano-meters in case of meta-material modeling to many meters for objects like motors or cars. The need for densely-packed electronics, the trend for "smart" wirelessly inter-connected devices and the growth of available computational power are just a few factors contributing to the growth of the market for electromagnetic (EM) simulation tools.

What are the requirements to the mesh generation workflow in an all-in-one design environment for EM-simulations? From the user point of view they are quite simple: it should be a seamless push-button solution being able to handle all kinds of input. In this talk we will take the developer point of view and walk through a range of applications in the area of EM simulation. I will talk about and challenges we face when serving the meshing needs of various calculation methods, the requirements a meshing component must meet in order to be used in a EM simulation tool and about our experience in building the meshing pipeline so that in many cases an impression of a "push-button" simulation can be achieved.


Dr. Omar Ghattas, Institute for Computational Engineering and Sciences Department of Mechanical Engineering Jackson School of Geosciences The University of Texas at Austin

Title: Large-scale Bayesian Inverse Problems and theFflow of the Antarctic Ice Sheet

Biography: Dr. Omar Ghattas is the John A. and Katherine G. Jackson Chair in Computational Geosciences, Professor of Geological Sciences and Mechanical Engineering, and Director of the Center for Computational Geosciences in the Institute for Computational Engineering and Sciences (ICES) at The University of Texas at Austin. He is also a member of the faculty in the Computational Science, Engineering, and Mathematics (CSEM) interdisciplinary PhD program in ICES, serves as Director of the KAUST-UT Austin Academic Excellence Alliance, and holds courtesy appointments in Computer Science, Biomedical Engineering, the Institute for Geophysics, and the Texas Advanced Computing Center. Prior to coming to UT-Austin in 2005, he was a professor at Carnegie Mellon University for 16 years. He earned BS, MS, and PhD degrees from Duke University in 1984, 1986, and 1988.

Ghattas has general research interests in simulation and modeling of complex geophysical, mechanical, and biological systems on supercomputers, with specific interest in inverse problems and associated uncertainty quantification for large-scale systems. His center's current research is aimed at large-scale forward and inverse modeling of whole-earth, plate-boundary-resolving mantle convection; global seismic wave propagation; dynamics of polar ice sheets and their land, atmosphere, and ocean interactions; and subsurface flows, as well as the underlying computational, mathematical, and statistical techniques for making tractable the solution and uncertainty quantification of such complex forward and inverse problems on parallel supercomputers.

Ghattas received the 1998 Allen Newell Medal for Research Excellence, the 2004/2005 CMU College of Engineering Outstanding Research Prize, the SC2002 Best Technical Paper Award, the 2003 IEEE/ACM Gordon Bell Prize for Special Accomplishment in Supercomputing, the SC2006 HPC Analytics Challenge Award, and the 2008 TeraGrid Capability Computing Challenge award, and was a finalist for the 2008, 2010, and 2012 Bell Prizes. He has served on the editorial boards or as associate editor of 13 journals, has been co-organizer of 12 conferences and workshops and served on the scientific or program committees of 49 others, has delivered invited keynote or plenary lectures at 33 international conferences, and has been a member or chair of 27 national or international professional or governmental committees. He is a Fellow of the Society for Industrial and Applied Mathematics (SIAM).

Abstract: Bayesian inference provides a systematic and coherent framework for quantifying uncertainty in the solution of ill-posed inverse problems. Given uncertainty in observational data, the forward model, and any prior knowledge of the parameters, the solution of the Bayesian inverse problem yields the so-called posterior probability of the model parameters, conditioned on the data. The fundamental challenge is how to explore this posterior density, in particular when the forward model is represented by PDEs (so that evaluating the posterior at any point requires solution of these PDEs), and the uncertain parameters are given by (a discretized) infinite dimensional field (so that we are faced with numerous evaluations of the posterior).

The specific problem we address here is the flow of ice from polar ice sheets such as Antarctica and Greenland, which is the primary contributor to projected sea level rise. One of the main difficulties faced in modeling ice sheet flow is the uncertain spatially-varying Robin boundary condition that describes the resistance to sliding at the base of the ice. Satellite observations of the surface ice flow velocity, along with a model of ice as a creeping incompressible shear-thinning fluid, can be used to infer this uncertain basal boundary condition. We cast this ill-posed inverse problem in the framework of Bayesian inference, which allows us to infer not only the basal sliding parameters, but also the associated uncertainty.

Fast parallel adaptive mesh refinement is crucial for uncertainty quantification, since the forward problem must be solved thousands of times (at least!) for varying input parameters (such as the basal boundary condition). AMR can reduce the cost of the forward problem by several orders of magnitude, but it must be automatic and scale well in parallel. For this purpose, we discuss the p4est library for parallel forest of octrees AMR, which has scaled to nearly half a million cores.

To overcome the prohibitive nature of Bayesian methods for large-scale inverse problems, we exploit the fact that, despite the large size of observational data, they typically provide only sparse information on model parameters. This implicit dimension reduction is provided by low rank approximations of the data misfit Hessian. We show results for Bayesian inversion of the basal sliding parameter field for the full Antarctic continent.

This work is joint with Carsten Burstedde, Tobin Isaac, James Martin, Noemi Petra, and Georg Stadler.


Leif Kobbelt, RWTH Aachen University

Title: 3D Model Augmentation: From Quad Meshes to Quad Layouts

Biography: Leif Kobbelt is a university professor of Computer Science with a specialization in Computer Graphics and Geometry Processing. He is the head of the Institute for Computer Graphics and Multimedia at RWTH Aachen University.

After receiving his diploma in 1992 and his Ph.D. in 1994 in Computer Science from the Karlsruhe Institute of Technology he worked at the University of Wisconsin in Madison, University of Erlangen-Nuremberg and the Max Planck Institute of Computer Science before he moved to RWTH Aachen University in 2001. His research interests include 3D reconstruction, efficient geometry processing and optimization, realistic realtime rendering and (mobile) multimedia applications. For his research he was awarded with a number of academic prices including the Heinz-Maier-Leibnitz Prize 2000, the Eurographics Outstanding Technical Contribution Award 2004, the Günther Enderle Award (in 1999 and 2012), an ERC Advanced Grant 2013 and the Gottfried Wilhelm Leibniz Prize 2014. He has been named a Fellow of the Eurographics Association (2008) and a Distinguished Professor of RWTH Aachen University (2013).

Abstract: The conversion of raw geometric data (that typically comes in the form of unstructured triangle meshes) to high quality quad meshes is an important and challenging task. The complexity of the task results from the fact that quad mesh topologies are subject to global consistency requirements which cannot be dealt with by local constructions. This is why recent quad meshing techniques formulate the mesh generation process as a global optimization problem. By adding hard and soft constraints to this optimization, many desired properties such as structural simplicity, principal direction alignment, as well as injectivity can be guaranteed by construction. An even more challenging problem is the computation of quad layouts, where a coarse segmentation of the input surface into essentially rectangular patches is sought which also satisfies global consistency and shape quality requirements. While being structurally related, both problems need to be addressed by fundamentally different approaches. In my talk I will present some of these approaches and demonstrate that they can generate high quality quad meshes and quad layouts with a high degree of automation but that they also allow the user to interactively control the results by setting boundary conditions accordingly.