TACC Research: Projects

R&D to improve technology and provide innovative tools for scientific discovery


 

Algorithms & Libraries

TACC's software programs provides algorithms and standard libraries that perform calculation, data processing, and automated reasoning. TACC's programmers also create custom libraries. Most modern software systems provide libraries that implement the majority of system services. Such libraries have commoditized the services which a modern application requires. As such, most code used by modern applications is provided in these system libraries.

Derives Krylov methods in the FLAME framework to show how FLAME can simplify the process of deriving iterative methods, and make a case that a Flame-based environment for deriving new iterative methods is a distinct, and attractive, possibility.
Victor Eijkhout
eijkhout@tacc.utexas.edu
A novel strategy for sparse direct factorizations that is geared towards the matrices that arise from hp adaptive Finite Element Methods.
Victor Eijkhout
eijkhout@tacc.utexas.edu

back to top >>
 

Bio Informatics

Bioinformatics is the computer-assisted data management discipline that helps researchers gather, analyze, and represent information to understand life's processes in the healthy and disease states, and find new or better drugs. TACC's Life Sciences Group is committed to providing the research community with the bioinformatics computational tools and expertise needed to address modern biological research questions. The TACC team is focused on ensuring that TACC maintains the hardware, software and domain expertise to support a wide variety of bioscience research.

To develop and operate computational resources in support of grand challenges in the life sciences.
Matthew Vaughn
vaughn@tacc.utexas.edu
This project aims to develop new methods and tools to extract biological entities from publication and other text documents.
This project will create coupled fuel, electricity, water, and air models for Texas and eastern Mexico to a develop quantitative understanding of the implications of increased natural gas usage in the Texas-Mexico border region.
Paul Navratil
pnav@tacc.utexas.edu
This NeuroNex Technology Hub for enhanced resolution three-dimensional electron microscopy (3DEM) will enable the discovery of new details in how brain synapses function across different regions and species.
An integrated cyberinfrastructure that enables investigators to deploy analysis tools and aggregate data resources.
Matthew W. Vaughn
vaughn@tacc.utexas.edu

back to top >>
 

Cloud Computing & Interface Technologies

As a complement to TACC's HPC and visualization resources,TACC offers cloud services to give researchers access to on-demand or persistent hosting of virtual servers, data sets, and gateways. TACC conducts research into cloud architectures and interfaces to improve next generation cloud platforms.

New communication mechanisms (integrated into Mvapich2) for Dense Many-Core (DMC) architectures
This project aims to research and development of a dynamic, reconfigurable tool to bridge users and remote computing resources to facilitate educational tools for data science. With this tool, curriculums and educational materials on essential tools for data science will be developed and disseminated.

back to top >>
 

Experimental Systems

An open research system to investigate the use of field-programmable gate arrays (FPGAs) as data center accelerators to improve performance, reduce power consumption, and open new research avenues of investigation.
Discovery is a testbed for benchmarking and testing hardware. TACC uses it to experiment and evaluate new technologies.
Fabric is an experimental system combining IBM's latest Power 8 processors with NVIDIA GPUs, and FPGAs from Altera and Xilinx, for researchers exploring alternate computer architectures.
A system dedicated to supporting workflows using Hadoop MapReduce and other Hadoop Distributed File System based tools.

back to top >>
 

Health Informatics & Compliance

Health informatics is the study of resources and methods for the management of health information. TACC is capable of handling research with data sets containing personalized health information through HIPAA and FISMA compliance. This enables deeper connections to clinical studies and research in human health.

Create a detailed spatial-temporal molecular atlas of the developing lung in preterm infants to young children.
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar5 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
Brings together researchers in medicine, computation, visualization and data to deploy passive monitoring to a cohort of 1000 pregnant women in Central Texas via ubiquitous cell phone apps to develop a digital phenotype of pregnancy.
Kelly Gaither
kelly@tacc.utexas.edu
Improve the quality of clinical care and improve patient safety by creating visualizations that provide clinicians with a systems view of physiologic and pathologic processes as well as the efficacy and confounding interaction of pharmacologic interventions.
Luis Francisco-Revilla
revilla@tacc.utexas.edu

back to top >>
 

Machine Learning & Analytics

At the edge of statistics, computer science and emerging applications in industry, machine learning and analytics focus on the development of fast and efficient algorithms for real-time processing of data as a main goal to deliver accurate predictions of various kinds.

Meeting the needs of faculty and researchers for data collection services, and contributing to the potential of data-driven research to make discoveries.
Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
Dan Stanzione
dan@tacc.utexas.edu
This project quantifies the impact of silent data corruption on deep learning training.
This project enables scalable and efficient I/O for distributed deep learning training in computer clusters with existing hardware/software stack.
This project enables deep learning training on supercomputer scale without losing test accuracy.

back to top >>
 

Next Generation Portals & Interfaces

The effective use of advanced computing resources is a challenge to researchers. Providing powerful and intuitive interfaces accelerates the rate at which science can be performed by allowing researchers to focus on scientific exploration rather than technological hurdles.

To develop and operate computational resources in support of grand challenges in the life sciences.
Matthew Vaughn
vaughn@tacc.utexas.edu
A resource-sharing Web platform that will enable computer models and simulations of natural hazards that can be validated against real-world data, creating an easily accessible resource for natural hazards researchers across the country.
Dan Stanzione
dan@tacc.utexas.edu
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar5 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
The BOINC@TACC project integrates volunteer computing with supercomputing and cloud computing. It provides a conduit for routing High-Throughput Computing (HTC) jobs from the TACC systems to the computing resources volunteered by individuals or institutions.
An integrated cyberinfrastructure that enables investigators to deploy analysis tools and aggregate data resources.
Matthew W. Vaughn
vaughn@tacc.utexas.edu
A web interface for users to manage their TACC account, projects, and allocations.
A single virtual system that scientists can use to interactively share computing resources, data and expertise, including Replication Service, XSEDE Online Information, XSEDE User Portal, and XSEDE User Portal Mobile.
A cyberinfrastructure which captures the "whole tale" of data driven computational research in an easy to use web-based environment. By capturing the input data, processing and analysis steps, and the final data products, the three can be published along with any research publication to allow others to reproduce results or expand upon other's findings directly.
The Technology Audit Service (TAS) award for XSEDE focused first on developing the XDMoD auditing framework to provide XSEDE stakeholders (the public, users, PI's, support staff, center directors, XSEDE leadership, and NSF program managers) with ready access to utilization, performance, and quality of service data for XSEDE.
 
Collaborative initiatives between NASA and TACC on questions of mutual interest related to large-scale computation. Additionally, TACC makes available to NASA computational resources, training, and support that augments existing NASA capabilities.

back to top >>
 

Next Generation Computing Systems

What will TACC build next? TACC leadership and staff are constantly monitoring, evaluating and developing new and innovative technologies in the marketplace to consider what next generation of computer systems will be most effective for science and engineering. Projects in this category are early and experimental technology evaluations, usually in collaboration with industry partners, that look at the efficacy of future system designs. TACC maintains its own expertise in this area creating new frontiers and new challenges for next generation computing.

The ACELab develops and packages open source benchmarks appropriate for the HPC environment.
Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
Dan Stanzione
dan@tacc.utexas.edu
Promotes efficiency among scientists using advanced computing technology, and provides new educational and research opportunities for UT Austin students and faculty in emerging energy systems.
Nathaniel Mendoza
nmendoza@tacc.utexas.edu
An Interactive Parallelization Tool (IPT) for assisting domain-­experts and students in efficiently parallelizing their existing C/C++/Fortran applications is being developed in this project.
This project aims to develop new methods to processes video data collected from traffic monitoring cameras to support traffic analyses including traffic counts and vehicle-pedestrian interactions.

back to top >>
 

Research Computing Infrastructure

The Chameleon testbed is deployed at the University of Chicago (UC) and the Texas Advanced Computing Center (TACC) and consists of 650 multi-core cloud nodes, 5PB of total disk space, and leverage 100 Gbps connection between the sites.
Dan Stanzione
dan@tacc.utexas.edu
The goal of this project and system is to open up new possibilities in science and engineering by providing computational capability that makes it possible for investigators to tackle much larger and more complex research challenges across a wide spectrum of domains.
Dan Stanzione
dan@tacc.utexas.edu
Jetstream is a new type of computational research resource for the national (non-pclassified) research community - a data analysis and computational resource that scientists and engineers use interactively to conduct their research anytime, anywhere.
Matthew Vaughn
vaughn@tacc.utexas.edu
Stampede 2 offers a powerful new system that builds on the success of Stampede and will continue to enable groundbreaking science.
Dan Stanzione
dan@tacc.utexas.edu
This award is to fund a transformational data intensive resource for the open science community called Wrangler.
Dan Stanzione
dan@tacc.utexas.edu
The goal of XSEDE is to accelerate open scientific discovery by enhancing the productivity and capability of researchers, engineers, and scholars, and by broadening their participation in science and engineering.
Kelly Gaither
kelly@tacc.utexas.edu

back to top >>
 

Software Defined Visualization

Software-defined visualization libraries provide performant rendering on general-purpose processors, including both traditional rasterization methods and ray tracing methods that can provide improved visual fidelity and expanded opportunities for simulation integration. By removing the dependence on particular hardware for performant rendering, these libraries enable visualization capabilities for both in situ and post-process analysis across the various hardware architectures found in high performance computing.

This project develops a dynamic ray scheduling algorithm that effectively manages both ray state and data accesses.
Paul Navrátil
pnav@tacc.utexas.edu
This project aims to create efficient shadowgraph and Schlieren visualizations of simulated airflow leveraging commodity hardware and open-source rendering platforms.
Paul Navrátil
pnav@tacc.utexas.edu
This project investigates making visual analysis of data more effectively by using time-tested methods of visual artists to investigate, reframe and generally make sense of their world. Imagine if data-intensive computing could be a more innately human and creative process, like sculpting (visual, tangible, active, physical).
To reduce the overall cost of the visual analysis of large-scale simulation and observational data by replacing brute-force techniques with data sampling based on the actual requirements and capabilities of the user.

back to top >>