TACC Research: Projects

R&D to improve technology and provide innovative tools for scientific discovery


 

Algorithms & Libraries

TACC's software programs provides algorithms and standard libraries that perform calculation, data processing, and automated reasoning. TACC's programmers also create custom libraries. Most modern software systems provide libraries that implement the majority of system services. Such libraries have commoditized the services which a modern application requires. As such, most code used by modern applications is provided in these system libraries.

Derives Krylov methods in the FLAME framework to show how FLAME can simplify the process of deriving iterative methods, and make a case that a Flame-based environment for deriving new iterative methods is a distinct, and attractive, possibility.
A novel strategy for sparse direct factorizations that is geared towards the matrices that arise from hp adaptive Finite Element Methods.

back to top >>
 

Bio Informatics

Bioinformatics is the computer-assisted data management discipline that helps researchers gather, analyze, and represent information to understand life's processes in the healthy and disease states, and find new or better drugs. TACC's Life Sciences Group is committed to providing the research community with the bioinformatics computational tools and expertise needed to address modern biological research questions. The TACC team is focused on ensuring that TACC maintains the hardware, software and domain expertise to support a wide variety of bioscience research.

An open-access online community resource for Arabidopsis model plant research.
Matthew Vaughn
vaughn@tacc.utexas.edu
To develop and operate computational resources in support of grand challenges in the life sciences.
Matthew Vaughn
vaughn@tacc.utexas.edu
This project aims to develop new methods and tools to extract biological entities from publication and other text documents.
This project will create coupled fuel, electricity, water, and air models for Texas and eastern Mexico to a develop quantitative understanding of the implications of increased natural gas usage in the Texas-Mexico border region.
Paul Navratil
pnav@tacc.utexas.edu
The project aims to use our knowledge of DNA replication to characterize epigenome dynamics during S phase in both Arabidopsis and maize and provide the basis for a deeper understanding of epigenetic regulation.
Matthew W. Vaughn
vaughn@tacc.utexas.edu
This NeuroNex Technology Hub for enhanced resolution three-dimensional electron microscopy (3DEM) will enable the discovery of new details in how brain synapses function across different regions and species.
An integrated cyberinfrastructure that enables investigators to deploy analysis tools and aggregate data resources.
Matthew W. Vaughn
vaughn@tacc.utexas.edu

back to top >>
 

Cloud Computing & Interface Technologies

As a complement to TACC's HPC and visualization resources,TACC offers cloud services to give researchers access to on-demand or persistent hosting of virtual servers, data sets, and gateways. TACC conducts research into cloud architectures and interfaces to improve next generation cloud platforms.

New communication mechanisms (integrated into Mvapich2) for Dense Many-Core (DMC) architectures
This project aims to research and development of a dynamic, reconfigurable tool to bridge users and remote computing resources to facilitate educational tools for data science. With this tool, curriculums and educational materials on essential tools for data science will be developed and disseminated.

back to top >>
 

Experimental Systems

An open research system to investigate the use of field-programmable gate arrays (FPGAs) as data center accelerators to improve performance, reduce power consumption, and open new research avenues of investigation.
Discovery is a testbed for benchmarking and testing hardware. TACC uses it to experiment and evaluate new technologies.
Fabric is an experimental system combining IBM's latest Power 8 processors with NVIDIA GPUs, and FPGAs from Altera and Xilinx, for researchers exploring alternate computer architectures.
A system dedicated to supporting workflows using Hadoop MapReduce and other Hadoop Distributed File System based tools.

back to top >>
 

Health Informatics & Compliance

Health informatics is the study of resources and methods for the management of health information. TACC is capable of handling research with data sets containing personalized health information through HIPAA and FISMA compliance. This enables deeper connections to clinical studies and research in human health.

Create a detailed spatial-temporal molecular atlas of the developing lung in preterm infants to young children.
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar4 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
Improve the quality of clinical care and improve patient safety by creating visualizations that provide clinicians with a systems view of physiologic and pathologic processes as well as the efficacy and confounding interaction of pharmacologic interventions.
Luis Francisco-Revilla
revilla@tacc.utexas.edu

back to top >>
 

Machine Learning & Analytics

At the edge of statistics, computer science and emerging applications in industry, machine learning and analytics focus on the development of fast and efficient algorithms for real-time processing of data as a main goal to deliver accurate predictions of various kinds.

Meeting the needs of faculty and researchers for data collection services, and contributing to the potential of data-driven research to make discoveries.
Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
Dan Stanzione
dan@tacc.utexas.edu

back to top >>
 

Next Generation Portals & Interfaces

The effective use of advanced computing resources is a challenge to researchers. Providing powerful and intuitive interfaces accelerates the rate at which science can be performed by allowing researchers to focus on scientific exploration rather than technological hurdles.

An open-access online community resource for Arabidopsis model plant research.
Matthew Vaughn
vaughn@tacc.utexas.edu
To develop and operate computational resources in support of grand challenges in the life sciences.
Matthew Vaughn
vaughn@tacc.utexas.edu
A resource-sharing Web platform that will enable computer models and simulations of natural hazards that can be validated against real-world data, creating an easily accessible resource for natural hazards researchers across the country.
Dan Stanzione
dan@tacc.utexas.edu
A sustainable, open and easy-to-use repository that organizes the images and related experimental measurements of diverse porous materials, improves access of porous media analysis results to a wider community of geosciences and engineering researchers not necessarily trained in computer science or data analysis, and enhances productivity, scientific inquiry, and engineering decisions founded on a data-driven basis.
Maria Esteva
maria@tacc.utexas.edu
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar4 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
An integrated cyberinfrastructure that enables investigators to deploy analysis tools and aggregate data resources.
Matthew W. Vaughn
vaughn@tacc.utexas.edu
A web interface for users to manage their TACC account, projects, and allocations.
A single virtual system that scientists can use to interactively share computing resources, data and expertise, including Replication Service, XSEDE Online Information, XSEDE User Portal, and XSEDE User Portal Mobile.
A cyberinfrastructure which captures the "whole tale" of data driven computational research in an easy to use web-based environment. By capturing the input data, processing and analysis steps, and the final data products, the three can be published along with any research publication to allow others to reproduce results or expand upon other's findings directly.
The Technology Audit Service (TAS) award for XSEDE focused first on developing the XDMoD auditing framework to provide XSEDE stakeholders (the public, users, PI's, support staff, center directors, XSEDE leadership, and NSF program managers) with ready access to utilization, performance, and quality of service data for XSEDE.
 
Collaborative initiatives between NASA and TACC on questions of mutual interest related to large-scale computation. Additionally, TACC makes available to NASA computational resources, training, and support that augments existing NASA capabilities.

back to top >>
 

Next Generation Computing Systems

What will TACC build next? TACC leadership and staff are constantly monitoring, evaluating and developing new and innovative technologies in the marketplace to consider what next generation of computer systems will be most effective for science and engineering. Projects in this category are early and experimental technology evaluations, usually in collaboration with industry partners, that look at the efficacy of future system designs. TACC maintains its own expertise in this area creating new frontiers and new challenges for next generation computing.

The ACELab develops and packages open source benchmarks appropriate for the HPC environment.
Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
Dan Stanzione
dan@tacc.utexas.edu
Developing new materials to teach the scientific community how to adapt their codes for the Intel MIC Architecture.
Promotes efficiency among scientists using advanced computing technology, and provides new educational and research opportunities for UT Austin students and faculty in emerging energy systems.
Nathaniel Mendoza
nmendoza@tacc.utexas.edu
An Interactive Parallelization Tool (IPT) for assisting domain-­experts and students in efficiently parallelizing their existing C/C++/Fortran applications is being developed in this project.
This project aims to develop new methods to processes video data collected from traffic monitoring cameras to support traffic analyses including traffic counts and vehicle-pedestrian interactions.

back to top >>
 

Software Defined Visualization

Software-defined visualization libraries provide performant rendering on general-purpose processors, including both traditional rasterization methods and ray tracing methods that can provide improved visual fidelity and expanded opportunities for simulation integration. By removing the dependence on particular hardware for performant rendering, these libraries enable visualization capabilities for both in situ and post-process analysis across the various hardware architectures found in high performance computing.

This project develops a dynamic ray scheduling algorithm that effectively manages both ray state and data accesses.
Paul Navrátil
pnav@tacc.utexas.edu
This project aims to create efficient shadowgraph and Schlieren visualizations of simulated airflow leveraging commodity hardware and open-source rendering platforms.
Paul Navrátil
pnav@tacc.utexas.edu
This project investigates making visual analysis of data more effectively by using time-tested methods of visual artists to investigate, reframe and generally make sense of their world. Imagine if data-intensive computing could be a more innately human and creative process, like sculpting (visual, tangible, active, physical).
To reduce the overall cost of the visual analysis of large-scale simulation and observational data by replacing brute-force techniques with data sampling based on the actual requirements and capabilities of the user.

back to top >>