Algorithms & Libraries
TACC's software programs provides algorithms and standard libraries that perform calculation, data processing, and automated reasoning. TACC's programmers also create custom libraries. Most modern software systems provide libraries that implement the majority of system services. Such libraries have commoditized the services which a modern application requires. As such, most code used by modern applications is provided in these system libraries.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
Derives Krylov methods in the FLAME framework to show how FLAME can simplify the process of deriving iterative methods, and make a case that a Flame-based environment for deriving new iterative methods is a distinct, and attractive, possibility.
NSF 0904907
A novel strategy for sparse direct factorizations that is geared towards the matrices that arise from hp adaptive Finite Element Methods.
back to top >>
Bio Informatics
Bioinformatics is the computer-assisted data management discipline that helps researchers gather, analyze, and represent information to understand life's processes in the healthy and disease states, and find new or better drugs. TACC's Life Sciences Group is committed to providing the research community with the bioinformatics computational tools and expertise needed to address modern biological research questions. The TACC team is focused on ensuring that TACC maintains the hardware, software and domain expertise to support a wide variety of bioscience research.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
To develop and operate computational resources in support of grand challenges in the life sciences.
This project aims to develop new methods and tools to extract biological entities from publication and other text documents.
This project will create coupled fuel, electricity, water, and air models for Texas and eastern Mexico to a develop quantitative understanding of the implications of increased natural gas usage in the Texas-Mexico border region.
This NeuroNex Technology Hub for enhanced resolution three-dimensional electron microscopy (3DEM) will enable the discovery of new details in how brain synapses function across different regions and species.
An integrated cyberinfrastructure that enables investigators to deploy analysis tools and aggregate data resources.
DARPA, HR001117S0003
back to top >>
Cloud Computing & Interface Technologies
As a complement to TACC's HPC and visualization resources,TACC offers cloud services to give researchers access to on-demand or persistent hosting of virtual servers, data sets, and gateways. TACC conducts research into cloud architectures and interfaces to improve next generation cloud platforms.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
New communication mechanisms (integrated into Mvapich2) for Dense Many-Core (DMC) architectures
National Science Foundation
This project aims to research and development of a dynamic, reconfigurable tool to bridge users and remote computing resources to facilitate educational tools for data science. With this tool, curriculums and educational materials on essential tools for data science will be developed and disseminated.
back to top >>
Experimental Systems
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
An open research system to investigate the use of field-programmable gate arrays (FPGAs) as data center accelerators to improve performance, reduce power consumption, and open new research avenues of investigation.
Discovery is a testbed for benchmarking and testing hardware. TACC uses it to experiment and evaluate new technologies.
Fabric is an experimental system combining IBM's latest Power 8 processors with NVIDIA GPUs, and FPGAs from Altera and Xilinx, for researchers exploring alternate computer architectures.
A system dedicated to supporting workflows using Hadoop MapReduce and other Hadoop Distributed File System based tools.
O'Donnell Foundation
back to top >>
Health Informatics & Compliance
Health informatics is the study of resources and methods for the management of health information. TACC is capable of handling research with data sets containing personalized health information through HIPAA and FISMA compliance. This enables deeper connections to clinical studies and research in human health.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
Create a detailed spatial-temporal molecular atlas of the developing lung in preterm infants to young children.
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar5 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
Free
Brings together researchers in medicine, computation, visualization and data to deploy passive monitoring to a cohort of 1000 pregnant women in Central Texas via ubiquitous cell phone apps to develop a digital phenotype of pregnancy.
Improve the quality of clinical care and improve patient safety by creating visualizations that provide clinicians with a systems view of physiologic and pathologic processes as well as the efficacy and confounding interaction of pharmacologic interventions.
back to top >>
Machine Learning & Analytics
At the edge of statistics, computer science and emerging applications in industry, machine learning and analytics focus on the development of fast and efficient algorithms for real-time processing of data as a main goal to deliver accurate predictions of various kinds.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
Meeting the needs of faculty and researchers for data collection services, and contributing to the potential of data-driven research to make discoveries.
Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
O'Donnell Foundation
This project quantifies the impact of silent data corruption on deep learning training.
Base Funding
This project enables scalable and efficient I/O for distributed deep learning training in computer clusters with existing hardware/software stack.
Base Funding
This project enables deep learning training on supercomputer scale without losing test accuracy.
Base Funding
back to top >>
Next Generation Portals & Interfaces
The effective use of advanced computing resources is a challenge to researchers. Providing powerful and intuitive interfaces accelerates the rate at which science can be performed by allowing researchers to focus on scientific exploration rather than technological hurdles.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
The BOINC@TACC project integrates volunteer computing with supercomputing and cloud computing. It provides a conduit for routing High-Throughput Computing (HTC) jobs from the TACC systems to the computing resources volunteered by individuals or institutions.
To develop and operate computational resources in support of grand challenges in the life sciences.
A resource-sharing Web platform that will enable computer models and simulations of natural hazards that can be validated against real-world data, creating an easily accessible resource for natural hazards researchers across the country.
Allows researchers to perform drug screening using the Autodock Vina (Scripps) program on the Lonestar5 supercomputer to search more than 700,000 drug-like compounds in only a few hours.
Free
An integrated cyberinfrastructure that enables investigators to deploy analysis tools and aggregate data resources.
DARPA, HR001117S0003
A web interface for users to manage their TACC account, projects, and allocations.
Base Funding
A cyberinfrastructure which captures the "whole tale" of data driven computational research in an easy to use web-based environment. By capturing the input data, processing and analysis steps, and the final data products, the three can be published along with any research publication to allow others to reproduce results or expand upon other's findings directly.
back to top >>
Next Generation Computing Systems
What will TACC build next? TACC leadership and staff are constantly monitoring, evaluating and developing new and innovative technologies in the marketplace to consider what next generation of computer systems will be most effective for science and engineering. Projects in this category are early and experimental technology evaluations, usually in collaboration with industry partners, that look at the efficacy of future system designs. TACC maintains its own expertise in this area creating new frontiers and new challenges for next generation computing.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
The ACELab develops and packages open source benchmarks appropriate for the HPC environment.
Base Funded
Simplifying data management for administrators and researchers while reducing the overall cost of next-generation storage systems.
O'Donnell Foundation
Promotes efficiency among scientists using advanced computing technology, and provides new educational and research opportunities for UT Austin students and faculty in emerging energy systems.
NEDO
NTT
An Interactive Parallelization Tool (IPT) for assisting domain-experts and students in efficiently parallelizing their existing C/C++/Fortran applications is being developed in this project.
NSF
This project aims to develop new methods to processes video data collected from traffic monitoring cameras to support traffic analyses including traffic counts and vehicle-pedestrian interactions.
City of Austin
back to top >>
Research Computing Infrastructure
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
The Chameleon testbed is deployed at the University of Chicago (UC) and the Texas Advanced Computing Center (TACC) and consists of 650 multi-core cloud nodes, 5PB of total disk space, and leverage 100 Gbps connection between the sites.
The goal of this project and system is to open up new possibilities in science and engineering by providing computational capability that makes it possible for investigators to tackle much larger and more complex research challenges across a wide spectrum of domains.
Jetstream is a new type of computational research resource for the national (non-pclassified) research community - a data analysis and computational resource that scientists and engineers use interactively to conduct their research anytime, anywhere.
Stampede 2 offers a powerful new system that builds on the success of Stampede and will continue to enable groundbreaking science.
This award is to fund a transformational data intensive resource for the open science community called Wrangler.
back to top >>
Software Defined Visualization
Software-defined visualization libraries provide performant rendering on general-purpose processors, including both traditional rasterization methods and ray tracing methods that can provide improved visual fidelity and expanded opportunities for simulation integration. By removing the dependence on particular hardware for performant rendering, these libraries enable visualization capabilities for both in situ and post-process analysis across the various hardware architectures found in high performance computing.
NAME
DESCRIPTION
FUNDING SOURCE(S)
CONTACT
This project develops a dynamic ray scheduling algorithm that effectively manages both ray state and data accesses.
Longhorn Supercomputer
This project aims to create efficient shadowgraph and Schlieren visualizations of simulated airflow leveraging commodity hardware and open-source rendering platforms.
US Air Force SBIR Topic AF151-182
This project investigates making visual analysis of data more effectively by using time-tested methods of visual artists to investigate, reframe and generally make sense of their world. Imagine if data-intensive computing could be a more innately human and creative process, like sculpting (visual, tangible, active, physical).
National Science Foundation
To reduce the overall cost of the visual analysis of large-scale simulation and observational data by replacing brute-force techniques with data sampling based on the actual requirements and capabilities of the user.
Department of Energy Office of Advanced Scientific Computing Research
back to top >>