ScholarshipsFellowshipsGrants & LoansTraineeships

Office of Cyberinfrastructure - (OCI) - #47.080

In fiscal year 2006, 130 proposals were received and 42 awards made. In fiscal year 2007, 307 proposals were received and 70 awards were made. In fiscal year 2008, approximately 320 proposals are expected to be received 71 awards made.




1) The OptIPuter: Essentially, the OptIPuter is a 'virtual' parallel computer in which the individual 'processors' are widely distributed clusters; the 'backplane' is provided by IP delivered over multiple dedicated lambdas (each 1-10 Gbps); and, the 'mass storage systems' are large distributed scientific data repositories, fed by scientific instruments as OptIPuter peripheral devices, operated in near real-time. Furthermore, collaboration will be a defining OptIPuter characteristic; goals include implementing a next-generation Access Grid with optical multicast, enabling tiled stereo HDTV screens of reality-matching visual resolution. The OptIPuter is an embodiment of the vision of the 'hollowing out of the computer' prophesized in the mid-1990s. The old 'computer-in-a-box' is being blown up and scattered across the Net. The OptIPuter's fundamental inventions include software and middleware abstractions to deliver unique capabilities in a lambda-rich world, a world in which endpoint-delivered bandwidth is greater than individual computers can saturate. The OptIPuter project explores a new architecture for the distributed information infrastructure (which NSF terms infrastructure) required by a number of this decade's science and engineering shared facilities. The project is driven by a close collaboration with leaders of two of these community systems, NSF's EarthScope and NIH's Biomedical Imaging Research Network (BIRN) - both of which are beginning to produce an accelerating flood of data which will in stored in distributed federated data repositories. One characteristic blocking such science is that the individual data objects (a 3D brain image or a terrain dataset) are large (Gigabytes) compared to what can be interactively manipulated or visualized over today's networks. What these scientists require are ultra-high-speed predictable 'clear-channel' networks linking PC clusters, storage and visualization systems, enabling collaborating scientists to explore interactively massive amounts of previously uncorrelated data. An important opportunity exists over the next few years to develop a radical new architecture for this needed scientific infostructure. 2) Engaging People in Cyberinfrastructure (EPIC): TeacherTECH program reached 540 K-12 educators from February 2005 to December 2005 through monthly and summer workshops in science, technology, engineering and math. These educator participants represented all of San Diego County, from urban to rural, from the tip of San Diego County all the way to the border with Mexico. Seventy educators were reached last year. Workshops include TeacherTECH Technology Tools workshops, TeacherTECH Advanced Technology Tools workshops, TeacherTECH Science Series and a TeacherTECH Math Series. Many of the TeacherTECH Science Series workshop presentations are archived on our TeacherTECH site (http://education.sdsc.edu/teachertech) for viewing by teachers across the nation. Ten thousand people have visited the new TeacherTECH website since its inception in May 2005. The site includes standards-based curriculum submitted by TeacherTECH participants, science lectures, scientific visualizations, notes, resources and much more. Our community partners and educators alike have praised our site. 3) Tornadogenesis within a simulated supercell storm: Using the Advanced Regional Prediction System model and supercomputing facilities a realistic supercell storm is simulated and successfully reproduced vorticity and other features of the most-intense tornado ever simulated. The use of a uniform resolution grid large enough to contain the entire parent storm is a first, and eliminates the uncertainties of artificial human control associated with nested grid simulations or simulations using horizontally stretched grids. The peak wind speed, reaching 120 meters per second, places the tornado within the F5 intensity scale of Fujita. Atmospheric data from this simulationwill be used to develop algorithms for a new system of cell-phone tower based Doppler radar, called CASA (Center for Collaborative Adaptive Sensing of the Atmosphere), projected to reduce tornado false-alarm forecasts from the current 75 percent to 25 percent. 4) The Pacific Rim Application and Grid Middleware Assembly (PRAGMA): An intensive international effort involving researcher from more than ten institutions to improve grid-based application, collaboration, discourse and exchange of scientific personnel among Pacific Rim institutions. It builds on an initial series of workshops that focused on assessing the state of grid-based infrastructure, evaluated a collection of scientific applications and provided a recurring venue to nurture sustainable collaborations across the Pacific Rim. The original award provided funds to host the first workshop and allow both computing infrastructure specialists and applications scientists to attend follow on workshops. This ongoing effort created the initial impetus to move forward in both building long long-term international collaborations and creating a venue for applications scientists to effectively understand and use grid-based systems. 5) Keeping Condor Flight Worthy: This proposal outlines an effort to sustain the Condor High Throughput Computing Software (Condor) by the Condor Project at the University of Wisconsin-Madison (UW). The activities proposed are categorized into four areas of work: Support, Release, Enhance, and Use. 'Support' activities aid users with installation, configuration, and usage through ticket-tracked email. Support also includes outreach work in the scientific computing community. This outreach work includes hosting an annual international Condor workshop, participating in online forums dedicated to Condor and distributed computing, writing articles and book chapters on using Condor, and delivering invited tutorials at workshops and conferences. 'Support' functions serve mainly to deliver new versions of the Condor software. Support also includes maintenance - ongoing bug fixing, support for new operating system releases and new versions of dependent software packages, and updates to documentation. 'Enhance' activities will generate basic improvements to the Condor software, such as enhancements in scalability and reliability, new capabilities on Win32, and the incorporation of recent advances in distributed computing technology. 'Use' activities leverages the Grid Laboratory of Wisconsin (GLOW) project. GLOW is a multi disciplinary team of researchers across the University of Wisconsin campus that develops, implements, tests and deploys grid-enabled capabilities.

Join conversations about this Program specifically... and Student Financial Aid (in general) on our sister site Aidpage dedicated to online mutual support and sharing of information.


ScholarshipsFellowshipsGrants & LoansTraineeships

Students and Parents! Share this page:



site design by zayko