IGrid2002 Visualization and Virtual Reality

Download 12.54 Mb.
Hajmi12.54 Mb.

iGrid2002 Visualization and Virtual Reality

  • iGrid2002 Visualization and Virtual Reality
  • Maxine Brown & Tom DeFanti
  • Electronic Visualization Laboratory

SIGGRAPH 89 Science by Satellite

  • “Using satellite technology…demo of
  • What It might be like to have high-speed
  • fiber-optic links between advanced
  • computers in two different geographic locations.”
  • ― Al Gore, Senator
  • Chair, US Senate Subcommittee on Science, Technology and Space
  • “What we really have to do is eliminate distance between individuals who want to interact with other people and with other computers.”
  • ― Larry Smarr, Director
  • National Center for Supercomputing Applications, UIUC

SIGGRAPH 92 Showcase: Science in the 21st Century

  • “From the atom to the Universe…it’s all here. Three dozen projects can now come through the network and appear to be in McCormick Place…Simulations on remote supercomputers or created from data gathered from far away instruments, these visualizations demonstrate the power of distributed computing, doing computing where the resources are and not necessarily on a single machine.”
  • ― Larry Smarr, Director, National Center for Supercomputing Applications, UIUC
  • “We have to develop the technology and techniques―and the sociology―to go along with group activities.”
  • ― Sid Karin, Director, San Diego Supercomputer Center, UCSD
  • in San Diego
  • in Chicago
  • UCSD National Center for Microscopy and Imaging Research (NCMIR) http://www-ncmir.ucsd.edu

SIGGRAPH 92 Showcase: Science in the 21st Century

  • “VR is a mode of scientific visualization. It’s something that lets you get inside of the data. Now, with most computer screens you’re outside looking in. In this, you’re inside looking out.”
  • ― Tom DeFanti, Director, Electronic Visualization Laboratory, UIC
  • www.evl.uic.edu
  • www.cavernus.org
  • “In a few years, the network is the computer…It doesn’t matter where your supercomputer is, your data resources, your sensors, scanners or satellite data. It can come from anywhere, be stored anywhere, but you can access it, at your fingertips, on your desktop.”
  • ― Maxine Brown, Associate Director, Electronic Visualization Laboratory, UIC

SIGGRAPH 94 VROOM: Virtual Reality Room

  • “It’s the real start of humans being able to immerse themselves inside the brains of computers―seeing what the computers are seeing.”
  • ― Larry Smarr, Director,
  • National Center for Supercomputing Applications, UIUC
  • “See things you’ve never seen before.”
  • ― Tom DeFanti, Director, Electronic Visualization Laboratory, UIC
  • “Virtual prototyping of new products,
  • from small to large.”
  • ― Rick Stevens, Director, Math and Computer
  • Science Division, Argonne National Lab
  • “Next year―Get rid of the
  • Machine Farm and put gigabit networks in
  • place to talk to computers at remote sites―a
  • whole new level of interaction and
  • communication.”
  • ― Maxine Brown, Associate Director,
  • Electronic Visualization Laboratory, UIC
  • Post-Euclidean Walkabout George Francis, NCSA, UIUC
  • www.math.uiuc.edu/~gfrancis/
  • General Motors
  • Research
  • www.evl.uic.edu/EVL/VROOM/HTML

Supercomputing 95 I-WAY: Information Wide Area Year

  • I-WAY featured:
  • Application demonstrations
  • OC-3 backbone
  • Large-scale immersive displays
  • I-Soft programming environment
  • http://archive.ncsa.uiuc.edu/General/Training/SC95/GII.HPCC.html
  • The Internet evolved from ARPAnet, a research network built in 1969 that primarily was a communications tool of the research community until the invention of the World Wide Web-and later Mosaic-that opened it up to the wider community.
  • I-WAY leaders Ian Foster, Larry Smarr, Rick Stevens and Tom DeFanti, believe the next great wave of evolution on the Internet will be unleashed by I-WAY.

iGrid 1998 at SC’98 November 7-13, 1998, Orlando, Florida, USA

  • 10 countries: Australia, Canada, Germany, Japan, Netherlands, Russia, Singapore, Switzerland, Taiwan, USA
  • 22 demonstrations featured technical innovations and application advancements requiring high-speed networks, with emphasis on remote instrumentation control, tele-immersion, real-time client server systems, multimedia, tele-teaching, digital video, distributed computing, and high-throughput, high-priority data transfers
  • www.startap.net/igrid98

iGrid 2000 at INET 2000 July 18-21, 2000, Yokohama, Japan

  • 14 regions: Canada, CERN, Germany, Greece, Japan, Korea, Mexico, Netherlands, Singapore, Spain, Sweden, Taiwan, United Kingdom, USA
  • 24 demonstrations featuring technical innovations in tele-immersion, large datasets, distributed computing, remote instrumentation, collaboration, streaming media, human/computer interfaces, digital video and high-definition television, and grid architecture development, and application advancements in science, engineering, cultural heritage, distance education, media communications, and art and architecture
  • www.startap.net/igrid2000

  • 23-26 September 2002
  • Amsterdam Science and Technology
  • The Netherlands
  • September 26, 2002
  • Maxine Brown
  • STAR TAP/StarLight co-Principal Investigator
  • Associate Director, Electronic Visualization Laboratory
  • University of Illinois at Chicago
  • i
  • Grid 2oo2
  • I N T E R N A T I O N A L V I R T U A L L A B O R A T O R Y
  • www.igrid2002.org

iGrid 2002 September 23-26, 2002, Amsterdam, The Netherlands

  • iGrid is a conference demonstrating application demands for increased bandwidth
  • iGrid is a testbed enabling the world’s research community to work together briefly and intensely to advance the state of the art – moving from grid-intensive computing to LambdaGrid-intensive computing, in which computational resources worldwide are connected by multiple lambdas.
  • www.startap.net/igrid2002

iGrid 2002 Application Demonstrations

  • 28 demonstrations from 16 countries: Australia, Canada, CERN/Switzerland, France, Finland, Germany, Greece, Italy, Japan, Netherlands, Singapore, Spain, Sweden, Taiwan, the United Kingdom and the USA.
  • Applications to be demonstrated: art, bioinformatics, chemistry, cosmology, cultural heritage, education, high-definition media streaming, manufacturing medicine, neuroscience, physics, tele-science
  • Grid technologies to be demonstrated: Major emphasis on grid middleware, data management grids, data replication grids, visualization grids, data/visualization grids, computational grids, access grids, grid portals

iGrid 2002 Featured Network Infrastructures

  • NetherLight, developed by SURFnet within the context of the Dutch Next Generation Internet project (GigaPort), is an advanced optical infrastructure and proving ground for network services optimized for high-performance applications located at the Amsterdam Internet Exchange facility.
  • StarLight, developed by the University of Illinois at Chicago, Northwestern University, and Argonne National Laboratory in partnership with Canada’s CANARIE and Holland’s SURFnet with funding from the USA NSF, is a persistent infrastructure that supports advanced applications and middleware research, and aggressive advanced networking services.

iGrid 2002 Enabling Technologies and Projects

  • EU-funded DataGrid Project aims to develop, implement and exploit a compu­tational and data-intensive grid of re­sources for the analysis of scientific data.
  • www.eu-datagrid.org
  • EU-funded DataTAG Project is creating an intercontinental testbed (Trans-Atlantic Grid) for data-intensive grids, with a focus on net­working techniques and interoperability issues among different grid domains.
  • www.datatag.org

iGrid 2002 Enabling Technologies and Projects

  • The Globus Project conducts research and development on the application of Grid concepts to scientific and engineering computing. The Globus Project provides software tools (the Globus Toolkit) that make it easier to build computational grids and grid-based applications.
  • www.globus.org
  • Quanta, the Quality of Service (QoS) Adaptive Networking Toolkit, is backward compatible with CAVERNsoft, and provides application developers with an easy-to-use system to efficiently utilize the extremely high bandwidth afforded by optical networks. .
  • www.evl.uic.edu/cavern/teranode/quanta

iGrid 2002 Singapore, Australia and Japan

  • APBioGrid of APBioNet
  • Bio Informatics Centre (BIC) National University of Singapore
  • Kooprime, Singapore
  • Cray, Singapore
  • www.bic.nus.edu.sg, www.bic.nus.edu.sg/biogrid, www.apbionet.org, http://s-star.org/main.htm
  • Using BIC’s APBioGrid (the Asia Pacific
  • Bioinformatics Grid, a collection of networked
  • computational resources) and KOOP testbed
  • technology, biologists can quickly build a
  • complex series of computations and database
  • management activities on top of computational
  • grids to solve real-world problems.
  • APBioGrid mimics tasks typical of a bioinformatician – it does resource discovery over the network, remotely distributing tasks that perform data acquisition, data transfer, data processing, data upload to databases, data analysis, computational calculations and visualizations.

iGrid 2002 Canada, CERN and The Netherlands

  • ATLAS Canada LightPath Data Transfer Trial
  • TRIUMF, Canada
  • Carleton University, Canada
  • University of Victoria, British Columbia, Canada
  • University of Alberta, Canada
  • University of Toronto, Canada
  • Simon Fraser University, Canada
  • BCNet, British Columbia, Canada
  • CANARIE, Canada
  • CERN, Switzerland
  • University of Amsterdam, The Netherlands
  • www.triumf.ca
  • The Lightpath Trial hopes to transmit 1TB of ATLAS Monte Carlo data from TRIUMF (Canada’s National Laboratory for Particle and Nuclear Physics ) to CERN in under 2 hours. Using Canada’s 2.5Gb link to StarLight, SURFnet’s 2.5Gb link from StarLight to NetherLight, and from NetherLight to CERN, an end-to-end lightpath will be built between TRIUMF in Vancouver and CERN.

iGrid 2002 Netherlands, USA, Canada, CERN, France, Italy, Japan, UK

  • Bandwidth Challenge from the Low-Lands
  • NIKHEF, The Netherlands
  • Participating sites: APAN, Japan; ANL, USA; Lab, USA; Caltech, USA; CERN,
  • Switzerland; Daresbury Laboratory, UK; ESnet, USA; Fermilab, USA; NASA
  • GSFC, USA; IN2P3, France; INFN/Milan, Italy; INFN/Rome, Italy; Internet2, USA;
  • JLab, USA; KEK High Energy Accelerator Research Organization, Japan; LANL,
  • USA; LBNL, USA; LBNL/NERSC, USA; Manchester University, UK; NIKHEF, The
  • Netherlands; ORNL, USA; Rice University, USA; RIKEN Accelerator Research
  • Facility, Japan; Rutherford Appleton Laboratory, UK; SDSC/UCSD, USA; SLAC,
  • USA; Stanford University, USA; Sun Microsystems, USA; TRIUMF, Canada;
  • University College London, UK; University of Delaware, USA; University of
  • Florida, USA; University of Michigan, USA; University of Texas at Dallas, USA;
  • University of Wisconsin, Madison, USA
  • http://www-iepm.slac.stanford.edu/
  • monitoring/bulk/igrid2002
  • Current data transfer capabilities to several international sites with high-performance links is demonstrated. iGrid 2002 serves as a HENP “Tier 0” or “Tier 1” site (an accelerator or major computing site), distributing data to multiple replica sites. Researchers investigate/ demonstrate issues regarding TCP implementations for high-bandwidth long-latency links, and create a repository of trace files of a few interesting flows to help explain the behavior of transport protocols over various production networks.

iGrid 2002 USA and CERN

  • Bandwidth Gluttony ― Distributed Grid-Enabled
  • Particle Physics Event Analysis
  • Argonne National Laboratory (ANL), USA
  • Caltech, USA
  • CERN (EU DataGrid Project)
  • This demonstration is a joint effort between Caltech (HEP) and ANL (Globus/GridFTP). Requests for remote virtual data collections are issued by Grid-based software that is itself triggered from a customized version of the High-Energy Physics (HEP) analysis tool called ROOT. These requests cause the data to be moved across a wide-area network using both striped and standard GridFTP servers.
  • In addition, at iGrid, an attempt is made to saturate a 10Gbps link between Amsterdam, ANL and StarLight and a 2.5Gbps link between Amsterdam and CERN, using striped GridFTP channels and specially tuned TCP/IP stacks applied to memory-cached data.
  • http://pcbunn.cacr.caltech.edu/iGrid2002/demo.htm

iGrid 2002 USA

  • Beat Box
  • Indiana University, USA
  • Res Umbrae, USA
  • http://dolinsky.fa.indiana.edu/beatbox
  • Beat Box presents networked CAVE participants with a playful arena of interactive sound machines. Participants cycle through sound selections and give voice to an interval by introducing it to a thoroughly odd indigenous head. Each head represents a distinct moment in a sequence that contributes to the resultant delivery of the collective instruments.

iGrid 2002 USA

  • Collaborative Visualization over the Access Grid
  • Argonne National Laboratory/University of Chicago, USA
  • Northern Illinois University, USA
  • www.mcs.anl.gov/fl/events/igrid2002.html, www.accessgrid.org, www.globus.org/mpi, www.globus.org, www.teragrid.org
  • This demonstration shows next-generation Access Grid applications, where the Access Grid is coupled to high-speed networks and vast computational resources. Using the Globus Toolkit, MPICH-G2 and Access Grid technology, scientists can collaboratively and interactively analyze time-varying datasets that are multiple terabytes in size.

iGrid 2002 The Netherlands and USA

  • D0 Data Analysis
  • NIKHEF, The Netherlands
  • Fermi National Accelerator Laboratory (Fermilab), USA
  • Michigan State University, USA
  • www-d0.fnal.gov, www.nikhef.nl
  • The D0 Experiment, which relies on the Tevatron Collider at Fermilab, is a worldwide collaboration of scientists conducting research on the fundamental nature of matter. Currently, raw data from the D0 detector is processed at Fermilab’s computer farm and results are written to tape. At iGrid, researchers show that by using the transoceanic StarLight/NetherLight network, it is possible for Fermilab to send raw data to NIKHEF for processing and then have NIKHEF send the results back to Fermilab.

iGrid 2002 USA, Germany, Japan, Taiwan and UK

  • Distributed, On-Demand, Data-Intensive and Collaborative
  • Simulation Analysis
  • Sandia National Laboratories, USA
  • Pittsburgh Supercomputing Center, USA
  • Tsukuba Advanced Computing Center, Japan
  • Manchester Computing Centre, UK
  • National Center for High-Performance Computing, Taiwan
  • High Performance Computing Center, Rechen-zentrum Universität Stuttgart, Germany
  • www.cs.sandia.gov/ilab, www.tbi.univie.ac.at/research/VirusPrj.html, www.hlrs.de
  • Grid tools applied to bioinformatics are demonstrated – specifically, predicting identifiable intron/exon splice sites in human genes based on RNA secondary structures. Modeling and simulation programs scale to geographically distributed supercomputer centers. Results are visualized in a collaborative environment, displaying spatial relationships and insights into identifying exonic splicing enhancers.

iGrid 2002 USA and UK

  • Dynamic Load Balancing of Structured Adaptive Mesh
  • Refinement (SAMR) Applications on Distributed Systems
  • CS Department, Illinois Institute of Technology
  • ECE Department, Northwestern University, USA
  • Nuclear and Astrophysics Laboratory, Oxford University
  • AMR applications results in load imbalance among processors on distributed systems. ENZO is a successful parallel implementation of structured AMR (SAMR), incorporating dynamic load balancing across distributed systems, used by in astrophysics and cosmology.
  • www.ece.nwu.edu/~zlan/research.html

iGrid 2002 USA and CERN

  • Fine Grained Authorization for GARA
  • Automated Bandwidth Reservation
  • University of Michigan, USA
  • CERN
  • www.citi.umich.edu/projects/qos
  • This demonstration shows modifications to the Globus General-purpose Architecture for Reservation and Allocation (GARA). Also shown is a secure and convenient Web interface for making reservation requests based on Kerberos credentials.
  • GARA modifications are demonstrated by reserving bandwidth for a video application running between sites with distinct security domains. Traffic generators overload the router interface servicing the video receiver, degrading the video quality when bandwidth is not reserved.

iGrid 2002 Italy and CERN

  • https://genius.ct.infn.it
  • Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Catania, Italy
  • Università di Catania, Italy
  • NICE srl, Camerano Casasco, Italy
  • CERN, Switzerland
  • The grid portal GENIUS (Grid Enabled web eNvironment for site Independent User job Submission) is an interactive data management tool being developed on the EU DataGrid testbed.
  • At iGrid 2002, researchers demonstrate GENIUS’s data movement and discovery, security mechanisms and system monitoring techniques, as well as optimization and fail-safe mechanisms ― for example, how to find network optimized files and how to detect system failure.

iGrid 2002 USA, Japan and Taiwan

  • Global Telescience Featuring IPv6
  • National Center for Microscopy and Imaging Research (NCMIR), UCSD, USA
  • San Diego Supercomputer Center, UCSD, USA
  • Cybermedia Center, Osaka University, Japan
  • National Center for High Performance Computing, Taiwan
  • https://gridport.npaci.edu/Telescience
  • Utilizing native IPv6 and a mixture of high bandwidth and low latency, this demonstration features a network-enabled end-to-end system for 3D electron tomography that utilizes richly connected resources to remotely control the intermediate-high-voltage electron microscope in San Diego and the ultra- high-voltage electron microscope in Osaka.

iGrid 2002 The Netherlands and USA

  • Griz: Grid Visualization Over Optical Networks
  • Vrije Universiteit, The Netherlands
  • Electronic Visualization Laboratory, University of Illinois at Chicago, USA
  • www.cs.vu.nl/~renambot/vr/html/intro.htm
  • Aura, a distributed parallel rendering toolkit, is used to remotely render data on available graphics resources (in Chicago and in Amsterdam) for local display at the iGrid conference. Aura is applied to real-world scientific problems; notably, the visualization of high-resolution isosurfaces of the Visible Human dataset and an interactive molecular dynamics simulation.

iGrid 2002 USA, Canada, The Netherlands, Sweden and UK

  • High Performance Data Webs
  • Laboratory for Advanced Computing, University of Illinois at Chicago, USA
  • Dalhousie University, Halifax, Canada
  • Imperial College of Science, Technology & Medicine, University of London, UK
  • Universiteit van Amsterdam, The Netherlands
  • SARA, The Netherlands
  • Center for Parallel Computers, Royal Institute of Technology, Sweden
  • www.ncdm.uic.edu, www.dataspaceweb.net
  • DataSpace is a high-performance data web for the remote analysis, mining, and real-time interaction of scientific, engineering, business, and other complex data. DataSpace applications are designed to exploit the capabilities of high-performance networks so that gigabyte and terabyte datasets can be remotely explored in real time.

iGrid 2002 Spain and USA

  • HDTV Transmission over IP
  • Universitat Politècnica de Catalunya, Barcelona, Spain
  • Ovide Broadcast Services, Barcelona, Spain
  • ResearchChannel, Pacific Northwest GigaPoP, USA
  • iCAIR, Northwestern Univ., USA
  • Starmaze, Spain
  • www.i2cat.net, www.researchchannel.com, www.washington.edu, www.icair.org
  • First transcontinental HDTV broadcast using “Year Gaudi 2002” footage:
  • UPC 1.5Gbps (HDSDI) compressed/transmitted at 270Mbps (SDTI) over IP
  • ResearchChannel uncompressed bi-directional HDTV/IP using prototype Tektronix hardware at 1.5Gbps, Sony HDCAM/IP at 270Mbps, MPEG-2 at 10 Mbps, VideoOnDemand at 5.6Mbps, and AudioOnDemand at 1.4Mbps
  • ICAIR is streaming 270Mbps over IP

iGrid 2002 Taiwan and Germany

  • Image Feature Extraction on a Grid Testbed
  • National Center for High Performance Computing, Taiwan
  • Institute of Statistical Science, Academia Sinica, Taiwan
  • High Performance Computing Center, Rechenzentrum Universität Stuttgart
  • http://motif.nchc.gov.tw/DataGrid
  • For medical imagery (confocal laser-scanning microscopes, CT, MRI and PET), NCHC does image processing, analysis and 3D reconstruction. For biotechnology imagery (e.g., microarray biochips), NCHC uses a data clustering procedure for feature extraction that provides insight into an image, such as identifying diseases caused by some protein.
  • Grid techniques enable the use of distributed computing resources and shared data. High-speed networks enable fast processing; typical medical doctors want the procedure accomplished in 5 seconds for use in daily operations.

iGrid 2002 USA, Canada, France, Japan, The Netherlands. Singapore

  • Kites Flying In and Out of Space
  • Jacqueline Matisse-Monnier, Visting artist
  • Mountain Lake Workshop, Virginia Tech Fndn , USA
  • Virginia Tech, USA
  • Virginia Polytechnic Institute and State University, USA
  • EVL, University of Illinois at Chicago, USA
  • SARA, The Netherlands
  • Sorbonne/La Cité Museum de Musique Paris, France
  • Tohwa University, Japan
  • Institute of High Performance Computing, Singapore
  • New Media Innovation Center, Canada
  • http://calder.ncsa.uiuc.edu/ART/MATISSE/
  • This virtual-reality art piece is a study of the physical properties of the flying kinetic artwork of Jacqueline Matisse-Monnier. One PC supports the simulation of one kite. For iGrid, distributed grid computing for the arts is demonstrated.

iGrid 2002 Germany and USA

  • www.cactuscode.org, www.griksl.org
  • Network Intensive Grid Computing and Visualization
  • Scientists run an astrophysics simulation at a USA supercomputing center and then computing detailed remote visualizations of the results. One part of the demo shows remote online visualization – as the simulation continues, each time step’s raw data is streamed from the USA to a Linux cluster in Amsterdam for parallel volume rendering. The other part demonstrates remote off-line visualization using grid technologies to access data on remote data servers, as well as new rendering techniques for network-adaptive visualizations.
  • Max-Planck-Institut für Gravitationsphysik, Albert-Einstein-Institut/Golm, Germany
  • Konrad-Zuse-Zentrum für Informationstechnik/Berlin, Germany
  • Lawrence Berkeley National Laboratory/National Energy Research Scientific Computing Center, USA

iGrid 2002 USA

  • Department of Media Study, University at Buffalo, USA
  • Res Umbrae, USA
  • New York State Center for Engineering Design and Industrial Innovation, University at Buffalo, USA
  • http://resumbrae.com/projects/paapab, www.ccr.buffalo.edu/anstey/VR/PAAPAB, www.nyscedii.buffalo.edu
  • PAAPAB (Pick An Avatar, Pick A Beat) is a shared virtual-reality disco environment inhabited by life-size puppets (user avatars). Users tour the dance floor to see the puppets they animate, dance with the puppets, and dance with avatars of other users. This research focuses on creating interactive drama in virtual reality; that is, immersive stories. PAAPAB serves as a testbed for technology development as well as character and world design.

iGrid 2002 USA and The Netherlands

  • Photonic TeraStream
  • iCAIR, Northwestern University, USA
  • EVL, University of Illinois at Chicago, USA
  • Materials Sciences Research Center, Northwestern University, USA
  • Universiteit van Amsterdam, The Netherlands
  • Argonne National Laboratory, USA
  • www.icair.org/igrid2002, www.uva.nl, www.icair.org/omninet
  • The Photonic TeraStream, supported by OMNInet, demonstrates that photonic-enabled applications are possible. OMNInet is used to prototype tools for intelligent application signaling, dynamic lambda provisioning, and extensions to lightpaths through dynamically provisioned L2 and L3 configurations – to access edge resources. The goal is to develop “Global Services-on-Demand” technologies for optical networks, enabling scientists to find, gather, integrate, and present information –large-scale datasets, scientific visualizations, streaming digital media, computational results – from resources worldwide.

iGrid 2002 Japan

  • TACC Quantum Chemistry Grid/Gaussian Portal
  • Grid Technology Research Center (GTRC), National Institute of Advanced Industrial Science and Technology (AIST), Japan
  • http://unit.aist.go.jp/grid/GSA/gaussian
  • Gaussian code, used in computational chemistry, sometimes receives inadequate computational resources when run on large computers. The Tsukuba Advanced Computing Center (TACC) Gaussian Grid Portal efficiently utilizes costly computational resources without knowing the specifications of each system environment. It consists of a Web interface, meta-scheduler, computational resources, archival resources and Grid software.

iGrid 2002 USA

  • TeraScope: Visual Tera Mining
  • Electronic Visualization Laboratory, University of Illinois at Chicago (UIC), USA
  • National Center for Data Mining, UIC, USA
  • www.evl.uic.edu/cavern/teranode/terascope, www.dataspaceweb.net
  • TeraScope is a massively parallelized set of information visualization tools for Visual Data Mining that interactively queries and mines terabyte datasets, correlates the data, and then visualizes the data using parallelized rendering software on tiled displays.
  • TeraScope’s main foci are to develop techniques to create TeraMaps (visualizations that summarize rather than plot enormous datasets) and to develop a distributed memory cache to collect pools of memory from optically connected clusters. These caches are used by TeraScope to bridge the impedance mismatch between large and slow distributed data stores and fast local memory.

iGrid 2002 USA

  • TeraVision: Visualization Streaming over Optical Networks
  • Electronic Visualization Laboratory, University of Illinois at Chicago, USA
  • www.evl.uic.edu/cavern/teranode/teravision
  • TeraVision is a hardware-assisted, high-resolution graphics streaming system for the Access Grid, enabling anyone to deliver a presentation without installing software or distributing data files in advance.
  • A user giving a presentation on a laptop or showing output from a node of a graphics cluster plugs the VGA or DVI output of the computer into the TeraVision Box. The box captures the signal at its native resolution, and digitizes and broadcasts it to another networked TeraVision Box, which is connected to a PC and DLP projector. Two Boxes can be used to stream stereoscopic computer graphics. Multiple Boxes can be used for an entire tiled display.

iGrid 2002 USA and UK

  • The Universe
  • University of California, San Diego
  • Information Sciences Institute, University of Southern California
  • Stephen Hawking Laboratory, University of Cambridge, UK
  • http://virdir.ncsa.uiuc.edu/virdir/virdir.html
  • Virtual Director and related technologies enable multiple users to remotely collaborate in a shared, astrophysical virtual world. Users collaborate via video, audio and 3D avatar representations, and through discrete interactions with the data. Astrophysical scenes are rendered using several techniques, including an experimental renderer that creates time-series volume animations using pre-sorted points and billboard splats, allowing visualizations of very large datasets in real-time.

iGrid 2002 USA, France, Germany and Italy

  • Video IBPster
  • Logistical Computing and Internetworking (LoCI) Lab, University of Tennessee, USA
  • Innovative Computing Lab, Univ. Tennessee, USA
  • University of California, Santa Barbara, USA
  • University of California, San Diego, USA
  • ENS, Lyon, France
  • Università del Piemonte Orientale, Alessandria, Italy
  • High Performance Computing Center, Rechenzentrum Universität Stuttgart, Germany
  • http://loci.cs.utk.edu, http://nws.cs.ucsb.edu
  • Logistical Networking is the global scheduling and optimization of data movement, storage and computation. LoCI develops tools for fast data transfer, such as the Data Mover, using as much bandwidth as is available. At iGrid, a geographically distributed abstraction of a file is replicated, transported to depots that are closer according to network proximity values, and downloaded from the nearest site in a completely transparent way.

iGrid 2002 The Netherlands

  • Virtual Laboratory on a National Scale
  • University of Amsterdam, The Netherlands
  • www.vl-e.nl/VLAM-G/
  • This demonstration of upper middleware complements Grid services, enabling scientists to easily extract information from raw datasets utilizing multiple computing resources.
  • The Virtual Laboratory develops a formal series of steps, or process flow, to solve a particular problem (data analysis, visualization, etc.) in a particular application domain. Various clusters (DAS-2) are assigned parts of the problem (retrieval, analysis, visualization, etc.)

iGrid 2002 Greece and USA

  • Virtual Visit to the Site of Ancient Olympia
  • Foundation of the Hellenic World (FHW), Greece
  • University of Macedonia, Greece
  • Greek Research & Technology Network
  • EVL, University of Illinois at Chicago, USA
  • www.fhw.gr/fhw/en/projects, www.fhw.gr/fhw/en/projects/3dvr/templezeus.html, www.grnet.gr/grnet2/index_en.htm
  • In preparation for the 2004 Olympic Games hosted by Greece, the FHW, a cultural heritage institution based in Athens, is developing an accurate 3D reconstruction of the site of Olympia as it used to be in antiquity. if access to a high-performance network were available, the FHW’s museum could serve as a centre of excellence, delivering educational and heritage content to a number of sites worldwide.
  • One of the most important monuments of the site is the Temple of Zeus, which houses the famous statue of Zeus, one of the seven wonders of the ancient world of which nothing remains today.

iGrid 2002 The Netherlands, Finland, UK and USA

  • vlbiGrid
  • Joint Institute for VLBI in Europe, The Netherlands
  • Metsahovi Radio Observatory, Finland
  • Jodrell Bank Observatory, University of Manchester, UK
  • Haystack Observatory, MIT, USA
  • University of Manchester UK
  • University College London, UK
  • University of Amsterdam, The Netherlands
  • www.jive.nl, www.jb.man.ac.uk, www.haystack.edu
  • Very Long Baseline Interferometry (VLBI) is a technique in which an array of physically independent radio telescopes observes simultaneously to yield high-resolution images of cosmic radio sources. The European VLBI Network (EVN) has access to multiple data sources that can deliver 1Gbps each and a data processor that can process 16 data streams simultaneously. High-speed networks would enable the EVN to achieve many-fold improvements in bandwidth.

iGrid 2002 …in addition, SURFnet Streaming Video Documentary!!!

  • iGrid 2002 to be streamed live on the Internet!
  • http://www.igrid2002.org/webcast.html
  • Live streams. SURFnet is streaming live conference plenary sessions and demonstration material over the Internet using Real Surestream, IP multicast H.261, MPEG-1 and MPEG-2.
  • Documentary. SURFnet is making a documentary of iGrid 2002 demonstrations.
  • On-demand video. After the conference, all video, both plenary sessions and documentary, will be available for on-demand viewing through the iGrid 2002 website and the SURFnet A/V streaming service.

Acknowledgments Organizing Institutions

  • The Netherlands:
  • Amsterdam Science & Technology Centre
  • GigaPort Project
  • SARA Computing and Networking Services
  • SURFnet
  • Universiteit van Amsterdam/ Science Faculty
  • United States of America:
  • Argonne National Laboratory/ Mathematics and Computer Science Division Indiana University/ Office of the Vice President for Information Technology
  • Northwestern University/ International Center for Advanced Internet Research
  • University of Illinois at Chicago/ Electronic Visualization Laboratory

Acknowledgments Participating Organizations

  • Internet Educational Equal Access Foundation (IEEAF)
  • Global Grid Forum
  • Globus Project
  • GRIDS Center
  • National Lab for Applied Network Research, Distributed Applications Support Team (NLANR/DAST)
  • Pacific Rim Applications and Grid Middleware Assembly (PRAGMA)
  • UCAID/Internet2
  • University of California, San Diego/ California Institute for Telecommunications and Information Technology [Cal-(IT)2]

Acknowledgments Sponsors

  • Amsterdam Internet Exchange
  • Amsterdam Science & Technology Centre
  • Cisco Systems, Inc.
  • City of Amsterdam
  • GEOgraphic Network Affiliates–International
  • GigaPort Project
  • Glimmerglass Networks
  • HP
  • IBM
  • Juniper Networks
  • Level 3 Communications, Inc.
  • National Computer Facilities (NWO/NCF), NL
  • National Science Foundation, USA
  • Royal Philips Electronics
  • SARA Computing and Networking Services
  • Stichting FOM Foundation for Fundamental Research on Matter
  • Stichting HEF
  • Stichting SURF
  • SURFnet
  • Tyco Telecommunications
  • Unilever NV
  • Universiteit van Amsterdam


  • A lambda, in networking parlance, is a fully dedicated wavelength of light in an optical network, typically assumed to be capable of ≥10Gbps bandwidth. Lambdas are circuit-based technology, but can carry packet-based information.
  • For this discussion, the term lambda means large and desirable units of networking, which is how the applications see them, conceptually offering the promise of end-to-end custom connectivity or allowing sufficiently massive over-provisioning of bandwidth so that the connection is effectively uncongested.
  • Photonics, for our purposes, is the technology of hardware lambda management.

International Lambda Grids are coming!

  • Amsterdam—NetherLight to…
  • Chicago—StarLight to…
  • Canada—CA*net4 to…
  • Others?
  • Bring us your Lambdas!

The Next S-Curves of Exponential Technology Growth

  • 0%
  • 100%
  • Research
  • Experimental/
  • Early Adopters
  • Production/
  • Mass Market
  • Time
  • Technology S-Curve
  • Gigabit Testbeds
  • Connections Programs
  • Internet2, Géant
  • DWDM
  • Experimental Networks
  • Lambda Grids
  • ~1990s 2000 2010
  • Networking Technology S-Curves
  • Technology Penetration

The OptIPuter: A new US NSF ITR-Funded Experimental Network Project

  • Driven by Visualization of Large Neuroscience and Earth Science Data
    • Mark Ellisman, UCSD NCMIR
    • John Orcutt, UCSD SIO
    • Eric Frost, SDSU
  • Multiple Lambdas Linking Clusters, Storage, and Screens
    • Integration with Itanium PC clusters using TCP, UDP, FTP
    • Data fusion on peer-to-peer storage with optimized storewidth
    • Interactive collaborative volume visualization
  • A GriPhyN-class effort!

The OptIPuter

  • UCSD and UIC are OptIPuter primary campuses
    • Larry Smarr, PI; Maxine Brown, Project Manager
    • Tom DeFanti, Mark Ellisman, Jason Leigh, Phil Papadopoulos
    • USC, UCI, SDSU, NU/StarLight are partnering campuses
    • Industrial partners: IBM, Telcordia/SAIC, CENIC
    • Bob Hertzberger, Cees de Laat, UvA, int’l OptIPuter Affiliated Site

The OptIPuter

  • Security is a major 5-year goal of the OptIPuter
    • Need line speed security for dynamic, real-time collaborations
      • Protection of data read/write over fast, distributed storage
      • Privacy, authentication, audit trails, anti-DDOS

OptIPuter Applications Enabled by a UCSD Central Campus DWDM Optical Switch

  • switch
  • Cluster – Disk
  • Disk – Disk
  • Viz – Disk
  • DB – Cluster
  • Cluster – Cluster
  • Medical Imaging and Microscopy
  • Chemistry, Engineering, Arts
  • San Diego Supercomputer Center
  • Scripps Institution of Oceanography
  • Oracle DB Server
  • switch
  • switch
  • switch
  • switch
  • DWDM Switch

OptIPuter Mod 0: Chicago 2002-2003

  • Int’l Gb, 2.5Gb, 10Gb
  • Nat’l Gb, 10Gb
  • Metro GE DWDM
  • 16x1 GE
  • 16x1 GE
  • 16-Processor Pentium w/5x3 Panel Display
  • 16-Processor Pentium w/RAID
  • 10x1 GE
  • Nationals: Illinois, California, Abilene, ESnet, NREN
  • Internationals: Netherlands, Canada, CERN, AmPATH, Asia…
  • 6509
  • 6509

Visualization at Near Print Resolution

  • 5x3 Grid of 1280x1024 Pixel LCD Panels Driven by 16-PC Cluster Resolution=6400x3072 Pixels, or ~3000x1500 pixels in Autostereo

OptIPuter Mod 1: Chicago 2003-2004

  • Int’l Gb, 2.5Gb, 10Gb
  • Nat’l Gb, 10Gb
  • Metro GE, 10GE
  • 16x1 GE
  • 16x10 GE
  • 16-Processor Itanium at UIC w/5x3 display
  • 16-Processor NGIntel at StarLight
  • 10x1 GE
  • +
  • 2x10GE
  • Nationals: Illinois, California, Wisconsin, Indiana, Abilene, FedNets, Washington, Pennsylvania…
  • Internationals: Netherlands, Canada, UK, CERN, AmPATH, Asia…

For More Information

  • University of Illinois at Chicago
  • Maxine Brown, maxine@uic.edu
  • Tom DeFanti, tom@uic.edu
  • Universiteit van Amsterdam
  • Cees de Laat, delaat@science.uva.nl

Download 12.54 Mb.

Do'stlaringiz bilan baham:

Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2020
ma'muriyatiga murojaat qiling