Annual Report LIT 2000.

The Laboratory of Information Technologies (LIT) of JINR was established in 2000 in frames of reorganization of the Laboratory of Computing Techniques and Automation. The main tasks of the Laboratory were formulated at the 88th session of the JINR Scientific Council. They consist in the maintenance of operation and the development of the computing and networking infrastructure.

The computing and networking infrastructure (JINR CoNet) as a JINR BASIC FACILITY includes:

  1. Telecommunications Services and Channels (External Networking);
  2. Local Area Network (LAN) & High Performance Computing Centre (HPCC);
  3. Support and development of standard software and modern tools of Computer Physics for users.

To support these activities, a new structure of the Laboratory has been worked out. The main part of the problems on the JINR LAN technical support is solved by the LIT chief engineer’s staff.

In 2000, the scientific programme of the Laboratory of Information Technologies covered three first-priority topics of the "Topical Plan for JINR Research and International Cooperation in 2000". The Laboratory staff also participated in 9 more topics of the Topical Plan in collaboration with other JINR Laboratories at the project level and in other 16 topics at the level of co-operation. The main results of the investigations performed within these topics in 2000 have been published in more than 100 articles of the well-known journals, proceedings of the scientific conferences, JINR preprints and communications.

The indication of the top-level investigations performed at the LCTA/LIT Computational Physics Department was the successful holding of the Second International conference “Modern Trends in Computational Physics” in 2000. The scientific programme of the Conference covered various fields of research under way at LIT in the field of mathematical modelling and computational methods for research in complex physical processes, using modern vector - parallel computing systems, computer communications and distributed computations for enormous data processing, numerical methods and algorithms of computer algebra methods, computational tools for modelling and analysis of experimental data. The conference enabled one for the first time of holding conferences at JINR to provide real-time access to the plenary meetings through the Internet.

Telecommunication systems

In 2000, the throughput of the JINR telecommunication system was 2Mb/s and remained at the level of the year 1999. The main Internet Provider for JINR was ROSNIIROS (the Russian Institute for Public Networks) which by the end of 2000 provided for JINR a paid access to international computer networks of 1 MB/s in the common traffic as an RBNet user and access to the Russian networks in frames of the interdepartmental program of the creation of networks and telecommunications for science and higher school. The channel of the CONTACT-DEMOS company was used as BACKUP for the reliable operation of the JINR's network with 256 Kb/s at 5% load.

However, such a throughput of the channel is inadequate to satisfy the JINR’s needs. Fig.1 shows the peak load of the link to Moscow at daytime since October 2000 at the average week load 65,4% http://jicom.jinr.ru/stats/

Fig.1 Statistics of the load of the link to Moscow

Table 1 shows the incoming JINR traffic from the May to December 2000 (total 2 Tbyte) distribution among the JINR subdivisions and laboratories. It should be noted that the University of Dubna and the modem pool take a noticeable share in the common traffic. The software allowing one to obtain quickly the information on the most active users of the JINR external channels has been developed at LIT. It allows one to control the correct use of telecommunication resources.

The perspectives of the development for the JINR external telecommunications were discussed at a workshop «Strategy for the development of the JINR external computer communication links» in June, 2000. The proceedings of the workshop and the projects presented at the workshop are available at the web-page http://lit.jinr.ru/LCTA/E_Publications /Workshop/


Table 1. Incoming JINR traffic distribution (in Gbyte)
over the JINR subdivisions and laboratories (>4 Gbyte)
LIT+prîxy+servers LHE Univ.Dubna FLNR DLNP BLTP Modem pool LPP FLNP UC JINR Board Other
695.2 235.6 177.8 199.3 160.2 123.1 112.8 106.5 84.7 49.9 47.6 16.1

JINR Local Area Network

The resources of the JINR LAN were used within the bounds of possibility. The increase in the network load resulting from the growing number of the elements connected to the network (at present the IP addresses database contains 3188 registered network elements) and from breaking down part of the equipment of the ATM BackBone has set the task of re-organizing the JINR LAN and changing to present-day technologies.

Fig.2. Present-day JINR LAN topology.

A project on modernization of the network topology and on selection of an adequate technology of its design is in development stage now. Fig.2 shows a modern JINR LAN topology. As a temporal decision, the broken ATM switches can be replaced by Catalyst switches from CISCO.

Systematic work on the LAN management was performed by the Network Operation Centre http://noc.jinr.ru/ The rules for users of JINR Computing and Networking Infrastructure have been worked out and approved by JINR Directorate. The new NOC home-page was designed using modern Internet technologies.

Computing service

The JINR High - Performance Computer Centre comprises high - performance computing systems of various architecture ( vector - scalar, multiprocessor, farms, clusters with bulk memory). More than one thousand staff members of JINR and other research centres are the HPCC users. JINR HPCC is one of the five largest Russian centres. It actively co-operates with other leading centres - Intergovernmental Supercomputer Centre, Institute of high-performance computations and data bases (St.-Petersburg). In collaboration with the leading nuclear physics centres of Russia, JINR participates in creating the Russian Regional Centre for LHC Data Handling (RRC-LHC) on the basis of JINR HPCC resources.


Table 2. JINR High Performance Computing Centre (HPCC) main components.
  Peak performance MFLOPS
HP Exemplar S-Class (SPP -2000) 5760
Convex Ń3840 960
APE100 1600
PCFarm 9200
Total: 17520
ATL 2640 Integrated Library System
Library Capacity 10.56 TByte
Cartridge Capacity 20/40 GByte
Drive Transfer Rate 1.5MByte/sec
Library Throughput 16.2 GByte/hr

In 2000, the JINR HPCC computing machine SPP2000 was used by a 161-st user and was 97% load at 58000 hours of the CPU useful time. CONVEX-220 was used by 1140 users as a computer, a mail- and http-server.


Table 3. A relative use of the computing power and the modem pool by the JINR Laboratories.
  LIT BLTP DLNP FLNR FLNP LPP LHE Board
SPP2000 5% 18% 17% 8% 17% 23% 12%  
CONVEX220 26% 9% 13% 15% 5%   15% 7%
Modem pool 16.1% 0.1% 19.3% 12.4% 16.4% 4.6% 13.5% 17.6%

Software development

Information and computer support of the JINR participation in the experiments at CERN, DESY and BNL was in progress in 2000. The technology of designing object-oriented applications and databases (GEANT4, Objectivity/DB, ROOT) was under study. A new version of the LHC++ Library has been installed on the LIT/JINR computing PC farm.

LHC Computing Support

For the last few years JINR has been involved in three projects on LHC: ALICE, ATLAS and CMS. The co-operation of Russian institutes in the LHC projects after starting up the accelerator (in the year 2005) and the experimental installations is connected directly with the necessity of providing a way for processing and analysis of experimental information directly in Russia. For this reason, by the end of 1999 a joint project "Russian Regional Centre for LHC Data Handling" (RRC-LHC) was worked out. JINR and 9 leading Russian physics institutes participating in LHC are involved in the project. For the period of less than a year, LHC-oriented PC farms have been created in ITEP, IHEP, SINP MSU, LIT and LNP of JINR. The program environment of these farms is completely unified and corresponds to the current state of the specialized software used at CERN. Thus, a beginning in the improvement of the prototype of the Russian regional centre has been made.

In September-October, a run of a mass production of physical events was started at a PC-farm LIT JINR (16 processor units of 500 MHz) for the CMS high level trigger. The volumes of simulated data up to 20 GByte are generated at the LIT PC-farm within a day. The data production is performed with the use of the pythia (v.6136) program and CMSIM (v.120), a program for simulation and reconstruction of events for the CMS experiment; the data are written in a zebra-format (fz) in blocks of an order of 1 GByte - approximately 500 events in a file. The data obtained will be transferred to CERN for inclusion into the object-oriented database (Objectivity/DB) that will be used for the definition of the Basic Units of Information, optimization of algorithms of the trigger and event reconstruction. The availability of the mass memory system at JINR HPCC provides a way of testing various models of work with enormous data volumes as well as improving the technology of a common use of the mass memory together with Moscow institutes.

Investigations for paralleling computations

A 32-processor APE100 complex of the 2x2x8 configuration was installed at HPCC of JINR in the last year. The project APE has been worked out and is being developed by a group of Italian physicists - theorists involved in QCD. The LIT group made central contributions in the present reworking of the TAO compiler kernel, such that the performance on APEmille can be improved. This rework also takes into account the specific architectural modifications necessary for porting the compiler to apeNEXT. This development has to be completed in order to obtain a reliable prototype of a stand-alone TAO-compiler for apeNEXT and in order to allow the combination of the TAO compiler with a C compiler.

Maintenance of the JINR Program Library

New documents were prepared and introduced in WWW concerning program libraries in 2000. They include realization at JINR of electronic access to the texts of the program library CPCLIB (Belfast, Northern Ireland) and the CPC (Computer Physics Communications) journal as well as maintenance of the NAG Library and CERNLIB on the JINR computer platforms. Filling the JINRLIB with the new codes was in progress.

DATABASE and WWW SERVICE

A systematic supplement and maintenance of the earlier constructed databases and information systems (IS) continued taking into account the users’ needs. Among these are:

A wide scope of problems has been solved in the field of information management, namely:

In order to maintain and develop a specialized WWW/FTP server FAXE http://faxe.jinr.ru; ftp://faxe.jinr.ru; with program products for the JINR users, its hard- and software facilities have been modernized.

The XML (eXtensible Mark-up Language) technology has been studied. It is a new industrial standard that specifies the architecture of the Internet programming tools of the next generation[2].

A converter xcvt has been developed in the Java language for processing XML documents. The program comprises style tables for transforming XML documents into HTML and LaTeX. A practical investigation of the Internet applications designed under the aegis of the W3C consortium and applied in WWW has been undertaken: Mathematical Mark-up Language, Vector Mark-up Language, and XHTML. These investigations can be effectively applied to

Software for data visualization

Scientific visualization is an effective tool for deep insight and analysis of the objects or processes under study. LIT supports and utilises several advanced visualization systems. The most powerful of them, the so called modular visualization systems, are ConvexAVS and Iris Explorer.

Special codes for data visualization have been developed at LIT. For example, the PICASSO code was developed for visualization and interactive analysis of the results obtained by the GEANT-DIRAC simulation program. It is needed for debugging the program and investigation of processes in the DIRAC set-up [3]. Another example is the "Juno" program (fig.3).

Fig.3 ‘JUNO’ application screenshot.

"Juno" is a tool for handling, conversion and statistical analysis of large experimental data bulks. It has the unique features enabling a non-programmer user perform complex manipulations on data, build one- and two-dimensional statistical distributions, accomplish rare events recognition by applying filters and additional criteria. “Juno” needs no special settings. It is implemented in Visual C++ environment and runs under Windows 9X/NT. The program is used to handle data gained at experimental installations for heavy ions physics research [4]

Computational Physics

The main tasks for Computer Physics at JINR are:

Mathematical modelling for experimental investigations

The properties of the projected experimental facility, a sub-critical assembly in Dubna (SAD) driven with the existing 660 MeV JINR protons accelerator, have been investigated by using the particle transport codes LCS, MCNP4B/DLC189, CASCADE [5]. The assembly consists of a central cylindrical lead target surrounded by a mixed-oxide (MOX) fuel (PuO2 + UO2) and a leader reflector (Fig.4). A dependence of the energetic gain on the proton energy, the neutron multiplication coefficient, and the neutron energetic spectra have been calculated.

Fig.4. SAD scheme.

The calculations show that for the sub-critical assembly with a mixed-oxide (MOX) BOR-60 fuel (29%PuO2+71%UO2) the multiplication coefficient k eff is equal to 0.947, the energetic gain is equal to 30, and the neutron flux density is 1012 cm-2 s-1.

A mathematical processing of experimental data obtained in frames of the first experiment carried out at the LNP phasotron within the project SAD has been performed. The main goal of the experiment was to study the differential characteristics of the secondary radiation field around the thick lead target irradiated by protons. Such experimental data are needed for verification of the calculations of the inter-nuclear cascade of secondary particles generated by primary protons within the target. Fig.5 demonstrates the comparison of the calculated and experimental neutron spectra from Pb-target at 75ś[6].

Fig.5. Comparison of the calculated and experimental neutron spectra from Pb-target at 75ś.

One of the important problems of particle physics is the question of existence of abnormal - narrow multiquark states predicted in a series of theoretical studies. An experimental solution to the question about the existence of exotic hadrons and study of their internal properties and the character of the processes with their formation are of particular importance for obtaining basic ideas about the nature of hadron matter. With the help of the techniques developed at LIT, the analysis of events was carried out on the basis of experimental data with a two-meter hydrogen bubble chamber CERN in p--p - interactions at 16 GeV/c. The width of the structure is comparable with the experimental resolution [7]. When analyzing the structure K (1630)ŕ K0sp +p- , kinematics features of its formation and decay were found which distinguish the group of events from the structure's interval and other intervals of the mass spectrum. The probability of a casual manifestation of these features is less 10-7. Conclusion has been made about the observation of the physical effect. The results have been accepted by the Particle Data Group for the Review of Particle Physics [8].

Methods and software for complex physics system calculations

In collaboration with the Computational Science Division, Advanced Computing Centre of the Institute of Physical and Chemical Research (RIKEN), Japan, research on the molecular dynamics simulation of clusters-beam-surface impact processes for metallic phases was performed. An optimised version of the DL_POLY molecular dynamics simulation code [9] has been used. The interaction of energetic clusters of atoms with solid surfaces is investigated with the use of the Finnis-Sinclair many-body potential. The characteristics of the collision range from a soft landing (< 0.1 eV/atom) up to higher impact energies (>1eV/atom). The penetration of the cluster into the solid substrate results in such dynamic processes as a plastic deformation of the material and shock waves. Shock waves or thermo elastic effects generated in materials are the essential factors for the analysis of new nontrivial structures on the surface and may be used to explain the structural-phase changes of the surface treated. Modification of the surface exposed to high energy cluster-beams, is studied by monitoring the molecular dynamics configurations of the system in real time and defining the critical impact energies, necessary to produce implantation (Fig.6) [10].

Fig.6 Four sequential snapshots of the Molecular Dynamics simulation.

A mathematical model of the evolution of the thermo elastic momentum arising in metal exposed to an ion source is investigated. On the basis of the numerical calculation, a relation between the thermo-elastic wave form and the form and location of the source as well as a condition of extension and putting out of the thermo elastic waves were studied. A temperature influence on the velocity of the thermo elastic waves was established [11].

An effective algorithm for calculation of wave functions of the continuous spectrum in a two-centre problem is proposed. To solve this problem, a finite-differential scheme of 4th-order and the continuous analogue of Newton method are applied. The wave functions of the continuous spectrum of the two-centre problem of positive molecular ion of hydrogen together with the phase shifts and with matrix elements between the continuous and discrete spectrum were calculated. The absolute accuracy of the calculated phase shift is order 10-6 for the electron momentum k¨1 and order 10-4 for k~0.1 [12].

Software for computer modelling of relativistic heavy ion collisions in the framework of the fluid-dynamic model for various equations of states was developed. The mathematical methods include a PIC-method (Particle-in-Cell) for modelling of nuclear matter moving, Newton and other iteration methods for solving the equation of state and numerical integration methods for calculation of observablies. The Fortran and C++ codes were used for computations, and IDL (the Interface Definition Language) was used for visualization of the computation results.

The development of the elasto-dynamic method and software in the theory of nuclear matter and its application to nuclear fission physics. The model constructed predicts a two-mode character of fission: a spheroid mode (S-mode) and a torsion one (T-mode). It should be noted that the barriers of nuclei fission for the T-mode lay higher than for the S-mode. Since the T-mode is characterized by a compact fission configuration, one can expect that the total kinetic energy (TKE) of the flying apart fragments will be higher than TKE for the S-mode. Data on such a TKE behaviour have been received in experiments (Obninsk) on uranium isotopes fission induced by fast 8-10 MeV neutrons. A comparison of the fission barriers calculated in the elastodynamical model (S-mode and T-mode) with the experimental data allow one to conclude that the rotating mechanism of fission covers well the region of fission of middle nuclei (with the mass numbers of 170 less than A less than 210) [13].

A proof has been complete of the invariance with respect of replacement of coordinates of the Feynman integral in paths (of a conversion amplitude - in quantum mechanics, a partial function - in statistical mechanics, a generating function - in the field theory) in a functional approach (i.e. without using a finitely multiple approximation) on the perturbation theory in 2 loops started in 1999. All the problems related to the determining of the integrating measure and to the existence of the contraterms arising at a quantum level have been completely solved [14].The main application of the investigations consists in the fact that their result allows one to apply a standard method of the perturbation theory for a functional integral to the problems with (topologically) nontrivial boundary conditions.

Modern computational tools in experimental data processing

In frames of software development for the HERA-B Outer tracker, a new fast seeding algorithm for the tracking program RANGER was developed on the basis of Radon-Hough transformation method. It was implemented as a C++ program. An algorithm of the very fast robust fit of a circle arc to drift radia in XoZ plane of Magnet Chamber was developed, implemented and tested on real data of MC super-layers.

A multifractal analysis of atomic Force Microscope (AFM) images of Nb thin film surfaces has been performed. The analysis allows one to propose a model of a new mechanism of the order parameter suppression on a ‘superconductor-vacuum’ boundary [15].

Physics fields and particles transport calculations

In frames of the engineering work within the ALICE experiment (CERN), three-dimensional calculations were performed for the magnetic system comprising a L3 magnet, a muonic filter and a dipole magnet (fig. 7).

Fig.7 Computer model a) and the magnetic field main component distribution .
b) for one of the variants of a dipole magnet for the ALICE experiment.

3-dimensional calculations of the electric field for the NA45 experiment (CERN) also were performed. The results of the computations have been reported at a meeting of the NA45 collaboration in Darmstadt. For the project worked out at ITEP for the experiment with a polarized target, 3-dimensional calculations of the forces acting on the winding, poles and the forces polarizing the tips of the magnetic system have been performed [16].

Mathematical processing of experimental data in particle physics

Research, development, and integration of soft- and hardware platforms has been realized for modelling and processing a number of experiments in particle physics. One of the basic properties of the local cluster is its reconfiguration and scaling. The local cluster RISC is applied as an effective tool for solving the problems of physical data processing. The mathematical processing of the experimental data obtained at the EXCHARM installation is carried out at this cluster. The data banks - the results of mathematical processing of initial experimental information (almost 200 GByte) for the experiment EXCHARM - have been generated and prepared for a further physical analysis.

The cluster RISC is also used for modelling the experiments on research of the processes with charmed and strange particles at the U-70 accelerator in Serpukhov. A new system for data processing has been created and put into operation. Its peculiar feature is the integration of the local linux cluster RISK and the computing facilities of the JINR computer centre. By means of integration of the local cluster and the robotized bulk memory, a distributed soft- and hardware platform has been synthesized for data processing in particle physics.

In frames of the CMS/LHC software activities testing and modifying CMSIM(ftn) and ORCA(C++) programs for muon tracks reconstruction in the endcap muon system were performed [17].

Computer Algebra

In the year 2000, the following investigations were performed:

The algorithm and the results mentioned above are the pioneer ones. The computer programs written in the C and C++ languages , at the expense of the original algorithms embedded, exceed the best foreign programs realizing the classical Buchberger algorithm for calculation of the Grebner bases.

International co-operation

In accordance with the Agreement between JINR and the Research Centre Rossendorf, Germany, about a co-operation in the field of application and development of computing systems, in particular for the «Zentrale Nutzerdatenbank» project, LIT takes part in the creation of an automated system of administrating a computer complex using the WWW technology as a tool of access to the ORACLE database via the Internet. The LIT personnel provided a Java-service - the development of programs in the Java language working under the operating systems Microsoft Windows and UNIX (Linux, AIX) and controlled by the standard WWW facilities - browsers Netscape Communicator and Internet Explorer. These programs in the form of Java applets provide in a dialog mode a graphic user interface (GUI) to work with the database Oracle. The network access to the database was provided also by the Java-JDBC language.

In co-operation with CERN and Brookhaven National Laboratory, the following work has been carried out:

In co-operation with Slovak scientists, qualitative and numerical research in the nonlinear ODE system describing the existence and stability of disclination vortexes in elastic matter has been started. With the help of the computer algebra system MAPLE, an asymptotic of singular and nonsingular vortexes was obtained at zero point. A behaviour of the vortexes was numerically investigated for large r for various parameters of the problem and the asymptotic.

Research on the nonlinear Schroedinger equation was performed in collaboration with the University of Capetown (SA). It has shown that the parametrically driven nonlinear Schroedinger equation has a wide class of travelling soliton solutions, some of which are stable. For small driving strengths stable nonpropagating and moving solitons co-exist while strongly forced solitons can only be stable if moving sufficiently fast [23].

The effective co-operation with the International Salvay Institute of Physics and Chemistry (Brussels, Belgium) progressed in 2000. New integral software for electrocardiogram analysis was developed [24]. Research on analysis of results of optical coherent tomography of the human skin microstructure [25] was undertaken. Resonances, correlation, stabilization and control over complex systems were studied [26].

References

  1. Grushetsky Ě., Manafov A.Ya., Nikonov E.G. -JINR P13-2000-173, Dubna, 2000.
  2. Galaktionov V.V. - JINR Đ10-2000-44, Dubna, 2000.
  3. Zrelov P. – to be published.
  4. Krylov V. – to be published.
  5. Polanski A., Acta Phys. Polonica Vol. B11,No.1, p. 95,2000, Barashenkov V.S. et al., JINR, P2-2000-131, A.N.Sissakian, I.V.Puzynin, A.Polanski, to be published.
  6. Bamblevski V.P., Krylov A.R., Polanski A. et al. Submitted to NIM.
  7. Karnaukhov V.M., Koka K., Moroz V.I. – Nuclear Physics, 2000, v.63, p.652.
  8. The European Physical Journal C, v.15, num.1-4, 2000, p.536.
  9. Kholmurodov K., Smith W., Yasuoka K., Ebisuzaki T. - Comput. Phys. Commun. 125, pp.167-192 (2000).
  10. Puzynin I.V., Kholmurodov K., Yasuoka K., Ebisuzaki T. - JINR E11-2000-228.
  11. Amirkhanov I.V. et al. –JINR P11-2000-263.
  12. Pavlov D.V., Puzynin I.V. et al., JINR, E11-2000-185.
  13. Bastrukov S., Salamatin V., Podgainy D., Streltsova O. in IV Int. Workshop on Nuclear Fission Physics, Obninsk, 2000, pp. 5-12.
  14. Kleinert Đ.and Chervyakov A.- Phys. Lett. B 477, 373 (2000); Phys. Lett. A 269, 63 (2000); Phys. Lett. A 273, 1 (2000); Europhys. Lett. (2000), FU-Berlin preprint 2000 (quant-ph/0002067).
  15. Altaisky M.V. et al. Particles and Nuclei, Letters No.2[99]-2000.
  16. Ivanov A.I., M.V.Yuldasheva, O.I.Yuldashev NIMA, v.441, N1-2, pp.262-266, 2000.
  17. Golutvin I., … , Ososkov G., Palichik V., Tikhonenko E., CPC, 126 (2000) pp.72-76.; In Proceedings of CHEP200, pp.128-132 2000, Padova, Italy.
  18. Gerdt V.P. In: "Problems of Modern Physics", JINR D2-99-263, 2000, pp.164-171.
  19. Kornyak V. International Journal of Modern Physics C, v.11, No.2 (2000) 397-414.
  20. Kornyak V., In: "Computer Algebra in Scientific Computing", Springer-Verlag, Berlin, 2000, pp. 273-284.
  21. Tarasov O.V. Nucl.Phys.B (Proc Supl) v.89 (2000) 112-116.
  22. Gerdt V.P. In: "Computer Algebra in Scientific Computing", Springer-Verlag, Berlin, 2000, pp.115-137.
  23. Barashenkov I.V. et al. JINR E17-2000-147, submitted. to Phys.Rev.E.
  24. Ivanov V.V., Zrelov P.V.: New Approach to ECG's Features Recognition Involving Neural Networks, ( submitted).
  25. Akishin P.G., et al; "Computer Physics Communications", vol. 126, No. 1/2, 2000, p.111-132.
  26. Antoniou I., Akritas P. and Ivanov V.: "Chaos, Solitons and Fractals", 11 (2000) 337-344; Antoniou I., et al. "Chaos, Solitons and Fractals", 11 (2000) 223-229; Akishin P.G.,et al. "Chaos, Solitons and Fractals", 11 (2000) 207-222, Antoniou I. and Ivanov V.V.: Computational Methods and Tools for Modeling and Analysis of Complex Processes, (submitted).