In 1997 the main efforts of the Laboratory of Computing Techniques and Automation were directed towards the facilities providing adequate networking, information, computing and software support of research under way and educational programme at JINR (CONET project) and the modification of the JINR LAN, which is a unique installation as regards its length, complex topology, equipment, network protocols, operating systems and software, and this is together with enormous data transfer.
The LCTA programme for scientific research is aimed to reach new results on computational physics (topic "Nonlinear Problems of Computational and Mathematical Physics: Investigations and Software") and on simulation and mathematical processing of experimental data (topic "Development of Mathematical Algorithms and Creation of Software Support of Experiments in Particle Physics").
In 1997 the Laboratory took an active part in several comprehensive interdepartmental national programmes intended for the establishment of a modern centre for information processing at JINR, including the programmes "Creation of the National Telecommunication Network for Science and Higher Schools" and "Creation of High-Performance Computer Centres in Russia".
In 1997, following the world tendency in the field of computing for science and higher schools and the progressive requirements of users, the Laboratory developed a conception of establishing a High Performance Computing Centre (HPCC) at JINR. It should be read as a balanced development of four main components of HPCC, namely
In 1997 an external communication for the 2Mbps fiberoptic link JINR - RSCC "Dubna" - Shabolovka - Moscow Backbone-M9 was started. The financing came from the Ministry of Science and Technology in the framework of the programme "Creation of the National Computer Telecommunications Network for Science and Higher Schools" for the project "Creation of the Rbnet Infrastructure (Earth Channels)". Thus, the problem of the computer communication link JINR - Moscow (M9) has been solved. The throughput of the channel is 16 times higher than the one available before. At present, the Institute uses an integrated earth-satellite communication link to Germany through the RUHEP network and the German DFN network. There is an idea of a collaborative use of the channels RADIO MSU to Europe and DEMOS - to North America.
In 1997, work on change-over to ATM technology of JINR LAN Backbone was started. The Committee for users of the JINR LAN unanimously approved a programme of changing the JINR LAN to advanced communication technology, based on ATM technology. However, the limited financing retards its full-scale realization. The ATM technology will allow one to create a reliable high-performance and well-controlled JINR Backbone. For these purposes, LCTA received part of software HP-Open View for network analysis and monitoring. It has been put into operation.
To make work on world data bases more effective, the general database server AlphaServer 2100 has been modernized and a PROXY-server has been introduced. A Data Base for the JINR IP addresses, based on the "client-server" technology and having a WWW-interface for visualization, has been developed and introduced.
In 1997 the load of the main JINR servers grew up. Figure 1 demonstrates the results of the use of the CONVEX computing server by the Institute Laboratories.
Fig. 1. Time distribution of the CONVEX computer
central processor in 1997
Regarding the computing facilities, at JINR there is a whole diversity of computational problems in various fields of physics, which need powerful computing resources. They involve problems of theoretical and mathematical physics, solid state physics, experimental data processing problems, especially in high energy physics. To solve these issues, LCTA and other JINR Laboratories worked out a project on the creation of a modern high-performance computing centre. At the end of 1997 the Ministry of Science and Technology allocated a fund for the realization of the initial stage of the project. In the beginning of 1998 the Laboratory expects to receive a S-class Scalar system Hewlett-Packard SPP2000 (8 processors, peak performance 6.4 GFlops), D-class file server, bulk memory system up to 10 TeraByte. In accordance with an agreement with Germany the JINR received and put into operation a C3840 system, comprising four scalar vector processors.
A SUN/SPARC cluster has been created as an OS Solaris environment server; it is working under OS Solaris 2.5.1. at JINR. Site-licenses for JINR on Fortran F77-4.0 and C++-4.1 provide JINR specialists with all the facilities in the OS Solaris environment for their work, including the use of current versions of CERNLIB. The software is available for any JINR hosts working under OS Solaris. Besides, the latest versions of the software FSF/GNU and CERNLIB, popular with the JINR users, have been installed at the server.
A specialized SUN-cluster has been installed for the CMS experiment at JINR. The software support of the cluster is similar to the analogous cluster CMS at CERN. The CMS cluster at JINR supports the tasks of simulation and data processing.
Work was continued on maintenance of the LINUX environment at JINR. Installation of the LINUX-server was supported by the Russian Foundation for Basic Research.
In 1997 LCTA, LPP and LHE worked out a project "A Computing Farm for the Solving of Problems of Simulation and Homogeneous Physics Information Processing". The goal of the project is to establish a computing farm of PCs providing fresh ways for physics experiments' modelling, mass processing and monitoring of similar arrays of experimental physics information.
The database systems and the information support of all JINR activities have been developed. Work was in progress on the realization of the BAPHYS project for creation at JINR of an information centre for institutions involved in fundamental and applied nuclear physics research (JINR, IHEP (Protvino), St.Petersburg Institute of Nuclear Physics, Novosibirsk Institute for Nuclear Physics, Institute for Nuclear Research (Troitsk), ITEP and Research Institute of Nuclear Physics MSU (Moscow)). A data base connected to the JINR Library Reference system is maintained on the basic server (http://dbserv.jinr.ru/library/) of this centre, equipped with a WWW-ORACLE system. The subsections "High Energy Physics (HEP)", "Low and Medium Energy Physics", "Physics Conferences, Workshops and Summer Schools" in the subsystem "Physics Information Servers and Data Bases" as well as "Publishing Offices" have been modified. 
Several versions have been realized on introducing the JINR topical plan for research and the relevant data into the information retrieval systems, including WWW.
Work was continued on the development of information support for the multilevel set of WWW-servers at JINR, in particular the general JINR and LCTA servers. Regarding the JINR general server, main modifications were related to the INFO and NEWS sections. The LCTA server has got its new home page, so the relevant subsections were respectively modernized.
The CMS information system is heavily based on the world-wide web (WWW). The web-server http://sunct2.jinr.dubna.su was designed at LCTA, and it contains information on RDMS CMS collaboration activities. This web-server was adopted as an official web-server of the RDMS CMS collaboration. Now there are references to the RDMS CMS web-server from CERN CMS web-servers CMSDOC and CMSINFO.
The home page for the newspaper "Dubna" has been introduced and is maintained currently at the JINR WWW-server.
A procedure has been worked out from the methodical viewpoint for introducing full-text data bases from the JINR Publishing Department. It requires, however, a purchase of licensed software to provide proper access to the information needed.
In 1997 the maintenance and the development of the general-purpose program libraries were progressing at LCTA including:
An original system for graphic digitizing (GDS), providing a way for tabulation of the graphics presented in journals and preprints, has been developed and realized at LCTA . The developed special-purpose software provides the maintenance of the measuring procedure and then the restoration of information. Continuous lines in the graphics can be digitized automatically. The GDS has been tested for restoration of numeric information from the graphics. The tests have shown that the measurement procedure did not give noticeable errors to the input data. The system has a friendly user interface (Fig.2).
Fig. 2. GDS package workbench
Software based on up-to-date technologies has been designed: Web-Oracle gateway Oralink (http://oradb1.jinr.dubna.su/software/oralink/), Russian Web Survey (http://oradb1.jinr.dubna.su/rws/), Java applets etc. The project Russian Web Survey is the most full and reliable source of the information on the WWW servers in Russia and the used software of HTTP servers.
Special client-server interface based on IDL has been created for 3D visualization and data analysis of the DEMON+CORSET set-up. Figure 3 demonstrates a result of the processing using the designed software.
Fig. 3. IDL package workbench
In 1997 the UC computing complex was upgraded by joint efforts of the LCTA and UC staff:
In 1997, research in computational physics was in progress. More than 100 scientific works were published and reported at the international conferences.
With the help of a mathematical model of inelastic collisions of high energy nuclei, developed at LCTA when analyzing experimental data of the international collaboration EMU-01/12 on interaction of aurum ions and photoemulsion nuclei, an unknown feature of the high energy fission reaction - a spherically symmetric emission of a large number of nuclear fragments at once - has been discovered .
Within the collaboration of LCTA with FLNR, St.Petersburg Institute of Physics Research and the Russian Research Centre "Kurchatov Institute", computations were continued on quasielastic scattering and total cross section for the systems 11Li+12C, 8B+12C11Li+12C, 8B+12C11Li+12C, 8B+12C11Li+12C, 8B+12C by using the densities created in nuclear-structural models and the effective nucleon-nucleon forces .
Computations have been performed for the M1 and E1 strength distribution up to Giant resonances in deformed nuclei. It has been shown that the calculated fragmentation of the M1 force lower than 4 MeV in 166,168Er, 172,174Yb and 178Hf was stronger than that in Dy and Gd. This was in agreement with experimental data .
Calculations were continued, algorithms and software were elaborated relating to the development of wave theory of nature of elementary particles and to the research of their structure and the resonance mass distribution .
An elasto-dymanical model of macroscopic dynamics of symmetric nuclear fission has been formulated. Modelling has been performed and a macroscopic barrier of symmetric nucleus fission of a superheavy element 298114 was obtained (Fig.4), the fusion of which was planned at experiments under way at FLNR, JINR .
Fig. 4. Calculated macroscopic fission-barrier profile
for symmetric fission
of a superheavy element 298114
By means of mathematical simulation, an energy dependence of ionization loss and amplification coefficient in the subcritical electronuclear systems was investigated. It has been shown that, in spite of enormous ionization loss, the coefficient reduced only by half when changing from 1 GeV to 200 MeV. That can be easily compensated by increasing intensity of the accelerated particle beam .
Numerical solution of nonlinear heat equation was found to simulate the evolution of thermal field induced by high current ion beam treatment of the metallic target. The dependences of the heating rate and the melting depth were obtained as a function of ion beam intensity .
Work has been progressed on modification of codes developed on the basis of the mathematical methods of cellular automata and artificial neural networks for solving problems of track recognition, search and identification of secondary vertices in the experiments DISTO, ATLAS, CERES/NA-45, EXCHARM and STAR [10-12].
Within the DIRAC collaboration, the modelling programs GEANT-DIRAC were developed and maintained, and research on designing a 3-level neural net trigger to measure dimesoatoms DIRAC lifetime was in progress . Research on parametric and nonparametric methods for the Gaussian form peak separation was performed . The algorithms developed have been tested on model and real data. The problems were solved on the basis of a wavelet transformation of initial data.
Computational models for two dipole magnet projects VULKAN-1 and VULKAN-2 of the ALICE experiment has been elaborated. Calculations of 3D magnetic field and magnet characteristics have been performed (Fig.5) .
Fig. 5. Computational model of VULKAN-2 magnet and
results of calculations
A problem of recomputation of the SP-40A magnetic field of the EXCHARM spectrometer from one work mode to the other one has been considered. Field distribution for two work modes have been simulated, experimental data being available for one of them. The calculated distribution was compared to an experimentally measured one, giving a satisfactory result. By comparison of two calculated distributions, an algorithm for recomputation of the magnetic field from a mode with the known experimental data into another mode has been obtained. Magnetic field maps have been composed for two modes of the EXCHARM installation: a mode for measurements for Bo = 0.7840 T and a mode with a field in the centre, which is 0.85 of Bo. The results are intended for experimental data processing at the 10th session of the EXCHARM experiment.
An effective adiabatic scheme for computation of energy levels and antiproton helium atom transfers has been obtained for the planned experiments.
The development of effective methods and algorithms for numerical investigation of Cauchy problems for the classical Hamiltonian systems was continued at LCTA. This problem resulted from modelling the process of nuclei fragmentation in the FOBOS experiment. The efficiency of the approach is shown in dynamic computations for few-body systems with some pair potentials .
Software has been developed which allows one to calculate asymptotics of potential curves and adiabatic potentials in the framework of the hyperspherical adiabatic approach. The asymptotic potentials make it possible to calculate the energy levels and the radial wave functions of two-electron systems in the adiabatic and coupled-channel approximations of the hyperspherical adiabatic approach. The program is applied to the calculation of the energy values of the ground state and several double excited states of H- below the n = 2 threshold .
The fractal properties of some objects (irradiated thin films, pictures taken in cosmic space, chaotic signals) were investigated. On the basis of a computer analysis of the objects, some conclusions have been made on nature of the respective processes. In addition, research has been carried out on fractal properties of the signals from a function analysis viewpoint .
Approaches to construction of asymptotically optimal interface solvers have been elaborated for the solution of nonlinear elliptic problems with anisotropic coefficients varying sharply within the domain . The approaches are based on an efficient sparse approximation to the related Schur complement matrices and on the construction of efficient multilevel iterative methods for solving arising interface equations.
In the framework of the development of computer methods, algorithms and software for solving problems of computational and mathematical physics under way at JINR, a new universal algorithmic approach has been designed to construct Grobner bases, which is alternative to the classical Buchberger algorithm. In this direction, an improved version of the involutive algorithm was developed [20, 21] which allows one to minimize the number of intermediate polynomials, thus increasing the computer efficiency.
A C program was written for the analytical computing of coefficients of the heat kernel expansion for elliptic differential operators on curved manifolds and in the presence of arbitrary gauge fields. The latest version of the program was able to compute the most complicated form of the fourth DWSG (E4)  coefficient, and considerably superior its analogues in other institutions.
In 1997 the earlier developed C program  for constructing finitely presented Lie (super)algebras was vastly improved. Its computational efficiency and speed are higher than those of other programs available.
In the course of research in computation of multiloop Feynman diagrams, a new type of recurrence relations for Feynman integrals has been found. It involves, in particular, recursions in the space-time dimension. A new method for computing Taylor expansions of Feynman integrals has been developed which is based on the recurrence relations generalized above. A new technique for semi-analytical calculation of Feynman diagrams has been designed. It is based on the asymptotic expansion of integarls and the use of conformal mappings and construction of Pade approximants. The radiation two-loop corrections have been computed to the so-called -parameter in the standard model containing a contribution of t-quark and Higgs boson. The QCD corrections to this parameter were computed at the three-loop level. All the corrections were included in the program package ZFITTER for comparison of theoretical predictions with experimental data obtained at LEP I accelerator at CERN.
Work was continued on creation, development and realization of the distributed computer soft- and hardware systems for experiments in particle physics. The architecture and the software have been developed, and a Parallel Extended Integrated System (PARIS-97) has been created on the basis of a cluster of PC of PENTIUM type, the network ETHERNET and the peripheral devices EXABYTE.
In 1997 work on designing software support of the EXCHARM experiment was in progress. A program bank for an operating system compatible with LINUX has been optimized for primary EXCHARM experimental data processing (BISON-97). The high-level productivity and reliability of the system has been provided. Control and management tools have been designed, investigated and introduced into the processing system BISON-97 for high intensity stochastic information streams in the conditions of asynchronous multi-access to a limit disk storage in local and distributed processing systems. A new version of the program package has been designed for geometrical module calibration of a trajectory detector, which allows one to determine angular, longitudinal and cross corrections of the local coordinate systems. Algorithms for identification of various events have been developed. The numerical results of mathematical processing (data bank dst97-s10, eff97 - efficiency tables ranged with respect to time) are used as input information for a successful solving of a wide scope of physics problems relating to research in generation and decay of particles with charmed and strange quarks.
In 1997 the scientific cooperation of LCTA with the leading research centres worldwide was in progress.
Within the collaboration with the Research Centre Rossendorf (FZR) (Germany), a computing system CONVEX3840 has been received.
According to the JINR-CERN Cooperation Agreement, work was continued on:
In accordance with the joint work schedule on Trigger/DAQ system for the ATLAS experiment (CERN), LCTA's Distributed Computing Systems Sector has completed a series of works on modelling and debugging the software for a second-level muonic trigger ATLAS. A high-level design of Resource Manager (RM) as component of Back end Software DAQ ATLAS has been done as part of joint work on designing a prototype of the DAQ ATLAS system. CORBA research of the joint systems (ILU version) has been performed for RM communication issues. Jointly with CERN, a technological base of the object-oriented technology has been created for carrying out a full series of engineering of the distributed computing systems with the use of CASE&Tools. In collaboration with CERN's SPS/LEP division, a series of works have been completed on application of LabView programming in SPS/LEP Control system.
In cooperation with the Research Centre Karlsruhe (Germany), neural processor modules SAND/1 have been developed on the basis of FPGA ALTERA 10K technology.
The international cooperation in the area of computational physics was continued:
© Laboratory of Information Technologies , JINR, Dubna, 1998