Distributed Computational Technologies
Level of education Master
Type of instruction Full-time
Duration 2 years
- Algorithmic Fundamentals of Information Processing
- Technologies and Algorithms of High-Performance and Distributed Computations
- Algorithmic Theory of Coding and Fast Signal Processing
- Introduction to Problems of Digital System Investigation and Design
- Precomputation of Complex Problems
- Methods of Statistical Information Processing
- Applied Software for Modelling Complex Systems
- History of Computer Engineering and Programming Development
- Mathematical and Computer Modelling Processes in Complex Systems
- Mathematical Description of Hard Formalisable Processes
- Automated Systems of Data Collection and Processing
- High-Performance and Distributed Computing Systems
- Mathematical Methods in Programming and DB
- Numerical Methods of Solving Complex Problems
- Scientific Visualisation of Complex Physical Processes
- Students are trained in the cutting-edge and most popular computer technologies in the world: cloud computing, big data, and Hadoop.
- They have access to advanced computer architectures: Blue Gene, Ultra SPARC, GP GPU and others.
- They can take part in cooperation programmes with world’s leading universities.
- They receive a comprehensive education from a good mathematical preparation to in-depth knowledge of engineering disciplines.
- Graduates have an opportunity to work in leading computer companies.
- Analyst – a specialist who knows how to ‘put’ applied tasks into computing.
- Mathematician – a specialist who knows how to bring tasks to the necessary rigour, correctness and efficiency.
- System programmer – a specialist who knows how to bring developed models to special and intermediate software that implements all created theoretical models.
- Our graduates are universal researchers who are ready to apply their knowledge to solve urgent and resource-intensive tasks.
Organisations where our graduates work
- Computer companies and software producers: IBM; Oracle; EPAM Systems; T-Systems; Digital Services Association; Speech Technology Center Limited; NEOTEK MARIN JSC; group of companies ‘Digital Design’; and others
- Computing centres and research institutions: the Resource Centre ‘The Computing Centre of St Petersburg University’; and the Laboratory of Information Technologies of the Joint Institute for Nuclear Research
- Industry: Avrora Scientific and Production Association; JSC Concern Okeanpribor
- Universities: St Petersburg University; the University of Amsterdam; Aalto University; and others
- Pavol Jozef Šafárik University in Košice (Slovakia)
- The University of Amsterdam (the Netherlands)
- Lappeenranta-Lahti University of Technology (Finland)
- Aalto University (Finland) and others
- High performance computing, and grid and cloud technologies
- Distributed computing and data processing
- Big Data
- Distributed ledgers
- Mathematical modelling of complex problems, development of computing environments (‘virtual testing ground’)
- Systems of artificial intelligence and the Internet of things
- Application of information technology methods to solve applied problems of physics, technology, medicine, and socio-economic problems
- Creation of the first Beowulf-cluster in Russia (1997)
- First distributed computing over the Internet (1998)
- Participation in projects to create GRID infrastructure: NorduGRID, X-GRID, LCG (since 2000)
- The system of dynamic load balancing in a heterogeneous environment (DINAMITE product) (2003)
- Development of middleware for the European project ‘Virtual Laboratory’ (in cooperation with the University of Amsterdam) (2001-2010)
- Development of a distributed telemedicine cardiology system (within the framework of the SKIF programme) (2004)
- Methods for optimising data access in a heterogeneous computer environment (2008)
- Hierarchical resource control in GRID calculations (2009)
- Using virtualisation methods as a basis for the work of a supercomputer centre (2011)
- Development of tools for a virtual personal supercomputer (2013)
- Development of tools for creating a virtual elementary particle accelerator (2014)
- Development of tools for the analysis and processing of Big Data (2015)
- Development of tools for a virtual hydrodynamic testing ground (2018)
- Development of a distributed ledger system and consensus algorithms (2018)
- Development of virtualisation methods for processing distributed Big Data (2020)