Titulo Estágio
On the development of benchmarks to support the evaluation of virtualized environments
Áreas de especialidade
Engenharia de Software
Comunicações, Serviços e Infraestruturas
Local do Estágio
DEI-SSE
Enquadramento
Virtualization is a technology that allows the creation of virtual machines (VMs) on top of physical hardware. To do that, hypervisors are used, and they divide the hardware resources among the VMs created by the users. Examples of hypervisors include Xen, KVM, and VirtualBox. These tools are used by cloud providers, which offer such resources to their end-users in a pay-per-use model (IaaS, infrastructure as a service). Like other software systems, hypervisors can also have software vulnerabilities.
Software vulnerabilities are caused by a design flaw, implementation bug (OWASP), or during operation (misconfiguration). When they are exploited, they can cause consequences such as unauthorized authentication, data losses, and financial losses, among others. Vulnerabilities in hypervisors can be especially critical because they can interfere in an environment with another client when an attack happens.
There are benchmarks available to evaluate the properties of a system, such as the performance of an application. A widely known benchmark is TPC-C (https://www.tpc.org/tpcc/default5.asp), whose goal is to evaluate an OLTP (On-line Transaction Processing) application. There are other benchmarks built on top of TPC-C (e.g., sysbench (https://wiki.gentoo.org/wiki/Sysbench) and HammerDB (https://www.hammerdb.com)). However, they are not prepared to be evaluated in a virtualized environment in the presence of attacks.
Through this research, we aim to adapt existing benchmarks to be used in virtualized environments in the presence of attacks. To do so, a virtualized environment needs to be set up, and the benchmark applications needs to be deployed in such environment and properly adapted. The outcome of this research would help understanding how hypervisors react in the presence of attacks, and the impact caused for the end-user.
The main objective of the benchmark to be developed/adapted is to support the evaluation of virtualized environments (hypervisors) both in a normal environment and in the presence of attacks. Vulnerabilities and exploits from the research group will be provided to support the development of this work.
Objetivo
The primary learning objectives of this research are as follows:
• Gain practical expertise of Linux environments (shell/python scripting)
• Gain practical knowledge of virtualization technologies (Xen/KVM)
• Gain practical experience working with a large Open Source Software
• Acquire hands-on experience in evaluating the performance of applications with the support of benchmark applications.
• Develop practical skills in using benchmarks and potentially create metrics to support the benchmarks.
Plano de Trabalhos - Semestre 1
T1. [09/09/2024 to 31/10/2024] Literature Review and Tool Familiarization.
During this initial phase, an extensive literature review will be conducted to understand the existing hypervisor tools and benchmark applications that can be used.
T2. [01/11/2024 to 30/11/2024] Tool Setup and Preliminary Evaluation
Setup of the virtualized environment (e.g., Xen or KVM), and deployment of benchmark applications in such environments. A comparison of the benchmark evaluation in virtualized environment and in a non-virtualized environment should also be performed.
T3. [01/12/2024 to 10/01/2025] Write the intermediate report.
Plano de Trabalhos - Semestre 2
T4. [11/01/2025 to 28/02/2025] Benchmark system adaptation
In this hands-on phase, the adaptation of existing benchmarking systems will be performed. Metrics to support the evaluation of other properties (such as those related to DBMS properties) may be considered and added to the benchmark system.
T5. [01/03/2025 to 30/04/2025] Evaluation of the Hypervisor Systems in the Presence of Attacks
In this task, the evaluation of the hypervisor system will be performed considering attacks on the hypervisor environment. Such attacks may be already available or may be adapted for the evaluation. The evaluation should consider more than one DBMS.
T6. [01/05/2025 to 30/06/2025] Report and Documentation.
The final phase will involve documenting the research findings, methodologies, and practical recommendations. A comprehensive report summarizing the research outcomes, including the adaptations performed in the benchmark applications, will be prepared.
Condições
- You will have a position in the SSE Laprie Lab
- Computational infrastructure will be provided to work
- Free beer once a month in our community meetings (Talk Ideas)
Observações
Recommended Bibliography:
- C. F. Gonçalves, N. Antunes and M. Vieira, "Intrusion Injection for Virtualized Systems: Concepts and Approach," 2023 53rd Annual IEEE/IFIP International Conference on Dependable Systems
- J. D. Pereira, J. H. Antunes and M. Vieira, "A Software Vulnerability Dataset of Large Open Source C/C++ Projects," 2022 IEEE 27th Pacific Rim International Symposium on Dependable Computing (PRDC), Beijing, China, 2022, pp. 152-163, doi: 10.1109/PRDC55274.2022.00029.
Reference data sources to be used during the masters:
• https://vulnerabilitydataset.dei.uc.pt
• https://www.cvedetails.com
• https://www.exploit-db.com
Orientador
José Alexandre D'Abruzzo Pereira
josep@dei.uc.pt 📩