Career Profile

Educational background in Software Engineering with a specialization in Data. Versatile experience as both a Cloud, Data and Software Engineer. Currently employed as a Cloud Data Engineer at UBS, specializing in Azure infrastructure, security and data pipelines. Operates a homelab that allows practicing DevOps principles, configuring networks, developing utilities, deploying applications using Ansible and Docker, and monitoring with Grafana.

Education

MSc Informatics Engineering 2013

2011-2013
Universidade do Minho, Braga, Portugal
  • Dissertation: “Analysis of the influence of stress on the interaction with the computer”.
  • Machine Learning and Knowledge Extraction. Data Mining;
  • Data Warehousing Systems and Online Analytical Processing.

Bachelor Informatics Engineering 2011

2008-2011
Universidade do Minho, Braga, Portugal
  • Software requirements, Modeling & Analysis;
  • Design, development, testing and maintenance of software;
  • Advanced Administration and Exploitation of Database Systems;

Experiences

Senior Big Data Engineer

2024-Present
UBS, Lausanne, Switzerland

Senior Cloud/Data Engineer

  • Design, deploy, and manage cloud infrastructure on Microsoft Azure using GitLab CI/CD, Azure DevOps, and ARM templates, ensuring scalable and secure environment;
  • Azure Kubernetes Services (AKS) Optimization: Develop, debug, and enhance AKS performance, improving resource utilization and reducing operational costs;
  • Data Pipeline Development: Build and maintain scalable data pipelines using Python, Databricks, and Apache Airflow, ensuring efficient ETL processing and data orchestration;
  • Configure network security rules and implement segregated access controls for Azure storage, ensuring compliance with data governance policies;
  • Work closely with data analysts, data scientists, and DevOps teams to optimize data solutions and drive business insights;
  • Automate deployment, monitoring, and maintenance processes to enhance system reliability and operational efficiency.

Senior Big Data Engineer

2019-2024
Credit Suisse (UBS Group), Lausanne, Switzerland

Senior Big Data Engineer at Credit Suisse with responsibilities in developing and maintaining data processes within a bank data science lab.

  • Work closely with Data Owners, Business and Data Scientist Project teams to clarify requirements;
  • Data discovery and understanding of source systems, capturing metadata, mapping data elements to business data requirements and data profiling;
  • Development of data sourcing pipelines into Palantir Foundry using the in-house Enzyme framework;
  • Development of custom connectors (as well as data transformers) to multiple sourcing systems: REST APIs, databases, file transfer protocols, etc;
  • Manipulation of data in different tools available in the company: NAS filesystems, S3 storage, HDFS, databases, etc;
  • Contribution to Enzyme framework (mostly Python code) developing new features and bug fixes;
  • Maintenance and enhancement of Enzyme’s Data Quality control framework;
  • Prepare technical deployment specifications for the support team and the implementation of data quality checks;
  • Maintenance of the metadata and feed catalog;
  • Management of version control software (git/Bitbucket) and code deployment tools (Atlassian Transporter/Jenkins);
  • Development of data sourcing and data handling pipelines using Python, PySpark, and BASH;
  • Management and integration of the alerting system and integration with Moogsoft;
  • Training new joiners in the data sourcing process and Credit Suisse tools;
  • Technical advisor for the rest of the team and other stakeholders.

Software Engineer

2016-2018
BySide, Porto, Portugal

Software Engineer at BySide, working with the following technologies:

  • Design, implementation, and maintenance of several systems such as Elasticsearch, Zookeeper, Redis Cluster, and Kafka;
  • Data handling using MySQL, Elasticsearch, and Redis;
  • Implementation of several Kafka producers and consumers mostly in JAVA and PHP;
  • Implementation of a real-time MySQL replication system to Elasticsearch (using JAVA and Kafka);
  • Deployment of several Bash scripts for deployment and automatization on our servers;
  • Web Back-End development using PHP and Symfony framework;
  • Development of several high-performance services such as data replication, data imports, and batch processing of marketing campaigns;
  • Design and development of a REST API using Symfony;
  • Devops function with the deployment of infrastructure services using Ansible on CentOS and Continuous Integration and Deployment with Gitlab CI/CD

Data Engineer

2013-2016
WeDo Technologies, Porto, Portugal

Development of new features and maintenance of the WeDo RAID platform at Sonae MCH, one of the biggest retail companies in Portugal.

  • Collection of stakeholders’ needs and requirements, and design of solutions;
  • Development of integration processes from diverse data sources using RAID integration tools and PL/SQL packages;
  • Handling and analysis of large volumes of data on Oracle Databases;
  • Tuning and optimization of SQL queries by altering database design, analyzing different query options, and indexing strategies;
  • Development of data validation processes on the implemented platform;
  • Implementation of Analytical and Operational Dashboards;
  • Monitoring the performance and reliability of processes’ execution on a Linux server environment;
  • Review, validation and supervision of new projects deployment into a RAID production environment.