Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Senior Data & Solution Architect.
Portugal Jobs Expertini

Urgent! Senior Data & Solution Architect Position in Porto - LUZA PT/HITO Solutions

Senior Data & Solution Architect



Job description

Job Description

Job Title: Senior Data & Solution Architect

Location: Porto, Portugal

Work Regime: Full-time & Hybrid (3x office per week)



Overview / Summary:
We are looking for a Senior Data & Solution Architect to join our team!


Responsibilities and Tasks:
  • Design scalable data solutions on AWS, applying cloud architecture best practices;
  • Build and orchestrate data pipelines using Python, AWS Glue, Airflow, and Step Functions;
  • Manage large-scale data integrations, including real-time (Kafka) and batch (ETL) processing;
  • Model and optimize databases, both relational (MySQL, PostgreSQL, Redshift) and NoSQL;
  • Implement and maintain Iceberg tables to ensure efficiency, consistency, and ACID compliance in big data environments;
  • Monitor production systems proactively, troubleshooting and mitigating potential issues;
  • Provide technical leadership, guiding teams on architecture and development best practices;
  • Document solutions and architectures, translating and challenging business requirements into clear technical specifications;
  • Leverage Azure Databricks for data engineering and analytics use cases when required.


Requirements

Mandatory Requirements:
  • Expertise in AWS: Extensive experience with AWS services, including Lambda, Glue, Step Functions, CloudFormation, and CloudWatch;
  • Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, adhering to best practices for cloud architecture and infrastructure;
  • Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (MySQL, PostgreSQL, Redshift) and NoSQL databases;
  • Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows;
  • ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka;
  • Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions;
  • Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues;
  • Technical Leadership and Communication: Capability to evolve into a technical lead role, with excellent communication and teamwork skills for effective collaboration with cross-functional teams;
  • Documentation and Requirements Analysis: Experience in writing solution documents and technical documentation, with the ability to challenge and refine business requirements;
  • Exposure to Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks;
  • Fluent in written and spoken English.

Benefits

Important:
  • Our company does not sponsor work visas or work permits.

    All applicants must have the legal right to work in the country where the position is based.
  • Only candidates who meet the required qualifications and match the profile requested by our clients will be contacted.


#VisionaryFuture - Build the future, join our living ecosystem!




Requirements
Mandatory Requirements: Expertise in AWS: Extensive experience with AWS services, including Lambda, Glue, Step Functions, CloudFormation, and CloudWatch; Strong Solution Architecture Knowledge: Ability to design scalable and efficient data solutions on AWS, adhering to best practices for cloud architecture and infrastructure; Proficiency in Python and Databases: Strong programming skills in Python and experience with relational databases (MySQL, PostgreSQL, Redshift) and NoSQL databases; Orchestration and Workflow Management: Experience with orchestration tools such as Apache Airflow and AWS Step Functions for automating and managing data workflows; ETL Tools and Big Data Experience: Knowledge of ETL tools and experience working with large volumes of data, with a preference for experience with Kafka; Experience with Iceberg Tables: Familiarity with Iceberg tables for managing large datasets efficiently, ensuring data consistency, and supporting ACID transactions; Production Awareness and Troubleshooting: Proactive approach to production monitoring and troubleshooting, with the ability to anticipate and mitigate potential issues; Technical Leadership and Communication: Capability to evolve into a technical lead role, with excellent communication and teamwork skills for effective collaboration with cross-functional teams; Documentation and Requirements Analysis: Experience in writing solution documents and technical documentation, with the ability to challenge and refine business requirements; Exposure to Azure Databricks: Familiarity with Databricks for data engineering and analytics tasks; Fluent in written and spoken English.


Required Skill Profession

Computer Occupations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Senior Data Potential: Insight & Career Growth Guide