About Me

What is this blog about

I like to talk about a range of topics, but the main ones, in case you’re interested:

Technology

Discuss a wide range of subjects about computing, programming languages, cloud, and others.

Personal Experiences

Write about some activity or event that I participated in and how it was from my perspective.

Hobbies

I like to take photos, I like to work out, and I like to read books. And here is a chance to talk a bit about each of my interests

Thoughts

I enjoy “putting in paper” some ideas that I’m having, to expose a bit and see what people think of my insights.

Work Experience

Data Engineering Manager

Tech lead of the Accelerate Database Modernization, an AI Agent that automates database/data warehouse migrations from legacy on-prem to the cloud.

Managing 8 data engineers and architects, providing technical and professional guidance to them.

Support distinct data projects across the financial and adtech industries in AWS. Data lakehouses in S3 to Redshift, streaming pipelines using Kafka/MSK, database optimization queries and modeling in PostgreSQL/Aurora.

Work side-by-side with the Sales team, support pre-sales discussions with customers, and write proposals for potential projects/engagements

Give expertise on data engineering projects and machine learning implementations.

Senior Data Architect

Constant communication with stakeholders and clients to deliver technical solutions.

Built architectures on AWS stack (Kinesis, Glue, Athena).

Deploy resources using infrastructure as a code (Terraform, CloudFormation, SDK, CDK);

Tune machine learning models for production, using the SageMaker ecosystem and other tools.

Configuration for a cloud environment (subnets, inbound, outbound, permissions).

Coaching junior engineers and architects.

Senior Data Engineer

Creating and architecting the data lake using GCS, Data Catalog, and BigQuery;

Development of the main data pipeline from NoSQL (MongoDB) to Data Lake, using Python and Airflow;

Using DBT to create data marts and quality checks;

Implementing infrastructure using Terraform and GCP;

Providing dashboards and self-service tools of analytics to empower business teams.

Senior Data Analyst

Supervised and elaboration of the infrastructure/engineering of the data flow. Collection and treatment of data in relational (MySQL) and non-relational (MongoDB) databases. Daily use of Google Cloud Platform (GCP) and BigQuery to view information. Use of Docker, Airflow, and Python to move information and send clean data to Data Lake. Construction of dashboards and reports with Metabase and Quicksight.

BI and CRM Analyst

Responsible for the CRM business team, with daily and project management using scrum and kanban. Created dashboards in Tableau and Power BI, updated daily to monitor marketing KPIs. Construction of ETL in Python put into production on AWS, using good practices and CI / CD techniques. Construction of the marketing DW using dimensional modeling and daily use of SQL. Using Salesforce daily to collect leads, forecast churn, and calculate cohorts for engagement analysis.