Printing tool Download PDF

Summary

Working as Data & IA Coordinator suporting pre-sales and delivering data projects.

Expertise in architecture and development of distributed soluctions with multiple and distinct integrations. Relevant experience in solutions for data pipeline and systems backends with high degree of automation, scalability and resilience, both in cloud and on premisses. 

Experience in projetcts on Azure cloud platform, using data analitycs and cognitive services.

More than 6 years working as data enginner/architect with major players like Microsoft (Azure), Cloudera, DataStax , Elastic and others.

  Expertise in development data flows (stream and batch (Hive/Spark/Nifi/Streamsets/Informatica)), deploy of Hadoop ecosystem with tunning and security, and other NoSQL like Cassandra and Elastic.

Personal Information

Born in Campinas, single and childless. Adept at physical activities (gym and football), performed weekly. Always looking to learn new things, and passionate about the big data subject and its most diverse applications.

I like to stay connected and celebrate with who is around me, being part of the team and sharing culture is essential to success.

Education

Jul 2018

Cloudera Bootcamp - San Francisco CA

Cloudera
Jul 2018Jul 2020

CCA131 - Administrator

license 100-020-797
Mai 2016 - Jul 2017

Master of Bussines Administrador

FIAP

MBA in Big Data - Data Science

Jan 2013Dec 2014

Certificate advanced module

Uptime Comunicação em Ingles

English language course

Jan 2008Dec 2011

Bachelor degree

Unicamp

Systems Analyst

Work History

Jan2019 Present

Data & IA Coordinator

Logicalis

Main responsibilities:

  • 2019 - Currently -- Working as Coordinator of a team with 12 people in multiple projects, in some of them actuating as Enginner/Pre-sales beeing responsable for the development/deploy of the soluctions.
    •  Azure project using large range os components ( IaaS and PaaS ), integrating on-premisses and cloud to build data-pipelines using Azure data factory, HDinsights and Databricks on the integration layer, and AKS with API managment to expose API's
    • Informatica stack and Cloudera for data-pipeline and data quality on TELCO company ( multiple projects )
    • IA projects/P&D for image/video analitycs with retail customer
    • P&D - IOT data plataform (Cloudera + Azure )
Aprl 2017 - Dez-2018

Bigdata & IoT Architect/Enginner Specialist

Logicalis

Main responsibilities:

  • 2017 - 2018 --  Bigdata architect leader, disseminating knowledge with partners and internally, survey and design of many proposals for bigdata projects with different levels of integrations. Also working on Cloudera installation projects as in the design of data pipelines both on-premises and cloud. 
    • Deployed Hadoop instalation and data pipeline for IoT and Smart City on Granada Spain in an optimization project for garbage collection routes, using Flume/Hive to ingest and Spark with R to process data and run  predictive models
    • Creation of IoT Eugenio (Logicalis) platform on Azure, performing the installation and configuration of Nifi and Hadoop enviroment, as well as the building of data pipeline to collect and process  IoT data on the platform
    • Design and construction of Data Lake Corporate solution for biggest Telecom customer in Brasil , definition of governance standards, audit and data lineage.
      Construction of components for ingestion, processing and extraction of data using Hadoop ecosystem Cloudera
    • Worked on project for Data ingest, processing and visualization on Azure, using Azure PaaS (data fractory, cosmosDB and PowerBI)
    • Deploy Cluster and data pipeline for NAT logs (legal searches) on TELCO using Cloudera Stack + Apache NIFI
Feb 2012Mar 2017

Senior Systems Analyst / Systems Engineer

Itau-Unibanco

Main responsibilities:

  • 2015 - 2017 -- Solution design and development of Itau Data Warehouse project to data transfer, extraction and load between Teradata platform and Hadoop, with historical control and governance of automated data models. IDP / ICCD project , testing , build and deploy of Technologies/applications, such as Cassandra, MOM (ActiveMQ), Python and Hadoop. Project structured in Cloud architecture tied to concepts of Lambda and Kappa for attendance and conception of data views for real-time Analitics, machine learning, CEP, geolocation, CRM, customer view 720 ° and others.
  • 2014 -- Development of structured/ half-structured charging and quality engines of files / tables to BIGDATA environment (Hadoop), support and build of process improvements in the environment. Development of control engines using Shell Script with HIVE and MYSQL.
  • 2012-2013 -- Development of storage engines of structured information from the media (systems channels) Itau in mainframe plataform, governance projects and risk management, acting with partners for software development, testing and support the soluctions 
Sep 2011Jan 2012

Sales

M3 - M. Officer

Sales and relationship with the public, service sales. Developing interpersonal relationships and how to speak in public.


Mar 2010Sep 2010

Intern

IBM do Brasil

Support and Monitoring Control-M applications, remote software installations with customers

Main responsibilities:

  • Acting together the India team in monitoring the Control-M applications, acting on issues resolution and _ user support for installation and update new market tools.

Professional Timeline

  • 2010
    • IBM – Intern
      • Application support of Control-M software in the Shell service account
  • 2011
    • M.Officer – Sales
      • Sales and customer service
  • 2012
    • Itau-Unibanco - Junior Systems Analyst
      • FLUIR – (Data Ingestion and Modeling (corporate data model))
  • 2013
    • Itau-Unibanco - Junior Systems Analyst
      • Highlight for high performance on 2012 
      • FLUIR – (Management Engine (Mainframe platform))
  • 2014
    • Itau-Unibanco - Junior Systems Analyst
      • DMT – Data Management Transformation
      • Data pipeline using Hadoop and Teradata
  • 2015
    • Itau-Unibanco - Regular Systems Analyst
      • Promotion for high performance on 2014
      • DMT – Projeto IDW using Hadoop as data lake
      • Position Change - Regular Systems Engineer
      • Project Itau Data Plataform
      • Project Itau Custumer Centricit Data
  • 2016-2017
    • Itau-Unibanco - Senior Systems Engennier
    • Promotion for high performance on the 2016 
    • Project Itau Data Plataform
      • Project Itau Custumer Centricit Data
  • 2017
    • Logicalis
      • Eugenio Iot bigdata Plataform
      • Data lake for TELCO customer
      • Iot Project for Smart City Granada - Spain
      • NAT logs for legal searches TELCO
      • Stream Data Projects for CRM Offerts TELCO
    • Highlight for high performance on 2017 
  • 2018
    • Highlight for high performance on 2018 
    • Promotion for high performance (Coordinator)