Download PDF

Summary

Expertise in engineering and development of distributed systems with integration with mainframe platform. Relevant experience in solutions for pipeline and backend systems with high degree of automation (PaaS).

Currently working in project of cloud platform that allows integration of IaaS, PaaS and SaaS.

Using Bigdata technologies in conjuncture with Lambda and Kappa architectural patterns, using Hadoop ecosystem, Cassandra, Sap-Hana, RabbitMQ, with integration with layers of API, in order to provide a platform to become agile and simple the process of generating value for institution through data mining, CEP, Machine Leaning and other business visions.

Personal Information

Born in Campinas, single and childless. I live in São Paulo since 2015. Adept at physical activities (gym and football), performed weekly.

I plan to soon start an MBA in business management to be a professional with all the vision (x Technology Market Application) in order to overcome the daily challenges and be a complete professional.

Education

Jan 2013Dec 2014

Certificate advanced module

Uptime Comunicação em Ingles

English language course

Jan 2008Dec 2011

Bachelor degree

Unicamp

Systems Analyst

Work History

Feb 2012Present

Systems Analyst / Systems Engineer

Itau-Unibanco

Main responsibilities:

  • 2015 - 2016 -- Solution design and development of Itau Data Warehouse project to data transfer, extraction and load between Teradata platform and Hadoop, with historical control and governance of automated data models. IDP / ICCD project , testing integrations capabilities of technologies, using Cassandra, MOM (RabbitMQ), Python / shell script, Sap-Hana and Hadoop. Project structured in Cloud architecture tied to concepts of Lambda and Kappa for attendance and conception of data views for real-time Analitics, machine learning, CEP, geolocation, CRM, customer view 720 ° and others.
  • 2014 -- Development of structured/ half-structured charging and quality engines of files / tables to BIGDATA environment (Hadoop), support and build of process improvements in the environment. Development of control engines using Shell Script with HIVE and MYSQL.






  • 2012-2013 -- Development of storage engines of structured information from the media (systems channels) Itau in mainframe plataform, governance projects and risk management, acting with partners for software development, testing and support the soluctions and head of information capture from Itau systems.
Sep 2011Jan 2012

Sales

M3 - M. Officer

Sales and relationship with the public, service sales. Developing interpersonal relationships and how to speak in public.


Mar 2010Sep 2010

Intern

IBM do Brasil

Support and Monitoring Control-M applications, remote software installations with customers

Main responsibilities:

  • Acting together the India team in monitoring the Control-M applications, acting on issues resolution and _ user support for installation and update new market tools.

Professional Timeline

  • 2010
    • IBM – Intern
      • Application support of Control-M software in the Shell service account
  • 2011
    • M.Officer – Sales
      • Vendas e atendimento ao publico
      • Sales and customer service
  • 2012
    • Itau-Unibanco - Junior Systems Analyst
      • FLUIR – (Data Capture (corporate data model))
  • 2013
    • Itau-Unibanco - Junior Systems Analyst
      • Highlight for high performance for the year 2012
      • FLUIR – (Management Engine (Mainframe platform))
  • 2014
    • Itau-Unibanco - Junior Systems Analyst
      • DMT – Data Management Transformation
      • Data pipeline using Hadoop and Teradata
  • 2015
    • Itau-Unibanco - Regular Systems Analyst
      • Promotion for high performance for the year 2014
      • DMT – Projeto IDW using Hadoop as data lake
      • Position Change - Regular Systems Engineer
      • Project Itau Data Plataform
      • Project Itau Custumer Centricit Data
  • 2016
    • Itau-Unibanco - Regular Systems Analyst
      • Project Itau Data Plataform
      • Project Itau Custumer Centricit Data